sha
null
last_modified
null
library_name
stringclasses
154 values
text
stringlengths
1
900k
metadata
stringlengths
2
348k
pipeline_tag
stringclasses
45 values
id
stringlengths
5
122
tags
listlengths
1
1.84k
created_at
stringlengths
25
25
arxiv
listlengths
0
201
languages
listlengths
0
1.83k
tags_str
stringlengths
17
9.34k
text_str
stringlengths
0
389k
text_lists
listlengths
0
722
processed_texts
listlengths
1
723
tokens_length
listlengths
1
723
input_texts
listlengths
1
61
embeddings
listlengths
768
768
null
null
transformers
# DETR (End-to-End Object Detection) model with ResNet-50 backbone (dilated C5 stage) DEtection TRansformer (DETR) model trained end-to-end on COCO 2017 panoptic (118k annotated images). It was introduced in the paper [End-to-End Object Detection with Transformers](https://arxiv.org/abs/2005.12872) by Carion et al. and first released in [this repository](https://github.com/facebookresearch/detr). Disclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model. DETR can be naturally extended to perform panoptic segmentation, by adding a mask head on top of the decoder outputs. ## Intended uses & limitations You can use the raw model for panoptic segmentation. See the [model hub](https://huggingface.co/models?search=facebook/detr) to look for all available DETR models. ### How to use Here is how to use this model: ```python from transformers import DetrFeatureExtractor, DetrForSegmentation from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) feature_extractor = DetrFeatureExtractor.from_pretrained('facebook/detr-resnet-50-dc5-panoptic') model = DetrForSegmentation.from_pretrained('facebook/detr-resnet-50-dc5-panoptic') inputs = feature_extractor(images=image, return_tensors="pt") outputs = model(**inputs) # model predicts COCO classes, bounding boxes, and masks logits = outputs.logits bboxes = outputs.pred_boxes masks = outputs.pred_masks ``` Currently, both the feature extractor and model support PyTorch. ## Training data The DETR model was trained on [COCO 2017 panoptic](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found [here](https://github.com/facebookresearch/detr/blob/master/datasets/coco_panoptic.py). Images are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training The model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64). ## Evaluation results This model achieves the following results on COCO 2017 validation: a box AP (average precision) of **40.2**, a segmentation AP (average precision) of **31.9** and a PQ (panoptic quality) of **44.6**. For more details regarding evaluation results, we refer to table 5 of the original paper. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2005-12872, author = {Nicolas Carion and Francisco Massa and Gabriel Synnaeve and Nicolas Usunier and Alexander Kirillov and Sergey Zagoruyko}, title = {End-to-End Object Detection with Transformers}, journal = {CoRR}, volume = {abs/2005.12872}, year = {2020}, url = {https://arxiv.org/abs/2005.12872}, archivePrefix = {arXiv}, eprint = {2005.12872}, timestamp = {Thu, 28 May 2020 17:38:09 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2005-12872.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"license": "apache-2.0", "tags": ["image-segmentation"], "datasets": ["coco"], "widget": [{"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/dog-cat.jpg", "example_title": "Dog & Cat"}, {"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/construction-site.jpg", "example_title": "Construction Site"}, {"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/apple-orange.jpg", "example_title": "Apple & Orange"}]}
image-segmentation
facebook/detr-resnet-50-dc5-panoptic
[ "transformers", "pytorch", "safetensors", "detr", "image-segmentation", "dataset:coco", "arxiv:2005.12872", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2005.12872" ]
[]
TAGS #transformers #pytorch #safetensors #detr #image-segmentation #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# DETR (End-to-End Object Detection) model with ResNet-50 backbone (dilated C5 stage) DEtection TRansformer (DETR) model trained end-to-end on COCO 2017 panoptic (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. Disclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model. DETR can be naturally extended to perform panoptic segmentation, by adding a mask head on top of the decoder outputs. ## Intended uses & limitations You can use the raw model for panoptic segmentation. See the model hub to look for all available DETR models. ### How to use Here is how to use this model: Currently, both the feature extractor and model support PyTorch. ## Training data The DETR model was trained on COCO 2017 panoptic, a dataset consisting of 118k/5k annotated images for training/validation respectively. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found here. Images are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training The model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64). ## Evaluation results This model achieves the following results on COCO 2017 validation: a box AP (average precision) of 40.2, a segmentation AP (average precision) of 31.9 and a PQ (panoptic quality) of 44.6. For more details regarding evaluation results, we refer to table 5 of the original paper. ### BibTeX entry and citation info
[ "# DETR (End-to-End Object Detection) model with ResNet-50 backbone (dilated C5 stage)\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 panoptic (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. \n\nThe model is trained using a \"bipartite matching loss\": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a \"no object\" as class and \"no bounding box\" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model.\n\nDETR can be naturally extended to perform panoptic segmentation, by adding a mask head on top of the decoder outputs.", "## Intended uses & limitations\n\nYou can use the raw model for panoptic segmentation. See the model hub to look for all available DETR models.", "### How to use\n\nHere is how to use this model:\n\n\n\nCurrently, both the feature extractor and model support PyTorch.", "## Training data\n\nThe DETR model was trained on COCO 2017 panoptic, a dataset consisting of 118k/5k annotated images for training/validation respectively.", "## Training procedure", "### Preprocessing\n\nThe exact details of preprocessing of images during training/validation can be found here. \n\nImages are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225).", "### Training\n\nThe model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64).", "## Evaluation results\n\nThis model achieves the following results on COCO 2017 validation: a box AP (average precision) of 40.2, a segmentation AP (average precision) of 31.9 and a PQ (panoptic quality) of 44.6.\n\nFor more details regarding evaluation results, we refer to table 5 of the original paper.", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #safetensors #detr #image-segmentation #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# DETR (End-to-End Object Detection) model with ResNet-50 backbone (dilated C5 stage)\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 panoptic (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. \n\nThe model is trained using a \"bipartite matching loss\": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a \"no object\" as class and \"no bounding box\" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model.\n\nDETR can be naturally extended to perform panoptic segmentation, by adding a mask head on top of the decoder outputs.", "## Intended uses & limitations\n\nYou can use the raw model for panoptic segmentation. See the model hub to look for all available DETR models.", "### How to use\n\nHere is how to use this model:\n\n\n\nCurrently, both the feature extractor and model support PyTorch.", "## Training data\n\nThe DETR model was trained on COCO 2017 panoptic, a dataset consisting of 118k/5k annotated images for training/validation respectively.", "## Training procedure", "### Preprocessing\n\nThe exact details of preprocessing of images during training/validation can be found here. \n\nImages are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225).", "### Training\n\nThe model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64).", "## Evaluation results\n\nThis model achieves the following results on COCO 2017 validation: a box AP (average precision) of 40.2, a segmentation AP (average precision) of 31.9 and a PQ (panoptic quality) of 44.6.\n\nFor more details regarding evaluation results, we refer to table 5 of the original paper.", "### BibTeX entry and citation info" ]
[ 61, 125, 325, 34, 28, 40, 3, 94, 42, 74, 11 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #detr #image-segmentation #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# DETR (End-to-End Object Detection) model with ResNet-50 backbone (dilated C5 stage)\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 panoptic (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team." ]
[ -0.07406463474035263, 0.0017924208659678698, -0.0011195981642231345, 0.06568663567304611, 0.1125514879822731, 0.022018050774931908, 0.07789719104766846, 0.00818848330527544, -0.09172855317592621, 0.03225686773657799, 0.14285790920257568, 0.11256001144647598, 0.03461400419473648, 0.1347421258687973, -0.023337701335549355, -0.1443895399570465, 0.0780131071805954, 0.11041366308927536, 0.0749502182006836, 0.1353522092103958, 0.07790866494178772, -0.11358919739723206, 0.1252315491437912, 0.036433860659599304, -0.2358856201171875, -0.015579847618937492, -0.027177046984434128, -0.09360998868942261, 0.07530947029590607, -0.019558152183890343, 0.18702499568462372, 0.054180245846509933, 0.10039637982845306, 0.024585073813796043, 0.027560515329241753, 0.04050358012318611, -0.050507962703704834, 0.07022407650947571, -0.005019110161811113, -0.05995235964655876, 0.111955925822258, 0.01830870471894741, 0.06800612807273865, -0.006820468697696924, -0.03697502613067627, -0.23511859774589539, -0.09018747508525848, 0.18233785033226013, 0.06261236220598221, 0.09310821443796158, 0.03690968453884125, 0.15987934172153473, 0.04439638555049896, 0.06884703040122986, 0.11519897729158401, -0.18736718595027924, -0.07022390514612198, 0.09167104959487915, 0.023375799879431725, -0.027684500440955162, 0.005457885563373566, 0.01001337356865406, 0.05897419899702072, 0.05434936285018921, 0.1251450926065445, -0.02060658112168312, 0.006836488377302885, -0.07216551154851913, -0.14089959859848022, -0.08896281570196152, 0.12719127535820007, -0.023824546486139297, -0.07976139336824417, -0.054813094437122345, -0.09763423353433609, -0.07006461918354034, -0.0014572071377187967, -0.006482680793851614, 0.006830472499132156, -0.004723612684756517, -0.007686185650527477, -0.018499234691262245, -0.1032482460141182, -0.10862067341804504, -0.1635221242904663, 0.1457039713859558, 0.022133104503154755, 0.11826511472463608, -0.091251440346241, 0.08704090863466263, -0.008418885059654713, -0.061255306005477905, -0.037722792476415634, -0.09759466350078583, 0.10997691005468369, 0.01187506876885891, -0.04766375571489334, -0.1159001812338829, -0.022858822718262672, 0.11633475124835968, -0.010036380030214787, -0.021987061947584152, -0.045397039502859116, 0.08686979115009308, 0.039872173219919205, 0.12467935681343079, -0.1012938842177391, 0.1327947974205017, 0.022026514634490013, 0.00555067416280508, 0.12179706245660782, -0.04144815728068352, -0.0826210156083107, 0.010876779444515705, 0.05088960379362106, -0.01772345043718815, 0.0321376696228981, 0.07203131914138794, 0.031205900013446808, -0.019995952025055885, 0.24254226684570312, -0.033895526081323624, -0.05406412482261658, 0.010801952332258224, -0.004684248473495245, 0.10526467114686966, 0.13466975092887878, -0.05651372671127319, 0.02172859013080597, 0.09660438448190689, -0.07922988384962082, 0.00632607564330101, -0.04611680284142494, -0.11264269053936005, 0.03514346480369568, -0.16909274458885193, -0.005673957988619804, -0.22460800409317017, -0.11759506165981293, 0.03912527859210968, 0.014933684840798378, 0.006291991099715233, 0.06015423312783241, 0.033807557076215744, -0.04167911037802696, 0.031003830954432487, 0.028404975309967995, -0.05555921420454979, -0.02112269215285778, 0.08266407251358032, -0.01589318737387657, 0.13602802157402039, -0.2084343284368515, 0.025745060294866562, -0.12561224400997162, 0.05150929465889931, -0.15140222012996674, -0.02067706547677517, -0.018471326678991318, 0.06637632846832275, -0.049007661640644073, -0.05127529427409172, -0.03417012840509415, 0.01344483532011509, 0.025523994117975235, 0.12258521467447281, -0.06106194481253624, -0.019576948136091232, 0.16619230806827545, -0.19617804884910583, -0.09883981943130493, 0.055151406675577164, -0.012730422429740429, 0.008394002914428711, 0.0019827084615826607, 0.047009218484163284, 0.022137513384222984, -0.1403609812259674, -0.06349842995405197, 0.03496871143579483, -0.15602000057697296, -0.12319685518741608, 0.050857141613960266, 0.05938320979475975, -0.17189927399158478, 0.011751962825655937, -0.1356513798236847, 0.030698265880346298, -0.053084224462509155, -0.06198219954967499, -0.051936015486717224, -0.0867237001657486, 0.07027707993984222, 0.001915075583383441, 0.04321034997701645, 0.005476758349686861, -0.03611289709806442, 0.02498422935605049, 0.054401397705078125, -0.031913675367832184, -0.022988470271229744, -0.07873941957950592, 0.20608651638031006, -0.13230398297309875, -0.01633242890238762, -0.13518908619880676, -0.05970368534326553, -0.019697541370987892, 0.0162306260317564, -0.012409310787916183, -0.0511859729886055, 0.04660551995038986, 0.052980002015829086, 0.02466088905930519, -0.033434126526117325, 0.022887902334332466, 0.014054662548005581, -0.018088746815919876, -0.04723794385790825, -0.01666839048266411, -0.05540908873081207, 0.10146582871675491, -0.000312572083203122, 0.027691805735230446, 0.00881414208561182, 0.06931851804256439, -0.02383754961192608, 0.02416357211768627, 0.01554663572460413, 0.016435453668236732, -0.007193034514784813, -0.03769201785326004, 0.08125502616167068, 0.0643191710114479, -0.021911917254328728, 0.07460945099592209, -0.1226552352309227, 0.07464851438999176, 0.2007913440465927, -0.11235356330871582, -0.1135389432311058, 0.05394582822918892, -0.020832225680351257, 0.004117976408451796, -0.031181951984763145, -0.001205160398967564, 0.02356678619980812, -0.08392713218927383, 0.13449938595294952, -0.08312956243753433, 0.05015679448843002, 0.10460883378982544, -0.030251532793045044, -0.0980464443564415, 0.010519263334572315, 0.11165156960487366, -0.09275995194911957, 0.08687532693147659, 0.11627735197544098, -0.1018579751253128, 0.06256759166717529, 0.010409610345959663, -0.029558954760432243, -0.007703033275902271, 0.005020993761718273, 0.01217904593795538, 0.12229098379611969, -0.026684073731303215, -0.051925692707300186, 0.055443327873945236, -0.07608761638402939, 0.0048660277388989925, -0.138926163315773, 0.000020181956642773002, 0.03829187527298927, 0.04245506599545479, -0.1064106896519661, 0.06776092201471329, -0.024699166417121887, 0.07087802141904831, 0.0230511836707592, -0.11645114421844482, 0.06456971168518066, 0.024723120033740997, -0.05435976758599281, 0.1428077518939972, -0.04775706306099892, -0.3221800625324249, -0.09569154679775238, -0.0351138673722744, -0.014086706563830376, 0.035089608281850815, 0.04398581385612488, -0.03916952386498451, -0.0456903874874115, -0.07868510484695435, -0.08158275485038757, -0.0034156206529587507, 0.03745725378394127, 0.06502517312765121, -0.05942428857088089, 0.022979959845542908, -0.06994087249040604, 0.007005765568464994, -0.07400014996528625, -0.07498175650835037, 0.10218455642461777, -0.07612523436546326, 0.14805515110492706, 0.15768791735172272, -0.07808469235897064, 0.014147828333079815, -0.013735095039010048, 0.12281063199043274, -0.02721424773335457, -0.00009648047125665471, 0.21658167243003845, -0.03651169314980507, 0.05326589196920395, 0.08111219853162766, -0.011740493588149548, -0.029327213764190674, -0.017571378499269485, -0.0612272210419178, -0.15541696548461914, -0.08934911340475082, -0.05229315906763077, 0.009583539329469204, 0.06752070784568787, 0.09492315351963043, 0.03281343728303909, 0.13026772439479828, 0.14054027199745178, 0.009646696969866753, 0.0052553038112819195, -0.05246695503592491, 0.08768011629581451, 0.0943574383854866, -0.003002390032634139, 0.12681326270103455, -0.1028486117720604, -0.02936612442135811, 0.0691177174448967, -0.00139047100674361, 0.15800796449184418, -0.03910120576620102, -0.07234498858451843, 0.04459509626030922, 0.11699866503477097, 0.07822231203317642, 0.11737770587205887, -0.06493133306503296, 0.005053235217928886, -0.00848686695098877, -0.05416741222143173, -0.04090951755642891, -0.0012028736528009176, -0.07015911489725113, 0.032985322177410126, -0.07312159985303879, 0.04325412958860397, 0.011424342170357704, 0.07311739772558212, 0.029043152928352356, -0.32318001985549927, -0.07012312114238739, -0.051177531480789185, 0.10307469964027405, -0.1172296404838562, 0.030439643189311028, 0.005325948353856802, -0.016715094447135925, 0.045435283333063126, -0.051021795719861984, 0.03148247301578522, -0.07130753993988037, -0.0064109633676707745, 0.03707919269800186, -0.016751699149608612, 0.04335654899477959, -0.012295437045395374, -0.09278670698404312, 0.15724554657936096, -0.01386759802699089, 0.002636859891936183, -0.06541408598423004, 0.0016307244077324867, 0.04175934940576553, 0.14871174097061157, 0.16969993710517883, 0.005338252056390047, -0.0468486025929451, -0.07418683916330338, -0.07371275126934052, 0.009652958251535892, 0.0007308353087864816, 0.04215981438755989, 0.02689751237630844, -0.01506325788795948, -0.033641114830970764, 0.03524124622344971, 0.03187166154384613, 0.017099885269999504, -0.06171676889061928, 0.04564560204744339, 0.023221449926495552, -0.04788622260093689, -0.033787891268730164, -0.09692919999361038, -0.07333458214998245, 0.0030465691816061735, -0.008210376836359501, -0.08880961686372757, -0.10400338470935822, -0.03394252434372902, 0.013732461258769035, -0.047842781990766525, 0.13957209885120392, -0.08584079891443253, 0.03728317469358444, -0.013935493305325508, -0.18048082292079926, 0.0965280532836914, -0.0923072025179863, -0.07367254793643951, -0.007005015388131142, 0.032626163214445114, -0.006450273562222719, -0.003317341674119234, 0.038541603833436966, 0.032982390373945236, -0.07467915117740631, -0.04885666444897652, 0.09863066673278809, 0.09982751309871674, 0.07670482248067856, 0.007982928305864334, 0.05504996329545975, -0.17366912961006165, 0.034194789826869965, 0.06661885976791382, 0.06473556160926819, 0.01739313080906868, -0.10728584229946136, 0.07052833586931229, 0.08142416179180145, -0.008985833264887333, -0.31727927923202515, -0.049905188381671906, -0.04021130129694939, -0.006754348054528236, -0.09355010837316513, -0.07536856830120087, 0.09099763631820679, 0.0543859526515007, -0.06496214121580124, 0.11611486971378326, -0.2168426513671875, -0.04032086953520775, 0.16367119550704956, 0.10269485414028168, 0.3162699043750763, -0.1304096281528473, -0.03817145898938179, -0.024639900773763657, -0.23439539968967438, 0.09189357608556747, -0.10513829439878464, 0.03783261775970459, -0.0506134070456028, 0.02194903790950775, -0.010722585953772068, -0.08066533505916595, 0.0717356726527214, -0.07581844180822372, 0.0536087304353714, -0.10428912192583084, -0.07992888242006302, 0.07234496623277664, -0.04027671739459038, 0.12126360088586807, 0.01672251895070076, 0.10180195420980453, 0.012884645722806454, -0.04758675396442413, -0.1128588616847992, 0.07710694521665573, 0.004949660040438175, -0.05604299157857895, -0.08438187837600708, -0.01406126283109188, -0.003566100960597396, -0.015444333665072918, 0.0678667426109314, -0.0026883806567639112, 0.0644274652004242, 0.1172427162528038, 0.019893109798431396, 0.0037592786829918623, -0.03770671784877777, -0.073589988052845, -0.10942120850086212, 0.1140415370464325, -0.18297763168811798, 0.008955147117376328, 0.061815112829208374, 0.035165127366781235, -0.01230630837380886, 0.0861603319644928, 0.014030033722519875, 0.029993338510394096, 0.06756620109081268, -0.1352025419473648, -0.03198840469121933, -0.004256234038621187, 0.037165530025959015, 0.03973929584026337, 0.14682936668395996, 0.12814968824386597, -0.11737632751464844, -0.02579280361533165, -0.026048438623547554, 0.010683713480830193, -0.013423574157059193, -0.013927036896348, 0.09357617050409317, -0.005380117334425449, -0.09626142680644989, 0.08038172125816345, 0.02794470824301243, -0.008219449780881405, -0.0068514542654156685, -0.030973367393016815, -0.12597456574440002, -0.08636587858200073, -0.08767731487751007, 0.051374368369579315, -0.2416848987340927, -0.09273097664117813, -0.038794271647930145, -0.1168900653719902, 0.030693676322698593, 0.27401718497276306, 0.09305267035961151, 0.053697191178798676, 0.010810954496264458, 0.003986313007771969, -0.13160854578018188, 0.029155906289815903, -0.11120632290840149, 0.06694042682647705, -0.13783244788646698, 0.07761789858341217, 0.029897261410951614, 0.0944172590970993, -0.09041322767734528, -0.01858201064169407, -0.06284847110509872, -0.0012735376367345452, -0.15189354121685028, 0.05302603915333748, -0.04597907140851021, -0.005633285269141197, 0.024017563089728355, 0.03696146607398987, -0.03564395755529404, 0.06276624649763107, -0.03913598507642746, 0.0317620187997818, 0.03159453719854355, -0.026959367096424103, -0.10698923468589783, -0.022870300337672234, -0.021380402147769928, 0.0014421845553442836, 0.06653931736946106, 0.08453765511512756, -0.03628366068005562, 0.017657669261097908, -0.09520646184682846, -0.054126348346471786, 0.06100192666053772, 0.0010055368766188622, 0.009123093448579311, -0.035446543246507645, 0.01715790666639805, 0.0505557619035244, -0.12837134301662445, 0.011681094765663147, 0.09952018409967422, -0.06872624158859253, 0.011979583650827408, -0.015585007146000862, 0.014704596251249313, -0.03311442583799362, 0.007701689377427101, 0.12007337808609009, 0.06048508360981941, 0.056523557752370834, -0.021097246557474136, 0.020606674253940582, -0.08814562857151031, -0.0032661529257893562, -0.028006864711642265, -0.11187507212162018, -0.0949309766292572, -0.004781910218298435, 0.019447216764092445, 0.001322758849710226, 0.16465404629707336, 0.019127655774354935, -0.06754894554615021, -0.0007668090402148664, 0.14344245195388794, 0.006067346315830946, 0.00448446162045002, 0.24120181798934937, 0.01086181215941906, -0.02676103077828884, 0.04760526493191719, 0.0710342526435852, 0.04036485403776169, 0.026676228269934654, 0.08699888736009598, 0.09771763533353806, 0.07307887077331543, 0.096872478723526, 0.09713389724493027, 0.03890369459986687, -0.12145724147558212, -0.12796857953071594, -0.04311073571443558, 0.06435931473970413, -0.026435434818267822, 0.005647442303597927, 0.15819722414016724, 0.02666281722486019, -0.0404425673186779, -0.02710367925465107, -0.010716567747294903, -0.06160486117005348, -0.09688562899827957, -0.07390163093805313, -0.11132808774709702, -0.003561462741345167, 0.00282685155980289, -0.07852242887020111, 0.13898228108882904, -0.013801426626741886, -0.02925010211765766, 0.16334474086761475, -0.057646144181489944, 0.017813144251704216, 0.07538945227861404, 0.015238654799759388, -0.057093147188425064, -0.004043663386255503, -0.024121291935443878, 0.051255516707897186, 0.03939257934689522, 0.02786930464208126, -0.01774763874709606, 0.07061560451984406, 0.0841827467083931, -0.014611441642045975, -0.11143475770950317, -0.02530829980969429, 0.07860816270112991, -0.007747482508420944, 0.04734061285853386, -0.019631171599030495, 0.05486753210425377, 0.008307252079248428, 0.0837455466389656, -0.04112447053194046, -0.007050707470625639, -0.09098802506923676, 0.11922571063041687, -0.09373190999031067, 0.00260817795060575, 0.00007244787411764264, -0.05063348636031151, -0.014889499172568321, 0.21059855818748474, 0.1878896951675415, 0.017688684165477753, 0.01915803924202919, 0.019776109606027603, 0.016050025820732117, -0.01627189852297306, 0.037807971239089966, 0.11428507417440414, 0.21434709429740906, -0.048975929617881775, -0.10133199393749237, -0.09266582131385803, 0.032709553837776184, -0.009132687933743, -0.050773270428180695, -0.004165216349065304, -0.0228225439786911, -0.08704821020364761, 0.011739271692931652, -0.06883415579795837, 0.039222899824380875, 0.14652685821056366, -0.02370455302298069, -0.018016109243035316, 0.0029891154263168573, 0.05848865956068039, 0.019787820056080818, 0.011811424978077412, -0.05721157044172287, 0.010597849264740944, 0.1715679168701172, 0.004206915386021137, -0.15547403693199158, -0.0529358796775341, 0.004613170400261879, 0.02753753401339054, 0.20630355179309845, -0.009859784506261349, 0.05300268903374672, 0.04887590929865837, 0.06887627393007278, -0.09994977712631226, 0.05979819595813751, -0.011525912210345268, -0.07767360657453537, -0.007349001709371805, 0.05490737035870552, -0.020795069634914398, 0.006533030420541763, -0.012008575722575188, -0.07719884812831879, 0.007672841195017099, 0.03441813215613365, -0.05302200838923454, -0.032992925494909286, 0.006891858763992786, -0.07966234534978867, 0.13011404871940613, 0.10847253352403641, 0.010928702540695667, -0.060314226895570755, -0.0009841981809586287, 0.008036493323743343, 0.02853504754602909, -0.08867693692445755, 0.027041852474212646, -0.07735928148031235, 0.013174346648156643, -0.040487825870513916, 0.02164696715772152, -0.18573662638664246, -0.029941167682409286, -0.10008249431848526, -0.05951394885778427, -0.06116499751806259, -0.04670485854148865, 0.0524461530148983, 0.028704963624477386, -0.042688582092523575, 0.20302976667881012, -0.021947268396615982, 0.08561836928129196, -0.05185062438249588, -0.11020298302173615 ]
null
null
transformers
# DETR (End-to-End Object Detection) model with ResNet-50 backbone (dilated C5 stage) DEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper [End-to-End Object Detection with Transformers](https://arxiv.org/abs/2005.12872) by Carion et al. and first released in [this repository](https://github.com/facebookresearch/detr). Disclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model. ## Intended uses & limitations You can use the raw model for object detection. See the [model hub](https://huggingface.co/models?search=facebook/detr) to look for all available DETR models. ### How to use Here is how to use this model: ```python from transformers import DetrFeatureExtractor, DetrForObjectDetection from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) feature_extractor = DetrFeatureExtractor.from_pretrained('facebook/detr-resnet-50-dc5') model = DetrForObjectDetection.from_pretrained('facebook/detr-resnet-50-dc5') inputs = feature_extractor(images=image, return_tensors="pt") outputs = model(**inputs) # model predicts bounding boxes and corresponding COCO classes logits = outputs.logits bboxes = outputs.pred_boxes ``` Currently, both the feature extractor and model support PyTorch. ## Training data The DETR model was trained on [COCO 2017 object detection](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found [here](https://github.com/google-research/vision_transformer/blob/master/vit_jax/input_pipeline.py). Images are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training The model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64). ## Evaluation results This model achieves an AP (average precision) of **43.3** on COCO 2017 validation. For more details regarding evaluation results, we refer to table 1 of the original paper. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2005-12872, author = {Nicolas Carion and Francisco Massa and Gabriel Synnaeve and Nicolas Usunier and Alexander Kirillov and Sergey Zagoruyko}, title = {End-to-End Object Detection with Transformers}, journal = {CoRR}, volume = {abs/2005.12872}, year = {2020}, url = {https://arxiv.org/abs/2005.12872}, archivePrefix = {arXiv}, eprint = {2005.12872}, timestamp = {Thu, 28 May 2020 17:38:09 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2005-12872.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"license": "apache-2.0", "tags": ["object-detection", "vision"], "datasets": ["coco"], "widget": [{"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/savanna.jpg", "example_title": "Savanna"}, {"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg", "example_title": "Football Match"}, {"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/airport.jpg", "example_title": "Airport"}]}
object-detection
facebook/detr-resnet-50-dc5
[ "transformers", "pytorch", "safetensors", "detr", "object-detection", "vision", "dataset:coco", "arxiv:2005.12872", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2005.12872" ]
[]
TAGS #transformers #pytorch #safetensors #detr #object-detection #vision #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# DETR (End-to-End Object Detection) model with ResNet-50 backbone (dilated C5 stage) DEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. Disclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model. ## Intended uses & limitations You can use the raw model for object detection. See the model hub to look for all available DETR models. ### How to use Here is how to use this model: Currently, both the feature extractor and model support PyTorch. ## Training data The DETR model was trained on COCO 2017 object detection, a dataset consisting of 118k/5k annotated images for training/validation respectively. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found here. Images are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training The model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64). ## Evaluation results This model achieves an AP (average precision) of 43.3 on COCO 2017 validation. For more details regarding evaluation results, we refer to table 1 of the original paper. ### BibTeX entry and citation info
[ "# DETR (End-to-End Object Detection) model with ResNet-50 backbone (dilated C5 stage)\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. \n\nThe model is trained using a \"bipartite matching loss\": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a \"no object\" as class and \"no bounding box\" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model.", "## Intended uses & limitations\n\nYou can use the raw model for object detection. See the model hub to look for all available DETR models.", "### How to use\n\nHere is how to use this model:\n\n\n\nCurrently, both the feature extractor and model support PyTorch.", "## Training data\n\nThe DETR model was trained on COCO 2017 object detection, a dataset consisting of 118k/5k annotated images for training/validation respectively.", "## Training procedure", "### Preprocessing\n\nThe exact details of preprocessing of images during training/validation can be found here. \n\nImages are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225).", "### Training\n\nThe model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64).", "## Evaluation results\n\nThis model achieves an AP (average precision) of 43.3 on COCO 2017 validation. For more details regarding evaluation results, we refer to table 1 of the original paper.", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #safetensors #detr #object-detection #vision #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# DETR (End-to-End Object Detection) model with ResNet-50 backbone (dilated C5 stage)\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. \n\nThe model is trained using a \"bipartite matching loss\": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a \"no object\" as class and \"no bounding box\" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model.", "## Intended uses & limitations\n\nYou can use the raw model for object detection. See the model hub to look for all available DETR models.", "### How to use\n\nHere is how to use this model:\n\n\n\nCurrently, both the feature extractor and model support PyTorch.", "## Training data\n\nThe DETR model was trained on COCO 2017 object detection, a dataset consisting of 118k/5k annotated images for training/validation respectively.", "## Training procedure", "### Preprocessing\n\nThe exact details of preprocessing of images during training/validation can be found here. \n\nImages are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225).", "### Training\n\nThe model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64).", "## Evaluation results\n\nThis model achieves an AP (average precision) of 43.3 on COCO 2017 validation. For more details regarding evaluation results, we refer to table 1 of the original paper.", "### BibTeX entry and citation info" ]
[ 62, 126, 295, 33, 28, 41, 3, 94, 42, 43, 11 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #detr #object-detection #vision #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# DETR (End-to-End Object Detection) model with ResNet-50 backbone (dilated C5 stage)\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nThe DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. \n\nThe model is trained using a \"bipartite matching loss\": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a \"no object\" as class and \"no bounding box\" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model." ]
[ -0.043032120913267136, 0.06092856824398041, -0.0065768081694841385, 0.072905994951725, 0.09476758539676666, 0.003933784086257219, 0.07610145956277847, 0.02473614364862442, 0.01496042124927044, 0.10092394798994064, -0.018238618969917297, 0.033506862819194794, 0.10438618808984756, 0.06814130395650864, 0.05152986943721771, -0.1645766645669937, 0.07984352111816406, -0.04977186396718025, 0.004463928751647472, 0.03804612159729004, 0.05056659132242203, -0.07684587687253952, 0.09156262129545212, -0.015407482162117958, -0.05411051958799362, 0.01307989563792944, -0.03314966335892677, -0.0020500055979937315, 0.08700483292341232, 0.05769665539264679, 0.061556845903396606, -0.03645947203040123, 0.04021500051021576, -0.034669239073991776, 0.01708575338125229, 0.13041208684444427, -0.029401682317256927, 0.059820786118507385, 0.048190757632255554, -0.03628791868686676, -0.00363167398609221, -0.07866171002388, 0.04236097261309624, 0.002174381399527192, -0.08198560774326324, -0.04813989996910095, -0.09257790446281433, 0.060716137290000916, 0.13340973854064941, 0.03855655714869499, -0.005888348910957575, 0.036732833832502365, 0.077363982796669, 0.05927896499633789, 0.1428077220916748, -0.329274982213974, -0.008557134307920933, 0.013446174561977386, -0.015357195399701595, 0.0626409575343132, -0.03646852821111679, -0.056546930223703384, -0.013842407613992691, -0.0016872277483344078, 0.06523312628269196, -0.02218976989388466, 0.054768156260252, -0.08167809247970581, -0.1296548992395401, -0.0960979014635086, 0.07710036635398865, 0.015987690538167953, -0.12741237878799438, -0.05265618860721588, -0.10604932904243469, -0.09255266934633255, 0.027699928730726242, -0.000448322796728462, 0.016897346824407578, 0.045880746096372604, 0.07127183675765991, -0.00203902181237936, -0.08680612593889236, -0.01654043234884739, -0.09922569990158081, 0.11145328730344772, 0.028815435245633125, 0.06190880760550499, 0.06555168330669403, 0.12490426003932953, -0.03540473058819771, -0.07574690878391266, -0.09395246207714081, -0.01802910678088665, -0.1766476035118103, -0.025987977162003517, 0.007974226027727127, -0.1451214700937271, -0.08937768638134003, 0.1353519856929779, 0.00016436762234661728, 0.06589744985103607, -0.07272055745124817, 0.06568235158920288, 0.084256611764431, 0.1363188922405243, -0.08037108927965164, 0.06826236099004745, 0.019777720794081688, -0.0332387313246727, 0.08359003812074661, -0.07153502106666565, -0.04298773035407066, -0.012956297025084496, 0.08151310682296753, 0.005068299360573292, 0.028622958809137344, 0.010306023061275482, -0.031144753098487854, -0.07095875591039658, 0.14546550810337067, -0.1243588998913765, 0.024038570001721382, 0.018579604104161263, -0.000855600752402097, -0.01719525083899498, 0.043360382318496704, 0.017136620357632637, -0.07522306591272354, 0.06244729831814766, -0.04708131030201912, 0.01400473341345787, -0.06775681674480438, -0.10947864502668381, 0.027905110269784927, -0.1468634456396103, -0.054652079939842224, -0.11917747557163239, -0.07980094850063324, -0.05576807260513306, 0.050622694194316864, -0.03351235017180443, 0.022585904225707054, -0.02105051837861538, -0.03633284196257591, -0.013445514254271984, 0.022095683962106705, -0.05362369865179062, -0.01589973270893097, 0.035979218780994415, -0.12078951299190521, 0.07053235918283463, -0.053476519882678986, 0.003582060569897294, -0.09049481153488159, 0.05717703700065613, -0.1234118640422821, 0.15280255675315857, 0.065300852060318, 0.035249024629592896, -0.09444016218185425, 0.020382681861519814, -0.09781606495380402, -0.03689146786928177, 0.03188589587807655, 0.09273535013198853, -0.191136434674263, -0.014543878845870495, 0.07935749739408493, -0.1572692096233368, 0.007772369310259819, 0.05710345506668091, -0.03198873624205589, 0.02507423609495163, 0.0569152757525444, -0.005820553749799728, 0.13268394768238068, -0.04424086585640907, -0.1550208330154419, -0.06292282044887543, -0.13484720885753632, 0.09556743502616882, 0.060245029628276825, 0.075560063123703, -0.044032514095306396, -0.013643995858728886, -0.05781839042901993, -0.05701068043708801, 0.027758996933698654, -0.08143722265958786, 0.024946607649326324, 0.026492560282349586, -0.041864100843667984, -0.04007228463888168, -0.0002158144925488159, 0.02059168554842472, -0.05488782003521919, -0.10434119403362274, 0.046324681490659714, -0.05192472040653229, 0.05197683721780777, -0.08723025023937225, 0.04363919794559479, -0.1538049727678299, 0.017306771129369736, -0.17181278765201569, -0.0478002168238163, 0.07574043422937393, -0.058571621775627136, 0.061191245913505554, 0.0007943347445689142, 0.04142411798238754, 0.09062088280916214, 0.01909659430384636, -0.06905234605073929, 0.024892669171094894, -0.0651489719748497, -0.038271136581897736, -0.14503416419029236, -0.08696277439594269, -0.051683954894542694, 0.04822462424635887, 0.034913599491119385, -0.0022125307004898787, 0.054503846913576126, 0.11202497780323029, 0.054253898561000824, -0.05685197934508324, -0.033086370676755905, 0.01242495235055685, -0.037482183426618576, -0.06686004251241684, -0.0033705118112266064, 0.03268483653664589, -0.03770308941602707, 0.04510881379246712, -0.13519322872161865, -0.006873028352856636, 0.05332503467798233, 0.01240869052708149, -0.05356617271900177, 0.017673226073384285, -0.01808704063296318, -0.018342915922403336, -0.05715884640812874, -0.025305435061454773, 0.23603452742099762, 0.026901310309767723, 0.05701655149459839, -0.04433537274599075, -0.004849104210734367, 0.04743685573339462, 0.0013350084191188216, -0.11419226974248886, -0.00974989403039217, 0.07934568077325821, -0.17182829976081848, 0.04734095558524132, 0.00504817022010684, 0.043929148465394974, 0.12393565475940704, 0.03026050515472889, -0.07023165374994278, -0.042733084410429, 0.09395452588796616, 0.048309575766325, 0.04912881553173065, 0.026266567409038544, 0.012548538856208324, 0.0166653860360384, 0.03885798528790474, -0.011300716549158096, -0.07334022223949432, 0.08893783390522003, 0.01811910606920719, -0.01837228238582611, 0.027005910873413086, -0.047566551715135574, -0.013735372573137283, 0.06515461206436157, 0.07851358503103256, 0.02090965397655964, 0.0051137590780854225, -0.054838500916957855, -0.12070688605308533, 0.11081820726394653, -0.103434257209301, -0.288949579000473, -0.22125422954559326, 0.02459842711687088, -0.05617634952068329, 0.0644032284617424, 0.025791915133595467, -0.010426562279462814, -0.034891050308942795, -0.10118155926465988, 0.010426280088722706, 0.04836941882967949, 0.052863992750644684, -0.006629844196140766, -0.0017766578821465373, 0.0516175851225853, -0.08950336277484894, 0.020841356366872787, -0.017712052911520004, -0.16689343750476837, 0.03127840906381607, 0.037784043699502945, 0.0602945014834404, 0.09201265871524811, -0.02451927214860916, -0.02089596726000309, 0.000086824300524313, 0.1531282216310501, -0.04579088091850281, 0.05761139094829559, 0.04689757153391838, -0.08273102343082428, 0.07107487320899963, 0.1803112030029297, -0.03583097830414772, -0.01765103079378605, -0.002308144001290202, 0.020631352439522743, -0.0958155170083046, -0.08955121785402298, -0.01418980024755001, -0.035202838480472565, -0.019126717001199722, 0.11124679446220398, 0.08486184477806091, -0.010935183614492416, 0.044887393712997437, -0.06938280910253525, 0.06646755337715149, 0.027539588510990143, 0.09920333325862885, -0.05922280624508858, 0.01330292783677578, 0.07109848409891129, -0.08187064528465271, -0.018153026700019836, 0.0480927936732769, 0.11104150861501694, 0.1999373883008957, -0.09346678107976913, 0.012202493846416473, 0.04881494492292404, 0.025591550394892693, 0.05486675351858139, 0.09882700443267822, -0.08424505591392517, 0.054694924503564835, -0.019112657755613327, -0.07711199671030045, -0.10823836177587509, 0.09168227016925812, 0.0026160134002566338, 0.036649323999881744, -0.050181757658720016, 0.024811938405036926, 0.054012659937143326, 0.2649746835231781, 0.08683184534311295, -0.20110628008842468, -0.0545225515961647, 0.03993126377463341, -0.018146436661481857, -0.08725765347480774, -0.0485864020884037, 0.10097119212150574, -0.04636454954743385, 0.015169121325016022, -0.04490093141794205, 0.04615209251642227, -0.22075915336608887, -0.03560374677181244, -0.005152181722223759, 0.012464722618460655, 0.002386357868090272, -0.0038355097640305758, -0.07213397324085236, -0.0037573024164885283, 0.024382583796977997, 0.0878940224647522, -0.05778835713863373, 0.06521229445934296, -0.005761154927313328, -0.026499321684241295, 0.07573074847459793, 0.03135671466588974, -0.12117867171764374, -0.17188945412635803, -0.10492429882287979, 0.0359795056283474, 0.04386734589934349, -0.08813192695379257, 0.11385257542133331, 0.008003110066056252, -0.02134816348552704, -0.057404205203056335, 0.026847487315535545, 0.0507434643805027, -0.1327534019947052, 0.07734983414411545, -0.048726800829172134, 0.0596894733607769, -0.04391619563102722, -0.011001843959093094, 0.04556470736861229, 0.0894206315279007, -0.24524416029453278, -0.05330081284046173, -0.07512710988521576, -0.007698300294578075, 0.042688582092523575, -0.06246780976653099, 0.0481736920773983, -0.011955920606851578, 0.12915629148483276, -0.010061852633953094, -0.14626286923885345, 0.009883711114525795, -0.032619815319776535, -0.08085216581821442, -0.06390237808227539, 0.1487494707107544, 0.07895161956548691, -0.00191548524890095, 0.021026283502578735, 0.041960153728723526, 0.058556441217660904, -0.0601031519472599, 0.03984931483864784, 0.18208985030651093, 0.03447351232171059, 0.1360936015844345, -0.033862724900245667, -0.09416281431913376, -0.017099181190133095, 0.09600181877613068, 0.037146780639886856, 0.13345545530319214, -0.03999847546219826, 0.14375661313533783, 0.08538250625133514, -0.10876256972551346, -0.16547586023807526, -0.009460201486945152, 0.07525422424077988, 0.10062766820192337, -0.0689077377319336, -0.19381090998649597, 0.08218038082122803, 0.04407111927866936, -0.020711926743388176, 0.10405685752630234, -0.18790242075920105, -0.06475035846233368, 0.02529072016477585, 0.060447484254837036, 0.22961413860321045, -0.03677242621779442, -0.0280742347240448, 0.002418778371065855, -0.1328633576631546, 0.07782254368066788, -0.027296457439661026, 0.07334037125110626, 0.04733303189277649, -0.08159628510475159, 0.045052286237478256, -0.015726363286376, 0.08528781682252884, 0.05200538784265518, 0.04867793619632721, -0.03780416399240494, 0.07097949087619781, 0.018033713102340698, -0.07796045392751694, 0.08503569662570953, 0.04351159557700157, 0.024420933797955513, 0.014245030470192432, -0.03403797373175621, 0.03654259815812111, 0.02653130516409874, -0.0006857659318484366, -0.030524317175149918, -0.07493103295564651, 0.053955335170030594, 0.07433357834815979, 0.028740495443344116, 0.05349939316511154, 0.004854342434555292, -0.001423834590241313, 0.08971237391233444, 0.056195348501205444, -0.07301326096057892, -0.02694643847644329, -0.02844618260860443, -0.01716465689241886, 0.11324108392000198, -0.04866257682442665, 0.11924321204423904, 0.08419344574213028, -0.04752639681100845, 0.07405087351799011, 0.047315604984760284, -0.12176738679409027, 0.04397813603281975, 0.09703604131937027, -0.09918923676013947, -0.07638294994831085, 0.023257838562130928, -0.04778185859322548, -0.047973744571208954, 0.06112914904952049, 0.21099235117435455, -0.0016768730711191893, -0.0008726005326025188, 0.008096910081803799, 0.05912114679813385, 0.0009140677866525948, 0.0399840846657753, -0.05952104926109314, 0.0022112277802079916, -0.06500687450170517, 0.10837570577859879, 0.08248471468687057, -0.04168838635087013, 0.03727458044886589, 0.03454262763261795, -0.09009455144405365, -0.047789279371500015, -0.1644449084997177, 0.05342024564743042, 0.016118304803967476, -0.016469262540340424, -0.02481214702129364, -0.08256492763757706, 0.05040426552295685, 0.0642106905579567, 0.0047950660809874535, 0.13792718946933746, -0.039655957370996475, 0.024012045934796333, -0.09584823995828629, 0.028926309198141098, -0.021439338102936745, 0.04728008434176445, -0.12771043181419373, 0.03549298271536827, 0.017809925600886345, 0.047850076109170914, -0.015326287597417831, -0.10837822407484055, -0.05065213888883591, -0.022959405556321144, -0.012110081501305103, -0.004500050097703934, -0.09388265758752823, -0.012962128967046738, 0.039386894553899765, 0.03008231893181801, -0.028025532141327858, 0.03791627660393715, -0.001788661116734147, -0.04500406235456467, -0.09425922483205795, 0.02055421844124794, -0.07425807416439056, -0.010392893105745316, 0.057211704552173615, -0.10595706105232239, 0.10623614490032196, 0.020426973700523376, -0.021042196080088615, 0.017619628459215164, -0.032797858119010925, 0.004643736407160759, 0.06434284895658493, -0.0009091399842873216, -0.0086573027074337, -0.08951833099126816, 0.0455889068543911, 0.009157383814454079, -0.05209900066256523, -0.026215078309178352, 0.11530844867229462, -0.04039948806166649, 0.05721990764141083, -0.040404293686151505, 0.028340790420770645, -0.08526456356048584, 0.07437371462583542, 0.0754191130399704, 0.08475188165903091, 0.09182819724082947, -0.02517074905335903, 0.05392612889409065, -0.11833114922046661, -0.029303016141057014, 0.02906022220849991, -0.022175300866365433, 0.06359508633613586, -0.08818984776735306, 0.014943568035960197, -0.005029052030295134, 0.05787607282400131, 0.03226497396826744, 0.0016653634374961257, -0.009175805374979973, -0.010817868635058403, -0.006140075623989105, 0.07538700848817825, 0.03629929572343826, 0.05620722845196724, -0.08626097440719604, -0.0035086714196950197, 0.023756081238389015, 0.03489621728658676, 0.02630709856748581, 0.02726759947836399, 0.047468435019254684, 0.1275263875722885, 0.07375288754701614, 0.040632061660289764, -0.055376678705215454, -0.09544607251882553, 0.056246932595968246, -0.08445755392313004, 0.021891813725233078, 0.021326817572116852, 0.03982386738061905, 0.1447495073080063, -0.11817050725221634, 0.041465453803539276, 0.07928582280874252, -0.0023238719440996647, -0.0472266860306263, -0.18608540296554565, -0.04205169901251793, -0.049233727157115936, -0.018724968656897545, -0.04882050305604935, -0.03947072476148605, 0.026619883254170418, -0.017965523526072502, 0.020241722464561462, 0.07614196836948395, -0.07340880483388901, -0.004756071139127016, 0.028832100331783295, 0.024831220507621765, 0.046694982796907425, 0.13456013798713684, -0.01716814562678337, 0.08852389454841614, 0.0459250845015049, 0.06678016483783722, 0.03867829591035843, 0.17261020839214325, 0.09286431223154068, -0.03550546616315842, -0.09383387118577957, 0.0044916607439517975, -0.030019909143447876, -0.06614403426647186, 0.08274133503437042, 0.07715034484863281, -0.04547959566116333, -0.004862060304731131, 0.12010736763477325, -0.03587011247873306, 0.04494931921362877, -0.17304272949695587, 0.05710672214627266, 0.029104236513376236, 0.026334799826145172, 0.010970056988298893, -0.12664711475372314, -0.012858416885137558, 0.018270494416356087, 0.11565599590539932, 0.004935357719659805, 0.008683335967361927, -0.0024798703379929066, -0.009996075183153152, -0.003858748357743025, 0.07253043353557587, 0.04294877126812935, 0.35743698477745056, -0.030176056548953056, 0.05527234077453613, -0.028421606868505478, 0.01936447061598301, 0.004651663359254599, 0.039971910417079926, -0.0863257572054863, 0.03920663893222809, -0.07466942816972733, -0.0056128185242414474, 0.0037891860119998455, -0.18696096539497375, 0.14840351045131683, -0.025816824287176132, -0.06648365408182144, 0.0447646901011467, -0.04377903789281845, -0.009722943417727947, 0.05943773686885834, 0.012524222023785114, -0.056395530700683594, 0.19989070296287537, 0.02105688862502575, -0.08485502004623413, -0.0696401298046112, -0.014304285869002342, -0.05219463258981705, 0.23591791093349457, -0.012581873685121536, 0.03014335036277771, 0.05537279695272446, 0.09097568690776825, -0.14075496792793274, -0.008001868613064289, -0.07220706343650818, -0.07525980472564697, -0.012547293677926064, 0.15247401595115662, -0.006130041554570198, 0.1439424753189087, 0.09622994065284729, -0.026187460869550705, -0.001967820804566145, -0.12070741504430771, 0.02224971540272236, -0.054036155343055725, 0.006110951770097017, -0.08169424533843994, 0.10546886920928955, 0.10989375412464142, 0.03758713975548744, 0.028736520558595657, -0.020082594826817513, -0.011408098042011261, 0.009119261056184769, 0.05882297828793526, 0.025569966062903404, -0.12196388840675354, 0.01951914094388485, -0.06975618004798889, 0.011315739713609219, -0.1874515414237976, -0.15152348577976227, 0.035500917583703995, -0.032695647329092026, -0.024494921788573265, 0.04499196261167526, 0.0018187856767326593, 0.01725880429148674, -0.07019434124231339, -0.0018260997021570802, 0.019899968057870865, 0.07550474256277084, -0.04743293672800064, -0.10268337279558182 ]
null
null
transformers
# DETR (End-to-End Object Detection) model with ResNet-50 backbone DEtection TRansformer (DETR) model trained end-to-end on COCO 2017 panoptic (118k annotated images). It was introduced in the paper [End-to-End Object Detection with Transformers](https://arxiv.org/abs/2005.12872) by Carion et al. and first released in [this repository](https://github.com/facebookresearch/detr). Disclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model. DETR can be naturally extended to perform panoptic segmentation, by adding a mask head on top of the decoder outputs. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/detr_architecture.png) ## Intended uses & limitations You can use the raw model for panoptic segmentation. See the [model hub](https://huggingface.co/models?search=facebook/detr) to look for all available DETR models. ### How to use Here is how to use this model: ```python import io import requests from PIL import Image import torch import numpy from transformers import DetrFeatureExtractor, DetrForSegmentation from transformers.models.detr.feature_extraction_detr import rgb_to_id url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) feature_extractor = DetrFeatureExtractor.from_pretrained("facebook/detr-resnet-50-panoptic") model = DetrForSegmentation.from_pretrained("facebook/detr-resnet-50-panoptic") # prepare image for the model inputs = feature_extractor(images=image, return_tensors="pt") # forward pass outputs = model(**inputs) # use the `post_process_panoptic` method of `DetrFeatureExtractor` to convert to COCO format processed_sizes = torch.as_tensor(inputs["pixel_values"].shape[-2:]).unsqueeze(0) result = feature_extractor.post_process_panoptic(outputs, processed_sizes)[0] # the segmentation is stored in a special-format png panoptic_seg = Image.open(io.BytesIO(result["png_string"])) panoptic_seg = numpy.array(panoptic_seg, dtype=numpy.uint8) # retrieve the ids corresponding to each mask panoptic_seg_id = rgb_to_id(panoptic_seg) ``` Currently, both the feature extractor and model support PyTorch. ## Training data The DETR model was trained on [COCO 2017 panoptic](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found [here](https://github.com/facebookresearch/detr/blob/master/datasets/coco_panoptic.py). Images are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training The model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64). ## Evaluation results This model achieves the following results on COCO 2017 validation: a box AP (average precision) of **38.8**, a segmentation AP (average precision) of **31.1** and a PQ (panoptic quality) of **43.4**. For more details regarding evaluation results, we refer to table 5 of the original paper. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2005-12872, author = {Nicolas Carion and Francisco Massa and Gabriel Synnaeve and Nicolas Usunier and Alexander Kirillov and Sergey Zagoruyko}, title = {End-to-End Object Detection with Transformers}, journal = {CoRR}, volume = {abs/2005.12872}, year = {2020}, url = {https://arxiv.org/abs/2005.12872}, archivePrefix = {arXiv}, eprint = {2005.12872}, timestamp = {Thu, 28 May 2020 17:38:09 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2005-12872.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"license": "apache-2.0", "tags": ["image-segmentation", "vision"], "datasets": ["coco"], "widget": [{"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg", "example_title": "Football Match"}, {"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/dog-cat.jpg", "example_title": "Dog & Cat"}, {"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/construction-site.jpg", "example_title": "Construction Site"}, {"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/apple-orange.jpg", "example_title": "Apple & Orange"}]}
image-segmentation
facebook/detr-resnet-50-panoptic
[ "transformers", "pytorch", "detr", "image-segmentation", "vision", "dataset:coco", "arxiv:2005.12872", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2005.12872" ]
[]
TAGS #transformers #pytorch #detr #image-segmentation #vision #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# DETR (End-to-End Object Detection) model with ResNet-50 backbone DEtection TRansformer (DETR) model trained end-to-end on COCO 2017 panoptic (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. Disclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model. DETR can be naturally extended to perform panoptic segmentation, by adding a mask head on top of the decoder outputs. !model image ## Intended uses & limitations You can use the raw model for panoptic segmentation. See the model hub to look for all available DETR models. ### How to use Here is how to use this model: Currently, both the feature extractor and model support PyTorch. ## Training data The DETR model was trained on COCO 2017 panoptic, a dataset consisting of 118k/5k annotated images for training/validation respectively. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found here. Images are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training The model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64). ## Evaluation results This model achieves the following results on COCO 2017 validation: a box AP (average precision) of 38.8, a segmentation AP (average precision) of 31.1 and a PQ (panoptic quality) of 43.4. For more details regarding evaluation results, we refer to table 5 of the original paper. ### BibTeX entry and citation info
[ "# DETR (End-to-End Object Detection) model with ResNet-50 backbone\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 panoptic (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. \n\nThe model is trained using a \"bipartite matching loss\": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a \"no object\" as class and \"no bounding box\" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model.\n\nDETR can be naturally extended to perform panoptic segmentation, by adding a mask head on top of the decoder outputs.\n\n!model image", "## Intended uses & limitations\n\nYou can use the raw model for panoptic segmentation. See the model hub to look for all available DETR models.", "### How to use\n\nHere is how to use this model:\n\n\n\nCurrently, both the feature extractor and model support PyTorch.", "## Training data\n\nThe DETR model was trained on COCO 2017 panoptic, a dataset consisting of 118k/5k annotated images for training/validation respectively.", "## Training procedure", "### Preprocessing\n\nThe exact details of preprocessing of images during training/validation can be found here. \n\nImages are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225).", "### Training\n\nThe model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64).", "## Evaluation results\n\nThis model achieves the following results on COCO 2017 validation: a box AP (average precision) of 38.8, a segmentation AP (average precision) of 31.1 and a PQ (panoptic quality) of 43.4.\n\nFor more details regarding evaluation results, we refer to table 5 of the original paper.", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #detr #image-segmentation #vision #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# DETR (End-to-End Object Detection) model with ResNet-50 backbone\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 panoptic (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. \n\nThe model is trained using a \"bipartite matching loss\": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a \"no object\" as class and \"no bounding box\" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model.\n\nDETR can be naturally extended to perform panoptic segmentation, by adding a mask head on top of the decoder outputs.\n\n!model image", "## Intended uses & limitations\n\nYou can use the raw model for panoptic segmentation. See the model hub to look for all available DETR models.", "### How to use\n\nHere is how to use this model:\n\n\n\nCurrently, both the feature extractor and model support PyTorch.", "## Training data\n\nThe DETR model was trained on COCO 2017 panoptic, a dataset consisting of 118k/5k annotated images for training/validation respectively.", "## Training procedure", "### Preprocessing\n\nThe exact details of preprocessing of images during training/validation can be found here. \n\nImages are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225).", "### Training\n\nThe model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64).", "## Evaluation results\n\nThis model achieves the following results on COCO 2017 validation: a box AP (average precision) of 38.8, a segmentation AP (average precision) of 31.1 and a PQ (panoptic quality) of 43.4.\n\nFor more details regarding evaluation results, we refer to table 5 of the original paper.", "### BibTeX entry and citation info" ]
[ 58, 118, 328, 34, 28, 40, 3, 94, 42, 73, 11 ]
[ "passage: TAGS\n#transformers #pytorch #detr #image-segmentation #vision #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# DETR (End-to-End Object Detection) model with ResNet-50 backbone\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 panoptic (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nThe DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. \n\nThe model is trained using a \"bipartite matching loss\": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a \"no object\" as class and \"no bounding box\" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model.\n\nDETR can be naturally extended to perform panoptic segmentation, by adding a mask head on top of the decoder outputs.\n\n!model image" ]
[ -0.04424172267317772, 0.12970812618732452, -0.006346749607473612, 0.0792618915438652, 0.11418043822050095, -0.023475099354982376, 0.02214236930012703, 0.02982247807085514, -0.030227629467844963, 0.11343871057033539, 0.0073060542345047, 0.07425213605165482, 0.05621922388672829, 0.04624035209417343, 0.024769488722085953, -0.20135927200317383, 0.06285559386014938, -0.044575635343790054, 0.051114875823259354, 0.06769859045743942, 0.031228438019752502, -0.10424230992794037, 0.084492027759552, 0.007685895077884197, -0.0325109101831913, -0.009135190397500992, -0.03818929195404053, -0.03377818316221237, 0.0709814727306366, 0.04770424962043762, 0.042645182460546494, -0.03520964831113815, 0.06020491570234299, -0.0756104364991188, 0.005761174485087395, 0.12098237872123718, -0.02352641522884369, 0.06366293877363205, 0.08469074964523315, -0.01748141273856163, 0.04706871137022972, -0.09783098846673965, 0.053435228765010834, 0.00547166308388114, -0.11223334819078445, -0.04072916880249977, -0.09725505113601685, 0.03748144581913948, 0.1139802485704422, 0.09292982518672943, -0.02325158193707466, 0.05921191722154617, 0.04518834128975868, 0.06704404205083847, 0.11430471390485764, -0.3197438716888428, -0.02167399600148201, 0.016996530815958977, 0.028987400233745575, 0.0817994475364685, -0.05141516402363777, -0.013860093429684639, -0.022578874602913857, 0.015252087265253067, 0.11754326522350311, -0.05553817003965378, 0.06540970504283905, -0.07415008544921875, -0.16927701234817505, -0.0847480520606041, 0.05696633458137512, 0.0003776906232815236, -0.099806047976017, -0.07207344472408295, -0.06757155060768127, -0.06392242759466171, 0.018627209588885307, 0.005225528962910175, 0.024305321276187897, 0.023634526878595352, 0.027009308338165283, -0.02698514424264431, -0.07842407375574112, -0.02210824005305767, -0.1043538972735405, 0.13953857123851776, 0.025221262127161026, 0.041952989995479584, 0.03042314015328884, 0.10331203043460846, -0.04815301671624184, -0.07066075503826141, -0.07491761445999146, -0.022123465314507484, -0.18232543766498566, -0.03195398673415184, -0.00690045952796936, -0.17428211867809296, -0.09626755863428116, 0.11002512276172638, -0.026665737852454185, 0.07093283534049988, 0.005575414281338453, 0.03296484425663948, 0.10165093839168549, 0.18580837547779083, -0.07856806367635727, 0.048893749713897705, -0.041363675147295, -0.019422437995672226, 0.02578061632812023, -0.06333070248365402, -0.02320055477321148, -0.008746794424951077, 0.03832239285111427, 0.027510544285178185, 0.023658059537410736, 0.04271642491221428, 0.0030399297829717398, -0.06702104210853577, 0.14117899537086487, -0.107220858335495, 0.009809981100261211, -0.004378973506391048, -0.0679774060845375, -0.004288898315280676, 0.012242457829415798, -0.01946820504963398, -0.10318522155284882, 0.07920945435762405, -0.07678012549877167, -0.0057323407381772995, -0.06925220787525177, -0.1207192987203598, -0.015344741754233837, -0.15100491046905518, -0.08179815858602524, -0.10536593198776245, -0.09887395799160004, -0.06746867299079895, 0.04040677845478058, -0.008762264624238014, 0.03573179990053177, -0.010742832906544209, -0.011413454078137875, -0.01883835904300213, 0.0075986990705132484, 0.02315964177250862, -0.04316607862710953, 0.025740912184119225, -0.11836475133895874, 0.042152147740125656, -0.0058814166113734245, 0.04104271158576012, -0.09448505938053131, 0.06945040822029114, -0.07535435259342194, 0.12887614965438843, 0.043367061764001846, 0.017340809106826782, -0.08101371675729752, -0.03643461689352989, -0.010310896672308445, -0.014765587635338306, -0.0021005391608923674, 0.06543242186307907, -0.1319764405488968, -0.01480208057910204, 0.05018671229481697, -0.09691110253334045, 0.02727304771542549, 0.03704507276415825, -0.04410155490040779, -0.015556142665445805, 0.06823795288801193, 0.04622065648436546, 0.17188426852226257, -0.029392356052994728, -0.135212704539299, -0.05039895698428154, -0.1574847400188446, 0.04392576962709427, 0.047863688319921494, 0.010151107795536518, 0.00810028426349163, -0.004431033041328192, -0.058861441910266876, -0.03712741285562515, 0.009931796230375767, -0.04800063744187355, 0.026134150102734566, 0.0019648766610771418, -0.04262012988328934, -0.040487077087163925, -0.009414039552211761, 0.027806125581264496, -0.05577956885099411, -0.027303384616971016, 0.043454986065626144, -0.047006167471408844, 0.02701970934867859, -0.05202716216444969, 0.04321426898241043, -0.11976485699415207, 0.005929738283157349, -0.22192813456058502, -0.07867951691150665, 0.06862003356218338, -0.1162833422422409, 0.06235725060105324, -0.0029588292818516493, 0.03323287516832352, 0.09595070034265518, 0.01863427832722664, -0.069845050573349, 0.01864030584692955, -0.051100701093673706, -0.0685374066233635, -0.05556459724903107, -0.07874168455600739, -0.0534113310277462, -0.0279384795576334, 0.004823672119528055, 0.0025967624969780445, 0.000015593815987813286, 0.06278913468122482, 0.03704536333680153, -0.048331763595342636, 0.019243502989411354, 0.027172202244400978, -0.014487567357718945, -0.07557395100593567, 0.024665746837854385, 0.030711613595485687, -0.030114339664578438, 0.018867339938879013, -0.16527652740478516, -0.07944215834140778, 0.04876625910401344, 0.009862102568149567, -0.06631653755903244, -0.01229638047516346, -0.034713998436927795, -0.011209812015295029, -0.08222440630197525, 0.0028784757014364004, 0.24302656948566437, 0.04922323301434517, 0.09132827073335648, -0.047315631061792374, -0.038231853395700455, 0.032813530415296555, 0.00425509549677372, -0.08220632374286652, 0.043661367148160934, 0.04897468164563179, -0.16775275766849518, 0.052733685821294785, -0.030989086255431175, 0.038226280361413956, 0.12205760926008224, 0.04080986976623535, -0.08832738548517227, -0.029065586626529694, 0.07176008075475693, 0.053067222237586975, 0.02466297708451748, 0.02446679212152958, -0.016564160585403442, 0.03293605148792267, 0.043974027037620544, 0.001609878265298903, -0.08295230567455292, 0.07977858930826187, 0.014310401864349842, -0.017977474257349968, 0.023990189656615257, -0.025017447769641876, -0.01583644188940525, 0.06875880062580109, 0.07397060841321945, 0.018036168068647385, 0.0003093890845775604, -0.04275095835328102, -0.07343310117721558, 0.13484184443950653, -0.08303851634263992, -0.2895424962043762, -0.20475061237812042, -0.04088026285171509, -0.10795842111110687, 0.05573023110628128, 0.011043628677725792, -0.034766267985105515, -0.05810489133000374, -0.09479324519634247, 0.016382917761802673, -0.02908637560904026, 0.03398773819208145, -0.07987405359745026, -0.0005220153834670782, 0.04556569829583168, -0.08534225821495056, 0.01997535489499569, -0.04684729874134064, -0.10112477838993073, 0.021761620417237282, 0.041709549725055695, 0.10347917675971985, 0.1246495246887207, -0.026675337925553322, -0.014237920753657818, 0.006199489813297987, 0.12355933338403702, -0.05215686932206154, 0.0402667298913002, 0.08995784819126129, -0.04276616498827934, 0.07610727846622467, 0.12369970977306366, -0.02525370940566063, -0.03288794308900833, 0.026283426210284233, 0.05327541008591652, -0.09691084176301956, -0.13432839512825012, -0.04763943701982498, -0.0469716340303421, -0.03044419176876545, 0.1372138112783432, 0.07266224920749664, 0.0019056336022913456, 0.040887702256441116, -0.03200498968362808, 0.009156796149909496, 0.006182472687214613, 0.10124754905700684, 0.024374548345804214, 0.02077150158584118, 0.09148333221673965, -0.0545235276222229, -0.04408111050724983, 0.049687184393405914, 0.03747829794883728, 0.2460400015115738, -0.09422177821397781, 0.057708680629730225, 0.04060928896069527, -0.008018121123313904, 0.066713348031044, 0.08689852803945541, -0.03570641949772835, 0.046985287219285965, 0.0013769421493634582, -0.06300150603055954, -0.06658188998699188, 0.06357478350400925, 0.07029541581869125, 0.008522472344338894, -0.04693526029586792, 0.007424802519381046, 0.05662374570965767, 0.21707947552204132, 0.044947460293769836, -0.1948910653591156, -0.050813812762498856, 0.004255566745996475, -0.007582826539874077, -0.09890171140432358, -0.01808164454996586, 0.09656575322151184, -0.06353580206632614, 0.005466770380735397, -0.04754026234149933, 0.06912197172641754, -0.20848709344863892, -0.01572340354323387, 0.04852302372455597, 0.09776652604341507, -0.020761912688612938, 0.018756819888949394, -0.09000363945960999, 0.0469554178416729, 0.0297233983874321, 0.09434924274682999, -0.04467223212122917, 0.04700871556997299, 0.004628732334822416, -0.05533871799707413, 0.12244214862585068, 0.030763044953346252, -0.08957450091838837, -0.13973087072372437, -0.09759588539600372, 0.009833747521042824, 0.07970864325761795, -0.04564874991774559, 0.12179061770439148, 0.015879832208156586, -0.019686393439769745, -0.05402608960866928, -0.019944848492741585, 0.02429509535431862, -0.1309613734483719, 0.05796702206134796, -0.04229468107223511, 0.009917397052049637, -0.020445991307497025, 0.002928856760263443, -0.018892694264650345, 0.12016920000314713, -0.06936559081077576, -0.07975896447896957, -0.10360316932201385, -0.02001252770423889, 0.09559682756662369, -0.06923724710941315, 0.06534198671579361, -0.024913694709539413, 0.10896827280521393, 0.00013290921924635768, -0.11831143498420715, 0.009891492314636707, -0.013516698963940144, -0.07703857123851776, -0.04003138840198517, 0.11954333633184433, 0.10492625087499619, -0.002261242363601923, -0.008231445215642452, 0.06750842183828354, 0.00735278008505702, -0.09380897879600525, 0.047102171927690506, 0.10266315191984177, 0.07699121534824371, 0.140953928232193, -0.04168441891670227, -0.09154901653528214, -0.052066557109355927, 0.07185396552085876, 0.05476132780313492, 0.122127965092659, -0.040944408625364304, 0.07439455389976501, 0.10494270920753479, -0.06649792194366455, -0.136168971657753, -0.007743550930172205, 0.059502013027668, 0.05749547854065895, -0.061852868646383286, -0.190132275223732, 0.05038772150874138, 0.07897352427244186, 0.011318134143948555, 0.06071989983320236, -0.2783910036087036, -0.06633828580379486, 0.05325329303741455, 0.0856088250875473, 0.12131353467702866, -0.05511391535401344, -0.04899093881249428, -0.007078508380800486, -0.06337787955999374, 0.0987183079123497, -0.034800004214048386, 0.12007199227809906, 0.004320268519222736, -0.05112828314304352, 0.03684788942337036, -0.027105992659926414, 0.09617266803979874, 0.09664186835289001, 0.06526700407266617, -0.023976009339094162, 0.02900218777358532, 0.030181411653757095, -0.07239378243684769, 0.09054337441921234, 0.04073946923017502, 0.04400292783975601, -0.025860749185085297, -0.058578670024871826, -0.011943123303353786, 0.0359685942530632, -0.0030858430545777082, -0.07178447395563126, -0.07120180130004883, 0.07208353281021118, 0.09802548587322235, 0.019551243633031845, 0.032147713005542755, -0.036344606429338455, -0.023262320086359978, 0.11077636480331421, 0.047622956335544586, -0.004510332364588976, -0.12537021934986115, -0.028988288715481758, 0.004117914941161871, 0.08825738728046417, -0.025007039308547974, 0.05479583889245987, 0.1161905974149704, 0.00021468076738528907, 0.09473495930433273, 0.048047829419374466, -0.08848191052675247, 0.031752631068229675, 0.05200633034110069, -0.09902448207139969, -0.08079257607460022, 0.028787998482584953, -0.05401913449168205, -0.04873477295041084, 0.04558439180254936, 0.1808016300201416, -0.006848175078630447, 0.01038313191384077, 0.008005788549780846, 0.034181542694568634, 0.010378087870776653, 0.09593196958303452, -0.023441482335329056, -0.000641294929664582, -0.05958640202879906, 0.12267472594976425, 0.0807039737701416, -0.00983276218175888, -0.024252964183688164, 0.07046806812286377, -0.10013686865568161, -0.06662455946207047, -0.14766332507133484, 0.025438819080591202, -0.05570492148399353, -0.017402274534106255, -0.0028484875801950693, -0.08268731087446213, 0.0838092714548111, 0.05045001953840256, 0.0026472483295947313, 0.08153288811445236, -0.0935390442609787, 0.008778904564678669, -0.08741842955350876, 0.019606081768870354, 0.002220921916887164, 0.03829440847039223, -0.10614437609910965, 0.07402533292770386, 0.013326799497008324, 0.024006206542253494, -0.022650742903351784, -0.10704473406076431, -0.05986231938004494, 0.0077018109150230885, -0.040755536407232285, -0.014201875776052475, -0.09898987412452698, -0.010039878077805042, -0.0029951906763017178, 0.052168771624565125, -0.013654896058142185, 0.017206361517310143, 0.000034490596590330824, -0.07352610677480698, -0.049714647233486176, 0.021864764392375946, -0.09603030234575272, -0.02145717665553093, 0.013297862373292446, -0.09416984766721725, 0.09005369246006012, 0.007037152070552111, -0.014574346132576466, 0.006131942383944988, -0.006431960500776768, -0.011716809123754501, 0.049694549292325974, 0.030276745557785034, 0.019925815984606743, -0.07101669907569885, 0.03015456721186638, 0.002174411667510867, -0.07760009169578552, -0.012717056088149548, 0.05568603798747063, -0.07989377528429031, 0.05815018340945244, -0.013949871994554996, 0.04614228010177612, -0.09469830989837646, 0.061463210731744766, 0.07543159276247025, 0.10925355553627014, 0.09435723721981049, -0.012194844894111156, 0.03966178372502327, -0.11504453420639038, -0.02479972131550312, 0.028450356796383858, -0.053803350776433945, 0.09550181776285172, -0.05972905457019806, 0.029516924172639847, -0.01567944511771202, 0.08774600923061371, 0.053446270525455475, 0.0032494054175913334, -0.0011812467128038406, 0.05228058248758316, -0.02962641231715679, 0.05106000229716301, 0.06680179387331009, 0.05668910592794418, -0.08279801160097122, 0.08421026170253754, 0.05195173993706703, 0.025866294279694557, 0.03843989968299866, 0.10497423261404037, 0.05184400826692581, 0.17557230591773987, 0.039266135543584824, 0.01845521666109562, -0.0668586790561676, -0.12645448744297028, 0.05408397316932678, -0.06242993474006653, 0.046402011066675186, 0.015728363767266273, 0.005354201886802912, 0.09877125173807144, -0.12607057392597198, 0.09514684230089188, 0.08744721859693527, -0.03849022835493088, 0.0005957953981123865, -0.14140388369560242, -0.02081196941435337, 0.027395226061344147, -0.033193863928318024, -0.06306512653827667, -0.00886367168277502, 0.04944444075226784, -0.0021166845690459013, 0.016542017459869385, 0.07114341109991074, -0.12012220919132233, -0.04781942069530487, 0.054401010274887085, 0.009665243327617645, 0.05817226693034172, 0.07547595351934433, 0.015284044668078423, 0.06410302966833115, 0.017902188003063202, 0.05512269586324692, 0.030092233791947365, 0.12723949551582336, 0.08860672265291214, -0.019148653373122215, -0.06005297973752022, 0.005406470503658056, -0.003175665158778429, -0.04736480116844177, 0.11691287159919739, 0.08268171548843384, -0.0680694580078125, -0.0006959692691452801, 0.14872007071971893, -0.03109646402299404, 0.011744877323508263, -0.12904730439186096, 0.13122804462909698, 0.05017435923218727, 0.0044365995563566685, 0.00619666650891304, -0.117873914539814, -0.017733683809638023, 0.07425349950790405, 0.10037748515605927, -0.05396854132413864, -0.00446045957505703, -0.002036689780652523, -0.015149090439081192, 0.014762800186872482, 0.059517212212085724, 0.039968863129615784, 0.3234735131263733, -0.03383791446685791, 0.06312776356935501, -0.055040717124938965, 0.023523623123764992, -0.004218658898025751, 0.04162825271487236, -0.008961460553109646, 0.03854942321777344, -0.08834666013717651, 0.016611197963356972, -0.0766044408082962, -0.138808935880661, 0.0959162637591362, -0.0033813253976404667, -0.09181348234415054, 0.030546046793460846, -0.06578245759010315, 0.0063805291429162025, 0.07426497340202332, 0.0018926828633993864, -0.018000075593590736, 0.1077454537153244, 0.02308918908238411, -0.061511192470788956, -0.08566533774137497, 0.02047320269048214, 0.01478736475110054, 0.19777342677116394, -0.011130140163004398, 0.008768127299845219, 0.06806542724370956, 0.05800463259220123, -0.12067213654518127, 0.06562159210443497, -0.04737546667456627, -0.09054450690746307, 0.022824952378869057, 0.10125602781772614, 0.005140635184943676, 0.09184903651475906, 0.03474343568086624, 0.005777011625468731, 0.03035186417400837, -0.059552934020757675, 0.04186498373746872, -0.06679293513298035, -0.024773558601737022, -0.06633540987968445, 0.1287655085325241, 0.12605144083499908, 0.056504491716623306, 0.019620224833488464, -0.007641640491783619, 0.004684173967689276, -0.004416771233081818, 0.04144872725009918, -0.008752275258302689, -0.10064376145601273, 0.010363413952291012, -0.04941561445593834, -0.002493490930646658, -0.16164594888687134, -0.10842391103506088, 0.02114458568394184, -0.06061409413814545, 0.010389899834990501, 0.038196466863155365, -0.01554722897708416, 0.029281552881002426, -0.06026637926697731, 0.030710024759173393, 0.019192375242710114, 0.07934513688087463, -0.08116576820611954, -0.08929979801177979 ]
null
null
transformers
# DETR (End-to-End Object Detection) model with ResNet-50 backbone DEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper [End-to-End Object Detection with Transformers](https://arxiv.org/abs/2005.12872) by Carion et al. and first released in [this repository](https://github.com/facebookresearch/detr). Disclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/detr_architecture.png) ## Intended uses & limitations You can use the raw model for object detection. See the [model hub](https://huggingface.co/models?search=facebook/detr) to look for all available DETR models. ### How to use Here is how to use this model: ```python from transformers import DetrImageProcessor, DetrForObjectDetection import torch from PIL import Image import requests url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) # you can specify the revision tag if you don't want the timm dependency processor = DetrImageProcessor.from_pretrained("facebook/detr-resnet-50", revision="no_timm") model = DetrForObjectDetection.from_pretrained("facebook/detr-resnet-50", revision="no_timm") inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) # convert outputs (bounding boxes and class logits) to COCO API # let's only keep detections with score > 0.9 target_sizes = torch.tensor([image.size[::-1]]) results = processor.post_process_object_detection(outputs, target_sizes=target_sizes, threshold=0.9)[0] for score, label, box in zip(results["scores"], results["labels"], results["boxes"]): box = [round(i, 2) for i in box.tolist()] print( f"Detected {model.config.id2label[label.item()]} with confidence " f"{round(score.item(), 3)} at location {box}" ) ``` This should output: ``` Detected remote with confidence 0.998 at location [40.16, 70.81, 175.55, 117.98] Detected remote with confidence 0.996 at location [333.24, 72.55, 368.33, 187.66] Detected couch with confidence 0.995 at location [-0.02, 1.15, 639.73, 473.76] Detected cat with confidence 0.999 at location [13.24, 52.05, 314.02, 470.93] Detected cat with confidence 0.999 at location [345.4, 23.85, 640.37, 368.72] ``` Currently, both the feature extractor and model support PyTorch. ## Training data The DETR model was trained on [COCO 2017 object detection](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found [here](https://github.com/google-research/vision_transformer/blob/master/vit_jax/input_pipeline.py). Images are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training The model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64). ## Evaluation results This model achieves an AP (average precision) of **42.0** on COCO 2017 validation. For more details regarding evaluation results, we refer to table 1 of the original paper. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2005-12872, author = {Nicolas Carion and Francisco Massa and Gabriel Synnaeve and Nicolas Usunier and Alexander Kirillov and Sergey Zagoruyko}, title = {End-to-End Object Detection with Transformers}, journal = {CoRR}, volume = {abs/2005.12872}, year = {2020}, url = {https://arxiv.org/abs/2005.12872}, archivePrefix = {arXiv}, eprint = {2005.12872}, timestamp = {Thu, 28 May 2020 17:38:09 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2005-12872.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"license": "apache-2.0", "tags": ["object-detection", "vision"], "datasets": ["coco"], "widget": [{"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/savanna.jpg", "example_title": "Savanna"}, {"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg", "example_title": "Football Match"}, {"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/airport.jpg", "example_title": "Airport"}]}
object-detection
facebook/detr-resnet-50
[ "transformers", "pytorch", "detr", "object-detection", "vision", "dataset:coco", "arxiv:2005.12872", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2005.12872" ]
[]
TAGS #transformers #pytorch #detr #object-detection #vision #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# DETR (End-to-End Object Detection) model with ResNet-50 backbone DEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. Disclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model. !model image ## Intended uses & limitations You can use the raw model for object detection. See the model hub to look for all available DETR models. ### How to use Here is how to use this model: This should output: Currently, both the feature extractor and model support PyTorch. ## Training data The DETR model was trained on COCO 2017 object detection, a dataset consisting of 118k/5k annotated images for training/validation respectively. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found here. Images are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training The model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64). ## Evaluation results This model achieves an AP (average precision) of 42.0 on COCO 2017 validation. For more details regarding evaluation results, we refer to table 1 of the original paper. ### BibTeX entry and citation info
[ "# DETR (End-to-End Object Detection) model with ResNet-50 backbone\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. \n\nThe model is trained using a \"bipartite matching loss\": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a \"no object\" as class and \"no bounding box\" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model.\n\n!model image", "## Intended uses & limitations\n\nYou can use the raw model for object detection. See the model hub to look for all available DETR models.", "### How to use\n\nHere is how to use this model:\n\n\nThis should output:\n\n\nCurrently, both the feature extractor and model support PyTorch.", "## Training data\n\nThe DETR model was trained on COCO 2017 object detection, a dataset consisting of 118k/5k annotated images for training/validation respectively.", "## Training procedure", "### Preprocessing\n\nThe exact details of preprocessing of images during training/validation can be found here. \n\nImages are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225).", "### Training\n\nThe model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64).", "## Evaluation results\n\nThis model achieves an AP (average precision) of 42.0 on COCO 2017 validation. For more details regarding evaluation results, we refer to table 1 of the original paper.", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #detr #object-detection #vision #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# DETR (End-to-End Object Detection) model with ResNet-50 backbone\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. \n\nThe model is trained using a \"bipartite matching loss\": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a \"no object\" as class and \"no bounding box\" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model.\n\n!model image", "## Intended uses & limitations\n\nYou can use the raw model for object detection. See the model hub to look for all available DETR models.", "### How to use\n\nHere is how to use this model:\n\n\nThis should output:\n\n\nCurrently, both the feature extractor and model support PyTorch.", "## Training data\n\nThe DETR model was trained on COCO 2017 object detection, a dataset consisting of 118k/5k annotated images for training/validation respectively.", "## Training procedure", "### Preprocessing\n\nThe exact details of preprocessing of images during training/validation can be found here. \n\nImages are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225).", "### Training\n\nThe model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64).", "## Evaluation results\n\nThis model achieves an AP (average precision) of 42.0 on COCO 2017 validation. For more details regarding evaluation results, we refer to table 1 of the original paper.", "### BibTeX entry and citation info" ]
[ 57, 119, 298, 33, 32, 41, 3, 94, 42, 43, 11 ]
[ "passage: TAGS\n#transformers #pytorch #detr #object-detection #vision #dataset-coco #arxiv-2005.12872 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# DETR (End-to-End Object Detection) model with ResNet-50 backbone\n\nDEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper End-to-End Object Detection with Transformers by Carion et al. and first released in this repository. \n\nDisclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nThe DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. \n\nThe model is trained using a \"bipartite matching loss\": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a \"no object\" as class and \"no bounding box\" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model.\n\n!model image## Intended uses & limitations\n\nYou can use the raw model for object detection. See the model hub to look for all available DETR models." ]
[ -0.04844289645552635, 0.17289111018180847, -0.006628787145018578, 0.07671672105789185, 0.12686926126480103, -0.03511009365320206, 0.03438426926732063, 0.049205418676137924, 0.0017850613221526146, 0.1218290850520134, -0.013971272855997086, 0.08171771466732025, 0.060608312487602234, 0.05721743404865265, 0.018498525023460388, -0.1707693636417389, 0.06536418199539185, -0.05339197814464569, 0.04510227590799332, 0.06681175529956818, 0.03838140144944191, -0.10025803744792938, 0.08716271817684174, 0.011831075884401798, -0.028914811089634895, -0.011879540979862213, -0.040937166661024094, -0.037020180374383926, 0.05522434413433075, 0.028905600309371948, 0.04602646455168724, -0.031180832535028458, 0.0556497797369957, -0.0931859090924263, 0.005708422977477312, 0.11909730732440948, -0.011178962886333466, 0.056373387575149536, 0.08375795185565948, -0.028100084513425827, 0.03598584979772568, -0.08916539698839188, 0.06411655992269516, 0.01079026609659195, -0.10781748592853546, -0.02202649414539337, -0.10418811440467834, 0.049977581948041916, 0.1200370118021965, 0.1023184210062027, -0.01671668328344822, 0.093962162733078, 0.05285722017288208, 0.062002237886190414, 0.1165165901184082, -0.2919566035270691, -0.019168803468346596, 0.009955156594514847, 0.02269463986158371, 0.07245511561632156, -0.07377424836158752, -0.01644289493560791, -0.0240850318223238, 0.01631956547498703, 0.10809992253780365, -0.05317988246679306, 0.061462923884391785, -0.06177853047847748, -0.16008655726909637, -0.08711837232112885, 0.0810115709900856, -0.0010724759194999933, -0.1025383397936821, -0.07346030324697495, -0.061578914523124695, -0.07934088259935379, 0.016725286841392517, 0.005503394175320864, 0.020799003541469574, 0.028038207441568375, 0.019274096935987473, -0.03803052008152008, -0.08341244608163834, -0.01310757827013731, -0.0725492388010025, 0.10665664076805115, 0.017863640561699867, 0.03648710995912552, 0.028400536626577377, 0.11420062184333801, -0.05232412368059158, -0.07790432870388031, -0.07369408011436462, -0.02074996754527092, -0.18545997142791748, -0.049493588507175446, 0.008226605132222176, -0.1424378901720047, -0.08926935493946075, 0.0907059833407402, -0.02111005410552025, 0.07090851664543152, 0.0027616634033620358, 0.03642534464597702, 0.1114940494298935, 0.19892719388008118, -0.06184026598930359, 0.0262385755777359, -0.027065198868513107, -0.0028616739436984062, 0.018686790019273758, -0.05289316177368164, -0.010059995576739311, -0.0046701617538928986, 0.044702623039484024, 0.03594302013516426, 0.025526471436023712, 0.03892261162400246, -0.0031461836770176888, -0.06871253252029419, 0.13668940961360931, -0.10806899517774582, 0.02266869693994522, 0.004997171927243471, -0.06281393021345139, -0.01017671637237072, 0.019982341676950455, -0.00860519427806139, -0.10913792997598648, 0.05656369775533676, -0.07922524213790894, -0.00845217052847147, -0.07628718763589859, -0.11208251118659973, -0.009228302165865898, -0.14396223425865173, -0.08372290432453156, -0.08927121758460999, -0.11528950929641724, -0.07187964767217636, 0.04061003774404526, -0.018187254667282104, 0.017321933060884476, -0.008409859612584114, -0.024158727377653122, -0.016308560967445374, 0.02205592952668667, -0.009256578981876373, -0.03425142168998718, 0.0195390023291111, -0.10178422927856445, 0.02250886894762516, 0.015732701867818832, 0.03670600801706314, -0.10853514075279236, 0.07777528464794159, -0.09340456128120422, 0.13821133971214294, 0.02717338502407074, 0.03183601424098015, -0.08468702435493469, -0.022702177986502647, -0.017644492909312248, -0.025040360167622566, -0.004177385941147804, 0.07976163923740387, -0.13758766651153564, -0.015212209895253181, 0.04142971336841583, -0.10678296536207199, 0.012277485802769661, 0.030372943729162216, -0.053725261241197586, 0.017394328489899635, 0.08029013872146606, 0.04476281255483627, 0.1914587914943695, -0.039388082921504974, -0.14323486387729645, -0.052287206053733826, -0.13870173692703247, 0.04389285668730736, 0.04741234332323074, -0.012264842167496681, 0.01018540095537901, -0.0027886885218322277, -0.06440875679254532, -0.04824744910001755, 0.01609017699956894, -0.04592162370681763, 0.025833211839199066, -0.005050987470895052, -0.0622941330075264, -0.04053431749343872, -0.020427536219358444, 0.019185591489076614, -0.059482306241989136, -0.024862676858901978, 0.06379278004169464, -0.04391155019402504, 0.02161797508597374, -0.04882608354091644, 0.049853235483169556, -0.12364332377910614, -0.01212333608418703, -0.20307505130767822, -0.05357179418206215, 0.07652376592159271, -0.11884704232215881, 0.054039403796195984, -0.007418100722134113, 0.03942391276359558, 0.09377947449684143, 0.028823137283325195, -0.05633202940225601, -0.0009187618270516396, -0.04042728245258331, -0.07586626708507538, -0.06176193058490753, -0.07812727987766266, -0.05717093497514725, 0.04253554344177246, -0.003459443338215351, 0.0010779232252389193, 0.01373189128935337, 0.061479005962610245, 0.04334253817796707, -0.06523168087005615, 0.0314173549413681, 0.007891855202615261, -0.014632008969783783, -0.07855114340782166, 0.013101885095238686, 0.04288756102323532, 0.0011413497850298882, 0.029692932963371277, -0.17334279417991638, -0.09877471625804901, 0.060906603932380676, 0.04123919457197189, -0.06669885665178299, -0.013347990810871124, -0.02483726665377617, -0.005487735383212566, -0.08008846640586853, 0.003844135208055377, 0.2352636158466339, 0.05530623719096184, 0.09366068989038467, -0.051228493452072144, -0.0498640313744545, 0.029125938192009926, 0.004381714388728142, -0.07770028710365295, 0.05146598815917969, 0.06787792593240738, -0.18123605847358704, 0.07104787230491638, -0.041186749935150146, 0.03479180485010147, 0.11927855759859085, 0.04835091531276703, -0.08113780617713928, -0.02942860871553421, 0.07045559585094452, 0.05001150071620941, 0.03939805552363396, 0.008676545694470406, 0.00503159686923027, 0.0331059955060482, 0.028236791491508484, 0.006926661357283592, -0.08127528429031372, 0.07834059000015259, 0.021964997053146362, -0.00879700481891632, 0.0354032889008522, -0.029670698568224907, -0.002958322409540415, 0.06574919819831848, 0.0737835168838501, 0.03163769841194153, -0.01096795592457056, -0.04312889277935028, -0.09295842051506042, 0.12486957758665085, -0.09153915196657181, -0.26124003529548645, -0.19157172739505768, 0.0002407594583928585, -0.08046232163906097, 0.056533247232437134, 0.0030380035750567913, -0.01825360581278801, -0.050830621272325516, -0.10075470805168152, 0.006391745992004871, -0.04304105043411255, 0.003445988055318594, -0.04293118417263031, 0.008851774036884308, 0.03807806223630905, -0.09025295078754425, 0.011698841117322445, -0.0362863689661026, -0.10952882468700409, -0.002862557303160429, 0.04724759981036186, 0.09356921911239624, 0.12494563311338425, -0.021345989778637886, -0.00281702633947134, 0.01670263521373272, 0.15222445130348206, -0.05487535893917084, 0.055134400725364685, 0.06453409790992737, -0.022372780367732048, 0.07997583597898483, 0.13713349401950836, -0.03142175450921059, -0.018982738256454468, 0.019464004784822464, 0.06897702068090439, -0.08227047324180603, -0.12806043028831482, -0.04448360204696655, -0.050551753491163254, -0.04068974405527115, 0.12803205847740173, 0.078080914914608, -0.0008994475938379765, 0.05519998446106911, -0.0455840565264225, -0.0010090870782732964, 0.0030230216216295958, 0.11111296713352203, -0.0005986073520034552, 0.030754493549466133, 0.06561824679374695, -0.061138853430747986, -0.042940638959407806, 0.05207701027393341, 0.04462861269712448, 0.2263409048318863, -0.09233570098876953, 0.06310226023197174, 0.04530315473675728, -0.01946241222321987, 0.06288943439722061, 0.07890500873327255, -0.04446762800216675, 0.053276900202035904, 0.006083568092435598, -0.07274903357028961, -0.06679636240005493, 0.05557190626859665, 0.05681769922375679, 0.02345425635576248, -0.04282256215810776, 0.0015941900201141834, 0.07212301343679428, 0.21356511116027832, 0.04489980638027191, -0.1853531002998352, -0.048014335334300995, 0.022700920701026917, -0.013870621100068092, -0.09034405648708344, -0.02460651844739914, 0.08422836661338806, -0.08520364761352539, -0.0025026157964020967, -0.04297751560807228, 0.07414504885673523, -0.2113063782453537, -0.02307640016078949, 0.05722949281334877, 0.08711596578359604, -0.029863223433494568, 0.0254000723361969, -0.08072996139526367, 0.037642959505319595, 0.02280746027827263, 0.09486499428749084, -0.03899337351322174, 0.04952576756477356, 0.007999802939593792, -0.06616193056106567, 0.10773741453886032, 0.04037954658269882, -0.08405081927776337, -0.15910691022872925, -0.09415064752101898, 0.002189312130212784, 0.05228748172521591, -0.06807094812393188, 0.13457423448562622, 0.0065108658745884895, -0.02293448895215988, -0.05933329463005066, 0.011463640257716179, 0.021786343306303024, -0.14151395857334137, 0.05559251084923744, -0.0367337241768837, -0.010826686397194862, -0.017960119992494583, 0.011978731490671635, -0.02101103588938713, 0.11564798653125763, -0.09181642532348633, -0.09378373622894287, -0.1034112498164177, -0.048721667379140854, 0.10121718049049377, -0.06784616410732269, 0.05688507482409477, -0.03662434220314026, 0.11678600311279297, -0.016641227528452873, -0.10998009890317917, -0.01484567392617464, -0.025237493216991425, -0.08326390385627747, -0.031907763332128525, 0.13288310170173645, 0.10593196004629135, -0.002184708137065172, -0.0035798214375972748, 0.0706753209233284, -0.0024306755512952805, -0.09850817918777466, 0.04421354457736015, 0.1122194305062294, 0.06623636186122894, 0.12103213369846344, -0.04109698161482811, -0.08674951642751694, -0.06599575281143188, 0.07330635190010071, 0.05723439157009125, 0.11534812301397324, -0.0418548621237278, 0.07308070361614227, 0.09007207304239273, -0.07509204745292664, -0.11880408227443695, 0.0022499775514006615, 0.05842980742454529, 0.0452062264084816, -0.04803586006164551, -0.17980781197547913, 0.046005308628082275, 0.06742687523365021, -0.0039769625291228294, 0.061022303998470306, -0.27892446517944336, -0.062378011643886566, 0.04939133673906326, 0.07193602621555328, 0.07358957827091217, -0.06600667536258698, -0.05044010281562805, 0.000884485780261457, -0.08708198368549347, 0.10588257014751434, -0.03672675043344498, 0.10742129385471344, 0.011536698788404465, -0.038829974830150604, 0.04769015684723854, -0.03926774114370346, 0.09788312762975693, 0.09424678981304169, 0.0663786232471466, -0.024787452071905136, -0.001348571153357625, 0.022620178759098053, -0.08431003987789154, 0.09337946027517319, 0.03718581795692444, 0.047001391649246216, -0.05386628583073616, -0.05265443027019501, -0.020245052874088287, 0.043292637914419174, -0.012562758289277554, -0.060776013880968094, -0.07444214820861816, 0.07855939865112305, 0.09616886079311371, 0.011730644851922989, 0.015826690942049026, -0.020536623895168304, 0.00281036295928061, 0.1451738327741623, 0.03819859027862549, 0.04604119062423706, -0.12586364150047302, -0.028372790664434433, 0.00607462041079998, 0.07634197175502777, -0.007041101343929768, 0.05662892386317253, 0.09956884384155273, 0.006912974640727043, 0.10196208953857422, 0.032939840108156204, -0.1107722744345665, 0.024945076555013657, 0.04430386424064636, -0.10368877649307251, -0.06407837569713593, 0.015171799808740616, -0.025258868932724, -0.04432709515094757, 0.03269699960947037, 0.16735142469406128, -0.0076615777797997, -0.00014963536523282528, -0.00014990661293268204, 0.041486918926239014, 0.016724660992622375, 0.09862726181745529, -0.02222726307809353, 0.005660528317093849, -0.06339132785797119, 0.13423779606819153, 0.08484075963497162, -0.004149260930716991, -0.0017653710674494505, 0.09181630611419678, -0.09628581255674362, -0.06496461480855942, -0.1212439239025116, 0.03933310508728027, -0.02951035648584366, -0.008797349408268929, 0.0046907998621463776, -0.06791558861732483, 0.07318377494812012, 0.06382235884666443, 0.005697800777852535, 0.08165992796421051, -0.07930164039134979, 0.0028339880518615246, -0.10477951169013977, 0.03393024578690529, 0.0012570603284984827, 0.0201076939702034, -0.11340857297182083, 0.06685853749513626, 0.01339346170425415, 0.010701476596295834, -0.011017629876732826, -0.11271510273218155, -0.05660329759120941, 0.005095956847071648, -0.006415044888854027, 0.007394591346383095, -0.10814745724201202, -0.012046987190842628, -0.005733809433877468, 0.03665664792060852, -0.021463384851813316, 0.020015910267829895, -0.011190161108970642, -0.07750217616558075, -0.053385794162750244, 0.03396984189748764, -0.09758234024047852, -0.015184925869107246, 0.038697175681591034, -0.09498648345470428, 0.08623228222131729, 0.0295180045068264, -0.014682568609714508, 0.00486733065918088, -0.0241357684135437, -0.021952103823423386, 0.030906852334737778, 0.012384253554046154, 0.025137782096862793, -0.09282231330871582, 0.03666692227125168, -0.0014226397033780813, -0.06882067024707794, -0.019822360947728157, 0.07602377235889435, -0.08854012191295624, 0.040218155831098557, 0.013839983381330967, 0.029095403850078583, -0.08590243756771088, 0.06668199598789215, 0.09720686078071594, 0.07508647441864014, 0.0838531106710434, -0.01567269116640091, 0.04540684446692467, -0.1264001727104187, -0.030420051887631416, 0.02761467918753624, -0.038750141859054565, 0.06247497722506523, -0.049993064254522324, 0.023373428732156754, -0.020227057859301567, 0.08177418261766434, 0.050378017127513885, 0.0023588009644299746, 0.002414667047560215, 0.03619884327054024, -0.03513142839074135, 0.050507888197898865, 0.05351542681455612, 0.04340662807226181, -0.07591953873634338, 0.07899326086044312, 0.03649674355983734, 0.023461949080228806, 0.0028051650151610374, 0.09258393943309784, 0.07348637282848358, 0.16451825201511383, 0.03155121952295303, 0.027106452733278275, -0.07421771436929703, -0.10629783570766449, 0.0681045725941658, -0.07567857205867767, 0.026006244122982025, -0.009611275047063828, 0.017335372045636177, 0.11708305776119232, -0.1296737939119339, 0.10574718564748764, 0.07499716430902481, -0.029540661722421646, -0.01311080064624548, -0.150579571723938, -0.024010315537452698, 0.02757134474813938, -0.022200774401426315, -0.06636996567249298, 0.0029186722822487354, 0.05839867144823074, -0.002623588778078556, 0.007393358275294304, 0.062485240399837494, -0.10340566188097, -0.05888431519269943, 0.03850782290101051, 0.016841910779476166, 0.05663570389151573, 0.05125742405653, 0.022943586111068726, 0.06536901742219925, 0.02504192665219307, 0.06785689294338226, 0.041815903037786484, 0.12569643557071686, 0.0738704651594162, -0.015141692012548447, -0.06279080361127853, 0.006462009623646736, -0.016826672479510307, -0.04241406172513962, 0.10348549485206604, 0.07993528991937637, -0.06196937337517738, -0.0055120596662163734, 0.16392484307289124, -0.03667229786515236, 0.01680680364370346, -0.13096000254154205, 0.14013032615184784, 0.04864777252078056, -0.008431508205831051, 0.025922458618879318, -0.1277429610490799, -0.014871036633849144, 0.082967609167099, 0.11588257551193237, -0.051377035677433014, 0.005915104411542416, -0.017962412908673286, -0.012544354423880577, 0.012689666822552681, 0.038141168653964996, 0.01937289908528328, 0.31018537282943726, -0.04596512019634247, 0.0860837996006012, -0.050236776471138, 0.014703914523124695, -0.004460369236767292, 0.06670568883419037, -0.02497418224811554, 0.043099984526634216, -0.08878374844789505, 0.02363584376871586, -0.057929784059524536, -0.18258610367774963, 0.08030794560909271, -0.023195970803499222, -0.09180675446987152, 0.0349913164973259, -0.06511200964450836, 0.007042749784886837, 0.06658065319061279, 0.012797670438885689, -0.02096940577030182, 0.09686434268951416, 0.033859431743621826, -0.06936358660459518, -0.08984725922346115, 0.03084801696240902, 0.008902502246201038, 0.19994457066059113, -0.008977219462394714, -0.00947965495288372, 0.07798157632350922, 0.048244062811136246, -0.13179683685302734, 0.06081195920705795, -0.03097778744995594, -0.09145717322826385, 0.010418903082609177, 0.11249218136072159, -0.0021470366045832634, 0.11479651927947998, 0.04166842997074127, 0.01010634284466505, 0.025769107043743134, -0.07640734314918518, 0.05206102877855301, -0.08407606184482574, -0.014805321581661701, -0.07286461442708969, 0.13287045061588287, 0.12361716479063034, 0.04424574226140976, 0.03002271056175232, -0.012083821929991245, 0.013531121425330639, -0.00998656451702118, 0.03449642285704613, -0.0143646951764822, -0.10219088196754456, 0.022211197763681412, -0.05897481366991997, 0.019353050738573074, -0.18934044241905212, -0.09350855648517609, 0.030723251402378082, -0.058739542961120605, 0.00782686471939087, 0.041904889047145844, -0.005946307443082333, 0.0338040366768837, -0.05772806331515312, 0.04263392835855484, 0.01042969711124897, 0.08966407179832458, -0.08142673224210739, -0.07544000446796417 ]
null
null
transformers
# Vision Transformer (base-sized model, patch size 16) trained using DINO Vision Transformer (ViT) model trained using the DINO method. It was introduced in the paper [Emerging Properties in Self-Supervised Vision Transformers](https://arxiv.org/abs/2104.14294) by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in [this repository](https://github.com/facebookresearch/dino). Disclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Note that this model does not include any fine-tuned heads. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image. ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=google/vit) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import ViTImageProcessor, ViTModel from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) processor = ViTImageProcessor.from_pretrained('facebook/dino-vitb16') model = ViTModel.from_pretrained('facebook/dino-vitb16') inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) last_hidden_states = outputs.last_hidden_state ``` ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2104-14294, author = {Mathilde Caron and Hugo Touvron and Ishan Misra and Herv{\'{e}} J{\'{e}}gou and Julien Mairal and Piotr Bojanowski and Armand Joulin}, title = {Emerging Properties in Self-Supervised Vision Transformers}, journal = {CoRR}, volume = {abs/2104.14294}, year = {2021}, url = {https://arxiv.org/abs/2104.14294}, archivePrefix = {arXiv}, eprint = {2104.14294}, timestamp = {Tue, 04 May 2021 15:12:43 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2104-14294.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"license": "apache-2.0", "tags": ["dino", "vision"], "datasets": ["imagenet-1k"]}
feature-extraction
facebook/dino-vitb16
[ "transformers", "pytorch", "tf", "vit", "feature-extraction", "dino", "vision", "dataset:imagenet-1k", "arxiv:2104.14294", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2104.14294" ]
[]
TAGS #transformers #pytorch #tf #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# Vision Transformer (base-sized model, patch size 16) trained using DINO Vision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. Disclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Note that this model does not include any fine-tuned heads. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image. ## Intended uses & limitations You can use the raw model for image classification. See the model hub to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ### BibTeX entry and citation info
[ "# Vision Transformer (base-sized model, patch size 16) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# Vision Transformer (base-sized model, patch size 16) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ 64, 134, 273, 41, 13, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Vision Transformer (base-sized model, patch size 16) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image." ]
[ -0.06740637123584747, 0.004214100074023008, -0.007464932277798653, 0.09437448531389236, 0.12236224114894867, 0.000022055721274227835, 0.09209947288036346, 0.05892207846045494, 0.010314849205315113, 0.049585722386837006, 0.024737253785133362, 0.024516256526112556, 0.03211210295557976, 0.09107852727174759, 0.11935294419527054, -0.2751989960670471, 0.014631147496402264, -0.014354590326547623, -0.011220100335776806, -0.003941270522773266, 0.08841380476951599, -0.09640266001224518, 0.12377608567476273, 0.015215679071843624, -0.08416949957609177, 0.017173686996102333, -0.052463632076978683, -0.014584665186703205, 0.08697916567325592, 0.08532832562923431, 0.11876849830150604, -0.01609167456626892, 0.07091349363327026, -0.10554216057062149, 0.01200579758733511, 0.10385990887880325, -0.023429475724697113, 0.02729911543428898, 0.11599569022655487, 0.03970465064048767, -0.014496362768113613, -0.11863242089748383, 0.07651177793741226, 0.02472054213285446, -0.08412834256887436, -0.09600796550512314, -0.035601526498794556, 0.04700394347310066, 0.04770370572805405, -0.004377115983515978, 0.013165225274860859, -0.024282608181238174, 0.005683914292603731, 0.06292785704135895, 0.17204995453357697, -0.15989543497562408, -0.06691639870405197, 0.0772385224699974, -0.0083431052044034, 0.08073806017637253, -0.015471749007701874, -0.02212621457874775, 0.0056003849022090435, 0.027879955247044563, 0.06657753139734268, -0.04109632596373558, 0.07576079666614532, -0.06946390122175217, -0.1186724454164505, -0.010304886847734451, 0.08259029686450958, -0.049039576202631, -0.09078112989664078, -0.10000866651535034, -0.05990133434534073, 0.05777802690863609, 0.008364112116396427, -0.018597498536109924, 0.05804091691970825, 0.03722093254327774, 0.034505486488342285, -0.09414026886224747, -0.09697851538658142, -0.0012390663614496589, -0.06369441747665405, 0.10565893352031708, 0.034840021282434464, 0.042136915028095245, -0.05232406035065651, 0.05876177176833153, -0.11105798929929733, -0.017624687403440475, -0.05087488144636154, -0.06195203214883804, -0.0840645283460617, -0.016070174053311348, -0.06593383103609085, -0.1547611653804779, -0.0327623076736927, 0.09968726336956024, 0.012743592262268066, 0.07918868213891983, -0.004030827432870865, 0.06277143955230713, 0.052301958203315735, 0.15929025411605835, -0.06178322061896324, 0.0447477325797081, 0.006015259772539139, -0.06369951367378235, 0.05789068341255188, -0.05554070323705673, -0.10604941844940186, 0.030433014035224915, -0.03233306109905243, -0.03523381054401398, -0.011338685639202595, 0.04213465377688408, -0.02287769876420498, -0.03906485438346863, 0.07766371220350266, -0.05967588350176811, 0.043127357959747314, -0.003583828918635845, -0.011337368749082088, 0.09166549146175385, 0.08744360506534576, -0.0786004588007927, -0.04344288259744644, 0.09920100122690201, -0.07962378859519958, 0.007854292169213295, -0.0861615240573883, -0.07624109834432602, 0.02885952964425087, -0.1443636566400528, -0.03173686936497688, -0.16667313873767853, -0.10848890244960785, -0.022425387054681778, 0.049336545169353485, 0.0016355329426005483, 0.02185668796300888, -0.020156925544142723, -0.0054975212551653385, -0.03362542763352394, 0.043548356741666794, -0.051850054413080215, 0.025728562846779823, 0.00003164906593156047, -0.06790004670619965, 0.09727974981069565, -0.0770297646522522, 0.0030949520878493786, -0.07037705928087234, 0.029841534793376923, -0.13189257681369781, 0.09961076080799103, -0.006160267628729343, -0.024955885484814644, -0.06603486090898514, -0.0443473756313324, -0.05238601565361023, 0.05681558698415756, 0.06503304094076157, 0.03760524094104767, -0.14668338000774384, -0.058200571686029434, 0.13737726211547852, -0.17310786247253418, 0.02841350808739662, 0.09986430406570435, -0.008891195990145206, 0.05616180971264839, 0.05903243273496628, 0.09484504908323288, 0.07493950426578522, -0.05953812971711159, -0.07463881373405457, 0.030658269301056862, -0.06495096534490585, 0.08011074364185333, 0.017378399148583412, 0.032810889184474945, 0.014484626241028309, 0.021749889478087425, -0.012535019777715206, -0.03300730511546135, -0.024502161890268326, -0.018559064716100693, -0.008256405591964722, -0.005730029195547104, 0.01893206126987934, -0.009285351261496544, -0.01289182249456644, 0.015819285064935684, -0.04730789735913277, 0.05621419474482536, 0.06452877074480057, -0.09481559693813324, 0.068421371281147, -0.08078751713037491, -0.001724615809507668, -0.0699208676815033, -0.009985635057091713, -0.13397489488124847, -0.09077084809541702, 0.06861061602830887, -0.1276906579732895, 0.06389996409416199, 0.033355142921209335, 0.029395723715424538, 0.08275432884693146, 0.0058357855305075645, -0.024362768977880478, -0.028380028903484344, -0.03789113089442253, -0.019318772479891777, -0.11460088193416595, -0.04901331663131714, -0.07289354503154755, 0.015137365087866783, -0.005401615984737873, 0.021156832575798035, 0.028328493237495422, 0.03541363775730133, 0.061548132449388504, -0.06490901112556458, -0.01923852227628231, 0.05277637019753456, -0.0026996624656021595, -0.045362744480371475, 0.049665406346321106, 0.047954261302948, -0.02626057341694832, 0.03314642608165741, -0.10615106672048569, -0.13326185941696167, 0.05231652781367302, -0.07692532241344452, -0.07526396960020065, -0.009705028496682644, 0.029182542115449905, -0.029140174388885498, -0.08814871311187744, -0.036053817719221115, 0.21574832499027252, 0.006173098459839821, 0.0983249768614769, -0.07609833776950836, 0.05235007405281067, 0.06527440249919891, -0.06537837535142899, -0.06008385494351387, 0.016076749190688133, 0.1090494766831398, -0.08516361564397812, 0.019727863371372223, 0.013083407655358315, -0.080507792532444, 0.12698516249656677, 0.05338297784328461, -0.04578889161348343, -0.017514856532216072, -0.0063291736878454685, 0.039043497294187546, 0.06905537843704224, -0.08818630129098892, -0.03218034654855728, -0.00015231771976687014, 0.026039335876703262, 0.012774829752743244, -0.078142911195755, 0.08209812641143799, 0.03601015731692314, 0.010317766107618809, 0.008348837494850159, -0.0281792301684618, -0.0448455810546875, 0.03377893194556236, 0.04014916718006134, 0.02207011729478836, 0.004670855589210987, -0.024458296597003937, -0.06255842000246048, 0.13625583052635193, -0.11397086828947067, -0.26785823702812195, -0.1615791916847229, -0.06886808574199677, -0.07137714326381683, 0.049970973283052444, 0.07611996680498123, -0.08554111421108246, -0.04230177775025368, -0.06987284868955612, 0.0697144865989685, -0.0010501522338017821, 0.002577443141490221, -0.015010722912847996, -0.00235737394541502, 0.014557071030139923, -0.10364042222499847, -0.007402652408927679, -0.04221929982304573, -0.12511160969734192, 0.06633608788251877, 0.0003721479733940214, 0.07897157222032547, 0.07002702355384827, -0.0031192274764180183, 0.012573031708598137, -0.02801911160349846, 0.1474846601486206, -0.03208855912089348, 0.11058244109153748, 0.155635803937912, -0.04151062294840813, 0.09179370105266571, 0.0723806768655777, 0.03358276188373566, -0.0601591058075428, 0.008937560021877289, 0.024601468816399574, -0.10371781140565872, -0.08395102620124817, -0.07303834706544876, -0.02967631258070469, 0.052403323352336884, 0.14253641664981842, 0.02749577723443508, -0.0914752259850502, 0.04572395607829094, 0.005569993983954191, 0.0567086897790432, 0.03203485533595085, 0.1018584743142128, 0.10015898942947388, -0.03037913888692856, 0.07028533518314362, -0.04497168958187103, -0.0001328817888861522, 0.0694843977689743, 0.06526278704404831, 0.221422478556633, -0.05477605387568474, 0.11637390404939651, 0.05294472724199295, -0.05318507179617882, 0.03775951638817787, 0.03250422328710556, -0.05037406459450722, 0.030828051269054413, -0.04777366667985916, -0.044925689697265625, -0.005484459456056356, 0.06181278079748154, -0.022130437195301056, -0.03742516040802002, -0.07574036717414856, 0.08130442351102829, 0.03800192475318909, 0.16747300326824188, 0.00854508951306343, -0.1683620810508728, -0.03925756737589836, -0.01525971107184887, 0.011593380011618137, -0.07666288316249847, 0.017417846247553825, 0.15382587909698486, -0.08137378841638565, 0.03531411662697792, -0.019977577030658722, 0.05769593268632889, -0.15346825122833252, -0.0060401372611522675, 0.029468247666954994, 0.15582673251628876, 0.028720911592245102, 0.03068830631673336, -0.1114298552274704, 0.03608550503849983, 0.00590459443628788, 0.05992581695318222, -0.02180897258222103, 0.03833841159939766, 0.01821824535727501, 0.08650577813386917, 0.09752827882766724, 0.03278352692723274, -0.05935734137892723, -0.08802299946546555, -0.035846415907144547, 0.026546789333224297, 0.11477898806333542, -0.001934525789692998, 0.048815421760082245, -0.022352883592247963, -0.006901362910866737, -0.037273358553647995, 0.019008677452802658, -0.01792289689183235, -0.16889163851737976, 0.04034889116883278, -0.04431511461734772, -0.05502866581082344, -0.058605242520570755, -0.017328226938843727, -0.0038596249651163816, 0.07067101448774338, -0.0486096628010273, -0.047022745013237, -0.07373752444982529, -0.008917632512748241, 0.02166244573891163, -0.0808923989534378, 0.08553654700517654, -0.019015157595276833, 0.1383018046617508, -0.06067351996898651, -0.09651942551136017, 0.024796565994620323, -0.0973852202296257, -0.07305259257555008, -0.07915248721837997, 0.0047312043607234955, 0.06188862398266792, -0.006103504449129105, 0.023557938635349274, 0.01435080450028181, 0.0158905740827322, -0.042915161699056625, 0.013997683301568031, 0.1870213896036148, -0.07052557170391083, 0.07711859047412872, -0.07822474092245102, 0.03123939409852028, -0.030229877680540085, 0.023846279829740524, 0.06125921010971069, 0.09804100543260574, -0.06263741105794907, 0.09196112304925919, 0.1757509559392929, -0.13055922091007233, -0.2527446448802948, -0.016934208571910858, 0.07493261247873306, -0.014955750666558743, -0.03954620659351349, -0.26269957423210144, 0.14597095549106598, 0.07925541698932648, 0.02280711941421032, -0.0410354807972908, -0.24017451703548431, -0.059523388743400574, -0.012723560445010662, 0.054178234189748764, 0.15496890246868134, -0.04748253896832466, 0.010219377465546131, 0.00413748761638999, 0.03743032366037369, 0.0683603584766388, 0.014405626803636551, 0.09084200859069824, 0.002647543093189597, -0.059979166835546494, 0.032759178429841995, -0.0014608462806791067, 0.08436595648527145, -0.026285067200660706, 0.019652757793664932, -0.039487674832344055, 0.03803779557347298, -0.018519148230552673, -0.04235907271504402, 0.07681750506162643, 0.05689822509884834, 0.030545515939593315, 0.02134246937930584, -0.03442021831870079, -0.04249029979109764, 0.047719329595565796, -0.030761590227484703, -0.05140014365315437, -0.04724063351750374, 0.051895834505558014, 0.05091783404350281, 0.03286192938685417, 0.018631374463438988, -0.05948122590780258, -0.09194332361221313, 0.10172547399997711, 0.0681379958987236, -0.1007605791091919, -0.2084684818983078, -0.037260524928569794, -0.034090686589479446, 0.11716224253177643, -0.1068873330950737, 0.06657130271196365, 0.04937434941530228, -0.007956099696457386, 0.0691077709197998, 0.06322626769542694, -0.10259190946817398, 0.017358919605612755, 0.07657326757907867, -0.07122253626585007, -0.08141985535621643, -0.0029967515729367733, 0.08603142201900482, -0.03736920282244682, 0.004803081974387169, 0.17895600199699402, -0.028868315741419792, -0.013470376841723919, -0.016159119084477425, 0.045718904584646225, -0.012898989953100681, 0.0240631103515625, 0.030593091621994972, -0.024664686992764473, -0.04628884419798851, 0.14732319116592407, 0.06666429340839386, -0.1227833554148674, 0.005846365354955196, 0.111101433634758, -0.07329431921243668, -0.07247573882341385, -0.06739937514066696, 0.12520262598991394, -0.03230990096926689, -0.05431510508060455, -0.028291231021285057, -0.043119873851537704, 0.031643353402614594, 0.09829551726579666, 0.04851430281996727, -0.0060529462061822414, -0.06506761163473129, 0.04948096722364426, -0.09177078306674957, 0.0303355660289526, 0.02161364257335663, 0.09027992188930511, -0.09203740209341049, 0.03841031342744827, 0.01889798976480961, 0.06967739760875702, -0.013912138529121876, -0.06893898546695709, -0.0912843570113182, -0.018723271787166595, 0.013680616393685341, 0.055675242096185684, -0.044537127017974854, 0.00948897935450077, 0.044167011976242065, 0.05068890377879143, 0.02818179503083229, 0.023402230814099312, -0.005204878747463226, -0.021621154621243477, -0.017501959577202797, 0.06397868692874908, -0.07340274006128311, -0.0022676694206893444, 0.03121655061841011, -0.08220352977514267, 0.09391475468873978, -0.06838156282901764, -0.0667867362499237, -0.02166123315691948, -0.0347873792052269, 0.009380231611430645, 0.014958683401346207, 0.024193646386265755, -0.006878587417304516, -0.021657967939972878, 0.03155994787812233, -0.04747777059674263, -0.03442336618900299, -0.046919140964746475, 0.13435280323028564, -0.058026134967803955, 0.14937752485275269, 0.005115875508636236, -0.04369411990046501, -0.10139143466949463, 0.056602593511343, 0.03387366980314255, 0.11870402097702026, 0.06215929612517357, -0.06355416029691696, 0.05208263173699379, -0.06503818929195404, -0.005484858527779579, 0.04207682982087135, -0.006972327828407288, 0.0844726711511612, -0.10702365636825562, 0.025463363155722618, 0.005914782173931599, 0.05274190008640289, 0.056791041046381, -0.024619104340672493, -0.016671331599354744, -0.012840722687542439, -0.12011092156171799, 0.03589535504579544, 0.01856854371726513, -0.04508005455136299, -0.022145869210362434, -0.03597886115312576, 0.04795609042048454, 0.04884670302271843, 0.22360379993915558, 0.06316167116165161, 0.06372992694377899, 0.09017199277877808, 0.0984509065747261, 0.008077353239059448, 0.01497198548167944, -0.14126600325107574, 0.03774712607264519, 0.0020109901670366526, 0.07436954230070114, -0.04197470843791962, -0.01728682592511177, 0.053314585238695145, -0.07909465581178665, 0.09069178998470306, 0.048874497413635254, -0.05030921474099159, -0.04434269666671753, -0.14473164081573486, -0.03076646290719509, -0.06207292899489403, -0.017921552062034607, -0.11216185986995697, -0.0021425585728138685, 0.04479804262518883, 0.01527432445436716, -0.012285413220524788, 0.11977816373109818, -0.11074910312891006, -0.038480862975120544, 0.025441065430641174, 0.008464478887617588, 0.05371605232357979, 0.054984379559755325, -0.03347230330109596, 0.004123003222048283, 0.0627996176481247, 0.014463706873357296, 0.06053992360830307, 0.12533630430698395, 0.011137536726891994, -0.03066774643957615, -0.055197324603796005, -0.000051695260481210425, -0.04283769428730011, -0.04720420017838478, 0.1414606124162674, 0.012790641747415066, -0.08046368509531021, 0.0014497027732431889, 0.14008648693561554, -0.03445284068584442, -0.0033863577991724014, -0.1439400613307953, 0.1493287980556488, -0.018259109929203987, 0.042722638696432114, -0.04295777902007103, -0.054337162524461746, -0.0745919942855835, 0.19575755298137665, 0.10049308836460114, 0.0033592060208320618, -0.002613619202747941, 0.05229923501610756, -0.0009248162969015539, 0.0025223814882338047, 0.12447549402713776, -0.012057788670063019, 0.24619142711162567, -0.01586286351084709, 0.10692881047725677, -0.02251829020678997, 0.04800846055150032, -0.10915463417768478, 0.01526615023612976, -0.028470003977417946, 0.019353855401277542, -0.10815358906984329, 0.027685968205332756, -0.010703465901315212, -0.16582703590393066, 0.15719476342201233, -0.0574009008705616, -0.031551748514175415, -0.0018148306990042329, 0.03814816102385521, -0.02937609888613224, 0.1233566477894783, -0.014240968972444534, -0.022698622196912766, 0.1836909055709839, 0.01496834959834814, -0.0365716852247715, -0.08968018740415573, -0.015334050171077251, -0.1576111912727356, 0.2662612497806549, 0.007507979404181242, 0.04181445762515068, 0.06383094936609268, 0.05057808384299278, -0.07287230342626572, -0.04358609765768051, -0.055762019008398056, -0.030561674386262894, -0.045299381017684937, 0.11139120906591415, -0.06319615244865417, 0.12762901186943054, 0.034011490643024445, -0.10975731909275055, 0.03207670897245407, 0.0382777638733387, -0.005629146937280893, -0.030497025698423386, 0.043493494391441345, -0.11609913408756256, 0.09242727607488632, 0.11977500468492508, 0.03830362483859062, -0.055036745965480804, -0.030881335958838463, 0.02925466001033783, 0.04232238233089447, 0.04107050597667694, -0.0032258781138807535, -0.06924494355916977, -0.004149483982473612, -0.1093125119805336, 0.06612592935562134, -0.08550059050321579, -0.10120341926813126, 0.04050789400935173, -0.01122993603348732, -0.027423128485679626, 0.06134527549147606, 0.05160452052950859, -0.004438874311745167, -0.045785751193761826, 0.024791112169623375, 0.009985510259866714, 0.059451911598443985, -0.03867621347308159, -0.09852304309606552 ]
null
null
transformers
# Vision Transformer (base-sized model, patch size 8) trained using DINO Vision Transformer (ViT) model trained using the DINO method. It was introduced in the paper [Emerging Properties in Self-Supervised Vision Transformers](https://arxiv.org/abs/2104.14294) by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in [this repository](https://github.com/facebookresearch/dino). Disclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. Images are presented to the model as a sequence of fixed-size patches (resolution 8x8), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Note that this model does not include any fine-tuned heads. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image. ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=google/vit) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import ViTImageProcessor, ViTModel from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) processor = ViTImageProcessor.from_pretrained('facebook/dino-vitb8') model = ViTModel.from_pretrained('facebook/dino-vitb8') inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) last_hidden_states = outputs.last_hidden_state ``` ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2104-14294, author = {Mathilde Caron and Hugo Touvron and Ishan Misra and Herv{\'{e}} J{\'{e}}gou and Julien Mairal and Piotr Bojanowski and Armand Joulin}, title = {Emerging Properties in Self-Supervised Vision Transformers}, journal = {CoRR}, volume = {abs/2104.14294}, year = {2021}, url = {https://arxiv.org/abs/2104.14294}, archivePrefix = {arXiv}, eprint = {2104.14294}, timestamp = {Tue, 04 May 2021 15:12:43 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2104-14294.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"license": "apache-2.0", "tags": ["dino", "vision"], "datasets": ["imagenet-1k"]}
feature-extraction
facebook/dino-vitb8
[ "transformers", "pytorch", "vit", "feature-extraction", "dino", "vision", "dataset:imagenet-1k", "arxiv:2104.14294", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2104.14294" ]
[]
TAGS #transformers #pytorch #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# Vision Transformer (base-sized model, patch size 8) trained using DINO Vision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. Disclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. Images are presented to the model as a sequence of fixed-size patches (resolution 8x8), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Note that this model does not include any fine-tuned heads. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image. ## Intended uses & limitations You can use the raw model for image classification. See the model hub to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ### BibTeX entry and citation info
[ "# Vision Transformer (base-sized model, patch size 8) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 8x8), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# Vision Transformer (base-sized model, patch size 8) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 8x8), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ 61, 133, 273, 41, 13, 11 ]
[ "passage: TAGS\n#transformers #pytorch #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Vision Transformer (base-sized model, patch size 8) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 8x8), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image." ]
[ -0.07150421291589737, 0.02053859643638134, -0.006582166068255901, 0.07995061576366425, 0.1216878816485405, -0.009375832974910736, 0.09879999607801437, 0.05948871001601219, 0.0177494864910841, 0.06591831147670746, 0.0237885732203722, -0.016568364575505257, 0.03597385063767433, 0.08356563001871109, 0.14157092571258545, -0.2758108377456665, 0.02428789809346199, -0.02947232872247696, -0.016696646809577942, -0.010844637639820576, 0.08704150468111038, -0.09724222868680954, 0.12710997462272644, 0.009666354395449162, -0.09186194837093353, 0.016320442780852318, -0.029036076739430428, -0.008792998269200325, 0.08785472810268402, 0.0915328860282898, 0.10275394469499588, -0.0016277700196951628, 0.06679265946149826, -0.08276773244142532, 0.01530970074236393, 0.09999135881662369, -0.012772900983691216, 0.021788261830806732, 0.1299121081829071, 0.04062071442604065, 0.003984210081398487, -0.13206747174263, 0.05570915713906288, 0.041907623410224915, -0.08226785808801651, -0.10246727615594864, -0.06370386481285095, 0.05719275400042534, 0.04567664489150047, 0.0017477133078500628, 0.0220315121114254, -0.017599724233150482, 0.016341738402843475, 0.0644088014960289, 0.1746833622455597, -0.1884610801935196, -0.06982316821813583, 0.1197383850812912, -0.014603047631680965, 0.08483754098415375, -0.027360202744603157, -0.020776210352778435, 0.00782543234527111, 0.012591452337801456, 0.05406058579683304, -0.047062356024980545, 0.07611783593893051, -0.06998665630817413, -0.10229967534542084, -0.026120711117982864, 0.06473979353904724, -0.03240890055894852, -0.10151180624961853, -0.08050983399152756, -0.059114158153533936, 0.033067189157009125, -0.006432064343243837, -0.00983136985450983, 0.06277192384004593, 0.04621495306491852, 0.02561197429895401, -0.10848653316497803, -0.10728813707828522, -0.007587837055325508, -0.05624815449118614, 0.08670374006032944, 0.024859227240085602, 0.049900855869054794, -0.029372967779636383, 0.05619916319847107, -0.12204433977603912, -0.017589742317795753, -0.044839948415756226, -0.03889318183064461, -0.10500264912843704, -0.032764676958322525, -0.062005870044231415, -0.13804931938648224, -0.02701825462281704, 0.10703784972429276, 0.015123591758310795, 0.07245644181966782, -0.025333013385534286, 0.054837752133607864, 0.056820861995220184, 0.13977180421352386, -0.07008379697799683, 0.026585765182971954, 0.011105462908744812, -0.08452798426151276, 0.05376356095075607, -0.04912620410323143, -0.07189465314149857, 0.01070590503513813, -0.02502598986029625, -0.04184935614466667, -0.013844183646142483, 0.027672121301293373, 0.004785479977726936, -0.03431287035346031, 0.10125967115163803, -0.06709001213312149, 0.030054204165935516, 0.0008572318474762142, 0.003555906005203724, 0.13100315630435944, 0.06930124014616013, -0.0775919035077095, -0.032995641231536865, 0.09400971233844757, -0.07014238834381104, 0.010783403180539608, -0.08143287152051926, -0.06019257381558418, 0.043448857963085175, -0.12565188109874725, -0.048923857510089874, -0.1451379805803299, -0.10149908810853958, -0.032262999564409256, 0.04012354463338852, 0.0059975385665893555, 0.0054726190865039825, 0.012074527330696583, -0.0010737256379798055, -0.03081081435084343, 0.0324527844786644, -0.03254416957497597, 0.021351369097828865, 0.01252756454050541, -0.07050315290689468, 0.07876110076904297, -0.09859087318181992, 0.001848874962888658, -0.06486505270004272, 0.037499599158763885, -0.1526353806257248, 0.09297780692577362, -0.013873791322112083, -0.01741000823676586, -0.08520303666591644, -0.03573337942361832, -0.058355726301670074, 0.0461149625480175, 0.0703413262963295, 0.03894757479429245, -0.13229654729366302, -0.05210926756262779, 0.12771928310394287, -0.1978183090686798, 0.02262108214199543, 0.11403650790452957, -0.0012457610573619604, 0.047573380172252655, 0.06716163456439972, 0.07631148397922516, 0.09743766486644745, -0.036907922476530075, -0.1070089042186737, 0.020760560408234596, -0.06462734937667847, 0.029676061123609543, 0.01887790858745575, 0.05185800418257713, 0.003193404758349061, 0.029558828100562096, 0.010421894490718842, -0.028412815183401108, -0.022749902680516243, -0.0159787368029356, -0.024972636252641678, -0.002889413619413972, -0.030810117721557617, -0.012789381667971611, 0.004772130865603685, 0.026633497327566147, -0.017767716199159622, 0.059150759130716324, 0.0787714272737503, -0.07088255137205124, 0.06857111304998398, -0.07035981118679047, 0.0034470229875296354, -0.10163562744855881, -0.012515859678387642, -0.13487693667411804, -0.09860772639513016, 0.06044737622141838, -0.10288789868354797, 0.05488399416208267, 0.06927935779094696, 0.029441550374031067, 0.049829836934804916, -0.01495927944779396, 0.0006508370279334486, -0.014426481910049915, -0.0568414181470871, -0.02580035664141178, -0.10179363191127777, -0.06258028000593185, -0.08079803735017776, 0.003957746084779501, -0.019108867272734642, 0.019871804863214493, 0.0713493674993515, 0.022564902901649475, 0.06012299284338951, -0.0747496709227562, -0.008098311722278595, 0.040356021374464035, 0.005064315628260374, -0.05044086277484894, 0.04865166172385216, 0.06678685545921326, -0.02398017793893814, 0.047232601791620255, -0.12464971095323563, -0.08076076209545135, 0.056051503866910934, -0.04971979558467865, -0.08030781149864197, -0.039312247186899185, 0.012558667920529842, -0.03416239097714424, -0.09625310450792313, -0.021957088261842728, 0.15305347740650177, 0.02215917967259884, 0.09082677960395813, -0.05678577348589897, 0.07102356106042862, 0.0663272887468338, -0.04521869122982025, -0.057899508625268936, 0.0011279659811407328, 0.10740835219621658, -0.05866774916648865, 0.018264243379235268, 0.013102774508297443, -0.07023776322603226, 0.12767191231250763, 0.04113920032978058, -0.0540669709444046, -0.02860797755420208, 0.01659032329916954, 0.04959876090288162, 0.07135993242263794, -0.07612651586532593, -0.03403753414750099, 0.003937207628041506, 0.03987060859799385, 0.008959485217928886, -0.06279945373535156, 0.08563169091939926, 0.03277381882071495, -0.0007012402638792992, 0.01089318934828043, -0.03386341780424118, -0.042137324810028076, 0.038171470165252686, 0.03850250691175461, 0.04122421145439148, -0.002356161829084158, -0.02437664568424225, -0.05753763020038605, 0.11704763025045395, -0.0673450157046318, -0.20714984834194183, -0.16358591616153717, -0.06165113300085068, -0.09038542211055756, 0.04587375372648239, 0.059240300208330154, -0.0661393478512764, -0.0275117214769125, -0.07388323545455933, 0.043247289955616, -0.004324901849031448, 0.004714230075478554, -0.042436446994543076, -0.000048903086280915886, 0.01539472583681345, -0.08868513256311417, -0.011023659259080887, -0.035743121057748795, -0.16264326870441437, 0.05203717574477196, 0.013527967035770416, 0.05426228791475296, 0.05910533294081688, -0.001359214074909687, 0.0059085930697619915, -0.03348048776388168, 0.12948937714099884, -0.028922971338033676, 0.1102905124425888, 0.190659299492836, -0.01514236256480217, 0.08091515302658081, 0.06425438076257706, 0.03442024812102318, -0.06811012327671051, 0.019336644560098648, 0.040004096925258636, -0.12152060866355896, -0.09154266119003296, -0.06615280359983444, -0.03773994743824005, 0.0805196464061737, 0.14353489875793457, 0.037600789219141006, -0.11196154356002808, 0.05191941186785698, -0.0038281939923763275, 0.04562754184007645, 0.019643587991595268, 0.10356079041957855, 0.08010778576135635, -0.040585946291685104, 0.07842423766851425, -0.06942110508680344, 0.021994849666953087, 0.08493930846452713, 0.0838100016117096, 0.1822112798690796, -0.05478321388363838, 0.14851993322372437, 0.025389377027750015, -0.02169126085937023, 0.04491543397307396, 0.04251080006361008, -0.03318100795149803, 0.012833730317652225, -0.044730786234140396, -0.04527530074119568, -0.030583146959543228, 0.05767207220196724, -0.011117379181087017, -0.04224391281604767, -0.08730439096689224, 0.05813683196902275, 0.01817931979894638, 0.24852286279201508, -0.007088016718626022, -0.1512894332408905, -0.06053775176405907, -0.009591943584382534, 0.0003461258893366903, -0.0683741644024849, 0.009576205164194107, 0.16156631708145142, -0.08331459015607834, 0.0294741690158844, -0.023422548547387123, 0.05231580138206482, -0.14657732844352722, -0.006127368658781052, 0.01301388069987297, 0.13135313987731934, 0.019478725269436836, 0.022731803357601166, -0.0885985940694809, 0.050961077213287354, 0.013474877923727036, 0.0435512438416481, -0.027690859511494637, 0.03694343566894531, 0.024290861561894417, 0.06179468333721161, 0.09812049567699432, 0.033799126744270325, -0.13745349645614624, -0.08362826704978943, -0.05251573398709297, 0.03206435963511467, 0.08664089441299438, -0.004257579334080219, 0.0447346568107605, -0.021376769989728928, -0.0035754190757870674, -0.04192317649722099, 0.06521546840667725, -0.0071887909434735775, -0.16998866200447083, 0.044461533427238464, -0.018754396587610245, -0.06000653654336929, -0.06664316356182098, -0.014348340220749378, 0.00525713711977005, 0.12767937779426575, -0.021183542907238007, -0.022968076169490814, -0.09637759625911713, 0.036622628569602966, 0.016264572739601135, -0.07101407647132874, 0.053428683429956436, -0.02470538392663002, 0.13244464993476868, -0.06568556278944016, -0.07180587947368622, 0.023136867210268974, -0.08665162324905396, -0.07584664225578308, -0.054471347481012344, 0.029301267117261887, 0.0397140197455883, 0.0010100643849000335, 0.006198692601174116, 0.0027094006072729826, 0.04725531116127968, -0.04727260768413544, 0.003182464512065053, 0.1700948029756546, -0.08568032830953598, 0.05405235290527344, -0.08221430331468582, 0.04928151145577431, -0.008809640072286129, 0.008991623297333717, 0.08260687440633774, 0.1339966207742691, -0.06207190454006195, 0.10162334144115448, 0.17291425168514252, -0.12139950692653656, -0.24324138462543488, 0.012416594661772251, 0.05217008665204048, -0.019690016284585, -0.034425582736730576, -0.2548846900463104, 0.16079892218112946, 0.10744162648916245, 0.011972744949162006, -0.05464385449886322, -0.21780307590961456, -0.06103340908885002, -0.03669369965791702, 0.0489019975066185, 0.22319407761096954, -0.0606076642870903, 0.015421557240188122, -0.015102471224963665, 0.04494442045688629, 0.05626121908426285, 0.009346931241452694, 0.09061936289072037, 0.007165241055190563, -0.08057668805122375, 0.03891312703490257, -0.02118678018450737, 0.08644329756498337, -0.012618260458111763, 0.038574058562517166, -0.038456991314888, 0.051300790160894394, -0.01834908500313759, -0.05078062787652016, 0.08243589848279953, 0.05590973049402237, 0.018150240182876587, 0.0010568643920123577, -0.04556579887866974, -0.03970127925276756, 0.043463099747896194, -0.025641798973083496, -0.04209315404295921, -0.0400569811463356, 0.06296047568321228, 0.04898896440863609, 0.03458215296268463, 0.011934198439121246, -0.07350122928619385, -0.07935137301683426, 0.12167662382125854, 0.060584455728530884, -0.11453264951705933, -0.180778369307518, -0.03523264825344086, -0.03404201939702034, 0.10841023921966553, -0.10234367102384567, 0.06770603358745575, 0.048681072890758514, 0.0017661782912909985, 0.03623788803815842, 0.06343904882669449, -0.0953260064125061, 0.026388030499219894, 0.06030352786183357, -0.05432281270623207, -0.09919703006744385, 0.012625805102288723, 0.13183583319187164, -0.06324233114719391, -0.02205335721373558, 0.20690561830997467, -0.04278264567255974, -0.007955607958137989, -0.009715345688164234, 0.03575904667377472, -0.016707710921764374, 0.03577764332294464, 0.0028979459311813116, -0.020613040775060654, -0.04793303832411766, 0.15190456807613373, 0.07849132269620895, -0.16397756338119507, -0.011451090686023235, 0.10742342472076416, -0.06309576332569122, -0.08027002215385437, -0.05821377784013748, 0.07184000313282013, -0.03193288296461105, -0.04613812640309334, -0.018904460594058037, -0.03882429376244545, 0.03417348861694336, 0.10883914679288864, 0.04592987149953842, 0.02070499211549759, -0.05261623114347458, 0.03964361175894737, -0.10234993696212769, 0.00789274089038372, 0.009856096468865871, 0.08959644287824631, -0.07315927743911743, 0.062081653624773026, 0.0010529453866183758, 0.06722081452608109, -0.011208198964595795, -0.07604459673166275, -0.08952128887176514, -0.02743626944720745, 0.013462444767355919, 0.04045390710234642, -0.06838015466928482, -0.0004808485973626375, 0.03527485206723213, 0.05640038102865219, 0.01689404994249344, 0.008816581219434738, -0.018171409144997597, -0.021729903295636177, -0.03599372133612633, 0.07534585893154144, -0.06175898760557175, 0.011755382642149925, 0.0378105528652668, -0.08041413873434067, 0.10158270597457886, -0.0593375526368618, -0.057581666857004166, -0.04239368066191673, -0.045197147876024246, -0.00439981697127223, -0.005839145742356777, 0.01617620885372162, -0.02296987920999527, -0.028705237433314323, 0.046662960201501846, -0.03273998573422432, -0.016492797061800957, -0.04606246575713158, 0.12986816465854645, -0.06260992586612701, 0.13948512077331543, -0.0014837458729743958, -0.030294643715023994, -0.09286192059516907, 0.024285664781928062, 0.04822876676917076, 0.12501022219657898, 0.03968086093664169, -0.07029351592063904, 0.051695071160793304, -0.0908162072300911, 0.009526478126645088, 0.01628389023244381, -0.0034233867190778255, 0.06799735128879547, -0.09687516838312149, 0.030538810417056084, 0.03310180827975273, 0.043288182467222214, 0.022845782339572906, -0.0338296964764595, -0.0002954806841444224, -0.049650970846414566, -0.09898335486650467, 0.03314518555998802, 0.021693509072065353, -0.0193930771201849, -0.027197713032364845, 0.013493645004928112, 0.05396803468465805, 0.029878539964556694, 0.2545260190963745, 0.07913285493850708, 0.10485012084245682, 0.07384909689426422, 0.09634266793727875, -0.0015914232935756445, 0.002162797376513481, -0.09554444253444672, 0.04727383330464363, -0.022838875651359558, 0.07511494308710098, -0.07010579854249954, -0.029102349653840065, 0.07335490733385086, -0.0829693078994751, 0.09249278157949448, 0.04222872480750084, -0.025233086198568344, -0.044118162244558334, -0.14471052587032318, -0.03421102464199066, -0.04835544899106026, -0.01745155267417431, -0.09101545810699463, -0.0037163717206567526, 0.07213848829269409, 0.029783600941300392, 0.0034958270844072104, 0.0991358608007431, -0.14810678362846375, -0.019649488851428032, 0.030374689027667046, 0.0020882291719317436, 0.03962429612874985, 0.05027097091078758, -0.03867266699671745, 0.005787472706288099, 0.05656423419713974, 0.013168117962777615, 0.07267884165048599, 0.15813423693180084, 0.019052958115935326, -0.022468211129307747, -0.05612378194928169, -0.011942044831812382, -0.048128459602594376, -0.010604632087051868, 0.16015133261680603, 0.010008998215198517, -0.08393286168575287, -0.006023131776601076, 0.15011578798294067, -0.03764200210571289, 0.0067197117023169994, -0.15881851315498352, 0.14266207814216614, -0.023119360208511353, 0.014373280107975006, -0.07173670828342438, -0.039053142070770264, -0.04352058470249176, 0.16713730990886688, 0.10243740677833557, 0.006225446704775095, 0.0020789324771612883, 0.03796015679836273, -0.010251596570014954, -0.0011065809521824121, 0.13027450442314148, -0.024245483800768852, 0.2369333654642105, -0.03404167667031288, 0.09439823031425476, -0.016063867136836052, 0.042513150721788406, -0.0895124226808548, 0.06571105122566223, -0.045510366559028625, 0.029768090695142746, -0.09470707923173904, 0.028019284829497337, 0.012612626887857914, -0.1668464094400406, 0.15203110873699188, -0.05579604208469391, -0.03329969942569733, 0.018330954015254974, -0.01437209639698267, -0.012664366513490677, 0.12451636046171188, -0.021933045238256454, -0.00680006667971611, 0.15068629384040833, 0.010487722232937813, -0.05091571435332298, -0.10792119801044464, 0.00045429676538333297, -0.07776220887899399, 0.27095749974250793, 0.001740248640999198, 0.019981281831860542, 0.07063429057598114, 0.03709256276488304, -0.10573951154947281, -0.0603586845099926, -0.06849504262208939, -0.05867477133870125, -0.06285696476697922, 0.1215313971042633, -0.06434435397386551, 0.14995083212852478, 0.02883380837738514, -0.12320853024721146, 0.02405530773103237, 0.04061795771121979, 0.012172510847449303, -0.05230448767542839, 0.04101935401558876, -0.10812828689813614, 0.11389386653900146, 0.11846940964460373, 0.02621421217918396, -0.049030158668756485, -0.03254235163331032, 0.03454349562525749, 0.038192588835954666, 0.05669895187020302, -0.005870674271136522, -0.09665457159280777, 0.010893757455050945, -0.15432848036289215, 0.03187289088964462, -0.09088481962680817, -0.11887099593877792, 0.047448158264160156, -0.006875165738165379, -0.03471885249018669, 0.05739056318998337, 0.01659552752971649, 0.0023269441444426775, -0.04006360098719597, -0.004766976460814476, 0.014021851122379303, 0.0586138553917408, -0.04555916413664818, -0.10252436995506287 ]
null
null
transformers
# Vision Transformer (small-sized model, patch size 16) trained using DINO Vision Transformer (ViT) model trained using the DINO method. It was introduced in the paper [Emerging Properties in Self-Supervised Vision Transformers](https://arxiv.org/abs/2104.14294) by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in [this repository](https://github.com/facebookresearch/dino). Disclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Note that this model does not include any fine-tuned heads. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image. ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=google/vit) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import ViTImageProcessor, ViTModel from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) processor = ViTImageProcessor.from_pretrained('facebook/dino-vits16') model = ViTModel.from_pretrained('facebook/dino-vits16') inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) last_hidden_states = outputs.last_hidden_state ``` ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2104-14294, author = {Mathilde Caron and Hugo Touvron and Ishan Misra and Herv{\'{e}} J{\'{e}}gou and Julien Mairal and Piotr Bojanowski and Armand Joulin}, title = {Emerging Properties in Self-Supervised Vision Transformers}, journal = {CoRR}, volume = {abs/2104.14294}, year = {2021}, url = {https://arxiv.org/abs/2104.14294}, archivePrefix = {arXiv}, eprint = {2104.14294}, timestamp = {Tue, 04 May 2021 15:12:43 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2104-14294.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"license": "apache-2.0", "tags": ["dino", "vision"], "datasets": ["imagenet-1k"]}
feature-extraction
facebook/dino-vits16
[ "transformers", "pytorch", "vit", "feature-extraction", "dino", "vision", "dataset:imagenet-1k", "arxiv:2104.14294", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2104.14294" ]
[]
TAGS #transformers #pytorch #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# Vision Transformer (small-sized model, patch size 16) trained using DINO Vision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. Disclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Note that this model does not include any fine-tuned heads. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image. ## Intended uses & limitations You can use the raw model for image classification. See the model hub to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ### BibTeX entry and citation info
[ "# Vision Transformer (small-sized model, patch size 16) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# Vision Transformer (small-sized model, patch size 16) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ 61, 135, 273, 41, 13, 11 ]
[ "passage: TAGS\n#transformers #pytorch #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Vision Transformer (small-sized model, patch size 16) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image." ]
[ -0.0703875944018364, 0.0011255335994064808, -0.007236064877361059, 0.08116310834884644, 0.1153019592165947, -0.008916741237044334, 0.10577498376369476, 0.05502481386065483, 0.004960934165865183, 0.06064682453870773, 0.025583745911717415, 0.00944080576300621, 0.04065285623073578, 0.08988512307405472, 0.13489246368408203, -0.2697257399559021, 0.016962392255663872, -0.022490832954645157, -0.023046670481562614, -0.001963919261470437, 0.09219077229499817, -0.09666194766759872, 0.12547257542610168, 0.021664703264832497, -0.08689450472593307, 0.016427606344223022, -0.04375899210572243, -0.016728002578020096, 0.08646248281002045, 0.08653199672698975, 0.11826875805854797, -0.0018833223730325699, 0.06120269000530243, -0.09837373346090317, 0.01094463374465704, 0.1002298891544342, -0.020627429708838463, 0.019258610904216766, 0.11750320345163345, 0.036242369562387466, -0.009183027781546116, -0.12908779084682465, 0.07104209065437317, 0.03194626420736313, -0.08147718757390976, -0.10160934180021286, -0.04746463522315025, 0.0438588447868824, 0.04934734106063843, 0.005633142776787281, 0.01404996495693922, -0.017261043190956116, 0.0068654934875667095, 0.06152814254164696, 0.16463032364845276, -0.18035514652729034, -0.06656456738710403, 0.0950162410736084, -0.008779377676546574, 0.08183383196592331, -0.024059303104877472, -0.026566199958324432, 0.00532363448292017, 0.024427976459264755, 0.05762398615479469, -0.04352835193276405, 0.06996513158082962, -0.07624978572130203, -0.11003204435110092, -0.012903758324682713, 0.08063963055610657, -0.0463346391916275, -0.09507256001234055, -0.09203541278839111, -0.061345603317022324, 0.03657863289117813, 0.012531431391835213, -0.01821773126721382, 0.06258733570575714, 0.040537554770708084, 0.023058265447616577, -0.0990750789642334, -0.104487843811512, -0.00575610576197505, -0.06304193288087845, 0.0922219455242157, 0.0313422866165638, 0.04555163159966469, -0.04743858426809311, 0.05325138568878174, -0.1062590628862381, -0.020727626979351044, -0.05333925038576126, -0.05504006892442703, -0.08984045684337616, -0.025322068482637405, -0.0695662572979927, -0.14525087177753448, -0.030740458518266678, 0.1274106800556183, 0.0046275025233626366, 0.07794997096061707, -0.005837593227624893, 0.06183629482984543, 0.05717630311846733, 0.14738918840885162, -0.061921872198581696, 0.04186444357037544, 0.008799416013062, -0.07331442832946777, 0.05408281460404396, -0.05067473649978638, -0.08928623795509338, 0.020752832293510437, -0.028143633157014847, -0.037645645439624786, -0.016901060938835144, 0.02899649739265442, -0.018772175535559654, -0.03708679974079132, 0.08414080739021301, -0.06692815572023392, 0.030295424163341522, -0.004452453460544348, -0.000936951779294759, 0.11505062133073807, 0.08026030659675598, -0.076075978577137, -0.031772781163454056, 0.08980223536491394, -0.07104478031396866, 0.0034198202192783356, -0.07945125550031662, -0.07030533999204636, 0.04005792737007141, -0.14434517920017242, -0.0360821895301342, -0.15651872754096985, -0.09080218523740768, -0.028342530131340027, 0.04340892285108566, 0.0004429716500453651, 0.01823517121374607, -0.008989816531538963, -0.00028153276070952415, -0.03127874806523323, 0.03574339300394058, -0.037356186658144, 0.026315949857234955, 0.004467527382075787, -0.06738665699958801, 0.0940277948975563, -0.07868728786706924, 0.0021110582165420055, -0.06851755082607269, 0.03544854000210762, -0.139535054564476, 0.0999540239572525, -0.008377731777727604, -0.022158190608024597, -0.08170536160469055, -0.04006492719054222, -0.06536997854709625, 0.050777290016412735, 0.07135046273469925, 0.03375200182199478, -0.14034350216388702, -0.05976762995123863, 0.14193874597549438, -0.18545174598693848, 0.021933406591415405, 0.10229584574699402, -0.005415952764451504, 0.06429583579301834, 0.062498223036527634, 0.07479219138622284, 0.07792022824287415, -0.052868109196424484, -0.09717395156621933, 0.028899146243929863, -0.05621858686208725, 0.05797417089343071, 0.021602502092719078, 0.03773600980639458, 0.008576552383601665, 0.02062833309173584, 0.0017586176982149482, -0.03154149278998375, -0.02431643381714821, -0.019785597920417786, -0.005777445621788502, -0.006853383965790272, -0.0036310963332653046, -0.014039895497262478, -0.0004271847428753972, 0.01544368639588356, -0.04214018955826759, 0.053936440497636795, 0.0679672360420227, -0.08125858753919601, 0.0686994269490242, -0.07900228351354599, -0.010473329573869705, -0.08503281325101852, -0.012121027335524559, -0.1357375681400299, -0.09350377321243286, 0.06215532496571541, -0.10400722920894623, 0.06088016927242279, 0.0455118753015995, 0.03381047025322914, 0.06394027918577194, -0.00017830553406383842, -0.012597866356372833, -0.031011173501610756, -0.04779001325368881, -0.019518055021762848, -0.10966966301202774, -0.0562254898250103, -0.08044042438268661, 0.025046346709132195, -0.002084768610075116, 0.018396366387605667, 0.05934741720557213, 0.02497877925634384, 0.06104002520442009, -0.07069438695907593, -0.02025827206671238, 0.04420009255409241, -0.0029355131555348635, -0.04525743052363396, 0.047304362058639526, 0.05719597265124321, -0.03005264140665531, 0.046378105878829956, -0.11342239379882812, -0.11647581309080124, 0.05654109641909599, -0.07396303117275238, -0.08007147908210754, -0.03264809027314186, 0.019105827435851097, -0.032426729798316956, -0.08445832878351212, -0.03039497882127762, 0.1912001669406891, 0.013681950978934765, 0.08945930749177933, -0.07073498517274857, 0.05809842795133591, 0.06441262364387512, -0.05746660381555557, -0.06162268668413162, 0.014085767790675163, 0.11134335398674011, -0.073699451982975, 0.02193359099328518, 0.03233173117041588, -0.07618551701307297, 0.12661318480968475, 0.047781120985746384, -0.04819435253739357, -0.024740183725953102, 0.007199866697192192, 0.045422445982694626, 0.06354822963476181, -0.08015000075101852, -0.030417941510677338, 0.0014251548564061522, 0.030770905315876007, 0.011477544903755188, -0.07333468645811081, 0.085667684674263, 0.039547085762023926, 0.010951478965580463, 0.013298286125063896, -0.031111987307667732, -0.04604870826005936, 0.03182332590222359, 0.04342690482735634, 0.037496086210012436, 0.00538514694198966, -0.023293811827898026, -0.0578930601477623, 0.1315823495388031, -0.10261987894773483, -0.24161618947982788, -0.16093339025974274, -0.06726503372192383, -0.07086378335952759, 0.048763010650873184, 0.06645148247480392, -0.0868818536400795, -0.03360041603446007, -0.06562498956918716, 0.0686534121632576, -0.010249124839901924, 0.00252722785808146, -0.020900152623653412, 0.0011362003860995173, 0.01622709073126316, -0.09773893654346466, -0.009421388618648052, -0.03838751092553139, -0.1509692519903183, 0.06298348307609558, -0.00007699008710915223, 0.06225963681936264, 0.06269613653421402, 0.006679691839963198, 0.009820002131164074, -0.03243295103311539, 0.1530282199382782, -0.025283772498369217, 0.10623455792665482, 0.16774208843708038, -0.02487030439078808, 0.08554194867610931, 0.06581825017929077, 0.029133480042219162, -0.071621373295784, 0.015285467728972435, 0.033422987908124924, -0.11134009063243866, -0.09397812187671661, -0.06323602050542831, -0.031331177800893784, 0.05608454346656799, 0.146779865026474, 0.031567808240652084, -0.10287914425134659, 0.04855351150035858, 0.0008750628330744803, 0.06016217917203903, 0.0357966274023056, 0.09994663298130035, 0.08810671418905258, -0.03583556041121483, 0.07549340277910233, -0.052006207406520844, 0.014558052644133568, 0.07424567639827728, 0.07718047499656677, 0.19949686527252197, -0.049767762422561646, 0.1413441300392151, 0.04729575291275978, -0.03564835712313652, 0.04243965074419975, 0.0369693897664547, -0.04726070165634155, 0.024838725104928017, -0.04273802414536476, -0.04791508615016937, -0.028318915516138077, 0.06200610473752022, -0.016788413748145103, -0.040792278945446014, -0.0836620032787323, 0.0798589438199997, 0.022413907572627068, 0.2024000734090805, 0.004027392249554396, -0.16731910407543182, -0.04800964891910553, -0.01707315444946289, 0.008160537108778954, -0.06887556612491608, 0.014424856752157211, 0.16098754107952118, -0.08061432093381882, 0.033281609416007996, -0.02127762883901596, 0.053715236485004425, -0.15031901001930237, -0.008699245750904083, 0.0213500764220953, 0.14748962223529816, 0.028199510648846626, 0.027546999976038933, -0.10258166491985321, 0.0447348989546299, 0.010697382502257824, 0.051624804735183716, -0.02554010972380638, 0.0395968072116375, 0.01620057225227356, 0.06141316518187523, 0.09656935185194016, 0.03353871405124664, -0.08785858005285263, -0.0805959403514862, -0.05260329693555832, 0.02461126446723938, 0.11086128652095795, -0.008527747355401516, 0.04506240412592888, -0.016211379319429398, -0.0035382523201406, -0.03651312366127968, 0.03531418740749359, -0.022209512069821358, -0.17165343463420868, 0.03959501162171364, -0.031192652881145477, -0.05618105083703995, -0.06312379240989685, -0.017364326864480972, -0.007279808633029461, 0.09651108086109161, -0.049762971699237823, -0.035381123423576355, -0.08124371618032455, 0.01056445762515068, 0.01692485250532627, -0.07812849432229996, 0.07725004106760025, -0.0213764775544405, 0.1330329030752182, -0.07025658339262009, -0.08307566493749619, 0.03054889850318432, -0.09281843900680542, -0.07364863902330399, -0.06893398612737656, 0.016186613589525223, 0.05788138508796692, 0.0015351636102423072, 0.014833766035735607, 0.007066790480166674, 0.027597632259130478, -0.04189005866646767, 0.00634104385972023, 0.18653710186481476, -0.07397197932004929, 0.06288757920265198, -0.07934872806072235, 0.04893882945179939, -0.018047332763671875, 0.022446295246481895, 0.07875123620033264, 0.1274242252111435, -0.060053516179323196, 0.10148046165704727, 0.18006554245948792, -0.1244773417711258, -0.24884921312332153, -0.0017246095230802894, 0.06543303281068802, -0.011013410985469818, -0.03979265317320824, -0.26374033093452454, 0.14455807209014893, 0.09311661869287491, 0.018407804891467094, -0.052158791571855545, -0.23627632856369019, -0.06096356734633446, -0.03147611767053604, 0.049125608056783676, 0.19422926008701324, -0.045543450862169266, 0.009007670916616917, -0.005724241025745869, 0.02675238810479641, 0.06120013818144798, 0.014399964362382889, 0.09174837917089462, 0.008005144074559212, -0.07910323143005371, 0.029832884669303894, -0.010935656726360321, 0.083833247423172, -0.018605321645736694, 0.023083442822098732, -0.03715398162603378, 0.0529635064303875, -0.006142986938357353, -0.04673737287521362, 0.07007793337106705, 0.049154143780469894, 0.018232453614473343, 0.011315800249576569, -0.04494895786046982, -0.03980520740151405, 0.04372100159525871, -0.03336004540324211, -0.04829638823866844, -0.044229939579963684, 0.05900714546442032, 0.04534647986292839, 0.033723168075084686, 0.004881224595010281, -0.06300871819257736, -0.09323788434267044, 0.11703251302242279, 0.07082315534353256, -0.10247307270765305, -0.1921328455209732, -0.03446412459015846, -0.03044429048895836, 0.11802218109369278, -0.0964660793542862, 0.06460491567850113, 0.04991921782493591, -0.009291158057749271, 0.05157249793410301, 0.06302914768457413, -0.1071227490901947, 0.019237326458096504, 0.06636275351047516, -0.06182265654206276, -0.08252286911010742, 0.007642318028956652, 0.12054231762886047, -0.04915737360715866, -0.008471732027828693, 0.19295760989189148, -0.029941227287054062, -0.013415936380624771, -0.01703222654759884, 0.0432400144636631, -0.015624104999005795, 0.03331854194402695, 0.017150655388832092, -0.02539672888815403, -0.04910433664917946, 0.15421393513679504, 0.07447708398103714, -0.14348715543746948, 0.00420036306604743, 0.09910281002521515, -0.0681886151432991, -0.07515167444944382, -0.05913010239601135, 0.0924389660358429, -0.02752692438662052, -0.04669266566634178, -0.021819408982992172, -0.048078231513500214, 0.032483987510204315, 0.10362834483385086, 0.050678592175245285, 0.010506777092814445, -0.053093321621418, 0.04494307562708855, -0.094229556620121, 0.01621291972696781, 0.013250148855149746, 0.08805564045906067, -0.07342014461755753, 0.04235091432929039, 0.016862478107213974, 0.06600996851921082, -0.009414218366146088, -0.07434742152690887, -0.09300418943166733, -0.019206633791327477, 0.01549751777201891, 0.05046432465314865, -0.051856815814971924, 0.002017274731770158, 0.039857421070337296, 0.06239907070994377, 0.02118850313127041, 0.01915270835161209, -0.012311787344515324, -0.02163456380367279, -0.03204522654414177, 0.0674746036529541, -0.07019205391407013, 0.0037923073396086693, 0.03613371029496193, -0.0872555673122406, 0.09653428196907043, -0.06672529131174088, -0.058138664811849594, -0.038668710738420486, -0.03649674728512764, 0.0033715490717440844, 0.005125231109559536, 0.01780099794268608, -0.012627788819372654, -0.02865270897746086, 0.04102373868227005, -0.04007716849446297, -0.025518160313367844, -0.047817081212997437, 0.13475479185581207, -0.05757734179496765, 0.1489851474761963, 0.0030676363967359066, -0.041395775973796844, -0.09682192653417587, 0.038957662880420685, 0.040069643408060074, 0.11696211248636246, 0.060397811233997345, -0.06619690358638763, 0.05205630883574486, -0.07390343397855759, 0.0004989499575458467, 0.03155282512307167, 0.0013169115409255028, 0.07918708771467209, -0.10485237836837769, 0.027420679107308388, 0.01866244524717331, 0.04843979701399803, 0.03688910976052284, -0.027213290333747864, -0.009568660520017147, -0.03735673055052757, -0.11908300966024399, 0.03529176861047745, 0.017057126387953758, -0.03168413043022156, -0.01825817860662937, -0.016218064352869987, 0.04914254695177078, 0.049012355506420135, 0.23251426219940186, 0.0805995762348175, 0.09306525439023972, 0.09155924618244171, 0.10398595780134201, 0.007421333808451891, 0.011006860062479973, -0.12495984882116318, 0.04182204604148865, -0.008305205963551998, 0.06991688162088394, -0.05097624287009239, -0.020920641720294952, 0.06043042987585068, -0.08357246965169907, 0.08511874079704285, 0.0479920469224453, -0.03996475785970688, -0.05017808452248573, -0.14803138375282288, -0.032501958310604095, -0.05116232484579086, -0.01377732865512371, -0.09931374341249466, -0.008363636210560799, 0.05493611469864845, 0.026130013167858124, -0.0021674903109669685, 0.11323096603155136, -0.1357409954071045, -0.03191269189119339, 0.02165338397026062, 0.006201844196766615, 0.05079333111643791, 0.04448790103197098, -0.03824310004711151, 0.005701245740056038, 0.06400983780622482, 0.016747575253248215, 0.06615420430898666, 0.13918569684028625, 0.010966967791318893, -0.019244033843278885, -0.05482780560851097, -0.007894336245954037, -0.04415171593427658, -0.030963651835918427, 0.1519090086221695, 0.011914187110960484, -0.07823868095874786, -0.0011166554177179933, 0.12512482702732086, -0.03206203505396843, -0.0032051987946033478, -0.15080061554908752, 0.13801339268684387, -0.01948666386306286, 0.038410164415836334, -0.05617982894182205, -0.04457859322428703, -0.05666206404566765, 0.18751049041748047, 0.10839826613664627, 0.0042640515603125095, -0.0006485703634098172, 0.04783932864665985, -0.00397019786760211, -0.0014963858993723989, 0.12844590842723846, -0.015557012520730495, 0.23756283521652222, -0.024455171078443527, 0.09602473676204681, -0.019984139129519463, 0.046016354113817215, -0.09580741822719574, 0.03183014318346977, -0.040026772767305374, 0.022903770208358765, -0.09795849770307541, 0.023669350892305374, 0.002552185207605362, -0.16996270418167114, 0.1757362186908722, -0.05618777871131897, -0.03195309266448021, 0.00756043242290616, 0.02249239757657051, -0.022907443344593048, 0.12768153846263885, -0.01695035956799984, -0.021329311653971672, 0.17575298249721527, 0.015306299552321434, -0.03970392420887947, -0.09938140958547592, -0.006696545984596014, -0.11368890106678009, 0.2700389325618744, 0.0033319161739200354, 0.03559037670493126, 0.06997476518154144, 0.04930006340146065, -0.0831746757030487, -0.050087910145521164, -0.05984034761786461, -0.03758511692285538, -0.0562104657292366, 0.1095503494143486, -0.0663861557841301, 0.15311895310878754, 0.03483676537871361, -0.11839846521615982, 0.03035958670079708, 0.03916581720113754, 0.006199718918651342, -0.0435502752661705, 0.045593585819005966, -0.11447256058454514, 0.09946943819522858, 0.11789529770612717, 0.031157108023762703, -0.0540444552898407, -0.027575820684432983, 0.034813813865184784, 0.04299783334136009, 0.05468371883034706, -0.006198334973305464, -0.08706778287887573, 0.0015744147822260857, -0.12297214567661285, 0.054304737597703934, -0.0968846008181572, -0.11465182900428772, 0.040662720799446106, -0.008444040082395077, -0.03661589324474335, 0.06041637063026428, 0.03154083341360092, 0.005178244784474373, -0.044537559151649475, 0.023320583626627922, 0.00859616044908762, 0.05410826951265335, -0.04919101297855377, -0.09967551380395889 ]
null
null
transformers
# Vision Transformer (small-sized model, patch size 8) trained using DINO Vision Transformer (ViT) model trained using the DINO method. It was introduced in the paper [Emerging Properties in Self-Supervised Vision Transformers](https://arxiv.org/abs/2104.14294) by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in [this repository](https://github.com/facebookresearch/dino). Disclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. Images are presented to the model as a sequence of fixed-size patches (resolution 8x8), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Note that this model does not include any fine-tuned heads. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image. ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=google/vit) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import ViTImageProcessor, ViTModel from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) processor = ViTImageProcessor.from_pretrained('facebook/dino-vits8') model = ViTModel.from_pretrained('facebook/dino-vits8') inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) last_hidden_states = outputs.last_hidden_state ``` ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2104-14294, author = {Mathilde Caron and Hugo Touvron and Ishan Misra and Herv{\'{e}} J{\'{e}}gou and Julien Mairal and Piotr Bojanowski and Armand Joulin}, title = {Emerging Properties in Self-Supervised Vision Transformers}, journal = {CoRR}, volume = {abs/2104.14294}, year = {2021}, url = {https://arxiv.org/abs/2104.14294}, archivePrefix = {arXiv}, eprint = {2104.14294}, timestamp = {Tue, 04 May 2021 15:12:43 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2104-14294.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"license": "apache-2.0", "tags": ["dino", "vision"], "datasets": ["imagenet-1k"]}
feature-extraction
facebook/dino-vits8
[ "transformers", "pytorch", "vit", "feature-extraction", "dino", "vision", "dataset:imagenet-1k", "arxiv:2104.14294", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2104.14294" ]
[]
TAGS #transformers #pytorch #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# Vision Transformer (small-sized model, patch size 8) trained using DINO Vision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. Disclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. Images are presented to the model as a sequence of fixed-size patches (resolution 8x8), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Note that this model does not include any fine-tuned heads. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image. ## Intended uses & limitations You can use the raw model for image classification. See the model hub to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ### BibTeX entry and citation info
[ "# Vision Transformer (small-sized model, patch size 8) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 8x8), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# Vision Transformer (small-sized model, patch size 8) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 8x8), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ 61, 134, 273, 41, 13, 11 ]
[ "passage: TAGS\n#transformers #pytorch #vit #feature-extraction #dino #vision #dataset-imagenet-1k #arxiv-2104.14294 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Vision Transformer (small-sized model, patch size 8) trained using DINO \n\nVision Transformer (ViT) model trained using the DINO method. It was introduced in the paper Emerging Properties in Self-Supervised Vision Transformers by Mathilde Caron, Hugo Touvron, Ishan Misra, Hervé Jégou, Julien Mairal, Piotr Bojanowski, Armand Joulin and first released in this repository. \n\nDisclaimer: The team releasing DINO did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-1k, at a resolution of 224x224 pixels. \n\nImages are presented to the model as a sequence of fixed-size patches (resolution 8x8), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.\n\nNote that this model does not include any fine-tuned heads. \n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image." ]
[ -0.0659715086221695, 0.010012286715209484, -0.006904518231749535, 0.08000825345516205, 0.11572901159524918, -0.011633431538939476, 0.11794175207614899, 0.05900506302714348, -0.0050387633964419365, 0.062023673206567764, 0.03473515808582306, 0.0005145324976183474, 0.037063807249069214, 0.08695169538259506, 0.1413852870464325, -0.27575477957725525, 0.025331899523735046, -0.03376473858952522, -0.005156481172889471, -0.007740978617221117, 0.08690034598112106, -0.09488385170698166, 0.1263594627380371, 0.016613183543086052, -0.0913228690624237, 0.017990175634622574, -0.03389721363782883, -0.014641011133790016, 0.08265502005815506, 0.09070642292499542, 0.11644305288791656, 0.009604768827557564, 0.07263283431529999, -0.10813049226999283, 0.014701958745718002, 0.10409166663885117, -0.017364468425512314, 0.012592054903507233, 0.11589246988296509, 0.03832446411252022, -0.004404440056532621, -0.11615631729364395, 0.06829071044921875, 0.03946312516927719, -0.0881713330745697, -0.09638536721467972, -0.05508768558502197, 0.046666402369737625, 0.03882654383778572, 0.004139185417443514, 0.015979807823896408, -0.017696039751172066, 0.0156632661819458, 0.06513439863920212, 0.161659836769104, -0.1860358715057373, -0.07117220759391785, 0.10273519903421402, -0.011463443748652935, 0.09084452688694, -0.027573805302381516, -0.02071804367005825, 0.0044056810438632965, 0.02116985246539116, 0.051967814564704895, -0.04799310863018036, 0.07666605710983276, -0.08293052762746811, -0.10207457095384598, -0.02843272127211094, 0.08130627870559692, -0.03773263841867447, -0.10404452681541443, -0.08686348795890808, -0.057152312248945236, 0.029436325654387474, 0.004279244691133499, -0.006149929016828537, 0.050558317452669144, 0.0426284521818161, 0.01524401642382145, -0.11086281388998032, -0.11178440600633621, 0.0005933358916081488, -0.056887030601501465, 0.09291872382164001, 0.03261701017618179, 0.05804819241166115, -0.03020545467734337, 0.06205381080508232, -0.10974563658237457, -0.01767224632203579, -0.04572398588061333, -0.04846801981329918, -0.10180612653493881, -0.02823646366596222, -0.061771802604198456, -0.1545075923204422, -0.02412571758031845, 0.11291913688182831, 0.014078086242079735, 0.07357748597860336, -0.02844257652759552, 0.05360589176416397, 0.06426931917667389, 0.15423767268657684, -0.05591185763478279, 0.038566432893276215, 0.0035166700836271048, -0.06578730791807175, 0.05527181550860405, -0.05266112461686134, -0.08436224609613419, 0.021469444036483765, -0.02507646568119526, -0.04314509034156799, -0.015261809341609478, 0.026538845151662827, -0.005284839775413275, -0.03241996839642525, 0.08526540547609329, -0.06133056432008743, 0.033140212297439575, 0.00277718179859221, 0.015030397102236748, 0.13291428983211517, 0.0683089941740036, -0.07952123880386353, -0.03878971189260483, 0.06813571602106094, -0.07747893780469894, 0.01692119613289833, -0.07656407356262207, -0.06725205481052399, 0.04662485048174858, -0.13115501403808594, -0.04604707285761833, -0.14795160293579102, -0.08712340891361237, -0.03366750106215477, 0.05018313229084015, 0.00361571554094553, 0.014264675788581371, 0.004338795784860849, -0.006265135016292334, -0.027418404817581177, 0.04412063956260681, -0.030751531943678856, 0.026204660534858704, 0.012892945669591427, -0.07228289544582367, 0.08187275379896164, -0.09298744797706604, 0.008581148460507393, -0.05785644054412842, 0.03321492299437523, -0.14845679700374603, 0.10549237579107285, -0.015875358134508133, -0.027271250262856483, -0.08052520453929901, -0.04804716631770134, -0.06195146590471268, 0.04204460233449936, 0.0697464719414711, 0.03318985179066658, -0.13786113262176514, -0.05006582662463188, 0.1293206661939621, -0.20469719171524048, 0.04670795053243637, 0.1095808818936348, -0.0028237788937985897, 0.053804926574230194, 0.06042121723294258, 0.08142915368080139, 0.08774487674236298, -0.05494048446416855, -0.1030074879527092, 0.025685831904411316, -0.06358229368925095, 0.0510791577398777, 0.026671847328543663, 0.04731770604848862, 0.0071750059723854065, 0.02903733402490616, 0.01349891908466816, -0.033120591193437576, -0.03407733514904976, -0.017283715307712555, -0.021010788157582283, 0.0058484454639256, -0.02622741460800171, -0.019708668813109398, 0.0024610068649053574, 0.026048043742775917, -0.040130652487277985, 0.056318625807762146, 0.07488644123077393, -0.07448302209377289, 0.07233196496963501, -0.06937107443809509, -0.009282967075705528, -0.10725167393684387, -0.006678074132651091, -0.13419297337532043, -0.09914708137512207, 0.05682530254125595, -0.11479122191667557, 0.05935001000761986, 0.06708472967147827, 0.02030034549534321, 0.06360025703907013, -0.004259106703102589, -0.005159879103302956, -0.03174901008605957, -0.04616789147257805, -0.02687985636293888, -0.11061128973960876, -0.061913181096315384, -0.08481783419847488, 0.020914413034915924, -0.005628707818686962, 0.02599477395415306, 0.05359555408358574, 0.015050802379846573, 0.05854221433401108, -0.0837300717830658, -0.015061851590871811, 0.04752342775464058, -0.00001838431489886716, -0.050365712493658066, 0.0498000830411911, 0.06366579234600067, -0.026001278311014175, 0.04678947851061821, -0.12442072480916977, -0.07625944167375565, 0.05437334626913071, -0.064501091837883, -0.080166757106781, -0.0338427759706974, 0.010675269179046154, -0.034141603857278824, -0.10628566145896912, -0.0075959330424666405, 0.1609535664319992, 0.01872999779880047, 0.08919207751750946, -0.05779634788632393, 0.07978811115026474, 0.07485570013523102, -0.05367289483547211, -0.06476063281297684, 0.017597634345293045, 0.10468693822622299, -0.07282542437314987, 0.016563788056373596, 0.03644200414419174, -0.05850121006369591, 0.12259173393249512, 0.03766889497637749, -0.05781342089176178, -0.040906067937612534, 0.01414715126156807, 0.0402371920645237, 0.08225521445274353, -0.08275539427995682, -0.03396504744887352, -0.0022251196205615997, 0.03855742886662483, 0.0012126255314797163, -0.07741010934114456, 0.08380810171365738, 0.02635394036769867, 0.000801770540419966, -0.006556952837854624, -0.04345674067735672, -0.04467128589749336, 0.038243405520915985, 0.02890845574438572, 0.03914735093712807, -0.0013336580013856292, -0.019845865666866302, -0.05300510674715042, 0.12382367253303528, -0.08791616559028625, -0.2319929450750351, -0.16695745289325714, -0.05890381336212158, -0.09032212197780609, 0.037078309804201126, 0.06596792489290237, -0.07312742620706558, -0.021294016391038895, -0.07580837607383728, 0.05540190637111664, -0.015774749219417572, 0.0027046562172472477, -0.027947213500738144, 0.0008511024061590433, 0.01492980495095253, -0.09484615176916122, -0.01716957800090313, -0.03817641735076904, -0.1746758222579956, 0.06554341316223145, 0.014103250578045845, 0.043553486466407776, 0.06192554533481598, 0.010416488163173199, 0.013869980350136757, -0.033169541507959366, 0.13074226677417755, -0.02385002002120018, 0.10791018605232239, 0.1662605106830597, -0.02024579606950283, 0.08113980293273926, 0.06939155608415604, 0.036285750567913055, -0.06660372763872147, 0.016390586271882057, 0.04070451483130455, -0.12075511366128922, -0.08379897475242615, -0.05466460436582565, -0.01744593121111393, 0.06824670732021332, 0.135066419839859, 0.039677560329437256, -0.09823106974363327, 0.04172753542661667, 0.008520781062543392, 0.044148776680231094, 0.038591817021369934, 0.10046102851629257, 0.09062009304761887, -0.05162885785102844, 0.07676753401756287, -0.067547507584095, 0.01089315116405487, 0.07667697966098785, 0.08536028116941452, 0.19962583482265472, -0.04445255547761917, 0.13401591777801514, 0.04114554077386856, -0.035398487001657486, 0.04459485039114952, 0.029211455956101418, -0.018652429804205894, 0.018182724714279175, -0.049512770026922226, -0.04235498607158661, -0.022579265758395195, 0.057943195104599, -0.003987702075392008, -0.051027074456214905, -0.09393656998872757, 0.07239644229412079, 0.01751471497118473, 0.22656388580799103, 0.014764761552214622, -0.14190538227558136, -0.053864315152168274, -0.019110608845949173, -0.020087197422981262, -0.06778513640165329, 0.013578273355960846, 0.16507819294929504, -0.08647990971803665, 0.025298818945884705, -0.022136785089969635, 0.05263620615005493, -0.15084050595760345, -0.0027075728867202997, 0.0026222257874906063, 0.13544048368930817, 0.016796818003058434, 0.024150783196091652, -0.10628409683704376, 0.04900573939085007, 0.009240733459591866, 0.05334138870239258, -0.022235818207263947, 0.03406733646988869, 0.016698310151696205, 0.08324659615755081, 0.09536027163267136, 0.03808298707008362, -0.11017395555973053, -0.07394809275865555, -0.06858577579259872, 0.029069336131215096, 0.09813342243432999, 0.007312335539609194, 0.04857039451599121, -0.01897604577243328, 0.0015186199452728033, -0.042380400002002716, 0.0636141449213028, -0.016973914578557014, -0.16488242149353027, 0.049241818487644196, -0.027370477095246315, -0.05808195099234581, -0.06812260299921036, -0.019912848249077797, 0.009822111576795578, 0.09772305935621262, -0.021293941885232925, -0.02122880518436432, -0.09673094749450684, 0.03857366368174553, 0.021385347470641136, -0.07044398784637451, 0.0567881315946579, -0.036629702895879745, 0.11919843405485153, -0.06170789152383804, -0.07499495148658752, 0.011143200099468231, -0.07894635945558548, -0.07297373563051224, -0.06825201213359833, 0.012812845408916473, 0.03921892121434212, 0.006485506892204285, 0.007706731092184782, -0.004254981875419617, 0.035001128911972046, -0.053051430732011795, 0.014310907572507858, 0.17768676578998566, -0.08031617105007172, 0.050858527421951294, -0.08114070445299149, 0.03114480897784233, -0.016419732943177223, 0.016161896288394928, 0.07617374509572983, 0.13460402190685272, -0.06086834520101547, 0.1047331914305687, 0.1735474020242691, -0.12998683750629425, -0.24683088064193726, -0.007670259568840265, 0.06368382275104523, -0.014793556183576584, -0.041079167276620865, -0.2629474997520447, 0.14833928644657135, 0.10695446282625198, 0.013287796638906002, -0.04304088279604912, -0.23261532187461853, -0.057681381702423096, -0.03089888207614422, 0.050187848508358, 0.23645669221878052, -0.06410908699035645, 0.015319477766752243, 0.002226948970928788, 0.03976026922464371, 0.0568527914583683, 0.015155593864619732, 0.09831228107213974, 0.012860403396189213, -0.08524788916110992, 0.033416904509067535, -0.017994213849306107, 0.08555480092763901, -0.01130110677331686, 0.025004917755723, -0.036648545414209366, 0.039069101214408875, -0.01108311302959919, -0.05035603791475296, 0.0862625315785408, 0.06883827596902847, 0.0160080436617136, 0.013419654220342636, -0.04518149048089981, -0.045676007866859436, 0.044769514352083206, -0.030462156981229782, -0.0380835086107254, -0.024390211328864098, 0.05982295423746109, 0.036005280911922455, 0.02776188589632511, 0.019169922918081284, -0.07060025632381439, -0.0946260541677475, 0.12916897237300873, 0.047255147248506546, -0.09255711734294891, -0.18738052248954773, -0.04379834979772568, -0.03964800760149956, 0.10468818992376328, -0.09447880834341049, 0.06572671979665756, 0.046440478414297104, -0.010624714195728302, 0.05357114225625992, 0.06497088819742203, -0.09360339492559433, 0.031442876905202866, 0.06287617981433868, -0.0489511713385582, -0.1050737127661705, 0.003409954719245434, 0.10792980343103409, -0.049214594066143036, -0.005359054077416658, 0.209741473197937, -0.036077018827199936, -0.008400638587772846, -0.020212696865200996, 0.04359603673219681, -0.017855115234851837, 0.019981827586889267, 0.00848342664539814, -0.0270333644002676, -0.05390887334942818, 0.15974459052085876, 0.08626741915941238, -0.16416050493717194, 0.002218295121565461, 0.08763908594846725, -0.06499800086021423, -0.0844249501824379, -0.06714478135108948, 0.08839263021945953, -0.030813241377472878, -0.04325612634420395, -0.019425490871071815, -0.038016606122255325, 0.03808397799730301, 0.11760906875133514, 0.04797329381108284, 0.01761242374777794, -0.0583195686340332, 0.043407224118709564, -0.09918851405382156, 0.01573440618813038, 0.002393998671323061, 0.07705258578062057, -0.07749421894550323, 0.059692371636629105, 0.013854075223207474, 0.08030477911233902, -0.013931854628026485, -0.09357840567827225, -0.0894324854016304, -0.03229421004652977, 0.003085601609200239, 0.04040597379207611, -0.0687108188867569, -0.0014601288130506873, 0.05294119939208031, 0.04186581075191498, 0.022103367373347282, 0.013758177869021893, -0.0159020833671093, -0.02028723433613777, -0.04027322679758072, 0.07907374203205109, -0.0675089955329895, -0.0014661747263744473, 0.02894848957657814, -0.08527647703886032, 0.09769394993782043, -0.04878968000411987, -0.05970115587115288, -0.041657429188489914, -0.01660391502082348, 0.010357948951423168, -0.009673225693404675, 0.003959538880735636, -0.022214306518435478, -0.030591772869229317, 0.04376745596528053, -0.034253690391778946, -0.013471863232553005, -0.04644806310534477, 0.13434147834777832, -0.05422214791178703, 0.1303493082523346, -0.0030587278306484222, -0.049300678074359894, -0.09177537262439728, 0.03760339692234993, 0.04228619858622551, 0.1228654608130455, 0.05765960365533829, -0.06989949941635132, 0.059727925807237625, -0.08353590965270996, 0.00708031514659524, 0.02587132528424263, 0.005123279057443142, 0.07576391845941544, -0.1080964058637619, 0.01895846240222454, 0.033894091844558716, 0.04578664153814316, 0.033709920942783356, -0.020531723275780678, -0.008417686447501183, -0.0460093691945076, -0.12478774785995483, 0.03321569412946701, 0.01705973967909813, -0.019749902188777924, -0.022137675434350967, -0.005306866951286793, 0.04360309615731239, 0.03852418437600136, 0.23928265273571014, 0.0546795092523098, 0.0965217649936676, 0.08282618969678879, 0.10867755115032196, 0.0031736986711621284, 0.011227823793888092, -0.12142381072044373, 0.04094112291932106, -0.023730603978037834, 0.07958555966615677, -0.06905331462621689, -0.01947728544473648, 0.059992220252752304, -0.07401074469089508, 0.08968724310398102, 0.0585084967315197, -0.034525156021118164, -0.04813399538397789, -0.15075455605983734, -0.035583432763814926, -0.0525924488902092, -0.015896815806627274, -0.09326440095901489, 0.009264889173209667, 0.06214584782719612, 0.03349526226520538, 0.0032041259109973907, 0.10564098507165909, -0.12027360498905182, -0.02835572510957718, 0.030085494741797447, 0.005208045709878206, 0.05425506830215454, 0.04296206310391426, -0.02468103915452957, 0.006957733538001776, 0.05635783076286316, 0.018532680347561836, 0.08074915409088135, 0.14884668588638306, 0.018229257315397263, -0.038934919983148575, -0.0609959214925766, -0.011441485024988651, -0.05542823672294617, -0.022329742088913918, 0.16249723732471466, 0.015441518276929855, -0.08817338198423386, -0.0016980806831270456, 0.1309104710817337, -0.030304213985800743, -0.011698788963258266, -0.17056314647197723, 0.16586683690547943, -0.022939356043934822, 0.026895271614193916, -0.06262373179197311, -0.03569689020514488, -0.046070776879787445, 0.16723458468914032, 0.10887288302183151, 0.010665654204785824, -0.003782727988436818, 0.0573851615190506, -0.011200536042451859, 0.0054899766109883785, 0.13194425404071808, -0.03949228674173355, 0.23685437440872192, -0.03449929878115654, 0.10151238739490509, -0.024613257497549057, 0.043452322483062744, -0.09603454172611237, 0.032145753502845764, -0.0409453809261322, 0.027904793620109558, -0.09951402992010117, 0.027762334793806076, 0.012777691707015038, -0.16576015949249268, 0.17406001687049866, -0.03814074397087097, -0.032853808254003525, 0.017001105472445488, 0.01179753802716732, -0.02036401815712452, 0.1263367384672165, -0.020624088123440742, -0.005141088739037514, 0.18177075684070587, 0.001348429941572249, -0.03597913682460785, -0.09849990904331207, 0.0023997435346245766, -0.10080672055482864, 0.25633227825164795, -0.0009761852561496198, 0.029380928725004196, 0.07140573859214783, 0.04042663797736168, -0.08905378729104996, -0.034079659730196, -0.07827087491750717, -0.03899422287940979, -0.06778494268655777, 0.1152632012963295, -0.06443940848112106, 0.157736137509346, 0.022482143715023994, -0.11434994637966156, 0.0337006039917469, 0.047541290521621704, -0.0005017513758502901, -0.03263970464468002, 0.04324239492416382, -0.10085704922676086, 0.10964307934045792, 0.12024196982383728, 0.034744132310152054, -0.04414470121264458, -0.03627205267548561, 0.031224098056554794, 0.04337753728032112, 0.05743981897830963, -0.0024407224263995886, -0.09101954847574234, 0.009200300090014935, -0.15155304968357086, 0.04108268767595291, -0.1016848236322403, -0.11555592715740204, 0.04768708348274231, -0.00882434006780386, -0.03766195848584175, 0.058071065694093704, 0.03129267692565918, 0.007348894141614437, -0.04360392689704895, 0.01887316256761551, 0.013675890862941742, 0.055257171392440796, -0.044995080679655075, -0.09954749792814255 ]
null
null
transformers
# `dpr-ctx_encoder-multiset-base` ## Table of Contents - [Model Details](#model-details) - [How To Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation-results) - [Environmental Impact](#environmental-impact) - [Technical Specifications](#technical-specifications) - [Citation Information](#citation-information) - [Model Card Authors](#model-card-authors) ## Model Details **Model Description:** [Dense Passage Retrieval (DPR)](https://github.com/facebookresearch/DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. `dpr-ctx_encoder-multiset-base` is the context encoder trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open), [TriviaQA](https://huggingface.co/datasets/trivia_qa), [WebQuestions (WQ)](https://huggingface.co/datasets/web_questions), and [CuratedTREC (TREC)](https://huggingface.co/datasets/trec). - **Developed by:** See [GitHub repo](https://github.com/facebookresearch/DPR) for model developers - **Model Type:** BERT-based encoder - **Language(s):** [CC-BY-NC-4.0](https://github.com/facebookresearch/DPR/blob/main/LICENSE), also see [Code of Conduct](https://github.com/facebookresearch/DPR/blob/main/CODE_OF_CONDUCT.md) - **License:** English - **Related Models:** - [`dpr-question_encoder-multiset-base`](https://huggingface.co/facebook/dpr-question_encoder-multiset-base) - [`dpr-reader-multiset-base`](https://huggingface.co/facebook/dpr-reader-multiset-base) - [`dpr-question-encoder-single-nq-base`](https://huggingface.co/facebook/dpr-question_encoder-single-nq-base) - [`dpr-reader-single-nq-base`](https://huggingface.co/facebook/dpr-reader-single-nq-base) - [`dpr-ctx_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base) - **Resources for more information:** - [Research Paper](https://arxiv.org/abs/2004.04906) - [GitHub Repo](https://github.com/facebookresearch/DPR) - [Hugging Face DPR docs](https://huggingface.co/docs/transformers/main/en/model_doc/dpr) - [BERT Base Uncased Model Card](https://huggingface.co/bert-base-uncased) ## How to Get Started with the Model Use the code below to get started with the model. ```python from transformers import DPRContextEncoder, DPRContextEncoderTokenizer tokenizer = DPRContextEncoderTokenizer.from_pretrained("facebook/dpr-ctx_encoder-multiset-base") model = DPRContextEncoder.from_pretrained("facebook/dpr-ctx_encoder-multiset-base") input_ids = tokenizer("Hello, is my dog cute ?", return_tensors="pt")["input_ids"] embeddings = model(input_ids).pooler_output ``` ## Uses #### Direct Use `dpr-ctx_encoder-multiset-base`, [`dpr-question_encoder-multiset-base`](https://huggingface.co/facebook/dpr-question_encoder-multiset-base), and [`dpr-reader-multiset-base`](https://huggingface.co/facebook/dpr-reader-multiset-base) can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. ## Risks, Limitations and Biases **CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. ## Training #### Training Data This model was trained using the following datasets: - **[Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open)** ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)) - **[TriviaQA](https://huggingface.co/datasets/trivia_qa)** ([Joshi et al., 2017](https://aclanthology.org/P17-1147/)) - **[WebQuestions (WQ)](https://huggingface.co/datasets/web_questions)** ([Berant et al., 2013](https://aclanthology.org/D13-1160/)) - **[CuratedTREC (TREC)](https://huggingface.co/datasets/trec)** ([Baudiš & Šedivý, 2015](https://www.aminer.cn/pub/599c7953601a182cd263079b/reading-wikipedia-to-answer-open-domain-questions)) #### Training Procedure The training procedure is described in the [associated paper](https://arxiv.org/pdf/2004.04906.pdf): > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. The authors report that for encoders, they used two independent BERT ([Devlin et al., 2019](https://aclanthology.org/N19-1423/)) networks (base, un-cased) and use FAISS ([Johnson et al., 2017](https://arxiv.org/abs/1702.08734)) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. ## Evaluation The following evaluation information is extracted from the [associated paper](https://arxiv.org/pdf/2004.04906.pdf). #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were [NQ](https://huggingface.co/datasets/nq_open), [TriviaQA](https://huggingface.co/datasets/trivia_qa), [WebQuestions (WQ)](https://huggingface.co/datasets/web_questions), [CuratedTREC (TREC)](https://huggingface.co/datasets/trec), and [SQuAD v1.1](https://huggingface.co/datasets/squad). #### Results | | Top 20 | | | | | Top 100| | | | | |:----:|:------:|:---------:|:--:|:----:|:-----:|:------:|:---------:|:--:|:----:|:-----:| | | NQ | TriviaQA | WQ | TREC | SQuAD | NQ | TriviaQA | WQ | TREC | SQuAD | | | 79.4 | 78.8 |75.0| 89.1 | 51.6 | 86.0 | 84.7 |82.9| 93.9 | 67.6 | ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type and based on the [associated paper](https://arxiv.org/abs/2004.04906). - **Hardware Type:** 8 32GB GPUs - **Hours used:** Unknown - **Cloud Provider:** Unknown - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Technical Specifications See the [associated paper](https://arxiv.org/abs/2004.04906) for details on the modeling architecture, objective, compute infrastructure, and training details. ## Citation Information ```bibtex @inproceedings{karpukhin-etal-2020-dense, title = "Dense Passage Retrieval for Open-Domain Question Answering", author = "Karpukhin, Vladimir and Oguz, Barlas and Min, Sewon and Lewis, Patrick and Wu, Ledell and Edunov, Sergey and Chen, Danqi and Yih, Wen-tau", booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.emnlp-main.550", doi = "10.18653/v1/2020.emnlp-main.550", pages = "6769--6781", } ``` ## Model Card Authors This model card was written by the team at Hugging Face.
{"language": "en", "license": "cc-by-nc-4.0", "tags": ["dpr"], "datasets": ["nq_open"], "inference": false}
null
facebook/dpr-ctx_encoder-multiset-base
[ "transformers", "pytorch", "tf", "dpr", "en", "dataset:nq_open", "arxiv:2004.04906", "arxiv:1702.08734", "arxiv:1910.09700", "license:cc-by-nc-4.0", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2004.04906", "1702.08734", "1910.09700" ]
[ "en" ]
TAGS #transformers #pytorch #tf #dpr #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us
'dpr-ctx\_encoder-multiset-base' ================================ Table of Contents ----------------- * Model Details * How To Get Started With the Model * Uses * Risks, Limitations and Biases * Training * Evaluation * Environmental Impact * Technical Specifications * Citation Information * Model Card Authors Model Details ------------- Model Description: Dense Passage Retrieval (DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. 'dpr-ctx\_encoder-multiset-base' is the context encoder trained using the Natural Questions (NQ) dataset, TriviaQA, WebQuestions (WQ), and CuratedTREC (TREC). * Developed by: See GitHub repo for model developers * Model Type: BERT-based encoder * Language(s): CC-BY-NC-4.0, also see Code of Conduct * License: English * Related Models: + 'dpr-question\_encoder-multiset-base' + 'dpr-reader-multiset-base' + 'dpr-question-encoder-single-nq-base' + 'dpr-reader-single-nq-base' + 'dpr-ctx\_encoder-single-nq-base' * Resources for more information: + Research Paper + GitHub Repo + Hugging Face DPR docs + BERT Base Uncased Model Card How to Get Started with the Model --------------------------------- Use the code below to get started with the model. Uses ---- #### Direct Use 'dpr-ctx\_encoder-multiset-base', 'dpr-question\_encoder-multiset-base', and 'dpr-reader-multiset-base' can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. Risks, Limitations and Biases ----------------------------- CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes. Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. Training -------- #### Training Data This model was trained using the following datasets: * Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019) * TriviaQA (Joshi et al., 2017) * WebQuestions (WQ) (Berant et al., 2013) * CuratedTREC (TREC) (Baudiš & Šedivý, 2015) #### Training Procedure The training procedure is described in the associated paper: > > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > > > > > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. > > > The authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. Evaluation ---------- The following evaluation information is extracted from the associated paper. #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1. #### Results Environmental Impact -------------------- Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper. * Hardware Type: 8 32GB GPUs * Hours used: Unknown * Cloud Provider: Unknown * Compute Region: Unknown * Carbon Emitted: Unknown Technical Specifications ------------------------ See the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details. Model Card Authors ------------------ This model card was written by the team at Hugging Face.
[ "#### Direct Use\n\n\n'dpr-ctx\\_encoder-multiset-base', 'dpr-question\\_encoder-multiset-base', and 'dpr-reader-multiset-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the following datasets:\n\n\n* Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019)\n* TriviaQA (Joshi et al., 2017)\n* WebQuestions (WQ) (Berant et al., 2013)\n* CuratedTREC (TREC) (Baudiš & Šedivý, 2015)", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ "TAGS\n#transformers #pytorch #tf #dpr #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us \n", "#### Direct Use\n\n\n'dpr-ctx\\_encoder-multiset-base', 'dpr-question\\_encoder-multiset-base', and 'dpr-reader-multiset-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the following datasets:\n\n\n* Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019)\n* TriviaQA (Joshi et al., 2017)\n* WebQuestions (WQ) (Berant et al., 2013)\n* CuratedTREC (TREC) (Baudiš & Šedivý, 2015)", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ 69, 66, 206, 91, 313, 82, 135 ]
[ "passage: TAGS\n#transformers #pytorch #tf #dpr #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us \n#### Direct Use\n\n\n'dpr-ctx\\_encoder-multiset-base', 'dpr-question\\_encoder-multiset-base', and 'dpr-reader-multiset-base' can be used for the task of open-domain question answering.#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------#### Training Data\n\n\nThis model was trained using the following datasets:\n\n\n* Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019)\n* TriviaQA (Joshi et al., 2017)\n* WebQuestions (WQ) (Berant et al., 2013)\n* CuratedTREC (TREC) (Baudiš & Šedivý, 2015)" ]
[ -0.07678969949483871, 0.08138057589530945, -0.004385398235172033, 0.07675784826278687, 0.038825586438179016, 0.003587803803384304, 0.10049929469823837, 0.05827005207538605, 0.07720630615949631, 0.05328727886080742, 0.06702835857868195, 0.04590446129441261, 0.06651099026203156, 0.04711957648396492, -0.056478098034858704, -0.1943323314189911, 0.03436686098575592, -0.007277053315192461, 0.04476241394877434, 0.11798305809497833, 0.1387459933757782, -0.04832854121923447, 0.06229594722390175, 0.019638249650597572, -0.04409570246934891, -0.049834467470645905, 0.0056261043064296246, -0.027554135769605637, 0.0890728235244751, 0.09433986991643906, 0.1121114045381546, 0.022224297747015953, 0.01895863004028797, -0.223174586892128, 0.03756904602050781, 0.04284743219614029, -0.014981282874941826, 0.07534434646368027, 0.08133853226900101, 0.0013281905557960272, 0.1410883665084839, -0.11693151295185089, 0.0735594779253006, 0.04843563586473465, -0.10947549343109131, -0.13836804032325745, -0.12695768475532532, 0.03298944979906082, 0.09003155678510666, 0.08052769303321838, -0.04936083406209946, 0.17105521261692047, -0.08662015944719315, 0.04472789540886879, 0.1497982144355774, -0.15138797461986542, -0.012284460477530956, -0.003949224017560482, -0.010728182271122932, 0.008049311116337776, -0.07381126284599304, -0.027653908357024193, 0.03410448879003525, 0.06215013191103935, 0.004339936189353466, -0.04886852204799652, 0.07504527270793915, -0.0701027438044548, -0.14424391090869904, -0.04118264093995094, 0.13522227108478546, 0.07268767058849335, -0.0717591643333435, -0.1530771553516388, -0.031786203384399414, 0.125954270362854, 0.04519175738096237, -0.08909230679273605, -0.0272311270236969, -0.046516261994838715, 0.07968156784772873, -0.023415740579366684, -0.06290700286626816, -0.011366128921508789, -0.10249251872301102, 0.17789719998836517, 0.04750385507941246, 0.015566115267574787, -0.05511317029595375, 0.06566867232322693, 0.01021349336951971, -0.10814943164587021, -0.06753770262002945, -0.09946676343679428, -0.08506055921316147, -0.04902515187859535, -0.038823846727609634, 0.007777538150548935, 0.010588463395833969, 0.14229711890220642, -0.13932150602340698, 0.029290586709976196, -0.007218018639832735, 0.02680022083222866, 0.11621570587158203, 0.039429280906915665, -0.1131783053278923, -0.021599553525447845, 0.06993886083364487, 0.05036711320281029, 0.03867747262120247, -0.02002297155559063, -0.009645744226872921, -0.10251069813966751, 0.03948970139026642, 0.06048238277435303, 0.08531729876995087, 0.07222673296928406, -0.08393243700265884, -0.0739303007721901, 0.12758344411849976, -0.11314166337251663, -0.047182705253362656, 0.02575521357357502, -0.07823291420936584, -0.005730230826884508, 0.007078362163156271, 0.010628631338477135, -0.058044835925102234, 0.023461708799004555, -0.08996538072824478, -0.015331410802900791, -0.06113581731915474, -0.12755000591278076, 0.0584828145802021, 0.03876987099647522, 0.0012200245400890708, -0.1100691631436348, -0.12732075154781342, -0.06005227565765381, 0.00042638537706807256, -0.042780157178640366, -0.03770973160862923, -0.02580142393708229, -0.029891984537243843, -0.008127679117023945, -0.015217505395412445, -0.010066208429634571, -0.018787285313010216, 0.0771900862455368, -0.0544290691614151, 0.034436531364917755, 0.03927689418196678, 0.004839124623686075, -0.10172397643327713, 0.056093793362379074, -0.1331835389137268, 0.13048356771469116, -0.03619470074772835, -0.08659122884273529, -0.08265822380781174, -0.07213742285966873, 0.019087277352809906, 0.06772739440202713, 0.02891676500439644, 0.20221346616744995, -0.15746717154979706, 0.02500583976507187, 0.164190411567688, -0.1404298096895218, -0.09877043962478638, 0.11665032804012299, -0.10605015605688095, 0.05086737498641014, 0.11042214930057526, 0.08917836844921112, 0.04869954660534859, -0.08080287277698517, -0.0647742822766304, -0.024631710723042488, -0.05609841272234917, 0.1764707714319229, 0.03851741924881935, -0.02642950788140297, 0.009998819790780544, -0.04162032529711723, -0.06544501334428787, 0.01968449354171753, -0.029825979843735695, -0.036637865006923676, 0.029789036139845848, -0.026069916784763336, 0.1275741457939148, 0.00830210093408823, -0.05963633581995964, 0.020200463011860847, -0.10758023709058762, -0.047855060547590256, 0.09407594799995422, -0.032074734568595886, 0.007034834008663893, -0.118031345307827, 0.01835639961063862, 0.012171180918812752, 0.0015212586149573326, -0.17883585393428802, -0.13290473818778992, -0.028570013120770454, -0.1108221709728241, 0.06316963583230972, 0.1338498741388321, 0.02574152685701847, 0.029625162482261658, -0.06558916717767715, 0.027957536280155182, -0.03982560709118843, 0.01570187322795391, -0.05208088830113411, -0.15160271525382996, 0.023602809756994247, -0.04919375851750374, 0.10333888232707977, -0.21108326315879822, 0.01748414896428585, 0.12590214610099792, 0.05215020105242729, 0.022277384996414185, -0.01364180725067854, 0.05943116173148155, 0.023398056626319885, 0.008590481244027615, -0.0414714552462101, 0.006500109564512968, -0.028121620416641235, -0.10779839009046555, 0.08302231878042221, -0.20456628501415253, -0.04412462189793587, 0.08378620445728302, -0.057085443288087845, -0.1312849372625351, -0.08330626785755157, -0.004223153460770845, -0.01874023862183094, -0.035621367394924164, -0.030680909752845764, 0.1644594371318817, 0.0442720390856266, 0.01676960289478302, -0.09437431395053864, -0.06324934959411621, -0.033490438014268875, -0.04683100804686546, -0.016840171068906784, 0.05193920433521271, -0.060570456087589264, -0.2540154457092285, 0.09362474828958511, 0.05367691442370415, -0.06316007673740387, 0.08252700418233871, 0.03420455753803253, -0.06282190978527069, -0.041419677436351776, 0.013499069958925247, -0.02588973380625248, 0.09870949387550354, 0.021409589797258377, 0.05856647342443466, 0.04970453679561615, 0.035581860691308975, 0.00673943804576993, -0.08982208371162415, -0.0243638027459383, 0.007025713566690683, -0.012683724984526634, -0.1438532918691635, 0.03194325044751167, 0.05606340616941452, 0.12185805290937424, 0.025573043152689934, -0.011438300833106041, 0.07193013280630112, -0.06461472809314728, -0.15634222328662872, 0.18650172650814056, -0.06386885792016983, -0.21699734032154083, -0.02325195074081421, -0.003893124870955944, -0.03547042980790138, 0.008888836950063705, 0.010961201973259449, -0.044517673552036285, -0.05065577104687691, -0.07524821162223816, 0.04179752245545387, -0.005573438946157694, -0.07513544708490372, -0.05552563816308975, -0.0016265864251181483, 0.04877405986189842, -0.08952149748802185, 0.01954622194170952, -0.031264904886484146, -0.03371342271566391, 0.04779275506734848, 0.027673516422510147, 0.10952853411436081, 0.10061328858137131, 0.01815202459692955, -0.04125826060771942, -0.03729046881198883, 0.2714683413505554, -0.1549767255783081, 0.02138606645166874, 0.13749811053276062, -0.1373862475156784, 0.06485691666603088, 0.1750127524137497, 0.024866227060556412, -0.06928536295890808, 0.02541174739599228, 0.07457192987203598, -0.021815260872244835, -0.24140626192092896, -0.05664891377091408, -0.031532470136880875, -0.10038405656814575, 0.026077309623360634, 0.021138224750757217, 0.07733137160539627, 0.07685445994138718, -0.12144315987825394, -0.026123948395252228, 0.09868424385786057, 0.06797703355550766, 0.1469915509223938, 0.023449931293725967, 0.054039325565099716, -0.020722726359963417, -0.021418169140815735, 0.06288642436265945, -0.011539667844772339, 0.29293811321258545, -0.058637503534555435, 0.0862109437584877, 0.12786166369915009, 0.051946334540843964, 0.040099892765283585, -0.01984749734401703, -0.01047088485211134, 0.009433016180992126, -0.059241268783807755, -0.03427276387810707, -0.03837452828884125, 0.06045682728290558, 0.01579635590314865, -0.020186858251690865, -0.006692477967590094, -0.06682603806257248, 0.05214244872331619, 0.10241962224245071, 0.015292808413505554, -0.12401848286390305, -0.040609877556562424, 0.05997509881854057, -0.05356583744287491, -0.051657017320394516, 0.030827883630990982, 0.08070775866508484, -0.14581146836280823, 0.09502281993627548, -0.02033608965575695, 0.07888728380203247, -0.16413210332393646, 0.007479911670088768, -0.05471927300095558, 0.044355183839797974, -0.02448495291173458, 0.10233109444379807, -0.30167651176452637, 0.21272730827331543, 0.027056993916630745, 0.03142023831605911, -0.107169508934021, -0.03640731796622276, 0.044176261872053146, 0.013514332473278046, 0.11685516685247421, 0.02068697288632393, -0.005724433343857527, -0.10323479026556015, 0.016222409904003143, 0.033575378358364105, 0.009303275495767593, 0.03365074843168259, 0.08171679824590683, 0.024014582857489586, 0.048368748277425766, 0.00034728454193100333, 0.006225286051630974, -0.14899955689907074, -0.1469736546278, 0.06324662268161774, -0.11482314020395279, 0.012647937051951885, -0.05351515859365463, -0.04061319679021835, -0.017960987985134125, 0.07759331911802292, -0.1720118224620819, -0.10368902236223221, -0.07887387275695801, 0.026749230921268463, 0.06613752990961075, -0.11061830818653107, -0.008006785064935684, 0.05715201422572136, 0.08072881400585175, -0.02956964261829853, -0.03512845188379288, 0.05463431030511856, -0.060170289129018784, -0.1447123885154724, -0.025882912799715996, 0.02784706838428974, 0.1741257607936859, 0.07159483432769775, 0.024456825107336044, 0.00900112185627222, 0.02326522395014763, -0.1667618751525879, 0.0038535355124622583, 0.10751467198133469, 0.08234933763742447, 0.05456500127911568, 0.0302968118339777, -0.009394917637109756, -0.13879932463169098, -0.031739331781864166, 0.07569636404514313, 0.2500058710575104, -0.031373314559459686, 0.03560469299554825, 0.15986791253089905, -0.057464372366666794, -0.18533675372600555, 0.0007435049046762288, 0.030206721276044846, -0.008999736979603767, 0.11304126679897308, -0.1564464569091797, -0.03178434073925018, 0.06751696020364761, 0.016107525676488876, 0.06289681792259216, -0.16866718232631683, -0.09409898519515991, 0.17457447946071625, 0.07200179249048233, 0.06687361001968384, -0.07603278756141663, -0.03463653102517128, -0.00020792211580555886, -0.05210565775632858, 0.1689126193523407, -0.1362181305885315, 0.02038601040840149, 0.02955164574086666, 0.0744868740439415, 0.05101361125707626, -0.020045379176735878, 0.1891932189464569, 0.024253256618976593, 0.10772474110126495, -0.13164889812469482, -0.1001342311501503, -0.0340147465467453, -0.025007938966155052, 0.058897536247968674, 0.033425603061914444, 0.01812492124736309, -0.06627156585454941, -0.06300724297761917, -0.1270897537469864, 0.025383921340107918, -0.05418164283037186, -0.04621969163417816, -0.08419773727655411, 0.09041447937488556, 0.09879196435213089, -0.020126651972532272, 0.024703852832317352, -0.08901200443506241, -0.011058936826884747, 0.04598058760166168, 0.24857133626937866, 0.10344744473695755, -0.05974644422531128, -0.0032864585518836975, -0.012542477808892727, 0.0999358668923378, -0.06904783099889755, 0.012051507830619812, 0.05690295994281769, 0.010055845603346825, 0.15464380383491516, -0.0010088194394484162, -0.12907136976718903, 0.036677516996860504, -0.015268010087311268, -0.08655648678541183, -0.12824030220508575, -0.009861191734671593, -0.010765121318399906, -0.13942059874534607, -0.12819647789001465, 0.11200803518295288, 0.0015975118149071932, -0.012640257366001606, 0.013803086243569851, 0.06820318102836609, 0.008376671001315117, 0.07640887051820755, 0.1032700464129448, 0.06441404670476913, -0.06725016981363297, 0.027236908674240112, 0.07104844599962234, -0.04538892209529877, 0.07474827766418457, 0.03883831575512886, -0.08009397983551025, -0.05408873409032822, -0.14526720345020294, 0.05249686539173126, -0.055783167481422424, -0.07006294280290604, -0.01197862159460783, -0.08700736612081528, -0.01885824091732502, 0.1532118171453476, 0.0329415500164032, 0.0652022436261177, 0.003981212619692087, -0.02205640822649002, -0.05044692009687424, 0.07877335697412491, 0.040256500244140625, 0.011164352297782898, -0.08300740271806717, 0.04668835923075676, 0.03895075246691704, 0.052998125553131104, -0.026417385786771774, -0.0015082949539646506, -0.12206672877073288, 0.012169144116342068, -0.19820424914360046, 0.06270337849855423, -0.12047779560089111, 0.04706187546253204, -0.037220533937215805, -0.06951713562011719, -0.011748786084353924, 0.011156183667480946, -0.0400274358689785, 0.02408023737370968, 0.04710251837968826, 0.05341025069355965, -0.1499779373407364, -0.05930418521165848, 0.08385922759771347, -0.07567635178565979, 0.08607582747936249, -0.02349889650940895, -0.05440811812877655, 0.010018279775977135, -0.15988799929618835, 0.029842648655176163, -0.015911724418401718, 0.027667934074997902, 0.0061019789427518845, -0.20845352113246918, -0.015740854665637016, -0.013908092863857746, -0.01911088265478611, 0.03245696425437927, 0.012233952060341835, -0.0611281655728817, 0.03184062987565994, 0.05321529880166054, -0.048518940806388855, -0.09270001947879791, 0.03917646035552025, 0.15063676238059998, -0.008772619068622589, 0.1243860051035881, -0.014040706679224968, 0.07999946922063828, -0.1356348693370819, -0.006986518856137991, 0.03934744745492935, 0.056623850017786026, 0.05084814876317978, -0.030784929171204567, 0.06674284487962723, -0.028570735827088356, 0.18658187985420227, -0.07419352233409882, -0.05550531670451164, 0.06468313932418823, 0.02162325568497181, -0.0032198652625083923, 0.012867423705756664, -0.030602827668190002, -0.06021881476044655, -0.031110860407352448, 0.01547897607088089, 0.0004370790848042816, -0.060376327484846115, -0.09349550306797028, 0.18171072006225586, 0.09242834895849228, 0.09695679694414139, -0.020908398553729057, -0.02454408071935177, -0.0850525051355362, 0.03705805167555809, 0.020061353221535683, 0.06445148587226868, -0.08439712226390839, -0.03217899799346924, 0.057057540863752365, 0.16594715416431427, -0.09784070402383804, 0.08281286805868149, -0.022102516144514084, -0.09599869698286057, -0.12145961821079254, -0.1692364364862442, -0.04832405969500542, 0.03404231369495392, 0.015826048329472542, -0.12708503007888794, 0.02516528218984604, 0.1883956789970398, -0.0006562840426340699, -0.05900142341852188, 0.08521852642297745, -0.0058216010220348835, -0.09916695952415466, -0.012844735756516457, 0.014062149450182915, 0.02283715270459652, 0.0262755174189806, 0.011300620622932911, 0.05665188655257225, 0.023309044539928436, 0.04824657738208771, 0.039768025279045105, 0.02532784454524517, -0.003026203718036413, -0.08960020542144775, -0.08616508543491364, -0.0024330399464815855, 0.07973599433898926, 0.13047923147678375, 0.2688167691230774, 0.06035510078072548, -0.004681388381868601, -0.00369972363114357, 0.12709443271160126, 0.0276712104678154, -0.04797626659274101, -0.13755521178245544, 0.1580677330493927, -0.007069095503538847, -0.006029844284057617, 0.04246636480093002, -0.12096317857503891, 0.09168004244565964, 0.11161356419324875, 0.14687837660312653, -0.13038833439350128, 0.0012075118720531464, -0.04666865989565849, 0.014045387506484985, 0.022461213171482086, 0.049269165843725204, 0.020456427708268166, 0.24872900545597076, -0.08554313331842422, 0.05094144865870476, -0.017561092972755432, 0.039729923009872437, 0.021616948768496513, 0.12879303097724915, 0.04126337170600891, 0.025597458705306053, -0.1167435571551323, 0.11580383777618408, -0.07435181736946106, -0.16427189111709595, -0.014048431999981403, -0.02455451898276806, -0.0928298681974411, 0.02400641329586506, -0.11211559921503067, 0.010756103321909904, 0.06621105968952179, 0.012908789329230785, -0.011234147474169731, 0.11627383530139923, 0.026473645120859146, -0.05880613625049591, -0.016225585713982582, 0.1311360001564026, 0.03201092779636383, 0.20157435536384583, 0.014336190186440945, 0.21802209317684174, 0.10577217489480972, 0.0010274245869368315, -0.08938774466514587, 0.039515502750873566, 0.03948849439620972, -0.000344532891176641, 0.011986393481492996, 0.14935898780822754, 0.016108565032482147, 0.0504034087061882, 0.13688339293003082, -0.06758154928684235, 0.05595342814922333, -0.06138598173856735, -0.08967019617557526, -0.0893116444349289, 0.10744316875934601, -0.08067189157009125, 0.144027978181839, 0.14781388640403748, -0.03433206304907799, 0.02133110910654068, -0.019135909155011177, -0.010161581449210644, -0.02473202347755432, 0.045545194298028946, 0.008634496480226517, -0.12700538337230682, 0.04659516364336014, 0.07143064588308334, 0.05436873808503151, -0.17361442744731903, -0.03118816949427128, 0.018606266006827354, -0.05398298799991608, 0.019958386197686195, 0.054664455354213715, -0.04731183871626854, 0.009269310161471367, -0.0431143082678318, -0.06356212496757507, -0.034224480390548706, 0.0946279764175415, -0.09748723357915878, -0.046481359750032425 ]
null
null
transformers
# `dpr-ctx_encoder-single-nq-base` ## Table of Contents - [Model Details](#model-details) - [How To Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation-results) - [Environmental Impact](#environmental-impact) - [Technical Specifications](#technical-specifications) - [Citation Information](#citation-information) - [Model Card Authors](#model-card-authors) ## Model Details **Model Description:** [Dense Passage Retrieval (DPR)](https://github.com/facebookresearch/DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. `dpr-ctx_encoder-single-nq-base` is the Context Encoder trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open) ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)). - **Developed by:** See [GitHub repo](https://github.com/facebookresearch/DPR) for model developers - **Model Type:** BERT-based encoder - **Language(s):** [CC-BY-NC-4.0](https://github.com/facebookresearch/DPR/blob/main/LICENSE), also see [Code of Conduct](https://github.com/facebookresearch/DPR/blob/main/CODE_OF_CONDUCT.md) - **License:** English - **Related Models:** - [`dpr-question-encoder-single-nq-base`](https://huggingface.co/facebook/dpr-question_encoder-single-nq-base) - [`dpr-reader-single-nq-base`](https://huggingface.co/facebook/dpr-reader-single-nq-base) - [`dpr-ctx_encoder-multiset-base`](https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base) - [`dpr-question_encoder-multiset-base`](https://huggingface.co/facebook/dpr-question_encoder-multiset-base) - [`dpr-reader-multiset-base`](https://huggingface.co/facebook/dpr-reader-multiset-base) - **Resources for more information:** - [Research Paper](https://arxiv.org/abs/2004.04906) - [GitHub Repo](https://github.com/facebookresearch/DPR) - [Hugging Face DPR docs](https://huggingface.co/docs/transformers/main/en/model_doc/dpr) - [BERT Base Uncased Model Card](https://huggingface.co/bert-base-uncased) ## How to Get Started with the Model Use the code below to get started with the model. ```python >>> from transformers import DPRContextEncoder, DPRContextEncoderTokenizer >>> tokenizer = DPRContextEncoderTokenizer.from_pretrained("facebook/dpr-ctx_encoder-single-nq-base") >>> model = DPRContextEncoder.from_pretrained("facebook/dpr-ctx_encoder-single-nq-base") >>> input_ids = tokenizer("Hello, is my dog cute ?", return_tensors="pt")["input_ids"] >>> embeddings = model(input_ids).pooler_output ``` ## Uses #### Direct Use `dpr-ctx_encoder-single-nq-base`, [`dpr-question-encoder-single-nq-base`](https://huggingface.co/facebook/dpr-question_encoder-single-nq-base), and [`dpr-reader-single-nq-base`](https://huggingface.co/facebook/dpr-reader-single-nq-base) can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. ## Risks, Limitations and Biases **CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. ## Training #### Training Data This model was trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open) ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)). The model authors write that: > [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators. #### Training Procedure The training procedure is described in the [associated paper](https://arxiv.org/pdf/2004.04906.pdf): > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. The authors report that for encoders, they used two independent BERT ([Devlin et al., 2019](https://aclanthology.org/N19-1423/)) networks (base, un-cased) and use FAISS ([Johnson et al., 2017](https://arxiv.org/abs/1702.08734)) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. ## Evaluation The following evaluation information is extracted from the [associated paper](https://arxiv.org/pdf/2004.04906.pdf). #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were [NQ](https://huggingface.co/datasets/nq_open), [TriviaQA](https://huggingface.co/datasets/trivia_qa), [WebQuestions (WQ)](https://huggingface.co/datasets/web_questions), [CuratedTREC (TREC)](https://huggingface.co/datasets/trec), and [SQuAD v1.1](https://huggingface.co/datasets/squad). #### Results | | Top 20 | | | | | Top 100| | | | | |:----:|:------:|:---------:|:--:|:----:|:-----:|:------:|:---------:|:--:|:----:|:-----:| | | NQ | TriviaQA | WQ | TREC | SQuAD | NQ | TriviaQA | WQ | TREC | SQuAD | | | 78.4 | 79.4 |73.2| 79.8 | 63.2 | 85.4 | 85.0 |81.4| 89.1 | 77.2 | ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type and based on the [associated paper](https://arxiv.org/abs/2004.04906). - **Hardware Type:** 8 32GB GPUs - **Hours used:** Unknown - **Cloud Provider:** Unknown - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Technical Specifications See the [associated paper](https://arxiv.org/abs/2004.04906) for details on the modeling architecture, objective, compute infrastructure, and training details. ## Citation Information ```bibtex @inproceedings{karpukhin-etal-2020-dense, title = "Dense Passage Retrieval for Open-Domain Question Answering", author = "Karpukhin, Vladimir and Oguz, Barlas and Min, Sewon and Lewis, Patrick and Wu, Ledell and Edunov, Sergey and Chen, Danqi and Yih, Wen-tau", booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.emnlp-main.550", doi = "10.18653/v1/2020.emnlp-main.550", pages = "6769--6781", } ``` ## Model Card Authors This model card was written by the team at Hugging Face.
{"language": "en", "license": "cc-by-nc-4.0", "tags": ["dpr"], "datasets": ["nq_open"], "inference": false}
null
facebook/dpr-ctx_encoder-single-nq-base
[ "transformers", "pytorch", "tf", "dpr", "en", "dataset:nq_open", "arxiv:2004.04906", "arxiv:1702.08734", "arxiv:1910.09700", "license:cc-by-nc-4.0", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2004.04906", "1702.08734", "1910.09700" ]
[ "en" ]
TAGS #transformers #pytorch #tf #dpr #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us
'dpr-ctx\_encoder-single-nq-base' ================================= Table of Contents ----------------- * Model Details * How To Get Started With the Model * Uses * Risks, Limitations and Biases * Training * Evaluation * Environmental Impact * Technical Specifications * Citation Information * Model Card Authors Model Details ------------- Model Description: Dense Passage Retrieval (DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. 'dpr-ctx\_encoder-single-nq-base' is the Context Encoder trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). * Developed by: See GitHub repo for model developers * Model Type: BERT-based encoder * Language(s): CC-BY-NC-4.0, also see Code of Conduct * License: English * Related Models: + 'dpr-question-encoder-single-nq-base' + 'dpr-reader-single-nq-base' + 'dpr-ctx\_encoder-multiset-base' + 'dpr-question\_encoder-multiset-base' + 'dpr-reader-multiset-base' * Resources for more information: + Research Paper + GitHub Repo + Hugging Face DPR docs + BERT Base Uncased Model Card How to Get Started with the Model --------------------------------- Use the code below to get started with the model. Uses ---- #### Direct Use 'dpr-ctx\_encoder-single-nq-base', 'dpr-question-encoder-single-nq-base', and 'dpr-reader-single-nq-base' can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. Risks, Limitations and Biases ----------------------------- CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes. Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. Training -------- #### Training Data This model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that: > > [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators. > > > #### Training Procedure The training procedure is described in the associated paper: > > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > > > > > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. > > > The authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. Evaluation ---------- The following evaluation information is extracted from the associated paper. #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1. #### Results Environmental Impact -------------------- Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper. * Hardware Type: 8 32GB GPUs * Hours used: Unknown * Cloud Provider: Unknown * Compute Region: Unknown * Carbon Emitted: Unknown Technical Specifications ------------------------ See the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details. Model Card Authors ------------------ This model card was written by the team at Hugging Face.
[ "#### Direct Use\n\n\n'dpr-ctx\\_encoder-single-nq-base', 'dpr-question-encoder-single-nq-base', and 'dpr-reader-single-nq-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that:\n\n\n\n> \n> [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators.\n> \n> \n>", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ "TAGS\n#transformers #pytorch #tf #dpr #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us \n", "#### Direct Use\n\n\n'dpr-ctx\\_encoder-single-nq-base', 'dpr-question-encoder-single-nq-base', and 'dpr-reader-single-nq-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that:\n\n\n\n> \n> [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators.\n> \n> \n>", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ 69, 74, 206, 94, 313, 82, 135 ]
[ "passage: TAGS\n#transformers #pytorch #tf #dpr #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us \n#### Direct Use\n\n\n'dpr-ctx\\_encoder-single-nq-base', 'dpr-question-encoder-single-nq-base', and 'dpr-reader-single-nq-base' can be used for the task of open-domain question answering.#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------#### Training Data\n\n\nThis model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that:\n\n\n\n> \n> [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators.\n> \n> \n>" ]
[ -0.07295844703912735, 0.0897628664970398, -0.0029659285210072994, 0.06659715622663498, 0.05409052222967148, -0.017509084194898605, 0.07582920044660568, 0.07092791050672531, 0.12430543452501297, 0.0644732192158699, 0.047316260635852814, 0.04000864550471306, 0.039222341030836105, 0.07052399963140488, -0.042833562940359116, -0.16291221976280212, 0.03804653510451317, -0.014633494429290295, 0.049826133996248245, 0.11508385092020035, 0.10585525631904602, -0.05377953127026558, 0.05751118063926697, 0.041486918926239014, -0.018777092918753624, -0.023771917447447777, -0.005611115135252476, -0.035783205181360245, 0.11285503953695297, 0.0851176530122757, 0.06953494250774384, 0.023566508665680885, 0.011188599281013012, -0.23311056196689606, 0.041970040649175644, 0.042802926152944565, -0.015102740377187729, 0.07106675952672958, 0.06920408457517624, 0.0012841939460486174, 0.11326829344034195, -0.0724741518497467, 0.07161519676446915, 0.075161412358284, -0.07069076597690582, -0.1683858036994934, -0.1003890261054039, 0.010649258270859718, 0.0379183366894722, 0.11298684775829315, -0.05930871143937111, 0.1683485060930252, -0.07802199572324753, 0.020742151886224747, 0.19446974992752075, -0.16323067247867584, 0.008020462468266487, 0.0038358252495527267, -0.007719174958765507, 0.011297527700662613, -0.054281823337078094, -0.002858673920854926, 0.03764532878994942, 0.061148501932621, 0.014777367934584618, -0.03389238193631172, 0.05049069970846176, -0.055938661098480225, -0.13834114372730255, -0.03691284731030464, 0.14062358438968658, 0.07599436491727829, -0.09293784201145172, -0.1462552547454834, -0.03397185355424881, 0.17942219972610474, 0.05657581239938736, -0.10163745284080505, -0.008182157762348652, -0.0537281259894371, 0.07671871036291122, -0.0434822179377079, -0.07162667065858841, -0.03026064671576023, -0.10044263303279877, 0.19780206680297852, 0.052941709756851196, 0.03361000865697861, -0.05487487092614174, 0.06813095510005951, -0.019930636510252953, -0.08037666976451874, -0.09594442695379257, -0.09769923239946365, -0.06938325613737106, -0.02406679466366768, -0.022433964535593987, -0.0000370812886103522, 0.019946301355957985, 0.16129803657531738, -0.1766546666622162, 0.005820156540721655, 0.019502799957990646, 0.025063280016183853, 0.12576021254062653, 0.03973018750548363, -0.09110970050096512, 0.008614630438387394, 0.08976193517446518, 0.04042312130331993, 0.020812049508094788, -0.02701348066329956, -0.009878790006041527, -0.09612920880317688, 0.038183435797691345, 0.05946863442659378, 0.08624967187643051, 0.061829857528209686, -0.08638869971036911, -0.05920758843421936, 0.10788554698228836, -0.10592801868915558, -0.05994441360235214, 0.012930598109960556, -0.11144448071718216, -0.03525467962026596, 0.038761842995882034, -0.004053981509059668, -0.06608603894710541, 0.020997531712055206, -0.09838937222957611, -0.04001324623823166, -0.04913489520549774, -0.12063194066286087, 0.07041865587234497, -0.0007522027590312064, 0.006902381777763367, -0.1559203714132309, -0.19432511925697327, -0.07789817452430725, 0.0018969964003190398, -0.03939373046159744, -0.043264828622341156, -0.02757892571389675, -0.041080355644226074, -0.0211869478225708, -0.04977588355541229, -0.023980582132935524, -0.02370382472872734, 0.057263992726802826, -0.05340980365872383, 0.02533639222383499, 0.06156301870942116, 0.01803000085055828, -0.11758944392204285, 0.05043531581759453, -0.14370973408222198, 0.11700170487165451, -0.051090992987155914, -0.08352818340063095, -0.1057715192437172, -0.08183638751506805, 0.008213804103434086, 0.08164507150650024, 0.052472252398729324, 0.21663710474967957, -0.10117983818054199, 0.01562897115945816, 0.11139646172523499, -0.12359841912984848, -0.13071979582309723, 0.11965155601501465, -0.09554583579301834, 0.11381895840167999, 0.09436139464378357, 0.1291292905807495, 0.04454684257507324, -0.04965141415596008, -0.08085262775421143, -0.04662344232201576, -0.06201549246907234, 0.14215348660945892, 0.028287416324019432, -0.0670660138130188, 0.0352095328271389, -0.0463174469769001, -0.0676073282957077, 0.009727382101118565, -0.027934296056628227, -0.017382021993398666, 0.02683677151799202, -0.018936635926365852, 0.11140955984592438, -0.015452422201633453, -0.042082399129867554, 0.008652924560010433, -0.13164104521274567, -0.07692985981702805, 0.06800151616334915, -0.026041153818368912, 0.008327354677021503, -0.13004276156425476, 0.026955530047416687, -0.002465174300596118, 0.025468280538916588, -0.1981644183397293, -0.14176027476787567, -0.014839421957731247, -0.20551425218582153, 0.08855564892292023, 0.09474139660596848, 0.020244402810931206, -0.010480636730790138, -0.038761038333177567, 0.006139388307929039, -0.07397326827049255, 0.0011476472718641162, -0.05884438380599022, -0.1336348056793213, 0.03227803483605385, -0.03333127498626709, 0.03407791256904602, -0.19248127937316895, -0.0004055957542732358, 0.08644600957632065, 0.0370955653488636, 0.020110435783863068, -0.024799363687634468, 0.053574077785015106, 0.008298049680888653, -0.00039087579352781177, -0.03477887064218521, 0.000352356000803411, -0.03579854592680931, -0.11763636767864227, 0.12429194897413254, -0.1622927486896515, -0.04237399250268936, 0.06281041353940964, -0.05057288333773613, -0.10893881320953369, -0.048099976032972336, -0.008453307673335075, -0.021806320175528526, -0.07058525830507278, -0.043861012905836105, 0.22335229814052582, 0.028741708025336266, 0.025836283341050148, -0.1028752475976944, -0.04864994436502457, -0.028056422248482704, -0.026947153732180595, -0.01402334589511156, 0.04294759780168533, -0.015406335704028606, -0.22349998354911804, 0.07186567783355713, 0.025401443243026733, -0.062280673533678055, 0.10072356462478638, 0.047596581280231476, -0.0570170097053051, -0.022218601778149605, -0.020718373358249664, -0.04297836869955063, 0.11670297384262085, -0.016707787290215492, 0.05818590149283409, 0.0542883463203907, 0.0667390525341034, 0.023307805880904198, -0.09329566359519958, -0.023260939866304398, 0.014342709444463253, -0.009356127120554447, -0.129164919257164, 0.0522305965423584, 0.038275059312582016, 0.12791313230991364, 0.03512440621852875, -0.0032191986683756113, 0.05688362941145897, -0.06816253066062927, -0.17022264003753662, 0.20535193383693695, -0.03725864365696907, -0.20721162855625153, -0.01781454309821129, 0.07344089448451996, -0.013032685033977032, -0.002555091166868806, 0.03944672644138336, -0.054172582924366, -0.0710703432559967, -0.10130702704191208, 0.06518091261386871, -0.0054442621767520905, -0.08042377233505249, -0.04925161972641945, -0.02053668722510338, 0.036260880529880524, -0.09450401365756989, 0.007124255411326885, -0.036265574395656586, -0.023455379530787468, 0.04884634539484978, 0.02518751658499241, 0.13246141374111176, 0.0806046798825264, 0.012733026407659054, -0.05043458566069603, -0.03916846960783005, 0.22253161668777466, -0.14528270065784454, 0.03353618085384369, 0.18335312604904175, -0.07561289519071579, 0.07138367742300034, 0.14092880487442017, 0.01048464234918356, -0.07108701765537262, 0.04204215481877327, 0.07604186236858368, -0.0363534614443779, -0.28440648317337036, -0.057944174855947495, -0.008225449360907078, -0.056532490998506546, 0.02039780095219612, 0.02240782603621483, 0.07441898435354233, 0.07954423874616623, -0.10979234427213669, -0.02555873431265354, 0.07668944448232651, 0.0753001868724823, 0.1584247648715973, 0.02499445155262947, 0.04695485159754753, -0.004272592719644308, -0.0009792819619178772, 0.05926791951060295, -0.026629040017724037, 0.31997251510620117, -0.07332810759544373, 0.10416432470083237, 0.12089043855667114, 0.02782507799565792, 0.04493540897965431, -0.0014503314159810543, 0.02547757513821125, 0.026945510879158974, -0.0545848049223423, -0.020300423726439476, -0.019265325739979744, 0.08113780617713928, -0.024761026725172997, -0.03390682488679886, -0.019086135551333427, -0.030767112970352173, 0.04290163889527321, 0.09060855954885483, 0.008811467327177525, -0.10387387871742249, -0.07550731301307678, 0.036406710743904114, -0.07262886315584183, -0.06104752793908119, 0.06010861322283745, 0.10487712174654007, -0.1640489101409912, 0.045120496302843094, -0.01690748706459999, 0.08536066859960556, -0.1303797960281372, -0.0004122104437556118, -0.053952936083078384, 0.05577509105205536, -0.036274805665016174, 0.10525120049715042, -0.25926363468170166, 0.19121797382831573, 0.008635452017188072, 0.043799303472042084, -0.10851985961198807, -0.04436730593442917, 0.05034874752163887, -0.06819919496774673, 0.13665714859962463, 0.009036186151206493, -0.009714058600366116, -0.058431416749954224, 0.020327258855104446, 0.049269866198301315, 0.007318622432649136, -0.002679761964827776, 0.10311386734247208, 0.024069195613265038, 0.05991188436746597, -0.007511392701417208, 0.029619550332427025, -0.14979322254657745, -0.11222584545612335, 0.031359631568193436, -0.0726807713508606, 0.021027199923992157, -0.06540784984827042, -0.050159718841314316, 0.006280913483351469, 0.057892587035894394, -0.1283588707447052, -0.0990513265132904, -0.07376638054847717, 0.06888114660978317, 0.06193714961409569, -0.10355541855096817, 0.021481119096279144, 0.055808473378419876, 0.06935450434684753, -0.021927887573838234, -0.04858439415693283, 0.05867132917046547, -0.07474508136510849, -0.15449683368206024, -0.034885939210653305, 0.012858809903264046, 0.1715914011001587, 0.07029012590646744, 0.012973352335393429, 0.007175033912062645, -0.005510410759598017, -0.16126279532909393, 0.03571466729044914, 0.11212946474552155, 0.06025785580277443, 0.11278121918439865, 0.044679876416921616, -0.010485214181244373, -0.12964247167110443, -0.06964343786239624, 0.06988881528377533, 0.23795372247695923, -0.037295158952474594, 0.029487481340765953, 0.14982959628105164, -0.05736488103866577, -0.21152661740779877, 0.03329557552933693, 0.0387021079659462, -0.01383371651172638, 0.10284330695867538, -0.12737296521663666, -0.0007705772295594215, 0.04785158112645149, -0.005723554641008377, 0.09188612550497055, -0.18662028014659882, -0.10871313512325287, 0.14613878726959229, 0.07541903108358383, 0.08087509870529175, -0.08704448491334915, -0.026688892394304276, 0.008119717240333557, -0.054469700902700424, 0.14516809582710266, -0.12139277905225754, 0.01831051893532276, 0.01660604402422905, 0.10395392775535583, 0.05033494159579277, -0.031040195375680923, 0.1588650941848755, 0.011405589990317822, 0.09285715967416763, -0.10352382808923721, -0.05239420011639595, -0.046319130808115005, -0.028099896386265755, 0.11138474196195602, 0.05218098312616348, 0.03561782091856003, -0.05656395107507706, -0.06364694237709045, -0.13180159032344818, 0.030274147167801857, -0.04087651148438454, -0.04571125656366348, -0.09011540561914444, 0.09148922562599182, 0.10406894981861115, -0.004187362734228373, 0.028953230008482933, -0.09785906970500946, 0.0013542009983211756, 0.039364706724882126, 0.24517890810966492, 0.10732036828994751, -0.11499731242656708, 0.014036862179636955, -0.01490709651261568, 0.10464317351579666, -0.13840635120868683, 0.004312965553253889, 0.08719892799854279, 0.03623740375041962, 0.13601627945899963, 0.009454120881855488, -0.1660398542881012, 0.05376182496547699, -0.024139663204550743, -0.09347739070653915, -0.18781296908855438, -0.011273435316979885, -0.005292569287121296, -0.1257074922323227, -0.08922728896141052, 0.1190854087471962, -0.024443022906780243, -0.008294501341879368, -0.0003415816754568368, 0.05491213873028755, 0.028562579303979874, 0.08913836628198624, 0.09475304931402206, 0.05962904170155525, -0.056523196399211884, 0.06924127787351608, 0.057831261307001114, -0.0683140829205513, 0.10416840761899948, 0.023086898028850555, -0.08033578097820282, -0.05333453789353371, -0.14279243350028992, 0.0497857928276062, -0.11516012251377106, -0.0774170532822609, -0.044641636312007904, -0.08790343999862671, -0.012854021042585373, 0.1625642031431198, 0.017241019755601883, 0.021302800625562668, 0.021982142701745033, -0.01922936923801899, -0.04008400812745094, 0.0894588902592659, 0.0400727242231369, 0.0015087057836353779, -0.09089680016040802, -0.0009317563381046057, 0.048796746879816055, 0.05649581179022789, -0.02726208046078682, 0.0024876552633941174, -0.120641328394413, 0.019954456016421318, -0.2417220026254654, 0.04879452660679817, -0.08484239131212234, 0.026324542239308357, -0.023716485127806664, -0.054940346628427505, -0.012544522061944008, 0.03308820724487305, -0.04862765967845917, 0.0017415375914424658, 0.055371321737766266, 0.044121336191892624, -0.14694459736347198, -0.0711154043674469, 0.061193957924842834, -0.06516961753368378, 0.08820310235023499, 0.0008699504542164505, -0.0659903883934021, 0.005423697177320719, -0.14909672737121582, 0.04298603907227516, -0.01789586991071701, 0.02372768148779869, 0.01874607428908348, -0.1735268384218216, -0.017825312912464142, -0.01679406315088272, -0.041246477514505386, 0.020225241780281067, -0.030544638633728027, -0.059288136661052704, 0.025412920862436295, 0.05090922862291336, -0.018075382336974144, -0.10774032771587372, 0.027274176478385925, 0.10609202086925507, 0.008629035204648972, 0.11433643102645874, -0.023159164935350418, 0.09788835793733597, -0.13637304306030273, -0.010250132530927658, 0.052426911890506744, 0.06773634254932404, 0.057541895657777786, -0.009514185599982738, 0.059550631791353226, -0.04374009370803833, 0.1778380125761032, -0.04848882183432579, -0.015250587835907936, 0.059136901050806046, -0.00009410226630279794, -0.008341838605701923, -0.007384396158158779, -0.104965440928936, -0.08136581629514694, -0.027106652036309242, 0.023839641362428665, 0.020640447735786438, -0.045202940702438354, -0.08973965048789978, 0.20452450215816498, 0.06875665485858917, 0.10421673208475113, 0.007110188715159893, -0.013150637969374657, -0.09956645220518112, -0.0025542189832776785, -0.0037484201602637768, 0.07612544298171997, -0.06496153771877289, -0.016319021582603455, 0.04162995517253876, 0.15905317664146423, -0.08138179779052734, 0.09490962326526642, -0.05130958929657936, -0.06348294019699097, -0.12164033204317093, -0.18121255934238434, -0.0506257563829422, -0.0067864577285945415, 0.0303247831761837, -0.12445524334907532, -0.0028699557296931744, 0.1722414791584015, -0.009707941673696041, -0.08170830458402634, 0.10552606731653214, -0.004893937613815069, -0.1269891858100891, 0.019841281697154045, 0.003994084428995848, 0.02758082002401352, 0.059759706258773804, 0.03565766289830208, 0.059710364788770676, 0.014903870411217213, 0.044243764132261276, 0.058657508343458176, -0.01894957199692726, 0.01866823621094227, -0.07501407712697983, -0.07509360462427139, -0.013467502780258656, 0.08360139280557632, 0.10471219569444656, 0.24200206995010376, 0.06724639981985092, -0.011607978492975235, -0.004793247673660517, 0.166112020611763, 0.04027334228157997, 0.02114785835146904, -0.10927765816450119, 0.20267289876937866, 0.00502400379627943, -0.025330262258648872, 0.06713847070932388, -0.13668210804462433, 0.07514432072639465, 0.1095677986741066, 0.11666753888130188, -0.12202358990907669, -0.00460050581023097, -0.039396848529577255, 0.005664028227329254, 0.03162391856312752, 0.019867319613695145, 0.0667726919054985, 0.31810498237609863, -0.08628949522972107, 0.072409987449646, -0.009960969910025597, 0.053554732352495193, 0.026601873338222504, 0.12624114751815796, 0.04179093986749649, 0.045512352138757706, -0.13853655755519867, 0.08927576243877411, -0.08651857823133469, -0.22208259999752045, 0.0004611718177329749, -0.04338202625513077, -0.0650092139840126, -0.0026339413598179817, -0.11991534382104874, -0.007326595019549131, 0.0866699367761612, 0.002051533432677388, -0.0018361745169386268, 0.059443771839141846, 0.019016914069652557, -0.06236134096980095, -0.008455348201096058, 0.11790880560874939, 0.014376135542988777, 0.23276658356189728, 0.007498623803257942, 0.2176937311887741, 0.10410235077142715, -0.001610449398867786, -0.10076773166656494, 0.05242547765374184, 0.032978497445583344, -0.021303143352270126, 0.02145817130804062, 0.15270750224590302, 0.02756599709391594, 0.05458131060004234, 0.14742407202720642, -0.05713973566889763, 0.05398385226726532, -0.008243243210017681, -0.07273364067077637, -0.11008748412132263, 0.09728504717350006, -0.10251683741807938, 0.146932914853096, 0.12867094576358795, -0.04162716120481491, 0.01506168581545353, -0.016371188685297966, 0.012235641479492188, -0.02402012050151825, 0.05630616098642349, 0.023494575172662735, -0.09981103241443634, 0.08168446272611618, 0.04846572503447533, 0.03824499249458313, -0.2220836579799652, -0.014149350114166737, 0.02558225765824318, -0.05349740758538246, 0.024806639179587364, 0.036662742495536804, -0.01845088228583336, 0.020265335217118263, -0.05225227400660515, -0.06754567474126816, -0.034941237419843674, 0.11440073698759079, -0.0974632129073143, -0.06151984632015228 ]
null
null
transformers
# `dpr-question_encoder-multiset-base` ## Table of Contents - [Model Details](#model-details) - [How To Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation-results) - [Environmental Impact](#environmental-impact) - [Technical Specifications](#technical-specifications) - [Citation Information](#citation-information) - [Model Card Authors](#model-card-authors) ## Model Details **Model Description:** [Dense Passage Retrieval (DPR)](https://github.com/facebookresearch/DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. `dpr-question_encoder-multiset-base` is the question encoder trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open), [TriviaQA](https://huggingface.co/datasets/trivia_qa), [WebQuestions (WQ)](https://huggingface.co/datasets/web_questions), and [CuratedTREC (TREC)](https://huggingface.co/datasets/trec). - **Developed by:** See [GitHub repo](https://github.com/facebookresearch/DPR) for model developers - **Model Type:** BERT-based encoder - **Language(s):** [CC-BY-NC-4.0](https://github.com/facebookresearch/DPR/blob/main/LICENSE), also see [Code of Conduct](https://github.com/facebookresearch/DPR/blob/main/CODE_OF_CONDUCT.md) - **License:** English - **Related Models:** - [`dpr-ctx_encoder-multiset-base`](https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base) - [`dpr-reader-multiset-base`](https://huggingface.co/facebook/dpr-reader-multiset-base) - [`dpr-ctx_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base) - [`dpr-question_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-question_encoder-single-nq-base) - [`dpr-reader-single-nq-base`](https://huggingface.co/facebook/dpr-reader-single-nq-base) - **Resources for more information:** - [Research Paper](https://arxiv.org/abs/2004.04906) - [GitHub Repo](https://github.com/facebookresearch/DPR) - [Hugging Face DPR docs](https://huggingface.co/docs/transformers/main/en/model_doc/dpr) - [BERT Base Uncased Model Card](https://huggingface.co/bert-base-uncased) ## How to Get Started with the Model Use the code below to get started with the model. ```python from transformers import DPRQuestionEncoder, DPRQuestionEncoderTokenizer tokenizer = DPRQuestionEncoderTokenizer.from_pretrained("facebook/dpr-question_encoder-multiset-base") model = DPRQuestionEncoder.from_pretrained("facebook/dpr-question_encoder-multiset-base") input_ids = tokenizer("Hello, is my dog cute ?", return_tensors="pt")["input_ids"] embeddings = model(input_ids).pooler_output ``` ## Uses #### Direct Use `dpr-question_encoder-multiset-base`, [`dpr-ctx_encoder-multiset-base`](https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base), and [`dpr-reader-multiset-base`](https://huggingface.co/facebook/dpr-reader-multiset-base) can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. ## Risks, Limitations and Biases **CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al., 2021](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al., 2021](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. ## Training #### Training Data This model was trained using the following datasets: - **[Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open)** ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)) - **[TriviaQA](https://huggingface.co/datasets/trivia_qa)** ([Joshi et al., 2017](https://aclanthology.org/P17-1147/)) - **[WebQuestions (WQ)](https://huggingface.co/datasets/web_questions)** ([Berant et al., 2013](https://aclanthology.org/D13-1160/)) - **[CuratedTREC (TREC)](https://huggingface.co/datasets/trec)** ([Baudiš & Šedivý, 2015](https://www.aminer.cn/pub/599c7953601a182cd263079b/reading-wikipedia-to-answer-open-domain-questions)) #### Training Procedure The training procedure is described in the [associated paper](https://arxiv.org/pdf/2004.04906.pdf): > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. The authors report that for encoders, they used two independent BERT ([Devlin et al., 2019](https://aclanthology.org/N19-1423/)) networks (base, un-cased) and use FAISS ([Johnson et al., 2017](https://arxiv.org/abs/1702.08734)) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. ## Evaluation The following evaluation information is extracted from the [associated paper](https://arxiv.org/pdf/2004.04906.pdf). #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were [NQ](https://huggingface.co/datasets/nq_open), [TriviaQA](https://huggingface.co/datasets/trivia_qa), [WebQuestions (WQ)](https://huggingface.co/datasets/web_questions), [CuratedTREC (TREC)](https://huggingface.co/datasets/trec), and [SQuAD v1.1](https://huggingface.co/datasets/squad). #### Results | | Top 20 | | | | | Top 100| | | | | |:----:|:------:|:---------:|:--:|:----:|:-----:|:------:|:---------:|:--:|:----:|:-----:| | | NQ | TriviaQA | WQ | TREC | SQuAD | NQ | TriviaQA | WQ | TREC | SQuAD | | | 79.4 | 78.8 |75.0| 89.1 | 51.6 | 86.0 | 84.7 |82.9| 93.9 | 67.6 | ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type and based on the [associated paper](https://arxiv.org/abs/2004.04906). - **Hardware Type:** 8 32GB GPUs - **Hours used:** Unknown - **Cloud Provider:** Unknown - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Technical Specifications See the [associated paper](https://arxiv.org/abs/2004.04906) for details on the modeling architecture, objective, compute infrastructure, and training details. ## Citation Information ```bibtex @inproceedings{karpukhin-etal-2020-dense, title = "Dense Passage Retrieval for Open-Domain Question Answering", author = "Karpukhin, Vladimir and Oguz, Barlas and Min, Sewon and Lewis, Patrick and Wu, Ledell and Edunov, Sergey and Chen, Danqi and Yih, Wen-tau", booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.emnlp-main.550", doi = "10.18653/v1/2020.emnlp-main.550", pages = "6769--6781", } ``` ## Model Card Authors This model card was written by the team at Hugging Face.
{"language": "en", "license": "cc-by-nc-4.0", "tags": ["dpr"], "datasets": ["nq_open", "trivia_qa", "web_questions", "trec"], "inference": false}
feature-extraction
facebook/dpr-question_encoder-multiset-base
[ "transformers", "pytorch", "tf", "dpr", "feature-extraction", "en", "dataset:nq_open", "dataset:trivia_qa", "dataset:web_questions", "dataset:trec", "arxiv:2004.04906", "arxiv:1702.08734", "arxiv:1910.09700", "license:cc-by-nc-4.0", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2004.04906", "1702.08734", "1910.09700" ]
[ "en" ]
TAGS #transformers #pytorch #tf #dpr #feature-extraction #en #dataset-nq_open #dataset-trivia_qa #dataset-web_questions #dataset-trec #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us
'dpr-question\_encoder-multiset-base' ===================================== Table of Contents ----------------- * Model Details * How To Get Started With the Model * Uses * Risks, Limitations and Biases * Training * Evaluation * Environmental Impact * Technical Specifications * Citation Information * Model Card Authors Model Details ------------- Model Description: Dense Passage Retrieval (DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. 'dpr-question\_encoder-multiset-base' is the question encoder trained using the Natural Questions (NQ) dataset, TriviaQA, WebQuestions (WQ), and CuratedTREC (TREC). * Developed by: See GitHub repo for model developers * Model Type: BERT-based encoder * Language(s): CC-BY-NC-4.0, also see Code of Conduct * License: English * Related Models: + 'dpr-ctx\_encoder-multiset-base' + 'dpr-reader-multiset-base' + 'dpr-ctx\_encoder-single-nq-base' + 'dpr-question\_encoder-single-nq-base' + 'dpr-reader-single-nq-base' * Resources for more information: + Research Paper + GitHub Repo + Hugging Face DPR docs + BERT Base Uncased Model Card How to Get Started with the Model --------------------------------- Use the code below to get started with the model. Uses ---- #### Direct Use 'dpr-question\_encoder-multiset-base', 'dpr-ctx\_encoder-multiset-base', and 'dpr-reader-multiset-base' can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. Risks, Limitations and Biases ----------------------------- CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes. Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. Training -------- #### Training Data This model was trained using the following datasets: * Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019) * TriviaQA (Joshi et al., 2017) * WebQuestions (WQ) (Berant et al., 2013) * CuratedTREC (TREC) (Baudiš & Šedivý, 2015) #### Training Procedure The training procedure is described in the associated paper: > > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > > > > > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. > > > The authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. Evaluation ---------- The following evaluation information is extracted from the associated paper. #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1. #### Results Environmental Impact -------------------- Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper. * Hardware Type: 8 32GB GPUs * Hours used: Unknown * Cloud Provider: Unknown * Compute Region: Unknown * Carbon Emitted: Unknown Technical Specifications ------------------------ See the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details. Model Card Authors ------------------ This model card was written by the team at Hugging Face.
[ "#### Direct Use\n\n\n'dpr-question\\_encoder-multiset-base', 'dpr-ctx\\_encoder-multiset-base', and 'dpr-reader-multiset-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the following datasets:\n\n\n* Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019)\n* TriviaQA (Joshi et al., 2017)\n* WebQuestions (WQ) (Berant et al., 2013)\n* CuratedTREC (TREC) (Baudiš & Šedivý, 2015)", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ "TAGS\n#transformers #pytorch #tf #dpr #feature-extraction #en #dataset-nq_open #dataset-trivia_qa #dataset-web_questions #dataset-trec #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us \n", "#### Direct Use\n\n\n'dpr-question\\_encoder-multiset-base', 'dpr-ctx\\_encoder-multiset-base', and 'dpr-reader-multiset-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the following datasets:\n\n\n* Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019)\n* TriviaQA (Joshi et al., 2017)\n* WebQuestions (WQ) (Berant et al., 2013)\n* CuratedTREC (TREC) (Baudiš & Šedivý, 2015)", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ 97, 66, 206, 91, 313, 82, 135 ]
[ "passage: TAGS\n#transformers #pytorch #tf #dpr #feature-extraction #en #dataset-nq_open #dataset-trivia_qa #dataset-web_questions #dataset-trec #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us \n#### Direct Use\n\n\n'dpr-question\\_encoder-multiset-base', 'dpr-ctx\\_encoder-multiset-base', and 'dpr-reader-multiset-base' can be used for the task of open-domain question answering.#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------#### Training Data\n\n\nThis model was trained using the following datasets:\n\n\n* Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019)\n* TriviaQA (Joshi et al., 2017)\n* WebQuestions (WQ) (Berant et al., 2013)\n* CuratedTREC (TREC) (Baudiš & Šedivý, 2015)" ]
[ -0.05253488942980766, 0.05612237751483917, -0.004882460925728083, 0.07535283267498016, 0.028257176280021667, 0.005732703488320112, 0.08478092402219772, 0.06588038802146912, 0.08593540638685226, 0.050556544214487076, 0.07033122330904007, 0.0711122676730156, 0.0414542518556118, 0.03251410275697708, -0.007166997995227575, -0.1661292016506195, 0.040120869874954224, -0.0065282247960567474, 0.07617823779582977, 0.14494942128658295, 0.10316848754882812, -0.06368836760520935, 0.060284968465566635, 0.007248703856021166, -0.035974953323602676, -0.05818852782249451, 0.00880265049636364, -0.036589574068784714, 0.10876785218715668, 0.07354111224412918, 0.0745156854391098, 0.023265615105628967, 0.009082222357392311, -0.21512824296951294, 0.037532199174165726, 0.06558798998594284, -0.026082757860422134, 0.07119350135326385, 0.11258583515882492, -0.004850368015468121, 0.15378861129283905, -0.05467857047915459, 0.03925711289048195, 0.07676558941602707, -0.1176522895693779, -0.13946783542633057, -0.11875128000974655, 0.03750089183449745, 0.08110591769218445, 0.10420165210962296, -0.07242801785469055, 0.14657112956047058, -0.07998169213533401, 0.054617151618003845, 0.14754682779312134, -0.07706204056739807, -0.011473631486296654, -0.052035100758075714, 0.01720847561955452, 0.023230988532304764, -0.08146551251411438, -0.013016786426305771, 0.009660286828875542, 0.05769530311226845, 0.023805690929293633, -0.06658101081848145, 0.03446659818291664, -0.051119156181812286, -0.12243431806564331, -0.019824650138616562, 0.09223253279924393, 0.062176335602998734, -0.055636513978242874, -0.1510414481163025, -0.041266877204179764, 0.141396626830101, 0.04528863728046417, -0.08909975737333298, -0.02024511806666851, -0.037818487733602524, 0.05605597421526909, -0.009396703913807869, -0.04686318710446358, 0.001710359356366098, -0.10058965533971786, 0.16236832737922668, 0.03564370423555374, 0.03927329182624817, -0.07409016788005829, -0.0076461853459477425, -0.01172124408185482, -0.08933226764202118, -0.09150288254022598, -0.11040221899747849, -0.08279116451740265, -0.023677075281739235, -0.06810310482978821, -0.08208437263965607, 0.031849946826696396, 0.089935801923275, -0.12167299538850784, 0.029630349949002266, -0.09669239819049835, 0.04092029854655266, 0.16274741291999817, 0.04669729247689247, -0.12564139068126678, 0.01832444593310356, 0.026051541790366173, 0.04351592808961868, 0.02821403555572033, -0.02156670019030571, -0.0074441940523684025, -0.06525246053934097, -0.030470125377178192, 0.030737387016415596, 0.08778641372919083, 0.09930248558521271, -0.08963297307491302, -0.08350364863872528, 0.01274827215820551, -0.07241278141736984, -0.07811175286769867, 0.041669897735118866, -0.05199218913912773, 0.00597676495090127, 0.0037269259337335825, 0.002816862892359495, -0.014326093718409538, 0.0242652278393507, -0.07356436550617218, 0.014094826765358448, -0.07201132923364639, -0.09439558535814285, 0.07077095657587051, 0.006620689760893583, -0.03333894535899162, -0.1340317577123642, -0.15022918581962585, -0.07253635674715042, 0.02636599726974964, -0.030662845820188522, 0.02985726110637188, -0.04861051216721535, -0.021361084654927254, -0.0088939955458045, -0.027895338833332062, -0.09142518788576126, -0.0367206335067749, 0.08007735759019852, -0.07746753841638565, 0.08395767211914062, 0.0405179001390934, 0.020419424399733543, -0.11665859073400497, 0.044808708131313324, -0.13845182955265045, 0.15313255786895752, -0.060494858771562576, -0.07175726443529129, -0.07350770384073257, -0.04014371708035469, -0.0242154560983181, 0.08377788960933685, 0.028028201311826706, 0.18911948800086975, -0.16670681536197662, 0.0029470149893313646, 0.18475764989852905, -0.09840713441371918, -0.12830401957035065, 0.12938988208770752, -0.11812923848628998, 0.10161097347736359, 0.10476730763912201, 0.13088130950927734, 0.007009145803749561, -0.04053272306919098, -0.07546094059944153, -0.04825504496693611, -0.07634752988815308, 0.18297024071216583, 0.05009162053465843, -0.04157276079058647, 0.026033766567707062, -0.03572872653603554, -0.025392934679985046, 0.03901398554444313, -0.018923383206129074, -0.031053077429533005, 0.029349226504564285, -0.0041678776033222675, 0.21417969465255737, -0.016812218353152275, -0.08253128826618195, 0.04401379078626633, -0.10856450349092484, -0.06331299990415573, 0.0681629478931427, -0.008699627593159676, 0.03439382091164589, -0.10689054429531097, -0.045032888650894165, 0.009740308858454227, 0.038518454879522324, -0.1930021196603775, -0.08590953052043915, -0.04667488485574722, -0.06811219453811646, 0.0891985073685646, 0.07914279401302338, 0.009841703809797764, -0.021493788808584213, -0.05036541447043419, 0.023075513541698456, -0.07068084180355072, 0.0014121747808530927, -0.04415738210082054, -0.12365730106830597, 0.03129998967051506, -0.049581367522478104, 0.017542969435453415, -0.09787227213382721, 0.01395196933299303, 0.06033201515674591, 0.023120857775211334, 0.01860566809773445, -0.016094453632831573, 0.0550650991499424, 0.03841732442378998, 0.0059297713451087475, -0.010991443879902363, 0.02447177655994892, -0.056444428861141205, -0.10073898732662201, 0.1166735589504242, -0.13408216834068298, -0.05862870812416077, 0.06671945750713348, -0.07200824469327927, -0.1052391305565834, -0.024578304961323738, -0.007569462992250919, -0.015045533888041973, -0.05324460566043854, -0.056856654584407806, 0.17962999641895294, 0.04439977928996086, -0.012394200079143047, -0.10322368890047073, -0.058575380593538284, -0.023413624614477158, -0.030483458191156387, -0.015102832578122616, 0.053711358457803726, 0.015523658134043217, -0.2542004883289337, 0.08668011426925659, 0.1089368686079979, -0.06884917616844177, 0.08531657606363297, 0.032258037477731705, -0.0666501373052597, -0.0501754954457283, -0.03401860222220421, -0.042866673320531845, 0.11233652383089066, 0.029089489951729774, 0.05286911502480507, 0.039653826504945755, 0.004338239785283804, -0.0002008637966355309, -0.04980538785457611, -0.025074543431401253, 0.0033055769745260477, 0.022747306153178215, -0.1474202573299408, 0.03806375339627266, 0.07547643780708313, 0.11750283092260361, 0.036423683166503906, -0.008837862871587276, 0.06669249385595322, -0.04843350872397423, -0.14791452884674072, 0.17337851226329803, -0.0968787744641304, -0.2924869656562805, -0.012887299060821533, 0.011884117498993874, -0.05774471536278725, -0.0023334180004894733, 0.034191545099020004, -0.09117849916219711, -0.037911560386419296, -0.06793633848428726, 0.08388020843267441, 0.00777073809877038, -0.06913195550441742, -0.010138125158846378, -0.004053360782563686, 0.003929606173187494, -0.06809081882238388, 0.02900707721710205, -0.061743173748254776, -0.025708716362714767, 0.052723608911037445, 0.02053764835000038, 0.11597058176994324, 0.06696392595767975, 0.043492477387189865, -0.05423145741224289, -0.0541340634226799, 0.18328003585338593, -0.14873570203781128, 0.014701150357723236, 0.14069758355617523, -0.12932313978672028, 0.0670558363199234, 0.159096360206604, 0.01964743807911873, -0.05010908097028732, 0.037020135670900345, 0.0835888534784317, -0.027371151372790337, -0.2362094223499298, -0.05751042813062668, -0.015740549191832542, -0.03769448772072792, 0.010360815562307835, -0.0057596019469201565, 0.10081518441438675, 0.05353274941444397, -0.091493621468544, -0.06306208670139313, 0.08988617360591888, 0.045143257826566696, 0.1461700201034546, -0.004089384339749813, 0.06219774857163429, 0.014391033910214901, -0.0167180635035038, 0.04383474215865135, -0.054322317242622375, 0.31618162989616394, -0.04342374950647354, 0.07730454206466675, 0.13574810326099396, 0.04397749528288841, 0.020167764276266098, -0.0332777202129364, -0.02274574339389801, 0.033266402781009674, -0.0589015893638134, -0.04179518669843674, -0.07304194569587708, 0.05071644112467766, 0.028901901096105576, -0.04164052754640579, 0.015756333246827126, -0.056185174733400345, 0.08932559192180634, 0.06227957084774971, -0.027325822040438652, -0.11198782175779343, -0.04171590879559517, 0.01359778456389904, -0.028030071407556534, -0.05057810992002487, 0.06336884200572968, 0.0802563726902008, -0.12450665235519409, 0.0555635541677475, -0.052342768758535385, 0.08317017555236816, -0.09302067011594772, 0.03504723682999611, -0.027866387739777565, 0.007340988144278526, -0.014636454172432423, 0.06406610459089279, -0.25128939747810364, 0.20030154287815094, 0.0348508395254612, 0.016373323276638985, -0.09968230128288269, -0.04102382808923721, 0.05043333023786545, -0.006909082178026438, 0.12798109650611877, 0.017664624378085136, 0.0013315677642822266, -0.08751744031906128, -0.006257845554500818, 0.03414180129766464, 0.043825194239616394, -0.00714495312422514, 0.08460905402898788, 0.031132791191339493, 0.042630426585674286, -0.01595497690141201, 0.03639949858188629, -0.1593317836523056, -0.12805749475955963, 0.06273314356803894, -0.08894746750593185, 0.06860262900590897, -0.032758794724941254, -0.0313684456050396, 0.010517836548388004, 0.028439365327358246, -0.12549231946468353, -0.08412884175777435, -0.09562397003173828, -0.009053850546479225, 0.06456684321165085, -0.10184630006551743, -0.027206962928175926, 0.05825985595583916, 0.050285372883081436, -0.054643310606479645, -0.11420152336359024, 0.08071944117546082, -0.05745924264192581, -0.12155170738697052, -0.060321107506752014, 0.011072621680796146, 0.18397881090641022, 0.08970579504966736, 0.01657242700457573, 0.0476260669529438, 0.030897188931703568, -0.128109872341156, 0.015684684738516808, 0.1746363788843155, 0.04483594000339508, 0.059957411140203476, 0.03440393507480621, -0.05999864265322685, -0.15000197291374207, -0.0266452357172966, 0.08905918896198273, 0.18019291758537292, -0.028515012934803963, 0.13881029188632965, 0.16254264116287231, -0.07074616104364395, -0.2590903341770172, 0.02156982012093067, 0.042248643934726715, -0.03387872874736786, 0.11442475020885468, -0.17741571366786957, 0.0044032796286046505, 0.07205075025558472, 0.015460862778127193, -0.006099684629589319, -0.20779868960380554, -0.10289350152015686, 0.15023890137672424, 0.01955820620059967, 0.04739110916852951, -0.06692475080490112, -0.031102990731596947, -0.0030616887379437685, -0.02541719563305378, 0.1958339363336563, -0.12887586653232574, 0.023424267768859863, 0.01376174483448267, 0.0882936641573906, 0.040852002799510956, -0.009461650624871254, 0.12792763113975525, 0.03357556089758873, 0.07281218469142914, -0.10466320067644119, -0.10275174677371979, 0.018579330295324326, -0.03833722323179245, 0.0620286725461483, 0.03308038040995598, 0.0035913661122322083, -0.09807190299034119, -0.036355163902044296, -0.15866036713123322, 0.029557999223470688, -0.037296928465366364, -0.04536380618810654, -0.08318930864334106, 0.08563321828842163, 0.12733282148838043, -0.027796877548098564, 0.08435717970132828, -0.08388636261224747, 0.029537586495280266, 0.051902011036872864, 0.2387794554233551, 0.09260185807943344, -0.12744499742984772, -0.024257326498627663, -0.014367313124239445, 0.11389907449483871, -0.12969639897346497, -0.0008388662827201188, 0.06535742431879044, 0.006487572565674782, 0.12899214029312134, 0.03204130381345749, -0.09989096969366074, 0.08253896981477737, -0.02487412840127945, -0.07899705320596695, -0.14097584784030914, -0.019495200365781784, 0.02346213161945343, -0.1639215499162674, -0.06967227160930634, 0.13720495998859406, -0.01992352120578289, -0.025714358314871788, 0.02033354714512825, 0.09538254141807556, 0.020204247906804085, 0.021786849945783615, 0.058241818100214005, 0.04313119500875473, -0.048580143600702286, 0.01408854778856039, 0.027286188676953316, -0.10122106969356537, 0.09176025539636612, 0.051687091588974, -0.07611165195703506, -0.05851909890770912, -0.1192355826497078, 0.04752255231142044, -0.05256430804729462, -0.04282552748918533, 0.029927093535661697, -0.07696904987096786, 0.00561140663921833, 0.2367118000984192, 0.012319992296397686, 0.05824770778417587, -0.019540928304195404, -0.0058334288187325, -0.0012913690879940987, 0.0899345874786377, 0.008174126967787743, 0.011341629549860954, -0.02329706773161888, 0.017676254734396935, 0.01626488007605076, 0.0666879341006279, -0.037377603352069855, -0.03389601409435272, -0.12203465402126312, 0.023689284920692444, -0.21929270029067993, 0.04806070029735565, -0.11808084696531296, 0.028480220586061478, -0.038940832018852234, -0.02416452392935753, 0.011453451588749886, 0.011142976582050323, -0.045803721994161606, 0.012870854698121548, 0.05730431526899338, 0.05492072179913521, -0.16223850846290588, -0.029187055304646492, 0.07903622090816498, -0.060421451926231384, 0.0984877198934555, 0.013157752342522144, -0.06881235539913177, 0.0222731102257967, -0.15235406160354614, 0.01550600677728653, -0.033721186220645905, 0.032577551901340485, -0.00856659933924675, -0.10995977371931076, -0.034323737025260925, -0.03375624865293503, -0.020443102344870567, 0.045254338532686234, 0.008950582705438137, -0.05201796814799309, 0.07815880328416824, 0.0438661091029644, -0.04165385663509369, -0.08965978771448135, 0.016767509281635284, 0.10569238662719727, 0.03394289314746857, 0.11777129769325256, -0.032555557787418365, 0.1024497002363205, -0.1124669760465622, -0.004835314583033323, 0.056947771459817886, 0.05709003657102585, 0.0067141614854335785, -0.0656542032957077, 0.054948821663856506, -0.04524330422282219, 0.128137469291687, -0.017348257824778557, -0.07227720320224762, 0.05604300647974014, 0.023112304508686066, -0.05998431146144867, 0.02014322206377983, -0.07027560472488403, -0.06252843141555786, -0.058735255151987076, 0.024081159383058548, 0.004499041009694338, -0.05816378444433212, -0.10860683023929596, 0.16880977153778076, 0.12283489853143692, 0.1973704695701599, -0.013947493396699429, 0.017048994079232216, -0.0252163615077734, 0.06632068008184433, 0.029776059091091156, 0.03499715030193329, -0.04406588524580002, -0.01754346303641796, 0.10588441789150238, 0.09162712842226028, -0.0408521369099617, 0.08776701241731644, -0.04052533954381943, -0.07120812684297562, -0.08777505159378052, -0.12357872724533081, -0.03464442119002342, -0.0075387428514659405, 0.01906510815024376, -0.11978890746831894, 0.01026740949600935, 0.17126987874507904, -0.00046382626169361174, -0.06476246565580368, 0.10272058844566345, 0.004913468845188618, -0.1292562186717987, 0.05589508265256882, -0.007431806996464729, 0.010984862223267555, 0.012873005121946335, 0.0069725727662444115, 0.0648900493979454, 0.03699074685573578, 0.06031659618020058, 0.06934398412704468, -0.006861341185867786, -0.019413825124502182, -0.13538207113742828, -0.09440343081951141, 0.025684047490358353, 0.07785699516534805, 0.08459359407424927, 0.18847189843654633, 0.07411491125822067, -0.0140744149684906, 0.010633789002895355, 0.11789573729038239, 0.01606782153248787, 0.021011635661125183, -0.1289588212966919, 0.1533285230398178, -0.03454437106847763, -0.005853723734617233, 0.013036562129855156, -0.1044982522726059, 0.04320606216788292, 0.11002423614263535, 0.14155644178390503, -0.12817156314849854, -0.005179948173463345, -0.05775018781423569, 0.025629714131355286, 0.03059258870780468, 0.05821043998003006, 0.037245701998472214, 0.30013707280158997, -0.09082120656967163, 0.06616505980491638, 0.0008638757863081992, 0.012212988920509815, 0.04611850902438164, 0.037764739245176315, 0.011896586045622826, 0.015815040096640587, -0.1233653873205185, 0.1437501460313797, -0.14491386711597443, -0.15417295694351196, 0.0003554810828063637, -0.041519276797771454, -0.08072280883789062, 0.007352154236286879, -0.0181833915412426, 0.016034184023737907, 0.0861675962805748, 0.03411174193024635, -0.012530224397778511, 0.116585873067379, 0.018953803926706314, -0.06405679136514664, 0.0015937539283186197, 0.0995204970240593, -0.037985119968652725, 0.19167664647102356, 0.013688414357602596, 0.18729187548160553, 0.08800123631954193, 0.01980023831129074, -0.08775506913661957, -0.007919357158243656, 0.037009693682193756, -0.033663056790828705, -0.01901381090283394, 0.12966381013393402, 0.02211136370897293, 0.11066686362028122, 0.14828073978424072, -0.10162966698408127, 0.05987846851348877, -0.02854984812438488, -0.0792180597782135, -0.07542400062084198, 0.11332804709672928, -0.07918286323547363, 0.12803389132022858, 0.13292300701141357, -0.02636275254189968, -0.007829989306628704, -0.01491216104477644, -0.03866248205304146, -0.029284190386533737, 0.030354825779795647, -0.024540912359952927, -0.11860479414463043, 0.0004844492650590837, 0.06263548880815506, 0.04390709102153778, -0.21155261993408203, -0.030732538551092148, 0.04015246778726578, -0.010402572341263294, 0.04914253577589989, 0.034502867609262466, -0.025328975170850754, -0.028721526265144348, -0.016432741656899452, -0.09716310352087021, 0.008459486067295074, 0.10394909232854843, -0.06533295661211014, -0.02875460311770439 ]
null
null
transformers
# `dpr-question_encoder-single-nq-base` ## Table of Contents - [Model Details](#model-details) - [How To Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation-results) - [Environmental Impact](#environmental-impact) - [Technical Specifications](#technical-specifications) - [Citation Information](#citation-information) - [Model Card Authors](#model-card-authors) ## Model Details **Model Description:** [Dense Passage Retrieval (DPR)](https://github.com/facebookresearch/DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. `dpr-question_encoder-single-nq-base` is the question encoder trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open) ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)). - **Developed by:** See [GitHub repo](https://github.com/facebookresearch/DPR) for model developers - **Model Type:** BERT-based encoder - **Language(s):** [CC-BY-NC-4.0](https://github.com/facebookresearch/DPR/blob/main/LICENSE), also see [Code of Conduct](https://github.com/facebookresearch/DPR/blob/main/CODE_OF_CONDUCT.md) - **License:** English - **Related Models:** - [`dpr-ctx_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base) - [`dpr-reader-single-nq-base`](https://huggingface.co/facebook/dpr-reader-single-nq-base) - [`dpr-ctx_encoder-multiset-base`](https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base) - [`dpr-question_encoder-multiset-base`](https://huggingface.co/facebook/dpr-question_encoder-multiset-base) - [`dpr-reader-multiset-base`](https://huggingface.co/facebook/dpr-reader-multiset-base) - **Resources for more information:** - [Research Paper](https://arxiv.org/abs/2004.04906) - [GitHub Repo](https://github.com/facebookresearch/DPR) - [Hugging Face DPR docs](https://huggingface.co/docs/transformers/main/en/model_doc/dpr) - [BERT Base Uncased Model Card](https://huggingface.co/bert-base-uncased) ## How to Get Started with the Model Use the code below to get started with the model. ```python from transformers import DPRQuestionEncoder, DPRQuestionEncoderTokenizer tokenizer = DPRQuestionEncoderTokenizer.from_pretrained("facebook/dpr-question_encoder-single-nq-base") model = DPRQuestionEncoder.from_pretrained("facebook/dpr-question_encoder-single-nq-base") input_ids = tokenizer("Hello, is my dog cute ?", return_tensors="pt")["input_ids"] embeddings = model(input_ids).pooler_output ``` ## Uses #### Direct Use `dpr-question_encoder-single-nq-base`, [`dpr-ctx_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base), and [`dpr-reader-single-nq-base`](https://huggingface.co/facebook/dpr-reader-single-nq-base) can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. ## Risks, Limitations and Biases **CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al., 2021](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al., 2021](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. ## Training #### Training Data This model was trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open) ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)). The model authors write that: > [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators. #### Training Procedure The training procedure is described in the [associated paper](https://arxiv.org/pdf/2004.04906.pdf): > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. The authors report that for encoders, they used two independent BERT ([Devlin et al., 2019](https://aclanthology.org/N19-1423/)) networks (base, un-cased) and use FAISS ([Johnson et al., 2017](https://arxiv.org/abs/1702.08734)) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. ## Evaluation The following evaluation information is extracted from the [associated paper](https://arxiv.org/pdf/2004.04906.pdf). #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were [NQ](https://huggingface.co/datasets/nq_open), [TriviaQA](https://huggingface.co/datasets/trivia_qa), [WebQuestions (WQ)](https://huggingface.co/datasets/web_questions), [CuratedTREC (TREC)](https://huggingface.co/datasets/trec), and [SQuAD v1.1](https://huggingface.co/datasets/squad). #### Results | | Top 20 | | | | | Top 100| | | | | |:----:|:------:|:---------:|:--:|:----:|:-----:|:------:|:---------:|:--:|:----:|:-----:| | | NQ | TriviaQA | WQ | TREC | SQuAD | NQ | TriviaQA | WQ | TREC | SQuAD | | | 78.4 | 79.4 |73.2| 79.8 | 63.2 | 85.4 | 85.0 |81.4| 89.1 | 77.2 | ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type and based on the [associated paper](https://arxiv.org/abs/2004.04906). - **Hardware Type:** 8 32GB GPUs - **Hours used:** Unknown - **Cloud Provider:** Unknown - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Technical Specifications See the [associated paper](https://arxiv.org/abs/2004.04906) for details on the modeling architecture, objective, compute infrastructure, and training details. ## Citation Information ```bibtex @inproceedings{karpukhin-etal-2020-dense, title = "Dense Passage Retrieval for Open-Domain Question Answering", author = "Karpukhin, Vladimir and Oguz, Barlas and Min, Sewon and Lewis, Patrick and Wu, Ledell and Edunov, Sergey and Chen, Danqi and Yih, Wen-tau", booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.emnlp-main.550", doi = "10.18653/v1/2020.emnlp-main.550", pages = "6769--6781", } ``` ## Model Card Authors This model card was written by the team at Hugging Face.
{"language": "en", "license": "cc-by-nc-4.0", "tags": ["dpr"], "datasets": ["nq_open"], "inference": false}
feature-extraction
facebook/dpr-question_encoder-single-nq-base
[ "transformers", "pytorch", "tf", "dpr", "feature-extraction", "en", "dataset:nq_open", "arxiv:2004.04906", "arxiv:1702.08734", "arxiv:1910.09700", "license:cc-by-nc-4.0", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2004.04906", "1702.08734", "1910.09700" ]
[ "en" ]
TAGS #transformers #pytorch #tf #dpr #feature-extraction #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us
'dpr-question\_encoder-single-nq-base' ====================================== Table of Contents ----------------- * Model Details * How To Get Started With the Model * Uses * Risks, Limitations and Biases * Training * Evaluation * Environmental Impact * Technical Specifications * Citation Information * Model Card Authors Model Details ------------- Model Description: Dense Passage Retrieval (DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. 'dpr-question\_encoder-single-nq-base' is the question encoder trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). * Developed by: See GitHub repo for model developers * Model Type: BERT-based encoder * Language(s): CC-BY-NC-4.0, also see Code of Conduct * License: English * Related Models: + 'dpr-ctx\_encoder-single-nq-base' + 'dpr-reader-single-nq-base' + 'dpr-ctx\_encoder-multiset-base' + 'dpr-question\_encoder-multiset-base' + 'dpr-reader-multiset-base' * Resources for more information: + Research Paper + GitHub Repo + Hugging Face DPR docs + BERT Base Uncased Model Card How to Get Started with the Model --------------------------------- Use the code below to get started with the model. Uses ---- #### Direct Use 'dpr-question\_encoder-single-nq-base', 'dpr-ctx\_encoder-single-nq-base', and 'dpr-reader-single-nq-base' can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. Risks, Limitations and Biases ----------------------------- CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes. Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. Training -------- #### Training Data This model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that: > > [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators. > > > #### Training Procedure The training procedure is described in the associated paper: > > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > > > > > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. > > > The authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. Evaluation ---------- The following evaluation information is extracted from the associated paper. #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1. #### Results Environmental Impact -------------------- Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper. * Hardware Type: 8 32GB GPUs * Hours used: Unknown * Cloud Provider: Unknown * Compute Region: Unknown * Carbon Emitted: Unknown Technical Specifications ------------------------ See the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details. Model Card Authors ------------------ This model card was written by the team at Hugging Face.
[ "#### Direct Use\n\n\n'dpr-question\\_encoder-single-nq-base', 'dpr-ctx\\_encoder-single-nq-base', and 'dpr-reader-single-nq-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that:\n\n\n\n> \n> [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators.\n> \n> \n>", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ "TAGS\n#transformers #pytorch #tf #dpr #feature-extraction #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us \n", "#### Direct Use\n\n\n'dpr-question\\_encoder-single-nq-base', 'dpr-ctx\\_encoder-single-nq-base', and 'dpr-reader-single-nq-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that:\n\n\n\n> \n> [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators.\n> \n> \n>", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ 75, 75, 206, 94, 313, 82, 135 ]
[ "passage: TAGS\n#transformers #pytorch #tf #dpr #feature-extraction #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us \n#### Direct Use\n\n\n'dpr-question\\_encoder-single-nq-base', 'dpr-ctx\\_encoder-single-nq-base', and 'dpr-reader-single-nq-base' can be used for the task of open-domain question answering.#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------#### Training Data\n\n\nThis model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that:\n\n\n\n> \n> [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators.\n> \n> \n>" ]
[ -0.0777929425239563, 0.09039101749658585, -0.0025135488249361515, 0.06719013303518295, 0.05821390450000763, -0.016167959198355675, 0.08708067983388901, 0.0842328667640686, 0.1059616282582283, 0.05392447113990784, 0.056535620242357254, 0.056134939193725586, 0.034569449722766876, 0.09352144598960876, -0.018778767436742783, -0.16647928953170776, 0.017933489754796028, -0.010241306386888027, 0.06326161324977875, 0.12467759847640991, 0.07965897768735886, -0.06669623404741287, 0.06481067836284637, 0.026408473029732704, -0.036272380501031876, -0.04548395425081253, -0.00851080846041441, -0.03676030412316322, 0.09827613085508347, 0.09192893654108047, 0.08028078079223633, 0.005509940907359123, 0.010733992792665958, -0.2375611662864685, 0.04030744358897209, 0.056303855031728745, -0.01677294261753559, 0.0786423534154892, 0.0955045074224472, 0.013807595707476139, 0.14357313513755798, -0.053173940628767014, 0.04227166622877121, 0.09618313610553741, -0.07038794457912445, -0.1315969079732895, -0.08662856370210648, -0.0016117143677547574, 0.06293626129627228, 0.1164940819144249, -0.05372854322195053, 0.1592394858598709, -0.06697406619787216, 0.029993733391165733, 0.13770754635334015, -0.12250090390443802, -0.0061815944500267506, -0.042439449578523636, -0.012001140974462032, 0.006471787579357624, -0.0635058805346489, 0.0035278620198369026, 0.03192820027470589, 0.049182236194610596, 0.04401147738099098, -0.049187932163476944, 0.03732487931847572, -0.04744661599397659, -0.14816118776798248, -0.03391537815332413, 0.13629049062728882, 0.05506976321339607, -0.08588805794715881, -0.1713249385356903, -0.046054668724536896, 0.16388531029224396, 0.056962814182043076, -0.09936217218637466, -0.016174716874957085, -0.035065487027168274, 0.0945788323879242, -0.07682199031114578, -0.07248614728450775, -0.031492628157138824, -0.08939671516418457, 0.1844983547925949, 0.05188033729791641, 0.03159702196717262, -0.07208529114723206, 0.03848307952284813, -0.0061678229831159115, -0.07695803791284561, -0.08789627999067307, -0.10919371992349625, -0.06799258291721344, -0.022742051631212234, -0.03247201442718506, -0.0152210034430027, 0.011902169324457645, 0.1236913874745369, -0.12405035644769669, 0.00857703946530819, -0.018932195380330086, 0.03984924033284187, 0.13857755064964294, 0.04566023871302605, -0.11075348407030106, 0.00891639944165945, 0.0703323557972908, 0.02029893733561039, -0.005657521542161703, -0.03465238958597183, -0.01622913032770157, -0.06694146990776062, 0.014296326786279678, 0.04323163628578186, 0.1193738579750061, 0.07247879356145859, -0.04772397130727768, -0.08318659663200378, 0.09886433929204941, -0.09805548191070557, -0.08264358341693878, 0.008360614068806171, -0.09557768702507019, -0.0056472234427928925, 0.026586472988128662, -0.03824527934193611, -0.06994298845529556, 0.013222793117165565, -0.09879062324762344, -0.009614318609237671, -0.069480299949646, -0.11449406296014786, 0.05339428782463074, 0.017609942704439163, -0.020179864019155502, -0.1449376940727234, -0.16937994956970215, -0.09343919157981873, 0.015441165305674076, -0.03547846898436546, 0.004849007353186607, -0.03332807868719101, -0.0583341084420681, -0.018402179703116417, -0.041606202721595764, -0.07838105410337448, -0.02907801792025566, 0.06128055602312088, -0.05934426188468933, 0.0335647352039814, 0.018712176010012627, 0.025374282151460648, -0.11174950003623962, 0.030421270057559013, -0.1378997266292572, 0.12590636312961578, -0.06062018498778343, -0.07451937347650528, -0.08785094320774078, -0.06000390276312828, 0.007712306920439005, 0.08696694672107697, 0.03458312898874283, 0.20394043624401093, -0.16347244381904602, 0.0268661268055439, 0.1765255630016327, -0.11668092757463455, -0.1396210789680481, 0.11179832369089127, -0.08900490403175354, 0.1253504753112793, 0.08912888914346695, 0.1119300052523613, 0.09046421200037003, -0.08681099861860275, -0.08159994333982468, -0.05642946809530258, -0.08143997192382812, 0.11623474955558777, 0.026171432808041573, -0.05960763618350029, 0.0612812377512455, -0.02685597725212574, -0.08902477473020554, 0.03571062162518501, -0.01604842208325863, -0.025825463235378265, 0.02986983209848404, -0.024525301530957222, 0.11366331577301025, -0.005557190626859665, -0.03593456745147705, 0.05034951865673065, -0.12423902750015259, -0.03077968768775463, 0.05971745401620865, -0.010839764960110188, 0.013123424723744392, -0.11776955425739288, 0.004861228168010712, 0.007026315201073885, 0.032804399728775024, -0.2053331732749939, -0.1207016259431839, -0.02818135730922222, -0.1568923145532608, 0.09964495152235031, 0.08577807247638702, 0.009251197800040245, -0.003069893456995487, -0.040336184203624725, 0.011906088329851627, -0.06581873446702957, -0.0006199919153004885, -0.05034291744232178, -0.130625918507576, 0.036178361624479294, -0.03718820586800575, -0.01376383937895298, -0.12973695993423462, 0.00960841029882431, 0.053875524550676346, 0.012550140731036663, 0.016033073887228966, -0.0225677452981472, 0.053130876272916794, 0.02991710789501667, -0.0047507681883871555, -0.03141717612743378, 0.017569130286574364, -0.05410890281200409, -0.07438201457262039, 0.12807200849056244, -0.20054660737514496, -0.06210048869252205, 0.07883692532777786, -0.1022508516907692, -0.11634045094251633, -0.054789453744888306, -0.012425419874489307, -0.020367972552776337, -0.06452689319849014, -0.04797714948654175, 0.23532846570014954, 0.027209222316741943, 0.029490342363715172, -0.09749287366867065, -0.05384000763297081, -0.019430603832006454, -0.014553500339388847, -0.002137033501639962, 0.014866757206618786, 0.020917246118187904, -0.20752175152301788, 0.07512150704860687, 0.054307721555233, -0.04315486177802086, 0.15256167948246002, 0.0501951240003109, -0.0887838676571846, -0.030698705464601517, 0.012714562006294727, -0.049453254789114, 0.12803968787193298, -0.00609286455437541, 0.022663939744234085, 0.04672505706548691, 0.040748342871665955, 0.04266141727566719, -0.0809350460767746, -0.005971712525933981, 0.006258235778659582, -0.006645176559686661, -0.11084780842065811, 0.026643378660082817, 0.050645869225263596, 0.12965001165866852, 0.03835461288690567, 0.004249535501003265, 0.038814324885606766, -0.06616164743900299, -0.15704533457756042, 0.19040468335151672, -0.0693371370434761, -0.2333092987537384, -0.029160184785723686, 0.07800416648387909, -0.024516131728887558, 0.008992485702037811, 0.028833001852035522, -0.06189374625682831, -0.05441684275865555, -0.08443304151296616, 0.07085596024990082, -0.02828189916908741, -0.07785756140947342, -0.07155725359916687, 0.015153327025473118, 0.015307179652154446, -0.07895292341709137, 0.022827858105301857, -0.0483841672539711, -0.03600134328007698, 0.049955662339925766, 0.04811958223581314, 0.11950736492872238, 0.06733173131942749, 0.020476773381233215, -0.039906758815050125, -0.05552700534462929, 0.17416220903396606, -0.14022944867610931, 0.03545895218849182, 0.1676623672246933, -0.07289206981658936, 0.07271357625722885, 0.1378096640110016, 0.005061788484454155, -0.0705769881606102, 0.042583249509334564, 0.07776281237602234, -0.04185852035880089, -0.24416010081768036, -0.08343558013439178, -0.030170517042279243, -0.03375955671072006, 0.025213489308953285, 0.016312843188643456, 0.06812019646167755, 0.05540880188345909, -0.1045057475566864, -0.062334492802619934, 0.05829256400465965, 0.07233475893735886, 0.16165342926979065, -0.002143091754987836, 0.05559568479657173, 0.01857772096991539, -0.0031658466905355453, 0.07300911843776703, -0.07542303204536438, 0.33150243759155273, -0.06857780367136002, 0.05901385843753815, 0.12803404033184052, 0.05405596271157265, 0.029671937227249146, -0.0072527495212852955, 0.04131651669740677, 0.04300970211625099, -0.0479138046503067, -0.042064279317855835, -0.04906100779771805, 0.08991970866918564, -0.02347792126238346, -0.07602889835834503, 0.0039156354032456875, -0.033073633909225464, 0.03884897753596306, 0.08326040953397751, -0.01331940945237875, -0.08894092589616776, -0.0652642473578453, 0.017841756343841553, -0.05623208358883858, -0.0795099213719368, 0.061981141567230225, 0.11916089802980423, -0.14750978350639343, 0.02396024949848652, -0.03664805740118027, 0.09349294751882553, -0.12537914514541626, 0.011586582288146019, -0.034695789217948914, 0.01630662940442562, -0.013786222785711288, 0.08146482706069946, -0.23009556531906128, 0.1964137852191925, 0.031957659870386124, 0.02828010357916355, -0.1160251572728157, -0.02009933441877365, 0.055797260254621506, -0.09202488511800766, 0.13272707164287567, 0.0187526922672987, 0.011939279735088348, -0.10003826767206192, -0.0066548497416079044, 0.039799027144908905, 0.05138646066188812, -0.00653481762856245, 0.08555048704147339, 0.023727547377347946, 0.04871164262294769, -0.010435463860630989, 0.036210350692272186, -0.1436290442943573, -0.1252836138010025, 0.04923715814948082, -0.07135917991399765, 0.03176535665988922, -0.03320867940783501, -0.035582058131694794, 0.02803615666925907, 0.08267199993133545, -0.1413867473602295, -0.09412195533514023, -0.09395864605903625, 0.03252773359417915, 0.07157698273658752, -0.10141202807426453, 0.023323625326156616, 0.06849197298288345, 0.04314751923084259, -0.0063843270763754845, -0.0876225158572197, 0.07937302440404892, -0.08200670033693314, -0.11632974445819855, -0.037423405796289444, 0.0013545092660933733, 0.18256404995918274, 0.06285933405160904, 0.0121184466406703, 0.013141436502337456, -0.0028761671856045723, -0.16567525267601013, 0.007467416115105152, 0.1676224023103714, 0.052050065249204636, 0.08897801488637924, 0.018390439450740814, -0.06303949654102325, -0.11808348447084427, -0.03690968081355095, 0.06146308407187462, 0.19809302687644958, -0.033485978841781616, 0.07697474211454391, 0.14187553524971008, -0.07023938000202179, -0.23370586335659027, 0.0019178345100954175, 0.036158595234155655, -0.026550758630037308, 0.11902537941932678, -0.15936192870140076, 0.017120469361543655, 0.0314774326980114, 0.00016273774963337928, 0.09572028368711472, -0.13502366840839386, -0.11248132586479187, 0.16402384638786316, 0.04276802018284798, 0.1066012978553772, -0.08300791680812836, -0.028192393481731415, 0.03374336287379265, 0.00999448448419571, 0.17732588946819305, -0.09560275077819824, 0.01267335657030344, 0.008714640513062477, 0.10649954527616501, 0.05586143955588341, -0.0313059464097023, 0.11298055946826935, 0.014671415090560913, 0.06829240173101425, -0.11268014460802078, -0.018376708030700684, 0.010958435945212841, -0.029942534863948822, 0.08993147313594818, 0.06421788781881332, 0.03577353060245514, -0.11440590023994446, -0.05296730995178223, -0.12364116311073303, 0.014862766489386559, -0.02328190766274929, -0.06095439940690994, -0.06991057097911835, 0.08941216766834259, 0.11992267519235611, -0.012542945332825184, 0.002850652439519763, -0.0925958976149559, 0.007400524336844683, 0.07142174988985062, 0.2352445274591446, 0.08550950884819031, -0.10755903273820877, 0.0062090735882520676, -0.005408860743045807, 0.10875296592712402, -0.13584163784980774, 0.00386839359998703, 0.07989884912967682, 0.03336296230554581, 0.0964578166604042, 0.021990453824400902, -0.11732227355241776, 0.06370706856250763, -0.02912822552025318, -0.0950266569852829, -0.14214015007019043, -0.020126832649111748, 0.009011274203658104, -0.12288523465394974, -0.08479893207550049, 0.12012547254562378, -0.021349122747778893, -0.01751873642206192, 0.013354826718568802, 0.061362918466329575, 0.054703496396541595, 0.056774482131004333, 0.0646418035030365, 0.042017947882413864, -0.04605982452630997, 0.030290935188531876, 0.03435313701629639, -0.07170671224594116, 0.11304941028356552, 0.03281594067811966, -0.08345723897218704, -0.0651482343673706, -0.09810438752174377, -0.014440873637795448, -0.10547501593828201, -0.05491996183991432, 0.0010826833313331008, -0.038985319435596466, -0.017606569454073906, 0.17251281440258026, 0.02098754048347473, 0.022856784984469414, 0.0026449989527463913, -0.013590643182396889, -0.02146230638027191, 0.07164598256349564, -0.010203907266259193, -0.018193157389760017, -0.055385246872901917, 0.02697194553911686, 0.034979384392499924, 0.0543508306145668, -0.04452773928642273, -0.022950900718569756, -0.10847941786050797, 0.01641913130879402, -0.21820935606956482, 0.06668797135353088, -0.11925439536571503, 0.005067954305559397, -0.03046068176627159, -0.04083873704075813, -0.014507022686302662, 0.015804985538125038, -0.05481617897748947, 0.0037188015412539244, 0.05748661980032921, 0.05165810137987137, -0.14107221364974976, -0.033887382596731186, 0.051374875009059906, -0.050706006586551666, 0.0778365507721901, 0.005412581376731396, -0.06603719294071198, 0.019186941906809807, -0.12351299077272415, 0.02017989195883274, -0.028455225750803947, 0.03373385965824127, 0.014742200262844563, -0.11981707066297531, -0.02093592844903469, 0.0003809217887464911, -0.02348501607775688, 0.02155914157629013, 0.005688028410077095, -0.07075119763612747, 0.023171599954366684, 0.05154392123222351, -0.00849863700568676, -0.09085549414157867, 0.01385891530662775, 0.11256080120801926, 0.014267461374402046, 0.09222543239593506, -0.03441093862056732, 0.10164099186658859, -0.14159409701824188, -0.0029302218463271856, 0.0520482137799263, 0.09157788753509521, 0.07409918308258057, -0.025498591363430023, 0.06015375256538391, -0.04325965419411659, 0.18388424813747406, -0.02911847084760666, 0.008548635058104992, 0.05855010822415352, 0.013198846019804478, -0.017131706699728966, -0.017681671306490898, -0.06937555223703384, -0.060894351452589035, -0.024927303194999695, 0.04267231374979019, 0.03403458371758461, -0.0651247650384903, -0.09714479744434357, 0.1844026744365692, 0.07840125262737274, 0.09188051521778107, -0.0008101984276436269, -0.01051750686019659, -0.08421678096055984, -0.00016262142162304372, -0.010630580596625805, 0.04075715318322182, -0.040398772805929184, -0.04535995051264763, 0.08475936204195023, 0.1265076845884323, -0.04853641986846924, 0.11126477271318436, -0.01812324859201908, -0.06525396555662155, -0.12088849395513535, -0.17668363451957703, -0.02351500652730465, -0.015281178057193756, 0.016618814319372177, -0.11934002488851547, 0.018134063109755516, 0.15763066709041595, -0.012375871650874615, -0.0735340565443039, 0.10511929541826248, 0.010508846491575241, -0.15933459997177124, 0.04408034309744835, 0.0022845109924674034, 0.035120993852615356, 0.015651501715183258, 0.027501191943883896, 0.04656904190778732, 0.033266328275203705, 0.05264472961425781, 0.05135224387049675, -0.0229700468480587, 0.009658107534050941, -0.12534132599830627, -0.07757425308227539, -0.02674012817442417, 0.08285143226385117, 0.0874752625823021, 0.23181743919849396, 0.05918489024043083, -0.03536928445100784, -0.007733779959380627, 0.18510843813419342, 0.005540978163480759, 0.020777855068445206, -0.11849850416183472, 0.18486815690994263, 0.009003400802612305, -0.023186039179563522, 0.029896622523665428, -0.117589071393013, 0.06481145322322845, 0.14063476026058197, 0.16164103150367737, -0.09610988944768906, -0.001962097827345133, -0.04842675104737282, 0.012491738423705101, 0.021052926778793335, 0.04237684980034828, 0.043279293924570084, 0.3482849895954132, -0.062089644372463226, 0.04667673259973526, -0.0024741976521909237, 0.029350923374295235, 0.027737323194742203, 0.07416857033967972, 0.0429806225001812, 0.02739799953997135, -0.12052623182535172, 0.11520017683506012, -0.11682197451591492, -0.1746339499950409, 0.0025123555678874254, -0.04897307604551315, -0.07072625309228897, 0.010946611873805523, -0.0817398950457573, 0.009206213057041168, 0.08198939263820648, 0.008639005944132805, 0.0021306362468749285, 0.10738325864076614, 0.009877067990601063, -0.09439297020435333, -0.01985667645931244, 0.08718510717153549, -0.04672863706946373, 0.20859840512275696, 0.007915528491139412, 0.1922055035829544, 0.09813356399536133, 0.011495688930153847, -0.10374584794044495, 0.03300146386027336, 0.03594020754098892, -0.07302805781364441, 0.015694214031100273, 0.17705696821212769, 0.010130996815860271, 0.07994706928730011, 0.1396106630563736, -0.05915633216500282, 0.038788892328739166, -0.03990798443555832, -0.0679742619395256, -0.09636175632476807, 0.10031458735466003, -0.09819891303777695, 0.1584334820508957, 0.11986277252435684, -0.0436645969748497, 0.005829922389239073, -0.026439949870109558, -0.015264363028109074, -0.04991058260202408, 0.03656903654336929, -0.01760827749967575, -0.10206539928913116, 0.03787955641746521, 0.04455169662833214, 0.03423256054520607, -0.23321466147899628, 0.0035597304813563824, 0.03849860653281212, -0.03021395206451416, 0.022797150537371635, 0.0444299653172493, -0.044207751750946045, 0.008680115453898907, -0.03946569934487343, -0.05738910287618637, -0.0169240552932024, 0.09244221448898315, -0.10353651642799377, -0.047680340707302094 ]
null
null
transformers
# `dpr-reader-multiset-base` ## Table of Contents - [Model Details](#model-details) - [How To Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation-results) - [Environmental Impact](#environmental-impact) - [Technical Specifications](#technical-specifications) - [Citation Information](#citation-information) - [Model Card Authors](#model-card-authors) ## Model Details **Model Description:** [Dense Passage Retrieval (DPR)](https://github.com/facebookresearch/DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. `dpr-reader-multiset-base` is the reader model trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open), [TriviaQA](https://huggingface.co/datasets/trivia_qa), [WebQuestions (WQ)](https://huggingface.co/datasets/web_questions), and [CuratedTREC (TREC)](https://huggingface.co/datasets/trec). - **Developed by:** See [GitHub repo](https://github.com/facebookresearch/DPR) for model developers - **Model Type:** BERT-based encoder - **Language(s):** [CC-BY-NC-4.0](https://github.com/facebookresearch/DPR/blob/main/LICENSE), also see [Code of Conduct](https://github.com/facebookresearch/DPR/blob/main/CODE_OF_CONDUCT.md) - **License:** English - **Related Models:** - [`dpr-question_encoder-multiset-base`](https://huggingface.co/facebook/dpr-question_encoder-multiset-base) - [`dpr-ctx_encoder-multiset-base`](https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base) - [`dpr-question-encoder-single-nq-base`](https://huggingface.co/facebook/dpr-question_encoder-single-nq-base) - [`dpr-reader-single-nq-base`](https://huggingface.co/facebook/dpr-reader-single-nq-base) - [`dpr-ctx_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base) - **Resources for more information:** - [Research Paper](https://arxiv.org/abs/2004.04906) - [GitHub Repo](https://github.com/facebookresearch/DPR) - [Hugging Face DPR docs](https://huggingface.co/docs/transformers/main/en/model_doc/dpr) - [BERT Base Uncased Model Card](https://huggingface.co/bert-base-uncased) ## How to Get Started with the Model Use the code below to get started with the model. ```python from transformers import DPRReader, DPRReaderTokenizer tokenizer = DPRReaderTokenizer.from_pretrained("facebook/dpr-reader-multiset-base") model = DPRReader.from_pretrained("facebook/dpr-reader-multiset-base") encoded_inputs = tokenizer( questions=["What is love ?"], titles=["Haddaway"], texts=["'What Is Love' is a song recorded by the artist Haddaway"], return_tensors="pt", ) outputs = model(**encoded_inputs) start_logits = outputs.start_logits end_logits = outputs.end_logits relevance_logits = outputs.relevance_logits ``` ## Uses #### Direct Use `dpr-reader-multiset-base`, [`dpr-question_encoder-multiset-base`](https://huggingface.co/facebook/dpr-question_encoder-multiset-base), and [`dpr-ctx_encoder-multiset-base`](https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base) can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. ## Risks, Limitations and Biases **CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. ## Training #### Training Data This model was trained using the following datasets: - **[Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open)** ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)) - **[TriviaQA](https://huggingface.co/datasets/trivia_qa)** ([Joshi et al., 2017](https://aclanthology.org/P17-1147/)) - **[WebQuestions (WQ)](https://huggingface.co/datasets/web_questions)** ([Berant et al., 2013](https://aclanthology.org/D13-1160/)) - **[CuratedTREC (TREC)](https://huggingface.co/datasets/trec)** ([Baudiš & Šedivý, 2015](https://www.aminer.cn/pub/599c7953601a182cd263079b/reading-wikipedia-to-answer-open-domain-questions)) #### Training Procedure The training procedure is described in the [associated paper](https://arxiv.org/pdf/2004.04906.pdf): > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. The authors report that for encoders, they used two independent BERT ([Devlin et al., 2019](https://aclanthology.org/N19-1423/)) networks (base, un-cased) and use FAISS ([Johnson et al., 2017](https://arxiv.org/abs/1702.08734)) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. ## Evaluation The following evaluation information is extracted from the [associated paper](https://arxiv.org/pdf/2004.04906.pdf). #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were [NQ](https://huggingface.co/datasets/nq_open), [TriviaQA](https://huggingface.co/datasets/trivia_qa), [WebQuestions (WQ)](https://huggingface.co/datasets/web_questions), [CuratedTREC (TREC)](https://huggingface.co/datasets/trec), and [SQuAD v1.1](https://huggingface.co/datasets/squad). #### Results | | Top 20 | | | | | Top 100| | | | | |:----:|:------:|:---------:|:--:|:----:|:-----:|:------:|:---------:|:--:|:----:|:-----:| | | NQ | TriviaQA | WQ | TREC | SQuAD | NQ | TriviaQA | WQ | TREC | SQuAD | | | 79.4 | 78.8 |75.0| 89.1 | 51.6 | 86.0 | 84.7 |82.9| 93.9 | 67.6 | ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type and based on the [associated paper](https://arxiv.org/abs/2004.04906). - **Hardware Type:** 8 32GB GPUs - **Hours used:** Unknown - **Cloud Provider:** Unknown - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Technical Specifications See the [associated paper](https://arxiv.org/abs/2004.04906) for details on the modeling architecture, objective, compute infrastructure, and training details. ## Citation Information ```bibtex @inproceedings{karpukhin-etal-2020-dense, title = "Dense Passage Retrieval for Open-Domain Question Answering", author = "Karpukhin, Vladimir and Oguz, Barlas and Min, Sewon and Lewis, Patrick and Wu, Ledell and Edunov, Sergey and Chen, Danqi and Yih, Wen-tau", booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.emnlp-main.550", doi = "10.18653/v1/2020.emnlp-main.550", pages = "6769--6781", } ``` ## Model Card Authors This model card was written by the team at Hugging Face.
{"language": "en", "license": "cc-by-nc-4.0", "tags": ["dpr"], "datasets": ["nq_open", "trivia_qa", "web_questions", "trec"], "inference": false}
null
facebook/dpr-reader-multiset-base
[ "transformers", "pytorch", "tf", "dpr", "en", "dataset:nq_open", "dataset:trivia_qa", "dataset:web_questions", "dataset:trec", "arxiv:2004.04906", "arxiv:1702.08734", "arxiv:1910.09700", "license:cc-by-nc-4.0", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2004.04906", "1702.08734", "1910.09700" ]
[ "en" ]
TAGS #transformers #pytorch #tf #dpr #en #dataset-nq_open #dataset-trivia_qa #dataset-web_questions #dataset-trec #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #region-us
'dpr-reader-multiset-base' ========================== Table of Contents ----------------- * Model Details * How To Get Started With the Model * Uses * Risks, Limitations and Biases * Training * Evaluation * Environmental Impact * Technical Specifications * Citation Information * Model Card Authors Model Details ------------- Model Description: Dense Passage Retrieval (DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. 'dpr-reader-multiset-base' is the reader model trained using the Natural Questions (NQ) dataset, TriviaQA, WebQuestions (WQ), and CuratedTREC (TREC). * Developed by: See GitHub repo for model developers * Model Type: BERT-based encoder * Language(s): CC-BY-NC-4.0, also see Code of Conduct * License: English * Related Models: + 'dpr-question\_encoder-multiset-base' + 'dpr-ctx\_encoder-multiset-base' + 'dpr-question-encoder-single-nq-base' + 'dpr-reader-single-nq-base' + 'dpr-ctx\_encoder-single-nq-base' * Resources for more information: + Research Paper + GitHub Repo + Hugging Face DPR docs + BERT Base Uncased Model Card How to Get Started with the Model --------------------------------- Use the code below to get started with the model. Uses ---- #### Direct Use 'dpr-reader-multiset-base', 'dpr-question\_encoder-multiset-base', and 'dpr-ctx\_encoder-multiset-base' can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. Risks, Limitations and Biases ----------------------------- CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes. Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. Training -------- #### Training Data This model was trained using the following datasets: * Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019) * TriviaQA (Joshi et al., 2017) * WebQuestions (WQ) (Berant et al., 2013) * CuratedTREC (TREC) (Baudiš & Šedivý, 2015) #### Training Procedure The training procedure is described in the associated paper: > > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > > > > > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. > > > The authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. Evaluation ---------- The following evaluation information is extracted from the associated paper. #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1. #### Results Environmental Impact -------------------- Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper. * Hardware Type: 8 32GB GPUs * Hours used: Unknown * Cloud Provider: Unknown * Compute Region: Unknown * Carbon Emitted: Unknown Technical Specifications ------------------------ See the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details. Model Card Authors ------------------ This model card was written by the team at Hugging Face.
[ "#### Direct Use\n\n\n'dpr-reader-multiset-base', 'dpr-question\\_encoder-multiset-base', and 'dpr-ctx\\_encoder-multiset-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the following datasets:\n\n\n* Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019)\n* TriviaQA (Joshi et al., 2017)\n* WebQuestions (WQ) (Berant et al., 2013)\n* CuratedTREC (TREC) (Baudiš & Šedivý, 2015)", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ "TAGS\n#transformers #pytorch #tf #dpr #en #dataset-nq_open #dataset-trivia_qa #dataset-web_questions #dataset-trec #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #region-us \n", "#### Direct Use\n\n\n'dpr-reader-multiset-base', 'dpr-question\\_encoder-multiset-base', and 'dpr-ctx\\_encoder-multiset-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the following datasets:\n\n\n* Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019)\n* TriviaQA (Joshi et al., 2017)\n* WebQuestions (WQ) (Berant et al., 2013)\n* CuratedTREC (TREC) (Baudiš & Šedivý, 2015)", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ 87, 66, 206, 91, 313, 82, 135 ]
[ "passage: TAGS\n#transformers #pytorch #tf #dpr #en #dataset-nq_open #dataset-trivia_qa #dataset-web_questions #dataset-trec #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #region-us \n#### Direct Use\n\n\n'dpr-reader-multiset-base', 'dpr-question\\_encoder-multiset-base', and 'dpr-ctx\\_encoder-multiset-base' can be used for the task of open-domain question answering.#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------#### Training Data\n\n\nThis model was trained using the following datasets:\n\n\n* Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019)\n* TriviaQA (Joshi et al., 2017)\n* WebQuestions (WQ) (Berant et al., 2013)\n* CuratedTREC (TREC) (Baudiš & Šedivý, 2015)" ]
[ -0.07399694621562958, 0.0638347715139389, -0.0041732084937393665, 0.0716833844780922, 0.04122985899448395, -0.014524155296385288, 0.0711633488535881, 0.07417763769626617, 0.06974474340677261, 0.05679267272353172, 0.03525424748659134, 0.07246948778629303, 0.06949234753847122, 0.05305120348930359, -0.0424734391272068, -0.18431951105594635, 0.012464466504752636, -0.012178782373666763, 0.054454464465379715, 0.12158164381980896, 0.10358531773090363, -0.06796502321958542, 0.04875875636935234, 0.011823520064353943, -0.01280171051621437, -0.046476926654577255, -0.008491798304021358, -0.022270286455750465, 0.07167552411556244, 0.0988471582531929, 0.06154439598321915, 0.02139962837100029, 0.0015247061382979155, -0.2576061189174652, 0.04596957191824913, 0.04613377898931503, -0.022460753098130226, 0.0675082877278328, 0.12815237045288086, 0.011569992639124393, 0.1339721828699112, -0.06256499141454697, 0.04064677655696869, 0.09738817811012268, -0.09090307354927063, -0.12052661925554276, -0.10571709275245667, -0.00639856792986393, 0.10773897916078568, 0.08636278659105301, -0.06472307443618774, 0.1474694460630417, -0.09008108824491501, 0.04062237963080406, 0.12609551846981049, -0.1384480744600296, -0.024532485753297806, -0.06611420959234238, -0.012970729731023312, -0.0012071389937773347, -0.08917351812124252, -0.01595180109143257, 0.025377487763762474, 0.04957911744713783, 0.019544586539268494, -0.05840058624744415, 0.026742583140730858, -0.04492105171084404, -0.1495734453201294, -0.02940124087035656, 0.10841228812932968, 0.057318564504384995, -0.07044616341590881, -0.17850038409233093, -0.03218919411301613, 0.19550135731697083, 0.04101130738854408, -0.07617975026369095, -0.024150105193257332, -0.02996569126844406, 0.0773724764585495, -0.047371767461299896, -0.05962188169360161, -0.01639781892299652, -0.10588196665048599, 0.14714424312114716, 0.06404095888137817, 0.019887104630470276, -0.07090136408805847, 0.01988513208925724, -0.018549831584095955, -0.10603439062833786, -0.09037147462368011, -0.10275448113679886, -0.07215169072151184, -0.02345254458487034, -0.032990675419569016, -0.04691525921225548, 0.03967804089188576, 0.1313997060060501, -0.13846932351589203, 0.01720076985657215, -0.06968680769205093, 0.04234069585800171, 0.14195315539836884, 0.015399685129523277, -0.1120801568031311, -0.03415268287062645, 0.04295738413929939, 0.026857838034629822, -0.015131072141230106, -0.02066049724817276, 0.011339494027197361, -0.0662900060415268, -0.01410791277885437, 0.05983392149209976, 0.1361970752477646, 0.06156792491674423, -0.063408762216568, -0.07998757809400558, 0.11136529594659805, -0.10311982035636902, -0.07789172977209091, 0.02597455121576786, -0.08090038597583771, 0.01856779120862484, 0.0011577075347304344, -0.01791379228234291, -0.0634794756770134, -0.008233394473791122, -0.08315001428127289, 0.005548530723899603, -0.054157525300979614, -0.10282790660858154, 0.06941446661949158, 0.017117522656917572, -0.029957514256238937, -0.13256153464317322, -0.14082667231559753, -0.09243569523096085, 0.010209103114902973, -0.049778588116168976, 0.012179169803857803, -0.039644401520490646, -0.057307761162519455, -0.010000732727348804, -0.03377821668982506, -0.05405634641647339, -0.039921313524246216, 0.07995813339948654, -0.0662079006433487, 0.053899962455034256, 0.04741259664297104, 0.016773780807852745, -0.08390665054321289, 0.031069010496139526, -0.11900987476110458, 0.13145199418067932, -0.06774970889091492, -0.08487801998853683, -0.0833115503191948, -0.044532086700201035, 0.01350359246134758, 0.06828363239765167, 0.01998669095337391, 0.19738395512104034, -0.1433066874742508, 0.01995796337723732, 0.20603151619434357, -0.1233440488576889, -0.13524948060512543, 0.11172038316726685, -0.0856926292181015, 0.09815359860658646, 0.10006525367498398, 0.10902150720357895, 0.09303710609674454, -0.10953962057828903, -0.0852917730808258, -0.04814894124865532, -0.049106888473033905, 0.13766111433506012, 0.053524743765592575, -0.06688810884952545, 0.08166979253292084, -0.03510356694459915, -0.03792552649974823, 0.04364112764596939, -0.03152230381965637, -0.03630626201629639, 0.044856369495391846, -0.03921566903591156, 0.13759848475456238, 0.0061515080742537975, -0.04824366793036461, 0.05921110883355141, -0.10745643079280853, -0.0567593052983284, 0.05989128351211548, -0.013367502018809319, 0.028271116316318512, -0.12804584205150604, -0.00471959263086319, 0.00045545693137682974, 0.04253271594643593, -0.20930519700050354, -0.10812532901763916, -0.03974514082074165, -0.10702212154865265, 0.07611463218927383, 0.08177398145198822, 0.020004449412226677, 0.01526164822280407, -0.06678648293018341, 0.025779128074645996, -0.03941287100315094, -0.0007041560602374375, -0.05186491832137108, -0.15027354657649994, 0.02096611261367798, -0.058261945843696594, 0.005634389817714691, -0.10956957191228867, 0.014439715072512627, 0.0839708149433136, 0.03730732202529907, 0.03272251412272453, -0.01155111100524664, 0.06429602205753326, 0.04471145197749138, 0.005118338391184807, -0.011351915076375008, 0.01295347698032856, -0.05943271145224571, -0.06801638007164001, 0.12210254371166229, -0.16453541815280914, -0.05101349204778671, 0.0639951229095459, -0.08032765984535217, -0.10077951103448868, -0.08881156146526337, -0.02045224979519844, -0.020985499024391174, -0.07234717905521393, -0.05054595693945885, 0.17657725512981415, 0.03583328425884247, 0.017526812851428986, -0.1051817536354065, -0.06631649285554886, -0.020577268674969673, -0.02788388729095459, 0.011366206221282482, 0.05074402689933777, -0.007337402552366257, -0.2079843431711197, 0.07860925793647766, 0.09402628988027573, -0.039349451661109924, 0.1546916663646698, 0.025018909946084023, -0.08727355301380157, -0.03908358886837959, 0.030643703415989876, -0.02787766419351101, 0.09976016730070114, -0.006336496211588383, 0.04117028787732124, 0.04734185338020325, 0.029724981635808945, 0.029563844203948975, -0.06678419560194016, -0.012591544538736343, -0.008091446943581104, -0.0063208164647221565, -0.11173339933156967, 0.04702742397785187, 0.07655677944421768, 0.11700570583343506, 0.035609420388936996, 0.014151477254927158, 0.042739178985357285, -0.07849449664354324, -0.14334654808044434, 0.18293732404708862, -0.07471810281276703, -0.1970885694026947, -0.02172371745109558, 0.028218720108270645, -0.050274066627025604, 0.0025673937052488327, 0.04937122017145157, -0.06576164066791534, -0.04897468909621239, -0.07526814937591553, 0.06692046672105789, 0.014557483606040478, -0.058993395417928696, -0.07920780032873154, 0.026494275778532028, 0.024703295901417732, -0.06829296797513962, 0.028712710365653038, -0.040072403848171234, -0.07197728753089905, 0.04671378433704376, 0.044072460383176804, 0.12079433351755142, 0.052740149199962616, 0.059354301542043686, -0.029591906815767288, -0.055703260004520416, 0.19840258359909058, -0.14327149093151093, 0.028295569121837616, 0.1807638555765152, -0.08745268732309341, 0.08014727383852005, 0.1614057868719101, 0.010856512933969498, -0.06563656777143478, 0.02982623130083084, 0.07898720353841782, -0.034014683216810226, -0.24821285903453827, -0.0643482431769371, -0.04501449316740036, -0.03979828953742981, 0.011556614190340042, 0.011809059418737888, 0.05716513469815254, 0.036745429039001465, -0.11185722798109055, -0.0821710005402565, 0.10576197504997253, 0.055870577692985535, 0.16307911276817322, 0.006450964603573084, 0.06159579008817673, 0.004156298004090786, -0.004841155838221312, 0.06703387200832367, -0.06361423432826996, 0.3384149968624115, -0.04927673935890198, 0.03189682960510254, 0.14909346401691437, 0.08719342947006226, 0.007399815134704113, -0.019686702638864517, 0.005231223069131374, 0.03576798364520073, -0.05386227369308472, -0.05616261810064316, -0.037291187793016434, 0.0771515965461731, 0.027493061497807503, -0.07955926656723022, 0.010609953664243221, -0.06629754602909088, 0.05522199347615242, 0.09084153920412064, -0.006145606283098459, -0.05538127198815346, -0.030910158529877663, 0.030672889202833176, -0.08047051727771759, -0.056649886071681976, 0.06360985338687897, 0.10819495469331741, -0.14470279216766357, 0.04942468926310539, -0.01879780739545822, 0.10762225836515427, -0.12676410377025604, 0.004342683590948582, -0.04429249092936516, -0.012596432119607925, -0.002873976482078433, 0.06948580592870712, -0.2162453532218933, 0.19511106610298157, 0.040708038955926895, 0.034394387155771255, -0.11011083424091339, -0.015066353604197502, 0.0671776682138443, -0.05654088780283928, 0.12488041818141937, 0.009740984998643398, 0.059722647070884705, -0.1185876876115799, -0.006558213382959366, 0.031679537147283554, 0.05589842423796654, -0.008648735471069813, 0.07926885783672333, 0.029325785115361214, 0.041850801557302475, -0.031052248552441597, 0.052390966564416885, -0.15587837994098663, -0.13747099041938782, 0.04594457149505615, -0.09791242331266403, 0.053255174309015274, -0.036661241203546524, -0.032760944217443466, 0.015700731426477432, 0.030875038355588913, -0.20356430113315582, -0.1014879122376442, -0.10044655948877335, 0.019119523465633392, 0.07289369404315948, -0.11339215189218521, -0.004486816469579935, 0.06121615320444107, 0.04835709556937218, -0.02848733216524124, -0.07513048499822617, 0.059585459530353546, -0.07227310538291931, -0.13812686502933502, -0.026073850691318512, 0.004852915648370981, 0.18192489445209503, 0.06237853318452835, 0.02354666404426098, 0.02741619199514389, 0.013834317214787006, -0.16126422584056854, -0.009375597350299358, 0.16483670473098755, 0.08651164919137955, 0.07357288151979446, 0.03539233282208443, -0.07833179831504822, -0.12433475255966187, -0.03008747473359108, 0.043637875467538834, 0.21873711049556732, -0.01618216186761856, 0.09896711260080338, 0.16529053449630737, -0.06047035753726959, -0.24410009384155273, -0.0334940068423748, 0.04750613123178482, -0.03917430341243744, 0.1289450079202652, -0.16055069863796234, 0.025260066613554955, 0.03668094426393509, 0.0006715779891237617, 0.05759293958544731, -0.14852891862392426, -0.10121814906597137, 0.1389213651418686, 0.022856010124087334, 0.06310780346393585, -0.09436433762311935, -0.049880579113960266, 0.030399082228541374, -0.015287710353732109, 0.1686461716890335, -0.16519632935523987, 0.02076895534992218, 0.010404844768345356, 0.06022025644779205, 0.042249493300914764, -0.023278864100575447, 0.13636985421180725, 0.03571219742298126, 0.06502547860145569, -0.10573484003543854, -0.011931944638490677, -0.00409617368131876, -0.024812299758195877, 0.055450841784477234, 0.04364527761936188, 0.01912039890885353, -0.1471177190542221, -0.048544880002737045, -0.13699747622013092, 0.009206969290971756, -0.030863454565405846, -0.038968127220869064, -0.07161573320627213, 0.0868050828576088, 0.14130817353725433, -0.008676618337631226, 0.06492632627487183, -0.09874384105205536, 0.0385860949754715, 0.1175977885723114, 0.22610221803188324, 0.05039563402533531, -0.09761910885572433, -0.03463747724890709, -0.008464512415230274, 0.1196012943983078, -0.1448776125907898, 0.009804325178265572, 0.06492482125759125, 0.01729649119079113, 0.12720194458961487, 0.015872644260525703, -0.1060909852385521, 0.07606778293848038, -0.0073951613157987595, -0.08074598014354706, -0.1265355795621872, -0.011220538057386875, 0.02248208411037922, -0.1426895260810852, -0.09505228698253632, 0.13139142096042633, 0.008609358221292496, -0.011227238923311234, 0.019075732678174973, 0.08317527174949646, 0.04110373184084892, 0.07389812171459198, 0.07664678245782852, 0.046784866601228714, -0.05813007429242134, 0.008618121035397053, 0.036535076797008514, -0.08042655885219574, 0.08024069666862488, 0.058028195053339005, -0.08010425418615341, -0.07039232552051544, -0.10361047089099884, 0.0024056292604655027, -0.06404494494199753, -0.05141562223434448, 0.011251010932028294, -0.04927970468997955, 0.00032113210181705654, 0.19476668536663055, 0.016806043684482574, 0.03193539381027222, 0.016768205910921097, -0.00509360758587718, 0.00839688815176487, 0.08031641691923141, 0.016484210267663002, -0.023922190070152283, -0.02047114446759224, 0.011274869553744793, 0.01956632174551487, 0.06952708214521408, -0.03526562452316284, -0.028063109144568443, -0.11033248901367188, 0.02854059264063835, -0.2154524177312851, 0.07400389760732651, -0.14610858261585236, 0.019503135234117508, -0.04498233273625374, -0.06202207878232002, -0.015735719352960587, 0.01168195903301239, -0.04424567520618439, 0.003911010455340147, 0.04640780761837959, 0.08618881553411484, -0.159254252910614, -0.02460111863911152, 0.06822343915700912, -0.06526131927967072, 0.0850488692522049, -0.023911749944090843, -0.06928175687789917, 0.03566494956612587, -0.09849817305803299, -0.0028816997073590755, -0.040843769907951355, 0.034649793058633804, 0.01832687482237816, -0.10859428346157074, -0.024668345227837563, -0.009457743726670742, -0.016601305454969406, 0.0433194525539875, 0.013186607509851456, -0.07596703618764877, 0.06476789712905884, 0.022283313795924187, -0.03144199773669243, -0.08506761491298676, 0.013365453109145164, 0.1322907656431198, 0.027175920084118843, 0.10159609466791153, -0.03295588493347168, 0.10790955275297165, -0.13852722942829132, -0.00017218213179148734, 0.059042103588581085, 0.0766899362206459, 0.08670394867658615, -0.046138495206832886, 0.06194179877638817, -0.04538629204034805, 0.13706986606121063, -0.03501102700829506, -0.016650540754199028, 0.0678238570690155, 0.024277914315462112, -0.020053250715136528, -0.0017389304703101516, -0.09558925777673721, -0.030352506786584854, -0.02622511051595211, 0.02461038902401924, 0.01972748339176178, -0.07542246580123901, -0.11510943621397018, 0.16869960725307465, 0.10238572955131531, 0.1221369132399559, -0.01363291498273611, -0.01058128010481596, -0.05877884849905968, 0.03733581304550171, 0.01757032424211502, 0.055140405893325806, -0.05751471966505051, -0.03455077484250069, 0.07318787276744843, 0.13847260177135468, -0.082279272377491, 0.09224944561719894, -0.0010629392927512527, -0.05975077301263809, -0.1184714138507843, -0.15237216651439667, -0.03313955292105675, -0.0011639796430245042, 0.0278102345764637, -0.12489044666290283, 0.044474732130765915, 0.19971849024295807, 0.00020609448256436735, -0.04201842099428177, 0.08851279318332672, 0.020876195281744003, -0.14072801172733307, 0.04280046746134758, -0.005657942965626717, 0.04198440536856651, 0.026037855073809624, 0.01944117806851864, 0.053328752517700195, 0.03523842245340347, 0.04123217985033989, 0.06955339759588242, -0.026898067444562912, -0.015687644481658936, -0.14387817680835724, -0.09189485013484955, -0.008066939190030098, 0.08342484384775162, 0.09446171671152115, 0.2495633363723755, 0.06218989938497543, -0.018602564930915833, -0.00154712307266891, 0.15918491780757904, 0.011238064616918564, -0.028715025633573532, -0.14651276171207428, 0.12040804326534271, 0.024551887065172195, -0.006555384956300259, 0.022827252745628357, -0.12241177260875702, 0.04794555902481079, 0.1381683200597763, 0.13711048662662506, -0.10860856622457504, 0.006358685437589884, -0.03622366115450859, 0.01571262627840042, 0.0388537235558033, 0.03385990858078003, 0.01934133842587471, 0.34051796793937683, -0.06410669535398483, 0.049940671771764755, 0.00013400003081187606, 0.0024585635401308537, 0.0185603778809309, 0.060413435101509094, 0.02354261837899685, 0.02389400266110897, -0.11751793324947357, 0.15615151822566986, -0.12390479445457458, -0.14853541553020477, 0.024566100910305977, -0.04355107247829437, -0.08707337081432343, 0.004078900907188654, -0.04577060788869858, 0.024130281060934067, 0.06899947673082352, 0.019101466983556747, 0.003954282961785793, 0.10048690438270569, 0.01871957816183567, -0.04967695474624634, -0.0247748252004385, 0.1030019074678421, -0.07044776529073715, 0.18783770501613617, 0.01354033499956131, 0.18817222118377686, 0.10270354896783829, 0.00308419531211257, -0.09637555480003357, 0.012026709504425526, 0.045931655913591385, -0.07661010324954987, -0.004555385559797287, 0.17936067283153534, 0.015336091630160809, 0.08749828487634659, 0.14296607673168182, -0.06061888486146927, 0.03829304128885269, -0.03380480036139488, -0.07065728306770325, -0.0865563303232193, 0.12231951206922531, -0.08752468228340149, 0.13310123980045319, 0.11460138857364655, -0.04183216392993927, 0.020902900025248528, -0.018554549664258957, -0.01922377571463585, -0.0503111258149147, 0.024702200666069984, -0.03157271072268486, -0.12691837549209595, 0.028211629018187523, 0.042260847985744476, 0.04941437393426895, -0.18858219683170319, -0.005393811967223883, 0.044423822313547134, -0.019302528351545334, 0.019128374755382538, 0.07012487202882767, -0.024095766246318817, -0.02003907971084118, -0.0257242601364851, -0.12036675959825516, -0.010520617477595806, 0.08442435413599014, -0.09021952003240585, -0.0328846238553524 ]
null
null
transformers
`dpr-reader-single-nq-base` # Table of Contents - [Model Details](#model-details) - [How To Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation-results) - [Environmental Impact](#environmental-impact) - [Technical Specifications](#technical-specifications) - [Citation Information](#citation-information) - [Model Card Authors](#model-card-authors) ## Model Details **Model Description:** [Dense Passage Retrieval (DPR)](https://github.com/facebookresearch/DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. `dpr-reader-single-nq-base` is the reader model trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open) ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)). - **Developed by:** See [GitHub repo](https://github.com/facebookresearch/DPR) for model developers - **Model Type:** QA Reader Model - **Language(s):** [CC-BY-NC-4.0](https://github.com/facebookresearch/DPR/blob/main/LICENSE), also see [Code of Conduct](https://github.com/facebookresearch/DPR/blob/main/CODE_OF_CONDUCT.md) - **License:** English - **Related Models:** - [`dpr-ctx_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base) - [`dpr-question_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-question_encoder-single-nq-base) - [`dpr-ctx_encoder-multiset-base`](https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base) - [`dpr-question_encoder-multiset-base`](https://huggingface.co/facebook/dpr-question_encoder-multiset-base) - [`dpr-reader-multiset-base`](https://huggingface.co/facebook/dpr-reader-multiset-base) - **Resources for more information:** - [Research Paper](https://arxiv.org/abs/2004.04906) - [GitHub Repo](https://github.com/facebookresearch/DPR) - [Hugging Face DPR docs](https://huggingface.co/docs/transformers/main/en/model_doc/dpr) - [BERT Base Uncased Model Card](https://huggingface.co/bert-base-uncased) ## How to Get Started with the Model Use the code below to get started with the model. ```python from transformers import DPRReader, DPRReaderTokenizer tokenizer = DPRReaderTokenizer.from_pretrained("facebook/dpr-reader-single-nq-base") model = DPRReader.from_pretrained("facebook/dpr-reader-single-nq-base") encoded_inputs = tokenizer( questions=["What is love ?"], titles=["Haddaway"], texts=["'What Is Love' is a song recorded by the artist Haddaway"], return_tensors="pt", ) outputs = model(**encoded_inputs) start_logits = outputs.start_logits end_logits = outputs.end_logits relevance_logits = outputs.relevance_logits ``` ## Uses #### Direct Use `dpr-reader-single-nq-base`, [`dpr-ctx_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base), and [`dpr-question_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-question_encoder-single-nq-base) can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. ## Risks, Limitations and Biases **CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al., 2021](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al., 2021](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. ## Training #### Training Data This model was trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open) ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)). The model authors write that: > [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators. #### Training Procedure The training procedure is described in the [associated paper](https://arxiv.org/pdf/2004.04906.pdf): > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. The authors report that for encoders, they used two independent BERT ([Devlin et al., 2019](https://aclanthology.org/N19-1423/)) networks (base, un-cased) and use FAISS ([Johnson et al., 2017](https://arxiv.org/abs/1702.08734)) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. ## Evaluation The following evaluation information is extracted from the [associated paper](https://arxiv.org/pdf/2004.04906.pdf). #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were [NQ](https://huggingface.co/datasets/nq_open), [TriviaQA](https://huggingface.co/datasets/trivia_qa), [WebQuestions (WQ)](https://huggingface.co/datasets/web_questions), [CuratedTREC (TREC)](https://huggingface.co/datasets/trec), and [SQuAD v1.1](https://huggingface.co/datasets/squad). #### Results | | Top 20 | | | | | Top 100| | | | | |:----:|:------:|:---------:|:--:|:----:|:-----:|:------:|:---------:|:--:|:----:|:-----:| | | NQ | TriviaQA | WQ | TREC | SQuAD | NQ | TriviaQA | WQ | TREC | SQuAD | | | 78.4 | 79.4 |73.2| 79.8 | 63.2 | 85.4 | 85.0 |81.4| 89.1 | 77.2 | ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type and based on the [associated paper](https://arxiv.org/abs/2004.04906). - **Hardware Type:** 8 32GB GPUs - **Hours used:** Unknown - **Cloud Provider:** Unknown - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Technical Specifications See the [associated paper](https://arxiv.org/abs/2004.04906) for details on the modeling architecture, objective, compute infrastructure, and training details. ## Citation Information ```bibtex @inproceedings{karpukhin-etal-2020-dense, title = "Dense Passage Retrieval for Open-Domain Question Answering", author = "Karpukhin, Vladimir and Oguz, Barlas and Min, Sewon and Lewis, Patrick and Wu, Ledell and Edunov, Sergey and Chen, Danqi and Yih, Wen-tau", booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.emnlp-main.550", doi = "10.18653/v1/2020.emnlp-main.550", pages = "6769--6781", } ``` ## Model Card Authors This model card was written by the team at Hugging Face.
{"language": "en", "license": "cc-by-nc-4.0", "tags": ["dpr"], "datasets": ["nq_open"], "inference": false}
null
facebook/dpr-reader-single-nq-base
[ "transformers", "pytorch", "tf", "dpr", "en", "dataset:nq_open", "arxiv:2004.04906", "arxiv:1702.08734", "arxiv:1910.09700", "license:cc-by-nc-4.0", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2004.04906", "1702.08734", "1910.09700" ]
[ "en" ]
TAGS #transformers #pytorch #tf #dpr #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us
'dpr-reader-single-nq-base' Table of Contents ================= * Model Details * How To Get Started With the Model * Uses * Risks, Limitations and Biases * Training * Evaluation * Environmental Impact * Technical Specifications * Citation Information * Model Card Authors Model Details ------------- Model Description: Dense Passage Retrieval (DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. 'dpr-reader-single-nq-base' is the reader model trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). * Developed by: See GitHub repo for model developers * Model Type: QA Reader Model * Language(s): CC-BY-NC-4.0, also see Code of Conduct * License: English * Related Models: + 'dpr-ctx\_encoder-single-nq-base' + 'dpr-question\_encoder-single-nq-base' + 'dpr-ctx\_encoder-multiset-base' + 'dpr-question\_encoder-multiset-base' + 'dpr-reader-multiset-base' * Resources for more information: + Research Paper + GitHub Repo + Hugging Face DPR docs + BERT Base Uncased Model Card How to Get Started with the Model --------------------------------- Use the code below to get started with the model. Uses ---- #### Direct Use 'dpr-reader-single-nq-base', 'dpr-ctx\_encoder-single-nq-base', and 'dpr-question\_encoder-single-nq-base' can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. Risks, Limitations and Biases ----------------------------- CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes. Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. Training -------- #### Training Data This model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that: > > [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators. > > > #### Training Procedure The training procedure is described in the associated paper: > > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > > > > > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. > > > The authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. Evaluation ---------- The following evaluation information is extracted from the associated paper. #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1. #### Results Environmental Impact -------------------- Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper. * Hardware Type: 8 32GB GPUs * Hours used: Unknown * Cloud Provider: Unknown * Compute Region: Unknown * Carbon Emitted: Unknown Technical Specifications ------------------------ See the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details. Model Card Authors ------------------ This model card was written by the team at Hugging Face.
[ "#### Direct Use\n\n\n'dpr-reader-single-nq-base', 'dpr-ctx\\_encoder-single-nq-base', and 'dpr-question\\_encoder-single-nq-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that:\n\n\n\n> \n> [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators.\n> \n> \n>", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ "TAGS\n#transformers #pytorch #tf #dpr #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us \n", "#### Direct Use\n\n\n'dpr-reader-single-nq-base', 'dpr-ctx\\_encoder-single-nq-base', and 'dpr-question\\_encoder-single-nq-base' can be used for the task of open-domain question answering.", "#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------", "#### Training Data\n\n\nThis model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that:\n\n\n\n> \n> [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators.\n> \n> \n>", "#### Training Procedure\n\n\nThe training procedure is described in the associated paper:\n\n\n\n> \n> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.\n> \n> \n> \n\n\n\n> \n> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.\n> \n> \n> \n\n\nThe authors report that for encoders, they used two independent BERT (Devlin et al., 2019) networks (base, un-cased) and use FAISS (Johnson et al., 2017) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.\n\n\nEvaluation\n----------\n\n\nThe following evaluation information is extracted from the associated paper.", "#### Testing Data, Factors and Metrics\n\n\nThe model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were NQ, TriviaQA, WebQuestions (WQ), CuratedTREC (TREC), and SQuAD v1.1.", "#### Results\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type and based on the associated paper.\n\n\n* Hardware Type: 8 32GB GPUs\n* Hours used: Unknown\n* Cloud Provider: Unknown\n* Compute Region: Unknown\n* Carbon Emitted: Unknown\n\n\nTechnical Specifications\n------------------------\n\n\nSee the associated paper for details on the modeling architecture, objective, compute infrastructure, and training details.\n\n\nModel Card Authors\n------------------\n\n\nThis model card was written by the team at Hugging Face." ]
[ 69, 75, 206, 94, 313, 82, 135 ]
[ "passage: TAGS\n#transformers #pytorch #tf #dpr #en #dataset-nq_open #arxiv-2004.04906 #arxiv-1702.08734 #arxiv-1910.09700 #license-cc-by-nc-4.0 #has_space #region-us \n#### Direct Use\n\n\n'dpr-reader-single-nq-base', 'dpr-ctx\\_encoder-single-nq-base', and 'dpr-question\\_encoder-single-nq-base' can be used for the task of open-domain question answering.#### Misuse and Out-of-scope Use\n\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nCONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al., 2021 and Bender et al., 2021). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.\n\n\nTraining\n--------#### Training Data\n\n\nThis model was trained using the Natural Questions (NQ) dataset (Lee et al., 2019; Kwiatkowski et al., 2019). The model authors write that:\n\n\n\n> \n> [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators.\n> \n> \n>" ]
[ -0.06737291067838669, 0.09479616582393646, -0.003263421356678009, 0.061017267405986786, 0.061339862644672394, -0.01662256196141243, 0.07664076238870621, 0.07296371459960938, 0.1252620369195938, 0.060668472200632095, 0.04759601876139641, 0.046794235706329346, 0.042827315628528595, 0.07342306524515152, -0.03739260882139206, -0.17129580676555634, 0.04025930166244507, -0.011277362704277039, 0.042342882603406906, 0.11391867697238922, 0.1073613166809082, -0.05830804631114006, 0.05425593629479408, 0.03711966797709465, -0.018555602058768272, -0.027039553970098495, -0.008860793896019459, -0.03701377660036087, 0.11202897131443024, 0.08064299076795578, 0.0766327902674675, 0.024312088266015053, 0.014715013094246387, -0.23883141577243805, 0.04026404395699501, 0.040803082287311554, -0.007551396731287241, 0.06938977539539337, 0.06954080611467361, -0.0002458130184095353, 0.11407005786895752, -0.07446497678756714, 0.07557012140750885, 0.07369235903024673, -0.0723663717508316, -0.17404548823833466, -0.09890730679035187, 0.006088246125727892, 0.04508110508322716, 0.11332760006189346, -0.056364987045526505, 0.15551625192165375, -0.07883822172880173, 0.014576851390302181, 0.1887773722410202, -0.15839175879955292, 0.0083804652094841, -0.0068641421385109425, -0.010747994296252728, 0.010770701803267002, -0.05786618962883949, -0.005078967195004225, 0.03292131796479225, 0.06425735354423523, 0.02210668846964836, -0.03615989163517952, 0.06012898311018944, -0.049041569232940674, -0.13713906705379486, -0.04406211897730827, 0.14771215617656708, 0.0674365907907486, -0.09055929630994797, -0.14389534294605255, -0.03198336809873581, 0.17718720436096191, 0.05190015584230423, -0.10893350094556808, -0.004519830457866192, -0.05811002478003502, 0.07930201292037964, -0.038886137306690216, -0.07240008562803268, -0.028794461861252785, -0.10128863900899887, 0.19723495841026306, 0.051580991595983505, 0.03194799646735191, -0.05231737345457077, 0.06413687765598297, -0.0030566093046218157, -0.08270697295665741, -0.09391038119792938, -0.09631548821926117, -0.06686299294233322, -0.021253863349556923, -0.021356167271733284, 0.0028118600603193045, 0.024716902524232864, 0.14930741488933563, -0.17864611744880676, 0.004442519973963499, 0.021225133910775185, 0.024144243448972702, 0.13491035997867584, 0.04724643751978874, -0.09211385250091553, 0.0060818991623818874, 0.08152462542057037, 0.04462279751896858, 0.014404489658772945, -0.026890335604548454, -0.008099285885691643, -0.09027273952960968, 0.04255664348602295, 0.06581506878137589, 0.08978310972452164, 0.06258543580770493, -0.08718562871217728, -0.06174517422914505, 0.11718201637268066, -0.11084585636854172, -0.0651848092675209, 0.010693172924220562, -0.11616289615631104, -0.03488263860344887, 0.031092366203665733, -0.0034530984703451395, -0.06588628143072128, 0.030048485845327377, -0.09699364006519318, -0.03732353448867798, -0.05609080567955971, -0.12044277042150497, 0.06755485385656357, 0.008983339183032513, 0.003251201007515192, -0.1530359536409378, -0.19582651555538177, -0.0794147402048111, 0.00021005475718993694, -0.03903535380959511, -0.03988221660256386, -0.03049793839454651, -0.03550693020224571, -0.020314868539571762, -0.05145605653524399, -0.026224961504340172, -0.021282244473695755, 0.05973530560731888, -0.05338888615369797, 0.028403516858816147, 0.057222895324230194, 0.02155354432761669, -0.11916303634643555, 0.04788944497704506, -0.15187779068946838, 0.11807053536176682, -0.05570129305124283, -0.07750727981328964, -0.10964634269475937, -0.08421097695827484, 0.008946427144110203, 0.07577560842037201, 0.04955131560564041, 0.2129458636045456, -0.11040622740983963, 0.015003046952188015, 0.10862021893262863, -0.1222471073269844, -0.1262466013431549, 0.115470752120018, -0.09205242246389389, 0.10358230769634247, 0.0933322086930275, 0.13105514645576477, 0.03668098896741867, -0.04672893509268761, -0.08070402592420578, -0.05162491276860237, -0.062063880264759064, 0.14295433461666107, 0.034715913236141205, -0.06843031197786331, 0.03343241661787033, -0.04264340177178383, -0.07864173501729965, 0.011507636867463589, -0.025872167199850082, -0.02269165590405464, 0.032914627343416214, -0.019682366400957108, 0.11303554475307465, -0.006218438036739826, -0.04313588887453079, 0.014924958348274231, -0.1292598843574524, -0.066481813788414, 0.06614077091217041, -0.027841717004776, 0.003966413903981447, -0.13004323840141296, 0.024197235703468323, -0.001995981205254793, 0.02503286488354206, -0.1950427144765854, -0.14790646731853485, -0.010395178571343422, -0.20482216775417328, 0.08740483224391937, 0.09642774611711502, 0.019341131672263145, -0.013395292684435844, -0.03896193578839302, 0.006310012191534042, -0.07645993679761887, 0.0015749408630654216, -0.0527932308614254, -0.135117307305336, 0.03256787732243538, -0.03301697596907616, 0.02503439597785473, -0.17522433400154114, -0.0011836810735985637, 0.07983957976102829, 0.040410127490758896, 0.015422300435602665, -0.022938892245292664, 0.0515989325940609, 0.011969641782343388, 0.0025619466323405504, -0.031834784895181656, 0.006308354903012514, -0.0414336659014225, -0.10726886987686157, 0.13314399123191833, -0.16975291073322296, -0.049962129443883896, 0.06072389334440231, -0.05558482185006142, -0.10530252754688263, -0.04168292507529259, -0.010326695628464222, -0.024676183238625526, -0.06580585241317749, -0.04507848247885704, 0.24101734161376953, 0.02355097606778145, 0.022724414244294167, -0.10274630039930344, -0.05137547105550766, -0.032770272344350815, -0.02355724573135376, -0.017741799354553223, 0.04121757298707962, -0.013133772648870945, -0.2306526005268097, 0.07576087862253189, 0.025287633761763573, -0.058581043034791946, 0.10597668588161469, 0.044244296848773956, -0.058335863053798676, -0.024016883224248886, -0.022240135818719864, -0.04610073193907738, 0.12437856942415237, -0.017128152772784233, 0.05505097657442093, 0.05398206785321236, 0.05777382105588913, 0.02539326436817646, -0.09041164070367813, -0.02095148339867592, 0.009689279831945896, -0.011155971325933933, -0.12531013786792755, 0.04513811320066452, 0.042409248650074005, 0.12957122921943665, 0.04115210846066475, -0.0043784151785075665, 0.05047186091542244, -0.06811700016260147, -0.17001335322856903, 0.2019427865743637, -0.0429665744304657, -0.21664129197597504, -0.00812511332333088, 0.07777249813079834, -0.0061853802762925625, -0.0020720548927783966, 0.03465523198246956, -0.0531754307448864, -0.06863736361265182, -0.09178539365530014, 0.076228067278862, -0.00891412328928709, -0.0831042006611824, -0.042425841093063354, -0.02113877609372139, 0.03497614711523056, -0.09607676416635513, 0.009149594232439995, -0.03583209961652756, -0.0221396517008543, 0.046202678233385086, 0.0260076392441988, 0.13271699845790863, 0.0845685750246048, 0.009528051130473614, -0.052656445652246475, -0.04061877727508545, 0.22155693173408508, -0.15110662579536438, 0.03228480741381645, 0.17313548922538757, -0.06873930990695953, 0.07217462360858917, 0.14930643141269684, 0.010915850289165974, -0.06741491705179214, 0.0379563570022583, 0.07467135041952133, -0.032834410667419434, -0.28060445189476013, -0.06508760154247284, -0.007512298878282309, -0.052272360771894455, 0.02121519111096859, 0.025552047416567802, 0.07253061980009079, 0.07673374563455582, -0.10882777720689774, -0.030720651149749756, 0.07288454473018646, 0.07649576663970947, 0.16109156608581543, 0.02611699514091015, 0.0511704683303833, -0.004334538709372282, -0.00668700784444809, 0.05658255144953728, -0.025742182508111, 0.3225015103816986, -0.07504183799028397, 0.09530915319919586, 0.12289100140333176, 0.0222200658172369, 0.04781148210167885, -0.0062796263955533504, 0.01975366845726967, 0.03340166062116623, -0.05415266752243042, -0.020370393991470337, -0.02859903685748577, 0.07771612703800201, -0.012646359391510487, -0.04110727459192276, -0.01620054803788662, -0.0443711020052433, 0.04576545208692551, 0.09218402951955795, 0.0062872800044715405, -0.10671929270029068, -0.07359223812818527, 0.0386061929166317, -0.07446838915348053, -0.06076226010918617, 0.05737035349011421, 0.09903963655233383, -0.16747066378593445, 0.049075979739427567, -0.017823169007897377, 0.08610652387142181, -0.1363748461008072, 0.0011141704162582755, -0.04239455237984657, 0.04748927801847458, -0.028660818934440613, 0.1057414636015892, -0.26712900400161743, 0.19153474271297455, 0.010156548582017422, 0.0467323400080204, -0.10508435219526291, -0.040843866765499115, 0.04807993397116661, -0.07312525808811188, 0.13628967106342316, 0.013164767064154148, -0.007294453680515289, -0.06344103813171387, 0.021089330315589905, 0.05073096975684166, 0.01125758420675993, -0.005686311516910791, 0.10206161439418793, 0.025606760755181313, 0.05886456370353699, -0.011971332132816315, 0.016064731404185295, -0.15071319043636322, -0.11512664705514908, 0.04078540578484535, -0.08059479296207428, 0.015171247534453869, -0.06329323351383209, -0.047123320400714874, -0.0018592396518215537, 0.055256474763154984, -0.12388195842504501, -0.10316826403141022, -0.07330716401338577, 0.05970286577939987, 0.06492311507463455, -0.09988284111022949, 0.022438576444983482, 0.05474746972322464, 0.06491472572088242, -0.019670486450195312, -0.05203350633382797, 0.05583212152123451, -0.07337602227926254, -0.1518450677394867, -0.03695150837302208, 0.017636457458138466, 0.17391061782836914, 0.07327872514724731, 0.01578708551824093, 0.012599443085491657, -0.003678625449538231, -0.1624525636434555, 0.035652004182338715, 0.1075698509812355, 0.058286599814891815, 0.10984749346971512, 0.03886553272604942, -0.0222451351583004, -0.1377129852771759, -0.06478426605463028, 0.07341214269399643, 0.2211119681596756, -0.04166262596845627, 0.03830433636903763, 0.15361501276493073, -0.0627254843711853, -0.21980775892734528, 0.022422075271606445, 0.041373543441295624, -0.015179632231593132, 0.09430497139692307, -0.14244595170021057, -0.0028772905934602022, 0.04976252093911171, -0.00846171472221613, 0.0896332636475563, -0.1896573156118393, -0.10971968621015549, 0.14970622956752777, 0.07373549789190292, 0.06701052933931351, -0.09089498966932297, -0.026125749573111534, 0.011960099451243877, -0.054357849061489105, 0.15183791518211365, -0.09664232283830643, 0.014874488115310669, 0.013318960554897785, 0.11360161751508713, 0.05053265765309334, -0.03030332550406456, 0.16477952897548676, 0.009595761075615883, 0.08226488530635834, -0.1048271656036377, -0.05842060595750809, -0.036160796880722046, -0.027910994365811348, 0.10892008244991302, 0.05306040868163109, 0.03345609828829765, -0.06045536324381828, -0.06195133179426193, -0.1341368854045868, 0.030480865389108658, -0.03880600258708, -0.04932139068841934, -0.09423041343688965, 0.09499251842498779, 0.10625382512807846, -0.006237120367586613, 0.019568925723433495, -0.10168593376874924, 0.008054578676819801, 0.02900308556854725, 0.25107502937316895, 0.11327117681503296, -0.11451132595539093, 0.014279777184128761, -0.013819427229464054, 0.1054840236902237, -0.1415545791387558, 0.008315106853842735, 0.09547223150730133, 0.031227638944983482, 0.13223478198051453, 0.012663458473980427, -0.16419613361358643, 0.053621258586645126, -0.022117793560028076, -0.09102899581193924, -0.19681327044963837, -0.006852142512798309, 0.0018130001844838262, -0.12377369403839111, -0.08717188984155655, 0.11307787150144577, -0.023423563688993454, -0.012901725247502327, 0.0071356273256242275, 0.060446493327617645, 0.03127500042319298, 0.0903392881155014, 0.08905352652072906, 0.06181943416595459, -0.058235254138708115, 0.06967482715845108, 0.06278306990861893, -0.07144873589277267, 0.10154945403337479, 0.03429865464568138, -0.07679761946201324, -0.056297868490219116, -0.14663434028625488, 0.05235818400979042, -0.1091575101017952, -0.07661669701337814, -0.04054838791489601, -0.09161041676998138, -0.013790784403681755, 0.1605452001094818, 0.015152343548834324, 0.026891354471445084, 0.02237691543996334, -0.020312940701842308, -0.03903929516673088, 0.09394286572933197, 0.030571890994906425, -0.0001512345188530162, -0.08094175159931183, -0.0022476273588836193, 0.04892420768737793, 0.06313718110322952, -0.03003213182091713, -0.0004381026665214449, -0.11461115628480911, 0.019746704027056694, -0.23094047605991364, 0.045788612216711044, -0.09132831543684006, 0.02325296215713024, -0.022028714418411255, -0.05344659462571144, -0.013403773307800293, 0.030877990648150444, -0.048665259033441544, 0.0020172460936009884, 0.05391673371195793, 0.0457281768321991, -0.14364685118198395, -0.06870078295469284, 0.05977046862244606, -0.06101016327738762, 0.08786410838365555, 0.0003663359966594726, -0.06359618902206421, 0.010712452232837677, -0.15157188475131989, 0.042869310826063156, -0.015477380715310574, 0.023931337520480156, 0.017631104215979576, -0.17711380124092102, -0.021358612924814224, -0.01122484914958477, -0.036817777901887894, 0.02178967371582985, -0.022963207215070724, -0.061604518443346024, 0.027869468554854393, 0.0545521043241024, -0.02027156576514244, -0.10790304094552994, 0.02716400846838951, 0.1039426177740097, 0.012264741584658623, 0.11541187763214111, -0.023467613384127617, 0.10061653703451157, -0.13275869190692902, -0.008382542990148067, 0.05118703842163086, 0.06924968957901001, 0.05438036844134331, -0.004918316844850779, 0.05914149805903435, -0.045564115047454834, 0.1824706494808197, -0.04811658337712288, -0.017032932490110397, 0.0636478066444397, 0.004523442592471838, -0.009291735477745533, -0.00325256260111928, -0.10279720276594162, -0.07958752661943436, -0.033402297645807266, 0.026038838550448418, 0.017541946843266487, -0.05777200683951378, -0.08492662012577057, 0.20778551697731018, 0.08195256441831589, 0.1180083379149437, 0.0061248959973454475, -0.012457405216991901, -0.10064743459224701, -0.006553593557327986, -0.0032771341502666473, 0.06887435168027878, -0.06195622310042381, -0.017380423843860626, 0.06108754128217697, 0.1581963151693344, -0.08485853672027588, 0.10132300853729248, -0.05401167273521423, -0.06678139418363571, -0.12002500146627426, -0.17593277990818024, -0.04684990644454956, -0.002206007018685341, 0.029704734683036804, -0.12381619215011597, -0.0043435352854430676, 0.1770518571138382, -0.01195169985294342, -0.08252111822366714, 0.10055093467235565, 0.00045839190715923905, -0.1251920908689499, 0.02212054654955864, 0.007028818130493164, 0.03307865187525749, 0.057475727051496506, 0.04273247718811035, 0.05379636958241463, 0.023187968879938126, 0.0475897453725338, 0.057818301022052765, -0.02635212428867817, 0.01740219257771969, -0.07817179709672928, -0.07539166510105133, -0.009325983002781868, 0.08114229887723923, 0.09970617294311523, 0.24252404272556305, 0.06631767749786377, -0.008798016235232353, -0.0022060328628867865, 0.16955050826072693, 0.03980545327067375, 0.016187535598874092, -0.10822274535894394, 0.2052800953388214, 0.001683444599620998, -0.023194923996925354, 0.06830734014511108, -0.13547997176647186, 0.0684620589017868, 0.09987779706716537, 0.12834803760051727, -0.11694610863924026, -0.005354776047170162, -0.04484068602323532, 0.007611649576574564, 0.031011806800961494, 0.01955980435013771, 0.06797154247760773, 0.3323182463645935, -0.09125380963087082, 0.070433609187603, -0.011420524679124355, 0.05463296175003052, 0.023828690871596336, 0.12297271192073822, 0.04249938949942589, 0.042003411799669266, -0.1421770602464676, 0.08828287571668625, -0.09008133411407471, -0.21290013194084167, -0.0029269580263644457, -0.048323776572942734, -0.06887542456388474, -0.001029669540002942, -0.10946466028690338, -0.01205026637762785, 0.08364854007959366, 0.006796971894800663, 0.0005077742389403284, 0.06419553607702255, 0.021024147048592567, -0.07003235071897507, -0.003694275626912713, 0.11805607378482819, 0.01696597784757614, 0.2337733656167984, 0.007288891822099686, 0.21777112782001495, 0.10095388442277908, 0.0025484394282102585, -0.10433445870876312, 0.05005061626434326, 0.03711044788360596, -0.026547791436314583, 0.023106059059500694, 0.15887148678302765, 0.030540402978658676, 0.05519673973321915, 0.14912836253643036, -0.05638110637664795, 0.05631038919091225, -0.01365554891526699, -0.07878296822309494, -0.10605575144290924, 0.09721855819225311, -0.10422445833683014, 0.14151839911937714, 0.1299675703048706, -0.03622325882315636, 0.011922871693968773, -0.019584480673074722, 0.006749908905476332, -0.026791106909513474, 0.06605847924947739, 0.014241533353924751, -0.09883493930101395, 0.0732228234410286, 0.04123475402593613, 0.0405002161860466, -0.21726162731647491, -0.011637798510491848, 0.0250560212880373, -0.04874667152762413, 0.027464453130960464, 0.04362470656633377, -0.02721101976931095, 0.02095450647175312, -0.05299419164657593, -0.07524073123931885, -0.031825218349695206, 0.11236166954040527, -0.09774916619062424, -0.06020963191986084 ]
null
null
transformers
This repository has been deprecated and will be deleted shortly. All ESM models have been moved to their official names to match their naming at the original FAIR repo. You can now find the ESM-1b model at [facebook/esm1b_t33_650M_UR50S](https://huggingface.co/facebook/esm1b_t33_650M_UR50S).
{}
fill-mask
facebook/esm-1b
[ "transformers", "pytorch", "safetensors", "esm", "fill-mask", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[]
TAGS #transformers #pytorch #safetensors #esm #fill-mask #autotrain_compatible #endpoints_compatible #has_space #region-us
This repository has been deprecated and will be deleted shortly. All ESM models have been moved to their official names to match their naming at the original FAIR repo. You can now find the ESM-1b model at facebook/esm1b_t33_650M_UR50S.
[]
[ "TAGS\n#transformers #pytorch #safetensors #esm #fill-mask #autotrain_compatible #endpoints_compatible #has_space #region-us \n" ]
[ 46 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #esm #fill-mask #autotrain_compatible #endpoints_compatible #has_space #region-us \n" ]
[ -0.043642301112413406, -0.03013250045478344, -0.006823097355663776, 0.003061715280637145, 0.0805453211069107, 0.01982920989394188, 0.10830280929803848, 0.06326580792665482, 0.09412148594856262, 0.031024718657135963, 0.16816860437393188, 0.14751504361629486, -0.05312659591436386, 0.2209298461675644, -0.053474217653274536, -0.21465204656124115, 0.10998006910085678, 0.01796245016157627, -0.07033541798591614, 0.09996090084314346, 0.08840698003768921, -0.11062948405742645, 0.058762725442647934, -0.03137993440032005, -0.06483380496501923, 0.05808717757463455, 0.029272548854351044, -0.12345153093338013, 0.13443328440189362, -0.0056047020480036736, 0.2505776286125183, 0.044583678245544434, -0.041668057441711426, -0.07736703008413315, 0.04640438035130501, 0.0293926689773798, -0.08842974901199341, 0.04389824718236923, -0.002956528915092349, -0.06581582874059677, -0.03694220259785652, -0.038697708398103714, 0.0474344827234745, 0.010638464242219925, -0.12113869935274124, -0.17723874747753143, -0.014957120642066002, 0.0033442885614931583, 0.03995855897665024, 0.07727870345115662, 0.008564517833292484, 0.23210765421390533, -0.13045501708984375, 0.08653043955564499, 0.1669541746377945, -0.28771519660949707, 0.017722919583320618, 0.10939525812864304, 0.11955919116735458, -0.013448398560285568, -0.032207418233156204, 0.059761542826890945, 0.035076290369033813, 0.020015016198158264, 0.1314399391412735, -0.03958774730563164, -0.059680450707674026, 0.019234968349337578, -0.10109799355268478, -0.06630916893482208, 0.16161638498306274, -0.04912515729665756, 0.06085073575377464, -0.04346011206507683, -0.1510874629020691, -0.04489557445049286, 0.02844373509287834, -0.0216229110956192, -0.003518914571031928, 0.005188747774809599, 0.013156392611563206, 0.004197500646114349, -0.1551821082830429, 0.0291384719312191, -0.21367177367210388, 0.24423286318778992, 0.00984447356313467, 0.06979316473007202, -0.15206900238990784, 0.045358557254076004, 0.018588600680232048, -0.14607682824134827, 0.07145899534225464, -0.09252811223268509, 0.034117866307497025, 0.020055091008543968, -0.06674163043498993, -0.03438473865389824, 0.08499454706907272, 0.14273661375045776, -0.040260303765535355, 0.016857828944921494, 0.09502604603767395, 0.11183662712574005, 0.041295647621154785, 0.06737970560789108, 0.0023398203775286674, -0.03747139871120453, 0.029836172237992287, -0.043534573167562485, 0.06645239144563675, -0.06177336350083351, -0.10487110167741776, -0.025480307638645172, 0.041791778057813644, 0.07564473897218704, 0.04876026138663292, 0.03458031266927719, -0.06143646314740181, 0.06232096999883652, 0.13262206315994263, -0.08331860601902008, 0.02286098152399063, -0.034412942826747894, 0.07666905224323273, 0.035802267491817474, 0.04086433723568916, -0.014829643070697784, 0.021907636895775795, 0.11938393861055374, -0.09432466328144073, -0.027284497395157814, -0.07279269397258759, -0.10892824828624725, 0.04183078184723854, -0.1186394914984703, 0.03195517882704735, -0.21592126786708832, -0.11722976714372635, 0.03427543491125107, 0.04060038551688194, 0.026354383677244186, -0.016378801316022873, 0.08066374063491821, -0.04998515173792839, 0.04135448485612869, -0.058729492127895355, -0.06613479554653168, -0.06491629779338837, 0.06826934963464737, -0.014903940260410309, 0.13414359092712402, -0.11325645446777344, 0.024018356576561928, -0.09441434592008591, 0.016358518972992897, -0.16432781517505646, -0.05922446399927139, -0.04892241954803467, 0.12733906507492065, 0.03692173212766647, -0.04965103045105934, -0.12719543278217316, 0.07220586389303207, -0.009139124304056168, 0.1429440975189209, -0.12378254532814026, -0.07317215204238892, 0.18751122057437897, -0.1426561325788498, -0.1363702416419983, 0.08827339112758636, 0.005092463921755552, -0.05231265723705292, 0.006220370065420866, 0.14536504447460175, -0.03042079694569111, -0.13195903599262238, -0.011216022074222565, 0.12586118280887604, -0.10697987675666809, -0.09317868947982788, 0.03530168533325195, 0.02274332568049431, -0.06736239045858383, 0.007783695589751005, 0.0651874989271164, 0.08561043441295624, -0.06369218975305557, -0.05101276561617851, -0.05187295377254486, -0.04933018237352371, 0.15602359175682068, 0.01586224138736725, 0.08502792567014694, -0.10348384082317352, -0.07156310230493546, -0.004645455162972212, 0.018573902547359467, 0.05763547495007515, 0.01564006321132183, -0.07460478693246841, 0.13731159269809723, -0.052768535912036896, -0.01592261530458927, -0.14647766947746277, -0.16028647124767303, -0.0272325798869133, 0.006984163075685501, -0.049308765679597855, 0.15146024525165558, 0.10218241065740585, 0.011742009781301022, -0.0025014588609337807, -0.06294532120227814, 0.06361322104930878, 0.06987671554088593, -0.04272077605128288, -0.09532733261585236, -0.008674701675772667, -0.10130368173122406, -0.019707990810275078, 0.010360862128436565, 0.029970476403832436, -0.003076584544032812, 0.15366552770137787, -0.002611281583085656, 0.051248714327812195, -0.05182434991002083, 0.03011898510158062, -0.052898116409778595, -0.014533297158777714, 0.028751201927661896, -0.015416484326124191, -0.05149787291884422, 0.16668321192264557, -0.21201251447200775, 0.40837231278419495, 0.19893568754196167, -0.23316840827465057, -0.02583618462085724, 0.09697271138429642, -0.022916307672858238, 0.03046153485774994, -0.006340933032333851, -0.046803269535303116, -0.04590128734707832, -0.03688405826687813, 0.11766812205314636, -0.024064233526587486, -0.02881530672311783, 0.010368602350354195, -0.08563172072172165, -0.07459431886672974, 0.011795511469244957, 0.02487342432141304, -0.08745710551738739, 0.20822399854660034, 0.3348996043205261, -0.033334020525217056, 0.14547955989837646, 0.02245112508535385, 0.009178988635540009, -0.006431070156395435, -0.06945251673460007, -0.0685843750834465, 0.09990447759628296, -0.14816850423812866, -0.02956831455230713, 0.07385404407978058, -0.0017328762914985418, 0.06207244470715523, -0.1365213543176651, -0.047026000916957855, 0.045012298971414566, 0.06859994679689407, -0.07997436076402664, 0.1323734074831009, 0.04016641154885292, 0.09153741598129272, -0.054885637015104294, -0.08770188689231873, 0.061242952942848206, 0.0018668079283088446, -0.020052483305335045, 0.15162350237369537, -0.14713361859321594, -0.3543034493923187, -0.09756357222795486, -0.039428144693374634, 0.03752598538994789, 0.05142885074019432, 0.09625044465065002, -0.0841512605547905, -0.07031526416540146, -0.006028356496244669, -0.015489139594137669, -0.005036971531808376, 0.07544729113578796, -0.027200262993574142, 0.009104789234697819, -0.00869408156722784, -0.08701207488775253, -0.08331874758005142, -0.020057369023561478, -0.0325341522693634, 0.1525181233882904, 0.019920015707612038, 0.0775279551744461, 0.13513058423995972, -0.025580519810318947, 0.008840195834636688, -0.013474646024405956, 0.18343491852283478, -0.0705091655254364, 0.02523023635149002, 0.20147916674613953, -0.015647467225790024, 0.08612135797739029, 0.16780683398246765, 0.01933259889483452, -0.024343274533748627, 0.01723255217075348, -0.052818939089775085, -0.12892794609069824, -0.13528579473495483, -0.13610799610614777, -0.10030128806829453, -0.046545952558517456, -0.0035286960192024708, 0.07193537801504135, 0.11058435589075089, 0.0871368944644928, 0.024061016738414764, -0.06397849321365356, -0.08050662279129028, 0.027686666697263718, 0.1020621508359909, -0.0431671105325222, 0.1545427143573761, -0.04298581928014755, -0.15163949131965637, 0.04698676988482475, 0.0267562884837389, 0.08607311546802521, 0.0759240910410881, -0.08172854781150818, 0.06315944343805313, 0.17015598714351654, 0.13215935230255127, 0.15407676994800568, 0.05164576694369316, -0.0758204534649849, -0.019351497292518616, -0.012577343732118607, -0.020341427996754646, 0.05736600235104561, 0.10501256585121155, -0.08174745738506317, -0.03996555507183075, -0.09836604446172714, 0.09885665029287338, 0.06526927649974823, 0.06956770271062851, -0.2519178092479706, 0.014507900923490524, 0.0985606387257576, 0.006231577601283789, -0.045285653322935104, 0.03707535192370415, 0.08279310166835785, -0.05835643783211708, 0.06144290789961815, 0.0018552066758275032, 0.046280667185783386, 0.090080127120018, 0.08623048663139343, -0.08532246947288513, -0.07416000217199326, 0.020596714690327644, 0.047007232904434204, -0.2070169597864151, 0.2710329592227936, -0.03237498924136162, -0.05360366031527519, -0.07168518751859665, 0.006744648329913616, 0.05740239471197128, 0.13457165658473969, 0.10784163326025009, 0.04278756305575371, -0.13114218413829803, -0.13526751101016998, -0.009461917914450169, 0.01641683094203472, 0.07756827026605606, -0.013987284153699875, 0.01456611417233944, -0.0055303024128079414, -0.022279098629951477, 0.0317370742559433, 0.14752468466758728, -0.0635056272149086, -0.1149090975522995, 0.058640968054533005, 0.09735085815191269, -0.030295100063085556, -0.05583559349179268, -0.07320047914981842, -0.1707705855369568, 0.13330303132534027, -0.0171657744795084, -0.04303878918290138, -0.10454429686069489, -0.13250237703323364, 0.11454177647829056, -0.07577738910913467, 0.1112908273935318, -0.05645390972495079, 0.047815609723329544, -0.09697901457548141, -0.13340890407562256, 0.1488640159368515, -0.1591416746377945, -0.03211286664009094, -0.0746094211935997, 0.0972999855875969, -0.08928657323122025, 0.03668045625090599, 0.006482196971774101, 0.06578436493873596, -0.12516078352928162, -0.0636199563741684, 0.04014908894896507, -0.023902980610728264, 0.06541892886161804, 0.09066714346408844, -0.03225290775299072, -0.10920942574739456, 0.07889333367347717, 0.03408951684832573, 0.21002906560897827, 0.26082777976989746, -0.09343400597572327, 0.11305878311395645, 0.14102894067764282, 0.006072137504816055, -0.36282479763031006, -0.11007369309663773, -0.15379981696605682, 0.006244524847716093, 0.099392369389534, -0.029530396685004234, 0.08434149622917175, 0.002106210682541132, -0.08253441005945206, 0.0751098096370697, -0.15316283702850342, -0.09680082648992538, 0.22183969616889954, -0.015837766230106354, 0.3998285233974457, -0.1331380009651184, -0.021767916157841682, -0.04307515546679497, -0.05589123070240021, 0.0528496690094471, -0.07364941388368607, 0.07122008502483368, -0.0014427334535866976, 0.03678469732403755, 0.022287148982286453, -0.1000664159655571, 0.15853366255760193, -0.07865598797798157, 0.04721783474087715, -0.10450118035078049, -0.08013001084327698, 0.09508173167705536, -0.04951982572674751, 0.018669646233320236, -0.04563366249203682, 0.021239180117845535, -0.026077251881361008, 0.003606613492593169, -0.09838663041591644, 0.12366408109664917, 0.024317605420947075, -0.06464788317680359, 0.013035384006798267, -0.02005554735660553, 0.005399294663220644, -0.014439880847930908, 0.21772032976150513, 0.007948295213282108, 0.22821860015392303, 0.15182098746299744, -0.011748245917260647, -0.17342036962509155, -0.06036209687590599, 0.021781114861369133, -0.07139517366886139, 0.0938425064086914, -0.028926489874720573, 0.05886557698249817, 0.07061130553483963, -0.04459583759307861, 0.052682600915431976, 0.12644091248512268, 0.013274271041154861, -0.026699764654040337, 0.19509921967983246, -0.21227282285690308, -0.018059061840176582, 0.0046843779273331165, 0.03998492658138275, 0.06009431183338165, 0.07704531401395798, 0.1067391037940979, 0.01195308193564415, -0.023552248254418373, -0.033398497849702835, 0.020235195755958557, -0.06450481712818146, 0.07124070078134537, 0.06223756819963455, 0.08680540323257446, -0.0959678590297699, 0.009598999284207821, -0.012328080832958221, -0.17138999700546265, -0.008892915211617947, 0.054568812251091, -0.08227872848510742, -0.1373257040977478, -0.018631167709827423, 0.03296653553843498, -0.027207978069782257, -0.05206775665283203, -0.06422097235918045, -0.13818928599357605, 0.018810100853443146, 0.19557584822177887, 0.11817676573991776, 0.06465620547533035, 0.029552921652793884, 0.00881088338792324, 0.016664929687976837, 0.001835440518334508, 0.025033634155988693, 0.06047804653644562, -0.14378421008586884, 0.052960820496082306, 0.002512021455913782, 0.1307031512260437, -0.11702483147382736, 0.0066963802091777325, -0.14443063735961914, -0.0019338970305398107, -0.05823521316051483, -0.07166478037834167, -0.08892138302326202, -0.09655796736478806, 0.03817190229892731, -0.07262048870325089, -0.054202236235141754, -0.020882872864603996, -0.12236124277114868, 0.0219370499253273, 0.0270992498844862, 0.008099844679236412, -0.08876486867666245, -0.05911225453019142, 0.10255252569913864, -0.051653314381837845, 0.07531791925430298, 0.09608102589845657, -0.07956192642450333, 0.04690317064523697, -0.12765194475650787, -0.11460716277360916, 0.13117218017578125, 0.018342746421694756, 0.07574187964200974, 0.04860767722129822, -0.0030073660891503096, 0.04567644000053406, 0.03830963745713234, 0.0217631533741951, 0.04786121845245361, -0.08347699791193008, 0.080022431910038, 0.005492134019732475, -0.15874236822128296, -0.0010286483447998762, -0.1376686841249466, 0.10416898876428604, -0.03964206948876381, 0.1324930191040039, -0.03218935430049896, 0.058270689100027084, -0.08637861907482147, 0.03585420921444893, -0.041532550007104874, -0.15119214355945587, -0.0258746687322855, -0.002548916731029749, 0.03641224652528763, -0.022806845605373383, 0.29270094633102417, 0.035733506083488464, -0.023823313415050507, 0.04862778261303902, 0.07235411554574966, 0.03927871584892273, 0.012387467548251152, 0.11296198517084122, 0.0308120995759964, -0.05412563681602478, -0.10127945244312286, 0.0477970615029335, 0.03976769745349884, -0.151611790060997, 0.11590678989887238, 0.09298865497112274, 0.05865323916077614, 0.12639035284519196, 0.011871766299009323, -0.0025993180461227894, -0.13646413385868073, -0.24904605746269226, -0.04958206042647362, 0.044728972017765045, 0.0008354434394277632, -0.05075550451874733, 0.16986772418022156, -0.008265286684036255, 0.043704938143491745, -0.05655280500650406, 0.008409898728132248, -0.18555858731269836, -0.06727148592472076, -0.0728139579296112, -0.08882451057434082, -0.007627834100276232, -0.043479904532432556, -0.02310246229171753, 0.14941132068634033, 0.015285682864487171, 0.006215864792466164, 0.1741798222064972, 0.009960155934095383, 0.016495075076818466, -0.01571916975080967, 0.02510957419872284, 0.032971106469631195, 0.0024365587159991264, -0.03828815370798111, -0.13993288576602936, 0.01382170245051384, -0.06035218760371208, -0.005855437368154526, -0.11999771744012833, 0.02659377083182335, -0.08137024939060211, -0.10373072326183319, -0.07986771315336227, 0.014667367562651634, -0.05721710994839668, 0.08155760169029236, -0.01458114292472601, 0.040871284902095795, 0.0057990639470517635, 0.1694776713848114, -0.10213681310415268, -0.10923578590154648, -0.020169349387288094, 0.13921600580215454, 0.03855922818183899, 0.11751306802034378, -0.025659486651420593, 0.002264380222186446, -0.14328457415103912, 0.19941596686840057, 0.35756736993789673, -0.050093717873096466, 0.1078292578458786, 0.011542282067239285, 0.02641836181282997, 0.008516930043697357, 0.08247781544923782, 0.11059200018644333, 0.2900775074958801, -0.0915914922952652, -0.0012488322099670768, -0.06192365288734436, -0.010910283774137497, -0.1207585409283638, -0.017564505338668823, 0.008877793326973915, -0.021049365401268005, -0.05073197931051254, 0.041101280599832535, -0.1277959942817688, 0.07453709840774536, 0.07188096642494202, -0.22501707077026367, -0.03787766024470329, 0.035278335213661194, 0.25806865096092224, -0.020213507115840912, 0.11116326600313187, -0.038216862827539444, -0.09017948061227798, 0.012479323893785477, 0.002915996825322509, -0.14657336473464966, -0.028195425868034363, 0.08467691391706467, -0.019795244559645653, 0.13296571373939514, -0.027151014655828476, 0.043928246945142746, 0.08956289291381836, 0.049294400960206985, -0.035540398210287094, 0.09622900187969208, 0.02365650236606598, -0.10426110029220581, -0.047245677560567856, 0.005749685689806938, -0.014344369992613792, -0.1017456129193306, 0.04163627699017525, -0.22055265307426453, 0.03908073529601097, -0.04398807883262634, -0.029739275574684143, 0.01570737548172474, 0.023786840960383415, -0.024361025542020798, 0.04941626638174057, 0.010924851521849632, 0.0018262371886521578, -0.02021193318068981, -0.00800576712936163, 0.005466653034090996, 0.03862202912569046, -0.07964567840099335, -0.13155625760555267, -0.11648355424404144, -0.0015458023408427835, 0.03849586844444275, 0.007929718121886253, -0.09836500883102417, -0.06525382399559021, -0.09903603792190552, 0.015649868175387383, -0.13589069247245789, 0.010565188713371754, 0.09660414606332779, 0.03538167104125023, 0.0029676491394639015, -0.04172071814537048, 0.0209024827927351, 0.06788071244955063, -0.15380223095417023, -0.09671609848737717 ]
null
null
fairseq
# fastspeech2-en-200_speaker-cv4 [FastSpeech 2](https://arxiv.org/abs/2006.04558) text-to-speech model from fairseq S^2 ([paper](https://arxiv.org/abs/2109.06912)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_synthesis)): - English - 200 male/female voices (random speaker when using the widget) - Trained on [Common Voice v4](https://commonvoice.mozilla.org/en/datasets) ## Usage ```python from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub from fairseq.models.text_to_speech.hub_interface import TTSHubInterface import IPython.display as ipd models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "facebook/fastspeech2-en-200_speaker-cv4", arg_overrides={"vocoder": "hifigan", "fp16": False} ) model = models[0] TTSHubInterface.update_cfg_with_data_cfg(cfg, task.data_cfg) generator = task.build_generator(model, cfg) text = "Hello, this is a test run." sample = TTSHubInterface.get_model_input(task, text) wav, rate = TTSHubInterface.get_prediction(task, model, generator, sample) ipd.Audio(wav, rate=rate) ``` See also [fairseq S^2 example](https://github.com/pytorch/fairseq/blob/main/examples/speech_synthesis/docs/common_voice_example.md). ## Citation ```bibtex @inproceedings{wang-etal-2021-fairseq, title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit", author = "Wang, Changhan and Hsu, Wei-Ning and Adi, Yossi and Polyak, Adam and Lee, Ann and Chen, Peng-Jen and Gu, Jiatao and Pino, Juan", booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-demo.17", doi = "10.18653/v1/2021.emnlp-demo.17", pages = "143--152", } ```
{"language": "en", "library_name": "fairseq", "tags": ["fairseq", "audio", "text-to-speech", "multi-speaker"], "datasets": ["common_voice"], "task": "text-to-speech", "widget": [{"text": "Hello, this is a test run.", "example_title": "Hello, this is a test run."}]}
text-to-speech
facebook/fastspeech2-en-200_speaker-cv4
[ "fairseq", "audio", "text-to-speech", "multi-speaker", "en", "dataset:common_voice", "arxiv:2006.04558", "arxiv:2109.06912", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2006.04558", "2109.06912" ]
[ "en" ]
TAGS #fairseq #audio #text-to-speech #multi-speaker #en #dataset-common_voice #arxiv-2006.04558 #arxiv-2109.06912 #has_space #region-us
# fastspeech2-en-200_speaker-cv4 FastSpeech 2 text-to-speech model from fairseq S^2 (paper/code): - English - 200 male/female voices (random speaker when using the widget) - Trained on Common Voice v4 ## Usage See also fairseq S^2 example.
[ "# fastspeech2-en-200_speaker-cv4\n\nFastSpeech 2 text-to-speech model from fairseq S^2 (paper/code):\n- English\n- 200 male/female voices (random speaker when using the widget)\n- Trained on Common Voice v4", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ "TAGS\n#fairseq #audio #text-to-speech #multi-speaker #en #dataset-common_voice #arxiv-2006.04558 #arxiv-2109.06912 #has_space #region-us \n", "# fastspeech2-en-200_speaker-cv4\n\nFastSpeech 2 text-to-speech model from fairseq S^2 (paper/code):\n- English\n- 200 male/female voices (random speaker when using the widget)\n- Trained on Common Voice v4", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ 59, 66, 13 ]
[ "passage: TAGS\n#fairseq #audio #text-to-speech #multi-speaker #en #dataset-common_voice #arxiv-2006.04558 #arxiv-2109.06912 #has_space #region-us \n# fastspeech2-en-200_speaker-cv4\n\nFastSpeech 2 text-to-speech model from fairseq S^2 (paper/code):\n- English\n- 200 male/female voices (random speaker when using the widget)\n- Trained on Common Voice v4## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ -0.0985691174864769, 0.023448171094059944, -0.0035697855055332184, -0.0381009541451931, 0.057796403765678406, -0.027762509882450104, 0.11115121841430664, 0.04737888649106026, -0.0013030405389145017, 0.019499532878398895, 0.045408107340335846, 0.049116455018520355, 0.052353385835886, 0.023730341345071793, -0.05917252227663994, -0.1872798651456833, 0.061193931847810745, -0.06018956005573273, 0.09235987812280655, 0.023671019822359085, 0.11743652075529099, -0.03660329431295395, -0.0008525529410690069, 0.05960456654429436, -0.045127104967832565, 0.021702153608202934, 0.0650402307510376, -0.0858878344297409, 0.09698012471199036, 0.06341629475355148, 0.020370256155729294, 0.07580424845218658, 0.04508539289236069, -0.17155593633651733, 0.02724919281899929, -0.025270063430070877, 0.10297659784555435, 0.02150566503405571, 0.002795495791360736, 0.0690799430012703, -0.05608499422669411, 0.08217109739780426, -0.024196306243538857, 0.07284180819988251, -0.0681876614689827, -0.18898560106754303, -0.009577051736414433, -0.1691906750202179, -0.04745059832930565, 0.011281492188572884, -0.05767957493662834, 0.03304142504930496, -0.1641106754541397, 0.02715986967086792, 0.1465696394443512, -0.32726478576660156, 0.001502367202192545, -0.07201444357633591, 0.04294683039188385, 0.054836686700582504, -0.07836800068616867, 0.01357205305248499, 0.05183960124850273, 0.026883801445364952, -0.1203136071562767, -0.07330206781625748, -0.18935392796993256, 0.005693752784281969, -0.10757704079151154, 0.12083801627159119, 0.2583053708076477, 0.06667619943618774, -0.04353329539299011, -0.14411184191703796, -0.027732905000448227, 0.20987974107265472, -0.04568041115999222, -0.052091676741838455, -0.005040891468524933, 0.04837533086538315, -0.0051956577226519585, -0.04541268199682236, -0.12785255908966064, -0.07465048879384995, -0.06643996387720108, 0.18459121882915497, -0.009630469605326653, 0.04278808459639549, -0.049626875668764114, -0.04261530563235283, -0.1300724595785141, -0.04699688404798508, 0.011178894899785519, -0.07792635262012482, -0.02075851894915104, 0.00856036227196455, -0.005477237515151501, -0.13548356294631958, 0.17194123566150665, -0.16348819434642792, -0.09739962220191956, 0.06654829531908035, -0.07534348219633102, 0.06786719709634781, 0.05426296964287758, -0.09540332853794098, 0.000496664026286453, -0.05622827634215355, 0.000708423089236021, -0.011036131531000137, 0.04671979323029518, -0.042242713272571564, -0.09464823454618454, -0.060924775898456573, -0.1207728385925293, 0.054988134652376175, -0.05945593863725662, -0.0011892052134498954, -0.08429142832756042, -0.003922571893781424, 0.06475646793842316, -0.03886694833636284, -0.020130198448896408, 0.09233258664608002, -0.018879961222410202, 0.08533196151256561, -0.07902491092681885, 0.09669248014688492, -0.03402433171868324, -0.13249868154525757, 0.001960460562258959, 0.009885061532258987, 0.038701657205820084, -0.10756109654903412, 0.024009259417653084, 0.03187796100974083, -0.011938736774027348, -0.09982944279909134, -0.06031734123826027, -0.0694475919008255, -0.09123989939689636, -0.004063067492097616, -0.09598588198423386, -0.15938782691955566, -0.05042249709367752, 0.06633518636226654, -0.04988851770758629, -0.021094754338264465, -0.10170963406562805, 0.056519828736782074, -0.06920147687196732, 0.1010441780090332, -0.06024092435836792, 0.024318262934684753, -0.013562634587287903, -0.05154941603541374, -0.08388931304216385, 0.15157124400138855, -0.03195006772875786, -0.08676495403051376, -0.033894218504428864, -0.02848636917769909, -0.11388502269983292, 0.08815553039312363, -0.016421062871813774, 0.2033604234457016, -0.2180432379245758, -0.05196807533502579, 0.11020638048648834, -0.061590440571308136, -0.023563126102089882, 0.13735252618789673, 0.023421375080943108, 0.10185865312814713, 0.07743725180625916, 0.3244386911392212, 0.07128937542438507, -0.16924385726451874, 0.00015375747170764953, 0.13658611476421356, -0.06141086295247078, 0.031236737966537476, 0.058717239648103714, -0.09398239105939865, 0.07983817160129547, -0.016617851331830025, 0.21191386878490448, 0.03273067995905876, -0.07641475647687912, -0.0127723328769207, 0.04654755815863609, -0.08955755084753036, 0.09618693590164185, -0.0403478741645813, -0.003994618076831102, -0.012549721635878086, 0.0036221046466380358, -0.022883515805006027, 0.10847976058721542, -0.06797003746032715, 0.04696888476610184, -0.1556272953748703, 0.0726931244134903, -0.02760004810988903, -0.009672342799603939, -0.14196990430355072, 0.028916366398334503, -0.024993358179926872, 0.11845669895410538, 0.09195894747972488, 0.20001095533370972, 0.048749249428510666, -0.009595942683517933, -0.05234925076365471, 0.03667265921831131, 0.18217681348323822, 0.05318084731698036, -0.05668431520462036, -0.19915542006492615, 0.09983894973993301, -0.07541478425264359, 0.1165488064289093, -0.1528955101966858, -0.03169509768486023, 0.15942274034023285, 0.06306218355894089, 0.04524854198098183, 0.002689506160095334, 0.0891580879688263, 0.017327044159173965, 0.028452450409531593, 0.026796424761414528, 0.038148168474435806, 0.027428574860095978, -0.055725716054439545, 0.24001508951187134, -0.10979588329792023, 0.06165193021297455, 0.06067361310124397, -0.11948411911725998, -0.06714003533124924, 0.06794149428606033, 0.02185925655066967, 0.004368143621832132, -0.0642862617969513, -0.07622414082288742, 0.15914730727672577, -0.06719805300235748, 0.10055280476808548, -0.07871811836957932, 0.06591396033763885, 0.012314999476075172, -0.06905677914619446, 0.045908816158771515, 0.10382572561502457, -0.07212331146001816, -0.10203434526920319, 0.0041609229519963264, 0.05762752518057823, -0.006465970538556576, 0.23537540435791016, -0.010542793199419975, -0.02083759382367134, -0.0016517684562131763, 0.01045804563909769, -0.058965347707271576, 0.04428250715136528, -0.143326073884964, -0.030319906771183014, 0.04893447086215019, 0.05721672251820564, 0.06957527995109558, -0.09959044307470322, -0.019025025889277458, -0.04859977960586548, -0.08067064732313156, -0.11412382870912552, 0.08189546316862106, 0.05372560769319534, 0.10749724507331848, -0.03734392672777176, -0.14203470945358276, 0.06557022035121918, -0.08046863228082657, -0.13492658734321594, 0.06502771377563477, -0.10868266224861145, -0.08360981196165085, -0.0644301176071167, -0.041703272610902786, 0.06063416600227356, 0.03593074530363083, 0.11070636659860611, -0.06387005001306534, -0.01782407984137535, -0.05721215531229973, 0.056020475924015045, -0.0038411396089941263, 0.01286544930189848, -0.021534476429224014, -0.0440310575067997, 0.03692547604441643, -0.08589205145835876, 0.014150271192193031, -0.04336690530180931, 0.06687929481267929, 0.01580040156841278, -0.04086173325777054, 0.013885327614843845, 0.18459133803844452, 0.11058152467012405, -0.05199096351861954, -0.013113981112837791, 0.23530824482440948, -0.10045964270830154, -0.030765870586037636, 0.13139626383781433, -0.0016528774285688996, 0.014142868109047413, 0.16425292193889618, 0.07459092885255814, -0.013020292855799198, -0.0003593026485759765, -0.01772751472890377, -0.017290301620960236, -0.1724691390991211, -0.08539987355470657, -0.07785619795322418, -0.007707118056714535, -0.19066187739372253, -0.013435893692076206, -0.12443222105503082, -0.051834944635629654, 0.010480678640305996, -0.08860400319099426, 0.09889210760593414, -0.033241596072912216, 0.23394805192947388, -0.09214423596858978, 0.08085742592811584, -0.06821353733539581, -0.08134020864963531, 0.11970499902963638, -0.10325250774621964, 0.06426502764225006, 0.05332217738032341, 0.1164473444223404, 0.012063142843544483, 0.05404895916581154, 0.13920623064041138, 0.0013310002395883203, 0.06547302007675171, -0.026474103331565857, -0.04185175523161888, -0.06861039996147156, 0.028982115909457207, 0.06157172843813896, 0.2286355048418045, -0.1017044335603714, 0.016040438786149025, -0.002623951993882656, 0.04694496467709541, 0.13520842790603638, 0.21752017736434937, -0.04037326201796532, 0.009377818554639816, 0.03667989745736122, -0.11057572066783905, -0.02767058089375496, 0.11197851598262787, 0.2595822513103485, -0.06117887422442436, 0.08112327009439468, 0.13808311522006989, 0.010408044792711735, -0.04491035267710686, 0.07457457482814789, -0.1685274988412857, 0.008836737833917141, -0.03571261465549469, 0.04414166510105133, -0.08751595765352249, 0.08363834768533707, 0.05636128410696983, 0.0477171465754509, -0.049187492579221725, -0.023686343804001808, 0.037530556321144104, 0.0074178497307002544, 0.08748596906661987, -0.014644729904830456, -0.042429957538843155, -0.08250097185373306, -0.016543472185730934, 0.014378798194229603, 0.13901977241039276, 0.16197624802589417, -0.02035663276910782, 0.045175790786743164, -0.004210804123431444, 0.033588021993637085, -0.11022245138883591, -0.12148644775152206, -0.03577245771884918, 0.012517686933279037, 0.24328316748142242, 0.037663038820028305, 0.003960924688726664, -0.03566773980855942, -0.12692216038703918, 0.01714983955025673, -0.1432752162218094, -0.019970085471868515, -0.030772706493735313, -0.1351841539144516, 0.09361661970615387, -0.010210121050477028, -0.048679858446121216, 0.04197124019265175, 0.06880971044301987, -0.001645029871724546, 0.023724615573883057, 0.08957140147686005, -0.03267397731542587, -0.093940868973732, -0.07065097242593765, 0.19483764469623566, 0.022651307284832, 0.09058574587106705, 0.04309595003724098, -0.008960138075053692, -0.02411257103085518, -0.07038138061761856, 0.07887047529220581, -0.05398530513048172, -0.11723769456148148, 0.06783939152956009, 0.041728559881448746, -0.0775180533528328, -0.021588219329714775, -0.06517267227172852, 0.23242920637130737, 0.2354511171579361, -0.03916211798787117, 0.11809435486793518, 0.2021460384130478, -0.010823930613696575, -0.23453791439533234, -0.083929643034935, 0.0488775335252285, 0.05180570110678673, 0.006399599369615316, -0.16415190696716309, 0.09109055250883102, 0.003502126317471266, -0.03761468082666397, 0.05133047327399254, -0.1405346393585205, -0.11374320089817047, 0.1414717733860016, -0.1462365984916687, 0.2927209436893463, -0.02813781052827835, -0.09319696575403214, -0.054456062614917755, 0.010430693626403809, 0.08100775629281998, -0.2482936531305313, 0.12363305687904358, 0.08277580142021179, 0.09235367178916931, 0.052888453006744385, 0.014887521043419838, 0.14296060800552368, 0.0010719961719587445, -0.019243977963924408, -0.0061924150213599205, -0.01919230818748474, -0.1045716255903244, 0.0606573149561882, 0.034581057727336884, -0.02107302099466324, 0.0010794267291203141, -0.10457877069711685, -0.031260304152965546, -0.0665179193019867, 0.013398803770542145, 0.07110041379928589, 0.03415529057383537, -0.061068855226039886, -0.07494426518678665, -0.014219389297068119, 0.018038788810372353, 0.16028569638729095, -0.14625439047813416, 0.07500244677066803, 0.137335866689682, 0.23876261711120605, -0.19394128024578094, 0.06866773217916489, -0.011313370428979397, -0.09426996111869812, 0.08813352882862091, -0.11683794856071472, 0.0317891500890255, 0.06551884114742279, 0.04223575443029404, 0.12471625953912735, 0.026683656498789787, -0.08265244215726852, 0.13058197498321533, 0.05505744367837906, -0.10840509831905365, -0.17606034874916077, -0.030491739511489868, 0.019619060680270195, -0.051377881318330765, -0.025977177545428276, 0.19077849388122559, 0.0386841744184494, 0.051793284714221954, -0.01846001110970974, 0.016149470582604408, -0.10937477648258209, 0.19349728524684906, 0.09115089476108551, 0.0451030507683754, -0.0939706414937973, 0.04227631539106369, 0.004498718306422234, 0.03947903960943222, -0.023990871384739876, 0.026911385357379913, -0.038142360746860504, -0.04388677328824997, -0.1228046789765358, 0.0704958364367485, 0.08230216056108475, -0.12015283852815628, -0.06998533755540848, -0.16958880424499512, 0.0027664725203067064, 0.19722002744674683, -0.011283084750175476, 0.07854075729846954, -0.007982447743415833, -0.03906669095158577, 0.0422852598130703, -0.01004260778427124, -0.00840462651103735, -0.05541443079710007, -0.08377110213041306, 0.11467532068490982, -0.04995191842317581, 0.07833006232976913, -0.06932318210601807, -0.05896713584661484, -0.07959399372339249, 0.050355397164821625, -0.10570169240236282, 0.03827586770057678, -0.07189512252807617, 0.01697452925145626, 0.017162060365080833, -0.021444914862513542, -0.02287394553422928, -0.00029656250262632966, -0.08117075264453888, 0.07196776568889618, 0.05198028311133385, 0.10166725516319275, -0.05932294949889183, 0.026551757007837296, 0.007037012837827206, -0.02693263627588749, 0.11442571133375168, 0.10571987926959991, -0.08915357291698456, 0.0841916874051094, -0.17448671162128448, -0.09367150813341141, 0.1260259598493576, 0.058671850711107254, -0.0173711609095335, 0.014289326034486294, -0.09358671307563782, 0.0677252933382988, 0.057438209652900696, -0.0017916652141138911, 0.035230398178100586, 0.009800937958061695, 0.12352217733860016, -0.11405014991760254, -0.05367310345172882, -0.08763886988162994, -0.0013749119825661182, 0.16046825051307678, 0.08108372986316681, 0.13689492642879486, -0.07360410690307617, 0.015018888749182224, 0.0019216154469177127, 0.07603532820940018, -0.015536329708993435, -0.06930235028266907, 0.03212388604879379, 0.003262836253270507, 0.07679429650306702, -0.08535166084766388, 0.10449805110692978, -0.12530648708343506, -0.06804556399583817, 0.0035636562388390303, 0.0008257575100287795, -0.05315367877483368, 0.039825137704610825, -0.050459347665309906, 0.10060685127973557, -0.04130574315786362, -0.08522582054138184, 0.019317157566547394, 0.0541289821267128, 0.2570737600326538, -0.00803740881383419, 0.0009144984069280326, 0.06846635043621063, 0.08582256734371185, 0.05717390775680542, -0.05962822586297989, 0.05181824788451195, 0.2079465687274933, -0.0574348159134388, -0.028333431109786034, -0.10645993053913116, 0.10527648031711578, 0.05190223827958107, 0.013128699734807014, -0.03761506453156471, -0.07374158501625061, -0.0572502426803112, -0.20345841348171234, 0.04780097305774689, -0.06114424020051956, -0.05777789652347565, 0.004020026884973049, -0.08463053405284882, 0.11560896039009094, 0.1635461151599884, 0.03563622757792473, 0.036690954118967056, 0.03552614152431488, -0.12052640318870544, -0.09522733837366104, 0.08021008223295212, -0.08167833089828491, 0.01228339970111847, -0.06506038457155228, -0.06151553988456726, 0.15603762865066528, 0.005659808404743671, -0.013205847702920437, 0.017224766314029694, -0.05225563794374466, -0.06035007908940315, -0.16519123315811157, -0.050483908504247665, -0.012190929614007473, 0.042023204267024994, 0.0748744010925293, 0.07037289440631866, 0.13030338287353516, -0.03750065341591835, 0.014716861769557, 0.11885170638561249, -0.056752171367406845, -0.15647311508655548, -0.1332295536994934, -0.02600196935236454, -0.08791288733482361, 0.089493528008461, -0.07525292783975601, -0.12996810674667358, -0.05578901618719101, 0.12160530686378479, 0.1936430037021637, -0.18465927243232727, 0.07574165612459183, -0.021278420463204384, -0.011098998598754406, -0.0003287196159362793, -0.05503948777914047, 0.04415104538202286, 0.16048221290111542, 0.02100275456905365, -0.04681526869535446, -0.04680423438549042, 0.023909030482172966, -0.04302484914660454, 0.1071784645318985, -0.05537824332714081, -0.009579610079526901, -0.053697507828474045, 0.10643373429775238, -0.1924418807029724, -0.12483826279640198, -0.19313807785511017, -0.079503633081913, -0.05056357756257057, 0.008859162218868732, -0.07804365456104279, 0.11324430257081985, 0.010228058323264122, -0.07525256276130676, -0.021970320492982864, -0.04892444610595703, -0.03663967177271843, -0.08047525584697723, 0.029591897502541542, 0.022384170442819595, -0.2470487654209137, -0.04203587770462036, 0.015845077112317085, 0.232784241437912, 0.03078397735953331, 0.07315277308225632, 0.010284092277288437, 0.1389378011226654, 0.023138852789998055, -0.12584786117076874, -0.01652812771499157, 0.11426956206560135, -0.029483981430530548, 0.2771137058734894, 0.08350782096385956, -0.11092348396778107, 0.06861485540866852, 0.1282966434955597, -0.13249100744724274, -0.04052300751209259, -0.0017583295702934265, -0.05547308921813965, 0.04784903675317764, -0.002490486018359661, -0.013749166391789913, -0.03137696161866188, 0.0589941181242466, -0.029010986909270287, -0.028327148407697678, -0.11094774305820465, -0.10918477177619934, -0.24435077607631683, 0.0001407736708642915, -0.046958986669778824, 0.013340430334210396, -0.08111848682165146, -0.019099365919828415, -0.06888444721698761, 0.05506312474608421, 0.050458431243896484, 0.05646933615207672, 0.09560158848762512, -0.03532923758029938, 0.004147582221776247, -0.03815995156764984, 0.07105223834514618, 0.13547246158123016, -0.0019038185710087419, -0.08732222765684128 ]
null
null
fairseq
# fastspeech2-en-ljspeech [FastSpeech 2](https://arxiv.org/abs/2006.04558) text-to-speech model from fairseq S^2 ([paper](https://arxiv.org/abs/2109.06912)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_synthesis)): - English - Single-speaker female voice - Trained on [LJSpeech](https://keithito.com/LJ-Speech-Dataset/) ## Usage ```python from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub from fairseq.models.text_to_speech.hub_interface import TTSHubInterface import IPython.display as ipd models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "facebook/fastspeech2-en-ljspeech", arg_overrides={"vocoder": "hifigan", "fp16": False} ) model = models[0] TTSHubInterface.update_cfg_with_data_cfg(cfg, task.data_cfg) generator = task.build_generator(model, cfg) text = "Hello, this is a test run." sample = TTSHubInterface.get_model_input(task, text) wav, rate = TTSHubInterface.get_prediction(task, model, generator, sample) ipd.Audio(wav, rate=rate) ``` See also [fairseq S^2 example](https://github.com/pytorch/fairseq/blob/main/examples/speech_synthesis/docs/ljspeech_example.md). ## Citation ```bibtex @inproceedings{wang-etal-2021-fairseq, title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit", author = "Wang, Changhan and Hsu, Wei-Ning and Adi, Yossi and Polyak, Adam and Lee, Ann and Chen, Peng-Jen and Gu, Jiatao and Pino, Juan", booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-demo.17", doi = "10.18653/v1/2021.emnlp-demo.17", pages = "143--152", } ```
{"language": "en", "library_name": "fairseq", "tags": ["fairseq", "audio", "text-to-speech"], "datasets": ["ljspeech"], "task": "text-to-speech", "widget": [{"text": "Hello, this is a test run.", "example_title": "Hello, this is a test run."}]}
text-to-speech
facebook/fastspeech2-en-ljspeech
[ "fairseq", "audio", "text-to-speech", "en", "dataset:ljspeech", "arxiv:2006.04558", "arxiv:2109.06912", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2006.04558", "2109.06912" ]
[ "en" ]
TAGS #fairseq #audio #text-to-speech #en #dataset-ljspeech #arxiv-2006.04558 #arxiv-2109.06912 #has_space #region-us
# fastspeech2-en-ljspeech FastSpeech 2 text-to-speech model from fairseq S^2 (paper/code): - English - Single-speaker female voice - Trained on LJSpeech ## Usage See also fairseq S^2 example.
[ "# fastspeech2-en-ljspeech\n\nFastSpeech 2 text-to-speech model from fairseq S^2 (paper/code):\n- English\n- Single-speaker female voice\n- Trained on LJSpeech", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ "TAGS\n#fairseq #audio #text-to-speech #en #dataset-ljspeech #arxiv-2006.04558 #arxiv-2109.06912 #has_space #region-us \n", "# fastspeech2-en-ljspeech\n\nFastSpeech 2 text-to-speech model from fairseq S^2 (paper/code):\n- English\n- Single-speaker female voice\n- Trained on LJSpeech", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ 51, 52, 13 ]
[ "passage: TAGS\n#fairseq #audio #text-to-speech #en #dataset-ljspeech #arxiv-2006.04558 #arxiv-2109.06912 #has_space #region-us \n# fastspeech2-en-ljspeech\n\nFastSpeech 2 text-to-speech model from fairseq S^2 (paper/code):\n- English\n- Single-speaker female voice\n- Trained on LJSpeech## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ -0.08861356228590012, 0.0584113635122776, -0.00599257554858923, -0.004323999863117933, 0.04171498119831085, -0.06038970127701759, 0.08856984972953796, 0.11319301277399063, -0.002036206191405654, -0.017862003296613693, 0.02016598731279373, 0.11396358907222748, 0.01010851003229618, 0.03332218900322914, -0.09027767181396484, -0.18067380785942078, 0.04168883338570595, -0.029444674029946327, -0.0026190646458417177, 0.02612590603530407, 0.14433170855045319, -0.029146689921617508, 0.008112075738608837, 0.032833024859428406, 0.0003615287714637816, 0.02953392267227173, 0.019603140652179718, -0.07788638770580292, 0.05456214398145676, 0.05759495869278908, -0.0014856174821034074, 0.045364636927843094, 0.0342293381690979, -0.18458791077136993, 0.026424122974276543, -0.03071237914264202, 0.0891367718577385, 0.020800378173589706, -0.07815729826688766, 0.0216413214802742, -0.028696829453110695, -0.08093930780887604, -0.0368846096098423, 0.009646783582866192, -0.06339100748300552, -0.20014026761054993, 0.01932557485997677, -0.18522778153419495, 0.010234071873128414, -0.000494328560307622, -0.03878089413046837, 0.02247309684753418, -0.18264885246753693, 0.05568835511803627, 0.21533875167369843, -0.29413095116615295, -0.0010290354257449508, -0.10510539263486862, 0.0415206141769886, 0.058253951370716095, -0.05627106502652168, 0.04486320540308952, 0.024781260639429092, -0.02845960296690464, -0.1401539444923401, -0.09761542081832886, -0.1892394721508026, 0.070014089345932, -0.12437192350625992, 0.11826831847429276, 0.314788818359375, 0.0148568544536829, -0.05215459689497948, -0.13311435282230377, -0.025805430486798286, 0.2264927178621292, -0.05699111148715019, -0.07997439801692963, 0.05179794877767563, 0.03329085931181908, 0.02190510928630829, -0.09924738854169846, -0.0985705703496933, -0.0549468994140625, -0.025388624519109726, 0.24854469299316406, 0.007253175135701895, 0.04789538308978081, -0.07905827462673187, -0.03220197558403015, -0.07630570232868195, -0.05160797759890556, 0.00973561406135559, -0.07186431437730789, 0.015070414170622826, 0.05502694472670555, 0.020782217383384705, -0.16143548488616943, 0.14802774786949158, -0.18681928515434265, -0.09715267270803452, 0.03166700899600983, 0.002932058647274971, 0.0910014659166336, 0.08061935007572174, -0.08865755051374435, 0.05065252259373665, -0.06262281537055969, 0.01797071099281311, 0.0012504858896136284, 0.06721284240484238, -0.027371007949113846, -0.07781345397233963, -0.08877163380384445, -0.18497325479984283, 0.07883325219154358, -0.022247837856411934, -0.029019445180892944, -0.10767407715320587, 0.00012254653847776353, 0.0625835433602333, -0.05352015420794487, 0.020927995443344116, 0.0480634942650795, -0.0544300451874733, 0.04798032343387604, -0.0070314169861376286, 0.10672920197248459, -0.02738480642437935, -0.15210673213005066, -0.0019050192786380649, 0.040925294160842896, 0.010414935648441315, -0.0931738018989563, 0.04053831472992897, 0.07566434890031815, 0.017896493896842003, -0.1197812631726265, -0.16779907047748566, -0.06571625918149948, -0.07179804146289825, -0.02076321840286255, -0.09182340651750565, -0.1762349158525467, -0.08065515011548996, 0.07695986330509186, -0.045663923025131226, -0.0633152425289154, -0.0973639190196991, 0.07975669205188751, -0.09999075531959534, 0.098670095205307, -0.076876699924469, 0.0386815145611763, -0.04551292955875397, -0.059218719601631165, -0.11944390088319778, 0.15894557535648346, -0.03345302864909172, -0.07400016486644745, 0.006245525553822517, -0.03153140842914581, -0.1272355318069458, 0.09074412286281586, 0.004375865682959557, 0.214582160115242, -0.2564849257469177, -0.0363972969353199, 0.021922685205936432, -0.056590285152196884, -0.012223335914313793, 0.12461075931787491, 0.03319806978106499, 0.15115566551685333, 0.0672856792807579, 0.366942822933197, 0.15760990977287292, -0.1902935951948166, 0.0048782904632389545, 0.1699700653553009, -0.052854642271995544, 0.0034015472047030926, 0.07596345245838165, -0.04577219858765602, 0.11647665500640869, -0.01014937274158001, 0.12511958181858063, 0.04229170083999634, -0.04997635260224342, -0.0343078188598156, 0.023958317935466766, -0.08916239440441132, 0.11085177958011627, -0.007191403768956661, -0.02092757448554039, -0.017436329275369644, -0.024899518117308617, 0.06499559432268143, 0.11496379971504211, -0.05608191713690758, 0.0703631192445755, -0.12727048993110657, 0.06868283450603485, 0.012149387039244175, -0.0018530123634263873, -0.11756739020347595, 0.013248001225292683, -0.05311603099107742, 0.035944174975156784, 0.12012620270252228, 0.22187405824661255, 0.034745953977108, -0.02048550732433796, -0.013274182565510273, 0.06881989538669586, 0.1176261156797409, 0.036347486078739166, -0.08941536396741867, -0.1990913450717926, 0.12071824073791504, -0.06459015607833862, 0.11467315256595612, -0.1567309945821762, -0.02394483983516693, 0.13585011661052704, 0.013265501707792282, 0.06723221391439438, 0.015972474589943886, 0.11194390058517456, 0.025858169421553612, 0.061146482825279236, 0.04018514230847359, 0.039865534752607346, 0.024409160017967224, -0.020258309319615364, 0.2532656788825989, -0.07501956820487976, 0.08957355469465256, 0.02461133897304535, -0.10914777219295502, -0.005853652022778988, 0.08021148294210434, 0.0013455268926918507, 0.010753046721220016, -0.06546138226985931, -0.030933598056435585, 0.14260698854923248, -0.09821195900440216, 0.09251783788204193, -0.09029306471347809, 0.000056884011428337544, -0.03278672695159912, -0.12003026902675629, 0.02274281531572342, 0.11751113086938858, -0.0851951465010643, -0.05915108695626259, -0.019968871027231216, 0.15306562185287476, -0.06918692588806152, 0.3066916763782501, 0.005348552018404007, -0.0048631117679178715, -0.004555211402475834, 0.004170512314885855, -0.10774485021829605, 0.05071592330932617, -0.13028521835803986, -0.0483555942773819, 0.05346013233065605, 0.04761645942926407, 0.0785285010933876, -0.10594034940004349, -0.03912682086229324, -0.03905363380908966, -0.10574866086244583, -0.12872616946697235, 0.10611677914857864, 0.05952160805463791, 0.09964584559202194, -0.04760190472006798, -0.09523910284042358, 0.03480931371450424, -0.08565898984670639, -0.10268215090036392, 0.04872749373316765, -0.13301116228103638, -0.08798492699861526, -0.06482414156198502, 0.048124250024557114, 0.08529645949602127, 0.01999717205762863, 0.13189281523227692, -0.05185011401772499, -0.029439367353916168, -0.028345361351966858, 0.025190766900777817, 0.032121382653713226, -0.007561000529676676, -0.023614827543497086, -0.012667858973145485, 0.016092227771878242, -0.12458094209432602, 0.0036776354536414146, -0.026387561112642288, 0.09275144338607788, -0.016123218461871147, -0.00906469114124775, 0.034359052777290344, 0.14099620282649994, 0.07657968997955322, -0.04744955897331238, -0.005343315191566944, 0.21044103801250458, -0.11510243266820908, -0.018837308511137962, 0.1363719254732132, -0.018022816628217697, 0.024690929800271988, 0.10917473584413528, 0.05305430665612221, -0.009692414663732052, 0.008980427868664265, -0.012819948606193066, -0.03664559870958328, -0.16519838571548462, -0.09317813068628311, -0.06412084400653839, 0.05417380481958389, -0.2500503659248352, 0.008805745281279087, -0.20908582210540771, -0.045463670045137405, 0.06662025302648544, -0.05623175576329231, 0.1258573979139328, -0.03919033706188202, 0.19735944271087646, -0.09890837222337723, 0.08300218731164932, -0.08844608068466187, -0.09675221145153046, 0.11717460304498672, -0.08448393642902374, 0.050594512373209, -0.023973703384399414, 0.04949544742703438, 0.061306603252887726, 0.05795735865831375, 0.12010636180639267, -0.0215897373855114, 0.061451513320207596, 0.008203591220080853, -0.04994833469390869, -0.08789191395044327, -0.00042441734694875777, 0.07680576294660568, 0.1832946389913559, -0.08287132531404495, -0.012078370898962021, 0.09149648994207382, 0.06258181482553482, 0.07682131230831146, 0.19929274916648865, 0.01006299164146185, 0.05791907384991646, 0.1038464903831482, -0.08312313258647919, -0.01961136981844902, 0.1451263725757599, 0.24225126206874847, -0.0544208362698555, 0.02959146350622177, 0.1661273092031479, 0.004705706145614386, -0.08875186741352081, 0.05358438938856125, -0.13129489123821259, -0.04243537038564682, -0.0063165100291371346, 0.07659091055393219, -0.09440376609563828, 0.056770823895931244, 0.05538739264011383, 0.03856600821018219, -0.03367959335446358, -0.01872693933546543, 0.05077500641345978, -0.030267702415585518, 0.12108607590198517, 0.011771583929657936, -0.028485169634222984, -0.06459371000528336, -0.012077338993549347, 0.017608320340514183, 0.1377118080854416, 0.18373209238052368, -0.014825382269918919, 0.001772444462403655, -0.018090510740876198, 0.012684506364166737, -0.06535137444734573, -0.08586844801902771, -0.016564827412366867, 0.008349764160811901, 0.21810419857501984, 0.012869060970842838, 0.004085696768015623, -0.022863393649458885, -0.12083698809146881, 0.057747937738895416, -0.06750715523958206, -0.05747929587960243, -0.007664849050343037, -0.1663089096546173, 0.08754158765077591, -0.030581118538975716, -0.03603615611791611, 0.0761045292019844, 0.03376472741365433, 0.03238031268119812, 0.009402637369930744, 0.05477970466017723, -0.050228580832481384, -0.06379099935293198, -0.024203186854720116, 0.14525380730628967, 0.02182599902153015, 0.0737740695476532, 0.05852406099438667, 0.0008328155381605029, -0.09220538288354874, -0.07654286921024323, 0.05288029834628105, -0.021764447912573814, -0.12274788320064545, 0.10525691509246826, -0.020084097981452942, -0.054824527353048325, 0.00009727133146952838, -0.042418211698532104, 0.21318304538726807, 0.28315532207489014, -0.034752100706100464, 0.13543269038200378, 0.18667639791965485, -0.05322209373116493, -0.2785945534706116, -0.08525481075048447, 0.03226093575358391, 0.06374473869800568, 0.008046697825193405, -0.14929436147212982, 0.10896595567464828, 0.021446436643600464, -0.05130177363753319, 0.04233540594577789, -0.1766880601644516, -0.11701987683773041, 0.16937467455863953, -0.15139882266521454, 0.2719646096229553, -0.06745613366365433, -0.07971423864364624, -0.02796221897006035, -0.029873011633753777, 0.1443764567375183, -0.3648802936077118, 0.10250137001276016, 0.09125065058469772, 0.07938990741968155, 0.07306718081235886, 0.03891861438751221, 0.1433873474597931, 0.022927142679691315, 0.0316508449614048, -0.030355652794241905, -0.01762280985713005, -0.12119424343109131, 0.023112567141652107, 0.10503917932510376, -0.006484418176114559, 0.015652816742658615, -0.21080538630485535, 0.011729194782674313, -0.06237763911485672, -0.0014787581749260426, 0.07779521495103836, 0.0023958287201821804, -0.09934346377849579, -0.08625347912311554, -0.02086852863430977, 0.035441819578409195, 0.25730782747268677, -0.09504272788763046, 0.028101306408643723, 0.08847877383232117, 0.23202267289161682, -0.1650543361902237, 0.13564276695251465, -0.012637108564376831, -0.08575621247291565, 0.062122393399477005, -0.1856328248977661, 0.040159787982702255, 0.05432109162211418, 0.04490155726671219, 0.13143664598464966, 0.008585350587964058, -0.03781671077013016, 0.11134694516658783, 0.08744235336780548, -0.14768752455711365, -0.09554151445627213, -0.06738743185997009, -0.029446471482515335, -0.002634190721437335, -0.05417810380458832, 0.21976704895496368, 0.03384703770279884, 0.04363897442817688, -0.02279648743569851, 0.03659068048000336, -0.11538601666688919, 0.15943995118141174, 0.1385836899280548, 0.03225044533610344, -0.07551116496324539, 0.06747199594974518, -0.012255670502781868, 0.045342009514570236, -0.005003698635846376, 0.06460687518119812, -0.024830741807818413, -0.06910187005996704, -0.13846346735954285, 0.09440620988607407, 0.0739278793334961, -0.10154156386852264, -0.11682768911123276, -0.13419921696186066, -0.0295308418571949, 0.14217215776443481, 0.010031952522695065, 0.06161251291632652, 0.010489831678569317, -0.000047907356929499656, 0.056308094412088394, 0.05837579071521759, 0.043605461716651917, -0.09072582423686981, -0.10016247630119324, 0.16220563650131226, -0.08943870663642883, 0.13660448789596558, -0.06585491448640823, -0.014843511395156384, -0.08921679109334946, 0.07339651137590408, -0.05203722417354584, 0.018669389188289642, -0.04651050642132759, 0.007295566610991955, 0.011531025171279907, -0.033027905970811844, -0.023092202842235565, 0.0008411642629653215, -0.07887367159128189, 0.0593482181429863, 0.050024036318063736, 0.09307779371738434, -0.10339538753032684, -0.025859978049993515, 0.03540796414017677, -0.014916563406586647, 0.07831001281738281, 0.0987561047077179, -0.07136807590723038, 0.08480352908372879, -0.1141936331987381, -0.04658844694495201, 0.17303189635276794, 0.04202422499656677, 0.001812783069908619, 0.009370116516947746, -0.08375414460897446, 0.05979475751519203, 0.09085920453071594, 0.00389295001514256, 0.0031644105911254883, 0.0027816640213131905, 0.15650908648967743, -0.10991628468036652, -0.0736917033791542, -0.07814832031726837, -0.0032820957712829113, 0.10386870056390762, 0.12056365609169006, 0.14024922251701355, -0.05207386985421181, -0.016067009419202805, 0.0320521704852581, 0.05828716605901718, 0.00971956830471754, -0.07694446295499802, 0.07213840633630753, -0.03656986355781555, 0.06312817335128784, -0.10885767638683319, 0.06939925998449326, 0.014136920683085918, -0.13887536525726318, -0.005138725973665714, -0.016329750418663025, -0.051620181649923325, 0.02212265506386757, -0.10994377732276917, 0.06727800518274307, -0.036448001861572266, -0.12124717235565186, 0.03232472017407417, 0.06986033916473389, 0.17683470249176025, -0.022449888288974762, 0.02577018365263939, 0.10977955162525177, 0.10938303172588348, 0.033164117485284805, -0.04152687266469002, 0.000775556662119925, 0.1701515167951584, -0.0207818690687418, -0.020360203459858894, -0.14146770536899567, 0.11577639728784561, 0.05501400679349899, -0.008319328539073467, -0.005567348096519709, -0.048439282923936844, -0.06572345644235611, -0.2641996741294861, 0.07326644659042358, -0.04999230429530144, -0.03181444853544235, -0.01857154630124569, -0.09932605922222137, 0.13018107414245605, 0.1104912981390953, 0.011133337393403053, 0.03700247406959534, 0.03313661366701126, -0.166065976023674, -0.08116848766803741, 0.07223611325025558, -0.08905211836099625, 0.04202689230442047, -0.08158805966377258, -0.014876690693199635, 0.09959328174591064, -0.006091256160289049, -0.02639157511293888, 0.029498156160116196, -0.07758353650569916, -0.06757387518882751, -0.15828199684619904, -0.0442313551902771, -0.05092541128396988, 0.06792259216308594, 0.08133549988269806, -0.0035435857716947794, 0.13990706205368042, -0.058559004217386246, 0.013874019496142864, 0.2119097113609314, -0.06795511394739151, -0.18256865441799164, -0.09266193956136703, -0.10344478487968445, -0.04872060567140579, 0.10585248470306396, -0.07775574177503586, -0.1336304396390915, -0.030880652368068695, 0.07457738369703293, 0.1917191743850708, -0.20453307032585144, 0.09235882014036179, 0.005168456584215164, 0.012496747076511383, -0.009724552743136883, -0.08640444278717041, 0.11220349371433258, 0.27490943670272827, 0.003084400435909629, -0.051977742463350296, -0.05742231011390686, 0.031263839453458786, -0.09601522982120514, 0.112904854118824, -0.08619704097509384, -0.008829954080283642, -0.0386853888630867, 0.08156178891658783, -0.1902502328157425, -0.07001757621765137, -0.22920243442058563, -0.11657154560089111, -0.03594615310430527, -0.003351100254803896, -0.07709707319736481, 0.1082477793097496, 0.034042179584503174, -0.0870736613869667, -0.049608487635850906, 0.018393905833363533, -0.026892544701695442, -0.10196253657341003, 0.06997434794902802, 0.017527692019939423, -0.20386269688606262, -0.062002554535865784, 0.023891961202025414, 0.2145654708147049, -0.0008586802869103849, 0.010356582701206207, -0.008598494343459606, 0.11418209969997406, 0.014225260354578495, -0.1074332743883133, -0.020777756348252296, 0.11008056998252869, -0.04539988934993744, 0.24917015433311462, 0.12438496947288513, -0.03250798210501671, 0.04246514290571213, 0.13212229311466217, -0.15247803926467896, -0.045240867882966995, -0.023567471653223038, -0.08500418812036514, 0.01699746586382389, 0.009424956515431404, 0.018169395625591278, 0.014789070934057236, 0.040388647466897964, -0.014539207331836224, -0.03616809844970703, -0.12994009256362915, -0.07137078046798706, -0.13144958019256592, -0.0029866534750908613, 0.011299674399197102, 0.0559372715651989, -0.08971738070249557, -0.022148314863443375, -0.08484899997711182, 0.0457887277007103, 0.03751520812511444, 0.04573450982570648, 0.13683189451694489, -0.03957449644804001, 0.018979357555508614, -0.06538385152816772, 0.08418750017881393, 0.11549253761768341, -0.00442548468708992, -0.09825257956981659 ]
null
null
transformers
# Hubert-Base [Facebook's Hubert](https://ai.facebook.com/blog/hubert-self-supervised-representation-learning-for-speech-recognition-generation-and-compression) The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. **Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model. [Paper](https://arxiv.org/abs/2106.07447) Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed **Abstract** Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets. The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/hubert . # Usage See [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more information on how to fine-tune the model. Note that the class `Wav2Vec2ForCTC` has to be replaced by `HubertForCTC`.
{"language": "en", "license": "apache-2.0", "tags": ["speech"], "datasets": ["librispeech_asr"]}
feature-extraction
facebook/hubert-base-ls960
[ "transformers", "pytorch", "tf", "hubert", "feature-extraction", "speech", "en", "dataset:librispeech_asr", "arxiv:2106.07447", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2106.07447" ]
[ "en" ]
TAGS #transformers #pytorch #tf #hubert #feature-extraction #speech #en #dataset-librispeech_asr #arxiv-2106.07447 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# Hubert-Base Facebook's Hubert The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model. Paper Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed Abstract Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets. The original model can be found under URL . # Usage See this blog for more information on how to fine-tune the model. Note that the class 'Wav2Vec2ForCTC' has to be replaced by 'HubertForCTC'.
[ "# Hubert-Base \n\nFacebook's Hubert\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper\n\nAuthors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed\n\nAbstract\nSelf-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets.\n\nThe original model can be found under URL .", "# Usage\n\nSee this blog for more information on how to fine-tune the model. Note that the class 'Wav2Vec2ForCTC' has to be replaced by 'HubertForCTC'." ]
[ "TAGS\n#transformers #pytorch #tf #hubert #feature-extraction #speech #en #dataset-librispeech_asr #arxiv-2106.07447 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# Hubert-Base \n\nFacebook's Hubert\n\nThe base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper\n\nAuthors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed\n\nAbstract\nSelf-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets.\n\nThe original model can be found under URL .", "# Usage\n\nSee this blog for more information on how to fine-tune the model. Note that the class 'Wav2Vec2ForCTC' has to be replaced by 'HubertForCTC'." ]
[ 69, 505, 47 ]
[ "passage: TAGS\n#transformers #pytorch #tf #hubert #feature-extraction #speech #en #dataset-librispeech_asr #arxiv-2106.07447 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n" ]
[ -0.07919494062662125, 0.13141120970249176, -0.00505525479093194, 0.05818694829940796, 0.04267424717545509, -0.0072942147962749004, 0.06678592413663864, 0.13642480969429016, -0.029694652184844017, -0.023821042850613594, 0.1374424248933792, 0.18145017325878143, -0.0009360109106637537, 0.002592973643913865, -0.04541778564453125, -0.21268366277217865, 0.07006362080574036, 0.023458782583475113, -0.02554764226078987, 0.07055280357599258, 0.11054744571447372, -0.04667684808373451, 0.06013930216431618, 0.028453340753912926, -0.034912481904029846, 0.03771228715777397, 0.02736423723399639, -0.10177940130233765, 0.12129774689674377, 0.06559909135103226, 0.012993082404136658, 0.03574557229876518, -0.031366247683763504, -0.14521917700767517, 0.024737179279327393, 0.005884829442948103, -0.05752914398908615, 0.05725880339741707, -0.022531356662511826, -0.023702088743448257, 0.12660279870033264, 0.011480569839477539, -0.0437985323369503, 0.06020243465900421, -0.09516840428113937, -0.26238560676574707, -0.0943387821316719, 0.12770837545394897, -0.029110588133335114, 0.10176090896129608, 0.021934494376182556, 0.13556118309497833, -0.08844934403896332, 0.07011954486370087, 0.20334267616271973, -0.3536343276500702, -0.00322158494964242, 0.01825152523815632, 0.07921844720840454, 0.03707380220293999, -0.02719428576529026, 0.02898847870528698, 0.039139725267887115, 0.021831827238202095, 0.02556987665593624, -0.07029290497303009, -0.18348920345306396, 0.07408943772315979, -0.08258059620857239, -0.06899736076593399, 0.3080272972583771, 0.02173897624015808, 0.04406512528657913, -0.0030165123753249645, -0.054682616144418716, 0.04096261411905289, -0.00044982644612900913, 0.03506949916481972, 0.0274066012352705, 0.08170309662818909, 0.061796121299266815, -0.07053601741790771, -0.1432649791240692, 0.010488501749932766, -0.18194733560085297, 0.05433452129364014, -0.011701306328177452, 0.0818093866109848, -0.1308644711971283, 0.03097895346581936, 0.013036217540502548, -0.13360026478767395, -0.01756453327834606, -0.048423007130622864, 0.10998484492301941, 0.07719472050666809, -0.07438944280147552, 0.0625745952129364, 0.10960172861814499, 0.11591887474060059, -0.004663505591452122, -0.0069204056635499, -0.028629081323742867, 0.10775700956583023, -0.009474843740463257, 0.09104857593774796, -0.04260270297527313, -0.05374607816338539, 0.0673985704779625, -0.04600222036242485, 0.06384193897247314, -0.03216797858476639, -0.13248012959957123, -0.046487562358379364, -0.050136443227529526, 0.05685780197381973, 0.112638920545578, 0.022109834477305412, -0.027627991512417793, 0.019843565300107002, 0.12138926982879639, -0.07722600549459457, 0.01377471536397934, 0.00365318707190454, 0.04133407771587372, 0.08098641782999039, 0.08162626624107361, 0.019927838817238808, -0.03851215913891792, -0.023949453607201576, -0.05519367754459381, 0.017797432839870453, 0.0037190522998571396, -0.04575127735733986, 0.08891016244888306, -0.09461700171232224, 0.033669620752334595, -0.15360908210277557, -0.06893738359212875, 0.02828192710876465, 0.03808426856994629, -0.00242256629280746, -0.07711409032344818, 0.0377008318901062, -0.06423159688711166, 0.06848257780075073, -0.08613333851099014, -0.009460683912038803, -0.08098626136779785, 0.07014082372188568, -0.06151101365685463, 0.07236040383577347, -0.18832330405712128, 0.06516671180725098, -0.09585260599851608, -0.002564126392826438, -0.017989246174693108, 0.00243579619564116, -0.07075253129005432, 0.11361668258905411, -0.04257004335522652, -0.04986962303519249, -0.0959477648139, 0.007832689210772514, -0.009955007582902908, 0.11015245318412781, -0.17932774126529694, -0.04542742669582367, 0.1470303237438202, -0.07714871317148209, -0.19664880633354187, 0.09503539651632309, 0.01894407346844673, -0.0011183987371623516, 0.03280860185623169, 0.24688827991485596, 0.041267648339271545, -0.11201786994934082, -0.023509832099080086, 0.16928347945213318, -0.0469120629131794, -0.14795291423797607, 0.07987440377473831, -0.06828078627586365, 0.04937469959259033, 0.024016546085476875, 0.01555046159774065, 0.10434603691101074, 0.023596778512001038, -0.08086905628442764, -0.08334184437990189, -0.05911567062139511, -0.02163686603307724, 0.00045903658610768616, 0.016930243000388145, -0.049716878682374954, -0.018674401566386223, 0.007046985439956188, 0.0415390245616436, 0.02805916965007782, 0.09004442393779755, -0.05460032448172569, 0.09316441416740417, 0.0016298881964758039, 0.00532523775473237, -0.13863182067871094, 0.08609085530042648, -0.06444276869297028, 0.049022093415260315, 0.05324900895357132, 0.126364603638649, 0.056353360414505005, -0.11150707304477692, -0.018120702356100082, 0.0194394588470459, 0.0697036013007164, 0.08049056679010391, -0.020866507664322853, -0.1535475105047226, 0.02351241372525692, -0.04931982234120369, 0.013137102127075195, -0.020751377567648888, -0.007612486369907856, 0.054647043347358704, 0.0956679955124855, -0.04490453004837036, 0.05569855123758316, -0.004653837997466326, -0.018315047025680542, -0.04597765579819679, -0.014499252662062645, 0.05730357766151428, 0.05467067286372185, -0.044583797454833984, 0.23126046359539032, -0.07787049561738968, 0.24806907773017883, 0.22295238077640533, -0.152666836977005, 0.0916910469532013, 0.0496179535984993, -0.021765582263469696, 0.007139365188777447, 0.061385031789541245, -0.03519479185342789, 0.06496387720108032, -0.019336361438035965, 0.11487884819507599, -0.06222977116703987, -0.015172048471868038, -0.008405626751482487, -0.041437070816755295, -0.028693338856101036, 0.06952376663684845, 0.047224998474121094, -0.13981008529663086, 0.1556750237941742, 0.33557966351509094, -0.028059355914592743, 0.11481456458568573, -0.06552727520465851, -0.044487230479717255, 0.011719267815351486, -0.04600747302174568, -0.06030477210879326, 0.12205521017313004, -0.2222350686788559, -0.026652561500668526, 0.10410764813423157, 0.012105559930205345, 0.051393136382102966, -0.13362234830856323, -0.04608936607837677, 0.02861572988331318, -0.030624141916632652, -0.1282636523246765, 0.07406571507453918, -0.0025938700418919325, 0.08930283039808273, -0.03432599455118179, -0.11455623060464859, 0.07819392532110214, 0.010927827097475529, -0.06827150285243988, 0.1005222275853157, -0.1508771926164627, -0.23255512118339539, -0.06625637412071228, -0.07655792683362961, -0.02377171628177166, -0.009211701340973377, 0.13553465902805328, -0.0502551794052124, -0.021631639450788498, -0.036834847182035446, -0.06880627572536469, -0.08247103542089462, 0.005566755309700966, -0.0051980819553136826, 0.0017449790611863136, 0.029686124995350838, -0.1472608745098114, -0.04757251963019371, -0.013594319112598896, 0.023939112201333046, 0.03353487327694893, 0.0034862454049289227, 0.08251269906759262, 0.09625771641731262, 0.04251120239496231, 0.050303880125284195, -0.025758538395166397, 0.2095925360918045, -0.027640460059046745, 0.003977563232183456, 0.20586678385734558, 0.010382112115621567, 0.055093914270401, 0.11741294711828232, 0.04096322879195213, -0.006898925639688969, -0.0282962154597044, -0.04298107698559761, -0.07059586048126221, -0.2068624049425125, -0.09416579455137253, -0.16162849962711334, 0.00893140584230423, 0.000128350846352987, 0.07614032179117203, 0.0800299420952797, 0.01830233633518219, -0.0029406624380499125, -0.0206651221960783, -0.05366665869951248, 0.010517017915844917, 0.2890206575393677, -0.06676483899354935, 0.09922564029693604, -0.1207234337925911, -0.035043876618146896, 0.11014581471681595, 0.08390549570322037, 0.10681568831205368, 0.09486804902553558, 0.0467941015958786, 0.06387373805046082, 0.23857542872428894, 0.018920719623565674, 0.08961313962936401, 0.01343919150531292, -0.017605971544981003, -0.03776831552386284, -0.06360481679439545, -0.012195009738206863, 0.08538086712360382, 0.11438022553920746, -0.1088457703590393, 0.015980014577507973, -0.14525063335895538, 0.06102462112903595, 0.15501630306243896, 0.07342829555273056, -0.11819197982549667, 0.0064066192135214806, 0.06150393187999725, -0.0034980953205376863, -0.028797978535294533, 0.08143221586942673, 0.022950399667024612, -0.04551032930612564, 0.08120866119861603, 0.0300443172454834, 0.09227822721004486, 0.0780395045876503, 0.03713129833340645, -0.07104791700839996, -0.13422784209251404, 0.04910112917423248, 0.08523760735988617, -0.2567797601222992, 0.23754729330539703, -0.010188085958361626, -0.04671461135149002, -0.05071556940674782, 0.005854888819158077, 0.07870908826589584, 0.10870476812124252, 0.1282290816307068, 0.0370292142033577, -0.06339109688997269, 0.0011218688450753689, -0.04206319525837898, 0.05321740359067917, 0.002716858172789216, 0.06478241831064224, -0.044600117951631546, -0.03527429699897766, -0.004327842034399509, 0.04854515939950943, 0.17214080691337585, -0.08183375746011734, -0.15739385783672333, 0.06144167482852936, 0.17413637042045593, -0.044713281095027924, -0.029407285153865814, -0.04478511959314346, -0.13928893208503723, 0.10173005610704422, -0.05353539437055588, -0.044784221798181534, -0.0741041749715805, -0.13874059915542603, 0.1214810460805893, -0.057268764823675156, 0.05174233019351959, -0.04128576070070267, -0.05340627580881119, -0.06643937528133392, -0.16436736285686493, 0.1156158596277237, -0.12186852842569351, 0.002333977958187461, -0.006939237471669912, 0.14649608731269836, -0.07434624433517456, 0.07448521256446838, 0.03423098847270012, 0.044496577233076096, -0.1446244865655899, -0.1039583683013916, 0.017303796485066414, -0.013406472280621529, 0.025218764320015907, -0.033104270696640015, -0.036486268043518066, -0.060758721083402634, 0.02456933818757534, -0.0144823482260108, 0.2314387559890747, 0.14593777060508728, -0.11265823990106583, 0.17009826004505157, 0.12651316821575165, -0.04863186553120613, -0.25323471426963806, -0.13887536525726318, -0.14690011739730835, -0.05047669634222984, 0.0238821841776371, -0.1111719161272049, 0.09982474893331528, 0.0173735823482275, -0.0972907543182373, 0.09001776576042175, -0.2136552482843399, -0.08336047828197479, 0.22784122824668884, -0.09808826446533203, 0.28563013672828674, -0.15546852350234985, -0.048941414803266525, -0.03370334580540657, -0.20025509595870972, 0.13780401647090912, -0.14708220958709717, 0.06293478608131409, -0.006154073867946863, 0.009799477644264698, 0.0028665498830378056, -0.04528845101594925, 0.13435152173042297, 0.01665845885872841, -0.01684623211622238, -0.08989575505256653, -0.038074690848588943, 0.09068470448255539, -0.02527998946607113, 0.08349818736314774, -0.1579759418964386, 0.027352774515748024, -0.16716188192367554, 0.018883561715483665, -0.13286948204040527, 0.08690838515758514, 0.015845440328121185, -0.03195920214056969, -0.06211647391319275, -0.03332984447479248, 0.061653051525354385, 0.013954090885818005, 0.2198546677827835, -0.03349959850311279, 0.03365005552768707, 0.16292735934257507, 0.058005522936582565, -0.22761212289333344, -0.10023731738328934, -0.05301313474774361, -0.06219471991062164, 0.06817091256380081, -0.17069955170154572, 0.04081820696592331, 0.07573606073856354, 0.005966899450868368, 0.048975005745887756, 0.06324175745248795, -0.01175741944462061, -0.015532523393630981, 0.12106656283140182, -0.13052047789096832, -0.06105908378958702, -0.022204680368304253, 0.02825153060257435, 0.005135491956025362, 0.03785300999879837, 0.1403989940881729, -0.02448318526148796, -0.0027975065167993307, -0.018573254346847534, 0.009586883708834648, -0.12838546931743622, 0.03808626905083656, 0.10188262909650803, 0.0029745474457740784, -0.13173693418502808, 0.06663136929273605, -0.007476849015802145, -0.13721297681331635, 0.004902342334389687, 0.06305781751871109, -0.07199677079916, -0.1291922926902771, -0.10539545118808746, -0.06332825869321823, -0.12169777601957321, -0.07745230942964554, -0.020468367263674736, -0.10206782817840576, 0.052113451063632965, 0.13012397289276123, 0.07627148181200027, 0.06629083305597305, -0.04438067600131035, -0.06727828085422516, 0.05689113959670067, -0.003524879924952984, -0.040827181190252304, -0.005081584211438894, -0.1078963577747345, -0.0031094844453036785, 0.0037826006300747395, 0.11065641045570374, -0.042970895767211914, -0.0018378908280283213, -0.05825535207986832, 0.03082730621099472, -0.11349494010210037, -0.03349198400974274, -0.1039203628897667, -0.011443604715168476, 0.024057717993855476, -0.11168401688337326, -0.04767013341188431, 0.04309703782200813, -0.12644726037979126, -0.027713684365153313, 0.010274259373545647, 0.0876249447464943, -0.14793184399604797, -0.046069368720054626, 0.07258306443691254, -0.01332289632409811, 0.1061723604798317, 0.14255918562412262, -0.11223804950714111, 0.051641784608364105, -0.12330891191959381, -0.15362577140331268, 0.10386600345373154, 0.060022614896297455, 0.0408049076795578, -0.0011238537263125181, -0.008226403035223484, 0.11670679599046707, 0.020466404035687447, 0.01541847363114357, 0.009589364752173424, -0.1004154160618782, -0.04037807881832123, -0.044660281389951706, -0.10042229294776917, 0.00767287565395236, -0.08030027151107788, 0.1571660041809082, 0.03567971661686897, 0.13141991198062897, 0.003126365365460515, -0.009859663434326649, -0.10354539006948471, 0.025191696360707283, -0.03921277076005936, -0.13725407421588898, -0.09657478332519531, -0.04018917307257652, -0.00004449037805898115, -0.023957695811986923, 0.22997255623340607, 0.0003653682942967862, -0.12307078391313553, 0.04879182204604149, 0.056008268147706985, -0.003932245075702667, -0.020322412252426147, 0.28818976879119873, 0.050339024513959885, -0.03041735105216503, -0.05678965896368027, 0.013164871372282505, 0.0023002440575510263, 0.05423780903220177, 0.048240140080451965, 0.1502743512392044, 0.1375269591808319, 0.09529443830251694, 0.09862809628248215, -0.030953260138630867, -0.07053270190954208, -0.10604510456323624, 0.018831271678209305, 0.10752763599157333, -0.048693735152482986, 0.09556245803833008, 0.1555614173412323, -0.036094702780246735, 0.039009347558021545, -0.06331221759319305, 0.005235243123024702, -0.13576911389827728, -0.058647703379392624, -0.033031366765499115, -0.11977221816778183, -0.02434653602540493, -0.06307758390903473, 0.07494387775659561, 0.13370124995708466, 0.03325014188885689, -0.008035683073103428, -0.00036244114744476974, -0.0011797576444223523, -0.07020364701747894, 0.037931136786937714, -0.016387304291129112, 0.017834194004535675, -0.0794178918004036, 0.004795365501195192, 0.0011265749344602227, -0.04163780435919762, -0.03476623445749283, 0.016863033175468445, -0.012458615005016327, 0.03219318017363548, -0.1197030320763588, -0.05708640068769455, -0.06129481643438339, 0.045147694647312164, 0.04068351164460182, 0.1957503706216812, 0.06248583272099495, 0.016057739034295082, 0.0634978711605072, 0.15265044569969177, -0.10679155588150024, -0.14481601119041443, -0.05386785790324211, 0.12438143044710159, 0.03097369521856308, 0.030630461871623993, -0.022866660729050636, -0.0022414634004235268, -0.045467112213373184, 0.227897509932518, 0.30017441511154175, -0.07352538406848907, 0.03960282355546951, -0.012043541297316551, 0.02478695847094059, 0.020013336092233658, 0.026631232351064682, 0.1686868965625763, 0.22585047781467438, -0.06285936385393143, -0.04776720330119133, -0.07416434586048126, 0.011928809806704521, -0.11871865391731262, 0.043339964002370834, -0.011060710996389389, -0.12056170403957367, 0.018906835466623306, 0.07080145180225372, -0.07538889348506927, 0.0408845990896225, -0.04580646753311157, -0.1780238002538681, -0.041600730270147324, -0.001626202603802085, 0.15222512185573578, 0.08667566627264023, 0.03463577851653099, -0.05620183050632477, -0.047536443918943405, 0.11379607766866684, -0.01432992983609438, -0.216848224401474, -0.029614174738526344, 0.10812225192785263, -0.12041781842708588, 0.09528655558824539, -0.018708430230617523, 0.048590660095214844, 0.0832471027970314, 0.11057524383068085, -0.07389511913061142, 0.07899967581033707, 0.04121261462569237, -0.08599803596735, -0.03572264686226845, -0.09393206983804703, 0.004323895554989576, -0.028181055560708046, 0.051293227821588516, -0.04008020833134651, 0.06391649693250656, 0.08139723539352417, -0.0169964749366045, -0.04723217710852623, -0.0050336201675236225, -0.06342693418264389, 0.0641254410147667, 0.012032979167997837, -0.04357945919036865, -0.06908175349235535, -0.042388685047626495, -0.060844965279102325, 0.06037034094333649, -0.1565706878900528, -0.12138047814369202, 0.014939992688596249, -0.01740441843867302, -0.03576413542032242, -0.009262523613870144, -0.051347579807043076, -0.06394370645284653, -0.02132025733590126, 0.007736891973763704, -0.06244191154837608, 0.011679155752062798, 0.06340111047029495, -0.005600892938673496, 0.019600791856646538, 0.010824378579854965, 0.04425475373864174, 0.0403396412730217, -0.09885864704847336, -0.06114698946475983 ]
null
null
transformers
# Hubert-Large [Facebook's Hubert](https://ai.facebook.com/blog/hubert-self-supervised-representation-learning-for-speech-recognition-generation-and-compression) The large model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. **Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model. The model was pretrained on [Libri-Light](https://github.com/facebookresearch/libri-light). [Paper](https://arxiv.org/abs/2106.07447) Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed **Abstract** Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets. The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/hubert . # Usage See [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more information on how to fine-tune the model. Note that the class `Wav2Vec2ForCTC` has to be replaced by `HubertForCTC`.
{"language": "en", "license": "apache-2.0", "tags": ["speech"], "datasets": ["libri-light"]}
feature-extraction
facebook/hubert-large-ll60k
[ "transformers", "pytorch", "tf", "hubert", "feature-extraction", "speech", "en", "dataset:libri-light", "arxiv:2106.07447", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2106.07447" ]
[ "en" ]
TAGS #transformers #pytorch #tf #hubert #feature-extraction #speech #en #dataset-libri-light #arxiv-2106.07447 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# Hubert-Large Facebook's Hubert The large model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model. The model was pretrained on Libri-Light. Paper Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed Abstract Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets. The original model can be found under URL . # Usage See this blog for more information on how to fine-tune the model. Note that the class 'Wav2Vec2ForCTC' has to be replaced by 'HubertForCTC'.
[ "# Hubert-Large \n\nFacebook's Hubert\n\nThe large model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nThe model was pretrained on Libri-Light.\n\nPaper\n\nAuthors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed\n\nAbstract\nSelf-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets.\n\nThe original model can be found under URL .", "# Usage\n\nSee this blog for more information on how to fine-tune the model. Note that the class 'Wav2Vec2ForCTC' has to be replaced by 'HubertForCTC'." ]
[ "TAGS\n#transformers #pytorch #tf #hubert #feature-extraction #speech #en #dataset-libri-light #arxiv-2106.07447 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# Hubert-Large \n\nFacebook's Hubert\n\nThe large model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nThe model was pretrained on Libri-Light.\n\nPaper\n\nAuthors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed\n\nAbstract\nSelf-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets.\n\nThe original model can be found under URL .", "# Usage\n\nSee this blog for more information on how to fine-tune the model. Note that the class 'Wav2Vec2ForCTC' has to be replaced by 'HubertForCTC'." ]
[ 66, 516, 47 ]
[ "passage: TAGS\n#transformers #pytorch #tf #hubert #feature-extraction #speech #en #dataset-libri-light #arxiv-2106.07447 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n" ]
[ -0.07008452713489532, 0.13328340649604797, -0.004982049576938152, 0.05836464837193489, 0.021394439041614532, -0.008874346502125263, 0.07127107679843903, 0.1400901973247528, -0.04994043707847595, -0.018295355141162872, 0.1355748176574707, 0.1908401995897293, -0.020739799365401268, 0.0055093904957175255, -0.04216954484581947, -0.23270860314369202, 0.07171531021595001, 0.04435189068317413, -0.04172111302614212, 0.07718829065561295, 0.09972647577524185, -0.05821356922388077, 0.052465807646512985, 0.016896549612283707, -0.0823591947555542, 0.031256139278411865, 0.02845412865281105, -0.09586656093597412, 0.12425026297569275, 0.07366764545440674, 0.015192515216767788, 0.04177175834774971, -0.04336237907409668, -0.13215085864067078, 0.018561160191893578, 0.013791228644549847, -0.052305348217487335, 0.0602174773812294, -0.012064700946211815, -0.0026777489110827446, 0.1437937617301941, -0.009906419552862644, -0.02189253829419613, 0.045634590089321136, -0.07563649863004684, -0.26865020394325256, -0.0818825215101242, 0.13502883911132812, -0.05653386563062668, 0.1113860160112381, 0.02660949155688286, 0.16125696897506714, -0.07800764590501785, 0.0748918280005455, 0.23716676235198975, -0.3414047658443451, -0.014958901330828667, 0.025596313178539276, 0.08029089123010635, 0.00840219296514988, -0.04202081635594368, 0.05068632960319519, 0.04448726773262024, 0.02230546995997429, 0.04720161482691765, -0.06363968551158905, -0.17967018485069275, 0.08305087685585022, -0.08240093290805817, -0.05659438669681549, 0.2885434925556183, 0.019355617463588715, 0.04895232617855072, -0.017766650766134262, -0.07156161218881607, 0.04045403003692627, -0.005096367560327053, 0.03481182083487511, 0.021976640447974205, 0.06062671169638634, 0.020484615117311478, -0.08730769902467728, -0.1510000079870224, 0.009598679840564728, -0.17870667576789856, 0.1372177004814148, 0.0010513011366128922, 0.0833236575126648, -0.12179678678512573, 0.039856430143117905, 0.006113309413194656, -0.13573125004768372, -0.024691792204976082, -0.04955282434821129, 0.1211135983467102, 0.09010660648345947, -0.0769643560051918, 0.07808160036802292, 0.10110817849636078, 0.13758359849452972, -0.01750674843788147, -0.008318806998431683, -0.035972725600004196, 0.10903298854827881, -0.031728584319353104, 0.0821169763803482, -0.03641117736697197, -0.045528676360845566, 0.07367575168609619, -0.0882294625043869, 0.08370726555585861, -0.04351669177412987, -0.1258246898651123, -0.05630411207675934, -0.05415735021233559, 0.04986459016799927, 0.10911546647548676, 0.03849087283015251, -0.013887211680412292, 0.01920154318213463, 0.12724089622497559, -0.06302479654550552, 0.01635756529867649, 0.004212548490613699, 0.04348432645201683, 0.05869206786155701, 0.10479701310396194, 0.006381809711456299, -0.030432002618908882, -0.016120772808790207, -0.04398191720247269, -0.006343306973576546, -0.01861530914902687, -0.05703536793589592, 0.09013179689645767, -0.08003289252519608, 0.04264124110341072, -0.17148201167583466, -0.05405709519982338, 0.041997719556093216, 0.0564439482986927, -0.009853078052401543, -0.08086778223514557, 0.040192682296037674, -0.0950113832950592, 0.08845195919275284, -0.07547654211521149, 0.00877002440392971, -0.0689588263630867, 0.07813376188278198, -0.04175303876399994, 0.07332543283700943, -0.2265794426202774, 0.05904572457075119, -0.08224357664585114, 0.0019999449141323566, -0.0339263491332531, -0.008437392301857471, -0.06676941365003586, 0.10563334077596664, -0.020080069079995155, -0.06176068261265755, -0.06347876787185669, 0.00959921907633543, 0.0043284171260893345, 0.11323662847280502, -0.16222088038921356, -0.05126942694187164, 0.11037691682577133, -0.09218363463878632, -0.19582390785217285, 0.07808255404233932, 0.013392954133450985, 0.0012965210480615497, 0.024105429649353027, 0.24295100569725037, 0.08951512724161148, -0.11495060473680496, -0.02231382392346859, 0.17025767266750336, -0.052461665123701096, -0.15188369154930115, 0.07386838644742966, -0.04353940486907959, 0.026758184656500816, 0.03171843662858009, -0.009467856958508492, 0.10673972964286804, 0.012288223952054977, -0.07420583814382553, -0.08036325871944427, -0.044991184026002884, 0.008155102841556072, -0.002988775260746479, 0.0273527130484581, -0.06483662128448486, -0.026556875556707382, 0.025410121306777, 0.07686760276556015, 0.01422121562063694, 0.09343540668487549, -0.06447923928499222, 0.07037170976400375, 0.019089791923761368, 0.014406553469598293, -0.13232053816318512, 0.09857039898633957, -0.04119941592216492, 0.040463149547576904, 0.04476848617196083, 0.12133215367794037, 0.05623868480324745, -0.08598529547452927, -0.025208884850144386, 0.00882842019200325, 0.05629776045680046, 0.08094457536935806, -0.0073610832914710045, -0.18757057189941406, 0.01609593629837036, -0.049897100776433945, -0.020300015807151794, -0.020530693233013153, -0.010727196000516415, 0.07365968078374863, 0.10231833904981613, -0.03523999825119972, 0.04675488546490669, -0.02597782388329506, -0.03236430883407593, -0.04572595655918121, -0.011193440295755863, 0.05290280282497406, 0.06398512423038483, -0.04177769273519516, 0.24205200374126434, -0.06104175001382828, 0.2639541029930115, 0.23135845363140106, -0.16132089495658875, 0.09692835062742233, 0.06900762766599655, -0.035829175263643265, -0.0003955892752856016, 0.07717249542474747, -0.03623083606362343, 0.06294352561235428, -0.030246861279010773, 0.09027480334043503, -0.0553167425096035, -0.037057049572467804, -0.00467958627268672, -0.016176952049136162, -0.040610771626234055, 0.07131995260715485, 0.06106796860694885, -0.13924844563007355, 0.14854662120342255, 0.2958432137966156, -0.024507811293005943, 0.11009114980697632, -0.06507943570613861, -0.01601802371442318, -0.0010905512608587742, -0.03712984174489975, -0.05562489107251167, 0.15624509751796722, -0.2536951005458832, -0.02462875097990036, 0.09572838991880417, -0.0025476724840700626, 0.0644628033041954, -0.13217195868492126, -0.0436895415186882, 0.030538950115442276, -0.021138930693268776, -0.08552030473947525, 0.07187628000974655, -0.01032167300581932, 0.08490526676177979, -0.024305926635861397, -0.1199529767036438, 0.07494199275970459, 0.007067006081342697, -0.07147134095430374, 0.09915509819984436, -0.16784243285655975, -0.2356334924697876, -0.05548693984746933, -0.06580100953578949, -0.028355294838547707, -0.011452145874500275, 0.1345309168100357, -0.0384552925825119, -0.022072305902838707, -0.027141589671373367, -0.05018456652760506, -0.0924147367477417, 0.02569025196135044, -0.020917261019349098, 0.0127571951597929, 0.014015085995197296, -0.15263476967811584, -0.04654829949140549, -0.0018349119927734137, 0.046137068420648575, 0.03271561115980148, -0.0067383041605353355, 0.09410345554351807, 0.09635835886001587, 0.04679098352789879, 0.04858700558543205, -0.030748112127184868, 0.19493700563907623, -0.03479328751564026, 0.0141395078971982, 0.19845040142536163, 0.030926473438739777, 0.04839291423559189, 0.12174159288406372, 0.04733917862176895, -0.009910805150866508, -0.021759895607829094, -0.04207328334450722, -0.06368790566921234, -0.17398013174533844, -0.10667183250188828, -0.1660265028476715, 0.005846844986081123, -0.0009538258891552687, 0.08158265054225922, 0.0726458877325058, 0.014913901686668396, -0.018538177013397217, -0.032682985067367554, -0.037913329899311066, 0.030722297728061676, 0.2954229414463043, -0.08289171755313873, 0.09499944001436234, -0.11255021393299103, -0.042677540332078934, 0.11725366115570068, 0.06228414177894592, 0.1302807480096817, 0.06359291076660156, 0.049876511096954346, 0.06703241914510727, 0.24282188713550568, 0.024187492206692696, 0.08734181523323059, 0.017813032492995262, -0.019703460857272148, -0.032934121787548065, -0.08326607197523117, -0.00421117851510644, 0.07182583957910538, 0.07944235950708389, -0.09011998772621155, 0.0055579873733222485, -0.15802469849586487, 0.06153049319982529, 0.14690083265304565, 0.07943771779537201, -0.12770819664001465, 0.00498352712020278, 0.06280402094125748, 0.013754556886851788, -0.04393274709582329, 0.07859901338815689, 0.07037317752838135, -0.03597613424062729, 0.07392523437738419, 0.010769733227789402, 0.09211792051792145, 0.06689698249101639, 0.03775260969996452, -0.06160658597946167, -0.14183439314365387, 0.04480874538421631, 0.0870881900191307, -0.2555846571922302, 0.22224199771881104, -0.02229744754731655, -0.019171878695487976, -0.05977129936218262, 0.004608705174177885, 0.06486519426107407, 0.13841654360294342, 0.1169542744755745, 0.03920431435108185, -0.008697778917849064, -0.0025081299245357513, -0.02597946487367153, 0.058098696172237396, 0.021030021831393242, 0.05293175205588341, -0.04049824923276901, -0.030724238604307175, -0.017866607755422592, 0.06056947633624077, 0.19451913237571716, -0.07263338565826416, -0.1549951285123825, 0.06100039556622505, 0.16373153030872345, -0.058175407350063324, -0.05152539908885956, -0.04740051180124283, -0.08637990057468414, 0.12922701239585876, -0.04566546529531479, -0.06104294955730438, -0.0801854282617569, -0.13663692772388458, 0.11712079495191574, -0.051698386669158936, 0.06468093395233154, -0.03550960496068001, -0.019400490447878838, -0.07728907465934753, -0.17948026955127716, 0.11016831547021866, -0.13221049308776855, 0.0074144466780126095, -0.011019871570169926, 0.14354142546653748, -0.077016681432724, 0.06351558119058609, 0.025282787159085274, 0.034452471882104874, -0.14188240468502045, -0.113873191177845, -0.005391952581703663, -0.023233497515320778, 0.016182061284780502, -0.054293323308229446, -0.018412109464406967, -0.033561013638973236, 0.03413902595639229, -0.008793933317065239, 0.2085028439760208, 0.1297818273305893, -0.12355884909629822, 0.15632303059101105, 0.11236817389726639, -0.04140632972121239, -0.2536526322364807, -0.12133648991584778, -0.13617752492427826, -0.060740891844034195, -0.014092725701630116, -0.11107929795980453, 0.08920936286449432, -0.015971990302205086, -0.09920358657836914, 0.10033977776765823, -0.23060736060142517, -0.08364938944578171, 0.20580166578292847, -0.06701282411813736, 0.28647106885910034, -0.1451350748538971, -0.05408930033445358, -0.03610985353589058, -0.20194697380065918, 0.15012799203395844, -0.12779571115970612, 0.06330570578575134, -0.007843765430152416, 0.002583194524049759, 0.00840829312801361, -0.04885847121477127, 0.12445568293333054, -0.02420433610677719, 0.0035947374999523163, -0.09105808287858963, -0.07106649875640869, 0.09099946916103363, -0.02272670343518257, 0.06420999020338058, -0.1537371426820755, 0.050663214176893234, -0.15119579434394836, 0.022411001846194267, -0.13200458884239197, 0.08242607861757278, -0.007877189666032791, -0.05219317600131035, -0.07254723459482193, -0.027602853253483772, 0.05105215311050415, 0.012873269617557526, 0.21256886422634125, -0.004042395390570164, 0.01236108411103487, 0.1297801285982132, 0.021366531029343605, -0.20483236014842987, -0.12225611507892609, -0.041982512921094894, -0.07378073781728745, 0.09317993372678757, -0.17531973123550415, 0.02478361874818802, 0.07620187103748322, 0.011760166846215725, 0.03277114778757095, 0.06573484092950821, -0.034258343279361725, -0.03639501705765724, 0.11692622303962708, -0.15267208218574524, -0.038076844066381454, -0.04068329930305481, 0.04321755841374397, 0.039510760456323624, 0.0634094700217247, 0.14371076226234436, -0.05559324100613594, 0.0031388706993311644, -0.002338707447052002, 0.0021004914306104183, -0.12996681034564972, 0.04281960800290108, 0.09563679248094559, 0.01319595891982317, -0.13349664211273193, 0.06176011264324188, -0.013472664169967175, -0.126723051071167, -0.005167579744011164, 0.08007393032312393, -0.07411546260118484, -0.12212911248207092, -0.03604612872004509, -0.04837731644511223, -0.12786997854709625, -0.08124510943889618, -0.02016931027173996, -0.11437664180994034, 0.06957434117794037, 0.14313623309135437, 0.07927954196929932, 0.07325275242328644, -0.05271363630890846, -0.03744438290596008, 0.051450878381729126, 0.008211075328290462, -0.079121895134449, 0.00983112957328558, -0.1114526093006134, 0.013714713044464588, 0.002709200605750084, 0.10214061290025711, -0.049987200647592545, 0.00890939962118864, -0.07563889026641846, 0.025436142459511757, -0.12299557775259018, -0.026047980412840843, -0.07817023247480392, -0.013414459303021431, 0.01018771342933178, -0.10859355330467224, -0.051216769963502884, 0.04864383488893509, -0.10839194804430008, -0.04438900947570801, 0.013209357857704163, 0.060949236154556274, -0.1515788733959198, -0.05424753949046135, 0.06383244693279266, -0.03428414836525917, 0.10986502468585968, 0.1409796178340912, -0.09164576232433319, 0.053878750652074814, -0.10671757906675339, -0.16226182878017426, 0.1016421839594841, 0.05870271474123001, 0.038213059306144714, -0.009483118541538715, -0.016275066882371902, 0.09822993725538254, 0.02004779316484928, 0.008758696727454662, 0.022884495556354523, -0.10196149349212646, -0.040488410741090775, -0.04266177490353584, -0.08965464681386948, -0.0006266444688662887, -0.08131132274866104, 0.12862466275691986, 0.03039027564227581, 0.10764510184526443, 0.004559967666864395, -0.008010108023881912, -0.09074350446462631, 0.02322225272655487, -0.039908356964588165, -0.14575521647930145, -0.08797885477542877, -0.018193485215306282, 0.0030049774795770645, -0.027591994032263756, 0.19418463110923767, -0.01598292589187622, -0.1413457691669464, 0.04231206327676773, 0.047917384654283524, -0.019116459414362907, -0.012931706383824348, 0.31337040662765503, 0.047131117433309555, -0.02579544298350811, -0.036237139254808426, 0.031466737389564514, 0.011992651969194412, 0.04036421701312065, 0.045102789998054504, 0.16178485751152039, 0.14717817306518555, 0.09267465025186539, 0.11997339874505997, -0.03464384377002716, -0.12110721319913864, -0.11716420203447342, 0.0306834876537323, 0.10338301956653595, -0.05809510126709938, 0.10178632289171219, 0.15577512979507446, -0.052203208208084106, 0.046352650970220566, -0.04478617012500763, 0.012359749525785446, -0.10546264797449112, -0.08992195129394531, -0.030006350949406624, -0.12239044904708862, -0.03052845597267151, -0.07780242711305618, 0.05716675519943237, 0.07800406217575073, 0.03058869205415249, -0.01529171597212553, 0.010645756497979164, 0.019704669713974, -0.07811499387025833, 0.05678724870085716, -0.002311612479388714, -0.0018877548864111304, -0.0759756788611412, 0.005798527970910072, -0.002206770470365882, -0.021476836875081062, -0.0367211289703846, 0.024358857423067093, -0.013234623707830906, 0.04363205283880234, -0.08801079541444778, -0.05961860343813896, -0.052699096500873566, 0.021708399057388306, 0.06278076022863388, 0.19746197760105133, 0.054009079933166504, 0.010260621085762978, 0.0575016625225544, 0.1917266696691513, -0.11777287721633911, -0.15823301672935486, -0.03281024098396301, 0.10286055505275726, 0.0232688095420599, 0.044598955661058426, -0.012666384689509869, -0.009210141375660896, -0.08977440744638443, 0.2460186779499054, 0.31390586495399475, -0.04990452900528908, 0.04615180939435959, 0.018070153892040253, 0.0271205585449934, 0.013150081038475037, 0.0362471379339695, 0.16588932275772095, 0.2698742747306824, -0.05608178675174713, -0.05573868751525879, -0.07304761558771133, 0.028551388531923294, -0.10311003029346466, 0.05045146122574806, 0.007103755604475737, -0.1379128247499466, 0.009800916537642479, 0.057911403477191925, -0.055404555052518845, 0.02910761907696724, -0.02234596200287342, -0.2036673128604889, -0.04147718474268913, -0.00002635142664075829, 0.15929843485355377, 0.0675598680973053, 0.042162638157606125, -0.04964977130293846, -0.032601550221443176, 0.09307427704334259, -0.005322758108377457, -0.21947400271892548, -0.04006445035338402, 0.1074860543012619, -0.09653604030609131, 0.12433288991451263, -0.026376595720648766, 0.05312417447566986, 0.08879059553146362, 0.1042705699801445, -0.08713356405496597, 0.08101862668991089, 0.04056797921657562, -0.13776379823684692, -0.04650668427348137, -0.07720891386270523, -0.012860535643994808, -0.0036052456125617027, 0.05768025293946266, -0.04168606176972389, 0.04988689348101616, 0.11349838972091675, 0.0033780736848711967, -0.06072520464658737, -0.018884291872382164, -0.08244480937719345, 0.05494649335741997, 0.003729873802512884, -0.04321615397930145, -0.07629320025444031, -0.0394035279750824, -0.0478053018450737, 0.06345637142658234, -0.12960326671600342, -0.09329739958047867, -0.00027781553217209876, -0.01969565451145172, -0.04148796945810318, 0.002856305567547679, -0.0789089947938919, -0.06646028906106949, -0.020819272845983505, 0.011044261045753956, -0.06886670738458633, 0.0035215250682085752, 0.0662151575088501, -0.020679686218500137, 0.013200347311794758, -0.043876081705093384, 0.01949518546462059, 0.04544207081198692, -0.10469547659158707, -0.0686597153544426 ]
null
null
transformers
# Hubert-Large-Finetuned [Facebook's Hubert](https://ai.facebook.com/blog/hubert-self-supervised-representation-learning-for-speech-recognition-generation-and-compression) The large model fine-tuned on 960h of Librispeech on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. The model is a fine-tuned version of [hubert-large-ll60k](https://huggingface.co/facebook/hubert-large-ll60k). [Paper](https://arxiv.org/abs/2106.07447) Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed **Abstract** Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets. The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/hubert . # Usage The model can be used for automatic-speech-recognition as follows: ```python import torch from transformers import Wav2Vec2Processor, HubertForCTC from datasets import load_dataset processor = Wav2Vec2Processor.from_pretrained("facebook/hubert-large-ls960-ft") model = HubertForCTC.from_pretrained("facebook/hubert-large-ls960-ft") ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") input_values = processor(ds[0]["audio"]["array"], return_tensors="pt").input_values # Batch size 1 logits = model(input_values).logits predicted_ids = torch.argmax(logits, dim=-1) transcription = processor.decode(predicted_ids[0]) # ->"A MAN SAID TO THE UNIVERSE SIR I EXIST" ```
{"language": "en", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["libri-light", "librispeech_asr"], "model-index": [{"name": "hubert-large-ls960-ft", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "LibriSpeech (clean)", "type": "librispeech_asr", "config": "clean", "split": "test", "args": {"language": "en"}}, "metrics": [{"type": "wer", "value": 1.9, "name": "Test WER"}]}]}]}
automatic-speech-recognition
facebook/hubert-large-ls960-ft
[ "transformers", "pytorch", "tf", "hubert", "automatic-speech-recognition", "speech", "audio", "hf-asr-leaderboard", "en", "dataset:libri-light", "dataset:librispeech_asr", "arxiv:2106.07447", "license:apache-2.0", "model-index", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2106.07447" ]
[ "en" ]
TAGS #transformers #pytorch #tf #hubert #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-libri-light #dataset-librispeech_asr #arxiv-2106.07447 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
# Hubert-Large-Finetuned Facebook's Hubert The large model fine-tuned on 960h of Librispeech on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. The model is a fine-tuned version of hubert-large-ll60k. Paper Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed Abstract Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets. The original model can be found under URL . # Usage The model can be used for automatic-speech-recognition as follows:
[ "# Hubert-Large-Finetuned\n\nFacebook's Hubert\n\nThe large model fine-tuned on 960h of Librispeech on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. \n\nThe model is a fine-tuned version of hubert-large-ll60k.\n\nPaper\n\nAuthors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed\n\nAbstract\nSelf-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets.\n\nThe original model can be found under URL .", "# Usage\n\nThe model can be used for automatic-speech-recognition as follows:" ]
[ "TAGS\n#transformers #pytorch #tf #hubert #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-libri-light #dataset-librispeech_asr #arxiv-2106.07447 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n", "# Hubert-Large-Finetuned\n\nFacebook's Hubert\n\nThe large model fine-tuned on 960h of Librispeech on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. \n\nThe model is a fine-tuned version of hubert-large-ll60k.\n\nPaper\n\nAuthors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed\n\nAbstract\nSelf-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets.\n\nThe original model can be found under URL .", "# Usage\n\nThe model can be used for automatic-speech-recognition as follows:" ]
[ 97, 468, 21 ]
[ "passage: TAGS\n#transformers #pytorch #tf #hubert #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-libri-light #dataset-librispeech_asr #arxiv-2106.07447 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n" ]
[ -0.11951669305562973, 0.1454727053642273, -0.004487011581659317, 0.02851836010813713, 0.0534772127866745, -0.02960859425365925, 0.10965791344642639, 0.11866012215614319, 0.050869353115558624, -0.01156110130250454, 0.08097206801176071, 0.14867731928825378, -0.005137368105351925, 0.04399462044239044, -0.037219882011413574, -0.1853320449590683, 0.056972768157720566, 0.010823899880051613, 0.003599982475861907, 0.07112065702676773, 0.10594962537288666, -0.043215103447437286, 0.059053558856248856, 0.01872187666594982, -0.045082591474056244, 0.03600800782442093, 0.042795468121767044, -0.12794212996959686, 0.12049543112516403, 0.06281840801239014, -0.019931893795728683, 0.048826105892658234, 0.027045970782637596, -0.11619884520769119, 0.02230253256857395, 0.0025337212719023228, -0.017490891739726067, 0.05481173098087311, -0.019012143835425377, -0.022290097549557686, 0.07547593116760254, 0.022491224110126495, -0.013297326862812042, 0.07414928823709488, -0.057662755250930786, -0.2575339674949646, -0.048830848187208176, 0.12727291882038116, -0.043790169060230255, 0.08153699338436127, -0.01013822853565216, 0.12533973157405853, -0.08176468312740326, 0.09467421472072601, 0.13436366617679596, -0.24660652875900269, 0.02367350459098816, -0.023479240015149117, 0.0029280728194862604, 0.04008546844124794, -0.036130309104919434, 0.041025448590517044, 0.025840148329734802, 0.016119955107569695, 0.03366291522979736, -0.07467800378799438, -0.19387078285217285, 0.031481679528951645, -0.10049191117286682, -0.037816133350133896, 0.2766824960708618, 0.018861111253499985, 0.026215078309178352, -0.05283088609576225, -0.04747067391872406, 0.06975127011537552, -0.03672833740711212, 0.03640776872634888, -0.005454147234559059, 0.0285933967679739, 0.023717887699604034, -0.047414619475603104, -0.10030502825975418, -0.04476981982588768, -0.14244335889816284, 0.13003024458885193, -0.015112096443772316, 0.053475432097911835, -0.10884355753660202, 0.02678457275032997, 0.006289943121373653, -0.11452246457338333, -0.0025463735219091177, -0.014999237842857838, 0.012076067738234997, 0.06490173935890198, -0.014745178632438183, 0.05344076454639435, 0.1429184228181839, 0.10455270111560822, 0.04374084249138832, 0.0030383046250790358, -0.05344189703464508, 0.08795387297868729, -0.03221945837140083, 0.12059988081455231, -0.07114624977111816, -0.04121587425470352, 0.05975497141480446, 0.013447241857647896, 0.0759337842464447, -0.045927032828330994, -0.10704361647367477, -0.031736552715301514, 0.011148052290081978, 0.029263991862535477, 0.06515667587518692, 0.035978395491838455, -0.021231872960925102, 0.009178844280540943, 0.10644912719726562, -0.12491670995950699, 0.009112116880714893, 0.049246057868003845, 0.03938077762722969, 0.0344502255320549, 0.07971081882715225, 0.017740774899721146, -0.05211932584643364, -0.012606061063706875, -0.024546287953853607, 0.00024671011487953365, 0.04100019484758377, -0.04338652640581131, 0.06527495384216309, -0.06637164950370789, 0.04237417131662369, -0.15995872020721436, -0.04708246514201164, 0.014121811836957932, 0.010342823341488838, -0.0008907804731279612, -0.08451936393976212, -0.006651194766163826, -0.06986715644598007, 0.06437085568904877, -0.10058405250310898, 0.03501878306269646, -0.07317385822534561, 0.0809396356344223, 0.03217194229364395, 0.10135632008314133, -0.16889579594135284, 0.08302582055330276, -0.07211542874574661, -0.017631947994232178, -0.0668611228466034, 0.08326197415590286, -0.11271659284830093, 0.0819966271519661, -0.058606572449207306, -0.030315902084112167, -0.07472983002662659, 0.04541230574250221, -0.023061292245984077, 0.09821965545415878, -0.1792878806591034, -0.11566898971796036, 0.10735072940587997, -0.08924935013055801, -0.13461747765541077, 0.11669027805328369, 0.027249345555901527, 0.017526915296912193, 0.06840936094522476, 0.2950775921344757, 0.034537672996520996, -0.10397769510746002, -0.022843174636363983, 0.11068189889192581, -0.05186968669295311, -0.1117277592420578, 0.06459973752498627, -0.055863156914711, -0.003916772082448006, 0.022609863430261612, -0.011759458109736443, 0.10563649237155914, 0.020301083102822304, -0.08846137672662735, -0.049628615379333496, -0.0863238051533699, -0.009467470459640026, 0.007169682066887617, 0.016121881082654, -0.026087375357747078, -0.028448883444070816, 0.010359234176576138, 0.0803854838013649, -0.017477916553616524, 0.05254779011011124, -0.11110352724790573, 0.06937956064939499, 0.0017771945567801595, 0.02419254556298256, -0.14931926131248474, 0.1692226082086563, -0.058706287294626236, 0.0169459767639637, 0.09946579486131668, 0.05017787590622902, 0.06286536157131195, -0.06650341302156448, -0.027813974767923355, -0.012157986871898174, 0.1262037605047226, 0.07053577154874802, 0.012415752746164799, -0.22289064526557922, 0.03967822715640068, -0.05723433569073677, 0.09850002825260162, -0.052772633731365204, -0.020827988162636757, 0.07289160043001175, 0.09616539627313614, -0.007818472571671009, 0.043511658906936646, 0.041846081614494324, -0.0103153046220541, 0.01096426136791706, -0.010731141082942486, 0.04152749851346016, 0.01377926953136921, -0.06266134232282639, 0.20074620842933655, -0.14747455716133118, 0.22954295575618744, 0.22669529914855957, -0.07398246973752975, 0.07300995290279388, 0.1146758422255516, -0.0007983248215168715, -0.013175644911825657, 0.07349249720573425, -0.050858695060014725, 0.12907426059246063, -0.030483728274703026, 0.12077835947275162, -0.05889581888914108, -0.013147738762199879, 0.006938230711966753, -0.022515753284096718, -0.010610655881464481, 0.121821828186512, -0.007825289852917194, -0.1301804929971695, 0.12139482796192169, 0.1631963700056076, -0.06439395248889923, 0.16446641087532043, -0.05603399500250816, -0.04324490576982498, 0.05096398666501045, -0.05336461588740349, -0.03421865031123161, 0.16444416344165802, -0.24630238115787506, -0.05775845795869827, 0.0882968083024025, -0.04780863597989082, 0.04740799963474274, -0.14137931168079376, -0.007000196725130081, -0.01985625922679901, -0.060119953006505966, -0.1109614446759224, 0.09240584820508957, -0.02214144915342331, 0.08826258033514023, -0.06112721934914589, -0.21345143020153046, 0.04374700412154198, -0.027220681309700012, -0.10309170931577682, 0.06624196469783783, -0.1127607673406601, -0.2593905031681061, -0.08181579411029816, -0.06259514391422272, -0.03554078936576843, 0.007327874191105366, 0.10513213276863098, -0.08896898478269577, -0.02321820892393589, -0.043086711317300797, -0.014967641793191433, -0.009113654494285583, 0.002639368874952197, 0.024167072027921677, -0.002183087170124054, 0.06218516826629639, -0.14740334451198578, -0.029154447838664055, -0.032260600477457047, 0.0698278471827507, 0.059265244752168655, 0.006192544940859079, 0.07372235506772995, 0.1518818736076355, 0.06957472115755081, 0.04285561293363571, -0.008867105469107628, 0.18478864431381226, -0.08538022637367249, 0.0011419354705139995, 0.13989096879959106, -0.007461744360625744, 0.005056182388216257, 0.1750093400478363, 0.05029300972819328, -0.013523822650313377, -0.03109143115580082, -0.023824337869882584, -0.057657964527606964, -0.16633006930351257, -0.12928029894828796, -0.1255609095096588, -0.0238910261541605, 0.016542017459869385, 0.09029281884431839, 0.06593779474496841, -0.0241383109241724, -0.000947886030189693, -0.03218648582696915, 0.012693008407950401, -0.0046242475509643555, 0.26647013425827026, -0.06190481409430504, 0.1019718274474144, -0.09594203531742096, -0.06825210899114609, 0.09182269126176834, 0.06887726485729218, 0.041859522461891174, 0.08826718479394913, 0.04763233661651611, 0.029167113825678825, 0.158735990524292, 0.06411296874284744, 0.07629986852407455, 0.01325473003089428, -0.012896834872663021, -0.008119440637528896, -0.10242476314306259, -0.03258267790079117, 0.07733941823244095, 0.10900623351335526, -0.053392160683870316, 0.018237827345728874, -0.0935870036482811, 0.03535793721675873, 0.155568465590477, 0.09739618748426437, -0.17752104997634888, 0.0028160910587757826, 0.04518726468086243, -0.04198061302304268, -0.027133138850331306, 0.06824108958244324, 0.03813275322318077, -0.022126028314232826, 0.08051794022321701, 0.041092079132795334, 0.07683204114437103, -0.022839784622192383, 0.04862361401319504, -0.08418937772512436, -0.05806959420442581, 0.031568776816129684, 0.04217223450541496, -0.2738925814628601, 0.23526529967784882, 0.02412068471312523, 0.026711484417319298, -0.023006219416856766, -0.0073932018131017685, 0.09881295263767242, 0.11557575315237045, 0.12787047028541565, 0.029176507145166397, 0.0012631947174668312, -0.07378785312175751, -0.08847244828939438, 0.06997014582157135, 0.0024694271851330996, 0.0742299035191536, -0.05941011384129524, -0.02389127016067505, -0.044022150337696075, 0.05184943228960037, 0.040639765560626984, -0.13693110644817352, -0.11391986906528473, 0.03636164590716362, 0.2613977789878845, 0.008450683206319809, -0.029324883595108986, -0.07419278472661972, -0.12449335306882858, 0.001374084735289216, -0.0917268618941307, -0.023065049201250076, -0.07000154256820679, -0.14943595230579376, 0.12706318497657776, -0.03968685865402222, 0.023802751675248146, -0.014817114919424057, -0.02768789418041706, -0.035436730831861496, -0.1499687135219574, 0.11004260927438736, -0.10501088201999664, -0.025552192702889442, -0.006282862275838852, 0.1756753772497177, -0.015755509957671165, 0.08705601841211319, 0.020116860046982765, 0.03911999985575676, -0.08436581492424011, -0.05461573973298073, 0.09882490336894989, 0.05552132800221443, -0.07463334500789642, 0.014262626878917217, -0.02727729082107544, -0.14459745585918427, -0.014028643257915974, 0.001994020538404584, 0.18633168935775757, 0.11734712868928909, -0.0731908455491066, 0.18273252248764038, 0.20028765499591827, -0.02482428587973118, -0.27157580852508545, -0.12484806030988693, -0.09794902801513672, -0.010029326193034649, -0.0640292838215828, -0.15146514773368835, 0.10663363337516785, -0.07856615632772446, -0.09780658781528473, 0.06172672659158707, -0.17410820722579956, -0.09747807681560516, 0.32593169808387756, -0.10907837003469467, 0.22218461334705353, -0.12770739197731018, -0.06575486063957214, -0.07521712779998779, -0.14078541100025177, 0.10049993544816971, -0.1315561980009079, 0.09445705264806747, -0.01099066250026226, 0.053191978484392166, 0.008430277928709984, -0.01956171728670597, 0.09520857036113739, 0.060435570776462555, -0.024882951751351357, -0.047265101224184036, -0.02986239828169346, 0.05587492883205414, 0.004819236230105162, 0.1171574667096138, -0.13041406869888306, 0.03325331583619118, -0.11338212341070175, -0.006656052079051733, -0.11745139211416245, 0.08592440187931061, 0.05563903599977493, -0.020740244537591934, -0.010980279184877872, -0.02477349154651165, 0.008316014893352985, 0.013422805815935135, 0.15131475031375885, -0.07603864371776581, -0.008627298288047314, 0.13657677173614502, 0.13156405091285706, -0.22062182426452637, -0.0961351990699768, -0.033376868814229965, -0.06092512980103493, 0.11069757491350174, -0.11661738157272339, 0.08361339569091797, 0.05287355184555054, 0.05876399204134941, 0.07272055000066757, 0.05002664774656296, -0.060791365802288055, -0.030267583206295967, 0.1035112738609314, -0.11854258924722672, -0.095227912068367, -0.009962055832147598, 0.030504491180181503, 0.0430963970720768, 0.08377330750226974, 0.1517850160598755, -0.04597072675824165, 0.003553495043888688, 0.004943255800753832, 0.012281326577067375, -0.12188193947076797, 0.11691470444202423, 0.13454170525074005, 0.032897815108299255, -0.1676805168390274, 0.08835116773843765, -0.008089547976851463, -0.05984450504183769, 0.026668837293982506, 0.03882022574543953, -0.07170198112726212, -0.1197190135717392, -0.08451810479164124, 0.027223704382777214, -0.07571709901094437, -0.1093909740447998, -0.03381113335490227, -0.11084765195846558, 0.050684116780757904, 0.1310683786869049, 0.06407367438077927, 0.04731928929686546, -0.0886451005935669, -0.07051675021648407, 0.03457752987742424, 0.029446203261613846, -0.05825350433588028, 0.01020379550755024, -0.13286077976226807, -0.024259405210614204, -0.0036977953277528286, 0.07039836794137955, -0.06973826140165329, -0.03710586577653885, -0.08141186088323593, 0.043042052537202835, -0.13023051619529724, -0.016953112557530403, -0.058114588260650635, 0.0287779588252306, 0.02407171204686165, -0.1076485738158226, -0.03192967548966408, 0.042149197310209274, -0.10621484369039536, -0.018815811723470688, 0.03317532315850258, 0.10430342704057693, -0.1588265299797058, -0.028555240482091904, 0.03037896193563938, -0.023674648255109787, 0.126139834523201, 0.14331580698490143, -0.14681582152843475, 0.0666809156537056, -0.1809680312871933, -0.19574707746505737, 0.1262538880109787, 0.05407477542757988, 0.04242865368723869, -0.06943899393081665, -0.021350426599383354, 0.10521835088729858, 0.05482785403728485, 0.03049846924841404, 0.11032178997993469, -0.06456213444471359, 0.01141330972313881, -0.11486544460058212, -0.06110408902168274, -0.013642574660480022, -0.007480943575501442, 0.1534537971019745, 0.07639926671981812, 0.12784282863140106, -0.03864666074514389, -0.03700874000787735, -0.09882940351963043, 0.032198403030633926, -0.0525595061480999, -0.14437894523143768, -0.12310270965099335, 0.0018521674210205674, 0.028272166848182678, -0.03420981019735336, 0.20293687283992767, -0.002632726216688752, -0.11019377410411835, 0.036864787340164185, 0.04128604754805565, -0.024668028578162193, -0.0015177142340689898, 0.291591614484787, 0.04800349101424217, -0.010178125463426113, 0.016123542562127113, 0.0004276527324691415, 0.039557892829179764, 0.1447211503982544, -0.0051447609439492226, 0.15559054911136627, 0.0927509292960167, 0.08889496326446533, 0.14543843269348145, -0.06260541826486588, -0.056906238198280334, 0.014454721473157406, -0.024184590205550194, 0.09223147481679916, -0.07215007394552231, 0.13577938079833984, 0.14853428304195404, -0.003727734787389636, 0.06745156645774841, -0.07369476556777954, -0.014394477941095829, -0.13612540066242218, -0.11558147519826889, -0.059405963867902756, -0.1414344608783722, -0.0024973927065730095, -0.04359251260757446, 0.019394539296627045, 0.07131992280483246, 0.022657044231891632, 0.002206019824370742, 0.01721786893904209, -0.00007707802433287725, -0.0438358411192894, 0.08061964809894562, -0.02903526835143566, -0.049770232290029526, -0.10370992124080658, 0.011330822482705116, 0.08726491779088974, 0.024039115756750107, -0.027002213522791862, -0.0006515486747957766, -0.06575309485197067, 0.05558959022164345, -0.10302910953760147, -0.0657002404332161, -0.021924864500761032, 0.00695126224309206, 0.061493661254644394, 0.12388154119253159, 0.0883445143699646, -0.044239144772291183, 0.06228227913379669, 0.1899348795413971, -0.09227511286735535, -0.1822562962770462, -0.053134817630052567, 0.13404454290866852, -0.04534146934747696, 0.04993404075503349, -0.031851354986429214, -0.051607999950647354, -0.06217401847243309, 0.19581156969070435, 0.30268123745918274, -0.07121067494153976, 0.047996774315834045, -0.0655510351061821, 0.026668010279536247, -0.045561064034700394, -0.004905915353447199, 0.15965117514133453, 0.2266111522912979, -0.011475619859993458, -0.05386835336685181, -0.05511312186717987, -0.03414656221866608, -0.050868406891822815, 0.06706329435110092, 0.007203899789601564, -0.10870586335659027, -0.026393767446279526, 0.07962123304605484, -0.08916327357292175, -0.048981212079524994, -0.12280920892953873, -0.1447460800409317, -0.05691242590546608, -0.017921019345521927, 0.10525640845298767, 0.107674241065979, -0.009836621582508087, -0.05707504227757454, -0.01421040017157793, 0.029832329601049423, -0.011984305456280708, -0.22019188106060028, -0.0037655881606042385, 0.05635393410921097, -0.12473499029874802, 0.07999535650014877, -0.02611680142581463, 0.10424339026212692, 0.05520205572247505, 0.10318280011415482, -0.05074518918991089, 0.10097429156303406, 0.007013389375060797, -0.13136081397533417, -0.002401861594989896, 0.06087769195437431, 0.024145949631929398, 0.030361667275428772, 0.06227996200323105, -0.02458910644054413, 0.04971952363848686, 0.0023430672008544207, -0.05741877481341362, -0.09608601778745651, -0.008820620365440845, -0.06266969442367554, 0.055460140109062195, -0.002593748737126589, -0.04178425297141075, -0.0389699786901474, -0.049474649131298065, -0.013524840585887432, 0.060627829283475876, -0.15140527486801147, -0.08159154653549194, -0.04908895120024681, -0.018870865926146507, -0.1280069500207901, -0.024210134521126747, -0.1445862203836441, -0.06069216504693031, -0.08291270583868027, -0.01597004570066929, -0.035785432904958725, 0.02113899402320385, 0.08305481821298599, 0.016084255650639534, 0.0038169282488524914, -0.05290323123335838, 0.05831032991409302, 0.08102557808160782, -0.13600876927375793, -0.06568288058042526 ]
null
null
transformers
# Hubert-Extra-Large [Facebook's Hubert](https://ai.facebook.com/blog/hubert-self-supervised-representation-learning-for-speech-recognition-generation-and-compression) The extra large model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Speaker Identification, Intent Classification, Emotion Recognition, etc... The model was pretrained on [Libri-Light](https://github.com/facebookresearch/libri-light). [Paper](https://arxiv.org/abs/2106.07447) Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed **Abstract** Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets. The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/hubert . # Usage See [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more information on how to fine-tune the model. Note that the class `Wav2Vec2ForCTC` has to be replaced by `HubertForCTC`.
{"language": "en", "license": "apache-2.0", "tags": ["speech"], "datasets": ["libri-light"]}
feature-extraction
facebook/hubert-xlarge-ll60k
[ "transformers", "pytorch", "tf", "hubert", "feature-extraction", "speech", "en", "dataset:libri-light", "arxiv:2106.07447", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2106.07447" ]
[ "en" ]
TAGS #transformers #pytorch #tf #hubert #feature-extraction #speech #en #dataset-libri-light #arxiv-2106.07447 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# Hubert-Extra-Large Facebook's Hubert The extra large model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Speaker Identification, Intent Classification, Emotion Recognition, etc... The model was pretrained on Libri-Light. Paper Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed Abstract Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets. The original model can be found under URL . # Usage See this blog for more information on how to fine-tune the model. Note that the class 'Wav2Vec2ForCTC' has to be replaced by 'HubertForCTC'.
[ "# Hubert-Extra-Large \n\nFacebook's Hubert\n\nThe extra large model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Speaker Identification, Intent Classification, Emotion Recognition, etc...\n\nThe model was pretrained on Libri-Light.\n\nPaper\n\nAuthors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed\n\nAbstract\nSelf-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets.\n\nThe original model can be found under URL .", "# Usage\n\nSee this blog for more information on how to fine-tune the model. Note that the class 'Wav2Vec2ForCTC' has to be replaced by 'HubertForCTC'." ]
[ "TAGS\n#transformers #pytorch #tf #hubert #feature-extraction #speech #en #dataset-libri-light #arxiv-2106.07447 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# Hubert-Extra-Large \n\nFacebook's Hubert\n\nThe extra large model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition, Speaker Identification, Intent Classification, Emotion Recognition, etc...\n\nThe model was pretrained on Libri-Light.\n\nPaper\n\nAuthors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed\n\nAbstract\nSelf-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets.\n\nThe original model can be found under URL .", "# Usage\n\nSee this blog for more information on how to fine-tune the model. Note that the class 'Wav2Vec2ForCTC' has to be replaced by 'HubertForCTC'." ]
[ 66, 490, 47 ]
[ "passage: TAGS\n#transformers #pytorch #tf #hubert #feature-extraction #speech #en #dataset-libri-light #arxiv-2106.07447 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n" ]
[ -0.07008452713489532, 0.13328340649604797, -0.004982049576938152, 0.05836464837193489, 0.021394439041614532, -0.008874346502125263, 0.07127107679843903, 0.1400901973247528, -0.04994043707847595, -0.018295355141162872, 0.1355748176574707, 0.1908401995897293, -0.020739799365401268, 0.0055093904957175255, -0.04216954484581947, -0.23270860314369202, 0.07171531021595001, 0.04435189068317413, -0.04172111302614212, 0.07718829065561295, 0.09972647577524185, -0.05821356922388077, 0.052465807646512985, 0.016896549612283707, -0.0823591947555542, 0.031256139278411865, 0.02845412865281105, -0.09586656093597412, 0.12425026297569275, 0.07366764545440674, 0.015192515216767788, 0.04177175834774971, -0.04336237907409668, -0.13215085864067078, 0.018561160191893578, 0.013791228644549847, -0.052305348217487335, 0.0602174773812294, -0.012064700946211815, -0.0026777489110827446, 0.1437937617301941, -0.009906419552862644, -0.02189253829419613, 0.045634590089321136, -0.07563649863004684, -0.26865020394325256, -0.0818825215101242, 0.13502883911132812, -0.05653386563062668, 0.1113860160112381, 0.02660949155688286, 0.16125696897506714, -0.07800764590501785, 0.0748918280005455, 0.23716676235198975, -0.3414047658443451, -0.014958901330828667, 0.025596313178539276, 0.08029089123010635, 0.00840219296514988, -0.04202081635594368, 0.05068632960319519, 0.04448726773262024, 0.02230546995997429, 0.04720161482691765, -0.06363968551158905, -0.17967018485069275, 0.08305087685585022, -0.08240093290805817, -0.05659438669681549, 0.2885434925556183, 0.019355617463588715, 0.04895232617855072, -0.017766650766134262, -0.07156161218881607, 0.04045403003692627, -0.005096367560327053, 0.03481182083487511, 0.021976640447974205, 0.06062671169638634, 0.020484615117311478, -0.08730769902467728, -0.1510000079870224, 0.009598679840564728, -0.17870667576789856, 0.1372177004814148, 0.0010513011366128922, 0.0833236575126648, -0.12179678678512573, 0.039856430143117905, 0.006113309413194656, -0.13573125004768372, -0.024691792204976082, -0.04955282434821129, 0.1211135983467102, 0.09010660648345947, -0.0769643560051918, 0.07808160036802292, 0.10110817849636078, 0.13758359849452972, -0.01750674843788147, -0.008318806998431683, -0.035972725600004196, 0.10903298854827881, -0.031728584319353104, 0.0821169763803482, -0.03641117736697197, -0.045528676360845566, 0.07367575168609619, -0.0882294625043869, 0.08370726555585861, -0.04351669177412987, -0.1258246898651123, -0.05630411207675934, -0.05415735021233559, 0.04986459016799927, 0.10911546647548676, 0.03849087283015251, -0.013887211680412292, 0.01920154318213463, 0.12724089622497559, -0.06302479654550552, 0.01635756529867649, 0.004212548490613699, 0.04348432645201683, 0.05869206786155701, 0.10479701310396194, 0.006381809711456299, -0.030432002618908882, -0.016120772808790207, -0.04398191720247269, -0.006343306973576546, -0.01861530914902687, -0.05703536793589592, 0.09013179689645767, -0.08003289252519608, 0.04264124110341072, -0.17148201167583466, -0.05405709519982338, 0.041997719556093216, 0.0564439482986927, -0.009853078052401543, -0.08086778223514557, 0.040192682296037674, -0.0950113832950592, 0.08845195919275284, -0.07547654211521149, 0.00877002440392971, -0.0689588263630867, 0.07813376188278198, -0.04175303876399994, 0.07332543283700943, -0.2265794426202774, 0.05904572457075119, -0.08224357664585114, 0.0019999449141323566, -0.0339263491332531, -0.008437392301857471, -0.06676941365003586, 0.10563334077596664, -0.020080069079995155, -0.06176068261265755, -0.06347876787185669, 0.00959921907633543, 0.0043284171260893345, 0.11323662847280502, -0.16222088038921356, -0.05126942694187164, 0.11037691682577133, -0.09218363463878632, -0.19582390785217285, 0.07808255404233932, 0.013392954133450985, 0.0012965210480615497, 0.024105429649353027, 0.24295100569725037, 0.08951512724161148, -0.11495060473680496, -0.02231382392346859, 0.17025767266750336, -0.052461665123701096, -0.15188369154930115, 0.07386838644742966, -0.04353940486907959, 0.026758184656500816, 0.03171843662858009, -0.009467856958508492, 0.10673972964286804, 0.012288223952054977, -0.07420583814382553, -0.08036325871944427, -0.044991184026002884, 0.008155102841556072, -0.002988775260746479, 0.0273527130484581, -0.06483662128448486, -0.026556875556707382, 0.025410121306777, 0.07686760276556015, 0.01422121562063694, 0.09343540668487549, -0.06447923928499222, 0.07037170976400375, 0.019089791923761368, 0.014406553469598293, -0.13232053816318512, 0.09857039898633957, -0.04119941592216492, 0.040463149547576904, 0.04476848617196083, 0.12133215367794037, 0.05623868480324745, -0.08598529547452927, -0.025208884850144386, 0.00882842019200325, 0.05629776045680046, 0.08094457536935806, -0.0073610832914710045, -0.18757057189941406, 0.01609593629837036, -0.049897100776433945, -0.020300015807151794, -0.020530693233013153, -0.010727196000516415, 0.07365968078374863, 0.10231833904981613, -0.03523999825119972, 0.04675488546490669, -0.02597782388329506, -0.03236430883407593, -0.04572595655918121, -0.011193440295755863, 0.05290280282497406, 0.06398512423038483, -0.04177769273519516, 0.24205200374126434, -0.06104175001382828, 0.2639541029930115, 0.23135845363140106, -0.16132089495658875, 0.09692835062742233, 0.06900762766599655, -0.035829175263643265, -0.0003955892752856016, 0.07717249542474747, -0.03623083606362343, 0.06294352561235428, -0.030246861279010773, 0.09027480334043503, -0.0553167425096035, -0.037057049572467804, -0.00467958627268672, -0.016176952049136162, -0.040610771626234055, 0.07131995260715485, 0.06106796860694885, -0.13924844563007355, 0.14854662120342255, 0.2958432137966156, -0.024507811293005943, 0.11009114980697632, -0.06507943570613861, -0.01601802371442318, -0.0010905512608587742, -0.03712984174489975, -0.05562489107251167, 0.15624509751796722, -0.2536951005458832, -0.02462875097990036, 0.09572838991880417, -0.0025476724840700626, 0.0644628033041954, -0.13217195868492126, -0.0436895415186882, 0.030538950115442276, -0.021138930693268776, -0.08552030473947525, 0.07187628000974655, -0.01032167300581932, 0.08490526676177979, -0.024305926635861397, -0.1199529767036438, 0.07494199275970459, 0.007067006081342697, -0.07147134095430374, 0.09915509819984436, -0.16784243285655975, -0.2356334924697876, -0.05548693984746933, -0.06580100953578949, -0.028355294838547707, -0.011452145874500275, 0.1345309168100357, -0.0384552925825119, -0.022072305902838707, -0.027141589671373367, -0.05018456652760506, -0.0924147367477417, 0.02569025196135044, -0.020917261019349098, 0.0127571951597929, 0.014015085995197296, -0.15263476967811584, -0.04654829949140549, -0.0018349119927734137, 0.046137068420648575, 0.03271561115980148, -0.0067383041605353355, 0.09410345554351807, 0.09635835886001587, 0.04679098352789879, 0.04858700558543205, -0.030748112127184868, 0.19493700563907623, -0.03479328751564026, 0.0141395078971982, 0.19845040142536163, 0.030926473438739777, 0.04839291423559189, 0.12174159288406372, 0.04733917862176895, -0.009910805150866508, -0.021759895607829094, -0.04207328334450722, -0.06368790566921234, -0.17398013174533844, -0.10667183250188828, -0.1660265028476715, 0.005846844986081123, -0.0009538258891552687, 0.08158265054225922, 0.0726458877325058, 0.014913901686668396, -0.018538177013397217, -0.032682985067367554, -0.037913329899311066, 0.030722297728061676, 0.2954229414463043, -0.08289171755313873, 0.09499944001436234, -0.11255021393299103, -0.042677540332078934, 0.11725366115570068, 0.06228414177894592, 0.1302807480096817, 0.06359291076660156, 0.049876511096954346, 0.06703241914510727, 0.24282188713550568, 0.024187492206692696, 0.08734181523323059, 0.017813032492995262, -0.019703460857272148, -0.032934121787548065, -0.08326607197523117, -0.00421117851510644, 0.07182583957910538, 0.07944235950708389, -0.09011998772621155, 0.0055579873733222485, -0.15802469849586487, 0.06153049319982529, 0.14690083265304565, 0.07943771779537201, -0.12770819664001465, 0.00498352712020278, 0.06280402094125748, 0.013754556886851788, -0.04393274709582329, 0.07859901338815689, 0.07037317752838135, -0.03597613424062729, 0.07392523437738419, 0.010769733227789402, 0.09211792051792145, 0.06689698249101639, 0.03775260969996452, -0.06160658597946167, -0.14183439314365387, 0.04480874538421631, 0.0870881900191307, -0.2555846571922302, 0.22224199771881104, -0.02229744754731655, -0.019171878695487976, -0.05977129936218262, 0.004608705174177885, 0.06486519426107407, 0.13841654360294342, 0.1169542744755745, 0.03920431435108185, -0.008697778917849064, -0.0025081299245357513, -0.02597946487367153, 0.058098696172237396, 0.021030021831393242, 0.05293175205588341, -0.04049824923276901, -0.030724238604307175, -0.017866607755422592, 0.06056947633624077, 0.19451913237571716, -0.07263338565826416, -0.1549951285123825, 0.06100039556622505, 0.16373153030872345, -0.058175407350063324, -0.05152539908885956, -0.04740051180124283, -0.08637990057468414, 0.12922701239585876, -0.04566546529531479, -0.06104294955730438, -0.0801854282617569, -0.13663692772388458, 0.11712079495191574, -0.051698386669158936, 0.06468093395233154, -0.03550960496068001, -0.019400490447878838, -0.07728907465934753, -0.17948026955127716, 0.11016831547021866, -0.13221049308776855, 0.0074144466780126095, -0.011019871570169926, 0.14354142546653748, -0.077016681432724, 0.06351558119058609, 0.025282787159085274, 0.034452471882104874, -0.14188240468502045, -0.113873191177845, -0.005391952581703663, -0.023233497515320778, 0.016182061284780502, -0.054293323308229446, -0.018412109464406967, -0.033561013638973236, 0.03413902595639229, -0.008793933317065239, 0.2085028439760208, 0.1297818273305893, -0.12355884909629822, 0.15632303059101105, 0.11236817389726639, -0.04140632972121239, -0.2536526322364807, -0.12133648991584778, -0.13617752492427826, -0.060740891844034195, -0.014092725701630116, -0.11107929795980453, 0.08920936286449432, -0.015971990302205086, -0.09920358657836914, 0.10033977776765823, -0.23060736060142517, -0.08364938944578171, 0.20580166578292847, -0.06701282411813736, 0.28647106885910034, -0.1451350748538971, -0.05408930033445358, -0.03610985353589058, -0.20194697380065918, 0.15012799203395844, -0.12779571115970612, 0.06330570578575134, -0.007843765430152416, 0.002583194524049759, 0.00840829312801361, -0.04885847121477127, 0.12445568293333054, -0.02420433610677719, 0.0035947374999523163, -0.09105808287858963, -0.07106649875640869, 0.09099946916103363, -0.02272670343518257, 0.06420999020338058, -0.1537371426820755, 0.050663214176893234, -0.15119579434394836, 0.022411001846194267, -0.13200458884239197, 0.08242607861757278, -0.007877189666032791, -0.05219317600131035, -0.07254723459482193, -0.027602853253483772, 0.05105215311050415, 0.012873269617557526, 0.21256886422634125, -0.004042395390570164, 0.01236108411103487, 0.1297801285982132, 0.021366531029343605, -0.20483236014842987, -0.12225611507892609, -0.041982512921094894, -0.07378073781728745, 0.09317993372678757, -0.17531973123550415, 0.02478361874818802, 0.07620187103748322, 0.011760166846215725, 0.03277114778757095, 0.06573484092950821, -0.034258343279361725, -0.03639501705765724, 0.11692622303962708, -0.15267208218574524, -0.038076844066381454, -0.04068329930305481, 0.04321755841374397, 0.039510760456323624, 0.0634094700217247, 0.14371076226234436, -0.05559324100613594, 0.0031388706993311644, -0.002338707447052002, 0.0021004914306104183, -0.12996681034564972, 0.04281960800290108, 0.09563679248094559, 0.01319595891982317, -0.13349664211273193, 0.06176011264324188, -0.013472664169967175, -0.126723051071167, -0.005167579744011164, 0.08007393032312393, -0.07411546260118484, -0.12212911248207092, -0.03604612872004509, -0.04837731644511223, -0.12786997854709625, -0.08124510943889618, -0.02016931027173996, -0.11437664180994034, 0.06957434117794037, 0.14313623309135437, 0.07927954196929932, 0.07325275242328644, -0.05271363630890846, -0.03744438290596008, 0.051450878381729126, 0.008211075328290462, -0.079121895134449, 0.00983112957328558, -0.1114526093006134, 0.013714713044464588, 0.002709200605750084, 0.10214061290025711, -0.049987200647592545, 0.00890939962118864, -0.07563889026641846, 0.025436142459511757, -0.12299557775259018, -0.026047980412840843, -0.07817023247480392, -0.013414459303021431, 0.01018771342933178, -0.10859355330467224, -0.051216769963502884, 0.04864383488893509, -0.10839194804430008, -0.04438900947570801, 0.013209357857704163, 0.060949236154556274, -0.1515788733959198, -0.05424753949046135, 0.06383244693279266, -0.03428414836525917, 0.10986502468585968, 0.1409796178340912, -0.09164576232433319, 0.053878750652074814, -0.10671757906675339, -0.16226182878017426, 0.1016421839594841, 0.05870271474123001, 0.038213059306144714, -0.009483118541538715, -0.016275066882371902, 0.09822993725538254, 0.02004779316484928, 0.008758696727454662, 0.022884495556354523, -0.10196149349212646, -0.040488410741090775, -0.04266177490353584, -0.08965464681386948, -0.0006266444688662887, -0.08131132274866104, 0.12862466275691986, 0.03039027564227581, 0.10764510184526443, 0.004559967666864395, -0.008010108023881912, -0.09074350446462631, 0.02322225272655487, -0.039908356964588165, -0.14575521647930145, -0.08797885477542877, -0.018193485215306282, 0.0030049774795770645, -0.027591994032263756, 0.19418463110923767, -0.01598292589187622, -0.1413457691669464, 0.04231206327676773, 0.047917384654283524, -0.019116459414362907, -0.012931706383824348, 0.31337040662765503, 0.047131117433309555, -0.02579544298350811, -0.036237139254808426, 0.031466737389564514, 0.011992651969194412, 0.04036421701312065, 0.045102789998054504, 0.16178485751152039, 0.14717817306518555, 0.09267465025186539, 0.11997339874505997, -0.03464384377002716, -0.12110721319913864, -0.11716420203447342, 0.0306834876537323, 0.10338301956653595, -0.05809510126709938, 0.10178632289171219, 0.15577512979507446, -0.052203208208084106, 0.046352650970220566, -0.04478617012500763, 0.012359749525785446, -0.10546264797449112, -0.08992195129394531, -0.030006350949406624, -0.12239044904708862, -0.03052845597267151, -0.07780242711305618, 0.05716675519943237, 0.07800406217575073, 0.03058869205415249, -0.01529171597212553, 0.010645756497979164, 0.019704669713974, -0.07811499387025833, 0.05678724870085716, -0.002311612479388714, -0.0018877548864111304, -0.0759756788611412, 0.005798527970910072, -0.002206770470365882, -0.021476836875081062, -0.0367211289703846, 0.024358857423067093, -0.013234623707830906, 0.04363205283880234, -0.08801079541444778, -0.05961860343813896, -0.052699096500873566, 0.021708399057388306, 0.06278076022863388, 0.19746197760105133, 0.054009079933166504, 0.010260621085762978, 0.0575016625225544, 0.1917266696691513, -0.11777287721633911, -0.15823301672935486, -0.03281024098396301, 0.10286055505275726, 0.0232688095420599, 0.044598955661058426, -0.012666384689509869, -0.009210141375660896, -0.08977440744638443, 0.2460186779499054, 0.31390586495399475, -0.04990452900528908, 0.04615180939435959, 0.018070153892040253, 0.0271205585449934, 0.013150081038475037, 0.0362471379339695, 0.16588932275772095, 0.2698742747306824, -0.05608178675174713, -0.05573868751525879, -0.07304761558771133, 0.028551388531923294, -0.10311003029346466, 0.05045146122574806, 0.007103755604475737, -0.1379128247499466, 0.009800916537642479, 0.057911403477191925, -0.055404555052518845, 0.02910761907696724, -0.02234596200287342, -0.2036673128604889, -0.04147718474268913, -0.00002635142664075829, 0.15929843485355377, 0.0675598680973053, 0.042162638157606125, -0.04964977130293846, -0.032601550221443176, 0.09307427704334259, -0.005322758108377457, -0.21947400271892548, -0.04006445035338402, 0.1074860543012619, -0.09653604030609131, 0.12433288991451263, -0.026376595720648766, 0.05312417447566986, 0.08879059553146362, 0.1042705699801445, -0.08713356405496597, 0.08101862668991089, 0.04056797921657562, -0.13776379823684692, -0.04650668427348137, -0.07720891386270523, -0.012860535643994808, -0.0036052456125617027, 0.05768025293946266, -0.04168606176972389, 0.04988689348101616, 0.11349838972091675, 0.0033780736848711967, -0.06072520464658737, -0.018884291872382164, -0.08244480937719345, 0.05494649335741997, 0.003729873802512884, -0.04321615397930145, -0.07629320025444031, -0.0394035279750824, -0.0478053018450737, 0.06345637142658234, -0.12960326671600342, -0.09329739958047867, -0.00027781553217209876, -0.01969565451145172, -0.04148796945810318, 0.002856305567547679, -0.0789089947938919, -0.06646028906106949, -0.020819272845983505, 0.011044261045753956, -0.06886670738458633, 0.0035215250682085752, 0.0662151575088501, -0.020679686218500137, 0.013200347311794758, -0.043876081705093384, 0.01949518546462059, 0.04544207081198692, -0.10469547659158707, -0.0686597153544426 ]
null
null
transformers
# Hubert-Extra-Large-Finetuned [Facebook's Hubert](https://ai.facebook.com/blog/hubert-self-supervised-representation-learning-for-speech-recognition-generation-and-compression) The extra large model fine-tuned on 960h of Librispeech on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. The model is a fine-tuned version of [hubert-xlarge-ll60k](https://huggingface.co/facebook/hubert-xlarge-ll60k). [Paper](https://arxiv.org/abs/2106.07447) Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed **Abstract** Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets. The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/hubert . # Usage The model can be used for automatic-speech-recognition as follows: ```python import torch from transformers import Wav2Vec2Processor, HubertForCTC from datasets import load_dataset processor = Wav2Vec2Processor.from_pretrained("facebook/hubert-xlarge-ls960-ft") model = HubertForCTC.from_pretrained("facebook/hubert-xlarge-ls960-ft") ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") input_values = processor(ds[0]["audio"]["array"], return_tensors="pt").input_values # Batch size 1 logits = model(input_values).logits predicted_ids = torch.argmax(logits, dim=-1) transcription = processor.decode(predicted_ids[0]) # ->"A MAN SAID TO THE UNIVERSE SIR I EXIST" ```
{"language": "en", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["libri-light", "librispeech_asr"], "model-index": [{"name": "hubert-large-ls960-ft", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "LibriSpeech (clean)", "type": "librispeech_asr", "config": "clean", "split": "test", "args": {"language": "en"}}, "metrics": [{"type": "wer", "value": 1.8, "name": "Test WER"}]}]}]}
automatic-speech-recognition
facebook/hubert-xlarge-ls960-ft
[ "transformers", "pytorch", "tf", "safetensors", "hubert", "automatic-speech-recognition", "speech", "audio", "hf-asr-leaderboard", "en", "dataset:libri-light", "dataset:librispeech_asr", "arxiv:2106.07447", "license:apache-2.0", "model-index", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2106.07447" ]
[ "en" ]
TAGS #transformers #pytorch #tf #safetensors #hubert #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-libri-light #dataset-librispeech_asr #arxiv-2106.07447 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
# Hubert-Extra-Large-Finetuned Facebook's Hubert The extra large model fine-tuned on 960h of Librispeech on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. The model is a fine-tuned version of hubert-xlarge-ll60k. Paper Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed Abstract Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets. The original model can be found under URL . # Usage The model can be used for automatic-speech-recognition as follows:
[ "# Hubert-Extra-Large-Finetuned\n\nFacebook's Hubert\n\nThe extra large model fine-tuned on 960h of Librispeech on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. \n\nThe model is a fine-tuned version of hubert-xlarge-ll60k.\n\nPaper\n\nAuthors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed\n\nAbstract\nSelf-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets.\n\nThe original model can be found under URL .", "# Usage\n\nThe model can be used for automatic-speech-recognition as follows:" ]
[ "TAGS\n#transformers #pytorch #tf #safetensors #hubert #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-libri-light #dataset-librispeech_asr #arxiv-2106.07447 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n", "# Hubert-Extra-Large-Finetuned\n\nFacebook's Hubert\n\nThe extra large model fine-tuned on 960h of Librispeech on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. \n\nThe model is a fine-tuned version of hubert-xlarge-ll60k.\n\nPaper\n\nAuthors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed\n\nAbstract\nSelf-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets.\n\nThe original model can be found under URL .", "# Usage\n\nThe model can be used for automatic-speech-recognition as follows:" ]
[ 102, 473, 21 ]
[ "passage: TAGS\n#transformers #pytorch #tf #safetensors #hubert #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-libri-light #dataset-librispeech_asr #arxiv-2106.07447 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n" ]
[ -0.12077707052230835, 0.14572905004024506, -0.005017600953578949, 0.02667686715722084, 0.046810489147901535, -0.039981186389923096, 0.12172199040651321, 0.10930124670267105, 0.031147578731179237, -0.00945903081446886, 0.08295365422964096, 0.1236763522028923, -0.011218080297112465, 0.08180810511112213, -0.043972671031951904, -0.15098018944263458, 0.08160193264484406, -0.0009421662543900311, 0.008213171735405922, 0.07361588627099991, 0.1056138426065445, -0.0524243526160717, 0.047978855669498444, -0.009992191568017006, -0.031003963202238083, 0.03078005090355873, 0.047390177845954895, -0.12639281153678894, 0.11030689626932144, 0.05870140343904495, 0.005327819846570492, 0.05941949412226677, 0.0169853363186121, -0.10830596834421158, 0.0322846956551075, 0.01671217940747738, -0.013043944723904133, 0.05205754190683365, -0.020399313420057297, -0.024179130792617798, 0.04177365079522133, 0.013512929901480675, -0.01568041369318962, 0.06367021054029465, -0.051848843693733215, -0.26261764764785767, -0.04094526171684265, 0.12805598974227905, -0.05072086304426193, 0.07949298620223999, -0.016533197835087776, 0.14678499102592468, -0.07812386006116867, 0.10463929921388626, 0.13386325538158417, -0.2147250771522522, 0.021293358877301216, -0.037235014140605927, -0.0015648057451471686, 0.05001048743724823, -0.03597026318311691, 0.024736514315009117, 0.023944642394781113, 0.003568163374438882, 0.050724003463983536, -0.06720012426376343, -0.2043301910161972, 0.004974901210516691, -0.1096772775053978, -0.040404994040727615, 0.2575310468673706, 0.021691851317882538, 0.01972508803009987, -0.06410667300224304, -0.06462382525205612, 0.0831802487373352, -0.03260083496570587, 0.03939763829112053, -0.015686091035604477, 0.010021020658314228, 0.03315914049744606, -0.04482585936784744, -0.10581827908754349, -0.046657830476760864, -0.12837232649326324, 0.1586628556251526, -0.012712928466498852, 0.041005589067935944, -0.09786314517259598, 0.02948235161602497, 0.02850775048136711, -0.1264147162437439, 0.01770744100213051, -0.018014440312981606, 0.027555517852306366, 0.07317429780960083, 0.0044417777098715305, 0.02714456059038639, 0.16959959268569946, 0.09061069041490555, 0.004051821772009134, -0.0005804902175441384, -0.06932394206523895, 0.08313130587339401, -0.044231392443180084, 0.09887510538101196, -0.07712511718273163, -0.025533080101013184, 0.0881040170788765, 0.05706872418522835, 0.09917774051427841, -0.03938606381416321, -0.08043195307254791, -0.010920151136815548, 0.054134853184223175, 0.05548475682735443, 0.0695078894495964, 0.04355374351143837, -0.016850121319293976, 0.027643976733088493, 0.11414606869220734, -0.14292871952056885, -0.006063986569643021, 0.05938170105218887, 0.0544639527797699, -0.0019953097216784954, 0.07351130247116089, 0.0074031841941177845, -0.059613365679979324, -0.002153486479073763, -0.02057463303208351, -0.014780119061470032, 0.040010131895542145, -0.05112016573548317, 0.06752058118581772, -0.03124525398015976, 0.04027562588453293, -0.17624160647392273, -0.04833047464489937, 0.016564929857850075, -0.0008200437878258526, 0.017017528414726257, -0.06175515428185463, 0.000549057440366596, -0.07286317646503448, 0.04774121940135956, -0.10223782062530518, 0.013499781489372253, -0.08133312314748764, 0.09096019715070724, 0.049235135316848755, 0.08996842801570892, -0.1612062305212021, 0.059279073029756546, -0.08972181379795074, -0.017087873071432114, -0.09480828791856766, 0.06028515100479126, -0.1259630024433136, 0.09301172941923141, -0.03602222725749016, -0.02115842141211033, -0.0855337530374527, 0.04866113141179085, -0.020028825849294662, 0.10851360857486725, -0.16305531561374664, -0.10160194337368011, 0.14291587471961975, -0.12428594380617142, -0.14567221701145172, 0.1322421133518219, 0.035616640001535416, 0.002968834014609456, 0.07954319566488266, 0.30446168780326843, 0.0437496192753315, -0.100141741335392, -0.04234614595770836, 0.09821926802396774, -0.05386640876531601, -0.09343726933002472, 0.056829169392585754, -0.036883991211652756, -0.02125578187406063, 0.014711625874042511, -0.022481581196188927, 0.10789734125137329, 0.01807579956948757, -0.08851096034049988, -0.0547456257045269, -0.09866245836019516, 0.0025870141107589006, 0.008380299434065819, 0.004858288448303938, -0.042674995958805084, -0.034465860575437546, -0.017508378252387047, 0.08912698924541473, -0.046701084822416306, 0.03865981474518776, -0.11952963471412659, 0.09039245545864105, -0.002217394532635808, 0.0383518785238266, -0.1430942565202713, 0.12328577786684036, -0.05592457577586174, -0.01619819365441799, 0.07609671354293823, 0.019891977310180664, 0.07585498690605164, -0.044085487723350525, -0.03641140088438988, -0.039666175842285156, 0.13121992349624634, 0.06916797161102295, 0.022023068740963936, -0.2166776806116104, 0.06289403140544891, -0.06225527822971344, 0.11082888394594193, -0.06611141562461853, -0.011164878495037556, 0.0960114523768425, 0.10482827574014664, 0.009906393475830555, 0.03773817420005798, 0.053852781653404236, -0.016453593969345093, 0.027413466945290565, -0.02298472821712494, 0.03249744325876236, 0.01432088389992714, -0.06048902869224548, 0.1979818195104599, -0.18857282400131226, 0.2761460244655609, 0.2342420220375061, -0.033926524221897125, 0.07478697597980499, 0.12507086992263794, -0.004218378569930792, -0.008711948990821838, 0.07015057653188705, -0.06867020577192307, 0.09204153716564178, -0.031244970858097076, 0.11370937526226044, -0.06795179098844528, -0.020030608400702477, 0.011065732687711716, -0.02429201453924179, -0.004226251970976591, 0.12940697371959686, -0.05626015365123749, -0.12260334193706512, 0.12705206871032715, 0.16152800619602203, -0.07437451928853989, 0.17230379581451416, -0.07458416372537613, -0.043931398540735245, 0.05635958909988403, -0.021542515605688095, -0.022463766857981682, 0.17396026849746704, -0.20478370785713196, -0.037155650556087494, 0.07615657150745392, -0.04458906874060631, 0.04576306790113449, -0.1467643529176712, -0.00832043681293726, -0.03254716470837593, -0.07028912007808685, -0.08556579798460007, 0.07242576777935028, -0.02368459850549698, 0.10197132080793381, -0.08361741900444031, -0.2200399786233902, 0.04125245288014412, -0.02956182323396206, -0.1005532518029213, 0.07613909244537354, -0.10603702068328857, -0.306525856256485, -0.07404512166976929, -0.016714341938495636, -0.02784191444516182, 0.006779549177736044, 0.10766606777906418, -0.08181190490722656, -0.022327987477183342, -0.07936102151870728, -0.011135292239487171, 0.038313914090394974, 0.007989464327692986, 0.03367375582456589, 0.004403913393616676, 0.08830519765615463, -0.14751656353473663, -0.021310798823833466, -0.02836575172841549, 0.06821950525045395, 0.0704631507396698, 0.0205626729875803, 0.07772435247898102, 0.14117072522640228, 0.051480039954185486, 0.026199493557214737, -0.013905107043683529, 0.16449259221553802, -0.08935336023569107, 0.009147889912128448, 0.14141957461833954, -0.02343500405550003, 0.005064616911113262, 0.17100532352924347, 0.04127458482980728, -0.022960105910897255, -0.010624116286635399, -0.027014972642064095, -0.05590551346540451, -0.15948662161827087, -0.13346821069717407, -0.0977422371506691, -0.0012649975251406431, 0.006271261256188154, 0.08747004717588425, 0.04455985501408577, -0.02557825669646263, -0.01189421396702528, -0.05895805358886719, 0.04442429170012474, -0.013803456909954548, 0.24068838357925415, -0.06265190243721008, 0.09777072072029114, -0.0994105264544487, -0.084104523062706, 0.08933275938034058, 0.051726333796978, 0.021195024251937866, 0.053836192935705185, 0.024523863568902016, 0.022121019661426544, 0.15339772403240204, 0.06676237285137177, 0.09415344148874283, 0.003810015507042408, -0.020693978294730186, -0.00651832390576601, -0.11056245863437653, -0.0407496802508831, 0.04900550842285156, 0.0678972601890564, -0.03387369588017464, 0.02128995954990387, -0.07460928708314896, 0.058987654745578766, 0.13742153346538544, 0.08975907415151596, -0.1776110827922821, 0.002963114995509386, 0.04590548947453499, -0.038941312581300735, -0.022591622546315193, 0.07200362533330917, 0.07699097692966461, 0.005028128158301115, 0.06511193513870239, 0.03009662963449955, 0.059286851435899734, -0.04920709505677223, 0.05127714201807976, -0.12088197469711304, -0.039564747363328934, 0.01846790686249733, 0.04066918417811394, -0.26151031255722046, 0.2256292998790741, 0.026724694296717644, 0.054292671382427216, -0.012494048103690147, -0.010404939763247967, 0.09970947355031967, 0.11921938508749008, 0.13523715734481812, 0.025973621755838394, 0.0115898996591568, -0.08041374385356903, -0.1154739111661911, 0.08080985397100449, 0.006941898725926876, 0.08796092867851257, -0.05378258600831032, -0.02318192459642887, -0.047689590603113174, 0.05140256881713867, -0.0008237806032411754, -0.13613946735858917, -0.09542128443717957, 0.03416211158037186, 0.26189693808555603, 0.04127006232738495, -0.05087074264883995, -0.07740847021341324, -0.1134403869509697, -0.016981881111860275, -0.10042861849069595, -0.022553496062755585, -0.06391866505146027, -0.15493841469287872, 0.1166360005736351, -0.03559880331158638, 0.019812149927020073, -0.010556979104876518, -0.012008226476609707, -0.034973062574863434, -0.14304660260677338, 0.09803368896245956, -0.11224149912595749, -0.054009515792131424, 0.00036125138285569847, 0.18682989478111267, -0.024288618937134743, 0.07694035023450851, 0.01778860203921795, 0.0281615499407053, -0.06472287327051163, -0.05917717516422272, 0.09227592498064041, 0.05580445006489754, -0.06366796791553497, 0.0293655376881361, -0.030358627438545227, -0.1823599487543106, -0.008706390857696533, -0.0014971804339438677, 0.18380297720432281, 0.1451236754655838, -0.055768437683582306, 0.14630313217639923, 0.2341475635766983, -0.005162407178431749, -0.29722997546195984, -0.13247938454151154, -0.10806487500667572, -0.020614907145500183, -0.06268947571516037, -0.11484117060899734, 0.10655534267425537, -0.05499789118766785, -0.09799884259700775, 0.06414449214935303, -0.14610148966312408, -0.0975627452135086, 0.32380345463752747, -0.07441113889217377, 0.21293044090270996, -0.15508610010147095, -0.06250280886888504, -0.07745930552482605, -0.10984013974666595, 0.10753149539232254, -0.17449915409088135, 0.08066120743751526, 0.006771127227693796, 0.01688370481133461, 0.004376246593892574, -0.03416304662823677, 0.08676095306873322, 0.016993997618556023, -0.00896431040018797, -0.044221363961696625, -0.02578824572265148, 0.05480363219976425, -0.0005188201903365552, 0.13625788688659668, -0.13236215710639954, 0.046951957046985626, -0.07957752048969269, -0.005157444626092911, -0.108027383685112, 0.0824962928891182, 0.05688634514808655, -0.01919451355934143, 0.004875496961176395, -0.025813430547714233, -0.0029913177713751793, 0.019249074161052704, 0.1560511738061905, -0.08304820209741592, 0.000050817317969631404, 0.13129016757011414, 0.14350667595863342, -0.21488448977470398, -0.06541217118501663, -0.027064407244324684, -0.06674949079751968, 0.11058424413204193, -0.10709889233112335, 0.107331283390522, 0.042610276490449905, 0.04530109837651253, 0.0779879167675972, 0.04271893948316574, -0.06602015346288681, -0.04086216911673546, 0.11502394825220108, -0.15105539560317993, -0.10590845346450806, -0.003949436824768782, 0.04082712158560753, 0.026399364694952965, 0.10717631876468658, 0.15878376364707947, -0.04683321714401245, 0.003269412089139223, -0.0010354315163567662, 0.025171801447868347, -0.10151022672653198, 0.13518604636192322, 0.14223520457744598, 0.040082044899463654, -0.16287264227867126, 0.09802766144275665, -0.017628049477934837, -0.06859991699457169, 0.028014983981847763, 0.04446034133434296, -0.07612306624650955, -0.12472199648618698, -0.07405771315097809, 0.03312692418694496, -0.0110141197219491, -0.1295306533575058, -0.05767279863357544, -0.10770657658576965, 0.026787044480443, 0.16169755160808563, 0.061028748750686646, 0.0593172125518322, -0.0530419796705246, -0.06465063989162445, 0.01877068169414997, 0.05398203432559967, -0.05422450974583626, 0.011027226224541664, -0.1461108773946762, -0.022327270358800888, -0.0005231409450061619, 0.045108355581760406, -0.074905164539814, -0.016978487372398376, -0.09232164174318314, 0.03352592885494232, -0.10794153809547424, 0.0023814504966139793, -0.04279550537467003, 0.03547579422593117, 0.021117115393280983, -0.10111305862665176, -0.016343412920832634, 0.043522924184799194, -0.0907987505197525, -0.006533356849104166, 0.03360344469547272, 0.08222596347332001, -0.1634102165699005, -0.032586414366960526, 0.019121328368782997, -0.028750861063599586, 0.13471545279026031, 0.13143600523471832, -0.1478690803050995, 0.07583678513765335, -0.21761183440685272, -0.18227623403072357, 0.13877414166927338, 0.04343380406498909, 0.02337164618074894, -0.07262848317623138, -0.026285959407687187, 0.11610028147697449, 0.06051851063966751, 0.033252567052841187, 0.12115547060966492, -0.061967551708221436, 0.038191813975572586, -0.10187985002994537, -0.051251932978630066, -0.013368586078286171, -0.02657965011894703, 0.14006659388542175, 0.059037115424871445, 0.14117594063282013, -0.058982282876968384, -0.04600823298096657, -0.10597433894872665, 0.03531492128968239, -0.040438711643218994, -0.1693798154592514, -0.13584434986114502, 0.026287291198968887, 0.03685836121439934, -0.02911675162613392, 0.19002021849155426, -0.009017424657940865, -0.1067533865571022, 0.04540310427546501, 0.011595935560762882, -0.010407530702650547, 0.011788520030677319, 0.26908573508262634, 0.024175576865673065, -0.01782841980457306, 0.00039345587720163167, -0.022970683872699738, 0.04254725202918053, 0.10611008107662201, -0.006989442743360996, 0.16648335754871368, 0.07510743290185928, 0.0851062461733818, 0.14246633648872375, -0.04398978129029274, -0.037494245916604996, 0.0028278545942157507, -0.05359815061092377, 0.07822921872138977, -0.061495091766119, 0.1407431960105896, 0.16230984032154083, 0.022186795249581337, 0.05941324681043625, -0.08471280336380005, -0.01963537372648716, -0.12437161803245544, -0.10875112563371658, -0.06606220453977585, -0.1329459697008133, -0.009698591195046902, -0.056942570954561234, -0.014158662408590317, 0.074685238301754, 0.023679211735725403, -0.000821031047962606, 0.06811203807592392, 0.036057520657777786, -0.028483279049396515, 0.08820010721683502, -0.024732418358325958, -0.05603603273630142, -0.07508978247642517, -0.011323753744363785, 0.09330625087022781, 0.03402819111943245, -0.023675210773944855, 0.006624523084610701, -0.06752482056617737, 0.06702570617198944, -0.10266174376010895, -0.07232581079006195, -0.018403485417366028, 0.011616591364145279, 0.05358509346842766, 0.117392398416996, 0.08847670257091522, -0.04721837118268013, 0.07141055166721344, 0.20911343395709991, -0.07891552150249481, -0.20162831246852875, -0.057332802563905716, 0.12462260574102402, -0.056148283183574677, 0.06765349954366684, -0.03939080983400345, -0.06230758875608444, -0.044954799115657806, 0.18758375942707062, 0.2962237000465393, -0.06764354556798935, 0.06680004298686981, -0.08299155533313751, 0.029244469478726387, -0.06363750994205475, -0.0004931112052872777, 0.16881142556667328, 0.23182295262813568, -0.002004594774916768, -0.04076819494366646, -0.0436755008995533, -0.02967868745326996, -0.04722573235630989, 0.05011726915836334, -0.0023631788790225983, -0.0816831886768341, -0.03690289705991745, 0.08025254309177399, -0.05866548418998718, -0.07832415401935577, -0.1517670899629593, -0.14142602682113647, -0.05181185528635979, -0.016248004510998726, 0.12298806011676788, 0.09291240572929382, -0.016932466998696327, -0.05409332364797592, -0.009987279772758484, -0.01153340470045805, -0.01548632699996233, -0.19524770975112915, -0.006176479160785675, 0.025771567597985268, -0.12959034740924835, 0.1040831208229065, -0.01508951373398304, 0.09849374741315842, 0.055083923041820526, 0.06744372844696045, -0.05925416946411133, 0.13972504436969757, 0.008145791478455067, -0.13820675015449524, 0.0007404878851957619, 0.0668814480304718, 0.02030791901051998, 0.05132794380187988, 0.05844194069504738, -0.021563313901424408, 0.039627574384212494, 0.00761266378685832, -0.0729343518614769, -0.08817724138498306, -0.004822654649615288, -0.059934768825769424, 0.05216280743479729, -0.0034171920269727707, -0.04419473558664322, -0.0185871459543705, -0.04987965524196625, 0.00018332814215682447, 0.04916193708777428, -0.14201128482818604, -0.05470229685306549, -0.07017229497432709, -0.01258524414151907, -0.11375666409730911, -0.0191828403621912, -0.16307397186756134, -0.04771123453974724, -0.09942281246185303, -0.0140214329585433, -0.05864318087697029, 0.017280468717217445, 0.0895596593618393, 0.024813586845993996, 0.0018440866842865944, -0.05543629080057144, 0.04989612475037575, 0.08601193875074387, -0.12439094483852386, -0.07635762542486191 ]
null
null
null
# <p align="center"> IC-GAN: Instance-Conditioned GAN </p> Official Pytorch code of [Instance-Conditioned GAN](https://arxiv.org/abs/2109.05070) by Arantxa Casanova, Marlène Careil, Jakob Verbeek, Michał Drożdżal, Adriana Romero-Soriano. ![IC-GAN results](./figures/github_image.png?raw=true) ## Generate images with IC-GAN in a Colab Notebook We provide a [Google Colab notebook](https://colab.research.google.com/github/facebookresearch/ic_gan/blob/main/inference/icgan_colab.ipynb) to generate images with IC-GAN and its class-conditional counter part. We also invite users to check out the [demo on Replicate](https://replicate.ai/arantxacasanova/ic_gan), courtesy of [Replicate](https://replicate.ai/home). The figure below depicts two instances, unseen during training and downloaded from [Creative Commons search](https://search.creativecommons.org), and the generated images with IC-GAN and class-conditional IC-GAN when conditioning on the class "castle": ![IC-GAN results transfer](./figures/icgan_transfer_all_github.png?raw=true) Additionally, and inspired by [this Colab](https://colab.research.google.com/github/eyaler/clip_biggan/blob/main/ClipBigGAN.ipynb), we provide the funcionality in the same Colab notebook to guide generations with text captions, using the [CLIP model](https://github.com/openai/CLIP). As an example, the following Figure shows three instance conditionings and a text caption (top), followed by the resulting generated images with IC-GAN (bottom), when optimizing the noise vector following CLIP's gradient for 100 iterations. ![IC-GAN results transfer CLIP](./figures/icgan_clip.png?raw=true) *Credit for the three instance conditionings, from left to right, that were modified with a resize and central crop:* [1: "Landscape in Bavaria" by shining.darkness, licensed under CC BY 2.0](https://search.creativecommons.org/photos/92ef279c-4469-49a5-aa4b-48ad746f2dc4), [2: "Fantasy Landscape - slolsss" by Douglas Tofoli is marked with CC PDM 1.0](https://search.creativecommons.org/photos/13646adc-f1df-437a-a0dd-8223452ee46c), [3: "How to Draw Landscapes Simply" by Kuwagata Keisai is marked with CC0 1.0](https://search.creativecommons.org/photos/2ab9c3b7-de99-4536-81ed-604ee988bd5f) ## Requirements * Python 3.8 * Cuda v10.2 / Cudnn v7.6.5 * gcc v7.3.0 * Pytorch 1.8.0 * A conda environment can be created from `environment.yaml` by entering the command: `conda env create -f environment.yml`, that contains the aforemention version of Pytorch and other required packages. * Faiss: follow the instructions in the [original repository](https://github.com/facebookresearch/faiss). ## Overview This repository consists of four main folders: * `data_utils`: A common folder to obtain and format the data needed to train and test IC-GAN, agnostic of the specific backbone. * `inference`: Scripts to test the models both qualitatively and quantitatively. * `BigGAN_PyTorch`: It provides the training, evaluation and sampling scripts for IC-GAN with a BigGAN backbone. The code base comes from [Pytorch BigGAN repository](https://github.com/ajbrock/BigGAN-PyTorch), made available under the MIT License. It has been modified to [add additional utilities](#biggan-changelog) and it enables IC-GAN training on top of it. * `stylegan2_ada_pytorch`: It provides the training, evaluation and sampling scripts for IC-GAN with a StyleGAN2 backbone. The code base comes from [StyleGAN2 Pytorch](https://github.com/NVlabs/stylegan2-ada-pytorch), made available under the [Nvidia Source Code License](https://nvlabs.github.io/stylegan2-ada-pytorch/license.html). It has been modified to [add additional utilities](#stylegan-changelog) and it enables IC-GAN training on top of it. ## (Python script) Generate images with IC-GAN Alternatively, we can <b> generate images with IC-GAN models </b> directly from a python script, by following the next steps: 1) Download the desired pretrained models (links below) and the [pre-computed 1000 instance features from ImageNet](https://dl.fbaipublicfiles.com/ic_gan/stored_instances.tar.gz) and extract them into a folder `pretrained_models_path`. | model | backbone | class-conditional? | training dataset | resolution | url | |-------------------|-------------------|-------------------|---------------------|--------------------|--------------------| | IC-GAN | BigGAN | No | ImageNet | 256x256 | [model](https://dl.fbaipublicfiles.com/ic_gan/icgan_biggan_imagenet_res256.tar.gz) | | IC-GAN (half capacity) | BigGAN | No | ImageNet | 256x256 | [model](https://dl.fbaipublicfiles.com/ic_gan/icgan_biggan_imagenet_res256_halfcap.tar.gz) | | IC-GAN | BigGAN | No | ImageNet | 128x128 | [model](https://dl.fbaipublicfiles.com/ic_gan/icgan_biggan_imagenet_res128.tar.gz) | | IC-GAN | BigGAN | No | ImageNet | 64x64 | [model](https://dl.fbaipublicfiles.com/ic_gan/icgan_biggan_imagenet_res64.tar.gz) | | IC-GAN | BigGAN | Yes | ImageNet | 256x256 | [model](https://dl.fbaipublicfiles.com/ic_gan/cc_icgan_biggan_imagenet_res256.tar.gz) | | IC-GAN (half capacity) | BigGAN | Yes | ImageNet | 256x256 | [model](https://dl.fbaipublicfiles.com/ic_gan/cc_icgan_biggan_imagenet_res256_halfcap.tar.gz) | | IC-GAN | BigGAN | Yes | ImageNet | 128x128 | [model](https://dl.fbaipublicfiles.com/ic_gan/cc_icgan_biggan_imagenet_res128.tar.gz) | | IC-GAN | BigGAN | Yes | ImageNet | 64x64 | [model](https://dl.fbaipublicfiles.com/ic_gan/cc_icgan_biggan_imagenet_res64.tar.gz) | | IC-GAN | BigGAN | Yes | ImageNet-LT | 256x256 | [model](https://dl.fbaipublicfiles.com/ic_gan/cc_icgan_biggan_imagenetlt_res256.tar.gz) | | IC-GAN | BigGAN | Yes | ImageNet-LT | 128x128 | [model](https://dl.fbaipublicfiles.com/ic_gan/cc_icgan_biggan_imagenetlt_res128.tar.gz) | | IC-GAN | BigGAN | Yes | ImageNet-LT | 64x64 | [model](https://dl.fbaipublicfiles.com/ic_gan/cc_icgan_biggan_imagenetlt_res64.tar.gz) | | IC-GAN | BigGAN | No | COCO-Stuff | 256x256 | [model](https://dl.fbaipublicfiles.com/ic_gan/icgan_biggan_coco_res256.tar.gz) | | IC-GAN | BigGAN | No | COCO-Stuff | 128x128 | [model](https://dl.fbaipublicfiles.com/ic_gan/icgan_biggan_coco_res128.tar.gz) | | IC-GAN | StyleGAN2 | No | COCO-Stuff | 256x256 | [model](https://dl.fbaipublicfiles.com/ic_gan/icgan_stylegan2_coco_res256.tar.gz) | | IC-GAN | StyleGAN2 | No | COCO-Stuff | 128x128 | [model](https://dl.fbaipublicfiles.com/ic_gan/icgan_stylegan2_coco_res128.tar.gz) | 2) Execute: ``` python inference/generate_images.py --root_path [pretrained_models_path] --model [model] --model_backbone [backbone] --resolution [res] ``` * `model` can be chosen from `["icgan", "cc_icgan"]` to use the IC-GAN or the class-conditional IC-GAN model respectively. * `backbone` can be chosen from `["biggan", "stylegan2"]`. * `res` indicates the resolution at which the model has been trained. For ImageNet, choose one in `[64, 128, 256]`, and for COCO-Stuff, one in `[128, 256]`. This script results in a .PNG file where several generated images are shown, given an instance feature (each row), and a sampled noise vector (each grid position). <b>Additional and optional parameters</b>: * `index`: (None by default), is an integer from 0 to 999 that choses a specific instance feature vector out of the 1000 instances that have been selected with k-means on the ImageNet dataset and stored in `pretrained_models_path/stored_instances`. * `swap_target`: (None by default) is an integer from 0 to 999 indicating an ImageNet class label. This label will be used to condition the class-conditional IC-GAN, regardless of which instance features are being used. * `which_dataset`: (ImageNet by default) can be chosen from `["imagenet", "coco"]` to indicate which dataset (training split) to sample the instances from. * `trained_dataset`: (ImageNet by default) can be chosen from `["imagenet", "coco"]` to indicate the dataset in which the IC-GAN model has been trained on. * `num_imgs_gen`: (5 by default), it changes the number of noise vectors to sample per conditioning. Increasing this number results in a bigger .PNG file to save and load. * `num_conditionings_gen`: (5 by default), it changes the number of conditionings to sample. Increasing this number results in a bigger .PNG file to save and load. * `z_var`: (1.0 by default) controls the truncation factor for the generation. * Optionally, the script can be run with the following additional options `--visualize_instance_images --dataset_path [dataset_path]` to visualize the ground-truth images corresponding to the conditioning instance features, given a path to the dataset's ground-truth images `dataset_path`. Ground-truth instances will be plotted as the leftmost image for each row. ## Data preparation <div id="data-preparation"> <details> <summary>ImageNet</summary> <br> <ol> <li>Download dataset from <a href="https://image-net.org/download.php"> here </a>. </li> <li>Download <a href="https://github.com/facebookresearch/swav"> SwAV </a> feature extractor weights from <a href="https://dl.fbaipublicfiles.com/deepcluster/swav_800ep_pretrain.pth.tar"> here </a>. </li> <li> Replace the paths in data_utils/prepare_data.sh: <code>out_path</code> by the path where hdf5 files will be stored, <code>path_imnet</code> by the path where ImageNet dataset is downloaded, and <code>path_swav</code> by the path where SwAV weights are stored. </li> <li> Execute <code>./data_utils/prepare_data.sh imagenet [resolution]</code>, where <code>[resolution]</code> can be an integer in {64,128,256}. This script will create several hdf5 files: <ul> <li> <code>ILSVRC[resolution]_xy.hdf5</code> and <code>ILSVRC[resolution]_val_xy.hdf5</code>, where images and labels are stored for the training and validation set respectively. </li> <li> <code>ILSVRC[resolution]_feats_[feature_extractor]_resnet50.hdf5</code> that contains the instance features for each image. </li> <li> <code>ILSVRC[resolution]_feats_[feature_extractor]_resnet50_nn_k[k_nn].hdf5</code> that contains the list of [k_nn] neighbors for each of the instance features. </li> </ul> </li> </ol> </br> </details> <details> <summary>ImageNet-LT</summary> <br> <ol> <li>Download ImageNet dataset from <a href="https://image-net.org/download.php"> here </a>. Following <a href="https://github.com/zhmiao/OpenLongTailRecognition-OLTR"> ImageNet-LT </a>, the file <code>ImageNet_LT_train.txt</code> can be downloaded from <a href="https://drive.google.com/drive/u/1/folders/1j7Nkfe6ZhzKFXePHdsseeeGI877Xu1yf" > this link </a> and later stored in the folder <code>./BigGAN_PyTorch/imagenet_lt</code>. </li> <li>Download the pre-trained weights of the ResNet on ImageNet-LT from <a href="https://dl.fbaipublicfiles.com/classifier-balancing/ImageNet_LT/models/resnet50_uniform_e90.pth"> this link</a>, provided by the <a href="https://github.com/facebookresearch/classifier-balancing"> classifier-balancing repository </a>. </li> <li> Replace the paths in data_utils/prepare_data.sh: <code>out_path</code> by the path where hdf5 files will be stored, <code>path_imnet</code> by the path where ImageNet dataset is downloaded, and <code>path_classifier_lt</code> by the path where the pre-trained ResNet50 weights are stored. </li> <li> Execute <code>./data_utils/prepare_data.sh imagenet_lt [resolution]</code>, where <code>[resolution]</code> can be an integer in {64,128,256}. This script will create several hdf5 files: <ul> <li> <code>ILSVRC[resolution]longtail_xy.hdf5</code>, where images and labels are stored for the training and validation set respectively. </li> <li> <code>ILSVRC[resolution]longtail_feats_[feature_extractor]_resnet50.hdf5</code> that contains the instance features for each image. </li> <li> <code>ILSVRC[resolution]longtail_feats_[feature_extractor]_resnet50_nn_k[k_nn].hdf5</code> that contains the list of [k_nn] neighbors for each of the instance features. </li> </ul> </li> </ol> </br> </details> <details> <summary>COCO-Stuff</summary> <br> <ol> <li>Download the dataset following the <a href="https://github.com/WillSuen/LostGANs/blob/master/INSTALL.md"> LostGANs' repository instructions </a>. </li> <li>Download <a href="https://github.com/facebookresearch/swav"> SwAV </a> feature extractor weights from <a href="https://dl.fbaipublicfiles.com/deepcluster/swav_800ep_pretrain.pth.tar"> here </a>. </li> <li> Replace the paths in data_utils/prepare_data.sh: <code>out_path</code> by the path where hdf5 files will be stored, <code>path_imnet</code> by the path where ImageNet dataset is downloaded, and <code>path_swav</code> by the path where SwAV weights are stored. </li> <li> Execute <code>./data_utils/prepare_data.sh coco [resolution]</code>, where <code>[resolution]</code> can be an integer in {128,256}. This script will create several hdf5 files: <ul> <li> <code>COCO[resolution]_xy.hdf5</code> and <code>COCO[resolution]_val_test_xy.hdf5</code>, where images and labels are stored for the training and evaluation set respectively. </li> <li> <code>COCO[resolution]_feats_[feature_extractor]_resnet50.hdf5</code> that contains the instance features for each image. </li> <li> <code>COCO[resolution]_feats_[feature_extractor]_resnet50_nn_k[k_nn].hdf5</code> that contains the list of [k_nn] neighbors for each of the instance features. </li> </ul> </li> </ol> </br> </details> <details> <summary>Other datasets</summary> <br> <ol> <li>Download the corresponding dataset and store in a folder <code>dataset_path</code>. </li> <li>Download <a href="https://github.com/facebookresearch/swav"> SwAV </a> feature extractor weights from <a href="https://dl.fbaipublicfiles.com/deepcluster/swav_800ep_pretrain.pth.tar"> here </a>. </li> <li> Replace the paths in data_utils/prepare_data.sh: <code>out_path</code> by the path where hdf5 files will be stored and <code>path_swav</code> by the path where SwAV weights are stored. </li> <li> Execute <code>./data_utils/prepare_data.sh [dataset_name] [resolution] [dataset_path]</code>, where <code>[dataset_name]</code> will be the dataset name, <code>[resolution]</code> can be an integer, for example 128 or 256, and <code>dataset_path</code> contains the dataset images. This script will create several hdf5 files: <ul> <li> <code>[dataset_name][resolution]_xy.hdf5</code>, where images and labels are stored for the training set. </li> <li> <code>[dataset_name][resolution]_feats_[feature_extractor]_resnet50.hdf5</code> that contains the instance features for each image. </li> <li> <code>[dataset_name][resolution]_feats_[feature_extractor]_resnet50_nn_k[k_nn].hdf5</code> that contains the list of <code>k_nn</code> neighbors for each of the instance features. </li> </ul> </li> </ol> </br> </details> <details> <summary>How to subsample an instance feature dataset with k-means</summary> <br> To downsample the instance feature vector dataset, after we have prepared the data, we can use the k-means algorithm: <code> python data_utils/store_kmeans_indexes.py --resolution [resolution] --which_dataset [dataset_name] --data_root [data_path] </code> <ul> <li> Adding <code>--gpu</code> allows the faiss library to compute k-means leveraging GPUs, resulting in faster execution. </li> <li> Adding the parameter <code>--feature_extractor [feature_extractor]</code> chooses which feature extractor to use, with <code>feature_extractor</code> in <code>['selfsupervised', 'classification'] </code>, if we are using swAV as feature extactor or the ResNet pretrained on the classification task on ImageNet, respectively. </li> <li> The number of k-means clusters can be set with <code>--kmeans_subsampled [centers]</code>, where <code>centers</code> is an integer. </li> </ul> </br> </details> </div> ## How to train the models #### BigGAN or StyleGAN2 backbone Training parameters are stored in JSON files in `[backbone_folder]/config_files/[dataset]/*.json`, where `[backbone_folder]` is either BigGAN_Pytorch or stylegan2_ada_pytorch and `[dataset]` can either be ImageNet, ImageNet-LT or COCO_Stuff. ``` cd BigGAN_PyTorch python run.py --json_config config_files/<dataset>/<selected_config>.json --data_root [data_root] --base_root [base_root] ``` or ``` cd stylegan_ada_pytorch python run.py --json_config config_files/<dataset>/<selected_config>.json --data_root [data_root] --base_root [base_root] ``` where: * `data_root` path where the data has been prepared and stored, following the previous section (<a href="./README.md#data-preparation">Data preparation</a>). * `base_root` path where to store the model weights and logs. Note that one can create other JSON files to modify the training parameters. #### Other backbones To be able to run IC-GAN with other backbones, we provide some orientative steps: * Place the new backbone code in a new folder under `ic_gan` (`ic_gan/new_backbone`). * Modify the relevant piece of code in the GAN architecture to allow instance features as conditionings (for both generator and discriminator). * Create a `trainer.py` file with the training loop to train an IC-GAN with the new backbone. The `data_utils` folder provides the tools to prepare the dataset, load the data and conditioning sampling to train an IC-GAN. The IC-GAN with BigGAN backbone [`trainer.py`](BigGAN_PyTorch/trainer.py) file can be used as an inspiration. ## How to test the models <b>To obtain the FID and IS metrics on ImageNet and ImageNet-LT</b>: 1) Execute: ``` python inference/test.py --json_config [BigGAN-PyTorch or stylegan-ada-pytorch]/config_files/<dataset>/<selected_config>.json --num_inception_images [num_imgs] --sample_num_npz [num_imgs] --eval_reference_set [ref_set] --sample_npz --base_root [base_root] --data_root [data_root] --kmeans_subsampled [kmeans_centers] --model_backbone [backbone] ``` To obtain the tensorflow IS and FID metrics, use an environment with the Python <3.7 and Tensorflow 1.15. Then: 2) Obtain Inception Scores and pre-computed FID moments: ``` python ../data_utils/inception_tf13.py --experiment_name [exp_name] --experiment_root [base_root] --kmeans_subsampled [kmeans_centers] ``` For stratified FIDs in the ImageNet-LT dataset, the following parameters can be added `--which_dataset 'imagenet_lt' --split 'val' --strat_name [stratified_split]`, where `stratified_split` can be in `[few,low, many]`. 3) (Only needed once) Pre-compute reference moments with tensorflow code: ``` python ../data_utils/inception_tf13.py --use_ground_truth_data --data_root [data_root] --split [ref_set] --resolution [res] --which_dataset [dataset] ``` 4) (Using this [repository](https://github.com/bioinf-jku/TTUR)) FID can be computed using the pre-computed statistics obtained in 2) and the pre-computed ground-truth statistics obtain in 3). For example, to compute the FID with reference ImageNet validation set: ```python TTUR/fid.py [base_root]/[exp_name]/TF_pool_.npz [data_root]/imagenet_val_res[res]_tf_inception_moments_ground_truth.npz ``` <b>To obtain the FID metric on COCO-Stuff</b>: 1) Obtain ground-truth jpeg images: ```python data_utils/store_coco_jpeg_images.py --resolution [res] --split [ref_set] --data_root [data_root] --out_path [gt_coco_images] --filter_hd [filter_hd] ``` 2) Store generated images as jpeg images: ```python sample.py --json_config ../[BigGAN-PyTorch or stylegan-ada-pytorch]/config_files/<dataset>/<selected_config>.json --data_root [data_root] --base_root [base_root] --sample_num_npz [num_imgs] --which_dataset 'coco' --eval_instance_set [ref_set] --eval_reference_set [ref_set] --filter_hd [filter_hd] --model_backbone [backbone] ``` 3) Using this [repository](https://github.com/bioinf-jku/TTUR), compute FID on the two folders of ground-truth and generated images. where: * `dataset`: option to select the dataset in `['imagenet', 'imagenet_lt', 'coco'] * `exp_name`: name of the experiment folder. * `data_root`: path where the data has been prepared and stored, following the previous section ["Data preparation"](#data-preparation). * `base_root`: path where to find the model (for example, where the pretrained models have been downloaded). * `num_imgs`: needs to be set to 50000 for ImageNet and ImageNet-LT (with validation set as reference) and set to 11500 for ImageNet-LT (with training set as reference). For COCO-Stuff, set to 75777, 2050, 675, 1375 if using the training, evaluation, evaluation seen or evaluation unseen set as reference. * `ref_set`: set to `'val'` for ImageNet, ImageNet-LT (and COCO) to obtain metrics with the validation (evaluation) set as reference, or set to `'train'` for ImageNet-LT or COCO to obtain metrics with the training set as reference. * `kmeans_centers`: set to 1000 for ImageNet and to -1 for ImageNet-LT. * `backbone`: model backbone architecture in `['biggan','stylegan2']`. * `res`: integer indicating the resolution of the images (64,128,256). * `gt_coco_images`: folder to store the ground-truth JPEG images of that specific split. * `filter_hd`: only valid for `ref_set=val`. If -1, use the entire evaluation set; if 0, use only conditionings and their ground-truth images with seen class combinations during training (eval seen); if 1, use only conditionings and their ground-truth images with unseen class combinations during training (eval unseen). ## Utilities for GAN backbones We change and provide extra utilities to facilitate the training, for both BigGAN and StyleGAN2 base repositories. ### BigGAN change log The following changes were made: * BigGAN architecture: * In `train_fns.py`: option to either have the optimizers inside the generator and discriminator class, or directly in the `G_D` wrapper module. Additionally, added an option to augment both generated and real images with augmentations from [DiffAugment](https://github.com/mit-han-lab/data-efficient-gans). * In `BigGAN.py`: added a function `get_condition_embeddings` to handle the conditioning separately. * Small modifications to `layers.py` to adapt the batchnorm function calls to the pytorch 1.8 version. * Training utilities: * Added `trainer.py` file (replacing train.py): * Training now allows the usage of DDP for faster single-node and multi-node training. * Training is performed by epochs instead of by iterations. * Option to stop the training by using early stopping or when experiments diverge. * In `utils.py`: * Replaced `MultiEpochSampler` for `CheckpointedSampler` to allow experiments to be resumable when using epochs and fixing a bug where `MultiEpochSampler` would require a long time to fetch data permutations when the number of epochs increased. * ImageNet-LT: Added option to use different class distributions when sampling a class label for the generator. * ImageNet-LT: Added class balancing (uniform and temperature annealed). * Added data augmentations from [DiffAugment](https://github.com/mit-han-lab/data-efficient-gans). * Testing utilities: * In `calculate_inception_moments.py`: added option to obtain moments for ImageNet-LT dataset, as well as stratified moments for many, medium and few-shot classes (stratified FID computation). * In `inception_utils.py`: added option to compute [Precision, Recall, Density, Coverage](https://github.com/clovaai/generative-evaluation-prdc) and stratified FID. * Data utilities: * In `datasets.py`, added option to load ImageNet-LT dataset. * Added ImageNet-LT.txt files with image indexes for training and validation split. * In `utils.py`: * Separate functions to obtain the data from hdf5 files (`get_dataset_hdf5`) or from directory (`get_dataset_images`), as well as a function to obtain only the data loader (`get_dataloader`). * Added the function `sample_conditionings` to handle possible different conditionings to train G with. * Experiment utilities: * Added JSON files to launch experiments with the proposed hyper-parameter configuration. * Script to launch experiments with either the [submitit tool](https://github.com/facebookincubator/submitit) or locally in the same machine (run.py). ### StyleGAN2 change log <div id="stylegan-changelog"> <ul> <li> Multi-node DistributedDataParallel training. </li> <li> Added early stopping based on the training FID metric. </li> <li> Automatic checkpointing when jobs are automatically rescheduled on a cluster. </li> <li> Option to load dataset from hdf5 file. </li> <li> Replaced the usage of Click python package by an `ArgumentParser`. </li> <li> Only saving best and last model weights. </li> </ul> </div> ## Acknowledgements We would like to thanks the authors of the [Pytorch BigGAN repository](https://github.com/ajbrock/BigGAN-PyTorch) and [StyleGAN2 Pytorch](https://github.com/NVlabs/stylegan2-ada-pytorch), as our model requires their repositories to train IC-GAN with BigGAN or StyleGAN2 bakcbone respectively. Moreover, we would like to further thank the authors of [generative-evaluation-prdc](https://github.com/clovaai/generative-evaluation-prdc), [data-efficient-gans](https://github.com/mit-han-lab/data-efficient-gans), [faiss](https://github.com/facebookresearch/faiss) and [sg2im](https://github.com/google/sg2im) as some components were borrowed and modified from their code bases. Finally, we thank the author of [WanderCLIP](https://colab.research.google.com/github/eyaler/clip_biggan/blob/main/WanderCLIP.ipynb) as well as the following repositories, that we use in our Colab notebook: [pytorch-pretrained-BigGAN](https://github.com/huggingface/pytorch-pretrained-BigGAN) and [CLIP](https://github.com/openai/CLIP). ## License The majority of IC-GAN is licensed under CC-BY-NC, however portions of the project are available under separate license terms: BigGAN and [PRDC](https://github.com/facebookresearch/ic_gan/blob/main/data_utils/compute_pdrc.py) are licensed under the MIT license; [COCO-Stuff loader](https://github.com/facebookresearch/ic_gan/blob/main/data_utils/cocostuff_dataset.py) is licensed under Apache License 2.0; [DiffAugment](https://github.com/facebookresearch/ic_gan/blob/main/BigGAN_PyTorch/diffaugment_utils.py) is licensed under BSD 2-Clause Simplified license; StyleGAN2 is licensed under a NVIDIA license, available here: https://github.com/NVlabs/stylegan2-ada-pytorch/blob/main/LICENSE.txt. In the Colab notebook, [CLIP](https://github.com/openai/CLIP) and [pytorch-pretrained-BigGAN](https://github.com/huggingface/pytorch-pretrained-BigGAN) code is used, both licensed under the MIT license. ## Disclaimers THE DIFFAUGMENT SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. THE CLIP SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. THE PYTORCH-PRETRAINED-BIGGAN SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ## Cite the paper If this repository, the paper or any of its content is useful for your research, please cite: ``` @inproceedings{casanova2021instanceconditioned, title={Instance-Conditioned GAN}, author={Arantxa Casanova and Marlène Careil and Jakob Verbeek and Michal Drozdzal and Adriana Romero-Soriano}, booktitle={Advances in Neural Information Processing Systems (NeurIPS)}, year={2021} } ```
{"license": "cc-by-nc-4.0", "tags": ["image-generation", "conditional-image-generation", "generative-model"], "library": "pytorch"}
null
facebook/ic_gan
[ "image-generation", "conditional-image-generation", "generative-model", "arxiv:2109.05070", "license:cc-by-nc-4.0", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2109.05070" ]
[]
TAGS #image-generation #conditional-image-generation #generative-model #arxiv-2109.05070 #license-cc-by-nc-4.0 #region-us
IC-GAN: Instance-Conditioned GAN ================================= Official Pytorch code of Instance-Conditioned GAN by Arantxa Casanova, Marlène Careil, Jakob Verbeek, Michał Drożdżal, Adriana Romero-Soriano. !IC-GAN results Generate images with IC-GAN in a Colab Notebook ----------------------------------------------- We provide a Google Colab notebook to generate images with IC-GAN and its class-conditional counter part. We also invite users to check out the demo on Replicate, courtesy of Replicate. The figure below depicts two instances, unseen during training and downloaded from Creative Commons search, and the generated images with IC-GAN and class-conditional IC-GAN when conditioning on the class "castle": !IC-GAN results transfer Additionally, and inspired by this Colab, we provide the funcionality in the same Colab notebook to guide generations with text captions, using the CLIP model. As an example, the following Figure shows three instance conditionings and a text caption (top), followed by the resulting generated images with IC-GAN (bottom), when optimizing the noise vector following CLIP's gradient for 100 iterations. !IC-GAN results transfer CLIP *Credit for the three instance conditionings, from left to right, that were modified with a resize and central crop:* 1: "Landscape in Bavaria" by shining.darkness, licensed under CC BY 2.0, 2: "Fantasy Landscape - slolsss" by Douglas Tofoli is marked with CC PDM 1.0, 3: "How to Draw Landscapes Simply" by Kuwagata Keisai is marked with CC0 1.0 Requirements ------------ * Python 3.8 * Cuda v10.2 / Cudnn v7.6.5 * gcc v7.3.0 * Pytorch 1.8.0 * A conda environment can be created from 'URL' by entering the command: 'conda env create -f URL', that contains the aforemention version of Pytorch and other required packages. * Faiss: follow the instructions in the original repository. Overview -------- This repository consists of four main folders: * 'data\_utils': A common folder to obtain and format the data needed to train and test IC-GAN, agnostic of the specific backbone. * 'inference': Scripts to test the models both qualitatively and quantitatively. * 'BigGAN\_PyTorch': It provides the training, evaluation and sampling scripts for IC-GAN with a BigGAN backbone. The code base comes from Pytorch BigGAN repository, made available under the MIT License. It has been modified to add additional utilities and it enables IC-GAN training on top of it. * 'stylegan2\_ada\_pytorch': It provides the training, evaluation and sampling scripts for IC-GAN with a StyleGAN2 backbone. The code base comes from StyleGAN2 Pytorch, made available under the Nvidia Source Code License. It has been modified to add additional utilities and it enables IC-GAN training on top of it. (Python script) Generate images with IC-GAN ------------------------------------------- Alternatively, we can **generate images with IC-GAN models** directly from a python script, by following the next steps: 1. Download the desired pretrained models (links below) and the pre-computed 1000 instance features from ImageNet and extract them into a folder 'pretrained\_models\_path'. 2. Execute: * 'model' can be chosen from '["icgan", "cc\_icgan"]' to use the IC-GAN or the class-conditional IC-GAN model respectively. * 'backbone' can be chosen from '["biggan", "stylegan2"]'. * 'res' indicates the resolution at which the model has been trained. For ImageNet, choose one in '[64, 128, 256]', and for COCO-Stuff, one in '[128, 256]'. This script results in a .PNG file where several generated images are shown, given an instance feature (each row), and a sampled noise vector (each grid position). **Additional and optional parameters**: * 'index': (None by default), is an integer from 0 to 999 that choses a specific instance feature vector out of the 1000 instances that have been selected with k-means on the ImageNet dataset and stored in 'pretrained\_models\_path/stored\_instances'. * 'swap\_target': (None by default) is an integer from 0 to 999 indicating an ImageNet class label. This label will be used to condition the class-conditional IC-GAN, regardless of which instance features are being used. * 'which\_dataset': (ImageNet by default) can be chosen from '["imagenet", "coco"]' to indicate which dataset (training split) to sample the instances from. * 'trained\_dataset': (ImageNet by default) can be chosen from '["imagenet", "coco"]' to indicate the dataset in which the IC-GAN model has been trained on. * 'num\_imgs\_gen': (5 by default), it changes the number of noise vectors to sample per conditioning. Increasing this number results in a bigger .PNG file to save and load. * 'num\_conditionings\_gen': (5 by default), it changes the number of conditionings to sample. Increasing this number results in a bigger .PNG file to save and load. * 'z\_var': (1.0 by default) controls the truncation factor for the generation. * Optionally, the script can be run with the following additional options '--visualize\_instance\_images --dataset\_path [dataset\_path]' to visualize the ground-truth images corresponding to the conditioning instance features, given a path to the dataset's ground-truth images 'dataset\_path'. Ground-truth instances will be plotted as the leftmost image for each row. Data preparation ---------------- ImageNet 1. Download dataset from [feature extractor weights from [. Following [this link](URL ImageNet-LT </a>, the file <code>ImageNet_LT_train.txt</code> can be downloaded from <a href=) and later stored in the folder `./BigGAN_PyTorch/imagenet_lt`.](URL here </a>. </li> <li> Replace the paths in data_utils/prepare_data.sh: <code>out_path</code> by the path where hdf5 files will be stored, <code>path_imnet</code> by the path where ImageNet dataset is downloaded, and <code>path_swav</code> by the path where SwAV weights are stored. </li> <li> Execute <code>./data_utils/prepare_data.sh imagenet [resolution]</code>, where <code>[resolution]</code> can be an integer in {64,128,256}. This script will create several hdf5 files: <ul> <li> <code>ILSVRC[resolution]_xy.hdf5</code> and <code>ILSVRC[resolution]_val_xy.hdf5</code>, where images and labels are stored for the training and validation set respectively. </li> <li> <code>ILSVRC[resolution]_feats_[feature_extractor]_resnet50.hdf5</code> that contains the instance features for each image. </li> <li> <code>ILSVRC[resolution]_feats_[feature_extractor]_resnet50_nn_k[k_nn].hdf5</code> that contains the list of [k_nn] neighbors for each of the instance features. </li> </ul> </li> </ol> </br> </details> <details> <summary>ImageNet-LT</summary> <br> <ol> <li>Download ImageNet dataset from <a href=)](URL here </a>. </li> <li>Download <a href=) 2. Download the pre-trained weights of the ResNet on ImageNet-LT from [.](URL this link</a>, provided by the <a href=) 3. Replace the paths in data\_utils/prepare\_data.sh: `out_path` by the path where hdf5 files will be stored, `path_imnet` by the path where ImageNet dataset is downloaded, and `path_classifier_lt` by the path where the pre-trained ResNet50 weights are stored. 4. Execute `./data_utils/prepare_data.sh imagenet_lt [resolution]`, where `[resolution]` can be an integer in {64,128,256}. This script will create several hdf5 files: * `ILSVRC[resolution]longtail_xy.hdf5`, where images and labels are stored for the training and validation set respectively. * `ILSVRC[resolution]longtail_feats_[feature_extractor]_resnet50.hdf5` that contains the instance features for each image. * `ILSVRC[resolution]longtail_feats_[feature_extractor]_resnet50_nn_k[k_nn].hdf5` that contains the list of [k\_nn] neighbors for each of the instance features. COCO-Stuff 1. Download the dataset following the [feature extractor weights from [feature extractor weights from [Data preparation](URL here </a>. </li> <li> Replace the paths in data_utils/prepare_data.sh: <code>out_path</code> by the path where hdf5 files will be stored and <code>path_swav</code> by the path where SwAV weights are stored. </li> <li> Execute <code>./data_utils/prepare_data.sh [dataset_name] [resolution] [dataset_path]</code>, where <code>[dataset_name]</code> will be the dataset name, <code>[resolution]</code> can be an integer, for example 128 or 256, and <code>dataset_path</code> contains the dataset images. This script will create several hdf5 files: <ul> <li> <code>[dataset_name][resolution]_xy.hdf5</code>, where images and labels are stored for the training set. </li> <li> <code>[dataset_name][resolution]_feats_[feature_extractor]_resnet50.hdf5</code> that contains the instance features for each image. </li> <li> <code>[dataset_name][resolution]_feats_[feature_extractor]_resnet50_nn_k[k_nn].hdf5</code> that contains the list of <code>k_nn</code> neighbors for each of the instance features. </li> </ul> </li> </ol> </br> </details> <details> <summary>How to subsample an instance feature dataset with k-means</summary> <br> To downsample the instance feature vector dataset, after we have prepared the data, we can use the k-means algorithm: <code> python data_utils/store_kmeans_indexes.py --resolution [resolution] --which_dataset [dataset_name] --data_root [data_path] </code> <ul> <li> Adding <code>--gpu</code> allows the faiss library to compute k-means leveraging GPUs, resulting in faster execution. </li> <li> Adding the parameter <code>--feature_extractor [feature_extractor]</code> chooses which feature extractor to use, with <code>feature_extractor</code> in <code>['selfsupervised', 'classification'] </code>, if we are using swAV as feature extactor or the ResNet pretrained on the classification task on ImageNet, respectively. </li> <li> The number of k-means clusters can be set with <code>--kmeans_subsampled [centers]</code>, where <code>centers</code> is an integer. </li> </ul> </br> </details> </div> <h2>How to train the models</h2> <h4>BigGAN or StyleGAN2 backbone</h4> <p>Training parameters are stored in JSON files in '[backbone_folder]/config_files/[dataset]/*.json', where '[backbone_folder]' is either BigGAN_Pytorch or stylegan2_ada_pytorch and '[dataset]' can either be ImageNet, ImageNet-LT or COCO_Stuff.</p> <p>or</p> <p>where:</p> <ul> <li>'data_root' path where the data has been prepared and stored, following the previous section (<a href=)).](URL here </a>. </li> <li> Replace the paths in data_utils/prepare_data.sh: <code>out_path</code> by the path where hdf5 files will be stored, <code>path_imnet</code> by the path where ImageNet dataset is downloaded, and <code>path_swav</code> by the path where SwAV weights are stored. </li> <li> Execute <code>./data_utils/prepare_data.sh coco [resolution]</code>, where <code>[resolution]</code> can be an integer in {128,256}. This script will create several hdf5 files: <ul> <li> <code>COCO[resolution]_xy.hdf5</code> and <code>COCO[resolution]_val_test_xy.hdf5</code>, where images and labels are stored for the training and evaluation set respectively. </li> <li> <code>COCO[resolution]_feats_[feature_extractor]_resnet50.hdf5</code> that contains the instance features for each image. </li> <li> <code>COCO[resolution]_feats_[feature_extractor]_resnet50_nn_k[k_nn].hdf5</code> that contains the list of [k_nn] neighbors for each of the instance features. </li> </ul> </li> </ol> </br> </details> <details> <summary>Other datasets</summary> <br> <ol> <li>Download the corresponding dataset and store in a folder <code>dataset_path</code>. </li> <li>Download <a href=)](URL LostGANs' repository instructions </a>. </li> <li>Download <a href=) 2. 'base\_root' path where to store the model weights and logs. Note that one can create other JSON files to modify the training parameters. #### Other backbones To be able to run IC-GAN with other backbones, we provide some orientative steps: * Place the new backbone code in a new folder under 'ic\_gan' ('ic\_gan/new\_backbone'). * Modify the relevant piece of code in the GAN architecture to allow instance features as conditionings (for both generator and discriminator). * Create a 'URL' file with the training loop to train an IC-GAN with the new backbone. The 'data\_utils' folder provides the tools to prepare the dataset, load the data and conditioning sampling to train an IC-GAN. The IC-GAN with BigGAN backbone 'URL' file can be used as an inspiration. How to test the models ---------------------- **To obtain the FID and IS metrics on ImageNet and ImageNet-LT**: 1. Execute: To obtain the tensorflow IS and FID metrics, use an environment with the Python <3.7 and Tensorflow 1.15. Then: 2. Obtain Inception Scores and pre-computed FID moments: For stratified FIDs in the ImageNet-LT dataset, the following parameters can be added '--which\_dataset 'imagenet\_lt' --split 'val' --strat\_name [stratified\_split]', where 'stratified\_split' can be in '[few,low, many]'. 3. (Only needed once) Pre-compute reference moments with tensorflow code: 4. (Using this repository) FID can be computed using the pre-computed statistics obtained in 2) and the pre-computed ground-truth statistics obtain in 3). For example, to compute the FID with reference ImageNet validation set: **To obtain the FID metric on COCO-Stuff**: 1. Obtain ground-truth jpeg images: 2. Store generated images as jpeg images: 3. Using this repository, compute FID on the two folders of ground-truth and generated images. where: * 'dataset': option to select the dataset in '['imagenet', 'imagenet\_lt', 'coco'] * 'exp\_name': name of the experiment folder. * 'data\_root': path where the data has been prepared and stored, following the previous section "Data preparation". * 'base\_root': path where to find the model (for example, where the pretrained models have been downloaded). * 'num\_imgs': needs to be set to 50000 for ImageNet and ImageNet-LT (with validation set as reference) and set to 11500 for ImageNet-LT (with training set as reference). For COCO-Stuff, set to 75777, 2050, 675, 1375 if using the training, evaluation, evaluation seen or evaluation unseen set as reference. * 'ref\_set': set to ''val'' for ImageNet, ImageNet-LT (and COCO) to obtain metrics with the validation (evaluation) set as reference, or set to ''train'' for ImageNet-LT or COCO to obtain metrics with the training set as reference. * 'kmeans\_centers': set to 1000 for ImageNet and to -1 for ImageNet-LT. * 'backbone': model backbone architecture in '['biggan','stylegan2']'. * 'res': integer indicating the resolution of the images (64,128,256). * 'gt\_coco\_images': folder to store the ground-truth JPEG images of that specific split. * 'filter\_hd': only valid for 'ref\_set=val'. If -1, use the entire evaluation set; if 0, use only conditionings and their ground-truth images with seen class combinations during training (eval seen); if 1, use only conditionings and their ground-truth images with unseen class combinations during training (eval unseen). Utilities for GAN backbones --------------------------- We change and provide extra utilities to facilitate the training, for both BigGAN and StyleGAN2 base repositories. ### BigGAN change log The following changes were made: * BigGAN architecture: + In 'train\_fns.py': option to either have the optimizers inside the generator and discriminator class, or directly in the 'G\_D' wrapper module. Additionally, added an option to augment both generated and real images with augmentations from DiffAugment. + In 'URL': added a function 'get\_condition\_embeddings' to handle the conditioning separately. + Small modifications to 'URL' to adapt the batchnorm function calls to the pytorch 1.8 version. * Training utilities: + Added 'URL' file (replacing URL): - Training now allows the usage of DDP for faster single-node and multi-node training. - Training is performed by epochs instead of by iterations. - Option to stop the training by using early stopping or when experiments diverge. + In 'URL': - Replaced 'MultiEpochSampler' for 'CheckpointedSampler' to allow experiments to be resumable when using epochs and fixing a bug where 'MultiEpochSampler' would require a long time to fetch data permutations when the number of epochs increased. - ImageNet-LT: Added option to use different class distributions when sampling a class label for the generator. - ImageNet-LT: Added class balancing (uniform and temperature annealed). - Added data augmentations from DiffAugment. * Testing utilities: + In 'calculate\_inception\_moments.py': added option to obtain moments for ImageNet-LT dataset, as well as stratified moments for many, medium and few-shot classes (stratified FID computation). + In 'inception\_utils.py': added option to compute Precision, Recall, Density, Coverage and stratified FID. * Data utilities: + In 'URL', added option to load ImageNet-LT dataset. + Added URL files with image indexes for training and validation split. + In 'URL': - Separate functions to obtain the data from hdf5 files ('get\_dataset\_hdf5') or from directory ('get\_dataset\_images'), as well as a function to obtain only the data loader ('get\_dataloader'). - Added the function 'sample\_conditionings' to handle possible different conditionings to train G with. * Experiment utilities: + Added JSON files to launch experiments with the proposed hyper-parameter configuration. + Script to launch experiments with either the submitit tool or locally in the same machine (URL). ### StyleGAN2 change log * Multi-node DistributedDataParallel training. * Added early stopping based on the training FID metric. * Automatic checkpointing when jobs are automatically rescheduled on a cluster. * Option to load dataset from hdf5 file. * Replaced the usage of Click python package by an 'ArgumentParser'. * Only saving best and last model weights. Acknowledgements ---------------- We would like to thanks the authors of the Pytorch BigGAN repository and StyleGAN2 Pytorch, as our model requires their repositories to train IC-GAN with BigGAN or StyleGAN2 bakcbone respectively. Moreover, we would like to further thank the authors of generative-evaluation-prdc, data-efficient-gans, faiss and sg2im as some components were borrowed and modified from their code bases. Finally, we thank the author of WanderCLIP as well as the following repositories, that we use in our Colab notebook: pytorch-pretrained-BigGAN and CLIP. License ------- The majority of IC-GAN is licensed under CC-BY-NC, however portions of the project are available under separate license terms: BigGAN and PRDC are licensed under the MIT license; COCO-Stuff loader is licensed under Apache License 2.0; DiffAugment is licensed under BSD 2-Clause Simplified license; StyleGAN2 is licensed under a NVIDIA license, available here: URL In the Colab notebook, CLIP and pytorch-pretrained-BigGAN code is used, both licensed under the MIT license. Disclaimers ----------- THE DIFFAUGMENT SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. THE CLIP SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. THE PYTORCH-PRETRAINED-BIGGAN SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. Cite the paper -------------- If this repository, the paper or any of its content is useful for your research, please cite:
[ "#### Other backbones\n\n\nTo be able to run IC-GAN with other backbones, we provide some orientative steps:\n\n* Place the new backbone code in a new folder under 'ic\\_gan' ('ic\\_gan/new\\_backbone').\n* Modify the relevant piece of code in the GAN architecture to allow instance features as conditionings (for both generator and discriminator).\n* Create a 'URL' file with the training loop to train an IC-GAN with the new backbone. The 'data\\_utils' folder provides the tools to prepare the dataset, load the data and conditioning sampling to train an IC-GAN. The IC-GAN with BigGAN backbone 'URL' file can be used as an inspiration.\n\nHow to test the models\n----------------------\n\n\n**To obtain the FID and IS metrics on ImageNet and ImageNet-LT**:\n\n1. Execute:\n\nTo obtain the tensorflow IS and FID metrics, use an environment with the Python <3.7 and Tensorflow 1.15. Then:\n\n2. Obtain Inception Scores and pre-computed FID moments:\n\nFor stratified FIDs in the ImageNet-LT dataset, the following parameters can be added '--which\\_dataset 'imagenet\\_lt' --split 'val' --strat\\_name [stratified\\_split]', where 'stratified\\_split' can be in '[few,low, many]'.\n\n3. (Only needed once) Pre-compute reference moments with tensorflow code:\n4. (Using this repository) FID can be computed using the pre-computed statistics obtained in 2) and the pre-computed ground-truth statistics obtain in 3). For example, to compute the FID with reference ImageNet validation set:\n\n**To obtain the FID metric on COCO-Stuff**:\n\n1. Obtain ground-truth jpeg images:\n2. Store generated images as jpeg images:\n3. Using this repository, compute FID on the two folders of ground-truth and generated images.\n\nwhere:\n\n* 'dataset': option to select the dataset in '['imagenet', 'imagenet\\_lt', 'coco']\n* 'exp\\_name': name of the experiment folder.\n* 'data\\_root': path where the data has been prepared and stored, following the previous section \"Data preparation\".\n* 'base\\_root': path where to find the model (for example, where the pretrained models have been downloaded).\n* 'num\\_imgs': needs to be set to 50000 for ImageNet and ImageNet-LT (with validation set as reference) and set to 11500 for ImageNet-LT (with training set as reference). For COCO-Stuff, set to 75777, 2050, 675, 1375 if using the training, evaluation, evaluation seen or evaluation unseen set as reference.\n* 'ref\\_set': set to ''val'' for ImageNet, ImageNet-LT (and COCO) to obtain metrics with the validation (evaluation) set as reference, or set to ''train'' for ImageNet-LT or COCO to obtain metrics with the training set as reference.\n* 'kmeans\\_centers': set to 1000 for ImageNet and to -1 for ImageNet-LT.\n* 'backbone': model backbone architecture in '['biggan','stylegan2']'.\n* 'res': integer indicating the resolution of the images (64,128,256).\n* 'gt\\_coco\\_images': folder to store the ground-truth JPEG images of that specific split.\n* 'filter\\_hd': only valid for 'ref\\_set=val'. If -1, use the entire evaluation set; if 0, use only conditionings and their ground-truth images with seen class combinations during training (eval seen); if 1, use only conditionings and their ground-truth images with unseen class combinations during training (eval unseen).\n\nUtilities for GAN backbones\n---------------------------\n\n\nWe change and provide extra utilities to facilitate the training, for both BigGAN and StyleGAN2 base repositories.", "### BigGAN change log\n\n\nThe following changes were made:\n\n* BigGAN architecture:\n\n\n\t+ In 'train\\_fns.py': option to either have the optimizers inside the generator and discriminator class, or directly in the 'G\\_D' wrapper module. Additionally, added an option to augment both generated and real images with augmentations from DiffAugment.\n\t+ In 'URL': added a function 'get\\_condition\\_embeddings' to handle the conditioning separately.\n\t+ Small modifications to 'URL' to adapt the batchnorm function calls to the pytorch 1.8 version.\n* Training utilities:\n\n\n\t+ Added 'URL' file (replacing URL):\n\t\t- Training now allows the usage of DDP for faster single-node and multi-node training.\n\t\t- Training is performed by epochs instead of by iterations.\n\t\t- Option to stop the training by using early stopping or when experiments diverge.\n\t+ In 'URL':\n\t\t- Replaced 'MultiEpochSampler' for 'CheckpointedSampler' to allow experiments to be resumable when using epochs and fixing a bug where 'MultiEpochSampler' would require a long time to fetch data permutations when the number of epochs increased.\n\t\t- ImageNet-LT: Added option to use different class distributions when sampling a class label for the generator.\n\t\t- ImageNet-LT: Added class balancing (uniform and temperature annealed).\n\t\t- Added data augmentations from DiffAugment.\n* Testing utilities:\n\n\n\t+ In 'calculate\\_inception\\_moments.py': added option to obtain moments for ImageNet-LT dataset, as well as stratified moments for many, medium and few-shot classes (stratified FID computation).\n\t+ In 'inception\\_utils.py': added option to compute Precision, Recall, Density, Coverage and stratified FID.\n* Data utilities:\n\n\n\t+ In 'URL', added option to load ImageNet-LT dataset.\n\t+ Added URL files with image indexes for training and validation split.\n\t+ In 'URL':\n\t\t- Separate functions to obtain the data from hdf5 files ('get\\_dataset\\_hdf5') or from directory ('get\\_dataset\\_images'), as well as a function to obtain only the data loader ('get\\_dataloader').\n\t\t- Added the function 'sample\\_conditionings' to handle possible different conditionings to train G with.\n* Experiment utilities:\n\n\n\t+ Added JSON files to launch experiments with the proposed hyper-parameter configuration.\n\t+ Script to launch experiments with either the submitit tool or locally in the same machine (URL).", "### StyleGAN2 change log\n\n\n\n* Multi-node DistributedDataParallel training.\n* Added early stopping based on the training FID metric.\n* Automatic checkpointing when jobs are automatically rescheduled on a cluster.\n* Option to load dataset from hdf5 file.\n* Replaced the usage of Click python package by an 'ArgumentParser'.\n* Only saving best and last model weights.\n\n\n\nAcknowledgements\n----------------\n\n\nWe would like to thanks the authors of the Pytorch BigGAN repository and StyleGAN2 Pytorch, as our model requires their repositories to train IC-GAN with BigGAN or StyleGAN2 bakcbone respectively.\nMoreover, we would like to further thank the authors of generative-evaluation-prdc, data-efficient-gans, faiss and sg2im as some components were borrowed and modified from their code bases. Finally, we thank the author of WanderCLIP as well as the following repositories, that we use in our Colab notebook: pytorch-pretrained-BigGAN and CLIP.\n\n\nLicense\n-------\n\n\nThe majority of IC-GAN is licensed under CC-BY-NC, however portions of the project are available under separate license terms: BigGAN and PRDC are licensed under the MIT license; COCO-Stuff loader is licensed under Apache License 2.0; DiffAugment is licensed under BSD 2-Clause Simplified license; StyleGAN2 is licensed under a NVIDIA license, available here: URL In the Colab notebook, CLIP and pytorch-pretrained-BigGAN code is used, both licensed under the MIT license.\n\n\nDisclaimers\n-----------\n\n\nTHE DIFFAUGMENT SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n\n\nTHE CLIP SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n\nTHE PYTORCH-PRETRAINED-BIGGAN SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n\nCite the paper\n--------------\n\n\nIf this repository, the paper or any of its content is useful for your research, please cite:" ]
[ "TAGS\n#image-generation #conditional-image-generation #generative-model #arxiv-2109.05070 #license-cc-by-nc-4.0 #region-us \n", "#### Other backbones\n\n\nTo be able to run IC-GAN with other backbones, we provide some orientative steps:\n\n* Place the new backbone code in a new folder under 'ic\\_gan' ('ic\\_gan/new\\_backbone').\n* Modify the relevant piece of code in the GAN architecture to allow instance features as conditionings (for both generator and discriminator).\n* Create a 'URL' file with the training loop to train an IC-GAN with the new backbone. The 'data\\_utils' folder provides the tools to prepare the dataset, load the data and conditioning sampling to train an IC-GAN. The IC-GAN with BigGAN backbone 'URL' file can be used as an inspiration.\n\nHow to test the models\n----------------------\n\n\n**To obtain the FID and IS metrics on ImageNet and ImageNet-LT**:\n\n1. Execute:\n\nTo obtain the tensorflow IS and FID metrics, use an environment with the Python <3.7 and Tensorflow 1.15. Then:\n\n2. Obtain Inception Scores and pre-computed FID moments:\n\nFor stratified FIDs in the ImageNet-LT dataset, the following parameters can be added '--which\\_dataset 'imagenet\\_lt' --split 'val' --strat\\_name [stratified\\_split]', where 'stratified\\_split' can be in '[few,low, many]'.\n\n3. (Only needed once) Pre-compute reference moments with tensorflow code:\n4. (Using this repository) FID can be computed using the pre-computed statistics obtained in 2) and the pre-computed ground-truth statistics obtain in 3). For example, to compute the FID with reference ImageNet validation set:\n\n**To obtain the FID metric on COCO-Stuff**:\n\n1. Obtain ground-truth jpeg images:\n2. Store generated images as jpeg images:\n3. Using this repository, compute FID on the two folders of ground-truth and generated images.\n\nwhere:\n\n* 'dataset': option to select the dataset in '['imagenet', 'imagenet\\_lt', 'coco']\n* 'exp\\_name': name of the experiment folder.\n* 'data\\_root': path where the data has been prepared and stored, following the previous section \"Data preparation\".\n* 'base\\_root': path where to find the model (for example, where the pretrained models have been downloaded).\n* 'num\\_imgs': needs to be set to 50000 for ImageNet and ImageNet-LT (with validation set as reference) and set to 11500 for ImageNet-LT (with training set as reference). For COCO-Stuff, set to 75777, 2050, 675, 1375 if using the training, evaluation, evaluation seen or evaluation unseen set as reference.\n* 'ref\\_set': set to ''val'' for ImageNet, ImageNet-LT (and COCO) to obtain metrics with the validation (evaluation) set as reference, or set to ''train'' for ImageNet-LT or COCO to obtain metrics with the training set as reference.\n* 'kmeans\\_centers': set to 1000 for ImageNet and to -1 for ImageNet-LT.\n* 'backbone': model backbone architecture in '['biggan','stylegan2']'.\n* 'res': integer indicating the resolution of the images (64,128,256).\n* 'gt\\_coco\\_images': folder to store the ground-truth JPEG images of that specific split.\n* 'filter\\_hd': only valid for 'ref\\_set=val'. If -1, use the entire evaluation set; if 0, use only conditionings and their ground-truth images with seen class combinations during training (eval seen); if 1, use only conditionings and their ground-truth images with unseen class combinations during training (eval unseen).\n\nUtilities for GAN backbones\n---------------------------\n\n\nWe change and provide extra utilities to facilitate the training, for both BigGAN and StyleGAN2 base repositories.", "### BigGAN change log\n\n\nThe following changes were made:\n\n* BigGAN architecture:\n\n\n\t+ In 'train\\_fns.py': option to either have the optimizers inside the generator and discriminator class, or directly in the 'G\\_D' wrapper module. Additionally, added an option to augment both generated and real images with augmentations from DiffAugment.\n\t+ In 'URL': added a function 'get\\_condition\\_embeddings' to handle the conditioning separately.\n\t+ Small modifications to 'URL' to adapt the batchnorm function calls to the pytorch 1.8 version.\n* Training utilities:\n\n\n\t+ Added 'URL' file (replacing URL):\n\t\t- Training now allows the usage of DDP for faster single-node and multi-node training.\n\t\t- Training is performed by epochs instead of by iterations.\n\t\t- Option to stop the training by using early stopping or when experiments diverge.\n\t+ In 'URL':\n\t\t- Replaced 'MultiEpochSampler' for 'CheckpointedSampler' to allow experiments to be resumable when using epochs and fixing a bug where 'MultiEpochSampler' would require a long time to fetch data permutations when the number of epochs increased.\n\t\t- ImageNet-LT: Added option to use different class distributions when sampling a class label for the generator.\n\t\t- ImageNet-LT: Added class balancing (uniform and temperature annealed).\n\t\t- Added data augmentations from DiffAugment.\n* Testing utilities:\n\n\n\t+ In 'calculate\\_inception\\_moments.py': added option to obtain moments for ImageNet-LT dataset, as well as stratified moments for many, medium and few-shot classes (stratified FID computation).\n\t+ In 'inception\\_utils.py': added option to compute Precision, Recall, Density, Coverage and stratified FID.\n* Data utilities:\n\n\n\t+ In 'URL', added option to load ImageNet-LT dataset.\n\t+ Added URL files with image indexes for training and validation split.\n\t+ In 'URL':\n\t\t- Separate functions to obtain the data from hdf5 files ('get\\_dataset\\_hdf5') or from directory ('get\\_dataset\\_images'), as well as a function to obtain only the data loader ('get\\_dataloader').\n\t\t- Added the function 'sample\\_conditionings' to handle possible different conditionings to train G with.\n* Experiment utilities:\n\n\n\t+ Added JSON files to launch experiments with the proposed hyper-parameter configuration.\n\t+ Script to launch experiments with either the submitit tool or locally in the same machine (URL).", "### StyleGAN2 change log\n\n\n\n* Multi-node DistributedDataParallel training.\n* Added early stopping based on the training FID metric.\n* Automatic checkpointing when jobs are automatically rescheduled on a cluster.\n* Option to load dataset from hdf5 file.\n* Replaced the usage of Click python package by an 'ArgumentParser'.\n* Only saving best and last model weights.\n\n\n\nAcknowledgements\n----------------\n\n\nWe would like to thanks the authors of the Pytorch BigGAN repository and StyleGAN2 Pytorch, as our model requires their repositories to train IC-GAN with BigGAN or StyleGAN2 bakcbone respectively.\nMoreover, we would like to further thank the authors of generative-evaluation-prdc, data-efficient-gans, faiss and sg2im as some components were borrowed and modified from their code bases. Finally, we thank the author of WanderCLIP as well as the following repositories, that we use in our Colab notebook: pytorch-pretrained-BigGAN and CLIP.\n\n\nLicense\n-------\n\n\nThe majority of IC-GAN is licensed under CC-BY-NC, however portions of the project are available under separate license terms: BigGAN and PRDC are licensed under the MIT license; COCO-Stuff loader is licensed under Apache License 2.0; DiffAugment is licensed under BSD 2-Clause Simplified license; StyleGAN2 is licensed under a NVIDIA license, available here: URL In the Colab notebook, CLIP and pytorch-pretrained-BigGAN code is used, both licensed under the MIT license.\n\n\nDisclaimers\n-----------\n\n\nTHE DIFFAUGMENT SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n\n\nTHE CLIP SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n\nTHE PYTORCH-PRETRAINED-BIGGAN SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n\nCite the paper\n--------------\n\n\nIf this repository, the paper or any of its content is useful for your research, please cite:" ]
[ 43, 955, 632, 1047 ]
[ "passage: TAGS\n#image-generation #conditional-image-generation #generative-model #arxiv-2109.05070 #license-cc-by-nc-4.0 #region-us \n", "passage: #### Other backbones\n\n\nTo be able to run IC-GAN with other backbones, we provide some orientative steps:\n\n* Place the new backbone code in a new folder under 'ic\\_gan' ('ic\\_gan/new\\_backbone').\n* Modify the relevant piece of code in the GAN architecture to allow instance features as conditionings (for both generator and discriminator).\n* Create a 'URL' file with the training loop to train an IC-GAN with the new backbone. The 'data\\_utils' folder provides the tools to prepare the dataset, load the data and conditioning sampling to train an IC-GAN. The IC-GAN with BigGAN backbone 'URL' file can be used as an inspiration.\n\nHow to test the models\n----------------------\n\n\n**To obtain the FID and IS metrics on ImageNet and ImageNet-LT**:\n\n1. Execute:\n\nTo obtain the tensorflow IS and FID metrics, use an environment with the Python <3.7 and Tensorflow 1.15. Then:\n\n2. Obtain Inception Scores and pre-computed FID moments:\n\nFor stratified FIDs in the ImageNet-LT dataset, the following parameters can be added '--which\\_dataset 'imagenet\\_lt' --split 'val' --strat\\_name [stratified\\_split]', where 'stratified\\_split' can be in '[few,low, many]'.\n\n3. (Only needed once) Pre-compute reference moments with tensorflow code:\n4. (Using this repository) FID can be computed using the pre-computed statistics obtained in 2) and the pre-computed ground-truth statistics obtain in 3). For example, to compute the FID with reference ImageNet validation set:\n\n**To obtain the FID metric on COCO-Stuff**:\n\n1. Obtain ground-truth jpeg images:\n2. Store generated images as jpeg images:\n3. Using this repository, compute FID on the two folders of ground-truth and generated images.\n\nwhere:\n\n* 'dataset': option to select the dataset in '['imagenet', 'imagenet\\_lt', 'coco']\n* 'exp\\_name': name of the experiment folder.\n* 'data\\_root': path where the data has been prepared and stored, following the previous section \"Data preparation\".\n* 'base\\_root': path where to find the model (for example, where the pretrained models have been downloaded).\n* 'num\\_imgs': needs to be set to 50000 for ImageNet and ImageNet-LT (with validation set as reference) and set to 11500 for ImageNet-LT (with training set as reference). For COCO-Stuff, set to 75777, 2050, 675, 1375 if using the training, evaluation, evaluation seen or evaluation unseen set as reference.\n* 'ref\\_set': set to ''val'' for ImageNet, ImageNet-LT (and COCO) to obtain metrics with the validation (evaluation) set as reference, or set to ''train'' for ImageNet-LT or COCO to obtain metrics with the training set as reference.\n* 'kmeans\\_centers': set to 1000 for ImageNet and to -1 for ImageNet-LT.\n* 'backbone': model backbone architecture in '['biggan','stylegan2']'.\n* 'res': integer indicating the resolution of the images (64,128,256).\n* 'gt\\_coco\\_images': folder to store the ground-truth JPEG images of that specific split.\n* 'filter\\_hd': only valid for 'ref\\_set=val'. If -1, use the entire evaluation set; if 0, use only conditionings and their ground-truth images with seen class combinations during training (eval seen); if 1, use only conditionings and their ground-truth images with unseen class combinations during training (eval unseen).\n\nUtilities for GAN backbones\n---------------------------\n\n\nWe change and provide extra utilities to facilitate the training, for both BigGAN and StyleGAN2 base repositories.", "passage: ### BigGAN change log\n\n\nThe following changes were made:\n\n* BigGAN architecture:\n\n\n\t+ In 'train\\_fns.py': option to either have the optimizers inside the generator and discriminator class, or directly in the 'G\\_D' wrapper module. Additionally, added an option to augment both generated and real images with augmentations from DiffAugment.\n\t+ In 'URL': added a function 'get\\_condition\\_embeddings' to handle the conditioning separately.\n\t+ Small modifications to 'URL' to adapt the batchnorm function calls to the pytorch 1.8 version.\n* Training utilities:\n\n\n\t+ Added 'URL' file (replacing URL):\n\t\t- Training now allows the usage of DDP for faster single-node and multi-node training.\n\t\t- Training is performed by epochs instead of by iterations.\n\t\t- Option to stop the training by using early stopping or when experiments diverge.\n\t+ In 'URL':\n\t\t- Replaced 'MultiEpochSampler' for 'CheckpointedSampler' to allow experiments to be resumable when using epochs and fixing a bug where 'MultiEpochSampler' would require a long time to fetch data permutations when the number of epochs increased.\n\t\t- ImageNet-LT: Added option to use different class distributions when sampling a class label for the generator.\n\t\t- ImageNet-LT: Added class balancing (uniform and temperature annealed).\n\t\t- Added data augmentations from DiffAugment.\n* Testing utilities:\n\n\n\t+ In 'calculate\\_inception\\_moments.py': added option to obtain moments for ImageNet-LT dataset, as well as stratified moments for many, medium and few-shot classes (stratified FID computation).\n\t+ In 'inception\\_utils.py': added option to compute Precision, Recall, Density, Coverage and stratified FID.\n* Data utilities:\n\n\n\t+ In 'URL', added option to load ImageNet-LT dataset.\n\t+ Added URL files with image indexes for training and validation split.\n\t+ In 'URL':\n\t\t- Separate functions to obtain the data from hdf5 files ('get\\_dataset\\_hdf5') or from directory ('get\\_dataset\\_images'), as well as a function to obtain only the data loader ('get\\_dataloader').\n\t\t- Added the function 'sample\\_conditionings' to handle possible different conditionings to train G with.\n* Experiment utilities:\n\n\n\t+ Added JSON files to launch experiments with the proposed hyper-parameter configuration.\n\t+ Script to launch experiments with either the submitit tool or locally in the same machine (URL)." ]
[ -0.060307200998067856, 0.13276679813861847, -0.006576476618647575, 0.029909634962677956, 0.0478556863963604, 0.025373121723532677, 0.07410930842161179, 0.09360402077436447, -0.02847212366759777, 0.10458598285913467, 0.08657017350196838, 0.026607224717736244, 0.06492289155721664, 0.03881783410906792, 0.029093945398926735, -0.16691569983959198, 0.02691950649023056, -0.02971179224550724, -0.04578618332743645, 0.05019034817814827, 0.07082871347665787, -0.07027768343687057, 0.07332860678434372, -0.0184010062366724, -0.12526141107082367, -0.023388320580124855, -0.012638342566788197, -0.005837688688188791, 0.06355031579732895, 0.054086800664663315, 0.022372139617800713, 0.05048006772994995, 0.025032401084899902, -0.15870346128940582, 0.011655476875603199, 0.06383869051933289, -0.03334688022732735, 0.06490524858236313, 0.10067860037088394, 0.05094711109995842, 0.0903172716498375, -0.022190436720848083, -0.046757522970438004, 0.021154234185814857, -0.07561451196670532, -0.12134889513254166, -0.0818549394607544, 0.06372202932834625, 0.04926194250583649, 0.0432756133377552, 0.014809469692409039, 0.05955247953534126, -0.07024770230054855, 0.03689774498343468, 0.06592637300491333, -0.1813901662826538, -0.011156144551932812, 0.12565888464450836, 0.023659000173211098, 0.0450664721429348, -0.05783252790570259, 0.025292793288826942, 0.02978447824716568, -0.00369807961396873, 0.007290916983038187, -0.027728820219635963, -0.016938595101237297, 0.0013786660274490714, -0.09070213884115219, -0.041948240250349045, 0.12024283409118652, 0.03568261116743088, -0.04370030760765076, -0.05440911650657654, -0.03670470044016838, -0.04268727824091911, -0.024181785061955452, -0.0161445289850235, 0.03949414938688278, 0.04200774431228638, 0.05657197907567024, -0.11944792419672012, -0.1367853581905365, 0.005852783564478159, -0.07175202667713165, 0.012898922897875309, 0.035014357417821884, 0.05273718759417534, -0.046378329396247864, 0.07715971022844315, -0.06436348706483841, -0.0642298087477684, -0.058385659009218216, -0.052658990025520325, -0.04185980185866356, -0.008078351616859436, -0.04252338409423828, -0.03401925787329674, 0.026248672977089882, 0.16029329597949982, -0.03576340898871422, -0.006264634430408478, -0.06389272958040237, 0.07141857594251633, 0.0031749370973557234, 0.03802021965384483, -0.10692330449819565, 0.04864189028739929, 0.03970956802368164, 0.01787094585597515, 0.020941751077771187, -0.02259323187172413, -0.06287278980016708, -0.024559386074543, -0.020125940442085266, 0.023142926394939423, 0.003022957593202591, 0.006498474627733231, -0.03293215483427048, -0.02742387354373932, 0.13446535170078278, -0.05975992977619171, 0.013473940081894398, 0.04059605300426483, -0.02429387718439102, 0.05396503210067749, 0.04064265638589859, -0.007822849787771702, -0.01690862514078617, 0.011325971223413944, -0.06102396547794342, 0.013426847755908966, -0.06067202612757683, -0.02817550301551819, 0.037551552057266235, -0.04829216003417969, -0.01721026934683323, -0.08388001471757889, -0.10902900248765945, -0.009721207432448864, 0.05751408264040947, -0.04054667428135872, 0.018014440312981606, 0.04597747325897217, -0.06347885727882385, -0.01617008075118065, 0.00408281059935689, 0.04551278054714203, -0.02849273569881916, 0.05173979699611664, -0.03712798282504082, 0.04559781029820442, -0.028167380020022392, 0.013672350905835629, -0.06395985931158066, 0.04062226414680481, -0.11811307817697525, 0.08171651512384415, -0.02584150992333889, -0.019084956496953964, -0.0786852166056633, -0.04864857718348503, -0.03405146300792694, -0.0034372610971331596, 0.01107915211468935, 0.1070011630654335, -0.1706668883562088, -0.0037600614596158266, 0.05927355960011482, -0.09081587940454483, -0.06869388371706009, 0.10545947402715683, -0.022543050348758698, 0.018121814355254173, 0.021065479144454002, 0.09633899480104446, 0.12362370640039444, -0.08009645342826843, -0.047614965587854385, 0.025590986013412476, -0.06493347138166428, 0.022310808300971985, 0.047663357108831406, 0.020032236352562904, 0.00965330470353365, 0.015118232928216457, -0.14229273796081543, 0.058104898780584335, -0.01136802602559328, -0.06290721148252487, -0.0033060878049582243, -0.016309887170791626, -0.059194669127464294, -0.023125940933823586, 0.032511916011571884, 0.029946716502308846, -0.043122678995132446, -0.04948251321911812, 0.08752447366714478, -0.039716556668281555, 0.05268390104174614, -0.06599833071231842, 0.08820345252752304, -0.0696781575679779, 0.014860942959785461, -0.1061420813202858, -0.008769967593252659, 0.03563276305794716, -0.0845932736992836, 0.05227040871977806, 0.006096836179494858, 0.03259674459695816, 0.022002676501870155, 0.01043727621436119, 0.007641365751624107, -0.0071461680345237255, -0.06279212981462479, -0.01521912869066, -0.09232160449028015, -0.037625208497047424, -0.015561900101602077, 0.15868055820465088, -0.15742136538028717, -0.018495606258511543, 0.06961534917354584, 0.05373714491724968, 0.009331654757261276, -0.02119596302509308, 0.0030909988563507795, -0.05429993197321892, -0.054206833243370056, -0.037717193365097046, 0.012544575147330761, 0.030280062928795815, -0.045215804129838943, 0.10989075899124146, -0.10505885630846024, -0.04137057438492775, 0.0838610902428627, -0.021322935819625854, -0.040373723953962326, -0.0762111246585846, -0.018815191462635994, -0.021803410723805428, -0.0027526903431862593, -0.02867249585688114, 0.08356857299804688, 0.02249768190085888, 0.07543009519577026, -0.07383207231760025, 0.014574830420315266, 0.01871109940111637, 0.014026515185832977, -0.045397888869047165, 0.025720736011862755, 0.1329357624053955, -0.06680731475353241, 0.06533727049827576, 0.12013223767280579, -0.007625720929354429, 0.06474290788173676, 0.0006649072165600955, -0.0963917151093483, -0.05334741994738579, 0.09229027479887009, 0.013280883431434631, 0.07234150916337967, 0.03018653392791748, 0.029995614662766457, 0.06075328588485718, -0.021340064704418182, 0.030918747186660767, -0.1165350154042244, 0.011804725043475628, 0.02424176037311554, -0.03917521610856056, 0.02218502014875412, -0.025221293792128563, -0.0033373087644577026, 0.06556386500597, -0.009551550261676311, -0.033407412469387054, 0.049160685390233994, -0.0238200630992651, -0.06286811083555222, 0.1430657058954239, -0.0948563814163208, -0.22101648151874542, -0.15212596952915192, 0.00009720058733364567, -0.09195489436388016, 0.00036367401480674744, -0.007229115813970566, -0.01485587377101183, -0.05225859954953194, -0.08113332837820053, 0.0038459375500679016, -0.023016372695565224, -0.04101738706231117, -0.05167463794350624, 0.03540843725204468, 0.009826612658798695, -0.08921141177415848, 0.01707325503230095, -0.00398782454431057, -0.05151548981666565, 0.08927176147699356, 0.05128249526023865, 0.055495645850896835, 0.05413827672600746, 0.02118772268295288, -0.004806349519640207, 0.014538194052875042, 0.1535794883966446, -0.09560155123472214, 0.07201946526765823, 0.16583482921123505, -0.01091600302606821, 0.06512755155563354, 0.09371638298034668, 0.035482000559568405, -0.07047324627637863, -0.008296915329992771, 0.009776327759027481, -0.0441497266292572, -0.14806033670902252, -0.04697664454579353, -0.06061200425028801, -0.020507892593741417, 0.09558185189962387, 0.04306059703230858, 0.0573846660554409, 0.08856955915689468, -0.048963140696287155, 0.02132720686495304, 0.010273636318743229, 0.04448238015174866, 0.08224105089902878, -0.008171751163899899, 0.04138968512415886, -0.048842448741197586, -0.01638324186205864, 0.1029730811715126, 0.08728250861167908, 0.21971429884433746, 0.0009455631370656192, 0.12934674322605133, 0.04041379317641258, 0.1308455467224121, 0.029355710372328758, 0.057927560061216354, -0.04205237701535225, -0.012995707802474499, -0.012313414365053177, -0.09012124687433243, 0.020803548395633698, 0.08237502723932266, 0.03013748675584793, -0.06729934364557266, 0.0246786680072546, -0.0046821460127830505, 0.006504938006401062, 0.1379399448633194, 0.04790400341153145, -0.20098423957824707, 0.02457631379365921, 0.008414971642196178, 0.06562530994415283, -0.07822170853614807, 0.024165624752640724, 0.1232907772064209, -0.04447763040661812, 0.021562064066529274, -0.06794340163469315, 0.03165428712964058, -0.05718480423092842, -0.006228474900126457, 0.022379616275429726, 0.05396804213523865, 0.0404764823615551, 0.07699143141508102, -0.13100270926952362, 0.09900958091020584, 0.01960671879351139, 0.002195102395489812, -0.07891851663589478, 0.04709603264927864, 0.012441604398190975, 0.043061304837465286, 0.15210488438606262, 0.004679981619119644, -0.04927216097712517, -0.05984170362353325, -0.09620407968759537, 0.007221973035484552, 0.129723459482193, -0.04813550412654877, 0.04497082903981209, 0.004533205647021532, -0.00612438702955842, -0.01903550885617733, 0.026714662089943886, -0.08241057395935059, -0.18081068992614746, 0.0690138041973114, 0.008454442024230957, -0.027243077754974365, -0.03246340900659561, -0.011514517478644848, -0.11167949438095093, 0.17281264066696167, -0.11253046989440918, -0.031779754906892776, -0.0837709978222847, -0.013204716145992279, 0.14034424722194672, -0.0433402843773365, 0.014237683266401291, -0.03235230967402458, 0.10031389445066452, -0.053739577531814575, -0.1331314891576767, 0.04887883737683296, -0.057884734123945236, -0.06986788660287857, -0.020083963871002197, 0.08337376266717911, -0.013149142265319824, 0.02054869569838047, 0.004223478492349386, 0.03110549785196781, -0.038617972284555435, -0.08405902981758118, 0.0010338947176933289, -0.017834985628724098, 0.04747181013226509, 0.03038567118346691, -0.09024935960769653, 0.04231816157698631, -0.020342132076621056, 0.010508003644645214, 0.11064545065164566, 0.2753015458583832, -0.07901283353567123, 0.08886510133743286, 0.13563282787799835, -0.06368014961481094, -0.2201184183359146, 0.026312490925192833, -0.0007213701610453427, 0.003563113510608673, 0.04759875312447548, -0.21812750399112701, 0.08970990777015686, 0.13828803598880768, -0.011963277123868465, 0.14593206346035004, -0.236345112323761, -0.06577616930007935, -0.024363374337553978, 0.017193492501974106, 0.031884077936410904, -0.12573102116584778, -0.02711566351354122, -0.019805345684289932, -0.05224110558629036, 0.1566603183746338, 0.030693531036376953, 0.06883204728364944, -0.00804834347218275, -0.004202773328870535, 0.044569749385118484, -0.06145074963569641, 0.13504429161548615, -0.021087681874632835, 0.09242191910743713, -0.0652252808213234, 0.03310433402657509, 0.1290808767080307, -0.04931590333580971, 0.10223603248596191, -0.05998412147164345, 0.012559174560010433, -0.09466410428285599, 0.013245675712823868, -0.06379816681146622, 0.04683643579483032, -0.02784748189151287, -0.03800166770815849, -0.09564919024705887, 0.051451150327920914, 0.036609914153814316, 0.007531327661126852, 0.09064041823148727, 0.020768892019987106, -0.029952839016914368, 0.13351796567440033, 0.037172313779592514, -0.026098886504769325, -0.12343331426382065, -0.032316479831933975, -0.005421824753284454, 0.0734265148639679, -0.11344793438911438, 0.03200641646981239, 0.09147634357213974, 0.014555112458765507, 0.08362952619791031, 0.02715705893933773, -0.08603478223085403, 0.014076494611799717, 0.082809679210186, -0.07905950397253036, -0.12408232688903809, -0.010431925766170025, 0.03377600386738777, -0.02775188721716404, -0.038988735526800156, 0.1326364129781723, -0.03131769225001335, 0.010671828873455524, 0.01187905203551054, 0.06095569208264351, -0.011842907406389713, 0.0476997010409832, 0.03110358864068985, -0.014904595911502838, -0.08918198198080063, 0.107278972864151, 0.0769289955496788, -0.11567598581314087, -0.005275269504636526, 0.03593713417649269, -0.06406708806753159, -0.07645920664072037, -0.067436583340168, -0.02639375813305378, -0.09604852646589279, -0.005622871220111847, 0.014506024308502674, -0.0469115786254406, 0.031297314912080765, -0.020941248163580894, 0.031653519719839096, 0.034698937088251114, -0.007428605109453201, -0.024911776185035706, -0.08982809633016586, 0.044858288019895554, -0.06872900575399399, 0.08354502171278, -0.08235975354909897, 0.059357356280088425, -0.003053602995350957, 0.04534987732768059, -0.02352035976946354, -0.023447921499609947, -0.0372217632830143, 0.011411010287702084, -0.07832620292901993, 0.014806372113525867, -0.04397333785891533, -0.02534513734281063, 0.01857389137148857, 0.04642732813954353, -0.028288275003433228, 0.022295838221907616, -0.08707445114850998, -0.04801921918988228, -0.029384097084403038, 0.043548833578825, -0.086431123316288, -0.02445463091135025, 0.012277905829250813, -0.08422397822141647, 0.11175897717475891, 0.007554542273283005, -0.025832505896687508, -0.017548421397805214, -0.05174238979816437, -0.05947086587548256, 0.051133424043655396, 0.06305196136236191, -0.018023286014795303, -0.07049652189016342, 0.03681585192680359, 0.026074761524796486, -0.06298699229955673, -0.01444945763796568, 0.02099279873073101, -0.09795892238616943, 0.031061416491866112, -0.06340952217578888, 0.005699194967746735, -0.031816672533750534, 0.01106245443224907, 0.01932578533887863, 0.048528581857681274, 0.13380329310894012, -0.0014053067425265908, 0.023012027144432068, -0.10096049308776855, -0.026315955445170403, -0.02126535028219223, -0.03199024498462677, -0.03938668593764305, -0.022819971665740013, 0.029506288468837738, -0.03260189667344093, 0.14173458516597748, -0.0532676987349987, -0.04100794717669487, 0.00554998591542244, 0.013488978147506714, -0.023158004507422447, 0.022745436057448387, 0.10707905888557434, 0.0674716904759407, -0.008032386191189289, -0.009137342683970928, 0.02652728743851185, 0.033484723418951035, 0.002322559477761388, 0.04357327148318291, 0.11371753364801407, 0.013942350633442402, 0.06547994166612625, 0.08625563979148865, -0.08271823823451996, -0.03513866290450096, 0.1006450429558754, -0.07980397343635559, 0.09228209406137466, -0.04033133387565613, 0.07912702113389969, 0.1413712352514267, -0.07300993800163269, 0.06106755509972572, 0.017199739813804626, -0.026868373155593872, -0.045606791973114014, -0.1771780401468277, -0.0615726113319397, -0.04899738356471062, 0.020664187148213387, -0.07246500998735428, 0.04838121309876442, 0.12121393531560898, 0.04667964577674866, -0.021813014522194862, 0.04800540581345558, -0.06845150142908096, -0.08581461757421494, 0.0759955421090126, 0.00957655068486929, -0.001475513563491404, 0.06131072714924812, -0.02895733155310154, 0.08347845077514648, -0.003427491756156087, 0.07377292960882187, 0.06034420430660248, 0.14475645124912262, 0.048201967030763626, -0.05605137348175049, -0.06661731004714966, -0.02674844115972519, -0.024637527763843536, -0.015452968887984753, 0.1005670353770256, 0.0068838391453027725, -0.0055803884752094746, -0.025280991569161415, 0.08263573795557022, -0.03297671303153038, -0.015127323567867279, -0.09356149286031723, 0.08014533668756485, -0.010381761007010937, 0.06274326890707016, -0.025406768545508385, -0.07700338959693909, -0.021014800295233727, 0.1854225993156433, 0.143607035279274, -0.12755994498729706, -0.01669994369149208, 0.0725112333893776, -0.008470091037452221, -0.052413325756788254, 0.13308994472026825, 0.039178941398859024, 0.16690050065517426, -0.035758715122938156, 0.007952024228870869, -0.047867465764284134, 0.0011209665099158883, -0.0995752215385437, 0.0676320418715477, -0.04181484505534172, -0.0007741264998912811, -0.03993421420454979, 0.03073432296514511, -0.0027659039478749037, -0.065764419734478, 0.05908507481217384, -0.04291839897632599, -0.09129951149225235, 0.0010192102054134011, 0.028237320482730865, -0.032687705010175705, 0.026323368772864342, -0.03716803342103958, -0.032400522381067276, 0.07666473835706711, -0.014948527328670025, -0.1057928204536438, -0.0545756034553051, 0.039138708263635635, 0.01417966466397047, 0.18728964030742645, 0.03332244232296944, 0.060901742428541183, 0.04497100040316582, -0.010875784792006016, -0.14837366342544556, 0.017056504264473915, 0.029282301664352417, -0.08316301554441452, 0.009149782359600067, 0.09636759757995605, -0.008974644355475903, 0.045405205339193344, 0.06434024125337601, 0.011037684977054596, 0.013563907705247402, 0.013598927296698093, -0.007825245149433613, -0.08426783233880997, -0.008483488112688065, -0.07630085200071335, 0.0905415415763855, 0.09127404540777206, 0.01591654121875763, -0.03054649569094181, -0.0739530622959137, 0.017747635021805763, 0.06005626916885376, 0.08410360664129257, -0.010793034918606281, -0.05712760612368584, 0.00539959454908967, -0.026909777894616127, 0.06114370748400688, -0.10131052881479263, -0.048597510904073715, 0.022454330697655678, -0.011879428289830685, -0.044753819704055786, 0.08875835686922073, 0.08818736672401428, 0.025118475779891014, -0.02004515938460827, -0.07576694339513779, 0.02166636474430561, 0.01962604932487011, -0.06669335812330246, -0.0591922290623188 ]
null
null
transformers
# M2M100 1.2B M2M100 is a multilingual encoder-decoder (seq-to-seq) model trained for Many-to-Many multilingual translation. It was introduced in this [paper](https://arxiv.org/abs/2010.11125) and first released in [this](https://github.com/pytorch/fairseq/tree/master/examples/m2m_100) repository. The model that can directly translate between the 9,900 directions of 100 languages. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the `forced_bos_token_id` parameter to the `generate` method. *Note: `M2M100Tokenizer` depends on `sentencepiece`, so make sure to install it before running the example.* To install `sentencepiece` run `pip install sentencepiece` ```python from transformers import M2M100ForConditionalGeneration, M2M100Tokenizer hi_text = "जीवन एक चॉकलेट बॉक्स की तरह है।" chinese_text = "生活就像一盒巧克力。" model = M2M100ForConditionalGeneration.from_pretrained("facebook/m2m100_1.2B") tokenizer = M2M100Tokenizer.from_pretrained("facebook/m2m100_1.2B") # translate Hindi to French tokenizer.src_lang = "hi" encoded_hi = tokenizer(hi_text, return_tensors="pt") generated_tokens = model.generate(**encoded_hi, forced_bos_token_id=tokenizer.get_lang_id("fr")) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => "La vie est comme une boîte de chocolat." # translate Chinese to English tokenizer.src_lang = "zh" encoded_zh = tokenizer(chinese_text, return_tensors="pt") generated_tokens = model.generate(**encoded_zh, forced_bos_token_id=tokenizer.get_lang_id("en")) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => "Life is like a box of chocolate." ``` See the [model hub](https://huggingface.co/models?filter=m2m_100) to look for more fine-tuned versions. ## Languages covered Afrikaans (af), Amharic (am), Arabic (ar), Asturian (ast), Azerbaijani (az), Bashkir (ba), Belarusian (be), Bulgarian (bg), Bengali (bn), Breton (br), Bosnian (bs), Catalan; Valencian (ca), Cebuano (ceb), Czech (cs), Welsh (cy), Danish (da), German (de), Greeek (el), English (en), Spanish (es), Estonian (et), Persian (fa), Fulah (ff), Finnish (fi), French (fr), Western Frisian (fy), Irish (ga), Gaelic; Scottish Gaelic (gd), Galician (gl), Gujarati (gu), Hausa (ha), Hebrew (he), Hindi (hi), Croatian (hr), Haitian; Haitian Creole (ht), Hungarian (hu), Armenian (hy), Indonesian (id), Igbo (ig), Iloko (ilo), Icelandic (is), Italian (it), Japanese (ja), Javanese (jv), Georgian (ka), Kazakh (kk), Central Khmer (km), Kannada (kn), Korean (ko), Luxembourgish; Letzeburgesch (lb), Ganda (lg), Lingala (ln), Lao (lo), Lithuanian (lt), Latvian (lv), Malagasy (mg), Macedonian (mk), Malayalam (ml), Mongolian (mn), Marathi (mr), Malay (ms), Burmese (my), Nepali (ne), Dutch; Flemish (nl), Norwegian (no), Northern Sotho (ns), Occitan (post 1500) (oc), Oriya (or), Panjabi; Punjabi (pa), Polish (pl), Pushto; Pashto (ps), Portuguese (pt), Romanian; Moldavian; Moldovan (ro), Russian (ru), Sindhi (sd), Sinhala; Sinhalese (si), Slovak (sk), Slovenian (sl), Somali (so), Albanian (sq), Serbian (sr), Swati (ss), Sundanese (su), Swedish (sv), Swahili (sw), Tamil (ta), Thai (th), Tagalog (tl), Tswana (tn), Turkish (tr), Ukrainian (uk), Urdu (ur), Uzbek (uz), Vietnamese (vi), Wolof (wo), Xhosa (xh), Yiddish (yi), Yoruba (yo), Chinese (zh), Zulu (zu) ## BibTeX entry and citation info ``` @misc{fan2020englishcentric, title={Beyond English-Centric Multilingual Machine Translation}, author={Angela Fan and Shruti Bhosale and Holger Schwenk and Zhiyi Ma and Ahmed El-Kishky and Siddharth Goyal and Mandeep Baines and Onur Celebi and Guillaume Wenzek and Vishrav Chaudhary and Naman Goyal and Tom Birch and Vitaliy Liptchinsky and Sergey Edunov and Edouard Grave and Michael Auli and Armand Joulin}, year={2020}, eprint={2010.11125}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
{"language": ["multilingual", "af", "am", "ar", "ast", "az", "ba", "be", "bg", "bn", "br", "bs", "ca", "ceb", "cs", "cy", "da", "de", "el", "en", "es", "et", "fa", "ff", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "ht", "hu", "hy", "id", "ig", "ilo", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "lb", "lg", "ln", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", false, "ns", "oc", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sd", "si", "sk", "sl", "so", "sq", "sr", "ss", "su", "sv", "sw", "ta", "th", "tl", "tn", "tr", "uk", "ur", "uz", "vi", "wo", "xh", "yi", "yo", "zh", "zu"], "license": "mit"}
text2text-generation
facebook/m2m100_1.2B
[ "transformers", "pytorch", "rust", "m2m_100", "text2text-generation", "multilingual", "af", "am", "ar", "ast", "az", "ba", "be", "bg", "bn", "br", "bs", "ca", "ceb", "cs", "cy", "da", "de", "el", "en", "es", "et", "fa", "ff", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "ht", "hu", "hy", "id", "ig", "ilo", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "lb", "lg", "ln", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "ns", "oc", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sd", "si", "sk", "sl", "so", "sq", "sr", "ss", "su", "sv", "sw", "ta", "th", "tl", "tn", "tr", "uk", "ur", "uz", "vi", "wo", "xh", "yi", "yo", "zh", "zu", "arxiv:2010.11125", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.11125" ]
[ "multilingual", "af", "am", "ar", "ast", "az", "ba", "be", "bg", "bn", "br", "bs", "ca", "ceb", "cs", "cy", "da", "de", "el", "en", "es", "et", "fa", "ff", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "ht", "hu", "hy", "id", "ig", "ilo", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "lb", "lg", "ln", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "ns", "oc", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sd", "si", "sk", "sl", "so", "sq", "sr", "ss", "su", "sv", "sw", "ta", "th", "tl", "tn", "tr", "uk", "ur", "uz", "vi", "wo", "xh", "yi", "yo", "zh", "zu" ]
TAGS #transformers #pytorch #rust #m2m_100 #text2text-generation #multilingual #af #am #ar #ast #az #ba #be #bg #bn #br #bs #ca #ceb #cs #cy #da #de #el #en #es #et #fa #ff #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #ht #hu #hy #id #ig #ilo #is #it #ja #jv #ka #kk #km #kn #ko #lb #lg #ln #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #ns #oc #or #pa #pl #ps #pt #ro #ru #sd #si #sk #sl #so #sq #sr #ss #su #sv #sw #ta #th #tl #tn #tr #uk #ur #uz #vi #wo #xh #yi #yo #zh #zu #arxiv-2010.11125 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
# M2M100 1.2B M2M100 is a multilingual encoder-decoder (seq-to-seq) model trained for Many-to-Many multilingual translation. It was introduced in this paper and first released in this repository. The model that can directly translate between the 9,900 directions of 100 languages. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method. *Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.* To install 'sentencepiece' run 'pip install sentencepiece' See the model hub to look for more fine-tuned versions. ## Languages covered Afrikaans (af), Amharic (am), Arabic (ar), Asturian (ast), Azerbaijani (az), Bashkir (ba), Belarusian (be), Bulgarian (bg), Bengali (bn), Breton (br), Bosnian (bs), Catalan; Valencian (ca), Cebuano (ceb), Czech (cs), Welsh (cy), Danish (da), German (de), Greeek (el), English (en), Spanish (es), Estonian (et), Persian (fa), Fulah (ff), Finnish (fi), French (fr), Western Frisian (fy), Irish (ga), Gaelic; Scottish Gaelic (gd), Galician (gl), Gujarati (gu), Hausa (ha), Hebrew (he), Hindi (hi), Croatian (hr), Haitian; Haitian Creole (ht), Hungarian (hu), Armenian (hy), Indonesian (id), Igbo (ig), Iloko (ilo), Icelandic (is), Italian (it), Japanese (ja), Javanese (jv), Georgian (ka), Kazakh (kk), Central Khmer (km), Kannada (kn), Korean (ko), Luxembourgish; Letzeburgesch (lb), Ganda (lg), Lingala (ln), Lao (lo), Lithuanian (lt), Latvian (lv), Malagasy (mg), Macedonian (mk), Malayalam (ml), Mongolian (mn), Marathi (mr), Malay (ms), Burmese (my), Nepali (ne), Dutch; Flemish (nl), Norwegian (no), Northern Sotho (ns), Occitan (post 1500) (oc), Oriya (or), Panjabi; Punjabi (pa), Polish (pl), Pushto; Pashto (ps), Portuguese (pt), Romanian; Moldavian; Moldovan (ro), Russian (ru), Sindhi (sd), Sinhala; Sinhalese (si), Slovak (sk), Slovenian (sl), Somali (so), Albanian (sq), Serbian (sr), Swati (ss), Sundanese (su), Swedish (sv), Swahili (sw), Tamil (ta), Thai (th), Tagalog (tl), Tswana (tn), Turkish (tr), Ukrainian (uk), Urdu (ur), Uzbek (uz), Vietnamese (vi), Wolof (wo), Xhosa (xh), Yiddish (yi), Yoruba (yo), Chinese (zh), Zulu (zu) ## BibTeX entry and citation info
[ "# M2M100 1.2B\n\nM2M100 is a multilingual encoder-decoder (seq-to-seq) model trained for Many-to-Many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model that can directly translate between the 9,900 directions of 100 languages.\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\n\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions.", "## Languages covered\nAfrikaans (af), Amharic (am), Arabic (ar), Asturian (ast), Azerbaijani (az), Bashkir (ba), Belarusian (be), Bulgarian (bg), Bengali (bn), Breton (br), Bosnian (bs), Catalan; Valencian (ca), Cebuano (ceb), Czech (cs), Welsh (cy), Danish (da), German (de), Greeek (el), English (en), Spanish (es), Estonian (et), Persian (fa), Fulah (ff), Finnish (fi), French (fr), Western Frisian (fy), Irish (ga), Gaelic; Scottish Gaelic (gd), Galician (gl), Gujarati (gu), Hausa (ha), Hebrew (he), Hindi (hi), Croatian (hr), Haitian; Haitian Creole (ht), Hungarian (hu), Armenian (hy), Indonesian (id), Igbo (ig), Iloko (ilo), Icelandic (is), Italian (it), Japanese (ja), Javanese (jv), Georgian (ka), Kazakh (kk), Central Khmer (km), Kannada (kn), Korean (ko), Luxembourgish; Letzeburgesch (lb), Ganda (lg), Lingala (ln), Lao (lo), Lithuanian (lt), Latvian (lv), Malagasy (mg), Macedonian (mk), Malayalam (ml), Mongolian (mn), Marathi (mr), Malay (ms), Burmese (my), Nepali (ne), Dutch; Flemish (nl), Norwegian (no), Northern Sotho (ns), Occitan (post 1500) (oc), Oriya (or), Panjabi; Punjabi (pa), Polish (pl), Pushto; Pashto (ps), Portuguese (pt), Romanian; Moldavian; Moldovan (ro), Russian (ru), Sindhi (sd), Sinhala; Sinhalese (si), Slovak (sk), Slovenian (sl), Somali (so), Albanian (sq), Serbian (sr), Swati (ss), Sundanese (su), Swedish (sv), Swahili (sw), Tamil (ta), Thai (th), Tagalog (tl), Tswana (tn), Turkish (tr), Ukrainian (uk), Urdu (ur), Uzbek (uz), Vietnamese (vi), Wolof (wo), Xhosa (xh), Yiddish (yi), Yoruba (yo), Chinese (zh), Zulu (zu)", "## BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #rust #m2m_100 #text2text-generation #multilingual #af #am #ar #ast #az #ba #be #bg #bn #br #bs #ca #ceb #cs #cy #da #de #el #en #es #et #fa #ff #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #ht #hu #hy #id #ig #ilo #is #it #ja #jv #ka #kk #km #kn #ko #lb #lg #ln #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #ns #oc #or #pa #pl #ps #pt #ro #ru #sd #si #sk #sl #so #sq #sr #ss #su #sv #sw #ta #th #tl #tn #tr #uk #ur #uz #vi #wo #xh #yi #yo #zh #zu #arxiv-2010.11125 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# M2M100 1.2B\n\nM2M100 is a multilingual encoder-decoder (seq-to-seq) model trained for Many-to-Many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model that can directly translate between the 9,900 directions of 100 languages.\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\n\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions.", "## Languages covered\nAfrikaans (af), Amharic (am), Arabic (ar), Asturian (ast), Azerbaijani (az), Bashkir (ba), Belarusian (be), Bulgarian (bg), Bengali (bn), Breton (br), Bosnian (bs), Catalan; Valencian (ca), Cebuano (ceb), Czech (cs), Welsh (cy), Danish (da), German (de), Greeek (el), English (en), Spanish (es), Estonian (et), Persian (fa), Fulah (ff), Finnish (fi), French (fr), Western Frisian (fy), Irish (ga), Gaelic; Scottish Gaelic (gd), Galician (gl), Gujarati (gu), Hausa (ha), Hebrew (he), Hindi (hi), Croatian (hr), Haitian; Haitian Creole (ht), Hungarian (hu), Armenian (hy), Indonesian (id), Igbo (ig), Iloko (ilo), Icelandic (is), Italian (it), Japanese (ja), Javanese (jv), Georgian (ka), Kazakh (kk), Central Khmer (km), Kannada (kn), Korean (ko), Luxembourgish; Letzeburgesch (lb), Ganda (lg), Lingala (ln), Lao (lo), Lithuanian (lt), Latvian (lv), Malagasy (mg), Macedonian (mk), Malayalam (ml), Mongolian (mn), Marathi (mr), Malay (ms), Burmese (my), Nepali (ne), Dutch; Flemish (nl), Norwegian (no), Northern Sotho (ns), Occitan (post 1500) (oc), Oriya (or), Panjabi; Punjabi (pa), Polish (pl), Pushto; Pashto (ps), Portuguese (pt), Romanian; Moldavian; Moldovan (ro), Russian (ru), Sindhi (sd), Sinhala; Sinhalese (si), Slovak (sk), Slovenian (sl), Somali (so), Albanian (sq), Serbian (sr), Swati (ss), Sundanese (su), Swedish (sv), Swahili (sw), Tamil (ta), Thai (th), Tagalog (tl), Tswana (tn), Turkish (tr), Ukrainian (uk), Urdu (ur), Uzbek (uz), Vietnamese (vi), Wolof (wo), Xhosa (xh), Yiddish (yi), Yoruba (yo), Chinese (zh), Zulu (zu)", "## BibTeX entry and citation info" ]
[ 277, 202, 531, 10 ]
[ "passage: TAGS\n#transformers #pytorch #rust #m2m_100 #text2text-generation #multilingual #af #am #ar #ast #az #ba #be #bg #bn #br #bs #ca #ceb #cs #cy #da #de #el #en #es #et #fa #ff #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #ht #hu #hy #id #ig #ilo #is #it #ja #jv #ka #kk #km #kn #ko #lb #lg #ln #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #ns #oc #or #pa #pl #ps #pt #ro #ru #sd #si #sk #sl #so #sq #sr #ss #su #sv #sw #ta #th #tl #tn #tr #uk #ur #uz #vi #wo #xh #yi #yo #zh #zu #arxiv-2010.11125 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# M2M100 1.2B\n\nM2M100 is a multilingual encoder-decoder (seq-to-seq) model trained for Many-to-Many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model that can directly translate between the 9,900 directions of 100 languages.\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\n\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions." ]
[ -0.049534182995557785, -0.05939195677638054, -0.007935303263366222, 0.008215540088713169, 0.06320477277040482, 0.01878035068511963, 0.1228826642036438, 0.10017450153827667, 0.01507277600467205, 0.14542266726493835, 0.06044948846101761, 0.09725268185138702, 0.08512286096811295, 0.09086515754461288, 0.04448439180850983, -0.2149176448583603, 0.04820125177502632, -0.08879338949918747, 0.005158722400665283, 0.08111332356929779, 0.07545612007379532, -0.04271940141916275, 0.08771838247776031, -0.042954616248607635, -0.01724947988986969, 0.015725530683994293, -0.045703768730163574, -0.01638294942677021, 0.038586899638175964, 0.08128590136766434, 0.008809820748865604, 0.04150966554880142, 0.06155432015657425, -0.24394787847995758, 0.004798805341124535, 0.05448833107948303, -0.025135328993201256, -0.007662404328584671, 0.07659227401018143, -0.03366457298398018, 0.1918749064207077, -0.11934584379196167, -0.04753473028540611, 0.031633660197257996, -0.11853660643100739, -0.08955960720777512, -0.06658972054719925, 0.10137473791837692, 0.0709296315908432, 0.04272754490375519, -0.05689613148570061, 0.037322722375392914, -0.014209810644388199, 0.08279021084308624, 0.204960435628891, -0.2926751375198364, -0.053947847336530685, 0.11615187674760818, 0.07216189801692963, 0.10367515683174133, -0.056483250111341476, 0.04924459382891655, 0.024919399991631508, 0.024717533960938454, -0.047808386385440826, -0.05321911722421646, 0.12195403128862381, 0.015775667503476143, -0.16667522490024567, -0.04986177384853363, 0.12652795016765594, 0.005193409975618124, -0.017030589282512665, -0.04777213931083679, -0.05045812204480171, -0.056248296052217484, -0.060790155082941055, -0.009851466864347458, 0.017948513850569725, 0.04449232667684555, 0.046401046216487885, -0.04252190142869949, -0.09601151198148727, -0.03064947947859764, -0.04929789528250694, 0.11480526626110077, 0.022203169763088226, -0.008596688508987427, -0.0023063807748258114, 0.056396301835775375, -0.04533206298947334, -0.10370145738124847, -0.04146307334303856, -0.0226572398096323, -0.059630267322063446, 0.006229548715054989, -0.0033294057939201593, -0.017083948478102684, 0.11573341488838196, 0.0994136780500412, -0.05314386636018753, 0.06559322029352188, -0.007428355515003204, 0.06078159436583519, 0.04715675488114357, 0.12477748095989227, -0.09041508287191391, -0.052005667239427567, -0.0478871650993824, 0.014398805797100067, 0.0045144883915781975, -0.014462240971624851, -0.09334757179021835, -0.022100193426012993, 0.04357847198843956, 0.07434998452663422, 0.048381876200437546, 0.069816455245018, -0.0297408290207386, 0.011310703121125698, 0.03338978812098503, -0.14212483167648315, 0.0511648915708065, 0.0435987263917923, 0.007462233770638704, 0.07532837241888046, -0.022148244082927704, -0.01354560162872076, -0.08366160839796066, 0.02977345883846283, -0.048422057181596756, 0.03556162491440773, -0.060481999069452286, -0.10193536430597305, 0.01722724363207817, -0.07307951152324677, -0.04619010165333748, -0.11930043250322342, -0.06402270495891571, -0.0869530588388443, 0.013346980325877666, -0.06582123786211014, 0.044837288558483124, -0.09570366144180298, -0.08783534169197083, 0.03907323256134987, 0.013662334531545639, 0.044123560190200806, -0.05291198194026947, 0.04503561183810234, -0.020717183127999306, 0.06898892670869827, 0.0712231770157814, 0.0025831381790339947, -0.03425322473049164, 0.08941760659217834, -0.13969473540782928, 0.12144935876131058, -0.09191987663507462, 0.025828564539551735, -0.13376301527023315, -0.0510619580745697, -0.03582053631544113, 0.05632440000772476, 0.039102304726839066, 0.15323056280612946, -0.16331981122493744, -0.030384857207536697, 0.23949898779392242, -0.06217575818300247, -0.03253718465566635, 0.11577111482620239, 0.06249360367655754, 0.0553848072886467, 0.0368545800447464, 0.11717568337917328, 0.051144249737262726, -0.14445552229881287, -0.03793404623866081, 0.0944187194108963, -0.04569689556956291, 0.13501721620559692, 0.09118235856294632, -0.02631702460348606, 0.04441586509346962, 0.017066849395632744, -0.017688190564513206, 0.0228143148124218, -0.026697048917412758, -0.052440810948610306, 0.02618812583386898, -0.005865488201379776, -0.03675760328769684, -0.01294204406440258, -0.018115324899554253, -0.0024581216275691986, -0.06787683069705963, 0.006448582746088505, 0.10666601359844208, -0.013972671702504158, 0.05096380412578583, -0.12735912203788757, 0.06826964020729065, -0.041590187698602676, 0.05150402709841728, -0.1512693613767624, 0.050051748752593994, 0.0014350848505273461, 0.012576635926961899, 0.09414573013782501, 0.08890015631914139, 0.07006102055311203, 0.05601765960454941, 0.003871308173984289, -0.028619466349482536, 0.07405053824186325, -0.014303392730653286, -0.040861956775188446, -0.20016510784626007, -0.021929794922471046, -0.052503667771816254, 0.07353281229734421, -0.1048903539776802, 0.03910021111369133, 0.09188005328178406, 0.06451194733381271, -0.04942355677485466, 0.01735113002359867, -0.007083349861204624, 0.05015351623296738, -0.027111349627375603, -0.0361890085041523, 0.053251948207616806, -0.009297423996031284, -0.0384942851960659, 0.057133208960294724, -0.13818562030792236, -0.09454885870218277, 0.08380616456270218, -0.035347193479537964, -0.06665054708719254, 0.010187107138335705, -0.010466381907463074, -0.038804806768894196, 0.043895743787288666, -0.05882784351706505, 0.10276176780462265, 0.06368038058280945, 0.03682301193475723, -0.07892525941133499, -0.05087636411190033, 0.015011323615908623, -0.0740121528506279, -0.06590145081281662, 0.13554798066616058, 0.058879900723695755, -0.1985267996788025, 0.10999756306409836, 0.10711649805307388, 0.015243975445628166, 0.15499745309352875, -0.00470357621088624, -0.09466856718063354, -0.082054503262043, 0.10018038004636765, 0.03373512625694275, -0.0030947886407375336, -0.08328171819448471, -0.043556783348321915, -0.004951129667460918, 0.020052455365657806, 0.022211924195289612, -0.08709622174501419, 0.04299874231219292, -0.010396908968687057, -0.06227058172225952, 0.05618322640657425, 0.031249739229679108, -0.03319292888045311, 0.08294912427663803, 0.024066362529993057, 0.00015790134784765542, -0.05242454633116722, -0.03942228853702545, -0.09332993626594543, 0.1640755534172058, -0.15019693970680237, -0.1831549108028412, -0.10491319745779037, -0.07245049625635147, -0.06251595914363861, 0.016401223838329315, 0.042779341340065, -0.0943625345826149, -0.011848262511193752, -0.051792584359645844, 0.09107401221990585, -0.062162842601537704, -0.07147212326526642, -0.023934274911880493, 0.03319856896996498, -0.01152728870511055, -0.09805471450090408, -0.03261519595980644, 0.0384751595556736, -0.11160211265087128, 0.04420124366879463, -0.04472780600190163, 0.082871213555336, 0.11584526300430298, 0.07367156445980072, 0.01709509640932083, 0.007768571842461824, 0.21076753735542297, -0.0729580968618393, 0.04161312058568001, 0.07726272195577621, 0.03240139037370682, 0.07142610102891922, 0.1319771707057953, 0.026460111141204834, -0.05366668477654457, -0.019276730716228485, -0.011888190172612667, -0.0146310580894351, -0.14773236215114594, -0.06061471998691559, -0.058505039662122726, 0.04788171499967575, 0.04587729647755623, 0.08786778897047043, -0.05269995704293251, 0.02723780833184719, -0.04013349488377571, 0.030864126980304718, 0.05425649508833885, 0.06961727142333984, 0.06423734128475189, -0.04530242830514908, 0.03979319706559181, -0.02779463864862919, -0.02277660183608532, 0.08835603296756744, 0.042809709906578064, 0.06200525537133217, 0.021408572793006897, 0.15410412847995758, 0.09017831832170486, 0.09059598296880722, 0.019196398556232452, 0.06769418716430664, -0.013674605637788773, 0.024542804807424545, -0.009751268662512302, -0.08713989704847336, -0.033952344208955765, 0.0848798081278801, 0.11325956135988235, -0.05884579196572304, 0.02699197083711624, 0.01360100507736206, 0.10575244575738907, 0.17347216606140137, 0.030400147661566734, -0.1373721808195114, -0.010444719344377518, 0.06332185864448547, 0.027632733806967735, -0.08674337714910507, 0.015024581924080849, 0.09103461354970932, -0.08885841816663742, 0.06537619233131409, 0.03470882028341293, 0.07965631037950516, -0.08458271622657776, 0.02057960256934166, -0.022482968866825104, 0.10222992300987244, 0.0147535540163517, 0.10083911567926407, -0.2699204385280609, 0.11107534170150757, 0.014247863553464413, 0.03413432091474533, -0.01993640698492527, 0.03412103280425072, 0.014007072895765305, 0.07845406234264374, 0.1640511006116867, 0.046371303498744965, -0.13532964885234833, -0.17286647856235504, -0.04740903899073601, -0.021911723539233208, 0.144374281167984, -0.05817614868283272, 0.07924899458885193, -0.011354634538292885, -0.04503283649682999, -0.058577269315719604, 0.0517568476498127, -0.09030836075544357, -0.11813315004110336, 0.07617989182472229, -0.02770979329943657, 0.02782384864985943, -0.02845829725265503, -0.0013457058230414987, -0.1092444509267807, 0.12413626909255981, -0.12426958233118057, -0.05626802518963814, -0.07635964453220367, -0.0017018263461068273, 0.1211656853556633, -0.10868431627750397, -0.04182935506105423, -0.02610243298113346, 0.05207320675253868, -0.041408367455005646, -0.04351864382624626, 0.04217606037855148, -0.073491670191288, -0.13229627907276154, -0.0015605436637997627, 0.10884295403957367, 0.017472226172685623, 0.041472427546978, -0.011519582942128181, 0.027140220627188683, -0.011671609245240688, -0.13426321744918823, 0.03177215903997421, 0.10803298652172089, 0.0003453791723586619, 0.04483424872159958, -0.11708054691553116, -0.0761823058128357, -0.09737710654735565, -0.04772338271141052, 0.10610400885343552, 0.2690548300743103, -0.056267693638801575, 0.07826048135757446, 0.1732858121395111, -0.11643647402524948, -0.2243562638759613, -0.166349396109581, 0.017223695293068886, 0.04125010222196579, -0.057815536856651306, -0.1759910136461258, -0.0006703204708173871, 0.010256927460432053, 0.01036170031875372, 0.03600628674030304, -0.2923087477684021, -0.0974358320236206, 0.0797630101442337, 0.007759107742458582, -0.012232211418449879, -0.18501777946949005, -0.08908411115407944, -0.06434296816587448, -0.020703203976154327, -0.0061998614110052586, -0.036658063530921936, 0.08091442286968231, 0.021071523427963257, 0.008357184007763863, 0.03729107603430748, -0.015137060545384884, 0.16242261230945587, 0.02133854292333126, 0.00019562071247491986, -0.10934124141931534, -0.06306580454111099, 0.0588003508746624, -0.019450092688202858, 0.069617860019207, -0.05707213655114174, 0.010022332891821861, -0.031775183975696564, -0.008346772752702236, -0.07381036132574081, 0.054756537079811096, -0.06071799248456955, -0.031597793102264404, -0.053041789680719376, 0.04368111863732338, 0.04656723886728287, -0.0016671086195856333, 0.054679013788700104, -0.09013428539037704, 0.04275905713438988, 0.1566094160079956, 0.08150114119052887, 0.037189874798059464, -0.010528990998864174, -0.022108109667897224, -0.0021978795994073153, 0.012145578861236572, -0.060259513556957245, -0.0013153098989278078, 0.10013075172901154, -0.00008617227285867557, 0.12387795746326447, 0.03499584272503853, -0.13545100390911102, -0.007057784590870142, 0.09223134070634842, -0.08661839365959167, -0.16519372165203094, -0.008410434238612652, -0.01403407659381628, 0.01722029410302639, -0.02268006093800068, 0.14450517296791077, -0.019940651953220367, -0.0049104634672403336, -0.007785134017467499, 0.07037266343832016, -0.08546246588230133, 0.10691675543785095, 0.009181218221783638, 0.04130329191684723, -0.06544346362352371, 0.04867413267493248, 0.06832888722419739, -0.0425453819334507, 0.013686216436326504, 0.15562205016613007, -0.0879831537604332, -0.09572087973356247, -0.022936437278985977, 0.10097402334213257, 0.003112498205155134, -0.043161675333976746, 0.02336948551237583, -0.07164940237998962, 0.057675063610076904, 0.10883436352014542, 0.03585421293973923, 0.041551802307367325, 0.015889804810285568, 0.010765446349978447, 0.003917145077139139, 0.06943493336439133, 0.008851057849824429, -0.004284232389181852, -0.08349715173244476, 0.08492803573608398, 0.009962611831724644, 0.10982074588537216, -0.022269925102591515, -0.03312130644917488, -0.12944166362285614, 0.005480057559907436, -0.08889611065387726, 0.030844485387206078, -0.14553005993366241, -0.018170569092035294, 0.02721647173166275, -0.0431504063308239, -0.013024035841226578, -0.02025209367275238, -0.08693358302116394, -0.04207010939717293, -0.08916892111301422, 0.12422656267881393, -0.09696915000677109, -0.025219367817044258, 0.03987541049718857, -0.0929027795791626, 0.06722653657197952, 0.07850680500268936, -0.02975008822977543, 0.06109081581234932, -0.0757807195186615, -0.00038798354216851294, 0.006874267943203449, 0.05585244297981262, 0.005912924185395241, -0.09620776772499084, -0.001232592505402863, -0.012849176302552223, -0.0032702647149562836, -0.0025546506512910128, 0.028554707765579224, -0.07144042104482651, 0.11143793910741806, -0.026962995529174805, -0.06102709472179413, -0.07052718847990036, 0.06639391928911209, 0.04844958707690239, 0.04899795353412628, 0.06880396604537964, -0.0837254673242569, 0.05524798482656479, -0.09811395406723022, 0.006750370841473341, 0.011587663553655148, -0.052174609154462814, 0.03317965939640999, -0.09283823519945145, 0.06275784969329834, -0.03161217272281647, 0.0731862410902977, -0.006406077183783054, -0.01194336824119091, -0.00636648153886199, -0.05232169106602669, -0.06599687784910202, 0.03582312539219856, 0.06677135825157166, 0.04662531614303589, -0.0026347143575549126, -0.08039025962352753, 0.005975911393761635, -0.013735276646912098, 0.02802581526339054, 0.036974258720874786, 0.0883457362651825, 0.14408841729164124, 0.08472494035959244, 0.03643856570124626, -0.08372487872838974, -0.048278454691171646, 0.06467502564191818, -0.1263042837381363, 0.0025698491372168064, -0.11013806611299515, 0.17410564422607422, 0.16657327115535736, -0.14558085799217224, 0.0650584027171135, -0.03618602827191353, -0.05844749137759209, -0.10965105891227722, -0.21165283024311066, -0.05291641503572464, -0.04500240832567215, 0.021974418312311172, -0.10321491956710815, 0.08007192611694336, 0.010878030210733414, 0.06687426567077637, 0.031942445784807205, 0.09605769068002701, -0.0586540661752224, -0.08760742098093033, 0.04882901906967163, -0.016539236530661583, 0.03432856500148773, -0.029351992532610893, 0.010382345877587795, 0.03623581305146217, -0.03863029554486275, 0.03796821087598801, 0.07659374922513962, 0.00693880207836628, 0.017647814005613327, -0.0732930600643158, -0.06899313628673553, -0.01005993876606226, 0.005453329998999834, -0.026519814506173134, 0.10146903246641159, 0.05632583051919937, -0.07314469665288925, 0.010180520825088024, 0.10893500596284866, -0.059493161737918854, -0.19404926896095276, -0.09211920201778412, 0.192830428481102, -0.014103040099143982, 0.055982932448387146, -0.03708261251449585, -0.05965651944279671, -0.04749119654297829, 0.21743635833263397, 0.1846182644367218, -0.051480405032634735, 0.022317226976156235, 0.05273616686463356, 0.028970276936888695, 0.00023158921976573765, 0.041360240429639816, 0.060952045023441315, 0.24172376096248627, -0.07283644378185272, 0.0869777649641037, -0.0645325779914856, -0.006906591355800629, -0.091585174202919, 0.03271253779530525, -0.026672204956412315, -0.002629975089803338, -0.029294749721884727, 0.102747343480587, -0.13812434673309326, -0.13458885252475739, -0.004360491409897804, -0.04900345206260681, -0.05425220727920532, 0.0037558614276349545, 0.0815359354019165, 0.06284209340810776, 0.07055308669805527, -0.0107819102704525, -0.026789367198944092, 0.04958247020840645, 0.005494150333106518, -0.06033610180020332, 0.02667825296521187, 0.04425312578678131, -0.06373851746320724, 0.05977831780910492, -0.02145494893193245, 0.06768611818552017, 0.1094774678349495, -0.0053015900775790215, -0.08972479403018951, 0.08821677416563034, 0.044186994433403015, -0.08782651275396347, 0.030312031507492065, 0.09615754336118698, -0.016291934996843338, 0.03660091385245323, 0.06393133103847504, -0.051652729511260986, 0.08291608095169067, 0.13348732888698578, 0.031891610473394394, -0.058023326098918915, 0.07143396139144897, -0.10454214364290237, 0.12936760485172272, 0.13364644348621368, 0.007871107198297977, -0.034911397844552994, -0.0542709156870842, 0.041168827563524246, -0.04039742425084114, 0.0848056823015213, -0.03635065630078316, -0.15948455035686493, 0.00025262695271521807, -0.02439906634390354, 0.0942818745970726, -0.10206474363803864, -0.059229571372270584, 0.03815164044499397, 0.036325544118881226, -0.08835672587156296, 0.13932333886623383, 0.05685891583561897, 0.014335243962705135, -0.043840888887643814, -0.2470865249633789, 0.024864930659532547, 0.13584816455841064, -0.11698777973651886, -0.0672525018453598 ]
null
null
transformers
# M2M100 418M M2M100 is a multilingual encoder-decoder (seq-to-seq) model trained for Many-to-Many multilingual translation. It was introduced in this [paper](https://arxiv.org/abs/2010.11125) and first released in [this](https://github.com/pytorch/fairseq/tree/master/examples/m2m_100) repository. The model that can directly translate between the 9,900 directions of 100 languages. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the `forced_bos_token_id` parameter to the `generate` method. *Note: `M2M100Tokenizer` depends on `sentencepiece`, so make sure to install it before running the example.* To install `sentencepiece` run `pip install sentencepiece` ```python from transformers import M2M100ForConditionalGeneration, M2M100Tokenizer hi_text = "जीवन एक चॉकलेट बॉक्स की तरह है।" chinese_text = "生活就像一盒巧克力。" model = M2M100ForConditionalGeneration.from_pretrained("facebook/m2m100_418M") tokenizer = M2M100Tokenizer.from_pretrained("facebook/m2m100_418M") # translate Hindi to French tokenizer.src_lang = "hi" encoded_hi = tokenizer(hi_text, return_tensors="pt") generated_tokens = model.generate(**encoded_hi, forced_bos_token_id=tokenizer.get_lang_id("fr")) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => "La vie est comme une boîte de chocolat." # translate Chinese to English tokenizer.src_lang = "zh" encoded_zh = tokenizer(chinese_text, return_tensors="pt") generated_tokens = model.generate(**encoded_zh, forced_bos_token_id=tokenizer.get_lang_id("en")) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => "Life is like a box of chocolate." ``` See the [model hub](https://huggingface.co/models?filter=m2m_100) to look for more fine-tuned versions. ## Languages covered Afrikaans (af), Amharic (am), Arabic (ar), Asturian (ast), Azerbaijani (az), Bashkir (ba), Belarusian (be), Bulgarian (bg), Bengali (bn), Breton (br), Bosnian (bs), Catalan; Valencian (ca), Cebuano (ceb), Czech (cs), Welsh (cy), Danish (da), German (de), Greeek (el), English (en), Spanish (es), Estonian (et), Persian (fa), Fulah (ff), Finnish (fi), French (fr), Western Frisian (fy), Irish (ga), Gaelic; Scottish Gaelic (gd), Galician (gl), Gujarati (gu), Hausa (ha), Hebrew (he), Hindi (hi), Croatian (hr), Haitian; Haitian Creole (ht), Hungarian (hu), Armenian (hy), Indonesian (id), Igbo (ig), Iloko (ilo), Icelandic (is), Italian (it), Japanese (ja), Javanese (jv), Georgian (ka), Kazakh (kk), Central Khmer (km), Kannada (kn), Korean (ko), Luxembourgish; Letzeburgesch (lb), Ganda (lg), Lingala (ln), Lao (lo), Lithuanian (lt), Latvian (lv), Malagasy (mg), Macedonian (mk), Malayalam (ml), Mongolian (mn), Marathi (mr), Malay (ms), Burmese (my), Nepali (ne), Dutch; Flemish (nl), Norwegian (no), Northern Sotho (ns), Occitan (post 1500) (oc), Oriya (or), Panjabi; Punjabi (pa), Polish (pl), Pushto; Pashto (ps), Portuguese (pt), Romanian; Moldavian; Moldovan (ro), Russian (ru), Sindhi (sd), Sinhala; Sinhalese (si), Slovak (sk), Slovenian (sl), Somali (so), Albanian (sq), Serbian (sr), Swati (ss), Sundanese (su), Swedish (sv), Swahili (sw), Tamil (ta), Thai (th), Tagalog (tl), Tswana (tn), Turkish (tr), Ukrainian (uk), Urdu (ur), Uzbek (uz), Vietnamese (vi), Wolof (wo), Xhosa (xh), Yiddish (yi), Yoruba (yo), Chinese (zh), Zulu (zu) ## BibTeX entry and citation info ``` @misc{fan2020englishcentric, title={Beyond English-Centric Multilingual Machine Translation}, author={Angela Fan and Shruti Bhosale and Holger Schwenk and Zhiyi Ma and Ahmed El-Kishky and Siddharth Goyal and Mandeep Baines and Onur Celebi and Guillaume Wenzek and Vishrav Chaudhary and Naman Goyal and Tom Birch and Vitaliy Liptchinsky and Sergey Edunov and Edouard Grave and Michael Auli and Armand Joulin}, year={2020}, eprint={2010.11125}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
{"language": ["multilingual", "af", "am", "ar", "ast", "az", "ba", "be", "bg", "bn", "br", "bs", "ca", "ceb", "cs", "cy", "da", "de", "el", "en", "es", "et", "fa", "ff", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "ht", "hu", "hy", "id", "ig", "ilo", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "lb", "lg", "ln", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", false, "ns", "oc", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sd", "si", "sk", "sl", "so", "sq", "sr", "ss", "su", "sv", "sw", "ta", "th", "tl", "tn", "tr", "uk", "ur", "uz", "vi", "wo", "xh", "yi", "yo", "zh", "zu"], "license": "mit"}
text2text-generation
facebook/m2m100_418M
[ "transformers", "pytorch", "rust", "m2m_100", "text2text-generation", "multilingual", "af", "am", "ar", "ast", "az", "ba", "be", "bg", "bn", "br", "bs", "ca", "ceb", "cs", "cy", "da", "de", "el", "en", "es", "et", "fa", "ff", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "ht", "hu", "hy", "id", "ig", "ilo", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "lb", "lg", "ln", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "ns", "oc", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sd", "si", "sk", "sl", "so", "sq", "sr", "ss", "su", "sv", "sw", "ta", "th", "tl", "tn", "tr", "uk", "ur", "uz", "vi", "wo", "xh", "yi", "yo", "zh", "zu", "arxiv:2010.11125", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.11125" ]
[ "multilingual", "af", "am", "ar", "ast", "az", "ba", "be", "bg", "bn", "br", "bs", "ca", "ceb", "cs", "cy", "da", "de", "el", "en", "es", "et", "fa", "ff", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "ht", "hu", "hy", "id", "ig", "ilo", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "lb", "lg", "ln", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "ns", "oc", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sd", "si", "sk", "sl", "so", "sq", "sr", "ss", "su", "sv", "sw", "ta", "th", "tl", "tn", "tr", "uk", "ur", "uz", "vi", "wo", "xh", "yi", "yo", "zh", "zu" ]
TAGS #transformers #pytorch #rust #m2m_100 #text2text-generation #multilingual #af #am #ar #ast #az #ba #be #bg #bn #br #bs #ca #ceb #cs #cy #da #de #el #en #es #et #fa #ff #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #ht #hu #hy #id #ig #ilo #is #it #ja #jv #ka #kk #km #kn #ko #lb #lg #ln #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #ns #oc #or #pa #pl #ps #pt #ro #ru #sd #si #sk #sl #so #sq #sr #ss #su #sv #sw #ta #th #tl #tn #tr #uk #ur #uz #vi #wo #xh #yi #yo #zh #zu #arxiv-2010.11125 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
# M2M100 418M M2M100 is a multilingual encoder-decoder (seq-to-seq) model trained for Many-to-Many multilingual translation. It was introduced in this paper and first released in this repository. The model that can directly translate between the 9,900 directions of 100 languages. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method. *Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.* To install 'sentencepiece' run 'pip install sentencepiece' See the model hub to look for more fine-tuned versions. ## Languages covered Afrikaans (af), Amharic (am), Arabic (ar), Asturian (ast), Azerbaijani (az), Bashkir (ba), Belarusian (be), Bulgarian (bg), Bengali (bn), Breton (br), Bosnian (bs), Catalan; Valencian (ca), Cebuano (ceb), Czech (cs), Welsh (cy), Danish (da), German (de), Greeek (el), English (en), Spanish (es), Estonian (et), Persian (fa), Fulah (ff), Finnish (fi), French (fr), Western Frisian (fy), Irish (ga), Gaelic; Scottish Gaelic (gd), Galician (gl), Gujarati (gu), Hausa (ha), Hebrew (he), Hindi (hi), Croatian (hr), Haitian; Haitian Creole (ht), Hungarian (hu), Armenian (hy), Indonesian (id), Igbo (ig), Iloko (ilo), Icelandic (is), Italian (it), Japanese (ja), Javanese (jv), Georgian (ka), Kazakh (kk), Central Khmer (km), Kannada (kn), Korean (ko), Luxembourgish; Letzeburgesch (lb), Ganda (lg), Lingala (ln), Lao (lo), Lithuanian (lt), Latvian (lv), Malagasy (mg), Macedonian (mk), Malayalam (ml), Mongolian (mn), Marathi (mr), Malay (ms), Burmese (my), Nepali (ne), Dutch; Flemish (nl), Norwegian (no), Northern Sotho (ns), Occitan (post 1500) (oc), Oriya (or), Panjabi; Punjabi (pa), Polish (pl), Pushto; Pashto (ps), Portuguese (pt), Romanian; Moldavian; Moldovan (ro), Russian (ru), Sindhi (sd), Sinhala; Sinhalese (si), Slovak (sk), Slovenian (sl), Somali (so), Albanian (sq), Serbian (sr), Swati (ss), Sundanese (su), Swedish (sv), Swahili (sw), Tamil (ta), Thai (th), Tagalog (tl), Tswana (tn), Turkish (tr), Ukrainian (uk), Urdu (ur), Uzbek (uz), Vietnamese (vi), Wolof (wo), Xhosa (xh), Yiddish (yi), Yoruba (yo), Chinese (zh), Zulu (zu) ## BibTeX entry and citation info
[ "# M2M100 418M\n\nM2M100 is a multilingual encoder-decoder (seq-to-seq) model trained for Many-to-Many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model that can directly translate between the 9,900 directions of 100 languages.\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\n\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions.", "## Languages covered\nAfrikaans (af), Amharic (am), Arabic (ar), Asturian (ast), Azerbaijani (az), Bashkir (ba), Belarusian (be), Bulgarian (bg), Bengali (bn), Breton (br), Bosnian (bs), Catalan; Valencian (ca), Cebuano (ceb), Czech (cs), Welsh (cy), Danish (da), German (de), Greeek (el), English (en), Spanish (es), Estonian (et), Persian (fa), Fulah (ff), Finnish (fi), French (fr), Western Frisian (fy), Irish (ga), Gaelic; Scottish Gaelic (gd), Galician (gl), Gujarati (gu), Hausa (ha), Hebrew (he), Hindi (hi), Croatian (hr), Haitian; Haitian Creole (ht), Hungarian (hu), Armenian (hy), Indonesian (id), Igbo (ig), Iloko (ilo), Icelandic (is), Italian (it), Japanese (ja), Javanese (jv), Georgian (ka), Kazakh (kk), Central Khmer (km), Kannada (kn), Korean (ko), Luxembourgish; Letzeburgesch (lb), Ganda (lg), Lingala (ln), Lao (lo), Lithuanian (lt), Latvian (lv), Malagasy (mg), Macedonian (mk), Malayalam (ml), Mongolian (mn), Marathi (mr), Malay (ms), Burmese (my), Nepali (ne), Dutch; Flemish (nl), Norwegian (no), Northern Sotho (ns), Occitan (post 1500) (oc), Oriya (or), Panjabi; Punjabi (pa), Polish (pl), Pushto; Pashto (ps), Portuguese (pt), Romanian; Moldavian; Moldovan (ro), Russian (ru), Sindhi (sd), Sinhala; Sinhalese (si), Slovak (sk), Slovenian (sl), Somali (so), Albanian (sq), Serbian (sr), Swati (ss), Sundanese (su), Swedish (sv), Swahili (sw), Tamil (ta), Thai (th), Tagalog (tl), Tswana (tn), Turkish (tr), Ukrainian (uk), Urdu (ur), Uzbek (uz), Vietnamese (vi), Wolof (wo), Xhosa (xh), Yiddish (yi), Yoruba (yo), Chinese (zh), Zulu (zu)", "## BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #rust #m2m_100 #text2text-generation #multilingual #af #am #ar #ast #az #ba #be #bg #bn #br #bs #ca #ceb #cs #cy #da #de #el #en #es #et #fa #ff #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #ht #hu #hy #id #ig #ilo #is #it #ja #jv #ka #kk #km #kn #ko #lb #lg #ln #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #ns #oc #or #pa #pl #ps #pt #ro #ru #sd #si #sk #sl #so #sq #sr #ss #su #sv #sw #ta #th #tl #tn #tr #uk #ur #uz #vi #wo #xh #yi #yo #zh #zu #arxiv-2010.11125 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# M2M100 418M\n\nM2M100 is a multilingual encoder-decoder (seq-to-seq) model trained for Many-to-Many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model that can directly translate between the 9,900 directions of 100 languages.\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\n\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions.", "## Languages covered\nAfrikaans (af), Amharic (am), Arabic (ar), Asturian (ast), Azerbaijani (az), Bashkir (ba), Belarusian (be), Bulgarian (bg), Bengali (bn), Breton (br), Bosnian (bs), Catalan; Valencian (ca), Cebuano (ceb), Czech (cs), Welsh (cy), Danish (da), German (de), Greeek (el), English (en), Spanish (es), Estonian (et), Persian (fa), Fulah (ff), Finnish (fi), French (fr), Western Frisian (fy), Irish (ga), Gaelic; Scottish Gaelic (gd), Galician (gl), Gujarati (gu), Hausa (ha), Hebrew (he), Hindi (hi), Croatian (hr), Haitian; Haitian Creole (ht), Hungarian (hu), Armenian (hy), Indonesian (id), Igbo (ig), Iloko (ilo), Icelandic (is), Italian (it), Japanese (ja), Javanese (jv), Georgian (ka), Kazakh (kk), Central Khmer (km), Kannada (kn), Korean (ko), Luxembourgish; Letzeburgesch (lb), Ganda (lg), Lingala (ln), Lao (lo), Lithuanian (lt), Latvian (lv), Malagasy (mg), Macedonian (mk), Malayalam (ml), Mongolian (mn), Marathi (mr), Malay (ms), Burmese (my), Nepali (ne), Dutch; Flemish (nl), Norwegian (no), Northern Sotho (ns), Occitan (post 1500) (oc), Oriya (or), Panjabi; Punjabi (pa), Polish (pl), Pushto; Pashto (ps), Portuguese (pt), Romanian; Moldavian; Moldovan (ro), Russian (ru), Sindhi (sd), Sinhala; Sinhalese (si), Slovak (sk), Slovenian (sl), Somali (so), Albanian (sq), Serbian (sr), Swati (ss), Sundanese (su), Swedish (sv), Swahili (sw), Tamil (ta), Thai (th), Tagalog (tl), Tswana (tn), Turkish (tr), Ukrainian (uk), Urdu (ur), Uzbek (uz), Vietnamese (vi), Wolof (wo), Xhosa (xh), Yiddish (yi), Yoruba (yo), Chinese (zh), Zulu (zu)", "## BibTeX entry and citation info" ]
[ 277, 203, 531, 10 ]
[ "passage: TAGS\n#transformers #pytorch #rust #m2m_100 #text2text-generation #multilingual #af #am #ar #ast #az #ba #be #bg #bn #br #bs #ca #ceb #cs #cy #da #de #el #en #es #et #fa #ff #fi #fr #fy #ga #gd #gl #gu #ha #he #hi #hr #ht #hu #hy #id #ig #ilo #is #it #ja #jv #ka #kk #km #kn #ko #lb #lg #ln #lo #lt #lv #mg #mk #ml #mn #mr #ms #my #ne #nl #no #ns #oc #or #pa #pl #ps #pt #ro #ru #sd #si #sk #sl #so #sq #sr #ss #su #sv #sw #ta #th #tl #tn #tr #uk #ur #uz #vi #wo #xh #yi #yo #zh #zu #arxiv-2010.11125 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# M2M100 418M\n\nM2M100 is a multilingual encoder-decoder (seq-to-seq) model trained for Many-to-Many multilingual translation.\nIt was introduced in this paper and first released in this repository.\n\nThe model that can directly translate between the 9,900 directions of 100 languages.\nTo translate into a target language, the target language id is forced as the first generated token.\nTo force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n*Note: 'M2M100Tokenizer' depends on 'sentencepiece', so make sure to install it before running the example.*\n\nTo install 'sentencepiece' run 'pip install sentencepiece'\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions." ]
[ -0.0441865548491478, -0.06236843392252922, -0.0080857640132308, 0.00853306520730257, 0.061135828495025635, 0.01658567227423191, 0.12281297892332077, 0.09926015883684158, 0.008667554706335068, 0.14418508112430573, 0.07352975755929947, 0.10266076773405075, 0.08280905336141586, 0.09210904687643051, 0.04559127986431122, -0.22100113332271576, 0.04677163437008858, -0.08424637466669083, 0.007532170973718166, 0.08493820577859879, 0.07942395657300949, -0.037179239094257355, 0.0907699316740036, -0.04971606656908989, -0.012832988053560257, 0.015020497143268585, -0.04541761055588722, -0.0222181286662817, 0.03694961220026016, 0.08319280296564102, 0.00302896904759109, 0.03719284012913704, 0.05493485555052757, -0.23666292428970337, 0.005158006213605404, 0.05534693971276283, -0.031169120222330093, -0.010654401034116745, 0.07114773243665695, -0.04079582914710045, 0.20229054987430573, -0.12673144042491913, -0.04743603616952896, 0.02366926707327366, -0.12287282198667526, -0.09186459332704544, -0.0695764571428299, 0.10502802580595016, 0.06951656937599182, 0.047994330525398254, -0.058737728744745255, 0.04315057024359703, -0.011346797458827496, 0.08880207687616348, 0.20294983685016632, -0.2906986474990845, -0.057663824409246445, 0.11444354057312012, 0.07145380228757858, 0.10115228593349457, -0.04945440590381622, 0.05149048566818237, 0.024399951100349426, 0.026384901255369186, -0.0432872548699379, -0.05739395692944527, 0.12184768915176392, 0.014072504825890064, -0.17240524291992188, -0.0517549067735672, 0.13171708583831787, -0.002690128982067108, -0.014113018289208412, -0.048432644456624985, -0.05057348310947418, -0.06140168011188507, -0.06062445417046547, -0.009112980216741562, 0.015728292986750603, 0.044340018182992935, 0.045890696346759796, -0.030816860496997833, -0.09347111731767654, -0.028748134151101112, -0.05330950394272804, 0.10779691487550735, 0.020024903118610382, -0.011619656346738338, -0.005749901290982962, 0.050252821296453476, -0.056004516780376434, -0.10764399915933609, -0.042439091950654984, -0.024325454607605934, -0.05224915221333504, 0.005973203107714653, -0.005846008192747831, -0.0199146568775177, 0.11093026399612427, 0.10504084080457687, -0.045812755823135376, 0.06798697263002396, -0.004366680048406124, 0.06341923773288727, 0.04518601670861244, 0.12472853809595108, -0.08938144147396088, -0.049165137112140656, -0.04932941868901253, 0.01573791727423668, 0.010538595728576183, -0.016956087201833725, -0.09071583300828934, -0.02170662023127079, 0.04808748885989189, 0.07151024043560028, 0.0512448325753212, 0.0769830122590065, -0.0219163428992033, 0.014748742803931236, 0.01528680045157671, -0.14058351516723633, 0.04325014725327492, 0.04436051845550537, 0.007117033936083317, 0.0724487155675888, -0.017890451475977898, -0.01711220107972622, -0.08624359220266342, 0.03745955601334572, -0.04900725185871124, 0.03423964977264404, -0.05655999481678009, -0.10376564413309097, 0.021487638354301453, -0.07182323187589645, -0.04606974497437477, -0.12451756745576859, -0.053283799439668655, -0.08654852956533432, 0.013738383539021015, -0.06293045729398727, 0.04331611469388008, -0.09219396114349365, -0.08783147484064102, 0.03865072876214981, 0.016728797927498817, 0.046425316482782364, -0.05345689132809639, 0.04928649216890335, -0.025471096858382225, 0.06746404618024826, 0.07496606558561325, 0.00459299236536026, -0.03655196353793144, 0.08685965090990067, -0.13648897409439087, 0.11977831274271011, -0.08822961151599884, 0.02823159471154213, -0.13619568943977356, -0.05674128606915474, -0.044364362955093384, 0.05856839939951897, 0.04113755375146866, 0.15987475216388702, -0.17344248294830322, -0.030817439779639244, 0.24012815952301025, -0.06118899956345558, -0.03910445421934128, 0.117357037961483, 0.06440655887126923, 0.05972367897629738, 0.039897266775369644, 0.12712588906288147, 0.050540972501039505, -0.14264826476573944, -0.034957271069288254, 0.09651829302310944, -0.042920853942632675, 0.1414058804512024, 0.09262952208518982, -0.025820450857281685, 0.042935293167829514, 0.014438274316489697, -0.021373102441430092, 0.018605509772896767, -0.02441449649631977, -0.052297063171863556, 0.027438068762421608, 0.0006455146358348429, -0.03318151459097862, -0.013116534799337387, -0.021516162902116776, -0.0022246157750487328, -0.0685700848698616, 0.00007572745380457491, 0.10264889895915985, -0.01374942809343338, 0.05072643980383873, -0.12324029952287674, 0.07405119389295578, -0.02570156939327717, 0.05641103535890579, -0.14942586421966553, 0.05906166881322861, -0.002315359655767679, 0.022409427911043167, 0.09764987975358963, 0.08016205579042435, 0.07059019058942795, 0.04985841363668442, 0.005837464239448309, -0.020936550572514534, 0.07761085778474808, -0.015522906556725502, -0.047557532787323, -0.19929225742816925, -0.018003858625888824, -0.04797498509287834, 0.07542525231838226, -0.11552761495113373, 0.03849981352686882, 0.10328821837902069, 0.05965768173336983, -0.054516784846782684, 0.02039833925664425, -0.00796094536781311, 0.053054168820381165, -0.02268749102950096, -0.03654620796442032, 0.05593036860227585, -0.008622213266789913, -0.04373716935515404, 0.06704050302505493, -0.1582125723361969, -0.09255653619766235, 0.08564265072345734, -0.03729083389043808, -0.0745542123913765, 0.0006134109571576118, -0.011123524978756905, -0.03717086836695671, 0.049100618809461594, -0.0621456578373909, 0.10863415151834488, 0.06332869082689285, 0.039927784353494644, -0.08154056966304779, -0.05375846475362778, 0.015889648348093033, -0.07575282454490662, -0.06547220051288605, 0.14365892112255096, 0.055003173649311066, -0.19341012835502625, 0.11286768317222595, 0.11546884477138519, 0.01502426527440548, 0.1525334119796753, -0.005398507229983807, -0.0926746353507042, -0.08534880727529526, 0.10075550526380539, 0.036791399121284485, -0.0077136484906077385, -0.07875001430511475, -0.046257272362709045, -0.0034252661280333996, 0.018979787826538086, 0.019365953281521797, -0.08599268645048141, 0.042165607213974, -0.006014523096382618, -0.057880718261003494, 0.05275999754667282, 0.03964623063802719, -0.03604818135499954, 0.08629883825778961, 0.027331853285431862, 0.0012711163144558668, -0.05465538427233696, -0.035797372460365295, -0.09665586799383163, 0.1703018695116043, -0.15131519734859467, -0.1859254240989685, -0.09976571798324585, -0.0659051239490509, -0.055837735533714294, 0.01570240408182144, 0.04405663162469864, -0.09641694277524948, -0.00903295911848545, -0.05395748093724251, 0.08740480244159698, -0.0653575211763382, -0.07023388892412186, -0.02976168505847454, 0.02894069254398346, -0.01797180250287056, -0.09628410637378693, -0.0367514044046402, 0.03598542883992195, -0.11565690487623215, 0.04859355092048645, -0.05108942836523056, 0.07901520282030106, 0.11609944701194763, 0.07495433837175369, 0.01856287755072117, -0.0005579664721153677, 0.21092642843723297, -0.07549355179071426, 0.04137616977095604, 0.0827292799949646, 0.027938054874539375, 0.07381021976470947, 0.13121238350868225, 0.02287091501057148, -0.053551509976387024, -0.017184404656291008, -0.014720884151756763, -0.0193343348801136, -0.1477341651916504, -0.059359487146139145, -0.06264641135931015, 0.05848197266459465, 0.05243009701371193, 0.0840478241443634, -0.044399578124284744, 0.029606960713863373, -0.04324454069137573, 0.03651151433587074, 0.05901453644037247, 0.0731961727142334, 0.08011981099843979, -0.04617716744542122, 0.04430171102285385, -0.02722797729074955, -0.02431025542318821, 0.0869002714753151, 0.03155307099223137, 0.06528342515230179, 0.019332636147737503, 0.15385843813419342, 0.09332673251628876, 0.09358593821525574, 0.015629252418875694, 0.06457450985908508, -0.01637645810842514, 0.02736980840563774, -0.01335546001791954, -0.09928344935178757, -0.04240451380610466, 0.08505889028310776, 0.10918411612510681, -0.06285601854324341, 0.028844283893704414, 0.019393326714634895, 0.10594423115253448, 0.1694953590631485, 0.025168247520923615, -0.1413334608078003, -0.012134776450693607, 0.05859315022826195, 0.03907231241464615, -0.08238669484853745, 0.015015202574431896, 0.08792046457529068, -0.08764521777629852, 0.07053039222955704, 0.034042004495859146, 0.08044292032718658, -0.08069433271884918, 0.024377405643463135, -0.021482937037944794, 0.09795839339494705, 0.013069065287709236, 0.09837842732667923, -0.27891892194747925, 0.12154341489076614, 0.01860562525689602, 0.030313728377223015, -0.02773081697523594, 0.03200707584619522, 0.01823422871530056, 0.07699056714773178, 0.15898539125919342, 0.049108050763607025, -0.13789130747318268, -0.167299285531044, -0.04609305039048195, -0.02243170328438282, 0.146359384059906, -0.05754992738366127, 0.07548417896032333, -0.014509416185319424, -0.041318006813526154, -0.05820367485284805, 0.06565183401107788, -0.08186233043670654, -0.1187039464712143, 0.08377420902252197, -0.027214057743549347, 0.029195187613368034, -0.026747632771730423, -0.006470068357884884, -0.0966779887676239, 0.11438354104757309, -0.1292918622493744, -0.053728606551885605, -0.07421787083148956, -0.00432512816041708, 0.11990918964147568, -0.11196806281805038, -0.04471905902028084, -0.028215786442160606, 0.04645048826932907, -0.04176563769578934, -0.043735865503549576, 0.043793052434921265, -0.07150585204362869, -0.12885791063308716, 0.002473583910614252, 0.11740709841251373, 0.011294451542198658, 0.04460744559764862, -0.011410082690417767, 0.03292832896113396, -0.010830376297235489, -0.1323584020137787, 0.026901734992861748, 0.09920674562454224, 0.004064860753715038, 0.035184502601623535, -0.12173835933208466, -0.07050701975822449, -0.1060161292552948, -0.05093412846326828, 0.10592500120401382, 0.2609952986240387, -0.0564899742603302, 0.0769299641251564, 0.1701885312795639, -0.11828873306512833, -0.2310871034860611, -0.16396403312683105, 0.023572195321321487, 0.03665975481271744, -0.056765343993902206, -0.1785486340522766, -0.009333972819149494, 0.009669953025877476, 0.017092779278755188, 0.03957176208496094, -0.30061182379722595, -0.10063793510198593, 0.07738976925611496, 0.0062177227810025215, -0.012259199284017086, -0.18627046048641205, -0.09066063165664673, -0.057379089295864105, -0.01644514501094818, -0.0024360064417123795, -0.0478600338101387, 0.08196892589330673, 0.02881925366818905, 0.010487755760550499, 0.03814954310655594, -0.012241275049746037, 0.15901093184947968, 0.021839449182152748, -0.0000011807630926341517, -0.11713142693042755, -0.06935705244541168, 0.050946835428476334, -0.018113138154149055, 0.06684813648462296, -0.06275293976068497, 0.008593257516622543, -0.03989953547716141, -0.004735580179840326, -0.07286816090345383, 0.054973650723695755, -0.06735432893037796, -0.033122289925813675, -0.051109421998262405, 0.04084213450551033, 0.050056032836437225, -0.006217130459845066, 0.05187604948878288, -0.08942554146051407, 0.04763978719711304, 0.15373463928699493, 0.08203427493572235, 0.040564294904470444, -0.016971569508314133, -0.020205527544021606, 0.0012947417562827468, 0.013412385247647762, -0.06260497868061066, -0.006280301138758659, 0.10163909196853638, 0.000143141791340895, 0.12455934286117554, 0.034003108739852905, -0.1293862760066986, -0.018091268837451935, 0.08941824734210968, -0.08753272145986557, -0.15644697844982147, -0.0057472954504191875, 0.0012731518363580108, 0.01645466312766075, -0.03126303479075432, 0.14802484214305878, -0.026053274050354958, -0.0033576658461242914, -0.0062638260424137115, 0.06800380349159241, -0.07913641631603241, 0.10927945375442505, 0.007709990721195936, 0.04153351113200188, -0.06443356722593307, 0.04484157636761665, 0.06489475071430206, -0.04692462831735611, 0.02148425206542015, 0.15939047932624817, -0.08489764481782913, -0.09619301557540894, -0.023980937898159027, 0.09442852437496185, 0.007524093147367239, -0.037610068917274475, 0.028652401641011238, -0.07422032207250595, 0.05737045034766197, 0.12569701671600342, 0.035513073205947876, 0.04109632596373558, 0.015113737434148788, 0.0075612557120621204, 0.008591147139668465, 0.06636135280132294, 0.011064679361879826, -0.010433714836835861, -0.07179117947816849, 0.10058522969484329, 0.009086386300623417, 0.1101548969745636, -0.023108039051294327, -0.034965917468070984, -0.13135956227779388, 0.0026987954042851925, -0.0691021978855133, 0.03414415568113327, -0.14048117399215698, -0.019969940185546875, 0.023641321808099747, -0.039018165320158005, -0.01673082448542118, -0.023897472769021988, -0.08630955964326859, -0.03593163564801216, -0.09102053195238113, 0.12383227795362473, -0.09469708055257797, -0.02966560795903206, 0.039610281586647034, -0.09275218099355698, 0.06522674858570099, 0.08068369328975677, -0.03515279293060303, 0.05906219035387039, -0.07737143337726593, 0.004819396883249283, 0.003951517399400473, 0.05769166722893715, -0.001106204348616302, -0.09752927720546722, 0.00117028399836272, -0.02086597867310047, 0.00033457615063525736, 0.0022065856028348207, 0.029378890991210938, -0.07398127764463425, 0.10893366485834122, -0.026200249791145325, -0.06137928366661072, -0.0690242201089859, 0.06813979148864746, 0.03572386875748634, 0.05054374411702156, 0.06997163593769073, -0.0875127837061882, 0.057102907449007034, -0.09620838612318039, 0.006836645305156708, 0.00995125062763691, -0.05860556662082672, 0.03224412724375725, -0.09381544589996338, 0.0653134137392044, -0.029217660427093506, 0.0672977864742279, -0.003782220184803009, -0.018033679574728012, -0.0011029911693185568, -0.04450326785445213, -0.06270843744277954, 0.030699551105499268, 0.06515508890151978, 0.04693762585520744, -0.0044125840067863464, -0.08989114314317703, 0.011204482987523079, -0.012860781513154507, 0.023095224052667618, 0.033606722950935364, 0.08780837804079056, 0.14944976568222046, 0.08049993216991425, 0.03202872723340988, -0.08080974221229553, -0.03685913234949112, 0.052988145500421524, -0.12150813639163971, 0.0069108628667891026, -0.11159810423851013, 0.1800956279039383, 0.16652487218379974, -0.14179787039756775, 0.0654720887541771, -0.04565064609050751, -0.06080272048711777, -0.107294000685215, -0.21291404962539673, -0.04832613840699196, -0.0373220220208168, 0.01630866341292858, -0.10135683417320251, 0.07878532260656357, 0.007816217839717865, 0.06956292688846588, 0.029676314443349838, 0.09536373615264893, -0.06675033271312714, -0.0942663848400116, 0.05496671423316002, -0.01651047356426716, 0.038659580051898956, -0.0328763872385025, 0.008567821234464645, 0.036277495324611664, -0.04561888799071312, 0.039968639612197876, 0.07897668331861496, 0.017234530299901962, 0.012466615997254848, -0.07421838492155075, -0.06812082976102829, -0.010676378384232521, 0.005933361127972603, -0.026786794885993004, 0.10311057418584824, 0.052597999572753906, -0.0745292380452156, 0.012548545375466347, 0.10977595299482346, -0.054136164486408234, -0.19621384143829346, -0.09149548411369324, 0.1939864456653595, -0.013649312779307365, 0.051745638251304626, -0.04416108876466751, -0.05343577638268471, -0.044974185526371, 0.2254399210214615, 0.1846202313899994, -0.046542491763830185, 0.023266976699233055, 0.0505935437977314, 0.03478165343403816, 0.011033127084374428, 0.04098726063966751, 0.06237497553229332, 0.23922713100910187, -0.07156766206026077, 0.09004911035299301, -0.07426659017801285, -0.005637055728584528, -0.09687858074903488, 0.031048869714140892, -0.023375090211629868, -0.007079927250742912, -0.022185197100043297, 0.10210762172937393, -0.13769784569740295, -0.12914691865444183, -0.011484510265290737, -0.06407091021537781, -0.053937558084726334, 0.0038156684022396803, 0.08184075355529785, 0.0710664764046669, 0.07623612880706787, -0.005801255814731121, -0.02774636633694172, 0.039013974368572235, 0.009266361594200134, -0.06001933664083481, 0.021964453160762787, 0.045007623732089996, -0.05772098898887634, 0.0565032958984375, -0.025340838357806206, 0.07160000503063202, 0.11005682498216629, -0.0036430256441235542, -0.07979397475719452, 0.08867553621530533, 0.047890909016132355, -0.08444236218929291, 0.02556694485247135, 0.08421612530946732, -0.0180834848433733, 0.027011940255761147, 0.06467559188604355, -0.05008658766746521, 0.08196137100458145, 0.13366541266441345, 0.035363830626010895, -0.05224836245179176, 0.07409630715847015, -0.10288923978805542, 0.12571723759174347, 0.14105089008808136, 0.004596840124577284, -0.04223264008760452, -0.061447229236364365, 0.028994914144277573, -0.03987015038728714, 0.06635712087154388, -0.03510456159710884, -0.15826305747032166, -0.0025768710765987635, -0.025641264393925667, 0.09215559810400009, -0.1035897433757782, -0.057264458388090134, 0.03660477325320244, 0.036483559757471085, -0.0886722132563591, 0.14129029214382172, 0.046185076236724854, 0.012914730235934258, -0.04317982494831085, -0.24656754732131958, 0.028647271916270256, 0.13719940185546875, -0.11532063037157059, -0.06515330076217651 ]
null
null
transformers
# MaskFormer MaskFormer model trained on ADE20k semantic segmentation (base-sized version, Swin backbone). It was introduced in the paper [Per-Pixel Classification is Not All You Need for Semantic Segmentation](https://arxiv.org/abs/2107.06278) and first released in [this repository](https://github.com/facebookresearch/MaskFormer/blob/da3e60d85fdeedcb31476b5edd7d328826ce56cc/mask_former/modeling/criterion.py#L169). Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/maskformer_architecture.png) ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the [model hub](https://huggingface.co/models?search=maskformer) to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import MaskFormerFeatureExtractor, MaskFormerForInstanceSegmentation from PIL import Image import requests url = "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg" image = Image.open(requests.get(url, stream=True).raw) feature_extractor = MaskFormerFeatureExtractor.from_pretrained("facebook/maskformer-swin-base-ade") inputs = feature_extractor(images=image, return_tensors="pt") model = MaskFormerForInstanceSegmentation.from_pretrained("facebook/maskformer-swin-base-ade") outputs = model(**inputs) # model predicts class_queries_logits of shape `(batch_size, num_queries)` # and masks_queries_logits of shape `(batch_size, num_queries, height, width)` class_queries_logits = outputs.class_queries_logits masks_queries_logits = outputs.masks_queries_logits # you can pass them to feature_extractor for postprocessing # we refer to the demo notebooks for visualization (see "Resources" section in the MaskFormer docs) predicted_semantic_map = feature_extractor.post_process_semantic_segmentation(outputs, target_sizes=[image.size[::-1]])[0] ``` For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/maskformer).
{"license": "other", "tags": ["vision", "image-segmentation"], "datasets": ["scene_parse_150"], "widget": [{"src": "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg", "example_title": "House"}, {"src": "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000002.jpg", "example_title": "Castle"}]}
image-segmentation
facebook/maskformer-swin-base-ade
[ "transformers", "pytorch", "maskformer", "vision", "image-segmentation", "dataset:scene_parse_150", "arxiv:2107.06278", "license:other", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2107.06278" ]
[]
TAGS #transformers #pytorch #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us
# MaskFormer MaskFormer model trained on ADE20k semantic segmentation (base-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. !model image ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the model hub to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: For more code examples, we refer to the documentation.
[ "# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (base-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ "TAGS\n#transformers #pytorch #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n", "# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (base-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ 61, 97, 56, 44, 25 ]
[ "passage: TAGS\n#transformers #pytorch #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (base-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ -0.06014617159962654, 0.09670212864875793, -0.003704712726175785, 0.03385220468044281, 0.14323687553405762, 0.004243438597768545, 0.12617166340351105, 0.07061433047056198, -0.0003247229615226388, 0.032330241054296494, 0.05679159611463547, 0.024595245718955994, 0.07507160305976868, 0.13525184988975525, 0.0599241703748703, -0.2718023657798767, 0.0409366674721241, -0.011051993817090988, 0.04895666986703873, 0.10377079993486404, 0.1090722382068634, -0.10362177342176437, 0.12826164066791534, 0.10333941876888275, -0.19276323914527893, 0.03620864450931549, -0.006092226132750511, -0.05652408301830292, 0.09829295426607132, 0.034743744879961014, 0.13373933732509613, -0.04530801624059677, 0.07447955012321472, -0.09639862179756165, 0.03162052854895592, 0.09984429180622101, -0.006606715265661478, 0.047396909445524216, 0.06542228162288666, -0.04903169348835945, 0.15616507828235626, -0.032569125294685364, 0.05190706625580788, 0.06053696572780609, -0.11811932921409607, -0.11693494766950607, -0.045956313610076904, 0.14972564578056335, 0.062051255255937576, 0.07434626668691635, -0.006336771883070469, 0.07604557275772095, -0.0035140945110470057, 0.0636882409453392, 0.12149805575609207, -0.10336963087320328, -0.04838864505290985, 0.09601086378097534, 0.02540360577404499, -0.11239716410636902, -0.05539068952202797, 0.07029091566801071, 0.008508139289915562, 0.04153665155172348, 0.1519089639186859, -0.04272525757551193, 0.051190946251153946, -0.051599517464637756, -0.10061686486005783, -0.10141494870185852, 0.05050339177250862, -0.008858919143676758, -0.056252531707286835, -0.2045043408870697, -0.12134047597646713, 0.1455075442790985, -0.0217133779078722, -0.0006097044097259641, 0.023098371922969818, 0.002903448650613427, 0.068864606320858, -0.09859758615493774, -0.09094243496656418, -0.049173105508089066, -0.01587591879069805, 0.0845981016755104, 0.0062768300995230675, 0.07514819502830505, -0.06300894170999527, 0.09083940088748932, -0.15579909086227417, -0.11365832388401031, -0.061335042119026184, -0.14097052812576294, -0.06542989611625671, 0.014320643618702888, -0.04279974102973938, -0.14161355793476105, -0.03552919253706932, 0.1492459625005722, 0.013260471634566784, 0.035443469882011414, 0.01943870447576046, 0.034045200794935226, 0.07478508353233337, 0.16185595095157623, -0.06005279719829559, 0.05950351431965828, 0.02352605015039444, -0.0040916334837675095, 0.057302627712488174, -0.05564739182591438, -0.09434497356414795, -0.0030149922240525484, -0.0445304811000824, 0.03170423209667206, 0.015479683876037598, 0.07937897741794586, 0.0029122952837496996, -0.05797047168016434, 0.19970053434371948, -0.1283915638923645, 0.009575961157679558, 0.01167275384068489, -0.014640424400568008, -0.028513869270682335, 0.08897285163402557, -0.03923291712999344, -0.0893617495894432, 0.04283870384097099, -0.07818860560655594, 0.022736603394150734, -0.14400888979434967, -0.11070089787244797, 0.0037182793021202087, -0.14701540768146515, -0.0365278422832489, -0.12580472230911255, -0.19060522317886353, -0.013806973583996296, 0.05686567351222038, 0.001057509332895279, 0.05230548605322838, 0.05089106783270836, -0.028677714988589287, -0.01504965964704752, 0.031407542526721954, -0.021830128505825996, -0.003892526961863041, -0.008295977488160133, 0.0049916100688278675, 0.05515889450907707, -0.019476106390357018, 0.03267708048224449, -0.09700825810432434, 0.07259204983711243, -0.24982386827468872, 0.09780574589967728, -0.007392068859189749, -0.03818522393703461, -0.0223715640604496, -0.06564398854970932, -0.04239065200090408, 0.06542519479990005, 0.016445094719529152, 0.1170259341597557, -0.1600436419248581, -0.04568890109658241, 0.24391314387321472, -0.1684434711933136, -0.04730946943163872, 0.07709099352359772, -0.06709825247526169, -0.016905255615711212, 0.05157613381743431, 0.15075816214084625, 0.11745206266641617, -0.11843587458133698, -0.0028913861606270075, -0.0066827950067818165, -0.11628605425357819, 0.017533041536808014, 0.03359503671526909, -0.04122085124254227, 0.10745853930711746, 0.03574861213564873, -0.06643172353506088, -0.03696776181459427, -0.0025156266056001186, -0.05204915255308151, -0.018866337835788727, 0.0026389663107693195, 0.05201531946659088, 0.009711865335702896, -0.054283175617456436, -0.02060469426214695, -0.10661433637142181, 0.062057241797447205, 0.04979250580072403, -0.089148610830307, -0.004632085561752319, -0.07245275378227234, 0.11012249439954758, -0.09621044993400574, 0.0004540104418992996, -0.21815721690654755, -0.06190047040581703, -0.005088211502879858, -0.0668330043554306, 0.05683428794145584, 0.03020355850458145, 0.03153340891003609, 0.05619380995631218, -0.010914993472397327, 0.012641634792089462, -0.08793486654758453, -0.018603095784783363, -0.02026333659887314, -0.03421242907643318, -0.11424321681261063, -0.06228574737906456, 0.09725768119096756, -0.10851425677537918, 0.06543020159006119, -0.009371474385261536, 0.09702818095684052, 0.003665345022454858, -0.06186200678348541, 0.04798095300793648, -0.008982334285974503, -0.02840781956911087, -0.07060352712869644, 0.06857403367757797, -0.007604529615491629, 0.042631346732378006, 0.0769774466753006, -0.20814171433448792, -0.10712113976478577, 0.09621275961399078, -0.0744025707244873, -0.06739416718482971, 0.041536595672369, -0.03375979885458946, 0.008450567722320557, -0.1014048159122467, -0.08547277003526688, 0.1882203221321106, 0.02302459627389908, 0.128005251288414, -0.10643535107374191, -0.00229251547716558, 0.054870400577783585, -0.030883129686117172, -0.033596787601709366, -0.019115159288048744, 0.09305533766746521, -0.11311399936676025, 0.0919610932469368, 0.05834699422121048, 0.0532817579805851, 0.15350669622421265, 0.03370494395494461, -0.043875157833099365, 0.007015663664788008, -0.05576018616557121, -0.017128892242908478, 0.15428978204727173, -0.09568427503108978, -0.06405306607484818, 0.06671836972236633, -0.03163716197013855, 0.053153347223997116, -0.12153509259223938, 0.05105774477124214, 0.0615120604634285, -0.014699318446218967, -0.03379075229167938, 0.0026840067002922297, 0.023408589884638786, 0.03421519324183464, 0.04822194203734398, -0.0050104702822864056, -0.002153507899492979, -0.03568379580974579, -0.11255177855491638, 0.1558285802602768, -0.06603871285915375, -0.38879626989364624, -0.17361456155776978, 0.04152744263410568, -0.1050219014286995, 0.04920556768774986, -0.0024694199673831463, 0.025985952466726303, -0.07716699689626694, -0.11045393347740173, -0.010200876742601395, -0.02536509744822979, -0.11010470986366272, -0.018385907635092735, -0.009707223623991013, -0.004439631476998329, -0.14601050317287445, -0.0357540100812912, -0.024791039526462555, -0.0040094731375575066, 0.038979724049568176, 0.02065443806350231, 0.10523548722267151, 0.14491085708141327, -0.0733332633972168, 0.020298007875680923, -0.04032818228006363, 0.16577674448490143, -0.055242378264665604, 0.09547393023967743, 0.25851956009864807, -0.07499396800994873, 0.09219732880592346, 0.08903776109218597, -0.005427589174360037, -0.051186416298151016, -0.016921238973736763, 0.028985360637307167, -0.12537863850593567, -0.09770600497722626, -0.08146372437477112, -0.07874038815498352, -0.014936711639165878, 0.07179750502109528, 0.02460528165102005, 0.08041918277740479, 0.07140420377254486, -0.05982963368296623, -0.06207926943898201, -0.0013449431862682104, 0.11877144873142242, 0.09329832345247269, -0.02652369625866413, 0.1001877710223198, -0.08017516881227493, -0.02806723117828369, 0.06133599579334259, -0.00861135683953762, 0.17190402746200562, 0.013768746517598629, 0.02811479941010475, 0.09658069908618927, -0.037616848945617676, 0.08974754065275192, 0.10175997018814087, -0.04104842618107796, 0.018638018518686295, -0.04330078139901161, -0.08521853387355804, -0.1031622514128685, 0.062394849956035614, 0.045032799243927, -0.021346837282180786, -0.053444407880306244, 0.003296326380223036, 0.05548112094402313, 0.18232208490371704, 0.07084426283836365, -0.24234001338481903, -0.057530999183654785, 0.004710546229034662, 0.012232908979058266, -0.12520238757133484, -0.023810170590877533, 0.12829482555389404, -0.1655060201883316, 0.09152337163686752, -0.05259750783443451, 0.1145656481385231, -0.01575523242354393, 0.01475544273853302, -0.06897164136171341, 0.13573502004146576, 0.010140445083379745, 0.09501121938228607, -0.08049233257770538, 0.10392250120639801, 0.009774742648005486, 0.07490067929029465, -0.11973541229963303, 0.06424326449632645, 0.07768917083740234, 0.09205469489097595, 0.1426527202129364, 0.01986009255051613, -0.03773173317313194, -0.010264281183481216, -0.03965887427330017, 0.008202518336474895, 0.06941931694746017, -0.03492392972111702, 0.05126599967479706, -0.00906167458742857, -0.02809540368616581, -0.012213445268571377, 0.08211997151374817, -0.05716170370578766, -0.14855992794036865, 0.02452755905687809, -0.081296406686306, -0.06770023703575134, -0.03999187424778938, 0.009906346909701824, -0.06243898347020149, 0.1842653751373291, -0.002716952934861183, -0.11895976960659027, -0.1415233016014099, -0.05798350274562836, 0.05366692319512367, -0.048668861389160156, 0.09728389978408813, -0.06396415084600449, 0.22591853141784668, -0.06955748051404953, -0.14083963632583618, 0.014628512784838676, -0.07575695961713791, -0.053170911967754364, -0.017406485974788666, 0.11935480684041977, -0.013742457143962383, -0.008320174179971218, 0.06542709469795227, 0.05791563540697098, -0.04683651402592659, -0.1037781685590744, 0.024352343752980232, 0.14272460341453552, 0.10306400060653687, 0.11342860758304596, -0.04780504107475281, -0.1683368682861328, -0.012524804100394249, 0.09415379166603088, 0.09316345304250717, 0.07934607565402985, -0.07844045758247375, 0.09020912647247314, 0.14480920135974884, -0.08625628799200058, -0.2806859016418457, 0.013627598993480206, -0.018893864005804062, 0.051945220679044724, 0.038092419505119324, -0.13129328191280365, 0.029286563396453857, 0.03471396118402481, -0.014807839877903461, 0.03649696335196495, -0.20192725956439972, -0.1060670018196106, 0.060155417770147324, 0.08362384885549545, -0.0372413769364357, -0.07697358727455139, -0.01950986124575138, -0.04945036396384239, -0.16681437194347382, 0.10624150186777115, -0.0746789500117302, 0.07939407974481583, 0.00024821318220347166, 0.015776731073856354, 0.028866959735751152, -0.07650459557771683, 0.09590426087379456, -0.03803256154060364, 0.09424266219139099, -0.047980036586523056, -0.027929382398724556, 0.2016855627298355, -0.08504790812730789, 0.1488667130470276, 0.018364064395427704, 0.05815736576914787, -0.11025885492563248, -0.05028624087572098, -0.08010000735521317, 0.11011675000190735, -0.029643304646015167, -0.08604281395673752, -0.05820297449827194, 0.02147761732339859, 0.044371604919433594, -0.0019042636267840862, -0.005404118914157152, -0.12790706753730774, 0.03185587748885155, 0.17224089801311493, 0.1108255386352539, -0.048696890473365784, -0.14296047389507294, -0.06366610527038574, -0.002154519548639655, 0.05418403074145317, -0.15286390483379364, 0.05925777181982994, 0.05854608118534088, 0.05379306897521019, 0.08113209903240204, 0.045095428824424744, -0.08500048518180847, 0.015010218136012554, 0.047716815024614334, -0.1063503548502922, -0.15851864218711853, -0.025896456092596054, -0.08616422116756439, -0.0761769711971283, 0.018904412165284157, 0.09995955228805542, -0.05447923764586449, -0.012357843108475208, -0.009281414560973644, 0.0190596841275692, -0.01107296347618103, -0.003938067238777876, 0.06096328794956207, 0.04200075566768646, -0.0747217983007431, 0.086727574467659, 0.017603928223252296, -0.06220799684524536, 0.043590933084487915, 0.04440104961395264, -0.09669755399227142, -0.07509686797857285, -0.09171351790428162, 0.17669989168643951, -0.0537281297147274, -0.07486312836408615, -0.027741335332393646, -0.08649507164955139, 0.014292465522885323, 0.15950949490070343, 0.046498626470565796, 0.0022449649404734373, -0.0333859920501709, 0.011715995147824287, -0.11937835812568665, 0.050820425152778625, -0.03336292505264282, 0.0406256802380085, -0.08364038914442062, 0.12297296524047852, 0.05296460911631584, 0.0967872142791748, -0.06611628085374832, -0.06979241967201233, -0.05913858115673065, 0.009253369644284248, -0.13550734519958496, 0.03208313137292862, -0.09601296484470367, -0.02498880960047245, -0.0181521475315094, 0.06807487457990646, -0.019647028297185898, 0.05576011911034584, -0.05033273249864578, -0.03223840147256851, -0.01041794940829277, 0.006214533932507038, -0.15648818016052246, 0.00664420984685421, 0.0207542572170496, -0.07367219030857086, 0.07436986267566681, 0.008601327426731586, -0.07934674620628357, -0.02669718861579895, -0.06638239324092865, -0.0821017250418663, 0.025375783443450928, 0.03850836306810379, -0.0047751422971487045, -0.053112760186195374, 0.04288555681705475, 0.009901479817926884, -0.0316518098115921, -0.024554729461669922, 0.056092407554388046, -0.08862647414207458, 0.07960613816976547, 0.06000275909900665, -0.04787067696452141, -0.023951396346092224, 0.08426383882761002, 0.06747916340827942, 0.05539245903491974, 0.07556632161140442, -0.00396234355866909, 0.08843091130256653, -0.15609581768512726, -0.018154151737689972, -0.010187379084527493, -0.038907840847969055, 0.03849612921476364, -0.06101022660732269, 0.030254773795604706, -0.02442105859518051, 0.2025955319404602, 0.11453191190958023, 0.0707269161939621, 0.01891823671758175, 0.08124378323554993, -0.007113509811460972, 0.033094700425863266, 0.07148537784814835, -0.044145796447992325, -0.037149619311094284, 0.075769804418087, 0.03322089463472366, -0.0014214962720870972, 0.00625170674175024, 0.159418985247612, 0.12560439109802246, 0.09199480712413788, 0.06048181280493736, 0.10598082095384598, 0.026904460042715073, -0.17040888965129852, -0.1310160905122757, 0.02979489415884018, 0.0766022801399231, -0.057954296469688416, -0.047823838889598846, 0.15187443792819977, -0.10762026906013489, 0.1527893990278244, 0.011956347152590752, -0.053396619856357574, -0.05780977010726929, -0.17921757698059082, -0.02255706675350666, 0.03359276428818703, 0.005754118785262108, -0.06744589656591415, 0.011235835961997509, 0.1691228449344635, -0.030664822086691856, -0.08125174045562744, 0.1830553114414215, -0.15234966576099396, -0.07193689048290253, 0.05629324913024902, 0.03573394566774368, 0.04593122377991676, 0.027104713022708893, 0.042142681777477264, 0.07037227600812912, 0.04996604844927788, 0.0802871510386467, 0.02564346417784691, 0.10238447040319443, 0.02201988734304905, -0.06992121785879135, -0.05337556079030037, 0.0017750851111486554, 0.02010362222790718, -0.01009237952530384, 0.06219708174467087, 0.013489114120602608, -0.04752914234995842, -0.013957344926893711, 0.09945832937955856, -0.06351980566978455, -0.114084891974926, -0.15059301257133484, 0.26169952750205994, -0.032156214118003845, 0.0003934494452551007, 0.0023082513362169266, -0.097613126039505, 0.004856356885284185, 0.21742825210094452, 0.2146786004304886, -0.03566785529255867, 0.0241796113550663, -0.026270680129528046, 0.010617234744131565, -0.0003745071589946747, 0.13296368718147278, -0.021762358024716377, 0.2607443630695343, -0.02814677357673645, 0.0868564024567604, -0.00421726331114769, -0.06604520231485367, -0.10324421525001526, 0.0746588334441185, -0.004103816580027342, -0.0517888069152832, -0.03144729137420654, 0.03926481679081917, -0.04461473226547241, -0.12142893671989441, 0.0996137484908104, -0.06783511489629745, -0.05233541131019592, 0.02504383958876133, -0.028353337198495865, 0.007782170549035072, 0.06441431492567062, -0.017389891669154167, 0.009130459278821945, 0.17927195131778717, 0.0151786208152771, -0.07912427932024002, -0.030709346756339073, 0.038492366671562195, -0.14473053812980652, 0.19669903814792633, -0.009193796664476395, 0.02820487506687641, 0.09586979448795319, 0.04218775033950806, -0.08116549253463745, 0.02188078500330448, -0.0011279380414634943, -0.005677227396517992, 0.009618083946406841, 0.09899406880140305, 0.0023428474087268114, -0.10016264021396637, -0.010950242169201374, -0.07849781960248947, 0.02463872730731964, 0.050306711345911026, 0.006216227076947689, -0.032946281135082245, 0.0584297701716423, -0.10538224875926971, 0.12937547266483307, 0.10070939362049103, 0.03648868948221207, -0.03477524220943451, -0.05523492768406868, 0.036732904613018036, 0.029837463051080704, 0.013651790097355843, -0.051067814230918884, -0.09700050204992294, -0.010452117770910263, -0.012736091390252113, -0.013962303288280964, -0.2672266960144043, -0.05856822058558464, -0.003521543461829424, -0.01784536801278591, 0.015180845744907856, 0.025753574445843697, 0.07959014177322388, 0.05195857584476471, -0.026542603969573975, -0.00014487240696325898, 0.004414010792970657, 0.12305963039398193, -0.08952309936285019, -0.06156925484538078 ]
null
null
transformers
# MaskFormer MaskFormer model trained on COCO panoptic segmentation (base-sized version, Swin backbone). It was introduced in the paper [Per-Pixel Classification is Not All You Need for Semantic Segmentation](https://arxiv.org/abs/2107.06278) and first released in [this repository](https://github.com/facebookresearch/MaskFormer/blob/da3e60d85fdeedcb31476b5edd7d328826ce56cc/mask_former/modeling/criterion.py#L169). Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/maskformer_architecture.png) ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the [model hub](https://huggingface.co/models?search=maskformer) to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import MaskFormerFeatureExtractor, MaskFormerForInstanceSegmentation from PIL import Image import requests # load MaskFormer fine-tuned on COCO panoptic segmentation feature_extractor = MaskFormerFeatureExtractor.from_pretrained("facebook/maskformer-swin-base-coco") model = MaskFormerForInstanceSegmentation.from_pretrained("facebook/maskformer-swin-base-coco") url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) inputs = feature_extractor(images=image, return_tensors="pt") outputs = model(**inputs) # model predicts class_queries_logits of shape `(batch_size, num_queries)` # and masks_queries_logits of shape `(batch_size, num_queries, height, width)` class_queries_logits = outputs.class_queries_logits masks_queries_logits = outputs.masks_queries_logits # you can pass them to feature_extractor for postprocessing result = feature_extractor.post_process_panoptic_segmentation(outputs, target_sizes=[image.size[::-1]])[0] # we refer to the demo notebooks for visualization (see "Resources" section in the MaskFormer docs) predicted_panoptic_map = result["segmentation"] ``` For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/maskformer).
{"license": "other", "tags": ["vision", "image-segmentation"], "datasets": ["coco"], "widget": [{"src": "http://images.cocodataset.org/val2017/000000039769.jpg", "example_title": "Cats"}, {"src": "http://images.cocodataset.org/val2017/000000039770.jpg", "example_title": "Castle"}]}
image-segmentation
facebook/maskformer-swin-base-coco
[ "transformers", "pytorch", "maskformer", "vision", "image-segmentation", "dataset:coco", "arxiv:2107.06278", "license:other", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2107.06278" ]
[]
TAGS #transformers #pytorch #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us
# MaskFormer MaskFormer model trained on COCO panoptic segmentation (base-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. !model image ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the model hub to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: For more code examples, we refer to the documentation.
[ "# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (base-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ "TAGS\n#transformers #pytorch #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n", "# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (base-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ 56, 95, 56, 44, 25 ]
[ "passage: TAGS\n#transformers #pytorch #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (base-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ -0.0439973883330822, 0.03898816928267479, -0.0032883817330002785, 0.03165317326784134, 0.11375166475772858, -0.002510255668312311, 0.12730535864830017, 0.0679897889494896, -0.006998567376285791, 0.02701156958937645, 0.04059535637497902, 0.0278104767203331, 0.07831700891256332, 0.11846950650215149, 0.040191616863012314, -0.24876414239406586, 0.034418124705553055, -0.013806398026645184, 0.04843435063958168, 0.0884944275021553, 0.10328526049852371, -0.09790325909852982, 0.1260555237531662, 0.09231376647949219, -0.20384855568408966, 0.02473928965628147, 0.023210417479276657, -0.035372957587242126, 0.10167734324932098, 0.046964578330516815, 0.15621978044509888, -0.04386107996106148, 0.04291943833231926, -0.12245795875787735, 0.03330070152878761, 0.09091182053089142, -0.014925799332559109, 0.04002771154046059, 0.07706823945045471, -0.050202276557683945, 0.1368357390165329, -0.024220775812864304, 0.03338569775223732, 0.06753398478031158, -0.11413077265024185, -0.08973746746778488, -0.06476495414972305, 0.16372500360012054, 0.06217862293124199, 0.06656935811042786, -0.005028776824474335, 0.08703460544347763, -0.039901699870824814, 0.07555817067623138, 0.11506061255931854, -0.12644566595554352, -0.05963616818189621, 0.1422429382801056, 0.03055759146809578, -0.10351253300905228, -0.06211351230740547, 0.09331557154655457, 0.0078884307295084, 0.04258941859006882, 0.14495539665222168, -0.07458894699811935, -0.013416638597846031, -0.04127116873860359, -0.0989597961306572, -0.10706161707639694, 0.060831084847450256, -0.030342241749167442, -0.06522984057664871, -0.19442987442016602, -0.10364170372486115, 0.15954448282718658, -0.02776668407022953, 0.0008412451134063303, 0.020479977130889893, 0.020164573565125465, 0.06970509141683578, -0.07928478717803955, -0.10138244181871414, -0.06131058931350708, -0.020177366212010384, 0.09515132009983063, -0.006720643024891615, 0.060325607657432556, -0.04643000289797783, 0.08194718509912491, -0.11343646794557571, -0.09941626340150833, -0.05464862659573555, -0.11999049782752991, -0.08775226026773453, -0.0035873595625162125, -0.06882436573505402, -0.11598828434944153, 0.002636301564052701, 0.17601053416728973, 0.02211688831448555, 0.030565666034817696, -0.01274636760354042, 0.04531053453683853, 0.055726777762174606, 0.16427361965179443, -0.0292738638818264, 0.07938553392887115, 0.03520363196730614, -0.04967910051345825, 0.06230197474360466, -0.06521989405155182, -0.1043667197227478, 0.009919555857777596, -0.035590820014476776, 0.051544301211833954, 0.041682664304971695, 0.06515777856111526, -0.02246086299419403, -0.055722612887620926, 0.18810249865055084, -0.115115687251091, 0.008930939249694347, 0.023369664326310158, 0.007724732160568237, -0.04326114431023598, 0.06109572947025299, -0.027718907222151756, -0.08475635200738907, 0.04293452575802803, -0.08036083728075027, 0.03813475742936134, -0.1272323578596115, -0.10820475965738297, 0.022979311645030975, -0.15768678486347198, -0.0477091483771801, -0.13731834292411804, -0.172148659825325, -0.030412426218390465, 0.04983820021152496, -0.01300746388733387, 0.05880209058523178, 0.05696072801947594, -0.0344969667494297, -0.01889197900891304, 0.0279898289591074, 0.010261588729918003, -0.02407556213438511, -0.014615913853049278, -0.015845147892832756, 0.04254089668393135, -0.025668131187558174, 0.014220837503671646, -0.08463919907808304, 0.07268597930669785, -0.21168844401836395, 0.07990401983261108, -0.02928793989121914, -0.002832963829860091, -0.01533121895045042, -0.030784152448177338, -0.04520200937986374, 0.0652032420039177, -0.014912002719938755, 0.11212776601314545, -0.15784111618995667, -0.046028055250644684, 0.2321563959121704, -0.16888059675693512, -0.062231987714767456, 0.09786088764667511, -0.04451584815979004, -0.0337289422750473, 0.04653298109769821, 0.1927236169576645, 0.11227019876241684, -0.1276385635137558, -0.008966123685240746, 0.006456793285906315, -0.12444034218788147, 0.004267311654984951, 0.04595385119318962, -0.03643520548939705, 0.09808266162872314, 0.028799664229154587, -0.05231984332203865, -0.038923729211091995, 0.008962357416749, -0.04876437783241272, -0.017175186425447464, 0.014650565572082996, -0.0008363237720914185, 0.013664421625435352, -0.03663630038499832, -0.012246118858456612, -0.06959383934736252, 0.009869790636003017, 0.036216143518686295, -0.07873384654521942, 0.01915777288377285, -0.06815487891435623, 0.13770709931850433, -0.09170801937580109, 0.004830610938370228, -0.23422135412693024, -0.08812599629163742, 0.013758691027760506, -0.0182269848883152, 0.04894455522298813, 0.06524129211902618, 0.009085515514016151, 0.030689643695950508, -0.01608676090836525, 0.03667626529932022, -0.06042609363794327, -0.02597808837890625, -0.018498975783586502, -0.047404542565345764, -0.1058577224612236, -0.0724240243434906, 0.10810360312461853, -0.11156123131513596, 0.05778177082538605, 0.056797366589307785, 0.09933385252952576, 0.02591206692159176, -0.05385361239314079, 0.04675774276256561, 0.02320738323032856, -0.04650005325675011, -0.0589686781167984, 0.06923060119152069, -0.002705162391066551, 0.021574929356575012, 0.07823453098535538, -0.1747387945652008, -0.0777159035205841, 0.08757596462965012, -0.08481617271900177, -0.06147545576095581, -0.01083303987979889, -0.033586710691452026, 0.0015104776248335838, -0.0982825830578804, -0.05602840706706047, 0.16673071682453156, 0.015589730814099312, 0.1095660924911499, -0.09072048217058182, 0.003263661405071616, 0.03613385185599327, -0.03478052094578743, -0.026722276583313942, -0.01506089698523283, 0.04305936023592949, -0.10322176665067673, 0.09832244366407394, 0.019445132464170456, 0.04119674116373062, 0.13133768737316132, 0.021797800436615944, -0.03480726107954979, 0.01678256131708622, -0.034830622375011444, 0.020245540887117386, 0.1599894016981125, -0.07050516456365585, -0.0723814144730568, 0.06736885756254196, -0.022573668509721756, 0.04710850119590759, -0.12724734842777252, 0.048049889504909515, 0.04006313532590866, -0.016382478177547455, -0.024694425985217094, -0.001569682965055108, 0.0036630495451390743, 0.02792842872440815, 0.06440354883670807, 0.01090259850025177, -0.012489019893109798, -0.04539322480559349, -0.11952166259288788, 0.1679808795452118, -0.05144816264510155, -0.40700286626815796, -0.1774337738752365, -0.004151646047830582, -0.10931088030338287, 0.06832632422447205, 0.006774047389626503, 0.0518568754196167, -0.06880266964435577, -0.10734306275844574, -0.019752630963921547, 0.00871371477842331, -0.09183736890554428, -0.05729479715228081, -0.0006296957144513726, -0.021087490022182465, -0.13290557265281677, -0.03076431155204773, -0.03964245691895485, 0.019257761538028717, 0.04499058797955513, -0.013365307822823524, 0.12349479645490646, 0.15044517815113068, -0.028721123933792114, 0.012638228014111519, -0.056401196867227554, 0.1348937451839447, -0.06017286330461502, 0.07799635827541351, 0.2233777791261673, -0.0669466182589531, 0.07545320689678192, 0.10558352619409561, -0.0021449581254273653, -0.05760492384433746, -0.018143748864531517, 0.02550826035439968, -0.11643221974372864, -0.1040089949965477, -0.10279091447591782, -0.10143600404262543, 0.011832509189844131, 0.04877668246626854, 0.04376642405986786, 0.09467650204896927, 0.064254529774189, -0.05069015547633171, -0.03857583552598953, 0.0005137057160027325, 0.13206300139427185, 0.09807950258255005, -0.02571389637887478, 0.09749534726142883, -0.07849230617284775, -0.011507997289299965, 0.07039938867092133, 0.01703675091266632, 0.15842671692371368, 0.023698702454566956, 0.062122344970703125, 0.10150287300348282, -0.03532556816935539, 0.08507472276687622, 0.0858413502573967, -0.030440669506788254, -0.0013207135489210486, -0.03579229488968849, -0.0846787616610527, -0.11304119229316711, 0.06322740018367767, 0.05623859539628029, -0.038806457072496414, -0.04495319724082947, -0.032864760607481, 0.06939752399921417, 0.208349347114563, 0.06334995478391647, -0.22149652242660522, -0.05507166311144829, 0.02244037203490734, 0.018027421087026596, -0.10494991391897202, -0.017708227038383484, 0.13726310431957245, -0.16326865553855896, 0.07018616050481796, -0.05118951201438904, 0.12554582953453064, -0.01658092811703682, 0.00973975658416748, -0.10551253706216812, 0.08992493897676468, 0.009738821536302567, 0.09583836048841476, -0.08994146436452866, 0.10253237932920456, 0.021537533029913902, 0.0610358752310276, -0.11852754652500153, 0.061160579323768616, 0.06982716917991638, 0.12363055348396301, 0.13527649641036987, 0.01917487010359764, -0.06199989467859268, -0.025048144161701202, -0.03275584429502487, 0.006817273795604706, 0.07435372471809387, -0.047504592686891556, 0.044355444610118866, -0.008008783683180809, -0.01800740323960781, -0.021310633048415184, 0.04667157679796219, -0.04691309854388237, -0.13498999178409576, 0.031611792743206024, -0.0633901059627533, -0.0586412250995636, -0.04032628983259201, 0.013309893198311329, -0.02485009841620922, 0.21459665894508362, -0.027480319142341614, -0.08510007709264755, -0.1501488983631134, -0.07222750782966614, 0.025876136496663094, -0.04212522134184837, 0.08943722397089005, -0.059413693845272064, 0.2258450984954834, -0.07152733951807022, -0.11366691440343857, 0.02995169907808304, -0.08985717594623566, -0.06311753392219543, -0.021658601239323616, 0.12351840734481812, -0.026970382779836655, 0.002645385218784213, 0.07955992221832275, 0.06366390734910965, -0.05526704713702202, -0.10184962302446365, 0.016546182334423065, 0.12707717716693878, 0.13384507596492767, 0.1383100152015686, -0.03982870653271675, -0.16420643031597137, -0.02322249487042427, 0.08249957859516144, 0.07891307771205902, 0.0991315096616745, -0.06361515820026398, 0.10691159963607788, 0.13282015919685364, -0.09010054916143417, -0.29436352849006653, 0.0009113195701502264, -0.06236151605844498, 0.043322257697582245, 0.022502431645989418, -0.10501369833946228, 0.04059452936053276, 0.03508259356021881, -0.030942371115088463, 0.05966305360198021, -0.1844308078289032, -0.10664476454257965, 0.04572143033146858, 0.09405265003442764, -0.03686501085758209, -0.10741101205348969, -0.037358324974775314, -0.06310836970806122, -0.15832526981830597, 0.07554331421852112, -0.09553726017475128, 0.07729007303714752, -0.01152785588055849, -0.03912819176912308, 0.013327449560165405, -0.08041282743215561, 0.10635734349489212, -0.020004568621516228, 0.09531430900096893, -0.04677464812994003, -0.01952560432255268, 0.19333136081695557, -0.07401329278945923, 0.10172027349472046, 0.018730727955698967, 0.05156385898590088, -0.10690154880285263, -0.04395976662635803, -0.0686081051826477, 0.12537100911140442, -0.029828058555722237, -0.08338341861963272, -0.08653570711612701, 0.02737962268292904, 0.02971460483968258, 0.011224036104977131, 0.03663438931107521, -0.14903804659843445, 0.05789835378527641, 0.19617636501789093, 0.10581925511360168, -0.0803685262799263, -0.1114811822772026, -0.061861779540777206, -0.019056355580687523, 0.051086995750665665, -0.12123676389455795, 0.06355775147676468, 0.05701715499162674, 0.0506591796875, 0.0784602090716362, 0.05661788955330849, -0.07565395534038544, 0.02005627565085888, 0.06469257175922394, -0.10846203565597534, -0.1255614012479782, -0.020271772518754005, -0.12311091274023056, -0.061943501234054565, 0.018696697428822517, 0.09479697048664093, -0.03434035927057266, -0.010553267784416676, 0.004228650126606226, 0.022682415321469307, -0.01587141677737236, -0.0022616067435592413, 0.054583534598350525, 0.048367083072662354, -0.0733923465013504, 0.09187927842140198, 0.018864773213863373, -0.07919743657112122, 0.011218901723623276, 0.029780354350805283, -0.11775118112564087, -0.0903773382306099, -0.08146701753139496, 0.15708725154399872, -0.04138105735182762, -0.07839950919151306, -0.04737841337919235, -0.1084919422864914, 0.00841042585670948, 0.1825455278158188, 0.07156606763601303, 0.035520244389772415, -0.02965005300939083, 0.003841063240543008, -0.11825249344110489, 0.042027611285448074, -0.0344666987657547, 0.04824037104845047, -0.1013258621096611, 0.10645215958356857, 0.05439331382513046, 0.08160213381052017, -0.07024428993463516, -0.070085808634758, -0.09428700059652328, 0.007956953719258308, -0.12401825189590454, 0.056314773857593536, -0.1078244149684906, -0.018024658784270287, -0.017029644921422005, 0.06278647482395172, -0.024623090401291847, 0.04848930239677429, -0.05222407355904579, -0.01849752850830555, -0.020212996751070023, 0.02540201134979725, -0.15999871492385864, -0.0011706507066264749, 0.03459193557500839, -0.0738057866692543, 0.07939988374710083, 0.019575240090489388, -0.068024180829525, -0.010680584236979485, -0.06611381471157074, -0.07082308828830719, 0.03438589349389076, 0.03436049446463585, -0.008650711737573147, -0.08279409259557724, 0.021812276914715767, 0.021081190556287766, -0.01908973604440689, -0.0015450220089405775, 0.062439896166324615, -0.08948604762554169, 0.08191204816102982, 0.027242230251431465, -0.07205245643854141, -0.033047839999198914, 0.07235118746757507, 0.09001844376325607, 0.06489335745573044, 0.062341369688510895, -0.012201515026390553, 0.08659694343805313, -0.12659747898578644, -0.013482335023581982, -0.006113467738032341, -0.048764489591121674, 0.01798962615430355, -0.07511851191520691, 0.021428337320685387, -0.015414879657328129, 0.2160031944513321, 0.10372982174158096, 0.07367560267448425, 0.01670895144343376, 0.06109187379479408, 0.03420468047261238, 0.03829517215490341, 0.09417619556188583, -0.02837643399834633, -0.04218728840351105, 0.08980861306190491, 0.03184514120221138, 0.008704018779098988, 0.034533191472291946, 0.17798711359500885, 0.13408444821834564, 0.07971340417861938, 0.07254653424024582, 0.07705846428871155, 0.021701209247112274, -0.1532505303621292, -0.13899457454681396, 0.02077433094382286, 0.06018654257059097, -0.05963777005672455, -0.07030899077653885, 0.16902996599674225, -0.12511290609836578, 0.12381813675165176, 0.02055431343615055, -0.05539867654442787, -0.05720784142613411, -0.17088399827480316, -0.02296534739434719, 0.04057491198182106, -0.009128686971962452, -0.07369203120470047, -0.0053134700283408165, 0.2105335146188736, -0.03216218575835228, -0.051195453852415085, 0.18739692866802216, -0.1858799159526825, -0.0478501096367836, 0.03982885554432869, 0.023146621882915497, 0.03086130879819393, 0.01936977356672287, 0.034898560494184494, 0.05070023238658905, 0.019716592505574226, 0.0683598741889, 0.03691818565130234, 0.1309455931186676, 0.015991536900401115, -0.08462345600128174, -0.04981553554534912, 0.003755490528419614, 0.022824781015515327, 0.00014963735884521157, 0.08418912440538406, 0.030183160677552223, -0.052279409021139145, 0.005521805491298437, 0.14039739966392517, -0.07339397072792053, -0.0966828390955925, -0.1442912369966507, 0.2858809530735016, -0.04133743792772293, -0.007232782430946827, -0.010751609690487385, -0.07940653711557388, 0.010683899745345116, 0.24658246338367462, 0.19609181582927704, -0.019371777772903442, 0.028744209557771683, -0.006557217799127102, 0.009349470026791096, 0.0001937320048455149, 0.14166341722011566, -0.028130613267421722, 0.26947495341300964, -0.021039074286818504, 0.048655614256858826, 0.007233249023556709, -0.07598353177309036, -0.12940526008605957, 0.0533740408718586, -0.00525488518178463, -0.06077655032277107, -0.04783708229660988, 0.05677439272403717, -0.05879516899585724, -0.09607259929180145, 0.1390269249677658, -0.0525885671377182, -0.04272963106632233, 0.028197549283504486, -0.03460700809955597, -0.012153133749961853, 0.08099689334630966, -0.0313168503344059, 0.00523865781724453, 0.16383984684944153, 0.015935318544507027, -0.08296552300453186, -0.018832311034202576, 0.044365495443344116, -0.12121476233005524, 0.20113211870193481, -0.03179849684238434, 0.07388825714588165, 0.10666508227586746, 0.046765901148319244, -0.08389059454202652, 0.009871106594800949, 0.00706241512671113, -0.049085184931755066, 0.02126125805079937, 0.09779118001461029, -0.001960274064913392, -0.09165877848863602, 0.01083152275532484, -0.08118954300880432, 0.03681684657931328, 0.053267110139131546, 0.03603819012641907, -0.03329160064458847, 0.024319522082805634, -0.12345470488071442, 0.12126477062702179, 0.08346614986658096, 0.04065553843975067, -0.0227948147803545, -0.04467199370265007, 0.03606333211064339, 0.03608092665672302, 0.012830561026930809, -0.06574855744838715, -0.09222269803285599, -0.0036758643109351397, -0.012708996422588825, -0.04522504657506943, -0.26996910572052, -0.06156427413225174, -0.010268105193972588, -0.01095279399305582, 0.020477015525102615, 0.024755258113145828, 0.0654582530260086, 0.04018394276499748, -0.036326006054878235, -0.07623186707496643, 0.012720605358481407, 0.11647869646549225, -0.07629173994064331, -0.07546672970056534 ]
null
null
transformers
# MaskFormer MaskFormer model trained on ADE20k semantic segmentation (large-sized version, Swin backbone). It was introduced in the paper [Per-Pixel Classification is Not All You Need for Semantic Segmentation](https://arxiv.org/abs/2107.06278) and first released in [this repository](https://github.com/facebookresearch/MaskFormer/blob/da3e60d85fdeedcb31476b5edd7d328826ce56cc/mask_former/modeling/criterion.py#L169). Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/maskformer_architecture.png) ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the [model hub](https://huggingface.co/models?search=maskformer) to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import MaskFormerImageProcessor, MaskFormerForInstanceSegmentation from PIL import Image import requests url = "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg" image = Image.open(requests.get(url, stream=True).raw) processor = MaskFormerImageProcessor.from_pretrained("facebook/maskformer-swin-large-ade") inputs = processor(images=image, return_tensors="pt") model = MaskFormerForInstanceSegmentation.from_pretrained("facebook/maskformer-swin-large-ade") outputs = model(**inputs) # model predicts class_queries_logits of shape `(batch_size, num_queries)` # and masks_queries_logits of shape `(batch_size, num_queries, height, width)` class_queries_logits = outputs.class_queries_logits masks_queries_logits = outputs.masks_queries_logits # you can pass them to processor for postprocessing # we refer to the demo notebooks for visualization (see "Resources" section in the MaskFormer docs) predicted_semantic_map = processor.post_process_semantic_segmentation(outputs, target_sizes=[image.size[::-1]])[0] ``` For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/maskformer).
{"license": "other", "tags": ["vision", "image-segmentation"], "datasets": ["scene_parse_150"], "widget": [{"src": "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg", "example_title": "House"}, {"src": "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000002.jpg", "example_title": "Castle"}]}
image-segmentation
facebook/maskformer-swin-large-ade
[ "transformers", "pytorch", "maskformer", "vision", "image-segmentation", "dataset:scene_parse_150", "arxiv:2107.06278", "license:other", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2107.06278" ]
[]
TAGS #transformers #pytorch #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us
# MaskFormer MaskFormer model trained on ADE20k semantic segmentation (large-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. !model image ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the model hub to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: For more code examples, we refer to the documentation.
[ "# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (large-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ "TAGS\n#transformers #pytorch #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n", "# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (large-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ 61, 98, 56, 44, 25 ]
[ "passage: TAGS\n#transformers #pytorch #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (large-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ -0.059995029121637344, 0.10083241760730743, -0.003985382150858641, 0.03532823920249939, 0.14398141205310822, 0.0010444503277540207, 0.1301293969154358, 0.07383040338754654, 0.005133199505507946, 0.037718046456575394, 0.057635270059108734, 0.0277164988219738, 0.07791205495595932, 0.13277512788772583, 0.06046527624130249, -0.2739391326904297, 0.040782585740089417, -0.0214222501963377, 0.046948764473199844, 0.09872173517942429, 0.1111176535487175, -0.10674985498189926, 0.1259736269712448, 0.10185500979423523, -0.19446662068367004, 0.032189421355724335, -0.015196003019809723, -0.05870012566447258, 0.09833545982837677, 0.03202608600258827, 0.12571021914482117, -0.04539942368865013, 0.07145532965660095, -0.09847652912139893, 0.03170657902956009, 0.09945648908615112, -0.0009244289831258357, 0.04526263847947121, 0.06523124128580093, -0.03384861350059509, 0.15913942456245422, -0.03829311579465866, 0.04462591931223869, 0.059492357075214386, -0.12250439822673798, -0.10615357756614685, -0.05250521004199982, 0.15554875135421753, 0.05763981118798256, 0.06973038613796234, -0.0033156108111143112, 0.07943255454301834, -0.008527304045855999, 0.06061669439077377, 0.12245744466781616, -0.10035580396652222, -0.04717914015054703, 0.0952090471982956, 0.03213522210717201, -0.11284546554088593, -0.055537477135658264, 0.06979405134916306, 0.0037898134905844927, 0.038513895124197006, 0.14848917722702026, -0.03628695011138916, 0.0581892728805542, -0.048815298825502396, -0.10491127520799637, -0.101514533162117, 0.03640180081129074, -0.004406508523970842, -0.061195965856313705, -0.21401506662368774, -0.12106382846832275, 0.15218278765678406, -0.019308507442474365, -0.003897482994943857, 0.023762186989188194, 0.0020479655358940363, 0.0743267685174942, -0.10682053864002228, -0.08949843049049377, -0.043977200984954834, -0.011325652711093426, 0.08418166637420654, 0.0037183030508458614, 0.07948026806116104, -0.06199750304222107, 0.08569662272930145, -0.1580597311258316, -0.11257077753543854, -0.05762924626469612, -0.1456993967294693, -0.06369499117136002, 0.011170427314937115, -0.032835453748703, -0.14809471368789673, -0.03062623180449009, 0.13391852378845215, 0.01540381833910942, 0.029462570324540138, 0.029638366773724556, 0.034573864191770554, 0.08015139400959015, 0.16431236267089844, -0.06410084664821625, 0.058967314660549164, 0.025582205504179, 0.007660632487386465, 0.05475964769721031, -0.05593365803360939, -0.09344404190778732, -0.0006651445291936398, -0.049967192113399506, 0.035069189965724945, 0.016760408878326416, 0.07258852571249008, 0.005393423605710268, -0.061200231313705444, 0.19744046032428741, -0.12497899681329727, 0.016815856099128723, 0.016445117071270943, -0.01852378062903881, -0.028232283890247345, 0.07757527381181717, -0.0425548329949379, -0.09098269790410995, 0.03929176926612854, -0.07630264759063721, 0.0207932498306036, -0.14889311790466309, -0.11392152309417725, 0.009000621736049652, -0.15662789344787598, -0.041683170944452286, -0.12228980660438538, -0.18703413009643555, -0.00842554122209549, 0.05273279920220375, 0.003449419280514121, 0.04552438482642174, 0.052578043192625046, -0.02700081281363964, -0.01845082826912403, 0.034601811319589615, -0.022542815655469894, -0.0005151567165739834, -0.008052343502640724, 0.0058144573122262955, 0.061844803392887115, -0.018909530714154243, 0.029800234362483025, -0.0920354351401329, 0.07414325326681137, -0.24405229091644287, 0.0982695072889328, -0.006318818777799606, -0.031788419932127, -0.02040994167327881, -0.062233611941337585, -0.03564048931002617, 0.06171607971191406, 0.019418789073824883, 0.11897291243076324, -0.16919782757759094, -0.04166053235530853, 0.23702633380889893, -0.16465161740779877, -0.05124180018901825, 0.08148103952407837, -0.06770047545433044, -0.016906438395380974, 0.05441577732563019, 0.15706917643547058, 0.11592748761177063, -0.11282456666231155, -0.009847935289144516, -0.00787952821701765, -0.12104000151157379, 0.012208144180476665, 0.033105120062828064, -0.039120640605688095, 0.11507008969783783, 0.033951286226511, -0.05996476486325264, -0.03150713071227074, -0.008250693790614605, -0.05308819189667702, -0.01417283620685339, -0.0006630526040680707, 0.04362077638506889, 0.00946206133812666, -0.05692441761493683, -0.02528892084956169, -0.10163696855306625, 0.05534043163061142, 0.04846333712339401, -0.08457339555025101, -0.010594556108117104, -0.06740771234035492, 0.12258365005254745, -0.09583412855863571, 0.004094913136214018, -0.2119758427143097, -0.060019541531801224, 0.0025125539395958185, -0.07342220842838287, 0.05882273241877556, 0.01832161657512188, 0.03033040650188923, 0.061811432242393494, -0.0028139189817011356, 0.01317588146775961, -0.08375684171915054, -0.021934067830443382, -0.020650632679462433, -0.03090430051088333, -0.11894311755895615, -0.06471904367208481, 0.09471248090267181, -0.10893277078866959, 0.06416982412338257, -0.015222541056573391, 0.09391387552022934, 0.005172200500965118, -0.06187019124627113, 0.05199655517935753, -0.01408769004046917, -0.026363534852862358, -0.07300993800163269, 0.06841009110212326, -0.007733670063316822, 0.0426422543823719, 0.07903795689344406, -0.2099360078573227, -0.11483578383922577, 0.09088493138551712, -0.07545943558216095, -0.06002358719706535, 0.039025742560625076, -0.03433479741215706, 0.010429957881569862, -0.10279224812984467, -0.08183697611093521, 0.17710015177726746, 0.02420160174369812, 0.1285339742898941, -0.10832460224628448, 0.00046541172196157277, 0.0536949560046196, -0.029751427471637726, -0.03463605418801308, -0.017587922513484955, 0.09636236727237701, -0.12152739614248276, 0.09015950560569763, 0.05888323858380318, 0.06491828709840775, 0.16723912954330444, 0.029605839401483536, -0.0447816476225853, 0.002721957629546523, -0.04913035407662392, -0.016985027119517326, 0.15155814588069916, -0.09324396401643753, -0.061362944543361664, 0.06600078195333481, -0.035543542355298996, 0.05125616863369942, -0.12313798069953918, 0.052333008497953415, 0.06582287698984146, -0.0156038673594594, -0.03097703494131565, 0.0026623476296663284, 0.0244157612323761, 0.03410293906927109, 0.05232709273695946, -0.012098027393221855, -0.0050994278863072395, -0.03666801005601883, -0.10628760606050491, 0.15232013165950775, -0.06072282791137695, -0.3900723457336426, -0.17298762500286102, 0.0356164276599884, -0.10437367111444473, 0.05062481388449669, -0.003043568693101406, 0.028899846598505974, -0.07401411235332489, -0.11167702823877335, -0.0045331185683608055, -0.017637647688388824, -0.11215846240520477, -0.014808024279773235, -0.008816276676952839, -0.008728998713195324, -0.1466733068227768, -0.03777533769607544, -0.027062760666012764, -0.0001816008152673021, 0.0349857322871685, 0.03420647606253624, 0.1103995218873024, 0.13663744926452637, -0.0765489861369133, 0.01948431506752968, -0.036143526434898376, 0.17120090126991272, -0.055252112448215485, 0.0991569235920906, 0.25642848014831543, -0.0785951316356659, 0.09698212891817093, 0.08431146293878555, -0.003714129561558366, -0.046495817601680756, -0.019177351146936417, 0.027302633970975876, -0.12322330474853516, -0.09858924150466919, -0.077253058552742, -0.07781650871038437, -0.010981375351548195, 0.06604670733213425, 0.024086665362119675, 0.08099447935819626, 0.06925004720687866, -0.05405028536915779, -0.06193913519382477, -0.007789131719619036, 0.11791098862886429, 0.08568122237920761, -0.026090258732438087, 0.09604006260633469, -0.08498790115118027, -0.03402283415198326, 0.0606599897146225, -0.0184109378606081, 0.18323911726474762, 0.021252624690532684, 0.04023302346467972, 0.0962931215763092, -0.04921337217092514, 0.08799975365400314, 0.10608579963445663, -0.04369786009192467, 0.01614861749112606, -0.04720268025994301, -0.0875111073255539, -0.09953845292329788, 0.05919948220252991, 0.05471908673644066, -0.014870300889015198, -0.06052538752555847, 0.019602151587605476, 0.06061764433979988, 0.19178739190101624, 0.07123497128486633, -0.23352420330047607, -0.053881216794252396, 0.0058802589774131775, 0.006671535782516003, -0.12339773774147034, -0.026934679597616196, 0.1351967751979828, -0.1625780165195465, 0.07925082743167877, -0.04855446517467499, 0.11441072076559067, -0.021194182336330414, 0.018399769440293312, -0.06184706091880798, 0.1425156593322754, 0.008555700071156025, 0.09353049099445343, -0.07139299064874649, 0.09739390015602112, 0.011980588547885418, 0.07580620050430298, -0.12396233528852463, 0.06672947108745575, 0.07939399033784866, 0.07536011189222336, 0.1362544745206833, 0.022005731239914894, -0.03148438781499863, -0.005513813812285662, -0.04048353433609009, 0.0028650755994021893, 0.07123922556638718, -0.029171640053391457, 0.051031529903411865, -0.006883083377033472, -0.027005374431610107, -0.014941136352717876, 0.09221760183572769, -0.055849939584732056, -0.14501900970935822, 0.023702671751379967, -0.08014509826898575, -0.08032318949699402, -0.041501134634017944, 0.012978934682905674, -0.0718483254313469, 0.18476954102516174, 0.013896572403609753, -0.11553122103214264, -0.1377556473016739, -0.05034066364169121, 0.052781276404857635, -0.048970501869916916, 0.09375375509262085, -0.06821000576019287, 0.21976277232170105, -0.07335416972637177, -0.13716930150985718, 0.012328187935054302, -0.07368136942386627, -0.0494820699095726, -0.017345724627375603, 0.11866103112697601, -0.01743418537080288, -0.010152971372008324, 0.06382101774215698, 0.054692819714546204, -0.045354925096035004, -0.10395048558712006, 0.026183323934674263, 0.13925525546073914, 0.08953937143087387, 0.1218685433268547, -0.04947539418935776, -0.16511321067810059, -0.011656145565211773, 0.09170253574848175, 0.11187586933374405, 0.10070913285017014, -0.07820433378219604, 0.08700855821371078, 0.14166127145290375, -0.08572852611541748, -0.2788311243057251, 0.011331028304994106, -0.022642333060503006, 0.05655936151742935, 0.03990919515490532, -0.12771034240722656, 0.023273929953575134, 0.041133005172014236, -0.010990379378199577, 0.04603913053870201, -0.20320701599121094, -0.09945005178451538, 0.054798055440187454, 0.07822068780660629, -0.047647394239902496, -0.08013936877250671, -0.023516321554780006, -0.04272393137216568, -0.1544536054134369, 0.11175999045372009, -0.0681963711977005, 0.0823773443698883, 0.0008574571693316102, 0.014048426412045956, 0.034852899610996246, -0.0767442137002945, 0.09936758875846863, -0.04492233321070671, 0.09152580052614212, -0.04794040322303772, -0.0383179634809494, 0.20507031679153442, -0.08737628906965256, 0.14864520728588104, 0.018204646185040474, 0.058814454823732376, -0.11024287343025208, -0.047742076218128204, -0.08167140930891037, 0.10341593623161316, -0.03259928897023201, -0.08593348413705826, -0.059123240411281586, 0.026202382519841194, 0.04713642597198486, 0.005898863077163696, -0.011057679541409016, -0.12116015702486038, 0.028181184083223343, 0.16897907853126526, 0.10597366839647293, -0.04726530984044075, -0.13635729253292084, -0.06867999583482742, 0.002310433192178607, 0.04853711277246475, -0.14982223510742188, 0.058241453021764755, 0.060023996978998184, 0.05893843248486519, 0.08595724403858185, 0.04426104575395584, -0.08976320922374725, 0.011885790154337883, 0.04901430755853653, -0.10992269217967987, -0.1607438623905182, -0.028658224269747734, -0.07503307610750198, -0.07146699726581573, 0.010550388135015965, 0.09317736327648163, -0.0499054454267025, -0.012928352691233158, -0.012224762700498104, 0.0211048386991024, -0.011136024259030819, -0.0076248846016824245, 0.062062978744506836, 0.043441515415906906, -0.07320743799209595, 0.08868421614170074, 0.020967938005924225, -0.06586968898773193, 0.035331886261701584, 0.0558197982609272, -0.08919795602560043, -0.0746602937579155, -0.0883377343416214, 0.1834699958562851, -0.04383470118045807, -0.0735376700758934, -0.021266816183924675, -0.0860748142004013, 0.019424205645918846, 0.14989496767520905, 0.0449567474424839, -0.003551691537722945, -0.03020722232758999, 0.014764714986085892, -0.12170448154211044, 0.054358478635549545, -0.03436192497611046, 0.039319708943367004, -0.08243877440690994, 0.11733710765838623, 0.0450516901910305, 0.0987866148352623, -0.06262459605932236, -0.07271949201822281, -0.06068427115678787, 0.006469705607742071, -0.1348927617073059, 0.02716779336333275, -0.09622957557439804, -0.026598963886499405, -0.01665264181792736, 0.07521121203899384, -0.016356445848941803, 0.05198507010936737, -0.051542770117521286, -0.03631993383169174, -0.009421173483133316, 0.007191124372184277, -0.15308140218257904, 0.0030973476823419333, 0.019347982481122017, -0.07180328667163849, 0.08037693053483963, 0.004632166121155024, -0.07383445650339127, -0.03472324088215828, -0.06153234839439392, -0.08392342925071716, 0.0267435722053051, 0.037354227155447006, 0.0052255489863455296, -0.05227798596024513, 0.04710464924573898, 0.006480468902736902, -0.038797710090875626, -0.028831778094172478, 0.04974525794386864, -0.08938159048557281, 0.08522488921880722, 0.06282458454370499, -0.05125194415450096, -0.024434935301542282, 0.08634307235479355, 0.06297721713781357, 0.05325615033507347, 0.07793879508972168, 0.0009572014096193016, 0.08994375914335251, -0.15399236977100372, -0.015577791258692741, -0.011880972422659397, -0.03942227363586426, 0.03920939937233925, -0.05653580650687218, 0.028094938024878502, -0.027353959158062935, 0.19514644145965576, 0.1183960884809494, 0.06531678140163422, 0.01503803301602602, 0.07803045213222504, -0.010563275776803493, 0.03220229223370552, 0.07174038141965866, -0.04927326366305351, -0.0370219424366951, 0.0776667520403862, 0.031717449426651, -0.000602563435677439, 0.009053857065737247, 0.15805931389331818, 0.12860560417175293, 0.09504072368144989, 0.061140336096286774, 0.11376934498548508, 0.0280293095856905, -0.17124438285827637, -0.13232454657554626, 0.037867702543735504, 0.07884298264980316, -0.062418993562459946, -0.05208200961351395, 0.1449487805366516, -0.10613629221916199, 0.1571473926305771, 0.004478543531149626, -0.051436085253953934, -0.05442677065730095, -0.1711014211177826, -0.01905706338584423, 0.04706242308020592, 0.005175942555069923, -0.06751982122659683, 0.013062615878880024, 0.1587943732738495, -0.028643587604165077, -0.08804895728826523, 0.18108342587947845, -0.15428610146045685, -0.06904006749391556, 0.06330040842294693, 0.03454291820526123, 0.0422462522983551, 0.028719235211610794, 0.041001420468091965, 0.05977412313222885, 0.04828157648444176, 0.07754698395729065, 0.02701854705810547, 0.09817580133676529, 0.023041581735014915, -0.07627789676189423, -0.05783172696828842, 0.002330869436264038, 0.011254276148974895, -0.01136980764567852, 0.06160242483019829, 0.012638234533369541, -0.048430223017930984, -0.015382042154669762, 0.09380297362804413, -0.058636173605918884, -0.11442831158638, -0.1477341204881668, 0.2559189200401306, -0.037433721125125885, 0.0013604086125269532, -0.00252407300285995, -0.10133767127990723, -0.002291122917085886, 0.2108691781759262, 0.22558556497097015, -0.04907841607928276, 0.020643964409828186, -0.027256857603788376, 0.009150201454758644, -0.0044153970666229725, 0.13629364967346191, -0.01997755467891693, 0.2594327926635742, -0.026784589514136314, 0.10119902342557907, -0.011535693891346455, -0.06808418035507202, -0.10959778726100922, 0.08301941305398941, 0.007132920436561108, -0.039328448474407196, -0.03163943067193031, 0.03973051533102989, -0.04706505313515663, -0.1157306581735611, 0.09085126221179962, -0.07041727006435394, -0.0556485578417778, 0.027496276423335075, -0.030054913833737373, 0.008432978764176369, 0.06928884238004684, -0.018481526523828506, 0.008466617204248905, 0.16613361239433289, 0.013916599564254284, -0.07932904362678528, -0.025946905836462975, 0.04053165018558502, -0.14551547169685364, 0.19932302832603455, -0.006146900821477175, 0.020210521295666695, 0.09915408492088318, 0.03848424181342125, -0.09187097102403641, 0.020161231979727745, -0.001525361672975123, -0.013121116906404495, 0.004905973561108112, 0.10068673640489578, 0.004404560197144747, -0.10580892115831375, -0.015017092227935791, -0.08256460726261139, 0.020109735429286957, 0.05640380457043648, 0.010603235103189945, -0.03520696237683296, 0.0647180825471878, -0.10033465921878815, 0.1278255581855774, 0.1018248125910759, 0.0405929796397686, -0.031829364597797394, -0.05766202509403229, 0.0412382148206234, 0.03133194148540497, 0.01386670209467411, -0.058044783771038055, -0.09306200593709946, -0.010107312351465225, -0.015805236995220184, -0.011424136348068714, -0.25966396927833557, -0.061112381517887115, -0.006466709543019533, -0.01594030298292637, 0.011448469012975693, 0.024890771135687828, 0.09430816024541855, 0.04906688258051872, -0.023467838764190674, -0.002501236042007804, 0.0006196132744662464, 0.12242860347032547, -0.08813861012458801, -0.06166217848658562 ]
null
null
transformers
# MaskFormer MaskFormer model trained on COCO panoptic segmentation (large-sized version, Swin backbone). It was introduced in the paper [Per-Pixel Classification is Not All You Need for Semantic Segmentation](https://arxiv.org/abs/2107.06278) and first released in [this repository](https://github.com/facebookresearch/MaskFormer/blob/da3e60d85fdeedcb31476b5edd7d328826ce56cc/mask_former/modeling/criterion.py#L169). Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/maskformer_architecture.png) ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the [model hub](https://huggingface.co/models?search=maskformer) to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import MaskFormerImageProcessor, MaskFormerForInstanceSegmentation from PIL import Image import requests # load MaskFormer fine-tuned on COCO panoptic segmentation processor = MaskFormerImageProcessor.from_pretrained("facebook/maskformer-swin-large-coco") model = MaskFormerForInstanceSegmentation.from_pretrained("facebook/maskformer-swin-large-coco") url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) # model predicts class_queries_logits of shape `(batch_size, num_queries)` # and masks_queries_logits of shape `(batch_size, num_queries, height, width)` class_queries_logits = outputs.class_queries_logits masks_queries_logits = outputs.masks_queries_logits # you can pass them to processor for postprocessing result = processor.post_process_panoptic_segmentation(outputs, target_sizes=[image.size[::-1]])[0] # we refer to the demo notebooks for visualization (see "Resources" section in the MaskFormer docs) predicted_panoptic_map = result["segmentation"] ``` For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/maskformer).
{"license": "other", "tags": ["vision", "image-segmentation"], "datasets": ["coco"], "widget": [{"src": "http://images.cocodataset.org/val2017/000000039769.jpg", "example_title": "Cats"}, {"src": "http://images.cocodataset.org/val2017/000000039770.jpg", "example_title": "Castle"}]}
image-segmentation
facebook/maskformer-swin-large-coco
[ "transformers", "pytorch", "safetensors", "maskformer", "vision", "image-segmentation", "dataset:coco", "arxiv:2107.06278", "license:other", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2107.06278" ]
[]
TAGS #transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us
# MaskFormer MaskFormer model trained on COCO panoptic segmentation (large-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. !model image ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the model hub to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: For more code examples, we refer to the documentation.
[ "# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (large-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ "TAGS\n#transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n", "# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (large-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ 61, 96, 56, 44, 25 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (large-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ -0.06840242445468903, 0.08461162447929382, -0.003222570987418294, 0.02980940043926239, 0.12789686024188995, -0.0076076495461165905, 0.1191827729344368, 0.060841429978609085, -0.0178646482527256, 0.04307668283581734, 0.044213272631168365, 0.03267525136470795, 0.07555415481328964, 0.1397724151611328, 0.03666972368955612, -0.23682807385921478, 0.04830780252814293, -0.012393803335726261, 0.061904460191726685, 0.10902770608663559, 0.10893484205007553, -0.12084406614303589, 0.11646199226379395, 0.08361140638589859, -0.17313317954540253, 0.015957258641719818, 0.00734299048781395, -0.054804034531116486, 0.09520319849252701, 0.024821944534778595, 0.15259331464767456, -0.023773489519953728, 0.06910916417837143, -0.10400614142417908, 0.033489637076854706, 0.09558071196079254, -0.004561145789921284, 0.047312550246715546, 0.07844916731119156, -0.05788044631481171, 0.1195899099111557, -0.0441533587872982, 0.052936021238565445, 0.06310931593179703, -0.1085214614868164, -0.11756603419780731, -0.05617128312587738, 0.1600506603717804, 0.07173029333353043, 0.06969913840293884, -0.008604714646935463, 0.1136791780591011, -0.019239230081439018, 0.06538024544715881, 0.1208859384059906, -0.12125726044178009, -0.054876312613487244, 0.11197073757648468, 0.034515731036663055, -0.09718747437000275, -0.06551649421453476, 0.06611129641532898, 0.015373948961496353, 0.0313655287027359, 0.1636495590209961, -0.057065192610025406, 0.0014076529769226909, -0.062197256833314896, -0.10666574537754059, -0.10476129502058029, 0.04074478894472122, -0.02102551981806755, -0.06720471382141113, -0.21855320036411285, -0.11663813143968582, 0.14905542135238647, -0.02240755409002304, -0.0005856318166479468, 0.027323663234710693, -0.008173256181180477, 0.06845103204250336, -0.07739146053791046, -0.11168141663074493, -0.04681820049881935, -0.013481135480105877, 0.09080763161182404, 0.008760744705796242, 0.06598696857690811, -0.041761886328458786, 0.0951986163854599, -0.11535067856311798, -0.1132010743021965, -0.05747516080737114, -0.13387519121170044, -0.05544723570346832, 0.0026969993487000465, -0.0441778264939785, -0.12892255187034607, -0.007782765198498964, 0.14212869107723236, -0.02105521224439144, 0.02110113948583603, 0.003000160912051797, 0.03874603658914566, 0.0707748755812645, 0.15783032774925232, -0.05406174063682556, 0.0768587589263916, 0.04243135452270508, 0.004552363883703947, 0.07493836432695389, -0.04869586229324341, -0.07916536182165146, -0.007165318820625544, -0.01435439009219408, 0.05706615746021271, 0.023655390366911888, 0.06696897000074387, -0.009729250334203243, -0.05158601328730583, 0.21524956822395325, -0.12465769052505493, 0.008899277076125145, 0.024148734286427498, -0.0016830943059176207, -0.04092063382267952, 0.08475537598133087, -0.029880303889513016, -0.08961929380893707, 0.04652274772524834, -0.0746411681175232, 0.022193634882569313, -0.13078880310058594, -0.10486621409654617, 0.019482459872961044, -0.14368896186351776, -0.040942080318927765, -0.1413966566324234, -0.16446855664253235, -0.020547833293676376, 0.03426893800497055, 0.01049328874796629, 0.06217855587601662, 0.04858352243900299, -0.026812994852662086, -0.03114921972155571, 0.021108543500304222, -0.009257747791707516, -0.011105356737971306, 0.008232819847762585, -0.013988775201141834, 0.04464957118034363, -0.01310123410075903, 0.018587103113532066, -0.10944510996341705, 0.07649707049131393, -0.23934414982795715, 0.06691831350326538, -0.005752864759415388, -0.03522191569209099, -0.020633792504668236, -0.06212376058101654, -0.022392278537154198, 0.05706115812063217, 0.0024610343389213085, 0.12540528178215027, -0.12915252149105072, -0.028694462031126022, 0.26648232340812683, -0.17841121554374695, -0.06952039897441864, 0.09094011038541794, -0.05810976028442383, -0.023956837132573128, 0.05259076505899429, 0.13928593695163727, 0.11480728536844254, -0.1217838004231453, -0.022594289854168892, -0.012827841565012932, -0.10154786705970764, 0.0070172562263906, 0.03538057208061218, -0.03617092967033386, 0.09451409429311752, 0.025648698210716248, -0.05946749448776245, -0.04296334087848663, -0.004250459372997284, -0.05131819099187851, -0.019755853340029716, -0.016694309189915657, 0.04513975605368614, 0.017696166411042213, -0.05363699793815613, -0.03068172000348568, -0.09167106449604034, 0.01091723795980215, 0.05401686951518059, -0.0925629511475563, -0.007149955723434687, -0.07181115448474884, 0.12901602685451508, -0.09599112719297409, -0.0036458969116210938, -0.2169642448425293, -0.09342608600854874, 0.006655691657215357, -0.05118285492062569, 0.03467924892902374, 0.017795505002141, 0.02260296605527401, 0.06873425841331482, -0.016692737117409706, 0.0031728276517242193, -0.04999122396111488, -0.020519375801086426, -0.025574816390872, -0.035122405737638474, -0.1032465323805809, -0.06615626811981201, 0.12999796867370605, -0.12559764087200165, 0.06608380377292633, 0.00013886428496334702, 0.1008666455745697, 0.013061551377177238, -0.05584762617945671, 0.062280137091875076, -0.0009230186115019023, -0.030875833705067635, -0.07007218897342682, 0.06724460422992706, -0.0027396916411817074, 0.024171877652406693, 0.08112316578626633, -0.22242705523967743, -0.08273810893297195, 0.09814425557851791, -0.06172672659158707, -0.07040999084711075, 0.0024430868215858936, -0.03460942581295967, 0.005628017708659172, -0.10772544145584106, -0.06559552252292633, 0.16573180258274078, 0.015798306092619896, 0.1295793652534485, -0.11443836241960526, -0.00912188645452261, 0.05543176084756851, -0.040267035365104675, -0.027761075645685196, -0.01068051252514124, 0.05497490614652634, -0.12388770282268524, 0.09479325264692307, 0.06393454223871231, 0.04615112394094467, 0.13152702152729034, 0.023731758818030357, -0.04280373826622963, 0.018000658601522446, -0.022857386618852615, 0.007546071894466877, 0.1531612128019333, -0.06174280866980553, -0.05304592475295067, 0.0687103271484375, -0.03769342601299286, 0.04133594036102295, -0.12785132229328156, 0.04782084375619888, 0.04657692462205887, -0.011193451471626759, -0.023833677172660828, 0.0021550729870796204, 0.02914252318441868, 0.040552545338869095, 0.03413645550608635, -0.009071426466107368, -0.0032366265077143908, -0.040512315928936005, -0.11429522931575775, 0.168989896774292, -0.056899573653936386, -0.3838682174682617, -0.15755189955234528, 0.04157469421625137, -0.09716849029064178, 0.05382866784930229, 0.003109755925834179, 0.0445084273815155, -0.07988709956407547, -0.12595133483409882, -0.039132773876190186, 0.006004473194479942, -0.09825236350297928, -0.0533536933362484, -0.0013471314450725913, 0.0010309926001355052, -0.1377326250076294, -0.034542784094810486, -0.02289380319416523, -0.028341928496956825, 0.05980692431330681, 0.017365837469697, 0.1085825115442276, 0.15073218941688538, -0.06790881603956223, -0.0007111267186701298, -0.0479230098426342, 0.12340892851352692, -0.05364237725734711, 0.07766406238079071, 0.2641483247280121, -0.07930698990821838, 0.08576148748397827, 0.08539494127035141, -0.007751069497317076, -0.05180603638291359, -0.014588050544261932, 0.007015264593064785, -0.11303283274173737, -0.10427151620388031, -0.095712810754776, -0.07084845006465912, 0.009531383402645588, 0.0674617812037468, 0.024752769619226456, 0.08305305987596512, 0.08355265855789185, -0.07591933012008667, -0.07212665677070618, 0.014715498313307762, 0.12283128499984741, 0.0654602125287056, -0.019283272325992584, 0.09723947197198868, -0.0781707838177681, -0.024252714589238167, 0.06437687575817108, -0.007170419674366713, 0.15768077969551086, -0.003359790425747633, 0.0246069747954607, 0.09566795825958252, -0.007882761768996716, 0.07995825260877609, 0.11731258779764175, -0.03898624703288078, 0.013766823336482048, -0.03447346016764641, -0.09299026429653168, -0.11901404708623886, 0.049063604325056076, 0.044238217175006866, -0.008413437753915787, -0.07329150289297104, -0.009898800402879715, 0.05733290687203407, 0.18599960207939148, 0.07525613158941269, -0.2512582242488861, -0.06288232654333115, 0.01326797530055046, 0.02355317398905754, -0.122517429292202, -0.01890958659350872, 0.1463143527507782, -0.16454197466373444, 0.07799644023180008, -0.05443635210394859, 0.10537552833557129, -0.027029389515519142, 0.011680162511765957, -0.09327106177806854, 0.10947605222463608, 0.011175678111612797, 0.09951592981815338, -0.06530115753412247, 0.10051601380109787, 0.015247504226863384, 0.07134797424077988, -0.12709520757198334, 0.05749724432826042, 0.0710371732711792, 0.10275763273239136, 0.15260440111160278, 0.008710495196282864, -0.01813998818397522, -0.029377836734056473, -0.06656914204359055, 0.012541192583739758, 0.07102569192647934, -0.040927011519670486, 0.057810794562101364, -0.008623783476650715, -0.03007197566330433, -0.01655641198158264, 0.031409863382577896, -0.046204403042793274, -0.13656629621982574, 0.01531742513179779, -0.06599599123001099, -0.06260038167238235, -0.06690366566181183, 0.006854402367025614, -0.04935755208134651, 0.19450028240680695, -0.014248269610106945, -0.11241979151964188, -0.1362168788909912, -0.06517164409160614, 0.048509303480386734, -0.041327137500047684, 0.09845507889986038, -0.06128007546067238, 0.2257884293794632, -0.07446538656949997, -0.11491073668003082, 0.021024301648139954, -0.08997170627117157, -0.08324652910232544, -0.0253764595836401, 0.13482190668582916, -0.020482484251260757, -0.004990559536963701, 0.08034629374742508, 0.0544903464615345, -0.04438094422221184, -0.0986451655626297, 0.0039875623770058155, 0.14484404027462006, 0.1495305299758911, 0.13221298158168793, -0.043363992124795914, -0.19412453472614288, -0.01853402517735958, 0.08827527612447739, 0.08569904416799545, 0.10699453949928284, -0.07170642167329788, 0.08880139887332916, 0.1536344587802887, -0.06799111515283585, -0.29405876994132996, -0.0073277694173157215, -0.03813262656331062, 0.043042536824941635, 0.036849647760391235, -0.0867798924446106, 0.04201158881187439, 0.03986810892820358, -0.02773096412420273, 0.056208495050668716, -0.19930338859558105, -0.10171575844287872, 0.061131153255701065, 0.10867421329021454, -0.03075391612946987, -0.0950000137090683, -0.02134268544614315, -0.04198272526264191, -0.14915578067302704, 0.09568984806537628, -0.11970017850399017, 0.07502023875713348, -0.00018277006165590137, -0.01685965806245804, 0.033980365842580795, -0.09122811257839203, 0.09729906916618347, -0.0642082542181015, 0.09330292046070099, -0.05198926478624344, -0.013577482663094997, 0.20096877217292786, -0.08461009711027145, 0.13384374976158142, 0.0046808416955173016, 0.0629027932882309, -0.080854631960392, -0.04944181442260742, -0.07507007569074631, 0.11283720284700394, -0.03422308713197708, -0.08396641165018082, -0.061196763068437576, 0.03065662831068039, 0.043514497578144073, 0.004881179891526699, 0.0034923702478408813, -0.12405534833669662, 0.05378718674182892, 0.1956789195537567, 0.1378287672996521, -0.046525754034519196, -0.11488264054059982, -0.06895049661397934, -0.01747766323387623, 0.05295122042298317, -0.1340203434228897, 0.06299305707216263, 0.05486268922686577, 0.04709820821881294, 0.06383124738931656, 0.047753751277923584, -0.07243434339761734, 0.017540713772177696, 0.05405840650200844, -0.12490589171648026, -0.15367434918880463, -0.009689690545201302, -0.05762780085206032, -0.09203827381134033, 0.026647325605154037, 0.10071511566638947, -0.05160967633128166, -0.018195899203419685, -0.014366239309310913, 0.026877539232373238, -0.003034289926290512, 0.006700551602989435, 0.05709264799952507, 0.05014296993613243, -0.07557092607021332, 0.0910208597779274, 0.03223734349012375, -0.06983000040054321, 0.016813509166240692, 0.030358755961060524, -0.1073504090309143, -0.08470604568719864, -0.07320491969585419, 0.15343761444091797, -0.03852790966629982, -0.08471614867448807, -0.033861152827739716, -0.09699485450983047, -0.0025301489513367414, 0.17140275239944458, 0.05617937818169594, 0.021892771124839783, -0.009212009608745575, 0.008237832225859165, -0.13582605123519897, 0.05876754969358444, -0.02750089019536972, 0.05221845209598541, -0.09247216582298279, 0.12952551245689392, 0.05238460749387741, 0.08482769876718521, -0.06543878465890884, -0.057615235447883606, -0.06650368124246597, 0.0073715453036129475, -0.1391584873199463, 0.05269447714090347, -0.09877952188253403, -0.026902416720986366, -0.020079130306839943, 0.07951215654611588, -0.0093830032274127, 0.05622593313455582, -0.04108597710728645, -0.02136952616274357, -0.015367986634373665, 0.017225034534931183, -0.16544052958488464, -0.010670211166143417, 0.016845719888806343, -0.0802045613527298, 0.08849366754293442, 0.004158075898885727, -0.07357791066169739, -0.022287506610155106, -0.09700299054384232, -0.07689693570137024, 0.03470991179347038, 0.0360804907977581, -0.009709234349429607, -0.0689004436135292, 0.0355558767914772, 0.020724544301629066, -0.05139850080013275, -0.01386253722012043, 0.0593932680785656, -0.08692089468240738, 0.08778407424688339, 0.055105604231357574, -0.04596317932009697, -0.030026333406567574, 0.0707532986998558, 0.06601016968488693, 0.04220679774880409, 0.07896601408720016, -0.01966337487101555, 0.08715029805898666, -0.15037491917610168, -0.01807980425655842, -0.016505831852555275, -0.05167599767446518, 0.00984366424381733, -0.04564176872372627, 0.035577189177274704, -0.020468473434448242, 0.19288763403892517, 0.08715315163135529, 0.08644536882638931, 0.012936762534081936, 0.05919753015041351, 0.03581695631146431, 0.0444725900888443, 0.08966637402772903, -0.04413428530097008, -0.04661400616168976, 0.07006217539310455, 0.02823750488460064, 0.009415085427463055, -0.008666480891406536, 0.1656547486782074, 0.13615508377552032, 0.07425638288259506, 0.05930626019835472, 0.08946926146745682, 0.02074751816689968, -0.15873132646083832, -0.1443500816822052, 0.024840885773301125, 0.055303264409303665, -0.05524671822786331, -0.04449472203850746, 0.15807034075260162, -0.10930636525154114, 0.11904917657375336, 0.004343338776379824, -0.046458229422569275, -0.05481298267841339, -0.15911324322223663, -0.031384654343128204, 0.05134790390729904, 0.00005394233448896557, -0.0659823939204216, -0.002692603040486574, 0.17222808301448822, -0.032065581530332565, -0.07574976235628128, 0.20090177655220032, -0.14805404841899872, -0.04807773604989052, 0.04981035739183426, 0.04066219553351402, 0.0489269495010376, 0.04008276388049126, 0.026980288326740265, 0.05959369242191315, 0.03869663178920746, 0.08261283487081528, 0.029939092695713043, 0.11926023662090302, 0.02904791198670864, -0.06569619476795197, -0.05975373089313507, 0.006126810796558857, 0.02038021758198738, 0.0060301716439425945, 0.07073000073432922, 0.021962085738778114, -0.036489520221948624, -0.008520493283867836, 0.11510772258043289, -0.06464475393295288, -0.12001025676727295, -0.13237658143043518, 0.2748907506465912, -0.03523971140384674, -0.010441615246236324, 0.01565326377749443, -0.09742511808872223, 0.02066550962626934, 0.2144424468278885, 0.2050221860408783, -0.033830977976322174, 0.035854775458574295, -0.030592482537031174, 0.0067746578715741634, -0.008464095182716846, 0.13963323831558228, -0.012846267782151699, 0.2732185423374176, -0.03327689319849014, 0.07353159785270691, 0.006575288716703653, -0.04987674951553345, -0.12522540986537933, 0.05998562276363373, -0.01731494441628456, -0.039824388921260834, -0.035978399217128754, 0.03718595206737518, -0.036907292902469635, -0.10517134517431259, 0.11357986181974411, -0.0686703622341156, -0.051656462252140045, 0.026319071650505066, -0.026095261797308922, -0.005592959467321634, 0.0675206258893013, -0.022379035130143166, 0.010930297896265984, 0.15963098406791687, 0.010373761877417564, -0.07070799916982651, -0.008548588491976261, 0.026488108560442924, -0.12695270776748657, 0.24505004286766052, -0.008310986682772636, 0.04831590875983238, 0.0940796360373497, 0.027318214997649193, -0.09115813672542572, 0.026645418256521225, 0.009202552028000355, -0.028467081487178802, 0.03120039589703083, 0.1012265607714653, -0.0035349729005247355, -0.09063519537448883, -0.0021084430627524853, -0.08292856067419052, 0.0283161923289299, 0.04551347345113754, 0.01110205426812172, -0.03819718211889267, 0.051207080483436584, -0.09772166609764099, 0.13282713294029236, 0.0863732099533081, 0.029276957735419273, -0.027547361329197884, -0.04484453797340393, 0.04248792678117752, 0.0356031097471714, 0.02469908446073532, -0.03820647671818733, -0.10293309390544891, 0.0015772057231515646, -0.010578231886029243, -0.007537617348134518, -0.2815037965774536, -0.04812343046069145, -0.015426349826157093, -0.02021702378988266, -0.008650957606732845, 0.022102050483226776, 0.07127371430397034, 0.045911435037851334, -0.03046959452331066, -0.01035313494503498, -0.006504944991320372, 0.12276653945446014, -0.07856247574090958, -0.07476204633712769 ]
null
null
transformers
# MaskFormer MaskFormer model trained on ADE20k semantic segmentation (small-sized version, Swin backbone). It was introduced in the paper [Per-Pixel Classification is Not All You Need for Semantic Segmentation](https://arxiv.org/abs/2107.06278) and first released in [this repository](https://github.com/facebookresearch/MaskFormer/blob/da3e60d85fdeedcb31476b5edd7d328826ce56cc/mask_former/modeling/criterion.py#L169). Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/maskformer_architecture.png) ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the [model hub](https://huggingface.co/models?search=maskformer) to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import MaskFormerFeatureExtractor, MaskFormerForInstanceSegmentation from PIL import Image import requests url = "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg" image = Image.open(requests.get(url, stream=True).raw) feature_extractor = MaskFormerFeatureExtractor.from_pretrained("facebook/maskformer-swin-small-ade") inputs = feature_extractor(images=image, return_tensors="pt") model = MaskFormerForInstanceSegmentation.from_pretrained("facebook/maskformer-swin-small-ade") outputs = model(**inputs) # model predicts class_queries_logits of shape `(batch_size, num_queries)` # and masks_queries_logits of shape `(batch_size, num_queries, height, width)` class_queries_logits = outputs.class_queries_logits masks_queries_logits = outputs.masks_queries_logits # you can pass them to feature_extractor for postprocessing # we refer to the demo notebooks for visualization (see "Resources" section in the MaskFormer docs) predicted_semantic_map = feature_extractor.post_process_semantic_segmentation(outputs, target_sizes=[image.size[::-1]])[0] ``` For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/maskformer).
{"license": "other", "tags": ["vision", "image-segmentation"], "datasets": ["scene_parse_150"], "widget": [{"src": "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg", "example_title": "House"}, {"src": "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000002.jpg", "example_title": "Castle"}]}
image-segmentation
facebook/maskformer-swin-small-ade
[ "transformers", "pytorch", "safetensors", "maskformer", "vision", "image-segmentation", "dataset:scene_parse_150", "arxiv:2107.06278", "license:other", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2107.06278" ]
[]
TAGS #transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us
# MaskFormer MaskFormer model trained on ADE20k semantic segmentation (small-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. !model image ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the model hub to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: For more code examples, we refer to the documentation.
[ "# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (small-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ "TAGS\n#transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n", "# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (small-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ 66, 98, 56, 44, 25 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (small-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ -0.06975061446428299, 0.10949511080980301, -0.0043597049079835415, 0.030357396230101585, 0.14802038669586182, -0.014016514644026756, 0.1406008005142212, 0.07603069394826889, -0.023243429139256477, 0.06925026327371597, 0.048829108476638794, 0.006119974423199892, 0.0849381759762764, 0.12641434371471405, 0.03423324227333069, -0.24158929288387299, 0.051313817501068115, -0.06253235042095184, 0.05223990976810455, 0.08949493616819382, 0.1084861159324646, -0.1203618049621582, 0.11642267554998398, 0.08780815452337265, -0.14755384624004364, 0.011292104609310627, -0.014177409932017326, -0.07466277480125427, 0.10084331780672073, 0.026984620839357376, 0.124736487865448, -0.047209229320287704, 0.08414103835821152, -0.09894556552171707, 0.03475375846028328, 0.1192295029759407, 0.009040978737175465, 0.052045244723558426, 0.05677424371242523, -0.021612806245684624, 0.11367715895175934, -0.060144711285829544, 0.03798036649823189, 0.05834726616740227, -0.145109623670578, -0.13422048091888428, -0.07323011755943298, 0.153496652841568, 0.06891335546970367, 0.07022668421268463, 0.001087466604076326, 0.0866813138127327, 0.011008404195308685, 0.06985973566770554, 0.1480676829814911, -0.08864214271306992, -0.04161302372813225, 0.08925606310367584, 0.04092853143811226, -0.07934705168008804, -0.0671931579709053, 0.05710091069340706, 0.0038584289140999317, 0.024096164852380753, 0.15228484570980072, -0.03183586895465851, 0.06771000474691391, -0.04811234027147293, -0.11531558632850647, -0.10359887033700943, 0.03285166621208191, 0.0021924744360148907, -0.08392628282308578, -0.18586334586143494, -0.12821635603904724, 0.1417553424835205, -0.003329048864543438, -0.015509238466620445, 0.02253437601029873, 0.008080733940005302, 0.07711280137300491, -0.1148180440068245, -0.08738318830728531, -0.03610406070947647, 0.008279997855424881, 0.0852225050330162, -0.001560979406349361, 0.0945848822593689, -0.07504445314407349, 0.07844122499227524, -0.16971254348754883, -0.09827610105276108, -0.05844546854496002, -0.1357085406780243, -0.05006304383277893, -0.010379236191511154, 0.012117530219256878, -0.1824958324432373, -0.026480376720428467, 0.10474111884832382, -0.02473408915102482, 0.02049541473388672, 0.025160418823361397, 0.03152330964803696, 0.07702331244945526, 0.16579078137874603, -0.06268983334302902, 0.03713557869195938, 0.03988673910498619, 0.02778046764433384, 0.06267639249563217, -0.04674643650650978, -0.06507675349712372, 0.012213252484798431, -0.026978719979524612, 0.05937426909804344, 0.02968967892229557, 0.06171255186200142, 0.014246426522731781, -0.04400995001196861, 0.15961632132530212, -0.12139974534511566, 0.021489908918738365, 0.03129636496305466, -0.03489452227950096, -0.061255745589733124, 0.08474059402942657, -0.055964164435863495, -0.091136634349823, 0.044330984354019165, -0.06198874115943909, 0.008655799552798271, -0.1481088101863861, -0.1288701295852661, 0.010221995413303375, -0.12481530010700226, -0.06000784784555435, -0.11935186386108398, -0.18648871779441833, -0.009395021013915539, 0.04857027903199196, 0.022671379148960114, 0.07033096253871918, 0.04432329535484314, -0.03509163856506348, -0.011440839618444443, 0.04404451325535774, -0.06289881467819214, -0.001951666665263474, -0.0068327998742461205, -0.0033828297164291143, 0.06806059926748276, -0.02343406155705452, 0.03328375890851021, -0.09373759478330612, 0.07444585859775543, -0.259059876203537, 0.07013919204473495, 0.009888040833175182, -0.03490534424781799, -0.03691544383764267, -0.050398800522089005, -0.002422062214463949, 0.036041852086782455, 0.03349440544843674, 0.1376473307609558, -0.1636112630367279, -0.02267427369952202, 0.23397967219352722, -0.16257831454277039, -0.058637429028749466, 0.08951503783464432, -0.06924931704998016, 0.0018447207985445857, 0.07555900514125824, 0.12035966664552689, 0.07641143351793289, -0.12032174319028854, -0.016750987619161606, -0.023534156382083893, -0.13604877889156342, 0.02940729446709156, 0.025561435148119926, -0.050883784890174866, 0.11711034923791885, 0.008951311931014061, -0.05984940752387047, -0.04659610241651535, -0.020609861239790916, -0.042370084673166275, -0.0015735354973003268, -0.0036026043817400932, 0.030601533129811287, -0.011830463074147701, -0.08640658110380173, -0.040000807493925095, -0.09625667333602905, 0.05212235450744629, 0.048204414546489716, -0.07391344755887985, -0.018199998885393143, -0.05475698783993721, 0.1398477554321289, -0.09338334947824478, -0.0025059983599931, -0.1807824820280075, -0.09007447212934494, 0.0033424156717956066, -0.09224984794855118, 0.052000485360622406, -0.018394509330391884, 0.0328659787774086, 0.09210925549268723, 0.0048592546954751015, -0.010109654627740383, -0.07938806712627411, -0.013319391757249832, -0.029696805402636528, -0.044419676065444946, -0.09983806312084198, -0.07531347870826721, 0.07589863240718842, -0.12577924132347107, 0.06939145922660828, -0.0074667371809482574, 0.07379252463579178, 0.026791393756866455, -0.0741761326789856, 0.05448541045188904, -0.02880805917084217, -0.0213655773550272, -0.08835197240114212, 0.05801934003829956, 0.007283441256731749, 0.04351195693016052, 0.07022285461425781, -0.25456517934799194, -0.1439104974269867, 0.07284235209226608, -0.06255953013896942, -0.07123181968927383, 0.031078845262527466, -0.012793325819075108, 0.015860501676797867, -0.12525440752506256, -0.08778346329927444, 0.1367766261100769, 0.02345401979982853, 0.14764513075351715, -0.10302591323852539, 0.007468144875019789, 0.04947708174586296, -0.04816749319434166, -0.03824624791741371, -0.010542143136262894, 0.037791140377521515, -0.1464044451713562, 0.11027311533689499, 0.08538148552179337, 0.07790237665176392, 0.16611529886722565, 0.03705613315105438, -0.0398784838616848, -0.01680312119424343, 0.003042235504835844, -0.010420956648886204, 0.14467662572860718, -0.08395584672689438, -0.040525734424591064, 0.04792289808392525, -0.02838558331131935, 0.02635369822382927, -0.12699483335018158, 0.06235342100262642, 0.07671709358692169, -0.019092988222837448, -0.015961971133947372, -0.018071424216032028, 0.02709342911839485, 0.04436873644590378, 0.06896704435348511, -0.02134476602077484, 0.0018875006353482604, -0.042973071336746216, -0.10985885560512543, 0.15096649527549744, -0.07209744304418564, -0.3644750416278839, -0.1754656881093979, 0.01909816823899746, -0.07679635286331177, 0.050043072551488876, 0.013482051901519299, 0.016695508733391762, -0.07051806151866913, -0.1098291203379631, 0.01485755667090416, -0.002747942926362157, -0.11169218271970749, -0.008111122995615005, -0.0025132845621556044, -0.013564224354922771, -0.13740384578704834, -0.04339412599802017, -0.014427955262362957, -0.010703702457249165, 0.03306743875145912, 0.06970064342021942, 0.09591846913099289, 0.11385221034288406, -0.08487706631422043, 0.008691665716469288, -0.028612833470106125, 0.1737765669822693, -0.050361569970846176, 0.12629170715808868, 0.2523018419742584, -0.09809982031583786, 0.11398189514875412, 0.09915608912706375, -0.0019064756343141198, -0.032910071313381195, -0.0016378202708438039, 0.01776065304875374, -0.11964869499206543, -0.131165012717247, -0.061122044920921326, -0.053403083235025406, -0.0031119873747229576, 0.039900943636894226, 0.019938400015234947, 0.10166225582361221, 0.058770328760147095, -0.054297059774398804, -0.07278209179639816, 0.01872202195227146, 0.12677009403705597, 0.04161596670746803, -0.03129368647933006, 0.07979713380336761, -0.08037100732326508, -0.03102920763194561, 0.06473129987716675, -0.05565466359257698, 0.1957664042711258, 0.03211183473467827, 0.045239705592393875, 0.07792017608880997, -0.057552311569452286, 0.0862509161233902, 0.11579404771327972, -0.030928034335374832, 0.014357632026076317, -0.043809711933135986, -0.10408006608486176, -0.07931637018918991, 0.06346219778060913, 0.017248237505555153, 0.02583266608417034, -0.08072042465209961, 0.06871102750301361, 0.061328623443841934, 0.2411143183708191, 0.08744427561759949, -0.23940876126289368, -0.07238607853651047, 0.006843249779194593, 0.007291276473551989, -0.13566121459007263, -0.03711877018213272, 0.15375371277332306, -0.1721879243850708, 0.042772747576236725, -0.05748185142874718, 0.10580958425998688, -0.036684807389974594, 0.015792710706591606, -0.05249893665313721, 0.1514897346496582, -0.01663232408463955, 0.06393834948539734, -0.050313714891672134, 0.10146158933639526, 0.029974645003676414, 0.09787219017744064, -0.0998707041144371, 0.06489615887403488, 0.08663741499185562, 0.07076171040534973, 0.13767215609550476, 0.028833480551838875, -0.055425871163606644, -0.009374504908919334, -0.0436224602162838, -0.01538908388465643, 0.0929950550198555, 0.0021593847777694464, 0.06842480599880219, -0.013492811471223831, -0.020729728043079376, -0.005252548027783632, 0.0741628110408783, -0.07460390776395798, -0.0960770919919014, 0.0065436409786343575, -0.0417737178504467, -0.03733983263373375, -0.06625164300203323, 0.021568909287452698, -0.06811331957578659, 0.1881038099527359, 0.0013796711573377252, -0.10330020636320114, -0.13878537714481354, -0.031447071582078934, 0.015658749267458916, -0.05557173117995262, 0.09612956643104553, -0.0964028462767601, 0.1932712197303772, -0.0631529837846756, -0.11547768861055374, 0.021749628707766533, -0.08722685277462006, -0.06991865485906601, -0.030850516632199287, 0.10459015518426895, 0.02466980367898941, -0.027583979070186615, 0.05699102580547333, 0.06002229079604149, -0.03412846103310585, -0.09636475145816803, 0.036659423261880875, 0.166023388504982, 0.08437198400497437, 0.15660473704338074, -0.058686237782239914, -0.20959889888763428, -0.009475470520555973, 0.07558907568454742, 0.14498135447502136, 0.13924995064735413, -0.07447993010282516, 0.06347472965717316, 0.17804284393787384, -0.0874754786491394, -0.2818450629711151, 0.04028094932436943, -0.03862851485610008, 0.047590501606464386, 0.045518334954977036, -0.1026439443230629, 0.025157257914543152, 0.07069732248783112, -0.006898542866110802, 0.04355529323220253, -0.18526113033294678, -0.08148156106472015, 0.0643661767244339, 0.07178948819637299, -0.07735222578048706, -0.09937179833650589, -0.03266986832022667, -0.025188978761434555, -0.11552661657333374, 0.0861625000834465, -0.09976468980312347, 0.08074624836444855, 0.0080834049731493, -0.0003723512345459312, 0.03731054440140724, -0.08426573872566223, 0.09544902294874191, -0.08281701803207397, 0.09361155331134796, -0.05990345776081085, -0.06303852796554565, 0.2271951138973236, -0.11310259252786636, 0.148783877491951, 0.030029503628611565, 0.06364692002534866, -0.08542345464229584, -0.019444813951849937, -0.07004065811634064, 0.10527491569519043, -0.046080585569143295, -0.08999250829219818, -0.07398442178964615, 0.03273330628871918, 0.05486813932657242, 0.005243148189038038, 0.009711132384836674, -0.08992558717727661, 0.05153706669807434, 0.20618975162506104, 0.0730099305510521, -0.0055516562424600124, -0.1195051521062851, -0.0576210543513298, 0.0061607058160007, 0.03660391643643379, -0.13282880187034607, 0.06002184748649597, 0.060996267944574356, 0.06812889873981476, 0.12032586336135864, 0.0427229180932045, -0.11168266832828522, 0.01723966747522354, 0.032718826085329056, -0.12225805968046188, -0.16937734186649323, -0.02798687294125557, -0.04157843068242073, -0.07612723857164383, 0.01841176114976406, 0.11020266264677048, -0.059120114892721176, 0.0015178765170276165, -0.03774230182170868, 0.04566812515258789, -0.013767124153673649, -0.0030710326973348856, 0.07246004790067673, 0.0462532676756382, -0.05001305416226387, 0.11255848407745361, 0.024674734100699425, -0.06963901221752167, 0.02429777942597866, 0.07745916396379471, -0.08183667808771133, -0.07148449867963791, -0.037025753408670425, 0.18345265090465546, -0.029102584347128868, -0.08800844103097916, -0.011918974108994007, -0.0760202631354332, 0.0200326070189476, 0.17706532776355743, 0.02399594895541668, -0.015910087153315544, -0.00991761963814497, 0.03475965932011604, -0.10629013925790787, 0.08183972537517548, -0.03971049562096596, 0.03587324917316437, -0.07854706048965454, 0.12621107697486877, 0.028211204335093498, 0.08060451596975327, -0.05205678567290306, -0.09161549806594849, -0.07668434083461761, 0.010834192857146263, -0.15688754618167877, 0.031541962176561356, -0.09222232550382614, -0.0381833016872406, -0.020021522417664528, 0.08199413120746613, 0.01064229104667902, 0.05454997718334198, -0.03152016922831535, -0.04600469395518303, -0.018170863389968872, 0.004789774306118488, -0.17666171491146088, -0.012139691971242428, 0.01151141058653593, -0.05860764533281326, 0.0837157592177391, -0.0008333693258464336, -0.06507313251495361, -0.05588379129767418, -0.09045304358005524, -0.06602843850851059, 0.023166701197624207, 0.014154814183712006, 0.030483083799481392, -0.09150380641222, 0.03307183086872101, -0.0204215906560421, -0.0710550919175148, -0.030482016503810883, 0.07176267355680466, -0.07506823539733887, 0.08113843202590942, 0.06701906770467758, -0.031254831701517105, -0.03151240199804306, 0.08976689726114273, 0.05485706403851509, 0.03957977145910263, 0.07453741878271103, 0.0025112635921686888, 0.0978735163807869, -0.144396111369133, -0.013382505625486374, -0.008782913908362389, -0.03839972987771034, 0.026926686987280846, -0.02375694364309311, 0.02649593912065029, -0.041219327598810196, 0.12198575586080551, 0.11713042110204697, 0.03678920120000839, 0.023902585729956627, 0.026533231139183044, -0.05077284574508667, 0.027750080451369286, 0.03684911131858826, -0.06098482385277748, -0.058323170989751816, 0.05911567062139511, 0.02476208657026291, 0.006387454457581043, -0.014163070358335972, 0.16247481107711792, 0.13972778618335724, 0.11044871062040329, 0.07447821646928787, 0.09554296731948853, 0.02307310327887535, -0.18036238849163055, -0.17112517356872559, 0.06289689987897873, 0.07639269530773163, -0.06752854585647583, -0.05485305190086365, 0.16187776625156403, -0.0979626476764679, 0.1469355672597885, -0.02399609610438347, -0.04743647947907448, -0.07206070423126221, -0.13587255775928497, -0.01375898253172636, 0.0692068487405777, -0.012475169263780117, -0.0701848566532135, 0.03104265034198761, 0.11994630098342896, -0.01888258196413517, -0.08327368646860123, 0.17885933816432953, -0.15630726516246796, -0.07144813984632492, 0.07287541776895523, 0.037840649485588074, 0.04714297503232956, 0.03483619913458824, 0.032901667058467865, 0.04924865812063217, 0.03886716440320015, 0.07335753738880157, 0.055268801748752594, 0.10345882177352905, 0.04531378298997879, -0.08568762242794037, -0.07480709254741669, 0.016038889065384865, 0.021183203905820847, -0.023407498374581337, 0.06093864515423775, 0.01712149754166603, -0.04527060687541962, -0.0014247898943722248, 0.12194056063890457, -0.07355710864067078, -0.09949154406785965, -0.1385151743888855, 0.2694178521633148, -0.056652165949344635, 0.012128657661378384, -0.014959529973566532, -0.11899617314338684, 0.007839293219149113, 0.16767343878746033, 0.22800830006599426, -0.03973591327667236, 0.030813174322247505, -0.048673365265131, 0.0011504336725920439, -0.014564630575478077, 0.11600445210933685, -0.0030094757676124573, 0.23811693489551544, -0.02979080006480217, 0.16340650618076324, -0.029704220592975616, -0.07238608598709106, -0.11539524048566818, 0.074656181037426, 0.002663214923813939, -0.0028985498938709497, -0.04008451849222183, 0.06260358542203903, -0.027408000081777573, -0.1694229394197464, 0.08513356745243073, -0.05052116885781288, -0.06063045561313629, 0.03949468210339546, -0.03531987965106964, 0.020221006125211716, 0.0667504370212555, -0.008788389153778553, 0.004826078191399574, 0.14656634628772736, 0.018867071717977524, -0.053577352315187454, -0.0642160028219223, 0.037173766642808914, -0.16581960022449493, 0.21419979631900787, 0.00896427221596241, -0.012793323956429958, 0.11156638711690903, 0.01629588194191456, -0.12075839191675186, 0.04218830168247223, 0.011031555011868477, -0.02064616233110428, -0.0030372128821909428, 0.10915312170982361, -0.004491873551160097, -0.060156017541885376, -0.0178338885307312, -0.08784082531929016, 0.01317786704748869, 0.07142268121242523, -0.01492999866604805, -0.05858756601810455, 0.06733173877000809, -0.08295105397701263, 0.12270687520503998, 0.10216327756643295, 0.0326666496694088, -0.0009761067340150476, -0.05705306679010391, 0.037249889224767685, 0.05104658007621765, 0.0034016857389360666, -0.03009866550564766, -0.0947243720293045, 0.01685195416212082, 0.009025514125823975, 0.017903175204992294, -0.2505945563316345, -0.06275410950183868, -0.021238550543785095, -0.01395929604768753, 0.011850074864923954, 0.021670296788215637, 0.12501946091651917, 0.03358861804008484, -0.02347196452319622, 0.020954012870788574, -0.016342585906386375, 0.1396486461162567, -0.07465128600597382, -0.06773388385772705 ]
null
null
transformers
# MaskFormer MaskFormer model trained on COCO panoptic segmentation (small-sized version, Swin backbone). It was introduced in the paper [Per-Pixel Classification is Not All You Need for Semantic Segmentation](https://arxiv.org/abs/2107.06278) and first released in [this repository](https://github.com/facebookresearch/MaskFormer/blob/da3e60d85fdeedcb31476b5edd7d328826ce56cc/mask_former/modeling/criterion.py#L169). Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/maskformer_architecture.png) ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the [model hub](https://huggingface.co/models?search=maskformer) to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import MaskFormerFeatureExtractor, MaskFormerForInstanceSegmentation from PIL import Image import requests # load MaskFormer fine-tuned on COCO panoptic segmentation feature_extractor = MaskFormerFeatureExtractor.from_pretrained("facebook/maskformer-swin-small-coco") model = MaskFormerForInstanceSegmentation.from_pretrained("facebook/maskformer-swin-small-coco") url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) inputs = feature_extractor(images=image, return_tensors="pt") outputs = model(**inputs) # model predicts class_queries_logits of shape `(batch_size, num_queries)` # and masks_queries_logits of shape `(batch_size, num_queries, height, width)` class_queries_logits = outputs.class_queries_logits masks_queries_logits = outputs.masks_queries_logits # you can pass them to feature_extractor for postprocessing result = feature_extractor.post_process_panoptic_segmentation(outputs, target_sizes=[image.size[::-1]])[0] # we refer to the demo notebooks for visualization (see "Resources" section in the MaskFormer docs) predicted_panoptic_map = result["segmentation"] ``` For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/maskformer).
{"license": "other", "tags": ["vision", "image-segmentation"], "datasets": ["coco"], "widget": [{"src": "http://images.cocodataset.org/val2017/000000039769.jpg", "example_title": "Cats"}, {"src": "http://images.cocodataset.org/val2017/000000039770.jpg", "example_title": "Castle"}]}
image-segmentation
facebook/maskformer-swin-small-coco
[ "transformers", "pytorch", "safetensors", "maskformer", "vision", "image-segmentation", "dataset:coco", "arxiv:2107.06278", "license:other", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2107.06278" ]
[]
TAGS #transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us
# MaskFormer MaskFormer model trained on COCO panoptic segmentation (small-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. !model image ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the model hub to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: For more code examples, we refer to the documentation.
[ "# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (small-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ "TAGS\n#transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n", "# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (small-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ 61, 96, 56, 44, 25 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (small-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ -0.06787470728158951, 0.08055440336465836, -0.0031936734449118376, 0.02969186194241047, 0.1266271024942398, -0.008133294060826302, 0.12046124041080475, 0.06241276115179062, -0.018892493098974228, 0.042726900428533554, 0.04346415027976036, 0.031049152836203575, 0.07624026387929916, 0.13978619873523712, 0.03659550100564957, -0.23752248287200928, 0.04692303016781807, -0.011098111979663372, 0.06237397715449333, 0.108110211789608, 0.11017443984746933, -0.12177514284849167, 0.11653174459934235, 0.08382751047611237, -0.17192260921001434, 0.016135970130562782, 0.008352544158697128, -0.05477660521864891, 0.0941646620631218, 0.024569286033511162, 0.15245553851127625, -0.0239859726279974, 0.06993833929300308, -0.10301990061998367, 0.033201444894075394, 0.0951910987496376, -0.0030797540675848722, 0.04649658873677254, 0.0782795250415802, -0.05692305043339729, 0.11990255117416382, -0.04490913078188896, 0.05191248655319214, 0.06296386569738388, -0.10728614777326584, -0.11907823383808136, -0.056060899049043655, 0.16022859513759613, 0.07189849764108658, 0.06869206577539444, -0.007487384136766195, 0.11593292653560638, -0.021503036841750145, 0.06528764218091965, 0.12046072632074356, -0.11856092512607574, -0.055389631539583206, 0.111718088388443, 0.033540546894073486, -0.09689058363437653, -0.06509353965520859, 0.06606652587652206, 0.015534218400716782, 0.0320967398583889, 0.16114301979541779, -0.056349705904722214, -0.000018837145034922287, -0.06123879551887512, -0.10584789514541626, -0.10497568547725677, 0.04203450679779053, -0.02221025340259075, -0.0671883150935173, -0.21871086955070496, -0.1161251962184906, 0.1470218449831009, -0.023971866816282272, -0.0018743252148851752, 0.027394959703087807, -0.007404129486531019, 0.06762363761663437, -0.07536830008029938, -0.11232069134712219, -0.04721197485923767, -0.013132736086845398, 0.0900585725903511, 0.008433316834270954, 0.06662551313638687, -0.043378494679927826, 0.0937734916806221, -0.11570324748754501, -0.11228550970554352, -0.05805482715368271, -0.13378754258155823, -0.051271844655275345, 0.00205329991877079, -0.04353277385234833, -0.12800775468349457, -0.007048937492072582, 0.1432831734418869, -0.020204707980155945, 0.02034086361527443, 0.004247444216161966, 0.038188330829143524, 0.07088807225227356, 0.15581579506397247, -0.051453713327646255, 0.07588651776313782, 0.04352596402168274, 0.0043254150077700615, 0.07622408121824265, -0.04804486408829689, -0.07883056253194809, -0.006144076120108366, -0.014563457109034061, 0.05673852935433388, 0.023751966655254364, 0.06647226959466934, -0.008974747732281685, -0.05117885395884514, 0.21529598534107208, -0.12438814342021942, 0.008961143903434277, 0.02411281131207943, -0.0022243033163249493, -0.03785021975636482, 0.08399613946676254, -0.030494151636958122, -0.09008067101240158, 0.046533845365047455, -0.07303416728973389, 0.021602287888526917, -0.13048413395881653, -0.10559383779764175, 0.019994881004095078, -0.14261621236801147, -0.0400056391954422, -0.14292505383491516, -0.1635834276676178, -0.018902339041233063, 0.033796317875385284, 0.010056938044726849, 0.062397755682468414, 0.04840382933616638, -0.028548749163746834, -0.029579445719718933, 0.021696144714951515, -0.009970801882445812, -0.011584831401705742, 0.007132493890821934, -0.012660624459385872, 0.04399366304278374, -0.012749582529067993, 0.01833278127014637, -0.10932685434818268, 0.07482381165027618, -0.24003471434116364, 0.06514712423086166, -0.007137188222259283, -0.03536460921168327, -0.020895497873425484, -0.06248747929930687, -0.022684244439005852, 0.05768584460020065, 0.0023830889258533716, 0.12631548941135406, -0.12949615716934204, -0.028686467558145523, 0.2644081711769104, -0.17972275614738464, -0.06818173825740814, 0.090946726500988, -0.05691072344779968, -0.02292165532708168, 0.052506886422634125, 0.13948285579681396, 0.11526212841272354, -0.1225878894329071, -0.02213515341281891, -0.009905104525387287, -0.10125655680894852, 0.005819111131131649, 0.03505765646696091, -0.035389699041843414, 0.0933743491768837, 0.025189679116010666, -0.06089034304022789, -0.042960211634635925, -0.00525924377143383, -0.05106080323457718, -0.019672786816954613, -0.016400979831814766, 0.04410376772284508, 0.018844304606318474, -0.05240212380886078, -0.03159366548061371, -0.09137975424528122, 0.011743579991161823, 0.05497589334845543, -0.0917818620800972, -0.008190583437681198, -0.07422497868537903, 0.1289198100566864, -0.09662193804979324, -0.004888965282589197, -0.21481992304325104, -0.09065206348896027, 0.0072189755737781525, -0.04825422540307045, 0.033286210149526596, 0.01987546496093273, 0.02365494705736637, 0.06766168028116226, -0.015699855983257294, 0.0049268584698438644, -0.050506364554166794, -0.01963580772280693, -0.02614450640976429, -0.03639282286167145, -0.10363821685314178, -0.06697572022676468, 0.12970013916492462, -0.12497909367084503, 0.06523354351520538, 0.002058034995570779, 0.10013929754495621, 0.0137184401974082, -0.05475474148988724, 0.06107594817876816, -0.0005058388924226165, -0.030856255441904068, -0.06969831138849258, 0.06896399706602097, -0.0013227243907749653, 0.02702159434556961, 0.07960765808820724, -0.2206566035747528, -0.08584044873714447, 0.09838951379060745, -0.06152278557419777, -0.07064647227525711, 0.0032640646677464247, -0.03433094918727875, 0.0061920215375721455, -0.1070312112569809, -0.06575305759906769, 0.1648494303226471, 0.014888951554894447, 0.12971140444278717, -0.11428651958703995, -0.009389818646013737, 0.05520254373550415, -0.04104815796017647, -0.027039995416998863, -0.009558124467730522, 0.05445394292473793, -0.1228807270526886, 0.09430242329835892, 0.06082748621702194, 0.046659234911203384, 0.12997028231620789, 0.022428181022405624, -0.042635686695575714, 0.018669836223125458, -0.022462734952569008, 0.00925952848047018, 0.15325453877449036, -0.0624745674431324, -0.0534338541328907, 0.06874856352806091, -0.037561625242233276, 0.041561417281627655, -0.1284787952899933, 0.04843645542860031, 0.04789614677429199, -0.01155462022870779, -0.02118958905339241, 0.0016489590052515268, 0.028764165937900543, 0.04118838906288147, 0.03461473062634468, -0.009139260277152061, -0.0038187154568731785, -0.0402354970574379, -0.11502707749605179, 0.16861189901828766, -0.05655040964484215, -0.38267624378204346, -0.15638969838619232, 0.04230235517024994, -0.09346462041139603, 0.05442317575216293, 0.00362370815128088, 0.043899573385715485, -0.08096402883529663, -0.12636327743530273, -0.0400807186961174, 0.005494807846844196, -0.09787846356630325, -0.05119936540722847, -0.0008733721333555877, 0.0003445837937761098, -0.13929803669452667, -0.03442642465233803, -0.02280285581946373, -0.02751028910279274, 0.060573503375053406, 0.017043016850948334, 0.10614508390426636, 0.15180960297584534, -0.06836048513650894, -0.00014869490405544639, -0.04806174710392952, 0.1240454763174057, -0.05097142979502678, 0.07939165085554123, 0.2656100392341614, -0.07748398184776306, 0.0846649631857872, 0.08440181612968445, -0.007996561005711555, -0.05193176865577698, -0.014635751023888588, 0.00610476266592741, -0.11313460022211075, -0.10337882488965988, -0.09560856223106384, -0.0703829973936081, 0.010516521520912647, 0.06772616505622864, 0.02465311996638775, 0.08173681050539017, 0.08332560211420059, -0.0754856988787651, -0.06989482045173645, 0.013898322358727455, 0.1224878802895546, 0.06481363624334335, -0.01882232539355755, 0.09698513895273209, -0.07910497486591339, -0.023738209158182144, 0.0656287744641304, -0.009436984546482563, 0.15422286093235016, -0.0044843400828540325, 0.02374434471130371, 0.09609529376029968, -0.007690147962421179, 0.08254554867744446, 0.11738397926092148, -0.039435554295778275, 0.013517406769096851, -0.03462129458785057, -0.09305348247289658, -0.12024448066949844, 0.04795403406023979, 0.04119705408811569, -0.008217492140829563, -0.07459317147731781, -0.006902349181473255, 0.055919334292411804, 0.1881285011768341, 0.07570605725049973, -0.25014427304267883, -0.06628502160310745, 0.013283922336995602, 0.024027062579989433, -0.12174037098884583, -0.018566908314824104, 0.14753460884094238, -0.1640806496143341, 0.07616066187620163, -0.054077040404081345, 0.10440690815448761, -0.02687998116016388, 0.012036272324621677, -0.09239497035741806, 0.10941919684410095, 0.013387824408710003, 0.09812524169683456, -0.06441568583250046, 0.09911122173070908, 0.015030101872980595, 0.07334110140800476, -0.1274094432592392, 0.05784014239907265, 0.07087749242782593, 0.10218409448862076, 0.15200336277484894, 0.009800422936677933, -0.022388605400919914, -0.030411843210458755, -0.06711044162511826, 0.012230620719492435, 0.07157275080680847, -0.03922273591160774, 0.0576886348426342, -0.010343614965677261, -0.030086852610111237, -0.01505531556904316, 0.029215095564723015, -0.047551508992910385, -0.1359291672706604, 0.015448080375790596, -0.06626846641302109, -0.06597770005464554, -0.0672253668308258, 0.006870782934129238, -0.048773378133773804, 0.19448749721050262, -0.01511923223733902, -0.11245810985565186, -0.13585937023162842, -0.06473980844020844, 0.04644767940044403, -0.04108433052897453, 0.09943660348653793, -0.061021387577056885, 0.225270614027977, -0.07303377240896225, -0.11601733416318893, 0.020353710278868675, -0.09150295704603195, -0.08449526876211166, -0.02432035654783249, 0.13349616527557373, -0.01946639083325863, -0.004661035258322954, 0.0808958038687706, 0.05366924777626991, -0.043804824352264404, -0.09796459972858429, 0.0035247623454779387, 0.14751750230789185, 0.14753231406211853, 0.13081389665603638, -0.042519163340330124, -0.1944691240787506, -0.017400342971086502, 0.08826325833797455, 0.08645448833703995, 0.10780048370361328, -0.07237020134925842, 0.08869118988513947, 0.1535743921995163, -0.06845267862081528, -0.2960660755634308, -0.007046559825539589, -0.03801553696393967, 0.042265940457582474, 0.035240743309259415, -0.08462242037057877, 0.04469350725412369, 0.03849745914340019, -0.027295567095279694, 0.05351469665765762, -0.2013162523508072, -0.10110550373792648, 0.060807205736637115, 0.10870807617902756, -0.030003560706973076, -0.09540169686079025, -0.021934937685728073, -0.040814489126205444, -0.1509743332862854, 0.09506997466087341, -0.11877341568470001, 0.07477515190839767, -0.0010790885426104069, -0.01788845844566822, 0.033696096390485764, -0.09106916934251785, 0.09677727520465851, -0.06569238007068634, 0.09303703904151917, -0.05292133241891861, -0.014677321538329124, 0.2007157951593399, -0.08423794060945511, 0.1319257616996765, 0.006705335807055235, 0.062482282519340515, -0.08229301869869232, -0.04859849065542221, -0.07448136061429977, 0.11314297467470169, -0.033999983221292496, -0.08397887647151947, -0.0628688633441925, 0.03012452833354473, 0.04329436272382736, 0.004253005143254995, 0.0032986565493047237, -0.12458139657974243, 0.05490769073367119, 0.19573208689689636, 0.13839782774448395, -0.048333149403333664, -0.11224056780338287, -0.06805279105901718, -0.01710466854274273, 0.052041903138160706, -0.13398800790309906, 0.06272272765636444, 0.05458484962582588, 0.04791727662086487, 0.061799995601177216, 0.047620225697755814, -0.07194167375564575, 0.017206240445375443, 0.05360948666930199, -0.12366073578596115, -0.1529824286699295, -0.010006627067923546, -0.05647420138120651, -0.09118936210870743, 0.02609507367014885, 0.10042165219783783, -0.05174124613404274, -0.018357185646891594, -0.01488457527011633, 0.025990363210439682, -0.003621619427576661, 0.00659368559718132, 0.057167958468198776, 0.05032262206077576, -0.07527682185173035, 0.09174204617738724, 0.031934820115566254, -0.07077476382255554, 0.017396433278918266, 0.031960777938365936, -0.10768500715494156, -0.08489213138818741, -0.0685553252696991, 0.15332083404064178, -0.037390969693660736, -0.08453120291233063, -0.034708067774772644, -0.0988733097910881, -0.00166728172916919, 0.17046718299388885, 0.056111812591552734, 0.02122947946190834, -0.009111141785979271, 0.010027225129306316, -0.1365564912557602, 0.058541733771562576, -0.026131711900234222, 0.051460832357406616, -0.0920993834733963, 0.13176722824573517, 0.05112322047352791, 0.08442573994398117, -0.06577574461698532, -0.05734255537390709, -0.06695036590099335, 0.007454134523868561, -0.13867461681365967, 0.05650261789560318, -0.0993310883641243, -0.026774290949106216, -0.019109733402729034, 0.08020265400409698, -0.010679798200726509, 0.05680912360548973, -0.0407114140689373, -0.02031770348548889, -0.01604180969297886, 0.016942491754889488, -0.16236871480941772, -0.009888029657304287, 0.016127226874232292, -0.08000655472278595, 0.08738873898983002, 0.004502176772803068, -0.07410867512226105, -0.02275020070374012, -0.099429190158844, -0.07820470631122589, 0.0348152257502079, 0.036287907510995865, -0.009595636278390884, -0.06860432028770447, 0.03581671044230461, 0.02128414623439312, -0.05107302963733673, -0.013774391263723373, 0.063148133456707, -0.08601634949445724, 0.08715562522411346, 0.05443360656499863, -0.04474452883005142, -0.030977318063378334, 0.07024598866701126, 0.06873117387294769, 0.0418216772377491, 0.07819385826587677, -0.02120021916925907, 0.08697100728750229, -0.15036620199680328, -0.01829674281179905, -0.01669168658554554, -0.05159439519047737, 0.008180296048521996, -0.044448837637901306, 0.035583700984716415, -0.020416954532265663, 0.19172899425029755, 0.08721976727247238, 0.08589433133602142, 0.013684242963790894, 0.056388869881629944, 0.036858536303043365, 0.043914083391427994, 0.09098749607801437, -0.04403199255466461, -0.046575773507356644, 0.0688297301530838, 0.028106892481446266, 0.010249221697449684, -0.008969718590378761, 0.16793006658554077, 0.13784709572792053, 0.07362750172615051, 0.06131807342171669, 0.08752905577421188, 0.020265202969312668, -0.158965066075325, -0.14411310851573944, 0.02514093555510044, 0.05580756440758705, -0.056485045701265335, -0.04500499367713928, 0.15945087373256683, -0.1086808443069458, 0.11859595775604248, 0.0012444533640518785, -0.04615774750709534, -0.056202180683612823, -0.1613929718732834, -0.03202855587005615, 0.05049706622958183, -0.00040697341319173574, -0.06546074897050858, -0.0029612863436341286, 0.17107313871383667, -0.03131505846977234, -0.07569229602813721, 0.19866813719272614, -0.15086211264133453, -0.04669591411948204, 0.049960482865571976, 0.03985423594713211, 0.04797041416168213, 0.038811638951301575, 0.0258944034576416, 0.059082742780447006, 0.0388992577791214, 0.08261215686798096, 0.029938489198684692, 0.11916720867156982, 0.028842341154813766, -0.06494774669408798, -0.060183387249708176, 0.006838284432888031, 0.0194312185049057, 0.005777264479547739, 0.07122005522251129, 0.021864838898181915, -0.036552801728248596, -0.007394094485789537, 0.11595381051301956, -0.06492564082145691, -0.12379313260316849, -0.1321330964565277, 0.2743074595928192, -0.037171371281147, -0.009978930465877056, 0.015713488683104515, -0.09659601747989655, 0.01984976977109909, 0.21643324196338654, 0.20641356706619263, -0.03126472234725952, 0.03696274012327194, -0.030904151499271393, 0.006135584786534309, -0.010734555311501026, 0.14061754941940308, -0.012728998437523842, 0.27173614501953125, -0.033313099294900894, 0.07273222506046295, 0.006593227386474609, -0.049271076917648315, -0.1273714154958725, 0.059893060475587845, -0.016673587262630463, -0.04019307345151901, -0.03490934148430824, 0.03626948222517967, -0.036500971764326096, -0.10465409606695175, 0.11521216481924057, -0.06796574592590332, -0.050759755074977875, 0.02646082453429699, -0.026183467358350754, -0.004874764941632748, 0.06754109263420105, -0.022870326414704323, 0.011615470983088017, 0.15828727185726166, 0.010371142067015171, -0.07135706394910812, -0.006896952632814646, 0.026824140921235085, -0.12845376133918762, 0.24500694870948792, -0.007901218719780445, 0.04812011495232582, 0.09488588571548462, 0.02771225944161415, -0.09305760264396667, 0.026780186221003532, 0.009557388722896576, -0.028656039386987686, 0.03221513703465462, 0.10095500946044922, -0.004397368058562279, -0.08748482912778854, -0.0023706017527729273, -0.08448117971420288, 0.028438199311494827, 0.048140861093997955, 0.011578348465263844, -0.03855597972869873, 0.04961021617054939, -0.09835875034332275, 0.13265332579612732, 0.08502195030450821, 0.029581159353256226, -0.028948592022061348, -0.044561803340911865, 0.043276943266391754, 0.03524019569158554, 0.02562246471643448, -0.0374840646982193, -0.10379296541213989, 0.0016099753556773067, -0.010376364924013615, -0.005616934038698673, -0.2795261740684509, -0.0480484776198864, -0.017513694241642952, -0.020759034901857376, -0.009398045018315315, 0.022531025111675262, 0.07218582928180695, 0.045667171478271484, -0.03118637390434742, -0.009683328680694103, -0.007013519294559956, 0.12402734160423279, -0.07813414186239243, -0.07465799897909164 ]
null
null
transformers
# MaskFormer MaskFormer model trained on ADE20k semantic segmentation (tiny-sized version, Swin backbone). It was introduced in the paper [Per-Pixel Classification is Not All You Need for Semantic Segmentation](https://arxiv.org/abs/2107.06278) and first released in [this repository](https://github.com/facebookresearch/MaskFormer/blob/da3e60d85fdeedcb31476b5edd7d328826ce56cc/mask_former/modeling/criterion.py#L169). Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/maskformer_architecture.png) ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the [model hub](https://huggingface.co/models?search=maskformer) to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import MaskFormerFeatureExtractor, MaskFormerForInstanceSegmentation from PIL import Image import requests url = "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg" image = Image.open(requests.get(url, stream=True).raw) feature_extractor = MaskFormerFeatureExtractor.from_pretrained("facebook/maskformer-swin-tiny-ade") inputs = feature_extractor(images=image, return_tensors="pt") model = MaskFormerForInstanceSegmentation.from_pretrained("facebook/maskformer-swin-tiny-ade") outputs = model(**inputs) # model predicts class_queries_logits of shape `(batch_size, num_queries)` # and masks_queries_logits of shape `(batch_size, num_queries, height, width)` class_queries_logits = outputs.class_queries_logits masks_queries_logits = outputs.masks_queries_logits # you can pass them to feature_extractor for postprocessing # we refer to the demo notebooks for visualization (see "Resources" section in the MaskFormer docs) predicted_semantic_map = feature_extractor.post_process_semantic_segmentation(outputs, target_sizes=[image.size[::-1]])[0] ``` For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/maskformer).
{"license": "other", "tags": ["vision", "image-segmentation"], "datasets": ["scene_parse_150"], "widget": [{"src": "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg", "example_title": "House"}, {"src": "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000002.jpg", "example_title": "Castle"}]}
image-segmentation
facebook/maskformer-swin-tiny-ade
[ "transformers", "pytorch", "safetensors", "maskformer", "vision", "image-segmentation", "dataset:scene_parse_150", "arxiv:2107.06278", "license:other", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2107.06278" ]
[]
TAGS #transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us
# MaskFormer MaskFormer model trained on ADE20k semantic segmentation (tiny-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. !model image ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the model hub to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: For more code examples, we refer to the documentation.
[ "# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (tiny-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ "TAGS\n#transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n", "# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (tiny-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ 66, 97, 56, 44, 25 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-scene_parse_150 #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n# MaskFormer\n\nMaskFormer model trained on ADE20k semantic segmentation (tiny-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ -0.06384747475385666, 0.10556399077177048, -0.0043688383884727955, 0.028315884992480278, 0.14291459321975708, -0.010210519656538963, 0.14020968973636627, 0.07217652350664139, -0.012955115176737309, 0.06760744005441666, 0.05617155879735947, 0.0007200576947070658, 0.08257751911878586, 0.13784350454807281, 0.03718039020895958, -0.24664035439491272, 0.05560225248336792, -0.057297274470329285, 0.060396648943424225, 0.08929286152124405, 0.10591687262058258, -0.11748816817998886, 0.11691198498010635, 0.08460671454668045, -0.14273159205913544, 0.00994829647243023, -0.010333920828998089, -0.07196404039859772, 0.09878585487604141, 0.029308414086699486, 0.1306494176387787, -0.05031871050596237, 0.08053219318389893, -0.10278896987438202, 0.034813787788152695, 0.11792206019163132, 0.00386661640368402, 0.04867672920227051, 0.053139105439186096, -0.029291192069649696, 0.12061531096696854, -0.05262339115142822, 0.043231431394815445, 0.056516360491514206, -0.1431286633014679, -0.12679050862789154, -0.06211381033062935, 0.14892618358135223, 0.07306768745183945, 0.0689505785703659, -0.0002537165128160268, 0.07999759167432785, 0.005523784551769495, 0.0671926811337471, 0.1419825255870819, -0.09581377357244492, -0.043030135333538055, 0.09462830424308777, 0.04010138660669327, -0.07356975972652435, -0.06396672874689102, 0.05972352623939514, 0.0061225248500704765, 0.028518270701169968, 0.16532765328884125, -0.031377341598272324, 0.0756731852889061, -0.05226733908057213, -0.11436150968074799, -0.10432267189025879, 0.028110386803746223, -0.004162553232163191, -0.07853802293539047, -0.19280828535556793, -0.12722085416316986, 0.14015842974185944, -0.01071991492062807, -0.011728929355740547, 0.022006453946232796, 0.015383469872176647, 0.06668969243764877, -0.1170186996459961, -0.0849071592092514, -0.04252767562866211, 0.008464573882520199, 0.090079165995121, -0.002079681260511279, 0.08967158943414688, -0.07502304762601852, 0.08287987112998962, -0.18532755970954895, -0.10174048691987991, -0.061302147805690765, -0.13801157474517822, -0.04819749668240547, 0.00009064269397640601, -0.002154780086129904, -0.17453017830848694, -0.027314329519867897, 0.11226172000169754, -0.023086462169885635, 0.02211010828614235, 0.023763714358210564, 0.034940071403980255, 0.07187673449516296, 0.17048577964305878, -0.06073834002017975, 0.039748068898916245, 0.03519099950790405, 0.02952706441283226, 0.07204223424196243, -0.05415241792798042, -0.06919533759355545, 0.013494241051375866, -0.030048003420233727, 0.05620800331234932, 0.026743071153759956, 0.06221400946378708, 0.01005939394235611, -0.040477097034454346, 0.15656858682632446, -0.11831508576869965, 0.021947013214230537, 0.030627872794866562, -0.029991403222084045, -0.057847533375024796, 0.0868103951215744, -0.05436720699071884, -0.08918511867523193, 0.050097472965717316, -0.06476711481809616, 0.009396018460392952, -0.15088172256946564, -0.12757907807826996, 0.009352615103125572, -0.13581566512584686, -0.057721950113773346, -0.12249007076025009, -0.18041512370109558, -0.005891287699341774, 0.04720921069383621, 0.02215307392179966, 0.06813265383243561, 0.04438434913754463, -0.037009187042713165, -0.012469043023884296, 0.04353559389710426, -0.07220181822776794, -0.002117630559951067, -0.004788652993738651, -0.008492663502693176, 0.06922445446252823, -0.028108196333050728, 0.029807135462760925, -0.09948014467954636, 0.07821682095527649, -0.25980305671691895, 0.07529517263174057, 0.006338991690427065, -0.03624631091952324, -0.0271504707634449, -0.049734652042388916, -0.018803521990776062, 0.03960149362683296, 0.03464532643556595, 0.13706953823566437, -0.16706804931163788, -0.026395857334136963, 0.24739277362823486, -0.1677519530057907, -0.05795199051499367, 0.09082860499620438, -0.06771651655435562, -0.005114639177918434, 0.07836344093084335, 0.13415586948394775, 0.08328506350517273, -0.11842572689056396, -0.015130321495234966, -0.013469317927956581, -0.1317942589521408, 0.026976054534316063, 0.028357166796922684, -0.0501190684735775, 0.10995808988809586, 0.014669724740087986, -0.06499619036912918, -0.03957698866724968, -0.015812691301107407, -0.045583583414554596, -0.0057167354971170425, -0.005930979270488024, 0.03872063383460045, -0.009525824338197708, -0.08684135228395462, -0.04047056660056114, -0.09731344133615494, 0.05165182799100876, 0.04601629078388214, -0.08121688663959503, -0.013753337785601616, -0.05633332580327988, 0.13624736666679382, -0.09188777208328247, 0.002099098637700081, -0.17974144220352173, -0.09043733775615692, 0.0015994751593098044, -0.07587993144989014, 0.051676057279109955, -0.017208900302648544, 0.035709675401449203, 0.08754298835992813, -0.0004388496745377779, -0.011291137896478176, -0.08019410073757172, -0.012312062084674835, -0.031439829617738724, -0.040647733956575394, -0.1015859991312027, -0.07958544045686722, 0.08260759711265564, -0.12126229703426361, 0.0727701410651207, -0.01907975599169731, 0.07422614842653275, 0.027436377480626106, -0.07059073448181152, 0.05140354111790657, -0.023418596014380455, -0.02133217453956604, -0.08475232124328613, 0.05910506844520569, 0.008192974142730236, 0.03627251461148262, 0.07463143020868301, -0.2536069452762604, -0.1301012486219406, 0.07938900589942932, -0.06455546617507935, -0.06968145072460175, 0.0336042083799839, -0.012183663435280323, 0.018385034054517746, -0.12463579326868057, -0.08912423998117447, 0.1343318074941635, 0.02279684878885746, 0.14361727237701416, -0.10797067731618881, 0.010000523179769516, 0.04807889834046364, -0.05174947530031204, -0.03784770518541336, -0.011393524706363678, 0.04455976560711861, -0.12812381982803345, 0.10595453530550003, 0.09372740238904953, 0.0671098455786705, 0.16543909907341003, 0.034774407744407654, -0.04282962903380394, -0.011569916270673275, -0.002168514532968402, -0.011695480905473232, 0.14082355797290802, -0.07831994444131851, -0.045571211725473404, 0.04809707775712013, -0.03232216462492943, 0.026702582836151123, -0.130046084523201, 0.057323116809129715, 0.07727763056755066, -0.021130848675966263, -0.016064291819930077, -0.021016037091612816, 0.02216271124780178, 0.04334428161382675, 0.06705965846776962, -0.027114134281873703, 0.003766761627048254, -0.039996836334466934, -0.11003191024065018, 0.15374945104122162, -0.06932401657104492, -0.3606067895889282, -0.17341788113117218, 0.023103322833776474, -0.07883252948522568, 0.05698881670832634, 0.012092124670743942, 0.01766902767121792, -0.0688234269618988, -0.12000027298927307, 0.006441670004278421, 0.0008580347057431936, -0.10684625804424286, -0.012238696217536926, -0.006446585524827242, -0.015427659265697002, -0.13731317222118378, -0.043359559029340744, -0.0175488144159317, -0.006008405238389969, 0.03682535141706467, 0.062095336616039276, 0.09778372198343277, 0.1285082995891571, -0.08599860221147537, 0.010241744108498096, -0.031053613871335983, 0.1798207312822342, -0.05094144865870476, 0.12537617981433868, 0.24819685518741608, -0.09866157919168472, 0.11324942111968994, 0.10246140509843826, -0.00014438848302233964, -0.03378298133611679, -0.003269307781010866, 0.017214758321642876, -0.1285555064678192, -0.1192302480340004, -0.06498877704143524, -0.05298096314072609, 0.006711880210787058, 0.04011918231844902, 0.018794648349285126, 0.10372617840766907, 0.06373737007379532, -0.05645361915230751, -0.072203628718853, 0.020852183923125267, 0.12475031614303589, 0.046204887330532074, -0.03348477929830551, 0.08871123194694519, -0.0816931203007698, -0.03552582859992981, 0.06061335280537605, -0.056386400014162064, 0.19079308211803436, 0.03112678788602352, 0.03783662989735603, 0.0776074007153511, -0.054033368825912476, 0.08575861901044846, 0.11091519892215729, -0.03258133679628372, 0.010228549130260944, -0.045892383903265, -0.10330364853143692, -0.08212228864431381, 0.06630093604326248, 0.01905982382595539, 0.02039315737783909, -0.07678000628948212, 0.06271452456712723, 0.060794245451688766, 0.22857707738876343, 0.0896867960691452, -0.24115216732025146, -0.06798399239778519, 0.010921094566583633, 0.01393758226186037, -0.1311834752559662, -0.030888017266988754, 0.15639930963516235, -0.16593722999095917, 0.04432480037212372, -0.05187347158789635, 0.10663669556379318, -0.024010876193642616, 0.017784176394343376, -0.05948568508028984, 0.1469094604253769, -0.014583299867808819, 0.0638895109295845, -0.05665784329175949, 0.09677944332361221, 0.028934666886925697, 0.09925985336303711, -0.0990804061293602, 0.060489822179079056, 0.08821087330579758, 0.08381308615207672, 0.13866470754146576, 0.025100121274590492, -0.04836202412843704, -0.01963789202272892, -0.04388388246297836, -0.00991881638765335, 0.09242882579565048, -0.0021173767745494843, 0.06791096180677414, -0.014077474363148212, -0.021089840680360794, -0.0007469325209967792, 0.07878424227237701, -0.06583038717508316, -0.10244923830032349, 0.004713188856840134, -0.03591114282608032, -0.039726193994283676, -0.06408637017011642, 0.021696457639336586, -0.0626247376203537, 0.18382354080677032, 0.0067960419692099094, -0.10781113803386688, -0.13415572047233582, -0.049114927649497986, 0.01930667832493782, -0.057421956211328506, 0.09578043222427368, -0.0891142189502716, 0.19979412853717804, -0.0671975165605545, -0.1224089190363884, 0.022938953712582588, -0.0870317667722702, -0.06490468233823776, -0.023825764656066895, 0.10873974859714508, 0.011710567399859428, -0.022701667621731758, 0.05864523723721504, 0.05966612324118614, -0.03313460201025009, -0.09960639476776123, 0.0378606840968132, 0.16165298223495483, 0.07149402797222137, 0.1497168391942978, -0.054623521864414215, -0.20286418497562408, -0.009288329631090164, 0.08579860627651215, 0.13650862872600555, 0.1328677237033844, -0.07590802013874054, 0.06371944397687912, 0.18505825102329254, -0.08417164534330368, -0.2886391580104828, 0.02712382934987545, -0.04494024068117142, 0.04520423710346222, 0.05321290343999863, -0.10569234937429428, 0.025451792404055595, 0.06306225806474686, -0.010370409116148949, 0.042234547436237335, -0.20185619592666626, -0.08353972434997559, 0.0708000436425209, 0.07594482600688934, -0.05867570638656616, -0.10118257999420166, -0.03487667441368103, -0.02761375717818737, -0.10903051495552063, 0.09252610057592392, -0.11393572390079498, 0.08253481984138489, 0.01198657788336277, -0.0021878161933273077, 0.038022078573703766, -0.08027078956365585, 0.09660610556602478, -0.08722065389156342, 0.09525970369577408, -0.06026284769177437, -0.053407903760671616, 0.214838445186615, -0.11235116422176361, 0.1515020728111267, 0.017517024651169777, 0.06089070439338684, -0.0785461887717247, -0.02597891539335251, -0.07122784107923508, 0.10428796708583832, -0.04102948307991028, -0.09286878257989883, -0.0762743353843689, 0.028099870309233665, 0.052710775285959244, 0.008146793581545353, 0.008021730929613113, -0.0967491865158081, 0.0557558573782444, 0.20241385698318481, 0.0783037468791008, -0.017856910824775696, -0.11607807129621506, -0.05487850308418274, 0.0008743969956412911, 0.0424712635576725, -0.1388431042432785, 0.05763233080506325, 0.05241652578115463, 0.0659494623541832, 0.11430864036083221, 0.04675424471497536, -0.10925938934087753, 0.01578769087791443, 0.03507782146334648, -0.12182807177305222, -0.17677192389965057, -0.029877932742238045, -0.04723804071545601, -0.07572630047798157, 0.019075898453593254, 0.10815133899450302, -0.0639030784368515, 0.00544006610289216, -0.03824243322014809, 0.041553664952516556, -0.014633575454354286, -0.007884728722274303, 0.07640738040208817, 0.041740696877241135, -0.053347837179899216, 0.10491213202476501, 0.021814163774251938, -0.06885819137096405, 0.030452119186520576, 0.07429544627666473, -0.08163082599639893, -0.0718829557299614, -0.05106760933995247, 0.1817702353000641, -0.029182691127061844, -0.09065227955579758, -0.015012571588158607, -0.07995852828025818, 0.019614223390817642, 0.17738094925880432, 0.02898426540195942, -0.011096947826445103, -0.014736977405846119, 0.031238412484526634, -0.10552623867988586, 0.08061409741640091, -0.04082262143492699, 0.03927690535783768, -0.08030695468187332, 0.13366645574569702, 0.033301759511232376, 0.08183993399143219, -0.05699103698134422, -0.08800902217626572, -0.07525991648435593, 0.011688733473420143, -0.1510889232158661, 0.0313308909535408, -0.09327343106269836, -0.03801937401294708, -0.019268935546278954, 0.08314814418554306, 0.0051836250349879265, 0.051667407155036926, -0.031513046473264694, -0.042801253497600555, -0.017495172098279, 0.004106062930077314, -0.17871737480163574, -0.008847574703395367, 0.009825664572417736, -0.059009820222854614, 0.07927703112363815, -0.005569349974393845, -0.06806085258722305, -0.052263595163822174, -0.08993513137102127, -0.0646437555551529, 0.029075687751173973, 0.011138408444821835, 0.019948698580265045, -0.07275006175041199, 0.03167873993515968, -0.01703498698771, -0.069263756275177, -0.02601690962910652, 0.06441686302423477, -0.08001970499753952, 0.08676458895206451, 0.06806144118309021, -0.039139844477176666, -0.02892375737428665, 0.08691219240427017, 0.05721476301550865, 0.03860173746943474, 0.07648277282714844, -0.0011232285760343075, 0.09604693204164505, -0.14453864097595215, -0.012485913001000881, -0.007567529566586018, -0.03580285236239433, 0.021415172144770622, -0.02778579294681549, 0.022790713235735893, -0.04227282106876373, 0.12929357588291168, 0.11927873641252518, 0.03462297469377518, 0.021182255819439888, 0.035243093967437744, -0.029986752197146416, 0.029822401702404022, 0.05087447911500931, -0.05290517956018448, -0.06349821388721466, 0.05199523642659187, 0.024456383660435677, 0.008751525543630123, -0.00690044229850173, 0.15793228149414062, 0.13535498082637787, 0.11149102449417114, 0.07739389687776566, 0.09147760272026062, 0.029075294733047485, -0.17511866986751556, -0.1691252738237381, 0.0597708560526371, 0.07072577625513077, -0.05948559567332268, -0.04156772419810295, 0.16435033082962036, -0.09467843174934387, 0.14563307166099548, -0.012942872010171413, -0.04709654673933983, -0.07478632032871246, -0.14700010418891907, -0.018294882029294968, 0.06391748785972595, -0.012857676483690739, -0.06918801367282867, 0.024672623723745346, 0.1348930299282074, -0.021818241104483604, -0.08693034201860428, 0.1872386783361435, -0.15905144810676575, -0.06868154555559158, 0.07427351176738739, 0.03656235337257385, 0.05008270964026451, 0.03616149723529816, 0.02574743516743183, 0.05481167137622833, 0.03759278357028961, 0.07125885039567947, 0.05072750151157379, 0.10247119516134262, 0.04383762553334236, -0.0819953978061676, -0.07206063717603683, 0.01601720042526722, 0.02157735638320446, -0.01868237927556038, 0.06360266357660294, 0.015618491917848587, -0.050393082201480865, -0.0017071266192942858, 0.1214909479022026, -0.07490161061286926, -0.10296495258808136, -0.14212679862976074, 0.2709590792655945, -0.05375608056783676, 0.016310222446918488, -0.020358290523290634, -0.12027959525585175, 0.002652684925124049, 0.17384429275989532, 0.22092051804065704, -0.0450061671435833, 0.0308344978839159, -0.04988190159201622, 0.004783210344612598, -0.012573481537401676, 0.12291256338357925, -0.010418337769806385, 0.24804505705833435, -0.03194276988506317, 0.1585264950990677, -0.029418792575597763, -0.07184520363807678, -0.12576702237129211, 0.07148009538650513, 0.0031395454425364733, -0.0120179932564497, -0.042893681675195694, 0.06178038939833641, -0.036288224160671234, -0.13885687291622162, 0.08267378062009811, -0.051196955144405365, -0.05305579677224159, 0.03745921328663826, -0.031415216624736786, 0.01654326543211937, 0.06727532297372818, -0.01144202146679163, 0.008657210506498814, 0.14597086608409882, 0.01577019691467285, -0.05518938601016998, -0.05553821474313736, 0.030811967328190804, -0.15552560985088348, 0.21678471565246582, 0.00982680730521679, -0.008706377819180489, 0.1056932583451271, 0.018627436831593513, -0.11001303046941757, 0.04923953860998154, 0.013248992152512074, -0.01661001332104206, -0.0037334198132157326, 0.10372210294008255, -0.007007411681115627, -0.06761997193098068, -0.011677517555654049, -0.08937657624483109, 0.016435371711850166, 0.06758935004472733, -0.014816111885011196, -0.05541460961103439, 0.058415260165929794, -0.08714079111814499, 0.12054099887609482, 0.1003587543964386, 0.03604299947619438, -0.00905567780137062, -0.04857911542057991, 0.03400444611907005, 0.04826054349541664, -0.0022809109650552273, -0.030290042981505394, -0.08998541533946991, 0.011903025209903717, 0.015193029306828976, 0.0112891411408782, -0.2609070837497711, -0.0600457563996315, -0.0245790034532547, -0.008550142869353294, 0.011958157643675804, 0.018630772829055786, 0.12651099264621735, 0.031660180538892746, -0.023326698690652847, 0.011746336705982685, -0.01051483303308487, 0.13445641100406647, -0.07153353840112686, -0.0693325623869896 ]
null
null
transformers
# MaskFormer MaskFormer model trained on COCO panoptic segmentation (tiny-sized version, Swin backbone). It was introduced in the paper [Per-Pixel Classification is Not All You Need for Semantic Segmentation](https://arxiv.org/abs/2107.06278) and first released in [this repository](https://github.com/facebookresearch/MaskFormer/blob/da3e60d85fdeedcb31476b5edd7d328826ce56cc/mask_former/modeling/criterion.py#L169). Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/maskformer_architecture.png) ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the [model hub](https://huggingface.co/models?search=maskformer) to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import MaskFormerFeatureExtractor, MaskFormerForInstanceSegmentation from PIL import Image import requests # load MaskFormer fine-tuned on COCO panoptic segmentation feature_extractor = MaskFormerFeatureExtractor.from_pretrained("facebook/maskformer-swin-tiny-coco") model = MaskFormerForInstanceSegmentation.from_pretrained("facebook/maskformer-swin-tiny-coco") url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) inputs = feature_extractor(images=image, return_tensors="pt") outputs = model(**inputs) # model predicts class_queries_logits of shape `(batch_size, num_queries)` # and masks_queries_logits of shape `(batch_size, num_queries, height, width)` class_queries_logits = outputs.class_queries_logits masks_queries_logits = outputs.masks_queries_logits # you can pass them to feature_extractor for postprocessing result = feature_extractor.post_process_panoptic_segmentation(outputs, target_sizes=[image.size[::-1]])[0] # we refer to the demo notebooks for visualization (see "Resources" section in the MaskFormer docs) predicted_panoptic_map = result["segmentation"] ``` For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/maskformer).
{"license": "other", "tags": ["vision", "image-segmentation"], "datasets": ["coco"], "widget": [{"src": "http://images.cocodataset.org/val2017/000000039769.jpg", "example_title": "Cats"}, {"src": "http://images.cocodataset.org/val2017/000000039770.jpg", "example_title": "Castle"}]}
image-segmentation
facebook/maskformer-swin-tiny-coco
[ "transformers", "pytorch", "safetensors", "maskformer", "vision", "image-segmentation", "dataset:coco", "arxiv:2107.06278", "license:other", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2107.06278" ]
[]
TAGS #transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us
# MaskFormer MaskFormer model trained on COCO panoptic segmentation (tiny-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. !model image ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the model hub to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: For more code examples, we refer to the documentation.
[ "# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (tiny-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ "TAGS\n#transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n", "# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (tiny-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image", "## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ 61, 95, 56, 44, 25 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #maskformer #vision #image-segmentation #dataset-coco #arxiv-2107.06278 #license-other #endpoints_compatible #has_space #region-us \n# MaskFormer\n\nMaskFormer model trained on COCO panoptic segmentation (tiny-sized version, Swin backbone). It was introduced in the paper Per-Pixel Classification is Not All You Need for Semantic Segmentation and first released in this repository. \n\nDisclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nMaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation.\n\n!model image## Intended uses & limitations\n\nYou can use this particular checkpoint for semantic segmentation. See the model hub to look for other\nfine-tuned versions on a task that interests you.### How to use\n\nHere is how to use this model:\n\n\n\nFor more code examples, we refer to the documentation." ]
[ -0.06278284639120102, 0.0732455924153328, -0.003274020506069064, 0.029520869255065918, 0.11789001524448395, -0.013374492526054382, 0.11749519407749176, 0.060695335268974304, -0.02185933291912079, 0.04962999373674393, 0.042187489569187164, 0.03522671386599541, 0.0732070654630661, 0.1347944438457489, 0.029891561716794968, -0.23373037576675415, 0.04848362132906914, -0.012523497454822063, 0.05859902501106262, 0.10815738886594772, 0.10717184096574783, -0.11841895431280136, 0.11572793871164322, 0.08531913161277771, -0.1714286208152771, 0.011403343640267849, 0.011451552622020245, -0.05393478274345398, 0.09606844931840897, 0.027814412489533424, 0.15906335413455963, -0.02389884740114212, 0.0670885518193245, -0.11479781568050385, 0.03136923909187317, 0.09659167379140854, -0.0060579474084079266, 0.045556116849184036, 0.08471956104040146, -0.058064986020326614, 0.11759128421545029, -0.048785097897052765, 0.05630858615040779, 0.06565737724304199, -0.1093481108546257, -0.11223143339157104, -0.06142564117908478, 0.1548195779323578, 0.06940890103578568, 0.07194463908672333, -0.011215750128030777, 0.11845884472131729, -0.03184439241886139, 0.06845145672559738, 0.11849020421504974, -0.1269586980342865, -0.05540744587779045, 0.11854568868875504, 0.03582898527383804, -0.10020418465137482, -0.06515072286128998, 0.0721280425786972, 0.01843508519232273, 0.031620096415281296, 0.157435804605484, -0.05767754837870598, -0.0024188472889363766, -0.06344857066869736, -0.1085723489522934, -0.1114552915096283, 0.04870001971721649, -0.022493600845336914, -0.06782418489456177, -0.21840891242027283, -0.11413431167602539, 0.14560623466968536, -0.024483423680067062, -0.0030865247827023268, 0.026066793128848076, 0.0002939273836091161, 0.06189366802573204, -0.06975550204515457, -0.11331198364496231, -0.04797685891389847, -0.01344626396894455, 0.09932531416416168, 0.01001958642154932, 0.05799354240298271, -0.040258169174194336, 0.09113376587629318, -0.11102307587862015, -0.10485890507698059, -0.0585135892033577, -0.12748432159423828, -0.060660697519779205, -0.0009586536907590926, -0.04934300109744072, -0.12812554836273193, -0.0023129729088395834, 0.15191201865673065, -0.018002009019255638, 0.0183707308024168, -0.0035927824210375547, 0.03696780279278755, 0.06473930180072784, 0.15762372314929962, -0.0531904436647892, 0.08557842671871185, 0.04507742077112198, -0.006883535068482161, 0.08029725402593613, -0.05169651284813881, -0.08223196119070053, 0.00023974728537723422, -0.009024243801832199, 0.06587855517864227, 0.033921487629413605, 0.06962762027978897, -0.020013025030493736, -0.04638417065143585, 0.2097577005624771, -0.12206081300973892, 0.010523641481995583, 0.024241933599114418, 0.004405423533171415, -0.0446564145386219, 0.08043171465396881, -0.02587777003645897, -0.08972927182912827, 0.05197787657380104, -0.07547346502542496, 0.021115511655807495, -0.1250186711549759, -0.1052960604429245, 0.024532882496714592, -0.13331276178359985, -0.0426168367266655, -0.141480952501297, -0.1529727280139923, -0.01842631958425045, 0.03130987659096718, 0.008498001843690872, 0.06255709379911423, 0.04907441511750221, -0.03217295557260513, -0.027868114411830902, 0.02297314815223217, -0.010722770355641842, -0.015700437128543854, 0.009038643911480904, -0.014703535474836826, 0.03914591297507286, -0.007066528312861919, 0.018461374565958977, -0.10879739373922348, 0.0804247334599495, -0.22807562351226807, 0.07068288326263428, -0.014002588577568531, -0.035150252282619476, -0.022735925391316414, -0.05967572331428528, -0.024259157478809357, 0.05796373263001442, -0.004719664808362722, 0.12257427722215652, -0.12175384163856506, -0.029913337901234627, 0.26869168877601624, -0.17541253566741943, -0.06636686623096466, 0.0980515107512474, -0.05419082194566727, -0.035984478890895844, 0.05327940732240677, 0.14729511737823486, 0.10256102681159973, -0.1302875131368637, -0.02495265007019043, -0.005508595146238804, -0.10098078101873398, 0.01951901987195015, 0.04003669694066048, -0.04334050789475441, 0.09639067202806473, 0.0224296934902668, -0.06313212215900421, -0.044298041611909866, -0.002600608393549919, -0.05100390315055847, -0.022775260731577873, -0.010923533700406551, 0.047065962105989456, 0.014700287953019142, -0.0542604923248291, -0.031819943338632584, -0.09378417581319809, 0.004422017838805914, 0.05420643463730812, -0.09010876715183258, -0.002952696057036519, -0.07841768115758896, 0.12822233140468597, -0.09841931611299515, -0.0051390742883086205, -0.21950244903564453, -0.10461588948965073, 0.014251681976020336, -0.05208154767751694, 0.033506959676742554, 0.023789633065462112, 0.01916704885661602, 0.06581269204616547, -0.015317119657993317, 0.005271201021969318, -0.05597745627164841, -0.0208431463688612, -0.026526963338255882, -0.03809038922190666, -0.10292264819145203, -0.06978835165500641, 0.12675245106220245, -0.12427777051925659, 0.06371501833200455, 0.021025819703936577, 0.10306980460882187, 0.017845969647169113, -0.057354219257831573, 0.06330527365207672, 0.008064488880336285, -0.03699524328112602, -0.07178632915019989, 0.0671548843383789, -0.004871444310992956, 0.017787745222449303, 0.07916048914194107, -0.22002315521240234, -0.08190987259149551, 0.09760430455207825, -0.06631189584732056, -0.06961286813020706, 0.00298686814494431, -0.03694111481308937, 0.007344077341258526, -0.11006227135658264, -0.06511424481868744, 0.1685219556093216, 0.015876300632953644, 0.1255737841129303, -0.11509085446596146, -0.008550433441996574, 0.05160323902964592, -0.04089794307947159, -0.033070098608732224, -0.012918253429234028, 0.047723717987537384, -0.12038025259971619, 0.09813246876001358, 0.05541288107633591, 0.048816777765750885, 0.12095225602388382, 0.021962802857160568, -0.041991427540779114, 0.019870588555932045, -0.010470286943018436, 0.01855209283530712, 0.1464572697877884, -0.06522393226623535, -0.0520550012588501, 0.06734687834978104, -0.031834810972213745, 0.040597278624773026, -0.1346166580915451, 0.04545343667268753, 0.04763074219226837, -0.013911549933254719, -0.02116130106151104, -0.0014242730103433132, 0.023959733545780182, 0.040100276470184326, 0.03870223090052605, -0.004567893221974373, -0.0015150250401347876, -0.03897901624441147, -0.11486208438873291, 0.1716768890619278, -0.05717119574546814, -0.3917543590068817, -0.1566348373889923, 0.03446005657315254, -0.09904638677835464, 0.05803757905960083, 0.006773048080503941, 0.040494829416275024, -0.08451467007398605, -0.12270419299602509, -0.035519085824489594, 0.003974921070039272, -0.09519518166780472, -0.05471440404653549, 0.002050525974482298, 0.0016332714585587382, -0.1402416080236435, -0.03190947696566582, -0.02207573689520359, -0.02764430269598961, 0.0553373247385025, 0.011526862159371376, 0.11662327498197556, 0.1564965397119522, -0.05612720921635628, -0.0018999929307028651, -0.049910593777894974, 0.12007296830415726, -0.05723894387483597, 0.07263477146625519, 0.2552565336227417, -0.07359915971755981, 0.08339130878448486, 0.08787643909454346, -0.006411200854927301, -0.05997083708643913, -0.010678019374608994, 0.012477939017117023, -0.11041084676980972, -0.11119593679904938, -0.10484597831964493, -0.07560696452856064, 0.01038733683526516, 0.06730403751134872, 0.030184293165802956, 0.08811774849891663, 0.07974386215209961, -0.08198501169681549, -0.06002530828118324, 0.018165554851293564, 0.12428171932697296, 0.07247260957956314, -0.01821848750114441, 0.09626887738704681, -0.07560797780752182, -0.02754366211593151, 0.06620213389396667, -0.002136093331500888, 0.1543346792459488, -0.002448070328682661, 0.04543331637978554, 0.09786416590213776, -0.0017892135074362159, 0.07945127785205841, 0.10854876041412354, -0.032593246549367905, 0.012539876624941826, -0.033082328736782074, -0.09305831044912338, -0.11484736949205399, 0.047515880316495895, 0.05241083353757858, -0.017496151849627495, -0.06804607808589935, -0.013649570755660534, 0.05810900405049324, 0.18551285564899445, 0.07552766799926758, -0.24722564220428467, -0.06498932838439941, 0.01584240421652794, 0.02812783420085907, -0.1162693202495575, -0.015211071819067001, 0.1494692713022232, -0.16502542793750763, 0.07954542338848114, -0.05602985993027687, 0.10813148319721222, -0.027773825451731682, 0.008592162281274796, -0.10271715372800827, 0.107160784304142, 0.012278854846954346, 0.10404780507087708, -0.07372382283210754, 0.10323609411716461, 0.016283931210637093, 0.07466147840023041, -0.12670157849788666, 0.06192236766219139, 0.07071732729673386, 0.11312290281057358, 0.15735039114952087, 0.008340143598616123, -0.022002313286066055, -0.03719104081392288, -0.0718379095196724, 0.008741732686758041, 0.07615789771080017, -0.03759827837347984, 0.05790433660149574, -0.010849114507436752, -0.030108142644166946, -0.01281324215233326, 0.02647409401834011, -0.053042639046907425, -0.13919658958911896, 0.019485387951135635, -0.06393962353467941, -0.058460675179958344, -0.06951267272233963, 0.003749595722183585, -0.044099461287260056, 0.20225964486598969, -0.01297208946198225, -0.1057625338435173, -0.13860613107681274, -0.06706038862466812, 0.046788107603788376, -0.041121955960989, 0.09744148701429367, -0.06236313283443451, 0.23198841512203217, -0.07698936760425568, -0.10814587026834488, 0.02705926075577736, -0.09184280782938004, -0.09568090736865997, -0.026541590690612793, 0.1335364431142807, -0.025263752788305283, -0.001768020330928266, 0.08368312567472458, 0.052974700927734375, -0.03886907175183296, -0.10099908709526062, 0.0077525353990495205, 0.1442919820547104, 0.14611399173736572, 0.13825950026512146, -0.03959266468882561, -0.1931963562965393, -0.022941896691918373, 0.08907783031463623, 0.0837789922952652, 0.11202199757099152, -0.07109008729457855, 0.09075409919023514, 0.1485494077205658, -0.06902524828910828, -0.30003902316093445, -0.009376694448292255, -0.043334949761629105, 0.036692824214696884, 0.03360072895884514, -0.08548849076032639, 0.040476880967617035, 0.03952036052942276, -0.0311850905418396, 0.052837178111076355, -0.20882178843021393, -0.10443153977394104, 0.05586086958646774, 0.112152099609375, -0.04010477662086487, -0.0981212705373764, -0.02669631876051426, -0.04354892298579216, -0.14581450819969177, 0.08934617042541504, -0.11790687590837479, 0.07565314322710037, 0.0022449155803769827, -0.029072722420096397, 0.030858267098665237, -0.08979899436235428, 0.10180876404047012, -0.05972660332918167, 0.09354964643716812, -0.05528298765420914, -0.013901402242481709, 0.2071266621351242, -0.07716868072748184, 0.12527769804000854, 0.006772999186068773, 0.060785721987485886, -0.07546809315681458, -0.04947259649634361, -0.07066436856985092, 0.1208748072385788, -0.03777027875185013, -0.08585751056671143, -0.07202385365962982, 0.026259202510118484, 0.04445340484380722, 0.006156689487397671, 0.017698897048830986, -0.12301100790500641, 0.053116776049137115, 0.1949213147163391, 0.1358635574579239, -0.044385384768247604, -0.11841326951980591, -0.06830862909555435, -0.02024398185312748, 0.05093841999769211, -0.1308542639017105, 0.06129012629389763, 0.05282196030020714, 0.046319279819726944, 0.06517492979764938, 0.0484275296330452, -0.07768653333187103, 0.025183308869600296, 0.05235860124230385, -0.11327976733446121, -0.1578187793493271, -0.009865286760032177, -0.0642581656575203, -0.08891956508159637, 0.027079032734036446, 0.09886655956506729, -0.050756458193063736, -0.016284098848700523, -0.014544229954481125, 0.02719094045460224, -0.0026619096752256155, 0.008985236287117004, 0.05525684729218483, 0.05092557519674301, -0.07664172351360321, 0.09271746873855591, 0.03785372152924538, -0.07986412197351456, 0.014865864999592304, 0.02212984673678875, -0.11162928491830826, -0.08503379672765732, -0.061664946377277374, 0.14911139011383057, -0.03283858299255371, -0.08204673230648041, -0.03570721670985222, -0.1016024500131607, 0.001333494670689106, 0.16891419887542725, 0.05726761743426323, 0.027283519506454468, -0.011450683698058128, 0.005811159498989582, -0.13636203110218048, 0.05770055577158928, -0.02770618163049221, 0.05211746692657471, -0.10036958009004593, 0.13163267076015472, 0.05553164333105087, 0.07862834632396698, -0.06500911712646484, -0.05879047513008118, -0.06778911501169205, 0.0064810048788785934, -0.13708430528640747, 0.060517456382513046, -0.10463691502809525, -0.024188535287976265, -0.022944467142224312, 0.07391448318958282, -0.008714298717677593, 0.056758832186460495, -0.04260022193193436, -0.024246960878372192, -0.01732332445681095, 0.019056661054491997, -0.16678069531917572, -0.010857156477868557, 0.01978401280939579, -0.08251197636127472, 0.08728979527950287, 0.0017604419263079762, -0.07349313795566559, -0.019264904782176018, -0.10294892638921738, -0.07466321438550949, 0.03837892413139343, 0.03735430911183357, -0.011377141810953617, -0.0696236789226532, 0.030262652784585953, 0.022319601848721504, -0.049634918570518494, -0.009600155986845493, 0.06423857808113098, -0.08992020040750504, 0.0783783569931984, 0.050782736390829086, -0.05152277648448944, -0.028809906914830208, 0.07281994074583054, 0.0732063576579094, 0.035752203315496445, 0.07899575680494308, -0.022033486515283585, 0.08765148371458054, -0.14681090414524078, -0.019205966964364052, -0.017951177433133125, -0.04987693950533867, 0.005006404127925634, -0.044595226645469666, 0.03522530198097229, -0.017053261399269104, 0.19779790937900543, 0.08093994855880737, 0.08686166256666183, 0.01426700595766306, 0.059783849865198135, 0.04343568906188011, 0.045850157737731934, 0.09726113080978394, -0.03825113922357559, -0.04650013521313667, 0.06834351271390915, 0.023857789114117622, 0.016327301040291786, -0.0065473043359816074, 0.16746367514133453, 0.1410321593284607, 0.0763164758682251, 0.06222638115286827, 0.08377912640571594, 0.013797786086797714, -0.1687130481004715, -0.1483946144580841, 0.01865849457681179, 0.058285191655159, -0.05204204097390175, -0.06071116030216217, 0.15854743123054504, -0.11200328916311264, 0.11706183850765228, 0.00765684200450778, -0.04900151863694191, -0.0549011155962944, -0.16020475327968597, -0.033873386681079865, 0.04872419685125351, -0.002773284912109375, -0.0685340091586113, -0.005287783220410347, 0.17376822233200073, -0.034075576812028885, -0.07106196880340576, 0.20852074027061462, -0.15808692574501038, -0.046948861330747604, 0.04431506618857384, 0.036542486399412155, 0.04677463695406914, 0.04048299044370651, 0.02063794992864132, 0.05573990195989609, 0.03183124214410782, 0.0850929543375969, 0.03435608744621277, 0.12139362841844559, 0.0330163910984993, -0.06733300536870956, -0.05951780080795288, 0.010389256291091442, 0.020720822736620903, 0.005503555294126272, 0.07423096895217896, 0.023662813007831573, -0.03882656246423721, -0.004575342405587435, 0.12091398239135742, -0.06498986482620239, -0.11710124462842941, -0.1332702338695526, 0.27475062012672424, -0.037011705338954926, -0.008892246522009373, 0.017146484926342964, -0.09304407984018326, 0.022678207606077194, 0.22000783681869507, 0.196807861328125, -0.018942002207040787, 0.037055183202028275, -0.03062288649380207, 0.00613724859431386, -0.007772447541356087, 0.14299727976322174, -0.016525574028491974, 0.2733260989189148, -0.03447142615914345, 0.06341326981782913, 0.00913806352764368, -0.04962102696299553, -0.1348172277212143, 0.05437980964779854, -0.0158773735165596, -0.045110009610652924, -0.040814097970724106, 0.04511519521474838, -0.03627133369445801, -0.09736968576908112, 0.12492682784795761, -0.06893368810415268, -0.05055473744869232, 0.02349270135164261, -0.025218000635504723, -0.005216402001678944, 0.07340625673532486, -0.02083934098482132, 0.01009579747915268, 0.1596525013446808, 0.011792870238423347, -0.07598698884248734, -0.002798552392050624, 0.025304438546299934, -0.11719714105129242, 0.24598728120326996, -0.00698277959600091, 0.056396484375, 0.09952529519796371, 0.029204968363046646, -0.09017424285411835, 0.030008919537067413, 0.01385550107806921, -0.02689979039132595, 0.03578333929181099, 0.09791963547468185, -0.00820864550769329, -0.08222421258687973, 0.0010110537987202406, -0.08816386759281158, 0.03517911210656166, 0.04705485329031944, 0.016870729625225067, -0.04239945486187935, 0.043080829083919525, -0.10459312796592712, 0.12913021445274353, 0.08479403704404831, 0.029372448101639748, -0.028290865942835808, -0.039940908551216125, 0.04428793117403984, 0.032974954694509506, 0.031170645728707314, -0.039107777178287506, -0.10837677121162415, 0.004059300757944584, -0.011758659034967422, -0.007993476465344429, -0.2783745229244232, -0.048048265278339386, -0.012595508247613907, -0.022133272141218185, -0.010680338367819786, 0.022528117522597313, 0.07153160870075226, 0.04073509946465492, -0.03186945989727974, -0.022041434422135353, -0.005345584359019995, 0.12192682176828384, -0.07326934486627579, -0.07148510962724686 ]
null
null
transformers
# mBART-50 many to many multilingual machine translation This model is a fine-tuned checkpoint of [mBART-large-50](https://huggingface.co/facebook/mbart-large-50). `mbart-large-50-many-to-many-mmt` is fine-tuned for multilingual machine translation. It was introduced in [Multilingual Translation with Extensible Multilingual Pretraining and Finetuning](https://arxiv.org/abs/2008.00401) paper. The model can translate directly between any pair of 50 languages. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the `forced_bos_token_id` parameter to the `generate` method. ```python from transformers import MBartForConditionalGeneration, MBart50TokenizerFast article_hi = "संयुक्त राष्ट्र के प्रमुख का कहना है कि सीरिया में कोई सैन्य समाधान नहीं है" article_ar = "الأمين العام للأمم المتحدة يقول إنه لا يوجد حل عسكري في سوريا." model = MBartForConditionalGeneration.from_pretrained("facebook/mbart-large-50-many-to-many-mmt") tokenizer = MBart50TokenizerFast.from_pretrained("facebook/mbart-large-50-many-to-many-mmt") # translate Hindi to French tokenizer.src_lang = "hi_IN" encoded_hi = tokenizer(article_hi, return_tensors="pt") generated_tokens = model.generate( **encoded_hi, forced_bos_token_id=tokenizer.lang_code_to_id["fr_XX"] ) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => "Le chef de l 'ONU affirme qu 'il n 'y a pas de solution militaire dans la Syrie." # translate Arabic to English tokenizer.src_lang = "ar_AR" encoded_ar = tokenizer(article_ar, return_tensors="pt") generated_tokens = model.generate( **encoded_ar, forced_bos_token_id=tokenizer.lang_code_to_id["en_XX"] ) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => "The Secretary-General of the United Nations says there is no military solution in Syria." ``` See the [model hub](https://huggingface.co/models?filter=mbart-50) to look for more fine-tuned versions. ## Languages covered Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI) ## BibTeX entry and citation info ``` @article{tang2020multilingual, title={Multilingual Translation with Extensible Multilingual Pretraining and Finetuning}, author={Yuqing Tang and Chau Tran and Xian Li and Peng-Jen Chen and Naman Goyal and Vishrav Chaudhary and Jiatao Gu and Angela Fan}, year={2020}, eprint={2008.00401}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
{"language": ["multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl"], "tags": ["mbart-50"], "pipeline_tag": "translation"}
translation
facebook/mbart-large-50-many-to-many-mmt
[ "transformers", "pytorch", "tf", "jax", "rust", "safetensors", "mbart", "text2text-generation", "mbart-50", "translation", "multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl", "arxiv:2008.00401", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2008.00401" ]
[ "multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl" ]
TAGS #transformers #pytorch #tf #jax #rust #safetensors #mbart #text2text-generation #mbart-50 #translation #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #autotrain_compatible #endpoints_compatible #has_space #region-us
# mBART-50 many to many multilingual machine translation This model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-many-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper. The model can translate directly between any pair of 50 languages. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method. See the model hub to look for more fine-tuned versions. ## Languages covered Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI) ## BibTeX entry and citation info
[ "# mBART-50 many to many multilingual machine translation\n\n\nThis model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-many-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nThe model can translate directly between any pair of 50 languages. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions.", "## Languages covered\nArabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI)", "## BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #jax #rust #safetensors #mbart #text2text-generation #mbart-50 #translation #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# mBART-50 many to many multilingual machine translation\n\n\nThis model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-many-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nThe model can translate directly between any pair of 50 languages. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions.", "## Languages covered\nArabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI)", "## BibTeX entry and citation info" ]
[ 182, 172, 346, 10 ]
[ "passage: TAGS\n#transformers #pytorch #tf #jax #rust #safetensors #mbart #text2text-generation #mbart-50 #translation #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# mBART-50 many to many multilingual machine translation\n\n\nThis model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-many-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nThe model can translate directly between any pair of 50 languages. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions." ]
[ -0.058685269206762314, -0.11020675301551819, -0.0059699201956391335, 0.036147817969322205, 0.0720333680510521, 0.016543559730052948, 0.15961645543575287, 0.10281005501747131, 0.003183105494827032, 0.08675023168325424, 0.0023953549098223448, 0.16187149286270142, 0.07257901877164841, 0.1477588713169098, 0.04404178261756897, -0.24251724779605865, 0.05666781961917877, -0.07936593890190125, 0.013261403888463974, 0.10360255092382431, 0.07429477572441101, -0.031372055411338806, 0.07749184221029282, -0.019906125962734222, -0.016004176810383797, 0.04523589089512825, 0.01098454650491476, -0.03132951259613037, 0.0944632738828659, 0.09988774359226227, 0.010942182503640652, 0.02792951837182045, 0.06180214509367943, -0.1899755448102951, 0.007136384956538677, 0.03454580157995224, -0.013467231765389442, -0.00031790087814442813, 0.0949452817440033, -0.04060105234384537, 0.12797655165195465, -0.12389158457517624, -0.02487446554005146, 0.054493967443704605, -0.11141695082187653, -0.12783096730709076, -0.07132013142108917, 0.05675755813717842, 0.0834183320403099, 0.04905500262975693, -0.07068649679422379, 0.13234558701515198, 0.008563147857785225, 0.08842834085226059, 0.15923886001110077, -0.3864175081253052, -0.02907714620232582, 0.13224662840366364, 0.09345497936010361, 0.09195594489574432, -0.024047330021858215, 0.05468777194619179, 0.03106088936328888, -0.002497835084795952, -0.017154760658740997, -0.06412147730588913, 0.08710340410470963, -0.042482588440179825, -0.1564301997423172, -0.004631952848285437, 0.038187481462955475, -0.005314064212143421, -0.039093200117349625, -0.13376228511333466, -0.10502299666404724, 0.01833781599998474, -0.052005745470523834, -0.037040796130895615, 0.033824216574430466, 0.03600069880485535, 0.09747352451086044, -0.045805297791957855, -0.06074926629662514, -0.04052725434303284, -0.12162452936172485, 0.1501113623380661, 0.023448769003152847, 0.007046465761959553, -0.004693531431257725, 0.053596381098032, -0.05752038583159447, -0.059028275310993195, -0.05208617076277733, -0.056077081710100174, -0.10370781272649765, -0.01874576508998871, -0.008368616923689842, -0.08422543853521347, 0.07583413273096085, 0.1245497614145279, -0.04079677164554596, 0.07030154764652252, -0.08919575065374374, 0.03333599865436554, 0.03064042702317238, 0.05857883393764496, -0.13335564732551575, -0.04863998293876648, -0.005873817950487137, -0.0021396069787442684, 0.004226249642670155, -0.0065134428441524506, -0.060335345566272736, -0.08136101067066193, 0.012691234238445759, 0.08869145810604095, 0.018083937466144562, 0.11051054298877716, -0.017800863832235336, -0.026602495461702347, 0.017799390479922295, -0.161935493350029, 0.01793067902326584, 0.01901276223361492, 0.012538198381662369, 0.12464931607246399, -0.040029410272836685, -0.0054273903369903564, -0.07915565371513367, -0.033865708857774734, -0.02703358791768551, 0.04788728058338165, -0.07729239761829376, -0.15843656659126282, 0.01597263477742672, -0.02981099858880043, -0.09841611981391907, -0.11493036895990372, -0.1161520853638649, -0.0627700686454773, 0.00005008962762076408, -0.039121996611356735, 0.07761281728744507, -0.08043425530195236, -0.02696867845952511, 0.027297334745526314, -0.00424188794568181, 0.018628325313329697, -0.05633565038442612, 0.022641319781541824, -0.02318480797111988, 0.1098535880446434, 0.10556642711162567, -0.0075597455725073814, -0.0753524899482727, 0.045031674206256866, -0.16643136739730835, 0.18634790182113647, -0.10866396129131317, -0.0003416238469071686, -0.12077170610427856, -0.04388074204325676, -0.011091968044638634, 0.06775754690170288, 0.014489490538835526, 0.18446438014507294, -0.21184757351875305, -0.0357799306511879, 0.3243795931339264, -0.09129160642623901, -0.016258614137768745, 0.07778456807136536, 0.006554315332323313, 0.03580343350768089, 0.06986943632364273, 0.11344879865646362, 0.02945626713335514, -0.06362360715866089, 0.0190491434186697, 0.055245667695999146, -0.07604063302278519, 0.1022421345114708, 0.08963446319103241, -0.024760344997048378, -0.0011562315048649907, 0.0052788532339036465, 0.01189481746405363, 0.025714144110679626, -0.03429283946752548, -0.03743184357881546, 0.019629260525107384, 0.020780982449650764, 0.021381035447120667, -0.02888994850218296, 0.022781552746891975, -0.015118825249373913, -0.050304971635341644, 0.05679460987448692, 0.06911151856184006, -0.02042645774781704, 0.03140408173203468, -0.11244147270917892, -0.01965700089931488, -0.08650849759578705, 0.05791878700256348, -0.18113239109516144, -0.04091983661055565, -0.0033386773429811, -0.03804911673069, 0.11698152869939804, 0.14770925045013428, 0.08587922155857086, 0.040590330958366394, -0.008704698644578457, -0.006130074616521597, 0.111827552318573, -0.009026149287819862, -0.02250327728688717, -0.176463320851326, -0.01838541403412819, -0.06713292002677917, 0.04610631242394447, -0.0720590353012085, 0.03357631713151932, 0.013070809654891491, 0.0764479711651802, -0.02433834969997406, 0.026135610416531563, 0.03656435385346413, 0.040125131607055664, -0.03333587944507599, -0.034895166754722595, 0.06552087515592575, -0.05553573742508888, -0.050938915461301804, 0.1670244038105011, -0.16644829511642456, 0.022251740097999573, 0.1153281033039093, -0.06358975917100906, -0.1007586419582367, 0.004524894990026951, -0.018436657264828682, -0.018697746098041534, 0.009218854829668999, -0.04173840954899788, 0.0617733970284462, 0.07038317620754242, 0.09243541955947876, -0.09522643685340881, -0.0384063720703125, 0.005686176475137472, -0.061317328363657, -0.02275242656469345, 0.11435160040855408, 0.02151481807231903, -0.21109545230865479, 0.09606008976697922, 0.07812133431434631, 0.07652024179697037, 0.15418237447738647, 0.01154034398496151, -0.0661117359995842, -0.05646258592605591, 0.07880737632513046, 0.009849727153778076, -0.07219776511192322, -0.013316141441464424, -0.04044694826006889, 0.03340265527367592, 0.05543197691440582, 0.009714683517813683, -0.06968637555837631, 0.0377056822180748, 0.009668681770563126, -0.0819813683629036, 0.03078811801970005, 0.07788511365652084, 0.018481537699699402, 0.11379605531692505, 0.006393606308847666, -0.006042288616299629, -0.031197628006339073, -0.03643741458654404, -0.11477512866258621, 0.16694699227809906, -0.12364117801189423, -0.24405938386917114, -0.10065405070781708, -0.05186489596962929, -0.09002699702978134, 0.01876789703965187, 0.013214802369475365, -0.0835234597325325, -0.03092506341636181, -0.06973166763782501, 0.09752096980810165, -0.030893763527274132, -0.0692741796374321, -0.027592137455940247, 0.0044456725008785725, 0.0017072721384465694, -0.12167103588581085, -0.023831045255064964, 0.01729830540716648, -0.11638733744621277, 0.04575459659099579, -0.030907833948731422, 0.054037902504205704, 0.10368449240922928, 0.013311869464814663, 0.007081214338541031, -0.019557291641831398, 0.2079148143529892, -0.032078925520181656, 0.057363830506801605, 0.1112012192606926, 0.05238605663180351, 0.05267027020454407, 0.15203645825386047, 0.039952974766492844, -0.0651242733001709, -0.012287117540836334, 0.03167162463068962, -0.01611921936273575, -0.1837523728609085, -0.06759515404701233, -0.06568322330713272, 0.019022880122065544, 0.008689322508871555, 0.04766438528895378, -0.017201438546180725, 0.012949314899742603, -0.04946678876876831, 0.017497163265943527, 0.0914478600025177, 0.07646990567445755, 0.16967999935150146, -0.04252057522535324, 0.10144373029470444, -0.04036494344472885, -0.053415894508361816, 0.10017219930887222, 0.010154404677450657, 0.05663769319653511, 0.05631948634982109, 0.11248372495174408, 0.07194986939430237, 0.04152996093034744, 0.03330424427986145, 0.11414285749197006, 0.004729653242975473, 0.020087314769625664, -0.025628261268138885, -0.07909983396530151, -0.03649146854877472, 0.049772102385759354, 0.05157012864947319, -0.0468476377427578, 0.03065880760550499, -0.037123240530490875, 0.10176266729831696, 0.12095582485198975, 0.04737069830298424, -0.1662978231906891, 0.0002805933472700417, 0.03477245569229126, -0.0451042540371418, -0.06137210875749588, 0.02082555741071701, 0.031891535967588425, -0.11447712779045105, 0.1153603047132492, -0.009062055498361588, 0.08509854972362518, -0.0859486386179924, 0.023849796503782272, -0.032305777072906494, 0.08355220407247543, -0.0027144530322402716, 0.07882237434387207, -0.2542758584022522, 0.14074082672595978, 0.041175175458192825, 0.010885917581617832, -0.025958802551031113, 0.01860976405441761, 0.04585935175418854, 0.059084221720695496, 0.15061752498149872, 0.030768735334277153, -0.07178241014480591, -0.1671445667743683, -0.0552387461066246, -0.01960705779492855, 0.12401606142520905, -0.033147528767585754, 0.0851370096206665, -0.02013099193572998, -0.040374111384153366, -0.06407931447029114, 0.0767187848687172, -0.12002954632043839, -0.08093172311782837, 0.08415520191192627, 0.02173670567572117, 0.03328871726989746, -0.003700872650370002, -0.03631811589002609, -0.1263771802186966, 0.19603973627090454, -0.07479630410671234, -0.0934789851307869, -0.10058672726154327, -0.0115857208147645, 0.12907260656356812, -0.08864855021238327, 0.0034026899375021458, -0.028451448306441307, 0.0563899427652359, -0.030232757329940796, -0.12006363272666931, 0.08529684692621231, -0.06742385029792786, -0.08545937389135361, -0.0007792924297973514, 0.09761069715023041, -0.021648898720741272, 0.027697041630744934, 0.014126830734312534, 0.028748922049999237, 0.020267562940716743, -0.12531974911689758, -0.03991899639368057, 0.13703744113445282, -0.08754102885723114, 0.0708879753947258, -0.11822833865880966, -0.17457446455955505, -0.06719957292079926, -0.017610805109143257, 0.11693847924470901, 0.26464584469795227, -0.06696802377700806, 0.07906289398670197, 0.15927384793758392, -0.09143690019845963, -0.17838971316814423, -0.08439666777849197, -0.013760721310973167, 0.047101978212594986, -0.03300437703728676, -0.1325342357158661, 0.024498727172613144, 0.02409994974732399, 0.01727769337594509, 0.020953232422471046, -0.2739066481590271, -0.1263626515865326, 0.050807975232601166, 0.018094714730978012, 0.13284695148468018, -0.10069093853235245, -0.07885804027318954, -0.059326354414224625, -0.07015818357467651, 0.04747810959815979, -0.05771081894636154, 0.10317639261484146, 0.03139381855726242, 0.04597273841500282, 0.03594816103577614, -0.014547099359333515, 0.09792119264602661, 0.04907185956835747, 0.02864985726773739, -0.08531520515680313, 0.020839720964431763, 0.06557262688875198, 0.02045261301100254, 0.11085973680019379, -0.04568549990653992, 0.004371578339487314, -0.05568508431315422, -0.06674924492835999, -0.07796557247638702, 0.04631265252828598, -0.03197992965579033, -0.06656402349472046, -0.01854001171886921, 0.045103270560503006, 0.03659825772047043, -0.021660955622792244, -0.049495335668325424, -0.11049255728721619, 0.05181388184428215, 0.11073468625545502, 0.15192517638206482, 0.0003170820709783584, 0.017507966607809067, -0.0043150391429662704, -0.008616261184215546, 0.04555148258805275, -0.04434787109494209, 0.011077755130827427, 0.13049006462097168, 0.0034316133242100477, 0.09442990273237228, 0.036099642515182495, -0.09043339639902115, 0.011907505802810192, 0.06331648677587509, -0.08308912813663483, -0.20536506175994873, 0.019403746351599693, -0.06875813752412796, 0.05853240564465523, 0.019544027745723724, 0.1853281855583191, -0.0350167453289032, -0.032173704355955124, 0.02262081205844879, 0.02272759936749935, -0.05637352168560028, 0.11515866965055466, -0.018441975116729736, 0.0293281152844429, -0.08850792795419693, 0.044167593121528625, 0.02835707925260067, -0.0977780669927597, -0.002250568475574255, 0.1770104169845581, -0.11868737637996674, -0.09653730690479279, -0.006042602937668562, 0.12333564460277557, -0.1052928939461708, 0.00961629394441843, 0.028300540521740913, -0.07487165927886963, 0.054486896842718124, 0.15008218586444855, 0.034743692725896835, 0.031010299921035767, -0.0010183457052335143, -0.02252558432519436, 0.045069485902786255, 0.05082090571522713, 0.06254169344902039, 0.006854000501334667, -0.041335076093673706, 0.12905721366405487, 0.0007824223721399903, 0.05168956518173218, -0.04819454252719879, -0.020722046494483948, -0.0850529596209526, -0.007393854204565287, -0.1039966568350792, -0.01310450304299593, -0.14484673738479614, -0.01992923766374588, 0.006093182601034641, -0.038095928728580475, 0.008163297548890114, -0.021988077089190483, -0.08605578541755676, -0.016058722510933876, -0.08150776475667953, 0.08457164466381073, -0.14049404859542847, -0.020755449309945107, 0.000979035277850926, -0.05163758620619774, 0.08280246704816818, 0.06701389700174332, -0.060052793473005295, 0.04382733255624771, -0.1061788946390152, 0.01788083091378212, 0.00856527779251337, 0.045126643031835556, 0.030011603608727455, -0.060140401124954224, 0.03956640884280205, 0.03388618677854538, 0.030389729887247086, 0.01255746092647314, -0.008846101351082325, -0.06452494859695435, 0.07782123237848282, -0.05069038271903992, -0.06041519343852997, -0.06394252926111221, 0.04672439396381378, 0.0850522592663765, 0.04069967567920685, 0.0864076241850853, -0.09431585669517517, 0.08666355162858963, -0.13127021491527557, -0.01249771099537611, 0.0038425836246460676, -0.07582756876945496, 0.004411929287016392, -0.0818304643034935, 0.063059963285923, -0.033492013812065125, 0.12162796407938004, 0.04500934109091759, 0.03368546813726425, -0.001360554713755846, -0.08658574521541595, -0.07517391443252563, 0.03705064579844475, 0.10613510012626648, 0.06358197331428528, 0.007369696628302336, -0.06734053045511246, 0.03060051053762436, -0.021613825112581253, -0.018391162157058716, 0.04926733300089836, 0.13900995254516602, 0.09716444462537766, 0.06851142644882202, 0.05307445675134659, -0.04895913973450661, -0.12140700221061707, -0.001053985906764865, -0.11156865209341049, 0.03247108682990074, -0.09795615822076797, 0.17979256808757782, 0.13421858847141266, -0.10668295621871948, 0.09119025617837906, 0.00012780501856468618, -0.044413745403289795, -0.1300918459892273, -0.1831447035074234, -0.06166858598589897, -0.0520622618496418, 0.012003456242382526, -0.09979812055826187, 0.08706825226545334, -0.01469878014177084, 0.10487192869186401, 0.030666891485452652, 0.13667041063308716, -0.09259115159511566, -0.10175660252571106, 0.05134230852127075, -0.00880176480859518, 0.03704981133341789, 0.0031091291457414627, -0.016133593395352364, 0.01161171868443489, -0.02171977050602436, 0.011270717717707157, 0.06091032549738884, -0.020624911412596703, 0.0014988055918365717, -0.09039393067359924, -0.07284435629844666, -0.020850172266364098, 0.010615379549562931, 0.001731791766360402, 0.0986301451921463, 0.07308388501405716, -0.09494206309318542, -0.01906461827456951, 0.07174840569496155, -0.049976445734500885, -0.19025897979736328, -0.09336739033460617, 0.2323140799999237, -0.031618986278772354, 0.08967437595129013, -0.051666393876075745, -0.04046149179339409, -0.03644074127078056, 0.20302462577819824, 0.20830655097961426, -0.050796594470739365, 0.0200856514275074, 0.030051033943891525, 0.018009603023529053, 0.037072017788887024, 0.0705987960100174, 0.04458969086408615, 0.19964386522769928, -0.05258581414818764, 0.0930987074971199, -0.0324823372066021, -0.025179734453558922, -0.03986725956201553, 0.04091571643948555, -0.03820684552192688, 0.00538762379437685, -0.023556549102067947, 0.10377120971679688, -0.0803956612944603, -0.05943159759044647, 0.02958378754556179, -0.015198004432022572, -0.04405605420470238, -0.017305929213762283, 0.009631654247641563, 0.050307631492614746, 0.02946597896516323, 0.0038021302316337824, -0.012216707691550255, 0.06699712574481964, 0.00228060781955719, -0.07216468453407288, -0.033002182841300964, 0.040158387273550034, -0.009889468550682068, 0.020416604354977608, -0.009076894260942936, 0.07685815542936325, 0.10864901542663574, 0.0074066962115466595, -0.08355339616537094, 0.07166923582553864, 0.019582003355026245, -0.05212389677762985, 0.02623717114329338, 0.05855252221226692, 0.014152911491692066, 0.04489913582801819, 0.027749082073569298, -0.1471489816904068, 0.087880939245224, 0.1167328879237175, -0.02357768826186657, -0.07286572456359863, 0.10917676240205765, -0.07642248272895813, 0.1273658722639084, 0.1678270697593689, 0.03803536295890808, -0.012160082347691059, -0.06142909452319145, 0.022585676982998848, -0.043648529797792435, 0.03922788053750992, -0.06867919862270355, -0.17425456643104553, -0.023578673601150513, -0.018933124840259552, 0.06943679600954056, -0.14515875279903412, -0.08553817868232727, -0.015969576314091682, 0.030249111354351044, -0.07493294030427933, 0.1400797814130783, 0.04984402284026146, 0.02439211681485176, 0.00609589833766222, -0.27689775824546814, 0.06503291428089142, 0.15734709799289703, -0.11270029842853546, -0.04916150122880936 ]
null
null
transformers
# mBART-50 many to one multilingual machine translation This model is a fine-tuned checkpoint of [mBART-large-50](https://huggingface.co/facebook/mbart-large-50). `mbart-large-50-many-to-many-mmt` is fine-tuned for multilingual machine translation. It was introduced in [Multilingual Translation with Extensible Multilingual Pretraining and Finetuning](https://arxiv.org/abs/2008.00401) paper. The model can translate directly between any pair of 50 languages. ```python from transformers import MBartForConditionalGeneration, MBart50TokenizerFast article_hi = "संयुक्त राष्ट्र के प्रमुख का कहना है कि सीरिया में कोई सैन्य समाधान नहीं है" article_ar = "الأمين العام للأمم المتحدة يقول إنه لا يوجد حل عسكري في سوريا." model = MBartForConditionalGeneration.from_pretrained("facebook/mbart-large-50-many-to-one-mmt") tokenizer = MBart50TokenizerFast.from_pretrained("facebook/mbart-large-50-many-to-one-mmt") # translate Hindi to English tokenizer.src_lang = "hi_IN" encoded_hi = tokenizer(article_hi, return_tensors="pt") generated_tokens = model.generate(**encoded_hi) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => "The head of the UN says there is no military solution in Syria." # translate Arabic to English tokenizer.src_lang = "ar_AR" encoded_ar = tokenizer(article_ar, return_tensors="pt") generated_tokens = model.generate(**encoded_ar) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => "The Secretary-General of the United Nations says there is no military solution in Syria." ``` See the [model hub](https://huggingface.co/models?filter=mbart-50) to look for more fine-tuned versions. ## Languages covered Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI) ## BibTeX entry and citation info ``` @article{tang2020multilingual, title={Multilingual Translation with Extensible Multilingual Pretraining and Finetuning}, author={Yuqing Tang and Chau Tran and Xian Li and Peng-Jen Chen and Naman Goyal and Vishrav Chaudhary and Jiatao Gu and Angela Fan}, year={2020}, eprint={2008.00401}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
{"language": ["multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl"], "tags": ["mbart-50"]}
text2text-generation
facebook/mbart-large-50-many-to-one-mmt
[ "transformers", "pytorch", "tf", "jax", "mbart", "text2text-generation", "mbart-50", "multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl", "arxiv:2008.00401", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2008.00401" ]
[ "multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl" ]
TAGS #transformers #pytorch #tf #jax #mbart #text2text-generation #mbart-50 #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #autotrain_compatible #endpoints_compatible #has_space #region-us
# mBART-50 many to one multilingual machine translation This model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-many-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper. The model can translate directly between any pair of 50 languages. See the model hub to look for more fine-tuned versions. ## Languages covered Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI) ## BibTeX entry and citation info
[ "# mBART-50 many to one multilingual machine translation\n\n\nThis model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-many-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\nThe model can translate directly between any pair of 50 languages.\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions.", "## Languages covered\nArabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI)", "## BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #jax #mbart #text2text-generation #mbart-50 #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# mBART-50 many to one multilingual machine translation\n\n\nThis model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-many-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\nThe model can translate directly between any pair of 50 languages.\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions.", "## Languages covered\nArabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI)", "## BibTeX entry and citation info" ]
[ 172, 114, 346, 10 ]
[ "passage: TAGS\n#transformers #pytorch #tf #jax #mbart #text2text-generation #mbart-50 #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# mBART-50 many to one multilingual machine translation\n\n\nThis model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-many-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\nThe model can translate directly between any pair of 50 languages.\n\n\n\n\n\nSee the model hub to look for more fine-tuned versions." ]
[ -0.07135295122861862, -0.14283490180969238, -0.0037341006100177765, 0.04528535157442093, 0.05108267813920975, 0.04141588136553764, 0.14930637180805206, 0.11098260432481766, 0.0063977488316595554, 0.07190436124801636, 0.016307223588228226, 0.09911621361970901, 0.04244460538029671, 0.13129092752933502, 0.018572140485048294, -0.29115191102027893, 0.02503366954624653, -0.019804170355200768, -0.009976791217923164, 0.10631275922060013, 0.11312684416770935, -0.013524003326892853, 0.07529853284358978, -0.026540441438555717, -0.05057045444846153, 0.043704692274332047, 0.00962027721107006, -0.02833396941423416, 0.1170768216252327, 0.12775161862373352, 0.011607048101723194, 0.03636489808559418, 0.0482298918068409, -0.16562950611114502, 0.02618248015642166, 0.00423117820173502, -0.023111898452043533, 0.010340669192373753, 0.04212750494480133, -0.04170927777886391, 0.2001294642686844, -0.0581744946539402, -0.017352372407913208, 0.057348888367414474, -0.07806029170751572, -0.0700446143746376, -0.06519822776317596, 0.07196159660816193, 0.004138132557272911, 0.054584991186857224, -0.04569428041577339, 0.1830664426088333, -0.051714058965444565, 0.07580822706222534, 0.17094150185585022, -0.35070765018463135, -0.04034474864602089, 0.17781420052051544, 0.1050168052315712, 0.08287617564201355, -0.05126800015568733, 0.08499851822853088, 0.04080384224653244, 0.046537987887859344, -0.04736216738820076, -0.08834011852741241, 0.0663258284330368, -0.03171917796134949, -0.1318199634552002, 0.020811915397644043, 0.061001650989055634, -0.00542901735752821, -0.014156004413962364, -0.13502025604248047, -0.07492814213037491, 0.04586371034383774, -0.07098405063152313, -0.04235637187957764, -0.031728796660900116, 0.04049064218997955, 0.07820800691843033, -0.11836371570825577, -0.07071870565414429, -0.015849176794290543, -0.16296839714050293, 0.1468200832605362, 0.03308667987585068, -0.02148127183318138, -0.04262656718492508, 0.05988458916544914, -0.12417222559452057, -0.07003221660852432, -0.0043630655854940414, -0.04792902246117592, -0.04825865104794502, 0.007103509735316038, 0.008342409506440163, -0.029559984803199768, 0.08700662851333618, 0.1385798305273056, -0.06772828847169876, 0.08180999755859375, -0.07920949161052704, 0.04470102861523628, -0.009991775266826153, 0.047596417367458344, -0.0958835631608963, -0.09136073291301727, 0.021003982052206993, 0.004127881955355406, 0.004992317873984575, -0.013908104971051216, -0.12181077897548676, -0.06856933981180191, 0.020419206470251083, 0.07648465037345886, 0.03126145899295807, 0.08585398644208908, 0.01714394986629486, -0.019412651658058167, 0.07756282389163971, -0.14082476496696472, 0.03472400829195976, 0.003926410339772701, 0.03265273571014404, 0.10948965698480606, -0.08318586647510529, 0.0010914580198004842, -0.102910615503788, 0.019465813413262367, -0.048307690769433975, 0.03950640186667442, -0.03569245710968971, -0.1400168538093567, 0.03302482143044472, -0.05952020362019539, -0.05942871421575546, -0.13752028346061707, -0.04731221869587898, -0.04325461760163307, -0.0303484033793211, -0.07669159770011902, 0.04513927176594734, -0.03257976099848747, -0.017528284341096878, 0.056928008794784546, -0.016427362337708473, 0.0788511410355568, -0.049251217395067215, 0.02186681143939495, -0.005993626546114683, 0.07747310400009155, -0.027591748163104057, 0.006465654820203781, -0.05063945800065994, 0.03345663473010063, -0.14229649305343628, 0.1438671350479126, -0.0963122695684433, -0.05310620367527008, -0.14394810795783997, -0.05183783546090126, -0.07774689048528671, 0.09127331525087357, -0.014587737619876862, 0.13545556366443634, -0.17994436621665955, -0.06169166415929794, 0.2616010308265686, -0.08015551418066025, -0.052222348749637604, 0.10708745568990707, 0.03475544974207878, 0.0735359787940979, 0.05655546486377716, 0.12970279157161713, 0.08296839892864227, -0.07624214142560959, 0.04466446861624718, 0.06260489672422409, -0.0032134142238646746, 0.12429089844226837, 0.08284001797437668, -0.021331580355763435, -0.023023949936032295, 0.011015952564775944, -0.0012467114720493555, 0.00480300048366189, -0.021047446876764297, -0.04417341202497482, 0.02897961623966694, 0.032301001250743866, 0.010349397547543049, -0.011875029653310776, 0.062189944088459015, -0.05215582996606827, -0.06906766444444656, 0.0325898639857769, 0.06797656416893005, -0.007108551915735006, 0.06694231182336807, -0.1197158694267273, 0.014533852227032185, -0.018769826740026474, 0.04088563099503517, -0.19818566739559174, -0.0129074826836586, -0.006164650432765484, -0.01729275844991207, 0.15070749819278717, 0.16414086520671844, 0.07060422003269196, 0.04587437957525253, -0.0895535945892334, 0.025224315002560616, 0.08710787445306778, -0.003896434558555484, -0.011842254549264908, -0.16521382331848145, 0.024338942021131516, -0.08023393899202347, 0.07294376939535141, -0.07952547818422318, 0.017696792259812355, 0.03599179908633232, 0.13044403493404388, -0.008326997049152851, 0.042812276631593704, 0.0003497008583508432, 0.06936925649642944, -0.0442318394780159, -0.01429001521319151, 0.07156873494386673, -0.0562003031373024, -0.10272015631198883, 0.16238799691200256, -0.16600336134433746, 0.06704077869653702, 0.14628374576568604, -0.007632332853972912, -0.0757884681224823, -0.0816662460565567, -0.00008006137068150565, -0.014572628773748875, 0.015217955224215984, -0.05644959956407547, 0.06694597750902176, 0.02129030041396618, 0.08950824290513992, -0.10172601789236069, -0.027114320546388626, 0.005034883972257376, -0.04204786941409111, -0.06367017328739166, 0.14467765390872955, 0.06989293545484543, -0.2465108186006546, 0.10111537575721741, 0.11359039694070816, 0.03853524476289749, 0.20078957080841064, 0.01299530640244484, -0.04569965600967407, -0.02816612645983696, 0.047339629381895065, 0.022709712386131287, -0.03239840641617775, -0.09610132873058319, -0.022540194913744926, 0.029606174677610397, 0.07031835615634918, 0.06857318431138992, -0.09890716522932053, 0.018796121701598167, -0.01615150272846222, -0.06995361298322678, 0.0690903291106224, 0.11798270791769028, -0.007308193016797304, 0.12275035679340363, -0.004651760682463646, -0.005698936525732279, -0.03901584446430206, 0.000042290626879548654, -0.11198396235704422, 0.15707795321941376, -0.14652281999588013, -0.2539671063423157, -0.10262145847082138, -0.05099206790328026, -0.09647520631551743, -0.005743782501667738, 0.048519548028707504, -0.08115154504776001, -0.02663949690759182, -0.08838564157485962, 0.1574082374572754, -0.01698927953839302, -0.04098258912563324, -0.03493594378232956, 0.027691202238202095, -0.01566319167613983, -0.113711878657341, -0.025705017149448395, 0.0442655123770237, -0.09806898981332779, 0.05440480634570122, -0.08087867498397827, 0.05735788121819496, 0.11095912754535675, 0.02739509381353855, 0.021049411967396736, -0.02601824514567852, 0.21386559307575226, -0.05105806887149811, 0.05554613098502159, 0.1377277970314026, 0.08656679838895798, 0.023410523310303688, 0.14830705523490906, 0.06090535223484039, -0.05512046441435814, -0.010029810480773449, 0.039866551756858826, -0.031168369576334953, -0.16730685532093048, -0.1009378731250763, -0.07894134521484375, -0.005763558205217123, -0.0024993172846734524, 0.07755595445632935, -0.04696572199463844, 0.003001216799020767, -0.08394462615251541, 0.0035753061529248953, 0.06422791630029678, 0.04810323938727379, 0.1707106977701187, -0.04610813781619072, 0.08947017788887024, -0.053186822682619095, -0.05369136855006218, 0.1322319507598877, 0.057098690420389175, -0.0029652216471731663, 0.04335802420973778, 0.08894500881433487, 0.0665847510099411, 0.014492659829556942, 0.059305232018232346, 0.08986254781484604, -0.026107028126716614, -0.020661287009716034, -0.03127136826515198, -0.07457882165908813, -0.004601250868290663, 0.033174678683280945, 0.11630118638277054, -0.08154736459255219, 0.010303141549229622, -0.04016571864485741, 0.08181992918252945, -0.04047878086566925, 0.08984950184822083, -0.10500416904687881, -0.020881012082099915, 0.04686853662133217, -0.04076218605041504, -0.05516990274190903, 0.025605810806155205, 0.07760238647460938, -0.11269286274909973, 0.11345356702804565, 0.02641821652650833, 0.1056780070066452, -0.044097740203142166, 0.0213320292532444, -0.061395447701215744, 0.06824193149805069, -0.007868758402764797, 0.08457539230585098, -0.29796260595321655, 0.17261533439159393, 0.0496697798371315, -0.0276064220815897, -0.050818152725696564, -0.010437272489070892, 0.04517221078276634, 0.11038623750209808, 0.08294529467821121, 0.025380942970514297, -0.10635749250650406, -0.15510180592536926, -0.03260638937354088, -0.013899966143071651, 0.12008455395698547, -0.027381062507629395, 0.052929554134607315, -0.023414675146341324, -0.02281561866402626, -0.04731659218668938, 0.06632491946220398, -0.17647813260555267, -0.1291700005531311, 0.08007965981960297, 0.020892683416604996, -0.04660618305206299, -0.02371332049369812, -0.08221427351236343, -0.04932994022965431, 0.09564408659934998, -0.04721946641802788, -0.049208201467990875, -0.11867959797382355, 0.041725143790245056, 0.12341760098934174, -0.12184275686740875, -0.022677747532725334, -0.03303540498018265, 0.044336505234241486, -0.06251193583011627, -0.08407940715551376, 0.07761646062135696, -0.05106744170188904, -0.10197614878416061, 0.006120390258729458, 0.14872069656848907, -0.026474488899111748, 0.08264398574829102, 0.012894068844616413, -0.01957964152097702, 0.028202079236507416, -0.12918713688850403, -0.03581193462014198, 0.10867884755134583, -0.02703278511762619, 0.06666801869869232, -0.10946866869926453, -0.08963721990585327, -0.04878713935613632, -0.06513553857803345, 0.15564334392547607, 0.23634400963783264, -0.046376872807741165, 0.07247600704431534, 0.15679389238357544, -0.058730047196149826, -0.25029245018959045, -0.09478303790092468, -0.020481247454881668, 0.0568375289440155, -0.059951651841402054, -0.10067129135131836, 0.011517309583723545, 0.0041654035449028015, 0.013379240408539772, 0.03205479308962822, -0.30113479495048523, -0.14689283072948456, 0.06934600323438644, -0.0033037939574569464, 0.10744813084602356, -0.12266170978546143, -0.0966520607471466, -0.07194040715694427, -0.1186564490199089, 0.03634857386350632, -0.07562673836946487, 0.11360828578472137, 0.015124499797821045, 0.0574081689119339, 0.0012466594344004989, 0.015591251663863659, 0.15573051571846008, 0.007103745825588703, -0.01569601707160473, -0.08440236747264862, -0.005970679689198732, 0.06841927766799927, 0.03316672891378403, 0.08821641653776169, -0.11983579397201538, 0.006191353779286146, -0.035670239478349686, -0.06992622464895248, -0.09587829560041428, 0.0721425712108612, -0.05814550817012787, -0.06930182874202728, -0.052547723054885864, 0.05669743940234184, -0.03927435353398323, -0.05509549006819725, 0.02458709105849266, -0.12056560069322586, 0.09363801777362823, 0.1176353469491005, 0.1756584197282791, -0.08437362313270569, 0.001647881348617375, -0.016285410150885582, -0.024595895782113075, 0.05771784111857414, -0.07401955872774124, 0.008742714300751686, 0.11445772647857666, -0.015712818130850792, 0.1087634414434433, 0.02162085473537445, -0.10974998772144318, 0.021783191710710526, 0.10992459952831268, -0.0610724538564682, -0.21356967091560364, -0.02573307603597641, -0.02724177949130535, 0.07053510844707489, 0.006663061678409576, 0.19517208635807037, -0.024922383949160576, -0.04753736034035683, 0.0157821923494339, 0.025064680725336075, -0.06511647999286652, 0.1712568700313568, 0.014514459297060966, 0.03913898393511772, -0.09473161399364471, 0.02381962537765503, 0.021838631480932236, -0.16017577052116394, -0.018888361752033234, 0.21194010972976685, -0.11784762889146805, -0.11541569232940674, 0.02540830336511135, 0.09632038325071335, -0.10935264825820923, -0.003241594647988677, 0.019637297838926315, -0.10351721197366714, 0.059926100075244904, 0.11840502172708511, 0.048511527478694916, 0.014764201827347279, -0.02002197690308094, -0.029847754165530205, 0.06307625770568848, 0.030980650335550308, 0.06366491317749023, 0.027471071109175682, -0.0753035694360733, 0.08209913969039917, 0.007091888226568699, 0.0612536184489727, -0.049580156803131104, -0.011443826369941235, -0.11573132127523422, -0.02372489683330059, -0.1948685497045517, -0.005010077264159918, -0.13973690569400787, -0.005974302999675274, -0.004617089405655861, -0.09324264526367188, -0.02754008024930954, -0.0283900648355484, -0.09135228395462036, -0.03834618628025055, -0.11042960733175278, 0.10246828943490982, -0.06908967345952988, 0.03901037201285362, 0.05274837464094162, -0.030014030635356903, 0.0990581214427948, 0.05034273862838745, -0.06959404051303864, 0.10978808254003525, -0.0560508668422699, -0.034169841557741165, 0.017379293218255043, 0.057138316333293915, 0.045094359666109085, -0.05114852637052536, 0.024111740291118622, 0.045227307826280594, 0.07624993473291397, 0.04766931012272835, 0.09625288099050522, -0.054457973688840866, 0.0381760448217392, -0.09283309429883957, -0.10795131325721741, -0.04061504825949669, 0.06436194479465485, 0.11667785793542862, 0.045753005892038345, 0.06215382367372513, -0.09833966195583344, 0.04879777878522873, -0.14268407225608826, -0.0030792541801929474, -0.025171739980578423, -0.10066086053848267, 0.02705203928053379, -0.06749194860458374, 0.05868764594197273, -0.013498740270733833, 0.12815994024276733, 0.05499168112874031, 0.0012143678031861782, 0.011877895332872868, -0.03597685694694519, -0.011375761590898037, 0.006647814065217972, 0.15155895054340363, 0.06335221976041794, 0.007618374656885862, -0.10655353218317032, 0.020689919590950012, -0.015619954094290733, 0.062414221465587616, 0.10035344958305359, 0.15051916241645813, 0.08030454814434052, 0.07963623851537704, 0.07980617135763168, -0.05558428168296814, -0.09875079989433289, 0.007516606245189905, -0.04029228538274765, 0.018450375646352768, -0.11723408848047256, 0.10713528841733932, 0.2031085342168808, -0.12011104822158813, 0.07274553179740906, -0.02381020598113537, -0.05038070306181908, -0.1562173217535019, -0.21757717430591583, -0.09059683978557587, -0.08367588371038437, -0.0009568502428010106, -0.10998613387346268, 0.0420418307185173, -0.013498452492058277, 0.11860180646181107, 0.015187633223831654, 0.15671555697917938, -0.10189302265644073, -0.13599000871181488, 0.03117203153669834, 0.03294149413704872, 0.03576364368200302, 0.01711430959403515, 0.010314445942640305, -0.01769975945353508, -0.01325672585517168, 0.009293202310800552, 0.04826797917485237, -0.04541350156068802, -0.015248449519276619, -0.11218880116939545, -0.049584612250328064, -0.032386597245931625, 0.007940739393234253, 0.018021337687969208, 0.14799773693084717, 0.0502186119556427, -0.10094176977872849, -0.016218703240156174, 0.14925597608089447, -0.032538190484046936, -0.22213521599769592, -0.09349354356527328, 0.23912174999713898, -0.027058295905590057, 0.09719274938106537, -0.05325740948319435, -0.019690481945872307, -0.04448295012116432, 0.21409468352794647, 0.33434680104255676, -0.07502126693725586, 0.03691398352384567, 0.06932146847248077, 0.030161121860146523, 0.028891580179333687, 0.10932742059230804, 0.06749508529901505, 0.22072038054466248, -0.023487122729420662, 0.02098371833562851, -0.048708345741033554, -0.0365615151822567, -0.05200960859656334, 0.08759211748838425, -0.002606533234938979, -0.02235984429717064, -0.038965485990047455, 0.10857430845499039, -0.020557058975100517, -0.06674621999263763, -0.03693489730358124, -0.09030181914567947, -0.08131618797779083, -0.003001462435349822, 0.05122190713882446, 0.04331580176949501, 0.025007260963320732, 0.01214278768748045, 0.02263876423239708, -0.026335882022976875, 0.008393343538045883, -0.11175715923309326, -0.007262440398335457, 0.06644352525472641, -0.0393795408308506, 0.03232135996222496, -0.006973874755203724, 0.07517632842063904, 0.11816414445638657, 0.02285858429968357, -0.039413005113601685, 0.1188909038901329, 0.0654195100069046, 0.0029056828934699297, 0.03874628618359566, 0.09175367653369904, 0.0015035325195640326, 0.0007637887028977275, 0.08602160960435867, -0.16636686027050018, 0.09025231748819351, 0.09337187558412552, -0.046822257339954376, -0.0760287418961525, 0.1303289532661438, -0.12410574406385422, 0.13525454699993134, 0.1738976687192917, 0.019203275442123413, -0.018454447388648987, -0.0457104928791523, 0.021742502227425575, -0.03559320792555809, 0.0947069451212883, -0.06805653870105743, -0.18798662722110748, -0.034508008509874344, -0.10047771781682968, 0.04081544280052185, -0.1927705705165863, -0.09823822975158691, 0.002847137162461877, 0.04638822004199028, -0.09233690798282623, 0.18276803195476532, 0.0743248388171196, 0.0009382286807522178, 0.00269380584359169, -0.3452315330505371, 0.03600373864173889, 0.12213043868541718, -0.1249900832772255, -0.044843342155218124 ]
null
null
transformers
# mBART-50 one to many multilingual machine translation This model is a fine-tuned checkpoint of [mBART-large-50](https://huggingface.co/facebook/mbart-large-50). `mbart-large-50-one-to-many-mmt` is fine-tuned for multilingual machine translation. It was introduced in [Multilingual Translation with Extensible Multilingual Pretraining and Finetuning](https://arxiv.org/abs/2008.00401) paper. The model can translate English to other 49 languages mentioned below. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the `forced_bos_token_id` parameter to the `generate` method. ```python from transformers import MBartForConditionalGeneration, MBart50TokenizerFast article_en = "The head of the United Nations says there is no military solution in Syria" model = MBartForConditionalGeneration.from_pretrained("facebook/mbart-large-50-one-to-many-mmt") tokenizer = MBart50TokenizerFast.from_pretrained("facebook/mbart-large-50-one-to-many-mmt", src_lang="en_XX") model_inputs = tokenizer(article_en, return_tensors="pt") # translate from English to Hindi generated_tokens = model.generate( **model_inputs, forced_bos_token_id=tokenizer.lang_code_to_id["hi_IN"] ) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => 'संयुक्त राष्ट्र के नेता कहते हैं कि सीरिया में कोई सैन्य समाधान नहीं है' # translate from English to Chinese generated_tokens = model.generate( **model_inputs, forced_bos_token_id=tokenizer.lang_code_to_id["zh_CN"] ) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => '联合国首脑说,叙利亚没有军事解决办法' ``` See the [model hub](https://huggingface.co/models?filter=mbart-50) to look for more fine-tuned versions. ## Languages covered Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI) ## BibTeX entry and citation info ``` @article{tang2020multilingual, title={Multilingual Translation with Extensible Multilingual Pretraining and Finetuning}, author={Yuqing Tang and Chau Tran and Xian Li and Peng-Jen Chen and Naman Goyal and Vishrav Chaudhary and Jiatao Gu and Angela Fan}, year={2020}, eprint={2008.00401}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
{"language": ["multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl"], "tags": ["mbart-50"]}
text2text-generation
facebook/mbart-large-50-one-to-many-mmt
[ "transformers", "pytorch", "tf", "jax", "mbart", "text2text-generation", "mbart-50", "multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl", "arxiv:2008.00401", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2008.00401" ]
[ "multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl" ]
TAGS #transformers #pytorch #tf #jax #mbart #text2text-generation #mbart-50 #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #autotrain_compatible #endpoints_compatible #has_space #region-us
# mBART-50 one to many multilingual machine translation This model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-one-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper. The model can translate English to other 49 languages mentioned below. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method. See the model hub to look for more fine-tuned versions. ## Languages covered Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI) ## BibTeX entry and citation info
[ "# mBART-50 one to many multilingual machine translation\n\n\nThis model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-one-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nThe model can translate English to other 49 languages mentioned below. \nTo translate into a target language, the target language id is forced as the first generated token. To force the\ntarget language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n\n\nSee the model hub to look for more fine-tuned versions.", "## Languages covered\nArabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI)", "## BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #jax #mbart #text2text-generation #mbart-50 #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# mBART-50 one to many multilingual machine translation\n\n\nThis model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-one-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nThe model can translate English to other 49 languages mentioned below. \nTo translate into a target language, the target language id is forced as the first generated token. To force the\ntarget language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n\n\nSee the model hub to look for more fine-tuned versions.", "## Languages covered\nArabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI)", "## BibTeX entry and citation info" ]
[ 172, 171, 346, 10 ]
[ "passage: TAGS\n#transformers #pytorch #tf #jax #mbart #text2text-generation #mbart-50 #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# mBART-50 one to many multilingual machine translation\n\n\nThis model is a fine-tuned checkpoint of mBART-large-50. 'mbart-large-50-one-to-many-mmt' is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nThe model can translate English to other 49 languages mentioned below. \nTo translate into a target language, the target language id is forced as the first generated token. To force the\ntarget language id as the first generated token, pass the 'forced_bos_token_id' parameter to the 'generate' method.\n\n\n\nSee the model hub to look for more fine-tuned versions." ]
[ -0.03981080278754234, -0.15676294267177582, -0.005402104929089546, 0.035035911947488785, 0.08540351688861847, 0.014509551227092743, 0.14548435807228088, 0.10097797214984894, -0.0769064873456955, 0.053682129830121994, 0.036309488117694855, 0.1702890247106552, 0.05554668605327606, 0.15590186417102814, 0.07328235357999802, -0.2980461120605469, 0.06596647202968597, -0.0660107284784317, 0.0490085706114769, 0.09847945719957352, 0.10376172512769699, -0.01657807268202305, 0.0951150432229042, -0.004257307853549719, -0.05099613219499588, 0.03218157961964607, 0.038418613374233246, -0.03633249178528786, 0.08919266611337662, 0.10647013783454895, -0.012242843396961689, 0.001646188902668655, 0.049751441925764084, -0.1957104355096817, 0.005251763388514519, 0.030019955709576607, -0.022780772298574448, -0.011787422932684422, 0.08050800859928131, -0.013623157516121864, 0.21206150949001312, -0.1335577815771103, -0.020597269758582115, 0.050317149609327316, -0.1118515208363533, -0.12610280513763428, -0.0691821277141571, 0.0861014574766159, 0.03660308197140694, 0.047017902135849, -0.062441643327474594, 0.11153864860534668, -0.01931474730372429, 0.12075579166412354, 0.13928258419036865, -0.3709494173526764, -0.04193917661905289, 0.12899631261825562, 0.0754823088645935, 0.0950433611869812, -0.025987083092331886, 0.054837048053741455, 0.015822818502783775, 0.048802729696035385, -0.006780519150197506, -0.07507627457380295, 0.032038092613220215, -0.010168069042265415, -0.1760374903678894, -0.0393402986228466, 0.08835475891828537, -0.036520734429359436, -0.04592718929052353, -0.11249133199453354, -0.08543115854263306, -0.014256073161959648, -0.032772112637758255, -0.07417044788599014, 0.01841878704726696, 0.05997776985168457, 0.08267507702112198, -0.06565538048744202, -0.0596541166305542, -0.021147703751921654, -0.13818280398845673, 0.13324899971485138, 0.012804748490452766, -0.0016795784467831254, -0.0167582668364048, 0.054614558815956116, -0.05327474698424339, -0.06236720085144043, -0.05688564479351044, -0.050173673778772354, -0.06606579571962357, -0.02843601256608963, -0.03285038098692894, -0.10937053710222244, 0.055907391011714935, 0.13206546008586884, -0.03992471098899841, 0.0627175122499466, -0.11069972068071365, 0.047826237976551056, 0.024670114740729332, 0.10251142829656601, -0.1205940693616867, -0.02441103383898735, 0.014658519998192787, -0.025584354996681213, -0.0059609729796648026, -0.02671119198203087, -0.08774841576814651, -0.03694620728492737, 0.05964784696698189, 0.08461999893188477, 0.030510885640978813, 0.1324649602174759, -0.0320584699511528, -0.04426628723740578, 0.006147717125713825, -0.16371722519397736, -0.0057052564807236195, -0.007022398058325052, 0.011360034346580505, 0.10976218432188034, -0.040191613137722015, -0.003963452763855457, -0.09329071640968323, -0.02436813712120056, -0.029049046337604523, 0.05843685194849968, -0.06405528634786606, -0.1383601427078247, 0.026846135035157204, -0.006693693809211254, -0.08771469444036484, -0.1310436725616455, -0.11049610376358032, -0.04585527628660202, 0.01492130197584629, -0.03175598755478859, 0.06631162017583847, -0.08750779181718826, -0.04002856835722923, 0.04951205477118492, -0.022865407168865204, 0.005895302630960941, -0.053260188549757004, 0.04590707644820213, -0.058060504496097565, 0.09876882284879684, 0.044540464878082275, -0.006507405079901218, -0.07032349705696106, 0.031075570732355118, -0.20015327632427216, 0.19029337167739868, -0.07453096657991409, -0.007082975469529629, -0.119384765625, -0.06171855330467224, -0.04834835231304169, 0.06423745304346085, 0.01169650536030531, 0.17219021916389465, -0.20443235337734222, -0.03345479071140289, 0.2445095032453537, -0.12317585200071335, 0.003869607811793685, 0.10713422298431396, 0.0069769080728292465, 0.10860399901866913, 0.058035288006067276, 0.14654900133609772, -0.016940219327807426, -0.08521198481321335, 0.02123701572418213, 0.05230998620390892, -0.1212700828909874, 0.14183677732944489, 0.07688337564468384, -0.04067612066864967, -0.0012709592701867223, 0.008713101036846638, 0.0160934217274189, 0.02037625014781952, -0.04615747556090355, -0.04101670905947685, 0.023514578118920326, 0.05864974856376648, 0.026989780366420746, -0.030587680637836456, 0.025107938796281815, 0.0029123297426849604, -0.0414775051176548, 0.07844386994838715, 0.042551275342702866, -0.008329696953296661, 0.0732828751206398, -0.1378062516450882, -0.02785593643784523, -0.07804030179977417, 0.06808416545391083, -0.20188164710998535, -0.06970871984958649, -0.011697830632328987, 0.04135787487030029, 0.15056273341178894, 0.13310712575912476, 0.06874784827232361, 0.011591030284762383, -0.015285972505807877, 0.002582224551588297, 0.08664283156394958, -0.008564256131649017, -0.023372337222099304, -0.19166934490203857, -0.02002096362411976, -0.05997003614902496, 0.018509915098547935, -0.05244278535246849, 0.039466265588998795, 0.11288397014141083, 0.05132149159908295, -0.000284171401290223, 0.027161408215761185, 0.024471137672662735, 0.06181379407644272, -0.03312636911869049, -0.036734264343976974, 0.07795967906713486, -0.028087187558412552, -0.05514689162373543, 0.19539961218833923, -0.15863022208213806, 0.039134375751018524, 0.11785545945167542, -0.10520961135625839, -0.09996434301137924, -0.014541268348693848, -0.020280884578824043, -0.02080404944717884, 0.04916321486234665, -0.0570744127035141, 0.10629852861166, 0.03460164740681648, 0.0708802193403244, -0.08921036869287491, -0.03318316861987114, 0.002214722102507949, -0.0622403547167778, -0.05222277715802193, 0.09469803422689438, 0.008356069214642048, -0.25453048944473267, 0.08541971445083618, 0.06622926145792007, 0.0909620150923729, 0.1925601065158844, 0.015186049975454807, -0.05731729790568352, -0.054050616919994354, 0.07462602853775024, 0.02526174485683441, -0.04869689792394638, -0.006207847967743874, -0.04019699990749359, 0.024200191721320152, 0.07951251417398453, 0.01729407161474228, -0.07212777435779572, 0.04916960746049881, 0.002088209381327033, -0.09165932238101959, 0.04538355767726898, 0.08789076656103134, 0.007273949682712555, 0.11242596805095673, 0.02201703004539013, 0.014870903454720974, 0.006564619950950146, -0.016617238521575928, -0.1302490532398224, 0.1546081006526947, -0.16279065608978271, -0.22983185946941376, -0.13227815926074982, -0.08718588203191757, -0.09006203711032867, 0.01258001383394003, 0.032796524465084076, -0.09398417919874191, -0.023953314870595932, -0.06845785677433014, 0.10329236090183258, -0.03722817450761795, -0.046667005866765976, -0.03919578343629837, -0.022068267688155174, -0.0023352571297436953, -0.11893126368522644, -0.02881634421646595, 0.004436239134520292, -0.11463086307048798, 0.06765932589769363, -0.05572403222322464, 0.08808284252882004, 0.1293911188840866, 0.027583515271544456, 0.03184159845113754, -0.054080307483673096, 0.24304930865764618, -0.04168837517499924, 0.059407610446214676, 0.10382172465324402, 0.029369838535785675, 0.04585844278335571, 0.15001076459884644, 0.0208426546305418, -0.08298908174037933, -0.0021988647058606148, 0.015697041526436806, -0.04993857443332672, -0.17750714719295502, -0.0755821168422699, -0.07423289865255356, 0.025810062885284424, 0.006712716538459063, 0.07173728197813034, 0.0240112766623497, -0.013891673646867275, -0.01925264298915863, 0.043619390577077866, 0.0863889679312706, 0.09445743262767792, 0.1993246227502823, -0.04172420874238014, 0.09646683186292648, -0.029127977788448334, -0.03423609584569931, 0.10667504370212555, -0.007188728079199791, 0.06356658041477203, 0.04930325970053673, 0.12410382181406021, 0.08521676063537598, 0.054254982620477676, 0.03294077515602112, 0.10379809141159058, -0.035282645374536514, 0.02713516540825367, -0.03458830341696739, -0.0890616700053215, -0.062483854591846466, 0.06806744635105133, 0.047296177595853806, -0.051521170884370804, 0.018752722069621086, -0.006791889667510986, 0.10842982679605484, 0.12120210379362106, -0.01738107204437256, -0.15696975588798523, -0.012251499108970165, 0.017726419493556023, -0.018559295684099197, -0.05290049687027931, 0.013025972060859203, 0.05761236697435379, -0.10732618719339371, 0.10349370539188385, -0.02030227519571781, 0.07579927891492844, -0.06783583015203476, 0.039083633571863174, 0.013607319444417953, 0.0809212401509285, -0.0047738258726894855, 0.06733870506286621, -0.29572078585624695, 0.14446058869361877, 0.06866712123155594, 0.025736020877957344, -0.044457998126745224, 0.0094828512519598, 0.039771582931280136, 0.08837063610553741, 0.147201269865036, 0.04612920433282852, -0.14702348411083221, -0.18908071517944336, -0.032242681831121445, -0.03261407092213631, 0.16695952415466309, 0.006693311035633087, 0.06359895318746567, -0.024218153208494186, -0.01149835530668497, -0.05230865627527237, 0.08204176276922226, -0.10614113509654999, -0.09547356516122818, 0.08758605271577835, 0.04659128189086914, 0.0884215384721756, -0.0006149844848550856, -0.04186493903398514, -0.05942708998918533, 0.18947745859622955, -0.07680565118789673, -0.0707971379160881, -0.1174946203827858, -0.020875437185168266, 0.09612371027469635, -0.097098708152771, 0.013950676657259464, -0.015611868351697922, 0.032299067825078964, -0.030699167400598526, -0.11060319095849991, 0.09702705591917038, -0.06057729199528694, -0.08307848125696182, 0.007260560989379883, 0.10290981084108353, -0.005166363436728716, 0.04757275804877281, 0.007060260511934757, 0.026312490925192833, 0.04401533678174019, -0.1265566200017929, -0.045380476862192154, 0.12608619034290314, -0.048286646604537964, 0.050979964435100555, -0.15695680677890778, -0.1562511920928955, -0.08048204332590103, -0.03848251327872276, 0.14535263180732727, 0.21891535818576813, -0.06882509589195251, 0.10095006227493286, 0.14426696300506592, -0.109401635825634, -0.1957636922597885, -0.05745226889848709, -0.02385825477540493, 0.025063756853342056, -0.029531316831707954, -0.14442382752895355, -0.022087451070547104, -0.01233623642474413, 0.027545876801013947, 0.048854678869247437, -0.26708459854125977, -0.13341686129570007, 0.03228357434272766, 0.005426367744803429, 0.11884178221225739, -0.10700113326311111, -0.09225872904062271, -0.011022970080375671, -0.07405845075845718, 0.019156241789460182, -0.06096328794956207, 0.10609234124422073, 0.05007317662239075, 0.037166159600019455, 0.023371215909719467, 0.00018881635332945734, 0.07387128472328186, 0.07007163763046265, 0.016210511326789856, -0.11131329834461212, -0.013316679745912552, 0.061817120760679245, 0.028785116970539093, 0.1308126598596573, 0.01198464073240757, -0.020058536902070045, -0.11067581176757812, -0.04833796247839928, -0.09397226572036743, 0.046337321400642395, -0.036146316677331924, -0.09116347879171371, -0.04000317305326462, 0.04147522151470184, 0.04519028216600418, -0.03356821835041046, -0.04222424700856209, -0.13723766803741455, 0.05022158473730087, 0.13913623988628387, 0.14344564080238342, -0.06931581348180771, 0.02648210898041725, -0.018912097439169884, -0.021572081372141838, 0.04654070362448692, -0.011525181122124195, -0.001600470277480781, 0.11083690077066422, -0.0292439553886652, 0.11977867782115936, 0.04273267090320587, -0.08129682391881943, 0.025899730622768402, 0.06980123370885849, -0.10593824088573456, -0.11964599043130875, 0.008396715857088566, -0.05155627056956291, 0.061469197273254395, 0.008308786898851395, 0.18801477551460266, -0.030317272990942, -0.014443285763263702, 0.016627822071313858, 0.01955346018075943, -0.059820253401994705, 0.08907541632652283, 0.012903669849038124, 0.020422104746103287, -0.07779491692781448, 0.04228396341204643, 0.03728484734892845, -0.12058734148740768, 0.027034226804971695, 0.17425984144210815, -0.12982314825057983, -0.09285009652376175, -0.00885474681854248, 0.10885255038738251, -0.09035614132881165, -0.00032061972888186574, 0.016104809939861298, -0.07895108312368393, 0.0683995708823204, 0.17003732919692993, 0.03145528584718704, 0.03835417702794075, -0.03697150945663452, -0.0021370036993175745, 0.03870280459523201, 0.02961164526641369, 0.011272107250988483, -0.02299555018544197, -0.03184418007731438, 0.16461113095283508, 0.020670175552368164, 0.0900324359536171, -0.05520064756274223, -0.0474260151386261, -0.08539886772632599, -0.008489258587360382, 0.001888469560071826, -0.012674528174102306, -0.11880353838205338, -0.0056465743109583855, 0.009224041365087032, -0.04406043887138367, 0.008551163598895073, -0.008920419029891491, -0.08270350098609924, -0.004903741646558046, -0.0724937915802002, 0.07271310687065125, -0.12319847196340561, -0.01740078255534172, -0.005837395321577787, -0.03520752489566803, 0.09403914958238602, 0.05224239453673363, -0.08932969719171524, 0.06045832484960556, -0.07226148247718811, 0.022106053307652473, 0.027372591197490692, 0.01665210723876953, 0.015603609383106232, -0.04528886079788208, 0.018027659505605698, 0.031729355454444885, 0.03337842598557472, 0.03766907751560211, 0.018946204334497452, -0.0652213841676712, 0.11435156315565109, -0.043824851512908936, -0.0624917671084404, -0.05912567675113678, 0.060508809983730316, 0.06340249627828598, 0.015330938622355461, 0.09035000205039978, -0.10973259061574936, 0.0523667111992836, -0.12156824767589569, -0.013791264034807682, 0.017387481406331062, -0.07240639626979828, -0.023729024454951286, -0.10232704132795334, 0.053720470517873764, -0.017324769869446754, 0.09832097589969635, 0.050517432391643524, -0.0011232455726712942, 0.013428080826997757, -0.06464409828186035, -0.08805117756128311, 0.015859734266996384, 0.06301743537187576, 0.049401506781578064, 0.008517059497535229, -0.0626428872346878, 0.01984667219221592, -0.018104439601302147, 0.018662957474589348, 0.11827383935451508, 0.08760477602481842, 0.1422056406736374, 0.08592899888753891, 0.029519759118556976, -0.030040422454476357, -0.14474520087242126, -0.028898563235998154, -0.07285342365503311, 0.03830336034297943, -0.07723821699619293, 0.16798166930675507, 0.16191641986370087, -0.12868019938468933, 0.07484214752912521, 0.011982440017163754, -0.042662397027015686, -0.130768284201622, -0.192876398563385, -0.047390345484018326, -0.07049089670181274, -0.00589059479534626, -0.10256227850914001, 0.09614666551351547, -0.00210562814027071, 0.09019425511360168, 0.044023435562849045, 0.15554845333099365, -0.13996483385562897, -0.0947125107049942, 0.030990784987807274, -0.028172297403216362, 0.044881440699100494, -0.034357450902462006, -0.009205869399011135, -0.006153747905045748, -0.030708301812410355, 0.024965576827526093, 0.07607852667570114, 0.027141714468598366, -0.023428672924637794, -0.11389952898025513, -0.058032698929309845, -0.008978644385933876, 0.03288691118359566, 0.008145986124873161, 0.10352730005979538, 0.06967375427484512, -0.12351471185684204, 0.0029650761280208826, 0.03232446312904358, -0.008261378854513168, -0.22723791003227234, -0.08967731893062592, 0.24798056483268738, -0.00015488396456930786, 0.078915536403656, -0.08390377461910248, -0.03125830739736557, -0.02417673170566559, 0.2080768346786499, 0.21106766164302826, -0.06524287164211273, 0.01143720280379057, 0.009045423939824104, 0.02670520916581154, 0.039241112768650055, 0.10237835347652435, 0.03646602854132652, 0.22121088206768036, -0.04952510818839073, 0.06859149038791656, -0.0736275166273117, -0.009835821576416492, -0.03186819702386856, 0.042174264788627625, -0.023030590265989304, -0.00643998384475708, -0.029058774933218956, 0.0909276157617569, -0.1120373085141182, -0.011074301786720753, 0.028996555134654045, -0.037266526371240616, -0.02775725908577442, -0.004290214739739895, 0.0185890831053257, 0.05554013326764107, 0.03819524496793747, -0.004592609591782093, -0.011213492602109909, 0.02024611085653305, 0.014611240476369858, -0.0959002822637558, -0.03591909632086754, 0.06357812881469727, 0.01132944505661726, 0.043822187930345535, -0.019788039848208427, 0.085493303835392, 0.10505084693431854, 0.013301133178174496, -0.08642719686031342, 0.07411748170852661, 0.0024793276097625494, -0.029219865798950195, 0.00622003898024559, 0.042236436158418655, 0.013238129206001759, 0.05867829918861389, 0.02664903551340103, -0.102383092045784, 0.10831855982542038, 0.10127510875463486, -0.001604994758963585, -0.043849438428878784, 0.06378918886184692, -0.09527523815631866, 0.118827685713768, 0.15293996036052704, 0.05534360557794571, -0.0028498873580247164, -0.08493108302354813, 0.013620978221297264, -0.03355678170919418, 0.029014872387051582, -0.0748523622751236, -0.15109801292419434, -0.015156541019678116, -0.011198937892913818, 0.035746220499277115, -0.16270776093006134, -0.1326281726360321, 0.0006771667976863682, 0.03795216977596283, -0.09494227170944214, 0.14686724543571472, 0.02571745775640011, 0.023064885288476944, 0.0022232423070818186, -0.20852480828762054, 0.05970006063580513, 0.15159763395786285, -0.1182892769575119, -0.033039677888154984 ]
null
null
transformers
# mBART-50 mBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the "Multilingual Denoising Pretraining" objective. It was introduced in [Multilingual Translation with Extensible Multilingual Pretraining and Finetuning](https://arxiv.org/abs/2008.00401) paper. ## Model description mBART-50 is a multilingual Sequence-to-Sequence model. It was introduced to show that multilingual translation models can be created through multilingual fine-tuning. Instead of fine-tuning on one direction, a pre-trained model is fine-tuned on many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below. **Multilingual Denoising Pretraining**: The model incorporates N languages by concatenating data: `D = {D1, ..., DN }` where each Di is a collection of monolingual documents in language `i`. The source documents are noised using two schemes, first randomly shuffling the original sentences' order, and second a novel in-filling scheme, where spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. 35% of each instance's words are masked by random sampling a span length according to a Poisson distribution `(λ = 3.5)`. The decoder input is the original text with one position offset. A language id symbol `LID` is used as the initial token to predict the sentence. ## Intended uses & limitations `mbart-large-50` is pre-trained model and primarily aimed at being fine-tuned on translation tasks. It can also be fine-tuned on other multilingual sequence-to-sequence tasks. See the [model hub](https://huggingface.co/models?filter=mbart-50) to look for fine-tuned versions. ## Training As the model is multilingual, it expects the sequences in a different format. A special language id token is used as a prefix in both the source and target text. The text format is `[lang_code] X [eos]` with `X` being the source or target text respectively and `lang_code` is `source_lang_code` for source text and `tgt_lang_code` for target text. `bos` is never used. Once the examples are prepared in this format, it can be trained as any other sequence-to-sequence model. ```python from transformers import MBartForConditionalGeneration, MBart50TokenizerFast model = MBartForConditionalGeneration.from_pretrained("facebook/mbart-large-50") tokenizer = MBart50TokenizerFast.from_pretrained("facebook/mbart-large-50", src_lang="en_XX", tgt_lang="ro_RO") src_text = " UN Chief Says There Is No Military Solution in Syria" tgt_text = "Şeful ONU declară că nu există o soluţie militară în Siria" model_inputs = tokenizer(src_text, return_tensors="pt") with tokenizer.as_target_tokenizer(): labels = tokenizer(tgt_text, return_tensors="pt").input_ids model(**model_inputs, labels=labels) # forward pass ``` ## Languages covered Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI) ## BibTeX entry and citation info ``` @article{tang2020multilingual, title={Multilingual Translation with Extensible Multilingual Pretraining and Finetuning}, author={Yuqing Tang and Chau Tran and Xian Li and Peng-Jen Chen and Naman Goyal and Vishrav Chaudhary and Jiatao Gu and Angela Fan}, year={2020}, eprint={2008.00401}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
{"language": ["multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl"], "license": "mit", "tags": ["mbart-50"]}
text2text-generation
facebook/mbart-large-50
[ "transformers", "pytorch", "tf", "mbart", "text2text-generation", "mbart-50", "multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl", "arxiv:2008.00401", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2008.00401" ]
[ "multilingual", "ar", "cs", "de", "en", "es", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "af", "az", "bn", "fa", "he", "hr", "id", "ka", "km", "mk", "ml", "mn", "mr", "pl", "ps", "pt", "sv", "sw", "ta", "te", "th", "tl", "uk", "ur", "xh", "gl", "sl" ]
TAGS #transformers #pytorch #tf #mbart #text2text-generation #mbart-50 #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
# mBART-50 mBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the "Multilingual Denoising Pretraining" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper. ## Model description mBART-50 is a multilingual Sequence-to-Sequence model. It was introduced to show that multilingual translation models can be created through multilingual fine-tuning. Instead of fine-tuning on one direction, a pre-trained model is fine-tuned on many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below. Multilingual Denoising Pretraining: The model incorporates N languages by concatenating data: 'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, first randomly shuffling the original sentences' order, and second a novel in-filling scheme, where spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. 35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'. The decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence. ## Intended uses & limitations 'mbart-large-50' is pre-trained model and primarily aimed at being fine-tuned on translation tasks. It can also be fine-tuned on other multilingual sequence-to-sequence tasks. See the model hub to look for fine-tuned versions. ## Training As the model is multilingual, it expects the sequences in a different format. A special language id token is used as a prefix in both the source and target text. The text format is '[lang_code] X [eos]' with 'X' being the source or target text respectively and 'lang_code' is 'source_lang_code' for source text and 'tgt_lang_code' for target text. 'bos' is never used. Once the examples are prepared in this format, it can be trained as any other sequence-to-sequence model. ## Languages covered Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI) ## BibTeX entry and citation info
[ "# mBART-50\n\nmBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the \"Multilingual Denoising Pretraining\" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.", "## Model description\n\nmBART-50 is a multilingual Sequence-to-Sequence model. It was introduced to show that multilingual translation models can be created through multilingual fine-tuning. \nInstead of fine-tuning on one direction, a pre-trained model is fine-tuned on many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.\n\nMultilingual Denoising Pretraining: The model incorporates N languages by concatenating data: \n'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, \nfirst randomly shuffling the original sentences' order, and second a novel in-filling scheme, \nwhere spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. \n35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'.\nThe decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence.", "## Intended uses & limitations\n\n'mbart-large-50' is pre-trained model and primarily aimed at being fine-tuned on translation tasks. It can also be fine-tuned on other multilingual sequence-to-sequence tasks. \nSee the model hub to look for fine-tuned versions.", "## Training\n\nAs the model is multilingual, it expects the sequences in a different format. A special language id token is used as a prefix in both the source and target text. The text format is '[lang_code] X [eos]' with 'X' being the source or target text respectively and 'lang_code' is 'source_lang_code' for source text and 'tgt_lang_code' for target text. 'bos' is never used. Once the examples are prepared in this format, it can be trained as any other sequence-to-sequence model.", "## Languages covered\nArabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI)", "## BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #mbart #text2text-generation #mbart-50 #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# mBART-50\n\nmBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the \"Multilingual Denoising Pretraining\" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.", "## Model description\n\nmBART-50 is a multilingual Sequence-to-Sequence model. It was introduced to show that multilingual translation models can be created through multilingual fine-tuning. \nInstead of fine-tuning on one direction, a pre-trained model is fine-tuned on many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.\n\nMultilingual Denoising Pretraining: The model incorporates N languages by concatenating data: \n'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, \nfirst randomly shuffling the original sentences' order, and second a novel in-filling scheme, \nwhere spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. \n35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'.\nThe decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence.", "## Intended uses & limitations\n\n'mbart-large-50' is pre-trained model and primarily aimed at being fine-tuned on translation tasks. It can also be fine-tuned on other multilingual sequence-to-sequence tasks. \nSee the model hub to look for fine-tuned versions.", "## Training\n\nAs the model is multilingual, it expects the sequences in a different format. A special language id token is used as a prefix in both the source and target text. The text format is '[lang_code] X [eos]' with 'X' being the source or target text respectively and 'lang_code' is 'source_lang_code' for source text and 'tgt_lang_code' for target text. 'bos' is never used. Once the examples are prepared in this format, it can be trained as any other sequence-to-sequence model.", "## Languages covered\nArabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI)", "## BibTeX entry and citation info" ]
[ 174, 65, 298, 78, 135, 346, 10 ]
[ "passage: TAGS\n#transformers #pytorch #tf #mbart #text2text-generation #mbart-50 #multilingual #ar #cs #de #en #es #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #af #az #bn #fa #he #hr #id #ka #km #mk #ml #mn #mr #pl #ps #pt #sv #sw #ta #te #th #tl #uk #ur #xh #gl #sl #arxiv-2008.00401 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# mBART-50\n\nmBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the \"Multilingual Denoising Pretraining\" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.", "passage: ## Model description\n\nmBART-50 is a multilingual Sequence-to-Sequence model. It was introduced to show that multilingual translation models can be created through multilingual fine-tuning. \nInstead of fine-tuning on one direction, a pre-trained model is fine-tuned on many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.\n\nMultilingual Denoising Pretraining: The model incorporates N languages by concatenating data: \n'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, \nfirst randomly shuffling the original sentences' order, and second a novel in-filling scheme, \nwhere spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. \n35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'.\nThe decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence.## Intended uses & limitations\n\n'mbart-large-50' is pre-trained model and primarily aimed at being fine-tuned on translation tasks. It can also be fine-tuned on other multilingual sequence-to-sequence tasks. \nSee the model hub to look for fine-tuned versions.## Training\n\nAs the model is multilingual, it expects the sequences in a different format. A special language id token is used as a prefix in both the source and target text. The text format is '[lang_code] X [eos]' with 'X' being the source or target text respectively and 'lang_code' is 'source_lang_code' for source text and 'tgt_lang_code' for target text. 'bos' is never used. Once the examples are prepared in this format, it can be trained as any other sequence-to-sequence model." ]
[ -0.03308223932981491, -0.11251980066299438, -0.006174925714731216, 0.004065757617354393, 0.05101003497838974, 0.01813122257590294, 0.1412094235420227, 0.04968271404504776, -0.07374262809753418, 0.04154151305556297, 0.0736064463853836, 0.023354902863502502, 0.06550141423940659, 0.1364665925502777, 0.034279994666576385, -0.267794132232666, 0.037629541009664536, -0.020447492599487305, 0.06478946655988693, 0.0767165943980217, 0.08360691368579865, -0.016616106033325195, 0.068690225481987, -0.02851412072777748, -0.03909270465373993, 0.029439829289913177, -0.01614403910934925, -0.03982803598046303, 0.05682120844721794, 0.08769434690475464, 0.05524095520377159, 0.021296439692378044, 0.054611608386039734, -0.18746241927146912, 0.01792200282216072, 0.030703449621796608, -0.03702324628829956, -0.007423164322972298, 0.001839775126427412, -0.02645552158355713, 0.22650864720344543, -0.007464899681508541, -0.006782958284020424, 0.007016469724476337, -0.12531736493110657, -0.018730971962213516, -0.03421124443411827, 0.06535571813583374, 0.02568337321281433, 0.05813194811344147, -0.027012629434466362, 0.049266695976257324, -0.01000230386853218, 0.0886409804224968, 0.08294089883565903, -0.26034438610076904, -0.031477827578783035, 0.11246702075004578, 0.06570612639188766, 0.10485891997814178, -0.028785452246665955, 0.03208981454372406, 0.03496459871530533, 0.05980820953845978, -0.035908665508031845, -0.0859483927488327, 0.11392611265182495, -0.04323745146393776, -0.15066581964492798, -0.032103002071380615, 0.1607692837715149, -0.030349362641572952, -0.07090913504362106, -0.021569181233644485, -0.047229424118995667, -0.011018570512533188, -0.02660255879163742, -0.04349592328071594, -0.012494479306042194, 0.04500347375869751, 0.10198606550693512, -0.0957680344581604, -0.07312067598104477, -0.03053126111626625, -0.09725455194711685, 0.16601353883743286, 0.012332454323768616, 0.019141042605042458, -0.0774945318698883, 0.06932941824197769, -0.06490051001310349, -0.05065629631280899, -0.05487243831157684, -0.050104279071092606, -0.013139519840478897, 0.02421385422348976, -0.016954515129327774, -0.12070225179195404, -0.0030961129814386368, 0.08248841017484665, -0.009650636464357376, 0.04888932406902313, -0.07962870597839355, 0.07686956226825714, 0.034288886934518814, 0.09624913334846497, -0.04383479803800583, -0.05052584037184715, 0.005355574190616608, 0.024610277265310287, 0.0322556346654892, -0.03604229539632797, -0.12245228886604309, -0.008682070299983025, 0.017821233719587326, 0.02864876389503479, 0.01804962567985058, 0.07793562114238739, 0.02153225988149643, -0.01234318409115076, 0.021681535989046097, -0.1179530918598175, -0.025756092742085457, -0.018839791417121887, -0.0012102052569389343, 0.07385770976543427, 0.05001001060009003, -0.035174090415239334, -0.10261048376560211, 0.03666272386908531, -0.03021225333213806, 0.020542338490486145, -0.1039813831448555, -0.13596391677856445, -0.019910583272576332, -0.03920028358697891, -0.03025168552994728, -0.1007997989654541, -0.1494377851486206, -0.06659611314535141, 0.015892911702394485, -0.009111881256103516, 0.042451754212379456, -0.09794948995113373, -0.05929561331868172, -0.0006335396319627762, -0.006070733070373535, -0.020261120051145554, -0.0356849730014801, 0.029338110238313675, -0.040812425315380096, 0.05680565536022186, -0.10083144158124924, 0.01651080697774887, -0.07284040749073029, 0.04830145835876465, -0.10718899220228195, 0.15003511309623718, -0.023940972983837128, -0.05115634575486183, -0.04806538298726082, -0.03127296641469002, -0.10313747078180313, 0.08890093863010406, 0.0193124208599329, 0.13568603992462158, -0.2161969095468521, -0.037767913192510605, 0.2407660335302353, -0.14385992288589478, -0.022318294271826744, 0.08860448002815247, -0.01648811437189579, 0.1473178118467331, 0.06035592406988144, 0.1124066561460495, 0.05646877735853195, -0.05777822807431221, 0.025929784402251244, 0.06843134760856628, -0.013296915218234062, 0.1154969185590744, 0.03600425273180008, -0.004453703761100769, -0.05027175694704056, 0.02300499752163887, -0.03384432941675186, 0.016195569187402725, -0.010845933109521866, -0.04925300553441048, 0.037231020629405975, 0.043669745326042175, -0.029099103063344955, -0.027712790295481682, 0.025345752015709877, 0.019368985667824745, -0.05056651681661606, 0.012024912051856518, 0.04815763607621193, -0.04212087020277977, 0.05542987212538719, -0.06646797060966492, 0.002871626988053322, -0.021087661385536194, 0.03448010981082916, -0.1629091501235962, -0.0052961744368076324, -0.006513694301247597, 0.024220112711191177, 0.12442751228809357, 0.1507452428340912, 0.025551188737154007, 0.043718431144952774, -0.034910086542367935, 0.006515582092106342, 0.014646697789430618, -0.017674295231699944, -0.041238002479076385, -0.13692544400691986, 0.0037586912512779236, -0.07419414818286896, 0.0014599394053220749, -0.09175065159797668, 0.04201357066631317, -0.04346240311861038, -0.0048110149800777435, 0.013267932459712029, 0.030584242194890976, -0.021221572533249855, 0.06985881924629211, -0.03857630491256714, -0.007184802088886499, 0.02467479556798935, 0.03136772662401199, -0.10056710243225098, 0.09235571324825287, -0.18304234743118286, -0.043798722326755524, 0.06676311790943146, -0.03154236450791359, -0.09147489070892334, -0.02985438145697117, -0.013947726227343082, -0.028819970786571503, 0.045104172080755234, -0.026065904647111893, 0.10094378888607025, -0.003725701943039894, 0.08492112904787064, -0.1006687805056572, 0.019782237708568573, -0.010675747878849506, -0.06005639582872391, -0.04777216911315918, 0.09412746876478195, 0.06510704010725021, -0.18623435497283936, 0.08194398880004883, 0.047737494111061096, 0.006614265963435173, 0.20760226249694824, 0.019258107990026474, -0.05407213419675827, -0.05320194363594055, 0.04442690312862396, -0.011595422402024269, 0.04207390174269676, -0.06768522411584854, -0.02164503000676632, -0.005286507308483124, 0.0713222324848175, 0.05141937732696533, -0.07721062004566193, 0.027123238891363144, -0.009807759895920753, -0.05181332305073738, 0.0076695578172802925, 0.03198612481355667, -0.03179425001144409, 0.0858960896730423, 0.01211718562990427, -0.05553651601076126, -0.0014450672315433621, -0.03463394194841385, -0.11400069296360016, 0.16060039401054382, -0.142723947763443, -0.17782050371170044, -0.1270335614681244, -0.03607127070426941, -0.041863180696964264, 0.010780958458781242, 0.021611958742141724, -0.0023878514766693115, -0.00860613863915205, -0.08506329357624054, 0.10871603339910507, -0.11600841581821442, -0.06905438750982285, -0.07271675020456314, -0.005445929244160652, -0.07105465233325958, -0.08083635568618774, -0.019451025873422623, -0.026165060698986053, -0.06616336107254028, 0.03897150978446007, -0.10758601129055023, 0.046362005174160004, 0.1370094120502472, 0.005152514204382896, -0.005344092845916748, -0.04386279731988907, 0.17324987053871155, -0.04935650900006294, 0.06375756114721298, 0.032610516995191574, -0.03668894246220589, 0.038234274834394455, 0.12157875299453735, 0.007963973097503185, -0.049863215535879135, 0.0008248668164014816, -0.012482714839279652, -0.06760244816541672, -0.1444031298160553, -0.05196436494588852, -0.07099872827529907, 0.045818205922842026, -0.023002104833722115, 0.02553119882941246, 0.047970421612262726, 0.014759265817701817, -0.03833075612783432, 0.04147854447364807, 0.05044401437044144, 0.0663115382194519, 0.09394916146993637, -0.0479382649064064, 0.07321709394454956, -0.034224532544612885, -0.03178291767835617, 0.04856047034263611, 0.024799499660730362, 0.11525251716375351, 0.024599965661764145, 0.021572064608335495, 0.09479615837335587, 0.013596002012491226, 0.024402301758527756, 0.08665529638528824, -0.04504239186644554, 0.0378686860203743, -0.042143721133470535, -0.061180002987384796, -0.0458906888961792, 0.06826433539390564, 0.020612796768546104, -0.01271436270326376, -0.008905654773116112, 0.039621252566576004, 0.02760925143957138, 0.07465842366218567, 0.04166001081466675, -0.08718614280223846, -0.031344201415777206, 0.03186081349849701, 0.03214326500892639, -0.10770270228385925, 0.04731855168938637, 0.09868477284908295, -0.07723227143287659, 0.07413755357265472, -0.010272850282490253, 0.07006838917732239, -0.04256431385874748, 0.03741097450256348, -0.05612491816282272, 0.0277097150683403, -0.017607927322387695, 0.06239379569888115, -0.3180699944496155, 0.1747678816318512, 0.05114632099866867, 0.03009847365319729, -0.045959457755088806, 0.007212776690721512, -0.007180740591138601, 0.11445923894643784, 0.11724857985973358, 0.029711615294218063, -0.1533389836549759, -0.05091552808880806, -0.0024158330634236336, -0.010584428906440735, 0.1281505823135376, -0.019266126677393913, 0.08138944208621979, 0.020190121605992317, 0.02370157465338707, -0.02566266618669033, 0.03097221627831459, -0.09949056804180145, -0.1566869020462036, 0.09145232290029526, 0.0011505559086799622, -0.023287950083613396, -0.014996620826423168, -0.0255239550024271, 0.04776545241475105, 0.10742197930812836, -0.13173514604568481, -0.05627826601266861, -0.055092450231313705, 0.027190014719963074, 0.05938416346907616, -0.0687304139137268, -0.021034400910139084, -0.037950217723846436, -0.013424661010503769, -0.07222329080104828, -0.08041919022798538, 0.07421600818634033, -0.06714973598718643, -0.030566435307264328, -0.007765039801597595, 0.0820499137043953, 0.0712791383266449, 0.039212387055158615, 0.0049390634521842, -0.002314248587936163, 0.014001402072608471, -0.10677401721477509, -0.019903570413589478, 0.08609580248594284, -0.03918341547250748, 0.11676271259784698, -0.1611422300338745, -0.11940929293632507, -0.06407574564218521, -0.03751597926020622, 0.0743894949555397, 0.17948490381240845, -0.026792459189891815, 0.08444063365459442, 0.13725852966308594, -0.1406513750553131, -0.22061687707901, -0.04572266340255737, -0.003866172395646572, 0.06596497446298599, -0.037383876740932465, -0.16769254207611084, -0.044240355491638184, -0.05146905034780502, 0.03240632265806198, 0.08774127066135406, -0.2671507000923157, -0.09055422246456146, 0.09492607414722443, -0.01588084176182747, 0.1662251055240631, -0.10465261340141296, -0.05958171188831329, -0.009655158966779709, 0.02202705293893814, 0.008734533563256264, 0.020876362919807434, 0.1019277274608612, 0.018641846254467964, 0.05395955592393875, 0.019143681973218918, 0.014032258652150631, 0.08947543054819107, 0.01520005613565445, 0.03014487773180008, -0.09460553526878357, -0.03210308775305748, 0.07211600989103317, -0.006190290674567223, 0.09437321871519089, -0.013747734017670155, 0.011603321880102158, -0.05262194573879242, -0.052549999207258224, -0.07886037230491638, 0.035842619836330414, -0.056437037885189056, -0.07151195406913757, -0.04708242788910866, 0.046987853944301605, 0.024483270943164825, 0.006985866464674473, 0.05283454433083534, -0.10525176674127579, 0.07350928336381912, 0.10362149775028229, 0.13210368156433105, 0.0297663826495409, 0.06284114718437195, -0.008434806019067764, -0.005714407190680504, 0.06443405151367188, -0.01445750892162323, -0.020677722990512848, 0.09183391183614731, -0.012286757118999958, 0.11759621649980545, 0.05541636794805527, -0.12533895671367645, -0.004626242443919182, 0.06635810434818268, -0.08228357881307602, -0.15413324534893036, -0.01047411747276783, -0.011668800376355648, 0.012577943503856659, 0.017475996166467667, 0.18694177269935608, -0.058384161442518234, 0.006788638420403004, -0.008700829930603504, 0.042184941470623016, -0.06015164032578468, 0.06002552807331085, 0.024995993822813034, 0.0013831043615937233, -0.027029819786548615, 0.04744565486907959, 0.03637654334306717, -0.05997157841920853, 0.030566444620490074, 0.09715427458286285, -0.07945577800273895, -0.0724586546421051, -0.06837098300457001, 0.1562776416540146, -0.06686218827962875, -0.03197071701288223, 0.014992054551839828, -0.08500486612319946, 0.04872111976146698, 0.19827723503112793, 0.0013643428683280945, 0.03543805330991745, -0.05976000428199768, 0.02576575055718422, 0.021847940981388092, 0.03704646974802017, 0.007510051131248474, 0.0007291305810213089, 0.00909581407904625, 0.15363618731498718, 0.020214160904288292, 0.11426291614770889, -0.0323067307472229, -0.05167315900325775, -0.09074041247367859, 0.023322338238358498, -0.0615469254553318, 0.005792730487883091, -0.08784814178943634, -0.014198609627783298, 0.029683616012334824, -0.010672159492969513, -0.02209402248263359, -0.01965494453907013, -0.06623071432113647, 0.013878745958209038, -0.054073214530944824, 0.05226128548383713, -0.03410059213638306, -0.010379891842603683, -0.0054688770323991776, -0.047818608582019806, 0.047498494386672974, 0.06370261311531067, -0.03332645073533058, 0.06777924299240112, -0.07534611970186234, 0.08538568019866943, 0.01739359274506569, 0.028954684734344482, -0.03430142626166344, 0.00009221956133842468, -0.014176178723573685, 0.027755524963140488, 0.01184386108070612, 0.013446619734168053, 0.010115706361830235, -0.04952913522720337, 0.1097891554236412, -0.007868949323892593, -0.05118442699313164, -0.038998618721961975, 0.08722227811813354, -0.0281275175511837, 0.0369589626789093, 0.05248553305864334, -0.07797064632177353, 0.05101677030324936, -0.0745595395565033, -0.00628238171339035, 0.021014736965298653, -0.03158605098724365, 0.04793849214911461, -0.0988427996635437, 0.0398116335272789, 0.0015397837851196527, 0.12131060659885406, 0.049771733582019806, -0.0011451637838035822, -0.008626971393823624, -0.10564860701560974, -0.026178088039159775, 0.0016006124205887318, 0.06573093682527542, 0.03844567388296127, -0.02943291887640953, -0.06897280365228653, 0.03485989198088646, -0.03963018208742142, 0.0920829102396965, 0.1330583393573761, 0.052832525223493576, 0.10515177249908447, 0.11276033520698547, -0.008720820769667625, -0.037600331008434296, -0.0027197105810046196, -0.04760890454053879, 0.035342127084732056, 0.030730480328202248, -0.05503068119287491, 0.09865622222423553, 0.14514516294002533, -0.07649007439613342, 0.0752355083823204, 0.021210456266999245, -0.06680954992771149, -0.14356687664985657, -0.18082433938980103, -0.009459835477173328, -0.024120237678289413, -0.03077642433345318, -0.1067640632390976, 0.05219658091664314, -0.007996993139386177, 0.04985877871513367, -0.014319590292870998, 0.10677176713943481, -0.08450378477573395, -0.1151352971792221, 0.06324412673711777, -0.01247789990156889, 0.07925065606832504, -0.015845034271478653, 0.00831280741840601, 0.029026031494140625, -0.029104402288794518, 0.020870540291070938, 0.06071246787905693, 0.048072993755340576, 0.009569607675075531, -0.07851636409759521, -0.018539132550358772, -0.021588418632745743, 0.012561531737446785, 0.06282784044742584, 0.17635419964790344, 0.04876077547669411, -0.09999285638332367, 0.021672286093235016, 0.12236631661653519, -0.016707370057702065, -0.14005312323570251, -0.10126997530460358, 0.16030612587928772, 0.023790471255779266, 0.06511660665273666, -0.07478971779346466, -0.03817575424909592, -0.0052367160096764565, 0.19047480821609497, 0.22301505506038666, -0.052391283214092255, 0.003024328500032425, 0.03769936412572861, 0.033260319381952286, 0.053319547325372696, 0.0560799278318882, 0.018019653856754303, 0.26122573018074036, -0.025035083293914795, 0.03346375375986099, -0.08725405484437943, 0.006235798355191946, -0.025135375559329987, 0.04634259641170502, 0.005098823457956314, -0.03235805779695511, -0.032341014593839645, 0.07079029083251953, -0.10173769295215607, -0.1313600242137909, -0.007889511063694954, -0.05108500272035599, -0.026751607656478882, 0.006798549089580774, 0.005849950015544891, 0.05636177211999893, 0.05001833662390709, -0.019379641860723495, -0.034172702580690384, 0.07916920632123947, 0.00590109545737505, -0.0775844156742096, -0.059466466307640076, 0.03901562839746475, -0.026313364505767822, 0.04442450776696205, -0.0017487583681941032, 0.05112462490797043, 0.0601980946958065, 0.0486055389046669, -0.0058464165776968, 0.0926937460899353, 0.004481025040149689, 0.045357123017311096, 0.010253848508000374, 0.07115556299686432, -0.02791469730436802, 0.06276345998048782, 0.05330612137913704, -0.08527477085590363, 0.08009897917509079, 0.1028306633234024, -0.08579404652118683, -0.03072051890194416, 0.08913599699735641, -0.08851759880781174, 0.13176099956035614, 0.14962562918663025, 0.02847483940422535, -0.013451466336846352, -0.023401815444231033, 0.0010225437581539154, -0.04342205077409744, 0.07888713479042053, -0.03426537662744522, -0.07862932980060577, -0.019120432436466217, -0.053527094423770905, 0.04397889971733093, -0.24576690793037415, -0.030589070171117783, 0.024200260639190674, 0.019658029079437256, -0.06372605264186859, 0.10682666301727295, 0.04262383282184601, -0.001203453168272972, -0.03664024546742439, -0.17111198604106903, 0.04081130027770996, 0.09444913268089294, -0.095905601978302, -0.0667303204536438 ]
null
null
transformers
#### mbart-large-cc25 Pretrained (not finetuned) multilingual mbart model. Original Languages ``` export langs=ar_AR,cs_CZ,de_DE,en_XX,es_XX,et_EE,fi_FI,fr_XX,gu_IN,hi_IN,it_IT,ja_XX,kk_KZ,ko_KR,lt_LT,lv_LV,my_MM,ne_NP,nl_XX,ro_RO,ru_RU,si_LK,tr_TR,vi_VN,zh_CN ``` Original Code: https://github.com/pytorch/fairseq/tree/master/examples/mbart Docs: https://huggingface.co/transformers/master/model_doc/mbart.html Finetuning Code: examples/seq2seq/finetune.py (as of Aug 20, 2020) Can also be finetuned for summarization.
{"language": ["en", "ar", "cs", "de", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "multilingual"], "tags": ["translation"]}
translation
facebook/mbart-large-cc25
[ "transformers", "pytorch", "tf", "mbart", "text2text-generation", "translation", "en", "ar", "cs", "de", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "multilingual", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en", "ar", "cs", "de", "et", "fi", "fr", "gu", "hi", "it", "ja", "kk", "ko", "lt", "lv", "my", "ne", "nl", "ro", "ru", "si", "tr", "vi", "zh", "multilingual" ]
TAGS #transformers #pytorch #tf #mbart #text2text-generation #translation #en #ar #cs #de #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #multilingual #autotrain_compatible #endpoints_compatible #has_space #region-us
#### mbart-large-cc25 Pretrained (not finetuned) multilingual mbart model. Original Languages Original Code: URL Docs: URL Finetuning Code: examples/seq2seq/URL (as of Aug 20, 2020) Can also be finetuned for summarization.
[ "#### mbart-large-cc25\n\nPretrained (not finetuned) multilingual mbart model.\nOriginal Languages\n\n\nOriginal Code: URL\nDocs: URL\nFinetuning Code: examples/seq2seq/URL (as of Aug 20, 2020)\n\nCan also be finetuned for summarization." ]
[ "TAGS\n#transformers #pytorch #tf #mbart #text2text-generation #translation #en #ar #cs #de #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #multilingual #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "#### mbart-large-cc25\n\nPretrained (not finetuned) multilingual mbart model.\nOriginal Languages\n\n\nOriginal Code: URL\nDocs: URL\nFinetuning Code: examples/seq2seq/URL (as of Aug 20, 2020)\n\nCan also be finetuned for summarization." ]
[ 101, 71 ]
[ "passage: TAGS\n#transformers #pytorch #tf #mbart #text2text-generation #translation #en #ar #cs #de #et #fi #fr #gu #hi #it #ja #kk #ko #lt #lv #my #ne #nl #ro #ru #si #tr #vi #zh #multilingual #autotrain_compatible #endpoints_compatible #has_space #region-us \n#### mbart-large-cc25\n\nPretrained (not finetuned) multilingual mbart model.\nOriginal Languages\n\n\nOriginal Code: URL\nDocs: URL\nFinetuning Code: examples/seq2seq/URL (as of Aug 20, 2020)\n\nCan also be finetuned for summarization." ]
[ -0.0473334901034832, -0.04430648684501648, -0.0055567119270563126, 0.012399948202073574, 0.11852240562438965, 0.019093643873929977, 0.0810728520154953, 0.08033215254545212, -0.014600258320569992, 0.012768789194524288, 0.05602271854877472, 0.07084579765796661, -0.0016495012678205967, 0.1406879872083664, -0.03220028430223465, -0.2728785276412964, 0.03102955035865307, 0.007868636399507523, -0.0030620938632637262, 0.0997331440448761, 0.10523335635662079, -0.03117692470550537, 0.10328159481287003, -0.019464779645204544, -0.07057184725999832, 0.07855644822120667, -0.016087699681520462, -0.0681668296456337, 0.05567946657538414, 0.10940276831388474, 0.05054600164294243, 0.08276272565126419, 0.005096537061035633, -0.2199275642633438, 0.031375691294670105, -0.0043060327880084515, -0.08741902559995651, -0.01066746935248375, 0.04341665282845497, -0.0468776561319828, 0.21335043013095856, -0.08732049912214279, -0.10276942700147629, 0.043704234063625336, -0.07439540326595306, 0.011811806820333004, -0.06917471438646317, 0.0024167189840227365, 0.010238452814519405, 0.061605002731084824, -0.02307823859155178, 0.12773968279361725, -0.10451232641935349, 0.09876006096601486, 0.13174232840538025, -0.3330545723438263, -0.03536060079932213, 0.14705178141593933, 0.05148165300488472, 0.11183162033557892, -0.048750314861536026, 0.07123693078756332, 0.05636318027973175, 0.006845081225037575, -0.014396125450730324, -0.11389143019914627, -0.07973750680685043, 0.016095401719212532, -0.12318306416273117, -0.006765862926840782, 0.19248171150684357, -0.016010956838726997, -0.028335364535450935, 0.0023723149206489325, -0.09554653614759445, -0.021631691604852676, -0.07335886359214783, -0.004161092918366194, -0.0065912529826164246, 0.050262510776519775, 0.041962530463933945, -0.11090796440839767, -0.09712939709424973, -0.01733439601957798, -0.15322314202785492, 0.22280587255954742, 0.0632142722606659, -0.008886891417205334, -0.11135761439800262, 0.022270413115620613, -0.04635609686374664, -0.09941574931144714, -0.014171299524605274, -0.06603873521089554, 0.05702699348330498, 0.04829740524291992, -0.07446230202913284, -0.06021153926849365, 0.12542623281478882, 0.09101447463035583, -0.0007934009190648794, 0.057718489319086075, -0.021006165072321892, 0.06875579059123993, -0.032100047916173935, 0.10209929198026657, -0.09865588694810867, -0.062686488032341, 0.0934898853302002, -0.020036689937114716, 0.024426335468888283, -0.03330850973725319, -0.16107402741909027, -0.0981750413775444, -0.019060026854276657, 0.1166682243347168, -0.026386726647615433, 0.14074665307998657, 0.02471088245511055, 0.011006481014192104, 0.07072487473487854, -0.13102997839450836, -0.007067054510116577, 0.008096658624708652, 0.039604201912879944, 0.06812595576047897, -0.005545414984226227, -0.010659098625183105, -0.12437137961387634, -0.025241829454898834, -0.053882475942373276, 0.028923647478222847, -0.006571885664016008, -0.10112413763999939, 0.022127313539385796, -0.06470204144716263, -0.018244212493300438, -0.17408590018749237, -0.012432614341378212, -0.015294641256332397, -0.0718548595905304, 0.018965760245919228, 0.05720505863428116, -0.06613229960203171, -0.08913123607635498, 0.025739222764968872, -0.018317274749279022, -0.07102461904287338, -0.0772671177983284, 0.05115962028503418, -0.001827520434744656, 0.07244887948036194, -0.19445528090000153, 0.025409996509552002, -0.0991683304309845, -0.011692274361848831, -0.01688806526362896, 0.10092926770448685, -0.060299839824438095, -0.028971396386623383, -0.06300512701272964, -0.06306303292512894, -0.1198144182562828, 0.0704338327050209, 0.016012640669941902, 0.1494126170873642, -0.1970571130514145, -0.07252207398414612, 0.24294286966323853, -0.05767877399921417, -0.058908913284540176, 0.14572380483150482, 0.032591819763183594, -0.008565019816160202, 0.047703638672828674, 0.16217611730098724, 0.08608877658843994, -0.061114538460969925, 0.027553463354706764, 0.07614800333976746, 0.01567491702735424, 0.011617869138717651, 0.05060287564992905, 0.003319389186799526, -0.016597142443060875, 0.0694725513458252, -0.05485109984874725, 0.05467221885919571, -0.01374430675059557, -0.059769198298454285, -0.016583411023020744, 0.0031138916965574026, 0.04460655525326729, -0.015281260013580322, 0.055952057242393494, -0.03278522193431854, -0.029634244740009308, -0.02010829746723175, 0.07815446704626083, -0.051361266523599625, 0.08808118849992752, -0.12680068612098694, 0.019724905490875244, 0.00351667869836092, 0.07244879752397537, -0.15892554819583893, -0.008501565083861351, -0.030715810135006905, 0.10716211050748825, 0.0874839648604393, 0.1750272661447525, 0.006501310039311647, -0.02180396392941475, -0.05629739537835121, 0.018768947571516037, 0.023295434191823006, -0.024880021810531616, -0.03982416167855263, -0.09798121452331543, 0.04261137917637825, -0.03791412338614464, 0.023295201361179352, -0.09078650921583176, 0.027786189690232277, 0.12147479504346848, 0.11672251671552658, -0.014367271214723587, 0.08012416958808899, 0.025127746164798737, 0.08683280646800995, -0.07293226569890976, 0.014745390973985195, 0.04859481751918793, -0.0396488793194294, -0.10950513184070587, 0.19424161314964294, -0.09882804751396179, 0.10980162024497986, 0.1766376942396164, -0.08270759135484695, 0.043174128979444504, 0.010088586248457432, -0.022952424362301826, 0.03088964708149433, 0.1042793020606041, -0.03141963854432106, 0.061458684504032135, 0.0026781978085637093, 0.15850335359573364, -0.10991382598876953, -0.016189707443118095, 0.004957766737788916, -0.07780031859874725, -0.05695739760994911, 0.11812346428632736, 0.1108192652463913, -0.17042165994644165, 0.0946144163608551, 0.20032276213169098, -0.02066531591117382, 0.22935841977596283, -0.005810900125652552, -0.0440077930688858, 0.03698624670505524, 0.061771538108587265, -0.015782030299305916, -0.012279191985726357, -0.0928095355629921, -0.0735585018992424, 0.026652980595827103, 0.034964676946401596, 0.09733355790376663, -0.10396561771631241, -0.0009882152080535889, -0.02728964202105999, -0.08119423687458038, -0.040549442172050476, 0.10928173363208771, 0.013321158476173878, 0.11329081654548645, -0.044705502688884735, -0.030076684430241585, -0.014630808494985104, -0.027290618047118187, -0.0906267836689949, 0.19412240386009216, -0.1741742640733719, -0.27945461869239807, -0.13627709448337555, -0.04890391603112221, -0.155988410115242, 0.012211024761199951, 0.07701396942138672, -0.12289256602525711, -0.03369631990790367, -0.0766109749674797, 0.1040409728884697, -0.05253823846578598, -0.021704858168959618, -0.060331132262945175, 0.02808418869972229, -0.03979221731424332, -0.12964536249637604, -0.016099106520414352, -0.018082015216350555, -0.13726520538330078, 0.10032015293836594, -0.140316441655159, 0.15036500990390778, 0.14785979688167572, -0.04378799349069595, 0.023444289341568947, -0.04770005866885185, 0.17401181161403656, -0.04712146148085594, -0.019704021513462067, 0.1435895562171936, 0.01907431334257126, 0.07044722884893417, 0.12413408607244492, 0.008412675000727177, -0.0900573804974556, 0.03003004379570484, 0.021729910746216774, -0.05117955431342125, -0.18031767010688782, -0.13517215847969055, -0.13964781165122986, 0.006031360011547804, -0.035997506231069565, 0.06644637137651443, -0.01568746753036976, 0.026403766125440598, -0.09270362555980682, 0.06337769329547882, 0.017349740490317345, 0.06664809584617615, 0.2162264734506607, -0.0028417364228516817, 0.08117436617612839, -0.07819806784391403, -0.09767702221870422, 0.11502326279878616, 0.07234910130500793, 0.04873153194785118, 0.006730797700583935, 0.13978363573551178, 0.10582851618528366, 0.10589770972728729, 0.08676628023386002, 0.04275503754615784, -0.027446039021015167, 0.003886748803779483, -0.04218925163149834, -0.07572923600673676, -0.046541664749383926, 0.018936550244688988, 0.04877626895904541, -0.12068156898021698, -0.043679047375917435, -0.005522279534488916, 0.11306615173816681, 0.013443839736282825, 0.00017556751845404506, -0.14313563704490662, 0.04119432717561722, 0.050168585032224655, 0.010237292386591434, -0.06371325999498367, 0.06502807885408401, 0.13518346846103668, -0.06269923597574234, 0.11438301205635071, 0.03526834771037102, 0.12427906692028046, 0.02434394508600235, 0.060866426676511765, -0.08798807859420776, 0.04978582635521889, 0.01745968870818615, 0.10551874339580536, -0.31237339973449707, 0.15979363024234772, 0.056311771273612976, -0.017203329131007195, -0.05409889295697212, -0.013060182332992554, 0.05794106423854828, 0.19926366209983826, 0.08007578551769257, 0.03473464027047157, 0.014422977343201637, -0.13401645421981812, 0.021063359454274178, 0.07836172729730606, 0.19339877367019653, 0.01096520759165287, 0.06531515717506409, -0.03699593245983124, -0.014793080277740955, -0.04662410169839859, 0.031763989478349686, -0.16022875905036926, -0.18139435350894928, 0.09110410511493683, 0.09832126647233963, -0.04328912869095802, -0.00531268585473299, -0.02317817322909832, -0.07322141528129578, 0.15103478729724884, -0.05542390048503876, -0.052016470581293106, -0.10374260693788528, -0.04892460256814957, 0.13290825486183167, -0.14261414110660553, 0.029933245852589607, -0.0347110815346241, 0.0451328307390213, -0.08133744448423386, -0.1586274951696396, 0.04032635688781738, -0.0643545389175415, -0.07766913622617722, 0.028958892449736595, 0.12986095249652863, -0.08704220503568649, 0.021642325446009636, 0.03812165558338165, -0.02348543331027031, -0.01843034289777279, -0.1539766639471054, -0.08510883152484894, 0.06520044058561325, 0.022514253854751587, 0.05440095439553261, -0.14183735847473145, -0.10283543914556503, -0.03807542100548744, -0.02312644012272358, 0.17050017416477203, 0.23371586203575134, -0.03927987441420555, 0.06219898536801338, 0.16106024384498596, -0.0659693256020546, -0.33899518847465515, -0.1053122729063034, -0.08530277758836746, 0.0476081520318985, -0.06714044511318207, -0.12295646220445633, -0.02395424246788025, -0.013207466341555119, 0.016715597361326218, 0.06637817621231079, -0.21577629446983337, -0.12153346091508865, 0.08598265051841736, 0.03617586940526962, 0.21717992424964905, -0.13115288317203522, -0.08795902132987976, -0.0960838571190834, -0.20743632316589355, 0.05360845476388931, -0.12093210965394974, 0.10227549821138382, 0.004127464722841978, 0.04908720403909683, 0.00987557228654623, 0.03688138723373413, 0.13383693993091583, 0.03329351916909218, -0.006121471058577299, -0.06911789625883102, -0.07751764357089996, 0.10713419318199158, 0.024758614599704742, 0.06688614189624786, -0.13887380063533783, 0.03460357338190079, -0.07623691111803055, -0.044538840651512146, -0.09128762036561966, 0.06591655313968658, -0.00792780239135027, -0.06383102387189865, -0.08307698369026184, -0.00880096759647131, 0.03202708065509796, -0.009008736349642277, 0.1557784378528595, -0.09930122643709183, 0.010292367078363895, 0.10961581021547318, 0.1650613695383072, -0.14510881900787354, 0.141363263130188, 0.00027696078177541494, -0.033588994294404984, 0.09135251492261887, -0.12442291527986526, 0.06726060062646866, 0.07956299185752869, -0.0162865798920393, 0.04176841676235199, 0.04990309849381447, -0.032845787703990936, 0.012464363127946854, 0.13733193278312683, -0.10254525393247604, -0.10906792432069778, -0.05416577681899071, -0.034605249762535095, 0.07792329043149948, 0.11156746745109558, 0.18091589212417603, -0.05009718984365463, -0.04017002135515213, -0.020962947979569435, -0.0005279528559185565, -0.016301022842526436, 0.07239950448274612, 0.08941227942705154, 0.010151993483304977, -0.1255365014076233, 0.060336265712976456, 0.01742762140929699, -0.06340712308883667, -0.00788097083568573, 0.08730511367321014, -0.15459461510181427, -0.11110007017850876, 0.010069682262837887, 0.04211872071027756, -0.06606029719114304, -0.05138849839568138, -0.057941898703575134, -0.08124205470085144, 0.002395086921751499, 0.17314057052135468, 0.0688738003373146, 0.06117323786020279, -0.04255587235093117, -0.022634627297520638, 0.006326253991574049, 0.008752653375267982, 0.028646761551499367, 0.029781341552734375, -0.08600220829248428, 0.12975312769412994, -0.02865881472826004, 0.15432745218276978, -0.056675948202610016, -0.00737488828599453, -0.06665080785751343, -0.007467517629265785, -0.13660763204097748, 0.04006101191043854, -0.11386563628911972, -0.016352815553545952, -0.02481226436793804, -0.07437494397163391, -0.07947403192520142, 0.018906233832240105, -0.13270463049411774, -0.03257652744650841, -0.08182935416698456, 0.07817597687244415, -0.06808796525001526, 0.03891167417168617, 0.051682889461517334, -0.05696627125144005, 0.09462037682533264, 0.04074177145957947, -0.07681545615196228, 0.16737395524978638, -0.16204291582107544, -0.05431201681494713, 0.1010945737361908, 0.07873228937387466, 0.0465427041053772, 0.14779965579509735, 0.03231791406869888, 0.07085481286048889, 0.08803815394639969, 0.019741617143154144, 0.09327332675457001, -0.08217643946409225, 0.006587764713913202, -0.16119976341724396, -0.06893213838338852, -0.018663467839360237, 0.054274141788482666, 0.08035984635353088, 0.08513481169939041, 0.12808048725128174, -0.06971144676208496, 0.000252919940976426, -0.07423201203346252, 0.0031018229201436043, -0.029207872226834297, -0.13259387016296387, -0.10689425468444824, -0.10494685173034668, 0.05590629577636719, -0.00460357079282403, 0.18291378021240234, 0.004564973060041666, -0.0018381536938250065, -0.030424820259213448, 0.00989480223506689, 0.034483056515455246, 0.019220268353819847, 0.2045532763004303, 0.07480328530073166, 0.000044675543904304504, -0.10583923012018204, -0.01825411431491375, 0.03577415272593498, -0.00014504248974844813, 0.0917554646730423, 0.11349701136350632, 0.1217697411775589, 0.15699271857738495, 0.033642154186964035, -0.017392033711075783, -0.018988100811839104, -0.09693410247564316, -0.023343883454799652, 0.03059304878115654, -0.07009930163621902, 0.07775215804576874, 0.22812576591968536, -0.10999453812837601, 0.046620216220617294, 0.022691048681735992, -0.05046966299414635, -0.16570903360843658, -0.10155642032623291, -0.09574346989393234, -0.10442102700471878, -0.005638126749545336, -0.13108056783676147, 0.021578365936875343, 0.0054047866724431515, 0.04834860563278198, -0.016655130311846733, 0.11550762504339218, -0.06316778063774109, -0.18274933099746704, 0.07711837440729141, -0.02183573693037033, 0.05756455659866333, 0.0385439358651638, -0.016960561275482178, 0.016374880447983742, -0.04508994519710541, 0.02935614436864853, 0.076108418405056, 0.023371268063783646, 0.01360329706221819, -0.15503473579883575, -0.04037805274128914, -0.038583435118198395, 0.05917685851454735, 0.0710015818476677, 0.11351883411407471, 0.07050473988056183, -0.11916270107030869, 0.028859524056315422, 0.14798444509506226, -0.01437509898096323, -0.14665254950523376, -0.04852166399359703, 0.10962863266468048, 0.0885588601231575, 0.12372854351997375, -0.05105191469192505, -0.07460618764162064, -0.03253520280122757, 0.2772338390350342, 0.25507184863090515, -0.08735750615596771, 0.04440752789378166, 0.04075665771961212, 0.04631422832608223, 0.07001376897096634, 0.14121508598327637, 0.07318847626447678, 0.2301921248435974, -0.009529591538012028, -0.06429299712181091, -0.05996277555823326, -0.002245915587991476, -0.08270696550607681, 0.1030329167842865, -0.024258632212877274, -0.0970376506447792, -0.04225428029894829, 0.04614280164241791, -0.09538643062114716, -0.030684364959597588, -0.02009836584329605, -0.11591766029596329, -0.04646122455596924, -0.012111187912523746, 0.025268353521823883, 0.05861649662256241, 0.05003369227051735, -0.03217202425003052, -0.010920923203229904, 0.01739228330552578, 0.02851530909538269, -0.14094805717468262, 0.015055575408041477, 0.042452435940504074, -0.024276452139019966, 0.03963207080960274, -0.003983518108725548, 0.08000785857439041, 0.07671389728784561, 0.05331570282578468, 0.001264622900635004, 0.13790929317474365, 0.03555333614349365, 0.05798189714550972, 0.034047991037368774, -0.0024905800819396973, -0.013660400174558163, -0.03951491415500641, 0.08662770688533783, -0.04785381630063057, 0.09041205048561096, 0.07264328002929688, -0.049533601850271225, -0.05509386211633682, 0.0693475753068924, -0.14270399510860443, 0.0557529479265213, 0.1258508563041687, -0.001214365940541029, -0.006180401425808668, -0.0411810576915741, 0.024072743952274323, -0.01680910587310791, 0.000261309469351545, -0.04237106069922447, -0.1029883399605751, -0.056518878787755966, 0.01316686812788248, 0.04577593877911568, -0.1568184196949005, -0.004657873418182135, -0.033437348902225494, 0.07495515793561935, -0.11181196570396423, 0.16048532724380493, 0.07286254316568375, -0.024846337735652924, -0.001356796477921307, -0.28477346897125244, 0.048012278974056244, 0.0643453299999237, -0.125637024641037, -0.06712973862886429 ]
null
null
transformers
### mbart-large-en-ro This is mbart-large-cc25, finetuned on wmt_en_ro. It scores BLEU 28.1 without post processing and BLEU 38 with postprocessing. Instructions in `romanian_postprocessing.md` Original Code: https://github.com/pytorch/fairseq/tree/master/examples/mbart Docs: https://huggingface.co/transformers/master/model_doc/mbart.html Finetuning Code: examples/seq2seq/finetune.py (as of Aug 20, 2020)
{"language": ["en", "ro"], "license": "mit", "tags": ["translation"]}
translation
facebook/mbart-large-en-ro
[ "transformers", "pytorch", "tf", "safetensors", "mbart", "translation", "en", "ro", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en", "ro" ]
TAGS #transformers #pytorch #tf #safetensors #mbart #translation #en #ro #license-mit #endpoints_compatible #has_space #region-us
### mbart-large-en-ro This is mbart-large-cc25, finetuned on wmt_en_ro. It scores BLEU 28.1 without post processing and BLEU 38 with postprocessing. Instructions in 'romanian_postprocessing.md' Original Code: URL Docs: URL Finetuning Code: examples/seq2seq/URL (as of Aug 20, 2020)
[ "### mbart-large-en-ro\nThis is mbart-large-cc25, finetuned on wmt_en_ro.\n\nIt scores BLEU 28.1 without post processing and BLEU 38 with postprocessing. Instructions in 'romanian_postprocessing.md'\n\nOriginal Code: URL\n\nDocs: URL\n\nFinetuning Code: examples/seq2seq/URL (as of Aug 20, 2020)" ]
[ "TAGS\n#transformers #pytorch #tf #safetensors #mbart #translation #en #ro #license-mit #endpoints_compatible #has_space #region-us \n", "### mbart-large-en-ro\nThis is mbart-large-cc25, finetuned on wmt_en_ro.\n\nIt scores BLEU 28.1 without post processing and BLEU 38 with postprocessing. Instructions in 'romanian_postprocessing.md'\n\nOriginal Code: URL\n\nDocs: URL\n\nFinetuning Code: examples/seq2seq/URL (as of Aug 20, 2020)" ]
[ 48, 99 ]
[ "passage: TAGS\n#transformers #pytorch #tf #safetensors #mbart #translation #en #ro #license-mit #endpoints_compatible #has_space #region-us \n### mbart-large-en-ro\nThis is mbart-large-cc25, finetuned on wmt_en_ro.\n\nIt scores BLEU 28.1 without post processing and BLEU 38 with postprocessing. Instructions in 'romanian_postprocessing.md'\n\nOriginal Code: URL\n\nDocs: URL\n\nFinetuning Code: examples/seq2seq/URL (as of Aug 20, 2020)" ]
[ -0.079743392765522, -0.004757256712764502, -0.005070546641945839, 0.023360472172498703, 0.09782575815916061, -0.010195910930633545, 0.01879301480948925, 0.04742443934082985, 0.031357891857624054, 0.07936456799507141, 0.12461836636066437, 0.04649859294295311, -0.0330456905066967, 0.14301574230194092, -0.03247987851500511, -0.09608454257249832, 0.07293646782636642, 0.009956775233149529, -0.08008603751659393, 0.0878378227353096, 0.09282459318637848, -0.054862406104803085, 0.08222634345293045, -0.053015127778053284, -0.0734337568283081, 0.11940612643957138, 0.002769171493127942, -0.0598578155040741, 0.08161615580320358, 0.04578922688961029, 0.04883326590061188, 0.0828467309474945, 0.015424619428813457, -0.09051407128572464, 0.012956819497048855, -0.0034741791896522045, -0.133229061961174, 0.009979362599551678, 0.047836434096097946, -0.0783318504691124, 0.15287792682647705, -0.15148082375526428, -0.052902646362781525, -0.005019461270421743, -0.05089898034930229, -0.06038793921470642, -0.06941643357276917, 0.08838589489459991, 0.020927801728248596, 0.1069907397031784, 0.014665976166725159, 0.2020438313484192, -0.16487348079681396, 0.10060671716928482, 0.21886803209781647, -0.3480200171470642, 0.01587195321917534, 0.008161884732544422, 0.026423770934343338, 0.10769190639257431, 0.004237393848598003, 0.03488282859325409, 0.05665768310427666, -0.006088494323194027, -0.019242992624640465, -0.10273031890392303, -0.04197787493467331, 0.02045278251171112, -0.1341971457004547, -0.06736248731613159, 0.2606589198112488, 0.02402397431433201, -0.04390242323279381, 0.08396792411804199, -0.08354462683200836, -0.0038945015985518694, -0.011910352855920792, 0.06222285330295563, -0.008919518440961838, 0.005797987338155508, 0.004078955389559269, -0.04780988395214081, -0.0993606373667717, -0.06509733945131302, -0.10024207085371017, 0.26790082454681396, 0.029423335567116737, 0.034894298762083054, -0.1385287046432495, 0.039212994277477264, -0.032129548490047455, -0.06421778351068497, 0.015389088541269302, -0.024884328246116638, 0.08227436244487762, 0.04877927154302597, -0.08534510433673859, 0.07568555325269699, 0.06726803630590439, 0.23218557238578796, -0.05845392122864723, 0.015333387069404125, 0.04390658438205719, 0.10452588647603989, -0.0836346447467804, 0.07106484472751617, -0.08893702179193497, -0.12189581990242004, 0.1306593120098114, -0.04194341599941254, 0.07495661824941635, 0.002683066064491868, -0.14113734662532806, -0.15341372787952423, -0.021844666451215744, 0.09446151554584503, -0.002971992827951908, 0.0742773711681366, -0.006480271462351084, 0.030959779396653175, 0.17119921743869781, -0.1346871554851532, -0.012616495601832867, 0.03582644462585449, -0.03893665224313736, 0.015594648197293282, 0.06316650658845901, -0.07159223407506943, -0.11146026104688644, 0.07869073748588562, -0.06561636924743652, -0.013525381684303284, 0.00022526785323861986, -0.12358202040195465, 0.061519406735897064, -0.06377056986093521, 0.04513239488005638, -0.18211165070533752, 0.003031408414244652, 0.034659694880247116, -0.054051246494054794, 0.03336438909173012, 0.06024657562375069, 0.041820961982011795, -0.048730287700891495, 0.014411179348826408, -0.030709149315953255, -0.0854305624961853, -0.04866323992609978, 0.08676314353942871, -0.018601270392537117, 0.06423042714595795, -0.2573266923427582, 0.021330872550606728, -0.1670965552330017, 0.006759820505976677, -0.029756641015410423, -0.058448582887649536, -0.08636825531721115, 0.09069948643445969, -0.05854244902729988, -0.03746245056390762, -0.116166852414608, 0.039929453283548355, 0.07381150871515274, 0.12279023975133896, -0.0910123735666275, -0.049836140125989914, 0.22529733180999756, -0.09576810151338577, -0.10395562648773193, 0.1341102421283722, 0.05198058858513832, -0.03000517003238201, 0.023751188069581985, 0.16483935713768005, 0.04911303520202637, -0.07415430247783661, -0.03281504288315773, 0.10947640240192413, -0.07692830264568329, -0.1770583838224411, 0.08813971281051636, -0.028633439913392067, -0.03171548619866371, 0.025180337950587273, -0.1014910638332367, 0.056787073612213135, 0.027621565386652946, 0.0034311667550355196, -0.05995877832174301, 0.0041364808566868305, -0.011972464621067047, -0.013519056141376495, 0.037834834307432175, -0.10519655048847198, -0.07380415499210358, 0.03662574291229248, 0.12584929168224335, -0.03651014715433121, 0.0990002229809761, -0.0950416848063469, 0.1799769252538681, -0.02800137549638748, 0.02259707823395729, -0.13705763220787048, 0.05409213528037071, -0.03488302230834961, 0.09590375423431396, 0.036999307572841644, 0.05560077354311943, 0.036825791001319885, -0.042613498866558075, -0.004258676897734404, -0.022046895697712898, -0.03283141925930977, -0.008307493291795254, -0.035981547087430954, -0.1266120821237564, 0.0280015766620636, -0.040744245052337646, 0.02928222343325615, -0.04281235113739967, 0.017581406980752945, 0.19979046285152435, 0.09852547943592072, -0.027184035629034042, 0.08504443615674973, -0.03751271218061447, 0.047897208482027054, -0.06919831037521362, -0.0017703300109133124, 0.06143960729241371, 0.0037682089023292065, -0.042137511074543, 0.11674753576517105, -0.1374519020318985, 0.13857416808605194, 0.21226195991039276, -0.09897954016923904, -0.015873130410909653, 0.05483047664165497, -0.011616974137723446, 0.04876828193664551, 0.08191345632076263, -0.05569159984588623, 0.001471878495067358, -0.03692324459552765, 0.16158214211463928, -0.11654875427484512, -0.026240037754178047, 0.016906296834349632, -0.05503425374627113, -0.07533331215381622, 0.14723187685012817, 0.22917792201042175, -0.16112564504146576, 0.10681872069835663, 0.30401474237442017, -0.10523227602243423, 0.1930771917104721, 0.002494814107194543, -0.09032242745161057, 0.039421483874320984, 0.08812657743692398, -0.005062881391495466, 0.10062447935342789, -0.07998931407928467, -0.04419828951358795, 0.012160476297140121, 0.024601738899946213, 0.11931989341974258, -0.12230169773101807, -0.013428094796836376, -0.0065645514987409115, -0.032140687108039856, -0.16329072415828705, 0.06084853410720825, -0.04698574170470238, 0.10698554664850235, 0.006001912988722324, -0.07878898829221725, 0.03351595997810364, 0.014269727282226086, -0.0939464271068573, 0.13241828978061676, -0.1160115897655487, -0.18216672539710999, -0.188121035695076, 0.0941961333155632, -0.08102862536907196, -0.009464369155466557, 0.07022380828857422, -0.07252027839422226, -0.0527837909758091, -0.06867138296365738, 0.021073443815112114, -0.08070194721221924, 0.02969123236835003, -0.026252998039126396, 0.10826610773801804, 0.009045375511050224, -0.10560822486877441, -0.020169055089354515, -0.06091579049825668, -0.06565923988819122, 0.03060406818985939, -0.12233193218708038, 0.1661538928747177, 0.07941312342882156, -0.08875231444835663, 0.02523878775537014, 0.007977014407515526, 0.18580059707164764, -0.07307145744562149, 0.00205486873164773, 0.15803083777427673, 0.010668461211025715, 0.03476552292704582, 0.11547093838453293, 0.01050669513642788, -0.031367041170597076, 0.027734961360692978, -0.009620336815714836, -0.06765177100896835, -0.16861052811145782, -0.13413232564926147, -0.09981068968772888, -0.016410350799560547, 0.0011669577797874808, 0.07493122667074203, -0.09103583544492722, 0.029917430132627487, -0.08701863884925842, 0.004915478173643351, -0.016278915107250214, 0.08620196580886841, 0.0893758088350296, 0.021776966750621796, 0.048115409910678864, -0.14133210480213165, -0.016966363415122032, 0.13350924849510193, 0.09395204484462738, 0.1199059784412384, -0.01570264622569084, 0.1260974407196045, 0.05496100336313248, 0.14401017129421234, 0.08998046070337296, 0.10073147714138031, -0.07531752437353134, -0.020712217316031456, -0.017947809770703316, -0.046964485198259354, -0.04464362561702728, 0.026823144406080246, -0.0018576458096504211, -0.0484539233148098, -0.0766180232167244, -0.06701626628637314, 0.12276341021060944, 0.03721823915839195, 0.025000862777233124, -0.13877983391284943, -0.011329260654747486, 0.04823276773095131, 0.07521543651819229, 0.019814664497971535, 0.0579216443002224, 0.10447543114423752, -0.01971421018242836, 0.04427020251750946, 0.07852967083454132, 0.10376031696796417, 0.1098933294415474, 0.011928228661417961, -0.08997415751218796, -0.03001183457672596, -0.0015089281369000673, 0.07009413093328476, -0.20563209056854248, 0.2350553572177887, 0.07086145877838135, -0.006040945649147034, -0.042721524834632874, -0.0044852690771222115, 0.009433556348085403, 0.2485194206237793, 0.12322189658880234, 0.030017992481589317, -0.07475635409355164, -0.14571675658226013, -0.013195819221436977, 0.05438552051782608, 0.1159830316901207, 0.026517540216445923, 0.046390458941459656, -0.061573971062898636, -0.009926668368279934, -0.010135704651474953, 0.015457459725439548, -0.05831649526953697, -0.1364642083644867, 0.01999518647789955, 0.09175089001655579, -0.13569727540016174, -0.016050031408667564, -0.020785706117749214, -0.043161265552043915, 0.19422613084316254, -0.17845426499843597, -0.03383113443851471, -0.08961213380098343, -0.11002735048532486, 0.07042983919382095, -0.13817225396633148, 0.0350327342748642, -0.04449082538485527, -0.03467731177806854, -0.07832437753677368, -0.13548442721366882, 0.10027802735567093, -0.10331984609365463, -0.05335858464241028, 0.043836768716573715, 0.16026650369167328, -0.0859556794166565, 0.018237901851534843, 0.01840723119676113, -0.016057685017585754, -0.04454857110977173, -0.15148533880710602, -0.046800483018159866, -0.007465498521924019, 0.02203291840851307, 0.02347789891064167, -0.16270996630191803, -0.11304550617933273, 0.012042476795613766, -0.016027063131332397, 0.1806822419166565, 0.2951166331768036, -0.09941756725311279, 0.06057140976190567, 0.19814996421337128, -0.013556546531617641, -0.3220796287059784, -0.012642111629247665, -0.11305413395166397, -0.017096776515245438, 0.004107202868908644, -0.024751827120780945, 0.0008393164607696235, 0.04318871349096298, -0.007353282067924738, 0.009129391983151436, -0.19081996381282806, -0.0742262527346611, 0.07896255701780319, 0.04221060872077942, 0.3319222033023834, -0.11246377974748611, -0.024926427751779556, -0.053712401539087296, -0.2125714272260666, 0.04994409903883934, -0.10569752007722855, 0.03847420588135719, 0.005951953586190939, -0.0019738981500267982, 0.014817601069808006, -0.04924151301383972, 0.13666021823883057, -0.01606650836765766, 0.029939142987132072, -0.08234921842813492, -0.054777346551418304, 0.1780400425195694, -0.0017601583385840058, 0.09707726538181305, -0.10450826585292816, 0.04791126772761345, -0.15573252737522125, -0.036940187215805054, -0.06450414657592773, 0.08245138823986053, -0.040835194289684296, -0.01927054300904274, -0.04838723689317703, 0.013770648278295994, -0.0020390846766531467, -0.0019852095283567905, 0.16785481572151184, -0.07778476923704147, -0.08783990144729614, 0.10776010900735855, 0.08617490530014038, -0.14404350519180298, 0.06048838421702385, 0.07246599346399307, -0.061452604830265045, 0.09398192912340164, -0.11038901656866074, 0.11304005235433578, 0.0771440640091896, 0.002624818589538336, -0.010370795615017414, 0.036580152809619904, -0.05877281352877617, 0.017011171206831932, 0.15324464440345764, -0.16424672305583954, 0.0594647154211998, -0.01678287796676159, -0.03016570955514908, 0.06420446932315826, 0.1417790800333023, 0.1425703912973404, -0.027938902378082275, -0.01613684929907322, -0.02609148807823658, -0.02202983945608139, -0.0719323456287384, 0.05152013897895813, 0.11819363385438919, -0.0032208531629294157, -0.08131207525730133, 0.05716906115412712, 0.024889206513762474, -0.10231611132621765, 0.01285136304795742, 0.08474308252334595, -0.16491998732089996, -0.10045069456100464, -0.019446438178420067, -0.05275261774659157, -0.09562399983406067, -0.11498972773551941, -0.12990719079971313, -0.025319421663880348, 0.024742668494582176, 0.1636311113834381, 0.11630228906869888, 0.015099704265594482, 0.018208270892500877, -0.04745303839445114, -0.07821458578109741, 0.0001191478077089414, -0.02432379685342312, 0.023738037794828415, -0.07947131991386414, 0.1492803394794464, 0.03018142096698284, 0.07431358098983765, -0.04217799752950668, 0.038494039326906204, -0.1198691725730896, 0.02396339923143387, -0.12996797263622284, -0.014175253920257092, -0.06601636111736298, 0.002842233283445239, -0.05471271276473999, -0.060998596251010895, -0.12761229276657104, 0.0402841679751873, -0.10245812684297562, -0.022306477651000023, -0.017020054161548615, 0.009489595890045166, -0.09666838496923447, 0.0021238664630800486, 0.05583975836634636, -0.03493441268801689, 0.08116062730550766, 0.0536821112036705, -0.0060660699382424355, 0.09728877246379852, -0.07558716088533401, -0.08826757967472076, 0.03408024460077286, 0.038665056228637695, 0.0306794885545969, 0.16808994114398956, 0.06634843349456787, 0.06570259481668472, 0.009805926121771336, 0.021480731666088104, 0.11959031969308853, -0.09110286086797714, 0.061033330857753754, -0.07949385792016983, -0.10859113931655884, -0.040098655968904495, 0.024469491094350815, 0.06857088953256607, 0.06079253926873207, 0.11234384775161743, -0.03476504236459732, -0.029637305065989494, -0.023012783378362656, 0.0025155851617455482, -0.0317591167986393, -0.0832318589091301, -0.11943889409303665, -0.030484966933727264, 0.0237971730530262, -0.013574091717600822, 0.1907353550195694, 0.01368635706603527, 0.009562085382640362, 0.05390757694840431, -0.01413456816226244, 0.027056526392698288, -0.010462150909006596, 0.16340456902980804, -0.0006893746904097497, 0.05288900434970856, -0.13301105797290802, -0.004944638814777136, 0.04456506669521332, 0.03127652779221535, 0.1662549376487732, 0.2139945924282074, 0.07462430745363235, 0.12094001471996307, 0.02842695638537407, -0.05306825414299965, -0.06071826443076134, -0.11858375370502472, -0.022692229598760605, 0.004742113873362541, -0.04564419016242027, 0.032180219888687134, 0.26511502265930176, -0.03021782822906971, -0.0414944626390934, 0.018401606008410454, 0.004873268771916628, -0.150125652551651, -0.04013746604323387, -0.06844301521778107, -0.1124090701341629, 0.01252234261482954, -0.06907337158918381, -0.022396625950932503, 0.06516207754611969, 0.04207291081547737, -0.020439352840185165, 0.061640314757823944, -0.09316392242908478, -0.11530093103647232, 0.010827711783349514, -0.0022736666724085808, 0.03607043996453285, 0.047552067786455154, -0.04262331500649452, 0.007868239656090736, -0.09716247022151947, -0.003941004164516926, 0.04018840193748474, 0.005578150972723961, -0.00306260515935719, -0.0836937353014946, 0.003200460923835635, -0.07614609599113464, 0.08346983790397644, 0.046658508479595184, 0.17730990052223206, 0.042222000658512115, -0.06213195249438286, 0.007757242303341627, 0.16134461760520935, -0.011548461392521858, -0.08509337157011032, -0.0449845977127552, 0.10065607726573944, 0.11539306491613388, 0.12485425919294357, -0.04169763624668121, -0.11475884169340134, -0.02749578468501568, 0.22367633879184723, 0.2180814892053604, -0.04394767805933952, 0.07036270946264267, 0.0037218094803392887, 0.05069758743047714, 0.09096686542034149, 0.13247552514076233, 0.07543539255857468, 0.28833267092704773, -0.011381205171346664, -0.14405006170272827, -0.09875822067260742, 0.09700695425271988, -0.10812696069478989, 0.13120554387569427, -0.038568757474422455, -0.10765525698661804, -0.07007534801959991, 0.017638003453612328, -0.004028900060802698, -0.07644737511873245, 0.008444568142294884, -0.20581722259521484, -0.04261230677366257, 0.010074647143483162, 0.08990859240293503, 0.01597793959081173, 0.04715153947472572, -0.06539478898048401, -0.029583804309368134, -0.004124772269278765, 0.04046332836151123, -0.18102312088012695, -0.10147643089294434, 0.06357384473085403, -0.002197845373302698, 0.12607942521572113, -0.0137296998873353, 0.06992447376251221, 0.08945287019014359, 0.02608155645430088, -0.04978812858462334, 0.15398849546909332, 0.02731386013329029, -0.019422223791480064, 0.006982987280935049, -0.03961292654275894, -0.06865532696247101, 0.05274198576807976, 0.06144397333264351, -0.07340438663959503, 0.05745549127459526, -0.07145776599645615, -0.03317458555102348, -0.0848911851644516, 0.0898638665676117, -0.1162104457616806, 0.07042102515697479, 0.0851932018995285, -0.05909718945622444, -0.03938322141766548, -0.06123286858201027, 0.034920983016490936, 0.05580730363726616, -0.0465729683637619, -0.009302736259996891, -0.11877221614122391, 0.030578909441828728, -0.07441236823797226, 0.02993859350681305, -0.14283226430416107, -0.006390955299139023, -0.07215788215398788, 0.050590816885232925, -0.12613676488399506, 0.07371009886264801, 0.04264635592699051, 0.0003947852528654039, -0.01695362664759159, -0.1141425147652626, 0.008159320801496506, 0.017566479742527008, -0.09481257200241089, -0.09398099035024643 ]
null
null
transformers
# Muppet: Massive Multi-task Representations with Pre-Finetuning # RoBERTa base model This is a Massive Multi-task Pre-finetuned version of Roberta base. It was introduced in [this paper](https://arxiv.org/abs/2101.11038). The model improves over roberta-base in a wide range of GLUE, QA tasks (details can be found in the paper). The gains in smaller datasets are significant. Note: This checkpoint does not contain the classificaiton/MRC heads used during pre-finetuning due to compatibility issues and hence you might get slightly lower performance than that reported in the paper on some datasets ## Model description RoBERTa is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the BERT model as inputs. ## Intended uses & limitations You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=roberta) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ## Evaluation results When fine-tuned on downstream tasks, this model achieves the following results: Glue test results: | Model | MNLI | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | SQuAD| |:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:----:| | Roberta-base | 87.6 | 91.9 | 92.8 | 94.8 | 63.6 | 91.2 | 90.2 | 78.7 | 82.6| | MUPPET Roberta-base | 88.1 | 91.9 | 93.3 | 96.7 | - | - | 91.7 | 87.8 | 86.6| ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2101-11038, author = {Armen Aghajanyan and Anchit Gupta and Akshat Shrivastava and Xilun Chen and Luke Zettlemoyer and Sonal Gupta}, title = {Muppet: Massive Multi-task Representations with Pre-Finetuning}, journal = {CoRR}, volume = {abs/2101.11038}, year = {2021}, url = {https://arxiv.org/abs/2101.11038}, archivePrefix = {arXiv}, eprint = {2101.11038}, timestamp = {Sun, 31 Jan 2021 17:23:50 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-2101-11038.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"language": "en", "license": "mit", "tags": ["exbert"], "datasets": ["bookcorpus", "wikipedia"]}
fill-mask
facebook/muppet-roberta-base
[ "transformers", "pytorch", "roberta", "fill-mask", "exbert", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2101.11038", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.11038" ]
[ "en" ]
TAGS #transformers #pytorch #roberta #fill-mask #exbert #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2101.11038 #license-mit #autotrain_compatible #endpoints_compatible #region-us
Muppet: Massive Multi-task Representations with Pre-Finetuning ============================================================== RoBERTa base model ================== This is a Massive Multi-task Pre-finetuned version of Roberta base. It was introduced in this paper. The model improves over roberta-base in a wide range of GLUE, QA tasks (details can be found in the paper). The gains in smaller datasets are significant. Note: This checkpoint does not contain the classificaiton/MRC heads used during pre-finetuning due to compatibility issues and hence you might get slightly lower performance than that reported in the paper on some datasets Model description ----------------- RoBERTa is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the BERT model as inputs. Intended uses & limitations --------------------------- You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. Evaluation results ------------------ When fine-tuned on downstream tasks, this model achieves the following results: Glue test results: ### BibTeX entry and citation info
[ "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #roberta #fill-mask #exbert #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2101.11038 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "### BibTeX entry and citation info" ]
[ 68, 11 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #fill-mask #exbert #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2101.11038 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### BibTeX entry and citation info" ]
[ -0.05731098726391792, 0.14263887703418732, -0.004832151345908642, 0.09680981189012527, 0.059223782271146774, 0.05520828813314438, 0.17609991133213043, 0.10272378474473953, 0.061202455312013626, -0.034092068672180176, 0.15295007824897766, 0.2332683950662613, 0.045618388801813126, 0.19677504897117615, -0.09592767804861069, -0.2180095613002777, 0.0037084738723933697, 0.11346153169870377, 0.05494626238942146, 0.07585775852203369, 0.1004159078001976, -0.08721203356981277, 0.07685015350580215, -0.049947261810302734, -0.10789981484413147, 0.012812240980565548, 0.030786648392677307, -0.10059960931539536, 0.13498494029045105, 0.09380857646465302, 0.08813319355249405, 0.060301460325717926, 0.03575947880744934, -0.08510736376047134, 0.04007188230752945, -0.04692012444138527, -0.10704582929611206, 0.11964308470487595, -0.032089803367853165, -0.042469244450330734, 0.04732835665345192, 0.007357543334364891, 0.008053556084632874, -0.004426189232617617, -0.13327337801456451, -0.12611845135688782, -0.09975035488605499, 0.16047784686088562, -0.01495593972504139, 0.00904054194688797, 0.048158373683691025, 0.1527508944272995, 0.019161585718393326, 0.08444752544164658, 0.16666847467422485, -0.25578007102012634, -0.020852457731962204, 0.026924021542072296, 0.07341120392084122, 0.007347891107201576, -0.06419442594051361, 0.04891296103596687, 0.05968029052019119, 0.00411730632185936, 0.054414112120866776, -0.10549172013998032, -0.037387948483228683, 0.023912694305181503, -0.0307761337608099, -0.036907877773046494, 0.201196551322937, -0.04659584164619446, -0.027301708236336708, 0.02197675220668316, -0.05457395687699318, 0.07264057546854019, -0.024857308715581894, 0.015020770952105522, 0.002723218407481909, -0.02166340872645378, 0.030856693163514137, -0.07214522361755371, -0.10030210018157959, -0.014426257461309433, -0.1848987191915512, 0.09798623621463776, -0.026210153475403786, 0.06868380308151245, -0.08864200860261917, 0.03947802260518074, -0.020545782521367073, -0.1240721121430397, -0.0012204841477796435, -0.07173169404268265, 0.11319384723901749, -0.008273307234048843, 0.019628621637821198, 0.05310123413801193, 0.04690670967102051, 0.24525240063667297, 0.050813157111406326, -0.042940106242895126, -0.09325246512889862, 0.12463495135307312, 0.051569610834121704, 0.014092328026890755, 0.008347561582922935, -0.13418076932430267, 0.13972119987010956, -0.04588655009865761, 0.104926697909832, -0.025774499401450157, -0.16292643547058105, -0.08818178623914719, -0.00282347877509892, 0.045393746346235275, 0.07511960715055466, -0.01716991886496544, -0.04986460134387016, 0.0006433069938793778, 0.19243474304676056, -0.056577473878860474, 0.007859576493501663, -0.03676435351371765, -0.0018385417060926557, -0.04672711342573166, 0.07356730103492737, -0.022060398012399673, -0.030397610738873482, 0.12044898420572281, -0.08333013951778412, -0.026660345494747162, 0.004962059669196606, -0.08372776210308075, 0.07437269389629364, -0.10113563388586044, 0.09571211785078049, -0.14428207278251648, -0.22910933196544647, 0.03190422058105469, 0.0570746548473835, -0.07999049872159958, -0.03507569432258606, 0.0642726942896843, 0.05124291032552719, 0.0408189482986927, -0.07095541805028915, 0.032861076295375824, -0.0809311494231224, 0.1128881424665451, -0.049220435321331024, 0.10597865283489227, -0.223637655377388, 0.028749488294124603, -0.08809291571378708, -0.025454029440879822, -0.044123195111751556, -0.055204760283231735, -0.057815149426460266, 0.034060925245285034, -0.042314935475587845, -0.022652441635727882, -0.02725798822939396, 0.025495270267128944, 0.05974547192454338, 0.1535986363887787, -0.12050080299377441, -0.0726204589009285, 0.19219720363616943, -0.09250233322381973, -0.21899060904979706, 0.08041834831237793, -0.06373719871044159, 0.1216399148106575, 0.06914301961660385, 0.12716421484947205, -0.003948184195905924, -0.10498814284801483, -0.006567308213561773, 0.06118085980415344, -0.008144086226820946, -0.1404625028371811, 0.026272764429450035, 0.04679631069302559, -0.025716984644532204, 0.02052243798971176, 0.1249828115105629, -0.0017551449127495289, -0.023152122274041176, -0.05747189745306969, -0.015998492017388344, -0.0429171584546566, 0.03701072558760643, 0.041639335453510284, 0.057095739990472794, -0.07685267180204391, 0.012024917639791965, 0.06516432762145996, -0.005816440097987652, 0.0755949541926384, 0.0203982163220644, 0.022943129763007164, 0.10056781768798828, -0.09716729074716568, -0.00217463169246912, -0.12738479673862457, 0.06153331324458122, -0.03308849036693573, 0.09660881757736206, 0.07509857416152954, 0.039289604872465134, 0.03844005987048149, 0.010166458785533905, -0.067385733127594, 0.023600270971655846, 0.09959719330072403, 0.035904522985219955, -0.08508548885583878, -0.13178639113903046, 0.038506388664245605, -0.06135614961385727, -0.08511880785226822, -0.09927274286746979, 0.010119279846549034, -0.06977564841508865, 0.051423992961645126, -0.00426161615177989, 0.062313199043273926, -0.029602907598018646, -0.009384332224726677, -0.09099958837032318, 0.017855308949947357, 0.07165942341089249, 0.026663759723305702, -0.036775439977645874, 0.12546922266483307, -0.029603397473692894, 0.2861844599246979, 0.18395406007766724, -0.1223331019282341, -0.03262869268655777, -0.09615320712327957, -0.022261839359998703, -0.020397624000906944, -0.0102186668664217, 0.007319945376366377, -0.016927523538470268, -0.013803813606500626, 0.13372226059436798, -0.07756225764751434, -0.024005666375160217, 0.031575072556734085, -0.07210582494735718, -0.04362282529473305, 0.07398761063814163, 0.12474578619003296, -0.2599318325519562, 0.18784701824188232, 0.21118099987506866, 0.0014000340597704053, 0.16871023178100586, 0.047054532915353775, -0.021986888721585274, -0.022493358701467514, -0.1445353925228119, 0.009774026460945606, 0.15533246099948883, -0.056224334985017776, 0.02272694744169712, 0.09433059394359589, -0.06020739674568176, 0.010425671003758907, -0.11477845162153244, -0.049125660210847855, -0.013040532357990742, 0.03493839502334595, -0.11026744544506073, 0.06231088936328888, 0.0007801840547472239, 0.15174105763435364, 0.004696296062320471, -0.027772441506385803, 0.05436338856816292, 0.009266020730137825, -0.07112310826778412, 0.16147853434085846, -0.08672608435153961, -0.2707256078720093, -0.14588303864002228, -0.19049714505672455, 0.03067665547132492, 0.05177929252386093, 0.08101135492324829, 0.047394048422575, -0.02384980581700802, 0.03429611399769783, -0.046029165387153625, -0.017541125416755676, -0.02480544149875641, -0.08464933186769485, 0.06377869099378586, -0.07298393547534943, -0.12953978776931763, -0.07001730799674988, -0.01780257746577263, 0.05077279731631279, 0.0913146436214447, -0.09604388475418091, 0.07661118358373642, -0.0029588963370770216, -0.029429921880364418, -0.004714234732091427, -0.04871281236410141, 0.17271891236305237, -0.05448489636182785, 0.06548739969730377, 0.13809530436992645, -0.060291215777397156, 0.043269090354442596, 0.18112367391586304, 0.05655967816710472, -0.045526862144470215, -0.016518667340278625, -0.09424074739217758, -0.08959771692752838, -0.25632187724113464, -0.0731254294514656, -0.1016215905547142, 0.06351576745510101, 0.12349876761436462, 0.00947820395231247, 0.10508551448583603, 0.1032601073384285, -0.01645599864423275, 0.11448412388563156, 0.0025314386002719402, 0.08964250981807709, 0.1405671238899231, 0.008363474160432816, 0.13034921884536743, -0.06989637017250061, -0.03151803836226463, 0.09916921705007553, 0.1442698836326599, 0.10643037408590317, 0.0475349985063076, 0.07017165422439575, 0.04106944799423218, 0.04805858060717583, 0.09885707497596741, 0.17308777570724487, 0.018995165824890137, 0.005624591372907162, -0.05959917977452278, -0.028999190777540207, -0.056517887860536575, 0.0709795132279396, -0.014637403190135956, -0.009182358160614967, -0.037926461547613144, -0.18783767521381378, 0.007453450467437506, 0.02732165716588497, 0.07480695098638535, -0.1829136610031128, -0.06735431402921677, 0.08975213766098022, 0.02641843818128109, -0.10938413441181183, 0.00033221012563444674, -0.07221165299415588, -0.11787679046392441, 0.07601866871118546, -0.03686590865254402, 0.10130687803030014, -0.0352446585893631, 0.005903508514165878, -0.15823236107826233, -0.11482004076242447, -0.03167898207902908, 0.09098156541585922, -0.2887588441371918, 0.2971425950527191, 0.04828893765807152, -0.014717383310198784, -0.08592768758535385, -0.010003719478845596, 0.03342345729470253, 0.07838805019855499, 0.13245467841625214, -0.01397815253585577, -0.02726348675787449, -0.044893935322761536, -0.030553193762898445, 0.01236465759575367, -0.043072860687971115, -0.008844751864671707, -0.012888140045106411, -0.004045460373163223, -0.015305150300264359, -0.02423633076250553, 0.033105041831731796, -0.011414897628128529, -0.19657233357429504, 0.06593765318393707, -0.020979689434170723, -0.08317641168832779, -0.00869123637676239, -0.08377011120319366, 0.03636545315384865, 0.1887529492378235, -0.09460549056529999, -0.07641901075839996, -0.10209430754184723, 0.0722096636891365, 0.08959366381168365, -0.0815221518278122, 0.07291307300329208, -0.06726080179214478, 0.011361395008862019, -0.11934729665517807, -0.15101012587547302, 0.08767734467983246, -0.11441005766391754, -0.04065714776515961, -0.10829446464776993, 0.11112140864133835, -0.05051178112626076, 0.05973564460873604, 0.039125677198171616, 0.03492078557610512, -0.12025091797113419, -0.0312981940805912, 0.06401678919792175, -0.08698997646570206, 0.17364083230495453, -0.013340722769498825, -0.15113267302513123, -0.16059370338916779, 0.016404880210757256, -0.07296987622976303, 0.16418546438217163, 0.2309904396533966, -0.0460345484316349, 0.16703546047210693, 0.19536125659942627, -0.12016444653272629, -0.2743143141269684, -0.07246312499046326, -0.11531922966241837, -0.0015557295409962535, -0.026708809658885002, -0.12419037520885468, 0.07590705156326294, 0.059193387627601624, -0.05826302617788315, 0.18560922145843506, -0.043884288519620895, -0.09931899607181549, 0.20402590930461884, 0.02898308075964451, 0.3281908333301544, -0.1266442984342575, -0.07614368945360184, -0.1046835333108902, -0.2278016358613968, 0.16468097269535065, 0.0064191329292953014, 0.07376968860626221, -0.06594890356063843, 0.03021688014268875, -0.01985064148902893, -0.04771500453352928, 0.07728066295385361, -0.07181491702795029, 0.007076447829604149, -0.11271823942661285, -0.15920691192150116, 0.006907427683472633, -0.017485857009887695, 0.022240053862333298, -0.024118348956108093, 0.012601258233189583, -0.10936637967824936, -0.03409639745950699, -0.05354193598031998, 0.10373029112815857, -0.026139715686440468, -0.10291871428489685, -0.0490436777472496, 0.04252396151423454, -0.05023770034313202, -0.03001953288912773, 0.21460744738578796, -0.00020416690676938742, 0.15475647151470184, 0.12528347969055176, 0.0634758248925209, -0.09377490729093552, 0.05419579893350601, -0.06273046135902405, -0.103103406727314, 0.018560001626610756, -0.1068773940205574, 0.021325640380382538, 0.12757594883441925, 0.01612868532538414, 0.085067518055439, 0.0687604546546936, 0.0017118967371061444, -0.001398602849803865, 0.11220671236515045, -0.1803440898656845, 0.09114882349967957, 0.00544931972399354, -0.0010923127410933375, -0.005108179524540901, 0.0288710817694664, 0.13337351381778717, -0.021825874224305153, -0.09870223701000214, 0.010477741248905659, 0.04978766664862633, -0.02714819461107254, 0.12481293827295303, 0.1280890703201294, 0.01732013188302517, -0.0868835374712944, 0.05408551171422005, 0.04136708006262779, -0.04903648421168327, -0.0020747114904224873, 0.07029100507497787, -0.05758045241236687, -0.09884243458509445, 0.012594949454069138, 0.04495907574892044, -0.169382706284523, -0.004777931142598391, -0.046273376792669296, -0.06045836582779884, 0.04725611209869385, 0.25107836723327637, 0.06350827217102051, -0.003403837326914072, -0.007589718792587519, -0.05867264047265053, -0.01833065040409565, 0.08788417279720306, -0.028065120801329613, 0.05529772490262985, 0.048900436609983444, -0.035764794796705246, -0.04804077371954918, 0.1078159287571907, -0.06406252831220627, 0.04050265625119209, -0.19639582931995392, 0.014096018858253956, -0.13512304425239563, 0.05728352069854736, -0.11517789214849472, -0.03190363198518753, -0.05927548184990883, -0.07448212057352066, -0.06122737377882004, -0.04718418046832085, -0.09314103424549103, 0.045194607228040695, 0.006976722273975611, 0.06854698807001114, -0.10864109545946121, -0.061096787452697754, 0.09562292695045471, 0.0012040537549182773, 0.07141131162643433, 0.0661482959985733, -0.005232319235801697, 0.05639167129993439, -0.1853097528219223, -0.03259958699345589, 0.04008128494024277, 0.03742025047540665, 0.06922870874404907, -0.041135452687740326, 0.029000960290431976, 0.06549594551324844, -0.026475947350263596, 0.036725714802742004, 0.04809240996837616, -0.09242421388626099, -0.007575982250273228, -0.02527821809053421, -0.1424739807844162, 0.008365998044610023, 0.011542805470526218, 0.08138834685087204, -0.020618800073862076, 0.06685862690210342, -0.051791962236166, 0.04016478732228279, -0.11644429713487625, 0.02939409203827381, -0.043604057282209396, -0.19733752310276031, -0.0341530442237854, -0.07652148604393005, 0.007596975192427635, 0.026405243203043938, 0.3017604649066925, 0.022000832483172417, -0.02127254009246826, 0.028022823855280876, 0.00977229978889227, -0.0510600283741951, -0.0033762468956410885, 0.22010603547096252, 0.0072515239007771015, -0.04685884341597557, -0.12497282773256302, 0.09229137003421783, -0.009263655170798302, 0.09708601981401443, 0.08590804785490036, 0.12617181241512299, 0.10127126425504684, 0.04492877051234245, 0.04218324273824692, -0.009160391986370087, 0.02903631515800953, -0.18751226365566254, 0.07681211829185486, 0.04750789329409599, 0.04143653064966202, 0.05655752867460251, 0.11187269538640976, -0.08015791326761246, 0.03031821735203266, -0.06084243208169937, -0.04897535964846611, -0.1300172209739685, -0.07365837693214417, -0.07633436471223831, 0.04048605635762215, -0.003854649141430855, -0.07910232245922089, 0.020490702241659164, 0.013268695212900639, 0.04022616520524025, -0.05817204713821411, -0.060133859515190125, 0.045684944838285446, -0.027643028646707535, 0.048349592834711075, 0.05313551425933838, -0.010861430317163467, -0.11153963208198547, 0.019458606839179993, -0.1154119223356247, 0.000435743888374418, -0.027256567031145096, 0.012268027290701866, 0.008881486020982265, 0.0006826624739915133, -0.05417553335428238, -0.08913665264844894, -0.04123575612902641, 0.03908904641866684, 0.04546458646655083, 0.1700303852558136, 0.0014763918006792665, 0.06590129435062408, 0.07632262259721756, 0.135235995054245, -0.002161417854949832, -0.08549721539020538, -0.0471348837018013, 0.12284126877784729, 0.021677201613783836, 0.031069019809365273, -0.00814750138670206, -0.00421826122328639, 0.03367483988404274, 0.2550617456436157, 0.3459153473377228, -0.05050461366772652, 0.04297281801700592, 0.06871242076158524, 0.019471734762191772, 0.11150432378053665, 0.07239623367786407, 0.07956194877624512, 0.2708892822265625, -0.07347992062568665, -0.04271399974822998, -0.04926924407482147, 0.015096311457455158, -0.019780561327934265, 0.11003462970256805, 0.023920651525259018, -0.04444979131221771, -0.031138649210333824, 0.0649707019329071, -0.07521025091409683, -0.10431995987892151, 0.02995087392628193, -0.23597191274166107, -0.09926942735910416, 0.0010395467979833484, -0.004738012794405222, -0.0030807594303041697, 0.024912044405937195, -0.02558859810233116, -0.0186910480260849, 0.04383012279868126, -0.007151284720748663, -0.1988699585199356, -0.04649725183844566, 0.06988444179296494, 0.017502401024103165, 0.1507587432861328, -0.04910067468881607, 0.06268853694200516, 0.09490171819925308, 0.06293030083179474, -0.06462746858596802, 0.04124947264790535, 0.04945201799273491, 0.0060548847541213036, 0.05348680168390274, -0.011746925301849842, -0.0029036987107247114, -0.09844294935464859, 0.0938619077205658, -0.019926365464925766, 0.050631944090127945, -0.013630030676722527, -0.11712636053562164, -0.03709704801440239, 0.11425577849149704, -0.09400331228971481, 0.08842180669307709, 0.05724945291876793, -0.031480297446250916, -0.045162227004766464, -0.05257708951830864, 0.014572283253073692, 0.10266954451799393, -0.03199223801493645, -0.05714375153183937, -0.0625900998711586, -0.023151880130171776, -0.012255989946424961, 0.02945804037153721, -0.30636340379714966, -0.018227489665150642, -0.13445259630680084, -0.025983460247516632, -0.1690179854631424, 0.05630270391702652, 0.046598806977272034, 0.004585597664117813, -0.026686960831284523, -0.058132968842983246, 0.010357742197811604, 0.004231857135891914, -0.08876820653676987, -0.07725545018911362 ]
null
null
transformers
# Muppet: Massive Multi-task Representations with Pre-Finetuning # RoBERTa large model This is a Massive Multi-task Pre-finetuned version of Roberta large. It was introduced in [this paper](https://arxiv.org/abs/2101.11038). The model improves over roberta-base in a wide range of GLUE, QA tasks (details can be found in the paper). The gains in smaller datasets are significant. Note: This checkpoint does not contain the classificaiton/MRC heads used during pre-finetuning due to compatibility issues and hence you might get slightly lower performance than that reported in the paper on some datasets ## Model description RoBERTa is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the BERT model as inputs. ## Intended uses & limitations You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=roberta) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ## Evaluation results When fine-tuned on downstream tasks, this model achieves the following results: Glue test results: | Model | MNLI | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | SQuAD| |:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:----:| | Roberta-large | 90.2 | 92.2 | 94.7 | 96.4 | 63.6 | 91.2 | 90.9 | 88.1 | 88.7| | MUPPET Roberta-large | 90.8 | 92.2 | 94.9 | 97.4 | - | - | 91.4 | 92.8 | 89.4| ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2101-11038, author = {Armen Aghajanyan and Anchit Gupta and Akshat Shrivastava and Xilun Chen and Luke Zettlemoyer and Sonal Gupta}, title = {Muppet: Massive Multi-task Representations with Pre-Finetuning}, journal = {CoRR}, volume = {abs/2101.11038}, year = {2021}, url = {https://arxiv.org/abs/2101.11038}, archivePrefix = {arXiv}, eprint = {2101.11038}, timestamp = {Sun, 31 Jan 2021 17:23:50 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-2101-11038.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"language": "en", "license": "mit", "tags": ["exbert"], "datasets": ["bookcorpus", "wikipedia"]}
fill-mask
facebook/muppet-roberta-large
[ "transformers", "pytorch", "roberta", "fill-mask", "exbert", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2101.11038", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.11038" ]
[ "en" ]
TAGS #transformers #pytorch #roberta #fill-mask #exbert #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2101.11038 #license-mit #autotrain_compatible #endpoints_compatible #region-us
Muppet: Massive Multi-task Representations with Pre-Finetuning ============================================================== RoBERTa large model =================== This is a Massive Multi-task Pre-finetuned version of Roberta large. It was introduced in this paper. The model improves over roberta-base in a wide range of GLUE, QA tasks (details can be found in the paper). The gains in smaller datasets are significant. Note: This checkpoint does not contain the classificaiton/MRC heads used during pre-finetuning due to compatibility issues and hence you might get slightly lower performance than that reported in the paper on some datasets Model description ----------------- RoBERTa is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the BERT model as inputs. Intended uses & limitations --------------------------- You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. Evaluation results ------------------ When fine-tuned on downstream tasks, this model achieves the following results: Glue test results: ### BibTeX entry and citation info
[ "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #roberta #fill-mask #exbert #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2101.11038 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "### BibTeX entry and citation info" ]
[ 68, 11 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #fill-mask #exbert #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2101.11038 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### BibTeX entry and citation info" ]
[ -0.05731098726391792, 0.14263887703418732, -0.004832151345908642, 0.09680981189012527, 0.059223782271146774, 0.05520828813314438, 0.17609991133213043, 0.10272378474473953, 0.061202455312013626, -0.034092068672180176, 0.15295007824897766, 0.2332683950662613, 0.045618388801813126, 0.19677504897117615, -0.09592767804861069, -0.2180095613002777, 0.0037084738723933697, 0.11346153169870377, 0.05494626238942146, 0.07585775852203369, 0.1004159078001976, -0.08721203356981277, 0.07685015350580215, -0.049947261810302734, -0.10789981484413147, 0.012812240980565548, 0.030786648392677307, -0.10059960931539536, 0.13498494029045105, 0.09380857646465302, 0.08813319355249405, 0.060301460325717926, 0.03575947880744934, -0.08510736376047134, 0.04007188230752945, -0.04692012444138527, -0.10704582929611206, 0.11964308470487595, -0.032089803367853165, -0.042469244450330734, 0.04732835665345192, 0.007357543334364891, 0.008053556084632874, -0.004426189232617617, -0.13327337801456451, -0.12611845135688782, -0.09975035488605499, 0.16047784686088562, -0.01495593972504139, 0.00904054194688797, 0.048158373683691025, 0.1527508944272995, 0.019161585718393326, 0.08444752544164658, 0.16666847467422485, -0.25578007102012634, -0.020852457731962204, 0.026924021542072296, 0.07341120392084122, 0.007347891107201576, -0.06419442594051361, 0.04891296103596687, 0.05968029052019119, 0.00411730632185936, 0.054414112120866776, -0.10549172013998032, -0.037387948483228683, 0.023912694305181503, -0.0307761337608099, -0.036907877773046494, 0.201196551322937, -0.04659584164619446, -0.027301708236336708, 0.02197675220668316, -0.05457395687699318, 0.07264057546854019, -0.024857308715581894, 0.015020770952105522, 0.002723218407481909, -0.02166340872645378, 0.030856693163514137, -0.07214522361755371, -0.10030210018157959, -0.014426257461309433, -0.1848987191915512, 0.09798623621463776, -0.026210153475403786, 0.06868380308151245, -0.08864200860261917, 0.03947802260518074, -0.020545782521367073, -0.1240721121430397, -0.0012204841477796435, -0.07173169404268265, 0.11319384723901749, -0.008273307234048843, 0.019628621637821198, 0.05310123413801193, 0.04690670967102051, 0.24525240063667297, 0.050813157111406326, -0.042940106242895126, -0.09325246512889862, 0.12463495135307312, 0.051569610834121704, 0.014092328026890755, 0.008347561582922935, -0.13418076932430267, 0.13972119987010956, -0.04588655009865761, 0.104926697909832, -0.025774499401450157, -0.16292643547058105, -0.08818178623914719, -0.00282347877509892, 0.045393746346235275, 0.07511960715055466, -0.01716991886496544, -0.04986460134387016, 0.0006433069938793778, 0.19243474304676056, -0.056577473878860474, 0.007859576493501663, -0.03676435351371765, -0.0018385417060926557, -0.04672711342573166, 0.07356730103492737, -0.022060398012399673, -0.030397610738873482, 0.12044898420572281, -0.08333013951778412, -0.026660345494747162, 0.004962059669196606, -0.08372776210308075, 0.07437269389629364, -0.10113563388586044, 0.09571211785078049, -0.14428207278251648, -0.22910933196544647, 0.03190422058105469, 0.0570746548473835, -0.07999049872159958, -0.03507569432258606, 0.0642726942896843, 0.05124291032552719, 0.0408189482986927, -0.07095541805028915, 0.032861076295375824, -0.0809311494231224, 0.1128881424665451, -0.049220435321331024, 0.10597865283489227, -0.223637655377388, 0.028749488294124603, -0.08809291571378708, -0.025454029440879822, -0.044123195111751556, -0.055204760283231735, -0.057815149426460266, 0.034060925245285034, -0.042314935475587845, -0.022652441635727882, -0.02725798822939396, 0.025495270267128944, 0.05974547192454338, 0.1535986363887787, -0.12050080299377441, -0.0726204589009285, 0.19219720363616943, -0.09250233322381973, -0.21899060904979706, 0.08041834831237793, -0.06373719871044159, 0.1216399148106575, 0.06914301961660385, 0.12716421484947205, -0.003948184195905924, -0.10498814284801483, -0.006567308213561773, 0.06118085980415344, -0.008144086226820946, -0.1404625028371811, 0.026272764429450035, 0.04679631069302559, -0.025716984644532204, 0.02052243798971176, 0.1249828115105629, -0.0017551449127495289, -0.023152122274041176, -0.05747189745306969, -0.015998492017388344, -0.0429171584546566, 0.03701072558760643, 0.041639335453510284, 0.057095739990472794, -0.07685267180204391, 0.012024917639791965, 0.06516432762145996, -0.005816440097987652, 0.0755949541926384, 0.0203982163220644, 0.022943129763007164, 0.10056781768798828, -0.09716729074716568, -0.00217463169246912, -0.12738479673862457, 0.06153331324458122, -0.03308849036693573, 0.09660881757736206, 0.07509857416152954, 0.039289604872465134, 0.03844005987048149, 0.010166458785533905, -0.067385733127594, 0.023600270971655846, 0.09959719330072403, 0.035904522985219955, -0.08508548885583878, -0.13178639113903046, 0.038506388664245605, -0.06135614961385727, -0.08511880785226822, -0.09927274286746979, 0.010119279846549034, -0.06977564841508865, 0.051423992961645126, -0.00426161615177989, 0.062313199043273926, -0.029602907598018646, -0.009384332224726677, -0.09099958837032318, 0.017855308949947357, 0.07165942341089249, 0.026663759723305702, -0.036775439977645874, 0.12546922266483307, -0.029603397473692894, 0.2861844599246979, 0.18395406007766724, -0.1223331019282341, -0.03262869268655777, -0.09615320712327957, -0.022261839359998703, -0.020397624000906944, -0.0102186668664217, 0.007319945376366377, -0.016927523538470268, -0.013803813606500626, 0.13372226059436798, -0.07756225764751434, -0.024005666375160217, 0.031575072556734085, -0.07210582494735718, -0.04362282529473305, 0.07398761063814163, 0.12474578619003296, -0.2599318325519562, 0.18784701824188232, 0.21118099987506866, 0.0014000340597704053, 0.16871023178100586, 0.047054532915353775, -0.021986888721585274, -0.022493358701467514, -0.1445353925228119, 0.009774026460945606, 0.15533246099948883, -0.056224334985017776, 0.02272694744169712, 0.09433059394359589, -0.06020739674568176, 0.010425671003758907, -0.11477845162153244, -0.049125660210847855, -0.013040532357990742, 0.03493839502334595, -0.11026744544506073, 0.06231088936328888, 0.0007801840547472239, 0.15174105763435364, 0.004696296062320471, -0.027772441506385803, 0.05436338856816292, 0.009266020730137825, -0.07112310826778412, 0.16147853434085846, -0.08672608435153961, -0.2707256078720093, -0.14588303864002228, -0.19049714505672455, 0.03067665547132492, 0.05177929252386093, 0.08101135492324829, 0.047394048422575, -0.02384980581700802, 0.03429611399769783, -0.046029165387153625, -0.017541125416755676, -0.02480544149875641, -0.08464933186769485, 0.06377869099378586, -0.07298393547534943, -0.12953978776931763, -0.07001730799674988, -0.01780257746577263, 0.05077279731631279, 0.0913146436214447, -0.09604388475418091, 0.07661118358373642, -0.0029588963370770216, -0.029429921880364418, -0.004714234732091427, -0.04871281236410141, 0.17271891236305237, -0.05448489636182785, 0.06548739969730377, 0.13809530436992645, -0.060291215777397156, 0.043269090354442596, 0.18112367391586304, 0.05655967816710472, -0.045526862144470215, -0.016518667340278625, -0.09424074739217758, -0.08959771692752838, -0.25632187724113464, -0.0731254294514656, -0.1016215905547142, 0.06351576745510101, 0.12349876761436462, 0.00947820395231247, 0.10508551448583603, 0.1032601073384285, -0.01645599864423275, 0.11448412388563156, 0.0025314386002719402, 0.08964250981807709, 0.1405671238899231, 0.008363474160432816, 0.13034921884536743, -0.06989637017250061, -0.03151803836226463, 0.09916921705007553, 0.1442698836326599, 0.10643037408590317, 0.0475349985063076, 0.07017165422439575, 0.04106944799423218, 0.04805858060717583, 0.09885707497596741, 0.17308777570724487, 0.018995165824890137, 0.005624591372907162, -0.05959917977452278, -0.028999190777540207, -0.056517887860536575, 0.0709795132279396, -0.014637403190135956, -0.009182358160614967, -0.037926461547613144, -0.18783767521381378, 0.007453450467437506, 0.02732165716588497, 0.07480695098638535, -0.1829136610031128, -0.06735431402921677, 0.08975213766098022, 0.02641843818128109, -0.10938413441181183, 0.00033221012563444674, -0.07221165299415588, -0.11787679046392441, 0.07601866871118546, -0.03686590865254402, 0.10130687803030014, -0.0352446585893631, 0.005903508514165878, -0.15823236107826233, -0.11482004076242447, -0.03167898207902908, 0.09098156541585922, -0.2887588441371918, 0.2971425950527191, 0.04828893765807152, -0.014717383310198784, -0.08592768758535385, -0.010003719478845596, 0.03342345729470253, 0.07838805019855499, 0.13245467841625214, -0.01397815253585577, -0.02726348675787449, -0.044893935322761536, -0.030553193762898445, 0.01236465759575367, -0.043072860687971115, -0.008844751864671707, -0.012888140045106411, -0.004045460373163223, -0.015305150300264359, -0.02423633076250553, 0.033105041831731796, -0.011414897628128529, -0.19657233357429504, 0.06593765318393707, -0.020979689434170723, -0.08317641168832779, -0.00869123637676239, -0.08377011120319366, 0.03636545315384865, 0.1887529492378235, -0.09460549056529999, -0.07641901075839996, -0.10209430754184723, 0.0722096636891365, 0.08959366381168365, -0.0815221518278122, 0.07291307300329208, -0.06726080179214478, 0.011361395008862019, -0.11934729665517807, -0.15101012587547302, 0.08767734467983246, -0.11441005766391754, -0.04065714776515961, -0.10829446464776993, 0.11112140864133835, -0.05051178112626076, 0.05973564460873604, 0.039125677198171616, 0.03492078557610512, -0.12025091797113419, -0.0312981940805912, 0.06401678919792175, -0.08698997646570206, 0.17364083230495453, -0.013340722769498825, -0.15113267302513123, -0.16059370338916779, 0.016404880210757256, -0.07296987622976303, 0.16418546438217163, 0.2309904396533966, -0.0460345484316349, 0.16703546047210693, 0.19536125659942627, -0.12016444653272629, -0.2743143141269684, -0.07246312499046326, -0.11531922966241837, -0.0015557295409962535, -0.026708809658885002, -0.12419037520885468, 0.07590705156326294, 0.059193387627601624, -0.05826302617788315, 0.18560922145843506, -0.043884288519620895, -0.09931899607181549, 0.20402590930461884, 0.02898308075964451, 0.3281908333301544, -0.1266442984342575, -0.07614368945360184, -0.1046835333108902, -0.2278016358613968, 0.16468097269535065, 0.0064191329292953014, 0.07376968860626221, -0.06594890356063843, 0.03021688014268875, -0.01985064148902893, -0.04771500453352928, 0.07728066295385361, -0.07181491702795029, 0.007076447829604149, -0.11271823942661285, -0.15920691192150116, 0.006907427683472633, -0.017485857009887695, 0.022240053862333298, -0.024118348956108093, 0.012601258233189583, -0.10936637967824936, -0.03409639745950699, -0.05354193598031998, 0.10373029112815857, -0.026139715686440468, -0.10291871428489685, -0.0490436777472496, 0.04252396151423454, -0.05023770034313202, -0.03001953288912773, 0.21460744738578796, -0.00020416690676938742, 0.15475647151470184, 0.12528347969055176, 0.0634758248925209, -0.09377490729093552, 0.05419579893350601, -0.06273046135902405, -0.103103406727314, 0.018560001626610756, -0.1068773940205574, 0.021325640380382538, 0.12757594883441925, 0.01612868532538414, 0.085067518055439, 0.0687604546546936, 0.0017118967371061444, -0.001398602849803865, 0.11220671236515045, -0.1803440898656845, 0.09114882349967957, 0.00544931972399354, -0.0010923127410933375, -0.005108179524540901, 0.0288710817694664, 0.13337351381778717, -0.021825874224305153, -0.09870223701000214, 0.010477741248905659, 0.04978766664862633, -0.02714819461107254, 0.12481293827295303, 0.1280890703201294, 0.01732013188302517, -0.0868835374712944, 0.05408551171422005, 0.04136708006262779, -0.04903648421168327, -0.0020747114904224873, 0.07029100507497787, -0.05758045241236687, -0.09884243458509445, 0.012594949454069138, 0.04495907574892044, -0.169382706284523, -0.004777931142598391, -0.046273376792669296, -0.06045836582779884, 0.04725611209869385, 0.25107836723327637, 0.06350827217102051, -0.003403837326914072, -0.007589718792587519, -0.05867264047265053, -0.01833065040409565, 0.08788417279720306, -0.028065120801329613, 0.05529772490262985, 0.048900436609983444, -0.035764794796705246, -0.04804077371954918, 0.1078159287571907, -0.06406252831220627, 0.04050265625119209, -0.19639582931995392, 0.014096018858253956, -0.13512304425239563, 0.05728352069854736, -0.11517789214849472, -0.03190363198518753, -0.05927548184990883, -0.07448212057352066, -0.06122737377882004, -0.04718418046832085, -0.09314103424549103, 0.045194607228040695, 0.006976722273975611, 0.06854698807001114, -0.10864109545946121, -0.061096787452697754, 0.09562292695045471, 0.0012040537549182773, 0.07141131162643433, 0.0661482959985733, -0.005232319235801697, 0.05639167129993439, -0.1853097528219223, -0.03259958699345589, 0.04008128494024277, 0.03742025047540665, 0.06922870874404907, -0.041135452687740326, 0.029000960290431976, 0.06549594551324844, -0.026475947350263596, 0.036725714802742004, 0.04809240996837616, -0.09242421388626099, -0.007575982250273228, -0.02527821809053421, -0.1424739807844162, 0.008365998044610023, 0.011542805470526218, 0.08138834685087204, -0.020618800073862076, 0.06685862690210342, -0.051791962236166, 0.04016478732228279, -0.11644429713487625, 0.02939409203827381, -0.043604057282209396, -0.19733752310276031, -0.0341530442237854, -0.07652148604393005, 0.007596975192427635, 0.026405243203043938, 0.3017604649066925, 0.022000832483172417, -0.02127254009246826, 0.028022823855280876, 0.00977229978889227, -0.0510600283741951, -0.0033762468956410885, 0.22010603547096252, 0.0072515239007771015, -0.04685884341597557, -0.12497282773256302, 0.09229137003421783, -0.009263655170798302, 0.09708601981401443, 0.08590804785490036, 0.12617181241512299, 0.10127126425504684, 0.04492877051234245, 0.04218324273824692, -0.009160391986370087, 0.02903631515800953, -0.18751226365566254, 0.07681211829185486, 0.04750789329409599, 0.04143653064966202, 0.05655752867460251, 0.11187269538640976, -0.08015791326761246, 0.03031821735203266, -0.06084243208169937, -0.04897535964846611, -0.1300172209739685, -0.07365837693214417, -0.07633436471223831, 0.04048605635762215, -0.003854649141430855, -0.07910232245922089, 0.020490702241659164, 0.013268695212900639, 0.04022616520524025, -0.05817204713821411, -0.060133859515190125, 0.045684944838285446, -0.027643028646707535, 0.048349592834711075, 0.05313551425933838, -0.010861430317163467, -0.11153963208198547, 0.019458606839179993, -0.1154119223356247, 0.000435743888374418, -0.027256567031145096, 0.012268027290701866, 0.008881486020982265, 0.0006826624739915133, -0.05417553335428238, -0.08913665264844894, -0.04123575612902641, 0.03908904641866684, 0.04546458646655083, 0.1700303852558136, 0.0014763918006792665, 0.06590129435062408, 0.07632262259721756, 0.135235995054245, -0.002161417854949832, -0.08549721539020538, -0.0471348837018013, 0.12284126877784729, 0.021677201613783836, 0.031069019809365273, -0.00814750138670206, -0.00421826122328639, 0.03367483988404274, 0.2550617456436157, 0.3459153473377228, -0.05050461366772652, 0.04297281801700592, 0.06871242076158524, 0.019471734762191772, 0.11150432378053665, 0.07239623367786407, 0.07956194877624512, 0.2708892822265625, -0.07347992062568665, -0.04271399974822998, -0.04926924407482147, 0.015096311457455158, -0.019780561327934265, 0.11003462970256805, 0.023920651525259018, -0.04444979131221771, -0.031138649210333824, 0.0649707019329071, -0.07521025091409683, -0.10431995987892151, 0.02995087392628193, -0.23597191274166107, -0.09926942735910416, 0.0010395467979833484, -0.004738012794405222, -0.0030807594303041697, 0.024912044405937195, -0.02558859810233116, -0.0186910480260849, 0.04383012279868126, -0.007151284720748663, -0.1988699585199356, -0.04649725183844566, 0.06988444179296494, 0.017502401024103165, 0.1507587432861328, -0.04910067468881607, 0.06268853694200516, 0.09490171819925308, 0.06293030083179474, -0.06462746858596802, 0.04124947264790535, 0.04945201799273491, 0.0060548847541213036, 0.05348680168390274, -0.011746925301849842, -0.0029036987107247114, -0.09844294935464859, 0.0938619077205658, -0.019926365464925766, 0.050631944090127945, -0.013630030676722527, -0.11712636053562164, -0.03709704801440239, 0.11425577849149704, -0.09400331228971481, 0.08842180669307709, 0.05724945291876793, -0.031480297446250916, -0.045162227004766464, -0.05257708951830864, 0.014572283253073692, 0.10266954451799393, -0.03199223801493645, -0.05714375153183937, -0.0625900998711586, -0.023151880130171776, -0.012255989946424961, 0.02945804037153721, -0.30636340379714966, -0.018227489665150642, -0.13445259630680084, -0.025983460247516632, -0.1690179854631424, 0.05630270391702652, 0.046598806977272034, 0.004585597664117813, -0.026686960831284523, -0.058132968842983246, 0.010357742197811604, 0.004231857135891914, -0.08876820653676987, -0.07725545018911362 ]
null
null
transformers
## RAG This is a non-finetuned version of the RAG-Sequence model of the the paper [Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks](https://arxiv.org/pdf/2005.11401.pdf) by Patrick Lewis, Ethan Perez, Aleksandara Piktus et al. Rag consits of a *question encoder*, *retriever* and a *generator*. The retriever should be a `RagRetriever` instance. The *question encoder* can be any model that can be loaded with `AutoModel` and the *generator* can be any model that can be loaded with `AutoModelForSeq2SeqLM`. This model is a non-finetuned RAG-Sequence model and was created as follows: ```python from transformers import RagTokenizer, RagRetriever, RagSequenceForGeneration, AutoTokenizer model = RagSequenceForGeneration.from_pretrained_question_encoder_generator("facebook/dpr-question_encoder-single-nq-base", "facebook/bart-large") question_encoder_tokenizer = AutoTokenizer.from_pretrained("facebook/dpr-question_encoder-single-nq-base") generator_tokenizer = AutoTokenizer.from_pretrained("facebook/bart-large") tokenizer = RagTokenizer(question_encoder_tokenizer, generator_tokenizer) model.config.use_dummy_dataset = True model.config.index_name = "exact" retriever = RagRetriever(model.config, question_encoder_tokenizer, generator_tokenizer) model.save_pretrained("./") tokenizer.save_pretrained("./") retriever.save_pretrained("./") ``` Note that the model is *uncased* so that all capital input letters are converted to lower-case. ## Usage: *Note*: the model uses the *dummy* retriever as a default. Better results are obtained by using the full retriever, by setting `config.index_name="legacy"` and `config.use_dummy_dataset=False`. The model can be fine-tuned as follows: ```python from transformers import RagTokenizer, RagRetriever, RagTokenForGeneration tokenizer = RagTokenizer.from_pretrained("facebook/rag-sequence-base") retriever = RagRetriever.from_pretrained("facebook/rag-sequence-base") model = RagTokenForGeneration.from_pretrained("facebook/rag-sequence-base", retriever=retriever) input_dict = tokenizer.prepare_seq2seq_batch("who holds the record in 100m freestyle", "michael phelps", return_tensors="pt") outputs = model(input_dict["input_ids"], labels=input_dict["labels"]) loss = outputs.loss # train on loss ```
{"license": "apache-2.0", "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png"}
null
facebook/rag-sequence-base
[ "transformers", "pytorch", "rag", "arxiv:2005.11401", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2005.11401" ]
[]
TAGS #transformers #pytorch #rag #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #region-us
## RAG This is a non-finetuned version of the RAG-Sequence model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks by Patrick Lewis, Ethan Perez, Aleksandara Piktus et al. Rag consits of a *question encoder*, *retriever* and a *generator*. The retriever should be a 'RagRetriever' instance. The *question encoder* can be any model that can be loaded with 'AutoModel' and the *generator* can be any model that can be loaded with 'AutoModelForSeq2SeqLM'. This model is a non-finetuned RAG-Sequence model and was created as follows: Note that the model is *uncased* so that all capital input letters are converted to lower-case. ## Usage: *Note*: the model uses the *dummy* retriever as a default. Better results are obtained by using the full retriever, by setting 'config.index_name="legacy"' and 'config.use_dummy_dataset=False'. The model can be fine-tuned as follows:
[ "## RAG\n\nThis is a non-finetuned version of the RAG-Sequence model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nRag consits of a *question encoder*, *retriever* and a *generator*. The retriever should be a 'RagRetriever' instance. The *question encoder* can be any model that can be loaded with 'AutoModel' and the *generator* can be any model that can be loaded with 'AutoModelForSeq2SeqLM'. \n\nThis model is a non-finetuned RAG-Sequence model and was created as follows:\n\n\n\nNote that the model is *uncased* so that all capital input letters are converted to lower-case.", "## Usage:\n\n*Note*: the model uses the *dummy* retriever as a default. Better results are obtained by using the full retriever, \nby setting 'config.index_name=\"legacy\"' and 'config.use_dummy_dataset=False'.\nThe model can be fine-tuned as follows:" ]
[ "TAGS\n#transformers #pytorch #rag #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #region-us \n", "## RAG\n\nThis is a non-finetuned version of the RAG-Sequence model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nRag consits of a *question encoder*, *retriever* and a *generator*. The retriever should be a 'RagRetriever' instance. The *question encoder* can be any model that can be loaded with 'AutoModel' and the *generator* can be any model that can be loaded with 'AutoModelForSeq2SeqLM'. \n\nThis model is a non-finetuned RAG-Sequence model and was created as follows:\n\n\n\nNote that the model is *uncased* so that all capital input letters are converted to lower-case.", "## Usage:\n\n*Note*: the model uses the *dummy* retriever as a default. Better results are obtained by using the full retriever, \nby setting 'config.index_name=\"legacy\"' and 'config.use_dummy_dataset=False'.\nThe model can be fine-tuned as follows:" ]
[ 39, 198, 78 ]
[ "passage: TAGS\n#transformers #pytorch #rag #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #region-us \n## RAG\n\nThis is a non-finetuned version of the RAG-Sequence model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nRag consits of a *question encoder*, *retriever* and a *generator*. The retriever should be a 'RagRetriever' instance. The *question encoder* can be any model that can be loaded with 'AutoModel' and the *generator* can be any model that can be loaded with 'AutoModelForSeq2SeqLM'. \n\nThis model is a non-finetuned RAG-Sequence model and was created as follows:\n\n\n\nNote that the model is *uncased* so that all capital input letters are converted to lower-case.## Usage:\n\n*Note*: the model uses the *dummy* retriever as a default. Better results are obtained by using the full retriever, \nby setting 'config.index_name=\"legacy\"' and 'config.use_dummy_dataset=False'.\nThe model can be fine-tuned as follows:" ]
[ -0.022377610206604004, 0.11611958593130112, -0.007137025706470013, 0.07519477605819702, 0.14541487395763397, 0.06147848442196846, 0.040463317185640335, 0.08979851752519608, 0.005921249743551016, 0.04894102364778519, 0.04288352653384209, 0.1108347624540329, 0.012815197929739952, 0.0786370262503624, -0.013054479844868183, -0.1656736135482788, 0.0355507954955101, -0.05518538877367973, 0.09333889186382294, 0.04855166748166084, 0.08807171881198883, -0.046820707619190216, 0.0449521467089653, -0.020439254119992256, -0.03228157386183739, 0.0289979986846447, 0.01605621911585331, 0.012174369767308235, 0.043169595301151276, 0.04138588160276413, 0.09638363122940063, 0.025399213656783104, 0.059134818613529205, -0.13014666736125946, 0.022215431556105614, 0.06282191723585129, -0.004343412350863218, 0.030578002333641052, 0.012513354420661926, 0.012054507620632648, 0.10672936588525772, 0.005269606597721577, 0.03340282291173935, 0.057890765368938446, -0.07641010731458664, -0.12208987772464752, -0.0628180056810379, 0.05303317308425903, 0.05968631058931351, 0.0547654926776886, -0.00725829740986228, 0.07739858329296112, 0.019999803975224495, 0.047736771404743195, 0.07853482663631439, -0.2001463919878006, -0.004603590816259384, 0.1413882076740265, -0.044787365943193436, 0.035123396664857864, -0.03591137006878853, -0.04430346190929413, -0.051175691187381744, 0.018012333661317825, 0.02757473848760128, -0.04184163361787796, -0.11727867275476456, -0.03744776174426079, -0.09136863797903061, -0.018316423520445824, 0.1632937490940094, -0.032236117869615555, -0.05611832067370415, -0.04616139456629753, -0.07533058524131775, -0.01896650716662407, -0.0098704369738698, -0.08160316199064255, 0.004567316267639399, 0.08269219100475311, 0.04244910925626755, -0.09803149104118347, -0.07471753656864166, -0.028379077091813087, -0.10923795402050018, 0.029421666637063026, 0.01004250068217516, 0.007858404889702797, -0.04139360412955284, 0.09302712976932526, -0.09993217140436172, -0.06543085724115372, -0.1036694273352623, -0.04818689823150635, -0.08023709803819656, -0.05600563436746597, -0.026534711942076683, -0.05393526703119278, 0.05135876685380936, 0.16974425315856934, 0.00978637021034956, -0.011812115088105202, -0.03551265224814415, 0.045450687408447266, 0.04292891174554825, 0.07526520639657974, -0.08823341131210327, 0.04513275623321533, 0.08815263211727142, -0.0627056360244751, 0.05029792711138725, -0.04258017987012863, -0.05528027564287186, 0.0012587120290845633, -0.000046457163989543915, 0.08469630777835846, 0.043930016458034515, 0.06072814390063286, -0.048425592482089996, -0.060702789574861526, 0.1604369580745697, -0.11015870422124863, -0.002006774302572012, 0.03884434327483177, -0.021509546786546707, 0.0760732889175415, 0.08455884456634521, -0.0076692355796694756, -0.04278990253806114, 0.008245808072388172, -0.05580760911107063, -0.03710790351033211, -0.05199507623910904, -0.1260191947221756, -0.0067647346295416355, -0.050905220210552216, -0.010320877656340599, -0.1381428986787796, -0.22016076743602753, -0.03175438195466995, 0.05885033681988716, -0.015750715509057045, -0.00882759876549244, -0.024698903784155846, 0.009529219940304756, -0.02283586747944355, -0.016211548820137978, -0.16764488816261292, -0.023210598155856133, 0.010469009168446064, -0.020428787916898727, 0.03948723524808884, -0.04444197565317154, 0.03586706891655922, -0.11634478718042374, -0.004602702800184488, -0.1144552230834961, 0.10007282346487045, -0.012528643012046814, 0.057227276265621185, -0.11322923004627228, -0.014245626516640186, -0.021599065512418747, -0.03210046887397766, 0.03498796001076698, 0.10748819261789322, -0.13816750049591064, -0.012121228501200676, 0.10698306560516357, -0.1051139086484909, -0.057740312069654465, 0.0908968448638916, -0.01977764256298542, 0.13463610410690308, 0.07071421295404434, 0.014620003290474415, 0.16527383029460907, -0.040837083011865616, -0.05551911145448685, 0.017316633835434914, -0.006380972918123007, 0.02431632950901985, 0.04368720203638077, -0.08136246353387833, -0.04630931466817856, 0.029444312676787376, -0.043589744716882706, 0.01215901505202055, 0.011423951014876366, -0.0761290192604065, -0.0444546639919281, -0.056859444826841354, 0.07849669456481934, -0.0004347069188952446, -0.00021976977586746216, 0.06613326817750931, -0.09866191446781158, 0.06127224117517471, 0.10795408487319946, -0.06171875074505806, 0.013625383377075195, -0.07847415655851364, 0.06301285326480865, -0.06882617622613907, 0.03988131135702133, -0.15614193677902222, -0.1406875103712082, 0.028257587924599648, -0.12490882724523544, 0.016051586717367172, 0.06328438967466354, 0.016221463680267334, 0.02245831862092018, 0.0192936509847641, 0.04691113904118538, 0.05084087327122688, -0.0108548104763031, -0.01036234013736248, -0.04059487208724022, -0.057587724179029465, -0.03817354887723923, 0.07735547423362732, -0.1085963100194931, 0.036263033747673035, -0.03167877346277237, -0.016848744824528694, 0.0015787251759320498, -0.04187477380037308, 0.008821453899145126, -0.04482199624180794, -0.04754498973488808, -0.04091033339500427, 0.026936892420053482, 0.050915054976940155, -0.051770228892564774, -0.026309236884117126, -0.17829708755016327, -0.04763243347406387, 0.044756028801202774, 0.07150083035230637, -0.0751478523015976, 0.0560029074549675, -0.030500546097755432, 0.027793461456894875, -0.05241535231471062, 0.03167346864938736, 0.14959700405597687, 0.06831782311201096, 0.07781067490577698, -0.02262042835354805, -0.011001505888998508, 0.007783430628478527, -0.0059060389176011086, 0.033829133957624435, 0.0018656051252037287, 0.0974900871515274, -0.11645565181970596, 0.008693157695233822, 0.035350196063518524, -0.0982428565621376, 0.06943975389003754, 0.056616418063640594, -0.04289747029542923, -0.038186002522706985, 0.029691550880670547, 0.032880861312150955, -0.05808369070291519, -0.028854895383119583, 0.045137010514736176, 0.039062272757291794, -0.012101351283490658, 0.006981437560170889, -0.022889217361807823, 0.057264674454927444, 0.052175045013427734, -0.035401903092861176, -0.05901024490594864, 0.052684854716062546, -0.017956798896193504, 0.030276289209723473, 0.004670348018407822, 0.03446904569864273, 0.0324309878051281, -0.012819156050682068, -0.09340564906597137, 0.1660461127758026, -0.06098008155822754, -0.13232801854610443, -0.10736878216266632, -0.02228112891316414, -0.1015438586473465, 0.024730259552598, 0.07762796431779861, -0.007006437983363867, -0.06815524399280548, -0.069735586643219, -0.024786286056041718, 0.034570347517728806, -0.05802551656961441, -0.039086200296878815, -0.048832111060619354, 0.04912015050649643, -0.12746337056159973, 0.004616940394043922, 0.007406943943351507, -0.18073368072509766, -0.0100932065397501, 0.04765874147415161, 0.050850510597229004, 0.15171094238758087, -0.036846231669187546, 0.004236715845763683, 0.021278616040945053, 0.16305752098560333, 0.012032599188387394, 0.06425963342189789, 0.14463181793689728, -0.05981657654047012, 0.0555492527782917, 0.15590722858905792, 0.031229261308908463, -0.04621606320142746, 0.028797656297683716, 0.021323267370462418, -0.05520264059305191, -0.1735537052154541, -0.06743433326482773, -0.09901735186576843, -0.0434001199901104, 0.08485773950815201, 0.05268273502588272, 0.005421115551143885, 0.06963330507278442, -0.06269435584545135, -0.005637871567159891, 0.005463681649416685, 0.09611041843891144, 0.1722797155380249, 0.03032119944691658, 0.07872128486633301, -0.04003729298710823, -0.011691712774336338, 0.06704297661781311, -0.0004409998655319214, 0.13203388452529907, -0.043417226523160934, 0.10641451179981232, 0.061126708984375, 0.0770651251077652, 0.013172214850783348, 0.0941619873046875, -0.09080661833286285, 0.059122275561094284, 0.008095355704426765, -0.05883581563830376, -0.056698285043239594, 0.05636870115995407, -0.0014514402719214559, -0.03841346129775047, -0.01575322449207306, -0.015436309389770031, 0.03452898934483528, 0.10662364959716797, 0.0871632844209671, -0.18396180868148804, -0.10558710247278214, 0.019176779314875603, 0.04088107496500015, -0.0782267302274704, 0.000006429478617064888, -0.022060701623558998, -0.04442188888788223, 0.007915718480944633, -0.020051896572113037, 0.06141127273440361, -0.09629085659980774, 0.043073009699583054, 0.022109055891633034, 0.10143397003412247, 0.03099258802831173, 0.09751924127340317, -0.09181196242570877, 0.021477151662111282, 0.015618833713233471, 0.07318633049726486, -0.03278736770153046, 0.04912830516695976, 0.06688623130321503, 0.04533170908689499, 0.12389297783374786, 0.015109339728951454, -0.037870921194553375, -0.13029435276985168, -0.05901876837015152, 0.04652615636587143, 0.004310077987611294, -0.09257201850414276, 0.07760505378246307, -0.05618784949183464, -0.007920625619590282, 0.014779342338442802, 0.0643383339047432, -0.12049311399459839, -0.1883848011493683, 0.09463423490524292, -0.005199949257075787, 0.049577370285987854, 0.01329273171722889, 0.021507831290364265, 0.014553695917129517, 0.19593802094459534, -0.11663001775741577, -0.08713967353105545, -0.1079932227730751, -0.0060052769258618355, 0.0956607237458229, -0.09819964319467545, 0.04346715658903122, -0.04381522163748741, 0.09437020868062973, -0.01218968816101551, -0.14631007611751556, 0.04216606542468071, -0.07929810136556625, -0.021752554923295975, -0.0058825211599469185, -0.08697746694087982, 0.055194657295942307, 0.015166254714131355, 0.059500932693481445, 0.06881382316350937, -0.03006509505212307, -0.12483996152877808, -0.04236609861254692, 0.12416853755712509, -0.0460766963660717, 0.021617019549012184, -0.12523001432418823, -0.032702770084142685, -0.053433798253536224, 0.035724036395549774, 0.14926369488239288, 0.1842261254787445, -0.09739712625741959, 0.08969901502132416, 0.17031344771385193, -0.09644578397274017, -0.21925213932991028, -0.0928640067577362, -0.004719950258731842, -0.009554415941238403, 0.016129091382026672, -0.22615961730480194, 0.11469826847314835, 0.07522691786289215, -0.04648929461836815, 0.02372368983924389, -0.19985464215278625, -0.08674420416355133, 0.2111445963382721, -0.013155522756278515, 0.16072073578834534, -0.11237944662570953, -0.03242715075612068, -0.06549837440252304, -0.012244915589690208, 0.13364507257938385, -0.056211959570646286, 0.07928314805030823, -0.007263618055731058, 0.0018625154625624418, 0.037575334310531616, -0.0387224480509758, 0.03266006335616112, 0.012941395863890648, 0.015195985324680805, -0.04488035663962364, 0.025767480954527855, 0.03863981366157532, -0.09296302497386932, 0.16928429901599884, 0.04706718400120735, 0.10769180953502655, -0.06092600151896477, -0.055721770972013474, -0.017527157440781593, 0.11832486093044281, -0.002473581815138459, -0.08087541908025742, -0.023362472653388977, 0.02041753940284252, 0.1140037551522255, 0.01122698187828064, 0.012126137502491474, -0.014892632141709328, 0.13955768942832947, 0.14065980911254883, 0.0275882538408041, -0.008603361435234547, -0.06732233613729477, 0.02017103135585785, -0.03814674913883209, 0.0753161758184433, -0.043114811182022095, 0.058052558451890945, 0.07803975045681, 0.028107058256864548, 0.0805320292711258, 0.05067024752497673, -0.04182184860110283, 0.008693922311067581, 0.03515476733446121, -0.09847915172576904, -0.07494862377643585, 0.002797418739646673, 0.049547143280506134, -0.13063248991966248, -0.011808954179286957, 0.13512927293777466, -0.020475614815950394, -0.01175745576620102, 0.04395851492881775, 0.06799796223640442, 0.02566288784146309, 0.058396123349666595, 0.018035298213362694, 0.0013272520154714584, -0.06744778901338577, 0.027443895116448402, 0.11715859174728394, -0.060607075691223145, 0.042740870267152786, 0.01034915167838335, -0.0959489718079567, -0.08984380960464478, -0.06251215934753418, 0.06556842476129532, -0.1485784351825714, 0.01852402091026306, -0.017778638750314713, -0.020808715373277664, 0.010088617913424969, 0.004353519529104233, 0.03088732622563839, 0.052768122404813766, -0.0794304683804512, -0.026468392461538315, -0.10223813354969025, 0.09005816280841827, 0.03150329738855362, 0.016733190044760704, -0.0610370859503746, -0.016983503475785255, -0.009826784022152424, 0.029095593839883804, -0.006699071731418371, -0.034999437630176544, -0.06279389560222626, -0.032164592295885086, -0.06584542989730835, 0.008251014165580273, -0.044427454471588135, 0.03530140966176987, 0.04177476838231087, 0.021087024360895157, -0.03212207555770874, -0.0007497142651118338, -0.06370405852794647, -0.05642980337142944, -0.026270365342497826, 0.06118977069854736, -0.14742475748062134, -0.004625951871275902, 0.02957446500658989, -0.06960718333721161, 0.08492919057607651, 0.02993888221681118, -0.036096084862947464, 0.026613343507051468, -0.15791203081607819, -0.057163774967193604, -0.018920857459306717, 0.05626373365521431, 0.024132033810019493, -0.08093249052762985, 0.037981510162353516, 0.040888916701078415, -0.03395555168390274, -0.02352742850780487, 0.0445103757083416, -0.1042003184556961, 0.02914520725607872, -0.03585495799779892, 0.03751937672495842, -0.048764608800411224, -0.00849965214729309, -0.006551454775035381, 0.05791741609573364, 0.16062724590301514, -0.06596149504184723, 0.08584529161453247, -0.10152088105678558, -0.04823228716850281, 0.019535532221198082, -0.0070356326177716255, -0.03544717654585838, -0.06407046318054199, 0.03444140776991844, -0.03282534331083298, 0.13037975132465363, -0.03757528215646744, 0.03470093011856079, 0.0028012278489768505, 0.026567500084638596, -0.02934611402451992, -0.040314000099897385, 0.15268373489379883, 0.1023004874587059, 0.0001326140045421198, 0.0010911166900768876, 0.04544976353645325, -0.008655421435832977, 0.011006402783095837, 0.1618279218673706, -0.0008055634680204093, -0.007018619682639837, -0.003554490627720952, 0.03642180934548378, -0.07177731394767761, 0.01775161549448967, 0.0882948637008667, -0.046535931527614594, 0.08002401888370514, 0.006216948386281729, -0.02420428954064846, 0.14329366385936737, -0.06307120621204376, 0.06683851778507233, 0.011432511731982231, -0.04529901593923569, -0.15758326649665833, -0.13103194534778595, -0.05558546259999275, -0.09843414276838303, -0.007396879140287638, -0.09775198251008987, 0.02047630213201046, 0.041423771530389786, -0.005893833003938198, -0.005652120336890221, 0.04392707347869873, -0.02313327230513096, -0.029833268374204636, -0.021891562268137932, -0.053304802626371384, -0.024293694645166397, 0.05823775380849838, -0.013024839572608471, 0.04485630989074707, 0.1043555960059166, 0.08353669196367264, 0.07579196989536285, 0.09464158117771149, 0.05716432258486748, -0.05375701189041138, -0.06293828040361404, -0.04085438326001167, 0.04541916027665138, -0.06432957947254181, 0.11011230945587158, 0.03776679188013077, -0.029295727610588074, 0.004993479698896408, 0.08481575548648834, -0.03495687246322632, -0.11320289224386215, -0.1203421801328659, 0.1857336312532425, 0.08358991146087646, 0.026611346751451492, -0.011571158654987812, -0.1220347061753273, -0.01287000346928835, 0.16479676961898804, 0.174408420920372, 0.030344059690833092, 0.010183329693973064, 0.053368568420410156, 0.009345735423266888, 0.01910512149333954, 0.07281181961297989, 0.04319094493985176, 0.2585797607898712, -0.0457196943461895, 0.07182618975639343, 0.020789284259080887, 0.013897811062633991, -0.07514215260744095, 0.06918652355670929, -0.018899593502283096, 0.011115229688584805, -0.008875382132828236, 0.050254859030246735, -0.11934683471918106, -0.09552513062953949, -0.0033240020275115967, 0.011069778352975845, -0.06193035840988159, -0.03448421508073807, 0.021110190078616142, -0.01984540745615959, 0.05041380599141121, -0.05628783255815506, -0.031509775668382645, 0.19570891559123993, -0.03741059452295303, -0.10956482589244843, -0.0422314889729023, 0.033758826553821564, 0.12078922986984253, 0.15570439398288727, 0.023643629625439644, -0.002189620863646269, 0.08239118754863739, -0.013479305431246758, -0.1337520033121109, 0.07188667356967926, -0.01255515031516552, -0.06871914118528366, 0.06367254257202148, 0.05214140564203262, -0.06474936008453369, 0.06345848739147186, 0.04532686620950699, -0.09330269694328308, 0.026162883266806602, 0.001687923795543611, -0.025939276441931725, -0.06490035355091095, 0.002029702067375183, -0.04984145611524582, 0.13208238780498505, 0.11831729114055634, -0.006965485401451588, -0.0060693928971886635, -0.06018994376063347, 0.07105405628681183, 0.02033476158976555, 0.02021055296063423, -0.04519794136285782, -0.06065432354807854, 0.03450796380639076, -0.0395628996193409, 0.05913558602333069, -0.19133952260017395, -0.02602427266538143, 0.0533129945397377, 0.0025584609247744083, -0.06658019870519638, 0.09911404550075531, 0.05933469533920288, 0.046691011637449265, -0.042400576174259186, -0.05916416645050049, -0.02024242654442787, 0.03693028539419174, -0.09719523042440414, -0.08528444916009903 ]
null
null
transformers
## RAG This is the RAG-Sequence Model of the the paper [Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks](https://arxiv.org/pdf/2005.11401.pdf) by Patrick Lewis, Ethan Perez, Aleksandara Piktus et al. The model is a *uncased* model, which means that capital letters are simply converted to lower-case letters. The model consits of a *question_encoder*, *retriever* and a *generator*. The retriever extracts relevant passages from the *wiki_dpr* `train` datasets, which is linked above. The question_encoder and retriever are based on `facebook/dpr-question_encoder-single-nq-base` and `facebook/bart-large`, which were jointly finetuned on on the *wiki_dpr* QA dataset in an end-to-end fashion. ## Usage: **Note**: In the usage example below only the *dummy* retriever of *wiki_dpr* is used because the complete *lecagy* index requires over 75 GB of RAM. The model can generate answers to any factoid question as follows: ```python from transformers import RagTokenizer, RagRetriever, RagSequenceForGeneration tokenizer = RagTokenizer.from_pretrained("facebook/rag-sequence-nq") retriever = RagRetriever.from_pretrained("facebook/rag-sequence-nq", index_name="exact", use_dummy_dataset=True) model = RagSequenceForGeneration.from_pretrained("facebook/rag-sequence-nq", retriever=retriever) input_dict = tokenizer.prepare_seq2seq_batch("how many countries are in europe", return_tensors="pt") generated = model.generate(input_ids=input_dict["input_ids"]) print(tokenizer.batch_decode(generated, skip_special_tokens=True)[0]) # should give 54 => google says either 44 or 51 ```
{"language": "en", "license": "apache-2.0", "datasets": ["wiki_dpr"], "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png"}
null
facebook/rag-sequence-nq
[ "transformers", "pytorch", "tf", "rag", "en", "dataset:wiki_dpr", "arxiv:2005.11401", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2005.11401" ]
[ "en" ]
TAGS #transformers #pytorch #tf #rag #en #dataset-wiki_dpr #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #region-us
## RAG This is the RAG-Sequence Model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks by Patrick Lewis, Ethan Perez, Aleksandara Piktus et al. The model is a *uncased* model, which means that capital letters are simply converted to lower-case letters. The model consits of a *question_encoder*, *retriever* and a *generator*. The retriever extracts relevant passages from the *wiki_dpr* 'train' datasets, which is linked above. The question_encoder and retriever are based on 'facebook/dpr-question_encoder-single-nq-base' and 'facebook/bart-large', which were jointly finetuned on on the *wiki_dpr* QA dataset in an end-to-end fashion. ## Usage: Note: In the usage example below only the *dummy* retriever of *wiki_dpr* is used because the complete *lecagy* index requires over 75 GB of RAM. The model can generate answers to any factoid question as follows:
[ "## RAG\n\nThis is the RAG-Sequence Model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nThe model is a *uncased* model, which means that capital letters are simply converted to lower-case letters.\n\nThe model consits of a *question_encoder*, *retriever* and a *generator*. The retriever extracts relevant passages from the *wiki_dpr* 'train' datasets, which is linked above.\nThe question_encoder and retriever are based on 'facebook/dpr-question_encoder-single-nq-base' and 'facebook/bart-large', which were jointly finetuned on \non the *wiki_dpr* QA dataset in an end-to-end fashion.", "## Usage:\n\nNote: In the usage example below only the *dummy* retriever of *wiki_dpr* is used because the complete *lecagy* index requires over 75 GB of RAM.\nThe model can generate answers to any factoid question as follows:" ]
[ "TAGS\n#transformers #pytorch #tf #rag #en #dataset-wiki_dpr #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #region-us \n", "## RAG\n\nThis is the RAG-Sequence Model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nThe model is a *uncased* model, which means that capital letters are simply converted to lower-case letters.\n\nThe model consits of a *question_encoder*, *retriever* and a *generator*. The retriever extracts relevant passages from the *wiki_dpr* 'train' datasets, which is linked above.\nThe question_encoder and retriever are based on 'facebook/dpr-question_encoder-single-nq-base' and 'facebook/bart-large', which were jointly finetuned on \non the *wiki_dpr* QA dataset in an end-to-end fashion.", "## Usage:\n\nNote: In the usage example below only the *dummy* retriever of *wiki_dpr* is used because the complete *lecagy* index requires over 75 GB of RAM.\nThe model can generate answers to any factoid question as follows:" ]
[ 52, 206, 60 ]
[ "passage: TAGS\n#transformers #pytorch #tf #rag #en #dataset-wiki_dpr #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #region-us \n## RAG\n\nThis is the RAG-Sequence Model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nThe model is a *uncased* model, which means that capital letters are simply converted to lower-case letters.\n\nThe model consits of a *question_encoder*, *retriever* and a *generator*. The retriever extracts relevant passages from the *wiki_dpr* 'train' datasets, which is linked above.\nThe question_encoder and retriever are based on 'facebook/dpr-question_encoder-single-nq-base' and 'facebook/bart-large', which were jointly finetuned on \non the *wiki_dpr* QA dataset in an end-to-end fashion.## Usage:\n\nNote: In the usage example below only the *dummy* retriever of *wiki_dpr* is used because the complete *lecagy* index requires over 75 GB of RAM.\nThe model can generate answers to any factoid question as follows:" ]
[ 0.009448742493987083, 0.10710862278938293, -0.004978138022124767, 0.09268195182085037, 0.05011377111077309, 0.04783345013856888, 0.010219045914709568, 0.09041952341794968, 0.013815120793879032, 0.06327439844608307, 0.09077447652816772, -0.029896119609475136, 0.03722486272454262, 0.04716191068291664, -0.016896124929189682, -0.15982502698898315, 0.01826094090938568, 0.005026711151003838, -0.02858472242951393, 0.04549090191721916, 0.06399820744991302, -0.04357587918639183, 0.06453977525234222, -0.008450469933450222, -0.07309062778949738, 0.06115221977233887, 0.030431030318140984, -0.015008949674665928, 0.08864623308181763, 0.06926581263542175, 0.10782493650913239, 0.012274306267499924, 0.03674697130918503, -0.15834534168243408, 0.02823662757873535, 0.13166414201259613, 0.01170270424336195, 0.05204537510871887, 0.029586127027869225, 0.025562020018696785, 0.07323141396045685, 0.022970715537667274, 0.04881506785750389, 0.031598858535289764, -0.07030574232339859, -0.21244493126869202, -0.03383330628275871, 0.10961699485778809, 0.02106308937072754, 0.04284617304801941, -0.006422305013984442, 0.0067504108883440495, -0.00014476399519480765, 0.0488397479057312, 0.21078595519065857, -0.22400899231433868, -0.051725469529628754, 0.11272836476564407, 0.021880963817238808, 0.03471798449754715, -0.01573200896382332, 0.004922030493617058, 0.03576009348034859, 0.040245093405246735, 0.02863316424190998, -0.057810064405202866, -0.0033741232473403215, -0.028895089402794838, -0.08239444345235825, -0.019244400784373283, 0.12273550778627396, 0.046971727162599564, -0.028805915266275406, -0.013294760137796402, -0.1433040350675583, 0.1387714147567749, -0.01623586378991604, -0.08786282688379288, -0.010492363944649696, 0.09661610424518585, -0.0031890179961919785, -0.09196775406599045, -0.04717325419187546, -0.03915983438491821, -0.09471520781517029, -0.01643712818622589, 0.009934463538229465, 0.07110091298818588, -0.039196040481328964, 0.12826846539974213, -0.13348589837551117, -0.10587534308433533, 0.008156019262969494, -0.11241686344146729, -0.09743691235780716, 0.020181970670819283, -0.005292134825140238, -0.04649771749973297, 0.0739874318242073, 0.1593419313430786, -0.01828591898083687, 0.03222676366567612, -0.09484824538230896, 0.01929006725549698, 0.014066954143345356, 0.10640086233615875, -0.13252343237400055, -0.08753896504640579, 0.0906650647521019, -0.04038958624005318, -0.01224010530859232, -0.04524741321802139, -0.026716716587543488, -0.01876210793852806, -0.029901903122663498, -0.006212923210114241, 0.061498235911130905, 0.035834696143865585, -0.03877818211913109, -0.05361885204911232, 0.10320578515529633, -0.07432540506124496, -0.026556162163615227, 0.01596124842762947, -0.004578528925776482, 0.054537273943424225, 0.08338343352079391, 0.004268237389624119, -0.030177975073456764, 0.038772717118263245, -0.05341995880007744, -0.07889065891504288, -0.05076546594500542, -0.0972711518406868, 0.01564841903746128, -0.09579218178987503, -0.011888031847774982, -0.13337886333465576, -0.18640637397766113, 0.010384605266153812, 0.09887473285198212, -0.044290803372859955, -0.03235854208469391, 0.05153239145874977, 0.0456966795027256, -0.00360597250983119, -0.005958134308457375, -0.021468745544552803, -0.020903630182147026, 0.07855833321809769, -0.08101770281791687, 0.06660755723714828, -0.043272778391838074, 0.04061580449342728, -0.057848814874887466, 0.08446688950061798, -0.1673763245344162, 0.11409708857536316, -0.036954186856746674, -0.03372170776128769, -0.1017853319644928, -0.015957219526171684, -0.11096415668725967, -0.029965603724122047, 0.06297134608030319, 0.06117686629295349, -0.1696803867816925, -0.02641063742339611, 0.0890798419713974, -0.08700626343488693, -0.035643238574266434, 0.15970808267593384, -0.05653401464223862, 0.10696681588888168, 0.08437667787075043, 0.1272633969783783, 0.15049242973327637, -0.1083383858203888, -0.028128350153565407, 0.057771310210227966, -0.022992225363850594, 0.1010967493057251, 0.0035321044269949198, -0.03491292521357536, -0.035173509269952774, 0.045168813318014145, 0.042648009955883026, 0.041681963950395584, -0.012748905457556248, -0.052659470587968826, -0.0663619413971901, -0.06167939677834511, 0.042653053998947144, -0.010276038199663162, -0.027238231152296066, 0.034048017114400864, -0.09193559736013412, -0.004870227072387934, 0.07371781766414642, -0.04445382580161095, 0.0022984668612480164, -0.005590349435806274, 0.03873426094651222, -0.07904720306396484, 0.06644919514656067, -0.14807268977165222, -0.06648046523332596, 0.04173845052719116, -0.09898242354393005, -0.003841761499643326, 0.11905518919229507, 0.014777605421841145, -0.04060634970664978, -0.0378427729010582, 0.062326956540346146, -0.03220345452427864, -0.016476813703775406, -0.029839718714356422, -0.016367511823773384, 0.011802508495748043, -0.028988322243094444, -0.018271051347255707, -0.054797664284706116, 0.008847040124237537, 0.00405465392395854, 0.13102203607559204, -0.001719902502372861, -0.02334834262728691, -0.010166045278310776, -0.01987667940557003, 0.004236688371747732, -0.05657738074660301, 0.023364825174212456, 0.00963935349136591, 0.008039746433496475, -0.04492415487766266, -0.06492182612419128, 0.02494814433157444, 0.062354233115911484, 0.08903536200523376, -0.0604272186756134, 0.09104514122009277, -0.01225561834871769, -0.03472764045000076, -0.09779258817434311, -0.06031769886612892, 0.22220167517662048, 0.06606294214725494, 0.06390096992254257, -0.08105151355266571, -0.024816082790493965, 0.05521346256136894, 0.037095166742801666, 0.014113368466496468, 0.017558041960000992, 0.1255936473608017, -0.12420067191123962, 0.021708795800805092, 0.09318950027227402, -0.003399258479475975, 0.13015300035476685, -0.018343601375818253, -0.05824979394674301, -0.002821275731548667, -0.005344044882804155, -0.050334345549345016, 0.0671200081706047, -0.029926013201475143, 0.08157118409872055, 0.0419071801006794, 0.029891252517700195, 0.009660864248871803, -0.025525473058223724, 0.02787945605814457, 0.0010983169777318835, -0.03852817788720131, -0.06624893099069595, 0.037954505532979965, 0.04803651571273804, 0.061963509768247604, 0.05557507276535034, 0.024713514372706413, 0.014493205584585667, 0.002562324982136488, -0.04188194498419762, 0.13741689920425415, -0.07355101406574249, -0.22509486973285675, -0.04361278936266899, 0.05661626160144806, -0.08296187967061996, -0.04419919103384018, 0.05401844531297684, -0.07461331784725189, -0.021352551877498627, -0.028773227706551552, 0.08113173395395279, -0.014855117537081242, -0.033073540776968, 0.013111595995724201, -0.04335324838757515, 0.056365400552749634, -0.15889646112918854, 0.007798574399203062, -0.017596213147044182, -0.020975474268198013, -0.03432883694767952, -0.005434880964457989, 0.060405127704143524, 0.08146722614765167, -0.034393321722745895, -0.001838922849856317, 0.018631557002663612, 0.24385637044906616, -0.04913286492228508, 0.12547507882118225, 0.149828240275383, -0.030502144247293472, 0.07578915357589722, 0.1479644477367401, 0.00713710393756628, -0.07217788696289062, 0.024014484137296677, 0.09685111045837402, -0.08285144716501236, -0.16128529608249664, -0.013247671537101269, -0.04316021129488945, 0.032983940094709396, 0.08628889173269272, 0.019646363332867622, -0.12869291007518768, 0.053988490253686905, -0.05303457751870155, 0.06100112572312355, 0.01607881672680378, 0.02774870954453945, 0.19121527671813965, -0.011240544728934765, 0.0846642404794693, -0.03903510794043541, -0.05108226090669632, 0.10635212063789368, 0.0054152002558112144, 0.17416875064373016, -0.08361958712339401, -0.009690452367067337, 0.016066938638687134, 0.11948550492525101, 0.02756582945585251, 0.11067696660757065, -0.09291951358318329, 0.05093989148736, -0.03510637581348419, -0.012590001337230206, -0.013426722958683968, 0.04187166690826416, -0.03288022801280022, -0.08670677244663239, 0.02370307967066765, -0.0016937983455136418, 0.04379364848136902, 0.27778586745262146, 0.045452821999788284, -0.0704936534166336, -0.05388142913579941, 0.0030895129311829805, -0.01275894045829773, -0.10665863007307053, 0.053720150142908096, 0.08254724740982056, -0.07164240628480911, 0.0372387133538723, -0.014186382293701172, 0.09806466102600098, -0.05018600448966026, 0.040755391120910645, 0.017067106440663338, 0.049536556005477905, -0.0035350557882338762, 0.05505526810884476, -0.10758543759584427, 0.04457121342420578, 0.040684036910533905, 0.01637236401438713, -0.008476478978991508, 0.031247103586792946, 0.03404192626476288, 0.012215778231620789, 0.09947729110717773, 0.0019780772272497416, 0.03771997615695, -0.05304379761219025, -0.08793308585882187, 0.0797872543334961, 0.008067812770605087, -0.07732295989990234, 0.015232399106025696, -0.009833257645368576, 0.04218146950006485, 0.03702216595411301, 0.1120476946234703, -0.10981782525777817, -0.14527402818202972, 0.0020129820331931114, -0.034683309495449066, 0.011200718581676483, -0.011356527917087078, 0.008789467625319958, 0.019768817350268364, 0.15899929404258728, -0.05882078781723976, -0.08619600534439087, -0.10270289331674576, 0.02595500461757183, 0.1370951533317566, -0.05951426550745964, 0.010363705456256866, -0.04760327190160751, 0.05350510776042938, -0.07813939452171326, -0.18876367807388306, 0.04374947398900986, -0.05374183878302574, -0.060107748955488205, -0.012263065204024315, 0.015780430287122726, 0.038841910660266876, -0.010497970506548882, 0.0328151136636734, 0.016008928418159485, -0.05894151329994202, -0.11534681171178818, -0.01670549251139164, 0.016206366941332817, -0.044796716421842575, -0.02037862315773964, -0.053199656307697296, 0.08543267101049423, -0.01737137883901596, 0.04585813358426094, 0.06666084378957748, 0.1225658655166626, -0.1323649138212204, 0.0933137983083725, 0.15128810703754425, -0.05162794888019562, -0.20527179539203644, -0.05774695798754692, 0.03256817162036896, -0.04783905670046806, -0.01519949734210968, -0.18458804488182068, 0.12538471817970276, 0.09878525137901306, -0.005817718338221312, 0.02164517156779766, -0.0920962393283844, -0.08181674033403397, 0.1706729382276535, -0.039802756160497665, 0.24172210693359375, -0.06776633858680725, 0.013346308842301369, -0.0628659576177597, -0.08484721928834915, 0.12339325994253159, -0.18584685027599335, 0.05279715731739998, -0.012145576998591423, -0.0021535451523959637, 0.0225877333432436, -0.019963707774877548, 0.027962693944573402, 0.0031323381699621677, 0.07292994856834412, -0.013720491901040077, 0.05363459512591362, -0.04762923717498779, -0.031201958656311035, 0.13597895205020905, 0.0273087527602911, 0.10280302911996841, -0.0027171089313924313, -0.05285969376564026, -0.04197774827480316, 0.09249940514564514, -0.02949073724448681, -0.12198802083730698, -0.09022866934537888, -0.0066126626916229725, 0.05770708620548248, -0.026736723259091377, -0.07193108648061752, 0.022752949967980385, 0.07981526851654053, 0.1556600034236908, -0.0011021762620657682, -0.057639431208372116, -0.1307961642742157, 0.025614865124225616, -0.039847780019044876, 0.12012871354818344, -0.1707761287689209, 0.04496562480926514, 0.08890976756811142, 0.0028719904366880655, 0.07655724138021469, 0.010351940058171749, -0.08214078098535538, 0.013402472250163555, 0.0021609836257994175, -0.15184132754802704, -0.12062644213438034, -0.013333410955965519, -0.16533435881137848, -0.07276958972215652, -0.09333156794309616, 0.1469414085149765, -0.032596852630376816, 0.004763228353112936, 0.022644154727458954, 0.024751488119363785, 0.01374838873744011, 0.09172746539115906, 0.035836197435855865, 0.01922183856368065, -0.06794721633195877, 0.07995209097862244, 0.08385460078716278, -0.10720254480838776, 0.04720207676291466, 0.01157866045832634, -0.11418222635984421, -0.02221955545246601, -0.04838588833808899, 0.12981994450092316, -0.046907465904951096, -0.0002734794106800109, -0.05367571860551834, -0.05162613466382027, 0.019355259835720062, 0.08538497984409332, 0.007369778119027615, 0.042706336826086044, -0.026355938985943794, -0.035787828266620636, -0.03480490297079086, 0.10990335047245026, 0.0353977344930172, 0.007612604647874832, -0.07433010637760162, -0.09381388127803802, 0.030817613005638123, 0.07827107608318329, -0.03638653829693794, 0.015247882343828678, -0.09482406824827194, -0.03458923473954201, -0.22881542146205902, -0.03427688777446747, 0.007259713020175695, 0.03299683332443237, -0.022645624354481697, -0.009395400062203407, -0.004745109938085079, -0.013367786072194576, -0.018966469913721085, -0.022811809554696083, -0.007797242607921362, 0.03809266537427902, -0.13536500930786133, -0.02974860742688179, 0.05191721394658089, -0.03383714333176613, 0.0968477874994278, 0.03260183334350586, -0.03303828835487366, -0.0074561829678714275, -0.06666643172502518, -0.02624564617872238, -0.010982136242091656, 0.08845404535531998, -0.00846171472221613, -0.041938476264476776, 0.05555855482816696, -0.020355956628918648, -0.027057886123657227, -0.006004412192851305, 0.06145493686199188, -0.11743335425853729, 0.02409246750175953, -0.030323579907417297, -0.03323601186275482, -0.03640831634402275, -0.05606452375650406, 0.07191474735736847, 0.14829781651496887, 0.0715717300772667, -0.07851690798997879, 0.07021378725767136, -0.18096095323562622, -0.06784853339195251, 0.028928721323609352, -0.010342895984649658, 0.08328002691268921, -0.058907270431518555, 0.03272801265120506, -0.034415725618600845, 0.18380756676197052, 0.014270240440964699, -0.03339153528213501, 0.003733214223757386, -0.08693467825651169, -0.04784877970814705, -0.022417789325118065, 0.013734616339206696, 0.05044831335544586, -0.030538583174347878, -0.06370358169078827, 0.016043588519096375, 0.03461575135588646, 0.11986515671014786, 0.09705083817243576, 0.08985822647809982, 0.05814816430211067, -0.08496422320604324, 0.07202055305242538, -0.05438914895057678, 0.027268676087260246, 0.09374665468931198, -0.029125621542334557, 0.07306519150733948, -0.013979689218103886, -0.0648927092552185, 0.10451295971870422, -0.06871406733989716, 0.07822012901306152, -0.006084748078137636, -0.03439144045114517, -0.13114915788173676, -0.01800544559955597, -0.052332933992147446, -0.12799586355686188, 0.020305097103118896, -0.10652990639209747, -0.022518593817949295, 0.07141099870204926, -0.015199163928627968, -0.05002988502383232, 0.08056148886680603, -0.02528354711830616, -0.08367455005645752, 0.03164392337203026, 0.04187271371483803, -0.06126928701996803, 0.0770072489976883, 0.0009548357920721173, 0.044847048819065094, 0.0585126094520092, 0.023366952314972878, 0.04566432535648346, 0.02216869406402111, 0.010949883610010147, -0.04775625839829445, -0.044188495725393295, -0.04604466259479523, 0.038093000650405884, -0.048261720687150955, 0.13357825577259064, 0.051706403493881226, -0.02176736295223236, -0.02248198911547661, 0.10750983655452728, -0.011925634928047657, -0.07843723893165588, -0.18802031874656677, 0.11305933445692062, -0.05819600448012352, 0.021231284365057945, 0.008772333152592182, -0.08924952894449234, -0.05264110863208771, 0.20038153231143951, 0.1516549289226532, -0.026510747149586678, 0.00012454860552679747, 0.016988378018140793, 0.008162596262991428, 0.03135570138692856, 0.05943552777171135, 0.10330377519130707, 0.22624395787715912, -0.07295248657464981, -0.004836151842027903, -0.008058621548116207, -0.056704092770814896, -0.06584829092025757, 0.15546132624149323, 0.01990455575287342, 0.004285825882107019, -0.05359441787004471, 0.048959631472826004, -0.03534107282757759, -0.1804296225309372, 0.012054798193275928, -0.06848888844251633, -0.04913412034511566, -0.021155424416065216, -0.08239519596099854, -0.09889588505029678, 0.03162533789873123, -0.06897298246622086, -0.007849887944757938, 0.13074719905853271, -0.042687151581048965, -0.09725058078765869, -0.08326225727796555, 0.05962011590600014, 0.05970149487257004, 0.06342139840126038, 0.03058941289782524, 0.010118463076651096, 0.07621795684099197, 0.03155284747481346, -0.036978740245103836, 0.03760562837123871, 0.03594871610403061, -0.029064886271953583, -0.07740911096334457, -0.016723915934562683, -0.05330832675099373, 0.022868912667036057, 0.0871051549911499, -0.03506353497505188, -0.01933342218399048, 0.027092168107628822, -0.01745179109275341, -0.11363046616315842, 0.018242008984088898, -0.07558368891477585, 0.13788476586341858, 0.13097476959228516, 0.01601511985063553, 0.013574988581240177, -0.09875879436731339, 0.028339551761746407, 0.005633711349219084, 0.025195661932229996, -0.008808784186840057, -0.06334605067968369, 0.0059653762727975845, 0.05019707605242729, -0.007320757023990154, -0.1748971790075302, -0.06503622233867645, 0.048374127596616745, 0.0030874190852046013, 0.009983416646718979, 0.08490847796201706, -0.0068395668640732765, 0.023493215441703796, -0.018807120621204376, -0.06032010540366173, -0.03902295231819153, 0.0159514881670475, -0.05718753859400749, -0.05427628755569458 ]
null
null
transformers
## RAG This is a non-finetuned version of the RAG-Token model of the the paper [Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks](https://arxiv.org/pdf/2005.11401.pdf) by Patrick Lewis, Ethan Perez, Aleksandara Piktus et al. Rag consits of a *question encoder*, *retriever* and a *generator*. The retriever should be a `RagRetriever` instance. The *question encoder* can be any model that can be loaded with `AutoModel` and the *generator* can be any model that can be loaded with `AutoModelForSeq2SeqLM`. This model is a non-finetuned RAG-Token model and was created as follows: ```python from transformers import RagTokenizer, RagRetriever, RagTokenForGeneration, AutoTokenizer model = RagTokenForGeneration.from_pretrained_question_encoder_generator("facebook/dpr-question_encoder-single-nq-base", "facebook/bart-large") question_encoder_tokenizer = AutoTokenizer.from_pretrained("facebook/dpr-question_encoder-single-nq-base") generator_tokenizer = AutoTokenizer.from_pretrained("facebook/bart-large") tokenizer = RagTokenizer(question_encoder_tokenizer, generator_tokenizer) model.config.use_dummy_dataset = True model.config.index_name = "exact" retriever = RagRetriever(model.config, question_encoder_tokenizer, generator_tokenizer) model.save_pretrained("./") tokenizer.save_pretrained("./") retriever.save_pretrained("./") ``` Note that the model is *uncased* so that all capital input letters are converted to lower-case. ## Usage: *Note*: the model uses the *dummy* retriever as a default. Better results are obtained by using the full retriever, by setting `config.index_name="legacy"` and `config.use_dummy_dataset=False`. The model can be fine-tuned as follows: ```python from transformers import RagTokenizer, RagRetriever, RagTokenForGeneration tokenizer = RagTokenizer.from_pretrained("facebook/rag-token-base") retriever = RagRetriever.from_pretrained("facebook/rag-token-base") model = RagTokenForGeneration.from_pretrained("facebook/rag-token-base", retriever=retriever) input_dict = tokenizer.prepare_seq2seq_batch("who holds the record in 100m freestyle", "michael phelps", return_tensors="pt") outputs = model(input_dict["input_ids"], labels=input_dict["labels"]) loss = outputs.loss # train on loss ```
{"language": "en", "license": "apache-2.0", "datasets": ["wiki_dpr"], "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png"}
null
facebook/rag-token-base
[ "transformers", "pytorch", "rag", "en", "dataset:wiki_dpr", "arxiv:2005.11401", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2005.11401" ]
[ "en" ]
TAGS #transformers #pytorch #rag #en #dataset-wiki_dpr #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #has_space #region-us
## RAG This is a non-finetuned version of the RAG-Token model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks by Patrick Lewis, Ethan Perez, Aleksandara Piktus et al. Rag consits of a *question encoder*, *retriever* and a *generator*. The retriever should be a 'RagRetriever' instance. The *question encoder* can be any model that can be loaded with 'AutoModel' and the *generator* can be any model that can be loaded with 'AutoModelForSeq2SeqLM'. This model is a non-finetuned RAG-Token model and was created as follows: Note that the model is *uncased* so that all capital input letters are converted to lower-case. ## Usage: *Note*: the model uses the *dummy* retriever as a default. Better results are obtained by using the full retriever, by setting 'config.index_name="legacy"' and 'config.use_dummy_dataset=False'. The model can be fine-tuned as follows:
[ "## RAG\n\nThis is a non-finetuned version of the RAG-Token model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nRag consits of a *question encoder*, *retriever* and a *generator*. The retriever should be a 'RagRetriever' instance. The *question encoder* can be any model that can be loaded with 'AutoModel' and the *generator* can be any model that can be loaded with 'AutoModelForSeq2SeqLM'. \n\nThis model is a non-finetuned RAG-Token model and was created as follows:\n\n\n\nNote that the model is *uncased* so that all capital input letters are converted to lower-case.", "## Usage:\n\n*Note*: the model uses the *dummy* retriever as a default. Better results are obtained by using the full retriever, \nby setting 'config.index_name=\"legacy\"' and 'config.use_dummy_dataset=False'.\nThe model can be fine-tuned as follows:" ]
[ "TAGS\n#transformers #pytorch #rag #en #dataset-wiki_dpr #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "## RAG\n\nThis is a non-finetuned version of the RAG-Token model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nRag consits of a *question encoder*, *retriever* and a *generator*. The retriever should be a 'RagRetriever' instance. The *question encoder* can be any model that can be loaded with 'AutoModel' and the *generator* can be any model that can be loaded with 'AutoModelForSeq2SeqLM'. \n\nThis model is a non-finetuned RAG-Token model and was created as follows:\n\n\n\nNote that the model is *uncased* so that all capital input letters are converted to lower-case.", "## Usage:\n\n*Note*: the model uses the *dummy* retriever as a default. Better results are obtained by using the full retriever, \nby setting 'config.index_name=\"legacy\"' and 'config.use_dummy_dataset=False'.\nThe model can be fine-tuned as follows:" ]
[ 53, 196, 78 ]
[ "passage: TAGS\n#transformers #pytorch #rag #en #dataset-wiki_dpr #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n## RAG\n\nThis is a non-finetuned version of the RAG-Token model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nRag consits of a *question encoder*, *retriever* and a *generator*. The retriever should be a 'RagRetriever' instance. The *question encoder* can be any model that can be loaded with 'AutoModel' and the *generator* can be any model that can be loaded with 'AutoModelForSeq2SeqLM'. \n\nThis model is a non-finetuned RAG-Token model and was created as follows:\n\n\n\nNote that the model is *uncased* so that all capital input letters are converted to lower-case.## Usage:\n\n*Note*: the model uses the *dummy* retriever as a default. Better results are obtained by using the full retriever, \nby setting 'config.index_name=\"legacy\"' and 'config.use_dummy_dataset=False'.\nThe model can be fine-tuned as follows:" ]
[ -0.02238382212817669, 0.1485275775194168, -0.008447774685919285, 0.06364286690950394, 0.08844640105962753, 0.0015449377242475748, 0.03334598243236542, 0.08704488724470139, -0.03939009830355644, 0.051128122955560684, 0.0543777160346508, 0.13446734845638275, 0.02717200480401516, 0.08041713386774063, 0.01309212762862444, -0.20997266471385956, 0.0777411088347435, -0.04513208940625191, 0.03797236084938049, 0.06011413037776947, 0.08996490389108658, -0.06163749843835831, 0.07486402243375778, 0.0041512381285429, -0.07727327942848206, 0.036519598215818405, 0.003481883555650711, 0.01244933158159256, 0.03156456723809242, 0.03142332658171654, 0.10953167825937271, 0.015017597004771233, 0.08356844633817673, -0.15282003581523895, 0.017159583047032356, 0.06713344156742096, 0.01534073892980814, 0.047813065350055695, 0.06344430148601532, -0.025005802512168884, 0.0878177359700203, -0.013591103255748749, 0.030287619680166245, 0.048298995941877365, -0.09923871606588364, -0.16352149844169617, -0.047094590961933136, 0.08233071863651276, 0.013662259094417095, 0.039444904774427414, -0.01882082037627697, -0.0029142373241484165, 0.03224097564816475, 0.04517006501555443, 0.11318071186542511, -0.19701656699180603, -0.018421506509184837, 0.1708797961473465, -0.04712878167629242, 0.02887328341603279, -0.03571441397070885, -0.04130689427256584, -0.038162022829055786, 0.0037206325214356184, 0.0881035253405571, -0.011246027424931526, -0.08415385335683823, 0.0028804708272218704, -0.12251601368188858, -0.037824735045433044, 0.16148293018341064, -0.013736911118030548, -0.06296611577272415, -0.057232487946748734, -0.0684141144156456, 0.016315767541527748, 0.03356199339032173, -0.09109282493591309, 0.013367673382163048, 0.08751559257507324, 0.039910949766635895, -0.10719559341669083, -0.07506471872329712, -0.02576061524450779, -0.09883803874254227, 0.042684346437454224, 0.019419917836785316, 0.028155893087387085, -0.05674610286951065, 0.10739199817180634, -0.036853689700365067, -0.06417715549468994, -0.10862038284540176, -0.06785574555397034, -0.07841718941926956, -0.04183109104633331, -0.019622646272182465, -0.08473849296569824, 0.0016156064812093973, 0.17505885660648346, 0.018691401928663254, 0.0226588174700737, -0.04246441647410393, 0.02788197249174118, 0.022074725478887558, 0.12877371907234192, -0.09970014542341232, 0.05504218488931656, 0.04314069077372551, -0.020981138572096825, 0.014965510927140713, -0.029540836811065674, -0.03880840912461281, -0.021824533119797707, 0.0069447108544409275, 0.04583877697587013, 0.029110902920365334, 0.09960941970348358, -0.03376847133040428, -0.055506374686956406, 0.14319193363189697, -0.1300877183675766, -0.012898956425487995, 0.03387845680117607, -0.08032218366861343, 0.03800298646092415, 0.08615873008966446, -0.01259862631559372, -0.05081005394458771, 0.008033180609345436, -0.046338826417922974, -0.05558113753795624, -0.0708124190568924, -0.11163628846406937, -0.0041029192507267, -0.05003063753247261, -0.018466319888830185, -0.12170037627220154, -0.25335487723350525, -0.03684613108634949, 0.0861610397696495, 0.019074248149991035, -0.008721771650016308, -0.03318184241652489, -0.008442319929599762, -0.04199735075235367, -0.033170316368341446, -0.15505379438400269, -0.007857448421418667, 0.014270962215960026, -0.05742054805159569, 0.03709787875413895, -0.015354032628238201, 0.03982490673661232, -0.10744944214820862, 0.01553761214017868, -0.19097945094108582, 0.12269385159015656, -0.0037563967052847147, 0.027009492740035057, -0.13169220089912415, -0.03838393837213516, -0.03357882797718048, -0.024329708889126778, 0.015586701221764088, 0.07854904979467392, -0.11201051622629166, -0.003930377308279276, 0.12779875099658966, -0.09175415337085724, 0.0015373991336673498, 0.047676555812358856, -0.02770821750164032, 0.14525459706783295, 0.0908818393945694, 0.08318465203046799, 0.1532188057899475, -0.053392115980386734, -0.0608229823410511, -0.019378118216991425, -0.032360468059778214, 0.08146568387746811, 0.036020681262016296, -0.05889609456062317, -0.02382742054760456, 0.038026437163352966, -0.04721591994166374, -0.012510639615356922, 0.03577262908220291, -0.0664002075791359, -0.026761434972286224, -0.05102638527750969, 0.0542367622256279, -0.01725144311785698, -0.015164043754339218, 0.07631780207157135, -0.09317807108163834, 0.04041924700140953, 0.07395293563604355, -0.06981758028268814, -0.0019016260048374534, -0.09969097375869751, 0.04518615081906319, -0.05955508351325989, 0.02988552115857601, -0.1809372454881668, -0.1595253050327301, 0.014222905971109867, -0.13821634650230408, 0.020767560228705406, 0.041574180126190186, 0.043220773339271545, 0.02510770782828331, 0.0123723354190588, 0.010726569220423698, 0.03069951757788658, -0.050832126289606094, 0.0109107019379735, -0.02696981281042099, -0.08708421140909195, -0.02859095297753811, 0.08336012810468674, -0.14007069170475006, 0.062486741691827774, -0.007105489261448383, 0.02904573082923889, 0.02735322341322899, -0.06470775604248047, 0.020550750195980072, -0.06236114352941513, -0.015054453164339066, -0.05189688876271248, 0.017385689541697502, 0.05233708396553993, -0.0050975969061255455, 0.027031878009438515, -0.19921162724494934, -0.08711621165275574, 0.032715871930122375, 0.07089459896087646, -0.060674771666526794, 0.0006663668318651617, -0.0753951370716095, 0.042377930134534836, -0.07342085987329483, -0.00517023541033268, 0.1463221311569214, 0.048728443682193756, 0.06887492537498474, -0.04886068031191826, -0.049279432743787766, -0.01631338894367218, 0.003934632055461407, 0.006364983040839434, 0.00046545499935746193, 0.0370367057621479, -0.10597708076238632, 0.03317902982234955, -0.002889183582738042, -0.009731330908834934, 0.08041585981845856, 0.03193619102239609, -0.040140558034181595, -0.008669715374708176, 0.05413544923067093, 0.0010426642838865519, -0.050576455891132355, 0.06781095266342163, 0.04102715104818344, 0.042522504925727844, -0.0017146561294794083, 0.0005362770170904696, -0.05634068697690964, 0.04907580465078354, 0.06790481507778168, -0.06559427082538605, -0.06741965562105179, 0.04923734441399574, -0.0007824169006198645, 0.024295954033732414, 0.01942010037600994, 0.017462624236941338, 0.01901889219880104, -0.005903652403503656, -0.0848870649933815, 0.12349720299243927, -0.08271238952875137, -0.12220332026481628, -0.1550457775592804, -0.058366890996694565, -0.0935852974653244, 0.030546147376298904, 0.054460491985082626, -0.050080735236406326, -0.0774446427822113, -0.06803548336029053, 0.09544451534748077, 0.00046044166083447635, -0.06386216729879379, -0.042117998003959656, -0.04438561201095581, 0.06827457994222641, -0.1297248899936676, -0.011291609145700932, 0.00302019901573658, -0.14356113970279694, -0.0009308740845881402, 0.018144814297556877, 0.0534692220389843, 0.1089916005730629, -0.02666516788303852, -0.000850144715514034, 0.012391760013997555, 0.2365298718214035, 0.011679280549287796, 0.10254459828138351, 0.1312227100133896, -0.08821903169155121, 0.0864342525601387, 0.17099051177501678, 0.01545336190611124, -0.06629247218370438, 0.038609009236097336, 0.053193092346191406, -0.03630692511796951, -0.22876417636871338, -0.040831033140420914, -0.07806475460529327, -0.04806628078222275, 0.13888418674468994, 0.058151453733444214, 0.03223891928792, 0.08573352545499802, -0.08571351319551468, 0.021235303953289986, 0.0846155434846878, 0.08097438514232635, 0.1432046890258789, 0.03792192414402962, 0.07945819199085236, -0.03981243818998337, -0.051269274204969406, 0.023288842290639877, 0.041339680552482605, 0.1789250373840332, -0.05106266960501671, 0.10812145471572876, 0.07438480108976364, 0.10757942497730255, 0.030341736972332, 0.13764679431915283, -0.06034812703728676, 0.08649444580078125, -0.0012727243592962623, -0.07329116761684418, -0.04217737168073654, 0.04943052679300308, -0.049652744084596634, -0.023100441321730614, 0.007703807204961777, -0.008197343908250332, -0.004110767040401697, 0.16576038300991058, 0.04787341505289078, -0.23359471559524536, -0.08699150383472443, 0.009806780144572258, 0.053149256855249405, -0.08840484172105789, -0.025219392031431198, 0.0003312215849291533, -0.06244484707713127, 0.06182552129030228, -0.01896851509809494, 0.0751904845237732, -0.09664611518383026, 0.0490591786801815, -0.0006161113269627094, 0.1252051740884781, 0.030830567702651024, 0.07677419483661652, -0.06777919083833694, 0.05734771862626076, 0.03158636763691902, 0.08483782410621643, -0.03312263637781143, 0.07125095278024673, 0.04824148863554001, 0.04826205596327782, 0.1240658164024353, 0.01634746789932251, -0.09581653028726578, -0.11490020155906677, -0.0662209764122963, 0.020193103700876236, 0.04665467143058777, -0.1179850846529007, 0.09179814904928207, -0.04345674440264702, -0.012764988467097282, -0.0033505307510495186, 0.0859023854136467, -0.1342199146747589, -0.19281013309955597, 0.0500936396420002, 0.01409884262830019, 0.15137210488319397, 0.0060335309244692326, 0.035863958299160004, -0.007619423326104879, 0.16668149828910828, -0.047654420137405396, -0.09250891208648682, -0.10417726635932922, 0.010488311760127544, 0.10101219266653061, -0.09238064289093018, 0.013365899212658405, -0.05322306975722313, 0.1383688896894455, 0.008214757777750492, -0.17286166548728943, -0.010603535920381546, -0.04589499160647392, -0.005008416716009378, 0.003513145260512829, -0.08899952471256256, 0.09698204696178436, 0.012920260429382324, 0.04929053783416748, 0.06808026134967804, -0.0125225018709898, -0.09691707044839859, 0.008258421905338764, 0.18967348337173462, -0.035297997295856476, 0.014970434829592705, -0.14848217368125916, -0.022829854860901833, -0.07254859805107117, 0.06311124563217163, 0.10525315254926682, 0.08152654021978378, -0.08578964322805405, 0.07119059562683105, 0.17657075822353363, -0.10042573511600494, -0.21055017411708832, -0.051256921142339706, 0.03386843949556351, 0.00873508770018816, -0.0438869446516037, -0.29531586170196533, 0.15391753613948822, 0.07487168908119202, -0.027861077338457108, -0.04674268886446953, -0.17671039700508118, -0.07739243656396866, 0.14747436344623566, -0.004091139882802963, 0.09167098999023438, -0.055891428142786026, -0.04043405130505562, -0.029207363724708557, -0.08801554143428802, 0.14359356462955475, -0.05844356492161751, 0.04586632922291756, 0.0031533155124634504, 0.03723246976733208, 0.06663377583026886, -0.017028912901878357, 0.047861192375421524, 0.03880833834409714, 0.02911863848567009, -0.0666203647851944, 0.09314984828233719, 0.0578479990363121, -0.09212788939476013, 0.21887537837028503, 0.061105210334062576, 0.05609753727912903, -0.0802685022354126, -0.06882957369089127, -0.019720077514648438, 0.06134520471096039, 0.0007283389568328857, -0.08059276640415192, -0.024055350571870804, 0.02358848787844181, 0.14341512322425842, 0.015530292876064777, 0.029775327071547508, -0.013893427327275276, 0.15969619154930115, 0.08143576234579086, 0.04576914384961128, -0.012878482230007648, -0.1116943508386612, 0.011591342277824879, -0.013316909782588482, 0.0803481787443161, -0.07136404514312744, 0.03966150060296059, 0.12247872352600098, -0.008788015693426132, 0.09581896662712097, 0.06330052018165588, -0.09487848728895187, -0.025081327185034752, 0.03982112556695938, -0.09222272783517838, -0.0932282879948616, 0.011092993430793285, 0.04976179450750351, -0.14508932828903198, -0.02765575423836708, 0.1263381838798523, -0.01929350197315216, -0.020455503836274147, 0.04725055396556854, 0.03601149469614029, 0.03554924577474594, 0.06805726885795593, 0.0404251404106617, 0.014434278942644596, -0.06728436052799225, 0.07108795642852783, 0.14219503104686737, -0.06773059070110321, 0.027210569009184837, -0.015724048018455505, -0.0937214344739914, -0.060186717659235, -0.034185741096735, 0.06756126135587692, -0.10820882767438889, -0.0075067454017698765, 0.007659540977329016, -0.06372993439435959, 0.04648993909358978, 0.046108778566122055, 0.021693861111998558, 0.046536389738321304, -0.053951285779476166, -0.012659218162298203, -0.07588249444961548, 0.08098609000444412, 0.047142114490270615, 0.026207245886325836, -0.08227118849754333, -0.0035751566756516695, -0.03068399988114834, 0.025428036227822304, 0.012598633766174316, -0.043520282953977585, -0.04235134273767471, -0.0198704581707716, -0.05152875930070877, 0.017934296280145645, -0.05889219418168068, 0.02968912199139595, 0.023963745683431625, -0.0005448412266559899, -0.0065060388296842575, 0.021584102883934975, -0.09885676205158234, -0.06461039185523987, -0.02823222428560257, 0.037291280925273895, -0.16698510944843292, -0.025429874658584595, 0.02980238012969494, -0.0827980637550354, 0.06125299632549286, 0.033065613359212875, -0.06449122726917267, 0.03236094489693642, -0.08453337848186493, -0.07294952124357224, 0.00031719679827801883, 0.06426650285720825, 0.010610471479594707, -0.02880861982703209, 0.01970679871737957, 0.009859403595328331, -0.07746158540248871, -0.018096668645739555, 0.022335322573781013, -0.08816123753786087, 0.01626666635274887, 0.010489940643310547, 0.06542391330003738, -0.05402158200740814, 0.027312491089105606, 0.008789732120931149, 0.058287832885980606, 0.1310138702392578, -0.018694372847676277, 0.04082675278186798, -0.12315048277378082, -0.03511016443371773, 0.019499393180012703, -0.03715885058045387, -0.0015483371680602431, -0.05896100029349327, 0.034676626324653625, -0.020958824083209038, 0.0683332160115242, -0.010116158053278923, -0.02041444182395935, -0.0028235826175659895, 0.06639758497476578, -0.0685814917087555, -0.014097041450440884, 0.08386251330375671, 0.09589271247386932, -0.024712081998586655, 0.06472691148519516, 0.019856883212924004, -0.0042106290347874165, 0.06086905673146248, 0.15978944301605225, 0.016545087099075317, 0.07327596843242645, -0.00824401993304491, 0.02489813044667244, -0.049534786492586136, -0.012524998746812344, 0.07584799081087112, -0.05822426453232765, 0.04591963067650795, 0.005211449693888426, -0.010585319250822067, 0.12490274012088776, -0.06171257048845291, 0.09766461700201035, 0.032647691667079926, -0.03480524569749832, -0.1730191856622696, -0.1982019543647766, -0.0509815439581871, -0.058440111577510834, -0.010572962462902069, -0.09953482449054718, 0.024217108264565468, 0.018925361335277557, -0.0032773248385638, 0.03225806728005409, 0.05822611600160599, -0.016896262764930725, -0.0716712474822998, -0.010788503102958202, -0.0505790077149868, -0.023000257089734077, 0.05878232419490814, 0.0014975941739976406, 0.0677022784948349, 0.11572857201099396, 0.07682578265666962, 0.06459280103445053, 0.12134195864200592, 0.04393410310149193, -0.0658840760588646, -0.04729912057518959, -0.03617090731859207, 0.038248591125011444, -0.07473772019147873, 0.057102762162685394, 0.02626548893749714, -0.017193906009197235, -0.0026841943617910147, 0.12586624920368195, 0.016354883089661598, -0.0908554419875145, -0.12269716709852219, 0.15815210342407227, 0.09155014902353287, 0.04060565307736397, 0.01799708604812622, -0.10262329876422882, -0.024410994723439217, 0.18491974472999573, 0.14308515191078186, 0.07050963491201401, 0.00809545535594225, 0.03387070447206497, 0.012851296924054623, 0.029329024255275726, 0.07461364567279816, 0.048193465918302536, 0.2283606082201004, -0.05623265355825424, 0.07603228092193604, 0.002041483763605356, -0.005481216125190258, -0.059156086295843124, 0.02844892628490925, -0.03344341740012169, -0.015042278915643692, -0.029088545590639114, 0.04258502274751663, -0.13695195317268372, -0.08774316310882568, -0.012485472485423088, 0.001330775790847838, -0.050878386944532394, -0.05986788868904114, 0.01732134260237217, -0.01900113932788372, 0.05179664492607117, -0.04615157097578049, -0.03827567771077156, 0.23586583137512207, -0.022596491500735283, -0.12014016509056091, -0.05822277069091797, 0.04981542378664017, 0.08479762822389603, 0.1659853458404541, 0.043924860656261444, -0.026680750772356987, 0.06426189839839935, -0.0412081740796566, -0.15920455753803253, 0.02593318000435829, -0.00713264150545001, -0.024746475741267204, 0.000280499632935971, 0.06928485631942749, -0.045674510300159454, 0.08496315032243729, 0.025409474968910217, -0.06418147683143616, 0.017514703795313835, 0.07104212790727615, -0.03313630819320679, -0.04319191724061966, 0.0011300042970106006, -0.06630248576402664, 0.07083239406347275, 0.13829906284809113, -0.010350465774536133, 0.016605330631136894, -0.06523193418979645, 0.051776956766843796, 0.024778233841061592, 0.0009191619465127587, -0.0029643417801707983, -0.06575632095336914, -0.001454085810109973, -0.051188427954912186, 0.05836363881826401, -0.1397130936384201, -0.06183678284287453, 0.07446828484535217, -0.017846520990133286, -0.050895046442747116, 0.10939735174179077, 0.08356073498725891, 0.04837602749466896, -0.03430875018239021, -0.017021948471665382, -0.03176232799887657, 0.03800682723522186, -0.09774564206600189, -0.06008428707718849 ]
null
null
transformers
## RAG This is the RAG-Token Model of the the paper [Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks](https://arxiv.org/pdf/2005.11401.pdf) by Patrick Lewis, Ethan Perez, Aleksandara Piktus et al. The model is a *uncased* model, which means that capital letters are simply converted to lower-case letters. The model consists of a *question_encoder*, *retriever* and a *generator*. The retriever extracts relevant passages from the *wiki_dpr* `train` datasets, which is linked above. The question_encoder and retriever are based on `facebook/dpr-question_encoder-single-nq-base` and `facebook/bart-large`, which were jointly finetuned on on the *wiki_dpr* QA dataset in an end-to-end fashion. ## Usage: **Note**: In the usage example below only the *dummy* retriever of *wiki_dpr* is used because the complete *lecagy* index requires over 75 GB of RAM. The model can generate answers to any factoid question as follows: ```python from transformers import RagTokenizer, RagRetriever, RagTokenForGeneration tokenizer = RagTokenizer.from_pretrained("facebook/rag-token-nq") retriever = RagRetriever.from_pretrained("facebook/rag-token-nq", index_name="exact", use_dummy_dataset=True) model = RagTokenForGeneration.from_pretrained("facebook/rag-token-nq", retriever=retriever) input_dict = tokenizer.prepare_seq2seq_batch("who holds the record in 100m freestyle", return_tensors="pt") generated = model.generate(input_ids=input_dict["input_ids"]) print(tokenizer.batch_decode(generated, skip_special_tokens=True)[0]) # should give michael phelps => sounds reasonable ```
{"language": "en", "license": "apache-2.0", "datasets": ["wiki_dpr"], "thumbnail": "https://huggingface.co/front/thumbnails/facebook.png"}
null
facebook/rag-token-nq
[ "transformers", "pytorch", "tf", "rag", "en", "dataset:wiki_dpr", "arxiv:2005.11401", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2005.11401" ]
[ "en" ]
TAGS #transformers #pytorch #tf #rag #en #dataset-wiki_dpr #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #has_space #region-us
## RAG This is the RAG-Token Model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks by Patrick Lewis, Ethan Perez, Aleksandara Piktus et al. The model is a *uncased* model, which means that capital letters are simply converted to lower-case letters. The model consists of a *question_encoder*, *retriever* and a *generator*. The retriever extracts relevant passages from the *wiki_dpr* 'train' datasets, which is linked above. The question_encoder and retriever are based on 'facebook/dpr-question_encoder-single-nq-base' and 'facebook/bart-large', which were jointly finetuned on on the *wiki_dpr* QA dataset in an end-to-end fashion. ## Usage: Note: In the usage example below only the *dummy* retriever of *wiki_dpr* is used because the complete *lecagy* index requires over 75 GB of RAM. The model can generate answers to any factoid question as follows:
[ "## RAG\n\nThis is the RAG-Token Model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nThe model is a *uncased* model, which means that capital letters are simply converted to lower-case letters.\n\nThe model consists of a *question_encoder*, *retriever* and a *generator*. The retriever extracts relevant passages from the *wiki_dpr* 'train' datasets, which is linked above.\nThe question_encoder and retriever are based on 'facebook/dpr-question_encoder-single-nq-base' and 'facebook/bart-large', which were jointly finetuned on \non the *wiki_dpr* QA dataset in an end-to-end fashion.", "## Usage:\n\nNote: In the usage example below only the *dummy* retriever of *wiki_dpr* is used because the complete *lecagy* index requires over 75 GB of RAM.\nThe model can generate answers to any factoid question as follows:" ]
[ "TAGS\n#transformers #pytorch #tf #rag #en #dataset-wiki_dpr #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "## RAG\n\nThis is the RAG-Token Model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nThe model is a *uncased* model, which means that capital letters are simply converted to lower-case letters.\n\nThe model consists of a *question_encoder*, *retriever* and a *generator*. The retriever extracts relevant passages from the *wiki_dpr* 'train' datasets, which is linked above.\nThe question_encoder and retriever are based on 'facebook/dpr-question_encoder-single-nq-base' and 'facebook/bart-large', which were jointly finetuned on \non the *wiki_dpr* QA dataset in an end-to-end fashion.", "## Usage:\n\nNote: In the usage example below only the *dummy* retriever of *wiki_dpr* is used because the complete *lecagy* index requires over 75 GB of RAM.\nThe model can generate answers to any factoid question as follows:" ]
[ 56, 204, 60 ]
[ "passage: TAGS\n#transformers #pytorch #tf #rag #en #dataset-wiki_dpr #arxiv-2005.11401 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n## RAG\n\nThis is the RAG-Token Model of the the paper Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks \nby Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.\n\nThe model is a *uncased* model, which means that capital letters are simply converted to lower-case letters.\n\nThe model consists of a *question_encoder*, *retriever* and a *generator*. The retriever extracts relevant passages from the *wiki_dpr* 'train' datasets, which is linked above.\nThe question_encoder and retriever are based on 'facebook/dpr-question_encoder-single-nq-base' and 'facebook/bart-large', which were jointly finetuned on \non the *wiki_dpr* QA dataset in an end-to-end fashion.## Usage:\n\nNote: In the usage example below only the *dummy* retriever of *wiki_dpr* is used because the complete *lecagy* index requires over 75 GB of RAM.\nThe model can generate answers to any factoid question as follows:" ]
[ 0.006208499893546104, 0.06352369487285614, -0.00470673106610775, 0.0874040350317955, 0.03552493080496788, 0.03469192609190941, 0.02351185493171215, 0.09985076636075974, 0.013074792921543121, 0.053588639944791794, 0.1038116067647934, -0.022796114906668663, 0.027579011395573616, 0.04383014887571335, -0.002949318615719676, -0.1854538470506668, 0.02885506860911846, 0.009340980090200901, -0.011291646398603916, 0.053667765110731125, 0.06558018922805786, -0.04551665857434273, 0.06347749382257462, 0.005250702612102032, -0.08059792965650558, 0.07370533794164658, 0.03946051374077797, -0.024578195065259933, 0.10089892894029617, 0.08607880771160126, 0.10188566893339157, 0.009811454452574253, 0.03353755548596382, -0.1639443039894104, 0.03054351732134819, 0.13248446583747864, 0.0010944323148578405, 0.05645725131034851, -0.0042738704942166805, 0.012776794843375683, 0.10788360238075256, 0.03753584995865822, 0.045767344534397125, 0.02783946320414543, -0.07289297133684158, -0.2392861396074295, -0.007462481036782265, 0.1311631053686142, 0.021122334524989128, 0.0482594259083271, -0.01502632163465023, 0.0014006284764036536, -0.00817716121673584, 0.047377873212099075, 0.18150010704994202, -0.22476567327976227, -0.041242968291044235, 0.10641276091337204, 0.03956528753042221, 0.062218476086854935, -0.01345390360802412, 0.005646055098623037, 0.02693331241607666, 0.04533889889717102, 0.03670588508248329, -0.04693133011460304, 0.003800592152401805, 0.006742547731846571, -0.09642421454191208, -0.025649601593613625, 0.141593798995018, 0.04994206503033638, -0.0221912432461977, 0.006167856976389885, -0.12528397142887115, 0.13396167755126953, -0.009203651919960976, -0.08529739081859589, -0.026050690561532974, 0.09461566805839539, 0.0021813204512000084, -0.09479613602161407, -0.0541522391140461, -0.04203678295016289, -0.09211660176515579, -0.0022082177456468344, -0.0009612767607904971, 0.07719264924526215, -0.04247022420167923, 0.11849109083414078, -0.13822844624519348, -0.11300656944513321, 0.016535811126232147, -0.11479244381189346, -0.06148955225944519, 0.04419195279479027, -0.02964099310338497, -0.04507637023925781, 0.06547977030277252, 0.14966218173503876, -0.003997586667537689, 0.02929854206740856, -0.08918032050132751, 0.0351753756403923, 0.006221242249011993, 0.10601263493299484, -0.11852872371673584, -0.07061707973480225, 0.097287118434906, -0.002683195285499096, -0.010445250198245049, -0.06111492961645126, -0.057143550366163254, 0.001378159853629768, -0.030031902715563774, -0.019391076639294624, 0.07409431785345078, 0.03785620629787445, -0.02330857142806053, -0.05797487869858742, 0.09146280586719513, -0.08041280508041382, -0.029071206226944923, 0.030955569818615913, -0.005254156421869993, 0.045301858335733414, 0.0835447683930397, 0.004100304562598467, -0.027968931943178177, 0.006708331406116486, -0.06386538594961166, -0.08266689628362656, -0.0670183002948761, -0.11713064461946487, 0.02157033421099186, -0.06417407095432281, -0.010160102508962154, -0.13623479008674622, -0.16539454460144043, 0.0028598480857908726, 0.1015864908695221, -0.032610539346933365, -0.04508572816848755, 0.034454863518476486, 0.03361721336841583, 0.008437932468950748, -0.006773183587938547, -0.03721219301223755, -0.02153068035840988, 0.07922644913196564, -0.07789886742830276, 0.07022728770971298, -0.06011692434549332, 0.03724643588066101, -0.06280702352523804, 0.06601787358522415, -0.16509680449962616, 0.0970945656299591, -0.035517457872629166, -0.02317865937948227, -0.09299422800540924, -0.01652141474187374, -0.12233837693929672, -0.007983803749084473, 0.04934817552566528, 0.06895263493061066, -0.17174872756004333, -0.03886346518993378, 0.061549071222543716, -0.09485677629709244, -0.03151148185133934, 0.14939351379871368, -0.060497526079416275, 0.1077050045132637, 0.07636425644159317, 0.16934047639369965, 0.1508411169052124, -0.09280615299940109, -0.01558862254023552, 0.0840379148721695, -0.03279761224985123, 0.07665901631116867, 0.015403054654598236, -0.02456170879304409, -0.05284200236201286, 0.03844555467367172, 0.036409713327884674, 0.03864188492298126, -0.013115313835442066, -0.06366927921772003, -0.05574173107743263, -0.03351019695401192, 0.04534408077597618, -0.00004620790423359722, -0.03353242576122284, 0.03135157749056816, -0.10709558427333832, 0.036552976816892624, 0.06066540628671646, -0.05754617974162102, 0.012240729294717312, -0.018561875447630882, 0.055494897067546844, -0.07717510312795639, 0.06155316159129143, -0.1443299800157547, -0.06585296243429184, 0.0162862129509449, -0.09821633249521255, -0.0017146308673545718, 0.1281810849905014, 0.01167211402207613, -0.048166316002607346, -0.03469957038760185, 0.05988577753305435, -0.0465969480574131, 0.000016476556993438862, -0.02623005583882332, -0.05787079036235809, -0.0019387819338589907, -0.03733418509364128, -0.04530559852719307, -0.0670742467045784, 0.02041899971663952, 0.010243836790323257, 0.10256455838680267, -0.00904864352196455, -0.005533343646675348, -0.021303607150912285, 0.0015104158082976937, -0.00002852036413969472, -0.04156096652150154, 0.04021593555808067, 0.013727014884352684, 0.001851128414273262, -0.022542372345924377, -0.08119694888591766, 0.026607029139995575, 0.0697731301188469, 0.05749315768480301, -0.042701318860054016, 0.0768171101808548, -0.007360395509749651, -0.021184496581554413, -0.08387263864278793, -0.06814932823181152, 0.218282088637352, 0.05517309904098511, 0.04527261480689049, -0.09018919616937637, -0.01475994847714901, 0.04377463087439537, 0.022915855050086975, 0.014402677305042744, 0.017313078045845032, 0.10846424847841263, -0.11085707694292068, 0.03608490154147148, 0.10060090571641922, -0.017755180597305298, 0.16699443757534027, -0.03358171880245209, -0.053471773862838745, -0.010416395030915737, -0.019140012562274933, -0.05230667069554329, 0.0922531858086586, -0.047298237681388855, 0.07175330072641373, 0.035628244280815125, 0.038751374930143356, 0.011361910961568356, -0.037573013454675674, 0.021251730620861053, 0.009053677320480347, -0.04283278435468674, -0.1028527095913887, 0.02373882569372654, 0.03724927455186844, 0.06346649676561356, 0.059817105531692505, 0.012390884570777416, 0.024548301473259926, 0.004781543742865324, -0.06529220938682556, 0.15121620893478394, -0.08909384906291962, -0.21908196806907654, -0.06837362051010132, 0.08096517622470856, -0.07593938708305359, -0.04044689983129501, 0.06223972514271736, -0.08184327185153961, 0.0016730150673538446, -0.04360351338982582, 0.09970351308584213, -0.013867042027413845, -0.03894774243235588, -0.023403743281960487, -0.04597935080528259, 0.05411108583211899, -0.17297062277793884, 0.007242557592689991, -0.024311158806085587, -0.005028937943279743, -0.040087733417749405, 0.0019595022313296795, 0.0502956323325634, 0.10596957802772522, -0.024734705686569214, -0.0033038875553756952, 0.015217345207929611, 0.25185665488243103, -0.040354933589696884, 0.1234239786863327, 0.1343180239200592, -0.03581085056066513, 0.07663530856370926, 0.15505890548229218, 0.014800637029111385, -0.049473825842142105, 0.011409598402678967, 0.09198779612779617, -0.09107954800128937, -0.1449100226163864, -0.012321630492806435, -0.041449058800935745, 0.026730194687843323, 0.06570635735988617, 0.016390126198530197, -0.11147591471672058, 0.04325275868177414, -0.036510489881038666, 0.03354876860976219, -0.0017659860895946622, 0.033158861100673676, 0.23759981989860535, -0.013203051872551441, 0.10102689266204834, -0.03988203778862953, -0.036551572382450104, 0.09080767631530762, -0.00311757973395288, 0.1710159033536911, -0.07605130970478058, -0.029819050803780556, 0.043761394917964935, 0.12540921568870544, 0.04918881505727768, 0.1131669133901596, -0.10866514593362808, 0.04363954812288284, -0.04152198135852814, -0.017015570774674416, -0.0038886135444045067, 0.055510424077510834, -0.04752236604690552, -0.07474886626005173, 0.01905926503241062, -0.02475498989224434, 0.053709354251623154, 0.24608656764030457, 0.06084396317601204, -0.08271650969982147, -0.06456220149993896, -0.005003882106393576, -0.021356578916311264, -0.0873570516705513, 0.06556441634893417, 0.09485975652933121, -0.05824510380625725, 0.026287006214261055, -0.015775253996253014, 0.08867235481739044, -0.033452413976192474, 0.05483315512537956, 0.013759850524365902, 0.04189883917570114, -0.017539044842123985, 0.06411940604448318, -0.11281909793615341, 0.07633235305547714, 0.04011431708931923, -0.0034274766221642494, -0.0035650283098220825, 0.022384444251656532, 0.03403221070766449, 0.028928110376000404, 0.08451318740844727, 0.010766169987618923, 0.015550846233963966, -0.05379423871636391, -0.06062309816479683, 0.10507117956876755, 0.0019292299402877688, -0.06892634183168411, 0.022810233756899834, 0.0035297151189297438, 0.050183769315481186, 0.04973698407411575, 0.1425626575946808, -0.110379159450531, -0.14074988663196564, -0.005426506511867046, -0.05061435326933861, 0.03177417814731598, -0.013648511841893196, 0.0008965415181592107, 0.05808902531862259, 0.10162878036499023, -0.04242658615112305, -0.07353738695383072, -0.10407432913780212, 0.025760462507605553, 0.12194591760635376, -0.05976783111691475, 0.010594635270535946, -0.03829171881079674, 0.039529383182525635, -0.05896880105137825, -0.18696439266204834, 0.05807601287961006, -0.04803738370537758, -0.04808438941836357, -0.012171512469649315, 0.015130887739360332, 0.043332312256097794, -0.0036974952090531588, 0.03577983379364014, 0.0176842138171196, -0.06058754399418831, -0.12498835474252701, -0.007516083307564259, 0.0007516842451877892, -0.05578174442052841, -0.05528673157095909, -0.07094760239124298, 0.09666649252176285, -0.013189380057156086, 0.059451423585414886, 0.07487478107213974, 0.08823801577091217, -0.12800903618335724, 0.09440387785434723, 0.17327530682086945, -0.06774883717298508, -0.19799746572971344, -0.0700988918542862, 0.018270863220095634, -0.02887335605919361, -0.00539854122325778, -0.18946310877799988, 0.12616705894470215, 0.08417905867099762, -0.007421789690852165, 0.00025422609178349376, -0.09364674240350723, -0.07562844455242157, 0.17129361629486084, -0.03455023095011711, 0.240212544798851, -0.06947183609008789, 0.010966513305902481, -0.03616183251142502, -0.055098649114370346, 0.14019766449928284, -0.18720175325870514, 0.050971269607543945, -0.009696452878415585, 0.000249116332270205, 0.022019293159246445, -0.009245466440916061, 0.03446841984987259, 0.001638439018279314, 0.06418454647064209, -0.013603945262730122, 0.02774467132985592, -0.03506704047322273, -0.03363661468029022, 0.11547914892435074, -0.008839455433189869, 0.09494509547948837, -0.03853851929306984, -0.05631620064377785, -0.057573042809963226, 0.08789193630218506, -0.03609876334667206, -0.1207762286067009, -0.07960114628076553, -0.004121613688766956, 0.05125144496560097, -0.030348457396030426, -0.05206463113427162, -0.017617395147681236, 0.09383481740951538, 0.19309502840042114, 0.011831245385110378, -0.059602756053209305, -0.12366365641355515, 0.04801833629608154, -0.0348949134349823, 0.13616719841957092, -0.2175324261188507, 0.03933914005756378, 0.08442375808954239, -0.0007906697574071586, 0.07451114058494568, 0.020075779408216476, -0.09556075185537338, 0.017999563366174698, 0.02979116514325142, -0.14244741201400757, -0.11018890142440796, -0.013842918910086155, -0.17837397754192352, -0.08013984560966492, -0.0928947776556015, 0.14475342631340027, -0.040138375014066696, 0.007895782589912415, 0.015213433653116226, 0.012307441793382168, -0.013932904228568077, 0.06563401222229004, 0.06534687429666519, 0.03140389546751976, -0.05034845694899559, 0.05262571945786476, 0.06651459634304047, -0.11247130483388901, 0.05332634598016739, 0.018384292721748352, -0.10177478194236755, -0.04152720049023628, -0.06421023607254028, 0.14147235453128815, -0.061871178448200226, -0.009181113913655281, -0.06012967601418495, -0.06498968601226807, 0.027198316529393196, 0.1198718249797821, 0.011991046369075775, 0.028531819581985474, -0.035430651158094406, -0.031191159039735794, -0.017924001440405846, 0.11780780553817749, 0.02164897881448269, -0.017364712432026863, -0.0677725225687027, -0.08715655654668808, 0.03602650389075279, 0.06639871001243591, -0.03495894744992256, 0.003846472594887018, -0.1151890680193901, -0.0321178063750267, -0.2298995703458786, -0.05004620924592018, 0.005694747902452946, 0.03162793815135956, -0.015880560502409935, -0.016836196184158325, -0.014585142955183983, -0.012329989112913609, -0.036375246942043304, -0.01759054698050022, -0.00874223280698061, 0.04592349752783775, -0.1302216649055481, -0.016139494255185127, 0.05218084901571274, -0.025208065286278725, 0.09051475673913956, 0.034314606338739395, -0.03864193707704544, 0.009825590997934341, -0.04753512144088745, -0.004450102802366018, -0.01391427218914032, 0.08243371546268463, -0.01159600354731083, -0.03477535396814346, 0.06484328955411911, -0.01587422378361225, -0.02319415472447872, -0.008356673642992973, 0.06575120240449905, -0.10847026854753494, 0.04446118324995041, -0.006420552730560303, -0.0437508225440979, -0.042274873703718185, -0.050453778356313705, 0.06404265016317368, 0.13841040432453156, 0.09073677659034729, -0.07977539300918579, 0.06286070495843887, -0.18347975611686707, -0.06389213353395462, 0.03336294740438461, -0.012110654264688492, 0.08182574063539505, -0.07961371541023254, 0.01988542079925537, -0.026852065697312355, 0.1784326285123825, 0.026839710772037506, -0.037946924567222595, 0.013384979218244553, -0.09593521058559418, -0.03776584193110466, -0.021176643669605255, 0.04018130525946617, 0.03994560241699219, -0.024572143331170082, -0.0638822615146637, 0.023115579038858414, 0.025214632973074913, 0.1348681002855301, 0.08764302730560303, 0.10372694581747055, 0.08113566786050797, -0.05411615967750549, 0.09038429707288742, -0.04555249214172363, 0.03228238597512245, 0.0903083011507988, -0.01695280894637108, 0.08554163575172424, -0.018032174557447433, -0.06577830016613007, 0.13541515171527863, -0.06069681793451309, 0.08404024690389633, -0.0015607218956574798, -0.03549604117870331, -0.1422390341758728, -0.05642411485314369, -0.04126355051994324, -0.1742544323205948, 0.004370022565126419, -0.11252705752849579, -0.01973738893866539, 0.09612200409173965, -0.01520586758852005, -0.037431396543979645, 0.07439270615577698, -0.0018972951220348477, -0.08174470067024231, 0.03138239309191704, 0.025050245225429535, -0.05666795000433922, 0.06447943300008774, 0.01664799079298973, 0.04827946424484253, 0.02169124223291874, 0.0013520577922463417, 0.046498700976371765, 0.014930243603885174, 0.015723079442977905, -0.07136794924736023, -0.042150530964136124, -0.04301423579454422, 0.028335656970739365, -0.040072452276945114, 0.16403092443943024, 0.05168292298913002, -0.03973826766014099, -0.009392759762704372, 0.13523362576961517, -0.012696492485702038, -0.09541773051023483, -0.18688862025737762, 0.13560894131660461, -0.05029207095503807, 0.030397949740290642, -0.017422497272491455, -0.08706232905387878, -0.05335407704114914, 0.19951102137565613, 0.16544003784656525, -0.0167398639023304, 0.005856823176145554, 0.022606123238801956, 0.006382085382938385, 0.030917663127183914, 0.057284485548734665, 0.10251577198505402, 0.2242136150598526, -0.07265026867389679, -0.014298380352556705, -0.021898798644542694, -0.04918577894568443, -0.05335931479930878, 0.14649705588817596, 0.042560067027807236, -0.02617175690829754, -0.051489755511283875, 0.05008477345108986, -0.042206354439258575, -0.1513683944940567, 0.01145600713789463, -0.06440730392932892, -0.03422538936138153, -0.03165873512625694, -0.057051707059144974, -0.09756717085838318, 0.04352984577417374, -0.06857005506753922, -0.020339662209153175, 0.1278427094221115, -0.039130039513111115, -0.10249213874340057, -0.10355842858552933, 0.07578255981206894, 0.03391188383102417, 0.06149830296635628, 0.021576836705207825, 0.01502515934407711, 0.0867457240819931, 0.03639021888375282, -0.03939539194107056, 0.05415087193250656, 0.031487319618463516, -0.033832937479019165, -0.08885621279478073, -0.044551633298397064, -0.03733334317803383, 0.05383170023560524, 0.07812371104955673, -0.049234338104724884, -0.0036914271768182516, -0.0008430550806224346, -0.02269231714308262, -0.09798029810190201, 0.0017245747148990631, -0.07661110162734985, 0.13297776877880096, 0.12313233315944672, 0.006750117056071758, -0.004941596649587154, -0.08444071561098099, 0.02851857617497444, 0.013595170341432095, 0.009296908974647522, -0.013321923092007637, -0.05770936980843544, 0.0028316150419414043, 0.049012359231710434, -0.028292588889598846, -0.17705628275871277, -0.06522420793771744, 0.05794757604598999, 0.00678509334102273, 0.013379792682826519, 0.07633408904075623, 0.013604342006146908, 0.020111504942178726, -0.026255974546074867, -0.04925975948572159, -0.031907957047224045, 0.030647750943899155, -0.06689417362213135, -0.06439866125583649 ]
null
null
transformers
# S2T-LARGE-LIBRISPEECH-ASR `s2t-large-librispeech-asr` is a Speech to Text Transformer (S2T) model trained for automatic speech recognition (ASR). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is an end-to-end sequence-to-sequence transformer model. It is trained with standard autoregressive cross-entropy loss and generates the transcripts autoregressively. ## Intended uses & limitations This model can be used for end-to-end speech recognition (ASR). See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-large-librispeech-asr") processor = Speech2Textprocessor.from_pretrained("facebook/s2t-large-librispeech-asr") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) input_features = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ).input_features # Batch size 1 generated_ids = model.generate(input_ids=input_features) transcription = processor.batch_decode(generated_ids) ``` #### Evaluation on LibriSpeech Test The following script shows how to evaluate this model on the [LibriSpeech](https://huggingface.co/datasets/librispeech_asr) *"clean"* and *"other"* test dataset. ```python from datasets import load_dataset, load_metric from transformers import Speech2TextForConditionalGeneration, Speech2TextProcessor import soundfile as sf librispeech_eval = load_dataset("librispeech_asr", "clean", split="test") # change to "other" for other test dataset wer = load_metric("wer") model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-large-librispeech-asr").to("cuda") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-large-librispeech-asr", do_upper_case=True) def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch librispeech_eval = librispeech_eval.map(map_to_array) def map_to_pred(batch): features = processor(batch["speech"], sampling_rate=16000, padding=True, return_tensors="pt") input_features = features.input_features.to("cuda") attention_mask = features.attention_mask.to("cuda") gen_tokens = model.generate(input_ids=input_features, attention_mask=attention_mask) batch["transcription"] = processor.batch_decode(gen_tokens, skip_special_tokens=True) return batch result = librispeech_eval.map(map_to_pred, batched=True, batch_size=8, remove_columns=["speech"]) print("WER:", wer(predictions=result["transcription"], references=result["text"])) ``` *Result (WER)*: | "clean" | "other" | |:-------:|:-------:| | 3.3 | 7.5 | ## Training data The S2T-LARGE-LIBRISPEECH-ASR is trained on [LibriSpeech ASR Corpus](https://www.openslr.org/12), a dataset consisting of approximately 1000 hours of 16kHz read English speech. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": "en", "license": "mit", "tags": ["audio", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["librispeech_asr"], "model-index": [{"name": "hubert-large-ls960-ft", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "LibriSpeech (clean)", "type": "librispeech_asr", "config": "clean", "split": "test", "args": {"language": "en"}}, "metrics": [{"type": "wer", "value": 3.3, "name": "Test WER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "LibriSpeech (other)", "type": "librispeech_asr", "config": "other", "split": "test", "args": {"language": "en"}}, "metrics": [{"type": "wer", "value": 7.5, "name": "Test WER"}]}]}]}
automatic-speech-recognition
facebook/s2t-large-librispeech-asr
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "hf-asr-leaderboard", "en", "dataset:librispeech_asr", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "model-index", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #model-index #endpoints_compatible #has_space #region-us
S2T-LARGE-LIBRISPEECH-ASR ========================= 's2t-large-librispeech-asr' is a Speech to Text Transformer (S2T) model trained for automatic speech recognition (ASR). The S2T model was proposed in this paper and released in this repository Model description ----------------- S2T is an end-to-end sequence-to-sequence transformer model. It is trained with standard autoregressive cross-entropy loss and generates the transcripts autoregressively. Intended uses & limitations --------------------------- This model can be used for end-to-end speech recognition (ASR). See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. #### Evaluation on LibriSpeech Test The following script shows how to evaluate this model on the LibriSpeech *"clean"* and *"other"* test dataset. *Result (WER)*: Training data ------------- The S2T-LARGE-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of approximately 1000 hours of 16kHz read English speech. Training procedure ------------------ ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. ### BibTeX entry and citation info
[ "### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.", "#### Evaluation on LibriSpeech Test\n\n\nThe following script shows how to evaluate this model on the LibriSpeech\n*\"clean\"* and *\"other\"* test dataset.\n\n\n*Result (WER)*:\n\n\n\nTraining data\n-------------\n\n\nThe S2T-LARGE-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of\napproximately 1000 hours of 16kHz read English speech.\n\n\nTraining procedure\n------------------", "### Preprocessing\n\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000.", "### Training\n\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively.", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #model-index #endpoints_compatible #has_space #region-us \n", "### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.", "#### Evaluation on LibriSpeech Test\n\n\nThe following script shows how to evaluate this model on the LibriSpeech\n*\"clean\"* and *\"other\"* test dataset.\n\n\n*Result (WER)*:\n\n\n\nTraining data\n-------------\n\n\nThe S2T-LARGE-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of\napproximately 1000 hours of 16kHz read English speech.\n\n\nTraining procedure\n------------------", "### Preprocessing\n\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000.", "### Training\n\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively.", "### BibTeX entry and citation info" ]
[ 95, 137, 104, 99, 50, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #model-index #endpoints_compatible #has_space #region-us \n### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.#### Evaluation on LibriSpeech Test\n\n\nThe following script shows how to evaluate this model on the LibriSpeech\n*\"clean\"* and *\"other\"* test dataset.\n\n\n*Result (WER)*:\n\n\n\nTraining data\n-------------\n\n\nThe S2T-LARGE-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of\napproximately 1000 hours of 16kHz read English speech.\n\n\nTraining procedure\n------------------### Preprocessing\n\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000.### Training\n\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively.### BibTeX entry and citation info" ]
[ -0.06815151125192642, 0.12272072583436966, -0.006631114985793829, 0.02401820570230484, 0.05930164083838463, -0.005046993959695101, 0.05641384795308113, 0.0875103697180748, -0.02733197622001171, 0.10505151748657227, 0.02667824737727642, 0.07337884604930878, 0.07570639252662659, 0.13102875649929047, 0.0536196269094944, -0.15756407380104065, 0.00746334670111537, -0.0855383649468422, 0.0955696552991867, 0.09216595441102982, 0.1311793476343155, -0.06216809153556824, 0.06344321370124817, -0.014298168011009693, -0.045959245413541794, 0.008144645020365715, 0.0225589070469141, -0.05595280975103378, 0.03668322414159775, 0.0365377701818943, 0.06968531012535095, -0.009274575859308243, 0.0715179294347763, -0.16671986877918243, 0.007442694157361984, 0.07113523036241531, 0.045342519879341125, 0.049915798008441925, 0.060845427215099335, -0.06772606074810028, 0.04651554673910141, -0.05812986195087433, 0.06186715140938759, 0.04953254386782646, -0.07186246663331985, -0.10632465034723282, -0.08914288878440857, 0.06373215466737747, 0.0965123325586319, 0.06945057213306427, -0.045258622616529465, 0.0010876063024625182, -0.037469442933797836, 0.07565750926733017, 0.12389326095581055, -0.13949179649353027, -0.015574310906231403, -0.1099858358502388, 0.013607441447675228, 0.1229378804564476, -0.05305856466293335, -0.032943349331617355, -0.02184632234275341, -0.00037756195524707437, 0.07146979123353958, -0.03169088438153267, -0.03504154831171036, -0.0180770605802536, -0.1378539353609085, -0.05323483794927597, 0.18040798604488373, -0.03210395947098732, -0.09696327149868011, -0.10820968449115753, -0.0218513160943985, -0.014016591012477875, 0.03371896222233772, 0.0018676567124202847, 0.016381651163101196, 0.002557369414716959, 0.019038241356611252, -0.09206204861402512, -0.08459970355033875, -0.06748316437005997, -0.03316352143883705, 0.079344242811203, 0.02053871378302574, -0.008606071583926678, 0.009992477484047413, 0.1074596717953682, 0.016639476642012596, -0.08078557997941971, -0.03668168932199478, -0.031158531084656715, -0.13412563502788544, 0.008289141580462456, -0.01696864515542984, -0.19390767812728882, -0.025323599576950073, 0.15447364747524261, -0.01808803528547287, 0.05824865400791168, -0.04223436862230301, 0.01204466912895441, 0.006676759570837021, 0.20629504323005676, 0.0006591736455447972, -0.06611079722642899, -0.019127972424030304, 0.029646821320056915, 0.04794931411743164, -0.011884989216923714, -0.028099052608013153, 0.030452674254775047, 0.01646336540579796, 0.06011751666665077, 0.029990224167704582, -0.006986454129219055, -0.07795683294534683, -0.004356893245130777, 0.0750436931848526, -0.1846071481704712, 0.0519079752266407, 0.04890111833810806, -0.047270700335502625, 0.004880258347839117, 0.09243476390838623, -0.039593812078237534, -0.10906960815191269, 0.09419518709182739, -0.03696059435606003, 0.03041549026966095, -0.07626432180404663, -0.085698701441288, 0.00892846379429102, -0.07730475068092346, -0.08021503686904907, -0.07148566097021103, -0.09162306785583496, -0.04332306981086731, 0.006812873296439648, -0.027423840016126633, 0.03166442736983299, -0.019711779430508614, 0.0212840773165226, 0.009802514687180519, -0.02504022605717182, -0.0027256759349256754, -0.010178575292229652, 0.037141114473342896, -0.007344468962401152, 0.03896910697221756, 0.06422142684459686, 0.04837141931056976, -0.0853884145617485, 0.024871714413166046, -0.18911002576351166, 0.16256660223007202, -0.047611795365810394, -0.013586188666522503, -0.10980557650327682, -0.020532431080937386, -0.05625169724225998, 0.011688574217259884, 0.030114324763417244, 0.0840609148144722, -0.17855559289455414, -0.052843671292066574, 0.1844915747642517, -0.10515772551298141, 0.014592458494007587, 0.1181621104478836, -0.003844976658001542, 0.03549743443727493, 0.1358332633972168, 0.15629510581493378, 0.07737889885902405, -0.1548425704240799, -0.07709517329931259, 0.007003268226981163, -0.04055356979370117, 0.1589963436126709, 0.08204634487628937, -0.08949696272611618, 0.04699438065290451, 0.00693096686154604, -0.11036878824234009, -0.003878726391121745, 0.0426318384706974, -0.0381561815738678, -0.01605452038347721, -0.04498235508799553, 0.013600056059658527, -0.02111111395061016, -0.05496412515640259, -0.021486300975084305, -0.1265413761138916, 0.10064083337783813, 0.060181304812431335, -0.0544842854142189, 0.053427476435899734, -0.10265521705150604, 0.025868814438581467, 0.033761683851480484, -0.011225094087421894, -0.1668795794248581, 0.03187231346964836, 0.04726126044988632, -0.11906469613313675, 0.13080298900604248, -0.0849032923579216, 0.010257064364850521, 0.045780595391988754, -0.005050764884799719, -0.015446599572896957, 0.014578198082745075, -0.013505653478205204, -0.018050700426101685, -0.10162098705768585, -0.037789635360240936, -0.019103549420833588, 0.2301681637763977, -0.0527193546295166, 0.032631296664476395, 0.049351904541254044, 0.07934341579675674, -0.007693698164075613, -0.0757037028670311, 0.020109020173549652, -0.02162262424826622, 0.014697723090648651, -0.04559791460633278, -0.011298105120658875, 0.019419968128204346, -0.032862767577171326, 0.062157467007637024, -0.14808420836925507, -0.22276322543621063, 0.07177131623029709, 0.06272649765014648, -0.08764856308698654, 0.016823843121528625, 0.0006413995288312435, -0.03789535164833069, -0.08512832224369049, -0.16194771230220795, 0.20635388791561127, 0.03218108043074608, 0.0916920155286789, -0.0759182721376419, -0.07273269444704056, -0.03841790184378624, -0.034883372485637665, -0.016298796981573105, 0.06089174747467041, -0.08670099824666977, -0.10159411281347275, 0.033873315900564194, 0.030177375301718712, -0.0654100850224495, 0.11743383854627609, 0.016116205602884293, -0.12239231169223785, -0.038042355328798294, 0.03316086530685425, 0.016766395419836044, 0.01340120006352663, -0.04442545026540756, -0.01489387545734644, 0.031042790040373802, -0.011230459436774254, 0.009489074349403381, -0.0909394770860672, 0.07650864869356155, 0.04697500169277191, -0.05138790234923363, -0.0557427816092968, -0.04813708737492561, -0.004205420147627592, 0.056304339319467545, 0.005513581447303295, 0.1003529280424118, -0.024332331493496895, -0.03521560877561569, -0.12802506983280182, 0.10538863390684128, -0.1635124832391739, -0.24301721155643463, -0.20608587563037872, 0.011492779478430748, 0.015435830689966679, 0.03025885857641697, 0.02959408052265644, -0.07037708163261414, -0.05364622548222542, -0.10518545657396317, 0.07340855896472931, 0.00876983068883419, -0.061493128538131714, -0.0503648966550827, 0.017420785501599312, 0.050704773515462875, -0.11361546814441681, 0.00875803641974926, 0.027443785220384598, -0.018100010231137276, -0.039582524448633194, 0.024732626974582672, 0.04052184149622917, 0.13570362329483032, 0.0010854324791580439, -0.017766138538718224, 0.005256186705082655, 0.185096874833107, -0.09670747071504593, 0.10829359292984009, 0.13102726638317108, -0.10864189267158508, 0.04839455708861351, 0.08729138970375061, 0.004145734943449497, -0.0030579394660890102, 0.024819694459438324, -0.0037409630604088306, -0.041056450456380844, -0.2880782186985016, -0.04079222306609154, -0.02234850451350212, 0.0053448863327503204, 0.07831531763076782, 0.032671939581632614, -0.010591888800263405, 0.026599198579788208, -0.05830778554081917, -0.029458988457918167, 0.1045243963599205, 0.06192797049880028, 0.08616876602172852, -0.026209257543087006, 0.08366793394088745, -0.05521262437105179, 0.0073123108595609665, 0.0810958743095398, -0.015567479655146599, 0.15123362839221954, -0.03860069066286087, 0.1812007874250412, 0.05123618245124817, 0.045100051909685135, 0.001385066076181829, 0.04894716665148735, -0.02047213353216648, 0.05878087878227234, 0.007157894782721996, -0.09080873429775238, -0.056309789419174194, 0.08174686133861542, 0.08034208416938782, 0.001926531083881855, 0.04902200773358345, -0.009150889702141285, 0.0661071240901947, 0.17539112269878387, 0.04562278836965561, -0.155861958861351, -0.06190670281648636, 0.04163011163473129, -0.053869158029556274, -0.04835554212331772, -0.012816577218472958, 0.0975019559264183, -0.10490363836288452, 0.070869080722332, 0.010183826088905334, 0.07255811989307404, -0.07744944095611572, -0.04513610154390335, 0.028965072706341743, 0.11529888212680817, 0.013630335219204426, 0.059373460710048676, -0.1379193812608719, 0.049394793808460236, 0.022059572860598564, 0.08761551976203918, -0.019831795245409012, 0.06952778995037079, -0.0059635452926158905, 0.0013453189749270678, 0.13570614159107208, 0.021578127518296242, -0.09862891584634781, -0.014637788757681847, -0.07751864194869995, -0.01890818029642105, 0.10455300658941269, -0.03906836360692978, 0.05818699672818184, -0.034434687346220016, -0.03224113956093788, -0.04834339767694473, -0.13367561995983124, -0.1279425472021103, -0.16522443294525146, 0.03875783830881119, -0.027212895452976227, 0.0544586181640625, -0.04704730212688446, -0.020044630393385887, -0.011972932144999504, 0.1379920244216919, -0.24857695400714874, -0.09507791697978973, -0.07116227596998215, -0.04013126716017723, 0.1524883508682251, -0.049650587141513824, 0.024316983297467232, -0.007617907132953405, 0.14149990677833557, -0.007700997404754162, -0.02513585053384304, -0.008067789487540722, -0.05151139199733734, -0.11171841621398926, -0.0436847098171711, 0.20268350839614868, 0.08276545256376266, 0.08114978671073914, 0.002531646052375436, 0.06514694541692734, -0.008764590136706829, -0.06744038313627243, 0.007511000614613295, 0.10913893580436707, 0.023362120613455772, 0.11451928317546844, -0.09252337366342545, -0.10530020296573639, -0.14193931221961975, -0.03413090854883194, 0.10480059683322906, 0.09300507605075836, -0.0811353400349617, 0.11644483357667923, 0.1337270438671112, -0.13473808765411377, -0.17248424887657166, -0.024274906143546104, 0.07610757648944855, 0.01596076600253582, 0.07376386970281601, -0.2218174785375595, 0.05418134108185768, 0.07214707136154175, 0.002864130539819598, -0.0381438322365284, -0.2749844491481781, -0.1294010579586029, 0.06597039103507996, -0.006117967888712883, -0.21731731295585632, -0.09268489480018616, -0.0946933925151825, -0.0684032216668129, -0.01783466711640358, 0.052170343697071075, -0.11006510257720947, 0.050782669335603714, 0.044097915291786194, 0.012307132594287395, 0.03964152932167053, -0.03449546545743942, 0.10797559469938278, 0.01385774090886116, -0.023179255425930023, -0.04718177393078804, 0.01734381914138794, 0.05444207042455673, -0.04942137748003006, 0.17302432656288147, -0.0248939897865057, 0.034226614981889725, -0.029711827635765076, -0.018756382167339325, -0.05138871446251869, 0.031603146344423294, -0.0445471815764904, 0.0020758514292538166, -0.07995747774839401, 0.02545372024178505, 0.04118117317557335, -0.02108265832066536, -0.03367369994521141, -0.09544982016086578, -0.02588573656976223, 0.17477402091026306, 0.12971152365207672, 0.09558534622192383, -0.1485922634601593, 0.004937469493597746, 0.008509106002748013, 0.002927018329501152, -0.06891729682683945, 0.05322755500674248, 0.055076368153095245, 0.04427449405193329, 0.15310466289520264, -0.019349299371242523, -0.1450785994529724, 0.008753438480198383, 0.0528535358607769, -0.0919307991862297, -0.10455135256052017, 0.006501548923552036, 0.02827421762049198, -0.10188823938369751, -0.08665984869003296, 0.1283271610736847, -0.004740923177450895, -0.01935770735144615, 0.01364095602184534, 0.0563201978802681, -0.07123753428459167, 0.18760663270950317, 0.046124111860990524, 0.022164154797792435, -0.07376270741224289, 0.10182986408472061, 0.11832062900066376, -0.04651366174221039, 0.0032934327609837055, 0.13157865405082703, -0.061647310853004456, -0.02952801249921322, -0.12170984596014023, -0.016581164672970772, 0.0336904376745224, -0.05350559577345848, -0.024843556806445122, -0.0770825669169426, 0.025300640612840652, 0.06130971014499664, 0.015025983564555645, 0.06366904079914093, -0.026406610384583473, 0.019333545118570328, -0.05447669327259064, 0.10432905703783035, 0.0583864189684391, 0.047368891537189484, -0.05131380632519722, 0.1388375461101532, 0.03520644083619118, 0.017320668324828148, -0.002788607496768236, -0.03991719335317612, -0.047860659658908844, 0.026873992756009102, -0.03232768923044205, 0.03012852929532528, -0.05388708785176277, -0.03372263163328171, 0.0037352165672928095, -0.00744104478508234, 0.014898223802447319, 0.04652424901723862, -0.01286088302731514, -0.04516783356666565, -0.05396707355976105, 0.07709935307502747, -0.160240039229393, 0.015692414715886116, 0.08276665210723877, -0.08679015189409256, 0.07243050634860992, 0.07106378674507141, -0.04078011214733124, 0.014864242635667324, -0.10152335464954376, -0.028734317049384117, -0.022492092102766037, 0.04567987099289894, -0.015638282522559166, -0.17994721233844757, 0.013747927732765675, -0.0048131439834833145, -0.05229862406849861, -0.011537529528141022, 0.14053571224212646, -0.08736525475978851, 0.08801814913749695, 0.018307257443666458, -0.031771644949913025, -0.07462149858474731, 0.058224208652973175, 0.07850884646177292, 0.03767916187644005, 0.12085780501365662, -0.05871404707431793, 0.04523131996393204, -0.11963728815317154, 0.01964009553194046, 0.013441212475299835, -0.0032121529802680016, -0.03959956020116806, -0.050468482077121735, 0.04747215658426285, 0.00952323991805315, 0.13035708665847778, 0.004135800059884787, -0.0003508211812004447, 0.073014996945858, 0.030693450942635536, -0.029483530670404434, 0.05100006237626076, 0.0003065109485760331, -0.02992619387805462, -0.01904335804283619, -0.04422955587506294, -0.06495695561170578, -0.03903188556432724, -0.017782198265194893, 0.05113893374800682, 0.1333337277173996, 0.11212009936571121, 0.03227819502353668, 0.029693104326725006, -0.006707300432026386, -0.1262740194797516, 0.04431608319282532, -0.0444486103951931, 0.008224994875490665, -0.042569149285554886, 0.1749648004770279, 0.09170965105295181, -0.13849672675132751, 0.10172959417104721, 0.017937885597348213, -0.11041277647018433, -0.08860018104314804, -0.10887619107961655, -0.033279888331890106, -0.007815978489816189, -0.02997824177145958, -0.0734754130244255, 0.09497897326946259, 0.00657501257956028, 0.009602583944797516, -0.021726535633206367, 0.16931259632110596, -0.10505873709917068, -0.12591081857681274, 0.03700093552470207, 0.007127951830625534, 0.04683861881494522, 0.06721287220716476, 0.035796381533145905, 0.08042668551206589, 0.038147952407598495, 0.09689482301473618, 0.06638698279857635, 0.054788198322057724, -0.0050480980426073074, -0.02757476642727852, -0.03577549383044243, 0.01580832526087761, 0.007634852547198534, -0.007243235595524311, 0.16208186745643616, 0.062467191368341446, -0.03438645601272583, -0.016994750127196312, 0.13621726632118225, -0.05580030381679535, -0.07570173591375351, -0.14966630935668945, 0.0947059765458107, 0.017340397462248802, 0.011571274138987064, 0.009922782890498638, -0.10020200908184052, -0.0004450994310900569, 0.1640385240316391, 0.09776148200035095, -0.0009019878343679011, 0.010951604694128036, 0.0014481112593784928, 0.022331513464450836, 0.0020835143513977528, 0.048214249312877655, 0.033258773386478424, 0.2016208916902542, -0.006595029029995203, 0.12092407792806625, -0.048136916011571884, -0.04581271484494209, -0.08487843722105026, 0.06741359829902649, -0.09822885692119598, -0.0009813010692596436, -0.032783761620521545, 0.09565120935440063, -0.004374502692371607, -0.29067134857177734, -0.03808963671326637, -0.031867507845163345, -0.08810526132583618, 0.03626314923167229, 0.10045883059501648, -0.004913621582090855, 0.029910892248153687, 0.03845970705151558, -0.0382697656750679, 0.1917794793844223, 0.014778553508222103, -0.05364920571446419, -0.04772178456187248, 0.053291480988264084, -0.1602100431919098, 0.1524205356836319, -0.001313638174906373, 0.12040933221578598, 0.04997338727116585, 0.018188202753663063, -0.09002944827079773, 0.09315082430839539, 0.0397663451731205, -0.0558600015938282, 0.05365380272269249, 0.2081572711467743, -0.011629187501966953, 0.11999621242284775, 0.04307698830962181, 0.00036908473703078926, 0.05569205805659294, -0.012845037505030632, -0.04257715493440628, -0.12333711236715317, 0.032404154539108276, -0.10660135746002197, 0.130109965801239, 0.17709603905677795, -0.028550168499350548, 0.00027231022249907255, -0.05448605865240097, -0.0406477116048336, 0.020213782787322998, 0.11929444968700409, 0.012380411848425865, -0.14956054091453552, 0.003003315767273307, 0.025375649333000183, 0.08834777772426605, -0.19279327988624573, -0.0444311648607254, -0.00676029222086072, -0.055142324417829514, 0.0005406247801147401, 0.09682205319404602, 0.005336299538612366, 0.030439915135502815, -0.025499539449810982, -0.05651655048131943, 0.04521840065717697, 0.07931338250637054, -0.12927858531475067, -0.02793678641319275 ]
null
null
transformers
# S2T-MEDIUM-LIBRISPEECH-ASR `s2t-medium-librispeech-asr` is a Speech to Text Transformer (S2T) model trained for automatic speech recognition (ASR). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is an end-to-end sequence-to-sequence transformer model. It is trained with standard autoregressive cross-entropy loss and generates the transcripts autoregressively. ## Intended uses & limitations This model can be used for end-to-end speech recognition (ASR). See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-medium-librispeech-asr") processor = Speech2Textprocessor.from_pretrained("facebook/s2t-medium-librispeech-asr") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) input_features = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ).input_features # Batch size 1 generated_ids = model.generate(input_features=input_features) transcription = processor.batch_decode(generated_ids) ``` #### Evaluation on LibriSpeech Test The following script shows how to evaluate this model on the [LibriSpeech](https://huggingface.co/datasets/librispeech_asr) *"clean"* and *"other"* test dataset. ```python from datasets import load_dataset from evaluate import load from transformers import Speech2TextForConditionalGeneration, Speech2TextProcessor librispeech_eval = load_dataset("librispeech_asr", "clean", split="test") # change to "other" for other test dataset wer = load("wer") model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-medium-librispeech-asr").to("cuda") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-medium-librispeech-asr", do_upper_case=True) def map_to_pred(batch): features = processor(batch["audio"]["array"], sampling_rate=16000, padding=True, return_tensors="pt") input_features = features.input_features.to("cuda") attention_mask = features.attention_mask.to("cuda") gen_tokens = model.generate(input_features=input_features, attention_mask=attention_mask) batch["transcription"] = processor.batch_decode(gen_tokens, skip_special_tokens=True)[0] return batch result = librispeech_eval.map(map_to_pred, remove_columns=["audio"]) print("WER:", wer.compute(predictions=result["transcription"], references=result["text"])) ``` *Result (WER)*: | "clean" | "other" | |:-------:|:-------:| | 3.5 | 7.8 | ## Training data The S2T-MEDIUM-LIBRISPEECH-ASR is trained on [LibriSpeech ASR Corpus](https://www.openslr.org/12), a dataset consisting of approximately 1000 hours of 16kHz read English speech. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": "en", "license": "mit", "tags": ["audio", "automatic-speech-recognition"], "datasets": ["librispeech_asr"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-medium-librispeech-asr
[ "transformers", "pytorch", "tf", "safetensors", "speech_to_text", "automatic-speech-recognition", "audio", "en", "dataset:librispeech_asr", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en" ]
TAGS #transformers #pytorch #tf #safetensors #speech_to_text #automatic-speech-recognition #audio #en #dataset-librispeech_asr #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #has_space #region-us
S2T-MEDIUM-LIBRISPEECH-ASR ========================== 's2t-medium-librispeech-asr' is a Speech to Text Transformer (S2T) model trained for automatic speech recognition (ASR). The S2T model was proposed in this paper and released in this repository Model description ----------------- S2T is an end-to-end sequence-to-sequence transformer model. It is trained with standard autoregressive cross-entropy loss and generates the transcripts autoregressively. Intended uses & limitations --------------------------- This model can be used for end-to-end speech recognition (ASR). See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. #### Evaluation on LibriSpeech Test The following script shows how to evaluate this model on the LibriSpeech *"clean"* and *"other"* test dataset. *Result (WER)*: Training data ------------- The S2T-MEDIUM-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of approximately 1000 hours of 16kHz read English speech. Training procedure ------------------ ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. ### BibTeX entry and citation info
[ "### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.", "#### Evaluation on LibriSpeech Test\n\n\nThe following script shows how to evaluate this model on the LibriSpeech\n*\"clean\"* and *\"other\"* test dataset.\n\n\n*Result (WER)*:\n\n\n\nTraining data\n-------------\n\n\nThe S2T-MEDIUM-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of\napproximately 1000 hours of 16kHz read English speech.\n\n\nTraining procedure\n------------------", "### Preprocessing\n\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000.", "### Training\n\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively.", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #safetensors #speech_to_text #automatic-speech-recognition #audio #en #dataset-librispeech_asr #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #has_space #region-us \n", "### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.", "#### Evaluation on LibriSpeech Test\n\n\nThe following script shows how to evaluate this model on the LibriSpeech\n*\"clean\"* and *\"other\"* test dataset.\n\n\n*Result (WER)*:\n\n\n\nTraining data\n-------------\n\n\nThe S2T-MEDIUM-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of\napproximately 1000 hours of 16kHz read English speech.\n\n\nTraining procedure\n------------------", "### Preprocessing\n\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000.", "### Training\n\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively.", "### BibTeX entry and citation info" ]
[ 86, 137, 104, 99, 50, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #safetensors #speech_to_text #automatic-speech-recognition #audio #en #dataset-librispeech_asr #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #has_space #region-us \n### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.#### Evaluation on LibriSpeech Test\n\n\nThe following script shows how to evaluate this model on the LibriSpeech\n*\"clean\"* and *\"other\"* test dataset.\n\n\n*Result (WER)*:\n\n\n\nTraining data\n-------------\n\n\nThe S2T-MEDIUM-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of\napproximately 1000 hours of 16kHz read English speech.\n\n\nTraining procedure\n------------------### Preprocessing\n\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000.### Training\n\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively.### BibTeX entry and citation info" ]
[ -0.06700040400028229, 0.1533086746931076, -0.00734571972861886, 0.03499186784029007, 0.09094402194023132, -0.003584857564419508, 0.07554766535758972, 0.08753012120723724, -0.057151295244693756, 0.09103302657604218, 0.02548539824783802, 0.09111656248569489, 0.05393177643418312, 0.08992024511098862, 0.050143320113420486, -0.14508044719696045, -0.014145155437290668, -0.05913037061691284, 0.12047428637742996, 0.0913098156452179, 0.11622337251901627, -0.06586240231990814, 0.050668466836214066, -0.024588318541646004, -0.04978412017226219, 0.022769685834646225, 0.012814649380743504, -0.052898164838552475, 0.029505284503102303, 0.022392841055989265, 0.055650494992733, -0.02497517131268978, 0.050945114344358444, -0.1677795648574829, 0.008166317827999592, 0.05651209130883217, 0.024810384958982468, 0.05186885967850685, 0.06714877486228943, -0.07921251654624939, 0.08800781518220901, -0.08194516599178314, 0.048439688980579376, 0.05272577702999115, -0.08203823119401932, -0.08638297766447067, -0.10234054923057556, 0.05879489704966545, 0.0760577917098999, 0.077263243496418, -0.03594809025526047, 0.020247049629688263, -0.037676069885492325, 0.08533987402915955, 0.1345069259405136, -0.15299028158187866, -0.01819758303463459, -0.0694514736533165, 0.022940605878829956, 0.1317528635263443, -0.06353475898504257, -0.05120058357715607, -0.02033744938671589, 0.005050713196396828, 0.09318190813064575, -0.04626959562301636, -0.009192082099616528, -0.030449047684669495, -0.11240563541650772, -0.044733453541994095, 0.15126733481884003, -0.0313398614525795, -0.08688569813966751, -0.1014842838048935, -0.004337170161306858, -0.012739471159875393, 0.045103494077920914, 0.023017842322587967, 0.01583537645637989, -0.0008045627037063241, 0.013890844769775867, -0.07922586798667908, -0.07263277471065521, -0.06116630509495735, -0.000879808678291738, 0.06615249812602997, 0.015522077679634094, -0.022675009444355965, 0.02713790535926819, 0.1036970242857933, 0.034558024257421494, -0.09522126615047455, -0.02180994115769863, 0.0010697218822315335, -0.10731188952922821, -0.0010081622749567032, -0.011690587736666203, -0.17199844121932983, -0.017878113314509392, 0.13486449420452118, -0.002355733420699835, 0.03606627136468887, -0.04542818292975426, 0.011347788386046886, -0.0004923758679069579, 0.18132562935352325, -0.02075720950961113, -0.06578425318002701, -0.0216788612306118, 0.05412178114056587, 0.023129785433411598, 0.010452834889292717, -0.024132005870342255, 0.00822758860886097, 0.04215247556567192, 0.06391752511262894, 0.004692427814006805, -0.011047285050153732, -0.07376429438591003, 0.0035871502477675676, 0.053599122911691666, -0.18880021572113037, 0.06226455792784691, 0.04458001255989075, -0.05597582459449768, 0.018074141815304756, 0.10188449919223785, -0.05067480728030205, -0.1089114397764206, 0.08505924046039581, -0.027994386851787567, 0.028287289664149284, -0.09863077849149704, -0.08650408685207367, 0.006463019642978907, -0.09855492413043976, -0.060454219579696655, -0.06350880861282349, -0.10332698374986649, -0.05566798523068428, 0.019175130873918533, -0.03568515554070473, 0.04133014380931854, -0.03985300287604332, -0.004417465999722481, -0.004038799088448286, -0.031986385583877563, 0.023561235517263412, 0.0010818130103871226, 0.05387475714087486, 0.023284660652279854, 0.04112004116177559, 0.06169392541050911, 0.06284689158201218, -0.0866256132721901, 0.024241000413894653, -0.1346069723367691, 0.16555576026439667, -0.030597928911447525, -0.007622777950018644, -0.11018825322389603, -0.019227011129260063, -0.060878802090883255, 0.0059908009134233, 0.05592895671725273, 0.07277540862560272, -0.19837746024131775, -0.0433516725897789, 0.18147914111614227, -0.07267062366008759, 0.005054200999438763, 0.10467255860567093, -0.025716528296470642, 0.05776716396212578, 0.1482338160276413, 0.11714500933885574, 0.07638540863990784, -0.10460357367992401, -0.08544732630252838, -0.029376337304711342, -0.036136917769908905, 0.12995442748069763, 0.07237356156110764, -0.07107613235712051, 0.05814207345247269, 0.007723972667008638, -0.11476178467273712, 0.010584075003862381, 0.019994936883449554, -0.03282942622900009, -0.01911592297255993, -0.0609438456594944, -0.011242172680795193, -0.0028793744277209044, -0.06056783348321915, -0.023523373529314995, -0.11531858146190643, 0.11474312096834183, 0.05827734246850014, -0.06536220759153366, 0.040709272027015686, -0.0925256758928299, 0.005660130642354488, -0.013018029741942883, -0.0019757533445954323, -0.1706908494234085, 0.05460565537214279, 0.03612706437706947, -0.08531364053487778, 0.13509084284305573, -0.08735357969999313, 0.024149179458618164, 0.041088636964559555, -0.011535844765603542, -0.012579337693750858, 0.0045810891315341, -0.021854732185602188, -0.030792200937867165, -0.09413255006074905, -0.04182010516524315, -0.0284127127379179, 0.1636703610420227, -0.0831235945224762, 0.02406328171491623, 0.021161433309316635, 0.058857206255197525, -0.014630899764597416, -0.06865163892507553, 0.028841139748692513, -0.013941745273768902, 0.008591091260313988, -0.052390579134225845, -0.004482742864638567, 0.01732235960662365, -0.038152966648340225, 0.05882558599114418, -0.15177026391029358, -0.2209840565919876, 0.06831476837396622, 0.08103921264410019, -0.08048820495605469, -0.012235901318490505, 0.015050237998366356, -0.046257924288511276, -0.07529200613498688, -0.12551257014274597, 0.21195818483829498, 0.04243915155529976, 0.10510106384754181, -0.07376701384782791, -0.060544390231370926, -0.04936445131897926, -0.03246331587433815, -0.004525892902165651, 0.0966685563325882, -0.08864796906709671, -0.11131519824266434, 0.037277448922395706, 0.018624208867549896, -0.08238357305526733, 0.1131824404001236, 0.030360350385308266, -0.12646572291851044, -0.02717648260295391, 0.0358489528298378, 0.040679316967725754, -0.009812051430344582, -0.03222604840993881, -0.014963495545089245, 0.03411075100302696, -0.025533391162753105, 0.006682481151074171, -0.08470559865236282, 0.06880190223455429, 0.04427164047956467, -0.04334546625614166, -0.017157988622784615, -0.03776505962014198, 0.001860772492364049, 0.06141090393066406, -0.0037739078979939222, 0.07491853833198547, -0.017501341179013252, -0.03533689305186272, -0.11277768015861511, 0.1013997420668602, -0.13592536747455597, -0.20407001674175262, -0.1919393092393875, 0.01194614265114069, 0.026703424751758575, 0.026936713606119156, 0.016177183017134666, -0.06153520196676254, -0.04341977462172508, -0.08248346298933029, 0.04672262445092201, 0.03291112929582596, -0.08620376139879227, -0.037554364651441574, 0.030076509341597557, 0.03634139150381088, -0.12450762093067169, 0.01058847177773714, 0.02850794792175293, -0.05824030563235283, -0.022967366501688957, 0.015845566987991333, 0.044263143092393875, 0.1184912621974945, -0.018306365236639977, -0.020602386444807053, 0.028297044336795807, 0.1590060591697693, -0.09121597558259964, 0.08235868811607361, 0.09842189401388168, -0.08564411103725433, 0.04891376942396164, 0.06711866706609726, 0.005657368805259466, -0.008254901506006718, 0.04037253186106682, 0.011514155194163322, -0.04423512890934944, -0.28271931409835815, -0.019133692607283592, -0.026438279077410698, -0.023039020597934723, 0.0974968895316124, 0.04162639006972313, -0.014043369330465794, 0.04430712014436722, -0.03980468213558197, -0.008039401844143867, 0.11062616109848022, 0.06470683962106705, 0.025253362953662872, -0.017174147069454193, 0.0628497526049614, -0.049656640738248825, 0.0049508423544466496, 0.0596190020442009, 0.008805572055280209, 0.16290539503097534, -0.030972950160503387, 0.20371483266353607, 0.05447977036237717, 0.025821711868047714, -0.02237671986222267, 0.07332803308963776, -0.04551886394619942, 0.06890786439180374, 0.008117655292153358, -0.09374061226844788, -0.06069735437631607, 0.06350472569465637, 0.04982756823301315, 0.046595923602581024, 0.04554831236600876, 0.012292317114770412, 0.07552491873502731, 0.18240918219089508, 0.023341741412878036, -0.1888015866279602, -0.05081455782055855, 0.03188024088740349, -0.05407831445336342, -0.05428323894739151, -0.030104679986834526, 0.0985633060336113, -0.0919785276055336, 0.036614999175071716, -0.004409246612340212, 0.08278679102659225, -0.07705198973417282, -0.048377782106399536, 0.02987346053123474, 0.10469682514667511, 0.014989390969276428, 0.06558279693126678, -0.2063325047492981, 0.04787978157401085, 0.021117661148309708, 0.11192785948514938, -0.011821461841464043, 0.04503532871603966, -0.00043454166734591126, -0.016803251579403877, 0.12641094624996185, 0.011247113347053528, 0.015187257900834084, 0.01114175096154213, -0.09475148469209671, -0.0184238962829113, 0.08343268185853958, -0.050734490156173706, 0.07605790346860886, -0.02142813615500927, -0.02847997471690178, -0.04909151792526245, -0.13830658793449402, -0.14383746683597565, -0.16955004632472992, 0.028728008270263672, 0.00017280217434745282, 0.05164468660950661, -0.030493982136249542, -0.021949265152215958, -0.024729130789637566, 0.16219636797904968, -0.2386111617088318, -0.08205696195363998, -0.0692722275853157, -0.015240116976201534, 0.16927103698253632, -0.04898662865161896, 0.008169484324753284, -0.005009008105844259, 0.12416411191225052, -0.013982574455440044, -0.02220872789621353, -0.02792864665389061, -0.04154219105839729, -0.11092585325241089, -0.03128303214907646, 0.18329888582229614, 0.09337751567363739, 0.0745285376906395, 0.014568556100130081, 0.06042986363172531, -0.013831814751029015, -0.05492285266518593, -0.008436537347733974, 0.08292384445667267, 0.0416584312915802, 0.11079803854227066, -0.11517751961946487, -0.12098487466573715, -0.14766637980937958, -0.031124988570809364, 0.10164468735456467, 0.07816683501005173, -0.0564449280500412, 0.08202508836984634, 0.1490781158208847, -0.14294005930423737, -0.14536476135253906, -0.020969104021787643, 0.06716900318861008, 0.020988453179597855, 0.06989313662052155, -0.22058174014091492, 0.053165655583143234, 0.06677471101284027, 0.011342404410243034, -0.02478789910674095, -0.2801715135574341, -0.11526071280241013, 0.06944596767425537, 0.0170438289642334, -0.22682854533195496, -0.13644830882549286, -0.0880623608827591, -0.08167319744825363, -0.03977780416607857, 0.06932230293750763, -0.09529005736112595, 0.07054411619901657, 0.015888815745711327, 0.03276653587818146, 0.04406880959868431, -0.042474959045648575, 0.08225859701633453, 0.05721426010131836, -0.002303987741470337, -0.05554317310452461, -0.011809847317636013, 0.0345611497759819, -0.05382221192121506, 0.20126934349536896, -0.050658710300922394, 0.02631547674536705, -0.0496426485478878, -0.01621682196855545, -0.05857851728796959, 0.0242990143597126, -0.031043076887726784, 0.016979334875941277, -0.0565028116106987, 0.04260143265128136, 0.04209631308913231, 0.005469805095344782, -0.01992771215736866, -0.09765701740980148, -0.045904744416475296, 0.15839383006095886, 0.13965237140655518, 0.08297304809093475, -0.12689505517482758, 0.007770066149532795, 0.033814020454883575, 0.01854391023516655, -0.048186857253313065, 0.05287088453769684, 0.06533022969961166, 0.04852212220430374, 0.1542528122663498, -0.009790853597223759, -0.14676432311534882, 0.006371241062879562, 0.04760695993900299, -0.08184543997049332, -0.06544798612594604, 0.030297519639134407, 0.05340805649757385, -0.1009175032377243, -0.08034971356391907, 0.10568954795598984, -0.028917090967297554, -0.02080143801867962, 0.006914418190717697, 0.05573349446058273, -0.07423309236764908, 0.17838265001773834, 0.05451774597167969, 0.014005844481289387, -0.06344237178564072, 0.09989942610263824, 0.12453577667474747, -0.03682495281100273, 0.0030285075772553682, 0.14958544075489044, -0.048930730670690536, -0.04325278103351593, -0.07097053527832031, 0.012762855738401413, 0.026721017435193062, -0.036681126803159714, -0.004786215722560883, -0.046303991228342056, 0.03214694932103157, 0.05728621408343315, 0.023274268954992294, 0.0652238056063652, -0.035265471786260605, -0.0018141823820769787, -0.0907127782702446, 0.10325971990823746, 0.06635148823261261, 0.04073575884103775, -0.022574763745069504, 0.1280158907175064, 0.031390480697155, -0.00031739112455397844, 0.008679045364260674, -0.027985522523522377, -0.05029524117708206, 0.04027017205953598, -0.008471517823636532, 0.03317767754197121, -0.0623069666326046, -0.02877027913928032, -0.0022983707021921873, 0.012287548743188381, 0.023867247626185417, 0.045897457748651505, -0.008166952058672905, -0.05788949877023697, -0.044652968645095825, 0.07737123966217041, -0.15717987716197968, -0.0013795463601127267, 0.061582956463098526, -0.08669696003198624, 0.05304491147398949, 0.05623594671487808, -0.04495769739151001, -0.002139074495062232, -0.15089179575443268, -0.02538241073489189, -0.010998928919434547, 0.026273000985383987, -0.009675255976617336, -0.18728132545948029, 0.02347000688314438, 0.004787905141711235, -0.013336706906557083, -0.017361579462885857, 0.1493532657623291, -0.10315202176570892, 0.06538654863834381, -0.0021270893048495054, -0.030367281287908554, -0.06898309290409088, 0.08045785129070282, 0.08251714706420898, 0.03607518970966339, 0.142208531498909, -0.05089665204286575, 0.04428292065858841, -0.11173146218061447, 0.014232569374144077, 0.008323383517563343, -0.009361768141388893, -0.07393421232700348, -0.024854617193341255, 0.04180527478456497, 0.0003240093355998397, 0.08743277192115784, 0.009955812245607376, -0.0306536927819252, 0.055963341146707535, -0.010560763999819756, -0.05802629142999649, 0.04478985816240311, 0.005751083604991436, -0.03306027874350548, -0.020429478958249092, -0.04040737450122833, -0.04823434725403786, -0.03538554161787033, -0.010745598003268242, 0.034075845032930374, 0.1558735966682434, 0.0799393430352211, 0.03613792359828949, 0.03571990877389908, -0.019850702956318855, -0.08385267108678818, 0.05534229055047035, -0.02287895977497101, 0.0011641675373539329, -0.04772632196545601, 0.18665704131126404, 0.11214017122983932, -0.1321639120578766, 0.11649368703365326, 0.014507682994008064, -0.10465631633996964, -0.06366446614265442, -0.10375040024518967, -0.03226476535201073, 0.032980382442474365, -0.020175136625766754, -0.07799625396728516, 0.08202893286943436, 0.02131810039281845, 0.004988887347280979, -0.02415655180811882, 0.1601746678352356, -0.08415969461202621, -0.11289595812559128, 0.05587625503540039, -0.006216845475137234, 0.05433765798807144, 0.029813092201948166, 0.06194779649376869, 0.07307325303554535, 0.03742944076657295, 0.10469511896371841, 0.0855075791478157, 0.06483129411935806, -0.008730257861316204, -0.0389040969312191, -0.04175323247909546, 0.012643368914723396, 0.008136599324643612, 0.029302425682544708, 0.1656080186367035, 0.06532681733369827, -0.04532655328512192, -0.020271463319659233, 0.12043339759111404, -0.03868991136550903, -0.06535544246435165, -0.13064776360988617, 0.08762312680482864, 0.04665825143456459, 0.005621044896543026, -0.004385003820061684, -0.09690821170806885, 0.03297409415245056, 0.1567731499671936, 0.07917049527168274, -0.002208857564255595, -0.004603790119290352, -0.015380964614450932, 0.021370092406868935, 0.004100021906197071, 0.038708481937646866, 0.04728994891047478, 0.16340816020965576, -0.023930802941322327, 0.12993481755256653, -0.053872592747211456, -0.05031992122530937, -0.07061076909303665, 0.04530728608369827, -0.09591003507375717, 0.019839461892843246, -0.039508968591690063, 0.0919685810804367, 0.009944356977939606, -0.25866609811782837, -0.026287103071808815, -0.020095763728022575, -0.0759819820523262, 0.015120217576622963, 0.061780501157045364, -0.019710252061486244, 0.035678282380104065, 0.033825792372226715, -0.02518177404999733, 0.1491069197654724, 0.010502549819648266, -0.05702700465917587, -0.033265113830566406, 0.042241133749485016, -0.11418633908033371, 0.15734535455703735, -0.002275709528476, 0.11712510883808136, 0.057751864194869995, 0.009506369940936565, -0.11106936633586884, 0.07852064818143845, 0.015414041467010975, -0.009388061240315437, 0.07584403455257416, 0.1936917006969452, -0.007127064745873213, 0.09505056589841843, 0.03007959946990013, 0.023425927385687828, 0.04933278262615204, -0.04184870794415474, -0.02725962921977043, -0.12874680757522583, 0.04736717417836189, -0.08496905118227005, 0.12401848286390305, 0.18229059875011444, -0.02639608643949032, 0.005697788204997778, -0.06927745044231415, -0.020597660914063454, 0.021069737151265144, 0.12073499709367752, 0.002658834680914879, -0.13409139215946198, 0.002465351950377226, 0.013002180494368076, 0.07097981870174408, -0.2528427541255951, -0.043743573129177094, -0.01095456350594759, -0.07265012711286545, -0.011767970398068428, 0.08008111268281937, 0.009277968667447567, 0.040749117732048035, -0.03338916227221489, -0.0928659737110138, 0.03053244762122631, 0.07928241789340973, -0.15064455568790436, -0.030308974906802177 ]
null
null
transformers
# S2T-MEDIUM-MUSTC-MULTILINGUAL-ST `s2t-medium-mustc-multilingual-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Multilingual Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to French text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. For multilingual speech translation models, `eos_token_id` is used as the `decoder_start_token_id` and the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the `forced_bos_token_id` parameter to the `generate()` method. The following example shows how to transate English speech to French and German text using the `facebook/s2t-medium-mustc-multilingual-st` checkpoint. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-medium-mustc-multilingual-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-medium-mustc-multilingual-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") ds = ds.map(map_to_array) inputs = processor(ds["speech"][0], sampling_rate=16_000, return_tensors="pt") # translate English Speech To French Text generated_ids = model.generate( input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"], forced_bos_token_id=processor.tokenizer.lang_code_to_id["fr"] ) translation_fr = processor.batch_decode(generated_ids) # translate English Speech To German Text generated_ids = model.generate( input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"], forced_bos_token_id=processor.tokenizer.lang_code_to_id["de"] ) translation_de = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-medium-mustc-multilingual-st is trained on [MuST-C](https://ict.fbk.eu/must-c/). MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for multilingual ASR. For multilingual models, target language ID token is used as target BOS. ## Evaluation results MuST-C test results (BLEU score): | En-De | En-Nl | En-Es | En-Fr | En-It | En-Pt | En-Ro | En-Ru | |:-----:|:-----:|:-----:|:-----:|:-----:|:-----:|:-----:|:-----:| | 24.5 | 28.6 | 28.2 | 34.9 | 24.6 | 31.1 | 23.8 | 16.0 | ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "de", "nl", "es", "fr", "it", "pt", "ro", "ru"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["mustc"], "pipeline_tag": "automatic-speech-recognition"}
automatic-speech-recognition
facebook/s2t-medium-mustc-multilingual-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "de", "nl", "es", "fr", "it", "pt", "ro", "ru", "dataset:mustc", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en", "de", "nl", "es", "fr", "it", "pt", "ro", "ru" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #de #nl #es #fr #it #pt #ro #ru #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
S2T-MEDIUM-MUSTC-MULTILINGUAL-ST ================================ 's2t-medium-mustc-multilingual-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Multilingual Speech Translation (ST). The S2T model was proposed in this paper and released in this repository Model description ----------------- S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. Intended uses & limitations --------------------------- This model can be used for end-to-end English speech to French text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. For multilingual speech translation models, 'eos\_token\_id' is used as the 'decoder\_start\_token\_id' and the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the 'forced\_bos\_token\_id' parameter to the 'generate()' method. The following example shows how to transate English speech to French and German text using the 'facebook/s2t-medium-mustc-multilingual-st' checkpoint. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. Training data ------------- The s2t-medium-mustc-multilingual-st is trained on MuST-C. MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. Training procedure ------------------ ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for multilingual ASR. For multilingual models, target language ID token is used as target BOS. Evaluation results ------------------ MuST-C test results (BLEU score): ### BibTeX entry and citation info
[ "### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\nFor multilingual speech translation models, 'eos\\_token\\_id' is used as the 'decoder\\_start\\_token\\_id' and\nthe target language id is forced as the first generated token. To force the target language id as the first\ngenerated token, pass the 'forced\\_bos\\_token\\_id' parameter to the 'generate()' method. The following\nexample shows how to transate English speech to French and German text using the 'facebook/s2t-medium-mustc-multilingual-st'\ncheckpoint.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.\n\n\nTraining data\n-------------\n\n\nThe s2t-medium-mustc-multilingual-st is trained on MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.\n\n\nTraining procedure\n------------------", "### Preprocessing\n\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000.", "### Training\n\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for multilingual ASR. For multilingual models, target language ID token\nis used as target BOS.\n\n\nEvaluation results\n------------------\n\n\nMuST-C test results (BLEU score):", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #de #nl #es #fr #it #pt #ro #ru #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\nFor multilingual speech translation models, 'eos\\_token\\_id' is used as the 'decoder\\_start\\_token\\_id' and\nthe target language id is forced as the first generated token. To force the target language id as the first\ngenerated token, pass the 'forced\\_bos\\_token\\_id' parameter to the 'generate()' method. The following\nexample shows how to transate English speech to French and German text using the 'facebook/s2t-medium-mustc-multilingual-st'\ncheckpoint.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.\n\n\nTraining data\n-------------\n\n\nThe s2t-medium-mustc-multilingual-st is trained on MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.\n\n\nTraining procedure\n------------------", "### Preprocessing\n\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000.", "### Training\n\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for multilingual ASR. For multilingual models, target language ID token\nis used as target BOS.\n\n\nEvaluation results\n------------------\n\n\nMuST-C test results (BLEU score):", "### BibTeX entry and citation info" ]
[ 96, 385, 99, 109, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #de #nl #es #fr #it #pt #ro #ru #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\nFor multilingual speech translation models, 'eos\\_token\\_id' is used as the 'decoder\\_start\\_token\\_id' and\nthe target language id is forced as the first generated token. To force the target language id as the first\ngenerated token, pass the 'forced\\_bos\\_token\\_id' parameter to the 'generate()' method. The following\nexample shows how to transate English speech to French and German text using the 'facebook/s2t-medium-mustc-multilingual-st'\ncheckpoint.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.\n\n\nTraining data\n-------------\n\n\nThe s2t-medium-mustc-multilingual-st is trained on MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.\n\n\nTraining procedure\n------------------" ]
[ -0.04295043274760246, 0.05272488296031952, -0.007720061112195253, 0.012852896936237812, 0.07416865229606628, -0.023624036461114883, 0.0581262931227684, 0.050796184688806534, -0.036778103560209274, 0.10694548487663269, -0.003510955022647977, 0.06590155512094498, 0.07739079743623734, 0.1319611519575119, 0.05664187669754028, -0.2318686544895172, 0.04315263405442238, -0.09570411592721939, 0.03471827134490013, 0.09796085953712463, 0.11839251965284348, -0.03801354393362999, 0.04608304426074028, -0.021030785515904427, -0.024647492915391922, 0.039048757404088974, 0.03849959373474121, -0.07394116371870041, 0.06412561982870102, 0.1027415320277214, 0.06950389593839645, -0.0016056213062256575, 0.03719823807477951, -0.17215774953365326, 0.008247639052569866, 0.09019017964601517, 0.00019288071780465543, 0.022905414924025536, 0.11660368740558624, -0.04390707612037659, 0.09745179861783981, -0.06299643963575363, 0.011767206713557243, 0.04961129277944565, -0.08781813830137253, -0.11986935138702393, -0.07166468352079391, 0.06178756430745125, 0.05831260234117508, 0.06910819560289383, -0.06261108815670013, 0.02085118182003498, -0.023226868361234665, 0.08590348809957504, 0.10611293464899063, -0.21330809593200684, -0.024619542062282562, -0.05490780249238014, 0.05603605881333351, 0.09988337010145187, -0.049749888479709625, -0.0017977104289457202, -0.011812496930360794, 0.008517852984368801, 0.06046269088983536, -0.06554190069437027, -0.02443137764930725, -0.06899991631507874, -0.1379205584526062, -0.093533955514431, 0.11110126227140427, 0.016605496406555176, -0.09171892702579498, -0.10688372701406479, -0.037562817335128784, 0.02000553160905838, 0.007675866596400738, -0.05916640907526016, 0.015034814365208149, 0.015956109389662743, 0.004810541868209839, -0.08668376505374908, -0.09192898124456406, -0.004635146353393793, -0.08259414881467819, 0.09997545182704926, 0.018359294161200523, -0.012128542177379131, 0.05474408343434334, 0.09208579361438751, -0.047096166759729385, -0.059830937534570694, -0.022001199424266815, -0.03504568710923195, -0.15237627923488617, -0.020549068227410316, -0.010441024787724018, -0.16452626883983612, -0.019614066928625107, 0.14525961875915527, -0.07658194750547409, 0.047629132866859436, -0.0665191039443016, 0.027719369158148766, 0.021936161443591118, 0.16266511380672455, -0.08243897557258606, -0.047966718673706055, -0.013591485098004341, -0.03453836962580681, 0.0014052735641598701, 0.009027468971908092, -0.016714343801140785, -0.01824299991130829, 0.04124682396650314, 0.05164862796664238, 0.016086596995592117, 0.03742361068725586, -0.013922657817602158, -0.033634595572948456, 0.02578769251704216, -0.17340688407421112, 0.043399911373853683, 0.04399770498275757, -0.03535028547048569, 0.09090849757194519, 0.04762226715683937, -0.008772067725658417, -0.13088278472423553, 0.04989597573876381, -0.006853564642369747, 0.03324365243315697, -0.07460933923721313, -0.13045890629291534, -0.0026663150638341904, -0.03258780017495155, -0.08608841896057129, -0.11855769157409668, -0.06717483699321747, -0.07507338374853134, 0.04316508397459984, -0.05024774372577667, 0.012270170263946056, -0.031184976920485497, -0.0052901641465723515, 0.0026795745361596346, -0.01706302911043167, 0.007662720512598753, -0.01974545791745186, 0.03047327883541584, -0.036978378891944885, 0.040330953896045685, 0.09490768611431122, 0.03837966173887253, -0.053682975471019745, 0.014736679382622242, -0.2642255425453186, 0.20284612476825714, -0.04660416394472122, -0.02556886337697506, -0.1145852655172348, -0.029774345457553864, -0.05595702305436134, 0.052907947450876236, 0.0018878558184951544, 0.09141353517770767, -0.13080944120883942, -0.05775387957692146, 0.24179278314113617, -0.07043391466140747, 0.0465984120965004, 0.1421143263578415, -0.016878461465239525, 0.057622700929641724, 0.12560983002185822, 0.09473246335983276, 0.10864084959030151, -0.13056157529354095, 0.0012979498133063316, 0.01910226233303547, -0.08195695281028748, 0.17253008484840393, 0.08851815015077591, -0.07670252025127411, 0.022859398275613785, -0.00142276578117162, -0.024008823558688164, 0.005983355455100536, 0.012676913291215897, -0.007650364656001329, 0.03682715818285942, -0.0035357249435037374, -0.011609338223934174, -0.021612977609038353, -0.027055874466896057, -0.013979156501591206, -0.09676798433065414, 0.08748745918273926, 0.020816026255488396, -0.05329149216413498, 0.04843977093696594, -0.0735212191939354, -0.0038241420406848192, -0.0016706710448488593, -0.0017852564342319965, -0.18025906383991241, 0.009695527143776417, 0.027312202379107475, -0.04668745398521423, 0.15024131536483765, -0.0036651461850851774, 0.03160668909549713, 0.050965819507837296, -0.016764985397458076, 0.01101546548306942, 0.021692687645554543, -0.03756076842546463, 0.015333171933889389, -0.062126677483320236, -0.01949712634086609, -0.024729037657380104, 0.11797010898590088, -0.06971685588359833, 0.0033048936165869236, 0.021543748676776886, 0.07914677262306213, 0.0007187717710621655, -0.02470908872783184, -0.003082399256527424, 0.017096372321248055, 0.006340852007269859, -0.03479821979999542, -0.009853771887719631, 0.019100027158856392, 0.02217419631779194, 0.10636777430772781, -0.1563492715358734, -0.12701794505119324, 0.0777030810713768, -0.011352535337209702, -0.13213273882865906, -0.0081578204408288, -0.01358362939208746, -0.03816330432891846, -0.06763936579227448, -0.14087361097335815, 0.1920335739850998, 0.030399413779377937, 0.08784245699644089, -0.07118188589811325, -0.0457775853574276, -0.019560910761356354, -0.03708011656999588, -0.040370479226112366, 0.06369387358427048, -0.022734640166163445, -0.14870789647102356, 0.05776730552315712, -0.04461619630455971, -0.04229922592639923, 0.12027962505817413, 0.013919885270297527, -0.11389654129743576, -0.014296763576567173, 0.10676898807287216, 0.023522984236478806, -0.05201936885714531, 0.03394543379545212, -0.012895350344479084, 0.04204844310879707, 0.009295226074755192, 0.014975638128817081, -0.05344647914171219, 0.07998810708522797, 0.06387419253587723, -0.0645141527056694, 0.008791359141469002, -0.02159143052995205, 0.039937734603881836, 0.093937449157238, 0.0151161327958107, 0.028469812124967575, -0.02836589142680168, -0.03518069162964821, -0.12212511897087097, 0.11256463080644608, -0.13287808001041412, -0.3202998638153076, -0.1945164054632187, -0.027745699509978294, -0.004852275364100933, 0.04986177384853363, 0.03242745250463486, -0.010000564157962799, -0.049129582941532135, -0.09295766055583954, 0.1001187190413475, 0.011128305457532406, -0.09238632023334503, -0.09583716094493866, 0.027127273380756378, 0.02454407326877117, -0.09748351573944092, 0.013725390657782555, 0.039923954755067825, -0.06615611165761948, -0.01589842140674591, 0.008395783603191376, 0.05772702768445015, 0.10283587127923965, -0.013532646000385284, -0.023913126438856125, -0.028854412958025932, 0.15803539752960205, -0.09346875548362732, 0.10836131125688553, 0.08855108171701431, -0.02962261252105236, 0.040263209491968155, 0.15239952504634857, -0.0016260059783235192, 0.003987574018537998, 0.03621600568294525, 0.020390838384628296, -0.040611181408166885, -0.23073315620422363, -0.04247375950217247, -0.05531669780611992, 0.004079627338796854, 0.05990590527653694, 0.03461676463484764, 0.06681409478187561, 0.02229517325758934, -0.07351116836071014, 0.045190829783678055, 0.1464947909116745, 0.06726744771003723, 0.05819854885339737, -0.010821075178682804, 0.07116563618183136, -0.04184573143720627, -0.05020691826939583, 0.09538251906633377, 0.00019074456940870732, 0.1311606615781784, -0.02728254534304142, 0.16723278164863586, 0.03983195871114731, 0.0196215882897377, 0.01398975308984518, 0.09003444761037827, -0.015335530042648315, 0.0559043362736702, -0.01828763075172901, -0.07106127589941025, -0.022789061069488525, 0.05780917778611183, 0.1229557916522026, -0.031971752643585205, 0.024592962116003036, -0.03963802382349968, 0.06692628562450409, 0.19728019833564758, 0.0045852079056203365, -0.1534641534090042, -0.03367186710238457, 0.014472379349172115, -0.07720433175563812, -0.07809239625930786, 0.016503002494573593, 0.12710414826869965, -0.13824664056301117, 0.057321079075336456, -0.009942695498466492, 0.07957390695810318, -0.0494387149810791, -0.027625279501080513, 0.040560804307460785, 0.12924765050411224, -0.008440472185611725, 0.07096804678440094, -0.2176954448223114, 0.08615964651107788, -0.0017339991172775626, 0.05641188472509384, -0.01453876867890358, 0.06408523768186569, -0.007794685196131468, -0.012062577530741692, 0.135116308927536, 0.007357448805123568, -0.12037793546915054, -0.04127954691648483, -0.04856089875102043, -0.03265930712223053, 0.10179701447486877, -0.0612625814974308, 0.05631488189101219, -0.01929217018187046, -0.052811119705438614, -0.07017173618078232, -0.03994590416550636, -0.11544869095087051, -0.15595220029354095, -0.000322656036587432, -0.020460840314626694, 0.09659017622470856, -0.007835695520043373, 0.007747710216790438, -0.1027611792087555, 0.169230118393898, -0.15429766476154327, -0.07957702875137329, -0.09683080017566681, -0.035764873027801514, 0.10854427516460419, -0.05760856345295906, 0.04101092368364334, 0.0115421237424016, 0.13540878891944885, -0.02198578044772148, -0.05154973268508911, 0.04341330751776695, -0.05956856906414032, -0.05313614383339882, -0.0226643867790699, 0.19155625998973846, 0.05714993178844452, 0.054464295506477356, 0.006247274577617645, 0.03836549445986748, 0.030104773119091988, -0.06991098076105118, -0.038400813937187195, 0.063721664249897, 0.038195401430130005, 0.134319007396698, -0.0981496274471283, -0.1870325207710266, -0.12609201669692993, -0.015288402326405048, 0.144557923078537, 0.05795690417289734, -0.07983517646789551, 0.10272913426160812, 0.15989938378334045, -0.08925140649080276, -0.16749371588230133, -0.05074511840939522, 0.08654076606035233, 0.04399537667632103, 0.016783546656370163, -0.20286999642848969, -0.015852289274334908, 0.06615225225687027, 0.026405425742268562, 0.00014103614375926554, -0.25234347581863403, -0.1251085102558136, 0.07667078822851181, 0.014426419511437416, -0.17159049212932587, -0.08127908408641815, -0.07753465324640274, -0.07273383438587189, -0.019959112629294395, 0.05718662217259407, -0.11004331707954407, 0.046342477202415466, 0.11189284175634384, 0.018870871514081955, 0.029687166213989258, -0.010347947478294373, 0.08431409299373627, 0.04342999681830406, -0.015515405684709549, -0.06022368744015694, 0.05211038514971733, 0.026857363060116768, -0.009948795661330223, 0.13854718208312988, 0.013218739069998264, -0.008153703995049, -0.03632913902401924, -0.048923272639513016, -0.06895796209573746, 0.056386224925518036, -0.03553890809416771, -0.00843917578458786, -0.04874660074710846, 0.037902943789958954, 0.06016591563820839, -0.01324487291276455, -0.0870751217007637, -0.11087627708911896, -0.0004980175872333348, 0.22436843812465668, 0.1336228996515274, 0.10591119527816772, -0.13093367218971252, -0.012703219428658485, -0.0008372793672606349, 0.029831958934664726, 0.06894702464342117, 0.04932351037859917, 0.06270021945238113, 0.0012599261244758964, 0.16604384779930115, -0.020044496282935143, -0.13009902834892273, 0.017224770039319992, 0.03832381218671799, -0.07777966558933258, -0.11788751929998398, -0.012524059042334557, -0.035473838448524475, -0.011348935775458813, -0.09292145818471909, 0.15266695618629456, -0.018437163904309273, -0.023906119167804718, 0.014431999064981937, 0.03934821859002113, -0.052942872047424316, 0.1374216228723526, -0.00874274130910635, 0.026462746784090996, -0.0644013062119484, 0.08293554931879044, 0.11308625340461731, -0.10896244645118713, -0.027290400117635727, 0.15762057900428772, -0.08368697017431259, -0.039462052285671234, -0.06170164793729782, -0.05799844488501549, -0.008178513497114182, -0.04871900752186775, 0.021652214229106903, -0.09351473301649094, 0.05701382830739021, 0.08925638347864151, -0.00472306041046977, 0.03451981022953987, -0.02038044109940529, 0.0034712553024291992, -0.03555510193109512, 0.08223696053028107, 0.07884884625673294, 0.004330149386078119, -0.03784499689936638, 0.16495180130004883, 0.01629331149160862, 0.020396409556269646, -0.02261892706155777, -0.04626547545194626, -0.057064276188611984, 0.016072534024715424, -0.03367098793387413, 0.05029406026005745, -0.0579444020986557, -0.03981853276491165, 0.010025626979768276, -0.003338401671499014, 0.0165412500500679, 0.01771756261587143, -0.04057379439473152, -0.03741971030831337, -0.05205744504928589, 0.07401107251644135, -0.12937995791435242, 0.006839434616267681, 0.05953469127416611, -0.07766234874725342, 0.06698891520500183, 0.0883241519331932, -0.060418788343667984, 0.021752962842583656, -0.07252293825149536, -0.031491827219724655, -0.0034580486826598644, 0.03390798345208168, 0.000943778723012656, -0.1567297875881195, 0.03976196050643921, 0.01740403287112713, -0.02702038362622261, -0.036242660135030746, 0.01409618929028511, -0.0866423174738884, 0.07951762527227402, 0.005619081202894449, -0.048229046165943146, -0.07471375912427902, 0.046499691903591156, 0.03716979920864105, 0.008156867697834969, 0.08341963589191437, -0.09346331655979156, 0.06728126853704453, -0.1023065447807312, -0.00048253018758259714, 0.019042322412133217, 0.013841080479323864, -0.045174431055784225, -0.03223934397101402, 0.05635737627744675, 0.013419405557215214, 0.09007883071899414, 0.031297795474529266, 0.021535933017730713, 0.033243339508771896, -0.11732149124145508, -0.014727155677974224, 0.06523170322179794, 0.020288778468966484, 0.020558951422572136, -0.024415312334895134, -0.06826458126306534, -0.04274376109242439, -0.01805490255355835, -0.03227820619940758, 0.0875418409705162, 0.15609347820281982, 0.12211082130670547, 0.03613138943910599, 0.04614211618900299, -0.023314526304602623, -0.09875527769327164, -0.0034377577248960733, -0.052481040358543396, 0.0005217768484726548, -0.06116960570216179, 0.11698506027460098, 0.13791818916797638, -0.16136883199214935, 0.1182486042380333, 0.03896379470825195, -0.06763403117656708, -0.07420361787080765, -0.1661670207977295, -0.03303312137722969, -0.022062690928578377, -0.02334553375840187, -0.07887829095125198, 0.08106332272291183, -0.013959256932139397, 0.06152942031621933, -0.0012930374359712005, 0.13908077776432037, -0.18156643211841583, -0.17136773467063904, 0.07409389317035675, 0.006742801982909441, 0.09793692082166672, 0.05839728191494942, 0.011309105902910233, 0.06498327851295471, 0.0509142205119133, 0.08972349762916565, 0.06728820502758026, 0.06990466266870499, 0.024245603010058403, -0.04940134659409523, -0.018725009635090828, 0.009484786540269852, 0.005731549579650164, -0.00484315212816, 0.14083905518054962, 0.0713532418012619, -0.06727749854326248, -0.006989257875829935, 0.10846766084432602, -0.06779021769762039, -0.11989777535200119, -0.17460502684116364, 0.1733238697052002, 0.02308756671845913, 0.05415252968668938, -0.003069293685257435, -0.09536469727754593, -0.043863557279109955, 0.16440194845199585, 0.1538083553314209, 0.0007499026251025498, 0.00808939803391695, 0.0116133326664567, 0.021218443289399147, 0.03893595188856125, 0.07280214875936508, 0.0435103178024292, 0.26090720295906067, -0.024604856967926025, 0.07271544635295868, -0.021991482004523277, -0.031354714184999466, -0.06799774616956711, 0.08602259308099747, -0.08710398524999619, -0.0034355237148702145, -0.012946984730660915, 0.12669266760349274, -0.0503903329372406, -0.28009673953056335, -0.053466975688934326, 0.04724910110235214, -0.057787105441093445, 0.02330101653933525, 0.00804431363940239, 0.051126327365636826, 0.030803849920630455, 0.0444830022752285, -0.03754303231835365, 0.158074289560318, 0.028894536197185516, -0.04628315567970276, -0.011625716462731361, 0.03587432950735092, -0.05078477784991264, 0.10192850977182388, -0.015875548124313354, 0.10926436632871628, 0.061639364808797836, 0.026539208367466927, -0.049693357199430466, 0.059030577540397644, 0.05082641541957855, -0.048849739134311676, 0.059222523123025894, 0.15825286507606506, -0.00426911748945713, 0.08412391692399979, 0.03321779891848564, -0.07458622008562088, 0.06463704258203506, -0.027157796546816826, -0.011122601106762886, -0.11006604880094528, 0.06353505700826645, -0.10695405304431915, 0.09769846498966217, 0.1958131194114685, 0.011418751440942287, 0.013832522556185722, -0.060090240091085434, -0.015731286257505417, 0.0070669641718268394, 0.0846983939409256, 0.006262531504034996, -0.13159814476966858, -0.012490950524806976, -0.0720076635479927, 0.06389334797859192, -0.20296159386634827, -0.05046675354242325, -0.0038175799418240786, -0.02644292637705803, -0.024345707148313522, 0.09414377063512802, -0.003281247103586793, 0.029049977660179138, -0.026723645627498627, -0.10570714622735977, 0.0177974421530962, 0.11832020431756973, -0.1067596971988678, -0.012728319503366947 ]
null
null
transformers
# S2T-SMALL-COVOST2-CA-EN-ST `s2t-small-covost2-ca-en-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end Catalan speech to English text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-covost2-ca-en-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-covost2-ca-en-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=48_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-covost2-ca-en-st is trained on Catalan-English subset of [CoVoST2](https://github.com/facebookresearch/covost). CoVoST is a large-scale multilingual ST corpus based on [Common Voice](https://arxiv.org/abs/1912.06670), created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for ca-en (BLEU score): 17.85 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["ca", "en"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-covost2-ca-en-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "ca", "en", "dataset:covost2", "arxiv:2010.05171", "arxiv:1912.06670", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1912.06670", "1904.08779" ]
[ "ca", "en" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #ca #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-COVOST2-CA-EN-ST 's2t-small-covost2-ca-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end Catalan speech to English text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-covost2-ca-en-st is trained on Catalan-English subset of CoVoST2. CoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for ca-en (BLEU score): 17.85 ### BibTeX entry and citation info
[ "# S2T-SMALL-COVOST2-CA-EN-ST\n\n's2t-small-covost2-ca-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end Catalan speech to English text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-ca-en-st is trained on Catalan-English subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for ca-en (BLEU score): 17.85", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #ca #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-COVOST2-CA-EN-ST\n\n's2t-small-covost2-ca-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end Catalan speech to English text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-ca-en-st is trained on Catalan-English subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for ca-en (BLEU score): 17.85", "### BibTeX entry and citation info" ]
[ 91, 78, 111, 42, 137, 66, 3, 96, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #ca #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-COVOST2-CA-EN-ST\n\n's2t-small-covost2-ca-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end Catalan speech to English text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.08856365829706192, 0.12162001430988312, -0.006377975456416607, 0.008821009658277035, 0.06794111430644989, -0.07096020877361298, 0.012328674085438251, 0.0583714097738266, -0.13205890357494354, 0.05543345957994461, 0.03678899258375168, 0.105362169444561, 0.061496272683143616, 0.022429006174206734, -0.013097044080495834, -0.2541000247001648, 0.027621479704976082, -0.06542643904685974, 0.054205235093832016, 0.0826437920331955, 0.12064840644598007, -0.03125684708356857, 0.07902194559574127, 0.03215329721570015, 0.005944593343883753, 0.012388551607728004, 0.05693832412362099, -0.07705742120742798, 0.06496786326169968, 0.11462347954511642, 0.07646302878856659, 0.02481905184686184, 0.06435678899288177, -0.13501329720020294, 0.005368209443986416, 0.03535857051610947, 0.01840078830718994, 0.05043044313788414, 0.0219193734228611, -0.0009882571175694466, 0.12591448426246643, 0.014286569319665432, 0.035409100353717804, 0.08872728049755096, -0.0756077989935875, -0.031499315053224564, -0.04084381088614464, 0.006585846655070782, 0.16484645009040833, 0.05696184188127518, -0.03633999079465866, -0.02247426100075245, -0.017539002001285553, 0.0415450818836689, 0.025706565007567406, -0.20233047008514404, -0.023447806015610695, -0.04596567898988724, 0.01581873930990696, 0.021676979959011078, -0.01637716218829155, -0.016788728535175323, -0.027416903525590897, 0.006632172502577305, 0.05994429439306259, -0.07063847780227661, 0.031428027898073196, -0.06251083314418793, -0.10115586221218109, -0.04254469648003578, 0.07919112592935562, -0.028760943561792374, -0.0816722884774208, -0.11673075705766678, -0.060267556458711624, 0.005292996298521757, -0.016013938933610916, -0.056153010576963425, -0.0029067895375192165, 0.03656493499875069, 0.0382884219288826, -0.10899300128221512, -0.11638080328702927, -0.016689367592334747, -0.09907685220241547, 0.07819583266973495, 0.039183609187603, -0.00378803675994277, 0.010881238617002964, 0.09716589748859406, -0.08516107499599457, -0.0680551826953888, -0.05395103245973587, -0.07605894654989243, -0.10889220982789993, -0.011798408813774586, -0.006897968240082264, -0.21060094237327576, -0.02710612118244171, 0.14580054581165314, -0.11942833662033081, 0.03588749095797539, 0.035160455852746964, 0.025053756311535835, -0.003271793480962515, 0.18959370255470276, -0.04800581932067871, -0.07566185295581818, 0.05728740245103836, -0.037348970770835876, 0.03941866382956505, -0.0010486283572390676, -0.11465054005384445, -0.013965075835585594, -0.09322229027748108, 0.05178944393992424, 0.07867035269737244, 0.01777375489473343, -0.007868909277021885, -0.07816296815872192, 0.1714702844619751, -0.11133790761232376, 0.014512770809233189, 0.043711673468351364, -0.044412121176719666, 0.11065785586833954, 0.037954527884721756, -0.010851847939193249, -0.12009921669960022, 0.020143430680036545, -0.004052780102938414, 0.014279953204095364, -0.0894199088215828, -0.1072007566690445, 0.004437181632965803, -0.07058779895305634, -0.05540773645043373, -0.11420345306396484, -0.08076491206884384, -0.05143841356039047, 0.019886577501893044, -0.07298596948385239, 0.041693441569805145, -0.10014437884092331, -0.05509123206138611, -0.0005556417163461447, -0.043728236109018326, -0.15926124155521393, 0.008622190915048122, 0.012590289115905762, -0.006424008868634701, 0.043848875910043716, -0.06816232949495316, 0.019360864534974098, -0.06719758361577988, 0.00369619601406157, -0.25100386142730713, 0.19969163835048676, 0.04592394828796387, 0.015656130388379097, -0.10951186716556549, -0.05918760225176811, -0.06959740072488785, 0.06430230289697647, 0.04343174397945404, 0.1397811770439148, -0.24748766422271729, -0.02690259739756584, 0.20364859700202942, -0.06920535117387772, 0.02585393749177456, 0.12015841901302338, 0.009961592964828014, 0.0734856054186821, 0.13349445164203644, 0.08475877344608307, 0.1032361313700676, -0.027912704274058342, -0.08542495965957642, 0.01787250116467476, -0.11995130032300949, 0.04148927703499794, 0.06305722147226334, -0.08771521598100662, 0.1368063986301422, 0.015233989804983139, -0.05469627305865288, -0.02600141614675522, 0.07035166025161743, -0.03128774091601372, 0.048632048070430756, -0.008079544641077518, 0.0933256670832634, -0.023211335763335228, -0.009133265353739262, 0.01456372905522585, -0.12962818145751953, 0.14716671407222748, 0.015976186841726303, -0.07804573327302933, 0.06812167912721634, -0.15254421532154083, -0.02367439679801464, -0.031118270009756088, 0.019717222079634666, -0.18234772980213165, 0.06000794470310211, -0.040169090032577515, -0.006207338534295559, 0.1539520025253296, 0.026294799521565437, 0.06776100397109985, 0.03041374683380127, -0.00834246538579464, 0.031413402408361435, 0.08267483115196228, -0.0079309968277812, 0.004844686482101679, -0.07483357936143875, -0.010139274410903454, -0.023706601932644844, 0.09968595206737518, -0.10375428944826126, 0.03619828820228577, 0.049247823655605316, 0.04698371887207031, 0.025179103016853333, -0.025782795622944832, -0.02952689491212368, 0.012167575769126415, -0.0064398315735161304, -0.04644622653722763, 0.0004662392311729491, 0.030704259872436523, -0.06821145117282867, 0.13729320466518402, -0.19953171908855438, -0.1760285198688507, 0.07600712776184082, -0.04446212202310562, -0.05426141247153282, 0.0216875821352005, 0.004271803889423609, 0.0062781041488051414, 0.041241493076086044, -0.1476142853498459, 0.2623315155506134, 0.04763771593570709, 0.14410828053951263, -0.08944369852542877, -0.03835412487387657, -0.04530378058552742, -0.06369839608669281, 0.04373093321919441, 0.052384402602910995, 0.06427051872015, -0.15936386585235596, -0.0016088866395875812, 0.024204693734645844, -0.006705915089696646, 0.11767943203449249, 0.01491051260381937, -0.061072561889886856, -0.01343126967549324, 0.04271392896771431, 0.05275455117225647, -0.008551066741347313, 0.023162739351391792, 0.025400837883353233, 0.026801180094480515, 0.023721899837255478, 0.023633120581507683, -0.07878223061561584, 0.08600785583257675, 0.08114490658044815, -0.05297551304101944, -0.061206698417663574, -0.01566288433969021, 0.0010125439148396254, 0.049670372158288956, 0.033738307654857635, 0.005470651667565107, -0.019072027876973152, -0.021492743864655495, -0.10620857030153275, 0.12782038748264313, -0.17090339958667755, -0.2639632821083069, -0.19844649732112885, -0.00997903011739254, -0.007540344726294279, 0.1019224002957344, 0.047149065881967545, -0.07751908898353577, -0.049888189882040024, -0.07262732833623886, 0.06316740810871124, -0.007385592442005873, -0.06611529737710953, -0.14580748975276947, 0.05658639594912529, 0.007828355766832829, -0.08843803405761719, 0.00968884862959385, 0.005433339625597, -0.09365636110305786, -0.03797697275876999, 0.004763543605804443, -0.028280222788453102, 0.2038053572177887, -0.018880266696214676, -0.044372837990522385, -0.03357827663421631, 0.10244151949882507, -0.052942581474781036, 0.09831974655389786, 0.1640816330909729, -0.09971761703491211, 0.024193719029426575, 0.14107656478881836, 0.03492754325270653, -0.028315288946032524, -0.016667433083057404, -0.03907524421811104, -0.08824031800031662, -0.1958642601966858, -0.038710154592990875, -0.04880410060286522, 0.026148857548832893, 0.040628883987665176, 0.009306102059781551, 0.08587817847728729, 0.03758629411458969, -0.06333163380622864, -0.00007425196963595226, 0.08258319646120071, 0.08507034182548523, 0.19500841200351715, -0.0619993694126606, 0.08603331446647644, -0.08111944049596786, -0.027048707008361816, 0.08410122245550156, -0.025535937398672104, 0.15190371870994568, 0.017991406843066216, 0.2103239744901657, 0.06262712180614471, -0.023845432326197624, 0.028685331344604492, 0.03955568000674248, -0.019012589007616043, 0.008856063708662987, -0.019075676798820496, -0.07438191026449203, -0.07303473353385925, 0.07100439071655273, 0.06684818118810654, -0.027352437376976013, 0.008152861148118973, 0.02307099476456642, 0.04248148947954178, 0.072480708360672, 0.05073990672826767, -0.1602386236190796, -0.08196595311164856, 0.022848688066005707, -0.11680325120687485, -0.025503601878881454, 0.00012660303036682308, 0.16818256676197052, -0.0614493228495121, 0.005703664384782314, 0.014203456230461597, 0.10376566648483276, -0.13488414883613586, 0.036993518471717834, 0.015199628658592701, 0.15553317964076996, 0.055730029940605164, 0.010856577195227146, -0.0449373759329319, 0.07135405391454697, 0.03181205689907074, 0.09875717014074326, -0.00647998321801424, 0.07180225104093552, -0.013195956125855446, 0.035912539809942245, 0.07412221282720566, -0.008096607401967049, -0.15601256489753723, -0.0022525161039084196, -0.10480163991451263, -0.03282948583364487, 0.07030521333217621, -0.06237967312335968, 0.06431978195905685, -0.038334496319293976, -0.039825454354286194, -0.06563271582126617, -0.11969099193811417, -0.012791994959115982, -0.18911047279834747, 0.018123261630535126, -0.00010511792788747698, 0.03348906338214874, -0.004885573405772448, 0.06565839797258377, -0.005262780003249645, 0.16403299570083618, -0.2645528316497803, -0.08728336542844772, -0.08811074495315552, -0.07116252928972244, 0.12401623278856277, -0.06350068747997284, 0.07657834887504578, 0.04781355708837509, 0.04011780396103859, 0.016964701935648918, -0.04813610389828682, 0.0638628900051117, -0.09105879813432693, -0.0005824498948641121, -0.030293839052319527, 0.17302006483078003, 0.025077171623706818, 0.07819817215204239, 0.0709296390414238, 0.025646883994340897, 0.006028241012245417, -0.10917685180902481, -0.11121085286140442, 0.13205942511558533, 0.005856761708855629, -0.0000155227226059651, -0.02397547848522663, -0.09568379074335098, -0.05069878324866295, 0.05129331722855568, 0.11893779039382935, 0.07110094279050827, -0.09239166975021362, 0.11777793616056442, 0.11522295325994492, -0.06829480081796646, -0.16885687410831451, -0.07356815040111542, 0.06729735434055328, 0.04347851499915123, 0.032317932695150375, -0.16179302334785461, -0.001820003380998969, 0.024797825142741203, -0.006433616392314434, -0.02257268875837326, -0.2802003026008606, -0.11677461117506027, 0.03461182489991188, -0.03341741859912872, 0.015120603144168854, 0.002730770967900753, -0.1008397787809372, -0.05234656482934952, -0.04897177591919899, 0.02078930288553238, -0.07117956131696701, 0.058417681604623795, 0.08626203238964081, -0.026249241083860397, 0.05176329240202904, 0.012748750858008862, 0.08595199882984161, -0.02714119665324688, -0.055963583290576935, -0.05425683408975601, 0.05694151669740677, 0.02676420845091343, -0.031968794763088226, 0.1446613371372223, -0.0030465831514447927, -0.002094775903970003, -0.09682177752256393, -0.05301601439714432, 0.00645574601367116, -0.01070968434214592, -0.007902521640062332, 0.014874420128762722, -0.02163865603506565, -0.02230321615934372, 0.03347824141383171, 0.03013424575328827, -0.07766824960708618, -0.17117510735988617, -0.11289063841104507, 0.20253600180149078, 0.24498510360717773, 0.022967102006077766, -0.05080768093466759, 0.003042643889784813, 0.02685278281569481, 0.026522599160671234, -0.09687282145023346, 0.031161557883024216, 0.042637404054403305, -0.005052150227129459, 0.036604639142751694, -0.0064994352869689465, -0.05499991029500961, 0.05576309561729431, 0.07830867171287537, -0.004102886654436588, -0.1476764976978302, -0.030414175242185593, 0.0005002972320653498, -0.06281878054141998, 0.0155328419059515, 0.188740536570549, -0.0036661713384091854, -0.0075938403606414795, 0.005832806695252657, 0.011066538281738758, -0.058713484555482864, 0.10808037221431732, -0.025195393711328506, 0.0002617636928334832, -0.08361619710922241, -0.0067266677506268024, 0.08207852393388748, -0.017966177314519882, 0.023112554103136063, 0.10261429101228714, -0.07626709342002869, -0.059751953929662704, -0.170430526137352, -0.14381143450737, -0.11433228105306625, -0.05436832457780838, -0.017606081441044807, -0.06061537563800812, 0.010494890622794628, 0.06661759316921234, -0.0134632159024477, 0.06046406179666519, -0.08026944845914841, 0.0067208921536803246, -0.048298630863428116, 0.03084285371005535, 0.03425202891230583, 0.003279207507148385, -0.025733590126037598, 0.1034398227930069, -0.013258999213576317, -0.04470761492848396, -0.012225997634232044, -0.02577744610607624, -0.04014172405004501, 0.02339889667928219, -0.11203675717115402, -0.016008058562874794, -0.05369260907173157, -0.055580344051122665, 0.04569263383746147, 0.017838239669799805, -0.007304822094738483, 0.011012979783117771, -0.030278025195002556, -0.008246834389865398, -0.04044199734926224, 0.019913379102945328, -0.07691939920186996, 0.08152636885643005, 0.02146541327238083, -0.052377644926309586, 0.07127702981233597, 0.05209442228078842, -0.07222332060337067, 0.08026522397994995, -0.008413433097302914, -0.055645596235990524, -0.0021409012842923403, 0.05457865074276924, 0.050723593682050705, -0.033299341797828674, 0.039022766053676605, 0.05488646402955055, -0.03460052236914635, -0.050373900681734085, -0.005645338911563158, -0.04991133511066437, 0.13241417706012726, 0.021615196019411087, -0.021952060982584953, -0.051819127053022385, 0.05242711678147316, 0.06505317240953445, -0.012027986347675323, 0.1359502524137497, -0.0531168095767498, 0.03870308771729469, -0.06406525522470474, 0.02138122171163559, 0.007583135273307562, 0.011448451317846775, -0.0040580760687589645, -0.09167115390300751, 0.026909470558166504, 0.020155422389507294, -0.002645016647875309, -0.02314736135303974, 0.09276086091995239, 0.049746330827474594, -0.03430444747209549, 0.01770511083304882, 0.06015869230031967, 0.07185220718383789, 0.059587832540273666, -0.024351857602596283, -0.11388155817985535, 0.046773530542850494, -0.011237435042858124, -0.1871684342622757, 0.05340512841939926, 0.07367698103189468, 0.0494372583925724, 0.10319837927818298, 0.065419502556324, 0.08957819640636444, -0.14710532128810883, 0.06889715790748596, -0.04589410126209259, -0.03690272942185402, -0.05436805263161659, 0.07242005318403244, 0.1883571892976761, -0.08818355202674866, 0.09333235770463943, 0.07168585062026978, -0.04829777032136917, -0.1380624771118164, -0.1709374338388443, -0.02081318013370037, -0.061538614332675934, -0.040713511407375336, -0.0512838289141655, 0.0297504011541605, -0.020664647221565247, 0.05576741322875023, -0.023627175018191338, 0.15478424727916718, -0.08977952599525452, -0.11989355087280273, 0.011797016486525536, 0.005792532581835985, 0.12790659070014954, 0.06791084259748459, 0.04715204983949661, 0.06058457866311073, 0.05947837233543396, 0.10599067807197571, 0.06955761462450027, 0.055311571806669235, -0.007227677386254072, -0.03917800262570381, -0.013763627037405968, -0.012103232555091381, 0.06118886172771454, 0.01425621472299099, 0.21687084436416626, 0.07681798189878464, -0.1123303696513176, -0.0023997013922780752, 0.14519722759723663, -0.06721779704093933, -0.13653676211833954, -0.18778514862060547, 0.15760178864002228, 0.11128034442663193, 0.07423357665538788, -0.018524272367358208, -0.10058044642210007, 0.02553071826696396, 0.15581990778446198, 0.1416972279548645, -0.024199383333325386, 0.0004757933202199638, -0.004761050920933485, 0.02281382493674755, -0.019895320758223534, 0.06038442254066467, -0.001399847911670804, 0.37324076890945435, 0.017331447452306747, 0.002415285911411047, 0.04384296387434006, -0.010902469977736473, -0.10891552269458771, 0.020032154396176338, -0.07533946633338928, -0.021020229905843735, 0.013163385912775993, 0.11223170906305313, -0.022727560251951218, -0.15432485938072205, -0.053545720875263214, 0.016383837908506393, -0.020738836377859116, 0.04106266051530838, 0.06936415284872055, 0.045499760657548904, 0.04880290850996971, 0.0269855298101902, -0.08225712180137634, 0.18984369933605194, 0.00864969752728939, -0.0468117780983448, 0.02842831052839756, 0.050872802734375, -0.11642684042453766, 0.1340310126543045, -0.0341305136680603, 0.06356211006641388, 0.07808063924312592, 0.05660109221935272, -0.07959957420825958, 0.043591830879449844, -0.021640725433826447, -0.10389028489589691, 0.10019279271364212, 0.12327796220779419, 0.000619158148765564, 0.09295529127120972, 0.029977209866046906, -0.08550114184617996, 0.06120087206363678, -0.01915726251900196, -0.024733638390898705, -0.06620136648416519, 0.0961955115199089, -0.086795873939991, 0.1194094866514206, 0.11568697541952133, -0.012424956075847149, 0.04073493182659149, -0.021859806030988693, 0.009687583893537521, 0.00590038625523448, 0.06041865050792694, -0.01989622414112091, -0.10444744676351547, -0.035025909543037415, -0.045858386904001236, 0.025064243003726006, -0.18534183502197266, -0.042180385440588, 0.01055643055588007, 0.005505953449755907, -0.029705703258514404, 0.05546743422746658, 0.06483033299446106, 0.04073449596762657, -0.02490823343396187, 0.02921861596405506, -0.018045935779809952, 0.05138656124472618, -0.07694929838180542, -0.06556198745965958 ]
null
null
transformers
# S2T-SMALL-COVOST2-DE-EN-ST `s2t-small-covost2-de-en-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end German speech to English text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-covost2-de-en-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-covost2-de-en-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=48_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-covost2-de-en-st is trained on German-English subset of [CoVoST2](https://github.com/facebookresearch/covost). CoVoST is a large-scale multilingual ST corpus based on [Common Voice](https://arxiv.org/abs/1912.06670), created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for de-en (BLEU score): 17.58 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["de", "en"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-covost2-de-en-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "de", "en", "dataset:covost2", "arxiv:2010.05171", "arxiv:1912.06670", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1912.06670", "1904.08779" ]
[ "de", "en" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #de #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-COVOST2-DE-EN-ST 's2t-small-covost2-de-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end German speech to English text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-covost2-de-en-st is trained on German-English subset of CoVoST2. CoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for de-en (BLEU score): 17.58 ### BibTeX entry and citation info
[ "# S2T-SMALL-COVOST2-DE-EN-ST\n\n's2t-small-covost2-de-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end German speech to English text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-de-en-st is trained on German-English subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for de-en (BLEU score): 17.58", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #de #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-COVOST2-DE-EN-ST\n\n's2t-small-covost2-de-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end German speech to English text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-de-en-st is trained on German-English subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for de-en (BLEU score): 17.58", "### BibTeX entry and citation info" ]
[ 91, 78, 111, 42, 137, 66, 3, 96, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #de #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-COVOST2-DE-EN-ST\n\n's2t-small-covost2-de-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end German speech to English text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.07959844172000885, 0.11093932390213013, -0.005666304379701614, -0.0004316555568948388, 0.05977140739560127, -0.056204233318567276, 0.042479678988456726, 0.05880303308367729, -0.10837632417678833, 0.04533608257770538, 0.03511066734790802, 0.056871965527534485, 0.05244942754507065, 0.06201830506324768, -0.00443286681547761, -0.2697967290878296, 0.024623697623610497, -0.06309037655591965, 0.057551171630620956, 0.08161154389381409, 0.12563134729862213, -0.043595049530267715, 0.07127653807401657, 0.02665255405008793, -0.004747715778648853, 0.013247056864202023, 0.05989260599017143, -0.053286030888557434, 0.05459916219115257, 0.10308681428432465, 0.06996185332536697, 0.015446732752025127, 0.07269063591957092, -0.13773790001869202, 0.002440096577629447, 0.04102310538291931, 0.017735380679368973, 0.03900071606040001, -0.001159056555479765, 0.007926912046968937, 0.12739084661006927, 0.02246907725930214, 0.01895548589527607, 0.09597368538379669, -0.05754261836409569, -0.018645910546183586, -0.04813345521688461, 0.04680498689413071, 0.15525278449058533, 0.07019683718681335, -0.038170985877513885, -0.012505956925451756, -0.02153545431792736, 0.05836646258831024, 0.015180272981524467, -0.20970378816127777, -0.016517842188477516, -0.022948920726776123, -0.00021062714222352952, -0.00556535366922617, -0.04316031560301781, -0.02314223349094391, -0.03305819630622864, 0.010038209147751331, 0.06853844970464706, -0.06797762960195541, 0.06438829004764557, -0.07197891175746918, -0.11718527972698212, -0.02824379689991474, 0.08957846462726593, -0.002735141199082136, -0.0967429131269455, -0.12325581163167953, -0.05518626794219017, 0.003082897048443556, -0.01914147287607193, -0.044462986290454865, -0.015464095398783684, 0.037978943437337875, 0.023230556398630142, -0.12509006261825562, -0.12194753438234329, -0.007230402901768684, -0.08438138663768768, 0.11971044540405273, 0.037024322897195816, 0.0003701134992297739, -0.003266718937084079, 0.07633771002292633, -0.11336709558963776, -0.08126042783260345, -0.05400428920984268, -0.09651851654052734, -0.10499444603919983, -0.014698551036417484, -0.011969029903411865, -0.2282729297876358, -0.04215290769934654, 0.1604790836572647, -0.07517122477293015, 0.04145660996437073, 0.03824445977807045, 0.024781523272395134, 0.0040976922027766705, 0.204897940158844, -0.037805646657943726, -0.08711045235395432, 0.06673182547092438, -0.03314988687634468, 0.04302472248673439, 0.0017314294818788767, -0.10019808262586594, 0.0032142293639481068, -0.0821191668510437, 0.02996375970542431, 0.05378333479166031, 0.03372206911444664, 0.0016889347461983562, -0.06654629856348038, 0.14618249237537384, -0.12159235775470734, 0.005998349282890558, 0.033817168325185776, -0.0544363372027874, 0.13269822299480438, 0.04230531305074692, -0.01576809585094452, -0.11933423578739166, 0.056024860590696335, 0.0007838630117475986, 0.022339044138789177, -0.0975116565823555, -0.1149759590625763, 0.0010292555671185255, -0.06608428806066513, -0.05185198411345482, -0.12067898362874985, -0.08961618691682816, -0.041975297033786774, 0.0207999125123024, -0.037026241421699524, 0.03867074102163315, -0.0817520022392273, -0.07631724327802658, 0.00751333637163043, -0.037730660289525986, -0.17903189361095428, 0.014496399089694023, -0.0020343265496194363, -0.022520270198583603, 0.03683270141482353, -0.05243348330259323, 0.013117482885718346, -0.07895122468471527, -0.00996506866067648, -0.26741501688957214, 0.1810939610004425, 0.010137378238141537, 0.019344469532370567, -0.08891519904136658, -0.06397014856338501, -0.06559363752603531, 0.05430813506245613, 0.033418506383895874, 0.14739415049552917, -0.24832428991794586, -0.02665707655251026, 0.21005497872829437, -0.09219473600387573, 0.03056013025343418, 0.12113965302705765, 0.0023653963580727577, 0.053787101060152054, 0.13142763078212738, 0.11386442184448242, 0.12865154445171356, -0.05522743612527847, -0.10308624804019928, 0.0283972155302763, -0.12456338852643967, 0.07655853778123856, 0.04541445150971413, -0.06733087450265884, 0.12836481630802155, 0.010387144982814789, -0.03777064010500908, -0.006237676367163658, 0.0651441365480423, -0.02628018520772457, 0.04190092906355858, -0.017981722950935364, 0.11582767218351364, -0.03095916286110878, -0.012769319117069244, 0.0027350096497684717, -0.1429426521062851, 0.16001546382904053, 0.03754850849509239, -0.08240530639886856, 0.05652307718992233, -0.14296463131904602, -0.06636524200439453, 0.0008393962634727359, 0.02238062210381031, -0.1690520942211151, 0.03122665174305439, -0.02867814153432846, -0.014247655868530273, 0.1690053939819336, 0.03885800018906593, 0.0822557881474495, 0.029408451169729233, -0.014039394445717335, 0.009536822326481342, 0.055465105921030045, -0.004166058264672756, 0.008247976191341877, -0.07108351588249207, -0.029826978221535683, -0.02532905712723732, 0.10493386536836624, -0.08254630118608475, 0.030766749754548073, 0.034302111715078354, 0.07109107822179794, 0.03580864518880844, -0.02964925207197666, -0.01319087389856577, -0.0073226382955908775, -0.0025729660410434008, -0.04048248007893562, -0.027755651623010635, 0.028994495049118996, -0.07292022556066513, 0.13738609850406647, -0.17942172288894653, -0.16482198238372803, 0.0760452151298523, -0.04243432357907295, -0.0539579764008522, 0.04263649135828018, -0.009204464964568615, 0.002783859148621559, 0.039286043494939804, -0.16111518442630768, 0.251547247171402, 0.0400228276848793, 0.12156710028648376, -0.06390298902988434, -0.04384499415755272, -0.05023839324712753, -0.06882309168577194, 0.04475518688559532, 0.02699798159301281, 0.03932339698076248, -0.17488782107830048, 0.006914159748703241, 0.03393922373652458, -0.027278246358036995, 0.1252118945121765, 0.0032434663735330105, -0.0523252971470356, -0.007958230562508106, 0.028920292854309082, 0.03314777463674545, 0.008400154300034046, 0.03532610088586807, 0.02750583551824093, 0.02751866541802883, 0.02832527831196785, 0.018651600927114487, -0.05574990063905716, 0.0927712470293045, 0.07173416763544083, -0.04899633303284645, -0.04565728083252907, -0.017071019858121872, -0.00003996953819296323, 0.04933330789208412, 0.010774842463433743, 0.016497459262609482, -0.020173145458102226, -0.02090129442512989, -0.1052892729640007, 0.12951521575450897, -0.16670989990234375, -0.2556880712509155, -0.18556532263755798, -0.0207517072558403, -0.022949576377868652, 0.09036840498447418, 0.04119604080915451, -0.07028445601463318, -0.0666702464222908, -0.08264213800430298, 0.0688229575753212, -0.01654237136244774, -0.05365639552474022, -0.14123912155628204, 0.04830304905772209, 0.030226880684494972, -0.08995313942432404, 0.002149097388610244, 0.0017141038551926613, -0.06392299383878708, -0.017063897103071213, 0.027259469032287598, -0.04091412574052811, 0.19232428073883057, -0.0268459040671587, -0.04099201411008835, -0.03939728066325188, 0.1036040335893631, -0.0488036647439003, 0.12576696276664734, 0.16587978601455688, -0.12791498005390167, 0.03111320734024048, 0.12539276480674744, 0.029042433947324753, -0.019744951277971268, -0.002545249881222844, -0.025188477709889412, -0.08193264156579971, -0.18390102684497833, -0.04335961490869522, -0.052404530346393585, 0.041456788778305054, 0.0376991406083107, -0.00013111942098475993, 0.08143893629312515, 0.026948444545269012, -0.07682189345359802, 0.0006385476444847882, 0.08472134917974472, 0.08351007103919983, 0.17217299342155457, -0.05726497620344162, 0.09545136988162994, -0.07342611253261566, -0.022809702903032303, 0.0742814838886261, -0.054792553186416626, 0.15005075931549072, 0.019617052748799324, 0.17765595018863678, 0.05694378539919853, -0.02841157466173172, 0.03229822963476181, 0.04476515203714371, -0.01978185959160328, -0.00025868750526569784, -0.003020813688635826, -0.08050815016031265, -0.06353095918893814, 0.09073582291603088, 0.052835654467344284, -0.018036318942904472, 0.018368305638432503, 0.05587531626224518, 0.052489519119262695, 0.0732276514172554, 0.07653062045574188, -0.13917972147464752, -0.10085447877645493, 0.018559448421001434, -0.12715643644332886, -0.010825873352587223, 0.0071258763782680035, 0.15994742512702942, -0.05894477665424347, 0.022232985123991966, 0.019525717943906784, 0.10141224414110184, -0.11933708935976028, 0.030566902831196785, 0.0013817514991387725, 0.159152552485466, 0.04598213732242584, 0.017631176859140396, -0.07114145159721375, 0.055611107498407364, 0.026153894141316414, 0.10270577669143677, -0.01663997396826744, 0.07328325510025024, 0.0020310538820922375, 0.026648111641407013, 0.0711340457201004, -0.00099958386272192, -0.15582334995269775, 0.018243681639432907, -0.1087837889790535, -0.04197032004594803, 0.10601288825273514, -0.03414483368396759, 0.06905826181173325, -0.03797964006662369, -0.03866518661379814, -0.053651776164770126, -0.10019981116056442, -0.047270502895116806, -0.15900693833827972, 0.01918874680995941, 0.012425974942743778, 0.02266358956694603, -0.0057547371834516525, 0.07447768747806549, -0.012182124890387058, 0.1837765872478485, -0.24921022355556488, -0.08315298706293106, -0.1013069674372673, -0.048488691449165344, 0.11931489408016205, -0.07420500367879868, 0.08260326087474823, 0.055929116904735565, 0.06165209412574768, 0.00771572720259428, -0.08086230605840683, 0.05251612514257431, -0.09963081777095795, -0.005108477547764778, -0.02467469498515129, 0.19036993384361267, 0.03197500482201576, 0.08668392896652222, 0.06652531772851944, 0.01848335564136505, 0.01015456859022379, -0.11842410266399384, -0.07925356179475784, 0.1265006959438324, 0.008826623670756817, 0.009403565898537636, -0.035861555486917496, -0.09938258677721024, -0.05179668590426445, 0.06915149092674255, 0.12513849139213562, 0.0859760269522667, -0.10034098476171494, 0.11566998809576035, 0.08743386715650558, -0.05292374640703201, -0.1948559433221817, -0.05804677680134773, 0.07722605764865875, 0.0553978830575943, 0.031335316598415375, -0.16001681983470917, 0.03395574167370796, 0.011140794493258, -0.012374203652143478, -0.02226302959024906, -0.23172177374362946, -0.1199263259768486, 0.05765518918633461, -0.05228627845644951, -0.005125298630446196, 0.007375743705779314, -0.07871764153242111, -0.04940032213926315, -0.06136898323893547, 0.013791622593998909, -0.08381742984056473, 0.0550561249256134, 0.10261331498622894, -0.002152875531464815, 0.04894879832863808, -0.003872628090903163, 0.058268193155527115, -0.0020312294363975525, -0.04036654531955719, -0.06372638046741486, 0.03824087604880333, 0.02885160781443119, -0.03220735862851143, 0.17865115404129028, -0.021659694612026215, 0.015660904347896576, -0.09445780515670776, -0.04775567725300789, 0.003209812333807349, 0.015140808187425137, -0.004324689973145723, -0.013779998756945133, -0.03190043196082115, -0.007556082680821419, 0.02118859440088272, 0.02789539285004139, -0.05809766799211502, -0.16756805777549744, -0.1235714703798294, 0.19930826127529144, 0.2350553572177887, 0.029290270060300827, -0.051256075501441956, 0.0014585318276658654, 0.009476975537836552, 0.024668652564287186, -0.10625102370977402, 0.0476895272731781, 0.03550291061401367, 0.007937894202768803, 0.059207554906606674, -0.010758946649730206, -0.059409528970718384, 0.04866010323166847, 0.06622105091810226, -0.033409666270017624, -0.1311912089586258, -0.036774713546037674, 0.012677270919084549, -0.06809335201978683, 0.041907720267772675, 0.20773743093013763, -0.03271449729800224, 0.01144811138510704, 0.003960848785936832, 0.01265960093587637, -0.06792455166578293, 0.1014508381485939, -0.0167256947606802, 0.013291054405272007, -0.07249118387699127, 0.01893986016511917, 0.06460389494895935, -0.0022180103696882725, 0.034484997391700745, 0.098015695810318, -0.0884784534573555, -0.06043119356036186, -0.1820903718471527, -0.14963549375534058, -0.07407437264919281, -0.02773682028055191, 0.0026767984963953495, -0.060150302946567535, 0.0033226951491087675, 0.06357468664646149, -0.004358907230198383, 0.058133091777563095, -0.0867830216884613, 0.01618283987045288, -0.05971843749284744, 0.03179578855633736, 0.0428166426718235, 0.012083875015377998, -0.04072418808937073, 0.13231758773326874, -0.019029872491955757, -0.02800009958446026, -0.023584652692079544, -0.02267543599009514, -0.04368910193443298, 0.01163545809686184, -0.09759718924760818, -0.03185157850384712, -0.05128643661737442, -0.05695652961730957, 0.04222653806209564, 0.006923970300704241, -0.008121081627905369, 0.02478414960205555, -0.037464894354343414, -0.013652598485350609, -0.03533032163977623, 0.0026763584464788437, -0.08174595236778259, 0.07000725716352463, 0.022686583921313286, -0.05304071679711342, 0.08378490805625916, 0.09057771414518356, -0.06067921221256256, 0.08828777074813843, 0.034983403980731964, -0.07390744239091873, 0.01504028681665659, 0.051721200346946716, 0.0371294841170311, -0.024647457525134087, 0.027080001309514046, 0.040081411600112915, -0.03742947429418564, -0.05182238295674324, -0.01959359459578991, -0.04498333856463432, 0.13143043220043182, 0.022151194512844086, -0.005175994709134102, -0.06265523284673691, 0.03355226665735245, 0.05940109118819237, -0.007239202037453651, 0.15049007534980774, -0.04637773334980011, 0.050279807299375534, -0.07550948113203049, 0.027753131464123726, 0.009933224879205227, -0.0060864705592393875, -0.007930050604045391, -0.09052485227584839, 0.03008980117738247, 0.004713763017207384, 0.0018816462252289057, -0.03851233795285225, 0.05545750632882118, 0.06171450763940811, -0.05235687643289566, 0.03895201534032822, 0.06471928209066391, 0.09255743026733398, 0.04472725838422775, -0.035031285136938095, -0.11829216033220291, 0.013103297911584377, -0.013251062482595444, -0.19766855239868164, 0.08494605869054794, 0.08136599510908127, 0.006332805380225182, 0.09992265701293945, 0.059903908520936966, 0.08121523261070251, -0.16272825002670288, 0.039222512394189835, -0.05306088179349899, -0.01728602685034275, -0.048298731446266174, 0.09248659014701843, 0.18998324871063232, -0.08285421878099442, 0.09315627813339233, 0.06947354972362518, -0.04567412659525871, -0.12915624678134918, -0.16161681711673737, -0.02518390864133835, -0.058144282549619675, -0.028909659013152122, -0.06261059641838074, 0.024269716814160347, -0.021380245685577393, 0.061191361397504807, -0.019538430497050285, 0.1738055944442749, -0.11695222556591034, -0.11669199913740158, 0.027141118422150612, 0.0001246317697223276, 0.0768839493393898, 0.08463601768016815, 0.025695206597447395, 0.05005593225359917, 0.04175153002142906, 0.09734140336513519, 0.04772162809967995, 0.04902290552854538, -0.010848925448954105, -0.034922000020742416, -0.02464074082672596, -0.018988560885190964, 0.052296217530965805, 0.01882159151136875, 0.18049363791942596, 0.06435761600732803, -0.10111242532730103, -0.017344621941447258, 0.13461008667945862, -0.08829434216022491, -0.1656576544046402, -0.18564221262931824, 0.15627707540988922, 0.11272044479846954, 0.07610730826854706, -0.03519954904913902, -0.10305779427289963, 0.02923470176756382, 0.14851708710193634, 0.16048288345336914, -0.029063429683446884, 0.010143307968974113, -0.01113063469529152, 0.019508682191371918, -0.019016101956367493, 0.06790252029895782, -0.00927855633199215, 0.35658958554267883, 0.02766106091439724, 0.01805826649069786, 0.03817431628704071, -0.031637050211429596, -0.09278885275125504, 0.03470434248447418, -0.07491959631443024, -0.03044944442808628, 0.006052887067198753, 0.10226678848266602, -0.02289886586368084, -0.12897127866744995, -0.0855851024389267, 0.01614689640700817, -0.021431660279631615, 0.04681350290775299, 0.056179776787757874, 0.05495676025748253, 0.05135229229927063, 0.019404642283916473, -0.07619047164916992, 0.19750399887561798, 0.006915196310728788, -0.04819350317120552, 0.027439873665571213, 0.03782086446881294, -0.1079292744398117, 0.1068045049905777, -0.024483010172843933, 0.07962462306022644, 0.07080370932817459, 0.052161406725645065, -0.07140615582466125, 0.04567509517073631, -0.013443340547382832, -0.14917106926441193, 0.07065816223621368, 0.161879763007164, -0.018466519191861153, 0.11427657306194305, 0.027297180145978928, -0.1003856360912323, 0.04640231281518936, -0.013065215200185776, -0.04082328826189041, -0.059169284999370575, 0.09925724565982819, -0.07960406690835953, 0.1346879005432129, 0.12206393480300903, -0.018226580694317818, 0.02190905623137951, -0.02170124463737011, 0.014467884786427021, 0.031430426985025406, 0.04290240257978439, -0.025446247309446335, -0.1035732552409172, -0.04563263803720474, -0.05710974335670471, 0.015765532851219177, -0.17799867689609528, -0.0472259595990181, 0.003876962000504136, 0.018618790432810783, -0.032399509102106094, 0.06567684561014175, 0.06468917429447174, 0.030272599309682846, -0.019584648311138153, 0.029802978038787842, -0.004649801645427942, 0.05443256348371506, -0.08656150102615356, -0.07727046310901642 ]
null
null
transformers
# S2T-SMALL-COVOST2-EN-CA-ST `s2t-small-covost2-en-ca-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Catlan text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-covost2-en-ca-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-covost2-en-ca-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=48_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-covost2-en-ca-st is trained on English-Catlan subset of [CoVoST2](https://github.com/facebookresearch/covost). CoVoST is a large-scale multilingual ST corpus based on [Common Voice](https://arxiv.org/abs/1912.06670), created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for en-ca (BLEU score): 21.68 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "ca"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-covost2-en-ca-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "ca", "dataset:covost2", "arxiv:2010.05171", "arxiv:1912.06670", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1912.06670", "1904.08779" ]
[ "en", "ca" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #ca #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-COVOST2-EN-CA-ST 's2t-small-covost2-en-ca-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Catlan text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-covost2-en-ca-st is trained on English-Catlan subset of CoVoST2. CoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for en-ca (BLEU score): 21.68 ### BibTeX entry and citation info
[ "# S2T-SMALL-COVOST2-EN-CA-ST\n\n's2t-small-covost2-en-ca-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Catlan text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-en-ca-st is trained on English-Catlan subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for en-ca (BLEU score): 21.68", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #ca #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-COVOST2-EN-CA-ST\n\n's2t-small-covost2-en-ca-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Catlan text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-en-ca-st is trained on English-Catlan subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for en-ca (BLEU score): 21.68", "### BibTeX entry and citation info" ]
[ 91, 78, 111, 43, 137, 67, 3, 96, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #ca #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-COVOST2-EN-CA-ST\n\n's2t-small-covost2-en-ca-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Catlan text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.07028277963399887, 0.1281692236661911, -0.005968815181404352, -0.0073510752990841866, 0.06748400628566742, -0.05478370189666748, 0.02801375836133957, 0.06742685288190842, -0.1318240761756897, 0.04794185981154442, 0.03636312112212181, 0.07215383648872375, 0.05899559706449509, 0.05274057388305664, -0.004994954913854599, -0.263044536113739, 0.019288914278149605, -0.05358525738120079, 0.045381784439086914, 0.09128548949956894, 0.12265074253082275, -0.04318099468946457, 0.0690080001950264, 0.026239631697535515, -0.0035589851904660463, -0.0024776530917733908, 0.05822227895259857, -0.06342749297618866, 0.05306731536984444, 0.10616430640220642, 0.07765098661184311, 0.02762511931359768, 0.07281865179538727, -0.15361042320728302, 0.005989172495901585, 0.05373737961053848, 0.023225780576467514, 0.046952810138463974, -0.0007845905493013561, 0.011189436540007591, 0.12481877952814102, 0.023795610293745995, 0.03648507595062256, 0.0974012240767479, -0.06593585014343262, -0.03118523396551609, -0.0429544597864151, 0.024832705035805702, 0.1623956561088562, 0.07189072668552399, -0.03601080924272537, -0.0012243211967870593, -0.012113520875573158, 0.05965051427483559, 0.021323643624782562, -0.21460971236228943, -0.014241455122828484, -0.02406284213066101, 0.004353144206106663, 0.0008396184421144426, -0.04018332436680794, -0.028473228216171265, -0.03583192452788353, 0.0068587856367230415, 0.08774305880069733, -0.06447377055883408, 0.040801163762807846, -0.06017887964844704, -0.11816822737455368, -0.024871312081813812, 0.0951339527964592, -0.009484576061367989, -0.08932827413082123, -0.11170711368322372, -0.060589149594306946, -0.006699042394757271, -0.018666202202439308, -0.046248190104961395, -0.006734744645655155, 0.04557938128709793, 0.01357604656368494, -0.12592262029647827, -0.12492885440587997, -0.014180419035255909, -0.09466944634914398, 0.08408922702074051, 0.03897568956017494, 0.0009965512435883284, -0.013974348083138466, 0.07555340230464935, -0.09767361730337143, -0.07655235379934311, -0.05317370593547821, -0.09128443896770477, -0.10847394168376923, -0.016906196251511574, -0.00546481367200613, -0.2191973328590393, -0.03990805894136429, 0.14937038719654083, -0.0781787559390068, 0.04070937633514404, 0.031091079115867615, 0.021678904071450233, -0.002487774007022381, 0.20322944223880768, -0.05333248898386955, -0.06749847531318665, 0.0711163654923439, -0.021470433101058006, 0.039249781519174576, -0.004547245800495148, -0.0912812128663063, 0.011957349255681038, -0.08365677297115326, 0.037735238671302795, 0.06524688750505447, 0.03673867508769035, -0.0022288765758275986, -0.06572434306144714, 0.1379580795764923, -0.12076559662818909, 0.005875764414668083, 0.03879881650209427, -0.04791153222322464, 0.1234743669629097, 0.05673413723707199, -0.007736789993941784, -0.11069376766681671, 0.048792313784360886, -0.010236391797661781, 0.022065039724111557, -0.08504947274923325, -0.11084762960672379, 0.002036942867562175, -0.06977781653404236, -0.05927368625998497, -0.11439692974090576, -0.0836246907711029, -0.042933039367198944, 0.012334167025983334, -0.0384783111512661, 0.03568213805556297, -0.08181219547986984, -0.068169504404068, 0.0013950675493106246, -0.034241367131471634, -0.16159693896770477, 0.014220502227544785, 0.010065469890832901, -0.01679951138794422, 0.041161853820085526, -0.040004462003707886, 0.020195409655570984, -0.07161029428243637, 0.0011010960442945361, -0.24010148644447327, 0.17616187036037445, 0.02780190296471119, 0.031156951561570168, -0.09976591914892197, -0.06488534808158875, -0.0750679075717926, 0.04887441545724869, 0.03360723331570625, 0.14115381240844727, -0.24391908943653107, -0.01848374865949154, 0.20974166691303253, -0.07890381664037704, 0.024861782789230347, 0.11055295914411545, 0.0033460971899330616, 0.07182331383228302, 0.12868145108222961, 0.09417009353637695, 0.12043195962905884, -0.03828602284193039, -0.09186868369579315, 0.01933491975069046, -0.1264275461435318, 0.07267971336841583, 0.04601151496171951, -0.07593177258968353, 0.1348848193883896, 0.009167888201773167, -0.03190886601805687, -0.019839148968458176, 0.062077026814222336, -0.030674608424305916, 0.038168393075466156, -0.022594831883907318, 0.091493159532547, -0.03357069194316864, -0.010841552168130875, 0.012099855579435825, -0.14556674659252167, 0.14417646825313568, 0.035522494465112686, -0.0842331275343895, 0.06699827313423157, -0.15741324424743652, -0.05666647106409073, -0.001151676056906581, 0.02591380476951599, -0.1766902208328247, 0.03928818181157112, -0.03193547576665878, -0.002770497463643551, 0.16036219894886017, 0.020814426243305206, 0.07201546430587769, 0.01831572689116001, -0.007749444805085659, 0.02319466322660446, 0.07483389228582382, -0.002666093409061432, -0.0034336706157773733, -0.06992720812559128, -0.016567176207900047, -0.026308227330446243, 0.11851253360509872, -0.0917782261967659, 0.038902897387742996, 0.045456625521183014, 0.057548511773347855, 0.030659792944788933, -0.029037676751613617, 0.005461609456688166, -0.013023704290390015, -0.002692573005333543, -0.04821630194783211, -0.019171766936779022, 0.03029652312397957, -0.0890345424413681, 0.1336158961057663, -0.2079470008611679, -0.16977952420711517, 0.07938671857118607, -0.048921287059783936, -0.04826930910348892, 0.03707999736070633, 0.00006358649261528626, 0.0010869980324059725, 0.03686254471540451, -0.14694716036319733, 0.24349191784858704, 0.046580348163843155, 0.1326082944869995, -0.07124398648738861, -0.04335679113864899, -0.05070342868566513, -0.06309422105550766, 0.048261858522892, 0.022775907069444656, 0.05313142016530037, -0.17525134980678558, -0.0029472208116203547, 0.04287118837237358, -0.010344257578253746, 0.09959717839956284, 0.007512138690799475, -0.05340385437011719, -0.010370857082307339, 0.026112711057066917, 0.036860402673482895, -0.014834610745310783, 0.03179852291941643, 0.0288538821041584, 0.027967851608991623, 0.026273302733898163, 0.014783896505832672, -0.0719614028930664, 0.09123221784830093, 0.08036372810602188, -0.04844597354531288, -0.057288624346256256, -0.02077474631369114, 0.0005420000525191426, 0.04884142428636551, 0.01717992126941681, 0.023720411583781242, -0.01796439103782177, -0.022710390388965607, -0.10586723685264587, 0.12967252731323242, -0.17483925819396973, -0.25731781125068665, -0.18054179847240448, -0.019343765452504158, 0.004687469452619553, 0.0866573229432106, 0.04052621126174927, -0.07396509498357773, -0.06761296093463898, -0.09094782918691635, 0.04765772819519043, -0.026642611250281334, -0.05260094255208969, -0.13083669543266296, 0.05521302670240402, 0.030170420184731483, -0.08963628113269806, 0.013330619782209396, 0.004711963701993227, -0.07851862907409668, -0.013353202491998672, 0.020795928314328194, -0.043114759027957916, 0.20822301506996155, -0.014514895156025887, -0.03995980694890022, -0.044501278549432755, 0.10328416526317596, -0.04133603721857071, 0.1152978241443634, 0.16328689455986023, -0.11956678330898285, 0.0323401503264904, 0.1284041851758957, 0.024655483663082123, -0.025712816044688225, -0.00017865511472336948, -0.026348410174250603, -0.08192100375890732, -0.19544944167137146, -0.023242812603712082, -0.04814822971820831, 0.03622044250369072, 0.03734586015343666, 0.0029220683500170708, 0.0802498608827591, 0.026319613680243492, -0.07107309252023697, 0.0039553409442305565, 0.09056468307971954, 0.08780975639820099, 0.18216240406036377, -0.06008509173989296, 0.10108340531587601, -0.07681836932897568, -0.011676791124045849, 0.06774811446666718, -0.05222107842564583, 0.16133852303028107, 0.02213340625166893, 0.20243895053863525, 0.06042372062802315, -0.01797730289399624, 0.03807317093014717, 0.03233608230948448, -0.021832168102264404, 0.009797420352697372, -0.0075948480516672134, -0.07833076268434525, -0.06898314505815506, 0.08303001523017883, 0.0566960908472538, -0.024851875379681587, 0.02761409990489483, 0.05363413318991661, 0.040799714624881744, 0.07453534752130508, 0.06503082066774368, -0.1408560872077942, -0.08323534578084946, 0.026919905096292496, -0.11612231284379959, -0.018939776346087456, 0.010262977331876755, 0.1505039632320404, -0.051456935703754425, 0.01550255436450243, 0.01542739849537611, 0.10594576597213745, -0.12110187113285065, 0.03206732124090195, 0.009211958386003971, 0.1507023274898529, 0.045861538499593735, 0.012604612857103348, -0.06600604206323624, 0.05951758101582527, 0.03328259661793709, 0.09475667029619217, -0.005192444194108248, 0.06827536970376968, -0.0033176320139318705, 0.040088556706905365, 0.06742110103368759, -0.008036939427256584, -0.14593009650707245, 0.006352594122290611, -0.10976569354534149, -0.03619704395532608, 0.10369236767292023, -0.035397112369537354, 0.07417777925729752, -0.04598868638277054, -0.043119750916957855, -0.047054603695869446, -0.10764424502849579, -0.03008185140788555, -0.17254100739955902, 0.025217315182089806, 0.016637617722153664, 0.03191254660487175, -0.012349860742688179, 0.07400403916835785, -0.013941336423158646, 0.17396272718906403, -0.25487425923347473, -0.08776123821735382, -0.09780284017324448, -0.04365908354520798, 0.12713788449764252, -0.06424311548471451, 0.07993239164352417, 0.04874664917588234, 0.05439135059714317, 0.024965327233076096, -0.07167241722345352, 0.04820820316672325, -0.08279063552618027, -0.019210178405046463, -0.029978452250361443, 0.1936516910791397, 0.013159659691154957, 0.08779802173376083, 0.05921334773302078, 0.025408200919628143, -0.0009472580277360976, -0.10695613920688629, -0.08275221288204193, 0.14458633959293365, 0.016339777037501335, 0.0011427209246903658, -0.03410491719841957, -0.0781518891453743, -0.07011747360229492, 0.056825682520866394, 0.12848308682441711, 0.08618639409542084, -0.10400031507015228, 0.12001240998506546, 0.08733559399843216, -0.06270144879817963, -0.17128095030784607, -0.06742586195468903, 0.057690706104040146, 0.044580936431884766, 0.034842200577259064, -0.16488584876060486, 0.016994312405586243, 0.008361855521798134, -0.00760225672274828, -0.013922715559601784, -0.24498051404953003, -0.11657179892063141, 0.04601282998919487, -0.042963944375514984, -0.0039860825054347515, -0.012568647973239422, -0.0843975767493248, -0.04347984120249748, -0.06255900114774704, 0.027129236608743668, -0.08604081720113754, 0.06162452697753906, 0.09196769446134567, -0.007305493112653494, 0.04872390627861023, -0.0034800772555172443, 0.05792377516627312, -0.02485528029501438, -0.04408256337046623, -0.06251031160354614, 0.040789294987916946, 0.04020452871918678, -0.0355415903031826, 0.16137154400348663, -0.007699563633650541, 0.014049174264073372, -0.08765849471092224, -0.052617061883211136, 0.007196749094873667, -0.01307413075119257, -0.005996210966259241, -0.005490989424288273, -0.0339631512761116, -0.018513990566134453, 0.02675742097198963, 0.018624676391482353, -0.05430469289422035, -0.16851888597011566, -0.1030418798327446, 0.19152382016181946, 0.24016763269901276, 0.022107768803834915, -0.05679062753915787, 0.0005164843751117587, 0.011057102121412754, 0.01940092258155346, -0.1294146180152893, 0.04174865409731865, 0.039670202881097794, 0.006846753880381584, 0.055781468749046326, -0.0017465074779465795, -0.06337232142686844, 0.045750297605991364, 0.06761608272790909, -0.02408459596335888, -0.132376030087471, -0.02643335610628128, 0.023611480370163918, -0.08490799367427826, 0.02475452795624733, 0.19590851664543152, -0.017769450321793556, 0.005196098703891039, 0.012941377237439156, 0.017687439918518066, -0.052853766828775406, 0.10060344636440277, -0.011443977244198322, 0.00812601950019598, -0.0756794884800911, 0.015553041361272335, 0.08303899317979813, -0.0007813361589796841, 0.028486115857958794, 0.10444104671478271, -0.0929507166147232, -0.0630452111363411, -0.18029296398162842, -0.16269879043102264, -0.05666058138012886, -0.02586616575717926, 0.005165742244571447, -0.05804850161075592, 0.007940630428493023, 0.08065307140350342, -0.015588992275297642, 0.07263762503862381, -0.0848306342959404, 0.011238902807235718, -0.06801167875528336, 0.03186146542429924, 0.02535325661301613, 0.00369387143291533, -0.046562228351831436, 0.1353832334280014, -0.007945023477077484, -0.027641450986266136, -0.015550125390291214, -0.031160075217485428, -0.037644729018211365, 0.018149469047784805, -0.08661474287509918, -0.02591259405016899, -0.053981222212314606, -0.05861765146255493, 0.04645591974258423, 0.013738458044826984, -0.012861092574894428, 0.01785162277519703, -0.03199716657400131, -0.01829851232469082, -0.0394006110727787, 0.0014813902089372277, -0.0820721834897995, 0.0654202401638031, 0.019501544535160065, -0.056573692709207535, 0.07857981324195862, 0.07377317547798157, -0.0650552436709404, 0.08181130886077881, 0.02458903379738331, -0.06791920959949493, 0.009737735614180565, 0.05231361836194992, 0.0339924655854702, -0.037167444825172424, 0.024218538776040077, 0.03158124163746834, -0.04224317893385887, -0.05145804211497307, -0.0111205093562603, -0.04823208227753639, 0.11568603664636612, 0.011766530573368073, 0.006581885274499655, -0.05917239561676979, 0.016903838142752647, 0.057426080107688904, -0.0055218953639268875, 0.14900198578834534, -0.034672658890485764, 0.054439861327409744, -0.08609745651483536, 0.024658171460032463, -0.0009515765123069286, 0.0008575369138270617, -0.00684376573190093, -0.0922284722328186, 0.03435187414288521, 0.005242879036813974, -0.006086915731430054, -0.05062349513173103, 0.055119238793849945, 0.06323783844709396, -0.04574829339981079, 0.035050176084041595, 0.06046547740697861, 0.09078584611415863, 0.06138475611805916, -0.040376801043748856, -0.09598936885595322, 0.013718705624341965, -0.010894437320530415, -0.19676807522773743, 0.07053519785404205, 0.06903120130300522, 0.024331115186214447, 0.09786884486675262, 0.053744927048683167, 0.07421543449163437, -0.1403832882642746, 0.05631221830844879, -0.05756825953722, -0.027688298374414444, -0.0467451810836792, 0.09139704704284668, 0.19534170627593994, -0.07435426115989685, 0.08751798421144485, 0.06526780128479004, -0.04910525679588318, -0.14126475155353546, -0.1539994478225708, -0.03021947853267193, -0.04673123359680176, -0.03132868558168411, -0.0599956251680851, 0.030680201947689056, -0.021195193752646446, 0.053305916488170624, -0.010841306298971176, 0.15954962372779846, -0.09045646339654922, -0.12042087316513062, 0.015545926988124847, 0.00447540869936347, 0.08017584681510925, 0.07875102758407593, 0.03771306574344635, 0.0615796335041523, 0.038036927580833435, 0.09700444340705872, 0.05303086340427399, 0.05622873455286026, 0.0015746795106679201, -0.040547195822000504, -0.027702875435352325, -0.018884772434830666, 0.06315290927886963, 0.013671581633388996, 0.17743195593357086, 0.06847608834505081, -0.098131462931633, -0.011528931558132172, 0.14932821691036224, -0.08234845101833344, -0.15208634734153748, -0.18480850756168365, 0.1593605875968933, 0.11394638568162918, 0.05768517777323723, -0.0314747653901577, -0.10174516588449478, 0.03586871176958084, 0.16796274483203888, 0.15267136693000793, -0.032826926559209824, 0.007563337683677673, -0.006789566949009895, 0.020365983247756958, -0.020950686186552048, 0.06431236118078232, 0.011285582557320595, 0.3357429802417755, 0.022382784634828568, 0.0078045972622931, 0.03885849565267563, -0.024971377104520798, -0.10230596363544464, 0.01983046904206276, -0.07232014834880829, -0.01578611508011818, 0.014673870988190174, 0.09542430192232132, -0.019316373392939568, -0.13223840296268463, -0.07884538173675537, 0.008774993941187859, -0.01435239426791668, 0.044938381761312485, 0.05293484777212143, 0.04091073200106621, 0.037355706095695496, 0.020887255668640137, -0.07804039120674133, 0.19385719299316406, 0.007859660312533379, -0.05877909064292908, 0.03701937571167946, 0.04863086715340614, -0.12073829770088196, 0.10166880488395691, -0.02436676435172558, 0.07348071783781052, 0.07192680239677429, 0.051702700555324554, -0.08110668510198593, 0.056139543652534485, -0.017515789717435837, -0.13114099204540253, 0.07228107005357742, 0.15342167019844055, -0.007776123005896807, 0.10902412980794907, 0.03079109638929367, -0.08510132879018784, 0.049362003803253174, -0.02797936089336872, -0.037957943975925446, -0.05769628286361694, 0.08404134213924408, -0.08104416728019714, 0.1354292929172516, 0.11473693698644638, -0.02667635679244995, 0.022190114483237267, -0.016047867015004158, -0.001227875123731792, 0.023119313642382622, 0.04995376244187355, -0.0277361162006855, -0.09822534024715424, -0.04487493261694908, -0.058905135840177536, 0.02279602736234665, -0.1876066029071808, -0.047753818333148956, 0.00996961910277605, 0.006208777893334627, -0.03438375145196915, 0.06705380231142044, 0.06374264508485794, 0.03579619899392128, -0.028661154210567474, 0.030042869970202446, -0.006987109314650297, 0.06257891654968262, -0.10243455320596695, -0.07425571233034134 ]
null
null
transformers
# S2T-SMALL-COVOST2-EN-DE-ST `s2t-small-covost2-en-de-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to German text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-covost2-en-de-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-covost2-en-de-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=48_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-covost2-en-de-st is trained on English-German subset of [CoVoST2](https://github.com/facebookresearch/covost). CoVoST is a large-scale multilingual ST corpus based on [Common Voice](https://arxiv.org/abs/1912.06670), created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for en-de (BLEU score): 16.29 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "de"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-covost2-en-de-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "de", "dataset:covost2", "arxiv:2010.05171", "arxiv:1912.06670", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1912.06670", "1904.08779" ]
[ "en", "de" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #de #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-COVOST2-EN-DE-ST 's2t-small-covost2-en-de-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to German text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-covost2-en-de-st is trained on English-German subset of CoVoST2. CoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for en-de (BLEU score): 16.29 ### BibTeX entry and citation info
[ "# S2T-SMALL-COVOST2-EN-DE-ST\n\n's2t-small-covost2-en-de-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to German text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-en-de-st is trained on English-German subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for en-de (BLEU score): 16.29", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #de #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-COVOST2-EN-DE-ST\n\n's2t-small-covost2-en-de-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to German text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-en-de-st is trained on English-German subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for en-de (BLEU score): 16.29", "### BibTeX entry and citation info" ]
[ 91, 78, 111, 42, 137, 66, 3, 96, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #de #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-COVOST2-EN-DE-ST\n\n's2t-small-covost2-en-de-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to German text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.07883130759000778, 0.11369394510984421, -0.005764924455434084, -0.0012064239708706737, 0.06162621080875397, -0.05572497472167015, 0.041034847497940063, 0.06025663763284683, -0.1107657253742218, 0.04485880956053734, 0.03561406955122948, 0.06021258980035782, 0.0533122792840004, 0.06123824417591095, -0.003860217286273837, -0.2693125903606415, 0.02396605722606182, -0.06198817864060402, 0.05726459249854088, 0.08263295143842697, 0.12516970932483673, -0.04277866706252098, 0.07139632105827332, 0.026835951954126358, -0.004408856853842735, 0.010837337002158165, 0.0603957436978817, -0.05456492677330971, 0.05502535030245781, 0.10372556746006012, 0.07067108899354935, 0.01740124262869358, 0.07294096797704697, -0.13955849409103394, 0.0032596837263554335, 0.043333712965250015, 0.019891705363988876, 0.03997383266687393, -0.0010849263053387403, 0.007929726503789425, 0.1281888633966446, 0.02361294999718666, 0.02201983518898487, 0.0962652936577797, -0.058908864855766296, -0.019488545134663582, -0.047197118401527405, 0.043285101652145386, 0.15637269616127014, 0.07039947807788849, -0.03797343373298645, -0.011524712666869164, -0.020548176020383835, 0.05871416628360748, 0.014496511779725552, -0.20871277153491974, -0.01602715067565441, -0.02240237407386303, 0.0005585722392424941, -0.004777151625603437, -0.04165240749716759, -0.024370113387703896, -0.03479286655783653, 0.009593241848051548, 0.0701003223657608, -0.06708909571170807, 0.06203602999448776, -0.07046812772750854, -0.1170068234205246, -0.028084352612495422, 0.08981510996818542, -0.0036616469733417034, -0.09593937546014786, -0.12222982197999954, -0.05612513795495033, 0.0027133533731102943, -0.01973436400294304, -0.045043718069791794, -0.014171711169183254, 0.03957105800509453, 0.022827286273241043, -0.12530343234539032, -0.12193907052278519, -0.009145298972725868, -0.08611521124839783, 0.11321823298931122, 0.03718290105462074, 0.0009175960440188646, -0.0051130992360413074, 0.07613378018140793, -0.11071319878101349, -0.07995764166116714, -0.05318964645266533, -0.09681644290685654, -0.10434578359127045, -0.014705521054565907, -0.011151503771543503, -0.22642025351524353, -0.04098791629076004, 0.15898093581199646, -0.07449863106012344, 0.041818052530288696, 0.03763874992728233, 0.02407178469002247, 0.004136839881539345, 0.20411860942840576, -0.03971618041396141, -0.08436606079339981, 0.06758822500705719, -0.030897485092282295, 0.04194912314414978, 0.0012264827964827418, -0.0985659807920456, 0.0036281875800341368, -0.08223400264978409, 0.031097136437892914, 0.05542251095175743, 0.03428656607866287, 0.0010737867560237646, -0.06642546504735947, 0.14803935587406158, -0.12145361304283142, 0.005356272216886282, 0.03468180447816849, -0.053903546184301376, 0.13225795328617096, 0.04471035674214363, -0.014290828257799149, -0.11779903620481491, 0.05301520600914955, -0.0001616409863345325, 0.02208271436393261, -0.0957302525639534, -0.11411132663488388, 0.00013602126273326576, -0.06680446118116379, -0.052016470581293106, -0.12023955583572388, -0.09120337665081024, -0.042668744921684265, 0.019600754603743553, -0.03757822886109352, 0.03737786039710045, -0.08185667544603348, -0.07432432472705841, 0.0059478264302015305, -0.03744117543101311, -0.176077738404274, 0.014582553878426552, 0.0001899867202155292, -0.02117852307856083, 0.037415776401758194, -0.05077962204813957, 0.014368404634296894, -0.07827313989400864, -0.00845797173678875, -0.26498356461524963, 0.18223868310451508, 0.01278320699930191, 0.02056247554719448, -0.0905705913901329, -0.06286872178316116, -0.0660010352730751, 0.054657187312841415, 0.03275502845644951, 0.14683020114898682, -0.24884532392024994, -0.025777006521821022, 0.20916889607906342, -0.0906127542257309, 0.031331416219472885, 0.11916400492191315, 0.0021287479903548956, 0.057385727763175964, 0.13194893300533295, 0.11109508574008942, 0.12827101349830627, -0.051871407777071, -0.10014566779136658, 0.027507521212100983, -0.12631113827228546, 0.0759354904294014, 0.04532451182603836, -0.06898143887519836, 0.13026249408721924, 0.010791930370032787, -0.03801894932985306, -0.006404501385986805, 0.06522496044635773, -0.026405923068523407, 0.04192866012454033, -0.019343001767992973, 0.11320717632770538, -0.030398869886994362, -0.012275484390556812, 0.004511479754000902, -0.1430797576904297, 0.15968738496303558, 0.03727458417415619, -0.08171505481004715, 0.05708467960357666, -0.14560261368751526, -0.06552917510271072, 0.0012920878361910582, 0.022401316091418266, -0.17053794860839844, 0.03302788361907005, -0.030023837462067604, -0.010985788889229298, 0.16779008507728577, 0.03645563870668411, 0.08059180527925491, 0.027308816090226173, -0.01257059071213007, 0.01184159331023693, 0.05930916219949722, -0.0035949349403381348, 0.006349981762468815, -0.06960206478834152, -0.02793484926223755, -0.024794261902570724, 0.10569331049919128, -0.08360861241817474, 0.03232307732105255, 0.03435570001602173, 0.06952989846467972, 0.034788500517606735, -0.02992136776447296, -0.010177146643400192, -0.00841750018298626, -0.0018134587444365025, -0.04188449680805206, -0.025915302336215973, 0.029043102636933327, -0.07379891723394394, 0.1368560791015625, -0.18247970938682556, -0.1666087806224823, 0.07711805403232574, -0.04285575449466705, -0.053267702460289, 0.04352932050824165, -0.007100521586835384, 0.0027479585260152817, 0.039342835545539856, -0.15894761681556702, 0.2494630217552185, 0.04101598635315895, 0.12262888252735138, -0.06445448845624924, -0.043192312121391296, -0.050557058304548264, -0.06800329685211182, 0.04592478647828102, 0.02606036327779293, 0.04175962135195732, -0.17531481385231018, 0.004995677154511213, 0.03413311764597893, -0.024752546101808548, 0.12206344306468964, 0.0032960278913378716, -0.05239966884255409, -0.007565096486359835, 0.027749603614211082, 0.032759733498096466, 0.004311850294470787, 0.03335706517100334, 0.027800239622592926, 0.027532244101166725, 0.028124934062361717, 0.018370188772678375, -0.05735104903578758, 0.0927061215043068, 0.07319857180118561, -0.04895241558551788, -0.04739988222718239, -0.017224494367837906, 0.00031314301304519176, 0.0486132949590683, 0.011439556255936623, 0.017390040680766106, -0.01963195949792862, -0.02081744745373726, -0.10521888732910156, 0.12880773842334747, -0.16776417195796967, -0.25431668758392334, -0.18493448197841644, -0.02028045989573002, -0.018508952111005783, 0.09018444269895554, 0.040491458028554916, -0.0713178887963295, -0.06617793440818787, -0.08307930827140808, 0.06704933196306229, -0.01763267070055008, -0.05505410581827164, -0.14047113060951233, 0.04958700016140938, 0.02903033047914505, -0.08954711258411407, 0.0037355050444602966, 0.0028151338919997215, -0.06502943485975266, -0.01664666272699833, 0.024786921218037605, -0.043141577392816544, 0.1943366676568985, -0.025006387382745743, -0.04126179590821266, -0.039879657328128815, 0.10484138131141663, -0.04738198593258858, 0.12470482289791107, 0.16773411631584167, -0.1263810694217682, 0.030428921803832054, 0.12536954879760742, 0.028186699375510216, -0.020816905423998833, -0.002372868824750185, -0.02532435581088066, -0.08164277672767639, -0.18535327911376953, -0.03987309709191322, -0.050543054938316345, 0.041620854288339615, 0.03672957792878151, -0.0001311333035118878, 0.07963722944259644, 0.027469730004668236, -0.07513324916362762, 0.0008077878155745566, 0.08598612993955612, 0.0830959677696228, 0.17474210262298584, -0.05878566578030586, 0.09689266979694366, -0.07367151975631714, -0.020517315715551376, 0.07320325821638107, -0.055113259702920914, 0.14920872449874878, 0.01949605531990528, 0.18060645461082458, 0.056867074221372604, -0.027469802647829056, 0.03372041508555412, 0.04316877946257591, -0.019704492762684822, 0.0016935152234509587, -0.0038038648199290037, -0.07942511886358261, -0.06621171534061432, 0.08884797990322113, 0.053763680160045624, -0.019858544692397118, 0.020119447261095047, 0.057125553488731384, 0.050001226365566254, 0.07269339263439178, 0.07514560967683792, -0.1401257961988449, -0.09901928156614304, 0.019814128056168556, -0.12654618918895721, -0.013228117488324642, 0.0079600615426898, 0.15761743485927582, -0.05915941298007965, 0.020420555025339127, 0.01992437057197094, 0.10154931247234344, -0.12038970738649368, 0.030513182282447815, 0.002129096770659089, 0.15682464838027954, 0.04642495512962341, 0.015983112156391144, -0.06962312757968903, 0.053969014436006546, 0.026593836024403572, 0.10120104253292084, -0.014685221947729588, 0.07334332913160324, 0.0021509318612515926, 0.029064800590276718, 0.06979117542505264, -0.0019365372136235237, -0.1536378711462021, 0.01768539845943451, -0.10853301733732224, -0.04083109647035599, 0.10381817817687988, -0.0350404717028141, 0.06898021697998047, -0.039188235998153687, -0.039188869297504425, -0.053379856050014496, -0.10164374113082886, -0.046149689704179764, -0.1607142835855484, 0.01901855133473873, 0.010757243260741234, 0.02569223754107952, -0.0061089578084647655, 0.07493450492620468, -0.01268811710178852, 0.18176403641700745, -0.2497924268245697, -0.08570846915245056, -0.10053114593029022, -0.047878511250019073, 0.1211077943444252, -0.07243438065052032, 0.08207296580076218, 0.05472029373049736, 0.060667794197797775, 0.0102374954149127, -0.08002616465091705, 0.05204781889915466, -0.09682314842939377, -0.006446250714361668, -0.025671321898698807, 0.19046993553638458, 0.029728949069976807, 0.0865572914481163, 0.06556019932031631, 0.01930786482989788, 0.008760557509958744, -0.11546562612056732, -0.07988081127405167, 0.13011261820793152, 0.009814254008233547, 0.00779729662463069, -0.03505735099315643, -0.09690303355455399, -0.05481838434934616, 0.06662221252918243, 0.12571293115615845, 0.08454003185033798, -0.0999690517783165, 0.11552674323320389, 0.08642194420099258, -0.054391928017139435, -0.19101695716381073, -0.060183968394994736, 0.07447899132966995, 0.05470285564661026, 0.03267852962017059, -0.16103754937648773, 0.031511951237916946, 0.011609802953898907, -0.011238813400268555, -0.020968839526176453, -0.23337028920650482, -0.11948924511671066, 0.05760939046740532, -0.05061096325516701, -0.005796076264232397, 0.004395768046379089, -0.07892878353595734, -0.04920255020260811, -0.062461309134960175, 0.015028429217636585, -0.08328809589147568, 0.05504097789525986, 0.10098033398389816, -0.0011030712630599737, 0.04893102869391441, -0.0030745009426027536, 0.057842105627059937, -0.005987030453979969, -0.040960974991321564, -0.06302196532487869, 0.03750275447964668, 0.02961423434317112, -0.0329742468893528, 0.1756940335035324, -0.0186447836458683, 0.015818461775779724, -0.09361719340085983, -0.049380846321582794, 0.00325002521276474, 0.009591834619641304, -0.004469395615160465, -0.0125480517745018, -0.0310157872736454, -0.008934232406318188, 0.020512038841843605, 0.026714079082012177, -0.05902295932173729, -0.16897577047348022, -0.12226644158363342, 0.19779478013515472, 0.23623771965503693, 0.02828098088502884, -0.05146316811442375, 0.0008864109986461699, 0.009831409901380539, 0.024039918556809425, -0.11009696125984192, 0.04696550965309143, 0.03519821912050247, 0.007574249058961868, 0.05815204232931137, -0.009861517697572708, -0.06011413037776947, 0.048802319914102554, 0.06537892669439316, -0.030857935547828674, -0.13205543160438538, -0.03444089740514755, 0.013881955295801163, -0.07041174918413162, 0.03845466300845146, 0.20596757531166077, -0.029872197657823563, 0.009237500838935375, 0.005432424135506153, 0.013531793840229511, -0.06615842133760452, 0.10120725631713867, -0.01701132394373417, 0.012265634723007679, -0.07359641045331955, 0.01808818429708481, 0.06675378978252411, -0.0016838818555697799, 0.03338059410452843, 0.09801226854324341, -0.08958438038825989, -0.061310939490795135, -0.182468444108963, -0.15050393342971802, -0.07178321480751038, -0.027136163786053658, 0.0038163205608725548, -0.060199130326509476, 0.0037012638058513403, 0.06551459431648254, -0.006029557436704636, 0.06048773229122162, -0.08648201823234558, 0.015342573635280132, -0.060379985719919205, 0.03161894530057907, 0.04070938751101494, 0.01034989021718502, -0.04206326976418495, 0.13247977197170258, -0.017925472930073738, -0.028834203258156776, -0.022012479603290558, -0.02355208434164524, -0.042319390922784805, 0.012822798453271389, -0.09817267954349518, -0.03106437437236309, -0.052458904683589935, -0.05678081512451172, 0.04311838001012802, 0.008611169643700123, -0.008017742075026035, 0.02315560355782509, -0.03641073405742645, -0.01437387615442276, -0.0356094166636467, 0.002671359572559595, -0.08107109367847443, 0.06991126388311386, 0.02237868309020996, -0.05358429625630379, 0.08313146978616714, 0.08891784399747849, -0.06045754998922348, 0.08756875991821289, 0.035941626876592636, -0.07328449189662933, 0.013748761266469955, 0.051327336579561234, 0.03700203448534012, -0.02582421712577343, 0.026636460795998573, 0.038671720772981644, -0.0382358655333519, -0.0521024689078331, -0.017559155821800232, -0.04554414749145508, 0.13029111921787262, 0.02187480963766575, -0.002305087633430958, -0.06293521076440811, 0.03091285564005375, 0.05941556394100189, -0.006333937402814627, 0.14994849264621735, -0.04484463483095169, 0.05078161135315895, -0.07674568146467209, 0.027696501463651657, 0.008102444931864738, -0.004103653598576784, -0.006851826794445515, -0.09082488715648651, 0.030815018340945244, 0.005143081769347191, 0.0010027140378952026, -0.03919006139039993, 0.055412400513887405, 0.06167566031217575, -0.05272093415260315, 0.03992874175310135, 0.06412511318922043, 0.09181266278028488, 0.047330230474472046, -0.03553883358836174, -0.11507587134838104, 0.01347329095005989, -0.01309312041848898, -0.1984950453042984, 0.082077257335186, 0.08063450455665588, 0.007232677657157183, 0.09917065501213074, 0.05805350840091705, 0.08091234415769577, -0.15955714881420135, 0.04065129905939102, -0.05360884219408035, -0.020182328298687935, -0.04874259606003761, 0.09260325878858566, 0.19085600972175598, -0.08117501437664032, 0.09229202568531036, 0.0687616840004921, -0.04629112035036087, -0.13109050691127777, -0.16024765372276306, -0.02531726285815239, -0.05723470821976662, -0.029545903205871582, -0.06177061051130295, 0.024737538769841194, -0.020594140514731407, 0.059677574783563614, -0.019125038757920265, 0.170328751206398, -0.11340712755918503, -0.1178392693400383, 0.024484263733029366, 0.00009467093332204968, 0.07671217620372772, 0.08398296684026718, 0.027794944122433662, 0.051832832396030426, 0.04114155098795891, 0.09711257368326187, 0.04813455045223236, 0.04853574186563492, -0.008570997044444084, -0.03581961989402771, -0.02436120994389057, -0.019605115056037903, 0.05463755503296852, 0.017007816582918167, 0.18088626861572266, 0.06541356444358826, -0.10074722021818161, -0.016486382111907005, 0.13573932647705078, -0.08647987991571426, -0.1644274741411209, -0.18588677048683167, 0.15725310146808624, 0.11269518733024597, 0.0736725777387619, -0.03494046628475189, -0.10268332809209824, 0.02963043563067913, 0.15109694004058838, 0.15930217504501343, -0.030077464878559113, 0.008979834616184235, -0.009734283201396465, 0.01956259459257126, -0.02093368023633957, 0.06690079718828201, -0.006257679313421249, 0.3525339961051941, 0.026445070281624794, 0.016566384583711624, 0.038541484624147415, -0.031643345952034, -0.09419648349285126, 0.0318714939057827, -0.07455005496740341, -0.028251774609088898, 0.007852931506931782, 0.10154236108064651, -0.02429845742881298, -0.12925732135772705, -0.08440046012401581, 0.016786299645900726, -0.019928397610783577, 0.046448227018117905, 0.05330488458275795, 0.05274645611643791, 0.049553122371435165, 0.01972850225865841, -0.07636194676160812, 0.19744816422462463, 0.00674534123390913, -0.04954104498028755, 0.02929728478193283, 0.03934692591428757, -0.10745269060134888, 0.10491298884153366, -0.02474343776702881, 0.07902954518795013, 0.07048407942056656, 0.05300208181142807, -0.07306046038866043, 0.04731407389044762, -0.014084139838814735, -0.14485931396484375, 0.07108266651630402, 0.15969257056713104, -0.016280611976981163, 0.11348506808280945, 0.027796996757388115, -0.09831652045249939, 0.04677184671163559, -0.014874893240630627, -0.04111815243959427, -0.05936304107308388, 0.0974932536482811, -0.08030086010694504, 0.13436904549598694, 0.1209348738193512, -0.01912638172507286, 0.022417977452278137, -0.020907068625092506, 0.012421229854226112, 0.03004215471446514, 0.0436646044254303, -0.027097897604107857, -0.10194950550794601, -0.045512568205595016, -0.056655216962099075, 0.01600414328277111, -0.17876237630844116, -0.04629255831241608, 0.0039994013495743275, 0.017385249957442284, -0.03210761770606041, 0.06552599370479584, 0.06417636573314667, 0.03187815845012665, -0.02045832946896553, 0.030472703278064728, -0.005028628744184971, 0.056257471442222595, -0.08913954347372055, -0.07702748477458954 ]
null
null
transformers
# S2T-SMALL-COVOST2-EN-ET-ST `s2t-small-covost2-en-et-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Estonian text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-covost2-en-et-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-covost2-en-et-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=48_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-covost2-en-et-st is trained on English-Estonian subset of [CoVoST2](https://github.com/facebookresearch/covost). CoVoST is a large-scale multilingual ST corpus based on [Common Voice](https://arxiv.org/abs/1912.06670), created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for en-et (BLEU score): 13.01 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "et"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-covost2-en-et-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "et", "dataset:covost2", "arxiv:2010.05171", "arxiv:1912.06670", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1912.06670", "1904.08779" ]
[ "en", "et" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #et #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-COVOST2-EN-ET-ST 's2t-small-covost2-en-et-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Estonian text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-covost2-en-et-st is trained on English-Estonian subset of CoVoST2. CoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for en-et (BLEU score): 13.01 ### BibTeX entry and citation info
[ "# S2T-SMALL-COVOST2-EN-ET-ST\n\n's2t-small-covost2-en-et-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Estonian text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-en-et-st is trained on English-Estonian subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for en-et (BLEU score): 13.01", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #et #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-COVOST2-EN-ET-ST\n\n's2t-small-covost2-en-et-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Estonian text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-en-et-st is trained on English-Estonian subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for en-et (BLEU score): 13.01", "### BibTeX entry and citation info" ]
[ 91, 78, 111, 43, 137, 67, 3, 96, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #et #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-COVOST2-EN-ET-ST\n\n's2t-small-covost2-en-et-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Estonian text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.06688961386680603, 0.09225259721279144, -0.005930713843554258, -0.006021932698786259, 0.06549839675426483, -0.060799740254879, 0.03026505373418331, 0.06635899096727371, -0.11410107463598251, 0.048738304525613785, 0.04011012986302376, 0.05254034698009491, 0.061967045068740845, 0.06775498390197754, 0.002861619694158435, -0.2685752809047699, 0.03672259673476219, -0.06282330304384232, 0.051216479390859604, 0.08101136237382889, 0.13836964964866638, -0.04512757435441017, 0.07030317932367325, 0.029414096847176552, 0.0024635037407279015, 0.01136771310120821, 0.05124349147081375, -0.06329506635665894, 0.056698691099882126, 0.09926014393568039, 0.06676918268203735, 0.012368841096758842, 0.06512704491615295, -0.1509406566619873, 0.0009770472534000874, 0.041910551488399506, 0.019392194226384163, 0.02932705543935299, 0.0012594453291967511, 0.01350285392254591, 0.13177147507667542, 0.010944511741399765, 0.02731945551931858, 0.08508701622486115, -0.051548153162002563, -0.03324010968208313, -0.04451312869787216, 0.019522517919540405, 0.18024344742298126, 0.081597700715065, -0.04703100025653839, 0.0098959906026721, -0.0229149479418993, 0.06275835633277893, 0.021549422293901443, -0.20477914810180664, -0.015457814559340477, -0.03553742915391922, -0.006522633600980043, 0.01107226125895977, -0.03893587365746498, -0.02305728942155838, -0.035315606743097305, 0.015749545767903328, 0.0761631578207016, -0.06324544548988342, 0.027758732438087463, -0.06834523379802704, -0.11785875260829926, -0.011212754994630814, 0.1146993637084961, -0.002557057188823819, -0.08053473383188248, -0.1126127615571022, -0.04877680167555809, -0.00695251626893878, -0.021667160093784332, -0.047692954540252686, -0.012341751717031002, 0.03178640827536583, 0.037092603743076324, -0.13461744785308838, -0.1370834857225418, -0.00032429079874418676, -0.08415474742650986, 0.11386369913816452, 0.04335467517375946, 0.0004654176882468164, -0.0029496708884835243, 0.06741829961538315, -0.1215115636587143, -0.07751820236444473, -0.05322152376174927, -0.09304051101207733, -0.10323385149240494, -0.009064343757927418, -0.005151424091309309, -0.2425171285867691, -0.03186294436454773, 0.15012051165103912, -0.06626155972480774, 0.03747003898024559, 0.04507644847035408, 0.031724683940410614, -0.008189011365175247, 0.22077345848083496, -0.04457492753863335, -0.09080249071121216, 0.06304622441530228, -0.03957375884056091, 0.04624485224485397, 0.0030474180821329355, -0.09493441879749298, -0.006190882530063391, -0.08127183467149734, 0.03573179990053177, 0.05893673002719879, 0.03920385614037514, 0.009898331016302109, -0.059764277189970016, 0.1363750845193863, -0.12368498742580414, -0.0054952045902609825, 0.029506033286452293, -0.057022981345653534, 0.1318102777004242, 0.04786868765950203, -0.006310909520834684, -0.12702366709709167, 0.038035500794649124, 0.007579138968139887, 0.03442775830626488, -0.07844902575016022, -0.1296691745519638, 0.0057765161618590355, -0.08891135454177856, -0.04813661426305771, -0.12572699785232544, -0.06035330891609192, -0.04231622442603111, 0.01031604502350092, -0.03640799969434738, 0.04585811495780945, -0.08472999185323715, -0.0758642852306366, -0.000026124524083570577, -0.033779677003622055, -0.17812056839466095, 0.018220581114292145, -0.006420392543077469, -0.0441603921353817, 0.046541523188352585, -0.03674739971756935, 0.02255362644791603, -0.08482352644205093, -0.018002113327383995, -0.2572537064552307, 0.15970735251903534, 0.007049932144582272, 0.03232375904917717, -0.10086440294981003, -0.06389664113521576, -0.06644456088542938, 0.044666219502687454, 0.036988887935876846, 0.14534695446491241, -0.2401348203420639, -0.01818518340587616, 0.1831250935792923, -0.0927090048789978, 0.022472936660051346, 0.1274937391281128, 0.02089659869670868, 0.0776774138212204, 0.12571801245212555, 0.11949720233678818, 0.1202831119298935, -0.04432747885584831, -0.10570019483566284, 0.02816401608288288, -0.14051783084869385, 0.06607206910848618, 0.04686687886714935, -0.07401050627231598, 0.16390584409236908, 0.02190827578306198, -0.03464031592011452, -0.013596797361969948, 0.07463047653436661, -0.03212618827819824, 0.04151567071676254, -0.013235894031822681, 0.08540043234825134, -0.030043447390198708, -0.012098578736186028, -0.004312829114496708, -0.14091360569000244, 0.15133872628211975, 0.041185636073350906, -0.08717173337936401, 0.07610144466161728, -0.14992211759090424, -0.04965992644429207, 0.002880923915654421, 0.023707494139671326, -0.16027940809726715, 0.03653981164097786, -0.038485050201416016, -0.016590522602200508, 0.16064925491809845, 0.03142485022544861, 0.07934976369142532, 0.013865585438907146, -0.007639836519956589, 0.018816279247403145, 0.06632745265960693, 0.003381272777915001, 0.002712497254833579, -0.07328964024782181, -0.002615060191601515, -0.02713458612561226, 0.10205429047346115, -0.07630303502082825, 0.034012362360954285, 0.05769436061382294, 0.0692257434129715, 0.03397703915834427, -0.02199411764740944, -0.009023339487612247, -0.007544014137238264, -0.0019697481766343117, -0.052227675914764404, -0.019448159262537956, 0.02278187684714794, -0.07532647997140884, 0.14881867170333862, -0.19882389903068542, -0.17503488063812256, 0.08073732256889343, -0.04194037243723869, -0.050639763474464417, 0.05911712720990181, -0.0025776904076337814, -0.003595177084207535, 0.05791957303881645, -0.1448637694120407, 0.22550418972969055, 0.04992552474141121, 0.12464188039302826, -0.06158829852938652, -0.041233405470848083, -0.04009053856134415, -0.07904433459043503, 0.02991337887942791, 0.0392385832965374, 0.04811036214232445, -0.18697489798069, 0.00476476177573204, 0.034508951008319855, -0.02310270443558693, 0.13649308681488037, 0.004363010171800852, -0.06156149134039879, -0.007274056319147348, 0.027711013332009315, 0.032546259462833405, 0.008853738196194172, 0.010021563619375229, 0.022865373641252518, 0.03107057884335518, 0.030275488272309303, 0.03037615865468979, -0.05995982140302658, 0.0817890614271164, 0.0673523098230362, -0.04976045340299606, -0.04710826277732849, 0.008529814891517162, -0.002758373972028494, 0.04465983435511589, 0.006890237797051668, 0.016768816858530045, -0.029602190479636192, -0.020200222730636597, -0.09915779531002045, 0.12950806319713593, -0.1881391555070877, -0.2750236690044403, -0.19103437662124634, -0.018476838245987892, -0.014216798357665539, 0.07006841897964478, 0.050293177366256714, -0.09427251666784286, -0.07759880274534225, -0.0937969908118248, 0.07918867468833923, -0.023975200951099396, -0.059489767998456955, -0.1482740342617035, 0.062062621116638184, 0.02350698411464691, -0.0909152626991272, 0.0065728649497032166, -0.00429810443893075, -0.0639255940914154, 0.0011140471324324608, 0.008432477712631226, -0.04559171944856644, 0.19558236002922058, -0.027360672131180763, -0.03787641599774361, -0.03681260719895363, 0.10759066790342331, -0.05834154412150383, 0.13540959358215332, 0.15916892886161804, -0.11389077454805374, 0.031762074679136276, 0.14327208697795868, 0.021664844825863838, -0.015666792169213295, 0.0007622772245667875, -0.024658014997839928, -0.07545427232980728, -0.20542462170124054, -0.03638214245438576, -0.05141720920801163, 0.04352415353059769, 0.026717882603406906, 0.0008356368634849787, 0.06384242326021194, 0.02770349569618702, -0.06976475566625595, -0.014332215301692486, 0.09647180885076523, 0.08873381465673447, 0.20525440573692322, -0.06187041103839874, 0.09375666826963425, -0.08443433046340942, -0.00871994812041521, 0.07206648588180542, -0.06279663741588593, 0.17024073004722595, 0.021130798384547234, 0.18883216381072998, 0.05798834562301636, -0.018448462709784508, 0.024316491559147835, 0.03542082756757736, -0.01745571382343769, 0.004292121157050133, 0.0019200603710487485, -0.07424161583185196, -0.04275776073336601, 0.08459547162055969, 0.05413573235273361, -0.04500435292720795, 0.03700074180960655, 0.05987666919827461, 0.05523889139294624, 0.06852366775274277, 0.0695217102766037, -0.11772307008504868, -0.09635408222675323, 0.019289091229438782, -0.12677136063575745, -0.02439168095588684, -0.0030689374543726444, 0.16280819475650787, -0.05735457316040993, 0.02176990732550621, 0.018043898046016693, 0.09502066671848297, -0.08253270387649536, 0.024000609293580055, 0.00048049341421574354, 0.15700407326221466, 0.04764970391988754, 0.021347690373659134, -0.04745147004723549, 0.07474550604820251, 0.02676926553249359, 0.1028563603758812, -0.001452248077839613, 0.06720384955406189, -0.0007606679573655128, 0.05990024283528328, 0.07205693423748016, 0.005250511225312948, -0.16797134280204773, -0.0006822290015406907, -0.1098453626036644, -0.031123057007789612, 0.10366795212030411, -0.03008841723203659, 0.06245431303977966, -0.03979985788464546, -0.030556926503777504, -0.0545097254216671, -0.11567869782447815, -0.02663772739470005, -0.16364333033561707, 0.025241820141673088, 0.013031987473368645, 0.014857078902423382, -0.002152785426005721, 0.06080269441008568, -0.03531860560178757, 0.17706629633903503, -0.24908818304538727, -0.0877116248011589, -0.09462088346481323, -0.05924772098660469, 0.14477168023586273, -0.08063126355409622, 0.08127930015325546, 0.04647340625524521, 0.0666017159819603, 0.01326502114534378, -0.059544775635004044, 0.04016704112291336, -0.0886051282286644, -0.0051680440083146095, -0.0035758288577198982, 0.1892942637205124, 0.021012485027313232, 0.08317676186561584, 0.05856839939951897, 0.03125733509659767, -0.007439606357365847, -0.11310828477144241, -0.08455920964479446, 0.14399400353431702, 0.0017190015641972423, -0.0017411693697795272, -0.0261033046990633, -0.10206101834774017, -0.08506165444850922, 0.04587048664689064, 0.1392495483160019, 0.10555436462163925, -0.10454810410737991, 0.12235123664140701, 0.08122400939464569, -0.06057095527648926, -0.1881241351366043, -0.05725075677037239, 0.0637153834104538, 0.058915432542562485, 0.03555890917778015, -0.1508297175168991, 0.017755327746272087, 0.002572699449956417, -0.007709783967584372, -0.03226169943809509, -0.21524971723556519, -0.1254255622625351, 0.06585254520177841, -0.053938332945108414, 0.00788876973092556, 0.02134700119495392, -0.06123638525605202, -0.03575317934155464, -0.04731335490942001, 0.007239101454615593, -0.07370856404304504, 0.059556297957897186, 0.09362397342920303, 0.005976616404950619, 0.04940671846270561, 0.0007023726939223707, 0.07268929481506348, -0.02388954535126686, -0.04503582417964935, -0.06694349646568298, 0.041214559227228165, 0.03531549498438835, -0.036943841725587845, 0.19109730422496796, -0.006327019073069096, 0.002131144516170025, -0.08497125655412674, -0.05298967659473419, -0.0026736913714557886, 0.015066394582390785, 0.00225297873839736, -0.008654025383293629, -0.04409169778227806, -0.004911947995424271, 0.021917305886745453, 0.02075226418673992, -0.044999342411756516, -0.1682496815919876, -0.13650599122047424, 0.17903481423854828, 0.23353302478790283, 0.01696300134062767, -0.05286882072687149, 0.011360006406903267, 0.008374393917620182, 0.0170065276324749, -0.11381890624761581, 0.03685871511697769, 0.03905278816819191, -0.004554759245365858, 0.05227560177445412, -0.02448173426091671, -0.08000604808330536, 0.043809838593006134, 0.06840812414884567, -0.02069483883678913, -0.12684541940689087, -0.026352617889642715, -0.000531352183315903, -0.06648888438940048, 0.026178445667028427, 0.2015930712223053, -0.04026230797171593, 0.015159441158175468, 0.0014102113200351596, 0.019884206354618073, -0.05813046544790268, 0.09813591837882996, -0.0005606168415397406, 0.004859857261180878, -0.06943674385547638, 0.024639705196022987, 0.06417401134967804, -0.0016668412135913968, 0.043174855411052704, 0.09801417589187622, -0.09458786994218826, -0.06888721883296967, -0.16902005672454834, -0.16988541185855865, -0.04874991998076439, -0.04115815833210945, -0.0013284601736813784, -0.05071481317281723, -0.005955061409622431, 0.0752016231417656, -0.0012015699176117778, 0.05988510325551033, -0.08705364167690277, 0.0034512458369135857, -0.06880030781030655, 0.03924422711133957, 0.042785122990608215, -0.010835903696715832, -0.048472095280885696, 0.13676699995994568, -0.0018730363808572292, -0.029713107272982597, -0.016894154250621796, -0.030000535771250725, -0.016808435320854187, 0.00916108675301075, -0.07122434675693512, -0.033917136490345, -0.05564947798848152, -0.05318595841526985, 0.04203205555677414, -0.0011841871310025454, -0.029025720432400703, 0.02143823727965355, -0.03856292739510536, -0.009804148226976395, -0.04696216806769371, 0.0027862221468240023, -0.07411675900220871, 0.05829782038927078, 0.02304050698876381, -0.05350160598754883, 0.08241800218820572, 0.08407063037157059, -0.06184077262878418, 0.09960323572158813, 0.038674335926771164, -0.06284172832965851, 0.014961764216423035, 0.05736234411597252, 0.04202505946159363, -0.04172907769680023, 0.012110323645174503, 0.03377648815512657, -0.028637787327170372, -0.046908486634492874, -0.019126804545521736, -0.04576299339532852, 0.12106536328792572, 0.0020748877432197332, -0.008436965756118298, -0.06116533279418945, 0.031756527721881866, 0.04971374571323395, 0.004068488255143166, 0.1632847934961319, -0.04082665219902992, 0.06057673692703247, -0.06967931240797043, 0.02975586988031864, -0.0028825663030147552, -0.00989789329469204, 0.008949929848313332, -0.08055344223976135, 0.02845391258597374, -0.0044330270029604435, -0.0003151929413434118, -0.030384527519345284, 0.05096312612295151, 0.07085540145635605, -0.06545338779687881, 0.01133725605905056, 0.06374368071556091, 0.07146216183900833, 0.04642084613442421, -0.0304185189306736, -0.1260952651500702, 0.0003627959231380373, -0.01078407559543848, -0.20344330370426178, 0.06429580599069595, 0.0722876489162445, 0.03057478554546833, 0.10682976990938187, 0.055322155356407166, 0.07308837026357651, -0.15468332171440125, 0.0455656573176384, -0.06386169791221619, -0.020693279802799225, -0.038634028285741806, 0.10677111893892288, 0.21238718926906586, -0.08451151847839355, 0.08235864341259003, 0.06118762865662575, -0.059740759432315826, -0.13990966975688934, -0.15814994275569916, -0.034740325063467026, -0.05481236055493355, -0.02370855212211609, -0.060112107545137405, 0.03513122722506523, -0.019581263884902, 0.06843707710504532, -0.019191093742847443, 0.17146167159080505, -0.09935543686151505, -0.1268799602985382, 0.022629639133810997, 0.0030812141485512257, 0.07164982706308365, 0.08218766748905182, 0.04340701922774315, 0.053831685334444046, 0.03568582609295845, 0.09035512804985046, 0.05255616456270218, 0.05209710821509361, -0.00848862063139677, -0.05331642925739288, -0.028217824175953865, -0.0227810088545084, 0.0689995214343071, 0.020249241963028908, 0.16867206990718842, 0.07524740695953369, -0.10460905730724335, -0.019287802278995514, 0.14602524042129517, -0.06832349300384521, -0.1744345873594284, -0.18794694542884827, 0.13429921865463257, 0.1214979961514473, 0.06582346558570862, -0.0412728488445282, -0.09678889811038971, 0.02421136386692524, 0.15964558720588684, 0.15867000818252563, 0.002366471802815795, 0.005851180292665958, -0.014896389096975327, 0.019022813066840172, -0.035248033702373505, 0.06320220977067947, -0.007351357955485582, 0.3189825117588043, 0.023834582418203354, 0.020287521183490753, 0.017215624451637268, -0.023724554106593132, -0.10624327510595322, 0.020836427807807922, -0.06266764551401138, -0.03113468363881111, 0.002486618934199214, 0.10575447976589203, -0.012329860590398312, -0.12716878950595856, -0.08927175402641296, 0.004526757635176182, -0.02666524611413479, 0.05445673689246178, 0.07549440115690231, 0.04095909744501114, 0.049378398805856705, 0.017830386757850647, -0.0763736292719841, 0.1830405741930008, 0.012985087931156158, -0.04915915057063103, 0.030080651864409447, 0.05255452170968056, -0.12037210911512375, 0.10136160254478455, -0.018826978281140327, 0.08420920372009277, 0.06573627144098282, 0.05131690576672554, -0.07379027456045151, 0.05969425290822983, -0.018036430701613426, -0.10756073147058487, 0.07870666682720184, 0.15636485815048218, -0.01641465537250042, 0.10642990469932556, 0.038208458572626114, -0.08698101341724396, 0.048667531460523605, -0.029280709102749825, -0.04266853630542755, -0.0421002097427845, 0.10198776423931122, -0.08507632464170456, 0.13011164963245392, 0.12547211349010468, -0.02290361188352108, 0.010001970455050468, -0.025405023247003555, 0.0016567468410357833, 0.026634227484464645, 0.042005665600299835, -0.021290326490998268, -0.09601274877786636, -0.04612055793404579, -0.052767615765333176, 0.017591075971722603, -0.15773905813694, -0.05186587944626808, 0.0025292232166975737, 0.012409638613462448, -0.03348740562796593, 0.06770769506692886, 0.051197703927755356, 0.026410702615976334, -0.02251296117901802, 0.051775820553302765, -0.0017894124612212181, 0.04949294775724411, -0.09029226005077362, -0.07898562401533127 ]
null
null
transformers
# S2T-SMALL-COVOST2-EN-FA-ST `s2t-small-covost2-en-fa-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Farsi text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-covost2-en-fa-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-covost2-en-fa-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=48_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-covost2-en-fa-st is trained on English-Farsi subset of [CoVoST2](https://github.com/facebookresearch/covost). CoVoST is a large-scale multilingual ST corpus based on [Common Voice](https://arxiv.org/abs/1912.06670), created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for en-fa (BLEU score): 11.43 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "fa"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-covost2-en-fa-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "fa", "dataset:covost2", "arxiv:2010.05171", "arxiv:1912.06670", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1912.06670", "1904.08779" ]
[ "en", "fa" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #fa #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-COVOST2-EN-FA-ST 's2t-small-covost2-en-fa-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Farsi text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-covost2-en-fa-st is trained on English-Farsi subset of CoVoST2. CoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for en-fa (BLEU score): 11.43 ### BibTeX entry and citation info
[ "# S2T-SMALL-COVOST2-EN-FA-ST\n\n's2t-small-covost2-en-fa-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Farsi text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-en-fa-st is trained on English-Farsi subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for en-fa (BLEU score): 11.43", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #fa #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-COVOST2-EN-FA-ST\n\n's2t-small-covost2-en-fa-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Farsi text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-en-fa-st is trained on English-Farsi subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for en-fa (BLEU score): 11.43", "### BibTeX entry and citation info" ]
[ 91, 78, 111, 43, 137, 67, 3, 96, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #fa #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-COVOST2-EN-FA-ST\n\n's2t-small-covost2-en-fa-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Farsi text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.07189787179231644, 0.12053137272596359, -0.006027044262737036, -0.0006422887672670186, 0.06895510852336884, -0.054870933294296265, 0.03112206794321537, 0.06812600791454315, -0.1293860673904419, 0.047325823456048965, 0.035442255437374115, 0.06301499158143997, 0.061838071793317795, 0.05977191403508186, -0.003378909546881914, -0.26432108879089355, 0.017230354249477386, -0.058525554835796356, 0.04369215667247772, 0.08686695992946625, 0.12582387030124664, -0.04069962352514267, 0.0711001381278038, 0.027538448572158813, -0.009697983972728252, -0.00010034191654995084, 0.05659123882651329, -0.06214570626616478, 0.050838783383369446, 0.1143941581249237, 0.07570446282625198, 0.024600129574537277, 0.06677882373332977, -0.15025268495082855, 0.007211263757199049, 0.051749199628829956, 0.023103836923837662, 0.043605737388134, 0.0021986584179103374, 0.008162799291312695, 0.1279723346233368, 0.026402547955513, 0.03501954302191734, 0.09840612858533859, -0.06987902522087097, -0.04240596294403076, -0.04263710975646973, 0.03151024132966995, 0.15550655126571655, 0.07446695119142532, -0.0362076535820961, -0.009589307941496372, -0.006562909577041864, 0.06491602212190628, 0.025900084525346756, -0.21039506793022156, -0.017173178493976593, -0.033193789422512054, -0.00526697700843215, 0.004988132510334253, -0.04090077802538872, -0.029763128608465195, -0.033698514103889465, 0.010176175273954868, 0.08270375430583954, -0.06354273855686188, 0.04110655561089516, -0.06241089105606079, -0.11742212623357773, -0.020523760467767715, 0.0897548720240593, -0.010046599432826042, -0.0919373482465744, -0.11701609194278717, -0.05789108946919441, -0.003949618898332119, -0.023037288337945938, -0.04842134937644005, -0.008082793094217777, 0.04488712549209595, 0.02444547414779663, -0.13018673658370972, -0.12410333752632141, -0.011874723248183727, -0.09445188194513321, 0.09657838195562363, 0.03976279869675636, 0.0041168793104588985, -0.014804212376475334, 0.0686199814081192, -0.10010400414466858, -0.08062398433685303, -0.049063365906476974, -0.09355383366346359, -0.10210423916578293, -0.015907149761915207, -0.00537791708484292, -0.2196122109889984, -0.03583279252052307, 0.14154210686683655, -0.09534308314323425, 0.04064789414405823, 0.026188012212514877, 0.021112600341439247, -0.0016320416470989585, 0.19624705612659454, -0.05323975533246994, -0.07585740089416504, 0.07243761420249939, -0.022187206894159317, 0.036230478435754776, -0.0049943164922297, -0.08881711214780807, 0.013395137153565884, -0.09222346544265747, 0.03571736067533493, 0.06512736529111862, 0.03347072750329971, 0.0010435290168970823, -0.06560905277729034, 0.1477268934249878, -0.12389058619737625, 0.0031628336291760206, 0.038821082562208176, -0.05000157654285431, 0.12405169010162354, 0.05415019392967224, -0.010119516402482986, -0.10998186469078064, 0.047138188034296036, -0.008309789001941681, 0.02388203889131546, -0.08347906172275543, -0.11136743426322937, 0.0006935804267413914, -0.0660964623093605, -0.055612944066524506, -0.11838638782501221, -0.08599109947681427, -0.04255886375904083, 0.011993324384093285, -0.04209420457482338, 0.040461450815200806, -0.07513367384672165, -0.06988625228404999, -0.000822928617708385, -0.03145295009016991, -0.15108510851860046, 0.01339570339769125, 0.009787676855921745, -0.01129088457673788, 0.04208405688405037, -0.039513204246759415, 0.021255912259221077, -0.06827640533447266, 0.00012416134995874017, -0.2525973618030548, 0.17958933115005493, 0.022393597289919853, 0.029929351061582565, -0.09588592499494553, -0.06379406154155731, -0.07817741483449936, 0.04982684552669525, 0.03360776603221893, 0.1451856940984726, -0.23494026064872742, -0.02176317572593689, 0.20816479623317719, -0.08570165187120438, 0.026938538998365402, 0.11451049894094467, 0.0054040816612541676, 0.07050266861915588, 0.1287693977355957, 0.09861025214195251, 0.1293911188840866, -0.03547854349017143, -0.08882731944322586, 0.02576707862317562, -0.12959572672843933, 0.0760427787899971, 0.04557633772492409, -0.07095072418451309, 0.13906386494636536, 0.011411257088184357, -0.024318424984812737, -0.007299321237951517, 0.060252562165260315, -0.029145579785108566, 0.038260288536548615, -0.02719002217054367, 0.08745955675840378, -0.03539937362074852, -0.007196888793259859, 0.010339540429413319, -0.14246205985546112, 0.1423054188489914, 0.03668011724948883, -0.07887307554483414, 0.0643031969666481, -0.15849536657333374, -0.05178577080368996, 0.0032355559524148703, 0.025042761117219925, -0.18116198480129242, 0.04357007518410683, -0.03431202843785286, -0.002828091150149703, 0.1588938981294632, 0.022855721414089203, 0.0752195343375206, 0.022366905584931374, -0.009885428473353386, 0.025345997884869576, 0.07695810496807098, -0.00404015090316534, -0.006492659915238619, -0.0748719796538353, -0.011492908000946045, -0.02840438298881054, 0.11644607037305832, -0.08974576741456985, 0.037884604185819626, 0.042590152472257614, 0.059352293610572815, 0.033437032252550125, -0.02931998483836651, 0.005108578130602837, -0.012283223681151867, 0.0037000440061092377, -0.05041676014661789, -0.020111050456762314, 0.032325003296136856, -0.07973692566156387, 0.13314563035964966, -0.20425286889076233, -0.16141267120838165, 0.08174936473369598, -0.04274088516831398, -0.054262977093458176, 0.04505693539977074, -0.000531450379639864, 0.0014367811381816864, 0.04117829352617264, -0.1432802528142929, 0.24117229878902435, 0.04340574890375137, 0.13588041067123413, -0.06979790329933167, -0.042192406952381134, -0.04740327224135399, -0.06215627118945122, 0.04644562676548958, 0.026411233469843864, 0.03943140059709549, -0.18582293391227722, 0.0019056351156905293, 0.04178166016936302, -0.008846461772918701, 0.10640178620815277, 0.006502760574221611, -0.0554005466401577, -0.010033042170107365, 0.02982543222606182, 0.03358742594718933, -0.01708308234810829, 0.02475140616297722, 0.030430011451244354, 0.029523255303502083, 0.024643752723932266, 0.01949227787554264, -0.06686020642518997, 0.08982262015342712, 0.08137127757072449, -0.04961671680212021, -0.04923349618911743, -0.01997162029147148, 0.0021673424635082483, 0.04983915016055107, 0.015889955684542656, 0.020167119801044464, -0.020483678206801414, -0.022092457860708237, -0.10564092546701431, 0.12444253265857697, -0.17411941289901733, -0.25862377882003784, -0.1810295581817627, -0.025083184242248535, 0.00040910960524342954, 0.08318135142326355, 0.04243730753660202, -0.07204847782850266, -0.06552901864051819, -0.0913797989487648, 0.05134143680334091, -0.029667817056179047, -0.05348288267850876, -0.12813766300678253, 0.05415258929133415, 0.03218844160437584, -0.0883757621049881, 0.01123607624322176, 0.004139672499150038, -0.06971211731433868, -0.013321451842784882, 0.02074291743338108, -0.04781476780772209, 0.19934052228927612, -0.010829583741724491, -0.037879928946495056, -0.04259994626045227, 0.11919864267110825, -0.04364724084734917, 0.11609818041324615, 0.1772586554288864, -0.11975324898958206, 0.030031848698854446, 0.12427950650453568, 0.027147026732563972, -0.027542531490325928, -0.0011866656132042408, -0.024928251281380653, -0.08058423548936844, -0.1915377378463745, -0.020300930365920067, -0.05019555613398552, 0.0396256186068058, 0.030524225905537605, 0.005614588502794504, 0.07196249812841415, 0.025013517588377, -0.06977979838848114, 0.0014460799284279346, 0.08969171345233917, 0.08338053524494171, 0.18051360547542572, -0.06520682573318481, 0.10277468711137772, -0.07915224134922028, -0.011576862074434757, 0.07323682308197021, -0.06162182614207268, 0.16118544340133667, 0.02306225895881653, 0.19326697289943695, 0.05940338596701622, -0.015857171267271042, 0.03920383378863335, 0.03058459982275963, -0.022986672818660736, 0.004625294823199511, -0.007444273214787245, -0.07937605679035187, -0.06674663722515106, 0.08196745067834854, 0.05152187496423721, -0.023155584931373596, 0.029048338532447815, 0.05483667552471161, 0.04059037193655968, 0.078555166721344, 0.058689266443252563, -0.13754571974277496, -0.08686897158622742, 0.024521561339497566, -0.12185083329677582, -0.02139141596853733, 0.005917869973927736, 0.15508247911930084, -0.0514632984995842, 0.015463736839592457, 0.017642371356487274, 0.10311055928468704, -0.11406020075082779, 0.03145555406808853, 0.0077431946992874146, 0.15196995437145233, 0.04413377121090889, 0.00936043169349432, -0.06837103515863419, 0.05956524983048439, 0.03613407909870148, 0.0977175235748291, -0.004447002895176411, 0.07167286425828934, 0.0010414456482976675, 0.04536253586411476, 0.06977146863937378, -0.007527148816734552, -0.14016030728816986, -0.002369352849200368, -0.1044064611196518, -0.03717996925115585, 0.10838237404823303, -0.0333869531750679, 0.07113587111234665, -0.04322255030274391, -0.03840644657611847, -0.04758007451891899, -0.09562955796718597, -0.03002997860312462, -0.17583739757537842, 0.02524309791624546, 0.01600063033401966, 0.02107524313032627, -0.012814451940357685, 0.07499701529741287, -0.02533760853111744, 0.17569999396800995, -0.24812161922454834, -0.09188855439424515, -0.10143280774354935, -0.036003511399030685, 0.12601631879806519, -0.06725350022315979, 0.07704228907823563, 0.04523060470819473, 0.05295591801404953, 0.02081352099776268, -0.0719820037484169, 0.04512368142604828, -0.08716781437397003, -0.017576439306139946, -0.02299661748111248, 0.19347314536571503, 0.01775762438774109, 0.08537064492702484, 0.05799505114555359, 0.023431627079844475, -0.0000947721564443782, -0.10833805054426193, -0.08233713358640671, 0.14112955331802368, 0.016210610046982765, -0.0023166374303400517, -0.03441467881202698, -0.07231367379426956, -0.07100941985845566, 0.053746648132801056, 0.12832370400428772, 0.08797776699066162, -0.10327840596437454, 0.11678405106067657, 0.08483719080686569, -0.05745307728648186, -0.1681142896413803, -0.06580296903848648, 0.06383135169744492, 0.0447612963616848, 0.041083451360464096, -0.16766880452632904, 0.020835621282458305, 0.008550432510674, -0.006893247365951538, -0.009562516584992409, -0.2384032905101776, -0.1150958314538002, 0.0492309033870697, -0.0416317917406559, -0.00788637064397335, -0.015543999150395393, -0.08275453001260757, -0.04303014650940895, -0.06444071233272552, 0.02188914082944393, -0.08933240175247192, 0.06439503282308578, 0.09404139965772629, -0.00926540419459343, 0.049193743616342545, -0.0006245123222470284, 0.0599164254963398, -0.029947569593787193, -0.03582087159156799, -0.06190004199743271, 0.04372963681817055, 0.030691174790263176, -0.031333521008491516, 0.1615303009748459, -0.006651000119745731, 0.01328513864427805, -0.08949476480484009, -0.05740172788500786, 0.005709078628569841, -0.01081678457558155, -0.003153631929308176, -0.00535858329385519, -0.033032454550266266, -0.0144719909876585, 0.02114998549222946, 0.018623683601617813, -0.06199847161769867, -0.17171068489551544, -0.1029650941491127, 0.20023836195468903, 0.2315959334373474, 0.01860065758228302, -0.052833203226327896, -0.002075502183288336, 0.009492892771959305, 0.01694284938275814, -0.13128584623336792, 0.04136624187231064, 0.03518763184547424, 0.0052695684134960175, 0.0604899637401104, -0.006132973823696375, -0.06303757429122925, 0.041813477873802185, 0.06799913197755814, -0.021418355405330658, -0.14363619685173035, -0.03083619475364685, 0.0130406329408288, -0.08379767090082169, 0.017273174598813057, 0.20033051073551178, -0.01817505992949009, 0.008055110462009907, 0.008988237008452415, 0.016366658732295036, -0.05640994757413864, 0.102689728140831, -0.009428250603377819, 0.0061498680151999, -0.07980619370937347, 0.015582389198243618, 0.08106927573680878, 0.007137162610888481, 0.029479095712304115, 0.10638654232025146, -0.0913960337638855, -0.06390608102083206, -0.18940071761608124, -0.15373347699642181, -0.054435208439826965, -0.028527462854981422, 0.006430366542190313, -0.06107435002923012, 0.007334798574447632, 0.08148326724767685, -0.013890145346522331, 0.06628664582967758, -0.08298205584287643, 0.015166091732680798, -0.06357146799564362, 0.03308146446943283, 0.024458443745970726, -0.0007053639274090528, -0.0449470616877079, 0.1448604315519333, -0.008232741616666317, -0.026881257072091103, -0.017344599589705467, -0.027224821969866753, -0.03438105434179306, 0.01740541122853756, -0.08465692400932312, -0.027766946703195572, -0.054534781724214554, -0.05738408491015434, 0.04592863470315933, 0.010522530414164066, -0.013621029444038868, 0.019088802859187126, -0.0334315225481987, -0.017265325412154198, -0.03796381503343582, 0.0016141408123075962, -0.0856274664402008, 0.06411736458539963, 0.01608038879930973, -0.05544715002179146, 0.08068764954805374, 0.08161129057407379, -0.06483635306358337, 0.0822381004691124, 0.018818598240613937, -0.06541624665260315, 0.00868450477719307, 0.05270812287926674, 0.03438170626759529, -0.04039962217211723, 0.026964103803038597, 0.030574841424822807, -0.04646799713373184, -0.048423971980810165, -0.01109539158642292, -0.050915297120809555, 0.11915142089128494, 0.010448850691318512, 0.007687400095164776, -0.05784410983324051, 0.015330442227423191, 0.06277438998222351, -0.004891502670943737, 0.14502552151679993, -0.03454383835196495, 0.05347222462296486, -0.08772192150354385, 0.0243610180914402, -0.0005832113092765212, 0.0008552293875254691, -0.0034683390986174345, -0.09067074209451675, 0.03542878478765488, 0.005383349023759365, -0.0011315676383674145, -0.046972502022981644, 0.052421923726797104, 0.06568415462970734, -0.05439452826976776, 0.03832370042800903, 0.057278573513031006, 0.0809042826294899, 0.060444362461566925, -0.03756644204258919, -0.09731701016426086, 0.014602839946746826, -0.008834823034703732, -0.19319555163383484, 0.07335221022367477, 0.07887205481529236, 0.01613910309970379, 0.09423526376485825, 0.05325044319033623, 0.06968828290700912, -0.13182179629802704, 0.05196968838572502, -0.060900717973709106, -0.03181033954024315, -0.05138884112238884, 0.08218926936388016, 0.20073947310447693, -0.07377296686172485, 0.09071572870016098, 0.06561090797185898, -0.047028157860040665, -0.14177674055099487, -0.14798268675804138, -0.029591670259833336, -0.044802431017160416, -0.031653549522161484, -0.059717919677495956, 0.03384008631110191, -0.02244020625948906, 0.05738387629389763, -0.013407506048679352, 0.16249889135360718, -0.09380713105201721, -0.1199306845664978, 0.019946957007050514, 0.002582672517746687, 0.07870221138000488, 0.07592878490686417, 0.03520517796278, 0.06046048924326897, 0.037282414734363556, 0.09383238106966019, 0.05293058231472969, 0.05170441418886185, -0.00006556677544722334, -0.043071795254945755, -0.026262005791068077, -0.018110357224941254, 0.061273086816072464, 0.014531152322888374, 0.18430253863334656, 0.06789334863424301, -0.09409074485301971, -0.01362083200365305, 0.14946217834949493, -0.0805991142988205, -0.15439456701278687, -0.18813686072826385, 0.15362581610679626, 0.11152908951044083, 0.058279529213905334, -0.036394212394952774, -0.10021237283945084, 0.03181672841310501, 0.17016547918319702, 0.15437878668308258, -0.04133785516023636, 0.010455676354467869, -0.004162989556789398, 0.020363768562674522, -0.020238269120454788, 0.06115784868597984, 0.013045495375990868, 0.3343566656112671, 0.0230391938239336, 0.012205129489302635, 0.0337316058576107, -0.025789033621549606, -0.10463999956846237, 0.02776547335088253, -0.06728383898735046, -0.01242872141301632, 0.01358045544475317, 0.09551969915628433, -0.019811982288956642, -0.13749946653842926, -0.07769888639450073, 0.009173132479190826, -0.016438838094472885, 0.04296344518661499, 0.042952004820108414, 0.045759838074445724, 0.03945229947566986, 0.023424750193953514, -0.07635156065225601, 0.18757681548595428, 0.007352384738624096, -0.05521209165453911, 0.03528384119272232, 0.04935670271515846, -0.11865485459566116, 0.09042485058307648, -0.02147400565445423, 0.0724162608385086, 0.07185118645429611, 0.051095105707645416, -0.080015167593956, 0.05926921218633652, -0.018664507195353508, -0.13369008898735046, 0.06972752511501312, 0.15542064607143402, -0.004741759970784187, 0.09986095130443573, 0.030363013967871666, -0.08291153609752655, 0.049600549042224884, -0.023226778954267502, -0.04168519750237465, -0.05902969837188721, 0.08231598138809204, -0.08380069583654404, 0.1352546215057373, 0.12238246202468872, -0.024942761287093163, 0.01997862569987774, -0.015735480934381485, 0.0005633726832456887, 0.026452964171767235, 0.05221753939986229, -0.029905017465353012, -0.10505848377943039, -0.04197630286216736, -0.054245661944150925, 0.02175370790064335, -0.18322908878326416, -0.04711582884192467, 0.008500892668962479, 0.007405041251331568, -0.030826803296804428, 0.0673454999923706, 0.06430657207965851, 0.035808734595775604, -0.025432320311665535, 0.02427109144628048, -0.005612255539745092, 0.06171748787164688, -0.09914391487836838, -0.06943828612565994 ]
null
null
transformers
# S2T-SMALL-COVOST2-ES-EN-ST `s2t-small-covost2-es-en-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end Spanish speech to English text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-covost2-es-en-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-covost2-es-en-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=48_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-covost2-es-en-st is trained on Spanish-English subset of [CoVoST2](https://github.com/facebookresearch/covost). CoVoST is a large-scale multilingual ST corpus based on [Common Voice](https://arxiv.org/abs/1912.06670), created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for es-en (BLEU score): 22.31 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["es", "en"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-covost2-es-en-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "es", "en", "dataset:covost2", "arxiv:2010.05171", "arxiv:1912.06670", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1912.06670", "1904.08779" ]
[ "es", "en" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #es #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-COVOST2-ES-EN-ST 's2t-small-covost2-es-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end Spanish speech to English text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-covost2-es-en-st is trained on Spanish-English subset of CoVoST2. CoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for es-en (BLEU score): 22.31 ### BibTeX entry and citation info
[ "# S2T-SMALL-COVOST2-ES-EN-ST\n\n's2t-small-covost2-es-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end Spanish speech to English text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-es-en-st is trained on Spanish-English subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for es-en (BLEU score): 22.31", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #es #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-COVOST2-ES-EN-ST\n\n's2t-small-covost2-es-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end Spanish speech to English text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-es-en-st is trained on Spanish-English subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for es-en (BLEU score): 22.31", "### BibTeX entry and citation info" ]
[ 91, 78, 111, 42, 137, 66, 3, 96, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #es #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-COVOST2-ES-EN-ST\n\n's2t-small-covost2-es-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end Spanish speech to English text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.0825132429599762, 0.126972496509552, -0.006485023535788059, 0.004625383298844099, 0.07387661188840866, -0.05849365144968033, 0.018131593242287636, 0.0695541650056839, -0.12262677401304245, 0.0473414771258831, 0.029799090698361397, 0.09103242307901382, 0.05561324208974838, 0.04213934391736984, -0.012794530019164085, -0.26209399104118347, 0.023359643295407295, -0.06460551917552948, 0.04552145674824715, 0.08136535435914993, 0.11638683825731277, -0.03351934999227524, 0.0714074894785881, 0.023922409862279892, -0.0010766233317553997, 0.01424045953899622, 0.06693132221698761, -0.07975943386554718, 0.05714090168476105, 0.11590501666069031, 0.07991407811641693, 0.025995707139372826, 0.07140912115573883, -0.13618424534797668, 0.005064777564257383, 0.04374661296606064, 0.021735116839408875, 0.047047168016433716, 0.012580068781971931, 0.006628092378377914, 0.12457579374313354, 0.017474114894866943, 0.03304162621498108, 0.09566158801317215, -0.07998771965503693, -0.030533606186509132, -0.04545056074857712, 0.006089881528168917, 0.15728458762168884, 0.06562866270542145, -0.03471163660287857, -0.008950630202889442, -0.014348270371556282, 0.04739368334412575, 0.020485443994402885, -0.20243917405605316, -0.019708627834916115, -0.03356899321079254, 0.004061347339302301, 0.008785618469119072, -0.02213686890900135, -0.02263219840824604, -0.028292302042245865, 0.000980908633209765, 0.0673154667019844, -0.07221875339746475, 0.030306590721011162, -0.06179726868867874, -0.10866722464561462, -0.03778678551316261, 0.07964383810758591, -0.014855435118079185, -0.08345196396112442, -0.11340532451868057, -0.06314658373594284, 0.006893896963447332, -0.01390615850687027, -0.04629454016685486, -0.005850085522979498, 0.04133422672748566, 0.032168224453926086, -0.11895687878131866, -0.12092885375022888, -0.013099143281579018, -0.09637004137039185, 0.09326168894767761, 0.03579194098711014, -0.0011835152981802821, -0.0023681619204580784, 0.07607709616422653, -0.0879824236035347, -0.08136788755655289, -0.04056046903133392, -0.08735020458698273, -0.10052738338708878, -0.0071565438993275166, -0.0056051150895655155, -0.22587302327156067, -0.025891924276947975, 0.12368448078632355, -0.10458086431026459, 0.03959602490067482, 0.03556984290480614, 0.022247634828090668, 0.0018099169246852398, 0.1952665001153946, -0.04189833998680115, -0.06536388397216797, 0.06473197788000107, -0.030957644805312157, 0.04109620302915573, -0.0013737829867750406, -0.10698361694812775, -0.0002675051218830049, -0.09075245261192322, 0.04815659672021866, 0.07169840484857559, 0.027090372517704964, -0.0036094689276069403, -0.07201127707958221, 0.14678837358951569, -0.11984676867723465, 0.009576359763741493, 0.03842676803469658, -0.05182148143649101, 0.12768083810806274, 0.043683458119630814, -0.015996824949979782, -0.11185789108276367, 0.023716624826192856, -0.007673576939851046, 0.01572447642683983, -0.09742052853107452, -0.11178388446569443, 0.005862221587449312, -0.07734932005405426, -0.04862378537654877, -0.11997120827436447, -0.08387304097414017, -0.05678252503275871, 0.015352356247603893, -0.06121186912059784, 0.03752921521663666, -0.09259253740310669, -0.056986939162015915, 0.0013773780083283782, -0.04128921404480934, -0.1553737074136734, 0.01137456949800253, 0.01584697514772415, -0.0011321514612063766, 0.04551753029227257, -0.04169149696826935, 0.015425186604261398, -0.06813083589076996, 0.0010568088619038463, -0.25231754779815674, 0.19438651204109192, 0.029969871044158936, 0.028182778507471085, -0.1038181334733963, -0.056393180042505264, -0.07140903919935226, 0.0634470209479332, 0.03314162418246269, 0.15352919697761536, -0.24157431721687317, -0.026231074705719948, 0.21574106812477112, -0.07298800349235535, 0.030953962355852127, 0.10280828177928925, 0.006240673828870058, 0.08647071570158005, 0.13235414028167725, 0.10145902633666992, 0.10838325321674347, -0.035908158868551254, -0.07893238961696625, 0.02173135057091713, -0.12214615195989609, 0.05873498320579529, 0.04993987828493118, -0.08430857211351395, 0.1355215311050415, 0.015172037295997143, -0.049469105899333954, -0.011256854981184006, 0.06782715767621994, -0.028261421248316765, 0.04900871589779854, -0.014632262289524078, 0.09565233439207077, -0.02477552928030491, -0.0042646219953894615, 0.013283196836709976, -0.13828271627426147, 0.15324848890304565, 0.022377850487828255, -0.08035393059253693, 0.061894726008176804, -0.1592065989971161, -0.05140320211648941, -0.01422740425914526, 0.021425267681479454, -0.17849530279636383, 0.04505956918001175, -0.03859065845608711, 0.011575081385672092, 0.15781430900096893, 0.019248737022280693, 0.07041864097118378, 0.015368098393082619, -0.0046903397887945175, 0.02091616950929165, 0.07437287271022797, -0.005205605644732714, 0.0049919066950678825, -0.07421600073575974, -0.01787685975432396, -0.025767238810658455, 0.10140856355428696, -0.09723183512687683, 0.032776180654764175, 0.0337400920689106, 0.0459950715303421, 0.02749902568757534, -0.023441093042492867, -0.006366654299199581, 0.00043838072451762855, 0.00048098250408656895, -0.05190596729516983, -0.008007882162928581, 0.03607902675867081, -0.07085762172937393, 0.12836576998233795, -0.2004113644361496, -0.17378120124340057, 0.07440432906150818, -0.04941500350832939, -0.049090318381786346, 0.032758645713329315, 0.008569160476326942, 0.0069672465324401855, 0.03895274177193642, -0.14145323634147644, 0.24828164279460907, 0.0447821319103241, 0.14156374335289001, -0.08366291970014572, -0.03171578794717789, -0.0455225333571434, -0.0747818797826767, 0.039924077689647675, 0.03235386312007904, 0.05827504023909569, -0.1808084398508072, 0.003334167180582881, 0.020442700013518333, -0.014049654826521873, 0.11674971133470535, 0.010429651476442814, -0.05307536944746971, -0.004334242548793554, 0.03468209132552147, 0.04914846271276474, -0.013993503525853157, 0.025630872696638107, 0.03237629681825638, 0.026110881939530373, 0.028527822345495224, 0.026730652898550034, -0.06711362302303314, 0.08692219853401184, 0.0797400251030922, -0.04998909309506416, -0.05418140068650246, -0.016157979145646095, 0.000422833050834015, 0.04551379010081291, 0.022277453914284706, 0.018285546451807022, -0.02142794243991375, -0.021145841106772423, -0.10651325434446335, 0.12895013391971588, -0.17343369126319885, -0.2638722360134125, -0.1951109766960144, -0.027201056480407715, 0.004450379405170679, 0.10028275847434998, 0.04730840399861336, -0.08512477576732635, -0.05511070042848587, -0.07019975036382675, 0.06526993215084076, -0.012720045633614063, -0.06675952672958374, -0.12971200048923492, 0.06352989375591278, 0.018664376810193062, -0.08712400496006012, 0.011600085534155369, 0.011601148173213005, -0.07943917065858841, -0.020825723186135292, 0.002463114447891712, -0.03622504696249962, 0.20660866796970367, -0.006570097059011459, -0.04024352878332138, -0.04086935147643089, 0.10157876461744308, -0.04682938754558563, 0.10802873224020004, 0.1816595047712326, -0.10828465223312378, 0.023631593212485313, 0.1267455816268921, 0.02558124251663685, -0.027267388999462128, -0.009441288188099861, -0.033863041549921036, -0.08540322631597519, -0.200290247797966, -0.029113711789250374, -0.046036671847105026, 0.022236740216612816, 0.04041536524891853, 0.002187887439504266, 0.0790446400642395, 0.04134642332792282, -0.06263021379709244, 0.0010517239570617676, 0.08108119666576385, 0.08422008901834488, 0.19036813080310822, -0.06377243250608444, 0.08658374845981598, -0.07693440467119217, -0.019872313365340233, 0.07569970190525055, -0.039702191948890686, 0.15031681954860687, 0.019217079505324364, 0.2075883150100708, 0.06331627815961838, -0.021107278764247894, 0.029146961867809296, 0.036223676055669785, -0.020642908290028572, 0.008417445234954357, -0.013486177660524845, -0.07649005949497223, -0.08010336756706238, 0.07152699679136276, 0.04877270385622978, -0.022972796112298965, 0.010730085894465446, 0.04976010322570801, 0.03589700907468796, 0.06601247191429138, 0.0540747195482254, -0.15019673109054565, -0.08188366144895554, 0.026323523372411728, -0.11308756470680237, -0.025859804823994637, 0.0007152180187404156, 0.154465451836586, -0.06145565211772919, 0.017013562843203545, 0.01610610820353031, 0.10546837002038956, -0.12404187023639679, 0.03438350185751915, 0.003363597672432661, 0.14950120449066162, 0.05364377424120903, 0.01709473691880703, -0.05358195677399635, 0.06728249788284302, 0.031087102368474007, 0.0964454710483551, -0.004066402092576027, 0.07376617193222046, -0.011892298236489296, 0.0299331396818161, 0.06656665354967117, -0.010197711177170277, -0.1397111564874649, 0.008222156204283237, -0.09840097278356552, -0.038766905665397644, 0.08227705210447311, -0.03861605003476143, 0.07735027372837067, -0.04143580421805382, -0.0389237254858017, -0.0572509840130806, -0.10286572575569153, -0.021926788613200188, -0.1825227439403534, 0.011882408522069454, -0.0032244431786239147, 0.0365615040063858, -0.009463198482990265, 0.07460419088602066, -0.016369128599762917, 0.16816653311252594, -0.24916687607765198, -0.09445595741271973, -0.10016555339097977, -0.043070316314697266, 0.12846995890140533, -0.0653160884976387, 0.07835554331541061, 0.05063296854496002, 0.0461009256541729, 0.022274674847722054, -0.06361474096775055, 0.0619916133582592, -0.09355048835277557, -0.005392953287810087, -0.032227229326963425, 0.18217013776302338, 0.017835620790719986, 0.08017262071371078, 0.07125154882669449, 0.0176790039986372, 0.007531582377851009, -0.10120148211717606, -0.09399043768644333, 0.13688717782497406, 0.016260825097560883, -0.00197158963419497, -0.03597244620323181, -0.08795654028654099, -0.06388256698846817, 0.04622909799218178, 0.1251646876335144, 0.07664173096418381, -0.09519337117671967, 0.11769350618124008, 0.0919596403837204, -0.07245473563671112, -0.156887024641037, -0.07473669201135635, 0.07782862335443497, 0.04937039688229561, 0.03909938037395477, -0.1718837469816208, -0.002416975097730756, 0.020019520074129105, -0.0031815541442483664, -0.0267060287296772, -0.26752132177352905, -0.10688009113073349, 0.03724237158894539, -0.039120182394981384, 0.008736795745790005, -0.007902232930064201, -0.09510555118322372, -0.054607752710580826, -0.0547858364880085, 0.016996290534734726, -0.07433994114398956, 0.05409111827611923, 0.09561962634325027, -0.006863366346806288, 0.054353147745132446, 0.008587184362113476, 0.07539497315883636, -0.03971902281045914, -0.05257585272192955, -0.056011877954006195, 0.04151665046811104, 0.03104575350880623, -0.03465224802494049, 0.14863929152488708, 0.0028094539884477854, 0.005697323475033045, -0.09963954985141754, -0.058093637228012085, 0.0038261532317847013, -0.014727859757840633, -0.005533317103981972, 0.005425735376775265, -0.015574557706713676, -0.02550571970641613, 0.03016258031129837, 0.021285051479935646, -0.05669817328453064, -0.17334283888339996, -0.10483521223068237, 0.20466189086437225, 0.2409137785434723, 0.02412119321525097, -0.057606473565101624, -0.004191131331026554, 0.022045457735657692, 0.020991329103708267, -0.11138322949409485, 0.037600066512823105, 0.031698696315288544, -0.0001672756188781932, 0.04232143238186836, -0.008747288957238197, -0.06387466192245483, 0.05466261878609657, 0.07832630723714828, -0.002095312112942338, -0.12858620285987854, -0.02533825673162937, 0.027686642482876778, -0.07291435450315475, 0.01450357399880886, 0.1996878981590271, -0.003096182830631733, -0.009113993495702744, 0.0062413145788013935, 0.011615884490311146, -0.05288473516702652, 0.10664364695549011, -0.014369235374033451, 0.0034474635031074286, -0.0788634717464447, 0.010551296174526215, 0.08138403296470642, -0.01086939312517643, 0.02678275853395462, 0.1087338775396347, -0.08752065151929855, -0.062186453491449356, -0.1672125905752182, -0.1450188010931015, -0.09761983156204224, -0.04088973253965378, -0.007559457793831825, -0.05569308251142502, 0.00393945537507534, 0.0743202343583107, -0.01476233173161745, 0.05900618061423302, -0.08245550096035004, 0.010626250877976418, -0.050721604377031326, 0.019433964043855667, 0.029698742553591728, 0.00156602100469172, -0.04878263548016548, 0.1214890331029892, -0.012250343337655067, -0.03811085596680641, -0.014857929199934006, -0.024139557033777237, -0.04934808611869812, 0.025365512818098068, -0.09239958226680756, -0.009682606905698776, -0.05470292270183563, -0.05985185503959656, 0.04220706969499588, 0.019435381516814232, -0.010147113353013992, 0.010440518148243427, -0.033938415348529816, -0.010696768760681152, -0.04261523112654686, 0.01075304951518774, -0.06900486350059509, 0.07530288398265839, 0.013368853367865086, -0.05181514844298363, 0.06883229315280914, 0.06775013357400894, -0.0621950626373291, 0.09243042767047882, 0.016994977369904518, -0.059184107929468155, 0.016884585842490196, 0.05212986841797829, 0.04379144310951233, -0.03379533067345619, 0.03590938821434975, 0.041324764490127563, -0.03938412666320801, -0.05502376705408096, -0.01026374101638794, -0.055506277829408646, 0.12961901724338531, 0.025783248245716095, -0.0036796678323298693, -0.05481763556599617, 0.03974004462361336, 0.06614388525485992, -0.01507446076720953, 0.14590059220790863, -0.0470297671854496, 0.04612383246421814, -0.07582657784223557, 0.02139594592154026, -0.0024007612373679876, 0.012646627612411976, -0.0006175026064738631, -0.08992231637239456, 0.03481442481279373, 0.01725931465625763, -0.009520968422293663, -0.023735150694847107, 0.08256128430366516, 0.04963177070021629, -0.055951010435819626, 0.04384661093354225, 0.05827401578426361, 0.09512750059366226, 0.06387554109096527, -0.02358291484415531, -0.10205990821123123, 0.031001169234514236, -0.010057533159852028, -0.19332332909107208, 0.05805012211203575, 0.07849790155887604, 0.02796649932861328, 0.10433656722307205, 0.05868067592382431, 0.07865296304225922, -0.13412825763225555, 0.04985133558511734, -0.048471130430698395, -0.039623018354177475, -0.04667530953884125, 0.07211177051067352, 0.1952313333749771, -0.07734955102205276, 0.09502109885215759, 0.06884830445051193, -0.04265208914875984, -0.13917949795722961, -0.16490033268928528, -0.023138778284192085, -0.06299387663602829, -0.03653137758374214, -0.0599885955452919, 0.030859975144267082, -0.026566337794065475, 0.04743277654051781, -0.019945798441767693, 0.14823943376541138, -0.0890532061457634, -0.13900503516197205, 0.007766514550894499, 0.008615243248641491, 0.104963518679142, 0.07319001853466034, 0.05024324730038643, 0.057339366525411606, 0.056976187974214554, 0.09718810766935349, 0.06211390346288681, 0.05513405799865723, -0.001565289101563394, -0.03801283240318298, -0.01745351031422615, -0.019170621410012245, 0.06408288329839706, 0.017454857006669044, 0.1944914162158966, 0.06527779996395111, -0.10153819620609283, -0.010854958556592464, 0.15987786650657654, -0.07859828323125839, -0.1516784131526947, -0.18233507871627808, 0.15875037014484406, 0.11338610202074051, 0.0743405669927597, -0.03698223456740379, -0.10339071601629257, 0.028384173288941383, 0.1647263914346695, 0.149138942360878, -0.0295597855001688, 0.0009378984686918557, -0.010032709687948227, 0.023571636527776718, -0.019779641181230545, 0.07077018171548843, 0.0031153757590800524, 0.35651078820228577, 0.015960216522216797, 0.0098955649882555, 0.040165673941373825, -0.012689311057329178, -0.10994481295347214, 0.026841051876544952, -0.07678008824586868, -0.022628184407949448, 0.018927812576293945, 0.11097255349159241, -0.023100553080439568, -0.14254915714263916, -0.06872942298650742, 0.015448428690433502, -0.022548388689756393, 0.03531619533896446, 0.06089631840586662, 0.04843621328473091, 0.048858076333999634, 0.026501232758164406, -0.08002957701683044, 0.1875355839729309, 0.013543694280087948, -0.05606652423739433, 0.024211086332798004, 0.045676592737436295, -0.11287781596183777, 0.10577134788036346, -0.029980799183249474, 0.07057666778564453, 0.07264390587806702, 0.05089918151497841, -0.08153826743364334, 0.05284503474831581, -0.019422439858317375, -0.11987905204296112, 0.09649968892335892, 0.12417564541101456, -0.00226236367598176, 0.09672513604164124, 0.028077423572540283, -0.09649624675512314, 0.0511774942278862, -0.009046889841556549, -0.027176527306437492, -0.058103296905756, 0.09240355342626572, -0.09134827554225922, 0.12164867669343948, 0.11628556251525879, -0.023620201274752617, 0.03101111203432083, -0.01902911625802517, 0.019708292558789253, 0.02483636699616909, 0.06437527388334274, -0.03150767460465431, -0.10564393550157547, -0.05188095197081566, -0.054684773087501526, 0.0164264515042305, -0.18412786722183228, -0.03595120832324028, 0.008356260135769844, 0.0083994809538126, -0.035005420446395874, 0.061031803488731384, 0.057792987674474716, 0.045710984617471695, -0.023081637918949127, 0.02904599718749523, -0.012677211314439774, 0.06391752511262894, -0.09144919365644455, -0.0706387534737587 ]
null
null
transformers
# S2T-SMALL-COVOST2-FR-EN-ST `s2t-small-covost2-fr-en-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end French speech to English text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-covost2-fr-en-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-covost2-fr-en-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=48_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-covost2-fr-en-st is trained on French-English subset of [CoVoST2](https://github.com/facebookresearch/covost). CoVoST is a large-scale multilingual ST corpus based on [Common Voice](https://arxiv.org/abs/1912.06670), created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for fr-en (BLEU score): 26.25 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["fr", "en"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["covost2"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-covost2-fr-en-st
[ "transformers", "pytorch", "tf", "safetensors", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "fr", "en", "dataset:covost2", "arxiv:2010.05171", "arxiv:1912.06670", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1912.06670", "1904.08779" ]
[ "fr", "en" ]
TAGS #transformers #pytorch #tf #safetensors #speech_to_text #automatic-speech-recognition #audio #speech-translation #fr #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-COVOST2-FR-EN-ST 's2t-small-covost2-fr-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end French speech to English text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-covost2-fr-en-st is trained on French-English subset of CoVoST2. CoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster ST research with the largest ever open dataset ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using character based SentencePiece vocab. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results CoVOST2 test results for fr-en (BLEU score): 26.25 ### BibTeX entry and citation info
[ "# S2T-SMALL-COVOST2-FR-EN-ST\n\n's2t-small-covost2-fr-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end French speech to English text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-fr-en-st is trained on French-English subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for fr-en (BLEU score): 26.25", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #safetensors #speech_to_text #automatic-speech-recognition #audio #speech-translation #fr #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-COVOST2-FR-EN-ST\n\n's2t-small-covost2-fr-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end French speech to English text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-covost2-fr-en-st is trained on French-English subset of CoVoST2.\nCoVoST is a large-scale multilingual ST corpus based on Common Voice, created to to foster \nST research with the largest ever open dataset", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using character based SentencePiece vocab.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nCoVOST2 test results for fr-en (BLEU score): 26.25", "### BibTeX entry and citation info" ]
[ 96, 78, 111, 42, 137, 66, 3, 96, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #safetensors #speech_to_text #automatic-speech-recognition #audio #speech-translation #fr #en #dataset-covost2 #arxiv-2010.05171 #arxiv-1912.06670 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-COVOST2-FR-EN-ST\n\n's2t-small-covost2-fr-en-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end French speech to English text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.06633919477462769, 0.09872039407491684, -0.007726005278527737, 0.013487807475030422, 0.04435654357075691, -0.05646800249814987, 0.03210258111357689, 0.055508945137262344, -0.11649306118488312, 0.05379977449774742, 0.026512881740927696, 0.06785693019628525, 0.06643448024988174, 0.05989900976419449, 0.007001486606895924, -0.21498319506645203, 0.035754840821027756, -0.08301124721765518, 0.03956585377454758, 0.08367171883583069, 0.12841907143592834, -0.03036728873848915, 0.07789306342601776, 0.03354931250214577, 0.014283142983913422, 0.012051359750330448, 0.06594933569431305, -0.06397067755460739, 0.05726364254951477, 0.10245850682258606, 0.07335267215967178, 0.01842408813536167, 0.057698823511600494, -0.13801570236682892, 0.008834938518702984, 0.0579446405172348, 0.0002335065946681425, 0.03635316342115402, 0.03489605709910393, -0.023701759055256844, 0.06024247035384178, 0.00952543132007122, 0.03382158279418945, 0.08005538582801819, -0.07711681723594666, -0.060748569667339325, -0.062187522649765015, 0.03192424401640892, 0.12794238328933716, 0.060776855796575546, -0.0380365252494812, -0.011318337172269821, -0.011177731677889824, 0.07123367488384247, 0.05070110037922859, -0.20339922606945038, -0.00693401787430048, -0.07672732323408127, -0.0028657265938818455, 0.03256336227059364, -0.0361773818731308, -0.02515142410993576, -0.04203605279326439, -0.012615387327969074, 0.09291038662195206, -0.05370321124792099, 0.01373539213091135, -0.06424085050821304, -0.11929951608181, -0.022248994559049606, 0.08355099707841873, -0.003992428537458181, -0.08522500842809677, -0.0991615429520607, -0.06508512049913406, -0.015259579755365849, -0.014227181673049927, -0.06083793193101883, -0.008532998152077198, 0.038599930703639984, 0.023514725267887115, -0.12320046871900558, -0.10645189881324768, -0.020289048552513123, -0.07889863848686218, 0.051321107894182205, 0.04130663350224495, 0.008354957215487957, 0.002552404999732971, 0.06500400602817535, -0.1226123571395874, -0.06108437478542328, -0.052154723554849625, -0.0790235623717308, -0.11979685723781586, -0.006589680910110474, -0.006562107242643833, -0.23449471592903137, -0.030037008225917816, 0.148113414645195, -0.09480676054954529, 0.04512213543057442, -0.010655790567398071, 0.02279708720743656, -0.011542469263076782, 0.18173232674598694, -0.05009730905294418, -0.06346632540225983, 0.04846356436610222, -0.02017611637711525, 0.02129363641142845, -0.0033775335177779198, -0.062300413846969604, 0.010485587641596794, -0.04366561025381088, 0.0300144050270319, 0.0563327819108963, 0.033931367099285126, -0.01037578471004963, -0.04927963390946388, 0.13100849092006683, -0.14353454113006592, 0.024477796629071236, 0.051123686134815216, -0.044844284653663635, 0.10288675874471664, 0.07150958478450775, -0.006789620034396648, -0.10910215228796005, 0.030545352026820183, -0.0010168845765292645, 0.022591549903154373, -0.07484082132577896, -0.101164810359478, -0.007095148786902428, -0.06383272260427475, -0.07296671718358994, -0.1145818904042244, -0.08629994839429855, -0.04722852259874344, 0.01086396723985672, -0.05532003194093704, 0.05636320263147354, -0.09944594651460648, -0.047173790633678436, 0.014687544666230679, -0.027185142040252686, -0.16816315054893494, 0.02036280371248722, 0.011293575167655945, -0.030919259414076805, 0.04709036275744438, -0.03665923327207565, 0.020723840221762657, -0.07386880367994308, -0.0023427712731063366, -0.22386927902698517, 0.20556379854679108, 0.013242898508906364, -0.0008460776298306882, -0.11469598114490509, -0.03724558651447296, -0.08472501486539841, 0.0343974344432354, 0.029006360098719597, 0.14858554303646088, -0.22887085378170013, -0.03533460572361946, 0.24069340527057648, -0.09525520354509354, 0.026043439283967018, 0.1463163048028946, 0.000861687061842531, 0.07810186594724655, 0.13059480488300323, 0.1053616851568222, 0.11141582578420639, -0.04275769740343094, -0.07973232865333557, 0.016577251255512238, -0.11706388741731644, 0.11493247002363205, 0.06605619937181473, -0.07225760817527771, 0.11327415704727173, 0.0013212159974500537, -0.02469570003449917, -0.02272975631058216, 0.05879122018814087, -0.0440581850707531, 0.030755477026104927, -0.0037135060410946608, 0.11367842555046082, -0.04265424981713295, -0.008817548863589764, 0.013542023487389088, -0.13904157280921936, 0.10741084814071655, 0.016620663926005363, -0.08240807801485062, 0.0742967277765274, -0.14756794273853302, -0.04071123152971268, 0.027014991268515587, 0.026296867057681084, -0.1533876657485962, 0.013595329597592354, -0.02804015576839447, -0.01830931007862091, 0.15968620777130127, -0.01967030018568039, 0.0794176235795021, 0.03534378111362457, -0.014231356792151928, 0.0231644194573164, 0.07141590118408203, -0.00412942236289382, 0.012466753832995892, -0.08636172115802765, -0.0072562843561172485, -0.03504079207777977, 0.13257315754890442, -0.09301792830228806, 0.039439182728528976, 0.05873692035675049, 0.06667415797710419, 0.024631768465042114, -0.03728964924812317, -0.02564806491136551, -0.020013032481074333, 0.0020754318684339523, -0.034113891422748566, -0.018589362502098083, 0.019554410129785538, -0.07635492086410522, 0.1411106139421463, -0.1786925494670868, -0.16366566717624664, 0.06691984087228775, -0.027033576741814613, -0.07492426782846451, -0.0014314831933006644, 0.022265532985329628, -0.008267538622021675, 0.03354550525546074, -0.15815293788909912, 0.2150442898273468, 0.04788185656070709, 0.1313719004392624, -0.07091434299945831, -0.03188598155975342, -0.04212497919797897, -0.07225049287080765, 0.031469423323869705, 0.03912744298577309, -0.009387562982738018, -0.20305362343788147, 0.028888266533613205, 0.0444713793694973, -0.0169812124222517, 0.10531409829854965, 0.005825186613947153, -0.062017299234867096, -0.0040649715811014175, 0.05554337427020073, 0.041204746812582016, -0.055295538157224655, 0.050000566989183426, 0.026777591556310654, 0.028290485963225365, 0.02043449506163597, 0.00553997652605176, -0.06305621564388275, 0.0931534618139267, 0.08423448354005814, -0.04121851176023483, -0.04794720560312271, -0.021063368767499924, 0.01463655661791563, 0.04846411198377609, 0.008101035840809345, 0.017560578882694244, -0.009937562979757786, -0.022776402533054352, -0.11073895543813705, 0.11993888020515442, -0.17745432257652283, -0.2718268930912018, -0.20257319509983063, -0.029167331755161285, 0.001894470420666039, 0.07763487100601196, 0.048465367406606674, -0.08198089152574539, -0.050878506153821945, -0.10038526356220245, 0.06622594594955444, 0.001716459752060473, -0.061663419008255005, -0.12140174955129623, 0.0421823114156723, 0.01633993536233902, -0.09627659618854523, 0.01704477146267891, 0.011244853027164936, -0.09237794578075409, -0.0020899956580251455, 0.013135692104697227, -0.037461914122104645, 0.17554213106632233, -0.01804118975996971, -0.04132920876145363, -0.03045259788632393, 0.1214020848274231, -0.052514392882585526, 0.12225494533777237, 0.13408727943897247, -0.12577396631240845, 0.053137507289648056, 0.13014180958271027, 0.021232418715953827, -0.011894157156348228, 0.004019175656139851, -0.025710877031087875, -0.06512020528316498, -0.21383120119571686, -0.022524096071720123, -0.037310026586055756, 0.032069962471723557, 0.03926672786474228, 0.013108487240970135, 0.10089973360300064, 0.02816956676542759, -0.05777210742235184, -0.004236378241330385, 0.13369274139404297, 0.07080835849046707, 0.17171715199947357, -0.04755471646785736, 0.09807728230953217, -0.0687345415353775, -0.017153266817331314, 0.0740882009267807, -0.0477885901927948, 0.14396873116493225, 0.017776932567358017, 0.201599583029747, 0.04730193689465523, 0.01537455152720213, 0.030940018594264984, 0.05697184428572655, -0.023351017385721207, 0.03656919673085213, -0.019686507061123848, -0.07507199794054031, -0.05930382013320923, 0.09036005288362503, 0.049125801771879196, -0.03510652855038643, 0.036684390157461166, 0.05907364934682846, 0.06147090345621109, 0.10715953260660172, 0.05732346326112747, -0.16128237545490265, -0.062034036964178085, 0.01810353994369507, -0.09746941924095154, -0.031164510175585747, 0.000696619157679379, 0.13980638980865479, -0.07080060243606567, 0.02384057641029358, 0.0064285192638635635, 0.08280333131551743, -0.09110604971647263, 0.012879682704806328, 0.004202964249998331, 0.1248350515961647, 0.03603838384151459, 0.0059242816641926765, -0.07581420242786407, 0.04962975159287453, 0.02963942103087902, 0.07319925725460052, 0.011722399853169918, 0.06762593239545822, -0.002807853277772665, 0.06913700699806213, 0.10212788730859756, 0.002363091567531228, -0.1452464610338211, 0.009583071805536747, -0.1096154972910881, -0.04249820113182068, 0.11696774512529373, -0.037946779280900955, 0.059491127729415894, -0.054293230175971985, -0.048401668667793274, -0.04597819969058037, -0.09030739963054657, -0.04484985023736954, -0.17650850117206573, 0.01741834729909897, 0.010573497042059898, 0.08342988789081573, 0.001306194462813437, 0.06133132800459862, -0.05181428790092468, 0.171003520488739, -0.2596572935581207, -0.07451455295085907, -0.08904246240854263, -0.07132948935031891, 0.12913073599338531, -0.0758957490324974, 0.08516138046979904, 0.033328089863061905, 0.059708915650844574, 0.008001759648323059, -0.072019562125206, 0.048768945038318634, -0.07892890274524689, -0.028982365503907204, -0.023409385234117508, 0.17668229341506958, 0.029069436714053154, 0.07202112674713135, 0.06421617418527603, 0.04347336292266846, 0.007900119759142399, -0.0822359248995781, -0.06689167022705078, 0.13158655166625977, -0.016658345237374306, 0.031475894153118134, -0.053576063364744186, -0.12557490170001984, -0.09812691062688828, 0.04233342036604881, 0.1511576920747757, 0.09203052520751953, -0.09881949424743652, 0.1262752115726471, 0.10966716706752777, -0.06911482661962509, -0.18321332335472107, -0.08172652125358582, 0.0639193132519722, 0.021527130156755447, 0.04493454098701477, -0.16786985099315643, -0.004128823056817055, 0.02143685519695282, -0.006255531217902899, -0.036596477031707764, -0.20934869349002838, -0.1212603896856308, 0.06485683470964432, -0.05820637196302414, -0.022036679089069366, -0.020052781328558922, -0.0914839580655098, -0.039924003183841705, -0.026718880981206894, 0.042112357914447784, -0.08782784640789032, 0.04445849731564522, 0.10078246146440506, -0.021079162135720253, 0.04445556923747063, 0.012533796951174736, 0.05084926635026932, -0.020499473437666893, -0.03481139987707138, -0.048210177570581436, 0.06885375827550888, 0.020503811538219452, -0.02923813834786415, 0.1588631421327591, 0.018662096932530403, 0.008977143093943596, -0.027921903878450394, -0.03647339344024658, -0.003719739615917206, 0.0027690681163221598, -0.004833269398659468, 0.018262680619955063, -0.039669882506132126, -0.001736376667395234, 0.037525780498981476, 0.006630110088735819, -0.02557925134897232, -0.15163631737232208, -0.07804297655820847, 0.238749161362648, 0.1923949420452118, 0.056464873254299164, -0.05979521945118904, 0.0014017410576343536, 0.009190667420625687, 0.028930148109793663, -0.07839467376470566, 0.06683103740215302, 0.023081429302692413, -0.0015008172485977411, 0.08866162598133087, -0.004994019400328398, -0.08637391775846481, 0.03457615152001381, 0.06645029783248901, -0.029180677607655525, -0.14881354570388794, -0.020993944257497787, -0.00030657986644655466, -0.05573231354355812, 0.02209319733083248, 0.19394747912883759, -0.02210996113717556, -0.0056829070672392845, -0.001319839502684772, 0.03223010152578354, -0.0633460134267807, 0.10047489404678345, -0.02170497551560402, -0.0004905087407678366, -0.07612323760986328, 0.029124952852725983, 0.09529827535152435, -0.025505317375063896, 0.02795216254889965, 0.11029773205518723, -0.07975909858942032, -0.05930634215474129, -0.1636728048324585, -0.15276196599006653, -0.039133526384830475, -0.040717799216508865, 0.009261952713131905, -0.05644549801945686, 0.008369982242584229, 0.10514947026968002, -0.021767381578683853, 0.07012409716844559, -0.0515180379152298, -0.0015665628015995026, -0.0334012471139431, 0.048171110451221466, 0.04827044531702995, 0.005949269514530897, -0.02423916757106781, 0.11935670673847198, -0.013898664154112339, -0.013321866281330585, -0.019604293629527092, -0.03022364340722561, -0.0317828506231308, 0.015284843742847443, -0.039220958948135376, -0.009434020146727562, -0.04193098098039627, -0.04750815033912659, 0.04572013393044472, 0.015723256394267082, 0.015521951951086521, 0.024224920198321342, -0.025425704196095467, -0.01727219857275486, -0.040791239589452744, 0.01864326000213623, -0.12115573137998581, 0.050614990293979645, 0.04725203663110733, -0.056479163467884064, 0.07389247417449951, 0.08169616758823395, -0.0748627781867981, 0.0908963680267334, -0.016623811796307564, -0.0682569071650505, 0.0036268928088247776, 0.045146506279706955, 0.018472101539373398, -0.052153103053569794, 0.017244311049580574, 0.01752941869199276, -0.04103412851691246, -0.04884541034698486, 0.022953609004616737, -0.047184284776449203, 0.12404708564281464, 0.009716372936964035, -0.015140179544687271, -0.06029946357011795, 0.02046460658311844, 0.026872193440794945, -0.0008000722737051547, 0.152973011136055, -0.04607228562235832, 0.04857724532485008, -0.06137668713927269, 0.025278661400079727, 0.017863433808088303, -0.0031732989009469748, -0.008691677823662758, -0.09124448150396347, 0.02803008444607258, 0.007574048824608326, -0.006442441139370203, -0.019823353737592697, 0.009726392105221748, 0.05674698203802109, -0.07282684743404388, -0.01754431240260601, 0.06258457154035568, 0.024265306070446968, 0.05477116256952286, -0.043159183114767075, -0.11925286799669266, -0.010034237988293171, -0.013941744342446327, -0.14982348680496216, 0.05981354042887688, 0.08326682448387146, 0.02524157054722309, 0.08400242030620575, 0.04460502788424492, 0.07772246748209, -0.11091384291648865, 0.02924840711057186, -0.056659381836652756, -0.04092207923531532, -0.052880484610795975, 0.10541395097970963, 0.18410031497478485, -0.07706131041049957, 0.10981833189725876, 0.05790542811155319, -0.059844970703125, -0.14668725430965424, -0.14920012652873993, -0.04807206615805626, -0.04297681152820587, -0.03395368158817291, -0.04975508898496628, 0.06438295543193817, -0.017560921609401703, 0.062457695603370667, 0.010534217581152916, 0.15705858170986176, -0.11897209286689758, -0.12166330963373184, 0.023452986031770706, 0.0008495908696204424, 0.08447819203138351, 0.08247239887714386, 0.034239500761032104, 0.07225339859724045, 0.06414607912302017, 0.09062706679105759, 0.06352375447750092, 0.07034136354923248, 0.013381799682974815, -0.05097869783639908, -0.0370350182056427, -0.021061960607767105, 0.05205058306455612, -0.01045459695160389, 0.1429503858089447, 0.07756929099559784, -0.0919383242726326, -0.008488408289849758, 0.14087319374084473, -0.07938089221715927, -0.12508656084537506, -0.18971987068653107, 0.15272323787212372, 0.09502580016851425, 0.07553351670503616, -0.032070763409137726, -0.09840449690818787, 0.029129015281796455, 0.13485872745513916, 0.14934764802455902, -0.02006453089416027, 0.01682133972644806, -0.014791935682296753, 0.030539074912667274, -0.008139623329043388, 0.05135689675807953, 0.008269606158137321, 0.2901462912559509, 0.011567848734557629, 0.05993107333779335, 0.019346626475453377, -0.0350126214325428, -0.09775669872760773, 0.040415022522211075, -0.09791447222232819, 0.004051531665027142, 0.00012246317055542022, 0.1036461889743805, -0.02356071025133133, -0.16370567679405212, -0.08021856844425201, 0.029048200696706772, -0.016734138131141663, 0.030438151210546494, 0.04702404886484146, 0.04749155044555664, 0.037453413009643555, 0.0427662692964077, -0.07882601022720337, 0.1819911152124405, 0.022488107904791832, -0.041854377835989, 0.027265023440122604, 0.04566912353038788, -0.12587812542915344, 0.1292324662208557, -0.00014231607201509178, 0.06549417227506638, 0.05210566148161888, 0.02626563422381878, -0.058304909616708755, 0.04422852024435997, -0.014446395449340343, -0.08049562573432922, 0.06574036926031113, 0.15276560187339783, 0.003940001595765352, 0.11419229209423065, 0.024795018136501312, -0.09060154855251312, 0.05671989917755127, -0.017335401847958565, -0.054819975048303604, -0.05101044476032257, 0.08651863783597946, -0.0814906656742096, 0.11276482790708542, 0.14438049495220184, -0.01403913926333189, 0.0439833365380764, -0.04096287861466408, -0.01766793802380562, 0.018562428653240204, 0.032225657254457474, -0.015806473791599274, -0.09380143880844116, -0.02830067090690136, -0.018042394891381264, 0.0476561076939106, -0.1850459724664688, -0.051980145275592804, -0.0037691574543714523, 0.013980528339743614, -0.03667011111974716, 0.06270179897546768, 0.06431849300861359, 0.019749516621232033, -0.01855754666030407, -0.002406112616881728, 0.009413260035216808, 0.04878661036491394, -0.077412910759449, -0.04357154667377472 ]
null
null
transformers
# S2T-SMALL-LIBRISPEECH-ASR `s2t-small-librispeech-asr` is a Speech to Text Transformer (S2T) model trained for automatic speech recognition (ASR). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is an end-to-end sequence-to-sequence transformer model. It is trained with standard autoregressive cross-entropy loss and generates the transcripts autoregressively. ## Intended uses & limitations This model can be used for end-to-end speech recognition (ASR). See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* *Note: The feature extractor depends on [torchaudio](https://github.com/pytorch/audio) and the tokenizer depends on [sentencepiece](https://github.com/google/sentencepiece) so be sure to install those packages before running the examples.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-librispeech-asr") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-librispeech-asr") ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) input_features = processor( ds[0]["audio"]["array"], sampling_rate=16_000, return_tensors="pt" ).input_features # Batch size 1 generated_ids = model.generate(input_features=input_features) transcription = processor.batch_decode(generated_ids) ``` #### Evaluation on LibriSpeech Test The following script shows how to evaluate this model on the [LibriSpeech](https://huggingface.co/datasets/librispeech_asr) *"clean"* and *"other"* test dataset. ```python from datasets import load_dataset from evaluate import load from transformers import Speech2TextForConditionalGeneration, Speech2TextProcessor librispeech_eval = load_dataset("librispeech_asr", "clean", split="test") # change to "other" for other test dataset wer = load("wer") model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-librispeech-asr").to("cuda") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-librispeech-asr", do_upper_case=True) def map_to_pred(batch): features = processor(batch["audio"]["array"], sampling_rate=16000, padding=True, return_tensors="pt") input_features = features.input_features.to("cuda") attention_mask = features.attention_mask.to("cuda") gen_tokens = model.generate(input_features=input_features, attention_mask=attention_mask) batch["transcription"] = processor.batch_decode(gen_tokens, skip_special_tokens=True)[0] return batch result = librispeech_eval.map(map_to_pred, remove_columns=["audio"]) print("WER:", wer.compute(predictions=result["transcription"], references=result["text"])) ``` *Result (WER)*: | "clean" | "other" | |:-------:|:-------:| | 4.3 | 9.0 | ## Training data The S2T-SMALL-LIBRISPEECH-ASR is trained on [LibriSpeech ASR Corpus](https://www.openslr.org/12), a dataset consisting of approximately 1000 hours of 16kHz read English speech. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": "en", "license": "mit", "tags": ["speech", "audio", "automatic-speech-recognition", "hf-asr-leaderboard"], "datasets": ["librispeech_asr"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}], "model-index": [{"name": "s2t-small-librispeech-asr", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "LibriSpeech (clean)", "type": "librispeech_asr", "config": "clean", "split": "test", "args": {"language": "en"}}, "metrics": [{"type": "wer", "value": 4.3, "name": "Test WER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "LibriSpeech (other)", "type": "librispeech_asr", "config": "other", "split": "test", "args": {"language": "en"}}, "metrics": [{"type": "wer", "value": 9.0, "name": "Test WER"}]}]}]}
automatic-speech-recognition
facebook/s2t-small-librispeech-asr
[ "transformers", "pytorch", "tf", "safetensors", "speech_to_text", "automatic-speech-recognition", "speech", "audio", "hf-asr-leaderboard", "en", "dataset:librispeech_asr", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "model-index", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en" ]
TAGS #transformers #pytorch #tf #safetensors #speech_to_text #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #model-index #endpoints_compatible #has_space #region-us
S2T-SMALL-LIBRISPEECH-ASR ========================= 's2t-small-librispeech-asr' is a Speech to Text Transformer (S2T) model trained for automatic speech recognition (ASR). The S2T model was proposed in this paper and released in this repository Model description ----------------- S2T is an end-to-end sequence-to-sequence transformer model. It is trained with standard autoregressive cross-entropy loss and generates the transcripts autoregressively. Intended uses & limitations --------------------------- This model can be used for end-to-end speech recognition (ASR). See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* *Note: The feature extractor depends on torchaudio and the tokenizer depends on sentencepiece so be sure to install those packages before running the examples.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. #### Evaluation on LibriSpeech Test The following script shows how to evaluate this model on the LibriSpeech *"clean"* and *"other"* test dataset. *Result (WER)*: Training data ------------- The S2T-SMALL-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of approximately 1000 hours of 16kHz read English speech. Training procedure ------------------ ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. ### BibTeX entry and citation info
[ "### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\n*Note: The feature extractor depends on torchaudio and the tokenizer depends on sentencepiece\nso be sure to install those packages before running the examples.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.", "#### Evaluation on LibriSpeech Test\n\n\nThe following script shows how to evaluate this model on the LibriSpeech\n*\"clean\"* and *\"other\"* test dataset.\n\n\n*Result (WER)*:\n\n\n\nTraining data\n-------------\n\n\nThe S2T-SMALL-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of\napproximately 1000 hours of 16kHz read English speech.\n\n\nTraining procedure\n------------------", "### Preprocessing\n\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000.", "### Training\n\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively.", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #safetensors #speech_to_text #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #model-index #endpoints_compatible #has_space #region-us \n", "### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\n*Note: The feature extractor depends on torchaudio and the tokenizer depends on sentencepiece\nso be sure to install those packages before running the examples.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.", "#### Evaluation on LibriSpeech Test\n\n\nThe following script shows how to evaluate this model on the LibriSpeech\n*\"clean\"* and *\"other\"* test dataset.\n\n\n*Result (WER)*:\n\n\n\nTraining data\n-------------\n\n\nThe S2T-SMALL-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of\napproximately 1000 hours of 16kHz read English speech.\n\n\nTraining procedure\n------------------", "### Preprocessing\n\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000.", "### Training\n\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively.", "### BibTeX entry and citation info" ]
[ 103, 175, 104, 99, 50, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #safetensors #speech_to_text #automatic-speech-recognition #speech #audio #hf-asr-leaderboard #en #dataset-librispeech_asr #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #model-index #endpoints_compatible #has_space #region-us \n### How to use\n\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\n\n*Note: The feature extractor depends on torchaudio and the tokenizer depends on sentencepiece\nso be sure to install those packages before running the examples.*\n\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly\nwith 'pip install torchaudio sentencepiece'.#### Evaluation on LibriSpeech Test\n\n\nThe following script shows how to evaluate this model on the LibriSpeech\n*\"clean\"* and *\"other\"* test dataset.\n\n\n*Result (WER)*:\n\n\n\nTraining data\n-------------\n\n\nThe S2T-SMALL-LIBRISPEECH-ASR is trained on LibriSpeech ASR Corpus, a dataset consisting of\napproximately 1000 hours of 16kHz read English speech.\n\n\nTraining procedure\n------------------### Preprocessing\n\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000." ]
[ -0.1287718117237091, 0.11210361123085022, -0.004443460144102573, 0.018533218652009964, 0.10395734012126923, -0.023513182997703552, 0.04967086389660835, 0.0837722197175026, 0.014560669660568237, 0.13091906905174255, 0.01672172173857689, 0.04066118225455284, 0.02792259305715561, 0.07873497158288956, 0.06612342596054077, -0.14311857521533966, -0.0032895621843636036, -0.06404012441635132, 0.13289527595043182, 0.12963514029979706, 0.09304415434598923, -0.07764996588230133, 0.04144739359617233, -0.0051986295729875565, -0.021298827603459358, -0.016934800893068314, 0.004414783790707588, -0.06104281544685364, 0.04593770205974579, 0.011506760492920876, 0.01700451783835888, -0.014066871255636215, 0.018393879756331444, -0.16520968079566956, 0.012972676195204258, 0.07886837422847748, 0.0678800493478775, 0.02990708500146866, 0.07957538962364197, -0.08833342790603638, 0.057254526764154434, -0.03921957314014435, 0.04145820811390877, 0.057579901069402695, -0.06024156138300896, -0.09827763587236404, -0.05034656450152397, -0.026116730645298958, 0.08681991696357727, 0.0924283042550087, -0.05813105031847954, 0.09191984683275223, 0.0021935219410806894, 0.0923585444688797, 0.15620477497577667, -0.1518387645483017, -0.010481438599526882, -0.08901156485080719, 0.042561136186122894, 0.1289743185043335, -0.07412246614694595, -0.003950608894228935, -0.0030861226841807365, -0.0020371475256979465, 0.06946354359388351, -0.02930162101984024, -0.023370934650301933, -0.05518638342618942, -0.15199241042137146, -0.03582170978188515, 0.16322165727615356, -0.062400296330451965, -0.09479779750108719, -0.17137521505355835, -0.03434470295906067, -0.07132909446954727, 0.037480857223272324, 0.022792523726820946, 0.023031016811728477, -0.010622051544487476, 0.012826134450733662, -0.0353194884955883, -0.07244787365198135, -0.037170279771089554, -0.0032896571792662144, 0.16240355372428894, 0.017932230606675148, -0.01987970806658268, -0.0017755139851942658, 0.1247703805565834, -0.011238439939916134, -0.12159592658281326, -0.040635865181684494, -0.007538813166320324, -0.09871093183755875, -0.020505540072917938, -0.03042355738580227, -0.16053952276706696, -0.016890687867999077, 0.14990320801734924, -0.013720286078751087, 0.050875093787908554, -0.033529091626405716, 0.007162683177739382, 0.00951041653752327, 0.19670170545578003, 0.02151629514992237, -0.04073864966630936, 0.010643279179930687, 0.051814913749694824, 0.08870299905538559, 0.018490366637706757, -0.002996281022205949, 0.030354905873537064, 0.04819321259856224, 0.10561417788267136, 0.010360748507082462, -0.0044107586145401, -0.06631891429424286, 0.007990841753780842, 0.0886760950088501, -0.17883998155593872, 0.034081701189279556, 0.047932032495737076, -0.05856063961982727, -0.027824413031339645, 0.09891829639673233, -0.03702462092041969, -0.10402307659387589, 0.12933211028575897, -0.02244666777551174, 0.04328443482518196, -0.07345830649137497, -0.1003679409623146, 0.016050005331635475, -0.052050475031137466, -0.05133236199617386, -0.07813124358654022, -0.06275741755962372, -0.041113562881946564, -0.020457351580262184, -0.0016519692726433277, 0.04991979897022247, -0.01443599071353674, 0.014018832705914974, -0.021731384098529816, -0.012695662677288055, 0.012261966243386269, -0.029285341501235962, 0.05651330202817917, 0.1114465743303299, 0.026308616623282433, 0.15013249218463898, 0.04850918799638748, -0.14922070503234863, 0.0072786626406013966, -0.19216221570968628, 0.110732801258564, -0.023378532379865646, -0.012684791348874569, -0.08856064081192017, -0.02457246743142605, -0.05797773227095604, 0.02666606567800045, 0.022139308974146843, 0.09403274208307266, -0.15703119337558746, -0.04270702600479126, 0.1993628591299057, -0.11943864077329636, -0.03926931321620941, 0.12217089533805847, 0.03284693509340286, 0.04219717159867287, 0.13079284131526947, 0.09886723011732101, 0.12246256321668625, -0.19557787477970123, -0.039763957262039185, -0.030406402423977852, -0.02089114300906658, 0.12854717671871185, 0.08488532155752182, -0.06814964860677719, 0.07137202471494675, 0.018846625462174416, -0.16130074858665466, -0.004182486329227686, 0.04233326017856598, -0.0029492233879864216, -0.021633919328451157, -0.04910599812865257, -0.01636461541056633, -0.0047400412149727345, -0.08425381779670715, -0.06711123883724213, -0.13740848004817963, 0.08402650058269501, 0.07652338594198227, -0.05421091616153717, 0.043782398104667664, -0.09147464483976364, 0.031509917229413986, -0.010650894604623318, -0.027673643082380295, -0.18249160051345825, 0.00683776568621397, 0.057201750576496124, -0.15256015956401825, 0.09357549995183945, -0.08145023137331009, 0.006494156550616026, 0.10489773005247116, 0.0029592658393085003, -0.009258097037672997, 0.05115323141217232, -0.032207198441028595, -0.006794904824346304, -0.05681940168142319, 0.01249928493052721, -0.044886842370033264, 0.2085045427083969, -0.05013337731361389, 0.02570539340376854, 0.04759443551301956, 0.03740891441702843, -0.013547907583415508, -0.08204300701618195, 0.071273073554039, -0.026809608563780785, 0.001268042135052383, -0.06469296663999557, 0.0043982258066535, 0.04561278596520424, -0.0009233147138729692, 0.046584729105234146, -0.21063430607318878, -0.22885578870773315, 0.09584493935108185, 0.10439442098140717, -0.07322810590267181, -0.01451093889772892, -0.008976410143077374, -0.04691210016608238, -0.12848037481307983, -0.12993793189525604, 0.21564802527427673, 0.053216420114040375, 0.11438450217247009, -0.10685420781373978, -0.07877463102340698, -0.014142364263534546, -0.022050747647881508, -0.006737923715263605, 0.06234600767493248, -0.05553639680147171, -0.06191079318523407, 0.02685101144015789, 0.016530221328139305, -0.07849818468093872, 0.09457171708345413, 0.03385363519191742, -0.08988810330629349, -0.0031644669361412525, 0.09108133614063263, 0.015958750620484352, -0.006632568780332804, -0.08837136626243591, -0.03300921991467476, 0.02519705519080162, -0.04048090800642967, 0.0019071400165557861, -0.09633095562458038, 0.07143682986497879, 0.06599675863981247, -0.02551407366991043, 0.011877666227519512, 0.004398643970489502, -0.029772089794278145, 0.049265217036008835, -0.04762943834066391, 0.08817493915557861, -0.03713211044669151, -0.029053129255771637, -0.14038574695587158, 0.10291996598243713, -0.12228111177682877, -0.2876564860343933, -0.16435804963111877, 0.027157818898558617, -0.01797439716756344, 0.011104266159236431, 0.08514387905597687, -0.05658663436770439, -0.04518190398812294, -0.10349445790052414, 0.04391811415553093, 0.013331403024494648, -0.0800894945859909, -0.08389811962842941, 0.03633328899741173, 0.03744969889521599, -0.11966826766729355, -0.010528244078159332, 0.03378763049840927, -0.014180603437125683, -0.002232159487903118, 0.06056348979473114, 0.05341758951544762, 0.11064015328884125, -0.006827618461102247, -0.02091505564749241, 0.00781517755240202, 0.1387125551700592, -0.08841297030448914, 0.06025449186563492, 0.1895635575056076, -0.04683644697070122, 0.0452151894569397, 0.0634126141667366, 0.0020719454623758793, -0.022167926654219627, 0.054096564650535583, -0.005251428112387657, -0.050768811255693436, -0.21823561191558838, -0.059557124972343445, -0.02136329561471939, -0.005563959013670683, 0.09581248462200165, 0.028387613594532013, -0.07251083850860596, 0.03435303270816803, -0.09268278628587723, -0.11587315797805786, 0.08817777782678604, 0.09519457072019577, 0.05205430090427399, -0.017740953713655472, 0.0914425328373909, -0.013375633396208286, 0.0005788120906800032, 0.047096069902181625, -0.04076993092894554, 0.05703547224402428, -0.04370540380477905, 0.17687660455703735, 0.036461010575294495, 0.031049776822328568, 0.020125314593315125, 0.059957969933748245, -0.022756371647119522, 0.040595315396785736, 0.021048327907919884, -0.1261192262172699, -0.06332249194383621, 0.04662380367517471, -0.008952098898589611, 0.03408299759030342, 0.018760928884148598, 0.02483198419213295, 0.053059063851833344, 0.14495930075645447, 0.011164966970682144, -0.21642936766147614, -0.05931948125362396, -0.003080720780417323, -0.03964988514780998, -0.048565495759248734, 0.009548713453114033, 0.07579005509614944, -0.10148858278989792, 0.0720575824379921, -0.01064163539558649, 0.06345727294683456, -0.06538177281618118, -0.031610939651727676, 0.07725824415683746, 0.12347576022148132, 0.009472700767219067, 0.09719894826412201, -0.1509675234556198, -0.02236972190439701, 0.005154190119355917, 0.06745042651891708, -0.03151002526283264, 0.03882801532745361, -0.00845764297991991, -0.0265228021889925, 0.13979128003120422, -0.009982280433177948, -0.027405399829149246, -0.00497604301199317, -0.07804853469133377, -0.022918272763490677, 0.0748237892985344, -0.055765360593795776, 0.0631650760769844, -0.017258301377296448, -0.06195669621229172, -0.060492802411317825, -0.15910311043262482, -0.13022421300411224, -0.13184994459152222, 0.026894861832261086, 0.007200995460152626, 0.014859932474792004, -0.04208995774388313, -0.03883073478937149, -0.07324577867984772, 0.13162928819656372, -0.18347902595996857, -0.12145804613828659, -0.045606162399053574, 0.003552977694198489, 0.1366787999868393, -0.058906104415655136, 0.00199453835375607, -0.05419493094086647, 0.1823590248823166, -0.05513579770922661, -0.005606168881058693, -0.003550905268639326, -0.0939624086022377, -0.12291031330823898, -0.04390741512179375, 0.2600584924221039, 0.025160711258649826, 0.04561035707592964, -0.009222332388162613, 0.07158033549785614, -0.014684251509606838, -0.06271868199110031, -0.04121469706296921, 0.10635799169540405, 0.021638620644807816, 0.06365372240543365, -0.12601470947265625, -0.07589786499738693, -0.1531975120306015, -0.04825890064239502, 0.09780893474817276, 0.11556018143892288, -0.05696772038936615, 0.0841977447271347, 0.16509824991226196, -0.08808305114507675, -0.1506238579750061, -0.03188121318817139, 0.0853925496339798, 0.018845688551664352, 0.07949486374855042, -0.1613234281539917, 0.11109952628612518, 0.05081635341048241, -0.0018239717464894056, 0.04620979353785515, -0.32229456305503845, -0.11831365525722504, 0.04998575523495674, 0.020866675302386284, -0.20371954143047333, -0.1371452510356903, -0.08623931556940079, -0.06826303154230118, 0.04396667703986168, 0.027328744530677795, -0.09582996368408203, 0.06649747490882874, 0.029257001355290413, -0.008903046138584614, 0.05532212182879448, -0.06027180701494217, 0.09230609238147736, -0.00529170036315918, -0.04264640063047409, -0.051355086266994476, 0.03501807153224945, 0.027881981804966927, -0.05819935351610184, 0.19291996955871582, -0.08104179799556732, 0.03971390053629875, -0.010372151620686054, -0.044372305274009705, -0.039605654776096344, 0.07010909914970398, -0.04100779816508293, 0.014920679852366447, -0.03488828241825104, 0.01674923673272133, 0.02248968370258808, -0.01574559323489666, -0.09085727483034134, -0.1043110191822052, -0.07830055803060532, 0.2413390576839447, 0.10720642656087875, 0.06926995515823364, -0.19298967719078064, 0.006699505727738142, 0.03765419125556946, -0.0005012059700675309, -0.028704363852739334, 0.033638615161180496, 0.0651615709066391, 0.08723996579647064, 0.0939595103263855, -0.02921687625348568, -0.14833566546440125, -0.028441091999411583, 0.038694895803928375, -0.1015344113111496, -0.07228054851293564, -0.025683391839265823, 0.10205894708633423, -0.07736068964004517, -0.10656315833330154, 0.11828761547803879, -0.032019563019275665, -0.015279561281204224, -0.0010494166053831577, 0.042316049337387085, -0.023997312411665916, 0.20187658071517944, -0.021360158920288086, 0.028625374659895897, -0.07212173938751221, 0.08424802869558334, 0.08036531507968903, -0.07379088550806046, -0.033381570130586624, 0.14826729893684387, -0.10256647318601608, -0.05388546735048294, -0.10023198276758194, -0.01892184279859066, 0.020687496289610863, -0.052108269184827805, -0.03632909432053566, -0.034210629761219025, -0.005601337179541588, 0.05617469549179077, 0.064689040184021, 0.02563660964369774, -0.021693624556064606, 0.000634326774161309, -0.08862047642469406, 0.12229358404874802, 0.09791558980941772, 0.05976812541484833, -0.06036778911948204, 0.10381919145584106, 0.07832743227481842, -0.010384206660091877, 0.0072828237898647785, -0.033240482211112976, -0.020016012713313103, 0.04523007944226265, 0.005315749440342188, 0.05687336251139641, -0.03558512032032013, -0.030617745593190193, -0.005649330094456673, 0.011635459028184414, 0.022481165826320648, 0.04588228836655617, 0.004940553102642298, -0.052012763917446136, -0.06445418298244476, 0.06386969983577728, -0.15310516953468323, 0.009161705151200294, 0.07357076555490494, -0.10330981761217117, 0.058505475521087646, 0.04339246451854706, -0.04911176487803459, -0.010759936645627022, -0.05714752897620201, -0.055795468389987946, 0.03144477680325508, 0.03642972931265831, -0.03475767746567726, -0.2501969337463379, 0.04605403542518616, 0.021989699453115463, -0.02634870819747448, -0.037829943001270294, 0.1728469282388687, -0.130608469247818, 0.04383475333452225, 0.07079575210809708, -0.017613787204027176, -0.0788293182849884, 0.03104349598288536, 0.0763520821928978, 0.05891059711575508, 0.12200090289115906, -0.06307832896709442, 0.057744350284338, -0.11104270070791245, 0.011928140185773373, -0.0025736705865710974, -0.011495313607156277, -0.08828241378068924, 0.026774302124977112, 0.0570126511156559, -0.003189630340784788, 0.08476303517818451, 0.038123901933431625, 0.04459727928042412, 0.06586237251758575, 0.06787226349115372, -0.012542841024696827, 0.024311412125825882, 0.022410497069358826, -0.033588431775569916, 0.014562449418008327, -0.029832301661372185, -0.04883037880063057, 0.004146893974393606, -0.06634213775396347, 0.03135450929403305, 0.18290427327156067, 0.07987728714942932, 0.02816000021994114, 0.018929867073893547, -0.017166994512081146, -0.07761264592409134, 0.017998861148953438, -0.02559419721364975, 0.026263603940606117, -0.0339060053229332, 0.2074957937002182, 0.13229534029960632, -0.09809482842683792, 0.09025723487138748, 0.02045886218547821, -0.09037626534700394, -0.0842718780040741, -0.12150198221206665, -0.032538123428821564, 0.03750868886709213, -0.014871939085423946, -0.06675105541944504, 0.04302344098687172, -0.009866129606962204, 0.02125275507569313, -0.049407776445150375, 0.20794354379177094, -0.08304993063211441, -0.1885131597518921, 0.052113376557826996, 0.011921153403818607, 0.07689940929412842, 0.13026699423789978, 0.04393850639462471, 0.08724083751440048, 0.04522563889622688, 0.09985224157571793, 0.08150374889373779, 0.056787822395563126, 0.009349499829113483, -0.03760558366775513, -0.0233645997941494, -0.0016680174740031362, -0.0020897078793495893, 0.03114919364452362, 0.16059155762195587, 0.05564303323626518, -0.0198182575404644, -0.023244114592671394, 0.15868456661701202, -0.04009785130620003, -0.10672254115343094, -0.08134471625089645, 0.15185216069221497, 0.027517834678292274, 0.010455986484885216, -0.013463528826832771, -0.08952832967042923, 0.0015625186497345567, 0.19914032518863678, 0.09118639677762985, 0.013351162895560265, 0.006174171343445778, -0.003631777362897992, 0.01728862337768078, 0.019786855205893517, 0.012194879353046417, 0.05048803612589836, 0.18546515703201294, -0.02526821382343769, 0.16232094168663025, -0.06125536188483238, -0.04144558683037758, -0.11651535332202911, 0.0375356525182724, -0.0871967002749443, 0.01424899697303772, -0.02818855084478855, 0.10292340070009232, 0.07313507050275803, -0.26305046677589417, -0.07808791846036911, -0.015859194099903107, -0.08298051357269287, 0.03220410272479057, 0.07531549036502838, 0.011068476364016533, 0.04407377913594246, 0.024631069973111153, -0.035529639571905136, 0.12223989516496658, 0.00028650579042732716, -0.059943538159132004, -0.034922704100608826, 0.0016949764685705304, -0.15744875371456146, 0.11763594299554825, -0.02328711375594139, 0.14594438672065735, 0.04477377235889435, 0.015312406234443188, -0.10393497347831726, 0.1164536103606224, 0.03176746517419815, -0.07251688838005066, 0.09431648254394531, 0.19895875453948975, -0.009728831239044666, 0.07944817841053009, 0.031218906864523888, -0.0012749506859108806, 0.04477792605757713, -0.017551429569721222, 0.006680938880890608, -0.16166570782661438, 0.08704660087823868, -0.08204865455627441, 0.12720324099063873, 0.12791889905929565, -0.03570994362235069, -0.013682747259736061, -0.04728109762072563, -0.012288358062505722, 0.03940172493457794, 0.10971729457378387, 0.04924342408776283, -0.1901964247226715, 0.014136793091893196, 0.02178463712334633, 0.06697399914264679, -0.27800261974334717, -0.03090479038655758, 0.02724188193678856, -0.06457987427711487, 0.012166823260486126, 0.0802079290151596, -0.026741359382867813, 0.02139078453183174, -0.03932986035943031, -0.07776204496622086, 0.051158346235752106, 0.09590362012386322, -0.16827788949012756, -0.059585247188806534 ]
null
null
transformers
# S2T-SMALL-MUSTC-EN-DE-ST `s2t-small-mustc-en-de-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to German text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-mustc-en-de-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-mustc-en-de-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-mustc-en-de-st is trained on English-German subset of [MuST-C](https://ict.fbk.eu/must-c/). MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-de (BLEU score): 22.7 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "de"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["mustc"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-mustc-en-de-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "de", "dataset:mustc", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en", "de" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #de #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-MUSTC-EN-DE-ST 's2t-small-mustc-en-de-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to German text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-mustc-en-de-st is trained on English-German subset of MuST-C. MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-de (BLEU score): 22.7 ### BibTeX entry and citation info
[ "# S2T-SMALL-MUSTC-EN-DE-ST\n\n's2t-small-mustc-en-de-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to German text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-de-st is trained on English-German subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-de (BLEU score): 22.7", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #de #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-MUSTC-EN-DE-ST\n\n's2t-small-mustc-en-de-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to German text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-de-st is trained on English-German subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-de (BLEU score): 22.7", "### BibTeX entry and citation info" ]
[ 82, 78, 111, 42, 137, 116, 3, 100, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #de #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-MUSTC-EN-DE-ST\n\n's2t-small-mustc-en-de-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to German text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.09461349993944168, 0.10615259408950806, -0.004659750498831272, -0.007154060527682304, 0.05441689118742943, -0.05805650353431702, 0.0380212776362896, 0.06492578983306885, -0.05920104309916496, 0.052382227033376694, 0.0002974389644805342, 0.07974767684936523, 0.043417081236839294, 0.07439769804477692, -0.009360934607684612, -0.25877347588539124, 0.04333222657442093, -0.05931268632411957, 0.049024686217308044, 0.07710197567939758, 0.1267857551574707, -0.0572035051882267, 0.0641908198595047, 0.023509306833148003, -0.00848893728107214, 0.04261420667171478, 0.044060349464416504, -0.0555490180850029, 0.06655833125114441, 0.0923234075307846, 0.04446013271808624, 0.021765604615211487, 0.0775495395064354, -0.1380188763141632, -0.004762905649840832, 0.0320114940404892, 0.026820708066225052, 0.03116908296942711, 0.018752682954072952, 0.038226351141929626, 0.10239071398973465, -0.011540084145963192, -0.008101596496999264, 0.09679481387138367, -0.03185173496603966, -0.032178688794374466, -0.03504608944058418, 0.061903513967990875, 0.17596130073070526, 0.08738822489976883, -0.04496905207633972, -0.0016381057212129235, -0.019716553390026093, 0.05120950937271118, 0.027891268953680992, -0.18493549525737762, 0.007518298923969269, -0.09310812503099442, -0.03348217159509659, 0.004408655222505331, -0.05246415361762047, -0.006173809990286827, -0.012832706794142723, 0.008076895959675312, 0.036243367940187454, -0.07380510121583939, 0.05711124464869499, -0.07651033252477646, -0.12364352494478226, -0.027166476473212242, 0.11445827037096024, -0.004712092224508524, -0.1061604768037796, -0.13681969046592712, -0.04624772071838379, 0.030460312962532043, -0.03450551629066467, -0.05568083003163338, -0.009836911223828793, 0.02145020291209221, 0.039347629994153976, -0.13081896305084229, -0.1279837191104889, 0.011094007641077042, -0.07066026329994202, 0.17178626358509064, 0.041213635355234146, 0.0005642474279738963, 0.009995484724640846, 0.06868917495012283, -0.0774199441075325, -0.05516992136836052, -0.032073043286800385, -0.09281083196401596, -0.10932766646146774, -0.012623813934624195, -0.03270608186721802, -0.24221986532211304, -0.04925179481506348, 0.12641891837120056, -0.08998783677816391, 0.027672769501805305, 0.049880627542734146, 0.009105337783694267, 0.01851765625178814, 0.18524068593978882, -0.03213680908083916, -0.12155002355575562, 0.0385311059653759, -0.05373261868953705, 0.03470665588974953, 0.02586604468524456, -0.09048181027173996, -0.00030013860668987036, -0.045067302882671356, 0.030641799792647362, 0.045976992696523666, 0.040552303194999695, 0.006922802422195673, -0.06765259802341461, 0.18167047202587128, -0.11427108943462372, 0.007837138138711452, 0.01862146146595478, -0.04424981027841568, 0.1478431522846222, 0.02557176724076271, -0.010144260711967945, -0.1613764762878418, 0.08877228945493698, 0.005129869561642408, 0.030005309730768204, -0.07214858382940292, -0.1172608956694603, -0.00938399974256754, -0.06234509125351906, -0.0413147397339344, -0.11274808645248413, -0.10956325381994247, -0.024311695247888565, 0.007762838155031204, -0.01940341107547283, 0.03380872681736946, -0.07063661515712738, -0.05273239687085152, 0.0056754592806100845, -0.05699530243873596, -0.18302880227565765, 0.010428911074995995, -0.02349850907921791, -0.01946892961859703, 0.020639117807149887, -0.08105900883674622, 0.020038213580846786, -0.0803537368774414, -0.03080889582633972, -0.2514190375804901, 0.1704503297805786, -0.0009216810576617718, 0.022820450365543365, -0.09029705822467804, -0.0835641622543335, -0.06098276376724243, 0.06161969527602196, 0.027559641748666763, 0.14980393648147583, -0.19694650173187256, -0.04394551366567612, 0.17899398505687714, -0.09417353570461273, 0.043642133474349976, 0.15448474884033203, 0.028270553797483444, 0.023742415010929108, 0.14157697558403015, 0.12521560490131378, 0.11228686571121216, -0.09210646152496338, -0.11421406269073486, 0.05021422728896141, -0.11186635494232178, 0.06838938593864441, 0.03549949452280998, -0.07360608130693436, 0.10863903909921646, 0.015945538878440857, -0.02224513329565525, -0.0009251090232282877, 0.07157297432422638, -0.009949753992259502, 0.0403611995279789, -0.01984304189682007, 0.127757266163826, -0.03698662668466568, -0.007122399285435677, -0.012515120208263397, -0.12241287529468536, 0.18653185665607452, 0.04862334579229355, -0.06286683678627014, 0.06831596046686172, -0.10934370756149292, -0.07332295179367065, -0.017496764659881592, 0.007113634143024683, -0.16451746225357056, 0.048415668308734894, -0.017580464482307434, -0.015578131191432476, 0.14886581897735596, 0.05811043456196785, 0.06225474923849106, 0.019143445417284966, -0.007975754328072071, -0.0019063911167904735, 0.06494709104299545, 0.006892122328281403, 0.0019321824656799436, -0.04533569887280464, -0.03719954565167427, -0.016610125079751015, 0.09951556473970413, -0.050854604691267014, 0.022147757932543755, 0.012360447086393833, 0.07797769457101822, 0.025709228590130806, -0.028292594477534294, -0.011548595502972603, -0.010536819696426392, 0.0010507918195798993, -0.022354749962687492, -0.026538778096437454, 0.020521530881524086, -0.047336746007204056, 0.12760482728481293, -0.13547739386558533, -0.16620874404907227, 0.08190299570560455, -0.0341387577354908, -0.02789265476167202, 0.05597809702157974, -0.034371983259916306, -0.012065751478075981, 0.030269332230091095, -0.2025071382522583, 0.2871745824813843, 0.04803771525621414, 0.11101306974887848, -0.056948624551296234, -0.06351788341999054, -0.0478309765458107, -0.07182586193084717, 0.020256109535694122, 0.00030364340636879206, 0.013634547591209412, -0.180075541138649, 0.008653662167489529, 0.025469623506069183, -0.017193898558616638, 0.13298358023166656, 0.019050808623433113, -0.06641550362110138, 0.018455155193805695, 0.01665111631155014, 0.005098660942167044, 0.014090125449001789, -0.004172788932919502, 0.002158751944079995, 0.020496603101491928, 0.04077393561601639, 0.04541631042957306, -0.04112936928868294, 0.09640400856733322, 0.056722160428762436, -0.05508985370397568, -0.036493245512247086, -0.008071855641901493, 0.0043159183114767075, 0.05580555275082588, -0.004033349454402924, 0.03904128447175026, -0.02868293598294258, -0.012239278294146061, -0.11015289276838303, 0.10619014501571655, -0.15828093886375427, -0.30585455894470215, -0.1827571839094162, -0.04122542962431908, -0.0547897070646286, 0.07284607738256454, 0.05817015841603279, -0.0894816666841507, -0.0910167321562767, -0.07989547401666641, 0.10601548105478287, -0.03661066293716431, -0.040948301553726196, -0.11048959940671921, 0.03980978578329086, 0.059260156005620956, -0.07990187406539917, -0.003342156996950507, -0.0018545748898759484, -0.01497033704072237, -0.02495596371591091, 0.032948773354291916, -0.020728088915348053, 0.14198413491249084, -0.02482619322836399, -0.04661473631858826, -0.028647320345044136, 0.11727658659219742, -0.04247545078396797, 0.13312558829784393, 0.1788218766450882, -0.12108875066041946, 0.02440034970641136, 0.1260480284690857, 0.024100234732031822, -0.017719050869345665, -0.003208876121789217, -0.015223181806504726, -0.03279273211956024, -0.17760449647903442, -0.0866035595536232, -0.06339342147111893, 0.02756764180958271, 0.03409872204065323, 0.015700293704867363, 0.04014701768755913, 0.011985544115304947, -0.09022347629070282, -0.001901081413961947, 0.04042816162109375, 0.08459575474262238, 0.1634874939918518, -0.045933980494737625, 0.0714275985956192, -0.05855455622076988, -0.04567006230354309, 0.10410700738430023, -0.025510717183351517, 0.1076989620923996, 0.009656515903770924, 0.16732512414455414, 0.026422424241900444, -0.03215750306844711, 0.011698299087584019, 0.07835689932107925, -0.023685315623879433, -0.00342412106692791, -0.00024340533127542585, -0.04944320395588875, -0.034587662667036057, 0.0931994616985321, 0.0732455775141716, -0.02708885259926319, 0.013238550163805485, 0.049041442573070526, 0.08721423894166946, 0.10972336679697037, 0.08324705809354782, -0.12321839481592178, -0.11156307905912399, 0.014765186235308647, -0.16130216419696808, -0.016187991946935654, 0.0061170244589447975, 0.1266692578792572, -0.0562695749104023, 0.04429763928055763, 0.006889333948493004, 0.08428055793046951, -0.10679399967193604, 0.039801374077796936, -0.00287197926081717, 0.16624855995178223, 0.036804620176553726, 0.03618083894252777, -0.06068817898631096, 0.03537468612194061, 0.016156602650880814, 0.09172780811786652, -0.035259589552879333, 0.0732685774564743, -0.010142579674720764, 0.011346130631864071, 0.10690528899431229, 0.004536930937319994, -0.1478477269411087, 0.022856831550598145, -0.12174100428819656, -0.04063060134649277, 0.11654644459486008, 0.006887068040668964, 0.06183421239256859, -0.03902202099561691, -0.03489413484930992, -0.05773511901497841, -0.10599429905414581, -0.06883881986141205, -0.15947464108467102, 0.04402672126889229, 0.0292228851467371, -0.005576795898377895, 0.005069142673164606, 0.06340692192316055, -0.04392039403319359, 0.21662487089633942, -0.20061108469963074, -0.09293963760137558, -0.11721032857894897, -0.03798329457640648, 0.1251191943883896, -0.0636640340089798, 0.07950598746538162, 0.07104172557592392, 0.07451710104942322, -0.010462283156812191, -0.06543940305709839, 0.062309350818395615, -0.12349840253591537, 0.0010316729312762618, -0.0197324026376009, 0.1871386468410492, 0.025326479226350784, 0.07443971931934357, 0.05708467215299606, -0.005966166500002146, 0.018492845818400383, -0.11954055726528168, -0.0687292069196701, 0.07996872812509537, 0.002442561089992523, 0.012324189767241478, -0.054018862545490265, -0.14059853553771973, -0.07251516729593277, 0.056248925626277924, 0.13627520203590393, 0.08807762712240219, -0.09299100190401077, 0.12484238296747208, 0.08004504442214966, -0.04114070534706116, -0.20934469997882843, -0.05495123565196991, 0.0751945748925209, 0.0363333560526371, 0.006355126854032278, -0.14853066205978394, 0.05866030976176262, -0.0015893664676696062, -0.017620159313082695, -0.040625862777233124, -0.19698747992515564, -0.12026717513799667, 0.05743515491485596, -0.04433362931013107, 0.002874221419915557, 0.028305696323513985, -0.057909976691007614, -0.06307537853717804, -0.047661952674388885, -0.02038261480629444, -0.05164756253361702, 0.036036714911460876, 0.09521826356649399, 0.031280018389225006, 0.05488621070981026, -0.009555311873555183, 0.05048680678009987, 0.01516166515648365, -0.04921204224228859, -0.04556231200695038, 0.04518672451376915, 0.011701803654432297, -0.019035888835787773, 0.2226242870092392, -0.049898579716682434, 0.026172000914812088, -0.08412699401378632, -0.05368702486157417, -0.016973458230495453, 0.041330162435770035, 0.001149603514932096, -0.028144659474492073, -0.0073277298361063, -0.0063323997892439365, 0.040780190378427505, 0.023686662316322327, -0.07126103341579437, -0.15342602133750916, -0.17170977592468262, 0.188741534948349, 0.23844611644744873, 0.03925732895731926, -0.05171379819512367, 0.001343437354080379, -0.0024106218479573727, 0.002670341171324253, -0.06595319509506226, 0.06481432914733887, 0.033211931586265564, -0.005866008810698986, 0.06492601335048676, -0.029026733711361885, -0.04057629033923149, 0.03739212453365326, 0.06984557956457138, -0.04989377409219742, -0.12930484116077423, -0.05279463902115822, 0.006672449875622988, -0.06353981047868729, 0.07277500629425049, 0.21204787492752075, -0.04007178544998169, 0.0017320237820968032, -0.015179460868239403, 0.017377132549881935, -0.0878959521651268, 0.11667788028717041, -0.04131867364048958, 0.018952395766973495, -0.06451401114463806, 0.06808022409677505, 0.05267450958490372, -0.0100724371150136, 0.03480364382266998, 0.10210034996271133, -0.09970744699239731, -0.05707760155200958, -0.18754591047763824, -0.16481685638427734, -0.06144976615905762, -0.038736674934625626, -0.025891494005918503, -0.05450964719057083, -0.006151539273560047, 0.012468073517084122, -0.010367500595748425, 0.024011394008994102, -0.0870376005768776, 0.010653456673026085, -0.06732574105262756, 0.028046634048223495, 0.0760299488902092, 0.01909615285694599, -0.05882828310132027, 0.12736694514751434, -0.04126126691699028, -0.0001019807968987152, -0.036220718175172806, 0.012230690568685532, -0.009671439416706562, -0.0011227845679968596, -0.11920308321714401, -0.04392904043197632, -0.06327681243419647, -0.06366269290447235, 0.03547384589910507, -0.006553381681442261, 0.0041678110137581825, 0.03363754227757454, -0.05194966122508049, -0.01697244681417942, -0.02737114019691944, 0.005729995667934418, -0.06830952316522598, 0.07448016107082367, 0.022747939452528954, -0.048146553337574005, 0.10793899744749069, 0.10228296369314194, -0.06675780564546585, 0.09615722298622131, 0.05085818096995354, -0.0903954952955246, 0.026225481182336807, 0.0673913061618805, 0.04172119125723839, -0.013375315815210342, 0.007175592239946127, 0.06479194760322571, -0.024207070469856262, -0.06978164613246918, -0.01223847083747387, -0.03288828954100609, 0.1178370863199234, 0.01518032792955637, -0.010526572354137897, -0.05647442117333412, 0.009201566688716412, 0.06045792996883392, -0.019307874143123627, 0.14760063588619232, -0.06241655349731445, 0.04748055711388588, -0.06751285493373871, 0.0463097020983696, 0.01711301878094673, -0.016152577474713326, -0.004342499189078808, -0.08849483728408813, 0.02931496500968933, -0.002161802025511861, 0.05466879904270172, -0.03637072816491127, 0.05664932727813721, 0.0700613334774971, -0.06980889290571213, 0.01818198338150978, 0.05185877904295921, 0.12338393181562424, 0.016096053645014763, -0.020374776795506477, -0.14660373330116272, -0.0062691583298146725, -0.01871415786445141, -0.12206493318080902, 0.08528245985507965, 0.11325738579034805, 0.02509159967303276, 0.11339455097913742, 0.038565054535865784, 0.061051394790410995, -0.18639259040355682, 0.01875055767595768, -0.07517494261264801, 0.019052626565098763, -0.04584223031997681, 0.09305023401975632, 0.1715833991765976, -0.09981667995452881, 0.08113151788711548, 0.0813809409737587, -0.04864451661705971, -0.11122572422027588, -0.1316593736410141, -0.01581035926938057, -0.07323096692562103, -0.012611353769898415, -0.0779629796743393, 0.01165008544921875, -0.07805155962705612, 0.06666359305381775, -0.022245356813073158, 0.19792936742305756, -0.10509665310382843, -0.13546675443649292, 0.07173781096935272, -0.00669790618121624, 0.06667827814817429, 0.10776574164628983, 0.03546725958585739, 0.03887340798974037, 0.031659819185733795, 0.10807624459266663, 0.028355855494737625, 0.011623568832874298, -0.0006835127715021372, -0.011199302040040493, -0.034525856375694275, -0.019659029319882393, 0.03650098294019699, 0.015664836391806602, 0.19317519664764404, 0.0681026428937912, -0.09339836239814758, -0.040282197296619415, 0.08798413723707199, -0.11235594749450684, -0.1624501645565033, -0.16471630334854126, 0.1638040542602539, 0.10078267008066177, 0.06965067982673645, -0.019387662410736084, -0.11280093342065811, 0.00983413029462099, 0.15477168560028076, 0.16676780581474304, -0.028443388640880585, 0.013391613960266113, -0.018033280968666077, 0.021863525733351707, -0.01680692471563816, 0.06896673887968063, -0.015982814133167267, 0.3445124626159668, 0.03557899221777916, 0.01109364815056324, 0.035844966769218445, -0.025237441062927246, -0.06724797189235687, 0.06772288680076599, -0.07928354293107986, -0.03993242606520653, 0.003922499716281891, 0.10238193720579147, -0.022233890369534492, -0.17747151851654053, -0.12394261360168457, 0.03127192705869675, -0.026225872337818146, 0.061779048293828964, 0.04863822087645531, 0.07804709672927856, 0.0695599690079689, 0.020785197615623474, -0.05172724276781082, 0.22964011132717133, -0.006462733261287212, -0.04860455170273781, 0.04945500195026398, 0.03501984849572182, -0.13886184990406036, 0.07527479529380798, -0.008712206035852432, 0.06660584360361099, 0.05384467914700508, 0.04371858388185501, -0.06977846473455429, 0.059712111949920654, -0.02093394473195076, -0.14116445183753967, 0.0859634280204773, 0.18864306807518005, -0.017122099176049232, 0.12070757150650024, 0.015550038777291775, -0.11020267009735107, 0.052816033363342285, 0.034698061645030975, -0.06299328058958054, -0.0784606784582138, 0.10795846581459045, -0.07623765617609024, 0.14176000654697418, 0.1287495195865631, -0.020078646019101143, 0.019469749182462692, -0.033759791404008865, 0.015979468822479248, 0.019640162587165833, 0.04791804775595665, -0.03914204612374306, -0.1267152577638626, -0.03515932708978653, -0.0334644690155983, 0.02890053763985634, -0.1673719584941864, -0.019402120262384415, -0.005450042895972729, -0.003392586950212717, -0.0521119125187397, 0.07470583915710449, 0.05663444101810455, 0.0241241492331028, -0.008774666115641594, 0.06144927442073822, 0.00916348211467266, 0.053936917334795, -0.0881597176194191, -0.07843992114067078 ]
null
null
transformers
# S2T-SMALL-MUSTC-EN-ES-ST `s2t-small-mustc-en-es-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Spanish text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-mustc-en-es-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-mustc-en-es-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-mustc-en-es-st is trained on English-Spanish subset of [MuST-C](https://ict.fbk.eu/must-c/). MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-es (BLEU score): 27.2 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "es"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["mustc"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-mustc-en-es-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "es", "dataset:mustc", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en", "es" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #es #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-MUSTC-EN-ES-ST 's2t-small-mustc-en-es-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Spanish text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-mustc-en-es-st is trained on English-Spanish subset of MuST-C. MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-es (BLEU score): 27.2 ### BibTeX entry and citation info
[ "# S2T-SMALL-MUSTC-EN-ES-ST\n\n's2t-small-mustc-en-es-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Spanish text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-es-st is trained on English-Spanish subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-es (BLEU score): 27.2", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #es #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-MUSTC-EN-ES-ST\n\n's2t-small-mustc-en-es-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Spanish text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-es-st is trained on English-Spanish subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-es (BLEU score): 27.2", "### BibTeX entry and citation info" ]
[ 82, 78, 111, 42, 137, 117, 3, 100, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #es #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-MUSTC-EN-ES-ST\n\n's2t-small-mustc-en-es-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Spanish text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.09863188117742538, 0.13013389706611633, -0.005993666127324104, 0.00010697317338781431, 0.07940107583999634, -0.06383071839809418, -0.001639083493500948, 0.08362312614917755, -0.08436214923858643, 0.055741894990205765, -0.006895160768181086, 0.13958150148391724, 0.050060976296663284, 0.04346783831715584, -0.02252451702952385, -0.2461719959974289, 0.04018716514110565, -0.06073335185647011, 0.028848782181739807, 0.07723644375801086, 0.11135014146566391, -0.04228943958878517, 0.06349378824234009, 0.01883178949356079, -0.000004251710834068945, 0.045577872544527054, 0.05536756291985512, -0.10010172426700592, 0.07052270323038101, 0.11246699839830399, 0.061464108526706696, 0.03929387778043747, 0.07688380777835846, -0.13789932429790497, 0.00009378705726703629, 0.03899376839399338, 0.03409769386053085, 0.04421212151646614, 0.04348843917250633, 0.03663108870387077, 0.09669577330350876, -0.018955327570438385, 0.013822272419929504, 0.09621641784906387, -0.06667342036962509, -0.05026506632566452, -0.029952680692076683, -0.0027672091964632273, 0.180599644780159, 0.08211993426084518, -0.039917655289173126, 0.006240159273147583, -0.005496577359735966, 0.03698783367872238, 0.038954295217990875, -0.172250896692276, 0.002957089338451624, -0.11340895295143127, -0.02628413587808609, 0.030303869396448135, -0.02030997909605503, -0.006440230179578066, -0.005782170221209526, -0.005459954496473074, 0.037728000432252884, -0.0788881927728653, -0.00360766495577991, -0.0585031695663929, -0.11191636323928833, -0.04105924814939499, 0.09469365328550339, -0.026210058480501175, -0.08444984257221222, -0.12021973729133606, -0.056661903858184814, 0.03489873558282852, -0.02448694407939911, -0.057883694767951965, 0.004389872308820486, 0.025191975757479668, 0.0513312891125679, -0.12242446094751358, -0.12552548944950104, 0.0018689925782382488, -0.08901682496070862, 0.12603440880775452, 0.04039785638451576, 0.0008240744355134666, 0.012721187435090542, 0.06619387120008469, -0.033154189586639404, -0.05445948243141174, -0.009191571734845638, -0.0790369063615799, -0.10317468643188477, -0.0010243974393233657, -0.021171055734157562, -0.2378147393465042, -0.023818697780370712, 0.06455092877149582, -0.13868191838264465, 0.026181994006037712, 0.04665840417146683, 0.0034894782584160566, 0.012287299148738384, 0.17034012079238892, -0.04034391790628433, -0.08488079160451889, 0.03628690168261528, -0.049891822040081024, 0.02918686717748642, 0.01873888447880745, -0.10106310248374939, -0.0034365870524197817, -0.0584612600505352, 0.05992599204182625, 0.0750279501080513, 0.03233637660741806, -0.0005797538324259222, -0.07596071809530258, 0.1861957609653473, -0.1122671589255333, 0.012463384307920933, 0.02660655789077282, -0.04172822833061218, 0.1398390233516693, 0.02762102335691452, -0.009712939150631428, -0.15174394845962524, 0.03747742250561714, -0.007797019090503454, 0.019251838326454163, -0.07035280764102936, -0.11246950179338455, -0.0016897718887776136, -0.08145754784345627, -0.03703339397907257, -0.11147866398096085, -0.10351387411355972, -0.04602305218577385, -0.0003228082787245512, -0.05814489722251892, 0.03358989208936691, -0.08905687183141708, -0.022573281079530716, -0.00340383336879313, -0.06304362416267395, -0.14216208457946777, 0.005388298537582159, 0.0066756498999893665, 0.015402255579829216, 0.03609815239906311, -0.060962747782468796, 0.025208864361047745, -0.06498737633228302, -0.012260086834430695, -0.22993727028369904, 0.19221395254135132, 0.028061676770448685, 0.0369282029569149, -0.11704636365175247, -0.07643111795186996, -0.07301023602485657, 0.07575447857379913, 0.02733362279832363, 0.15803071856498718, -0.18453453481197357, -0.044282615184783936, 0.18350420892238617, -0.06269670277833939, 0.04617375135421753, 0.12591496109962463, 0.03600192070007324, 0.0725909024477005, 0.14232100546360016, 0.10705063492059708, 0.07768340408802032, -0.06058646738529205, -0.07799115777015686, 0.04049954190850258, -0.11068177223205566, 0.03845284879207611, 0.042609866708517075, -0.10003063827753067, 0.12372017651796341, 0.023421606048941612, -0.040922462940216064, -0.01064003724604845, 0.0744292214512825, -0.01298085693269968, 0.05461308732628822, -0.013716410845518112, 0.09742951393127441, -0.02935754507780075, 0.006519144866615534, 0.0013860514154657722, -0.11734184622764587, 0.17493382096290588, 0.023465553298592567, -0.05855332314968109, 0.07682586461305618, -0.13576091825962067, -0.05042841657996178, -0.04422752931714058, 0.005326335318386555, -0.18000568449497223, 0.07254815846681595, -0.03413541987538338, 0.029756756499409676, 0.13056540489196777, 0.02613004669547081, 0.042228441685438156, -0.0038200421258807182, 0.008558937348425388, 0.01690128818154335, 0.0938643291592598, 0.0056474218145012856, -0.004045398905873299, -0.04931313544511795, -0.020189998671412468, -0.016230417415499687, 0.09387798607349396, -0.07384328544139862, 0.025911321863532066, 0.007800770457834005, 0.03798585385084152, 0.01155998557806015, -0.019758492708206177, -0.0008124212035909295, 0.001023306860588491, 0.006394925527274609, -0.040483519434928894, 0.003973265178501606, 0.031312260776758194, -0.042231012135744095, 0.11403625458478928, -0.16770991683006287, -0.18595601618289948, 0.08044514060020447, -0.05040663108229637, -0.019872048869729042, 0.04048018902540207, -0.005754333455115557, -0.004047378897666931, 0.02969265915453434, -0.1711806356906891, 0.28370943665504456, 0.055670931935310364, 0.14167185127735138, -0.09245003759860992, -0.04568548500537872, -0.041505247354507446, -0.08292444795370102, 0.014245574362576008, 0.008609643206000328, 0.04607193544507027, -0.1894269734621048, 0.003218354657292366, 0.0016665930161252618, 0.005912491586059332, 0.12080348283052444, 0.031164521351456642, -0.06716643273830414, 0.02571774460375309, 0.02192673273384571, 0.027913646772503853, -0.02136133797466755, -0.020379122346639633, 0.010551727376878262, 0.017408963292837143, 0.0427529513835907, 0.059697605669498444, -0.059919070452451706, 0.08687886595726013, 0.07112535834312439, -0.05750945210456848, -0.050921279937028885, -0.007991929538547993, 0.00672581372782588, 0.04939127340912819, 0.013728215359151363, 0.0418255589902401, -0.03098706342279911, -0.011769006960093975, -0.11255805194377899, 0.10395579785108566, -0.17204788327217102, -0.3184552490711212, -0.20006783306598663, -0.05347074568271637, -0.009352474473416805, 0.08994752168655396, 0.06690801680088043, -0.1145826205611229, -0.07059884071350098, -0.06007738783955574, 0.10550055652856827, -0.02932458370923996, -0.06201307848095894, -0.09012043476104736, 0.06368323415517807, 0.040063921362161636, -0.07469798624515533, 0.012270820327103138, 0.011668240651488304, -0.04068100452423096, -0.03036404773592949, -0.006903356406837702, -0.014695098623633385, 0.16319970786571503, 0.005965614225715399, -0.04551227390766144, -0.034195154905319214, 0.11602094024419785, -0.03927706554532051, 0.10413985699415207, 0.20682862401008606, -0.08824920654296875, 0.014527359046041965, 0.12694913148880005, 0.01805518940091133, -0.027616454288363457, -0.013574138283729553, -0.028447283431887627, -0.038081392645835876, -0.20486795902252197, -0.062325555831193924, -0.05120515078306198, -0.004212960135191679, 0.03748839721083641, 0.01763421855866909, 0.03137895464897156, 0.035107702016830444, -0.06584811955690384, -0.005509506911039352, 0.03270774334669113, 0.08554496616125107, 0.19257675111293793, -0.05565641447901726, 0.0607038214802742, -0.06379475444555283, -0.04278700426220894, 0.10711391270160675, -0.0019581003580242395, 0.10934709757566452, 0.0061643850058317184, 0.2165459543466568, 0.03615158423781395, -0.018194232136011124, 0.006550353951752186, 0.06465751677751541, -0.027431154623627663, 0.010212280787527561, -0.016295110806822777, -0.04388926550745964, -0.061415936797857285, 0.0600317046046257, 0.0636928454041481, -0.03607027605175972, 0.002373496536165476, 0.03809260204434395, 0.06221040338277817, 0.09933243691921234, 0.0470975786447525, -0.14434602856636047, -0.08092126250267029, 0.025417331606149673, -0.13930559158325195, -0.03766121342778206, -0.005652455613017082, 0.1175369918346405, -0.06024778634309769, 0.035334568470716476, -0.0015836411621421576, 0.09364669770002365, -0.1113143339753151, 0.04709133133292198, 0.0034849888179451227, 0.1544613242149353, 0.047298651188611984, 0.03662370145320892, -0.02731165662407875, 0.05338992550969124, 0.022434141486883163, 0.08157327026128769, -0.015769703313708305, 0.07433376461267471, -0.03230604901909828, 0.01742967963218689, 0.09859646111726761, -0.01026780903339386, -0.12605245411396027, 0.0037240535020828247, -0.10692096501588821, -0.0353519469499588, 0.0786069929599762, 0.0005148298223502934, 0.07633799314498901, -0.04578036814928055, -0.0367453396320343, -0.06304214894771576, -0.10902590304613113, -0.027733106166124344, -0.19804835319519043, 0.03012506105005741, 0.005652157124131918, 0.014487329870462418, 0.0008070181356742978, 0.06271980702877045, -0.05273844674229622, 0.1896827071905136, -0.19986119866371155, -0.11464210599660873, -0.11693309992551804, -0.0288013257086277, 0.13995841145515442, -0.04926912114024162, 0.07280392199754715, 0.06258626282215118, 0.05151674523949623, 0.013600475154817104, -0.03805031254887581, 0.07602016627788544, -0.11525844037532806, -0.00006660511280642822, -0.030157363042235374, 0.17560042440891266, 0.0009338031522929668, 0.06344913691282272, 0.06276169419288635, -0.007108820602297783, 0.013657611794769764, -0.09088346362113953, -0.09265591204166412, 0.09391826391220093, 0.017468232661485672, -0.006250243168324232, -0.05548011139035225, -0.11939822882413864, -0.09579834342002869, 0.02047925628721714, 0.13329564034938812, 0.07029791921377182, -0.08749345690011978, 0.1279679387807846, 0.08791426569223404, -0.07109292596578598, -0.1496734321117401, -0.08111952990293503, 0.07916399836540222, 0.024343958124518394, 0.021040167659521103, -0.16572237014770508, 0.00028670436586253345, 0.01123009528964758, -0.0006343005807138979, -0.049875516444444656, -0.25153473019599915, -0.0988461971282959, 0.021880891174077988, -0.02261851169168949, 0.021441690623760223, 0.0028776787221431732, -0.08354996889829636, -0.0708145946264267, -0.03281915560364723, -0.013025939464569092, -0.031583767384290695, 0.03442758694291115, 0.0834699347615242, 0.02752733789384365, 0.0635862797498703, 0.010548156686127186, 0.07342041283845901, -0.048987530171871185, -0.067991241812706, -0.032348304986953735, 0.05265853554010391, 0.016376467421650887, -0.021990882232785225, 0.1744285225868225, -0.012507027946412563, 0.010882362723350525, -0.09079725295305252, -0.07098665833473206, -0.01550961472094059, -0.008104785345494747, -0.000687283230945468, 0.0020864105317741632, 0.019190002232789993, -0.035582661628723145, 0.05521917715668678, 0.01200585812330246, -0.07070913910865784, -0.16476789116859436, -0.143246591091156, 0.19736194610595703, 0.24737165868282318, 0.032760679721832275, -0.061273764818906784, -0.007313535548746586, 0.019296977669000626, -0.0031563297379761934, -0.07288110256195068, 0.04995010048151016, 0.0275533776730299, -0.01981099136173725, 0.037818774580955505, -0.025135347619652748, -0.048803895711898804, 0.04499797523021698, 0.0883815586566925, 0.0005324782687239349, -0.12136882543563843, -0.034452252089977264, 0.033975839614868164, -0.07236747443675995, 0.030227774754166603, 0.19641228020191193, 0.007313793990761042, -0.03134545311331749, -0.013157551176846027, 0.01832752674818039, -0.06469061970710754, 0.1261368989944458, -0.038122184574604034, 0.002955202478915453, -0.07607626914978027, 0.05417652428150177, 0.07992348819971085, -0.02266918681561947, 0.02341024950146675, 0.12083116173744202, -0.09769962728023529, -0.0591469332575798, -0.16212545335292816, -0.1561596691608429, -0.09967587888240814, -0.05912989377975464, -0.043259747326374054, -0.04681713506579399, -0.0034016508143395185, 0.0262160487473011, -0.02843010425567627, 0.02280002273619175, -0.0786285251379013, 0.0025911261327564716, -0.055256184190511703, 0.010180240496993065, 0.05656014010310173, 0.00182047626003623, -0.07400766760110855, 0.11017473042011261, -0.029928402975201607, -0.014110825024545193, -0.022098105400800705, 0.012291180901229382, -0.017401624470949173, 0.019954508170485497, -0.10857750475406647, -0.009791886433959007, -0.06581593304872513, -0.06969676911830902, 0.0350661426782608, 0.013892884366214275, -0.00026506264111958444, 0.011850259266793728, -0.04529784619808197, -0.011876954697072506, -0.03818010538816452, 0.01846400462090969, -0.05041581392288208, 0.0819355696439743, 0.009454351849853992, -0.04574321582913399, 0.08310437202453613, 0.06745889782905579, -0.07000758498907089, 0.1033547893166542, 0.025067586451768875, -0.0672740712761879, 0.028110621497035027, 0.07038797438144684, 0.05420130491256714, -0.023057660087943077, 0.02184780314564705, 0.06797772645950317, -0.02810348942875862, -0.0750817134976387, 0.003342200303450227, -0.05086282640695572, 0.11457551270723343, 0.019199656322598457, -0.008016940206289291, -0.04432714357972145, 0.019277166575193405, 0.0683947205543518, -0.028818810358643532, 0.1422118842601776, -0.06399676203727722, 0.04184634983539581, -0.07040070742368698, 0.03409953787922859, -0.0012111490359529853, 0.014938532374799252, 0.007792819757014513, -0.0863187164068222, 0.038895752280950546, 0.018845101818442345, 0.03598517179489136, -0.007576700299978256, 0.09590034931898117, 0.050660017877817154, -0.07715269923210144, 0.028113756328821182, 0.04039079323410988, 0.1247953549027443, 0.04297764226794243, -0.003127788659185171, -0.12070932984352112, 0.02101101353764534, -0.013072672300040722, -0.1162479966878891, 0.043990518897771835, 0.11214698851108551, 0.06317223608493805, 0.11860683560371399, 0.038592252880334854, 0.056723032146692276, -0.14244979619979858, 0.03309359401464462, -0.06695907562971115, -0.015203400515019894, -0.04161969572305679, 0.06270001083612442, 0.1800256371498108, -0.09152442961931229, 0.08550279587507248, 0.08011344820261002, -0.043823421001434326, -0.128439262509346, -0.13788560032844543, -0.011666673235595226, -0.08245982229709625, -0.02490152232348919, -0.07423362135887146, 0.02365977317094803, -0.08450973033905029, 0.04469917342066765, -0.023894499987363815, 0.1583428680896759, -0.05815548822283745, -0.17147529125213623, 0.03984106704592705, 0.006567533127963543, 0.11223107576370239, 0.09043798595666885, 0.07635907083749771, 0.05296647176146507, 0.05554994195699692, 0.10812416672706604, 0.05017809942364693, 0.01736866496503353, 0.015402969904243946, -0.014966047368943691, -0.023474950343370438, -0.019413849338889122, 0.054307397454977036, 0.01389274187386036, 0.21643196046352386, 0.06929614394903183, -0.09231306612491608, -0.03249470517039299, 0.1293979287147522, -0.09653853625059128, -0.13846077024936676, -0.15859875082969666, 0.16783155500888824, 0.10069048404693604, 0.06695396453142166, -0.0220196433365345, -0.11378509551286697, 0.007552230730652809, 0.18337398767471313, 0.14984393119812012, -0.030771315097808838, -0.0029784636572003365, -0.018367011100053787, 0.0292588509619236, -0.01944490149617195, 0.07479260116815567, 0.005352209787815809, 0.34189099073410034, 0.01662231981754303, -0.0030363043770194054, 0.03767811134457588, 0.008784178644418716, -0.095846988260746, 0.05435236915946007, -0.08306286484003067, -0.027286646887660027, 0.024500008672475815, 0.11571160703897476, -0.020393921062350273, -0.20420394837856293, -0.10145895928144455, 0.032892197370529175, -0.028074830770492554, 0.043240249156951904, 0.057414598762989044, 0.06618667393922806, 0.06652210652828217, 0.032027941197156906, -0.05710099637508392, 0.21277005970478058, 0.004421185702085495, -0.05823139473795891, 0.045724302530288696, 0.05122238025069237, -0.15058133006095886, 0.07383929193019867, -0.017717668786644936, 0.05309362709522247, 0.05654117465019226, 0.040238797664642334, -0.0856969952583313, 0.07309363782405853, -0.030417289584875107, -0.09326202422380447, 0.1282222419977188, 0.12756219506263733, 0.010647662915289402, 0.0907786414027214, 0.016413766890764236, -0.10659053176641464, 0.058802731335163116, 0.04199754819273949, -0.04310518875718117, -0.07493297010660172, 0.09772365540266037, -0.09590745717287064, 0.1211036965250969, 0.11832540482282639, -0.02854362688958645, 0.03445224091410637, -0.02881043590605259, 0.02166725881397724, 0.008168598636984825, 0.08395412564277649, -0.04769735783338547, -0.13316287100315094, -0.047716278582811356, -0.028082557022571564, 0.03228066861629486, -0.17770972847938538, 0.0000227019681915408, 0.001168438931927085, -0.020957574248313904, -0.05696090683341026, 0.06649862229824066, 0.044479697942733765, 0.04770035669207573, -0.014195060357451439, 0.067347452044487, -0.004580870736390352, 0.06912440806627274, -0.09492410719394684, -0.06645802408456802 ]
null
null
transformers
# S2T-SMALL-MUSTC-EN-FR-ST `s2t-small-mustc-en-fr-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to French text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-mustc-en-fr-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-mustc-en-fr-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-mustc-en-fr-st is trained on English-French subset of [MuST-C](https://ict.fbk.eu/must-c/). MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-fr (BLEU score): 32.9 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "fr"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["mustc"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-mustc-en-fr-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "fr", "dataset:mustc", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en", "fr" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #fr #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-MUSTC-EN-FR-ST 's2t-small-mustc-en-fr-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to French text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-mustc-en-fr-st is trained on English-French subset of MuST-C. MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-fr (BLEU score): 32.9 ### BibTeX entry and citation info
[ "# S2T-SMALL-MUSTC-EN-FR-ST\n\n's2t-small-mustc-en-fr-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to French text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-fr-st is trained on English-French subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-fr (BLEU score): 32.9", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #fr #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-MUSTC-EN-FR-ST\n\n's2t-small-mustc-en-fr-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to French text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-fr-st is trained on English-French subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-fr (BLEU score): 32.9", "### BibTeX entry and citation info" ]
[ 82, 78, 111, 42, 137, 118, 3, 100, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #fr #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-MUSTC-EN-FR-ST\n\n's2t-small-mustc-en-fr-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to French text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.08354287594556808, 0.13763757050037384, -0.0055175162851810455, 0.00025734849623404443, 0.05646814778447151, -0.07256048917770386, 0.016632884740829468, 0.06766659766435623, -0.08317175507545471, 0.04140215367078781, 0.016650501638650894, 0.09147385507822037, 0.03847144544124603, 0.07437560707330704, 0.0029774345457553864, -0.2341141402721405, 0.04116290435194969, -0.05966596677899361, 0.019809871912002563, 0.0868556872010231, 0.13005392253398895, -0.034306518733501434, 0.07611782103776932, 0.04615914076566696, -0.0026666992343962193, 0.045687027275562286, 0.04755668714642525, -0.06725773960351944, 0.07248374074697495, 0.11022885143756866, 0.04535869508981705, 0.03625879064202309, 0.08183711022138596, -0.16101326048374176, 0.0015802339185029268, 0.05088405683636665, 0.0235955398529768, 0.04092467576265335, 0.03012435883283615, 0.03225795552134514, 0.09738708287477493, -0.004697443451732397, 0.011319938115775585, 0.10123197734355927, -0.029211292043328285, -0.04376230761408806, -0.032213058322668076, 0.022386696189641953, 0.15181313455104828, 0.09447365999221802, -0.037569135427474976, 0.003167682094499469, -0.032571133226156235, 0.05062150955200195, 0.04086112976074219, -0.17920978367328644, 0.005379086825996637, -0.11607149243354797, -0.02672436088323593, 0.005844632163643837, -0.05694785341620445, -0.009459535591304302, -0.02470662072300911, -0.002262615831568837, 0.06102798879146576, -0.07233265787363052, 0.022700389847159386, -0.06330060213804245, -0.12042175978422165, -0.019176362082362175, 0.11806683987379074, -0.00488609354943037, -0.10240958631038666, -0.12046992778778076, -0.04610225930809975, 0.02080496959388256, -0.05414915084838867, -0.06827033311128616, 0.015747472643852234, 0.029112670570611954, 0.04189872369170189, -0.13727544248104095, -0.1170731782913208, -0.006469039246439934, -0.08407018333673477, 0.10796034336090088, 0.043100252747535706, 0.009878532029688358, 0.012458923272788525, 0.05802391842007637, -0.09255459904670715, -0.023694731295108795, -0.024113569408655167, -0.09993762522935867, -0.10659269243478775, -0.007858105935156345, -0.01832614280283451, -0.19591952860355377, -0.027603983879089355, 0.12655717134475708, -0.13633598387241364, 0.034968052059412, 0.03516675531864166, 0.004691078793257475, 0.0034221389796584845, 0.17020165920257568, -0.0493512824177742, -0.10359998792409897, 0.03998937830328941, -0.03117298148572445, 0.0014730500988662243, 0.03704950958490372, -0.071534164249897, 0.0014362707734107971, -0.06412087380886078, 0.02725394070148468, 0.053773365914821625, 0.025708897039294243, 0.013747936114668846, -0.05911942943930626, 0.20918715000152588, -0.11269761621952057, 0.014715741388499737, 0.02506381832063198, -0.048248592764139175, 0.14648430049419403, 0.04420806095004082, 0.003485653782263398, -0.15576905012130737, 0.047987423837184906, -0.0057135228998959064, 0.026148471981287003, -0.06070942431688309, -0.10806620121002197, -0.023511042818427086, -0.04415997490286827, -0.05517347529530525, -0.09982125461101532, -0.09753367304801941, -0.027538182213902473, 0.01615561731159687, -0.03946983814239502, 0.024022312834858894, -0.08657509833574295, -0.05017838999629021, 0.00479558389633894, -0.03520215302705765, -0.15402892231941223, 0.003737127175554633, -0.024111144244670868, -0.01863022893667221, 0.029010623693466187, -0.07311911135911942, 0.028989523649215698, -0.05327702686190605, -0.017926841974258423, -0.22827591001987457, 0.1811130791902542, 0.014046813361346722, -0.0020168053451925516, -0.1137591153383255, -0.09386760741472244, -0.08933126926422119, 0.06257832795381546, 0.03163478896021843, 0.15753881633281708, -0.18735583126544952, -0.046856414526700974, 0.1912587285041809, -0.07565091550350189, 0.04647231101989746, 0.1749950349330902, 0.030252406373620033, 0.03332878276705742, 0.12531155347824097, 0.10529612749814987, 0.10716723650693893, -0.0741497352719307, -0.0908951461315155, 0.07216518372297287, -0.11377867311239243, 0.05735946074128151, 0.052298836410045624, -0.0834450051188469, 0.1032407134771347, 0.009005827829241753, -0.012210032902657986, -0.011109980754554272, 0.05964413285255432, -0.012375970371067524, 0.03995481878519058, -0.022553959861397743, 0.1250428557395935, -0.030222540721297264, -0.0013976560439914465, -0.0019964498933404684, -0.1301262080669403, 0.1464909017086029, 0.04542233794927597, -0.06505268067121506, 0.07790148258209229, -0.12254201620817184, -0.05990605801343918, 0.0057750544510781765, 0.0022659054957330227, -0.16845372319221497, 0.06200699135661125, -0.027064070105552673, -0.006561345420777798, 0.1306944191455841, 0.03197983279824257, 0.05664653703570366, 0.01516772247850895, -0.0034062599297612906, 0.019482078030705452, 0.08019589632749557, 0.0014747661771252751, 0.0008986529428511858, -0.05768692493438721, -0.009897731244564056, -0.016011683270335197, 0.13177819550037384, -0.0851033627986908, 0.029704343527555466, 0.009827079251408577, 0.05649023503065109, 0.016778290271759033, -0.022243160754442215, -0.021135473623871803, -0.02275102026760578, 0.008760534226894379, -0.022233346477150917, -0.00943794846534729, 0.016178075224161148, -0.020336253568530083, 0.12391895800828934, -0.12929092347621918, -0.1663569211959839, 0.08687324821949005, -0.046202272176742554, -0.03150280937552452, 0.04933403059840202, -0.029421895742416382, -0.005207146517932415, 0.03982510045170784, -0.16786658763885498, 0.265857458114624, 0.064672090113163, 0.14722996950149536, -0.069303959608078, -0.061059847474098206, -0.0362265519797802, -0.06669197976589203, 0.02033626101911068, 0.009398853406310081, 0.05328180640935898, -0.1728236973285675, -0.004889909643679857, 0.023602843284606934, -0.0028011472895741463, 0.10635468363761902, 0.023876328021287918, -0.06591861695051193, 0.019495533779263496, 0.027602246031165123, 0.005036517977714539, -0.017500896006822586, -0.01268150843679905, 0.0014197996351867914, 0.01797530986368656, 0.030144043266773224, 0.04408758133649826, -0.0606209971010685, 0.10120715200901031, 0.06437987089157104, -0.0635053738951683, -0.04079186171293259, 0.004045977257192135, 0.0030810371972620487, 0.058752622455358505, 0.0089084068313241, 0.013030528090894222, -0.03750613331794739, -0.015026353299617767, -0.10801435261964798, 0.11285218596458435, -0.17335079610347748, -0.3019781708717346, -0.17369817197322845, -0.04766885191202164, -0.04288632422685623, 0.06577224284410477, 0.05740196257829666, -0.10422277450561523, -0.08132478594779968, -0.09681680053472519, 0.05255778506398201, -0.056193239986896515, -0.04438785836100578, -0.09477151930332184, 0.053219955414533615, 0.04526239633560181, -0.08761335909366608, 0.004651248455047607, -0.0001519869256298989, -0.035495318472385406, -0.03849669173359871, 0.011736812070012093, -0.03685690462589264, 0.1472119688987732, -0.02644205279648304, -0.04455145448446274, -0.023623991757631302, 0.14410796761512756, -0.029355784878134727, 0.11552160233259201, 0.17858798801898956, -0.07539310306310654, 0.033635739237070084, 0.12846249341964722, 0.024966895580291748, -0.011990061029791832, 0.0010160207748413086, -0.0006964555359445512, -0.027299415320158005, -0.18070659041404724, -0.0708240270614624, -0.05449528992176056, 0.01792106404900551, 0.02896309085190296, 0.02040186896920204, 0.038991186767816544, 0.004103550687432289, -0.07290882617235184, -0.0019954226445406675, 0.05685011297464371, 0.07996392250061035, 0.19855591654777527, -0.05190734192728996, 0.07573775202035904, -0.06606946140527725, -0.04287324845790863, 0.10916325449943542, -0.03405885770916939, 0.10457039624452591, 0.002538635628297925, 0.19674624502658844, 0.03455299139022827, -0.014928540214896202, 0.011706228367984295, 0.06647767126560211, -0.025610432028770447, 0.010393748059868813, -0.011530946008861065, -0.039343494921922684, -0.03274036943912506, 0.0730426236987114, 0.06904522329568863, -0.047439221292734146, 0.007296515163034201, 0.057639241218566895, 0.07442065328359604, 0.10410166531801224, 0.059059832245111465, -0.13635264337062836, -0.08152168244123459, 0.006232311949133873, -0.15546971559524536, -0.03626105934381485, 0.01761181652545929, 0.13084706664085388, -0.06897985190153122, 0.02126745507121086, 0.011684771627187729, 0.08414579927921295, -0.08247394114732742, 0.04601702839136124, 0.030729440972208977, 0.16520409286022186, 0.0373716801404953, 0.022649891674518585, -0.025247246026992798, 0.03849935159087181, 0.015675073489546776, 0.06041787192225456, -0.00902906246483326, 0.07530227303504944, -0.008514455519616604, 0.05421418324112892, 0.12207384407520294, 0.004694199655205011, -0.1285889744758606, 0.02323528192937374, -0.10143256187438965, -0.03875351697206497, 0.10704046487808228, 0.010767998173832893, 0.05302409082651138, -0.055039677768945694, -0.05924515053629875, -0.05910526216030121, -0.10946135222911835, -0.06666692346334457, -0.1901777684688568, 0.04200536757707596, 0.021326681599020958, -0.003254050388932228, 0.0028872007969766855, 0.06347303092479706, -0.06102272868156433, 0.21273525059223175, -0.18375127017498016, -0.09021113812923431, -0.10760907828807831, -0.04410729557275772, 0.14257670938968658, -0.04961289465427399, 0.09479791671037674, 0.05126618593931198, 0.07464903593063354, 0.003973542246967554, -0.04430816322565079, 0.052871569991111755, -0.12534503638744354, 0.01485744770616293, -0.02088545821607113, 0.18656225502490997, 0.00856270082294941, 0.06646924465894699, 0.05617499724030495, -0.004437730181962252, -0.0030345283448696136, -0.11289960891008377, -0.09253633767366409, 0.07381761074066162, 0.002805073745548725, -0.00032719923183321953, -0.05748949944972992, -0.11889041215181351, -0.09342794120311737, 0.05175042524933815, 0.15324059128761292, 0.08998528867959976, -0.10178066790103912, 0.137832909822464, 0.0835174173116684, -0.04601135849952698, -0.17862620949745178, -0.07237746566534042, 0.0759669691324234, 0.007061982527375221, 0.0021356893703341484, -0.1593708097934723, 0.031992677599191666, -0.013081436976790428, -0.008727796375751495, -0.03570501133799553, -0.21641574800014496, -0.11118103563785553, 0.03217998519539833, -0.041156869381666183, 0.02094366028904915, 0.01683107204735279, -0.0499359630048275, -0.05953475460410118, -0.04551885277032852, 0.0032238473650068045, -0.0517570823431015, 0.026355603709816933, 0.0826253667473793, 0.008480353280901909, 0.04914331063628197, 0.009172399528324604, 0.04942004382610321, -0.0145327840000391, -0.052200403064489365, -0.017538713291287422, 0.05980447679758072, 0.008668111637234688, -0.01777905970811844, 0.19677788019180298, -0.0178468506783247, 0.022690515965223312, -0.08383473753929138, -0.05862576887011528, -0.021832706406712532, 0.0016408670926466584, -0.0008815730107016861, -0.0003688309225253761, 0.0020042001269757748, -0.014210589230060577, 0.061083439737558365, 0.022379059344530106, -0.11660131812095642, -0.14540937542915344, -0.16114820539951324, 0.21199458837509155, 0.2305595576763153, 0.032675743103027344, -0.06613823771476746, 0.0241071954369545, 0.013892552815377712, -0.007748004049062729, -0.08820497244596481, 0.06542092561721802, 0.026157191023230553, -0.016758091747760773, 0.05914564058184624, -0.018340591341257095, -0.03474094346165657, 0.010731340385973454, 0.06587988883256912, -0.03720902279019356, -0.1291283518075943, -0.04572509974241257, 0.014608269557356834, -0.07557329535484314, 0.023364519700407982, 0.1878824383020401, -0.020880797877907753, -0.015848614275455475, -0.01783105544745922, 0.02321501076221466, -0.08277765661478043, 0.1025398001074791, -0.04020882397890091, 0.01167347002774477, -0.0718323364853859, 0.03847791999578476, 0.061397626996040344, -0.017671732231974602, 0.017563756555318832, 0.1272885501384735, -0.1041446402668953, -0.06731193512678146, -0.16964350640773773, -0.1609591543674469, -0.05245823413133621, -0.046681489795446396, -0.026839274913072586, -0.04687384143471718, 0.01005506794899702, 0.029738714918494225, -0.022112412378191948, 0.017513757571578026, -0.06996544450521469, -0.01226549781858921, -0.07765363901853561, 0.03805183991789818, 0.06914134323596954, -0.01159488782286644, -0.04163511469960213, 0.08919315040111542, -0.03284105658531189, 0.011880303733050823, -0.02290479652583599, 0.008444353938102722, -0.0016174643533304334, -0.00007561968959635124, -0.09034748375415802, -0.02761603146791458, -0.052905481308698654, -0.05341731011867523, 0.038737718015909195, 0.005926183424890041, -0.0006284060073085129, 0.020280158147215843, -0.05459625646471977, -0.024782570078969002, -0.014694924466311932, 0.016249917447566986, -0.07308291643857956, 0.08214713633060455, 0.03367622569203377, -0.05534454062581062, 0.09129345417022705, 0.08533833175897598, -0.06464473903179169, 0.08859305083751678, 0.0316748283803463, -0.1013500839471817, -0.008479605428874493, 0.07402447611093521, 0.040140800178050995, -0.040974099189043045, 0.022212687879800797, 0.05461675301194191, -0.019624190405011177, -0.0793878436088562, -0.0032312539406120777, -0.03431455045938492, 0.1032269224524498, -0.005329547915607691, -0.006953537929803133, -0.05673891678452492, 0.009613771922886372, 0.049345146864652634, -0.004559248220175505, 0.14326035976409912, -0.05453867465257645, 0.035852331668138504, -0.06495440006256104, 0.03932355344295502, 0.007252346724271774, 0.002599482424557209, 0.01177877839654684, -0.09935764223337173, 0.02665829472243786, 0.010355842299759388, 0.05443315580487251, -0.015593702904880047, 0.07644554227590561, 0.06651903688907623, -0.1058635339140892, -0.003224555402994156, 0.04727088287472725, 0.08840034902095795, 0.033448923379182816, -0.017389534041285515, -0.11625248938798904, -0.004980756901204586, -0.004641440697014332, -0.1036270335316658, 0.05819776654243469, 0.10859928280115128, 0.03921763598918915, 0.11311300098896027, 0.013147076591849327, 0.04736020416021347, -0.13548141717910767, 0.05052797868847847, -0.07710374891757965, -0.0025186275597661734, -0.06966862082481384, 0.09025353938341141, 0.1684250831604004, -0.09258197993040085, 0.0918465331196785, 0.08805544674396515, -0.05630861967802048, -0.14047297835350037, -0.10040242969989777, -0.019010065123438835, -0.0780264139175415, -0.01696360856294632, -0.07143829017877579, 0.041814036667346954, -0.08168264478445053, 0.07604537159204483, 0.001093603204935789, 0.16092438995838165, -0.10513380169868469, -0.14516213536262512, 0.055008359253406525, -0.009601100347936153, 0.07724637538194656, 0.09350863099098206, 0.06537565588951111, 0.05398985370993614, 0.017949890345335007, 0.1075984388589859, 0.03948838263750076, 0.02151774801313877, 0.022419525310397148, -0.012389817275106907, -0.02352108806371689, -0.01941288448870182, 0.045900288969278336, 0.008737935684621334, 0.2058376967906952, 0.07694177329540253, -0.11087921261787415, -0.02842125855386257, 0.10532774776220322, -0.1018684133887291, -0.14850760996341705, -0.17686901986598969, 0.1660955548286438, 0.10819821059703827, 0.059852708131074905, -0.02500031888484955, -0.09584704041481018, 0.0057174284011125565, 0.17765949666500092, 0.1693613976240158, -0.030122723430395126, 0.004975501447916031, -0.003159546758979559, 0.023189906030893326, -0.003943021409213543, 0.04339442029595375, 0.02632562257349491, 0.3140505254268646, 0.010128329508006573, -0.001163802808150649, 0.03362897038459778, -0.009533882141113281, -0.07995843887329102, 0.055668044835329056, -0.06254927814006805, -0.0224052295088768, 0.015710880979895592, 0.09130805730819702, -0.01843683421611786, -0.19555649161338806, -0.10364126414060593, 0.034401390701532364, -0.008344143629074097, 0.05450901761651039, 0.00935911014676094, 0.06706280261278152, 0.055092185735702515, 0.021822599694132805, -0.06811924278736115, 0.2315930426120758, -0.006605873350054026, -0.05369165539741516, 0.049044858664274216, 0.04141668975353241, -0.14957088232040405, 0.09101884812116623, -0.00934048555791378, 0.0743662565946579, 0.054420165717601776, 0.026571158319711685, -0.07253387570381165, 0.05083475261926651, -0.02356216125190258, -0.09460356831550598, 0.08120628446340561, 0.1648644059896469, 0.0029982354026287794, 0.10616946965456009, -0.007400404661893845, -0.08274845033884048, 0.06367465853691101, 0.004151877015829086, -0.06900473684072495, -0.07246595621109009, 0.10387735068798065, -0.09390883892774582, 0.1346285492181778, 0.1316467523574829, -0.02121478132903576, 0.040681142359972, -0.041637688875198364, 0.0011069192551076412, 0.0014553143410012126, 0.040255870670080185, -0.045345909893512726, -0.11085492372512817, -0.011343250051140785, -0.037929654121398926, 0.03563636913895607, -0.17999069392681122, -0.010767756029963493, -0.008624935522675514, 0.00021112646209076047, -0.04447308927774429, 0.07127596437931061, 0.053493205457925797, 0.034473713487386703, -0.020607249811291695, 0.06268102675676346, -0.0018940168665722013, 0.05932946503162384, -0.07998606562614441, -0.06575468182563782 ]
null
null
transformers
# S2T-SMALL-MUSTC-EN-IT-ST `s2t-small-mustc-en-it-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Italian text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-mustc-en-it-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-mustc-en-it-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-mustc-en-it-st is trained on English-Italian subset of [MuST-C](https://ict.fbk.eu/must-c/). MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-it (BLEU score): 22.7 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "it"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["mustc"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-mustc-en-it-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "it", "dataset:mustc", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en", "it" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #it #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-MUSTC-EN-IT-ST 's2t-small-mustc-en-it-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Italian text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-mustc-en-it-st is trained on English-Italian subset of MuST-C. MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-it (BLEU score): 22.7 ### BibTeX entry and citation info
[ "# S2T-SMALL-MUSTC-EN-IT-ST\n\n's2t-small-mustc-en-it-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Italian text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-it-st is trained on English-Italian subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-it (BLEU score): 22.7", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #it #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-MUSTC-EN-IT-ST\n\n's2t-small-mustc-en-it-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Italian text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-it-st is trained on English-Italian subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-it (BLEU score): 22.7", "### BibTeX entry and citation info" ]
[ 82, 78, 111, 42, 137, 117, 3, 100, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #it #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-MUSTC-EN-IT-ST\n\n's2t-small-mustc-en-it-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Italian text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.10699060559272766, 0.15520843863487244, -0.004556768108159304, 0.003333204658702016, 0.07556705176830292, -0.04177061468362808, 0.03832463547587395, 0.10932627320289612, -0.05168046057224274, 0.062207769602537155, -0.019381573423743248, 0.10358749330043793, 0.06969448179006577, 0.08393065631389618, -0.02773837186396122, -0.2485976219177246, 0.0511484332382679, -0.06471798568964005, 0.03725079074501991, 0.09239630401134491, 0.13173848390579224, -0.056543171405792236, 0.06984465569257736, 0.012214905582368374, -0.0301524568349123, 0.04248622804880142, 0.042117707431316376, -0.06464311480522156, 0.0711185485124588, 0.11516227573156357, 0.04887135699391365, 0.030305510386824608, 0.0968329906463623, -0.14430366456508636, 0.0032854238525032997, 0.06619013100862503, 0.03654775023460388, 0.032175105065107346, 0.045980893075466156, 0.05242529883980751, 0.09911853075027466, 0.01931823417544365, -0.0038271937519311905, 0.0946950912475586, -0.02773752063512802, -0.01672270894050598, -0.028827931731939316, 0.018913650885224342, 0.15697839856147766, 0.09898050874471664, -0.044390659779310226, 0.03164717182517052, -0.037033308297395706, 0.051253337413072586, 0.03833159804344177, -0.15630023181438446, -0.007602619472891092, -0.13497865200042725, -0.04469635337591171, 0.0384225957095623, -0.03617596626281738, -0.0000685959093971178, -0.02904125675559044, -0.01176112424582243, 0.00739112077280879, -0.0418221540749073, 0.04282016679644585, -0.07284224033355713, -0.12430699914693832, -0.0162793081253767, 0.13101686537265778, -0.018340447917580605, -0.09817563742399216, -0.11910435557365417, -0.04547012597322464, 0.0347919836640358, -0.03367287665605545, -0.06665688753128052, -0.009707920253276825, 0.04477398097515106, 0.072169229388237, -0.10990125685930252, -0.12887665629386902, -0.01190357655286789, -0.06344055384397507, 0.1754334270954132, 0.06401169300079346, 0.00667151901870966, 0.014006738550961018, 0.05925530940294266, -0.06773468106985092, -0.03176454082131386, -0.014012791216373444, -0.07441193610429764, -0.09848833829164505, -0.014985600486397743, -0.03100043721497059, -0.23663529753684998, -0.029215388000011444, 0.10058321803808212, -0.06616916507482529, 0.013282254338264465, 0.03980531916022301, 0.02279571257531643, 0.004171833395957947, 0.1620391458272934, -0.07236627489328384, -0.10790305584669113, 0.03909857198596001, -0.03480115532875061, 0.027236485853791237, 0.023304933682084084, -0.0989680215716362, -0.010569515638053417, -0.06876703351736069, 0.048294227570295334, 0.0773586556315422, 0.03391226381063461, -0.004891377408057451, -0.07702583819627762, 0.16928303241729736, -0.1331551969051361, 0.020052870735526085, 0.025495978072285652, -0.06658899784088135, 0.16801738739013672, 0.015144595876336098, -0.010945294983685017, -0.16910554468631744, 0.06308027356863022, -0.013348951004445553, 0.02712858095765114, -0.07965584099292755, -0.10822390764951706, -0.0007969735888764262, -0.08818963915109634, -0.049524612724781036, -0.10663334280252457, -0.09884826093912125, -0.02062070183455944, -0.0008216720307245851, -0.05609983205795288, 0.051521431654691696, -0.07403415441513062, -0.01984667032957077, -0.020090308040380478, -0.04919574782252312, -0.1592138558626175, 0.0087947528809309, 0.002500637201592326, -0.013513405807316303, 0.03481881320476532, -0.07806769013404846, 0.029736772179603577, -0.07803203910589218, -0.03507170081138611, -0.2046424001455307, 0.1598396748304367, 0.0252227745950222, 0.019373010843992233, -0.10386354476213455, -0.072702556848526, -0.08158222585916519, 0.05651870742440224, 0.046433497220277786, 0.17314939200878143, -0.20493942499160767, -0.031118884682655334, 0.1702422797679901, -0.09725017845630646, 0.03746499866247177, 0.1725320667028427, 0.046322502195835114, 0.08136944472789764, 0.14395035803318024, 0.10811915993690491, 0.08351536095142365, -0.07552531361579895, -0.10358049720525742, 0.038514215499162674, -0.12457748502492905, 0.04891457036137581, 0.036807794123888016, -0.06566408276557922, 0.1699349582195282, 0.0247547198086977, -0.04157005250453949, -0.01606232300400734, 0.053597308695316315, -0.02518419176340103, 0.03452971205115318, -0.048430852591991425, 0.09892682731151581, -0.01950277015566826, 0.008098849095404148, -0.0040558939799666405, -0.13704979419708252, 0.10162397474050522, 0.058021772652864456, -0.08187884092330933, 0.09124770760536194, -0.14767777919769287, -0.005695806350558996, 0.0026669155340641737, 0.010044611059129238, -0.16204501688480377, 0.064996138215065, -0.007572921458631754, 0.014471636153757572, 0.1180897131562233, 0.013613859191536903, 0.055550530552864075, 0.009535356424748898, 0.0006207704427652061, 0.01816765032708645, 0.09090358763933182, 0.0008733487338759005, -0.013828971423208714, -0.061803121119737625, -0.04354984313249588, -0.004165362101048231, 0.1237213984131813, -0.08059236407279968, 0.04482961818575859, 0.03212469443678856, 0.0789911225438118, 0.00012374647485557944, -0.032256558537483215, -0.04042677953839302, -0.03102567233145237, -0.010166454128921032, -0.038733404129743576, -0.008551504462957382, 0.021308083087205887, -0.03493561968207359, 0.09728501737117767, -0.14545023441314697, -0.15594519674777985, 0.0774693638086319, -0.05395196005702019, -0.01913360133767128, 0.0342838354408741, -0.016428669914603233, -0.02022373490035534, 0.04054548591375351, -0.17709195613861084, 0.22670462727546692, 0.05551374331116676, 0.10964568704366684, -0.07226791232824326, -0.02139841578900814, -0.024935314431786537, -0.06945439428091049, 0.001613214029930532, -0.01731174625456333, 0.07250062376260757, -0.194308340549469, 0.007971413433551788, 0.024930329993367195, -0.0178262609988451, 0.12160027772188187, 0.04660577327013016, -0.09027711302042007, 0.03393915295600891, 0.03949179872870445, 0.016158660873770714, -0.026251506060361862, -0.04083552956581116, 0.01721903868019581, 0.01261681318283081, 0.03231823816895485, 0.05846729502081871, -0.060549844056367874, 0.08254221826791763, 0.05835390090942383, -0.05036811903119087, -0.04903170466423035, -0.030306311324238777, -0.022029221057891846, 0.07448454946279526, 0.01199005451053381, 0.008488682098686695, -0.030184030532836914, -0.012282389216125011, -0.08967320621013641, 0.0961846187710762, -0.15988975763320923, -0.30187034606933594, -0.17542633414268494, -0.05433940142393112, -0.03037344478070736, 0.07742222398519516, 0.037898823618888855, -0.09695024788379669, -0.08162690699100494, -0.07496140152215958, 0.06115708500146866, -0.05416630581021309, -0.04684915393590927, -0.0985778272151947, 0.0696704238653183, 0.02415573224425316, -0.06307518482208252, 0.009393923915922642, 0.003016124712303281, -0.05117000266909599, -0.040102679282426834, -0.025164199993014336, -0.0019468121463432908, 0.16381527483463287, -0.02196408435702324, -0.01829831674695015, -0.01795382983982563, 0.08847051858901978, -0.05810815840959549, 0.13502661883831024, 0.2071022391319275, -0.10362999886274338, 0.020340535789728165, 0.13097944855690002, 0.01623953878879547, -0.02733423560857773, -0.011710016056895256, -0.021263426169753075, -0.024778852239251137, -0.18557052314281464, -0.06767489016056061, -0.05787132307887077, 0.01921842247247696, 0.020817218348383904, 0.0180845707654953, 0.01742263324558735, 0.023346299305558205, -0.08103382587432861, -0.048478879034519196, 0.06438659131526947, 0.07306607067584991, 0.15470804274082184, -0.05192812904715538, 0.07791011035442352, -0.05606812238693237, -0.02749595418572426, 0.11804582923650742, 0.009471341036260128, 0.13558261096477509, 0.005190160591155291, 0.22803407907485962, 0.017705168575048447, 0.003315101843327284, 0.014795544557273388, 0.07406952232122421, -0.0034904559142887592, -0.006892272271215916, 0.002233406761661172, -0.043722692877054214, -0.04484622925519943, 0.0846702829003334, 0.07069452852010727, -0.05920317396521568, 0.028984321281313896, 0.0725579708814621, 0.07938797771930695, 0.14399884641170502, 0.046413883566856384, -0.15485087037086487, -0.07425868511199951, 0.009796198457479477, -0.12912331521511078, -0.033955905586481094, -0.005479985382407904, 0.128883495926857, -0.04992486909031868, 0.008860750123858452, 0.008786311373114586, 0.08273494243621826, -0.1203175038099289, 0.03026607632637024, 0.02633669599890709, 0.16314902901649475, 0.035315755754709244, 0.05327687785029411, -0.028176775202155113, 0.04432829096913338, 0.01961923949420452, 0.10545937716960907, -0.046718355268239975, 0.07277486473321915, -0.015448519960045815, 0.06568410992622375, 0.12545175850391388, -0.008371494710445404, -0.12342294305562973, -0.017778612673282623, -0.12629255652427673, -0.04737146571278572, 0.13798527419567108, 0.024395376443862915, 0.05728680267930031, -0.06412636488676071, -0.03573421388864517, -0.05119391530752182, -0.07359729707241058, -0.027759937569499016, -0.19124945998191833, 0.06269357353448868, 0.0484553761780262, -0.013457688502967358, 0.007419925648719072, 0.040853746235370636, -0.0315699577331543, 0.22807878255844116, -0.18372496962547302, -0.08152226358652115, -0.11979638040065765, -0.014704389497637749, 0.12017617374658585, -0.04989389330148697, 0.06424380838871002, 0.055264316499233246, 0.07701212912797928, 0.007567611988633871, -0.034052565693855286, 0.07225373387336731, -0.1126248836517334, 0.00231747655197978, -0.037402067333459854, 0.15889644622802734, 0.014628560282289982, 0.056835468858480453, 0.06508097797632217, -0.014600700698792934, 0.008984196931123734, -0.07992415130138397, -0.07246583700180054, 0.08558320999145508, 0.004452741239219904, 0.018788693472743034, -0.07843629270792007, -0.11680972576141357, -0.10592493414878845, 0.028405562043190002, 0.13754846155643463, 0.08104222267866135, -0.09152092784643173, 0.09060132503509521, 0.07696034014225006, -0.06107454001903534, -0.1810780018568039, -0.04160176217556, 0.05914178490638733, 0.025433529168367386, 0.0002210115344496444, -0.15382134914398193, 0.022986527532339096, 0.026957036927342415, -0.019567616283893585, -0.013090859167277813, -0.2324029952287674, -0.09674655646085739, 0.011038524098694324, -0.01940658502280712, 0.013597385957837105, 0.018549112603068352, -0.037543125450611115, -0.060457341372966766, -0.029060276225209236, -0.054260894656181335, -0.013188268058001995, 0.029878666624426842, 0.0903264656662941, -0.006291453260928392, 0.05795734003186226, -0.0023078788071870804, 0.0458279587328434, -0.01974664442241192, -0.06409834325313568, -0.039666835218667984, 0.054992709308862686, 0.010080035775899887, -0.0022792548406869173, 0.1905720978975296, -0.021921178326010704, -0.00798704568296671, -0.05623543635010719, -0.07501932978630066, -0.013636043295264244, 0.006890460848808289, -0.0003258140932302922, -0.011052236892282963, 0.021337345242500305, -0.0017375739989802241, 0.04426886886358261, 0.037778519093990326, -0.10091053694486618, -0.17612260580062866, -0.1600082963705063, 0.20710153877735138, 0.23345716297626495, 0.019586613401770592, -0.04742160066962242, -0.008382018655538559, 0.013159054331481457, -0.015250998549163342, -0.034243445843458176, 0.0662706196308136, 0.03362538665533066, -0.02718699909746647, 0.04736443981528282, -0.019838299602270126, -0.05339893698692322, 0.022522585466504097, 0.0825360044836998, -0.0172638688236475, -0.12789250910282135, -0.036275435239076614, 0.031531620770692825, -0.08015623688697815, 0.06923018395900726, 0.18874521553516388, -0.022591713815927505, -0.009446103125810623, -0.028052223846316338, 0.044997043907642365, -0.0814628005027771, 0.13624216616153717, -0.0381692573428154, -0.01263702753931284, -0.0664450004696846, 0.0890444740653038, 0.06217208877205849, -0.017204295843839645, 0.015625152736902237, 0.12749920785427094, -0.10616146773099899, -0.06649652123451233, -0.1662019044160843, -0.18933400511741638, -0.05051526799798012, -0.07205013185739517, -0.05581209808588028, -0.03977157175540924, -0.009124343283474445, -0.01787533238530159, -0.017879236489534378, 0.02251690812408924, -0.10355841368436813, -0.009845554828643799, -0.09631063789129257, 0.026249242946505547, 0.081914983689785, -0.00438185129314661, -0.05857367068529129, 0.11224252730607986, -0.02473554015159607, -0.03831448405981064, -0.021613886579871178, -0.002806334290653467, 0.013408547267317772, 0.02108503133058548, -0.075583316385746, -0.048391103744506836, -0.05685381591320038, -0.06782226264476776, 0.026189984753727913, -0.0012347577139735222, -0.014844726771116257, 0.021162863820791245, -0.06743791699409485, -0.01977655105292797, -0.03170720115303993, -0.006394080352038145, -0.06458746641874313, 0.0910237655043602, 0.013672292232513428, -0.05056639388203621, 0.10087959468364716, 0.09039395302534103, -0.07132967561483383, 0.08425161987543106, 0.02792861871421337, -0.07138092070817947, -0.011228599585592747, 0.06958790123462677, 0.04065626487135887, -0.018903082236647606, 0.026355264708399773, 0.0832299068570137, -0.038320451974868774, -0.07444653660058975, 0.004550482612103224, -0.029297707602381706, 0.1306934654712677, 0.011827092617750168, -0.009491046890616417, -0.04704379290342331, 0.029896067455410957, 0.06863313913345337, -0.012607527896761894, 0.11980818957090378, -0.034549780189991, 0.04734454303979874, -0.059245139360427856, 0.04968433454632759, 0.020870156586170197, -0.01491747610270977, 0.009374776855111122, -0.08658048510551453, 0.040673110634088516, 0.012097380124032497, 0.019509054720401764, -0.017094362527132034, 0.06657884269952774, 0.06302721053361893, -0.07765166461467743, 0.01903729885816574, 0.047537919133901596, 0.11356992274522781, 0.02531629428267479, -0.004193025641143322, -0.10317414999008179, -0.005722952075302601, -0.027613885700702667, -0.09258042275905609, 0.043376583606004715, 0.12784788012504578, 0.049606021493673325, 0.12238311767578125, 0.049146272242069244, 0.04337973892688751, -0.16791664063930511, 0.04049966111779213, -0.10212120413780212, 0.005431120749562979, -0.06158651039004326, 0.10395660996437073, 0.1794268786907196, -0.09469849616289139, 0.055191002786159515, 0.06425925344228745, -0.06730750948190689, -0.12876051664352417, -0.16580241918563843, -0.012295886874198914, -0.07622158527374268, -0.013923615217208862, -0.07903897017240524, 0.04588134214282036, -0.07718878984451294, 0.05925079062581062, -0.04050071910023689, 0.17343474924564362, -0.08085134625434875, -0.139194518327713, 0.0708826407790184, -0.012819860130548477, 0.08894246071577072, 0.09206318855285645, 0.018280616030097008, 0.038611624389886856, 0.013311592862010002, 0.12122620642185211, 0.028661450371146202, 0.023367250338196754, 0.007406580727547407, -0.018999015912413597, -0.02466011233627796, -0.026898909360170364, 0.03303854912519455, 0.004308648873120546, 0.1997576355934143, 0.04990272969007492, -0.08948515355587006, -0.03816370666027069, 0.10341423004865646, -0.09415197372436523, -0.12033624202013016, -0.16612058877944946, 0.1482187956571579, 0.10527969151735306, 0.059590768069028854, -0.029622754082083702, -0.1052856519818306, 0.01619115099310875, 0.1828785389661789, 0.14770643413066864, -0.05396635830402374, -0.007677861023694277, 0.0014590735081583261, 0.025625573471188545, -0.025628967210650444, 0.07255665212869644, 0.0059502306394279, 0.2845088839530945, 0.01416334230452776, 0.038978852331638336, 0.042579710483551025, -0.0036214711144566536, -0.09070304036140442, 0.05790747329592705, -0.06083980202674866, -0.010318947024643421, -0.022818634286522865, 0.10555808246135712, 0.018230317160487175, -0.17686185240745544, -0.10708733648061752, 0.016992639750242233, -0.03144083917140961, 0.054373737424612045, 0.031167665496468544, 0.04667074233293533, 0.09074848145246506, 0.010829362086951733, -0.05269736796617508, 0.21494026482105255, 0.0029798520263284445, -0.056360743939876556, 0.010557010769844055, 0.0403011292219162, -0.17510369420051575, 0.09957504272460938, -0.019814785569906235, 0.0632825642824173, 0.05473571643233299, 0.05132787674665451, -0.08138822019100189, 0.06206030026078224, -0.030549554154276848, -0.10581762343645096, 0.08383047580718994, 0.15492230653762817, -0.024953430518507957, 0.11468713730573654, 0.022732924669981003, -0.10095833241939545, 0.07081056386232376, -0.017530977725982666, -0.06095745414495468, -0.07163628190755844, 0.09309086203575134, -0.09087236225605011, 0.15122467279434204, 0.13368108868598938, -0.022051909938454628, 0.01251761894673109, -0.03802509978413582, 0.016304200515151024, 0.021782562136650085, 0.07253116369247437, -0.04483112692832947, -0.15707354247570038, -0.04411068558692932, -0.04070844501256943, 0.03891705349087715, -0.13630741834640503, -0.0123234111815691, -0.01249682530760765, -0.016702931374311447, -0.06334055215120316, 0.06216319277882576, 0.07446523010730743, 0.03175298124551773, -0.019658304750919342, 0.017911437898874283, 0.01130856852978468, 0.05818355828523636, -0.08695811778306961, -0.06782513856887817 ]
null
null
transformers
# S2T-SMALL-MUSTC-EN-NL-ST `s2t-small-mustc-en-nl-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Dutch text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-mustc-en-nl-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-mustc-en-nl-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-mustc-en-nl-st is trained on English-Dutch subset of [MuST-C](https://ict.fbk.eu/must-c/). MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-nl (BLEU score): 27.3 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "nl"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["mustc"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-mustc-en-nl-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "nl", "dataset:mustc", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en", "nl" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #nl #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-MUSTC-EN-NL-ST 's2t-small-mustc-en-nl-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Dutch text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-mustc-en-nl-st is trained on English-Dutch subset of MuST-C. MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-nl (BLEU score): 27.3 ### BibTeX entry and citation info
[ "# S2T-SMALL-MUSTC-EN-NL-ST\n\n's2t-small-mustc-en-nl-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Dutch text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-nl-st is trained on English-Dutch subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-nl (BLEU score): 27.3", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #nl #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-MUSTC-EN-NL-ST\n\n's2t-small-mustc-en-nl-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Dutch text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-nl-st is trained on English-Dutch subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-nl (BLEU score): 27.3", "### BibTeX entry and citation info" ]
[ 82, 78, 111, 42, 137, 118, 3, 100, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #nl #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-MUSTC-EN-NL-ST\n\n's2t-small-mustc-en-nl-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Dutch text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.07355280965566635, 0.09452871978282928, -0.005272955633699894, -0.007547630462795496, 0.036840181797742844, -0.07811468094587326, 0.012088301591575146, 0.06942691653966904, -0.05873369798064232, 0.04261370748281479, 0.004675534553825855, 0.11113402247428894, 0.0542742945253849, 0.04701036587357521, 0.0060945735312998295, -0.26328304409980774, 0.06596565246582031, -0.06539133936166763, 0.033132776618003845, 0.08392100781202316, 0.13406561315059662, -0.047811247408390045, 0.06955359876155853, 0.035535506904125214, 0.013835125602781773, 0.04202788695693016, 0.025065863505005836, -0.06967193633317947, 0.0696353167295456, 0.09072111546993256, 0.04779300093650818, 0.03778751194477081, 0.07887567579746246, -0.15749502182006836, -0.007951959036290646, 0.027398690581321716, 0.03618483617901802, 0.02337425947189331, 0.02005542442202568, 0.06619030982255936, 0.1333819180727005, -0.022882189601659775, 0.0034751389175653458, 0.07592711597681046, -0.03679848089814186, -0.054322756826877594, -0.022494791075587273, 0.006504065357148647, 0.18002863228321075, 0.0835818499326706, -0.04127214848995209, 0.027307137846946716, -0.01396419107913971, 0.0598473995923996, 0.04650341346859932, -0.19634482264518738, 0.0032912634778767824, -0.09608660638332367, -0.016292382031679153, 0.006530494894832373, -0.06488170474767685, -0.014422484673559666, -0.03262395039200783, 0.017489487305283546, 0.056619416922330856, -0.08602765202522278, 0.01135220192372799, -0.0656174048781395, -0.12428149580955505, -0.025233225896954536, 0.10825609415769577, -0.009737410582602024, -0.09257575869560242, -0.12082484364509583, -0.03816191107034683, 0.042629946023225784, -0.037423644214868546, -0.07305976003408432, -0.001555245486088097, 0.02489849366247654, 0.03159113600850105, -0.1329755187034607, -0.12034149467945099, 0.026479000225663185, -0.07204924523830414, 0.14460134506225586, 0.044167790561914444, 0.0004407225060276687, 0.020129460841417313, 0.05877780169248581, -0.05304583162069321, -0.051210030913352966, -0.047871917486190796, -0.0903526023030281, -0.10028741508722305, 0.00590409804135561, -0.0006080233142711222, -0.2185128629207611, -0.03773780167102814, 0.09863442182540894, -0.08025264739990234, 0.02468937449157238, 0.027273287996649742, 0.008629044517874718, -0.0005792065057903528, 0.17815688252449036, -0.05587462708353996, -0.09783922135829926, 0.03464195504784584, -0.07023567706346512, 0.03902319446206093, 0.03333497792482376, -0.08348460495471954, 0.019584311172366142, -0.04600567743182182, 0.039100855588912964, 0.04484894126653671, 0.03703823685646057, 0.016630684956908226, -0.0649058148264885, 0.19665862619876862, -0.10241252928972244, 0.00935338530689478, 0.0002801493392325938, -0.027489006519317627, 0.14807049930095673, 0.03466542065143585, 0.0013325049076229334, -0.16003002226352692, 0.07304182648658752, 0.013299861922860146, 0.031778767704963684, -0.06567145884037018, -0.13283675909042358, -0.004364355932921171, -0.029962429776787758, -0.050461091101169586, -0.10812161862850189, -0.1165255457162857, -0.020879914984107018, -0.0036138398572802544, -0.035293854773044586, 0.049950048327445984, -0.08553669601678848, -0.060685720294713974, -0.00308736739680171, -0.05261053517460823, -0.14509454369544983, 0.011446130461990833, -0.01999031752347946, -0.022097891196608543, 0.03873874619603157, -0.0725240483880043, 0.018565300852060318, -0.08433150500059128, -0.03126556798815727, -0.22172006964683533, 0.15855732560157776, -0.007125586736947298, -0.003182870103046298, -0.09484812617301941, -0.10406841337680817, -0.0739600732922554, 0.05361148715019226, 0.03560490161180496, 0.12992429733276367, -0.19505994021892548, -0.029157675802707672, 0.18301048874855042, -0.08553367853164673, 0.04708413779735565, 0.14693649113178253, 0.039238572120666504, 0.04032986983656883, 0.12212586402893066, 0.1042356789112091, 0.08947852998971939, -0.0811418816447258, -0.09177933633327484, 0.07055968046188354, -0.1033218577504158, 0.048669662326574326, 0.03834175691008568, -0.06645918637514114, 0.1172536164522171, 0.032007522881031036, -0.030140317976474762, -0.0014555127127096057, 0.06360013037919998, -0.0015887852059677243, 0.04576953500509262, -0.014163525775074959, 0.11340942233800888, -0.03288276121020317, -0.017458712682127953, -0.012004799209535122, -0.12149875611066818, 0.18506328761577606, 0.04359564557671547, -0.0716174989938736, 0.08253677189350128, -0.11686877906322479, -0.09121383726596832, -0.009721586480736732, 0.003190728835761547, -0.1566164195537567, 0.05824476107954979, -0.022274155169725418, -0.0017342866631224751, 0.113340362906456, 0.06497374176979065, 0.05909116193652153, 0.013302117586135864, -0.007841316051781178, 0.019030414521694183, 0.07777808606624603, 0.013541322201490402, 0.0052100615575909615, -0.04563840851187706, -0.025600749999284744, -0.008105403743684292, 0.0825846716761589, -0.04505014047026634, 0.00968183670192957, 0.016850434243679047, 0.07630444318056107, 0.01963053084909916, -0.030496135354042053, 0.0020893465261906385, 0.0020798156037926674, 0.012915934436023235, -0.02693876065313816, -0.02465151622891426, 0.028617678210139275, -0.03021334484219551, 0.13624656200408936, -0.13423869013786316, -0.1924871802330017, 0.07473583519458771, -0.04774545878171921, 0.0025676172226667404, 0.08854716271162033, -0.029260341078042984, -0.022263038903474808, 0.033248186111450195, -0.19551874697208405, 0.23574934899806976, 0.06281113624572754, 0.10662341117858887, -0.0629749596118927, -0.07553662359714508, -0.039047449827194214, -0.0732656940817833, 0.023734869435429573, 0.017259176820516586, 0.005829759873449802, -0.15997697412967682, -0.006033984944224358, 0.017639100551605225, -0.005979839246720076, 0.10021887719631195, 0.02337881736457348, -0.05400978401303291, 0.00803934782743454, 0.020851194858551025, 0.008154725655913353, 0.0005229997332207859, -0.026235511526465416, 0.01402898970991373, 0.03664093092083931, 0.04504295438528061, 0.035315629094839096, -0.044146690517663956, 0.08693527430295944, 0.0579417385160923, -0.0648735985159874, -0.00996532291173935, 0.003589110914617777, 0.00794149935245514, 0.04734517261385918, -0.014378557913005352, 0.020311085507273674, -0.04221571236848831, -0.025997649878263474, -0.09815698117017746, 0.10953868180513382, -0.17426419258117676, -0.34432312846183777, -0.18354517221450806, -0.04907682538032532, -0.049819618463516235, 0.053867705166339874, 0.0692620575428009, -0.09105399250984192, -0.10230055451393127, -0.08759158849716187, 0.08102941513061523, -0.03398424759507179, -0.0459974929690361, -0.10376451164484024, 0.045350462198257446, 0.04586176201701164, -0.09492510557174683, -0.006210779771208763, -0.00628340570256114, -0.028081588447093964, 0.0002881318796426058, 0.02094072662293911, -0.020166469737887383, 0.13197992742061615, -0.01472456380724907, -0.04893573001027107, -0.026181712746620178, 0.11940128356218338, -0.02336503006517887, 0.1310681253671646, 0.17279018461704254, -0.09465332329273224, 0.02604150027036667, 0.11452029645442963, 0.01881149224936962, -0.026593143120408058, 0.0000933789269765839, -0.011358973570168018, -0.03539189696311951, -0.17862898111343384, -0.07785944640636444, -0.06871556490659714, 0.013101208955049515, 0.013115148060023785, 0.012507143430411816, 0.026515398174524307, -0.00461015198379755, -0.06627988070249557, 0.001005556550808251, 0.043958861380815506, 0.08795636147260666, 0.19490474462509155, -0.04449000954627991, 0.07977678626775742, -0.06212085112929344, -0.04892678186297417, 0.10772284865379333, -0.042485933750867844, 0.10728422552347183, 0.009910543449223042, 0.18123866617679596, 0.0379316546022892, -0.012198017910122871, 0.004864743910729885, 0.08700372278690338, -0.03553146868944168, 0.0025653080083429813, -0.001453074044547975, -0.05907187983393669, -0.03294224664568901, 0.06654318422079086, 0.03200199082493782, -0.03606914356350899, 0.029352709650993347, 0.05570061504840851, 0.0805041491985321, 0.08535858988761902, 0.07714089751243591, -0.1003473699092865, -0.09291499108076096, 0.018065502867102623, -0.15518654882907867, -0.03494301438331604, 0.0013468068791553378, 0.13856300711631775, -0.07279376685619354, 0.03367440402507782, -0.004373386036604643, 0.08125444501638412, -0.1075335144996643, 0.052942633628845215, 0.002386058447882533, 0.1618049442768097, 0.03996548056602478, 0.043254971504211426, -0.0588214136660099, 0.059212978929281235, 0.002850556978955865, 0.08299700915813446, -0.02512531541287899, 0.0582193061709404, -0.02916504628956318, 0.011021465063095093, 0.12259814888238907, 0.021674243733286858, -0.16853775084018707, 0.020550044253468513, -0.11901337653398514, -0.02637709677219391, 0.09415388852357864, 0.02780197560787201, 0.05289309099316597, -0.03356511518359184, -0.042054835706949234, -0.05510284751653671, -0.11112865060567856, -0.054046668112277985, -0.15748916566371918, 0.05531025305390358, 0.027040328830480576, -0.009142456576228142, -0.0012957215076312423, 0.05910956487059593, -0.04992062970995903, 0.22917403280735016, -0.22255514562129974, -0.10026434063911438, -0.1128925085067749, -0.017513172701001167, 0.13568902015686035, -0.056553393602371216, 0.05451461300253868, 0.06576599180698395, 0.0788719579577446, 0.0003467874776106328, -0.042746905237436295, 0.0641828179359436, -0.11487852782011032, 0.0009435740066692233, -0.013367652893066406, 0.17121002078056335, 0.013033900409936905, 0.06502140313386917, 0.0552125945687294, -0.0022182874381542206, 0.01400950737297535, -0.12800656259059906, -0.08620266616344452, 0.09008026123046875, -0.018219701945781708, -0.02132917195558548, -0.04042569175362587, -0.06078418344259262, -0.07719942927360535, 0.036440085619688034, 0.14099150896072388, 0.07962483912706375, -0.08450958132743835, 0.1256902515888214, 0.07297653704881668, -0.05441083759069443, -0.20369581878185272, -0.08795572072267532, 0.06074029207229614, 0.02244713343679905, 0.008113701827824116, -0.12988539040088654, 0.05705643445253372, 0.013422798365354538, -0.015271813608705997, -0.08577024936676025, -0.2121565192937851, -0.12454739958047867, 0.05882818624377251, -0.04424121230840683, 0.02457347698509693, 0.033740852028131485, -0.05339773744344711, -0.0615503266453743, -0.07799477130174637, -0.018758226186037064, -0.034529175609350204, 0.023035092279314995, 0.07751592993736267, 0.0348675511777401, 0.05566281080245972, 0.007739334367215633, 0.060524843633174896, 0.0037121560890227556, -0.046721264719963074, -0.0364767387509346, 0.07823359221220016, 0.01393421832472086, -0.012818663381040096, 0.2200813740491867, -0.03428639844059944, 0.012714513577520847, -0.10219874233007431, -0.04860464483499527, -0.03615008667111397, 0.029835816472768784, 0.001672655576840043, -0.025860197842121124, -0.0003980206965934485, -0.00814676471054554, 0.0540962815284729, 0.025236112996935844, -0.06008507311344147, -0.1399901658296585, -0.18264290690422058, 0.19601863622665405, 0.2390083372592926, 0.03203249350190163, -0.04548117518424988, 0.011132740415632725, 0.01297309622168541, -0.015636738389730453, -0.03797000274062157, 0.05457690730690956, 0.030473671853542328, -0.013582981191575527, 0.04619504511356354, -0.03034839779138565, -0.04058854654431343, 0.012899287976324558, 0.06919644773006439, -0.03900442272424698, -0.1335073709487915, -0.03961438685655594, 0.05821474641561508, -0.058516889810562134, 0.02931392751634121, 0.19941656291484833, -0.0317070335149765, -0.004955342039465904, -0.014701333828270435, 0.036631301045417786, -0.07701125741004944, 0.10768650472164154, -0.033775437623262405, 0.02487349882721901, -0.07790309190750122, 0.05392419919371605, 0.06323517113924026, -0.0024908045306801796, 0.027151677757501602, 0.1107678934931755, -0.10693339258432388, -0.0581582747399807, -0.14363163709640503, -0.1496850699186325, -0.05793022736907005, -0.045976486057043076, -0.02004614658653736, -0.055020906031131744, -0.015149567276239395, -0.009929325431585312, -0.006567416712641716, 0.03258715942502022, -0.0747160017490387, -0.0057060145772993565, -0.06678298115730286, 0.04406631365418434, 0.06478356570005417, 0.015822097659111023, -0.03667471930384636, 0.10283640772104263, -0.042648978531360626, 0.007464187685400248, -0.03137079253792763, 0.005395103245973587, 0.000040996092138811946, -0.004677017219364643, -0.10588575899600983, -0.0366828590631485, -0.07621973752975464, -0.07331624627113342, 0.0434279628098011, -0.013291832990944386, 0.003090105252340436, 0.03112640231847763, -0.04539797455072403, -0.018870292231440544, -0.032790396362543106, 0.0034093160647898912, -0.06977205723524094, 0.05367164686322212, 0.03068143129348755, -0.05228462815284729, 0.08995508402585983, 0.08505938202142715, -0.06124269589781761, 0.10755132138729095, 0.047329794615507126, -0.07338100671768188, 0.022718071937561035, 0.0779007226228714, 0.06152505427598953, -0.026345420628786087, -0.0035472987219691277, 0.0707116574048996, -0.012679081410169601, -0.07336708903312683, -0.005763815715909004, -0.04696216806769371, 0.11601483821868896, -0.002500497968867421, -0.017546355724334717, -0.040851712226867676, 0.01798967644572258, 0.04147806018590927, -0.019003091380000114, 0.14318370819091797, -0.06580382585525513, 0.02952289767563343, -0.07092104852199554, 0.036078162491321564, 0.017451869323849678, -0.013690153136849403, 0.019667096436023712, -0.0917891189455986, 0.025533707812428474, -0.0004987940774299204, 0.04223130643367767, -0.0131850466132164, 0.06030327081680298, 0.06731627136468887, -0.10183407366275787, 0.008608447387814522, 0.0464475154876709, 0.09519977122545242, 0.022102544084191322, -0.01868988759815693, -0.1338784545660019, -0.028017966076731682, -0.027733298018574715, -0.10784496366977692, 0.08846026659011841, 0.12070666998624802, 0.06623802334070206, 0.11177444458007812, 0.04223215952515602, 0.07321863621473312, -0.18151544034481049, 0.016802575439214706, -0.06012379005551338, 0.01949743553996086, -0.04855364188551903, 0.08457954227924347, 0.15180747210979462, -0.11729400604963303, 0.08766546845436096, 0.08483445644378662, -0.05746138468384743, -0.12370280921459198, -0.14404232800006866, -0.019606897607445717, -0.08014657348394394, -0.02114054001867771, -0.0959354043006897, 0.031416866928339005, -0.08468877524137497, 0.07523281127214432, -0.010239954106509686, 0.16647204756736755, -0.08940970152616501, -0.12970784306526184, 0.05693221837282181, -0.009443283081054688, 0.07492395490407944, 0.09287500381469727, 0.07252275198698044, 0.03895961493253708, 0.057502616196870804, 0.10509826242923737, 0.04561052471399307, 0.026108773425221443, -0.007220493163913488, -0.02301868051290512, -0.0451883040368557, -0.020063795149326324, 0.03672996535897255, 0.00835893303155899, 0.1933571994304657, 0.08896324783563614, -0.10010644793510437, -0.03334967792034149, 0.09882979840040207, -0.1067613810300827, -0.18254342675209045, -0.15679539740085602, 0.1763567477464676, 0.09847283363342285, 0.06737961620092392, -0.017339829355478287, -0.09603888541460037, 0.0038655532989650965, 0.18364714086055756, 0.1460321843624115, -0.0016533829038962722, 0.006189990788698196, -0.021369528025388718, 0.018404653295874596, -0.023355303332209587, 0.060018666088581085, 0.010391512885689735, 0.32986634969711304, 0.018339846283197403, -0.005993526428937912, 0.03346720337867737, -0.0267428420484066, -0.09089816361665726, 0.0293442290276289, -0.06338229030370712, -0.04500126838684082, 0.004136286675930023, 0.10326746106147766, -0.026691889390349388, -0.1735883504152298, -0.1229015663266182, 0.050028372555971146, -0.02135602943599224, 0.06478729099035263, 0.040852807462215424, 0.07828077673912048, 0.05731959640979767, 0.018532419577240944, -0.053295813500881195, 0.2439379245042801, 0.005736323073506355, -0.032916951924562454, 0.05466505512595177, 0.043380022048950195, -0.13412544131278992, 0.048267532140016556, -0.0020883537363260984, 0.07827271521091461, 0.0606318823993206, 0.04260585084557533, -0.07787420600652695, 0.06731736660003662, -0.020974373444914818, -0.09880132228136063, 0.09230916947126389, 0.18305891752243042, 0.0013422934571281075, 0.09771664440631866, 0.012375249527394772, -0.09829878807067871, 0.03916984423995018, 0.04134562611579895, -0.04646932706236839, -0.05745343118906021, 0.097775399684906, -0.06990265101194382, 0.12957820296287537, 0.1407763808965683, -0.03249460831284523, 0.030076943337917328, -0.03638077899813652, 0.004691312089562416, -0.015987366437911987, 0.05163940414786339, -0.03409018740057945, -0.11021676659584045, -0.03224169462919235, -0.01697365939617157, 0.04133955389261246, -0.17177754640579224, -0.01608717255294323, 0.0017249251250177622, -0.014302479103207588, -0.03581749275326729, 0.08783814311027527, 0.038430240005254745, 0.026771578937768936, -0.010900965891778469, 0.08314113318920135, 0.007416801527142525, 0.05233566835522652, -0.07483432441949844, -0.07100803405046463 ]
null
null
transformers
# S2T-SMALL-MUSTC-EN-PT-ST `s2t-small-mustc-en-pt-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Portuguese text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-mustc-en-pt-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-mustc-en-pt-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-mustc-en-pt-st is trained on English-Portuguese subset of [MuST-C](https://ict.fbk.eu/must-c/). MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-pt (BLEU score): 28.1 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "pt"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["mustc"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-mustc-en-pt-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "pt", "dataset:mustc", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en", "pt" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #pt #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-MUSTC-EN-PT-ST 's2t-small-mustc-en-pt-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Portuguese text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-mustc-en-pt-st is trained on English-Portuguese subset of MuST-C. MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-pt (BLEU score): 28.1 ### BibTeX entry and citation info
[ "# S2T-SMALL-MUSTC-EN-PT-ST\n\n's2t-small-mustc-en-pt-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Portuguese text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-pt-st is trained on English-Portuguese subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-pt (BLEU score): 28.1", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #pt #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-MUSTC-EN-PT-ST\n\n's2t-small-mustc-en-pt-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Portuguese text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-pt-st is trained on English-Portuguese subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-pt (BLEU score): 28.1", "### BibTeX entry and citation info" ]
[ 82, 78, 111, 44, 137, 118, 3, 100, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #pt #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-MUSTC-EN-PT-ST\n\n's2t-small-mustc-en-pt-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Portuguese text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.08554986864328384, 0.13806770741939545, -0.005939864553511143, -0.007371257524937391, 0.07422873377799988, -0.0653613731265068, 0.0007936145993880928, 0.07948113232851028, -0.09918461740016937, 0.04476024582982063, 0.005667729303240776, 0.10306772589683533, 0.05504472926259041, 0.053613774478435516, -0.009620696306228638, -0.2507869601249695, 0.04404471069574356, -0.054254576563835144, 0.007255685515701771, 0.08239477127790451, 0.11821473389863968, -0.0467066764831543, 0.07700429856777191, 0.03604164347052574, 0.027210084721446037, 0.038167666643857956, 0.048044219613075256, -0.08827196061611176, 0.05433061346411705, 0.09931636601686478, 0.05466450750827789, 0.033871572464704514, 0.0784030631184578, -0.14593389630317688, -0.0020396371837705374, 0.05725400522351265, 0.04446015506982803, 0.031376589089632034, 0.03842463716864586, 0.037987012416124344, 0.10685606300830841, 0.010847418569028378, 0.02134004607796669, 0.10549689829349518, -0.06110440939664841, -0.07465502619743347, -0.029467551037669182, 0.0067539517767727375, 0.18564243614673615, 0.10070522129535675, -0.041396159678697586, 0.01331417914479971, -0.0014008826110512018, 0.055990613996982574, 0.04002395272254944, -0.18077155947685242, 0.0026396671310067177, -0.10047376900911331, -0.018409954383969307, 0.014045236632227898, -0.022514503449201584, -0.023365531116724014, -0.015152674168348312, -0.0012367545859888196, 0.05072659254074097, -0.07740495353937149, 0.04678842052817345, -0.06079849228262901, -0.12355859577655792, -0.02865011617541313, 0.12481237202882767, -0.010122926905751228, -0.08915553241968155, -0.106381356716156, -0.05015943944454193, 0.02250901237130165, -0.015069030225276947, -0.05355963110923767, -0.0024315365590155125, 0.045726876705884933, 0.05284130200743675, -0.12425941228866577, -0.11738035827875137, 0.0025776615366339684, -0.07604188472032547, 0.11800044029951096, 0.04176931455731392, 0.012839951552450657, 0.008665627799928188, 0.05240940302610397, -0.0676434189081192, -0.049926698207855225, -0.02282164990901947, -0.08890096098184586, -0.09409582614898682, -0.007452717516571283, -0.0013656913070008159, -0.22294092178344727, -0.03294181451201439, 0.10458634793758392, -0.10818805545568466, 0.024032508954405785, 0.04028351232409477, 0.009890924207866192, 0.002866399008780718, 0.1736966073513031, -0.0683690682053566, -0.09465010464191437, 0.04710736125707626, -0.05905229598283768, 0.032535869628190994, 0.023721423000097275, -0.08440188318490982, 0.00783500075340271, -0.07844860851764679, 0.04827999696135521, 0.05471407622098923, 0.028002187609672546, -0.001078287954442203, -0.07286841422319412, 0.17298725247383118, -0.1091618537902832, 0.005850664339959621, 0.021877672523260117, -0.043357402086257935, 0.15827716886997223, 0.055977609008550644, 0.0024410837795585394, -0.14812685549259186, 0.06535807996988297, 0.001316797104664147, 0.028561526909470558, -0.0754980519413948, -0.12341039627790451, -0.01474169921129942, -0.07379607856273651, -0.05062903091311455, -0.10200124233961105, -0.11864648759365082, -0.03092285431921482, -0.0025523679796606302, -0.03795579448342323, 0.02383793145418167, -0.07265286892652512, -0.044796451926231384, -0.019185898825526237, -0.04809610918164253, -0.15663954615592957, 0.0028984001837670803, -0.0033460226841270924, 0.01836482621729374, 0.03784160315990448, -0.07196296006441116, 0.03384882211685181, -0.06630270183086395, -0.01231146790087223, -0.22978173196315765, 0.1715441197156906, 0.017171012237668037, 0.03915867581963539, -0.10591701418161392, -0.09252425283193588, -0.0744020864367485, 0.056471485644578934, 0.030417200177907944, 0.1427634060382843, -0.1851881444454193, -0.03961678966879845, 0.18741756677627563, -0.08516043424606323, 0.02963271364569664, 0.13022145628929138, 0.03839730843901634, 0.0703548938035965, 0.13348162174224854, 0.08260555565357208, 0.10407388210296631, -0.06678018718957901, -0.06637785583734512, 0.03177706524729729, -0.12614275515079498, 0.049187809228897095, 0.0581180714070797, -0.0962458997964859, 0.11981585621833801, 0.022612815722823143, -0.02032019942998886, 0.008574019186198711, 0.07293527573347092, -0.01178601011633873, 0.03823142871260643, -0.032292962074279785, 0.0997321605682373, -0.029849762097001076, -0.0038779964670538902, -0.0013900817139074206, -0.12042828649282455, 0.1685585230588913, 0.038721054792404175, -0.06922061741352081, 0.07323617488145828, -0.13825257122516632, -0.049263596534729004, -0.03470083326101303, -0.002586865099146962, -0.17375317215919495, 0.079521544277668, -0.033361345529556274, 0.016383808106184006, 0.1277206689119339, 0.03733649477362633, 0.061225637793540955, 0.0038122483529150486, 0.0032630006317049265, 0.012460303492844105, 0.09046351909637451, 0.004826254677027464, 0.003914193250238895, -0.038787707686424255, -0.009958390146493912, -0.009090520441532135, 0.09500980377197266, -0.06232810020446777, 0.03863224759697914, 0.020253321155905724, 0.06223202496767044, 0.024190695956349373, -0.027198156341910362, 0.012993065640330315, -0.004166575148701668, 0.007205205503851175, -0.04845209792256355, -0.006360137369483709, 0.035246483981609344, -0.043373893946409225, 0.10197631269693375, -0.15428408980369568, -0.15082032978534698, 0.08163288235664368, -0.04578392580151558, -0.017149489372968674, 0.06482136994600296, -0.009662809781730175, -0.011805296875536442, 0.01351103000342846, -0.16732120513916016, 0.27690503001213074, 0.06112387776374817, 0.1327899694442749, -0.07430464029312134, -0.04868360236287117, -0.042645517736673355, -0.06593973189592361, 0.032413382083177567, 0.00003352490602992475, 0.05422377213835716, -0.16002289950847626, 0.0022757137194275856, 0.01178279984742403, -0.01762937568128109, 0.10430524498224258, 0.022443313151597977, -0.07256145775318146, 0.01991991512477398, 0.02454359643161297, 0.02029380202293396, -0.02757253311574459, -0.050523798912763596, 0.018959738314151764, 0.024363774806261063, 0.028334518894553185, 0.04007551074028015, -0.05045172572135925, 0.08972238004207611, 0.06837602704763412, -0.05823420733213425, -0.060901448130607605, -0.007542443927377462, 0.006649182643741369, 0.03816663846373558, 0.003025498939678073, 0.04608854651451111, -0.03143579512834549, -0.005710766650736332, -0.10211075097322464, 0.08823486417531967, -0.17067360877990723, -0.3129587769508362, -0.2000289410352707, -0.03311274200677872, -0.027544556185603142, 0.07354670017957687, 0.06300510466098785, -0.10346490144729614, -0.08594448119401932, -0.07313404977321625, 0.08609513193368912, -0.06494640558958054, -0.054573219269514084, -0.09419726580381393, 0.07692329585552216, 0.053428955376148224, -0.07608567178249359, 0.008002735674381256, 0.015685033053159714, -0.03605685010552406, -0.024381140246987343, 0.00764643307775259, -0.025328170508146286, 0.1595505028963089, -0.01020557526499033, -0.028723664581775665, -0.027488574385643005, 0.11649829149246216, -0.03365906700491905, 0.1185399740934372, 0.2155952900648117, -0.09148269891738892, 0.026052290573716164, 0.1288544237613678, 0.01947316899895668, -0.026185939088463783, 0.003038860158994794, -0.016144659370183945, -0.037092868238687515, -0.20352549850940704, -0.0545426569879055, -0.053323931992053986, 0.029880302026867867, 0.02341618575155735, 0.008645503781735897, 0.023316126316785812, 0.0221989918500185, -0.06609499454498291, -0.0017200919101014733, 0.05452040582895279, 0.08945945650339127, 0.18330152332782745, -0.06501214951276779, 0.08307208865880966, -0.06289028376340866, -0.04556349292397499, 0.11153090745210648, -0.03997582942247391, 0.11766452342271805, 0.008787701837718487, 0.18995046615600586, 0.038120195269584656, -0.01069093868136406, 0.01806499995291233, 0.07277720421552658, -0.023445388302206993, 0.00977636780589819, -0.006832503248006105, -0.046255309134721756, -0.04940515384078026, 0.08072184771299362, 0.0750969797372818, -0.017224809154868126, 0.035716552287340164, 0.05485585331916809, 0.0555013082921505, 0.11874884366989136, 0.05833015963435173, -0.14026907086372375, -0.08328942954540253, 0.02314727008342743, -0.13295601308345795, -0.034045133739709854, 0.007493216078728437, 0.118108369410038, -0.05718226730823517, 0.03877028822898865, 0.004543128423392773, 0.09417445957660675, -0.11091543734073639, 0.031054986640810966, 0.015887806192040443, 0.15107251703739166, 0.0371391735970974, 0.035230591893196106, -0.05069343000650406, 0.03122749552130699, 0.012755170464515686, 0.08321122080087662, -0.008339735679328442, 0.07951201498508453, -0.01122209057211876, 0.033478353172540665, 0.09688060730695724, -0.004831088241189718, -0.13030973076820374, 0.008114120922982693, -0.12509727478027344, -0.0349278450012207, 0.08549563586711884, 0.005729882046580315, 0.06657730042934418, -0.046397510915994644, -0.03865682706236839, -0.06066065654158592, -0.10567335039377213, -0.06180607154965401, -0.16699443757534027, 0.033670250326395035, 0.02355252578854561, 0.02242753840982914, -0.00528005650267005, 0.06322139501571655, -0.04500405490398407, 0.21184928715229034, -0.20500345528125763, -0.11403217166662216, -0.11680480092763901, -0.037008874118328094, 0.13615716993808746, -0.045908380299806595, 0.07008719444274902, 0.06290803849697113, 0.07577388733625412, 0.022440176457166672, -0.027903908863663673, 0.06768569350242615, -0.10299316793680191, 0.002531888196244836, -0.02832278609275818, 0.17696185410022736, -0.005785875953733921, 0.06735777854919434, 0.043146051466464996, -0.004916816018521786, 0.006596898194402456, -0.10503740608692169, -0.07874085754156113, 0.12243395298719406, 0.003087791847065091, -0.002132186433300376, -0.0587860532104969, -0.10094812512397766, -0.09540420770645142, 0.02221311815083027, 0.13828091323375702, 0.06830817461013794, -0.08985123038291931, 0.12047033756971359, 0.06352871656417847, -0.05203481391072273, -0.15943989157676697, -0.06403501331806183, 0.06593258678913116, 0.022206129506230354, 0.009796732105314732, -0.16031351685523987, 0.033314645290374756, 0.013517438434064388, -0.005240749102085829, -0.04358940199017525, -0.21907003223896027, -0.10614332556724548, 0.03561647981405258, -0.021894196048378944, 0.0035374925937503576, 0.002708115614950657, -0.06675500422716141, -0.07506483048200607, -0.06942914426326752, -0.030101485550403595, -0.06669958680868149, 0.023694297298789024, 0.07816997170448303, 0.01652812398970127, 0.049959633499383926, 0.000870542717166245, 0.047303590923547745, -0.03960433602333069, -0.05994196608662605, -0.03198539838194847, 0.05992740020155907, 0.03063558042049408, -0.019327698275446892, 0.19426795840263367, 0.0044713132083415985, 0.016116101294755936, -0.09532500058412552, -0.07092627137899399, -0.00818419549614191, -0.0007942761294543743, 0.007391093764454126, -0.010511031374335289, 0.013271045871078968, -0.03473423793911934, 0.03324129432439804, 0.01816478930413723, -0.0762285366654396, -0.1591675728559494, -0.16340336203575134, 0.18388555943965912, 0.2407795786857605, 0.02449727989733219, -0.06602338701486588, 0.0027708453126251698, 0.01270220335572958, -0.012197517789900303, -0.08355649560689926, 0.058882638812065125, 0.024943243712186813, -0.008514128625392914, 0.051063422113657, -0.008804294280707836, -0.05000271648168564, 0.04053987190127373, 0.06761644035577774, -0.011980660259723663, -0.11347926408052444, -0.04364669695496559, 0.024854334071278572, -0.07300024479627609, 0.027450868859887123, 0.19136905670166016, -0.007966339588165283, -0.014169513247907162, -0.006444951053708792, 0.015714596956968307, -0.07953373342752457, 0.1087711974978447, -0.040196701884269714, -0.00034308171598240733, -0.078020840883255, 0.0674058124423027, 0.07264523953199387, -0.012324794195592403, 0.01224329974502325, 0.11610317975282669, -0.10544916987419128, -0.0631183311343193, -0.15897351503372192, -0.1449606865644455, -0.04242807254195213, -0.0517144501209259, -0.027749404311180115, -0.05013082176446915, -0.000185533455805853, 0.02269408293068409, -0.022653676569461823, 0.023904969915747643, -0.08049637824296951, -0.004377869423478842, -0.06794881075620651, 0.027043666690587997, 0.06522592902183533, -0.0034719989635050297, -0.07136531174182892, 0.12080419063568115, -0.02211817353963852, -0.029466910287737846, -0.01715284213423729, 0.0021685990504920483, -0.013971762731671333, 0.02171597257256508, -0.09065121412277222, -0.02291613444685936, -0.0734301507472992, -0.06339537352323532, 0.046041812747716904, 0.003956501372158527, -0.007074092049151659, 0.028059804812073708, -0.04665403068065643, -0.018533902242779732, -0.02863636054098606, 0.012704573571681976, -0.054353516548871994, 0.06219082698225975, 0.01848388835787773, -0.05403691530227661, 0.08556362986564636, 0.07800427824258804, -0.07040993124246597, 0.0760289803147316, 0.04725869745016098, -0.07849515974521637, 0.029118623584508896, 0.06474069505929947, 0.044743143022060394, -0.05310576781630516, 0.024273987859487534, 0.06300816684961319, -0.03224387392401695, -0.07676244527101517, -0.0044007254764437675, -0.04247491806745529, 0.10132790356874466, 0.018906066194176674, -0.005856778007000685, -0.039094045758247375, -0.003911097068339586, 0.07448617368936539, -0.013119720853865147, 0.14640827476978302, -0.048714280128479004, 0.05857501178979874, -0.07096876204013824, 0.04332352429628372, 0.0028158079367130995, 0.01525028608739376, 0.00724495854228735, -0.08712555468082428, 0.03303336724638939, 0.01303143985569477, 0.041323915123939514, -0.020233966410160065, 0.0878649353981018, 0.060141030699014664, -0.0941895991563797, 0.026142433285713196, 0.04503226652741432, 0.11969126015901566, 0.04025859385728836, -0.015796588733792305, -0.10450683534145355, 0.007202823180705309, -0.010194653645157814, -0.1098412275314331, 0.046156205236911774, 0.0932416319847107, 0.04642979055643082, 0.10607954859733582, 0.01978996954858303, 0.0670185387134552, -0.14510077238082886, 0.03720764070749283, -0.050090305507183075, -0.0022637483198195696, -0.038833409547805786, 0.09556630253791809, 0.1759151965379715, -0.09086847305297852, 0.0796499028801918, 0.08775579929351807, -0.04757288470864296, -0.1358404904603958, -0.11941201984882355, -0.017416473478078842, -0.0844760313630104, -0.015499879606068134, -0.08383192121982574, 0.019193194806575775, -0.07742827385663986, 0.055475443601608276, -0.012682933360338211, 0.15222924947738647, -0.08236958831548691, -0.1538509875535965, 0.046328965574502945, -0.0045061614364385605, 0.08009405434131622, 0.09759446233510971, 0.07165157794952393, 0.048859503120183945, 0.025236377492547035, 0.10491546243429184, 0.04353780671954155, 0.002697946038097143, 0.0030018866527825594, -0.01574053429067135, -0.025849612429738045, -0.02492954395711422, 0.053929653018713, -0.001443280722014606, 0.21808989346027374, 0.07551627606153488, -0.09154168516397476, -0.025167811661958694, 0.10524923354387283, -0.1067114919424057, -0.14243300259113312, -0.18242038786411285, 0.15800482034683228, 0.10825835168361664, 0.056819766759872437, -0.021408841013908386, -0.10439866781234741, 0.012943025678396225, 0.20359613001346588, 0.12106715887784958, -0.011568786576390266, 0.0012169862166047096, -0.002706944476813078, 0.02189890667796135, -0.03017694130539894, 0.04182211309671402, 0.018330078572034836, 0.29602476954460144, 0.01605098508298397, -0.007872512564063072, 0.034135378897190094, -0.017465505748987198, -0.09911549836397171, 0.030526261776685715, -0.0735977441072464, -0.03470415249466896, 0.021090596914291382, 0.10647401213645935, -0.030316442251205444, -0.15791073441505432, -0.10716477781534195, 0.04413912817835808, -0.015978651121258736, 0.04619119316339493, 0.02333936095237732, 0.062067046761512756, 0.059140678495168686, 0.02637961506843567, -0.06648410856723785, 0.2409237027168274, 0.0021196904126554728, -0.07000837475061417, 0.041321273893117905, 0.032615385949611664, -0.15945367515087128, 0.06105586886405945, -0.01196417398750782, 0.05849868059158325, 0.0585966482758522, 0.03578071668744087, -0.0771392434835434, 0.07162615656852722, -0.03944756090641022, -0.09165143221616745, 0.08714868873357773, 0.15235881507396698, -0.006560603156685829, 0.10557190328836441, 0.012908541597425938, -0.10632594674825668, 0.055792730301618576, 0.028171241283416748, -0.05357946828007698, -0.07494205981492996, 0.09783996641635895, -0.0964285209774971, 0.1250254511833191, 0.1195179894566536, -0.034234680235385895, 0.02625276707112789, -0.027918921783566475, 0.02195095270872116, 0.006848474498838186, 0.07474889606237411, -0.042226918041706085, -0.10891979187726974, -0.047716204077005386, -0.0328456275165081, 0.033065393567085266, -0.19022652506828308, -0.00925592239946127, -0.0083005391061306, -0.0251288041472435, -0.05611507222056389, 0.06467573344707489, 0.032028548419475555, 0.043039318174123764, -0.015236001461744308, 0.04933157563209534, 0.006771138869225979, 0.06464038044214249, -0.09827756881713867, -0.06672142446041107 ]
null
null
transformers
# S2T-SMALL-MUSTC-EN-RO-ST `s2t-small-mustc-en-ro-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Romanian text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-mustc-en-ro-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-mustc-en-ro-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-mustc-en-ro-st is trained on English-Romanian subset of [MuST-C](https://ict.fbk.eu/must-c/). MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-ro (BLEU score): 21.9 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "ro"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["mustc"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-mustc-en-ro-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "ro", "dataset:mustc", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en", "ro" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #ro #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-MUSTC-EN-RO-ST 's2t-small-mustc-en-ro-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Romanian text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-mustc-en-ro-st is trained on English-Romanian subset of MuST-C. MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-ro (BLEU score): 21.9 ### BibTeX entry and citation info
[ "# S2T-SMALL-MUSTC-EN-RO-ST\n\n's2t-small-mustc-en-ro-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Romanian text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-ro-st is trained on English-Romanian subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-ro (BLEU score): 21.9", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #ro #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-MUSTC-EN-RO-ST\n\n's2t-small-mustc-en-ro-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Romanian text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-ro-st is trained on English-Romanian subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-ro (BLEU score): 21.9", "### BibTeX entry and citation info" ]
[ 82, 78, 111, 43, 137, 117, 3, 100, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #ro #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-MUSTC-EN-RO-ST\n\n's2t-small-mustc-en-ro-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Romanian text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.08429563790559769, 0.12228985875844955, -0.005375843495130539, -0.01170217152684927, 0.07699723541736603, -0.06173764169216156, 0.03498663008213043, 0.07922649383544922, -0.060161564499139786, 0.06735061854124069, 0.015166877768933773, 0.06787145882844925, 0.043728120625019073, 0.056794166564941406, 0.0021761388052254915, -0.24579112231731415, 0.07284853607416153, -0.049071088433265686, 0.012905239127576351, 0.06351131200790405, 0.12294203788042068, -0.061532843858003616, 0.06406168639659882, 0.032545316964387894, 0.012605255469679832, 0.044458113610744476, 0.03402300551533699, -0.08019109070301056, 0.08160236477851868, 0.10925915092229843, 0.030799241736531258, 0.03507670387625694, 0.09918686747550964, -0.1248919740319252, -0.0052317180670797825, 0.03281179070472717, 0.01791958138346672, 0.027676736935973167, 0.04610235244035721, 0.027817584574222565, 0.12181579321622849, -0.01704222522675991, 0.0222785547375679, 0.08463732153177261, -0.024325789883732796, -0.026013968512415886, -0.024888716638088226, 0.04727960005402565, 0.1682269424200058, 0.11085229367017746, -0.0540492981672287, 0.047721877694129944, -0.03456493467092514, 0.04545364901423454, 0.010966259054839611, -0.19232258200645447, 0.00540350005030632, -0.11817863583564758, -0.03336859866976738, 0.017612088471651077, -0.03877320885658264, 0.005352802108973265, -0.03043232671916485, 0.0020736707374453545, 0.04274817928671837, -0.05341145768761635, -0.00009949479135684669, -0.07701942324638367, -0.1189427450299263, -0.02572171576321125, 0.10957848280668259, 0.010005074553191662, -0.07876956462860107, -0.10675148665904999, -0.042245443910360336, 0.044426482170820236, -0.042270924896001816, -0.034634459763765335, -0.01755443587899208, 0.03402801603078842, 0.04652443155646324, -0.123283252120018, -0.1272105723619461, 0.015055901370942593, -0.05598215013742447, 0.16067175567150116, 0.048530418425798416, 0.00983026996254921, 0.009506262838840485, 0.04871569946408272, -0.046662457287311554, -0.04828973487019539, 0.008882831782102585, -0.06036105751991272, -0.10361241549253464, -0.021482745185494423, -0.022723590955138206, -0.2195814549922943, -0.016832780092954636, 0.12412362545728683, -0.10315784811973572, 0.028035197407007217, 0.047646764665842056, 0.016260778531432152, -0.014547942206263542, 0.19622012972831726, -0.08224909752607346, -0.11501410603523254, 0.048480719327926636, -0.046702008694410324, 0.02965705096721649, 0.01919584907591343, -0.09789088368415833, -0.03088526800274849, -0.05686287209391594, 0.037676382809877396, 0.06650500744581223, 0.04496157914400101, 0.03842926025390625, -0.059562064707279205, 0.16439345479011536, -0.11704093217849731, -0.0007028304389677942, 0.023185955360531807, -0.0680217519402504, 0.1837746500968933, 0.021259382367134094, -0.001409227610565722, -0.18208882212638855, 0.061345770955085754, 0.00542179262265563, 0.03159443661570549, -0.05889163538813591, -0.12496361136436462, -0.00042515757377259433, -0.06097252294421196, -0.03728128597140312, -0.09800048172473907, -0.04168756306171417, -0.015442037023603916, 0.014378641732037067, -0.038906119763851166, 0.02166275680065155, -0.07795308530330658, -0.037459876388311386, 0.0030400140676647425, -0.03640788048505783, -0.1866907924413681, 0.021664300933480263, -0.011494707316160202, -0.025006862357258797, 0.021238569170236588, -0.08751903474330902, 0.033659253269433975, -0.09460058063268661, -0.029714621603488922, -0.22491811215877533, 0.1391955465078354, 0.01953073963522911, 0.09313222020864487, -0.10094958543777466, -0.06268396228551865, -0.05218445509672165, 0.03794415667653084, 0.03865652158856392, 0.14457494020462036, -0.1595553159713745, -0.035233087837696075, 0.17476503551006317, -0.09801597148180008, 0.03193862363696098, 0.17518174648284912, 0.05284067615866661, 0.04940762370824814, 0.14054034650325775, 0.10336615145206451, 0.09677674621343613, -0.06266533583402634, -0.09977225214242935, 0.014870131388306618, -0.1250249296426773, 0.05528347194194794, 0.04581815376877785, -0.09437773376703262, 0.15239594876766205, 0.02743089757859707, -0.03021000511944294, 0.014844760298728943, 0.0768231675028801, -0.008010617457330227, 0.033035434782505035, -0.017515355721116066, 0.1064862385392189, -0.018238220363855362, 0.018130075186491013, -0.0035468272399157286, -0.13287204504013062, 0.14818605780601501, 0.06496423482894897, -0.08281917124986649, 0.08426294475793839, -0.15033504366874695, 0.03658708930015564, -0.014600015245378017, 0.017437469214200974, -0.17372912168502808, 0.09436632692813873, -0.024146180599927902, 0.025002121925354004, 0.14120131731033325, 0.014508448541164398, 0.061417356133461, 0.006568904500454664, 0.010211720131337643, 0.014914113096892834, 0.07303791493177414, 0.0041494653560221195, 0.008131728507578373, -0.06183011084794998, -0.0069265784695744514, -0.008874722756445408, 0.13745403289794922, -0.08185414224863052, 0.030639782547950745, 0.027928682044148445, 0.08617853373289108, 0.007039589807391167, -0.01150144636631012, -0.032924722880125046, -0.012020386755466461, -0.013650164008140564, -0.032303158193826675, -0.0008317981846630573, 0.01423851028084755, -0.03869457170367241, 0.09092670679092407, -0.15089134871959686, -0.18076305091381073, 0.0902947261929512, -0.07393509149551392, -0.033395446836948395, 0.0728069320321083, -0.01001651119440794, 0.00243630213662982, 0.0562194287776947, -0.15048237144947052, 0.24805501103401184, 0.05580021068453789, 0.13279858231544495, -0.05635930597782135, -0.02185204066336155, -0.03276970610022545, -0.07168197631835938, 0.0031447154469788074, 0.006233127322047949, 0.10987637937068939, -0.1973450481891632, 0.008254842832684517, 0.01367928646504879, -0.025163866579532623, 0.12531761825084686, 0.02568291313946247, -0.10628122836351395, 0.037732068449258804, 0.041774604469537735, 0.03164810314774513, 0.0036557535640895367, -0.023810526356101036, 0.02290743961930275, 0.010374311357736588, 0.016255363821983337, 0.06046433374285698, -0.051407720893621445, 0.09702267497777939, 0.04351084306836128, -0.05096050724387169, -0.06615139544010162, -0.0020034476183354855, -0.0043849097564816475, 0.05670183524489403, 0.01823786273598671, 0.030322330072522163, -0.03664067015051842, 0.008213691413402557, -0.10065795481204987, 0.08165739476680756, -0.16169783473014832, -0.2972014248371124, -0.22201170027256012, -0.025952225551009178, -0.04687744379043579, 0.052646201103925705, 0.03725631162524223, -0.11091262102127075, -0.10333369672298431, -0.07510791718959808, 0.11259768903255463, -0.06842049211263657, -0.056141745299100876, -0.08200486749410629, 0.09496676921844482, 0.046871110796928406, -0.04830629378557205, 0.005647572688758373, -0.004355947487056255, -0.021822089329361916, -0.022859815508127213, -0.002835637191310525, -0.018189752474427223, 0.14463448524475098, -0.04280022904276848, -0.03832864388823509, -0.013874044641852379, 0.12500649690628052, -0.0671631470322609, 0.12933026254177094, 0.17356330156326294, -0.11562258750200272, 0.021592797711491585, 0.16476839780807495, 0.027264265343546867, -0.008447762578725815, -0.004095566459000111, -0.017240911722183228, -0.022801999002695084, -0.16910715401172638, -0.09289893507957458, -0.058708809316158295, -0.00318792462348938, 0.029124321416020393, 0.019121984019875526, 0.01708400994539261, 0.02233113721013069, -0.1000891700387001, -0.01786545291543007, 0.047284629195928574, 0.09071823954582214, 0.22223789989948273, -0.039077747613191605, 0.057371944189071655, -0.09060131758451462, -0.0439082607626915, 0.08560023456811905, -0.04025885835289955, 0.14937397837638855, 0.016271669417619705, 0.20544512569904327, 0.011289224028587341, -0.004708107095211744, 0.0357474647462368, 0.07112943381071091, -0.023089183494448662, -0.006749640218913555, 0.016926627606153488, -0.03627869114279747, -0.017971398308873177, 0.0612981952726841, 0.07573632895946503, -0.048644308000802994, 0.01900683343410492, 0.0407150574028492, 0.07911402732133865, 0.1596290022134781, 0.052075717598199844, -0.13848301768302917, -0.1113356351852417, 0.008806007914245129, -0.15077126026153564, -0.03030826710164547, -0.026677248999476433, 0.0912695825099945, -0.06435481458902359, 0.03821525350213051, 0.012771663255989552, 0.0771130844950676, -0.08549565821886063, 0.027145417407155037, -0.0190998837351799, 0.1418391913175583, 0.03393586724996567, 0.042570583522319794, -0.0129240732640028, 0.06549112498760223, 0.021555913612246513, 0.08389990776777267, -0.016602102667093277, 0.08146164566278458, -0.00948574859648943, 0.08335418999195099, 0.07474947720766068, -0.003318178001791239, -0.13195398449897766, -0.029730232432484627, -0.12738566100597382, -0.021681681275367737, 0.09079823642969131, 0.009936682879924774, 0.08342635631561279, -0.07300035655498505, -0.029616376385092735, -0.07921799272298813, -0.11742528527975082, -0.040630944073200226, -0.19760821759700775, 0.0468595027923584, 0.01006617583334446, 0.0019938016775995493, 0.02024925872683525, 0.04284586384892464, -0.05997224897146225, 0.2099492996931076, -0.1922905296087265, -0.0876077190041542, -0.10142018646001816, -0.031131023541092873, 0.15672823786735535, -0.06765452027320862, 0.07068274915218353, 0.038573890924453735, 0.07798454165458679, -0.001025150646455586, -0.041452229022979736, 0.032343875616788864, -0.10097992420196533, -0.006666109431535006, 0.005212932825088501, 0.17058530449867249, 0.015040687285363674, 0.07546774297952652, 0.059179000556468964, 0.0013473797589540482, 0.012964812107384205, -0.0910286158323288, -0.10638634115457535, 0.10975588113069534, -0.042510028928518295, -0.04136354848742485, -0.07796221971511841, -0.17320074141025543, -0.11740770190954208, 0.02287210337817669, 0.17560796439647675, 0.09064331650733948, -0.09726978838443756, 0.13081863522529602, 0.11604847759008408, -0.04276062175631523, -0.16122621297836304, -0.04430019110441208, 0.05446862056851387, 0.03857223317027092, 0.009286296553909779, -0.1590442657470703, 0.013892014510929585, -0.026591936126351357, -0.01342015527188778, -0.04175952076911926, -0.18590201437473297, -0.09129994362592697, 0.053888797760009766, -0.024705464020371437, 0.005031598266214132, 0.02226926200091839, -0.033846233040094376, -0.059250421822071075, -0.027699541300535202, -0.03177085146307945, -0.005213252268731594, 0.01795700192451477, 0.08833178877830505, 0.03862781822681427, 0.06675025820732117, 0.008827419020235538, 0.0651206374168396, -0.059605225920677185, -0.0917922705411911, -0.052285388112068176, 0.056706443428993225, 0.059005092829465866, -0.01624089665710926, 0.1995118409395218, -0.013915974646806717, -0.011149665340781212, -0.093559131026268, -0.07539133727550507, 0.004479679279029369, 0.004580160137265921, -0.0021954209078103304, 0.006964288651943207, 0.012313345447182655, -0.011704272590577602, 0.03804244101047516, 0.021844252943992615, -0.09022543579339981, -0.1515595018863678, -0.18182727694511414, 0.1525169461965561, 0.21750867366790771, 0.011732654646039009, -0.012386075221002102, 0.028859276324510574, 0.004388374742120504, 0.0046723647974431515, -0.0577082633972168, 0.06727141886949539, 0.04142449051141739, -0.024856962263584137, 0.041448526084423065, -0.03285062313079834, -0.06773464381694794, 0.04757104814052582, 0.08308753371238708, -0.03273544833064079, -0.10653889179229736, -0.023541269823908806, -0.004517813678830862, -0.07063386589288712, 0.03433064743876457, 0.1951635479927063, -0.0333193801343441, -0.008909037336707115, -0.014490139670670033, 0.009280198253691196, -0.07287116348743439, 0.09145553410053253, -0.06710227578878403, -0.006615329068154097, -0.07085947692394257, 0.05026809871196747, 0.09669553488492966, -0.03967677801847458, 0.028297172859311104, 0.11419570446014404, -0.11263719946146011, -0.07228286564350128, -0.16987672448158264, -0.19436204433441162, -0.0623345822095871, -0.07009753584861755, -0.05416419357061386, -0.041923508048057556, -0.011585034430027008, 0.021570414304733276, -0.01786433346569538, -0.008214215748012066, -0.08875621110200882, -0.021369699388742447, -0.10690819472074509, 0.019906288012862206, 0.07961796969175339, -0.032546769827604294, -0.061896827071905136, 0.11326645314693451, 0.004425858613103628, -0.039415132254362106, -0.02098560519516468, -0.01657349243760109, 0.005379193928092718, 0.010909109376370907, -0.1079765260219574, -0.042008258402347565, -0.06932182610034943, -0.050517693161964417, 0.02785627543926239, -0.005098865833133459, -0.015719516202807426, 0.012976654805243015, -0.05619306489825249, -0.021980544552206993, -0.01881505735218525, 0.009360143914818764, -0.05038032308220863, 0.08545500785112381, 0.028413742780685425, -0.056125205010175705, 0.10779367387294769, 0.08497733622789383, -0.05761544778943062, 0.10146870464086533, 0.041203949600458145, -0.10503432154655457, -0.017147120088338852, 0.07111755013465881, 0.05807540565729141, -0.015800602734088898, 0.021100463345646858, 0.056260332465171814, -0.01924172230064869, -0.05663295090198517, 0.01948300562798977, -0.026098905131220818, 0.10432901233434677, 0.002223056508228183, -0.025233376771211624, -0.04742646962404251, 0.025425905361771584, 0.04726056009531021, -0.016013192012906075, 0.14560967683792114, -0.05308808013796806, 0.059146810322999954, -0.04419592022895813, 0.0490693636238575, -0.010372632183134556, 0.027767151594161987, -0.0025293754879385233, -0.08392383903265, 0.027455631643533707, 0.006917744874954224, 0.023430267348885536, -0.028772663325071335, 0.12954241037368774, 0.07655299454927444, -0.07560291886329651, -0.006328233517706394, 0.04735713079571724, 0.08947929739952087, 0.043628938496112823, 0.007912118919193745, -0.1456744521856308, -0.005367576610296965, -0.021146822720766068, -0.11631256341934204, 0.020986754447221756, 0.10564327239990234, 0.026136189699172974, 0.10512664169073105, 0.03409164026379585, 0.0499400831758976, -0.14996153116226196, 0.041101619601249695, -0.0952211320400238, -0.0042438688687980175, -0.04969044402241707, 0.13656525313854218, 0.21985666453838348, -0.06961901485919952, 0.05632438138127327, 0.06820917129516602, -0.052522193640470505, -0.1097978949546814, -0.13298113644123077, -0.014283549971878529, -0.08913839608430862, 0.002634644042700529, -0.04817422106862068, 0.0162721648812294, -0.05409158766269684, 0.06883969902992249, -0.03813968598842621, 0.13782747089862823, -0.05170116946101189, -0.1492147594690323, 0.03684758022427559, 0.001360439695417881, 0.07546645402908325, 0.10158888250589371, 0.04571280628442764, 0.034259431064128876, 0.027511680498719215, 0.10203644633293152, 0.0437171570956707, 0.011004205793142319, 0.013267586007714272, -0.03137728571891785, -0.011337491683661938, -0.013565988279879093, 0.04345998540520668, 0.010861777700483799, 0.22633512318134308, 0.07302284240722656, -0.10400395840406418, -0.055490314960479736, 0.10138627141714096, -0.07775238156318665, -0.141755148768425, -0.19409413635730743, 0.12166918069124222, 0.10202572494745255, 0.047648314386606216, -0.02586270496249199, -0.09705329686403275, -0.011667203158140182, 0.20395806431770325, 0.1502976268529892, 0.007758599705994129, 0.0038868053816258907, -0.023807013407349586, 0.019914396107196808, -0.026573913171887398, 0.06563220173120499, -0.008909710682928562, 0.3226275146007538, 0.010111463256180286, -0.018388016149401665, -0.014020279981195927, 0.008838927373290062, -0.11153300106525421, 0.05632386356592178, -0.07001804560422897, -0.028669022023677826, -0.0019754597451537848, 0.11823000013828278, -0.02052445337176323, -0.16313707828521729, -0.12580490112304688, -0.008837185800075531, -0.028975151479244232, 0.05204377323389053, 0.022044984623789787, 0.04772451892495155, 0.059563830494880676, 0.016413046047091484, -0.047970715910196304, 0.19941483438014984, 0.0006724022096022964, -0.043467629700899124, 0.032434944063425064, 0.06585798412561417, -0.17085061967372894, 0.10195142030715942, -0.023756302893161774, 0.05772886052727699, 0.06640138477087021, 0.06287656724452972, -0.07690105587244034, 0.06553216278553009, -0.03379217907786369, -0.10286403447389603, 0.10322904586791992, 0.1489713191986084, -0.008416655473411083, 0.1252995878458023, 0.03410849720239639, -0.0757613405585289, 0.08627304434776306, -0.05061729624867439, -0.031552817672491074, -0.05527396872639656, 0.13130450248718262, -0.08485091477632523, 0.12889190018177032, 0.14434978365898132, -0.03372107073664665, 0.012202546000480652, -0.0516369491815567, 0.0045571764931082726, 0.019032100215554237, 0.016376107931137085, -0.06350525468587875, -0.15163472294807434, -0.021732382476329803, -0.08042395114898682, 0.009054132737219334, -0.18431077897548676, -0.02801135927438736, -0.011336402036249638, -0.0021163139026612043, -0.06080266833305359, 0.06414459645748138, 0.06268088519573212, 0.022548390552401543, -0.01681230030953884, 0.0680297389626503, -0.003101370297372341, 0.05053548514842987, -0.09865482896566391, -0.06390589475631714 ]
null
null
transformers
# S2T-SMALL-MUSTC-EN-RU-ST `s2t-small-mustc-en-ru-st` is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text) ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Russian text translation. See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. *Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the filter bank features. Make sure to install the `torchaudio` package before running this example.* You could either install those as extra speech dependancies with `pip install transformers"[speech, sentencepiece]"` or install the packages seperatly with `pip install torchaudio sentencepiece`. ```python import torch from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration from datasets import load_dataset import soundfile as sf model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-mustc-en-ru-st") processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-mustc-en-ru-st") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset( "patrickvonplaten/librispeech_asr_dummy", "clean", split="validation" ) ds = ds.map(map_to_array) inputs = processor( ds["speech"][0], sampling_rate=16_000, return_tensors="pt" ) generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) translation = processor.batch_decode(generated_ids, skip_special_tokens=True) ``` ## Training data The s2t-small-mustc-en-ru-st is trained on English-Russian subset of [MuST-C](https://ict.fbk.eu/must-c/). MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779). The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-ru (BLEU score): 15.3 ### BibTeX entry and citation info ```bibtex @inproceedings{wang2020fairseqs2t, title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq}, author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino}, booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations}, year = {2020}, } ```
{"language": ["en", "ru"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition"], "datasets": ["mustc"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Librispeech sample 1", "src": "https://cdn-media.huggingface.co/speech_samples/sample1.flac"}, {"example_title": "Librispeech sample 2", "src": "https://cdn-media.huggingface.co/speech_samples/sample2.flac"}]}
automatic-speech-recognition
facebook/s2t-small-mustc-en-ru-st
[ "transformers", "pytorch", "tf", "speech_to_text", "automatic-speech-recognition", "audio", "speech-translation", "en", "ru", "dataset:mustc", "arxiv:2010.05171", "arxiv:1904.08779", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2010.05171", "1904.08779" ]
[ "en", "ru" ]
TAGS #transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #ru #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us
# S2T-SMALL-MUSTC-EN-RU-ST 's2t-small-mustc-en-ru-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST). The S2T model was proposed in this paper and released in this repository ## Model description S2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are fed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the transcripts/translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Russian text translation. See the model hub to look for other S2T checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. *Note: The 'Speech2TextProcessor' object uses torchaudio to extract the filter bank features. Make sure to install the 'torchaudio' package before running this example.* You could either install those as extra speech dependancies with 'pip install transformers"[speech, sentencepiece]"' or install the packages seperatly with 'pip install torchaudio sentencepiece'. ## Training data The s2t-small-mustc-en-ru-st is trained on English-Russian subset of MuST-C. MuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems for speech translation from English into several languages. For each target language, MuST-C comprises several hundred hours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual transcriptions and translations. ## Training procedure ### Preprocessing The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization) is applied to each example. The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000. ### Training The model is trained with standard autoregressive cross-entropy loss and using SpecAugment. The encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate model training and for better performance the encoder is pre-trained for English ASR. ## Evaluation results MuST-C test results for en-ru (BLEU score): 15.3 ### BibTeX entry and citation info
[ "# S2T-SMALL-MUSTC-EN-RU-ST\n\n's2t-small-mustc-en-ru-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Russian text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-ru-st is trained on English-Russian subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-ru (BLEU score): 15.3", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #ru #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n", "# S2T-SMALL-MUSTC-EN-RU-ST\n\n's2t-small-mustc-en-ru-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository", "## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Russian text translation.\nSee the model hub to look for other S2T checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'.", "## Training data\n\nThe s2t-small-mustc-en-ru-st is trained on English-Russian subset of MuST-C.\nMuST-C is a multilingual speech translation corpus whose size and quality facilitates the training of end-to-end systems\nfor speech translation from English into several languages. For each target language, MuST-C comprises several hundred\nhours of audio recordings from English TED Talks, which are automatically aligned at the sentence level with their manual\ntranscriptions and translations.", "## Training procedure", "### Preprocessing\n\nThe speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from\nWAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)\nis applied to each example.\n\nThe texts are lowercased and tokenized using SentencePiece and a vocabulary size of 8,000.", "### Training\n\nThe model is trained with standard autoregressive cross-entropy loss and using SpecAugment.\nThe encoder receives speech features, and the decoder generates the transcripts autoregressively. To accelerate\nmodel training and for better performance the encoder is pre-trained for English ASR.", "## Evaluation results\n\nMuST-C test results for en-ru (BLEU score): 15.3", "### BibTeX entry and citation info" ]
[ 82, 78, 111, 42, 137, 117, 3, 100, 73, 21, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #speech_to_text #automatic-speech-recognition #audio #speech-translation #en #ru #dataset-mustc #arxiv-2010.05171 #arxiv-1904.08779 #license-mit #endpoints_compatible #region-us \n# S2T-SMALL-MUSTC-EN-RU-ST\n\n's2t-small-mustc-en-ru-st' is a Speech to Text Transformer (S2T) model trained for end-to-end Speech Translation (ST).\nThe S2T model was proposed in this paper and released in\nthis repository## Model description\n\nS2T is a transformer-based seq2seq (encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a convolutional downsampler to reduce the length of speech inputs by 3/4th before they are\nfed into the encoder. The model is trained with standard autoregressive cross-entropy loss and generates the\ntranscripts/translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Russian text translation.\nSee the model hub to look for other S2T checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\n*Note: The 'Speech2TextProcessor' object uses torchaudio to extract the\nfilter bank features. Make sure to install the 'torchaudio' package before running this example.*\n\nYou could either install those as extra speech dependancies with\n'pip install transformers\"[speech, sentencepiece]\"' or install the packages seperatly \nwith 'pip install torchaudio sentencepiece'." ]
[ -0.07082251459360123, 0.09052541851997375, -0.006205108016729355, -0.014708840288221836, 0.09705156832933426, -0.05201783403754234, 0.06427200883626938, 0.06603643298149109, -0.05485732480883598, 0.042700111865997314, 0.01915411464869976, 0.055155158042907715, 0.03823142498731613, 0.08704037219285965, 0.010253162123262882, -0.2546181082725525, 0.03556854650378227, -0.08288917690515518, 0.021771052852272987, 0.08081778138875961, 0.1345415562391281, -0.05648018419742584, 0.04956704378128052, 0.024737846106290817, -0.002918556332588196, 0.051497600972652435, 0.06220215559005737, -0.07828650623559952, 0.0526457242667675, 0.10733180493116379, 0.047851309180259705, 0.028905445709824562, 0.0684816911816597, -0.14742927253246307, 0.0005051905754953623, 0.03904591500759125, 0.006742177531123161, 0.028793253004550934, 0.05726991593837738, 0.022736554965376854, 0.15035009384155273, -0.04648108780384064, -0.006902898661792278, 0.09654674679040909, -0.026378976181149483, -0.028237180784344673, -0.027581028640270233, 0.06111987680196762, 0.17100471258163452, 0.12007556855678558, -0.06197991967201233, 0.008796008303761482, -0.00792684592306614, 0.06510856002569199, 0.04114426299929619, -0.2134406864643097, -0.0003047640493605286, -0.09159789234399796, -0.06949584186077118, 0.029564738273620605, -0.03317699208855629, 0.00851836334913969, -0.03369845822453499, -0.024646028876304626, 0.03821900859475136, -0.04672086983919144, 0.02902481146156788, -0.051437169313430786, -0.13314205408096313, -0.02169068157672882, 0.10988830029964447, -0.004848173353821039, -0.0667886957526207, -0.1132969856262207, -0.03998584300279617, -0.010051844641566277, -0.0386110357940197, -0.031748197972774506, -0.021167637780308723, 0.01575305499136448, 0.020853471010923386, -0.15036116540431976, -0.15048007667064667, 0.0011353903682902455, -0.05538563430309296, 0.16539457440376282, 0.043894100934267044, 0.028164878487586975, 0.005819350015372038, 0.049395978450775146, -0.07022694498300552, -0.042361095547676086, -0.0015767925651744008, -0.10290802270174026, -0.1350342035293579, 0.010402697138488293, -0.04528334364295006, -0.24431098997592926, -0.03200201317667961, 0.08978834748268127, -0.09515159577131271, 0.045198194682598114, 0.08258727192878723, 0.014552946202456951, -0.014579536393284798, 0.19167940318584442, -0.03359808772802353, -0.0754319280385971, 0.02745845541357994, -0.058726172894239426, 0.021044574677944183, 0.025842102244496346, -0.10113907605409622, -0.02603474073112011, -0.04477646201848984, 0.0398617684841156, 0.036562226712703705, 0.0431511215865612, 0.031816501170396805, -0.048505086451768875, 0.16171036660671234, -0.11424500495195389, 0.010600727051496506, 0.014332112856209278, -0.03924962878227234, 0.14326608180999756, 0.008056080900132656, -0.01368026901036501, -0.15854217112064362, 0.06713163107633591, 0.029925595968961716, 0.058521758764982224, -0.06827112287282944, -0.11865513026714325, -0.023557865992188454, -0.10485304892063141, -0.026981651782989502, -0.13463109731674194, -0.09038051217794418, -0.024879038333892822, -0.00006176692113513127, -0.03602581471204758, 0.04963156580924988, -0.06572118401527405, -0.06000150740146637, -0.0030953355599194765, -0.04458596929907799, -0.16710084676742554, 0.004228467121720314, -0.022502241656184196, -0.029068157076835632, 0.03609870746731758, -0.024012455716729164, 0.010058820247650146, -0.06806369870901108, -0.05012661591172218, -0.1977413445711136, 0.1836707442998886, 0.0021647263783961535, 0.026203611865639687, -0.08981549739837646, -0.0747038871049881, -0.06675860285758972, 0.05659255012869835, 0.03936444967985153, 0.15222182869911194, -0.18615132570266724, -0.036373984068632126, 0.17493493854999542, -0.08311007916927338, 0.06942762434482574, 0.15008623898029327, 0.03498679772019386, 0.024590594694018364, 0.14795660972595215, 0.11031752079725266, 0.09534729272127151, -0.05621685832738876, -0.11336655169725418, 0.016660617664456367, -0.14466337859630585, 0.059412240982055664, 0.03053337335586548, -0.057814180850982666, 0.12284925580024719, 0.023416120558977127, -0.004662148654460907, -0.013247169554233551, 0.06152056157588959, -0.029182568192481995, 0.04301470145583153, -0.026039134711027145, 0.10328434407711029, -0.038619183003902435, 0.020305965095758438, -0.029058167710900307, -0.14283473789691925, 0.1280556619167328, 0.04267971217632294, -0.04184052720665932, 0.0856236144900322, -0.1382838785648346, -0.05870778486132622, -0.0039278012700378895, 0.025049058720469475, -0.1767289936542511, 0.09153192490339279, -0.031574565917253494, 0.03426908701658249, 0.13735461235046387, 0.022648127749562263, 0.07113881409168243, -0.006928164977580309, -0.009191718883812428, 0.006351442541927099, 0.06622743606567383, 0.0019231508485972881, -0.016513746231794357, -0.06486445665359497, -0.011165078729391098, -0.013455156236886978, 0.08602866530418396, -0.05913095921278, 0.03089451603591442, 0.013695847243070602, 0.0646856501698494, 0.016869181767106056, -0.008870737627148628, -0.019245952367782593, -0.013939332216978073, -0.002339910715818405, -0.011987164616584778, -0.011987654492259026, 0.00889204815030098, -0.05957172438502312, 0.12311579287052155, -0.11100128293037415, -0.1843719780445099, 0.08664906024932861, -0.016111403703689575, -0.043689750134944916, 0.05515040457248688, -0.015447234734892845, -0.0001402115449309349, 0.06506308913230896, -0.1506071388721466, 0.2575400769710541, 0.059572767466306686, 0.10178925096988678, -0.05020957440137863, -0.027344949543476105, -0.022958382964134216, -0.09719809889793396, 0.004032190423458815, 0.01736210659146309, 0.03105476312339306, -0.21244975924491882, 0.004546191543340683, -0.00926861260086298, -0.011859816499054432, 0.16714486479759216, 0.024893756955862045, -0.0794689729809761, 0.02797034941613674, 0.0034179100766777992, 0.009380734525620937, 0.02054988034069538, 0.010112464427947998, -0.007888372987508774, 0.01706189662218094, 0.022953473031520844, 0.048307858407497406, -0.04881373047828674, 0.08589786291122437, 0.05258765071630478, -0.06509615480899811, -0.04628283157944679, 0.01273877639323473, -0.0009012609953060746, 0.05559469014406204, -0.004193091299384832, 0.05909917503595352, -0.03984512388706207, -0.02289755642414093, -0.09566556662321091, 0.08235052973031998, -0.18102169036865234, -0.28824514150619507, -0.18610207736492157, -0.0342753529548645, -0.02899603359401226, 0.07028476893901825, 0.06766314804553986, -0.15488103032112122, -0.07816470414400101, -0.07403449714183807, 0.10352868586778641, -0.04071183502674103, -0.03291108086705208, -0.09709560871124268, 0.0845356211066246, 0.04341336712241173, -0.03152962028980255, -0.003547556232661009, -0.008262662217020988, -0.01927783340215683, 0.02317453734576702, 0.02151799388229847, -0.03813371807336807, 0.12117280811071396, -0.026526687666773796, -0.03106503374874592, -0.043891143053770065, 0.09674699604511261, -0.04267130792140961, 0.1261029988527298, 0.17632049322128296, -0.0860925167798996, 0.0181096401065588, 0.12247508764266968, 0.00373362866230309, -0.0007153219194151461, 0.018573230132460594, -0.022858718410134315, -0.03355669230222702, -0.20295925438404083, -0.07837099581956863, -0.050735700875520706, 0.030152518302202225, 0.02143966592848301, 0.009544187225401402, 0.00249992567114532, 0.02085108496248722, -0.09486930817365646, -0.05166025832295418, 0.04888366535305977, 0.0844842940568924, 0.1772284209728241, -0.039423245936632156, 0.0868830606341362, -0.07489816844463348, -0.008528579026460648, 0.10258091986179352, -0.06313298642635345, 0.15378423035144806, 0.009917494840919971, 0.17127546668052673, 0.02445058338344097, -0.009440929628908634, 0.03448827192187309, 0.059687837958335876, 0.013914753682911396, 0.009509729221463203, 0.018129359930753708, -0.05254300683736801, -0.04019799828529358, 0.10482218116521835, 0.07097530364990234, -0.0647629052400589, 0.02775772474706173, 0.09404107183218002, 0.08829255402088165, 0.14239837229251862, 0.05726848170161247, -0.11229245364665985, -0.13209007680416107, 0.00982984434813261, -0.14275939762592316, -0.021787073463201523, -0.00578294787555933, 0.09935513883829117, -0.05922636762261391, 0.04220515117049217, 0.00012751274334732443, 0.08100830763578415, -0.04191695898771286, 0.04530200734734535, 0.01008547656238079, 0.15898221731185913, 0.030852362513542175, 0.0454537533223629, -0.06240350753068924, 0.05916878581047058, 0.013330429792404175, 0.08463944494724274, -0.019517339766025543, 0.05968977510929108, -0.012923309579491615, 0.026783742010593414, 0.07719643414020538, 0.003959070425480604, -0.17103509604930878, -0.010928977280855179, -0.12635275721549988, -0.03506326302886009, 0.12015116214752197, 0.018811052665114403, 0.08294549584388733, -0.04805910959839821, -0.030068935826420784, -0.06877615302801132, -0.11397774517536163, -0.06581660360097885, -0.20007479190826416, 0.027178481221199036, 0.02309316396713257, 0.003590606153011322, 0.005031621549278498, 0.04160580039024353, -0.03946781903505325, 0.1746283620595932, -0.1772817075252533, -0.12060874700546265, -0.09501448273658752, -0.013847922906279564, 0.14894969761371613, -0.06997862458229065, 0.08183025568723679, 0.060193683952093124, 0.07134533673524857, 0.003442640881985426, -0.054856009781360626, 0.029109567403793335, -0.10430023074150085, -0.022145308554172516, 0.0070714266039431095, 0.192422553896904, 0.02182878740131855, 0.06832737475633621, 0.029092034325003624, 0.020551931113004684, 0.003058166243135929, -0.07848574221134186, -0.06939565390348434, 0.09955321997404099, 0.002657116623595357, 0.023634059354662895, -0.04453952983021736, -0.13647107779979706, -0.13794565200805664, 0.008935492485761642, 0.14390261471271515, 0.1008291244506836, -0.12406332790851593, 0.15913967788219452, 0.04078546166419983, -0.03927890583872795, -0.16582277417182922, -0.07352711260318756, 0.08822774142026901, 0.05546195060014725, 0.019291579723358154, -0.1701648086309433, -0.005063317716121674, -0.04198852553963661, -0.008290735073387623, -0.03537800908088684, -0.17159417271614075, -0.11762191355228424, 0.04394999146461487, -0.05087883025407791, 0.008108672685921192, 0.018022162839770317, -0.029429299756884575, -0.06484062969684601, 0.002956397132948041, -0.025988154113292694, -0.0054091583006083965, 0.04825325682759285, 0.07690244913101196, 0.04025520384311676, 0.065435990691185, 0.008329708129167557, 0.051287755370140076, -0.05179803818464279, -0.060533102601766586, -0.05126601457595825, 0.06390280276536942, 0.024007724598050117, -0.027524307370185852, 0.18708811700344086, -0.02277515083551407, 0.023759188130497932, -0.06977912783622742, -0.07394247502088547, -0.0035385340452194214, -0.00012138760939706117, 0.01620037667453289, -0.01716488040983677, -0.017951076850295067, -0.009804447181522846, 0.04985664784908295, -0.0020464782137423754, -0.03866974636912346, -0.14965207874774933, -0.1816643327474594, 0.17029830813407898, 0.21093210577964783, 0.04229801148176193, -0.035636160522699356, 0.01911633461713791, -0.002034838544204831, 0.003953580744564533, -0.09817373752593994, 0.04429022967815399, 0.025661373510956764, -0.03115753084421158, 0.03552189841866493, -0.041345540434122086, -0.06359825283288956, 0.01988714188337326, 0.0801684707403183, -0.059772592037916183, -0.12230583280324936, -0.03697436675429344, 0.019673453643918037, -0.04373932257294655, 0.029924564063549042, 0.19907739758491516, -0.06224675476551056, -0.002140501281246543, -0.010241535492241383, 0.027547689154744148, -0.06305217742919922, 0.1056080162525177, -0.053741615265607834, 0.009903146885335445, -0.06116996333003044, 0.07125500589609146, 0.05302167683839798, -0.0007698989356867969, 0.04487333074212074, 0.10698940604925156, -0.12381164729595184, -0.06616081297397614, -0.17321793735027313, -0.17921876907348633, -0.05142929404973984, -0.057341285049915314, -0.02374381013214588, -0.046293146908283234, 0.002722834935411811, -0.0037817771080881357, -0.007325360085815191, 0.0013337436830624938, -0.08086482435464859, 0.002923963824287057, -0.07196366041898727, 0.03418640047311783, 0.0869121327996254, -0.027150550857186317, -0.09176750481128693, 0.10242613404989243, -0.03280354291200638, 0.003454344579949975, -0.029833011329174042, 0.01612967997789383, 0.0076520307920873165, 0.01181842666119337, -0.11525151133537292, -0.04116945341229439, -0.07270264625549316, -0.07341279089450836, 0.02956192009150982, 0.0001471320283599198, -0.008576858788728714, 0.040042534470558167, -0.04928359389305115, -0.01784142665565014, -0.05245224013924599, 0.004914248827844858, -0.049166057258844376, 0.0771060511469841, -0.008073866367340088, -0.04526098817586899, 0.10562955588102341, 0.13852718472480774, -0.054714709520339966, 0.12253106385469437, 0.06864923983812332, -0.04462047293782234, 0.011329774744808674, 0.0673777386546135, 0.033275704830884933, -0.019021030515432358, 0.005393035244196653, 0.04558445140719414, -0.019402585923671722, -0.061383601278066635, 0.014062564820051193, -0.036844465881586075, 0.09561145305633545, -0.009329765103757381, 0.030546728521585464, -0.05513754114508629, 0.013602793216705322, 0.04472768306732178, 0.01925826072692871, 0.1852618157863617, -0.0870671346783638, 0.07015145570039749, -0.055034711956977844, 0.04548938572406769, -0.008188280276954174, -0.008198205381631851, -0.006185648497194052, -0.10046425461769104, 0.04677818343043327, -0.014933237805962563, 0.0495128333568573, 0.00006561803456861526, 0.06831551343202591, 0.07248319685459137, -0.10009737312793732, 0.01124751940369606, 0.05449811369180679, 0.1021152213215828, 0.03608163818717003, -0.02066241391003132, -0.1260765641927719, -0.007394827902317047, -0.015197059139609337, -0.12970301508903503, 0.05130380764603615, 0.11794281750917435, 0.05344221368432045, 0.12457746267318726, 0.02814507484436035, 0.03012058511376381, -0.1142098680138588, 0.017830198630690575, -0.1289210170507431, 0.0002022765838773921, -0.06807703524827957, 0.1108136847615242, 0.2090691179037094, -0.08918467909097672, 0.08045928925275803, 0.06070882827043533, -0.055852677673101425, -0.11270283162593842, -0.1342480331659317, -0.025134118273854256, -0.08679772913455963, -0.010359004139900208, -0.06374464929103851, 0.01260383240878582, -0.08344783633947372, 0.06151874363422394, -0.03030914068222046, 0.17322969436645508, -0.054863572120666504, -0.14931204915046692, 0.0746615082025528, 0.0010522177908569574, 0.05687369778752327, 0.13372014462947845, 0.03787458688020706, 0.05970473960042, -0.007001898251473904, 0.08094070851802826, 0.040954943746328354, 0.00043334922520443797, 0.016810502856969833, -0.006765460129827261, -0.03642004355788231, -0.008673283271491528, 0.07104277610778809, 0.03937352076172829, 0.19581560790538788, 0.07498912513256073, -0.09078963845968246, -0.05095404013991356, 0.14146801829338074, -0.08077451586723328, -0.1505890190601349, -0.14500832557678223, 0.15477918088436127, 0.11412659287452698, 0.05935223773121834, -0.03161235153675079, -0.10619944334030151, -0.011046327650547028, 0.16592873632907867, 0.20158742368221283, -0.001581059186719358, 0.016483325511217117, -0.040347978472709656, 0.029180992394685745, -0.026311207562685013, 0.0814252570271492, -0.009637811221182346, 0.26057907938957214, 0.036960091441869736, 0.015875378623604774, 0.00215887650847435, -0.0028423755429685116, -0.10288360714912415, 0.034794215112924576, -0.05428801849484444, -0.04362504929304123, 0.023911360651254654, 0.1113419160246849, -0.02727418765425682, -0.1847829967737198, -0.11538390070199966, 0.011573233641684055, -0.01425738725811243, 0.08719620108604431, 0.07385867834091187, 0.07755886763334274, 0.07856649905443192, 0.02151019312441349, -0.036855150014162064, 0.1858263909816742, 0.013128984719514847, -0.045388467609882355, 0.06476784497499466, 0.045328084379434586, -0.15883004665374756, 0.03242352604866028, -0.01536476518958807, 0.08683861047029495, 0.04887858033180237, 0.052314311265945435, -0.08138470351696014, 0.07605911791324615, -0.025971578434109688, -0.12321363389492035, 0.09513144195079803, 0.16345496475696564, -0.002783233532682061, 0.07908670604228973, 0.039057258516550064, -0.09244344383478165, 0.04204086586833, 0.0013338429853320122, -0.027426348999142647, -0.05255766585469246, 0.14147111773490906, -0.10061995685100555, 0.11652419716119766, 0.15255789458751678, -0.0371529795229435, 0.008027148433029652, -0.025486929342150688, 0.004240368027240038, 0.014336843974888325, 0.037199217826128006, -0.04339990392327309, -0.1224999725818634, -0.044347330927848816, -0.028296686708927155, 0.042620521038770676, -0.14799483120441437, -0.00993183720856905, -0.014013608917593956, -0.007665091194212437, -0.047659385949373245, 0.08678188920021057, 0.048436153680086136, 0.03341764584183693, -0.009129778482019901, 0.04888099804520607, 0.02412590943276882, 0.06086845323443413, -0.11285021156072617, -0.0879315584897995 ]
null
null
transformers
# S2T2-Wav2Vec2-CoVoST2-EN-AR-ST `s2t-wav2vec2-large-en-ar` is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST). The S2T2 model was proposed in [Large-Scale Self- and Semi-Supervised Learning for Speech Translation](https://arxiv.org/pdf/2104.06678.pdf) and officially released in [Fairseq](https://github.com/pytorch/fairseq/blob/6f847c8654d56b4d1b1fbacec027f47419426ddb/fairseq/models/wav2vec/wav2vec2_asr.py#L266). ## Model description S2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a pretrained [Wav2Vec2](https://huggingface.co/transformers/model_doc/wav2vec2.html) as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Arabic text translation. See the [model hub](https://huggingface.co/models?filter=speech2text2) to look for other S2T2 checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. You can use the model directly via the ASR pipeline ```python from datasets import load_dataset from transformers import pipeline librispeech_en = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") asr = pipeline("automatic-speech-recognition", model="facebook/s2t-wav2vec2-large-en-ar", feature_extractor="facebook/s2t-wav2vec2-large-en-ar") translation = asr(librispeech_en[0]["file"]) ``` or step-by-step as follows: ```python import torch from transformers import Speech2Text2Processor, SpeechEncoderDecoder from datasets import load_dataset import soundfile as sf model = SpeechEncoderDecoder.from_pretrained("facebook/s2t-wav2vec2-large-en-ar") processor = Speech2Text2Processor.from_pretrained("facebook/s2t-wav2vec2-large-en-ar") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") ds = ds.map(map_to_array) inputs = processor(ds["speech"][0], sampling_rate=16_000, return_tensors="pt") generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) transcription = processor.batch_decode(generated_ids) ``` ## Evaluation results CoVoST-V2 test results for en-ar (BLEU score): **20.2** For more information, please have a look at the [official paper](https://arxiv.org/pdf/2104.06678.pdf) - especially row 10 of Table 2. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2104-06678, author = {Changhan Wang and Anne Wu and Juan Miguel Pino and Alexei Baevski and Michael Auli and Alexis Conneau}, title = {Large-Scale Self- and Semi-Supervised Learning for Speech Translation}, journal = {CoRR}, volume = {abs/2104.06678}, year = {2021}, url = {https://arxiv.org/abs/2104.06678}, archivePrefix = {arXiv}, eprint = {2104.06678}, timestamp = {Thu, 12 Aug 2021 15:37:06 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2104-06678.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"language": ["en", "ar"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition", "speech2text2"], "datasets": ["covost2", "librispeech_asr"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Common Voice 1", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_18301577.mp3"}, {"example_title": "Common Voice 2", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_99987.mp3"}, {"example_title": "Common Voice 3", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_99988.mp3"}]}
automatic-speech-recognition
facebook/s2t-wav2vec2-large-en-ar
[ "transformers", "pytorch", "speech-encoder-decoder", "automatic-speech-recognition", "audio", "speech-translation", "speech2text2", "en", "ar", "dataset:covost2", "dataset:librispeech_asr", "arxiv:2104.06678", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2104.06678" ]
[ "en", "ar" ]
TAGS #transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #ar #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #has_space #region-us
# S2T2-Wav2Vec2-CoVoST2-EN-AR-ST 's2t-wav2vec2-large-en-ar' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST). The S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in Fairseq. ## Model description S2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Arabic text translation. See the model hub to look for other S2T2 checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. You can use the model directly via the ASR pipeline or step-by-step as follows: ## Evaluation results CoVoST-V2 test results for en-ar (BLEU score): 20.2 For more information, please have a look at the official paper - especially row 10 of Table 2. ### BibTeX entry and citation info
[ "# S2T2-Wav2Vec2-CoVoST2-EN-AR-ST\n\n's2t-wav2vec2-large-en-ar' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.", "## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Arabic text translation.\nSee the model hub to look for other S2T2 checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:", "## Evaluation results\n\nCoVoST-V2 test results for en-ar (BLEU score): 20.2\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #ar #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #has_space #region-us \n", "# S2T2-Wav2Vec2-CoVoST2-EN-AR-ST\n\n's2t-wav2vec2-large-en-ar' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.", "## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Arabic text translation.\nSee the model hub to look for other S2T2 checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:", "## Evaluation results\n\nCoVoST-V2 test results for en-ar (BLEU score): 20.2\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.", "### BibTeX entry and citation info" ]
[ 93, 93, 107, 43, 67, 43, 11 ]
[ "passage: TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #ar #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #has_space #region-us \n# S2T2-Wav2Vec2-CoVoST2-EN-AR-ST\n\n's2t-wav2vec2-large-en-ar' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Arabic text translation.\nSee the model hub to look for other S2T2 checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:## Evaluation results\n\nCoVoST-V2 test results for en-ar (BLEU score): 20.2\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.### BibTeX entry and citation info" ]
[ -0.05044690892100334, 0.07356016337871552, -0.006880472414195538, -0.006773841101676226, 0.06524987518787384, -0.01923084445297718, 0.10728432238101959, 0.03764132782816887, -0.04895508289337158, 0.04524092376232147, 0.038799822330474854, 0.03690558299422264, 0.0985456332564354, 0.09463736414909363, 0.03879981487989426, -0.23952211439609528, 0.028651172295212746, -0.04895544424653053, 0.08192642033100128, 0.056851569563150406, 0.13407227396965027, -0.04530651122331619, 0.09293319284915924, 0.04931975156068802, -0.00887201726436615, 0.026164377108216286, 0.03475615754723549, -0.07743802666664124, 0.055913761258125305, 0.08904958516359329, 0.11526359617710114, 0.010287141427397728, 0.10012863576412201, -0.1756056249141693, 0.004227072931826115, 0.04569164291024208, 0.010819475166499615, 0.035747136920690536, 0.05692513659596443, -0.043530289083719254, 0.0965714082121849, -0.02848041243851185, 0.049769479781389236, 0.042284365743398666, -0.0639076828956604, -0.09995298832654953, -0.03580910712480545, 0.059595439583063126, 0.10464812815189362, 0.08569139242172241, -0.03168352320790291, -0.07756171375513077, -0.016311200335621834, 0.05139368772506714, 0.038488518446683884, -0.20146632194519043, -0.005191158503293991, -0.07570542395114899, -0.007127182092517614, 0.03162049502134323, -0.018870238214731216, -0.019749924540519714, -0.02025200054049492, 0.006079485174268484, 0.05869603157043457, -0.05934089422225952, 0.000526290968991816, -0.07052174210548401, -0.1136641874909401, -0.04977812245488167, 0.09409110993146896, -0.037300482392311096, -0.0756043866276741, -0.0747828483581543, -0.05154252424836159, 0.012376591563224792, -0.007345783989876509, -0.04082966968417168, 0.007608323823660612, 0.03612097352743149, 0.04676676169037819, -0.1046544685959816, -0.11792440712451935, -0.039400938898324966, -0.08748090267181396, 0.05445679649710655, 0.036739472299814224, 0.022998038679361343, -0.032277967780828476, 0.09795376658439636, -0.11271771043539047, -0.0718625858426094, -0.050717003643512726, -0.031050346791744232, -0.13291239738464355, -0.007216676604002714, -0.03126373514533043, -0.21073970198631287, -0.05558690056204796, 0.12036938220262527, -0.00491910008713603, 0.07231307029724121, 0.0006867895135655999, 0.0456000417470932, -0.0205044187605381, 0.2054944485425949, -0.03796827793121338, -0.043262869119644165, -0.007725595962256193, -0.026585357263684273, 0.00034515862353146076, 0.003916469402611256, -0.09011365473270416, -0.022755635902285576, -0.025233197957277298, 0.02350836619734764, -0.012268771417438984, 0.030219152569770813, -0.007089426275342703, -0.011364620178937912, 0.1134643405675888, -0.1371156871318817, 0.0341973602771759, 0.04521137475967407, -0.02748345211148262, 0.07180047780275345, 0.07902760058641434, 0.01659911312162876, -0.10822043567895889, 0.07249386608600616, 0.015133796259760857, 0.05030452832579613, -0.08048416674137115, -0.06388778984546661, -0.004854355938732624, -0.04750525578856468, -0.026204142719507217, -0.1291346400976181, -0.1509857177734375, -0.05515125021338463, 0.0199056975543499, -0.04835117980837822, 0.053953442722558975, -0.10117162019014359, 0.02798382192850113, 0.03714856877923012, -0.04613124579191208, -0.057140741497278214, 0.00744013674557209, -0.01631869561970234, -0.032778602093458176, 0.03076637163758278, -0.04029732942581177, 0.041587479412555695, -0.05517120659351349, 0.0008117349934764206, -0.1318751871585846, 0.20557016134262085, 0.010771866887807846, -0.03969675675034523, -0.12872251868247986, -0.03882138803601265, -0.05463169887661934, 0.05277689918875694, 0.020383529365062714, 0.09084559231996536, -0.21607425808906555, -0.0345793254673481, 0.21513564884662628, -0.08733103424310684, 0.0439920537173748, 0.12950186431407928, -0.026353703811764717, 0.08014722168445587, 0.09963119775056839, 0.05917570739984512, 0.09922128170728683, -0.029298188164830208, -0.04714108631014824, 0.02418624982237816, -0.04410064220428467, 0.10960522294044495, 0.07012932747602463, -0.07932202517986298, 0.0714002475142479, 0.004286514595150948, -0.045768141746520996, -0.007377914618700743, 0.043781816959381104, -0.05569576844573021, 0.02161669358611107, -0.022293061017990112, 0.042701806873083115, -0.034676358103752136, 0.036360450088977814, 0.03897074982523918, -0.10867425799369812, 0.06526495516300201, 0.0369369201362133, -0.0669018104672432, 0.07165224850177765, -0.16216257214546204, -0.04031331092119217, -0.02923673577606678, 0.004135963972657919, -0.16584157943725586, 0.03916915878653526, -0.014943515881896019, -0.022675251588225365, 0.1446954309940338, 0.03134942054748535, 0.054992061108350754, -0.005723249167203903, -0.02040860429406166, -0.009207875467836857, 0.0769028589129448, -0.01808072440326214, -0.04502762854099274, -0.05527295544743538, -0.009449729695916176, -0.04034433886408806, 0.07442741841077805, -0.10550370812416077, 0.010355226695537567, 0.005705863703042269, 0.08622627705335617, 0.008792500011622906, -0.013177153654396534, -0.021260656416416168, 0.03813700005412102, 0.005217841360718012, -0.02263469621539116, -0.020282894372940063, -0.00877609383314848, -0.10592696070671082, 0.11119394749403, -0.18205326795578003, -0.10753867030143738, 0.04358851537108421, 0.0012179577024653554, -0.08697883039712906, 0.019197257235646248, 0.04565219208598137, -0.01635454222559929, 0.0075760455802083015, -0.08068452030420303, 0.34899863600730896, 0.01585959643125534, 0.1386486142873764, -0.06711579114198685, -0.01369608473032713, -0.03910660743713379, -0.05800286680459976, 0.021732453256845474, 0.0456816703081131, -0.011678459122776985, -0.19142843782901764, 0.0268116295337677, 0.03206723555922508, -0.05579403415322304, 0.10875612497329712, 0.03394639119505882, -0.06785173714160919, 0.022459400817751884, -0.008992296643555164, 0.012423668056726456, -0.053676579147577286, 0.00635936763137579, -0.015494718216359615, 0.0265366043895483, 0.034987032413482666, 0.02032146044075489, -0.07245808094739914, 0.09521695971488953, 0.07444415241479874, -0.04789978265762329, -0.06646271049976349, -0.02141113579273224, 0.006660588085651398, 0.0437251441180706, 0.01620626263320446, -0.019675642251968384, 0.002388501074165106, -0.02574826590716839, -0.13354143500328064, 0.1365320235490799, -0.13404153287410736, -0.23319882154464722, -0.20950323343276978, -0.023040559142827988, 0.04389436915516853, 0.06172801926732063, 0.026037152856588364, -0.06379815936088562, -0.05965252220630646, -0.09656922519207001, 0.08690522611141205, -0.047936346381902695, -0.0538087859749794, -0.04373038932681084, -0.012307567521929741, 0.02215077541768551, -0.08175420761108398, 0.016201045364141464, -0.011792028322815895, -0.08489135652780533, -0.0028245856519788504, 0.0012349829776212573, -0.03958292677998543, 0.14617405831813812, 0.0067538307048380375, -0.03649887442588806, -0.03455592691898346, 0.1326730102300644, -0.08124461024999619, 0.0984896868467331, 0.1391127109527588, -0.12887653708457947, 0.04407138004899025, 0.1487443596124649, 0.011122853495180607, -0.021492375060915947, -0.010600334033370018, -0.04046954587101936, -0.04087354242801666, -0.22750303149223328, -0.052965279668569565, -0.04019307345151901, -0.07173220813274384, 0.06310601532459259, 0.021438565105199814, 0.021394869312644005, 0.05984393134713173, -0.05540613457560539, 0.043187301605939865, 0.08512473851442337, 0.050512127578258514, 0.11181381344795227, -0.005148028954863548, 0.06060931831598282, -0.06902533769607544, -0.016161195933818817, 0.04802394285798073, 0.008661746978759766, 0.10826059430837631, 0.01922052539885044, 0.1432218849658966, 0.01387778390198946, 0.008145633153617382, 0.05773090198636055, 0.07186377793550491, -0.039576392620801926, 0.041388869285583496, -0.017486749216914177, -0.034369103610515594, -0.08066617697477341, 0.11809616535902023, 0.07183963060379028, -0.05224660411477089, 0.03138626739382744, 0.04199467971920967, 0.020442306995391846, 0.15315930545330048, 0.05183083564043045, -0.2044663429260254, -0.05269617959856987, 0.057344935834407806, -0.08787932991981506, -0.0873832181096077, -0.0027293204329907894, 0.08331924676895142, -0.11990831792354584, 0.08473224192857742, 0.014613492414355278, 0.07622764259576797, -0.11747747659683228, -0.007567408494651318, -0.02577188052237034, 0.09403638541698456, 0.04875655099749565, 0.04367546737194061, -0.21757163107395172, 0.0691308081150055, 0.024166392162442207, 0.09999026358127594, -0.02179769240319729, 0.0661899670958519, -0.005964827723801136, 0.07665038853883743, 0.12021197378635406, 0.009640608914196491, -0.18436299264431, -0.011251088231801987, -0.08222775161266327, 0.003867503721266985, 0.10496367514133453, -0.041212089359760284, 0.04880563169717789, -0.0294421948492527, -0.030328936874866486, -0.03333926573395729, -0.08000453561544418, -0.061763107776641846, -0.19031572341918945, 0.03255466744303703, 0.03482472524046898, 0.14793656766414642, -0.010555167682468891, 0.034339651465415955, -0.04156530648469925, 0.1389693170785904, -0.28782621026039124, -0.09611529111862183, -0.09167993813753128, -0.07321541756391525, 0.1409284919500351, -0.09855374693870544, 0.06896761804819107, 0.03227158635854721, 0.08834967762231827, -0.0014903565170243382, -0.07946042716503143, 0.036382660269737244, -0.08316002041101456, -0.0968046635389328, -0.04053818807005882, 0.1138596460223198, 0.03899699077010155, 0.07262451946735382, 0.05277575924992561, 0.033612824976444244, -0.0303553007543087, -0.04936801269650459, -0.033238958567380905, 0.20786096155643463, -0.021559249609708786, 0.046076610684394836, -0.03235911205410957, -0.1375381052494049, -0.08280332386493683, -0.007536462973803282, 0.14871609210968018, 0.003714608959853649, -0.06727778166532516, 0.12617157399654388, 0.11512628197669983, -0.11905495077371597, -0.18489237129688263, -0.09432820230722427, 0.09045643359422684, 0.050551872700452805, -0.036065053194761276, -0.22812166810035706, -0.002707388484850526, -0.02064386196434498, -0.0005692677805200219, -0.0462287999689579, -0.18127012252807617, -0.13403309881687164, 0.14177654683589935, -0.08700130879878998, 0.03311590477824211, -0.011940318159759045, -0.07419747859239578, -0.08731234818696976, -0.03107229806482792, 0.02283141389489174, -0.0016434708377346396, 0.05422987416386604, 0.06419134140014648, 0.08203878998756409, 0.02946680225431919, -0.001558691612444818, 0.07254549860954285, 0.032942481338977814, -0.017381764948368073, -0.05719801411032677, 0.0732889398932457, -0.07054602354764938, -0.026310516521334648, 0.13731977343559265, 0.020060379058122635, -0.0026259031146764755, -0.07054490596055984, -0.07831345498561859, -0.02790641412138939, -0.0005623531760647893, 0.014229228720068932, 0.008602176792919636, -0.03597624972462654, -0.014173444360494614, 0.041734836995601654, 0.0015272750752046704, 0.02869318053126335, -0.14486531913280487, -0.09656534343957901, 0.12961407005786896, 0.2691439092159271, 0.04660680517554283, -0.07022947072982788, -0.003946978598833084, -0.005087982397526503, 0.07814574241638184, -0.07674667984247208, 0.060532137751579285, 0.061873603612184525, -0.045012105256319046, 0.11886026710271835, -0.009549214504659176, -0.11514180153608322, 0.0746263638138771, 0.06552286446094513, -0.00010112715244758874, -0.15762776136398315, 0.005469960626214743, 0.0686485543847084, -0.04992809519171715, 0.006529141217470169, 0.16426654160022736, -0.04201183468103409, -0.01713484525680542, -0.02451344020664692, 0.04469459876418114, -0.07464494556188583, 0.09496236592531204, -0.018851038068532944, 0.01028794702142477, -0.062236182391643524, 0.07000819593667984, 0.09785431623458862, -0.029395053163170815, 0.04725931957364082, 0.07941519469022751, -0.07971617579460144, -0.05930136889219284, -0.12885378301143646, -0.021046051755547523, -0.10385175049304962, -0.1072191521525383, 0.022594360634684563, -0.11359402537345886, -0.000016472491552121937, 0.10960297286510468, -0.023395732045173645, 0.04952488839626312, -0.05778851732611656, -0.02277272753417492, -0.028877580538392067, -0.010710364207625389, 0.017140435054898262, 0.015889804810285568, -0.029575273394584656, 0.14126290380954742, -0.004733090754598379, 0.03461262583732605, -0.019485898315906525, 0.0005917728994973004, -0.013518241234123707, 0.03557493910193443, -0.10059702396392822, 0.012630299665033817, -0.07767178118228912, -0.03286958485841751, 0.045368731021881104, -0.020204609259963036, 0.022685933858156204, 0.026446349918842316, -0.046370476484298706, 0.011449919082224369, -0.05303943529725075, 0.04190303757786751, -0.09159309417009354, 0.04240192845463753, 0.029394879937171936, -0.03205685317516327, 0.08764173835515976, 0.05534861609339714, -0.07239589095115662, 0.08050209283828735, -0.1071181371808052, -0.019373565912246704, -0.00412960909307003, 0.045146893709897995, -0.000763487711083144, -0.011699734255671501, 0.017024314031004906, 0.03186553344130516, -0.030294662341475487, -0.0380549393594265, 0.0432220883667469, -0.04529519006609917, 0.10579557716846466, 0.019095610827207565, 0.013855009339749813, -0.06132573261857033, 0.033096231520175934, 0.01085724588483572, -0.001760739367455244, 0.11274590343236923, -0.05308062955737114, 0.04897544905543327, -0.044809870421886444, 0.03165234252810478, 0.024701690301299095, 0.02228562720119953, -0.008657078258693218, -0.11919789761304855, 0.049880702048540115, -0.002960829995572567, 0.08417493849992752, -0.008041714318096638, -0.026473823934793472, 0.034262239933013916, -0.07488121092319489, -0.005052161868661642, 0.05540090054273605, 0.030758222565054893, 0.05352219194173813, -0.04184996336698532, -0.10156071931123734, -0.028896884992718697, -0.025499112904071808, 0.03194471076130867, 0.06577765196561813, 0.09919694811105728, 0.09520722925662994, 0.1043199896812439, 0.013456332497298717, 0.011550002731382847, -0.08887151628732681, 0.002232677536085248, -0.10185018926858902, -0.01656801998615265, -0.040211327373981476, 0.0629279837012291, 0.14251703023910522, -0.09149360656738281, 0.08285275101661682, 0.024740071967244148, -0.05806092917919159, -0.1255725622177124, -0.17739269137382507, -0.07264472544193268, -0.021138133481144905, -0.00231507932767272, -0.04876217246055603, 0.05519307032227516, 0.02613709308207035, 0.038474347442388535, 0.005858732853084803, 0.13397040963172913, -0.09760750085115433, -0.09985746443271637, 0.05851311609148979, -0.003324427641928196, 0.09249484539031982, 0.06707856059074402, 0.01781793124973774, 0.06288573145866394, 0.03897339850664139, 0.07916518300771713, 0.049478136003017426, 0.03722938895225525, 0.01919591799378395, 0.0027109442744404078, -0.019716795533895493, -0.019943706691265106, 0.04150906205177307, 0.013980322517454624, 0.11278536170721054, 0.05779029801487923, -0.04073120653629303, -0.01247820258140564, 0.09883469343185425, -0.0646313726902008, -0.0935925543308258, -0.16837945580482483, 0.20528826117515564, 0.05700968578457832, 0.06090955063700676, -0.009827648289501667, -0.07536489516496658, 0.006819142960011959, 0.1804446429014206, 0.13897809386253357, -0.01790584996342659, 0.011928517371416092, -0.020266592502593994, 0.025009868666529655, -0.01245800033211708, 0.06467724591493607, 0.01431494951248169, 0.2979470193386078, -0.02566072903573513, 0.047495003789663315, 0.02970067597925663, -0.035174574702978134, -0.03728843852877617, 0.08594004064798355, -0.13001267611980438, 0.005403287708759308, -0.005238212179392576, 0.07734358310699463, -0.09728880971670151, -0.1771310418844223, -0.012178231030702591, 0.033969245851039886, -0.029150689020752907, 0.052169881761074066, 0.06430429965257645, 0.07197637856006622, 0.013634806498885155, 0.02048511989414692, -0.0785268098115921, 0.18595942854881287, 0.011343508027493954, -0.050115518271923065, 0.03902637958526611, 0.0445718988776207, -0.10740271955728531, 0.11227963864803314, 0.03149130940437317, 0.10505973547697067, 0.03428654372692108, 0.05552120879292488, -0.047206465154886246, 0.08675383031368256, -0.00004734285903396085, -0.013699417933821678, 0.09693235903978348, 0.16325907409191132, -0.014548261649906635, 0.11665861308574677, 0.04447922110557556, -0.05827101320028305, 0.0903996080160141, 0.0012040409492328763, -0.07503959536552429, -0.04151288419961929, 0.05208329111337662, -0.09449294209480286, 0.12602496147155762, 0.15136189758777618, -0.010509866289794445, -0.00266831717453897, -0.02326669730246067, 0.008401853032410145, -0.03050512634217739, 0.059339821338653564, -0.042858097702264786, -0.08548101037740707, -0.00966548640280962, -0.0024583919439464808, 0.06394313275814056, -0.16794420778751373, -0.027443714439868927, -0.01098207663744688, -0.022941596806049347, -0.023617872968316078, 0.05563853308558464, 0.05775132402777672, 0.026563161984086037, -0.03201405704021454, -0.05802631005644798, 0.03622827306389809, 0.054511673748493195, -0.10045640170574188, -0.0624115876853466 ]
null
null
transformers
# S2T2-Wav2Vec2-CoVoST2-EN-CA-ST `s2t-wav2vec2-large-en-ca` is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST). The S2T2 model was proposed in [Large-Scale Self- and Semi-Supervised Learning for Speech Translation](https://arxiv.org/pdf/2104.06678.pdf) and officially released in [Fairseq](https://github.com/pytorch/fairseq/blob/6f847c8654d56b4d1b1fbacec027f47419426ddb/fairseq/models/wav2vec/wav2vec2_asr.py#L266). ## Model description S2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a pretrained [Wav2Vec2](https://huggingface.co/transformers/model_doc/wav2vec2.html) as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Catalan text translation. See the [model hub](https://huggingface.co/models?filter=speech2text2) to look for other S2T2 checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. You can use the model directly via the ASR pipeline ```python from datasets import load_dataset from transformers import pipeline librispeech_en = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") asr = pipeline("automatic-speech-recognition", model="facebook/s2t-wav2vec2-large-en-ca", feature_extractor="facebook/s2t-wav2vec2-large-en-ca") translation = asr(librispeech_en[0]["file"]) ``` or step-by-step as follows: ```python import torch from transformers import Speech2Text2Processor, SpeechEncoderDecoder from datasets import load_dataset import soundfile as sf model = SpeechEncoderDecoder.from_pretrained("facebook/s2t-wav2vec2-large-en-ca") processor = Speech2Text2Processor.from_pretrained("facebook/s2t-wav2vec2-large-en-ca") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") ds = ds.map(map_to_array) inputs = processor(ds["speech"][0], sampling_rate=16_000, return_tensors="pt") generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) transcription = processor.batch_decode(generated_ids) ``` ## Evaluation results CoVoST-V2 test results for en-ca (BLEU score): **34.1** For more information, please have a look at the [official paper](https://arxiv.org/pdf/2104.06678.pdf) - especially row 10 of Table 2. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2104-06678, author = {Changhan Wang and Anne Wu and Juan Miguel Pino and Alexei Baevski and Michael Auli and Alexis Conneau}, title = {Large-Scale Self- and Semi-Supervised Learning for Speech Translation}, journal = {CoRR}, volume = {abs/2104.06678}, year = {2021}, url = {https://arxiv.org/abs/2104.06678}, archivePrefix = {arXiv}, eprint = {2104.06678}, timestamp = {Thu, 12 Aug 2021 15:37:06 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2104-06678.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"language": ["en", "ca"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition", "speech2text2"], "datasets": ["covost2", "librispeech_asr"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Common Voice 1", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_18301577.mp3"}, {"example_title": "Common Voice 2", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_99989.mp3"}, {"example_title": "Common Voice 3", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_9999.mp3"}]}
automatic-speech-recognition
facebook/s2t-wav2vec2-large-en-ca
[ "transformers", "pytorch", "speech-encoder-decoder", "automatic-speech-recognition", "audio", "speech-translation", "speech2text2", "en", "ca", "dataset:covost2", "dataset:librispeech_asr", "arxiv:2104.06678", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2104.06678" ]
[ "en", "ca" ]
TAGS #transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #ca #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #region-us
# S2T2-Wav2Vec2-CoVoST2-EN-CA-ST 's2t-wav2vec2-large-en-ca' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST). The S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in Fairseq. ## Model description S2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Catalan text translation. See the model hub to look for other S2T2 checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. You can use the model directly via the ASR pipeline or step-by-step as follows: ## Evaluation results CoVoST-V2 test results for en-ca (BLEU score): 34.1 For more information, please have a look at the official paper - especially row 10 of Table 2. ### BibTeX entry and citation info
[ "# S2T2-Wav2Vec2-CoVoST2-EN-CA-ST\n\n's2t-wav2vec2-large-en-ca' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.", "## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Catalan text translation.\nSee the model hub to look for other S2T2 checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:", "## Evaluation results\n\nCoVoST-V2 test results for en-ca (BLEU score): 34.1\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #ca #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #region-us \n", "# S2T2-Wav2Vec2-CoVoST2-EN-CA-ST\n\n's2t-wav2vec2-large-en-ca' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.", "## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Catalan text translation.\nSee the model hub to look for other S2T2 checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:", "## Evaluation results\n\nCoVoST-V2 test results for en-ca (BLEU score): 34.1\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.", "### BibTeX entry and citation info" ]
[ 89, 93, 107, 43, 67, 43, 11 ]
[ "passage: TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #ca #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #region-us \n# S2T2-Wav2Vec2-CoVoST2-EN-CA-ST\n\n's2t-wav2vec2-large-en-ca' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Catalan text translation.\nSee the model hub to look for other S2T2 checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:## Evaluation results\n\nCoVoST-V2 test results for en-ca (BLEU score): 34.1\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.### BibTeX entry and citation info" ]
[ -0.05824388563632965, 0.06472521275281906, -0.007312943693250418, 0.00573496101424098, 0.06135258451104164, -0.03667011857032776, 0.06281670182943344, 0.03937447816133499, -0.03200950846076012, 0.06948860734701157, 0.024142377078533173, 0.07766728103160858, 0.09517378360033035, 0.0635685846209526, 0.014349345117807388, -0.21353545784950256, 0.025520283728837967, -0.06670880317687988, 0.07163393497467041, 0.0391424223780632, 0.12774783372879028, -0.052526891231536865, 0.0933082327246666, 0.05007077008485794, 0.027289606630802155, 0.034772034734487534, 0.04579247906804085, -0.08602255582809448, 0.06699053943157196, 0.08421342074871063, 0.07458116114139557, 0.02015724591910839, 0.08802907168865204, -0.1560848504304886, -0.00231361691839993, 0.016447946429252625, 0.017381196841597557, 0.03672404959797859, 0.07538660615682602, -0.04404863342642784, 0.04185087978839874, -0.04015716537833214, 0.029548434540629387, 0.062434691935777664, -0.06514643132686615, -0.1269792914390564, -0.043105486780405045, 0.01200034562498331, 0.07447059452533722, 0.06118394434452057, -0.03642404079437256, -0.07840843498706818, -0.024124059826135635, 0.03143388405442238, 0.04668504744768143, -0.20057104527950287, 0.0216448325663805, -0.10532119870185852, 0.0019258706597611308, 0.043513376265764236, 0.008575499057769775, -0.012663104571402073, 0.0007904465892352164, 0.0022561966907233, 0.03075127862393856, -0.05409533157944679, -0.018226152285933495, -0.07238533347845078, -0.10727735608816147, -0.046945177018642426, 0.09538015723228455, -0.0458085834980011, -0.06966890394687653, -0.08625918626785278, -0.04530341923236847, 0.018720010295510292, -0.010745067149400711, -0.05206649377942085, 0.013134073466062546, 0.030466342344880104, 0.05224796384572983, -0.0939815416932106, -0.11258412897586823, -0.02295578084886074, -0.07928208261728287, 0.03663698583841324, 0.03909805417060852, 0.01968265511095524, -0.009725537151098251, 0.09715453535318375, -0.10046306997537613, -0.060363635420799255, -0.045392002910375595, -0.01968955248594284, -0.12318265438079834, 0.003029314801096916, -0.024169547483325005, -0.20762576162815094, -0.010737464763224125, 0.10252819955348969, -0.0335967019200325, 0.039008136838674545, -0.011143906973302364, 0.04056417569518089, 0.0016051320126280189, 0.18754200637340546, -0.06443360447883606, -0.05317806079983711, -0.00569197116419673, -0.04461619257926941, 0.008597727864980698, -0.005377859342843294, -0.10744216293096542, -0.04042544215917587, -0.02090907283127308, 0.04569397494196892, 0.03395329788327217, 0.003297733375802636, -0.02792932838201523, -0.028558339923620224, 0.13404764235019684, -0.12370461970567703, 0.05576184391975403, 0.05766129866242409, -0.021714037284255028, 0.11158697307109833, 0.04142620787024498, 0.014488509856164455, -0.11593658477067947, 0.06046558544039726, 0.01723374053835869, 0.031448353081941605, -0.06583409011363983, -0.07072598487138748, -0.002778364811092615, -0.028786659240722656, -0.03524243086576462, -0.11240991950035095, -0.11262546479701996, -0.057631973177194595, -0.0006134749273769557, -0.08166380971670151, 0.07080274820327759, -0.10359842330217361, 0.026158975437283516, 0.028285840526223183, -0.06473394483327866, -0.09401869773864746, 0.003042949829250574, -0.014826996251940727, -0.019941261038184166, 0.048448365181684494, -0.09522893279790878, 0.04570675268769264, -0.07011260092258453, 0.0029788613319396973, -0.1146465316414833, 0.2194780707359314, 0.00811498798429966, -0.04594222456216812, -0.12257076799869537, -0.05212538689374924, -0.06384016573429108, 0.055752143263816833, 0.023029286414384842, 0.08686485141515732, -0.22102053463459015, -0.045646511018276215, 0.20679673552513123, -0.07197615504264832, 0.042410772293806076, 0.16056358814239502, -0.010458802804350853, 0.08076073229312897, 0.10212214291095734, 0.06646892428398132, 0.08431626856327057, -0.014414875768125057, -0.06225356459617615, 0.00014694969286210835, -0.054152216762304306, 0.08913183212280273, 0.06779086589813232, -0.10247620195150375, 0.08576487004756927, 0.015667453408241272, -0.04302925989031792, -0.03481138497591019, 0.04940294101834297, -0.0533180832862854, 0.01681051217019558, -0.00821726955473423, 0.06144660338759422, -0.04388664662837982, 0.029360909014940262, 0.032779157161712646, -0.07666723430156708, 0.07306230068206787, 0.010594341903924942, -0.06475257873535156, 0.08084016293287277, -0.14855743944644928, 0.0034179321955889463, -0.040664173662662506, 0.012287398800253868, -0.15774649381637573, 0.0627855658531189, 0.0016611674800515175, -0.024214670062065125, 0.13687460124492645, 0.05716999992728233, 0.04334728792309761, -0.001289058942347765, -0.009990011341869831, 0.018814584240317345, 0.09572409093379974, -0.01832326501607895, -0.013198754750192165, -0.05949121341109276, -0.013128401711583138, -0.031702641397714615, 0.07789663970470428, -0.09911136329174042, 0.0004467958933673799, 0.040257394313812256, 0.06906270235776901, 0.005231576040387154, -0.014925823546946049, -0.046031683683395386, 0.050642505288124084, -0.009271221235394478, -0.0023759882897138596, -0.011047879233956337, -0.027512306347489357, -0.08878699690103531, 0.1203952431678772, -0.18284663558006287, -0.12878161668777466, 0.05094876512885094, 0.001351938466541469, -0.06788281351327896, 0.01734951138496399, 0.04902896657586098, -0.01122526079416275, 0.0032703096512705088, -0.11508119106292725, 0.33074212074279785, 0.016381729394197464, 0.13942383229732513, -0.08326203376054764, -0.018425853922963142, -0.04102672263979912, -0.05704738199710846, 0.03172164037823677, 0.06359987705945969, -0.024842360988259315, -0.17328907549381256, 0.02702580951154232, 0.017425136640667915, -0.04039306193590164, 0.11232384294271469, 0.04529205709695816, -0.08814920485019684, 0.02014344185590744, 0.013249722309410572, 0.03818649426102638, -0.05676291882991791, 0.011336651630699635, -0.007070688996464014, 0.04010750353336334, 0.033460360020399094, 0.02591828629374504, -0.07495986670255661, 0.0957566425204277, 0.0632123351097107, -0.05314173176884651, -0.07399250566959381, -0.01748710311949253, 0.031172212213277817, 0.04365767911076546, 0.03690331056714058, -0.018870413303375244, 0.000976820127107203, -0.02908139117062092, -0.13313338160514832, 0.10513636469841003, -0.14431257545948029, -0.2610427737236023, -0.2356201857328415, -0.002661294536665082, 0.035047732293605804, 0.07423664629459381, 0.04053007438778877, -0.06876201927661896, -0.06205498054623604, -0.08708825707435608, 0.10086462646722794, -0.023233875632286072, -0.07321144640445709, -0.04666636511683464, 0.009901491925120354, 0.007494502700865269, -0.08655436336994171, 0.016584374010562897, -0.018442625179886818, -0.07239002734422684, -0.015978092327713966, 0.01902136206626892, -0.008831258863210678, 0.15495797991752625, 0.004487126599997282, -0.04678837209939957, -0.014880930073559284, 0.11698668450117111, -0.07838468253612518, 0.07444756478071213, 0.14076228439807892, -0.12752656638622284, 0.034880924969911575, 0.13614928722381592, 0.02547154575586319, -0.027037115767598152, -0.019074318930506706, -0.05367760732769966, -0.04303336143493652, -0.23490102589130402, -0.07778313010931015, -0.052078261971473694, -0.04995303601026535, 0.06444266438484192, 0.02079150266945362, 0.04621030017733574, 0.05059768632054329, -0.04048094153404236, 0.032272614538669586, 0.08377286046743393, 0.04713905602693558, 0.16053400933742523, -0.014813932590186596, 0.057418014854192734, -0.0688498467206955, -0.047140538692474365, 0.07246367633342743, 0.027044659480452538, 0.08915985375642776, 0.030298743396997452, 0.1810580939054489, 0.007646358571946621, 0.02711181528866291, 0.03692849352955818, 0.07820714265108109, -0.0363142229616642, 0.04579370841383934, -0.021041886880993843, -0.045287203043699265, -0.06489763408899307, 0.1158379390835762, 0.12060178071260452, -0.04969937354326248, 0.0393233560025692, 0.011923584155738354, 0.03098413534462452, 0.16780371963977814, 0.07066240906715393, -0.1882462501525879, -0.053436484187841415, 0.05498998612165451, -0.07450195401906967, -0.07103560864925385, 0.0012917305575683713, 0.10630233585834503, -0.11140187084674835, 0.06857917457818985, 0.021602924913167953, 0.06931629031896591, -0.12304191291332245, 0.002079307334497571, -0.03953346610069275, 0.10446138679981232, 0.06439387798309326, 0.03867291286587715, -0.17195111513137817, 0.05468854680657387, 0.023958206176757812, 0.0819774940609932, -0.03515542671084404, 0.07074759155511856, -0.02816629968583584, 0.037376560270786285, 0.12253968417644501, 0.005370925646275282, -0.13958534598350525, -0.01874564215540886, -0.0993383452296257, -0.003211274975910783, 0.10620676726102829, -0.052338480949401855, 0.04159863293170929, -0.0451086089015007, -0.04032792150974274, -0.038370806723833084, -0.10588103532791138, -0.0470840260386467, -0.1968827247619629, 0.05272572860121727, 0.06278974562883377, 0.1475411355495453, 0.012066108174622059, 0.024290813133120537, -0.06473463028669357, 0.16461531817913055, -0.2979241907596588, -0.08243373036384583, -0.0820036306977272, -0.10244794934988022, 0.17433495819568634, -0.07258889824151993, 0.07319537550210953, 0.045914653688669205, 0.0822228342294693, -0.009456353262066841, -0.06666627526283264, 0.06017328426241875, -0.08305160701274872, -0.12352646887302399, -0.023337705060839653, 0.12008212506771088, 0.03445197269320488, 0.07397028058767319, 0.05901860073208809, 0.04723234847187996, -0.024338040500879288, -0.057793717831373215, -0.026457762345671654, 0.1906575709581375, -0.0571797639131546, 0.06643430143594742, -0.0421958826482296, -0.17818261682987213, -0.0788755714893341, -0.005040065385401249, 0.16802549362182617, 0.03822750225663185, -0.08054614812135696, 0.14200225472450256, 0.1276269257068634, -0.11233080923557281, -0.1897781789302826, -0.08091218769550323, 0.0635920837521553, 0.03364010900259018, -0.015018119476735592, -0.19862772524356842, -0.015513788908720016, 0.009362606331706047, -0.01318027451634407, -0.025159616023302078, -0.19777169823646545, -0.1441626250743866, 0.10410110652446747, -0.09866179525852203, 0.05583076924085617, -0.006356260273605585, -0.10105641186237335, -0.09741205722093582, -0.022615743800997734, 0.0334308035671711, 0.030598055571317673, 0.0535971075296402, 0.05558566004037857, 0.06774615496397018, 0.041141025722026825, -0.0007513299351558089, 0.08035093545913696, 0.02413240633904934, -0.0478535033762455, -0.03309986740350723, 0.08682218194007874, -0.048751115798950195, -0.012919680215418339, 0.1555456519126892, 0.03749793395400047, -0.01328642200678587, -0.03160378336906433, -0.06424407660961151, -0.034282855689525604, 0.016494065523147583, 0.019097620621323586, 0.020497111603617668, -0.0475609116256237, -0.020372068509459496, 0.04337198659777641, 0.0053690397180616856, 0.02370688132941723, -0.14887763559818268, -0.09882669895887375, 0.14247289299964905, 0.2743162512779236, 0.05006592720746994, -0.036029595881700516, 0.006772714201360941, -0.009751654230058193, 0.06000587344169617, -0.007465688977390528, 0.07302692532539368, 0.06677649170160294, -0.03753351792693138, 0.09417285770177841, 0.0035006387624889612, -0.10891028493642807, 0.08492223918437958, 0.07003842294216156, 0.011236719787120819, -0.18069684505462646, 0.006880050990730524, 0.04259049519896507, -0.03375979885458946, 0.02558799274265766, 0.16683366894721985, -0.028542423620820045, -0.030936170369386673, -0.02630019746720791, 0.04993894323706627, -0.07193901389837265, 0.09394603222608566, -0.014230072498321533, 0.003936755936592817, -0.07607292383909225, 0.04564953222870827, 0.09711754322052002, -0.020998617634177208, 0.04470309615135193, 0.07097980380058289, -0.05389394983649254, -0.047610778361558914, -0.1494850218296051, -0.07506511360406876, -0.1196429654955864, -0.11342606693506241, -0.007705185562372208, -0.107529416680336, 0.006685317028313875, 0.09514245390892029, -0.03191796690225601, 0.04242398962378502, -0.04656142741441727, -0.03192565217614174, -0.03489487245678902, -0.00943959504365921, 0.008728994987905025, 0.02886187843978405, -0.011334656737744808, 0.11374363303184509, -0.023441344499588013, 0.010688130743801594, -0.0261569544672966, 0.0010145973647013307, -0.014657776802778244, 0.027648955583572388, -0.11470535397529602, 0.0011193249374628067, -0.08606140315532684, -0.03823546692728996, 0.04408055916428566, -0.021845847368240356, 0.024280760437250137, 0.03240878880023956, -0.05105863884091377, 0.012324189767241478, -0.05753609165549278, 0.05062470585107803, -0.10176856070756912, 0.05031345784664154, 0.03383227065205574, -0.0334370918571949, 0.09603162109851837, 0.03139232471585274, -0.09642043709754944, 0.0738712027668953, -0.13966085016727448, -0.04270347207784653, 0.016968876123428345, 0.071578249335289, 0.017654836177825928, -0.007127690128982067, 0.0010689981281757355, 0.05788598954677582, -0.023343494161963463, -0.03297204151749611, 0.057003237307071686, -0.049242183566093445, 0.12279566377401352, -0.009444121271371841, -0.022937117144465446, -0.04198738560080528, 0.02860105037689209, 0.010893038474023342, -0.013738736510276794, 0.1096934825181961, -0.07277072966098785, 0.05476560816168785, -0.02721007913351059, 0.026790695264935493, 0.01905541680753231, 0.026868265122175217, -0.03409462049603462, -0.09257704764604568, 0.045871295034885406, 0.018744545057415962, 0.10963734984397888, -0.02598837949335575, 0.0003579891344998032, 0.029485439881682396, -0.0798848420381546, -0.03121797926723957, 0.053651947528123856, 0.03157190978527069, 0.06305529177188873, -0.03769220784306526, -0.13513785600662231, -0.00006451174704125151, -0.025331448763608932, 0.010358807630836964, 0.05850907787680626, 0.08581233769655228, 0.1003413274884224, 0.1031511053442955, 0.050445616245269775, 0.006094885524362326, -0.09045280516147614, 0.03000783361494541, -0.11051298677921295, -0.008580880239605904, -0.032667990773916245, 0.05944424867630005, 0.11748264729976654, -0.0815124362707138, 0.08912808448076248, 0.02113054133951664, -0.05465163663029671, -0.13700278103351593, -0.14172416925430298, -0.07391351461410522, -0.04081074893474579, -0.008853569626808167, -0.053078390657901764, 0.06809964776039124, -0.000026404857635498047, 0.04459116980433464, -0.010007472708821297, 0.11751267313957214, -0.10291264951229095, -0.09023647755384445, 0.05621951073408127, -0.008413471281528473, 0.12123812735080719, 0.055782608687877655, 0.018578672781586647, 0.062490690499544144, 0.07061817497015, 0.09444382786750793, 0.07326590269804001, 0.021477924659848213, 0.0005439951783046126, 0.004002464935183525, -0.027132397517561913, -0.015042888931930065, 0.042106155306100845, 0.016584962606430054, 0.11812304705381393, 0.07038485258817673, -0.05796762555837631, -0.006563553586602211, 0.0778704285621643, -0.07100217789411545, -0.07062433660030365, -0.15971337258815765, 0.2023138552904129, 0.058910269290208817, 0.0891614556312561, -0.006259439513087273, -0.09501858055591583, -0.0005878357333131135, 0.16202542185783386, 0.14381150901317596, -0.01116890273988247, 0.010961043648421764, -0.02395216003060341, 0.026430824771523476, -0.012633935548365116, 0.049511346966028214, 0.0015157611342146993, 0.3400965631008148, -0.007918525487184525, 0.03422572463750839, 0.03050614707171917, -0.017858654260635376, -0.05302570015192032, 0.08000492304563522, -0.14299806952476501, 0.0036419383250176907, -0.008366171270608902, 0.09864463657140732, -0.1019885316491127, -0.18928498029708862, -0.011313267983496189, 0.012251612730324268, -0.03766854479908943, 0.04348098486661911, 0.068348228931427, 0.06116902828216553, 0.03350946307182312, 0.020937927067279816, -0.07695242017507553, 0.17152521014213562, 0.014284833334386349, -0.05420256778597832, 0.07329850643873215, 0.032767441123723984, -0.13144759833812714, 0.11518342792987823, 0.029243122786283493, 0.09287994354963303, 0.04490743204951286, 0.06093711405992508, -0.0598481148481369, 0.058037955313920975, -0.010027781128883362, -0.025554729625582695, 0.11897990852594376, 0.13732963800430298, -0.0008284181240014732, 0.12724626064300537, 0.05423193424940109, -0.08223550766706467, 0.09570857137441635, 0.024345839396119118, -0.05925016850233078, -0.06282948702573776, 0.039860665798187256, -0.08978431671857834, 0.10272640734910965, 0.12940222024917603, -0.011883813887834549, 0.001865922473371029, -0.028103679418563843, 0.013894869014620781, -0.041568756103515625, 0.07443923503160477, -0.04719381406903267, -0.11513306200504303, -0.0057730842381715775, -0.0023515645880252123, 0.06682313978672028, -0.1838439553976059, -0.015625720843672752, -0.0065852883271873, -0.02325778640806675, -0.02844076231122017, 0.03875360265374184, 0.060247939079999924, 0.019392436370253563, -0.015317928977310658, -0.052613645792007446, 0.039717577397823334, 0.04759600758552551, -0.09122905880212784, -0.04178004711866379 ]
null
null
transformers
# S2T2-Wav2Vec2-CoVoST2-EN-DE-ST `s2t-wav2vec2-large-en-de` is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST). The S2T2 model was proposed in [Large-Scale Self- and Semi-Supervised Learning for Speech Translation](https://arxiv.org/pdf/2104.06678.pdf) and officially released in [Fairseq](https://github.com/pytorch/fairseq/blob/6f847c8654d56b4d1b1fbacec027f47419426ddb/fairseq/models/wav2vec/wav2vec2_asr.py#L266). ## Model description S2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a pretrained [Wav2Vec2](https://huggingface.co/transformers/model_doc/wav2vec2.html) as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to German text translation. See the [model hub](https://huggingface.co/models?filter=speech2text2) to look for other S2T2 checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. You can use the model directly via the ASR pipeline ```python from datasets import load_dataset from transformers import pipeline librispeech_en = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") asr = pipeline("automatic-speech-recognition", model="facebook/s2t-wav2vec2-large-en-de", feature_extractor="facebook/s2t-wav2vec2-large-en-de") translation_de = asr(librispeech_en[0]["file"]) ``` or step-by-step as follows: ```python import torch from transformers import Speech2Text2Processor, SpeechEncoderDecoder from datasets import load_dataset import soundfile as sf model = SpeechEncoderDecoder.from_pretrained("facebook/s2t-wav2vec2-large-en-de") processor = Speech2Text2Processor.from_pretrained("facebook/s2t-wav2vec2-large-en-de") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") ds = ds.map(map_to_array) inputs = processor(ds["speech"][0], sampling_rate=16_000, return_tensors="pt") generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) transcription = processor.batch_decode(generated_ids) ``` ## Evaluation results CoVoST-V2 test results for en-de (BLEU score): **26.5** For more information, please have a look at the [official paper](https://arxiv.org/pdf/2104.06678.pdf) - especially row 10 of Table 2. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2104-06678, author = {Changhan Wang and Anne Wu and Juan Miguel Pino and Alexei Baevski and Michael Auli and Alexis Conneau}, title = {Large-Scale Self- and Semi-Supervised Learning for Speech Translation}, journal = {CoRR}, volume = {abs/2104.06678}, year = {2021}, url = {https://arxiv.org/abs/2104.06678}, archivePrefix = {arXiv}, eprint = {2104.06678}, timestamp = {Thu, 12 Aug 2021 15:37:06 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2104-06678.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"language": ["en", "de"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition", "speech2text2"], "datasets": ["covost2", "librispeech_asr"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Common Voice 1", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_18301577.mp3"}, {"example_title": "Common Voice 2", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_99985.mp3"}, {"example_title": "Common Voice 3", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_99986.mp3"}]}
automatic-speech-recognition
facebook/s2t-wav2vec2-large-en-de
[ "transformers", "pytorch", "speech-encoder-decoder", "automatic-speech-recognition", "audio", "speech-translation", "speech2text2", "en", "de", "dataset:covost2", "dataset:librispeech_asr", "arxiv:2104.06678", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2104.06678" ]
[ "en", "de" ]
TAGS #transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #de #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #has_space #region-us
# S2T2-Wav2Vec2-CoVoST2-EN-DE-ST 's2t-wav2vec2-large-en-de' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST). The S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in Fairseq. ## Model description S2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to German text translation. See the model hub to look for other S2T2 checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. You can use the model directly via the ASR pipeline or step-by-step as follows: ## Evaluation results CoVoST-V2 test results for en-de (BLEU score): 26.5 For more information, please have a look at the official paper - especially row 10 of Table 2. ### BibTeX entry and citation info
[ "# S2T2-Wav2Vec2-CoVoST2-EN-DE-ST\n\n's2t-wav2vec2-large-en-de' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.", "## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to German text translation.\nSee the model hub to look for other S2T2 checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:", "## Evaluation results\n\nCoVoST-V2 test results for en-de (BLEU score): 26.5\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #de #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #has_space #region-us \n", "# S2T2-Wav2Vec2-CoVoST2-EN-DE-ST\n\n's2t-wav2vec2-large-en-de' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.", "## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to German text translation.\nSee the model hub to look for other S2T2 checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:", "## Evaluation results\n\nCoVoST-V2 test results for en-de (BLEU score): 26.5\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.", "### BibTeX entry and citation info" ]
[ 93, 93, 107, 43, 67, 43, 11 ]
[ "passage: TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #de #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #has_space #region-us \n# S2T2-Wav2Vec2-CoVoST2-EN-DE-ST\n\n's2t-wav2vec2-large-en-de' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to German text translation.\nSee the model hub to look for other S2T2 checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:## Evaluation results\n\nCoVoST-V2 test results for en-de (BLEU score): 26.5\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.### BibTeX entry and citation info" ]
[ -0.05945241078734398, 0.07310990989208221, -0.0067213852889835835, -0.0027213876601308584, 0.06362293660640717, -0.01801951602101326, 0.0942743718624115, 0.03269558027386665, -0.03325382620096207, 0.05882856622338295, 0.025117823854088783, 0.03989866003394127, 0.08250641077756882, 0.0958070158958435, 0.02723221480846405, -0.24805955588817596, 0.03370782360434532, -0.04340212419629097, 0.08301214128732681, 0.046191416680812836, 0.1317020058631897, -0.05923193693161011, 0.08674849569797516, 0.047443687915802, -0.005343404598534107, 0.034184180200099945, 0.0448436476290226, -0.06035172939300537, 0.05539526790380478, 0.07075697183609009, 0.10216571390628815, 0.00791106652468443, 0.09917107224464417, -0.15609847009181976, -0.0032696276903152466, 0.04072151705622673, 0.012026730924844742, 0.033490393310785294, 0.05568668618798256, -0.04800962656736374, 0.08815596252679825, -0.029492830857634544, 0.046160560101270676, 0.049718376249074936, -0.0600932240486145, -0.1005815863609314, -0.042458441108465195, 0.06120060384273529, 0.09321847558021545, 0.08961199223995209, -0.03646710887551308, -0.07775259763002396, -0.018417105078697205, 0.0447935052216053, 0.03183170035481453, -0.20699790120124817, -0.0013237239327281713, -0.06636995077133179, -0.0022167370188981295, 0.015342826023697853, -0.023438267409801483, -0.024262161925435066, -0.01290093269199133, 0.013885272666811943, 0.061372388154268265, -0.054843537509441376, 0.023668812587857246, -0.07668017596006393, -0.12084199488162994, -0.03438669815659523, 0.0912955179810524, -0.036453817039728165, -0.08110594749450684, -0.0848863497376442, -0.0469161756336689, 0.012081054039299488, -0.0044458527117967606, -0.048688855022192, 0.010114389471709728, 0.030993599444627762, 0.041733261197805405, -0.10146931558847427, -0.11516235768795013, -0.031154124066233635, -0.06816490739583969, 0.07151614129543304, 0.03311365470290184, 0.013090556487441063, -0.026971234008669853, 0.10373745858669281, -0.12009520828723907, -0.07129313051700592, -0.052074018865823746, -0.03227625787258148, -0.1311144083738327, -0.007700901012867689, -0.03727995604276657, -0.21791182458400726, -0.05545548349618912, 0.1423526406288147, -0.0018428839975968003, 0.05243247002363205, 0.00233132834546268, 0.0392131507396698, -0.0016642401460558176, 0.21047300100326538, -0.037351541221141815, -0.058623168617486954, -0.0015515777049586177, -0.04229147359728813, 0.004110811743885279, 0.003121831687167287, -0.09793432056903839, -0.03795742616057396, -0.019457757472991943, 0.01956825517117977, -0.005208294838666916, 0.023151777684688568, -0.009373046457767487, -0.020017990842461586, 0.11853035539388657, -0.13081374764442444, 0.043518513441085815, 0.043430060148239136, -0.02139884978532791, 0.09874999523162842, 0.05794437974691391, 0.012507393024861813, -0.10916878283023834, 0.09719221293926239, 0.014698483049869537, 0.03351610526442528, -0.09635366499423981, -0.08135271072387695, -0.005964254029095173, -0.049432117491960526, -0.02333661913871765, -0.11923528462648392, -0.14912821352481842, -0.06599966436624527, 0.012743871659040451, -0.04821955785155296, 0.05122946575284004, -0.10180028527975082, 0.012288396246731281, 0.03456839546561241, -0.04737551137804985, -0.07851948589086533, 0.012610115110874176, -0.03234196454286575, -0.04563666507601738, 0.02053380385041237, -0.06390216201543808, 0.041300538927316666, -0.07077088952064514, -0.006380127742886543, -0.14850924909114838, 0.19500400125980377, -0.004401833284646273, -0.04255940392613411, -0.10904356837272644, -0.05391473323106766, -0.05357137322425842, 0.0529576912522316, 0.02150634303689003, 0.08601044118404388, -0.21292521059513092, -0.030572935938835144, 0.2070852518081665, -0.10381217300891876, 0.05813772976398468, 0.1338241845369339, -0.0261320099234581, 0.0699416920542717, 0.10322843492031097, 0.06763651967048645, 0.1263369619846344, -0.04300103336572647, -0.05482024699449539, 0.011492196470499039, -0.05859772115945816, 0.1077040582895279, 0.05890065059065819, -0.0758061334490776, 0.06404316425323486, 0.003298601135611534, -0.05746639892458916, -0.028363430872559547, 0.04385773465037346, -0.04400241747498512, 0.017481081187725067, -0.014343458227813244, 0.05352993309497833, -0.042470090091228485, 0.023742910474538803, 0.030235596001148224, -0.10022742301225662, 0.08876282721757889, 0.03723356872797012, -0.07447148859500885, 0.06495721638202667, -0.1441534459590912, -0.040194395929574966, -0.014747723005712032, 0.0025556881446391344, -0.1626984030008316, 0.0309964157640934, 0.0014094730140641332, -0.029521744698286057, 0.14750222861766815, 0.06605048477649689, 0.05557385832071304, -0.005409226752817631, -0.02129264734685421, -0.016486523672938347, 0.05250370129942894, -0.019485194236040115, -0.029475009068846703, -0.06027767434716225, -0.027741311118006706, -0.044022753834724426, 0.06508361548185349, -0.09791860729455948, 0.009372112341225147, 0.020590102300047874, 0.08227702230215073, 0.014574882574379444, -0.017576448619365692, -0.034464769065380096, 0.040983039885759354, -0.0020334364380687475, -0.014408666640520096, -0.029500771313905716, -0.0008650450618006289, -0.09163995087146759, 0.11844337731599808, -0.16474168002605438, -0.11850690841674805, 0.037143658846616745, 0.0028754561208188534, -0.08861927688121796, 0.018461840227246284, 0.036676373332738876, -0.012346887029707432, -0.011659468524158001, -0.11222147196531296, 0.35215526819229126, 0.01157324854284525, 0.11856209486722946, -0.06281493604183197, -0.02981468290090561, -0.04351654648780823, -0.04615241289138794, 0.026476096361875534, 0.04526713490486145, -0.020439723506569862, -0.17570506036281586, 0.029716435819864273, 0.019768163561820984, -0.06712169200181961, 0.11615157127380371, 0.03212972357869148, -0.07337887585163116, 0.013927621766924858, -0.014302441850304604, 0.012976524420082569, -0.036492351442575455, 0.005246298853307962, -0.012528638355433941, 0.02989768423140049, 0.038832589983940125, 0.02629636600613594, -0.06966648995876312, 0.09533564746379852, 0.06391549110412598, -0.04251028597354889, -0.06483210623264313, -0.01788484677672386, 0.012113790959119797, 0.04236665368080139, 0.018464546650648117, -0.019959427416324615, 0.002595979953184724, -0.02908039279282093, -0.1376246064901352, 0.1298314929008484, -0.13612695038318634, -0.23152978718280792, -0.22126196324825287, 0.004665900953114033, 0.03879609704017639, 0.06357226520776749, 0.025199655443429947, -0.05136687681078911, -0.07356515526771545, -0.09000016748905182, 0.10298310220241547, -0.0442107617855072, -0.057605791836977005, -0.05445307865738869, -0.005023030564188957, 0.022387269884347916, -0.08360560983419418, 0.008376149460673332, -0.01825626753270626, -0.0715840756893158, -0.010789988562464714, 0.020451700314879417, -0.022578099742531776, 0.1537637859582901, -0.0019565466791391373, -0.03751849755644798, -0.030443942174315453, 0.12735514342784882, -0.08253305405378342, 0.1016981452703476, 0.14223924279212952, -0.13004249334335327, 0.039404675364494324, 0.12838079035282135, 0.01305642444640398, -0.020712923258543015, -0.005339803174138069, -0.03689570724964142, -0.04041726142168045, -0.23290689289569855, -0.06800949573516846, -0.04394664615392685, -0.05384642630815506, 0.06735769659280777, 0.0111313471570611, 0.0326431542634964, 0.05555378273129463, -0.05073966085910797, 0.039895541965961456, 0.08411288261413574, 0.05533652752637863, 0.11529011279344559, -0.006291886791586876, 0.06483950465917587, -0.057310547679662704, -0.026036009192466736, 0.05296248942613602, 0.0016246887389570475, 0.10193292051553726, 0.020598825067281723, 0.12800095975399017, 0.011968327686190605, 0.0010018907487392426, 0.04472735896706581, 0.08324804157018661, -0.042054884135723114, 0.04008951410651207, -0.013636019080877304, -0.04294195398688316, -0.05772124603390694, 0.12420439720153809, 0.06844174861907959, -0.04816943034529686, 0.019900165498256683, 0.03988002985715866, 0.02126123569905758, 0.172650545835495, 0.07332214713096619, -0.1882593035697937, -0.06881880760192871, 0.048782892525196075, -0.07955491542816162, -0.07339189946651459, 0.0063936454243958, 0.10557878017425537, -0.11322397738695145, 0.08154963701963425, 0.012068655341863632, 0.07328691333532333, -0.11525671929121017, -0.0024093985557556152, -0.028096169233322144, 0.1183638870716095, 0.05108465626835823, 0.047229211777448654, -0.19986994564533234, 0.05403758957982063, 0.012654793448746204, 0.1039198562502861, -0.02813524380326271, 0.06634591519832611, -0.017032109200954437, 0.06844466924667358, 0.12026571482419968, 0.011300642043352127, -0.180620014667511, 0.007950318977236748, -0.08390103280544281, -0.004752594046294689, 0.10903562605381012, -0.055247530341148376, 0.05169255658984184, -0.032280925661325455, -0.03617505356669426, -0.027332941070199013, -0.09742812812328339, -0.07365085184574127, -0.16990025341510773, 0.03714520111680031, 0.043551262468099594, 0.1412942111492157, -0.0031898394227027893, 0.03297175094485283, -0.03916359320282936, 0.15998230874538422, -0.28409871459007263, -0.0757768526673317, -0.0946928933262825, -0.06637450307607651, 0.14371232688426971, -0.09269022941589355, 0.07481642067432404, 0.04853079095482826, 0.11155791580677032, -0.011495701968669891, -0.08806993812322617, 0.046905893832445145, -0.09265250712633133, -0.10907154530286789, -0.03457073122262955, 0.11935944110155106, 0.05436244234442711, 0.07421193271875381, 0.05831163749098778, 0.03769553080201149, -0.0215647853910923, -0.06773088872432709, -0.01914963871240616, 0.19931560754776, -0.021093202754855156, 0.057167671620845795, -0.05530690401792526, -0.15310782194137573, -0.0729113444685936, 0.013279811479151249, 0.1492718905210495, 0.005259695928543806, -0.08078514784574509, 0.11694126576185226, 0.10011040419340134, -0.12036667764186859, -0.20560838282108307, -0.05507458746433258, 0.08724356442689896, 0.058860983699560165, -0.03745613992214203, -0.22385287284851074, 0.015858113765716553, 0.000006842922630312387, -0.006966997403651476, -0.03876980021595955, -0.16996265947818756, -0.14090615510940552, 0.1284279078245163, -0.09304846823215485, 0.023427188396453857, 0.005037135444581509, -0.073642298579216, -0.08812277764081955, -0.03904160112142563, 0.029644301161170006, -0.012429758906364441, 0.04406251013278961, 0.06897162646055222, 0.08387638628482819, 0.0378691740334034, -0.01315190177410841, 0.05588711053133011, 0.056796152144670486, -0.013511689379811287, -0.04084751754999161, 0.06709080189466476, -0.05518299713730812, -0.02513977698981762, 0.16362783312797546, 0.025793394073843956, 0.005868697538971901, -0.04790695011615753, -0.06904713064432144, -0.031205255538225174, 0.02525397017598152, 0.011005409993231297, -0.0015820441767573357, -0.045100968331098557, 0.0022167495917528868, 0.03451700508594513, 0.0035571418702602386, 0.027353109791874886, -0.14913129806518555, -0.1075763925909996, 0.11954030394554138, 0.2722143828868866, 0.06565837562084198, -0.06634727865457535, 0.0035088134463876486, -0.013711636886000633, 0.07704731822013855, -0.04231555014848709, 0.07167350500822067, 0.07197877019643784, -0.03274939954280853, 0.11926261335611343, -0.00035852903965860605, -0.11423569172620773, 0.06635377556085587, 0.05502694100141525, -0.020039789378643036, -0.15400950610637665, -0.0054733604192733765, 0.052180029451847076, -0.04231125861406326, 0.03866622596979141, 0.17694243788719177, -0.05135773494839668, -0.01581565849483013, -0.02207002602517605, 0.04340844601392746, -0.07522305846214294, 0.08746461570262909, -0.017385030165314674, 0.013304539956152439, -0.05912309139966965, 0.0576443150639534, 0.08001887053251266, -0.03173559904098511, 0.050403255969285965, 0.0653034895658493, -0.0697970911860466, -0.056839220225811005, -0.1288677453994751, -0.023911127820611, -0.10979779064655304, -0.09706911444664001, 0.010614834725856781, -0.1186278834939003, 0.011131984181702137, 0.11168193072080612, -0.018291065469384193, 0.04492831230163574, -0.05934567376971245, -0.015338304452598095, -0.045604243874549866, -0.011850163340568542, 0.021315421909093857, 0.03535860404372215, -0.03325783833861351, 0.14928889274597168, -0.01421891339123249, 0.016645485535264015, -0.025243287906050682, 0.0025527318939566612, -0.021734090521931648, 0.02252868562936783, -0.0975794717669487, 0.002540776040405035, -0.07401875406503677, -0.03689887002110481, 0.04286340996623039, -0.023368706926703453, 0.025742575526237488, 0.041923169046640396, -0.05094552040100098, 0.011730413883924484, -0.05412328988313675, 0.03112405724823475, -0.0912269875407219, 0.04236016422510147, 0.027190757915377617, -0.04220723360776901, 0.09077047556638718, 0.059638507664203644, -0.06770157814025879, 0.07248358428478241, -0.10299940407276154, -0.022378843277692795, 0.011000150814652443, 0.051974814385175705, 0.006287505384534597, -0.01137184165418148, 0.005705463234335184, 0.034175824373960495, -0.02728397026658058, -0.034861113876104355, 0.03406647965312004, -0.04930468276143074, 0.12178876996040344, 0.033506304025650024, -0.0053920019418001175, -0.06234376132488251, 0.03153452277183533, 0.014516645111143589, -0.0028568110428750515, 0.12286145985126495, -0.06322412937879562, 0.06537891924381256, -0.04545490816235542, 0.03523024171590805, 0.029623614624142647, 0.024898339062929153, -0.01994514651596546, -0.10006512701511383, 0.0478806346654892, -0.0014999012928456068, 0.09220466017723083, -0.012817059643566608, -0.020072219893336296, 0.03356907144188881, -0.07902603596448898, -0.006376715376973152, 0.06585218757390976, 0.03223537281155586, 0.04259435459971428, -0.04563586041331291, -0.11865580826997757, -0.024502262473106384, -0.025085238739848137, 0.027304284274578094, 0.09339430928230286, 0.09031371772289276, 0.09178293496370316, 0.10237565636634827, 0.04301700368523598, 0.020098350942134857, -0.11811386048793793, 0.004279026295989752, -0.10059385746717453, 0.0035883651580661535, -0.03194301202893257, 0.08226832002401352, 0.11733024567365646, -0.09785982966423035, 0.09209072589874268, 0.029165152460336685, -0.05611716955900192, -0.12532581388950348, -0.17449131608009338, -0.07376011461019516, -0.02848604880273342, -0.0007344285841099918, -0.06350672245025635, 0.052247025072574615, -0.005280698649585247, 0.041123516857624054, -0.003563281148672104, 0.13350091874599457, -0.11163751035928726, -0.08593346178531647, 0.07665017992258072, 0.000748247024603188, 0.0848231390118599, 0.06795217096805573, 0.006805631332099438, 0.054071150720119476, 0.051719438284635544, 0.07912028580904007, 0.043512012809515, 0.037650592625141144, -0.003218777012079954, 0.012887399643659592, -0.019433939829468727, -0.016154643148183823, 0.03545008972287178, 0.015304877422749996, 0.1017102301120758, 0.057726532220840454, -0.048668380826711655, -0.012813475914299488, 0.08997464179992676, -0.0795411616563797, -0.0947568416595459, -0.17196124792099, 0.21712982654571533, 0.06599535793066025, 0.07811147719621658, -0.005279050208628178, -0.08433471620082855, 0.012056313455104828, 0.16406480967998505, 0.14742524921894073, -0.005399401765316725, 0.01204332709312439, -0.01895422302186489, 0.02047298476099968, -0.0027246708050370216, 0.06382530182600021, 0.0018877482507377863, 0.31792131066322327, -0.009849674068391323, 0.043894004076719284, 0.022761031985282898, -0.035081055015325546, -0.027873381972312927, 0.08444417268037796, -0.13558951020240784, -0.009021748788654804, -0.008201015181839466, 0.08492204546928406, -0.09062261134386063, -0.17430083453655243, -0.019442403689026833, 0.022883860394358635, -0.03875705972313881, 0.04724244400858879, 0.05762208253145218, 0.060660313814878464, 0.03959524631500244, 0.01602241024374962, -0.08021383732557297, 0.20445112884044647, 0.0053378622978925705, -0.04252425581216812, 0.0458412766456604, 0.03297366201877594, -0.10910841822624207, 0.11526378244161606, 0.027723077684640884, 0.10069838166236877, 0.03934825584292412, 0.05845237523317337, -0.05472518503665924, 0.0662502571940422, -0.010306332260370255, -0.04323115572333336, 0.09506569057703018, 0.16651421785354614, -0.025192568078637123, 0.13574430346488953, 0.04261445626616478, -0.09021810442209244, 0.08475418388843536, 0.012043021619319916, -0.0830828994512558, -0.05022330582141876, 0.0570179745554924, -0.09165201336145401, 0.12689171731472015, 0.15066556632518768, -0.00882813986390829, -0.011729314923286438, -0.023303283378481865, 0.013523025438189507, -0.027428874745965004, 0.0765659287571907, -0.03629263490438461, -0.09479732066392899, -0.012559524737298489, 0.010477472096681595, 0.060735173523426056, -0.18528443574905396, -0.029338670894503593, -0.003349670907482505, -0.015890363603830338, -0.025939086452126503, 0.05211314186453819, 0.05028589442372322, 0.022258365526795387, -0.025687305256724358, -0.03859071061015129, 0.039696041494607925, 0.057480014860630035, -0.09518752247095108, -0.0604669526219368 ]
null
null
transformers
# S2T2-Wav2Vec2-CoVoST2-EN-TR-ST `s2t-wav2vec2-large-en-tr` is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST). The S2T2 model was proposed in [Large-Scale Self- and Semi-Supervised Learning for Speech Translation](https://arxiv.org/pdf/2104.06678.pdf) and officially released in [Fairseq](https://github.com/pytorch/fairseq/blob/6f847c8654d56b4d1b1fbacec027f47419426ddb/fairseq/models/wav2vec/wav2vec2_asr.py#L266). ## Model description S2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a pretrained [Wav2Vec2](https://huggingface.co/transformers/model_doc/wav2vec2.html) as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Turkish text translation. See the [model hub](https://huggingface.co/models?filter=speech2text2) to look for other S2T2 checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the transcripts by passing the speech features to the model. You can use the model directly via the ASR pipeline ```python from datasets import load_dataset from transformers import pipeline librispeech_en = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") asr = pipeline("automatic-speech-recognition", model="facebook/s2t-wav2vec2-large-en-tr", feature_extractor="facebook/s2t-wav2vec2-large-en-tr") translation = asr(librispeech_en[0]["file"]) ``` or step-by-step as follows: ```python import torch from transformers import Speech2Text2Processor, SpeechEncoderDecoder from datasets import load_dataset import soundfile as sf model = SpeechEncoderDecoder.from_pretrained("facebook/s2t-wav2vec2-large-en-tr") processor = Speech2Text2Processor.from_pretrained("facebook/s2t-wav2vec2-large-en-tr") def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") ds = ds.map(map_to_array) inputs = processor(ds["speech"][0], sampling_rate=16_000, return_tensors="pt") generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"]) transcription = processor.batch_decode(generated_ids) ``` ## Evaluation results CoVoST-V2 test results for en-tr (BLEU score): **17.5** For more information, please have a look at the [official paper](https://arxiv.org/pdf/2104.06678.pdf) - especially row 10 of Table 2. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2104-06678, author = {Changhan Wang and Anne Wu and Juan Miguel Pino and Alexei Baevski and Michael Auli and Alexis Conneau}, title = {Large-Scale Self- and Semi-Supervised Learning for Speech Translation}, journal = {CoRR}, volume = {abs/2104.06678}, year = {2021}, url = {https://arxiv.org/abs/2104.06678}, archivePrefix = {arXiv}, eprint = {2104.06678}, timestamp = {Thu, 12 Aug 2021 15:37:06 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2104-06678.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"language": ["en", "tr"], "license": "mit", "tags": ["audio", "speech-translation", "automatic-speech-recognition", "speech2text2"], "datasets": ["covost2", "librispeech_asr"], "pipeline_tag": "automatic-speech-recognition", "widget": [{"example_title": "Common Voice 1", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_99989.mp3"}, {"example_title": "Common Voice 2", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_99986.mp3"}, {"example_title": "Common Voice 3", "src": "https://cdn-media.huggingface.co/speech_samples/common_voice_en_99987.mp3"}]}
automatic-speech-recognition
facebook/s2t-wav2vec2-large-en-tr
[ "transformers", "pytorch", "speech-encoder-decoder", "automatic-speech-recognition", "audio", "speech-translation", "speech2text2", "en", "tr", "dataset:covost2", "dataset:librispeech_asr", "arxiv:2104.06678", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2104.06678" ]
[ "en", "tr" ]
TAGS #transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #tr #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #has_space #region-us
# S2T2-Wav2Vec2-CoVoST2-EN-TR-ST 's2t-wav2vec2-large-en-tr' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST). The S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in Fairseq. ## Model description S2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech Translation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively. ## Intended uses & limitations This model can be used for end-to-end English speech to Turkish text translation. See the model hub to look for other S2T2 checkpoints. ### How to use As this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the transcripts by passing the speech features to the model. You can use the model directly via the ASR pipeline or step-by-step as follows: ## Evaluation results CoVoST-V2 test results for en-tr (BLEU score): 17.5 For more information, please have a look at the official paper - especially row 10 of Table 2. ### BibTeX entry and citation info
[ "# S2T2-Wav2Vec2-CoVoST2-EN-TR-ST\n\n's2t-wav2vec2-large-en-tr' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.", "## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Turkish text translation.\nSee the model hub to look for other S2T2 checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:", "## Evaluation results\n\nCoVoST-V2 test results for en-tr (BLEU score): 17.5\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #tr #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #has_space #region-us \n", "# S2T2-Wav2Vec2-CoVoST2-EN-TR-ST\n\n's2t-wav2vec2-large-en-tr' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.", "## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.", "## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Turkish text translation.\nSee the model hub to look for other S2T2 checkpoints.", "### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:", "## Evaluation results\n\nCoVoST-V2 test results for en-tr (BLEU score): 17.5\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.", "### BibTeX entry and citation info" ]
[ 93, 93, 107, 44, 67, 43, 11 ]
[ "passage: TAGS\n#transformers #pytorch #speech-encoder-decoder #automatic-speech-recognition #audio #speech-translation #speech2text2 #en #tr #dataset-covost2 #dataset-librispeech_asr #arxiv-2104.06678 #license-mit #endpoints_compatible #has_space #region-us \n# S2T2-Wav2Vec2-CoVoST2-EN-TR-ST\n\n's2t-wav2vec2-large-en-tr' is a Speech to Text Transformer model trained for end-to-end Speech Translation (ST).\nThe S2T2 model was proposed in Large-Scale Self- and Semi-Supervised Learning for Speech Translation and officially released in\nFairseq.## Model description\n\nS2T2 is a transformer-based seq2seq (speech encoder-decoder) model designed for end-to-end Automatic Speech Recognition (ASR) and Speech\nTranslation (ST). It uses a pretrained Wav2Vec2 as the encoder and a transformer-based decoder. The model is trained with standard autoregressive cross-entropy loss and generates the translations autoregressively.## Intended uses & limitations\n\nThis model can be used for end-to-end English speech to Turkish text translation.\nSee the model hub to look for other S2T2 checkpoints.### How to use\n\nAs this a standard sequence to sequence transformer model, you can use the 'generate' method to generate the\ntranscripts by passing the speech features to the model.\n\nYou can use the model directly via the ASR pipeline\n\n\n\nor step-by-step as follows:## Evaluation results\n\nCoVoST-V2 test results for en-tr (BLEU score): 17.5\n\nFor more information, please have a look at the official paper - especially row 10 of Table 2.### BibTeX entry and citation info" ]
[ -0.04585529863834381, 0.042493585497140884, -0.007307990919798613, -0.017139440402388573, 0.07139475643634796, -0.01103972364217043, 0.10865852981805801, 0.03599286451935768, -0.05491404980421066, 0.04557279124855995, 0.028887160122394562, 0.03828618675470352, 0.09442133456468582, 0.08140923082828522, 0.02686505950987339, -0.25153177976608276, 0.030961899086833, -0.05819372832775116, 0.07894717901945114, 0.06028180569410324, 0.1313108205795288, -0.044452838599681854, 0.07852942496538162, 0.05805876478552818, -0.010130336508154869, 0.03539608046412468, 0.040063828229904175, -0.08516701310873032, 0.05869204178452492, 0.06846075505018234, 0.0981561616063118, -0.0015829000622034073, 0.08480443060398102, -0.15787164866924286, 0.004269098863005638, 0.046660564839839935, 0.004726823884993792, 0.03119017742574215, 0.07689106464385986, -0.06916366517543793, 0.1278834044933319, -0.04022635519504547, 0.038501787930727005, 0.036688752472400665, -0.06318091601133347, -0.10127825289964676, -0.06120794638991356, 0.07340666651725769, 0.11098373681306839, 0.09057577699422836, -0.040326111018657684, -0.06985125690698624, -0.014317218214273453, 0.05858170613646507, 0.03440060093998909, -0.23721104860305786, -0.018164893612265587, -0.0977037101984024, -0.024064555764198303, 0.024528853595256805, -0.0022579189389944077, -0.010054931975901127, -0.025290312245488167, 0.006459956988692284, 0.0449831560254097, -0.06007455289363861, -0.006272477563470602, -0.08984997123479843, -0.10206801444292068, -0.03876405581831932, 0.10400985181331635, -0.04251336678862572, -0.062430914491415024, -0.07285439223051071, -0.03984721377491951, 0.007531572598963976, -0.00011651484965113923, -0.040676601231098175, -0.008655184879899025, 0.032058972865343094, 0.04950477555394173, -0.09800562262535095, -0.11736474186182022, -0.02481665462255478, -0.08907432109117508, 0.08082027733325958, 0.04328148812055588, 0.021216198801994324, -0.038010358810424805, 0.11153550446033478, -0.10533305257558823, -0.07758616656064987, -0.046163108199834824, -0.014220181852579117, -0.14713023602962494, -0.004409948829561472, -0.029464321210980415, -0.23006793856620789, -0.04761863127350807, 0.11928576231002808, 0.014238685369491577, 0.0653291866183281, -0.0026726049836724997, 0.04722039774060249, -0.03372006118297577, 0.19727258384227753, -0.028817379847168922, -0.059249211102724075, -0.010667086578905582, -0.04986167699098587, 0.0031153110321611166, 0.0019741691648960114, -0.08720799535512924, -0.041120655834674835, -0.012878314591944218, 0.04211798310279846, -0.006341905333101749, 0.029291508719325066, 0.014348176307976246, -0.0008243431802839041, 0.0949549451470375, -0.13359777629375458, 0.0263077262789011, 0.03640734776854515, -0.028698166832327843, 0.09172903746366501, 0.05305344983935356, 0.011146623641252518, -0.11510498821735382, 0.10038552433252335, 0.037071313709020615, 0.04894740879535675, -0.07049137353897095, -0.06396991014480591, -0.00917909573763609, -0.050462570041418076, -0.02856733277440071, -0.1351461559534073, -0.14439807832241058, -0.05627892166376114, 0.012236077338457108, -0.06321427971124649, 0.0693584606051445, -0.0856577679514885, 0.011964724399149418, 0.036285579204559326, -0.04590072110295296, -0.05315713956952095, 0.007526753004640341, -0.012831308878958225, -0.037176214158535004, 0.02445993758738041, -0.02846897393465042, 0.04079924896359444, -0.06374653428792953, -0.008306855335831642, -0.1237945407629013, 0.20008182525634766, 0.0179975014179945, -0.026471486315131187, -0.12285755574703217, -0.045750319957733154, -0.06490040570497513, 0.036013826727867126, 0.02316729538142681, 0.07607060670852661, -0.2083459496498108, -0.0253943782299757, 0.20966170728206635, -0.09428904950618744, 0.04609433561563492, 0.13486510515213013, -0.017836647108197212, 0.08823943138122559, 0.11131369322538376, 0.08004152774810791, 0.1259678602218628, -0.02390487864613533, -0.05833679065108299, 0.0009209899581037462, -0.06300660222768784, 0.11660780012607574, 0.05604028329253197, -0.07619673758745193, 0.07470331341028214, 0.012313119135797024, -0.05048090219497681, -0.003560316748917103, 0.04625847190618515, -0.04666221886873245, 0.022417429834604263, -0.018078844994306564, 0.050818704068660736, -0.05351155623793602, 0.04870966449379921, 0.016014663502573967, -0.10475272685289383, 0.062350593507289886, 0.041479822248220444, -0.061960529536008835, 0.06523511558771133, -0.14954787492752075, -0.015830345451831818, -0.02799699828028679, 0.008685644716024399, -0.17351242899894714, 0.05565492436289787, -0.0195819903165102, -0.008699017576873302, 0.15246328711509705, 0.03957563266158104, 0.05948857218027115, -0.014949791133403778, -0.017936022952198982, -0.008374453522264957, 0.07852140069007874, -0.013649467378854752, -0.03962823748588562, -0.05381892994046211, -0.0003442993329372257, -0.03223239257931709, 0.07609108835458755, -0.09107695519924164, 0.015089417807757854, 0.023894550278782845, 0.09364692121744156, 0.005794899072498083, -0.009381195530295372, -0.03992370888590813, 0.03612489625811577, 0.00647259596735239, -0.024341871961951256, -0.030877292156219482, -0.002242543501779437, -0.09066187590360641, 0.11943581700325012, -0.1931932419538498, -0.1124429777264595, 0.04107223451137543, 0.013284135609865189, -0.10482767969369888, 0.028492316603660583, 0.038732171058654785, -0.020624594762921333, 0.001893959124572575, -0.08953692018985748, 0.35449376702308655, 0.018478546291589737, 0.128255233168602, -0.05864402651786804, -0.03196415305137634, -0.034546200186014175, -0.061765044927597046, 0.022462375462055206, 0.054181259125471115, -0.0327434241771698, -0.20350953936576843, 0.027301006019115448, 0.03327609971165657, -0.05319259315729141, 0.11643359065055847, 0.03957970067858696, -0.07405472546815872, 0.005814492702484131, 0.005915519781410694, 0.02708946168422699, -0.04319493845105171, -0.003582439385354519, -0.01605623960494995, 0.027858447283506393, 0.04055556654930115, 0.030909404158592224, -0.06887039542198181, 0.08986079692840576, 0.06971083581447601, -0.046206019818782806, -0.06362006813287735, -0.0008420501253567636, 0.015038149431347847, 0.05622096359729767, 0.005950139369815588, -0.005370232276618481, -0.003156566061079502, -0.027518361806869507, -0.12900422513484955, 0.12004364281892776, -0.13991540670394897, -0.2359691709280014, -0.2116043120622635, -0.00383944483473897, 0.044626131653785706, 0.040101028978824615, 0.03571966663002968, -0.07623453438282013, -0.050902001559734344, -0.08186132460832596, 0.0875374972820282, -0.05843506008386612, -0.046790845692157745, -0.05691015347838402, -0.003054136410355568, 0.013143585994839668, -0.0692286491394043, 0.007353661116212606, -0.01723732054233551, -0.0819304957985878, 0.008044530637562275, 0.01181792002171278, -0.03441654518246651, 0.14843133091926575, -0.020757699385285378, -0.02700136788189411, -0.03728022798895836, 0.10866373032331467, -0.10052429884672165, 0.10174380242824554, 0.13631868362426758, -0.1253334879875183, 0.0410321019589901, 0.13407643139362335, 0.00689627043902874, -0.014194134622812271, -0.009352625347673893, -0.04090351611375809, -0.039857301861047745, -0.24225282669067383, -0.053918592631816864, -0.05022311955690384, -0.06632710993289948, 0.05243508145213127, 0.021400777623057365, 0.031413640826940536, 0.05699656158685684, -0.061580222100019455, 0.03615867346525192, 0.09126953035593033, 0.04616932198405266, 0.13591215014457703, 0.006150729022920132, 0.056806791573762894, -0.0769173875451088, -0.023825496435165405, 0.054151926189661026, -0.005940773990005255, 0.1284082978963852, 0.02041897177696228, 0.124354287981987, 0.011019173078238964, 0.013670097105205059, 0.044610388576984406, 0.07613030076026917, -0.04167744144797325, 0.04841945692896843, -0.0026414894964545965, -0.04973968118429184, -0.05506635829806328, 0.10565370321273804, 0.062140632420778275, -0.059077031910419464, 0.03372647613286972, 0.054685670882463455, 0.04094555601477623, 0.15766936540603638, 0.0460963174700737, -0.17904633283615112, -0.06777278333902359, 0.036961425095796585, -0.09091971069574356, -0.07047251611948013, -0.010667758993804455, 0.09681607782840729, -0.11604248732328415, 0.09109222143888474, 0.00012392281496431679, 0.07346296310424805, -0.10533789545297623, -0.01234445534646511, -0.021510839462280273, 0.09627395868301392, 0.04686245322227478, 0.05993986129760742, -0.19287534058094025, 0.07170092314481735, 0.01956811361014843, 0.1161298081278801, -0.03550490736961365, 0.06592389196157455, 0.004117235075682402, 0.057755790650844574, 0.10741124302148819, 0.012822008691728115, -0.198406383395195, -0.005514161661267281, -0.09954585880041122, -0.0021522706374526024, 0.11249855905771255, -0.044277373701334, 0.04805317148566246, -0.03727737441658974, -0.026319507509469986, -0.04006984457373619, -0.11706371605396271, -0.07116304337978363, -0.19393311440944672, 0.04195885360240936, 0.03842818737030029, 0.12904362380504608, -0.0049286410212516785, 0.030415400862693787, -0.03298182412981987, 0.14299391210079193, -0.2735421061515808, -0.08904030174016953, -0.08891560882329941, -0.07303714007139206, 0.14533835649490356, -0.10378655791282654, 0.04777757450938225, 0.039998576045036316, 0.07989353686571121, -0.005489179864525795, -0.08205876499414444, 0.02221708744764328, -0.08020263910293579, -0.10709436982870102, -0.02591296285390854, 0.12706808745861053, 0.04977024346590042, 0.08021653443574905, 0.06330475956201553, 0.03377619385719299, -0.020271655172109604, -0.05734547972679138, -0.03959229961037636, 0.18321992456912994, -0.026607848703861237, 0.05554625764489174, -0.019302042201161385, -0.15416665375232697, -0.10339879244565964, -0.0013084715465083718, 0.14896981418132782, 0.003780615283176303, -0.07310149818658829, 0.12349589169025421, 0.07453886419534683, -0.12049952894449234, -0.16098006069660187, -0.07007842510938644, 0.08226010203361511, 0.0580202117562294, -0.02913161925971508, -0.20073124766349792, 0.006026409100741148, 0.0035306166391819715, -0.003398950444534421, -0.026438290253281593, -0.20707657933235168, -0.13690438866615295, 0.1360442191362381, -0.09153535962104797, 0.02444332279264927, -0.008126802742481232, -0.07197828590869904, -0.07224928587675095, -0.018934985622763634, 0.0012897492852061987, -0.03254445269703865, 0.05085470527410507, 0.05986427515745163, 0.09176067262887955, 0.03446698188781738, -0.003060233313590288, 0.08117128908634186, 0.03738522529602051, -0.019526351243257523, -0.0630296990275383, 0.0680391788482666, -0.04977650195360184, -0.022768359631299973, 0.14385420083999634, 0.0036455439403653145, 0.0031543602235615253, -0.06670782715082169, -0.06800242513418198, -0.022593706846237183, 0.025123069062829018, 0.007807902991771698, 0.015057452954351902, -0.04378972202539444, -0.009665947407484055, 0.03816929832100868, -0.0034859974402934313, 0.008788232691586018, -0.14162348210811615, -0.09909530729055405, 0.1035950630903244, 0.25578296184539795, 0.06784899532794952, -0.06384792923927307, -0.008585848845541477, -0.000004371608156361617, 0.06537320464849472, -0.0396038293838501, 0.05001814663410187, 0.07603701949119568, -0.04245747625827789, 0.11558381468057632, -0.0141976960003376, -0.12073508650064468, 0.08507230877876282, 0.06988369673490524, -0.016148872673511505, -0.1538407951593399, 0.0015599187463521957, 0.05589606612920761, -0.032689377665519714, 0.01152500044554472, 0.1705036163330078, -0.06071498617529869, -0.007041032426059246, -0.02141508087515831, 0.03848912566900253, -0.07437507808208466, 0.11068326979875565, -0.03606158867478371, 0.004343807697296143, -0.05511849373579025, 0.06122555583715439, 0.09084083139896393, -0.021959124132990837, 0.05279112607240677, 0.07397528737783432, -0.07703109830617905, -0.05615335702896118, -0.12038787454366684, -0.02980605885386467, -0.07851243764162064, -0.10937388241291046, 0.016353311017155647, -0.11641192436218262, -0.0005773742450401187, 0.10784348845481873, -0.02107488550245762, 0.039276741445064545, -0.06940764933824539, -0.02388736605644226, -0.043068062514066696, -0.0026189996860921383, 0.047547318041324615, 0.017253538593649864, -0.03529958426952362, 0.16738052666187286, -0.0009749918826855719, 0.01888388581573963, -0.018758976832032204, -0.008880802430212498, 0.0047592478804290295, 0.02864036336541176, -0.10748448222875595, 0.004070226103067398, -0.09340248256921768, -0.02889183722436428, 0.05103018879890442, -0.018816551193594933, 0.014115534722805023, 0.030517281964421272, -0.04752859100699425, 0.013219231739640236, -0.05867985263466835, 0.03250598534941673, -0.09844478964805603, 0.04427708312869072, 0.023226678371429443, -0.03953663632273674, 0.09658787399530411, 0.08178359270095825, -0.07474299520254135, 0.08351727575063705, -0.09618445485830307, -0.01968494988977909, 0.0015234309248626232, 0.052146509289741516, 0.010038770735263824, -0.015642788261175156, 0.012118689715862274, 0.03698699176311493, -0.023700956255197525, -0.015132379718124866, 0.04027234762907028, -0.054788585752248764, 0.10918895155191422, 0.00019235075160395354, 0.012694376520812511, -0.06473855674266815, 0.04081571847200394, 0.014471638016402721, -0.003046045545488596, 0.1228206604719162, -0.07283779978752136, 0.07071474939584732, -0.04409800097346306, 0.03569735214114189, 0.021195588633418083, 0.009337377734482288, 0.00008494652865920216, -0.10694923251867294, 0.05347835272550583, -0.003920146729797125, 0.07653902471065521, -0.0033000227995216846, -0.012217685580253601, 0.031881388276815414, -0.08542042225599289, -0.01885402947664261, 0.04632551223039627, 0.029195724055171013, 0.049537766724824905, -0.044654395431280136, -0.12191256880760193, -0.03572452813386917, -0.024958297610282898, -0.010962918400764465, 0.09520576149225235, 0.0935099646449089, 0.09088997542858124, 0.10097268968820572, 0.035316966474056244, 0.007728606462478638, -0.10340970009565353, -0.016102029010653496, -0.11845502257347107, -0.000931261049117893, -0.02557622455060482, 0.07363894581794739, 0.15681998431682587, -0.09691773355007172, 0.07415319979190826, 0.024782245978713036, -0.06186326593160629, -0.11257145553827286, -0.1872086375951767, -0.06085243821144104, -0.022064225748181343, -0.004302922170609236, -0.04813762754201889, 0.05650996044278145, 0.003500844119116664, 0.047875721007585526, -0.004557614680379629, 0.1473764032125473, -0.11436010152101517, -0.08910799771547318, 0.08145228028297424, -0.010132642462849617, 0.0909896120429039, 0.06411955505609512, 0.006811628583818674, 0.04590337723493576, 0.03462270647287369, 0.08206711709499359, 0.05753134563565254, 0.033544450998306274, -0.0020493019837886095, -0.007433372084051371, -0.021059514954686165, -0.008764785714447498, 0.0441640205681324, 0.024372050538659096, 0.11026890575885773, 0.07517992705106735, -0.048662248998880386, -0.025209270417690277, 0.10689250379800797, -0.06350874900817871, -0.09476594626903534, -0.16201327741146088, 0.17744259536266327, 0.066425621509552, 0.05057571828365326, -0.0009255427285097539, -0.08248136937618256, 0.009542915970087051, 0.18064869940280914, 0.16145874559879303, 0.012765433639287949, 0.022745594382286072, -0.03462076559662819, 0.025536851957440376, -0.009219796396791935, 0.06286002695560455, 0.006548773031681776, 0.29188811779022217, -0.007058983668684959, 0.05289100110530853, 0.01432719361037016, -0.0329025499522686, -0.0414876826107502, 0.0700611099600792, -0.1235695630311966, -0.014178860932588577, 0.008594067767262459, 0.11428678035736084, -0.09241844713687897, -0.1947244256734848, -0.018005870282649994, 0.03053407557308674, -0.04642382636666298, 0.06300940364599228, 0.07558229565620422, 0.06287913024425507, 0.031118061393499374, 0.01707281917333603, -0.07923423498868942, 0.18288075923919678, 0.018376344814896584, -0.0489821583032608, 0.03308351710438728, 0.04200773313641548, -0.10720396786928177, 0.10237978398799896, 0.029629774391651154, 0.09853897243738174, 0.039108194410800934, 0.06390595436096191, -0.04923994839191437, 0.0739244669675827, -0.0041517349891364574, -0.014192555099725723, 0.11034345626831055, 0.16000406444072723, -0.01658577285706997, 0.10129223018884659, 0.06633062660694122, -0.06558680534362793, 0.0855678915977478, -0.015113637782633305, -0.049902476370334625, -0.0488554947078228, 0.06052905693650246, -0.0780600905418396, 0.11780428141355515, 0.1769932061433792, -0.009529170580208302, 0.003508629510179162, -0.018630288541316986, -0.010978548787534237, -0.019677238538861275, 0.06210515648126602, -0.03667286038398743, -0.0975736454129219, -0.02651570923626423, 0.002257526619359851, 0.0624537318944931, -0.16710051894187927, -0.033601608127355576, -0.010536914691329002, -0.031355880200862885, -0.020359400659799576, 0.06494736671447754, 0.0564420223236084, 0.025336677208542824, -0.02756008692085743, -0.047862619161605835, 0.035622820258140564, 0.0649818629026413, -0.10809260606765747, -0.042798034846782684 ]
null
null
fairseq
# tts_transformer-ar-cv7 [Transformer](https://arxiv.org/abs/1809.08895) text-to-speech model from fairseq S^2 ([paper](https://arxiv.org/abs/2109.06912)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_synthesis)): - Arabic - Single-speaker male voice - Trained on [Common Voice v7](https://commonvoice.mozilla.org/en/datasets) ## Usage ```python from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub from fairseq.models.text_to_speech.hub_interface import TTSHubInterface import IPython.display as ipd models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "facebook/tts_transformer-ar-cv7", arg_overrides={"vocoder": "hifigan", "fp16": False} ) model = models[0] TTSHubInterface.update_cfg_with_data_cfg(cfg, task.data_cfg) generator = task.build_generator(model, cfg) text = "مرحبًا ، هذا اختبار تشغيل." sample = TTSHubInterface.get_model_input(task, text) wav, rate = TTSHubInterface.get_prediction(task, model, generator, sample) ipd.Audio(wav, rate=rate) ``` See also [fairseq S^2 example](https://github.com/pytorch/fairseq/blob/main/examples/speech_synthesis/docs/common_voice_example.md). ## Citation ```bibtex @inproceedings{wang-etal-2021-fairseq, title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit", author = "Wang, Changhan and Hsu, Wei-Ning and Adi, Yossi and Polyak, Adam and Lee, Ann and Chen, Peng-Jen and Gu, Jiatao and Pino, Juan", booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-demo.17", doi = "10.18653/v1/2021.emnlp-demo.17", pages = "143--152", } ```
{"language": "ar", "library_name": "fairseq", "tags": ["fairseq", "audio", "text-to-speech"], "datasets": ["common_voice"], "task": "text-to-speech", "widget": [{"text": "\u0645\u0631\u062d\u0628\u064b\u0627 \u060c \u0647\u0630\u0627 \u0627\u062e\u062a\u0628\u0627\u0631 \u062a\u0634\u063a\u064a\u0644.", "example_title": "Hello, this is a test run."}]}
text-to-speech
facebook/tts_transformer-ar-cv7
[ "fairseq", "audio", "text-to-speech", "ar", "dataset:common_voice", "arxiv:1809.08895", "arxiv:2109.06912", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "1809.08895", "2109.06912" ]
[ "ar" ]
TAGS #fairseq #audio #text-to-speech #ar #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us
# tts_transformer-ar-cv7 Transformer text-to-speech model from fairseq S^2 (paper/code): - Arabic - Single-speaker male voice - Trained on Common Voice v7 ## Usage See also fairseq S^2 example.
[ "# tts_transformer-ar-cv7\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Arabic\n- Single-speaker male voice\n- Trained on Common Voice v7", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ "TAGS\n#fairseq #audio #text-to-speech #ar #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n", "# tts_transformer-ar-cv7\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Arabic\n- Single-speaker male voice\n- Trained on Common Voice v7", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ 52, 51, 13 ]
[ "passage: TAGS\n#fairseq #audio #text-to-speech #ar #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n# tts_transformer-ar-cv7\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Arabic\n- Single-speaker male voice\n- Trained on Common Voice v7## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ -0.09502340853214264, -0.006236296147108078, -0.002895836951211095, -0.07096204161643982, 0.06232909485697746, -0.05690997466444969, 0.19318023324012756, 0.0801331177353859, -0.06382632255554199, -0.02061825804412365, 0.026450475677847862, 0.07595744729042053, 0.07197660952806473, -0.003376915818080306, -0.03949509561061859, -0.18933223187923431, 0.04193912073969841, -0.004965417552739382, -0.03839922696352005, 0.043294742703437805, 0.1569766104221344, -0.011656749062240124, 0.011717713437974453, 0.03644605726003647, -0.06147899851202965, 0.0612189918756485, 0.07048187404870987, -0.11810312420129776, 0.10468044877052307, 0.10920552164316177, 0.0267587099224329, 0.097416952252388, 0.051300324499607086, -0.13988149166107178, 0.04109472036361694, -0.026483433321118355, 0.05075519159436226, 0.03814270719885826, -0.044402576982975006, 0.013715739361941814, 0.03704161196947098, 0.08095651119947433, -0.021225174888968468, 0.05899648368358612, -0.10544121265411377, -0.19389334321022034, 0.042883120477199554, -0.039811599999666214, 0.042812444269657135, 0.04778263345360756, -0.04483947902917862, 0.024889079853892326, -0.06675364822149277, 0.06814398616552353, 0.10806149244308472, -0.2268766164779663, 0.010354401543736458, -0.051552265882492065, 0.036639098078012466, 0.09953580051660538, -0.05707167461514473, 0.033932946622371674, 0.035574354231357574, -0.025470631197094917, -0.1512421816587448, -0.0917910560965538, -0.18836496770381927, 0.03204217553138733, -0.08830652385950089, 0.03594480827450752, 0.3743746876716614, 0.025765497237443924, -0.037612657994031906, -0.05330929532647133, -0.05797652155160904, 0.12016256898641586, 0.027725093066692352, -0.06848371773958206, -0.043761156499385834, 0.062977135181427, -0.024382872506976128, -0.07185377180576324, -0.1292651742696762, -0.06224711611866951, -0.11950431019067764, 0.13952109217643738, -0.024543263018131256, -0.002164108445867896, -0.06337092816829681, -0.0399676151573658, -0.09841391444206238, -0.04469313099980354, 0.055687904357910156, -0.03470170125365257, -0.053890686482191086, 0.011125716380774975, -0.0015912143280729651, -0.29841992259025574, 0.11343394219875336, -0.1047651618719101, -0.0320761539041996, 0.05771152675151825, -0.06785419583320618, 0.0705125629901886, 0.017370587214827538, -0.01784721575677395, -0.009104330092668533, -0.01048129890114069, -0.012394720688462257, 0.021304503083229065, 0.003682343987748027, -0.058766115456819534, -0.1470590978860855, -0.03490760549902916, -0.0681435614824295, 0.030445851385593414, -0.019053157418966293, 0.016467684879899025, -0.026695484295487404, 0.0008775819442234933, 0.15690428018569946, -0.04908024147152901, -0.05332542583346367, 0.04822273924946785, 0.020704146474599838, 0.0004316282575018704, 0.013709231279790401, 0.06982464343309402, -0.01820247992873192, -0.15569494664669037, -0.006517382804304361, 0.0031298049725592136, 0.06464235484600067, -0.11937092244625092, 0.04255177080631256, 0.0222789216786623, 0.01029868796467781, -0.14030030369758606, -0.02660682611167431, -0.051990050822496414, -0.07930607348680496, 0.07593736797571182, -0.08326265960931778, -0.12946771085262299, -0.05818290263414383, 0.04657745733857155, -0.07587000727653503, -0.0707193911075592, -0.056968625634908676, 0.07094009965658188, -0.03369138389825821, 0.08811342716217041, -0.07465814799070358, 0.05850144848227501, 0.005752568133175373, -0.04468691721558571, -0.09418225288391113, 0.14162881672382355, -0.0025607789866626263, -0.08802090585231781, -0.052123263478279114, 0.012827038764953613, -0.08963993191719055, 0.08002204447984695, -0.03075946494936943, 0.13634894788265228, -0.21807560324668884, -0.09941753000020981, 0.10454993695020676, -0.07320766896009445, -0.06828291714191437, 0.15236395597457886, 0.03259865567088127, 0.07207702845335007, 0.108125701546669, 0.3287963569164276, 0.07049859315156937, -0.12649540603160858, -0.04588295519351959, 0.17755861580371857, 0.018373269587755203, 0.022538186982274055, 0.09400807321071625, -0.08518676459789276, 0.07367555797100067, -0.0482570119202137, 0.17150790989398956, 0.035737358033657074, -0.08264908939599991, -0.022707846015691757, 0.08459530025720596, -0.058189816772937775, 0.13227154314517975, -0.028016522526741028, 0.020085323601961136, 0.005611712113022804, -0.0573870874941349, 0.023495083674788475, 0.0865936279296875, -0.09153121709823608, 0.08182434737682343, -0.18076246976852417, 0.027250222861766815, -0.1169792041182518, 0.011152182705700397, -0.15158911049365997, 0.06173563003540039, -0.0835840106010437, 0.06060966104269028, 0.15884418785572052, 0.18535850942134857, 0.028365617617964745, -0.02064749225974083, -0.0884457677602768, 0.030770068988204002, 0.15666258335113525, 0.08709783852100372, -0.04714386537671089, -0.20703446865081787, 0.11579219251871109, -0.10310379415750504, 0.07271616905927658, -0.11106175184249878, -0.010677295736968517, 0.12413951754570007, 0.058193255215883255, 0.03605309873819351, 0.029809506610035896, 0.07144028693437576, 0.05408366397023201, 0.011823613196611404, 0.010832538828253746, 0.010049011558294296, 0.01690654829144478, -0.15790903568267822, 0.2210315316915512, -0.19378606975078583, 0.13397130370140076, 0.08012348413467407, -0.14646930992603302, -0.0592963807284832, 0.11450404673814774, 0.03897060826420784, 0.005064846482127905, 0.02805209904909134, -0.10698290914297104, 0.20763826370239258, -0.07197091728448868, 0.090569406747818, -0.05994679406285286, 0.07003194838762283, 0.024929948151111603, -0.11008003354072571, 0.04301045089960098, 0.10362989455461502, -0.15612080693244934, -0.16262935101985931, 0.06546739488840103, 0.12022843956947327, -0.01901349425315857, 0.22058016061782837, -0.045357633382081985, -0.0012106822105124593, 0.02034463733434677, 0.015153014101088047, -0.03139819577336311, 0.05486864224076271, -0.20508408546447754, -0.03321007266640663, 0.01875445991754532, 0.06490368396043777, 0.07304561138153076, -0.08862145245075226, -0.006197473034262657, 0.005613434594124556, -0.08138787001371384, -0.2314245104789734, 0.10289132595062256, -0.02838745340704918, 0.09623537212610245, -0.04161063954234123, -0.07938362658023834, 0.06251594424247742, -0.044626738876104355, -0.14177440106868744, 0.10283777862787247, -0.15560707449913025, -0.09134898334741592, -0.08809883892536163, -0.005406700074672699, 0.02853073924779892, 0.09761500358581543, 0.13495297729969025, -0.1209496334195137, 0.02265106327831745, -0.040556494146585464, 0.07098530232906342, -0.0025011813268065453, 0.03295142203569412, -0.03467771038413048, -0.07816898077726364, 0.05265725031495094, -0.09806433320045471, 0.02052481658756733, 0.0042567262426018715, -0.014299580827355385, 0.03185666352510452, -0.10028843581676483, 0.025675160810351372, 0.21915322542190552, 0.10053602606058121, -0.03746483474969864, -0.057970982044935226, 0.18263158202171326, -0.11678418517112732, -0.04695693030953407, 0.14502806961536407, -0.024431155994534492, 0.007459590211510658, 0.17196005582809448, 0.01276780478656292, 0.0013815501006320119, -0.013716010376811028, -0.03363256901502609, -0.06751151382923126, -0.18229199945926666, -0.035988036543130875, -0.09180808067321777, -0.0014819904463365674, -0.21400706470012665, 0.009605412371456623, -0.03346602991223335, -0.061587538570165634, -0.011304900050163269, -0.04672897607088089, 0.154317244887352, -0.024730868637561798, 0.2529522478580475, -0.08695115149021149, 0.06559130549430847, -0.09147703647613525, -0.0810495913028717, 0.09981588274240494, -0.02808421477675438, 0.02229490876197815, 0.11514046043157578, 0.13167628645896912, 0.018547849729657173, -0.04004290699958801, 0.10142561048269272, 0.028614288195967674, 0.05878995358943939, -0.013488588854670525, -0.05111394077539444, -0.04657868668437004, -0.03213871642947197, 0.02520371600985527, 0.23575150966644287, -0.09692196547985077, 0.0546293705701828, 0.0004985859850421548, 0.057198166847229004, -0.028069350868463516, 0.16743424534797668, -0.09559222310781479, -0.0097756776958704, 0.043744154274463654, -0.09684339910745621, -0.0019345155451446772, 0.05799166113138199, 0.1819247156381607, -0.013034092262387276, 0.09749220311641693, 0.13292983174324036, 0.006193231325596571, -0.046699050813913345, 0.06273842602968216, -0.1805240362882614, -0.07195837050676346, -0.044174231588840485, 0.04010365530848503, -0.15902726352214813, 0.16127145290374756, 0.0498364232480526, 0.01087728701531887, 0.012984936125576496, 0.0024325931444764137, 0.028457216918468475, 0.1348094493150711, 0.10869765281677246, 0.0031773275695741177, -0.016126977279782295, -0.12840114533901215, -0.025498108938336372, 0.019024638459086418, 0.12335090339183807, 0.0767749696969986, -0.05287861078977585, 0.04223937168717384, -0.0032007968984544277, 0.056736141443252563, 0.02020629681646824, -0.08713586628437042, -0.06363261491060257, 0.044071927666664124, 0.15475748479366302, 0.13833235204219818, -0.00018378377717453986, -0.036558281630277634, -0.15054626762866974, -0.08082830905914307, -0.11731436103582382, -0.035003919154405594, -0.058335185050964355, -0.11783060431480408, 0.08340676873922348, -0.06081976741552353, -0.07785753905773163, 0.0400281585752964, -0.004679043311625719, -0.007465974893420935, 0.012543934397399426, 0.1142013818025589, -0.03056176006793976, -0.059228431433439255, -0.043833937495946884, 0.15624278783798218, 0.009860750287771225, 0.1071498692035675, 0.021251993253827095, -0.033298101276159286, -0.04003652185201645, -0.03820683807134628, 0.061185773462057114, -0.05795706436038017, -0.13728953897953033, 0.09363844990730286, 0.0952986553311348, -0.10715372860431671, -0.03801175579428673, -0.05458621308207512, 0.1591460257768631, 0.1682155430316925, 0.004863756708800793, 0.07827527821063995, 0.23043473064899445, -0.030187075957655907, -0.23024101555347443, -0.11676736921072006, 0.020325716584920883, 0.041804514825344086, -0.030367590487003326, -0.11943250149488449, 0.03433534875512123, -0.1145368367433548, -0.025774654000997543, -0.09413494914770126, -0.13292339444160461, -0.12219755351543427, 0.18546180427074432, -0.1766907125711441, 0.20900465548038483, -0.04441431909799576, -0.05994975566864014, -0.07055652886629105, 0.0506451241672039, 0.05497681722044945, -0.23101484775543213, 0.10559319704771042, 0.12746469676494598, 0.05225111171603203, 0.019627278670668602, 0.05414997413754463, 0.1235818937420845, 0.03388163447380066, -0.020017487928271294, -0.0019305397290736437, -0.04007330909371376, 0.01248069666326046, 0.07618632167577744, -0.04469877853989601, 0.011116499081254005, 0.001577338669449091, -0.10958351939916611, -0.050095412880182266, -0.04968817159533501, 0.0044647748582065105, 0.09456811100244522, -0.006956415716558695, -0.029473422095179558, -0.11833660304546356, 0.002876825863495469, -0.021113930270075798, 0.24991805851459503, -0.18851466476917267, 0.15014800429344177, 0.15499605238437653, 0.2594640254974365, -0.17202313244342804, 0.04524402692914009, 0.0007758801802992821, -0.0828518196940422, 0.09189611673355103, -0.1964453011751175, 0.04436483606696129, 0.04366926848888397, 0.00023039041843730956, 0.15697060525417328, 0.03323829174041748, -0.062355052679777145, 0.16319026052951813, 0.09429824352264404, -0.03962736949324608, -0.1497897505760193, -0.03708496317267418, -0.06029844656586647, -0.020268792286515236, 0.024576770141720772, 0.16902966797351837, 0.0059897033497691154, 0.05435548722743988, -0.008935424499213696, -0.02162802964448929, -0.14110606908798218, 0.19894573092460632, 0.0776384025812149, 0.03626285120844841, -0.07804570347070694, 0.04065616428852081, -0.009066605940461159, -0.0777251273393631, -0.02917340025305748, -0.019651202484965324, -0.0553584061563015, -0.06335946917533875, -0.15965506434440613, 0.10456183552742004, -0.03083467110991478, -0.1180446445941925, -0.052196651697158813, -0.16223536431789398, -0.021965928375720978, 0.19138824939727783, -0.0035986911971122026, 0.05868354067206383, -0.030896689742803574, 0.035790372639894485, 0.06606324762105942, 0.002179335569962859, 0.04863239824771881, -0.027808599174022675, -0.10830744355916977, 0.12899048626422882, -0.0620659738779068, 0.11618439108133316, -0.06653815507888794, -0.015961337834596634, -0.041914328932762146, 0.07295379787683487, -0.0413980633020401, -0.006721111945807934, -0.0667462944984436, 0.005192828364670277, 0.0407758429646492, -0.07971052080392838, -0.007633307948708534, -0.026208413764834404, -0.08845048397779465, 0.05685681477189064, 0.06276771426200867, 0.09993208944797516, -0.05597621202468872, 0.030275437980890274, 0.02031543478369713, -0.017323944717645645, 0.1185806542634964, 0.09334023296833038, -0.11001040041446686, 0.14215087890625, -0.16789022088050842, -0.07337412238121033, 0.1326625943183899, 0.07192079722881317, -0.018285676836967468, 0.03203436732292175, -0.056288763880729675, 0.09611281752586365, 0.031768813729286194, -0.008494816720485687, 0.01596955955028534, 0.056567657738924026, 0.10214947909116745, -0.1756858229637146, -0.007754529360681772, -0.04508361592888832, 0.007248392794281244, 0.15299539268016815, 0.12759266793727875, 0.1133214682340622, -0.0642346516251564, -0.004484373144805431, -0.020499154925346375, 0.050935350358486176, 0.017753135412931442, -0.10255640000104904, 0.021188020706176758, -0.058975670486688614, 0.06981038302183151, -0.09920879453420639, 0.04365832731127739, -0.0767248198390007, -0.026007747277617455, 0.021553639322519302, 0.0470404177904129, -0.02221154421567917, -0.002998013747856021, 0.029945500195026398, 0.10546194761991501, -0.021606728434562683, -0.09681269526481628, -0.0012908575590699911, 0.022282799705863, 0.18771584331989288, -0.08563823997974396, -0.019458763301372528, 0.021057408303022385, 0.07903575152158737, 0.051004886627197266, -0.05614810064435005, -0.011665835045278072, 0.06078183278441429, -0.061877068132162094, -0.05260573327541351, -0.07520828396081924, 0.036269061267375946, 0.17212481796741486, 0.004802855663001537, -0.014010305516421795, -0.02824985235929489, -0.05271756649017334, -0.20833848416805267, -0.016234103590250015, -0.05928909778594971, -0.08428219705820084, -0.01622733660042286, -0.06866562366485596, 0.09228391200304031, 0.16704556345939636, 0.03923673555254936, 0.0789782926440239, 0.10797132551670074, -0.10335458815097809, -0.12595587968826294, 0.019593795761466026, -0.07346334308385849, 0.030370814725756645, -0.08458100259304047, -0.02287326566874981, 0.19185934960842133, -0.0026531857438385487, 0.019034113734960556, 0.008876815438270569, -0.030931126326322556, -0.018142541870474815, -0.190357968211174, -0.047917935997247696, -0.034991469234228134, 0.07349834591150284, 0.03570510074496269, 0.08021208643913269, 0.09177324920892715, -0.04548877477645874, 0.028020288795232773, 0.1577572524547577, -0.08079323917627335, -0.1972445696592331, -0.1349240094423294, -0.03859344869852066, -0.0500335730612278, 0.11110449582338333, -0.082538902759552, -0.08610744029283524, -0.046055544167757034, 0.05740232393145561, 0.25773653388023376, -0.10176343470811844, 0.07756559550762177, -0.016700703650712967, 0.01084689237177372, -0.012755386531352997, -0.06236247718334198, 0.04343583062291145, 0.14519940316677094, 0.029291309416294098, 0.01196001935750246, -0.05962526798248291, 0.01569933071732521, 0.02523854561150074, 0.08172547072172165, -0.018266260623931885, -0.049096524715423584, -0.005422830581665039, 0.1263311803340912, -0.16673803329467773, -0.20662853121757507, -0.1760883331298828, -0.09636510163545609, 0.021015062928199768, 0.006799045950174332, 0.03523681312799454, 0.1380392462015152, -0.021006319671869278, -0.05674780532717705, -0.050896283239126205, -0.029688799753785133, -0.0011736898450180888, -0.07468466460704803, 0.05222287029027939, 0.021976392716169357, -0.22666777670383453, -0.10516080260276794, 0.00414320407435298, 0.20603825151920319, 0.01586327515542507, 0.07505198568105698, 0.0606074295938015, 0.179803729057312, 0.025703592225909233, -0.05007578432559967, -0.015726417303085327, 0.08963307738304138, -0.03830789029598236, 0.2541838586330414, 0.04369458183646202, -0.08971887826919556, 0.06909888982772827, 0.006742376368492842, -0.12891310453414917, -0.0016308700433000922, -0.010540742427110672, -0.0688972920179367, 0.07102911919355392, -0.021913910284638405, -0.011830690316855907, 0.011811125092208385, 0.046786971390247345, -0.07098698616027832, 0.00636205542832613, -0.08139631897211075, -0.0929371789097786, -0.1746349334716797, -0.029283735901117325, -0.02805948071181774, 0.021219737827777863, -0.043900568038225174, -0.010158753022551537, -0.07492396980524063, 0.034188412129879, 0.04333795607089996, 0.06657573580741882, 0.10819797962903976, -0.022404499351978302, 0.0030189594253897667, -0.008047048933804035, 0.05022052675485611, 0.12973541021347046, -0.03811420872807503, -0.09563422203063965 ]
null
null
fairseq
# tts_transformer-en-200_speaker-cv4 [Transformer](https://arxiv.org/abs/1809.08895) text-to-speech model from fairseq S^2 ([paper](https://arxiv.org/abs/2109.06912)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_synthesis)): - English - 200 male/female voices (random speaker when using the widget) - Trained on [Common Voice v4](https://commonvoice.mozilla.org/en/datasets) ## Usage ```python from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub from fairseq.models.text_to_speech.hub_interface import TTSHubInterface import IPython.display as ipd models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "facebook/tts_transformer-en-200_speaker-cv4", arg_overrides={"vocoder": "hifigan", "fp16": False} ) model = models[0] TTSHubInterface.update_cfg_with_data_cfg(cfg, task.data_cfg) generator = task.build_generator(model, cfg) text = "Hello, this is a test run." sample = TTSHubInterface.get_model_input(task, text) wav, rate = TTSHubInterface.get_prediction(task, model, generator, sample) ipd.Audio(wav, rate=rate) ``` See also [fairseq S^2 example](https://github.com/pytorch/fairseq/blob/main/examples/speech_synthesis/docs/common_voice_example.md). ## Citation ```bibtex @inproceedings{wang-etal-2021-fairseq, title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit", author = "Wang, Changhan and Hsu, Wei-Ning and Adi, Yossi and Polyak, Adam and Lee, Ann and Chen, Peng-Jen and Gu, Jiatao and Pino, Juan", booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-demo.17", doi = "10.18653/v1/2021.emnlp-demo.17", pages = "143--152", } ```
{"language": "en", "library_name": "fairseq", "tags": ["fairseq", "audio", "text-to-speech", "multi-speaker"], "datasets": ["common_voice"], "task": "text-to-speech", "widget": [{"text": "Hello, this is a test run.", "example_title": "Hello, this is a test run."}]}
text-to-speech
facebook/tts_transformer-en-200_speaker-cv4
[ "fairseq", "audio", "text-to-speech", "multi-speaker", "en", "dataset:common_voice", "arxiv:1809.08895", "arxiv:2109.06912", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "1809.08895", "2109.06912" ]
[ "en" ]
TAGS #fairseq #audio #text-to-speech #multi-speaker #en #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us
# tts_transformer-en-200_speaker-cv4 Transformer text-to-speech model from fairseq S^2 (paper/code): - English - 200 male/female voices (random speaker when using the widget) - Trained on Common Voice v4 ## Usage See also fairseq S^2 example.
[ "# tts_transformer-en-200_speaker-cv4\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- English\n- 200 male/female voices (random speaker when using the widget)\n- Trained on Common Voice v4", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ "TAGS\n#fairseq #audio #text-to-speech #multi-speaker #en #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n", "# tts_transformer-en-200_speaker-cv4\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- English\n- 200 male/female voices (random speaker when using the widget)\n- Trained on Common Voice v4", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ 58, 65, 13 ]
[ "passage: TAGS\n#fairseq #audio #text-to-speech #multi-speaker #en #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n# tts_transformer-en-200_speaker-cv4\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- English\n- 200 male/female voices (random speaker when using the widget)\n- Trained on Common Voice v4## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ -0.08735895156860352, 0.0383245125412941, -0.0031902920454740524, -0.04828997701406479, 0.0712289959192276, -0.039190538227558136, 0.13497516512870789, 0.052772171795368195, -0.038431961089372635, 0.005817426834255457, 0.03310609236359596, 0.08772297203540802, 0.03666983172297478, 0.01606088876724243, -0.06721638888120651, -0.23441052436828613, 0.05855416879057884, -0.016494499519467354, 0.007630326319485903, 0.05346798896789551, 0.11429980397224426, -0.022594718262553215, 0.0030757179483771324, 0.05556813254952431, -0.08984136581420898, 0.049928002059459686, 0.07463131844997406, -0.10759638994932175, 0.10482653975486755, 0.10186050832271576, 0.042494259774684906, 0.09057654440402985, 0.06289393454790115, -0.16314248740673065, 0.030846774578094482, -0.020697854459285736, 0.07328220456838608, 0.03623413294553757, -0.021116849035024643, 0.01725032739341259, 0.033211592584848404, 0.11198838800191879, -0.0001620558905415237, 0.08984285593032837, -0.08167485147714615, -0.1778305470943451, 0.006399694364517927, -0.0389145128428936, -0.047828614711761475, 0.012206552550196648, -0.05811162292957306, 0.06714210659265518, -0.09608859568834305, 0.07106051594018936, 0.13121750950813293, -0.25751620531082153, 0.005981752648949623, -0.037579260766506195, 0.05602620169520378, 0.06422823667526245, -0.06103082746267319, 0.03571688011288643, 0.06026478484272957, 0.026276912540197372, -0.12858864665031433, -0.09287365525960922, -0.18400397896766663, 0.007412157952785492, -0.11685983091592789, 0.08017604798078537, 0.34952831268310547, 0.06003231555223465, -0.0556054562330246, -0.09082607924938202, -0.05064969137310982, 0.1727878898382187, 0.002630854258313775, -0.0681060254573822, -0.04845398664474487, 0.05091708153486252, -0.04474719986319542, -0.05409625545144081, -0.12274473160505295, -0.08115481585264206, -0.12441863864660263, 0.18164044618606567, -0.015805520117282867, 0.010581819340586662, -0.07356232404708862, -0.02236330695450306, -0.11371739953756332, -0.022689664736390114, 0.038658276200294495, -0.07697559893131256, -0.06901610642671585, 0.0061745657585561275, -0.02101857401430607, -0.23577256500720978, 0.12458023428916931, -0.08616112172603607, -0.07074522972106934, 0.05199313908815384, -0.09562085568904877, 0.059735771268606186, 0.0689665749669075, -0.06475604325532913, -0.042368583381175995, -0.008633499033749104, -0.028985818848013878, -0.029773779213428497, -0.0005435507628135383, -0.061385318636894226, -0.11764726787805557, -0.05736730620265007, -0.0996524915099144, 0.04213234409689903, -0.015410161577165127, 0.04973883926868439, -0.04073312506079674, -0.014470675028860569, 0.11683958023786545, -0.03355931490659714, -0.02543310634791851, 0.05998803675174713, 0.02830609492957592, 0.11844153702259064, 0.0011329262051731348, 0.07908409833908081, -0.02412608452141285, -0.16003625094890594, -0.022178728133440018, -0.013158472254872322, 0.036500196903944016, -0.13725075125694275, 0.023218216374516487, 0.03375490382313728, -0.007625453639775515, -0.14047250151634216, -0.02532367780804634, -0.050762370228767395, -0.09191534668207169, 0.03906971216201782, -0.09248022735118866, -0.14156106114387512, -0.06162387877702713, 0.06297028809785843, -0.05916040018200874, -0.06127113848924637, -0.06316658109426498, 0.0526779405772686, -0.08690369129180908, 0.09088749438524246, -0.07825174927711487, 0.0470292866230011, 0.012584466487169266, -0.01768687181174755, -0.06911318004131317, 0.14478038251399994, -0.012904814444482327, -0.07331830263137817, -0.04292531684041023, -0.03327352553606033, -0.08764069527387619, 0.10977891087532043, -0.028338925912976265, 0.15850815176963806, -0.20593078434467316, -0.07002805173397064, 0.14007355272769928, -0.07639991492033005, -0.018207138404250145, 0.12881478667259216, 0.02468673512339592, 0.10009472817182541, 0.11319637298583984, 0.33575674891471863, 0.09754577279090881, -0.11319560557603836, 0.02053341269493103, 0.12595397233963013, -0.06785997748374939, 0.05468587949872017, 0.032218243926763535, -0.09852009266614914, 0.050552308559417725, -0.03426416963338852, 0.18477053940296173, 0.034743260592222214, -0.08068098872900009, -0.004719762597233057, 0.042514391243457794, -0.08396013081073761, 0.1271713674068451, -0.05086563527584076, -0.004550970625132322, -0.01582977920770645, -0.016603348776698112, 0.0464998297393322, 0.110472671687603, -0.1187131330370903, 0.08109774440526962, -0.17468225955963135, 0.013416907750070095, -0.020733101293444633, 0.01694394461810589, -0.1779199242591858, 0.07542562484741211, -0.05231616646051407, 0.09921742975711823, 0.13699699938297272, 0.23895534873008728, 0.031531497836112976, -0.00940379686653614, -0.06588829308748245, 0.026329683139920235, 0.16838151216506958, 0.07662896811962128, -0.021682672202587128, -0.20256027579307556, 0.0839449018239975, -0.09368740767240524, 0.0010271539213135839, -0.11897861957550049, 0.0011018463410437107, 0.15258881449699402, 0.059493448585271835, 0.04003889858722687, 0.0020824484527111053, 0.07236648350954056, 0.047977276146411896, 0.0017860858934000134, 0.01956229843199253, 0.01825646497309208, 0.011676279827952385, -0.10775038599967957, 0.22156524658203125, -0.15797683596611023, 0.05105031281709671, 0.0693892240524292, -0.1287996768951416, -0.08442169427871704, 0.10804957151412964, 0.028043579310178757, 0.010907575488090515, -0.010242382995784283, -0.12322244793176651, 0.19156262278556824, -0.06540876626968384, 0.07965321838855743, -0.03854097053408623, 0.06360255926847458, 0.017703328281641006, -0.0818471685051918, 0.02623744122684002, 0.09096196293830872, -0.12869638204574585, -0.1455610692501068, 0.023699289187788963, 0.025144614279270172, -0.00454444857314229, 0.24033978581428528, -0.025974564254283905, -0.009085969999432564, -0.004896347410976887, -0.013773505575954914, -0.03571752831339836, 0.034800056368112564, -0.18634921312332153, -0.05005572363734245, 0.034489959478378296, 0.070897176861763, 0.05362803116440773, -0.08895190060138702, -0.017276445403695107, -0.01641676016151905, -0.07382027059793472, -0.20758438110351562, 0.0991336852312088, -0.0014426300767809153, 0.08747130632400513, -0.06309736520051956, -0.10286933183670044, 0.05507439002394676, -0.052174244076013565, -0.14774617552757263, 0.08346495777368546, -0.1418745368719101, -0.10509487986564636, -0.10357185453176498, -0.029069194570183754, 0.049762897193431854, 0.0691533237695694, 0.12233342975378036, -0.07214963436126709, -0.025325072929263115, -0.05246219038963318, 0.13264822959899902, -0.02860003337264061, 0.02516827918589115, -0.03239677473902702, -0.05048966407775879, 0.0539156049489975, -0.10725658386945724, 0.029944423586130142, -0.0032022602390497923, 0.04015949368476868, 0.020062487572431564, -0.09145808964967728, 0.056702788919210434, 0.2486073076725006, 0.1117924228310585, -0.0348929725587368, -0.027301538735628128, 0.19088305532932281, -0.10838597267866135, -0.010823330841958523, 0.13527049124240875, -0.028473572805523872, 0.0036240292247384787, 0.15772174298763275, 0.053010739386081696, 0.006600667722523212, 0.008530440740287304, -0.024715760722756386, -0.05284520238637924, -0.19743536412715912, -0.06290289014577866, -0.07159531116485596, 0.012865999713540077, -0.1629447191953659, -0.016516709700226784, -0.04227583482861519, -0.02996453456580639, -0.01250428520143032, -0.0631619244813919, 0.1223253533244133, -0.024248173460364342, 0.234352245926857, -0.09449877589941025, 0.09795968234539032, -0.07084857672452927, -0.09055148810148239, 0.09489260613918304, -0.06580428034067154, 0.08218079060316086, 0.09224940836429596, 0.15512561798095703, 0.04789799079298973, -0.00453836377710104, 0.11630875617265701, 0.04208332300186157, 0.06481387466192245, -0.00820631068199873, -0.03653153032064438, -0.04098467156291008, 0.046374302357435226, 0.03896588832139969, 0.2533952295780182, -0.11278129369020462, 0.017653217539191246, -0.06636045128107071, 0.03759961947798729, 0.0656338483095169, 0.17318494617938995, -0.06139085069298744, -0.008621825836598873, 0.057386480271816254, -0.10375574231147766, -0.023806260898709297, 0.10059723258018494, 0.2630224823951721, -0.05879220739006996, 0.10315775126218796, 0.12908101081848145, 0.020998898893594742, -0.042052071541547775, 0.0613618828356266, -0.1866120994091034, -0.0023176297545433044, -0.05681360512971878, 0.03722535818815231, -0.1198263093829155, 0.1326756775379181, 0.03503897786140442, 0.010672316886484623, -0.01884990558028221, -0.021829644218087196, 0.021601637825369835, 0.12384122610092163, 0.08125047385692596, -0.01853414811193943, -0.04033508896827698, -0.10195530205965042, 0.01905190944671631, 0.0006952176336199045, 0.11327916383743286, 0.10421053320169449, -0.01033517811447382, 0.045939452946186066, -0.0012530350359156728, 0.05155829340219498, -0.047393906861543655, -0.14213056862354279, -0.04900045320391655, 0.012330002151429653, 0.20679248869419098, 0.10399598628282547, -0.006257218774408102, -0.0557415597140789, -0.1494760811328888, -0.03998250514268875, -0.11892703175544739, -0.00938927661627531, -0.04914739355444908, -0.11481256783008575, 0.10888577252626419, -0.02496805042028427, -0.05316285416483879, 0.06150170415639877, 0.07178027927875519, -0.03386467695236206, -0.0047949375584721565, 0.09127750992774963, -0.027578389272093773, -0.09149561822414398, -0.04358945041894913, 0.17040212452411652, 0.06353286653757095, 0.10100417584180832, 0.03222331404685974, -0.029694844037294388, -0.009439005516469479, -0.07153058797121048, 0.06355611234903336, -0.027993692085146904, -0.10475512593984604, 0.09174476563930511, 0.059833403676748276, -0.12180278450250626, -0.045590825378894806, -0.049600910395383835, 0.1757708340883255, 0.15248797833919525, -0.0411035381257534, 0.10988102853298187, 0.18829376995563507, -0.02226417139172554, -0.2266082763671875, -0.07096148282289505, 0.023325670510530472, 0.06309139728546143, 0.025022277608513832, -0.14289911091327667, 0.023479988798499107, -0.060354720801115036, -0.007053745910525322, -0.019651783630251884, -0.16965609788894653, -0.11996728926897049, 0.18476130068302155, -0.13351374864578247, 0.21550065279006958, -0.02717171050608158, -0.0773811861872673, -0.06884358823299408, 0.0342404842376709, 0.09216608107089996, -0.2528561055660248, 0.10263249278068542, 0.10748521983623505, 0.1147259920835495, 0.048086512833833694, 0.03097567707300186, 0.12092257291078568, 0.04618542268872261, -0.017643075436353683, -0.01019542571157217, -0.05931815132498741, -0.01215758454054594, 0.07314936071634293, 0.0184968002140522, -0.018628353253006935, 0.00029494913178496063, -0.012851601466536522, -0.04937049001455307, -0.07097357511520386, 0.014068507589399815, 0.06730227172374725, 0.0011058456730097532, -0.058304984122514725, -0.09276586025953293, -0.007250615861266851, -0.008179853670299053, 0.18085157871246338, -0.18690405786037445, 0.10805662721395493, 0.15366457402706146, 0.24889709055423737, -0.13569431006908417, 0.004569801967591047, 0.002179242903366685, -0.10380154103040695, 0.08792099356651306, -0.15620124340057373, 0.03069789707660675, 0.050491273403167725, 0.027823954820632935, 0.15699893236160278, 0.04409429058432579, -0.09603174775838852, 0.1284571886062622, 0.052408650517463684, -0.06783584505319595, -0.15336191654205322, -0.03298838809132576, -0.039690352976322174, -0.03584416210651398, 0.03713645040988922, 0.20043127238750458, 0.023904243484139442, 0.05101172626018524, -0.009075026959180832, -0.009385445155203342, -0.13236455619335175, 0.19928309321403503, 0.09961536526679993, 0.0516255646944046, -0.08319973945617676, -0.002321518026292324, -0.00933137722313404, -0.009090968407690525, -0.01608741283416748, -0.028433630242943764, -0.052450720220804214, -0.06370970606803894, -0.12336268275976181, 0.07403493672609329, 0.0010276006069034338, -0.11713266372680664, -0.04743436351418495, -0.18336541950702667, 0.008961371146142483, 0.19952182471752167, -0.007024876307696104, 0.09715252369642258, -0.0656493753194809, -0.024215083569288254, 0.03689572215080261, -0.01782734878361225, 0.02268769033253193, -0.030872415751218796, -0.1462787538766861, 0.177371084690094, -0.03867977112531662, 0.07680606096982956, -0.06907453387975693, -0.05164583399891853, -0.06625652313232422, 0.06819020956754684, -0.0940970852971077, 0.005167408846318722, -0.1163259893655777, 0.014070499688386917, 0.04162336140871048, -0.054086294025182724, -0.018933290615677834, 0.006581473629921675, -0.08982190489768982, 0.059924863278865814, 0.06493036448955536, 0.08800575882196426, -0.03996333107352257, 0.02621074765920639, 0.013037274591624737, -0.03171229362487793, 0.0952044427394867, 0.0896860733628273, -0.1262214183807373, 0.11038849502801895, -0.16707560420036316, -0.06516110897064209, 0.12556664645671844, 0.07689850777387619, -0.0337715670466423, 0.012692978605628014, -0.09873336553573608, 0.06512603163719177, 0.05299631878733635, 0.007000568322837353, 0.0005295606097206473, 0.039876509457826614, 0.10407836735248566, -0.11035505682229996, -0.06527521461248398, -0.0858357846736908, -0.0025992945302277803, 0.15449222922325134, 0.10168803483247757, 0.11758588999509811, -0.07458474487066269, 0.02224005199968815, -0.017048588022589684, 0.06502250581979752, 0.007232916075736284, -0.06323403120040894, 0.05600860342383385, -0.03586334362626076, 0.062057990580797195, -0.08862490206956863, 0.08794751018285751, -0.13804103434085846, -0.04462791606783867, -0.008213340304791927, -0.0042914473451673985, -0.012727271765470505, 0.03372931480407715, 0.06546879559755325, 0.07406866550445557, -0.031528063118457794, -0.10659267753362656, 0.018852634355425835, 0.024109655991196632, 0.20154651999473572, -0.0510399155318737, -0.0553034283220768, 0.027064615860581398, 0.0681079626083374, 0.06888140738010406, -0.048445869237184525, 0.032235823571681976, 0.09844398498535156, -0.04214984178543091, -0.03782501816749573, -0.08282185345888138, 0.033028628677129745, 0.07742489129304886, -0.007488448638468981, 0.00009583793143974617, -0.030658146366477013, -0.0696650817990303, -0.1904536485671997, -0.04084402695298195, -0.04943319782614708, -0.10853004455566406, -0.007622826844453812, -0.08375590294599533, 0.08938759565353394, 0.10821511596441269, 0.030592644587159157, 0.04069429263472557, 0.06651557981967926, -0.09838118404150009, -0.12903863191604614, 0.04133837670087814, -0.06963831186294556, -0.003139232285320759, -0.07307794690132141, -0.027660531923174858, 0.18595796823501587, 0.007649518083781004, -0.01203184388577938, 0.01562785916030407, -0.04187128320336342, -0.05044430494308472, -0.171207994222641, -0.04296523705124855, -0.01606031320989132, 0.046327996999025345, 0.05736478045582771, 0.07719593495130539, 0.13726972043514252, -0.06308342516422272, 0.011845339089632034, 0.1022871807217598, -0.06112060323357582, -0.15728718042373657, -0.1656314581632614, -0.013227245770394802, -0.057504769414663315, 0.10520925372838974, -0.05274439975619316, -0.11024339497089386, -0.07082977890968323, 0.10159613192081451, 0.24656547605991364, -0.09744767099618912, 0.0743294283747673, -0.021138817071914673, -0.009346064180135727, 0.01187799870967865, -0.0403805635869503, 0.03910739719867706, 0.1441119760274887, 0.0492500402033329, -0.03790969029068947, -0.05253634229302406, 0.01929149404168129, 0.02182656340301037, 0.0808454379439354, -0.04314325004816055, -0.0535280667245388, -0.014512389898300171, 0.15163901448249817, -0.1769128143787384, -0.1376737505197525, -0.1497519314289093, -0.06669497489929199, -0.008686579763889313, 0.012803713791072369, -0.009569603949785233, 0.11461835354566574, 0.018136780709028244, -0.08757708966732025, -0.026455704122781754, 0.005436552222818136, -0.011938230134546757, -0.09608869254589081, 0.03608420491218567, 0.020934469997882843, -0.24561873078346252, -0.08690721541643143, -0.012247506529092789, 0.21317633986473083, 0.027208808809518814, 0.07848730683326721, 0.04625057056546211, 0.1561347395181656, 0.010580643080174923, -0.06887990981340408, -0.011942768469452858, 0.12833236157894135, -0.04077478498220444, 0.25972557067871094, 0.05077613890171051, -0.1635798066854477, 0.06680827587842941, 0.08514698594808578, -0.10373607277870178, -0.029870256781578064, -0.010724883526563644, -0.08167438954114914, 0.06712841987609863, -0.02427218109369278, -0.021975157782435417, -0.012938731350004673, 0.063270702958107, -0.042383432388305664, -0.004945764318108559, -0.07441055029630661, -0.09763775765895844, -0.22952868044376373, -0.03981519863009453, -0.06611406803131104, 0.022089160978794098, -0.10318326205015182, -0.020750341936945915, -0.06012861803174019, 0.020268240943551064, 0.039505913853645325, 0.0644700899720192, 0.08035673201084137, -0.030772970989346504, 0.0041343048214912415, 0.010701204650104046, 0.05865222215652466, 0.1514844447374344, -0.0402158759534359, -0.08424938470125198 ]
null
null
fairseq
# tts_transformer-en-ljspeech [Transformer](https://arxiv.org/abs/1809.08895) text-to-speech model from fairseq S^2 ([paper](https://arxiv.org/abs/2109.06912)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_synthesis)): - English - Single-speaker female voice - Trained on [LJSpeech](https://keithito.com/LJ-Speech-Dataset/) ## Usage ```python from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub from fairseq.models.text_to_speech.hub_interface import TTSHubInterface import IPython.display as ipd models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "facebook/tts_transformer-en-ljspeech", arg_overrides={"vocoder": "hifigan", "fp16": False} ) model = models[0] TTSHubInterface.update_cfg_with_data_cfg(cfg, task.data_cfg) generator = task.build_generator(model, cfg) text = "Hello, this is a test run." sample = TTSHubInterface.get_model_input(task, text) wav, rate = TTSHubInterface.get_prediction(task, model, generator, sample) ipd.Audio(wav, rate=rate) ``` See also [fairseq S^2 example](https://github.com/pytorch/fairseq/blob/main/examples/speech_synthesis/docs/ljspeech_example.md). ## Citation ```bibtex @inproceedings{wang-etal-2021-fairseq, title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit", author = "Wang, Changhan and Hsu, Wei-Ning and Adi, Yossi and Polyak, Adam and Lee, Ann and Chen, Peng-Jen and Gu, Jiatao and Pino, Juan", booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-demo.17", doi = "10.18653/v1/2021.emnlp-demo.17", pages = "143--152", } ```
{"language": "en", "library_name": "fairseq", "tags": ["fairseq", "audio", "text-to-speech"], "datasets": ["ljspeech"], "task": "text-to-speech", "widget": [{"text": "Hello, this is a test run.", "example_title": "Hello, this is a test run."}]}
text-to-speech
facebook/tts_transformer-en-ljspeech
[ "fairseq", "audio", "text-to-speech", "en", "dataset:ljspeech", "arxiv:1809.08895", "arxiv:2109.06912", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "1809.08895", "2109.06912" ]
[ "en" ]
TAGS #fairseq #audio #text-to-speech #en #dataset-ljspeech #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us
# tts_transformer-en-ljspeech Transformer text-to-speech model from fairseq S^2 (paper/code): - English - Single-speaker female voice - Trained on LJSpeech ## Usage See also fairseq S^2 example.
[ "# tts_transformer-en-ljspeech\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- English\n- Single-speaker female voice\n- Trained on LJSpeech", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ "TAGS\n#fairseq #audio #text-to-speech #en #dataset-ljspeech #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n", "# tts_transformer-en-ljspeech\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- English\n- Single-speaker female voice\n- Trained on LJSpeech", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ 50, 51, 13 ]
[ "passage: TAGS\n#fairseq #audio #text-to-speech #en #dataset-ljspeech #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n# tts_transformer-en-ljspeech\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- English\n- Single-speaker female voice\n- Trained on LJSpeech## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ -0.08445403724908829, 0.08690456300973892, -0.00546959089115262, -0.015578222461044788, 0.07495453953742981, -0.07677122205495834, 0.11787617951631546, 0.10978339612483978, -0.04557602480053902, -0.03859526291489601, 0.011341670528054237, 0.1580408811569214, 0.0029308488592505455, 0.036387231200933456, -0.10285263508558273, -0.21661004424095154, 0.04040715843439102, 0.02043858915567398, -0.11422105878591537, 0.056412216275930405, 0.13783608376979828, -0.0051231589168310165, 0.01286924909800291, 0.0248272605240345, -0.05615575611591339, 0.064110167324543, 0.03676578029990196, -0.10120804607868195, 0.05644235759973526, 0.10272125154733658, 0.017845597118139267, 0.06984410434961319, 0.05110824853181839, -0.16082394123077393, 0.03140256181359291, -0.03494352847337723, 0.05373906344175339, 0.032590627670288086, -0.10414344817399979, -0.016693612560629845, 0.03875517100095749, -0.04417615756392479, -0.019718710333108902, 0.03346601128578186, -0.07953676581382751, -0.196879044175148, 0.037608515471220016, -0.06818577647209167, -0.007523276377469301, 0.0183535385876894, -0.046074651181697845, 0.059798240661621094, -0.10514387488365173, 0.09491018950939178, 0.19489628076553345, -0.2223479151725769, 0.007871191948652267, -0.08728151023387909, 0.05745435133576393, 0.06310134381055832, -0.04061747342348099, 0.06711192429065704, 0.036012228578329086, -0.035432446748018265, -0.1397632360458374, -0.1140318214893341, -0.18214833736419678, 0.06725101172924042, -0.12999702990055084, 0.06801591068506241, 0.40248119831085205, 0.010091418400406837, -0.06810861825942993, -0.06877677142620087, -0.04118383303284645, 0.18353860080242157, -0.009559051133692265, -0.0922790914773941, 0.005187715403735638, 0.0390172116458416, -0.04022054746747017, -0.09791623800992966, -0.08921067416667938, -0.048077840358018875, -0.08407210558652878, 0.2347518801689148, -0.00429212860763073, 0.010863274335861206, -0.10725183039903641, -0.026735926046967506, -0.057201262563467026, -0.0396483838558197, 0.03434651345014572, -0.0653005987405777, -0.04439395293593407, 0.04207402840256691, 0.014251566492021084, -0.2652033567428589, 0.09569951146841049, -0.11005204170942307, -0.07825382053852081, 0.01740489900112152, -0.025663992390036583, 0.07468437403440475, 0.08816153556108475, -0.04686838388442993, 0.013991027139127254, -0.0223739892244339, -0.006840967107564211, -0.013717741705477238, 0.014256526716053486, -0.0576547309756279, -0.1014072597026825, -0.08623212575912476, -0.15337559580802917, 0.07130380719900131, 0.022610792890191078, 0.0225547906011343, -0.056574542075395584, -0.010381567291915417, 0.12729020416736603, -0.048871152102947235, 0.013224613852798939, 0.014298872090876102, -0.002355956705287099, 0.07779108732938766, 0.074881911277771, 0.08580299466848373, -0.02016228623688221, -0.18857817351818085, -0.03171183913946152, 0.008188445121049881, 0.019704462960362434, -0.11666036397218704, 0.03692243993282318, 0.07364074140787125, 0.029941653832793236, -0.16872592270374298, -0.13186214864253998, -0.04445740580558777, -0.0740913525223732, 0.03042253665626049, -0.0834578424692154, -0.155262753367424, -0.10542922466993332, 0.0817156508564949, -0.05687815696001053, -0.10641365498304367, -0.058509912341833115, 0.07758583873510361, -0.09887408465147018, 0.07716629654169083, -0.1054910197854042, 0.0650302916765213, -0.021582122892141342, -0.03474976122379303, -0.11377432197332382, 0.1637936234474182, -0.007592263165861368, -0.05362427979707718, -0.0020622669253498316, -0.03397807851433754, -0.07980611175298691, 0.11495441943407059, -0.0035098365042358637, 0.17258232831954956, -0.24005991220474243, -0.07137774676084518, 0.03981195017695427, -0.08251078426837921, -0.012931961566209793, 0.11302485316991806, 0.036578234285116196, 0.14506205916404724, 0.10656648874282837, 0.36267438530921936, 0.18831241130828857, -0.11781595647335052, 0.006601688917726278, 0.13624052703380585, -0.050990477204322815, 0.009042864665389061, 0.048396650701761246, -0.04953494668006897, 0.07807359099388123, -0.03549285978078842, 0.1013934388756752, 0.04776153340935707, -0.05242933705449104, -0.02916998229920864, 0.01973552443087101, -0.07680998742580414, 0.13895395398139954, 0.0017066417494788766, -0.019338922575116158, -0.00529120909050107, -0.05194595828652382, 0.14423730969429016, 0.104737788438797, -0.10704857856035233, 0.10027016699314117, -0.1580696552991867, 0.010443934239447117, 0.021253515034914017, 0.02421637624502182, -0.1602701097726822, 0.06542395055294037, -0.0886206328868866, 0.011724664829671383, 0.16539040207862854, 0.23636919260025024, 0.02257642149925232, -0.016021201387047768, -0.02530718222260475, 0.054496880620718, 0.11707600951194763, 0.06170724704861641, -0.04447410628199577, -0.20942294597625732, 0.11108110100030899, -0.07726416736841202, 0.022742828354239464, -0.13564704358577728, 0.002773985033854842, 0.12772157788276672, 0.00022937871108297259, 0.06647790968418121, 0.008808178827166557, 0.09060433506965637, 0.061217837035655975, 0.03141064569354057, 0.03183091804385185, 0.012636987492442131, 0.013671210035681725, -0.07967589050531387, 0.22484932839870453, -0.1228250116109848, 0.08724749088287354, 0.035614266991615295, -0.10008459538221359, -0.03708335757255554, 0.1319640427827835, 0.010240158066153526, 0.018705852329730988, -0.00616080267354846, -0.08318418264389038, 0.17809128761291504, -0.09385708719491959, 0.07563792914152145, -0.04786836728453636, 0.004443054087460041, -0.022508330643177032, -0.13356608152389526, 0.00865574087947607, 0.09176670014858246, -0.14289121329784393, -0.1311223953962326, 0.007110231090337038, 0.10718629509210587, -0.07556208968162537, 0.3094157874584198, -0.017567308619618416, 0.0064932238310575485, -0.00907923188060522, -0.027446789667010307, -0.06718793511390686, 0.039167653769254684, -0.17205199599266052, -0.06480920314788818, 0.03734254464507103, 0.060638684779405594, 0.0528164878487587, -0.0785069689154625, -0.037104323506355286, -0.006007998716086149, -0.09225237369537354, -0.24176253378391266, 0.11911267042160034, 0.009550371207296848, 0.08300453424453735, -0.060866281390190125, -0.05562262982130051, 0.0259816013276577, -0.05787906050682068, -0.12240749597549438, 0.07645028829574585, -0.16082338988780975, -0.1107022762298584, -0.11590377241373062, 0.0485420785844326, 0.07995745539665222, 0.05676335468888283, 0.14931590855121613, -0.0738050639629364, -0.04300837218761444, -0.025597339496016502, 0.10241693258285522, 0.021128220483660698, 0.012235293164849281, -0.03292043134570122, -0.022818731144070625, 0.039561677724123, -0.13405980169773102, 0.02569170482456684, 0.015305903740227222, 0.06355703622102737, -0.011404765769839287, -0.06518393754959106, 0.07852063328027725, 0.20525728166103363, 0.07232742011547089, -0.03662830591201782, -0.020083218812942505, 0.17342181503772736, -0.1355205774307251, -0.0013649220345541835, 0.13874699175357819, -0.05214421823620796, 0.012918075546622276, 0.12050678580999374, 0.030162522569298744, 0.0003998545871581882, 0.016926469281315804, -0.02397751808166504, -0.07200063765048981, -0.19514833390712738, -0.06306858360767365, -0.06536881625652313, 0.06610967963933945, -0.21702948212623596, 0.00401264475658536, -0.10314375907182693, -0.024543754756450653, 0.04187870770692825, -0.0342094860970974, 0.14502866566181183, -0.030634047463536263, 0.2074134200811386, -0.0817018523812294, 0.10203402489423752, -0.09962734580039978, -0.10307049751281738, 0.09267905354499817, -0.05813918635249138, 0.08015748858451843, 0.014650767669081688, 0.10488411784172058, 0.10049185156822205, -0.010412839241325855, 0.10508354008197784, 0.014670904725790024, 0.07161636650562286, 0.017391329631209373, -0.0382535345852375, -0.061504196375608444, 0.01235059555619955, 0.03877737373113632, 0.1960332840681076, -0.07820111513137817, -0.019841331988573074, 0.027942705899477005, 0.04095616936683655, 0.0016371281817555428, 0.15464381873607635, -0.03294622525572777, 0.03897297382354736, 0.11968931555747986, -0.07974900305271149, -0.01746128872036934, 0.12148287147283554, 0.22761034965515137, -0.047083478420972824, 0.06281334161758423, 0.15623056888580322, 0.015591331757605076, -0.09604865312576294, 0.036594223231077194, -0.13477061688899994, -0.04609789326786995, -0.03241686895489693, 0.06592611223459244, -0.13919858634471893, 0.114546038210392, 0.0377328097820282, 0.007317852694541216, 0.009400867857038975, -0.01023801974952221, 0.03340989351272583, 0.08817870169878006, 0.10213445872068405, 0.0075928280130028725, -0.03860793635249138, -0.09705404937267303, 0.032916486263275146, 0.0025119041092693806, 0.10555638372898102, 0.13916856050491333, 0.00189033686183393, -0.008847557008266449, -0.013732803985476494, 0.02766895294189453, -0.0011462249094620347, -0.09896531701087952, -0.04068933054804802, 0.008468306623399258, 0.1751973032951355, 0.09330974519252777, -0.0037734508514404297, -0.04091372340917587, -0.1289609670639038, -0.003961312118917704, -0.04057752713561058, -0.04629625752568245, -0.028314610943198204, -0.13582617044448853, 0.10245001316070557, -0.05126363784074783, -0.030935941264033318, 0.08992777019739151, 0.036438390612602234, 0.015345278196036816, -0.025967063382267952, 0.05373813956975937, -0.033863313496112823, -0.04942431300878525, -0.003998264204710722, 0.10231917351484299, 0.06415081769227982, 0.07632878422737122, 0.04732196405529976, -0.015184318646788597, -0.06798534095287323, -0.06951876729726791, 0.031466491520404816, -0.0295748021453619, -0.1193394809961319, 0.12438957393169403, 0.01846027933061123, -0.1007988303899765, -0.025839360430836678, -0.02988874539732933, 0.15354174375534058, 0.18467234075069427, -0.033267468214035034, 0.10196636617183685, 0.176665261387825, -0.0575965940952301, -0.2676408588886261, -0.05826001986861229, 0.017047109082341194, 0.06936033070087433, 0.01618325337767601, -0.12763583660125732, 0.04400251805782318, -0.05222187936306, -0.012724515981972218, -0.03220149502158165, -0.18769942224025726, -0.11728113144636154, 0.21485644578933716, -0.13225984573364258, 0.19212421774864197, -0.07095428556203842, -0.06392892450094223, -0.04111931100487709, 0.008077342063188553, 0.14889949560165405, -0.3466099202632904, 0.08662055432796478, 0.12197768688201904, 0.090211421251297, 0.06625239551067352, 0.05766546353697777, 0.10649669915437698, 0.06625603884458542, 0.024002503603696823, -0.03748980909585953, -0.03578987717628479, -0.02831338904798031, 0.03486676886677742, 0.07421483844518661, 0.002946489490568638, 0.018992751836776733, -0.11657540500164032, -0.008058865554630756, -0.06795281916856766, -0.007653311360627413, 0.07541447877883911, -0.022850273177027702, -0.08826608955860138, -0.10887525230646133, -0.013663480058312416, 0.010374863632023335, 0.28261294960975647, -0.1376209408044815, 0.07617418467998505, 0.10923365503549576, 0.24626201391220093, -0.10730352252721786, 0.07099081575870514, 0.007175017148256302, -0.09812132269144058, 0.06294538825750351, -0.2342863380908966, 0.04851333796977997, 0.03095105104148388, 0.030253272503614426, 0.15496602654457092, 0.025604644790291786, -0.045029960572719574, 0.10484730452299118, 0.09543389827013016, -0.09481123089790344, -0.06623909622430801, -0.06392137706279755, -0.08957275748252869, 0.009705645963549614, 0.0032289486844092607, 0.23112738132476807, 0.02622334100306034, 0.047470979392528534, -0.0008368773269467056, 0.008075342513620853, -0.14247199892997742, 0.17151068150997162, 0.13353359699249268, 0.04432009160518646, -0.07328217476606369, 0.021480021998286247, -0.019344212487339973, 0.011087811551988125, 0.012884493917226791, 0.017418449744582176, -0.043450675904750824, -0.08212659507989883, -0.14821170270442963, 0.10215458273887634, -0.028816092759370804, -0.10102170705795288, -0.09801805764436722, -0.15385903418064117, -0.021891718730330467, 0.16242121160030365, 0.007672989275306463, 0.07942180335521698, -0.045052628964185715, 0.014570100232958794, 0.03259517252445221, 0.04730980843305588, 0.06780140101909637, -0.05610766261816025, -0.1617170125246048, 0.2044616937637329, -0.07865654677152634, 0.13264358043670654, -0.06511694937944412, -0.012052470818161964, -0.0800304263830185, 0.09355714917182922, -0.031119640916585922, -0.012836041860282421, -0.07855088263750076, 0.0006035935948602855, 0.03233632817864418, -0.07405722141265869, -0.018100889399647713, -0.0030753447208553553, -0.07824473828077316, 0.04601350054144859, 0.07318031042814255, 0.08428076654672623, -0.0885884240269661, -0.016017884016036987, 0.03989667817950249, -0.02356443926692009, 0.05959931015968323, 0.08044148981571198, -0.10672309249639511, 0.11349214613437653, -0.09645729511976242, -0.02753930538892746, 0.17422617971897125, 0.054298121482133865, -0.008411488495767117, -0.0009584682411514223, -0.08244836330413818, 0.06263566762208939, 0.08626625686883926, 0.01417750958353281, -0.02768309786915779, 0.0365026630461216, 0.1444501429796219, -0.11195343732833862, -0.06763898581266403, -0.07481731474399567, 0.006971701979637146, 0.10614833235740662, 0.1250791996717453, 0.13166148960590363, -0.05392146855592728, -0.014704369008541107, 0.010902847163379192, 0.04814593493938446, 0.031290650367736816, -0.06997629255056381, 0.08504439145326614, -0.07245134562253952, 0.045814018696546555, -0.10789486020803452, 0.03774217516183853, -0.014841631054878235, -0.08711253851652145, -0.008993037976324558, -0.008122257888317108, -0.004956392105668783, 0.020678823813796043, 0.00267366343177855, 0.054740775376558304, -0.03361406922340393, -0.12939181923866272, 0.041786473244428635, 0.029864097014069557, 0.12025661021471024, -0.06117980554699898, -0.03759227320551872, 0.056164324283599854, 0.08457046747207642, 0.0400003157556057, -0.02368866093456745, -0.024879122152924538, 0.07009152323007584, -0.00801189336925745, -0.038367584347724915, -0.10212572664022446, 0.03561058267951012, 0.0930616706609726, -0.021211573854088783, 0.03722131997346878, -0.000791932747233659, -0.07506028562784195, -0.24632397294044495, 0.0003992481797467917, -0.03727361559867859, -0.08754222840070724, -0.027861710637807846, -0.09808386117219925, 0.1007165014743805, 0.06376742571592331, 0.004866816569119692, 0.04110155999660492, 0.053971972316503525, -0.12478207796812057, -0.11063330620527267, 0.028299834579229355, -0.08039752393960953, 0.015889674425125122, -0.0748041570186615, 0.02827308140695095, 0.13878224790096283, 0.00904915388673544, -0.021615441888570786, 0.03623872250318527, -0.07147516310214996, -0.03587781637907028, -0.16935913264751434, -0.04189348220825195, -0.045765891671180725, 0.08484289795160294, 0.05749797448515892, 0.012737086974084377, 0.13396619260311127, -0.084309883415699, 0.0246133916079998, 0.1966737061738968, -0.07218712568283081, -0.19232088327407837, -0.13264180719852448, -0.08519494533538818, -0.017091166228055954, 0.11489000171422958, -0.06236375495791435, -0.1101779043674469, -0.04708666354417801, 0.06836999207735062, 0.23710060119628906, -0.11781370639801025, 0.0880405530333519, 0.000820130982901901, 0.010055119171738625, -0.003596600843593478, -0.05998227745294571, 0.10724541544914246, 0.2632540166378021, 0.035584233701229095, -0.05275585502386093, -0.0624689944088459, 0.023002851754426956, -0.017300032079219818, 0.09669364243745804, -0.07395616918802261, -0.04896470159292221, -0.002265897113829851, 0.12575195729732513, -0.20199652016162872, -0.09487881511449814, -0.20002298057079315, -0.0927315354347229, 0.002243177965283394, -0.01345547754317522, -0.010121757164597511, 0.10788831859827042, 0.03567894548177719, -0.08343194425106049, -0.05203813686966896, 0.05492543801665306, -0.002906668931245804, -0.1089346632361412, 0.0786370187997818, 0.011059258133172989, -0.20579271018505096, -0.10975311696529388, -0.013899819925427437, 0.18357157707214355, -0.00032641657162457705, 0.02300647459924221, 0.019245533272624016, 0.127940371632576, 0.00009463715105084702, -0.052732378244400024, -0.008166673593223095, 0.1251627802848816, -0.05777599662542343, 0.23411983251571655, 0.09500925987958908, -0.07776443660259247, 0.04394856095314026, 0.06847398728132248, -0.12093281000852585, -0.03518722951412201, -0.03395894169807434, -0.10436048358678818, 0.039867617189884186, -0.008362369611859322, 0.01172381266951561, 0.038714420050382614, 0.03135969489812851, -0.031907446682453156, -0.014678314328193665, -0.10470850020647049, -0.06199345365166664, -0.12133045494556427, -0.04685116186738014, -0.005866356194019318, 0.05708911642432213, -0.11978645622730255, -0.01249243039637804, -0.07825435698032379, 0.011400946415960789, 0.022852065041661263, 0.07016252726316452, 0.12787964940071106, -0.03319729492068291, 0.017163414508104324, -0.012256964109838009, 0.06901351362466812, 0.1401362419128418, -0.05043967440724373, -0.09679298847913742 ]
null
null
fairseq
# tts_transformer-es-css10 [Transformer](https://arxiv.org/abs/1809.08895) text-to-speech model from fairseq S^2 ([paper](https://arxiv.org/abs/2109.06912)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_synthesis)): - Spanish - Single-speaker male voice - Trained on [CSS10](https://github.com/Kyubyong/css10) ## Usage Dependencies ```sh pip install fairseq sentencepiece ``` ```python from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub from fairseq.models.text_to_speech.hub_interface import TTSHubInterface import IPython.display as ipd models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "facebook/tts_transformer-es-css10", arg_overrides={"vocoder": "hifigan", "fp16": False} ) model = models[0] TTSHubInterface.update_cfg_with_data_cfg(cfg, task.data_cfg) generator = task.build_generator([model], cfg) text = "Hola, esta es una prueba." sample = TTSHubInterface.get_model_input(task, text) wav, rate = TTSHubInterface.get_prediction(task, model, generator, sample) ipd.Audio(wav, rate=rate) ``` See also [fairseq S^2 example](https://github.com/pytorch/fairseq/blob/main/examples/speech_synthesis/docs/common_voice_example.md). ## Citation ```bibtex @inproceedings{wang-etal-2021-fairseq, title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit", author = "Wang, Changhan and Hsu, Wei-Ning and Adi, Yossi and Polyak, Adam and Lee, Ann and Chen, Peng-Jen and Gu, Jiatao and Pino, Juan", booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-demo.17", doi = "10.18653/v1/2021.emnlp-demo.17", pages = "143--152", } ```
{"language": "es", "library_name": "fairseq", "tags": ["fairseq", "audio", "text-to-speech"], "datasets": ["css10"], "task": "text-to-speech", "widget": [{"text": "Hola, esta es una prueba.", "example_title": "Hello, this is a test run."}]}
text-to-speech
facebook/tts_transformer-es-css10
[ "fairseq", "audio", "text-to-speech", "es", "dataset:css10", "arxiv:1809.08895", "arxiv:2109.06912", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "1809.08895", "2109.06912" ]
[ "es" ]
TAGS #fairseq #audio #text-to-speech #es #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us
# tts_transformer-es-css10 Transformer text-to-speech model from fairseq S^2 (paper/code): - Spanish - Single-speaker male voice - Trained on CSS10 ## Usage Dependencies See also fairseq S^2 example.
[ "# tts_transformer-es-css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Spanish\n- Single-speaker male voice\n- Trained on CSS10", "## Usage\nDependencies\n\n\n\n\nSee also fairseq S^2 example." ]
[ "TAGS\n#fairseq #audio #text-to-speech #es #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n", "# tts_transformer-es-css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Spanish\n- Single-speaker male voice\n- Trained on CSS10", "## Usage\nDependencies\n\n\n\n\nSee also fairseq S^2 example." ]
[ 50, 49, 16 ]
[ "passage: TAGS\n#fairseq #audio #text-to-speech #es #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n# tts_transformer-es-css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Spanish\n- Single-speaker male voice\n- Trained on CSS10## Usage\nDependencies\n\n\n\n\nSee also fairseq S^2 example." ]
[ -0.12422732263803482, 0.02511546202003956, -0.0015994420973584056, -0.024375667795538902, 0.09009519964456558, -0.06315167248249054, 0.18579141795635223, 0.05342838913202286, -0.01408522017300129, 0.006407159846276045, 0.021107438951730728, 0.11575820297002792, 0.059199508279561996, 0.03401465341448784, -0.083554707467556, -0.15458956360816956, 0.07235978543758392, -0.0021754023618996143, -0.10538814961910248, 0.05047357827425003, 0.11930421739816666, -0.02248211018741131, 0.029168838635087013, 0.014683250337839127, -0.07904811948537827, 0.049838580191135406, 0.0916735976934433, -0.10815861076116562, 0.06048280745744705, 0.08135131001472473, 0.043332722038030624, 0.14212019741535187, 0.069327712059021, -0.08254014700651169, 0.03226986154913902, -0.03852070868015289, 0.04066108539700508, 0.03384967893362045, -0.04982255399227142, 0.006211807020008564, -0.01169493980705738, 0.03519264608621597, 0.011038389056921005, 0.041820839047431946, -0.12999874353408813, -0.13049200177192688, 0.0060194400139153, -0.05712125450372696, 0.04088230058550835, 0.06424061954021454, -0.055626384913921356, 0.04634680598974228, -0.09840062260627747, 0.03456249460577965, 0.13634130358695984, -0.20832650363445282, 0.02691713348031044, -0.08010601997375488, 0.0705910176038742, 0.09962396323680878, -0.038274358958005905, 0.007513168267905712, 0.05451255291700363, -0.046248335391283035, -0.10840603709220886, -0.10597822070121765, -0.19708596169948578, 0.05308046564459801, -0.0930711030960083, 0.034473907202482224, 0.4122900664806366, 0.04317542538046837, -0.038738053292036057, -0.08478977531194687, -0.0792977586388588, 0.08328946679830551, 0.0361308827996254, -0.08783271163702011, -0.015655096620321274, 0.05522980913519859, -0.0451202392578125, -0.06588960438966751, -0.13284964859485626, -0.04659790173172951, -0.11071469634771347, 0.18244677782058716, -0.04688884690403938, -0.023099826648831367, -0.06621875613927841, -0.0336703285574913, -0.022043783217668533, -0.06637641787528992, 0.06541380286216736, -0.03946816548705101, -0.07853490859270096, 0.011611024849116802, -0.015174571424722672, -0.26453930139541626, 0.08458026498556137, -0.048457544296979904, -0.09901359677314758, 0.00771641731262207, -0.013028672896325588, 0.058086130768060684, 0.04321996867656708, 0.028352539986371994, -0.01347544975578785, -0.04524979740381241, -0.02658340148627758, 0.00582121079787612, 0.01908191107213497, -0.10817626863718033, -0.18647746741771698, -0.06540004909038544, -0.12282603234052658, 0.03687490150332451, 0.014350807294249535, 0.031334538012742996, -0.04064546152949333, -0.012203904800117016, 0.2289065271615982, -0.027311226353049278, -0.03297416493296623, 0.03937957435846329, 0.03383037820458412, 0.07333461195230484, 0.0020238456781953573, 0.07375609874725342, 0.0051476736553013325, -0.16603882610797882, -0.03638080134987831, -0.012192219495773315, 0.03425375744700432, -0.12235216796398163, 0.022264668717980385, -0.03495807945728302, 0.01786215417087078, -0.13781149685382843, -0.05690250173211098, -0.08523944765329361, -0.12277539819478989, 0.08208204805850983, -0.06068848818540573, -0.1419462114572525, -0.07540862262248993, 0.06421339511871338, -0.07026631385087967, -0.12400811165571213, -0.06845565140247345, 0.04148823022842407, -0.0674239844083786, 0.08895877003669739, -0.1385628581047058, 0.04773101583123207, -0.043907783925533295, -0.039145443588495255, -0.11361008137464523, 0.1602533906698227, 0.017430318519473076, -0.05033821240067482, 0.007227051071822643, 0.022906865924596786, -0.11520376056432724, 0.10248192399740219, -0.04916787147521973, 0.1911502182483673, -0.2504415214061737, -0.09547529369592667, 0.125348761677742, -0.08469214290380478, -0.06948409229516983, 0.10440483689308167, 0.0335768386721611, 0.08643552660942078, 0.11599346250295639, 0.3113155663013458, 0.06122438982129097, -0.11336049437522888, -0.0417872779071331, 0.15663237869739532, -0.017292696982622147, -0.0020002219825983047, 0.1044163852930069, -0.10457726567983627, 0.14124368131160736, -0.0376339890062809, 0.0906834602355957, 0.005680824164301157, -0.061019167304039, -0.009389401413500309, 0.06360799074172974, -0.04682285338640213, 0.11906691640615463, -0.027008941397070885, 0.014869360253214836, 0.006258201785385609, -0.04176006466150284, 0.08322463184595108, 0.06703097373247147, -0.10648071020841599, 0.08840658515691757, -0.1838817000389099, 0.04289967939257622, -0.028657305985689163, 0.0009968857048079371, -0.20070238411426544, 0.03299235552549362, -0.06547261774539948, 0.11697875708341599, 0.15045566856861115, 0.18158310651779175, 0.02308320812880993, -0.03277164325118065, -0.04140162467956543, 0.027451327070593834, 0.12528903782367706, 0.08832214772701263, -0.04453517124056816, -0.2071872055530548, 0.1027260273694992, -0.10000301152467728, 0.12194600701332092, -0.1259853094816208, 0.025003699585795403, 0.14878959953784943, 0.014487044885754585, 0.0551690012216568, 0.012050728313624859, 0.0332307443022728, 0.05836357921361923, -0.016175182536244392, -0.00841380562633276, 0.0052885920740664005, 0.04858415201306343, -0.12680178880691528, 0.18744376301765442, -0.2061028778553009, 0.07074636965990067, 0.08560188859701157, -0.10408041626214981, -0.08939795196056366, 0.09091134369373322, 0.047543201595544815, 0.030677851289510727, -0.00010399172606412321, -0.13808001577854156, 0.17998023331165314, -0.07471676915884018, 0.07777108252048492, -0.06943902373313904, 0.06270315498113632, 0.007327704690396786, -0.10654150694608688, 0.051976390182971954, 0.10616682469844818, -0.1397155076265335, -0.1315072625875473, 0.09967271238565445, 0.1331636756658554, -0.011211128905415535, 0.21505829691886902, -0.06527340412139893, -0.03670160844922066, -0.008365101180970669, -0.0017638703575357795, -0.02967366762459278, 0.07231529802083969, -0.1886925995349884, -0.04649387672543526, 0.03029981628060341, 0.04954952746629715, 0.06664865463972092, -0.10305284708738327, -0.03267904371023178, 0.013704363256692886, -0.06876082718372345, -0.24891719222068787, 0.08919736742973328, -0.01396119687706232, 0.09382390975952148, -0.03621234744787216, -0.055351968854665756, 0.053408753126859665, -0.0403716154396534, -0.15641839802265167, 0.11563528329133987, -0.16420574486255646, -0.09264477342367172, -0.1130523756146431, 0.05693871155381203, 0.043728385120630264, 0.13501566648483276, 0.1389719843864441, -0.10370700061321259, 0.03117462620139122, -0.05410659685730934, 0.05719732120633125, 0.053117480129003525, 0.01829209178686142, -0.020112503319978714, -0.04800806939601898, -0.014399022795259953, -0.06510527431964874, 0.012999755330383778, 0.0005680882604792714, 0.02789255417883396, 0.008744621649384499, -0.1165158823132515, 0.08217131346464157, 0.2879910171031952, 0.10278353840112686, -0.03538094460964203, -0.043476175516843796, 0.14484542608261108, -0.11870069056749344, -0.04569167271256447, 0.1920454204082489, -0.029816515743732452, -0.011215433478355408, 0.13702622056007385, 0.004780925344675779, 0.01003651786595583, -0.008254601620137691, -0.06854730099439621, -0.09739459306001663, -0.1888606995344162, -0.029643086716532707, -0.07958102226257324, 0.019756745547056198, -0.19927646219730377, -0.024420225992798805, 0.019211947917938232, -0.0026351213455200195, 0.0024233791045844555, -0.06293673813343048, 0.14070090651512146, -0.026936527341604233, 0.1709374040365219, -0.08388037234544754, 0.09021811187267303, -0.10123462229967117, -0.08799389004707336, 0.12085419148206711, -0.04407156631350517, 0.037300143390893936, 0.10617713630199432, 0.17684297263622284, 0.022186147049069405, -0.0341443233191967, 0.07030951976776123, 0.03030187077820301, 0.11091993749141693, -0.025217588990926743, -0.04708995297551155, -0.06332386285066605, -0.02218964509665966, 0.0373779758810997, 0.22244015336036682, -0.0953385978937149, -0.0041623846627771854, -0.009052479639649391, 0.04669882729649544, 0.01107008382678032, 0.17268522083759308, -0.15486444532871246, -0.009207009337842464, 0.07311180979013443, -0.014810552820563316, 0.010326381772756577, 0.05840909108519554, 0.2289765477180481, -0.023241808637976646, 0.06998848915100098, 0.15220658481121063, 0.017935872077941895, -0.01387602649629116, 0.05429145321249962, -0.1450066715478897, -0.06955599784851074, -0.03039051964879036, 0.016990922391414642, -0.08177617937326431, 0.1235632449388504, 0.01884496584534645, 0.0006802486605010927, 0.007034416310489178, 0.0040025534108281136, 0.01141627598553896, 0.16369853913784027, 0.13256047666072845, -0.008173109032213688, -0.07315541058778763, -0.14009246230125427, -0.025444941595196724, -0.011225511319935322, 0.11398308724164963, 0.016859007999300957, -0.01130736991763115, 0.03146538510918617, 0.00537622207775712, 0.07973507046699524, -0.0022086186800152063, -0.06358850002288818, -0.08536531776189804, 0.03776087239384651, 0.1331719309091568, 0.09173461049795151, 0.011780966073274612, -0.031573336571455, -0.1932401806116104, -0.028185002505779266, -0.10579032450914383, -0.04011940583586693, -0.07809209823608398, -0.14534161984920502, 0.06755974888801575, -0.05883494392037392, -0.07239501923322678, 0.06302466988563538, 0.042382754385471344, 0.012542794458568096, -0.007090711500495672, 0.12965133786201477, -0.03640146180987358, -0.06642194092273712, -0.05038751661777496, 0.12936405837535858, 0.030069051310420036, 0.0787167102098465, 0.04073600098490715, -0.040921468287706375, -0.02785584144294262, -0.07616016268730164, 0.02322438918054104, -0.06166551262140274, -0.05141599103808403, 0.11657333374023438, 0.04820602759718895, -0.14673082530498505, -0.01721864379942417, -0.022281205281615257, 0.15298138558864594, 0.17262381315231323, -0.042035337537527084, 0.07080285251140594, 0.17129746079444885, -0.024514757096767426, -0.24365870654582977, -0.03449227660894394, 0.006116656586527824, 0.05129389092326164, 0.03732568398118019, -0.10412226617336273, 0.029576295986771584, -0.06275694817304611, -0.018130598589777946, -0.07902830094099045, -0.16731777787208557, -0.11151336878538132, 0.17217065393924713, -0.14926733076572418, 0.17024537920951843, -0.0816507413983345, -0.08495306968688965, -0.09284727275371552, 0.013666823506355286, 0.16643907129764557, -0.32173946499824524, 0.07017914205789566, 0.14659613370895386, 0.026129579171538353, 0.037733495235443115, 0.0534851960837841, 0.12643134593963623, 0.07368076592683792, -0.002009439980611205, 0.023115552961826324, -0.07139650732278824, 0.09280187636613846, 0.05204255133867264, -0.041378892958164215, 0.061850205063819885, 0.011348458006978035, -0.08079349994659424, -0.03393919765949249, -0.026440316811203957, -0.0034779382403939962, 0.07560581713914871, -0.012950712814927101, -0.06093575432896614, -0.12092971056699753, 0.006503805983811617, -0.04179585725069046, 0.24155142903327942, -0.18754886090755463, 0.10528585314750671, 0.11869924515485764, 0.2641095519065857, -0.11661037802696228, 0.04336006939411163, 0.002909866627305746, -0.0894727036356926, 0.062075186520814896, -0.1719927042722702, 0.044466082006692886, 0.02985658496618271, 0.014931018464267254, 0.14628814160823822, 0.060444820672273636, -0.014986728318035603, 0.14148801565170288, 0.09020237624645233, -0.03421285003423691, -0.06372017413377762, -0.04518794268369675, -0.13856635987758636, -0.028838863596320152, 0.03549536317586899, 0.160542294383049, 0.043535538017749786, 0.05136257782578468, -0.009976177476346493, -0.03061259537935257, -0.10288158059120178, 0.1571081578731537, 0.09092769771814346, 0.01759154163300991, -0.08780428022146225, 0.026518067345023155, -0.018730489537119865, -0.07423850148916245, -0.0457877591252327, -0.03962879255414009, -0.08420480042695999, -0.04540037363767624, -0.1414191871881485, 0.0783071219921112, -0.09344930946826935, -0.13358505070209503, -0.08514799922704697, -0.17976967990398407, -0.025254426524043083, 0.20147447288036346, 0.011815267615020275, 0.07381196320056915, -0.02352144382894039, 0.061936214566230774, 0.05218050256371498, -0.025945967063307762, 0.020679747685790062, -0.02290760539472103, -0.11229149252176285, 0.16943053901195526, -0.07119476795196533, 0.09509263932704926, -0.07913228869438171, -0.007471535820513964, -0.07978340238332748, 0.06212540343403816, 0.012594791129231453, 0.006415531039237976, -0.0821537971496582, -0.0409088060259819, 0.027377912774682045, -0.061819128692150116, -0.018986761569976807, -0.03143554925918579, -0.07468626648187637, 0.06874360889196396, 0.058167167007923126, 0.11047392338514328, -0.052306972444057465, 0.031222013756632805, 0.008709752932190895, -0.04990376904606819, 0.08933497220277786, 0.1166239082813263, -0.10025840252637863, 0.12039737403392792, -0.20711499452590942, -0.038158826529979706, 0.14244921505451202, 0.09080732613801956, -0.0013951559085398912, 0.051907770335674286, -0.06921026855707169, 0.08934617042541504, 0.034439753741025925, -0.0007969170692376792, -0.01585601083934307, 0.07562420517206192, 0.14476890861988068, -0.15189658105373383, -0.03403692692518234, -0.056641045957803726, 0.014643339440226555, 0.1952793151140213, 0.12877191603183746, 0.12828610837459564, -0.04671173542737961, 0.010101526975631714, -0.03244514763355255, 0.04641294479370117, 0.033798396587371826, -0.06061800196766853, 0.012896365486085415, -0.04206443578004837, 0.07544127106666565, -0.06879434734582901, 0.08884063363075256, -0.10063531994819641, 0.03807593882083893, 0.0036209835670888424, 0.08159173280000687, 0.0737089291214943, 0.036879345774650574, 0.06164472550153732, 0.12943127751350403, -0.03792784735560417, -0.09298750758171082, 0.04321957379579544, 0.02623307704925537, 0.09852409362792969, 0.004061354789882898, -0.03506671264767647, -0.01055322028696537, 0.1196008026599884, 0.028297187760472298, -0.023962341248989105, -0.029764236882328987, 0.04963289201259613, -0.07658524811267853, -0.0920775756239891, -0.06285292655229568, 0.08836675435304642, 0.13188208639621735, -0.0203299131244421, 0.006297657731920481, 0.00962100364267826, -0.06077350676059723, -0.21056987345218658, -0.0023401787038892508, -0.04716559872031212, -0.09053678065538406, -0.04041735827922821, -0.08894702792167664, 0.09903976321220398, 0.1664183884859085, 0.008169355802237988, 0.056530874222517014, 0.06303313374519348, -0.1504817008972168, -0.12448170781135559, 0.004221959505230188, -0.0568283349275589, 0.034707967191934586, -0.10063274204730988, -0.040361810475587845, 0.18272440135478973, 0.02982848510146141, 0.03181290626525879, 0.002594550373032689, -0.000915464770514518, -0.029262280091643333, -0.18321098387241364, -0.03539501130580902, -0.023825746029615402, 0.0771891325712204, 0.02079206518828869, 0.04202113673090935, 0.1044786125421524, -0.04762859269976616, 0.03429753705859184, 0.14807240664958954, -0.10289190709590912, -0.1779196709394455, -0.16033178567886353, -0.1330951303243637, -0.030519794672727585, 0.13602906465530396, -0.06066129356622696, -0.12709368765354156, -0.05315938964486122, 0.06693685799837112, 0.2072233110666275, -0.1366950422525406, 0.07577503472566605, -0.02473280392587185, 0.013964841142296791, -0.030981892719864845, -0.04302223026752472, -0.0006660671206191182, 0.19431383907794952, 0.047936081886291504, -0.055533267557621, -0.054952189326286316, 0.02756027691066265, 0.008355147205293179, 0.0939454585313797, -0.04716514050960541, -0.0597798116505146, -0.0038651477079838514, 0.13297288119792938, -0.18330992758274078, -0.1702089160680771, -0.11798175424337387, -0.09362214058637619, 0.023435963317751884, 0.023263873532414436, 0.006184778176248074, 0.0935443565249443, 0.013975158333778381, -0.03852587938308716, -0.06015412136912346, -0.007047678809612989, 0.0023741323966532946, -0.07629147171974182, 0.12045011669397354, 0.0020992723293602467, -0.23357871174812317, -0.07899800688028336, -0.033779896795749664, 0.2282748967409134, 0.015670262277126312, 0.11065508425235748, 0.045388396829366684, 0.1479751318693161, 0.010097505524754524, -0.04843037202954292, 0.010637570172548294, 0.01959526538848877, -0.053550079464912415, 0.18729594349861145, 0.054992400109767914, -0.13988368213176727, 0.09716235846281052, 0.047138456255197525, -0.127494677901268, -0.03923343867063522, -0.05458170548081398, -0.08914587646722794, 0.07871965318918228, -0.058094337582588196, -0.00821929145604372, 0.013462360017001629, 0.0731859877705574, -0.0794791728258133, 0.025044983252882957, -0.024291686713695526, -0.07948771119117737, -0.17357079684734344, -0.043543413281440735, 0.025472218170762062, 0.03517349436879158, -0.07338520884513855, 0.004656636621803045, -0.083633653819561, 0.019671423360705376, 0.044332705438137054, 0.04978359490633011, 0.07983524352312088, -0.01664700172841549, 0.005489273928105831, 0.02458840236067772, 0.0682913288474083, 0.16845093667507172, -0.030987853184342384, -0.11849803477525711 ]
null
null
fairseq
# tts_transformer-fr-cv7_css10 [Transformer](https://arxiv.org/abs/1809.08895) text-to-speech model from fairseq S^2 ([paper](https://arxiv.org/abs/2109.06912)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_synthesis)): - French - Single-speaker male voice - Pre-trained on [Common Voice v7](https://commonvoice.mozilla.org/en/datasets), fine-tuned on [CSS10](https://github.com/Kyubyong/css10) ## Usage ```python from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub from fairseq.models.text_to_speech.hub_interface import TTSHubInterface import IPython.display as ipd models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "facebook/tts_transformer-fr-cv7_css10", arg_overrides={"vocoder": "hifigan", "fp16": False} ) model = models[0] TTSHubInterface.update_cfg_with_data_cfg(cfg, task.data_cfg) generator = task.build_generator(model, cfg) text = "Bonjour, ceci est un test." sample = TTSHubInterface.get_model_input(task, text) wav, rate = TTSHubInterface.get_prediction(task, model, generator, sample) ipd.Audio(wav, rate=rate) ``` See also [fairseq S^2 example](https://github.com/pytorch/fairseq/blob/main/examples/speech_synthesis/docs/common_voice_example.md). ## Citation ```bibtex @inproceedings{wang-etal-2021-fairseq, title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit", author = "Wang, Changhan and Hsu, Wei-Ning and Adi, Yossi and Polyak, Adam and Lee, Ann and Chen, Peng-Jen and Gu, Jiatao and Pino, Juan", booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-demo.17", doi = "10.18653/v1/2021.emnlp-demo.17", pages = "143--152", } ```
{"language": "fr", "library_name": "fairseq", "tags": ["fairseq", "audio", "text-to-speech"], "datasets": ["common_voice", "css10"], "task": "text-to-speech", "widget": [{"text": "Bonjour, ceci est un test.", "example_title": "Hello, this is a test run."}]}
text-to-speech
facebook/tts_transformer-fr-cv7_css10
[ "fairseq", "audio", "text-to-speech", "fr", "dataset:common_voice", "dataset:css10", "arxiv:1809.08895", "arxiv:2109.06912", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "1809.08895", "2109.06912" ]
[ "fr" ]
TAGS #fairseq #audio #text-to-speech #fr #dataset-common_voice #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us
# tts_transformer-fr-cv7_css10 Transformer text-to-speech model from fairseq S^2 (paper/code): - French - Single-speaker male voice - Pre-trained on Common Voice v7, fine-tuned on CSS10 ## Usage See also fairseq S^2 example.
[ "# tts_transformer-fr-cv7_css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- French\n- Single-speaker male voice\n- Pre-trained on Common Voice v7, fine-tuned on CSS10", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ "TAGS\n#fairseq #audio #text-to-speech #fr #dataset-common_voice #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n", "# tts_transformer-fr-cv7_css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- French\n- Single-speaker male voice\n- Pre-trained on Common Voice v7, fine-tuned on CSS10", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ 59, 65, 13 ]
[ "passage: TAGS\n#fairseq #audio #text-to-speech #fr #dataset-common_voice #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n# tts_transformer-fr-cv7_css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- French\n- Single-speaker male voice\n- Pre-trained on Common Voice v7, fine-tuned on CSS10## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ -0.13916128873825073, 0.036003924906253815, -0.0008451927569694817, -0.03781190887093544, 0.06421300023794174, -0.06397686898708344, 0.16202528774738312, 0.058069754391908646, -0.04343603178858757, 0.031804408878088, 0.015338328666985035, 0.02880973555147648, 0.05168922245502472, 0.008440015837550163, -0.039859890937805176, -0.17143569886684418, 0.0666724219918251, -0.0028497816529124975, -0.01191597618162632, 0.03352554142475128, 0.13195101916790009, -0.035371363162994385, 0.026329616084694862, 0.053199298679828644, -0.08181419223546982, 0.04705136641860008, 0.08393193781375885, -0.07200019806623459, 0.0922326073050499, 0.08757392317056656, 0.01883827894926071, 0.10635747015476227, 0.08511664718389511, -0.14747926592826843, 0.03449314832687378, -0.024129053577780724, 0.051551491022109985, 0.027928443625569344, -0.01798052340745926, 0.015574384480714798, 0.0349804162979126, 0.09133948385715485, -0.0009097064612433314, 0.0660661831498146, -0.08400093764066696, -0.1504894196987152, -0.012982956133782864, -0.03605956584215164, 0.02817060984671116, 0.05289173871278763, -0.061793677508831024, 0.0011229106457903981, -0.12166745960712433, 0.05916738510131836, 0.08988261222839355, -0.22292102873325348, 0.009369985200464725, -0.0411478728055954, 0.045734964311122894, 0.06500351428985596, -0.06622891873121262, 0.007734197191894054, 0.029587093740701675, -0.0247153639793396, -0.1410670280456543, -0.09223300218582153, -0.2203398197889328, 0.014719536527991295, -0.09874462336301804, 0.06297247111797333, 0.3741368055343628, 0.04534799978137016, -0.051883768290281296, -0.1065787822008133, -0.04649060592055321, 0.045549895614385605, 0.007946631871163845, -0.090638667345047, -0.014414877630770206, 0.05596420541405678, -0.012994560413062572, -0.08607514947652817, -0.12938880920410156, -0.08136952668428421, -0.08139404654502869, 0.1378762274980545, -0.028876807540655136, -0.02555028907954693, -0.030114149674773216, -0.0303488876670599, -0.1509040743112564, -0.027507778257131577, 0.0405656062066555, -0.026461265981197357, -0.07984096556901932, -0.00033189408713951707, -0.04981622472405434, -0.24755993485450745, 0.10881731659173965, -0.024076301604509354, -0.06080666556954384, 0.028183866292238235, -0.0647474080324173, 0.04015899449586868, 0.029078612104058266, 0.029527300968766212, -0.04207909107208252, -0.038043539971113205, -0.0011388107668608427, 0.016486452892422676, 0.022145673632621765, -0.06095165014266968, -0.16050399839878082, -0.0808844193816185, -0.07477693259716034, 0.010531003586947918, -0.02838202938437462, 0.004020527936518192, -0.04431559145450592, -0.004829034674912691, 0.22377540171146393, -0.03824351355433464, -0.030148837715387344, 0.0694112628698349, 0.03300042450428009, 0.04001772403717041, -0.0027411337941884995, 0.08915876597166061, -0.006865605711936951, -0.14394399523735046, -0.01970122568309307, 0.009928420186042786, 0.053715240210294724, -0.1100936233997345, 0.021259261295199394, -0.042709361761808395, -0.004599899984896183, -0.08296581357717514, -0.025382228195667267, -0.09231501072645187, -0.11706055700778961, 0.05826007202267647, -0.08419094979763031, -0.15189099311828613, -0.0704367607831955, 0.05496874451637268, -0.039505720138549805, -0.1013224795460701, -0.06134739890694618, 0.022082600742578506, -0.050885360687971115, 0.0735701471567154, -0.12446023523807526, 0.053192198276519775, -0.02081974409520626, -0.04305816814303398, -0.0848654955625534, 0.13420745730400085, -0.006642630789428949, -0.1249285414814949, -0.02859256975352764, -0.01114964485168457, -0.1313355416059494, 0.09549345076084137, -0.009411696344614029, 0.1639036387205124, -0.2527959644794464, -0.07387158274650574, 0.12207988649606705, -0.07956530153751373, -0.003361424198374152, 0.1623324602842331, 0.039934203028678894, 0.026235254481434822, 0.0935964435338974, 0.33496400713920593, 0.12047615647315979, -0.15508577227592468, -0.031878069043159485, 0.16770808398723602, 0.009663209319114685, 0.020532695576548576, 0.08945605158805847, -0.13146550953388214, 0.10150972753763199, -0.03242402896285057, 0.13078266382217407, -0.006428127642720938, -0.07661843299865723, -0.0006646256661042571, 0.058522842824459076, -0.06539274007081985, 0.1128280907869339, -0.02182239666581154, -0.014143751002848148, -0.003246462671086192, -0.03659064322710037, 0.010317335836589336, 0.10861057043075562, -0.13364969193935394, 0.08713585138320923, -0.15807576477527618, 0.04819739982485771, -0.06527291238307953, -0.018706830218434334, -0.19124256074428558, 0.06824971735477448, -0.037705618888139725, 0.11157962679862976, 0.13951674103736877, 0.21347537636756897, 0.01288495771586895, -0.016662903130054474, -0.07581693679094315, 0.029648801311850548, 0.12069421261548996, 0.06713158637285233, -0.0345294214785099, -0.21355493366718292, 0.07314416021108627, -0.11026382446289062, 0.16359131038188934, -0.13450172543525696, 0.000358240824425593, 0.15423785150051117, 0.0439659021794796, 0.04395526275038719, -0.017824141308665276, 0.05718188360333443, 0.05547555908560753, 0.011220910586416721, 0.023808777332305908, 0.003509271889925003, 0.049492739140987396, -0.08941899985074997, 0.18689946830272675, -0.17939285933971405, 0.0296179112046957, 0.08412603288888931, -0.07865244150161743, -0.08349160104990005, 0.07740572094917297, 0.0309845432639122, 0.021043719723820686, 0.00020363129442557693, -0.11174992471933365, 0.19447821378707886, -0.06780757755041122, 0.09865520149469376, -0.06879469007253647, 0.060684069991111755, 0.021238189190626144, -0.07819607853889465, 0.03386414051055908, 0.09726910293102264, -0.10838013142347336, -0.09757419675588608, 0.05153300613164902, 0.059127021580934525, -0.013005401007831097, 0.21630734205245972, -0.0573614276945591, -0.02625804953277111, -0.004432986956089735, -0.0036828031297773123, -0.049967873841524124, 0.09157855808734894, -0.19679442048072815, -0.04781394451856613, 0.01983312889933586, 0.05149249732494354, 0.07572131603956223, -0.11864817887544632, 0.007632323540747166, 0.0023101218976080418, -0.10550301522016525, -0.1759161353111267, 0.10996685922145844, -0.027216704562306404, 0.08925097435712814, -0.05636999011039734, -0.0995454490184784, 0.04195936769247055, -0.04371900111436844, -0.16756051778793335, 0.09427051991224289, -0.18444021046161652, -0.07297666370868683, -0.09283774346113205, 0.060989636927843094, -0.003042516065761447, 0.11495727300643921, 0.1055455133318901, -0.09505343437194824, 0.009718305431306362, -0.08454420417547226, 0.03630815073847771, 0.006057228893041611, 0.0147386584430933, -0.0666327103972435, -0.07130401581525803, 0.03065807744860649, -0.09717640280723572, 0.020544810220599174, -0.027856478467583656, 0.015313982963562012, -0.027874529361724854, -0.06736260652542114, 0.061190519481897354, 0.25758615136146545, 0.09941238164901733, -0.034003984183073044, -0.03379998728632927, 0.2002602219581604, -0.09648247808218002, -0.03861122950911522, 0.18202511966228485, -0.0012857905821874738, -0.024140730500221252, 0.1673571914434433, 0.016986289992928505, 0.016217947006225586, -0.021333450451493263, -0.03720724582672119, -0.06256185472011566, -0.14899501204490662, -0.07289040088653564, -0.09939135611057281, -0.008862430229783058, -0.19252827763557434, -0.016533879563212395, 0.010888325050473213, -0.04275330901145935, -0.026593880727887154, -0.060229744762182236, 0.13743898272514343, -0.02606179006397724, 0.24370554089546204, -0.08239862322807312, 0.08496751636266708, -0.080208919942379, -0.06387752294540405, 0.10159529745578766, -0.06674908846616745, -0.01091391034424305, 0.09554927796125412, 0.17399510741233826, 0.007496982347220182, 0.016901560127735138, 0.08434242010116577, 0.009997663088142872, 0.06809142976999283, -0.008046010509133339, -0.03208823874592781, -0.04433901607990265, -0.013897771947085857, 0.020604781806468964, 0.25461164116859436, -0.101181760430336, -0.01362378615885973, -0.002686267951503396, 0.028979327529668808, 0.02352365106344223, 0.19108456373214722, -0.1162753477692604, -0.03299098461866379, 0.020756253972649574, -0.032359298318624496, 0.002944897161796689, 0.07367248088121414, 0.2418965846300125, -0.026046762242913246, 0.06802183389663696, 0.13545262813568115, -0.005972589366137981, -0.013742497190833092, 0.06692365556955338, -0.13305656611919403, -0.017251357436180115, -0.020687196403741837, 0.015190050937235355, -0.07396872341632843, 0.11828335374593735, 0.0425884835422039, 0.009775466285645962, 0.013341796584427357, 0.010883745737373829, 0.00978532899171114, 0.15865932404994965, 0.1393013745546341, -0.014295574277639389, -0.013248834758996964, -0.10738199204206467, -0.03540489077568054, -0.005616760812699795, 0.13468250632286072, 0.0343487448990345, -0.02999511919915676, 0.03436518833041191, -0.039125312119722366, 0.07005994766950607, -0.03428485244512558, -0.1323624700307846, -0.09431418776512146, 0.04235294833779335, 0.19150318205356598, 0.07809332013130188, 0.015337379649281502, -0.027773084118962288, -0.18374954164028168, -0.03533487394452095, -0.10447914898395538, 0.012954248115420341, -0.06862033158540726, -0.11277642101049423, 0.1165180504322052, -0.048434317111968994, -0.043678317219018936, 0.047539904713630676, 0.06423011422157288, -0.011562550440430641, 0.025046033784747124, 0.10781913250684738, -0.061172276735305786, -0.09165395051240921, -0.043435197323560715, 0.2063010036945343, 0.03230138123035431, 0.1036788746714592, 0.05485980585217476, -0.022651299834251404, -0.014470286667346954, -0.05319816991686821, 0.03888094425201416, -0.04135976359248161, -0.0944102555513382, 0.10036277770996094, 0.03914973512291908, -0.13486750423908234, -0.022712334990501404, -0.012558767572045326, 0.17308935523033142, 0.1619502604007721, -0.04739856719970703, 0.09242714941501617, 0.1858169436454773, -0.025354912504553795, -0.24084915220737457, -0.04235786944627762, 0.02459043264389038, 0.06615912914276123, -0.01191803254187107, -0.13717958331108093, 0.05049850046634674, -0.08381500840187073, -0.01844140887260437, -0.09230373799800873, -0.14723321795463562, -0.13634951412677765, 0.1796988695859909, -0.17045286297798157, 0.13297489285469055, -0.019945774227380753, -0.07045900821685791, -0.06115187332034111, 0.04017583280801773, 0.09084692597389221, -0.27456074953079224, 0.07929683476686478, 0.1406717300415039, -0.004569709300994873, 0.04652686417102814, 0.04698920622467995, 0.10997666418552399, 0.08874299377202988, -0.011333631351590157, 0.04249358922243118, -0.04900861904025078, 0.023706160485744476, 0.08526797592639923, 0.00157365039922297, 0.05330194905400276, 0.015260428190231323, -0.06569114327430725, -0.041985221207141876, -0.03869640827178955, -0.003723199013620615, 0.057962674647569656, 0.008964357897639275, -0.054574254900217056, -0.09106288850307465, 0.004750140011310577, -0.014699066057801247, 0.1768265664577484, -0.2048482596874237, 0.08550886809825897, 0.14842680096626282, 0.2696581482887268, -0.10389255732297897, 0.046275779604911804, 0.026252051815390587, -0.07921852171421051, 0.061985719949007034, -0.10570431500673294, 0.060828033834695816, 0.03766336292028427, 0.02860167622566223, 0.154227152466774, 0.04356568306684494, -0.048702213913202286, 0.11711480468511581, 0.05168639495968819, -0.07035688310861588, -0.10645605623722076, -0.0581132173538208, -0.1399107128381729, -0.04231146350502968, 0.031803276389837265, 0.16722026467323303, 0.01194524485617876, 0.034070972353219986, -0.02390185184776783, -0.018002865836024284, -0.115461565554142, 0.1647271364927292, 0.08982614427804947, 0.021927939727902412, -0.0930708721280098, 0.00606020912528038, -0.016752565279603004, -0.07340963184833527, -0.028773926198482513, -0.049584075808525085, -0.06375796347856522, -0.0617920383810997, -0.14824095368385315, 0.08499711751937866, -0.011695259250700474, -0.13751348853111267, -0.0779399573802948, -0.1601649969816208, 0.0014850097941234708, 0.19809724390506744, 0.012132344767451286, 0.05906440690159798, -0.037662699818611145, 0.010547940619289875, -0.005378812085837126, 0.0045863366685807705, 0.03271764516830444, -0.030088428407907486, -0.09903547167778015, 0.14293688535690308, -0.07245863229036331, 0.07738839089870453, -0.05558076873421669, -0.02857903763651848, -0.02344541810452938, 0.038071565330028534, 0.013237852603197098, -0.0025559847708791494, -0.06855909526348114, -0.010068570263683796, 0.038520023226737976, -0.05510736256837845, -0.0007323379395529628, -0.007789428811520338, -0.08981827646493912, 0.06431407481431961, 0.0378502681851387, 0.09472563862800598, -0.05495713651180267, 0.05539602041244507, 0.0056999423541128635, -0.045323364436626434, 0.10579540580511093, 0.10442696511745453, -0.09005288779735565, 0.11716760694980621, -0.20824237167835236, -0.09158559888601303, 0.09812389314174652, 0.09617194533348083, -0.029512200504541397, 0.029175423085689545, -0.07216699421405792, 0.09418348968029022, 0.025448115542531013, -0.02529139630496502, 0.013192223384976387, 0.05909378454089165, 0.09931797534227371, -0.15981745719909668, -0.012857799418270588, -0.06822578608989716, 0.022795194759964943, 0.1726362407207489, 0.1449863463640213, 0.11609435081481934, -0.06811922043561935, 0.021421605721116066, -0.030038578435778618, 0.05049525946378708, 0.002476930385455489, -0.04865480586886406, -0.051235683262348175, -0.04518154263496399, 0.08229945600032806, -0.04893788322806358, 0.06525717675685883, -0.06677494943141937, 0.03356589004397392, -0.006392254959791899, 0.03855177015066147, 0.020384173840284348, 0.03771323338150978, 0.06269724667072296, 0.10855529457330704, -0.034045230597257614, -0.044478196650743484, 0.021845854818820953, 0.046077921986579895, 0.23145154118537903, -0.025103464722633362, -0.030200079083442688, 0.003457465674728155, 0.10935480147600174, 0.042193397879600525, -0.08847767859697342, 0.03690905123949051, 0.10767245292663574, -0.09200208634138107, -0.0516926534473896, -0.09117024391889572, 0.07870203256607056, 0.08376526832580566, -0.0371701642870903, 0.011097095906734467, 0.0020732723642140627, -0.08027194440364838, -0.21842853724956512, -0.011323603801429272, -0.0665811076760292, -0.07806077599525452, -0.025351306423544884, -0.093579962849617, 0.09941394627094269, 0.12082234025001526, 0.029306132346391678, 0.0665697231888771, 0.09691377729177475, -0.18746694922447205, -0.11162209510803223, 0.04030810296535492, -0.06619815528392792, 0.03996125981211662, -0.09197565168142319, -0.04072251170873642, 0.19158509373664856, -0.017389776185154915, 0.04445051774382591, -0.003933504223823547, 0.011262237094342709, -0.031548138707876205, -0.1651407778263092, -0.04203977435827255, -0.021751541644334793, 0.03733745962381363, 0.06774931401014328, 0.10073734819889069, 0.1148437038064003, -0.06188374385237694, 0.02999504841864109, 0.12668758630752563, -0.08160409331321716, -0.1533079594373703, -0.15670986473560333, -0.09078098088502884, 0.0004924954264424741, 0.11553148925304413, -0.05803225561976433, -0.1123165562748909, -0.007498500868678093, 0.07239773869514465, 0.2501072883605957, -0.10028618574142456, 0.0880267396569252, 0.008899803273379803, -0.0008476163493469357, -0.0037074126303195953, -0.0784936472773552, 0.016997212544083595, 0.15039467811584473, 0.04971485957503319, -0.026903502643108368, -0.05839750915765762, -0.007097715977579355, 0.021743500605225563, 0.079106405377388, -0.04047542065382004, -0.07092408835887909, 0.006310227792710066, 0.12450740486383438, -0.13464879989624023, -0.18764010071754456, -0.11002089083194733, -0.1144213080406189, 0.015657668933272362, 0.009466872550547123, -0.03814421594142914, 0.13446854054927826, -0.008466118015348911, -0.0760626271367073, -0.04206061735749245, 0.048136066645383835, -0.012716633267700672, -0.07465442270040512, 0.06043864041566849, 0.0022454988211393356, -0.22172054648399353, -0.04679470509290695, -0.013890041038393974, 0.24269819259643555, 0.011785889975726604, 0.08828648179769516, 0.038913335651159286, 0.1365380883216858, 0.015115304850041866, -0.07839062064886093, -0.0008357695187442005, 0.10295133292675018, -0.04343714192509651, 0.2480882704257965, 0.01754201389849186, -0.13222374022006989, 0.11399466544389725, -0.022392110899090767, -0.14936259388923645, -0.030426129698753357, -0.008320515975356102, -0.09330710023641586, 0.10110944509506226, -0.04754484072327614, -0.013224503956735134, -0.01581670343875885, 0.06662724912166595, -0.0671888217329979, 0.011250088922679424, -0.011363796889781952, -0.07081133127212524, -0.1758497655391693, -0.0038986029103398323, -0.0459786094725132, 0.016397349536418915, -0.0848950669169426, -0.01992647536098957, -0.05598908290266991, 0.02307235635817051, 0.03687679022550583, 0.05580225959420204, 0.07250867038965225, -0.02762484923005104, -0.0031050648540258408, 0.05820750445127487, 0.056922826915979385, 0.14072713255882263, -0.027313344180583954, -0.10491478443145752 ]
null
null
fairseq
# tts_transformer-ru-cv7_css10 [Transformer](https://arxiv.org/abs/1809.08895) text-to-speech model from fairseq S^2 ([paper](https://arxiv.org/abs/2109.06912)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_synthesis)): - Russian - Single-speaker male voice - Pre-trained on [Common Voice v7](https://commonvoice.mozilla.org/en/datasets), fine-tuned on [CSS10](https://github.com/Kyubyong/css10) ## Usage ```python from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub from fairseq.models.text_to_speech.hub_interface import TTSHubInterface import IPython.display as ipd models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "facebook/tts_transformer-ru-cv7_css10", arg_overrides={"vocoder": "hifigan", "fp16": False} ) model = models[0] TTSHubInterface.update_cfg_with_data_cfg(cfg, task.data_cfg) generator = task.build_generator(model, cfg) text = "Здравствуйте, это пробный запуск." sample = TTSHubInterface.get_model_input(task, text) wav, rate = TTSHubInterface.get_prediction(task, model, generator, sample) ipd.Audio(wav, rate=rate) ``` See also [fairseq S^2 example](https://github.com/pytorch/fairseq/blob/main/examples/speech_synthesis/docs/common_voice_example.md). ## Citation ```bibtex @inproceedings{wang-etal-2021-fairseq, title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit", author = "Wang, Changhan and Hsu, Wei-Ning and Adi, Yossi and Polyak, Adam and Lee, Ann and Chen, Peng-Jen and Gu, Jiatao and Pino, Juan", booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-demo.17", doi = "10.18653/v1/2021.emnlp-demo.17", pages = "143--152", } ```
{"language": "ru", "library_name": "fairseq", "tags": ["fairseq", "audio", "text-to-speech"], "datasets": ["common_voice", "css10"], "task": "text-to-speech", "widget": [{"text": "\u0417\u0434\u0440\u0430\u0432\u0441\u0442\u0432\u0443\u0439\u0442\u0435, \u044d\u0442\u043e \u043f\u0440\u043e\u0431\u043d\u044b\u0439 \u0437\u0430\u043f\u0443\u0441\u043a.", "example_title": "Hello, this is a test run."}]}
text-to-speech
facebook/tts_transformer-ru-cv7_css10
[ "fairseq", "audio", "text-to-speech", "ru", "dataset:common_voice", "dataset:css10", "arxiv:1809.08895", "arxiv:2109.06912", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "1809.08895", "2109.06912" ]
[ "ru" ]
TAGS #fairseq #audio #text-to-speech #ru #dataset-common_voice #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us
# tts_transformer-ru-cv7_css10 Transformer text-to-speech model from fairseq S^2 (paper/code): - Russian - Single-speaker male voice - Pre-trained on Common Voice v7, fine-tuned on CSS10 ## Usage See also fairseq S^2 example.
[ "# tts_transformer-ru-cv7_css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Russian\n- Single-speaker male voice\n- Pre-trained on Common Voice v7, fine-tuned on CSS10", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ "TAGS\n#fairseq #audio #text-to-speech #ru #dataset-common_voice #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n", "# tts_transformer-ru-cv7_css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Russian\n- Single-speaker male voice\n- Pre-trained on Common Voice v7, fine-tuned on CSS10", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ 59, 65, 13 ]
[ "passage: TAGS\n#fairseq #audio #text-to-speech #ru #dataset-common_voice #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n# tts_transformer-ru-cv7_css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Russian\n- Single-speaker male voice\n- Pre-trained on Common Voice v7, fine-tuned on CSS10## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ -0.12238969653844833, 0.0023266442585736513, -0.0011566586326807737, -0.05503162369132042, 0.08466219156980515, -0.04939834401011467, 0.20145152509212494, 0.05019397661089897, -0.027066687121987343, 0.0327608548104763, 0.021810322999954224, -0.011745053343474865, 0.06223277002573013, 0.022782159969210625, -0.022309567779302597, -0.19216524064540863, 0.07235385477542877, -0.02957109734416008, -0.0051548536866903305, 0.031244073063135147, 0.14184172451496124, -0.045749563723802567, 0.008041801862418652, 0.03725102171301842, -0.07896096259355545, 0.06312169879674911, 0.09793512523174286, -0.08117473870515823, 0.07959714531898499, 0.0779130607843399, 0.029683731496334076, 0.0952451080083847, 0.08533265441656113, -0.13585138320922852, 0.034974757581949234, -0.03430372104048729, 0.04275406152009964, 0.018558520823717117, 0.001498839003033936, 0.013278508558869362, 0.08983161300420761, 0.06390081346035004, -0.014560919255018234, 0.05943233147263527, -0.08451253920793533, -0.12636035680770874, -0.01564376801252365, -0.0004569141019601375, 0.043295033276081085, 0.07371699810028076, -0.07981166988611221, 0.008030818775296211, -0.11077472567558289, 0.06003079563379288, 0.10054571181535721, -0.2664100229740143, 0.0035869211424142122, -0.016796192154288292, 0.01822540909051895, 0.10703817009925842, -0.051538389176130295, 0.02469368651509285, 0.018620874732732773, -0.03261781483888626, -0.17381756007671356, -0.08159653842449188, -0.21132367849349976, 0.01805642992258072, -0.10693678259849548, 0.06850732862949371, 0.3748238980770111, 0.03223133087158203, -0.0210629440844059, -0.10211116075515747, -0.04111238569021225, 0.0142658157274127, 0.018669143319129944, -0.06866725534200668, -0.04732658341526985, 0.05448596179485321, -0.024258073419332504, -0.0844474732875824, -0.15105266869068146, -0.0690598338842392, -0.05310773104429245, 0.16504588723182678, -0.023018362000584602, -0.008473485708236694, -0.032328203320503235, -0.03836914896965027, -0.13156627118587494, -0.04016568139195442, 0.058145422488451004, -0.038981981575489044, -0.0971141904592514, 0.022467344999313354, -0.06766722351312637, -0.2831118702888489, 0.11471135914325714, -0.0443187840282917, -0.022520393133163452, 0.04017612710595131, -0.019571347162127495, 0.05884750932455063, 0.005762526765465736, 0.06546691805124283, -0.04748939350247383, -0.029858199879527092, -0.014613074250519276, -0.01632443442940712, 0.037918366491794586, -0.06949267536401749, -0.18139635026454926, -0.10107819736003876, -0.06928360462188721, 0.022095395252108574, -0.047572702169418335, 0.011045889928936958, -0.03702740743756294, -0.00006637879414483905, 0.17848162353038788, -0.03933189809322357, -0.038339972496032715, 0.05732676014304161, 0.03993441164493561, 0.05067085847258568, -0.0359891913831234, 0.08266757428646088, -0.01247475016862154, -0.12983344495296478, 0.005104634910821915, 0.0335274338722229, 0.046342238783836365, -0.12161269783973694, 0.01943669281899929, -0.09807836264371872, 0.008983608335256577, -0.11580002307891846, 0.0003317665832582861, -0.09332899004220963, -0.12776361405849457, 0.06402736902236938, -0.05419264733791351, -0.13490650057792664, -0.07288877665996552, 0.04286148026585579, -0.04633012041449547, -0.12538157403469086, -0.05504683405160904, 0.01595366932451725, -0.06615203619003296, 0.07986179739236832, -0.09311706572771072, 0.045177049934864044, -0.04239638149738312, -0.06756466627120972, -0.061932522803545, 0.13820648193359375, -0.02399825118482113, -0.10222042351961136, -0.01110624149441719, 0.006610541138797998, -0.10486806184053421, 0.08632877469062805, -0.0016057684551924467, 0.1463913917541504, -0.27233120799064636, -0.06440773606300354, 0.11904341727495193, -0.0915779322385788, 0.016463078558444977, 0.1494569629430771, 0.04186253249645233, 0.024587417021393776, 0.11117665469646454, 0.35014966130256653, 0.11348346620798111, -0.12973226606845856, -0.05972118675708771, 0.11574278026819229, -0.0048805647529661655, 0.02634384296834469, 0.08157368004322052, -0.10980726033449173, 0.1313679814338684, -0.016657348722219467, 0.13177934288978577, -0.00947353895753622, -0.06981975585222244, -0.01352614164352417, 0.051807377487421036, -0.07149957120418549, 0.10693227499723434, -0.03136707842350006, -0.00014398194616660476, -0.02956097573041916, -0.057239752262830734, -0.027763547375798225, 0.11010970175266266, -0.11990202963352203, 0.09816822409629822, -0.17946450412273407, 0.05199591442942619, -0.05112204700708389, -0.0008651179959997535, -0.19463318586349487, 0.08643016219139099, -0.038554850965738297, 0.15693092346191406, 0.1507117599248886, 0.20593374967575073, 0.020879032090306282, -0.03952420875430107, -0.08059094846248627, 0.02218419685959816, 0.10413380712270737, 0.06394514441490173, -0.03896661847829819, -0.20805229246616364, 0.06516990065574646, -0.10944977402687073, 0.11393888294696808, -0.10186102986335754, -0.0031060141045600176, 0.13919228315353394, 0.06055314838886261, 0.04020770639181137, -0.000619148719124496, 0.045340754091739655, 0.07219193130731583, -0.00022767657355871052, 0.02585255168378353, 0.0006890337099321187, 0.035188231617212296, -0.12025970220565796, 0.17647238075733185, -0.1574716567993164, 0.017558451741933823, 0.07545793801546097, -0.055126287043094635, -0.08272362500429153, 0.10571803152561188, 0.038827862590551376, 0.026587851345539093, 0.024686921387910843, -0.08413223922252655, 0.20739994943141937, -0.06672106683254242, 0.05912802368402481, -0.04890620708465576, 0.09214894473552704, 0.028154391795396805, -0.10474084317684174, 0.031578127294778824, 0.10432954877614975, -0.12847581505775452, -0.12566520273685455, 0.055236317217350006, 0.05252818763256073, -0.03300127759575844, 0.24306103587150574, -0.054946210235357285, -0.031487345695495605, 0.007077165879309177, -0.021133581176400185, -0.04832366108894348, 0.10213951021432877, -0.15776708722114563, -0.058047905564308167, 0.01598370261490345, 0.054936330765485764, 0.07750005275011063, -0.1071699857711792, -0.008379705250263214, -0.013568819500505924, -0.10573720932006836, -0.1779632717370987, 0.12698175013065338, -0.03244347870349884, 0.09300774335861206, -0.06582516431808472, -0.07499969750642776, 0.03278423100709915, -0.04846974089741707, -0.1564357727766037, 0.07398973405361176, -0.19323337078094482, -0.08893917500972748, -0.10787170380353928, 0.0670388713479042, 0.0016799757722765207, 0.11574169248342514, 0.11339045315980911, -0.14759710431098938, 0.010186326690018177, -0.06565674394369125, 0.06449414044618607, 0.021767713129520416, 0.013438488356769085, -0.07036895304918289, -0.045879896730184555, 0.02964388206601143, -0.06171657145023346, 0.014033024199306965, -0.030352437868714333, 0.023986849933862686, 0.021553175523877144, -0.0634734109044075, 0.056176722049713135, 0.24394553899765015, 0.08357423543930054, -0.03250358626246452, -0.04380549490451813, 0.1414785087108612, -0.10158807784318924, -0.03440101817250252, 0.1800915151834488, -0.011059513315558434, -0.024192119017243385, 0.17692117393016815, 0.00254276511259377, 0.03057875856757164, -0.008055126294493675, -0.06185353547334671, -0.0679815262556076, -0.17244364321231842, -0.08491945266723633, -0.09506108611822128, -0.004054155200719833, -0.2083793431520462, -0.0208616741001606, -0.029903437942266464, -0.030408654361963272, -0.04225099831819534, -0.10601465404033661, 0.13071925938129425, -0.022787541151046753, 0.2140505611896515, -0.07736773788928986, 0.09687501937150955, -0.0954589992761612, -0.04077616333961487, 0.09988349676132202, -0.09372148662805557, 0.03459711745381355, 0.11395920813083649, 0.15618310868740082, 0.005923410411924124, 0.04544446989893913, 0.10498912632465363, -0.005353673826903105, 0.10608022660017014, -0.010247377678751945, -0.014921796508133411, -0.04529164358973503, -0.018545551225543022, 0.04774634167551994, 0.24729444086551666, -0.12098705768585205, 0.008048913441598415, 0.031031467020511627, 0.04380977153778076, 0.06258891522884369, 0.1906730979681015, -0.1048307791352272, -0.07325250655412674, 0.026123838499188423, -0.02204267494380474, 0.01475611049681902, 0.06685899198055267, 0.21723012626171112, -0.019068341702222824, 0.07901553064584732, 0.12839779257774353, -0.0034991211723536253, 0.01914413645863533, 0.06640208512544632, -0.1448143720626831, -0.024232227355241776, -0.019487479701638222, 0.03912394121289253, -0.0943470448255539, 0.14148546755313873, 0.028713064268231392, 0.01649475283920765, 0.007915891706943512, -0.002084275707602501, 0.009987072087824345, 0.13780124485492706, 0.10541512072086334, -0.01619141735136509, -0.06883493810892105, -0.1349448412656784, -0.04640761390328407, -0.00045331011642701924, 0.1468147337436676, 0.036247748881578445, -0.010957116261124611, 0.025449229404330254, -0.017280958592891693, 0.0612783282995224, -0.044405706226825714, -0.13835574686527252, -0.09696285426616669, 0.04025059565901756, 0.2096492201089859, 0.10312409698963165, 0.01681624911725521, -0.03450945019721985, -0.15936608612537384, -0.03691974654793739, -0.08772554248571396, -0.010440575890243053, -0.05871068313717842, -0.09844912588596344, 0.12576887011528015, -0.06776578724384308, -0.05114367604255676, 0.05346933379769325, 0.06401166319847107, -0.011234321631491184, 0.022377870976924896, 0.08787921816110611, -0.048162225633859634, -0.11244763433933258, -0.024129457771778107, 0.20775797963142395, 0.05783404782414436, 0.10322757065296173, 0.039244044572114944, -0.000032231106160907075, -0.017114318907260895, -0.03147989138960838, 0.049222543835639954, -0.010022634640336037, -0.10027848929166794, 0.11551960557699203, 0.04854373633861542, -0.15866467356681824, -0.07100872695446014, -0.039373308420181274, 0.17081639170646667, 0.1663057506084442, -0.07345512509346008, 0.12202586233615875, 0.15560775995254517, -0.020468106493353844, -0.23398171365261078, -0.060517195612192154, 0.026685567572712898, 0.10645242035388947, 0.00530252093449235, -0.1326359510421753, 0.02401391975581646, -0.11132562160491943, -0.02352537028491497, -0.1048220545053482, -0.12722037732601166, -0.1440461128950119, 0.1904284954071045, -0.19433477520942688, 0.11166229844093323, -0.01565365307033062, -0.05101815238595009, -0.07505548000335693, 0.09490406513214111, 0.0366644524037838, -0.2329314798116684, 0.091026671230793, 0.13236381113529205, 0.042333442717790604, 0.05692050978541374, 0.05060001090168953, 0.1120171919465065, 0.05737872049212456, -0.027956843376159668, 0.010826827958226204, -0.0424501895904541, 0.03047199919819832, 0.07927580922842026, -0.010067373514175415, 0.04891852289438248, 0.021183175966143608, -0.053756147623062134, -0.044106174260377884, -0.02805752493441105, -0.007605230435729027, 0.07598965615034103, -0.0009087626240216196, -0.06683194637298584, -0.08305168896913528, -0.011392341926693916, -0.023463960736989975, 0.2333396077156067, -0.20239169895648956, 0.07777400314807892, 0.09948433935642242, 0.24518609046936035, -0.09589279443025589, 0.06733591109514236, 0.014894899912178516, -0.085689015686512, 0.06708736717700958, -0.10010839253664017, 0.03746354207396507, 0.04848860949277878, 0.005753049161285162, 0.13062626123428345, 0.02174164354801178, -0.0665103942155838, 0.14050887525081635, 0.07016294449567795, -0.07940007746219635, -0.09410727024078369, -0.05116724967956543, -0.10706944018602371, -0.023258142173290253, 0.044404227286577225, 0.18137311935424805, -0.026342129334807396, 0.03169196844100952, -0.01892412267625332, -0.0024196773301810026, -0.11004851013422012, 0.165165513753891, 0.07797752320766449, 0.018709536641836166, -0.08116605877876282, 0.0350303016602993, -0.018114741891622543, -0.055532146245241165, -0.010404467582702637, -0.05976921319961548, -0.08560041338205338, -0.06361781060695648, -0.14044520258903503, 0.053188394755125046, -0.011728019453585148, -0.1458723545074463, -0.0755419135093689, -0.15858376026153564, -0.01160188764333725, 0.16025717556476593, 0.024181164801120758, 0.033755917102098465, -0.051547493785619736, 0.01260934118181467, 0.017198028042912483, -0.00637992424890399, 0.06503885239362717, -0.04135496914386749, -0.13862167298793793, 0.13496726751327515, -0.06464424729347229, 0.07326975464820862, -0.05941808596253395, -0.021699432283639908, -0.011237261816859245, 0.05243076756596565, -0.026752445846796036, -0.006472352426499128, -0.07653715461492538, -0.022939477115869522, 0.025425637140870094, -0.07189906388521194, -0.009448382072150707, 0.0011832256568595767, -0.08969265967607498, 0.06771688908338547, 0.005552330054342747, 0.09188040345907211, -0.04050594940781593, 0.0530606210231781, -0.007898670621216297, -0.03232681751251221, 0.11635823547840118, 0.14515160024166107, -0.08873098343610764, 0.1475490778684616, -0.17584848403930664, -0.052972011268138885, 0.11344598233699799, 0.09077330678701401, -0.02637539803981781, 0.042223233729600906, -0.0820980966091156, 0.09205106645822525, 0.02236313559114933, -0.01477578654885292, 0.01788850501179695, 0.059842754155397415, 0.10313092917203903, -0.17280550301074982, 0.009970618411898613, -0.06856710463762283, 0.021325161680579185, 0.1753518134355545, 0.16866028308868408, 0.1428140550851822, -0.09311756491661072, 0.04975033551454544, -0.010933667421340942, 0.05623769015073776, -0.01005490031093359, -0.056408412754535675, -0.06622812896966934, -0.05563909932971001, 0.09021344780921936, -0.07654741406440735, 0.0587383471429348, -0.06182774528861046, 0.029668156057596207, -0.004372971132397652, 0.04248705878853798, 0.012306408025324345, 0.04287875443696976, 0.06254816055297852, 0.114021435379982, -0.030231814831495285, -0.05824694782495499, 0.004178824368864298, 0.025422824546694756, 0.2054813802242279, -0.026821650564670563, -0.015184440650045872, 0.019304098561406136, 0.11109495908021927, 0.036361463367938995, -0.09474679827690125, 0.05318140611052513, 0.08438386023044586, -0.13518473505973816, -0.06636811792850494, -0.08754362165927887, 0.1120099201798439, 0.13006986677646637, -0.031768061220645905, -0.005175514612346888, -0.018043627962470055, -0.07644971460103989, -0.19008749723434448, -0.031178370118141174, -0.0777679905295372, -0.08823085576295853, -0.013584534637629986, -0.08530919253826141, 0.08217208832502365, 0.13124701380729675, 0.02195705845952034, 0.0390813983976841, 0.10134753584861755, -0.15720216929912567, -0.12160313129425049, 0.04842497780919075, -0.059687964618206024, 0.026208680123090744, -0.054110608994960785, -0.05803336575627327, 0.19180704653263092, -0.028729815036058426, 0.03386064991354942, -0.0020080513786524534, -0.007235324941575527, -0.040876101702451706, -0.16547779738903046, -0.051554419100284576, -0.014336342923343182, 0.054302480071783066, 0.09130195528268814, 0.08488116413354874, 0.11605741083621979, -0.05139053612947464, 0.01350012794137001, 0.14660562574863434, -0.06427370011806488, -0.16022226214408875, -0.13236713409423828, -0.09312697499990463, -0.004559824708849192, 0.10873217135667801, -0.06184239313006401, -0.11644251644611359, -0.029448894783854485, 0.07269946485757828, 0.28105899691581726, -0.07249700278043747, 0.09033329039812088, -0.02629321627318859, 0.006763600278645754, -0.03251788020133972, -0.043475277721881866, -0.006368077360093594, 0.11751072853803635, 0.05837187170982361, -0.012089614756405354, -0.08927499502897263, -0.005040023475885391, -0.00234533054754138, 0.05402655154466629, -0.03561130911111832, -0.08620946854352951, 0.0034216849599033594, 0.15381310880184174, -0.1497918963432312, -0.17316316068172455, -0.11835356056690216, -0.12819822132587433, 0.008160388097167015, 0.035364892333745956, 0.022021695971488953, 0.13279561698436737, 0.014275169931352139, -0.0808083787560463, -0.03502121940255165, 0.023027338087558746, 0.002669320208951831, -0.05574357882142067, 0.06769921630620956, 0.005105853080749512, -0.22646987438201904, -0.09229425340890884, -0.02886410430073738, 0.25212955474853516, 0.01155767496675253, 0.1123058944940567, 0.03844437748193741, 0.15367279946804047, 0.02018583193421364, -0.08533166348934174, 0.0034023134503513575, 0.10067242383956909, -0.052334435284137726, 0.23754574358463287, 0.06141239404678345, -0.12741921842098236, 0.10202022641897202, -0.047570519149303436, -0.10120190680027008, -0.019346600398421288, 0.027254890650510788, -0.09073033928871155, 0.09032023698091507, -0.036586944013834, -0.02831384912133217, -0.037522390484809875, 0.07000422477722168, -0.06560239940881729, 0.021551568061113358, -0.02275868132710457, -0.06562785059213638, -0.18407633900642395, -0.02798338420689106, -0.023605601862072945, 0.023712236434221268, -0.0605069063603878, -0.009311467409133911, -0.06507515162229538, 0.027219098061323166, 0.03750168904662132, 0.07625775039196014, 0.05943313613533974, -0.0328228585422039, 0.006716920528560877, 0.04222244769334793, 0.07558473944664001, 0.13278521597385406, -0.05319265276193619, -0.13099530339241028 ]
null
null
fairseq
# tts_transformer-tr-cv7 [Transformer](https://arxiv.org/abs/1809.08895) text-to-speech model from fairseq S^2 ([paper](https://arxiv.org/abs/2109.06912)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_synthesis)): - Turkish - Single-speaker male voice - Trained on [Common Voice v7](https://commonvoice.mozilla.org/en/datasets) ## Usage ```python from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub from fairseq.models.text_to_speech.hub_interface import TTSHubInterface import IPython.display as ipd models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "facebook/tts_transformer-tr-cv7", arg_overrides={"vocoder": "hifigan", "fp16": False} ) model = models[0] TTSHubInterface.update_cfg_with_data_cfg(cfg, task.data_cfg) generator = task.build_generator(model, cfg) text = "Merhaba, bu bir deneme çalışmasıdır." sample = TTSHubInterface.get_model_input(task, text) wav, rate = TTSHubInterface.get_prediction(task, model, generator, sample) ipd.Audio(wav, rate=rate) ``` See also [fairseq S^2 example](https://github.com/pytorch/fairseq/blob/main/examples/speech_synthesis/docs/common_voice_example.md). ## Citation ```bibtex @inproceedings{wang-etal-2021-fairseq, title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit", author = "Wang, Changhan and Hsu, Wei-Ning and Adi, Yossi and Polyak, Adam and Lee, Ann and Chen, Peng-Jen and Gu, Jiatao and Pino, Juan", booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-demo.17", doi = "10.18653/v1/2021.emnlp-demo.17", pages = "143--152", } ```
{"language": "tr", "library_name": "fairseq", "tags": ["fairseq", "audio", "text-to-speech"], "datasets": ["common_voice"], "task": "text-to-speech", "widget": [{"text": "Merhaba, bu bir deneme \u00e7al\u0131\u015fmas\u0131d\u0131r.", "example_title": "Hello, this is a test run."}]}
text-to-speech
facebook/tts_transformer-tr-cv7
[ "fairseq", "audio", "text-to-speech", "tr", "dataset:common_voice", "arxiv:1809.08895", "arxiv:2109.06912", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "1809.08895", "2109.06912" ]
[ "tr" ]
TAGS #fairseq #audio #text-to-speech #tr #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us
# tts_transformer-tr-cv7 Transformer text-to-speech model from fairseq S^2 (paper/code): - Turkish - Single-speaker male voice - Trained on Common Voice v7 ## Usage See also fairseq S^2 example.
[ "# tts_transformer-tr-cv7\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Turkish\n- Single-speaker male voice\n- Trained on Common Voice v7", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ "TAGS\n#fairseq #audio #text-to-speech #tr #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n", "# tts_transformer-tr-cv7\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Turkish\n- Single-speaker male voice\n- Trained on Common Voice v7", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ 52, 52, 13 ]
[ "passage: TAGS\n#fairseq #audio #text-to-speech #tr #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n# tts_transformer-tr-cv7\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Turkish\n- Single-speaker male voice\n- Trained on Common Voice v7## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ -0.07793864607810974, -0.07654982060194016, -0.0031126528047025204, -0.06842165440320969, 0.06797532737255096, -0.04631036892533302, 0.18705883622169495, 0.06785203516483307, -0.09683096408843994, -0.014021106995642185, 0.011745716445147991, 0.08242660015821457, 0.059846993535757065, -0.01759994961321354, -0.049703627824783325, -0.22507119178771973, 0.037930648773908615, -0.010014692321419716, -0.041961248964071274, 0.05610257759690285, 0.14074625074863434, -0.0045172483660280704, -0.004559782799333334, 0.04346838593482971, -0.07757935672998428, 0.06989244371652603, 0.0832507312297821, -0.11548957228660583, 0.10416935384273529, 0.08233951777219772, 0.020695723593235016, 0.08576887100934982, 0.02827567234635353, -0.10110600292682648, 0.03927592933177948, -0.02410607784986496, 0.0398903451859951, 0.028231745585799217, -0.009711191058158875, -0.02901184745132923, 0.09498921036720276, 0.06117146462202072, -0.036036912351846695, 0.048644568771123886, -0.10712224990129471, -0.17479507625102997, -0.006519121117889881, -0.012557373382151127, 0.03385389968752861, 0.050825487822294235, -0.054864197969436646, 0.04754572734236717, -0.06678728759288788, 0.08351804316043854, 0.09613987803459167, -0.2896832227706909, -0.017542067915201187, -0.08883444219827652, 0.033427730202674866, 0.08775541931390762, -0.01674746908247471, 0.05252469331026077, 0.030036598443984985, -0.019831568002700806, -0.15527181327342987, -0.09738170355558395, -0.20292340219020844, -0.008295536041259766, -0.0869913101196289, 0.052896350622177124, 0.39347678422927856, 0.007551418151706457, -0.018979521468281746, -0.04291592538356781, -0.05055024102330208, 0.08796750009059906, 0.04561098292469978, -0.07320477068424225, -0.06848331540822983, 0.059956617653369904, -0.024824857711791992, -0.07507355511188507, -0.13247080147266388, -0.04756096750497818, -0.14098800718784332, 0.18545760214328766, -0.007927211932837963, -0.005791386589407921, -0.07470924407243729, -0.012809986248612404, -0.06572321802377701, -0.04796138033270836, 0.07443869858980179, -0.010225560516119003, -0.06888212263584137, 0.008733861148357391, -0.0015688830753788352, -0.3301175832748413, 0.11387200653553009, -0.10374587029218674, 0.007546809036284685, 0.03905438259243965, -0.08914431929588318, 0.0778535008430481, -0.004145754501223564, -0.023274069651961327, -0.010048612020909786, -0.017300521954894066, -0.015898989513516426, -0.02210727334022522, 0.02168641984462738, -0.057146329432725906, -0.14046065509319305, -0.05006203055381775, -0.053390022367239, 0.06154658645391464, -0.004359428770840168, 0.02946213260293007, 0.0032036618795245886, 0.009722924791276455, 0.11682824790477753, -0.04028542712330818, -0.049198225140571594, 0.03234283998608589, 0.0264138076454401, 0.052827153354883194, -0.023415300995111465, 0.060826487839221954, -0.03544550761580467, -0.11857286095619202, 0.020025840029120445, 0.009929836727678776, 0.06965349614620209, -0.1314256489276886, 0.038821276277303696, 0.003539114259183407, 0.0013620560057461262, -0.15386131405830383, -0.024326054379343987, -0.04648731276392937, -0.09091158211231232, 0.05414191633462906, -0.0540628656744957, -0.10861945897340775, -0.08129236847162247, 0.04228511080145836, -0.0657452940940857, -0.07242751866579056, -0.0592210628092289, 0.07625793665647507, -0.05641798675060272, 0.07706215232610703, -0.0721568688750267, 0.060780659317970276, -0.017200760543346405, -0.06011587008833885, -0.06755331158638, 0.1313038021326065, 0.008811814710497856, -0.07322745770215988, -0.0411088801920414, -0.0052449507638812065, -0.11456845700740814, 0.07130274176597595, -0.029006555676460266, 0.10615502297878265, -0.21328535676002502, -0.08045424520969391, 0.117652028799057, -0.08406248688697815, -0.04802265018224716, 0.15833503007888794, 0.04828542470932007, 0.08500603586435318, 0.1342092752456665, 0.3577944040298462, 0.12082047760486603, -0.1096891462802887, -0.04686249420046806, 0.12735804915428162, -0.026993906125426292, 0.04732798412442207, 0.0727732703089714, -0.08400470018386841, 0.056554824113845825, -0.04830661416053772, 0.15333691239356995, 0.04073881730437279, -0.08017993718385696, -0.00360386841930449, 0.0826893225312233, -0.04950492084026337, 0.14974568784236908, -0.05827036499977112, 0.04250507429242134, -0.031301822513341904, -0.05494469404220581, 0.03173231706023216, 0.08604469895362854, -0.0968136414885521, 0.07447649538516998, -0.16077305376529694, 0.06452935934066772, -0.10019439458847046, 0.017283925786614418, -0.16309909522533417, 0.09089662879705429, -0.08058197051286697, 0.07960569858551025, 0.1815379410982132, 0.21105289459228516, 0.02702474407851696, -0.029326193034648895, -0.08113136142492294, 0.03824712336063385, 0.15395890176296234, 0.08806078881025314, -0.03722962737083435, -0.1995849609375, 0.12026078999042511, -0.09000066667795181, 0.07732327282428741, -0.07964684814214706, -0.004332426935434341, 0.15387186408042908, 0.06928495317697525, 0.028873207047581673, 0.04409187659621239, 0.04543440043926239, 0.05574791878461838, 0.01371893659234047, 0.004220242612063885, -0.005331720691174269, 0.02248493954539299, -0.12729322910308838, 0.22116903960704803, -0.21336142718791962, 0.11095119267702103, 0.08634284138679504, -0.1289268583059311, -0.08389188349246979, 0.13923880457878113, 0.02766539342701435, 0.0030817484948784113, 0.019532447680830956, -0.1263056993484497, 0.21503207087516785, -0.07784964144229889, 0.08281345665454865, -0.0507519505918026, 0.03720816969871521, 0.030087046325206757, -0.11144036054611206, 0.05450495705008507, 0.11395745724439621, -0.1800423264503479, -0.20115210115909576, 0.0626385435461998, 0.11334685236215591, -0.013825411908328533, 0.2508251368999481, -0.03869905322790146, 0.0013991962186992168, -0.009649170562624931, 0.0399467907845974, -0.0030076075345277786, 0.059063177555799484, -0.20466086268424988, -0.03493877500295639, 0.016587942838668823, 0.06786301732063293, 0.08843451738357544, -0.0985073372721672, -0.008247341960668564, 0.0076863425783813, -0.08598807454109192, -0.21723371744155884, 0.1370069682598114, -0.03132114186882973, 0.11717917770147324, -0.0609276257455349, -0.0626705139875412, 0.054319530725479126, -0.03793121501803398, -0.13758398592472076, 0.07175586372613907, -0.178050696849823, -0.10756192356348038, -0.09721343964338303, 0.04058295488357544, 0.027311911806464195, 0.06984340399503708, 0.15040892362594604, -0.11697310209274292, 0.03663717210292816, -0.02912968583405018, 0.09368028491735458, -0.019571496173739433, 0.03746180981397629, -0.07009706646203995, -0.06714464724063873, 0.026463938876986504, -0.08961677551269531, 0.004434421192854643, -0.004088503308594227, -0.02425455115735531, 0.035754937678575516, -0.09415695816278458, 0.03399448096752167, 0.23529507219791412, 0.05702900141477585, -0.0284584928303957, -0.06766093522310257, 0.11322841793298721, -0.13086147606372833, -0.0285816453397274, 0.1417955607175827, -0.014469711109995842, 0.0002644812921062112, 0.14565446972846985, 0.007192216347903013, 0.021399706602096558, -0.010416844859719276, -0.03947603330016136, -0.07108312845230103, -0.21075403690338135, -0.04698396846652031, -0.10330222547054291, 0.019946908578276634, -0.22411687672138214, 0.0017528004245832562, 0.012104175053536892, -0.05934286117553711, -0.028520064428448677, -0.04158473387360573, 0.16291308403015137, -0.03016938641667366, 0.28598296642303467, -0.07826411724090576, 0.06665833294391632, -0.11188959330320358, -0.09070909023284912, 0.1062992736697197, -0.07089132815599442, 0.06412728875875473, 0.12022572755813599, 0.1096748635172844, 0.025672558695077896, -0.02497461810708046, 0.08128887414932251, 0.04139355197548866, 0.06617806851863861, 0.011449587531387806, -0.03126402199268341, -0.06768486648797989, 0.008013870567083359, -0.004509313032031059, 0.21445433795452118, -0.09795381128787994, 0.04970254749059677, 0.016285233199596405, 0.08668464422225952, -0.018706995993852615, 0.1588403582572937, -0.06708179414272308, -0.02526284195482731, 0.01450673583894968, -0.09076255559921265, 0.020380539819598198, 0.047704871743917465, 0.20850501954555511, -0.011041714809834957, 0.10391666740179062, 0.1058063730597496, 0.005094985943287611, -0.027697546407580376, 0.0639791339635849, -0.15520724654197693, -0.056507814675569534, -0.03461098298430443, 0.05566699430346489, -0.1022270992398262, 0.18418556451797485, 0.04077357426285744, 0.03603444993495941, -0.012303547002375126, -0.002337733283638954, 0.040079884231090546, 0.11343581974506378, 0.08513560146093369, 0.009631878696382046, -0.0172420684248209, -0.10740959644317627, -0.054200977087020874, 0.008140900172293186, 0.138588547706604, 0.049161188304424286, -0.04490337893366814, 0.02311667427420616, -0.007969293743371964, 0.044262420386075974, -0.049209125339984894, -0.11096616834402084, -0.06332144141197205, 0.05678907409310341, 0.14518485963344574, 0.11105529963970184, 0.014772171154618263, -0.04588684067130089, -0.13394969701766968, -0.05724059417843819, -0.10359993577003479, -0.027165398001670837, -0.0602458156645298, -0.10162463784217834, 0.08564624935388565, -0.0687875971198082, -0.09740691632032394, 0.05591976270079613, -0.02735980972647667, -0.0089672040194273, -0.007360097952187061, 0.09784433245658875, -0.021046984940767288, -0.07760277390480042, -0.03154534846544266, 0.1912376433610916, 0.022653287276625633, 0.12298084795475006, 0.042093805968761444, -0.03518501669168472, -0.009301305748522282, -0.05119948089122772, 0.03406112268567085, -0.07177801430225372, -0.1373758614063263, 0.10792992264032364, 0.09741618484258652, -0.1268313080072403, -0.06132049858570099, -0.04050514101982117, 0.15843011438846588, 0.15360452234745026, -0.016969775781035423, 0.07321766018867493, 0.15365271270275116, -0.03269781172275543, -0.20103392004966736, -0.08076260983943939, 0.005754712037742138, 0.06186940148472786, -0.01968872919678688, -0.07982814311981201, 0.03468751534819603, -0.06489095836877823, -0.02084786258637905, -0.07570668309926987, -0.18927064538002014, -0.12399989366531372, 0.1603889763355255, -0.18126709759235382, 0.17241273820400238, -0.043394941836595535, -0.061890147626399994, -0.04505710303783417, 0.07221146672964096, 0.030163509771227837, -0.3166002333164215, 0.10129769891500473, 0.12540417909622192, 0.056299518793821335, 0.028707142919301987, 0.05326724424958229, 0.11960567533969879, 0.03165299817919731, -0.019954776391386986, -0.013966909609735012, -0.056898608803749084, 0.05909622460603714, 0.0813584178686142, -0.021695073693990707, -0.005426381248980761, 0.014960888773202896, -0.08141955733299255, -0.032737281173467636, -0.043401215225458145, 0.03672636300325394, 0.07991837710142136, 0.0009361366974189878, -0.043710943311452866, -0.1103585734963417, -0.007804105523973703, -0.038097042590379715, 0.19710323214530945, -0.17742761969566345, 0.14596471190452576, 0.10503623634576797, 0.25420480966567993, -0.1247604638338089, 0.05608275160193443, -0.013939086347818375, -0.08241035044193268, 0.07094939053058624, -0.1273076981306076, 0.020111216232180595, 0.07249738276004791, 0.009799528867006302, 0.1509169340133667, 0.02840498834848404, -0.06816177070140839, 0.17792576551437378, 0.0958658829331398, -0.063820481300354, -0.16069656610488892, -0.052452147006988525, -0.07925198972225189, 0.01300947368144989, 0.04628730192780495, 0.1781652420759201, -0.021609019488096237, 0.06015639752149582, -0.005283449776470661, -0.040262412279844284, -0.13783244788646698, 0.20544423162937164, 0.04696204885840416, 0.02430597133934498, -0.07636680454015732, 0.01824898086488247, -0.012976461090147495, -0.06814634054899216, -0.024352824315428734, -0.022302385419607162, -0.059510134160518646, -0.06632652878761292, -0.12132494896650314, 0.08136213570833206, 0.011482663452625275, -0.11914906650781631, -0.052635811269283295, -0.17297136783599854, -0.02517608553171158, 0.21045777201652527, -0.005453219637274742, 0.06468602269887924, -0.05874600633978844, 0.04124641790986061, 0.04313268885016441, -0.00394366355612874, 0.09957576543092728, -0.025392752140760422, -0.12475843727588654, 0.19763770699501038, -0.05060850456357002, 0.08709695190191269, -0.06842685490846634, -0.026538928970694542, -0.015563755296170712, 0.05242330953478813, -0.04085678234696388, -0.02654733881354332, -0.08892764896154404, -0.004546215757727623, 0.055440500378608704, -0.07896820455789566, -0.020435502752661705, -0.019808486104011536, -0.09111004322767258, 0.05965053290128708, 0.05484117940068245, 0.08366410434246063, -0.06333611905574799, 0.029073793441057205, 0.011637603864073753, -0.026156898587942123, 0.12006820738315582, 0.13703928887844086, -0.10767590254545212, 0.15052072703838348, -0.15346182882785797, -0.07380717992782593, 0.1348343938589096, 0.0817553773522377, -0.0013058296171948314, 0.032914526760578156, -0.06481605768203735, 0.10337556153535843, 0.04070023074746132, 0.022282052785158157, 0.0026182965375483036, 0.04367978870868683, 0.11871886998414993, -0.19251388311386108, -0.0035044255200773478, -0.059892427176237106, 0.016387922689318657, 0.1483999788761139, 0.12919041514396667, 0.11733982712030411, -0.08886955678462982, 0.027131564915180206, -0.0248758215457201, 0.056356385350227356, 0.0022505572997033596, -0.11974047124385834, 0.022528504952788353, -0.04567426070570946, 0.07235974818468094, -0.09361531585454941, 0.021549256518483162, -0.05259699746966362, 0.00967351533472538, 0.004654171410948038, 0.037753988057374954, -0.025062909349799156, -0.008742034435272217, 0.031185142695903778, 0.09502940624952316, -0.030278783291578293, -0.13686829805374146, -0.016073208302259445, 0.014082658104598522, 0.11176961660385132, -0.03606187924742699, -0.03613492101430893, -0.0008596729021519423, 0.07040037214756012, 0.09245755523443222, -0.050866175442934036, -0.04195819050073624, -0.009986313059926033, -0.06508314609527588, -0.022490043193101883, -0.05691855028271675, 0.056107427924871445, 0.17093157768249512, -0.0023448385763913393, -0.02668096497654915, -0.012457472272217274, -0.0598595030605793, -0.18782305717468262, -0.0469941645860672, -0.0432434007525444, -0.07929982990026474, -0.028262609615921974, -0.06879331916570663, 0.09841973334550858, 0.11199674755334854, 0.05514814332127571, 0.058211930096149445, 0.12668325006961823, -0.13726800680160522, -0.12075737118721008, 0.05581242963671684, -0.0805831179022789, 0.027527643367648125, -0.10059837996959686, -0.04423706606030464, 0.1470380425453186, -0.004723967052996159, 0.018966780975461006, 0.029281752184033394, -0.034307487308979034, -0.050023164600133896, -0.204090878367424, -0.04952177777886391, -0.020005034282803535, 0.06907306611537933, 0.05770168825984001, 0.06510935723781586, 0.11978168040513992, -0.06957095861434937, 0.014508084394037724, 0.15534767508506775, -0.07384160161018372, -0.19811317324638367, -0.12026296555995941, -0.06357985734939575, -0.0330815389752388, 0.09686915576457977, -0.060995422303676605, -0.09247532486915588, -0.03730083629488945, 0.08295193314552307, 0.3089965879917145, -0.03737683221697807, 0.09362413734197617, -0.025815872475504875, 0.011291501112282276, -0.0007539726211689413, -0.05620814859867096, 0.025793680921196938, 0.14665792882442474, 0.05095421150326729, 0.017762476578354836, -0.0800684243440628, 0.02362237684428692, 0.015901679173111916, 0.03892611712217331, -0.014496412128210068, -0.08381514996290207, 0.022434089332818985, 0.17413008213043213, -0.16088631749153137, -0.2024506777524948, -0.1770569235086441, -0.09362256526947021, -0.0008564653689973056, 0.024534963071346283, 0.055389296263456345, 0.1222800761461258, 0.016659224405884743, -0.06605230271816254, -0.056058503687381744, -0.022939832881093025, 0.009738538414239883, -0.09681475907564163, 0.05561584606766701, 0.024851812049746513, -0.21965767443180084, -0.11931894719600677, -0.007592502050101757, 0.20838724076747894, 0.025317322462797165, 0.08699533343315125, 0.06142847612500191, 0.15571996569633484, 0.021349119022488594, -0.05316343903541565, 0.011041969992220402, 0.08127101510763168, -0.03877203166484833, 0.1998654007911682, 0.07062921673059464, -0.10758540779352188, 0.0679236352443695, -0.012063483707606792, -0.08487433195114136, -0.010343214496970177, 0.0019523738883435726, -0.046273380517959595, 0.06786701083183289, 0.019105233252048492, -0.010315854102373123, 0.02313372492790222, 0.05299757048487663, -0.10168416798114777, 0.033213865011930466, -0.05984514579176903, -0.07933679223060608, -0.19139382243156433, -0.06694872677326202, -0.03780195489525795, 0.019685953855514526, -0.03802157938480377, -0.0278621856123209, -0.07649607211351395, 0.021608248353004456, 0.02880256064236164, 0.07129416614770889, 0.1076110452413559, -0.0281970351934433, 0.008364783599972725, 0.013994481414556503, 0.0457325242459774, 0.1407853662967682, -0.056606873869895935, -0.07024975121021271 ]
null
null
fairseq
# tts_transformer-vi-cv7 [Transformer](https://arxiv.org/abs/1809.08895) text-to-speech model from fairseq S^2 ([paper](https://arxiv.org/abs/2109.06912)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_synthesis)): - Vietnamese - Single-speaker male voice - Trained on [Common Voice v7](https://commonvoice.mozilla.org/en/datasets) ## Usage ```python from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub from fairseq.models.text_to_speech.hub_interface import TTSHubInterface import IPython.display as ipd models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "facebook/tts_transformer-vi-cv7", arg_overrides={"vocoder": "hifigan", "fp16": False} ) model = models[0] TTSHubInterface.update_cfg_with_data_cfg(cfg, task.data_cfg) generator = task.build_generator(model, cfg) text = "Xin chào, đây là một cuộc chạy thử nghiệm." sample = TTSHubInterface.get_model_input(task, text) wav, rate = TTSHubInterface.get_prediction(task, model, generator, sample) ipd.Audio(wav, rate=rate) ``` See also [fairseq S^2 example](https://github.com/pytorch/fairseq/blob/main/examples/speech_synthesis/docs/common_voice_example.md). ## Citation ```bibtex @inproceedings{wang-etal-2021-fairseq, title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit", author = "Wang, Changhan and Hsu, Wei-Ning and Adi, Yossi and Polyak, Adam and Lee, Ann and Chen, Peng-Jen and Gu, Jiatao and Pino, Juan", booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-demo.17", doi = "10.18653/v1/2021.emnlp-demo.17", pages = "143--152", } ```
{"language": "vi", "library_name": "fairseq", "tags": ["fairseq", "audio", "text-to-speech"], "datasets": ["common_voice"], "task": "text-to-speech", "widget": [{"text": "Xin ch\u00e0o, \u0111\u00e2y l\u00e0 m\u1ed9t cu\u1ed9c ch\u1ea1y th\u1eed nghi\u1ec7m.", "example_title": "Hello, this is a test run."}]}
text-to-speech
facebook/tts_transformer-vi-cv7
[ "fairseq", "audio", "text-to-speech", "vi", "dataset:common_voice", "arxiv:1809.08895", "arxiv:2109.06912", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "1809.08895", "2109.06912" ]
[ "vi" ]
TAGS #fairseq #audio #text-to-speech #vi #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us
# tts_transformer-vi-cv7 Transformer text-to-speech model from fairseq S^2 (paper/code): - Vietnamese - Single-speaker male voice - Trained on Common Voice v7 ## Usage See also fairseq S^2 example.
[ "# tts_transformer-vi-cv7\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Vietnamese\n- Single-speaker male voice\n- Trained on Common Voice v7", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ "TAGS\n#fairseq #audio #text-to-speech #vi #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n", "# tts_transformer-vi-cv7\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Vietnamese\n- Single-speaker male voice\n- Trained on Common Voice v7", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ 52, 52, 13 ]
[ "passage: TAGS\n#fairseq #audio #text-to-speech #vi #dataset-common_voice #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n# tts_transformer-vi-cv7\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Vietnamese\n- Single-speaker male voice\n- Trained on Common Voice v7## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ -0.07503992319107056, -0.00039664015639573336, -0.0031338133849203587, -0.07221154123544693, 0.07408955693244934, -0.08161860704421997, 0.14184483885765076, 0.09351532906293869, -0.07657095044851303, 0.0009814000222831964, 0.01012384332716465, 0.08322668075561523, 0.06342379748821259, -0.009030149318277836, -0.054816897958517075, -0.22845670580863953, 0.06148502975702286, 0.04396989941596985, -0.016225541010499, 0.036221183836460114, 0.13554564118385315, -0.03775137662887573, 0.03896690905094147, 0.06256645917892456, -0.0624263659119606, 0.06731864809989929, 0.07177077233791351, -0.09650654345750809, 0.09350669384002686, 0.06916384398937225, 0.020183878019452095, 0.10356898605823517, 0.05280015990138054, -0.1020946353673935, 0.030837683007121086, -0.014476635493338108, 0.050186771899461746, 0.025653516873717308, -0.04204178228974342, 0.05431705340743065, 0.06625321507453918, 0.06457493454217911, -0.010611049830913544, 0.05803271755576134, -0.0999760627746582, -0.2062550038099289, 0.015202854759991169, -0.06639206409454346, 0.050522346049547195, 0.0452880933880806, -0.0626535639166832, 0.07443002611398697, -0.08305685967206955, 0.055922579020261765, 0.09657636284828186, -0.26064741611480713, 0.01413043960928917, -0.0013404346536844969, 0.09333407878875732, 0.0664357990026474, -0.06994690001010895, 0.037169456481933594, 0.05551539361476898, -0.02095821686089039, -0.14719223976135254, -0.10211949050426483, -0.17933204770088196, 0.04023383557796478, -0.09798552095890045, 0.08482538163661957, 0.376574844121933, 0.03038451075553894, -0.03628617152571678, -0.05146874487400055, -0.06209190934896469, 0.08814433217048645, 0.028792429715394974, -0.10704309493303299, -0.014366257935762405, 0.06951535493135452, 0.013964405283331871, -0.07903673499822617, -0.13254383206367493, -0.05825774371623993, -0.08193978667259216, 0.04379357025027275, -0.022929197177290916, -0.0178549624979496, -0.0924239456653595, -0.056833427399396896, -0.14305683970451355, -0.03163045644760132, 0.043512023985385895, -0.018662093207240105, -0.06306138634681702, 0.006387772504240274, 0.01977555826306343, -0.3194820284843445, 0.10594804584980011, -0.13884906470775604, -0.03618144989013672, 0.047705475240945816, -0.12009885162115097, 0.06166890636086464, 0.03585230931639671, 0.025091450661420822, -0.0470115952193737, 0.030412981286644936, -0.014004990458488464, 0.006213570944964886, 0.0005523496656678617, -0.05738366022706032, -0.15351185202598572, -0.040601130574941635, -0.06945861876010895, 0.015666741877794266, -0.02831246703863144, 0.01822521910071373, -0.030558113008737564, -0.01833263598382473, 0.18202093243598938, -0.022878702729940414, -0.02904658205807209, 0.034785136580467224, 0.046806659549474716, 0.09413867443799973, 0.009291322901844978, 0.08263736218214035, -0.02297821268439293, -0.13158075511455536, -0.010719701647758484, -0.0004424916987773031, 0.04021107032895088, -0.13313810527324677, 0.03579951450228691, -0.017650321125984192, -0.0013432451523840427, -0.08852817118167877, -0.02251172438263893, -0.058184973895549774, -0.08508172631263733, 0.0624191090464592, -0.09320104867219925, -0.13540928065776825, -0.08781029284000397, 0.01951092667877674, -0.03510644659399986, -0.08602123707532883, -0.03830128535628319, 0.05691779404878616, -0.05220045894384384, 0.09523852169513702, -0.08612029254436493, 0.07024393230676651, -0.0002034870849456638, -0.032746437937021255, -0.10003067553043365, 0.10891622304916382, -0.007470208685845137, -0.09287530928850174, -0.015032998286187649, -0.018326478078961372, -0.06272849440574646, 0.07396754622459412, -0.048149362206459045, 0.11185794323682785, -0.23001942038536072, -0.07402746379375458, 0.10497330874204636, -0.1048036515712738, -0.05261777341365814, 0.16013289988040924, 0.051172830164432526, 0.06781602650880814, 0.09308892488479614, 0.3363109529018402, 0.08610621094703674, -0.17994360625743866, -0.007231387309730053, 0.14993049204349518, -0.01703605242073536, 0.02867838367819786, 0.0850834995508194, -0.06779400259256363, 0.038285721093416214, -0.045485518872737885, 0.13647235929965973, -0.013154171407222748, -0.09385959804058075, -0.013019866310060024, 0.06215175241231918, -0.05833372101187706, 0.13529334962368011, -0.02938373200595379, 0.027611110359430313, 0.014924990944564342, -0.02197149768471718, 0.06551196426153183, 0.0992574393749237, -0.09317971020936966, 0.07511908560991287, -0.19265116751194, 0.03514087572693825, -0.07833744585514069, 0.017036911100149155, -0.13351388275623322, 0.11343313753604889, -0.0379905104637146, 0.06469684094190598, 0.1394439935684204, 0.22281721234321594, 0.006336668971925974, -0.013598737306892872, -0.08393283933401108, 0.029979929327964783, 0.14205458760261536, 0.089870885014534, -0.021897442638874054, -0.20063267648220062, 0.10861620306968689, -0.10106708109378815, 0.025346994400024414, -0.14664648473262787, -0.015830572694540024, 0.1714303344488144, 0.04400099441409111, 0.04425168037414551, 0.035617418587207794, 0.0851912721991539, 0.06042971462011337, 0.0050782980397343636, 0.022992923855781555, -0.0015534567646682262, 0.021348699927330017, -0.11492006480693817, 0.2373056411743164, -0.15144601464271545, 0.10831774771213531, 0.07832388579845428, -0.13652661442756653, -0.05508682131767273, 0.13600899279117584, 0.0412340983748436, 0.00018173687567468733, -0.01805865578353405, -0.09160471707582474, 0.16167768836021423, -0.08156027644872665, 0.10419867932796478, -0.0617344044148922, 0.07149069756269455, 0.03763365000486374, -0.09916083514690399, 0.038277655839920044, 0.10838722437620163, -0.11248639225959778, -0.16633906960487366, 0.06595931202173233, 0.09830918163061142, -0.014075931161642075, 0.24055856466293335, -0.052732598036527634, 0.01205006055533886, -0.016126461327075958, 0.05136023834347725, -0.014514528214931488, 0.06942856311798096, -0.26721155643463135, -0.022182291373610497, 0.013126421719789505, 0.06732364743947983, 0.0634620264172554, -0.10866817086935043, -0.017173228785395622, -0.005632144398987293, -0.0764530599117279, -0.2169627994298935, 0.13736090064048767, -0.013960529118776321, 0.08410485833883286, -0.039263468235731125, -0.050357285887002945, 0.04752488061785698, -0.046714846044778824, -0.13918174803256989, 0.08985317498445511, -0.18180476129055023, -0.14388984441757202, -0.07918526232242584, 0.0029169409535825253, 0.029596474021673203, 0.08872361481189728, 0.13283689320087433, -0.16165226697921753, -0.009723123162984848, -0.030620027333498, 0.065850630402565, -0.00653542997315526, 0.02441667579114437, -0.028445648029446602, -0.04718176648020744, 0.034647513180971146, -0.08302778005599976, 0.015314505435526371, -0.015165302902460098, -0.02077244222164154, 0.01685960404574871, -0.103819839656353, 0.01564650610089302, 0.23972328007221222, 0.08163762837648392, -0.03164749592542648, -0.045186661183834076, 0.15539959073066711, -0.11039573699235916, -0.036860667169094086, 0.16939875483512878, 0.006808490492403507, 0.007886362262070179, 0.13946250081062317, 0.0005869620945304632, -0.007900607772171497, 0.010240955278277397, -0.020862692967057228, -0.05731453001499176, -0.19503264129161835, -0.03360748291015625, -0.10087887197732925, 0.04435741528868675, -0.19986560940742493, 0.005241692066192627, 0.004059078171849251, -0.06577673554420471, 0.024167586117982864, 0.0035476365592330694, 0.1502133011817932, -0.03189946338534355, 0.26456740498542786, -0.1088973805308342, 0.05385594442486763, -0.09567373245954514, -0.0692928284406662, 0.1070285439491272, -0.0401228591799736, 0.03687761723995209, 0.14157085120677948, 0.11840178817510605, 0.04698799550533295, 0.0037230062298476696, 0.11456632614135742, 0.02243160456418991, 0.00879060011357069, -0.009962569922208786, -0.052072495222091675, -0.06541099399328232, 0.039832569658756256, 0.023010805249214172, 0.2264680713415146, -0.11116218566894531, 0.04716876894235611, 0.030862528830766678, 0.04307783767580986, 0.0024214487057179213, 0.18320217728614807, -0.037780918180942535, 0.010481339879333973, 0.04299749806523323, -0.055769097059965134, 0.005723845213651657, 0.08466052263975143, 0.2349475920200348, -0.031370025128126144, 0.10779373347759247, 0.13428039848804474, 0.009167362004518509, -0.033867381513118744, 0.07743736356496811, -0.15767458081245422, -0.06161421164870262, -0.015233124606311321, 0.030231166630983353, -0.12321154028177261, 0.1639137715101242, 0.02222188375890255, -0.014288113452494144, 0.00609172647818923, -0.003336910856887698, 0.006078525446355343, 0.186799094080925, 0.11624215543270111, 0.0186374019831419, 0.03399887681007385, -0.09075277298688889, -0.010086807422339916, 0.002713836496695876, 0.1089019700884819, 0.09869875758886337, -0.07063660025596619, 0.02952529862523079, -0.03491119295358658, 0.046096011996269226, -0.027570145204663277, -0.0897173210978508, -0.03075322136282921, 0.045099325478076935, 0.15077100694179535, 0.1422368884086609, 0.009018137119710445, -0.03561960533261299, -0.13705982267856598, -0.01886732131242752, -0.09572435170412064, 0.01219293661415577, -0.05152623727917671, -0.11107983440160751, 0.09263592958450317, -0.037406448274850845, -0.06192968785762787, 0.03224603086709976, 0.028397729620337486, -0.00769084645435214, 0.03205755725502968, 0.12715032696723938, -0.014110392890870571, -0.09412506222724915, -0.039278555661439896, 0.14803174138069153, 0.005239210557192564, 0.0898442417383194, 0.034342534840106964, -0.027326280251145363, -0.009982679039239883, -0.06286106258630753, 0.03947919234633446, -0.025713307783007622, -0.14288730919361115, 0.08487636595964432, 0.05964534729719162, -0.1492755115032196, -0.028889693319797516, -0.07909002155065536, 0.1377466768026352, 0.17624422907829285, -0.01553184911608696, 0.10252036899328232, 0.20353280007839203, -0.046763449907302856, -0.2255760282278061, -0.06946570426225662, -0.005521259270608425, 0.04654070362448692, -0.02412816882133484, -0.11111770570278168, 0.03291357681155205, -0.10068520903587341, -0.01791176199913025, -0.12883590161800385, -0.12477227300405502, -0.13746890425682068, 0.13819631934165955, -0.15390589833259583, 0.17155995965003967, -0.02384902723133564, -0.07308650761842728, -0.04116705432534218, 0.03220837563276291, 0.09502362459897995, -0.22773748636245728, 0.08214396238327026, 0.1234942153096199, 0.025173980742692947, 0.03162297233939171, 0.05076675862073898, 0.11204688996076584, 0.021321700885891914, -0.018746258690953255, 0.01877136155962944, -0.084974005818367, 0.022509336471557617, 0.08654510974884033, 0.008516632951796055, 0.046339381486177444, 0.011416114866733551, -0.07487411797046661, -0.03020329214632511, -0.060878969728946686, -0.012937097810208797, 0.07766315340995789, -0.009801440872251987, -0.0541292279958725, -0.07330740243196487, -0.009245609864592552, -0.02806142345070839, 0.2548051178455353, -0.16491501033306122, 0.11312844604253769, 0.13752521574497223, 0.25014728307724, -0.16324563324451447, 0.08679277449846268, 0.0031308108009397984, -0.08219369500875473, 0.09826284646987915, -0.16810347139835358, 0.04564254730939865, 0.04714161902666092, 0.017802080139517784, 0.18406370282173157, 0.032557278871536255, -0.07189466059207916, 0.1535862684249878, 0.07674247026443481, -0.05932163447141647, -0.1735282987356186, -0.04953845962882042, -0.04872973635792732, 0.03277458995580673, 0.04430515691637993, 0.1422480344772339, -0.011750739067792892, 0.040825024247169495, 0.00311305932700634, -0.012388210743665695, -0.1344452202320099, 0.14655201137065887, 0.05220920220017433, 0.04914487898349762, -0.08354745805263519, 0.014606880024075508, 0.022131631150841713, -0.12412405759096146, -0.026536619290709496, 0.010345984250307083, -0.06328156590461731, -0.06094905734062195, -0.1153077632188797, 0.12109319865703583, 0.02064693719148636, -0.10516025125980377, -0.04203411191701889, -0.16888314485549927, -0.0017189336940646172, 0.2292976677417755, 0.007287690881639719, 0.07358964532613754, -0.03825575113296509, 0.01635138876736164, 0.0217128936201334, -0.014391696080565453, 0.05471019446849823, -0.02616817131638527, -0.15355081856250763, 0.0887284129858017, -0.020028922706842422, 0.13294444978237152, -0.06657861173152924, -0.03755651041865349, -0.07033147662878036, 0.049958065152168274, -0.016201170161366463, 0.007332462817430496, -0.07379653304815292, -0.01592182368040085, 0.032894592732191086, -0.10217151045799255, -0.0111919566988945, -0.000794830615632236, -0.09659459441900253, 0.06810817867517471, 0.05452335625886917, 0.090856172144413, -0.05224161967635155, 0.021237429231405258, 0.04913492128252983, -0.02721356227993965, 0.10867893695831299, 0.05924481153488159, -0.11446896195411682, 0.1315765082836151, -0.1567736268043518, -0.09147213399410248, 0.16214463114738464, 0.0902865007519722, 0.01559317484498024, 0.009927038103342056, -0.06883459538221359, 0.08978252857923508, 0.05382661893963814, -0.009949011728167534, 0.045052070170640945, 0.04349975660443306, 0.07721200585365295, -0.18198560178279877, -0.043180715292692184, -0.04562881216406822, 0.012187352403998375, 0.1485302895307541, 0.11274900287389755, 0.09867016226053238, -0.07967674732208252, 0.0181232038885355, -0.01778598129749298, 0.04157331585884094, -0.016283085569739342, -0.09957367181777954, 0.010403900407254696, -0.03503704443573952, 0.051523059606552124, -0.09127486497163773, 0.02437138371169567, -0.07222903519868851, -0.005795713048428297, 0.02252453938126564, 0.023185059428215027, -0.06301534920930862, 0.020037606358528137, 0.016733039170503616, 0.09539586305618286, -0.021937241777777672, -0.10342209786176682, -0.00029059755615890026, 0.001941013615578413, 0.19897380471229553, -0.09467553347349167, -0.046914663165807724, 0.0004869753320235759, 0.05992956459522247, 0.08292064070701599, -0.02791028656065464, -0.0684613585472107, 0.0799110010266304, -0.11587521433830261, -0.01209814939647913, -0.08558769524097443, 0.05341940000653267, 0.13001559674739838, -0.010209270752966404, 0.02086563967168331, -0.031063465401530266, -0.0704168751835823, -0.21209663152694702, -0.05738240107893944, -0.07568995654582977, -0.11949489265680313, -0.016340656206011772, -0.07409322261810303, 0.1344202160835266, 0.06094672158360481, 0.07308292388916016, 0.059660088270902634, 0.0989348366856575, -0.10821846127510071, -0.09755169600248337, 0.03139369189739227, -0.06684383004903793, 0.006292310077697039, -0.09270406514406204, 0.011478153057396412, 0.18607363104820251, -0.0019408123334869742, -0.006320884916931391, 0.013994883745908737, -0.03640248253941536, -0.02794637903571129, -0.15928296744823456, -0.04232403635978699, -0.013303740881383419, 0.03476966172456741, 0.031093206256628036, 0.048432856798172, 0.09907332062721252, -0.05382110923528671, 0.031238742172718048, 0.14594502747058868, -0.08454596251249313, -0.19762273132801056, -0.15815851092338562, -0.019172009080648422, -0.04675062373280525, 0.10854055732488632, -0.06699930876493454, -0.06362216919660568, -0.06658443063497543, 0.12328232079744339, 0.23924964666366577, -0.049580421298742294, 0.07036744058132172, -0.012304380536079407, 0.00833048578351736, -0.025278141722083092, -0.07817782461643219, 0.059037309139966965, 0.1530042141675949, 0.038783133029937744, -0.002465569181367755, -0.0917314738035202, 0.007223082240670919, 0.02542853355407715, 0.025944784283638, 0.017439046874642372, -0.05336156487464905, 0.0020253031980246305, 0.12873704731464386, -0.1592751294374466, -0.1688375174999237, -0.1554548740386963, -0.13072334229946136, 0.016901861876249313, -0.0005752844735980034, 0.04861022159457207, 0.1034206673502922, -0.01993149146437645, -0.062098827213048935, -0.04350197687745094, 0.008318341337144375, 0.0013631469337269664, -0.07161205261945724, 0.052369374781847, 0.03133382648229599, -0.24540360271930695, -0.1069144457578659, 0.0004378813609946519, 0.16668346524238586, 0.004387117922306061, 0.08244980126619339, 0.045771144330501556, 0.1527452915906906, -0.0032363077625632286, -0.06738906353712082, -0.04882650822401047, 0.09833289682865143, -0.04304562881588936, 0.2790464162826538, 0.018363362178206444, -0.1245846077799797, 0.05818551406264305, -0.0028053955174982548, -0.1106216162443161, 0.0007490607094950974, 0.012337143532931805, -0.07849473506212234, 0.06371000409126282, -0.01818341761827469, -0.009837282821536064, 0.013742176815867424, 0.033636510372161865, -0.046718910336494446, -0.021196192130446434, -0.0808168426156044, -0.08119605481624603, -0.1794891357421875, -0.016350921243429184, -0.01754472404718399, 0.03583669662475586, -0.0495888814330101, -0.02125304564833641, -0.06907766312360764, 0.044052015990018845, 0.010147622786462307, 0.0552118755877018, 0.09727003425359726, -0.015402615070343018, 0.0014933685306459665, -0.033140406012535095, 0.05088447034358978, 0.1296471804380417, -0.03083661012351513, -0.09572144597768784 ]
null
null
fairseq
# tts_transformer-zh-cv7_css10 [Transformer](https://arxiv.org/abs/1809.08895) text-to-speech model from fairseq S^2 ([paper](https://arxiv.org/abs/2109.06912)/[code](https://github.com/pytorch/fairseq/tree/main/examples/speech_synthesis)): - Simplified Chinese - Single-speaker female voice - Pre-trained on [Common Voice v7](https://commonvoice.mozilla.org/en/datasets), fine-tuned on [CSS10](https://github.com/Kyubyong/css10) ## Usage ```python from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub from fairseq.models.text_to_speech.hub_interface import TTSHubInterface import IPython.display as ipd models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "facebook/tts_transformer-zh-cv7_css10", arg_overrides={"vocoder": "hifigan", "fp16": False} ) model = models[0] TTSHubInterface.update_cfg_with_data_cfg(cfg, task.data_cfg) generator = task.build_generator(model, cfg) text = "您好,这是试运行。" sample = TTSHubInterface.get_model_input(task, text) wav, rate = TTSHubInterface.get_prediction(task, model, generator, sample) ipd.Audio(wav, rate=rate) ``` See also [fairseq S^2 example](https://github.com/pytorch/fairseq/blob/main/examples/speech_synthesis/docs/common_voice_example.md). ## Citation ```bibtex @inproceedings{wang-etal-2021-fairseq, title = "fairseq S{\^{}}2: A Scalable and Integrable Speech Synthesis Toolkit", author = "Wang, Changhan and Hsu, Wei-Ning and Adi, Yossi and Polyak, Adam and Lee, Ann and Chen, Peng-Jen and Gu, Jiatao and Pino, Juan", booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-demo.17", doi = "10.18653/v1/2021.emnlp-demo.17", pages = "143--152", } ```
{"language": "zh", "library_name": "fairseq", "tags": ["fairseq", "audio", "text-to-speech"], "datasets": ["common_voice", "css10"], "task": "text-to-speech", "widget": [{"text": "\u60a8\u597d\uff0c\u8fd9\u662f\u8bd5\u8fd0\u884c\u3002", "example_title": "Hello, this is a test run."}]}
text-to-speech
facebook/tts_transformer-zh-cv7_css10
[ "fairseq", "audio", "text-to-speech", "zh", "dataset:common_voice", "dataset:css10", "arxiv:1809.08895", "arxiv:2109.06912", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "1809.08895", "2109.06912" ]
[ "zh" ]
TAGS #fairseq #audio #text-to-speech #zh #dataset-common_voice #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us
# tts_transformer-zh-cv7_css10 Transformer text-to-speech model from fairseq S^2 (paper/code): - Simplified Chinese - Single-speaker female voice - Pre-trained on Common Voice v7, fine-tuned on CSS10 ## Usage See also fairseq S^2 example.
[ "# tts_transformer-zh-cv7_css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Simplified Chinese\n- Single-speaker female voice\n- Pre-trained on Common Voice v7, fine-tuned on CSS10", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ "TAGS\n#fairseq #audio #text-to-speech #zh #dataset-common_voice #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n", "# tts_transformer-zh-cv7_css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Simplified Chinese\n- Single-speaker female voice\n- Pre-trained on Common Voice v7, fine-tuned on CSS10", "## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ 59, 68, 13 ]
[ "passage: TAGS\n#fairseq #audio #text-to-speech #zh #dataset-common_voice #dataset-css10 #arxiv-1809.08895 #arxiv-2109.06912 #has_space #region-us \n# tts_transformer-zh-cv7_css10\n\nTransformer text-to-speech model from fairseq S^2 (paper/code):\n- Simplified Chinese\n- Single-speaker female voice\n- Pre-trained on Common Voice v7, fine-tuned on CSS10## Usage\n\n\n\nSee also fairseq S^2 example." ]
[ -0.14058271050453186, 0.0004436538729351014, -0.001344132237136364, -0.07507336884737015, 0.08535405248403549, -0.06013890728354454, 0.1524936556816101, 0.08462794870138168, -0.03276018798351288, 0.03217195346951485, -0.036540791392326355, 0.016282791271805763, 0.07373902946710587, -0.00959609542042017, -0.05892170965671539, -0.21721753478050232, 0.06642384827136993, 0.06883988529443741, -0.03606356680393219, 0.03450854867696762, 0.13536779582500458, -0.04209279641509056, 0.023590177297592163, 0.0896197110414505, -0.02447143755853176, 0.01900552213191986, 0.03892794996500015, -0.0719112679362297, 0.08410938829183578, 0.09534458070993423, 0.026079043745994568, 0.10938413441181183, 0.09476854652166367, -0.1763961911201477, 0.03438052162528038, -0.07549681514501572, 0.03448548913002014, 0.018562959507107735, -0.060139529407024384, 0.05126525089144707, 0.06612352281808853, 0.022640101611614227, -0.04834739491343498, 0.08615278452634811, -0.032544150948524475, -0.17353244125843048, -0.014995413832366467, 0.014887434430420399, 0.02602357231080532, 0.013858412392437458, -0.08102856576442719, 0.04395628347992897, -0.15235047042369843, 0.0348971001803875, 0.13887841999530792, -0.2925480604171753, 0.021081333979964256, -0.008393185213208199, 0.08667134493589401, 0.047430504113435745, -0.05229121074080467, 0.05172443762421608, 0.00402685534209013, -0.052409578114748, -0.21176624298095703, -0.10677411407232285, -0.10026542097330093, 0.03575519844889641, -0.1264861524105072, 0.10487160831689835, 0.38736796379089355, 0.05069073662161827, -0.06322284787893295, -0.11879213154315948, -0.04608040675520897, 0.06010080501437187, -0.02509869821369648, -0.099387526512146, -0.02770877629518509, 0.07639027386903763, -0.01400663610547781, -0.08265334367752075, -0.12075097858905792, -0.07517553865909576, -0.061600297689437866, 0.14445151388645172, -0.001969338394701481, -0.019222410395741463, -0.0890597403049469, -0.03859909996390343, -0.07423755526542664, -0.050169944763183594, 0.010534271597862244, -0.051466986536979675, -0.09915322810411453, 0.037782058119773865, -0.02390109933912754, -0.25696268677711487, 0.13305610418319702, -0.018634194508194923, 0.012570750899612904, 0.09070045500993729, -0.01789131388068199, 0.043714966624975204, 0.055015064775943756, 0.03529667481780052, -0.054746683686971664, -0.05376608669757843, 0.008879505097866058, 0.009316005744040012, 0.041182152926921844, -0.07632632553577423, -0.15317602455615997, -0.12262625992298126, -0.04700025916099548, 0.045771706849336624, -0.05122261121869087, 0.028431113809347153, -0.05038043111562729, -0.01845563016831875, 0.16611650586128235, -0.016240905970335007, -0.023317288607358932, 0.07787083089351654, 0.025552280247211456, 0.07431351393461227, 0.001886524260044098, 0.11405801773071289, -0.008454009890556335, -0.16740749776363373, -0.03864102065563202, 0.01580408774316311, 0.07157169282436371, -0.03669329360127449, 0.024241134524345398, -0.03893117234110832, 0.01348499022424221, -0.10742471367120743, -0.03538302332162857, -0.04967956990003586, -0.12837454676628113, 0.039221905171871185, -0.06622011214494705, -0.11734706163406372, -0.07146334648132324, 0.06216280162334442, -0.04664977267384529, -0.10024569928646088, -0.05791954696178436, 0.043950606137514114, -0.024850713089108467, 0.008467339910566807, -0.1274295151233673, 0.040587637573480606, -0.0031634902115911245, -0.008743311278522015, -0.008612902835011482, 0.1439114212989807, 0.024082636460661888, -0.09769755601882935, -0.00726110115647316, -0.025415528565645218, -0.09628653526306152, 0.0931820198893547, 0.014228069223463535, 0.14813297986984253, -0.25667616724967957, -0.04578877240419388, 0.06845462322235107, -0.09042388945817947, -0.011640122160315514, 0.14098593592643738, 0.04351184144616127, 0.03741884231567383, 0.07722409814596176, 0.3575720489025116, 0.1343011111021042, -0.11729402095079422, -0.016662325710058212, 0.11014530062675476, -0.012647471390664577, 0.023106666281819344, 0.08260831236839294, -0.08684296905994415, 0.12083955854177475, -0.02609262615442276, 0.09755408018827438, 0.003920485731214285, -0.07013972103595734, -0.04238586127758026, 0.02732854336500168, -0.08801112323999405, 0.1365738958120346, 0.010086598806083202, 0.0045780460350215435, 0.014198309741914272, -0.025583500042557716, 0.013064910657703876, 0.12582045793533325, -0.14493241906166077, 0.09671664983034134, -0.20775847136974335, 0.08083996176719666, 0.03223796561360359, 0.013691850006580353, -0.17594435811042786, 0.11979678273200989, -0.049749668687582016, 0.09671647101640701, 0.15755893290042877, 0.21724222600460052, -0.018109844997525215, -0.013490156270563602, -0.057167209684848785, 0.02734368108212948, 0.12863166630268097, 0.07252249866724014, -0.0365157388150692, -0.1793801188468933, 0.04385996609926224, -0.05366495996713638, 0.1715087741613388, -0.17218296229839325, 0.002084790961816907, 0.17765966057777405, 0.04003223031759262, 0.005878088530153036, -0.0000955256909946911, 0.0846695750951767, 0.06528370082378387, 0.00988873839378357, 0.06878522038459778, 0.009914044290781021, 0.050777800381183624, -0.1582222580909729, 0.1902942657470703, -0.17462033033370972, -0.006124037783592939, 0.04608546197414398, -0.04495980963110924, -0.023298243060708046, 0.07057629525661469, 0.021834155544638634, 0.013988664373755455, 0.025315163657069206, -0.07338158041238785, 0.19240190088748932, -0.1172521635890007, 0.09386023133993149, -0.07542381435632706, 0.0844162255525589, 0.0085980873554945, -0.12261868268251419, 0.04575829580426216, 0.0650472342967987, -0.11190181970596313, -0.16845907270908356, 0.014592007733881474, -0.008226779289543629, -0.019512904807925224, 0.2946365177631378, -0.05636850371956825, 0.020828740671277046, -0.02858787775039673, -0.023491015657782555, -0.06113724410533905, 0.12097296863794327, -0.22244183719158173, -0.08907976746559143, 0.02762947790324688, 0.061712250113487244, 0.05929601565003395, -0.11875180155038834, 0.009384004399180412, -0.01933209039270878, -0.11352253705263138, -0.15558050572872162, 0.11705465614795685, -0.030527062714099884, 0.07990522682666779, -0.0683940052986145, -0.0566701740026474, 0.0306470338255167, -0.05481988564133644, -0.19337619841098785, 0.05112419277429581, -0.14528530836105347, -0.058231648057699203, -0.08400677144527435, 0.03704560548067093, 0.02135566435754299, 0.08302897214889526, 0.09655445069074631, -0.07143882662057877, -0.013100439682602882, -0.04874248802661896, 0.04857002571225166, 0.0029391320422291756, 0.027612466365098953, -0.06937441229820251, -0.03549528494477272, 0.032335635274648666, -0.09729230403900146, 0.01642708107829094, -0.055125340819358826, 0.01947598159313202, -0.016420168802142143, -0.0888388603925705, 0.08149109780788422, 0.23946775496006012, 0.09132719039916992, -0.03815348073840141, -0.03745106980204582, 0.10839206725358963, -0.12479321658611298, 0.013750423677265644, 0.19855134189128876, -0.043704695999622345, -0.00802633911371231, 0.16712145507335663, 0.011600970290601254, 0.032209381461143494, -0.017737459391355515, -0.07251168042421341, -0.0743137076497078, -0.15116280317306519, -0.10249511152505875, -0.11398136615753174, -0.003763978136703372, -0.24367310106754303, -0.006170955952256918, 0.021163633093237877, -0.006196060683578253, -0.04586390033364296, -0.06257446855306625, 0.12001203000545502, -0.035067494958639145, 0.19122464954853058, -0.023444749414920807, 0.10899977385997772, -0.0873153954744339, -0.05871474742889404, 0.07129126042127609, -0.08340200036764145, 0.07109910249710083, 0.07110624015331268, 0.17399954795837402, 0.06342422217130661, 0.10454482585191727, 0.1669088751077652, -0.02163274772465229, 0.08023757487535477, -0.009410420432686806, -0.024350719526410103, -0.03460435941815376, -0.014922008849680424, 0.03660198673605919, 0.2673306167125702, -0.12258779257535934, -0.025556307286024094, 0.0542852021753788, 0.040633104741573334, 0.009081481024622917, 0.15869565308094025, 0.005677484441548586, -0.005257788579910994, 0.057329557836055756, -0.029453802853822708, 0.012386813759803772, 0.08296529948711395, 0.19668595492839813, -0.09450705349445343, 0.10623116791248322, 0.13851642608642578, -0.003710041055455804, -0.0506676584482193, 0.05925832316279411, -0.17971846461296082, -0.04393954575061798, 0.0014015000779181719, 0.05256110057234764, -0.13720108568668365, 0.09830062836408615, 0.042460955679416656, 0.011099521070718765, -0.04195253551006317, -0.016307810321450233, 0.043415553867816925, 0.09292960911989212, 0.0772300660610199, -0.014751011505723, -0.024253347888588905, -0.07600458711385727, 0.016716552898287773, 0.017908308655023575, 0.1506735235452652, 0.11420102417469025, -0.008501295000314713, -0.004056105390191078, -0.03095938265323639, 0.0317729115486145, 0.015539547428488731, -0.10711634904146194, -0.0603158213198185, 0.029360607266426086, 0.17109353840351105, 0.07194717973470688, -0.005762074142694473, -0.007289446424692869, -0.1515314131975174, -0.06823226064443588, -0.10012929141521454, -0.015148255042731762, -0.042739387601614, -0.0983797088265419, 0.11763455718755722, -0.06057129427790642, -0.001071074279025197, 0.0649317130446434, 0.05630963295698166, 0.00849207118153572, 0.0005297065945342183, 0.054964326322078705, -0.026271434500813484, -0.1568024605512619, -0.027339650318026543, 0.19037634134292603, 0.05704401433467865, 0.12683087587356567, 0.07623226195573807, -0.03348380699753761, -0.01909937895834446, -0.055106014013290405, -0.009608379565179348, -0.03778316080570221, -0.09695321321487427, 0.10196413844823837, 0.016172420233488083, -0.17958882451057434, -0.05595609173178673, -0.07230693846940994, 0.15722456574440002, 0.1526012122631073, -0.03656524792313576, 0.09229539334774017, 0.21598993241786957, 0.00008917286322684959, -0.26734623312950134, -0.1271858513355255, 0.024203825742006302, 0.10903866589069366, -0.0201666671782732, -0.13031022250652313, 0.04371972009539604, -0.05899890884757042, -0.027453148737549782, -0.09407602250576019, -0.15185405313968658, -0.14008109271526337, 0.18038764595985413, -0.15598516166210175, 0.14718420803546906, -0.06798271089792252, -0.0940951481461525, -0.03328850120306015, 0.06257811933755875, 0.06861477345228195, -0.2657305896282196, 0.10751098394393921, 0.11700229346752167, -0.03897421434521675, 0.05119197815656662, 0.05092839524149895, 0.12613525986671448, 0.07740827649831772, -0.029665140435099602, -0.018826371058821678, -0.08107981085777283, 0.03796803206205368, 0.07532785087823868, 0.03725140914320946, -0.02423650398850441, 0.01796409860253334, -0.07720302790403366, -0.029516829177737236, -0.0489191859960556, -0.06613987684249878, 0.06720249354839325, -0.007015452720224857, -0.13078998029232025, -0.08441956341266632, -0.005894147325307131, 0.012504622340202332, 0.26897525787353516, -0.1765987128019333, -0.014142768457531929, 0.08393160998821259, 0.28069138526916504, -0.15938539803028107, 0.09922029823064804, -0.03667062148451805, -0.07838867604732513, 0.05143573880195618, -0.11916090548038483, 0.061401885002851486, 0.032108452171087265, 0.037541795521974564, 0.15865884721279144, 0.029855767264962196, -0.0706767588853836, 0.14348754286766052, 0.07490787655115128, -0.05395451933145523, -0.13284915685653687, -0.07864605635404587, -0.08451145142316818, -0.016071243211627007, 0.04591978341341019, 0.16948390007019043, 0.04203968867659569, 0.023071380332112312, -0.022271975874900818, -0.010000078938901424, -0.13055911660194397, 0.1737501323223114, 0.09689690917730331, 0.05937495455145836, -0.10171548277139664, 0.007118981331586838, -0.0066345189698040485, 0.09129492193460464, 0.013986222445964813, -0.017524704337120056, -0.10493069887161255, -0.06467939913272858, -0.17772240936756134, 0.10870981216430664, 0.0012380309635773301, -0.15969111025333405, -0.12183656543493271, -0.18747717142105103, -0.0459732823073864, 0.17876888811588287, 0.03156771883368492, 0.060430821031332016, -0.05039505288004875, -0.031549740582704544, 0.0018385356524959207, -0.014977182261645794, 0.05528390035033226, -0.03387671709060669, -0.16790714859962463, 0.14612728357315063, -0.0364200584590435, 0.12646989524364471, -0.07586506754159927, -0.03657875210046768, -0.03615117818117142, 0.07441524416208267, -0.06890185922384262, 0.011024873703718185, -0.11291981488466263, -0.04888433218002319, 0.03003697469830513, -0.09044045954942703, -0.010998260229825974, 0.017704328522086143, -0.06064692512154579, 0.06372462213039398, 0.015621757134795189, 0.07644522935152054, -0.030604517087340355, 0.03845059126615524, 0.022998973727226257, -0.047025494277477264, 0.09892997145652771, 0.11609505116939545, -0.09245054423809052, 0.133515402674675, -0.1147347167134285, -0.07676934450864792, 0.12114614248275757, 0.09944808483123779, -0.03795009106397629, 0.05387701466679573, -0.08972255885601044, 0.09559769928455353, 0.05549324303865433, -0.010150871239602566, 0.006562598515301943, 0.061316054314374924, 0.07786036282777786, -0.2230840027332306, -0.012921041809022427, -0.0802951231598854, 0.013562279753386974, 0.13163958489894867, 0.1625269502401352, 0.11294767260551453, -0.05786118283867836, -0.0012250798754394054, -0.011439365334808826, 0.04713081195950508, -0.014357766136527061, -0.032617807388305664, 0.003932176157832146, -0.11026401817798615, 0.0477365143597126, -0.060668569058179855, 0.06402170658111572, -0.09337925165891647, -0.03792071342468262, -0.02884357050061226, 0.10237971693277359, 0.057970933616161346, 0.031175803393125534, 0.13847176730632782, 0.09967842698097229, -0.022799097001552582, -0.04131809622049332, 0.03866980969905853, 0.00009823346772463992, 0.21327748894691467, -0.056702516973018646, -0.07430702447891235, 0.021282339468598366, 0.11218005418777466, 0.037584319710731506, -0.077293761074543, 0.02608048915863037, 0.10674886405467987, -0.1255227029323578, -0.10784005373716354, -0.08782515674829483, 0.06931870430707932, 0.12747691571712494, -0.03516261652112007, 0.02365746907889843, -0.003982521127909422, -0.10754325985908508, -0.1920069456100464, -0.06764382123947144, -0.06975390762090683, -0.07267848402261734, -0.026179255917668343, -0.053366195410490036, 0.021407268941402435, 0.14022047817707062, 0.004975157789885998, 0.021541552618145943, 0.08670338243246078, -0.13861647248268127, -0.11624281108379364, 0.00757182203233242, -0.06203416734933853, 0.017229773104190826, -0.014654702506959438, -0.018843770027160645, 0.1792260855436325, -0.037071000784635544, 0.03890683501958847, 0.017276527360081673, -0.03919408842921257, -0.022923238575458527, -0.11561606824398041, -0.04986029118299484, -0.016982218250632286, 0.006482516415417194, 0.09723087400197983, 0.08651904761791229, 0.11101534962654114, -0.06411706656217575, 0.012181757017970085, 0.10341040045022964, -0.01823616586625576, -0.17296771705150604, -0.13836058974266052, -0.07001026719808578, 0.005172845907509327, 0.06383759528398514, -0.010362488217651844, -0.12592536211013794, 0.022855423390865326, 0.11106084287166595, 0.2384307086467743, -0.10980366915464401, 0.09649902582168579, 0.027111133560538292, -0.00443950854241848, -0.02871599793434143, -0.0716051310300827, 0.042222876101732254, 0.22170931100845337, 0.06170717254281044, -0.023546651005744934, -0.08923320472240448, -0.028580090031027794, -0.016738953068852425, 0.1000930517911911, -0.04139598086476326, -0.06792082637548447, 0.004206653218716383, 0.11871141940355301, -0.18113146722316742, -0.10268064588308334, -0.10154574364423752, -0.1090274304151535, 0.008439264260232449, 0.022257165983319283, -0.047710150480270386, 0.14687180519104004, -0.014144334942102432, -0.0917460024356842, 0.006841056514531374, 0.013629711233079433, 0.028423592448234558, -0.08623041212558746, 0.11797665804624557, 0.03334193304181099, -0.19796454906463623, -0.06838486343622208, -0.03444845229387283, 0.19153137505054474, 0.01657581888139248, 0.10232892632484436, 0.055266451090574265, 0.1840769499540329, 0.02430565468966961, -0.06946270912885666, 0.004106421489268541, 0.1227044090628624, -0.023750215768814087, 0.20383508503437042, 0.06064911559224129, -0.0956769734621048, 0.09333518892526627, 0.028215717524290085, -0.10605297982692719, -0.02704085409641266, 0.027287479490041733, -0.11039935052394867, 0.08402882516384125, -0.09513700753450394, -0.02646995522081852, -0.009035211987793446, 0.06275582313537598, -0.03738660365343094, -0.014484983868896961, -0.06077304109930992, -0.06091700494289398, -0.1962551474571228, -0.006890515796840191, -0.06261373311281204, 0.02771633118391037, -0.039997052401304245, -0.013022167608141899, -0.06947614252567291, -0.00248409784398973, -0.001460897154174745, 0.05950387939810753, 0.10150058567523956, -0.030648013576865196, 0.005821557715535164, -0.00028880007448606193, 0.049994006752967834, 0.1117662787437439, -0.08436021208763123, -0.1272096037864685 ]
null
null
transformers
# Vision Transformer (base-sized model) pre-trained with MAE Vision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper [Masked Autoencoders Are Scalable Vision Learners](https://arxiv.org/abs/2111.06377) by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in [this repository](https://github.com/facebookresearch/mae). Disclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches. During pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=facebook/vit-mae) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import AutoImageProcessor, ViTMAEForPreTraining from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) processor = AutoImageProcessor.from_pretrained('facebook/vit-mae-base') model = ViTMAEForPreTraining.from_pretrained('facebook/vit-mae-base') inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) loss = outputs.loss mask = outputs.mask ids_restore = outputs.ids_restore ``` ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2111-06377, author = {Kaiming He and Xinlei Chen and Saining Xie and Yanghao Li and Piotr Doll{\'{a}}r and Ross B. Girshick}, title = {Masked Autoencoders Are Scalable Vision Learners}, journal = {CoRR}, volume = {abs/2111.06377}, year = {2021}, url = {https://arxiv.org/abs/2111.06377}, eprinttype = {arXiv}, eprint = {2111.06377}, timestamp = {Tue, 16 Nov 2021 12:12:31 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-2111-06377.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"license": "apache-2.0", "tags": ["vision"], "datasets": ["imagenet-1k"]}
null
facebook/vit-mae-base
[ "transformers", "pytorch", "tf", "vit_mae", "pretraining", "vision", "dataset:imagenet-1k", "arxiv:2111.06377", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2111.06377" ]
[]
TAGS #transformers #pytorch #tf #vit_mae #pretraining #vision #dataset-imagenet-1k #arxiv-2111.06377 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# Vision Transformer (base-sized model) pre-trained with MAE Vision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. Disclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches. During pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. ## Intended uses & limitations You can use the raw model for image classification. See the model hub to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ### BibTeX entry and citation info
[ "# Vision Transformer (base-sized model) pre-trained with MAE\n\nVision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. \n\nDisclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches.\n\nDuring pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #vit_mae #pretraining #vision #dataset-imagenet-1k #arxiv-2111.06377 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# Vision Transformer (base-sized model) pre-trained with MAE\n\nVision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. \n\nDisclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches.\n\nDuring pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ 63, 122, 209, 41, 13, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #vit_mae #pretraining #vision #dataset-imagenet-1k #arxiv-2111.06377 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Vision Transformer (base-sized model) pre-trained with MAE\n\nVision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. \n\nDisclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches.\n\nDuring pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder.## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.### How to use\n\nHere is how to use this model:### BibTeX entry and citation info" ]
[ -0.0782487541437149, 0.04953200742602348, -0.006027778144925833, 0.04347856342792511, 0.09869049489498138, 0.025067951530218124, 0.08113323897123337, 0.10139619559049606, 0.014264069497585297, 0.0315779410302639, 0.06615355610847473, 0.009140807203948498, 0.13579994440078735, 0.16014987230300903, 0.11978048831224442, -0.2845557928085327, 0.00015225453535094857, -0.04124584048986435, 0.07445476949214935, 0.050609417259693146, 0.07569517940282822, -0.09029270708560944, 0.0858558639883995, 0.06829776614904404, -0.05488789081573486, -0.02159040980041027, -0.027729598805308342, -0.015466183423995972, 0.09856556355953217, 0.014074327424168587, 0.08919184654951096, -0.022117672488093376, 0.09152088314294815, -0.1441875845193863, 0.023688368499279022, 0.11900193989276886, -0.005240906961262226, 0.06165827810764313, 0.09248793870210648, 0.058294881135225296, 0.006662364583462477, -0.11052348464727402, 0.0024642697535455227, 0.05909518897533417, -0.09880974143743515, -0.17266321182250977, -0.09773941338062286, 0.0580659881234169, 0.06248932331800461, 0.06479208171367645, -0.004511834587901831, 0.04815306514501572, 0.07399853318929672, 0.047427948564291, 0.1573045551776886, -0.2517373561859131, -0.042187631130218506, 0.053971949964761734, -0.0018684465903788805, -0.013143750838935375, -0.07642994821071625, 0.013197493739426136, -0.005103005096316338, 0.048221200704574585, 0.12903578579425812, -0.01478376891463995, 0.0760473906993866, -0.07092619687318802, -0.13919569551944733, -0.049035195261240005, 0.017679158598184586, -0.026268379762768745, -0.118801049888134, -0.11149552464485168, -0.08466032147407532, 0.03148753196001053, 0.012948610819876194, -0.019269246608018875, 0.02064315415918827, 0.05443693697452545, 0.04655468091368675, -0.12814654409885406, -0.0980033352971077, -0.048272181302309036, -0.015839040279388428, 0.0748552456498146, 0.04195085167884827, 0.08378376066684723, -0.015730993822216988, 0.06696061789989471, -0.04475315660238266, -0.05437694489955902, -0.1059403270483017, -0.050805654376745224, -0.11626631766557693, -0.03864065185189247, 0.003495155368000269, -0.17369964718818665, -0.08930021524429321, 0.15754827857017517, -0.11919239908456802, 0.05120959132909775, 0.00966256856918335, 0.023217305541038513, 0.08436278998851776, 0.13917700946331024, -0.12575341761112213, 0.11620327830314636, 0.02696271240711212, -0.07345427572727203, 0.03453734889626503, -0.04678155109286308, -0.016829345375299454, 0.052260760217905045, -0.04728313162922859, 0.007803079206496477, 0.008523671887814999, 0.03940105065703392, -0.028071565553545952, -0.054577358067035675, 0.19695386290550232, -0.07157963514328003, -0.015379117801785469, 0.0048902989365160465, -0.04361308738589287, 0.0442633256316185, 0.09286966919898987, -0.0577721931040287, -0.06522990763187408, 0.05349327623844147, -0.04894497990608215, -0.02563687600195408, -0.06511412560939789, -0.08980128914117813, -0.02600155957043171, -0.12805619835853577, -0.09043355286121368, -0.1122552677989006, -0.15093421936035156, -0.012163596227765083, 0.06678412854671478, 0.0002487035235390067, 0.07712972164154053, 0.031400177627801895, -0.017736956477165222, -0.0045804050751030445, 0.04903171956539154, -0.04134582355618477, -0.0062477681785821915, 0.0018653637962415814, -0.09244316816329956, 0.09577411413192749, -0.002912685042247176, 0.014706115238368511, -0.08646384626626968, 0.0424288846552372, -0.1251603662967682, 0.0802096426486969, -0.005225706845521927, -0.05726924166083336, -0.10326289385557175, -0.05474233254790306, -0.0732252448797226, 0.0018196121091023088, 0.0649375468492508, 0.08237351477146149, -0.1269034594297409, -0.023347511887550354, 0.17853565514087677, -0.14968222379684448, -0.006868202704936266, 0.09618276357650757, 0.002311010379344225, 0.05366206169128418, 0.038523197174072266, 0.04066464304924011, 0.13872744143009186, -0.11811087280511856, -0.11125344783067703, 0.059991732239723206, -0.09626862406730652, 0.04593472182750702, 0.034586936235427856, 0.01806759648025036, 0.02657637745141983, -0.00897086039185524, -0.04171525686979294, 0.003187577472999692, -0.027046237140893936, -0.04123249277472496, 0.009431962855160236, -0.031911514699459076, -0.0062544578686356544, 0.017940066754817963, -0.011561640538275242, 0.041447963565588, -0.07780441641807556, -0.1201377660036087, 0.0640248954296112, -0.04231340438127518, 0.05064438283443451, -0.0878133773803711, -0.012943745590746403, -0.05660700052976608, 0.006355787627398968, -0.13485319912433624, -0.023814938962459564, 0.07832523435354233, -0.10979387164115906, 0.017509320750832558, -0.058064404875040054, 0.03417236730456352, 0.08587866276502609, -0.011002717539668083, -0.05630522966384888, -0.007912680506706238, -0.0633995532989502, -0.02393745258450508, -0.09289859980344772, -0.10576429963111877, -0.06592931598424911, -0.0345282256603241, 0.023249471560120583, 0.009093762375414371, 0.03522587940096855, 0.045819252729415894, 0.04455682635307312, -0.07261044532060623, 0.04725086688995361, 0.00005504121872945689, -0.015657495707273483, -0.07641319930553436, 0.056897107511758804, 0.041681546717882156, 0.026404976844787598, 0.06355725973844528, -0.061672333627939224, -0.21115480363368988, 0.0384763665497303, -0.034782372415065765, -0.10372806340456009, 0.09574493020772934, -0.026520824059844017, -0.022716674953699112, -0.10504204779863358, -0.031621795147657394, 0.1790037900209427, 0.01987781934440136, 0.08145197480916977, -0.038055699318647385, 0.03106953576207161, 0.044828690588474274, -0.014414056204259396, -0.06597177684307098, -0.03182593733072281, 0.10205838084220886, -0.12730802595615387, 0.026273727416992188, -0.02347276173532009, 0.04591916874051094, 0.18870031833648682, 0.0538669116795063, -0.06785846501588821, -0.024885646998882294, 0.020166859030723572, 0.040075238794088364, 0.10402460396289825, -0.01959771290421486, -0.038306254893541336, 0.007278280798345804, 0.01360305305570364, 0.01908193528652191, -0.09690680354833603, 0.058954689651727676, 0.021008333191275597, -0.03533116728067398, 0.006424892693758011, -0.020475294440984726, -0.0662120133638382, 0.02619929052889347, 0.09275831282138824, 0.04572978988289833, 0.0018131444230675697, -0.03487558290362358, -0.0958632305264473, 0.14333246648311615, -0.09010535478591919, -0.2858790457248688, -0.1579711139202118, -0.025340138003230095, 0.0006219298811629415, 0.04903234541416168, 0.027333686128258705, -0.08809024095535278, -0.04902661219239235, -0.06576182693243027, 0.0767824649810791, -0.03930513188242912, -0.000651513400953263, 0.002155580557882786, -0.04486817121505737, 0.019478874281048775, -0.10272350162267685, 0.019071592018008232, -0.02614627778530121, -0.11768040806055069, 0.06634706258773804, 0.008537132292985916, 0.06950414925813675, 0.10408084094524384, 0.014611070975661278, 0.005772866774350405, -0.035865407437086105, 0.19842228293418884, -0.05949901044368744, 0.17593565583229065, 0.16919255256652832, -0.025232529267668724, 0.11069923639297485, 0.05520138144493103, 0.02355271577835083, -0.0646238625049591, 0.030554834753274918, 0.017197217792272568, -0.09535716474056244, -0.15857741236686707, -0.02526472695171833, -0.04791485145688057, 0.04144013300538063, 0.11490841954946518, 0.0259910449385643, -0.05834950879216194, 0.04700097069144249, -0.06293126195669174, 0.028318071737885475, -0.014900279231369495, 0.08137977123260498, -0.08493862301111221, -0.02790032885968685, 0.05235147848725319, -0.05149473994970322, 0.008157594129443169, 0.08611024916172028, 0.02789340168237686, 0.20972588658332825, -0.02943377196788788, 0.16983181238174438, 0.06299387663602829, 0.0683935210108757, 0.0357576422393322, 0.08334926515817642, -0.06944836676120758, 0.03003327175974846, -0.010012613609433174, -0.044227954000234604, 0.011671219021081924, 0.052309662103652954, 0.02635747753083706, 0.006348844151943922, 0.0015984541969373822, 0.033099859952926636, 0.03607229143381119, 0.3580698072910309, 0.11608675867319107, -0.12921994924545288, -0.1021258756518364, -0.028078917413949966, 0.01380960550159216, -0.09094315767288208, 0.0008543721633031964, 0.10076069831848145, -0.10809671133756638, 0.06201424077153206, -0.08928219228982925, 0.049381323158741, -0.13588039577007294, -0.008892368525266647, 0.08271609991788864, 0.15585245192050934, -0.025133419781923294, 0.027894331142306328, -0.1012813001871109, 0.09268336743116379, 0.020667852833867073, 0.10617295652627945, -0.06809129565954208, 0.05715785175561905, 0.07244914025068283, -0.005592493340373039, 0.14972634613513947, 0.03869163244962692, -0.11239416152238846, -0.10236739367246628, -0.04439988732337952, -0.005600297823548317, 0.13934902846813202, -0.0512954443693161, 0.06749344617128372, 0.0034538351465016603, 0.004124423488974571, -0.021103644743561745, 0.04359399899840355, -0.10643953084945679, -0.08892291784286499, 0.07570226490497589, -0.0487980842590332, -0.017830045893788338, -0.05507506802678108, 0.0025547335390001535, -0.054886918514966965, 0.1523175686597824, -0.10418577492237091, -0.06285306811332703, -0.10543885827064514, 0.009727724827826023, 0.012855412438511848, -0.07641007751226425, 0.06499271839857101, -0.0372639074921608, 0.15372242033481598, -0.08413602411746979, -0.10536514967679977, 0.020689772441983223, -0.07492858916521072, -0.0977526605129242, -0.06162203103303909, 0.07050953060388565, 0.12711843848228455, -0.026121605187654495, 0.010845975950360298, 0.0508190356194973, 0.028503496199846268, -0.07695937156677246, 0.1404908001422882, 0.039698418229818344, -0.07123485952615738, 0.1351495385169983, -0.014543197117745876, -0.05853929743170738, -0.0661616399884224, 0.010952671058475971, 0.03739529103040695, 0.09707070142030716, -0.08108426630496979, 0.09568415582180023, 0.15484941005706787, -0.10225669294595718, -0.2937088906764984, 0.0020226147025823593, 0.03792032226920128, 0.018391570076346397, -0.01619470678269863, -0.2991124093532562, 0.06094490364193916, 0.004503420554101467, -0.03221474215388298, 0.0357142835855484, -0.1962573379278183, -0.06001350283622742, 0.01256537064909935, 0.0742434412240982, 0.0855596736073494, -0.045202869921922684, -0.008216358721256256, 0.010607200674712658, 0.028295600786805153, 0.06255637109279633, -0.009708672761917114, 0.1076822355389595, -0.04226941987872124, -0.0833052471280098, 0.0036255877930670977, -0.04084444046020508, 0.06796287000179291, -0.04142408072948456, 0.04709842801094055, -0.018254781141877174, 0.09841436892747879, 0.10843794047832489, -0.04718198999762535, 0.10424728691577911, 0.08365973085165024, 0.0584021732211113, -0.008188285864889622, -0.019123679026961327, -0.04759785905480385, 0.06581762433052063, -0.016147766262292862, -0.08652530610561371, -0.08045364916324615, 0.09326064586639404, 0.04291550815105438, 0.02402891218662262, 0.050488777458667755, 0.01876792311668396, -0.0037950107362121344, 0.12534883618354797, 0.024613238871097565, 0.011950185522437096, -0.12199192494153976, -0.040607694536447525, -0.01738317497074604, 0.10341907292604446, -0.11654603481292725, 0.027286261320114136, 0.04451873153448105, 0.05537683144211769, 0.0552920401096344, 0.051134057343006134, -0.1867385059595108, 0.04630283638834953, 0.051409315317869186, -0.08735977858304977, -0.17563112080097198, -0.011163327842950821, -0.034905146807432175, -0.07384822517633438, 0.04150404781103134, 0.15074940025806427, -0.03982551023364067, 0.009878150187432766, -0.007855276577174664, 0.05844882130622864, -0.018082164227962494, 0.06587204337120056, 0.030078718438744545, -0.0012076961575075984, -0.031560562551021576, 0.15803681313991547, 0.08213410526514053, 0.017225373536348343, 0.008090727031230927, 0.08524299412965775, -0.08426334708929062, -0.07488458603620529, -0.07656586915254593, -0.020395686849951744, 0.04204394295811653, -0.01135345734655857, 0.023885812610387802, -0.041517581790685654, 0.03150133043527603, 0.057378120720386505, 0.005370642989873886, 0.017666257917881012, -0.06544037163257599, 0.03097892552614212, -0.07028106600046158, 0.03653563931584358, -0.02879399061203003, 0.04262461140751839, -0.06630592048168182, 0.08052574098110199, 0.04702809825539589, 0.09836999326944351, -0.027928942814469337, -0.07389117777347565, -0.09496928751468658, 0.003793261246755719, -0.030294714495539665, 0.018313517794013023, -0.09463377296924591, -0.03318922221660614, -0.009267674759030342, 0.044005103409290314, 0.014347082935273647, 0.059499844908714294, -0.002215198939666152, -0.05776016786694527, -0.013157302513718605, 0.010953565128147602, -0.09342687577009201, -0.02083452045917511, 0.031685926020145416, -0.07945462316274643, 0.10215144604444504, -0.026849020272493362, -0.048867642879486084, -0.004869123920798302, -0.11827226728200912, -0.007144971285015345, -0.013489925302565098, 0.008518374525010586, -0.03930407017469406, -0.06238400191068649, 0.02159428410232067, -0.05174637213349342, -0.08053614944219589, -0.024621129035949707, 0.14483411610126495, -0.04873166233301163, 0.12224642932415009, -0.03508540987968445, 0.0840262770652771, -0.06533987075090408, 0.07457340508699417, 0.05969209596514702, 0.05414927378296852, 0.06328415870666504, -0.051184363663196564, 0.0999239832162857, -0.14619243144989014, -0.0038619651459157467, 0.0033726755063980818, -0.011453138664364815, 0.02944657765328884, -0.038577254861593246, 0.042642991989851, -0.025861650705337524, 0.06918664276599884, 0.0018621459603309631, 0.059565525501966476, 0.00395861454308033, -0.01625712774693966, -0.13923239707946777, 0.01922747865319252, 0.013355200178921223, -0.0283492524176836, -0.043348655104637146, 0.02796376869082451, 0.033328067511320114, -0.016848308965563774, 0.1385297328233719, 0.13010932505130768, 0.028794266283512115, 0.15635177493095398, 0.07929766923189163, 0.002809362020343542, -0.04671982675790787, -0.1695306897163391, -0.021420586854219437, -0.015356449410319328, 0.052800118923187256, -0.002076780190691352, -0.10306508094072342, 0.1018349677324295, -0.11380649358034134, 0.10208263248205185, 0.02996109426021576, -0.042150143533945084, -0.0714937224984169, -0.20135177671909332, 0.011906207539141178, -0.01105231512337923, 0.01696152798831463, -0.06132916361093521, 0.004973205272108316, 0.11463957279920578, -0.005158524494618177, -0.0021998656447976828, 0.1389290690422058, -0.08911852538585663, -0.05184917524456978, 0.08576676994562149, 0.01725202053785324, 0.013181260786950588, 0.056155212223529816, 0.02102258987724781, 0.05262530595064163, 0.015985677018761635, 0.0518161877989769, 0.023455984890460968, 0.060671161860227585, 0.07674437016248703, -0.007658896502107382, -0.0953640565276146, -0.000536163744982332, -0.006338189356029034, -0.035321954637765884, 0.14302749931812286, 0.040890976786613464, -0.03533066064119339, -0.0011466145515441895, 0.0858871340751648, -0.04767880588769913, 0.03233632445335388, -0.154985710978508, 0.22146937251091003, -0.03429800644516945, -0.029236888512969017, -0.0006413068622350693, -0.08513545989990234, -0.019268851727247238, 0.20491547882556915, 0.12698189914226532, 0.00642009824514389, 0.017061976715922356, 0.013435925357043743, -0.008734599687159061, 0.006522222887724638, 0.12678709626197815, 0.036636631935834885, 0.19181951880455017, -0.03908336162567139, 0.06293962150812149, -0.03667254373431206, -0.034405775368213654, -0.04938183352351189, 0.04902854561805725, -0.017414972186088562, 0.04057970270514488, -0.08633074909448624, 0.04016366973519325, -0.060110270977020264, -0.29522982239723206, 0.14038382470607758, -0.014997773803770542, -0.010311324149370193, 0.018483128398656845, -0.07120577991008759, 0.023632977157831192, 0.08281784504652023, 0.000372252514353022, 0.035629305988550186, 0.16332052648067474, 0.039352040737867355, -0.049122828990221024, -0.08198486268520355, 0.011244524270296097, -0.23927772045135498, 0.243594229221344, -0.0068511818535625935, -0.0086906086653471, 0.0721026360988617, 0.04779377207159996, -0.11909975111484528, -0.004285674076527357, -0.05112869292497635, -0.011211207136511803, -0.04602525010704994, 0.16381940245628357, -0.02676115185022354, 0.09537210315465927, 0.02148066833615303, -0.11459411680698395, 0.03718455508351326, 0.11139442771673203, 0.01471130270510912, -0.05586278811097145, 0.06432203203439713, -0.08413397520780563, 0.13385698199272156, 0.13882653415203094, 0.010040306486189365, -0.02520730532705784, -0.025158671662211418, 0.003116876119747758, -0.0021615298464894295, 0.06905379891395569, -0.03495399281382561, -0.10438171774148941, -0.00041809986578300595, -0.1049744039773941, 0.05374256148934364, -0.13008566200733185, -0.0832466334104538, 0.0160692036151886, -0.051995452493429184, -0.0113706449046731, 0.03502684831619263, 0.016724830493330956, 0.04690053313970566, -0.0332888588309288, 0.12557783722877502, -0.0057333651930093765, 0.0431746207177639, -0.09862829744815826, -0.0837254524230957 ]
null
null
transformers
# Vision Transformer (huge-sized model) pre-trained with MAE Vision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper [Masked Autoencoders Are Scalable Vision Learners](https://arxiv.org/abs/2111.06377) by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in [this repository](https://github.com/facebookresearch/mae). Disclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches. During pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=facebook/vit-mae) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import AutoImageProcessor, ViTMAEForPreTraining from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) processor = AutoImageProcessor.from_pretrained('facebook/vit-mae-huge') model = ViTMAEForPreTraining.from_pretrained('facebook/vit-mae-huge') inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) loss = outputs.loss mask = outputs.mask ids_restore = outputs.ids_restore ``` ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2111-06377, author = {Kaiming He and Xinlei Chen and Saining Xie and Yanghao Li and Piotr Doll{\'{a}}r and Ross B. Girshick}, title = {Masked Autoencoders Are Scalable Vision Learners}, journal = {CoRR}, volume = {abs/2111.06377}, year = {2021}, url = {https://arxiv.org/abs/2111.06377}, eprinttype = {arXiv}, eprint = {2111.06377}, timestamp = {Tue, 16 Nov 2021 12:12:31 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-2111-06377.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"license": "apache-2.0", "tags": ["vision"], "datasets": ["imagenet-1k"]}
null
facebook/vit-mae-huge
[ "transformers", "pytorch", "tf", "vit_mae", "pretraining", "vision", "dataset:imagenet-1k", "arxiv:2111.06377", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2111.06377" ]
[]
TAGS #transformers #pytorch #tf #vit_mae #pretraining #vision #dataset-imagenet-1k #arxiv-2111.06377 #license-apache-2.0 #endpoints_compatible #region-us
# Vision Transformer (huge-sized model) pre-trained with MAE Vision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. Disclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches. During pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. ## Intended uses & limitations You can use the raw model for image classification. See the model hub to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ### BibTeX entry and citation info
[ "# Vision Transformer (huge-sized model) pre-trained with MAE\n\nVision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. \n\nDisclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches.\n\nDuring pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #vit_mae #pretraining #vision #dataset-imagenet-1k #arxiv-2111.06377 #license-apache-2.0 #endpoints_compatible #region-us \n", "# Vision Transformer (huge-sized model) pre-trained with MAE\n\nVision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. \n\nDisclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches.\n\nDuring pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ 59, 123, 209, 41, 13, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #vit_mae #pretraining #vision #dataset-imagenet-1k #arxiv-2111.06377 #license-apache-2.0 #endpoints_compatible #region-us \n# Vision Transformer (huge-sized model) pre-trained with MAE\n\nVision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. \n\nDisclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches.\n\nDuring pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder.## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.### How to use\n\nHere is how to use this model:### BibTeX entry and citation info" ]
[ -0.06969593465328217, 0.02777443267405033, -0.005734664388000965, 0.04598821699619293, 0.09899245202541351, 0.020768264308571815, 0.06993316113948822, 0.0970543771982193, -0.014714290387928486, 0.03177574649453163, 0.0543697290122509, 0.008870313875377178, 0.13956491649150848, 0.15387549996376038, 0.09727491438388824, -0.2603204548358917, -0.005923971068114042, -0.02859489992260933, 0.0694626048207283, 0.06102512776851654, 0.09070945531129837, -0.08328301459550858, 0.08410573750734329, 0.068511463701725, -0.07718689739704132, -0.009088427759706974, -0.021061964333057404, -0.023659169673919678, 0.10806328058242798, 0.027100281789898872, 0.10165795683860779, -0.012716436758637428, 0.07383880764245987, -0.13773943483829498, 0.02768503688275814, 0.12854206562042236, -0.0158573929220438, 0.05859881266951561, 0.09943479299545288, 0.0546833798289299, 0.0031939928885549307, -0.1060837134718895, 0.00495243351906538, 0.06587957590818405, -0.09747771918773651, -0.17325134575366974, -0.09290093183517456, 0.06545935571193695, 0.04145980626344681, 0.05676916241645813, -0.005143413320183754, 0.06902995705604553, 0.04315035790205002, 0.048562515527009964, 0.17374767363071442, -0.25390851497650146, -0.03491224721074104, 0.06870950013399124, -0.010280410759150982, 0.005575125105679035, -0.07341541349887848, 0.013786977156996727, -0.0017214954132214189, 0.047431498765945435, 0.15425896644592285, -0.0068013425916433334, 0.04121111333370209, -0.06398743391036987, -0.12226102501153946, -0.05729545280337334, -0.0005469436291605234, -0.026749279350042343, -0.10612966120243073, -0.11984833329916, -0.09260779619216919, 0.03326278179883957, 0.02828850783407688, -0.03455820679664612, 0.01843409053981304, 0.05873994901776314, 0.04179052263498306, -0.1363215297460556, -0.09266407042741776, -0.049893032759428024, -0.02391551434993744, 0.07507918775081635, 0.05344489589333534, 0.0741569995880127, -0.018656155094504356, 0.09265883266925812, -0.05498432368040085, -0.05102474242448807, -0.10455334186553955, -0.04479791969060898, -0.11972695589065552, -0.024380413815379143, 0.004222684074193239, -0.1598963588476181, -0.09121552854776382, 0.15121811628341675, -0.14066985249519348, 0.04916754737496376, 0.0012487707426771522, 0.02371005155146122, 0.05661928653717041, 0.13005703687667847, -0.1180083379149437, 0.12299104034900665, 0.04144364967942238, -0.08085639029741287, 0.028976593166589737, -0.051520999521017075, -0.020804163068532944, 0.04472542181611061, -0.02460525743663311, 0.017007142305374146, -0.0037143679801374674, 0.03803975507616997, -0.04302256181836128, -0.058568090200424194, 0.191628560423851, -0.0790884792804718, -0.021012354642152786, 0.007779791951179504, -0.03988795727491379, 0.03594537079334259, 0.07952473312616348, -0.06645656377077103, -0.09658241271972656, 0.049762461334466934, -0.05134991183876991, -0.03147825971245766, -0.06584849208593369, -0.09725914150476456, -0.026259804144501686, -0.1503976285457611, -0.08254041522741318, -0.12988100945949554, -0.16277551651000977, -0.012783150188624859, 0.04832617565989494, 0.003011660650372505, 0.05697884038090706, 0.04081527888774872, -0.01545842457562685, -0.020145857706665993, 0.035122308880090714, -0.03421638160943985, -0.002781753893941641, 0.005558914504945278, -0.09452760964632034, 0.07978049665689468, 0.0009188763215206563, 0.02421228401362896, -0.08824247121810913, 0.04840242490172386, -0.10416177660226822, 0.10535534471273422, -0.0011274650460109115, -0.06473476439714432, -0.09869664907455444, -0.057655300945043564, -0.06337237358093262, -0.00248688249848783, 0.06714778393507004, 0.07164627313613892, -0.10735936462879181, -0.030654551461338997, 0.17536136507987976, -0.16046562790870667, -0.01438556145876646, 0.10522173345088959, -0.0009439005516469479, 0.045542776584625244, 0.0569351501762867, 0.06400987505912781, 0.14789746701717377, -0.10625122487545013, -0.10220015048980713, 0.05358703061938286, -0.10119624435901642, 0.02908254787325859, 0.039108701050281525, 0.030537381768226624, 0.024926641955971718, -0.009561584331095219, -0.047326892614364624, 0.002552381483837962, -0.027390439063310623, -0.04564696550369263, 0.0012391642667353153, -0.031034080311655998, -0.017406221479177475, 0.02124251425266266, -0.004475940950214863, 0.015833696350455284, -0.09207946062088013, -0.1178799569606781, 0.05731157958507538, -0.048874516040086746, 0.06615360826253891, -0.09559357911348343, -0.004598807077854872, -0.03817254304885864, 0.012048865668475628, -0.14537008106708527, -0.03599638491868973, 0.08057713508605957, -0.12014946341514587, 0.011046850122511387, -0.0647728443145752, 0.026371587067842484, 0.07517766207456589, -0.032331738620996475, -0.06157276779413223, -0.030765831470489502, -0.062405068427324295, -0.024839160963892937, -0.07225805521011353, -0.10337469726800919, -0.06183556094765663, -0.053652793169021606, 0.03694764897227287, 0.023550212383270264, 0.05626622959971428, 0.053288642317056656, 0.0462847501039505, -0.07457610219717026, 0.044643584638834, 0.008715874515473843, -0.0164496973156929, -0.0859856978058815, 0.053691986948251724, 0.04096922650933266, 0.011048604734241962, 0.09278663992881775, -0.06102069467306137, -0.17460061609745026, 0.04447769373655319, -0.023373084142804146, -0.1053963378071785, 0.08263137191534042, -0.01991898939013481, -0.02837355248630047, -0.11191204190254211, -0.0381343699991703, 0.18828922510147095, 0.0039030408952385187, 0.08714286237955093, -0.058299511671066284, 0.023899424821138382, 0.034927502274513245, 0.0008236544672399759, -0.05408675596117973, -0.01720520481467247, 0.07584822922945023, -0.11163589358329773, 0.03809710964560509, -0.006324307527393103, 0.03694786876440048, 0.18765464425086975, 0.04417332634329796, -0.06578784435987473, -0.013163394294679165, 0.015678852796554565, 0.039520736783742905, 0.07866847515106201, -0.017873363569378853, -0.030240200459957123, 0.009304315783083439, 0.022967252880334854, 0.02520735375583172, -0.1093454584479332, 0.055552393198013306, 0.03132735565304756, -0.03796174004673958, 0.008043383248150349, -0.021479368209838867, -0.05062809959053993, 0.028430016711354256, 0.086832195520401, 0.05425604060292244, 0.01835111901164055, -0.03239898383617401, -0.09077675640583038, 0.1565701961517334, -0.09046778827905655, -0.28373774886131287, -0.15453816950321198, -0.010378831997513771, 0.002062845276668668, 0.03301842138171196, 0.041843172162771225, -0.08821532130241394, -0.054154347628355026, -0.07150010019540787, 0.10190413147211075, -0.024891570210456848, -0.003991744481027126, 0.005235269665718079, -0.038758303970098495, 0.01371587160974741, -0.11280989646911621, 0.01836208999156952, -0.021921318024396896, -0.11583535373210907, 0.07189778983592987, -0.002368048531934619, 0.06618879735469818, 0.11314091831445694, 0.0015538674779236317, 0.0070900339633226395, -0.04368138685822487, 0.18605583906173706, -0.05273517221212387, 0.17221465706825256, 0.17662586271762848, -0.01357471663504839, 0.10858172178268433, 0.039826977998018265, 0.022644037380814552, -0.057207562029361725, 0.03448900952935219, 0.01941533014178276, -0.10701196640729904, -0.18507611751556396, -0.025578154250979424, -0.04761188477277756, 0.03467794507741928, 0.10167347639799118, 0.024511946365237236, -0.0505036860704422, 0.059066515415906906, -0.06398715078830719, 0.023938819766044617, -0.012465342879295349, 0.07529676705598831, -0.05571340397000313, -0.01940390281379223, 0.055136099457740784, -0.05275358632206917, 0.023455694317817688, 0.0934586226940155, 0.02938825823366642, 0.20016014575958252, -0.058316197246313095, 0.18719705939292908, 0.06038398668169975, 0.08797041326761246, 0.04980623722076416, 0.09359066188335419, -0.07232584804296494, 0.031728167086839676, -0.011674375273287296, -0.03978761285543442, 0.016266729682683945, 0.05154256150126457, 0.019733687862753868, 0.006926706060767174, -0.0041799298487603664, 0.020413044840097427, 0.036482736468315125, 0.3544841706752777, 0.11760059744119644, -0.1321507841348648, -0.10959923267364502, -0.025172607973217964, 0.004671209491789341, -0.08786024153232574, 0.01459164172410965, 0.13356606662273407, -0.11378540098667145, 0.04578749090433121, -0.08085372298955917, 0.06276613473892212, -0.11804049462080002, -0.011272011324763298, 0.0682157501578331, 0.17308646440505981, -0.031185491010546684, 0.03174176439642906, -0.12404955923557281, 0.08279120177030563, 0.016542062163352966, 0.09344504028558731, -0.06479837745428085, 0.050548844039440155, 0.07613328099250793, 0.0012829946354031563, 0.13788679242134094, 0.03440419211983681, -0.08818916976451874, -0.08971050381660461, -0.04387969523668289, 0.00043815476237796247, 0.12263946235179901, -0.05849599465727806, 0.0594177171587944, 0.006353697739541531, 0.01864437758922577, -0.013848001137375832, 0.04456159099936485, -0.09929432719945908, -0.0923299789428711, 0.058920446783304214, -0.03234792500734329, 0.004656753037124872, -0.06498250365257263, -0.014324650168418884, -0.06275198608636856, 0.16437941789627075, -0.10757709294557571, -0.08223847299814224, -0.11036161333322525, 0.01317981630563736, 0.0012763128615915775, -0.08530225604772568, 0.06482043117284775, -0.0324847549200058, 0.13968344032764435, -0.08960489928722382, -0.11320172250270844, 0.02689230628311634, -0.07401770353317261, -0.11723218113183975, -0.04387231543660164, 0.07871339470148087, 0.12055733799934387, -0.01862930692732334, 0.006094467826187611, 0.027825243771076202, 0.02553144283592701, -0.07410042732954025, 0.12825602293014526, 0.03518335148692131, -0.06159990653395653, 0.15603193640708923, -0.001781489816494286, -0.06640055775642395, -0.07243785262107849, -0.00008588148193666711, 0.05669666454195976, 0.09123767167329788, -0.0780360996723175, 0.10270800441503525, 0.14933045208454132, -0.08910471200942993, -0.29754215478897095, -0.00132872408721596, 0.029406903311610222, 0.02858954295516014, 0.003502008505165577, -0.280226469039917, 0.05355255305767059, 0.011779174208641052, -0.020940134301781654, 0.015463773161172867, -0.2019667625427246, -0.07074619829654694, 0.018275119364261627, 0.07729227095842361, 0.09233839064836502, -0.04285227879881859, -0.011427335441112518, 0.030232828110456467, 0.01658041961491108, 0.07534971833229065, -0.023158850148320198, 0.10161510854959488, -0.04125363752245903, -0.09082221984863281, 0.004272371996194124, -0.03812551498413086, 0.069652259349823, -0.04784804582595825, 0.05657150596380234, -0.021745529025793076, 0.10545039176940918, 0.11218705028295517, -0.04860897734761238, 0.08607759326696396, 0.058723077178001404, 0.07303594797849655, 0.004759375937283039, -0.03048587031662464, -0.05484645441174507, 0.07779283821582794, -0.013568410649895668, -0.09872464835643768, -0.10000281035900116, 0.09242474287748337, 0.048074617981910706, 0.01308424025774002, 0.07446420192718506, 0.020720355212688446, 0.018222270533442497, 0.15714764595031738, 0.025774063542485237, -0.003007256193086505, -0.12502604722976685, -0.05254295468330383, -0.018164195120334625, 0.11555232107639313, -0.1281069815158844, 0.025726329535245895, 0.050140488892793655, 0.054994624108076096, 0.06015930324792862, 0.055048003792762756, -0.19377322494983673, 0.05763046443462372, 0.043828774243593216, -0.08063588291406631, -0.16788068413734436, -0.029182160273194313, -0.03244446963071823, -0.056433141231536865, 0.037060026079416275, 0.14262701570987701, -0.049363721162080765, 0.003457812825217843, -0.009092763997614384, 0.05323731526732445, -0.01582658663392067, 0.07769252359867096, 0.031449008733034134, 0.009217994287610054, -0.04574831947684288, 0.1479269564151764, 0.0810711681842804, -0.015842869877815247, 0.010578223504126072, 0.08829642832279205, -0.09264836460351944, -0.06528904289007187, -0.05272062495350838, -0.042374320328235626, 0.03946347534656525, -0.026963653042912483, 0.01761613041162491, -0.06981466710567474, 0.04497212916612625, 0.061223678290843964, 0.024180535227060318, 0.027969341725111008, -0.05888574942946434, 0.016066215932369232, -0.062401141971349716, 0.04179072007536888, -0.014673233032226562, 0.04284153878688812, -0.0803365483880043, 0.08687495440244675, 0.06436538696289062, 0.11629942059516907, -0.02836243249475956, -0.07053330540657043, -0.09994727373123169, 0.008745850995182991, -0.058272358030080795, 0.017927322536706924, -0.0923561081290245, -0.02287093736231327, -0.019179942086338997, 0.04564552381634712, 0.008372500538825989, 0.07673297077417374, -0.005007375963032246, -0.06093323603272438, -0.017106276005506516, 0.012294510379433632, -0.08760812133550644, -0.01337486319243908, 0.032196711748838425, -0.07762335240840912, 0.11182421445846558, -0.030798878520727158, -0.06549908220767975, 0.003039520001038909, -0.10857731103897095, -0.0029716722201555967, -0.013247983530163765, 0.007211936637759209, -0.03598707914352417, -0.05936753377318382, 0.024148572236299515, -0.04880618304014206, -0.10334663093090057, -0.02213866263628006, 0.15794764459133148, -0.05675983428955078, 0.12423327565193176, -0.039037689566612244, 0.0740765929222107, -0.06714602559804916, 0.08093828707933426, 0.05105094611644745, 0.049622129648923874, 0.06013241782784462, -0.04665185138583183, 0.09617548435926437, -0.14892558753490448, -0.012450248934328556, 0.009961436502635479, -0.006489556282758713, 0.01896209642291069, -0.03517301380634308, 0.04187973216176033, -0.025832995772361755, 0.07136256992816925, 0.002737184753641486, 0.015186380594968796, 0.010547042824327946, -0.012202247977256775, -0.13084633648395538, 0.016300493851304054, 0.01950775645673275, -0.04182716831564903, -0.046976786106824875, 0.03296174854040146, 0.01703306846320629, -0.010209991596639156, 0.1384645253419876, 0.14780710637569427, 0.05026252195239067, 0.12705448269844055, 0.07679207623004913, -0.009761566296219826, -0.03777812793850899, -0.17100626230239868, -0.012278813868761063, -0.013391264714300632, 0.040917497128248215, 0.0019108904525637627, -0.1148819625377655, 0.10576226562261581, -0.10700517147779465, 0.11623885482549667, 0.043534159660339355, -0.03703884407877922, -0.07366982847452164, -0.19048461318016052, 0.0045862579718232155, -0.029521247372031212, 0.010515766218304634, -0.06270692497491837, 0.0005518845864571631, 0.15899930894374847, 0.004382527433335781, -0.007205276750028133, 0.1597679853439331, -0.09020916372537613, -0.06526507437229156, 0.08265740424394608, 0.020762430503964424, 0.007125422824174166, 0.05665039271116257, 0.012797144241631031, 0.05217501521110535, 0.02712494321167469, 0.046672966331243515, 0.022855451330542564, 0.06571970880031586, 0.06491434574127197, -0.0160608422011137, -0.08769199252128601, 0.0009358664974570274, 0.00031406100606545806, -0.038644175976514816, 0.1462499499320984, 0.03718520700931549, -0.038245536386966705, -0.003435111604630947, 0.09373179078102112, -0.04022333025932312, 0.03190549090504646, -0.1643267422914505, 0.24452504515647888, -0.030735019594430923, -0.03833063319325447, 0.02978765405714512, -0.08410432189702988, -0.01985730044543743, 0.2089782953262329, 0.12677159905433655, 0.0022038305178284645, 0.01970333233475685, 0.0011479893000796437, -0.008813259191811085, 0.007916350848972797, 0.14541210234165192, 0.04027777165174484, 0.17973987758159637, -0.036300525069236755, 0.0636284351348877, -0.03907523676753044, -0.036903008818626404, -0.05317132547497749, 0.052462801337242126, -0.023918932303786278, 0.02891971729695797, -0.09079615026712418, 0.03733564913272858, -0.05141954496502876, -0.28270667791366577, 0.15988056361675262, -0.03193192556500435, -0.012260176241397858, 0.003784388769418001, -0.07255061715841293, 0.011322560720145702, 0.07712458819150925, 0.007426882162690163, 0.0428488589823246, 0.14661645889282227, 0.04331762716174126, -0.05687817931175232, -0.06894459575414658, 0.028836622834205627, -0.2314245104789734, 0.22678349912166595, -0.006731012370437384, -0.001998533960431814, 0.0750131830573082, 0.04515784606337547, -0.10905438661575317, 0.015067067928612232, -0.04249091073870659, -0.0046367705799639225, -0.05367172136902809, 0.15280917286872864, -0.027848150581121445, 0.08801791071891785, 0.020983094349503517, -0.13152408599853516, 0.023505307734012604, 0.10791218280792236, 0.013613141141831875, -0.06713822484016418, 0.056615009903907776, -0.08253300189971924, 0.12651695311069489, 0.14866112172603607, 0.009346729144454002, -0.022684292867779732, -0.03248799964785576, -0.0004365649656392634, -0.0055605205707252026, 0.07386703789234161, -0.017817987129092216, -0.10606548190116882, 0.008615298196673393, -0.11034270375967026, 0.054769158363342285, -0.14948494732379913, -0.07662700116634369, 0.013653935864567757, -0.05264176055788994, -0.022046605125069618, 0.046536635607481, 0.0308106429874897, 0.041888508945703506, -0.028044916689395905, 0.11165084689855576, -0.013482174836099148, 0.041475072503089905, -0.09037361294031143, -0.07560724765062332 ]
null
null
transformers
# Vision Transformer (large-sized model) pre-trained with MAE Vision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper [Masked Autoencoders Are Scalable Vision Learners](https://arxiv.org/abs/2111.06377) by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in [this repository](https://github.com/facebookresearch/mae). Disclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches. During pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=facebook/vit-mae) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import AutoImageProcessor, ViTMAEForPreTraining from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) processor = AutoImageProcessor.from_pretrained('facebook/vit-mae-large') model = ViTMAEForPreTraining.from_pretrained('facebook/vit-mae-large') inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) loss = outputs.loss mask = outputs.mask ids_restore = outputs.ids_restore ``` ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2111-06377, author = {Kaiming He and Xinlei Chen and Saining Xie and Yanghao Li and Piotr Doll{\'{a}}r and Ross B. Girshick}, title = {Masked Autoencoders Are Scalable Vision Learners}, journal = {CoRR}, volume = {abs/2111.06377}, year = {2021}, url = {https://arxiv.org/abs/2111.06377}, eprinttype = {arXiv}, eprint = {2111.06377}, timestamp = {Tue, 16 Nov 2021 12:12:31 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-2111-06377.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
{"license": "apache-2.0", "tags": ["vision"], "datasets": ["imagenet-1k"]}
null
facebook/vit-mae-large
[ "transformers", "pytorch", "tf", "vit_mae", "pretraining", "vision", "dataset:imagenet-1k", "arxiv:2111.06377", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2111.06377" ]
[]
TAGS #transformers #pytorch #tf #vit_mae #pretraining #vision #dataset-imagenet-1k #arxiv-2111.06377 #license-apache-2.0 #endpoints_compatible #has_space #region-us
# Vision Transformer (large-sized model) pre-trained with MAE Vision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. Disclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches. During pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. ## Intended uses & limitations You can use the raw model for image classification. See the model hub to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ### BibTeX entry and citation info
[ "# Vision Transformer (large-sized model) pre-trained with MAE\n\nVision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. \n\nDisclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches.\n\nDuring pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ "TAGS\n#transformers #pytorch #tf #vit_mae #pretraining #vision #dataset-imagenet-1k #arxiv-2111.06377 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n", "# Vision Transformer (large-sized model) pre-trained with MAE\n\nVision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. \n\nDisclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team.", "## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches.\n\nDuring pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder.", "## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.", "### How to use\n\nHere is how to use this model:", "### BibTeX entry and citation info" ]
[ 63, 123, 209, 41, 13, 11 ]
[ "passage: TAGS\n#transformers #pytorch #tf #vit_mae #pretraining #vision #dataset-imagenet-1k #arxiv-2111.06377 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Vision Transformer (large-sized model) pre-trained with MAE\n\nVision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in this repository. \n\nDisclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nThe Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches.\n\nDuring pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder.## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for\nfine-tuned versions on a task that interests you.### How to use\n\nHere is how to use this model:### BibTeX entry and citation info" ]
[ -0.0752921923995018, 0.05250300094485283, -0.00619417754933238, 0.05170975998044014, 0.10161279886960983, 0.026028677821159363, 0.08563998341560364, 0.10269937664270401, 0.00967155210673809, 0.023988481611013412, 0.051747336983680725, 0.020515572279691696, 0.13314735889434814, 0.15856677293777466, 0.12171817570924759, -0.2903185486793518, -0.008101249113678932, -0.03986109048128128, 0.07274080067873001, 0.05325108394026756, 0.08238057792186737, -0.0855676680803299, 0.08622892200946808, 0.06714259088039398, -0.053303513675928116, -0.018908465281128883, -0.02298637107014656, -0.02191358618438244, 0.1068868488073349, 0.0236364733427763, 0.08059749007225037, -0.021987175568938255, 0.08811227977275848, -0.13866828382015228, 0.025168225169181824, 0.12506785988807678, -0.01323787309229374, 0.05719875544309616, 0.095401830971241, 0.05070124939084053, 0.027177628129720688, -0.10921938717365265, 0.012336778454482555, 0.06068141758441925, -0.10425829142332077, -0.16957277059555054, -0.09125640243291855, 0.0760778859257698, 0.06547527760267258, 0.06682651489973068, -0.004857392981648445, 0.059695083647966385, 0.0704536959528923, 0.055840764194726944, 0.16547124087810516, -0.25657907128334045, -0.0454305037856102, 0.04576520249247551, -0.00746434461325407, -0.006782127544283867, -0.06937950104475021, 0.014837312512099743, -0.006533517036587, 0.049274738878011703, 0.12437083572149277, -0.008694680407643318, 0.08158086985349655, -0.06145593151450157, -0.14290697872638702, -0.05021854490041733, 0.0300965066999197, -0.0255892314016819, -0.11911723762750626, -0.11526259034872055, -0.08021137118339539, 0.04028412699699402, 0.015011905692517757, -0.02440909668803215, 0.01834845542907715, 0.05062618851661682, 0.04612058773636818, -0.12970076501369476, -0.10115145146846771, -0.0420537143945694, -0.013622380793094635, 0.06773266196250916, 0.045805998146533966, 0.07630780339241028, -0.01622937247157097, 0.06965311616659164, -0.03972379118204117, -0.04892871156334877, -0.10341570526361465, -0.05339087173342705, -0.12070507556200027, -0.04480795934796333, 0.002225476084277034, -0.1717466115951538, -0.08986708521842957, 0.16262120008468628, -0.10455285757780075, 0.042098257690668106, 0.004482456482946873, 0.02008420042693615, 0.08148545026779175, 0.13825476169586182, -0.12714417278766632, 0.11689122766256332, 0.029203183948993683, -0.08109559118747711, 0.028491610661149025, -0.04761657118797302, -0.021388527005910873, 0.04792872816324234, -0.04302060976624489, 0.015137113630771637, 0.010073222219944, 0.04433596134185791, -0.024630747735500336, -0.06160438805818558, 0.19419288635253906, -0.07306807488203049, -0.02441317029297352, -0.003388660727068782, -0.046116311103105545, 0.04104384034872055, 0.10137193650007248, -0.05656134709715843, -0.07503212243318558, 0.049111153930425644, -0.047526828944683075, -0.03168921172618866, -0.06250569969415665, -0.08929190039634705, -0.021798821166157722, -0.1267346888780594, -0.0909920334815979, -0.12944217026233673, -0.1448405236005783, -0.006946423090994358, 0.06638378649950027, -0.0019514545565471053, 0.06808522343635559, 0.036893729120492935, -0.021710675209760666, -0.010584943927824497, 0.05022438243031502, -0.039848923683166504, -0.007580074947327375, 0.0030616612639278173, -0.09923293441534042, 0.09438096731901169, 0.004368856083601713, 0.014658808708190918, -0.0832982286810875, 0.03418105095624924, -0.13367582857608795, 0.07633651793003082, -0.002193683525547385, -0.055345308035612106, -0.10548346489667892, -0.05426948517560959, -0.06729841232299805, 0.0026563466526567936, 0.05678851902484894, 0.08043930679559708, -0.13239288330078125, -0.0202210433781147, 0.1816137284040451, -0.15217941999435425, -0.024900708347558975, 0.10051081329584122, 0.002996268216520548, 0.05830763280391693, 0.04536100849509239, 0.038707759231328964, 0.14401228725910187, -0.12491340935230255, -0.10178100317716599, 0.05836309492588043, -0.09955082833766937, 0.05054232105612755, 0.034664999693632126, 0.024897655472159386, 0.026845969259738922, -0.012794149108231068, -0.044074878096580505, 0.010361275635659695, -0.027442021295428276, -0.037685904651880264, 0.007677275687456131, -0.04062410444021225, -0.0006332742632366717, 0.021558161824941635, -0.012864994816482067, 0.03692653030157089, -0.08235400170087814, -0.11553670465946198, 0.06114574149250984, -0.038882240653038025, 0.04953090846538544, -0.0937080904841423, -0.012241393327713013, -0.039744649082422256, 0.0031884938944131136, -0.1368667036294937, -0.03404255956411362, 0.0751085951924324, -0.10180432349443436, 0.02841923199594021, -0.05171271041035652, 0.03386257588863373, 0.08789512515068054, -0.02234570123255253, -0.054136428982019424, -0.009487410075962543, -0.056904055178165436, -0.019689494743943214, -0.10038191080093384, -0.09955843538045883, -0.06145649030804634, -0.026281479746103287, 0.017887724563479424, 0.008025752380490303, 0.03478730842471123, 0.05910472199320793, 0.05206705629825592, -0.06830065697431564, 0.046722281724214554, -0.0035897765774279833, -0.019069228321313858, -0.07918466627597809, 0.05540738254785538, 0.043217454105615616, 0.02571641094982624, 0.07402239739894867, -0.0673782005906105, -0.19974888861179352, 0.04436729848384857, -0.04239170253276825, -0.11379208415746689, 0.09972193092107773, -0.02966034784913063, -0.02023724466562271, -0.10111416131258011, -0.0387393943965435, 0.17403842508792877, 0.015984520316123962, 0.08345511555671692, -0.04202791303396225, 0.019246162846684456, 0.04287302866578102, -0.01439761370420456, -0.0619477778673172, -0.022761715576052666, 0.10549934953451157, -0.1336732804775238, 0.025894489139318466, -0.030698657035827637, 0.051337841898202896, 0.18934266269207, 0.05180639773607254, -0.07503054291009903, -0.018918275833129883, 0.02014559879899025, 0.05024735629558563, 0.10464508831501007, -0.022780748084187508, -0.03695731237530708, 0.013418166898190975, 0.005361356772482395, 0.016497217118740082, -0.09845457971096039, 0.052252303808927536, 0.023733336478471756, -0.03443146124482155, 0.013752847909927368, -0.01629285141825676, -0.06323844194412231, 0.03833834081888199, 0.09029696136713028, 0.05134672299027443, 0.000604884116910398, -0.02837267890572548, -0.10473646223545074, 0.13591262698173523, -0.086017906665802, -0.2955071032047272, -0.14866220951080322, -0.020892076194286346, 0.0025409955997020006, 0.04797397926449776, 0.03290051594376564, -0.09019777923822403, -0.05095355957746506, -0.06997190415859222, 0.0763932541012764, -0.034345097839832306, -0.0007739694556221366, -0.009284742176532745, -0.035684093832969666, 0.012967399321496487, -0.10340876132249832, 0.01846979558467865, -0.020627008751034737, -0.11110853403806686, 0.06616812944412231, -0.003433512756600976, 0.07258275151252747, 0.10943679511547089, 0.01032385602593422, 0.011990874074399471, -0.04110712558031082, 0.2020808607339859, -0.06312493979930878, 0.18685710430145264, 0.1762906312942505, -0.023292837664484978, 0.10407104343175888, 0.04389967769384384, 0.020563660189509392, -0.06404443830251694, 0.04027928039431572, 0.025572285056114197, -0.09981583058834076, -0.16733399033546448, -0.028393680229783058, -0.0472334586083889, 0.03733726590871811, 0.11428432166576385, 0.02813776396214962, -0.03623117133975029, 0.04287748783826828, -0.06851787120103836, 0.028250282630324364, -0.004085849039256573, 0.08486247807741165, -0.08915480971336365, -0.029368717223405838, 0.051384780555963516, -0.05881742760539055, 0.004740433767437935, 0.0948684886097908, 0.019393164664506912, 0.20966054499149323, -0.033487118780612946, 0.15783655643463135, 0.06698712706565857, 0.06460820138454437, 0.03379008546471596, 0.08647435158491135, -0.07016460597515106, 0.03201303631067276, -0.011508331634104252, -0.05034518986940384, 0.015357601456344128, 0.05810225382447243, 0.02553822658956051, -0.005109719000756741, 0.0021829025354236364, 0.031053515151143074, 0.038168929517269135, 0.3534892499446869, 0.10860610753297806, -0.12976902723312378, -0.09947742521762848, -0.03455037996172905, 0.010052903555333614, -0.08762325346469879, 0.0033847743179649115, 0.0899633914232254, -0.1067182794213295, 0.06634119153022766, -0.08054134994745255, 0.049365680664777756, -0.12748610973358154, -0.014045686461031437, 0.09351049363613129, 0.13826383650302887, -0.02547808736562729, 0.027585746720433235, -0.10962826758623123, 0.08866982907056808, 0.016041802242398262, 0.10194497555494308, -0.07878541201353073, 0.052431829273700714, 0.07139778882265091, 0.0010377507423982024, 0.15391165018081665, 0.04209350422024727, -0.09454590827226639, -0.10077382624149323, -0.03249022364616394, -0.014507437124848366, 0.13918450474739075, -0.060507964342832565, 0.0660676583647728, -0.0006921455496922135, -0.002201551804319024, -0.020203368738293648, 0.04840235039591789, -0.11150321364402771, -0.09398061037063599, 0.07996805757284164, -0.06154632940888405, -0.02211972326040268, -0.05575148016214371, -0.0035092285834252834, -0.056309036910533905, 0.14647778868675232, -0.0934344232082367, -0.06982071697711945, -0.10872762650251389, 0.002256231615319848, 0.013187614269554615, -0.07325014472007751, 0.06861173361539841, -0.03049357980489731, 0.15592160820960999, -0.08014540374279022, -0.10898376256227493, 0.020199425518512726, -0.07612435519695282, -0.10258229076862335, -0.05618421733379364, 0.07698869705200195, 0.12586188316345215, -0.02079707197844982, 0.013341707177460194, 0.04807291179895401, 0.03574873507022858, -0.07315567135810852, 0.13605424761772156, 0.04223877936601639, -0.06630861014127731, 0.12693598866462708, -0.016946423798799515, -0.05409007892012596, -0.07533634454011917, 0.00592191144824028, 0.03725369647145271, 0.09102131426334381, -0.08048684149980545, 0.09284636378288269, 0.13950778543949127, -0.10091453045606613, -0.28064385056495667, -0.000536520907189697, 0.03441273421049118, 0.00952130276709795, -0.014405418187379837, -0.2925907373428345, 0.0633544847369194, -0.0014345886884257197, -0.024936502799391747, 0.03405720740556717, -0.19350199401378632, -0.06960387527942657, 0.011427263729274273, 0.08102712780237198, 0.06606505066156387, -0.051052309572696686, -0.0031115501187741756, 0.016675321385264397, 0.03265135735273361, 0.07457789033651352, -0.009978794492781162, 0.10236214846372604, -0.03269239515066147, -0.08823604881763458, 0.009002220816910267, -0.03699859231710434, 0.05977332219481468, -0.03676515072584152, 0.044363971799612045, -0.02866019867360592, 0.09528419375419617, 0.1097000315785408, -0.047284629195928574, 0.10711508989334106, 0.09799327701330185, 0.06338116526603699, -0.012755775824189186, -0.024296758696436882, -0.05061289295554161, 0.057866550981998444, -0.015546494163572788, -0.09452765434980392, -0.08021842688322067, 0.10118325054645538, 0.04981331154704094, 0.01810959167778492, 0.04355699568986893, 0.014669833704829216, 0.0017250565579161048, 0.11297092586755753, 0.03237442299723625, 0.010506359860301018, -0.12594957649707794, -0.046110693365335464, -0.021600760519504547, 0.09680792689323425, -0.11972727626562119, 0.03212444856762886, 0.04204701632261276, 0.059515394270420074, 0.059739235788583755, 0.04830315336585045, -0.18700721859931946, 0.03663574904203415, 0.05477192625403404, -0.09386429190635681, -0.17745469510555267, -0.025009339675307274, -0.03366341441869736, -0.06522217392921448, 0.03753872588276863, 0.15238696336746216, -0.04309511557221413, 0.008590370416641235, -0.00787157379090786, 0.05248812958598137, -0.014455897733569145, 0.05786167085170746, 0.03852124884724617, -0.004208660684525967, -0.028157342225313187, 0.1572927087545395, 0.08200841397047043, 0.008326943963766098, 0.007668359205126762, 0.09227010607719421, -0.08463194966316223, -0.07822373509407043, -0.07367481291294098, -0.023803837597370148, 0.02651147171854973, -0.012413723394274712, 0.02617688849568367, -0.03950009122490883, 0.038064274936914444, 0.07881364971399307, 0.0016513009322807193, 0.01986684463918209, -0.06307488679885864, 0.03616304323077202, -0.07028692960739136, 0.04035879299044609, -0.022264519706368446, 0.0445237010717392, -0.0722457766532898, 0.09491673856973648, 0.051001325249671936, 0.09373084455728531, -0.030219830572605133, -0.0706210732460022, -0.09445560723543167, 0.0063092359341681, -0.03018619306385517, 0.021659983322024345, -0.08944756537675858, -0.025648614391684532, -0.009432756341993809, 0.039915814995765686, 0.018633360043168068, 0.059531621634960175, -0.003976669628173113, -0.056098487228155136, -0.007003257982432842, 0.017415428534150124, -0.09349178522825241, -0.024621056392788887, 0.0313173308968544, -0.0838622897863388, 0.10779712349176407, -0.03125838190317154, -0.059609685093164444, 0.002397604286670685, -0.11438721418380737, -0.004418671131134033, -0.016959629952907562, 0.015381737612187862, -0.03377682715654373, -0.0650884211063385, 0.026139164343476295, -0.055020105093717575, -0.08416162431240082, -0.019579993560910225, 0.13195481896400452, -0.050687141716480255, 0.12038400024175644, -0.02458578161895275, 0.06949681788682938, -0.06888111680746078, 0.07531030476093292, 0.05963815003633499, 0.04772043228149414, 0.052436258643865585, -0.047923725098371506, 0.09585848450660706, -0.14955461025238037, -0.007009996101260185, 0.0049666366539895535, -0.006995378527790308, 0.03683103621006012, -0.04039589688181877, 0.040821947157382965, -0.026813212782144547, 0.06402839720249176, 0.008235793560743332, 0.0668812096118927, 0.007293452508747578, -0.007408742792904377, -0.13123604655265808, 0.019992927089333534, 0.019287345930933952, -0.040784433484077454, -0.04028494656085968, 0.014056949876248837, 0.027939800173044205, -0.02081969752907753, 0.13189685344696045, 0.1308983862400055, 0.03431645408272743, 0.14803677797317505, 0.0661168321967125, 0.0031213730107992887, -0.0438874252140522, -0.16829954087734222, -0.02775660529732704, -0.008761638775467873, 0.05357494577765465, -0.00626458739861846, -0.10116314888000488, 0.10779844224452972, -0.11543559283018112, 0.10157488286495209, 0.029887503013014793, -0.03666162118315697, -0.06836098432540894, -0.20575319230556488, 0.012091162614524364, -0.006454985588788986, 0.016870565712451935, -0.06549651175737381, 0.006859317887574434, 0.11414355039596558, 0.0028769150376319885, -0.0029776364099234343, 0.13788087666034698, -0.09492622315883636, -0.0559920109808445, 0.08752233535051346, 0.02030285820364952, 0.0148219745606184, 0.04684796929359436, 0.02307986468076706, 0.05117691308259964, 0.01800035685300827, 0.04011353850364685, 0.01993962749838829, 0.0599130280315876, 0.07239285856485367, -0.010815647430717945, -0.08922308683395386, 0.006881235633045435, -0.01123833004385233, -0.033173806965351105, 0.14477239549160004, 0.04063648357987404, -0.03537742793560028, 0.0009815111989155412, 0.10309414565563202, -0.0481472983956337, 0.016540493816137314, -0.1610022634267807, 0.2184707075357437, -0.032595064491033554, -0.03331338241696358, 0.007502977270632982, -0.08486421406269073, -0.018649745732545853, 0.20725072920322418, 0.12399277091026306, 0.011373762041330338, 0.014250260777771473, 0.013263324275612831, -0.005472648423165083, 0.006445482838898897, 0.13133612275123596, 0.04577620327472687, 0.19991649687290192, -0.03595596179366112, 0.06704027205705643, -0.040225375443696976, -0.023434333503246307, -0.046456094831228256, 0.06561996042728424, -0.00826132670044899, 0.03512777015566826, -0.08318748325109482, 0.050530269742012024, -0.060454465448856354, -0.28945377469062805, 0.1306159645318985, -0.020110534504055977, -0.01742786355316639, 0.017963945865631104, -0.07198458909988403, 0.021816497668623924, 0.08290739357471466, 0.002600444480776787, 0.03335434943437576, 0.15225505828857422, 0.049074094742536545, -0.05336920544505119, -0.07640459388494492, 0.017958559095859528, -0.23175781965255737, 0.243118017911911, -0.01181415282189846, -0.006600965745747089, 0.07630016654729843, 0.04732966423034668, -0.12123021483421326, -0.006865848787128925, -0.046366799622774124, -0.01348213478922844, -0.03884837031364441, 0.16554279625415802, -0.027828114107251167, 0.09563189744949341, 0.02379978820681572, -0.12655355036258698, 0.03623659908771515, 0.12288939207792282, 0.017167801037430763, -0.0625934973359108, 0.07780373841524124, -0.09226290136575699, 0.13272587954998016, 0.1395014375448227, 0.004649690352380276, -0.021779457107186317, -0.03232010826468468, -0.0007213546778075397, 0.002417636336758733, 0.051472488790750504, -0.03219865262508392, -0.0954262986779213, -0.0012941441964358091, -0.11629270762205124, 0.05338378623127937, -0.13269585371017456, -0.08316139876842499, 0.017142463475465775, -0.05003580451011658, -0.020669300109148026, 0.04217296093702316, 0.028978846967220306, 0.047482382506132126, -0.03104444034397602, 0.12121889740228653, -0.00654885359108448, 0.04609464854001999, -0.10027755796909332, -0.07904078811407089 ]
null
null
transformers
# Wav2Vec2-Base-100h [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) The base model pretrained and fine-tuned on 100 hours of Librispeech on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. [Paper](https://arxiv.org/abs/2006.11477) Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli **Abstract** We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data. The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20. # Usage To transcribe audio files the model can be used as a standalone acoustic model as follows: ```python from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import soundfile as sf import torch # load model and processor processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-100h") model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-100h") # define function to read in sound file def map_to_array(batch): speech, _ = sf.read(batch["file"]) batch["speech"] = speech return batch # load dummy dataset and read soundfiles ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") ds = ds.map(map_to_array) # tokenize input_values = processor(ds[0]["audio"]["array"], return_tensors="pt", padding="longest").input_values # Batch size 1 # retrieve logits logits = model(input_values).logits # take argmax and decode predicted_ids = torch.argmax(logits, dim=-1) transcription = processor.batch_decode(predicted_ids) ``` ## Evaluation This code snippet shows how to evaluate **facebook/wav2vec2-base-100h** on LibriSpeech's "clean" and "other" test data. ```python from datasets import load_dataset from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor import soundfile as sf import torch from jiwer import wer librispeech_eval = load_dataset("librispeech_asr", "clean", split="test") model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-100h").to("cuda") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-100h") def map_to_pred(batch): input_values = processor(batch["audio"]["array"], return_tensors="pt", padding="longest").input_values with torch.no_grad(): logits = model(input_values.to("cuda")).logits predicted_ids = torch.argmax(logits, dim=-1) transcription = processor.batch_decode(predicted_ids) batch["transcription"] = transcription return batch result = librispeech_eval.map(map_to_pred, batched=True, batch_size=1, remove_columns=["speech"]) print("WER:", wer(result["text"], result["transcription"])) ``` *Result (WER)*: | "clean" | "other" | |---|---| | 6.1 | 13.5 |
{"language": "en", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition"], "datasets": ["librispeech_asr"]}
automatic-speech-recognition
facebook/wav2vec2-base-100h
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "en", "dataset:librispeech_asr", "arxiv:2006.11477", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2006.11477" ]
[ "en" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us
Wav2Vec2-Base-100h ================== Facebook's Wav2Vec2 The base model pretrained and fine-tuned on 100 hours of Librispeech on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Paper Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli Abstract We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data. The original model can be found under URL Usage ===== To transcribe audio files the model can be used as a standalone acoustic model as follows: Evaluation ---------- This code snippet shows how to evaluate facebook/wav2vec2-base-100h on LibriSpeech's "clean" and "other" test data. *Result (WER)*:
[]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n" ]
[ 73 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n" ]
[ -0.10553258657455444, 0.130898118019104, -0.0032210664357990026, -0.0069449907168745995, 0.04456991329789162, -0.046208616346120834, 0.07104607671499252, 0.09719584137201309, 0.013007433153688908, 0.01296982727944851, 0.09222981333732605, 0.17959539592266083, 0.0030187941156327724, -0.00842480082064867, -0.04955119639635086, -0.13855057954788208, 0.08840401470661163, 0.012503410689532757, 0.07250794768333435, 0.08181643486022949, 0.09971898794174194, -0.05253602936863899, 0.030067499727010727, 0.056679267436265945, -0.05401643365621567, 0.034473419189453125, 0.04372253268957138, -0.13200542330741882, 0.08021045476198196, 0.017922570928931236, 0.008886466734111309, 0.05725982412695885, 0.04201558977365494, -0.15367181599140167, 0.0036659573670476675, 0.0012196027673780918, -0.019065123051404953, 0.0409681499004364, 0.04913613572716713, -0.019390206784009933, -0.0036085534375160933, 0.03938404098153114, -0.049491070210933685, 0.05481933057308197, -0.04292481020092964, -0.25252434611320496, -0.07903379201889038, 0.05558633804321289, 0.03390234336256981, 0.1055200845003128, -0.004184235818684101, 0.1357744038105011, -0.07014869153499603, 0.07564567029476166, 0.15512040257453918, -0.3402085602283478, 0.04942312836647034, 0.007840266451239586, 0.07459427416324615, -0.0007963182288222015, -0.014770575799047947, 0.05158987268805504, 0.059410460293293, 0.01968969590961933, 0.0067363218404352665, -0.0481865331530571, -0.23490853607654572, 0.018609128892421722, -0.08365047723054886, -0.08914119005203247, 0.2763230502605438, 0.01709545962512493, 0.009737185202538967, -0.029975801706314087, -0.030394138768315315, -0.00208685384131968, 0.0035830088891088963, 0.04579317942261696, 0.011192433536052704, 0.07485197484493256, 0.006276512518525124, -0.010196984745562077, -0.12356168776750565, -0.06562907993793488, -0.17517545819282532, 0.0861472338438034, -0.018789222463965416, 0.07595038414001465, -0.10856316238641739, 0.003985750954598188, 0.0055588893592357635, -0.1073039174079895, -0.01251818798482418, 0.008832748048007488, 0.062148984521627426, 0.0549384281039238, -0.0890202671289444, 0.030067842453718185, 0.11984404176473618, 0.10347294807434082, 0.06263213604688644, -0.017943130806088448, -0.058236151933670044, 0.11197682470083237, -0.015561729669570923, 0.09655848145484924, -0.06580095738172531, -0.04237474873661995, 0.06200028583407402, -0.045705072581768036, 0.0872374102473259, -0.036531124264001846, -0.11455228924751282, -0.07593734562397003, 0.005413370672613382, 0.05977078154683113, 0.10201477259397507, -0.010023833252489567, -0.04662850499153137, 0.02936653606593609, 0.06969907879829407, -0.1026461124420166, 0.018017424270510674, 0.019503910094499588, 0.04069804400205612, 0.06350523233413696, 0.06225026026368141, 0.054257288575172424, -0.07724354416131973, 0.023967625573277473, -0.00534059526398778, 0.013780725188553333, 0.04040428251028061, -0.03413857892155647, 0.08639609813690186, -0.09333818405866623, 0.03244217485189438, -0.1576840728521347, -0.05885520577430725, -0.00569072226062417, 0.016366764903068542, 0.010820036754012108, -0.09561426937580109, 0.018917560577392578, -0.04050036519765854, 0.07263784110546112, -0.126414954662323, 0.05499587208032608, -0.08863165974617004, 0.07167955487966537, -0.008761011995375156, 0.10670781135559082, -0.15373651683330536, 0.09085401147603989, -0.08862880617380142, -0.02379925735294819, 0.055160146206617355, 0.03620157390832901, -0.099167600274086, 0.04630241543054581, -0.10348021984100342, -0.037183236330747604, -0.08817009627819061, 0.028224295005202293, 0.009931059554219246, 0.04270811378955841, -0.14398418366909027, -0.07768739014863968, 0.11411935836076736, -0.10307222604751587, -0.1258954107761383, 0.0906059592962265, 0.06541302055120468, -0.031062107533216476, 0.036072660237550735, 0.27910104393959045, -0.013128956779837608, -0.14770203828811646, -0.03967995196580887, 0.10566240549087524, -0.06082668527960777, -0.15083302557468414, 0.08385676890611649, -0.08748205751180649, 0.011965442448854446, 0.01408698782324791, -0.012296510860323906, 0.07645261287689209, 0.016154587268829346, -0.08376128226518631, -0.07453311234712601, -0.10177339613437653, -0.021080397069454193, -0.03707750886678696, 0.018567649647593498, -0.021932341158390045, -0.0010107764974236488, -0.04372774809598923, 0.05814548581838608, 0.013737856410443783, 0.07135165482759476, -0.07868031412363052, 0.13074877858161926, -0.09697720408439636, 0.007299181539565325, -0.16689161956310272, 0.171956405043602, -0.04545271396636963, -0.007387565448880196, 0.07694359868764877, 0.03380583971738815, 0.04236609488725662, -0.089943528175354, 0.018857711926102638, -0.03533199056982994, 0.09387461096048355, 0.06296900659799576, 0.020133990794420242, -0.13281568884849548, 0.00854481104761362, -0.056071553379297256, -0.015910202637314796, 0.05173464119434357, -0.05182471126317978, 0.05021200701594353, 0.07562640309333801, -0.03414363041520119, 0.03478047624230385, 0.02506065182387829, -0.008699807338416576, -0.024318216368556023, 0.020164651796221733, 0.06529483944177628, 0.0329551063477993, -0.012639208696782589, 0.2608710527420044, -0.10263499617576599, 0.21942198276519775, 0.23810690641403198, -0.14825774729251862, 0.09490755945444107, 0.10501835495233536, 0.0042173112742602825, -0.02398437075316906, 0.024790292605757713, -0.04333411157131195, 0.10180330276489258, -0.04527559503912926, 0.11452284455299377, -0.07136353850364685, 0.021400270983576775, 0.023141104727983475, -0.058160360902547836, -0.029492732137441635, 0.046045225113630295, 0.020257392898201942, -0.08659094572067261, 0.11685719341039658, 0.2639707028865814, -0.11033948510885239, 0.10913971811532974, -0.059389617294073105, -0.06938845664262772, 0.05329904332756996, -0.04328740015625954, -0.04060158133506775, 0.06868652999401093, -0.18649053573608398, -0.020451687276363373, 0.11309658735990524, 0.010944943875074387, 0.05313780531287193, -0.14778806269168854, 0.0018295581685379148, 0.0014550879132002592, -0.04905227944254875, -0.15963372588157654, 0.07122679054737091, -0.015048031695187092, 0.07437968254089355, -0.06138448417186737, -0.14604808390140533, 0.06767101585865021, -0.029231460765004158, -0.08494333922863007, 0.04943833872675896, -0.1273941695690155, -0.19499844312667847, -0.13717390596866608, -0.08844323456287384, -0.02497990056872368, 0.0384749136865139, 0.15096785128116608, -0.06208513304591179, -0.03659309074282646, -0.006901481654495001, -0.04125802591443062, -0.03256578743457794, 0.01825602725148201, 0.06956687569618225, 0.03523300588130951, 0.06612950563430786, -0.1360173225402832, -0.00553650688380003, -0.007801401894539595, 0.030831478536128998, 0.02565259486436844, 0.045470088720321655, 0.08835817128419876, 0.13233067095279694, 0.06605740636587143, 0.009677453897893429, 0.012083767913281918, 0.14244942367076874, -0.051761411130428314, -0.051135167479515076, 0.19370770454406738, -0.033904727548360825, 0.010422100313007832, 0.16998347640037537, 0.02151668630540371, -0.02509351074695587, -0.034925494343042374, -0.05893196538090706, -0.08270371705293655, -0.23722556233406067, -0.13327594101428986, -0.12315286695957184, -0.01946079730987549, 0.03249676153063774, 0.06402452290058136, 0.055884744971990585, 0.007174030411988497, 0.001026236335746944, -0.03014427050948143, -0.004808984696865082, 0.001112596713937819, 0.2428148239850998, -0.0475541353225708, 0.11669880896806717, -0.10617652535438538, -0.02251419425010681, 0.06858710199594498, 0.11554747819900513, 0.09516856074333191, 0.1387278288602829, 0.05478128790855408, 0.05034932121634483, 0.16305990517139435, 0.08390197157859802, 0.10281553119421005, 0.048526328057050705, 0.003477555699646473, -0.0011071959743276238, -0.07144153118133545, -0.017537400126457214, 0.08567972481250763, 0.13755297660827637, -0.08278528600931168, -0.008461780846118927, -0.13989698886871338, 0.014602833427488804, 0.18495753407478333, 0.10520230233669281, -0.18115316331386566, 0.016157910227775574, 0.041682351380586624, -0.04312965273857117, -0.022627774626016617, 0.10051585733890533, -0.001219655037857592, -0.012892578728497028, 0.07470382004976273, 0.0647098645567894, 0.08223386108875275, -0.01231531985104084, 0.030562004074454308, -0.06266938894987106, -0.11482807248830795, 0.08159756660461426, 0.06349847465753555, -0.21560755372047424, 0.22825436294078827, 0.009443742223083973, 0.027259882539510727, -0.02224699780344963, 0.018613731488585472, 0.05059058964252472, 0.050911929458379745, 0.15158957242965698, 0.021791379898786545, -0.1253834068775177, -0.009121525101363659, -0.07055676728487015, 0.04293029382824898, 0.038012273609638214, 0.10832592099905014, -0.05921723693609238, -0.03463339805603027, -0.030047286301851273, 0.03232336789369583, 0.056804630905389786, -0.11937764286994934, -0.11106505990028381, 0.01914956048130989, 0.30287572741508484, 0.06441196799278259, -0.02195272222161293, -0.06208842620253563, -0.19923004508018494, 0.0869675725698471, -0.11312504857778549, 0.012986715883016586, -0.03872888535261154, -0.15464921295642853, 0.12560436129570007, -0.0074307541362941265, 0.07708790898323059, -0.030087526887655258, -0.015005347318947315, -0.04702563211321831, -0.13900606334209442, 0.11093595623970032, -0.12478809803724289, -0.02755695767700672, -0.021537229418754578, 0.17201226949691772, -0.07755780965089798, 0.059625569730997086, 0.05749013274908066, 0.075789675116539, -0.10451560467481613, -0.05349566787481308, 0.11411191523075104, 0.06102922186255455, 0.009756186977028847, 0.019745752215385437, -0.016817614436149597, -0.23264074325561523, 0.03012603335082531, -0.04088092967867851, 0.22231651842594147, 0.1339775025844574, -0.08638075739145279, 0.18232716619968414, 0.21496087312698364, -0.04361119121313095, -0.30918097496032715, -0.14794783294200897, -0.06409947574138641, -0.01401350274682045, -0.013975550420582294, -0.12061818689107895, 0.11361110955476761, -0.05663692206144333, -0.12858913838863373, 0.04298190772533417, -0.17978867888450623, -0.08528170734643936, 0.24440136551856995, -0.12553873658180237, 0.2579133212566376, -0.09149366617202759, -0.08791037648916245, -0.07697483897209167, -0.2083970159292221, 0.08934041112661362, -0.12861685454845428, 0.09533693641424179, 0.020095612853765488, 0.04834409058094025, -0.0032761390320956707, -0.05023179575800896, 0.12247822433710098, 0.06964068859815598, -0.044906772673130035, -0.054036129266023636, 0.03761465474963188, 0.051064588129520416, -0.0013577078934758902, 0.09207505732774734, -0.10425111651420593, 0.046614814549684525, -0.07467342168092728, -0.01770988292992115, -0.10582128912210464, 0.11127619445323944, 0.054762184619903564, -0.005041792523115873, -0.0004597716615535319, -0.09294067323207855, 0.02075539156794548, 0.0075386459939181805, 0.18497338891029358, -0.039920635521411896, 0.005495375022292137, 0.21799907088279724, 0.08913405239582062, -0.14787627756595612, -0.0857413038611412, -0.043097324669361115, -0.09553927928209305, 0.11717835068702698, -0.11586102843284607, 0.08731591701507568, 0.05556878075003624, 0.031527236104011536, 0.021553244441747665, 0.06515321135520935, -0.0313250795006752, -0.006342106964439154, 0.12088450789451599, -0.09494642168283463, 0.005057735834270716, 0.021784421056509018, 0.0866456925868988, 0.07720641791820526, 0.05301899090409279, 0.13054990768432617, -0.019091904163360596, -0.008595217019319534, 0.00885860063135624, 0.012888466008007526, -0.15684325993061066, 0.11237099766731262, 0.066322922706604, 0.017946872860193253, -0.1617332547903061, 0.13419018685817719, -0.009535997174680233, -0.16045138239860535, 0.028488822281360626, -0.035346753895282745, -0.0770009383559227, -0.12456200271844864, -0.07436610013246536, -0.05795807018876076, -0.06541389971971512, -0.1286797970533371, 0.029880192130804062, -0.10729623585939407, 0.05121079832315445, 0.06720234453678131, 0.0686970129609108, 0.05852325260639191, -0.04622297361493111, -0.07811441272497177, 0.025459539145231247, 0.000011643805919447914, -0.04393397644162178, 0.022547293454408646, -0.14323608577251434, -0.0609799288213253, 0.0049041942693293095, 0.058983832597732544, -0.05969872325658798, -0.018304618075489998, -0.035527244210243225, 0.05525393411517143, -0.11442197114229202, 0.005268616136163473, -0.08427722007036209, -0.006557543762028217, 0.021367160603404045, -0.09194521605968475, -0.04612528160214424, 0.07238994538784027, -0.11176017671823502, -0.03971487656235695, -0.01866694539785385, 0.08856991678476334, -0.15284664928913116, -0.02588054910302162, 0.04093141853809357, -0.01763247884809971, 0.08831114321947098, 0.1667511761188507, -0.10959204286336899, 0.05900387838482857, -0.19305944442749023, -0.21232213079929352, 0.14857356250286102, 0.04487431049346924, -0.00411234563216567, -0.07526002079248428, -0.01284536998718977, 0.15780283510684967, 0.02208596281707287, -0.007283198181539774, 0.10564544051885605, -0.0858166366815567, -0.0646427720785141, -0.11394652724266052, -0.05267186462879181, 0.0002667454828042537, -0.07834916561841965, 0.20607034862041473, 0.07555761188268661, 0.15330466628074646, -0.01674608327448368, -0.02290985733270645, -0.051292531192302704, 0.06201019510626793, -0.08475256711244583, -0.1346537172794342, -0.18266621232032776, -0.010854627005755901, -0.0008765954407863319, -0.05076060816645622, 0.21584820747375488, -0.010438222438097, -0.09894604980945587, 0.04366534575819969, 0.06481567770242691, -0.05863906815648079, 0.016870345920324326, 0.3084961771965027, 0.05215081945061684, -0.016059083864092827, -0.04979681223630905, -0.010718269273638725, 0.02526518888771534, 0.09674394875764847, -0.05696709081530571, 0.16609297692775726, 0.13069912791252136, 0.11963684856891632, 0.11121363192796707, -0.0778949037194252, -0.13920266926288605, -0.035769857466220856, -0.018955543637275696, 0.1124034970998764, -0.029977023601531982, 0.11309010535478592, 0.1314707100391388, 0.005977015942335129, 0.03182263672351837, -0.06869325786828995, 0.02107797935605049, -0.15962888300418854, -0.08474449068307877, -0.06388143450021744, -0.1244516372680664, 0.006807905621826649, -0.023599054664373398, 0.09286303073167801, 0.09161261469125748, 0.006726193707436323, -0.0012807550374418497, 0.022250106558203697, -0.05333257466554642, -0.06643092632293701, 0.05710507184267044, -0.04184741526842117, -0.04383650794625282, -0.028828075155615807, -0.02074788697063923, 0.07553410530090332, -0.020120704546570778, -0.012579452246427536, 0.03099837526679039, -0.06999890506267548, 0.057659395039081573, -0.10923458635807037, -0.06497935205698013, -0.04828762263059616, 0.017961908131837845, 0.04987628012895584, 0.1769562065601349, 0.0703502669930458, -0.01841086708009243, 0.07841479033231735, 0.17592471837997437, -0.1167592853307724, -0.14444665610790253, -0.04834729805588722, 0.07056789100170135, -0.024213865399360657, 0.0789770558476448, -0.03786179795861244, -0.02406960539519787, -0.057969506829977036, 0.20962217450141907, 0.2210516482591629, -0.023191357031464577, 0.04511535167694092, -0.04486880451440811, 0.02214992791414261, -0.04110521450638771, -0.011881881393492222, 0.1788846105337143, 0.21833086013793945, -0.04134572669863701, -0.0491926409304142, -0.08656515181064606, -0.013672783970832825, -0.08476120978593826, 0.06531766802072525, -0.07521392405033112, -0.1619744896888733, 0.00009920428419718519, 0.07996059954166412, -0.11901438981294632, 0.04175471514463425, -0.07564549893140793, -0.1176062747836113, -0.03611231595277786, 0.017024710774421692, 0.17818686366081238, 0.11621901392936707, 0.007459801621735096, -0.07456135749816895, -0.029357291758060455, 0.0513065904378891, -0.025648940354585648, -0.19937174022197723, 0.023894203826785088, -0.0001692539663054049, -0.09509439766407013, 0.06519272923469543, 0.005465993657708168, 0.14946505427360535, 0.020932557061314583, 0.15391206741333008, -0.0794263705611229, 0.12185506522655487, -0.001882461947388947, -0.15206871926784515, 0.02240588143467903, 0.05049044266343117, -0.026614127680659294, 0.015433653257787228, 0.04593406245112419, -0.04700611159205437, 0.06037532538175583, 0.020408250391483307, -0.03441973030567169, -0.07844642549753189, -0.02581782452762127, -0.04933638125658035, 0.07223241776227951, -0.06718641519546509, -0.03679826855659485, -0.05522547662258148, -0.01901129260659218, 0.005404320545494556, 0.05544726178050041, -0.15088249742984772, -0.06679877638816833, -0.07014062255620956, -0.005030340049415827, -0.0506955124437809, 0.029244428500533104, -0.09448962658643723, -0.03867999464273453, -0.07312208414077759, -0.026727264747023582, -0.06672850251197815, -0.016113724559545517, 0.09021682292222977, -0.03460538759827614, 0.007240580394864082, -0.0492607057094574, 0.09985403716564178, 0.06446917355060577, -0.11979615688323975, -0.11410555988550186 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 100k unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390). **Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model. **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Fine-Tuning Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
{"language": "multilingual", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-100k-voxpopuli
[ "transformers", "pytorch", "wav2vec2", "pretraining", "audio", "automatic-speech-recognition", "voxpopuli", "multilingual", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "multilingual" ]
TAGS #transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli Facebook's Wav2Vec2 base model pretrained on the 100k unlabeled subset of VoxPopuli corpus. Note: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model. Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Fine-Tuning Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
[ "# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the 100k unlabeled subset of VoxPopuli corpus.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning." ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the 100k unlabeled subset of VoxPopuli corpus.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning." ]
[ 71, 204, 57 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the 100k unlabeled subset of VoxPopuli corpus.\n\nNote: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning." ]
[ -0.054081667214632034, 0.07928266376256943, -0.003468954935669899, 0.005413401406258345, 0.11107829958200455, -0.026428915560245514, 0.12575072050094604, 0.016965297982096672, 0.013454627245664597, 0.060215361416339874, -0.008975722827017307, -0.003770316718146205, 0.10353374481201172, 0.12531213462352753, 0.051421087235212326, -0.26812589168548584, 0.06214131787419319, -0.03340379521250725, 0.12715592980384827, 0.043237894773483276, 0.12408608198165894, -0.08482403308153152, 0.060663603246212006, 0.061391014605760574, -0.10142805427312851, 0.003707872238010168, 0.009975763969123363, -0.07136130332946777, 0.07956058531999588, 0.06583596765995026, 0.041415631771087646, 0.04204220697283745, 0.08097705245018005, -0.18088509142398834, 0.03045741096138954, 0.05189235880970955, -0.0114658884704113, 0.008453928865492344, 0.09601940214633942, -0.000026870697183767334, 0.1920233964920044, -0.0423671193420887, 0.0006371844792738557, 0.09188909083604813, -0.09869696944952011, -0.06193458288908005, -0.09883344918489456, 0.18215136229991913, 0.10075543075799942, 0.09341142326593399, -0.07647138088941574, 0.06943871080875397, 0.016807837411761284, 0.05587947368621826, 0.08795681595802307, -0.25409483909606934, -0.029431916773319244, 0.13588492572307587, 0.06707143783569336, -0.025671135634183884, -0.10361329466104507, 0.09595251083374023, 0.02590128406882286, -0.00990353338420391, -0.031738776713609695, -0.04925798252224922, 0.08189660310745239, -0.08230338990688324, -0.09870399534702301, 0.004673100542277098, 0.15358488261699677, 0.04483296722173691, -0.07613132148981094, -0.13745447993278503, -0.02182074636220932, 0.19973979890346527, -0.05355653166770935, -0.1284346580505371, 0.004227188881486654, 0.03881027549505234, 0.053935885429382324, -0.17287294566631317, -0.0671360045671463, -0.0016626364085823298, -0.0408962182700634, 0.08190855383872986, -0.00016577000496909022, -0.006376880221068859, -0.036467619240283966, 0.009335529059171677, -0.1387956440448761, -0.09497115761041641, -0.006470100022852421, -0.08465644717216492, -0.08411580324172974, -0.03723309934139252, -0.042757146060466766, -0.09937028586864471, 0.007729633711278439, 0.0825749933719635, 0.010261430405080318, 0.05861012265086174, -0.08377770334482193, 0.0003348253667354584, 0.05216830596327782, 0.14019373059272766, -0.11019184440374374, -0.00048727355897426605, 0.01755395531654358, -0.02719646319746971, 0.024484559893608093, -0.03931191563606262, -0.06784161180257797, -0.021357698366045952, -0.005841970443725586, 0.026520678773522377, 0.043763961642980576, 0.009132253006100655, -0.046413686126470566, -0.0703548789024353, 0.14154677093029022, -0.0900728851556778, 0.03890641778707504, 0.04469821974635124, 0.007961301133036613, 0.11311264336109161, -0.03084106184542179, 0.10100823640823364, -0.11895515769720078, -0.013005468063056469, -0.022022906690835953, 0.0011392526794224977, -0.001188722439110279, -0.036641448736190796, 0.009231381118297577, -0.04219251126050949, -0.013980110175907612, -0.11513110250234604, -0.07555607706308365, -0.09131850302219391, -0.009454421699047089, -0.055824875831604004, -0.06314271688461304, -0.04280683398246765, 0.011333100497722626, -0.007139467168599367, 0.011590847745537758, -0.031222905963659286, 0.002567801857367158, -0.011970901861786842, -0.023885169997811317, 0.04763340950012207, -0.023393306881189346, 0.058702047914266586, -0.0024897572584450245, -0.025343071669340134, -0.16156472265720367, 0.1456747204065323, -0.04352883622050285, -0.01595189981162548, -0.1349235475063324, 0.004974575247615576, -0.0650116577744484, 0.021995289251208305, 0.005465751048177481, 0.13165250420570374, -0.18336617946624756, -0.08963160216808319, 0.19105763733386993, -0.1345663219690323, 0.02017727680504322, 0.17084714770317078, 0.016986196860671043, 0.06065874919295311, 0.13607695698738098, 0.1737920343875885, 0.010401721112430096, -0.14583562314510345, 0.014381928369402885, -0.06410566717386246, -0.01161663793027401, 0.11388613283634186, 0.023723961785435677, -0.04316385090351105, 0.035319019109010696, -0.004553907085210085, -0.049699567258358, -0.07162876427173615, 0.001844701706431806, -0.05364115536212921, 0.009863726794719696, -0.04528636485338211, 0.03665274381637573, -0.011427386663854122, -0.02139389142394066, -0.008762343786656857, -0.10350500047206879, -0.08947476744651794, 0.07261008024215698, -0.05184393376111984, 0.05955265089869499, -0.10901230573654175, 0.03560662642121315, 0.13308411836624146, 0.03359221667051315, -0.1337468922138214, 0.04848495125770569, 0.027821438387036324, 0.04189254343509674, 0.1138007864356041, 0.07815852016210556, -0.035707004368305206, -0.00637661712244153, -0.037494853138923645, 0.006377970799803734, -0.019504498690366745, -0.016439149156212807, -0.0348682701587677, -0.08526836335659027, -0.01775921881198883, -0.052231043577194214, 0.12445077300071716, -0.1900031864643097, 0.020172392949461937, 0.050523292273283005, 0.1048574224114418, 0.005402136594057083, -0.027535591274499893, 0.026654649525880814, 0.05486072227358818, 0.026478657498955727, -0.01957140862941742, 0.05868745967745781, 0.006239981856197119, -0.04461557790637016, 0.09647761285305023, -0.11397350579500198, -0.11723149567842484, 0.1256031095981598, -0.01884150318801403, -0.022170837968587875, -0.001324156066402793, 0.018216023221611977, 0.00285165011882782, -0.03275516629219055, -0.015299073420464993, 0.22695086896419525, 0.014436219818890095, 0.052628856152296066, -0.07118962705135345, 0.007127385586500168, -0.0007348433136940002, -0.0352964885532856, -0.09846854209899902, 0.0508415587246418, -0.006070482078939676, -0.13479389250278473, -0.008753503672778606, 0.061650462448596954, 0.05554734542965889, 0.11586461961269379, 0.02072042040526867, -0.03922085464000702, 0.0045446306467056274, -0.07316893339157104, -0.019967345520853996, 0.0025609603617340326, -0.10351058095693588, -0.04355988651514053, 0.019016733393073082, 0.004319264553487301, 0.04756724089384079, -0.055788058787584305, 0.039913009852170944, 0.009723278693854809, -0.06607946753501892, -0.0005452575860545039, 0.005663097836077213, -0.028763478621840477, 0.05574687942862511, -0.018056271597743034, 0.029309218749403954, -0.044815853238105774, -0.03492536395788193, -0.14790305495262146, 0.09523510932922363, -0.06005009636282921, -0.3290458917617798, -0.10078783333301544, -0.07749956846237183, -0.08116503059864044, 0.034413617104291916, 0.03356962651014328, -0.09316371381282806, -0.08368349820375443, -0.07437178492546082, 0.11101647466421127, 0.004965598229318857, -0.07814173400402069, 0.09427733719348907, -0.008438363671302795, -0.01444915495812893, -0.0883672684431076, 0.0034257504157721996, -0.03126779943704605, -0.12449096143245697, -0.01913290284574032, -0.024922342970967293, 0.05469755828380585, 0.1286407858133316, 0.016869554296135902, -0.01442812755703926, -0.02003829926252365, 0.23543646931648254, -0.12179619818925858, 0.11232932657003403, 0.24258415400981903, -0.021564897149801254, 0.0301677156239748, 0.15521399676799774, -0.02814469113945961, -0.05983157828450203, 0.023958032950758934, 0.01894616149365902, -0.005979497451335192, -0.24182462692260742, -0.1132580116391182, -0.032431893050670624, -0.0599609799683094, 0.026177838444709778, 0.031700436025857925, 0.04852691665291786, 0.029788609594106674, -0.08582647889852524, -0.05889347195625305, 0.08587846159934998, 0.045259732753038406, 0.1592676043510437, -0.008333556354045868, 0.10747987031936646, -0.03569730743765831, 0.011318265460431576, 0.07433733344078064, 0.017953284084796906, 0.06324011087417603, 0.0948011502623558, 0.11661802232265472, 0.051385316997766495, 0.05736539512872696, 0.029226252809166908, 0.002669127192348242, -0.02075035311281681, -0.0010404279455542564, -0.02877318300306797, -0.048980712890625, -0.03136734291911125, 0.04614923149347305, 0.12791560590267181, -0.13968457281589508, -0.11071903258562088, 0.06411674618721008, 0.033858709037303925, 0.1212204098701477, 0.09243841469287872, -0.08006332069635391, -0.07361302524805069, 0.03240582346916199, -0.06487084180116653, -0.04847565293312073, 0.03764986991882324, 0.0906025767326355, -0.16595692932605743, 0.11637917160987854, 0.08015235513448715, 0.10723370313644409, -0.03934219107031822, 0.02941625751554966, -0.09263674914836884, 0.017352046445012093, 0.01792212948203087, 0.053539350628852844, -0.1969936043024063, 0.14405865967273712, 0.027891626581549644, 0.09196154028177261, -0.059853337705135345, 0.007841567508876324, 0.03344026207923889, 0.11631648987531662, 0.13139891624450684, -0.012369503267109394, -0.08575643599033356, 0.014102589339017868, -0.07101994752883911, 0.025796623900532722, 0.06260325014591217, -0.033069536089897156, 0.04692632704973221, -0.008854895830154419, 0.0029125476721674204, -0.024846456944942474, 0.06566252559423447, -0.2886105477809906, -0.1584775298833847, 0.02126021310687065, 0.0304139144718647, 0.07013227045536041, -0.026179537177085876, -0.04664138704538345, -0.09932693839073181, 0.11780437082052231, 0.0194587092846632, -0.013860313221812248, -0.11322348564863205, 0.01751028560101986, 0.044266004115343094, -0.08622902631759644, 0.01692347787320614, 0.029507523402571678, 0.17586655914783478, -0.09397779405117035, -0.054059017449617386, 0.017856206744909286, -0.0735517144203186, -0.09483176469802856, 0.023899724707007408, 0.15362614393234253, 0.12716783583164215, 0.04239281266927719, 0.11225193738937378, 0.005287780426442623, 0.031053798273205757, -0.10445984452962875, 0.055133137851953506, 0.054230377078056335, -0.023689357563853264, 0.0066619329154491425, -0.044596366584300995, -0.26997843384742737, -0.1685042381286621, -0.031914133578538895, 0.1484157294034958, 0.1505187302827835, -0.006686170119792223, 0.16658127307891846, 0.26898685097694397, -0.11203384399414062, -0.25384098291397095, -0.04496101662516594, -0.014583076350390911, 0.03226125240325928, 0.018841644749045372, -0.23888172209262848, 0.08386663347482681, 0.011299271136522293, 0.00804231408983469, -0.09191672503948212, -0.17563292384147644, -0.13709628582000732, 0.16197022795677185, 0.002176437759771943, 0.0112027982249856, -0.07584565132856369, -0.07697638124227524, -0.05127378925681114, -0.12492631375789642, 0.05789240077137947, -0.11377577483654022, 0.075319305062294, 0.08036919683218002, 0.030856337398290634, 0.008221213705837727, 0.0405193567276001, 0.09349378943443298, 0.06895921379327774, -0.0044107926078140736, -0.049969274550676346, 0.025784173980355263, 0.049973782151937485, -0.009288448840379715, 0.08285675942897797, 0.047560565173625946, 0.004143192432820797, -0.04030865803360939, -0.08379510790109634, -0.08911482244729996, 0.08484649658203125, -0.07126110047101974, -0.019975094124674797, -0.04625339061021805, 0.0893058180809021, 0.02897801250219345, 0.002721735741943121, -0.07516594976186752, -0.10865334421396255, 0.0008468187879770994, 0.13118501007556915, 0.21692359447479248, -0.10851912200450897, 0.03541228547692299, -0.02976224012672901, -0.047455914318561554, 0.040034882724285126, -0.0012672210577875376, 0.056889262050390244, 0.04615021124482155, -0.004704349674284458, 0.09954379498958588, -0.012837469577789307, -0.11917006224393845, 0.006551558617502451, 0.02197428047657013, -0.06348612904548645, -0.21804401278495789, -0.060476429760456085, 0.03972978889942169, 0.004655276890844107, -0.004634345415979624, 0.17685607075691223, -0.014662468805909157, -0.07270220667123795, -0.0206281878054142, 0.05194823071360588, -0.02969120815396309, 0.037791281938552856, 0.018803970888257027, 0.028589870780706406, -0.08592874556779861, 0.05858153477311134, 0.12512801587581635, -0.09712186455726624, 0.05462725833058357, 0.0768178179860115, -0.06876738369464874, -0.06581340730190277, -0.12273053824901581, 0.06490450352430344, 0.0024172994308173656, -0.069181427359581, 0.04453534632921219, -0.13219279050827026, 0.007354696746915579, 0.06647590547800064, 0.01704840362071991, -0.0201403945684433, -0.026163354516029358, -0.019111813977360725, -0.06723564118146896, 0.01867624931037426, 0.06444589048624039, -0.03971084952354431, -0.11843415349721909, 0.09324079006910324, 0.02150539867579937, 0.06939399987459183, -0.030529402196407318, -0.05585755780339241, -0.10518457740545273, -0.008055515587329865, -0.0853835865855217, 0.01730605587363243, -0.1377062201499939, -0.012624607421457767, -0.04184010624885559, -0.04684850946068764, -0.013527491129934788, 0.047770388424396515, -0.04062121734023094, -0.029062222689390182, -0.037622421979904175, 0.06688674539327621, -0.13959987461566925, 0.07532919198274612, 0.06654061377048492, -0.04386529326438904, 0.10505253076553345, 0.016994226723909378, -0.04291985556483269, 0.055070627480745316, -0.20178104937076569, -0.038329850882291794, -0.01173931360244751, 0.030114011839032173, -0.0006057304563000798, -0.1613718718290329, 0.00033805021666921675, 0.017493903636932373, 0.021417612209916115, -0.012658197432756424, 0.09587417542934418, -0.028802314773201942, -0.0028959219343960285, -0.06232375279068947, -0.09304863214492798, -0.03576812148094177, 0.06936237215995789, 0.1090972051024437, 0.01754705049097538, 0.09639167785644531, -0.07809074968099594, 0.05905081704258919, -0.09912563860416412, 0.03739307075738907, -0.024829940870404243, 0.018660394474864006, -0.0026767123490571976, -0.10233152657747269, 0.06703615933656693, 0.011320097371935844, 0.08726707845926285, 0.022113248705863953, -0.04377520829439163, 0.05737092345952988, -0.0694413036108017, -0.1158156543970108, 0.03839302062988281, 0.12873150408267975, 0.07492754608392715, 0.0028984113596379757, 0.04743959382176399, -0.027692632749676704, 0.0056826453655958176, 0.15034544467926025, 0.18170489370822906, 0.17549669742584229, 0.09885536879301071, 0.0793197825551033, 0.052138425409793854, -0.03700428828597069, -0.06326037645339966, 0.05447102338075638, -0.10314682871103287, 0.00006095976277720183, -0.06476668268442154, -0.04036976024508476, 0.12020576745271683, -0.1615627557039261, 0.087430439889431, -0.012770256958901882, -0.10393627732992172, -0.1335952877998352, -0.1695752888917923, -0.07576319575309753, -0.039191339164972305, -0.0056332931853830814, -0.10920480638742447, 0.004562019370496273, -0.010191383771598339, 0.0035347032826393843, -0.09212695807218552, 0.06760764122009277, -0.1152120977640152, -0.10993536561727524, 0.16247308254241943, -0.01872607320547104, -0.0011512171477079391, 0.0052683367393910885, 0.025663137435913086, 0.0135013023391366, 0.11748209595680237, 0.062090374529361725, 0.07398325949907303, 0.028564035892486572, 0.06723703444004059, -0.08133912086486816, -0.07119298726320267, 0.023103171959519386, -0.007975165732204914, 0.07922052592039108, 0.1427372843027115, 0.06140829250216484, -0.05726216360926628, 0.005165209993720055, 0.14540711045265198, 0.024784257635474205, -0.11760139465332031, -0.15583060681819916, 0.0719277411699295, -0.014299856498837471, 0.009219761937856674, -0.0244972612708807, -0.09915483742952347, 0.010361372493207455, 0.2765165865421295, 0.18673308193683624, -0.05631660297513008, 0.014092513360083103, -0.011129381135106087, 0.023742439225316048, 0.048666760325431824, 0.08898915350437164, 0.0951792448759079, 0.15695683658123016, -0.026089049875736237, 0.03645656630396843, -0.028509685769677162, -0.09547057747840881, -0.10667447745800018, 0.12008322030305862, -0.01346802432090044, -0.023200461640954018, -0.015293351374566555, 0.15053409337997437, -0.11058050394058228, -0.1452403962612152, -0.053455766290426254, -0.02877955697476864, -0.08262879401445389, 0.03349399194121361, -0.020563576370477676, 0.08911707997322083, 0.08006447553634644, 0.0064037819392979145, 0.009535478428006172, 0.18254925310611725, 0.027975764125585556, 0.013231569901108742, 0.0267792996019125, 0.09703250974416733, -0.08730123937129974, 0.1484895497560501, -0.003694605315104127, 0.0491497702896595, 0.051317863166332245, 0.032061971724033356, -0.08059138059616089, 0.04823371395468712, 0.005166774149984121, 0.035906001925468445, 0.05224202200770378, 0.14868584275245667, 0.009975370950996876, 0.045233890414237976, 0.1027728021144867, -0.12712223827838898, 0.05572064593434334, 0.0358467735350132, 0.004073162097483873, -0.03387229144573212, 0.15512219071388245, -0.18094216287136078, 0.11379595845937729, 0.09553786367177963, -0.03932730481028557, -0.005854323040693998, -0.023118069395422935, 0.019015725702047348, -0.04552806168794632, 0.08381834626197815, -0.01636432111263275, -0.16281156241893768, 0.017695974558591843, -0.07304957509040833, 0.03746020793914795, -0.23906227946281433, -0.014709312468767166, -0.02095535583794117, -0.011881131678819656, -0.030412469059228897, 0.11524702608585358, 0.05341077968478203, -0.06054164096713066, -0.011153118684887886, -0.07624956220388412, 0.026953700929880142, 0.10632489621639252, -0.07849220931529999, -0.04746120423078537 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in cs (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-cs") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-cs") # load dataset ds = load_dataset("common_voice", "cs", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "cs", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-cs
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "cs", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "cs" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #cs #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in cs (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in cs (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #cs #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in cs (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #cs #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in cs (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09631366282701492, 0.0948907881975174, -0.0022541130892932415, -0.01595556177198887, 0.10802698135375977, -0.08030663430690765, 0.13965362310409546, 0.05005420371890068, 0.08423089236021042, 0.07091820985078812, -0.0027372404001653194, 0.021284205839037895, 0.10653557628393173, 0.08852729201316833, 0.028314974159002304, -0.20170435309410095, 0.04828174039721489, -0.07041948288679123, 0.13475866615772247, 0.042460568249225616, 0.08562071621417999, -0.08541324734687805, 0.036748290061950684, 0.07211386412382126, -0.04649185389280319, 0.04453129693865776, -0.01995766907930374, -0.12864254415035248, 0.10238296538591385, 0.07842972874641418, -0.03618336841464043, 0.010365941561758518, 0.08375381678342819, -0.134815976023674, 0.024780862033367157, 0.03907442092895508, 0.007253353018313646, -0.0018219570629298687, 0.11568633466959, -0.08734266459941864, 0.08088069409132004, 0.01660391502082348, -0.004861493594944477, 0.09125879406929016, -0.08263272792100906, -0.15692923963069916, -0.044601812958717346, 0.08354868739843369, 0.021265452727675438, 0.08451739698648453, -0.10949379205703735, 0.059342555701732635, -0.028498591855168343, 0.04123876243829727, 0.13938191533088684, -0.2385992407798767, -0.014454125426709652, -0.011864584870636463, 0.07874391227960587, -0.00899670459330082, -0.04442631080746651, 0.06882037967443466, 0.04324401170015335, 0.006630433723330498, -0.09586668014526367, -0.030568409711122513, 0.01874111033976078, -0.11917835474014282, -0.09275148063898087, 0.03687046840786934, 0.2508065104484558, 0.08061929047107697, -0.08136139065027237, -0.10712160170078278, 0.03962355852127075, 0.2117529809474945, -0.05736381560564041, -0.09545864909887314, -0.004381853621453047, 0.008353134617209435, 0.027815544977784157, -0.05072886124253273, -0.06857322156429291, 0.004140963312238455, -0.047269612550735474, 0.07800296694040298, 0.002327218186110258, 0.00857456587255001, -0.029584815725684166, -0.01598210632801056, -0.047734569758176804, -0.07317323982715607, 0.029540855437517166, -0.07869517803192139, -0.07508884370326996, 0.002247302094474435, -0.044400013983249664, -0.1319674253463745, 0.05203846096992493, 0.016660340130329132, 0.09737800806760788, 0.020244445651769638, -0.15549162030220032, 0.036409810185432434, 0.10748977214097977, 0.0669855922460556, -0.20338480174541473, -0.07458784431219101, -0.006036919541656971, -0.03630679473280907, 0.01021184679120779, -0.03266342729330063, -0.07506491243839264, 0.013549452647566795, -0.031183242797851562, 0.034324049949645996, 0.008337466977536678, -0.03636264055967331, -0.09439822286367416, -0.10034600645303726, -0.0003041406744159758, -0.06170915812253952, 0.00003380089401616715, 0.07177184522151947, -0.04440457373857498, 0.1822587251663208, -0.018776003271341324, 0.0962233692407608, -0.10374642163515091, -0.08166174590587616, -0.015424486249685287, 0.033843159675598145, 0.041264746338129044, -0.06453771144151688, 0.044112708419561386, 0.026919187977910042, 0.01246209442615509, -0.16362054646015167, -0.04031365364789963, -0.0931784063577652, -0.03614804521203041, -0.08478103578090668, 0.0024975105188786983, -0.06658805906772614, 0.05235808715224266, -0.036012064665555954, -0.03582508862018585, -0.03105916641652584, -0.05624412000179291, 0.011181175708770752, -0.06076483428478241, 0.10269492864608765, 0.01756519079208374, 0.08701164275407791, -0.02150759845972061, -0.044396109879016876, -0.10459355264902115, 0.14287923276424408, -0.10594295710325241, 0.02710849978029728, -0.12869539856910706, -0.043579380959272385, -0.0531664676964283, 0.049539804458618164, 0.0651857927441597, 0.12677350640296936, -0.22397200763225555, -0.10051017254590988, 0.17030304670333862, -0.07264110445976257, -0.042905330657958984, 0.20886942744255066, 0.024586006999015808, 0.08475778251886368, 0.10992715507745743, 0.2108188271522522, -0.018885554745793343, -0.14812584221363068, -0.058268237859010696, -0.03411132097244263, 0.0441281795501709, 0.07578060775995255, 0.043295614421367645, -0.05719534680247307, 0.04977578669786453, 0.0031826396007090807, 0.07811981439590454, -0.03756292909383774, -0.015005506575107574, -0.04902021214365959, 0.04248160496354103, -0.08721856027841568, 0.0742911547422409, -0.006612736731767654, -0.029190242290496826, -0.014093457721173763, -0.05337285250425339, -0.049142107367515564, 0.07130517065525055, -0.05468399077653885, 0.027584258466959, -0.12786231935024261, 0.11860838532447815, 0.012716772966086864, 0.029616227373480797, -0.14432936906814575, 0.11907385289669037, 0.00791883748024702, -0.021067162975668907, 0.11446614563465118, 0.08814114332199097, -0.028239654377102852, -0.015426086261868477, 0.009910458698868752, 0.02511521242558956, -0.03514578938484192, -0.00031253171619027853, -0.014136038720607758, -0.11564099043607712, 0.024830671027302742, -0.0615110918879509, 0.09566619247198105, -0.09472189098596573, -0.02202843688428402, 0.0614604577422142, 0.1010725274682045, 0.0036181751638650894, -0.011838823556900024, 0.06700388342142105, 0.05787286534905434, 0.037665609270334244, 0.004838208202272654, 0.03910410776734352, -0.012583186849951744, -0.02668493241071701, 0.07269766181707382, -0.08750732243061066, 0.05875062569975853, 0.11721070110797882, -0.08386024087667465, -0.016529414802789688, 0.04445170983672142, -0.0034257902298122644, -0.009144031442701817, -0.04324836656451225, -0.05164407193660736, 0.26682376861572266, 0.011903412640094757, 0.058603137731552124, -0.07091990113258362, 0.01894465461373329, 0.03839993476867676, -0.04692777246236801, -0.07806514948606491, 0.07655337452888489, 0.03048519231379032, -0.06775575131177902, 0.006749476306140423, 0.08897372335195541, 0.06563106924295425, 0.1702527403831482, 0.0054991343058645725, -0.09066053479909897, -0.01400927547365427, -0.07838888466358185, -0.04588983207941055, 0.06431180238723755, -0.25917142629623413, -0.05539296567440033, 0.024881931021809578, -0.014920612797141075, 0.027590110898017883, -0.01570782996714115, 0.023141365498304367, -0.03088180534541607, -0.033953893929719925, -0.024583380669355392, 0.0211455300450325, 0.007759639527648687, 0.08975180983543396, -0.011818330734968185, -0.05853388458490372, -0.04533112794160843, -0.06528890132904053, -0.11286617815494537, 0.07174690067768097, -0.07887215167284012, -0.4057677686214447, -0.025739872828125954, -0.06820295006036758, -0.06569690257310867, 0.019378097727894783, 0.032662685960531235, -0.10257538408041, -0.06962523609399796, -0.03773677721619606, 0.10423800349235535, 0.04360014572739601, -0.04028979688882828, 0.12041221559047699, 0.002494760090485215, 0.054888464510440826, -0.10282396525144577, 0.0001222008722834289, -0.07017596065998077, -0.04672396555542946, -0.06540060043334961, 0.05550418794155121, 0.049055952578783035, 0.10586496442556381, 0.0814313292503357, -0.004566263407468796, -0.010005977004766464, 0.16162027418613434, -0.12034175544977188, 0.005660673137754202, 0.19693295657634735, -0.0766160786151886, -0.02757532149553299, 0.10693242400884628, 0.020563362166285515, -0.10204168409109116, 0.04600149393081665, 0.011051294393837452, -0.0034770192578434944, -0.2372051626443863, -0.15073557198047638, -0.03250608965754509, 0.06697921454906464, 0.04158910736441612, -0.023507066071033478, 0.004909162409603596, -0.020209161564707756, -0.04138795658946037, 0.0013345311162993312, 0.03328096494078636, -0.0037273524794727564, 0.1475994884967804, -0.05966142937541008, 0.09884702414274216, -0.04643607512116432, -0.03358845040202141, 0.08841033279895782, 0.007672168780118227, 0.05952119082212448, 0.09972822666168213, 0.028782837092876434, 0.06258247047662735, 0.022567734122276306, 0.023034056648612022, -0.014441022649407387, 0.005693723913282156, 0.000025335557438666, -0.028672082349658012, -0.055011358112096786, -0.005660693626850843, 0.0638141855597496, 0.1844514161348343, -0.12863430380821228, -0.09252907335758209, -0.026149149984121323, 0.06637565046548843, 0.14699745178222656, 0.12784965336322784, -0.051050588488578796, -0.07908013463020325, 0.027163226157426834, -0.09229084849357605, -0.03818429261445999, 0.10141509771347046, 0.09787824749946594, -0.1594744175672531, 0.10299785435199738, 0.08426517993211746, 0.04964131489396095, -0.09757955372333527, 0.04685695841908455, -0.12936830520629883, -0.027374904602766037, 0.010892047546803951, 0.03685161843895912, -0.21003548800945282, 0.13593505322933197, 0.03417043015360832, 0.038599297404289246, -0.09897787123918533, 0.012128397822380066, 0.06160849705338478, -0.06482316553592682, 0.16134130954742432, -0.025445761159062386, 0.03273169323801994, 0.02402215078473091, -0.1173943355679512, 0.016337541863322258, 0.03190106898546219, -0.023076722398400307, 0.010055877268314362, 0.022142445668578148, 0.0007701101712882519, -0.026649905368685722, 0.05344634875655174, -0.22038784623146057, -0.11356505006551743, 0.030106017366051674, 0.023919526487588882, 0.1069551482796669, -0.024663468822836876, -0.10976243019104004, -0.17980676889419556, 0.10338091105222702, -0.046859461814165115, 0.008987043984234333, -0.07428842782974243, 0.08915629237890244, 0.03242730349302292, -0.015845168381929398, 0.028473788872361183, 0.07339873164892197, 0.1379341334104538, -0.058944474905729294, -0.05550486594438553, 0.06868571043014526, -0.10917070508003235, -0.11565202474594116, -0.01469971239566803, 0.20055435597896576, 0.14311771094799042, 0.050200656056404114, 0.07289258390665054, -0.02048366703093052, 0.031563371419906616, -0.06106747314333916, 0.1001996323466301, 0.09237156808376312, -0.07175235450267792, 0.10227695107460022, -0.003036823822185397, -0.34616950154304504, -0.12466273456811905, -0.019544752314686775, 0.18406713008880615, 0.0996265858411789, -0.01505871769040823, 0.14878256618976593, 0.279340535402298, -0.07127907127141953, -0.23411957919597626, 0.026206720620393753, 0.01856401562690735, 0.034462060779333115, 0.029944319278001785, -0.22957345843315125, 0.09735871851444244, 0.05137068033218384, 0.0005541656282730401, -0.010943356901407242, -0.18381009995937347, -0.13024580478668213, 0.24180659651756287, -0.0203948263078928, 0.18629106879234314, -0.003384685143828392, -0.05086643993854523, -0.06191720440983772, 0.039418283849954605, 0.05386827513575554, -0.12030843645334244, 0.07993420958518982, 0.07301235944032669, 0.019806064665317535, 0.02118683233857155, 0.03152410686016083, 0.0632195770740509, 0.09492060542106628, -0.01811150647699833, -0.052829284220933914, 0.06648237258195877, 0.031170248985290527, 0.05567378178238869, 0.08671143651008606, 0.08104756474494934, -0.04797445610165596, -0.013254262506961823, -0.07090385258197784, -0.07386064529418945, 0.09401311725378036, -0.003401687601581216, -0.030041255056858063, -0.018064439296722412, 0.06340436637401581, -0.015316705219447613, 0.03309888765215874, -0.07310082018375397, -0.09619222581386566, 0.0375107079744339, 0.11082057654857635, 0.23490899801254272, -0.0453445203602314, 0.024595214053988457, -0.026509763672947884, -0.06925663352012634, 0.04245863854885101, 0.04246596619486809, 0.0328940786421299, 0.07320550829172134, 0.012520433403551579, 0.11040935665369034, -0.009975697845220566, -0.1411789506673813, 0.08382855355739594, 0.03014182113111019, -0.08062181621789932, -0.17299962043762207, -0.04578982666134834, 0.031233036890625954, -0.0011031237663701177, 0.036909230053424835, 0.2107527256011963, -0.03479788079857826, -0.05387941375374794, -0.021409638226032257, 0.06785188615322113, -0.03049888089299202, 0.09801218658685684, 0.04992842301726341, 0.007234231103211641, -0.1351192444562912, 0.08515717834234238, 0.07114874571561813, -0.06528039276599884, 0.09812255203723907, -0.0015627129469066858, -0.02917114645242691, -0.07336784154176712, -0.21337223052978516, 0.051176030188798904, 0.07527747750282288, -0.08889126777648926, -0.017202304676175117, -0.13376480340957642, 0.04340153560042381, 0.11471336334943771, 0.005291752517223358, -0.015603862702846527, -0.06222239509224892, -0.051279228180646896, -0.005656498018652201, 0.08656523376703262, 0.058349646627902985, -0.07424838095903397, -0.0929379016160965, 0.038484007120132446, 0.03184238448739052, 0.07011114805936813, -0.02744293212890625, -0.10133323073387146, -0.09413506835699081, 0.01561973337084055, -0.18908655643463135, -0.03429228067398071, -0.10706617683172226, -0.00040179945062845945, -0.01645900495350361, -0.04386431351304054, -0.02886727824807167, 0.019354989752173424, -0.05797908827662468, 0.0318995825946331, 0.034696534276008606, 0.09992755204439163, -0.14945383369922638, 0.04599226638674736, 0.026690108701586723, -0.031159939244389534, 0.13025487959384918, 0.07949385792016983, -0.03619819134473801, 0.058211877942085266, -0.18766850233078003, -0.048263825476169586, 0.068373903632164, 0.08144573867321014, 0.011929824948310852, -0.14058220386505127, 0.00148573808837682, 0.022211745381355286, 0.0008832218591123819, -0.01233174093067646, 0.08945052325725555, -0.016815368086099625, 0.022233858704566956, -0.06980087608098984, -0.022801531478762627, -0.016798246651887894, 0.029021354392170906, 0.06942443549633026, 0.05083579197525978, 0.05636650323867798, -0.09032280743122101, 0.067170649766922, -0.0741644874215126, 0.03826677054166794, -0.04410108923912048, -0.0206308476626873, 0.0423755943775177, -0.09889928251504898, 0.08630712330341339, -0.005252200178802013, 0.17371992766857147, 0.012975870631635189, 0.02375439926981926, 0.024847280234098434, -0.06264451146125793, -0.12851989269256592, 0.05879095941781998, -0.006892892997711897, 0.0668429359793663, -0.003622370772063732, -0.06169246509671211, -0.014777032658457756, 0.0792924240231514, 0.16957662999629974, 0.058649126440286636, 0.08918841928243637, 0.08168992400169373, 0.023021820932626724, 0.13882456719875336, -0.09549251943826675, -0.07304604351520538, 0.03688090294599533, -0.12555566430091858, 0.07848027348518372, -0.07105637341737747, -0.0013722749426960945, 0.00506649911403656, -0.07946070283651352, 0.045244745910167694, -0.021002445369958878, -0.04541729763150215, -0.09710274636745453, -0.09667012095451355, -0.05077409744262695, -0.07681164145469666, -0.009299936704337597, -0.09143159538507462, 0.003834550501778722, -0.023325271904468536, 0.039612509310245514, -0.10247786343097687, 0.11760900914669037, -0.0998203381896019, -0.12753772735595703, 0.20325374603271484, -0.03397819772362709, -0.019596273079514503, -0.09732063114643097, 0.015181470662355423, 0.03571121022105217, 0.12592388689517975, 0.031237652525305748, 0.04472161829471588, -0.05445456504821777, 0.024477267637848854, -0.06984183937311172, -0.07765497267246246, 0.02069752663373947, -0.048817142844200134, -0.0031410513911396265, 0.1438988894224167, 0.11323998868465424, -0.03278356045484543, 0.020056970417499542, 0.10455343872308731, -0.03091449663043022, -0.01127302460372448, -0.15464366972446442, -0.028233502060174942, -0.016371017321944237, 0.0364484004676342, -0.04173464700579643, -0.0961490347981453, -0.015438782051205635, 0.20030908286571503, 0.15843917429447174, -0.0808032751083374, 0.036386892199516296, -0.003937433008104563, 0.006224006414413452, -0.0019879858009517193, 0.03583366051316261, 0.13234655559062958, 0.213056281208992, -0.003896131878718734, 0.016565043479204178, 0.0008507083402946591, -0.11161350458860397, -0.023847924545407295, 0.1026012971997261, -0.0023801284842193127, -0.0238328967243433, -0.007951414212584496, 0.17072942852973938, -0.18623091280460358, -0.22605381906032562, -0.11588186025619507, -0.09839878231287003, -0.136408731341362, 0.003930554259568453, -0.067189060151577, 0.17510464787483215, 0.05355961248278618, -0.010415975004434586, -0.011634859256446362, 0.14961479604244232, 0.03485139459371567, -0.009562687017023563, -0.042829059064388275, 0.045700132846832275, -0.1273944228887558, 0.055898942053318024, -0.024503638967871666, 0.019844012334942818, 0.029371021315455437, 0.09410960227251053, -0.04121540114283562, 0.0227877926081419, 0.011900801211595535, 0.010677933692932129, 0.032324787229299545, 0.13438059389591217, 0.021500861272215843, 0.15542075037956238, 0.12946297228336334, -0.0888310968875885, 0.008432810194790363, 0.002149576786905527, -0.017577221617102623, -0.03996799513697624, 0.12279421836137772, -0.11586951464414597, 0.08747922629117966, 0.051639098674058914, -0.055946409702301025, -0.04063541814684868, -0.017954451963305473, 0.025077955797314644, -0.019815223291516304, -0.001295153982937336, -0.0127760274335742, -0.20911037921905518, 0.014887016266584396, -0.11439812928438187, 0.057875342667102814, -0.1852617710828781, -0.010224228724837303, 0.00892576389014721, -0.017136504873633385, -0.028334952890872955, 0.04628424346446991, 0.04612474888563156, -0.04023858904838562, -0.018340345472097397, 0.047291338443756104, 0.035204820334911346, 0.10582153499126434, -0.07182194292545319, -0.042162034660577774 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in de (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-de") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-de") # load dataset ds = load_dataset("common_voice", "de", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "de", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-de
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "de", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "de" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #de #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in de (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in de (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #de #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in de (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #de #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in de (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09261883050203323, 0.0913657695055008, -0.0023520628456026316, -0.015435568057000637, 0.10653623193502426, -0.08108346164226532, 0.14277715981006622, 0.04983903467655182, 0.08404497802257538, 0.06712993234395981, -0.0031770383939146996, 0.02637823484838009, 0.10477767884731293, 0.09073761105537415, 0.02865098975598812, -0.20603863894939423, 0.04916490986943245, -0.07072289288043976, 0.13764438033103943, 0.0412927083671093, 0.08715759962797165, -0.08292446285486221, 0.03570973128080368, 0.07424740493297577, -0.04431439936161041, 0.04882023483514786, -0.018962476402521133, -0.13013920187950134, 0.10355517268180847, 0.07793022692203522, -0.038415856659412384, 0.0077249519526958466, 0.08102624118328094, -0.1344616562128067, 0.023752287030220032, 0.04209192842245102, 0.007443006616085768, -0.0034253369085490704, 0.11624030768871307, -0.08811462670564651, 0.08162783831357956, 0.015105986036360264, -0.00633207056671381, 0.09127277135848999, -0.08390063047409058, -0.16284845769405365, -0.04240017756819725, 0.08052610605955124, 0.02092099003493786, 0.0832730233669281, -0.11095785349607468, 0.05907434970140457, -0.025480620563030243, 0.04118453711271286, 0.1371472030878067, -0.23805581033229828, -0.014100243337452412, -0.016085555776953697, 0.07961924374103546, -0.011416016146540642, -0.04334237053990364, 0.07013996690511703, 0.04515959322452545, 0.009089485742151737, -0.09909278154373169, -0.030681006610393524, 0.027198748663067818, -0.12232854962348938, -0.09258542954921722, 0.039567284286022186, 0.24925656616687775, 0.07935106754302979, -0.08380544930696487, -0.10630910098552704, 0.04049243777990341, 0.22021016478538513, -0.05630876496434212, -0.09832131862640381, -0.003985874820500612, 0.008132867515087128, 0.02764097787439823, -0.04909634217619896, -0.06653497368097305, 0.00429402245208621, -0.04817178472876549, 0.0838858112692833, 0.004508991725742817, 0.007971233688294888, -0.03292592614889145, -0.01713450253009796, -0.05037198215723038, -0.07233180105686188, 0.030791468918323517, -0.08096480369567871, -0.07067837566137314, 0.005412247963249683, -0.04387719929218292, -0.13529102504253387, 0.05320344865322113, 0.016254622489213943, 0.09564510732889175, 0.019795561209321022, -0.15602773427963257, 0.03679567947983742, 0.11167886108160019, 0.06346580386161804, -0.20440487563610077, -0.08084490150213242, -0.007248568348586559, -0.03757680207490921, 0.011163252405822277, -0.029208336025476456, -0.07206487655639648, 0.013360274955630302, -0.030776390805840492, 0.032262541353702545, 0.01171217393130064, -0.03609859198331833, -0.09531056880950928, -0.1027325764298439, -0.007870381698012352, -0.061676397919654846, -0.00008491198968840763, 0.06833814084529877, -0.04651753976941109, 0.18441134691238403, -0.02124992571771145, 0.09805230051279068, -0.10676229000091553, -0.08305517584085464, -0.01345134899020195, 0.03282143175601959, 0.04082423448562622, -0.06678526848554611, 0.04317399859428406, 0.026730192825198174, 0.014965892769396305, -0.1655845195055008, -0.043046142905950546, -0.09307819604873657, -0.03293917328119278, -0.08799780905246735, 0.0029080049134790897, -0.06775181740522385, 0.05578071251511574, -0.037283364683389664, -0.0361546091735363, -0.029689321294426918, -0.056151133030653, 0.012089954689145088, -0.05910063534975052, 0.10401417315006256, 0.016728121787309647, 0.08891774713993073, -0.021224291995167732, -0.04502424597740173, -0.1046377569437027, 0.14498263597488403, -0.10603939741849899, 0.025336362421512604, -0.13141478598117828, -0.04454652592539787, -0.0536995567381382, 0.05230138823390007, 0.0675470307469368, 0.12472733855247498, -0.2230251133441925, -0.10307108610868454, 0.1667458713054657, -0.0737801045179367, -0.04097834974527359, 0.212286576628685, 0.024260321632027626, 0.09190893918275833, 0.11115552484989166, 0.2154129296541214, -0.019714394584298134, -0.14621534943580627, -0.05691289156675339, -0.034005306661129, 0.041060902178287506, 0.0792446881532669, 0.04181041941046715, -0.05514552816748619, 0.04795478656888008, 0.002633244264870882, 0.0809689536690712, -0.0358433723449707, -0.013446328230202198, -0.04807598516345024, 0.044425200670957565, -0.08601541072130203, 0.07759924232959747, -0.00814814493060112, -0.029289716854691505, -0.015096542425453663, -0.054137878119945526, -0.042883843183517456, 0.07025018334388733, -0.05275300517678261, 0.026674386113882065, -0.12815049290657043, 0.1170894131064415, 0.01269950345158577, 0.030096612870693207, -0.141677588224411, 0.12582701444625854, 0.00630551902577281, -0.02762894332408905, 0.11519069969654083, 0.09169387817382812, -0.027514493092894554, -0.015872672200202942, 0.009610703215003014, 0.024049565196037292, -0.036419179290533066, -0.00031817081617191434, -0.013049150817096233, -0.11596711724996567, 0.027680598199367523, -0.06258866935968399, 0.08766809105873108, -0.09062600880861282, -0.02319994568824768, 0.055231258273124695, 0.10149072855710983, 0.003495363751426339, -0.010148595087230206, 0.06547807902097702, 0.0582573227584362, 0.037304457277059555, 0.005843997001647949, 0.041253481060266495, -0.014313261955976486, -0.026224836707115173, 0.07239121943712234, -0.08576319366693497, 0.06383628398180008, 0.11643562465906143, -0.08950534462928772, -0.01430183183401823, 0.047621458768844604, -0.004401452373713255, -0.010515705682337284, -0.04291530326008797, -0.05246947333216667, 0.27018260955810547, 0.011247910559177399, 0.05701856315135956, -0.07078057527542114, 0.018701428547501564, 0.0381280891597271, -0.048657167702913284, -0.08079373091459274, 0.07579229772090912, 0.02646159753203392, -0.07283543050289154, 0.006630076095461845, 0.09182374179363251, 0.06485653668642044, 0.1756075918674469, 0.005261356942355633, -0.08778643608093262, -0.013902343809604645, -0.07879138737916946, -0.048105236142873764, 0.061341866850852966, -0.26231247186660767, -0.055398572236299515, 0.024158775806427002, -0.013146787881851196, 0.029444487765431404, -0.01401212066411972, 0.022820696234703064, -0.031600140035152435, -0.031271111220121384, -0.0267085749655962, 0.021945394575595856, 0.006948466878384352, 0.08885883539915085, -0.011741949245333672, -0.05949227139353752, -0.04320615530014038, -0.06614893674850464, -0.11122368276119232, 0.07052882760763168, -0.07665660232305527, -0.4081614315509796, -0.02349778078496456, -0.07081515341997147, -0.0670095905661583, 0.01788775436580181, 0.034519318491220474, -0.1041419729590416, -0.06842435896396637, -0.03372390940785408, 0.1112319752573967, 0.04104235768318176, -0.04047679901123047, 0.11981770396232605, 0.003813396440818906, 0.057221803814172745, -0.10402663797140121, -0.0014794707531109452, -0.07010568678379059, -0.04558705538511276, -0.06554342806339264, 0.054025761783123016, 0.0457339845597744, 0.09994587302207947, 0.0809985101222992, -0.0045355395413935184, -0.009994500316679478, 0.1635674685239792, -0.12036900222301483, 0.004617484286427498, 0.19820259511470795, -0.07427262514829636, -0.02602478303015232, 0.10936515778303146, 0.019083328545093536, -0.10293322056531906, 0.04737798497080803, 0.014842823147773743, -0.0012844259617850184, -0.23879340291023254, -0.1492318958044052, -0.02986999787390232, 0.06933681666851044, 0.04204392433166504, -0.02477109245955944, 0.004843220580369234, -0.02046152576804161, -0.039996884763240814, 0.0042700814083218575, 0.03435129299759865, -0.003740607760846615, 0.15111000835895538, -0.06205856427550316, 0.10009846836328506, -0.04598817229270935, -0.036049984395504, 0.08813470602035522, 0.0065139103680849075, 0.059191614389419556, 0.10089468210935593, 0.022893231362104416, 0.0632176324725151, 0.015698274597525597, 0.020304322242736816, -0.012830524705350399, 0.003843357553705573, 0.0005184971960261464, -0.03086005710065365, -0.05461474135518074, -0.0028480482287704945, 0.06470637023448944, 0.18101327121257782, -0.12915687263011932, -0.09199182689189911, -0.024004165083169937, 0.06750538945198059, 0.1418720930814743, 0.13000990450382233, -0.047336068004369736, -0.07811731845140457, 0.02943219058215618, -0.09632200002670288, -0.03853125125169754, 0.10219430178403854, 0.09441067278385162, -0.15937799215316772, 0.10578344762325287, 0.08415750414133072, 0.05007384344935417, -0.09826364368200302, 0.04890983924269676, -0.13158228993415833, -0.02537856064736843, 0.01117878407239914, 0.03554151952266693, -0.21255207061767578, 0.13783138990402222, 0.03273497149348259, 0.040676143020391464, -0.09778860956430435, 0.013289553113281727, 0.06158774718642235, -0.06740441918373108, 0.16175276041030884, -0.024647127836942673, 0.03353869542479515, 0.03154653310775757, -0.1145544946193695, 0.017618367448449135, 0.029930811375379562, -0.020038193091750145, 0.00928792729973793, 0.020405666902661324, 0.0041131083853542805, -0.026683561503887177, 0.05499165877699852, -0.22633199393749237, -0.11120526492595673, 0.02973298355937004, 0.019999265670776367, 0.10878951102495193, -0.023907622322440147, -0.11207333207130432, -0.1762729436159134, 0.10502562671899796, -0.042558927088975906, 0.0068108937703073025, -0.07312104851007462, 0.09300924837589264, 0.02933555282652378, -0.015581678599119186, 0.025459300726652145, 0.07398408651351929, 0.1339578628540039, -0.060087233781814575, -0.05581438168883324, 0.06844142824411392, -0.10956431925296783, -0.11043427139520645, -0.014110776595771313, 0.19602298736572266, 0.1447836309671402, 0.0488600954413414, 0.0731201246380806, -0.02221929095685482, 0.03223387524485588, -0.059743788093328476, 0.10204020887613297, 0.09276849776506424, -0.07444946467876434, 0.09822549670934677, -0.0028907519299536943, -0.3484407365322113, -0.12568840384483337, -0.01970551162958145, 0.18221670389175415, 0.09770788997411728, -0.01179188210517168, 0.1477694809436798, 0.2836582362651825, -0.07207266986370087, -0.2334236353635788, 0.02392016537487507, 0.02120056003332138, 0.03347288817167282, 0.02798890322446823, -0.23184873163700104, 0.09745746850967407, 0.05313444882631302, 0.0015123119810596108, -0.01136974897235632, -0.18245230615139008, -0.12980525195598602, 0.24298089742660522, -0.018270015716552734, 0.19454456865787506, -0.00008550405618734658, -0.05025450140237808, -0.06287568807601929, 0.040829792618751526, 0.04993352293968201, -0.1191161647439003, 0.0806145891547203, 0.0737021341919899, 0.021528342738747597, 0.020149821415543556, 0.03280320018529892, 0.06448815017938614, 0.0943179726600647, -0.01799878105521202, -0.05376265197992325, 0.0653187707066536, 0.02728971838951111, 0.0563029982149601, 0.08885065466165543, 0.07947040349245071, -0.04918770492076874, -0.013744073919951916, -0.07210253924131393, -0.07374119758605957, 0.09674832224845886, -0.0032563835848122835, -0.033186376094818115, -0.013968373648822308, 0.06589996814727783, -0.017109718173742294, 0.03319389373064041, -0.07560724765062332, -0.09465672820806503, 0.036833055317401886, 0.1081436425447464, 0.2342653125524521, -0.04355311393737793, 0.022943025454878807, -0.027744390070438385, -0.0698324516415596, 0.043876953423023224, 0.04213446378707886, 0.03159957006573677, 0.07290317118167877, 0.011715509928762913, 0.11150876432657242, -0.010141851380467415, -0.14302563667297363, 0.08443714678287506, 0.03145445138216019, -0.0778840184211731, -0.17789243161678314, -0.04485661908984184, 0.03142986819148064, 0.0024751187302172184, 0.039217304438352585, 0.21204882860183716, -0.03565075621008873, -0.05539897084236145, -0.022574247792363167, 0.06794200837612152, -0.033295467495918274, 0.09986866265535355, 0.047940559685230255, 0.005758856423199177, -0.13429245352745056, 0.08610685914754868, 0.06937826424837112, -0.06474508345127106, 0.1007009968161583, -0.0021587107330560684, -0.02719292603433132, -0.07361677289009094, -0.21757936477661133, 0.053728505969047546, 0.07494234293699265, -0.08962997049093246, -0.01722395420074463, -0.1333197057247162, 0.04361684247851372, 0.10810375958681107, 0.005361880641430616, -0.016110306605696678, -0.06385064125061035, -0.05048583820462227, -0.0017770605627447367, 0.08530978113412857, 0.06438054889440536, -0.07564570754766464, -0.09364161640405655, 0.03635701909661293, 0.033062856644392014, 0.06919568032026291, -0.027156107127666473, -0.10086525231599808, -0.09601332992315292, 0.014082019217312336, -0.1956315040588379, -0.035802293568849564, -0.10616301000118256, 0.0007360426243394613, -0.015171637758612633, -0.04463890939950943, -0.028889000415802002, 0.019429659470915794, -0.05824052542448044, 0.031213440001010895, 0.03690249100327492, 0.09847165644168854, -0.14945246279239655, 0.04688301309943199, 0.02701437845826149, -0.03046903759241104, 0.13059361279010773, 0.0796487107872963, -0.03542294725775719, 0.05709990859031677, -0.1841251105070114, -0.047724850475788116, 0.06789304316043854, 0.07862110435962677, 0.011901527643203735, -0.14557476341724396, 0.002530851401388645, 0.02098648063838482, 0.0029712540563195944, -0.012861138209700584, 0.08765843510627747, -0.01844766177237034, 0.027703186497092247, -0.06979726999998093, -0.025784458965063095, -0.017563125118613243, 0.028087928891181946, 0.0662083551287651, 0.05078752338886261, 0.05781480297446251, -0.09143014997243881, 0.0665888786315918, -0.07346902042627335, 0.039117176085710526, -0.04372043162584305, -0.02217744290828705, 0.04829585179686546, -0.09965481609106064, 0.0838000550866127, -0.005475820507854223, 0.16956186294555664, 0.018792688846588135, 0.01921791210770607, 0.025180984288454056, -0.06855688989162445, -0.13213291764259338, 0.05807740241289139, -0.00744479987770319, 0.06415888667106628, -0.002418399788439274, -0.06969284266233444, -0.015456832945346832, 0.07876208424568176, 0.17073756456375122, 0.05451507493853569, 0.09415160119533539, 0.08014657348394394, 0.02216586470603943, 0.13907261192798615, -0.09382149577140808, -0.07910826802253723, 0.03841982036828995, -0.12017639726400375, 0.07974899560213089, -0.07113511860370636, -0.004180945921689272, 0.0058899917639791965, -0.07758133113384247, 0.045065294951200485, -0.02046406827867031, -0.044048015028238297, -0.09840504080057144, -0.0985274463891983, -0.050662800669670105, -0.07682435214519501, -0.009122908115386963, -0.08988884091377258, 0.0040983096696436405, -0.02887648344039917, 0.04173271358013153, -0.10560516268014908, 0.11543656140565872, -0.09908665716648102, -0.1299867331981659, 0.2051309049129486, -0.03519892320036888, -0.020556166768074036, -0.096932053565979, 0.01820569671690464, 0.03308560699224472, 0.12974485754966736, 0.028928469866514206, 0.04516620561480522, -0.05755361542105675, 0.023592213168740273, -0.07064773887395859, -0.07658470422029495, 0.020180976018309593, -0.048116009682416916, -0.004955668468028307, 0.14607402682304382, 0.11367124319076538, -0.03441770002245903, 0.02018951252102852, 0.10465702414512634, -0.029853535816073418, -0.013999799266457558, -0.15467974543571472, -0.019517483189702034, -0.017097102478146553, 0.038311537355184555, -0.043450236320495605, -0.09622110426425934, -0.017663218080997467, 0.19897358119487762, 0.16226036846637726, -0.07668855041265488, 0.03448965400457382, -0.0024459597188979387, 0.0063210753723979, -0.0015296910423785448, 0.03463302180171013, 0.13290466368198395, 0.21147209405899048, -0.005148537922650576, 0.02113659866154194, -0.0012471646768972278, -0.11096746474504471, -0.023749297484755516, 0.10277801752090454, 0.0007283890736289322, -0.02384425885975361, -0.008352339267730713, 0.17215581238269806, -0.18547779321670532, -0.22671587765216827, -0.12017074972391129, -0.09540795534849167, -0.1381540149450302, 0.003685791976749897, -0.06408699601888657, 0.17654503881931305, 0.057766202837228775, -0.00896315835416317, -0.01151534914970398, 0.14869928359985352, 0.03562624379992485, -0.008385471068322659, -0.04297134280204773, 0.04532945156097412, -0.12185999006032944, 0.05327322706580162, -0.023723848164081573, 0.017584607005119324, 0.03032885491847992, 0.09171665459871292, -0.040272437036037445, 0.022912874817848206, 0.010182525962591171, 0.015124486759305, 0.03186226636171341, 0.1339329332113266, 0.022431975230574608, 0.15292368829250336, 0.12919580936431885, -0.09046421945095062, 0.005541080608963966, 0.007678682450205088, -0.019778359681367874, -0.03918595612049103, 0.12742376327514648, -0.11683595180511475, 0.08550549298524857, 0.05391915515065193, -0.054181382060050964, -0.04040950536727905, -0.020701926201581955, 0.02744780294597149, -0.01797705702483654, -0.006262151058763266, -0.012101726606488228, -0.20865462720394135, 0.015299131162464619, -0.11380261927843094, 0.05628005042672157, -0.18600033223628998, -0.009703800082206726, 0.011367264203727245, -0.01579119637608528, -0.027254333719611168, 0.04494667053222656, 0.04615413025021553, -0.03952540084719658, -0.01749281771481037, 0.0414760559797287, 0.03482731804251671, 0.10400144010782242, -0.06717853993177414, -0.04045387730002403 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in en (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-en") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-en") # load dataset ds = load_dataset("common_voice", "en", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "en", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-en
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "en", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "en" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #en #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in en (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in en (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #en #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in en (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #en #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in en (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09317535161972046, 0.09072143584489822, -0.0023520332761108875, -0.015353977680206299, 0.10651984810829163, -0.08184587210416794, 0.14261162281036377, 0.04930124059319496, 0.08315345644950867, 0.06726337224245071, -0.0036870548501610756, 0.02421690709888935, 0.10500840842723846, 0.08925766497850418, 0.02906256541609764, -0.20517753064632416, 0.048594191670417786, -0.07209651917219162, 0.13815870881080627, 0.04161417484283447, 0.0868060439825058, -0.08397656679153442, 0.03539456054568291, 0.0752967968583107, -0.044694893062114716, 0.04847894236445427, -0.018772626295685768, -0.13011638820171356, 0.10300048440694809, 0.07856955379247665, -0.038083817809820175, 0.006899081636220217, 0.08183592557907104, -0.13473786413669586, 0.023616312071681023, 0.041360411792993546, 0.008526218123733997, -0.0030959874857217073, 0.11685021221637726, -0.08783028274774551, 0.07798992097377777, 0.015814632177352905, -0.006393907126039267, 0.09112713485956192, -0.08251041173934937, -0.16059967875480652, -0.04222320392727852, 0.08046797662973404, 0.020489085465669632, 0.08324599266052246, -0.11125298589468002, 0.060232046991586685, -0.02522406168282032, 0.04203324392437935, 0.13816215097904205, -0.2384054958820343, -0.014274039305746555, -0.018513666465878487, 0.0788985937833786, -0.011522917076945305, -0.04255485162138939, 0.07056186348199844, 0.045184873044490814, 0.00885424017906189, -0.09997526556253433, -0.030788691714406013, 0.024462023749947548, -0.12316609174013138, -0.09326045215129852, 0.038710638880729675, 0.25041037797927856, 0.08054031431674957, -0.08443435281515121, -0.10600695759057999, 0.040890537202358246, 0.22000768780708313, -0.0555725060403347, -0.09770721197128296, -0.004729934968054295, 0.007260063197463751, 0.02768072299659252, -0.05001332238316536, -0.06739027798175812, 0.004313575569540262, -0.0493289977312088, 0.0850653201341629, 0.005209789611399174, 0.009915481321513653, -0.032822322100400925, -0.01770024187862873, -0.050862278789281845, -0.07141593098640442, 0.03108586184680462, -0.07986892759799957, -0.07223526388406754, 0.00446005305275321, -0.0446651466190815, -0.13594184815883636, 0.052554693073034286, 0.014910726808011532, 0.09478697180747986, 0.019674455747008324, -0.15739299356937408, 0.03737037628889084, 0.11149981617927551, 0.06351187825202942, -0.2039824277162552, -0.078229159116745, -0.0074272011406719685, -0.03903336450457573, 0.012533510103821754, -0.029974762350320816, -0.07355506718158722, 0.013212625868618488, -0.03112497739493847, 0.03157072514295578, 0.010366762988269329, -0.03629113361239433, -0.09479553252458572, -0.10267934948205948, -0.012363945133984089, -0.0621737502515316, -0.00022753413941245526, 0.069045290350914, -0.046854596585035324, 0.1844993382692337, -0.020335525274276733, 0.09913937002420425, -0.10712713748216629, -0.08361756056547165, -0.013290300965309143, 0.03470495343208313, 0.042230717837810516, -0.0667090117931366, 0.043400734663009644, 0.028659213334321976, 0.014865357428789139, -0.1658872365951538, -0.04184665530920029, -0.09188465774059296, -0.03165154904127121, -0.0862261950969696, 0.002452318323776126, -0.06722923368215561, 0.05714254453778267, -0.037092797458171844, -0.03703443333506584, -0.027981992810964584, -0.055302221328020096, 0.011912164278328419, -0.059407807886600494, 0.10404885560274124, 0.017608504742383957, 0.08903771638870239, -0.021203288808465004, -0.04522091522812843, -0.10069748014211655, 0.1445772647857666, -0.10487262904644012, 0.025274641811847687, -0.13115933537483215, -0.04609377309679985, -0.05311675742268562, 0.051780883222818375, 0.06813478469848633, 0.12431658804416656, -0.2216624915599823, -0.10243914276361465, 0.16535592079162598, -0.07375586777925491, -0.04178290069103241, 0.21258895099163055, 0.025045054033398628, 0.09167052805423737, 0.1110040470957756, 0.21642912924289703, -0.018798334524035454, -0.1485644429922104, -0.056741055101156235, -0.03533878177404404, 0.04058799147605896, 0.0788818821310997, 0.04247519001364708, -0.05444572493433952, 0.04645518213510513, 0.002853192389011383, 0.08272866904735565, -0.03618336096405983, -0.013803430832922459, -0.04858459532260895, 0.04401203989982605, -0.08655280619859695, 0.07706174999475479, -0.0076839979737997055, -0.028773358091711998, -0.015009161084890366, -0.053950462490320206, -0.042794838547706604, 0.07001245021820068, -0.052752476185560226, 0.027656838297843933, -0.12921853363513947, 0.11721128225326538, 0.012202348560094833, 0.0298345685005188, -0.14167927205562592, 0.1247931644320488, 0.005526039749383926, -0.026924926787614822, 0.11552086472511292, 0.08931334316730499, -0.02764805220067501, -0.014999610371887684, 0.009805616922676563, 0.02383725717663765, -0.03474196419119835, -0.0009129380923695862, -0.01306847296655178, -0.11596184968948364, 0.026455296203494072, -0.06155088171362877, 0.08822604268789291, -0.089385986328125, -0.024350624531507492, 0.05342121049761772, 0.10028970241546631, 0.002728516235947609, -0.010596510022878647, 0.06643979996442795, 0.05856018513441086, 0.03742039203643799, 0.006669791880995035, 0.04128511995077133, -0.014548078179359436, -0.02641620859503746, 0.0722440555691719, -0.08429548889398575, 0.06231851130723953, 0.11601413786411285, -0.09035507589578629, -0.014324464835226536, 0.04623851925134659, -0.0041309529915452, -0.011479968205094337, -0.04481815919280052, -0.05135245621204376, 0.2711741030216217, 0.01060984656214714, 0.057466812431812286, -0.06950642168521881, 0.020388931035995483, 0.038276128470897675, -0.05009646713733673, -0.08105184137821198, 0.07728926837444305, 0.02518937736749649, -0.0721447691321373, 0.005308887921273708, 0.0906420424580574, 0.0627691000699997, 0.1748373806476593, 0.006000483874231577, -0.08824295550584793, -0.01464785821735859, -0.07895828783512115, -0.04759593680500984, 0.05967888981103897, -0.2623042166233063, -0.053913600742816925, 0.024735674262046814, -0.013057509437203407, 0.02844962850213051, -0.013979227282106876, 0.022740544751286507, -0.03165094554424286, -0.03206711634993553, -0.02786148339509964, 0.021318569779396057, 0.007710879202932119, 0.08839172124862671, -0.012840507552027702, -0.0572204627096653, -0.04283766448497772, -0.06629054248332977, -0.11208245158195496, 0.06908759474754333, -0.0770958662033081, -0.4081128239631653, -0.02504703775048256, -0.07311727106571198, -0.06764262169599533, 0.016679463908076286, 0.035138119012117386, -0.10431592166423798, -0.06831598281860352, -0.03477455675601959, 0.1103304848074913, 0.04103207215666771, -0.039167337119579315, 0.1209145188331604, 0.0028711447957903147, 0.05696908012032509, -0.10254258662462234, -0.0011863429099321365, -0.07193074375391006, -0.046910736709833145, -0.0649680569767952, 0.055600348860025406, 0.046030592173337936, 0.10111179202795029, 0.08059952408075333, -0.004138736054301262, -0.009865867905318737, 0.1633991003036499, -0.11868328601121902, 0.00408447440713644, 0.19877396523952484, -0.0756683424115181, -0.02552609331905842, 0.10918918251991272, 0.019160836935043335, -0.10120046138763428, 0.04804322496056557, 0.014212585054337978, -0.0020393605809658766, -0.23868702352046967, -0.1497926115989685, -0.029734058305621147, 0.06872094422578812, 0.04369129240512848, -0.025453105568885803, 0.003908112179487944, -0.02042227052152157, -0.039218779653310776, 0.002880565822124481, 0.032861266285181046, -0.0034766097087413073, 0.1512821763753891, -0.06212151423096657, 0.09978761523962021, -0.04589628428220749, -0.03559393063187599, 0.08779842406511307, 0.007185970898717642, 0.0588081069290638, 0.10029958188533783, 0.024027926847338676, 0.06363226473331451, 0.01864311657845974, 0.02191580832004547, -0.012886620126664639, 0.0036216711159795523, 0.0005472826887853444, -0.02982911467552185, -0.05512315779924393, -0.003822872182354331, 0.06567752361297607, 0.18258143961429596, -0.12709727883338928, -0.09242478013038635, -0.02291318215429783, 0.06639830023050308, 0.14220614731311798, 0.12946830689907074, -0.04626932740211487, -0.07810930907726288, 0.028953734785318375, -0.09558051824569702, -0.03783779218792915, 0.10218335688114166, 0.0973561704158783, -0.15860868990421295, 0.1060851439833641, 0.08365016430616379, 0.049841027706861496, -0.09644591063261032, 0.04857006296515465, -0.13195505738258362, -0.02400634065270424, 0.011255234479904175, 0.0365041084587574, -0.2114773690700531, 0.13798758387565613, 0.032851867377758026, 0.04055393859744072, -0.0977749153971672, 0.013365834951400757, 0.06082294508814812, -0.06837164610624313, 0.16142329573631287, -0.024676108732819557, 0.03178984671831131, 0.030027570202946663, -0.11492425203323364, 0.017459208145737648, 0.02997862733900547, -0.017015021294355392, 0.008834957145154476, 0.02061626687645912, 0.003795535070821643, -0.026432402431964874, 0.05649792402982712, -0.22508646547794342, -0.11104737222194672, 0.02898181602358818, 0.022677429020404816, 0.10696640610694885, -0.023792700842022896, -0.11264079809188843, -0.17768149077892303, 0.1032024472951889, -0.04099194332957268, 0.007080069277435541, -0.07285431027412415, 0.09353028982877731, 0.029787547886371613, -0.016427461057901382, 0.024972181767225266, 0.0743594691157341, 0.13419735431671143, -0.06042052432894707, -0.05563410371541977, 0.06713502109050751, -0.10987504571676254, -0.10957086086273193, -0.014250662177801132, 0.19710688292980194, 0.14439763128757477, 0.04892796650528908, 0.07324783504009247, -0.02224244736135006, 0.03079097718000412, -0.059811852872371674, 0.10285977274179459, 0.09233389049768448, -0.07558104395866394, 0.10008838027715683, -0.0024767485447227955, -0.3489243686199188, -0.1263297200202942, -0.02104048989713192, 0.18486538529396057, 0.09704221785068512, -0.01249010395258665, 0.1478991061449051, 0.2837230861186981, -0.0715947225689888, -0.23291067779064178, 0.023577062413096428, 0.022315219044685364, 0.03372359275817871, 0.02908315323293209, -0.23107872903347015, 0.099183589220047, 0.052122946828603745, 0.0019045999506488442, -0.013273757882416248, -0.1830507218837738, -0.1305830031633377, 0.24320422112941742, -0.01699220761656761, 0.1946982741355896, -0.0000971638728515245, -0.05019271746277809, -0.06183570250868797, 0.042310792952775955, 0.05033165588974953, -0.11796525865793228, 0.08073776960372925, 0.07358625531196594, 0.02359122224152088, 0.020279640331864357, 0.03262905403971672, 0.06312626600265503, 0.09343783557415009, -0.01965833269059658, -0.05446211248636246, 0.06617844104766846, 0.024553073570132256, 0.056332919746637344, 0.0891430601477623, 0.07993925362825394, -0.048903532326221466, -0.011891908943653107, -0.0720096006989479, -0.07432673871517181, 0.09690374881029129, -0.001987047027796507, -0.03295133635401726, -0.013500644825398922, 0.06505067646503448, -0.01614513248205185, 0.03361291438341141, -0.07540203630924225, -0.09403276443481445, 0.03680512309074402, 0.11177558451890945, 0.23400278389453888, -0.04500803351402283, 0.022292278707027435, -0.028163477778434753, -0.06988056749105453, 0.04378571733832359, 0.043938424438238144, 0.03195234760642052, 0.07309003919363022, 0.013009144924581051, 0.11172239482402802, -0.010568049736320972, -0.14203527569770813, 0.08469204604625702, 0.030978718772530556, -0.07866328209638596, -0.17907986044883728, -0.045713383704423904, 0.032469742000103, 0.0013871145201846957, 0.03769989311695099, 0.21161742508411407, -0.036540307104587555, -0.053550440818071365, -0.0226142518222332, 0.06822936236858368, -0.033622339367866516, 0.09996069222688675, 0.04903361201286316, 0.005780128296464682, -0.1342422515153885, 0.08633700758218765, 0.06982038915157318, -0.06308358907699585, 0.10191686451435089, -0.0024811774492263794, -0.026903172954916954, -0.07405677437782288, -0.21891163289546967, 0.051547348499298096, 0.07600616663694382, -0.09074129164218903, -0.015225045382976532, -0.13187764585018158, 0.043249912559986115, 0.1081419587135315, 0.005251970142126083, -0.017858920618891716, -0.06463548541069031, -0.05149778723716736, -0.003303427016362548, 0.08590036630630493, 0.06239438056945801, -0.0755828246474266, -0.09429643303155899, 0.03696465119719505, 0.03373492881655693, 0.07126733660697937, -0.027063291519880295, -0.10311908274888992, -0.09531161189079285, 0.015351466834545135, -0.19410330057144165, -0.03462827950716019, -0.1055416688323021, 0.0010945872636511922, -0.015303922817111015, -0.045109380036592484, -0.02983793057501316, 0.020062599331140518, -0.05746591463685036, 0.03092263452708721, 0.0374562107026577, 0.09865172952413559, -0.14990191161632538, 0.046448301523923874, 0.02769652009010315, -0.029078060761094093, 0.13048969209194183, 0.08087310940027237, -0.03714980185031891, 0.0575888529419899, -0.18624895811080933, -0.04849368706345558, 0.06815287470817566, 0.07970795780420303, 0.011778444051742554, -0.14688120782375336, 0.0011625575134530663, 0.02114701084792614, 0.002071765251457691, -0.013343547470867634, 0.08865906298160553, -0.018131189048290253, 0.025888757780194283, -0.06890230625867844, -0.025710389018058777, -0.0180179663002491, 0.028165854513645172, 0.06497066468000412, 0.0506930910050869, 0.05833413824439049, -0.09153828769922256, 0.06736819446086884, -0.07283160835504532, 0.03946373239159584, -0.04360827058553696, -0.02134731411933899, 0.047460075467824936, -0.10004474967718124, 0.08313439041376114, -0.006458701100200415, 0.169640451669693, 0.017258087173104286, 0.019095318391919136, 0.024257905781269073, -0.0667613074183464, -0.1333569586277008, 0.057284798473119736, -0.0057749999687075615, 0.06371787190437317, -0.0022182853426784277, -0.06693515926599503, -0.015697240829467773, 0.07957840710878372, 0.17395706474781036, 0.05315738916397095, 0.09264916181564331, 0.08070652186870575, 0.021859776228666306, 0.1400262713432312, -0.09258440881967545, -0.07794424891471863, 0.03757407143712044, -0.12043063342571259, 0.08023016154766083, -0.07105642557144165, -0.003742273896932602, 0.004689322784543037, -0.07689040154218674, 0.045033231377601624, -0.02070559561252594, -0.04434892535209656, -0.09822419285774231, -0.10005378723144531, -0.0510404035449028, -0.07766027748584747, -0.008678007870912552, -0.08996792882680893, 0.004731170367449522, -0.026887455955147743, 0.042123787105083466, -0.10405376553535461, 0.1158251166343689, -0.10025534778833389, -0.1295195072889328, 0.20386053621768951, -0.03629862144589424, -0.020389260724186897, -0.09667109698057175, 0.018476059660315514, 0.03527265414595604, 0.1292094737291336, 0.03027440793812275, 0.04512001574039459, -0.05621582269668579, 0.024415701627731323, -0.07115717977285385, -0.07661225646734238, 0.019618794322013855, -0.04894925653934479, -0.00355269992724061, 0.14353938400745392, 0.11474526673555374, -0.033334944397211075, 0.019485557451844215, 0.10568948090076447, -0.02997947297990322, -0.01485716924071312, -0.15484194457530975, -0.02164837159216404, -0.018418164923787117, 0.038118019700050354, -0.04386671632528305, -0.09600955992937088, -0.01812884584069252, 0.20019640028476715, 0.16133137047290802, -0.07724276930093765, 0.034480515867471695, -0.003440824570134282, 0.0060243443585932255, -0.0010887396056205034, 0.03441686928272247, 0.13453415036201477, 0.20909081399440765, -0.004474155604839325, 0.022597413510084152, -0.0014058263041079044, -0.11159583181142807, -0.02315777726471424, 0.10272809118032455, -0.0002197756402892992, -0.023613909259438515, -0.007820826023817062, 0.1720149666070938, -0.18563278019428253, -0.2262413501739502, -0.12120731174945831, -0.09582018852233887, -0.13831910490989685, 0.004644130822271109, -0.060085445642471313, 0.17744222283363342, 0.056880317628383636, -0.009699085727334023, -0.010534638538956642, 0.14510680735111237, 0.03526247665286064, -0.008982458151876926, -0.043866027146577835, 0.04599627107381821, -0.12454111874103546, 0.053326815366744995, -0.023020064458251, 0.018637437373399734, 0.02932334691286087, 0.09286917001008987, -0.03934699296951294, 0.023411734029650688, 0.009400985203683376, 0.013111948035657406, 0.03191089257597923, 0.1360214799642563, 0.022216545417904854, 0.15440762042999268, 0.12892775237560272, -0.08955571055412292, 0.004352674353867769, 0.006172561086714268, -0.0177761297672987, -0.038828812539577484, 0.12650059163570404, -0.11703420430421829, 0.08591799437999725, 0.05363254249095917, -0.054406896233558655, -0.040539324283599854, -0.021140925586223602, 0.027352306991815567, -0.019266372546553612, -0.0058393776416778564, -0.011437720619142056, -0.20907700061798096, 0.014363976195454597, -0.11447108536958694, 0.0566641204059124, -0.1859462410211563, -0.010239223949611187, 0.011943423189222813, -0.016594022512435913, -0.027395954355597496, 0.04459802433848381, 0.049073781818151474, -0.0393022857606411, -0.017353128641843796, 0.04243578761816025, 0.03551536798477173, 0.10400308668613434, -0.06739119440317154, -0.041122302412986755 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in es (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-es") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-es") # load dataset ds = load_dataset("common_voice", "es", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "es", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-es
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "es", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "es" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #es #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in es (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in es (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #es #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in es (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #es #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in es (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09323783963918686, 0.09018179774284363, -0.0023670310620218515, -0.015055375173687935, 0.10755133628845215, -0.08020974695682526, 0.1417023241519928, 0.050931934267282486, 0.08627186715602875, 0.06648993492126465, -0.0043400307185947895, 0.024727066978812218, 0.1048627570271492, 0.09259165823459625, 0.02947070635855198, -0.20543575286865234, 0.04910057783126831, -0.07182599604129791, 0.137551948428154, 0.04145433008670807, 0.08746146410703659, -0.0834336131811142, 0.03668307512998581, 0.07390196621417999, -0.042554594576358795, 0.049806829541921616, -0.018886391073465347, -0.13056011497974396, 0.10339123010635376, 0.07905147969722748, -0.037896301597356796, 0.006731367204338312, 0.08091669529676437, -0.13564899563789368, 0.023636993020772934, 0.041210707277059555, 0.007851548492908478, -0.002579695777967572, 0.11719877272844315, -0.08857689052820206, 0.0774926245212555, 0.01511745247989893, -0.006414105650037527, 0.09079362452030182, -0.08347836881875992, -0.163072407245636, -0.041999805718660355, 0.08026258647441864, 0.02125518023967743, 0.08482538908720016, -0.11124809086322784, 0.059408679604530334, -0.024076206609606743, 0.04040093347430229, 0.14238639175891876, -0.2375541627407074, -0.013079116120934486, -0.01934751681983471, 0.07789662480354309, -0.008140449412167072, -0.04280519858002663, 0.07000385969877243, 0.044856391847133636, 0.008479788899421692, -0.0979081243276596, -0.029982617124915123, 0.025427188724279404, -0.12171000242233276, -0.09145556390285492, 0.03838365152478218, 0.2513069212436676, 0.0796302855014801, -0.08306233584880829, -0.10691852122545242, 0.039258915930986404, 0.21966864168643951, -0.05382619798183441, -0.09782407432794571, -0.0033221575431525707, 0.00705947307869792, 0.03138300031423569, -0.049417342990636826, -0.06796421855688095, 0.00616110023111105, -0.04867815598845482, 0.08268957585096359, 0.0046645705588161945, 0.008556749671697617, -0.030534613877534866, -0.017416683956980705, -0.04873508960008621, -0.07318052649497986, 0.030359085649251938, -0.07916882634162903, -0.07156239449977875, 0.004784691613167524, -0.042869362980127335, -0.13526631891727448, 0.050973113626241684, 0.016001423820853233, 0.09218920767307281, 0.02017533965408802, -0.15298086404800415, 0.03761156648397446, 0.11300120502710342, 0.06330602616071701, -0.20131249725818634, -0.08020525425672531, -0.009014895185828209, -0.03896990045905113, 0.011650933884084225, -0.030930016189813614, -0.07388997822999954, 0.013965554535388947, -0.031067542731761932, 0.030985744670033455, 0.010886930860579014, -0.03812038153409958, -0.09518914669752121, -0.10188388079404831, -0.007462513167411089, -0.06316391378641129, 0.0010403371416032314, 0.06661228835582733, -0.046059101819992065, 0.18442372977733612, -0.020177876576781273, 0.09818056970834732, -0.10636856406927109, -0.08026692271232605, -0.01450456865131855, 0.032813750207424164, 0.040220558643341064, -0.06599505990743637, 0.043066300451755524, 0.02639682963490486, 0.015579896979033947, -0.16604046523571014, -0.04343822970986366, -0.09300802648067474, -0.03220619261264801, -0.08637390285730362, 0.002285383176058531, -0.06580589711666107, 0.05700458586215973, -0.03589552640914917, -0.037713680416345596, -0.029538346454501152, -0.05662877485156059, 0.010706719942390919, -0.058884359896183014, 0.10257193446159363, 0.019516147673130035, 0.08892274647951126, -0.021707670763134956, -0.0459558367729187, -0.1056070625782013, 0.14289069175720215, -0.10655679553747177, 0.025257790461182594, -0.1297607123851776, -0.04341761767864227, -0.05215394124388695, 0.05352816730737686, 0.06597836315631866, 0.12538567185401917, -0.22150231897830963, -0.10313241928815842, 0.16508953273296356, -0.07593033462762833, -0.04218912869691849, 0.21170973777770996, 0.0254657045006752, 0.09027422964572906, 0.11036932468414307, 0.21629804372787476, -0.02357574552297592, -0.1465386301279068, -0.05985935777425766, -0.034808699041604996, 0.04386112466454506, 0.07755947858095169, 0.04270131513476372, -0.05519290640950203, 0.04943821579217911, 0.0022242655977606773, 0.08026988804340363, -0.0361531563103199, -0.013824080117046833, -0.046999476850032806, 0.04419342428445816, -0.08504476398229599, 0.07583130896091461, -0.008292436599731445, -0.028169147670269012, -0.016036398708820343, -0.056833747774362564, -0.04102588817477226, 0.06939103454351425, -0.050887711346149445, 0.02690519392490387, -0.12818916141986847, 0.1179826632142067, 0.013773735612630844, 0.028565729036927223, -0.14199842512607574, 0.12241215258836746, 0.00606000330299139, -0.027905317023396492, 0.11523262411355972, 0.0879548192024231, -0.02730315737426281, -0.015150775201618671, 0.009494997560977936, 0.023759575560688972, -0.03672006353735924, -0.00020619436691049486, -0.014383096247911453, -0.11733315885066986, 0.02669445611536503, -0.06203530728816986, 0.08738884329795837, -0.09117922186851501, -0.0238481555134058, 0.05407536402344704, 0.10225753486156464, 0.004809598438441753, -0.011073708534240723, 0.06406162679195404, 0.057866256684064865, 0.036425743252038956, 0.006700464989989996, 0.03892049565911293, -0.013975724577903748, -0.024780740961432457, 0.07257486879825592, -0.08563616126775742, 0.06610770523548126, 0.11610735952854156, -0.08888483047485352, -0.015201613306999207, 0.04723503440618515, -0.004093264229595661, -0.010989493690431118, -0.04517300799489021, -0.053478144109249115, 0.2714502513408661, 0.011420021764934063, 0.057414788752794266, -0.07053334265947342, 0.018661294132471085, 0.037860672920942307, -0.05027352645993233, -0.08159547299146652, 0.0758228451013565, 0.028353769332170486, -0.07048516720533371, 0.008611497469246387, 0.09313290566205978, 0.06495444476604462, 0.17445853352546692, 0.005944592412561178, -0.08926896750926971, -0.013310498557984829, -0.08050227165222168, -0.047811321914196014, 0.06118956580758095, -0.2620195746421814, -0.053723447024822235, 0.02433530054986477, -0.012475083582103252, 0.030146511271595955, -0.013811344280838966, 0.022026630118489265, -0.029631435871124268, -0.030890781432390213, -0.027577992528676987, 0.02059902809560299, 0.00618721405044198, 0.08729840070009232, -0.012930134311318398, -0.0563412569463253, -0.04455628618597984, -0.06485272943973541, -0.11185338348150253, 0.0695977658033371, -0.07899933308362961, -0.40830034017562866, -0.02676454558968544, -0.06946311146020889, -0.06455659866333008, 0.017764968797564507, 0.036715343594551086, -0.10510458052158356, -0.07003562152385712, -0.034850478172302246, 0.1114763617515564, 0.041955552995204926, -0.04066970571875572, 0.1219853088259697, 0.00229963893070817, 0.05574836954474449, -0.1036674901843071, -0.001808327971957624, -0.07063165307044983, -0.04580363258719444, -0.06593436747789383, 0.05471020191907883, 0.045636095106601715, 0.10031550377607346, 0.0806063711643219, -0.005611944943666458, -0.01042043138295412, 0.16428101062774658, -0.12013441324234009, 0.005843072663992643, 0.19864988327026367, -0.0750727653503418, -0.02497250586748123, 0.10653546452522278, 0.01903069019317627, -0.10154931247234344, 0.046865064650774, 0.01500120759010315, -0.0016599537339061499, -0.2410135716199875, -0.14812147617340088, -0.030054889619350433, 0.06626462936401367, 0.04231534153223038, -0.023726047948002815, 0.004401771351695061, -0.019273415207862854, -0.04047667235136032, 0.0026968675665557384, 0.03200143203139305, -0.004052617587149143, 0.14820711314678192, -0.0614469088613987, 0.09922496229410172, -0.04575269669294357, -0.03399929404258728, 0.08933298289775848, 0.00750192254781723, 0.060812611132860184, 0.10130301862955093, 0.024500248953700066, 0.06392596662044525, 0.01748318038880825, 0.02051609195768833, -0.011804110370576382, 0.0035046522971242666, 0.0003342387790326029, -0.03140673041343689, -0.05433870106935501, -0.0031498016323894262, 0.0665610060095787, 0.17871785163879395, -0.12759916484355927, -0.09150364995002747, -0.022986967116594315, 0.06692096590995789, 0.1417737454175949, 0.12756933271884918, -0.047261953353881836, -0.07860896736383438, 0.03068138100206852, -0.09625072777271271, -0.038952797651290894, 0.10147570073604584, 0.09674171358346939, -0.15797260403633118, 0.10644727200269699, 0.08440998941659927, 0.05016455426812172, -0.09686244279146194, 0.04855646193027496, -0.1299068033695221, -0.02602747641503811, 0.010534973815083504, 0.036959320306777954, -0.21112097799777985, 0.13901810348033905, 0.032262276858091354, 0.04090389609336853, -0.09744799137115479, 0.014697222039103508, 0.06084573268890381, -0.06696683913469315, 0.1625412404537201, -0.02338622324168682, 0.026941031217575073, 0.030107123777270317, -0.11414448171854019, 0.016691237688064575, 0.029271721839904785, -0.02103281021118164, 0.009650095365941525, 0.020819108933210373, 0.004359312355518341, -0.025673242285847664, 0.056940291076898575, -0.22471751272678375, -0.11371102929115295, 0.028933776542544365, 0.021101465448737144, 0.10538656264543533, -0.024794483557343483, -0.11248191446065903, -0.1769159585237503, 0.10391152650117874, -0.04326140135526657, 0.005306055303663015, -0.0739593356847763, 0.08994017541408539, 0.030195314437150955, -0.01713784970343113, 0.02612879127264023, 0.07552048563957214, 0.13560734689235687, -0.060685895383358, -0.05557962507009506, 0.06994389742612839, -0.11160892993211746, -0.11001486331224442, -0.014947273768484592, 0.19477078318595886, 0.14527671039104462, 0.048584140837192535, 0.0722803995013237, -0.02060873620212078, 0.031202606856822968, -0.05921381339430809, 0.10350368916988373, 0.09353988617658615, -0.07608969509601593, 0.10100157558917999, -0.0032933650072664022, -0.34810107946395874, -0.12630948424339294, -0.022798923775553703, 0.181306853890419, 0.09645657241344452, -0.011978390626609325, 0.14801274240016937, 0.28426724672317505, -0.07439284771680832, -0.23126143217086792, 0.022862164303660393, 0.02280246466398239, 0.03298650309443474, 0.03262316808104515, -0.23318630456924438, 0.0992351546883583, 0.05402902513742447, 0.0011793377343565226, -0.012004773132503033, -0.1838875114917755, -0.13016478717327118, 0.24142417311668396, -0.018832510337233543, 0.19283261895179749, -0.0016062305076047778, -0.051065366715192795, -0.06246357783675194, 0.042760416865348816, 0.0498332753777504, -0.11615267395973206, 0.07980293035507202, 0.07427915930747986, 0.023055139929056168, 0.019848132506012917, 0.03357160836458206, 0.06467431783676147, 0.09502139687538147, -0.019010985270142555, -0.054666437208652496, 0.06612993031740189, 0.02485349401831627, 0.05492613837122917, 0.08810008317232132, 0.08021722733974457, -0.05058874189853668, -0.01674746535718441, -0.07217912375926971, -0.0722661018371582, 0.09654577821493149, -0.0024154335260391235, -0.03279634565114975, -0.013788914307951927, 0.06479166448116302, -0.016145819798111916, 0.032228127121925354, -0.07007229328155518, -0.09414549916982651, 0.03948649391531944, 0.11166928708553314, 0.2309531569480896, -0.04596275836229324, 0.019443368539214134, -0.027337059378623962, -0.06947323679924011, 0.043457988649606705, 0.04261106625199318, 0.03192205727100372, 0.07274874299764633, 0.010942090302705765, 0.11101284623146057, -0.010615705512464046, -0.14461080729961395, 0.08519724756479263, 0.032569754868745804, -0.0774707943201065, -0.17853571474552155, -0.045968055725097656, 0.032604292035102844, 0.002524656243622303, 0.03662564232945442, 0.21164819598197937, -0.03523292765021324, -0.055025964975357056, -0.023189811035990715, 0.06918492913246155, -0.031246552243828773, 0.1001523956656456, 0.04695369675755501, 0.005921971518546343, -0.13416573405265808, 0.08849120140075684, 0.07016304135322571, -0.06524644792079926, 0.1013113409280777, -0.00019730285566765815, -0.026299193501472473, -0.07254143059253693, -0.21606415510177612, 0.05314294248819351, 0.07287686318159103, -0.09034933894872665, -0.018200207501649857, -0.13333821296691895, 0.04404418542981148, 0.10753510892391205, 0.00555789889767766, -0.018893450498580933, -0.0627657026052475, -0.049116674810647964, -0.0014462494291365147, 0.08525113761425018, 0.06475329399108887, -0.07374012470245361, -0.0962274819612503, 0.03427501767873764, 0.034302037209272385, 0.06942948698997498, -0.02698800340294838, -0.10016857832670212, -0.09481658041477203, 0.014608368277549744, -0.19099514186382294, -0.033636484295129776, -0.10584227740764618, -0.0002753911539912224, -0.014342832379043102, -0.04588520899415016, -0.0303045567125082, 0.020093675702810287, -0.057892777025699615, 0.03117687627673149, 0.03782657906413078, 0.09974535554647446, -0.14953328669071198, 0.04526311904191971, 0.027363702654838562, -0.03078390471637249, 0.1294519454240799, 0.07982004433870316, -0.036219339817762375, 0.05758356302976608, -0.1881193220615387, -0.04724854230880737, 0.06861237436532974, 0.08015424758195877, 0.012475521303713322, -0.14505994319915771, 0.0028097329195588827, 0.020206063985824585, 0.002463806653395295, -0.014180352911353111, 0.08594869822263718, -0.019660912454128265, 0.026406168937683105, -0.0675738975405693, -0.026825714856386185, -0.016968460753560066, 0.028262827545404434, 0.06832846999168396, 0.048164643347263336, 0.05898863822221756, -0.09092439711093903, 0.06649477779865265, -0.07389350235462189, 0.0396585650742054, -0.042738933116197586, -0.0208632480353117, 0.047999121248722076, -0.09939365088939667, 0.08300291001796722, -0.005987498909235001, 0.1724000722169876, 0.018707290291786194, 0.021666409447789192, 0.02597232721745968, -0.06759262830018997, -0.13081558048725128, 0.05712537467479706, -0.009064576588571072, 0.06321686506271362, -0.0025581184308975935, -0.06869799643754959, -0.01606864482164383, 0.0797593966126442, 0.1693180650472641, 0.055527493357658386, 0.0922127366065979, 0.079926036298275, 0.02322336658835411, 0.13897931575775146, -0.0943681076169014, -0.0795963853597641, 0.03494740277528763, -0.12006966024637222, 0.07834500819444656, -0.06982183456420898, -0.0037739707622677088, 0.007058426272124052, -0.0780240073800087, 0.04636622965335846, -0.021576011553406715, -0.04325921833515167, -0.09883435070514679, -0.0994224101305008, -0.050777334719896317, -0.07873167097568512, -0.009685364551842213, -0.09114520996809006, 0.005223830696195364, -0.02418516017496586, 0.04182421788573265, -0.10285519808530807, 0.11476822197437286, -0.09924253821372986, -0.12979617714881897, 0.20239190757274628, -0.03507786989212036, -0.019547581672668457, -0.09809954464435577, 0.01761683262884617, 0.034556932747364044, 0.1302812695503235, 0.028651094064116478, 0.04531988501548767, -0.0580248199403286, 0.024501919746398926, -0.0694262832403183, -0.07641299813985825, 0.01963984966278076, -0.04916497692465782, -0.005840042140334845, 0.1435004621744156, 0.11258535832166672, -0.033333636820316315, 0.019592881202697754, 0.1087818518280983, -0.03152521699666977, -0.011968035250902176, -0.15291093289852142, -0.02424605004489422, -0.01568295992910862, 0.03957558050751686, -0.04316021129488945, -0.09709352254867554, -0.0184780340641737, 0.19809246063232422, 0.1602030098438263, -0.07536313682794571, 0.03314364701509476, -0.004180139396339655, 0.00699717178940773, -0.001863517565652728, 0.034382276237010956, 0.13358640670776367, 0.21427375078201294, -0.0047489069402217865, 0.023483769968152046, -0.0010616317158564925, -0.10857748985290527, -0.024210957810282707, 0.10355674475431442, 0.0004598128725774586, -0.022471072152256966, -0.007944692857563496, 0.17166957259178162, -0.18394596874713898, -0.22913533449172974, -0.11990401893854141, -0.09682896733283997, -0.13820162415504456, 0.004151377361267805, -0.059309642761945724, 0.17655906081199646, 0.05790350213646889, -0.00767130684107542, -0.013591361232101917, 0.1479690670967102, 0.03724369406700134, -0.010931425727903843, -0.04433730989694595, 0.04480130225419998, -0.12746505439281464, 0.05738898366689682, -0.023163069039583206, 0.016822461038827896, 0.02891865000128746, 0.09145612269639969, -0.04041315242648125, 0.022035302594304085, 0.009979582391679287, 0.013835519552230835, 0.032980289310216904, 0.1335090547800064, 0.02205415442585945, 0.15445122122764587, 0.12932845950126648, -0.0899709165096283, 0.005196527577936649, 0.007802579086273909, -0.01985071413218975, -0.03691256046295166, 0.1256818175315857, -0.11783342808485031, 0.08570920675992966, 0.05269763991236687, -0.05424240976572037, -0.041302528232336044, -0.020018689334392548, 0.027862045913934708, -0.018219267949461937, -0.003512722672894597, -0.01013445109128952, -0.20992988348007202, 0.013733661733567715, -0.11115747690200806, 0.05854673311114311, -0.18563011288642883, -0.009451187215745449, 0.010351794771850109, -0.016878454014658928, -0.029319513589143753, 0.044332876801490784, 0.04954845830798149, -0.0397743321955204, -0.018075574189424515, 0.04184674099087715, 0.03532014787197113, 0.10539666563272476, -0.06702850013971329, -0.040393177419900894 ]
null
null
null
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in et (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-et") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-et") # load dataset ds = load_dataset("common_voice", "et", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "et", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-et
[ "audio", "automatic-speech-recognition", "voxpopuli", "et", "arxiv:2101.00390", "license:cc-by-nc-4.0", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "et" ]
TAGS #audio #automatic-speech-recognition #voxpopuli #et #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in et (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in et (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#audio #automatic-speech-recognition #voxpopuli #et #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in et (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 45, 162, 30 ]
[ "passage: TAGS\n#audio #automatic-speech-recognition #voxpopuli #et #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in et (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.0738009437918663, -0.00885866954922676, -0.0002188233338529244, -0.011409154161810875, 0.07528825104236603, -0.03403417766094208, 0.16217175126075745, 0.038505591452121735, 0.1667454093694687, 0.07180925458669662, -0.00003261511665186845, 0.037124428898096085, 0.06008007749915123, 0.024375272914767265, 0.004984518513083458, -0.20385724306106567, 0.04206943139433861, -0.07126416265964508, 0.20556381344795227, 0.028565693646669388, 0.07509557157754898, -0.06203463301062584, 0.004986913874745369, 0.044866420328617096, -0.02405676245689392, 0.009045291692018509, -0.0017474303022027016, -0.10530222207307816, 0.11452542990446091, 0.07147204130887985, -0.05136843025684357, -0.004327807109802961, 0.08496897667646408, -0.1427171230316162, 0.008838498964905739, -0.0002707386447582394, 0.010840247385203838, -0.01492321863770485, 0.06889548897743225, -0.06141563132405281, 0.10170667618513107, 0.13880757987499237, 0.017470309510827065, 0.060552503913640976, -0.10729008167982101, -0.07225582003593445, -0.0021653305739164352, 0.016964424401521683, 0.0593564547598362, 0.06132978945970535, -0.11135855317115784, 0.06919626891613007, -0.0566922165453434, 0.050233010202646255, 0.020576409995555878, -0.23704499006271362, -0.004713496193289757, 0.08570211380720139, 0.10769243538379669, 0.017534693703055382, -0.019939539954066277, 0.056575071066617966, 0.0669635608792305, 0.019627690315246582, -0.13598789274692535, -0.047258295118808746, -0.05678965151309967, -0.11526656150817871, -0.11278833448886871, 0.04373878985643387, 0.3291718065738678, 0.05127323791384697, -0.08657683432102203, -0.09151952713727951, 0.05459831282496452, 0.21119999885559082, -0.04706466570496559, -0.08981150388717651, 0.0051202187314629555, 0.030452346429228783, 0.05952481925487518, -0.0038973044138401747, -0.0409378819167614, -0.05985838547348976, -0.02115180715918541, 0.11911962926387787, 0.003402339294552803, 0.011503953486680984, -0.07135210186243057, -0.015609217807650566, -0.21546921133995056, -0.0532926507294178, 0.01736573874950409, -0.06369026005268097, -0.05667596310377121, -0.014474299736320972, -0.055122703313827515, -0.12144490331411362, 0.0586695596575737, 0.09365187585353851, 0.1357107162475586, 0.050993021577596664, -0.13557565212249756, 0.05358002707362175, 0.12795212864875793, -0.022403748705983162, -0.15410232543945312, -0.04669377580285072, -0.0157921239733696, -0.02365182526409626, 0.05109778419137001, -0.04554072394967079, -0.10536262392997742, 0.04971432685852051, -0.05739372968673706, 0.034718677401542664, -0.011819968931376934, -0.011302650906145573, -0.10821318626403809, -0.08623041212558746, -0.06024457514286041, -0.024171968922019005, -0.039365384727716446, 0.060673657804727554, -0.059631768614053726, 0.10049259662628174, -0.023592079058289528, 0.10791841149330139, -0.07761266082525253, -0.0764319971203804, -0.04144129157066345, 0.02939784899353981, 0.0649467259645462, -0.08115166425704956, 0.026642486453056335, -0.0038142255507409573, 0.03151193633675575, -0.16280484199523926, -0.061010655015707016, -0.1297701746225357, -0.008133296854794025, -0.10618694126605988, -0.02474990300834179, -0.11092931032180786, 0.048698216676712036, -0.004475786816328764, -0.04973429813981056, -0.014133339747786522, -0.07180415093898773, -0.003080032067373395, -0.01597020961344242, 0.11826568096876144, -0.017409099265933037, 0.07244507223367691, -0.04873450845479965, -0.03011433035135269, -0.08240310847759247, 0.14220191538333893, -0.09405872970819473, 0.010939976200461388, -0.1324787735939026, 0.00021679105702787638, -0.06968683749437332, 0.07187805324792862, 0.06232810392975807, 0.1381630152463913, -0.22615154087543488, -0.08213416486978531, 0.12386509776115417, -0.08569943904876709, -0.0377945601940155, 0.1942431926727295, 0.024450048804283142, 0.14935871958732605, 0.09626810997724533, 0.3421175479888916, -0.07072371989488602, -0.1207582950592041, 0.01116027683019638, -0.005736252758651972, 0.08112199604511261, 0.10724766552448273, 0.05632045120000839, -0.06292442977428436, -0.019133400171995163, -0.004936612211167812, 0.06025392562150955, -0.043986208736896515, -0.01441564504057169, -0.04556331783533096, 0.02405090071260929, -0.0870852917432785, 0.040379367768764496, -0.009640869684517384, -0.025520486757159233, -0.015562349930405617, -0.04069080203771591, -0.05400414392352104, 0.09757822006940842, -0.08411872386932373, 0.02652098424732685, -0.14787374436855316, 0.1585668921470642, -0.010127771645784378, 0.022766824811697006, -0.15638123452663422, 0.1877250224351883, 0.012311385944485664, 0.03593713790178299, 0.10531670600175858, 0.07697214186191559, -0.041435569524765015, -0.00900033488869667, 0.016380675137043, 0.04180328547954559, 0.04228181391954422, 0.006461726501584053, -0.03618525713682175, -0.1827196329832077, 0.017782550305128098, -0.07998472452163696, 0.06724861264228821, -0.12008097022771835, -0.030743762850761414, -0.016054635867476463, 0.04146786034107208, -0.01706828735768795, -0.021647006273269653, 0.08114080876111984, 0.08608002215623856, 0.011478519067168236, 0.00717553123831749, 0.07194265723228455, 0.0025168631691485643, -0.06196435168385506, 0.0967484563589096, -0.15556617081165314, -0.0887521281838417, 0.08142101764678955, -0.12377507984638214, -0.02903016470372677, 0.02154938504099846, 0.029598280787467957, -0.015545347705483437, -0.04553141072392464, -0.03719787299633026, 0.2594352066516876, -0.03648275136947632, 0.04626484215259552, -0.07311926782131195, 0.037160348147153854, 0.04020316153764725, -0.0854148268699646, -0.035070087760686874, 0.10790438205003738, 0.03328603506088257, -0.03844088315963745, -0.008898495696485043, 0.03352070972323418, 0.015306729823350906, 0.1709764450788498, -0.005586984101682901, -0.059330929070711136, -0.029079901054501534, -0.079068623483181, -0.04906906932592392, 0.07788010686635971, -0.3180710971355438, -0.07744815945625305, 0.02457970567047596, 0.018419312313199043, 0.032626356929540634, -0.031093591824173927, 0.020823998376727104, -0.02946966141462326, -0.013694094493985176, -0.0734572783112526, 0.017092464491724968, -0.08104655891656876, 0.08127010613679886, -0.06307559460401535, -0.08578094840049744, -0.020479129627346992, -0.07020703703165054, -0.16001135110855103, 0.09279907494783401, -0.04039419814944267, -0.2367936372756958, 0.015115953050553799, -0.09995357692241669, -0.024435725063085556, 0.0346851572394371, 0.038020119071006775, -0.06505991518497467, -0.061932217329740524, -0.05542649328708649, 0.10255848616361618, 0.023756790906190872, -0.04673425108194351, 0.05123459920287132, -0.022199925035238266, 0.025635486468672752, -0.09568104892969131, -0.006095037795603275, -0.08690661191940308, -0.03605753183364868, -0.04372745752334595, -0.006474905647337437, 0.02226494438946247, 0.20146042108535767, 0.11073706299066544, -0.027376610785722733, -0.01578805036842823, 0.18373000621795654, -0.09238076955080032, 0.0021140233147889376, 0.19542934000492096, -0.0846073180437088, -0.04337738826870918, 0.17222103476524353, 0.024701902642846107, -0.11535049229860306, 0.04424107447266579, -0.005441491957753897, -0.044738978147506714, -0.216498002409935, -0.15677370131015778, -0.03806678205728531, 0.09759224206209183, 0.05838719382882118, -0.04301050305366516, 0.02303922548890114, 0.00885926652699709, -0.019712166860699654, 0.02261924184858799, 0.03420858085155487, 0.01793917454779148, 0.14459200203418732, -0.0466613695025444, 0.12787938117980957, -0.02301391214132309, -0.0081460727378726, 0.04509102180600166, 0.0035790579859167337, 0.04569295793771744, 0.13078585267066956, 0.03463244065642357, 0.08261246979236603, -0.012864548712968826, 0.034071166068315506, -0.02062159590423107, -0.03261448070406914, -0.0002886889560613781, -0.05594569817185402, -0.07372108101844788, -0.015697389841079712, 0.07891878485679626, 0.1397544890642166, -0.12167857587337494, -0.09384282678365707, -0.007248121779412031, 0.04842739924788475, 0.07182040065526962, 0.1890825778245926, -0.057672545313835144, -0.07093190401792526, 0.04730064049363136, -0.08112238347530365, -0.003534671850502491, 0.14410430192947388, 0.14952611923217773, -0.10335182398557663, 0.0882021114230156, 0.09885066002607346, 0.05032363533973694, -0.08169635385274887, 0.05209752544760704, -0.16120603680610657, -0.023522423580288887, -0.001877048285678029, 0.03606756404042244, -0.24602195620536804, 0.14229974150657654, 0.029892221093177795, 0.04804791510105133, -0.10826940089464188, -0.01948496140539646, 0.057766664773225784, -0.07040965557098389, 0.1797427088022232, -0.022808043286204338, -0.0601654015481472, 0.001389197539538145, -0.104624442756176, 0.04679729789495468, 0.029566913843154907, -0.013428517617285252, 0.01923847757279873, 0.02589106373488903, 0.044614940881729126, -0.01043392438441515, 0.04696323722600937, -0.23582157492637634, -0.08676842600107193, -0.009036813862621784, 0.06607522815465927, 0.08497358113527298, 0.014035678468644619, -0.07396999001502991, -0.12825952470302582, -0.04872375354170799, -0.05992399901151657, 0.011004404164850712, -0.031104648485779762, 0.08669799566268921, 0.03412512317299843, 0.0027034529484808445, 0.016619384288787842, 0.07314622402191162, 0.07997560501098633, -0.062424082309007645, -0.02298097498714924, 0.1058439090847969, -0.13322240114212036, -0.09546799957752228, -0.03114042431116104, 0.17111019790172577, 0.138824000954628, 0.08359082788228989, 0.08124683797359467, 0.011711816303431988, 0.05183209478855133, -0.05169861018657684, 0.11816653609275818, 0.03614693880081177, -0.08786311000585556, 0.09493110328912735, -0.03281930461525917, -0.2624569237232208, -0.12297143042087555, -0.0557788722217083, 0.12742163240909576, 0.08477459847927094, 0.017127441242337227, 0.14848262071609497, 0.29963043332099915, -0.11063221842050552, -0.22251062095165253, 0.006723110564053059, 0.022223150357604027, 0.04646701738238335, 0.03801644220948219, -0.2257908135652542, 0.10730410367250443, 0.045639362186193466, 0.006893748417496681, 0.010299350135028362, -0.23118126392364502, -0.13352905213832855, 0.24527785181999207, 0.007359078619629145, 0.18294383585453033, -0.035601165145635605, -0.06321222335100174, -0.06641925871372223, 0.06957965344190598, 0.04868840426206589, -0.11533796042203903, 0.096352219581604, 0.08318451046943665, 0.0026534032076597214, 0.024790480732917786, 0.04924037307500839, 0.11403793841600418, 0.13545793294906616, -0.015464544296264648, -0.05014453083276749, 0.006281231064349413, -0.024517608806490898, 0.0634102001786232, 0.06496202945709229, 0.04671448841691017, -0.031143968924880028, 0.000012884001080237795, -0.07615694403648376, -0.04697129502892494, 0.05631692335009575, 0.0006833629449829459, -0.04406534135341644, -0.0010977465426549315, 0.04905925691127777, -0.026683708652853966, 0.041153907775878906, -0.04270559176802635, -0.1381218284368515, 0.07091210037469864, 0.09607698023319244, 0.2577323019504547, -0.04118860140442848, 0.0024283791426569223, 0.0015623180661350489, -0.07088571786880493, 0.0750093013048172, -0.04921390488743782, 0.01013533491641283, 0.06915079057216644, 0.05894676595926285, 0.1342637538909912, 0.021068982779979706, -0.10729817301034927, 0.09076274931430817, 0.024941643700003624, -0.059284061193466187, -0.1339734047651291, -0.009663003496825695, 0.0527476966381073, 0.006012953817844391, 0.018226437270641327, 0.1856239289045334, -0.04724421724677086, -0.025932811200618744, -0.01639619655907154, 0.0645962506532669, -0.06086752191185951, 0.16081194579601288, 0.06867530941963196, 0.025395896285772324, -0.12475886195898056, 0.09032982587814331, 0.03893747553229332, -0.0429588221013546, 0.1159059926867485, -0.05816085264086723, -0.03815088048577309, -0.07268235832452774, -0.2666708827018738, 0.06789954751729965, 0.054647646844387054, -0.11480671167373657, 0.008942494168877602, -0.12154749035835266, 0.00951383262872696, 0.08121161162853241, 0.0209275521337986, -0.02561016194522381, -0.10147525370121002, -0.06582800298929214, -0.025888893753290176, 0.0751100406050682, 0.05914481729269028, -0.05320090800523758, -0.08850378543138504, 0.04546724259853363, 0.04956573620438576, 0.038861360400915146, -0.04544693976640701, -0.12506969273090363, -0.15967373549938202, 0.04281221702694893, -0.16736340522766113, 0.027521640062332153, -0.1323993057012558, -0.00008573015657020733, 0.016014866530895233, -0.003546329913660884, -0.05757912993431091, 0.00024819321697577834, -0.07379966974258423, 0.05554209649562836, 0.0559675395488739, 0.10966505110263824, -0.06871523708105087, 0.042538296431303024, 0.017116336151957512, -0.02936423569917679, 0.09692265093326569, 0.08653159439563751, -0.04507400095462799, 0.07534587383270264, -0.2108496129512787, -0.00002060739825537894, 0.05106806382536888, 0.06902879476547241, -0.029700759798288345, -0.15444499254226685, -0.04201298952102661, 0.009549563750624657, 0.00994921661913395, -0.014415450394153595, 0.07323221862316132, -0.02335680089890957, 0.04688810557126999, -0.06528110057115555, -0.04934610426425934, -0.029424630105495453, 0.022168153896927834, 0.02143307402729988, 0.048524677753448486, 0.06222876161336899, -0.0988222286105156, 0.05528692156076431, -0.060683585703372955, 0.05942145362496376, -0.04400666058063507, 0.011053362861275673, -0.02601729892194271, -0.10135050863027573, 0.059981364756822586, -0.03527071326971054, 0.16599498689174652, 0.03526124730706215, -0.001836731331422925, -0.011994695290923119, 0.005210966803133488, -0.053215645253658295, 0.04276988282799721, 0.08704435080289841, 0.07985690236091614, 0.003442002460360527, -0.07674606144428253, 0.041577208787202835, 0.06814330816268921, 0.2484390139579773, -0.0035929614678025246, 0.046609144657850266, 0.06580868363380432, 0.03862394765019417, 0.11138615012168884, -0.10596463084220886, 0.02269785851240158, 0.10385065525770187, -0.041636232286691666, 0.0948265939950943, -0.037279702723026276, 0.014737552963197231, -0.011213749647140503, -0.06547416746616364, 0.040217094123363495, -0.0346580371260643, -0.06899169832468033, -0.14233219623565674, -0.1915472149848938, -0.029029475525021553, -0.10726054012775421, -0.02741972915828228, -0.10476253181695938, -0.025240113958716393, 0.08111788332462311, 0.037826817482709885, -0.0759420245885849, 0.06527116894721985, -0.18342891335487366, -0.12258202582597733, 0.16639797389507294, -0.050374679267406464, -0.021997947245836258, -0.1476433426141739, -0.0031868850346654654, 0.06869969516992569, 0.06790347397327423, 0.0008744002552703023, 0.019457200542092323, -0.015910955145955086, 0.022441839799284935, -0.033461276441812515, -0.05852622166275978, 0.006345361936837435, -0.037365272641181946, 0.01776277832686901, 0.1497153639793396, 0.09791626781225204, -0.05969976633787155, 0.03326958417892456, 0.12425354868173599, -0.010926873423159122, -0.03560188412666321, -0.1750776320695877, 0.024598725140094757, -0.08123691380023956, 0.0366741381585598, -0.06023775413632393, -0.10131016373634338, 0.010557536967098713, 0.2139083296060562, 0.1609581559896469, -0.06718815118074417, 0.012519562616944313, 0.01491751428693533, 0.000543437257874757, 0.007037339732050896, 0.0011848987778648734, 0.0632481500506401, 0.18075796961784363, -0.01595364324748516, 0.037772417068481445, -0.009108017198741436, -0.0973338782787323, 0.012427563779056072, 0.06866337358951569, 0.03349033370614052, -0.054532330483198166, -0.01783391460776329, 0.1672486513853073, -0.2168883979320526, -0.13795605301856995, -0.10094650834798813, -0.051114682108163834, -0.10940086841583252, -0.028232278302311897, -0.06726348400115967, 0.2012963443994522, 0.07975222170352936, -0.05532459169626236, -0.010808631777763367, 0.0575786791741848, 0.025391260161995888, -0.04515383765101433, -0.08622760325670242, 0.03803110867738724, -0.016330525279045105, 0.04604637622833252, -0.039000511169433594, 0.11984706670045853, 0.03326214849948883, 0.09862241894006729, 0.005093172192573547, 0.04884788393974304, 0.01775229349732399, 0.027162184938788414, 0.052952080965042114, 0.11680617928504944, 0.019744137302041054, 0.1997029334306717, 0.12497754395008087, -0.14333419501781464, 0.01489983033388853, 0.07368697226047516, -0.051968593150377274, -0.05602269992232323, 0.09118442982435226, -0.12411284446716309, 0.11385124921798706, 0.03628358617424965, -0.06290317326784134, -0.08595234900712967, -0.00015216923202387989, 0.05353134870529175, -0.0140604879707098, -0.03399743139743805, 0.0057495348155498505, -0.23894435167312622, 0.03649109601974487, -0.1089448556303978, 0.005830301903188229, -0.19274093210697174, 0.01634252443909645, 0.007402075920253992, -0.018622945994138718, -0.0014273462584242225, 0.013609804213047028, 0.11573079228401184, -0.013972059823572636, -0.046996887773275375, -0.035206686705350876, 0.06285201013088226, 0.12170122563838959, -0.026836900040507317, -0.08215313404798508 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in fi (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-fi") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-fi") # load dataset ds = load_dataset("common_voice", "fi", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "fi", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-fi
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "fi", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "fi" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #fi #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in fi (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in fi (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #fi #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in fi (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #fi #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in fi (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09506887197494507, 0.09257300198078156, -0.0023753917776048183, -0.013553828001022339, 0.10607288777828217, -0.08222272247076035, 0.1415407359600067, 0.05075497925281525, 0.08564722537994385, 0.06841672956943512, -0.005269103683531284, 0.021496610715985298, 0.10727255046367645, 0.09384819120168686, 0.0263367872685194, -0.20320557057857513, 0.0487951897084713, -0.07225048542022705, 0.13507366180419922, 0.041051533073186874, 0.08734805136919022, -0.08201958984136581, 0.03688494488596916, 0.0744212344288826, -0.05060054734349251, 0.04785515367984772, -0.017817681655287743, -0.12868347764015198, 0.10368014127016068, 0.08259448409080505, -0.03815542161464691, 0.004986762534826994, 0.07985436916351318, -0.13410653173923492, 0.024105293676257133, 0.041197486221790314, 0.009093021973967552, -0.0022515705786645412, 0.11866907775402069, -0.08652680367231369, 0.07977215945720673, 0.021970069035887718, -0.006740243639796972, 0.09251046180725098, -0.08293482661247253, -0.15328943729400635, -0.044038351625204086, 0.08370725065469742, 0.017762893810868263, 0.08484246581792831, -0.1103711798787117, 0.05791105329990387, -0.024892063811421394, 0.04293625056743622, 0.14226746559143066, -0.2371687889099121, -0.014791670255362988, -0.016697127372026443, 0.07693552225828171, -0.014837888069450855, -0.043881673365831375, 0.06841175258159637, 0.04484221339225769, 0.008786901831626892, -0.09761882573366165, -0.03022449091076851, 0.020803630352020264, -0.12322159111499786, -0.0946839451789856, 0.04146896302700043, 0.24591204524040222, 0.0796719565987587, -0.08475617319345474, -0.105802983045578, 0.039877407252788544, 0.21851246058940887, -0.057254608720541, -0.09378794580698013, -0.005592613015323877, 0.006823284551501274, 0.0365639291703701, -0.051132939755916595, -0.06664355099201202, 0.004246369935572147, -0.051159460097551346, 0.08653950691223145, 0.005061962176114321, 0.010334997437894344, -0.03203608840703964, -0.01809103786945343, -0.05890250205993652, -0.07136782258749008, 0.03022720478475094, -0.07958174496889114, -0.07303384691476822, 0.004728295840322971, -0.04650784656405449, -0.133575901389122, 0.05335794389247894, 0.014263473451137543, 0.09176652878522873, 0.018169861286878586, -0.1554517149925232, 0.03787225857377052, 0.1088942289352417, 0.05746086314320564, -0.2039458006620407, -0.07570037990808487, -0.006774469278752804, -0.03739062324166298, 0.009133675135672092, -0.03149479627609253, -0.0715450644493103, 0.01466735452413559, -0.035149168223142624, 0.030145641416311264, 0.010621972382068634, -0.03551880642771721, -0.09335333853960037, -0.10225756466388702, -0.010738522745668888, -0.06401526927947998, 0.0002001696266233921, 0.06986623257398605, -0.047459617257118225, 0.1807941049337387, -0.02095557190477848, 0.09828098863363266, -0.10661374777555466, -0.0815766379237175, -0.013705343939363956, 0.03367877006530762, 0.04167182371020317, -0.06729083508253098, 0.043206192553043365, 0.022183788940310478, 0.015021264553070068, -0.16631868481636047, -0.04290343075990677, -0.09190333634614944, -0.03215343877673149, -0.08703567832708359, 0.0008095739176496863, -0.0637386292219162, 0.052605144679546356, -0.037023235112428665, -0.035369403660297394, -0.026962395757436752, -0.0561186745762825, 0.012745109386742115, -0.05772839114069939, 0.10342366993427277, 0.014078997075557709, 0.08853898197412491, -0.018994461745023727, -0.045742057263851166, -0.10042987018823624, 0.141480952501297, -0.10735415667295456, 0.024564441293478012, -0.1312253475189209, -0.041617922484874725, -0.051841557025909424, 0.05186617001891136, 0.0672612413764, 0.126471608877182, -0.2240535169839859, -0.10269765555858612, 0.16718968749046326, -0.07661571353673935, -0.04164350777864456, 0.21197669208049774, 0.026286223903298378, 0.08963153511285782, 0.10953904688358307, 0.21779699623584747, -0.017084363847970963, -0.15227842330932617, -0.058569908142089844, -0.03496994823217392, 0.04281305894255638, 0.07908952981233597, 0.04184043034911156, -0.05195408686995506, 0.04663471877574921, 0.0028397964779287577, 0.07603632658720016, -0.03381401300430298, -0.014289919286966324, -0.049330681562423706, 0.041772522032260895, -0.08883573859930038, 0.07600072771310806, -0.007606233935803175, -0.027936497703194618, -0.013121395371854305, -0.0512503907084465, -0.044388726353645325, 0.07210448384284973, -0.05018429830670357, 0.026498204097151756, -0.1284221112728119, 0.12267529219388962, 0.014145438559353352, 0.02925925888121128, -0.1438760906457901, 0.12471944838762283, 0.005109683610498905, -0.02652478776872158, 0.11394210904836655, 0.0891202837228775, -0.026396367698907852, -0.013472839258611202, 0.010576846078038216, 0.026720477268099785, -0.03505834564566612, -0.0004828172968700528, -0.013878065161406994, -0.11802314966917038, 0.0246136337518692, -0.062150947749614716, 0.0927593857049942, -0.09738008677959442, -0.021540137007832527, 0.05554269626736641, 0.10183466225862503, 0.005223060958087444, -0.011413081549108028, 0.0660969465970993, 0.057083066552877426, 0.03748093917965889, 0.006852696184068918, 0.03986319899559021, -0.012644771486520767, -0.022688934579491615, 0.0714305117726326, -0.08242485672235489, 0.06985951215028763, 0.1164892166852951, -0.08340416848659515, -0.01714889518916607, 0.046496398746967316, -0.004452637862414122, -0.010557446628808975, -0.0463034026324749, -0.051580917090177536, 0.26686033606529236, 0.011856108903884888, 0.05859100818634033, -0.06954169273376465, 0.019346699118614197, 0.03998107463121414, -0.04973863065242767, -0.0793757438659668, 0.07804742455482483, 0.026383042335510254, -0.06908250600099564, 0.006569126155227423, 0.08838333934545517, 0.0626450851559639, 0.17279773950576782, 0.0054784794338047504, -0.0905458927154541, -0.013591400347650051, -0.07739821076393127, -0.045964840799570084, 0.06112019345164299, -0.25925666093826294, -0.0541452057659626, 0.02423902042210102, -0.015262681059539318, 0.02819659747183323, -0.012835125438869, 0.021370448172092438, -0.030582496896386147, -0.032844919711351395, -0.025226177647709846, 0.02112131007015705, 0.007430939935147762, 0.0897010937333107, -0.0112863564863801, -0.06008068472146988, -0.045968279242515564, -0.06537281721830368, -0.1124364510178566, 0.06920989602804184, -0.0758858174085617, -0.4060250222682953, -0.02475668489933014, -0.07624440640211105, -0.06865537166595459, 0.016816262155771255, 0.033414680510759354, -0.1022593080997467, -0.06891173124313354, -0.03742719069123268, 0.11159948259592056, 0.04237596318125725, -0.038216665387153625, 0.11884128302335739, 0.0023537324741482735, 0.05748344957828522, -0.10202597826719284, -0.0005934067885391414, -0.06977344304323196, -0.04367546737194061, -0.06571011245250702, 0.0567815862596035, 0.04814508557319641, 0.09821753948926926, 0.08080360293388367, -0.003309179563075304, -0.008854582905769348, 0.17141707241535187, -0.11982723325490952, 0.006610915530472994, 0.19898366928100586, -0.07749736309051514, -0.025850236415863037, 0.10955186188220978, 0.02133355848491192, -0.10180139541625977, 0.04575810581445694, 0.016823887825012207, 0.0002960173587780446, -0.234529510140419, -0.14852407574653625, -0.031058410182595253, 0.06829912215471268, 0.04340258613228798, -0.022867461666464806, 0.005527873523533344, -0.020778153091669083, -0.03824568912386894, 0.0015739973168820143, 0.03299030661582947, -0.004485874902456999, 0.14636573195457458, -0.061115942895412445, 0.0996207594871521, -0.046025875955820084, -0.034468427300453186, 0.09006768465042114, 0.007912068627774715, 0.06364744156599045, 0.09879454225301743, 0.020287716761231422, 0.06428972631692886, 0.01995816081762314, 0.02246657758951187, -0.012652288191020489, 0.0009876678232103586, -0.0010844046482816339, -0.029342642053961754, -0.056046683341264725, -0.0007197201484814286, 0.06434336304664612, 0.1775529831647873, -0.12601515650749207, -0.0926845520734787, -0.027614230290055275, 0.06525254249572754, 0.14370600879192352, 0.1294637769460678, -0.04578637704253197, -0.07895568758249283, 0.02750670723617077, -0.09820541739463806, -0.039493657648563385, 0.10128116607666016, 0.09850553423166275, -0.15875153243541718, 0.10459373891353607, 0.08247797936201096, 0.051690004765987396, -0.0955880805850029, 0.04736091569066048, -0.1295342594385147, -0.024900009855628014, 0.009636753238737583, 0.03451939672231674, -0.21190202236175537, 0.1372893750667572, 0.03334752097725868, 0.04044831171631813, -0.09774800390005112, 0.012797074392437935, 0.06229313835501671, -0.06314536929130554, 0.16117295622825623, -0.023981008678674698, 0.03941992670297623, 0.02169063687324524, -0.11422761529684067, 0.015658805146813393, 0.029465237632393837, -0.020732004195451736, 0.01065848022699356, 0.02060205675661564, 0.003665475407615304, -0.02712429314851761, 0.05748070031404495, -0.22155004739761353, -0.11289173364639282, 0.031231330707669258, 0.025056155398488045, 0.1044345423579216, -0.024264270439743996, -0.11274220794439316, -0.1783670037984848, 0.10473749041557312, -0.04460565745830536, 0.006436141673475504, -0.07547485083341599, 0.09167817234992981, 0.02950809895992279, -0.01506049744784832, 0.027686554938554764, 0.07315289974212646, 0.1362854242324829, -0.06137697026133537, -0.057039692997932434, 0.06606566160917282, -0.10998231172561646, -0.10977671295404434, -0.013138446025550365, 0.19635643064975739, 0.14443105459213257, 0.04780768230557442, 0.07221083343029022, -0.020968375727534294, 0.030603259801864624, -0.059967461973428726, 0.09914146363735199, 0.09285953640937805, -0.0757589265704155, 0.09792662411928177, -0.0019751463551074266, -0.3455045819282532, -0.12619498372077942, -0.02275332249701023, 0.18286316096782684, 0.10049744695425034, -0.011993271298706532, 0.1480819135904312, 0.2786467373371124, -0.07115311920642853, -0.23065859079360962, 0.025419432669878006, 0.02249717339873314, 0.033916790038347244, 0.02733571268618107, -0.22958971560001373, 0.09918449819087982, 0.05315413326025009, 0.00201466865837574, -0.003392599057406187, -0.1831391155719757, -0.12857532501220703, 0.24209856986999512, -0.016912782564759254, 0.19292670488357544, -0.0035423776134848595, -0.051972854882478714, -0.05961953103542328, 0.03699759393930435, 0.0523386150598526, -0.12228987365961075, 0.08093477785587311, 0.07314394414424896, 0.021442722529172897, 0.021457582712173462, 0.03298809379339218, 0.06387230008840561, 0.09174641966819763, -0.018941935151815414, -0.054847318679094315, 0.06540945172309875, 0.024270201101899147, 0.05681564286351204, 0.09040165692567825, 0.07853623479604721, -0.04892871901392937, -0.014701644890010357, -0.07237698882818222, -0.07506133615970612, 0.09800844639539719, -0.0035217951517552137, -0.03155612573027611, -0.014087226241827011, 0.06654930859804153, -0.016375388950109482, 0.032749567180871964, -0.07913139462471008, -0.09471730887889862, 0.04004549980163574, 0.11473990231752396, 0.22723624110221863, -0.04291672632098198, 0.01894281432032585, -0.026131732389330864, -0.06982746720314026, 0.04387442395091057, 0.04303302243351936, 0.030729809775948524, 0.07298436015844345, 0.013733193278312683, 0.1120058000087738, -0.01113374438136816, -0.14343765377998352, 0.08174453675746918, 0.03168311342597008, -0.08057843148708344, -0.17605231702327728, -0.04676581546664238, 0.03154913708567619, 0.0001842430792748928, 0.0378834567964077, 0.21274515986442566, -0.038767267018556595, -0.05325303226709366, -0.021342314779758453, 0.06773964315652847, -0.033485282212495804, 0.10066292434930801, 0.050312574952840805, 0.005660280119627714, -0.13598798215389252, 0.08489910513162613, 0.06830546259880066, -0.06235842779278755, 0.09985419362783432, -0.0010822529438883066, -0.028360825031995773, -0.07407698780298233, -0.2163075953722, 0.05144396051764488, 0.07532928138971329, -0.09000702202320099, -0.013592182658612728, -0.13411813974380493, 0.04468068107962608, 0.11483573168516159, 0.0054636201821267605, -0.01792127639055252, -0.06280472129583359, -0.05141128972172737, -0.002876146463677287, 0.08608358353376389, 0.06207755208015442, -0.07445396482944489, -0.09712395071983337, 0.03978865593671799, 0.031390029937028885, 0.07059887051582336, -0.027405260130763054, -0.10014594346284866, -0.09547627717256546, 0.013884366489946842, -0.18859519064426422, -0.033591024577617645, -0.10419736802577972, 0.0025389210786670446, -0.016818031668663025, -0.04594813659787178, -0.029095815494656563, 0.019157133996486664, -0.05745899677276611, 0.030612971633672714, 0.036160312592983246, 0.09525539726018906, -0.14980864524841309, 0.04655294492840767, 0.027144094929099083, -0.031091194599866867, 0.12996016442775726, 0.08165425807237625, -0.03696060925722122, 0.05688063055276871, -0.1829821914434433, -0.04687988758087158, 0.0688813254237175, 0.07926593720912933, 0.012687258422374725, -0.14656798541545868, 0.0013503901427611709, 0.019998280331492424, 0.002954490715637803, -0.012110868468880653, 0.09409167617559433, -0.01795351132750511, 0.023063139989972115, -0.06884359568357468, -0.023437034338712692, -0.016934946179389954, 0.02831769734621048, 0.06996335834264755, 0.051437947899103165, 0.057945363223552704, -0.0899549052119255, 0.06786123663187027, -0.07212956249713898, 0.03987409546971321, -0.0430268831551075, -0.02301904186606407, 0.04958903416991234, -0.09890586137771606, 0.08331876248121262, -0.005808641668409109, 0.17155806720256805, 0.016175221651792526, 0.021577369421720505, 0.024535538628697395, -0.07075373083353043, -0.13249571621418, 0.05735104903578758, -0.00417289650067687, 0.06585627794265747, -0.0031214419286698103, -0.06809738278388977, -0.016134951263666153, 0.07778412103652954, 0.17256532609462738, 0.056156620383262634, 0.09456967562437057, 0.07752323150634766, 0.020680613815784454, 0.1409125179052353, -0.09479515254497528, -0.07344263046979904, 0.03622203692793846, -0.12417715787887573, 0.08011886477470398, -0.07216896861791611, 0.001788474852219224, 0.006853807717561722, -0.07703282684087753, 0.047003697603940964, -0.020778456702828407, -0.04417857155203819, -0.09679210931062698, -0.09901207685470581, -0.04982083663344383, -0.07421557605266571, -0.010226847603917122, -0.09061210602521896, 0.007494359277188778, -0.027883876115083694, 0.04210052266716957, -0.10162797570228577, 0.11856338381767273, -0.10061229765415192, -0.12853080034255981, 0.20597392320632935, -0.036306679248809814, -0.02324695512652397, -0.0974757969379425, 0.018626747652888298, 0.035934098064899445, 0.12802861630916595, 0.029091278091073036, 0.045938290655612946, -0.055917635560035706, 0.025568589568138123, -0.07068347185850143, -0.07693589478731155, 0.01976984180510044, -0.04955758526921272, -0.0025725795421749353, 0.1431283950805664, 0.11334726214408875, -0.03237137198448181, 0.018407614901661873, 0.10894329100847244, -0.031640417873859406, -0.015283677726984024, -0.15586945414543152, -0.02856469340622425, -0.018234090879559517, 0.0374063178896904, -0.045593976974487305, -0.09745112806558609, -0.018987266346812248, 0.20620118081569672, 0.1598614603281021, -0.07998201996088028, 0.036076467484235764, -0.0006750574102625251, 0.006172075401991606, -0.0012773032067343593, 0.032950181514024734, 0.13319693505764008, 0.21172328293323517, -0.005396961700171232, 0.021249448880553246, 0.0002749584673438221, -0.1113031730055809, -0.02538120746612549, 0.10802028328180313, 0.0008165938197635114, -0.023040181025862694, -0.010451008565723896, 0.16943009197711945, -0.1841069757938385, -0.23108038306236267, -0.12132736295461655, -0.09696178138256073, -0.139055997133255, 0.0018302052048966289, -0.06796923279762268, 0.17676249146461487, 0.05571053922176361, -0.00934088509529829, -0.009773831814527512, 0.14656329154968262, 0.0354103185236454, -0.008331537246704102, -0.04784120246767998, 0.046748291701078415, -0.12264999747276306, 0.050912827253341675, -0.022293364629149437, 0.016942860558629036, 0.02891680598258972, 0.09053272008895874, -0.04264691844582558, 0.022366920486092567, 0.009251384064555168, 0.010695851407945156, 0.029345804825425148, 0.13722245395183563, 0.022597316652536392, 0.1504879891872406, 0.12766382098197937, -0.08949737995862961, 0.004732797853648663, 0.007360893301665783, -0.018631264567375183, -0.03947293013334274, 0.12603236734867096, -0.11678916960954666, 0.08509603142738342, 0.0565933883190155, -0.05509316548705101, -0.040228698402643204, -0.022382361814379692, 0.02751566283404827, -0.01842370070517063, -0.0052440352737903595, -0.011192681267857552, -0.20958156883716583, 0.014551463536918163, -0.11389248818159103, 0.057466309517621994, -0.18773631751537323, -0.01070142537355423, 0.008833339437842369, -0.016044827178120613, -0.027168620377779007, 0.043872296810150146, 0.05266880616545677, -0.038809169083833694, -0.017244506627321243, 0.0393383651971817, 0.0358455628156662, 0.10413473099470139, -0.0664299800992012, -0.04033324494957924 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in fr (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-fr") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-fr") # load dataset ds = load_dataset("common_voice", "fr", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "fr", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-fr
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "fr", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "fr" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #fr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in fr (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in fr (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #fr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in fr (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #fr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in fr (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09412004053592682, 0.08698021620512009, -0.0023842365480959415, -0.012717454694211483, 0.10643167793750763, -0.08238190412521362, 0.13703396916389465, 0.04924783110618591, 0.08439984172582626, 0.06953369826078415, -0.0037209440488368273, 0.021229175850749016, 0.10405964404344559, 0.08903583139181137, 0.02540348470211029, -0.20491285622119904, 0.04683534428477287, -0.07133296877145767, 0.1314285844564438, 0.04104248061776161, 0.08719145506620407, -0.08233892172574997, 0.0374690480530262, 0.07597477734088898, -0.04831954836845398, 0.04798745736479759, -0.018814200535416603, -0.12814439833164215, 0.10276094824075699, 0.08212943375110626, -0.03944643959403038, 0.00766280572861433, 0.08158598095178604, -0.1338728815317154, 0.024205448105931282, 0.040272172540426254, 0.007331981789320707, -0.001658946042880416, 0.11842697858810425, -0.08950560539960861, 0.08357540518045425, 0.020247066393494606, -0.008660582825541496, 0.09223280847072601, -0.08306445181369781, -0.16130103170871735, -0.04297291859984398, 0.08553186058998108, 0.015159763395786285, 0.08512348681688309, -0.10941949486732483, 0.05842967703938484, -0.026580188423395157, 0.04289340600371361, 0.13886350393295288, -0.2361234575510025, -0.015225064940750599, -0.01457265391945839, 0.07483751326799393, -0.014895998872816563, -0.045075852423906326, 0.06948217004537582, 0.045864321291446686, 0.008302437141537666, -0.09827570617198944, -0.03152555972337723, 0.02278653346002102, -0.1237095519900322, -0.09517761319875717, 0.04070734977722168, 0.24886544048786163, 0.08104713261127472, -0.08386646956205368, -0.10185669362545013, 0.039963893592357635, 0.2175617665052414, -0.05678064748644829, -0.09780450910329819, -0.005421909969300032, 0.0074358670972287655, 0.03361739218235016, -0.05198516696691513, -0.06729470938444138, 0.005036599934101105, -0.049503277987241745, 0.08935421705245972, 0.004826993681490421, 0.010075093246996403, -0.029369143769145012, -0.018140587955713272, -0.05881638824939728, -0.07053295522928238, 0.029379827901721, -0.07880618423223495, -0.07058382779359818, 0.003663907526060939, -0.045993622392416, -0.13354827463626862, 0.0529940165579319, 0.0166914202272892, 0.08481904864311218, 0.019711455330252647, -0.1556456834077835, 0.035059113055467606, 0.10977163910865784, 0.0568915419280529, -0.20225587487220764, -0.07469538599252701, -0.005819276906549931, -0.03931274637579918, 0.008790651336312294, -0.02942609414458275, -0.07195696979761124, 0.013599634170532227, -0.035968028008937836, 0.027633847668766975, 0.006460265256464481, -0.035435933619737625, -0.09180492162704468, -0.10179898142814636, -0.005158782005310059, -0.06207868829369545, -0.0007641064003109932, 0.07234407216310501, -0.04828323423862457, 0.1813509613275528, -0.0185642559081316, 0.09731528162956238, -0.10509175062179565, -0.08216631412506104, -0.013083202764391899, 0.034264516085386276, 0.043318893760442734, -0.06677862256765366, 0.04296286404132843, 0.026645854115486145, 0.014989466406404972, -0.1646198034286499, -0.040648169815540314, -0.09280866384506226, -0.03183005750179291, -0.08879087120294571, 0.0026687958743423223, -0.06443046033382416, 0.05279503017663956, -0.036044295877218246, -0.033757586032152176, -0.026858169585466385, -0.05565319582819939, 0.010986932553350925, -0.05532142147421837, 0.10259827226400375, 0.01370283029973507, 0.08915215730667114, -0.019759729504585266, -0.045691195875406265, -0.10082045942544937, 0.14244450628757477, -0.10904105752706528, 0.021617993712425232, -0.13202941417694092, -0.045353878289461136, -0.058798253536224365, 0.05169142037630081, 0.06603076308965683, 0.1262417733669281, -0.21434783935546875, -0.10230887681245804, 0.16554893553256989, -0.07526136934757233, -0.03875167667865753, 0.21126429736614227, 0.0254399124532938, 0.08822895586490631, 0.10698692500591278, 0.21550054848194122, -0.013162747956812382, -0.15307526290416718, -0.05675455182790756, -0.029056919738650322, 0.04176750406622887, 0.077348992228508, 0.04334523156285286, -0.0530853271484375, 0.04473109170794487, 0.0027875471860170364, 0.07934075593948364, -0.034848105162382126, -0.015359092503786087, -0.04703517258167267, 0.04311174526810646, -0.08684059232473373, 0.0742211639881134, -0.006565571296960115, -0.025201186537742615, -0.013185618445277214, -0.05155617743730545, -0.04359767958521843, 0.07205914705991745, -0.05104808136820793, 0.02541234903037548, -0.12640194594860077, 0.11981264501810074, 0.01256937813013792, 0.028657987713813782, -0.14462296664714813, 0.12714700400829315, 0.006190715357661247, -0.026692243292927742, 0.11725668609142303, 0.09387946873903275, -0.024536345154047012, -0.013760815374553204, 0.009564469568431377, 0.02691594697535038, -0.03467545658349991, -0.0007035324815660715, -0.011461097747087479, -0.11831562966108322, 0.026418736204504967, -0.06315162032842636, 0.09568095952272415, -0.09871261566877365, -0.022882189601659775, 0.059069979935884476, 0.10329625755548477, 0.0034552263095974922, -0.011210975237190723, 0.0662064477801323, 0.05576113611459732, 0.03788101300597191, 0.007931013591587543, 0.03987252339720726, -0.013438689522445202, -0.022827735170722008, 0.0731525868177414, -0.07945948839187622, 0.06523267924785614, 0.1164296492934227, -0.08598760515451431, -0.017776895314455032, 0.042452964931726456, -0.005263401195406914, -0.009691224433481693, -0.04407598078250885, -0.05100926756858826, 0.26610490679740906, 0.009150562807917595, 0.05964837595820427, -0.06922009587287903, 0.019235078245401382, 0.04048868268728256, -0.05031772702932358, -0.08143076300621033, 0.07969202101230621, 0.027290984988212585, -0.07389313727617264, 0.0073700034990906715, 0.08438194543123245, 0.0635153278708458, 0.17168967425823212, 0.005380920134484768, -0.08901605010032654, -0.014496676623821259, -0.07550713419914246, -0.046717844903469086, 0.060142602771520615, -0.2579672932624817, -0.05229223892092705, 0.025263018906116486, -0.014732474461197853, 0.029744789004325867, -0.014198188669979572, 0.023007061332464218, -0.029907621443271637, -0.032557711005210876, -0.02437894605100155, 0.023053715005517006, 0.006250503472983837, 0.08850853145122528, -0.01362071093171835, -0.06034444645047188, -0.04463847354054451, -0.06442966312170029, -0.1115211620926857, 0.06873879581689835, -0.07615064084529877, -0.40610119700431824, -0.025223664939403534, -0.07196185737848282, -0.06947819888591766, 0.016848556697368622, 0.03271428495645523, -0.10301867872476578, -0.0689956471323967, -0.03588263317942619, 0.10850037634372711, 0.039648957550525665, -0.03854472190141678, 0.11875294148921967, 0.0017968247411772609, 0.05796629190444946, -0.10181231796741486, -0.0008315050508826971, -0.07350260764360428, -0.04367615282535553, -0.06555698812007904, 0.05640186741948128, 0.04753640666604042, 0.09788099676370621, 0.0803489238023758, -0.0021873610094189644, -0.00830324087291956, 0.16861054301261902, -0.11786827445030212, 0.005426071584224701, 0.19985030591487885, -0.07446825504302979, -0.02683715522289276, 0.10646040737628937, 0.02031562104821205, -0.10220160335302353, 0.04621794447302818, 0.016586819663643837, -0.0019413173431530595, -0.23635482788085938, -0.14920920133590698, -0.032530881464481354, 0.06755764037370682, 0.0426885150372982, -0.022970978170633316, 0.006833173800259829, -0.020940301939845085, -0.03822329640388489, 0.0016392638208344579, 0.03266109526157379, -0.006410726346075535, 0.15415386855602264, -0.06109914556145668, 0.09838853776454926, -0.04619358479976654, -0.035544686019420624, 0.08999033272266388, 0.00466144597157836, 0.06054837256669998, 0.09688972681760788, 0.021353580057621002, 0.06122340261936188, 0.01942090131342411, 0.021859200671315193, -0.012717874720692635, 0.001438438892364502, -0.001365167903713882, -0.029160693287849426, -0.05449049919843674, -0.002142064506188035, 0.06515856087207794, 0.18072132766246796, -0.12729960680007935, -0.09145496785640717, -0.02197081781923771, 0.06551606953144073, 0.13886454701423645, 0.12797237932682037, -0.04466428980231285, -0.08090684562921524, 0.023770928382873535, -0.09514552354812622, -0.03726378455758095, 0.10121915489435196, 0.10056938976049423, -0.15635617077350616, 0.1048598662018776, 0.08172351866960526, 0.04914757236838341, -0.09287026524543762, 0.04738837480545044, -0.13013778626918793, -0.019831079989671707, 0.01075537409633398, 0.03524181246757507, -0.2099204808473587, 0.13769903779029846, 0.032169513404369354, 0.03922268748283386, -0.0970287024974823, 0.013744381256401539, 0.06087123602628708, -0.06424449384212494, 0.16191628575325012, -0.02341780625283718, 0.03346467763185501, 0.0281941257417202, -0.11332537978887558, 0.01503688283264637, 0.032848555594682693, -0.01841902919113636, 0.00835860799998045, 0.020894110202789307, 0.0014249804662540555, -0.024660788476467133, 0.05424056947231293, -0.22461582720279694, -0.11203678697347641, 0.029696617275476456, 0.02026057057082653, 0.09863977134227753, -0.023809442296624184, -0.11065948009490967, -0.18095235526561737, 0.09901677817106247, -0.03758188337087631, 0.010361818596720695, -0.07654590159654617, 0.09644933044910431, 0.030311184003949165, -0.016562089323997498, 0.027460869401693344, 0.07144865393638611, 0.13318203389644623, -0.06013732776045799, -0.05530096963047981, 0.06708868592977524, -0.11194104701280594, -0.11124522238969803, -0.011558393947780132, 0.1991429626941681, 0.14496183395385742, 0.04853443801403046, 0.0720827654004097, -0.0210022684186697, 0.03269539400935173, -0.060696326196193695, 0.10185442864894867, 0.08604416996240616, -0.07974836975336075, 0.09886687994003296, -0.0030546702910214663, -0.34636974334716797, -0.12375456094741821, -0.021119875833392143, 0.18255940079689026, 0.09829646348953247, -0.012248912826180458, 0.14632293581962585, 0.2782454490661621, -0.0688648447394371, -0.23414161801338196, 0.02733691968023777, 0.024212829768657684, 0.03266536444425583, 0.027065712958574295, -0.23062598705291748, 0.09988882392644882, 0.05338922142982483, 0.0012979579623788595, -0.006495461333543062, -0.1803141087293625, -0.1291457563638687, 0.24140320718288422, -0.01804223097860813, 0.19448432326316833, -0.0008466794970445335, -0.05216088518500328, -0.05845213308930397, 0.03551600128412247, 0.0531013198196888, -0.1226726695895195, 0.08107633888721466, 0.07332737743854523, 0.015383079648017883, 0.021163372322916985, 0.03246147930622101, 0.06312296539545059, 0.09411618858575821, -0.018375325947999954, -0.052918147295713425, 0.06613418459892273, 0.022844549268484116, 0.059774432331323624, 0.08948926627635956, 0.08005691319704056, -0.048007357865571976, -0.014037754386663437, -0.07296261191368103, -0.0755130797624588, 0.10051922500133514, -0.0020243790931999683, -0.03150121867656708, -0.01574278622865677, 0.06582628935575485, -0.015993908047676086, 0.03338886797428131, -0.07524530589580536, -0.09537819027900696, 0.03663628548383713, 0.12464411556720734, 0.2281198501586914, -0.037655532360076904, 0.02462771162390709, -0.026761945337057114, -0.06883632391691208, 0.04275551438331604, 0.0401211641728878, 0.031446997076272964, 0.07327558100223541, 0.015461009927093983, 0.11341922730207443, -0.012103809043765068, -0.14217576384544373, 0.0812724232673645, 0.030292915180325508, -0.07880832254886627, -0.18364594876766205, -0.04851570725440979, 0.021815698593854904, 0.003118433989584446, 0.03767085820436478, 0.21295583248138428, -0.039554014801979065, -0.05222455412149429, -0.023192843422293663, 0.06703200936317444, -0.034421157091856, 0.10131724923849106, 0.0521661676466465, 0.00479873176664114, -0.13549621403217316, 0.08540980517864227, 0.06767158955335617, -0.06236214563250542, 0.0992073342204094, -0.0013398024020716548, -0.027705635875463486, -0.0748443454504013, -0.21718432009220123, 0.05023191124200821, 0.07338523119688034, -0.08988428860902786, -0.015969062224030495, -0.13464142382144928, 0.045568954199552536, 0.11253835260868073, 0.005896512884646654, -0.02041560783982277, -0.06284742802381516, -0.0491635762155056, -0.00555544625967741, 0.08602003008127213, 0.06413328647613525, -0.07489194720983505, -0.0945778340101242, 0.03770722076296806, 0.03173469379544258, 0.06959842145442963, -0.028024286031723022, -0.10039996355772018, -0.09527771174907684, 0.01219790056347847, -0.19002968072891235, -0.035445619374513626, -0.10327407717704773, 0.0014015523483976722, -0.014858124777674675, -0.046420346945524216, -0.02871253900229931, 0.02264413610100746, -0.059470709413290024, 0.031584881246089935, 0.037460245192050934, 0.09720313549041748, -0.14908115565776825, 0.04645223915576935, 0.027071669697761536, -0.029437104240059853, 0.1291828155517578, 0.0852510929107666, -0.036845169961452484, 0.058901168406009674, -0.19009612500667572, -0.04727570712566376, 0.06971420347690582, 0.08050643652677536, 0.013435735367238522, -0.14456398785114288, 0.003920326475054026, 0.020740466192364693, 0.002288956195116043, -0.01404554583132267, 0.09022108465433121, -0.018677808344364166, 0.024877093732357025, -0.07312340289354324, -0.023771481588482857, -0.017007378861308098, 0.02958560362458229, 0.06771834939718246, 0.05264860391616821, 0.05772411823272705, -0.09109208732843399, 0.06735096126794815, -0.07289857417345047, 0.039178796112537384, -0.043063051998615265, -0.02394675649702549, 0.044241636991500854, -0.09778252243995667, 0.08346439152956009, -0.004234835971146822, 0.16811423003673553, 0.016909636557102203, 0.0175087321549654, 0.02391817234456539, -0.06881491839885712, -0.1327570080757141, 0.0562516450881958, -0.008804923854768276, 0.06553535163402557, -0.003020013216882944, -0.06625593453645706, -0.014312724582850933, 0.08099517971277237, 0.1716703325510025, 0.055972807109355927, 0.09444696456193924, 0.0748676136136055, 0.02108754962682724, 0.1430557370185852, -0.09610038250684738, -0.0712001696228981, 0.04156462848186493, -0.1236114501953125, 0.08023341745138168, -0.07273205369710922, -0.0048474776558578014, 0.005179137457162142, -0.0773153305053711, 0.04586902633309364, -0.02074553072452545, -0.044282227754592896, -0.09861123561859131, -0.09691073000431061, -0.05130039155483246, -0.07708967477083206, -0.00882613006979227, -0.08990640193223953, 0.006128312554210424, -0.03028787672519684, 0.04360957071185112, -0.10082229226827621, 0.11911536008119583, -0.10433997213840485, -0.12854503095149994, 0.2043319195508957, -0.036917347460985184, -0.02336849458515644, -0.09738721698522568, 0.016994725912809372, 0.03465297445654869, 0.1254018247127533, 0.028290938585996628, 0.044412583112716675, -0.053614143282175064, 0.024376675486564636, -0.06966236978769302, -0.0763462483882904, 0.019516557455062866, -0.04737260565161705, -0.0029427919071167707, 0.14627408981323242, 0.11341448128223419, -0.033322256058454514, 0.01824631728231907, 0.10646568983793259, -0.031630247831344604, -0.012460438534617424, -0.15576723217964172, -0.023966416716575623, -0.014899715781211853, 0.03983158990740776, -0.04313991963863373, -0.09558822959661484, -0.01703694276511669, 0.20122839510440826, 0.16282226145267487, -0.07802927494049072, 0.03737889975309372, 0.0003592800348997116, 0.0062979902140796185, 0.0017185824690386653, 0.029879754409193993, 0.13410542905330658, 0.2090775966644287, -0.0039710900746285915, 0.02010747417807579, -0.00019113185408059508, -0.1113298162817955, -0.02125333994626999, 0.10552793741226196, 0.002342810155823827, -0.024158431217074394, -0.007730958051979542, 0.16828900575637817, -0.18649926781654358, -0.2342420518398285, -0.11797623336315155, -0.09622626006603241, -0.13728247582912445, 0.0008543220465071499, -0.0723419263958931, 0.1811971515417099, 0.0565001480281353, -0.01061676163226366, -0.011377200484275818, 0.14675723016262054, 0.03614144027233124, -0.009756342507898808, -0.043187860399484634, 0.045957233756780624, -0.12349706888198853, 0.05231582000851631, -0.022345704957842827, 0.019383203238248825, 0.03009217232465744, 0.09001176059246063, -0.040658138692379, 0.022306853905320168, 0.009535929188132286, 0.009903598576784134, 0.029075510799884796, 0.1344011425971985, 0.022983750328421593, 0.15509964525699615, 0.1251717209815979, -0.0934104174375534, 0.004985298030078411, 0.009251070208847523, -0.019725224003195763, -0.038612645119428635, 0.12546594440937042, -0.117909274995327, 0.08553636819124222, 0.05833406373858452, -0.05265342816710472, -0.041107501834630966, -0.02136317454278469, 0.029240233823657036, -0.017185909673571587, -0.0029395250603556633, -0.010135929100215435, -0.20879234373569489, 0.015553335659205914, -0.11521092802286148, 0.0560816265642643, -0.18428251147270203, -0.011012985371053219, 0.008582957088947296, -0.01565154269337654, -0.025181716307997704, 0.043407343327999115, 0.051102735102176666, -0.037123534828424454, -0.016428539529442787, 0.0443071573972702, 0.033852770924568176, 0.1037391871213913, -0.06572385877370834, -0.037748899310827255 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in hr (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-hr") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-hr") # load dataset ds = load_dataset("common_voice", "hr", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "hr", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-hr
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "hr", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "hr" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #hr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in hr (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in hr (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #hr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in hr (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #hr #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in hr (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09652966260910034, 0.08694545179605484, -0.002195577137172222, -0.016309283673763275, 0.1051342785358429, -0.07829201221466064, 0.13995662331581116, 0.05221671611070633, 0.08814714103937149, 0.06665724515914917, -0.00601884163916111, 0.01994490996003151, 0.10553665459156036, 0.08463889360427856, 0.027349507436156273, -0.20406952500343323, 0.045681100338697433, -0.06930288672447205, 0.13266123831272125, 0.04161316528916359, 0.0845145508646965, -0.08393819630146027, 0.03900957107543945, 0.07503335922956467, -0.043792400509119034, 0.04970434680581093, -0.019632942974567413, -0.12665481865406036, 0.10566963255405426, 0.07790099084377289, -0.03802835941314697, 0.007680235896259546, 0.08591275662183762, -0.13608814775943756, 0.024635067209601402, 0.038958169519901276, 0.008936585858464241, -0.0010647120652720332, 0.11161917448043823, -0.08929654210805893, 0.084198959171772, 0.014493090100586414, -0.00834977813065052, 0.09155396372079849, -0.08220970630645752, -0.16019880771636963, -0.04353824257850647, 0.08242533355951309, 0.01581207849085331, 0.08496788144111633, -0.11096765100955963, 0.06504444777965546, -0.02859416976571083, 0.03991786763072014, 0.14364375174045563, -0.23887374997138977, -0.012595071457326412, -0.015025715343654156, 0.07456088811159134, -0.008898461237549782, -0.04713594540953636, 0.07242359966039658, 0.044835370033979416, 0.010374849662184715, -0.09844312816858292, -0.03243605047464371, 0.024687182158231735, -0.11924567073583603, -0.0930890291929245, 0.04187503084540367, 0.2507917284965515, 0.08116687834262848, -0.07940781861543655, -0.10292215645313263, 0.039463549852371216, 0.21465381979942322, -0.0550370067358017, -0.09370460361242294, -0.005509546957910061, 0.005989690311253071, 0.02544494718313217, -0.04955292493104935, -0.06772785633802414, 0.005028217565268278, -0.04747726023197174, 0.08402293920516968, 0.005234427750110626, 0.009004592895507812, -0.029834479093551636, -0.014185694046318531, -0.04527534171938896, -0.07194411754608154, 0.0300728939473629, -0.07832225412130356, -0.07159744203090668, 0.005252681206911802, -0.04379091039299965, -0.12874636054039001, 0.054474905133247375, 0.012317199259996414, 0.08436006307601929, 0.021807100623846054, -0.1528884917497635, 0.03439212217926979, 0.10845646262168884, 0.0632142648100853, -0.20400550961494446, -0.08138387650251389, -0.0062061515636742115, -0.038143936544656754, 0.010626605711877346, -0.028313346207141876, -0.07131651788949966, 0.011841921135783195, -0.03273923322558403, 0.031558845192193985, 0.006877833511680365, -0.03412370756268501, -0.0934794470667839, -0.09958916902542114, -0.011078989133238792, -0.0633794441819191, -0.001259293989278376, 0.07145250588655472, -0.0459817536175251, 0.17752206325531006, -0.017634257674217224, 0.09637991338968277, -0.10645541548728943, -0.08517485111951828, -0.014437615871429443, 0.034026674926280975, 0.04132891073822975, -0.06374309957027435, 0.04183105379343033, 0.03160897642374039, 0.014081266708672047, -0.16259606182575226, -0.03897688537836075, -0.0922398641705513, -0.03320194408297539, -0.08801253139972687, 0.005439859349280596, -0.066426120698452, 0.05710061267018318, -0.03459768742322922, -0.036337949335575104, -0.02935202233493328, -0.056339140981435776, 0.011093770153820515, -0.05869065970182419, 0.10246053338050842, 0.015486841090023518, 0.08797361701726913, -0.021457601338624954, -0.04472177475690842, -0.10241090506315231, 0.14290209114551544, -0.10652358829975128, 0.02691608853638172, -0.13097524642944336, -0.04602057859301567, -0.05653073638677597, 0.05254798382520676, 0.06559853255748749, 0.12596967816352844, -0.21655358374118805, -0.10341788083314896, 0.165826216340065, -0.07169609516859055, -0.041976068168878555, 0.2090039700269699, 0.024546680971980095, 0.0895683690905571, 0.11086700111627579, 0.2182227373123169, -0.018008723855018616, -0.14988605678081512, -0.05846892669796944, -0.030817802995443344, 0.0427778996527195, 0.07743901014328003, 0.04195249080657959, -0.05446457862854004, 0.0453987643122673, 0.0023036429192870855, 0.0803227573633194, -0.03634015843272209, -0.015228704549372196, -0.049159128218889236, 0.04427946358919144, -0.08555595576763153, 0.0727919191122055, -0.007034168113023043, -0.025359828025102615, -0.014329623430967331, -0.05482352524995804, -0.041924040764570236, 0.0740918219089508, -0.054553475230932236, 0.02767273597419262, -0.1269531399011612, 0.114546038210392, 0.014110538177192211, 0.02780454233288765, -0.14292435348033905, 0.12467747181653976, 0.0030874833464622498, -0.03345975652337074, 0.12100081890821457, 0.09586240351200104, -0.025080176070332527, -0.01495551597326994, 0.005483736749738455, 0.024402130395174026, -0.033349376171827316, 0.000987367588095367, -0.011937093921005726, -0.12142683565616608, 0.027682490646839142, -0.05943426862359047, 0.09610260277986526, -0.09167829155921936, -0.024003159254789352, 0.05826942250132561, 0.10594680160284042, 0.0032378467731177807, -0.009250757284462452, 0.06592341512441635, 0.0571303628385067, 0.03845673426985741, 0.006551756057888269, 0.03973938524723053, -0.014852665364742279, -0.027612602338194847, 0.07195238023996353, -0.08267764002084732, 0.05548561364412308, 0.11802938580513, -0.08549045026302338, -0.014785387553274632, 0.04282846674323082, -0.004202891141176224, -0.008623714558780193, -0.043231576681137085, -0.05120127275586128, 0.2688840925693512, 0.010423616506159306, 0.05675344914197922, -0.06951522082090378, 0.016427814960479736, 0.036005642265081406, -0.048416152596473694, -0.07754175364971161, 0.07951243221759796, 0.02669277787208557, -0.07794412225484848, 0.005595461465418339, 0.08642534911632538, 0.06479976326227188, 0.17673543095588684, 0.006174534559249878, -0.08802354335784912, -0.013616149313747883, -0.08182286471128464, -0.05097073316574097, 0.0656542256474495, -0.26169079542160034, -0.05533658340573311, 0.023947590962052345, -0.012751077301800251, 0.028557205572724342, -0.015322847291827202, 0.02475326508283615, -0.0347774438560009, -0.03152259439229965, -0.030070694163441658, 0.02181047759950161, 0.0072932494804263115, 0.08903344720602036, -0.011048336513340473, -0.055403538048267365, -0.04658655449748039, -0.06541590392589569, -0.11087668687105179, 0.0705747902393341, -0.07895994931459427, -0.40417996048927307, -0.026294590905308723, -0.0695919543504715, -0.06721920520067215, 0.015491598285734653, 0.035325683653354645, -0.10871347784996033, -0.06932031363248825, -0.033040013164281845, 0.1169256642460823, 0.04485330358147621, -0.03954362869262695, 0.11883421242237091, 0.0033048500772565603, 0.05891972780227661, -0.1050405353307724, -0.0018595969304442406, -0.07180508226156235, -0.04336223378777504, -0.06413976103067398, 0.05462026223540306, 0.04542579874396324, 0.09895054250955582, 0.08139263838529587, -0.0018466078909114003, -0.008535419590771198, 0.16359522938728333, -0.12192730605602264, 0.005917055066674948, 0.191013365983963, -0.07424025982618332, -0.02655312418937683, 0.11014673113822937, 0.02186628244817257, -0.1015825867652893, 0.04566438868641853, 0.01560166385024786, 0.001308706123381853, -0.23872870206832886, -0.15092171728610992, -0.03172609210014343, 0.06523743271827698, 0.041235581040382385, -0.024042820557951927, -0.0030528311617672443, -0.021556591615080833, -0.04155222699046135, 0.0005971904029138386, 0.030910102650523186, -0.006230195052921772, 0.15207253396511078, -0.060302454978227615, 0.09680496156215668, -0.04714725539088249, -0.03655998781323433, 0.08897317945957184, 0.005799731705337763, 0.06241152808070183, 0.0956568494439125, 0.022150283679366112, 0.06325807422399521, 0.023865608498454094, 0.021831857040524483, -0.01348827313631773, 0.002135833725333214, -0.0009792844066396356, -0.028623387217521667, -0.05341590568423271, -0.00181693269405514, 0.06630167365074158, 0.18080037832260132, -0.12744469940662384, -0.08817319571971893, -0.02256288193166256, 0.0675157830119133, 0.1467393934726715, 0.13167761266231537, -0.04763176292181015, -0.07922679930925369, 0.02735159732401371, -0.09506671875715256, -0.03902265429496765, 0.1014656275510788, 0.09509491175413132, -0.15754131972789764, 0.1081351786851883, 0.0841679573059082, 0.048523712903261185, -0.09824204444885254, 0.04828515648841858, -0.1307685524225235, -0.02234056033194065, 0.008485784754157066, 0.03844494745135307, -0.2057124227285385, 0.1405559629201889, 0.03331468254327774, 0.03787261247634888, -0.09807077795267105, 0.009975666180253029, 0.06415386497974396, -0.06494537740945816, 0.15867504477500916, -0.021447787061333656, 0.035068709403276443, 0.026150263845920563, -0.11238285899162292, 0.019130714237689972, 0.03099071979522705, -0.01888514682650566, 0.010069365613162518, 0.02029961347579956, 0.0018098090076819062, -0.026816271245479584, 0.04881691932678223, -0.22330042719841003, -0.10943703353404999, 0.02874738536775112, 0.02142959088087082, 0.10599452257156372, -0.02400820329785347, -0.11119157075881958, -0.17456990480422974, 0.09906952828168869, -0.03812423348426819, 0.008668603375554085, -0.07246160507202148, 0.09356439858675003, 0.03160737082362175, -0.018728557974100113, 0.02307056449353695, 0.07432318478822708, 0.13455338776111603, -0.058571550995111465, -0.05690298229455948, 0.0680375024676323, -0.10771242529153824, -0.1105455830693245, -0.012031624093651772, 0.19853366911411285, 0.14795351028442383, 0.051316067576408386, 0.07192428410053253, -0.021419035270810127, 0.028735995292663574, -0.059988558292388916, 0.1005469411611557, 0.08892754465341568, -0.07251887023448944, 0.10137724876403809, -0.006916375365108252, -0.34526073932647705, -0.12526677548885345, -0.022260542958974838, 0.18132972717285156, 0.09835618734359741, -0.012183412909507751, 0.1466081142425537, 0.28403303027153015, -0.06976088881492615, -0.2359504997730255, 0.026094237342476845, 0.01863499917089939, 0.03396579995751381, 0.030130356550216675, -0.22831523418426514, 0.10174770653247833, 0.04798945039510727, 0.0013854990247637033, -0.01495454739779234, -0.17437343299388885, -0.12991568446159363, 0.2450103759765625, -0.01825370267033577, 0.19063574075698853, 0.0012764816638082266, -0.05018540844321251, -0.06417946517467499, 0.03442142903804779, 0.04582886025309563, -0.12964653968811035, 0.08140300959348679, 0.07049092650413513, 0.027466997504234314, 0.019527360796928406, 0.033093489706516266, 0.06252583861351013, 0.09382858127355576, -0.018960611894726753, -0.0539168082177639, 0.06711388379335403, 0.026714639738202095, 0.05548446252942085, 0.08859037607908249, 0.07546241581439972, -0.04827077314257622, -0.012240667827427387, -0.07099107652902603, -0.07530029118061066, 0.09744743257761002, -0.0024102553725242615, -0.03232059255242348, -0.01621062122285366, 0.06726284325122833, -0.017662713304162025, 0.03148185461759567, -0.07261793315410614, -0.09606415778398514, 0.03768065571784973, 0.10833781957626343, 0.23442625999450684, -0.04614513739943504, 0.02862696908414364, -0.026769837364554405, -0.06928499788045883, 0.04209866002202034, 0.04082103818655014, 0.032565027475357056, 0.07405070960521698, 0.014734129421412945, 0.11357500404119492, -0.011542852967977524, -0.14131025969982147, 0.08295607566833496, 0.033590950071811676, -0.0780341774225235, -0.17861925065517426, -0.04751736670732498, 0.024056458845734596, 0.0005595116526819766, 0.03974536061286926, 0.21255619823932648, -0.03517073765397072, -0.055012162774801254, -0.022121410816907883, 0.06676989793777466, -0.035754572600126266, 0.09995374828577042, 0.050504595041275024, 0.005840308032929897, -0.1362752914428711, 0.08710425347089767, 0.06598974019289017, -0.060747984796762466, 0.09974072873592377, 0.0020119505934417248, -0.03010495752096176, -0.07423079758882523, -0.21129795908927917, 0.058078281581401825, 0.07057172060012817, -0.08826148509979248, -0.020191846415400505, -0.13582193851470947, 0.04272214323282242, 0.10679073631763458, 0.005072562023997307, -0.019290205091238022, -0.06326215714216232, -0.04987906292080879, -0.002824971918016672, 0.0860685259103775, 0.06350015103816986, -0.07556726038455963, -0.09421230107545853, 0.03615075349807739, 0.03358924761414528, 0.070830799639225, -0.0275606457144022, -0.10050907731056213, -0.09648851305246353, 0.014924858696758747, -0.2004898637533188, -0.03742802515625954, -0.10438081622123718, 0.002136736875399947, -0.017290618270635605, -0.0484011173248291, -0.03190290555357933, 0.02226412296295166, -0.0601685605943203, 0.031675759702920914, 0.036347534507513046, 0.09968879818916321, -0.14954257011413574, 0.04903722181916237, 0.028805304318666458, -0.030597591772675514, 0.13031110167503357, 0.08287876844406128, -0.03788187354803085, 0.062449753284454346, -0.1775471717119217, -0.05098965764045715, 0.06744702905416489, 0.07994814217090607, 0.01526105497032404, -0.14486324787139893, -0.0005329600535333157, 0.020865000784397125, 0.006692902185022831, -0.013015635311603546, 0.08988767862319946, -0.014942189678549767, 0.029216011986136436, -0.07346226274967194, -0.02533060871064663, -0.017759129405021667, 0.02946765162050724, 0.06757302582263947, 0.05092795938253403, 0.05721582472324371, -0.09045771509408951, 0.0631861612200737, -0.07611191272735596, 0.03816942498087883, -0.04143611714243889, -0.0229498241096735, 0.04483252763748169, -0.1016107127070427, 0.0854337140917778, -0.006151177920401096, 0.1711091250181198, 0.015571754425764084, 0.017897220328450203, 0.025414282456040382, -0.06310809403657913, -0.1399199664592743, 0.05446141958236694, -0.007740704342722893, 0.06177506968379021, -0.001657247543334961, -0.06656640768051147, -0.015382219105958939, 0.07851304858922958, 0.17347323894500732, 0.058472421020269394, 0.09292244166135788, 0.08607874065637589, 0.019334549084305763, 0.14055956900119781, -0.09538595378398895, -0.07992652803659439, 0.03940925747156143, -0.12180010229349136, 0.07928495109081268, -0.07159427553415298, -0.00032358570024371147, 0.008656095713376999, -0.08013608306646347, 0.045024603605270386, -0.024543141946196556, -0.04645322263240814, -0.09967797249555588, -0.0953650251030922, -0.05109601095318794, -0.08235002309083939, -0.007400551810860634, -0.08966654539108276, 0.00501230638474226, -0.024357758462429047, 0.04311466962099075, -0.10066577792167664, 0.12350650131702423, -0.1012173742055893, -0.13135388493537903, 0.19875790178775787, -0.03310631215572357, -0.021656597033143044, -0.0956389382481575, 0.018985776230692863, 0.036809906363487244, 0.12518298625946045, 0.02872941456735134, 0.04348752275109291, -0.06086329370737076, 0.022438062354922295, -0.06987341493368149, -0.07596675306558609, 0.0196431465446949, -0.04732801392674446, -0.0018517994321882725, 0.14568401873111725, 0.11364979296922684, -0.03372595086693764, 0.018654095008969307, 0.11108586937189102, -0.03387705981731415, -0.017338084056973457, -0.15405820310115814, -0.01674148440361023, -0.016595806926488876, 0.03538862615823746, -0.042281150817871094, -0.09689061343669891, -0.014423662796616554, 0.20183433592319489, 0.1642979085445404, -0.07821311801671982, 0.03699040785431862, -0.0031258482486009598, 0.00723484018817544, -0.0028724551666527987, 0.032560598105192184, 0.13487499952316284, 0.2056315690279007, -0.0063954615034163, 0.021376611664891243, 0.0001290242071263492, -0.11213245242834091, -0.021497536450624466, 0.10350054502487183, -0.00035489373840391636, -0.024010464549064636, -0.009457512758672237, 0.16953571140766144, -0.1826964020729065, -0.23869577050209045, -0.11921128630638123, -0.09836108237504959, -0.13691073656082153, 0.0020435701590031385, -0.06346847116947174, 0.17802101373672485, 0.05487856641411781, -0.010347033850848675, -0.011477609165012836, 0.15098074078559875, 0.035866234451532364, -0.01185817364603281, -0.04814722016453743, 0.050093382596969604, -0.12966513633728027, 0.05181229114532471, -0.024878017604351044, 0.016540221869945526, 0.030995095148682594, 0.09095989912748337, -0.039065539836883545, 0.022908898070454597, 0.009213150478899479, 0.007628920488059521, 0.028851015493273735, 0.13623420894145966, 0.021130748093128204, 0.1535780429840088, 0.1273033320903778, -0.08721905201673508, 0.005385745782405138, 0.008722620084881783, -0.018816471099853516, -0.03863363713026047, 0.12355268746614456, -0.11474122852087021, 0.08793804794549942, 0.054655950516462326, -0.0527518205344677, -0.037405699491500854, -0.02175944671034813, 0.027196910232305527, -0.016324583441019058, -0.00726475240662694, -0.01072648260742426, -0.2108113169670105, 0.014449574053287506, -0.11190427839756012, 0.05651816353201866, -0.189619779586792, -0.0114144803956151, 0.010527078993618488, -0.015403076075017452, -0.026362614706158638, 0.04731221869587898, 0.049965571612119675, -0.03668884560465813, -0.016084499657154083, 0.040947325527668, 0.0348854623734951, 0.10340547561645508, -0.07139771431684494, -0.04011547937989235 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in hu (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-hu") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-hu") # load dataset ds = load_dataset("common_voice", "hu", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "hu", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-hu
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "hu", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "hu" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #hu #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in hu (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in hu (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #hu #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in hu (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #hu #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in hu (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09412223845720291, 0.09532992541790009, -0.002130106557160616, -0.01699664071202278, 0.10689747333526611, -0.0791376456618309, 0.14203868806362152, 0.04900075122714043, 0.08538049459457397, 0.06825491786003113, -0.007996587082743645, 0.01420739758759737, 0.11001380532979965, 0.0878443717956543, 0.03046463616192341, -0.2064567357301712, 0.04924739524722099, -0.07186830788850784, 0.14006656408309937, 0.03973493352532387, 0.08374158293008804, -0.08503873646259308, 0.037400200963020325, 0.07308881729841232, -0.04315193369984627, 0.046313345432281494, -0.022182561457157135, -0.1281071901321411, 0.10220057517290115, 0.07789137214422226, -0.03553713113069534, 0.006745194550603628, 0.08605706691741943, -0.13670757412910461, 0.023882359266281128, 0.038510795682668686, 0.009423053823411465, -0.0029168534092605114, 0.11404389142990112, -0.08261047303676605, 0.08330701291561127, 0.0192672461271286, -0.009496997110545635, 0.09295711666345596, -0.08215153962373734, -0.1593153476715088, -0.0447094552218914, 0.08873695135116577, 0.018184619024395943, 0.08083163946866989, -0.11094634234905243, 0.06030826270580292, -0.028391968458890915, 0.038841962814331055, 0.14014631509780884, -0.24107281863689423, -0.012120019644498825, -0.01792479306459427, 0.0775642991065979, -0.007437516935169697, -0.04726321995258331, 0.07001172006130219, 0.04706605523824692, 0.01019088365137577, -0.1033482551574707, -0.03150607645511627, 0.021499117836356163, -0.11938212066888809, -0.0915127843618393, 0.043456558138132095, 0.2497939169406891, 0.08041971921920776, -0.08066245168447495, -0.10618710517883301, 0.03994900733232498, 0.21334968507289886, -0.058133985847234726, -0.09444352239370346, -0.005097480956465006, 0.007741678040474653, 0.03341853246092796, -0.04437931999564171, -0.06476880609989166, 0.007833733223378658, -0.04777291789650917, 0.08161958307027817, 0.0059322696179151535, 0.007176587823778391, -0.030882392078638077, -0.019146766513586044, -0.038179464638233185, -0.07446957379579544, 0.031240226700901985, -0.0789552628993988, -0.07268702238798141, 0.0045083253644406796, -0.040729183703660965, -0.12498286366462708, 0.05363136902451515, 0.014117894694209099, 0.09135189652442932, 0.0238175205886364, -0.15412381291389465, 0.03794772922992706, 0.11009207367897034, 0.06706663221120834, -0.2049424946308136, -0.0835810974240303, -0.007036611437797546, -0.035932138562202454, 0.012854833155870438, -0.029803430661559105, -0.07271592319011688, 0.015181402675807476, -0.03169253095984459, 0.02984701842069626, 0.007033345755189657, -0.03636537492275238, -0.09460471570491791, -0.10182700306177139, -0.010674700140953064, -0.061797041445970535, 0.0017718560993671417, 0.07239560782909393, -0.045354656875133514, 0.184054434299469, -0.02131813019514084, 0.09808555245399475, -0.10669029504060745, -0.08487674593925476, -0.01577151007950306, 0.036040980368852615, 0.04157491400837898, -0.06648606061935425, 0.045241884887218475, 0.02995963767170906, 0.01340128481388092, -0.16614696383476257, -0.03717634081840515, -0.09214642643928528, -0.03366883844137192, -0.08750086277723312, 0.00473356619477272, -0.06535579264163971, 0.05610331892967224, -0.03872822970151901, -0.03585222736001015, -0.03639669716358185, -0.05604412034153938, 0.011920632794499397, -0.05993109196424484, 0.10491133481264114, 0.018062442541122437, 0.08838008344173431, -0.020915675908327103, -0.04265753924846649, -0.11034033447504044, 0.1421167254447937, -0.11000031977891922, 0.034576527774333954, -0.12754984200000763, -0.04588847979903221, -0.05023344233632088, 0.0498482920229435, 0.06734846532344818, 0.12451270967721939, -0.21927540004253387, -0.10371870547533035, 0.16187554597854614, -0.07694803923368454, -0.04242406412959099, 0.21402154862880707, 0.027010124176740646, 0.08869967609643936, 0.10993599146604538, 0.22096586227416992, -0.013666299171745777, -0.14757482707500458, -0.06129865348339081, -0.03279837220907211, 0.04523564875125885, 0.07914037257432938, 0.040912553668022156, -0.05117468908429146, 0.04576573893427849, 0.0028779348358511925, 0.08636979013681412, -0.033236365765333176, -0.015185455791652203, -0.05007799342274666, 0.04538050666451454, -0.08623103052377701, 0.07341981679201126, -0.005913283210247755, -0.026337889954447746, -0.014013477601110935, -0.051204074174165726, -0.04712487384676933, 0.07359204441308975, -0.052107736468315125, 0.027731887996196747, -0.128448486328125, 0.11983107030391693, 0.019718868657946587, 0.028759917244315147, -0.1370023488998413, 0.12556572258472443, 0.007012049201875925, -0.02999250218272209, 0.11501013487577438, 0.09019601345062256, -0.029319940134882927, -0.010146206244826317, 0.008251219056546688, 0.023900188505649567, -0.035656288266181946, 0.0014196770498529077, -0.012165195308625698, -0.11736121028661728, 0.02442133240401745, -0.05877639353275299, 0.09307712316513062, -0.09537335485219955, -0.024250060319900513, 0.05385493114590645, 0.10742535442113876, 0.0030373670160770416, -0.010322759859263897, 0.06601447612047195, 0.06116744130849838, 0.03738667443394661, 0.00573458056896925, 0.04065553843975067, -0.014629803597927094, -0.02761196158826351, 0.07014671713113785, -0.08352027833461761, 0.06190957501530647, 0.11701828986406326, -0.08806195110082626, -0.014168423600494862, 0.046708934009075165, -0.005440052133053541, -0.011028693988919258, -0.04084790125489235, -0.05120805650949478, 0.268147349357605, 0.00923361349850893, 0.05846069008111954, -0.06843069195747375, 0.02053755894303322, 0.036450061947107315, -0.04726755991578102, -0.07847294956445694, 0.07900220900774002, 0.0301823653280735, -0.07148346304893494, 0.005278104916214943, 0.08416534960269928, 0.06654791533946991, 0.17518030107021332, 0.00365278753452003, -0.08766663819551468, -0.014146312139928341, -0.07847493141889572, -0.048864465206861496, 0.06758053600788116, -0.25747737288475037, -0.05659566819667816, 0.022731605917215347, -0.012974376790225506, 0.030704768374562263, -0.015728196129202843, 0.021925942972302437, -0.03595837205648422, -0.03066500462591648, -0.030455676838755608, 0.0190751813352108, 0.005537073127925396, 0.08783617615699768, -0.013236931525170803, -0.0575607530772686, -0.04898538812994957, -0.06490567326545715, -0.11305172741413116, 0.06585796922445297, -0.08048098534345627, -0.4107585847377777, -0.027005892246961594, -0.06879293918609619, -0.06550109386444092, 0.01619049161672592, 0.033052656799554825, -0.10770745575428009, -0.07129578292369843, -0.03389381617307663, 0.11197584867477417, 0.04417379945516586, -0.04087154194712639, 0.12478666752576828, 0.004445443395525217, 0.05880989134311676, -0.10240326821804047, -0.0004826946824323386, -0.0706196054816246, -0.03978617116808891, -0.0641370490193367, 0.053449537605047226, 0.04778527095913887, 0.09958674013614655, 0.08439794927835464, -0.0035572166088968515, -0.010028635151684284, 0.16228890419006348, -0.11962401866912842, 0.0066650621592998505, 0.19326569139957428, -0.0754542127251625, -0.026198018342256546, 0.11158303171396255, 0.020884959027171135, -0.101682648062706, 0.04538816213607788, 0.011400454677641392, 0.0025874485727399588, -0.23306843638420105, -0.15134136378765106, -0.03345496207475662, 0.06844359636306763, 0.04141756519675255, -0.02459731511771679, -0.005772334523499012, -0.02243947610259056, -0.04097024351358414, 0.005527266766875982, 0.030425021424889565, -0.005165162496268749, 0.15103209018707275, -0.06252878159284592, 0.09607185423374176, -0.046879854053258896, -0.03578999266028404, 0.08818431198596954, 0.007384453900158405, 0.06432745605707169, 0.10079027712345123, 0.025823455303907394, 0.06239829212427139, 0.02417507767677307, 0.022534694522619247, -0.017335213720798492, 0.004304724745452404, -0.0012595818843692541, -0.029553141444921494, -0.05305109918117523, -0.0035295290872454643, 0.06469500064849854, 0.18152348697185516, -0.1264924257993698, -0.09233705699443817, -0.02147003449499607, 0.06863182038068771, 0.13998021185398102, 0.131475567817688, -0.044843412935733795, -0.07917434722185135, 0.030416328459978104, -0.095564104616642, -0.03742843121290207, 0.10522297769784927, 0.09947866946458817, -0.15683303773403168, 0.1078873723745346, 0.08644886314868927, 0.05106346681714058, -0.09736878424882889, 0.047397252172231674, -0.1322285532951355, -0.023331306874752045, 0.011600366793572903, 0.03576212376356125, -0.20549307763576508, 0.1381653994321823, 0.03168272227048874, 0.039157990366220474, -0.09418097883462906, 0.01198996976017952, 0.06378401815891266, -0.06268823146820068, 0.1546580195426941, -0.02079632505774498, 0.029107337817549706, 0.021746188402175903, -0.11453930288553238, 0.019411204382777214, 0.028745455667376518, -0.018025539815425873, 0.010526877827942371, 0.019788749516010284, 0.00336440559476614, -0.028363192453980446, 0.05251403898000717, -0.22609101235866547, -0.10824792087078094, 0.030071040615439415, 0.025321047753095627, 0.1075742095708847, -0.02472127601504326, -0.1121106967329979, -0.18313314020633698, 0.09908140450716019, -0.03977584093809128, 0.009072926826775074, -0.07200486958026886, 0.09047269821166992, 0.030541056767106056, -0.015506645664572716, 0.020485401153564453, 0.07555872946977615, 0.13083823025226593, -0.05996553227305412, -0.05548993498086929, 0.06638187915086746, -0.10881898552179337, -0.11187442392110825, -0.013561795465648174, 0.19822575151920319, 0.14687061309814453, 0.05061653256416321, 0.07137563079595566, -0.02186560072004795, 0.0324646458029747, -0.06275469809770584, 0.09777668863534927, 0.09420544654130936, -0.07385176420211792, 0.10136972367763519, -0.007411963772028685, -0.3418530821800232, -0.12696821987628937, -0.022112853825092316, 0.18009886145591736, 0.09777829796075821, -0.013630414381623268, 0.15047383308410645, 0.2843340337276459, -0.07033509761095047, -0.2345253974199295, 0.020464187487959862, 0.018665483221411705, 0.03409629315137863, 0.026349136605858803, -0.2296743392944336, 0.10123591870069504, 0.04719023406505585, 0.0026725749485194683, -0.01101419236510992, -0.1811555027961731, -0.13011102378368378, 0.2446155995130539, -0.02063780091702938, 0.19493015110492706, -0.0006945248460397124, -0.04927859082818031, -0.06062811240553856, 0.034491363912820816, 0.04220481589436531, -0.11787328869104385, 0.07973761111497879, 0.07363012433052063, 0.023567141965031624, 0.019078796729445457, 0.032741956412792206, 0.06632006913423538, 0.09542106091976166, -0.020748386159539223, -0.055926963686943054, 0.06520602107048035, 0.027285117655992508, 0.057453982532024384, 0.08755778521299362, 0.07704547047615051, -0.049727242439985275, -0.01759953610599041, -0.06943408399820328, -0.07247983664274216, 0.09559356421232224, -0.003379181493073702, -0.028851568698883057, -0.01753704808652401, 0.06961812824010849, -0.015382841229438782, 0.03528929501771927, -0.07577573508024216, -0.09197846800088882, 0.038051556795835495, 0.10550360381603241, 0.22937536239624023, -0.054871927946805954, 0.029230158776044846, -0.02902884967625141, -0.07021798193454742, 0.046085357666015625, 0.0514676533639431, 0.03133874014019966, 0.0767236202955246, 0.011498589999973774, 0.11445710808038712, -0.011637065559625626, -0.14066091179847717, 0.08463507890701294, 0.03286371007561684, -0.07911762595176697, -0.18106840550899506, -0.04838412255048752, 0.024448594078421593, 0.002153078792616725, 0.03866906091570854, 0.21012578904628754, -0.03819183260202408, -0.05349218472838402, -0.02077505923807621, 0.06946852058172226, -0.032771382480859756, 0.09807226061820984, 0.04857293516397476, 0.006330856122076511, -0.13622313737869263, 0.08438107371330261, 0.06953997910022736, -0.05686774104833603, 0.09920567274093628, 0.004189101047813892, -0.030284658074378967, -0.07250656932592392, -0.22127202153205872, 0.05688884109258652, 0.08148781210184097, -0.09120813012123108, -0.01956699788570404, -0.13614574074745178, 0.042396415024995804, 0.10416806489229202, 0.0056236484088003635, -0.017450427636504173, -0.06313459575176239, -0.04890931397676468, -0.0019092290895059705, 0.08486557751893997, 0.06469844281673431, -0.07598137110471725, -0.09871821850538254, 0.04254875332117081, 0.03517288714647293, 0.0704343169927597, -0.02793901227414608, -0.10153384506702423, -0.09879350662231445, 0.013553597964346409, -0.20169195532798767, -0.03449736908078194, -0.10916305333375931, 0.004114877432584763, -0.017232172191143036, -0.049309417605400085, -0.0343264676630497, 0.01989154890179634, -0.057563528418540955, 0.03114571049809456, 0.036706097424030304, 0.0999242290854454, -0.14752210676670074, 0.04791531711816788, 0.029604202136397362, -0.031163714826107025, 0.13101229071617126, 0.0808417871594429, -0.0369379036128521, 0.059512846171855927, -0.17623043060302734, -0.05379485711455345, 0.0682339146733284, 0.0813085064291954, 0.013906163163483143, -0.14811551570892334, -0.0002450170577503741, 0.020270705223083496, 0.0035730802919715643, -0.012100465595722198, 0.09141605347394943, -0.015639398247003555, 0.02027399092912674, -0.0747641995549202, -0.02590932324528694, -0.017253460362553596, 0.027606142684817314, 0.06647995859384537, 0.0489211305975914, 0.053526755422353745, -0.09256793558597565, 0.06627015769481659, -0.07534196972846985, 0.03729397431015968, -0.04434047266840935, -0.01988079398870468, 0.04778966307640076, -0.10094519704580307, 0.08467663824558258, -0.00521734356880188, 0.17055603861808777, 0.01276810560375452, 0.01598844863474369, 0.02434539422392845, -0.062407609075307846, -0.14331229031085968, 0.05707145482301712, -0.005864253267645836, 0.06618157029151917, -0.0038660436403006315, -0.07061803340911865, -0.018344584852457047, 0.07815009355545044, 0.16547787189483643, 0.057018086314201355, 0.09279285371303558, 0.07676001638174057, 0.019895851612091064, 0.13861477375030518, -0.09743411093950272, -0.08023694157600403, 0.047145117074251175, -0.1277467906475067, 0.079654261469841, -0.07157160341739655, 0.0007446749368682504, 0.0077632418833673, -0.07782845199108124, 0.04489213600754738, -0.024427786469459534, -0.045672472566366196, -0.09485369920730591, -0.09564763307571411, -0.05119850113987923, -0.08272865414619446, -0.00838558841496706, -0.08910788595676422, 0.001808079658076167, -0.03135734423995018, 0.041125666350126266, -0.10302369296550751, 0.11748962104320526, -0.09818435460329056, -0.12923294305801392, 0.20001491904258728, -0.03370894864201546, -0.02122083120048046, -0.0988476574420929, 0.01495271921157837, 0.03662770241498947, 0.12880592048168182, 0.028946133330464363, 0.04391535744071007, -0.05747586488723755, 0.023293735459446907, -0.06914542615413666, -0.07557074725627899, 0.020291410386562347, -0.05101166293025017, -0.0008620499284006655, 0.13903875648975372, 0.11523185670375824, -0.0315457247197628, 0.017873074859380722, 0.10533789545297623, -0.030914943665266037, -0.01763536036014557, -0.1572079062461853, -0.019485699012875557, -0.019705211743712425, 0.03385837748646736, -0.04258649796247482, -0.09436432272195816, -0.019187558442354202, 0.20415425300598145, 0.16290177404880524, -0.08219601213932037, 0.033578917384147644, -0.004874755162745714, 0.006625629495829344, -0.0030989728402346373, 0.03711986541748047, 0.13522441685199738, 0.21111661195755005, -0.0032710516825318336, 0.01596391201019287, -0.003986987750977278, -0.11227083206176758, -0.027005115523934364, 0.10520952194929123, 0.0019591692835092545, -0.0223388634622097, -0.011980394832789898, 0.17016635835170746, -0.18609754741191864, -0.22881004214286804, -0.1211450919508934, -0.10209504514932632, -0.13683108985424042, 0.00270212534815073, -0.05893859639763832, 0.1778954416513443, 0.052408959716558456, -0.009840496815741062, -0.010724457912147045, 0.15384043753147125, 0.03565401956439018, -0.00707451393827796, -0.04260530695319176, 0.04797196388244629, -0.13077984750270844, 0.05294707044959068, -0.02480989880859852, 0.015095949172973633, 0.031309571117162704, 0.0945386067032814, -0.03893035650253296, 0.022223537787795067, 0.01070611272007227, 0.008994905278086662, 0.027256915345788002, 0.13522452116012573, 0.02263908088207245, 0.1533668041229248, 0.12996967136859894, -0.08725762367248535, 0.002708022715523839, 0.004441371187567711, -0.01693803071975708, -0.03569529205560684, 0.12442310899496078, -0.11465378850698471, 0.08704710751771927, 0.05389132723212242, -0.05600652098655701, -0.0377311035990715, -0.019250502809882164, 0.026086173951625824, -0.020700937137007713, -0.006262854672968388, -0.008450476452708244, -0.2107604295015335, 0.01144144032150507, -0.11798276752233505, 0.05693723261356354, -0.18026579916477203, -0.00962740182876587, 0.009764095768332481, -0.014435428194701672, -0.027803711593151093, 0.04774254187941551, 0.04801219701766968, -0.03836897388100624, -0.01799476519227028, 0.03789828345179558, 0.03483310341835022, 0.10372841358184814, -0.06749840080738068, -0.040129154920578 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in it (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-it") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-it") # load dataset ds = load_dataset("common_voice", "it", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "it", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-it
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "it", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "it" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #it #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in it (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in it (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #it #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in it (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #it #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in it (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09134902060031891, 0.09064485877752304, -0.0023795629385858774, -0.01574917696416378, 0.10655370354652405, -0.08017381280660629, 0.1403677612543106, 0.05175754800438881, 0.08356725424528122, 0.06611648947000504, -0.003109343582764268, 0.024685131385922432, 0.10556115955114365, 0.09029187262058258, 0.030287092551589012, -0.20415520668029785, 0.04792051762342453, -0.0725761204957962, 0.13755163550376892, 0.04272441938519478, 0.08665870875120163, -0.08478636294603348, 0.03661565110087395, 0.07245463877916336, -0.044706325978040695, 0.04849162697792053, -0.0178001020103693, -0.12939530611038208, 0.10302940756082535, 0.07858628779649734, -0.03700088709592819, 0.0064415764063596725, 0.07993092387914658, -0.13567720353603363, 0.024805570021271706, 0.04200565442442894, 0.0071930596604943275, -0.0020525692962110043, 0.11664929240942001, -0.08655061572790146, 0.08223628997802734, 0.016933010891079903, -0.007444136310368776, 0.09349529445171356, -0.0854703038930893, -0.16123904287815094, -0.04367322847247124, 0.08475074172019958, 0.01968454197049141, 0.08466516435146332, -0.1119244173169136, 0.06034902110695839, -0.025695277377963066, 0.04193342104554176, 0.13877278566360474, -0.2377062290906906, -0.014800156466662884, -0.015535148791968822, 0.07898077368736267, -0.008845411241054535, -0.04493357241153717, 0.07096514850854874, 0.043335191905498505, 0.00937814824283123, -0.09911029040813446, -0.03128652647137642, 0.027306517586112022, -0.12166991084814072, -0.0931013822555542, 0.03719301149249077, 0.2498445361852646, 0.0808384120464325, -0.08268939703702927, -0.10656082630157471, 0.040006499737501144, 0.2186179757118225, -0.05681958794593811, -0.09621018171310425, -0.0041873762384057045, 0.00845259428024292, 0.029338466003537178, -0.05280780792236328, -0.06807297468185425, 0.004979058634489775, -0.049806833267211914, 0.08433562517166138, 0.0036637720186263323, 0.009675854817032814, -0.031807463616132736, -0.016856811940670013, -0.04833522066473961, -0.07173991948366165, 0.03000337816774845, -0.08012552559375763, -0.07158888876438141, 0.004915847443044186, -0.043932314962148666, -0.13582123816013336, 0.054383181035518646, 0.016311390325427055, 0.09536053985357285, 0.020018374547362328, -0.15709634125232697, 0.038144443184137344, 0.11182054877281189, 0.063468337059021, -0.20520444214344025, -0.07865852117538452, -0.008794627152383327, -0.03699405491352081, 0.012342685833573341, -0.03066226653754711, -0.07165652513504028, 0.01519037876278162, -0.03156281262636185, 0.03193649277091026, 0.014687987044453621, -0.03480999544262886, -0.09386251866817474, -0.10210975259542465, -0.010008035227656364, -0.06327126175165176, 0.0012909320648759604, 0.0687335655093193, -0.046113189309835434, 0.18419483304023743, -0.02048431523144245, 0.09696190059185028, -0.10505238175392151, -0.08256837725639343, -0.01497811172157526, 0.033927250653505325, 0.03952819108963013, -0.06701187789440155, 0.0430084653198719, 0.02690116874873638, 0.013127915561199188, -0.16484546661376953, -0.04149959608912468, -0.09242897480726242, -0.0311672892421484, -0.0860748440027237, 0.005851092282682657, -0.06499864906072617, 0.05671601742506027, -0.035251300781965256, -0.03660452738404274, -0.028439274057745934, -0.05701644346117973, 0.012973821721971035, -0.06089448556303978, 0.10425553470849991, 0.01697723940014839, 0.08824291825294495, -0.02057044208049774, -0.044012729078531265, -0.10342949628829956, 0.14244922995567322, -0.10714065283536911, 0.027130914852023125, -0.13121020793914795, -0.04241977259516716, -0.05423732101917267, 0.051808636635541916, 0.06715182960033417, 0.1266336590051651, -0.22252333164215088, -0.10286101698875427, 0.16670338809490204, -0.07580003887414932, -0.04179589822888374, 0.21154023706912994, 0.025478903204202652, 0.09118541330099106, 0.11092162132263184, 0.21547697484493256, -0.01916254125535488, -0.14782463014125824, -0.061674125492572784, -0.03272119536995888, 0.03952346369624138, 0.08107351511716843, 0.041906729340553284, -0.05475077033042908, 0.04550905153155327, 0.002900213235989213, 0.08071544021368027, -0.03458324819803238, -0.013637647964060307, -0.048108574002981186, 0.043557148426771164, -0.08394782245159149, 0.07670824974775314, -0.0063757807947695255, -0.030450019985437393, -0.013676942326128483, -0.05590884014964104, -0.04401281848549843, 0.07110047340393066, -0.05020693689584732, 0.02683701179921627, -0.1284402310848236, 0.11923669278621674, 0.013247000053524971, 0.03078250028192997, -0.14309585094451904, 0.12360288947820663, 0.005127302370965481, -0.0261722169816494, 0.11663707345724106, 0.0878247618675232, -0.026085099205374718, -0.017045723274350166, 0.009816049598157406, 0.02370934933423996, -0.037238709628582, -0.0000761649280320853, -0.012938295491039753, -0.11747031658887863, 0.026533911004662514, -0.06115783751010895, 0.09113869816064835, -0.09204985946416855, -0.023623673245310783, 0.05472259968519211, 0.10021606087684631, 0.004485628567636013, -0.010241091251373291, 0.06621558964252472, 0.05699998140335083, 0.03737163543701172, 0.006327479612082243, 0.04110930114984512, -0.013413903303444386, -0.026683524250984192, 0.07153604924678802, -0.08550937473773956, 0.06298857182264328, 0.11790265142917633, -0.0900348350405693, -0.01539279893040657, 0.04585067555308342, -0.005156177561730146, -0.011094293557107449, -0.041183460503816605, -0.05258585512638092, 0.2658243179321289, 0.012070458382368088, 0.05681384727358818, -0.06911686807870865, 0.019251849502325058, 0.03838462010025978, -0.04921504855155945, -0.08073921501636505, 0.07587374746799469, 0.027505341917276382, -0.07393475621938705, 0.007118950132280588, 0.09612417221069336, 0.06578537076711655, 0.1778845489025116, 0.006878354120999575, -0.08893106132745743, -0.015512500889599323, -0.08045846968889236, -0.047151852399110794, 0.060850683599710464, -0.2622227966785431, -0.053424105048179626, 0.02346595749258995, -0.01366897951811552, 0.02848079614341259, -0.013116007670760155, 0.023014936596155167, -0.032618869096040726, -0.031403250992298126, -0.024859974160790443, 0.019925283268094063, 0.00659947469830513, 0.08974713832139969, -0.009782521985471249, -0.05512480065226555, -0.044950250536203384, -0.06500893831253052, -0.11080631613731384, 0.06987303495407104, -0.07901222258806229, -0.40822741389274597, -0.025228356942534447, -0.0730215534567833, -0.06703291088342667, 0.016717858612537384, 0.03334667161107063, -0.10384401679039001, -0.06846558302640915, -0.03542958199977875, 0.11167927086353302, 0.04074012488126755, -0.04046367108821869, 0.11760853230953217, 0.0027918091509491205, 0.056563254445791245, -0.10284017771482468, -0.0013127649435773492, -0.07012729346752167, -0.047489866614341736, -0.06475566327571869, 0.05504193156957626, 0.04547061771154404, 0.09927928447723389, 0.08191247284412384, -0.0031558191403746605, -0.01039276272058487, 0.16325820982456207, -0.12181495130062103, 0.007613146211951971, 0.19716130197048187, -0.07462390512228012, -0.02550029754638672, 0.10835708677768707, 0.02041546255350113, -0.10122675448656082, 0.04651268571615219, 0.01564466953277588, -0.0010708397021517158, -0.23784393072128296, -0.1485576629638672, -0.030186835676431656, 0.071855328977108, 0.042828962206840515, -0.023915253579616547, 0.0038910459261387587, -0.021204905584454536, -0.04005499929189682, 0.001961869653314352, 0.032289039343595505, -0.004190959967672825, 0.1485644280910492, -0.0628446489572525, 0.09945060312747955, -0.0467076450586319, -0.03336447477340698, 0.0909477099776268, 0.007772716693580151, 0.05894267186522484, 0.10113627463579178, 0.02410946600139141, 0.06439737975597382, 0.018736422061920166, 0.02028190717101097, -0.01460203155875206, 0.004553188104182482, -0.0011842184467241168, -0.030240003019571304, -0.05581294000148773, -0.003301473567262292, 0.06494349986314774, 0.1832466721534729, -0.12838852405548096, -0.0897161215543747, -0.023345664143562317, 0.06805147230625153, 0.14285412430763245, 0.12798449397087097, -0.046303361654281616, -0.07774365693330765, 0.028371131047606468, -0.09489990770816803, -0.03930317610502243, 0.10091102868318558, 0.09647166728973389, -0.15887339413166046, 0.10575146228075027, 0.08294326812028885, 0.049303088337183, -0.09690078347921371, 0.04724957048892975, -0.13127882778644562, -0.026736130937933922, 0.01030611153692007, 0.037498217076063156, -0.2122049182653427, 0.13831235468387604, 0.033887263387441635, 0.039676420390605927, -0.09642553329467773, 0.011721394024789333, 0.06334683299064636, -0.06613422930240631, 0.16167771816253662, -0.023601585999131203, 0.03137701377272606, 0.027003852650523186, -0.11554092913866043, 0.0162721648812294, 0.03173165023326874, -0.01900257170200348, 0.009222922846674919, 0.019158244132995605, 0.0025012146215885878, -0.02789466083049774, 0.05825581029057503, -0.223064586520195, -0.11152090132236481, 0.02870376966893673, 0.02166251838207245, 0.1069941520690918, -0.02442299760878086, -0.11100257188081741, -0.17270226776599884, 0.10567118227481842, -0.04448751360177994, 0.00805656798183918, -0.07429465651512146, 0.09333551675081253, 0.027984553948044777, -0.01659984327852726, 0.026549750939011574, 0.07254919409751892, 0.13420018553733826, -0.059504661709070206, -0.05725846812129021, 0.06969954818487167, -0.11010710150003433, -0.10986938327550888, -0.013701136223971844, 0.19608299434185028, 0.14391259849071503, 0.0496540404856205, 0.07296840101480484, -0.020233139395713806, 0.030282381922006607, -0.06042179837822914, 0.10380587726831436, 0.09313861280679703, -0.07512785494327545, 0.1009388417005539, -0.002058852929621935, -0.3477655053138733, -0.12611091136932373, -0.020746460184454918, 0.18616200983524323, 0.09780020266771317, -0.0134363928809762, 0.14691713452339172, 0.28300783038139343, -0.07402093708515167, -0.2358686327934265, 0.022645749151706696, 0.021224698051810265, 0.03119221143424511, 0.028794867917895317, -0.23261477053165436, 0.0980895608663559, 0.0533134900033474, 0.001778645091690123, -0.010248535312712193, -0.1823602020740509, -0.1298208087682724, 0.2411172240972519, -0.017647037282586098, 0.19215819239616394, -0.0020413235761225224, -0.05148819833993912, -0.060605451464653015, 0.04284299165010452, 0.05205114930868149, -0.11988125741481781, 0.08111041784286499, 0.07271147519350052, 0.020128946751356125, 0.019934143871068954, 0.031399089843034744, 0.06321489810943604, 0.09123874455690384, -0.018076132982969284, -0.05605475604534149, 0.06401719897985458, 0.02669019065797329, 0.054792620241642, 0.08903007209300995, 0.07956602424383163, -0.04943243786692619, -0.014016595669090748, -0.0715501606464386, -0.07548974454402924, 0.0955757424235344, -0.0033628884702920914, -0.03294987604022026, -0.014843971468508244, 0.06708694994449615, -0.018178049474954605, 0.033309001475572586, -0.07157979160547256, -0.0955388993024826, 0.03974239528179169, 0.11131339520215988, 0.23154932260513306, -0.045547451823949814, 0.02128743752837181, -0.0285190362483263, -0.06959268450737, 0.043432362377643585, 0.04255819693207741, 0.03157056123018265, 0.07252595573663712, 0.013273727148771286, 0.11266834288835526, -0.011135508306324482, -0.14346984028816223, 0.08381017297506332, 0.03322460129857063, -0.08069583773612976, -0.18061944842338562, -0.046737246215343475, 0.0329943522810936, -0.0005901122349314392, 0.035350948572158813, 0.21176375448703766, -0.034983959048986435, -0.05360301956534386, -0.021673301234841347, 0.0682087317109108, -0.03270668908953667, 0.09819617122411728, 0.04828402400016785, 0.005737678613513708, -0.13385432958602905, 0.08667400479316711, 0.06952875107526779, -0.06492887437343597, 0.10088928043842316, 0.0012029442004859447, -0.026698729023337364, -0.07459867000579834, -0.21780693531036377, 0.05364473909139633, 0.07701349258422852, -0.08938787877559662, -0.015887193381786346, -0.1312718391418457, 0.044129811227321625, 0.11004029214382172, 0.00452555064111948, -0.016252433881163597, -0.0631488710641861, -0.05108712613582611, -0.0008065678994171321, 0.08632770925760269, 0.05887637287378311, -0.07532409578561783, -0.09525054693222046, 0.03748453035950661, 0.035620953887701035, 0.07082119584083557, -0.026680346578359604, -0.10413026064634323, -0.09641174972057343, 0.01396278664469719, -0.19133710861206055, -0.03477328643202782, -0.10639829933643341, 0.00045410278835333884, -0.015478401444852352, -0.043749261647462845, -0.030282462015748024, 0.017732512205839157, -0.0578121691942215, 0.029174594208598137, 0.037270687520504, 0.09898243844509125, -0.151168093085289, 0.0466357097029686, 0.027358464896678925, -0.02967255748808384, 0.12949752807617188, 0.07970570027828217, -0.03440070152282715, 0.05766601487994194, -0.18289537727832794, -0.04799440875649452, 0.06683177500963211, 0.07894448935985565, 0.011668098159134388, -0.14572013914585114, 0.0026522409170866013, 0.018857209011912346, 0.001271665096282959, -0.012420978397130966, 0.08749738335609436, -0.019936438649892807, 0.028414685279130936, -0.06745564937591553, -0.02621086686849594, -0.017448659986257553, 0.029735514894127846, 0.06518686562776566, 0.04792334884405136, 0.058552928268909454, -0.0918312594294548, 0.06600116193294525, -0.07563398778438568, 0.039692677557468414, -0.043512992560863495, -0.022981423884630203, 0.0482042133808136, -0.10094963759183884, 0.08241844922304153, -0.005782498978078365, 0.16963008046150208, 0.019528960809111595, 0.019905265420675278, 0.027237262576818466, -0.0663037896156311, -0.13374018669128418, 0.055493030697107315, -0.00873627420514822, 0.0631764754652977, -0.004199614282697439, -0.06651621311903, -0.015404712408781052, 0.07746151834726334, 0.16950997710227966, 0.05562188848853111, 0.09392556548118591, 0.08169751614332199, 0.020939746871590614, 0.1408870816230774, -0.09390928596258163, -0.07965365052223206, 0.03613007441163063, -0.12320341169834137, 0.08127278834581375, -0.07148566842079163, -0.004108842462301254, 0.0074129668064415455, -0.07904227823019028, 0.04551132023334503, -0.02236945740878582, -0.04340406134724617, -0.09712682664394379, -0.10120420902967453, -0.0501282773911953, -0.07744351029396057, -0.009272954426705837, -0.08977142721414566, 0.004545741714537144, -0.025159865617752075, 0.04159044474363327, -0.10419310629367828, 0.11798017472028732, -0.1001724973320961, -0.13012897968292236, 0.20489619672298431, -0.035185426473617554, -0.020706025883555412, -0.09905066341161728, 0.019429706037044525, 0.03371414542198181, 0.12765346467494965, 0.029330004006624222, 0.0466361865401268, -0.056807003915309906, 0.025379959493875504, -0.07200530916452408, -0.07742106169462204, 0.020644430071115494, -0.04871184378862381, -0.004877656232565641, 0.14300662279129028, 0.11316853016614914, -0.03246660903096199, 0.019868044182658195, 0.1082478016614914, -0.030297785997390747, -0.01328909583389759, -0.1541711539030075, -0.02071109227836132, -0.018354155123233795, 0.03579867631196976, -0.04456467926502228, -0.0957302451133728, -0.017562732100486755, 0.20144201815128326, 0.16110986471176147, -0.07761860638856888, 0.03489353880286217, -0.0025770687498152256, 0.006894433870911598, -0.0007486467366106808, 0.03669184073805809, 0.13246017694473267, 0.21114325523376465, -0.004530563484877348, 0.022673673927783966, -0.0024142328184098005, -0.11180296540260315, -0.022701628506183624, 0.10271120071411133, 0.0011014938354492188, -0.021680062636733055, -0.009185460396111012, 0.17118105292320251, -0.18505315482616425, -0.22801071405410767, -0.12031079083681107, -0.09623511880636215, -0.13814422488212585, 0.0030799556989222765, -0.061590537428855896, 0.17715927958488464, 0.05669792741537094, -0.00817457027733326, -0.011723218485713005, 0.14521566033363342, 0.03568699583411217, -0.010047891177237034, -0.04694908484816551, 0.047265663743019104, -0.12590312957763672, 0.057362329214811325, -0.023516766726970673, 0.012522037141025066, 0.03055647201836109, 0.08980067074298859, -0.0420260988175869, 0.02337690070271492, 0.010498176328837872, 0.013595146127045155, 0.030446544289588928, 0.13397757709026337, 0.02374880202114582, 0.15421327948570251, 0.12932851910591125, -0.08607253432273865, 0.0064004650339484215, 0.0038983931299299, -0.02033281698822975, -0.037950750440359116, 0.12771287560462952, -0.11628164350986481, 0.08662980049848557, 0.05591706931591034, -0.054101914167404175, -0.040867626667022705, -0.022327205166220665, 0.026269113644957542, -0.01947229541838169, -0.004839490167796612, -0.012342065572738647, -0.20584560930728912, 0.013596901670098305, -0.11280542612075806, 0.05685283616185188, -0.1886831820011139, -0.010720247402787209, 0.013621044345200062, -0.015469749458134174, -0.028707154095172882, 0.0442795529961586, 0.05033661797642708, -0.039047278463840485, -0.017270050942897797, 0.04353923723101616, 0.03595671430230141, 0.10397518426179886, -0.06729014217853546, -0.039365172386169434 ]
null
null
null
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in lt (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-lt") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-lt") # load dataset ds = load_dataset("common_voice", "lt", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "lt", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-lt
[ "audio", "automatic-speech-recognition", "voxpopuli", "lt", "arxiv:2101.00390", "license:cc-by-nc-4.0", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "lt" ]
TAGS #audio #automatic-speech-recognition #voxpopuli #lt #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in lt (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in lt (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#audio #automatic-speech-recognition #voxpopuli #lt #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in lt (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 45, 163, 30 ]
[ "passage: TAGS\n#audio #automatic-speech-recognition #voxpopuli #lt #arxiv-2101.00390 #license-cc-by-nc-4.0 #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in lt (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.07576017826795578, -0.003191756084561348, -0.0007253228104673326, -0.009054006077349186, 0.07737651467323303, -0.03119838424026966, 0.1600542664527893, 0.04717142507433891, 0.16748760640621185, 0.07181598246097565, -0.003422314999625087, 0.02985331043601036, 0.061837684363126755, 0.014013818465173244, 0.0015813843347132206, -0.19913220405578613, 0.04126093164086342, -0.07360070943832397, 0.20425152778625488, 0.032898180186748505, 0.07756521552801132, -0.05689379572868347, 0.005250229965895414, 0.046590086072683334, -0.02960476279258728, 0.007433892693370581, 0.003461610758677125, -0.11096461862325668, 0.12040511518716812, 0.07576315850019455, -0.04961733520030975, -0.007165540475398302, 0.07981839030981064, -0.14561791718006134, 0.008419225923717022, -0.0019372686510905623, 0.007493382319808006, -0.015123418532311916, 0.06575833261013031, -0.06805184483528137, 0.10935425758361816, 0.13214486837387085, 0.02147258073091507, 0.06758362799882889, -0.10929641872644424, -0.0847315713763237, -0.0032517160288989544, 0.020034918561577797, 0.05015082284808159, 0.06247454136610031, -0.107169508934021, 0.07349367439746857, -0.05896630510687828, 0.04600046947598457, 0.029645895585417747, -0.24156321585178375, -0.006599960848689079, 0.08890105038881302, 0.10079105943441391, 0.013295914977788925, -0.02027531899511814, 0.06281808763742447, 0.06587136536836624, 0.018272364512085915, -0.12290109694004059, -0.041391562670469284, -0.03942505642771721, -0.11149397492408752, -0.110425665974617, 0.03942082077264786, 0.32730692625045776, 0.04495932161808014, -0.07932732254266739, -0.09947312623262405, 0.052749451249837875, 0.21857336163520813, -0.04466186836361885, -0.08910307288169861, 0.005052960943430662, 0.032337237149477005, 0.07639683783054352, -0.0045998841524124146, -0.04329608753323555, -0.05475388839840889, -0.026114964857697487, 0.1213517114520073, 0.004954823758453131, 0.005456465296447277, -0.06517698615789413, -0.008281445130705833, -0.21161793172359467, -0.05349479615688324, 0.012890282087028027, -0.06352950632572174, -0.05159446597099304, -0.00861304346472025, -0.05761313810944557, -0.11946012079715729, 0.05529593676328659, 0.08512995392084122, 0.13455620408058167, 0.047312576323747635, -0.13347196578979492, 0.05184505879878998, 0.11605382710695267, -0.013006329536437988, -0.15142472088336945, -0.03673761710524559, -0.014262359589338303, -0.022496355697512627, 0.046139221638441086, -0.047491591423749924, -0.10250169783830643, 0.04438018426299095, -0.04947420209646225, 0.038089632987976074, -0.011696603149175644, -0.01235623937100172, -0.1099044680595398, -0.08285451680421829, -0.05261659994721413, -0.03152599185705185, -0.03541674464941025, 0.06101479381322861, -0.06418471783399582, 0.09662799537181854, -0.01649649813771248, 0.1016584113240242, -0.07576686888933182, -0.07306176424026489, -0.043749239295721054, 0.025887861847877502, 0.06197132170200348, -0.08084768801927567, 0.023917408660054207, -0.0037043823394924402, 0.02350758947432041, -0.16049717366695404, -0.0635586753487587, -0.12530046701431274, -0.014663973823189735, -0.1128295287489891, -0.020831292495131493, -0.10714416205883026, 0.042933300137519836, -0.0027931309305131435, -0.04190707579255104, -0.025030676275491714, -0.06942310184240341, 0.0016042692586779594, -0.008772014640271664, 0.113739512860775, -0.016738206148147583, 0.06826005131006241, -0.046920377761125565, -0.023116927593946457, -0.08860144019126892, 0.1493164598941803, -0.09961661696434021, 0.01605190522968769, -0.13183866441249847, -0.00044933348544873297, -0.061163246631622314, 0.06897210329771042, 0.055842719972133636, 0.1439918875694275, -0.22712573409080505, -0.08457645028829575, 0.1410415917634964, -0.08453377336263657, -0.04079102352261543, 0.18846727907657623, 0.020829349756240845, 0.14981798827648163, 0.10089140385389328, 0.32797864079475403, -0.054316967725753784, -0.11663911491632462, 0.00410511763766408, -0.012915008701384068, 0.0854686051607132, 0.09881042689085007, 0.054732467979192734, -0.06344621628522873, -0.02509511634707451, -0.0012430842034518719, 0.047656383365392685, -0.04695450887084007, -0.012865656986832619, -0.05029381439089775, 0.019575556740164757, -0.08259239792823792, 0.04461560398340225, -0.010286022908985615, -0.024219196289777756, -0.018132219091057777, -0.045762937515974045, -0.05643255636096001, 0.09694010019302368, -0.08146662265062332, 0.025621572509407997, -0.14883212745189667, 0.154174342751503, -0.00013197207590565085, 0.02440142259001732, -0.1501941978931427, 0.17072103917598724, 0.014897878281772137, 0.03536192327737808, 0.10328695923089981, 0.07060843706130981, -0.03743172809481621, -0.010535502806305885, 0.012449526228010654, 0.052536945790052414, 0.041810423135757446, 0.00447753444314003, -0.03342442587018013, -0.1799531877040863, 0.02433694899082184, -0.07739289849996567, 0.08444100618362427, -0.128616601228714, -0.02199908159673214, -0.002243644092231989, 0.044916991144418716, -0.016557995229959488, -0.02084314450621605, 0.07874644547700882, 0.07820042967796326, 0.01165086217224598, 0.005949634127318859, 0.07127396017313004, -0.00314600532874465, -0.07169267535209656, 0.09141995757818222, -0.1568290740251541, -0.08953137695789337, 0.08097148686647415, -0.10809793323278427, -0.029137276113033295, 0.0073684933595359325, 0.027651136741042137, -0.014717862010002136, -0.040412843227386475, -0.042288634926080704, 0.24630169570446014, -0.03370935842394829, 0.048719871789216995, -0.07362524420022964, 0.03047802485525608, 0.03783832862973213, -0.0788964033126831, -0.03659069538116455, 0.10573963820934296, 0.039168067276477814, -0.03174144774675369, -0.007627928163856268, 0.029386555776000023, 0.026878733187913895, 0.1595020294189453, 0.0041061220690608025, -0.05593254044651985, -0.025784233585000038, -0.07108332961797714, -0.04033825919032097, 0.07548452168703079, -0.3168206214904785, -0.08049198985099792, 0.02389702945947647, 0.0161694698035717, 0.03316386416554451, -0.03139950707554817, 0.01706121489405632, -0.025186125189065933, -0.014933442696928978, -0.06019657850265503, 0.01585702784359455, -0.0750238299369812, 0.0866687223315239, -0.05487051606178284, -0.0859871432185173, -0.022304261103272438, -0.06792080402374268, -0.15368060767650604, 0.0971950814127922, -0.04052707180380821, -0.2507229745388031, 0.012912219390273094, -0.10161008685827255, -0.02918495237827301, 0.03479394689202309, 0.03514214977622032, -0.07449141144752502, -0.06227731332182884, -0.05632103607058525, 0.09585792571306229, 0.022232819348573685, -0.047982241958379745, 0.049721747636795044, -0.022340958938002586, 0.027537962421774864, -0.09820069372653961, -0.008127985522150993, -0.08412837982177734, -0.03830166533589363, -0.0415104441344738, -0.01020719949156046, 0.027006087824702263, 0.20617742836475372, 0.10975721478462219, -0.02519458532333374, -0.019050661474466324, 0.18336540460586548, -0.09373179078102112, -0.0047170380130410194, 0.18325595557689667, -0.08259377628564835, -0.039948929101228714, 0.16391369700431824, 0.025993788614869118, -0.12231142818927765, 0.042334578931331635, -0.004330724943429232, -0.04131224751472473, -0.21303172409534454, -0.15832599997520447, -0.036882057785987854, 0.09312833100557327, 0.05961279571056366, -0.03803027421236038, 0.026571620255708694, 0.005497980862855911, -0.019097188487648964, 0.022204311564564705, 0.03572836518287659, 0.019173037260770798, 0.15960849821567535, -0.043829772621393204, 0.1246376484632492, -0.022076504305005074, -0.007277755066752434, 0.051594849675893784, 0.004435907118022442, 0.04879024252295494, 0.12105127424001694, 0.04238443449139595, 0.07836628705263138, 0.0011419373331591487, 0.027091126888990402, -0.02748137153685093, -0.03416193276643753, 0.004709266591817141, -0.05489136651158333, -0.07170853763818741, -0.017721062526106834, 0.08015091717243195, 0.1274813860654831, -0.11862649768590927, -0.09135765582323074, -0.010796802118420601, 0.05492987111210823, 0.07773778587579727, 0.1837073266506195, -0.06013764068484306, -0.06956318765878677, 0.04646183177828789, -0.0813155397772789, -0.005605929531157017, 0.13952165842056274, 0.14350709319114685, -0.10348621010780334, 0.09405897557735443, 0.09433648735284805, 0.04979971796274185, -0.08072768151760101, 0.05083028972148895, -0.16193439066410065, -0.02133268490433693, -0.0002521845162846148, 0.04104467108845711, -0.24174369871616364, 0.14201365411281586, 0.027192294597625732, 0.040725428611040115, -0.11873079091310501, -0.022991059347987175, 0.057680193334817886, -0.0701964721083641, 0.17761856317520142, -0.022775642573833466, -0.024149730801582336, 0.0021843735594302416, -0.11530282348394394, 0.040748998522758484, 0.029297972097992897, -0.026431739330291748, 0.026452772319316864, 0.022993162274360657, 0.04162651300430298, -0.006277117412537336, 0.04829781502485275, -0.2199147492647171, -0.08941194415092468, 0.0005972751532681286, 0.0626106709241867, 0.0723494440317154, 0.011423678137362003, -0.07460550218820572, -0.133317768573761, -0.030550898984074593, -0.06104189530014992, 0.006624995265156031, -0.03467201814055443, 0.08662773668766022, 0.0325130969285965, 0.0016358379507437348, 0.015314887277781963, 0.07035751640796661, 0.08102370798587799, -0.060925956815481186, -0.017495643347501755, 0.10699699074029922, -0.12582556903362274, -0.09555567055940628, -0.029287835583090782, 0.1731608510017395, 0.13576258718967438, 0.08056823164224625, 0.08077473938465118, 0.011453872546553612, 0.04874667152762413, -0.05515950173139572, 0.1159227043390274, 0.0411817692220211, -0.0942595899105072, 0.09825251251459122, -0.03291592746973038, -0.2616277039051056, -0.1297530084848404, -0.05802469700574875, 0.13242433965206146, 0.09539081156253815, 0.009933172725141048, 0.14998354017734528, 0.28177735209465027, -0.11172787100076675, -0.20754781365394592, 0.008896821178495884, 0.009723014198243618, 0.04104122892022133, 0.041131287813186646, -0.2199203073978424, 0.09877081215381622, 0.051519304513931274, 0.005320359952747822, 0.013892588205635548, -0.24162183701992035, -0.1337740272283554, 0.2403242588043213, 0.008082980290055275, 0.1883414089679718, -0.0436578243970871, -0.0680599957704544, -0.0659652054309845, 0.06730671226978302, 0.047344669699668884, -0.11036598682403564, 0.09601815044879913, 0.07465142011642456, -0.004006019327789545, 0.02809310145676136, 0.04602041468024254, 0.11642809957265854, 0.12684865295886993, -0.015655918046832085, -0.05460254102945328, 0.004772562067955732, -0.01938067376613617, 0.058451395481824875, 0.06667699664831161, 0.04549412801861763, -0.03163789585232735, -0.006125269457697868, -0.08130204677581787, -0.04963449388742447, 0.05696888267993927, -0.004185781814157963, -0.038701415061950684, -0.008025438524782658, 0.05205870792269707, -0.02685011364519596, 0.04005788266658783, -0.06481120735406876, -0.14274010062217712, 0.07850589603185654, 0.0964125394821167, 0.2592504322528839, -0.03449980542063713, 0.012455573305487633, 0.001903953612782061, -0.06590702384710312, 0.07201148569583893, -0.047028131783008575, 0.009599130600690842, 0.07062210887670517, 0.05715688690543175, 0.13368196785449982, 0.016786061227321625, -0.10759156942367554, 0.09238465130329132, 0.02894260361790657, -0.056667033582925797, -0.14289076626300812, -0.015290581621229649, 0.049902621656656265, -0.0020882836543023586, 0.018015632405877113, 0.1861843466758728, -0.05058478191494942, -0.02343691885471344, -0.009456437081098557, 0.06113520264625549, -0.05263923481106758, 0.15468788146972656, 0.06843963265419006, 0.024513505399227142, -0.12252812087535858, 0.08776542544364929, 0.04532971233129501, -0.029532158747315407, 0.113851398229599, -0.04134652391076088, -0.04298144578933716, -0.06872343271970749, -0.27226021885871887, 0.06673823297023773, 0.0387335941195488, -0.1095905750989914, 0.0004969913861714303, -0.12857216596603394, 0.015496090054512024, 0.0934949517250061, 0.021894674748182297, -0.023997269570827484, -0.09763176739215851, -0.062064770609140396, -0.026632534340023994, 0.07218623906373978, 0.06480055302381516, -0.05689701810479164, -0.08689489960670471, 0.04800998792052269, 0.049089252948760986, 0.03809945285320282, -0.04478468373417854, -0.12395850569009781, -0.15516619384288788, 0.042912986129522324, -0.18645310401916504, 0.025272123515605927, -0.12645679712295532, 0.0032907268032431602, 0.01186898723244667, -0.006238239351660013, -0.05383770912885666, -0.0033555440604686737, -0.07358547300100327, 0.05098384618759155, 0.05332523584365845, 0.10831186920404434, -0.07985866069793701, 0.04067937284708023, 0.024961762130260468, -0.029542962089180946, 0.10151640325784683, 0.08235284686088562, -0.04785723239183426, 0.07509301602840424, -0.19427178800106049, -0.0017684713238850236, 0.055412907153367996, 0.06620507687330246, -0.030451111495494843, -0.15210293233394623, -0.04118551313877106, 0.012156592682003975, 0.010358105413615704, -0.006791797466576099, 0.0773698017001152, -0.02306857332587242, 0.04419509693980217, -0.06008080393075943, -0.05063507333397865, -0.02358081378042698, 0.019501350820064545, 0.022683294489979744, 0.043611444532871246, 0.055973537266254425, -0.0970023050904274, 0.05557059124112129, -0.06474640965461731, 0.05627528950572014, -0.04616953432559967, 0.012157615274190903, -0.022550705820322037, -0.09155651926994324, 0.062386468052864075, -0.028507091104984283, 0.17420464754104614, 0.029282862320542336, -0.009959300048649311, -0.008535439148545265, 0.008346788585186005, -0.06523040682077408, 0.035588573664426804, 0.08387202024459839, 0.09238313883543015, 0.004456077236682177, -0.0579986646771431, 0.04011233150959015, 0.06594636291265488, 0.22754248976707458, 0.011796186678111553, 0.044794805347919464, 0.07272603362798691, 0.033981796354055405, 0.1154560074210167, -0.10526420176029205, 0.017030328512191772, 0.10589434951543808, -0.049012795090675354, 0.09314268827438354, -0.03625379130244255, 0.011188211850821972, -0.00129693525377661, -0.06698186695575714, 0.04101387783885002, -0.0324091836810112, -0.07130642235279083, -0.1394605040550232, -0.1848713606595993, -0.032378990203142166, -0.1030125543475151, -0.02647743746638298, -0.1071493849158287, -0.02999078296124935, 0.08019472658634186, 0.040658023208379745, -0.07320032268762589, 0.06782346218824387, -0.1749964952468872, -0.1232338398694992, 0.16774287819862366, -0.04820241034030914, -0.01927555911242962, -0.15578697621822357, -0.002803556388244033, 0.05956495180726051, 0.08269370347261429, 0.0002716625458560884, 0.02352343685925007, -0.011740019544959068, 0.017231088131666183, -0.036831505596637726, -0.0596265010535717, 0.004749412182718515, -0.02922273427248001, 0.01651652529835701, 0.14400920271873474, 0.09524346143007278, -0.05807945504784584, 0.0302510317414999, 0.12682539224624634, -0.012493012472987175, -0.04038652777671814, -0.16818799078464508, 0.029239187017083168, -0.08479330688714981, 0.028790198266506195, -0.06085975468158722, -0.10194914788007736, 0.016513194888830185, 0.22191761434078217, 0.16459892690181732, -0.07681255042552948, 0.015000816434621811, 0.010511105880141258, 0.0027115358971059322, 0.01660637930035591, 0.003760766005143523, 0.06293778121471405, 0.18883433938026428, -0.01913200318813324, 0.03994125872850418, -0.001656626001931727, -0.09087610989809036, 0.012391005642712116, 0.06920990347862244, 0.03412216901779175, -0.04853493720293045, -0.027687346562743187, 0.16750261187553406, -0.20468486845493317, -0.1323457956314087, -0.09716545790433884, -0.056761760264635086, -0.11281746625900269, -0.028989074751734734, -0.08003553748130798, 0.19085344672203064, 0.07482968270778656, -0.04903501644730568, -0.011322331614792347, 0.06312958896160126, 0.026286568492650986, -0.04811931401491165, -0.08977010846138, 0.04991631954908371, -0.003418376436457038, 0.05383715406060219, -0.037493571639060974, 0.10743825882673264, 0.03514270856976509, 0.09435392916202545, 0.00030771808815188706, 0.04230637475848198, 0.01786644756793976, 0.03598492965102196, 0.05014462396502495, 0.10606817156076431, 0.02187846228480339, 0.19665400683879852, 0.12862984836101532, -0.13920369744300842, 0.016636304557323456, 0.08248057961463928, -0.05297236889600754, -0.06099795550107956, 0.08904464542865753, -0.12444832175970078, 0.11478639394044876, 0.04724691063165665, -0.06380346417427063, -0.08249828964471817, 0.0002830896992236376, 0.05055340379476547, -0.019118400290608406, -0.030688051134347916, 0.003008900908753276, -0.235605388879776, 0.03330560028553009, -0.0930621474981308, 0.007209252566099167, -0.20401693880558014, 0.008431206457316875, 0.009268602356314659, -0.014557723887264729, -0.0019297180697321892, 0.020163798704743385, 0.11075813323259354, -0.009170317091047764, -0.051944054663181305, -0.0400548130273819, 0.05781479552388191, 0.12484593689441681, -0.03243332728743553, -0.07407642155885696 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in nl (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-nl") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-nl") # load dataset ds = load_dataset("common_voice", "nl", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "nl", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-nl
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "nl", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "nl" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #nl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in nl (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in nl (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #nl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in nl (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 163, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #nl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in nl (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.08891139924526215, 0.11620017141103745, -0.002495171269401908, -0.010285938158631325, 0.09814292192459106, -0.07875268906354904, 0.14157842099666595, 0.05025091394782066, 0.07987379282712936, 0.06705956161022186, -0.003557124175131321, 0.022526472806930542, 0.09616891294717789, 0.07747630774974823, 0.02314375340938568, -0.20456892251968384, 0.04746890813112259, -0.07104773074388504, 0.1217641532421112, 0.04092801362276077, 0.0847955048084259, -0.07954626530408859, 0.03427222743630409, 0.06719494611024857, -0.04151389002799988, 0.05126700550317764, -0.016270838677883148, -0.12884244322776794, 0.10242369025945663, 0.08413289487361908, -0.03755418583750725, 0.012270404025912285, 0.08872435241937637, -0.14831015467643738, 0.02082713320851326, 0.03416382148861885, 0.012451398186385632, -0.0077270399779081345, 0.10710152238607407, -0.08263709396123886, 0.08660895377397537, 0.019648171961307526, 0.003017686540260911, 0.08610528707504272, -0.08241761475801468, -0.15654732286930084, -0.045966342091560364, 0.08279667049646378, 0.020204665139317513, 0.0823383778333664, -0.10638578981161118, 0.06780599802732468, -0.03046533651649952, 0.045411113649606705, 0.1368386447429657, -0.2367110401391983, -0.017347773537039757, -0.014063386246562004, 0.07397202402353287, -0.009987163357436657, -0.03951648250222206, 0.07752864807844162, 0.046168334782123566, 0.006559258326888084, -0.10376295447349548, -0.03304480016231537, 0.0276121124625206, -0.11577906459569931, -0.09430413693189621, 0.04008281230926514, 0.25539493560791016, 0.07498347014188766, -0.07958948612213135, -0.10851782560348511, 0.0481150820851326, 0.22491179406642914, -0.05402981862425804, -0.10130918771028519, -0.00278464425355196, 0.007122589275240898, 0.023746496066451073, -0.040998999029397964, -0.0635669007897377, 0.006289399694651365, -0.053583819419145584, 0.09050263464450836, 0.004344331566244364, -0.0017186688492074609, -0.023916834965348244, -0.016762161627411842, -0.04436717554926872, -0.07303204387426376, 0.028598379343748093, -0.07959788292646408, -0.06620356440544128, 0.003480667946860194, -0.03760305047035217, -0.12462770938873291, 0.05620026960968971, 0.02948339842259884, 0.09463515877723694, 0.021868735551834106, -0.15055790543556213, 0.03011469356715679, 0.11217011511325836, 0.06567997485399246, -0.19595645368099213, -0.08020021766424179, -0.01383144035935402, -0.034575920552015305, 0.010983546264469624, -0.027541959658265114, -0.07258013635873795, 0.02238857001066208, -0.029347660019993782, 0.03671594336628914, 0.013229086995124817, -0.037492748349905014, -0.09546146541833878, -0.10280398279428482, 0.0009518467704765499, -0.0649547129869461, 0.0020545050501823425, 0.06540253013372421, -0.04394102469086647, 0.1804492026567459, -0.019496800377964973, 0.09903628379106522, -0.10002291947603226, -0.07727538049221039, -0.01163472980260849, 0.02679392136633396, 0.046169720590114594, -0.0684400349855423, 0.04343254491686821, 0.03553108498454094, 0.012715864926576614, -0.16360557079315186, -0.03599011152982712, -0.09238023310899734, -0.032256413251161575, -0.08506119251251221, -0.0034586319234222174, -0.0730033740401268, 0.05834305286407471, -0.03562351316213608, -0.03902309760451317, -0.04639379680156708, -0.055635686963796616, 0.009487025439739227, -0.049010638147592545, 0.10026966035366058, 0.01852828823029995, 0.08615802228450775, -0.02381514571607113, -0.04364457726478577, -0.10250553488731384, 0.1435539424419403, -0.10890747606754303, 0.023636579513549805, -0.13190360367298126, -0.046281129121780396, -0.041420865803956985, 0.06011682376265526, 0.060958050191402435, 0.1291351169347763, -0.20941859483718872, -0.10487674176692963, 0.16700701415538788, -0.0749143585562706, -0.043280020356178284, 0.21070416271686554, 0.027051545679569244, 0.07887959480285645, 0.11458531022071838, 0.21235741674900055, -0.016547512263059616, -0.15255394577980042, -0.059610433876514435, -0.02638256922364235, 0.05200955271720886, 0.0909942165017128, 0.04281381517648697, -0.056688033044338226, 0.04047161713242531, 0.0025022136978805065, 0.07720424979925156, -0.036018840968608856, -0.015269024297595024, -0.051870569586753845, 0.04758211970329285, -0.08579196035861969, 0.07465385645627975, 0.000674082082696259, -0.028090868145227432, -0.01316011231392622, -0.059075962752103806, -0.01919737458229065, 0.07655182480812073, -0.05807638168334961, 0.024517955258488655, -0.12887661159038544, 0.10353244096040726, 0.01029972080141306, 0.023557210341095924, -0.14011643826961517, 0.11023173481225967, 0.00897472444921732, -0.04059562832117081, 0.1215103343129158, 0.08657663315534592, -0.027031704783439636, -0.01198241114616394, -0.00010072187433252111, 0.026089349761605263, -0.0262994933873415, 0.002817899454385042, -0.010739263147115707, -0.12497176229953766, 0.031961921602487564, -0.0629008412361145, 0.07847937941551208, -0.1006758064031601, -0.02207663096487522, 0.05364341288805008, 0.09977693855762482, 0.004327067639678717, -0.01136034820228815, 0.06315561383962631, 0.06608213484287262, 0.028969604521989822, 0.0051245735958218575, 0.035916924476623535, -0.019320592284202576, -0.036028388887643814, 0.07224314659833908, -0.08834022283554077, 0.06881073862314224, 0.11620490252971649, -0.07904678583145142, -0.014942719601094723, 0.04790930822491646, -0.0050100600346922874, -0.014600244350731373, -0.0372154526412487, -0.05119716748595238, 0.2599776089191437, 0.011240377090871334, 0.058848634362220764, -0.06738085299730301, 0.013845396228134632, 0.03283916786313057, -0.05011869966983795, -0.08510050922632217, 0.0856117233633995, 0.03117520734667778, -0.06583719700574875, 0.014662330970168114, 0.08523714542388916, 0.0647815689444542, 0.17089101672172546, 0.002825164934620261, -0.08524118363857269, -0.01515259686857462, -0.08210291713476181, -0.03962589055299759, 0.062097541987895966, -0.25080811977386475, -0.057666271924972534, 0.02173767238855362, -0.012713062576949596, 0.028005704283714294, -0.01906788907945156, 0.018511030822992325, -0.02755298651754856, -0.034543249756097794, -0.025829162448644638, 0.02075444720685482, -0.0015444800956174731, 0.0851408988237381, -0.014827142469584942, -0.05724570155143738, -0.04657502844929695, -0.06469592452049255, -0.10862835496664047, 0.07210051268339157, -0.07731908559799194, -0.4070397913455963, -0.022331099957227707, -0.07375890016555786, -0.07199511677026749, 0.017743676900863647, 0.03601669520139694, -0.10820671170949936, -0.07544305175542831, -0.041077855974435806, 0.10289488732814789, 0.03726162016391754, -0.039356883615255356, 0.10755348950624466, 0.0033008295577019453, 0.0569206066429615, -0.10825126618146896, -0.0003612341242842376, -0.06666839122772217, -0.039468880742788315, -0.0666743814945221, 0.044056445360183716, 0.055751968175172806, 0.10853168368339539, 0.08780580013990402, -0.009787407703697681, -0.01052366103976965, 0.15547987818717957, -0.12176573276519775, -0.001343278563581407, 0.19232630729675293, -0.07053873687982559, -0.026761841028928757, 0.11298374086618423, 0.02270718663930893, -0.10414086282253265, 0.04968676343560219, 0.015811815857887268, 0.0010928662959486246, -0.2362116277217865, -0.15541931986808777, -0.0339638888835907, 0.07051505893468857, 0.04587814211845398, -0.022985093295574188, -0.003682787762954831, -0.028511568903923035, -0.04201355576515198, 0.0031010389793664217, 0.03514041379094124, 0.0011668625520542264, 0.16533958911895752, -0.05909174680709839, 0.10274486988782883, -0.045998044312000275, -0.044624071568250656, 0.09177502989768982, 0.00911717489361763, 0.058628279715776443, 0.10243658721446991, 0.029432088136672974, 0.05991361290216446, 0.01658998616039753, 0.019305210560560226, -0.021041985601186752, 0.008918216452002525, 0.00540801789611578, -0.027900701388716698, -0.054455820471048355, -0.0012555932626128197, 0.06685297936201096, 0.18607713282108307, -0.13445380330085754, -0.08396392315626144, -0.026463381946086884, 0.06488070636987686, 0.14488977193832397, 0.12649434804916382, -0.04553839936852455, -0.08147306740283966, 0.03234829381108284, -0.09812598675489426, -0.03359316289424896, 0.10263348370790482, 0.09068481624126434, -0.1510465294122696, 0.11298689246177673, 0.08375273644924164, 0.04795549809932709, -0.09776178747415543, 0.0458611361682415, -0.12376026809215546, -0.03114839270710945, 0.011290531605482101, 0.0353681705892086, -0.21704484522342682, 0.14214357733726501, 0.030220381915569305, 0.03732902929186821, -0.09081321954727173, 0.017332784831523895, 0.060155805200338364, -0.06219780445098877, 0.16340510547161102, -0.023040449246764183, 0.023334955796599388, 0.018568692728877068, -0.1133078783750534, 0.018871497362852097, 0.033323608338832855, -0.02885136939585209, 0.011278046295046806, 0.019990060478448868, 0.00026703497860580683, -0.02069488726556301, 0.03492521867156029, -0.22398769855499268, -0.11956760287284851, 0.03530731052160263, 0.009780735708773136, 0.09163690358400345, -0.022542566061019897, -0.11641653627157211, -0.18453800678253174, 0.10674851387739182, -0.049400076270103455, 0.006406156346201897, -0.07048864662647247, 0.08414868265390396, 0.03774520009756088, -0.01496005617082119, 0.02390756644308567, 0.07608985155820847, 0.12752851843833923, -0.05621722713112831, -0.04952847957611084, 0.07006097584962845, -0.10871388018131256, -0.11710584163665771, -0.016203870996832848, 0.19435130059719086, 0.1475868821144104, 0.05105871334671974, 0.07168453931808472, -0.018404126167297363, 0.03796177729964256, -0.06108309328556061, 0.09324788302183151, 0.08974228799343109, -0.07799563556909561, 0.09932544082403183, -0.003933184314519167, -0.33755192160606384, -0.13244928419589996, -0.025632895529270172, 0.16707096993923187, 0.10918772220611572, -0.008398681879043579, 0.1498364508152008, 0.2824649512767792, -0.0768156349658966, -0.22999021410942078, 0.014874313026666641, 0.0170895978808403, 0.02869841456413269, 0.02945609763264656, -0.23258835077285767, 0.11088283360004425, 0.04992927983403206, 0.00004126761996303685, -0.01721072569489479, -0.1871752142906189, -0.13476672768592834, 0.24956266582012177, -0.02146000973880291, 0.1872062385082245, -0.00834591407328844, -0.047478776425123215, -0.06432952731847763, 0.04059050232172012, 0.05746685341000557, -0.10235665738582611, 0.07717130333185196, 0.07156375795602798, 0.015295556746423244, 0.021518321707844734, 0.035078294575214386, 0.06876441091299057, 0.09784244745969772, -0.014997814781963825, -0.052597180008888245, 0.05240953341126442, 0.025252284482121468, 0.05925513431429863, 0.08680212497711182, 0.06953322887420654, -0.04749053344130516, -0.016632938757538795, -0.07736861705780029, -0.06541406363248825, 0.09878463298082352, -0.006673008669167757, -0.037173833698034286, -0.019646456465125084, 0.06572965532541275, -0.011961326003074646, 0.03553011640906334, -0.06643964350223541, -0.09662801772356033, 0.03370992839336395, 0.10907099395990372, 0.23411406576633453, -0.035221830010414124, 0.029207471758127213, -0.026762723922729492, -0.07045705616474152, 0.04657962545752525, 0.04336964711546898, 0.03512360155582428, 0.0794290229678154, 0.017010819166898727, 0.11122224479913712, -0.015648992732167244, -0.13709673285484314, 0.08111652731895447, 0.028534701094031334, -0.0777757465839386, -0.1731642782688141, -0.04128367826342583, 0.02043977752327919, 0.00034174163010902703, 0.03712537884712219, 0.20996329188346863, -0.0356709361076355, -0.05247615650296211, -0.019235530868172646, 0.06647700071334839, -0.029342880472540855, 0.11129100620746613, 0.047005631029605865, 0.005860489793121815, -0.13127471506595612, 0.08269345015287399, 0.06871479749679565, -0.06990373879671097, 0.09697112441062927, -0.0013319605495780706, -0.02325984090566635, -0.0692724734544754, -0.21109864115715027, 0.05114196613430977, 0.06637830287218094, -0.08609962463378906, -0.023464249446988106, -0.13854004442691803, 0.043282169848680496, 0.10773786902427673, 0.007956077344715595, -0.01337225828319788, -0.06451258808374405, -0.04324769601225853, -0.004268378019332886, 0.08773545175790787, 0.07477745413780212, -0.06806041300296783, -0.09871415793895721, 0.0327228307723999, 0.03274805471301079, 0.05911720171570778, -0.02877645194530487, -0.09724396467208862, -0.09970246255397797, 0.013278699479997158, -0.19058114290237427, -0.03104303777217865, -0.10450568050146103, 0.00935196690261364, -0.01281195692718029, -0.050625238567590714, -0.029918786138296127, 0.014356673695147038, -0.05967877060174942, 0.0317513607442379, 0.03755538538098335, 0.10283000022172928, -0.14516057074069977, 0.044772062450647354, 0.029929809272289276, -0.03329303488135338, 0.12614187598228455, 0.07286734133958817, -0.03579312190413475, 0.06222779303789139, -0.19078558683395386, -0.04741557314991951, 0.066774383187294, 0.09104429185390472, 0.007335123606026173, -0.1443099081516266, -0.0029342672787606716, 0.027744144201278687, 0.006545294541865587, -0.012654034420847893, 0.07684674859046936, -0.022308671846985817, 0.019887758418917656, -0.06871187686920166, -0.03666174039244652, -0.013995630666613579, 0.02727598324418068, 0.07248726487159729, 0.04759311303496361, 0.05962067097425461, -0.09651262313127518, 0.06262355297803879, -0.07316309213638306, 0.03780001774430275, -0.03810424730181694, -0.024178028106689453, 0.04155489802360535, -0.09242282807826996, 0.08415902405977249, -0.002570776268839836, 0.17805570363998413, 0.013636265881359577, 0.01693604700267315, 0.02541010081768036, -0.07176564633846283, -0.12881574034690857, 0.05805874243378639, 0.0077959042973816395, 0.06962936371564865, -0.00032036268385127187, -0.06918200850486755, -0.018986191600561142, 0.07333368808031082, 0.16480468213558197, 0.05582351237535477, 0.09723758697509766, 0.07354092597961426, 0.017294511198997498, 0.1427178978919983, -0.09124284982681274, -0.07663314044475555, 0.05257204547524452, -0.12154880911111832, 0.08613601326942444, -0.06687556207180023, 0.0037047096993774176, 0.0010097541380673647, -0.08225637674331665, 0.042831793427467346, -0.022646864876151085, -0.04276818037033081, -0.1043790951371193, -0.10173894464969635, -0.054198477417230606, -0.0771162286400795, -0.007733041420578957, -0.09268021583557129, 0.00424807658419013, -0.029499242082238197, 0.04511300474405289, -0.09802989661693573, 0.10609492659568787, -0.10572131723165512, -0.12549854815006256, 0.19562138617038727, -0.039041079580783844, -0.026866378262639046, -0.0974087193608284, 0.016856489703059196, 0.03388800844550133, 0.13443875312805176, 0.027041565626859665, 0.04502246901392937, -0.0587582029402256, 0.022817065939307213, -0.06791962683200836, -0.07495550811290741, 0.019398607313632965, -0.04444531351327896, 0.002783152274787426, 0.1430477499961853, 0.11348607391119003, -0.03412578999996185, 0.021503794938325882, 0.10515201091766357, -0.032538872212171555, -0.025016650557518005, -0.15287616848945618, -0.00662857573479414, -0.021354682743549347, 0.04274353012442589, -0.04177121818065643, -0.08985581248998642, -0.014782643876969814, 0.1928589940071106, 0.15923596918582916, -0.07010220736265182, 0.0330619178712368, -0.004501082934439182, 0.0073790522292256355, -0.0038902251981198788, 0.034360822290182114, 0.13294678926467896, 0.21853116154670715, -0.0005640924791805446, 0.005908309947699308, 0.0016797241987660527, -0.10680653899908066, -0.02238544076681137, 0.10136011242866516, 0.007429210003465414, -0.024530448019504547, -0.013907271437346935, 0.1738402098417282, -0.1861000657081604, -0.21922889351844788, -0.11827201396226883, -0.10147742927074432, -0.13963682949543, 0.004337060730904341, -0.060734111815690994, 0.17207719385623932, 0.05027081444859505, -0.011272774077951908, -0.014242874458432198, 0.14472696185112, 0.03609151020646095, -0.01938406005501747, -0.03540632501244545, 0.049161601811647415, -0.1112234964966774, 0.059986215084791183, -0.023240476846694946, 0.02465810440480709, 0.03191273659467697, 0.08750129491090775, -0.03409353271126747, 0.02272723987698555, 0.01397035177797079, 0.0032199081033468246, 0.03700591251254082, 0.13092631101608276, 0.019680818542838097, 0.17092551290988922, 0.1305432915687561, -0.08117137104272842, 0.007206934504210949, 0.008647367358207703, -0.027364732697606087, -0.04585924744606018, 0.12208551168441772, -0.1186615526676178, 0.08890638500452042, 0.056525442749261856, -0.052295926958322525, -0.037153374403715134, -0.018795527517795563, 0.03248031809926033, -0.023240244016051292, 0.002012139419093728, -0.01141931302845478, -0.20594890415668488, 0.01477532647550106, -0.12251033633947372, 0.052136968821287155, -0.18203040957450867, -0.014352232217788696, 0.009867479093372822, -0.01117321290075779, -0.028233086690306664, 0.047666408121585846, 0.04984480142593384, -0.03473741561174393, -0.021458841860294342, 0.0255898367613554, 0.03405430167913437, 0.10333562642335892, -0.06726768612861633, -0.03715343773365021 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in pl (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-pl") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-pl") # load dataset ds = load_dataset("common_voice", "pl", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "pl", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-pl
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "pl", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "pl" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #pl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in pl (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in pl (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #pl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in pl (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #pl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in pl (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09500859677791595, 0.08716567605733871, -0.002235318999737501, -0.015593139454722404, 0.1057034358382225, -0.08178805559873581, 0.14164914190769196, 0.05079566314816475, 0.08528017997741699, 0.06766592711210251, -0.002972338115796447, 0.022758817300200462, 0.1052321195602417, 0.09006527811288834, 0.031671300530433655, -0.2073868215084076, 0.04834473878145218, -0.07093576341867447, 0.13795599341392517, 0.04295933246612549, 0.08656564354896545, -0.08501982688903809, 0.035761989653110504, 0.075558140873909, -0.042631447315216064, 0.04736912623047829, -0.019858473911881447, -0.12895528972148895, 0.10298080742359161, 0.07964424043893814, -0.039900220930576324, 0.006849798373878002, 0.08216811716556549, -0.1368667334318161, 0.0242178775370121, 0.04001440852880478, 0.008306298404932022, -0.0034564551897346973, 0.11807902902364731, -0.0865948423743248, 0.0800388902425766, 0.01894402503967285, -0.006022120360285044, 0.09275569021701813, -0.08229704946279526, -0.16178296506404877, -0.04329314082860947, 0.0841870903968811, 0.016283389180898666, 0.08590080589056015, -0.11067093163728714, 0.06538072228431702, -0.026685917750000954, 0.040984757244586945, 0.13729378581047058, -0.24221254885196686, -0.013717688620090485, -0.017561698332428932, 0.07717132568359375, -0.010092461481690407, -0.043246444314718246, 0.07101387530565262, 0.04339737445116043, 0.008502672426402569, -0.10009919106960297, -0.030042119324207306, 0.02728492021560669, -0.12213051319122314, -0.09369248151779175, 0.03830268979072571, 0.24822057783603668, 0.07865281403064728, -0.08238270878791809, -0.10924803465604782, 0.041972823441028595, 0.21759064495563507, -0.055714093148708344, -0.09429331123828888, -0.0044092219322919846, 0.008864015340805054, 0.027575379237532616, -0.04910093545913696, -0.06687838584184647, 0.00669340742751956, -0.04895026981830597, 0.08243107050657272, 0.005551780574023724, 0.009130405262112617, -0.029788298532366753, -0.017303122207522392, -0.051866497844457626, -0.07155047357082367, 0.029342111200094223, -0.07973235100507736, -0.07081468403339386, 0.004819056484848261, -0.04460402578115463, -0.13677601516246796, 0.05237890034914017, 0.01714295521378517, 0.09485448896884918, 0.021468697115778923, -0.15479791164398193, 0.037316061556339264, 0.11102855205535889, 0.06725899130105972, -0.20327158272266388, -0.07680229842662811, -0.006378542631864548, -0.03700205311179161, 0.012784549035131931, -0.027932453900575638, -0.07213131338357925, 0.01656917855143547, -0.030557872727513313, 0.031328290700912476, 0.010545708239078522, -0.03509673848748207, -0.09443185478448868, -0.10149891674518585, -0.008932765573263168, -0.06360044330358505, 0.0010038638720288873, 0.0684581771492958, -0.0466729998588562, 0.18544869124889374, -0.020184684544801712, 0.09694826602935791, -0.10776915401220322, -0.08200593292713165, -0.014161844737827778, 0.03552718088030815, 0.041773438453674316, -0.06346766650676727, 0.044095274060964584, 0.03183472901582718, 0.012335043400526047, -0.16454406082630157, -0.04471934214234352, -0.09174942225217819, -0.03278350830078125, -0.08615454286336899, 0.002379343146458268, -0.06253079324960709, 0.05871614068746567, -0.037727199494838715, -0.037620216608047485, -0.03289131820201874, -0.05638296157121658, 0.012258877046406269, -0.05707881599664688, 0.101103276014328, 0.017036661505699158, 0.08964790403842926, -0.023217320442199707, -0.04600926488637924, -0.1128818690776825, 0.14213396608829498, -0.10846878588199615, 0.027575917541980743, -0.13124828040599823, -0.045633088797330856, -0.05082616209983826, 0.051510293036699295, 0.06758047640323639, 0.12321729958057404, -0.21854238212108612, -0.10269688814878464, 0.16636350750923157, -0.07608147710561752, -0.04327978193759918, 0.21183344721794128, 0.02625238336622715, 0.08720900118350983, 0.11014725267887115, 0.21485593914985657, -0.015691347420215607, -0.14588361978530884, -0.06116929650306702, -0.03341458737850189, 0.04535084590315819, 0.07726522535085678, 0.04236660152673721, -0.05605445057153702, 0.045138753950595856, 0.0038049032445997, 0.08211442083120346, -0.03486676886677742, -0.013647302985191345, -0.04838510975241661, 0.042336806654930115, -0.08566376566886902, 0.07559093087911606, -0.00490901293233037, -0.028018765151500702, -0.014570537954568863, -0.05562160536646843, -0.04107455909252167, 0.07074534893035889, -0.052064940333366394, 0.026991667225956917, -0.12706221640110016, 0.12104515731334686, 0.012227549217641354, 0.02781563065946102, -0.14450344443321228, 0.11989754438400269, 0.0057688746601343155, -0.026691699400544167, 0.11764418333768845, 0.08629895001649857, -0.02543395571410656, -0.015948185697197914, 0.008455866947770119, 0.023494983091950417, -0.03706352785229683, -0.0017509449971839786, -0.013060761615633965, -0.11554016172885895, 0.02636452205479145, -0.062464989721775055, 0.08897191286087036, -0.09141218662261963, -0.02406412363052368, 0.05627559497952461, 0.10389841347932816, 0.0025813132524490356, -0.01220724731683731, 0.06647352874279022, 0.058472104370594025, 0.03783067688345909, 0.006855824496597052, 0.04018864408135414, -0.015071622096002102, -0.026559097692370415, 0.07370682805776596, -0.08065294474363327, 0.0631464496254921, 0.11848942935466766, -0.08848051726818085, -0.014012503437697887, 0.04188384488224983, -0.0062647294253110886, -0.009469395503401756, -0.043073467910289764, -0.0519840344786644, 0.26955652236938477, 0.01181879173964262, 0.05832025408744812, -0.0716797336935997, 0.017627593129873276, 0.03742580488324165, -0.046415261924266815, -0.0799148827791214, 0.07678437232971191, 0.03200680762529373, -0.06904833018779755, 0.008079503662884235, 0.08998540043830872, 0.06784038245677948, 0.17221929132938385, 0.003971274010837078, -0.08799199759960175, -0.01136765442788601, -0.07771587371826172, -0.0479004867374897, 0.06165330484509468, -0.25962817668914795, -0.05267636850476265, 0.02444380521774292, -0.0132287060841918, 0.028547294437885284, -0.014947550371289253, 0.022966304793953896, -0.0321078859269619, -0.03273432329297066, -0.025273172184824944, 0.01966366171836853, 0.007401720155030489, 0.0888674408197403, -0.012381761334836483, -0.05432989075779915, -0.04475067928433418, -0.06305797398090363, -0.11153226345777512, 0.06947729736566544, -0.07810261100530624, -0.41103091835975647, -0.025621196255087852, -0.06748133897781372, -0.06871376931667328, 0.015371639281511307, 0.03315040096640587, -0.10555404424667358, -0.07008527964353561, -0.035757165402173996, 0.11054406315088272, 0.04045885056257248, -0.03878229111433029, 0.11832736432552338, 0.0012132007395848632, 0.05826106667518616, -0.10452859848737717, -0.0011727444361895323, -0.06990829855203629, -0.0449887216091156, -0.0646452009677887, 0.05758233740925789, 0.04409288242459297, 0.09818542748689651, 0.07973475009202957, -0.0034203340765088797, -0.010486978106200695, 0.1628761887550354, -0.11941798776388168, 0.005538554862141609, 0.1998119205236435, -0.07435497641563416, -0.025955528020858765, 0.1056687980890274, 0.02001027762889862, -0.10256689786911011, 0.045660600066185, 0.014391697011888027, -0.0005461907130666077, -0.2398066222667694, -0.15208780765533447, -0.03142109885811806, 0.06945560872554779, 0.04363183304667473, -0.023142671212553978, -0.0007335678674280643, -0.019998742267489433, -0.04164166375994682, 0.003941431641578674, 0.030431346967816353, -0.004934330470860004, 0.15365514159202576, -0.06261755526065826, 0.09953777492046356, -0.04749257490038872, -0.03578173741698265, 0.08935876935720444, 0.0012024181196466088, 0.06055246666073799, 0.10007251799106598, 0.01867184415459633, 0.06140478700399399, 0.02064189687371254, 0.02041413076221943, -0.013050317764282227, 0.0026648195926100016, -0.0012754632625728846, -0.028807099908590317, -0.05405254289507866, -0.004209624137729406, 0.06461254507303238, 0.1835642158985138, -0.12839047610759735, -0.08984857052564621, -0.027523888275027275, 0.06776432693004608, 0.13750645518302917, 0.1283014863729477, -0.04593981057405472, -0.07843422889709473, 0.026504144072532654, -0.09747359901666641, -0.039235301315784454, 0.10258683562278748, 0.0908111035823822, -0.15831726789474487, 0.10852521657943726, 0.08197887241840363, 0.04978124052286148, -0.09879357367753983, 0.04663145914673805, -0.12984026968479156, -0.02431788481771946, 0.011389903724193573, 0.03754361346364021, -0.20970343053340912, 0.13686098158359528, 0.032599713653326035, 0.03885660693049431, -0.09839007258415222, 0.014584275893867016, 0.06144091859459877, -0.06598086655139923, 0.1590612530708313, -0.023117410019040108, 0.034669551998376846, 0.031019333750009537, -0.11657460033893585, 0.016657326370477676, 0.03048408217728138, -0.01743645779788494, 0.00876473169773817, 0.020283561199903488, 0.004455359652638435, -0.027429983019828796, 0.05178964510560036, -0.22591404616832733, -0.11216682195663452, 0.03014374151825905, 0.02060059644281864, 0.10844030231237411, -0.02439599670469761, -0.11302201449871063, -0.18241888284683228, 0.10841098427772522, -0.03702998161315918, 0.005941824987530708, -0.07535460591316223, 0.0935867577791214, 0.0318150669336319, -0.016137540340423584, 0.027587536722421646, 0.0733601376414299, 0.1313914656639099, -0.060355398803949356, -0.05523756891489029, 0.06715535372495651, -0.10876426100730896, -0.11256569623947144, -0.013516639359295368, 0.19808867573738098, 0.14181829988956451, 0.048563916236162186, 0.07224065810441971, -0.019362447783350945, 0.031191090121865273, -0.062288880348205566, 0.10134194046258926, 0.09240929037332535, -0.07464960962533951, 0.09990284591913223, -0.0028429306112229824, -0.3403429687023163, -0.12422865629196167, -0.023066585883498192, 0.18513554334640503, 0.09858923405408859, -0.011860379949212074, 0.14890684187412262, 0.2786906957626343, -0.06983619183301926, -0.23441244661808014, 0.025182103738188744, 0.02130410075187683, 0.03077031299471855, 0.029458805918693542, -0.231137216091156, 0.100121408700943, 0.05182180553674698, 0.0015385037986561656, -0.01221648883074522, -0.1815481334924698, -0.13168731331825256, 0.24079735577106476, -0.018653733655810356, 0.1905527412891388, -0.0015424180310219526, -0.051585592329502106, -0.06132005900144577, 0.04177292063832283, 0.04868520423769951, -0.11744949221611023, 0.07986459136009216, 0.0713668167591095, 0.022646788507699966, 0.02013840712606907, 0.030878383666276932, 0.06271759420633316, 0.09080091118812561, -0.02104610577225685, -0.055033113807439804, 0.0706174299120903, 0.026970969513058662, 0.05829896405339241, 0.0915740355849266, 0.07584241777658463, -0.048953860998153687, -0.02145618572831154, -0.07291696965694427, -0.07284317910671234, 0.09698468446731567, -0.002243236405774951, -0.03209199383854866, -0.015212230384349823, 0.06603924930095673, -0.018327519297599792, 0.03273947536945343, -0.07262707501649857, -0.0949849784374237, 0.03678295761346817, 0.11203770339488983, 0.23190660774707794, -0.042202193289995193, 0.019440170377492905, -0.028949009254574776, -0.06931644678115845, 0.043183762580156326, 0.0421244315803051, 0.031862806528806686, 0.07427903264760971, 0.012213691137731075, 0.11024808883666992, -0.011749195866286755, -0.14342062175273895, 0.08234629034996033, 0.03234757110476494, -0.07974638044834137, -0.18293191492557526, -0.04622887447476387, 0.032205238938331604, 0.0008763633668422699, 0.038281772285699844, 0.21150358021259308, -0.03790704160928726, -0.05540391057729721, -0.021526262164115906, 0.06720983982086182, -0.03197816386818886, 0.10114503651857376, 0.04827836528420448, 0.0052498504519462585, -0.13587015867233276, 0.08625628799200058, 0.06959390640258789, -0.06426529586315155, 0.09927800297737122, 0.002194973872974515, -0.026945648714900017, -0.07468637079000473, -0.21348032355308533, 0.051228947937488556, 0.0751432552933693, -0.08787427097558975, -0.018246006220579147, -0.1359066665172577, 0.046254150569438934, 0.10998959094285965, 0.004237284418195486, -0.020126482471823692, -0.062322333455085754, -0.050960689783096313, -0.0017838201019912958, 0.08556011319160461, 0.06327469646930695, -0.07301726192235947, -0.09361077100038528, 0.037844229489564896, 0.0335138775408268, 0.06963076442480087, -0.026643728837370872, -0.10046698153018951, -0.09743332862854004, 0.014041599817574024, -0.1950240433216095, -0.03673996031284332, -0.10401958972215652, 0.0013944751117378473, -0.014958575367927551, -0.04559856280684471, -0.028451161459088326, 0.019637219607830048, -0.05821266025304794, 0.029687415808439255, 0.0353364571928978, 0.09823071211576462, -0.15134313702583313, 0.04772987589240074, 0.0281173475086689, -0.028743209317326546, 0.13099364936351776, 0.08009487390518188, -0.036284610629081726, 0.05759592354297638, -0.17973603308200836, -0.04873325675725937, 0.06736645847558975, 0.08095531910657883, 0.011599370278418064, -0.14326976239681244, 0.004090155474841595, 0.02234744466841221, 0.0018537717405706644, -0.011145072057843208, 0.09224341809749603, -0.02095266804099083, 0.02592097967863083, -0.06855171918869019, -0.026027312502264977, -0.016943037509918213, 0.028375281020998955, 0.06326836347579956, 0.049829527735710144, 0.05782409384846687, -0.09221478551626205, 0.06673388928174973, -0.0763559564948082, 0.03858131915330887, -0.04265865683555603, -0.022671205922961235, 0.04965771734714508, -0.09952739626169205, 0.08354641497135162, -0.004295896273106337, 0.1741364300251007, 0.01960381306707859, 0.01684708520770073, 0.02624022774398327, -0.06243066489696503, -0.13125662505626678, 0.05459582433104515, -0.007779323495924473, 0.0643717348575592, -0.0018451237119734287, -0.0681472197175026, -0.01629347912967205, 0.0773768201470375, 0.16655214130878448, 0.055564068257808685, 0.09212902933359146, 0.07680033892393112, 0.02052455209195614, 0.14108093082904816, -0.09401025623083115, -0.07618163526058197, 0.03410489112138748, -0.12137410789728165, 0.0809713825583458, -0.07044492661952972, 0.003471687203273177, 0.006197107490152121, -0.07805388420820236, 0.04587624594569206, -0.021649228408932686, -0.044686321169137955, -0.09714234620332718, -0.10003047436475754, -0.05146949738264084, -0.07979024201631546, -0.007034499663859606, -0.09009160101413727, 0.004346069879829884, -0.027557754889130592, 0.042097825556993484, -0.10173731297254562, 0.11729150265455246, -0.09985226392745972, -0.1297381967306137, 0.20138031244277954, -0.03466970473527908, -0.01965351775288582, -0.09580688923597336, 0.017118265852332115, 0.03265843540430069, 0.12498588860034943, 0.028882617130875587, 0.04696820676326752, -0.058751147240400314, 0.02501854859292507, -0.06775419414043427, -0.07564736157655716, 0.01903749443590641, -0.049485061317682266, -0.0025529558770358562, 0.14667315781116486, 0.11188483983278275, -0.03154701367020607, 0.017377719283103943, 0.10522456467151642, -0.031095610931515694, -0.014683995395898819, -0.15177632868289948, -0.016540348529815674, -0.017742427065968513, 0.036426566541194916, -0.04314636439085007, -0.09535083919763565, -0.01476065069437027, 0.2042783498764038, 0.1602870225906372, -0.07678921520709991, 0.03596030920743942, -0.002931764116510749, 0.005840514320880175, 0.001546597690321505, 0.03634706139564514, 0.13534344732761383, 0.2111366093158722, -0.006623825989663601, 0.021889135241508484, -0.0023945942521095276, -0.11025822907686234, -0.025275856256484985, 0.10380824655294418, 0.0012608995893970132, -0.02454243414103985, -0.010440091602504253, 0.16979819536209106, -0.18497911095619202, -0.22384332120418549, -0.11731690913438797, -0.09646378457546234, -0.13899262249469757, 0.0017960947006940842, -0.06155047565698624, 0.17709676921367645, 0.055206961929798126, -0.008729294873774052, -0.01053641363978386, 0.1459444910287857, 0.03682073950767517, -0.011935160495340824, -0.044208645820617676, 0.04741981253027916, -0.1254357248544693, 0.053350817412137985, -0.022646866738796234, 0.018901770934462547, 0.030176417902112007, 0.09095778316259384, -0.04153333231806755, 0.021595023572444916, 0.01031836960464716, 0.008134848438203335, 0.030282914638519287, 0.13291360437870026, 0.023649917915463448, 0.1541176736354828, 0.12805452942848206, -0.08255486935377121, 0.006098182871937752, 0.008523495867848396, -0.018407467752695084, -0.03761833533644676, 0.12742114067077637, -0.11686889082193375, 0.08712691813707352, 0.054424796253442764, -0.055332254618406296, -0.04111100360751152, -0.022116385400295258, 0.027092646807432175, -0.02017839625477791, -0.006753251422196627, -0.01118552777916193, -0.20934753119945526, 0.012856284156441689, -0.10903386026620865, 0.05651457980275154, -0.1866208016872406, -0.010171672329306602, 0.010833860374987125, -0.015224019065499306, -0.029331298545002937, 0.04712579399347305, 0.04938291385769844, -0.03981509804725647, -0.017547499388456345, 0.042406003922224045, 0.035163432359695435, 0.10340229421854019, -0.06823866814374924, -0.03824200853705406 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in ro (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-ro") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-ro") # load dataset ds = load_dataset("common_voice", "ro", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "ro", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-ro
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "ro", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "ro" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #ro #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in ro (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in ro (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #ro #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in ro (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #ro #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in ro (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.08973532170057297, 0.08796430379152298, -0.0023184537421911955, -0.015529678203165531, 0.10893253982067108, -0.08172090351581573, 0.1383569985628128, 0.05117728188633919, 0.08056996017694473, 0.0674988180398941, -0.0036585964262485504, 0.02569444663822651, 0.10669676214456558, 0.09159301966428757, 0.02732708305120468, -0.20874272286891937, 0.045949749648571014, -0.07467986643314362, 0.14219380915164948, 0.04306994378566742, 0.08692269772291183, -0.08187345415353775, 0.03886209428310394, 0.07357428222894669, -0.04526526480913162, 0.05071970820426941, -0.018162855878472328, -0.13092517852783203, 0.10505668073892593, 0.08016300201416016, -0.03384888917207718, 0.007948215119540691, 0.08590191602706909, -0.13448089361190796, 0.024405211210250854, 0.04206770658493042, 0.006407782435417175, 0.00047219241969287395, 0.11634578555822372, -0.09272981435060501, 0.08333025872707367, 0.01759076677262783, -0.007976527325809002, 0.0921943187713623, -0.08520710468292236, -0.1635197401046753, -0.045357465744018555, 0.07739759981632233, 0.016416925936937332, 0.08522869646549225, -0.11080296337604523, 0.06349772959947586, -0.026342123746871948, 0.04236644133925438, 0.15180589258670807, -0.23770712316036224, -0.016588320955634117, -0.012930408120155334, 0.07674535363912582, -0.010916042141616344, -0.0489090159535408, 0.06989936530590057, 0.04414115473628044, 0.008064170368015766, -0.0996280387043953, -0.030544616281986237, 0.020038506016135216, -0.12127985805273056, -0.09234584867954254, 0.041304465383291245, 0.2425071895122528, 0.08050283789634705, -0.08088035881519318, -0.10083570331335068, 0.038264527916908264, 0.21298670768737793, -0.055822476744651794, -0.09366470575332642, -0.004768551792949438, 0.005024701822549105, 0.02187318727374077, -0.051582273095846176, -0.06644134968519211, 0.005710144527256489, -0.04956744238734245, 0.08457692712545395, 0.004099471028894186, 0.008932575583457947, -0.03264034539461136, -0.016610819846391678, -0.05041258782148361, -0.07205849140882492, 0.030172070488333702, -0.08153009414672852, -0.07220450043678284, 0.00858397874981165, -0.04238152503967285, -0.12451451271772385, 0.053545840084552765, 0.02239828370511532, 0.09163900464773178, 0.021856600418686867, -0.15333396196365356, 0.03582777455449104, 0.10735085606575012, 0.06582985818386078, -0.2008182406425476, -0.08441076427698135, -0.007588810287415981, -0.03857853263616562, 0.010820417664945126, -0.029456447809934616, -0.07001321762800217, 0.016760017722845078, -0.03339443355798721, 0.029254356399178505, 0.009794825688004494, -0.03510095924139023, -0.09295900911092758, -0.09737922996282578, -0.013335581868886948, -0.06371702998876572, 0.0006648472626693547, 0.07071381062269211, -0.04779621586203575, 0.18511401116847992, -0.019172679632902145, 0.09728018194437027, -0.10474605858325958, -0.08567381650209427, -0.011816495098173618, 0.03391958400607109, 0.03937516734004021, -0.06791183352470398, 0.04279572516679764, 0.01948080211877823, 0.014717002399265766, -0.16869139671325684, -0.0366474874317646, -0.09412388503551483, -0.03362786024808884, -0.0895342230796814, 0.004389177542179823, -0.06994114816188812, 0.056664109230041504, -0.03513868898153305, -0.03489426523447037, -0.024728797376155853, -0.053086623549461365, 0.01350381039083004, -0.054569292813539505, 0.10522207617759705, 0.020010748878121376, 0.08905801922082901, -0.02411036752164364, -0.048124879598617554, -0.09893132001161575, 0.14174719154834747, -0.10857266932725906, 0.024592438712716103, -0.134879007935524, -0.044691603630781174, -0.05375347286462784, 0.05125785619020462, 0.06679787486791611, 0.12139970809221268, -0.2111952155828476, -0.1028313860297203, 0.16940854489803314, -0.072227843105793, -0.033264171332120895, 0.2117268145084381, 0.024050461128354073, 0.09058428555727005, 0.11048520356416702, 0.21640606224536896, -0.021261515095829964, -0.1480409950017929, -0.05899811536073685, -0.032731663435697556, 0.041893403977155685, 0.07851842790842056, 0.042581479996442795, -0.05165768414735794, 0.043342843651771545, 0.003640635870397091, 0.08086272329092026, -0.034874219447374344, -0.015392023138701916, -0.048496901988983154, 0.04612451419234276, -0.08377128094434738, 0.07650099694728851, -0.0038807287346571684, -0.026921391487121582, -0.013977467082440853, -0.05749313533306122, -0.05208086967468262, 0.0747242122888565, -0.053554341197013855, 0.025805525481700897, -0.12918712198734283, 0.11973901838064194, 0.012925405986607075, 0.02928490936756134, -0.14407862722873688, 0.12333802878856659, 0.0036461411509662867, -0.023294296115636826, 0.11636131256818771, 0.09342418611049652, -0.024230439215898514, -0.01751774549484253, 0.009946108795702457, 0.022976044565439224, -0.03634704649448395, -0.0020374469459056854, -0.010684690438210964, -0.11455434560775757, 0.02520640194416046, -0.06464748829603195, 0.0888100415468216, -0.09927403181791306, -0.02617495320737362, 0.05228336527943611, 0.1019703596830368, 0.005340008996427059, -0.009662147611379623, 0.06421281397342682, 0.05742452293634415, 0.035249579697847366, 0.005827097222208977, 0.040871791541576385, -0.015943706035614014, -0.02258520945906639, 0.07375013828277588, -0.08340723067522049, 0.056429192423820496, 0.11394559592008591, -0.09227757155895233, -0.01744348369538784, 0.04889199882745743, -0.0060705202631652355, -0.008434056304395199, -0.04583980515599251, -0.04919206723570824, 0.27018117904663086, 0.011550089344382286, 0.058874357491731644, -0.06809624284505844, 0.022093243896961212, 0.039561983197927475, -0.05297774821519852, -0.07930371910333633, 0.08441472798585892, 0.030145950615406036, -0.0767621248960495, 0.0078292740508914, 0.08482478559017181, 0.06126692146062851, 0.1734325885772705, 0.007933271117508411, -0.08809909969568253, -0.014126830734312534, -0.07901136577129364, -0.04731408879160881, 0.05554790422320366, -0.24975815415382385, -0.055252328515052795, 0.024837033823132515, -0.014507300220429897, 0.02769819088280201, -0.0146541902795434, 0.022735614329576492, -0.033043984323740005, -0.03059142269194126, -0.026997078210115433, 0.02270236425101757, 0.005998425651341677, 0.08967991173267365, -0.014469052664935589, -0.06423289328813553, -0.047027599066495895, -0.06472320854663849, -0.10738605260848999, 0.07004023343324661, -0.07398904114961624, -0.4042978882789612, -0.024632597342133522, -0.08069833368062973, -0.0642775148153305, 0.015851009637117386, 0.03414097800850868, -0.10678552836179733, -0.06860068440437317, -0.03320778161287308, 0.10893634706735611, 0.0403786227107048, -0.043132148683071136, 0.11715027689933777, 0.0049855150282382965, 0.05492106452584267, -0.1029776930809021, -0.0008706892840564251, -0.07391834259033203, -0.05235813185572624, -0.06041625142097473, 0.049641918390989304, 0.04577835276722908, 0.09473322331905365, 0.08130388706922531, -0.0040356977842748165, -0.011214327067136765, 0.15552377700805664, -0.12107711285352707, 0.00395713746547699, 0.19624778628349304, -0.07048123329877853, -0.02343703806400299, 0.10350996255874634, 0.016406379640102386, -0.1006687581539154, 0.046386245638132095, 0.01656092144548893, -0.00395341357216239, -0.2420867383480072, -0.14573422074317932, -0.029880313202738762, 0.06677266210317612, 0.04568857327103615, -0.024607256054878235, -0.0018936103442683816, -0.020709533244371414, -0.03976309299468994, 0.00068675383226946, 0.0337025485932827, -0.005101324059069157, 0.1460687816143036, -0.06049539893865585, 0.10037077963352203, -0.04903242364525795, -0.03665808215737343, 0.08769603818655014, 0.006512183230370283, 0.07050006091594696, 0.0992523804306984, 0.023735394701361656, 0.05994245409965515, 0.024736560881137848, 0.02254888229072094, -0.01196844968944788, 0.003945810254663229, 0.0006368915201164782, -0.032957322895526886, -0.05321526899933815, -0.004246896598488092, 0.06574660539627075, 0.1808955818414688, -0.1296376883983612, -0.08862068504095078, -0.021399080753326416, 0.06423277407884598, 0.14584623277187347, 0.12854841351509094, -0.05253630131483078, -0.07496091723442078, 0.02536897175014019, -0.09542151540517807, -0.03585429862141609, 0.10407039523124695, 0.09217331558465958, -0.16011853516101837, 0.1077810600399971, 0.08062447607517242, 0.04978331923484802, -0.0951625406742096, 0.0462239533662796, -0.13436320424079895, -0.02056988701224327, 0.011389304883778095, 0.039080310612916946, -0.20476894080638885, 0.145686075091362, 0.030664119869470596, 0.038575779646635056, -0.09523703902959824, 0.010574333369731903, 0.06203994154930115, -0.07262617349624634, 0.1623327136039734, -0.02398386038839817, 0.028102349489927292, 0.023642484098672867, -0.11627352982759476, 0.018747380003333092, 0.03214637562632561, -0.021341247484087944, 0.011680912226438522, 0.016674362123012543, 0.002334491116926074, -0.028384549543261528, 0.04780002310872078, -0.2156178504228592, -0.11090060323476791, 0.02660774253308773, 0.019468307495117188, 0.10155850648880005, -0.022766131907701492, -0.11220753937959671, -0.17409390211105347, 0.10027099400758743, -0.041106197983026505, 0.011216443032026291, -0.07475471496582031, 0.09853333234786987, 0.028621947392821312, -0.020111987367272377, 0.022057289257645607, 0.07172735035419464, 0.13244424760341644, -0.06036738306283951, -0.056706130504608154, 0.06896943598985672, -0.10897301882505417, -0.11468391865491867, -0.010070721618831158, 0.1953115314245224, 0.14785589277744293, 0.04993538558483124, 0.07279035449028015, -0.01854850910604, 0.029089156538248062, -0.058528997004032135, 0.10275973379611969, 0.09300514310598373, -0.07822905480861664, 0.10143187642097473, 0.0005362281808629632, -0.3494260907173157, -0.12809689342975616, -0.019489476457238197, 0.18443207442760468, 0.09992076456546783, -0.013255857862532139, 0.15120169520378113, 0.2836824655532837, -0.07168164849281311, -0.23452122509479523, 0.020310094580054283, 0.02263360470533371, 0.032290831208229065, 0.034340426325798035, -0.23097899556159973, 0.09443262964487076, 0.05033865571022034, 0.0015877317637205124, -0.01811007410287857, -0.1779908388853073, -0.12529414892196655, 0.2417154312133789, -0.019854318350553513, 0.1983722448348999, 0.0010008547687903047, -0.04772936552762985, -0.060637395828962326, 0.046585533767938614, 0.04735022410750389, -0.11972799897193909, 0.0828544944524765, 0.0730736032128334, 0.028494738042354584, 0.02076234482228756, 0.03299250453710556, 0.060051970183849335, 0.08798082917928696, -0.020749209448695183, -0.05419004335999489, 0.058240536600351334, 0.027674086391925812, 0.055804092437028885, 0.08400481939315796, 0.07754414528608322, -0.04912610724568367, -0.01744713820517063, -0.07283688336610794, -0.0748610869050026, 0.09765128791332245, -0.00008450664608972147, -0.02817765809595585, -0.015302972868084908, 0.0677269995212555, -0.015709886327385902, 0.035010214895009995, -0.06437162309885025, -0.09523557126522064, 0.037532005459070206, 0.11523649096488953, 0.22662152349948883, -0.03143790364265442, 0.02169395424425602, -0.02992435358464718, -0.06841131299734116, 0.04325897991657257, 0.039976611733436584, 0.030113661661744118, 0.07215403765439987, 0.013458631932735443, 0.11072386801242828, -0.012326358817517757, -0.14605799317359924, 0.0883609876036644, 0.0322510190308094, -0.07574115693569183, -0.17930817604064941, -0.044506680220365524, 0.03186027333140373, 0.0012670134892687201, 0.039871834218502045, 0.2145652323961258, -0.03728087618947029, -0.05586111545562744, -0.023210061714053154, 0.072910375893116, -0.033412907272577286, 0.09998548030853271, 0.048169832676649094, 0.0053204805590212345, -0.1357860565185547, 0.08549953252077103, 0.07159322500228882, -0.05968739464879036, 0.1014387384057045, 0.0002365766413277015, -0.026391707360744476, -0.07350040972232819, -0.21696147322654724, 0.05152078717947006, 0.07139996439218521, -0.08783644437789917, -0.016370318830013275, -0.1282982975244522, 0.04216655716300011, 0.10235770791769028, 0.005307336803525686, -0.020217331126332283, -0.06208354979753494, -0.05099378898739815, -0.0036395892966538668, 0.08286231756210327, 0.06484242528676987, -0.07673665881156921, -0.09246040135622025, 0.030265076085925102, 0.03404362499713898, 0.07301387935876846, -0.027317021042108536, -0.10272669792175293, -0.09810926765203476, 0.015308946371078491, -0.18969492614269257, -0.032040610909461975, -0.10309487581253052, 0.0021805143915116787, -0.01777157001197338, -0.04592267796397209, -0.032156188040971756, 0.019102007150650024, -0.060533735901117325, 0.0323261022567749, 0.037023141980171204, 0.09843681007623672, -0.14936916530132294, 0.046337176114320755, 0.028815628960728645, -0.028843197971582413, 0.12747718393802643, 0.08198296278715134, -0.033786024898290634, 0.05981963500380516, -0.18601560592651367, -0.05041011795401573, 0.06797968596220016, 0.07829564809799194, 0.013990454375743866, -0.14655663073062897, 0.003458505729213357, 0.017875835299491882, 0.006014938931912184, -0.01412895042449236, 0.08969899266958237, -0.02251886948943138, 0.0329374335706234, -0.07463831454515457, -0.030835887417197227, -0.01808253861963749, 0.030471958220005035, 0.059058815240859985, 0.053252093493938446, 0.06060827523469925, -0.09103265404701233, 0.069302998483181, -0.07343040406703949, 0.03992263600230217, -0.04384954273700714, -0.024377362802624702, 0.04840991646051407, -0.1014716774225235, 0.08304159343242645, -0.005219806917011738, 0.16463878750801086, 0.022671561688184738, 0.019234120845794678, 0.02278522215783596, -0.07313058525323868, -0.13660050928592682, 0.05429202690720558, -0.016162950545549393, 0.06605180352926254, -0.002333552110940218, -0.0642697662115097, -0.01754576899111271, 0.08005266636610031, 0.1731238216161728, 0.056995391845703125, 0.09965229034423828, 0.08319199085235596, 0.02081046625971794, 0.14018729329109192, -0.09819237142801285, -0.07450989633798599, 0.04018763080239296, -0.12320652604103088, 0.07875863462686539, -0.07137391716241837, -0.003565412713214755, 0.013886376284062862, -0.07378560304641724, 0.04256369546055794, -0.024181289598345757, -0.04231061413884163, -0.10097838938236237, -0.10254401713609695, -0.05440983921289444, -0.07798463106155396, -0.004899194929748774, -0.08769712597131729, 0.0075134048238396645, -0.024345638230443, 0.04201876372098923, -0.10040191560983658, 0.11105813831090927, -0.09843191504478455, -0.13166256248950958, 0.20329925417900085, -0.03560420498251915, -0.021583721041679382, -0.09482624381780624, 0.01592477224767208, 0.037522900849580765, 0.128004789352417, 0.03116375394165516, 0.045543912798166275, -0.05455115810036659, 0.02821274660527706, -0.07110817730426788, -0.07871725410223007, 0.018609866499900818, -0.04568831995129585, -0.0010812277905642986, 0.1412244737148285, 0.11343260109424591, -0.031589966267347336, 0.019101114943623543, 0.11141414940357208, -0.031238948926329613, -0.009140252135694027, -0.15313616394996643, -0.01974429376423359, -0.017094291746616364, 0.038690805435180664, -0.046415407210588455, -0.09347260743379593, -0.01616900973021984, 0.2009183168411255, 0.16870102286338806, -0.0702020525932312, 0.034158553928136826, -0.005822003819048405, 0.007367256097495556, 0.000012625720046344213, 0.03520609810948372, 0.1346583068370819, 0.21054618060588837, -0.007580135948956013, 0.01906435191631317, 0.0018474938115105033, -0.10694736987352371, -0.020946377888321877, 0.10287459939718246, 0.0017092435155063868, -0.022116001695394516, -0.010342647321522236, 0.16994298994541168, -0.18364322185516357, -0.23714545369148254, -0.11618635058403015, -0.09684930741786957, -0.13900581002235413, 0.0034780821297317743, -0.06002504378557205, 0.18097737431526184, 0.06135717034339905, -0.01024904940277338, -0.01486649364233017, 0.14718176424503326, 0.03596093878149986, -0.013097116723656654, -0.04495806246995926, 0.04730412736535072, -0.12653422355651855, 0.05180799961090088, -0.02594338171184063, 0.015459652058780193, 0.03189415857195854, 0.0890606939792633, -0.03979649767279625, 0.02465958334505558, 0.009569784626364708, 0.009659433737397194, 0.03147374838590622, 0.13472847640514374, 0.019925441592931747, 0.15565747022628784, 0.1294679492712021, -0.0863909125328064, 0.006454397924244404, 0.008229621686041355, -0.016861610114574432, -0.03692995384335518, 0.12651993334293365, -0.11584516614675522, 0.08526811748743057, 0.057161152362823486, -0.05525089427828789, -0.038962703198194504, -0.02300102822482586, 0.028422988951206207, -0.01936766877770424, -0.008504120633006096, -0.013230613432824612, -0.2076413780450821, 0.00992914941161871, -0.11316157877445221, 0.05722523480653763, -0.18669338524341583, -0.007043821271508932, 0.00756250973790884, -0.01634044013917446, -0.02754688449203968, 0.04610100015997887, 0.04489267244935036, -0.03893112763762474, -0.01611286774277687, 0.03831017017364502, 0.03413808345794678, 0.10461442917585373, -0.07090891152620316, -0.04019155353307724 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in sk (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-sk") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-sk") # load dataset ds = load_dataset("common_voice", "sk", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "sk", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-sk
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "sk", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "sk" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #sk #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in sk (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in sk (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #sk #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in sk (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #sk #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in sk (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09447353333234787, 0.08548613637685776, -0.0022000542376190424, -0.0196307934820652, 0.10718734562397003, -0.0813937857747078, 0.14219516515731812, 0.049294326454401016, 0.09208300709724426, 0.07134798914194107, -0.002324407920241356, 0.02863006480038166, 0.10590332746505737, 0.09510144591331482, 0.03112775832414627, -0.2042689174413681, 0.049988921731710434, -0.07526323199272156, 0.12548232078552246, 0.043848276138305664, 0.08565669506788254, -0.08638176321983337, 0.03582827001810074, 0.07444505393505096, -0.043793026357889175, 0.04464445635676384, -0.018410714343190193, -0.1336204558610916, 0.10242026299238205, 0.07890792191028595, -0.0385429784655571, 0.007998758926987648, 0.08142880350351334, -0.13471472263336182, 0.02513783797621727, 0.03974185138940811, 0.013471891172230244, -0.0026381034404039383, 0.11044604331254959, -0.08542680740356445, 0.0801033154129982, 0.012514191679656506, -0.005880454555153847, 0.08964353799819946, -0.08124329149723053, -0.1565016359090805, -0.04471737891435623, 0.08679794520139694, 0.023510390892624855, 0.08239814639091492, -0.1099056825041771, 0.06173036992549896, -0.027166442945599556, 0.04106814041733742, 0.14245951175689697, -0.23400866985321045, -0.014402132481336594, -0.015134521760046482, 0.06788469105958939, -0.011111950501799583, -0.04552576690912247, 0.07188399881124496, 0.041235726326704025, 0.007310500834137201, -0.0910271555185318, -0.03187449648976326, 0.016513913869857788, -0.11797057837247849, -0.0951392725110054, 0.03762055188417435, 0.24816125631332397, 0.08044280856847763, -0.08159065991640091, -0.10899163037538528, 0.04062049835920334, 0.21603336930274963, -0.05879554897546768, -0.09413482993841171, -0.007286659441888332, 0.0042977663688361645, 0.024015409871935844, -0.05411488562822342, -0.06948566436767578, 0.003456323640421033, -0.05077386274933815, 0.095366470515728, 0.006187528371810913, 0.011621086858212948, -0.02975624054670334, -0.015570188872516155, -0.053002748638391495, -0.0729740634560585, 0.029129179194569588, -0.08019135147333145, -0.07590888440608978, 0.006485886871814728, -0.04116211086511612, -0.1383855938911438, 0.053485557436943054, 0.02132871374487877, 0.0995011106133461, 0.02137063257396221, -0.14718079566955566, 0.035895947366952896, 0.11042921990156174, 0.06687698513269424, -0.20197805762290955, -0.07335270941257477, -0.004323948640376329, -0.0357857309281826, 0.012497232295572758, -0.033669739961624146, -0.07184761017560959, 0.01321903895586729, -0.03122369758784771, 0.03059513121843338, 0.011688338592648506, -0.03134332597255707, -0.08820216357707977, -0.10093867033720016, -0.00565901817753911, -0.06456618010997772, -0.0015200164634734392, 0.06692980974912643, -0.04383613169193268, 0.18388663232326508, -0.020712431520223618, 0.09899565577507019, -0.10369188338518143, -0.07369605451822281, -0.013773186132311821, 0.03439895436167717, 0.043352626264095306, -0.06292542070150375, 0.045083027333021164, 0.0251857191324234, 0.013203004375100136, -0.16556279361248016, -0.043572623282670975, -0.09322196990251541, -0.03744449466466904, -0.08339918404817581, -0.0015552864642813802, -0.062020447105169296, 0.051578640937805176, -0.03243367001414299, -0.039013538509607315, -0.024825917556881905, -0.05677486211061478, 0.01202145405113697, -0.06171572208404541, 0.10195141285657883, 0.020812783390283585, 0.0868029072880745, -0.022615855559706688, -0.04394092783331871, -0.11852972954511642, 0.14410868287086487, -0.11547897011041641, 0.02709408663213253, -0.13028165698051453, -0.0439644381403923, -0.060524847358465195, 0.05051803216338158, 0.06718312203884125, 0.12867289781570435, -0.2194686383008957, -0.10277564823627472, 0.1656588762998581, -0.07793502509593964, -0.04520433023571968, 0.20940831303596497, 0.022966418415308, 0.08881327509880066, 0.10846269130706787, 0.21780742704868317, -0.01810499280691147, -0.15552765130996704, -0.06402648240327835, -0.03187039867043495, 0.04072699695825577, 0.07602250576019287, 0.04391743987798691, -0.05556395277380943, 0.049693457782268524, 0.005116157699376345, 0.07071255147457123, -0.030841827392578125, -0.012237798422574997, -0.04760453477501869, 0.042250778526067734, -0.0870818942785263, 0.07806320488452911, -0.008273436687886715, -0.023690760135650635, -0.014592181891202927, -0.05209733545780182, -0.033337291330099106, 0.07193250954151154, -0.05030817911028862, 0.028342554345726967, -0.12756554782390594, 0.11739055812358856, 0.010530160740017891, 0.028625143691897392, -0.14273980259895325, 0.11300947517156601, 0.004574838560074568, -0.030896782875061035, 0.12079418450593948, 0.08298317342996597, -0.025580203160643578, -0.019182192161679268, 0.009260162711143494, 0.025266475975513458, -0.03723382204771042, 0.00274062342941761, -0.017351660877466202, -0.11520242691040039, 0.025648673996329308, -0.05852363631129265, 0.09183526784181595, -0.08611554652452469, -0.02121213637292385, 0.06012308597564697, 0.1063268706202507, 0.001133937737904489, -0.011394071392714977, 0.06885433197021484, 0.055587414652109146, 0.04007461294531822, 0.007280478719621897, 0.03782409057021141, -0.012067550793290138, -0.028819294646382332, 0.07706548273563385, -0.08111046254634857, 0.05193891003727913, 0.11842720210552216, -0.08478765934705734, -0.013714385218918324, 0.043429892510175705, -0.0033525575418025255, -0.011493437923491001, -0.04656977951526642, -0.054619234055280685, 0.26587164402008057, 0.014312326908111572, 0.05337248742580414, -0.06899107992649078, 0.013066723011434078, 0.03668157011270523, -0.04990958794951439, -0.07787613570690155, 0.07622285187244415, 0.03218277543783188, -0.06759384274482727, 0.008185479789972305, 0.08945214748382568, 0.06322740763425827, 0.1727292537689209, 0.0009798366809263825, -0.08951359987258911, -0.013792539946734905, -0.0830492153763771, -0.04986625537276268, 0.06252824515104294, -0.2578187584877014, -0.05561163276433945, 0.026793746277689934, -0.014327428303658962, 0.02779570408165455, -0.014762808568775654, 0.023336419835686684, -0.03424105793237686, -0.03253135085105896, -0.019045600667595863, 0.026433689519762993, 0.007784249261021614, 0.08760633319616318, -0.010615918785333633, -0.04934011027216911, -0.04619476571679115, -0.06311328709125519, -0.11110103130340576, 0.06924130022525787, -0.0790959820151329, -0.4020746350288391, -0.03245386481285095, -0.059489209204912186, -0.0701025202870369, 0.013546963222324848, 0.030622275546193123, -0.10531282424926758, -0.06992301344871521, -0.03625313937664032, 0.11213496327400208, 0.043867163360118866, -0.038332391530275345, 0.11871515959501266, -0.0005864035920239985, 0.050528641790151596, -0.10227049887180328, -0.0021776447538286448, -0.06896389275789261, -0.03942792862653732, -0.06314273923635483, 0.0605585053563118, 0.04991893470287323, 0.10321734845638275, 0.08065583556890488, -0.004225487355142832, -0.010056659579277039, 0.1665513664484024, -0.12110822647809982, 0.0142596997320652, 0.196665957570076, -0.0762728676199913, -0.02435125783085823, 0.10525230318307877, 0.01895294338464737, -0.0972190871834755, 0.048119619488716125, 0.019993992522358894, -0.005915766581892967, -0.23848405480384827, -0.1479029357433319, -0.03095127083361149, 0.06637673079967499, 0.04059234634041786, -0.023745736107230186, 0.00405431492254138, -0.015061584301292896, -0.042473986744880676, -0.0025937447790056467, 0.03214549645781517, -0.0024445762392133474, 0.15750549733638763, -0.05872611701488495, 0.1001347079873085, -0.050054386258125305, -0.034140948206186295, 0.09032388776540756, 0.004674837458878756, 0.05885401740670204, 0.09218940883874893, 0.020399432629346848, 0.06485728174448013, 0.023505276069045067, 0.02438465878367424, -0.016264470294117928, 0.004787406884133816, 0.00040624820394441485, -0.027736855670809746, -0.05512072518467903, -0.006628550589084625, 0.0682443156838417, 0.17513775825500488, -0.1334250122308731, -0.08561869710683823, -0.02044207975268364, 0.06859151273965836, 0.14772772789001465, 0.12642385065555573, -0.03843715041875839, -0.0847405344247818, 0.02954643964767456, -0.09521012008190155, -0.0367683470249176, 0.10005839914083481, 0.0975811555981636, -0.1627831906080246, 0.11013441532850266, 0.08255114406347275, 0.04830850660800934, -0.09167995303869247, 0.043399009853601456, -0.13309481739997864, -0.02273319661617279, 0.008098875172436237, 0.036135222762823105, -0.20662462711334229, 0.13637006282806396, 0.03167199715971947, 0.039125360548496246, -0.09966802597045898, 0.012332198210060596, 0.06538302451372147, -0.07199051231145859, 0.15923668444156647, -0.020037807524204254, 0.011637208051979542, 0.022193748503923416, -0.11614863574504852, 0.015543675981462002, 0.028047766536474228, -0.022499671205878258, 0.011507582850754261, 0.02187211625277996, 0.003120955778285861, -0.028967438265681267, 0.05327955260872841, -0.21623745560646057, -0.10903304070234299, 0.02784111723303795, 0.023749668151140213, 0.09910145401954651, -0.025385648012161255, -0.11098421365022659, -0.1833193302154541, 0.0957367792725563, -0.042347729206085205, 0.001992121571674943, -0.07549303025007248, 0.09197773039340973, 0.03506256639957428, -0.02114100195467472, 0.02724488638341427, 0.07385587692260742, 0.1347585767507553, -0.05908021703362465, -0.05874774977564812, 0.06475415080785751, -0.10781499743461609, -0.1163632795214653, -0.008360869251191616, 0.19960172474384308, 0.14425547420978546, 0.048802830278873444, 0.06928304582834244, -0.01645585335791111, 0.03419319540262222, -0.06591074913740158, 0.10220114141702652, 0.09610259532928467, -0.07322431355714798, 0.10465353727340698, -0.00037969500408507884, -0.34234619140625, -0.12897975742816925, -0.024753982201218605, 0.18360576033592224, 0.10414457321166992, -0.01415350753813982, 0.1491418331861496, 0.27929314970970154, -0.07043834775686264, -0.23923754692077637, 0.024099787697196007, 0.02013906091451645, 0.03201274201273918, 0.0305935125797987, -0.22787965834140778, 0.10476398468017578, 0.05357317626476288, 0.00006869375647511333, -0.018032211810350418, -0.18419961631298065, -0.13733525574207306, 0.24330677092075348, -0.01767158880829811, 0.19076035916805267, -0.002575123682618141, -0.05230757221579552, -0.05934154614806175, 0.03453212231397629, 0.05011354386806488, -0.11979212611913681, 0.07608823478221893, 0.0727953240275383, 0.028969159349799156, 0.021288195624947548, 0.030354678630828857, 0.065824493765831, 0.09786885231733322, -0.01946292072534561, -0.056516941636800766, 0.06520942598581314, 0.030111325904726982, 0.05349361151456833, 0.09219375997781754, 0.0801294669508934, -0.04711005464196205, -0.014268317259848118, -0.0711652860045433, -0.07705047726631165, 0.09676103293895721, -0.002036819700151682, -0.033734794706106186, -0.022116776555776596, 0.06622733175754547, -0.015782712027430534, 0.031188590452075005, -0.06980254501104355, -0.09599178284406662, 0.037846505641937256, 0.10904978215694427, 0.22893860936164856, -0.040789198130369186, 0.01858886331319809, -0.026756787672638893, -0.07107485830783844, 0.04056243225932121, 0.040393903851509094, 0.03221021965146065, 0.07385829091072083, 0.01597192883491516, 0.10882596671581268, -0.013872949406504631, -0.14367832243442535, 0.08522139489650726, 0.029808159917593002, -0.07868606597185135, -0.1793752908706665, -0.04473288357257843, 0.028940808027982712, -0.0011673478875309229, 0.03553350642323494, 0.21167761087417603, -0.03654433414340019, -0.05510081350803375, -0.022740809246897697, 0.06805909425020218, -0.029495954513549805, 0.10164940357208252, 0.05342058837413788, 0.007938733324408531, -0.13338297605514526, 0.08577612787485123, 0.06666499376296997, -0.0627591535449028, 0.10195685923099518, 0.0005026625585742295, -0.032067783176898956, -0.07322997599840164, -0.21010681986808777, 0.046846553683280945, 0.07024450600147247, -0.08755964785814285, -0.01484904158860445, -0.13364753127098083, 0.041283395141363144, 0.11184476315975189, 0.004419182427227497, -0.01881813071668148, -0.06044270843267441, -0.048359379172325134, -0.0005999195855110884, 0.08866731822490692, 0.06365446746349335, -0.07404810190200806, -0.09823589771986008, 0.042519938200712204, 0.030866120010614395, 0.07351097464561462, -0.028083598241209984, -0.09884534031152725, -0.0945344939827919, 0.01328999176621437, -0.18810246884822845, -0.034205202013254166, -0.11158265918493271, 0.0008120705024339259, -0.015878181904554367, -0.05056335777044296, -0.030616309493780136, 0.02101845294237137, -0.058621544390916824, 0.03253064304590225, 0.032881494611501694, 0.09822461009025574, -0.14862048625946045, 0.04528718441724777, 0.03281111642718315, -0.03095017932355404, 0.12880197167396545, 0.08609288185834885, -0.03476475551724434, 0.06414556503295898, -0.17557065188884735, -0.04377472400665283, 0.06951958686113358, 0.08183947205543518, 0.009257570840418339, -0.1409153938293457, 0.00010914247104665264, 0.01991971582174301, 0.009825214743614197, -0.010840038768947124, 0.0882917121052742, -0.017169587314128876, 0.024352382868528366, -0.0676388368010521, -0.023043537512421608, -0.019582092761993408, 0.030746497213840485, 0.07292152941226959, 0.05248428136110306, 0.05630671977996826, -0.09298720955848694, 0.06555425375699997, -0.07448974251747131, 0.039732303470373154, -0.043465547263622284, -0.02292359434068203, 0.05294514447450638, -0.100614532828331, 0.08418679237365723, -0.003939489368349314, 0.17169766128063202, 0.013598501682281494, 0.012331301346421242, 0.028261622413992882, -0.06562267243862152, -0.12840913236141205, 0.05932597443461418, -0.0032434738241136074, 0.06498762965202332, -0.005588832776993513, -0.06625451147556305, -0.015630267560482025, 0.07518135756254196, 0.16751882433891296, 0.062088653445243835, 0.09215714782476425, 0.07919345051050186, 0.02334246039390564, 0.14177122712135315, -0.09162601083517075, -0.07296549528837204, 0.03940403461456299, -0.1310301423072815, 0.078548364341259, -0.07297130674123764, 0.006337208207696676, 0.0071569629944860935, -0.08355768769979477, 0.04738098755478859, -0.01937187649309635, -0.046342916786670685, -0.10063651949167252, -0.10497795045375824, -0.05080946162343025, -0.07800298929214478, -0.006770329549908638, -0.08829286694526672, 0.0027813795022666454, -0.0254579558968544, 0.04219437390565872, -0.1011790856719017, 0.12351883202791214, -0.10120851546525955, -0.12828604876995087, 0.2030225545167923, -0.0356321781873703, -0.025123318657279015, -0.09538757055997849, 0.018103672191500664, 0.04086792841553688, 0.12957598268985748, 0.02997216396033764, 0.042825739830732346, -0.059354450553655624, 0.025600174441933632, -0.0686243399977684, -0.07691225409507751, 0.01917126029729843, -0.047750748693943024, 0.002399848774075508, 0.13933958113193512, 0.1159992665052414, -0.03161901235580444, 0.015660984441637993, 0.10856401920318604, -0.0316191092133522, -0.018796544522047043, -0.15628114342689514, -0.03290550038218498, -0.019626935943961143, 0.03317132964730263, -0.04128367826342583, -0.09770341962575912, -0.01613420620560646, 0.1982060968875885, 0.15783576667308807, -0.07400738447904587, 0.036346517503261566, -0.009304533712565899, 0.006815333850681782, -0.0033097947016358376, 0.035530827939510345, 0.13133837282657623, 0.21052712202072144, -0.005518221762031317, 0.02209477685391903, 0.0016910883132368326, -0.11357419192790985, -0.024911683052778244, 0.10291591286659241, -0.00197752402164042, -0.026991646736860275, -0.00985493790358305, 0.17087402939796448, -0.18735328316688538, -0.23353271186351776, -0.11959981918334961, -0.0962245762348175, -0.13811053335666656, 0.005132443271577358, -0.06872519105672836, 0.17762692272663116, 0.05234206095337868, -0.008907568641006947, -0.00932153407484293, 0.14339791238307953, 0.038151178508996964, -0.00873023271560669, -0.04228387027978897, 0.05129522457718849, -0.11857099831104279, 0.04933663085103035, -0.02257661707699299, 0.022355476394295692, 0.03199364244937897, 0.0896204337477684, -0.042096566408872604, 0.022977225482463837, 0.011701530776917934, 0.008836093358695507, 0.02876206487417221, 0.13574416935443878, 0.023780643939971924, 0.14792877435684204, 0.13167822360992432, -0.09005507826805115, 0.004194698762148619, 0.004048276226967573, -0.019585365429520607, -0.0418534018099308, 0.1265041083097458, -0.12062205374240875, 0.08797243237495422, 0.051887739449739456, -0.055792685598134995, -0.04141465201973915, -0.019245268777012825, 0.024542970582842827, -0.023079359903931618, -0.0052132816053926945, -0.010786384344100952, -0.20834366977214813, 0.014027895405888557, -0.1195514053106308, 0.05754542350769043, -0.18650978803634644, -0.010246113874018192, 0.007986442185938358, -0.01389356330037117, -0.02336432971060276, 0.04775552079081535, 0.04685549810528755, -0.04076850041747093, -0.016945142298936844, 0.03809526190161705, 0.03440297394990921, 0.10565172880887985, -0.06954380124807358, -0.038015205413103104 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli-Finetuned [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10K unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390) and fine-tuned on the transcribed data in sl (refer to Table 1 of paper for more information). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Usage for inference In the following it is shown how the model can be used in inference on a sample of the [Common Voice dataset](https://commonvoice.mozilla.org/en/datasets) ```python #!/usr/bin/env python3 from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torchaudio import torch # resample audio # load model & processor model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-sl") processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-10k-voxpopuli-ft-sl") # load dataset ds = load_dataset("common_voice", "sl", split="validation[:1%]") # common voice does not match target sampling rate common_voice_sample_rate = 48000 target_sample_rate = 16000 resampler = torchaudio.transforms.Resample(common_voice_sample_rate, target_sample_rate) # define mapping fn to read in sound file and resample def map_to_array(batch): speech, _ = torchaudio.load(batch["path"]) speech = resampler(speech) batch["speech"] = speech[0] return batch # load all audio files ds = ds.map(map_to_array) # run inference on the first 5 data samples inputs = processor(ds[:5]["speech"], sampling_rate=target_sample_rate, return_tensors="pt", padding=True) # inference logits = model(**inputs).logits predicted_ids = torch.argmax(logits, axis=-1) print(processor.batch_decode(predicted_ids)) ```
{"language": "sl", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli-ft-sl
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "audio", "voxpopuli", "sl", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "sl" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #sl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli-Finetuned Facebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in sl (refer to Table 1 of paper for more information). Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Usage for inference In the following it is shown how the model can be used in inference on a sample of the Common Voice dataset
[ "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in sl (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #sl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in sl (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ 66, 162, 30 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #audio #voxpopuli #sl #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli-Finetuned\n\nFacebook's Wav2Vec2 base model pretrained on the 10K unlabeled subset of VoxPopuli corpus and fine-tuned on the transcribed data in sl (refer to Table 1 of paper for more information).\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Usage for inference\n\nIn the following it is shown how the model can be used in inference on a sample of the Common Voice dataset" ]
[ -0.09498270601034164, 0.09017936885356903, -0.002271765610203147, -0.016813870519399643, 0.10944867879152298, -0.08176805078983307, 0.14260025322437286, 0.04720914363861084, 0.08715367317199707, 0.06841940432786942, -0.0024940494913607836, 0.025712454691529274, 0.10259051620960236, 0.0871032252907753, 0.030965685844421387, -0.2071848213672638, 0.05023670569062233, -0.07736752182245255, 0.13532370328903198, 0.04143448919057846, 0.0853452980518341, -0.08545846492052078, 0.03439681604504585, 0.0732603594660759, -0.04363333433866501, 0.048359885811805725, -0.01887955516576767, -0.12900523841381073, 0.10259542614221573, 0.07979579269886017, -0.039332255721092224, 0.007171011995524168, 0.08631600439548492, -0.1357080489397049, 0.024258827790617943, 0.03769412636756897, 0.009476716630160809, -0.005308778490871191, 0.11530714482069016, -0.08910847455263138, 0.087847039103508, 0.015144619159400463, -0.006280737463384867, 0.08948835730552673, -0.08091815561056137, -0.16191799938678741, -0.043006688356399536, 0.07794881612062454, 0.018493803218007088, 0.08220672607421875, -0.1093912422657013, 0.057686686515808105, -0.03010154888033867, 0.045454174280166626, 0.13555842638015747, -0.23863914608955383, -0.014795896597206593, -0.013310413807630539, 0.07512680441141129, -0.012248599901795387, -0.04332681745290756, 0.07412758469581604, 0.04194192215800285, 0.007864877581596375, -0.0952964797616005, -0.029758049175143242, 0.021173173561692238, -0.11769337207078934, -0.0946078673005104, 0.0362519845366478, 0.24908135831356049, 0.07840103656053543, -0.08261903375387192, -0.1080370545387268, 0.04154534265398979, 0.2173786610364914, -0.056431181728839874, -0.09277071803808212, -0.0049997977912425995, 0.006822814233601093, 0.025149399414658546, -0.05178718641400337, -0.06584304571151733, 0.003306495724245906, -0.04749652370810509, 0.08825819194316864, 0.0049084750935435295, 0.011657236143946648, -0.03012305311858654, -0.018323101103305817, -0.05207979306578636, -0.07132679224014282, 0.02973405458033085, -0.07823684811592102, -0.07476325333118439, 0.00766408909112215, -0.044559039175510406, -0.13722606003284454, 0.054120734333992004, 0.021606680005788803, 0.10073033720254898, 0.022045789286494255, -0.15144501626491547, 0.03587259352207184, 0.10871412605047226, 0.06475542485713959, -0.20236355066299438, -0.07711056619882584, -0.007238844875246286, -0.039582107216119766, 0.016065869480371475, -0.03186729922890663, -0.07363622635602951, 0.01452702283859253, -0.03263174742460251, 0.030861366540193558, 0.010602389462292194, -0.03304942324757576, -0.09214001148939133, -0.09901975095272064, -0.017060061916708946, -0.06518044322729111, 0.0006512263789772987, 0.06901054829359055, -0.04776409640908241, 0.18198202550411224, -0.01768878474831581, 0.09789331257343292, -0.10473065823316574, -0.07686905562877655, -0.012741677463054657, 0.0366402268409729, 0.03874015808105469, -0.06572464853525162, 0.04420633241534233, 0.02653391659259796, 0.01351513434201479, -0.16472239792346954, -0.05297320336103439, -0.09325384348630905, -0.03448934853076935, -0.08542802929878235, 0.003059634706005454, -0.0684441328048706, 0.05489308014512062, -0.034564319998025894, -0.03901680186390877, -0.03118305280804634, -0.05794818326830864, 0.012101046741008759, -0.05805319920182228, 0.10196540504693985, 0.014154775068163872, 0.08814740180969238, -0.021207589656114578, -0.04549940302968025, -0.10905694961547852, 0.1455642580986023, -0.11170327663421631, 0.030815143138170242, -0.12704260647296906, -0.04490895941853523, -0.05718828737735748, 0.053068965673446655, 0.06960073113441467, 0.12677842378616333, -0.22363391518592834, -0.10136674344539642, 0.1665482074022293, -0.07703395187854767, -0.036786384880542755, 0.21181118488311768, 0.02536606229841709, 0.08314337581396103, 0.11040442436933517, 0.22135832905769348, -0.014260750263929367, -0.14641541242599487, -0.05922318622469902, -0.03131532669067383, 0.042214129120111465, 0.07569752633571625, 0.04351227730512619, -0.05303853377699852, 0.04976167902350426, 0.006841941736638546, 0.08292900025844574, -0.033840715885162354, -0.01441760454326868, -0.04699627682566643, 0.04097027704119682, -0.0868675485253334, 0.07965542376041412, -0.006045402493327856, -0.026925981044769287, -0.015302072279155254, -0.053973641246557236, -0.031974680721759796, 0.07220020145177841, -0.052252285182476044, 0.027051180601119995, -0.13024525344371796, 0.12375885248184204, 0.016090838238596916, 0.028150590136647224, -0.14509625732898712, 0.11602654308080673, 0.0038560591638088226, -0.018425563350319862, 0.11719658970832825, 0.08836537599563599, -0.02544981800019741, -0.01509199757128954, 0.011312112212181091, 0.02688795141875744, -0.034188222140073776, -0.0014380143256857991, -0.013218380510807037, -0.11542775481939316, 0.023284675553441048, -0.061995696276426315, 0.0925520732998848, -0.09658525139093399, -0.02186518907546997, 0.058879148215055466, 0.10628778487443924, 0.0016178025398403406, -0.013831321150064468, 0.06786716729402542, 0.05855964124202728, 0.03806556761264801, 0.007368335034698248, 0.04037664830684662, -0.014049559831619263, -0.030536368489265442, 0.07151120156049728, -0.08208446204662323, 0.06413371115922928, 0.11643152683973312, -0.08881845325231552, -0.01071489043533802, 0.042924296110868454, -0.003107335651293397, -0.009104331955313683, -0.04416302964091301, -0.05369924381375313, 0.2663058340549469, 0.010962434113025665, 0.055957261472940445, -0.06783939152956009, 0.020073456689715385, 0.03451896086335182, -0.048288408666849136, -0.07950102537870407, 0.07950984686613083, 0.028184836730360985, -0.06392429023981094, 0.007010420318692923, 0.08149515092372894, 0.06368407607078552, 0.1735301911830902, 0.0027166884392499924, -0.08814352005720139, -0.01359194703400135, -0.0790742039680481, -0.050527121871709824, 0.060805004090070724, -0.26302802562713623, -0.05582718178629875, 0.024955688044428825, -0.014074440114200115, 0.028075870126485825, -0.014820413663983345, 0.022352345287799835, -0.03126055747270584, -0.03444724157452583, -0.029789531603455544, 0.020734084770083427, 0.007887446321547031, 0.08788203448057175, -0.01599102094769478, -0.05461294576525688, -0.045174695551395416, -0.06466009467840195, -0.1107272356748581, 0.06867286562919617, -0.07917576283216476, -0.4058177173137665, -0.02970847673714161, -0.06507774442434311, -0.06894804537296295, 0.016489064320921898, 0.03339352831244469, -0.1023818850517273, -0.07065370678901672, -0.03579504415392876, 0.10568027198314667, 0.04369354248046875, -0.039194073528051376, 0.11970431357622147, 0.00002754144043137785, 0.05461065471172333, -0.10357317328453064, -0.0010962977539747953, -0.07023681700229645, -0.041518956422805786, -0.06492838263511658, 0.05957013741135597, 0.047662124037742615, 0.09649014472961426, 0.08063387125730515, -0.0034363982267677784, -0.010461405850946903, 0.16230933368206024, -0.11753354966640472, 0.003023602068424225, 0.20238527655601501, -0.07789920270442963, -0.025958213955163956, 0.10633919388055801, 0.01919489912688732, -0.10256347805261612, 0.04715485870838165, 0.01373151782900095, -0.005918307229876518, -0.23660074174404144, -0.14995846152305603, -0.031737711280584335, 0.06976228952407837, 0.03729766234755516, -0.023333407938480377, -0.006496576592326164, -0.020550450310111046, -0.03970106318593025, -0.0004926955443806946, 0.03349825367331505, -0.004836910404264927, 0.15728658437728882, -0.06325602531433105, 0.10212145000696182, -0.04842913895845413, -0.032987143844366074, 0.08994007855653763, -0.003553785150870681, 0.05992846190929413, 0.09576646983623505, 0.021264715120196342, 0.06126625835895538, 0.024561282247304916, 0.021087590605020523, -0.015195908956229687, 0.0047289221547544, 0.00024325713457074016, -0.028709517791867256, -0.05504992604255676, -0.011257314123213291, 0.06560134887695312, 0.18187949061393738, -0.12868115305900574, -0.0880545899271965, -0.016901077702641487, 0.06746295839548111, 0.14326328039169312, 0.12635497748851776, -0.04957159608602524, -0.07945040613412857, 0.030203765258193016, -0.09592217952013016, -0.03674742951989174, 0.10312336683273315, 0.10201361030340195, -0.15708976984024048, 0.10936293005943298, 0.08540788292884827, 0.04785066843032837, -0.09545924514532089, 0.04815955087542534, -0.1311139166355133, -0.025347165763378143, 0.01143729779869318, 0.03649592772126198, -0.20996709167957306, 0.13330328464508057, 0.031276922672986984, 0.038560524582862854, -0.09767002612352371, 0.014048651792109013, 0.0625845417380333, -0.06576769053936005, 0.1597054898738861, -0.022523578256368637, 0.016839023679494858, 0.030909817665815353, -0.11722450703382492, 0.0171723160892725, 0.031000778079032898, -0.017634104937314987, 0.01007340382784605, 0.019117865711450577, 0.002307301154360175, -0.028706712648272514, 0.05042508989572525, -0.22118335962295532, -0.10966494679450989, 0.028093526139855385, 0.02455703355371952, 0.10553929209709167, -0.025788316503167152, -0.11088965088129044, -0.17781485617160797, 0.10463935881853104, -0.041385918855667114, 0.007060924544930458, -0.078105129301548, 0.09695982187986374, 0.030784741044044495, -0.015197200700640678, 0.028136657550930977, 0.07644262164831161, 0.13081029057502747, -0.056057076901197433, -0.0565827377140522, 0.06657896935939789, -0.10694736987352371, -0.11003901064395905, -0.01079224981367588, 0.19651272892951965, 0.1413070410490036, 0.04966600611805916, 0.07129484415054321, -0.01786479726433754, 0.036019280552864075, -0.06160217151045799, 0.09922041743993759, 0.10206104815006256, -0.07631750404834747, 0.10533318668603897, -0.001772386021912098, -0.3485141098499298, -0.12042173743247986, -0.022902145981788635, 0.18570426106452942, 0.09872020035982132, -0.015070200897753239, 0.149611696600914, 0.27635085582733154, -0.07270023971796036, -0.23937362432479858, 0.025750095024704933, 0.020118359476327896, 0.03617735207080841, 0.03344261273741722, -0.22970324754714966, 0.101526640355587, 0.05327271670103073, 0.0017784833908081055, -0.01167168840765953, -0.1857510209083557, -0.1341332346200943, 0.24065719544887543, -0.01762646622955799, 0.19407479465007782, -0.0006033391109667718, -0.0543241910636425, -0.06401921808719635, 0.0418260283768177, 0.044026024639606476, -0.11917994916439056, 0.0786496251821518, 0.07464874535799026, 0.0270317941904068, 0.023562820628285408, 0.031658969819545746, 0.06323438882827759, 0.09277799725532532, -0.016439910978078842, -0.05747034400701523, 0.06853899359703064, 0.02176739275455475, 0.05679096654057503, 0.09152548015117645, 0.07554183900356293, -0.048336368054151535, -0.02114647999405861, -0.0719008594751358, -0.07098604738712311, 0.09693410247564316, -0.0010323101887479424, -0.02875676192343235, -0.015052208676934242, 0.06272555887699127, -0.01812654174864292, 0.033376771956682205, -0.07162737846374512, -0.09887492656707764, 0.03527447581291199, 0.11205215007066727, 0.23429571092128754, -0.04074438288807869, 0.025022627785801888, -0.027424143627285957, -0.06677878648042679, 0.039636071771383286, 0.04220164567232132, 0.031095420941710472, 0.07524614036083221, 0.01667197048664093, 0.1094178557395935, -0.012077041901648045, -0.13785839080810547, 0.0843670591711998, 0.032368410378694534, -0.0817711278796196, -0.18316255509853363, -0.04635688289999962, 0.028180686756968498, 0.0018103907350450754, 0.037369001656770706, 0.2134772092103958, -0.036796119064092636, -0.05229761824011803, -0.024605967104434967, 0.06818275898694992, -0.03155616670846939, 0.09934159368276596, 0.04821155592799187, 0.006277676671743393, -0.13458821177482605, 0.08725312352180481, 0.06732743978500366, -0.06413918733596802, 0.10006023198366165, 0.0007733809761703014, -0.028062867000699043, -0.07339512556791306, -0.21856999397277832, 0.05104180425405502, 0.07754883170127869, -0.08986221998929977, -0.019113751128315926, -0.1344577968120575, 0.04386691749095917, 0.11492419987916946, 0.005225330125540495, -0.017261875793337822, -0.06260963529348373, -0.04892643541097641, 0.00018261719378642738, 0.08781338483095169, 0.06477411091327667, -0.07583963871002197, -0.09751420468091965, 0.0420602448284626, 0.030182166025042534, 0.06805441528558731, -0.028659669682383537, -0.10035325586795807, -0.09809695929288864, 0.016384374350309372, -0.19513222575187683, -0.033869389444589615, -0.1063782349228859, 0.0017921550897881389, -0.013493849895894527, -0.04880421608686447, -0.02954195812344551, 0.020775359123945236, -0.058114975690841675, 0.03232241049408913, 0.036321092396974564, 0.09736282378435135, -0.15181563794612885, 0.045087672770023346, 0.027407487854361534, -0.02867431938648224, 0.12991486489772797, 0.08706389367580414, -0.03685487434267998, 0.060352351516485214, -0.17951098084449768, -0.04756905138492584, 0.0701306089758873, 0.08079706132411957, 0.011345633305609226, -0.13820502161979675, 0.0020481212995946407, 0.02219325862824917, 0.0063291387632489204, -0.011471481062471867, 0.08940820395946503, -0.018985560163855553, 0.02514268271625042, -0.06692152470350266, -0.02715490758419037, -0.0172718595713377, 0.02798527479171753, 0.06741876155138016, 0.052665673196315765, 0.059515077620744705, -0.09257834404706955, 0.06400894373655319, -0.07308834046125412, 0.04101052135229111, -0.04338034987449646, -0.023549873381853104, 0.043564461171627045, -0.10066746920347214, 0.08506742864847183, -0.004899925552308559, 0.16756881773471832, 0.020505931228399277, 0.013724323362112045, 0.02510892041027546, -0.0627361610531807, -0.12713561952114105, 0.057153068482875824, -0.009891328401863575, 0.06479677557945251, -0.0015666215913370252, -0.06349411606788635, -0.015169945545494556, 0.07867703586816788, 0.16721220314502716, 0.056757744401693344, 0.09382777661085129, 0.07821270078420639, 0.024067075923085213, 0.14134950935840607, -0.0928492620587349, -0.07909213751554489, 0.03648320585489273, -0.12467160075902939, 0.08191469311714172, -0.07608694583177567, 0.0036201972980052233, 0.005537334363907576, -0.07902469485998154, 0.046342868357896805, -0.021532157436013222, -0.04588702693581581, -0.10122843086719513, -0.1055012196302414, -0.05062726140022278, -0.07763820141553879, -0.007004458457231522, -0.09157079458236694, 0.005540122743695974, -0.02243645489215851, 0.04131842777132988, -0.10101993381977081, 0.11597823351621628, -0.10552505403757095, -0.12595881521701813, 0.20029084384441376, -0.03754550591111183, -0.021690715104341507, -0.09630007296800613, 0.013886382803320885, 0.03633808717131615, 0.12864995002746582, 0.028475500643253326, 0.044243715703487396, -0.0551060251891613, 0.024407440796494484, -0.06887222081422806, -0.07468527555465698, 0.018647700548171997, -0.047322094440460205, -0.001051086699590087, 0.13952112197875977, 0.11507018655538559, -0.0345650240778923, 0.01648516021668911, 0.112763412296772, -0.03411584720015526, -0.015111943706870079, -0.1531372368335724, -0.024541698396205902, -0.018427986651659012, 0.03734752535820007, -0.044128451496362686, -0.09686581790447235, -0.014813930727541447, 0.19749325513839722, 0.16044066846370697, -0.07609286904335022, 0.035535622388124466, -0.007995209656655788, 0.006766351871192455, -0.0017425349215045571, 0.03390234708786011, 0.1330736130475998, 0.21037797629833221, -0.0024456053506582975, 0.0191776342689991, 0.00017352556460537016, -0.11239471286535263, -0.026247868314385414, 0.10028766095638275, -0.0037095188163220882, -0.02295452542603016, -0.008819041773676872, 0.17082500457763672, -0.19005201756954193, -0.22326987981796265, -0.11829821765422821, -0.09434051811695099, -0.13795453310012817, 0.0029517188668251038, -0.06102776899933815, 0.17974020540714264, 0.054044775664806366, -0.012237670831382275, -0.011819087900221348, 0.14312440156936646, 0.03593632951378822, -0.012094894424080849, -0.043441083282232285, 0.04735661298036575, -0.12003116309642792, 0.05207495763897896, -0.025086047127842903, 0.020713666453957558, 0.030547035858035088, 0.09304394572973251, -0.04161018133163452, 0.020809337496757507, 0.009344353340566158, 0.006061222404241562, 0.03014497458934784, 0.1345689743757248, 0.02047773264348507, 0.1613560914993286, 0.12707309424877167, -0.08649381250143051, 0.004914505872875452, 0.008898316882550716, -0.019000014290213585, -0.03634733706712723, 0.1249651312828064, -0.11848551779985428, 0.08623452484607697, 0.054342854768037796, -0.05412086471915245, -0.040627118200063705, -0.021529430523514748, 0.026008307933807373, -0.023990022018551826, -0.008105513639748096, -0.009982360526919365, -0.20990712940692902, 0.01197848841547966, -0.1115441769361496, 0.05498960241675377, -0.18665365874767303, -0.011220624670386314, 0.007360341027379036, -0.014963464811444283, -0.025389492511749268, 0.04608359932899475, 0.05211392790079117, -0.03989342972636223, -0.017735891044139862, 0.034290120005607605, 0.03697902709245682, 0.10634973645210266, -0.06871471554040909, -0.040511757135391235 ]
null
null
transformers
# Wav2Vec2-Base-VoxPopuli [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 10k unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390). **Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)* **Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/) # Fine-Tuning Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
{"language": "multilingual", "license": "cc-by-nc-4.0", "tags": ["audio", "automatic-speech-recognition", "voxpopuli"]}
automatic-speech-recognition
facebook/wav2vec2-base-10k-voxpopuli
[ "transformers", "pytorch", "wav2vec2", "pretraining", "audio", "automatic-speech-recognition", "voxpopuli", "multilingual", "arxiv:2101.00390", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:05+00:00
[ "2101.00390" ]
[ "multilingual" ]
TAGS #transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us
# Wav2Vec2-Base-VoxPopuli Facebook's Wav2Vec2 base model pretrained on the 10k unlabeled subset of VoxPopuli corpus. Paper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation Learning, Semi-Supervised Learning and Interpretation* Authors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI* See the official website for more information, here # Fine-Tuning Please refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '"facebook/wav2vec2-large-xlsr-53"' with this checkpoint for fine-tuning.
[ "# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the 10k unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning." ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n", "# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the 10k unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here", "# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning." ]
[ 71, 133, 57 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #pretraining #audio #automatic-speech-recognition #voxpopuli #multilingual #arxiv-2101.00390 #license-cc-by-nc-4.0 #endpoints_compatible #region-us \n# Wav2Vec2-Base-VoxPopuli\n\nFacebook's Wav2Vec2 base model pretrained on the 10k unlabeled subset of VoxPopuli corpus.\n\nPaper: *VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation\nLearning, Semi-Supervised Learning and Interpretation*\n\nAuthors: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*\n\nSee the official website for more information, here# Fine-Tuning\n\nPlease refer to this blog on how to fine-tune this model on a specific language. Note that you should replace '\"facebook/wav2vec2-large-xlsr-53\"' with this checkpoint for fine-tuning." ]
[ -0.06689031422138214, 0.021754873916506767, -0.004546821117401123, 0.005005165468901396, 0.12637417018413544, -0.026665247976779938, 0.09017711132764816, 0.007610769011080265, 0.0332237109541893, 0.012096594087779522, 0.008296416141092777, 0.024691686034202576, 0.07734648138284683, 0.09740042686462402, 0.0007677189423702657, -0.3054135739803314, 0.06234459951519966, 0.012666664086282253, 0.08322178572416306, 0.06075954809784889, 0.12370922416448593, -0.07907170802354813, 0.02265588380396366, 0.05658524110913277, -0.09999869018793106, -0.003051203675568104, 0.021779632195830345, -0.08087383955717087, 0.13078755140304565, 0.08974690735340118, 0.09538300335407257, 0.061452195048332214, 0.04143008589744568, -0.16484707593917847, 0.02903328277170658, 0.05078266188502312, -0.04012947157025337, -0.00260677351616323, 0.127742737531662, -0.026172012090682983, 0.200572669506073, -0.029905201867222786, -0.029581857845187187, 0.09271153062582016, -0.12551432847976685, -0.14645366370677948, -0.06615068018436432, 0.13830558955669403, 0.1474670022726059, 0.07391518354415894, -0.07274729013442993, 0.03955695778131485, -0.047171931713819504, 0.07259239256381989, 0.08769873529672623, -0.30063533782958984, -0.0362122468650341, 0.1297047883272171, 0.07569756358861923, -0.02535492368042469, -0.10263218730688095, 0.08697967231273651, 0.017683709040284157, 0.0080347191542387, -0.015933556482195854, -0.08656606078147888, 0.03397712484002113, -0.08384960144758224, -0.11822380125522614, -0.0017748834798112512, 0.17122022807598114, 0.0427696518599987, -0.06352411210536957, -0.08996488153934479, -0.028205206617712975, 0.17445017397403717, -0.05989109352231026, -0.1600266993045807, 0.008424971252679825, 0.03540894016623497, 0.06911438703536987, -0.15878146886825562, -0.08340179175138474, -0.024684472009539604, -0.047659482806921005, 0.12795481085777283, 0.03815898671746254, -0.02152472361922264, -0.05763154849410057, 0.03622704744338989, -0.08473429083824158, -0.06292698532342911, 0.008420783095061779, -0.10224409401416779, -0.08410653471946716, -0.011685139499604702, -0.0762733668088913, -0.09813317656517029, -0.02651197649538517, 0.06986765563488007, 0.010422428138554096, 0.05571436882019043, -0.0630035251379013, 0.03396119549870491, 0.011068400926887989, 0.09069027006626129, -0.129019632935524, 0.0014393414603546262, 0.0030856213998049498, -0.04654397815465927, -0.0074621825478971004, -0.03330691531300545, -0.07299410551786423, -0.07792751491069794, 0.005695706699043512, 0.07016098499298096, 0.014465367421507835, 0.023445401340723038, -0.07163643836975098, -0.0957973524928093, 0.04444913566112518, -0.07148676365613937, 0.015910131856799126, 0.026900341734290123, 0.0027848216705024242, 0.19349190592765808, 0.003924149554222822, 0.059931911528110504, -0.1570141464471817, 0.0198866818100214, -0.013609183952212334, 0.01448717899620533, -0.009475822560489178, -0.060241855680942535, 0.02136312797665596, -0.038067035377025604, -0.015432021580636501, -0.13921912014484406, -0.07467595487833023, -0.07642893493175507, -0.00008679791790200397, -0.0353265218436718, -0.07329221814870834, -0.03776916489005089, -0.0030563040636479855, -0.018793826922774315, -0.025637445971369743, -0.013839010149240494, -0.019263066351413727, 0.015988368541002274, -0.02551288716495037, 0.06564266979694366, -0.05173792690038681, 0.08198282122612, 0.015704922378063202, -0.017258716747164726, -0.14029371738433838, 0.13345655798912048, -0.07651408016681671, -0.06554728746414185, -0.15244616568088531, -0.05928250029683113, -0.056091755628585815, 0.0549234077334404, 0.004368038848042488, 0.14478515088558197, -0.17901740968227386, -0.1027495265007019, 0.25710785388946533, -0.10765059292316437, 0.038888249546289444, 0.18354341387748718, 0.02074752375483513, 0.03690440580248833, 0.16014373302459717, 0.13735419511795044, 0.04075975343585014, -0.12560610473155975, 0.060521624982357025, -0.041171252727508545, -0.026308735832571983, 0.0601455420255661, 0.057112712413072586, -0.016298215836286545, 0.007002492900937796, -0.0015236993785947561, -0.07080040872097015, -0.049908172339200974, -0.01726752705872059, -0.059526778757572174, 0.024594556540250778, -0.028801020234823227, 0.10543440282344818, 0.001687771873548627, -0.0010465168161317706, 0.014973582699894905, -0.09374742209911346, -0.024555373936891556, 0.07488693296909332, -0.056327831000089645, 0.07620079815387726, -0.10573296993970871, 0.044963251799345016, 0.10467604547739029, 0.04984850063920021, -0.15550357103347778, 0.03890932351350784, -0.015783343464136124, 0.09153177589178085, 0.09979293495416641, 0.2026437669992447, -0.02458898164331913, -0.03151286020874977, -0.08858662098646164, 0.021524975076317787, -0.03923291712999344, -0.0343024767935276, -0.0352838858962059, -0.08611874282360077, -0.03381919115781784, -0.03997208923101425, 0.0701846107840538, -0.159858837723732, 0.006183870602399111, 0.0614243783056736, 0.060658667236566544, 0.010502154938876629, 0.01052048895508051, 0.02132284827530384, 0.10265852510929108, 0.0396682471036911, 0.00782640278339386, 0.09419549256563187, -0.0056862435303628445, -0.054416149854660034, 0.10053759813308716, -0.059826772660017014, 0.0019678690005093813, 0.1348770409822464, -0.10510113835334778, -0.007656631991267204, 0.00514271529391408, 0.018048886209726334, -0.0057137771509587765, 0.003348431782796979, -0.021348951384425163, 0.20890912413597107, 0.02712775394320488, 0.08032438904047012, -0.07605741918087006, 0.02718915231525898, -0.01157775241881609, -0.04546690732240677, -0.04762105643749237, 0.08044089376926422, 0.013160118833184242, -0.0807332992553711, 0.0015903936000540853, 0.09715715795755386, -0.004169796127825975, 0.1438816785812378, 0.018624957650899887, -0.021819666028022766, 0.013146168552339077, -0.04411325603723526, -0.023052049800753593, -0.011691260151565075, -0.16482381522655487, -0.023060347884893417, 0.032842572778463364, 0.029264921322464943, 0.06337854266166687, -0.07535464316606522, 0.0010695790406316519, 0.007002172525972128, -0.08635853230953217, -0.04544994607567787, 0.042425137013196945, -0.010638527572154999, 0.07976802438497543, -0.04140041768550873, -0.008196206763386726, -0.009120088070631027, -0.03360762447118759, -0.10799965262413025, 0.11481975764036179, -0.062255047261714935, -0.3478263318538666, -0.09860678762197495, -0.10245656967163086, -0.0953652411699295, 0.02878325805068016, 0.05193409323692322, -0.0930524542927742, -0.06668461859226227, 0.011050241068005562, 0.17083439230918884, -0.03874262422323227, -0.08567970991134644, 0.040617767721414566, 0.007344463840126991, -0.013589846901595592, -0.10453491657972336, 0.005072918254882097, -0.03520531952381134, -0.1328083574771881, 0.008687025867402554, -0.024581102654337883, 0.034233544021844864, 0.15212856233119965, 0.03493572399020195, -0.0181456096470356, -0.026435883715748787, 0.20928755402565002, -0.11215250194072723, 0.07901827245950699, 0.2814671993255615, -0.007599916774779558, 0.024703374132514, 0.1293974667787552, -0.00009741459507495165, -0.06592532247304916, -0.0009600484045222402, 0.06177617982029915, -0.014330429024994373, -0.267490029335022, -0.13054333627223969, -0.0628475695848465, -0.02487468346953392, 0.019082091748714447, -0.0041240788996219635, 0.02388119138777256, 0.04159320145845413, -0.09661196917295456, -0.03916918486356735, 0.068232960999012, 0.03396304324269295, 0.2034803330898285, -0.03952528536319733, 0.146049365401268, -0.028213638812303543, -0.02423015981912613, 0.06395336240530014, 0.04638884216547012, 0.09492168575525284, 0.10212777554988861, 0.08691859990358353, 0.0932963415980339, 0.046848561614751816, 0.02655181474983692, 0.011785325594246387, 0.0001978213549591601, -0.01944434829056263, -0.04607527330517769, -0.019186580553650856, -0.03382281959056854, 0.00959694478660822, 0.12384365499019623, -0.15839996933937073, -0.12056490033864975, 0.00029375110170803964, 0.029325714334845543, 0.17079536616802216, 0.055286191403865814, -0.08160963654518127, -0.05059697851538658, 0.03584003075957298, -0.09482212364673615, -0.04780008643865585, 0.054000191390514374, 0.0801311805844307, -0.16375869512557983, 0.13390937447547913, 0.03031332604587078, 0.10203157365322113, -0.021127652376890182, 0.05181164667010307, -0.15841296315193176, -0.00673391530290246, 0.03809230029582977, 0.08410974591970444, -0.2522304356098175, 0.21490232646465302, 0.008643729612231255, 0.06574376672506332, -0.07144279032945633, -0.003883322235196829, 0.042935777455568314, 0.1179220974445343, 0.13822878897190094, -0.008929830975830555, -0.01576904207468033, 0.001247269450686872, -0.01680871844291687, 0.02740274928510189, 0.041151609271764755, -0.025528695434331894, 0.04512762650847435, -0.005114937201142311, 0.012415791861712933, -0.006572200451046228, 0.10866931080818176, -0.23424939811229706, -0.13313359022140503, 0.035730402916669846, 0.018894068896770477, 0.09485817700624466, -0.010812046006321907, -0.07386854290962219, -0.12381001561880112, 0.12039563804864883, -0.012924511916935444, -0.038969505578279495, -0.10403329133987427, 0.040425363928079605, 0.028045643121004105, -0.10555429756641388, 0.034224629402160645, 0.04509853571653366, 0.11785894632339478, -0.09358258545398712, -0.053481366485357285, 0.04216917231678963, -0.08652180433273315, -0.07529089599847794, 0.04776414483785629, 0.19805699586868286, 0.10336374491453171, 0.02947225421667099, 0.10772625356912613, -0.041876085102558136, 0.04181540384888649, -0.11623472720384598, 0.06549849361181259, 0.012320897541940212, -0.010171332396566868, 0.02085774950683117, -0.05602071061730385, -0.25100475549697876, -0.11608860641717911, -0.02467060461640358, 0.1909887045621872, 0.18243001401424408, 0.0012311515165492892, 0.15474535524845123, 0.2518150806427002, -0.08372389525175095, -0.24385268986225128, -0.06454449892044067, -0.009728523902595043, 0.044667817652225494, 0.028904922306537628, -0.26119256019592285, 0.057717304676771164, 0.06028038635849953, 0.0063886079005897045, -0.08776285499334335, -0.22714103758335114, -0.13130183517932892, 0.18256928026676178, 0.055024731904268265, 0.13824929296970367, -0.09157438576221466, -0.04971500486135483, -0.07765603065490723, -0.1227867379784584, 0.08685600012540817, -0.11212636530399323, 0.09498218446969986, 0.04863957688212395, 0.003987196832895279, 0.009768332354724407, 0.044382572174072266, 0.1309928596019745, 0.07288551330566406, 0.003748478600755334, -0.034221477806568146, 0.02106468379497528, 0.03202415257692337, 0.02805297262966633, 0.042269278317689896, 0.02744785137474537, -0.0218825526535511, -0.053147394210100174, -0.11446382850408554, -0.11601734906435013, 0.08005696535110474, -0.05840996280312538, -0.01783376932144165, -0.016739020124077797, 0.08992842584848404, 0.01041683554649353, 0.011744602583348751, -0.07625981420278549, -0.1425003856420517, 0.022151747718453407, 0.12465372681617737, 0.23553578555583954, -0.13416695594787598, -0.02014208398759365, -0.06013735756278038, -0.049960907548666, 0.08134371042251587, 0.008984421379864216, 0.06258305162191391, 0.043843574821949005, 0.0013437122106552124, 0.093109630048275, 0.020370090380311012, -0.07209032028913498, 0.015975624322891235, 0.0305216982960701, -0.06448978185653687, -0.2617095410823822, -0.060919031500816345, -0.009641134180128574, 0.019393235445022583, 0.029547495767474174, 0.1874001920223236, -0.01048384141176939, -0.059491392225027084, -0.015187234617769718, 0.038913629949092865, -0.0484037920832634, 0.06891128420829773, 0.0407639779150486, 0.04920773580670357, -0.10947132855653763, 0.051651421934366226, 0.09602247923612595, -0.11756473779678345, 0.04434216395020485, 0.05712218955159187, -0.060560133308172226, -0.09616122394800186, -0.13750393688678741, 0.012111320160329342, -0.01218627579510212, -0.08412262052297592, 0.03245021402835846, -0.17236357927322388, 0.026598425582051277, 0.09112021327018738, 0.040674351155757904, -0.0072229718789458275, -0.06496302783489227, -0.04408102110028267, -0.032495368272066116, 0.0032242615707218647, 0.1265760064125061, -0.0618472620844841, -0.14002570509910583, 0.15904508531093597, 0.019099721685051918, 0.09503475576639175, -0.0436384417116642, -0.05425151437520981, -0.12501001358032227, 0.021353336051106453, -0.14925046265125275, 0.016497474163770676, -0.12804704904556274, 0.0010465659433975816, -0.04959765449166298, -0.024899423122406006, -0.029046032577753067, 0.03593450039625168, -0.10265635699033737, 0.012252296321094036, -0.008725889027118683, 0.07648683339357376, -0.10509629547595978, 0.07311569899320602, 0.06803303956985474, -0.021088918671011925, 0.08753424882888794, 0.022480705752968788, -0.04315956309437752, 0.10174189507961273, -0.17807641625404358, -0.03951294347643852, 0.03379654139280319, 0.025051971897482872, -0.010761220008134842, -0.1585032194852829, 0.009776310995221138, 0.030548909679055214, 0.04326846823096275, -0.0003887183265760541, 0.08670801669359207, -0.06576356291770935, -0.016893943771719933, -0.05954865366220474, -0.09407053142786026, -0.025175781920552254, 0.07302583754062653, 0.10737679898738861, 0.020974475890398026, 0.1072339341044426, -0.07551790028810501, 0.06100643053650856, -0.10222044587135315, 0.0645895004272461, -0.04286608844995499, -0.022307323291897774, 0.03718748688697815, -0.1344788372516632, 0.0567452609539032, -0.00047804409405216575, 0.07105636596679688, -0.008825342170894146, 0.0016210214234888554, 0.009519965387880802, -0.08399755507707596, -0.1103595718741417, 0.02995528094470501, 0.12852589786052704, 0.06927412003278732, -0.003957010805606842, 0.025602495297789574, -0.00006080795719753951, 0.00819176435470581, 0.19869540631771088, 0.2045259326696396, 0.20770135521888733, 0.04027338698506355, 0.0794810950756073, 0.015144294127821922, -0.05397145077586174, -0.05458107963204384, 0.01577606424689293, -0.07225937396287918, 0.02554585412144661, -0.06966271996498108, -0.026219071820378304, 0.07974255084991455, -0.1492396891117096, 0.12282586842775345, 0.02396738901734352, -0.08797529339790344, -0.13991595804691315, -0.18093080818653107, -0.05579714849591255, -0.09175343811511993, -0.01905866526067257, -0.11963492631912231, -0.024235956370830536, 0.043335702270269394, 0.03218969330191612, -0.11630763858556747, 0.08997896313667297, -0.15033772587776184, -0.15823407471179962, 0.18223895132541656, -0.03927367553114891, 0.016730256378650665, -0.014941375702619553, 0.00907890871167183, 0.0056173247285187244, 0.09973064810037613, 0.02469264343380928, 0.03661169484257698, -0.02272551879286766, 0.041916005313396454, -0.07524675875902176, -0.052909743040800095, -0.0014149290509521961, 0.03500019386410713, 0.13321712613105774, 0.20958997309207916, 0.034550976008176804, -0.06766750663518906, 0.00255727325566113, 0.15309588611125946, 0.03256949782371521, -0.12256550043821335, -0.14050744473934174, 0.06813060492277145, 0.021020956337451935, 0.010484853759407997, -0.022119803354144096, -0.06383220106363297, 0.013594252057373524, 0.2759469747543335, 0.16638930141925812, -0.054324451833963394, 0.025819877162575722, 0.006865192670375109, 0.03228595852851868, 0.08522116392850876, 0.11629218608140945, 0.08609932661056519, 0.175166517496109, -0.04077986627817154, 0.0002356245822738856, -0.00960023794323206, -0.06281567364931107, -0.09117688238620758, 0.14418423175811768, 0.030530836433172226, -0.07765328139066696, -0.001318358932621777, 0.1551297903060913, -0.12999245524406433, -0.07455433905124664, -0.06912899017333984, -0.04871100187301636, -0.1054701879620552, -0.010808385908603668, -0.03939490392804146, 0.10027630627155304, 0.09896647185087204, -0.019367646425962448, -0.009673846885561943, 0.1870259940624237, 0.051824361085891724, -0.0011667790822684765, -0.03239227086305618, 0.1238052099943161, -0.005867616273462772, 0.06350508332252502, -0.01355365477502346, 0.07001808285713196, 0.058142609894275665, 0.05048838630318642, -0.016466429457068443, 0.06889758259057999, -0.0008071811171248555, 0.03523777052760124, 0.08667837083339691, 0.13606269657611847, 0.007066415157169104, 0.020732946693897247, 0.07729480415582657, -0.15365710854530334, 0.02878294140100479, 0.06371447443962097, -0.041698865592479706, -0.01278030313551426, 0.17013698816299438, -0.19269296526908875, 0.05482758581638336, 0.14955782890319824, -0.034207552671432495, -0.028607243672013283, -0.043396685272455215, 0.0615205243229866, -0.024781720712780952, 0.04615163058042526, -0.03645733371376991, -0.1445658951997757, 0.011466902680695057, -0.07481727004051208, 0.028024666011333466, -0.18044936656951904, -0.01346233207732439, -0.032758407294750214, -0.004238570109009743, -0.053053759038448334, 0.08970768749713898, 0.011721864342689514, -0.0545976348221302, 0.020086662843823433, -0.08609159290790558, 0.015278760343790054, 0.10607776790857315, -0.09433545172214508, -0.04725118726491928 ]