sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
listlengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
| tokens_length
listlengths 1
723
| input_texts
listlengths 1
61
| embeddings
listlengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null |
flair
|
## English Part-of-Speech Tagging in Flair (fast model)
This is the fast part-of-speech tagging model for English that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **98,10** (Ontonotes)
Predicts fine-grained POS tags:
| **tag** | **meaning** |
|---------------------------------|-----------|
|ADD | Email |
|AFX | Affix |
|CC | Coordinating conjunction |
|CD | Cardinal number |
|DT | Determiner |
|EX | Existential there |
|FW | Foreign word |
|HYPH | Hyphen |
|IN | Preposition or subordinating conjunction |
|JJ | Adjective |
|JJR |Adjective, comparative |
|JJS | Adjective, superlative |
|LS | List item marker |
|MD | Modal |
|NFP | Superfluous punctuation |
|NN | Noun, singular or mass |
|NNP |Proper noun, singular |
|NNPS | Proper noun, plural |
|NNS |Noun, plural |
|PDT | Predeterminer |
|POS | Possessive ending |
|PRP | Personal pronoun |
|PRP$ | Possessive pronoun |
|RB | Adverb |
|RBR | Adverb, comparative |
|RBS | Adverb, superlative |
|RP | Particle |
|SYM | Symbol |
|TO | to |
|UH | Interjection |
|VB | Verb, base form |
|VBD | Verb, past tense |
|VBG | Verb, gerund or present participle |
|VBN | Verb, past participle |
|VBP | Verb, non-3rd person singular present |
|VBZ | Verb, 3rd person singular present |
|WDT | Wh-determiner |
|WP | Wh-pronoun |
|WP$ | Possessive wh-pronoun |
|WRB | Wh-adverb |
|XX | Unknown |
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/pos-english-fast")
# make example sentence
sentence = Sentence("I love Berlin.")
# predict NER tags
tagger.predict(sentence)
# print sentence
print(sentence)
# print predicted NER spans
print('The following NER tags are found:')
# iterate over entities and print
for entity in sentence.get_spans('pos'):
print(entity)
```
This yields the following output:
```
Span [1]: "I" [− Labels: PRP (1.0)]
Span [2]: "love" [− Labels: VBP (0.9998)]
Span [3]: "Berlin" [− Labels: NNP (0.9999)]
Span [4]: "." [− Labels: . (0.9998)]
```
So, the word "*I*" is labeled as a **pronoun** (PRP), "*love*" is labeled as a **verb** (VBP) and "*Berlin*" is labeled as a **proper noun** (NNP) in the sentence "*I love Berlin*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
```python
from flair.data import Corpus
from flair.datasets import ColumnCorpus
from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
# 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself)
corpus: Corpus = ColumnCorpus(
"resources/tasks/onto-ner",
column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"},
tag_to_bioes="ner",
)
# 2. what tag do we want to predict?
tag_type = 'pos'
# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize each embedding we use
embedding_types = [
# contextual string embeddings, forward
FlairEmbeddings('news-forward'),
# contextual string embeddings, backward
FlairEmbeddings('news-backward'),
]
# embedding stack consists of Flair and GloVe embeddings
embeddings = StackedEmbeddings(embeddings=embedding_types)
# 5. initialize sequence tagger
from flair.models import SequenceTagger
tagger = SequenceTagger(hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type=tag_type)
# 6. initialize trainer
from flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus)
# 7. run training
trainer.train('resources/taggers/pos-english-fast',
train_with_dev=True,
max_epochs=150)
```
---
### Cite
Please cite the following paper when using this model.
```
@inproceedings{akbik2018coling,
title={Contextual String Embeddings for Sequence Labeling},
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
pages = {1638--1649},
year = {2018}
}
```
---
### Issues?
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
|
{"language": "en", "tags": ["flair", "token-classification", "sequence-tagger-model"], "datasets": ["ontonotes"], "widget": [{"text": "I love Berlin."}]}
|
token-classification
|
flair/pos-english-fast
|
[
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"en",
"dataset:ontonotes",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #has_space #region-us
|
English Part-of-Speech Tagging in Flair (fast model)
----------------------------------------------------
This is the fast part-of-speech tagging model for English that ships with Flair.
F1-Score: 98,10 (Ontonotes)
Predicts fine-grained POS tags:
Based on Flair embeddings and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: Flair ('pip install flair')
This yields the following output:
So, the word "*I*" is labeled as a pronoun (PRP), "*love*" is labeled as a verb (VBP) and "*Berlin*" is labeled as a proper noun (NNP) in the sentence "*I love Berlin*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
---
### Cite
Please cite the following paper when using this model.
---
### Issues?
The Flair issue tracker is available here.
|
[
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRP), \"*love*\" is labeled as a verb (VBP) and \"*Berlin*\" is labeled as a proper noun (NNP) in the sentence \"*I love Berlin*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
"TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #has_space #region-us \n",
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRP), \"*love*\" is labeled as a verb (VBP) and \"*Berlin*\" is labeled as a proper noun (NNP) in the sentence \"*I love Berlin*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
41,
97,
22,
15,
15
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #has_space #region-us \n### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRP), \"*love*\" is labeled as a verb (VBP) and \"*Berlin*\" is labeled as a proper noun (NNP) in the sentence \"*I love Berlin*\".\n\n\n\n\n---### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
-0.09654349833726883,
0.02451726607978344,
-0.0009043450118042529,
0.10612541437149048,
0.057738520205020905,
0.06040659174323082,
0.002132750814780593,
0.07508105784654617,
0.08618345856666565,
0.08102487027645111,
0.1256256401538849,
-0.026729634031653404,
0.066706083714962,
0.15705415606498718,
0.055722109973430634,
-0.28822821378707886,
0.012350550852715969,
0.000387382140615955,
0.02420773357152939,
0.15740936994552612,
0.12913918495178223,
0.015539132989943027,
0.0006168977706693113,
-0.011361246928572655,
-0.09074467420578003,
0.008635882288217545,
-0.025122981518507004,
-0.022447116672992706,
0.14025990664958954,
-0.05348089337348938,
0.15944015979766846,
0.0026885056868195534,
0.06456459313631058,
-0.18054886162281036,
0.021371154114603996,
0.02195402793586254,
-0.05337318778038025,
0.05478101596236229,
0.01444229669868946,
-0.08794758468866348,
0.30248796939849854,
-0.10867413133382797,
-0.015168821439146996,
0.04804442822933197,
-0.15579603612422943,
-0.3037412464618683,
-0.060181617736816406,
0.18956506252288818,
0.032371580600738525,
0.08444575220346451,
-0.05128515511751175,
0.09979059547185898,
-0.10420067608356476,
0.007598940748721361,
0.05655868723988533,
-0.1893027126789093,
-0.058593764901161194,
0.08073531836271286,
0.07005283981561661,
0.047991376370191574,
-0.11503483355045319,
0.021208276972174644,
0.030959798023104668,
0.07127212733030319,
0.03718951344490051,
-0.012765012681484222,
0.08459179848432541,
0.012641940265893936,
-0.20550601184368134,
-0.09668470174074173,
0.2757447063922882,
0.042059119790792465,
-0.07436185330152512,
-0.02248149923980236,
-0.02811729535460472,
-0.12338393926620483,
0.02391539141535759,
-0.12532569468021393,
-0.05830457806587219,
0.013386699371039867,
0.13014355301856995,
-0.12159647047519684,
-0.10382846742868423,
-0.050668735057115555,
0.11408938467502594,
0.15825700759887695,
-0.03149263560771942,
0.0406000092625618,
-0.02833086997270584,
0.12537270784378052,
-0.11252626031637192,
-0.08531612157821655,
0.05594327300786972,
-0.08415671437978745,
-0.011642892844974995,
0.016787996515631676,
-0.04280257597565651,
0.01935560442507267,
-0.011219857260584831,
0.07230779528617859,
0.05083087086677551,
-0.052142683416604996,
0.05426621064543724,
0.07736276835203171,
0.06708814948797226,
0.16122998297214508,
-0.037757474929094315,
-0.052723705768585205,
-0.0010586639400571585,
0.0022147269919514656,
0.02142849750816822,
-0.014102272689342499,
-0.0960581824183464,
-0.05373915657401085,
0.027593662962317467,
0.02784627117216587,
-0.03391953557729721,
0.03857726603746414,
-0.09796261042356491,
0.026546524837613106,
0.01644781045615673,
-0.07551521062850952,
0.03395429253578186,
0.06350745260715485,
-0.07755459100008011,
0.04625697806477547,
0.09197364002466202,
-0.05438993498682976,
-0.03669838607311249,
0.09190564602613449,
-0.0583089254796505,
0.049307674169540405,
-0.12003470212221146,
-0.09019772708415985,
0.01456284150481224,
-0.06116273254156113,
-0.03431373089551926,
-0.1029582992196083,
-0.006102720275521278,
-0.0223750788718462,
0.031352449208498,
0.00643532071262598,
-0.01781802624464035,
-0.05993780121207237,
-0.06106599420309067,
-0.002907335991039872,
-0.006529592908918858,
0.012751425616443157,
-0.04651622474193573,
0.04817333444952965,
-0.03259122744202614,
0.06636903434991837,
-0.13654744625091553,
0.005392964463680983,
-0.13267304003238678,
0.04297686368227005,
-0.10289536416530609,
-0.018322864547371864,
-0.14377810060977936,
-0.061314042657613754,
-0.0737595334649086,
-0.08680371940135956,
0.005281387362629175,
-0.009370357729494572,
0.048375699669122696,
0.18667621910572052,
-0.33414775133132935,
-0.008304080925881863,
0.18613484501838684,
-0.1435844451189041,
-0.027398429811000824,
0.16836312413215637,
-0.06507764011621475,
0.15237785875797272,
0.021295564249157906,
0.1342548280954361,
0.08589435368776321,
-0.3144737780094147,
-0.060631487518548965,
0.08468030393123627,
-0.16800861060619354,
0.027717918157577515,
0.04919926077127457,
0.021558407694101334,
-0.18036223948001862,
0.016674183309078217,
-0.10283906757831573,
0.06786636263132095,
-0.008870590478181839,
-0.07240442931652069,
-0.03174148127436638,
0.01576855778694153,
0.10253316164016724,
0.01486112643033266,
-0.06717977672815323,
-0.06452478468418121,
-0.16437889635562897,
-0.08420775085687637,
0.038230665028095245,
0.05609584599733353,
0.011591781862080097,
0.007224091794341803,
0.1313849836587906,
0.09868127852678299,
-0.03599319979548454,
-0.10092776268720627,
0.016687555238604546,
-0.00775841623544693,
0.01622757315635681,
0.05020027235150337,
0.14975935220718384,
0.019004305824637413,
0.06313087046146393,
-0.002380733611062169,
-0.008359403349459171,
-0.1549115777015686,
0.03646942228078842,
0.0031268319580703974,
-0.03255760297179222,
-0.1151120513677597,
-0.09190405905246735,
0.14565791189670563,
-0.168509379029274,
0.0446762852370739,
0.055642008781433105,
0.03999912366271019,
-0.05713918060064316,
-0.003682720707729459,
0.014676885679364204,
0.0733199268579483,
0.0003182740183547139,
0.008870783261954784,
0.11005440354347229,
0.01023772731423378,
-0.08611954003572464,
-0.05896572396159172,
-0.011393795721232891,
-0.20722997188568115,
0.07721958309412003,
-0.019508209079504013,
-0.155054971575737,
-0.08550034463405609,
-0.0907987430691719,
0.038992077112197876,
-0.03032503090798855,
-0.06148061528801918,
0.16018302738666534,
0.024725226685404778,
0.023012328892946243,
-0.03957991674542427,
0.051015473902225494,
-0.017211923375725746,
-0.08594661951065063,
-0.09393284469842911,
0.09015396982431412,
0.05976274609565735,
0.02889120951294899,
0.0741429477930069,
0.1834300309419632,
-0.10664292424917221,
0.1424597054719925,
0.02599274553358555,
-0.004691032227128744,
-0.11297191679477692,
0.03626289591193199,
-0.008506099693477154,
0.2177244871854782,
-0.15187302231788635,
0.04808866232633591,
0.03297877311706543,
-0.018566610291600227,
0.02526535652577877,
-0.15987299382686615,
-0.05886440351605415,
-0.020777815952897072,
-0.05714540183544159,
-0.049372706562280655,
0.04334864020347595,
-0.03660037741065025,
0.10921653360128403,
-0.032665036618709564,
-0.08643388748168945,
0.036707013845443726,
0.026264691725373268,
-0.09122142195701599,
0.11506912112236023,
-0.06578247994184494,
-0.18402068316936493,
-0.08584919571876526,
0.019116245210170746,
-0.007954459637403488,
-0.003651122795417905,
-0.0035441608633846045,
-0.05468648299574852,
0.04360922425985336,
-0.005158815532922745,
0.1384381651878357,
-0.05026436224579811,
-0.06293123215436935,
-0.19210432469844818,
0.004383372142910957,
0.04731924086809158,
-0.10701151192188263,
-0.02777833864092827,
-0.05738640949130058,
0.06889387965202332,
0.051762934774160385,
0.03084980882704258,
0.14149446785449982,
0.10723143815994263,
-0.02345019392669201,
0.02403932623565197,
-0.03517106547951698,
0.28185921907424927,
-0.07797277718782425,
0.12484151870012283,
0.10822001844644547,
-0.07692208141088486,
0.06783422827720642,
0.14060775935649872,
0.02806658111512661,
-0.0602586567401886,
-0.018114034086465836,
-0.06367950886487961,
-0.10443444550037384,
-0.09897508472204208,
-0.16767092049121857,
-0.06922001391649246,
0.03728710487484932,
-0.003221469931304455,
0.031374748796224594,
-0.03586937487125397,
0.03503295034170151,
0.030463777482509613,
-0.13661573827266693,
-0.10727471113204956,
0.010272487066686153,
0.0679023340344429,
-0.07517021894454956,
-0.009674138389527798,
-0.010836930014193058,
-0.07045077532529831,
0.04880908131599426,
0.051277223974466324,
0.07959394156932831,
0.0325545035302639,
-0.050139568746089935,
0.10864761471748352,
0.19786830246448517,
0.0465998537838459,
0.07858366519212723,
0.037717241793870926,
-0.0283699631690979,
-0.026990706101059914,
-0.06295546889305115,
-0.013234889134764671,
0.07753997296094894,
0.09833000600337982,
0.0725872814655304,
-0.0653204619884491,
-0.09279774129390717,
0.07184170186519623,
-0.018181126564741135,
0.13479028642177582,
-0.15591229498386383,
-0.00037333040381781757,
0.008057261817157269,
0.08058270066976547,
-0.05667074769735336,
-0.01620633155107498,
0.07511929422616959,
-0.1135353296995163,
0.010974272154271603,
-0.04785867780447006,
0.05353822186589241,
0.022535737603902817,
0.024120861664414406,
-0.029906639829277992,
0.0963607206940651,
-0.03806338086724281,
0.14712293446063995,
-0.2504882216453552,
0.32345086336135864,
-0.023821182548999786,
0.07953835278749466,
-0.04686220735311508,
-0.019308047369122505,
0.020163122564554214,
0.057752158492803574,
0.23767049610614777,
0.027053238824009895,
-0.0716351792216301,
-0.06463108956813812,
-0.11577863991260529,
0.03547376021742821,
0.06753750890493393,
-0.012182130478322506,
0.035823188722133636,
0.08210675418376923,
0.030347539111971855,
0.002335169119760394,
0.24539044499397278,
-0.20588836073875427,
-0.049204014241695404,
-0.03580697998404503,
0.019155332818627357,
-0.018061399459838867,
0.005425512325018644,
-0.05693863704800606,
-0.018731724470853806,
0.05268546938896179,
0.006551346275955439,
0.0020378651097416878,
-0.1297585368156433,
0.11338187754154205,
-0.005597994197160006,
-0.04299185797572136,
-0.06513018161058426,
-0.011996760033071041,
0.01926562748849392,
-0.14023180305957794,
0.019839750602841377,
0.04445360228419304,
-0.07103235274553299,
-0.01697973720729351,
-0.06690482050180435,
0.14661693572998047,
0.11115468293428421,
0.06799653172492981,
0.00943867675960064,
0.015519805252552032,
0.00033329802681691945,
-0.1604759395122528,
0.16479939222335815,
-0.17816615104675293,
-0.061541326344013214,
0.004820319823920727,
-0.15174798667430878,
-0.12566150724887848,
-0.11129096895456314,
0.07165972143411636,
0.21848627924919128,
0.34441083669662476,
-0.1381617784500122,
0.08276300877332687,
0.03774351254105568,
-0.10525152087211609,
-0.2614096701145172,
-0.008403051644563675,
0.07403957843780518,
0.05510846525430679,
0.027985630556941032,
-0.16437183320522308,
0.020131828263401985,
0.0826042965054512,
-0.03619759902358055,
0.042953845113515854,
-0.2326297014951706,
-0.04053967446088791,
0.16857528686523438,
0.039376284927129745,
0.031595416367053986,
0.0166407972574234,
-0.012429691851139069,
0.003408466000109911,
-0.016964947804808617,
0.1501934677362442,
0.021021423861384392,
0.047888629138469696,
0.0017128053586930037,
0.014843103475868702,
0.0015150088584050536,
-0.05245388671755791,
0.16644679009914398,
0.08610314875841141,
0.06061932072043419,
-0.042651355266571045,
-0.13736571371555328,
0.018506653606891632,
-0.028785524889826775,
0.1248832643032074,
-0.03175545856356621,
-0.02318432182073593,
-0.20873819291591644,
-0.004652945324778557,
-0.15279674530029297,
0.23398935794830322,
-0.08352203667163849,
-0.11376859247684479,
-0.078595831990242,
0.08345093578100204,
-0.039002493023872375,
-0.041900940239429474,
-0.16808746755123138,
-0.05423648655414581,
0.04137036204338074,
0.14810499548912048,
-0.014193885959684849,
0.16433125734329224,
-0.14294368028640747,
0.06051809340715408,
-0.012386086396872997,
0.0892258882522583,
-0.04956945776939392,
-0.050376053899526596,
0.05334747955203056,
0.024356646463274956,
0.12530970573425293,
0.031355973333120346,
-0.1121717020869255,
0.008862817659974098,
0.07107852399349213,
-0.18576636910438538,
-0.11760594695806503,
-0.0498458594083786,
-0.11357173323631287,
-0.024456258863210678,
-0.11434551328420639,
0.05138155445456505,
-0.14052371680736542,
0.03305118903517723,
-0.040505558252334595,
-0.004637460224330425,
-0.15388861298561096,
-0.020517488941550255,
0.15572071075439453,
0.07224233448505402,
-0.009336553514003754,
0.019432656466960907,
0.036695435643196106,
-0.1170850396156311,
0.05184022709727287,
0.055045973509550095,
-0.06746392697095871,
-0.056511908769607544,
0.0006572363781742752,
0.22095002233982086,
0.042096588760614395,
-0.06470713019371033,
0.016990890726447105,
-0.08274456113576889,
-0.005894308909773827,
0.08140632510185242,
0.11069116741418839,
-0.031077589839696884,
-0.10129667818546295,
-0.01710035838186741,
-0.08388561010360718,
0.06377337872982025,
0.02357630431652069,
-0.034902531653642654,
-0.027710627764463425,
0.13747702538967133,
0.03130072355270386,
0.09343323111534119,
-0.059251818805933,
-0.06979189068078995,
-0.13671758770942688,
0.01035293098539114,
-0.02170216478407383,
-0.04980892688035965,
0.0011700709583237767,
-0.010582450777292252,
-0.026477504521608353,
-0.014173110947012901,
-0.021675467491149902,
-0.01563854329288006,
-0.07044845819473267,
0.017527468502521515,
0.003117381827905774,
0.08274634927511215,
-0.07639053463935852,
-0.01949647255241871,
0.09657271951436996,
-0.044197928160429,
0.05014793574810028,
0.07104749977588654,
-0.050664421170949936,
-0.015385539270937443,
-0.18006427586078644,
0.02523793652653694,
0.053399428725242615,
-0.017053650692105293,
0.05684219300746918,
-0.09865777939558029,
0.05568376183509827,
-0.038116205483675,
0.0067168124951422215,
0.021562810987234116,
0.11159685999155045,
-0.0969448909163475,
0.0718189999461174,
0.123601533472538,
-0.09850044548511505,
-0.029934177175164223,
0.01606467366218567,
0.05602497234940529,
0.037644580006599426,
0.16779550909996033,
-0.049599792808294296,
0.1392974704504013,
-0.10860351473093033,
-0.016151510179042816,
-0.020448386669158936,
-0.016196638345718384,
-0.06537719815969467,
-0.04365089163184166,
0.05683998391032219,
-0.007181486580520868,
0.13891403377056122,
0.12593284249305725,
-0.0539538636803627,
0.021485356613993645,
0.08599252998828888,
-0.050131794065237045,
-0.034561075270175934,
0.0971488356590271,
-0.02541561797261238,
-0.02621333859860897,
0.028755508363246918,
0.048197802156209946,
-0.009396148845553398,
0.11660394072532654,
0.2005600482225418,
0.12525087594985962,
0.06718727201223373,
0.09833662211894989,
0.06169101968407631,
-0.018017644062638283,
-0.10229471325874329,
-0.09245290607213974,
0.12380281090736389,
0.08228480070829391,
-0.0771845355629921,
0.05098697915673256,
0.01009625755250454,
-0.14183630049228668,
0.09136179089546204,
0.01469414122402668,
-0.11759254336357117,
-0.06328114122152328,
-0.19759276509284973,
0.038532208651304245,
-0.04842725396156311,
0.012505260296165943,
-0.020634649321436882,
-0.08328688144683838,
0.10383974760770798,
-0.020170828327536583,
-0.05189869552850723,
0.1260385513305664,
-0.07289645820856094,
-0.05965018644928932,
0.09597901254892349,
-0.006950587499886751,
0.04564748704433441,
-0.17123864591121674,
0.05047936365008354,
-0.022492270916700363,
-0.044763900339603424,
0.025936288759112358,
0.01981051079928875,
-0.00041560360114090145,
-0.04956124722957611,
-0.11493407934904099,
-0.06832985579967499,
-0.040348704904317856,
0.014911445789039135,
0.014735518023371696,
0.2118280678987503,
0.02940770424902439,
-0.024821728467941284,
-0.025838959962129593,
0.0918378159403801,
0.0019120767246931791,
-0.09625037759542465,
-0.11862639337778091,
0.2842627465724945,
-0.11010586470365524,
0.04139804095029831,
-0.05851386487483978,
-0.04215075820684433,
0.004858880303800106,
0.15505851805210114,
0.2257886379957199,
-0.05948816239833832,
0.005257703363895416,
-0.002079133875668049,
0.02702886052429676,
0.0480661503970623,
0.12033313512802124,
0.058334335684776306,
0.14797388017177582,
-0.13539579510688782,
0.06538469344377518,
-0.1882101148366928,
-0.05537115037441254,
-0.035184938460588455,
0.16595694422721863,
0.05321642383933067,
-0.06500408798456192,
-0.1100936308503151,
0.16626213490962982,
-0.1464977115392685,
-0.024177035316824913,
0.024810634553432465,
-0.022302672266960144,
-0.13736259937286377,
-0.012110835872590542,
0.10356265306472778,
0.054903995245695114,
0.13429483771324158,
-0.06638762354850769,
-0.059239376336336136,
0.16335146129131317,
-0.02582857385277748,
-0.05359547212719917,
-0.07328547537326813,
0.06262251734733582,
-0.12775519490242004,
0.2271105796098709,
-0.00813837070018053,
0.1494491994380951,
0.07310298085212708,
0.011424229480326176,
-0.06051492318511009,
0.11930312216281891,
0.05867581441998482,
0.054500021040439606,
-0.08212697505950928,
0.027891799807548523,
-0.008054145611822605,
-0.04579014703631401,
0.01131847407668829,
-0.14373335242271423,
0.09121666848659515,
-0.012667728587985039,
-0.06842784583568573,
-0.08423512428998947,
0.10208429396152496,
-0.04959965869784355,
0.09512639790773392,
0.18378540873527527,
-0.03132130950689316,
-0.039014026522636414,
-0.09134366363286972,
0.020749062299728394,
0.06847763806581497,
0.023359328508377075,
0.022349465638399124,
-0.0546804741024971,
-0.026085106655955315,
0.11425512284040451,
-0.06914772093296051,
-0.08983386307954788,
-0.06924548000097275,
-0.028479956090450287,
-0.024734703823924065,
0.031307365745306015,
0.02865668013691902,
-0.024646908044815063,
0.07564270496368408,
-0.0010594041086733341,
0.033643629401922226,
0.0074340240098536015,
0.08209767192602158,
-0.07868613302707672,
-0.0454300194978714
] |
null | null |
flair
|
## English Part-of-Speech Tagging in Flair (default model)
This is the standard part-of-speech tagging model for English that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **98,19** (Ontonotes)
Predicts fine-grained POS tags:
| **tag** | **meaning** |
|---------------------------------|-----------|
|ADD | Email |
|AFX | Affix |
|CC | Coordinating conjunction |
|CD | Cardinal number |
|DT | Determiner |
|EX | Existential there |
|FW | Foreign word |
|HYPH | Hyphen |
|IN | Preposition or subordinating conjunction |
|JJ | Adjective |
|JJR |Adjective, comparative |
|JJS | Adjective, superlative |
|LS | List item marker |
|MD | Modal |
|NFP | Superfluous punctuation |
|NN | Noun, singular or mass |
|NNP |Proper noun, singular |
|NNPS | Proper noun, plural |
|NNS |Noun, plural |
|PDT | Predeterminer |
|POS | Possessive ending |
|PRP | Personal pronoun |
|PRP$ | Possessive pronoun |
|RB | Adverb |
|RBR | Adverb, comparative |
|RBS | Adverb, superlative |
|RP | Particle |
|SYM | Symbol |
|TO | to |
|UH | Interjection |
|VB | Verb, base form |
|VBD | Verb, past tense |
|VBG | Verb, gerund or present participle |
|VBN | Verb, past participle |
|VBP | Verb, non-3rd person singular present |
|VBZ | Verb, 3rd person singular present |
|WDT | Wh-determiner |
|WP | Wh-pronoun |
|WP$ | Possessive wh-pronoun |
|WRB | Wh-adverb |
|XX | Unknown |
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/pos-english")
# make example sentence
sentence = Sentence("I love Berlin.")
# predict NER tags
tagger.predict(sentence)
# print sentence
print(sentence)
# print predicted NER spans
print('The following NER tags are found:')
# iterate over entities and print
for entity in sentence.get_spans('pos'):
print(entity)
```
This yields the following output:
```
Span [1]: "I" [− Labels: PRP (1.0)]
Span [2]: "love" [− Labels: VBP (1.0)]
Span [3]: "Berlin" [− Labels: NNP (0.9999)]
Span [4]: "." [− Labels: . (1.0)]
```
So, the word "*I*" is labeled as a **pronoun** (PRP), "*love*" is labeled as a **verb** (VBP) and "*Berlin*" is labeled as a **proper noun** (NNP) in the sentence "*I love Berlin*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
```python
from flair.data import Corpus
from flair.datasets import ColumnCorpus
from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
# 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself)
corpus: Corpus = ColumnCorpus(
"resources/tasks/onto-ner",
column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"},
tag_to_bioes="ner",
)
# 2. what tag do we want to predict?
tag_type = 'pos'
# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize each embedding we use
embedding_types = [
# contextual string embeddings, forward
FlairEmbeddings('news-forward'),
# contextual string embeddings, backward
FlairEmbeddings('news-backward'),
]
# embedding stack consists of Flair and GloVe embeddings
embeddings = StackedEmbeddings(embeddings=embedding_types)
# 5. initialize sequence tagger
from flair.models import SequenceTagger
tagger = SequenceTagger(hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type=tag_type)
# 6. initialize trainer
from flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus)
# 7. run training
trainer.train('resources/taggers/pos-english',
train_with_dev=True,
max_epochs=150)
```
---
### Cite
Please cite the following paper when using this model.
```
@inproceedings{akbik2018coling,
title={Contextual String Embeddings for Sequence Labeling},
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
pages = {1638--1649},
year = {2018}
}
```
---
### Issues?
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
|
{"language": "en", "tags": ["flair", "token-classification", "sequence-tagger-model"], "datasets": ["ontonotes"], "inference": false}
|
token-classification
|
flair/pos-english
|
[
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"en",
"dataset:ontonotes",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #has_space #region-us
|
English Part-of-Speech Tagging in Flair (default model)
-------------------------------------------------------
This is the standard part-of-speech tagging model for English that ships with Flair.
F1-Score: 98,19 (Ontonotes)
Predicts fine-grained POS tags:
Based on Flair embeddings and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: Flair ('pip install flair')
This yields the following output:
So, the word "*I*" is labeled as a pronoun (PRP), "*love*" is labeled as a verb (VBP) and "*Berlin*" is labeled as a proper noun (NNP) in the sentence "*I love Berlin*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
---
### Cite
Please cite the following paper when using this model.
---
### Issues?
The Flair issue tracker is available here.
|
[
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRP), \"*love*\" is labeled as a verb (VBP) and \"*Berlin*\" is labeled as a proper noun (NNP) in the sentence \"*I love Berlin*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
"TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #has_space #region-us \n",
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRP), \"*love*\" is labeled as a verb (VBP) and \"*Berlin*\" is labeled as a proper noun (NNP) in the sentence \"*I love Berlin*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
41,
97,
22,
15,
15
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #has_space #region-us \n### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRP), \"*love*\" is labeled as a verb (VBP) and \"*Berlin*\" is labeled as a proper noun (NNP) in the sentence \"*I love Berlin*\".\n\n\n\n\n---### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
-0.09654349833726883,
0.02451726607978344,
-0.0009043450118042529,
0.10612541437149048,
0.057738520205020905,
0.06040659174323082,
0.002132750814780593,
0.07508105784654617,
0.08618345856666565,
0.08102487027645111,
0.1256256401538849,
-0.026729634031653404,
0.066706083714962,
0.15705415606498718,
0.055722109973430634,
-0.28822821378707886,
0.012350550852715969,
0.000387382140615955,
0.02420773357152939,
0.15740936994552612,
0.12913918495178223,
0.015539132989943027,
0.0006168977706693113,
-0.011361246928572655,
-0.09074467420578003,
0.008635882288217545,
-0.025122981518507004,
-0.022447116672992706,
0.14025990664958954,
-0.05348089337348938,
0.15944015979766846,
0.0026885056868195534,
0.06456459313631058,
-0.18054886162281036,
0.021371154114603996,
0.02195402793586254,
-0.05337318778038025,
0.05478101596236229,
0.01444229669868946,
-0.08794758468866348,
0.30248796939849854,
-0.10867413133382797,
-0.015168821439146996,
0.04804442822933197,
-0.15579603612422943,
-0.3037412464618683,
-0.060181617736816406,
0.18956506252288818,
0.032371580600738525,
0.08444575220346451,
-0.05128515511751175,
0.09979059547185898,
-0.10420067608356476,
0.007598940748721361,
0.05655868723988533,
-0.1893027126789093,
-0.058593764901161194,
0.08073531836271286,
0.07005283981561661,
0.047991376370191574,
-0.11503483355045319,
0.021208276972174644,
0.030959798023104668,
0.07127212733030319,
0.03718951344490051,
-0.012765012681484222,
0.08459179848432541,
0.012641940265893936,
-0.20550601184368134,
-0.09668470174074173,
0.2757447063922882,
0.042059119790792465,
-0.07436185330152512,
-0.02248149923980236,
-0.02811729535460472,
-0.12338393926620483,
0.02391539141535759,
-0.12532569468021393,
-0.05830457806587219,
0.013386699371039867,
0.13014355301856995,
-0.12159647047519684,
-0.10382846742868423,
-0.050668735057115555,
0.11408938467502594,
0.15825700759887695,
-0.03149263560771942,
0.0406000092625618,
-0.02833086997270584,
0.12537270784378052,
-0.11252626031637192,
-0.08531612157821655,
0.05594327300786972,
-0.08415671437978745,
-0.011642892844974995,
0.016787996515631676,
-0.04280257597565651,
0.01935560442507267,
-0.011219857260584831,
0.07230779528617859,
0.05083087086677551,
-0.052142683416604996,
0.05426621064543724,
0.07736276835203171,
0.06708814948797226,
0.16122998297214508,
-0.037757474929094315,
-0.052723705768585205,
-0.0010586639400571585,
0.0022147269919514656,
0.02142849750816822,
-0.014102272689342499,
-0.0960581824183464,
-0.05373915657401085,
0.027593662962317467,
0.02784627117216587,
-0.03391953557729721,
0.03857726603746414,
-0.09796261042356491,
0.026546524837613106,
0.01644781045615673,
-0.07551521062850952,
0.03395429253578186,
0.06350745260715485,
-0.07755459100008011,
0.04625697806477547,
0.09197364002466202,
-0.05438993498682976,
-0.03669838607311249,
0.09190564602613449,
-0.0583089254796505,
0.049307674169540405,
-0.12003470212221146,
-0.09019772708415985,
0.01456284150481224,
-0.06116273254156113,
-0.03431373089551926,
-0.1029582992196083,
-0.006102720275521278,
-0.0223750788718462,
0.031352449208498,
0.00643532071262598,
-0.01781802624464035,
-0.05993780121207237,
-0.06106599420309067,
-0.002907335991039872,
-0.006529592908918858,
0.012751425616443157,
-0.04651622474193573,
0.04817333444952965,
-0.03259122744202614,
0.06636903434991837,
-0.13654744625091553,
0.005392964463680983,
-0.13267304003238678,
0.04297686368227005,
-0.10289536416530609,
-0.018322864547371864,
-0.14377810060977936,
-0.061314042657613754,
-0.0737595334649086,
-0.08680371940135956,
0.005281387362629175,
-0.009370357729494572,
0.048375699669122696,
0.18667621910572052,
-0.33414775133132935,
-0.008304080925881863,
0.18613484501838684,
-0.1435844451189041,
-0.027398429811000824,
0.16836312413215637,
-0.06507764011621475,
0.15237785875797272,
0.021295564249157906,
0.1342548280954361,
0.08589435368776321,
-0.3144737780094147,
-0.060631487518548965,
0.08468030393123627,
-0.16800861060619354,
0.027717918157577515,
0.04919926077127457,
0.021558407694101334,
-0.18036223948001862,
0.016674183309078217,
-0.10283906757831573,
0.06786636263132095,
-0.008870590478181839,
-0.07240442931652069,
-0.03174148127436638,
0.01576855778694153,
0.10253316164016724,
0.01486112643033266,
-0.06717977672815323,
-0.06452478468418121,
-0.16437889635562897,
-0.08420775085687637,
0.038230665028095245,
0.05609584599733353,
0.011591781862080097,
0.007224091794341803,
0.1313849836587906,
0.09868127852678299,
-0.03599319979548454,
-0.10092776268720627,
0.016687555238604546,
-0.00775841623544693,
0.01622757315635681,
0.05020027235150337,
0.14975935220718384,
0.019004305824637413,
0.06313087046146393,
-0.002380733611062169,
-0.008359403349459171,
-0.1549115777015686,
0.03646942228078842,
0.0031268319580703974,
-0.03255760297179222,
-0.1151120513677597,
-0.09190405905246735,
0.14565791189670563,
-0.168509379029274,
0.0446762852370739,
0.055642008781433105,
0.03999912366271019,
-0.05713918060064316,
-0.003682720707729459,
0.014676885679364204,
0.0733199268579483,
0.0003182740183547139,
0.008870783261954784,
0.11005440354347229,
0.01023772731423378,
-0.08611954003572464,
-0.05896572396159172,
-0.011393795721232891,
-0.20722997188568115,
0.07721958309412003,
-0.019508209079504013,
-0.155054971575737,
-0.08550034463405609,
-0.0907987430691719,
0.038992077112197876,
-0.03032503090798855,
-0.06148061528801918,
0.16018302738666534,
0.024725226685404778,
0.023012328892946243,
-0.03957991674542427,
0.051015473902225494,
-0.017211923375725746,
-0.08594661951065063,
-0.09393284469842911,
0.09015396982431412,
0.05976274609565735,
0.02889120951294899,
0.0741429477930069,
0.1834300309419632,
-0.10664292424917221,
0.1424597054719925,
0.02599274553358555,
-0.004691032227128744,
-0.11297191679477692,
0.03626289591193199,
-0.008506099693477154,
0.2177244871854782,
-0.15187302231788635,
0.04808866232633591,
0.03297877311706543,
-0.018566610291600227,
0.02526535652577877,
-0.15987299382686615,
-0.05886440351605415,
-0.020777815952897072,
-0.05714540183544159,
-0.049372706562280655,
0.04334864020347595,
-0.03660037741065025,
0.10921653360128403,
-0.032665036618709564,
-0.08643388748168945,
0.036707013845443726,
0.026264691725373268,
-0.09122142195701599,
0.11506912112236023,
-0.06578247994184494,
-0.18402068316936493,
-0.08584919571876526,
0.019116245210170746,
-0.007954459637403488,
-0.003651122795417905,
-0.0035441608633846045,
-0.05468648299574852,
0.04360922425985336,
-0.005158815532922745,
0.1384381651878357,
-0.05026436224579811,
-0.06293123215436935,
-0.19210432469844818,
0.004383372142910957,
0.04731924086809158,
-0.10701151192188263,
-0.02777833864092827,
-0.05738640949130058,
0.06889387965202332,
0.051762934774160385,
0.03084980882704258,
0.14149446785449982,
0.10723143815994263,
-0.02345019392669201,
0.02403932623565197,
-0.03517106547951698,
0.28185921907424927,
-0.07797277718782425,
0.12484151870012283,
0.10822001844644547,
-0.07692208141088486,
0.06783422827720642,
0.14060775935649872,
0.02806658111512661,
-0.0602586567401886,
-0.018114034086465836,
-0.06367950886487961,
-0.10443444550037384,
-0.09897508472204208,
-0.16767092049121857,
-0.06922001391649246,
0.03728710487484932,
-0.003221469931304455,
0.031374748796224594,
-0.03586937487125397,
0.03503295034170151,
0.030463777482509613,
-0.13661573827266693,
-0.10727471113204956,
0.010272487066686153,
0.0679023340344429,
-0.07517021894454956,
-0.009674138389527798,
-0.010836930014193058,
-0.07045077532529831,
0.04880908131599426,
0.051277223974466324,
0.07959394156932831,
0.0325545035302639,
-0.050139568746089935,
0.10864761471748352,
0.19786830246448517,
0.0465998537838459,
0.07858366519212723,
0.037717241793870926,
-0.0283699631690979,
-0.026990706101059914,
-0.06295546889305115,
-0.013234889134764671,
0.07753997296094894,
0.09833000600337982,
0.0725872814655304,
-0.0653204619884491,
-0.09279774129390717,
0.07184170186519623,
-0.018181126564741135,
0.13479028642177582,
-0.15591229498386383,
-0.00037333040381781757,
0.008057261817157269,
0.08058270066976547,
-0.05667074769735336,
-0.01620633155107498,
0.07511929422616959,
-0.1135353296995163,
0.010974272154271603,
-0.04785867780447006,
0.05353822186589241,
0.022535737603902817,
0.024120861664414406,
-0.029906639829277992,
0.0963607206940651,
-0.03806338086724281,
0.14712293446063995,
-0.2504882216453552,
0.32345086336135864,
-0.023821182548999786,
0.07953835278749466,
-0.04686220735311508,
-0.019308047369122505,
0.020163122564554214,
0.057752158492803574,
0.23767049610614777,
0.027053238824009895,
-0.0716351792216301,
-0.06463108956813812,
-0.11577863991260529,
0.03547376021742821,
0.06753750890493393,
-0.012182130478322506,
0.035823188722133636,
0.08210675418376923,
0.030347539111971855,
0.002335169119760394,
0.24539044499397278,
-0.20588836073875427,
-0.049204014241695404,
-0.03580697998404503,
0.019155332818627357,
-0.018061399459838867,
0.005425512325018644,
-0.05693863704800606,
-0.018731724470853806,
0.05268546938896179,
0.006551346275955439,
0.0020378651097416878,
-0.1297585368156433,
0.11338187754154205,
-0.005597994197160006,
-0.04299185797572136,
-0.06513018161058426,
-0.011996760033071041,
0.01926562748849392,
-0.14023180305957794,
0.019839750602841377,
0.04445360228419304,
-0.07103235274553299,
-0.01697973720729351,
-0.06690482050180435,
0.14661693572998047,
0.11115468293428421,
0.06799653172492981,
0.00943867675960064,
0.015519805252552032,
0.00033329802681691945,
-0.1604759395122528,
0.16479939222335815,
-0.17816615104675293,
-0.061541326344013214,
0.004820319823920727,
-0.15174798667430878,
-0.12566150724887848,
-0.11129096895456314,
0.07165972143411636,
0.21848627924919128,
0.34441083669662476,
-0.1381617784500122,
0.08276300877332687,
0.03774351254105568,
-0.10525152087211609,
-0.2614096701145172,
-0.008403051644563675,
0.07403957843780518,
0.05510846525430679,
0.027985630556941032,
-0.16437183320522308,
0.020131828263401985,
0.0826042965054512,
-0.03619759902358055,
0.042953845113515854,
-0.2326297014951706,
-0.04053967446088791,
0.16857528686523438,
0.039376284927129745,
0.031595416367053986,
0.0166407972574234,
-0.012429691851139069,
0.003408466000109911,
-0.016964947804808617,
0.1501934677362442,
0.021021423861384392,
0.047888629138469696,
0.0017128053586930037,
0.014843103475868702,
0.0015150088584050536,
-0.05245388671755791,
0.16644679009914398,
0.08610314875841141,
0.06061932072043419,
-0.042651355266571045,
-0.13736571371555328,
0.018506653606891632,
-0.028785524889826775,
0.1248832643032074,
-0.03175545856356621,
-0.02318432182073593,
-0.20873819291591644,
-0.004652945324778557,
-0.15279674530029297,
0.23398935794830322,
-0.08352203667163849,
-0.11376859247684479,
-0.078595831990242,
0.08345093578100204,
-0.039002493023872375,
-0.041900940239429474,
-0.16808746755123138,
-0.05423648655414581,
0.04137036204338074,
0.14810499548912048,
-0.014193885959684849,
0.16433125734329224,
-0.14294368028640747,
0.06051809340715408,
-0.012386086396872997,
0.0892258882522583,
-0.04956945776939392,
-0.050376053899526596,
0.05334747955203056,
0.024356646463274956,
0.12530970573425293,
0.031355973333120346,
-0.1121717020869255,
0.008862817659974098,
0.07107852399349213,
-0.18576636910438538,
-0.11760594695806503,
-0.0498458594083786,
-0.11357173323631287,
-0.024456258863210678,
-0.11434551328420639,
0.05138155445456505,
-0.14052371680736542,
0.03305118903517723,
-0.040505558252334595,
-0.004637460224330425,
-0.15388861298561096,
-0.020517488941550255,
0.15572071075439453,
0.07224233448505402,
-0.009336553514003754,
0.019432656466960907,
0.036695435643196106,
-0.1170850396156311,
0.05184022709727287,
0.055045973509550095,
-0.06746392697095871,
-0.056511908769607544,
0.0006572363781742752,
0.22095002233982086,
0.042096588760614395,
-0.06470713019371033,
0.016990890726447105,
-0.08274456113576889,
-0.005894308909773827,
0.08140632510185242,
0.11069116741418839,
-0.031077589839696884,
-0.10129667818546295,
-0.01710035838186741,
-0.08388561010360718,
0.06377337872982025,
0.02357630431652069,
-0.034902531653642654,
-0.027710627764463425,
0.13747702538967133,
0.03130072355270386,
0.09343323111534119,
-0.059251818805933,
-0.06979189068078995,
-0.13671758770942688,
0.01035293098539114,
-0.02170216478407383,
-0.04980892688035965,
0.0011700709583237767,
-0.010582450777292252,
-0.026477504521608353,
-0.014173110947012901,
-0.021675467491149902,
-0.01563854329288006,
-0.07044845819473267,
0.017527468502521515,
0.003117381827905774,
0.08274634927511215,
-0.07639053463935852,
-0.01949647255241871,
0.09657271951436996,
-0.044197928160429,
0.05014793574810028,
0.07104749977588654,
-0.050664421170949936,
-0.015385539270937443,
-0.18006427586078644,
0.02523793652653694,
0.053399428725242615,
-0.017053650692105293,
0.05684219300746918,
-0.09865777939558029,
0.05568376183509827,
-0.038116205483675,
0.0067168124951422215,
0.021562810987234116,
0.11159685999155045,
-0.0969448909163475,
0.0718189999461174,
0.123601533472538,
-0.09850044548511505,
-0.029934177175164223,
0.01606467366218567,
0.05602497234940529,
0.037644580006599426,
0.16779550909996033,
-0.049599792808294296,
0.1392974704504013,
-0.10860351473093033,
-0.016151510179042816,
-0.020448386669158936,
-0.016196638345718384,
-0.06537719815969467,
-0.04365089163184166,
0.05683998391032219,
-0.007181486580520868,
0.13891403377056122,
0.12593284249305725,
-0.0539538636803627,
0.021485356613993645,
0.08599252998828888,
-0.050131794065237045,
-0.034561075270175934,
0.0971488356590271,
-0.02541561797261238,
-0.02621333859860897,
0.028755508363246918,
0.048197802156209946,
-0.009396148845553398,
0.11660394072532654,
0.2005600482225418,
0.12525087594985962,
0.06718727201223373,
0.09833662211894989,
0.06169101968407631,
-0.018017644062638283,
-0.10229471325874329,
-0.09245290607213974,
0.12380281090736389,
0.08228480070829391,
-0.0771845355629921,
0.05098697915673256,
0.01009625755250454,
-0.14183630049228668,
0.09136179089546204,
0.01469414122402668,
-0.11759254336357117,
-0.06328114122152328,
-0.19759276509284973,
0.038532208651304245,
-0.04842725396156311,
0.012505260296165943,
-0.020634649321436882,
-0.08328688144683838,
0.10383974760770798,
-0.020170828327536583,
-0.05189869552850723,
0.1260385513305664,
-0.07289645820856094,
-0.05965018644928932,
0.09597901254892349,
-0.006950587499886751,
0.04564748704433441,
-0.17123864591121674,
0.05047936365008354,
-0.022492270916700363,
-0.044763900339603424,
0.025936288759112358,
0.01981051079928875,
-0.00041560360114090145,
-0.04956124722957611,
-0.11493407934904099,
-0.06832985579967499,
-0.040348704904317856,
0.014911445789039135,
0.014735518023371696,
0.2118280678987503,
0.02940770424902439,
-0.024821728467941284,
-0.025838959962129593,
0.0918378159403801,
0.0019120767246931791,
-0.09625037759542465,
-0.11862639337778091,
0.2842627465724945,
-0.11010586470365524,
0.04139804095029831,
-0.05851386487483978,
-0.04215075820684433,
0.004858880303800106,
0.15505851805210114,
0.2257886379957199,
-0.05948816239833832,
0.005257703363895416,
-0.002079133875668049,
0.02702886052429676,
0.0480661503970623,
0.12033313512802124,
0.058334335684776306,
0.14797388017177582,
-0.13539579510688782,
0.06538469344377518,
-0.1882101148366928,
-0.05537115037441254,
-0.035184938460588455,
0.16595694422721863,
0.05321642383933067,
-0.06500408798456192,
-0.1100936308503151,
0.16626213490962982,
-0.1464977115392685,
-0.024177035316824913,
0.024810634553432465,
-0.022302672266960144,
-0.13736259937286377,
-0.012110835872590542,
0.10356265306472778,
0.054903995245695114,
0.13429483771324158,
-0.06638762354850769,
-0.059239376336336136,
0.16335146129131317,
-0.02582857385277748,
-0.05359547212719917,
-0.07328547537326813,
0.06262251734733582,
-0.12775519490242004,
0.2271105796098709,
-0.00813837070018053,
0.1494491994380951,
0.07310298085212708,
0.011424229480326176,
-0.06051492318511009,
0.11930312216281891,
0.05867581441998482,
0.054500021040439606,
-0.08212697505950928,
0.027891799807548523,
-0.008054145611822605,
-0.04579014703631401,
0.01131847407668829,
-0.14373335242271423,
0.09121666848659515,
-0.012667728587985039,
-0.06842784583568573,
-0.08423512428998947,
0.10208429396152496,
-0.04959965869784355,
0.09512639790773392,
0.18378540873527527,
-0.03132130950689316,
-0.039014026522636414,
-0.09134366363286972,
0.020749062299728394,
0.06847763806581497,
0.023359328508377075,
0.022349465638399124,
-0.0546804741024971,
-0.026085106655955315,
0.11425512284040451,
-0.06914772093296051,
-0.08983386307954788,
-0.06924548000097275,
-0.028479956090450287,
-0.024734703823924065,
0.031307365745306015,
0.02865668013691902,
-0.024646908044815063,
0.07564270496368408,
-0.0010594041086733341,
0.033643629401922226,
0.0074340240098536015,
0.08209767192602158,
-0.07868613302707672,
-0.0454300194978714
] |
null | null |
flair
|
## English Universal Part-of-Speech Tagging in Flair (fast model)
This is the fast universal part-of-speech tagging model for English that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **98,47** (Ontonotes)
Predicts universal POS tags:
| **tag** | **meaning** |
|---------------------------------|-----------|
|ADJ | adjective |
| ADP | adposition |
| ADV | adverb |
| AUX | auxiliary |
| CCONJ | coordinating conjunction |
| DET | determiner |
| INTJ | interjection |
| NOUN | noun |
| NUM | numeral |
| PART | particle |
| PRON | pronoun |
| PROPN | proper noun |
| PUNCT | punctuation |
| SCONJ | subordinating conjunction |
| SYM | symbol |
| VERB | verb |
| X | other |
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/upos-english-fast")
# make example sentence
sentence = Sentence("I love Berlin.")
# predict NER tags
tagger.predict(sentence)
# print sentence
print(sentence)
# print predicted NER spans
print('The following NER tags are found:')
# iterate over entities and print
for entity in sentence.get_spans('pos'):
print(entity)
```
This yields the following output:
```
Span [1]: "I" [− Labels: PRON (0.9996)]
Span [2]: "love" [− Labels: VERB (1.0)]
Span [3]: "Berlin" [− Labels: PROPN (0.9986)]
Span [4]: "." [− Labels: PUNCT (1.0)]
```
So, the word "*I*" is labeled as a **pronoun** (PRON), "*love*" is labeled as a **verb** (VERB) and "*Berlin*" is labeled as a **proper noun** (PROPN) in the sentence "*I love Berlin*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
```python
from flair.data import Corpus
from flair.datasets import ColumnCorpus
from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
# 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself)
corpus: Corpus = ColumnCorpus(
"resources/tasks/onto-ner",
column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"},
tag_to_bioes="ner",
)
# 2. what tag do we want to predict?
tag_type = 'upos'
# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize each embedding we use
embedding_types = [
# contextual string embeddings, forward
FlairEmbeddings('news-forward-fast'),
# contextual string embeddings, backward
FlairEmbeddings('news-backward-fast'),
]
# embedding stack consists of Flair and GloVe embeddings
embeddings = StackedEmbeddings(embeddings=embedding_types)
# 5. initialize sequence tagger
from flair.models import SequenceTagger
tagger = SequenceTagger(hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type=tag_type)
# 6. initialize trainer
from flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus)
# 7. run training
trainer.train('resources/taggers/upos-english-fast',
train_with_dev=True,
max_epochs=150)
```
---
### Cite
Please cite the following paper when using this model.
```
@inproceedings{akbik2018coling,
title={Contextual String Embeddings for Sequence Labeling},
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
pages = {1638--1649},
year = {2018}
}
```
---
### Issues?
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
|
{"language": "en", "tags": ["flair", "token-classification", "sequence-tagger-model"], "datasets": ["ontonotes"], "widget": [{"text": "I love Berlin."}]}
|
token-classification
|
flair/upos-english-fast
|
[
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"en",
"dataset:ontonotes",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #region-us
|
English Universal Part-of-Speech Tagging in Flair (fast model)
--------------------------------------------------------------
This is the fast universal part-of-speech tagging model for English that ships with Flair.
F1-Score: 98,47 (Ontonotes)
Predicts universal POS tags:
Based on Flair embeddings and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: Flair ('pip install flair')
This yields the following output:
So, the word "*I*" is labeled as a pronoun (PRON), "*love*" is labeled as a verb (VERB) and "*Berlin*" is labeled as a proper noun (PROPN) in the sentence "*I love Berlin*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
---
### Cite
Please cite the following paper when using this model.
---
### Issues?
The Flair issue tracker is available here.
|
[
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRON), \"*love*\" is labeled as a verb (VERB) and \"*Berlin*\" is labeled as a proper noun (PROPN) in the sentence \"*I love Berlin*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
"TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #region-us \n",
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRON), \"*love*\" is labeled as a verb (VERB) and \"*Berlin*\" is labeled as a proper noun (PROPN) in the sentence \"*I love Berlin*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
37,
97,
22,
15,
15
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #region-us \n### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRON), \"*love*\" is labeled as a verb (VERB) and \"*Berlin*\" is labeled as a proper noun (PROPN) in the sentence \"*I love Berlin*\".\n\n\n\n\n---### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
-0.0898815169930458,
0.023771122097969055,
-0.001487379427999258,
0.10299017280340195,
0.052614711225032806,
0.06480679661035538,
0.038192663341760635,
0.04685518518090248,
0.11425352096557617,
0.05003012716770172,
0.11690953373908997,
-0.031146982684731483,
0.07001783698797226,
0.13879141211509705,
0.04326340928673744,
-0.2914661169052124,
0.020744502544403076,
-0.011923144571483135,
0.01169240940362215,
0.153060182929039,
0.1331934630870819,
0.02655882015824318,
0.011601182632148266,
-0.008133061230182648,
-0.08580024540424347,
0.014900912530720234,
-0.013496975414454937,
-0.01303007360547781,
0.15075726807117462,
-0.05623955279588699,
0.16982485353946686,
-0.016898274421691895,
0.08262526243925095,
-0.18445122241973877,
0.021106772124767303,
0.025009935721755028,
-0.04166584089398384,
0.028432374820113182,
0.0059976824559271336,
-0.09080269932746887,
0.30034705996513367,
-0.06705763936042786,
0.001381663023494184,
0.03998275473713875,
-0.15722939372062683,
-0.2496788650751114,
-0.06137925386428833,
0.17162500321865082,
0.026198578998446465,
0.11042067408561707,
-0.040206968784332275,
0.10602319985628128,
-0.11377564817667007,
0.006259925663471222,
0.024749480187892914,
-0.1585025042295456,
-0.05927029252052307,
0.042430490255355835,
0.055525388568639755,
0.05056805908679962,
-0.11841873079538345,
0.01349883433431387,
0.030059687793254852,
0.06374078243970871,
0.024612532928586006,
-0.035108014941215515,
0.09984966367483139,
-0.005662199575453997,
-0.2025691568851471,
-0.0728834941983223,
0.2694748342037201,
0.04430845007300377,
-0.09202708303928375,
0.0008329666452482343,
-0.02826727367937565,
-0.09516042470932007,
0.03174348548054695,
-0.1292840540409088,
-0.04425473138689995,
0.014057469554245472,
0.1462254375219345,
-0.12285458296537399,
-0.09163574874401093,
-0.06660646200180054,
0.10284695029258728,
0.17492416501045227,
-0.016730215400457382,
0.03591138496994972,
-0.03977494686841965,
0.10094918310642242,
-0.1143990159034729,
-0.08769087493419647,
0.04054101184010506,
-0.08084934949874878,
-0.02886119857430458,
0.0008614523685537279,
-0.034494396299123764,
0.0006567630334757268,
0.00006497783033410087,
0.09852224588394165,
0.05847626551985741,
-0.04213908314704895,
0.04860220104455948,
0.06352544575929642,
0.06965192407369614,
0.1789100617170334,
-0.017274443060159683,
-0.06970396637916565,
-0.0047951312735676765,
-0.002610258525237441,
0.03152114525437355,
-0.00793085154145956,
-0.09801054000854492,
-0.047128330916166306,
0.022331377491354942,
0.023647412657737732,
-0.06088045611977577,
0.0346975214779377,
-0.10376155376434326,
0.02789650671184063,
-0.01456197164952755,
-0.0741216316819191,
0.018752500414848328,
0.05458265170454979,
-0.07495538890361786,
0.07906625419855118,
0.06753302365541458,
-0.02416156232357025,
-0.04508918523788452,
0.0970316156744957,
-0.05162978544831276,
0.06319284439086914,
-0.1218646764755249,
-0.08501695096492767,
0.0061088851653039455,
-0.08302149921655655,
-0.0285832230001688,
-0.10237408429384232,
-0.007083098404109478,
-0.03640022873878479,
0.051680486649274826,
0.004662372637540102,
-0.014351153746247292,
-0.08431732654571533,
-0.042870696634054184,
-0.008128315210342407,
-0.0025762321893125772,
-0.018542950972914696,
-0.04570265859365463,
0.032436639070510864,
-0.026661740615963936,
0.03598421439528465,
-0.15958154201507568,
-0.010902747511863708,
-0.1512676477432251,
0.033837802708148956,
-0.1146857813000679,
-0.0249230545014143,
-0.13525724411010742,
-0.04884737730026245,
-0.09263661503791809,
-0.07050321251153946,
0.010233508422970772,
-0.008764013648033142,
0.04503672569990158,
0.19616882503032684,
-0.30200710892677307,
-0.008923140354454517,
0.19682587683200836,
-0.17516162991523743,
-0.029986577108502388,
0.17988649010658264,
-0.07246404141187668,
0.16754749417304993,
0.04123549163341522,
0.12043961882591248,
0.08837124705314636,
-0.3562321066856384,
-0.018417008221149445,
0.08611693233251572,
-0.15958347916603088,
0.03799496591091156,
0.050578195601701736,
-0.019890479743480682,
-0.20920144021511078,
0.015433547087013721,
-0.14237040281295776,
0.07827798277139664,
0.0028234149795025587,
-0.06630746275186539,
-0.019352389499545097,
0.022131022065877914,
0.07748380303382874,
0.010248089209198952,
-0.06287945061922073,
-0.054252997040748596,
-0.1499505341053009,
-0.116937056183815,
0.03764430060982704,
0.04879239574074745,
0.017393972724676132,
-0.0015096609713509679,
0.10221609473228455,
0.12539555132389069,
-0.03485417738556862,
-0.09814292192459106,
0.0340767577290535,
-0.006666012108325958,
0.03218095377087593,
0.049701984971761703,
0.11170683801174164,
0.02142612636089325,
0.06940871477127075,
0.0057291206903755665,
-0.010342957451939583,
-0.14541436731815338,
0.03992427885532379,
0.0030358564108610153,
-0.04804525896906853,
-0.111871637403965,
-0.08093813061714172,
0.17679722607135773,
-0.1533837467432022,
0.03716573491692543,
0.028068674728274345,
0.015247117727994919,
-0.050416089594364166,
-0.0023662890307605267,
-0.00413854792714119,
0.07998897135257721,
-0.007735489401966333,
0.008942054584622383,
0.12087557464838028,
0.010936119593679905,
-0.10252264142036438,
-0.07460156828165054,
0.01487889513373375,
-0.212365984916687,
0.05623725801706314,
-0.01881849765777588,
-0.1642047017812729,
-0.12440452724695206,
-0.08044865727424622,
0.038893137127161026,
-0.039356883615255356,
-0.06462139636278152,
0.20748431980609894,
0.05073857307434082,
0.01694762520492077,
-0.022073592990636826,
0.06901208311319351,
-0.03121216967701912,
-0.10157162696123123,
-0.08482599258422852,
0.07279036194086075,
0.09494420140981674,
0.015534200705587864,
0.05376916378736496,
0.13386395573616028,
-0.11451682448387146,
0.12835155427455902,
0.01628490351140499,
0.008541291579604149,
-0.10275119543075562,
0.03242132067680359,
-0.0065600210800766945,
0.22303655743598938,
-0.17655839025974274,
0.04046853631734848,
0.011347397230565548,
-0.017488060519099236,
0.0032335410360246897,
-0.16060581803321838,
-0.04455229640007019,
-0.023232119157910347,
-0.05123237520456314,
-0.05573230981826782,
0.03301989287137985,
-0.04923583194613457,
0.10197853296995163,
-0.03805481642484665,
-0.1141560897231102,
0.026981815695762634,
0.020379500463604927,
-0.0943702682852745,
0.13842032849788666,
-0.04054646193981171,
-0.14895102381706238,
-0.07320088148117065,
-0.008757123723626137,
0.00356266088783741,
-0.00835609994828701,
-0.02066211961209774,
-0.06309610605239868,
0.02715228497982025,
0.006254683248698711,
0.16018477082252502,
-0.04550106078386307,
-0.07313425093889236,
-0.18113337457180023,
-0.0002764715172816068,
0.033544592559337616,
-0.11049854010343552,
-0.033356327563524246,
-0.0665290430188179,
0.06360512971878052,
0.029604781419038773,
0.00185615592636168,
0.150980144739151,
0.10164634138345718,
0.00916291680186987,
0.026793397963047028,
-0.042458176612854004,
0.2884517312049866,
-0.07879526168107986,
0.13444553315639496,
0.09921272844076157,
-0.10011971741914749,
0.0684748962521553,
0.1569395512342453,
0.02416958287358284,
-0.06256715208292007,
-0.00903116911649704,
-0.04893215745687485,
-0.08828334510326385,
-0.08101218193769455,
-0.15415704250335693,
-0.06216505542397499,
0.028658825904130936,
0.0006397911347448826,
0.016075361520051956,
-0.04434935748577118,
0.04623495787382126,
0.03691063076257706,
-0.14931556582450867,
-0.08002548664808273,
0.009640413336455822,
0.06999705731868744,
-0.04832753166556358,
-0.02104419656097889,
-0.006157474592328072,
-0.07314592599868774,
0.03670820593833923,
0.0535360723733902,
0.0737357959151268,
0.04113136976957321,
-0.015342574566602707,
0.11256469786167145,
0.1689920425415039,
0.027463145554065704,
0.08941153436899185,
0.022620627656579018,
-0.007234051823616028,
-0.019971519708633423,
-0.05275975540280342,
-0.01355003658682108,
0.05556487664580345,
0.05934477224946022,
0.0853646919131279,
-0.07872757315635681,
-0.0367402546107769,
0.07334405928850174,
-0.011544733308255672,
0.12120134383440018,
-0.1469651758670807,
0.012373965233564377,
0.004728917498141527,
0.07151028513908386,
-0.05305381491780281,
-0.012835515663027763,
0.04643014818429947,
-0.13080547749996185,
0.029811760410666466,
-0.06532241404056549,
0.04459981247782707,
0.005902419798076153,
0.023805199190974236,
-0.0157120693475008,
0.13091905415058136,
-0.03811614215373993,
0.1363297998905182,
-0.2602112293243408,
0.336235374212265,
-0.03692622855305672,
0.10981101542711258,
-0.05958091840147972,
-0.014212301932275295,
0.022238193079829216,
0.07817142456769943,
0.24444620311260223,
0.02973197214305401,
-0.04588458687067032,
-0.07326139509677887,
-0.13672584295272827,
0.04460277780890465,
0.06848480552434921,
-0.018432466313242912,
0.06345726549625397,
0.059996966272592545,
0.026230990886688232,
-0.001555597293190658,
0.20078861713409424,
-0.22012169659137726,
-0.06648235023021698,
-0.04858449846506119,
0.002474370179697871,
-0.004793323110789061,
0.0005759952473454177,
-0.05429087206721306,
0.020678870379924774,
0.038263119757175446,
0.002964450279250741,
-0.008101001381874084,
-0.11814379692077637,
0.11312797665596008,
-0.020037200301885605,
-0.0523291639983654,
-0.08144507557153702,
-0.021734267473220825,
0.008143910206854343,
-0.1435418426990509,
0.026103945448994637,
0.040529634803533554,
-0.06761395931243896,
0.004134626127779484,
-0.07008503377437592,
0.13368579745292664,
0.12391141057014465,
0.06065106391906738,
0.026719551533460617,
0.0036677694879472256,
0.017565898597240448,
-0.16690148413181305,
0.14031511545181274,
-0.19897636771202087,
-0.07395902276039124,
0.0019207334844395518,
-0.16418513655662537,
-0.11845581233501434,
-0.1321670413017273,
0.07062552124261856,
0.2097981572151184,
0.29835495352745056,
-0.10392815619707108,
0.06954342871904373,
0.04755028709769249,
-0.12475559115409851,
-0.24273677170276642,
-0.0006197944167070091,
0.09548034518957138,
0.045727282762527466,
-0.009646390564739704,
-0.18834161758422852,
0.03447475656867027,
0.0818508192896843,
-0.023883482441306114,
0.03321679309010506,
-0.2587031126022339,
-0.03284367918968201,
0.1884077936410904,
0.06629036366939545,
0.06797950714826584,
0.014033270068466663,
-0.01376405544579029,
-0.013652621768414974,
-0.032552268356084824,
0.12519674003124237,
0.02303091436624527,
0.03526823967695236,
0.020781349390745163,
-0.012490931898355484,
-0.008473760448396206,
-0.05418547987937927,
0.13205586373806,
0.10665445774793625,
0.07584100216627121,
-0.04074835777282715,
-0.15029066801071167,
-0.012967033311724663,
-0.018688805401325226,
0.14403069019317627,
-0.015324699692428112,
-0.030964761972427368,
-0.17597830295562744,
-0.020685795694589615,
-0.1353050172328949,
0.23296716809272766,
-0.07694875448942184,
-0.11541455239057541,
-0.05695611238479614,
0.08691581338644028,
-0.05040032044053078,
-0.04760302975773811,
-0.161513552069664,
-0.040737804025411606,
0.01683737523853779,
0.10201193392276764,
0.006449053063988686,
0.17348994314670563,
-0.1654031127691269,
0.06061621755361557,
0.0013864635257050395,
0.08555713295936584,
0.02343907579779625,
-0.057367268949747086,
0.042930543422698975,
0.025384027510881424,
0.12333493679761887,
0.031133588403463364,
-0.10746170580387115,
0.01927465945482254,
0.055092550814151764,
-0.17190025746822357,
-0.11530495434999466,
-0.04282107204198837,
-0.13536518812179565,
-0.02341591566801071,
-0.10175365209579468,
0.054470647126436234,
-0.12480266392230988,
0.03396201506257057,
-0.047873273491859436,
0.0036299063358455896,
-0.13314267992973328,
-0.01806429587304592,
0.14172202348709106,
0.060974884778261185,
-0.01416811440140009,
0.042942073196172714,
0.028527820482850075,
-0.11717578768730164,
0.059595394879579544,
0.056194014847278595,
-0.0836978480219841,
-0.034280359745025635,
0.004118233919143677,
0.2706238627433777,
0.036549586802721024,
-0.07702073454856873,
-0.0021921286825090647,
-0.09695515036582947,
-0.026007818058133125,
0.12048900127410889,
0.11804647743701935,
-0.01622922718524933,
-0.11188836395740509,
-0.015402276068925858,
-0.08700945228338242,
0.05905122682452202,
0.03091423027217388,
-0.04664026200771332,
-0.014942952431738377,
0.11329488456249237,
0.03112051449716091,
0.062032490968704224,
-0.05094687268137932,
-0.08429044485092163,
-0.14692062139511108,
0.01117468811571598,
-0.019465796649456024,
-0.03836091235280037,
0.0057831499725580215,
0.0050759888254106045,
-0.02257705293595791,
-0.015195132233202457,
0.002818683860823512,
-0.011374175548553467,
-0.05861470103263855,
0.011758342385292053,
0.020560702309012413,
0.0743308886885643,
-0.06810213625431061,
-0.008571668528020382,
0.10328253358602524,
-0.05350097641348839,
0.045763131231069565,
0.07745879143476486,
-0.05387529730796814,
-0.010789348743855953,
-0.19259965419769287,
0.02553585171699524,
0.04600789025425911,
-0.027159309014678,
0.06099128723144531,
-0.11601406335830688,
0.05228552594780922,
-0.033104099333286285,
0.03668760508298874,
0.014980271458625793,
0.12325882166624069,
-0.0906621441245079,
0.07217783480882645,
0.1482895165681839,
-0.12009642273187637,
-0.024194778874516487,
0.012532703578472137,
0.05817165970802307,
0.012007111683487892,
0.20531602203845978,
-0.06059364229440689,
0.14371365308761597,
-0.09605105966329575,
0.0017151922220364213,
0.004002497531473637,
-0.00445089116692543,
-0.11271866410970688,
-0.051328495144844055,
0.05827435478568077,
-0.005275709554553032,
0.11476492136716843,
0.11478540301322937,
0.01628744974732399,
0.021961787715554237,
0.09114065766334534,
-0.0448027104139328,
-0.0338531956076622,
0.08444898575544357,
-0.012007747776806355,
-0.027859004214406013,
0.03465801477432251,
0.03852783143520355,
0.004441526252776384,
0.09889516979455948,
0.23105819523334503,
0.12473151087760925,
0.06292504072189331,
0.10576481372117996,
0.06534860283136368,
-0.0035515546333044767,
-0.0784880742430687,
-0.12224901467561722,
0.13685913383960724,
0.069983571767807,
-0.05981208011507988,
0.06071731820702553,
-0.0038184334989637136,
-0.12759903073310852,
0.07057873159646988,
0.028462760150432587,
-0.13664086163043976,
-0.0823068767786026,
-0.20139212906360626,
0.03345528617501259,
-0.022971970960497856,
0.01623893529176712,
-0.024648049846291542,
-0.08758675307035446,
0.09069687873125076,
-0.009683573618531227,
-0.05121735483407974,
0.14137782156467438,
-0.0726037323474884,
-0.07032899558544159,
0.11296145617961884,
0.0006733745685778558,
0.04512431472539902,
-0.14738117158412933,
0.050878845155239105,
-0.03519922494888306,
-0.0483323410153389,
0.030570797622203827,
0.018431413918733597,
0.006659801118075848,
-0.0437842532992363,
-0.11441260576248169,
-0.07590807229280472,
-0.02750110626220703,
0.018117723986506462,
0.011151387356221676,
0.20532436668872833,
0.03336218744516373,
-0.017224811017513275,
-0.024767227470874786,
0.08552828431129456,
-0.000138137984322384,
-0.12597258388996124,
-0.13693031668663025,
0.295724481344223,
-0.10422991961240768,
0.04548569396138191,
-0.05753675848245621,
-0.0338369682431221,
0.016779400408267975,
0.18896102905273438,
0.20881734788417816,
-0.06347009539604187,
-0.008355078287422657,
-0.013836994767189026,
0.022617386654019356,
0.05795544758439064,
0.13515709340572357,
0.04157590493559837,
0.14717058837413788,
-0.13533060252666473,
0.06852736324071884,
-0.16655151546001434,
-0.05244576930999756,
-0.03257973492145538,
0.1808294653892517,
0.06254994124174118,
-0.04706014692783356,
-0.12075438350439072,
0.18435131013393402,
-0.14072152972221375,
-0.011634008958935738,
0.003690115176141262,
0.013966896571218967,
-0.14030328392982483,
-0.027834448963403702,
0.10534191876649857,
0.07249721884727478,
0.15156474709510803,
-0.045954976230859756,
-0.06632585823535919,
0.18402941524982452,
-0.006378110032528639,
-0.06077969819307327,
-0.1067739874124527,
0.0772295817732811,
-0.09132950007915497,
0.21867454051971436,
-0.0013022507773712277,
0.1692190170288086,
0.06703164428472519,
0.0005252040573395789,
-0.050734713673591614,
0.13815093040466309,
0.05040137097239494,
0.06936892867088318,
-0.07845488935709,
-0.01130489632487297,
-0.0034196244087070227,
-0.03305864706635475,
-0.002251965692266822,
-0.13530965149402618,
0.11277751624584198,
-0.020297182723879814,
-0.08940798789262772,
-0.08497004956007004,
0.11560584604740143,
-0.06088733673095703,
0.09324125945568085,
0.19660809636116028,
-0.021913893520832062,
-0.02970086596906185,
-0.09225732088088989,
0.022480564191937447,
0.06635672599077225,
-0.011449257843196392,
0.01746012270450592,
-0.05471854656934738,
-0.032528575509786606,
0.1052054762840271,
-0.0816725566983223,
-0.09430181235074997,
-0.05640112981200218,
-0.03552388772368431,
-0.006583489943295717,
0.021327320486307144,
-0.007092682644724846,
-0.01946868747472763,
0.10377151519060135,
-0.005531010217964649,
0.05123837664723396,
0.016481341794133186,
0.08580677956342697,
-0.07432416826486588,
-0.05581441521644592
] |
null | null |
flair
|
## English Universal Part-of-Speech Tagging in Flair (default model)
This is the standard universal part-of-speech tagging model for English that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **98,6** (Ontonotes)
Predicts universal POS tags:
| **tag** | **meaning** |
|---------------------------------|-----------|
|ADJ | adjective |
| ADP | adposition |
| ADV | adverb |
| AUX | auxiliary |
| CCONJ | coordinating conjunction |
| DET | determiner |
| INTJ | interjection |
| NOUN | noun |
| NUM | numeral |
| PART | particle |
| PRON | pronoun |
| PROPN | proper noun |
| PUNCT | punctuation |
| SCONJ | subordinating conjunction |
| SYM | symbol |
| VERB | verb |
| X | other |
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/upos-english")
# make example sentence
sentence = Sentence("I love Berlin.")
# predict NER tags
tagger.predict(sentence)
# print sentence
print(sentence)
# print predicted NER spans
print('The following NER tags are found:')
# iterate over entities and print
for entity in sentence.get_spans('pos'):
print(entity)
```
This yields the following output:
```
Span [1]: "I" [− Labels: PRON (0.9996)]
Span [2]: "love" [− Labels: VERB (1.0)]
Span [3]: "Berlin" [− Labels: PROPN (0.9986)]
Span [4]: "." [− Labels: PUNCT (1.0)]
```
So, the word "*I*" is labeled as a **pronoun** (PRON), "*love*" is labeled as a **verb** (VERB) and "*Berlin*" is labeled as a **proper noun** (PROPN) in the sentence "*I love Berlin*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
```python
from flair.data import Corpus
from flair.datasets import ColumnCorpus
from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
# 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself)
corpus: Corpus = ColumnCorpus(
"resources/tasks/onto-ner",
column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"},
tag_to_bioes="ner",
)
# 2. what tag do we want to predict?
tag_type = 'upos'
# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize each embedding we use
embedding_types = [
# contextual string embeddings, forward
FlairEmbeddings('news-forward'),
# contextual string embeddings, backward
FlairEmbeddings('news-backward'),
]
# embedding stack consists of Flair and GloVe embeddings
embeddings = StackedEmbeddings(embeddings=embedding_types)
# 5. initialize sequence tagger
from flair.models import SequenceTagger
tagger = SequenceTagger(hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type=tag_type)
# 6. initialize trainer
from flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus)
# 7. run training
trainer.train('resources/taggers/upos-english',
train_with_dev=True,
max_epochs=150)
```
---
### Cite
Please cite the following paper when using this model.
```
@inproceedings{akbik2018coling,
title={Contextual String Embeddings for Sequence Labeling},
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
pages = {1638--1649},
year = {2018}
}
```
---
### Issues?
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
|
{"language": "en", "tags": ["flair", "token-classification", "sequence-tagger-model"], "datasets": ["ontonotes"], "widget": [{"text": "I love Berlin."}]}
|
token-classification
|
flair/upos-english
|
[
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"en",
"dataset:ontonotes",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #region-us
|
English Universal Part-of-Speech Tagging in Flair (default model)
-----------------------------------------------------------------
This is the standard universal part-of-speech tagging model for English that ships with Flair.
F1-Score: 98,6 (Ontonotes)
Predicts universal POS tags:
Based on Flair embeddings and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: Flair ('pip install flair')
This yields the following output:
So, the word "*I*" is labeled as a pronoun (PRON), "*love*" is labeled as a verb (VERB) and "*Berlin*" is labeled as a proper noun (PROPN) in the sentence "*I love Berlin*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
---
### Cite
Please cite the following paper when using this model.
---
### Issues?
The Flair issue tracker is available here.
|
[
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRON), \"*love*\" is labeled as a verb (VERB) and \"*Berlin*\" is labeled as a proper noun (PROPN) in the sentence \"*I love Berlin*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
"TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #region-us \n",
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRON), \"*love*\" is labeled as a verb (VERB) and \"*Berlin*\" is labeled as a proper noun (PROPN) in the sentence \"*I love Berlin*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
37,
97,
22,
15,
15
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #dataset-ontonotes #region-us \n### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the word \"*I*\" is labeled as a pronoun (PRON), \"*love*\" is labeled as a verb (VERB) and \"*Berlin*\" is labeled as a proper noun (PROPN) in the sentence \"*I love Berlin*\".\n\n\n\n\n---### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
-0.0898815169930458,
0.023771122097969055,
-0.001487379427999258,
0.10299017280340195,
0.052614711225032806,
0.06480679661035538,
0.038192663341760635,
0.04685518518090248,
0.11425352096557617,
0.05003012716770172,
0.11690953373908997,
-0.031146982684731483,
0.07001783698797226,
0.13879141211509705,
0.04326340928673744,
-0.2914661169052124,
0.020744502544403076,
-0.011923144571483135,
0.01169240940362215,
0.153060182929039,
0.1331934630870819,
0.02655882015824318,
0.011601182632148266,
-0.008133061230182648,
-0.08580024540424347,
0.014900912530720234,
-0.013496975414454937,
-0.01303007360547781,
0.15075726807117462,
-0.05623955279588699,
0.16982485353946686,
-0.016898274421691895,
0.08262526243925095,
-0.18445122241973877,
0.021106772124767303,
0.025009935721755028,
-0.04166584089398384,
0.028432374820113182,
0.0059976824559271336,
-0.09080269932746887,
0.30034705996513367,
-0.06705763936042786,
0.001381663023494184,
0.03998275473713875,
-0.15722939372062683,
-0.2496788650751114,
-0.06137925386428833,
0.17162500321865082,
0.026198578998446465,
0.11042067408561707,
-0.040206968784332275,
0.10602319985628128,
-0.11377564817667007,
0.006259925663471222,
0.024749480187892914,
-0.1585025042295456,
-0.05927029252052307,
0.042430490255355835,
0.055525388568639755,
0.05056805908679962,
-0.11841873079538345,
0.01349883433431387,
0.030059687793254852,
0.06374078243970871,
0.024612532928586006,
-0.035108014941215515,
0.09984966367483139,
-0.005662199575453997,
-0.2025691568851471,
-0.0728834941983223,
0.2694748342037201,
0.04430845007300377,
-0.09202708303928375,
0.0008329666452482343,
-0.02826727367937565,
-0.09516042470932007,
0.03174348548054695,
-0.1292840540409088,
-0.04425473138689995,
0.014057469554245472,
0.1462254375219345,
-0.12285458296537399,
-0.09163574874401093,
-0.06660646200180054,
0.10284695029258728,
0.17492416501045227,
-0.016730215400457382,
0.03591138496994972,
-0.03977494686841965,
0.10094918310642242,
-0.1143990159034729,
-0.08769087493419647,
0.04054101184010506,
-0.08084934949874878,
-0.02886119857430458,
0.0008614523685537279,
-0.034494396299123764,
0.0006567630334757268,
0.00006497783033410087,
0.09852224588394165,
0.05847626551985741,
-0.04213908314704895,
0.04860220104455948,
0.06352544575929642,
0.06965192407369614,
0.1789100617170334,
-0.017274443060159683,
-0.06970396637916565,
-0.0047951312735676765,
-0.002610258525237441,
0.03152114525437355,
-0.00793085154145956,
-0.09801054000854492,
-0.047128330916166306,
0.022331377491354942,
0.023647412657737732,
-0.06088045611977577,
0.0346975214779377,
-0.10376155376434326,
0.02789650671184063,
-0.01456197164952755,
-0.0741216316819191,
0.018752500414848328,
0.05458265170454979,
-0.07495538890361786,
0.07906625419855118,
0.06753302365541458,
-0.02416156232357025,
-0.04508918523788452,
0.0970316156744957,
-0.05162978544831276,
0.06319284439086914,
-0.1218646764755249,
-0.08501695096492767,
0.0061088851653039455,
-0.08302149921655655,
-0.0285832230001688,
-0.10237408429384232,
-0.007083098404109478,
-0.03640022873878479,
0.051680486649274826,
0.004662372637540102,
-0.014351153746247292,
-0.08431732654571533,
-0.042870696634054184,
-0.008128315210342407,
-0.0025762321893125772,
-0.018542950972914696,
-0.04570265859365463,
0.032436639070510864,
-0.026661740615963936,
0.03598421439528465,
-0.15958154201507568,
-0.010902747511863708,
-0.1512676477432251,
0.033837802708148956,
-0.1146857813000679,
-0.0249230545014143,
-0.13525724411010742,
-0.04884737730026245,
-0.09263661503791809,
-0.07050321251153946,
0.010233508422970772,
-0.008764013648033142,
0.04503672569990158,
0.19616882503032684,
-0.30200710892677307,
-0.008923140354454517,
0.19682587683200836,
-0.17516162991523743,
-0.029986577108502388,
0.17988649010658264,
-0.07246404141187668,
0.16754749417304993,
0.04123549163341522,
0.12043961882591248,
0.08837124705314636,
-0.3562321066856384,
-0.018417008221149445,
0.08611693233251572,
-0.15958347916603088,
0.03799496591091156,
0.050578195601701736,
-0.019890479743480682,
-0.20920144021511078,
0.015433547087013721,
-0.14237040281295776,
0.07827798277139664,
0.0028234149795025587,
-0.06630746275186539,
-0.019352389499545097,
0.022131022065877914,
0.07748380303382874,
0.010248089209198952,
-0.06287945061922073,
-0.054252997040748596,
-0.1499505341053009,
-0.116937056183815,
0.03764430060982704,
0.04879239574074745,
0.017393972724676132,
-0.0015096609713509679,
0.10221609473228455,
0.12539555132389069,
-0.03485417738556862,
-0.09814292192459106,
0.0340767577290535,
-0.006666012108325958,
0.03218095377087593,
0.049701984971761703,
0.11170683801174164,
0.02142612636089325,
0.06940871477127075,
0.0057291206903755665,
-0.010342957451939583,
-0.14541436731815338,
0.03992427885532379,
0.0030358564108610153,
-0.04804525896906853,
-0.111871637403965,
-0.08093813061714172,
0.17679722607135773,
-0.1533837467432022,
0.03716573491692543,
0.028068674728274345,
0.015247117727994919,
-0.050416089594364166,
-0.0023662890307605267,
-0.00413854792714119,
0.07998897135257721,
-0.007735489401966333,
0.008942054584622383,
0.12087557464838028,
0.010936119593679905,
-0.10252264142036438,
-0.07460156828165054,
0.01487889513373375,
-0.212365984916687,
0.05623725801706314,
-0.01881849765777588,
-0.1642047017812729,
-0.12440452724695206,
-0.08044865727424622,
0.038893137127161026,
-0.039356883615255356,
-0.06462139636278152,
0.20748431980609894,
0.05073857307434082,
0.01694762520492077,
-0.022073592990636826,
0.06901208311319351,
-0.03121216967701912,
-0.10157162696123123,
-0.08482599258422852,
0.07279036194086075,
0.09494420140981674,
0.015534200705587864,
0.05376916378736496,
0.13386395573616028,
-0.11451682448387146,
0.12835155427455902,
0.01628490351140499,
0.008541291579604149,
-0.10275119543075562,
0.03242132067680359,
-0.0065600210800766945,
0.22303655743598938,
-0.17655839025974274,
0.04046853631734848,
0.011347397230565548,
-0.017488060519099236,
0.0032335410360246897,
-0.16060581803321838,
-0.04455229640007019,
-0.023232119157910347,
-0.05123237520456314,
-0.05573230981826782,
0.03301989287137985,
-0.04923583194613457,
0.10197853296995163,
-0.03805481642484665,
-0.1141560897231102,
0.026981815695762634,
0.020379500463604927,
-0.0943702682852745,
0.13842032849788666,
-0.04054646193981171,
-0.14895102381706238,
-0.07320088148117065,
-0.008757123723626137,
0.00356266088783741,
-0.00835609994828701,
-0.02066211961209774,
-0.06309610605239868,
0.02715228497982025,
0.006254683248698711,
0.16018477082252502,
-0.04550106078386307,
-0.07313425093889236,
-0.18113337457180023,
-0.0002764715172816068,
0.033544592559337616,
-0.11049854010343552,
-0.033356327563524246,
-0.0665290430188179,
0.06360512971878052,
0.029604781419038773,
0.00185615592636168,
0.150980144739151,
0.10164634138345718,
0.00916291680186987,
0.026793397963047028,
-0.042458176612854004,
0.2884517312049866,
-0.07879526168107986,
0.13444553315639496,
0.09921272844076157,
-0.10011971741914749,
0.0684748962521553,
0.1569395512342453,
0.02416958287358284,
-0.06256715208292007,
-0.00903116911649704,
-0.04893215745687485,
-0.08828334510326385,
-0.08101218193769455,
-0.15415704250335693,
-0.06216505542397499,
0.028658825904130936,
0.0006397911347448826,
0.016075361520051956,
-0.04434935748577118,
0.04623495787382126,
0.03691063076257706,
-0.14931556582450867,
-0.08002548664808273,
0.009640413336455822,
0.06999705731868744,
-0.04832753166556358,
-0.02104419656097889,
-0.006157474592328072,
-0.07314592599868774,
0.03670820593833923,
0.0535360723733902,
0.0737357959151268,
0.04113136976957321,
-0.015342574566602707,
0.11256469786167145,
0.1689920425415039,
0.027463145554065704,
0.08941153436899185,
0.022620627656579018,
-0.007234051823616028,
-0.019971519708633423,
-0.05275975540280342,
-0.01355003658682108,
0.05556487664580345,
0.05934477224946022,
0.0853646919131279,
-0.07872757315635681,
-0.0367402546107769,
0.07334405928850174,
-0.011544733308255672,
0.12120134383440018,
-0.1469651758670807,
0.012373965233564377,
0.004728917498141527,
0.07151028513908386,
-0.05305381491780281,
-0.012835515663027763,
0.04643014818429947,
-0.13080547749996185,
0.029811760410666466,
-0.06532241404056549,
0.04459981247782707,
0.005902419798076153,
0.023805199190974236,
-0.0157120693475008,
0.13091905415058136,
-0.03811614215373993,
0.1363297998905182,
-0.2602112293243408,
0.336235374212265,
-0.03692622855305672,
0.10981101542711258,
-0.05958091840147972,
-0.014212301932275295,
0.022238193079829216,
0.07817142456769943,
0.24444620311260223,
0.02973197214305401,
-0.04588458687067032,
-0.07326139509677887,
-0.13672584295272827,
0.04460277780890465,
0.06848480552434921,
-0.018432466313242912,
0.06345726549625397,
0.059996966272592545,
0.026230990886688232,
-0.001555597293190658,
0.20078861713409424,
-0.22012169659137726,
-0.06648235023021698,
-0.04858449846506119,
0.002474370179697871,
-0.004793323110789061,
0.0005759952473454177,
-0.05429087206721306,
0.020678870379924774,
0.038263119757175446,
0.002964450279250741,
-0.008101001381874084,
-0.11814379692077637,
0.11312797665596008,
-0.020037200301885605,
-0.0523291639983654,
-0.08144507557153702,
-0.021734267473220825,
0.008143910206854343,
-0.1435418426990509,
0.026103945448994637,
0.040529634803533554,
-0.06761395931243896,
0.004134626127779484,
-0.07008503377437592,
0.13368579745292664,
0.12391141057014465,
0.06065106391906738,
0.026719551533460617,
0.0036677694879472256,
0.017565898597240448,
-0.16690148413181305,
0.14031511545181274,
-0.19897636771202087,
-0.07395902276039124,
0.0019207334844395518,
-0.16418513655662537,
-0.11845581233501434,
-0.1321670413017273,
0.07062552124261856,
0.2097981572151184,
0.29835495352745056,
-0.10392815619707108,
0.06954342871904373,
0.04755028709769249,
-0.12475559115409851,
-0.24273677170276642,
-0.0006197944167070091,
0.09548034518957138,
0.045727282762527466,
-0.009646390564739704,
-0.18834161758422852,
0.03447475656867027,
0.0818508192896843,
-0.023883482441306114,
0.03321679309010506,
-0.2587031126022339,
-0.03284367918968201,
0.1884077936410904,
0.06629036366939545,
0.06797950714826584,
0.014033270068466663,
-0.01376405544579029,
-0.013652621768414974,
-0.032552268356084824,
0.12519674003124237,
0.02303091436624527,
0.03526823967695236,
0.020781349390745163,
-0.012490931898355484,
-0.008473760448396206,
-0.05418547987937927,
0.13205586373806,
0.10665445774793625,
0.07584100216627121,
-0.04074835777282715,
-0.15029066801071167,
-0.012967033311724663,
-0.018688805401325226,
0.14403069019317627,
-0.015324699692428112,
-0.030964761972427368,
-0.17597830295562744,
-0.020685795694589615,
-0.1353050172328949,
0.23296716809272766,
-0.07694875448942184,
-0.11541455239057541,
-0.05695611238479614,
0.08691581338644028,
-0.05040032044053078,
-0.04760302975773811,
-0.161513552069664,
-0.040737804025411606,
0.01683737523853779,
0.10201193392276764,
0.006449053063988686,
0.17348994314670563,
-0.1654031127691269,
0.06061621755361557,
0.0013864635257050395,
0.08555713295936584,
0.02343907579779625,
-0.057367268949747086,
0.042930543422698975,
0.025384027510881424,
0.12333493679761887,
0.031133588403463364,
-0.10746170580387115,
0.01927465945482254,
0.055092550814151764,
-0.17190025746822357,
-0.11530495434999466,
-0.04282107204198837,
-0.13536518812179565,
-0.02341591566801071,
-0.10175365209579468,
0.054470647126436234,
-0.12480266392230988,
0.03396201506257057,
-0.047873273491859436,
0.0036299063358455896,
-0.13314267992973328,
-0.01806429587304592,
0.14172202348709106,
0.060974884778261185,
-0.01416811440140009,
0.042942073196172714,
0.028527820482850075,
-0.11717578768730164,
0.059595394879579544,
0.056194014847278595,
-0.0836978480219841,
-0.034280359745025635,
0.004118233919143677,
0.2706238627433777,
0.036549586802721024,
-0.07702073454856873,
-0.0021921286825090647,
-0.09695515036582947,
-0.026007818058133125,
0.12048900127410889,
0.11804647743701935,
-0.01622922718524933,
-0.11188836395740509,
-0.015402276068925858,
-0.08700945228338242,
0.05905122682452202,
0.03091423027217388,
-0.04664026200771332,
-0.014942952431738377,
0.11329488456249237,
0.03112051449716091,
0.062032490968704224,
-0.05094687268137932,
-0.08429044485092163,
-0.14692062139511108,
0.01117468811571598,
-0.019465796649456024,
-0.03836091235280037,
0.0057831499725580215,
0.0050759888254106045,
-0.02257705293595791,
-0.015195132233202457,
0.002818683860823512,
-0.011374175548553467,
-0.05861470103263855,
0.011758342385292053,
0.020560702309012413,
0.0743308886885643,
-0.06810213625431061,
-0.008571668528020382,
0.10328253358602524,
-0.05350097641348839,
0.045763131231069565,
0.07745879143476486,
-0.05387529730796814,
-0.010789348743855953,
-0.19259965419769287,
0.02553585171699524,
0.04600789025425911,
-0.027159309014678,
0.06099128723144531,
-0.11601406335830688,
0.05228552594780922,
-0.033104099333286285,
0.03668760508298874,
0.014980271458625793,
0.12325882166624069,
-0.0906621441245079,
0.07217783480882645,
0.1482895165681839,
-0.12009642273187637,
-0.024194778874516487,
0.012532703578472137,
0.05817165970802307,
0.012007111683487892,
0.20531602203845978,
-0.06059364229440689,
0.14371365308761597,
-0.09605105966329575,
0.0017151922220364213,
0.004002497531473637,
-0.00445089116692543,
-0.11271866410970688,
-0.051328495144844055,
0.05827435478568077,
-0.005275709554553032,
0.11476492136716843,
0.11478540301322937,
0.01628744974732399,
0.021961787715554237,
0.09114065766334534,
-0.0448027104139328,
-0.0338531956076622,
0.08444898575544357,
-0.012007747776806355,
-0.027859004214406013,
0.03465801477432251,
0.03852783143520355,
0.004441526252776384,
0.09889516979455948,
0.23105819523334503,
0.12473151087760925,
0.06292504072189331,
0.10576481372117996,
0.06534860283136368,
-0.0035515546333044767,
-0.0784880742430687,
-0.12224901467561722,
0.13685913383960724,
0.069983571767807,
-0.05981208011507988,
0.06071731820702553,
-0.0038184334989637136,
-0.12759903073310852,
0.07057873159646988,
0.028462760150432587,
-0.13664086163043976,
-0.0823068767786026,
-0.20139212906360626,
0.03345528617501259,
-0.022971970960497856,
0.01623893529176712,
-0.024648049846291542,
-0.08758675307035446,
0.09069687873125076,
-0.009683573618531227,
-0.05121735483407974,
0.14137782156467438,
-0.0726037323474884,
-0.07032899558544159,
0.11296145617961884,
0.0006733745685778558,
0.04512431472539902,
-0.14738117158412933,
0.050878845155239105,
-0.03519922494888306,
-0.0483323410153389,
0.030570797622203827,
0.018431413918733597,
0.006659801118075848,
-0.0437842532992363,
-0.11441260576248169,
-0.07590807229280472,
-0.02750110626220703,
0.018117723986506462,
0.011151387356221676,
0.20532436668872833,
0.03336218744516373,
-0.017224811017513275,
-0.024767227470874786,
0.08552828431129456,
-0.000138137984322384,
-0.12597258388996124,
-0.13693031668663025,
0.295724481344223,
-0.10422991961240768,
0.04548569396138191,
-0.05753675848245621,
-0.0338369682431221,
0.016779400408267975,
0.18896102905273438,
0.20881734788417816,
-0.06347009539604187,
-0.008355078287422657,
-0.013836994767189026,
0.022617386654019356,
0.05795544758439064,
0.13515709340572357,
0.04157590493559837,
0.14717058837413788,
-0.13533060252666473,
0.06852736324071884,
-0.16655151546001434,
-0.05244576930999756,
-0.03257973492145538,
0.1808294653892517,
0.06254994124174118,
-0.04706014692783356,
-0.12075438350439072,
0.18435131013393402,
-0.14072152972221375,
-0.011634008958935738,
0.003690115176141262,
0.013966896571218967,
-0.14030328392982483,
-0.027834448963403702,
0.10534191876649857,
0.07249721884727478,
0.15156474709510803,
-0.045954976230859756,
-0.06632585823535919,
0.18402941524982452,
-0.006378110032528639,
-0.06077969819307327,
-0.1067739874124527,
0.0772295817732811,
-0.09132950007915497,
0.21867454051971436,
-0.0013022507773712277,
0.1692190170288086,
0.06703164428472519,
0.0005252040573395789,
-0.050734713673591614,
0.13815093040466309,
0.05040137097239494,
0.06936892867088318,
-0.07845488935709,
-0.01130489632487297,
-0.0034196244087070227,
-0.03305864706635475,
-0.002251965692266822,
-0.13530965149402618,
0.11277751624584198,
-0.020297182723879814,
-0.08940798789262772,
-0.08497004956007004,
0.11560584604740143,
-0.06088733673095703,
0.09324125945568085,
0.19660809636116028,
-0.021913893520832062,
-0.02970086596906185,
-0.09225732088088989,
0.022480564191937447,
0.06635672599077225,
-0.011449257843196392,
0.01746012270450592,
-0.05471854656934738,
-0.032528575509786606,
0.1052054762840271,
-0.0816725566983223,
-0.09430181235074997,
-0.05640112981200218,
-0.03552388772368431,
-0.006583489943295717,
0.021327320486307144,
-0.007092682644724846,
-0.01946868747472763,
0.10377151519060135,
-0.005531010217964649,
0.05123837664723396,
0.016481341794133186,
0.08580677956342697,
-0.07432416826486588,
-0.05581441521644592
] |
null | null |
flair
|
## Multilingual Universal Part-of-Speech Tagging in Flair (fast model)
This is the fast multilingual universal part-of-speech tagging model that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **92,88** (12 UD Treebanks covering English, German, French, Italian, Dutch, Polish, Spanish, Swedish, Danish, Norwegian, Finnish and Czech)
Predicts universal POS tags:
| **tag** | **meaning** |
|---------------------------------|-----------|
|ADJ | adjective |
| ADP | adposition |
| ADV | adverb |
| AUX | auxiliary |
| CCONJ | coordinating conjunction |
| DET | determiner |
| INTJ | interjection |
| NOUN | noun |
| NUM | numeral |
| PART | particle |
| PRON | pronoun |
| PROPN | proper noun |
| PUNCT | punctuation |
| SCONJ | subordinating conjunction |
| SYM | symbol |
| VERB | verb |
| X | other |
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/upos-multi-fast")
# make example sentence
sentence = Sentence("Ich liebe Berlin, as they say. ")
# predict NER tags
tagger.predict(sentence)
# print sentence
print(sentence)
# print predicted NER spans
print('The following NER tags are found:')
# iterate over entities and print
for entity in sentence.get_spans('pos'):
print(entity)
```
This yields the following output:
```
Span [1]: "Ich" [− Labels: PRON (0.9999)]
Span [2]: "liebe" [− Labels: VERB (0.9999)]
Span [3]: "Berlin" [− Labels: PROPN (0.9997)]
Span [4]: "," [− Labels: PUNCT (1.0)]
Span [5]: "as" [− Labels: SCONJ (0.9991)]
Span [6]: "they" [− Labels: PRON (0.9998)]
Span [7]: "say" [− Labels: VERB (0.9998)]
Span [8]: "." [− Labels: PUNCT (1.0)]
```
So, the words "*Ich*" and "*they*" are labeled as **pronouns** (PRON), while "*liebe*" and "*say*" are labeled as **verbs** (VERB) in the multilingual sentence "*Ich liebe Berlin, as they say*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
```python
from flair.data import MultiCorpus
from flair.datasets import UD_ENGLISH, UD_GERMAN, UD_FRENCH, UD_ITALIAN, UD_POLISH, UD_DUTCH, UD_CZECH, \
UD_DANISH, UD_SPANISH, UD_SWEDISH, UD_NORWEGIAN, UD_FINNISH
from flair.embeddings import StackedEmbeddings, FlairEmbeddings
# 1. make a multi corpus consisting of 12 UD treebanks (in_memory=False here because this corpus becomes large)
corpus = MultiCorpus([
UD_ENGLISH(in_memory=False),
UD_GERMAN(in_memory=False),
UD_DUTCH(in_memory=False),
UD_FRENCH(in_memory=False),
UD_ITALIAN(in_memory=False),
UD_SPANISH(in_memory=False),
UD_POLISH(in_memory=False),
UD_CZECH(in_memory=False),
UD_DANISH(in_memory=False),
UD_SWEDISH(in_memory=False),
UD_NORWEGIAN(in_memory=False),
UD_FINNISH(in_memory=False),
])
# 2. what tag do we want to predict?
tag_type = 'upos'
# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize each embedding we use
embedding_types = [
# contextual string embeddings, forward
FlairEmbeddings('multi-forward-fast'),
# contextual string embeddings, backward
FlairEmbeddings('multi-backward-fast'),
]
# embedding stack consists of Flair and GloVe embeddings
embeddings = StackedEmbeddings(embeddings=embedding_types)
# 5. initialize sequence tagger
from flair.models import SequenceTagger
tagger = SequenceTagger(hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type=tag_type,
use_crf=False)
# 6. initialize trainer
from flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus)
# 7. run training
trainer.train('resources/taggers/upos-multi-fast',
train_with_dev=True,
max_epochs=150)
```
---
### Cite
Please cite the following paper when using this model.
```
@inproceedings{akbik2018coling,
title={Contextual String Embeddings for Sequence Labeling},
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
pages = {1638--1649},
year = {2018}
}
```
---
### Issues?
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
|
{"language": ["en", "de", "fr", "it", "nl", "pl", "es", "sv", "da", false, "fi", "cs"], "tags": ["flair", "token-classification", "sequence-tagger-model"], "datasets": ["ontonotes"], "widget": [{"text": "Ich liebe Berlin, as they say."}]}
|
token-classification
|
flair/upos-multi-fast
|
[
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"en",
"de",
"fr",
"it",
"nl",
"pl",
"es",
"sv",
"da",
"no",
"fi",
"cs",
"dataset:ontonotes",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en",
"de",
"fr",
"it",
"nl",
"pl",
"es",
"sv",
"da",
"no",
"fi",
"cs"
] |
TAGS
#flair #pytorch #token-classification #sequence-tagger-model #en #de #fr #it #nl #pl #es #sv #da #no #fi #cs #dataset-ontonotes #region-us
|
Multilingual Universal Part-of-Speech Tagging in Flair (fast model)
-------------------------------------------------------------------
This is the fast multilingual universal part-of-speech tagging model that ships with Flair.
F1-Score: 92,88 (12 UD Treebanks covering English, German, French, Italian, Dutch, Polish, Spanish, Swedish, Danish, Norwegian, Finnish and Czech)
Predicts universal POS tags:
Based on Flair embeddings and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: Flair ('pip install flair')
This yields the following output:
So, the words "*Ich*" and "*they*" are labeled as pronouns (PRON), while "*liebe*" and "*say*" are labeled as verbs (VERB) in the multilingual sentence "*Ich liebe Berlin, as they say*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
---
### Cite
Please cite the following paper when using this model.
---
### Issues?
The Flair issue tracker is available here.
|
[
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the words \"*Ich*\" and \"*they*\" are labeled as pronouns (PRON), while \"*liebe*\" and \"*say*\" are labeled as verbs (VERB) in the multilingual sentence \"*Ich liebe Berlin, as they say*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
"TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #de #fr #it #nl #pl #es #sv #da #no #fi #cs #dataset-ontonotes #region-us \n",
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the words \"*Ich*\" and \"*they*\" are labeled as pronouns (PRON), while \"*liebe*\" and \"*say*\" are labeled as verbs (VERB) in the multilingual sentence \"*Ich liebe Berlin, as they say*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
59,
101,
22,
15,
15
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #de #fr #it #nl #pl #es #sv #da #no #fi #cs #dataset-ontonotes #region-us \n### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the words \"*Ich*\" and \"*they*\" are labeled as pronouns (PRON), while \"*liebe*\" and \"*say*\" are labeled as verbs (VERB) in the multilingual sentence \"*Ich liebe Berlin, as they say*\".\n\n\n\n\n---### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
-0.08705111593008041,
-0.06974589824676514,
-0.0029402044601738453,
0.13343004882335663,
0.11380794644355774,
0.04046819359064102,
0.1234075129032135,
0.09400173276662827,
0.08065958321094513,
0.08943095058202744,
0.14182615280151367,
0.039049942046403885,
0.07930684834718704,
0.10644066333770752,
-0.01729878969490528,
-0.2673886716365814,
0.01436537317931652,
-0.053039174526929855,
0.03259987384080887,
0.14099104702472687,
0.14983488619327545,
0.0181419774889946,
0.03364994749426842,
-0.005140720866620541,
-0.08035092055797577,
-0.033900413662195206,
0.013763324357569218,
-0.04290793836116791,
0.13254331052303314,
0.03713918477296829,
0.12139558792114258,
-0.02358778938651085,
0.09159675240516663,
-0.20092029869556427,
0.018963128328323364,
0.022806446999311447,
-0.023296255618333817,
0.027669893577694893,
-0.010314518585801125,
-0.019491391256451607,
0.26680538058280945,
-0.05961274355649948,
-0.005362371914088726,
0.04460129886865616,
-0.16420994699001312,
-0.20219846069812775,
-0.09917870163917542,
0.07012404501438141,
0.000780441623646766,
0.08860646188259125,
-0.04734252765774727,
0.1386774629354477,
-0.1079372987151146,
0.06303069740533829,
0.09271877259016037,
-0.18567509949207306,
-0.07767379283905029,
0.06374889612197876,
0.09728702157735825,
0.04479850083589554,
-0.13468341529369354,
0.022231008857488632,
0.015567345544695854,
0.07337559759616852,
-0.056193504482507706,
-0.0464339405298233,
0.07506085187196732,
0.025755656883120537,
-0.17265266180038452,
-0.07514820992946625,
0.2782323956489563,
0.057655248790979385,
-0.08449651300907135,
-0.060215216130018234,
-0.025447869673371315,
-0.03713280335068703,
0.03233492374420166,
-0.07263645529747009,
-0.06051342934370041,
0.024095546454191208,
0.19349494576454163,
-0.0956026166677475,
-0.10124565660953522,
-0.04626401141285896,
0.05712262913584709,
0.17984230816364288,
-0.022176319733262062,
0.04624127969145775,
-0.0943414643406868,
0.04802579805254936,
-0.0613694004714489,
-0.09749160706996918,
0.03429834917187691,
-0.09257692098617554,
-0.041234906762838364,
0.016179077327251434,
-0.04098982363939285,
-0.023932194337248802,
0.045141175389289856,
0.09791187942028046,
0.006479648873209953,
-0.005071300081908703,
-0.042382240295410156,
0.08820044249296188,
0.05148367956280708,
0.18538452684879303,
-0.04178445041179657,
-0.01356926653534174,
-0.0018817429663613439,
-0.004270296078175306,
0.0262509323656559,
-0.04581872746348381,
-0.14908021688461304,
-0.02416214719414711,
-0.05663151293992996,
0.013190428726375103,
-0.035801902413368225,
0.026816459372639656,
-0.0769190564751625,
-0.0004179490206297487,
0.010339428670704365,
-0.10791890323162079,
0.011428655125200748,
0.05422237515449524,
-0.026629900559782982,
0.1418619602918625,
0.035250771790742874,
-0.02102135308086872,
-0.03654438257217407,
0.07169277220964432,
-0.04483336582779884,
0.06797372549772263,
-0.07833654433488846,
-0.09884674102067947,
0.007542961742728949,
-0.08391544222831726,
-0.016693497076630592,
-0.0686226636171341,
-0.0633910521864891,
-0.024519434198737144,
0.04136720299720764,
-0.03742571175098419,
0.015127215534448624,
-0.06820674985647202,
-0.08250764012336731,
-0.007874137721955776,
0.03354879468679428,
-0.04853960871696472,
-0.046268776059150696,
0.07736379653215408,
0.02545568346977234,
0.04945754259824753,
-0.14167985320091248,
0.010609478689730167,
-0.10058707743883133,
0.020462408661842346,
-0.16475240886211395,
0.01099385879933834,
-0.07889459282159805,
-0.0855245441198349,
-0.06626423448324203,
-0.10348190367221832,
0.01262164581567049,
0.02284119836986065,
0.02937336452305317,
0.19363512098789215,
-0.3112894296646118,
-0.004099764861166477,
0.1954658031463623,
-0.12762749195098877,
-0.02764982171356678,
0.1809024214744568,
-0.051456812769174576,
0.13623753190040588,
0.012626494280993938,
0.15531621873378754,
0.055707722902297974,
-0.30688008666038513,
0.035481810569763184,
0.05645043030381203,
-0.14251159131526947,
0.11074382066726685,
0.11394944041967392,
-0.03203630819916725,
-0.12062140554189682,
0.02993321605026722,
-0.12675727903842926,
0.037419576197862625,
-0.013634638860821724,
-0.07312754541635513,
-0.03406817838549614,
0.05418349802494049,
0.06587903201580048,
0.007699925918132067,
-0.10209636390209198,
-0.03933915123343468,
-0.11873791366815567,
-0.10764838755130768,
0.03420544043183327,
0.014554808847606182,
0.02803596295416355,
-0.05252809077501297,
0.1247745156288147,
0.06297074258327484,
-0.015049824491143227,
-0.14358030259609222,
0.0015940126031637192,
-0.00772726908326149,
0.008642273023724556,
0.06612740457057953,
0.10314267128705978,
0.021073132753372192,
0.0665469765663147,
0.0019509571138769388,
0.03383135050535202,
-0.12952382862567902,
-0.000002904607072196086,
0.0076846168376505375,
-0.11558974534273148,
-0.0337667316198349,
-0.08223187178373337,
0.13785061240196228,
-0.181343674659729,
0.023228434845805168,
0.06841281801462173,
0.057989004999399185,
-0.03591005876660347,
-0.009641778655350208,
-0.038715701550245285,
0.05066358670592308,
-0.026871278882026672,
-0.0006881244480609894,
0.07697766274213791,
0.012853976339101791,
-0.05500398948788643,
-0.09289027005434036,
-0.017231928184628487,
-0.19523866474628448,
0.0769948810338974,
-0.03185148537158966,
-0.13979074358940125,
-0.12694452702999115,
-0.06385914981365204,
0.007585304323583841,
-0.05381587892770767,
-0.06036067008972168,
0.14544373750686646,
0.05543679744005203,
0.04000027850270271,
-0.07388129085302353,
0.05560673773288727,
0.03483403101563454,
-0.09168213605880737,
-0.06836552917957306,
0.11795546859502792,
0.14447811245918274,
0.021201390773057938,
0.08850517868995667,
0.10569603741168976,
-0.015243145637214184,
0.16119271516799927,
0.008890695869922638,
-0.048671212047338486,
-0.13823813199996948,
0.05440310016274452,
-0.0036995878908783197,
0.20138950645923615,
-0.17289450764656067,
0.05466026812791824,
0.04284132644534111,
-0.009352241642773151,
-0.008039792068302631,
-0.138696551322937,
-0.059188973158597946,
-0.026452578604221344,
-0.07139705866575241,
-0.03354692831635475,
0.05774153396487236,
-0.007147215772420168,
0.1097869873046875,
-0.01004873774945736,
-0.13824455440044403,
0.06827925145626068,
-0.0068110516294837,
-0.09273705631494522,
0.10718333721160889,
-0.0930112823843956,
-0.2507244944572449,
-0.15774023532867432,
-0.0817408412694931,
-0.10432083159685135,
0.0020375631283968687,
0.025199977681040764,
-0.06138137727975845,
0.0011386392870917916,
0.018133951351046562,
0.1570911556482315,
-0.03255407512187958,
-0.06889273226261139,
-0.06945717334747314,
0.02646620385348797,
-0.023347022011876106,
-0.08644875884056091,
-0.05947257950901985,
-0.028527280315756798,
0.06239132955670357,
0.03170974925160408,
-0.0016626653959974647,
0.12264080345630646,
0.09424838423728943,
0.010677866637706757,
0.048036012798547745,
-0.017986048012971878,
0.29898470640182495,
-0.09035682678222656,
0.06851483136415482,
0.11065705120563507,
-0.10387775301933289,
0.06286371499300003,
0.1687413454055786,
0.0659724771976471,
-0.06224358081817627,
-0.016697783023118973,
-0.03312095254659653,
-0.08447825163602829,
-0.17416033148765564,
-0.1488671898841858,
-0.08523491770029068,
0.015901021659374237,
-0.0348709374666214,
0.030206769704818726,
0.03305324539542198,
0.001882388023659587,
0.03123004175722599,
-0.10513836145401001,
-0.016591662541031837,
0.04269026219844818,
0.13650953769683838,
-0.04470813274383545,
0.011284707114100456,
0.002834755228832364,
-0.06789655983448029,
0.07686693221330643,
0.04885580763220787,
0.1012224331498146,
0.10113757103681564,
0.0017976294038817286,
0.10067527741193771,
0.09956087917089462,
0.029263805598020554,
0.0839005708694458,
0.02926507219672203,
-0.02299341931939125,
-0.028045546263456345,
-0.09606859087944031,
0.0005080642877146602,
0.06855929642915726,
0.05710052326321602,
0.027008401229977608,
-0.057866647839546204,
-0.013192396610975266,
0.09649109095335007,
0.031890928745269775,
0.06236094608902931,
-0.10781780630350113,
-0.02110930159687996,
0.000006720773399138125,
0.06366812437772751,
-0.07920460402965546,
0.030083345249295235,
0.09758997708559036,
-0.17291435599327087,
0.02043823152780533,
-0.05038036033511162,
0.08386006951332092,
0.014025107957422733,
0.04220478609204292,
-0.004840019159018993,
0.07748525589704514,
-0.01855560578405857,
0.12808793783187866,
-0.29737377166748047,
0.2865842580795288,
-0.01012710202485323,
0.053525473922491074,
-0.08121137320995331,
0.02554846927523613,
-0.014446315355598927,
0.09746064245700836,
0.2709955871105194,
0.02706221491098404,
-0.10616706311702728,
-0.06878217309713364,
-0.01732451096177101,
0.003462771186605096,
0.11121214181184769,
0.013639683835208416,
0.06874939054250717,
0.004729495383799076,
0.03275630250573158,
-0.0072945114225149155,
0.11586722731590271,
-0.13285276293754578,
-0.09430736303329468,
0.02693282440304756,
-0.0005902772536501288,
-0.0891193151473999,
0.010255404748022556,
-0.05394336208701134,
-0.11409572511911392,
0.0890493243932724,
-0.10825794190168381,
-0.011350667104125023,
-0.10506631433963776,
0.07519563287496567,
0.010371123440563679,
-0.052380386739969254,
-0.07734676450490952,
-0.06443537771701813,
0.046696316450834274,
-0.0896696075797081,
0.035093970596790314,
0.03902412950992584,
-0.048669345676898956,
-0.06239250674843788,
-0.06731295585632324,
0.14045315980911255,
0.08735057711601257,
0.03456025943160057,
0.0021952157840132713,
0.023328667506575584,
-0.02259809710085392,
-0.18717965483665466,
0.08303411304950714,
-0.05244465917348862,
-0.021907921880483627,
0.03895164281129837,
-0.16054292023181915,
-0.07711809128522873,
-0.10360085219144821,
0.02050616778433323,
0.19734954833984375,
0.281703919172287,
-0.09538464993238449,
0.09073720127344131,
0.1160346195101738,
-0.09177602082490921,
-0.21033239364624023,
-0.011606321670114994,
0.028117770329117775,
0.04125901311635971,
0.020716091617941856,
-0.13783012330532074,
0.05317532643675804,
0.09156697988510132,
0.006425467785447836,
0.058720361441373825,
-0.28507503867149353,
-0.04176902025938034,
0.11898346245288849,
-0.010303123854100704,
-0.02181669883430004,
-0.06827034056186676,
-0.053767129778862,
-0.036995042115449905,
-0.08639046549797058,
0.07834042608737946,
0.023234333842992783,
0.07135344296693802,
0.00852221716195345,
0.002940323669463396,
0.00235059205442667,
-0.03993557393550873,
0.2124142348766327,
0.08153977990150452,
0.028903940692543983,
-0.050356119871139526,
-0.2093721181154251,
0.09946766495704651,
-0.0012667089467868209,
0.1095355823636055,
-0.020104771479964256,
-0.01807679608464241,
-0.21816086769104004,
-0.04760408028960228,
-0.11077464371919632,
0.1729581654071808,
-0.0905865952372551,
-0.08472093939781189,
-0.06263784319162369,
0.05353253707289696,
-0.027965446934103966,
-0.01842469535768032,
-0.06174496188759804,
-0.09425867348909378,
0.013116486370563507,
0.13452543318271637,
-0.009317141957581043,
0.14272183179855347,
-0.15742714703083038,
0.013250860385596752,
0.007901295088231564,
0.030781513080000877,
0.0041734399273991585,
-0.052660174667835236,
0.09536018967628479,
0.029970180243253708,
0.17863774299621582,
0.05181486904621124,
-0.05358077958226204,
0.014110603369772434,
0.030842138454318047,
-0.16836705803871155,
-0.09092903882265091,
-0.03971976041793823,
-0.09054659307003021,
-0.03522966802120209,
-0.06817728281021118,
0.11237261444330215,
-0.058432333171367645,
0.03375846892595291,
-0.03348436579108238,
0.03843693435192108,
-0.13328386843204498,
-0.0182456374168396,
0.13595980405807495,
0.053805071860551834,
-0.06099938973784447,
0.022069955244660378,
0.06230258196592331,
-0.08647166192531586,
0.025386599823832512,
0.08927716314792633,
-0.0741160586476326,
-0.05035557225346565,
-0.055852003395557404,
0.25212278962135315,
0.002485155826434493,
-0.08654436469078064,
-0.012478560209274292,
-0.08453109115362167,
0.014443491585552692,
0.09994199872016907,
0.1043563187122345,
-0.015889659523963928,
-0.1143878847360611,
-0.0246109776198864,
-0.10899487882852554,
0.071408711373806,
0.01867596060037613,
-0.052145157009363174,
-0.039491839706897736,
0.07088986039161682,
0.03338969871401787,
0.10820219665765762,
-0.04269056394696236,
-0.0885724276304245,
-0.16157007217407227,
0.01507399883121252,
-0.02247929759323597,
0.023369986563920975,
-0.006531170103698969,
0.017570670694112778,
0.012794976122677326,
-0.06913632899522781,
-0.00655630836263299,
0.0009633592562749982,
-0.04180445522069931,
0.02006601355969906,
0.030771156772971153,
0.09086988866329193,
-0.11055073142051697,
-0.03252612054347992,
0.08351702243089676,
-0.06406558305025101,
0.06682723015546799,
0.1116165891289711,
-0.09151753038167953,
0.011723286472260952,
-0.18996214866638184,
0.0747336745262146,
0.006721190642565489,
0.014740523882210255,
0.048040326684713364,
-0.12292000651359558,
0.0567970909178257,
-0.0048340135253965855,
0.026646161451935768,
0.0291077122092247,
0.09160348773002625,
-0.10610151290893555,
0.05533445253968239,
0.08711086213588715,
-0.16037730872631073,
-0.0154305724427104,
0.04594172164797783,
0.08312457799911499,
0.052031852304935455,
0.14675025641918182,
-0.025522267445921898,
0.15500329434871674,
-0.08021291345357895,
-0.009927927516400814,
-0.011741274036467075,
-0.02378021739423275,
-0.06481175869703293,
-0.054432183504104614,
0.056843388825654984,
-0.00695212185382843,
0.15363483130931854,
0.106279656291008,
0.05151715129613876,
0.008580857887864113,
0.0526885986328125,
-0.07530295848846436,
-0.012984998524188995,
0.08414556831121445,
-0.00046652639866806567,
-0.006104880012571812,
-0.0145065663382411,
-0.005253596697002649,
-0.028262397274374962,
0.044469788670539856,
0.17472070455551147,
0.1174127385020256,
0.07337664812803268,
0.0691879540681839,
0.09673512727022171,
-0.012840134091675282,
-0.08602400869131088,
-0.015742020681500435,
0.13969747722148895,
0.04092835634946823,
-0.10121326148509979,
0.10583733022212982,
0.02398528903722763,
-0.13009491562843323,
0.10779231041669846,
-0.024071330204606056,
-0.12843649089336395,
-0.09360121935606003,
-0.14594422280788422,
-0.002513404469937086,
-0.045515451580286026,
0.015219355002045631,
-0.07342991977930069,
0.0233951136469841,
0.10657239705324173,
-0.00030172415426932275,
-0.07450336217880249,
0.13890500366687775,
-0.10052287578582764,
-0.046457163989543915,
0.07628848403692245,
-0.003831435926258564,
0.08141548931598663,
-0.12124442309141159,
0.030484087765216827,
-0.007553652860224247,
-0.03678925335407257,
0.02595188282430172,
0.05634414404630661,
0.07205834239721298,
-0.09078286588191986,
-0.1647803783416748,
-0.04911109432578087,
-0.008396687917411327,
0.014166361652314663,
-0.04508893936872482,
0.220604807138443,
0.03887951374053955,
-0.017881128937005997,
-0.0031764714512974024,
0.09177324920892715,
-0.014331053011119366,
-0.1518728882074356,
-0.15894629061222076,
0.23094096779823303,
-0.08281504362821579,
0.054272327572107315,
-0.07226412743330002,
-0.05656775087118149,
0.0032781551126390696,
0.20716965198516846,
0.17090293765068054,
-0.044723913073539734,
0.000808430602774024,
0.046354733407497406,
0.03426588326692581,
0.06391479074954987,
0.05013500899076462,
0.06266828626394272,
0.20107071101665497,
-0.06217397749423981,
0.03116804175078869,
-0.15385377407073975,
-0.03891938179731369,
-0.03127703815698624,
0.05345260351896286,
0.06294560432434082,
-0.062118351459503174,
-0.10149741917848587,
0.18302014470100403,
-0.09768680483102798,
-0.11771786957979202,
0.036863118410110474,
-0.04377567395567894,
-0.12017152458429337,
0.010110844857990742,
0.0628480315208435,
0.03612038865685463,
0.09348275512456894,
-0.05345597863197327,
-0.07800760120153427,
0.17600764334201813,
-0.010476944036781788,
-0.04541082680225372,
-0.03663362190127373,
0.06506641954183578,
-0.1606539934873581,
0.16746363043785095,
0.0056464700028300285,
0.17847168445587158,
0.09330553561449051,
0.06372460722923279,
-0.014174547977745533,
0.09469445794820786,
0.06459193676710129,
0.053659625351428986,
-0.05319172143936157,
0.013758210465312004,
-0.03410036116838455,
-0.04546844959259033,
0.02720591239631176,
-0.127155140042305,
0.07588998973369598,
0.07947952300310135,
-0.04727035388350487,
-0.08334808051586151,
0.13701516389846802,
-0.11191265285015106,
0.10909269750118256,
0.1788807511329651,
-0.0010090359719470143,
-0.04326869547367096,
-0.07740757614374161,
0.05495963245630264,
0.04936456307768822,
0.0482381172478199,
0.009784197434782982,
-0.13386431336402893,
0.011407796293497086,
0.06641896814107895,
0.0074300398118793964,
-0.1431928128004074,
-0.10247389227151871,
0.038996923714876175,
0.011397022753953934,
0.025043241679668427,
0.036201801151037216,
-0.01739087887108326,
0.028750982135534286,
-0.031635481864213943,
0.013580176047980785,
-0.002938572783023119,
0.11783294379711151,
-0.07402995228767395,
-0.038732483983039856
] |
null | null |
flair
|
## Multilingual Universal Part-of-Speech Tagging in Flair (default model)
This is the default multilingual universal part-of-speech tagging model that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **98,47** (12 UD Treebanks covering English, German, French, Italian, Dutch, Polish, Spanish, Swedish, Danish, Norwegian, Finnish and Czech)
Predicts universal POS tags:
| **tag** | **meaning** |
|---------------------------------|-----------|
|ADJ | adjective |
| ADP | adposition |
| ADV | adverb |
| AUX | auxiliary |
| CCONJ | coordinating conjunction |
| DET | determiner |
| INTJ | interjection |
| NOUN | noun |
| NUM | numeral |
| PART | particle |
| PRON | pronoun |
| PROPN | proper noun |
| PUNCT | punctuation |
| SCONJ | subordinating conjunction |
| SYM | symbol |
| VERB | verb |
| X | other |
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/upos-multi")
# make example sentence
sentence = Sentence("Ich liebe Berlin, as they say. ")
# predict POS tags
tagger.predict(sentence)
# print sentence
print(sentence)
# iterate over tokens and print the predicted POS label
print("The following POS tags are found:")
for token in sentence:
print(token.get_label("upos"))
```
This yields the following output:
```
Token[0]: "Ich" → PRON (0.9999)
Token[1]: "liebe" → VERB (0.9999)
Token[2]: "Berlin" → PROPN (0.9997)
Token[3]: "," → PUNCT (1.0)
Token[4]: "as" → SCONJ (0.9991)
Token[5]: "they" → PRON (0.9998)
Token[6]: "say" → VERB (0.9998)
Token[7]: "." → PUNCT (1.0)
```
So, the words "*Ich*" and "*they*" are labeled as **pronouns** (PRON), while "*liebe*" and "*say*" are labeled as **verbs** (VERB) in the multilingual sentence "*Ich liebe Berlin, as they say*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
```python
from flair.data import MultiCorpus
from flair.datasets import UD_ENGLISH, UD_GERMAN, UD_FRENCH, UD_ITALIAN, UD_POLISH, UD_DUTCH, UD_CZECH, \
UD_DANISH, UD_SPANISH, UD_SWEDISH, UD_NORWEGIAN, UD_FINNISH
from flair.embeddings import StackedEmbeddings, FlairEmbeddings
# 1. make a multi corpus consisting of 12 UD treebanks (in_memory=False here because this corpus becomes large)
corpus = MultiCorpus([
UD_ENGLISH(in_memory=False),
UD_GERMAN(in_memory=False),
UD_DUTCH(in_memory=False),
UD_FRENCH(in_memory=False),
UD_ITALIAN(in_memory=False),
UD_SPANISH(in_memory=False),
UD_POLISH(in_memory=False),
UD_CZECH(in_memory=False),
UD_DANISH(in_memory=False),
UD_SWEDISH(in_memory=False),
UD_NORWEGIAN(in_memory=False),
UD_FINNISH(in_memory=False),
])
# 2. what tag do we want to predict?
tag_type = 'upos'
# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize each embedding we use
embedding_types = [
# contextual string embeddings, forward
FlairEmbeddings('multi-forward'),
# contextual string embeddings, backward
FlairEmbeddings('multi-backward'),
]
# embedding stack consists of Flair and GloVe embeddings
embeddings = StackedEmbeddings(embeddings=embedding_types)
# 5. initialize sequence tagger
from flair.models import SequenceTagger
tagger = SequenceTagger(hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type=tag_type,
use_crf=False)
# 6. initialize trainer
from flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus)
# 7. run training
trainer.train('resources/taggers/upos-multi',
train_with_dev=True,
max_epochs=150)
```
---
### Cite
Please cite the following paper when using this model.
```
@inproceedings{akbik2018coling,
title={Contextual String Embeddings for Sequence Labeling},
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
pages = {1638--1649},
year = {2018}
}
```
---
### Issues?
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
|
{"language": ["en", "de", "fr", "it", "nl", "pl", "es", "sv", "da", false, "fi", "cs"], "tags": ["flair", "token-classification", "sequence-tagger-model"], "datasets": ["ontonotes"], "widget": [{"text": "Ich liebe Berlin, as they say"}]}
|
token-classification
|
flair/upos-multi
|
[
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"en",
"de",
"fr",
"it",
"nl",
"pl",
"es",
"sv",
"da",
"no",
"fi",
"cs",
"dataset:ontonotes",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en",
"de",
"fr",
"it",
"nl",
"pl",
"es",
"sv",
"da",
"no",
"fi",
"cs"
] |
TAGS
#flair #pytorch #token-classification #sequence-tagger-model #en #de #fr #it #nl #pl #es #sv #da #no #fi #cs #dataset-ontonotes #region-us
|
Multilingual Universal Part-of-Speech Tagging in Flair (default model)
----------------------------------------------------------------------
This is the default multilingual universal part-of-speech tagging model that ships with Flair.
F1-Score: 98,47 (12 UD Treebanks covering English, German, French, Italian, Dutch, Polish, Spanish, Swedish, Danish, Norwegian, Finnish and Czech)
Predicts universal POS tags:
Based on Flair embeddings and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: Flair ('pip install flair')
This yields the following output:
So, the words "*Ich*" and "*they*" are labeled as pronouns (PRON), while "*liebe*" and "*say*" are labeled as verbs (VERB) in the multilingual sentence "*Ich liebe Berlin, as they say*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
---
### Cite
Please cite the following paper when using this model.
---
### Issues?
The Flair issue tracker is available here.
|
[
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the words \"*Ich*\" and \"*they*\" are labeled as pronouns (PRON), while \"*liebe*\" and \"*say*\" are labeled as verbs (VERB) in the multilingual sentence \"*Ich liebe Berlin, as they say*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
"TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #de #fr #it #nl #pl #es #sv #da #no #fi #cs #dataset-ontonotes #region-us \n",
"### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the words \"*Ich*\" and \"*they*\" are labeled as pronouns (PRON), while \"*liebe*\" and \"*say*\" are labeled as verbs (VERB) in the multilingual sentence \"*Ich liebe Berlin, as they say*\".\n\n\n\n\n---",
"### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---",
"### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---",
"### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
59,
101,
22,
15,
15
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #en #de #fr #it #nl #pl #es #sv #da #no #fi #cs #dataset-ontonotes #region-us \n### Demo: How to use in Flair\n\n\nRequires: Flair ('pip install flair')\n\n\nThis yields the following output:\n\n\nSo, the words \"*Ich*\" and \"*they*\" are labeled as pronouns (PRON), while \"*liebe*\" and \"*say*\" are labeled as verbs (VERB) in the multilingual sentence \"*Ich liebe Berlin, as they say*\".\n\n\n\n\n---### Training: Script to train this model\n\n\nThe following Flair script was used to train this model:\n\n\n\n\n---### Cite\n\n\nPlease cite the following paper when using this model.\n\n\n\n\n---### Issues?\n\n\nThe Flair issue tracker is available here."
] |
[
-0.08705111593008041,
-0.06974589824676514,
-0.0029402044601738453,
0.13343004882335663,
0.11380794644355774,
0.04046819359064102,
0.1234075129032135,
0.09400173276662827,
0.08065958321094513,
0.08943095058202744,
0.14182615280151367,
0.039049942046403885,
0.07930684834718704,
0.10644066333770752,
-0.01729878969490528,
-0.2673886716365814,
0.01436537317931652,
-0.053039174526929855,
0.03259987384080887,
0.14099104702472687,
0.14983488619327545,
0.0181419774889946,
0.03364994749426842,
-0.005140720866620541,
-0.08035092055797577,
-0.033900413662195206,
0.013763324357569218,
-0.04290793836116791,
0.13254331052303314,
0.03713918477296829,
0.12139558792114258,
-0.02358778938651085,
0.09159675240516663,
-0.20092029869556427,
0.018963128328323364,
0.022806446999311447,
-0.023296255618333817,
0.027669893577694893,
-0.010314518585801125,
-0.019491391256451607,
0.26680538058280945,
-0.05961274355649948,
-0.005362371914088726,
0.04460129886865616,
-0.16420994699001312,
-0.20219846069812775,
-0.09917870163917542,
0.07012404501438141,
0.000780441623646766,
0.08860646188259125,
-0.04734252765774727,
0.1386774629354477,
-0.1079372987151146,
0.06303069740533829,
0.09271877259016037,
-0.18567509949207306,
-0.07767379283905029,
0.06374889612197876,
0.09728702157735825,
0.04479850083589554,
-0.13468341529369354,
0.022231008857488632,
0.015567345544695854,
0.07337559759616852,
-0.056193504482507706,
-0.0464339405298233,
0.07506085187196732,
0.025755656883120537,
-0.17265266180038452,
-0.07514820992946625,
0.2782323956489563,
0.057655248790979385,
-0.08449651300907135,
-0.060215216130018234,
-0.025447869673371315,
-0.03713280335068703,
0.03233492374420166,
-0.07263645529747009,
-0.06051342934370041,
0.024095546454191208,
0.19349494576454163,
-0.0956026166677475,
-0.10124565660953522,
-0.04626401141285896,
0.05712262913584709,
0.17984230816364288,
-0.022176319733262062,
0.04624127969145775,
-0.0943414643406868,
0.04802579805254936,
-0.0613694004714489,
-0.09749160706996918,
0.03429834917187691,
-0.09257692098617554,
-0.041234906762838364,
0.016179077327251434,
-0.04098982363939285,
-0.023932194337248802,
0.045141175389289856,
0.09791187942028046,
0.006479648873209953,
-0.005071300081908703,
-0.042382240295410156,
0.08820044249296188,
0.05148367956280708,
0.18538452684879303,
-0.04178445041179657,
-0.01356926653534174,
-0.0018817429663613439,
-0.004270296078175306,
0.0262509323656559,
-0.04581872746348381,
-0.14908021688461304,
-0.02416214719414711,
-0.05663151293992996,
0.013190428726375103,
-0.035801902413368225,
0.026816459372639656,
-0.0769190564751625,
-0.0004179490206297487,
0.010339428670704365,
-0.10791890323162079,
0.011428655125200748,
0.05422237515449524,
-0.026629900559782982,
0.1418619602918625,
0.035250771790742874,
-0.02102135308086872,
-0.03654438257217407,
0.07169277220964432,
-0.04483336582779884,
0.06797372549772263,
-0.07833654433488846,
-0.09884674102067947,
0.007542961742728949,
-0.08391544222831726,
-0.016693497076630592,
-0.0686226636171341,
-0.0633910521864891,
-0.024519434198737144,
0.04136720299720764,
-0.03742571175098419,
0.015127215534448624,
-0.06820674985647202,
-0.08250764012336731,
-0.007874137721955776,
0.03354879468679428,
-0.04853960871696472,
-0.046268776059150696,
0.07736379653215408,
0.02545568346977234,
0.04945754259824753,
-0.14167985320091248,
0.010609478689730167,
-0.10058707743883133,
0.020462408661842346,
-0.16475240886211395,
0.01099385879933834,
-0.07889459282159805,
-0.0855245441198349,
-0.06626423448324203,
-0.10348190367221832,
0.01262164581567049,
0.02284119836986065,
0.02937336452305317,
0.19363512098789215,
-0.3112894296646118,
-0.004099764861166477,
0.1954658031463623,
-0.12762749195098877,
-0.02764982171356678,
0.1809024214744568,
-0.051456812769174576,
0.13623753190040588,
0.012626494280993938,
0.15531621873378754,
0.055707722902297974,
-0.30688008666038513,
0.035481810569763184,
0.05645043030381203,
-0.14251159131526947,
0.11074382066726685,
0.11394944041967392,
-0.03203630819916725,
-0.12062140554189682,
0.02993321605026722,
-0.12675727903842926,
0.037419576197862625,
-0.013634638860821724,
-0.07312754541635513,
-0.03406817838549614,
0.05418349802494049,
0.06587903201580048,
0.007699925918132067,
-0.10209636390209198,
-0.03933915123343468,
-0.11873791366815567,
-0.10764838755130768,
0.03420544043183327,
0.014554808847606182,
0.02803596295416355,
-0.05252809077501297,
0.1247745156288147,
0.06297074258327484,
-0.015049824491143227,
-0.14358030259609222,
0.0015940126031637192,
-0.00772726908326149,
0.008642273023724556,
0.06612740457057953,
0.10314267128705978,
0.021073132753372192,
0.0665469765663147,
0.0019509571138769388,
0.03383135050535202,
-0.12952382862567902,
-0.000002904607072196086,
0.0076846168376505375,
-0.11558974534273148,
-0.0337667316198349,
-0.08223187178373337,
0.13785061240196228,
-0.181343674659729,
0.023228434845805168,
0.06841281801462173,
0.057989004999399185,
-0.03591005876660347,
-0.009641778655350208,
-0.038715701550245285,
0.05066358670592308,
-0.026871278882026672,
-0.0006881244480609894,
0.07697766274213791,
0.012853976339101791,
-0.05500398948788643,
-0.09289027005434036,
-0.017231928184628487,
-0.19523866474628448,
0.0769948810338974,
-0.03185148537158966,
-0.13979074358940125,
-0.12694452702999115,
-0.06385914981365204,
0.007585304323583841,
-0.05381587892770767,
-0.06036067008972168,
0.14544373750686646,
0.05543679744005203,
0.04000027850270271,
-0.07388129085302353,
0.05560673773288727,
0.03483403101563454,
-0.09168213605880737,
-0.06836552917957306,
0.11795546859502792,
0.14447811245918274,
0.021201390773057938,
0.08850517868995667,
0.10569603741168976,
-0.015243145637214184,
0.16119271516799927,
0.008890695869922638,
-0.048671212047338486,
-0.13823813199996948,
0.05440310016274452,
-0.0036995878908783197,
0.20138950645923615,
-0.17289450764656067,
0.05466026812791824,
0.04284132644534111,
-0.009352241642773151,
-0.008039792068302631,
-0.138696551322937,
-0.059188973158597946,
-0.026452578604221344,
-0.07139705866575241,
-0.03354692831635475,
0.05774153396487236,
-0.007147215772420168,
0.1097869873046875,
-0.01004873774945736,
-0.13824455440044403,
0.06827925145626068,
-0.0068110516294837,
-0.09273705631494522,
0.10718333721160889,
-0.0930112823843956,
-0.2507244944572449,
-0.15774023532867432,
-0.0817408412694931,
-0.10432083159685135,
0.0020375631283968687,
0.025199977681040764,
-0.06138137727975845,
0.0011386392870917916,
0.018133951351046562,
0.1570911556482315,
-0.03255407512187958,
-0.06889273226261139,
-0.06945717334747314,
0.02646620385348797,
-0.023347022011876106,
-0.08644875884056091,
-0.05947257950901985,
-0.028527280315756798,
0.06239132955670357,
0.03170974925160408,
-0.0016626653959974647,
0.12264080345630646,
0.09424838423728943,
0.010677866637706757,
0.048036012798547745,
-0.017986048012971878,
0.29898470640182495,
-0.09035682678222656,
0.06851483136415482,
0.11065705120563507,
-0.10387775301933289,
0.06286371499300003,
0.1687413454055786,
0.0659724771976471,
-0.06224358081817627,
-0.016697783023118973,
-0.03312095254659653,
-0.08447825163602829,
-0.17416033148765564,
-0.1488671898841858,
-0.08523491770029068,
0.015901021659374237,
-0.0348709374666214,
0.030206769704818726,
0.03305324539542198,
0.001882388023659587,
0.03123004175722599,
-0.10513836145401001,
-0.016591662541031837,
0.04269026219844818,
0.13650953769683838,
-0.04470813274383545,
0.011284707114100456,
0.002834755228832364,
-0.06789655983448029,
0.07686693221330643,
0.04885580763220787,
0.1012224331498146,
0.10113757103681564,
0.0017976294038817286,
0.10067527741193771,
0.09956087917089462,
0.029263805598020554,
0.0839005708694458,
0.02926507219672203,
-0.02299341931939125,
-0.028045546263456345,
-0.09606859087944031,
0.0005080642877146602,
0.06855929642915726,
0.05710052326321602,
0.027008401229977608,
-0.057866647839546204,
-0.013192396610975266,
0.09649109095335007,
0.031890928745269775,
0.06236094608902931,
-0.10781780630350113,
-0.02110930159687996,
0.000006720773399138125,
0.06366812437772751,
-0.07920460402965546,
0.030083345249295235,
0.09758997708559036,
-0.17291435599327087,
0.02043823152780533,
-0.05038036033511162,
0.08386006951332092,
0.014025107957422733,
0.04220478609204292,
-0.004840019159018993,
0.07748525589704514,
-0.01855560578405857,
0.12808793783187866,
-0.29737377166748047,
0.2865842580795288,
-0.01012710202485323,
0.053525473922491074,
-0.08121137320995331,
0.02554846927523613,
-0.014446315355598927,
0.09746064245700836,
0.2709955871105194,
0.02706221491098404,
-0.10616706311702728,
-0.06878217309713364,
-0.01732451096177101,
0.003462771186605096,
0.11121214181184769,
0.013639683835208416,
0.06874939054250717,
0.004729495383799076,
0.03275630250573158,
-0.0072945114225149155,
0.11586722731590271,
-0.13285276293754578,
-0.09430736303329468,
0.02693282440304756,
-0.0005902772536501288,
-0.0891193151473999,
0.010255404748022556,
-0.05394336208701134,
-0.11409572511911392,
0.0890493243932724,
-0.10825794190168381,
-0.011350667104125023,
-0.10506631433963776,
0.07519563287496567,
0.010371123440563679,
-0.052380386739969254,
-0.07734676450490952,
-0.06443537771701813,
0.046696316450834274,
-0.0896696075797081,
0.035093970596790314,
0.03902412950992584,
-0.048669345676898956,
-0.06239250674843788,
-0.06731295585632324,
0.14045315980911255,
0.08735057711601257,
0.03456025943160057,
0.0021952157840132713,
0.023328667506575584,
-0.02259809710085392,
-0.18717965483665466,
0.08303411304950714,
-0.05244465917348862,
-0.021907921880483627,
0.03895164281129837,
-0.16054292023181915,
-0.07711809128522873,
-0.10360085219144821,
0.02050616778433323,
0.19734954833984375,
0.281703919172287,
-0.09538464993238449,
0.09073720127344131,
0.1160346195101738,
-0.09177602082490921,
-0.21033239364624023,
-0.011606321670114994,
0.028117770329117775,
0.04125901311635971,
0.020716091617941856,
-0.13783012330532074,
0.05317532643675804,
0.09156697988510132,
0.006425467785447836,
0.058720361441373825,
-0.28507503867149353,
-0.04176902025938034,
0.11898346245288849,
-0.010303123854100704,
-0.02181669883430004,
-0.06827034056186676,
-0.053767129778862,
-0.036995042115449905,
-0.08639046549797058,
0.07834042608737946,
0.023234333842992783,
0.07135344296693802,
0.00852221716195345,
0.002940323669463396,
0.00235059205442667,
-0.03993557393550873,
0.2124142348766327,
0.08153977990150452,
0.028903940692543983,
-0.050356119871139526,
-0.2093721181154251,
0.09946766495704651,
-0.0012667089467868209,
0.1095355823636055,
-0.020104771479964256,
-0.01807679608464241,
-0.21816086769104004,
-0.04760408028960228,
-0.11077464371919632,
0.1729581654071808,
-0.0905865952372551,
-0.08472093939781189,
-0.06263784319162369,
0.05353253707289696,
-0.027965446934103966,
-0.01842469535768032,
-0.06174496188759804,
-0.09425867348909378,
0.013116486370563507,
0.13452543318271637,
-0.009317141957581043,
0.14272183179855347,
-0.15742714703083038,
0.013250860385596752,
0.007901295088231564,
0.030781513080000877,
0.0041734399273991585,
-0.052660174667835236,
0.09536018967628479,
0.029970180243253708,
0.17863774299621582,
0.05181486904621124,
-0.05358077958226204,
0.014110603369772434,
0.030842138454318047,
-0.16836705803871155,
-0.09092903882265091,
-0.03971976041793823,
-0.09054659307003021,
-0.03522966802120209,
-0.06817728281021118,
0.11237261444330215,
-0.058432333171367645,
0.03375846892595291,
-0.03348436579108238,
0.03843693435192108,
-0.13328386843204498,
-0.0182456374168396,
0.13595980405807495,
0.053805071860551834,
-0.06099938973784447,
0.022069955244660378,
0.06230258196592331,
-0.08647166192531586,
0.025386599823832512,
0.08927716314792633,
-0.0741160586476326,
-0.05035557225346565,
-0.055852003395557404,
0.25212278962135315,
0.002485155826434493,
-0.08654436469078064,
-0.012478560209274292,
-0.08453109115362167,
0.014443491585552692,
0.09994199872016907,
0.1043563187122345,
-0.015889659523963928,
-0.1143878847360611,
-0.0246109776198864,
-0.10899487882852554,
0.071408711373806,
0.01867596060037613,
-0.052145157009363174,
-0.039491839706897736,
0.07088986039161682,
0.03338969871401787,
0.10820219665765762,
-0.04269056394696236,
-0.0885724276304245,
-0.16157007217407227,
0.01507399883121252,
-0.02247929759323597,
0.023369986563920975,
-0.006531170103698969,
0.017570670694112778,
0.012794976122677326,
-0.06913632899522781,
-0.00655630836263299,
0.0009633592562749982,
-0.04180445522069931,
0.02006601355969906,
0.030771156772971153,
0.09086988866329193,
-0.11055073142051697,
-0.03252612054347992,
0.08351702243089676,
-0.06406558305025101,
0.06682723015546799,
0.1116165891289711,
-0.09151753038167953,
0.011723286472260952,
-0.18996214866638184,
0.0747336745262146,
0.006721190642565489,
0.014740523882210255,
0.048040326684713364,
-0.12292000651359558,
0.0567970909178257,
-0.0048340135253965855,
0.026646161451935768,
0.0291077122092247,
0.09160348773002625,
-0.10610151290893555,
0.05533445253968239,
0.08711086213588715,
-0.16037730872631073,
-0.0154305724427104,
0.04594172164797783,
0.08312457799911499,
0.052031852304935455,
0.14675025641918182,
-0.025522267445921898,
0.15500329434871674,
-0.08021291345357895,
-0.009927927516400814,
-0.011741274036467075,
-0.02378021739423275,
-0.06481175869703293,
-0.054432183504104614,
0.056843388825654984,
-0.00695212185382843,
0.15363483130931854,
0.106279656291008,
0.05151715129613876,
0.008580857887864113,
0.0526885986328125,
-0.07530295848846436,
-0.012984998524188995,
0.08414556831121445,
-0.00046652639866806567,
-0.006104880012571812,
-0.0145065663382411,
-0.005253596697002649,
-0.028262397274374962,
0.044469788670539856,
0.17472070455551147,
0.1174127385020256,
0.07337664812803268,
0.0691879540681839,
0.09673512727022171,
-0.012840134091675282,
-0.08602400869131088,
-0.015742020681500435,
0.13969747722148895,
0.04092835634946823,
-0.10121326148509979,
0.10583733022212982,
0.02398528903722763,
-0.13009491562843323,
0.10779231041669846,
-0.024071330204606056,
-0.12843649089336395,
-0.09360121935606003,
-0.14594422280788422,
-0.002513404469937086,
-0.045515451580286026,
0.015219355002045631,
-0.07342991977930069,
0.0233951136469841,
0.10657239705324173,
-0.00030172415426932275,
-0.07450336217880249,
0.13890500366687775,
-0.10052287578582764,
-0.046457163989543915,
0.07628848403692245,
-0.003831435926258564,
0.08141548931598663,
-0.12124442309141159,
0.030484087765216827,
-0.007553652860224247,
-0.03678925335407257,
0.02595188282430172,
0.05634414404630661,
0.07205834239721298,
-0.09078286588191986,
-0.1647803783416748,
-0.04911109432578087,
-0.008396687917411327,
0.014166361652314663,
-0.04508893936872482,
0.220604807138443,
0.03887951374053955,
-0.017881128937005997,
-0.0031764714512974024,
0.09177324920892715,
-0.014331053011119366,
-0.1518728882074356,
-0.15894629061222076,
0.23094096779823303,
-0.08281504362821579,
0.054272327572107315,
-0.07226412743330002,
-0.05656775087118149,
0.0032781551126390696,
0.20716965198516846,
0.17090293765068054,
-0.044723913073539734,
0.000808430602774024,
0.046354733407497406,
0.03426588326692581,
0.06391479074954987,
0.05013500899076462,
0.06266828626394272,
0.20107071101665497,
-0.06217397749423981,
0.03116804175078869,
-0.15385377407073975,
-0.03891938179731369,
-0.03127703815698624,
0.05345260351896286,
0.06294560432434082,
-0.062118351459503174,
-0.10149741917848587,
0.18302014470100403,
-0.09768680483102798,
-0.11771786957979202,
0.036863118410110474,
-0.04377567395567894,
-0.12017152458429337,
0.010110844857990742,
0.0628480315208435,
0.03612038865685463,
0.09348275512456894,
-0.05345597863197327,
-0.07800760120153427,
0.17600764334201813,
-0.010476944036781788,
-0.04541082680225372,
-0.03663362190127373,
0.06506641954183578,
-0.1606539934873581,
0.16746363043785095,
0.0056464700028300285,
0.17847168445587158,
0.09330553561449051,
0.06372460722923279,
-0.014174547977745533,
0.09469445794820786,
0.06459193676710129,
0.053659625351428986,
-0.05319172143936157,
0.013758210465312004,
-0.03410036116838455,
-0.04546844959259033,
0.02720591239631176,
-0.127155140042305,
0.07588998973369598,
0.07947952300310135,
-0.04727035388350487,
-0.08334808051586151,
0.13701516389846802,
-0.11191265285015106,
0.10909269750118256,
0.1788807511329651,
-0.0010090359719470143,
-0.04326869547367096,
-0.07740757614374161,
0.05495963245630264,
0.04936456307768822,
0.0482381172478199,
0.009784197434782982,
-0.13386431336402893,
0.011407796293497086,
0.06641896814107895,
0.0074300398118793964,
-0.1431928128004074,
-0.10247389227151871,
0.038996923714876175,
0.011397022753953934,
0.025043241679668427,
0.036201801151037216,
-0.01739087887108326,
0.028750982135534286,
-0.031635481864213943,
0.013580176047980785,
-0.002938572783023119,
0.11783294379711151,
-0.07402995228767395,
-0.038732483983039856
] |
null | null |
flair
|
## Test model README
Some test README description
|
{"tags": ["flair", "token-classification"], "widget": [{"text": "does this work"}]}
|
token-classification
|
flairbook/flairmodel
|
[
"flair",
"pytorch",
"token-classification",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#flair #pytorch #token-classification #region-us
|
## Test model README
Some test README description
|
[
"## Test model README\nSome test README description"
] |
[
"TAGS\n#flair #pytorch #token-classification #region-us \n",
"## Test model README\nSome test README description"
] |
[
19,
10
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #region-us \n## Test model README\nSome test README description"
] |
[
-0.11308077722787857,
-0.01656332239508629,
-0.007916594855487347,
0.03041435033082962,
0.06971556693315506,
0.08721547573804855,
0.09208288043737411,
0.08502449840307236,
0.1892099529504776,
0.007454387843608856,
0.10683540254831314,
0.08785640448331833,
-0.044069040566682816,
0.09052418172359467,
-0.03874298557639122,
-0.2347431182861328,
0.039679598063230515,
0.01965559646487236,
0.030266137793660164,
0.17573939263820648,
0.027123723179101944,
-0.09404817968606949,
0.04726940765976906,
-0.03817085921764374,
0.05426694080233574,
0.010162838734686375,
-0.04899922385811806,
-0.08070073276758194,
0.11825799196958542,
-0.09751284122467041,
0.14213402569293976,
-0.001217600074596703,
-0.03331335633993149,
-0.16097794473171234,
0.03740919381380081,
-0.003639115020632744,
-0.008811979554593563,
0.06443936377763748,
0.09869976341724396,
-0.03802308440208435,
0.12476006150245667,
0.0957443118095398,
0.0842539444565773,
0.0171030443161726,
-0.09583329409360886,
-0.18435586988925934,
-0.05233621597290039,
0.03701166808605194,
0.041818514466285706,
0.09847034513950348,
-0.031792234629392624,
0.27570655941963196,
-0.24688321352005005,
0.047785088419914246,
0.20115530490875244,
-0.15563736855983734,
-0.017779750749468803,
0.22986438870429993,
0.08231793344020844,
0.09702572226524353,
-0.08596006780862808,
0.021545764058828354,
0.05114690586924553,
0.054298270493745804,
-0.16361165046691895,
-0.057628072798252106,
0.002759644528850913,
0.09105531126260757,
-0.16051003336906433,
-0.07151646167039871,
0.25743237137794495,
-0.01929253712296486,
-0.0011169321369379759,
-0.04386420175433159,
-0.03483372554183006,
0.0145464688539505,
0.05797697231173515,
-0.046906836330890656,
-0.018918631598353386,
-0.00775892985984683,
0.11637778580188751,
0.016801638528704643,
-0.07749594748020172,
-0.11230149120092392,
-0.1417669653892517,
0.1656215488910675,
0.02559426613152027,
0.08005545288324356,
-0.17317599058151245,
0.08189776539802551,
0.03595671057701111,
-0.058976322412490845,
-0.006676849443465471,
-0.12500382959842682,
0.016684584319591522,
-0.030671877786517143,
-0.05612558498978615,
-0.028734495863318443,
0.12860481441020966,
0.08434142172336578,
-0.03802172839641571,
0.11012603342533112,
-0.002387549728155136,
0.09967362880706787,
0.05134853348135948,
0.10586333274841309,
0.07808136940002441,
0.08102797716856003,
-0.05827450752258301,
0.01787547953426838,
-0.041793640702962875,
-0.06347310543060303,
-0.015395758673548698,
-0.0157475583255291,
0.0600113570690155,
0.11206947267055511,
-0.03978660702705383,
-0.08768467605113983,
-0.054922789335250854,
-0.049827691167593,
0.09768771380186081,
-0.060294412076473236,
-0.018070489168167114,
0.14173442125320435,
-0.03450809046626091,
0.033247895538806915,
-0.03346730023622513,
-0.06500875949859619,
-0.0007345884223468602,
0.08533928543329239,
-0.07992865145206451,
-0.002963310806080699,
-0.03875809907913208,
-0.019752761349081993,
0.01034665759652853,
-0.05772107094526291,
0.015938831493258476,
-0.07751895487308502,
-0.16529828310012817,
-0.056691791862249374,
-0.020254353061318398,
0.007832209579646587,
0.04783652350306511,
0.012302888557314873,
0.005912972614169121,
-0.07746301591396332,
0.02132393978536129,
0.01553402841091156,
-0.07124543935060501,
0.1318616420030594,
0.005418602377176285,
-0.0005554805393330753,
0.0542326420545578,
0.013475975021719933,
-0.166856050491333,
0.033470772206783295,
-0.11423531174659729,
-0.013697165064513683,
-0.1454470455646515,
0.07280910760164261,
-0.054981719702482224,
-0.14733311533927917,
-0.0865374431014061,
-0.026036405935883522,
-0.03516294062137604,
0.2271091789007187,
-0.15954133868217468,
-0.07167476415634155,
0.06908895820379257,
-0.1627880185842514,
-0.11882244050502777,
0.043462879955768585,
-0.022381016984581947,
0.047558337450027466,
0.05029818415641785,
0.27791306376457214,
0.15394185483455658,
-0.10499629378318787,
0.10910210013389587,
0.15751735866069794,
-0.025217346847057343,
-0.031760312616825104,
0.11028601974248886,
-0.09044149518013,
-0.19428934156894684,
0.051019225269556046,
-0.06949801743030548,
-0.057968009263277054,
-0.05142669752240181,
-0.05434617027640343,
0.0034966066014021635,
-0.016983933746814728,
0.2019866555929184,
0.09274562448263168,
0.0831335037946701,
-0.09317304939031601,
0.00819638092070818,
-0.03769126161932945,
0.12456037104129791,
0.06033815070986748,
-0.029694756492972374,
-0.02564673125743866,
0.19130173325538635,
0.015359828248620033,
-0.05233205109834671,
-0.2037455290555954,
-0.0640963762998581,
0.040519941598176956,
0.0043898215517401695,
-0.01549469493329525,
0.15457509458065033,
0.10040044784545898,
-0.0669240728020668,
0.009783459827303886,
-0.038211554288864136,
-0.0007688644691370428,
-0.012665129266679287,
-0.10963291674852371,
-0.12352794408798218,
-0.00550314225256443,
-0.05104219168424606,
0.008228457532823086,
-0.1756768375635147,
-0.021268421784043312,
0.01008185837417841,
0.02461445890367031,
-0.03536604717373848,
-0.0017333250725641847,
0.07123807072639465,
-0.0005086341989226639,
0.024428490549325943,
-0.0052622416988015175,
0.059741146862506866,
-0.07074189186096191,
-0.0774977058172226,
-0.002189476042985916,
-0.0529455691576004,
0.05278891697525978,
0.12160191684961319,
-0.09503234922885895,
-0.014669623225927353,
-0.07474444061517715,
-0.05988496541976929,
0.022625427693128586,
-0.05874758958816528,
0.04269444942474365,
0.14165878295898438,
0.033995628356933594,
0.04372953623533249,
-0.07250835746526718,
-0.07353148609399796,
-0.02552074007689953,
-0.01153199840337038,
-0.005826093256473541,
0.21602903306484222,
0.043149761855602264,
-0.09873434156179428,
0.05586076155304909,
0.1752813160419464,
-0.022709090262651443,
0.09176665544509888,
-0.07916892319917679,
-0.07716772705316544,
0.05347762629389763,
0.0006342821288853884,
-0.06791175901889801,
0.14287061989307404,
-0.172483429312706,
0.09871445596218109,
0.08674561232328415,
0.06300988048315048,
0.07133207470178604,
-0.119688980281353,
-0.10226850211620331,
-0.013522251509130001,
-0.0062535847537219524,
-0.16888035833835602,
0.08428099006414413,
0.035210832953453064,
0.06693949550390244,
-0.0289009939879179,
-0.0793747678399086,
0.04906727746129036,
-0.024376899003982544,
-0.14548207819461823,
0.23059780895709991,
-0.07614704966545105,
-0.17757734656333923,
-0.05192829668521881,
0.0869792252779007,
0.005453256890177727,
-0.04275801405310631,
0.05580931901931763,
-0.17567603290081024,
0.0014796190662309527,
0.0025273950304836035,
-0.01069837436079979,
-0.09865739941596985,
-0.00023955557844601572,
-0.09951527416706085,
0.042989954352378845,
-0.0645831897854805,
-0.08449146151542664,
-0.007977028377354145,
-0.09753967076539993,
0.059955619275569916,
0.09695345163345337,
-0.14568772912025452,
0.0014527600724250078,
0.20729123055934906,
0.03378980979323387,
0.06821601092815399,
-0.014100383967161179,
0.2998522222042084,
-0.11646204441785812,
-0.05461667850613594,
0.15080422163009644,
-0.058592669665813446,
0.030360929667949677,
0.027750158682465553,
0.023747004568576813,
-0.0649774968624115,
0.014687876217067242,
-0.033614084124565125,
-0.06733842194080353,
-0.3488038182258606,
-0.08918686956167221,
-0.038628995418548584,
0.07655642181634903,
-0.019396396353840828,
0.06497710198163986,
0.02840433642268181,
0.06556114554405212,
0.0807749554514885,
-0.17485520243644714,
-0.05990574508905411,
-0.041174713522195816,
0.12694841623306274,
-0.03094988502562046,
0.0060103717260062695,
-0.09198112040758133,
-0.05787896737456322,
0.08839794248342514,
0.0018156897276639938,
0.15427473187446594,
0.0837930217385292,
-0.028634773567318916,
0.06347477436065674,
0.08478830754756927,
0.09656881541013718,
0.09773866087198257,
-0.012340742163360119,
-0.011761450208723545,
0.005058911629021168,
-0.06771117448806763,
-0.02672644518315792,
0.0013548929709941149,
0.056999459862709045,
-0.06807637959718704,
-0.09494412690401077,
-0.10527195781469345,
0.07640162110328674,
-0.08828192949295044,
0.09232888370752335,
-0.11249420791864395,
-0.008526998572051525,
0.01417937409132719,
0.014945066533982754,
-0.04357561096549034,
0.0838211327791214,
-0.012661337852478027,
-0.18945397436618805,
0.040761303156614304,
-0.008341284468770027,
0.11382494121789932,
0.013580547645688057,
0.03660972788929939,
-0.07582467049360275,
-0.12786218523979187,
-0.02750321477651596,
0.1611238569021225,
-0.1451503336429596,
0.2579284906387329,
-0.03135104849934578,
-0.05207231268286705,
-0.07514439523220062,
-0.08790787309408188,
0.05861316993832588,
0.1616879105567932,
0.1235339418053627,
0.02892005257308483,
-0.06881368160247803,
-0.09307331591844559,
-0.0862969383597374,
0.03890462592244148,
0.0472002811729908,
-0.031709425151348114,
-0.02054363675415516,
-0.0034668550360947847,
0.04994013160467148,
-0.07787653803825378,
-0.02320883981883526,
-0.002233303850516677,
0.09888854622840881,
0.02592664584517479,
-0.10515963286161423,
-0.09531772136688232,
-0.029064089059829712,
-0.0868891179561615,
-0.029476964846253395,
0.04236914590001106,
-0.0385182648897171,
-0.07490235567092896,
-0.05941825360059738,
-0.08408236503601074,
0.20653870701789856,
-0.06708124279975891,
-0.05174141004681587,
-0.040766991674900055,
-0.03835150599479675,
-0.022569656372070312,
-0.103338822722435,
0.03536221385002136,
-0.06986796110868454,
-0.15260647237300873,
-0.05321803316473961,
0.16211694478988647,
-0.02732076495885849,
0.10986922681331635,
0.03186660632491112,
0.02232091873884201,
-0.11546517163515091,
-0.10124117881059647,
0.06941788643598557,
-0.13011597096920013,
-0.09191329777240753,
0.11645689606666565,
0.028175458312034607,
0.056016165763139725,
-0.056699372828006744,
0.033991385251283646,
0.1792486160993576,
0.14199958741664886,
0.011651183478534222,
0.15209107100963593,
0.15952149033546448,
-0.09915870428085327,
-0.2090172916650772,
0.031898461282253265,
-0.09723532199859619,
-0.05066673457622528,
0.09884977340698242,
-0.16712434589862823,
0.17394235730171204,
0.07274431735277176,
-0.07742103189229965,
0.2314997911453247,
-0.3033122420310974,
-0.05604686960577965,
0.22956174612045288,
0.007907474413514137,
0.23373201489448547,
-0.133794367313385,
-0.10916568338871002,
0.06384429335594177,
-0.018118517473340034,
-0.05956067517399788,
-0.04388510063290596,
0.041794780641794205,
-0.09169624745845795,
0.05220566689968109,
0.07198776304721832,
-0.06721958518028259,
0.2321241796016693,
0.02454546093940735,
0.05119612067937851,
-0.03910301625728607,
-0.06385569274425507,
0.04790237918496132,
0.007407217752188444,
0.0659203976392746,
-0.03171176835894585,
0.09711410105228424,
-0.26321810483932495,
-0.0031755033414810896,
-0.07115887850522995,
0.05563422664999962,
-0.016639620065689087,
-0.023685617372393608,
-0.05784289538860321,
0.033635463565588,
-0.10200556367635727,
-0.00571964867413044,
0.1187824159860611,
-0.05468040332198143,
0.1484922617673874,
-0.028909914195537567,
0.15987011790275574,
0.09285999834537506,
-0.18830344080924988,
-0.010569683276116848,
-0.023809298872947693,
0.0796542763710022,
-0.1447528749704361,
-0.06484639644622803,
0.16372708976268768,
0.0633537620306015,
0.00465251412242651,
0.0815199688076973,
-0.09199996292591095,
-0.023511767387390137,
0.09853077679872513,
-0.16507980227470398,
-0.003311028005555272,
-0.02322833612561226,
-0.0601353645324707,
-0.07022696733474731,
0.02589651755988598,
0.04004853591322899,
0.050094861537218094,
-0.023801203817129135,
-0.03119332157075405,
0.018970485776662827,
-0.05459105968475342,
0.13992425799369812,
0.15281742811203003,
0.10540521889925003,
-0.09826890379190445,
0.05211769416928291,
0.04929029941558838,
-0.03193596377968788,
-0.03711269423365593,
0.17710652947425842,
-0.08579123765230179,
-0.11296512186527252,
0.035744234919548035,
0.17921869456768036,
0.03406373783946037,
-0.046743426471948624,
-0.11650554090738297,
-0.07857248932123184,
0.016233326867222786,
0.1830173134803772,
0.15911883115768433,
0.06828980892896652,
-0.0019180243834853172,
-0.06298694014549255,
0.03675652667880058,
0.05265362188220024,
-0.0125295240432024,
0.0048495689406991005,
-0.09328269213438034,
-0.047633491456508636,
0.015118526294827461,
0.17276664078235626,
-0.09518419951200485,
-0.10154422372579575,
-0.16400158405303955,
0.11275799572467804,
-0.17881211638450623,
0.0385923758149147,
0.012723319232463837,
-0.025028055533766747,
-0.005560328718274832,
-0.04188492149114609,
-0.012213421985507011,
-0.02460089884698391,
-0.11865053325891495,
0.09675627201795578,
0.03605908155441284,
0.06642162054777145,
-0.0901038721203804,
0.027303745970129967,
0.15267539024353027,
-0.04684300348162651,
0.12604066729545593,
0.11365662515163422,
-0.04371662065386772,
0.10881853848695755,
-0.05807493254542351,
0.02361820824444294,
0.06070135533809662,
0.06015120446681976,
-0.0030323935206979513,
-0.06692758202552795,
0.01566910929977894,
0.0019570006988942623,
0.04638819023966789,
0.1519407331943512,
0.04982617497444153,
-0.11654988676309586,
0.0055101546458899975,
-0.00620963703840971,
-0.17280836403369904,
0.013456523418426514,
-0.09642393887042999,
0.051990024745464325,
0.04939398169517517,
0.10007123649120331,
0.023294692859053612,
0.12513230741024017,
-0.10306396335363388,
-0.03245475888252258,
-0.021425433456897736,
-0.06989435106515884,
-0.06706097722053528,
0.021415771916508675,
0.07260183244943619,
-0.03776790574193001,
0.2645152509212494,
0.07068436592817307,
0.023043839260935783,
0.0110383415594697,
0.18458324670791626,
0.09928671270608902,
0.01698111742734909,
0.19918589293956757,
0.06726695597171783,
-0.030018873512744904,
-0.026905624195933342,
0.00963357463479042,
-0.062148116528987885,
-0.06006699800491333,
0.17826630175113678,
0.1460893601179123,
-0.016902567818760872,
0.016595792025327682,
-0.026565978303551674,
0.042536187916994095,
0.004851771518588066,
-0.13488446176052094,
0.033654406666755676,
0.033915333449840546,
0.0819438025355339,
0.06835748255252838,
0.09294398128986359,
-0.0629645511507988,
0.09627456218004227,
-0.06615497916936874,
-0.03472619876265526,
-0.1884179711341858,
-0.0017560557462275028,
-0.05043480172753334,
-0.10389669984579086,
0.055175065994262695,
-0.0476364940404892,
-0.09968722611665726,
0.15816010534763336,
0.04635777696967125,
-0.04998982325196266,
0.13573938608169556,
0.001591752632521093,
0.007595515809953213,
0.08950801193714142,
-0.03617965057492256,
-0.05048485845327377,
-0.005033173132687807,
0.006228032521903515,
-0.006548653822392225,
-0.07330617308616638,
-0.054099712520837784,
-0.04924248158931732,
-0.11135594546794891,
-0.024523191154003143,
-0.11616925150156021,
-0.09730265289545059,
-0.02641933038830757,
0.014653664082288742,
-0.04453616589307785,
0.12838304042816162,
0.02872535213828087,
0.07131318747997284,
-0.005812854506075382,
0.12986814975738525,
0.012860303744673729,
-0.10622739791870117,
-0.02014712244272232,
0.30572667717933655,
-0.036863163113594055,
0.026052838191390038,
0.035237181931734085,
-0.05532725155353546,
0.0024759299121797085,
0.22692997753620148,
0.2699424922466278,
-0.06459269672632217,
-0.012733090668916702,
0.05601631477475166,
0.022249426692724228,
0.11632449179887772,
0.09385839104652405,
0.033180151134729385,
0.2495465725660324,
-0.11138667911291122,
-0.025879699736833572,
-0.07201850414276123,
-0.06161686033010483,
-0.0557568185031414,
0.007725134026259184,
0.16351532936096191,
-0.08189736306667328,
-0.1236477792263031,
0.17540068924427032,
-0.2269848734140396,
0.07113877683877945,
0.05925579369068146,
-0.1198384091258049,
-0.12516602873802185,
-0.08095809072256088,
0.10671965032815933,
-0.035107873380184174,
0.03899591043591499,
-0.05472744628787041,
-0.07951866090297699,
-0.04811447486281395,
-0.02183673158288002,
-0.18651866912841797,
-0.17667876183986664,
0.13920271396636963,
-0.019715914502739906,
0.10846824944019318,
0.025979647412896156,
0.2038266360759735,
0.02610635571181774,
-0.006616220343858004,
-0.03658325597643852,
0.0676497220993042,
0.11972663551568985,
0.02408069185912609,
-0.06608036905527115,
0.013177779503166676,
0.06245183199644089,
-0.09692537039518356,
0.0864705741405487,
-0.06342470645904541,
-0.007206129841506481,
-0.001741621526889503,
-0.050540149211883545,
-0.10713513195514679,
0.07749524712562561,
-0.09195718914270401,
0.04490965977311134,
0.12124476581811905,
-0.02937474474310875,
-0.01465719472616911,
-0.06417932361364365,
0.052498508244752884,
0.008263825438916683,
-0.15926885604858398,
-0.05602861940860748,
-0.011950074695050716,
-0.057501837611198425,
0.006377951707690954,
-0.056913331151008606,
-0.26669538021087646,
-0.012986710295081139,
-0.05481226369738579,
0.010188952088356018,
0.010268333368003368,
0.038642823696136475,
0.023499200120568275,
-0.00692563084885478,
0.022786108776926994,
0.06391911953687668,
0.01906925067305565,
0.02846306748688221,
-0.1277388036251068,
-0.06300888955593109
] |
null | null |
flair
|
## Test model README
Some test README description
|
{"tags": ["flair", "token-classification"], "widget": [{"text": "does this work"}]}
|
token-classification
|
flairbook2/flairmodel
|
[
"flair",
"pytorch",
"token-classification",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#flair #pytorch #token-classification #region-us
|
## Test model README
Some test README description
|
[
"## Test model README\nSome test README description"
] |
[
"TAGS\n#flair #pytorch #token-classification #region-us \n",
"## Test model README\nSome test README description"
] |
[
19,
10
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #region-us \n## Test model README\nSome test README description"
] |
[
-0.11308077722787857,
-0.01656332239508629,
-0.007916594855487347,
0.03041435033082962,
0.06971556693315506,
0.08721547573804855,
0.09208288043737411,
0.08502449840307236,
0.1892099529504776,
0.007454387843608856,
0.10683540254831314,
0.08785640448331833,
-0.044069040566682816,
0.09052418172359467,
-0.03874298557639122,
-0.2347431182861328,
0.039679598063230515,
0.01965559646487236,
0.030266137793660164,
0.17573939263820648,
0.027123723179101944,
-0.09404817968606949,
0.04726940765976906,
-0.03817085921764374,
0.05426694080233574,
0.010162838734686375,
-0.04899922385811806,
-0.08070073276758194,
0.11825799196958542,
-0.09751284122467041,
0.14213402569293976,
-0.001217600074596703,
-0.03331335633993149,
-0.16097794473171234,
0.03740919381380081,
-0.003639115020632744,
-0.008811979554593563,
0.06443936377763748,
0.09869976341724396,
-0.03802308440208435,
0.12476006150245667,
0.0957443118095398,
0.0842539444565773,
0.0171030443161726,
-0.09583329409360886,
-0.18435586988925934,
-0.05233621597290039,
0.03701166808605194,
0.041818514466285706,
0.09847034513950348,
-0.031792234629392624,
0.27570655941963196,
-0.24688321352005005,
0.047785088419914246,
0.20115530490875244,
-0.15563736855983734,
-0.017779750749468803,
0.22986438870429993,
0.08231793344020844,
0.09702572226524353,
-0.08596006780862808,
0.021545764058828354,
0.05114690586924553,
0.054298270493745804,
-0.16361165046691895,
-0.057628072798252106,
0.002759644528850913,
0.09105531126260757,
-0.16051003336906433,
-0.07151646167039871,
0.25743237137794495,
-0.01929253712296486,
-0.0011169321369379759,
-0.04386420175433159,
-0.03483372554183006,
0.0145464688539505,
0.05797697231173515,
-0.046906836330890656,
-0.018918631598353386,
-0.00775892985984683,
0.11637778580188751,
0.016801638528704643,
-0.07749594748020172,
-0.11230149120092392,
-0.1417669653892517,
0.1656215488910675,
0.02559426613152027,
0.08005545288324356,
-0.17317599058151245,
0.08189776539802551,
0.03595671057701111,
-0.058976322412490845,
-0.006676849443465471,
-0.12500382959842682,
0.016684584319591522,
-0.030671877786517143,
-0.05612558498978615,
-0.028734495863318443,
0.12860481441020966,
0.08434142172336578,
-0.03802172839641571,
0.11012603342533112,
-0.002387549728155136,
0.09967362880706787,
0.05134853348135948,
0.10586333274841309,
0.07808136940002441,
0.08102797716856003,
-0.05827450752258301,
0.01787547953426838,
-0.041793640702962875,
-0.06347310543060303,
-0.015395758673548698,
-0.0157475583255291,
0.0600113570690155,
0.11206947267055511,
-0.03978660702705383,
-0.08768467605113983,
-0.054922789335250854,
-0.049827691167593,
0.09768771380186081,
-0.060294412076473236,
-0.018070489168167114,
0.14173442125320435,
-0.03450809046626091,
0.033247895538806915,
-0.03346730023622513,
-0.06500875949859619,
-0.0007345884223468602,
0.08533928543329239,
-0.07992865145206451,
-0.002963310806080699,
-0.03875809907913208,
-0.019752761349081993,
0.01034665759652853,
-0.05772107094526291,
0.015938831493258476,
-0.07751895487308502,
-0.16529828310012817,
-0.056691791862249374,
-0.020254353061318398,
0.007832209579646587,
0.04783652350306511,
0.012302888557314873,
0.005912972614169121,
-0.07746301591396332,
0.02132393978536129,
0.01553402841091156,
-0.07124543935060501,
0.1318616420030594,
0.005418602377176285,
-0.0005554805393330753,
0.0542326420545578,
0.013475975021719933,
-0.166856050491333,
0.033470772206783295,
-0.11423531174659729,
-0.013697165064513683,
-0.1454470455646515,
0.07280910760164261,
-0.054981719702482224,
-0.14733311533927917,
-0.0865374431014061,
-0.026036405935883522,
-0.03516294062137604,
0.2271091789007187,
-0.15954133868217468,
-0.07167476415634155,
0.06908895820379257,
-0.1627880185842514,
-0.11882244050502777,
0.043462879955768585,
-0.022381016984581947,
0.047558337450027466,
0.05029818415641785,
0.27791306376457214,
0.15394185483455658,
-0.10499629378318787,
0.10910210013389587,
0.15751735866069794,
-0.025217346847057343,
-0.031760312616825104,
0.11028601974248886,
-0.09044149518013,
-0.19428934156894684,
0.051019225269556046,
-0.06949801743030548,
-0.057968009263277054,
-0.05142669752240181,
-0.05434617027640343,
0.0034966066014021635,
-0.016983933746814728,
0.2019866555929184,
0.09274562448263168,
0.0831335037946701,
-0.09317304939031601,
0.00819638092070818,
-0.03769126161932945,
0.12456037104129791,
0.06033815070986748,
-0.029694756492972374,
-0.02564673125743866,
0.19130173325538635,
0.015359828248620033,
-0.05233205109834671,
-0.2037455290555954,
-0.0640963762998581,
0.040519941598176956,
0.0043898215517401695,
-0.01549469493329525,
0.15457509458065033,
0.10040044784545898,
-0.0669240728020668,
0.009783459827303886,
-0.038211554288864136,
-0.0007688644691370428,
-0.012665129266679287,
-0.10963291674852371,
-0.12352794408798218,
-0.00550314225256443,
-0.05104219168424606,
0.008228457532823086,
-0.1756768375635147,
-0.021268421784043312,
0.01008185837417841,
0.02461445890367031,
-0.03536604717373848,
-0.0017333250725641847,
0.07123807072639465,
-0.0005086341989226639,
0.024428490549325943,
-0.0052622416988015175,
0.059741146862506866,
-0.07074189186096191,
-0.0774977058172226,
-0.002189476042985916,
-0.0529455691576004,
0.05278891697525978,
0.12160191684961319,
-0.09503234922885895,
-0.014669623225927353,
-0.07474444061517715,
-0.05988496541976929,
0.022625427693128586,
-0.05874758958816528,
0.04269444942474365,
0.14165878295898438,
0.033995628356933594,
0.04372953623533249,
-0.07250835746526718,
-0.07353148609399796,
-0.02552074007689953,
-0.01153199840337038,
-0.005826093256473541,
0.21602903306484222,
0.043149761855602264,
-0.09873434156179428,
0.05586076155304909,
0.1752813160419464,
-0.022709090262651443,
0.09176665544509888,
-0.07916892319917679,
-0.07716772705316544,
0.05347762629389763,
0.0006342821288853884,
-0.06791175901889801,
0.14287061989307404,
-0.172483429312706,
0.09871445596218109,
0.08674561232328415,
0.06300988048315048,
0.07133207470178604,
-0.119688980281353,
-0.10226850211620331,
-0.013522251509130001,
-0.0062535847537219524,
-0.16888035833835602,
0.08428099006414413,
0.035210832953453064,
0.06693949550390244,
-0.0289009939879179,
-0.0793747678399086,
0.04906727746129036,
-0.024376899003982544,
-0.14548207819461823,
0.23059780895709991,
-0.07614704966545105,
-0.17757734656333923,
-0.05192829668521881,
0.0869792252779007,
0.005453256890177727,
-0.04275801405310631,
0.05580931901931763,
-0.17567603290081024,
0.0014796190662309527,
0.0025273950304836035,
-0.01069837436079979,
-0.09865739941596985,
-0.00023955557844601572,
-0.09951527416706085,
0.042989954352378845,
-0.0645831897854805,
-0.08449146151542664,
-0.007977028377354145,
-0.09753967076539993,
0.059955619275569916,
0.09695345163345337,
-0.14568772912025452,
0.0014527600724250078,
0.20729123055934906,
0.03378980979323387,
0.06821601092815399,
-0.014100383967161179,
0.2998522222042084,
-0.11646204441785812,
-0.05461667850613594,
0.15080422163009644,
-0.058592669665813446,
0.030360929667949677,
0.027750158682465553,
0.023747004568576813,
-0.0649774968624115,
0.014687876217067242,
-0.033614084124565125,
-0.06733842194080353,
-0.3488038182258606,
-0.08918686956167221,
-0.038628995418548584,
0.07655642181634903,
-0.019396396353840828,
0.06497710198163986,
0.02840433642268181,
0.06556114554405212,
0.0807749554514885,
-0.17485520243644714,
-0.05990574508905411,
-0.041174713522195816,
0.12694841623306274,
-0.03094988502562046,
0.0060103717260062695,
-0.09198112040758133,
-0.05787896737456322,
0.08839794248342514,
0.0018156897276639938,
0.15427473187446594,
0.0837930217385292,
-0.028634773567318916,
0.06347477436065674,
0.08478830754756927,
0.09656881541013718,
0.09773866087198257,
-0.012340742163360119,
-0.011761450208723545,
0.005058911629021168,
-0.06771117448806763,
-0.02672644518315792,
0.0013548929709941149,
0.056999459862709045,
-0.06807637959718704,
-0.09494412690401077,
-0.10527195781469345,
0.07640162110328674,
-0.08828192949295044,
0.09232888370752335,
-0.11249420791864395,
-0.008526998572051525,
0.01417937409132719,
0.014945066533982754,
-0.04357561096549034,
0.0838211327791214,
-0.012661337852478027,
-0.18945397436618805,
0.040761303156614304,
-0.008341284468770027,
0.11382494121789932,
0.013580547645688057,
0.03660972788929939,
-0.07582467049360275,
-0.12786218523979187,
-0.02750321477651596,
0.1611238569021225,
-0.1451503336429596,
0.2579284906387329,
-0.03135104849934578,
-0.05207231268286705,
-0.07514439523220062,
-0.08790787309408188,
0.05861316993832588,
0.1616879105567932,
0.1235339418053627,
0.02892005257308483,
-0.06881368160247803,
-0.09307331591844559,
-0.0862969383597374,
0.03890462592244148,
0.0472002811729908,
-0.031709425151348114,
-0.02054363675415516,
-0.0034668550360947847,
0.04994013160467148,
-0.07787653803825378,
-0.02320883981883526,
-0.002233303850516677,
0.09888854622840881,
0.02592664584517479,
-0.10515963286161423,
-0.09531772136688232,
-0.029064089059829712,
-0.0868891179561615,
-0.029476964846253395,
0.04236914590001106,
-0.0385182648897171,
-0.07490235567092896,
-0.05941825360059738,
-0.08408236503601074,
0.20653870701789856,
-0.06708124279975891,
-0.05174141004681587,
-0.040766991674900055,
-0.03835150599479675,
-0.022569656372070312,
-0.103338822722435,
0.03536221385002136,
-0.06986796110868454,
-0.15260647237300873,
-0.05321803316473961,
0.16211694478988647,
-0.02732076495885849,
0.10986922681331635,
0.03186660632491112,
0.02232091873884201,
-0.11546517163515091,
-0.10124117881059647,
0.06941788643598557,
-0.13011597096920013,
-0.09191329777240753,
0.11645689606666565,
0.028175458312034607,
0.056016165763139725,
-0.056699372828006744,
0.033991385251283646,
0.1792486160993576,
0.14199958741664886,
0.011651183478534222,
0.15209107100963593,
0.15952149033546448,
-0.09915870428085327,
-0.2090172916650772,
0.031898461282253265,
-0.09723532199859619,
-0.05066673457622528,
0.09884977340698242,
-0.16712434589862823,
0.17394235730171204,
0.07274431735277176,
-0.07742103189229965,
0.2314997911453247,
-0.3033122420310974,
-0.05604686960577965,
0.22956174612045288,
0.007907474413514137,
0.23373201489448547,
-0.133794367313385,
-0.10916568338871002,
0.06384429335594177,
-0.018118517473340034,
-0.05956067517399788,
-0.04388510063290596,
0.041794780641794205,
-0.09169624745845795,
0.05220566689968109,
0.07198776304721832,
-0.06721958518028259,
0.2321241796016693,
0.02454546093940735,
0.05119612067937851,
-0.03910301625728607,
-0.06385569274425507,
0.04790237918496132,
0.007407217752188444,
0.0659203976392746,
-0.03171176835894585,
0.09711410105228424,
-0.26321810483932495,
-0.0031755033414810896,
-0.07115887850522995,
0.05563422664999962,
-0.016639620065689087,
-0.023685617372393608,
-0.05784289538860321,
0.033635463565588,
-0.10200556367635727,
-0.00571964867413044,
0.1187824159860611,
-0.05468040332198143,
0.1484922617673874,
-0.028909914195537567,
0.15987011790275574,
0.09285999834537506,
-0.18830344080924988,
-0.010569683276116848,
-0.023809298872947693,
0.0796542763710022,
-0.1447528749704361,
-0.06484639644622803,
0.16372708976268768,
0.0633537620306015,
0.00465251412242651,
0.0815199688076973,
-0.09199996292591095,
-0.023511767387390137,
0.09853077679872513,
-0.16507980227470398,
-0.003311028005555272,
-0.02322833612561226,
-0.0601353645324707,
-0.07022696733474731,
0.02589651755988598,
0.04004853591322899,
0.050094861537218094,
-0.023801203817129135,
-0.03119332157075405,
0.018970485776662827,
-0.05459105968475342,
0.13992425799369812,
0.15281742811203003,
0.10540521889925003,
-0.09826890379190445,
0.05211769416928291,
0.04929029941558838,
-0.03193596377968788,
-0.03711269423365593,
0.17710652947425842,
-0.08579123765230179,
-0.11296512186527252,
0.035744234919548035,
0.17921869456768036,
0.03406373783946037,
-0.046743426471948624,
-0.11650554090738297,
-0.07857248932123184,
0.016233326867222786,
0.1830173134803772,
0.15911883115768433,
0.06828980892896652,
-0.0019180243834853172,
-0.06298694014549255,
0.03675652667880058,
0.05265362188220024,
-0.0125295240432024,
0.0048495689406991005,
-0.09328269213438034,
-0.047633491456508636,
0.015118526294827461,
0.17276664078235626,
-0.09518419951200485,
-0.10154422372579575,
-0.16400158405303955,
0.11275799572467804,
-0.17881211638450623,
0.0385923758149147,
0.012723319232463837,
-0.025028055533766747,
-0.005560328718274832,
-0.04188492149114609,
-0.012213421985507011,
-0.02460089884698391,
-0.11865053325891495,
0.09675627201795578,
0.03605908155441284,
0.06642162054777145,
-0.0901038721203804,
0.027303745970129967,
0.15267539024353027,
-0.04684300348162651,
0.12604066729545593,
0.11365662515163422,
-0.04371662065386772,
0.10881853848695755,
-0.05807493254542351,
0.02361820824444294,
0.06070135533809662,
0.06015120446681976,
-0.0030323935206979513,
-0.06692758202552795,
0.01566910929977894,
0.0019570006988942623,
0.04638819023966789,
0.1519407331943512,
0.04982617497444153,
-0.11654988676309586,
0.0055101546458899975,
-0.00620963703840971,
-0.17280836403369904,
0.013456523418426514,
-0.09642393887042999,
0.051990024745464325,
0.04939398169517517,
0.10007123649120331,
0.023294692859053612,
0.12513230741024017,
-0.10306396335363388,
-0.03245475888252258,
-0.021425433456897736,
-0.06989435106515884,
-0.06706097722053528,
0.021415771916508675,
0.07260183244943619,
-0.03776790574193001,
0.2645152509212494,
0.07068436592817307,
0.023043839260935783,
0.0110383415594697,
0.18458324670791626,
0.09928671270608902,
0.01698111742734909,
0.19918589293956757,
0.06726695597171783,
-0.030018873512744904,
-0.026905624195933342,
0.00963357463479042,
-0.062148116528987885,
-0.06006699800491333,
0.17826630175113678,
0.1460893601179123,
-0.016902567818760872,
0.016595792025327682,
-0.026565978303551674,
0.042536187916994095,
0.004851771518588066,
-0.13488446176052094,
0.033654406666755676,
0.033915333449840546,
0.0819438025355339,
0.06835748255252838,
0.09294398128986359,
-0.0629645511507988,
0.09627456218004227,
-0.06615497916936874,
-0.03472619876265526,
-0.1884179711341858,
-0.0017560557462275028,
-0.05043480172753334,
-0.10389669984579086,
0.055175065994262695,
-0.0476364940404892,
-0.09968722611665726,
0.15816010534763336,
0.04635777696967125,
-0.04998982325196266,
0.13573938608169556,
0.001591752632521093,
0.007595515809953213,
0.08950801193714142,
-0.03617965057492256,
-0.05048485845327377,
-0.005033173132687807,
0.006228032521903515,
-0.006548653822392225,
-0.07330617308616638,
-0.054099712520837784,
-0.04924248158931732,
-0.11135594546794891,
-0.024523191154003143,
-0.11616925150156021,
-0.09730265289545059,
-0.02641933038830757,
0.014653664082288742,
-0.04453616589307785,
0.12838304042816162,
0.02872535213828087,
0.07131318747997284,
-0.005812854506075382,
0.12986814975738525,
0.012860303744673729,
-0.10622739791870117,
-0.02014712244272232,
0.30572667717933655,
-0.036863163113594055,
0.026052838191390038,
0.035237181931734085,
-0.05532725155353546,
0.0024759299121797085,
0.22692997753620148,
0.2699424922466278,
-0.06459269672632217,
-0.012733090668916702,
0.05601631477475166,
0.022249426692724228,
0.11632449179887772,
0.09385839104652405,
0.033180151134729385,
0.2495465725660324,
-0.11138667911291122,
-0.025879699736833572,
-0.07201850414276123,
-0.06161686033010483,
-0.0557568185031414,
0.007725134026259184,
0.16351532936096191,
-0.08189736306667328,
-0.1236477792263031,
0.17540068924427032,
-0.2269848734140396,
0.07113877683877945,
0.05925579369068146,
-0.1198384091258049,
-0.12516602873802185,
-0.08095809072256088,
0.10671965032815933,
-0.035107873380184174,
0.03899591043591499,
-0.05472744628787041,
-0.07951866090297699,
-0.04811447486281395,
-0.02183673158288002,
-0.18651866912841797,
-0.17667876183986664,
0.13920271396636963,
-0.019715914502739906,
0.10846824944019318,
0.025979647412896156,
0.2038266360759735,
0.02610635571181774,
-0.006616220343858004,
-0.03658325597643852,
0.0676497220993042,
0.11972663551568985,
0.02408069185912609,
-0.06608036905527115,
0.013177779503166676,
0.06245183199644089,
-0.09692537039518356,
0.0864705741405487,
-0.06342470645904541,
-0.007206129841506481,
-0.001741621526889503,
-0.050540149211883545,
-0.10713513195514679,
0.07749524712562561,
-0.09195718914270401,
0.04490965977311134,
0.12124476581811905,
-0.02937474474310875,
-0.01465719472616911,
-0.06417932361364365,
0.052498508244752884,
0.008263825438916683,
-0.15926885604858398,
-0.05602861940860748,
-0.011950074695050716,
-0.057501837611198425,
0.006377951707690954,
-0.056913331151008606,
-0.26669538021087646,
-0.012986710295081139,
-0.05481226369738579,
0.010188952088356018,
0.010268333368003368,
0.038642823696136475,
0.023499200120568275,
-0.00692563084885478,
0.022786108776926994,
0.06391911953687668,
0.01906925067305565,
0.02846306748688221,
-0.1277388036251068,
-0.06300888955593109
] |
null | null |
transformers
|
# Marty DialoGPT Model
|
{"tags": ["conversational"]}
|
text-generation
|
flakje/DialoGPT-small-Marty
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Marty DialoGPT Model
|
[
"# Marty DialoGPT Model"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Marty DialoGPT Model"
] |
[
51,
8
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Marty DialoGPT Model"
] |
[
-0.020030727609992027,
0.0847432091832161,
-0.006213377229869366,
0.017665669322013855,
0.13358208537101746,
0.0012770810863003135,
0.1421806514263153,
0.12129658460617065,
-0.03508952260017395,
-0.04897429421544075,
0.1366945058107376,
0.17743852734565735,
-0.017397338524460793,
0.10377705097198486,
-0.05749646574258804,
-0.3072653114795685,
0.046609893441200256,
0.04910017177462578,
-0.008296145126223564,
0.12280184030532837,
0.09277141094207764,
-0.047213662415742874,
0.08141328394412994,
0.01384738925844431,
-0.159077450633049,
0.0007154151680879295,
0.015971483662724495,
-0.10370542109012604,
0.11406316608190536,
0.05870465189218521,
0.020152488723397255,
0.02040106989443302,
-0.05394740402698517,
-0.15114480257034302,
0.034991975873708725,
-0.020921044051647186,
-0.03261110559105873,
0.04645416513085365,
0.02331377938389778,
-0.07483548671007156,
0.13995115458965302,
0.11581376940011978,
-0.01148312445729971,
0.05315208062529564,
-0.1462228000164032,
0.00286112236790359,
-0.0010831383988261223,
0.06071384996175766,
0.054310623556375504,
0.09958481043577194,
-0.039547212421894073,
0.10456178337335587,
-0.05899342894554138,
0.11060802638530731,
0.11036800593137741,
-0.3212601840496063,
-0.02468465268611908,
0.12420766055583954,
0.029788799583911896,
0.046628210693597794,
-0.03477342426776886,
0.0922837033867836,
0.03296327218413353,
0.008504169061779976,
-0.017828579992055893,
-0.08032436668872833,
-0.06161884218454361,
0.02428535372018814,
-0.08548302203416824,
-0.013266580179333687,
0.2505522668361664,
-0.042853645980358124,
0.06387681514024734,
-0.0819157138466835,
-0.09776382148265839,
-0.003445397363975644,
-0.04274051636457443,
-0.04813767224550247,
-0.07153783738613129,
0.08380654454231262,
-0.0033193500712513924,
-0.0973864421248436,
-0.12104081362485886,
-0.02432486228644848,
-0.16412605345249176,
0.1799820363521576,
0.025522826239466667,
0.03994422405958176,
-0.19820226728916168,
0.09774649143218994,
0.0021144833881407976,
-0.08803810924291611,
0.02785114198923111,
-0.09977994114160538,
0.03430003300309181,
0.024158867076039314,
-0.02850153110921383,
-0.013850867748260498,
0.08019974082708359,
0.13629521429538727,
0.016825927421450615,
0.004742112010717392,
-0.016713209450244904,
0.04770481958985329,
0.03635669872164726,
0.06098846346139908,
-0.03087007813155651,
-0.08952168375253677,
0.029347043484449387,
-0.10388249158859253,
0.004413435235619545,
-0.05264643579721451,
-0.18098905682563782,
-0.02784336917102337,
0.06158473715186119,
0.05072476342320442,
0.042864784598350525,
0.13423897325992584,
-0.017872026190161705,
-0.04473312944173813,
0.029712725430727005,
-0.022376393899321556,
-0.031280048191547394,
0.002931659808382392,
-0.0013261106796562672,
0.13983778655529022,
0.010067389346659184,
0.04418987035751343,
-0.11582987010478973,
0.015177988447248936,
-0.062429867684841156,
-0.017308304086327553,
-0.021824542433023453,
-0.04850739985704422,
0.0047957985661923885,
-0.003400486195459962,
0.016747908666729927,
-0.15182600915431976,
-0.16047894954681396,
0.009925863705575466,
-0.004008060321211815,
-0.04953364282846451,
-0.09259393066167831,
-0.1045968234539032,
-0.021587001159787178,
0.0411381721496582,
-0.06290606409311295,
0.007952629588544369,
-0.05841854587197304,
0.08768186718225479,
-0.02005518227815628,
0.09719816595315933,
-0.07558822631835938,
0.07225769013166428,
-0.07947805523872375,
-0.03376293554902077,
-0.07431609183549881,
0.11938801407814026,
0.011698049493134022,
0.06708472967147827,
-0.017897389829158783,
-0.014612148515880108,
-0.10515822470188141,
0.07191712409257889,
-0.03754901885986328,
0.26127901673316956,
-0.09909781068563461,
-0.10973045974969864,
0.265170693397522,
-0.053417403250932693,
-0.1292732059955597,
0.12536972761154175,
0.003955288324505091,
0.09294691681861877,
0.13096500933170319,
0.18980112671852112,
-0.018336523324251175,
-0.019027985632419586,
0.09577048569917679,
0.11818873137235641,
-0.08750108629465103,
-0.0010857894085347652,
0.030311331152915955,
-0.009885401464998722,
-0.10310664772987366,
0.04588887095451355,
0.05108746886253357,
0.06972815096378326,
-0.05733141675591469,
-0.029308080673217773,
0.005669190548360348,
-0.006816883571445942,
0.08295628428459167,
-0.03552457317709923,
0.11860113590955734,
-0.04611868038773537,
-0.06407526135444641,
-0.02056128717958927,
0.012939726933836937,
-0.03960179165005684,
0.040298786014318466,
-0.07567106187343597,
0.054596491158008575,
-0.020206542685627937,
0.06840401887893677,
-0.14454376697540283,
-0.04676700755953789,
-0.048840660601854324,
0.16165503859519958,
0.053854361176490784,
0.13744671642780304,
0.05330290272831917,
-0.05130042880773544,
-0.019079498946666718,
0.030001264065504074,
0.18217481672763824,
-0.018264446407556534,
-0.07899755984544754,
-0.09334008395671844,
0.10799556225538254,
-0.05698864161968231,
0.07959506660699844,
-0.05075913295149803,
0.00945606455206871,
0.01585417613387108,
0.09703052788972855,
-0.022756217047572136,
0.03175203502178192,
0.017121275886893272,
-0.030396752059459686,
-0.057607557624578476,
0.004853216931223869,
0.10469845682382584,
0.0073389518074691296,
-0.08242414146661758,
0.23300810158252716,
-0.16691792011260986,
0.1578788459300995,
0.19403012096881866,
-0.23035970330238342,
0.012334628961980343,
-0.11714520305395126,
-0.03132545202970505,
-0.003749075112864375,
0.07005901634693146,
-0.05201664939522743,
0.18659944832324982,
-0.014598171226680279,
0.17468763887882233,
-0.03476157411932945,
-0.03459553048014641,
-0.03501349315047264,
-0.05079485848546028,
-0.003920670598745346,
0.08900972455739975,
0.046193599700927734,
-0.11177577078342438,
0.1818557232618332,
0.08144126087427139,
0.055791184306144714,
0.1958763152360916,
0.04541458562016487,
0.0042588962242007256,
0.050616372376680374,
-0.012979956343770027,
-0.050757914781570435,
-0.0745081976056099,
-0.2792695462703705,
-0.036395199596881866,
0.08190702646970749,
0.03347247838973999,
0.11455487459897995,
-0.09668238461017609,
-0.04256266728043556,
-0.00020238709112163633,
-0.018142590299248695,
0.05284206569194794,
0.1216626688838005,
0.0027854377403855324,
0.12276455014944077,
-0.009860118851065636,
-0.05117687210440636,
0.055904287844896317,
0.00625570910051465,
-0.09133028239011765,
0.17383375763893127,
-0.13554000854492188,
-0.363113671541214,
-0.11393433064222336,
-0.17666038870811462,
-0.06370443105697632,
0.054597944021224976,
0.10062459856271744,
-0.12266729772090912,
-0.026681724935770035,
0.0013931270223110914,
0.09557530283927917,
-0.09037384390830994,
0.0004255353705957532,
-0.014086072333157063,
-0.00451954361051321,
-0.12581956386566162,
-0.11004329472780228,
-0.05421933904290199,
-0.038974788039922714,
-0.060426853597164154,
0.12395898252725601,
-0.15261885523796082,
0.03521677479147911,
0.23324690759181976,
0.065894216299057,
0.055714089423418045,
-0.026285093277692795,
0.2096838653087616,
-0.09513400495052338,
-0.000058578592870617285,
0.16404849290847778,
-0.026681294664740562,
0.055856551975011826,
0.11531427502632141,
-0.006038577761501074,
-0.06943480670452118,
0.04116486757993698,
-0.032129283994436264,
-0.06672976911067963,
-0.21952871978282928,
-0.14297957718372345,
-0.10496770590543747,
0.0855453684926033,
0.03352176398038864,
0.045088645070791245,
0.1496182531118393,
0.06770189851522446,
-0.03954111412167549,
0.007429094519466162,
0.0708300918340683,
0.0901174396276474,
0.276827335357666,
-0.0859750285744667,
0.1417376697063446,
-0.003766546491533518,
-0.15002644062042236,
0.07936078310012817,
0.06889882683753967,
0.05762546509504318,
0.07046937197446823,
0.0982748344540596,
0.008884130977094173,
0.011557246558368206,
0.11080510169267654,
0.08363261818885803,
0.018123235553503036,
-0.030217934399843216,
-0.043973665684461594,
-0.051020409911870956,
-0.040628932416439056,
0.05460847169160843,
0.03678392618894577,
-0.14304205775260925,
-0.006527220364660025,
-0.015201874077320099,
0.08256146311759949,
0.06778636574745178,
0.080228790640831,
-0.1731240153312683,
-0.02601417899131775,
0.07383805513381958,
-0.030447840690612793,
-0.10645575821399689,
0.08981015533208847,
0.041598327457904816,
-0.12267430126667023,
0.03776146098971367,
-0.013003112748265266,
0.11497340351343155,
-0.05231069028377533,
0.07977716624736786,
-0.11210697889328003,
-0.05805721506476402,
0.005645395256578922,
0.10983072966337204,
-0.3111764192581177,
0.19280114769935608,
-0.005454530008137226,
-0.05381658673286438,
-0.09764721989631653,
-0.012983468361198902,
0.024451179429888725,
0.09304152429103851,
0.12811218202114105,
-0.007602228783071041,
0.05214796960353851,
0.012591972947120667,
-0.0628354474902153,
0.02687288261950016,
0.09382542222738266,
-0.03941839560866356,
-0.02854122780263424,
-0.04850964993238449,
-0.006812914740294218,
-0.003450811840593815,
-0.03662559390068054,
0.0001651074708206579,
-0.17648634314537048,
0.0901002585887909,
0.05283290147781372,
0.07115399837493896,
0.028789097443223,
-0.04415560141205788,
-0.07430196553468704,
0.2432287037372589,
-0.04151202738285065,
-0.1019434705376625,
-0.08672969788312912,
-0.025124909356236458,
0.04537520930171013,
-0.06262114644050598,
0.010901286266744137,
-0.06331218779087067,
0.045348186045885086,
-0.07972985506057739,
-0.18135984241962433,
0.11353737115859985,
-0.09551018476486206,
-0.04286634549498558,
-0.03253961354494095,
0.216544046998024,
-0.0131383640691638,
0.02909267507493496,
0.041049614548683167,
0.001602702192030847,
-0.11768788844347,
-0.0967332124710083,
-0.011394599452614784,
0.03724360093474388,
0.033950455486774445,
0.010438754223287106,
-0.045284949243068695,
-0.045296426862478256,
-0.06100137531757355,
-0.017223205417394638,
0.3207705318927765,
0.13048674166202545,
-0.04535268619656563,
0.15990690886974335,
0.14420388638973236,
-0.07201290130615234,
-0.2938636839389801,
-0.1014721617102623,
-0.09123840183019638,
-0.050281982868909836,
-0.0744631215929985,
-0.16553692519664764,
0.09357602894306183,
-0.009004248306155205,
-0.011069664731621742,
0.10331477224826813,
-0.25198081135749817,
-0.11164354532957077,
0.18268360197544098,
-0.02892284095287323,
0.4145981967449188,
-0.11372840404510498,
-0.08332522213459015,
-0.059619009494781494,
-0.2000875174999237,
0.13767218589782715,
0.03846307098865509,
0.10815320909023285,
-0.012485208921134472,
0.17211492359638214,
0.05168554559350014,
-0.011466880328953266,
0.08290847390890121,
0.04089028388261795,
-0.05886851251125336,
-0.09969869256019592,
-0.11163587123155594,
-0.0015369069296866655,
0.025009766221046448,
0.04410422965884209,
-0.062036532908678055,
0.026413600891828537,
-0.10523711889982224,
-0.0607014037668705,
-0.09379681944847107,
0.03732960671186447,
0.02075497806072235,
-0.0883345827460289,
-0.002457025460898876,
-0.03628590703010559,
0.002831860212609172,
0.0105486074462533,
0.15310263633728027,
-0.10887701064348221,
0.1286923587322235,
0.09101252257823944,
0.13608914613723755,
-0.16923289000988007,
-0.007669280283153057,
-0.05241791903972626,
-0.05942101404070854,
0.06922625005245209,
-0.10475797206163406,
0.02870512381196022,
0.11680955439805984,
-0.021668048575520515,
0.08292041718959808,
0.09514445066452026,
-0.0077131567522883415,
0.003251010086387396,
0.09421029686927795,
-0.2492358684539795,
-0.08497293293476105,
-0.08072952926158905,
-0.0024684497620910406,
0.08127453178167343,
0.08474121242761612,
0.20121578872203827,
-0.022235624492168427,
-0.03431381285190582,
0.02075730264186859,
0.030681097880005836,
-0.04495052248239517,
0.09143040329217911,
0.0023637115955352783,
0.01949170231819153,
-0.1477947235107422,
0.06397666782140732,
-0.023011216893792152,
-0.0891345739364624,
0.01162138395011425,
0.13133157789707184,
-0.10795652121305466,
-0.120596744120121,
-0.06046739220619202,
0.09523200988769531,
-0.11683358997106552,
0.009501894004642963,
-0.018299320712685585,
-0.11824419349431992,
0.05978541448712349,
0.0718151330947876,
0.0587020218372345,
0.08149762451648712,
-0.09760802239179611,
-0.011749315075576305,
-0.015605438500642776,
0.013962877914309502,
0.0454636849462986,
-0.022947488352656364,
-0.040550414472818375,
0.082310251891613,
-0.03635421767830849,
0.11510492861270905,
-0.09424736350774765,
-0.10485246032476425,
-0.14707283675670624,
0.034097883850336075,
-0.125465989112854,
-0.08771344274282455,
-0.12979207932949066,
-0.046158239245414734,
-0.01278872974216938,
-0.041186511516571045,
-0.04344053566455841,
-0.03372785449028015,
-0.10468150675296783,
0.03768612816929817,
-0.04087495431303978,
0.002168756676837802,
-0.07469265908002853,
0.030134867876768112,
0.053839217871427536,
-0.028268655762076378,
0.1530703902244568,
0.14758245646953583,
-0.1126907616853714,
0.09308616816997528,
-0.1579611450433731,
-0.07552140951156616,
0.10387180000543594,
0.008559782058000565,
0.05475565046072006,
0.0441054068505764,
0.0018287262646481395,
0.05795168876647949,
0.06867251545190811,
0.04446881636977196,
0.06283863633871078,
-0.08083226531744003,
0.047199998050928116,
-0.04710569232702255,
-0.12412980198860168,
-0.052684132009744644,
-0.03543633595108986,
0.05117907002568245,
0.042489662766456604,
0.09595022350549698,
-0.052695032209157944,
0.07966162264347076,
-0.057824671268463135,
0.0401453971862793,
0.019095461815595627,
-0.17009679973125458,
0.00552047835662961,
-0.07184427976608276,
0.03877994790673256,
0.008736291900277138,
0.222553551197052,
0.010242406278848648,
-0.04021679237484932,
0.03669368103146553,
0.07217080891132355,
0.027703335508704185,
-0.004788432270288467,
0.1841358244419098,
0.10350586473941803,
-0.046320125460624695,
-0.10170423239469528,
0.0977131724357605,
0.05186300724744797,
0.057964637875556946,
0.11711668223142624,
-0.011938962154090405,
-0.01807776838541031,
0.0894106775522232,
-0.0012002710718661547,
0.03527875244617462,
-0.1552322506904602,
-0.13866347074508667,
-0.03436778858304024,
0.056542232632637024,
-0.05798352137207985,
0.13675536215305328,
0.15952159464359283,
-0.05329958721995354,
0.024483568966388702,
-0.026776380836963654,
-0.05728567764163017,
-0.1770561933517456,
-0.21195170283317566,
-0.07711020112037659,
-0.1372574269771576,
0.004689884837716818,
-0.1384928822517395,
0.03352496773004532,
0.04369649291038513,
0.09525992721319199,
-0.06792015582323074,
0.06268616765737534,
0.0323861725628376,
-0.11793182045221329,
0.08119828253984451,
-0.027203239500522614,
0.07652521878480911,
-0.03833819553256035,
0.004856863059103489,
-0.07803089916706085,
0.042881593108177185,
0.014837691560387611,
0.04568677023053169,
-0.043774720281362534,
0.0024288108106702566,
-0.1305389404296875,
-0.07903799414634705,
-0.07139542698860168,
0.0685136616230011,
-0.0009876277763396502,
0.14220492541790009,
0.009686610661447048,
-0.03650299087166786,
0.038645774126052856,
0.23784278333187103,
-0.06683716922998428,
-0.09662216156721115,
-0.07932019978761673,
0.1784534752368927,
-0.013410951942205429,
0.11818526685237885,
-0.023713160306215286,
0.01548156701028347,
-0.09493178874254227,
0.35304945707321167,
0.33008867502212524,
-0.10036345571279526,
0.004294414073228836,
0.005679020192474127,
0.04433741793036461,
0.10610008239746094,
0.08866327255964279,
0.10839366912841797,
0.29750755429267883,
-0.057468730956315994,
-0.014974205754697323,
-0.015181221999228,
-0.045732248574495316,
-0.04203708469867706,
0.060726750642061234,
0.04230400174856186,
-0.06838192790746689,
-0.02943365089595318,
0.09585627913475037,
-0.22984156012535095,
0.07639000564813614,
-0.14864031970500946,
-0.16843101382255554,
-0.06842673569917679,
0.011293895542621613,
0.07012207806110382,
0.02322119101881981,
0.09878680109977722,
0.0018658211920410395,
-0.07153339684009552,
0.0612003393471241,
0.020919539034366608,
-0.2060096561908722,
-0.016489319503307343,
0.08597592264413834,
-0.022777216508984566,
-0.02447371929883957,
-0.017109060660004616,
0.058730073273181915,
0.0652097538113594,
0.06957957148551941,
-0.02507229521870613,
0.04039539024233818,
-0.0017699551535770297,
-0.06765463948249817,
0.021997438743710518,
0.0203914362937212,
0.006611094810068607,
-0.07760434597730637,
0.08768989145755768,
-0.14091870188713074,
0.043639954179525375,
0.039230652153491974,
-0.03784037381410599,
-0.03360338881611824,
0.02994699403643608,
-0.07875097543001175,
0.0763547345995903,
0.09118154644966125,
-0.012685962952673435,
-0.019743308424949646,
-0.03005385957658291,
-0.008147938176989555,
-0.02788851223886013,
-0.03396109119057655,
-0.08606580644845963,
-0.1787433624267578,
-0.10694897174835205,
0.05992187187075615,
-0.0045709447003901005,
-0.1767561137676239,
0.009886219166219234,
-0.13016408681869507,
0.0554911233484745,
-0.1094127744436264,
0.11045897752046585,
0.07091840356588364,
0.01891222596168518,
0.002886052243411541,
-0.030890027061104774,
0.05068419501185417,
0.09113617986440659,
-0.1282532513141632,
-0.0885261818766594
] |
null | null |
transformers
|
# FlauBERT: Unsupervised Language Model Pre-training for French
**FlauBERT** is a French BERT trained on a very large and heterogeneous French corpus. Models of different sizes are trained using the new CNRS (French National Centre for Scientific Research) [Jean Zay](http://www.idris.fr/eng/jean-zay/ ) supercomputer.
Along with FlauBERT comes [**FLUE**](https://github.com/getalp/Flaubert/tree/master/flue): an evaluation setup for French NLP systems similar to the popular GLUE benchmark. The goal is to enable further reproducible experiments in the future and to share models and progress on the French language.For more details please refer to the [official website](https://github.com/getalp/Flaubert).
## FlauBERT models
| Model name | Number of layers | Attention Heads | Embedding Dimension | Total Parameters |
| :------: | :---: | :---: | :---: | :---: |
| `flaubert-small-cased` | 6 | 8 | 512 | 54 M |
| `flaubert-base-uncased` | 12 | 12 | 768 | 137 M |
| `flaubert-base-cased` | 12 | 12 | 768 | 138 M |
| `flaubert-large-cased` | 24 | 16 | 1024 | 373 M |
**Note:** `flaubert-small-cased` is partially trained so performance is not guaranteed. Consider using it for debugging purpose only.
## Using FlauBERT with Hugging Face's Transformers
```python
import torch
from transformers import FlaubertModel, FlaubertTokenizer
# Choose among ['flaubert/flaubert_small_cased', 'flaubert/flaubert_base_uncased',
# 'flaubert/flaubert_base_cased', 'flaubert/flaubert_large_cased']
modelname = 'flaubert/flaubert_base_cased'
# Load pretrained model and tokenizer
flaubert, log = FlaubertModel.from_pretrained(modelname, output_loading_info=True)
flaubert_tokenizer = FlaubertTokenizer.from_pretrained(modelname, do_lowercase=False)
# do_lowercase=False if using cased models, True if using uncased ones
sentence = "Le chat mange une pomme."
token_ids = torch.tensor([flaubert_tokenizer.encode(sentence)])
last_layer = flaubert(token_ids)[0]
print(last_layer.shape)
# torch.Size([1, 8, 768]) -> (batch size x number of tokens x embedding dimension)
# The BERT [CLS] token correspond to the first hidden state of the last layer
cls_embedding = last_layer[:, 0, :]
```
**Notes:** if your `transformers` version is <=2.10.0, `modelname` should take one
of the following values:
```
['flaubert-small-cased', 'flaubert-base-uncased', 'flaubert-base-cased', 'flaubert-large-cased']
```
## References
If you use FlauBERT or the FLUE Benchmark for your scientific publication, or if you find the resources in this repository useful, please cite one of the following papers:
[LREC paper](http://www.lrec-conf.org/proceedings/lrec2020/pdf/2020.lrec-1.302.pdf)
```
@InProceedings{le2020flaubert,
author = {Le, Hang and Vial, Lo\"{i}c and Frej, Jibril and Segonne, Vincent and Coavoux, Maximin and Lecouteux, Benjamin and Allauzen, Alexandre and Crabb\'{e}, Beno\^{i}t and Besacier, Laurent and Schwab, Didier},
title = {FlauBERT: Unsupervised Language Model Pre-training for French},
booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference},
month = {May},
year = {2020},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {2479--2490},
url = {https://www.aclweb.org/anthology/2020.lrec-1.302}
}
```
[TALN paper](https://hal.archives-ouvertes.fr/hal-02784776/)
```
@inproceedings{le2020flaubert,
title = {FlauBERT: des mod{\`e}les de langue contextualis{\'e}s pr{\'e}-entra{\^\i}n{\'e}s pour le fran{\c{c}}ais},
author = {Le, Hang and Vial, Lo{\"\i}c and Frej, Jibril and Segonne, Vincent and Coavoux, Maximin and Lecouteux, Benjamin and Allauzen, Alexandre and Crabb{\'e}, Beno{\^\i}t and Besacier, Laurent and Schwab, Didier},
booktitle = {Actes de la 6e conf{\'e}rence conjointe Journ{\'e}es d'{\'E}tudes sur la Parole (JEP, 31e {\'e}dition), Traitement Automatique des Langues Naturelles (TALN, 27e {\'e}dition), Rencontre des {\'E}tudiants Chercheurs en Informatique pour le Traitement Automatique des Langues (R{\'E}CITAL, 22e {\'e}dition). Volume 2: Traitement Automatique des Langues Naturelles},
pages = {268--278},
year = {2020},
organization = {ATALA}
}
```
|
{"language": "fr", "license": "mit", "tags": ["bert", "language-model", "flaubert", "flue", "french", "bert-base", "flaubert-base", "cased"], "datasets": ["flaubert"], "metrics": ["flue"]}
|
fill-mask
|
flaubert/flaubert_base_cased
|
[
"transformers",
"pytorch",
"flaubert",
"fill-mask",
"bert",
"language-model",
"flue",
"french",
"bert-base",
"flaubert-base",
"cased",
"fr",
"dataset:flaubert",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fr"
] |
TAGS
#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #bert-base #flaubert-base #cased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
FlauBERT: Unsupervised Language Model Pre-training for French
=============================================================
FlauBERT is a French BERT trained on a very large and heterogeneous French corpus. Models of different sizes are trained using the new CNRS (French National Centre for Scientific Research) Jean Zay supercomputer.
Along with FlauBERT comes FLUE: an evaluation setup for French NLP systems similar to the popular GLUE benchmark. The goal is to enable further reproducible experiments in the future and to share models and progress on the French language.For more details please refer to the official website.
FlauBERT models
---------------
Note: 'flaubert-small-cased' is partially trained so performance is not guaranteed. Consider using it for debugging purpose only.
Using FlauBERT with Hugging Face's Transformers
-----------------------------------------------
Notes: if your 'transformers' version is <=2.10.0, 'modelname' should take one
of the following values:
References
----------
If you use FlauBERT or the FLUE Benchmark for your scientific publication, or if you find the resources in this repository useful, please cite one of the following papers:
LREC paper
TALN paper
|
[] |
[
"TAGS\n#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #bert-base #flaubert-base #cased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
82
] |
[
"passage: TAGS\n#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #bert-base #flaubert-base #cased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
-0.03594275563955307,
0.10239780694246292,
-0.005152486264705658,
0.08841285854578018,
0.03922083228826523,
0.010242111049592495,
0.09319330751895905,
0.08315373212099075,
0.029410289600491524,
0.016455069184303284,
0.17709310352802277,
0.05014636367559433,
-0.018809163942933083,
0.12007272988557816,
-0.000813956547062844,
-0.2803550362586975,
0.07596202939748764,
0.0025383606553077698,
-0.08697254955768585,
0.06630450487136841,
0.12870025634765625,
-0.05183872953057289,
0.10424041748046875,
0.031470295041799545,
-0.04871535673737526,
0.06276889890432358,
0.023304523900151253,
-0.07588393986225128,
0.14625203609466553,
0.08435684442520142,
0.1359914094209671,
0.051557231694459915,
0.01596611551940441,
-0.11282148957252502,
0.02674940600991249,
0.0006664104294031858,
-0.08090293407440186,
0.05052681267261505,
0.01078164391219616,
-0.037168923765420914,
0.1275012195110321,
0.023441588506102562,
-0.0025110638234764338,
0.04036964103579521,
-0.12291193008422852,
-0.15844525396823883,
-0.07095345109701157,
0.017405914142727852,
-0.08995700627565384,
0.06488851457834244,
-0.0038957647047936916,
0.1043553575873375,
-0.12716364860534668,
0.05946378782391548,
0.0991533175110817,
-0.3388030529022217,
-0.0255325585603714,
0.10669569671154022,
0.1465214192867279,
0.012804493308067322,
-0.07373673468828201,
0.08827032893896103,
0.055565353482961655,
0.015735743567347527,
0.0586773119866848,
-0.07424743473529816,
-0.03633997216820717,
0.050563983619213104,
-0.08099206537008286,
-0.021589992567896843,
0.1255629062652588,
-0.004823984578251839,
0.021360276266932487,
-0.05151233449578285,
-0.06653737276792526,
0.004591619595885277,
-0.02984941564500332,
-0.005371382925659418,
0.03354008123278618,
0.05948124825954437,
0.05678528547286987,
-0.08309143036603928,
-0.10300186276435852,
0.018452046439051628,
-0.1879841536283493,
0.0700882226228714,
-0.024776719510555267,
0.06104983389377594,
-0.07212492823600769,
0.008280009962618351,
-0.06276729702949524,
-0.13201607763767242,
0.02752254530787468,
-0.07102066278457642,
0.07137023657560349,
0.024913785979151726,
-0.06381456553936005,
0.07626426219940186,
0.13183416426181793,
0.18356025218963623,
-0.02432139590382576,
0.01013859175145626,
-0.04038647934794426,
0.09503939002752304,
0.04427316412329674,
0.10114544630050659,
-0.08616743236780167,
-0.10893534868955612,
-0.012002862989902496,
-0.06695391982793808,
-0.0006702577229589224,
-0.04997877776622772,
-0.1495075821876526,
-0.03506777063012123,
-0.03922344744205475,
0.011261927895247936,
0.021798592060804367,
0.03532553091645241,
-0.03672150522470474,
0.0436299592256546,
0.10592517256736755,
-0.05989694222807884,
0.06751243770122528,
0.033801835030317307,
0.011626754887402058,
0.10725562274456024,
0.0032307568471878767,
0.013294574804604053,
0.05039777234196663,
0.027872856706380844,
-0.10860170423984528,
-0.031836189329624176,
-0.03542449325323105,
-0.13552825152873993,
0.045133333653211594,
-0.1449265331029892,
0.03715243563055992,
-0.12430574744939804,
-0.0722590908408165,
0.008397442288696766,
0.09253443777561188,
-0.07371634244918823,
0.012167612090706825,
-0.010282017290592194,
-0.06309231370687485,
0.07635102421045303,
-0.024517320096492767,
0.016695117577910423,
-0.07734603434801102,
0.05697767809033394,
-0.06952375918626785,
0.10540389269590378,
-0.21896547079086304,
0.037696097046136856,
-0.06524653732776642,
0.015093482099473476,
-0.1389205902814865,
-0.006750092376023531,
-0.10555324703454971,
0.03777085244655609,
-0.04294004291296005,
-0.08949402719736099,
-0.019385026767849922,
0.07752899080514908,
0.006186294835060835,
0.11852837353944778,
-0.17153562605381012,
-0.07669441401958466,
0.15766240656375885,
-0.09878876805305481,
-0.0845966562628746,
0.11783698201179504,
-0.03542102873325348,
0.01101093553006649,
-0.004199317656457424,
0.1864866465330124,
0.018617605790495872,
-0.1474854052066803,
0.06088327243924141,
0.11255219578742981,
-0.011870251037180424,
-0.034434109926223755,
0.07997243106365204,
-0.024150462821125984,
-0.09694741666316986,
0.03214230388402939,
-0.036095574498176575,
0.09013896435499191,
-0.05147731676697731,
-0.06907057762145996,
-0.005926487036049366,
-0.0469597652554512,
0.14908486604690552,
0.02930031530559063,
0.07131631672382355,
-0.08801127970218658,
-0.057335786521434784,
0.03178064897656441,
0.016184283420443535,
0.04683089256286621,
-0.000219213412492536,
-0.03205899894237518,
0.15660344064235687,
0.04871336743235588,
-0.038732293993234634,
-0.1231672391295433,
-0.05579891428351402,
-0.05932389944791794,
0.08229097723960876,
-0.016852565109729767,
0.20274247229099274,
0.09698141366243362,
-0.005522903986275196,
-0.023969441652297974,
0.03850965574383736,
0.07566047459840775,
0.0552976168692112,
0.01770309917628765,
-0.14584170281887054,
0.015726327896118164,
-0.10655234754085541,
0.05539271980524063,
-0.12013231217861176,
-0.01296822726726532,
0.033792074769735336,
0.11222793906927109,
-0.028309686109423637,
0.052040886133909225,
-0.07189897447824478,
0.04660170152783394,
-0.06823834776878357,
-0.00324719725176692,
0.06843592971563339,
-0.0022891873959451914,
0.0014952209312468767,
0.13469122350215912,
-0.04101935774087906,
0.30346250534057617,
0.2038169950246811,
-0.1363822966814041,
-0.03475352004170418,
-0.012465184554457664,
-0.013514530844986439,
0.054226022213697433,
0.048449866473674774,
-0.02574617601931095,
0.0016701550921425223,
-0.020954232662916183,
0.1421494483947754,
-0.047631002962589264,
0.019129905849695206,
0.04311293736100197,
-0.045229289680719376,
-0.11446234583854675,
0.06219431385397911,
0.056167807430028915,
-0.05196608975529671,
0.2139398157596588,
0.2491159588098526,
0.0054231006652116776,
0.20080699026584625,
-0.0182082187384367,
0.03126504272222519,
-0.0241844542324543,
-0.05575024336576462,
-0.05836658552289009,
0.153744637966156,
-0.18370023369789124,
-0.04439842328429222,
0.06992276012897491,
-0.024568557739257812,
0.0337008573114872,
-0.09546294063329697,
-0.0485445111989975,
0.014462635852396488,
-0.007828732952475548,
-0.06230877712368965,
0.10310956090688705,
-0.0019683758728206158,
0.08991706371307373,
0.007907519116997719,
-0.20595115423202515,
0.03677557036280632,
0.012181648053228855,
-0.014149545691907406,
0.1386040896177292,
-0.16118815541267395,
-0.21132123470306396,
-0.07922761142253876,
-0.1756589114665985,
-0.08050451427698135,
0.017624642699956894,
0.06115792691707611,
-0.051749445497989655,
-0.034381840378046036,
-0.018899967893958092,
-0.03872188553214073,
-0.08949718624353409,
0.013620393350720406,
-0.07297954708337784,
-0.033789005130529404,
-0.038538526743650436,
-0.1371362805366516,
-0.07564224302768707,
-0.009763499721884727,
-0.0013738378183916211,
0.07180794328451157,
-0.04359707608819008,
0.08984938263893127,
0.08719730377197266,
-0.017381198704242706,
0.04152151197195053,
-0.020185939967632294,
0.2723843455314636,
-0.028710227459669113,
0.056874047964811325,
0.11878922581672668,
0.04481754079461098,
0.06191471591591835,
0.17282763123512268,
0.05596538633108139,
-0.028954606503248215,
-0.03602944314479828,
-0.00396698247641325,
-0.06932324916124344,
-0.0987570658326149,
-0.14784497022628784,
-0.09572242945432663,
-0.029151570051908493,
0.050212129950523376,
0.06715524196624756,
0.1007111594080925,
0.058911435306072235,
0.08150321990251541,
-0.055517252534627914,
-0.03917669877409935,
0.00013231516641099006,
0.2281382828950882,
-0.06639323383569717,
0.13275931775569916,
-0.038866836577653885,
-0.10312947630882263,
0.10749564319849014,
0.03041515313088894,
-0.004067154601216316,
0.09594409167766571,
0.011329778470098972,
0.06900889426469803,
0.1530248075723648,
0.05410178750753403,
0.10844004154205322,
-0.005168956238776445,
-0.026167692616581917,
-0.04920294135808945,
-0.05345292016863823,
0.02843172289431095,
0.037831615656614304,
0.0853123888373375,
-0.12114905565977097,
-0.04221802204847336,
-0.17083820700645447,
0.06908391416072845,
0.006862806621938944,
0.10582919418811798,
-0.19056810438632965,
0.01368635892868042,
0.06020892411470413,
0.02814536914229393,
-0.08941085636615753,
0.046591535210609436,
0.09734128415584564,
-0.09606586396694183,
0.056424666196107864,
0.019160382449626923,
0.08461911976337433,
0.11410371959209442,
0.06352393329143524,
-0.017802631482481956,
-0.06587392836809158,
0.02841281332075596,
0.056692227721214294,
-0.22923041880130768,
0.2733176648616791,
-0.014584965072572231,
-0.07555025070905685,
-0.0383765734732151,
0.01304937619715929,
0.07248469442129135,
0.1776622235774994,
0.1622784435749054,
0.05433730036020279,
-0.05527302250266075,
-0.08392000943422318,
0.012516725808382034,
0.00979752279818058,
0.0259433351457119,
0.000170273837284185,
-0.025997048243880272,
-0.012016097083687782,
-0.03798988461494446,
0.013517424464225769,
0.1361171007156372,
-0.12149640917778015,
-0.1598537564277649,
0.04814525693655014,
0.057398345321416855,
-0.013784217648208141,
-0.01614242233335972,
-0.09242857247591019,
-0.23582354187965393,
0.14640553295612335,
0.019338296726346016,
-0.014392673037946224,
-0.1324435919523239,
-0.048385776579380035,
0.12455342710018158,
-0.04883325845003128,
0.09612244367599487,
-0.0789014920592308,
0.022545987740159035,
-0.10330036282539368,
-0.11010874062776566,
0.11974652856588364,
-0.09290184825658798,
0.01212308183312416,
-0.06684595346450806,
0.06890150904655457,
-0.043195903301239014,
0.039446596056222916,
0.05762464553117752,
0.058315470814704895,
-0.10759145021438599,
-0.10954916477203369,
0.002812165068462491,
-0.11266957968473434,
0.054074034094810486,
-0.051382578909397125,
-0.0792863741517067,
-0.06510481983423233,
0.07203858345746994,
0.04281622916460037,
0.23083069920539856,
0.19989028573036194,
-0.11820899695158005,
0.16812650859355927,
0.12667863070964813,
-0.04751031473278999,
-0.30719318985939026,
-0.054145585745573044,
-0.12830740213394165,
0.012157703749835491,
0.0046707154251635075,
-0.17550835013389587,
0.04470008611679077,
-0.001985548995435238,
-0.051998477429151535,
0.08039382100105286,
-0.16989627480506897,
-0.07134853303432465,
0.21674242615699768,
-0.0272599458694458,
0.35018736124038696,
-0.09609522670507431,
-0.058948811143636703,
-0.04507073014974594,
-0.1527319699525833,
0.0856330618262291,
-0.0412641316652298,
0.04847998917102814,
0.008856325410306454,
0.013782353140413761,
0.0035922015085816383,
-0.05434909090399742,
0.1516784131526947,
-0.005844662897288799,
0.0029540376272052526,
-0.07083762437105179,
-0.1291743367910385,
0.10850490629673004,
0.006498955190181732,
0.02294132485985756,
-0.0979825034737587,
-0.006321134977042675,
-0.16604429483413696,
0.0009475880651734769,
-0.10987886041402817,
0.13255955278873444,
-0.030679792165756226,
-0.055979833006858826,
-0.042720209807157516,
0.06211471930146217,
0.019194258376955986,
-0.00234066229313612,
0.04015117883682251,
-0.039595138281583786,
0.0747566968202591,
0.16332727670669556,
-0.017328687012195587,
-0.1248612180352211,
-0.010607022792100906,
0.029890792444348335,
-0.06337375193834305,
0.06041240692138672,
-0.06330645829439163,
0.007520704064518213,
0.11059720069169998,
-0.02851405180990696,
0.08596460521221161,
0.09430849552154541,
-0.0006876355619169772,
-0.020148318260908127,
0.14028355479240417,
-0.14495673775672913,
-0.014307770878076553,
-0.013336516916751862,
-0.09439173340797424,
0.06619010120630264,
-0.05211834982037544,
0.11680414527654648,
0.006974115502089262,
-0.014424458146095276,
-0.020353620871901512,
-0.015100277960300446,
-0.09164611995220184,
-0.004202682059258223,
0.09044439345598221,
0.023760318756103516,
-0.1128423810005188,
-0.0335761196911335,
0.004707427229732275,
-0.13622799515724182,
-0.007545579224824905,
0.06523256748914719,
-0.07587762176990509,
-0.13595731556415558,
0.024048125371336937,
0.14400562644004822,
-0.14032894372940063,
-0.026473788544535637,
-0.005697597749531269,
-0.1527782529592514,
0.029557522386312485,
0.22262881696224213,
0.11080096662044525,
0.03595811501145363,
-0.02202817052602768,
-0.06986579298973083,
0.038131143897771835,
0.019952917471528053,
0.012412765994668007,
0.00458479393273592,
-0.02681654319167137,
0.010261842049658298,
-0.04708896204829216,
0.09731101244688034,
-0.08190684765577316,
-0.020567238330841064,
-0.17709723114967346,
-0.03738138824701309,
-0.1569719761610031,
-0.06324554979801178,
-0.07304729521274567,
-0.045781105756759644,
0.009782698936760426,
-0.10573552548885345,
-0.03066563419997692,
-0.04311099648475647,
-0.1155947744846344,
0.0065630050376057625,
0.03597831726074219,
0.07318584620952606,
-0.13805606961250305,
-0.031900644302368164,
0.09739468991756439,
-0.044536542147397995,
0.06739972531795502,
0.06818944960832596,
-0.03359605371952057,
0.05810783803462982,
-0.13164596259593964,
-0.1392739713191986,
0.02271813340485096,
0.018773630261421204,
0.09108950197696686,
-0.01734386570751667,
0.021975286304950714,
0.034006740897893906,
0.050926752388477325,
0.04037731513381004,
0.09142416715621948,
-0.06584032624959946,
0.008467541076242924,
0.02662913128733635,
-0.14992989599704742,
0.028557760640978813,
-0.016681771725416183,
0.18041609227657318,
-0.005798796657472849,
0.09695430099964142,
-0.045969221740961075,
0.03999268263578415,
-0.0637323334813118,
0.01209651306271553,
-0.05210011079907417,
-0.14648011326789856,
-0.02182125486433506,
-0.055466387420892715,
-0.004845402669161558,
0.018995143473148346,
0.29469266533851624,
0.10678107291460037,
-0.04827273637056351,
0.02206508442759514,
0.03228425607085228,
-0.004618890583515167,
0.04153918847441673,
0.23156361281871796,
0.07252166420221329,
-0.025126075372099876,
-0.09853900969028473,
0.07843201607465744,
0.04258539900183678,
0.0940551832318306,
0.05356062948703766,
0.12156990170478821,
0.09679296612739563,
0.07926740497350693,
0.06252311170101166,
-0.010244636796414852,
-0.042524874210357666,
-0.08185899257659912,
0.03193257749080658,
0.08440139889717102,
-0.093053437769413,
0.025519905611872673,
0.06151644140481949,
-0.10997676104307175,
0.08044719696044922,
-0.04717780277132988,
-0.02929830364882946,
-0.13181234896183014,
-0.06577516347169876,
-0.05541274696588516,
-0.09324230998754501,
-0.012891589663922787,
-0.05295712500810623,
0.053596656769514084,
0.06638354808092117,
0.06028244271874428,
0.00623358553275466,
0.029199499636888504,
-0.13575385510921478,
-0.014514239504933357,
0.037982601672410965,
0.01876903884112835,
0.0471869632601738,
-0.08129625767469406,
0.0036538264248520136,
-0.11392349749803543,
0.0057596382685005665,
-0.05739039555191994,
0.03395381197333336,
0.00020287993538659066,
0.0130890728905797,
-0.08625715970993042,
-0.07653603702783585,
-0.0731208398938179,
0.010025396011769772,
-0.0036682388745248318,
0.16858148574829102,
0.01093028113245964,
0.015501786023378372,
0.022754374891519547,
0.11977105587720871,
-0.050574805587530136,
-0.10976896435022354,
-0.10208938270807266,
0.11278736591339111,
-0.009992990642786026,
0.1250232607126236,
-0.0626220628619194,
0.016069473698735237,
-0.07559517025947571,
0.2744385898113251,
0.3848537504673004,
-0.10340263694524765,
0.047974132001399994,
0.06789244711399078,
0.013131399638950825,
0.09797322750091553,
0.03297954425215721,
0.06118832156062126,
0.19550445675849915,
-0.09371066838502884,
-0.06840776652097702,
-0.0857609435915947,
-0.05302630364894867,
-0.08036420494318008,
0.08165048807859421,
0.05755855143070221,
-0.07721535861492157,
-0.054305531084537506,
0.07623444497585297,
-0.08714795857667923,
0.05879194289445877,
0.10962506383657455,
-0.18041670322418213,
-0.06175016984343529,
-0.009704191237688065,
0.0825614333152771,
0.057111311703920364,
0.08231794834136963,
-0.03336436673998833,
-0.056443918496370316,
0.0996023416519165,
-0.006532253231853247,
-0.15381726622581482,
-0.06265959143638611,
0.08535782247781754,
0.009532389231026173,
0.18370455503463745,
-0.006763385608792305,
0.015390804968774319,
0.11681618541479111,
0.06737036257982254,
-0.045840512961149216,
-0.002358147641643882,
0.04085312411189079,
-0.04534933343529701,
-0.07789164036512375,
-0.09915757924318314,
0.0023028047289699316,
-0.09710340946912766,
0.005488802678883076,
-0.10654479265213013,
0.10940106958150864,
0.021638520061969757,
-0.10828074812889099,
-0.010418310761451721,
0.0765710324048996,
-0.08846425265073776,
0.06758444756269455,
0.0762534886598587,
0.014398889616131783,
-0.02674390748143196,
-0.030105946585536003,
0.01000696700066328,
0.05475230515003204,
-0.08376430720090866,
-0.11542397737503052,
-0.034658707678318024,
-0.04257775470614433,
0.007838181219995022,
-0.02868693135678768,
-0.1720627248287201,
-0.03701392933726311,
-0.09059958159923553,
0.09302670508623123,
-0.05756818503141403,
0.03047344647347927,
0.07649655640125275,
0.001309831626713276,
-0.0043069832026958466,
0.010007020086050034,
0.019229531288146973,
0.043708935379981995,
-0.056480683386325836,
-0.03237311542034149
] |
null | null |
transformers
|
# FlauBERT: Unsupervised Language Model Pre-training for French
**FlauBERT** is a French BERT trained on a very large and heterogeneous French corpus. Models of different sizes are trained using the new CNRS (French National Centre for Scientific Research) [Jean Zay](http://www.idris.fr/eng/jean-zay/ ) supercomputer.
Along with FlauBERT comes [**FLUE**](https://github.com/getalp/Flaubert/tree/master/flue): an evaluation setup for French NLP systems similar to the popular GLUE benchmark. The goal is to enable further reproducible experiments in the future and to share models and progress on the French language.For more details please refer to the [official website](https://github.com/getalp/Flaubert).
## FlauBERT models
| Model name | Number of layers | Attention Heads | Embedding Dimension | Total Parameters |
| :------: | :---: | :---: | :---: | :---: |
| `flaubert-small-cased` | 6 | 8 | 512 | 54 M |
| `flaubert-base-uncased` | 12 | 12 | 768 | 137 M |
| `flaubert-base-cased` | 12 | 12 | 768 | 138 M |
| `flaubert-large-cased` | 24 | 16 | 1024 | 373 M |
**Note:** `flaubert-small-cased` is partially trained so performance is not guaranteed. Consider using it for debugging purpose only.
## Using FlauBERT with Hugging Face's Transformers
```python
import torch
from transformers import FlaubertModel, FlaubertTokenizer
# Choose among ['flaubert/flaubert_small_cased', 'flaubert/flaubert_base_uncased',
# 'flaubert/flaubert_base_cased', 'flaubert/flaubert_large_cased']
modelname = 'flaubert/flaubert_base_cased'
# Load pretrained model and tokenizer
flaubert, log = FlaubertModel.from_pretrained(modelname, output_loading_info=True)
flaubert_tokenizer = FlaubertTokenizer.from_pretrained(modelname, do_lowercase=False)
# do_lowercase=False if using cased models, True if using uncased ones
sentence = "Le chat mange une pomme."
token_ids = torch.tensor([flaubert_tokenizer.encode(sentence)])
last_layer = flaubert(token_ids)[0]
print(last_layer.shape)
# torch.Size([1, 8, 768]) -> (batch size x number of tokens x embedding dimension)
# The BERT [CLS] token correspond to the first hidden state of the last layer
cls_embedding = last_layer[:, 0, :]
```
**Notes:** if your `transformers` version is <=2.10.0, `modelname` should take one
of the following values:
```
['flaubert-small-cased', 'flaubert-base-uncased', 'flaubert-base-cased', 'flaubert-large-cased']
```
## References
If you use FlauBERT or the FLUE Benchmark for your scientific publication, or if you find the resources in this repository useful, please cite one of the following papers:
[LREC paper](http://www.lrec-conf.org/proceedings/lrec2020/pdf/2020.lrec-1.302.pdf)
```
@InProceedings{le2020flaubert,
author = {Le, Hang and Vial, Lo\"{i}c and Frej, Jibril and Segonne, Vincent and Coavoux, Maximin and Lecouteux, Benjamin and Allauzen, Alexandre and Crabb\'{e}, Beno\^{i}t and Besacier, Laurent and Schwab, Didier},
title = {FlauBERT: Unsupervised Language Model Pre-training for French},
booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference},
month = {May},
year = {2020},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {2479--2490},
url = {https://www.aclweb.org/anthology/2020.lrec-1.302}
}
```
[TALN paper](https://hal.archives-ouvertes.fr/hal-02784776/)
```
@inproceedings{le2020flaubert,
title = {FlauBERT: des mod{\`e}les de langue contextualis{\'e}s pr{\'e}-entra{\^\i}n{\'e}s pour le fran{\c{c}}ais},
author = {Le, Hang and Vial, Lo{\"\i}c and Frej, Jibril and Segonne, Vincent and Coavoux, Maximin and Lecouteux, Benjamin and Allauzen, Alexandre and Crabb{\'e}, Beno{\^\i}t and Besacier, Laurent and Schwab, Didier},
booktitle = {Actes de la 6e conf{\'e}rence conjointe Journ{\'e}es d'{\'E}tudes sur la Parole (JEP, 31e {\'e}dition), Traitement Automatique des Langues Naturelles (TALN, 27e {\'e}dition), Rencontre des {\'E}tudiants Chercheurs en Informatique pour le Traitement Automatique des Langues (R{\'E}CITAL, 22e {\'e}dition). Volume 2: Traitement Automatique des Langues Naturelles},
pages = {268--278},
year = {2020},
organization = {ATALA}
}
```
|
{"language": "fr", "license": "mit", "tags": ["bert", "language-model", "flaubert", "flue", "french", "flaubert-base", "uncased"], "datasets": ["flaubert"], "metrics": ["flue"]}
|
fill-mask
|
flaubert/flaubert_base_uncased
|
[
"transformers",
"pytorch",
"flaubert",
"fill-mask",
"bert",
"language-model",
"flue",
"french",
"flaubert-base",
"uncased",
"fr",
"dataset:flaubert",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fr"
] |
TAGS
#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #flaubert-base #uncased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
FlauBERT: Unsupervised Language Model Pre-training for French
=============================================================
FlauBERT is a French BERT trained on a very large and heterogeneous French corpus. Models of different sizes are trained using the new CNRS (French National Centre for Scientific Research) Jean Zay supercomputer.
Along with FlauBERT comes FLUE: an evaluation setup for French NLP systems similar to the popular GLUE benchmark. The goal is to enable further reproducible experiments in the future and to share models and progress on the French language.For more details please refer to the official website.
FlauBERT models
---------------
Note: 'flaubert-small-cased' is partially trained so performance is not guaranteed. Consider using it for debugging purpose only.
Using FlauBERT with Hugging Face's Transformers
-----------------------------------------------
Notes: if your 'transformers' version is <=2.10.0, 'modelname' should take one
of the following values:
References
----------
If you use FlauBERT or the FLUE Benchmark for your scientific publication, or if you find the resources in this repository useful, please cite one of the following papers:
LREC paper
TALN paper
|
[] |
[
"TAGS\n#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #flaubert-base #uncased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #flaubert-base #uncased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
-0.03516087308526039,
0.11940024048089981,
-0.004682100843638182,
0.0935697928071022,
0.046558212488889694,
0.01091738510876894,
0.09807609766721725,
0.08356732875108719,
0.024230575188994408,
0.014178517274558544,
0.1683124452829361,
0.060332510620355606,
-0.017668256536126137,
0.1318313330411911,
-0.009374325163662434,
-0.27532610297203064,
0.06914090365171432,
-0.005142427980899811,
-0.07499302178621292,
0.061681028455495834,
0.12239064276218414,
-0.04768932983279228,
0.105004221200943,
0.037084028124809265,
-0.054348353296518326,
0.048940509557724,
0.0241694338619709,
-0.07716108858585358,
0.13782228529453278,
0.08942033350467682,
0.1282309740781784,
0.05289020389318466,
0.01946006342768669,
-0.09094163030385971,
0.025084972381591797,
0.0006281562964431942,
-0.08845910429954529,
0.04862552136182785,
0.01794670894742012,
-0.021662281826138496,
0.14660272002220154,
0.019816236570477486,
-0.00964658334851265,
0.04710299149155617,
-0.1138916015625,
-0.15473705530166626,
-0.08403569459915161,
0.0116500873118639,
-0.09468197822570801,
0.07725448906421661,
0.002285226248204708,
0.11239098757505417,
-0.1187819391489029,
0.06236681342124939,
0.08487274497747421,
-0.3491414189338684,
-0.023097500205039978,
0.10463708639144897,
0.12697508931159973,
-0.013037259690463543,
-0.08117872476577759,
0.0815541222691536,
0.053660061210393906,
0.017168911173939705,
0.04647691920399666,
-0.07266012579202652,
-0.08257617056369781,
0.036168139427900314,
-0.07347690314054489,
-0.011827260255813599,
0.11054157465696335,
-0.008232658728957176,
0.015934891998767853,
-0.06517311930656433,
-0.06796880811452866,
0.011897563934326172,
-0.027259450405836105,
0.002368166809901595,
0.03444159775972366,
0.05725158005952835,
0.0575885996222496,
-0.08852328360080719,
-0.09988529980182648,
0.028765706345438957,
-0.2050420045852661,
0.07416315376758575,
-0.021604517474770546,
0.06758909672498703,
-0.06939607858657837,
0.0002463201526552439,
-0.060055892914533615,
-0.12850630283355713,
0.021600430831313133,
-0.07588532567024231,
0.07215222716331482,
0.02168327197432518,
-0.05587322264909744,
0.0502985417842865,
0.12492541968822479,
0.1785808503627777,
-0.03751007467508316,
0.006578860338777304,
-0.04459228366613388,
0.09676448255777359,
0.04402093589305878,
0.090738944709301,
-0.07577714323997498,
-0.09009899944067001,
-0.011623432859778404,
-0.07515585422515869,
0.008804433047771454,
-0.057973362505435944,
-0.14612285792827606,
-0.03520926088094711,
-0.057543713599443436,
0.02054201066493988,
0.028487740084528923,
0.028440045192837715,
-0.02828439697623253,
0.04137958213686943,
0.10626693814992905,
-0.055721137672662735,
0.06805463880300522,
0.029666803777217865,
0.010950610972940922,
0.11121080070734024,
0.011966963298618793,
0.00966259092092514,
0.044692303985357285,
0.036158524453639984,
-0.1063060387969017,
-0.022950319573283195,
-0.02755620703101158,
-0.12969832122325897,
0.04240068048238754,
-0.15221092104911804,
0.03257796913385391,
-0.130731463432312,
-0.07830984890460968,
0.011061507277190685,
0.08925264328718185,
-0.07948832213878632,
0.009960981085896492,
0.0003101087349932641,
-0.06284062564373016,
0.07666490226984024,
-0.024989917874336243,
0.0041801827028393745,
-0.07321888953447342,
0.05282675102353096,
-0.07210680842399597,
0.09967105090618134,
-0.22673213481903076,
0.03992781415581703,
-0.07321986556053162,
0.00450302567332983,
-0.11549583077430725,
-0.01786004938185215,
-0.1136760264635086,
0.03822515159845352,
-0.02761109545826912,
-0.08977300673723221,
-0.005003246013075113,
0.07122036069631577,
0.0049264091067016125,
0.11540963500738144,
-0.17267149686813354,
-0.06757421046495438,
0.1506035178899765,
-0.10699627548456192,
-0.08637979626655579,
0.11176647990942001,
-0.02349490486085415,
0.01326136663556099,
-0.025426631793379784,
0.17392130196094513,
0.014141561463475227,
-0.14706698060035706,
0.041165631264448166,
0.12384065240621567,
-0.030508331954479218,
-0.03483184427022934,
0.07116294652223587,
-0.03016507439315319,
-0.09048197418451309,
0.03279704228043556,
-0.017161855474114418,
0.088301882147789,
-0.04547847434878349,
-0.07914715260267258,
-0.01831940934062004,
-0.04814264923334122,
0.15713953971862793,
0.03481170907616615,
0.07727569341659546,
-0.07673104107379913,
-0.048174310475587845,
0.010595595464110374,
0.016515647992491722,
0.05289728194475174,
-0.0012996236328035593,
-0.03080589510500431,
0.14768753945827484,
0.061624038964509964,
-0.04097363352775574,
-0.12313244491815567,
-0.07535667717456818,
-0.05919366702437401,
0.04723216965794563,
-0.021015627309679985,
0.2109113186597824,
0.09198671579360962,
0.007942743599414825,
-0.02125687152147293,
0.04490288347005844,
0.0726293995976448,
0.06032341718673706,
0.014119317755103111,
-0.14028504490852356,
0.02510415017604828,
-0.09951793402433395,
0.04285309091210365,
-0.1156735047698021,
-0.0156581811606884,
0.02365993522107601,
0.11369258165359497,
-0.028833601623773575,
0.048796940594911575,
-0.06911095231771469,
0.04744413122534752,
-0.07291020452976227,
-0.013433627784252167,
0.07415683567523956,
0.002894883742555976,
0.003160102292895317,
0.12581108510494232,
-0.043466147035360336,
0.304710328578949,
0.20640242099761963,
-0.12389109283685684,
-0.04290822148323059,
0.009160315617918968,
-0.019723497331142426,
0.046736832708120346,
0.028965504840016365,
0.008422422222793102,
-0.007870105095207691,
-0.021973464637994766,
0.14851421117782593,
-0.035015322268009186,
0.025227073580026627,
0.047136012464761734,
-0.04216289147734642,
-0.11851856857538223,
0.049617573618888855,
0.061074450612068176,
-0.04964497685432434,
0.21119087934494019,
0.24937838315963745,
0.000057968391047324985,
0.1974613517522812,
-0.0135662741959095,
0.03570462390780449,
-0.024371236562728882,
-0.05637476220726967,
-0.04867067560553551,
0.14920960366725922,
-0.1654287725687027,
-0.03159138560295105,
0.0733741968870163,
-0.028560230508446693,
0.03522592410445213,
-0.0939095988869667,
-0.053785186260938644,
0.007168389856815338,
-0.001388618373312056,
-0.06524477154016495,
0.12444012612104416,
-0.004526678007096052,
0.08380025625228882,
-0.0025061522610485554,
-0.20094145834445953,
0.04399453103542328,
0.01749887689948082,
-0.012849190272390842,
0.1353963315486908,
-0.16925014555454254,
-0.23119468986988068,
-0.07926357537508011,
-0.15179511904716492,
-0.09380458295345306,
0.0075610424391925335,
0.06489983201026917,
-0.046059366315603256,
-0.04785144329071045,
-0.02199167013168335,
-0.060102157294750214,
-0.08249689638614655,
0.008549596183001995,
-0.06392645835876465,
-0.025888031348586082,
-0.035166945308446884,
-0.13519524037837982,
-0.07127509266138077,
-0.003211784176528454,
-0.010063502937555313,
0.07164309173822403,
-0.01986025646328926,
0.090397909283638,
0.07857350260019302,
-0.022928472608327866,
0.03891155123710632,
-0.014897463843226433,
0.2663639485836029,
-0.030278153717517853,
0.056564413011074066,
0.12549832463264465,
0.05511485040187836,
0.06390009820461273,
0.17061559855937958,
0.05968979746103287,
-0.027907630428671837,
-0.03439616784453392,
-0.002989736385643482,
-0.07017575204372406,
-0.10648102313280106,
-0.15304896235466003,
-0.10113264620304108,
-0.013719107955694199,
0.04445459321141243,
0.0697983056306839,
0.11064828187227249,
0.06296773999929428,
0.08088871091604233,
-0.05108344554901123,
-0.05199756473302841,
0.006392201874405146,
0.2284117490053177,
-0.06331369280815125,
0.13162997364997864,
-0.04940083622932434,
-0.097068190574646,
0.11552601307630539,
0.01271107792854309,
0.002882956760004163,
0.09494996070861816,
0.004622680135071278,
0.07086889445781708,
0.15436983108520508,
0.05138025060296059,
0.10828305780887604,
0.003934579435735941,
-0.031395550817251205,
-0.04492662474513054,
-0.051180630922317505,
0.029732026159763336,
0.04077319800853729,
0.08865734934806824,
-0.1155882477760315,
-0.05414355918765068,
-0.1504439115524292,
0.0654161348938942,
-0.003658731235191226,
0.11077245324850082,
-0.17138664424419403,
0.008903815411031246,
0.04952307045459747,
0.037279754877090454,
-0.09238997101783752,
0.04716429486870766,
0.09082039445638657,
-0.10679592192173004,
0.0645226389169693,
0.011537939310073853,
0.07876399904489517,
0.12078085541725159,
0.06119980290532112,
-0.011166758835315704,
-0.06508180499076843,
0.032729700207710266,
0.06565040349960327,
-0.24045130610466003,
0.26086848974227905,
-0.014692651107907295,
-0.06551839411258698,
-0.041948117315769196,
0.010895539075136185,
0.0819072350859642,
0.17729410529136658,
0.1704656183719635,
0.056428443640470505,
-0.06404922157526016,
-0.07982528954744339,
0.01278678234666586,
0.0038027935661375523,
0.026672210544347763,
0.024358002468943596,
-0.03337262198328972,
-0.013986121863126755,
-0.03868919983506203,
0.010805639438331127,
0.1336643546819687,
-0.132135808467865,
-0.1636248081922531,
0.06296785175800323,
0.06265899538993835,
-0.030668003484606743,
-0.012643440626561642,
-0.09830519556999207,
-0.23954229056835175,
0.1535232812166214,
0.00917595811188221,
-0.015224677510559559,
-0.1345304399728775,
-0.058052483946084976,
0.11979692429304123,
-0.05030769482254982,
0.09773998707532883,
-0.08245514333248138,
0.024884987622499466,
-0.10621600598096848,
-0.12373301386833191,
0.11357250064611435,
-0.10254823416471481,
-0.0013455328298732638,
-0.06535420566797256,
0.04675493761897087,
-0.057746969163417816,
0.036554865539073944,
0.05305992439389229,
0.059378065168857574,
-0.10581595450639725,
-0.11609122157096863,
0.003325071418657899,
-0.10186509042978287,
0.05636950582265854,
-0.03961964696645737,
-0.07202056050300598,
-0.06797885149717331,
0.08045922964811325,
0.02754787914454937,
0.233725443482399,
0.19457007944583893,
-0.1263311505317688,
0.1885082721710205,
0.11174042522907257,
-0.04792153462767601,
-0.31244152784347534,
-0.0618344284594059,
-0.13471850752830505,
0.00949939340353012,
0.02606169879436493,
-0.16490912437438965,
0.041473034769296646,
-0.003124360926449299,
-0.05433344468474388,
0.09566618502140045,
-0.16968190670013428,
-0.07006403058767319,
0.22007691860198975,
-0.026828361675143242,
0.3466115891933441,
-0.10070957243442535,
-0.057942625135183334,
-0.05124363303184509,
-0.14622046053409576,
0.09309619665145874,
-0.04922699183225632,
0.057388484477996826,
0.001584253623150289,
0.012698899954557419,
0.0020328289829194546,
-0.05962042137980461,
0.1555704027414322,
-0.003003879450261593,
0.005005761981010437,
-0.07691030949354172,
-0.11475406587123871,
0.11368060857057571,
0.005996096413582563,
0.032765522599220276,
-0.0869583934545517,
0.003206664463505149,
-0.15288932621479034,
0.005651733372360468,
-0.10912009328603745,
0.14851386845111847,
-0.03578665107488632,
-0.05214236304163933,
-0.042857978492975235,
0.06997822225093842,
0.028680887073278427,
-0.004317054059356451,
0.053546905517578125,
-0.016367582604289055,
0.08032018691301346,
0.18089382350444794,
-0.02773081324994564,
-0.11250069737434387,
-0.02222524583339691,
0.02185954712331295,
-0.06279555708169937,
0.0526113286614418,
-0.04850374534726143,
0.0008253756677731872,
0.11016495525836945,
-0.019534176215529442,
0.07594278454780579,
0.09242945909500122,
0.0046652876771986485,
-0.017250610515475273,
0.13599276542663574,
-0.14240458607673645,
-0.00684928335249424,
-0.014232714660465717,
-0.08191253244876862,
0.06945574283599854,
-0.0471014678478241,
0.1146838515996933,
-0.0031128590926527977,
-0.008200151845812798,
-0.01942390576004982,
-0.009973733685910702,
-0.08181643486022949,
-0.010178310796618462,
0.08172657340765,
0.01868141070008278,
-0.11371247470378876,
-0.022991308942437172,
0.009585202671587467,
-0.13581489026546478,
-0.009411376900970936,
0.06635615229606628,
-0.07707135379314423,
-0.14404521882534027,
0.029940232634544373,
0.13111993670463562,
-0.15274359285831451,
-0.021728895604610443,
-0.0032958784140646458,
-0.1471066176891327,
0.02607187256217003,
0.21188239753246307,
0.10830787569284439,
0.03398815914988518,
-0.024869896471500397,
-0.0778919905424118,
0.025844991207122803,
0.019512543454766273,
0.004902667831629515,
0.005935638677328825,
-0.04601429030299187,
0.018565410748124123,
-0.03792800381779671,
0.10925672948360443,
-0.07956119626760483,
-0.015627851709723473,
-0.17801503837108612,
-0.048150449991226196,
-0.14965814352035522,
-0.05707947537302971,
-0.08255442976951599,
-0.04355522617697716,
0.004336780868470669,
-0.09601069986820221,
-0.0275779627263546,
-0.040577422827482224,
-0.10914168506860733,
0.0004734791873488575,
0.036622241139411926,
0.07207534462213516,
-0.1355963945388794,
-0.03167015686631203,
0.0868075042963028,
-0.03594556823372841,
0.06484941393136978,
0.06531041115522385,
-0.029934410005807877,
0.054546959698200226,
-0.12902367115020752,
-0.14438000321388245,
0.011093341745436192,
0.021681543439626694,
0.09789002686738968,
-0.015174664556980133,
0.01575511321425438,
0.030506394803524017,
0.05218570679426193,
0.043677229434251785,
0.10221798717975616,
-0.07185878604650497,
-0.007573239970952272,
0.019160989671945572,
-0.13794562220573425,
0.02510572038590908,
-0.011134653352200985,
0.17436224222183228,
-0.01431060116738081,
0.11272184550762177,
-0.05184697359800339,
0.04977615922689438,
-0.054868318140506744,
0.008955089375376701,
-0.05926140770316124,
-0.13801121711730957,
-0.014150144532322884,
-0.04871322587132454,
-0.0033427728340029716,
0.014515007846057415,
0.2939539849758148,
0.11707068979740143,
-0.040190789848566055,
0.02335405722260475,
0.019260842353105545,
-0.03401004150509834,
0.028194712474942207,
0.2353457659482956,
0.07354753464460373,
-0.022481955587863922,
-0.09174574166536331,
0.0795784667134285,
0.03410549461841583,
0.08226571977138519,
0.07984086126089096,
0.11082280427217484,
0.06862037628889084,
0.07930229604244232,
0.06228383257985115,
-0.03599625080823898,
-0.052822716534137726,
-0.07034455984830856,
0.03392309322953224,
0.09416608512401581,
-0.0930086150765419,
-0.010337821207940578,
0.04822364076972008,
-0.10528415441513062,
0.08432037383317947,
-0.05135410279035568,
-0.03248283267021179,
-0.1354382187128067,
-0.06557662785053253,
-0.0532558411359787,
-0.08570202440023422,
-0.00567605858668685,
-0.042342059314250946,
0.061306413263082504,
0.058315351605415344,
0.058329664170742035,
-0.0014739467296749353,
0.028763648122549057,
-0.16017095744609833,
-0.006319567561149597,
0.042829498648643494,
0.020952126011252403,
0.03623661771416664,
-0.07880731672048569,
0.008088207803666592,
-0.1206422820687294,
0.011418260633945465,
-0.057651765644550323,
0.029360415413975716,
0.002094605704769492,
0.023576095700263977,
-0.0813915953040123,
-0.07514410465955734,
-0.07308366149663925,
0.01189337857067585,
0.0070733968168497086,
0.16686587035655975,
0.015222843736410141,
0.028061797842383385,
0.01703471690416336,
0.12109672278165817,
-0.05592139810323715,
-0.10789363086223602,
-0.09535243362188339,
0.12288587540388107,
-0.0012350198812782764,
0.11499820649623871,
-0.05725330486893654,
0.01576067879796028,
-0.07776810973882675,
0.26591232419013977,
0.3786320686340332,
-0.11059869080781937,
0.046825576573610306,
0.07312433421611786,
0.011205260641872883,
0.09481093287467957,
0.03595526143908501,
0.061266105622053146,
0.19814880192279816,
-0.08849306404590607,
-0.07023503631353378,
-0.09120165556669235,
-0.0546991229057312,
-0.09342531859874725,
0.08378211408853531,
0.052658408880233765,
-0.07457979023456573,
-0.04883089289069176,
0.07145092636346817,
-0.09578977525234222,
0.08193550258874893,
0.11212363839149475,
-0.18292491137981415,
-0.06484594941139221,
-0.005154161714017391,
0.08792555332183838,
0.06722096353769302,
0.07500456273555756,
-0.0337410531938076,
-0.048892803490161896,
0.11549311876296997,
-0.009883267804980278,
-0.14963504672050476,
-0.04721229150891304,
0.07179267704486847,
-0.004144050180912018,
0.1922713816165924,
-0.0037076890002936125,
0.025522654876112938,
0.11509671062231064,
0.07241283357143402,
-0.0502307154238224,
0.008059139363467693,
0.03956672549247742,
-0.04941704869270325,
-0.06485317647457123,
-0.08670739084482193,
-0.0020971959456801414,
-0.10314112901687622,
0.003978022374212742,
-0.11387309432029724,
0.11023184657096863,
0.044334981590509415,
-0.10577280819416046,
-0.006802599877119064,
0.08587632328271866,
-0.0832446962594986,
0.07506994158029556,
0.08742725849151611,
0.01463833637535572,
-0.03422993794083595,
-0.02419264428317547,
0.012064938433468342,
0.05048312246799469,
-0.07654224336147308,
-0.11073930561542511,
-0.021747848019003868,
-0.031348083168268204,
-0.01816270314157009,
-0.01584477722644806,
-0.17347228527069092,
-0.033825308084487915,
-0.0957815945148468,
0.08267153054475784,
-0.05945990979671478,
0.027002552524209023,
0.06736461818218231,
-0.008840779773890972,
-0.006573789287358522,
0.025358671322464943,
0.012366941198706627,
0.037617992609739304,
-0.05342267453670502,
-0.03033590316772461
] |
null | null |
transformers
|
# FlauBERT: Unsupervised Language Model Pre-training for French
**FlauBERT** is a French BERT trained on a very large and heterogeneous French corpus. Models of different sizes are trained using the new CNRS (French National Centre for Scientific Research) [Jean Zay](http://www.idris.fr/eng/jean-zay/ ) supercomputer.
Along with FlauBERT comes [**FLUE**](https://github.com/getalp/Flaubert/tree/master/flue): an evaluation setup for French NLP systems similar to the popular GLUE benchmark. The goal is to enable further reproducible experiments in the future and to share models and progress on the French language.For more details please refer to the [official website](https://github.com/getalp/Flaubert).
## FlauBERT models
| Model name | Number of layers | Attention Heads | Embedding Dimension | Total Parameters |
| :------: | :---: | :---: | :---: | :---: |
| `flaubert-small-cased` | 6 | 8 | 512 | 54 M |
| `flaubert-base-uncased` | 12 | 12 | 768 | 137 M |
| `flaubert-base-cased` | 12 | 12 | 768 | 138 M |
| `flaubert-large-cased` | 24 | 16 | 1024 | 373 M |
**Note:** `flaubert-small-cased` is partially trained so performance is not guaranteed. Consider using it for debugging purpose only.
## Using FlauBERT with Hugging Face's Transformers
```python
import torch
from transformers import FlaubertModel, FlaubertTokenizer
# Choose among ['flaubert/flaubert_small_cased', 'flaubert/flaubert_base_uncased',
# 'flaubert/flaubert_base_cased', 'flaubert/flaubert_large_cased']
modelname = 'flaubert/flaubert_base_cased'
# Load pretrained model and tokenizer
flaubert, log = FlaubertModel.from_pretrained(modelname, output_loading_info=True)
flaubert_tokenizer = FlaubertTokenizer.from_pretrained(modelname, do_lowercase=False)
# do_lowercase=False if using cased models, True if using uncased ones
sentence = "Le chat mange une pomme."
token_ids = torch.tensor([flaubert_tokenizer.encode(sentence)])
last_layer = flaubert(token_ids)[0]
print(last_layer.shape)
# torch.Size([1, 8, 768]) -> (batch size x number of tokens x embedding dimension)
# The BERT [CLS] token correspond to the first hidden state of the last layer
cls_embedding = last_layer[:, 0, :]
```
**Notes:** if your `transformers` version is <=2.10.0, `modelname` should take one
of the following values:
```
['flaubert-small-cased', 'flaubert-base-uncased', 'flaubert-base-cased', 'flaubert-large-cased']
```
## References
If you use FlauBERT or the FLUE Benchmark for your scientific publication, or if you find the resources in this repository useful, please cite one of the following papers:
[LREC paper](http://www.lrec-conf.org/proceedings/lrec2020/pdf/2020.lrec-1.302.pdf)
```
@InProceedings{le2020flaubert,
author = {Le, Hang and Vial, Lo\"{i}c and Frej, Jibril and Segonne, Vincent and Coavoux, Maximin and Lecouteux, Benjamin and Allauzen, Alexandre and Crabb\'{e}, Beno\^{i}t and Besacier, Laurent and Schwab, Didier},
title = {FlauBERT: Unsupervised Language Model Pre-training for French},
booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference},
month = {May},
year = {2020},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {2479--2490},
url = {https://www.aclweb.org/anthology/2020.lrec-1.302}
}
```
[TALN paper](https://hal.archives-ouvertes.fr/hal-02784776/)
```
@inproceedings{le2020flaubert,
title = {FlauBERT: des mod{\`e}les de langue contextualis{\'e}s pr{\'e}-entra{\^\i}n{\'e}s pour le fran{\c{c}}ais},
author = {Le, Hang and Vial, Lo{\"\i}c and Frej, Jibril and Segonne, Vincent and Coavoux, Maximin and Lecouteux, Benjamin and Allauzen, Alexandre and Crabb{\'e}, Beno{\^\i}t and Besacier, Laurent and Schwab, Didier},
booktitle = {Actes de la 6e conf{\'e}rence conjointe Journ{\'e}es d'{\'E}tudes sur la Parole (JEP, 31e {\'e}dition), Traitement Automatique des Langues Naturelles (TALN, 27e {\'e}dition), Rencontre des {\'E}tudiants Chercheurs en Informatique pour le Traitement Automatique des Langues (R{\'E}CITAL, 22e {\'e}dition). Volume 2: Traitement Automatique des Langues Naturelles},
pages = {268--278},
year = {2020},
organization = {ATALA}
}
```
|
{"language": "fr", "license": "mit", "tags": ["bert", "language-model", "flaubert", "flue", "french", "bert-large", "flaubert-large", "cased"], "datasets": ["flaubert"], "metrics": ["flue"]}
|
fill-mask
|
flaubert/flaubert_large_cased
|
[
"transformers",
"pytorch",
"flaubert",
"fill-mask",
"bert",
"language-model",
"flue",
"french",
"bert-large",
"flaubert-large",
"cased",
"fr",
"dataset:flaubert",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fr"
] |
TAGS
#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #bert-large #flaubert-large #cased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
FlauBERT: Unsupervised Language Model Pre-training for French
=============================================================
FlauBERT is a French BERT trained on a very large and heterogeneous French corpus. Models of different sizes are trained using the new CNRS (French National Centre for Scientific Research) Jean Zay supercomputer.
Along with FlauBERT comes FLUE: an evaluation setup for French NLP systems similar to the popular GLUE benchmark. The goal is to enable further reproducible experiments in the future and to share models and progress on the French language.For more details please refer to the official website.
FlauBERT models
---------------
Note: 'flaubert-small-cased' is partially trained so performance is not guaranteed. Consider using it for debugging purpose only.
Using FlauBERT with Hugging Face's Transformers
-----------------------------------------------
Notes: if your 'transformers' version is <=2.10.0, 'modelname' should take one
of the following values:
References
----------
If you use FlauBERT or the FLUE Benchmark for your scientific publication, or if you find the resources in this repository useful, please cite one of the following papers:
LREC paper
TALN paper
|
[] |
[
"TAGS\n#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #bert-large #flaubert-large #cased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
84
] |
[
"passage: TAGS\n#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #bert-large #flaubert-large #cased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
-0.0331268385052681,
0.0841115266084671,
-0.005659740883857012,
0.08823967725038528,
0.035086315125226974,
0.002305259695276618,
0.08345094323158264,
0.07538271695375443,
0.04276278242468834,
0.03224891424179077,
0.18587173521518707,
0.05034339427947998,
-0.020650213584303856,
0.12385439872741699,
-0.011887774802744389,
-0.2834935486316681,
0.06840873509645462,
-0.006162711884826422,
-0.10203873366117477,
0.06964706629514694,
0.1310259848833084,
-0.06985766440629959,
0.1102273166179657,
0.03794284537434578,
-0.05598658695816994,
0.04612767696380615,
0.022797798737883568,
-0.07369780540466309,
0.1438562124967575,
0.09486781805753708,
0.12894240021705627,
0.054996076971292496,
0.005544581450521946,
-0.10002217441797256,
0.024313217028975487,
-0.011396986432373524,
-0.07302433997392654,
0.04501320421695709,
0.02967289835214615,
-0.003435565624386072,
0.12698136270046234,
-0.008415963500738144,
-0.011027294211089611,
0.04393065348267555,
-0.10944958031177521,
-0.17116038501262665,
-0.07292596995830536,
0.007801608648151159,
-0.1012350544333458,
0.06715334206819534,
-0.005962857510894537,
0.11660676449537277,
-0.13303570449352264,
0.0516887865960598,
0.1013985350728035,
-0.3389606475830078,
-0.02355608530342579,
0.1017114445567131,
0.1522524654865265,
0.008205035701394081,
-0.0775161013007164,
0.09326967597007751,
0.06114397943019867,
0.01524536032229662,
0.06121474876999855,
-0.07734601944684982,
-0.02725071832537651,
0.0414336696267128,
-0.08408192545175552,
-0.022086061537265778,
0.12498652935028076,
-0.0024396898224949837,
0.024825597181916237,
-0.08690851181745529,
-0.07171023637056351,
0.013607561588287354,
-0.019912170246243477,
-0.012307250872254372,
0.04343170300126076,
0.03954807296395302,
0.05788009986281395,
-0.08422303944826126,
-0.11554572731256485,
0.03614392131567001,
-0.2104663848876953,
0.0666603147983551,
-0.024048013612627983,
0.05945848301053047,
-0.08464136719703674,
-0.013531758449971676,
-0.05688324198126793,
-0.11838647723197937,
0.026017744094133377,
-0.07170964777469635,
0.06095102056860924,
0.025992339476943016,
-0.059674620628356934,
0.06388909369707108,
0.13379515707492828,
0.15125061571598053,
-0.048240553587675095,
-0.0005635818815790117,
-0.02043338492512703,
0.10173922032117844,
0.048216018825769424,
0.11417866498231888,
-0.0944419652223587,
-0.08866960555315018,
-0.02451794035732746,
-0.059801314026117325,
0.009915350005030632,
-0.04485982283949852,
-0.14534379541873932,
-0.054706573486328125,
-0.04543818533420563,
0.020815525203943253,
0.02407308667898178,
0.031365469098091125,
-0.023686381056904793,
0.047623082995414734,
0.10008101165294647,
-0.04591669142246246,
0.06803396344184875,
0.02836972288787365,
0.022600285708904266,
0.12388582527637482,
-0.025196997448801994,
0.005484776105731726,
0.05899333953857422,
0.05331205949187279,
-0.10910508781671524,
-0.028128672391176224,
-0.025372212752699852,
-0.12994040548801422,
0.055117037147283554,
-0.14163659512996674,
0.03351239487528801,
-0.13282816112041473,
-0.044122274965047836,
0.014089245349168777,
0.06988993287086487,
-0.0660238191485405,
0.0067356345243752,
0.012860646471381187,
-0.057601068168878555,
0.06778424233198166,
-0.025607716292142868,
0.025672180578112602,
-0.07047630846500397,
0.06035671383142471,
-0.0709528848528862,
0.11125493049621582,
-0.22301211953163147,
0.02636687643826008,
-0.06609100848436356,
0.014900077134370804,
-0.10773029178380966,
-0.014404430985450745,
-0.07922586053609848,
0.034656450152397156,
-0.04502841830253601,
-0.10139227658510208,
-0.01928658038377762,
0.08303729444742203,
-0.002463362878188491,
0.10956227779388428,
-0.18981748819351196,
-0.07460873574018478,
0.17550089955329895,
-0.08291353285312653,
-0.08588247001171112,
0.14130324125289917,
-0.02052614651620388,
0.0010737242409959435,
-0.0023652585223317146,
0.17623457312583923,
-0.005495345685631037,
-0.14514487981796265,
0.03040417842566967,
0.11087632924318314,
-0.033082958310842514,
-0.029738962650299072,
0.09547697752714157,
-0.02944457344710827,
-0.08516128361225128,
0.026148436591029167,
-0.01794004999101162,
0.08306713402271271,
-0.05185246095061302,
-0.06750205159187317,
-0.008818674832582474,
-0.04432240128517151,
0.14594927430152893,
0.026489805430173874,
0.06668287515640259,
-0.09484794735908508,
-0.05087590590119362,
0.0013489554403349757,
0.011147408746182919,
0.06135394796729088,
0.002268789801746607,
-0.037256840616464615,
0.15245336294174194,
0.043399617075920105,
-0.03826552629470825,
-0.11653061211109161,
-0.062201131135225296,
-0.054217252880334854,
0.053082454949617386,
-0.01913505047559738,
0.23435291647911072,
0.09215302020311356,
-0.012094005942344666,
-0.027735376730561256,
0.0473099946975708,
0.08098588138818741,
0.03845260292291641,
0.015139162540435791,
-0.1200198233127594,
0.026438243687152863,
-0.0952354148030281,
0.04087987542152405,
-0.12369196116924286,
-0.02128942683339119,
0.0441468246281147,
0.09037802368402481,
-0.025540882721543312,
0.057796910405159,
-0.0683988705277443,
0.0374126210808754,
-0.06746740639209747,
0.008623646572232246,
0.06510434299707413,
-0.010961056686937809,
-0.008083414286375046,
0.1426069736480713,
-0.04985734447836876,
0.31635764241218567,
0.19937439262866974,
-0.11945691704750061,
-0.03390192613005638,
-0.022665582597255707,
-0.021175729110836983,
0.06162780150771141,
0.03252698481082916,
-0.027460092678666115,
-0.014200350269675255,
-0.02263719029724598,
0.14112244546413422,
-0.05272817239165306,
0.018012242391705513,
0.05072629824280739,
-0.04048159345984459,
-0.11310172080993652,
0.07076726853847504,
0.05928782373666763,
-0.06607253104448318,
0.20298881828784943,
0.27598971128463745,
0.0016168050933629274,
0.21198812127113342,
-0.017030181363224983,
0.0285054799169302,
-0.02460191398859024,
-0.052016615867614746,
-0.05323508009314537,
0.1378861516714096,
-0.1569969356060028,
-0.035748884081840515,
0.08250495791435242,
-0.025394225493073463,
0.021964622661471367,
-0.09080037474632263,
-0.05734798684716225,
0.01673114113509655,
-0.002342566614970565,
-0.06542423367500305,
0.12996181845664978,
0.010641573928296566,
0.09596330672502518,
0.002420063130557537,
-0.19982923567295074,
0.03483551740646362,
0.01139542181044817,
-0.006506624631583691,
0.13081349432468414,
-0.13967536389827728,
-0.24842830002307892,
-0.07502023130655289,
-0.14845596253871918,
-0.07964026182889938,
0.007735030259937048,
0.06191271170973778,
-0.05593528598546982,
-0.033225592225790024,
-0.016413308680057526,
-0.008912109769880772,
-0.09484338015317917,
0.02955559454858303,
-0.08007267117500305,
-0.028536735102534294,
-0.03898686170578003,
-0.1296738088130951,
-0.07855752110481262,
-0.015572071075439453,
-0.016387516632676125,
0.07340014725923538,
-0.02742318995296955,
0.10392141342163086,
0.0782850831747055,
-0.018368834629654884,
0.03787808492779732,
-0.02315249852836132,
0.2606695592403412,
-0.04100457578897476,
0.049619004130363464,
0.1326572746038437,
0.0597333200275898,
0.06183160841464996,
0.17997293174266815,
0.05870576202869415,
-0.028884507715702057,
-0.041431307792663574,
-0.005551045760512352,
-0.06693973392248154,
-0.0972554162144661,
-0.15176424384117126,
-0.10038519650697708,
-0.006389545276761055,
0.0449887253344059,
0.06295733898878098,
0.09014013409614563,
0.05040156468749046,
0.07151132822036743,
-0.0599549375474453,
-0.051480650901794434,
-0.0003631616127677262,
0.24206703901290894,
-0.06359812617301941,
0.13300596177577972,
-0.04468072950839996,
-0.11328347027301788,
0.11855494976043701,
0.012394876219332218,
0.0059822616167366505,
0.09849844127893448,
0.02779664471745491,
0.07586938887834549,
0.12022653222084045,
0.0546613484621048,
0.10423853993415833,
0.0017578459810465574,
-0.035373758524656296,
-0.05520827695727348,
-0.05475872755050659,
0.0439101904630661,
0.03781898319721222,
0.12012837082147598,
-0.11779672652482986,
-0.056066226214170456,
-0.1552804559469223,
0.07773934304714203,
-0.005935414228588343,
0.11287423968315125,
-0.1845899075269699,
0.019577549770474434,
0.051185715943574905,
0.022427666932344437,
-0.081080362200737,
0.048717666417360306,
0.12076308578252792,
-0.10047850012779236,
0.060577813535928726,
0.012261523865163326,
0.07174098491668701,
0.1147206574678421,
0.08212228119373322,
-0.026426009833812714,
-0.05340701341629028,
0.03481542319059372,
0.06022244319319725,
-0.23572120070457458,
0.2806490659713745,
-0.007786135654896498,
-0.09285714477300644,
-0.04815855994820595,
0.004015514627099037,
0.07152488827705383,
0.16836221516132355,
0.17671774327754974,
0.05537668243050575,
-0.02575857564806938,
-0.07131856679916382,
0.014530264772474766,
0.008080749772489071,
0.036764442920684814,
0.0060120997950434685,
-0.03107486478984356,
-0.019610751420259476,
-0.035737331956624985,
0.010475367307662964,
0.15162187814712524,
-0.10056700557470322,
-0.16314572095870972,
0.05310263857245445,
0.07534237951040268,
-0.05338887870311737,
-0.015112068504095078,
-0.09395692497491837,
-0.25301826000213623,
0.1296142190694809,
0.021167786791920662,
-0.004266274161636829,
-0.12139653414487839,
-0.04775094613432884,
0.1260310709476471,
-0.057386722415685654,
0.08500345051288605,
-0.08965660631656647,
0.026967043057084084,
-0.1142403855919838,
-0.08510865271091461,
0.1245473325252533,
-0.09408168494701385,
-0.005353205371648073,
-0.06362278014421463,
0.06990408152341843,
-0.06549380719661713,
0.04558330401778221,
0.052199289202690125,
0.06582561880350113,
-0.10781524330377579,
-0.11286425590515137,
0.010256418026983738,
-0.09821386635303497,
0.058070313185453415,
-0.04256858676671982,
-0.056632157415151596,
-0.0661081001162529,
0.08723205327987671,
0.02438628487288952,
0.2467849850654602,
0.22447824478149414,
-0.12101367115974426,
0.16616788506507874,
0.11209249496459961,
-0.023681648075580597,
-0.318008691072464,
-0.052360642701387405,
-0.132961705327034,
0.011826717294752598,
0.019867360591888428,
-0.14736305177211761,
0.03786960616707802,
0.011794484220445156,
-0.04931998252868652,
0.08734884858131409,
-0.19281411170959473,
-0.07158942520618439,
0.18993502855300903,
-0.04983458295464516,
0.36226916313171387,
-0.0892128124833107,
-0.05344073474407196,
-0.05168433114886284,
-0.12608391046524048,
0.09015713632106781,
-0.02094453014433384,
0.05175820365548134,
0.0033722028601914644,
0.008306140080094337,
0.008288874290883541,
-0.06075238063931465,
0.15884728729724884,
-0.024242252111434937,
-0.0031549199484288692,
-0.06798875331878662,
-0.1460718810558319,
0.1045774519443512,
0.008138653822243214,
0.025202345103025436,
-0.11630796641111374,
-0.01356478314846754,
-0.14443790912628174,
-0.009726359508931637,
-0.1262349933385849,
0.14169085025787354,
-0.03657890483736992,
-0.054886795580387115,
-0.058280810713768005,
0.06209010258316994,
0.027199866250157356,
-0.004442454781383276,
0.02607676573097706,
-0.039930637925863266,
0.03827766329050064,
0.13907583057880402,
0.00627494789659977,
-0.12993644177913666,
-0.04052986949682236,
0.026880456134676933,
-0.06455688923597336,
0.0561927892267704,
-0.052244577556848526,
0.011829918250441551,
0.11237890273332596,
-0.026856034994125366,
0.07197096198797226,
0.0917583629488945,
0.009573114104568958,
-0.018825700506567955,
0.13955390453338623,
-0.13200703263282776,
-0.05555861070752144,
-0.015200119465589523,
-0.061416078358888626,
0.06256420165300369,
-0.0582452192902565,
0.09954883903265,
0.014856168068945408,
-0.008534849621355534,
-0.016221754252910614,
-0.009676420129835606,
-0.07368068397045135,
-0.013944012112915516,
0.07549767941236496,
0.027011191472411156,
-0.11314141750335693,
-0.04113328084349632,
0.010073554702103138,
-0.1530742198228836,
-0.015721825882792473,
0.06841681897640228,
-0.06422045826911926,
-0.12911830842494965,
0.016157913953065872,
0.11525481194257736,
-0.1390153020620346,
-0.028214598074555397,
0.00917518138885498,
-0.1559017300605774,
0.028852902352809906,
0.20001226663589478,
0.11260174959897995,
0.03073188103735447,
-0.01209003385156393,
-0.07875674217939377,
0.032265063375234604,
0.022656219080090523,
0.0014898008666932583,
0.008439130149781704,
-0.042263224720954895,
0.024518610909581184,
-0.0507645383477211,
0.10817639529705048,
-0.07582421600818634,
-0.006365222856402397,
-0.16345088183879852,
-0.04537329450249672,
-0.14773587882518768,
-0.05970773845911026,
-0.06848336011171341,
-0.0549931675195694,
0.0028266608715057373,
-0.09554124623537064,
-0.033074770122766495,
-0.05266052857041359,
-0.11496288329362869,
0.00032940879464149475,
0.03773384913802147,
0.08833526819944382,
-0.13008171319961548,
-0.037933990359306335,
0.09741678833961487,
-0.03512343019247055,
0.07865023612976074,
0.03800847753882408,
-0.03329635038971901,
0.0537128783762455,
-0.11226312816143036,
-0.15047965943813324,
0.030314944684505463,
0.022166498005390167,
0.09209892898797989,
-0.013680589385330677,
0.018428737297654152,
0.04086071997880936,
0.05980739742517471,
0.0474892221391201,
0.05925339087843895,
-0.06356367468833923,
0.025129539892077446,
0.03196973726153374,
-0.15851819515228271,
0.030432702973484993,
-0.025800596922636032,
0.14803873002529144,
-0.011180088855326176,
0.1092236116528511,
-0.03979549556970596,
0.04010948911309242,
-0.06403768807649612,
0.014697226695716381,
-0.06151088699698448,
-0.1455913931131363,
-0.02412160113453865,
-0.048934195190668106,
0.004691666457802057,
0.014166045002639294,
0.29826104640960693,
0.11288300156593323,
-0.06002506986260414,
0.03218778222799301,
0.03922595828771591,
-0.02411915734410286,
0.03504302725195885,
0.2307346761226654,
0.06782888621091843,
-0.025673456490039825,
-0.083501897752285,
0.07190622389316559,
0.04841916263103485,
0.12045808136463165,
0.08668752014636993,
0.135223388671875,
0.08867308497428894,
0.0762060210108757,
0.059118881821632385,
-0.02237301878631115,
-0.06132635846734047,
-0.08471529930830002,
0.01952962763607502,
0.08863931894302368,
-0.09165847301483154,
0.011730269528925419,
0.0607302300632,
-0.10437972843647003,
0.0746784508228302,
-0.06277825683355331,
-0.022391000762581825,
-0.13022702932357788,
-0.05616685375571251,
-0.06667184084653854,
-0.06254972517490387,
-0.012170902453362942,
-0.04389916732907295,
0.05998504161834717,
0.06453361362218857,
0.0599384680390358,
-0.00717788003385067,
0.03270005062222481,
-0.15877462923526764,
-0.014271106570959091,
0.04039904102683067,
0.018825791776180267,
0.055278994143009186,
-0.06503951549530029,
0.005794626194983721,
-0.12397710978984833,
0.003910339903086424,
-0.051749370992183685,
0.03334962949156761,
-0.015627559274435043,
-0.003958967048674822,
-0.09672816097736359,
-0.07200850546360016,
-0.07942795008420944,
0.010011098347604275,
0.013638630509376526,
0.14784115552902222,
0.013510608114302158,
0.019081665202975273,
0.013490424491465092,
0.11321700364351273,
-0.054023537784814835,
-0.10861475765705109,
-0.08397713303565979,
0.10748562961816788,
-0.018139608204364777,
0.1272696554660797,
-0.05931311473250389,
0.016071604564785957,
-0.08661205321550369,
0.26905158162117004,
0.39296770095825195,
-0.11485283076763153,
0.04979337379336357,
0.0635669082403183,
0.018329473212361336,
0.08997660875320435,
0.04059018939733505,
0.06071699038147926,
0.2269883155822754,
-0.09116072952747345,
-0.05470682680606842,
-0.08150431513786316,
-0.037314850836992264,
-0.08152701705694199,
0.08627539128065109,
0.057978130877017975,
-0.06756996363401413,
-0.05153525248169899,
0.0740206241607666,
-0.06888891011476517,
0.07249405235052109,
0.10174285620450974,
-0.19203004240989685,
-0.06082887947559357,
0.005403841845691204,
0.09376700222492218,
0.05711963400244713,
0.09246642887592316,
-0.04213371500372887,
-0.06445156782865524,
0.07009068131446838,
-0.008361869491636753,
-0.16803531348705292,
-0.038433704525232315,
0.08950939774513245,
0.02667950466275215,
0.18807674944400787,
-0.0005099590634927154,
0.03139420598745346,
0.1214979887008667,
0.05415647104382515,
-0.046244919300079346,
0.0011629491345956922,
0.049990370869636536,
-0.0393093004822731,
-0.07452265918254852,
-0.08952166140079498,
0.002295152749866247,
-0.11671366542577744,
0.01639942266047001,
-0.11932079493999481,
0.10310766845941544,
0.04480323567986488,
-0.10430633276700974,
-0.01855236105620861,
0.09543867409229279,
-0.08262983709573746,
0.06336088478565216,
0.08638268709182739,
0.016153603792190552,
-0.0313429981470108,
-0.024645714089274406,
0.011371978558599949,
0.05877434089779854,
-0.08093944936990738,
-0.12266074866056442,
-0.01787024736404419,
-0.04037834703922272,
-0.015974314883351326,
-0.01874420791864395,
-0.16629183292388916,
-0.03800012916326523,
-0.08927217125892639,
0.09627308696508408,
-0.05959710106253624,
0.02650132030248642,
0.06838878989219666,
-0.008926006965339184,
0.0032927505671977997,
0.027026964351534843,
0.016324710100889206,
0.03265579789876938,
-0.06157737970352173,
-0.031504083424806595
] |
null | null |
transformers
|
# FlauBERT: Unsupervised Language Model Pre-training for French
**FlauBERT** is a French BERT trained on a very large and heterogeneous French corpus. Models of different sizes are trained using the new CNRS (French National Centre for Scientific Research) [Jean Zay](http://www.idris.fr/eng/jean-zay/ ) supercomputer.
Along with FlauBERT comes [**FLUE**](https://github.com/getalp/Flaubert/tree/master/flue): an evaluation setup for French NLP systems similar to the popular GLUE benchmark. The goal is to enable further reproducible experiments in the future and to share models and progress on the French language.For more details please refer to the [official website](https://github.com/getalp/Flaubert).
## FlauBERT models
| Model name | Number of layers | Attention Heads | Embedding Dimension | Total Parameters |
| :------: | :---: | :---: | :---: | :---: |
| `flaubert-small-cased` | 6 | 8 | 512 | 54 M |
| `flaubert-base-uncased` | 12 | 12 | 768 | 137 M |
| `flaubert-base-cased` | 12 | 12 | 768 | 138 M |
| `flaubert-large-cased` | 24 | 16 | 1024 | 373 M |
**Note:** `flaubert-small-cased` is partially trained so performance is not guaranteed. Consider using it for debugging purpose only.
## Using FlauBERT with Hugging Face's Transformers
```python
import torch
from transformers import FlaubertModel, FlaubertTokenizer
# Choose among ['flaubert/flaubert_small_cased', 'flaubert/flaubert_base_uncased',
# 'flaubert/flaubert_base_cased', 'flaubert/flaubert_large_cased']
modelname = 'flaubert/flaubert_base_cased'
# Load pretrained model and tokenizer
flaubert, log = FlaubertModel.from_pretrained(modelname, output_loading_info=True)
flaubert_tokenizer = FlaubertTokenizer.from_pretrained(modelname, do_lowercase=False)
# do_lowercase=False if using cased models, True if using uncased ones
sentence = "Le chat mange une pomme."
token_ids = torch.tensor([flaubert_tokenizer.encode(sentence)])
last_layer = flaubert(token_ids)[0]
print(last_layer.shape)
# torch.Size([1, 8, 768]) -> (batch size x number of tokens x embedding dimension)
# The BERT [CLS] token correspond to the first hidden state of the last layer
cls_embedding = last_layer[:, 0, :]
```
**Notes:** if your `transformers` version is <=2.10.0, `modelname` should take one
of the following values:
```
['flaubert-small-cased', 'flaubert-base-uncased', 'flaubert-base-cased', 'flaubert-large-cased']
```
## References
If you use FlauBERT or the FLUE Benchmark for your scientific publication, or if you find the resources in this repository useful, please cite one of the following papers:
[LREC paper](http://www.lrec-conf.org/proceedings/lrec2020/pdf/2020.lrec-1.302.pdf)
```
@InProceedings{le2020flaubert,
author = {Le, Hang and Vial, Lo\"{i}c and Frej, Jibril and Segonne, Vincent and Coavoux, Maximin and Lecouteux, Benjamin and Allauzen, Alexandre and Crabb\'{e}, Beno\^{i}t and Besacier, Laurent and Schwab, Didier},
title = {FlauBERT: Unsupervised Language Model Pre-training for French},
booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference},
month = {May},
year = {2020},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {2479--2490},
url = {https://www.aclweb.org/anthology/2020.lrec-1.302}
}
```
[TALN paper](https://hal.archives-ouvertes.fr/hal-02784776/)
```
@inproceedings{le2020flaubert,
title = {FlauBERT: des mod{\`e}les de langue contextualis{\'e}s pr{\'e}-entra{\^\i}n{\'e}s pour le fran{\c{c}}ais},
author = {Le, Hang and Vial, Lo{\"\i}c and Frej, Jibril and Segonne, Vincent and Coavoux, Maximin and Lecouteux, Benjamin and Allauzen, Alexandre and Crabb{\'e}, Beno{\^\i}t and Besacier, Laurent and Schwab, Didier},
booktitle = {Actes de la 6e conf{\'e}rence conjointe Journ{\'e}es d'{\'E}tudes sur la Parole (JEP, 31e {\'e}dition), Traitement Automatique des Langues Naturelles (TALN, 27e {\'e}dition), Rencontre des {\'E}tudiants Chercheurs en Informatique pour le Traitement Automatique des Langues (R{\'E}CITAL, 22e {\'e}dition). Volume 2: Traitement Automatique des Langues Naturelles},
pages = {268--278},
year = {2020},
organization = {ATALA}
}
```
|
{"language": "fr", "license": "mit", "tags": ["bert", "language-model", "flaubert", "flue", "french", "flaubert-small", "cased"], "datasets": ["flaubert"], "metrics": ["flue"]}
|
fill-mask
|
flaubert/flaubert_small_cased
|
[
"transformers",
"pytorch",
"flaubert",
"fill-mask",
"bert",
"language-model",
"flue",
"french",
"flaubert-small",
"cased",
"fr",
"dataset:flaubert",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fr"
] |
TAGS
#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #flaubert-small #cased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
FlauBERT: Unsupervised Language Model Pre-training for French
=============================================================
FlauBERT is a French BERT trained on a very large and heterogeneous French corpus. Models of different sizes are trained using the new CNRS (French National Centre for Scientific Research) Jean Zay supercomputer.
Along with FlauBERT comes FLUE: an evaluation setup for French NLP systems similar to the popular GLUE benchmark. The goal is to enable further reproducible experiments in the future and to share models and progress on the French language.For more details please refer to the official website.
FlauBERT models
---------------
Note: 'flaubert-small-cased' is partially trained so performance is not guaranteed. Consider using it for debugging purpose only.
Using FlauBERT with Hugging Face's Transformers
-----------------------------------------------
Notes: if your 'transformers' version is <=2.10.0, 'modelname' should take one
of the following values:
References
----------
If you use FlauBERT or the FLUE Benchmark for your scientific publication, or if you find the resources in this repository useful, please cite one of the following papers:
LREC paper
TALN paper
|
[] |
[
"TAGS\n#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #flaubert-small #cased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #flaubert #fill-mask #bert #language-model #flue #french #flaubert-small #cased #fr #dataset-flaubert #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
-0.025711165741086006,
0.07708094269037247,
-0.0053148046135902405,
0.09785391390323639,
0.038614749908447266,
-0.0020832475274801254,
0.10573698580265045,
0.0799461230635643,
0.015016863122582436,
0.01853981241583824,
0.17120608687400818,
0.05714277923107147,
-0.014544486999511719,
0.12664540112018585,
-0.003513304516673088,
-0.29467764496803284,
0.0624074749648571,
0.0036398046649992466,
-0.0746241882443428,
0.06868281215429306,
0.13639581203460693,
-0.06555881351232529,
0.10894595831632614,
0.05292198434472084,
-0.05196097865700722,
0.04247714951634407,
0.035604219883680344,
-0.08496151119470596,
0.13963523507118225,
0.09164544194936752,
0.13278305530548096,
0.054985225200653076,
0.019673824310302734,
-0.08710119873285294,
0.025067375972867012,
-0.010188870131969452,
-0.07726605981588364,
0.040999047458171844,
0.023571383208036423,
-0.013676564209163189,
0.14663642644882202,
0.011179577559232712,
-0.010944748297333717,
0.039695776998996735,
-0.10630957037210464,
-0.15769705176353455,
-0.06765873730182648,
0.013363105244934559,
-0.09336603432893753,
0.075559601187706,
0.0008581596775911748,
0.12835049629211426,
-0.14334212243556976,
0.053422607481479645,
0.1054949015378952,
-0.3419117331504822,
-0.027495499700307846,
0.10786110162734985,
0.14933167397975922,
-0.006243622396141291,
-0.09044156223535538,
0.08264721184968948,
0.05459102243185043,
0.018422430381178856,
0.05807704105973244,
-0.07573293894529343,
-0.06999335438013077,
0.04459717497229576,
-0.07884878665208817,
-0.014951440505683422,
0.11105817556381226,
-0.018764441832900047,
0.01855500042438507,
-0.08718963712453842,
-0.06122651696205139,
-0.00094973249360919,
-0.03692660480737686,
-0.015511809848248959,
0.042296528816223145,
0.05041828006505966,
0.0647110864520073,
-0.06079944223165512,
-0.11628979444503784,
0.03551439195871353,
-0.20998907089233398,
0.06585197895765305,
-0.024561282247304916,
0.060120828449726105,
-0.09535223245620728,
-0.015206800773739815,
-0.05005768686532974,
-0.11289595067501068,
0.024193722754716873,
-0.067935049533844,
0.09313152730464935,
0.014914466999471188,
-0.05410108342766762,
0.055080097168684006,
0.1280391663312912,
0.15786103904247284,
-0.02683313563466072,
-0.0001927650155266747,
-0.01849639043211937,
0.09677768498659134,
0.044976167380809784,
0.09632433950901031,
-0.06906996667385101,
-0.0908256471157074,
-0.016229569911956787,
-0.07750149816274643,
0.013966756872832775,
-0.05041735991835594,
-0.15146353840827942,
-0.040455158799886703,
-0.06108366325497627,
0.02256128378212452,
0.01969350501894951,
0.030196538195014,
-0.02836422808468342,
0.049124401062726974,
0.08943867683410645,
-0.050874996930360794,
0.06970193237066269,
0.03026292659342289,
0.019350886344909668,
0.12213481962680817,
-0.01938321255147457,
0.005859480705112219,
0.04628235474228859,
0.05660460516810417,
-0.10474223643541336,
-0.023547213524580002,
-0.021293584257364273,
-0.13748130202293396,
0.047288041561841965,
-0.13661062717437744,
0.036298442631959915,
-0.14314383268356323,
-0.057354871183633804,
0.017768559977412224,
0.07308967411518097,
-0.07492250949144363,
0.0017640533624216914,
0.003278017044067383,
-0.0737101212143898,
0.0786866545677185,
-0.019263630732893944,
0.009223707020282745,
-0.07782495021820068,
0.05264440178871155,
-0.07891544699668884,
0.10493113845586777,
-0.2033286988735199,
0.02959965169429779,
-0.07064545154571533,
-0.001645081676542759,
-0.10731853544712067,
-0.028019387274980545,
-0.09072759747505188,
0.030824167653918266,
-0.03561171144247055,
-0.09609760344028473,
-0.014161830767989159,
0.07854559272527695,
-0.010619457811117172,
0.11654829978942871,
-0.18243034183979034,
-0.07691876590251923,
0.16105644404888153,
-0.10305892676115036,
-0.08006396889686584,
0.11901407688856125,
-0.021392500028014183,
0.02358825132250786,
-0.010410737246274948,
0.180743008852005,
0.030718177556991577,
-0.15160509943962097,
0.03900633007287979,
0.12595395743846893,
-0.04391312599182129,
-0.04786965996026993,
0.08138775825500488,
-0.018578730523586273,
-0.08193334937095642,
0.023251937702298164,
-0.015336333774030209,
0.08652111887931824,
-0.049603644758462906,
-0.0703430101275444,
-0.0015192501014098525,
-0.04220764338970184,
0.15315143764019012,
0.03433562070131302,
0.08334308862686157,
-0.09056399017572403,
-0.041560642421245575,
0.012520604766905308,
0.009419688023626804,
0.06380684673786163,
-0.006763504818081856,
-0.05444101616740227,
0.15135715901851654,
0.056846924126148224,
-0.04397956654429436,
-0.10757613182067871,
-0.05395258590579033,
-0.05468408763408661,
0.053824808448553085,
-0.028725871816277504,
0.239209845662117,
0.10175953060388565,
-0.007197430357336998,
-0.017644451931118965,
0.06042710319161415,
0.0764743983745575,
0.05159260705113411,
0.004270830191671848,
-0.13466332852840424,
0.029194997623562813,
-0.0947888121008873,
0.04858113452792168,
-0.12005100399255753,
-0.027163609862327576,
0.05115630850195885,
0.09203289449214935,
-0.02596675045788288,
0.05895133689045906,
-0.07613228261470795,
0.03778592869639397,
-0.0656212568283081,
0.00014542222197633237,
0.07399866729974747,
0.006104654632508755,
0.014896625652909279,
0.14526315033435822,
-0.04737357050180435,
0.3048739731311798,
0.2024160623550415,
-0.12686236202716827,
-0.038633380085229874,
-0.012067039497196674,
-0.023214135318994522,
0.057686690241098404,
0.014549069106578827,
-0.005254361778497696,
-0.01451980322599411,
-0.026944393292069435,
0.1452304869890213,
-0.047689907252788544,
0.017183363437652588,
0.05432403087615967,
-0.04239720478653908,
-0.1057034432888031,
0.0633058026432991,
0.06094187870621681,
-0.036532189697027206,
0.20882548391819,
0.25177237391471863,
0.009496132843196392,
0.196907639503479,
-0.017709434032440186,
0.037297602742910385,
-0.01933078095316887,
-0.053810395300388336,
-0.04232348874211311,
0.13931481540203094,
-0.14537787437438965,
-0.035611242055892944,
0.08093035221099854,
-0.033201854676008224,
0.02200446091592312,
-0.09631969034671783,
-0.05560123920440674,
0.018784480169415474,
-0.004272899124771357,
-0.05374099686741829,
0.12573638558387756,
0.0010454216971993446,
0.09370647370815277,
-0.0015400485135614872,
-0.21000289916992188,
0.043125610798597336,
0.018230505287647247,
-0.013796125538647175,
0.13197766244411469,
-0.14678403735160828,
-0.24441726505756378,
-0.08202985674142838,
-0.14244027435779572,
-0.06387772411108017,
0.010258020833134651,
0.06515490263700485,
-0.05160244554281235,
-0.046949807554483414,
-0.022730819880962372,
-0.037091679871082306,
-0.08012836426496506,
0.01779729314148426,
-0.08413787931203842,
-0.02107042632997036,
-0.05008777230978012,
-0.1366010159254074,
-0.07652120292186737,
-0.007812843658030033,
-0.0011238469742238522,
0.07950717210769653,
-0.026014741510152817,
0.07835976034402847,
0.09147413074970245,
-0.024470314383506775,
0.04005221277475357,
-0.026636866852641106,
0.2649618983268738,
-0.030588261783123016,
0.0590047761797905,
0.1342015415430069,
0.0665254145860672,
0.05681663006544113,
0.17506060004234314,
0.056631606072187424,
-0.028457509353756905,
-0.0348690003156662,
-0.004516839049756527,
-0.06938117742538452,
-0.09704766422510147,
-0.15150707960128784,
-0.10129973292350769,
0.002255244180560112,
0.046679288148880005,
0.0704246461391449,
0.09976918995380402,
0.06089312955737114,
0.08749011904001236,
-0.04459799453616142,
-0.04451792314648628,
0.0020664329640567303,
0.2474433183670044,
-0.055556267499923706,
0.13719329237937927,
-0.05286417156457901,
-0.11654604226350784,
0.11425117403268814,
-0.013100825250148773,
-0.01069052517414093,
0.10410646349191666,
0.016423813998699188,
0.07522706687450409,
0.13063426315784454,
0.07293249666690826,
0.11214601993560791,
0.012476748786866665,
-0.03711455315351486,
-0.05039963126182556,
-0.05663329362869263,
0.030811620876193047,
0.03236536309123039,
0.09754136204719543,
-0.11998439580202103,
-0.05804116651415825,
-0.13895465433597565,
0.06067870184779167,
0.006356256548315287,
0.10636518150568008,
-0.17694100737571716,
0.008779490366578102,
0.04841279238462448,
0.02352927066385746,
-0.07750731706619263,
0.05465321242809296,
0.11605654656887054,
-0.10972362756729126,
0.045319605618715286,
0.00940786488354206,
0.07058114558458328,
0.11849064379930496,
0.07678798586130142,
-0.020352384075522423,
-0.06755492091178894,
0.04184514656662941,
0.063385970890522,
-0.2347584217786789,
0.25761309266090393,
-0.016080789268016815,
-0.06674358993768692,
-0.05508764833211899,
0.008848713710904121,
0.07431007921695709,
0.16413208842277527,
0.1700575053691864,
0.058366481214761734,
-0.059350255876779556,
-0.08559117466211319,
0.012769373133778572,
0.0014434725744649768,
0.033216044306755066,
0.016980299726128578,
-0.03707325458526611,
-0.03251535817980766,
-0.03681159391999245,
0.012550923973321915,
0.12176524102687836,
-0.10098499059677124,
-0.1590656191110611,
0.060734640806913376,
0.07232441753149033,
-0.05083784461021423,
-0.014085139147937298,
-0.10073696076869965,
-0.24490268528461456,
0.12403850257396698,
0.009462802670896053,
-0.010565947741270065,
-0.12510395050048828,
-0.05924181640148163,
0.10259614139795303,
-0.05342797935009003,
0.09609158337116241,
-0.07429154217243195,
0.0338653139770031,
-0.10936484485864639,
-0.1057620644569397,
0.11474418640136719,
-0.10237545520067215,
-0.007325630635023117,
-0.06047864258289337,
0.05320703983306885,
-0.06044776365160942,
0.048123352229595184,
0.05920523777604103,
0.061890482902526855,
-0.09936438500881195,
-0.1113346517086029,
-0.010285452008247375,
-0.1032876968383789,
0.04173991456627846,
-0.05857980623841286,
-0.05902033671736717,
-0.0618225522339344,
0.08732607215642929,
0.022722724825143814,
0.24279560148715973,
0.20288145542144775,
-0.1308935433626175,
0.18101666867733002,
0.10740078240633011,
-0.03606295585632324,
-0.3300617039203644,
-0.05645492672920227,
-0.13977187871932983,
0.006705355364829302,
0.026658592745661736,
-0.136381596326828,
0.059483520686626434,
-0.0010233876528218389,
-0.05439447984099388,
0.08075238764286041,
-0.20584897696971893,
-0.07247748225927353,
0.20580847561359406,
-0.03204631432890892,
0.3430027961730957,
-0.09829720109701157,
-0.057045165449380875,
-0.04258126765489578,
-0.14983220398426056,
0.09408595412969589,
-0.035794440656900406,
0.05424086004495621,
0.001068496610969305,
0.005959521513432264,
0.005508817732334137,
-0.05657196789979935,
0.14635561406612396,
-0.013721106573939323,
-0.0018377960659563541,
-0.08118198812007904,
-0.1401560753583908,
0.10985217243432999,
0.016391590237617493,
0.017857640981674194,
-0.10185529291629791,
-0.016486328095197678,
-0.1455005556344986,
0.00000839150834508473,
-0.11346389353275299,
0.14807899296283722,
-0.03458943963050842,
-0.059292424470186234,
-0.06251800805330276,
0.06443870067596436,
0.026868754997849464,
-0.007638367358595133,
0.03233477473258972,
-0.03193608671426773,
0.060074400156736374,
0.14482717216014862,
-0.0008825515978969634,
-0.12763161957263947,
-0.03763384744524956,
0.01974640041589737,
-0.05853606015443802,
0.05137612298130989,
-0.04313785955309868,
0.0035327693913131952,
0.11238034814596176,
-0.02390696294605732,
0.07605729252099991,
0.09425091743469238,
0.01772460900247097,
-0.02192285843193531,
0.14077012240886688,
-0.13269567489624023,
-0.03501331806182861,
-0.012409446761012077,
-0.06584769487380981,
0.06472889333963394,
-0.06698629260063171,
0.10124903172254562,
-0.00943102315068245,
-0.0085962088778615,
-0.01756766252219677,
-0.004385015927255154,
-0.078116774559021,
-0.009423982352018356,
0.07532200962305069,
0.027723239734768867,
-0.11049535870552063,
-0.038697049021720886,
0.013188592158257961,
-0.15532611310482025,
-0.013455831445753574,
0.06932353973388672,
-0.07527977973222733,
-0.13736675679683685,
0.05515407770872116,
0.12309665977954865,
-0.14839856326580048,
-0.025331491604447365,
0.000278819992672652,
-0.1701512634754181,
0.027848610654473305,
0.20086033642292023,
0.10992135107517242,
0.03891634941101074,
-0.020659776404500008,
-0.07496728748083115,
0.01987871527671814,
0.022623086348176003,
0.01186207216233015,
0.004147916100919247,
-0.04033908620476723,
0.03811587020754814,
-0.044696394354104996,
0.1056780070066452,
-0.08045752346515656,
-0.010417409241199493,
-0.17481078207492828,
-0.042360417544841766,
-0.1315542757511139,
-0.03397505357861519,
-0.07927875220775604,
-0.04923971742391586,
0.00829261913895607,
-0.08286622166633606,
-0.04042639583349228,
-0.046216193586587906,
-0.11216793209314346,
0.009120438247919083,
0.0371130108833313,
0.08717932552099228,
-0.11751123517751694,
-0.03236841410398483,
0.09451986849308014,
-0.032034896314144135,
0.06701956689357758,
0.0484299473464489,
-0.03621464967727661,
0.05717208981513977,
-0.11052735149860382,
-0.14410147070884705,
0.022908352315425873,
0.022815633565187454,
0.09691066294908524,
-0.0124546242877841,
0.0259922556579113,
0.043356385082006454,
0.0549369715154171,
0.05038842186331749,
0.08608365058898926,
-0.06607935577630997,
0.018657950684428215,
0.020552413538098335,
-0.1490126997232437,
0.02197328768670559,
-0.028324682265520096,
0.16306708753108978,
-0.02719726413488388,
0.1197151243686676,
-0.05416841804981232,
0.04525483772158623,
-0.051186636090278625,
0.009718302637338638,
-0.06428097188472748,
-0.14327751100063324,
-0.028602976351976395,
-0.048495516180992126,
-0.0003569654072634876,
0.010864007286727428,
0.2814100682735443,
0.13046903908252716,
-0.05687787011265755,
0.03337851166725159,
0.030108662322163582,
-0.021589454263448715,
0.033600084483623505,
0.226088285446167,
0.0768994390964508,
-0.031345874071121216,
-0.08805409073829651,
0.07116998732089996,
0.04708600416779518,
0.09340100735425949,
0.0988425612449646,
0.12743782997131348,
0.06840486824512482,
0.089663065969944,
0.050473883748054504,
-0.027119601145386696,
-0.061326079070568085,
-0.07778659462928772,
0.024118199944496155,
0.08915739506483078,
-0.09944621473550797,
0.009579142555594444,
0.06477218866348267,
-0.1003069281578064,
0.08036128431558609,
-0.07301294803619385,
-0.03028409741818905,
-0.14149931073188782,
-0.07448655366897583,
-0.06224429979920387,
-0.06728547811508179,
-0.01589994505047798,
-0.040147826075553894,
0.058139409869909286,
0.06375259160995483,
0.0664532259106636,
-0.0038591634947806597,
0.028601408004760742,
-0.18071135878562927,
-0.0027917323168367147,
0.03463279455900192,
0.01161332055926323,
0.04960833489894867,
-0.0919300764799118,
0.004782691597938538,
-0.1350601315498352,
0.004984715487807989,
-0.054325804114341736,
0.02876276522874832,
0.001709054340608418,
0.0036107918713241816,
-0.09334467351436615,
-0.07647763937711716,
-0.06865942478179932,
-0.0019912973511964083,
0.0065512885339558125,
0.14657196402549744,
0.01941075176000595,
0.026121551170945168,
0.019653024151921272,
0.11426038295030594,
-0.05359364673495293,
-0.13616493344306946,
-0.08390173316001892,
0.10972913354635239,
-0.007874789647758007,
0.1247735545039177,
-0.057596415281295776,
0.021941011771559715,
-0.0875561535358429,
0.2911628484725952,
0.3926505446434021,
-0.10255169868469238,
0.05508247762918472,
0.0701628103852272,
0.011837095022201538,
0.09060882776975632,
0.048233214765787125,
0.051738008856773376,
0.2222217619419098,
-0.09046031534671783,
-0.054162830114364624,
-0.09273049980401993,
-0.04615050181746483,
-0.09279460459947586,
0.09364214539527893,
0.057461928576231,
-0.06630418449640274,
-0.044297877699136734,
0.06625492870807648,
-0.08089639246463776,
0.09247488528490067,
0.11157431453466415,
-0.1826648861169815,
-0.05649120733141899,
0.0009037640411406755,
0.08849543333053589,
0.06785320490598679,
0.08955472707748413,
-0.032526008784770966,
-0.060900453478097916,
0.07439383864402771,
-0.004840359557420015,
-0.1680312305688858,
-0.031212201341986656,
0.0882144570350647,
-0.0049402411095798016,
0.19747895002365112,
-0.006637440994381905,
0.02550453133881092,
0.12354534864425659,
0.07258624583482742,
-0.0523524284362793,
0.006224335636943579,
0.04685194417834282,
-0.05170975625514984,
-0.062104105949401855,
-0.09374874085187912,
-0.0003674123727250844,
-0.10837506502866745,
0.0068703824654221535,
-0.1205538883805275,
0.11260499060153961,
0.04331614449620247,
-0.08699501305818558,
-0.012304170988500118,
0.08532001078128815,
-0.0803588405251503,
0.06234296038746834,
0.07921423017978668,
0.0209344569593668,
-0.03991236165165901,
-0.02268623188138008,
0.00837014615535736,
0.050630901008844376,
-0.07875526696443558,
-0.1099335327744484,
-0.021760856732726097,
-0.0399625264108181,
-0.027026811614632607,
-0.007953262887895107,
-0.1661440134048462,
-0.037727825343608856,
-0.09727953374385834,
0.08389591425657272,
-0.06971289962530136,
0.02350008673965931,
0.0781843438744545,
-0.008130521513521671,
-0.001218560733832419,
0.0216978769749403,
0.011893361806869507,
0.04474666714668274,
-0.055300161242485046,
-0.03441067039966583
] |
null | null |
transformers
|
MLM fine-tuned from Bertimbau-Base model on the Brazilian Federal Official Gazette (200k instances)
|
{}
|
fill-mask
|
flavio-nakasato/berdou_200k
|
[
"transformers",
"pytorch",
"bert",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
MLM fine-tuned from Bertimbau-Base model on the Brazilian Federal Official Gazette (200k instances)
|
[] |
[
"TAGS\n#transformers #pytorch #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
36
] |
[
"passage: TAGS\n#transformers #pytorch #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.06357412785291672,
0.00690077617764473,
-0.008467365056276321,
0.020235946401953697,
0.12968459725379944,
0.03302915394306183,
0.09807441383600235,
0.07729126513004303,
0.10806342214345932,
-0.009440856985747814,
0.15823203325271606,
0.20325462520122528,
-0.03393663093447685,
0.18361465632915497,
-0.065280981361866,
-0.2617916762828827,
0.06820162385702133,
0.06229938939213753,
-0.06546879559755325,
0.11254725605249405,
0.05687131732702255,
-0.0869792252779007,
0.07119062542915344,
-0.02728140354156494,
-0.10962541401386261,
0.04230697825551033,
0.05219662934541702,
-0.10162917524576187,
0.12035926431417465,
0.021910926327109337,
0.2123224288225174,
0.016079774126410484,
-0.07168376445770264,
-0.09227655827999115,
0.046415410935878754,
-0.0007212258642539382,
-0.07019324600696564,
0.04336128383874893,
0.01872076466679573,
-0.07103253155946732,
-0.03834306448698044,
0.05254431441426277,
0.027913549914956093,
0.0400172658264637,
-0.146876260638237,
-0.1195456013083458,
-0.012633946724236012,
0.03010692074894905,
0.04268079623579979,
0.060200124979019165,
0.019220897927880287,
0.21056215465068817,
-0.12365733087062836,
0.10502377897500992,
0.15344665944576263,
-0.3129054605960846,
0.002867099829018116,
0.06838289648294449,
0.07103738188743591,
-0.04424971342086792,
-0.023489616811275482,
0.05856989696621895,
0.01071459986269474,
0.022148624062538147,
0.044038139283657074,
-0.07715853303670883,
-0.044328734278678894,
0.01152737345546484,
-0.0813736543059349,
-0.059741389006376266,
0.16159793734550476,
-0.04301191866397858,
0.04482201114296913,
0.012044340372085571,
-0.13646052777767181,
-0.04112602770328522,
-0.0220506452023983,
-0.0066766394302248955,
-0.034124407917261124,
0.043702688068151474,
-0.030891025438904762,
-0.01450628973543644,
-0.11146465688943863,
0.02620311640202999,
-0.2388714849948883,
0.25444263219833374,
0.025913868099451065,
0.06962989270687103,
-0.19036757946014404,
0.04825044423341751,
-0.032656311988830566,
-0.12202991545200348,
0.05392675846815109,
-0.09348990768194199,
0.023280160501599312,
-0.004289025440812111,
-0.06674729287624359,
-0.024281397461891174,
0.07810400426387787,
0.19070357084274292,
0.07175330817699432,
0.038729287683963776,
0.022616418078541756,
0.10197576135396957,
0.015252627432346344,
0.0927167534828186,
0.02304348163306713,
-0.03693375736474991,
0.058425456285476685,
-0.11234977096319199,
0.02645397186279297,
-0.06413000077009201,
-0.13045111298561096,
-0.03655298054218292,
0.026817962527275085,
0.07910523563623428,
0.039121128618717194,
0.05921967700123787,
-0.09775126725435257,
-0.00039136706618592143,
0.10266957432031631,
-0.07596733421087265,
0.011552278883755207,
-0.012669426389038563,
0.05071375519037247,
0.10554905235767365,
0.019699934870004654,
-0.013951314613223076,
-0.02595921792089939,
0.12606576085090637,
-0.07414942234754562,
-0.0338914580643177,
-0.057671189308166504,
-0.0717587098479271,
0.04045988991856575,
-0.12276265770196915,
0.03537328913807869,
-0.18295785784721375,
-0.12786880135536194,
0.05937539413571358,
0.05743318796157837,
0.0075002796947956085,
-0.02198064886033535,
0.027785688638687134,
0.0016502209473401308,
0.014095489867031574,
-0.051991820335388184,
-0.05198881775140762,
-0.03936923295259476,
0.10392222553491592,
0.01174288708716631,
0.12164439260959625,
-0.12003052234649658,
0.04834338277578354,
-0.08534543961286545,
0.014317753724753857,
-0.15386703610420227,
-0.04163011908531189,
-0.028065448626875877,
0.1477208137512207,
-0.0017313070129603148,
-0.044967345893383026,
-0.1107422336935997,
0.03536098450422287,
-0.008266955614089966,
0.174587219953537,
-0.0640043392777443,
-0.13445088267326355,
0.238468199968338,
-0.10142715275287628,
-0.15124346315860748,
0.08353633433580399,
0.00263609504327178,
-0.00937309768050909,
0.05675121024250984,
0.109283447265625,
0.03876260668039322,
-0.14184610545635223,
0.0926179513335228,
0.11292947828769684,
-0.13638317584991455,
-0.12760622799396515,
0.022435644641518593,
-0.00732642924413085,
-0.12322323024272919,
0.04600539803504944,
0.07860185950994492,
0.1112794280052185,
-0.07251705229282379,
-0.04695776477456093,
-0.01390511728823185,
-0.03809646517038345,
0.1488271951675415,
0.03689313679933548,
0.09978006780147552,
-0.07845763862133026,
-0.02166028693318367,
-0.028704503551125526,
-0.008114258758723736,
0.06035853177309036,
0.038866739720106125,
-0.08729325234889984,
0.1360790729522705,
-0.0566742941737175,
0.010620499961078167,
-0.180439755320549,
-0.12009736150503159,
-0.0016330704092979431,
0.05382363870739937,
-0.027322817593812943,
0.12601551413536072,
0.11395162343978882,
-0.03539265692234039,
-0.007137839682400227,
-0.03099343180656433,
0.09945479035377502,
0.025088751688599586,
-0.03798593953251839,
-0.0885278731584549,
0.007986658252775669,
-0.08452948927879333,
-0.014333197847008705,
0.01457307767122984,
0.002566321985796094,
0.00016168280853889883,
0.13817834854125977,
-0.0010485704988241196,
0.03795786574482918,
-0.05177360028028488,
0.04081299155950546,
-0.034957047551870346,
0.01450793631374836,
0.09004251658916473,
-0.000576441758312285,
-0.06362977623939514,
0.15637962520122528,
-0.14581918716430664,
0.35973721742630005,
0.19387078285217285,
-0.3088320195674896,
-0.016066158190369606,
0.01958218589425087,
-0.01481733750551939,
-0.0028315566014498472,
0.056414127349853516,
-0.015269504860043526,
0.04143389314413071,
0.014644528739154339,
0.15166911482810974,
-0.015120322816073895,
-0.02077334001660347,
0.027502890676259995,
-0.0772947371006012,
-0.04431246966123581,
0.03279697522521019,
0.09859511256217957,
-0.13104848563671112,
0.17962734401226044,
0.2618531882762909,
0.004645867273211479,
0.13293692469596863,
0.01004520058631897,
-0.0017370838904753327,
0.012384308502078056,
-0.03448771312832832,
-0.02204137109220028,
0.036397550255060196,
-0.19078975915908813,
-0.037138842046260834,
0.07815047353506088,
-0.030133357271552086,
0.05545393377542496,
-0.11835511028766632,
-0.03323771432042122,
0.029111113399267197,
0.05119411274790764,
-0.07707978785037994,
0.12659992277622223,
0.04097466543316841,
0.0710253193974495,
0.0037192106246948242,
-0.07951492071151733,
0.11071927845478058,
0.007798798382282257,
-0.038606591522693634,
0.15219268202781677,
-0.13388566672801971,
-0.3540363311767578,
-0.1352192908525467,
-0.186979740858078,
0.010174541734158993,
0.04617423936724663,
0.07225015014410019,
-0.08286191523075104,
-0.05899273604154587,
0.09581182152032852,
-0.003480511251837015,
-0.02892324887216091,
0.06940968334674835,
-0.06169416382908821,
0.011217288672924042,
-0.027349013835191727,
-0.06347832828760147,
-0.07560451328754425,
-0.028934668749570847,
-0.02698061801493168,
0.15005719661712646,
-0.09269136935472488,
0.08664495497941971,
0.13057461380958557,
0.0057759047485888,
0.07016542553901672,
-0.0002483248827047646,
0.18727800250053406,
-0.06556744873523712,
-0.005412220023572445,
0.18072476983070374,
-0.05880381539463997,
0.1026553139090538,
0.1556575745344162,
0.020712751895189285,
-0.05158966779708862,
0.00875561498105526,
-0.05700365826487541,
-0.11636948585510254,
-0.1564129889011383,
-0.11075278371572495,
-0.13123051822185516,
-0.011434734798967838,
0.05559059977531433,
0.04917698726058006,
0.13644592463970184,
0.08514466881752014,
0.03654884546995163,
-0.018586870282888412,
-0.06805557757616043,
0.0498523935675621,
0.17366138100624084,
-0.030056441202759743,
0.1334504783153534,
-0.036830224096775055,
-0.14371523261070251,
0.059510327875614166,
0.0252390094101429,
0.12022719532251358,
0.10808205604553223,
-0.004712763242423534,
0.03895212337374687,
0.16281089186668396,
0.1563887745141983,
0.16660696268081665,
0.025009524077177048,
-0.057338543236255646,
-0.004954719450324774,
-0.009356440976262093,
-0.058457158505916595,
0.02018333598971367,
0.15226905047893524,
-0.1055486798286438,
-0.051534514874219894,
-0.145093634724617,
0.05207017809152603,
0.09619975835084915,
0.06738487631082535,
-0.22444024682044983,
0.012990519404411316,
0.06385935842990875,
0.007989094592630863,
-0.06883342564105988,
0.03757710009813309,
-0.02228686586022377,
-0.13463854789733887,
0.06749572604894638,
-0.05030853673815727,
0.09488040208816528,
0.03667333722114563,
0.07960424572229385,
-0.03426273167133331,
-0.06298200786113739,
0.04128245636820793,
0.0669965147972107,
-0.2517971694469452,
0.2858309745788574,
-0.008294520899653435,
-0.051533956080675125,
-0.08108772337436676,
-0.009787647053599358,
0.04465258866548538,
0.12031106650829315,
0.0992002934217453,
0.032960955053567886,
-0.021231580525636673,
-0.15835201740264893,
-0.012746589258313179,
0.028594577684998512,
0.10843918472528458,
-0.02854795753955841,
-0.016072293743491173,
-0.02141297422349453,
-0.054353177547454834,
-0.007548002991825342,
0.09288700670003891,
0.00021381601982284337,
-0.13055965304374695,
0.0781245231628418,
0.056197553873062134,
0.0030072317458689213,
-0.010090996511280537,
-0.05736343935132027,
-0.11168934404850006,
0.18835410475730896,
-0.02566578984260559,
-0.054508499801158905,
-0.10566588491201401,
-0.11198879778385162,
0.09742310643196106,
-0.10951992124319077,
0.1106313019990921,
-0.09603893011808395,
0.004723524209111929,
-0.09463068842887878,
-0.18368598818778992,
0.1582668572664261,
-0.1269671618938446,
-0.006225429475307465,
-0.07936962693929672,
0.15473303198814392,
-0.0639534443616867,
0.02866891399025917,
0.003773587988689542,
0.028899380937218666,
-0.10591752827167511,
-0.05296826362609863,
0.030782422050833702,
-0.05678727477788925,
0.04187817499041557,
0.044521696865558624,
-0.06555546075105667,
-0.01695936545729637,
0.019335398450493813,
0.04292288422584534,
0.23622342944145203,
0.2353804111480713,
-0.052708715200424194,
0.1417168378829956,
0.1806049793958664,
-0.028383145108819008,
-0.3410240709781647,
-0.11411335319280624,
-0.13666872680187225,
-0.003915437962859869,
0.007809142116457224,
-0.1327342689037323,
0.09345895051956177,
-0.032195452600717545,
-0.04637759178876877,
0.12031539529561996,
-0.15053622424602509,
-0.09246959537267685,
0.2436363250017166,
0.008315314538776875,
0.4863871932029724,
-0.09246446192264557,
-0.06652036309242249,
-0.03995967283844948,
-0.14584210515022278,
0.05183078721165657,
0.024809755384922028,
0.08875752240419388,
-0.015901152044534683,
0.08785346150398254,
0.03374331444501877,
-0.09186475723981857,
0.09677482396364212,
-0.03436388820409775,
0.01234909426420927,
-0.10329624265432358,
-0.09800854325294495,
0.06808411329984665,
-0.01401363592594862,
-0.01322801224887371,
0.015540778636932373,
0.007425607182085514,
-0.04579975828528404,
-0.020523425191640854,
-0.10680554807186127,
0.10987795889377594,
0.03320621699094772,
-0.062224309891462326,
0.03879779577255249,
-0.017917169257998466,
-0.009515928104519844,
0.0034782900474965572,
0.1910327970981598,
-0.008325624279677868,
0.17571797966957092,
0.08782124519348145,
0.0300945732742548,
-0.16413554549217224,
-0.0698731392621994,
-0.050175994634628296,
-0.0846821516752243,
0.08663877099752426,
0.008863678202033043,
0.05756894871592522,
0.11674199998378754,
-0.021469993516802788,
0.040903765708208084,
0.11679863929748535,
0.013281558640301228,
-0.03635825589299202,
0.15106870234012604,
-0.2260168492794037,
0.040877439081668854,
-0.024700431153178215,
-0.002281648339703679,
0.06495176255702972,
0.0602131113409996,
0.08886897563934326,
0.04362958297133446,
-0.03604341670870781,
-0.0080631198361516,
-0.011103777214884758,
-0.059563565999269485,
0.05411487817764282,
0.060502372682094574,
0.05677267909049988,
-0.13078919053077698,
0.0061960369348526,
-0.020739618688821793,
-0.2086004763841629,
-0.016145547851920128,
0.07876262068748474,
-0.12113361060619354,
-0.10942773520946503,
0.0038382872007787228,
0.09838655591011047,
-0.08085829019546509,
-0.03981052711606026,
-0.06243035942316055,
-0.11349830776453018,
0.05747007206082344,
0.2176428735256195,
0.1169067993760109,
0.0780315026640892,
-0.01989174261689186,
-0.01007353700697422,
-0.002601395593956113,
-0.015962328761816025,
0.02512223646044731,
0.033555783331394196,
-0.08247660100460052,
0.01702079549431801,
-0.008670814335346222,
0.16094514727592468,
-0.11036427319049835,
-0.05973701551556587,
-0.1687975972890854,
0.04017099365592003,
-0.06963387876749039,
-0.10318976640701294,
-0.09188957512378693,
-0.07771022617816925,
0.01973199099302292,
-0.07843679189682007,
-0.04138858988881111,
-0.03797203674912453,
-0.1261909008026123,
0.025888055562973022,
0.036669645458459854,
-0.015996644273400307,
-0.06865283101797104,
-0.044388484209775925,
0.13997533917427063,
-0.050470441579818726,
0.06897341459989548,
0.14721760153770447,
-0.08223868906497955,
0.08987827599048615,
-0.11864562332630157,
-0.14169776439666748,
0.09844960272312164,
0.024490095674991608,
0.09209379553794861,
0.06073470786213875,
0.01991713047027588,
0.054184310138225555,
0.03840716555714607,
0.039452992379665375,
0.08403609693050385,
-0.11287132650613785,
0.06809459626674652,
0.011329096741974354,
-0.1869479864835739,
-0.02397647127509117,
-0.09611000120639801,
0.07828033715486526,
0.0018079385627061129,
0.11844782531261444,
-0.0382930189371109,
0.10906048864126205,
-0.0436384454369545,
0.014289634302258492,
-0.02247670851647854,
-0.16372942924499512,
-0.004627579357475042,
-0.048289380967617035,
0.012862684205174446,
-0.013447915203869343,
0.23876222968101501,
-0.024661000818014145,
0.024913061410188675,
0.03820062428712845,
0.0719211995601654,
-0.003087579505518079,
0.0022083136718720198,
0.15241484344005585,
0.09013786166906357,
-0.05284610390663147,
-0.0749572142958641,
0.09104806929826736,
0.019679788500070572,
-0.05150250345468521,
0.13582676649093628,
0.06253648549318314,
0.04935529828071594,
0.09676174819469452,
0.00193702126853168,
0.04410434886813164,
-0.13451460003852844,
-0.2456214725971222,
-0.04142381623387337,
0.06802476942539215,
0.022965481504797935,
0.02864265814423561,
0.12449731677770615,
-0.011933309026062489,
0.057093679904937744,
-0.02881103754043579,
-0.022149965167045593,
-0.1927638053894043,
-0.12258896976709366,
-0.08218653500080109,
-0.07139991223812103,
0.023771436884999275,
-0.02313394285738468,
-0.020754177123308182,
0.09821733087301254,
0.034732282161712646,
-0.026418423280119896,
0.15178021788597107,
-0.003468479262664914,
-0.011058829724788666,
0.016801699995994568,
-0.01001247763633728,
0.0172751322388649,
0.032349079847335815,
-0.03294634073972702,
-0.16857078671455383,
0.004473234061151743,
-0.05259554460644722,
0.0047274017706513405,
-0.08785852044820786,
0.02359730750322342,
-0.09015554189682007,
-0.13330627977848053,
-0.07091958820819855,
0.0264219231903553,
-0.04996372386813164,
0.09263461828231812,
-0.013066912069916725,
0.05031539872288704,
0.0013845227658748627,
0.1200626865029335,
-0.07606708258390427,
-0.09816689789295197,
-0.04547613114118576,
0.1901932656764984,
0.041288163512945175,
0.0920717865228653,
-0.015353480353951454,
0.030952494591474533,
-0.11943032592535019,
0.34167152643203735,
0.314802885055542,
-0.049354273825883865,
0.0750916451215744,
0.054602526128292084,
0.03442682698369026,
0.07451198995113373,
0.1279372125864029,
0.0763775110244751,
0.2879911959171295,
-0.09316780418157578,
-0.04345858469605446,
-0.044293951243162155,
-0.03673816844820976,
-0.1208759993314743,
0.01128399558365345,
0.03953966125845909,
-0.03837299346923828,
-0.0634862631559372,
0.07261399179697037,
-0.17381651699543,
0.12662146985530853,
0.057949863374233246,
-0.21046149730682373,
-0.04841303452849388,
-0.027771536260843277,
0.17428803443908691,
0.017816947773098946,
0.1136963814496994,
-0.03833884000778198,
-0.08398560434579849,
0.062350668013095856,
0.022619010880589485,
-0.20338550209999084,
-0.06756751984357834,
0.10970646142959595,
-0.012227135710418224,
0.05940033122897148,
-0.017002668231725693,
0.031783878803253174,
0.0780811533331871,
0.07013798505067825,
-0.014899644069373608,
0.02075999788939953,
0.023412270471453667,
-0.10955478996038437,
-0.07060349732637405,
0.01478694099932909,
-0.0013840675819665194,
-0.11833599954843521,
0.02185012586414814,
-0.16461415588855743,
0.04151973873376846,
-0.09669603407382965,
-0.027114197611808777,
-0.0026749002281576395,
0.05793723464012146,
-0.04355005547404289,
0.04500356316566467,
0.06464733183383942,
0.018565697595477104,
-0.0383153110742569,
-0.05022261664271355,
-0.011393008753657341,
0.0629846602678299,
-0.11954975128173828,
-0.17594216763973236,
-0.08240210264921188,
-0.07172682136297226,
0.04485165327787399,
-0.010793168097734451,
-0.13988232612609863,
-0.04391428083181381,
-0.10527841746807098,
0.032555706799030304,
-0.15290100872516632,
0.04201599210500717,
0.04696520045399666,
0.04337937757372856,
0.017507996410131454,
-0.04434172064065933,
0.04486740753054619,
0.049446675926446915,
-0.155558243393898,
-0.09162718802690506
] |
null | null |
transformers
|
MLM fine-tuned from Bertimbau-Base model on the Brazilian Federal Official Gazette (500k instances)
|
{}
|
fill-mask
|
flavio-nakasato/berdou_500k
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
MLM fine-tuned from Bertimbau-Base model on the Brazilian Federal Official Gazette (500k instances)
|
[] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
40
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.07790084183216095,
0.016518043354153633,
-0.008212950080633163,
0.046188149601221085,
0.10820154845714569,
0.03358736261725426,
0.13804419338703156,
0.08940190821886063,
0.11855188012123108,
0.02208547294139862,
0.1473609358072281,
0.2036140114068985,
-0.019671602174639702,
0.10159783065319061,
-0.054509591311216354,
-0.28330767154693604,
0.022876782342791557,
0.09180302917957306,
-0.10638835281133652,
0.09700179100036621,
0.03200136125087738,
-0.10263264924287796,
0.05319419875741005,
-0.0021460808347910643,
-0.1289295256137848,
0.04976126179099083,
0.05984581634402275,
-0.10990167409181595,
0.12254879623651505,
0.03920142352581024,
0.2380499243736267,
0.03947988525032997,
-0.06493404507637024,
-0.05188068374991417,
0.056791115552186966,
0.01835254579782486,
-0.0852741226553917,
0.07888620346784592,
0.0528767853975296,
-0.07912319898605347,
-0.04303424805402756,
0.038408491760492325,
0.03866856172680855,
0.020875008776783943,
-0.1238415539264679,
-0.09124433994293213,
-0.012357817031443119,
0.029323458671569824,
0.03295022249221802,
0.03917679563164711,
0.02539454773068428,
0.1978382021188736,
-0.08303739875555038,
0.09481292217969894,
0.15477293729782104,
-0.3562786877155304,
-0.02150500752031803,
0.10535745322704315,
0.07656166702508926,
-0.022566968575119972,
-0.047764409333467484,
0.06430140137672424,
0.012407893314957619,
0.02227991633117199,
0.07929351180791855,
-0.08231029659509659,
-0.027204114943742752,
0.016501979902386665,
-0.09856409579515457,
0.0013579472433775663,
0.09178505092859268,
-0.0531139150261879,
0.06253664940595627,
-0.012256533838808537,
-0.14276117086410522,
-0.05819517746567726,
-0.02497401088476181,
-0.021335484459996223,
-0.03644772619009018,
0.028043638914823532,
-0.04732730612158775,
-0.05991828814148903,
-0.13152219355106354,
0.02421870268881321,
-0.23034344613552094,
0.24477382004261017,
0.01693771407008171,
0.05773843452334404,
-0.18603426218032837,
0.04713644087314606,
-0.021673882380127907,
-0.12839902937412262,
0.08886776864528656,
-0.07202836126089096,
0.0006479526055045426,
-0.008409288711845875,
-0.04997296631336212,
-0.17373839020729065,
0.07299933582544327,
0.12691183388233185,
0.054207004606723785,
0.05734189972281456,
0.025039875879883766,
0.1201840341091156,
0.006598496809601784,
0.10288801044225693,
-0.008097893558442593,
-0.015497330576181412,
0.05203497037291527,
-0.080804742872715,
0.04675235599279404,
-0.07400276511907578,
-0.1494344174861908,
-0.025927912443876266,
0.029222549870610237,
0.047016967087984085,
0.02587195485830307,
0.05586226284503937,
-0.08110863715410233,
-0.014490109868347645,
0.0735277608036995,
-0.07120494544506073,
0.03395641967654228,
-0.0065535977482795715,
0.049428485333919525,
0.11689461022615433,
0.03487372025847435,
-0.026272600516676903,
-0.00731932558119297,
0.13330204784870148,
-0.08505000919103622,
-0.02419033646583557,
-0.07531776279211044,
-0.08962216228246689,
0.04826059192419052,
-0.1632310003042221,
0.02469579502940178,
-0.15286129713058472,
-0.09554176032543182,
0.051516033709049225,
0.07790232449769974,
-0.016488375142216682,
0.005278232507407665,
0.05114215239882469,
-0.014284017495810986,
0.033739712089300156,
-0.025700587779283524,
-0.028234267607331276,
-0.01529658678919077,
0.08671203255653381,
-0.018218006938695908,
0.12540732324123383,
-0.10648874938488007,
0.03853633254766464,
-0.05722050368785858,
0.03026474080979824,
-0.18027721345424652,
-0.0430169440805912,
-0.05288250744342804,
0.11058122664690018,
-0.010321443900465965,
-0.03522532433271408,
-0.1426706314086914,
0.0367639921605587,
0.02239118702709675,
0.13548487424850464,
-0.10513918846845627,
-0.13305321335792542,
0.21288810670375824,
-0.1016547754406929,
-0.11583919823169708,
0.1047796681523323,
-0.002022174187004566,
-0.014103610068559647,
0.03722289949655533,
0.1264645904302597,
0.06430704891681671,
-0.13776272535324097,
0.0722510814666748,
0.12743568420410156,
-0.11741554737091064,
-0.1355130523443222,
0.0006078224396333098,
0.005416044499725103,
-0.05701550096273422,
0.025835078209638596,
0.12810392677783966,
0.09405088424682617,
-0.06690292060375214,
-0.052552465349435806,
-0.015467210672795773,
-0.04039204120635986,
0.16365651786327362,
0.07179825752973557,
0.12719546258449554,
-0.057909417897462845,
-0.05095156654715538,
0.02343592792749405,
-0.025478344410657883,
0.016641924157738686,
0.037690892815589905,
-0.0898546352982521,
0.14999167621135712,
-0.10085631161928177,
-0.010246340185403824,
-0.19158701598644257,
-0.14655375480651855,
-0.015788329765200615,
0.025937765836715698,
-0.0017375507159158587,
0.15789695084095,
0.1447407305240631,
-0.02243388257920742,
-0.019109655171632767,
-0.005066946614533663,
0.11249406635761261,
0.019448185339570045,
-0.07525867223739624,
-0.13098235428333282,
0.00973969791084528,
-0.10094733536243439,
-0.023953260853886604,
-0.035868123173713684,
0.017509840428829193,
0.036097750067710876,
0.12777051329612732,
0.03281712904572487,
0.037003710865974426,
-0.03767522796988487,
0.03916104882955551,
-0.040597960352897644,
0.0037048272788524628,
0.0844956561923027,
-0.0021049424540251493,
-0.09299628436565399,
0.13193143904209137,
-0.16167443990707397,
0.3401946723461151,
0.18673387169837952,
-0.26746660470962524,
-0.022602055221796036,
0.0033191246911883354,
-0.015736890956759453,
-0.00679839076474309,
0.050552334636449814,
0.0019227248849347234,
0.062129292637109756,
0.01416977122426033,
0.13805356621742249,
-0.009513786062598228,
-0.03573422506451607,
0.04569558799266815,
-0.06468183547258377,
-0.07068756967782974,
0.04037341848015785,
0.12745486199855804,
-0.09928955882787704,
0.1722826361656189,
0.23145046830177307,
-0.06363307684659958,
0.18414750695228577,
0.014316155575215816,
-0.017707981169223785,
0.011351192370057106,
-0.029390960931777954,
0.0027528635691851377,
0.07604313641786575,
-0.2058791071176529,
-0.039232268929481506,
0.06647104769945145,
-0.06172962114214897,
0.06304269284009933,
-0.1546151489019394,
-0.024685638025403023,
0.008748271502554417,
0.05809599161148071,
-0.01735253818333149,
0.12114395946264267,
0.0281747467815876,
0.06908408552408218,
-0.02054799348115921,
-0.09589458256959915,
0.08945789933204651,
0.01011973712593317,
-0.026470042765140533,
0.16203516721725464,
-0.09715394675731659,
-0.3078756332397461,
-0.13901609182357788,
-0.1378953605890274,
-0.0008588169002905488,
0.02411355823278427,
0.04285533353686333,
-0.07035351544618607,
-0.04864861071109772,
0.07027912884950638,
-0.002551848301663995,
-0.016505226492881775,
0.08433528244495392,
-0.06856820732355118,
0.00623853225260973,
-0.03675711154937744,
-0.07506686449050903,
-0.05477834865450859,
-0.0531604178249836,
-0.01982000283896923,
0.13998807966709137,
-0.05833425745368004,
0.07497313618659973,
0.17828361690044403,
0.008051199838519096,
0.0559733510017395,
-0.006132024340331554,
0.12713803350925446,
-0.07607276737689972,
0.034274373203516006,
0.14822211861610413,
-0.06883109360933304,
0.09923606365919113,
0.1303607076406479,
0.056760236620903015,
-0.033458054065704346,
-0.011679700575768948,
-0.018221072852611542,
-0.13439784944057465,
-0.18390651047229767,
-0.07532092928886414,
-0.12441830337047577,
-0.0020640946459025145,
0.05165429040789604,
0.06139416992664337,
0.14612902700901031,
0.11318808048963547,
0.06912413984537125,
0.005081723909825087,
-0.06176111847162247,
0.023246942088007927,
0.13775521516799927,
-0.01742352731525898,
0.14519280195236206,
-0.03863596171140671,
-0.14853787422180176,
0.04498060792684555,
0.06126818805932999,
0.1259230524301529,
0.08870499581098557,
0.055460020899772644,
0.04202448949217796,
0.1566590815782547,
0.15940159559249878,
0.14825759828090668,
-0.012775593437254429,
-0.0755966305732727,
0.003909200429916382,
-0.01944836787879467,
-0.0008270479738712311,
0.022503167390823364,
0.15964414179325104,
-0.09427092224359512,
-0.00026090306346304715,
-0.09893545508384705,
0.04528913274407387,
0.10411030799150467,
0.05783972889184952,
-0.24251513183116913,
0.007850920781493187,
0.053264129906892776,
0.020254788920283318,
-0.04307316988706589,
0.025548191741108894,
0.027017300948500633,
-0.07546722143888474,
0.06020618602633476,
-0.10184729844331741,
0.07104078680276871,
0.013667120598256588,
0.04894994571805,
-0.018582401797175407,
-0.02691625989973545,
0.02450469881296158,
0.03875031694769859,
-0.2130439430475235,
0.2659459710121155,
0.005649178754538298,
-0.03467016667127609,
-0.08263656497001648,
0.0037175267934799194,
0.04836471378803253,
0.1055850014090538,
0.10236800462007523,
-0.001176556688733399,
0.008999454788863659,
-0.10877294838428497,
-0.038522299379110336,
0.015868959948420525,
0.10096142441034317,
-0.023684149608016014,
-0.020515648648142815,
0.0026090943720191717,
-0.05497108772397041,
0.02122051827609539,
0.06213882938027382,
0.01239787694066763,
-0.14617830514907837,
0.0892142727971077,
0.039175644516944885,
-0.11760716140270233,
0.005709649529308081,
-0.10346440970897675,
-0.15497536957263947,
0.21856728196144104,
-0.06183752790093422,
-0.05061936378479004,
-0.1110835075378418,
-0.07628781348466873,
0.0917704626917839,
-0.10510115325450897,
0.09091821312904358,
-0.0827544629573822,
0.017984915524721146,
-0.08907201141119003,
-0.20557613670825958,
0.17663627862930298,
-0.10945448279380798,
0.004430670756846666,
-0.09284458309412003,
0.14442124962806702,
-0.051370467990636826,
0.043046217411756516,
-0.01118745282292366,
0.026180638000369072,
-0.07149670273065567,
-0.047090671956539154,
0.04490131884813309,
-0.05696627125144005,
0.040822092443704605,
-0.027018742635846138,
-0.053463514894247055,
-0.02119051292538643,
0.036833833903074265,
0.06451930850744247,
0.22675089538097382,
0.18738184869289398,
-0.07550175487995148,
0.13442745804786682,
0.16283321380615234,
-0.02740528993308544,
-0.3473333418369293,
-0.04864295944571495,
-0.11036817729473114,
-0.0019252547062933445,
0.019654469564557076,
-0.1441228836774826,
0.1275452971458435,
-0.014579832553863525,
-0.04974827542901039,
0.15298835933208466,
-0.20405764877796173,
-0.11463084816932678,
0.21788786351680756,
0.059968747198581696,
0.4127348065376282,
-0.13012529909610748,
-0.07623867690563202,
0.0021725953556597233,
-0.0894550010561943,
0.08088383078575134,
-0.06273724883794785,
0.09463869780302048,
-0.00976912397891283,
0.07469606399536133,
0.03378957137465477,
-0.10241201519966125,
0.08097034692764282,
-0.060108982026576996,
0.013725215569138527,
-0.08789633959531784,
-0.10093788802623749,
0.09182241559028625,
-0.012423292733728886,
-0.029473239555954933,
0.002856625709682703,
-0.014331793412566185,
-0.006276264786720276,
-0.029628615826368332,
-0.08653043210506439,
0.11843180656433105,
0.028422541916370392,
-0.07164952158927917,
0.02578289620578289,
-0.017831752076745033,
-0.03424234688282013,
-0.008479396812617779,
0.25119802355766296,
0.008233427070081234,
0.17862387001514435,
0.14305737614631653,
0.053003013134002686,
-0.14458122849464417,
-0.09232106804847717,
-0.04653064161539078,
-0.08021276444196701,
0.09691376239061356,
-0.017238890752196312,
0.038613080978393555,
0.12192171066999435,
-0.0017617236590012908,
0.030962733551859856,
0.11938457190990448,
-0.014485117979347706,
-0.014398516155779362,
0.14780978858470917,
-0.2269173562526703,
-0.03197126090526581,
-0.009658879600465298,
-0.055370088666677475,
0.04336284473538399,
0.10206849873065948,
0.1151357963681221,
0.03379256650805473,
-0.017248602584004402,
0.021890543401241302,
-0.02489619329571724,
-0.03648986294865608,
0.07161223143339157,
0.08151238411664963,
0.04008433595299721,
-0.10956866294145584,
0.0040501500479876995,
-0.006473991554230452,
-0.24123698472976685,
-0.017690198495984077,
0.09091350436210632,
-0.08740647882223129,
-0.1156194731593132,
0.010513732209801674,
0.12744635343551636,
-0.06532266736030579,
-0.028154218569397926,
-0.08661408722400665,
-0.09894219040870667,
0.04130775108933449,
0.24822163581848145,
0.09755055606365204,
0.049951013177633286,
-0.0471113920211792,
0.014125198125839233,
-0.012747100554406643,
0.01794242300093174,
0.014788154512643814,
0.05297001451253891,
-0.10302300751209259,
0.024993175640702248,
-0.006157109513878822,
0.1379939764738083,
-0.1127796396613121,
-0.04276144877076149,
-0.1717560589313507,
0.0217585451900959,
-0.07340721786022186,
-0.07134810090065002,
-0.08807346969842911,
-0.08023303002119064,
0.003779386170208454,
-0.06777804344892502,
-0.06129820644855499,
-0.040749453008174896,
-0.11543262749910355,
0.024692784994840622,
0.0334535576403141,
-0.025824250653386116,
-0.08772984147071838,
-0.041338033974170685,
0.10188698768615723,
-0.04076341539621353,
0.07254716753959656,
0.10029710084199905,
-0.05980180576443672,
0.08815409988164902,
-0.13207171857357025,
-0.08468326181173325,
0.09916973114013672,
0.024830590933561325,
0.1020490825176239,
0.09013800323009491,
0.013591969385743141,
0.026373881846666336,
0.057460978627204895,
0.04053720831871033,
0.06950108706951141,
-0.09748523682355881,
0.07849955558776855,
-0.028306126594543457,
-0.17846614122390747,
-0.04443996772170067,
-0.05164189636707306,
0.07400795817375183,
0.009766239672899246,
0.11034847050905228,
-0.05275709927082062,
0.09021607786417007,
-0.07860458642244339,
0.023079929873347282,
-0.0076187714003026485,
-0.14477665722370148,
0.04681834205985069,
-0.022911399602890015,
0.01480258721858263,
-0.04106546565890312,
0.20254354178905487,
0.00441964715719223,
-0.009260923601686954,
0.030396943911910057,
0.046589937061071396,
0.0006307119620032609,
0.019100962206721306,
0.1454123854637146,
0.06847364455461502,
-0.04422718286514282,
-0.09039520472288132,
0.10201897472143173,
0.03856464475393295,
0.007739732041954994,
0.1524369865655899,
0.04237629100680351,
-0.008512073196470737,
0.11374673992395401,
0.026358820497989655,
0.04652214050292969,
-0.13723181188106537,
-0.13093525171279907,
-0.0804644301533699,
0.08795841038227081,
0.023557627573609352,
0.03866160288453102,
0.15562687814235687,
-0.0073074717074632645,
0.051113441586494446,
-0.030721379444003105,
-0.05015246570110321,
-0.1872643530368805,
-0.18156903982162476,
-0.07743525505065918,
-0.05105085298418999,
0.02389856055378914,
-0.016819871962070465,
-0.04393294081091881,
0.06642135977745056,
0.050926972180604935,
-0.021122347563505173,
0.18589669466018677,
0.032143332064151764,
0.020533282309770584,
0.010158732533454895,
0.028195280581712723,
0.0009987286757677794,
0.00857510045170784,
-0.019199101254343987,
-0.17074912786483765,
0.01708327978849411,
-0.06384968012571335,
-0.012376051396131516,
-0.05795752629637718,
0.022778676822781563,
-0.06266462802886963,
-0.1312699019908905,
-0.05778918415307999,
0.019016889855265617,
-0.016934411600232124,
0.06166604906320572,
-0.006308829877525568,
0.0370224192738533,
-0.013597169890999794,
0.10929606109857559,
-0.08079570531845093,
-0.03252962976694107,
-0.05378003045916557,
0.14547099173069,
-0.0008521296549588442,
0.08852189034223557,
-0.011867829598486423,
0.00954501610249281,
-0.09102854132652283,
0.3167937695980072,
0.33853089809417725,
-0.06568694114685059,
0.07932788878679276,
0.07627791911363602,
0.028063280507922173,
0.03325807675719261,
0.12174034118652344,
0.05747270584106445,
0.298409104347229,
-0.10066726058721542,
-0.08270370215177536,
-0.042463675141334534,
-0.027296585962176323,
-0.11450732499361038,
0.047248221933841705,
0.05669313296675682,
-0.01981496810913086,
-0.0656062662601471,
0.06934014707803726,
-0.17018432915210724,
0.061008259654045105,
0.07885685563087463,
-0.23269113898277283,
-0.07523820549249649,
-0.019193513318896294,
0.18734173476696014,
-0.02395877242088318,
0.11407557129859924,
-0.03585958108305931,
-0.08768345415592194,
-0.0014376965118572116,
0.02380351908504963,
-0.20881132781505585,
0.0018643672810867429,
0.08331044018268585,
-0.034444715827703476,
0.08946173638105392,
-0.029837770387530327,
0.01205562986433506,
0.09660354256629944,
0.07604175060987473,
0.001057648565620184,
-0.012915709987282753,
0.032605838030576706,
-0.12737132608890533,
-0.06311450153589249,
0.03893883153796196,
-0.00010283156007062644,
-0.12024521827697754,
0.03789159283041954,
-0.128483384847641,
0.04040243849158287,
-0.14165377616882324,
-0.03062446601688862,
-0.007550285663455725,
0.05289151519536972,
-0.0467054508626461,
0.04853461682796478,
0.06265388429164886,
0.0425020307302475,
-0.03611145541071892,
-0.04011775180697441,
-0.019597195088863373,
0.06175994500517845,
-0.07281678169965744,
-0.164351224899292,
-0.1014758050441742,
-0.05118677392601967,
-0.005980412941426039,
-0.007813491858541965,
-0.19126109778881073,
-0.05188528075814247,
-0.08566392213106155,
0.020126499235630035,
-0.16986747086048126,
0.011818689294159412,
0.09483654797077179,
0.04221990332007408,
0.005898203235119581,
-0.03080839477479458,
0.03208480775356293,
0.025603074580430984,
-0.18278644979000092,
-0.07823255658149719
] |
null | null |
transformers
|
RoBERTa model pretrained on the Brazilian Federal Official Gazette (200k instances).
|
{}
|
fill-mask
|
flavio-nakasato/deeppolicytracker_200k
|
[
"transformers",
"pytorch",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
RoBERTa model pretrained on the Brazilian Federal Official Gazette (200k instances).
|
[] |
[
"TAGS\n#transformers #pytorch #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
37
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.05978045240044594,
0.0027343870606273413,
-0.008724397048354149,
0.02515793778002262,
0.13307689130306244,
0.027639828622341156,
0.09509950131177902,
0.08148215711116791,
0.05693569406867027,
-0.005708751268684864,
0.15464650094509125,
0.21959826350212097,
-0.03345884382724762,
0.17867937684059143,
-0.058117255568504333,
-0.2713718116283417,
0.06318216025829315,
0.048634059727191925,
-0.07361909002065659,
0.11866326630115509,
0.0721997618675232,
-0.07688385248184204,
0.06776180863380432,
-0.018220216035842896,
-0.12242016196250916,
0.042348217219114304,
0.05527849495410919,
-0.1122080385684967,
0.12108562141656876,
0.021961210295557976,
0.2060822993516922,
0.013724464923143387,
-0.06524214893579483,
-0.0831608921289444,
0.05297759547829628,
-0.0005626529455184937,
-0.07771056890487671,
0.039234988391399384,
0.006228282582014799,
-0.09797824174165726,
0.0029437756165862083,
0.05069584399461746,
0.031919922679662704,
0.043076999485492706,
-0.14973030984401703,
-0.12170203030109406,
-0.021631255745887756,
0.03224768117070198,
0.04861219599843025,
0.06754541397094727,
0.019578367471694946,
0.20521271228790283,
-0.1258353292942047,
0.10418124496936798,
0.15854887664318085,
-0.29196202754974365,
-0.011583889834582806,
0.07336939871311188,
0.07550900429487228,
-0.05153534933924675,
-0.025016577914357185,
0.061107337474823,
0.0111276526004076,
0.02207767404615879,
0.03159128874540329,
-0.08054082095623016,
-0.06527844071388245,
0.004459382500499487,
-0.07510198652744293,
-0.05628051981329918,
0.14868685603141785,
-0.051077596843242645,
0.04103284329175949,
0.010015198029577732,
-0.12913082540035248,
-0.036706481128931046,
-0.022282211109995842,
0.0016800274606794119,
-0.03736185282468796,
0.03933337330818176,
-0.04058491811156273,
-0.013890751637518406,
-0.10593511909246445,
0.020706869661808014,
-0.23111006617546082,
0.27582958340644836,
0.02612913027405739,
0.07012132555246353,
-0.1858518421649933,
0.043225426226854324,
-0.029383337125182152,
-0.12287542223930359,
0.04163218289613724,
-0.09743601828813553,
0.012437986209988594,
0.0019431845284998417,
-0.06488244980573654,
-0.03970800340175629,
0.08601388335227966,
0.2266247421503067,
0.08983870595693588,
0.03475968539714813,
0.03882957249879837,
0.09770892560482025,
0.014865895733237267,
0.08014364540576935,
0.017019858583807945,
-0.040938593447208405,
0.06849705427885056,
-0.12720969319343567,
0.04322979971766472,
-0.059905603528022766,
-0.12469810247421265,
-0.052705299109220505,
0.004645936656743288,
0.08657316863536835,
0.0478980652987957,
0.04835785925388336,
-0.08745969086885452,
0.0031580787617713213,
0.07787331938743591,
-0.07194057106971741,
-0.002319851191714406,
-0.029221290722489357,
0.05152527242898941,
0.10475372523069382,
0.021636445075273514,
-0.010703074745833874,
-0.016043487936258316,
0.11903663724660873,
-0.07282981276512146,
-0.03583741933107376,
-0.05615774169564247,
-0.055342018604278564,
0.03769473731517792,
-0.14162738621234894,
0.04631955549120903,
-0.1937199980020523,
-0.14164285361766815,
0.05045240744948387,
0.058013904839754105,
-0.004311853088438511,
-0.032746389508247375,
0.032482124865055084,
-0.005240604747086763,
0.01717841997742653,
-0.04425594210624695,
-0.03813016042113304,
-0.03695604205131531,
0.10319358855485916,
0.015862006694078445,
0.1258252114057541,
-0.10736379027366638,
0.04201505705714226,
-0.08612877130508423,
0.012246696278452873,
-0.16673244535923004,
-0.037367139011621475,
-0.02627558261156082,
0.16236986219882965,
0.002689799526706338,
-0.04094931110739708,
-0.11292707920074463,
0.03153347223997116,
-0.005968436133116484,
0.16826218366622925,
-0.050691187381744385,
-0.1258428990840912,
0.23801551759243011,
-0.10858677327632904,
-0.137539803981781,
0.0768466368317604,
-0.0008964669541455805,
0.00503029627725482,
0.04865459352731705,
0.09409047663211823,
0.054460909217596054,
-0.1282840520143509,
0.09256289899349213,
0.09043212980031967,
-0.15783078968524933,
-0.13727834820747375,
0.027883639559149742,
-0.005475207231938839,
-0.10261815786361694,
0.044968098402023315,
0.09180816262960434,
0.11191844940185547,
-0.07093978673219681,
-0.0524664968252182,
-0.019335782155394554,
-0.04510034993290901,
0.12913495302200317,
0.03954192250967026,
0.09601214528083801,
-0.07954450696706772,
-0.026266923174262047,
-0.06726016104221344,
0.00639816839247942,
0.07083439826965332,
0.03640305995941162,
-0.08596820384263992,
0.1376219093799591,
-0.05640283599495888,
0.00475625554099679,
-0.18434511125087738,
-0.10832203924655914,
-0.019048554822802544,
0.06233084574341774,
-0.024080509319901466,
0.11038170754909515,
0.11115530878305435,
-0.04524664208292961,
-0.012584532611072063,
-0.016787465661764145,
0.09273066371679306,
0.02054671198129654,
-0.019367150962352753,
-0.09770754724740982,
0.024277135729789734,
-0.08850374072790146,
0.00890185683965683,
0.020691927522420883,
0.0034111374989151955,
-0.004997740965336561,
0.14283043146133423,
-0.00161478400696069,
0.03947531431913376,
-0.04557863250374794,
0.03357338905334473,
-0.04402673617005348,
0.007829888723790646,
0.08130250126123428,
0.00620792992413044,
-0.052449412643909454,
0.15417484939098358,
-0.1333128660917282,
0.3405405282974243,
0.18508437275886536,
-0.2874508798122406,
-0.03881249949336052,
0.04533010721206665,
-0.018499813973903656,
-0.0015848495531827211,
0.04968307539820671,
0.007090166676789522,
0.027636928483843803,
0.00756347319111228,
0.14252737164497375,
-0.008502244018018246,
-0.015343432314693928,
0.03761560097336769,
-0.0861063003540039,
-0.03436943516135216,
0.034367144107818604,
0.09854230284690857,
-0.11538184434175491,
0.17259535193443298,
0.22679723799228668,
-0.014759926125407219,
0.13486558198928833,
0.019756602123379707,
0.0008263704366981983,
0.004766570404171944,
-0.04244794696569443,
-0.010318059474229813,
0.04416158050298691,
-0.1703941971063614,
-0.037341129034757614,
0.07009443640708923,
-0.04086581617593765,
0.054106131196022034,
-0.1080540344119072,
-0.04539399966597557,
0.023229053243994713,
0.05735735595226288,
-0.057464465498924255,
0.14314846694469452,
0.02993035688996315,
0.07184498757123947,
0.001863340032286942,
-0.0830451101064682,
0.10455971211194992,
0.011808671057224274,
-0.027575377374887466,
0.15615016222000122,
-0.11499504745006561,
-0.3407493829727173,
-0.14060579240322113,
-0.1856870949268341,
0.012841945514082909,
0.04907330498099327,
0.07068860530853271,
-0.0886731967329979,
-0.06028576195240021,
0.1009501963853836,
-0.002118155127391219,
-0.030502429232001305,
0.06733111292123795,
-0.059993140399456024,
0.033099252730607986,
-0.03546803444623947,
-0.05871487408876419,
-0.06814772635698318,
-0.030771153047680855,
-0.026918867602944374,
0.15232053399085999,
-0.08588875830173492,
0.09874926507472992,
0.12222868949174881,
0.012513059191405773,
0.06416334956884384,
0.003602869575843215,
0.16369159519672394,
-0.08151564747095108,
-0.004985830280929804,
0.19588902592658997,
-0.039671871811151505,
0.09784439206123352,
0.16251394152641296,
0.01572851650416851,
-0.04611923173069954,
0.007210151292383671,
-0.054344769567251205,
-0.1310671716928482,
-0.16383560001850128,
-0.10668035596609116,
-0.13408245146274567,
-0.021317366510629654,
0.045856814831495285,
0.050257451832294464,
0.15384143590927124,
0.10692618787288666,
0.039593882858753204,
-0.022173523902893066,
-0.07040359079837799,
0.060743995010852814,
0.16259261965751648,
-0.02038007788360119,
0.13690350949764252,
-0.05275079607963562,
-0.15010952949523926,
0.060834385454654694,
0.017343392595648766,
0.13702066242694855,
0.10000376403331757,
-0.018073182553052902,
0.04605214297771454,
0.15819214284420013,
0.157943993806839,
0.15094222128391266,
0.040314868092536926,
-0.057792484760284424,
-0.00135100819170475,
-0.00355874327942729,
-0.054626744240522385,
0.024207331240177155,
0.13539178669452667,
-0.10293795168399811,
-0.0397566519677639,
-0.122000552713871,
0.05419766530394554,
0.11576513946056366,
0.0632915124297142,
-0.2221747636795044,
0.010753236711025238,
0.06315211206674576,
0.00856467429548502,
-0.06559337675571442,
0.03360356017947197,
-0.0504627525806427,
-0.1452513188123703,
0.0742940679192543,
-0.05110727250576019,
0.09041508287191391,
0.04023086279630661,
0.06344828754663467,
-0.05134069547057152,
-0.05041798576712608,
0.03384535759687424,
0.066301628947258,
-0.24279962480068207,
0.28508660197257996,
-0.015547695569694042,
-0.0414053238928318,
-0.0798855721950531,
-0.0072272163815796375,
0.05470259487628937,
0.10367409884929657,
0.11631010472774506,
0.026701275259256363,
-0.05831453949213028,
-0.12985455989837646,
-0.010049611330032349,
0.025459015741944313,
0.10010068118572235,
-0.023013504222035408,
-0.008190451189875603,
-0.026562752202153206,
-0.05088624730706215,
-0.014673394151031971,
0.07104095071554184,
0.008848524652421474,
-0.12703938782215118,
0.07680127024650574,
0.049460720270872116,
-0.018087532371282578,
-0.008617108687758446,
-0.053773753345012665,
-0.09175468981266022,
0.19304224848747253,
-0.01958429254591465,
-0.05116080120205879,
-0.11072391271591187,
-0.10805923491716385,
0.10193557292222977,
-0.11249701678752899,
0.12616074085235596,
-0.10047021508216858,
0.009487134404480457,
-0.09380611777305603,
-0.1732918620109558,
0.14263565838336945,
-0.12662816047668457,
-0.005548976361751556,
-0.07589493691921234,
0.14169755578041077,
-0.06785442680120468,
0.024742592126131058,
0.0028587186243385077,
0.04321930930018425,
-0.11557991802692413,
-0.053762901574373245,
0.029133325442671776,
-0.0661218911409378,
0.03599350154399872,
0.054347891360521317,
-0.04755578562617302,
-0.035690948367118835,
0.016383660957217216,
0.022910388186573982,
0.2239362597465515,
0.23655645549297333,
-0.060722775757312775,
0.14300964772701263,
0.16329653561115265,
-0.02534516341984272,
-0.334926038980484,
-0.11384004354476929,
-0.13816533982753754,
0.000659266603179276,
0.005350308958441019,
-0.1263294517993927,
0.09073201566934586,
-0.018374400213360786,
-0.05229932442307472,
0.11701809614896774,
-0.15716011822223663,
-0.09006257355213165,
0.23355786502361298,
0.007530366536229849,
0.5038071870803833,
-0.09967079758644104,
-0.05945106968283653,
-0.052533090114593506,
-0.14235186576843262,
0.02291753888130188,
0.0022186338901519775,
0.09462001919746399,
-0.029315819963812828,
0.07914337515830994,
0.03301545977592468,
-0.09031011164188385,
0.09982404112815857,
-0.03833628445863724,
0.016298605129122734,
-0.11617961525917053,
-0.08525510132312775,
0.10806074738502502,
-0.0136068444699049,
-0.016709906980395317,
0.02158220298588276,
0.01456777099519968,
-0.04083799198269844,
-0.022307492792606354,
-0.10379772633314133,
0.10585320740938187,
0.03499794006347656,
-0.05941828712821007,
0.022518588230013847,
-0.009889909066259861,
-0.012628845870494843,
-0.004628289956599474,
0.1912514567375183,
-0.008090890944004059,
0.17767208814620972,
0.06429888308048248,
0.021506020799279213,
-0.12491537630558014,
-0.06406809389591217,
-0.04999396950006485,
-0.08511979877948761,
0.07266417145729065,
-0.009923536330461502,
0.04696602374315262,
0.10173138976097107,
-0.012479927390813828,
0.029755644500255585,
0.1106644868850708,
0.010039771907031536,
-0.014950153417885303,
0.16745342314243317,
-0.2137589156627655,
0.04184075817465782,
-0.015450791455805302,
-0.018000587821006775,
0.06724268943071365,
0.059818465262651443,
0.09292822331190109,
0.042055394500494,
-0.040214166045188904,
-0.010924269445240498,
-0.007243160158395767,
-0.06570540368556976,
0.04728737846016884,
0.07078813016414642,
0.0481405109167099,
-0.1261121779680252,
0.010777958668768406,
-0.019267624244093895,
-0.18276818096637726,
-0.0193245317786932,
0.08627685904502869,
-0.11631006002426147,
-0.10884512960910797,
0.009783122688531876,
0.08114659041166306,
-0.12021104246377945,
-0.030119983479380608,
-0.08283551782369614,
-0.11298330873250961,
0.05262024328112602,
0.22306393086910248,
0.1127481460571289,
0.06906166672706604,
-0.01038964930921793,
-0.013437945395708084,
-0.019051581621170044,
-0.020863153040409088,
0.04014519229531288,
0.036231301724910736,
-0.08725123107433319,
0.00853132363408804,
-0.009417156688869,
0.15634506940841675,
-0.10960228741168976,
-0.05792605131864548,
-0.15895655751228333,
0.04835722595453262,
-0.07342072576284409,
-0.09718167781829834,
-0.09943155944347382,
-0.07611285150051117,
0.009972813539206982,
-0.07039758563041687,
-0.05352642014622688,
-0.033090610057115555,
-0.1172407865524292,
0.028002966195344925,
0.02894745022058487,
-0.025185875594615936,
-0.06389441341161728,
-0.04643470048904419,
0.1366305649280548,
-0.04931804537773132,
0.07242259383201599,
0.1486574411392212,
-0.07077633589506149,
0.07762903720140457,
-0.12045170366764069,
-0.12859106063842773,
0.09266626089811325,
0.011787940748035908,
0.0829651728272438,
0.04641987010836601,
0.028651097789406776,
0.05126902461051941,
0.04517265781760216,
0.04532682150602341,
0.05619427561759949,
-0.11496394872665405,
0.07823242247104645,
0.008991614915430546,
-0.1918167769908905,
-0.027399636805057526,
-0.09008971601724625,
0.08355460315942764,
0.0007113930769264698,
0.12100622057914734,
-0.03608888015151024,
0.11319220811128616,
-0.03684284910559654,
0.015498606488108635,
-0.03130309283733368,
-0.1580456793308258,
-0.0009196364553645253,
-0.04583831876516342,
0.005965739022940397,
-0.008378063328564167,
0.23622959852218628,
-0.019403686746954918,
0.020302124321460724,
0.03706345707178116,
0.08312346041202545,
0.011911443434655666,
0.0034686587750911713,
0.14071707427501678,
0.09389068931341171,
-0.05033082515001297,
-0.07029570639133453,
0.09465329349040985,
0.022331232205033302,
-0.06154777854681015,
0.12758882343769073,
0.07004008442163467,
0.07805091887712479,
0.09229323267936707,
0.00525510823354125,
0.048385001718997955,
-0.10149682313203812,
-0.22425489127635956,
-0.04245093837380409,
0.04338166117668152,
0.030511466786265373,
-0.009528924711048603,
0.16239036619663239,
-0.007807712536305189,
0.05438727140426636,
-0.0286843404173851,
-0.0146601852029562,
-0.1913124918937683,
-0.12455741316080093,
-0.08852022141218185,
-0.06035058572888374,
0.034696102142333984,
-0.019905485212802887,
-0.019849425181746483,
0.10410076379776001,
0.03071051649749279,
-0.029069487005472183,
0.1391187459230423,
0.007366697769612074,
-0.012996077537536621,
0.016209116205573082,
-0.007277622353285551,
0.015048467554152012,
0.04374290257692337,
-0.018845049664378166,
-0.17080430686473846,
-0.003531701397150755,
-0.05279584601521492,
0.0019921238999813795,
-0.08741256594657898,
0.027429690584540367,
-0.09686073660850525,
-0.12442773580551147,
-0.06279069930315018,
0.03428082540631294,
-0.036186520010232925,
0.08635444939136505,
-0.008224183693528175,
0.04980190843343735,
0.003494719509035349,
0.13008199632167816,
-0.06635979562997818,
-0.10865864157676697,
-0.0572807714343071,
0.15417969226837158,
0.044320207089185715,
0.07846927642822266,
-0.024377785623073578,
0.025428062304854393,
-0.09749093651771545,
0.3293962776660919,
0.31430360674858093,
-0.0364663191139698,
0.07512470334768295,
0.049478501081466675,
0.03143766522407532,
0.07872943580150604,
0.10979169607162476,
0.07679310441017151,
0.2867131531238556,
-0.09095823019742966,
-0.0540350005030632,
-0.04679121449589729,
-0.03458467498421669,
-0.12083368748426437,
0.018574830144643784,
0.026663949713110924,
-0.03865300863981247,
-0.05562908574938774,
0.08287046104669571,
-0.17973420023918152,
0.13521145284175873,
0.08000985532999039,
-0.2064894735813141,
-0.05751892551779747,
-0.022878373041749,
0.15401794016361237,
0.030826324597001076,
0.11301977932453156,
-0.03987570479512215,
-0.09233494848012924,
0.0507376492023468,
0.02869175747036934,
-0.20688870549201965,
-0.07890072464942932,
0.10709986090660095,
0.004827416036278009,
0.06665920466184616,
-0.034168489277362823,
0.019203342497348785,
0.09574570506811142,
0.06589552015066147,
-0.023569602519273758,
0.026999808847904205,
0.021599261090159416,
-0.10421290993690491,
-0.05063549056649208,
0.027946757152676582,
-0.005161251872777939,
-0.1348150372505188,
0.025101054459810257,
-0.1394609659910202,
0.04613953083753586,
-0.0913073867559433,
-0.008668651804327965,
-0.005664225202053785,
0.0692635253071785,
-0.0486815981566906,
0.048449039459228516,
0.06823991239070892,
0.01269973162561655,
-0.031037641689181328,
-0.048129696398973465,
-0.011327249929308891,
0.06746246665716171,
-0.10943937301635742,
-0.1738860160112381,
-0.08208338916301727,
-0.07020771503448486,
0.0458214171230793,
-0.008811558596789837,
-0.1559021770954132,
-0.0473502054810524,
-0.11499189585447311,
0.015934964641928673,
-0.14897628128528595,
0.045110028237104416,
0.051641616970300674,
0.04488077014684677,
0.021195126697421074,
-0.024013997986912727,
0.037815019488334656,
0.04779626056551933,
-0.15743644535541534,
-0.09625527262687683
] |
null | null |
transformers
|
RoBERTa model pretrained on the Brazilian Federal Official Gazette (500k instances).
|
{}
|
fill-mask
|
flavio-nakasato/deeppolicytracker_500k
|
[
"transformers",
"pytorch",
"tensorboard",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
RoBERTa model pretrained on the Brazilian Federal Official Gazette (500k instances).
|
[] |
[
"TAGS\n#transformers #pytorch #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
41
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.07536142319440842,
0.011643580161035061,
-0.00835113413631916,
0.0492580346763134,
0.11646921187639236,
0.029791247099637985,
0.14224649965763092,
0.09276818484067917,
0.09238187223672867,
0.024627817794680595,
0.1382230967283249,
0.21884886920452118,
-0.01824801415205002,
0.10394322127103806,
-0.04900607094168663,
-0.2826099097728729,
0.021032076328992844,
0.07813074439764023,
-0.11514049023389816,
0.10202093422412872,
0.04388943314552307,
-0.09126440435647964,
0.05398508161306381,
0.0019215689972043037,
-0.13354837894439697,
0.046049755066633224,
0.06058808043599129,
-0.11292058229446411,
0.1267593950033188,
0.03910156339406967,
0.23068459331989288,
0.03846213221549988,
-0.06667101383209229,
-0.04984160140156746,
0.06151588261127472,
0.01903361827135086,
-0.0901922881603241,
0.07385588437318802,
0.037024207413196564,
-0.09428852796554565,
-0.01936214417219162,
0.038302384316921234,
0.03929413855075836,
0.025226851925253868,
-0.12772437930107117,
-0.09758399426937103,
-0.020749686285853386,
0.032078418880701065,
0.02952827326953411,
0.04375957325100899,
0.025577746331691742,
0.199665829539299,
-0.0813383236527443,
0.09534847736358643,
0.14762166142463684,
-0.33770424127578735,
-0.029496081173419952,
0.10334993153810501,
0.07637634128332138,
-0.01740833930671215,
-0.04144992679357529,
0.0696975365281105,
0.008724279701709747,
0.024305645376443863,
0.0707755908370018,
-0.08440755307674408,
-0.05688151344656944,
0.0048796734772622585,
-0.09055546671152115,
-0.003646841738373041,
0.09179401397705078,
-0.05700644478201866,
0.05975775048136711,
-0.013548480346798897,
-0.1419999599456787,
-0.047491688281297684,
-0.027559608221054077,
-0.014387750998139381,
-0.03847982734441757,
0.024923112243413925,
-0.05187215656042099,
-0.06267747282981873,
-0.1258300095796585,
0.02094395086169243,
-0.22548384964466095,
0.2526896595954895,
0.017926666885614395,
0.05929281562566757,
-0.18012534081935883,
0.04407263174653053,
-0.01782998815178871,
-0.13070490956306458,
0.07886628061532974,
-0.07635053992271423,
-0.003469733288511634,
0.0015593197895213962,
-0.04362665116786957,
-0.17636847496032715,
0.08668458461761475,
0.153885155916214,
0.06293786317110062,
0.05358508601784706,
0.03612338379025459,
0.11782229691743851,
0.0063111986964941025,
0.0923241376876831,
-0.010199482552707195,
-0.021956128999590874,
0.062498535960912704,
-0.08140407502651215,
0.0609424002468586,
-0.0726000964641571,
-0.14206090569496155,
-0.03571780025959015,
0.016590986400842667,
0.056322451680898666,
0.04228614270687103,
0.051905643194913864,
-0.07544241845607758,
-0.013531920500099659,
0.06438880413770676,
-0.07095561921596527,
0.02005528099834919,
-0.01722138375043869,
0.05141943320631981,
0.11986014246940613,
0.04023151099681854,
-0.02564370073378086,
-0.0028225162532180548,
0.11516845226287842,
-0.08231042325496674,
-0.02747245319187641,
-0.07134890556335449,
-0.07685630768537521,
0.04569564387202263,
-0.16943246126174927,
0.03011397272348404,
-0.16449978947639465,
-0.1081668883562088,
0.043964844197034836,
0.07641260325908661,
-0.020299863070249557,
-0.0071940128691494465,
0.05432133004069328,
-0.015058392658829689,
0.03252285346388817,
-0.022440703585743904,
-0.025891484692692757,
-0.014742359519004822,
0.08755237609148026,
-0.00789224449545145,
0.12928980588912964,
-0.09928245842456818,
0.03606573864817619,
-0.058870814740657806,
0.0307245384901762,
-0.1866442859172821,
-0.03470376506447792,
-0.05528080090880394,
0.1218569427728653,
-0.007537215482443571,
-0.031544819474220276,
-0.14193503558635712,
0.03643137216567993,
0.02311236597597599,
0.13091778755187988,
-0.09511425346136093,
-0.13027188181877136,
0.22143329679965973,
-0.10969307273626328,
-0.11662596464157104,
0.10313235223293304,
-0.00034573476295918226,
-0.0036734805908054113,
0.03303375095129013,
0.11808130145072937,
0.07438530027866364,
-0.12114878743886948,
0.06985096633434296,
0.11100877076387405,
-0.13137638568878174,
-0.14083588123321533,
0.0030346063431352377,
0.005365296266973019,
-0.04092581570148468,
0.027416786178946495,
0.13061366975307465,
0.09913130849599838,
-0.06533166021108627,
-0.05915512889623642,
-0.023207442834973335,
-0.04470452666282654,
0.14266611635684967,
0.07155412435531616,
0.11709458380937576,
-0.058789126574993134,
-0.05110654979944229,
-0.0073927296325564384,
-0.012311553582549095,
0.02001902088522911,
0.03692452237010002,
-0.08588939905166626,
0.15879537165164948,
-0.093654565513134,
-0.010519672185182571,
-0.1942829191684723,
-0.14208821952342987,
-0.025270549580454826,
0.023185132071375847,
0.003105397103354335,
0.14544132351875305,
0.14126184582710266,
-0.027896536514163017,
-0.018546201288700104,
0.0029071602039039135,
0.10028667747974396,
0.01416551973670721,
-0.05348880589008331,
-0.1332474648952484,
0.02096777968108654,
-0.10247951000928879,
0.006227395497262478,
-0.02685815840959549,
0.02070692367851734,
0.02255779691040516,
0.12652181088924408,
0.0296952985227108,
0.03667345643043518,
-0.030322488397359848,
0.03483564406633377,
-0.046718694269657135,
-0.0012969496892765164,
0.07919879257678986,
-0.0009799518156796694,
-0.08541922271251678,
0.1320667266845703,
-0.1549011915922165,
0.3245454728603363,
0.18324318528175354,
-0.25222039222717285,
-0.032582465559244156,
0.022534402087330818,
-0.018408451229333878,
-0.005483679939061403,
0.05049596354365349,
0.017742786556482315,
0.047079913318157196,
0.011380371637642384,
0.13061010837554932,
-0.006580917164683342,
-0.02327839843928814,
0.05173202231526375,
-0.07258372753858566,
-0.05810311809182167,
0.041968636214733124,
0.12771274149417877,
-0.094915010035038,
0.16964642703533173,
0.20874622464179993,
-0.07404834777116776,
0.18059514462947845,
0.016984177753329277,
-0.011659330688416958,
0.01344555988907814,
-0.029398901388049126,
0.007692015264183283,
0.07665667682886124,
-0.19013014435768127,
-0.0347762331366539,
0.0664834976196289,
-0.06981958448886871,
0.05914191156625748,
-0.1451771855354309,
-0.03094998002052307,
0.005737306550145149,
0.05980359762907028,
-0.006538025103509426,
0.1327260285615921,
0.02405189909040928,
0.07175885140895844,
-0.021226324141025543,
-0.09227590262889862,
0.08278077095746994,
0.01392919197678566,
-0.01752237230539322,
0.16750000417232513,
-0.08590082824230194,
-0.3081378638744354,
-0.1378791183233261,
-0.13565769791603088,
-0.0020439650397747755,
0.023394674062728882,
0.04295716434717178,
-0.07105544954538345,
-0.04922148212790489,
0.07316073030233383,
0.003232675837352872,
-0.004003997892141342,
0.0792648047208786,
-0.06448457390069962,
0.020873762667179108,
-0.03631814196705818,
-0.07549013197422028,
-0.05022209882736206,
-0.057148922234773636,
-0.02323177643120289,
0.14593857526779175,
-0.05327093228697777,
0.08802726864814758,
0.16905586421489716,
0.012550285086035728,
0.04885319247841835,
0.00021545936760958284,
0.11775598675012589,
-0.08743122965097427,
0.035373978316783905,
0.16900746524333954,
-0.05439622700214386,
0.09900873154401779,
0.13870090246200562,
0.053156640380620956,
-0.02953900210559368,
-0.015192784368991852,
-0.018502797931432724,
-0.14376892149448395,
-0.1903413087129593,
-0.07805060595273972,
-0.12872184813022614,
-0.004581542685627937,
0.04064694419503212,
0.06347502768039703,
0.15849946439266205,
0.12698149681091309,
0.06485909223556519,
-0.0025883859489113092,
-0.0653119683265686,
0.029330747202038765,
0.13889311254024506,
-0.010205605067312717,
0.14547351002693176,
-0.04937714338302612,
-0.15135471522808075,
0.04621342942118645,
0.06483875215053558,
0.13727127015590668,
0.0801081731915474,
0.04709290340542793,
0.05030903220176697,
0.15404266119003296,
0.15667839348316193,
0.13466326892375946,
0.0009187188697978854,
-0.0751815140247345,
0.0040469043888151646,
-0.015250175260007381,
-0.0018351945327594876,
0.019688205793499947,
0.14701808989048004,
-0.09335312247276306,
0.008337021805346012,
-0.08287381380796432,
0.052713699638843536,
0.11318081617355347,
0.0508648119866848,
-0.2409767359495163,
0.004237337503582239,
0.0528276301920414,
0.021228590980172157,
-0.04692624509334564,
0.01798040047287941,
0.0023427275009453297,
-0.08692162483930588,
0.06385073065757751,
-0.101706862449646,
0.07074058055877686,
0.01137698907405138,
0.03919457271695137,
-0.032614629715681076,
-0.010928098112344742,
0.022640569135546684,
0.039460305124521255,
-0.20382986962795258,
0.26586997509002686,
0.0006460759323090315,
-0.028984175994992256,
-0.0791526660323143,
0.0052904533222317696,
0.05522213131189346,
0.09401737153530121,
0.10869816690683365,
-0.004247578326612711,
-0.009676774963736534,
-0.08739175647497177,
-0.035155024379491806,
0.017052363604307175,
0.0919615849852562,
-0.01816433109343052,
-0.016529787331819534,
-0.0011856307974085212,
-0.051692042499780655,
0.011339298449456692,
0.04073398932814598,
0.015987176448106766,
-0.1467483788728714,
0.08853159844875336,
0.038112204521894455,
-0.12825120985507965,
0.007739827502518892,
-0.09882144629955292,
-0.14375539124011993,
0.22278666496276855,
-0.05415858328342438,
-0.054699961096048355,
-0.11399660259485245,
-0.07663819938898087,
0.09547971189022064,
-0.10935681313276291,
0.10527587682008743,
-0.09263046830892563,
0.014461711049079895,
-0.09164245426654816,
-0.20145848393440247,
0.1647089719772339,
-0.11031888425350189,
0.002526141470298171,
-0.084161177277565,
0.13648229837417603,
-0.057976134121418,
0.037640731781721115,
-0.013474050909280777,
0.03188606724143028,
-0.08154460787773132,
-0.04830831289291382,
0.042859990149736404,
-0.056861426681280136,
0.028397666290402412,
-0.012770370580255985,
-0.0398712083697319,
-0.04134887456893921,
0.029281508177518845,
0.052181635051965714,
0.2190672755241394,
0.19832973182201385,
-0.07709798961877823,
0.13352513313293457,
0.15129157900810242,
-0.019316932186484337,
-0.3460615277290344,
-0.057502035051584244,
-0.1116613820195198,
-0.001208813046105206,
0.0252317376434803,
-0.13937091827392578,
0.11665228009223938,
-0.009730532765388489,
-0.05563841015100479,
0.15958282351493835,
-0.2076561003923416,
-0.11159199476242065,
0.20932956039905548,
0.05999184027314186,
0.41863301396369934,
-0.13329580426216125,
-0.07144133001565933,
-0.002442446770146489,
-0.09258022159337997,
0.0597749724984169,
-0.0672234296798706,
0.10199921578168869,
-0.018120495602488518,
0.0700591430068016,
0.03106049634516239,
-0.10092440992593765,
0.08434760570526123,
-0.0639733150601387,
0.012281510978937149,
-0.09392406791448593,
-0.084839828312397,
0.11441865563392639,
-0.007932365871965885,
-0.03103458881378174,
0.0011230839882045984,
-0.007150697521865368,
-0.0027813713531941175,
-0.02906716614961624,
-0.08911383897066116,
0.12029516696929932,
0.03192722797393799,
-0.06788427382707596,
0.01380825974047184,
-0.012726900167763233,
-0.037128861993551254,
-0.0073455823585391045,
0.24216188490390778,
0.006354764569550753,
0.1807618886232376,
0.1381017118692398,
0.040524858981370926,
-0.11423755437135696,
-0.08632928878068924,
-0.04852750524878502,
-0.08041905611753464,
0.08240843564271927,
-0.032596904784440994,
0.03226194903254509,
0.106667160987854,
0.00039467812166549265,
0.020127320662140846,
0.11224869638681412,
-0.01845281943678856,
0.0007117748027667403,
0.15909788012504578,
-0.21771253645420074,
-0.036202117800712585,
-0.004651798866689205,
-0.06833845376968384,
0.0439615361392498,
0.10093237459659576,
0.1189456507563591,
0.03399030491709709,
-0.02380356192588806,
0.019473014399409294,
-0.018287353217601776,
-0.04162200167775154,
0.066010020673275,
0.08927890658378601,
0.03649584576487541,
-0.10859853029251099,
0.009288039058446884,
-0.0074465712532401085,
-0.2260841578245163,
-0.01962312124669552,
0.09775625169277191,
-0.0873168557882309,
-0.12063051760196686,
0.012361934408545494,
0.1076221913099289,
-0.0945601537823677,
-0.02367554046213627,
-0.10341771692037582,
-0.09936756640672684,
0.03371421620249748,
0.24353840947151184,
0.09638676047325134,
0.04413692280650139,
-0.03771790862083435,
0.007647861260920763,
-0.022548077628016472,
0.014704525470733643,
0.025894340127706528,
0.05291705206036568,
-0.10283101350069046,
0.0103202685713768,
-0.0031987454276531935,
0.14119109511375427,
-0.11176518350839615,
-0.0445166677236557,
-0.16276757419109344,
0.024837028235197067,
-0.06897640973329544,
-0.07351154088973999,
-0.09175602346658707,
-0.07753758877515793,
-0.0025944914668798447,
-0.06624450534582138,
-0.06782262772321701,
-0.03718223795294762,
-0.11055635660886765,
0.021310102194547653,
0.0274514127522707,
-0.030197784304618835,
-0.08776760846376419,
-0.04201033338904381,
0.10244587808847427,
-0.039260171353816986,
0.077738456428051,
0.10315977782011032,
-0.05513725429773331,
0.07956555485725403,
-0.13255318999290466,
-0.08017243444919586,
0.09610585123300552,
0.021217774599790573,
0.0930950716137886,
0.08084149658679962,
0.018979595974087715,
0.022007931023836136,
0.05984656140208244,
0.0405455008149147,
0.05628363788127899,
-0.09750266373157501,
0.08858586102724075,
-0.03438200429081917,
-0.17740845680236816,
-0.046122271567583084,
-0.046413931995630264,
0.07583437114953995,
0.009188736788928509,
0.10698036849498749,
-0.052560169249773026,
0.09255007654428482,
-0.08093990385532379,
0.023442106321454048,
-0.0121239572763443,
-0.1443140059709549,
0.040407415479421616,
-0.023588387295603752,
0.011160891503095627,
-0.03913760557770729,
0.2005992829799652,
0.005455664359033108,
-0.010600819252431393,
0.0269656702876091,
0.058000002056360245,
0.005491250194609165,
0.01392460148781538,
0.13484342396259308,
0.0717557743191719,
-0.04031308367848396,
-0.08650077879428864,
0.0985933393239975,
0.03556261584162712,
-0.007228076923638582,
0.1446225494146347,
0.05392833426594734,
0.008456836454570293,
0.10799476504325867,
0.03437824919819832,
0.05138842388987541,
-0.11866090446710587,
-0.12615646421909332,
-0.07431069761514664,
0.07176808267831802,
0.024765869602560997,
0.008841742761433125,
0.17953330278396606,
0.0006053848192095757,
0.05149250477552414,
-0.02716839499771595,
-0.044437021017074585,
-0.18419994413852692,
-0.1819099634885788,
-0.08210629969835281,
-0.04845074936747551,
0.030102035030722618,
-0.013569125905632973,
-0.04375358298420906,
0.07335899770259857,
0.044054556638002396,
-0.025877084583044052,
0.18008431792259216,
0.04738849028944969,
0.016384540125727654,
0.008439317345619202,
0.02951398864388466,
0.002837710315361619,
0.019373703747987747,
-0.014868595637381077,
-0.16954737901687622,
0.018676601350307465,
-0.061047427356243134,
-0.011373689398169518,
-0.05893642455339432,
0.032360322773456573,
-0.0726073831319809,
-0.1289254128932953,
-0.05397604778409004,
0.027042578905820847,
-0.012943590059876442,
0.059897784143686295,
-0.0029397914186120033,
0.03484475240111351,
-0.009864917024970055,
0.11938705295324326,
-0.0711914449930191,
-0.04292607679963112,
-0.06010475009679794,
0.13547980785369873,
0.00017876559286378324,
0.0751817598938942,
-0.016827866435050964,
0.006995109841227531,
-0.08060892671346664,
0.31413179636001587,
0.33989062905311584,
-0.060949187725782394,
0.07872892171144485,
0.06968272477388382,
0.02710200473666191,
0.03434217721223831,
0.11524045467376709,
0.057777415961027145,
0.29345428943634033,
-0.09891874343156815,
-0.08564654737710953,
-0.04613710939884186,
-0.02714039757847786,
-0.118376724421978,
0.04868235066533089,
0.04831523448228836,
-0.020950807258486748,
-0.058233924210071564,
0.07887919247150421,
-0.1758730411529541,
0.06970184296369553,
0.09044566750526428,
-0.23246674239635468,
-0.08339087665081024,
-0.022852426394820213,
0.1753229796886444,
-0.010412333533167839,
0.11065186560153961,
-0.039113614708185196,
-0.08929820358753204,
-0.005031270440667868,
0.02762444317340851,
-0.2081141173839569,
-0.00354451360180974,
0.07916070520877838,
-0.020803915336728096,
0.08382081985473633,
-0.040164027363061905,
0.0016908731777220964,
0.10634655505418777,
0.07064057141542435,
-0.00388703471980989,
-0.002501838840544224,
0.03220739960670471,
-0.117055743932724,
-0.04883642867207527,
0.04766792804002762,
-0.0011937525123357773,
-0.13988107442855835,
0.04205838963389397,
-0.10886580497026443,
0.044283486902713776,
-0.13161705434322357,
-0.017260242253541946,
-0.00801071710884571,
0.059082821011543274,
-0.04891728237271309,
0.05079202726483345,
0.06328269094228745,
0.03889768570661545,
-0.029752498492598534,
-0.04196132346987724,
-0.019663844257593155,
0.06980429589748383,
-0.07099960744380951,
-0.1688031554222107,
-0.09897769242525101,
-0.05279514938592911,
-0.012266378849744797,
-0.007850896567106247,
-0.19707658886909485,
-0.052326299250125885,
-0.09872795641422272,
0.008796208538115025,
-0.16692335903644562,
0.01695191115140915,
0.09764675796031952,
0.040505457669496536,
0.010890762321650982,
-0.01408459059894085,
0.028956154361367226,
0.026965294033288956,
-0.18535448610782623,
-0.07746874541044235
] |
null | null |
transformers
|
MLM fine-tuned from BR-BERTo model on the Brazilian Federal Official Gazette (100k instances)
|
{}
|
fill-mask
|
flavio-nakasato/roberdou_100k
|
[
"transformers",
"pytorch",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
MLM fine-tuned from BR-BERTo model on the Brazilian Federal Official Gazette (100k instances)
|
[] |
[
"TAGS\n#transformers #pytorch #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
37
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.05978045240044594,
0.0027343870606273413,
-0.008724397048354149,
0.02515793778002262,
0.13307689130306244,
0.027639828622341156,
0.09509950131177902,
0.08148215711116791,
0.05693569406867027,
-0.005708751268684864,
0.15464650094509125,
0.21959826350212097,
-0.03345884382724762,
0.17867937684059143,
-0.058117255568504333,
-0.2713718116283417,
0.06318216025829315,
0.048634059727191925,
-0.07361909002065659,
0.11866326630115509,
0.0721997618675232,
-0.07688385248184204,
0.06776180863380432,
-0.018220216035842896,
-0.12242016196250916,
0.042348217219114304,
0.05527849495410919,
-0.1122080385684967,
0.12108562141656876,
0.021961210295557976,
0.2060822993516922,
0.013724464923143387,
-0.06524214893579483,
-0.0831608921289444,
0.05297759547829628,
-0.0005626529455184937,
-0.07771056890487671,
0.039234988391399384,
0.006228282582014799,
-0.09797824174165726,
0.0029437756165862083,
0.05069584399461746,
0.031919922679662704,
0.043076999485492706,
-0.14973030984401703,
-0.12170203030109406,
-0.021631255745887756,
0.03224768117070198,
0.04861219599843025,
0.06754541397094727,
0.019578367471694946,
0.20521271228790283,
-0.1258353292942047,
0.10418124496936798,
0.15854887664318085,
-0.29196202754974365,
-0.011583889834582806,
0.07336939871311188,
0.07550900429487228,
-0.05153534933924675,
-0.025016577914357185,
0.061107337474823,
0.0111276526004076,
0.02207767404615879,
0.03159128874540329,
-0.08054082095623016,
-0.06527844071388245,
0.004459382500499487,
-0.07510198652744293,
-0.05628051981329918,
0.14868685603141785,
-0.051077596843242645,
0.04103284329175949,
0.010015198029577732,
-0.12913082540035248,
-0.036706481128931046,
-0.022282211109995842,
0.0016800274606794119,
-0.03736185282468796,
0.03933337330818176,
-0.04058491811156273,
-0.013890751637518406,
-0.10593511909246445,
0.020706869661808014,
-0.23111006617546082,
0.27582958340644836,
0.02612913027405739,
0.07012132555246353,
-0.1858518421649933,
0.043225426226854324,
-0.029383337125182152,
-0.12287542223930359,
0.04163218289613724,
-0.09743601828813553,
0.012437986209988594,
0.0019431845284998417,
-0.06488244980573654,
-0.03970800340175629,
0.08601388335227966,
0.2266247421503067,
0.08983870595693588,
0.03475968539714813,
0.03882957249879837,
0.09770892560482025,
0.014865895733237267,
0.08014364540576935,
0.017019858583807945,
-0.040938593447208405,
0.06849705427885056,
-0.12720969319343567,
0.04322979971766472,
-0.059905603528022766,
-0.12469810247421265,
-0.052705299109220505,
0.004645936656743288,
0.08657316863536835,
0.0478980652987957,
0.04835785925388336,
-0.08745969086885452,
0.0031580787617713213,
0.07787331938743591,
-0.07194057106971741,
-0.002319851191714406,
-0.029221290722489357,
0.05152527242898941,
0.10475372523069382,
0.021636445075273514,
-0.010703074745833874,
-0.016043487936258316,
0.11903663724660873,
-0.07282981276512146,
-0.03583741933107376,
-0.05615774169564247,
-0.055342018604278564,
0.03769473731517792,
-0.14162738621234894,
0.04631955549120903,
-0.1937199980020523,
-0.14164285361766815,
0.05045240744948387,
0.058013904839754105,
-0.004311853088438511,
-0.032746389508247375,
0.032482124865055084,
-0.005240604747086763,
0.01717841997742653,
-0.04425594210624695,
-0.03813016042113304,
-0.03695604205131531,
0.10319358855485916,
0.015862006694078445,
0.1258252114057541,
-0.10736379027366638,
0.04201505705714226,
-0.08612877130508423,
0.012246696278452873,
-0.16673244535923004,
-0.037367139011621475,
-0.02627558261156082,
0.16236986219882965,
0.002689799526706338,
-0.04094931110739708,
-0.11292707920074463,
0.03153347223997116,
-0.005968436133116484,
0.16826218366622925,
-0.050691187381744385,
-0.1258428990840912,
0.23801551759243011,
-0.10858677327632904,
-0.137539803981781,
0.0768466368317604,
-0.0008964669541455805,
0.00503029627725482,
0.04865459352731705,
0.09409047663211823,
0.054460909217596054,
-0.1282840520143509,
0.09256289899349213,
0.09043212980031967,
-0.15783078968524933,
-0.13727834820747375,
0.027883639559149742,
-0.005475207231938839,
-0.10261815786361694,
0.044968098402023315,
0.09180816262960434,
0.11191844940185547,
-0.07093978673219681,
-0.0524664968252182,
-0.019335782155394554,
-0.04510034993290901,
0.12913495302200317,
0.03954192250967026,
0.09601214528083801,
-0.07954450696706772,
-0.026266923174262047,
-0.06726016104221344,
0.00639816839247942,
0.07083439826965332,
0.03640305995941162,
-0.08596820384263992,
0.1376219093799591,
-0.05640283599495888,
0.00475625554099679,
-0.18434511125087738,
-0.10832203924655914,
-0.019048554822802544,
0.06233084574341774,
-0.024080509319901466,
0.11038170754909515,
0.11115530878305435,
-0.04524664208292961,
-0.012584532611072063,
-0.016787465661764145,
0.09273066371679306,
0.02054671198129654,
-0.019367150962352753,
-0.09770754724740982,
0.024277135729789734,
-0.08850374072790146,
0.00890185683965683,
0.020691927522420883,
0.0034111374989151955,
-0.004997740965336561,
0.14283043146133423,
-0.00161478400696069,
0.03947531431913376,
-0.04557863250374794,
0.03357338905334473,
-0.04402673617005348,
0.007829888723790646,
0.08130250126123428,
0.00620792992413044,
-0.052449412643909454,
0.15417484939098358,
-0.1333128660917282,
0.3405405282974243,
0.18508437275886536,
-0.2874508798122406,
-0.03881249949336052,
0.04533010721206665,
-0.018499813973903656,
-0.0015848495531827211,
0.04968307539820671,
0.007090166676789522,
0.027636928483843803,
0.00756347319111228,
0.14252737164497375,
-0.008502244018018246,
-0.015343432314693928,
0.03761560097336769,
-0.0861063003540039,
-0.03436943516135216,
0.034367144107818604,
0.09854230284690857,
-0.11538184434175491,
0.17259535193443298,
0.22679723799228668,
-0.014759926125407219,
0.13486558198928833,
0.019756602123379707,
0.0008263704366981983,
0.004766570404171944,
-0.04244794696569443,
-0.010318059474229813,
0.04416158050298691,
-0.1703941971063614,
-0.037341129034757614,
0.07009443640708923,
-0.04086581617593765,
0.054106131196022034,
-0.1080540344119072,
-0.04539399966597557,
0.023229053243994713,
0.05735735595226288,
-0.057464465498924255,
0.14314846694469452,
0.02993035688996315,
0.07184498757123947,
0.001863340032286942,
-0.0830451101064682,
0.10455971211194992,
0.011808671057224274,
-0.027575377374887466,
0.15615016222000122,
-0.11499504745006561,
-0.3407493829727173,
-0.14060579240322113,
-0.1856870949268341,
0.012841945514082909,
0.04907330498099327,
0.07068860530853271,
-0.0886731967329979,
-0.06028576195240021,
0.1009501963853836,
-0.002118155127391219,
-0.030502429232001305,
0.06733111292123795,
-0.059993140399456024,
0.033099252730607986,
-0.03546803444623947,
-0.05871487408876419,
-0.06814772635698318,
-0.030771153047680855,
-0.026918867602944374,
0.15232053399085999,
-0.08588875830173492,
0.09874926507472992,
0.12222868949174881,
0.012513059191405773,
0.06416334956884384,
0.003602869575843215,
0.16369159519672394,
-0.08151564747095108,
-0.004985830280929804,
0.19588902592658997,
-0.039671871811151505,
0.09784439206123352,
0.16251394152641296,
0.01572851650416851,
-0.04611923173069954,
0.007210151292383671,
-0.054344769567251205,
-0.1310671716928482,
-0.16383560001850128,
-0.10668035596609116,
-0.13408245146274567,
-0.021317366510629654,
0.045856814831495285,
0.050257451832294464,
0.15384143590927124,
0.10692618787288666,
0.039593882858753204,
-0.022173523902893066,
-0.07040359079837799,
0.060743995010852814,
0.16259261965751648,
-0.02038007788360119,
0.13690350949764252,
-0.05275079607963562,
-0.15010952949523926,
0.060834385454654694,
0.017343392595648766,
0.13702066242694855,
0.10000376403331757,
-0.018073182553052902,
0.04605214297771454,
0.15819214284420013,
0.157943993806839,
0.15094222128391266,
0.040314868092536926,
-0.057792484760284424,
-0.00135100819170475,
-0.00355874327942729,
-0.054626744240522385,
0.024207331240177155,
0.13539178669452667,
-0.10293795168399811,
-0.0397566519677639,
-0.122000552713871,
0.05419766530394554,
0.11576513946056366,
0.0632915124297142,
-0.2221747636795044,
0.010753236711025238,
0.06315211206674576,
0.00856467429548502,
-0.06559337675571442,
0.03360356017947197,
-0.0504627525806427,
-0.1452513188123703,
0.0742940679192543,
-0.05110727250576019,
0.09041508287191391,
0.04023086279630661,
0.06344828754663467,
-0.05134069547057152,
-0.05041798576712608,
0.03384535759687424,
0.066301628947258,
-0.24279962480068207,
0.28508660197257996,
-0.015547695569694042,
-0.0414053238928318,
-0.0798855721950531,
-0.0072272163815796375,
0.05470259487628937,
0.10367409884929657,
0.11631010472774506,
0.026701275259256363,
-0.05831453949213028,
-0.12985455989837646,
-0.010049611330032349,
0.025459015741944313,
0.10010068118572235,
-0.023013504222035408,
-0.008190451189875603,
-0.026562752202153206,
-0.05088624730706215,
-0.014673394151031971,
0.07104095071554184,
0.008848524652421474,
-0.12703938782215118,
0.07680127024650574,
0.049460720270872116,
-0.018087532371282578,
-0.008617108687758446,
-0.053773753345012665,
-0.09175468981266022,
0.19304224848747253,
-0.01958429254591465,
-0.05116080120205879,
-0.11072391271591187,
-0.10805923491716385,
0.10193557292222977,
-0.11249701678752899,
0.12616074085235596,
-0.10047021508216858,
0.009487134404480457,
-0.09380611777305603,
-0.1732918620109558,
0.14263565838336945,
-0.12662816047668457,
-0.005548976361751556,
-0.07589493691921234,
0.14169755578041077,
-0.06785442680120468,
0.024742592126131058,
0.0028587186243385077,
0.04321930930018425,
-0.11557991802692413,
-0.053762901574373245,
0.029133325442671776,
-0.0661218911409378,
0.03599350154399872,
0.054347891360521317,
-0.04755578562617302,
-0.035690948367118835,
0.016383660957217216,
0.022910388186573982,
0.2239362597465515,
0.23655645549297333,
-0.060722775757312775,
0.14300964772701263,
0.16329653561115265,
-0.02534516341984272,
-0.334926038980484,
-0.11384004354476929,
-0.13816533982753754,
0.000659266603179276,
0.005350308958441019,
-0.1263294517993927,
0.09073201566934586,
-0.018374400213360786,
-0.05229932442307472,
0.11701809614896774,
-0.15716011822223663,
-0.09006257355213165,
0.23355786502361298,
0.007530366536229849,
0.5038071870803833,
-0.09967079758644104,
-0.05945106968283653,
-0.052533090114593506,
-0.14235186576843262,
0.02291753888130188,
0.0022186338901519775,
0.09462001919746399,
-0.029315819963812828,
0.07914337515830994,
0.03301545977592468,
-0.09031011164188385,
0.09982404112815857,
-0.03833628445863724,
0.016298605129122734,
-0.11617961525917053,
-0.08525510132312775,
0.10806074738502502,
-0.0136068444699049,
-0.016709906980395317,
0.02158220298588276,
0.01456777099519968,
-0.04083799198269844,
-0.022307492792606354,
-0.10379772633314133,
0.10585320740938187,
0.03499794006347656,
-0.05941828712821007,
0.022518588230013847,
-0.009889909066259861,
-0.012628845870494843,
-0.004628289956599474,
0.1912514567375183,
-0.008090890944004059,
0.17767208814620972,
0.06429888308048248,
0.021506020799279213,
-0.12491537630558014,
-0.06406809389591217,
-0.04999396950006485,
-0.08511979877948761,
0.07266417145729065,
-0.009923536330461502,
0.04696602374315262,
0.10173138976097107,
-0.012479927390813828,
0.029755644500255585,
0.1106644868850708,
0.010039771907031536,
-0.014950153417885303,
0.16745342314243317,
-0.2137589156627655,
0.04184075817465782,
-0.015450791455805302,
-0.018000587821006775,
0.06724268943071365,
0.059818465262651443,
0.09292822331190109,
0.042055394500494,
-0.040214166045188904,
-0.010924269445240498,
-0.007243160158395767,
-0.06570540368556976,
0.04728737846016884,
0.07078813016414642,
0.0481405109167099,
-0.1261121779680252,
0.010777958668768406,
-0.019267624244093895,
-0.18276818096637726,
-0.0193245317786932,
0.08627685904502869,
-0.11631006002426147,
-0.10884512960910797,
0.009783122688531876,
0.08114659041166306,
-0.12021104246377945,
-0.030119983479380608,
-0.08283551782369614,
-0.11298330873250961,
0.05262024328112602,
0.22306393086910248,
0.1127481460571289,
0.06906166672706604,
-0.01038964930921793,
-0.013437945395708084,
-0.019051581621170044,
-0.020863153040409088,
0.04014519229531288,
0.036231301724910736,
-0.08725123107433319,
0.00853132363408804,
-0.009417156688869,
0.15634506940841675,
-0.10960228741168976,
-0.05792605131864548,
-0.15895655751228333,
0.04835722595453262,
-0.07342072576284409,
-0.09718167781829834,
-0.09943155944347382,
-0.07611285150051117,
0.009972813539206982,
-0.07039758563041687,
-0.05352642014622688,
-0.033090610057115555,
-0.1172407865524292,
0.028002966195344925,
0.02894745022058487,
-0.025185875594615936,
-0.06389441341161728,
-0.04643470048904419,
0.1366305649280548,
-0.04931804537773132,
0.07242259383201599,
0.1486574411392212,
-0.07077633589506149,
0.07762903720140457,
-0.12045170366764069,
-0.12859106063842773,
0.09266626089811325,
0.011787940748035908,
0.0829651728272438,
0.04641987010836601,
0.028651097789406776,
0.05126902461051941,
0.04517265781760216,
0.04532682150602341,
0.05619427561759949,
-0.11496394872665405,
0.07823242247104645,
0.008991614915430546,
-0.1918167769908905,
-0.027399636805057526,
-0.09008971601724625,
0.08355460315942764,
0.0007113930769264698,
0.12100622057914734,
-0.03608888015151024,
0.11319220811128616,
-0.03684284910559654,
0.015498606488108635,
-0.03130309283733368,
-0.1580456793308258,
-0.0009196364553645253,
-0.04583831876516342,
0.005965739022940397,
-0.008378063328564167,
0.23622959852218628,
-0.019403686746954918,
0.020302124321460724,
0.03706345707178116,
0.08312346041202545,
0.011911443434655666,
0.0034686587750911713,
0.14071707427501678,
0.09389068931341171,
-0.05033082515001297,
-0.07029570639133453,
0.09465329349040985,
0.022331232205033302,
-0.06154777854681015,
0.12758882343769073,
0.07004008442163467,
0.07805091887712479,
0.09229323267936707,
0.00525510823354125,
0.048385001718997955,
-0.10149682313203812,
-0.22425489127635956,
-0.04245093837380409,
0.04338166117668152,
0.030511466786265373,
-0.009528924711048603,
0.16239036619663239,
-0.007807712536305189,
0.05438727140426636,
-0.0286843404173851,
-0.0146601852029562,
-0.1913124918937683,
-0.12455741316080093,
-0.08852022141218185,
-0.06035058572888374,
0.034696102142333984,
-0.019905485212802887,
-0.019849425181746483,
0.10410076379776001,
0.03071051649749279,
-0.029069487005472183,
0.1391187459230423,
0.007366697769612074,
-0.012996077537536621,
0.016209116205573082,
-0.007277622353285551,
0.015048467554152012,
0.04374290257692337,
-0.018845049664378166,
-0.17080430686473846,
-0.003531701397150755,
-0.05279584601521492,
0.0019921238999813795,
-0.08741256594657898,
0.027429690584540367,
-0.09686073660850525,
-0.12442773580551147,
-0.06279069930315018,
0.03428082540631294,
-0.036186520010232925,
0.08635444939136505,
-0.008224183693528175,
0.04980190843343735,
0.003494719509035349,
0.13008199632167816,
-0.06635979562997818,
-0.10865864157676697,
-0.0572807714343071,
0.15417969226837158,
0.044320207089185715,
0.07846927642822266,
-0.024377785623073578,
0.025428062304854393,
-0.09749093651771545,
0.3293962776660919,
0.31430360674858093,
-0.0364663191139698,
0.07512470334768295,
0.049478501081466675,
0.03143766522407532,
0.07872943580150604,
0.10979169607162476,
0.07679310441017151,
0.2867131531238556,
-0.09095823019742966,
-0.0540350005030632,
-0.04679121449589729,
-0.03458467498421669,
-0.12083368748426437,
0.018574830144643784,
0.026663949713110924,
-0.03865300863981247,
-0.05562908574938774,
0.08287046104669571,
-0.17973420023918152,
0.13521145284175873,
0.08000985532999039,
-0.2064894735813141,
-0.05751892551779747,
-0.022878373041749,
0.15401794016361237,
0.030826324597001076,
0.11301977932453156,
-0.03987570479512215,
-0.09233494848012924,
0.0507376492023468,
0.02869175747036934,
-0.20688870549201965,
-0.07890072464942932,
0.10709986090660095,
0.004827416036278009,
0.06665920466184616,
-0.034168489277362823,
0.019203342497348785,
0.09574570506811142,
0.06589552015066147,
-0.023569602519273758,
0.026999808847904205,
0.021599261090159416,
-0.10421290993690491,
-0.05063549056649208,
0.027946757152676582,
-0.005161251872777939,
-0.1348150372505188,
0.025101054459810257,
-0.1394609659910202,
0.04613953083753586,
-0.0913073867559433,
-0.008668651804327965,
-0.005664225202053785,
0.0692635253071785,
-0.0486815981566906,
0.048449039459228516,
0.06823991239070892,
0.01269973162561655,
-0.031037641689181328,
-0.048129696398973465,
-0.011327249929308891,
0.06746246665716171,
-0.10943937301635742,
-0.1738860160112381,
-0.08208338916301727,
-0.07020771503448486,
0.0458214171230793,
-0.008811558596789837,
-0.1559021770954132,
-0.0473502054810524,
-0.11499189585447311,
0.015934964641928673,
-0.14897628128528595,
0.045110028237104416,
0.051641616970300674,
0.04488077014684677,
0.021195126697421074,
-0.024013997986912727,
0.037815019488334656,
0.04779626056551933,
-0.15743644535541534,
-0.09625527262687683
] |
null | null |
transformers
|
# Image-captioning-Indonesia
This is an encoder-decoder image captioning model using [CLIP](https://huggingface.co/transformers/model_doc/clip.html) as the visual encoder and [Marian](https://huggingface.co/transformers/model_doc/marian.html) as the textual decoder on datasets with Indonesian captions.
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by [HuggingFace](https://huggingface.co). All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
## How to use
At time of writing, you will need to install [HuggingFace](https://github.com/huggingface/) from its latest master branch in order to load `FlaxMarian`.
You will also need to have the [`flax_clip_vision_marian` folder](https://github.com/indonesian-nlp/Indonesia-Image-Captioning/tree/main/flax_clip_vision_marian) in your project directory to load the model using the `FlaxCLIPVisionMarianForConditionalGeneration` class.
```python
from torchvision.io import ImageReadMode, read_image
from torchvision.transforms import CenterCrop, ConvertImageDtype, Normalize, Resize
from torchvision.transforms.functional import InterpolationMode
import torch
import numpy as np
from transformers import MarianTokenizer
from flax_clip_vision_marian.modeling_clip_vision_marian import FlaxCLIPVisionMarianForConditionalGeneration
clip_marian_model_name = 'flax-community/Image-captioning-Indonesia'
model = FlaxCLIPVisionMarianForConditionalGeneration.from_pretrained(clip_marian_model_name)
marian_model_name = 'Helsinki-NLP/opus-mt-en-id'
tokenizer = MarianTokenizer.from_pretrained(marian_model_name)
config = model.config
image_size = config.clip_vision_config.image_size
# Image transformation
transforms = torch.nn.Sequential(
Resize([image_size], interpolation=InterpolationMode.BICUBIC),
CenterCrop(image_size),
ConvertImageDtype(torch.float),
Normalize((0.48145466, 0.4578275, 0.40821073), (0.26862954, 0.26130258, 0.27577711)),
)
# Hyperparameters
max_length = 8
num_beams = 4
gen_kwargs = {"max_length": max_length, "num_beams": num_beams}
def generate_step(batch):
output_ids = model.generate(pixel_values, **gen_kwargs)
token_ids = np.array(output_ids.sequences)[0]
caption = tokenizer.decode(token_ids)
return caption
image_file_path = image_file_path
image = read_image(image_file_path, mode=ImageReadMode.RGB)
image = transforms(image)
pixel_values = torch.stack([image]).permute(0, 2, 3, 1).numpy()
generated_ids = generate_step(pixel_values)
print(generated_ids)
```
## Training data
The Model was trained on translated Coco,Flickr and ViZWiz, each of them were translated using google translate and marian mt. we took only random 2 captions per image for each datasets
## Training procedure
The model was trained on a TPUv3-8 VM provided by the Google Cloud team.
## Team members
- Cahya Wirawan ([@cahya](https://huggingface.co/cahya))
- Galuh Sahid ([@Galuh](https://huggingface.co/Galuh))
- Muhammad Agung Hambali ([@AyameRushia](https://huggingface.co/AyameRushia))
- Samsul Rahmadani ([@munggok](https://huggingface.co/munggok))
|
{"language": "id"}
|
text2text-generation
|
flax-community/Image-captioning-Indonesia
|
[
"transformers",
"jax",
"clip-vision-marian",
"text2text-generation",
"id",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"id"
] |
TAGS
#transformers #jax #clip-vision-marian #text2text-generation #id #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# Image-captioning-Indonesia
This is an encoder-decoder image captioning model using CLIP as the visual encoder and Marian as the textual decoder on datasets with Indonesian captions.
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
## How to use
At time of writing, you will need to install HuggingFace from its latest master branch in order to load 'FlaxMarian'.
You will also need to have the 'flax_clip_vision_marian' folder in your project directory to load the model using the 'FlaxCLIPVisionMarianForConditionalGeneration' class.
## Training data
The Model was trained on translated Coco,Flickr and ViZWiz, each of them were translated using google translate and marian mt. we took only random 2 captions per image for each datasets
## Training procedure
The model was trained on a TPUv3-8 VM provided by the Google Cloud team.
## Team members
- Cahya Wirawan (@cahya)
- Galuh Sahid (@Galuh)
- Muhammad Agung Hambali (@AyameRushia)
- Samsul Rahmadani (@munggok)
|
[
"# Image-captioning-Indonesia\n\nThis is an encoder-decoder image captioning model using CLIP as the visual encoder and Marian as the textual decoder on datasets with Indonesian captions.\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.",
"## How to use\nAt time of writing, you will need to install HuggingFace from its latest master branch in order to load 'FlaxMarian'.\n\nYou will also need to have the 'flax_clip_vision_marian' folder in your project directory to load the model using the 'FlaxCLIPVisionMarianForConditionalGeneration' class.",
"## Training data\nThe Model was trained on translated Coco,Flickr and ViZWiz, each of them were translated using google translate and marian mt. we took only random 2 captions per image for each datasets",
"## Training procedure \nThe model was trained on a TPUv3-8 VM provided by the Google Cloud team.",
"## Team members\n- Cahya Wirawan (@cahya)\n- Galuh Sahid (@Galuh)\n- Muhammad Agung Hambali (@AyameRushia)\n- Samsul Rahmadani (@munggok)"
] |
[
"TAGS\n#transformers #jax #clip-vision-marian #text2text-generation #id #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# Image-captioning-Indonesia\n\nThis is an encoder-decoder image captioning model using CLIP as the visual encoder and Marian as the textual decoder on datasets with Indonesian captions.\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.",
"## How to use\nAt time of writing, you will need to install HuggingFace from its latest master branch in order to load 'FlaxMarian'.\n\nYou will also need to have the 'flax_clip_vision_marian' folder in your project directory to load the model using the 'FlaxCLIPVisionMarianForConditionalGeneration' class.",
"## Training data\nThe Model was trained on translated Coco,Flickr and ViZWiz, each of them were translated using google translate and marian mt. we took only random 2 captions per image for each datasets",
"## Training procedure \nThe model was trained on a TPUv3-8 VM provided by the Google Cloud team.",
"## Team members\n- Cahya Wirawan (@cahya)\n- Galuh Sahid (@Galuh)\n- Muhammad Agung Hambali (@AyameRushia)\n- Samsul Rahmadani (@munggok)"
] |
[
48,
104,
84,
54,
22,
43
] |
[
"passage: TAGS\n#transformers #jax #clip-vision-marian #text2text-generation #id #autotrain_compatible #endpoints_compatible #has_space #region-us \n# Image-captioning-Indonesia\n\nThis is an encoder-decoder image captioning model using CLIP as the visual encoder and Marian as the textual decoder on datasets with Indonesian captions.\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.## How to use\nAt time of writing, you will need to install HuggingFace from its latest master branch in order to load 'FlaxMarian'.\n\nYou will also need to have the 'flax_clip_vision_marian' folder in your project directory to load the model using the 'FlaxCLIPVisionMarianForConditionalGeneration' class.## Training data\nThe Model was trained on translated Coco,Flickr and ViZWiz, each of them were translated using google translate and marian mt. we took only random 2 captions per image for each datasets## Training procedure \nThe model was trained on a TPUv3-8 VM provided by the Google Cloud team.## Team members\n- Cahya Wirawan (@cahya)\n- Galuh Sahid (@Galuh)\n- Muhammad Agung Hambali (@AyameRushia)\n- Samsul Rahmadani (@munggok)"
] |
[
-0.046085868030786514,
0.1054362878203392,
-0.0030231461860239506,
0.05762948840856552,
0.10260485857725143,
0.015436390414834023,
0.11893399804830551,
0.07949094474315643,
-0.03469846397638321,
0.04973042011260986,
0.030039705336093903,
-0.010048821568489075,
0.10538891702890396,
0.13418594002723694,
0.0580727756023407,
-0.26278796792030334,
-0.018448323011398315,
-0.07321105897426605,
-0.046329014003276825,
0.0800042375922203,
0.06399048864841461,
-0.04703113064169884,
0.10524514317512512,
0.001415379112586379,
-0.07740273326635361,
-0.004255734849721193,
-0.08398132026195526,
-0.0735689252614975,
0.04494438320398331,
0.07680079340934753,
0.07563404738903046,
0.025825858116149902,
0.08357047289609909,
-0.12730662524700165,
0.020663229748606682,
0.08352893590927124,
-0.024850979447364807,
0.02164848893880844,
0.15178655087947845,
0.01498578954488039,
0.15647287666797638,
-0.041594669222831726,
0.013643139973282814,
0.06532147526741028,
-0.1382257640361786,
-0.055240921676158905,
-0.12200110405683517,
0.05535994842648506,
0.08740172535181046,
0.07353245466947556,
-0.009704062715172768,
0.07075168192386627,
-0.05762849003076553,
0.06364676356315613,
0.06344129145145416,
-0.1322774738073349,
-0.09591126441955566,
0.14922846853733063,
0.03713974729180336,
0.04735245555639267,
-0.07194234430789948,
0.039836540818214417,
0.04559050872921944,
0.0025231328327208757,
0.032955992966890335,
-0.04502899572253227,
-0.06837742030620575,
-0.0589158870279789,
-0.08347123861312866,
-0.03892001509666443,
0.19224749505519867,
0.012386542744934559,
-0.0383848138153553,
-0.10069037228822708,
-0.02762712724506855,
0.06439349055290222,
0.0068275961093604565,
0.032275229692459106,
0.02365889959037304,
0.021517345681786537,
0.046142157167196274,
-0.06235133111476898,
-0.09808086603879929,
-0.04516476020216942,
0.0030732122249901295,
-0.01914411410689354,
0.0030774695333093405,
0.028924137353897095,
-0.09069908410310745,
0.11163696646690369,
-0.00041526524000801146,
-0.10630723834037781,
-0.02408934012055397,
-0.03171110153198242,
-0.09372403472661972,
-0.028632452711462975,
0.029168635606765747,
-0.13375526666641235,
0.02597680874168873,
0.10946643352508545,
0.047170355916023254,
0.04980101063847542,
-0.000033928979974007234,
0.00778510607779026,
0.056490905582904816,
0.12471039593219757,
-0.0805976614356041,
-0.03316301852464676,
0.03564848750829697,
-0.0415119044482708,
-0.019100498408079147,
-0.024772336706519127,
0.017396114766597748,
0.011961717158555984,
-0.08007398247718811,
0.09272092580795288,
0.03999793529510498,
0.08339381963014603,
0.001251532114110887,
-0.07315073907375336,
0.15210267901420593,
-0.10746445506811142,
0.018325520679354668,
0.004649218171834946,
-0.050418879836797714,
0.06781438738107681,
0.10208491235971451,
-0.018235668540000916,
-0.05327488109469414,
-0.016963118687272072,
-0.03179255127906799,
-0.00093232118524611,
-0.056950297206640244,
-0.021794695407152176,
0.04765492305159569,
-0.03767749294638634,
-0.06572291254997253,
-0.10442762821912766,
-0.15194956958293915,
-0.08868349343538284,
0.0542391836643219,
-0.08946341276168823,
-0.00786818191409111,
-0.013929860666394234,
0.01664835773408413,
0.02217910625040531,
0.03238120302557945,
0.11910592764616013,
-0.026614349335432053,
0.03788168355822563,
-0.04945644736289978,
0.05896908789873123,
0.0990029126405716,
0.055615589022636414,
-0.025682082399725914,
0.07603644579648972,
-0.17698581516742706,
0.1549612432718277,
-0.08186166733503342,
0.0800359919667244,
-0.14457614719867706,
0.015928927809000015,
0.019060296937823296,
-0.03216056153178215,
0.0017176421824842691,
0.1533675640821457,
-0.19606047868728638,
0.007226563058793545,
0.15715394914150238,
-0.11228930205106735,
-0.05861099436879158,
0.07822567224502563,
-0.0072252522222697735,
0.1498146802186966,
-0.001454640063457191,
0.07363582402467728,
0.07396918535232544,
-0.09109686315059662,
0.019176725298166275,
-0.04408331587910652,
-0.03200696408748627,
0.06050229072570801,
0.04872678965330124,
-0.01762296073138714,
0.025372907519340515,
0.01978367753326893,
-0.087071493268013,
0.07984290271997452,
-0.013321256265044212,
-0.05123366415500641,
0.02534189261496067,
-0.07113316655158997,
-0.07784158736467361,
0.015609047375619411,
0.03571416437625885,
0.013562487438321114,
-0.015627456828951836,
-0.06582985818386078,
0.1037851944565773,
-0.07610618323087692,
-0.020437901839613914,
-0.036912109702825546,
0.00826555397361517,
-0.08030592650175095,
0.023650670424103737,
-0.0645800530910492,
-0.10486720502376556,
0.07273349910974503,
-0.06990200281143188,
0.038039952516555786,
-0.06670375913381577,
0.043960824608802795,
0.05536985397338867,
-0.05480630323290825,
-0.0515865720808506,
0.029364600777626038,
-0.016485074535012245,
-0.006170359440147877,
-0.09505718946456909,
-0.053187258541584015,
-0.020761508494615555,
0.16899354755878448,
-0.17738699913024902,
0.015456467866897583,
0.03762492164969444,
0.12253906577825546,
-0.0017629089998081326,
-0.046732667833566666,
0.10712732374668121,
-0.041227150708436966,
0.0190245620906353,
-0.11589258164167404,
0.028721360489726067,
0.022693786770105362,
-0.024899711832404137,
0.1395064890384674,
-0.07786693423986435,
-0.19226789474487305,
0.060564786195755005,
-0.030121885240077972,
-0.08439968526363373,
0.03323329985141754,
-0.03934710472822189,
-0.07854211330413818,
-0.09323190897703171,
0.03203798085451126,
0.06147395074367523,
0.02988067828118801,
0.12167855352163315,
-0.07916371524333954,
-0.05180750787258148,
0.018498407676815987,
-0.03733942285180092,
-0.06285123527050018,
0.07751385122537613,
0.06369990110397339,
-0.13320818543434143,
0.05567000061273575,
0.006267446558922529,
0.04827568307518959,
0.15939615666866302,
0.005878294352442026,
-0.054470136761665344,
-0.047924429178237915,
0.04041087627410889,
0.05354049429297447,
0.02211664244532585,
-0.03635428845882416,
-0.015222021378576756,
0.059391431510448456,
0.004096771590411663,
0.02083219401538372,
-0.07406487315893173,
-0.002772468840703368,
0.0021099054720252752,
-0.030478816479444504,
0.006733693182468414,
0.050638433545827866,
-0.06980712711811066,
0.04280930012464523,
-0.048932891339063644,
0.08171396702528,
-0.008946209214627743,
-0.03677735850214958,
-0.09669080376625061,
0.10878917574882507,
-0.03337569907307625,
-0.1988181471824646,
-0.15019842982292175,
-0.004656611941754818,
-0.06555839627981186,
-0.016503790393471718,
0.01843188889324665,
-0.05956203117966652,
-0.06988952308893204,
-0.06745248287916183,
0.014357532374560833,
0.033224307000637054,
-0.07194847613573074,
-0.05783599615097046,
0.02399638667702675,
0.03845328465104103,
-0.12001969665288925,
0.023693060502409935,
0.0640815794467926,
-0.10023985803127289,
0.09584664553403854,
-0.07598359882831573,
0.10898406058549881,
0.05307677388191223,
0.001113756326958537,
0.015077481046319008,
0.025765853002667427,
0.21291422843933105,
-0.1331336349248886,
0.16458430886268616,
0.14934837818145752,
0.059556230902671814,
0.04084654897451401,
0.08482924848794937,
0.010683095082640648,
-0.07606665790081024,
0.03459611535072327,
0.05968505144119263,
-0.06264206022024155,
-0.2736766040325165,
-0.05324874818325043,
-0.04248576611280441,
-0.027827510610222816,
0.0786365494132042,
0.06924890726804733,
0.03306856378912926,
0.06655014306306839,
-0.07996372878551483,
0.06621333956718445,
0.04676292464137077,
0.04837043583393097,
-0.04151860252022743,
0.02873406559228897,
0.010789738968014717,
-0.08928711712360382,
0.025026431307196617,
0.10711590200662613,
0.026663511991500854,
0.1974615454673767,
-0.0061393738724291325,
0.13787539303302765,
0.067074716091156,
0.09923397749662399,
0.029936498031020164,
0.07407886534929276,
-0.017491556704044342,
0.029190275818109512,
-0.007373278960585594,
-0.08261627703905106,
0.004648709669709206,
0.04090367257595062,
0.010466949082911015,
-0.07912493497133255,
-0.002296475926414132,
-0.017102079465985298,
0.04502369090914726,
0.2648024260997772,
-0.031966980546712875,
-0.12833942472934723,
-0.004687753040343523,
0.004577454179525375,
-0.0010590917663648725,
-0.06388388574123383,
-0.029168469831347466,
0.11559170484542847,
-0.15946638584136963,
0.10219286382198334,
-0.035018738359212875,
0.1006644070148468,
-0.10288107395172119,
-0.07244877517223358,
0.012973391450941563,
0.028294917196035385,
-0.011555989272892475,
0.12145750224590302,
-0.18128925561904907,
0.12352557480335236,
0.016740160062909126,
0.08126462250947952,
-0.07879796624183655,
0.020805399864912033,
0.056508611887693405,
0.07718890905380249,
0.12360487133264542,
0.045163098722696304,
-0.048165906220674515,
-0.10172748565673828,
-0.08699554949998856,
-0.023490317165851593,
0.02313195914030075,
-0.05425353720784187,
0.055030565708875656,
0.018373144790530205,
-0.03322513401508331,
-0.058900292962789536,
0.06681359559297562,
-0.14097170531749725,
-0.12810981273651123,
0.04102170839905739,
-0.024275755509734154,
0.06803669035434723,
-0.054710742086172104,
-0.023361939936876297,
-0.10989437997341156,
0.05046207830309868,
0.06362095475196838,
-0.06060443073511124,
-0.13209903240203857,
-0.04220697283744812,
0.12148706614971161,
-0.058875009417533875,
0.003978280816227198,
-0.03314865380525589,
0.07303603738546371,
-0.016156882047653198,
-0.03822040557861328,
0.018033171072602272,
-0.08719589561223984,
-0.14610734581947327,
-0.0342787504196167,
0.04758230596780777,
0.05916852876543999,
0.015211476013064384,
0.01197749376296997,
0.016863655298948288,
0.013365060091018677,
-0.10744703561067581,
0.021200012415647507,
0.0993003100156784,
-0.006888887379318476,
0.03920867294073105,
-0.012537877075374126,
-0.020456276834011078,
-0.07514876872301102,
-0.1259494125843048,
0.027456771582365036,
0.16976024210453033,
-0.025015778839588165,
0.06759530305862427,
0.08269702643156052,
-0.09337034076452255,
-0.18087510764598846,
-0.08836604654788971,
0.0360666923224926,
-0.01842336170375347,
-0.0002335720491828397,
-0.15952689945697784,
0.02754666656255722,
0.10277961194515228,
-0.010827737860381603,
-0.011387696489691734,
-0.2689700722694397,
-0.1219695508480072,
-0.020316628739237785,
0.029379049316048622,
-0.07740955054759979,
-0.12856130301952362,
-0.06683450192213058,
-0.04598377272486687,
-0.10726987570524216,
0.05811184644699097,
-0.025948895141482353,
0.09710009396076202,
0.004088653717190027,
-0.01435729581862688,
0.0027174365241080523,
-0.05438847094774246,
0.15418006479740143,
0.021310996264219284,
0.05157165974378586,
-0.08541370928287506,
0.00012083930778317153,
0.11762990057468414,
-0.03861820697784424,
0.11480416357517242,
0.002407959895208478,
0.03339891880750656,
-0.13815969228744507,
-0.02780241146683693,
-0.072189562022686,
0.05435755103826523,
-0.05589228868484497,
-0.014386068098247051,
-0.033073361963033676,
0.10208416730165482,
0.10346747189760208,
0.02467014640569687,
-0.06183681637048721,
-0.01741071790456772,
-0.05275781452655792,
0.16240091621875763,
0.053557559847831726,
0.060475192964076996,
-0.05032677948474884,
-0.10942376405000687,
0.023859722539782524,
0.050371941179037094,
-0.11410043388605118,
0.02755182981491089,
0.03677898645401001,
0.01761818118393421,
0.11513885110616684,
-0.023119810968637466,
-0.12226317077875137,
0.03958568722009659,
0.0410870760679245,
-0.06406532973051071,
-0.17129147052764893,
-0.030378086492419243,
-0.015674054622650146,
0.02357455901801586,
-0.060104917734861374,
0.08485331386327744,
-0.04692235216498375,
-0.01086586806923151,
-0.01589925028383732,
0.04247145354747772,
-0.014783337712287903,
0.15251454710960388,
0.019154759123921394,
0.024138284847140312,
-0.07136356085538864,
0.1696990728378296,
0.17710047960281372,
-0.06110062450170517,
-0.0063193547539412975,
0.16655124723911285,
-0.07451975345611572,
-0.079368457198143,
0.05929829180240631,
0.1272507607936859,
-0.05880548059940338,
-0.042973071336746216,
0.01187846902757883,
-0.0161715280264616,
0.005423251539468765,
0.0457761213183403,
-0.0016624387353658676,
0.03144114091992378,
0.013206375762820244,
-0.04360222443938255,
-0.06260073930025101,
0.05614235997200012,
0.0918751060962677,
0.02641865238547325,
-0.06956963986158371,
0.06933357566595078,
0.05876395106315613,
0.05131183937191963,
-0.02772855944931507,
-0.05953998863697052,
-0.07868366688489914,
-0.010890763252973557,
-0.036176759749650955,
0.054963551461696625,
-0.07510402053594589,
0.006488865707069635,
-0.027936743572354317,
-0.013370855711400509,
0.014397239312529564,
0.02103635109961033,
-0.02782035805284977,
-0.04673664644360542,
-0.051509179174900055,
0.14793236553668976,
-0.07368456572294235,
-0.07594897598028183,
0.07103927433490753,
-0.058465927839279175,
0.07722605764865875,
0.009628144092857838,
-0.030361272394657135,
-0.05043311417102814,
-0.10713410377502441,
-0.013508595526218414,
-0.04521092027425766,
0.011694815941154957,
0.022732678800821304,
-0.12139547616243362,
0.015635274350643158,
-0.0309381615370512,
-0.052147217094898224,
-0.016930345445871353,
0.04767836630344391,
-0.1002979651093483,
-0.007036334369331598,
-0.01589309610426426,
-0.07580292969942093,
-0.08266689628362656,
0.07668720185756683,
0.03423494100570679,
0.017137283459305763,
0.07560029625892639,
-0.07566411793231964,
0.09408999234437943,
-0.1453235149383545,
-0.011312948539853096,
0.016604790464043617,
-0.05357864126563072,
-0.0034841177985072136,
-0.0330052450299263,
0.05037344619631767,
-0.014813518151640892,
0.07275194674730301,
0.06375223398208618,
0.007947700098156929,
-0.02417859621345997,
0.022401237860322,
-0.05271276831626892,
-0.005326354410499334,
0.08323659002780914,
-0.023524923250079155,
0.03424055501818657,
0.04077473282814026,
0.010004744865000248,
-0.03202332183718681,
0.09607136249542236,
0.0046721152029931545,
0.07118383049964905,
0.02223706804215908,
0.04534687101840973,
0.04408823698759079,
-0.08431529998779297,
-0.03852425515651703,
-0.011534668505191803,
-0.037230685353279114,
0.07712038606405258,
-0.09289504587650299,
-0.0301506444811821,
0.0925387516617775,
-0.1931970715522766,
0.08596006780862808,
-0.01077218446880579,
-0.06212789937853813,
-0.023919694125652313,
-0.27920013666152954,
-0.038703154772520065,
-0.04494807869195938,
0.02907823957502842,
-0.04626454412937164,
0.05457651615142822,
0.007230259943753481,
0.030306650325655937,
-0.05011220648884773,
0.13133791089057922,
-0.07681689411401749,
-0.08015160262584686,
0.05764981731772423,
0.0142940953373909,
0.030988961458206177,
-0.05950634181499481,
0.039028432220220566,
0.01463716384023428,
0.02217189036309719,
-0.01294140238314867,
0.06825528293848038,
0.06480380147695541,
0.03812337666749954,
0.004913033451884985,
-0.10027260333299637,
0.004742816090583801,
-0.027557412162423134,
-0.0008379156934097409,
0.12734755873680115,
0.0592944361269474,
0.020826026797294617,
0.004501407500356436,
0.11504099518060684,
0.00015811108460184187,
-0.03808535262942314,
-0.15195348858833313,
0.024690674617886543,
-0.07509365677833557,
-0.03723464533686638,
0.00037274215719662607,
-0.09283588081598282,
-0.015248531475663185,
0.2688194215297699,
0.11562012881040573,
0.007464681286364794,
-0.009954776614904404,
0.010704629123210907,
0.0005220030434429646,
0.01531763281673193,
0.10888165235519409,
0.021296707913279533,
0.08217892795801163,
-0.07227961719036102,
0.08576805889606476,
-0.027277573943138123,
-0.07416243851184845,
-0.07204332202672958,
0.14566345512866974,
-0.03791088983416557,
-0.00853741355240345,
-0.01535332016646862,
0.10511009395122528,
-0.0506451353430748,
-0.1511763334274292,
0.08947090059518814,
-0.01549992710351944,
-0.09533840417861938,
-0.034151341766119,
-0.02726704627275467,
0.07930774986743927,
0.02726762369275093,
-0.004954199306666851,
0.039168886840343475,
0.18524989485740662,
0.04224863275885582,
-0.04602116346359253,
-0.08030116558074951,
0.05594513192772865,
-0.11635125428438187,
0.21658898890018463,
0.006265876814723015,
0.016317829489707947,
0.06493645906448364,
-0.01725444570183754,
-0.15858671069145203,
0.07636962085962296,
0.020979130640625954,
0.005093862768262625,
0.044973451644182205,
0.17721553146839142,
-0.029295379295945168,
0.023363381624221802,
0.012279021553695202,
-0.005244506988674402,
0.062226951122283936,
0.10250871628522873,
0.08294613659381866,
-0.07317529618740082,
0.10509618371725082,
-0.07224710285663605,
0.1518883854150772,
0.16064175963401794,
-0.03198660910129547,
-0.011634753085672855,
-0.08212420344352722,
0.028885219246149063,
0.018698468804359436,
0.007927305065095425,
-0.025613073259592056,
-0.10693210363388062,
-0.019365012645721436,
-0.1243114024400711,
0.09150373935699463,
-0.06170230358839035,
-0.015847358852624893,
-0.06099238619208336,
-0.08233759552240372,
-0.0021166247315704823,
0.12854847311973572,
0.06340809911489487,
0.04345777630805969,
-0.008889667689800262,
-0.08462096005678177,
-0.004254550207406282,
0.0799400806427002,
-0.11486704647541046,
-0.02914465218782425
] |
null | null | null |
# Neural ODE with Flax
This is the result of project ["Reproduce Neural ODE and SDE"][projectlink] in [HuggingFace Flax/JAX community week][comweeklink].
<code>main.py</code> will execute training of ResNet or OdeNet for MNIST dataset.
[projectlink]: https://discuss.huggingface.co/t/reproduce-neural-ode-and-neural-sde/7590
[comweeklink]: https://github.com/huggingface/transformers/tree/master/examples/research_projects/jax-projects#projects
## Dependency
### JAX and Flax
For JAX installation, please follow [here][jaxinstalllink].
or simply, type
```bash
pip install jax jaxlib
```
For Flax installation,
```bash
pip install flax
```
[jaxinstalllink]: https://github.com/google/jax#installation
Tensorflow-datasets will download MNIST dataset to environment.
## How to run training
For (small) ResNet training,
```bash
python main.py --model=resnet --lr=1e-4 --n_epoch=20 --batch_size=64
```
For Neural ODE training,
```bash
python main.py --model=odenet --lr=1e-4 --n_epoch=20 --batch_size=64
```
For Continuous Normalizing Flow,
```bash
python main.py --model=cnf --sample_dataset=circles
```
Sample datasets can be chosen as circles, moons, or scurve.
# Sample Results



# Bird Call generation Score SDE
These are the codes for the bird call generation score sde model.
<code>core-sde-sampler.py</code> will execute the sampler. The sampler uses pretrained weight to generate bird calls. The weight can be found [here](https://github.com/mandelbrot-walker/Birdcall-score-sde/blob/main/ckpt.flax)
For using different sample generation parameters change the argument values. For example,
```bash
python main.py --sigma=25 --num_steps=500 --signal_to_noise_ratio=0.10 --etol=1e-5 --sample_batch_size = 128 --sample_no = 47
```
In order to generate the audios, these dependencies are required,
```bash
pip install librosa
pip install soundfile
```
In order to train the model from scratch, please generate the dataset using this [link](www.kaggle.com/ibraheemmoosa/birdsong-spectogram-generation). The dataset is generated in kaggle. Therefore, during training your username and api key is required in the specified section inside the script.
```bash
python main.py --sigma=35 --n_epochs=1000 --batch_size=512 --lr=1e-3 --num_steps=500 --signal_to_noise_ratio=0.15 --etol=1e-5 --sample_batch_size = 64 --sample_no = 23
```
Generated samples can be found [here](https://github.com/mandelbrot-walker/Birdcall-score-sde/tree/main/generated_samples)
and [here](https://colab.research.google.com/drive/1AbF4aIMkSfNs-G__MXzqY7JSrz6qvLYN)
|
{}
| null |
flax-community/NeuralODE_SDE
|
[
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Neural ODE with Flax
This is the result of project ["Reproduce Neural ODE and SDE"][projectlink] in [HuggingFace Flax/JAX community week][comweeklink].
<code>URL</code> will execute training of ResNet or OdeNet for MNIST dataset.
[projectlink]: URL
[comweeklink]: URL
## Dependency
### JAX and Flax
For JAX installation, please follow [here][jaxinstalllink].
or simply, type
For Flax installation,
[jaxinstalllink]: URL
Tensorflow-datasets will download MNIST dataset to environment.
## How to run training
For (small) ResNet training,
For Neural ODE training,
For Continuous Normalizing Flow,
Sample datasets can be chosen as circles, moons, or scurve.
# Sample Results
!cnf-viz
!cnf-viz
!cnf-viz
# Bird Call generation Score SDE
These are the codes for the bird call generation score sde model.
<code>URL</code> will execute the sampler. The sampler uses pretrained weight to generate bird calls. The weight can be found here
For using different sample generation parameters change the argument values. For example,
In order to generate the audios, these dependencies are required,
In order to train the model from scratch, please generate the dataset using this link. The dataset is generated in kaggle. Therefore, during training your username and api key is required in the specified section inside the script.
Generated samples can be found here
and here
|
[
"# Neural ODE with Flax\nThis is the result of project [\"Reproduce Neural ODE and SDE\"][projectlink] in [HuggingFace Flax/JAX community week][comweeklink].\n\n<code>URL</code> will execute training of ResNet or OdeNet for MNIST dataset.\n\n[projectlink]: URL\n\n[comweeklink]: URL",
"## Dependency",
"### JAX and Flax\n\nFor JAX installation, please follow [here][jaxinstalllink].\n\nor simply, type\n\n\nFor Flax installation,\n\n\n[jaxinstalllink]: URL\n\n\nTensorflow-datasets will download MNIST dataset to environment.",
"## How to run training\n\nFor (small) ResNet training,\n\n\nFor Neural ODE training, \n\n\nFor Continuous Normalizing Flow,\n\nSample datasets can be chosen as circles, moons, or scurve.",
"# Sample Results\n\n!cnf-viz\n!cnf-viz\n!cnf-viz",
"# Bird Call generation Score SDE \n\nThese are the codes for the bird call generation score sde model. \n\n<code>URL</code> will execute the sampler. The sampler uses pretrained weight to generate bird calls. The weight can be found here\n\nFor using different sample generation parameters change the argument values. For example,\n \nIn order to generate the audios, these dependencies are required,\n\n\nIn order to train the model from scratch, please generate the dataset using this link. The dataset is generated in kaggle. Therefore, during training your username and api key is required in the specified section inside the script. \n \nGenerated samples can be found here\nand here"
] |
[
"TAGS\n#region-us \n",
"# Neural ODE with Flax\nThis is the result of project [\"Reproduce Neural ODE and SDE\"][projectlink] in [HuggingFace Flax/JAX community week][comweeklink].\n\n<code>URL</code> will execute training of ResNet or OdeNet for MNIST dataset.\n\n[projectlink]: URL\n\n[comweeklink]: URL",
"## Dependency",
"### JAX and Flax\n\nFor JAX installation, please follow [here][jaxinstalllink].\n\nor simply, type\n\n\nFor Flax installation,\n\n\n[jaxinstalllink]: URL\n\n\nTensorflow-datasets will download MNIST dataset to environment.",
"## How to run training\n\nFor (small) ResNet training,\n\n\nFor Neural ODE training, \n\n\nFor Continuous Normalizing Flow,\n\nSample datasets can be chosen as circles, moons, or scurve.",
"# Sample Results\n\n!cnf-viz\n!cnf-viz\n!cnf-viz",
"# Bird Call generation Score SDE \n\nThese are the codes for the bird call generation score sde model. \n\n<code>URL</code> will execute the sampler. The sampler uses pretrained weight to generate bird calls. The weight can be found here\n\nFor using different sample generation parameters change the argument values. For example,\n \nIn order to generate the audios, these dependencies are required,\n\n\nIn order to train the model from scratch, please generate the dataset using this link. The dataset is generated in kaggle. Therefore, during training your username and api key is required in the specified section inside the script. \n \nGenerated samples can be found here\nand here"
] |
[
6,
88,
4,
57,
50,
19,
147
] |
[
"passage: TAGS\n#region-us \n# Neural ODE with Flax\nThis is the result of project [\"Reproduce Neural ODE and SDE\"][projectlink] in [HuggingFace Flax/JAX community week][comweeklink].\n\n<code>URL</code> will execute training of ResNet or OdeNet for MNIST dataset.\n\n[projectlink]: URL\n\n[comweeklink]: URL## Dependency### JAX and Flax\n\nFor JAX installation, please follow [here][jaxinstalllink].\n\nor simply, type\n\n\nFor Flax installation,\n\n\n[jaxinstalllink]: URL\n\n\nTensorflow-datasets will download MNIST dataset to environment.## How to run training\n\nFor (small) ResNet training,\n\n\nFor Neural ODE training, \n\n\nFor Continuous Normalizing Flow,\n\nSample datasets can be chosen as circles, moons, or scurve.# Sample Results\n\n!cnf-viz\n!cnf-viz\n!cnf-viz# Bird Call generation Score SDE \n\nThese are the codes for the bird call generation score sde model. \n\n<code>URL</code> will execute the sampler. The sampler uses pretrained weight to generate bird calls. The weight can be found here\n\nFor using different sample generation parameters change the argument values. For example,\n \nIn order to generate the audios, these dependencies are required,\n\n\nIn order to train the model from scratch, please generate the dataset using this link. The dataset is generated in kaggle. Therefore, during training your username and api key is required in the specified section inside the script. \n \nGenerated samples can be found here\nand here"
] |
[
-0.024997305124998093,
0.28504425287246704,
-0.0035065102856606245,
0.04662267118692398,
0.11823076009750366,
-0.029642120003700256,
0.02738071046769619,
0.018419310450553894,
-0.16042795777320862,
0.04818063974380493,
-0.037272386252880096,
0.09799691289663315,
0.14802226424217224,
0.11373671144247055,
0.0575922392308712,
-0.19584402441978455,
0.010310590267181396,
-0.03202366083860397,
0.028034329414367676,
-0.0036087743937969208,
0.006326415576040745,
-0.04531414061784744,
0.05097773298621178,
0.09148693829774857,
-0.039386603981256485,
0.024507850408554077,
-0.0771416649222374,
0.01648278348147869,
0.0872935876250267,
0.015206355601549149,
0.07072553038597107,
-0.012067307718098164,
0.046887438744306564,
-0.16032801568508148,
0.036460284143686295,
0.10004988312721252,
0.02060258947312832,
0.028909681364893913,
0.14717641472816467,
0.03431547060608864,
0.1827588826417923,
0.012280258350074291,
-0.04346657171845436,
0.04084722697734833,
-0.0626700222492218,
-0.2246389538049698,
-0.18177548050880432,
-0.11347334086894989,
0.06267160922288895,
0.12077359110116959,
-0.012068161740899086,
0.0699305608868599,
0.09940914064645767,
0.07809069007635117,
0.11055901646614075,
-0.05395287647843361,
-0.016893910244107246,
0.12800142168998718,
-0.0015818077372387052,
0.11924976855516434,
-0.11404595524072647,
-0.021992480382323265,
0.07646992057561874,
0.007598241791129112,
0.042890772223472595,
-0.030995110049843788,
-0.04217269644141197,
-0.00842211488634348,
-0.08672231435775757,
-0.062072813510894775,
0.07324760407209396,
-0.01783382147550583,
-0.08708415925502777,
-0.05494179576635361,
0.007918299175798893,
-0.03475748375058174,
0.003768044291064143,
-0.031655922532081604,
0.04421753063797951,
0.03351704776287079,
0.14139896631240845,
-0.14822892844676971,
-0.06185709312558174,
-0.017722493037581444,
0.033059943467378616,
-0.015309876762330532,
-0.0026594726368784904,
0.015046408399939537,
-0.07363134622573853,
0.07628989964723587,
0.04343044012784958,
-0.05432988703250885,
-0.05396352335810661,
0.0022983411327004433,
-0.1171938106417656,
-0.03806299716234207,
0.01861284114420414,
-0.23372362554073334,
-0.03839242830872536,
-0.022268051281571388,
-0.01706596277654171,
0.02979099564254284,
-0.15669456124305725,
-0.03369162604212761,
0.036614175885915756,
0.0821307972073555,
-0.044466760009527206,
-0.12062206864356995,
0.009790937416255474,
0.028038309887051582,
-0.04208414629101753,
-0.062215711921453476,
-0.002428558887913823,
-0.02468697540462017,
-0.05808039754629135,
0.07353539019823074,
0.03520232066512108,
-0.028784513473510742,
-0.05039815604686737,
-0.06794010102748871,
0.13989289104938507,
-0.05891801789402962,
0.016797270625829697,
0.06442736834287643,
0.0189297366887331,
0.09537730365991592,
0.04013636335730553,
0.022453853860497475,
-0.057365067303180695,
-0.0011449643643572927,
-0.050770048052072525,
-0.026810962706804276,
-0.01459471695125103,
-0.06703373789787292,
0.017640497535467148,
-0.12256910651922226,
-0.056868746876716614,
0.02234264463186264,
-0.15351536870002747,
-0.0906803086400032,
-0.012786388397216797,
0.028983937576413155,
-0.04450196400284767,
-0.0694267526268959,
-0.07629998028278351,
-0.07223380357027054,
0.04126459360122681,
0.08742343634366989,
-0.04509865120053291,
-0.02251059003174305,
-0.04667087271809578,
0.06267360597848892,
-0.03676696866750717,
0.06757624447345734,
0.007143187336623669,
-0.03824220225214958,
-0.039334408938884735,
0.1478089839220047,
-0.08169963210821152,
-0.006674549542367458,
-0.06580664962530136,
-0.04614446684718132,
-0.05082511901855469,
-0.058587826788425446,
0.0024479085113853216,
0.04119842126965523,
-0.29847589135169983,
-0.019185345619916916,
0.026475584134459496,
-0.09079363942146301,
0.04862360656261444,
0.0509939081966877,
-0.020682157948613167,
-0.010200864635407925,
0.15147192776203156,
0.0988764762878418,
0.25572118163108826,
-0.1278727650642395,
-0.003731277771294117,
-0.02511633187532425,
-0.09975135326385498,
-0.07330481708049774,
0.05136611685156822,
-0.021439073607325554,
-0.13481399416923523,
0.029862307012081146,
-0.11215966939926147,
0.026093117892742157,
0.06023097410798073,
-0.049546848982572556,
0.05097968503832817,
-0.02082207426428795,
-0.09217020869255066,
-0.07918346673250198,
-0.09051371365785599,
0.06799722462892532,
0.07233496010303497,
-0.03937023878097534,
0.1175471767783165,
-0.10608109831809998,
0.030441228300333023,
0.045056622475385666,
0.057853080332279205,
-0.021360717713832855,
-0.0026560218539088964,
-0.1595570147037506,
-0.025574957951903343,
0.05543673783540726,
-0.06002103164792061,
0.12210696935653687,
0.09990661591291428,
-0.01131417416036129,
0.05682777240872383,
0.06402738392353058,
0.05722616985440254,
0.02885592170059681,
-0.0628586858510971,
-0.04109594225883484,
-0.04998289421200752,
-0.042412061244249344,
-0.06427507847547531,
0.012874183245003223,
-0.2293875813484192,
-0.03410438448190689,
0.08540908992290497,
0.026348497718572617,
0.0671667829155922,
-0.11169008165597916,
0.1091824397444725,
0.0011493484489619732,
-0.006307838950306177,
-0.07180222123861313,
-0.05766160413622856,
0.0631706714630127,
-0.028938069939613342,
0.035051051527261734,
-0.12161490321159363,
-0.1618986427783966,
0.00222919718362391,
0.20535659790039062,
0.033237818628549576,
-0.023051736876368523,
-0.0023420583456754684,
-0.027438830584287643,
-0.12576574087142944,
0.05084720999002457,
0.22852934896945953,
0.10538925975561142,
0.07319017499685287,
-0.11105644702911377,
-0.02751566469669342,
-0.04632490128278732,
0.06262510269880295,
-0.034342341125011444,
0.028622044250369072,
-0.05537084862589836,
0.03690515086054802,
0.011472112499177456,
-0.1109272763133049,
0.03808126971125603,
0.2277122586965561,
0.03592479228973389,
-0.053138602524995804,
-0.032956913113594055,
0.026970282196998596,
0.006257998291403055,
0.0030216218437999487,
0.10611312091350555,
0.11063738167285919,
0.04760896414518356,
-0.029953502118587494,
0.031097572296857834,
-0.13000813126564026,
-0.012513021007180214,
0.06583382934331894,
-0.12101321667432785,
-0.029850343242287636,
-0.017816713079810143,
0.032937563955783844,
0.005823667161166668,
-0.10457611829042435,
0.007688682060688734,
-0.07657134532928467,
-0.06192977353930473,
-0.09678313881158829,
0.12135165929794312,
-0.12507642805576324,
-0.15422944724559784,
-0.20571132004261017,
0.05193806812167168,
-0.09320084750652313,
-0.0007624084246344864,
-0.008140775375068188,
-0.02336592599749565,
-0.03553997352719307,
-0.10155501216650009,
0.04105094075202942,
-0.026019655168056488,
-0.011198211461305618,
-0.007457405794411898,
-0.020050790160894394,
-0.028557173907756805,
-0.17911988496780396,
0.0566805824637413,
0.0077057937160134315,
-0.03898424282670021,
-0.061273153871297836,
0.06502766162157059,
0.043310318142175674,
0.058251649141311646,
-0.002608114155009389,
-0.038418252021074295,
0.0024634595029056072,
0.2655927836894989,
-0.06730076670646667,
0.08515125513076782,
0.11627558618783951,
0.014262630604207516,
0.12646037340164185,
0.0768793374300003,
-0.013331315480172634,
-0.05519977957010269,
-0.008906451053917408,
0.08399034291505814,
-0.04925917461514473,
-0.25072863698005676,
-0.008570154197514057,
-0.058217886835336685,
-0.001917887246236205,
0.07003598660230637,
0.03668183833360672,
0.10038630664348602,
0.03758751228451729,
-0.03666364774107933,
0.009086397476494312,
0.018868690356612206,
0.01056315004825592,
0.008221970871090889,
0.008776286616921425,
-0.004968554247170687,
-0.10673684626817703,
-0.008234955370426178,
0.10483130812644958,
0.1384781152009964,
0.14179682731628418,
0.0030159722082316875,
0.17348612844944,
0.031172949820756912,
0.020815934985876083,
0.004098648205399513,
0.15699812769889832,
-0.07546864449977875,
-0.005514746066182852,
-0.04336302727460861,
-0.10158735513687134,
-0.004403667990118265,
0.032127708196640015,
0.21870078146457672,
-0.06068168953061104,
0.06652574986219406,
-0.007781484164297581,
-0.06801573932170868,
0.04255293309688568,
-0.04306882992386818,
-0.21520711481571198,
0.06395474076271057,
-0.020079487934708595,
0.014277927577495575,
-0.03866884857416153,
-0.007094309665262699,
0.147556871175766,
-0.056286659091711044,
0.09929252415895462,
-0.10003014653921127,
0.10583976656198502,
-0.10723092406988144,
-0.007392373401671648,
0.0028620543889701366,
0.16175545752048492,
-0.03837135061621666,
0.061352211982011795,
0.044872984290122986,
0.012269331142306328,
-0.00009266445704270154,
0.06634508073329926,
-0.05184518173336983,
0.0353451743721962,
0.038443442434072495,
-0.023784838616847992,
0.07332953810691833,
0.011815373785793781,
-0.22101832926273346,
-0.09006798267364502,
-0.04665055498480797,
0.0019749875646084547,
-0.02321665920317173,
-0.03982421010732651,
0.08748740702867508,
0.031596940010786057,
0.016997504979372025,
-0.056586697697639465,
-0.11786884814500809,
-0.0881006196141243,
-0.1774570792913437,
0.03796552121639252,
0.12296329438686371,
0.051022548228502274,
-0.026908470317721367,
-0.03884526342153549,
-0.020612133666872978,
0.01886652410030365,
-0.03666727617383003,
-0.03985465690493584,
-0.12274354696273804,
-0.03485477343201637,
0.14868006110191345,
-0.07505163550376892,
-0.007340706884860992,
-0.03019765019416809,
0.13544626533985138,
-0.1137101799249649,
-0.060648877173662186,
-0.0982140451669693,
0.016127847135066986,
-0.04531600698828697,
0.019192680716514587,
0.08016101270914078,
0.09441905468702316,
-0.0075308941304683685,
0.028826214373111725,
0.02337559498846531,
0.01456782128661871,
-0.15273134410381317,
-0.044041622430086136,
0.026485785841941833,
0.025656413286924362,
0.04887648671865463,
0.027460869401693344,
-0.11881230771541595,
-0.1309540867805481,
0.044272586703300476,
0.09483737498521805,
0.12270211428403854,
-0.02476906031370163,
0.10261768847703934,
0.12386052310466766,
-0.1203533262014389,
-0.14308470487594604,
0.025346355512738228,
0.09616687148809433,
0.008509770035743713,
0.1211310476064682,
-0.30328062176704407,
0.11944945156574249,
0.06199517101049423,
0.02971835248172283,
0.08218701183795929,
-0.23904858529567719,
-0.058712173253297806,
0.013138161040842533,
0.10317238420248032,
-0.18082527816295624,
-0.019483109936118126,
-0.05486715957522392,
-0.05393800511956215,
0.033799417316913605,
0.14909173548221588,
-0.022490059956908226,
0.1252850741147995,
0.04661747068166733,
0.047824520617723465,
0.0446212962269783,
-0.016203023493289948,
0.10744523257017136,
0.13758200407028198,
0.14539916813373566,
0.05392233282327652,
0.0034170635044574738,
-0.018610458821058273,
-0.0542483888566494,
0.14201083779335022,
-0.03240325674414635,
0.040497925132513046,
-0.1851416677236557,
-0.041497595608234406,
-0.03287443518638611,
0.07180799543857574,
0.019531937316060066,
0.05115107446908951,
-0.1417323350906372,
0.027341173961758614,
0.12278767675161362,
-0.005553088150918484,
0.015754342079162598,
0.02211795374751091,
-0.06429026275873184,
0.22988033294677734,
0.05846882984042168,
0.09149066358804703,
-0.08558709919452667,
-0.031093759462237358,
0.06362269073724747,
0.00328130298294127,
0.013949597254395485,
0.008225231431424618,
0.08348724991083145,
0.05269723758101463,
0.06576761603355408,
0.011977989226579666,
-0.15755301713943481,
0.007535157259553671,
0.049445830285549164,
-0.0334770642220974,
-0.17133517563343048,
-0.0715344101190567,
-0.1149190366268158,
-0.08057539165019989,
-0.037056196480989456,
0.09027812629938126,
-0.00722553301602602,
-0.00423070415854454,
-0.02435196191072464,
0.011498451232910156,
-0.006897435523569584,
0.13081373274326324,
0.03315575420856476,
0.005216405261307955,
-0.09646738320589066,
0.11423792690038681,
0.06813056766986847,
-0.030895324423909187,
0.021252209320664406,
0.059730276465415955,
-0.010280836373567581,
-0.030779479071497917,
0.041398681700229645,
0.13371270895004272,
-0.010779669508337975,
0.01815158501267433,
0.06613223254680634,
-0.00488515617325902,
0.03625050559639931,
0.08684513717889786,
0.00981628242880106,
0.08135666698217392,
-0.12947604060173035,
-0.024561025202274323,
-0.0885859876871109,
0.06595930457115173,
0.014565921388566494,
0.02981620654463768,
-0.11387774348258972,
0.10241854190826416,
0.0014913063496351242,
0.001347847399301827,
0.018632011488080025,
-0.08008448034524918,
-0.01079594623297453,
-0.042173683643341064,
0.11932916939258575,
0.06847285479307175,
-0.08440297842025757,
0.0395025759935379,
0.005534117575734854,
0.015761474147439003,
0.019380483776330948,
0.0672665387392044,
-0.057712484151124954,
-0.06431590020656586,
-0.045718204230070114,
0.13020867109298706,
-0.1356252133846283,
0.013762185350060463,
0.015653353184461594,
-0.028688933700323105,
-0.0006394481169991195,
-0.036226823925971985,
-0.029223689809441566,
0.0426427386701107,
-0.09439527988433838,
-0.04753785952925682,
0.08541186153888702,
0.03690585121512413,
-0.019837671890854836,
-0.14581872522830963,
-0.01827353611588478,
0.005654848180711269,
0.0370088592171669,
-0.05829796940088272,
0.009691865183413029,
-0.10466159880161285,
-0.09629564732313156,
0.0011743605136871338,
-0.06795259565114975,
0.0249708890914917,
0.04383767768740654,
0.1312168389558792,
0.020057648420333862,
0.2094862163066864,
0.06391861289739609,
0.06383855640888214,
-0.13894155621528625,
-0.017431043088436127,
-0.0291172843426466,
-0.06650366634130478,
-0.09019456058740616,
-0.07441100478172302,
0.01754358969628811,
0.011203769594430923,
0.010256147012114525,
-0.010416069068014622,
-0.0853436067700386,
-0.058087632060050964,
0.10100957006216049,
-0.06498416513204575,
0.005559603683650494,
0.16028203070163727,
0.023300619795918465,
-0.026656892150640488,
0.17067565023899078,
-0.019118426367640495,
-0.004461223259568214,
0.17716212570667267,
0.1430496722459793,
0.015128275379538536,
0.08996900916099548,
0.06355103105306625,
0.07712673395872116,
0.03146488219499588,
-0.07410876452922821,
0.044289182871580124,
0.014101512730121613,
0.03877943009138107,
0.035299818962812424,
0.026710888370871544,
0.07524958997964859,
-0.16940650343894958,
0.16242408752441406,
0.1517324447631836,
-0.07378149032592773,
-0.09332924336194992,
-0.27227410674095154,
-0.028067735955119133,
-0.011648554354906082,
-0.029802195727825165,
-0.12527063488960266,
0.09237095713615417,
0.04216684773564339,
0.02591683343052864,
0.008007939904928207,
0.12506826221942902,
-0.14311005175113678,
-0.1289844810962677,
0.05671653524041176,
-0.013727454468607903,
-0.00040614508907310665,
0.09084856510162354,
0.03907684609293938,
0.15228603780269623,
0.014772695489227772,
0.090983085334301,
0.04621633514761925,
0.13446001708507538,
0.10698521137237549,
-0.08145105838775635,
-0.06828738003969193,
0.0391799621284008,
-0.0564187653362751,
-0.011641232296824455,
0.12123377621173859,
0.07031301409006119,
-0.06328748911619186,
0.03322608023881912,
0.059671808034181595,
0.018363866955041885,
-0.00732112443074584,
-0.14542551338672638,
0.13230636715888977,
0.004127529915422201,
0.02796219289302826,
-0.004777128808200359,
-0.06333339214324951,
0.02312619611620903,
0.1320818066596985,
0.06649185717105865,
-0.02938174456357956,
-0.018902089446783066,
-0.04948420077562332,
0.0079383235424757,
-0.02872353419661522,
0.06425624340772629,
-0.017893223091959953,
0.061732567846775055,
0.04037584736943245,
0.08221633732318878,
-0.08987525850534439,
-0.08360972255468369,
-0.041498370468616486,
0.011760974302887917,
-0.04936372488737106,
-0.030758507549762726,
-0.022354429587721825,
0.11585451662540436,
-0.09190598875284195,
-0.1613706648349762,
-0.0760064572095871,
0.07765866070985794,
-0.09776319563388824,
0.0004981050733476877,
-0.0038729861844331026,
0.027435388416051865,
-0.03657730668783188,
0.021080659702420235,
-0.03870749846100807,
0.1923811137676239,
-0.020332366228103638,
-0.13937467336654663,
-0.08842049539089203,
0.030762050300836563,
-0.08901648968458176,
0.12335924059152603,
0.006280383560806513,
-0.06628705561161041,
0.006931371055543423,
-0.030708488076925278,
-0.13202662765979767,
0.03801959753036499,
-0.040717996656894684,
0.04137212038040161,
0.02565869130194187,
0.11480912566184998,
0.002844632836058736,
0.07219454646110535,
-0.024200454354286194,
0.011353742331266403,
0.03725689649581909,
0.04674805328249931,
0.03038858436048031,
-0.004987963475286961,
0.0029870790895074606,
-0.11910463124513626,
0.08850977569818497,
0.07675148546695709,
0.03429283946752548,
0.06205695867538452,
0.00787899736315012,
-0.05235380679368973,
-0.021522928029298782,
0.14158234000205994,
0.0067810919135808945,
-0.10602931678295135,
-0.08093216270208359,
-0.10272257030010223,
0.06624666601419449,
-0.022660037502646446,
0.002455055946484208,
-0.009436284191906452,
-0.08775228261947632,
0.08415286242961884,
0.06926047801971436,
-0.016571758314967155,
-0.07290954887866974,
-0.015013215132057667,
-0.15619441866874695,
0.014105305075645447,
0.08445391803979874,
-0.1077100858092308,
-0.07309475541114807
] |
null | null |
transformers
|
# NOTE: We have trained newer and better Finnish RoBERTa large model which can be found from different repository: [https://huggingface.co/Finnish-NLP/roberta-large-finnish](https://huggingface.co/Finnish-NLP/roberta-large-finnish). Our future Finnish models will be available at the [Finnish-NLP](https://huggingface.co/Finnish-NLP) Hugging Face organization
# RoBERTa large model for Finnish
Pretrained model on Finnish language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1907.11692) and first released in
[this repository](https://github.com/pytorch/fairseq/tree/master/examples/roberta). This model is case-sensitive: it
makes a difference between finnish and Finnish.
## Model description
RoBERTa is a transformers model pretrained on a large corpus of Finnish data in a self-supervised fashion. This means
it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model
randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict
the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one
after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to
learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of the Finnish language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the RoBERTa model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='flax-community/RoBERTa-large-finnish')
>>> unmasker("Moikka olen <mask> kielimalli.")
[{'sequence': 'Moikka olen uusi kielimalli.',
'score': 0.05129234120249748,
'token': 1825,
'token_str': ' uusi'},
{'sequence': 'Moikka olen toinen kielimalli.',
'score': 0.03112379088997841,
'token': 2194,
'token_str': ' toinen'},
{'sequence': 'Moikka olen myös kielimalli.',
'score': 0.025534993037581444,
'token': 491,
'token_str': ' myös'},
{'sequence': 'Moikka olen ensimmäinen kielimalli.',
'score': 0.020146571099758148,
'token': 2832,
'token_str': ' ensimmäinen'},
{'sequence': 'Moikka olen vapaa kielimalli.',
'score': 0.018089469522237778,
'token': 2257,
'token_str': ' vapaa'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import RobertaTokenizer, RobertaModel
tokenizer = RobertaTokenizer.from_pretrained('flax-community/RoBERTa-large-finnish')
model = RobertaModel.from_pretrained('flax-community/RoBERTa-large-finnish')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import RobertaTokenizer, TFRobertaModel
tokenizer = RobertaTokenizer.from_pretrained('flax-community/RoBERTa-large-finnish')
model = TFRobertaModel.from_pretrained('flax-community/RoBERTa-large-finnish', from_pt=True)
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
The training data used for this model contains a lot of unfiltered content from the internet, which is far from
neutral. Therefore, the model can have biased predictions.
## Training data
This Finnish RoBERTa model was pretrained on the combination of two datasets:
- [mc4](https://huggingface.co/datasets/mc4), the dataset mC4 is a multilingual colossal, cleaned version of Common Crawl's web crawl corpus. We used the Finnish subset of the mC4 dataset
- [Yle Finnish News Archive](http://urn.fi/urn:nbn:fi:lb-2017070501)
Raw datasets were cleaned to filter out bad quality and non-Finnish examples. Together these cleaned datasets were around 51GB of text.
## Training procedure
### Preprocessing
The texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of
the model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked
with `<s>` and the end of one by `</s>`
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `<mask>`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
Contrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).
### Pretraining
The model was trained on TPUv3-8 VM, sponsored by the Hugging Face JAX/Flax community week event, for 2 epochs with a sequence length of 128 and continuing for one more epoch with a sequence length of 512. The optimizer used is Adafactor with a learning rate of 2e-4, \\(\beta_{1} = 0.9\\), \\(\beta_{2} = 0.98\\) and \\(\epsilon = 1e-6\\), learning rate warmup for 1500 steps and linear decay of the learning rate after.
## Evaluation results
Evaluation was done by fine-tuning the model on downstream text classification task with two different labeled datasets: [Yle News](https://github.com/spyysalo/yle-corpus) and [Eduskunta](https://github.com/aajanki/eduskunta-vkk). Yle News classification fine-tuning was done with two different sequence lengths: 128 and 512 but Eduskunta only with 128 sequence length.
When fine-tuned on those datasets, this model (the first row of the table) achieves the following accuracy results compared to the [FinBERT (Finnish BERT)](https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1) and to our newer [Finnish RoBERTa-large](https://huggingface.co/Finnish-NLP/roberta-large-finnish) trained with larger dataset:
| | Average | Yle News 128 length | Yle News 512 length | Eduskunta 128 length |
|----------------------------------------|----------|---------------------|---------------------|----------------------|
|flax-community/RoBERTa-large-finnish |87.72 |94.42 |95.06 |73.67 |
|Finnish-NLP/roberta-large-finnish |88.02 |94.53 |95.23 |74.30 |
|TurkuNLP/bert-base-finnish-cased-v1 |**88.82** |**94.90** |**95.49** |**76.07** |
To conclude, this model slightly loses to our newer [Finnish RoBERTa-large](https://huggingface.co/Finnish-NLP/roberta-large-finnish) model trained with larger dataset and also slightly loses to the [FinBERT (Finnish BERT)](https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1) model.
## Team Members
- Aapo Tanskanen, [Hugging Face profile](https://huggingface.co/aapot), [LinkedIn profile](https://www.linkedin.com/in/aapotanskanen/)
- Rasmus Toivanen [Hugging Face profile](https://huggingface.co/RASMUS), [LinkedIn profile](https://www.linkedin.com/in/rasmustoivanen/)
- Tommi Vehviläinen [Hugging Face profile](https://huggingface.co/Tommi)
Feel free to contact us for more details 🤗
|
{"language": ["fi"], "license": "apache-2.0", "tags": ["finnish", "roberta"], "datasets": ["mc4"], "widget": [{"text": "Moikka olen <mask> kielimalli."}]}
|
fill-mask
|
flax-community/RoBERTa-large-finnish
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"roberta",
"fill-mask",
"finnish",
"fi",
"dataset:mc4",
"arxiv:1907.11692",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1907.11692"
] |
[
"fi"
] |
TAGS
#transformers #pytorch #jax #tensorboard #roberta #fill-mask #finnish #fi #dataset-mc4 #arxiv-1907.11692 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
NOTE: We have trained newer and better Finnish RoBERTa large model which can be found from different repository: URL Our future Finnish models will be available at the Finnish-NLP Hugging Face organization
=============================================================================================================================================================================================================
RoBERTa large model for Finnish
===============================
Pretrained model on Finnish language using a masked language modeling (MLM) objective. It was introduced in
this paper and first released in
this repository. This model is case-sensitive: it
makes a difference between finnish and Finnish.
Model description
-----------------
RoBERTa is a transformers model pretrained on a large corpus of Finnish data in a self-supervised fashion. This means
it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model
randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict
the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one
after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to
learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of the Finnish language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the RoBERTa model as inputs.
Intended uses & limitations
---------------------------
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
Here is how to use this model to get the features of a given text in PyTorch:
and in TensorFlow:
### Limitations and bias
The training data used for this model contains a lot of unfiltered content from the internet, which is far from
neutral. Therefore, the model can have biased predictions.
Training data
-------------
This Finnish RoBERTa model was pretrained on the combination of two datasets:
* mc4, the dataset mC4 is a multilingual colossal, cleaned version of Common Crawl's web crawl corpus. We used the Finnish subset of the mC4 dataset
* Yle Finnish News Archive
Raw datasets were cleaned to filter out bad quality and non-Finnish examples. Together these cleaned datasets were around 51GB of text.
Training procedure
------------------
### Preprocessing
The texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of
the model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked
with '~~' and the end of one by '~~'
The details of the masking procedure for each sentence are the following:
* 15% of the tokens are masked.
* In 80% of the cases, the masked tokens are replaced by ''.
* In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
* In the 10% remaining cases, the masked tokens are left as is.
Contrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).
### Pretraining
The model was trained on TPUv3-8 VM, sponsored by the Hugging Face JAX/Flax community week event, for 2 epochs with a sequence length of 128 and continuing for one more epoch with a sequence length of 512. The optimizer used is Adafactor with a learning rate of 2e-4, \(\beta\_{1} = 0.9\), \(\beta\_{2} = 0.98\) and \(\epsilon = 1e-6\), learning rate warmup for 1500 steps and linear decay of the learning rate after.
Evaluation results
------------------
Evaluation was done by fine-tuning the model on downstream text classification task with two different labeled datasets: Yle News and Eduskunta. Yle News classification fine-tuning was done with two different sequence lengths: 128 and 512 but Eduskunta only with 128 sequence length.
When fine-tuned on those datasets, this model (the first row of the table) achieves the following accuracy results compared to the FinBERT (Finnish BERT) and to our newer Finnish RoBERTa-large trained with larger dataset:
To conclude, this model slightly loses to our newer Finnish RoBERTa-large model trained with larger dataset and also slightly loses to the FinBERT (Finnish BERT) model.
Team Members
------------
* Aapo Tanskanen, Hugging Face profile, LinkedIn profile
* Rasmus Toivanen Hugging Face profile, LinkedIn profile
* Tommi Vehviläinen Hugging Face profile
Feel free to contact us for more details
|
[
"### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nHere is how to use this model to get the features of a given text in PyTorch:\n\n\nand in TensorFlow:",
"### Limitations and bias\n\n\nThe training data used for this model contains a lot of unfiltered content from the internet, which is far from\nneutral. Therefore, the model can have biased predictions.\n\n\nTraining data\n-------------\n\n\nThis Finnish RoBERTa model was pretrained on the combination of two datasets:\n\n\n* mc4, the dataset mC4 is a multilingual colossal, cleaned version of Common Crawl's web crawl corpus. We used the Finnish subset of the mC4 dataset\n* Yle Finnish News Archive\n\n\nRaw datasets were cleaned to filter out bad quality and non-Finnish examples. Together these cleaned datasets were around 51GB of text.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of\nthe model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked\nwith '~~' and the end of one by '~~'\n\n\nThe details of the masking procedure for each sentence are the following:\n\n\n* 15% of the tokens are masked.\n* In 80% of the cases, the masked tokens are replaced by ''.\n* In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n* In the 10% remaining cases, the masked tokens are left as is.\n\n\nContrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).",
"### Pretraining\n\n\nThe model was trained on TPUv3-8 VM, sponsored by the Hugging Face JAX/Flax community week event, for 2 epochs with a sequence length of 128 and continuing for one more epoch with a sequence length of 512. The optimizer used is Adafactor with a learning rate of 2e-4, \\(\\beta\\_{1} = 0.9\\), \\(\\beta\\_{2} = 0.98\\) and \\(\\epsilon = 1e-6\\), learning rate warmup for 1500 steps and linear decay of the learning rate after.\n\n\nEvaluation results\n------------------\n\n\nEvaluation was done by fine-tuning the model on downstream text classification task with two different labeled datasets: Yle News and Eduskunta. Yle News classification fine-tuning was done with two different sequence lengths: 128 and 512 but Eduskunta only with 128 sequence length.\nWhen fine-tuned on those datasets, this model (the first row of the table) achieves the following accuracy results compared to the FinBERT (Finnish BERT) and to our newer Finnish RoBERTa-large trained with larger dataset:\n\n\n\nTo conclude, this model slightly loses to our newer Finnish RoBERTa-large model trained with larger dataset and also slightly loses to the FinBERT (Finnish BERT) model.\n\n\nTeam Members\n------------\n\n\n* Aapo Tanskanen, Hugging Face profile, LinkedIn profile\n* Rasmus Toivanen Hugging Face profile, LinkedIn profile\n* Tommi Vehviläinen Hugging Face profile\n\n\nFeel free to contact us for more details"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #roberta #fill-mask #finnish #fi #dataset-mc4 #arxiv-1907.11692 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nHere is how to use this model to get the features of a given text in PyTorch:\n\n\nand in TensorFlow:",
"### Limitations and bias\n\n\nThe training data used for this model contains a lot of unfiltered content from the internet, which is far from\nneutral. Therefore, the model can have biased predictions.\n\n\nTraining data\n-------------\n\n\nThis Finnish RoBERTa model was pretrained on the combination of two datasets:\n\n\n* mc4, the dataset mC4 is a multilingual colossal, cleaned version of Common Crawl's web crawl corpus. We used the Finnish subset of the mC4 dataset\n* Yle Finnish News Archive\n\n\nRaw datasets were cleaned to filter out bad quality and non-Finnish examples. Together these cleaned datasets were around 51GB of text.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of\nthe model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked\nwith '~~' and the end of one by '~~'\n\n\nThe details of the masking procedure for each sentence are the following:\n\n\n* 15% of the tokens are masked.\n* In 80% of the cases, the masked tokens are replaced by ''.\n* In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n* In the 10% remaining cases, the masked tokens are left as is.\n\n\nContrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).",
"### Pretraining\n\n\nThe model was trained on TPUv3-8 VM, sponsored by the Hugging Face JAX/Flax community week event, for 2 epochs with a sequence length of 128 and continuing for one more epoch with a sequence length of 512. The optimizer used is Adafactor with a learning rate of 2e-4, \\(\\beta\\_{1} = 0.9\\), \\(\\beta\\_{2} = 0.98\\) and \\(\\epsilon = 1e-6\\), learning rate warmup for 1500 steps and linear decay of the learning rate after.\n\n\nEvaluation results\n------------------\n\n\nEvaluation was done by fine-tuning the model on downstream text classification task with two different labeled datasets: Yle News and Eduskunta. Yle News classification fine-tuning was done with two different sequence lengths: 128 and 512 but Eduskunta only with 128 sequence length.\nWhen fine-tuned on those datasets, this model (the first row of the table) achieves the following accuracy results compared to the FinBERT (Finnish BERT) and to our newer Finnish RoBERTa-large trained with larger dataset:\n\n\n\nTo conclude, this model slightly loses to our newer Finnish RoBERTa-large model trained with larger dataset and also slightly loses to the FinBERT (Finnish BERT) model.\n\n\nTeam Members\n------------\n\n\n* Aapo Tanskanen, Hugging Face profile, LinkedIn profile\n* Rasmus Toivanen Hugging Face profile, LinkedIn profile\n* Tommi Vehviläinen Hugging Face profile\n\n\nFeel free to contact us for more details"
] |
[
72,
49,
163,
204,
377
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #roberta #fill-mask #finnish #fi #dataset-mc4 #arxiv-1907.11692 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nHere is how to use this model to get the features of a given text in PyTorch:\n\n\nand in TensorFlow:### Limitations and bias\n\n\nThe training data used for this model contains a lot of unfiltered content from the internet, which is far from\nneutral. Therefore, the model can have biased predictions.\n\n\nTraining data\n-------------\n\n\nThis Finnish RoBERTa model was pretrained on the combination of two datasets:\n\n\n* mc4, the dataset mC4 is a multilingual colossal, cleaned version of Common Crawl's web crawl corpus. We used the Finnish subset of the mC4 dataset\n* Yle Finnish News Archive\n\n\nRaw datasets were cleaned to filter out bad quality and non-Finnish examples. Together these cleaned datasets were around 51GB of text.\n\n\nTraining procedure\n------------------### Preprocessing\n\n\nThe texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of\nthe model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked\nwith '~~' and the end of one by '~~'\n\n\nThe details of the masking procedure for each sentence are the following:\n\n\n* 15% of the tokens are masked.\n* In 80% of the cases, the masked tokens are replaced by ''.\n* In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n* In the 10% remaining cases, the masked tokens are left as is.\n\n\nContrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed)."
] |
[
-0.024754449725151062,
0.12540048360824585,
-0.004733323119580746,
0.015505122020840645,
0.031710874289274216,
0.0006881210138089955,
0.04229168966412544,
0.08868839591741562,
-0.04430896043777466,
0.08273852616548538,
0.0427568294107914,
-0.02526533603668213,
0.10933952778577805,
0.14414598047733307,
0.03251223638653755,
-0.21789176762104034,
0.07953713834285736,
-0.031624771654605865,
0.019809868186712265,
0.0901661217212677,
0.0744716078042984,
-0.08690106123685837,
0.009570139460265636,
-0.019042067229747772,
-0.004625741392374039,
-0.04922168329358101,
-0.04214964061975479,
-0.03241335600614548,
0.050070859491825104,
0.04721249267458916,
0.09747916460037231,
-0.017906062304973602,
0.010522901080548763,
-0.1317141205072403,
0.025014463812112808,
0.08240574598312378,
0.004265686962753534,
0.05227077379822731,
0.07451581954956055,
0.0523262582719326,
0.15844225883483887,
-0.08003499358892441,
-0.005536911077797413,
0.011219271458685398,
-0.08816435933113098,
-0.127985417842865,
-0.14254002273082733,
0.00981199648231268,
0.07083084434270859,
0.11111561954021454,
-0.04944948852062225,
0.12418598681688309,
-0.030273206532001495,
0.09125997871160507,
0.1385595202445984,
-0.2042878419160843,
-0.017816008999943733,
0.02488533779978752,
0.07088113576173782,
-0.03944062069058418,
-0.07628866285085678,
0.0153093496337533,
0.00400195037946105,
0.02999850921332836,
-0.021255867555737495,
-0.015264241024851799,
-0.04262609779834747,
-0.0784093365073204,
-0.09749633073806763,
-0.05264465883374214,
0.0665934830904007,
-0.011858743615448475,
-0.08103491365909576,
-0.10444412380456924,
-0.04685575142502785,
0.031964629888534546,
0.025808388367295265,
0.0028721613343805075,
-0.0021441008429974318,
0.014178052544593811,
0.005937756504863501,
-0.1311405748128891,
-0.10456904023885727,
0.0027101386804133654,
-0.058603886514902115,
0.15387210249900818,
0.042392075061798096,
0.037877682596445084,
-0.05694979056715965,
0.07364219427108765,
0.0009784844005480409,
-0.09040755033493042,
-0.042450323700904846,
-0.029008297249674797,
-0.08410447090864182,
-0.0083832498639822,
-0.053718239068984985,
-0.23954172432422638,
-0.0036054090596735477,
0.09711169451475143,
-0.04848511517047882,
0.02362077310681343,
-0.02085370011627674,
0.032332032918930054,
0.045554857701063156,
0.11697977781295776,
-0.1015460416674614,
-0.02707711048424244,
0.056584589183330536,
-0.045233353972435,
0.067594975233078,
-0.044972337782382965,
0.0019371930975466967,
0.006471933331340551,
0.046597741544246674,
0.05347064137458801,
0.061526138335466385,
0.06587523967027664,
-0.02424236387014389,
-0.048499446362257004,
0.043824635446071625,
-0.13774614036083221,
-0.00009177512401947752,
0.013341667130589485,
-0.04772307723760605,
0.0009438034030608833,
0.055760350078344345,
-0.029216019436717033,
-0.1358293741941452,
0.05758247524499893,
-0.042129036039114,
-0.024150637909770012,
-0.08113931119441986,
-0.14784866571426392,
0.03751426935195923,
-0.0793067067861557,
-0.06886695325374603,
-0.050522640347480774,
-0.1483817994594574,
-0.06724246591329575,
0.04734964668750763,
0.027809564024209976,
0.007543900515884161,
-0.029820065945386887,
-0.05223805457353592,
-0.01117685902863741,
0.010597366839647293,
0.02910337597131729,
-0.030595997348427773,
0.034607306122779846,
-0.13313916325569153,
0.036578238010406494,
0.03448300436139107,
0.013813456520438194,
-0.12521867454051971,
-0.013254323042929173,
-0.2147655338048935,
0.07103164494037628,
-0.05168269947171211,
-0.02079002931714058,
-0.05510903149843216,
-0.020604919642210007,
-0.07017579674720764,
0.0017926751170307398,
0.029507404193282127,
0.13183316588401794,
-0.15499337017536163,
-0.02682749554514885,
0.19627055525779724,
-0.13929031789302826,
0.0523931160569191,
0.14875055849552155,
-0.02492392808198929,
0.00414719944819808,
0.11113916337490082,
0.14550872147083282,
0.04079271852970123,
-0.08725671470165253,
-0.13232120871543884,
-0.044541966170072556,
-0.04197954386472702,
0.08437083661556244,
0.0319850780069828,
-0.08669205754995346,
0.07038751989603043,
0.014275400899350643,
-0.0198260098695755,
-0.013911315239965916,
-0.005247455555945635,
-0.044678106904029846,
0.047591906040906906,
0.003864416852593422,
-0.012654978781938553,
0.013062510639429092,
-0.010120482183992863,
-0.029046935960650444,
-0.0840362086892128,
-0.17019511759281158,
0.09585500508546829,
-0.08503822237253189,
0.09618924558162689,
-0.05834342911839485,
0.06198100000619888,
-0.0668850764632225,
0.04291433095932007,
-0.1634434163570404,
-0.10941778868436813,
0.032853465527296066,
-0.11957187205553055,
0.044285550713539124,
-0.03026985190808773,
0.019521882757544518,
0.03713444247841835,
-0.020844029262661934,
-0.020736470818519592,
-0.04309891164302826,
-0.05737994611263275,
-0.08604901283979416,
-0.11738650500774384,
-0.02594417706131935,
-0.06296899914741516,
0.03853660076856613,
-0.05828108638525009,
-0.010195741429924965,
0.054523568600416183,
0.1165706217288971,
0.05325804278254509,
-0.08520729094743729,
0.06844264268875122,
0.0387035608291626,
-0.005353816784918308,
-0.05754167586565018,
-0.014786841347813606,
0.027956632897257805,
-0.04682295024394989,
0.1149347573518753,
-0.1386815458536148,
-0.1520616114139557,
0.0716889351606369,
0.048513270914554596,
-0.06815554201602936,
0.11063962429761887,
-0.027282964438199997,
-0.01982487179338932,
-0.047320712357759476,
-0.012277011759579182,
0.19618454575538635,
0.047170061618089676,
0.054603833705186844,
-0.06542486697435379,
-0.012431710958480835,
0.008032678626477718,
-0.0515601709485054,
-0.04572785645723343,
0.05638003349304199,
-0.0019915143493562937,
-0.16898685693740845,
0.07147839665412903,
-0.05339687317609787,
-0.006356493569910526,
0.26272445917129517,
0.0365574024617672,
-0.11424795538187027,
-0.03379424661397934,
0.025889314711093903,
0.025693172588944435,
0.12123358249664307,
0.026328276842832565,
0.005086119286715984,
0.019091473892331123,
0.02879347838461399,
0.09598220884799957,
-0.06744900345802307,
0.047229960560798645,
0.004533919505774975,
-0.06854795664548874,
0.012704890221357346,
-0.00995495542883873,
-0.01268521323800087,
0.06752608716487885,
0.014361312612891197,
0.04916608706116676,
0.0008336022146977484,
-0.05631822720170021,
-0.0988730862736702,
0.19515888392925262,
-0.0995955765247345,
-0.26351118087768555,
-0.14317113161087036,
0.0233082827180624,
-0.05310911685228348,
-0.013390583917498589,
0.018087973818182945,
-0.03670845925807953,
-0.1051679253578186,
-0.10344407707452774,
0.06385094672441483,
0.04157905653119087,
-0.036800697445869446,
-0.04840724542737007,
-0.01869180239737034,
-0.0038292293902486563,
-0.12964743375778198,
0.021860886365175247,
-0.05014462396502495,
-0.04850415512919426,
0.027202777564525604,
0.00035839626798406243,
0.12696748971939087,
0.07249923050403595,
-0.03266407549381256,
-0.018726807087659836,
-0.0010394809069111943,
0.10249664634466171,
-0.11398600041866302,
0.06621137261390686,
-0.0010128425201401114,
-0.03705090284347534,
0.05473766848444939,
0.05921141058206558,
-0.02872643433511257,
-0.04583260044455528,
0.022782810032367706,
0.08945587277412415,
-0.035187575966119766,
-0.23832105100154877,
-0.08405012637376785,
-0.04253305867314339,
-0.004484292585402727,
-0.000357450102455914,
0.04587157070636749,
-0.020408032462000847,
-0.022415047511458397,
-0.11433150619268417,
-0.048743508756160736,
0.051210418343544006,
0.045436449348926544,
-0.04305001720786095,
-0.009146936237812042,
0.037497639656066895,
-0.07483582943677902,
0.02257935144007206,
0.08293506503105164,
-0.016389640048146248,
0.229167178273201,
-0.07236479222774506,
0.14348648488521576,
0.0541728213429451,
0.04454010725021362,
0.015837527811527252,
0.10899748653173447,
-0.027278466150164604,
0.03944982960820198,
-0.001859669340774417,
-0.062367357313632965,
-0.009331095032393932,
0.007259647827595472,
0.05014651641249657,
-0.034904271364212036,
-0.00914563238620758,
0.024919020012021065,
0.0699920803308487,
0.2658013701438904,
0.03826640173792839,
-0.08779150247573853,
-0.08151868730783463,
-0.005072996951639652,
-0.034122273325920105,
-0.0534512959420681,
-0.020828651264309883,
0.11748985201120377,
-0.10294244438409805,
0.03918353468179703,
-0.07669389992952347,
0.06563343107700348,
-0.10643990337848663,
-0.010457448661327362,
-0.016473285853862762,
0.06765986979007721,
-0.03408413380384445,
0.07557325810194016,
-0.17003615200519562,
0.1371505856513977,
0.03170201927423477,
0.1577865332365036,
-0.06723160296678543,
0.017002001404762268,
0.046574484556913376,
-0.053896043449640274,
0.1551719456911087,
0.03328245133161545,
-0.09233501553535461,
-0.06092628464102745,
-0.15550197660923004,
0.04659431055188179,
0.09147448092699051,
0.008286095224320889,
0.12945611774921417,
0.027316611260175705,
0.030334001407027245,
-0.0281277634203434,
0.03887847438454628,
-0.09046591818332672,
-0.12594956159591675,
0.05554145202040672,
-0.03142853081226349,
-0.06996916979551315,
-0.029520491138100624,
-0.061507344245910645,
-0.021887514740228653,
0.2136991322040558,
-0.135824054479599,
-0.07492852956056595,
-0.08730243891477585,
-0.011555437929928303,
0.10825710743665695,
-0.09747437387704849,
0.00239543872885406,
-0.003927342128008604,
0.09005199372768402,
-0.07402811199426651,
-0.07836217433214188,
0.04411141201853752,
-0.04339258745312691,
-0.12900374829769135,
0.001991137396544218,
0.05010901391506195,
0.10791563987731934,
0.032018549740314484,
0.0021040819119662046,
0.04972784221172333,
0.019964182749390602,
-0.10262564569711685,
-0.014688538387417793,
0.13597427308559418,
0.06529118120670319,
0.11947289109230042,
-0.0593545027077198,
-0.09988028556108475,
-0.10933411121368408,
0.007321844808757305,
0.121373251080513,
0.19032032787799835,
-0.027815118432044983,
0.1477786898612976,
0.21284039318561554,
-0.1366087943315506,
-0.2705586552619934,
-0.002222586888819933,
-0.004255571402609348,
0.09613338857889175,
-0.026724129915237427,
-0.21766766905784607,
0.018262267112731934,
0.05449815094470978,
-0.011376415379345417,
0.07080556452274323,
-0.16356459259986877,
-0.1127030998468399,
0.0856006070971489,
0.007837580516934395,
0.08879246562719345,
-0.055113546550273895,
-0.019576070830225945,
0.0006461912416853011,
0.03044245019555092,
0.07402727007865906,
-0.061301153153181076,
0.127011239528656,
0.03990674763917923,
-0.018614117056131363,
0.040112707763910294,
-0.07678675651550293,
0.09684012085199356,
-0.026788627728819847,
0.06744649261236191,
-0.05827026069164276,
0.008408209308981895,
0.13132204115390778,
-0.029560858383774757,
0.1411494016647339,
0.034049730747938156,
0.026107996702194214,
-0.07014267891645432,
-0.03843348100781441,
-0.09770762920379639,
0.09425851702690125,
-0.037659868597984314,
-0.019612858071923256,
-0.07644391804933548,
0.10071160644292831,
0.10736755281686783,
0.00559451337903738,
0.10848315060138702,
-0.04281500354409218,
0.04138122871518135,
0.0845053642988205,
0.0797228217124939,
0.009009617380797863,
-0.02358851209282875,
0.023746486753225327,
-0.027519553899765015,
0.09860005229711533,
0.019869785755872726,
0.023764625191688538,
0.09013651311397552,
0.017593616619706154,
0.10605829209089279,
0.008923619985580444,
-0.19825056195259094,
0.02188020385801792,
0.02420058660209179,
-0.19602249562740326,
-0.08478202670812607,
0.02375478856265545,
-0.09591623395681381,
-0.07654879242181778,
0.01372936088591814,
0.133570596575737,
-0.03438087925314903,
-0.0063190218061208725,
-0.020964089781045914,
0.09241653978824615,
-0.0025401883758604527,
0.14525647461414337,
0.031381551176309586,
0.008129102177917957,
-0.070083387196064,
0.17837463319301605,
0.07593286037445068,
-0.1616676151752472,
0.09316208213567734,
0.08974825590848923,
-0.06305202841758728,
-0.046650439500808716,
-0.02548118866980076,
0.1469225287437439,
0.09844668954610825,
-0.05783716216683388,
-0.1084279716014862,
-0.024001531302928925,
0.022549118846654892,
0.11801152676343918,
0.020178480073809624,
0.10007445514202118,
-0.026655543595552444,
-0.015544867143034935,
-0.12029976397752762,
0.08469416201114655,
0.07480844110250473,
-0.03670554608106613,
0.005615751724690199,
0.14035551249980927,
0.007825890555977821,
-0.024844251573085785,
-0.012673094868659973,
-0.015398004092276096,
-0.033390551805496216,
-0.023862631991505623,
0.013930807821452618,
0.012544611468911171,
-0.09280490130186081,
-0.01680191233754158,
-0.0413680300116539,
0.026307078078389168,
0.002345385029911995,
0.036004990339279175,
-0.02848813496530056,
-0.03528904169797897,
-0.06447312235832214,
-0.01985361985862255,
-0.12320496141910553,
-0.026597771793603897,
-0.01445973850786686,
-0.03954213485121727,
0.08897843956947327,
0.03947408124804497,
0.0016068039694800973,
0.0043977173045277596,
-0.07378684729337692,
0.005469317547976971,
-0.013355754315853119,
-0.0020232326351106167,
-0.0301875788718462,
-0.11403863877058029,
-0.03925614804029465,
0.007745754439383745,
-0.004779744427651167,
0.01039111241698265,
0.11766894906759262,
-0.09919573366641998,
0.06369893252849579,
0.007267081178724766,
-0.0008267228840850294,
-0.06959190219640732,
0.11421636492013931,
0.04733629152178764,
0.06275679171085358,
0.15370015799999237,
-0.0719270333647728,
0.07314302027225494,
-0.07971172779798508,
0.01739211194217205,
0.015939299017190933,
-0.011885746382176876,
-0.00846710056066513,
0.04233149439096451,
0.06061691418290138,
-0.0559053011238575,
0.06954693049192429,
-0.00044851936399936676,
-0.0394681878387928,
0.02201460301876068,
-0.035049572587013245,
-0.06598863750696182,
0.024218205362558365,
0.01094632875174284,
-0.060872625559568405,
-0.05035433918237686,
0.03749748691916466,
0.010796689428389072,
-0.009524623863399029,
0.08452683687210083,
0.2020799070596695,
0.1102234497666359,
0.2722569406032562,
0.11513172090053558,
-0.029342439025640488,
-0.06363850086927414,
-0.06314630806446075,
0.05832671746611595,
-0.01811284013092518,
0.07753177732229233,
0.04227720946073532,
-0.021193193271756172,
0.12926343083381653,
-0.1532747447490692,
0.09900607913732529,
0.03661128878593445,
-0.08831425756216049,
-0.08926978707313538,
-0.2746522128582001,
-0.03798435628414154,
0.10353343933820724,
0.00039442972047254443,
-0.14550837874412537,
0.05094281584024429,
0.059082817286252975,
0.01605285331606865,
-0.0337761789560318,
0.1503056138753891,
-0.12422729283571243,
-0.09448845684528351,
0.10034385323524475,
0.011905799619853497,
-0.009156320244073868,
0.06333105266094208,
-0.020066512748599052,
0.02625058963894844,
0.056518103927373886,
0.07824352383613586,
0.061463966965675354,
0.08715961873531342,
0.08296242356300354,
-0.03974348306655884,
-0.09452247619628906,
0.00478530814871192,
0.02013392560184002,
0.1281939595937729,
0.14638276398181915,
0.07472850382328033,
-0.06357939541339874,
-0.004399704281240702,
0.10243851691484451,
-0.025532124564051628,
-0.03552502393722534,
-0.13567228615283966,
0.10932504385709763,
0.02674819715321064,
-0.02500523068010807,
0.002430258085951209,
-0.11523820459842682,
0.05679158493876457,
0.1705264002084732,
0.19140489399433136,
0.008141704834997654,
0.007679100148379803,
-0.01711917668581009,
0.0008660929161123931,
0.02902526780962944,
0.0868324264883995,
0.008974676951766014,
0.22372470796108246,
-0.03500360623002052,
0.022218214347958565,
-0.05410198122262955,
-0.025618840008974075,
-0.08269035071134567,
0.08717833459377289,
-0.02843661420047283,
-0.013734417036175728,
-0.08906259387731552,
0.058939557522535324,
-0.09185824543237686,
-0.1939939260482788,
-0.0039962287992239,
-0.1075061783194542,
-0.1164248064160347,
0.00273289461620152,
0.03867786377668381,
0.026810284703969955,
0.11279058456420898,
0.021056411787867546,
0.03868582844734192,
0.11372386664152145,
0.01641690917313099,
-0.07838598638772964,
-0.0701218843460083,
0.026917144656181335,
-0.07200714200735092,
0.19449129700660706,
0.0548144206404686,
0.025707116350531578,
0.08001606166362762,
-0.004108808469027281,
-0.10857662558555603,
0.037192851305007935,
-0.009224694222211838,
-0.017099818214774132,
0.004124936647713184,
0.1460467278957367,
-0.02890734374523163,
0.08321196585893631,
0.050708748400211334,
-0.01644892431795597,
0.051334626972675323,
-0.04101531207561493,
-0.03461790457367897,
-0.04755208641290665,
0.08716932684183121,
-0.08750488609075546,
0.14608556032180786,
0.20160174369812012,
0.006021337117999792,
0.004844929091632366,
-0.03533300384879112,
-0.00986560806632042,
-0.01538596861064434,
0.03749113902449608,
-0.021377919241786003,
-0.162527397274971,
0.020849399268627167,
-0.11313465237617493,
0.04645002633333206,
-0.15182875096797943,
-0.028237123042345047,
0.04628517106175423,
-0.06926702708005905,
-0.022817250341176987,
0.05679618567228317,
-0.015182866714894772,
0.01124262623488903,
-0.02658722922205925,
0.07404274493455887,
0.038850847631692886,
0.06081492081284523,
-0.11810464411973953,
-0.10073184221982956
] |
null | null |
transformers
|
# Sinhala GPT2 trained on MC4 (manually cleaned)
### Overview
This is a smaller GPT2 model trained on [MC4](https://github.com/allenai/allennlp/discussions/5056) Sinhala dataset. As Sinhala is one of those low resource languages, there are only a handful of models been trained. So, this would be a great place to start training for more downstream tasks.
This model uses a manually cleaned version of MC4 dataset which can be found [here](https://huggingface.co/datasets/keshan/clean-si-mc4). Although the dataset is relatively small ~3GB. The finetuned model on [news articles](https://huggingface.co/keshan/sinhala-gpt2-newswire) generates good and acceptable results.
## Model Specification
The model chosen for training is GPT2 with the following specifications:
1. vocab_size=50257
2. n_embd=768
3. n_head=12
4. n_layer=12
5. n_positions=1024
## How to Use
You can use this model directly with a pipeline for causal language modeling:
```py
from transformers import pipeline
generator = pipeline('text-generation', model='flax-community/Sinhala-gpt2')
generator("මම", max_length=50, num_return_sequences=5)
```
|
{"language": "si", "tags": ["Sinhala", "text-generation", "gpt2"], "datasets": ["mc4"]}
|
text-generation
|
flax-community/Sinhala-gpt2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"gpt2",
"feature-extraction",
"Sinhala",
"text-generation",
"si",
"dataset:mc4",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"si"
] |
TAGS
#transformers #pytorch #tf #jax #tensorboard #gpt2 #feature-extraction #Sinhala #text-generation #si #dataset-mc4 #endpoints_compatible #has_space #text-generation-inference #region-us
|
# Sinhala GPT2 trained on MC4 (manually cleaned)
### Overview
This is a smaller GPT2 model trained on MC4 Sinhala dataset. As Sinhala is one of those low resource languages, there are only a handful of models been trained. So, this would be a great place to start training for more downstream tasks.
This model uses a manually cleaned version of MC4 dataset which can be found here. Although the dataset is relatively small ~3GB. The finetuned model on news articles generates good and acceptable results.
## Model Specification
The model chosen for training is GPT2 with the following specifications:
1. vocab_size=50257
2. n_embd=768
3. n_head=12
4. n_layer=12
5. n_positions=1024
## How to Use
You can use this model directly with a pipeline for causal language modeling:
|
[
"# Sinhala GPT2 trained on MC4 (manually cleaned)",
"### Overview\n\nThis is a smaller GPT2 model trained on MC4 Sinhala dataset. As Sinhala is one of those low resource languages, there are only a handful of models been trained. So, this would be a great place to start training for more downstream tasks.\n\nThis model uses a manually cleaned version of MC4 dataset which can be found here. Although the dataset is relatively small ~3GB. The finetuned model on news articles generates good and acceptable results.",
"## Model Specification\n\n\nThe model chosen for training is GPT2 with the following specifications:\n 1. vocab_size=50257\n 2. n_embd=768\n 3. n_head=12\n 4. n_layer=12\n 5. n_positions=1024",
"## How to Use\nYou can use this model directly with a pipeline for causal language modeling:"
] |
[
"TAGS\n#transformers #pytorch #tf #jax #tensorboard #gpt2 #feature-extraction #Sinhala #text-generation #si #dataset-mc4 #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# Sinhala GPT2 trained on MC4 (manually cleaned)",
"### Overview\n\nThis is a smaller GPT2 model trained on MC4 Sinhala dataset. As Sinhala is one of those low resource languages, there are only a handful of models been trained. So, this would be a great place to start training for more downstream tasks.\n\nThis model uses a manually cleaned version of MC4 dataset which can be found here. Although the dataset is relatively small ~3GB. The finetuned model on news articles generates good and acceptable results.",
"## Model Specification\n\n\nThe model chosen for training is GPT2 with the following specifications:\n 1. vocab_size=50257\n 2. n_embd=768\n 3. n_head=12\n 4. n_layer=12\n 5. n_positions=1024",
"## How to Use\nYou can use this model directly with a pipeline for causal language modeling:"
] |
[
72,
17,
110,
56,
21
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #tensorboard #gpt2 #feature-extraction #Sinhala #text-generation #si #dataset-mc4 #endpoints_compatible #has_space #text-generation-inference #region-us \n# Sinhala GPT2 trained on MC4 (manually cleaned)### Overview\n\nThis is a smaller GPT2 model trained on MC4 Sinhala dataset. As Sinhala is one of those low resource languages, there are only a handful of models been trained. So, this would be a great place to start training for more downstream tasks.\n\nThis model uses a manually cleaned version of MC4 dataset which can be found here. Although the dataset is relatively small ~3GB. The finetuned model on news articles generates good and acceptable results.## Model Specification\n\n\nThe model chosen for training is GPT2 with the following specifications:\n 1. vocab_size=50257\n 2. n_embd=768\n 3. n_head=12\n 4. n_layer=12\n 5. n_positions=1024## How to Use\nYou can use this model directly with a pipeline for causal language modeling:"
] |
[
-0.15760135650634766,
0.05572490021586418,
-0.002502570627257228,
0.04122080281376839,
0.08942589163780212,
-0.0035552880726754665,
0.0807962417602539,
0.10918034613132477,
-0.09000244736671448,
0.03357381746172905,
0.11336571723222733,
0.006480121519416571,
0.0676085501909256,
0.15593275427818298,
0.0791972279548645,
-0.2761014401912689,
0.008828227408230305,
0.03654683381319046,
-0.025585459545254707,
0.09989641606807709,
0.11238449066877365,
-0.044215768575668335,
0.04841860756278038,
0.0071714892983436584,
-0.12440001964569092,
0.008894359692931175,
-0.012704090215265751,
0.005097454413771629,
0.05076538771390915,
0.029973549768328667,
0.0763096883893013,
0.045895595103502274,
0.0912710428237915,
-0.1154167503118515,
0.034028343856334686,
0.04063570871949196,
0.014604680240154266,
0.018402131274342537,
0.04535940662026405,
0.06366506218910217,
0.18986749649047852,
-0.10944054275751114,
-0.006429579108953476,
0.07875386625528336,
-0.05735158920288086,
-0.10414610803127289,
-0.01997201144695282,
0.012143776752054691,
0.13325732946395874,
0.13802969455718994,
-0.043874263763427734,
0.07485952228307724,
-0.0931633859872818,
0.06341851502656937,
0.08134850859642029,
-0.21052773296833038,
-0.04810856655240059,
0.27110031247138977,
0.0673118382692337,
0.08669303357601166,
0.0018116625724360347,
0.030928945168852806,
0.019779130816459656,
0.06329553574323654,
0.09529169648885727,
-0.017013389617204666,
0.08591131120920181,
-0.010069293901324272,
-0.11827388405799866,
0.00275447778403759,
0.17654365301132202,
-0.01995108090341091,
-0.0310855433344841,
-0.13018353283405304,
-0.04739781841635704,
-0.0353781059384346,
-0.02331853099167347,
-0.005439092870801687,
-0.05141214281320572,
0.045283131301403046,
0.1212017685174942,
-0.14935430884361267,
-0.12245766073465347,
-0.040738001465797424,
0.006130645051598549,
0.014870567247271538,
0.027907196432352066,
0.047705840319395065,
-0.07659047096967697,
0.09896586090326309,
-0.038773953914642334,
-0.04937102273106575,
-0.034487176686525345,
-0.05713865906000137,
-0.09559029340744019,
0.015438931994140148,
0.044931810349226,
-0.0201497171074152,
-0.052725404500961304,
-0.0019165497506037354,
-0.04683246091008186,
0.02912631258368492,
0.11317960172891617,
0.0469924695789814,
-0.06556226313114166,
0.034377384930849075,
-0.13263796269893646,
0.03649798408150673,
0.0900096744298935,
0.016704922541975975,
0.051971435546875,
-0.024255357682704926,
-0.12148550897836685,
-0.07625816017389297,
-0.03172397240996361,
0.10521216690540314,
-0.003618813119828701,
0.0317261703312397,
0.003309195162728429,
-0.07779759168624878,
0.04703187569975853,
-0.12115239351987839,
-0.015129790641367435,
0.01939314231276512,
0.003554411232471466,
0.08755180239677429,
0.04315861314535141,
0.017738845199346542,
-0.1206267699599266,
-0.07105135917663574,
-0.035201545804739,
0.017620360478758812,
-0.08884754031896591,
-0.10154268145561218,
0.03724075108766556,
-0.02816821075975895,
-0.05991893634200096,
-0.11428192257881165,
-0.23706024885177612,
-0.01178728137165308,
0.06003604456782341,
-0.03961854800581932,
-0.09260555356740952,
-0.040935833007097244,
-0.06460653990507126,
-0.04108579084277153,
-0.0023007304407656193,
0.08420689404010773,
-0.023311132565140724,
-0.0027140795718878508,
-0.019989488646388054,
0.11113438755273819,
-0.0290658138692379,
-0.0033265533857047558,
0.02277081087231636,
0.04189508408308029,
-0.0480322539806366,
0.09371379762887955,
-0.04432433843612671,
0.049859728664159775,
-0.0791119784116745,
-0.05604490265250206,
-0.07124049961566925,
0.04982747882604599,
0.021481823176145554,
0.19005519151687622,
-0.24742721021175385,
-0.021514201536774635,
0.1791035234928131,
-0.09620033204555511,
-0.034820154309272766,
0.08977249264717102,
0.012984246015548706,
0.1854105442762375,
0.07370200008153915,
0.0672604888677597,
0.09250177443027496,
-0.12105098366737366,
0.06935099512338638,
0.013975930400192738,
-0.05933957174420357,
-0.10403551906347275,
0.07612331211566925,
-0.002591722644865513,
0.008240114897489548,
0.03370659798383713,
-0.02669159695506096,
0.04287703335285187,
-0.0365711934864521,
-0.04627395421266556,
-0.028101833537220955,
-0.06815468519926071,
-0.06169426441192627,
0.030163682997226715,
0.05806182324886322,
-0.006363072898238897,
-0.11204887926578522,
-0.01511769276112318,
0.10206849128007889,
-0.08382634818553925,
0.031967636197805405,
-0.09049975872039795,
0.11359746754169464,
-0.09891977906227112,
0.00883449800312519,
-0.10094534605741501,
-0.09803088009357452,
-0.017757313326001167,
0.01979530043900013,
0.06141659989953041,
0.0033169020898640156,
0.05702044442296028,
0.05396008491516113,
-0.04625052213668823,
0.0278750192373991,
0.028566177934408188,
-0.004573688376694918,
-0.07546943426132202,
-0.07020674645900726,
-0.025558188557624817,
-0.040660977363586426,
0.1378355622291565,
-0.2350424826145172,
0.07484080642461777,
0.04384142532944679,
0.08662550151348114,
0.02112485095858574,
-0.031017465516924858,
0.11448436975479126,
0.014825500547885895,
-0.0010814599227160215,
-0.10038173198699951,
0.05938185006380081,
-0.006164452526718378,
-0.06295576691627502,
0.09682458639144897,
-0.13359396159648895,
-0.01830187439918518,
0.0903945341706276,
0.031145533546805382,
-0.025087418034672737,
0.019880304113030434,
-0.04978254809975624,
0.009350989013910294,
-0.08403477817773819,
0.02589123509824276,
0.010147777386009693,
0.0014267966616898775,
0.10846497118473053,
-0.08551975339651108,
-0.005452923476696014,
0.002042354317381978,
0.0017368945991620421,
0.0238032229244709,
0.0837918221950531,
0.17337825894355774,
-0.05573257803916931,
0.03615002706646919,
-0.01639639027416706,
-0.0412585586309433,
0.15486791729927063,
0.02916889637708664,
-0.08573520928621292,
-0.009691386483609676,
0.03817905858159065,
0.03425327315926552,
0.11997206509113312,
-0.08918140083551407,
0.004966410342603922,
0.02248755469918251,
0.0033111083321273327,
0.1029309406876564,
-0.09526585787534714,
-0.03015899658203125,
0.01762321963906288,
-0.0948466807603836,
-0.003124311799183488,
0.095280222594738,
-0.0669950470328331,
0.06666345149278641,
-0.04663898050785065,
0.026893222704529762,
0.0006263618706725538,
0.00715772807598114,
-0.13956966996192932,
0.1932906061410904,
-0.032042406499385834,
-0.23440591990947723,
-0.1068868339061737,
0.05343572422862053,
0.02246907167136669,
-0.02141668274998665,
0.018771406263113022,
-0.1276063174009323,
-0.09208007156848907,
-0.12426183372735977,
-0.014461362734436989,
-0.03293696790933609,
-0.01104059536010027,
-0.03456319123506546,
0.016916560009121895,
-0.027744419872760773,
-0.08870068192481995,
0.0313817597925663,
0.0026977472007274628,
-0.05617108196020126,
0.04570215195417404,
-0.13976506888866425,
0.04198453575372696,
0.20704884827136993,
0.00561020290479064,
0.06965934485197067,
-0.008075687102973461,
0.2147262543439865,
-0.08316744863986969,
0.1478596329689026,
0.12204275280237198,
0.04915228486061096,
-0.00020335472072474658,
0.13445447385311127,
0.0056309946812689304,
-0.07524728775024414,
0.013274921104311943,
0.0029521954711526632,
-0.08854278922080994,
-0.24577662348747253,
-0.09833880513906479,
-0.11344574391841888,
-0.025142749771475792,
0.007770055439323187,
0.04128566384315491,
-0.05250753089785576,
0.06190698966383934,
-0.029948217794299126,
0.03410486876964569,
0.06279338151216507,
0.0718817263841629,
0.012982281856238842,
-0.02673107385635376,
0.08404921740293503,
-0.0764792412519455,
0.04692712798714638,
0.06917107850313187,
0.05058695003390312,
0.22338688373565674,
-0.030991334468126297,
0.1314389854669571,
0.08642932027578354,
0.07663673907518387,
0.10622036457061768,
0.07114699482917786,
-0.005593667738139629,
-0.007165553979575634,
0.02517220936715603,
-0.051909469068050385,
0.04902918264269829,
0.055268920958042145,
-0.007634750101715326,
-0.09693966060876846,
-0.05068693310022354,
0.1010059118270874,
-0.023019855841994286,
0.09656994789838791,
0.08186068385839462,
-0.20321039855480194,
-0.11035434156656265,
0.05389135330915451,
0.008474243804812431,
-0.05880748853087425,
0.00867056380957365,
0.17434577643871307,
-0.04625626280903816,
0.00847079697996378,
-0.023568719625473022,
0.10430194437503815,
0.009937351569533348,
-0.03090328723192215,
0.02947300672531128,
0.03674652799963951,
-0.015111882239580154,
0.07411915063858032,
-0.17746910452842712,
0.21622487902641296,
0.05110679939389229,
0.08178868889808655,
-0.07010891288518906,
-0.02239035815000534,
0.0015700978692620993,
0.13384541869163513,
0.130348801612854,
0.043459452688694,
-0.13698895275592804,
-0.06920285522937775,
-0.11716264486312866,
0.04810317978262901,
0.005281359888613224,
0.039299026131629944,
0.06465844810009003,
0.046911753714084625,
0.010088108479976654,
0.0017129926709458232,
-0.010503607802093029,
-0.2054959386587143,
-0.13166400790214539,
0.030862903222441673,
0.08377040177583694,
-0.11472741514444351,
-0.09158290922641754,
-0.10026197135448456,
0.08506622165441513,
0.22899562120437622,
-0.08735466003417969,
-0.12626467645168304,
-0.11257784068584442,
0.04437704011797905,
0.0880306139588356,
-0.05499078333377838,
0.032692912966012955,
0.018300550058484077,
0.15586622059345245,
-0.016260391101241112,
-0.047994162887334824,
0.056891825050115585,
-0.07199855893850327,
-0.0441155768930912,
0.02773190289735794,
0.14021353423595428,
0.05352839082479477,
0.03141089528799057,
0.03954717516899109,
-0.057729728519916534,
-0.009612195193767548,
-0.12036185711622238,
-0.0036982798483222723,
0.17489422857761383,
-0.05057932808995247,
0.06809085607528687,
-0.09782426804304123,
-0.12030770629644394,
-0.004583990667015314,
-0.08337393403053284,
0.16592741012573242,
0.13580220937728882,
-0.07903580367565155,
0.10609319061040878,
0.08298704773187637,
-0.09476917237043381,
-0.22516675293445587,
0.026966387405991554,
0.011361365206539631,
0.13151191174983978,
-0.04705021157860756,
-0.1353776454925537,
0.06171218305826187,
-0.008192004635930061,
-0.005284399259835482,
0.058016929775476456,
-0.2617405652999878,
-0.1391984075307846,
0.03322857245802879,
0.06891606003046036,
0.12211261689662933,
-0.09858263283967972,
-0.04359238222241402,
-0.006567836739122868,
-0.08979702740907669,
0.0806967169046402,
-0.2345535159111023,
0.10899956524372101,
-0.025697216391563416,
0.09671894460916519,
0.008266007527709007,
-0.07762647420167923,
0.12833063304424286,
0.021994754672050476,
-0.048577215522527695,
-0.05606836825609207,
0.10687398910522461,
0.12937751412391663,
-0.03285864740610123,
0.14365549385547638,
0.010180668905377388,
0.015400002710521221,
-0.1872938871383667,
-0.09847136586904526,
-0.07481745630502701,
0.004775306675583124,
-0.04110872372984886,
-0.045060329139232635,
-0.04800663888454437,
0.07514476031064987,
0.053020477294921875,
0.00887270551174879,
-0.15480080246925354,
-0.09691926091909409,
0.05931541323661804,
0.06719145178794861,
0.1716003268957138,
-0.1652996689081192,
-0.04823384806513786,
0.02807576209306717,
-0.008883440867066383,
0.03196582943201065,
-0.1863744854927063,
-0.02443867363035679,
0.08665069937705994,
-0.01923544891178608,
0.03691958263516426,
0.034078892320394516,
-0.10312281548976898,
0.019218724220991135,
0.0696328654885292,
-0.02772807516157627,
-0.10972404479980469,
-0.032707564532756805,
0.006613242439925671,
-0.0459601990878582,
-0.013320188038051128,
0.09138950705528259,
-0.08478141576051712,
0.009200154803693295,
0.0022169738076627254,
0.01678409054875374,
-0.04376070946455002,
0.09705030918121338,
0.08499573171138763,
0.06061329320073128,
-0.03183172270655632,
0.0026136687956750393,
0.04623781144618988,
-0.03251918777823448,
-0.00009423579467693344,
0.18855294585227966,
-0.17074519395828247,
-0.09747771918773651,
0.014116975478827953,
0.0006171948625706136,
-0.02243390493094921,
-0.03677354007959366,
-0.014573095366358757,
-0.08603210747241974,
0.03946933150291443,
-0.01919831708073616,
0.010034550912678242,
-0.0578354187309742,
-0.0629938393831253,
-0.025441180914640427,
-0.10132087767124176,
0.05428551137447357,
0.08871326595544815,
-0.02704007364809513,
-0.08894184231758118,
0.10949507355690002,
0.028235232457518578,
0.06655699759721756,
-0.037022095173597336,
0.004790524486452341,
-0.06004182994365692,
0.06002407893538475,
-0.1254725158214569,
0.03585117682814598,
-0.10138073563575745,
-0.04117996245622635,
-0.045338910073041916,
-0.036606159061193466,
-0.05912221968173981,
0.04021727666258812,
-0.03732921928167343,
0.00636790506541729,
-0.08439464122056961,
-0.017804251983761787,
0.04437290504574776,
-0.008226141333580017,
-0.004428425803780556,
-0.05413997545838356,
0.07374129444360733,
-0.04716057330369949,
-0.070011205971241,
0.035934578627347946,
-0.07428210973739624,
-0.013337106443941593,
0.008117136545479298,
0.03332740440964699,
0.024019386619329453,
-0.06724292039871216,
0.03120347671210766,
0.05117484927177429,
0.058814093470573425,
0.03907255083322525,
0.027662314474582672,
0.012187724001705647,
0.0024691747967153788,
-0.06456557661294937,
0.03088224120438099,
-0.05399671196937561,
0.0824771523475647,
-0.015687646344304085,
0.05841583013534546,
0.05315063148736954,
-0.07877977192401886,
0.07152121514081955,
-0.1616453379392624,
-0.005161621607840061,
-0.030656959861516953,
0.017408203333616257,
0.022094421088695526,
-0.013096130453050137,
0.083790123462677,
-0.051637496799230576,
0.17383694648742676,
-0.02117828093469143,
0.013044246472418308,
0.031646355986595154,
-0.06675675511360168,
0.08570489287376404,
-0.004371595103293657,
0.1615360975265503,
0.05770491808652878,
0.03981444239616394,
0.04186421260237694,
0.007805285509675741,
0.02978098951280117,
0.08543399721384048,
0.18806608021259308,
0.02664867974817753,
0.04906715080142021,
0.11033938825130463,
-0.02895006723701954,
-0.07164931297302246,
-0.10775303095579147,
0.00944455899298191,
-0.09272075444459915,
0.031113620847463608,
-0.06609469652175903,
0.01050913892686367,
0.2739485204219818,
-0.08509180694818497,
0.015162412077188492,
-0.007499439641833305,
-0.14192380011081696,
-0.16487930715084076,
-0.22994162142276764,
-0.0671997219324112,
-0.06496677547693253,
-0.013773269951343536,
-0.08966940641403198,
-0.04321632906794548,
0.0027271800208836794,
0.10083315521478653,
0.023279663175344467,
0.16351261734962463,
0.1212572231888771,
-0.05115815997123718,
0.02701401337981224,
-0.020120542496442795,
0.028157325461506844,
0.026780208572745323,
0.03745242580771446,
-0.023115435615181923,
-0.027073541656136513,
0.056156791746616364,
0.012039141729474068,
-0.04597243294119835,
0.0594601146876812,
-0.02975059673190117,
-0.011118380352854729,
-0.07099533081054688,
0.055244237184524536,
0.022167740389704704,
0.171714648604393,
0.03277220577001572,
-0.07720739394426346,
0.011585921980440617,
0.0818043127655983,
0.01659913919866085,
-0.17717915773391724,
-0.0999983623623848,
0.11838489770889282,
-0.0032008946873247623,
-0.022429516538977623,
-0.04069120064377785,
-0.031158803030848503,
0.023665541782975197,
0.30446359515190125,
0.24280162155628204,
-0.04391760379076004,
-0.013695635832846165,
0.001532545080408454,
-0.013816886581480503,
-0.07020191103219986,
0.15122857689857483,
0.07806984335184097,
0.2723786234855652,
-0.07068952918052673,
-0.02453000098466873,
-0.03425874561071396,
-0.039735134690999985,
-0.17680710554122925,
0.015464745461940765,
0.04077967256307602,
0.015617837198078632,
-0.010993641801178455,
0.13656654953956604,
-0.125101700425148,
0.05498756468296051,
-0.05917445197701454,
-0.02881438098847866,
-0.09383136034011841,
-0.03600337356328964,
-0.06137281283736229,
-0.03068672865629196,
0.021836623549461365,
-0.07959295064210892,
0.005945822689682245,
0.0821136012673378,
0.006126337219029665,
-0.10625365376472473,
-0.039143890142440796,
0.12480676919221878,
-0.005467269103974104,
0.11150722205638885,
0.0003527600201778114,
0.10792438685894012,
0.08663147687911987,
-0.03491337224841118,
-0.1509261280298233,
0.0805574432015419,
-0.008794606663286686,
0.04687689244747162,
-0.00482256431132555,
0.13169316947460175,
-0.07349132746458054,
-0.012981979176402092,
0.01187240518629551,
-0.042209792882204056,
0.013254962861537933,
-0.0018583128694444895,
0.004289577249437571,
-0.06296279281377792,
0.04549290984869003,
-0.06493671238422394,
0.11321235448122025,
0.07550594210624695,
-0.06039845570921898,
-0.019630076363682747,
-0.04544968158006668,
0.05144441872835159,
-0.025755438953638077,
0.07560083270072937,
0.009024427272379398,
-0.16820693016052246,
-0.001822927501052618,
-0.09810204058885574,
0.05329779535531998,
-0.2409355789422989,
-0.026119636371731758,
-0.02912810817360878,
-0.038481224328279495,
-0.05660835653543472,
0.15109959244728088,
0.10770346969366074,
0.03497900441288948,
-0.03462601825594902,
-0.07847955822944641,
-0.005572499707341194,
0.08645378053188324,
-0.17237228155136108,
-0.10173601657152176
] |
null | null |
transformers
|
## Sinhala Roberta model trained on MC4 Sinhala dataset (manually cleaned)
|
{"language": "si", "tags": ["fill-mask", "sinhala", "roberta"]}
|
fill-mask
|
flax-community/Sinhala-roberta
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"roberta",
"feature-extraction",
"fill-mask",
"sinhala",
"si",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"si"
] |
TAGS
#transformers #pytorch #jax #tensorboard #roberta #feature-extraction #fill-mask #sinhala #si #endpoints_compatible #region-us
|
## Sinhala Roberta model trained on MC4 Sinhala dataset (manually cleaned)
|
[
"## Sinhala Roberta model trained on MC4 Sinhala dataset (manually cleaned)"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #roberta #feature-extraction #fill-mask #sinhala #si #endpoints_compatible #region-us \n",
"## Sinhala Roberta model trained on MC4 Sinhala dataset (manually cleaned)"
] |
[
47,
20
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #roberta #feature-extraction #fill-mask #sinhala #si #endpoints_compatible #region-us \n## Sinhala Roberta model trained on MC4 Sinhala dataset (manually cleaned)"
] |
[
-0.11889782547950745,
0.134704127907753,
-0.0041094277985394,
0.06529290229082108,
0.08009368926286697,
0.014261940494179726,
0.10686138272285461,
0.08905065804719925,
0.016241317614912987,
0.045083485543727875,
0.09481658041477203,
0.1301438808441162,
0.060283008962869644,
0.09354259073734283,
0.04032381996512413,
-0.298782080411911,
0.02824244648218155,
0.06718344241380692,
-0.18948614597320557,
0.12671278417110443,
0.030533527955412865,
-0.07189527899026871,
0.053270936012268066,
0.0031621241942048073,
-0.12288070470094681,
0.02777736820280552,
-0.05670987442135811,
-0.04370211437344551,
0.0898284763097763,
-0.040246278047561646,
0.15862181782722473,
0.0629049614071846,
0.06087753176689148,
-0.05891551449894905,
0.06339597702026367,
-0.05973318964242935,
-0.031288087368011475,
0.004358967300504446,
-0.003606150159612298,
0.012206390500068665,
0.1494440734386444,
0.06180388480424881,
0.019569404423236847,
-0.004978329408913851,
-0.10335119068622589,
-0.08519701659679413,
-0.016301490366458893,
0.021417269483208656,
0.07121802866458893,
0.07624593377113342,
-0.03569887951016426,
0.14448533952236176,
-0.11133918166160583,
0.06480778753757477,
0.09679603576660156,
-0.19926732778549194,
-0.04566067084670067,
0.12943901121616364,
0.10829945653676987,
0.0004847408563364297,
-0.04476436227560043,
0.08154232054948807,
0.0008574816747568548,
0.046570371836423874,
0.03295570984482765,
-0.08664217591285706,
0.035403378307819366,
0.019763199612498283,
-0.09046503156423569,
0.05313761904835701,
0.12426717579364777,
-0.02047576755285263,
0.03620176762342453,
-0.022691525518894196,
-0.09492623060941696,
-0.0016518003540113568,
-0.05908187851309776,
-0.006176912225782871,
-0.057378608733415604,
0.03092155233025551,
-0.033495642244815826,
-0.026058722287416458,
-0.07902040332555771,
-0.0302100982517004,
-0.08342207968235016,
0.13104158639907837,
0.036485444754362106,
0.04549732804298401,
-0.20245997607707977,
-0.0049170320853590965,
-0.0044986470602452755,
-0.12435141950845718,
0.02530953474342823,
-0.006323518697172403,
-0.05473100021481514,
0.03543027862906456,
0.03595883026719093,
-0.08826565742492676,
0.08835836499929428,
0.06182778626680374,
0.02765991911292076,
0.060793206095695496,
0.1062629446387291,
0.033694878220558167,
0.018299875780940056,
0.042594779282808304,
-0.13432706892490387,
-0.029753826558589935,
0.06071944907307625,
-0.028725016862154007,
0.0352933369576931,
-0.056766148656606674,
-0.050819601863622665,
-0.010943621397018433,
0.0009068981162272394,
0.06794725358486176,
-0.006194764748215675,
0.032107967883348465,
-0.03477035090327263,
-0.05986534804105759,
-0.057579342275857925,
-0.10638595372438431,
-0.0654626414179802,
0.0027054878883063793,
0.024953877553343773,
0.05429980158805847,
0.0160573348402977,
-0.022998154163360596,
-0.014404664747416973,
-0.08409779518842697,
-0.09017305076122284,
-0.026295892894268036,
-0.06903015077114105,
-0.08301356434822083,
0.006716337986290455,
-0.16133560240268707,
0.025187021121382713,
-0.12566307187080383,
-0.24313169717788696,
0.004871265962719917,
0.06275967508554459,
-0.06066959723830223,
-0.04593382403254509,
-0.014345632866024971,
-0.01919240690767765,
-0.006846349686384201,
-0.002545358380302787,
-0.012274404987692833,
-0.03654832765460014,
0.05440409108996391,
-0.02369103580713272,
0.13036544620990753,
-0.011585002765059471,
0.030600978061556816,
-0.02809872478246689,
0.08387042582035065,
-0.08898802846670151,
-0.004562157206237316,
-0.03550815209746361,
0.1476990133523941,
-0.07605641335248947,
-0.045015379786491394,
-0.12045090645551682,
0.019753532484173775,
0.006701502483338118,
0.2259465903043747,
-0.19676291942596436,
-0.08896343410015106,
0.18884210288524628,
-0.15294066071510315,
-0.14129240810871124,
0.09171837568283081,
0.027881111949682236,
0.1571510285139084,
0.024119846522808075,
0.11740164458751678,
0.0025074942968785763,
-0.04817097634077072,
0.06589291244745255,
0.019137227907776833,
-0.025654446333646774,
-0.11460810899734497,
0.06993112713098526,
-0.014039442874491215,
0.05137490853667259,
0.018975775688886642,
0.03070041537284851,
0.05456497520208359,
-0.08724754303693771,
-0.07647255063056946,
0.02728925086557865,
-0.10919532924890518,
0.019872009754180908,
0.06229482591152191,
0.10585597157478333,
0.0193458404392004,
0.010270395316183567,
-0.08084804564714432,
0.13916558027267456,
-0.06307155638933182,
0.03872993588447571,
-0.1781124621629715,
0.21129210293293,
-0.08307836204767227,
-0.015293648466467857,
-0.18010449409484863,
-0.051626235246658325,
-0.025337789207696915,
-0.05373605340719223,
-0.0032319880556315184,
-0.013956891372799873,
0.09977366030216217,
-0.019029896706342697,
-0.002242795191705227,
0.03292805328965187,
0.032042454928159714,
0.02683074399828911,
-0.03537299111485481,
-0.14146266877651215,
0.0014707219088450074,
-0.0816301479935646,
0.13471508026123047,
-0.11313381791114807,
0.026636334136128426,
0.0531218945980072,
0.11592719703912735,
0.03232257440686226,
-0.0398910753428936,
0.08384544402360916,
0.03236893564462662,
-0.010838283225893974,
-0.06434614211320877,
0.05773746594786644,
-0.008469609543681145,
-0.11576826870441437,
0.10838846862316132,
-0.08674409240484238,
0.18023839592933655,
0.11579207330942154,
-0.1419522613286972,
0.011200946755707264,
0.12777969241142273,
-0.005344362463802099,
0.012998813763260841,
-0.007877950556576252,
0.1134972795844078,
0.002431113040074706,
0.034593142569065094,
0.09840883314609528,
0.0027071149088442326,
0.013505551032721996,
0.030281279236078262,
-0.04062347859144211,
-0.036508169025182724,
0.06584907323122025,
0.21299199759960175,
-0.15696479380130768,
0.09778796136379242,
0.12335909903049469,
-0.07938723266124725,
0.19564618170261383,
0.0037249934393912554,
-0.03480003401637077,
-0.07171322405338287,
-0.03875324875116348,
0.0044226874597370625,
0.14496508240699768,
-0.2442111372947693,
-0.0416233167052269,
0.03519111126661301,
-0.09923592954874039,
0.06524814665317535,
-0.048767779022455215,
-0.07693590223789215,
-0.009913593530654907,
0.0363447405397892,
-0.09400954097509384,
0.15753541886806488,
-0.08354882895946503,
0.04544467106461525,
-0.0891096442937851,
-0.06933356821537018,
-0.004915636498481035,
-0.001340898685157299,
-0.04914255440235138,
0.22670160233974457,
-0.009579859673976898,
-0.20621436834335327,
-0.0583420991897583,
-0.12070167064666748,
0.06045471131801605,
-0.03884869068861008,
0.0020858864299952984,
-0.15282630920410156,
-0.0749569833278656,
-0.004201339557766914,
-0.045223087072372437,
-0.052897095680236816,
0.051907945424318314,
-0.05011487007141113,
0.0031741983257234097,
-0.05333009362220764,
-0.058892618864774704,
0.010593452490866184,
-0.04333196580410004,
-0.006283603608608246,
0.09562720358371735,
-0.17289447784423828,
0.09138628095388412,
0.15739238262176514,
0.0262365210801363,
0.07485947757959366,
0.049195289611816406,
0.17312493920326233,
-0.12023547291755676,
0.02519497461616993,
0.20248578488826752,
0.009935413487255573,
0.013834413141012192,
0.12912869453430176,
0.030980395153164864,
-0.07699227333068848,
-0.0510220006108284,
0.003167541231960058,
-0.12041620910167694,
-0.22679586708545685,
-0.06203895062208176,
-0.10222042351961136,
-0.016879847273230553,
0.01497834175825119,
0.024037931114435196,
-0.0196888018399477,
0.13195903599262238,
0.07526557147502899,
-0.06539709866046906,
-0.022551968693733215,
0.0029050472658127546,
-0.008047634735703468,
-0.020620504394173622,
0.08922428637742996,
-0.029789669439196587,
-0.05639893561601639,
0.004309793934226036,
0.00245048594661057,
0.20828284323215485,
0.07790084928274155,
0.09752187132835388,
0.10883110016584396,
0.2216818928718567,
0.13568800687789917,
0.09029741585254669,
0.01836758852005005,
-0.05028987303376198,
0.01244060043245554,
-0.01921161450445652,
0.007480258587747812,
0.0002829134464263916,
0.1556713730096817,
-0.07735789567232132,
0.0009623818332329392,
0.03799832612276077,
-0.034228257834911346,
0.10098342597484589,
0.07392014563083649,
-0.2252257913351059,
-0.01109706424176693,
0.04270102456212044,
0.04381660744547844,
-0.03512363135814667,
0.0256950706243515,
0.02992115169763565,
-0.05861290916800499,
0.049520548433065414,
-0.07315291464328766,
0.0674188882112503,
0.007982860319316387,
-0.008821167051792145,
-0.06152471899986267,
-0.08006273210048676,
-0.028139809146523476,
0.02157600224018097,
-0.08232879638671875,
0.3599734902381897,
0.03219761326909065,
0.01151630375534296,
-0.021119683980941772,
-0.029393067583441734,
0.07111438363790512,
0.12734289467334747,
0.18450985848903656,
0.008146564476191998,
-0.06648892164230347,
-0.06681109219789505,
-0.04916932433843613,
0.06249329820275307,
0.016253793612122536,
-0.004414071328938007,
0.055866532027721405,
0.029472310096025467,
0.015066840685904026,
-0.035742130130529404,
0.046534463763237,
-0.043737947940826416,
-0.06679918617010117,
0.05312502011656761,
-0.015412140637636185,
-0.1373506635427475,
-0.046395864337682724,
-0.12965255975723267,
-0.07177118211984634,
0.12172375619411469,
-0.031778983771800995,
-0.05831228196620941,
-0.10175937414169312,
0.06385692209005356,
0.1648208349943161,
-0.10283637046813965,
0.01703760400414467,
-0.025562388822436333,
0.06606566905975342,
-0.02474469318985939,
-0.06216638907790184,
0.057756032794713974,
-0.05471234768629074,
0.02617175132036209,
-0.019061904400587082,
0.07238300889730453,
0.051982540637254715,
0.011209381744265556,
0.0551329180598259,
0.016980692744255066,
-0.03615377098321915,
-0.06999021023511887,
0.037452954798936844,
-0.0396258570253849,
-0.06307748705148697,
0.10479952394962311,
-0.07176832109689713,
-0.050949595868587494,
-0.01193939708173275,
-0.06628666818141937,
0.17009666562080383,
0.13209135830402374,
-0.027438253164291382,
0.057983171194791794,
0.18648064136505127,
-0.07801990956068039,
-0.3103589713573456,
-0.10113193094730377,
-0.10834752023220062,
0.12031392753124237,
0.08207378536462784,
-0.13502618670463562,
0.109412781894207,
-0.02903139404952526,
-0.03723829612135887,
-0.08490332961082458,
-0.2093464583158493,
-0.11514116823673248,
0.20316459238529205,
0.1528758555650711,
0.3151754140853882,
-0.14638014137744904,
-0.03397654742002487,
-0.031097324565052986,
-0.05432821065187454,
-0.0662744790315628,
-0.19748620688915253,
0.08786238729953766,
-0.05793748050928116,
0.08136089891195297,
0.01878555677831173,
-0.0617356114089489,
0.13623258471488953,
0.012543899938464165,
0.005302608013153076,
-0.060530804097652435,
-0.06687206774950027,
0.17476892471313477,
0.01811995729804039,
-0.001542461570352316,
0.05708056315779686,
0.0011619015131145716,
-0.1239958181977272,
-0.014080187305808067,
-0.04194515943527222,
0.052421312779188156,
0.012977964244782925,
-0.005202123895287514,
-0.0031255169305950403,
0.09634389728307724,
0.04308238625526428,
0.029963893815875053,
0.12878476083278656,
-0.03225138038396835,
0.14044681191444397,
0.009481946006417274,
0.11651954054832458,
-0.06312000751495361,
-0.09386729449033737,
-0.04575645178556442,
-0.01686101034283638,
0.027282381430268288,
-0.0723397359251976,
0.005784713663160801,
0.07499668747186661,
0.07282031327486038,
0.04506130889058113,
0.08036227524280548,
-0.04839717224240303,
0.07035290449857712,
0.13281899690628052,
-0.032770365476608276,
-0.11207903176546097,
0.02570091187953949,
-0.12039843946695328,
-0.013834659941494465,
0.040184348821640015,
0.09500055760145187,
0.007773886434733868,
-0.01921439729630947,
-0.03809821605682373,
0.013519743457436562,
-0.06096974387764931,
0.1668614149093628,
0.06176910921931267,
0.03981170058250427,
-0.10004662722349167,
0.023983154445886612,
-0.0058291335590183735,
-0.11283904314041138,
-0.04208427295088768,
0.09994110465049744,
-0.12654583156108856,
-0.10817212611436844,
0.021267184987664223,
0.1008264422416687,
-0.05309409648180008,
-0.011761021800339222,
-0.16106054186820984,
-0.06616108864545822,
0.004971744026988745,
0.19930224120616913,
0.07186754792928696,
-0.0004884805530309677,
-0.07858436554670334,
-0.05175720155239105,
-0.07382810115814209,
0.037716094404459,
0.10535331815481186,
-0.041297685354948044,
-0.13507312536239624,
0.034777332097291946,
0.03336656466126442,
0.13769669830799103,
-0.09030330926179886,
-0.051474716514348984,
-0.05722793564200401,
0.0642000362277031,
-0.02891259454190731,
0.0002496601373422891,
-0.13187897205352783,
-0.06566847115755081,
-0.060341205447912216,
-0.0353117398917675,
-0.12799490988254547,
-0.0017092630732804537,
-0.09888647496700287,
0.06353273242712021,
-0.00030679936753585935,
0.010060286149382591,
0.025893516838550568,
-0.03509271889925003,
0.09732171148061752,
-0.05503243952989578,
0.06976063549518585,
0.08568760007619858,
-0.020462799817323685,
0.10011296719312668,
-0.024849088862538338,
-0.08723652362823486,
-0.0037074799183756113,
0.05236820504069328,
0.05468762665987015,
-0.07334531843662262,
0.0021012392826378345,
0.008076436817646027,
0.019307641312479973,
0.04231267422437668,
0.03069043532013893,
-0.015106016770005226,
-0.019367538392543793,
-0.09917421638965607,
-0.05930149555206299,
-0.038585115224123,
0.03975628316402435,
0.04590049758553505,
0.06395439058542252,
0.0737111046910286,
-0.048869337886571884,
0.05313047394156456,
-0.11848428100347519,
0.026592737063765526,
-0.03335591405630112,
-0.022202685475349426,
0.01752578653395176,
-0.02261650748550892,
0.045621950179338455,
-0.08262749761343002,
0.17171809077262878,
-0.01548893190920353,
-0.03322630375623703,
0.023618653416633606,
0.02516929619014263,
0.038929253816604614,
-0.023916393518447876,
0.2589389681816101,
0.09700000286102295,
0.014347751624882221,
0.0874834656715393,
0.03206909820437431,
-0.033425040543079376,
0.21692323684692383,
0.03874420002102852,
0.06476098299026489,
0.08347585052251816,
0.0798976942896843,
0.029416117817163467,
0.027887310832738876,
-0.0015876704128459096,
-0.026463935151696205,
-0.09563394635915756,
0.04737643897533417,
0.06969571113586426,
0.03506737947463989,
0.28865668177604675,
-0.0432295948266983,
0.026068054139614105,
-0.013470303267240524,
-0.10109660029411316,
-0.19195754826068878,
-0.17854708433151245,
-0.11387532204389572,
0.0019176010973751545,
0.05830909311771393,
-0.08118807524442673,
-0.06409001350402832,
0.17770828306674957,
0.0725400447845459,
0.01944030448794365,
0.11981761455535889,
0.05666306987404823,
-0.021702099591493607,
0.022796478122472763,
0.01767915114760399,
-0.04293195158243179,
0.06652723252773285,
0.02928199991583824,
-0.08870011568069458,
-0.034868475049734116,
-0.02568686753511429,
-0.06974823027849197,
-0.0816493108868599,
0.0849938914179802,
-0.06869114935398102,
-0.0855422392487526,
-0.07783118635416031,
0.012094194069504738,
0.013920838944613934,
0.06543175131082535,
0.0014083520509302616,
-0.030912963673472404,
-0.010837624780833721,
0.05874968320131302,
0.02493174560368061,
-0.02382059209048748,
-0.12619933485984802,
0.07371462881565094,
0.05212854966521263,
-0.007557783275842667,
0.015815164893865585,
0.05114275589585304,
0.050067443400621414,
0.2590988278388977,
0.2676178514957428,
-0.08446668833494186,
0.032138630747795105,
0.05876995995640755,
0.003982988186180592,
-0.06654351204633713,
0.03794892504811287,
0.026581501588225365,
0.1492457240819931,
-0.0950111597776413,
-0.08067169785499573,
-0.13668012619018555,
-0.06487240642309189,
-0.147722989320755,
0.011738330125808716,
0.07866017520427704,
-0.0015820873668417335,
-0.0334894135594368,
0.1679898500442505,
-0.14028197526931763,
-0.00989184807986021,
0.07200632989406586,
-0.1846097707748413,
-0.1513853371143341,
-0.09390281140804291,
-0.07308629900217056,
0.07570941746234894,
0.06737121939659119,
-0.10579492151737213,
-0.013526699505746365,
0.0012741273967549205,
0.08326379954814911,
-0.19820889830589294,
-0.08126797527074814,
0.18319514393806458,
0.03862430155277252,
0.09043054282665253,
-0.04093475267291069,
0.029396897181868553,
0.1113036498427391,
-0.0075909653678536415,
-0.012898304499685764,
0.07523516565561295,
0.01738610491156578,
0.0038304575718939304,
-0.03933921456336975,
0.1321166455745697,
-0.04699934273958206,
-0.06389233469963074,
0.04967916011810303,
0.02819708362221718,
0.04584478214383125,
-0.08916882425546646,
-0.04544083774089813,
-0.02282099239528179,
0.08528923243284225,
-0.052675437182188034,
0.08775508403778076,
0.08997409790754318,
-0.02673961967229843,
-0.025866176933050156,
-0.03229355439543724,
0.041365355253219604,
0.04551957547664642,
-0.03678300231695175,
-0.12546373903751373,
-0.14392608404159546,
-0.03981668874621391,
-0.20405495166778564,
-0.020536387339234352,
-0.19383279979228973,
-0.042131271213293076,
-0.08784805238246918,
-0.034462928771972656,
-0.0788469985127449,
0.08749853819608688,
0.09416396915912628,
0.034426264464855194,
0.0027873495128005743,
-0.00431898282840848,
0.008397703059017658,
0.045491259545087814,
-0.21413853764533997,
-0.11812637001276016
] |
null | null |
transformers
|
<div class="course-tip course-tip-orange bg-gradient-to-br dark:bg-gradient-to-r before:border-orange-500 dark:before:border-orange-800 from-orange-50 dark:from-gray-900 to-white dark:to-gray-950 border border-orange-50 text-orange-700 dark:text-gray-400">
<p><b>Update:</b>
This model has been moved to <a href="https://huggingface.co/linhd-postdata/alberti-bert-base-multilingual-cased">linhd-postdata/alberti-bert-base-multilingual-cased</a>, where it will be maintained and updated.
</p>
</div>
# ALBERTI
ALBERTI is a set of two BERT-based multilingual model for poetry. One for verses and another one for stanzas. This model has been further trained with the PULPO corpus for verses using [Flax](https://github.com/google/flax), including training scripts.
This is part of the
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
## PULPO
PULPO, the Prodigious Unannotated Literary Poetry Corpus, is a set of multilingual corpora of verses and stanzas with over 95M words.
The following corpora has been downloaded using the [Averell](https://github.com/linhd-postdata/averell/) tool, developed by the [POSTDATA](https://postdata.linhd.uned.es/) team:
### Spanish
- [Disco v3](https://github.com/pruizf/disco)
- [Corpus of Spanish Golden-Age Sonnets](https://github.com/bncolorado/CorpusSonetosSigloDeOro)
- [Corpus general de poesía lírica castellana del Siglo de Oro](https://github.com/bncolorado/CorpusGeneralPoesiaLiricaCastellanaDelSigloDeOro)
- [Gongocorpus](https://github.com/linhd-postdata/gongocorpus) - [source](http://obvil.sorbonne-universite.site/corpus/gongora/gongora_obra-poetica)
### English
- [Eighteenth-Century Poetry Archive (ECPA)](https://github.com/alhuber1502/ECPA)
- [For better for verse](https://github.com/waynegraham/for_better_for_verse)
### French
- [Métrique en Ligne](https://crisco2.unicaen.fr/verlaine/index.php?navigation=accueil) - [source](https://github.com/linhd-postdata/metrique-en-ligne)
### Italian
- [Biblioteca italiana](https://github.com/linhd-postdata/biblioteca_italiana) - [source](http://www.bibliotecaitaliana.it/)
### Czech
- [Corpus of Czech Verse](https://github.com/versotym/corpusCzechVerse)
### Portuguese
- [Stichotheque](https://gitlab.com/stichotheque/stichotheque-pt)
Also, we obtained the following corpora from these sources:
### Spanish
- [Poesi.as](https://github.com/linhd-postdata/poesi.as) - [source](http://www.poesi.as/)
### English
- [A Gutenberg Poetry Corpus](https://github.com/aparrish/gutenberg-poetry-corpus)
### Arabic
- [Arabic Poetry dataset](https://www.kaggle.com/ahmedabelal/arabic-poetry)
### Chinese
- [THU Chinese Classical Poetry Corpus](https://github.com/THUNLP-AIPoet/Datasets/tree/master/CCPC)
### Finnish
- [SKVR](https://github.com/sks190/SKVR)
### German
- [TextGrid Poetry Corpus](https://github.com/linhd-postdata/textgrid-poetry) - [source](https://textgrid.de/en/digitale-bibliothek)
- [German Rhyme Corpus](https://github.com/tnhaider/german-rhyme-corpus)
### Hungarian
- [verskorpusz](https://github.com/ELTE-DH/verskorpusz)
### Portuguese
- [Poems in Portuguese](https://www.kaggle.com/oliveirasp6/poems-in-portuguese)
### Russian
- [19 000 Russian poems](https://www.kaggle.com/grafstor/19-000-russian-poems)
## Team members
- Álvaro Pérez ([alvp](https://huggingface.co/alvp))
- Javier de la Rosa ([versae](https://huggingface.co/versae))
- Aitor Díaz ([aitordiaz](https://huggingface.co/aitordiaz))
- Elena González-Blanco
- Salvador Ros ([salva](https://huggingface.co/salva))
## Useful links
- [Community Week timeline](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104#summary-timeline-calendar-6)
- [Community Week README](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md)
- [Community Week thread](https://discuss.huggingface.co/t/bertin-pretrain-roberta-large-from-scratch-in-spanish/7125)
- [Community Week channel](https://discord.com/channels/858019234139602994/859113060068229190)
- [Masked Language Modelling example scripts](https://github.com/huggingface/transformers/tree/master/examples/flax/language-modeling)
- [Model Repository](https://huggingface.co/flax-community/alberti-bert-base-multilingual-cased/)
## Acknowledgments
This project would not have been possible without the infrastructure and resources provided by HuggingFace and Google Cloud. Moreover, we want to thank POSTDATA Project (ERC-StG-679528) and the Computational Literary Studies Infrastructure (CLS INFRA No. 101004984) of the European Union's Horizon 2020 research and innovation programme for their support and time allowance.
|
{"language": "es", "license": "cc-by-4.0", "tags": ["multilingual", "bert"], "pipeline_tag": "fill-mask", "widget": [{"text": "\u00bfQu\u00e9 es la vida? Un [MASK]."}]}
|
fill-mask
|
flax-community/alberti-bert-base-multilingual-cased
|
[
"transformers",
"pytorch",
"jax",
"joblib",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"es",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"es"
] |
TAGS
#transformers #pytorch #jax #joblib #safetensors #bert #fill-mask #multilingual #es #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
<div class="course-tip course-tip-orange bg-gradient-to-br dark:bg-gradient-to-r before:border-orange-500 dark:before:border-orange-800 from-orange-50 dark:from-gray-900 to-white dark:to-gray-950 border border-orange-50 text-orange-700 dark:text-gray-400">
<p><b>Update:</b>
This model has been moved to <a href="URL where it will be maintained and updated.
</p>
</div>
# ALBERTI
ALBERTI is a set of two BERT-based multilingual model for poetry. One for verses and another one for stanzas. This model has been further trained with the PULPO corpus for verses using Flax, including training scripts.
This is part of the
Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.
## PULPO
PULPO, the Prodigious Unannotated Literary Poetry Corpus, is a set of multilingual corpora of verses and stanzas with over 95M words.
The following corpora has been downloaded using the Averell tool, developed by the POSTDATA team:
### Spanish
- Disco v3
- Corpus of Spanish Golden-Age Sonnets
- Corpus general de poesía lírica castellana del Siglo de Oro
- Gongocorpus - source
### English
- Eighteenth-Century Poetry Archive (ECPA)
- For better for verse
### French
- Métrique en Ligne - source
### Italian
- Biblioteca italiana - source
### Czech
- Corpus of Czech Verse
### Portuguese
- Stichotheque
Also, we obtained the following corpora from these sources:
### Spanish
- URL - source
### English
- A Gutenberg Poetry Corpus
### Arabic
- Arabic Poetry dataset
### Chinese
- THU Chinese Classical Poetry Corpus
### Finnish
- SKVR
### German
- TextGrid Poetry Corpus - source
- German Rhyme Corpus
### Hungarian
- verskorpusz
### Portuguese
- Poems in Portuguese
### Russian
- 19 000 Russian poems
## Team members
- Álvaro Pérez (alvp)
- Javier de la Rosa (versae)
- Aitor Díaz (aitordiaz)
- Elena González-Blanco
- Salvador Ros (salva)
## Useful links
- Community Week timeline
- Community Week README
- Community Week thread
- Community Week channel
- Masked Language Modelling example scripts
- Model Repository
## Acknowledgments
This project would not have been possible without the infrastructure and resources provided by HuggingFace and Google Cloud. Moreover, we want to thank POSTDATA Project (ERC-StG-679528) and the Computational Literary Studies Infrastructure (CLS INFRA No. 101004984) of the European Union's Horizon 2020 research and innovation programme for their support and time allowance.
|
[
"# ALBERTI\n\nALBERTI is a set of two BERT-based multilingual model for poetry. One for verses and another one for stanzas. This model has been further trained with the PULPO corpus for verses using Flax, including training scripts.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## PULPO\n\nPULPO, the Prodigious Unannotated Literary Poetry Corpus, is a set of multilingual corpora of verses and stanzas with over 95M words.\n\nThe following corpora has been downloaded using the Averell tool, developed by the POSTDATA team:",
"### Spanish\n- Disco v3\n- Corpus of Spanish Golden-Age Sonnets\n- Corpus general de poesía lírica castellana del Siglo de Oro\n- Gongocorpus - source",
"### English\n- Eighteenth-Century Poetry Archive (ECPA)\n- For better for verse",
"### French\n- Métrique en Ligne - source",
"### Italian\n- Biblioteca italiana - source",
"### Czech\n- Corpus of Czech Verse",
"### Portuguese\n- Stichotheque\n\nAlso, we obtained the following corpora from these sources:",
"### Spanish \n- URL - source",
"### English\n- A Gutenberg Poetry Corpus",
"### Arabic\n- Arabic Poetry dataset",
"### Chinese\n- THU Chinese Classical Poetry Corpus",
"### Finnish\n- SKVR",
"### German\n- TextGrid Poetry Corpus - source\n- German Rhyme Corpus",
"### Hungarian\n- verskorpusz",
"### Portuguese\n- Poems in Portuguese",
"### Russian\n- 19 000 Russian poems",
"## Team members\n\n- Álvaro Pérez (alvp)\n- Javier de la Rosa (versae)\n- Aitor Díaz (aitordiaz)\n- Elena González-Blanco\n- Salvador Ros (salva)",
"## Useful links\n\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Masked Language Modelling example scripts\n- Model Repository",
"## Acknowledgments\n\nThis project would not have been possible without the infrastructure and resources provided by HuggingFace and Google Cloud. Moreover, we want to thank POSTDATA Project (ERC-StG-679528) and the Computational Literary Studies Infrastructure (CLS INFRA No. 101004984) of the European Union's Horizon 2020 research and innovation programme for their support and time allowance."
] |
[
"TAGS\n#transformers #pytorch #jax #joblib #safetensors #bert #fill-mask #multilingual #es #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# ALBERTI\n\nALBERTI is a set of two BERT-based multilingual model for poetry. One for verses and another one for stanzas. This model has been further trained with the PULPO corpus for verses using Flax, including training scripts.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## PULPO\n\nPULPO, the Prodigious Unannotated Literary Poetry Corpus, is a set of multilingual corpora of verses and stanzas with over 95M words.\n\nThe following corpora has been downloaded using the Averell tool, developed by the POSTDATA team:",
"### Spanish\n- Disco v3\n- Corpus of Spanish Golden-Age Sonnets\n- Corpus general de poesía lírica castellana del Siglo de Oro\n- Gongocorpus - source",
"### English\n- Eighteenth-Century Poetry Archive (ECPA)\n- For better for verse",
"### French\n- Métrique en Ligne - source",
"### Italian\n- Biblioteca italiana - source",
"### Czech\n- Corpus of Czech Verse",
"### Portuguese\n- Stichotheque\n\nAlso, we obtained the following corpora from these sources:",
"### Spanish \n- URL - source",
"### English\n- A Gutenberg Poetry Corpus",
"### Arabic\n- Arabic Poetry dataset",
"### Chinese\n- THU Chinese Classical Poetry Corpus",
"### Finnish\n- SKVR",
"### German\n- TextGrid Poetry Corpus - source\n- German Rhyme Corpus",
"### Hungarian\n- verskorpusz",
"### Portuguese\n- Poems in Portuguese",
"### Russian\n- 19 000 Russian poems",
"## Team members\n\n- Álvaro Pérez (alvp)\n- Javier de la Rosa (versae)\n- Aitor Díaz (aitordiaz)\n- Elena González-Blanco\n- Salvador Ros (salva)",
"## Useful links\n\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Masked Language Modelling example scripts\n- Model Repository",
"## Acknowledgments\n\nThis project would not have been possible without the infrastructure and resources provided by HuggingFace and Google Cloud. Moreover, we want to thank POSTDATA Project (ERC-StG-679528) and the Computational Literary Studies Infrastructure (CLS INFRA No. 101004984) of the European Union's Horizon 2020 research and innovation programme for their support and time allowance."
] |
[
66,
89,
62,
39,
23,
12,
8,
9,
22,
7,
9,
8,
11,
7,
16,
8,
13,
9,
44,
36,
89
] |
[
"passage: TAGS\n#transformers #pytorch #jax #joblib #safetensors #bert #fill-mask #multilingual #es #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# ALBERTI\n\nALBERTI is a set of two BERT-based multilingual model for poetry. One for verses and another one for stanzas. This model has been further trained with the PULPO corpus for verses using Flax, including training scripts.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.## PULPO\n\nPULPO, the Prodigious Unannotated Literary Poetry Corpus, is a set of multilingual corpora of verses and stanzas with over 95M words.\n\nThe following corpora has been downloaded using the Averell tool, developed by the POSTDATA team:### Spanish\n- Disco v3\n- Corpus of Spanish Golden-Age Sonnets\n- Corpus general de poesía lírica castellana del Siglo de Oro\n- Gongocorpus - source### English\n- Eighteenth-Century Poetry Archive (ECPA)\n- For better for verse### French\n- Métrique en Ligne - source### Italian\n- Biblioteca italiana - source### Czech\n- Corpus of Czech Verse### Portuguese\n- Stichotheque\n\nAlso, we obtained the following corpora from these sources:### Spanish \n- URL - source### English\n- A Gutenberg Poetry Corpus### Arabic\n- Arabic Poetry dataset### Chinese\n- THU Chinese Classical Poetry Corpus### Finnish\n- SKVR### German\n- TextGrid Poetry Corpus - source\n- German Rhyme Corpus### Hungarian\n- verskorpusz### Portuguese\n- Poems in Portuguese### Russian\n- 19 000 Russian poems## Team members\n\n- Álvaro Pérez (alvp)\n- Javier de la Rosa (versae)\n- Aitor Díaz (aitordiaz)\n- Elena González-Blanco\n- Salvador Ros (salva)## Useful links\n\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
-0.028932010754942894,
0.23802033066749573,
-0.007804574444890022,
0.059777673333883286,
0.03322666883468628,
0.011752432212233543,
0.14672045409679413,
0.0754505842924118,
0.04451770335435867,
0.11032755672931671,
-0.011263799853622913,
0.05391377955675125,
0.08540279418230057,
0.009969541803002357,
0.04826907068490982,
-0.24385899305343628,
0.01429043523967266,
-0.09202864021062851,
-0.04820070415735245,
0.1003635823726654,
0.13010165095329285,
0.003157644532620907,
0.0422838032245636,
-0.01221290323883295,
0.08588659763336182,
0.024503406137228012,
-0.07086604088544846,
-0.10322016477584839,
0.05607159808278084,
0.03843722864985466,
0.012440628372132778,
0.036972761154174805,
-0.0033252404537051916,
-0.17692020535469055,
-0.0006885628681629896,
0.041655268520116806,
-0.068792924284935,
-0.04860885068774223,
0.21413853764533997,
-0.0983310267329216,
0.1549609750509262,
-0.1965339332818985,
0.013208145275712013,
0.008542603813111782,
-0.14157524704933167,
-0.11056621372699738,
-0.10477381944656372,
0.05640535056591034,
0.09193398803472519,
0.07502293586730957,
-0.05469880998134613,
0.08600037544965744,
-0.14243559539318085,
0.013745691627264023,
0.10641877353191376,
-0.176685631275177,
-0.10607818514108658,
0.002799775218591094,
0.13336281478405,
0.0849623829126358,
-0.10708238184452057,
0.020017441362142563,
-0.013110874220728874,
0.031978946179151535,
-0.04314161092042923,
-0.05582782253623009,
0.025427740067243576,
-0.011478092521429062,
-0.10458537936210632,
-0.04764203354716301,
0.13630659878253937,
-0.028419073671102524,
-0.007688951678574085,
-0.09872470796108246,
0.0001434469741070643,
0.15404030680656433,
-0.08928286284208298,
0.002019650535658002,
0.03548647090792656,
-0.0025512054562568665,
0.13869448006153107,
0.04142256826162338,
-0.07255615293979645,
0.03298819810152054,
0.03905527666211128,
0.02804126776754856,
0.02523648738861084,
-0.04050255939364433,
-0.002609768183901906,
0.019390713423490524,
-0.03065033257007599,
-0.07948337495326996,
0.010627144947648048,
-0.01246078684926033,
-0.008424290455877781,
0.017177622765302658,
0.07279280573129654,
-0.004098991397768259,
0.07051999121904373,
0.2146139144897461,
-0.031044255942106247,
0.015136304311454296,
0.06797918677330017,
-0.033031150698661804,
0.07261865586042404,
0.04977688565850258,
-0.0030155791901052,
-0.13999593257904053,
-0.10201313346624374,
-0.01131745707243681,
-0.02940402925014496,
0.044654060155153275,
-0.013767320662736893,
-0.025171848013997078,
-0.038696132600307465,
0.05719322711229324,
0.13032597303390503,
0.04061443731188774,
-0.05979938805103302,
-0.005683451890945435,
0.19641748070716858,
-0.09954284876585007,
0.06650052964687347,
0.05882427468895912,
-0.020876608788967133,
0.06670466810464859,
-0.06935058534145355,
-0.04442346468567848,
-0.0945887491106987,
0.06317895650863647,
-0.0330563522875309,
-0.010899213142693043,
-0.012629122473299503,
-0.08865797519683838,
0.1092456579208374,
-0.14995461702346802,
-0.04999149218201637,
-0.11265263706445694,
0.013681423850357533,
-0.13821330666542053,
-0.02096693404018879,
-0.09993895143270493,
-0.06363923847675323,
-0.07536924630403519,
-0.04560317099094391,
0.03743737190961838,
-0.0007091581355780363,
-0.003524614730849862,
-0.10867568105459213,
0.049384962767362595,
-0.06105424463748932,
0.030968492850661278,
0.032231613993644714,
0.028168069198727608,
-0.02213345654308796,
0.01660933904349804,
-0.12429328262805939,
0.07862386107444763,
-0.10198600590229034,
0.03985023498535156,
-0.11168739199638367,
0.04166799411177635,
0.01818675547838211,
0.044187095016241074,
-0.04793502762913704,
0.1656903624534607,
-0.1489754170179367,
-0.0854029580950737,
0.14269399642944336,
-0.04551015421748161,
0.029076697304844856,
0.1394396275281906,
0.03918655589222908,
0.003027022583410144,
0.09055661410093307,
0.11198794096708298,
0.08555254340171814,
-0.09634597599506378,
-0.056918539106845856,
-0.021386196836829185,
0.056164465844631195,
0.1142026036977768,
0.06927705556154251,
-0.05600441247224808,
0.1867705136537552,
0.009591293521225452,
-0.056898538023233414,
-0.04621146246790886,
-0.013067398220300674,
-0.03892968222498894,
0.035365987569093704,
0.025558125227689743,
0.05481364205479622,
-0.014586458913981915,
-0.04512542113661766,
-0.08380109071731567,
-0.10255938023328781,
-0.056390926241874695,
0.007735781837254763,
0.07589419931173325,
0.040668413043022156,
-0.028433553874492645,
0.11545564234256744,
0.11365898698568344,
0.03286398574709892,
-0.08420879393815994,
0.030062202364206314,
0.06587397307157516,
0.0755016952753067,
0.053190458565950394,
-0.06900728493928909,
0.011957088485360146,
-0.03168593719601631,
-0.020788386464118958,
0.013586203567683697,
-0.002655383199453354,
-0.035259559750556946,
-0.052825678139925,
-0.14222979545593262,
0.05160459131002426,
-0.012187381274998188,
0.11307637393474579,
-0.1830262839794159,
-0.00024379367823712528,
0.13925425708293915,
0.09800100326538086,
-0.01827920600771904,
0.03399953618645668,
-0.004084978252649307,
0.09578871726989746,
-0.03134892135858536,
-0.007092085666954517,
0.06917119026184082,
-0.03677815943956375,
-0.03360658138990402,
0.1545969694852829,
-0.11626185476779938,
-0.1998073011636734,
0.06597032397985458,
-0.005785604938864708,
-0.05098298564553261,
0.04062560945749283,
-0.024857699871063232,
-0.03648844361305237,
-0.02615411765873432,
-0.039769284427165985,
0.08342422544956207,
-0.008724812418222427,
0.03934420645236969,
-0.1349732130765915,
-0.06716176122426987,
-0.03041367046535015,
-0.08256983011960983,
-0.09546718746423721,
0.21568629145622253,
-0.065496526658535,
0.06126781553030014,
0.1627131849527359,
0.07323521375656128,
0.0963180661201477,
0.25279226899147034,
0.017873944714665413,
-0.06457626074552536,
0.008079491555690765,
0.026717349886894226,
0.025854257866740227,
0.001321507734246552,
-0.07962514460086823,
-0.004925472661852837,
-0.0024797196965664625,
0.013928018510341644,
0.01657135598361492,
-0.10398522019386292,
-0.022361677139997482,
-0.04142341390252113,
-0.05530316010117531,
0.026396887376904488,
0.021690383553504944,
0.06374950706958771,
0.08476333320140839,
0.054389357566833496,
0.0016894311411306262,
-0.029916604980826378,
-0.062371108680963516,
-0.06899220496416092,
0.08345822989940643,
-0.1362912654876709,
-0.27310535311698914,
-0.07627370208501816,
0.007794292643666267,
0.018642611801624298,
0.026976879686117172,
0.07203932851552963,
-0.10356626659631729,
-0.05755273625254631,
-0.07163072377443314,
0.14528284966945648,
0.09419901669025421,
-0.12552198767662048,
-0.1091054305434227,
0.0821230411529541,
0.0018600127659738064,
-0.0750841274857521,
0.0033664170186966658,
0.04997636377811432,
-0.07746681571006775,
0.020919930189847946,
-0.07657850533723831,
0.08535083383321762,
0.03508387878537178,
0.11434660851955414,
-0.08101565390825272,
0.0029381755739450455,
0.24439294636249542,
-0.19987882673740387,
0.09962377697229385,
0.03234279900789261,
-0.03307580202817917,
0.03412578999996185,
0.1161884218454361,
-0.006666738539934158,
-0.03757190331816673,
0.009018984623253345,
0.09225917607545853,
0.003151681274175644,
-0.26554909348487854,
-0.13018681108951569,
-0.08769906312227249,
0.06678516417741776,
0.03772477060556412,
0.118763267993927,
-0.07330814749002457,
-0.007912589237093925,
-0.12981069087982178,
-0.019849639385938644,
0.1112077459692955,
0.05714784190058708,
0.12943409383296967,
0.03562869504094124,
-0.02032952569425106,
-0.07999952137470245,
-0.02883250266313553,
0.08288656920194626,
0.11840084195137024,
0.0673021525144577,
0.050627220422029495,
0.2152472287416458,
0.09327245503664017,
0.09309904277324677,
-0.012053624726831913,
-0.07839652895927429,
0.05034707114100456,
0.02666442282497883,
-0.036437880247831345,
-0.055853504687547684,
0.05639606714248657,
0.0407470278441906,
0.08860976994037628,
-0.14551055431365967,
-0.01057406235486269,
-0.12260069698095322,
0.13925105333328247,
0.21915216743946075,
0.006972102914005518,
-0.057321012020111084,
-0.012140977196395397,
0.06581763923168182,
-0.08663378655910492,
-0.03650207072496414,
-0.019350288435816765,
0.005104084499180317,
-0.17636503279209137,
0.10289264470338821,
0.05896556004881859,
0.11216434836387634,
-0.0968383327126503,
0.0006887612980790436,
-0.014379416592419147,
-0.02574997954070568,
-0.01505141332745552,
0.03896738216280937,
-0.2160194218158722,
0.19095122814178467,
0.009213591925799847,
0.06572066247463226,
-0.034628525376319885,
-0.00530807813629508,
-0.02492581121623516,
0.02153221145272255,
0.10510236024856567,
0.02800648659467697,
-0.01475515402853489,
0.03899717703461647,
-0.09683126211166382,
-0.027333546429872513,
0.022482357919216156,
-0.16041088104248047,
0.09662053734064102,
0.027048297226428986,
-0.020253144204616547,
-0.09334508329629898,
-0.03729794919490814,
-0.11915561556816101,
-0.19180309772491455,
0.028416704386472702,
-0.04311583936214447,
0.015205733478069305,
-0.03354407474398613,
-0.07304910570383072,
-0.1861220747232437,
0.00136915547773242,
-0.13591799139976501,
-0.050067584961652756,
-0.09260302037000656,
-0.07662998139858246,
0.06370384991168976,
-0.041435085237026215,
0.005125984083861113,
0.006669847760349512,
0.013190442696213722,
-0.049111589789390564,
0.035654179751873016,
0.044578537344932556,
-0.020462242886424065,
-0.17767293751239777,
-0.04030099883675575,
0.16934682428836823,
0.15017499029636383,
0.04865293577313423,
-0.0033311014994978905,
0.042145080864429474,
0.05157933384180069,
-0.014192863367497921,
-0.034516654908657074,
0.014642606489360332,
0.050101734697818756,
-0.0024174447171390057,
-0.0758405402302742,
-0.16469988226890564,
-0.15211239457130432,
-0.13203121721744537,
-0.010881501249969006,
0.18597720563411713,
0.03236831724643707,
0.2193082571029663,
0.11510574817657471,
-0.15335075557231903,
-0.22260916233062744,
-0.09399604052305222,
0.0805562287569046,
-0.01631500944495201,
0.012974328361451626,
-0.18381138145923615,
-0.012818297371268272,
0.05937761440873146,
0.009338407777249813,
0.0076401750557124615,
-0.3442589044570923,
-0.0831541121006012,
-0.0037894484121352434,
-0.06029760465025902,
-0.1496460735797882,
-0.15084053575992584,
-0.08812190592288971,
-0.026890356093645096,
-0.18461662530899048,
0.07961522042751312,
0.04709777235984802,
0.058484047651290894,
-0.016394540667533875,
-0.0017851629527285695,
-0.017647868022322655,
0.04813950136303902,
0.19872692227363586,
0.1363169401884079,
-0.02372814156115055,
-0.10768808424472809,
-0.007103246636688709,
0.030547691509127617,
0.000970355118624866,
0.01604289747774601,
-0.04656555876135826,
-0.12182141095399857,
-0.16458912193775177,
-0.047177646309137344,
-0.13244307041168213,
0.056344401091337204,
-0.132718026638031,
0.047808822244405746,
0.0006235567852854729,
0.10017621517181396,
-0.0019897210877388716,
-0.03045627288520336,
0.10019159317016602,
-0.08653640747070312,
0.11120416969060898,
-0.0023071037139743567,
0.18860751390457153,
0.11766467988491058,
-0.14132781326770782,
-0.032038431614637375,
0.05869145318865776,
0.06144196540117264,
-0.03233463689684868,
0.016305873170495033,
0.08857255429029465,
-0.047743722796440125,
0.11356087774038315,
-0.03829827159643173,
-0.1516803652048111,
-0.040650058537721634,
0.1121891587972641,
-0.0989600196480751,
-0.09920583665370941,
0.046142563223838806,
-0.11202096939086914,
-0.01071830466389656,
-0.11530322581529617,
0.13250146806240082,
0.03318284451961517,
-0.09563540667295456,
0.032944709062576294,
0.05125432834029198,
0.04604439064860344,
0.1151241585612297,
-0.015706023201346397,
0.023602427914738655,
-0.05248202010989189,
0.09206939488649368,
0.10433665663003922,
-0.17293491959571838,
-0.031766630709171295,
0.3057596683502197,
-0.03676779568195343,
-0.05254119634628296,
0.04822922870516777,
0.09972820430994034,
-0.0331735797226429,
-0.040647342801094055,
-0.019141683354973793,
-0.1438983976840973,
0.0740889236330986,
0.1181977391242981,
0.005312880501151085,
0.06015333905816078,
0.09574215859174728,
-0.005563918501138687,
0.027221661061048508,
0.07327999919652939,
0.13104575872421265,
-0.03482485190033913,
0.06331172585487366,
-0.0011780897621065378,
-0.0037129258271306753,
-0.013917269185185432,
-0.007099034730345011,
-0.02445138804614544,
-0.18578235805034637,
-0.007155338767915964,
-0.08117352426052094,
0.05185360088944435,
-0.1260260045528412,
0.0037016216665506363,
-0.0473456010222435,
0.02171720191836357,
-0.005024905316531658,
-0.047704074531793594,
-0.06539088487625122,
-0.06843305379152298,
-0.05561327934265137,
0.1801697015762329,
-0.06442771106958389,
-0.012504208832979202,
0.053175199776887894,
-0.050948575139045715,
0.033225685358047485,
0.01871616207063198,
-0.0645298957824707,
0.02031097374856472,
-0.23083430528640747,
0.0349711999297142,
-0.0722469687461853,
0.001155048026703298,
0.03279507905244827,
-0.005359243601560593,
-0.012606373988091946,
-0.01430578250437975,
0.025335466489195824,
0.04187122359871864,
-0.0429832860827446,
-0.06392347067594528,
0.011721767485141754,
0.004196975380182266,
-0.1549595147371292,
0.020952114835381508,
0.03954976052045822,
0.09274790436029434,
-0.03475228697061539,
0.06734288483858109,
-0.1234436109662056,
0.08137527853250504,
-0.06123128533363342,
-0.015723329037427902,
0.002264623763039708,
-0.049625102430582047,
-0.06432085484266281,
-0.0012476122938096523,
0.0899219885468483,
0.0725039467215538,
0.05522555112838745,
0.08715497702360153,
-0.0025417215656489134,
0.06704637408256531,
0.018097246065735817,
-0.09200218319892883,
0.06502734124660492,
-0.052827972918748856,
0.04067756235599518,
-0.02929912507534027,
-0.06311646103858948,
-0.05038275569677353,
-0.02476523071527481,
-0.02377396449446678,
0.07662627100944519,
0.16924092173576355,
0.3007716238498688,
0.08285348117351532,
0.05537280812859535,
-0.10058273375034332,
0.03428688272833824,
0.10404350608587265,
-0.05951208621263504,
0.09163835644721985,
-0.10899858921766281,
0.04285305738449097,
0.07372267544269562,
-0.15922854840755463,
0.11218168586492538,
-0.07097884267568588,
-0.056914955377578735,
-0.057745326310396194,
-0.17657358944416046,
-0.05277203768491745,
-0.03750219941139221,
0.0027641442138701677,
-0.04979734495282173,
0.08762382715940475,
0.06906934827566147,
0.025280971080064774,
-0.06544019281864166,
0.010566852986812592,
0.05745907127857208,
-0.15979641675949097,
0.12262212485074997,
0.01574733294546604,
0.15623188018798828,
-0.06838085502386093,
0.05083848536014557,
-0.0413857139647007,
0.06932243704795837,
-0.017676956951618195,
0.08043413609266281,
0.0263016689568758,
0.003920292481780052,
-0.0883978083729744,
-0.08847454935312271,
0.0511532761156559,
0.009157937951385975,
0.0014839525101706386,
0.2537342607975006,
0.022725600749254227,
-0.02029639296233654,
0.04635562375187874,
0.16090647876262665,
0.08782254904508591,
-0.07689034938812256,
-0.11798712611198425,
0.021516790613532066,
-0.027757884934544563,
0.003447924507781863,
0.012486420571804047,
-0.09559915959835052,
0.020173151046037674,
0.15770260989665985,
0.20842605829238892,
0.044251374900341034,
0.03070576675236225,
-0.07311374694108963,
0.02967868372797966,
0.11209859699010849,
0.03130754828453064,
0.0353156253695488,
0.2690206468105316,
-0.037660110741853714,
-0.033765487372875214,
-0.08539886772632599,
0.03683719411492348,
-0.1527591347694397,
0.14450593292713165,
0.008806080557405949,
0.009331267327070236,
-0.018356099724769592,
0.16366806626319885,
-0.11872156709432602,
-0.18998835980892181,
0.031828373670578,
-0.11451956629753113,
-0.12088775634765625,
-0.040087468922138214,
0.047218795865774155,
0.1109430268406868,
0.09614533185958862,
0.07308723032474518,
-0.07258766144514084,
0.13784576952457428,
0.01274607703089714,
0.0032576804514974356,
-0.05948006734251976,
0.09918061643838882,
-0.12235692888498306,
0.21287399530410767,
-0.030701415613293648,
0.00025404911139048636,
0.07071217149496078,
-0.0009816314559429884,
-0.06446848064661026,
0.013062814250588417,
0.07693769037723541,
-0.006088533438742161,
0.03894565999507904,
0.04615660756826401,
0.010599401779472828,
0.05969696491956711,
0.13522057235240936,
0.04528360068798065,
0.09906331449747086,
0.13662868738174438,
0.024422194808721542,
-0.08027106523513794,
0.19910851120948792,
-0.17122985422611237,
0.04085615649819374,
0.15438438951969147,
-0.02218439057469368,
-0.037346918135881424,
-0.012544526718556881,
-0.041663311421871185,
-0.11667957156896591,
0.05281118303537369,
0.021788062527775764,
-0.19981959462165833,
0.04931256175041199,
-0.04733644425868988,
0.08198385685682297,
-0.14435896277427673,
-0.0482509471476078,
-0.018522070720791817,
0.003916935995221138,
-0.10896652936935425,
0.05642592906951904,
0.0167230274528265,
-0.013753263279795647,
-0.025749413296580315,
-0.18775653839111328,
0.024178648367524147,
0.12056596577167511,
-0.025909902527928352,
0.0010247889440506697
] |
null | null |
transformers
|
# arabic-t5-small
This is a T5v1.1 (small) trained on the concatenation of the Arabic Billion Words corpus and the Arabic subsets of the mC4 and Oscar datasets.
The model could only be trained for about `10%` of the whole dataset due to time limitations. This is equivalent to `22'000` steps or about `4.3` Billion tokens.
## Training parameters
| | |
| :-------------------: | :-----------: |
| Training batch size | `384` |
| Evaluation batch size | `768` |
| learning rate | `1e-2` |
| dtype | `jnp.float32` |
## Preprocessing and the tokenizer
We tried to keep the preprocessing to a bare minimum. We only replaced URLs, emails and social media user mentions with fixed tokens.
Contrary to other pretrained Arabic LMs, we decided to not strip the Arabic diacritics and to keep them part of the vocabulary.
The tokenizer was trained on `5%` of the training set, with a vocabulary size of `64'000`.
For more details about preprocessing, check the [tokenizer code](https://huggingface.co/flax-community/arabic-t5-small/blob/main/t5_tokenizer_model.py)
## Data
The model was trained on the concatenation of the Arabic Billion Words corpus and the Arabic subsets of the mC4 and Oscar datasets.
A random `0.1%` subset of the data was reserved for evaluation and the rest for training.
## Results
| | |
| :-----------------: | :-----------: |
| Evaluation accuracy | `56.84%` |
| Evaluation Loss | `2.423` |
| Training Loss | `2.392` |
| Training Time | `22h 23m 51s` |
## Note for finetuning
This model was pretrained with dropout turned off, so the default `dropout_rate` in the model config is `0`.
To finetune the model dropout should be turned be back on, like this:
```python
model = T5ForConditionalGeneration.from_pretrained("flax-community/arabic-t5-small", dropout_rate=0.1)
```
or,
```python
model = AutoModelForSeq2SeqLM.from_pretrained("flax-community/arabic-t5-small", dropout_rate=0.1)
```
|
{"language": ["ar"], "datasets": ["mc4", "oscar", "arabic_billion_words"]}
|
text2text-generation
|
flax-community/arabic-t5-small
|
[
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"ar",
"dataset:mc4",
"dataset:oscar",
"dataset:arabic_billion_words",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ar"
] |
TAGS
#transformers #pytorch #tf #jax #tensorboard #safetensors #t5 #text2text-generation #ar #dataset-mc4 #dataset-oscar #dataset-arabic_billion_words #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
arabic-t5-small
===============
This is a T5v1.1 (small) trained on the concatenation of the Arabic Billion Words corpus and the Arabic subsets of the mC4 and Oscar datasets.
The model could only be trained for about '10%' of the whole dataset due to time limitations. This is equivalent to '22'000' steps or about '4.3' Billion tokens.
Training parameters
-------------------
Preprocessing and the tokenizer
-------------------------------
We tried to keep the preprocessing to a bare minimum. We only replaced URLs, emails and social media user mentions with fixed tokens.
Contrary to other pretrained Arabic LMs, we decided to not strip the Arabic diacritics and to keep them part of the vocabulary.
The tokenizer was trained on '5%' of the training set, with a vocabulary size of '64'000'.
For more details about preprocessing, check the tokenizer code
Data
----
The model was trained on the concatenation of the Arabic Billion Words corpus and the Arabic subsets of the mC4 and Oscar datasets.
A random '0.1%' subset of the data was reserved for evaluation and the rest for training.
Results
-------
Note for finetuning
-------------------
This model was pretrained with dropout turned off, so the default 'dropout\_rate' in the model config is '0'.
To finetune the model dropout should be turned be back on, like this:
or,
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #tensorboard #safetensors #t5 #text2text-generation #ar #dataset-mc4 #dataset-oscar #dataset-arabic_billion_words #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
90
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #tensorboard #safetensors #t5 #text2text-generation #ar #dataset-mc4 #dataset-oscar #dataset-arabic_billion_words #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
-0.056077130138874054,
0.10345280170440674,
-0.0061886366456747055,
0.0653994157910347,
0.09848019480705261,
0.0057668364606797695,
0.19325463473796844,
0.11269596964120865,
-0.023011185228824615,
-0.038192372769117355,
0.18698923289775848,
0.15319497883319855,
0.026050131767988205,
0.126801535487175,
-0.07352901995182037,
-0.16571618616580963,
0.031058283522725105,
-0.0035379482433199883,
-0.026707107201218605,
0.12835635244846344,
0.09546878188848495,
-0.07644160091876984,
0.08307701349258423,
-0.0750705897808075,
-0.17365725338459015,
0.04687431454658508,
0.05719425901770592,
-0.1745765209197998,
0.10214002430438995,
0.07395128160715103,
0.08409854769706726,
0.07919247448444366,
-0.0208037830889225,
-0.12177816033363342,
0.04385504871606827,
0.007215279154479504,
-0.08469442278146744,
0.03770936653017998,
0.067009337246418,
-0.10464246571063995,
0.11209485679864883,
-0.03401217982172966,
-0.0022306886967271566,
0.04921175539493561,
-0.11635191738605499,
-0.08442007005214691,
0.006109363399446011,
0.038137223571538925,
0.018798159435391426,
0.08606158196926117,
-0.022780753672122955,
0.15336111187934875,
-0.01777605153620243,
0.14804339408874512,
0.07477548718452454,
-0.31089675426483154,
-0.04656160622835159,
0.02114810049533844,
0.07046639919281006,
0.12241069227457047,
-0.030369499698281288,
0.09020134061574936,
0.04546249657869339,
0.011862849816679955,
0.07849728316068649,
-0.09097525477409363,
-0.1678406149148941,
0.01126391626894474,
-0.06422991305589676,
-0.027402350679039955,
0.25863879919052124,
0.00533245038241148,
0.044552527368068695,
-0.05389471724629402,
-0.0773797556757927,
-0.08760746568441391,
0.001903667114675045,
-0.007103781681507826,
-0.03500499203801155,
-0.00005584733298746869,
-0.03442251309752464,
-0.022949913516640663,
-0.13668343424797058,
0.02776562236249447,
-0.21442009508609772,
0.0905161127448082,
0.009844977408647537,
0.06554712355136871,
-0.16257770359516144,
0.02390192449092865,
0.10171784460544586,
-0.15349148213863373,
0.05928877741098404,
-0.0788179486989975,
-0.01621031016111374,
0.013602337799966335,
0.008820475079119205,
-0.1286490559577942,
0.09834592044353485,
-0.013798612169921398,
-0.0030664694495499134,
0.04350569471716881,
-0.06897011399269104,
0.08573950827121735,
0.00991023238748312,
0.016900362446904182,
-0.060190822929143906,
-0.05962903052568436,
0.02386479638516903,
-0.018487391993403435,
0.01995471492409706,
-0.03140510246157646,
-0.0937054380774498,
0.01202121190726757,
0.06861370801925659,
0.06796113401651382,
0.03458937257528305,
0.11539293825626373,
-0.06949887424707413,
0.019802045077085495,
-0.04106416180729866,
-0.11362356692552567,
-0.0191922839730978,
0.023060448467731476,
-0.03513150289654732,
0.03924855962395668,
0.026294386014342308,
-0.0012680733343586326,
-0.08769699186086655,
0.008871599100530148,
-0.07177799195051193,
0.02114778570830822,
0.008654593490064144,
-0.0794016420841217,
0.08943496644496918,
-0.09291765838861465,
0.007357694208621979,
-0.16534174978733063,
-0.1074507087469101,
0.019007841125130653,
0.031220311298966408,
-0.02007061056792736,
0.0055030533112585545,
-0.06038747355341911,
-0.023246120661497116,
0.04124826192855835,
-0.07118432223796844,
0.021354835480451584,
-0.06666870415210724,
0.11810874938964844,
-0.02437613345682621,
0.12519627809524536,
-0.10754310339689255,
0.01284908875823021,
-0.06415143609046936,
-0.03696800395846367,
-0.0371943861246109,
0.10922075808048248,
-0.019725577905774117,
0.136895552277565,
-0.04811027646064758,
0.008248409256339073,
-0.05603553354740143,
0.04076828435063362,
-0.04243629053235054,
0.2570038437843323,
-0.2331012338399887,
-0.09456290304660797,
0.28161314129829407,
-0.05160122364759445,
-0.20538845658302307,
0.1315983384847641,
-0.01373396534472704,
0.015035576187074184,
0.11721764504909515,
0.20081652700901031,
-0.10962582379579544,
-0.039841439574956894,
0.005351411644369364,
0.09445153176784515,
-0.03309471905231476,
-0.07542615383863449,
0.0735042542219162,
0.04555930942296982,
0.0012150153052061796,
0.03309415653347969,
0.20551615953445435,
0.09045037627220154,
-0.05768725275993347,
-0.07410749793052673,
0.0023970312904566526,
-0.04706592112779617,
0.05674544721841812,
0.034683868288993835,
0.07160384953022003,
-0.1044677197933197,
-0.05701242759823799,
-0.04628750681877136,
-0.017273778095841408,
-0.007637500762939453,
0.02587467059493065,
-0.09764166921377182,
0.04543067887425423,
-0.0497107096016407,
0.022380076348781586,
-0.13975827395915985,
-0.083564892411232,
-0.04823530092835426,
0.19347189366817474,
0.00933343730866909,
0.026770081371068954,
0.07740896195173264,
-0.007818010635674,
-0.05312827229499817,
0.012757486663758755,
0.1727019101381302,
0.023614533245563507,
-0.08343247324228287,
-0.1551341861486435,
0.11635887622833252,
-0.07520656287670135,
0.11387493461370468,
-0.13134512305259705,
0.023971060290932655,
0.07313845306634903,
0.13347937166690826,
0.0420958511531353,
0.004965425934642553,
0.0287115927785635,
0.024202562868595123,
-0.05516038462519646,
-0.021646512672305107,
0.06610818952322006,
-0.018163414672017097,
-0.16482652723789215,
0.16621489822864532,
-0.1716986745595932,
0.31405359506607056,
0.2141997516155243,
-0.11760256439447403,
-0.014994175173342228,
-0.0007747208583168685,
0.002661392791196704,
0.011292846873402596,
0.0829227939248085,
0.02140313759446144,
0.02637246809899807,
0.002787241945043206,
0.17931388318538666,
-0.0706126019358635,
-0.043347399681806564,
0.03296796232461929,
-0.05451764911413193,
-0.08487381041049957,
0.1322496235370636,
-0.011367986910045147,
-0.19331546127796173,
0.19139361381530762,
0.18306821584701538,
0.04066713899374008,
0.24073153734207153,
-0.02066691964864731,
0.005592272616922855,
0.044628631323575974,
0.013602273538708687,
-0.008493208326399326,
0.021710846573114395,
-0.15436352789402008,
-0.022426698356866837,
0.05350567772984505,
-0.027550114318728447,
0.015509311109781265,
-0.11870291084051132,
-0.07874540239572525,
-0.01899394765496254,
-0.027918094769120216,
0.01578117161989212,
0.05619549751281738,
0.02011745050549507,
0.17022642493247986,
-0.05143669247627258,
-0.05729573220014572,
0.07268572598695755,
-0.017444921657443047,
-0.10924658924341202,
0.21820656955242157,
-0.12029782682657242,
-0.32659488916397095,
-0.02647383138537407,
-0.12845991551876068,
-0.05575495585799217,
0.01819612830877304,
0.08490940928459167,
-0.1100233793258667,
0.009788654744625092,
-0.06400951743125916,
-0.0185957383364439,
-0.06402949243783951,
0.04173334687948227,
0.019243869930505753,
0.0007359545561484993,
-0.03679853677749634,
-0.07968448847532272,
-0.04039761796593666,
-0.025973783805966377,
-0.06716780364513397,
0.1683116853237152,
-0.09294871985912323,
0.05900203809142113,
0.10703220963478088,
-0.006411182228475809,
0.037586018443107605,
-0.06800583004951477,
0.13459354639053345,
-0.09193596243858337,
0.009634377434849739,
0.11194245517253876,
-0.026587624102830887,
0.0516485758125782,
0.1838919073343277,
0.009669039398431778,
-0.08028802275657654,
0.019471701234579086,
-0.008677898906171322,
-0.06609664112329483,
-0.2759992480278015,
-0.08776449412107468,
-0.08201237767934799,
0.08062584698200226,
-0.012633268721401691,
0.07299161702394485,
0.04529343172907829,
0.08298936486244202,
-0.026135964319109917,
-0.038978517055511475,
0.015804795548319817,
0.022112831473350525,
0.13510660827159882,
0.005466985981911421,
0.12330974638462067,
-0.08807395398616791,
-0.08873625844717026,
0.10361918061971664,
0.02628019079566002,
0.054070308804512024,
0.04401707276701927,
0.038072723895311356,
0.03343930467963219,
0.07033668458461761,
0.08506952226161957,
0.1071380227804184,
0.06022347882390022,
-0.05356607213616371,
-0.0028870597016066313,
-0.054422397166490555,
-0.040806252509355545,
0.030567381531000137,
-0.005026711151003838,
-0.06734131276607513,
-0.024119826033711433,
0.024651115760207176,
0.1251717060804367,
0.037667009979486465,
0.05133572593331337,
-0.29793229699134827,
0.035042181611061096,
0.07468715310096741,
-0.027313370257616043,
-0.08162450790405273,
0.04796750098466873,
0.08289304375648499,
-0.09197673201560974,
0.14987842738628387,
-0.04057604819536209,
0.06572991609573364,
-0.028714986518025398,
0.037735532969236374,
-0.0460171103477478,
-0.10817606747150421,
-0.012126151472330093,
0.05890670791268349,
-0.4185838997364044,
0.2245660126209259,
0.04018782451748848,
-0.038614556193351746,
-0.08954867720603943,
-0.011072085238993168,
-0.01354529894888401,
0.08942937105894089,
0.19884133338928223,
-0.0007658372633159161,
0.034459397196769714,
-0.045454878360033035,
-0.07367214560508728,
0.051566630601882935,
0.08111414313316345,
0.029968280345201492,
-0.017121531069278717,
0.03441924601793289,
0.007129904348403215,
-0.003963605035096407,
0.05286604166030884,
-0.08027233183383942,
-0.1814834326505661,
0.030616892501711845,
0.08794030547142029,
0.0017093258211389184,
-0.026946140453219414,
-0.09424091875553131,
-0.09697465598583221,
0.14847318828105927,
-0.10001308470964432,
-0.09227713197469711,
-0.1281464844942093,
-0.000682761543430388,
0.012999537400901318,
-0.075519859790802,
0.025736859068274498,
-0.07473636418581009,
-0.05778061971068382,
-0.0725993663072586,
-0.14965085685253143,
0.1533295065164566,
-0.07538031041622162,
-0.008565494790673256,
-0.0922912210226059,
0.139800563454628,
-0.053872015327215195,
0.04691017046570778,
-0.01057503093034029,
-0.014269938692450523,
-0.021305028349161148,
-0.03355514630675316,
0.002892642980441451,
-0.027478482574224472,
0.07810083031654358,
-0.0070772976614534855,
-0.06139012426137924,
-0.17104654014110565,
-0.0266717541962862,
-0.08227423578500748,
0.2632538974285126,
0.21675868332386017,
-0.051353879272937775,
0.1385011225938797,
0.16945096850395203,
-0.02899528481066227,
-0.3089495897293091,
-0.09022162854671478,
-0.04582960158586502,
-0.004462821874767542,
-0.009269651025533676,
-0.1400139182806015,
0.0598321333527565,
0.018172750249505043,
-0.019222920760512352,
0.09698494523763657,
-0.29094845056533813,
-0.10837949812412262,
0.12550297379493713,
0.07344233989715576,
0.30249887704849243,
-0.2056700587272644,
-0.0517362616956234,
-0.0703403502702713,
-0.07842288166284561,
0.16536714136600494,
-0.1336868405342102,
0.08632998913526535,
0.013515445403754711,
-0.007682634051889181,
0.010948711074888706,
-0.033712927252054214,
0.0902162417769432,
0.0020947500597685575,
0.02991529554128647,
-0.10856238007545471,
-0.03162696957588196,
0.08376430720090866,
0.021434642374515533,
-0.011622131802141666,
-0.15743370354175568,
-0.010879694484174252,
-0.14043663442134857,
-0.0403565913438797,
-0.06314333528280258,
0.04447795823216438,
0.01410774327814579,
-0.06787547469139099,
0.033661454916000366,
-0.03596093878149986,
0.03468723222613335,
-0.04549286141991615,
0.1633811742067337,
-0.038315776735544205,
0.14569580554962158,
0.18934689462184906,
0.16641440987586975,
-0.14991645514965057,
0.11212225258350372,
-0.05494368448853493,
-0.05016167461872101,
0.05394894629716873,
-0.1768813580274582,
0.0063366941176354885,
0.1086001992225647,
-0.04325897991657257,
0.08101912587881088,
0.052400171756744385,
-0.010709693655371666,
0.0027163776103407145,
0.19000540673732758,
-0.20011122524738312,
-0.10001558810472488,
-0.03183771297335625,
-0.0541752353310585,
0.03550182282924652,
0.03667493909597397,
0.12463271617889404,
0.0035104681737720966,
-0.005294531583786011,
-0.028508054092526436,
-0.00683542899787426,
-0.03180735930800438,
0.14666105806827545,
0.03342393413186073,
0.03992437198758125,
-0.13474677503108978,
0.1384848803281784,
0.02931954339146614,
-0.18919220566749573,
0.03275284171104431,
0.1991518884897232,
-0.1283937394618988,
-0.09843763709068298,
-0.006212539039552212,
0.13536149263381958,
-0.04460913687944412,
-0.03933706879615784,
-0.03620089218020439,
-0.1323901116847992,
0.020383836701512337,
0.2243909239768982,
0.028505776077508926,
0.08056123554706573,
-0.0006504441844299436,
-0.08534499257802963,
0.026273570954799652,
0.1060757115483284,
0.05035435035824776,
-0.031216571107506752,
-0.09966254979372025,
0.05128581076860428,
-0.03712555393576622,
0.1367906779050827,
-0.08431112766265869,
0.015420163981616497,
-0.11273765563964844,
0.047204356640577316,
-0.10319485515356064,
-0.01597456820309162,
-0.06619071960449219,
-0.03715604543685913,
-0.033440001308918,
-0.05312192812561989,
-0.041722699999809265,
-0.08069072663784027,
-0.07758817821741104,
0.0329672209918499,
0.0022123134694993496,
0.056570570915937424,
-0.09986171126365662,
-0.034668631851673126,
0.00906414445489645,
-0.010226898826658726,
0.15835118293762207,
0.09784619510173798,
-0.12065045535564423,
0.09815927594900131,
-0.1938869059085846,
-0.04275127500295639,
0.12213101238012314,
0.005343771539628506,
0.04383130744099617,
0.13961243629455566,
0.041177500039339066,
0.07871042191982269,
0.052588801831007004,
0.062047507613897324,
0.003988845739513636,
-0.08975749462842941,
-0.029270822182297707,
-0.04711006209254265,
-0.07811310142278671,
-0.048456959426403046,
0.01893714815378189,
0.07151465862989426,
-0.022510632872581482,
0.10141440480947495,
-0.07725042849779129,
0.010043151676654816,
-0.10397695004940033,
0.027392633259296417,
0.013250323943793774,
-0.1802021861076355,
-0.06665625423192978,
-0.05610711872577667,
0.07087371498346329,
-0.04868609458208084,
0.18995442986488342,
0.04293788596987724,
-0.12011135369539261,
0.054048802703619,
-0.0026813047006726265,
0.01774108223617077,
0.004110389389097691,
0.1925961673259735,
0.06732087582349777,
-0.049259088933467865,
-0.08320479840040207,
0.0036118116695433855,
0.06656556576490402,
0.1288345754146576,
0.09504957497119904,
0.13993392884731293,
0.010757391341030598,
0.0962083712220192,
-0.017283201217651367,
0.03260308504104614,
0.06286438554525375,
-0.05497503653168678,
-0.09210937470197678,
0.05836471542716026,
-0.026723604649305344,
-0.0047653717920184135,
0.2140708714723587,
-0.005536512937396765,
-0.02633260376751423,
-0.028106046840548515,
-0.0561763234436512,
-0.1526520997285843,
-0.1487417072057724,
-0.08713298290967941,
-0.027816234156489372,
0.026640914380550385,
-0.0915660485625267,
0.03764277696609497,
0.11356116086244583,
0.12613068521022797,
-0.051659371703863144,
0.12953661382198334,
0.09738845378160477,
-0.0819028913974762,
0.10522822290658951,
0.022443639114499092,
0.05048773065209389,
-0.010168501175940037,
0.0005539240082725883,
-0.05638273060321808,
0.0233291108161211,
-0.018581248819828033,
0.05961574241518974,
-0.028647752478718758,
0.0937914252281189,
-0.14518138766288757,
-0.11461243033409119,
-0.04231591150164604,
0.047350507229566574,
0.012494564987719059,
0.10842351615428925,
0.02877970226109028,
-0.02071402408182621,
0.05017022043466568,
0.18826086819171906,
-0.027585873380303383,
-0.11541294306516647,
-0.046631913632154465,
0.1074599027633667,
-0.017453860491514206,
0.07036121934652328,
-0.02309265546500683,
-0.022517815232276917,
-0.066242516040802,
0.24039305746555328,
0.34030386805534363,
-0.10424375534057617,
0.026544487103819847,
-0.035603344440460205,
0.03454338759183884,
0.015007191337645054,
0.09090442955493927,
0.056619346141815186,
0.20736609399318695,
-0.08175259828567505,
0.02576657012104988,
-0.029235854744911194,
-0.00836050882935524,
-0.09665464609861374,
0.08812843263149261,
0.05195168778300285,
-0.018032215535640717,
-0.03146660700440407,
0.08406053483486176,
-0.15963931381702423,
0.06563607603311539,
-0.060229863971471786,
-0.14938218891620636,
-0.07858911156654358,
-0.038333453238010406,
0.10341020673513412,
0.08309505134820938,
0.014596755616366863,
0.012804550118744373,
-0.0485445037484169,
-0.07462995499372482,
0.023630710318684578,
-0.2163694202899933,
0.03366304561495781,
0.05099138990044594,
-0.1202516183257103,
0.05233945697546005,
-0.018132369965314865,
0.07507676631212234,
0.08196963369846344,
0.023253509774804115,
-0.02841080166399479,
0.10668281465768814,
0.02417582832276821,
0.07432228326797485,
0.025868374854326248,
0.01915563829243183,
0.024512551724910736,
-0.05350081995129585,
0.07167709618806839,
0.052589405328035355,
0.03193596750497818,
-0.13286039233207703,
-0.03280302509665489,
-0.01477893628180027,
0.0619167722761631,
-0.028498299419879913,
0.056735869497060776,
0.09925387799739838,
-0.013139107264578342,
0.06333044916391373,
-0.06453437358140945,
-0.05039177089929581,
0.028739744797348976,
-0.10621169209480286,
-0.03961964324116707,
-0.14668145775794983,
-0.0469195581972599,
0.0447230227291584,
0.014550271444022655,
-0.22844938933849335,
0.03321922942996025,
-0.15211723744869232,
0.012249967083334923,
-0.1519257128238678,
0.0636110007762909,
0.14218676090240479,
0.026241805404424667,
-0.031069952994585037,
-0.07687295973300934,
0.06515756249427795,
0.08291216194629669,
-0.0891503244638443,
-0.08426491171121597
] |
null | null |
transformers
|
# bengali-t5-base
**bengali-t5-base** is a model trained on the Bengali portion of MT5 dataset. We used the `T5-base` model for this model.
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organized by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
The model is trained on around ~11B tokens (64 size batch, 512 tokens, 350k steps).
## load tokenizer
```
>>> tokenizer = transformers.AutoTokenizer.from_pretrained("flax-community/bengali-t5-base")
>>> tokenizer.encode("আমি বাংলার গান গাই")
>>> tokenizer.decode([93, 1912, 814, 5995, 3, 1])
```
```
[93, 1912, 814, 5995, 3, 1]
'আমি বাংলার গান গাই </s>'
```
## load model
```
>>> config = T5Config.from_pretrained("flax-community/bengali-t5-base")
>>> model = FlaxT5ForConditionalGeneration.from_pretrained("flax-community/bengali-t5-base", config=config)
```
The model is trained on `de-noising` objectives followed by the script [here](https://huggingface.co/flax-community/bengali-t5-base/blob/main/run_t5_mlm_flax.py) and [here](https://huggingface.co/flax-community/bengali-t5-base/blob/main/run.sh). Currently This model doesn't have any generation capability. If you want this model to have generation capability, please do a finetuning on `prefix-LM` objective mentioned in the [paper](https://arxiv.org/abs/1910.10683).
See the tensorboard log in `Training metrics` tab.
Please note that we haven't finetuned the model in any downstream task.
## Proposal
- [Project Proposal](https://discuss.huggingface.co/t/pretrain-t5-from-scratch-in-bengali/7121)
## Participants
- [Ibraheem Muhammad Moosa](https://huggingface.co/ibraheemmoosa)
- [Tasnim Mohiuddin](https://huggingface.co/tasnim)
- [Khalid Saifullah](https://huggingface.co/khalidsaifullaah)
- [Tahsin Mayeesha](https://tahsin-mayeesha.github.io/)
- [M Saiful Bari](https://huggingface.co/sbmaruf)
## Useful links
- [Community Week timeline](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104#summary-timeline-calendar-6)
- [Community Week README](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md)
- [Masked Language Modelling example scripts](https://github.com/huggingface/transformers/tree/master/examples/flax/language-modeling)
- [Model Repository](https://huggingface.co/flax-community/roberta-base-als-demo)
|
{}
|
text2text-generation
|
flax-community/bengali-t5-base
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"mt5",
"text2text-generation",
"arxiv:1910.10683",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1910.10683"
] |
[] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #mt5 #text2text-generation #arxiv-1910.10683 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# bengali-t5-base
bengali-t5-base is a model trained on the Bengali portion of MT5 dataset. We used the 'T5-base' model for this model.
Flax/Jax Community Week, organized by HuggingFace and TPU usage sponsored by Google.
The model is trained on around ~11B tokens (64 size batch, 512 tokens, 350k steps).
## load tokenizer
## load model
The model is trained on 'de-noising' objectives followed by the script here and here. Currently This model doesn't have any generation capability. If you want this model to have generation capability, please do a finetuning on 'prefix-LM' objective mentioned in the paper.
See the tensorboard log in 'Training metrics' tab.
Please note that we haven't finetuned the model in any downstream task.
## Proposal
- Project Proposal
## Participants
- Ibraheem Muhammad Moosa
- Tasnim Mohiuddin
- Khalid Saifullah
- Tahsin Mayeesha
- M Saiful Bari
## Useful links
- Community Week timeline
- Community Week README
- Masked Language Modelling example scripts
- Model Repository
|
[
"# bengali-t5-base\n\nbengali-t5-base is a model trained on the Bengali portion of MT5 dataset. We used the 'T5-base' model for this model.\n\nFlax/Jax Community Week, organized by HuggingFace and TPU usage sponsored by Google.\n\nThe model is trained on around ~11B tokens (64 size batch, 512 tokens, 350k steps).",
"## load tokenizer",
"## load model\n\n\n\nThe model is trained on 'de-noising' objectives followed by the script here and here. Currently This model doesn't have any generation capability. If you want this model to have generation capability, please do a finetuning on 'prefix-LM' objective mentioned in the paper. \n\nSee the tensorboard log in 'Training metrics' tab.\n\nPlease note that we haven't finetuned the model in any downstream task.",
"## Proposal\n- Project Proposal",
"## Participants\n- Ibraheem Muhammad Moosa\n- Tasnim Mohiuddin\n- Khalid Saifullah\n- Tahsin Mayeesha\n- M Saiful Bari",
"## Useful links\n- Community Week timeline\n- Community Week README\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #mt5 #text2text-generation #arxiv-1910.10683 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# bengali-t5-base\n\nbengali-t5-base is a model trained on the Bengali portion of MT5 dataset. We used the 'T5-base' model for this model.\n\nFlax/Jax Community Week, organized by HuggingFace and TPU usage sponsored by Google.\n\nThe model is trained on around ~11B tokens (64 size batch, 512 tokens, 350k steps).",
"## load tokenizer",
"## load model\n\n\n\nThe model is trained on 'de-noising' objectives followed by the script here and here. Currently This model doesn't have any generation capability. If you want this model to have generation capability, please do a finetuning on 'prefix-LM' objective mentioned in the paper. \n\nSee the tensorboard log in 'Training metrics' tab.\n\nPlease note that we haven't finetuned the model in any downstream task.",
"## Proposal\n- Project Proposal",
"## Participants\n- Ibraheem Muhammad Moosa\n- Tasnim Mohiuddin\n- Khalid Saifullah\n- Tahsin Mayeesha\n- M Saiful Bari",
"## Useful links\n- Community Week timeline\n- Community Week README\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
74,
91,
5,
101,
9,
33,
28
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #mt5 #text2text-generation #arxiv-1910.10683 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# bengali-t5-base\n\nbengali-t5-base is a model trained on the Bengali portion of MT5 dataset. We used the 'T5-base' model for this model.\n\nFlax/Jax Community Week, organized by HuggingFace and TPU usage sponsored by Google.\n\nThe model is trained on around ~11B tokens (64 size batch, 512 tokens, 350k steps).## load tokenizer## load model\n\n\n\nThe model is trained on 'de-noising' objectives followed by the script here and here. Currently This model doesn't have any generation capability. If you want this model to have generation capability, please do a finetuning on 'prefix-LM' objective mentioned in the paper. \n\nSee the tensorboard log in 'Training metrics' tab.\n\nPlease note that we haven't finetuned the model in any downstream task.## Proposal\n- Project Proposal## Participants\n- Ibraheem Muhammad Moosa\n- Tasnim Mohiuddin\n- Khalid Saifullah\n- Tahsin Mayeesha\n- M Saiful Bari## Useful links\n- Community Week timeline\n- Community Week README\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
-0.10135100036859512,
0.06299371272325516,
-0.003112287260591984,
0.030606526881456375,
0.04393016919493675,
-0.03058154508471489,
0.1505696326494217,
0.07484731078147888,
0.02149173431098461,
0.07117962092161179,
0.07172670215368271,
0.00126067700330168,
0.12272528558969498,
0.19833430647850037,
0.040814194828271866,
-0.23579087853431702,
-0.0057878983207046986,
-0.05192660912871361,
-0.04949767887592316,
0.1272561401128769,
0.11829999834299088,
-0.030375061556696892,
0.0828578919172287,
-0.0302065871655941,
-0.04674661532044411,
0.043777428567409515,
-0.04774874076247215,
-0.07148618251085281,
0.026244647800922394,
0.07238117605447769,
0.056308452039957047,
0.03640774264931679,
0.025659818202257156,
-0.07331386208534241,
0.04482817277312279,
0.050644464790821075,
0.002082984894514084,
0.013844187371432781,
0.04645245522260666,
-0.0072157178074121475,
0.19208060204982758,
-0.04142897576093674,
-0.024251330643892288,
0.0705636665225029,
-0.11078251153230667,
0.04180125519633293,
-0.09609170258045197,
0.13755762577056885,
0.1635284572839737,
0.035617392510175705,
-0.028722302988171577,
0.11090898513793945,
-0.028939688578248024,
0.07207901030778885,
0.10363364219665527,
-0.12244519591331482,
-0.08615359663963318,
0.12700748443603516,
0.09113016724586487,
0.034949686378240585,
-0.07935772091150284,
-0.016971871256828308,
0.022695617750287056,
-0.0063230362720787525,
0.07249923795461655,
-0.0916263684630394,
0.10531464964151382,
-0.024187033995985985,
-0.0740540400147438,
0.005377548281103373,
0.1269085705280304,
0.06611582636833191,
-0.05613459274172783,
-0.11232263594865799,
-0.0758524015545845,
0.014689987525343895,
-0.01232778001576662,
-0.018948344513773918,
0.013707133941352367,
0.018542738631367683,
0.09408791363239288,
-0.07960006594657898,
-0.08618562668561935,
-0.043878063559532166,
0.012567874044179916,
0.02823571115732193,
0.022457672283053398,
0.01965535432100296,
-0.12932828068733215,
0.037843309342861176,
-0.06763309240341187,
-0.06870611757040024,
-0.04568479210138321,
-0.07138128578662872,
-0.047917965799570084,
-0.04217235743999481,
0.0896439403295517,
-0.0912257730960846,
-0.007006707135587931,
0.15373606979846954,
-0.03176160529255867,
0.05796068534255028,
0.046648427844047546,
0.019545257091522217,
0.06729277223348618,
0.06957893818616867,
-0.10802400857210159,
-0.08703158795833588,
0.10200688242912292,
0.07995609939098358,
0.014084733091294765,
-0.0044698696583509445,
-0.006507557351142168,
0.011177431792020798,
-0.011585764586925507,
0.07737023383378983,
0.02085035853087902,
0.059669021517038345,
-0.00764312082901597,
-0.04574781656265259,
0.19458070397377014,
-0.1642332375049591,
-0.011030163615942001,
-0.01567212864756584,
-0.07946833223104477,
0.04106833040714264,
0.03331021964550018,
-0.019612101837992668,
-0.08663950860500336,
-0.11828356236219406,
-0.05948491394519806,
-0.00802362896502018,
-0.10744648426771164,
-0.05722307041287422,
0.0347285121679306,
-0.06383836269378662,
-0.05960530787706375,
-0.14789749681949615,
-0.2458493411540985,
-0.02877812273800373,
0.03357571363449097,
-0.06991231441497803,
-0.024456528946757317,
-0.02372216060757637,
-0.0705190971493721,
-0.01820903830230236,
-0.039118316024541855,
0.06787487864494324,
-0.008919624611735344,
0.052193980664014816,
0.012940557673573494,
0.10507795214653015,
0.07935015112161636,
0.010792932473123074,
-0.04480629414319992,
0.07968911528587341,
-0.2584342658519745,
0.08656562119722366,
-0.02268378995358944,
0.08381294459104538,
-0.15657863020896912,
0.018184561282396317,
-0.01845886930823326,
0.00968126766383648,
0.035565316677093506,
0.1982596516609192,
-0.17321912944316864,
-0.01572500169277191,
0.1806538850069046,
-0.09431923925876617,
-0.0980122908949852,
0.10507095605134964,
0.025776557624340057,
0.14445240795612335,
0.06766734272241592,
0.07678934186697006,
0.049262139946222305,
-0.10422028601169586,
-0.0555850975215435,
-0.016444971784949303,
-0.06662195920944214,
-0.012320739217102528,
0.06370308250188828,
-0.0060241310857236385,
0.11815197765827179,
-0.019441548734903336,
0.024340886622667313,
0.04606098309159279,
-0.01952638290822506,
-0.019944658502936363,
0.025439810007810593,
-0.06261955201625824,
-0.011278013698756695,
-0.008380338549613953,
-0.0001478309277445078,
-0.008553638122975826,
-0.0755578801035881,
-0.027197619900107384,
0.08746187388896942,
-0.010814954526722431,
0.01677168719470501,
-0.14529937505722046,
0.06589924544095993,
-0.07864957302808762,
0.03506064414978027,
-0.1126355528831482,
0.008848024532198906,
0.010480611585080624,
-0.054434772580862045,
0.06776538491249084,
-0.08184818178415298,
0.05722673609852791,
0.03289084509015083,
-0.005787596572190523,
-0.026050208136439323,
0.0609985888004303,
-0.03858375549316406,
-0.07486740499734879,
-0.11642156541347504,
-0.08435497432947159,
-0.034976404160261154,
0.1125323474407196,
-0.2140975147485733,
0.05891578271985054,
0.02729332447052002,
0.1327735334634781,
0.019587798044085503,
-0.026282811537384987,
0.12905457615852356,
-0.043449386954307556,
0.004471866879612207,
-0.1065160259604454,
0.006355406250804663,
0.0015090685337781906,
-0.07329456508159637,
0.171070396900177,
-0.15064087510108948,
-0.10249850898981094,
0.043329235166311264,
0.021652650088071823,
-0.05011236295104027,
0.0052551343105733395,
-0.031872231513261795,
-0.05418728291988373,
-0.0010443473001942039,
-0.036329079419374466,
0.06250859051942825,
0.059375472366809845,
0.14778976142406464,
-0.11031734198331833,
-0.0627177432179451,
-0.012646094895899296,
-0.07110846787691116,
-0.025046637281775475,
0.07508250325918198,
0.06118981912732124,
-0.18203693628311157,
0.06147563457489014,
0.039422422647476196,
0.0806344747543335,
0.24908702075481415,
0.010114508680999279,
-0.07122602313756943,
-0.06075084209442139,
0.039028067141771317,
-0.02047247253358364,
0.07190480083227158,
-0.024686355143785477,
0.00042859683162532747,
0.010471268557012081,
0.02186710573732853,
0.00618338119238615,
-0.03098859265446663,
0.029796937480568886,
-0.009006194770336151,
-0.04556643217802048,
-0.006919967010617256,
0.08047503978013992,
-0.016950132325291634,
0.08588339388370514,
-0.020165832713246346,
0.09164663404226303,
-0.014474091120064259,
-0.02162807621061802,
-0.13275547325611115,
0.1623597890138626,
-0.060037385672330856,
-0.17773865163326263,
-0.08044406026601791,
0.017558110877871513,
-0.013293982483446598,
-0.029035266488790512,
-0.001451318385079503,
-0.06007246673107147,
-0.054192353039979935,
-0.11870509386062622,
0.051189862191677094,
0.04316931217908859,
-0.03699842095375061,
-0.10793979465961456,
-0.007042490877211094,
-0.0022204697597771883,
-0.13183696568012238,
-0.012456269934773445,
0.06802749633789062,
-0.11864782124757767,
0.06206154823303223,
-0.04928714036941528,
-0.01121513545513153,
0.07446561753749847,
-0.028231779113411903,
0.004023880232125521,
-0.0117374612018466,
0.16483183205127716,
-0.09243832528591156,
0.24977897107601166,
0.19707973301410675,
0.06321407854557037,
0.05852421373128891,
0.10292316973209381,
-0.028779776766896248,
-0.0634821206331253,
0.04109374061226845,
0.05027567222714424,
-0.04917517676949501,
-0.20747555792331696,
-0.005386835429817438,
-0.04925693944096565,
0.03955066204071045,
0.04783479869365692,
0.04397597163915634,
0.01647825539112091,
0.022937986999750137,
-0.09555245190858841,
0.04612760990858078,
0.03184313699603081,
0.06922174990177155,
-0.08555334806442261,
-0.014463784173130989,
0.0403626523911953,
-0.08825532346963882,
0.06824343651533127,
0.07699288427829742,
0.03221340477466583,
0.2487877607345581,
0.010175367817282677,
0.17877796292304993,
0.11504121124744415,
0.06277544051408768,
0.06938888877630234,
0.0379175990819931,
-0.03341785445809364,
-0.0012231420259922743,
-0.008844020776450634,
-0.06769515573978424,
-0.05844857171177864,
0.06604696065187454,
0.015769366174936295,
0.028418440371751785,
-0.0020450870506465435,
0.011759730987250805,
0.027433747425675392,
0.16321322321891785,
0.009098505601286888,
-0.18227113783359528,
-0.05450696498155594,
0.05440891906619072,
-0.12003528326749802,
-0.0651816725730896,
-0.005698258522897959,
0.12185122072696686,
-0.09045381844043732,
0.02101176045835018,
-0.030314644798636436,
0.10260753333568573,
-0.05321674793958664,
-0.04126160591840744,
-0.03024482913315296,
0.04913024231791496,
-0.027978897094726562,
0.0336223728954792,
-0.1954033076763153,
0.2151476889848709,
0.0407741479575634,
0.12802962958812714,
-0.04660481587052345,
0.005940515082329512,
0.049361374229192734,
0.040625378489494324,
0.07080628722906113,
0.026667622849345207,
0.010669133625924587,
-0.08133541792631149,
-0.1122504323720932,
0.007368898950517178,
0.03094632364809513,
-0.020788410678505898,
0.11273105442523956,
-0.0017664693295955658,
-0.001064773416146636,
-0.071465402841568,
0.015261306427419186,
-0.2115042358636856,
-0.0800199881196022,
0.02845810540020466,
-0.0048678601160645485,
0.016343170776963234,
-0.09363728016614914,
0.004736186936497688,
0.03751985728740692,
0.0645076185464859,
-0.12928448617458344,
-0.10824987292289734,
-0.09226667135953903,
0.020452622324228287,
0.054767727851867676,
-0.08113604038953781,
-0.029488516971468925,
-0.017025098204612732,
0.05712025240063667,
-0.00031471296097151935,
-0.032813385128974915,
0.02495480701327324,
-0.0746162161231041,
-0.07992768287658691,
-0.017467739060521126,
0.08505968004465103,
0.11361721903085709,
0.021474871784448624,
0.03010321967303753,
-0.041787996888160706,
0.06007281318306923,
-0.07902704924345016,
-0.06338521838188171,
0.08797196298837662,
0.04822567477822304,
0.008718234486877918,
-0.027862560003995895,
-0.000023086922738002613,
-0.03363087773323059,
-0.08058265596628189,
-0.00011218539293622598,
0.22182811796665192,
0.030087318271398544,
0.08450673520565033,
0.1897638440132141,
-0.08056269586086273,
-0.17978884279727936,
-0.08032593131065369,
-0.00788865890353918,
0.06977961957454681,
-0.01231318898499012,
-0.12295054644346237,
0.027418354526162148,
0.02911517024040222,
0.0073266709223389626,
-0.0020578326657414436,
-0.28083109855651855,
-0.12713655829429626,
0.028872642666101456,
0.05270233005285263,
0.0011281340848654509,
-0.17585468292236328,
-0.04903505742549896,
-0.014270375482738018,
-0.07660781592130661,
0.027534065768122673,
-0.14116021990776062,
0.06028144806623459,
-0.0005200376035645604,
0.06639958173036575,
-0.027643851935863495,
-0.041348859667778015,
0.09588882327079773,
0.01929113082587719,
0.03125263378024101,
-0.09362313151359558,
0.011756914667785168,
0.17822881042957306,
-0.07429930567741394,
0.11451665312051773,
-0.04823458939790726,
0.013846556656062603,
-0.26721447706222534,
-0.029099121689796448,
-0.0444960854947567,
0.05247543752193451,
-0.05598942190408707,
-0.025041364133358,
-0.002790080849081278,
0.13845382630825043,
0.04039191082119942,
0.02365754544734955,
-0.09324140101671219,
-0.10846293717622757,
0.01667007803916931,
0.13664142787456512,
0.19591191411018372,
-0.09132768958806992,
0.021590273827314377,
-0.03495772182941437,
0.03383302316069603,
0.06010200455784798,
-0.21406018733978271,
-0.025305025279521942,
0.02106870338320732,
0.04010907933115959,
0.0941476970911026,
-0.04103516787290573,
-0.10163132846355438,
0.0418078787624836,
0.057445257902145386,
-0.0017373841255903244,
-0.10099490731954575,
0.025208838284015656,
0.08038268983364105,
-0.08965891599655151,
-0.02445719577372074,
0.11406690627336502,
-0.08258412033319473,
-0.018125906586647034,
-0.012920670211315155,
0.03566374629735947,
-0.05149385333061218,
0.09922154247760773,
0.04392935708165169,
0.07617340236902237,
-0.03479943424463272,
0.11038050800561905,
0.05817297101020813,
-0.1076781377196312,
0.037286125123500824,
0.2016071230173111,
-0.08622386306524277,
-0.10912131518125534,
-0.09003980457782745,
0.10515134781599045,
0.016406642273068428,
-0.013087167404592037,
0.03435685113072395,
-0.013513830490410328,
0.04646967351436615,
0.11780157685279846,
-0.027394022792577744,
0.00018314736371394247,
-0.010470830835402012,
-0.0004068718699272722,
-0.07357828319072723,
0.07598695158958435,
0.10429420322179794,
-0.0069635785184800625,
-0.030917491763830185,
0.0015034443931654096,
0.03656463697552681,
0.046147048473358154,
-0.023874742910265923,
-0.06057636812329292,
-0.0990038812160492,
0.005455238278955221,
-0.06709296256303787,
0.002169287297874689,
-0.10933981835842133,
-0.029818110167980194,
-0.030184462666511536,
-0.007120373658835888,
-0.03741534799337387,
-0.014659087173640728,
-0.002055620774626732,
-0.00037883740151301026,
-0.04289259761571884,
0.11200963705778122,
-0.07840145379304886,
-0.026748480275273323,
0.042839232832193375,
-0.0663042962551117,
0.06697331368923187,
0.028782539069652557,
-0.06250973045825958,
0.011239810846745968,
-0.020143559202551842,
0.04675489291548729,
-0.10535289347171783,
0.0016638922970741987,
0.05525161698460579,
-0.10122091323137283,
0.017256679013371468,
-0.015231739729642868,
0.039867620915174484,
-0.0064982944168150425,
-0.015583072789013386,
-0.052932512015104294,
0.049866173416376114,
-0.06894596666097641,
0.005319708958268166,
-0.10149305313825607,
0.10120835155248642,
0.03796092048287392,
0.01508290134370327,
0.11180568486452103,
-0.07090029865503311,
-0.006351422052830458,
-0.15115901827812195,
0.015154119580984116,
0.02967224083840847,
-0.01723225973546505,
0.03852924332022667,
-0.05441739782691002,
0.041940852999687195,
-0.05172322317957878,
0.04317854717373848,
-0.002979700919240713,
-0.046267155557870865,
0.07165743410587311,
-0.1153009831905365,
-0.07360885292291641,
0.006392905954271555,
0.07979167252779007,
0.02326052449643612,
0.020598135888576508,
0.03110746666789055,
0.03089212439954281,
-0.01443913858383894,
-0.051439061760902405,
0.09265618771314621,
0.1869535595178604,
0.004083138424903154,
0.10731624066829681,
-0.017818287014961243,
-0.04404778778553009,
-0.048964936286211014,
0.024992629885673523,
-0.05606771633028984,
0.048479631543159485,
-0.07076666504144669,
0.113399438560009,
0.31258851289749146,
-0.10217747092247009,
0.061069414019584656,
0.0013164918636903167,
-0.06341933459043503,
-0.05518730729818344,
-0.27363893389701843,
-0.02705366723239422,
-0.02035645954310894,
0.0033419174142181873,
-0.06454037129878998,
0.05315769091248512,
0.08368545770645142,
0.0709753930568695,
0.013998867012560368,
0.14832328259944916,
0.010880254209041595,
-0.07712125778198242,
0.06482797861099243,
-0.011204736307263374,
0.03550159931182861,
-0.0102803660556674,
0.005824983585625887,
0.015886081382632256,
-0.01295024435967207,
0.03568103164434433,
0.08241426944732666,
0.0021238811314105988,
0.038204777985811234,
-0.0711512416601181,
-0.11331586539745331,
-0.019526777788996696,
0.03731658309698105,
-0.023339418694376945,
0.12483544647693634,
0.03647523745894432,
-0.07506443560123444,
-0.013660967350006104,
0.14174041152000427,
0.0000464232471131254,
-0.11935017257928848,
-0.12035933136940002,
0.08130640536546707,
0.0006279261433519423,
-0.03307824954390526,
-0.030383620411157608,
-0.07999702543020248,
0.0015420750714838505,
0.18347224593162537,
0.13278619945049286,
-0.0006668032729066908,
-0.0029907580465078354,
-0.06257670372724533,
-0.0026686382479965687,
-0.07873518764972687,
0.14739179611206055,
0.06612132489681244,
0.11558350920677185,
-0.04902782291173935,
0.07195338606834412,
-0.03527681529521942,
-0.06466067582368851,
-0.1322895586490631,
0.13644076883792877,
-0.014700209721922874,
0.030108457431197166,
-0.011936903931200504,
0.09308964014053345,
-0.0451495423913002,
-0.14585286378860474,
-0.0024707659613341093,
0.02328840270638466,
-0.06509453058242798,
-0.05288571119308472,
-0.0729680061340332,
0.045984841883182526,
0.06505240499973297,
0.01304400060325861,
0.001404748298227787,
0.11888251453638077,
0.04441309720277786,
-0.04025401175022125,
-0.07603447884321213,
0.11931557953357697,
-0.0554998554289341,
0.1858474761247635,
-0.021241873502731323,
0.055411748588085175,
0.10912490636110306,
-0.04672328382730484,
-0.15087442100048065,
0.04782150313258171,
0.03807239979505539,
0.021382620558142662,
-0.012961180880665779,
0.18894627690315247,
0.029813749715685844,
0.03710644692182541,
0.02764866128563881,
-0.046362221240997314,
-0.015095098875463009,
-0.05605969950556755,
0.06152239069342613,
-0.13836854696273804,
0.13109248876571655,
-0.04676421731710434,
0.15821446478366852,
0.06873581558465958,
-0.04255599528551102,
0.004363333806395531,
-0.08335275948047638,
0.020277297124266624,
0.029620777815580368,
-0.014144903048872948,
-0.01448743138462305,
-0.1855834275484085,
-0.016739660874009132,
-0.10173853486776352,
0.05610138922929764,
-0.23260053992271423,
-0.00853230245411396,
-0.0680553987622261,
0.013968282379209995,
-0.017599310725927353,
0.15136554837226868,
0.042956843972206116,
0.0636950433254242,
-0.0013329637004062533,
-0.14158311486244202,
-0.0310679841786623,
0.1273467093706131,
-0.15214958786964417,
-0.05133624002337456
] |
null | null |
transformers
|
## BERT base-uncased for in Swahili
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by [HuggingFace](https://huggingface.co). All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
## How to use
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("flax-community/bert-base-uncased-swahili")
model = AutoModelForMaskedLM.from_pretrained("flax-community/bert-base-uncased-swahili")
print(round((model.num_parameters())/(1000*1000)),"Million Parameters")
110 Million Parameters
```
#### **Training Data**:
This model was trained on [Swahili Safi](https://huggingface.co/datasets/flax-community/swahili-safi)
#### **More Details**:
For more details and Demo please check [HF Swahili Space](https://huggingface.co/spaces/flax-community/Swahili)
|
{"language": "sw", "datasets": ["flax-community/swahili-safi"], "widget": [{"text": "Si kila mwenye makucha [MASK] simba."}]}
|
fill-mask
|
flax-community/bert-base-uncased-swahili
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"bert",
"fill-mask",
"sw",
"dataset:flax-community/swahili-safi",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sw"
] |
TAGS
#transformers #pytorch #jax #tensorboard #bert #fill-mask #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #region-us
|
## BERT base-uncased for in Swahili
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
## How to use
#### Training Data:
This model was trained on Swahili Safi
#### More Details:
For more details and Demo please check HF Swahili Space
|
[
"## BERT base-uncased for in Swahili\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.",
"## How to use",
"#### Training Data:\nThis model was trained on Swahili Safi",
"#### More Details:\nFor more details and Demo please check HF Swahili Space"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #bert #fill-mask #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"## BERT base-uncased for in Swahili\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.",
"## How to use",
"#### Training Data:\nThis model was trained on Swahili Safi",
"#### More Details:\nFor more details and Demo please check HF Swahili Space"
] |
[
67,
68,
4,
16,
18
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #bert #fill-mask #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #region-us \n## BERT base-uncased for in Swahili\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.## How to use#### Training Data:\nThis model was trained on Swahili Safi#### More Details:\nFor more details and Demo please check HF Swahili Space"
] |
[
-0.08946340531110764,
0.12039333581924438,
0.0024790733586996794,
0.0891512781381607,
0.1448570191860199,
0.01408599503338337,
0.10945817083120346,
0.06908778101205826,
0.00113571691326797,
-0.0045489114709198475,
0.11846794188022614,
-0.05787719786167145,
0.10643135756254196,
0.2694537043571472,
0.022198937833309174,
-0.25303328037261963,
-0.016748273745179176,
0.012871157377958298,
-0.13217514753341675,
0.12916937470436096,
0.11140522360801697,
-0.09306538850069046,
0.07759352028369904,
-0.021840814501047134,
-0.19227273762226105,
0.010133309289813042,
-0.06802891939878464,
-0.07467322051525116,
0.11606147140264511,
0.04367959871888161,
0.11482192575931549,
0.04202534258365631,
0.10722669214010239,
-0.05285555124282837,
0.05511269345879555,
0.03272010385990143,
-0.022162003442645073,
0.04231474548578262,
-0.003873613430187106,
0.03318382054567337,
0.11243882030248642,
-0.022189201787114143,
-0.016553739085793495,
0.06517550349235535,
-0.0971747413277626,
-0.16595233976840973,
-0.07584942132234573,
0.04104773700237274,
0.07712652534246445,
0.05586753413081169,
-0.007117698900401592,
0.13021814823150635,
-0.13025884330272675,
0.07418196648359299,
0.10704132914543152,
-0.1525191068649292,
-0.08898293226957321,
0.15227411687374115,
0.08937382698059082,
-0.021281538531184196,
-0.08313746005296707,
0.05633828416466713,
0.05262567102909088,
0.04528691619634628,
0.13086891174316406,
-0.05527569726109505,
-0.10408809781074524,
-0.01114412397146225,
-0.07834752649068832,
-0.019795041531324387,
0.25136813521385193,
0.0430033840239048,
-0.03775089234113693,
-0.07934454828500748,
-0.0014295103028416634,
0.12265530228614807,
-0.056568846106529236,
-0.020615369081497192,
0.04103356599807739,
-0.01145743764936924,
-0.049936987459659576,
-0.07334624975919724,
-0.09072369337081909,
-0.07741662114858627,
0.012220477685332298,
0.08435278385877609,
-0.024416757747530937,
0.051016293466091156,
-0.1451084315776825,
0.05917273089289665,
0.0011840463848784566,
-0.09024446457624435,
0.014835437759757042,
-0.010269846767187119,
-0.022937647998332977,
-0.0005044700228609145,
-0.0026038221549242735,
-0.13125106692314148,
0.08010534197092056,
0.013917764648795128,
-0.02447335608303547,
0.02808324061334133,
0.09758105129003525,
0.03914432227611542,
0.006764371879398823,
0.132829487323761,
-0.07635310292243958,
-0.1524469554424286,
0.06854447722434998,
-0.02036409080028534,
-0.051850784569978714,
-0.05636696517467499,
-0.0944092869758606,
-0.054740842431783676,
-0.03843550756573677,
-0.0023172686342149973,
0.01891280710697174,
0.07238335907459259,
0.017461001873016357,
-0.028494548052549362,
-0.011619980446994305,
-0.10273980349302292,
0.00344923441298306,
-0.004182233475148678,
-0.03924379497766495,
-0.022787019610404968,
0.1061733216047287,
0.012383614666759968,
-0.04210690036416054,
-0.05196511372923851,
-0.034447554498910904,
-0.0014089765027165413,
-0.06820902973413467,
-0.0357322059571743,
0.03790391609072685,
-0.12615123391151428,
0.008674506098031998,
-0.1696108877658844,
-0.2588454782962799,
-0.056548912078142166,
0.05635766312479973,
0.02037818729877472,
-0.058499936014413834,
-0.028247680515050888,
0.007712709717452526,
-0.008796434849500656,
-0.015515330247581005,
0.17150627076625824,
-0.0386270172894001,
0.05195920541882515,
-0.04814832657575607,
0.09618546813726425,
-0.05503794178366661,
0.05351352319121361,
-0.014347492717206478,
0.06492213159799576,
-0.13783162832260132,
0.08285507559776306,
-0.06951824575662613,
0.13108421862125397,
-0.08550287038087845,
0.003848252585157752,
0.010472162626683712,
0.010687116533517838,
0.06949552148580551,
0.2178834229707718,
-0.2459143102169037,
-0.04016350209712982,
0.16887757182121277,
-0.06512703001499176,
-0.134598508477211,
0.14563488960266113,
-0.041935332119464874,
0.18693073093891144,
0.026380373165011406,
0.1935790479183197,
0.045361343771219254,
-0.11002549529075623,
-0.01600562408566475,
0.07201769948005676,
-0.07172981649637222,
-0.10629487037658691,
0.054014820605516434,
0.05678340420126915,
-0.06319038569927216,
0.05295088514685631,
-0.060088999569416046,
0.1500885933637619,
-0.05235989764332771,
-0.06575051695108414,
-0.027987828478217125,
-0.15567903220653534,
0.0420740470290184,
0.057703543454408646,
0.07797446846961975,
-0.02087404392659664,
-0.013449243269860744,
-0.010839544236660004,
0.09229564666748047,
-0.059239864349365234,
0.020412033423781395,
-0.07144799083471298,
0.09870971739292145,
-0.04682672768831253,
0.02113613486289978,
-0.08636851608753204,
-0.03409482166171074,
-0.043143752962350845,
0.08149507641792297,
-0.02649639919400215,
0.022772403433918953,
0.10085848718881607,
-0.02165919914841652,
-0.033181529492139816,
-0.008088000118732452,
-0.007442002184689045,
0.019543521106243134,
-0.06279077380895615,
-0.08042356371879578,
-0.06734530627727509,
-0.061042048037052155,
0.13235822319984436,
-0.15471164882183075,
0.056395307183265686,
-0.07190918922424316,
0.13482782244682312,
-0.030192792415618896,
-0.0006736302748322487,
0.0723748654127121,
0.00589560903608799,
0.011564211919903755,
-0.07948371022939682,
0.06702736765146255,
0.04485730826854706,
-0.13880059123039246,
0.1221054196357727,
-0.008686190471053123,
0.11206983774900436,
0.10023721307516098,
-0.03652540594339371,
-0.037897516041994095,
0.020798176527023315,
-0.04093712568283081,
-0.03080730140209198,
-0.05229751765727997,
0.04557350277900696,
0.12487155199050903,
0.003241418395191431,
0.156646728515625,
-0.0568825788795948,
-0.022153465077280998,
0.023226173594594002,
-0.07207591831684113,
-0.020497117191553116,
0.0825086385011673,
-0.01979135349392891,
-0.051469750702381134,
0.09181836992502213,
0.14799466729164124,
-0.0005930476472713053,
0.2652079463005066,
-0.04491860792040825,
-0.024339603260159492,
-0.05615958571434021,
-0.05295630171895027,
-0.03690563887357712,
0.1490703970193863,
-0.18566232919692993,
-0.07404980063438416,
0.030550291761755943,
-0.010171135887503624,
0.022546567022800446,
-0.08832339197397232,
-0.04975533112883568,
0.001679154927842319,
-0.038769494742155075,
-0.027550160884857178,
0.07381013035774231,
-0.07956869900226593,
0.060295358300209045,
0.0024520347360521555,
-0.05314187332987785,
0.03898213058710098,
-0.001967744203284383,
-0.08524255454540253,
0.1613968014717102,
-0.02793698012828827,
-0.24896831810474396,
-0.017468372359871864,
-0.05766454339027405,
0.025223812088370323,
-0.026757357642054558,
0.00975741259753704,
-0.07093537598848343,
-0.01965690404176712,
-0.02265205606818199,
-0.06436906009912491,
0.005664272233843803,
0.006214524153620005,
0.021874381229281425,
-0.012197533622384071,
0.06107517331838608,
-0.09716924279928207,
-0.0027338427025824785,
-0.0017234538681805134,
-0.0149228535592556,
0.06891126930713654,
-0.07027367502450943,
0.11845625191926956,
0.07723578810691833,
-0.07312580198049545,
0.08955857157707214,
0.007022003643214703,
0.22660569846630096,
-0.12055781483650208,
0.09372688829898834,
0.18534523248672485,
-0.0064737736247479916,
0.008794323541224003,
0.03193725273013115,
0.01625816524028778,
-0.05075829103589058,
0.0028111874125897884,
-0.034882061183452606,
-0.12405886501073837,
-0.14975939691066742,
-0.06456620246171951,
-0.06124480813741684,
0.036162152886390686,
-0.01838630996644497,
0.045772258192300797,
-0.04999465122818947,
0.07071281969547272,
0.07106684148311615,
0.040604595094919205,
-0.029212357476353645,
0.003928390797227621,
-0.14304490387439728,
-0.049454882740974426,
0.05080730840563774,
-0.0764445811510086,
-0.021064499393105507,
0.04296683520078659,
0.028697961941361427,
0.061214692890644073,
-0.013880396261811256,
0.028234193101525307,
0.056513722985982895,
0.21036283671855927,
0.04545140638947487,
0.07869531214237213,
0.02978537045419216,
-0.05416949838399887,
-0.018855981528759003,
-0.0070121996104717255,
-0.06448468565940857,
0.03083346039056778,
0.08024424314498901,
0.0002391463058302179,
-0.05933232232928276,
-0.04477182775735855,
0.018131591379642487,
0.2133331298828125,
0.005490774288773537,
-0.18350601196289062,
-0.07342621684074402,
0.040527600795030594,
-0.020538942888379097,
-0.08746053278446198,
-0.01005615759640932,
0.1703028529882431,
-0.16488520801067352,
0.038049742579460144,
-0.02787417732179165,
0.11233637481927872,
0.04514573514461517,
0.010138222016394138,
-0.0648643746972084,
0.008246040903031826,
-0.028106996789574623,
0.08184102177619934,
-0.32051801681518555,
0.1766393482685089,
0.0054182917810976505,
0.09727810323238373,
-0.06991078704595566,
-0.01949421688914299,
0.09198936820030212,
0.1632884442806244,
0.1808215081691742,
0.05557151138782501,
0.06334500014781952,
-0.06658171117305756,
-0.08949305862188339,
0.031041353940963745,
-0.060866404324769974,
-0.008463025093078613,
0.02919181063771248,
0.04210071265697479,
-0.0017427644925191998,
-0.0008125033928081393,
0.145603209733963,
-0.13335685431957245,
-0.08010949939489365,
-0.0019380166195333004,
0.06584230810403824,
0.023960614576935768,
-0.05853812023997307,
-0.07919785380363464,
-0.17955459654331207,
0.07410051673650742,
0.11473514139652252,
-0.03820378705859184,
-0.1736125648021698,
-0.001419186475686729,
0.0014593831729143858,
-0.05427353456616402,
0.012472975067794323,
0.0307596605271101,
0.05335250496864319,
-0.05774865671992302,
-0.06024546176195145,
0.034025948494672775,
-0.10892681777477264,
-0.07325300574302673,
-0.014559956267476082,
0.013017338700592518,
0.08536390960216522,
0.03638475760817528,
0.06029839441180229,
-0.026627851650118828,
-0.061348866671323776,
-0.07926151901483536,
0.06337755918502808,
0.07197386771440506,
0.06101269647479057,
0.006221684627234936,
0.06054338812828064,
-0.014269715175032616,
0.034340448677539825,
-0.04183061048388481,
0.1571500599384308,
0.12565945088863373,
-0.09745333343744278,
0.1137179583311081,
0.15399561822414398,
-0.03715616464614868,
-0.3305499255657196,
-0.03529956564307213,
0.01916038803756237,
0.10481739044189453,
0.02328607439994812,
-0.08736556023359299,
0.05201796069741249,
-0.0586821511387825,
-0.01909354329109192,
-0.09484004229307175,
-0.15468613803386688,
-0.10529303550720215,
0.16222451627254486,
0.07965808361768723,
0.24248197674751282,
-0.08579663187265396,
0.005179419182240963,
-0.048291608691215515,
-0.13421842455863953,
0.07061251997947693,
-0.21297945082187653,
0.09386191517114639,
-0.03547525778412819,
0.06777245551347733,
-0.000717141549102962,
-0.07024569809436798,
0.1467820107936859,
-0.0038121857214719057,
0.03541889414191246,
-0.1093040406703949,
-0.03866119682788849,
0.13102594017982483,
-0.06961461901664734,
0.10919363796710968,
-0.04400407522916794,
0.012402978725731373,
-0.22112390398979187,
-0.018306978046894073,
-0.1074686050415039,
0.13641050457954407,
-0.026591498404741287,
-0.06507409363985062,
0.012350752018392086,
0.07985158264636993,
0.06988951563835144,
0.025197727605700493,
-0.0016180261736735702,
-0.048935893923044205,
0.0755792185664177,
0.17249982059001923,
0.14493654668331146,
-0.06423751264810562,
0.010315636172890663,
-0.024805935099720955,
-0.029843445867300034,
0.05322733893990517,
-0.18690520524978638,
0.0068448977544903755,
0.06094823777675629,
0.04868672415614128,
0.10146874934434891,
0.03965377062559128,
-0.08798098564147949,
0.010079878382384777,
0.05100098252296448,
-0.0896935760974884,
-0.15711183845996857,
-0.0824587419629097,
-0.116652712225914,
-0.08076436072587967,
0.027610797435045242,
0.07507298141717911,
-0.11632207781076431,
-0.0036837528459727764,
-0.0459231473505497,
-0.004130107816308737,
-0.10331346839666367,
0.12216546386480331,
0.12008205056190491,
0.04318958520889282,
-0.11030121892690659,
0.06792882829904556,
0.019861385226249695,
0.0023175713140517473,
0.03636440634727478,
0.08835793286561966,
-0.10501129925251007,
-0.06919120997190475,
0.06740057468414307,
0.2835409641265869,
0.026273906230926514,
-0.10820258408784866,
-0.08177311718463898,
-0.0994650200009346,
0.026983262971043587,
0.10934935510158539,
0.06614219397306442,
-0.006572495214641094,
-0.006270671263337135,
-0.027117570862174034,
-0.11429750174283981,
0.07818585634231567,
0.11720520257949829,
-0.017636282369494438,
-0.1006317213177681,
0.16906800866127014,
0.001574300811626017,
0.20082756876945496,
-0.08900009095668793,
-0.042223699390888214,
-0.12413115054368973,
0.0268320981413126,
-0.15034076571464539,
-0.022378552705049515,
-0.0610334649682045,
-0.0025941936764866114,
-0.028941098600625992,
-0.061248619109392166,
-0.019561413675546646,
0.04431098327040672,
-0.07030671834945679,
0.028073139488697052,
0.027905341237783432,
0.03763632848858833,
0.006199683528393507,
-0.05176825821399689,
0.027680911123752594,
-0.03231755644083023,
0.11572522670030594,
0.05304937809705734,
-0.018155906349420547,
0.014716114848852158,
-0.12318088114261627,
-0.022932138293981552,
-0.03381853550672531,
-0.031883347779512405,
0.09500040858983994,
-0.04006952792406082,
-0.017760440707206726,
-0.046658698469400406,
-0.014670263975858688,
0.007220597937703133,
0.12295868992805481,
-0.029013726860284805,
0.04262059926986694,
-0.03227917477488518,
-0.028304245322942734,
-0.07355202734470367,
0.09358975291252136,
0.04073897749185562,
0.14715836942195892,
0.051051173359155655,
-0.04403210058808327,
0.08173170685768127,
-0.13425195217132568,
-0.009510194882750511,
-0.020712250843644142,
-0.09971829503774643,
-0.04394392669200897,
-0.06061815842986107,
0.05308586359024048,
-0.03344830870628357,
0.0893317312002182,
0.11093073338270187,
-0.039653804153203964,
-0.0423470064997673,
-0.0018309070728719234,
-0.017507759854197502,
-0.044089384377002716,
0.17197903990745544,
-0.056086111813783646,
0.026371104642748833,
0.09551630169153214,
0.09016867727041245,
-0.0003714211634360254,
0.08386443555355072,
0.011735783889889717,
0.05315325781702995,
-0.024318207055330276,
0.1247960776090622,
-0.002826015930622816,
-0.04178156331181526,
-0.04132121801376343,
0.01683925837278366,
-0.054158903658390045,
0.10214050114154816,
-0.1195920780301094,
0.0074572134763002396,
0.10820247232913971,
-0.16130803525447845,
0.07354602217674255,
-0.01085592433810234,
-0.09703554958105087,
-0.11673394590616226,
-0.15951701998710632,
-0.040476344525814056,
-0.05729211866855621,
0.013546934351325035,
-0.045946281403303146,
-0.03109556995332241,
0.07534467428922653,
-0.0008574968087486923,
-0.012270194478332996,
0.10587330162525177,
0.008456260897219181,
-0.011196419596672058,
0.13394902646541595,
-0.024566013365983963,
-0.006724905222654343,
-0.12611360847949982,
0.007061454467475414,
-0.039142802357673645,
-0.0018897282425314188,
-0.015976836904883385,
-0.05123107507824898,
-0.012627741321921349,
0.05021252855658531,
0.02496728114783764,
-0.08735550940036774,
-0.013025535270571709,
0.0023054000921547413,
0.09322569519281387,
0.025845656171441078,
0.03277485445141792,
-0.03220919147133827,
-0.009655194357037544,
0.1968209594488144,
-0.04653112590312958,
-0.0403594933450222,
-0.1540369987487793,
-0.02113710157573223,
-0.05151668190956116,
-0.03160518407821655,
-0.0019094145391136408,
-0.04914777725934982,
0.028429489582777023,
0.2834510803222656,
0.19433510303497314,
-0.08959899097681046,
0.053089022636413574,
-0.016655171290040016,
-0.0024456086102873087,
-0.02598847635090351,
0.15454941987991333,
0.05592804402112961,
0.0078384755179286,
-0.07967427372932434,
0.011675064451992512,
-0.07092905789613724,
-0.10712641477584839,
-0.07263645529747009,
0.12938643991947174,
-0.0016721830470487475,
-0.021991925314068794,
-0.022928249090909958,
0.05188625678420067,
-0.10604758560657501,
-0.16250857710838318,
0.025092465803027153,
-0.12989073991775513,
-0.09360945969820023,
-0.06637309491634369,
-0.05354586988687515,
0.11501317471265793,
0.046203773468732834,
-0.020033586770296097,
0.10320527106523514,
0.1523895412683487,
-0.0008010550518520176,
-0.04474317282438278,
-0.005937253590673208,
0.1150059849023819,
-0.112123042345047,
0.16352702677249908,
-0.039133019745349884,
-0.019520875066518784,
0.05369730293750763,
-0.004581363871693611,
-0.11543447524309158,
0.05951419100165367,
-0.008134535513818264,
0.05736599490046501,
-0.00614487798884511,
0.1160077378153801,
-0.02882758341729641,
-0.01905844919383526,
-0.04466230794787407,
-0.11394072324037552,
0.053794100880622864,
0.014956437982618809,
-0.01723824255168438,
-0.0814233273267746,
0.10122842341661453,
-0.035988736897706985,
0.1240188404917717,
0.10059564560651779,
-0.05825747922062874,
-0.03125040605664253,
-0.09312505275011063,
0.04330461472272873,
0.018303776159882545,
-0.0124757569283247,
-0.05698184669017792,
-0.08560415357351303,
-0.03696465119719505,
-0.09306727349758148,
-0.044504519551992416,
-0.08565710484981537,
-0.024386927485466003,
-0.11371629685163498,
-0.08745020627975464,
0.03636547550559044,
0.07820387184619904,
0.13296586275100708,
0.027927592396736145,
-0.009530702605843544,
0.05206763371825218,
-0.012714603915810585,
0.0990452915430069,
-0.09606710821390152,
-0.051975447684526443
] |
null | null |
transformers
|
## Swahili News Classification with BERT
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by [HuggingFace](https://huggingface.co). All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
This [model](https://huggingface.co/flax-community/bert-base-uncased-swahili) was used as the base and fine-tuned for this task.
## How to use
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("flax-community/bert-swahili-news-classification")
model = AutoModelForSequenceClassification.from_pretrained("flax-community/bert-swahili-news-classification")
```
```
Eval metrics (10% valid set): {'accuracy': 0.9114740008594757}
```
|
{"language": "sw", "datasets": ["flax-community/swahili-safi"], "widget": [{"text": "Idris ameandika kwenye ukurasa wake wa Instagram akimkumbusha Diamond kutekeleza ahadi yake kumpigia Zari magoti kumuomba msamaha kama alivyowahi kueleza awali.Idris ameandika;"}]}
|
text-classification
|
flax-community/bert-swahili-news-classification
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"bert",
"text-classification",
"sw",
"dataset:flax-community/swahili-safi",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sw"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #bert #text-classification #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #region-us
|
## Swahili News Classification with BERT
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
This model was used as the base and fine-tuned for this task.
## How to use
|
[
"## Swahili News Classification with BERT\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.\n\nThis model was used as the base and fine-tuned for this task.",
"## How to use"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #bert #text-classification #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"## Swahili News Classification with BERT\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.\n\nThis model was used as the base and fine-tuned for this task.",
"## How to use"
] |
[
72,
81,
4
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #bert #text-classification #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #region-us \n## Swahili News Classification with BERT\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.\n\nThis model was used as the base and fine-tuned for this task.## How to use"
] |
[
-0.036465276032686234,
0.04085921123623848,
-0.0014215358532965183,
0.05784924328327179,
0.1319863498210907,
0.005672872066497803,
0.12665745615959167,
0.054094862192869186,
0.004339873790740967,
-0.029439225792884827,
0.12801587581634521,
0.029977472499012947,
0.047934453934431076,
0.2119990438222885,
-0.05605490878224373,
-0.23730099201202393,
0.012522260658442974,
0.02195126563310623,
-0.04520651325583458,
0.15356168150901794,
0.09896862506866455,
-0.0926579087972641,
0.08926068246364594,
-0.028521746397018433,
-0.1432461440563202,
0.02360549010336399,
-0.004936028737574816,
-0.09745097905397415,
0.11002181470394135,
0.0625719204545021,
0.10303345322608948,
0.07213206589221954,
0.04203486442565918,
-0.057530928403139114,
0.07012210786342621,
0.027083732187747955,
-0.07860518246889114,
0.053365349769592285,
0.07226311415433884,
0.011318901553750038,
0.09880074113607407,
-0.007016483694314957,
-0.014730623923242092,
0.05038769915699959,
-0.12450684607028961,
-0.0970928892493248,
-0.04683174192905426,
0.04182197153568268,
0.11302880942821503,
0.050454121083021164,
-0.014526240527629852,
0.19667400419712067,
-0.13167811930179596,
0.11011427640914917,
0.08017376065254211,
-0.2729009985923767,
-0.09775710850954056,
0.1736680269241333,
0.09097113460302353,
0.012987657450139523,
-0.04567398875951767,
0.06759291142225266,
0.03806760534644127,
0.010265516117215157,
0.09733718633651733,
-0.06543956696987152,
-0.09552769362926483,
-0.01850266009569168,
-0.10329601913690567,
-0.02520567923784256,
0.23615053296089172,
0.011440365575253963,
-0.007142391987144947,
-0.0729374885559082,
-0.0513293482363224,
0.1037011593580246,
-0.06373891979455948,
-0.04285454750061035,
0.01042048167437315,
0.02445903979241848,
-0.00559241184964776,
-0.023692933842539787,
-0.09935295581817627,
-0.08414412289857864,
-0.07328562438488007,
0.12523508071899414,
-0.011700576171278954,
0.0495537593960762,
-0.19337345659732819,
0.06690628826618195,
0.018726754933595657,
-0.09339939802885056,
0.042764052748680115,
-0.041330449283123016,
0.02098992094397545,
-0.023000722751021385,
-0.005062729120254517,
-0.16118134558200836,
0.11143471300601959,
0.014392689801752567,
-0.014838812872767448,
0.008498843759298325,
0.04615691304206848,
0.04371919110417366,
0.04186394438147545,
0.08714541792869568,
-0.069544218480587,
-0.10551956295967102,
0.10113392025232315,
-0.014976074919104576,
0.0015379288233816624,
-0.039957642555236816,
-0.07850681245326996,
-0.013529150746762753,
0.029879996553063393,
0.02173592336475849,
0.06373804807662964,
0.13683350384235382,
-0.0016918499022722244,
-0.03393113240599632,
0.057183071970939636,
-0.08173105865716934,
-0.006315502338111401,
0.02868485078215599,
-0.0003025415644515306,
-0.05218154937028885,
0.05593005567789078,
0.0011889649322256446,
-0.046402446925640106,
0.01955876313149929,
-0.046877648681402206,
-0.026856502518057823,
-0.03406066447496414,
-0.09619078785181046,
0.03219744935631752,
-0.15308107435703278,
-0.008697118610143661,
-0.18956148624420166,
-0.20528188347816467,
-0.06186920776963234,
0.009165662340819836,
0.017848428338766098,
-0.058283329010009766,
-0.043383385986089706,
0.003921829164028168,
0.014469699002802372,
-0.03559022396802902,
0.11736806482076645,
-0.06893319636583328,
0.06966351717710495,
-0.07003157585859299,
0.08285650610923767,
-0.07227739691734314,
0.028552526608109474,
-0.0473470501601696,
0.049222901463508606,
-0.09497834742069244,
0.06735298037528992,
-0.04204203188419342,
0.14035890996456146,
-0.06468552350997925,
0.03835270181298256,
-0.031977392733097076,
0.03642014041543007,
0.021403539925813675,
0.2103847712278366,
-0.20658919215202332,
-0.04689278081059456,
0.13412651419639587,
-0.10422254353761673,
-0.14216676354408264,
0.14454296231269836,
-0.0413619689643383,
0.1416071504354477,
0.0618157796561718,
0.2130119353532791,
0.04385434091091156,
-0.02778637781739235,
0.014076628722250462,
0.062264323234558105,
-0.11116056144237518,
-0.09990540146827698,
0.026983970776200294,
0.06452400982379913,
-0.16571903228759766,
0.034081168472766876,
-0.0657692402601242,
0.13010594248771667,
-0.054837580770254135,
-0.03854057565331459,
-0.017870714887976646,
-0.07433073222637177,
0.06517786532640457,
0.03053254261612892,
0.07017964124679565,
-0.049996983259916306,
-0.007997805252671242,
-0.028111763298511505,
0.026994328945875168,
-0.030034951865673065,
0.01645686663687229,
-0.10267452895641327,
0.1333857923746109,
-0.0624426044523716,
0.07268979400396347,
-0.11924686282873154,
-0.04463254660367966,
-0.02536497265100479,
0.1393454372882843,
-0.06244288384914398,
0.03938419371843338,
0.07313047349452972,
-0.07222528755664825,
-0.014944622293114662,
-0.01749229058623314,
0.08527369052171707,
0.019655125215649605,
-0.0905957892537117,
-0.09226106107234955,
-0.00809361133724451,
-0.0730648934841156,
0.040791288018226624,
-0.16021092236042023,
0.050121601670980453,
0.021913690492510796,
0.06944817304611206,
-0.004902157001197338,
0.03613722324371338,
0.07669580727815628,
0.02751087211072445,
-0.02688046358525753,
-0.027222556993365288,
0.10855285078287125,
0.036119330674409866,
-0.09759750962257385,
0.16704320907592773,
-0.10159022361040115,
0.18855783343315125,
0.13098138570785522,
-0.08257903903722763,
-0.027408037334680557,
0.05941883102059364,
-0.04221256822347641,
-0.0037755663506686687,
-0.05298887565732002,
0.04812351614236832,
0.08433444052934647,
-0.011716589331626892,
0.1615452766418457,
-0.0735657811164856,
-0.03257414326071739,
0.03359423950314522,
-0.054830390959978104,
-0.02184036560356617,
0.09807366132736206,
-0.0350443534553051,
-0.09126891940832138,
0.09172853827476501,
0.1531568318605423,
0.035901106894016266,
0.20619197189807892,
-0.01942518539726734,
0.021527329459786415,
0.0015289769507944584,
-0.014396059326827526,
-0.011914970353245735,
0.0356084406375885,
-0.18139491975307465,
-0.06793508678674698,
0.037195704877376556,
-0.012811534106731415,
0.009900381788611412,
-0.07459201663732529,
-0.041287802159786224,
-0.007863749749958515,
-0.016230644658207893,
-0.04633036255836487,
0.0840427428483963,
-0.04598158597946167,
0.10624345391988754,
0.0023127456661313772,
-0.10073550790548325,
0.04490162804722786,
-0.01883239857852459,
-0.11536732316017151,
0.16506235301494598,
0.00010933245357591659,
-0.3043373227119446,
-0.04130452126264572,
-0.07850439101457596,
-0.027761580422520638,
0.002045921515673399,
0.057573702186346054,
-0.047121379524469376,
-0.027822362259030342,
-0.08439596742391586,
-0.05737964063882828,
0.07381869852542877,
0.04515208303928375,
-0.007536957506090403,
0.006306985393166542,
0.03712492808699608,
-0.09295537322759628,
-0.00888750795274973,
-0.06204596161842346,
-0.03939355909824371,
0.12515614926815033,
-0.09961430728435516,
0.1209181696176529,
0.08987919241189957,
-0.05078936740756035,
0.06395602226257324,
-0.025666454806923866,
0.20813918113708496,
-0.08616932481527328,
0.07310210913419724,
0.18197894096374512,
-0.054642800241708755,
0.0004422562487889081,
0.09075737744569778,
0.01851680502295494,
-0.052840616554021835,
0.03811945766210556,
-0.04182405397295952,
-0.12206532061100006,
-0.17365723848342896,
-0.09494854509830475,
-0.047023456543684006,
0.08501014858484268,
0.0091794952750206,
0.07872044295072556,
-0.0430920347571373,
0.08385725319385529,
0.03223564848303795,
0.02730158157646656,
0.022592099383473396,
0.030104555189609528,
-0.028898421674966812,
-0.02362794242799282,
0.09648577123880386,
-0.08124275505542755,
-0.06261444091796875,
0.07755287736654282,
-0.020364727824926376,
0.026491032913327217,
0.029119249433279037,
-0.01377888210117817,
0.05759873986244202,
0.14798913896083832,
0.11978693306446075,
0.12567433714866638,
-0.014971606433391571,
-0.04022282361984253,
-0.02815433405339718,
-0.03345339000225067,
-0.06354429572820663,
0.013354451395571232,
0.02027379535138607,
-0.01492386870086193,
-0.0462380088865757,
-0.09701066464185715,
0.06130029633641243,
0.22837728261947632,
0.026324257254600525,
-0.21218165755271912,
-0.0272184070199728,
0.035534147173166275,
0.0025026986841112375,
-0.06433913856744766,
0.033930301666259766,
0.1325606256723404,
-0.09315935522317886,
0.05676548555493355,
-0.010540970601141453,
0.13345780968666077,
-0.003596162423491478,
0.030246714130043983,
-0.08897753059864044,
-0.07518087327480316,
-0.04696197435259819,
0.09126012772321701,
-0.28773564100265503,
0.19285671412944794,
0.00862449873238802,
0.04319536313414574,
-0.03754505515098572,
-0.05203118175268173,
0.07646622508764267,
0.2277470976114273,
0.17532283067703247,
0.028227852657437325,
0.13896328210830688,
-0.06846693903207779,
-0.15652067959308624,
0.047723524272441864,
-0.0523298941552639,
-0.03295375034213066,
0.03248050808906555,
0.024547116830945015,
-0.017021318897604942,
0.008083432912826538,
0.09250694513320923,
-0.1050875261425972,
-0.04418065398931503,
-0.027377258986234665,
0.09183211624622345,
0.033985305577516556,
-0.0272182859480381,
-0.08961684256792068,
-0.15945100784301758,
0.10198970884084702,
0.03990761563181877,
-0.05337009206414223,
-0.1518113911151886,
-0.04306076094508171,
-0.05490875989198685,
-0.038725901395082474,
-0.02384076453745365,
0.01271083764731884,
0.041482988744974136,
-0.04314514994621277,
-0.13167455792427063,
0.1016913503408432,
-0.11022788286209106,
-0.10500361770391464,
-0.029294144362211227,
0.03861277922987938,
0.03264518454670906,
-0.0016714464873075485,
0.07405674457550049,
-0.0002446306752972305,
-0.059058766812086105,
-0.08446356654167175,
0.023384064435958862,
0.07777770608663559,
0.0424858033657074,
-0.02027304284274578,
0.02335314452648163,
-0.127638578414917,
0.0068751126527786255,
0.005882981698960066,
0.1501135528087616,
0.14375406503677368,
-0.07639759033918381,
0.1165609210729599,
0.16317114233970642,
-0.036706481128931046,
-0.37260007858276367,
-0.09604170173406601,
-0.013638745062053204,
0.02277423068881035,
0.03680100664496422,
-0.0215726587921381,
0.09619345515966415,
-0.02377086877822876,
-0.03491402044892311,
-0.03414534032344818,
-0.15740735828876495,
-0.107062928378582,
0.16241343319416046,
0.08978401124477386,
0.4198669493198395,
-0.12952609360218048,
-0.038943737745285034,
-0.04390183463692665,
-0.100282222032547,
0.12536443769931793,
-0.19650791585445404,
0.06399422138929367,
-0.04327334091067314,
-0.0333062969148159,
-0.003476227866485715,
-0.05736258625984192,
0.10345780104398727,
-0.007361509837210178,
0.03907674923539162,
-0.1120791956782341,
-0.08843044936656952,
0.09040819108486176,
-0.06107384338974953,
0.08926159888505936,
-0.04413631185889244,
0.035104017704725266,
-0.20047658681869507,
-0.04167163744568825,
-0.08621972054243088,
0.11766468733549118,
-0.013650719076395035,
-0.03721684217453003,
0.028716476634144783,
0.010506538674235344,
0.03690958768129349,
0.02022051252424717,
0.14796870946884155,
-0.07364676892757416,
0.12261397391557693,
0.2014581859111786,
0.16521944105625153,
-0.10629205405712128,
0.03590817376971245,
-0.028217501938343048,
-0.05413830280303955,
0.08870375901460648,
-0.14919427037239075,
0.017646130174398422,
0.04901808127760887,
0.014080184511840343,
0.06344784051179886,
0.07324377447366714,
-0.035208724439144135,
0.008486451581120491,
0.07309947907924652,
-0.19356924295425415,
-0.12662401795387268,
-0.062100209295749664,
-0.09745794534683228,
-0.027373384684324265,
0.07695087045431137,
0.11800918728113174,
-0.06330305337905884,
-0.021894970908761024,
-0.013571924529969692,
0.002557060681283474,
-0.028903549537062645,
0.10399530827999115,
0.0879964753985405,
0.03270765766501427,
-0.1202884390950203,
0.04940864071249962,
0.04863358661532402,
-0.04720921069383621,
-0.0041243163868784904,
0.07315269112586975,
-0.15170425176620483,
-0.10316134989261627,
0.021246513351798058,
0.2558899521827698,
0.03938251733779907,
-0.10944638401269913,
-0.08586402982473373,
-0.12362834811210632,
0.05567745491862297,
0.21104653179645538,
0.10377523303031921,
0.044290874153375626,
-0.011336485855281353,
-0.06413880735635757,
-0.05565158277750015,
0.08673901855945587,
0.07720079272985458,
0.003006769809871912,
-0.1130743995308876,
0.08229602873325348,
0.010109156370162964,
0.1648648977279663,
-0.11853467673063278,
-0.0614740289747715,
-0.15515229105949402,
0.0215716902166605,
-0.08610844612121582,
0.02563522756099701,
-0.06445349752902985,
0.006052845157682896,
-0.037397127598524094,
-0.04267295449972153,
-0.04803907126188278,
0.0123744523152709,
-0.05621752887964249,
0.03142504021525383,
0.011167076416313648,
0.04842871055006981,
-0.03371520712971687,
-0.05354014039039612,
0.004193582572042942,
-0.02954239957034588,
0.13372087478637695,
0.03665982186794281,
-0.07093340158462524,
0.00944146141409874,
-0.19190959632396698,
-0.032570939511060715,
0.038528598845005035,
-0.020718930289149284,
0.06067877635359764,
-0.005238882265985012,
0.017320526763796806,
0.010439976118505001,
-0.04210112616419792,
0.03509841114282608,
0.1431676298379898,
-0.07038524001836777,
0.09429748356342316,
-0.03714035078883171,
-0.05173632502555847,
-0.028899166733026505,
0.01852983981370926,
0.04410918802022934,
0.128914013504982,
0.12331262230873108,
-0.07765872031450272,
0.030767207965254784,
-0.1271522045135498,
0.013153254054486752,
-0.020436178892850876,
-0.14862501621246338,
-0.12285630404949188,
-0.036250174045562744,
0.04588794708251953,
-0.02776012010872364,
0.11804327368736267,
0.15550149977207184,
-0.0737592950463295,
-0.010539546608924866,
0.0550367571413517,
0.003033484099432826,
-0.03836064785718918,
0.18063150346279144,
-0.029163092374801636,
0.0021628732793033123,
0.027352171018719673,
0.07139356434345245,
0.039897456765174866,
0.036890510469675064,
0.027144502848386765,
0.028362421318888664,
-0.04515313357114792,
0.11691299825906754,
-0.007321872282773256,
0.01632961444556713,
0.02288656495511532,
-0.05404994636774063,
-0.10768460482358932,
0.10471555590629578,
-0.05556277185678482,
0.010243714787065983,
0.14188295602798462,
-0.05553014203906059,
0.021669279783964157,
-0.0019557508639991283,
-0.033916495740413666,
-0.13881535828113556,
-0.16800148785114288,
-0.0973958969116211,
-0.07975638657808304,
-0.007097339723259211,
-0.06565582007169724,
-0.06747501343488693,
0.09410683065652847,
0.01549128070473671,
-0.01760842651128769,
0.052548691630363464,
-0.008081833831965923,
-0.003843169892206788,
0.16426235437393188,
-0.039134591817855835,
-0.01751861907541752,
-0.09155851602554321,
0.013332394883036613,
-0.018316691741347313,
0.008538063615560532,
-0.038153305649757385,
0.006391771603375673,
0.025029977783560753,
0.051387887448072433,
-0.03131508454680443,
-0.11027250438928604,
-0.026624932885169983,
0.028472328558564186,
0.03280697762966156,
0.05582492798566818,
0.04701948165893555,
-0.022026296705007553,
0.02877361699938774,
0.19503138959407806,
-0.004750572144985199,
-0.015652261674404144,
-0.08762577921152115,
0.07263687998056412,
-0.008483187295496464,
0.03506852686405182,
-0.009899855591356754,
-0.052806973457336426,
0.02613304927945137,
0.2792259156703949,
0.22864669561386108,
-0.05860413983464241,
0.06699611991643906,
-0.03160444647073746,
0.021103952080011368,
0.05058184266090393,
0.16493670642375946,
0.06251460313796997,
0.10048727691173553,
-0.057092249393463135,
-0.023737221956253052,
-0.06536965072154999,
-0.05509592592716217,
-0.05000920966267586,
0.05725080519914627,
0.03387409821152687,
-0.037431079894304276,
-0.04075869917869568,
0.0832119733095169,
-0.08889654278755188,
-0.06372726708650589,
0.014686351642012596,
-0.12074754387140274,
-0.06881437450647354,
-0.05624918267130852,
0.007394297514110804,
0.056583914905786514,
0.06229427829384804,
-0.0219507347792387,
0.066982701420784,
0.06520743668079376,
-0.01465557236224413,
-0.09053991734981537,
-0.0022534315939992666,
0.07832492142915726,
-0.10000677406787872,
0.15319345891475677,
-0.030711641535162926,
-0.013523038476705551,
0.040914662182331085,
-0.017165374010801315,
-0.11262644082307816,
0.11748486757278442,
-0.04240388795733452,
0.050710756331682205,
0.04301196709275246,
0.01751885376870632,
-0.011399876326322556,
-0.053784601390361786,
-0.007807601243257523,
-0.11034674942493439,
0.05557646602392197,
-0.03510304540395737,
-0.046236664056777954,
-0.0923270434141159,
0.1161661148071289,
-0.027702955529093742,
0.11273273080587387,
0.07263656705617905,
-0.041621457785367966,
-0.018163133412599564,
-0.06995165348052979,
0.0112150264903903,
-0.025875823572278023,
-0.04293796047568321,
-0.025932757183909416,
-0.10605653375387192,
-0.018670152872800827,
-0.08481491357088089,
-0.015497403219342232,
-0.16795530915260315,
0.02801290526986122,
-0.1602906882762909,
-0.04658925533294678,
-0.024240655824542046,
0.04116371273994446,
0.12200673669576645,
0.021243048831820488,
-0.01362629421055317,
0.009922366589307785,
0.02696274220943451,
0.11107712239027023,
-0.06794911623001099,
-0.10391346365213394
] |
null | null |
transformers
|
# NOTE: This repository is now superseded by https://huggingface.co/bertin-project/bertin-roberta-base-spanish. This model corresponds to the `beta` version of the model using stepwise over sampling trained for 200k steps with 128 sequence lengths. Version 1 is now available and should be used instead.
# BERTIN
BERTIN is a series of BERT-based models for Spanish. This one is a RoBERTa-large model trained from scratch on the Spanish portion of mC4 using [Flax](https://github.com/google/flax), including training scripts.
This is part of the
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
## Spanish mC4
The Spanish portion of mC4 containes about 416 million records and 235 billion words.
```bash
$ zcat c4/multilingual/c4-es*.tfrecord*.json.gz | wc -l
416057992
```
```bash
$ zcat c4/multilingual/c4-es*.tfrecord-*.json.gz | jq -r '.text | split(" ") | length' | paste -s -d+ - | bc
235303687795
```
## Team members
- Javier de la Rosa ([versae](https://huggingface.co/versae))
- Eduardo González ([edugp](https://huggingface.co/edugp))
- Paulo Villegas ([paulo](https://huggingface.co/paulo))
- Pablo González de Prado ([Pablogps](https://huggingface.co/Pablogps))
- Manu Romero ([mrm8488](https://huggingface.co/))
- María Grandury ([mariagrandury](https://huggingface.co/))
## Useful links
- [Community Week timeline](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104#summary-timeline-calendar-6)
- [Community Week README](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md)
- [Community Week thread](https://discuss.huggingface.co/t/bertin-pretrain-roberta-large-from-scratch-in-spanish/7125)
- [Community Week channel](https://discord.com/channels/858019234139602994/859113060068229190)
- [Masked Language Modelling example scripts](https://github.com/huggingface/transformers/tree/master/examples/flax/language-modeling)
- [Model Repository](https://huggingface.co/flax-community/bertin-roberta-large-spanish/)
|
{"language": "es", "license": "cc-by-4.0", "tags": ["spanish", "roberta"], "pipeline_tag": "fill-mask", "widget": [{"text": "Fui a la librer\u00eda a comprar un <mask>."}]}
|
fill-mask
|
flax-community/bertin-roberta-large-spanish
|
[
"transformers",
"pytorch",
"jax",
"safetensors",
"roberta",
"fill-mask",
"spanish",
"es",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"es"
] |
TAGS
#transformers #pytorch #jax #safetensors #roberta #fill-mask #spanish #es #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us
|
# NOTE: This repository is now superseded by URL This model corresponds to the 'beta' version of the model using stepwise over sampling trained for 200k steps with 128 sequence lengths. Version 1 is now available and should be used instead.
# BERTIN
BERTIN is a series of BERT-based models for Spanish. This one is a RoBERTa-large model trained from scratch on the Spanish portion of mC4 using Flax, including training scripts.
This is part of the
Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.
## Spanish mC4
The Spanish portion of mC4 containes about 416 million records and 235 billion words.
## Team members
- Javier de la Rosa (versae)
- Eduardo González (edugp)
- Paulo Villegas (paulo)
- Pablo González de Prado (Pablogps)
- Manu Romero (mrm8488)
- María Grandury (mariagrandury)
## Useful links
- Community Week timeline
- Community Week README
- Community Week thread
- Community Week channel
- Masked Language Modelling example scripts
- Model Repository
|
[
"# NOTE: This repository is now superseded by URL This model corresponds to the 'beta' version of the model using stepwise over sampling trained for 200k steps with 128 sequence lengths. Version 1 is now available and should be used instead.",
"# BERTIN\n\nBERTIN is a series of BERT-based models for Spanish. This one is a RoBERTa-large model trained from scratch on the Spanish portion of mC4 using Flax, including training scripts.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## Spanish mC4\n\nThe Spanish portion of mC4 containes about 416 million records and 235 billion words.",
"## Team members\n\n- Javier de la Rosa (versae)\n- Eduardo González (edugp)\n- Paulo Villegas (paulo)\n- Pablo González de Prado (Pablogps)\n- Manu Romero (mrm8488)\n- María Grandury (mariagrandury)",
"## Useful links\n\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
"TAGS\n#transformers #pytorch #jax #safetensors #roberta #fill-mask #spanish #es #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# NOTE: This repository is now superseded by URL This model corresponds to the 'beta' version of the model using stepwise over sampling trained for 200k steps with 128 sequence lengths. Version 1 is now available and should be used instead.",
"# BERTIN\n\nBERTIN is a series of BERT-based models for Spanish. This one is a RoBERTa-large model trained from scratch on the Spanish portion of mC4 using Flax, including training scripts.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## Spanish mC4\n\nThe Spanish portion of mC4 containes about 416 million records and 235 billion words.",
"## Team members\n\n- Javier de la Rosa (versae)\n- Eduardo González (edugp)\n- Paulo Villegas (paulo)\n- Pablo González de Prado (Pablogps)\n- Manu Romero (mrm8488)\n- María Grandury (mariagrandury)",
"## Useful links\n\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
59,
59,
78,
24,
56,
36
] |
[
"passage: TAGS\n#transformers #pytorch #jax #safetensors #roberta #fill-mask #spanish #es #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n# NOTE: This repository is now superseded by URL This model corresponds to the 'beta' version of the model using stepwise over sampling trained for 200k steps with 128 sequence lengths. Version 1 is now available and should be used instead.# BERTIN\n\nBERTIN is a series of BERT-based models for Spanish. This one is a RoBERTa-large model trained from scratch on the Spanish portion of mC4 using Flax, including training scripts.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.## Spanish mC4\n\nThe Spanish portion of mC4 containes about 416 million records and 235 billion words.## Team members\n\n- Javier de la Rosa (versae)\n- Eduardo González (edugp)\n- Paulo Villegas (paulo)\n- Pablo González de Prado (Pablogps)\n- Manu Romero (mrm8488)\n- María Grandury (mariagrandury)## Useful links\n\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
-0.09594554454088211,
0.22462107241153717,
-0.0044957539066672325,
0.06707077473402023,
0.0845780000090599,
0.0241260826587677,
0.10612159222364426,
0.053667861968278885,
0.03853448107838631,
0.11176437884569168,
0.024852875620126724,
0.030377082526683807,
0.03953984007239342,
0.10114073008298874,
-0.010595676489174366,
-0.20410563051700592,
-0.009210796095430851,
-0.05923726037144661,
-0.1739017367362976,
0.07799958437681198,
0.048776496201753616,
-0.029602155089378357,
0.05636164918541908,
-0.04243317246437073,
-0.017506388947367668,
0.04605169594287872,
0.005953694693744183,
-0.15012560784816742,
0.1152677983045578,
0.023862449452280998,
0.022833343595266342,
0.030021950602531433,
0.030680779367685318,
0.0069435457699000835,
0.011588361114263535,
0.03734269738197327,
-0.029143130406737328,
0.0229036808013916,
0.08937390893697739,
0.03746538609266281,
0.10910098999738693,
-0.08960231393575668,
0.04421656206250191,
0.03861237317323685,
-0.1953241527080536,
-0.07113966345787048,
-0.11287489533424377,
-0.05831187963485718,
0.02765498496592045,
0.0732433944940567,
-0.0021534974221140146,
0.11625031381845474,
-0.09988447278738022,
-0.03468748927116394,
0.15092109143733978,
-0.179219588637352,
-0.08550155162811279,
0.1678423434495926,
0.09838832169771194,
0.026513921096920967,
0.008311538957059383,
0.09412667900323868,
0.0693717896938324,
-0.001860931166447699,
0.004240124952048063,
-0.06929505616426468,
0.09827152639627457,
-0.06401893496513367,
-0.06424783170223236,
-0.039732594043016434,
0.14831386506557465,
0.04382065311074257,
0.04811674356460571,
-0.08771409839391708,
-0.06456579267978668,
0.16468408703804016,
-0.0780717059969902,
0.006516187451779842,
0.047966368496418,
0.009016526862978935,
-0.020386097952723503,
0.030866213142871857,
-0.023114070296287537,
-0.03610572591423988,
-0.06248411163687706,
0.05212262272834778,
-0.019619999453425407,
0.028251612558960915,
-0.06638655811548233,
-0.037312332540750504,
-0.0564262792468071,
-0.09705058485269547,
0.005674232263118029,
0.009542104788124561,
0.058794695883989334,
-0.013305798172950745,
0.04176696017384529,
0.0028859400190413,
0.15405434370040894,
0.04681934043765068,
-0.12596149742603302,
0.0006185444653965533,
0.05230255052447319,
0.013482501730322838,
0.08103608340024948,
-0.008194198831915855,
-0.010532170534133911,
-0.12904495000839233,
0.015215989202260971,
-0.035006169229745865,
-0.04857326298952103,
0.0011022696271538734,
-0.061002571135759354,
0.013627712614834309,
-0.07805698364973068,
0.05709563195705414,
0.07590226083993912,
0.0668395534157753,
-0.020214026793837547,
-0.020834608003497124,
0.13698206841945648,
-0.08851446211338043,
0.04845082387328148,
0.01629052683711052,
-0.0674043744802475,
0.04318516328930855,
-0.09943055361509323,
-0.04555647075176239,
-0.03832104429602623,
-0.04111665487289429,
-0.05563686788082123,
-0.10791850090026855,
-0.058120500296354294,
-0.06439942866563797,
0.09778593480587006,
-0.13480380177497864,
-0.04876389726996422,
-0.16261093318462372,
-0.05853228271007538,
-0.14560498297214508,
-0.04206313192844391,
-0.08634872734546661,
-0.0648595541715622,
-0.0885465145111084,
0.045321885496377945,
-0.007539468351751566,
-0.024769151583313942,
0.09336815029382706,
-0.07730265706777573,
0.060543615370988846,
-0.014334140345454216,
0.08190441131591797,
-0.022866472601890564,
0.03373689576983452,
0.00011182578600710258,
0.04859283193945885,
-0.1399732381105423,
0.09205222129821777,
-0.02871154062449932,
0.06574448943138123,
-0.1291963756084442,
0.03348769247531891,
-0.04954734072089195,
0.07989040017127991,
-0.001837496878579259,
0.17629887163639069,
-0.09450604021549225,
-0.11565695703029633,
0.218408465385437,
-0.029351111501455307,
-0.0025815695989876986,
0.11086634546518326,
0.02067512646317482,
0.024051345884799957,
0.13095879554748535,
0.11186949163675308,
0.01181656401604414,
-0.07257441431283951,
0.04200121760368347,
0.014221204444766045,
0.0013008842943236232,
-0.09223104268312454,
0.017567986622452736,
-0.05717499926686287,
0.055148009210824966,
0.02266414277255535,
0.031056953594088554,
-0.009069081395864487,
-0.044324345886707306,
0.013909288682043552,
0.061046551913022995,
-0.03422432020306587,
0.0017529152100905776,
-0.050542257726192474,
0.015616587363183498,
-0.07175825536251068,
-0.0021869943011552095,
-0.018394993618130684,
0.027372226119041443,
0.04509615898132324,
-0.028996258974075317,
-0.037878505885601044,
0.13621459901332855,
0.04459303990006447,
0.032051462680101395,
-0.1281396448612213,
0.008049039170145988,
0.029891755431890488,
0.08929923176765442,
-0.010918817482888699,
0.03706960007548332,
0.011363568715751171,
-0.0462515763938427,
-0.015220360830426216,
0.04550400376319885,
0.005223620682954788,
-0.04529004544019699,
-0.030939606949687004,
-0.10633871704339981,
0.03455312177538872,
-0.039578601717948914,
-0.013844023458659649,
-0.18617813289165497,
-0.0037187468260526657,
0.08865012973546982,
0.036701783537864685,
-0.02470407821238041,
-0.013655858114361763,
0.0076142181642353535,
0.10827366262674332,
0.0036560017615556717,
-0.06909529119729996,
0.0912851095199585,
0.006240891292691231,
-0.041620898991823196,
0.13274537026882172,
-0.11604050546884537,
-0.0948544293642044,
0.09528480470180511,
0.0163487046957016,
-0.04146211966872215,
0.03813271224498749,
0.016033003106713295,
0.007020431570708752,
-0.13361774384975433,
-0.05323316156864166,
0.18378029763698578,
0.0031540566124022007,
0.1334574669599533,
-0.12377512454986572,
-0.048506855964660645,
0.04125101864337921,
-0.050044361501932144,
-0.06796523928642273,
0.11974669247865677,
-0.009085423313081264,
-0.06838621199131012,
0.11274238675832748,
-0.011575622484087944,
0.12643302977085114,
0.2195267677307129,
-0.0010322453454136848,
-0.019252708181738853,
0.03204026073217392,
-0.020292555913329124,
0.03879447281360626,
-0.03131890669465065,
-0.08000259846448898,
-0.061414994299411774,
-0.012712772935628891,
0.11019802838563919,
0.025651581585407257,
-0.09910909831523895,
-0.007391813676804304,
-0.03830369561910629,
-0.04207814857363701,
-0.030600138008594513,
0.03900764882564545,
0.03249329701066017,
0.12088571488857269,
0.1085505336523056,
-0.030517075210809708,
-0.016894076019525528,
-0.04534842446446419,
-0.07206390053033829,
0.135943204164505,
-0.042380254715681076,
-0.28465205430984497,
-0.10522810369729996,
-0.08377369493246078,
0.024642804637551308,
0.04090198129415512,
0.0482807382941246,
-0.08802764862775803,
-0.0013414988061413169,
-0.010352380573749542,
0.14085349440574646,
0.015800243243575096,
-0.06497672200202942,
0.0709662213921547,
0.10283178836107254,
-0.0035302205942571163,
-0.11200521886348724,
-0.035375434905290604,
-0.028095951303839684,
-0.13861864805221558,
0.04553148150444031,
-0.14494624733924866,
0.07348379492759705,
0.06937683373689651,
0.03509063646197319,
-0.016449863091111183,
-0.012426717206835747,
0.2084762006998062,
-0.13428160548210144,
0.03172038868069649,
0.1584029495716095,
0.07023163884878159,
-0.0059111956506967545,
0.051677584648132324,
0.03527207300066948,
-0.1207045316696167,
0.0028076819144189358,
0.0625518411397934,
-0.09238950908184052,
-0.23473109304904938,
-0.13441264629364014,
-0.055049341171979904,
0.011254673823714256,
0.13435208797454834,
0.07536958903074265,
-0.09090065211057663,
0.026045817881822586,
-0.012378687970340252,
0.019537344574928284,
0.0291312076151371,
0.004815489519387484,
0.07982718199491501,
0.0126087237149477,
-0.006710143759846687,
-0.020455991849303246,
-0.0297166109085083,
0.11861805617809296,
0.07496200501918793,
0.11852580308914185,
0.0560157336294651,
0.1427943855524063,
0.036405663937330246,
0.007597446441650391,
0.04758009314537048,
0.018714604899287224,
0.020739814266562462,
0.036602430045604706,
-0.07507546991109848,
-0.02492937631905079,
-0.015202596783638,
-0.033753279596567154,
0.0330948531627655,
-0.09170404821634293,
-0.09150933474302292,
-0.0509701669216156,
0.07562509179115295,
0.3154822289943695,
0.0618438720703125,
-0.1358751803636551,
-0.032584577798843384,
0.01373241562396288,
-0.038580115884542465,
-0.04999003931879997,
0.06372249126434326,
0.09485571086406708,
-0.1174577847123146,
0.10017462819814682,
0.08015920221805573,
0.1632431149482727,
-0.11665033549070358,
0.03716808184981346,
-0.05259544774889946,
0.015808144584298134,
-0.003912631887942553,
0.023146649822592735,
-0.20545101165771484,
0.3092474043369293,
0.009137015789747238,
0.052164897322654724,
-0.04546007141470909,
-0.028747277334332466,
-0.032812196761369705,
0.034394629299640656,
0.08976709842681885,
0.052041541785001755,
-0.05579356476664543,
-0.05342454835772514,
-0.14584054052829742,
0.006206751801073551,
-0.05490833893418312,
-0.1024777889251709,
0.03868236020207405,
0.043796606361866,
-0.0157203059643507,
-0.03461074084043503,
-0.11744201183319092,
-0.07585790753364563,
-0.10291554778814316,
-0.01260271668434143,
0.06638915836811066,
0.0084149781614542,
-0.04092301428318024,
-0.06701444834470749,
-0.14618456363677979,
0.03487132117152214,
0.04032284393906593,
-0.00020291056716814637,
-0.10221655666828156,
0.07341515272855759,
0.042426180094480515,
-0.04481625556945801,
0.006898688152432442,
0.04467351734638214,
0.017913684248924255,
-0.055248308926820755,
-0.036394838243722916,
0.12486761808395386,
-0.06761319190263748,
-0.1302689015865326,
-0.053341712802648544,
0.0994459018111229,
0.09728329628705978,
0.0012855773093178868,
0.0315823033452034,
0.03305688500404358,
0.09840019047260284,
-0.02341034822165966,
0.007428914308547974,
0.013174675405025482,
0.1235756129026413,
-0.03444002568721771,
-0.08143752068281174,
-0.08994561433792114,
-0.03821389004588127,
-0.12478380650281906,
0.05451573058962822,
0.17647124826908112,
-0.027021532878279686,
0.09555179625749588,
0.23156552016735077,
-0.10045663267374039,
-0.22094306349754333,
-0.07005707174539566,
0.05103402957320213,
-0.015448836609721184,
-0.0068877581506967545,
-0.17475439608097076,
0.02939850278198719,
0.18900130689144135,
-0.007460785564035177,
0.005481980741024017,
-0.3904845416545868,
-0.0699632465839386,
-0.039361439645290375,
-0.04050491005182266,
0.23023831844329834,
-0.10750597715377808,
-0.07912396639585495,
-0.045366235077381134,
-0.19156010448932648,
-0.0012331046164035797,
-0.04725025221705437,
0.07899762690067291,
0.010645651258528233,
0.06553005427122116,
-0.015091031789779663,
-0.0034225184936076403,
0.10863398015499115,
0.07231104373931885,
-0.010016237385571003,
-0.04815300181508064,
-0.13095945119857788,
0.06391353905200958,
-0.04047355800867081,
-0.026562893763184547,
0.01020439900457859,
-0.039184458553791046,
-0.10138040781021118,
-0.04856547713279724,
-0.05502844229340553,
0.08310846984386444,
-0.06148961931467056,
0.04104263707995415,
0.0704132691025734,
0.08524233847856522,
0.010627254843711853,
-0.029077589511871338,
0.08264776319265366,
-0.10949207097291946,
0.16291239857673645,
0.08633363246917725,
0.11466475576162338,
-0.05422058328986168,
-0.07880426943302155,
-0.0188906267285347,
0.013157797046005726,
0.12540270388126373,
-0.0809701681137085,
0.022488662973046303,
0.05673262104392052,
0.010022279806435108,
0.03393946960568428,
0.004496897105127573,
-0.09643994271755219,
0.046303484588861465,
0.07730027288198471,
-0.008593147620558739,
-0.10360110551118851,
0.04706566408276558,
-0.01018248125910759,
-0.005312568508088589,
-0.005697919055819511,
0.1361766904592514,
0.03283340111374855,
-0.07039794325828552,
-0.02363007329404354,
-0.014949736185371876,
-0.018217114731669426,
0.175882950425148,
-0.056675855070352554,
0.05161847546696663,
-0.09988489747047424,
0.0755954310297966,
0.07381198555231094,
-0.09479926526546478,
-0.004726650193333626,
0.17859293520450592,
-0.08410476893186569,
-0.07581695169210434,
0.07048718631267548,
0.1352929174900055,
0.018475113436579704,
-0.06750452518463135,
-0.049247778952121735,
-0.1261424720287323,
0.1002538800239563,
0.17761269211769104,
-0.005439082160592079,
0.051203858107328415,
0.048793841153383255,
-0.0063039083033800125,
-0.009983203373849392,
-0.04200303554534912,
0.06614697724580765,
-0.019650481641292572,
0.022776305675506592,
0.004659244790673256,
0.03236561268568039,
0.006111683323979378,
-0.048720721155405045,
-0.024170782417058945,
-0.2308676838874817,
0.017064962536096573,
-0.07500465959310532,
0.02620280347764492,
-0.09832901507616043,
0.013004036620259285,
-0.09945004433393478,
0.03519726172089577,
-0.0009607007377780974,
0.0037415684200823307,
-0.03322500362992287,
-0.011547855101525784,
-0.05728943273425102,
0.11476694792509079,
0.0001529980218037963,
0.0058410936035215855,
-0.001555918948724866,
-0.034040480852127075,
0.07072199881076813,
-0.014196200296282768,
-0.05030163377523422,
0.002468440681695938,
-0.1420975625514984,
-0.021817663684487343,
0.029685139656066895,
0.04747273772954941,
0.07759686559438705,
0.0012889369390904903,
0.03995470330119133,
-0.016636135056614876,
0.06967822462320328,
0.029682336375117302,
-0.06341833621263504,
-0.11454782634973526,
0.0399826355278492,
-0.017393724992871284,
-0.1579761952161789,
-0.004907107446342707,
-0.008641592226922512,
0.152485191822052,
0.02509956620633602,
0.04338371753692627,
-0.11359899491071701,
0.0410141795873642,
-0.0883566215634346,
-0.016097187995910645,
0.019732721149921417,
-0.0940583273768425,
-0.07119213789701462,
0.002418807940557599,
0.10453241318464279,
0.04142357409000397,
0.1578555703163147,
0.11292146891355515,
-0.010804702527821064,
0.0322638675570488,
-0.015788676217198372,
-0.11376211792230606,
0.007679708302021027,
0.07932580262422562,
0.07515089958906174,
0.011470576748251915,
-0.0025038793683052063,
0.06573034077882767,
0.042177025228738785,
0.031431201845407486,
0.11498235166072845,
0.1054820567369461,
0.24387453496456146,
0.03816436603665352,
0.08991311490535736,
-0.03122340515255928,
0.01366503071039915,
0.049746353179216385,
-0.04867333546280861,
0.0897512435913086,
-0.07226008921861649,
-0.09715083986520767,
0.07662343978881836,
-0.0971432775259018,
0.12684792280197144,
-0.05383250489830971,
0.021793747320771217,
-0.14510394632816315,
-0.1589089334011078,
-0.0686497837305069,
-0.10029982030391693,
0.020436692982912064,
-0.06900694221258163,
0.010830103419721127,
0.07630325108766556,
0.03622542694211006,
-0.008416670374572277,
-0.07077892124652863,
0.0068438989110291,
-0.11077046394348145,
0.09076648205518723,
-0.005878284107893705,
0.11366482079029083,
-0.11447048932313919,
0.028554080054163933,
-0.02367609553039074,
0.08039309829473495,
-0.004514680709689856,
0.0723271295428276,
0.011223256587982178,
0.03365296497941017,
-0.09032955765724182,
-0.08075332641601562,
0.004959084093570709,
0.002162423450499773,
-0.0003895988338626921,
0.12693679332733154,
0.038835663348436356,
-0.07592640817165375,
0.034893471747636795,
0.16442735493183136,
0.011986213736236095,
-0.09680086374282837,
-0.12634260952472687,
0.10011402517557144,
-0.05245067551732063,
0.02808748371899128,
-0.04333309084177017,
-0.07680799067020416,
-0.02031787298619747,
0.2539747357368469,
0.26162850856781006,
0.05263296887278557,
0.011155478656291962,
-0.0466558001935482,
0.007804470602422953,
0.08109918236732483,
0.14404074847698212,
0.0621907114982605,
0.18783578276634216,
0.017344804480671883,
-0.12032461911439896,
-0.012616196647286415,
0.04106970131397247,
-0.16677913069725037,
0.08295774459838867,
0.00446999492123723,
0.025730162858963013,
-0.015682244673371315,
0.139450341463089,
-0.044406939297914505,
-0.14920060336589813,
0.03033239208161831,
-0.1304500848054886,
-0.10758203268051147,
-0.06470204889774323,
-0.05661998316645622,
0.14219196140766144,
0.15578818321228027,
0.04932147264480591,
-0.04344011843204498,
0.017945140600204468,
-0.004111900459975004,
-0.08075612038373947,
-0.15884003043174744,
0.0745403990149498,
0.025981053709983826,
0.16730348765850067,
-0.04737389460206032,
-0.03036072663962841,
0.06624286621809006,
-0.01712251827120781,
-0.08995595574378967,
0.06794571131467819,
0.0396624356508255,
0.01913641393184662,
0.05383307859301567,
-0.09046609699726105,
-0.06929904967546463,
0.026063701137900352,
0.050328005105257034,
-0.07526461035013199,
0.025700148195028305,
0.08151748776435852,
0.08207274228334427,
-0.13202719390392303,
0.13228543102741241,
-0.07633373141288757,
0.08645544946193695,
0.11507939547300339,
-0.022965705022215843,
-0.02969326637685299,
-0.00872794259339571,
0.025201870128512383,
-0.02580832503736019,
0.019729487597942352,
-0.08521140366792679,
-0.26415368914604187,
-0.06293675303459167,
-0.07715417444705963,
-0.008818329311907291,
-0.17404842376708984,
0.02206694707274437,
-0.06715350598096848,
-0.036889657378196716,
-0.02598942257463932,
0.029915735125541687,
0.03314773365855217,
0.01705026999115944,
-0.018190445378422737,
-0.18659497797489166,
0.03747687116265297,
0.17091424763202667,
-0.10499207675457001,
-0.03309603035449982
] |
null | null |
transformers
|
# BigBird base model
BigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.
It is a pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this [paper](https://arxiv.org/abs/2007.14062) and first released in this [repository](https://github.com/google-research/bigbird).
Disclaimer: The team releasing BigBird did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
BigBird relies on **block sparse attention** instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 at a much lower compute cost compared to BERT. It has achieved SOTA on various tasks involving very long sequences such as long documents summarization, question-answering with long contexts.
## How to use `TODO: Update`
Here is how to use this model to get the features of a given text in Flax:
```python
from transformers import BigBirdTokenizer, FlaxBigBirdModel
model_id = "flax-community/bigband"
# by default its in `block_sparse` mode with num_random_blocks=3, block_size=64
model = FlaxBigBirdModel.from_pretrained(model_id)
# you can change `attention_type` to full attention like this:
model = FlaxBigBirdModel.from_pretrained(model_id, attention_type="original_full")
# you can change `block_size` & `num_random_blocks` like this:
model = FlaxBigBirdModel.from_pretrained(model_id, block_size=16, num_random_blocks=2)
tokenizer = BigBirdTokenizer.from_pretrained(model_id)
text = "Replace me by any text you'd like."
inputs = tokenizer(text, return_tensors="jax")
output = model(**inputs)
```
## Training Data `TODO: Update`
This model is pre-trained on four publicly available datasets: **Books**, **CC-News**, **Stories** and **Wikipedia**. It used same sentencepiece vocabulary as RoBERTa (which is in turn borrowed from GPT2).
## Training Procedure `TODO: Update`
Document longer than 4096 were split into multiple documents and documents that were much smaller than 4096 were joined. Following the original BERT training, 15% of tokens were masked and model is trained to predict the mask.
Model is warm started from RoBERTa’s checkpoint.
|
{"language": "en", "license": "apache-2.0", "datasets": ["bookcorpus", "wikipedia", "cc_news"]}
| null |
flax-community/bigband
|
[
"transformers",
"big_bird",
"pretraining",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"dataset:cc_news",
"arxiv:2007.14062",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2007.14062"
] |
[
"en"
] |
TAGS
#transformers #big_bird #pretraining #en #dataset-bookcorpus #dataset-wikipedia #dataset-cc_news #arxiv-2007.14062 #license-apache-2.0 #endpoints_compatible #region-us
|
# BigBird base model
BigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.
It is a pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository.
Disclaimer: The team releasing BigBird did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
BigBird relies on block sparse attention instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 at a much lower compute cost compared to BERT. It has achieved SOTA on various tasks involving very long sequences such as long documents summarization, question-answering with long contexts.
## How to use 'TODO: Update'
Here is how to use this model to get the features of a given text in Flax:
## Training Data 'TODO: Update'
This model is pre-trained on four publicly available datasets: Books, CC-News, Stories and Wikipedia. It used same sentencepiece vocabulary as RoBERTa (which is in turn borrowed from GPT2).
## Training Procedure 'TODO: Update'
Document longer than 4096 were split into multiple documents and documents that were much smaller than 4096 were joined. Following the original BERT training, 15% of tokens were masked and model is trained to predict the mask.
Model is warm started from RoBERTa’s checkpoint.
|
[
"# BigBird base model\n\nBigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.\n\nIt is a pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository.\n\nDisclaimer: The team releasing BigBird did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nBigBird relies on block sparse attention instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 at a much lower compute cost compared to BERT. It has achieved SOTA on various tasks involving very long sequences such as long documents summarization, question-answering with long contexts.",
"## How to use 'TODO: Update'\n\nHere is how to use this model to get the features of a given text in Flax:",
"## Training Data 'TODO: Update'\n\nThis model is pre-trained on four publicly available datasets: Books, CC-News, Stories and Wikipedia. It used same sentencepiece vocabulary as RoBERTa (which is in turn borrowed from GPT2).",
"## Training Procedure 'TODO: Update'\n\nDocument longer than 4096 were split into multiple documents and documents that were much smaller than 4096 were joined. Following the original BERT training, 15% of tokens were masked and model is trained to predict the mask.\n\nModel is warm started from RoBERTa’s checkpoint."
] |
[
"TAGS\n#transformers #big_bird #pretraining #en #dataset-bookcorpus #dataset-wikipedia #dataset-cc_news #arxiv-2007.14062 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# BigBird base model\n\nBigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.\n\nIt is a pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository.\n\nDisclaimer: The team releasing BigBird did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nBigBird relies on block sparse attention instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 at a much lower compute cost compared to BERT. It has achieved SOTA on various tasks involving very long sequences such as long documents summarization, question-answering with long contexts.",
"## How to use 'TODO: Update'\n\nHere is how to use this model to get the features of a given text in Flax:",
"## Training Data 'TODO: Update'\n\nThis model is pre-trained on four publicly available datasets: Books, CC-News, Stories and Wikipedia. It used same sentencepiece vocabulary as RoBERTa (which is in turn borrowed from GPT2).",
"## Training Procedure 'TODO: Update'\n\nDocument longer than 4096 were split into multiple documents and documents that were much smaller than 4096 were joined. Following the original BERT training, 15% of tokens were masked and model is trained to predict the mask.\n\nModel is warm started from RoBERTa’s checkpoint."
] |
[
61,
142,
88,
29,
59,
71
] |
[
"passage: TAGS\n#transformers #big_bird #pretraining #en #dataset-bookcorpus #dataset-wikipedia #dataset-cc_news #arxiv-2007.14062 #license-apache-2.0 #endpoints_compatible #region-us \n# BigBird base model\n\nBigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.\n\nIt is a pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository.\n\nDisclaimer: The team releasing BigBird did not write a model card for this model so this model card has been written by the Hugging Face team.## Model description\n\nBigBird relies on block sparse attention instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 at a much lower compute cost compared to BERT. It has achieved SOTA on various tasks involving very long sequences such as long documents summarization, question-answering with long contexts.## How to use 'TODO: Update'\n\nHere is how to use this model to get the features of a given text in Flax:## Training Data 'TODO: Update'\n\nThis model is pre-trained on four publicly available datasets: Books, CC-News, Stories and Wikipedia. It used same sentencepiece vocabulary as RoBERTa (which is in turn borrowed from GPT2).## Training Procedure 'TODO: Update'\n\nDocument longer than 4096 were split into multiple documents and documents that were much smaller than 4096 were joined. Following the original BERT training, 15% of tokens were masked and model is trained to predict the mask.\n\nModel is warm started from RoBERTa’s checkpoint."
] |
[
-0.03998050093650818,
0.004309358075261116,
-0.004152616020292044,
0.08209539949893951,
0.03840269893407822,
-0.061329521238803864,
0.0004231430066283792,
0.06197258085012436,
-0.18036682903766632,
0.08425701409578323,
-0.024483697488904,
-0.06095736473798752,
0.14178194105625153,
0.06738550961017609,
0.055693913251161575,
-0.3134603798389435,
0.015007334761321545,
-0.06183047220110893,
-0.022123387083411217,
0.03979722410440445,
0.14218191802501678,
-0.09344282746315002,
0.0939038097858429,
0.04782473295927048,
-0.0014047796139493585,
0.03109341487288475,
-0.06673744320869446,
-0.03377002477645874,
0.09393101185560226,
0.039835717529058456,
0.015124603174626827,
0.03585255891084671,
0.03803567215800285,
-0.12030412256717682,
0.028710918501019478,
0.02934769168496132,
0.04135337471961975,
0.024431178346276283,
0.06510227173566818,
0.10231472551822662,
0.0928068682551384,
-0.17274612188339233,
0.060254018753767014,
0.03697061166167259,
-0.03214786574244499,
-0.0596809983253479,
-0.1363469958305359,
-0.005279373377561569,
-0.0063253301195800304,
0.09151125699281693,
0.020632630214095116,
-0.03670259192585945,
-0.03205683082342148,
0.01519684586673975,
0.16337841749191284,
-0.23822030425071716,
-0.036868866533041,
0.000818561005871743,
0.057732660323381424,
0.06261008977890015,
-0.1203400045633316,
-0.012624572031199932,
-0.018831118941307068,
0.05144289880990982,
0.13519199192523956,
-0.008649374358355999,
0.0553058385848999,
0.0014687767252326012,
-0.10732389241456985,
0.01030809711664915,
0.07195503264665604,
0.023058054968714714,
-0.09588956832885742,
-0.14550088346004486,
-0.024825630709528923,
0.02018820494413376,
-0.021867334842681885,
-0.0720643699169159,
0.12859879434108734,
0.026183752343058586,
0.11782677471637726,
-0.15299293398857117,
-0.13973291218280792,
0.02235819399356842,
-0.17079439759254456,
-0.010612362995743752,
0.011070676147937775,
0.0575748048722744,
-0.06923019140958786,
-0.01975904405117035,
-0.04437530040740967,
-0.08316874504089355,
-0.09065201878547668,
0.001387067255564034,
-0.004496142268180847,
-0.0011485874420031905,
-0.01835048943758011,
-0.17089320719242096,
-0.030604712665081024,
0.0872897207736969,
-0.0543009415268898,
0.019398458302021027,
-0.03936383128166199,
-0.00716107152402401,
0.017800897359848022,
0.13267318904399872,
-0.08171172440052032,
-0.1655731201171875,
0.11800198256969452,
-0.012497076764702797,
0.08482015877962112,
0.0019334221724420786,
-0.035575930029153824,
-0.015650825574994087,
-0.028281357139348984,
-0.006434958893805742,
0.024143123999238014,
0.013407825492322445,
-0.038817405700683594,
0.009674442000687122,
0.09541655331850052,
-0.12275507301092148,
0.01067365426570177,
0.0157249066978693,
-0.055377185344696045,
0.07275106012821198,
-0.008171689696609974,
-0.03938724845647812,
-0.03763354942202568,
0.029765594750642776,
-0.04247693717479706,
-0.08305535465478897,
-0.026336515322327614,
-0.09432656317949295,
0.025779323652386665,
-0.07387274503707886,
-0.03167375549674034,
-0.0883251205086708,
-0.21614648401737213,
-0.050860557705163956,
-0.05545646697282791,
-0.038237858563661575,
-0.04777224361896515,
-0.04377628490328789,
-0.07961572706699371,
0.0009872257942333817,
0.01498466357588768,
0.12458810955286026,
-0.050522331148386,
-0.022797059267759323,
-0.07215850055217743,
0.11119257658720016,
-0.046077240258455276,
0.000495188229251653,
-0.06332723051309586,
-0.023496439680457115,
-0.17422302067279816,
0.08142583072185516,
-0.023899991065263748,
-0.03370478376746178,
-0.07586953788995743,
-0.054968658834695816,
-0.05558178946375847,
0.037031497806310654,
0.02139057219028473,
0.1152440533041954,
-0.17402315139770508,
-0.004027487710118294,
0.04204443469643593,
-0.0862603634595871,
0.0469912551343441,
0.1283569037914276,
-0.00028307040338404477,
0.08722307533025742,
0.12733513116836548,
0.0626164972782135,
-0.009267560206353664,
-0.03764750808477402,
-0.08870899677276611,
0.03980511799454689,
-0.12674665451049805,
0.09364187717437744,
0.0704013854265213,
-0.019861888140439987,
0.023171816021203995,
0.020003747195005417,
0.030982395634055138,
-0.05511467158794403,
0.04764298349618912,
0.0064950548112392426,
0.015606473200023174,
-0.03297755867242813,
-0.024944264441728592,
-0.0917520821094513,
0.018838489428162575,
-0.07695036381483078,
-0.07375466823577881,
0.004102648235857487,
0.05790087953209877,
0.02862388826906681,
0.04780842736363411,
-0.05472353845834732,
0.040936633944511414,
-0.01179142203181982,
-0.026485500857234,
-0.11785805225372314,
-0.012964693829417229,
0.014818841591477394,
-0.0354146845638752,
0.08812239021062851,
0.0624270886182785,
0.055283840745687485,
0.0032510231249034405,
-0.026362944394350052,
0.022906353697180748,
-0.04334798827767372,
-0.02955683320760727,
-0.08247989416122437,
-0.05539974942803383,
0.002931553404778242,
-0.006550414487719536,
0.0855507031083107,
-0.04392480105161667,
0.0422666110098362,
0.14404866099357605,
0.01921566016972065,
0.03302609175443649,
-0.023660218343138695,
0.0597962960600853,
-0.055094148963689804,
0.021266764029860497,
-0.05017988383769989,
-0.025852341204881668,
0.09551296383142471,
0.000823424372356385,
0.06637609750032425,
-0.1481800377368927,
-0.17408481240272522,
0.05613584443926811,
0.09508980810642242,
-0.028234006837010384,
0.09036748111248016,
0.00876366812735796,
-0.04120999574661255,
-0.1308836191892624,
-0.09051880240440369,
0.24989068508148193,
0.030091866850852966,
0.10444051772356033,
-0.12082992494106293,
-0.006280655972659588,
-0.02879510261118412,
0.043193232268095016,
0.0001190877374028787,
0.04568370804190636,
-0.033338118344545364,
-0.11297444254159927,
0.046891532838344574,
-0.019832180812954903,
0.06580348312854767,
0.1479790210723877,
0.06451492756605148,
-0.05571793392300606,
-0.007930049672722816,
-0.03913649916648865,
-0.0208139531314373,
0.0644034892320633,
-0.04218051955103874,
0.05676598101854324,
0.04216240346431732,
0.04086267948150635,
0.047551002353429794,
-0.06974669545888901,
0.059469740837812424,
0.10315036028623581,
-0.04712929204106331,
-0.01116141676902771,
-0.07004962116479874,
0.032181840389966965,
0.08185926079750061,
0.09636719524860382,
0.008985067717730999,
-0.020214872434735298,
-0.04043368250131607,
-0.06796622276306152,
0.1312437206506729,
-0.09658816456794739,
-0.2520214021205902,
-0.15668964385986328,
0.05984259024262428,
-0.024868907406926155,
0.010960149578750134,
0.03759737312793732,
-0.013137544505298138,
-0.06716768443584442,
-0.14995895326137543,
0.13430297374725342,
0.0006996382726356387,
0.006679363548755646,
0.0027695600874722004,
-0.030953794717788696,
-0.040917154401540756,
-0.13118833303451538,
-0.011438123881816864,
-0.014418810606002808,
-0.08837822079658508,
-0.031967032700777054,
0.013730363920331001,
0.048640407621860504,
0.020348556339740753,
-0.01925012283027172,
-0.08820464462041855,
-0.055752601474523544,
0.1337357461452484,
-0.03400738164782524,
0.15520773828029633,
0.21288877725601196,
-0.005231432616710663,
0.07257477939128876,
0.0750589519739151,
-0.00966101884841919,
-0.0367821604013443,
0.014561963267624378,
0.04252813756465912,
-0.036181870847940445,
-0.19210946559906006,
0.008766775950789452,
-0.05765901505947113,
0.059678107500076294,
0.02407030574977398,
0.03743009269237518,
-0.12951849400997162,
0.024942414835095406,
-0.010708251036703587,
-0.00494421785697341,
0.0580461323261261,
0.0686730295419693,
0.1349480003118515,
-0.008695180527865887,
0.05761299654841423,
-0.062325846403837204,
0.007319980766624212,
0.17008638381958008,
-0.022988388314843178,
0.04439782723784447,
-0.050405103713274,
0.24860794842243195,
0.00048091987264342606,
-0.031050432473421097,
0.04578321427106857,
0.12924903631210327,
-0.07095488160848618,
0.01833021081984043,
-0.08429638296365738,
-0.03601619228720665,
0.005346170160919428,
0.02908572554588318,
-0.013742832466959953,
0.10968653112649918,
-0.06857980042695999,
0.06637535989284515,
0.010597867891192436,
0.24129149317741394,
0.0025299459230154753,
-0.08362983912229538,
-0.11687552183866501,
0.02440774254500866,
-0.10661953687667847,
-0.10076972842216492,
0.013023415580391884,
0.18858645856380463,
-0.13716059923171997,
0.062242764979600906,
0.05261685326695442,
0.06560017168521881,
-0.07156610488891602,
0.005581930745393038,
-0.10778375715017319,
0.160127192735672,
-0.03825533390045166,
0.04894736036658287,
-0.0592227503657341,
-0.020910851657390594,
0.016368430107831955,
0.09667401015758514,
-0.09548772126436234,
0.03192082792520523,
0.0474904328584671,
-0.04496224597096443,
0.08707734197378159,
-0.018148522824048996,
-0.12078734487295151,
0.06497698277235031,
-0.15265671908855438,
0.030638214200735092,
0.05688066780567169,
-0.05703544616699219,
0.0804499015212059,
-0.02442343533039093,
-0.004927807953208685,
-0.02799830585718155,
-0.07649415731430054,
0.02942764200270176,
-0.1790025234222412,
0.011377112939953804,
-0.033685874193906784,
0.005240584257990122,
-0.08129461109638214,
-0.03591761365532875,
-0.001459737541154027,
0.19355426728725433,
-0.0952000841498375,
-0.09005452692508698,
-0.11083915829658508,
0.01677919551730156,
0.03780081123113632,
-0.04422364383935928,
0.059320755302906036,
0.0261228047311306,
0.14695903658866882,
-0.026272663846611977,
-0.041723430156707764,
0.046172067523002625,
-0.02822634018957615,
-0.13421964645385742,
0.003955479711294174,
0.08802076429128647,
0.020455345511436462,
0.090706467628479,
0.014332220889627934,
0.048435278236866,
-0.04748617485165596,
-0.0774024948477745,
-0.024254852905869484,
0.058631882071495056,
0.08937188237905502,
0.08217842131853104,
-0.053021226078271866,
-0.08093149960041046,
-0.013010191731154919,
-0.05654202774167061,
0.19629469513893127,
0.18652603030204773,
-0.06176631152629852,
0.14229953289031982,
0.08207279443740845,
-0.06461956351995468,
-0.2437678873538971,
0.06209404021501541,
0.007636088877916336,
0.0701700896024704,
-0.0015511105302721262,
-0.16201910376548767,
0.11609043926000595,
0.123146653175354,
0.011066627688705921,
0.006654864642769098,
-0.215837299823761,
-0.08613494038581848,
-0.04335499927401543,
-0.020433560013771057,
0.08538389205932617,
-0.05818135291337967,
0.005063274875283241,
-0.0218810997903347,
-0.013238110579550266,
0.05136655271053314,
-0.05903313308954239,
0.09033146500587463,
-0.003760459367185831,
0.010972337797284126,
0.03614818677306175,
-0.013213957659900188,
0.11413780599832535,
-0.07662618905305862,
0.04898620769381523,
-0.061225149780511856,
0.014499744400382042,
0.08532530069351196,
-0.04648590832948685,
0.15525901317596436,
-0.04034143313765526,
0.008853958919644356,
-0.1300908327102661,
-0.0531095527112484,
-0.08022324740886688,
0.01154431514441967,
-0.03074934519827366,
-0.003576673334464431,
-0.12692484259605408,
0.03563763201236725,
0.11818836629390717,
-0.003157990286126733,
0.005425135139375925,
-0.07411082088947296,
-0.14422890543937683,
0.07608048617839813,
0.20943577587604523,
-0.09473633021116257,
-0.12926670908927917,
0.013432517647743225,
0.0029591370839625597,
0.009423709474503994,
-0.06303488463163376,
0.027379872277379036,
0.0819152444601059,
0.018017873167991638,
0.04872222617268562,
0.06427934020757675,
-0.09361246228218079,
0.02458195947110653,
0.05383015796542168,
-0.04532542824745178,
-0.211030513048172,
-0.010765200480818748,
0.035097863525152206,
-0.14417764544487,
-0.11629872769117355,
0.10924722999334335,
-0.01242755725979805,
-0.04095037654042244,
-0.0037596425972878933,
0.10222430527210236,
0.017167681828141212,
0.0825219452381134,
0.03624356538057327,
-0.013114295899868011,
-0.04748258367180824,
0.14803868532180786,
0.08240479230880737,
-0.0973753035068512,
0.029909875243902206,
0.16740764677524567,
-0.06751105189323425,
-0.025752505287528038,
0.09277813136577606,
0.05213272199034691,
0.02364491857588291,
0.00793228205293417,
-0.031188536435365677,
-0.14000201225280762,
0.05126103013753891,
0.14554287493228912,
-0.021348407492041588,
0.010822772979736328,
-0.053697459399700165,
0.05878352001309395,
-0.06650228053331375,
0.06382659822702408,
-0.019199779257178307,
0.03137870132923126,
0.028228944167494774,
0.12911279499530792,
-0.0511830635368824,
0.024669164791703224,
-0.011545880697667599,
0.012300629168748856,
-0.04444542154669762,
-0.06448433548212051,
-0.11029873043298721,
0.06453986465930939,
-0.0355340801179409,
-0.006856814958155155,
-0.019405104219913483,
0.03891550749540329,
-0.03345250338315964,
0.02664458565413952,
-0.017501480877399445,
-0.05269012972712517,
-0.05051754415035248,
0.017381200566887856,
-0.07359535992145538,
-0.012642919085919857,
0.04733378812670708,
-0.07943310588598251,
0.11082431674003601,
-0.014425125904381275,
0.011252391152083874,
-0.0409003421664238,
0.013211196288466454,
-0.07902147620916367,
-0.02217124029994011,
0.04554560035467148,
0.0026221724692732096,
-0.0737127959728241,
0.008031947538256645,
-0.021363189443945885,
-0.09072595834732056,
-0.03379838913679123,
0.05710197985172272,
-0.054643332958221436,
0.10680204629898071,
-0.007789496332406998,
0.0483306385576725,
-0.062381159514188766,
-0.030795587226748466,
0.038334961980581284,
-0.002086246619001031,
0.15539073944091797,
0.0007960943621583283,
0.015147020108997822,
-0.11079146713018417,
0.01789902150630951,
-0.04391756281256676,
-0.06434228271245956,
0.03232359513640404,
-0.043757013976573944,
0.08418458700180054,
-0.0020243129692971706,
0.05776338651776314,
-0.008638078346848488,
-0.08892956376075745,
0.059724148362874985,
0.07950317859649658,
0.012620477005839348,
-0.0035739101003855467,
0.08051400631666183,
-0.0028527581598609686,
-0.04847431555390358,
0.1278301328420639,
0.00986503716558218,
0.028196021914482117,
0.08995592594146729,
0.28314387798309326,
0.10287439823150635,
0.0933491587638855,
0.12179012596607208,
0.033070359379053116,
0.026755953207612038,
-0.1411571204662323,
0.04476655647158623,
0.009607748128473759,
0.005939675960689783,
-0.03337613120675087,
0.021302757784724236,
0.12762342393398285,
-0.16947656869888306,
0.13923196494579315,
-0.016157284379005432,
-0.016075730323791504,
-0.10716930776834488,
-0.08861727267503738,
-0.06847390532493591,
0.07250403612852097,
-0.05673306807875633,
-0.13244757056236267,
0.04196561872959137,
0.1254594922065735,
0.03947370499372482,
0.017685770988464355,
0.08961198478937149,
-0.2118363231420517,
-0.11609210073947906,
0.10762243717908859,
0.004705621860921383,
0.03923267498612404,
0.05215059965848923,
0.01813436485826969,
0.030508719384670258,
0.06523612886667252,
0.06779667735099792,
0.06941429525613785,
0.12531748414039612,
-0.04288644343614578,
-0.0586625300347805,
-0.04746316745877266,
-0.020164307206869125,
-0.016043515875935555,
0.025861706584692,
0.09762218594551086,
0.06512729078531265,
-0.059882402420043945,
0.02736531011760235,
0.21078933775424957,
-0.01797734759747982,
-0.1206410825252533,
-0.10054837167263031,
0.03374458849430084,
0.0017234269762411714,
0.018091073259711266,
0.004876758437603712,
-0.1222556009888649,
-0.0034722506534308195,
0.22182564437389374,
0.11689018458127975,
-0.09661375731229782,
0.00740685872733593,
-0.015324478037655354,
-0.004569467157125473,
0.02334650605916977,
0.030142204836010933,
0.07486093789339066,
0.2645607590675354,
-0.05704797804355621,
0.07228361815214157,
-0.010910981334745884,
0.012360013090074062,
-0.1731189638376236,
0.15266035497188568,
-0.07375927269458771,
0.06260450929403305,
-0.00929738488048315,
-0.0227797944098711,
0.045081183314323425,
-0.32871195673942566,
-0.0523805171251297,
-0.053583137691020966,
-0.11396634578704834,
0.02345357835292816,
-0.047771621495485306,
0.02969059720635414,
0.05879031494259834,
0.07832647114992142,
0.009538033045828342,
0.08694904297590256,
0.02277597039937973,
-0.10205870866775513,
-0.04144768416881561,
0.10572593659162521,
-0.04041872173547745,
0.17830407619476318,
0.019504209980368614,
-0.02148868516087532,
0.09323291480541229,
0.0021888900082558393,
-0.15664759278297424,
0.02434382773935795,
-0.01608542911708355,
-0.02831764705479145,
0.04716692119836807,
0.12277580797672272,
-0.02454213984310627,
0.0457940399646759,
0.07602827996015549,
-0.000871377473231405,
0.03677304461598396,
0.09806370735168457,
-0.06598702073097229,
-0.028112903237342834,
0.04004707559943199,
-0.11379840224981308,
0.12468834966421127,
0.1379505693912506,
-0.029803792014718056,
0.015103502199053764,
-0.016059929504990578,
-0.03205084055662155,
0.010409689508378506,
0.14435812830924988,
0.003034347901120782,
-0.09794607013463974,
-0.01998366229236126,
-0.09238483756780624,
0.07257638871669769,
-0.18607421219348907,
-0.08475860953330994,
0.0055253650061786175,
0.008919643238186836,
-0.0384846031665802,
0.027959944680333138,
0.0781250149011612,
-0.019367264583706856,
-0.05029425397515297,
0.01812361367046833,
-0.02973201684653759,
0.11308099329471588,
-0.10850893706083298,
-0.04642351716756821
] |
null | null |
transformers
|
# T5 model for sentence splitting in English
Sentence Split is the task of dividing a long sentence into multiple sentences.
E.g.:
```
Mary likes to play football in her freetime whenever she meets with her friends that are very nice people.
```
could be split into
```
Mary likes to play football in her freetime whenever she meets with her friends.
```
```
Her friends are very nice people.
```
## How to use it in your code:
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("flax-community/byt5-base-wikisplit")
model = AutoModelForSeq2SeqLM.from_pretrained("flax-community/byt5-base-wikisplit")
complex_sentence = "This comedy drama is produced by Tidy , the company she co-founded in 2008 with her husband David Peet , who is managing director ."
sample_tokenized = tokenizer(complex_sentence, return_tensors="pt")
answer = model.generate(sample_tokenized['input_ids'], attention_mask = sample_tokenized['attention_mask'], max_length=256, num_beams=5)
gene_sentence = tokenizer.decode(answer[0], skip_special_tokens=True)
gene_sentence
"""
Output:
This comedy drama is produced by Tidy. She co-founded Tidy in 2008 with her husband David Peet, who is managing director.
"""
```
## Datasets:
[Wiki_Split](https://research.google/tools/datasets/wiki-split/)
## Current Basline from [paper](https://arxiv.org/abs/1907.12461)

## Our Results:
| Model | Exact | SARI | BLEU |
| --- | --- | --- | --- |
| [t5-base-wikisplit](https://huggingface.co/flax-community/t5-base-wikisplit) | 17.93 | 67.5438 | 76.9 |
| [t5-v1_1-base-wikisplit](https://huggingface.co/flax-community/t5-v1_1-base-wikisplit) | 18.1207 | 67.4873 | 76.9478 |
| [byt5-base-wikisplit](https://huggingface.co/flax-community/byt5-base-wikisplit) | 11.3582 | 67.2685 | 73.1682 |
| [t5-large-wikisplit](https://huggingface.co/flax-community/t5-large-wikisplit) | 18.6632 | 68.0501 | 77.1881 |
|
{"datasets": ["wiki_split"], "widget": [{"text": "Mary likes to play football in her freetime whenever she meets with her friends that are very nice people."}]}
|
text2text-generation
|
flax-community/byt5-base-wikisplit
|
[
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"dataset:wiki_split",
"arxiv:1907.12461",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1907.12461"
] |
[] |
TAGS
#transformers #pytorch #tf #jax #tensorboard #safetensors #t5 #text2text-generation #dataset-wiki_split #arxiv-1907.12461 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
T5 model for sentence splitting in English
==========================================
Sentence Split is the task of dividing a long sentence into multiple sentences.
E.g.:
could be split into
How to use it in your code:
---------------------------
Datasets:
---------
Wiki\_Split
Current Basline from paper
--------------------------
!baseline
Our Results:
------------
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #tensorboard #safetensors #t5 #text2text-generation #dataset-wiki_split #arxiv-1907.12461 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #tensorboard #safetensors #t5 #text2text-generation #dataset-wiki_split #arxiv-1907.12461 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
-0.07338094711303711,
0.05236980319023132,
-0.005175611469894648,
0.061645518988370895,
0.11803211271762848,
0.0010036667808890343,
0.14466595649719238,
0.13494300842285156,
-0.014728834852576256,
-0.00039379746885970235,
0.1478293389081955,
0.22735001146793365,
0.01969420723617077,
0.10862782597541809,
-0.12394186854362488,
-0.21091672778129578,
0.027336005121469498,
0.06010023504495621,
-0.02085421420633793,
0.11969713121652603,
0.08562296628952026,
-0.08677488565444946,
0.08265271037817001,
-0.04077007994055748,
-0.19138042628765106,
0.035352710634469986,
0.08314473927021027,
-0.13246387243270874,
0.1249883696436882,
0.07944901287555695,
0.13529321551322937,
0.0702531561255455,
-0.03211889788508415,
-0.09502177685499191,
0.0498277023434639,
0.06919447332620621,
-0.06919827312231064,
0.09713546186685562,
0.11586253345012665,
-0.07274200022220612,
0.028588442131876945,
0.013642747886478901,
-0.007472454570233822,
0.04824034124612808,
-0.12488812953233719,
-0.03066110610961914,
-0.05273165926337242,
0.05260734632611275,
0.05290108919143677,
0.0855073556303978,
0.0018154196441173553,
0.17952771484851837,
-0.023473775014281273,
0.1551177054643631,
0.13132891058921814,
-0.32272928953170776,
-0.024491146206855774,
-0.003694209735840559,
0.035026900470256805,
0.06410683691501617,
-0.04377242177724838,
0.049198880791664124,
0.05745662376284599,
0.025242920964956284,
0.06704900413751602,
-0.059979576617479324,
-0.22280187904834747,
0.019698096439242363,
-0.0979870930314064,
-0.005887251812964678,
0.21699436008930206,
-0.03743954747915268,
0.05656393989920616,
-0.038152456283569336,
-0.13367706537246704,
-0.06207386404275894,
0.011280378326773643,
-0.03207117319107056,
-0.05520296096801758,
0.01362606231123209,
-0.0072768740355968475,
-0.05934611335396767,
-0.1445494443178177,
0.0008293665014207363,
-0.19558300077915192,
0.13219156861305237,
0.008869849145412445,
0.05364776402711868,
-0.2030775249004364,
0.0796913281083107,
0.06808062642812729,
-0.12932996451854706,
0.08119905740022659,
-0.06944086402654648,
0.0007992846658453345,
-0.02430855482816696,
-0.01138046570122242,
-0.22808612883090973,
0.07521477341651917,
0.04648159444332123,
-0.046756185591220856,
0.01691397652029991,
-0.10254329442977905,
0.07758224755525589,
0.012161178514361382,
0.02930506505072117,
-0.06814257055521011,
0.004056310281157494,
0.08120247721672058,
-0.042766232043504715,
0.02530044876039028,
-0.06055561453104019,
-0.11505191028118134,
-0.03841305151581764,
0.10057072341442108,
0.10439515113830566,
0.04927099868655205,
0.100662000477314,
-0.03627978265285492,
-0.024155516177415848,
0.048705972731113434,
-0.09515255689620972,
-0.007232715375721455,
-0.005700218956917524,
0.017725026234984398,
0.07869744300842285,
0.029112080112099648,
0.0085785873234272,
-0.12368462234735489,
0.05533348396420479,
-0.08892074972391129,
-0.007119117770344019,
-0.014179508201777935,
-0.11277101933956146,
0.05516720935702324,
-0.10022448003292084,
0.003747796406969428,
-0.17174750566482544,
-0.14907126128673553,
0.014853403903543949,
-0.006822616793215275,
-0.016083071008324623,
-0.023295698687434196,
-0.012550415471196175,
-0.04867753013968468,
0.04986819252371788,
-0.046065736562013626,
0.014855194836854935,
-0.04274686798453331,
0.09479490667581558,
-0.06348644196987152,
0.06320349872112274,
-0.1157352477312088,
0.03426981717348099,
-0.09535468369722366,
-0.012971480377018452,
-0.08641048520803452,
0.03501192480325699,
-0.012800343334674835,
0.11410906165838242,
-0.0546882189810276,
0.01033430639654398,
-0.08609599620103836,
0.02308066189289093,
0.022368106991052628,
0.1857706606388092,
-0.18233196437358856,
-0.06212804839015007,
0.22879788279533386,
-0.11661332100629807,
-0.18440662324428558,
0.12300577759742737,
-0.004945422988384962,
0.021788299083709717,
0.08567887544631958,
0.20415861904621124,
0.0406879298388958,
-0.037098322063684464,
0.024412313476204872,
0.08823653310537338,
-0.09990306943655014,
-0.08241552859544754,
-0.007248586975038052,
0.005754207260906696,
-0.08257292956113815,
0.023229708895087242,
0.1286030113697052,
0.07517500966787338,
-0.056915804743766785,
-0.034504860639572144,
-0.05817713215947151,
-0.04456400126218796,
0.08241263031959534,
0.017521284520626068,
0.1011369526386261,
-0.07564268261194229,
-0.023342382162809372,
-0.008083132095634937,
0.004866853356361389,
-0.036047179251909256,
0.023545870557427406,
-0.05396796017885208,
0.11295168101787567,
-0.08729083836078644,
0.03186345845460892,
-0.18296995759010315,
-0.1287127286195755,
-0.008331984281539917,
0.14957498013973236,
-0.02128562703728676,
0.050561681389808655,
0.07632532715797424,
0.0004522579547483474,
-0.02195757068693638,
-0.02735603041946888,
0.1790250986814499,
0.013333974406123161,
-0.107710100710392,
-0.13087595999240875,
0.07950708270072937,
-0.08264517784118652,
-0.027372440323233604,
-0.13616007566452026,
0.027752602472901344,
0.04606589674949646,
0.13955268263816833,
0.058554768562316895,
0.04738957807421684,
0.018366221338510513,
0.010039297863841057,
-0.0937027782201767,
-0.015217214822769165,
0.07326135039329529,
0.0020394320599734783,
-0.054991912096738815,
0.20630531013011932,
-0.16716495156288147,
0.28073689341545105,
0.18137657642364502,
-0.18724587559700012,
-0.02556230127811432,
-0.03455307334661484,
-0.02355809323489666,
-0.006965194828808308,
0.018724262714385986,
-0.027344223111867905,
0.01475986186414957,
-0.005464691668748856,
0.17642900347709656,
-0.07748309522867203,
-0.06028398871421814,
0.03139367327094078,
-0.04138759896159172,
-0.03696925938129425,
0.09169646352529526,
0.026378966867923737,
-0.23792244493961334,
0.16654108464717865,
0.18612296879291534,
0.015527904964983463,
0.18936285376548767,
-0.0227508544921875,
-0.03314882889389992,
0.04529380798339844,
0.025531603023409843,
0.0006247143028303981,
-0.010891961865127087,
-0.10976946353912354,
0.0038155654910951853,
0.06395594775676727,
0.003746860660612583,
0.05428347364068031,
-0.1340436041355133,
-0.04031582176685333,
-0.011365944519639015,
0.002202568342909217,
0.024568775668740273,
0.08743710070848465,
0.04098992422223091,
0.16444575786590576,
-0.05635596066713333,
-0.02550804801285267,
0.09139774739742279,
0.015463294461369514,
-0.11901985108852386,
0.2022557109594345,
-0.12500494718551636,
-0.3084281384944916,
-0.11232460290193558,
-0.10777301341295242,
-0.04777711629867554,
0.00957652647048235,
0.07737249881029129,
-0.09352370351552963,
-0.023762289434671402,
-0.05438217893242836,
0.020080503076314926,
-0.04356355965137482,
0.04445093497633934,
-0.05288013070821762,
0.031235884875059128,
-0.030847733840346336,
-0.09284704178571701,
-0.0181504525244236,
-0.016358232125639915,
0.0004006609960924834,
0.13696005940437317,
-0.05614817887544632,
0.08548697829246521,
0.16269785165786743,
-0.02332182042300701,
0.028047727420926094,
-0.05157468840479851,
0.1222914606332779,
-0.053942132741212845,
0.048012666404247284,
0.18206240236759186,
-0.09147791564464569,
0.06940625607967377,
0.11438941955566406,
0.0011529962066560984,
-0.05474792793393135,
0.022760022431612015,
-0.012095877900719643,
-0.0512198768556118,
-0.28689995408058167,
-0.07264361530542374,
-0.08793862909078598,
0.10003940016031265,
0.05317392572760582,
0.06388060748577118,
0.10668512433767319,
0.09085222333669662,
-0.008675566874444485,
0.023609139025211334,
-0.00025697052478790283,
0.04358412325382233,
0.11977392435073853,
-0.012200837954878807,
0.13162562251091003,
-0.07405488938093185,
-0.09832840412855148,
0.08952929079532623,
0.04351469874382019,
0.09759733825922012,
-0.00300502497702837,
0.06721054762601852,
0.013234642334282398,
0.13242566585540771,
0.11347690224647522,
0.1520964503288269,
0.009054568596184254,
-0.044978950172662735,
0.012890874408185482,
-0.03734273836016655,
0.00898769125342369,
-0.005036704242229462,
-0.03898394852876663,
-0.0649506077170372,
-0.06017064303159714,
-0.02058192901313305,
0.0860196053981781,
0.12279891967773438,
0.07173880934715271,
-0.27021241188049316,
-0.005500519648194313,
0.04464300721883774,
-0.027508936822414398,
-0.10704434663057327,
0.05649204179644585,
0.08693427592515945,
-0.061556797474622726,
0.060041237622499466,
-0.08712957799434662,
0.08457209169864655,
-0.03654101863503456,
0.04116462543606758,
-0.05682293325662613,
-0.010444516316056252,
-0.03165474161505699,
0.08030931651592255,
-0.2921738624572754,
0.20146377384662628,
0.027306577190756798,
-0.01798509806394577,
-0.09779224544763565,
-0.006698530167341232,
0.01340553816407919,
0.1275637149810791,
0.13927820324897766,
-0.0133895855396986,
-0.02238101325929165,
-0.029780272394418716,
-0.061492763459682465,
0.020492499694228172,
0.09360520541667938,
0.01670021377503872,
0.00672928337007761,
-0.02125849574804306,
-0.012689127586781979,
0.0205532219260931,
-0.0344410315155983,
-0.06164565682411194,
-0.15749318897724152,
0.055718716233968735,
0.047543421387672424,
0.0024468519259244204,
0.0020888452418148518,
-0.06662765145301819,
-0.11162037402391434,
0.21224962174892426,
-0.03680497035384178,
-0.0753173679113388,
-0.1353660672903061,
-0.0008355205063708127,
0.050424810498952866,
-0.07936268299818039,
0.03032037429511547,
-0.053634826093912125,
0.04757052659988403,
-0.06444023549556732,
-0.20583733916282654,
0.1315518319606781,
-0.10545729100704193,
-0.062348734587430954,
-0.07044865190982819,
0.13247324526309967,
-0.073125459253788,
-0.0066912295296788216,
0.010711180046200752,
-0.008428264409303665,
-0.056391745805740356,
-0.05111871287226677,
0.02988036908209324,
-0.015103430487215519,
0.06639178842306137,
0.014349693432450294,
-0.0819404348731041,
-0.13313286006450653,
-0.03246822953224182,
-0.013543092645704746,
0.2866136133670807,
0.15441663563251495,
-0.06156026944518089,
0.10973826050758362,
0.14139756560325623,
-0.05303136259317398,
-0.3197764456272125,
-0.032558053731918335,
-0.08678944408893585,
-0.02069464884698391,
-0.010134224779903889,
-0.09506291896104813,
0.07348696887493134,
0.01616155169904232,
-0.014265801757574081,
0.15962643921375275,
-0.23590165376663208,
-0.10536365211009979,
0.1434834748506546,
0.08675567060709,
0.2819352447986603,
-0.1513368934392929,
-0.0807584673166275,
-0.029798949137330055,
-0.060604389756917953,
0.18979039788246155,
-0.1669827699661255,
0.058036647737026215,
-0.012623646296560764,
0.03484512120485306,
0.0427178256213665,
-0.06368544697761536,
0.048793308436870575,
-0.06225757673382759,
0.044412802904844284,
-0.10677288472652435,
-0.03900426626205444,
0.08896504342556,
-0.024167977273464203,
0.04803543910384178,
-0.08413220196962357,
0.052813570946455,
-0.05302368476986885,
-0.026152292266488075,
-0.07266772538423538,
0.07183578610420227,
0.018651431426405907,
-0.06673292070627213,
0.007864897139370441,
-0.05193748697638512,
0.0060057127848267555,
-0.031623415648937225,
0.20847187936306,
-0.0027447545435279608,
0.17485590279102325,
0.17429852485656738,
0.14687882363796234,
-0.0721031129360199,
0.0471854992210865,
-0.04692225903272629,
-0.08132309466600418,
0.07041744142770767,
-0.14622721076011658,
0.05383867397904396,
0.09908200800418854,
-0.013255021534860134,
0.06402820348739624,
0.0880952924489975,
-0.02430414780974388,
-0.01674014702439308,
0.12154082953929901,
-0.2555880844593048,
-0.05532047525048256,
-0.05960423871874809,
-0.026522375643253326,
-0.0028198640793561935,
0.10372081398963928,
0.17931169271469116,
0.0019083989318460226,
-0.021008331328630447,
-0.0026551270857453346,
0.013659710064530373,
-0.012175462208688259,
0.11964628100395203,
0.0662432461977005,
0.021442679688334465,
-0.11497677117586136,
0.10449577122926712,
0.02479441836476326,
-0.16482695937156677,
0.03376784548163414,
0.16567827761173248,
-0.12903136014938354,
-0.13723048567771912,
0.022461630403995514,
0.14458757638931274,
-0.07465346157550812,
-0.047268327325582504,
-0.07569791376590729,
-0.10424289852380753,
0.05799311399459839,
0.2651767134666443,
0.014273155480623245,
0.06612372398376465,
-0.021680708974599838,
-0.07847931236028671,
-0.063763327896595,
0.07295253127813339,
0.02640877291560173,
0.04445111006498337,
-0.13670234382152557,
0.08770639449357986,
-0.04165012389421463,
0.140151247382164,
-0.08417423814535141,
0.012751626782119274,
-0.13740195333957672,
-0.0005291210254654288,
-0.127420574426651,
-0.009937186725437641,
-0.05506028234958649,
-0.046357303857803345,
-0.026412483304739,
-0.03102402202785015,
-0.034935109317302704,
-0.02987593039870262,
-0.08648815006017685,
0.03136339783668518,
-0.0014670882374048233,
0.013410832732915878,
-0.08660081773996353,
-0.030234888195991516,
0.01717529073357582,
-0.023060662671923637,
0.13938193023204803,
0.08977209776639938,
-0.09600434452295303,
0.10493538528680801,
-0.17707476019859314,
-0.0552794449031353,
0.09809296578168869,
0.003999986685812473,
0.05297010391950607,
0.07602854073047638,
0.014769944362342358,
0.06487877666950226,
0.00944036990404129,
0.04301472008228302,
0.057172201573848724,
-0.0873994454741478,
0.03727894648909569,
-0.02013500966131687,
-0.10111531615257263,
-0.06786772608757019,
-0.0015403148718178272,
0.036986369639635086,
-0.006898592691868544,
0.11576902866363525,
-0.06896188855171204,
0.06813156604766846,
-0.1182599812746048,
0.022778911516070366,
0.014255549758672714,
-0.1793513298034668,
-0.02848338522017002,
-0.013679343275725842,
0.05114399269223213,
-0.04211626946926117,
0.13735024631023407,
0.028221676126122475,
-0.03520238399505615,
0.023198885843157768,
0.01868673786520958,
-0.02852870151400566,
0.03549547865986824,
0.20274445414543152,
0.0019129723077639937,
-0.059410687536001205,
-0.1392960399389267,
0.04427557811141014,
0.029729081317782402,
0.06871139258146286,
0.13274386525154114,
0.06507600843906403,
-0.05738446116447449,
0.0837346613407135,
0.008914224803447723,
-0.014893037267029285,
-0.04280068352818489,
-0.12258296459913254,
-0.08002272248268127,
0.07441346347332001,
0.010468910448253155,
0.04265603795647621,
0.20412491261959076,
0.021482175216078758,
-0.006573429796844721,
-0.022675342857837677,
-0.07329250127077103,
-0.15661168098449707,
-0.18158775568008423,
-0.09043916314840317,
-0.04684808850288391,
-0.008073991164565086,
-0.09657162427902222,
0.03237428888678551,
0.01938605308532715,
0.08240722119808197,
-0.053265590220689774,
0.13500943779945374,
0.1344732642173767,
-0.08528965711593628,
0.08648896962404251,
0.018249986693263054,
0.02418271265923977,
-0.021345684304833412,
-0.008503647521138191,
-0.06356336176395416,
-0.011719719506800175,
-0.0481308251619339,
0.025781143456697464,
-0.01352389995008707,
0.0633128210902214,
-0.11427709460258484,
-0.11310508102178574,
-0.016812633723020554,
0.06109200417995453,
-0.012425419874489307,
0.10569681972265244,
0.0425439178943634,
-0.002620181068778038,
0.029810532927513123,
0.20343086123466492,
-0.05006860941648483,
-0.057429928332567215,
-0.06652054935693741,
0.17316362261772156,
-0.0034342515282332897,
0.06728605180978775,
-0.002846671501174569,
-0.029990751296281815,
-0.005258191842585802,
0.2990972697734833,
0.2896990478038788,
-0.07317861914634705,
0.048683229833841324,
-0.012601050548255444,
0.01733049936592579,
0.03297915682196617,
0.12614639103412628,
0.0733405128121376,
0.21644705533981323,
-0.06667109578847885,
-0.0034647569991648197,
-0.019315781071782112,
0.007342556957155466,
-0.08373739570379257,
0.11001256108283997,
0.04052578657865524,
-0.03297426924109459,
-0.01923089474439621,
0.0897778570652008,
-0.15276986360549927,
0.055688995867967606,
-0.07586761564016342,
-0.16297702491283417,
-0.0785914808511734,
-0.018106766045093536,
0.11700178682804108,
-0.026168759912252426,
0.05609210208058357,
-0.031701717525720596,
-0.031206881627440453,
0.023360813036561012,
-0.0016169640002772212,
-0.16799676418304443,
0.03854748606681824,
0.019569534808397293,
-0.13693948090076447,
0.03449268639087677,
-0.0138999093323946,
0.04424114525318146,
0.09772748500108719,
0.03210436552762985,
-0.076392263174057,
0.05268898233771324,
-0.005404002033174038,
0.002446267521008849,
0.03706282749772072,
0.048018958419561386,
0.029783587902784348,
-0.06579825282096863,
0.05730528384447098,
-0.0961698517203331,
0.0327741876244545,
-0.0616471953690052,
-0.044949181377887726,
-0.014629872515797615,
0.04106416925787926,
-0.020176447927951813,
0.09246360510587692,
0.09404154866933823,
-0.017245260998606682,
0.0356304831802845,
-0.0629025474190712,
-0.0320977084338665,
0.014623732306063175,
-0.06746526062488556,
-0.04896172508597374,
-0.1507195234298706,
-0.053244058042764664,
0.11636575311422348,
0.02065630443394184,
-0.25794005393981934,
0.012078147381544113,
-0.11400598287582397,
-0.014342037960886955,
-0.18351365625858307,
0.053148746490478516,
0.15849682688713074,
0.024671413004398346,
0.0011991227511316538,
0.012612523511052132,
0.040520574897527695,
0.07852623611688614,
-0.09458658844232559,
-0.08039067685604095
] |
null | null | null |
# Searching Reaction GIFs with CLIP

Reaction GIFs are an integral part of today's communication. They convey complex emotions with many levels, in a short compact format.
If a picture is worth a thousand words then a GIF is worth more.
We might even say that the level of complexity and expressiveness increases like:
`Emoji < Memes/Image < GIFs`
I think most people would agree it is not always easy to find the perfect reaction GIF.
Although we started out with the more ambitious goal of GIF/Image generation we later settled on first finetuning CLIP.
Which is needed to properly drive a generation model (like VQGAN).

Available CLIP models wouldn't be suitable to use without this finetuning as explained in the challenges below.
## 📝 Challenges
Classic (Image,Text) tasks like, image search, caption generation all focus on cases where the text is a description of the image.
This is mainly because large scale datasets available like COCO,WIT happen to be of that format.
So it is interesting to see if models can also capture some more higher level relations.
like sentiment-> image mapping, where there is great variation on both sides.
We can think of reaction gif/images to be sentiment like, in fact the dataset we use was also gathered for sentiment analysis.
There is no one correct reaction GIF, which also makes evaluation challenging.
# Dataset
We use the [Reaction GIF dataset](https://github.com/bshmueli/ReactionGIF) by Shmueli et al. Also available in [datasets](https://huggingface.co/datasets/julien-c/reactiongif)(without the gifs, but I already had those files, nice coincidence 😉)

We only use the tweet, GIF Response fields.
For this short experiment we only used the first frame of the GIFs to get generate the (text,image) pairs. Thought about using multiple frames from the GIF, but since there may not be much change between frames and didn't know how that would effect the already overfitting model. I decided to go with the simpler single frame version.
As opposed to other datasets, like COCO, we don't have multiple captions per image. Although same GIFs are used multiple times and we can modify the dataset that way we chose not to. Since we already had a very small dataset.
- Domain: Twitter
- Train size: 18,976
- Validation size: 351
# Model
We used the Hybrid CLIP model. Training script: [Hybrid CLIP](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py)
- Text Model: [twitter-roberta-base-emoji](https://huggingface.co/cardiffnlp/twitter-roberta-base-emoji)
- Vision Model: [openai/clip-vit-base-patch32](https://huggingface.co/openai/clip-vit-base-patch32)
## Different models tried
`cardiffnlp/twitter-roberta-base-*` models performed best compared to other text models; like the plain `roberta-base`, `bert-base-cased`.
This makes sense since this model is already trained on twitter data.
And among `cardiffnlp/twitter-roberta-base-*` models `twitter-roberta-base-emoji` performed best. (As I had hoped)
This model is `cardiffnlp/twitter-roberta-base` further fine-tuned on emoji classification task, which we can say (emojis) is a parallel to GIFs.
Also tried `google/vit-base-patch32-384`, `google/vit-base-patch16-384` for the vision models, but results were inconclusive.
## Result Interpretation (Warning)
It would be wrong to claim that this model learned 'semantic reasoning' between sentence and image features.
It more likely learned a mapping between sentence sentiment and image occurrence statistics. Because the set of gif images repeat across the dataset, although paired with different sentences.
That is not to say that learning such semantic relations isn't feasible with this model. And it is well worth working on in the future, with a larger and better constructed dataset.
### 📈 Training Logs
Training logs can be found [here](https://wandb.ai/cceyda/flax-clip?workspace=user-cceyda)
It was really easy to overfit since it was a tiny dataset. Used early stopping.
Other parameters:
```
--max_seq_length 128 \
--per_device_train_batch_size="32" \
--learning_rate="1e-5"
--warmup_steps="150"
```
# 💡 Future Potential
It is possible to generate a very large training set by scraping twitter.(Couldn't do during the event because of twitter rate limit)
I found it surprising how well the results turned out to be with so little data and training time. Although it is really hard to define what is an appropriate reaction image, and there are definite mistakes model makes.
(I also trained just a plain clip but didn't have time to prep demo for that 😅 which was also similarly over fitting and only had time to do a single run)
I will definitely be trying out training a similar model for emoji & meme data.
Training CLIP is just the first step, if we have a well trained CLIP generation is within reach 🚀
# How to use
The final model available [here](https://huggingface.co/ceyda/clip-reply)
```py
from model import FlaxHybridCLIP # see demo
from transformers import AutoTokenizer, CLIPProcessor
model = FlaxHybridCLIP.from_pretrained("ceyda/clip-reply")
processor = CLIPProcessor.from_pretrained("openai/clip-vit-base-patch32")
processor.tokenizer = AutoTokenizer.from_pretrained("cardiffnlp/twitter-roberta-base")
def query(image_paths,query_text):
images = [Image.open(im).convert("RGB") for im in image_paths]
inputs = processor(text=[query_text], images=images, return_tensors="jax", padding=True)
inputs["pixel_values"] = jnp.transpose(inputs["pixel_values"], axes=[0, 2, 3, 1])
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image.reshape(-1)
probs = jax.nn.softmax(logits_per_image)
```
# Created By
Ceyda Cinarel [@ceyda](https://huggingface.co/ceyda)
Made during the flax community [event](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104/58)
# TL;DR The task
Input: Some sentence (like a tweet)
Output: The most suitable reaction GIF image (Ranking)
Example:
- Input: I miss you
- Output: 
# Demo
https://huggingface.co/spaces/flax-community/clip-reply-demo
|
{}
| null |
flax-community/clip-reply
|
[
"tensorboard",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#tensorboard #has_space #region-us
|
# Searching Reaction GIFs with CLIP
!header gif
Reaction GIFs are an integral part of today's communication. They convey complex emotions with many levels, in a short compact format.
If a picture is worth a thousand words then a GIF is worth more.
We might even say that the level of complexity and expressiveness increases like:
'Emoji < Memes/Image < GIFs'
I think most people would agree it is not always easy to find the perfect reaction GIF.
Although we started out with the more ambitious goal of GIF/Image generation we later settled on first finetuning CLIP.
Which is needed to properly drive a generation model (like VQGAN).
!header gif
Available CLIP models wouldn't be suitable to use without this finetuning as explained in the challenges below.
## Challenges
Classic (Image,Text) tasks like, image search, caption generation all focus on cases where the text is a description of the image.
This is mainly because large scale datasets available like COCO,WIT happen to be of that format.
So it is interesting to see if models can also capture some more higher level relations.
like sentiment-> image mapping, where there is great variation on both sides.
We can think of reaction gif/images to be sentiment like, in fact the dataset we use was also gathered for sentiment analysis.
There is no one correct reaction GIF, which also makes evaluation challenging.
# Dataset
We use the Reaction GIF dataset by Shmueli et al. Also available in datasets(without the gifs, but I already had those files, nice coincidence )
!Dataset
We only use the tweet, GIF Response fields.
For this short experiment we only used the first frame of the GIFs to get generate the (text,image) pairs. Thought about using multiple frames from the GIF, but since there may not be much change between frames and didn't know how that would effect the already overfitting model. I decided to go with the simpler single frame version.
As opposed to other datasets, like COCO, we don't have multiple captions per image. Although same GIFs are used multiple times and we can modify the dataset that way we chose not to. Since we already had a very small dataset.
- Domain: Twitter
- Train size: 18,976
- Validation size: 351
# Model
We used the Hybrid CLIP model. Training script: Hybrid CLIP
- Text Model: twitter-roberta-base-emoji
- Vision Model: openai/clip-vit-base-patch32
## Different models tried
'cardiffnlp/twitter-roberta-base-*' models performed best compared to other text models; like the plain 'roberta-base', 'bert-base-cased'.
This makes sense since this model is already trained on twitter data.
And among 'cardiffnlp/twitter-roberta-base-*' models 'twitter-roberta-base-emoji' performed best. (As I had hoped)
This model is 'cardiffnlp/twitter-roberta-base' further fine-tuned on emoji classification task, which we can say (emojis) is a parallel to GIFs.
Also tried 'google/vit-base-patch32-384', 'google/vit-base-patch16-384' for the vision models, but results were inconclusive.
## Result Interpretation (Warning)
It would be wrong to claim that this model learned 'semantic reasoning' between sentence and image features.
It more likely learned a mapping between sentence sentiment and image occurrence statistics. Because the set of gif images repeat across the dataset, although paired with different sentences.
That is not to say that learning such semantic relations isn't feasible with this model. And it is well worth working on in the future, with a larger and better constructed dataset.
### Training Logs
Training logs can be found here
It was really easy to overfit since it was a tiny dataset. Used early stopping.
Other parameters:
# Future Potential
It is possible to generate a very large training set by scraping twitter.(Couldn't do during the event because of twitter rate limit)
I found it surprising how well the results turned out to be with so little data and training time. Although it is really hard to define what is an appropriate reaction image, and there are definite mistakes model makes.
(I also trained just a plain clip but didn't have time to prep demo for that which was also similarly over fitting and only had time to do a single run)
I will definitely be trying out training a similar model for emoji & meme data.
Training CLIP is just the first step, if we have a well trained CLIP generation is within reach
# How to use
The final model available here
# Created By
Ceyda Cinarel @ceyda
Made during the flax community event
# TL;DR The task
Input: Some sentence (like a tweet)
Output: The most suitable reaction GIF image (Ranking)
Example:
- Input: I miss you
- Output: !hug
# Demo
URL
|
[
"# Searching Reaction GIFs with CLIP\n\n!header gif\n\nReaction GIFs are an integral part of today's communication. They convey complex emotions with many levels, in a short compact format.\n\nIf a picture is worth a thousand words then a GIF is worth more.\n\nWe might even say that the level of complexity and expressiveness increases like:\n\n'Emoji < Memes/Image < GIFs'\n\nI think most people would agree it is not always easy to find the perfect reaction GIF. \n\nAlthough we started out with the more ambitious goal of GIF/Image generation we later settled on first finetuning CLIP. \nWhich is needed to properly drive a generation model (like VQGAN). \n\n!header gif\n\nAvailable CLIP models wouldn't be suitable to use without this finetuning as explained in the challenges below.",
"## Challenges\n\nClassic (Image,Text) tasks like, image search, caption generation all focus on cases where the text is a description of the image.\nThis is mainly because large scale datasets available like COCO,WIT happen to be of that format. \nSo it is interesting to see if models can also capture some more higher level relations.\nlike sentiment-> image mapping, where there is great variation on both sides.\nWe can think of reaction gif/images to be sentiment like, in fact the dataset we use was also gathered for sentiment analysis.\nThere is no one correct reaction GIF, which also makes evaluation challenging.",
"# Dataset\n\nWe use the Reaction GIF dataset by Shmueli et al. Also available in datasets(without the gifs, but I already had those files, nice coincidence )\n!Dataset\nWe only use the tweet, GIF Response fields.\nFor this short experiment we only used the first frame of the GIFs to get generate the (text,image) pairs. Thought about using multiple frames from the GIF, but since there may not be much change between frames and didn't know how that would effect the already overfitting model. I decided to go with the simpler single frame version.\n\nAs opposed to other datasets, like COCO, we don't have multiple captions per image. Although same GIFs are used multiple times and we can modify the dataset that way we chose not to. Since we already had a very small dataset.\n\n- Domain: Twitter\n- Train size: 18,976\n- Validation size: 351",
"# Model\n\nWe used the Hybrid CLIP model. Training script: Hybrid CLIP \n\n- Text Model: twitter-roberta-base-emoji\n- Vision Model: openai/clip-vit-base-patch32",
"## Different models tried\n\n'cardiffnlp/twitter-roberta-base-*' models performed best compared to other text models; like the plain 'roberta-base', 'bert-base-cased'.\nThis makes sense since this model is already trained on twitter data.\n\nAnd among 'cardiffnlp/twitter-roberta-base-*' models 'twitter-roberta-base-emoji' performed best. (As I had hoped)\nThis model is 'cardiffnlp/twitter-roberta-base' further fine-tuned on emoji classification task, which we can say (emojis) is a parallel to GIFs. \n\nAlso tried 'google/vit-base-patch32-384', 'google/vit-base-patch16-384' for the vision models, but results were inconclusive.",
"## Result Interpretation (Warning)\n\nIt would be wrong to claim that this model learned 'semantic reasoning' between sentence and image features. \n\nIt more likely learned a mapping between sentence sentiment and image occurrence statistics. Because the set of gif images repeat across the dataset, although paired with different sentences.\n\nThat is not to say that learning such semantic relations isn't feasible with this model. And it is well worth working on in the future, with a larger and better constructed dataset.",
"### Training Logs\n\nTraining logs can be found here\nIt was really easy to overfit since it was a tiny dataset. Used early stopping.\nOther parameters:",
"# Future Potential\n\nIt is possible to generate a very large training set by scraping twitter.(Couldn't do during the event because of twitter rate limit)\n\nI found it surprising how well the results turned out to be with so little data and training time. Although it is really hard to define what is an appropriate reaction image, and there are definite mistakes model makes.\n\n(I also trained just a plain clip but didn't have time to prep demo for that which was also similarly over fitting and only had time to do a single run)\n\nI will definitely be trying out training a similar model for emoji & meme data.\n\nTraining CLIP is just the first step, if we have a well trained CLIP generation is within reach",
"# How to use\nThe final model available here",
"# Created By\n\nCeyda Cinarel @ceyda\n\nMade during the flax community event",
"# TL;DR The task\n\nInput: Some sentence (like a tweet)\nOutput: The most suitable reaction GIF image (Ranking)\n\nExample: \n - Input: I miss you \n - Output: !hug",
"# Demo\n\nURL"
] |
[
"TAGS\n#tensorboard #has_space #region-us \n",
"# Searching Reaction GIFs with CLIP\n\n!header gif\n\nReaction GIFs are an integral part of today's communication. They convey complex emotions with many levels, in a short compact format.\n\nIf a picture is worth a thousand words then a GIF is worth more.\n\nWe might even say that the level of complexity and expressiveness increases like:\n\n'Emoji < Memes/Image < GIFs'\n\nI think most people would agree it is not always easy to find the perfect reaction GIF. \n\nAlthough we started out with the more ambitious goal of GIF/Image generation we later settled on first finetuning CLIP. \nWhich is needed to properly drive a generation model (like VQGAN). \n\n!header gif\n\nAvailable CLIP models wouldn't be suitable to use without this finetuning as explained in the challenges below.",
"## Challenges\n\nClassic (Image,Text) tasks like, image search, caption generation all focus on cases where the text is a description of the image.\nThis is mainly because large scale datasets available like COCO,WIT happen to be of that format. \nSo it is interesting to see if models can also capture some more higher level relations.\nlike sentiment-> image mapping, where there is great variation on both sides.\nWe can think of reaction gif/images to be sentiment like, in fact the dataset we use was also gathered for sentiment analysis.\nThere is no one correct reaction GIF, which also makes evaluation challenging.",
"# Dataset\n\nWe use the Reaction GIF dataset by Shmueli et al. Also available in datasets(without the gifs, but I already had those files, nice coincidence )\n!Dataset\nWe only use the tweet, GIF Response fields.\nFor this short experiment we only used the first frame of the GIFs to get generate the (text,image) pairs. Thought about using multiple frames from the GIF, but since there may not be much change between frames and didn't know how that would effect the already overfitting model. I decided to go with the simpler single frame version.\n\nAs opposed to other datasets, like COCO, we don't have multiple captions per image. Although same GIFs are used multiple times and we can modify the dataset that way we chose not to. Since we already had a very small dataset.\n\n- Domain: Twitter\n- Train size: 18,976\n- Validation size: 351",
"# Model\n\nWe used the Hybrid CLIP model. Training script: Hybrid CLIP \n\n- Text Model: twitter-roberta-base-emoji\n- Vision Model: openai/clip-vit-base-patch32",
"## Different models tried\n\n'cardiffnlp/twitter-roberta-base-*' models performed best compared to other text models; like the plain 'roberta-base', 'bert-base-cased'.\nThis makes sense since this model is already trained on twitter data.\n\nAnd among 'cardiffnlp/twitter-roberta-base-*' models 'twitter-roberta-base-emoji' performed best. (As I had hoped)\nThis model is 'cardiffnlp/twitter-roberta-base' further fine-tuned on emoji classification task, which we can say (emojis) is a parallel to GIFs. \n\nAlso tried 'google/vit-base-patch32-384', 'google/vit-base-patch16-384' for the vision models, but results were inconclusive.",
"## Result Interpretation (Warning)\n\nIt would be wrong to claim that this model learned 'semantic reasoning' between sentence and image features. \n\nIt more likely learned a mapping between sentence sentiment and image occurrence statistics. Because the set of gif images repeat across the dataset, although paired with different sentences.\n\nThat is not to say that learning such semantic relations isn't feasible with this model. And it is well worth working on in the future, with a larger and better constructed dataset.",
"### Training Logs\n\nTraining logs can be found here\nIt was really easy to overfit since it was a tiny dataset. Used early stopping.\nOther parameters:",
"# Future Potential\n\nIt is possible to generate a very large training set by scraping twitter.(Couldn't do during the event because of twitter rate limit)\n\nI found it surprising how well the results turned out to be with so little data and training time. Although it is really hard to define what is an appropriate reaction image, and there are definite mistakes model makes.\n\n(I also trained just a plain clip but didn't have time to prep demo for that which was also similarly over fitting and only had time to do a single run)\n\nI will definitely be trying out training a similar model for emoji & meme data.\n\nTraining CLIP is just the first step, if we have a well trained CLIP generation is within reach",
"# How to use\nThe final model available here",
"# Created By\n\nCeyda Cinarel @ceyda\n\nMade during the flax community event",
"# TL;DR The task\n\nInput: Some sentence (like a tweet)\nOutput: The most suitable reaction GIF image (Ranking)\n\nExample: \n - Input: I miss you \n - Output: !hug",
"# Demo\n\nURL"
] |
[
14,
184,
136,
213,
45,
189,
111,
38,
155,
9,
18,
46,
3
] |
[
"passage: TAGS\n#tensorboard #has_space #region-us \n# Searching Reaction GIFs with CLIP\n\n!header gif\n\nReaction GIFs are an integral part of today's communication. They convey complex emotions with many levels, in a short compact format.\n\nIf a picture is worth a thousand words then a GIF is worth more.\n\nWe might even say that the level of complexity and expressiveness increases like:\n\n'Emoji < Memes/Image < GIFs'\n\nI think most people would agree it is not always easy to find the perfect reaction GIF. \n\nAlthough we started out with the more ambitious goal of GIF/Image generation we later settled on first finetuning CLIP. \nWhich is needed to properly drive a generation model (like VQGAN). \n\n!header gif\n\nAvailable CLIP models wouldn't be suitable to use without this finetuning as explained in the challenges below.## Challenges\n\nClassic (Image,Text) tasks like, image search, caption generation all focus on cases where the text is a description of the image.\nThis is mainly because large scale datasets available like COCO,WIT happen to be of that format. \nSo it is interesting to see if models can also capture some more higher level relations.\nlike sentiment-> image mapping, where there is great variation on both sides.\nWe can think of reaction gif/images to be sentiment like, in fact the dataset we use was also gathered for sentiment analysis.\nThere is no one correct reaction GIF, which also makes evaluation challenging.",
"passage: # Dataset\n\nWe use the Reaction GIF dataset by Shmueli et al. Also available in datasets(without the gifs, but I already had those files, nice coincidence )\n!Dataset\nWe only use the tweet, GIF Response fields.\nFor this short experiment we only used the first frame of the GIFs to get generate the (text,image) pairs. Thought about using multiple frames from the GIF, but since there may not be much change between frames and didn't know how that would effect the already overfitting model. I decided to go with the simpler single frame version.\n\nAs opposed to other datasets, like COCO, we don't have multiple captions per image. Although same GIFs are used multiple times and we can modify the dataset that way we chose not to. Since we already had a very small dataset.\n\n- Domain: Twitter\n- Train size: 18,976\n- Validation size: 351# Model\n\nWe used the Hybrid CLIP model. Training script: Hybrid CLIP \n\n- Text Model: twitter-roberta-base-emoji\n- Vision Model: openai/clip-vit-base-patch32## Different models tried\n\n'cardiffnlp/twitter-roberta-base-*' models performed best compared to other text models; like the plain 'roberta-base', 'bert-base-cased'.\nThis makes sense since this model is already trained on twitter data.\n\nAnd among 'cardiffnlp/twitter-roberta-base-*' models 'twitter-roberta-base-emoji' performed best. (As I had hoped)\nThis model is 'cardiffnlp/twitter-roberta-base' further fine-tuned on emoji classification task, which we can say (emojis) is a parallel to GIFs. \n\nAlso tried 'google/vit-base-patch32-384', 'google/vit-base-patch16-384' for the vision models, but results were inconclusive.## Result Interpretation (Warning)\n\nIt would be wrong to claim that this model learned 'semantic reasoning' between sentence and image features. \n\nIt more likely learned a mapping between sentence sentiment and image occurrence statistics. Because the set of gif images repeat across the dataset, although paired with different sentences.\n\nThat is not to say that learning such semantic relations isn't feasible with this model. And it is well worth working on in the future, with a larger and better constructed dataset.### Training Logs\n\nTraining logs can be found here\nIt was really easy to overfit since it was a tiny dataset. Used early stopping.\nOther parameters:"
] |
[
-0.010785999707877636,
-0.01718013361096382,
-0.0035655153915286064,
0.01961938664317131,
0.06229366362094879,
0.02181297168135643,
0.05442313849925995,
0.124596007168293,
0.04412679374217987,
0.049360956996679306,
0.00624443031847477,
-0.08683639764785767,
0.10017473250627518,
0.05608975142240524,
0.05756299942731857,
-0.2926122844219208,
-0.0022832751274108887,
-0.041173696517944336,
0.08968512713909149,
0.08535633981227875,
0.10538475215435028,
-0.07544548809528351,
0.06756891310214996,
0.0882146880030632,
-0.09477046132087708,
-0.002077702432870865,
-0.02677886188030243,
-0.00240310188382864,
0.08408445119857788,
0.058212246745824814,
0.05669940263032913,
0.039556194096803665,
0.010847344994544983,
-0.15289603173732758,
0.037183962762355804,
0.10259157419204712,
0.02590201050043106,
0.021384473890066147,
0.10615953803062439,
-0.029328808188438416,
0.1376892626285553,
-0.029269594699144363,
0.0007869098335504532,
0.05759957432746887,
-0.09208603203296661,
-0.004482641816139221,
-0.07962961494922638,
0.03119315207004547,
0.05458257719874382,
0.029751822352409363,
-0.0520460307598114,
0.02244490198791027,
-0.03022838570177555,
0.018795378506183624,
0.11163531988859177,
-0.14088475704193115,
-0.04773540794849396,
0.10683905333280563,
0.05095938965678215,
0.02204727940261364,
-0.0928291380405426,
0.05086768418550491,
0.010477564297616482,
-0.006101364269852638,
-0.03264043480157852,
0.018625129014253616,
0.059547059237957,
-0.017378177493810654,
-0.10809700191020966,
-0.03351792320609093,
0.09883268177509308,
0.009875908493995667,
-0.09315014630556107,
-0.16911552846431732,
-0.018284745514392853,
0.04635167121887207,
0.0032014548778533936,
-0.02822982147336006,
-0.021486777812242508,
0.09183575958013535,
0.043642133474349976,
-0.10623930394649506,
-0.16091559827327728,
0.006597351282835007,
-0.08177618682384491,
0.03750273212790489,
-0.02812054380774498,
0.05369868874549866,
-0.01641572266817093,
0.10780791938304901,
-0.19500954449176788,
-0.044340454041957855,
-0.04881814122200012,
-0.0797656923532486,
-0.12027537822723389,
-0.056614525616168976,
-0.03430458530783653,
-0.125193253159523,
0.030431874096393585,
0.07196758687496185,
0.018782101571559906,
0.002494247630238533,
0.005187672097235918,
0.04161859303712845,
0.1519576609134674,
0.1314287930727005,
-0.08742386102676392,
0.005894238129258156,
0.04572571814060211,
0.006644001230597496,
-0.012760769575834274,
-0.0100743118673563,
-0.04172847419977188,
-0.010069618001580238,
-0.0428767055273056,
0.010577661916613579,
0.03329380229115486,
0.023363756015896797,
-0.08691219985485077,
-0.0364144891500473,
0.06846783310174942,
-0.06374490261077881,
0.008403641171753407,
-0.002847533207386732,
-0.06807585060596466,
0.08634568750858307,
0.023189883679151535,
0.02741212584078312,
-0.03002789616584778,
0.00774224940687418,
-0.03204087167978287,
-0.022490017116069794,
-0.050533637404441833,
-0.040053240954875946,
0.038066066801548004,
-0.03359391540288925,
-0.07336048781871796,
-0.08363023400306702,
-0.1584027111530304,
-0.07501272857189178,
0.02836015075445175,
-0.08878493309020996,
0.021164724603295326,
0.04982926324009895,
0.006098693236708641,
-0.041235338896512985,
0.0488872230052948,
0.016767285764217377,
-0.033687006682157516,
0.02828253246843815,
-0.09138359129428864,
0.05566271021962166,
0.04090302437543869,
-0.005866924300789833,
-0.0940207913517952,
0.01051278319209814,
-0.2789843678474426,
0.18995968997478485,
-0.024707559496164322,
0.01103509683161974,
-0.07833337038755417,
0.005411546677350998,
-0.053875748068094254,
0.00972285121679306,
-0.007657613605260849,
0.1267036646604538,
-0.1512884795665741,
-0.04079729691147804,
0.013435045257210732,
-0.0936090350151062,
0.0028463657945394516,
0.1761082261800766,
-0.049047768115997314,
0.09076102823019028,
0.10043665021657944,
0.11448504775762558,
0.07935861498117447,
-0.06811520457267761,
-0.04414213448762894,
-0.016231585294008255,
-0.09272728860378265,
0.16962683200836182,
0.02684323862195015,
0.0003692423924803734,
-0.013389941304922104,
0.0014101890847086906,
-0.052159491926431656,
0.03317791223526001,
-0.04745088517665863,
-0.020185980945825577,
-0.018577661365270615,
-0.014539152383804321,
-0.056550636887550354,
0.019794315099716187,
-0.03349579870700836,
-0.044422976672649384,
-0.10202820599079132,
-0.1557830572128296,
0.04803915694355965,
-0.06042606383562088,
-0.024402808398008347,
-0.06354188174009323,
0.11478223651647568,
-0.05549952760338783,
0.002961239777505398,
-0.09861835092306137,
-0.11988402903079987,
0.08587764203548431,
-0.03240208327770233,
0.09052582830190659,
0.02896895632147789,
0.033418454229831696,
0.04308789595961571,
-0.021211642771959305,
0.0009364724392071366,
0.01118807028979063,
-0.033336490392684937,
-0.036971449851989746,
-0.1397601217031479,
-0.03229261189699173,
-0.06245884299278259,
0.1372898817062378,
-0.064327672123909,
-0.010837914422154427,
0.16635334491729736,
0.165683314204216,
-0.007985404692590237,
-0.06655361503362656,
0.04923679679632187,
-0.07345680147409439,
-0.03076264262199402,
-0.0765475481748581,
-0.0007785321795381606,
0.02454320713877678,
-0.060375288128852844,
0.11154983192682266,
-0.09835547208786011,
-0.24860456585884094,
0.06085079908370972,
0.0010756938718259335,
-0.12408453226089478,
-0.052913956344127655,
-0.05504271388053894,
0.026604097336530685,
-0.1323927342891693,
-0.035724177956581116,
0.16118544340133667,
-0.008396010845899582,
0.03956034779548645,
-0.08651843667030334,
-0.03261340782046318,
0.025980323553085327,
-0.02328931912779808,
-0.01365850679576397,
0.0009504137560725212,
0.04028138890862465,
-0.2513464093208313,
0.046722061932086945,
-0.05383595451712608,
0.05069280415773392,
0.20386774837970734,
0.016968777403235435,
-0.09704568237066269,
-0.03104877471923828,
0.06354719400405884,
-0.023301634937524796,
0.06368289142847061,
-0.0859735757112503,
-0.008663276210427284,
0.03928188234567642,
-0.058087222278118134,
0.023039771243929863,
-0.13765722513198853,
0.07858022302389145,
0.04057232663035393,
-0.025652846321463585,
0.08287777006626129,
-0.03448238596320152,
0.015324822627007961,
0.08149100095033646,
0.034886933863162994,
0.03807087242603302,
-0.004627630114555359,
-0.014202571474015713,
-0.08762578666210175,
0.1342095136642456,
-0.035369303077459335,
-0.2725903391838074,
-0.07106806337833405,
0.0987953245639801,
-0.04664825275540352,
0.04737120866775513,
0.0029673799872398376,
-0.10981043428182602,
-0.0712570995092392,
-0.06449652463197708,
0.08202143013477325,
0.039602525532245636,
0.00874244887381792,
-0.08280253410339355,
-0.02126939967274666,
-0.019662126898765564,
-0.08428987860679626,
-0.021206168457865715,
-0.009171308018267155,
-0.09362338483333588,
0.05452610179781914,
-0.034384630620479584,
0.03947444260120392,
0.11641652882099152,
-0.04756835103034973,
0.013081474229693413,
-0.04969404637813568,
0.21704351902008057,
-0.15220907330513,
0.1657075583934784,
0.1138472855091095,
0.005707667209208012,
0.07658088207244873,
0.0967014878988266,
-0.008127718232572079,
-0.07045607268810272,
0.04180335998535156,
0.04496166482567787,
-0.06639696657657623,
-0.15041333436965942,
-0.030296871438622475,
-0.055526427924633026,
-0.004394161514937878,
0.06179368495941162,
0.04622187465429306,
0.025898661464452744,
0.0854133665561676,
-0.10967282205820084,
-0.014451449736952782,
0.047659218311309814,
0.06123865395784378,
0.0016128560528159142,
-0.019903158769011497,
0.07117296755313873,
-0.03523138910531998,
-0.06698920577764511,
0.1372564733028412,
0.029150400310754776,
0.13039571046829224,
-0.059869907796382904,
0.0843660980463028,
0.06227483972907066,
0.08403535932302475,
0.09173312783241272,
0.018833424896001816,
-0.052754439413547516,
-0.005886471830308437,
-0.041471078991889954,
-0.02939126081764698,
0.004267494194209576,
0.0620371550321579,
0.1301153302192688,
-0.10534641146659851,
-0.012524988502264023,
-0.007452534511685371,
0.10112319886684418,
0.20244435966014862,
0.039968326687812805,
-0.1654462218284607,
-0.05385482683777809,
-0.00034762080758810043,
-0.05176464468240738,
-0.09078419208526611,
-0.029119662940502167,
0.16417032480239868,
-0.10561016947031021,
0.07961221784353256,
-0.020880132913589478,
0.05706847831606865,
-0.14182528853416443,
-0.021665511652827263,
0.02295513078570366,
0.08420483767986298,
-0.04873129725456238,
0.07440462708473206,
-0.02498696744441986,
0.018398484215140343,
0.022832278162240982,
0.062257587909698486,
-0.11317487806081772,
0.025898098945617676,
0.07666633278131485,
0.048629481345415115,
0.10029882192611694,
0.01803617924451828,
0.02455219253897667,
-0.02239924669265747,
-0.01821139082312584,
0.016853751614689827,
0.12998878955841064,
-0.12665173411369324,
0.10523685812950134,
-0.011164450086653233,
-0.0015926705673336983,
-0.03368484973907471,
0.00409211590886116,
-0.10219479352235794,
-0.18608644604682922,
0.050607696175575256,
-0.06625015288591385,
0.06068824976682663,
-0.046578697860240936,
0.006723999045789242,
-0.03056676685810089,
0.21279926598072052,
0.07807841151952744,
-0.07978024333715439,
-0.1324400156736374,
0.09960499405860901,
0.11010925471782684,
0.005199423059821129,
0.012725484557449818,
-0.04008176550269127,
0.17961138486862183,
-0.06524740159511566,
-0.07320834696292877,
0.007255257107317448,
-0.08792482316493988,
-0.15136487782001495,
-0.049367763102054596,
0.11983130127191544,
0.05826697126030922,
0.007960060611367226,
0.059605203568935394,
0.0089827049523592,
-0.01709688827395439,
-0.08477722108364105,
0.07131902128458023,
0.02574346587061882,
-0.034381795674562454,
0.14025913178920746,
0.068916454911232,
0.0013698451220989227,
-0.10861004889011383,
-0.034349214285612106,
0.041960328817367554,
0.11557351797819138,
-0.05153869092464447,
0.0634969174861908,
0.12440413236618042,
-0.06670640408992767,
-0.17227046191692352,
0.026946060359477997,
0.05540510267019272,
0.032251954078674316,
-0.06729286164045334,
-0.16811351478099823,
0.07492692023515701,
0.09388293325901031,
0.006767127197235823,
0.002066388726234436,
-0.16928726434707642,
-0.07819725573062897,
-0.031132277101278305,
-0.022982969880104065,
0.08444003015756607,
-0.11762283742427826,
-0.023403547704219818,
0.006335537414997816,
0.04593215882778168,
0.18804538249969482,
-0.04351825639605522,
0.11293268948793411,
0.022310052067041397,
0.0035478519275784492,
0.059961337596178055,
-0.026378542184829712,
0.14023742079734802,
-0.06419447064399719,
0.05243431776762009,
-0.02977394126355648,
0.0578681081533432,
0.08785989880561829,
-0.026560671627521515,
0.09684423357248306,
0.08857867121696472,
0.015472298488020897,
-0.05722542479634285,
-0.079981230199337,
-0.08227874338626862,
0.05734523385763168,
0.013663684949278831,
-0.06177813187241554,
-0.14980727434158325,
0.05898803472518921,
0.02290448173880577,
0.008308260701596737,
-0.0928514301776886,
-0.05905831232666969,
0.01995798945426941,
0.12734319269657135,
0.11707448959350586,
-0.01722794771194458,
-0.10844534635543823,
0.0053363037295639515,
-0.0017130491323769093,
0.0808836817741394,
-0.1303364634513855,
0.03161727637052536,
0.08592866361141205,
-0.0028219837695360184,
0.15110698342323303,
0.015576595440506935,
-0.17036661505699158,
0.0036661606281995773,
0.0858176052570343,
-0.10901399701833725,
-0.23542733490467072,
-0.0624537467956543,
-0.026286417618393898,
-0.02153177000582218,
-0.06017148867249489,
0.11755447089672089,
-0.039907749742269516,
-0.036937564611434937,
0.0020690723322331905,
0.06299616396427155,
0.011437708511948586,
0.057381149381399155,
-0.000780746340751648,
-0.00442185252904892,
-0.09303607046604156,
0.09909860789775848,
0.0867719054222107,
-0.12539739906787872,
0.013230291195213795,
0.07646086812019348,
-0.07961967587471008,
-0.033704861998558044,
-0.054672323167324066,
0.0669788345694542,
-0.07641258835792542,
-0.0011387965641915798,
-0.00004541315138339996,
-0.1103447899222374,
-0.0009242696687579155,
0.08897903561592102,
-0.004445109516382217,
0.03073502890765667,
-0.018106287345290184,
0.04878499358892441,
-0.10384345054626465,
0.04915693402290344,
0.1075255498290062,
0.03201841190457344,
-0.03843894973397255,
0.04825282096862793,
0.004831725265830755,
-0.002014703117311001,
-0.032101672142744064,
-0.04034220799803734,
-0.042466796934604645,
-0.026527050882577896,
-0.06513690203428268,
0.01850753091275692,
-0.0489477664232254,
-0.03650607168674469,
0.07752813398838043,
0.0018368213204666972,
0.03734704852104187,
0.007100604474544525,
-0.06905502080917358,
-0.010917039588093758,
-0.027775850147008896,
0.036245882511138916,
-0.07594174891710281,
-0.013675147667527199,
0.07679570466279984,
-0.07936515659093857,
0.12170001864433289,
0.03201805055141449,
-0.025453876703977585,
-0.06861318647861481,
-0.18446049094200134,
-0.06863437592983246,
0.06522417068481445,
0.02609594538807869,
-0.026988187804818153,
-0.07193659245967865,
0.06301216036081314,
0.013072812929749489,
-0.0455801784992218,
-0.017408298328518867,
0.03691526874899864,
-0.08590956032276154,
0.0058473385870456696,
0.070905901491642,
0.013110917061567307,
-0.1166393980383873,
0.022900909185409546,
0.04985961318016052,
0.09568776935338974,
0.07220093905925751,
-0.03394833952188492,
0.03662428632378578,
-0.1347656399011612,
-0.015218785963952541,
-0.00541787501424551,
-0.012553824111819267,
0.025145452469587326,
-0.04463208466768265,
0.06256791204214096,
-0.0011569601483643055,
0.11236295849084854,
0.07963566482067108,
0.005418340675532818,
0.004603068809956312,
0.12328989058732986,
0.08127321302890778,
0.02526472508907318,
-0.017061224207282066,
0.004158295691013336,
-0.027253441512584686,
-0.04032640904188156,
0.00014494825154542923,
0.04749530553817749,
-0.08048777282238007,
0.06690964102745056,
-0.01703265681862831,
0.12784864008426666,
0.05265973508358002,
0.04668019711971283,
-0.038076795637607574,
0.034430790692567825,
0.041090868413448334,
-0.0638260468840599,
0.06364945322275162,
-0.043567001819610596,
-0.034287936985492706,
0.1615086793899536,
-0.11214999854564667,
0.11477600783109665,
-0.02284674160182476,
-0.06241074949502945,
-0.03859419375658035,
-0.23928365111351013,
-0.020892024040222168,
-0.06502465903759003,
0.013796819373965263,
-0.12148471176624298,
0.07680492848157883,
0.05615531653165817,
0.05418040230870247,
-0.029929086565971375,
0.10986311733722687,
-0.06401877850294113,
-0.1274094134569168,
0.1268656849861145,
0.014638607390224934,
0.051737669855356216,
0.05417703464627266,
-0.018817784264683723,
0.051786117255687714,
0.045723553746938705,
0.06508469581604004,
0.05505869537591934,
0.1091032475233078,
0.0025776969268918037,
-0.03129216283559799,
-0.08036398142576218,
-0.024482451379299164,
-0.07691391557455063,
-0.02917599491775036,
0.09108097851276398,
0.005676665343344212,
-0.007883935235440731,
0.013027079403400421,
0.15618562698364258,
-0.025912851095199585,
-0.033252738416194916,
-0.16738659143447876,
0.10275442898273468,
-0.10429639369249344,
-0.006294114515185356,
0.015177508816123009,
-0.1243162453174591,
0.02440815418958664,
0.18497809767723083,
0.043348610401153564,
-0.023075159639120102,
-0.03471991419792175,
0.005268535111099482,
-0.01386338472366333,
0.03496166318655014,
0.038961298763751984,
-0.005630346015095711,
0.17669691145420074,
-0.05489322543144226,
0.19467204809188843,
0.013144610449671745,
-0.06639421731233597,
-0.003712743055075407,
0.10850846767425537,
0.03079828992486,
0.02593166194856167,
-0.08156105875968933,
0.08479897677898407,
-0.033280711621046066,
-0.2736157178878784,
0.10671784728765488,
-0.04772694408893585,
-0.054745838046073914,
-0.008218583650887012,
-0.012565243057906628,
0.0014146938920021057,
0.09108493477106094,
-0.032005660235881805,
0.007302786223590374,
0.06147202104330063,
0.03164173662662506,
-0.13757410645484924,
-0.003871990367770195,
0.04363878443837166,
-0.1205800473690033,
0.11298410594463348,
0.0066864266991615295,
0.057607121765613556,
0.06805402785539627,
-0.04777023196220398,
-0.13477228581905365,
-0.0212206169962883,
-0.004924075677990913,
-0.07494910061359406,
-0.010193257592618465,
0.20819978415966034,
0.05122523009777069,
-0.0023159752599895,
0.04990692436695099,
-0.10939943045377731,
0.04323925822973251,
0.018643217161297798,
0.04171441122889519,
-0.06910369545221329,
0.04544536769390106,
-0.09485391527414322,
0.109459787607193,
0.10462044179439545,
-0.007152056321501732,
0.024875368922948837,
-0.053599003702402115,
-0.0064281583763659,
0.06611660122871399,
0.11164042353630066,
0.012959617190063,
-0.11117532104253769,
-0.027531053870916367,
0.04037082940340042,
0.07027890533208847,
-0.08770054578781128,
-0.06476748734712601,
0.054180245846509933,
0.014990940690040588,
-0.018873833119869232,
0.11756420880556107,
0.08249041438102722,
0.041677940636873245,
-0.01453726552426815,
-0.06876790523529053,
0.02946074679493904,
0.045206308364868164,
-0.027010269463062286,
0.003680045949295163
] |
null | null |
transformers
|
# Model Card: clip-rsicd
## Model Details
This model is a fine-tuned [CLIP by OpenAI](https://huggingface.co/openai/clip-vit-base-patch32). It is designed with an aim to improve zero-shot image classification, text-to-image and image-to-image retrieval specifically on remote sensing images.
### Model Date
July 2021
### Model Type
The base model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.
### Model Version
We release several checkpoints for `clip-rsicd` model. Refer to [our github repo](https://github.com/arampacha/CLIP-rsicd#evaluation-results) for performance metrics on zero-shot classification for each of those.
### Training
To reproduce the fine-tuning procedure one can use released [script](https://github.com/arampacha/CLIP-rsicd/blob/master/run_clip_flax_tv.py).
The model was trained using batch size 1024, adafactor optimizer with linear warmup and decay with peak learning rate 1e-4 on 1 TPU-v3-8.
Full log of the training run can be found on [WandB](https://wandb.ai/wandb/hf-flax-clip-rsicd/runs/2dj1exsw).
### Demo
Check out the model text-to-image and image-to-image capabilities using [this demo](https://huggingface.co/spaces/sujitpal/clip-rsicd-demo).
### Documents
- [Fine-tuning CLIP on RSICD with HuggingFace and flax/jax on colab using TPU](https://colab.research.google.com/github/arampacha/CLIP-rsicd/blob/master/nbs/Fine_tuning_CLIP_with_HF_on_TPU.ipynb)
### Use with Transformers
```python
from PIL import Image
import requests
from transformers import CLIPProcessor, CLIPModel
model = CLIPModel.from_pretrained("flax-community/clip-rsicd-v2")
processor = CLIPProcessor.from_pretrained("flax-community/clip-rsicd-v2")
url = "https://raw.githubusercontent.com/arampacha/CLIP-rsicd/master/data/stadium_1.jpg"
image = Image.open(requests.get(url, stream=True).raw)
labels = ["residential area", "playground", "stadium", "forest", "airport"]
inputs = processor(text=[f"a photo of a {l}" for l in labels], images=image, return_tensors="pt", padding=True)
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image # this is the image-text similarity score
probs = logits_per_image.softmax(dim=1) # we can take the softmax to get the label probabilities
for l, p in zip(labels, probs[0]):
print(f"{l:<16} {p:.4f}")
```
[Try it on colab](https://colab.research.google.com/github/arampacha/CLIP-rsicd/blob/master/nbs/clip_rsicd_zero_shot.ipynb)
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification.
In addition, we can imagine applications in defense and law enforcement, climate change and global warming, and even some consumer applications. A partial list of applications can be found [here](https://github.com/arampacha/CLIP-rsicd#applications). In general we think such models can be useful as digital assistants for humans engaged in searching through large collections of images.
We also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
## Data
The model was trained on publicly available remote sensing image captions datasets. Namely [RSICD](https://github.com/201528014227051/RSICD_optimal), [UCM](https://mega.nz/folder/wCpSzSoS#RXzIlrv--TDt3ENZdKN8JA) and [Sydney](https://mega.nz/folder/pG4yTYYA#4c4buNFLibryZnlujsrwEQ). More information on the datasets used can be found on [our project page](https://github.com/arampacha/CLIP-rsicd#dataset).
## Performance and Limitations
### Performance
| Model-name | k=1 | k=3 | k=5 | k=10 |
| -------------------------------- | ----- | ----- | ----- | ----- |
| original CLIP | 0.572 | 0.745 | 0.837 | 0.939 |
| clip-rsicd-v2 (this model) | **0.883** | **0.968** | **0.982** | **0.998** |
## Limitations
The model is fine-tuned on RSI data but can contain some biases and limitations of the original CLIP model. Refer to [CLIP model card](https://huggingface.co/openai/clip-vit-base-patch32#limitations) for details on those.
|
{"tags": ["vision"]}
|
zero-shot-image-classification
|
flax-community/clip-rsicd-v2
|
[
"transformers",
"pytorch",
"jax",
"clip",
"zero-shot-image-classification",
"vision",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #clip #zero-shot-image-classification #vision #endpoints_compatible #has_space #region-us
|
Model Card: clip-rsicd
======================
Model Details
-------------
This model is a fine-tuned CLIP by OpenAI. It is designed with an aim to improve zero-shot image classification, text-to-image and image-to-image retrieval specifically on remote sensing images.
### Model Date
July 2021
### Model Type
The base model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.
### Model Version
We release several checkpoints for 'clip-rsicd' model. Refer to our github repo for performance metrics on zero-shot classification for each of those.
### Training
To reproduce the fine-tuning procedure one can use released script.
The model was trained using batch size 1024, adafactor optimizer with linear warmup and decay with peak learning rate 1e-4 on 1 TPU-v3-8.
Full log of the training run can be found on WandB.
### Demo
Check out the model text-to-image and image-to-image capabilities using this demo.
### Documents
* Fine-tuning CLIP on RSICD with HuggingFace and flax/jax on colab using TPU
### Use with Transformers
Try it on colab
Model Use
---------
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification.
In addition, we can imagine applications in defense and law enforcement, climate change and global warming, and even some consumer applications. A partial list of applications can be found here. In general we think such models can be useful as digital assistants for humans engaged in searching through large collections of images.
We also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
Data
----
The model was trained on publicly available remote sensing image captions datasets. Namely RSICD, UCM and Sydney. More information on the datasets used can be found on our project page.
Performance and Limitations
---------------------------
### Performance
Limitations
-----------
The model is fine-tuned on RSI data but can contain some biases and limitations of the original CLIP model. Refer to CLIP model card for details on those.
|
[
"### Model Date\n\n\nJuly 2021",
"### Model Type\n\n\nThe base model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.",
"### Model Version\n\n\nWe release several checkpoints for 'clip-rsicd' model. Refer to our github repo for performance metrics on zero-shot classification for each of those.",
"### Training\n\n\nTo reproduce the fine-tuning procedure one can use released script.\nThe model was trained using batch size 1024, adafactor optimizer with linear warmup and decay with peak learning rate 1e-4 on 1 TPU-v3-8.\nFull log of the training run can be found on WandB.",
"### Demo\n\n\nCheck out the model text-to-image and image-to-image capabilities using this demo.",
"### Documents\n\n\n* Fine-tuning CLIP on RSICD with HuggingFace and flax/jax on colab using TPU",
"### Use with Transformers\n\n\nTry it on colab\n\n\nModel Use\n---------",
"### Intended Use\n\n\nThe model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification.\n\n\nIn addition, we can imagine applications in defense and law enforcement, climate change and global warming, and even some consumer applications. A partial list of applications can be found here. In general we think such models can be useful as digital assistants for humans engaged in searching through large collections of images.\n\n\nWe also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.",
"#### Primary intended uses\n\n\nThe primary intended users of these models are AI researchers.\n\n\nWe primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.\n\n\nData\n----\n\n\nThe model was trained on publicly available remote sensing image captions datasets. Namely RSICD, UCM and Sydney. More information on the datasets used can be found on our project page.\n\n\nPerformance and Limitations\n---------------------------",
"### Performance\n\n\n\nLimitations\n-----------\n\n\nThe model is fine-tuned on RSI data but can contain some biases and limitations of the original CLIP model. Refer to CLIP model card for details on those."
] |
[
"TAGS\n#transformers #pytorch #jax #clip #zero-shot-image-classification #vision #endpoints_compatible #has_space #region-us \n",
"### Model Date\n\n\nJuly 2021",
"### Model Type\n\n\nThe base model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.",
"### Model Version\n\n\nWe release several checkpoints for 'clip-rsicd' model. Refer to our github repo for performance metrics on zero-shot classification for each of those.",
"### Training\n\n\nTo reproduce the fine-tuning procedure one can use released script.\nThe model was trained using batch size 1024, adafactor optimizer with linear warmup and decay with peak learning rate 1e-4 on 1 TPU-v3-8.\nFull log of the training run can be found on WandB.",
"### Demo\n\n\nCheck out the model text-to-image and image-to-image capabilities using this demo.",
"### Documents\n\n\n* Fine-tuning CLIP on RSICD with HuggingFace and flax/jax on colab using TPU",
"### Use with Transformers\n\n\nTry it on colab\n\n\nModel Use\n---------",
"### Intended Use\n\n\nThe model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification.\n\n\nIn addition, we can imagine applications in defense and law enforcement, climate change and global warming, and even some consumer applications. A partial list of applications can be found here. In general we think such models can be useful as digital assistants for humans engaged in searching through large collections of images.\n\n\nWe also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.",
"#### Primary intended uses\n\n\nThe primary intended users of these models are AI researchers.\n\n\nWe primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.\n\n\nData\n----\n\n\nThe model was trained on publicly available remote sensing image captions datasets. Namely RSICD, UCM and Sydney. More information on the datasets used can be found on our project page.\n\n\nPerformance and Limitations\n---------------------------",
"### Performance\n\n\n\nLimitations\n-----------\n\n\nThe model is fine-tuned on RSI data but can contain some biases and limitations of the original CLIP model. Refer to CLIP model card for details on those."
] |
[
41,
6,
72,
40,
70,
24,
32,
16,
149,
114,
46
] |
[
"passage: TAGS\n#transformers #pytorch #jax #clip #zero-shot-image-classification #vision #endpoints_compatible #has_space #region-us \n### Model Date\n\n\nJuly 2021### Model Type\n\n\nThe base model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.### Model Version\n\n\nWe release several checkpoints for 'clip-rsicd' model. Refer to our github repo for performance metrics on zero-shot classification for each of those.### Training\n\n\nTo reproduce the fine-tuning procedure one can use released script.\nThe model was trained using batch size 1024, adafactor optimizer with linear warmup and decay with peak learning rate 1e-4 on 1 TPU-v3-8.\nFull log of the training run can be found on WandB.### Demo\n\n\nCheck out the model text-to-image and image-to-image capabilities using this demo.### Documents\n\n\n* Fine-tuning CLIP on RSICD with HuggingFace and flax/jax on colab using TPU### Use with Transformers\n\n\nTry it on colab\n\n\nModel Use\n---------### Intended Use\n\n\nThe model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification.\n\n\nIn addition, we can imagine applications in defense and law enforcement, climate change and global warming, and even some consumer applications. A partial list of applications can be found here. In general we think such models can be useful as digital assistants for humans engaged in searching through large collections of images.\n\n\nWe also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis."
] |
[
-0.08165604621171951,
0.094379723072052,
-0.002018871484324336,
0.06923872977495193,
0.06741392612457275,
0.02109570987522602,
0.1355014145374298,
0.07218223810195923,
0.027770821005105972,
0.0008421421516686678,
-0.00004227369572618045,
-0.052550602704286575,
0.0611286424100399,
0.132141575217247,
0.06712324917316437,
-0.17483830451965332,
0.006232121028006077,
-0.04879440367221832,
0.0027038666885346174,
0.08857088536024094,
0.08119340240955353,
-0.07309627532958984,
0.08379597961902618,
0.04483843967318535,
-0.12159107625484467,
-0.030570587143301964,
-0.06397073715925217,
-0.03443704545497894,
0.11774308234453201,
0.07971224933862686,
0.09109725058078766,
0.000400802178774029,
0.07225379347801208,
-0.08387213945388794,
0.03655797615647316,
0.09271373599767685,
0.002465922385454178,
0.08671756833791733,
0.050362054258584976,
0.0009538587182760239,
0.19687296450138092,
-0.04583581164479256,
0.03325600549578667,
0.058852680027484894,
-0.0717315524816513,
-0.0255725234746933,
-0.08458180725574493,
0.0487842783331871,
0.04445780813694,
0.07322708517313004,
0.00682765943929553,
0.1143336370587349,
-0.06306440383195877,
0.06345519423484802,
0.12800778448581696,
-0.12934815883636475,
-0.06369981169700623,
0.09033551067113876,
0.11699284613132477,
0.0538807176053524,
-0.06049701198935509,
0.017697619274258614,
0.027204807847738266,
0.016759993508458138,
0.0489618182182312,
-0.013401323929429054,
-0.07074661552906036,
-0.040461812168359756,
-0.1050950214266777,
-0.016101084649562836,
0.05165102705359459,
-0.043673958629369736,
-0.11023422330617905,
-0.1025400161743164,
0.023786911740899086,
-0.0801653116941452,
0.04704376682639122,
-0.004598706495016813,
0.015504658222198486,
0.001443632529117167,
0.01751425489783287,
-0.05674387887120247,
-0.12499616295099258,
-0.06572466343641281,
-0.0016760891303420067,
0.19935916364192963,
0.014462464489042759,
0.03712546452879906,
-0.04858627915382385,
0.1265409141778946,
0.03582543879747391,
-0.042939163744449615,
-0.07172362506389618,
-0.05375022813677788,
-0.1411871612071991,
-0.051585495471954346,
-0.05445589870214462,
-0.0641053169965744,
-0.012432277202606201,
0.09564007818698883,
-0.06627592444419861,
0.012735885567963123,
0.03280004858970642,
0.04918929934501648,
0.11117026209831238,
0.11233571916818619,
-0.03360825404524803,
0.0055211191065609455,
0.05821453407406807,
-0.08437852561473846,
-0.008845017291605473,
-0.02453591860830784,
-0.04261466860771179,
-0.03031347133219242,
0.032403767108917236,
0.0657460168004036,
-0.0036336020566523075,
-0.012533647008240223,
-0.039687808603048325,
-0.03814442828297615,
0.15946947038173676,
-0.051247432827949524,
0.036236509680747986,
0.026388105005025864,
-0.005543647799640894,
0.012234527617692947,
0.08448603749275208,
-0.035808365792036057,
-0.054002825170755386,
0.018780237063765526,
-0.07055316865444183,
0.005843064747750759,
-0.06482845544815063,
-0.11086374521255493,
0.025965768843889236,
-0.1290939599275589,
0.02257459983229637,
-0.19151782989501953,
-0.09852084517478943,
-0.03560423105955124,
0.04750274121761322,
-0.06537942588329315,
0.002705963561311364,
0.01612795889377594,
-0.05436870828270912,
-0.06117945909500122,
0.06385054439306259,
0.023157648742198944,
-0.028751954436302185,
0.017073580995202065,
-0.07087033987045288,
0.06778300553560257,
-0.016739388927817345,
-0.04087192937731743,
-0.07287441194057465,
-0.012616954743862152,
-0.11695373803377151,
0.0915067046880722,
-0.05020185187458992,
0.07298143953084946,
-0.09656257182359695,
-0.04685980826616287,
0.01454290933907032,
0.0007148655713535845,
0.051509540528059006,
0.13707168400287628,
-0.14493834972381592,
0.016973581165075302,
0.06941717118024826,
-0.14844371378421783,
-0.04700044170022011,
0.13403049111366272,
-0.027545897290110588,
0.11366541683673859,
0.050646498799324036,
0.029819870367646217,
0.13715383410453796,
-0.16814206540584564,
-0.009234601631760597,
-0.01177898794412613,
-0.09890224784612656,
-0.013659490272402763,
0.04084966331720352,
0.06585333496332169,
-0.05071514472365379,
-0.023620814085006714,
-0.08111082762479782,
0.053463369607925415,
-0.02843027003109455,
-0.05054890736937523,
-0.02367180585861206,
-0.05160989984869957,
-0.04972723126411438,
-0.01902354136109352,
0.01588946208357811,
-0.011323128826916218,
-0.06977099925279617,
-0.09296676516532898,
0.08322236686944962,
-0.03340248391032219,
0.0286115650087595,
-0.08171392977237701,
0.03777683898806572,
-0.0544268898665905,
-0.019230222329497337,
-0.1097058579325676,
-0.01100238598883152,
0.0329224094748497,
-0.08032408356666565,
0.06946392357349396,
0.03223885968327522,
0.0483759306371212,
0.07147734612226486,
-0.003984705079346895,
0.023225324228405952,
-0.053552400320768356,
0.0009091193787753582,
-0.05252484977245331,
-0.13454732298851013,
-0.06337014585733414,
-0.05032864212989807,
0.0506284162402153,
-0.10905864834785461,
-0.0007101760129444301,
0.03010779246687889,
0.0996529832482338,
0.04233585298061371,
-0.06995532661676407,
0.04382835328578949,
0.0011684909695759416,
-0.025583527982234955,
-0.08503325283527374,
0.008946186862885952,
0.0782303661108017,
-0.0820588544011116,
0.06920577585697174,
-0.01653180830180645,
-0.1277017891407013,
0.04264640808105469,
0.0649801567196846,
-0.15798220038414001,
-0.023488998413085938,
-0.013842049986124039,
-0.02350679598748684,
-0.0946100726723671,
-0.004299513064324856,
0.26139432191848755,
0.0070538329891860485,
0.027727628126740456,
-0.06500525027513504,
-0.03680505231022835,
0.037623703479766846,
-0.022192850708961487,
0.05409586429595947,
0.04126553609967232,
0.05698408931493759,
-0.15854573249816895,
-0.014898449182510376,
-0.06078911945223808,
-0.10957928746938705,
0.11787011474370956,
0.027048004791140556,
-0.08004547655582428,
-0.02343764528632164,
0.004581608809530735,
0.059530749917030334,
0.17655955255031586,
-0.0011810599826276302,
0.007413347717374563,
0.009499868378043175,
0.0001487391855334863,
0.024623053148388863,
-0.09815356135368347,
0.05033217370510101,
0.03022152930498123,
0.030936183407902718,
-0.05109291151165962,
-0.02740790881216526,
-0.056357815861701965,
0.03177327662706375,
0.028699176385998726,
0.027449917048215866,
-0.004491014406085014,
0.003486750414595008,
-0.1015700250864029,
0.194843590259552,
-0.052325375378131866,
-0.19160783290863037,
-0.09887855499982834,
0.1520724892616272,
0.06008546054363251,
-0.011511877179145813,
0.024637803435325623,
-0.03970732167363167,
-0.07362305372953415,
-0.06896757334470749,
0.033966176211833954,
-0.029247375205159187,
-0.013900483027100563,
-0.029506247490644455,
0.03303037956357002,
-0.011487691663205624,
-0.11733677983283997,
0.037504490464925766,
0.04754508286714554,
-0.013954322785139084,
0.07691586762666702,
0.010362106375396252,
0.04146198183298111,
0.07374085485935211,
-0.05853100121021271,
0.0010865925578400493,
-0.004031194839626551,
0.13325221836566925,
-0.09455248713493347,
0.09817247837781906,
0.10229724645614624,
-0.008562522009015083,
-0.0028998591005802155,
0.01725156605243683,
0.006123492494225502,
-0.05681459978222847,
0.03140075504779816,
-0.005163196008652449,
-0.07397118210792542,
-0.16032429039478302,
-0.038219649344682693,
-0.05476169288158417,
0.004379113204777241,
0.08179590106010437,
0.027540605515241623,
0.007200487423688173,
0.0802076980471611,
0.033872563391923904,
0.00897279754281044,
-0.008153279311954975,
0.11796298623085022,
-0.062997005879879,
0.018739450722932816,
0.023982588201761246,
-0.041952986270189285,
0.03272920101881027,
0.08189341425895691,
0.04557371139526367,
0.21238721907138824,
-0.0681299939751625,
0.11877286434173584,
0.05044279620051384,
-0.0019236538792029023,
0.07983534038066864,
0.0895635113120079,
-0.00614011799916625,
-0.007272453512996435,
-0.006878504995256662,
-0.03746551647782326,
-0.035272158682346344,
0.04795281961560249,
-0.032324858009815216,
-0.039895717054605484,
0.013404116965830326,
-0.019164463505148888,
0.02448236383497715,
0.15447621047496796,
0.022818585857748985,
-0.20052585005760193,
-0.13669458031654358,
-0.012964930385351181,
-0.02649284154176712,
-0.09236694872379303,
0.0018418183317407966,
0.13758209347724915,
-0.10495976358652115,
-0.05732748284935951,
-0.038618654012680054,
0.08009732514619827,
-0.1116734966635704,
-0.0549461655318737,
-0.008932595141232014,
0.10047998279333115,
-0.022903351113200188,
0.038142260164022446,
-0.06168820336461067,
0.04260191321372986,
-0.013135996647179127,
0.13060489296913147,
-0.08879850804805756,
0.012573190964758396,
0.046000681817531586,
0.05175285413861275,
0.1344618797302246,
0.041735030710697174,
0.003913787193596363,
-0.024797657504677773,
-0.09699543565511703,
0.05537624657154083,
-0.01033325307071209,
-0.07237470895051956,
0.07794631272554398,
-0.0009070865926332772,
0.020807180553674698,
-0.0425255186855793,
-0.08268056809902191,
-0.1289445459842682,
-0.1210329607129097,
0.035875141620635986,
-0.060113970190286636,
0.07461969554424286,
-0.04412256181240082,
-0.05740866810083389,
0.019580088555812836,
0.1732615828514099,
-0.0842205286026001,
-0.08146032691001892,
-0.12888559699058533,
-0.01649116910994053,
0.10238195210695267,
-0.006419336888939142,
0.05145445093512535,
0.028714075684547424,
0.1339661031961441,
-0.06264077126979828,
-0.05605432391166687,
0.016081953421235085,
-0.14134946465492249,
-0.12318762391805649,
-0.031188566237688065,
0.03662151098251343,
0.14662878215312958,
0.05233723670244217,
0.05323529615998268,
0.0210321843624115,
-0.05419425666332245,
-0.10927881300449371,
-0.05325360596179962,
0.0624130442738533,
-0.0013572723837569356,
0.015962406992912292,
0.008479185402393341,
0.050693340599536896,
-0.01639411225914955,
-0.040679238736629486,
0.054094262421131134,
0.1328376978635788,
-0.08458255976438522,
0.07936567813158035,
0.09735994786024094,
-0.05016939714550972,
-0.16769739985466003,
-0.007176279556006193,
0.06201664358377457,
0.09725513309240341,
-0.022156978026032448,
-0.16623662412166595,
0.06655817478895187,
0.01250290498137474,
0.00609692744910717,
0.11857347190380096,
-0.2157820165157318,
-0.0743076279759407,
0.0620870403945446,
0.07828540354967117,
0.06930931657552719,
-0.08429025113582611,
-0.0031932054553180933,
0.015026325359940529,
-0.08450428396463394,
0.16493871808052063,
-0.09580710530281067,
0.08134076744318008,
-0.01816626824438572,
-0.010419249534606934,
0.0257167536765337,
-0.056585222482681274,
0.129123717546463,
-0.03458808735013008,
0.0350693017244339,
-0.029030701145529747,
0.025632085278630257,
-0.01573733426630497,
-0.050955139100551605,
0.08457338809967041,
-0.015013479627668858,
0.08881301432847977,
-0.040270280092954636,
-0.04444221407175064,
-0.04362112283706665,
0.07670406997203827,
-0.011973385699093342,
-0.049374524503946304,
-0.10774920135736465,
0.12133744359016418,
0.019686223939061165,
-0.03302404284477234,
0.02985590137541294,
-0.0611346960067749,
0.014556159265339375,
0.07047434896230698,
0.140419140458107,
0.007506169844418764,
-0.0293577928096056,
0.02145625464618206,
-0.04557384178042412,
0.1374417543411255,
-0.11478488892316818,
0.0016024074284359813,
0.08989328891038895,
0.05637018382549286,
0.05173013359308243,
0.025201523676514626,
-0.12817779183387756,
0.030403699725866318,
0.059574007987976074,
-0.1177675724029541,
-0.043077949434518814,
-0.023087341338396072,
0.05079008266329765,
-0.06451214104890823,
0.061714544892311096,
0.15147921442985535,
-0.13751991093158722,
-0.041142094880342484,
0.018182015046477318,
0.05934203788638115,
-0.05270092934370041,
0.09684748202562332,
0.02589309774339199,
0.020000841468572617,
-0.0717339739203453,
0.08162085711956024,
0.020940328016877174,
-0.0035649475175887346,
0.02293143793940544,
0.13053813576698303,
-0.11858068406581879,
-0.07200486958026886,
-0.03170401230454445,
0.03420960158109665,
-0.0913470983505249,
-0.04476150497794151,
-0.04542236030101776,
-0.09540745615959167,
-0.018963715061545372,
0.10216791927814484,
0.019657833501696587,
0.04066433012485504,
-0.05798233672976494,
0.015478058718144894,
-0.15913230180740356,
0.01072231587022543,
0.022176673635840416,
0.04324747994542122,
-0.05257740244269371,
0.11355366557836533,
0.008928630501031876,
0.008805515244603157,
-0.03480823338031769,
-0.013491295278072357,
-0.11576254665851593,
-0.03213702514767647,
-0.08279824256896973,
0.02871152013540268,
-0.08102323859930038,
0.03206690400838852,
0.02760929800570011,
0.005282216239720583,
0.009035788476467133,
0.06360772997140884,
-0.02011934109032154,
-0.000861472450196743,
-0.04482170566916466,
0.04159453883767128,
-0.02699855901300907,
-0.0010839427122846246,
0.03479320555925369,
-0.0845985859632492,
0.1050947904586792,
0.017805805429816246,
-0.014082247391343117,
-0.001053922576829791,
-0.12450730055570602,
0.05968460068106651,
0.020398039370775223,
-0.03497787192463875,
0.0024624825455248356,
-0.23454467952251434,
0.001986681018024683,
-0.013728045858442783,
-0.031200844794511795,
-0.007169406861066818,
0.06972477585077286,
-0.0549420528113842,
0.009779154323041439,
0.027077464386820793,
0.03994419053196907,
-0.09030264616012573,
0.024777686223387718,
0.09206853061914444,
0.09627640247344971,
0.04005971550941467,
-0.03574775159358978,
0.06268057227134705,
-0.052438534796237946,
-0.01869697868824005,
0.007127527147531509,
0.03601418808102608,
-0.03860250115394592,
-0.029807619750499725,
0.06325230002403259,
0.011636345647275448,
0.07311875373125076,
-0.049460239708423615,
0.03193848952651024,
-0.006560776848345995,
0.0021599235478788614,
-0.06506115198135376,
0.007754203397780657,
0.07811011373996735,
-0.05612484738230705,
0.007975693792104721,
-0.04001295566558838,
0.002784647047519684,
-0.03228668123483658,
-0.09374448657035828,
0.1470770239830017,
0.04832354560494423,
-0.07924500852823257,
0.1095755323767662,
-0.031682003289461136,
-0.0677686408162117,
0.01601998880505562,
0.07229925692081451,
-0.019440781325101852,
0.0319715179502964,
-0.06565054506063461,
0.048858121037483215,
0.13944709300994873,
-0.09476131945848465,
0.05282260850071907,
0.02500375732779503,
-0.06039270758628845,
-0.004341831896454096,
-0.10328075289726257,
-0.005367676727473736,
-0.04586289823055267,
-0.013665057718753815,
-0.09178460389375687,
0.025734391063451767,
-0.004378582816570997,
0.04369098320603371,
-0.025132540613412857,
0.14245621860027313,
0.01731635071337223,
-0.08390769362449646,
0.06020486354827881,
0.00857515074312687,
0.028895333409309387,
-0.011541219428181648,
0.002176918089389801,
0.04098561778664589,
0.0602031946182251,
0.06602248549461365,
0.0162750743329525,
0.08212786912918091,
0.031620170921087265,
0.01765468902885914,
-0.07075997442007065,
-0.00397422444075346,
-0.020400287583470345,
0.0006022051675245166,
0.13234388828277588,
0.061688244342803955,
-0.015053289011120796,
0.012429962866008282,
0.17225739359855652,
-0.06037062779068947,
-0.08640988171100616,
-0.18487221002578735,
0.1689763069152832,
-0.054229333996772766,
-0.04684289172291756,
0.013183536939322948,
-0.07588431239128113,
0.008685620501637459,
0.16744431853294373,
0.06547968834638596,
-0.029318837448954582,
0.011060194112360477,
-0.0004177863011136651,
-0.01752387173473835,
-0.013301846571266651,
0.06500418484210968,
-0.04016253352165222,
0.1829679310321808,
-0.07491482049226761,
0.08402546495199203,
-0.03213977813720703,
-0.0038299316074699163,
-0.02111225388944149,
0.0105074942111969,
-0.04470853507518768,
-0.043518416583538055,
-0.04620896652340889,
0.07925743609666824,
-0.07301018387079239,
-0.2380715161561966,
0.11031787097454071,
-0.03217640891671181,
-0.06138316169381142,
-0.016566788777709007,
-0.047113895416259766,
-0.016958896070718765,
0.10003099590539932,
-0.0046598478220403194,
0.03572051227092743,
0.176324725151062,
0.008405164815485477,
-0.013441993854939938,
-0.039761655032634735,
0.011641082353889942,
-0.01925736479461193,
0.16098608076572418,
-0.012637246400117874,
0.06786992400884628,
0.07935784012079239,
0.03268999233841896,
-0.1909998208284378,
0.002425137208774686,
-0.022655731067061424,
-0.08239387720823288,
0.024543190374970436,
0.1349446177482605,
-0.017643984407186508,
0.020681381225585938,
0.03406824916601181,
-0.039677903056144714,
0.014209484681487083,
-0.030209800228476524,
0.06244268640875816,
-0.06989523768424988,
0.021881768479943275,
-0.06624830514192581,
0.19945316016674042,
0.1147623062133789,
-0.039893988519907,
-0.031115224584937096,
-0.03393222764134407,
0.07683145254850388,
0.04502282291650772,
-0.017079781740903854,
-0.009780394844710827,
-0.09681174159049988,
-0.013995782472193241,
-0.03594644367694855,
0.05861731618642807,
-0.15925118327140808,
-0.025849109515547752,
-0.00965851079672575,
-0.026120411232113838,
0.014367174357175827,
0.0375652015209198,
0.04227178171277046,
0.05193251371383667,
0.0036504012532532215,
-0.037234801799058914,
0.023534368723630905,
0.048096269369125366,
-0.072197325527668,
-0.06086375191807747
] |
null | null |
transformers
|
# Model Card: clip-rsicd
## Model Details
This model is a finetuned [CLIP by OpenAI](https://huggingface.co/openai/clip-vit-base-patch32). It is designed with an aim to improve zero-shot image classification, text-to-image and image-to-image retrieval specifically on remote sensing images.
### Model Date
July 2021
### Model Type
The base model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.
### Model Version
We release several checkpoints for `clip-rsicd` model. Refer to [our github repo](https://github.com/arampacha/CLIP-rsicd#evaluation-results) for performance metrics on zero-shot classification for each of those.
### Training
To reproduce the fine-tuning procedure one can use released [script](https://github.com/arampacha/CLIP-rsicd/blob/master/run_clip_flax_tv.py).
The model was trained using batch size 1024, adafactor optimizer with linear warmup and decay with peak learning rate 1e-4 on 1 TPU-v3-8.
Full log of the training run can be found on [WandB](https://wandb.ai/wandb/hf-flax-clip-rsicd/runs/1ts243k3).
### Demo
Check out the model text-to-image and image-to-image capabilities using [this demo](https://huggingface.co/spaces/sujitpal/clip-rsicd-demo).
### Documents
- [Fine-tuning CLIP on RSICD with HuggingFace and flax/jax on colab using TPU](https://colab.research.google.com/github/arampacha/CLIP-rsicd/blob/master/nbs/Finetuning_CLIP_with_HF_and_jax.ipynb)
### Use with Transformers
```py
from PIL import Image
import requests
from transformers import CLIPProcessor, CLIPModel
model = CLIPModel.from_pretrained("flax-community/clip-rsicd")
processor = CLIPProcessor.from_pretrained("flax-community/clip-rsicd")
url = "https://raw.githubusercontent.com/arampacha/CLIP-rsicd/master/data/stadium_1.jpg"
image = Image.open(requests.get(url, stream=True).raw)
labels = ["residential area", "playground", "stadium", "forrest", "airport"]
inputs = processor(text=[f"a photo of a {l}" for l in labels], images=image, return_tensors="pt", padding=True)
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image # this is the image-text similarity score
probs = logits_per_image.softmax(dim=1) # we can take the softmax to get the label probabilities
for l, p in zip(labels, probs[0]):
print(f"{l:<16} {p:.4f}")
```
[Try it on colab](https://colab.research.google.com/github/arampacha/CLIP-rsicd/blob/master/nbs/clip_rsicd_zero_shot.ipynb)
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification.
In addition, we can imagine applications in defense and law enforcement, climate change and global warming, and even some consumer applications. A partial list of applications can be found [here](https://github.com/arampacha/CLIP-rsicd#applications). In general we think such models can be useful as digital assistants for humans engaged in searching through large collections of images.
We also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
## Data
The model was trained on publicly available remote sensing image captions datasets. Namely [RSICD](https://github.com/201528014227051/RSICD_optimal), [UCM](https://mega.nz/folder/wCpSzSoS#RXzIlrv--TDt3ENZdKN8JA) and [Sydney](https://mega.nz/folder/pG4yTYYA#4c4buNFLibryZnlujsrwEQ). More information on the datasets used can be found on [our project page](https://github.com/arampacha/CLIP-rsicd#dataset).
## Performance and Limitations
### Performance
| Model-name | k=1 | k=3 | k=5 | k=10 |
| -------------------------------- | ----- | ----- | ----- | ----- |
| original CLIP | 0.572 | 0.745 | 0.837 | 0.939 |
| clip-rsicd (this model) | 0.843 | 0.958 | 0.977 | 0.993 |
## Limitations
The model is finetuned on RSI data but can contain some biases and limitations of the original CLIP model. Refer to [CLIP model card](https://huggingface.co/openai/clip-vit-base-patch32#limitations) for details on those.
|
{"tags": ["vision"]}
|
zero-shot-image-classification
|
flax-community/clip-rsicd
|
[
"transformers",
"pytorch",
"jax",
"clip",
"zero-shot-image-classification",
"vision",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #clip #zero-shot-image-classification #vision #endpoints_compatible #has_space #region-us
|
Model Card: clip-rsicd
======================
Model Details
-------------
This model is a finetuned CLIP by OpenAI. It is designed with an aim to improve zero-shot image classification, text-to-image and image-to-image retrieval specifically on remote sensing images.
### Model Date
July 2021
### Model Type
The base model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.
### Model Version
We release several checkpoints for 'clip-rsicd' model. Refer to our github repo for performance metrics on zero-shot classification for each of those.
### Training
To reproduce the fine-tuning procedure one can use released script.
The model was trained using batch size 1024, adafactor optimizer with linear warmup and decay with peak learning rate 1e-4 on 1 TPU-v3-8.
Full log of the training run can be found on WandB.
### Demo
Check out the model text-to-image and image-to-image capabilities using this demo.
### Documents
* Fine-tuning CLIP on RSICD with HuggingFace and flax/jax on colab using TPU
### Use with Transformers
Try it on colab
Model Use
---------
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification.
In addition, we can imagine applications in defense and law enforcement, climate change and global warming, and even some consumer applications. A partial list of applications can be found here. In general we think such models can be useful as digital assistants for humans engaged in searching through large collections of images.
We also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
Data
----
The model was trained on publicly available remote sensing image captions datasets. Namely RSICD, UCM and Sydney. More information on the datasets used can be found on our project page.
Performance and Limitations
---------------------------
### Performance
Limitations
-----------
The model is finetuned on RSI data but can contain some biases and limitations of the original CLIP model. Refer to CLIP model card for details on those.
|
[
"### Model Date\n\n\nJuly 2021",
"### Model Type\n\n\nThe base model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.",
"### Model Version\n\n\nWe release several checkpoints for 'clip-rsicd' model. Refer to our github repo for performance metrics on zero-shot classification for each of those.",
"### Training\n\n\nTo reproduce the fine-tuning procedure one can use released script.\nThe model was trained using batch size 1024, adafactor optimizer with linear warmup and decay with peak learning rate 1e-4 on 1 TPU-v3-8.\nFull log of the training run can be found on WandB.",
"### Demo\n\n\nCheck out the model text-to-image and image-to-image capabilities using this demo.",
"### Documents\n\n\n* Fine-tuning CLIP on RSICD with HuggingFace and flax/jax on colab using TPU",
"### Use with Transformers\n\n\nTry it on colab\n\n\nModel Use\n---------",
"### Intended Use\n\n\nThe model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification.\n\n\nIn addition, we can imagine applications in defense and law enforcement, climate change and global warming, and even some consumer applications. A partial list of applications can be found here. In general we think such models can be useful as digital assistants for humans engaged in searching through large collections of images.\n\n\nWe also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.",
"#### Primary intended uses\n\n\nThe primary intended users of these models are AI researchers.\n\n\nWe primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.\n\n\nData\n----\n\n\nThe model was trained on publicly available remote sensing image captions datasets. Namely RSICD, UCM and Sydney. More information on the datasets used can be found on our project page.\n\n\nPerformance and Limitations\n---------------------------",
"### Performance\n\n\n\nLimitations\n-----------\n\n\nThe model is finetuned on RSI data but can contain some biases and limitations of the original CLIP model. Refer to CLIP model card for details on those."
] |
[
"TAGS\n#transformers #pytorch #jax #clip #zero-shot-image-classification #vision #endpoints_compatible #has_space #region-us \n",
"### Model Date\n\n\nJuly 2021",
"### Model Type\n\n\nThe base model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.",
"### Model Version\n\n\nWe release several checkpoints for 'clip-rsicd' model. Refer to our github repo for performance metrics on zero-shot classification for each of those.",
"### Training\n\n\nTo reproduce the fine-tuning procedure one can use released script.\nThe model was trained using batch size 1024, adafactor optimizer with linear warmup and decay with peak learning rate 1e-4 on 1 TPU-v3-8.\nFull log of the training run can be found on WandB.",
"### Demo\n\n\nCheck out the model text-to-image and image-to-image capabilities using this demo.",
"### Documents\n\n\n* Fine-tuning CLIP on RSICD with HuggingFace and flax/jax on colab using TPU",
"### Use with Transformers\n\n\nTry it on colab\n\n\nModel Use\n---------",
"### Intended Use\n\n\nThe model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification.\n\n\nIn addition, we can imagine applications in defense and law enforcement, climate change and global warming, and even some consumer applications. A partial list of applications can be found here. In general we think such models can be useful as digital assistants for humans engaged in searching through large collections of images.\n\n\nWe also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.",
"#### Primary intended uses\n\n\nThe primary intended users of these models are AI researchers.\n\n\nWe primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.\n\n\nData\n----\n\n\nThe model was trained on publicly available remote sensing image captions datasets. Namely RSICD, UCM and Sydney. More information on the datasets used can be found on our project page.\n\n\nPerformance and Limitations\n---------------------------",
"### Performance\n\n\n\nLimitations\n-----------\n\n\nThe model is finetuned on RSI data but can contain some biases and limitations of the original CLIP model. Refer to CLIP model card for details on those."
] |
[
41,
6,
72,
40,
70,
24,
32,
16,
149,
114,
45
] |
[
"passage: TAGS\n#transformers #pytorch #jax #clip #zero-shot-image-classification #vision #endpoints_compatible #has_space #region-us \n### Model Date\n\n\nJuly 2021### Model Type\n\n\nThe base model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.### Model Version\n\n\nWe release several checkpoints for 'clip-rsicd' model. Refer to our github repo for performance metrics on zero-shot classification for each of those.### Training\n\n\nTo reproduce the fine-tuning procedure one can use released script.\nThe model was trained using batch size 1024, adafactor optimizer with linear warmup and decay with peak learning rate 1e-4 on 1 TPU-v3-8.\nFull log of the training run can be found on WandB.### Demo\n\n\nCheck out the model text-to-image and image-to-image capabilities using this demo.### Documents\n\n\n* Fine-tuning CLIP on RSICD with HuggingFace and flax/jax on colab using TPU### Use with Transformers\n\n\nTry it on colab\n\n\nModel Use\n---------### Intended Use\n\n\nThe model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification.\n\n\nIn addition, we can imagine applications in defense and law enforcement, climate change and global warming, and even some consumer applications. A partial list of applications can be found here. In general we think such models can be useful as digital assistants for humans engaged in searching through large collections of images.\n\n\nWe also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis."
] |
[
-0.08165604621171951,
0.094379723072052,
-0.002018871484324336,
0.06923872977495193,
0.06741392612457275,
0.02109570987522602,
0.1355014145374298,
0.07218223810195923,
0.027770821005105972,
0.0008421421516686678,
-0.00004227369572618045,
-0.052550602704286575,
0.0611286424100399,
0.132141575217247,
0.06712324917316437,
-0.17483830451965332,
0.006232121028006077,
-0.04879440367221832,
0.0027038666885346174,
0.08857088536024094,
0.08119340240955353,
-0.07309627532958984,
0.08379597961902618,
0.04483843967318535,
-0.12159107625484467,
-0.030570587143301964,
-0.06397073715925217,
-0.03443704545497894,
0.11774308234453201,
0.07971224933862686,
0.09109725058078766,
0.000400802178774029,
0.07225379347801208,
-0.08387213945388794,
0.03655797615647316,
0.09271373599767685,
0.002465922385454178,
0.08671756833791733,
0.050362054258584976,
0.0009538587182760239,
0.19687296450138092,
-0.04583581164479256,
0.03325600549578667,
0.058852680027484894,
-0.0717315524816513,
-0.0255725234746933,
-0.08458180725574493,
0.0487842783331871,
0.04445780813694,
0.07322708517313004,
0.00682765943929553,
0.1143336370587349,
-0.06306440383195877,
0.06345519423484802,
0.12800778448581696,
-0.12934815883636475,
-0.06369981169700623,
0.09033551067113876,
0.11699284613132477,
0.0538807176053524,
-0.06049701198935509,
0.017697619274258614,
0.027204807847738266,
0.016759993508458138,
0.0489618182182312,
-0.013401323929429054,
-0.07074661552906036,
-0.040461812168359756,
-0.1050950214266777,
-0.016101084649562836,
0.05165102705359459,
-0.043673958629369736,
-0.11023422330617905,
-0.1025400161743164,
0.023786911740899086,
-0.0801653116941452,
0.04704376682639122,
-0.004598706495016813,
0.015504658222198486,
0.001443632529117167,
0.01751425489783287,
-0.05674387887120247,
-0.12499616295099258,
-0.06572466343641281,
-0.0016760891303420067,
0.19935916364192963,
0.014462464489042759,
0.03712546452879906,
-0.04858627915382385,
0.1265409141778946,
0.03582543879747391,
-0.042939163744449615,
-0.07172362506389618,
-0.05375022813677788,
-0.1411871612071991,
-0.051585495471954346,
-0.05445589870214462,
-0.0641053169965744,
-0.012432277202606201,
0.09564007818698883,
-0.06627592444419861,
0.012735885567963123,
0.03280004858970642,
0.04918929934501648,
0.11117026209831238,
0.11233571916818619,
-0.03360825404524803,
0.0055211191065609455,
0.05821453407406807,
-0.08437852561473846,
-0.008845017291605473,
-0.02453591860830784,
-0.04261466860771179,
-0.03031347133219242,
0.032403767108917236,
0.0657460168004036,
-0.0036336020566523075,
-0.012533647008240223,
-0.039687808603048325,
-0.03814442828297615,
0.15946947038173676,
-0.051247432827949524,
0.036236509680747986,
0.026388105005025864,
-0.005543647799640894,
0.012234527617692947,
0.08448603749275208,
-0.035808365792036057,
-0.054002825170755386,
0.018780237063765526,
-0.07055316865444183,
0.005843064747750759,
-0.06482845544815063,
-0.11086374521255493,
0.025965768843889236,
-0.1290939599275589,
0.02257459983229637,
-0.19151782989501953,
-0.09852084517478943,
-0.03560423105955124,
0.04750274121761322,
-0.06537942588329315,
0.002705963561311364,
0.01612795889377594,
-0.05436870828270912,
-0.06117945909500122,
0.06385054439306259,
0.023157648742198944,
-0.028751954436302185,
0.017073580995202065,
-0.07087033987045288,
0.06778300553560257,
-0.016739388927817345,
-0.04087192937731743,
-0.07287441194057465,
-0.012616954743862152,
-0.11695373803377151,
0.0915067046880722,
-0.05020185187458992,
0.07298143953084946,
-0.09656257182359695,
-0.04685980826616287,
0.01454290933907032,
0.0007148655713535845,
0.051509540528059006,
0.13707168400287628,
-0.14493834972381592,
0.016973581165075302,
0.06941717118024826,
-0.14844371378421783,
-0.04700044170022011,
0.13403049111366272,
-0.027545897290110588,
0.11366541683673859,
0.050646498799324036,
0.029819870367646217,
0.13715383410453796,
-0.16814206540584564,
-0.009234601631760597,
-0.01177898794412613,
-0.09890224784612656,
-0.013659490272402763,
0.04084966331720352,
0.06585333496332169,
-0.05071514472365379,
-0.023620814085006714,
-0.08111082762479782,
0.053463369607925415,
-0.02843027003109455,
-0.05054890736937523,
-0.02367180585861206,
-0.05160989984869957,
-0.04972723126411438,
-0.01902354136109352,
0.01588946208357811,
-0.011323128826916218,
-0.06977099925279617,
-0.09296676516532898,
0.08322236686944962,
-0.03340248391032219,
0.0286115650087595,
-0.08171392977237701,
0.03777683898806572,
-0.0544268898665905,
-0.019230222329497337,
-0.1097058579325676,
-0.01100238598883152,
0.0329224094748497,
-0.08032408356666565,
0.06946392357349396,
0.03223885968327522,
0.0483759306371212,
0.07147734612226486,
-0.003984705079346895,
0.023225324228405952,
-0.053552400320768356,
0.0009091193787753582,
-0.05252484977245331,
-0.13454732298851013,
-0.06337014585733414,
-0.05032864212989807,
0.0506284162402153,
-0.10905864834785461,
-0.0007101760129444301,
0.03010779246687889,
0.0996529832482338,
0.04233585298061371,
-0.06995532661676407,
0.04382835328578949,
0.0011684909695759416,
-0.025583527982234955,
-0.08503325283527374,
0.008946186862885952,
0.0782303661108017,
-0.0820588544011116,
0.06920577585697174,
-0.01653180830180645,
-0.1277017891407013,
0.04264640808105469,
0.0649801567196846,
-0.15798220038414001,
-0.023488998413085938,
-0.013842049986124039,
-0.02350679598748684,
-0.0946100726723671,
-0.004299513064324856,
0.26139432191848755,
0.0070538329891860485,
0.027727628126740456,
-0.06500525027513504,
-0.03680505231022835,
0.037623703479766846,
-0.022192850708961487,
0.05409586429595947,
0.04126553609967232,
0.05698408931493759,
-0.15854573249816895,
-0.014898449182510376,
-0.06078911945223808,
-0.10957928746938705,
0.11787011474370956,
0.027048004791140556,
-0.08004547655582428,
-0.02343764528632164,
0.004581608809530735,
0.059530749917030334,
0.17655955255031586,
-0.0011810599826276302,
0.007413347717374563,
0.009499868378043175,
0.0001487391855334863,
0.024623053148388863,
-0.09815356135368347,
0.05033217370510101,
0.03022152930498123,
0.030936183407902718,
-0.05109291151165962,
-0.02740790881216526,
-0.056357815861701965,
0.03177327662706375,
0.028699176385998726,
0.027449917048215866,
-0.004491014406085014,
0.003486750414595008,
-0.1015700250864029,
0.194843590259552,
-0.052325375378131866,
-0.19160783290863037,
-0.09887855499982834,
0.1520724892616272,
0.06008546054363251,
-0.011511877179145813,
0.024637803435325623,
-0.03970732167363167,
-0.07362305372953415,
-0.06896757334470749,
0.033966176211833954,
-0.029247375205159187,
-0.013900483027100563,
-0.029506247490644455,
0.03303037956357002,
-0.011487691663205624,
-0.11733677983283997,
0.037504490464925766,
0.04754508286714554,
-0.013954322785139084,
0.07691586762666702,
0.010362106375396252,
0.04146198183298111,
0.07374085485935211,
-0.05853100121021271,
0.0010865925578400493,
-0.004031194839626551,
0.13325221836566925,
-0.09455248713493347,
0.09817247837781906,
0.10229724645614624,
-0.008562522009015083,
-0.0028998591005802155,
0.01725156605243683,
0.006123492494225502,
-0.05681459978222847,
0.03140075504779816,
-0.005163196008652449,
-0.07397118210792542,
-0.16032429039478302,
-0.038219649344682693,
-0.05476169288158417,
0.004379113204777241,
0.08179590106010437,
0.027540605515241623,
0.007200487423688173,
0.0802076980471611,
0.033872563391923904,
0.00897279754281044,
-0.008153279311954975,
0.11796298623085022,
-0.062997005879879,
0.018739450722932816,
0.023982588201761246,
-0.041952986270189285,
0.03272920101881027,
0.08189341425895691,
0.04557371139526367,
0.21238721907138824,
-0.0681299939751625,
0.11877286434173584,
0.05044279620051384,
-0.0019236538792029023,
0.07983534038066864,
0.0895635113120079,
-0.00614011799916625,
-0.007272453512996435,
-0.006878504995256662,
-0.03746551647782326,
-0.035272158682346344,
0.04795281961560249,
-0.032324858009815216,
-0.039895717054605484,
0.013404116965830326,
-0.019164463505148888,
0.02448236383497715,
0.15447621047496796,
0.022818585857748985,
-0.20052585005760193,
-0.13669458031654358,
-0.012964930385351181,
-0.02649284154176712,
-0.09236694872379303,
0.0018418183317407966,
0.13758209347724915,
-0.10495976358652115,
-0.05732748284935951,
-0.038618654012680054,
0.08009732514619827,
-0.1116734966635704,
-0.0549461655318737,
-0.008932595141232014,
0.10047998279333115,
-0.022903351113200188,
0.038142260164022446,
-0.06168820336461067,
0.04260191321372986,
-0.013135996647179127,
0.13060489296913147,
-0.08879850804805756,
0.012573190964758396,
0.046000681817531586,
0.05175285413861275,
0.1344618797302246,
0.041735030710697174,
0.003913787193596363,
-0.024797657504677773,
-0.09699543565511703,
0.05537624657154083,
-0.01033325307071209,
-0.07237470895051956,
0.07794631272554398,
-0.0009070865926332772,
0.020807180553674698,
-0.0425255186855793,
-0.08268056809902191,
-0.1289445459842682,
-0.1210329607129097,
0.035875141620635986,
-0.060113970190286636,
0.07461969554424286,
-0.04412256181240082,
-0.05740866810083389,
0.019580088555812836,
0.1732615828514099,
-0.0842205286026001,
-0.08146032691001892,
-0.12888559699058533,
-0.01649116910994053,
0.10238195210695267,
-0.006419336888939142,
0.05145445093512535,
0.028714075684547424,
0.1339661031961441,
-0.06264077126979828,
-0.05605432391166687,
0.016081953421235085,
-0.14134946465492249,
-0.12318762391805649,
-0.031188566237688065,
0.03662151098251343,
0.14662878215312958,
0.05233723670244217,
0.05323529615998268,
0.0210321843624115,
-0.05419425666332245,
-0.10927881300449371,
-0.05325360596179962,
0.0624130442738533,
-0.0013572723837569356,
0.015962406992912292,
0.008479185402393341,
0.050693340599536896,
-0.01639411225914955,
-0.040679238736629486,
0.054094262421131134,
0.1328376978635788,
-0.08458255976438522,
0.07936567813158035,
0.09735994786024094,
-0.05016939714550972,
-0.16769739985466003,
-0.007176279556006193,
0.06201664358377457,
0.09725513309240341,
-0.022156978026032448,
-0.16623662412166595,
0.06655817478895187,
0.01250290498137474,
0.00609692744910717,
0.11857347190380096,
-0.2157820165157318,
-0.0743076279759407,
0.0620870403945446,
0.07828540354967117,
0.06930931657552719,
-0.08429025113582611,
-0.0031932054553180933,
0.015026325359940529,
-0.08450428396463394,
0.16493871808052063,
-0.09580710530281067,
0.08134076744318008,
-0.01816626824438572,
-0.010419249534606934,
0.0257167536765337,
-0.056585222482681274,
0.129123717546463,
-0.03458808735013008,
0.0350693017244339,
-0.029030701145529747,
0.025632085278630257,
-0.01573733426630497,
-0.050955139100551605,
0.08457338809967041,
-0.015013479627668858,
0.08881301432847977,
-0.040270280092954636,
-0.04444221407175064,
-0.04362112283706665,
0.07670406997203827,
-0.011973385699093342,
-0.049374524503946304,
-0.10774920135736465,
0.12133744359016418,
0.019686223939061165,
-0.03302404284477234,
0.02985590137541294,
-0.0611346960067749,
0.014556159265339375,
0.07047434896230698,
0.140419140458107,
0.007506169844418764,
-0.0293577928096056,
0.02145625464618206,
-0.04557384178042412,
0.1374417543411255,
-0.11478488892316818,
0.0016024074284359813,
0.08989328891038895,
0.05637018382549286,
0.05173013359308243,
0.025201523676514626,
-0.12817779183387756,
0.030403699725866318,
0.059574007987976074,
-0.1177675724029541,
-0.043077949434518814,
-0.023087341338396072,
0.05079008266329765,
-0.06451214104890823,
0.061714544892311096,
0.15147921442985535,
-0.13751991093158722,
-0.041142094880342484,
0.018182015046477318,
0.05934203788638115,
-0.05270092934370041,
0.09684748202562332,
0.02589309774339199,
0.020000841468572617,
-0.0717339739203453,
0.08162085711956024,
0.020940328016877174,
-0.0035649475175887346,
0.02293143793940544,
0.13053813576698303,
-0.11858068406581879,
-0.07200486958026886,
-0.03170401230454445,
0.03420960158109665,
-0.0913470983505249,
-0.04476150497794151,
-0.04542236030101776,
-0.09540745615959167,
-0.018963715061545372,
0.10216791927814484,
0.019657833501696587,
0.04066433012485504,
-0.05798233672976494,
0.015478058718144894,
-0.15913230180740356,
0.01072231587022543,
0.022176673635840416,
0.04324747994542122,
-0.05257740244269371,
0.11355366557836533,
0.008928630501031876,
0.008805515244603157,
-0.03480823338031769,
-0.013491295278072357,
-0.11576254665851593,
-0.03213702514767647,
-0.08279824256896973,
0.02871152013540268,
-0.08102323859930038,
0.03206690400838852,
0.02760929800570011,
0.005282216239720583,
0.009035788476467133,
0.06360772997140884,
-0.02011934109032154,
-0.000861472450196743,
-0.04482170566916466,
0.04159453883767128,
-0.02699855901300907,
-0.0010839427122846246,
0.03479320555925369,
-0.0845985859632492,
0.1050947904586792,
0.017805805429816246,
-0.014082247391343117,
-0.001053922576829791,
-0.12450730055570602,
0.05968460068106651,
0.020398039370775223,
-0.03497787192463875,
0.0024624825455248356,
-0.23454467952251434,
0.001986681018024683,
-0.013728045858442783,
-0.031200844794511795,
-0.007169406861066818,
0.06972477585077286,
-0.0549420528113842,
0.009779154323041439,
0.027077464386820793,
0.03994419053196907,
-0.09030264616012573,
0.024777686223387718,
0.09206853061914444,
0.09627640247344971,
0.04005971550941467,
-0.03574775159358978,
0.06268057227134705,
-0.052438534796237946,
-0.01869697868824005,
0.007127527147531509,
0.03601418808102608,
-0.03860250115394592,
-0.029807619750499725,
0.06325230002403259,
0.011636345647275448,
0.07311875373125076,
-0.049460239708423615,
0.03193848952651024,
-0.006560776848345995,
0.0021599235478788614,
-0.06506115198135376,
0.007754203397780657,
0.07811011373996735,
-0.05612484738230705,
0.007975693792104721,
-0.04001295566558838,
0.002784647047519684,
-0.03228668123483658,
-0.09374448657035828,
0.1470770239830017,
0.04832354560494423,
-0.07924500852823257,
0.1095755323767662,
-0.031682003289461136,
-0.0677686408162117,
0.01601998880505562,
0.07229925692081451,
-0.019440781325101852,
0.0319715179502964,
-0.06565054506063461,
0.048858121037483215,
0.13944709300994873,
-0.09476131945848465,
0.05282260850071907,
0.02500375732779503,
-0.06039270758628845,
-0.004341831896454096,
-0.10328075289726257,
-0.005367676727473736,
-0.04586289823055267,
-0.013665057718753815,
-0.09178460389375687,
0.025734391063451767,
-0.004378582816570997,
0.04369098320603371,
-0.025132540613412857,
0.14245621860027313,
0.01731635071337223,
-0.08390769362449646,
0.06020486354827881,
0.00857515074312687,
0.028895333409309387,
-0.011541219428181648,
0.002176918089389801,
0.04098561778664589,
0.0602031946182251,
0.06602248549461365,
0.0162750743329525,
0.08212786912918091,
0.031620170921087265,
0.01765468902885914,
-0.07075997442007065,
-0.00397422444075346,
-0.020400287583470345,
0.0006022051675245166,
0.13234388828277588,
0.061688244342803955,
-0.015053289011120796,
0.012429962866008282,
0.17225739359855652,
-0.06037062779068947,
-0.08640988171100616,
-0.18487221002578735,
0.1689763069152832,
-0.054229333996772766,
-0.04684289172291756,
0.013183536939322948,
-0.07588431239128113,
0.008685620501637459,
0.16744431853294373,
0.06547968834638596,
-0.029318837448954582,
0.011060194112360477,
-0.0004177863011136651,
-0.01752387173473835,
-0.013301846571266651,
0.06500418484210968,
-0.04016253352165222,
0.1829679310321808,
-0.07491482049226761,
0.08402546495199203,
-0.03213977813720703,
-0.0038299316074699163,
-0.02111225388944149,
0.0105074942111969,
-0.04470853507518768,
-0.043518416583538055,
-0.04620896652340889,
0.07925743609666824,
-0.07301018387079239,
-0.2380715161561966,
0.11031787097454071,
-0.03217640891671181,
-0.06138316169381142,
-0.016566788777709007,
-0.047113895416259766,
-0.016958896070718765,
0.10003099590539932,
-0.0046598478220403194,
0.03572051227092743,
0.176324725151062,
0.008405164815485477,
-0.013441993854939938,
-0.039761655032634735,
0.011641082353889942,
-0.01925736479461193,
0.16098608076572418,
-0.012637246400117874,
0.06786992400884628,
0.07935784012079239,
0.03268999233841896,
-0.1909998208284378,
0.002425137208774686,
-0.022655731067061424,
-0.08239387720823288,
0.024543190374970436,
0.1349446177482605,
-0.017643984407186508,
0.020681381225585938,
0.03406824916601181,
-0.039677903056144714,
0.014209484681487083,
-0.030209800228476524,
0.06244268640875816,
-0.06989523768424988,
0.021881768479943275,
-0.06624830514192581,
0.19945316016674042,
0.1147623062133789,
-0.039893988519907,
-0.031115224584937096,
-0.03393222764134407,
0.07683145254850388,
0.04502282291650772,
-0.017079781740903854,
-0.009780394844710827,
-0.09681174159049988,
-0.013995782472193241,
-0.03594644367694855,
0.05861731618642807,
-0.15925118327140808,
-0.025849109515547752,
-0.00965851079672575,
-0.026120411232113838,
0.014367174357175827,
0.0375652015209198,
0.04227178171277046,
0.05193251371383667,
0.0036504012532532215,
-0.037234801799058914,
0.023534368723630905,
0.048096269369125366,
-0.072197325527668,
-0.06086375191807747
] |
null | null |
transformers
|
# CLIP-Spanish
CLIP Spanish is a CLIP-like model for Spanish language. It is composed of [BERTIN](https://huggingface.co/bertin-project/bertin-roberta-base-spanish) as a language encoder and the ViT-B/32 image encoder from [CLIP](https://huggingface.co/openai/clip-vit-base-patch32). The model is implemented in [Flax](https://github.com/google/flax), including training scripts (see `training.md`).
This is part of the [Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
## Spanish WIT
We used a subset of 141,230 Spanish captions from the [WIT dataset](https://github.com/google-research-datasets/wit) for training.
## Team members
- Eduardo González Ponferrada ([edugp](https://huggingface.co/edugp))
- Manu Romero ([mrm8488](https://huggingface.co/))
- María Grandury ([mariagrandury](https://huggingface.co/))
## Useful links
- [Community Week timeline](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104#summary-timeline-calendar-6)
- [Community Week README](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md)
- [Community Week thread](https://discuss.huggingface.co/t/bertin-pretrain-roberta-large-from-scratch-in-spanish/7125)
- [Community Week channel](https://discord.com/channels/858019234139602994/859113060068229190)
- [Hybrid CLIP example scripts](https://github.com/huggingface/transformers/tree/master/examples/research_projects/jax-projects/hybrid_clip)
- [Model Repository](https://huggingface.co/flax-community/bertin-roberta-large-spanish/)
|
{"language": "es", "license": "cc-by-4.0", "tags": ["spanish", "roberta", "vit"]}
| null |
flax-community/clip-spanish
|
[
"transformers",
"jax",
"hybrid-clip",
"spanish",
"roberta",
"vit",
"es",
"license:cc-by-4.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"es"
] |
TAGS
#transformers #jax #hybrid-clip #spanish #roberta #vit #es #license-cc-by-4.0 #endpoints_compatible #has_space #region-us
|
# CLIP-Spanish
CLIP Spanish is a CLIP-like model for Spanish language. It is composed of BERTIN as a language encoder and the ViT-B/32 image encoder from CLIP. The model is implemented in Flax, including training scripts (see 'URL').
This is part of the Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.
## Spanish WIT
We used a subset of 141,230 Spanish captions from the WIT dataset for training.
## Team members
- Eduardo González Ponferrada (edugp)
- Manu Romero (mrm8488)
- María Grandury (mariagrandury)
## Useful links
- Community Week timeline
- Community Week README
- Community Week thread
- Community Week channel
- Hybrid CLIP example scripts
- Model Repository
|
[
"# CLIP-Spanish\n\nCLIP Spanish is a CLIP-like model for Spanish language. It is composed of BERTIN as a language encoder and the ViT-B/32 image encoder from CLIP. The model is implemented in Flax, including training scripts (see 'URL').\nThis is part of the Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## Spanish WIT\n\nWe used a subset of 141,230 Spanish captions from the WIT dataset for training.",
"## Team members\n\n- Eduardo González Ponferrada (edugp)\n- Manu Romero (mrm8488)\n- María Grandury (mariagrandury)",
"## Useful links\n\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Hybrid CLIP example scripts\n- Model Repository"
] |
[
"TAGS\n#transformers #jax #hybrid-clip #spanish #roberta #vit #es #license-cc-by-4.0 #endpoints_compatible #has_space #region-us \n",
"# CLIP-Spanish\n\nCLIP Spanish is a CLIP-like model for Spanish language. It is composed of BERTIN as a language encoder and the ViT-B/32 image encoder from CLIP. The model is implemented in Flax, including training scripts (see 'URL').\nThis is part of the Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## Spanish WIT\n\nWe used a subset of 141,230 Spanish captions from the WIT dataset for training.",
"## Team members\n\n- Eduardo González Ponferrada (edugp)\n- Manu Romero (mrm8488)\n- María Grandury (mariagrandury)",
"## Useful links\n\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Hybrid CLIP example scripts\n- Model Repository"
] |
[
48,
97,
25,
31,
34
] |
[
"passage: TAGS\n#transformers #jax #hybrid-clip #spanish #roberta #vit #es #license-cc-by-4.0 #endpoints_compatible #has_space #region-us \n# CLIP-Spanish\n\nCLIP Spanish is a CLIP-like model for Spanish language. It is composed of BERTIN as a language encoder and the ViT-B/32 image encoder from CLIP. The model is implemented in Flax, including training scripts (see 'URL').\nThis is part of the Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.## Spanish WIT\n\nWe used a subset of 141,230 Spanish captions from the WIT dataset for training.## Team members\n\n- Eduardo González Ponferrada (edugp)\n- Manu Romero (mrm8488)\n- María Grandury (mariagrandury)## Useful links\n\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Hybrid CLIP example scripts\n- Model Repository"
] |
[
-0.11662507057189941,
0.0716358944773674,
-0.0025835512205958366,
0.06878375262022018,
0.03649739548563957,
0.029078757390379906,
0.11297153681516647,
0.07208187133073807,
0.05941995978355408,
0.04945417493581772,
-0.015182389877736568,
0.05152309313416481,
0.06760290265083313,
0.11238609999418259,
-0.03684648126363754,
-0.20300273597240448,
-0.029907919466495514,
0.0063227578066289425,
-0.12623819708824158,
0.1026233434677124,
0.06193269044160843,
0.0016610536258667707,
0.0627155750989914,
0.025069762021303177,
-0.029676256701350212,
0.04017324000597,
0.0035554387141019106,
-0.12711820006370544,
0.1473473310470581,
0.10283825546503067,
0.07267449051141739,
0.03476574644446373,
-0.01530690211802721,
0.003321786178275943,
0.03617367893457413,
0.06711382418870926,
-0.034596577286720276,
0.014547217637300491,
0.12324535846710205,
0.09015853703022003,
0.07292994856834412,
-0.044280849397182465,
-0.0069193453527987,
0.038474954664707184,
-0.23108893632888794,
-0.061885979026556015,
-0.13066138327121735,
-0.06475163996219635,
0.000122873650980182,
0.06144716590642929,
0.017789559438824654,
0.20590226352214813,
-0.19485804438591003,
-0.021096020936965942,
0.14237116277217865,
-0.14061400294303894,
-0.07844166457653046,
0.1178235113620758,
0.20841550827026367,
0.0068499003536999226,
-0.014818248338997364,
0.0034307162277400494,
0.07134111225605011,
-0.02817625366151333,
-0.052430227398872375,
-0.06327831745147705,
-0.12853282690048218,
-0.07107994705438614,
-0.08549834787845612,
-0.007176711689680815,
0.20480790734291077,
0.02692246623337269,
0.04019688442349434,
0.01214703731238842,
-0.05248278006911278,
0.17247697710990906,
-0.08279486745595932,
-0.006503842305392027,
0.041914455592632294,
0.06344712525606155,
0.08253076672554016,
0.06517566740512848,
-0.046392571181058884,
-0.012194815091788769,
-0.08327969908714294,
0.14078861474990845,
-0.011469703167676926,
0.051275935024023056,
-0.18720796704292297,
-0.010760344564914703,
-0.030671173706650734,
-0.03400709107518196,
-0.00009422016591997817,
0.03247906267642975,
0.1294616013765335,
-0.05382101982831955,
0.08027112483978271,
-0.00936722569167614,
0.11442255973815918,
0.05765795335173607,
-0.13117115199565887,
-0.004263976123183966,
0.04830115661025047,
0.05749921500682831,
0.062041986733675,
-0.05068107694387436,
0.06695377826690674,
-0.08923956751823425,
0.03615691140294075,
-0.0344923734664917,
-0.0774519294500351,
0.016768639907240868,
-0.052188023924827576,
-0.01756170578300953,
-0.12925225496292114,
0.01901995576918125,
0.11721082776784897,
0.10535502433776855,
-0.020582467317581177,
-0.00816261861473322,
0.17072176933288574,
-0.06656574457883835,
0.07621309161186218,
0.022969922050833702,
-0.05948524549603462,
0.15727944672107697,
-0.06510220468044281,
-0.07013040035963058,
-0.03745643049478531,
-0.047783877700567245,
-0.07645959407091141,
-0.02732831984758377,
-0.06693204492330551,
-0.10132460296154022,
0.09749429672956467,
-0.16621831059455872,
-0.05576135590672493,
-0.18109872937202454,
-0.04767762869596481,
-0.12429738789796829,
-0.08360273391008377,
-0.0370858870446682,
-0.07995714247226715,
-0.07312264293432236,
0.02060297131538391,
0.001520837191492319,
-0.019858354702591896,
0.05080049857497215,
-0.12485847622156143,
0.08890603482723236,
-0.014324738644063473,
0.06799697130918503,
-0.08033017814159393,
0.0354602187871933,
0.007861165329813957,
0.06589989364147186,
-0.11857501417398453,
0.06132075935602188,
-0.012329488061368465,
0.07193406671285629,
-0.10330900549888611,
0.04696374386548996,
-0.07845398038625717,
0.08763935416936874,
-0.01056644693017006,
0.13496756553649902,
-0.1677253246307373,
-0.07636728882789612,
0.19480906426906586,
-0.06868826597929001,
-0.00825459510087967,
0.10664235800504684,
0.00793238915503025,
0.09304225444793701,
0.08222387731075287,
0.17752516269683838,
0.13484981656074524,
-0.14716953039169312,
0.05384692922234535,
0.07050897181034088,
-0.09825289994478226,
-0.11195246875286102,
-0.011358517222106457,
-0.005377460736781359,
0.008722102269530296,
-0.00019268906908109784,
-0.042339663952589035,
0.07083554565906525,
-0.063417948782444,
0.03474064916372299,
0.06063062325119972,
-0.0077901273034513,
0.035045135766267776,
-0.055184196680784225,
-0.010307174175977707,
-0.07599892467260361,
0.029947666451334953,
-0.00945013202726841,
0.02950785495340824,
0.07129491120576859,
0.044623471796512604,
-0.10550162196159363,
0.13460372388362885,
0.07511799037456512,
0.024236395955085754,
-0.055506665259599686,
0.04171352833509445,
0.013582227751612663,
0.20566579699516296,
-0.05044163763523102,
0.13310939073562622,
-0.006429793778806925,
-0.123469278216362,
0.01925770752131939,
0.04988507553935051,
-0.0004367211367934942,
-0.0348082110285759,
-0.03482355177402496,
-0.11729561537504196,
0.010285315103828907,
-0.027484139427542686,
-0.01714390702545643,
-0.21445389091968536,
-0.0031751145143061876,
0.13864731788635254,
0.021359475329518318,
-0.04782953858375549,
0.049772199243307114,
-0.018606198951601982,
0.11350339651107788,
-0.026647169142961502,
-0.043327782303094864,
0.11011186242103577,
0.01450210902839899,
0.007565150037407875,
0.20744672417640686,
-0.11384238302707672,
-0.12681974470615387,
0.07771801948547363,
-0.11277057230472565,
-0.06389831006526947,
0.050232868641614914,
0.012732543051242828,
-0.04438908025622368,
-0.13284361362457275,
-0.03287900611758232,
0.16578654944896698,
-0.032074764370918274,
0.11881458759307861,
-0.0933031290769577,
-0.024329129606485367,
0.021900935098528862,
-0.09940344095230103,
-0.010042404755949974,
0.08945595473051071,
-0.00841453392058611,
0.03758123144507408,
0.08243116736412048,
-0.02168942242860794,
0.13624383509159088,
0.2305465191602707,
-0.00549606466665864,
-0.00242498773150146,
0.07106811553239822,
0.014393859542906284,
0.043295372277498245,
-0.028460387140512466,
-0.08976775407791138,
-0.06297127902507782,
-0.039401549845933914,
0.11215562373399734,
0.024535659700632095,
-0.1580137014389038,
-0.03838970512151718,
-0.05399074777960777,
-0.0193515345454216,
-0.08243438601493835,
0.02891717664897442,
0.0370410680770874,
0.10215961188077927,
0.09484456479549408,
-0.03450646996498108,
0.04280268773436546,
-0.014601861126720905,
-0.09433569014072418,
0.11447632312774658,
-0.04656161367893219,
-0.27874624729156494,
-0.10992241650819778,
-0.10914186388254166,
0.11300209164619446,
-0.005199674516916275,
0.08765285462141037,
-0.03317746892571449,
0.012975319288671017,
0.006621840875595808,
0.20842982828617096,
0.0178791843354702,
-0.06118675321340561,
0.0020393612794578075,
0.08289474248886108,
0.025457410141825676,
-0.13892333209514618,
-0.0014445205451920629,
0.009937689639627934,
-0.06369521468877792,
0.08223283290863037,
-0.1503770649433136,
0.04369917884469032,
0.044913992285728455,
0.01258080918341875,
-0.026167215779423714,
0.0003137668827548623,
0.2827647626399994,
-0.14358295500278473,
0.08058807998895645,
0.19164007902145386,
-0.0012456460390239954,
0.008077869191765785,
0.010064972564578056,
0.03213686868548393,
-0.10405575484037399,
0.0038436902686953545,
0.052948206663131714,
-0.08156673610210419,
-0.2004447877407074,
-0.12256795167922974,
-0.11470822989940643,
0.061393335461616516,
0.06497672945261002,
0.05676569044589996,
-0.0775710940361023,
0.011029141955077648,
0.009496006183326244,
0.08197406679391861,
-0.014680294319987297,
0.008299596607685089,
0.16339510679244995,
0.026885515078902245,
-0.0545986033976078,
-0.06665460020303726,
-0.03365328535437584,
0.08566074073314667,
0.176785409450531,
0.02842302992939949,
0.03297966718673706,
0.10089097917079926,
0.07955561578273773,
0.06376708298921585,
0.1128372773528099,
-0.02224016562104225,
-0.0001811609254218638,
-0.013441530056297779,
-0.055480170994997025,
0.0016223995480686426,
-0.010543775744736195,
-0.05234668776392937,
0.009431062266230583,
-0.15137262642383575,
-0.025859246030449867,
-0.10281369835138321,
0.05022349953651428,
0.24205738306045532,
-0.0018427474424242973,
-0.12606240808963776,
-0.01438503060489893,
-0.016947515308856964,
-0.04807138442993164,
-0.03288744017481804,
0.05044437572360039,
0.13058002293109894,
-0.09545882046222687,
0.0687178447842598,
0.09376990795135498,
0.17390285432338715,
-0.13187827169895172,
0.005557223688811064,
-0.0711289569735527,
0.04442833736538887,
-0.03715689852833748,
0.03769411891698837,
-0.1911313384771347,
0.19662627577781677,
0.0021948295179754496,
0.0916551873087883,
-0.006969607900828123,
-0.05991686135530472,
-0.019467944279313087,
0.19513994455337524,
-0.013413160108029842,
0.10202311724424362,
0.039925072342157364,
-0.037250958383083344,
-0.11894173175096512,
-0.030550621449947357,
-0.11889740079641342,
0.001890769344754517,
0.02862115018069744,
0.022616706788539886,
0.005231320392340422,
-0.028561178594827652,
-0.10655384510755539,
-0.06218794360756874,
-0.06186775863170624,
0.025659672915935516,
0.14958159625530243,
0.07007907330989838,
-0.03279374539852142,
-0.05717708542943001,
-0.2445818930864334,
-0.04160568118095398,
-0.0003239930665586144,
0.02265467494726181,
-0.10445481538772583,
0.003985035233199596,
-0.031692083925008774,
-0.010853147134184837,
-0.04402203485369682,
0.10165349394083023,
-0.047324810177087784,
-0.05365728959441185,
-0.024843260645866394,
0.1461620032787323,
-0.08645801246166229,
-0.12328991293907166,
-0.08714395016431808,
0.04650600627064705,
0.10663724690675735,
-0.00675832899287343,
-0.0015249700518324971,
-0.03571445867419243,
0.06820442527532578,
0.00867420807480812,
-0.0670604482293129,
0.008167500607669353,
0.014942911453545094,
-0.10092131048440933,
0.006162363104522228,
-0.03849931061267853,
-0.004509848542511463,
-0.13043276965618134,
-0.01824602484703064,
0.07886017858982086,
-0.00912073440849781,
0.0873575434088707,
0.1486341506242752,
-0.0628993809223175,
-0.234224334359169,
-0.013087651692330837,
0.046946845948696136,
-0.04291645437479019,
0.02640223689377308,
-0.12467557191848755,
0.040570374578237534,
0.1430020034313202,
0.03355048596858978,
0.17604056000709534,
-0.40674611926078796,
-0.009059327654540539,
-0.09920381754636765,
-0.06102535128593445,
0.35769495368003845,
-0.05482093244791031,
-0.04843297228217125,
0.06373399496078491,
-0.3136785626411438,
0.054551515728235245,
-0.05215584486722946,
0.03131626918911934,
-0.02720065228641033,
0.0834946483373642,
-0.03663993254303932,
0.04675434157252312,
0.10260336846113205,
0.09110191464424133,
-0.028887327760457993,
-0.07285243272781372,
-0.10868851095438004,
0.06676197797060013,
-0.03455237299203873,
-0.03331488370895386,
-0.005614555440843105,
-0.0925503820180893,
-0.16536612808704376,
-0.025273658335208893,
-0.07041511684656143,
0.10546258091926575,
-0.051227867603302,
0.08633080124855042,
0.04417891055345535,
0.02900874800980091,
-0.008909173309803009,
-0.0027229285333305597,
0.10226434469223022,
-0.10844361782073975,
0.1582118272781372,
-0.011066357605159283,
0.13126882910728455,
-0.04592343047261238,
-0.18347018957138062,
-0.059679675847291946,
0.039155419915914536,
0.16946347057819366,
-0.059683170169591904,
-0.004049280192703009,
0.028266524896025658,
0.00013826011854689568,
0.060114551335573196,
0.03808262199163437,
-0.09444844722747803,
0.05749595910310745,
0.04881840944290161,
-0.01923949457705021,
-0.0004946365370415151,
0.055538829416036606,
0.058141421526670456,
0.025887012481689453,
-0.00943733286112547,
0.12534290552139282,
0.023749103769659996,
-0.05360545217990875,
-0.001448438735678792,
0.0022186648566275835,
-0.03283967822790146,
0.11594296246767044,
0.009961221367120743,
0.054580625146627426,
-0.09298619627952576,
0.028232712298631668,
0.0719107836484909,
-0.2042035311460495,
-0.05900302529335022,
0.23863685131072998,
-0.10012072324752808,
-0.09707805514335632,
0.09875668585300446,
0.12407761812210083,
-0.024887576699256897,
-0.14551086723804474,
-0.024589788168668747,
-0.1478053480386734,
0.09451521933078766,
0.23063518106937408,
-0.004662361927330494,
0.12412761151790619,
0.05945095419883728,
-0.029537774622440338,
-0.022002382203936577,
-0.07402137666940689,
0.0348958857357502,
0.0285186730325222,
0.040681611746549606,
0.03639357164502144,
0.033189963549375534,
0.06559115648269653,
-0.051049888134002686,
-0.038031142204999924,
-0.22587460279464722,
0.014445723034441471,
0.025308970361948013,
0.09226292371749878,
-0.08672010898590088,
0.05256836861371994,
-0.05210134759545326,
0.03284531086683273,
0.007848219946026802,
0.03772740066051483,
-0.03983034938573837,
0.004533989354968071,
-0.022342069074511528,
0.12480190396308899,
0.06225459277629852,
0.013742529787123203,
0.014925524592399597,
-0.021145202219486237,
0.04312992841005325,
-0.03668086603283882,
-0.08231090009212494,
-0.042444322258234024,
-0.13267342746257782,
-0.013247275725007057,
-0.0069767870008945465,
0.046696215867996216,
0.07545515894889832,
0.1266951560974121,
0.022267106920480728,
-0.03476245328783989,
0.07548818737268448,
0.006773049011826515,
0.0015528779476881027,
-0.1219729632139206,
0.04411061853170395,
-0.07031551748514175,
-0.17929181456565857,
0.010938795283436775,
-0.026489559561014175,
0.103431336581707,
0.026933562010526657,
0.0419834703207016,
-0.11560482531785965,
0.01255137100815773,
-0.017838129773736,
-0.014795597642660141,
0.04675830900669098,
-0.08611205220222473,
0.00697854720056057,
-0.012073568999767303,
0.08312594145536423,
0.050989314913749695,
0.07656611502170563,
0.16782307624816895,
-0.022560831159353256,
-0.0048640696331858635,
-0.041166484355926514,
-0.20288872718811035,
-0.015188337303698063,
0.10479060560464859,
-0.01957412250339985,
0.023959867656230927,
-0.08056451380252838,
0.02797788567841053,
0.03879464790225029,
-0.04333335533738136,
0.08490679413080215,
0.025994010269641876,
0.1228514239192009,
0.10025475174188614,
0.0038208512123674154,
-0.0698845311999321,
0.03053867071866989,
0.026932893320918083,
-0.07960700243711472,
0.09147162735462189,
-0.09878119081258774,
-0.02050112374126911,
0.05663498118519783,
0.022311391308903694,
0.12983815371990204,
-0.04518721252679825,
0.019985908642411232,
-0.11187174916267395,
-0.14036603271961212,
-0.012543809600174427,
-0.19241219758987427,
-0.009162374772131443,
-0.03599568083882332,
0.05570970103144646,
0.09014975279569626,
0.017560545355081558,
-0.03315422683954239,
-0.009862100705504417,
0.03792953118681908,
-0.15847031772136688,
0.02042911946773529,
-0.021156826987862587,
0.12016893178224564,
-0.17753784358501434,
0.024990491569042206,
0.01799466460943222,
0.047281019389629364,
-0.020444124937057495,
0.046846333891153336,
0.06577745079994202,
0.0282373558729887,
-0.06953427195549011,
-0.09281057119369507,
0.018594879657030106,
-0.013498609885573387,
-0.002859577303752303,
0.11706165224313736,
0.07362319529056549,
-0.016101891174912453,
0.0020284787751734257,
0.11396379768848419,
0.03168873488903046,
-0.08272771537303925,
-0.07259714603424072,
-0.0038821189664304256,
-0.06544893980026245,
0.009449413046240807,
-0.08273050934076309,
-0.07682503759860992,
0.00805057492107153,
0.3148549795150757,
0.1633726954460144,
0.05338068678975105,
0.02676681987941265,
-0.06072692945599556,
0.016585517674684525,
0.09422440826892853,
0.18593904376029968,
0.03341638669371605,
0.12254449725151062,
0.04089730605483055,
-0.23123514652252197,
-0.07565130293369293,
0.0680762305855751,
-0.1410110890865326,
-0.010492360219359398,
-0.03812844306230545,
-0.012471677735447884,
-0.008603869006037712,
0.11760371923446655,
-0.04622204601764679,
-0.18855002522468567,
0.09288503229618073,
-0.11364209651947021,
-0.029826024547219276,
-0.10796207189559937,
0.06865383684635162,
0.13652274012565613,
0.1318446695804596,
0.013322383165359497,
-0.03403320908546448,
0.07859551161527634,
-0.02470102347433567,
-0.07449662685394287,
-0.12600815296173096,
0.05770648270845413,
-0.09709706157445908,
0.09465032815933228,
-0.07134846597909927,
0.009476672857999802,
0.03268158808350563,
0.017695272341370583,
-0.0653095543384552,
0.0909164771437645,
0.01211506873369217,
0.03988857939839363,
0.008078349754214287,
-0.10328837484121323,
-0.020951278507709503,
-0.003175073768943548,
0.03254648298025131,
-0.17123402655124664,
0.010687963105738163,
0.17922544479370117,
0.09031867235898972,
-0.11911684274673462,
0.09751489758491516,
-0.0755813866853714,
0.08114843815565109,
0.07757222652435303,
-0.02892777882516384,
-0.09573069959878922,
-0.03265606611967087,
-0.04554244503378868,
-0.00684101739898324,
-0.014779645949602127,
-0.07129213958978653,
-0.21244420111179352,
-0.09373651444911957,
-0.19689799845218658,
0.02540581300854683,
-0.05646509677171707,
0.05205889046192169,
-0.09668271988630295,
-0.06506364792585373,
-0.026746811345219612,
-0.023812679573893547,
0.06722196936607361,
0.032656311988830566,
-0.009985494427382946,
-0.18592290580272675,
0.06311144679784775,
0.1530742347240448,
-0.050796449184417725,
-0.032061006873846054
] |
null | null |
transformers
|
# CLIP-Vision-BERT Multilingual Pre-trained Model
Pretrained CLIP-Vision-BERT pre-trained on translated [Conceptual-12M](https://github.com/google-research-datasets/conceptual-12m) image-text pairs using a masked language modeling (MLM) objective. 10M cleaned image-text pairs are translated using [mBART-50 one-to-many model](https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt) to 2.5M examples each in English, French, German and Spanish. This model is based on the VisualBERT which was introduced in
[this paper](https://arxiv.org/abs/1908.03557) and first released in
[this repository](https://github.com/uclanlp/visualbert). We trained CLIP-Vision-BERT model during community week hosted by Huggingface 🤗 using JAX/Flax.
This checkpoint is pre-trained for 60k steps.
## Model description
CLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.
## Intended uses & limitations❗️
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
Note that this model is primarily aimed at being fine-tuned on tasks such as visuo-linguistic sequence classification or visual question answering. We used this model to fine-tuned on a multi-translated version of the visual question answering task - [VQA v2](https://visualqa.org/challenge.html). Since Conceptual-12M is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.
### How to use❓
You can use this model directly with a pipeline for masked language modeling. You will need to clone the model from [here](https://github.com/gchhablani/multilingual-vqa). An example of usage is shown below:
```python
>>> from torchvision.io import read_image
>>> import numpy as np
>>> import os
>>> from transformers import CLIPProcessor, BertTokenizerFast
>>> from model.flax_clip_vision_bert.modeling_clip_vision_bert import FlaxCLIPVisionBertForMaskedLM
>>> image_path = os.path.join('images/val2014', os.listdir('images/val2014')[0])
>>> img = read_image(image_path)
>>> clip_processor = CLIPProcessor.from_pretrained('openai/clip-vit-base-patch32')
ftfy or spacy is not installed using BERT BasicTokenizer instead of ftfy.
>>> clip_outputs = clip_processor(images=img)
>>> clip_outputs['pixel_values'][0] = clip_outputs['pixel_values'][0].transpose(1,2,0) # Need to transpose images as model expected channel last images.
>>> tokenizer = BertTokenizerFast.from_pretrained('bert-base-multilingual-uncased')
>>> model = FlaxCLIPVisionBertForMaskedLM.from_pretrained('flax-community/clip-vision-bert-cc12m-60k')
>>> text = "Three teddy [MASK] in a showcase."
>>> tokens = tokenizer([text], return_tensors="np")
>>> pixel_values = np.concatenate([clip_outputs['pixel_values']])
>>> outputs = model(pixel_values=pixel_values, **tokens)
>>> indices = np.where(tokens['input_ids']==tokenizer.mask_token_id)
>>> preds = outputs.logits[indices][0]
>>> sorted_indices = np.argsort(preds)[::-1] # Get reverse sorted scores
/home/crocoder/anaconda3/lib/python3.8/site-packages/jax/_src/numpy/lax_numpy.py:4615: UserWarning: 'kind' argument to argsort is ignored.
warnings.warn("'kind' argument to argsort is ignored.")
>>> top_5_indices = sorted_indices[:5]
>>> top_5_tokens = tokenizer.convert_ids_to_tokens(top_5_indices)
>>> top_5_scores = preds[top_5_indices]
>>> print(dict(zip(top_5_tokens, top_5_scores)))
{'bears': 19.241959, 'bear': 17.700356, 'animals': 14.368396, 'girls': 14.343797, 'dolls': 14.274415}
```
## Training data 🏋🏻♂️
The CLIP-Vision-BERT model was pre-trained on a translated version of the Conceptual-12m dataset in four languages using mBART-50: English, French, German and Spanish, with 2.5M image-text pairs in each.
The dataset captions and image urls can be downloaded from [flax-community/conceptual-12m-mbart-50-translated](https://huggingface.co/datasets/flax-community/conceptual-12m-mbart-50-multilingual).
## Data Cleaning 🧹
Though the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.
**Splits**
We used 99% of the 10M examples as a train set, and the remaining ~ 100K examples as our validation set.
## Training procedure 👨🏻💻
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with `[CLS]` and the end of one by `[SEP]`
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
The visual embeddings are taken from the CLIP-Vision model and combined with the textual embeddings inside the BERT embedding layer. The padding is done in the middle. Here is an example of what the embeddings look like:
```
[CLS Emb] [Textual Embs] [SEP Emb] [Pad Embs] [Visual Embs]
```
A total length of 128 tokens, including the visual embeddings, is used. The texts are truncated or padded accordingly.
### Pretraining
The checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) **8 v3 TPU cores** for 60k steps with a per device batch size of 64 and a max sequence length of 128. The optimizer used is Adafactor with a learning rate of 1e-4, learning rate warmup for 5,000 steps, and linear decay of the learning rate after.
We tracked experiments using TensorBoard. Here is the link to the main dashboard: [CLIP Vision BERT CC12M Pre-training Dashboard](https://huggingface.co/flax-community/multilingual-vqa-pt-ckpts/tensorboard)
#### **Pretraining Results 📊**
The model at this checkpoint reached **eval accuracy of 67.53%** and **with train loss at 1.793 and eval loss at 1.724**.
## Fine Tuning on downstream tasks
We performed fine-tuning on downstream tasks. We used the following datasets for visual question answering:
1. Multilingual of [Visual Question Answering (VQA) v2](https://visualqa.org/challenge.html) - We translated this dataset to the four languages using `Helsinki-NLP` Marian models. The translated data can be found at [flax-community/multilingual-vqa](https://huggingface.co/datasets/flax-community/multilingual-vqa).
The checkpoints for the fine-tuned model on this pre-trained checkpoint can be found [here](https://huggingface.co/flax-community/multilingual-vqa-pt-60k-ft/tensorboard).
The fine-tuned model achieves eval accuracy of 49% on our validation dataset.
## Team Members
- Gunjan Chhablani [@gchhablani](https://hf.co/gchhablani)
- Bhavitvya Malik[@bhavitvyamalik](https://hf.co/bhavitvyamalik)
## Acknowledgements
We thank [Nilakshan Kunananthaseelan](https://huggingface.co/knilakshan20) for helping us whenever he could get a chance. We also thank [Abheesht Sharma](https://huggingface.co/abheesht) for helping in the discussions in the initial phases. [Luke Melas](https://github.com/lukemelas) helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.
This project would not be possible without the help of [Patrick](https://huggingface.co/patrickvonplaten) and [Suraj](https://huggingface.co/valhalla) who met with us frequently and helped review our approach and guided us throughout the project.
Huge thanks to Huggingface 🤗 & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.
<img src=https://pbs.twimg.com/media/E443fPjX0AY1BsR.jpg:large>
|
{}
|
fill-mask
|
flax-community/clip-vision-bert-cc12m-60k
|
[
"transformers",
"jax",
"clip-vision-bert",
"fill-mask",
"arxiv:1908.03557",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.03557"
] |
[] |
TAGS
#transformers #jax #clip-vision-bert #fill-mask #arxiv-1908.03557 #autotrain_compatible #endpoints_compatible #region-us
|
# CLIP-Vision-BERT Multilingual Pre-trained Model
Pretrained CLIP-Vision-BERT pre-trained on translated Conceptual-12M image-text pairs using a masked language modeling (MLM) objective. 10M cleaned image-text pairs are translated using mBART-50 one-to-many model to 2.5M examples each in English, French, German and Spanish. This model is based on the VisualBERT which was introduced in
this paper and first released in
this repository. We trained CLIP-Vision-BERT model during community week hosted by Huggingface using JAX/Flax.
This checkpoint is pre-trained for 60k steps.
## Model description
CLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.
## Intended uses & limitations️
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
Note that this model is primarily aimed at being fine-tuned on tasks such as visuo-linguistic sequence classification or visual question answering. We used this model to fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since Conceptual-12M is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.
### How to use
You can use this model directly with a pipeline for masked language modeling. You will need to clone the model from here. An example of usage is shown below:
## Training data ️
The CLIP-Vision-BERT model was pre-trained on a translated version of the Conceptual-12m dataset in four languages using mBART-50: English, French, German and Spanish, with 2.5M image-text pairs in each.
The dataset captions and image urls can be downloaded from flax-community/conceptual-12m-mbart-50-translated.
## Data Cleaning
Though the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.
Splits
We used 99% of the 10M examples as a train set, and the remaining ~ 100K examples as our validation set.
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with '[CLS]' and the end of one by '[SEP]'
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by '[MASK]'.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
The visual embeddings are taken from the CLIP-Vision model and combined with the textual embeddings inside the BERT embedding layer. The padding is done in the middle. Here is an example of what the embeddings look like:
A total length of 128 tokens, including the visual embeddings, is used. The texts are truncated or padded accordingly.
### Pretraining
The checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 60k steps with a per device batch size of 64 and a max sequence length of 128. The optimizer used is Adafactor with a learning rate of 1e-4, learning rate warmup for 5,000 steps, and linear decay of the learning rate after.
We tracked experiments using TensorBoard. Here is the link to the main dashboard: CLIP Vision BERT CC12M Pre-training Dashboard
#### Pretraining Results
The model at this checkpoint reached eval accuracy of 67.53% and with train loss at 1.793 and eval loss at 1.724.
## Fine Tuning on downstream tasks
We performed fine-tuning on downstream tasks. We used the following datasets for visual question answering:
1. Multilingual of Visual Question Answering (VQA) v2 - We translated this dataset to the four languages using 'Helsinki-NLP' Marian models. The translated data can be found at flax-community/multilingual-vqa.
The checkpoints for the fine-tuned model on this pre-trained checkpoint can be found here.
The fine-tuned model achieves eval accuracy of 49% on our validation dataset.
## Team Members
- Gunjan Chhablani @gchhablani
- Bhavitvya Malik@bhavitvyamalik
## Acknowledgements
We thank Nilakshan Kunananthaseelan for helping us whenever he could get a chance. We also thank Abheesht Sharma for helping in the discussions in the initial phases. Luke Melas helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.
This project would not be possible without the help of Patrick and Suraj who met with us frequently and helped review our approach and guided us throughout the project.
Huge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.
<img src=URL
|
[
"# CLIP-Vision-BERT Multilingual Pre-trained Model\n\nPretrained CLIP-Vision-BERT pre-trained on translated Conceptual-12M image-text pairs using a masked language modeling (MLM) objective. 10M cleaned image-text pairs are translated using mBART-50 one-to-many model to 2.5M examples each in English, French, German and Spanish. This model is based on the VisualBERT which was introduced in\nthis paper and first released in\nthis repository. We trained CLIP-Vision-BERT model during community week hosted by Huggingface using JAX/Flax.\n\nThis checkpoint is pre-trained for 60k steps.",
"## Model description\nCLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.",
"## Intended uses & limitations️\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.\nNote that this model is primarily aimed at being fine-tuned on tasks such as visuo-linguistic sequence classification or visual question answering. We used this model to fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since Conceptual-12M is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.",
"### How to use\nYou can use this model directly with a pipeline for masked language modeling. You will need to clone the model from here. An example of usage is shown below:",
"## Training data ️\nThe CLIP-Vision-BERT model was pre-trained on a translated version of the Conceptual-12m dataset in four languages using mBART-50: English, French, German and Spanish, with 2.5M image-text pairs in each.\n\nThe dataset captions and image urls can be downloaded from flax-community/conceptual-12m-mbart-50-translated.",
"## Data Cleaning \n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.\n\nSplits\nWe used 99% of the 10M examples as a train set, and the remaining ~ 100K examples as our validation set.",
"## Training procedure ",
"### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with '[CLS]' and the end of one by '[SEP]'\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.\n\n\nThe visual embeddings are taken from the CLIP-Vision model and combined with the textual embeddings inside the BERT embedding layer. The padding is done in the middle. Here is an example of what the embeddings look like:\n\n\n\nA total length of 128 tokens, including the visual embeddings, is used. The texts are truncated or padded accordingly.",
"### Pretraining\nThe checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 60k steps with a per device batch size of 64 and a max sequence length of 128. The optimizer used is Adafactor with a learning rate of 1e-4, learning rate warmup for 5,000 steps, and linear decay of the learning rate after.\n\nWe tracked experiments using TensorBoard. Here is the link to the main dashboard: CLIP Vision BERT CC12M Pre-training Dashboard",
"#### Pretraining Results \n\nThe model at this checkpoint reached eval accuracy of 67.53% and with train loss at 1.793 and eval loss at 1.724.",
"## Fine Tuning on downstream tasks\nWe performed fine-tuning on downstream tasks. We used the following datasets for visual question answering:\n\n1. Multilingual of Visual Question Answering (VQA) v2 - We translated this dataset to the four languages using 'Helsinki-NLP' Marian models. The translated data can be found at flax-community/multilingual-vqa.\n\nThe checkpoints for the fine-tuned model on this pre-trained checkpoint can be found here.\nThe fine-tuned model achieves eval accuracy of 49% on our validation dataset.",
"## Team Members\n - Gunjan Chhablani @gchhablani\n - Bhavitvya Malik@bhavitvyamalik",
"## Acknowledgements\n We thank Nilakshan Kunananthaseelan for helping us whenever he could get a chance. We also thank Abheesht Sharma for helping in the discussions in the initial phases. Luke Melas helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.\n\n This project would not be possible without the help of Patrick and Suraj who met with us frequently and helped review our approach and guided us throughout the project.\n\n Huge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.\n\n<img src=URL"
] |
[
"TAGS\n#transformers #jax #clip-vision-bert #fill-mask #arxiv-1908.03557 #autotrain_compatible #endpoints_compatible #region-us \n",
"# CLIP-Vision-BERT Multilingual Pre-trained Model\n\nPretrained CLIP-Vision-BERT pre-trained on translated Conceptual-12M image-text pairs using a masked language modeling (MLM) objective. 10M cleaned image-text pairs are translated using mBART-50 one-to-many model to 2.5M examples each in English, French, German and Spanish. This model is based on the VisualBERT which was introduced in\nthis paper and first released in\nthis repository. We trained CLIP-Vision-BERT model during community week hosted by Huggingface using JAX/Flax.\n\nThis checkpoint is pre-trained for 60k steps.",
"## Model description\nCLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.",
"## Intended uses & limitations️\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.\nNote that this model is primarily aimed at being fine-tuned on tasks such as visuo-linguistic sequence classification or visual question answering. We used this model to fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since Conceptual-12M is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.",
"### How to use\nYou can use this model directly with a pipeline for masked language modeling. You will need to clone the model from here. An example of usage is shown below:",
"## Training data ️\nThe CLIP-Vision-BERT model was pre-trained on a translated version of the Conceptual-12m dataset in four languages using mBART-50: English, French, German and Spanish, with 2.5M image-text pairs in each.\n\nThe dataset captions and image urls can be downloaded from flax-community/conceptual-12m-mbart-50-translated.",
"## Data Cleaning \n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.\n\nSplits\nWe used 99% of the 10M examples as a train set, and the remaining ~ 100K examples as our validation set.",
"## Training procedure ",
"### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with '[CLS]' and the end of one by '[SEP]'\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.\n\n\nThe visual embeddings are taken from the CLIP-Vision model and combined with the textual embeddings inside the BERT embedding layer. The padding is done in the middle. Here is an example of what the embeddings look like:\n\n\n\nA total length of 128 tokens, including the visual embeddings, is used. The texts are truncated or padded accordingly.",
"### Pretraining\nThe checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 60k steps with a per device batch size of 64 and a max sequence length of 128. The optimizer used is Adafactor with a learning rate of 1e-4, learning rate warmup for 5,000 steps, and linear decay of the learning rate after.\n\nWe tracked experiments using TensorBoard. Here is the link to the main dashboard: CLIP Vision BERT CC12M Pre-training Dashboard",
"#### Pretraining Results \n\nThe model at this checkpoint reached eval accuracy of 67.53% and with train loss at 1.793 and eval loss at 1.724.",
"## Fine Tuning on downstream tasks\nWe performed fine-tuning on downstream tasks. We used the following datasets for visual question answering:\n\n1. Multilingual of Visual Question Answering (VQA) v2 - We translated this dataset to the four languages using 'Helsinki-NLP' Marian models. The translated data can be found at flax-community/multilingual-vqa.\n\nThe checkpoints for the fine-tuned model on this pre-trained checkpoint can be found here.\nThe fine-tuned model achieves eval accuracy of 49% on our validation dataset.",
"## Team Members\n - Gunjan Chhablani @gchhablani\n - Bhavitvya Malik@bhavitvyamalik",
"## Acknowledgements\n We thank Nilakshan Kunananthaseelan for helping us whenever he could get a chance. We also thank Abheesht Sharma for helping in the discussions in the initial phases. Luke Melas helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.\n\n This project would not be possible without the help of Patrick and Suraj who met with us frequently and helped review our approach and guided us throughout the project.\n\n Huge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.\n\n<img src=URL"
] |
[
47,
161,
79,
143,
41,
97,
95,
3,
242,
139,
35,
145,
27,
156
] |
[
"passage: TAGS\n#transformers #jax #clip-vision-bert #fill-mask #arxiv-1908.03557 #autotrain_compatible #endpoints_compatible #region-us \n# CLIP-Vision-BERT Multilingual Pre-trained Model\n\nPretrained CLIP-Vision-BERT pre-trained on translated Conceptual-12M image-text pairs using a masked language modeling (MLM) objective. 10M cleaned image-text pairs are translated using mBART-50 one-to-many model to 2.5M examples each in English, French, German and Spanish. This model is based on the VisualBERT which was introduced in\nthis paper and first released in\nthis repository. We trained CLIP-Vision-BERT model during community week hosted by Huggingface using JAX/Flax.\n\nThis checkpoint is pre-trained for 60k steps.## Model description\nCLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.## Intended uses & limitations️\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.\nNote that this model is primarily aimed at being fine-tuned on tasks such as visuo-linguistic sequence classification or visual question answering. We used this model to fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since Conceptual-12M is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.### How to use\nYou can use this model directly with a pipeline for masked language modeling. You will need to clone the model from here. An example of usage is shown below:",
"passage: ## Training data ️\nThe CLIP-Vision-BERT model was pre-trained on a translated version of the Conceptual-12m dataset in four languages using mBART-50: English, French, German and Spanish, with 2.5M image-text pairs in each.\n\nThe dataset captions and image urls can be downloaded from flax-community/conceptual-12m-mbart-50-translated.## Data Cleaning \n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.\n\nSplits\nWe used 99% of the 10M examples as a train set, and the remaining ~ 100K examples as our validation set.## Training procedure ### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with '[CLS]' and the end of one by '[SEP]'\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.\n\n\nThe visual embeddings are taken from the CLIP-Vision model and combined with the textual embeddings inside the BERT embedding layer. The padding is done in the middle. Here is an example of what the embeddings look like:\n\n\n\nA total length of 128 tokens, including the visual embeddings, is used. The texts are truncated or padded accordingly.### Pretraining\nThe checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 60k steps with a per device batch size of 64 and a max sequence length of 128. The optimizer used is Adafactor with a learning rate of 1e-4, learning rate warmup for 5,000 steps, and linear decay of the learning rate after.\n\nWe tracked experiments using TensorBoard. Here is the link to the main dashboard: CLIP Vision BERT CC12M Pre-training Dashboard"
] |
[
-0.06224233657121658,
-0.03639001026749611,
-0.0017980743432417512,
0.056883059442043304,
0.03576077148318291,
-0.019745126366615295,
0.04706116393208504,
0.08834663778543472,
-0.06044050306081772,
0.09134592115879059,
0.006385468877851963,
-0.07834327220916748,
0.07939176261425018,
0.09348298609256744,
0.058192551136016846,
-0.258114218711853,
0.08648385107517242,
-0.03295322507619858,
-0.05991023778915405,
0.04749642312526703,
0.09749871492385864,
-0.09790989011526108,
0.08083476126194,
-0.014555828645825386,
-0.052434857934713364,
-0.02490832470357418,
-0.02576233074069023,
-0.016216089949011803,
0.10226660966873169,
0.09746471047401428,
0.07576867938041687,
-0.008046475239098072,
0.015643784776329994,
-0.0941551998257637,
0.009584993124008179,
0.0917171984910965,
-0.01644212007522583,
0.05333523452281952,
0.06960204988718033,
0.13614651560783386,
0.07860790938138962,
-0.07490747421979904,
0.03640320524573326,
0.06064886227250099,
-0.08599944412708282,
-0.09464459121227264,
-0.09256777912378311,
0.03127943351864815,
0.09056738018989563,
0.029800210148096085,
-0.019822560250759125,
0.09151168912649155,
-0.02298768237233162,
0.09940478950738907,
0.11412708461284637,
-0.21401694416999817,
-0.014504410326480865,
0.08106556534767151,
0.07418558746576309,
0.05396132916212082,
-0.05349528789520264,
0.038111425936222076,
0.01828288845717907,
0.04224763065576553,
0.041811756789684296,
-0.04498710483312607,
-0.06229855865240097,
-0.05756837874650955,
-0.13223573565483093,
-0.041489116847515106,
0.07873822748661041,
0.014484560117125511,
-0.06586810946464539,
-0.10215020179748535,
-0.0872734934091568,
0.021855972707271576,
-0.0029079997912049294,
-0.04586980864405632,
0.02725500799715519,
0.01244401279836893,
-0.02507813833653927,
-0.09417317807674408,
-0.06608974188566208,
-0.020992804318666458,
-0.028558827936649323,
0.01813420094549656,
0.03468389809131622,
0.04262075945734978,
-0.048007819801568985,
0.06747590750455856,
-0.10077033191919327,
-0.049936190247535706,
-0.05402403324842453,
-0.03400923311710358,
-0.06481894105672836,
-0.00809310283511877,
-0.08273313194513321,
-0.2014096975326538,
-0.07551160454750061,
0.06390900164842606,
0.0022183717228472233,
0.03807409852743149,
-0.04447091370820999,
0.05082211643457413,
0.021777009591460228,
0.11905097216367722,
-0.07981981337070465,
-0.044592954218387604,
0.05264774709939957,
-0.011851728893816471,
0.017199717462062836,
-0.026292884722352028,
-0.08292250335216522,
0.01763523928821087,
0.05958736687898636,
0.0046814908273518085,
0.012085079215466976,
0.04558572545647621,
-0.04510601982474327,
-0.04540916159749031,
0.138773113489151,
-0.10411231219768524,
0.03304556757211685,
-0.027364186942577362,
-0.04644329100847244,
0.022394755855202675,
0.09634214639663696,
-0.07045713067054749,
-0.09313398599624634,
0.0056343600153923035,
-0.056093037128448486,
-0.044380854815244675,
-0.13190314173698425,
-0.17189982533454895,
0.05756989121437073,
-0.06890810281038284,
-0.08037920296192169,
-0.10330149531364441,
-0.13879582285881042,
-0.025738311931490898,
0.041461069136857986,
0.0021307533606886864,
0.02048872411251068,
-0.02401045523583889,
-0.009632428176701069,
-0.034795332700014114,
0.019120093435049057,
0.05297701805830002,
0.02860230952501297,
0.0468304343521595,
-0.08274544775485992,
0.08988897502422333,
-0.0936344563961029,
0.023663274943828583,
-0.0614437498152256,
0.05079221352934837,
-0.2440757155418396,
0.10757262259721756,
-0.029086433351039886,
-0.06946012377738953,
-0.07925348728895187,
-0.05187195539474487,
-0.07730680704116821,
0.025914745405316353,
0.07009880244731903,
0.0815153419971466,
-0.2556537985801697,
-0.05834575369954109,
0.14889350533485413,
-0.12395407259464264,
0.0019399458542466164,
0.16232794523239136,
-0.03891507536172867,
0.053000353276729584,
0.10046470165252686,
0.11305607855319977,
0.02875874564051628,
-0.058885358273983,
-0.11428937315940857,
-0.013112642802298069,
0.009725856594741344,
0.0448763370513916,
0.024033192545175552,
-0.029963649809360504,
0.06382124125957489,
0.01267749723047018,
0.017307531088590622,
-0.007574490271508694,
-0.01685594953596592,
-0.028446141630411148,
0.008322825655341148,
-0.012243269942700863,
-0.0032736100256443024,
-0.02767643705010414,
0.00023806234821677208,
-0.019766196608543396,
-0.05927906930446625,
0.07870092988014221,
0.059622589498758316,
-0.07331810891628265,
0.08920861780643463,
-0.03894233703613281,
0.05288442224264145,
-0.08869120478630066,
0.03846556320786476,
-0.16538609564304352,
-0.0714639350771904,
0.06990815699100494,
-0.06538959592580795,
0.026527518406510353,
0.053994759917259216,
0.0453929677605629,
0.056329526007175446,
-0.06743362545967102,
0.02093968726694584,
-0.054193250834941864,
-0.04894164204597473,
-0.10011601448059082,
-0.06923840939998627,
-0.06531831622123718,
-0.04326172173023224,
-0.043988898396492004,
-0.03500932455062866,
0.013840002939105034,
0.08895625919103622,
0.10045422613620758,
0.05162092298269272,
-0.07405736297369003,
-0.001318060327321291,
0.04873741418123245,
-0.017856203019618988,
-0.04147172346711159,
0.004175080917775631,
0.02322598174214363,
-0.027417223900556564,
0.10955937206745148,
-0.1236409842967987,
-0.1259605437517166,
0.07115262001752853,
-0.024532154202461243,
-0.04361345246434212,
0.010985087603330612,
-0.013014490716159344,
-0.032907042652368546,
-0.05563358590006828,
-0.0842428058385849,
0.1622033566236496,
0.018638692796230316,
0.09699542075395584,
-0.10489007830619812,
0.008881031535565853,
0.01671387068927288,
0.018885647878050804,
-0.049128979444503784,
0.021843595430254936,
0.03679114580154419,
-0.12622585892677307,
0.05450018495321274,
0.039845146238803864,
0.029281433671712875,
0.15639981627464294,
0.00978642888367176,
-0.10400673747062683,
-0.005404800176620483,
0.00683504156768322,
0.031174009665846825,
0.099966861307621,
0.002994747832417488,
-0.02319509908556938,
0.041650429368019104,
0.05157531052827835,
0.03672981634736061,
-0.05762184038758278,
0.1123552918434143,
0.050801631063222885,
-0.020129818469285965,
0.04256737232208252,
-0.016437016427516937,
-0.040013387799263,
0.06615098565816879,
0.05073936656117439,
0.008734348230063915,
0.004450479056686163,
-0.015570749528706074,
-0.09643715620040894,
0.14082872867584229,
-0.11788240075111389,
-0.2684542238712311,
-0.17605789005756378,
-0.013013679534196854,
-0.034198492765426636,
0.01132575049996376,
0.007686079014092684,
-0.026847459375858307,
-0.0950978472828865,
-0.09540572762489319,
0.06571414321660995,
-0.025591515004634857,
0.0004628300666809082,
-0.02030426636338234,
-0.017431098967790604,
0.02986982651054859,
-0.1382589340209961,
0.027580510824918747,
0.017242271453142166,
-0.03208019956946373,
0.01040569506585598,
-0.014944793656468391,
0.11493975669145584,
0.07969989627599716,
-0.042839083820581436,
-0.01700945943593979,
-0.041868288069963455,
0.13931351900100708,
-0.06093735992908478,
0.15521563589572906,
0.0709138959646225,
-0.06666389107704163,
0.06074732914566994,
0.07566741108894348,
-0.016736449673771858,
-0.0547085702419281,
0.019357670098543167,
0.0462949313223362,
-0.0690096765756607,
-0.14466963708400726,
-0.07668112963438034,
-0.055861443281173706,
0.03252509981393814,
0.04833563417196274,
0.034634679555892944,
-0.04057218134403229,
0.00004766695201396942,
-0.06941278278827667,
0.11054296791553497,
0.001673462800681591,
0.08018004894256592,
0.029903586953878403,
-0.06603690981864929,
0.08145806193351746,
-0.04868081212043762,
0.018836675211787224,
0.10595621913671494,
0.005149985197931528,
0.16811814904212952,
-0.0685630589723587,
0.18595142662525177,
0.031469155102968216,
0.04706641286611557,
0.057301297783851624,
0.1146126389503479,
-0.07043477892875671,
0.04587354511022568,
-0.06302522122859955,
-0.047243937849998474,
-0.0047834268771111965,
0.04660390689969063,
0.008155428804457188,
-0.02873559296131134,
-0.04404324293136597,
0.02090458571910858,
0.050084710121154785,
0.24485015869140625,
0.06033940613269806,
-0.15580330789089203,
-0.08657471090555191,
0.009829851798713207,
-0.03170720115303993,
-0.09524939954280853,
-0.008957309648394585,
0.21009892225265503,
-0.0805651992559433,
0.002621148247271776,
-0.04077823832631111,
0.07830365002155304,
-0.06842266023159027,
0.014796976931393147,
-0.028488237410783768,
0.06344792246818542,
-0.01990802399814129,
0.054058171808719635,
-0.22446930408477783,
0.08975670486688614,
0.05418304353952408,
0.01975971832871437,
-0.06694082915782928,
0.014165830798447132,
0.005292857065796852,
-0.01493515633046627,
0.10131378471851349,
0.02578430064022541,
-0.038851067423820496,
-0.044318050146102905,
-0.11564157903194427,
0.018378768116235733,
0.11422303318977356,
0.02862970158457756,
0.06366315484046936,
0.013263115659356117,
0.00439880508929491,
0.01119746919721365,
0.05552634224295616,
-0.053813446313142776,
-0.14065545797348022,
0.018181268125772476,
0.0014705434441566467,
-0.00563383474946022,
-0.06503341346979141,
-0.045862000435590744,
-0.06032281741499901,
0.20433005690574646,
-0.0513896681368351,
0.0016388981603085995,
-0.09271393716335297,
0.03285536915063858,
0.0973876416683197,
-0.06779888272285461,
0.08613383769989014,
0.011270754039287567,
0.11847470700740814,
-0.10000485926866531,
-0.09743253886699677,
0.09679296612739563,
-0.07195059955120087,
-0.08908240497112274,
-0.0470069944858551,
0.033455200493335724,
0.048352450132369995,
0.056981004774570465,
-0.007469381205737591,
0.022600872442126274,
0.019833549857139587,
-0.03224394842982292,
-0.033850301057100296,
0.12550251185894012,
0.0191328302025795,
0.07064907252788544,
-0.13074418902397156,
-0.01658158376812935,
-0.0031350264325737953,
0.025932854041457176,
0.07816104590892792,
0.13245046138763428,
-0.04107914865016937,
0.094852976500988,
0.2065083533525467,
-0.07251547276973724,
-0.27751481533050537,
0.03387363255023956,
0.017486466094851494,
0.08600066602230072,
-0.02324676886200905,
-0.16583974659442902,
-0.017712866887450218,
0.030607687309384346,
-0.0015920638106763363,
0.061488956212997437,
-0.16701671481132507,
-0.11506108939647675,
0.0371774397790432,
0.03726092725992203,
0.13981401920318604,
-0.04082994535565376,
-0.021063242107629776,
-0.019522318616509438,
0.03834071010351181,
-0.02179044298827648,
-0.008236776106059551,
0.09839295595884323,
0.030262742191553116,
-0.05205149203538895,
0.02780080772936344,
-0.03624294698238373,
0.08376996964216232,
-0.03384189307689667,
0.048523854464292526,
-0.06857233494520187,
0.0009385645389556885,
0.07395784556865692,
-0.06383514404296875,
0.11529727280139923,
-0.003169400617480278,
0.053152427077293396,
-0.02972479909658432,
-0.03511601686477661,
-0.054853249341249466,
0.04325934499502182,
-0.03423672914505005,
-0.04303060099482536,
-0.08762607723474503,
0.0699392631649971,
0.037791907787323,
0.016653817147016525,
-0.001806151121854782,
-0.03717215359210968,
-0.07430513948202133,
0.17675301432609558,
0.08024735748767853,
-0.0013597614597529173,
-0.06263351440429688,
0.02792741172015667,
-0.036504194140434265,
0.13782042264938354,
-0.08194754272699356,
0.02338457852602005,
0.0666123479604721,
-0.01024193037301302,
0.10224120318889618,
0.018613435328006744,
-0.16126784682273865,
-0.005543314851820469,
0.03709197789430618,
-0.10431734472513199,
-0.1254461258649826,
-0.012523511424660683,
0.01939305290579796,
-0.13841982185840607,
-0.015458770096302032,
0.11953651905059814,
-0.0558944009244442,
-0.00533559825271368,
-0.0027165133506059647,
0.05205376446247101,
-0.018173260614275932,
0.12215091288089752,
0.031513143330812454,
0.012386171147227287,
-0.061724912375211716,
0.12610824406147003,
0.07010570168495178,
-0.17755497992038727,
0.08974488079547882,
0.0687808096408844,
-0.09471454471349716,
-0.036938004195690155,
0.039098531007766724,
0.13066546618938446,
0.056290458887815475,
-0.04385615140199661,
-0.07328785210847855,
-0.0472390353679657,
0.05642075091600418,
0.1564861685037613,
0.011912653222680092,
0.05021199584007263,
-0.046347759664058685,
-0.005123740062117577,
-0.06738587468862534,
0.07425874471664429,
0.006800150498747826,
0.03921607509255409,
0.015100512653589249,
0.10162301361560822,
-0.017664717510342598,
0.02741970494389534,
-0.026470815762877464,
-0.007582306861877441,
-0.07250314950942993,
-0.046174801886081696,
-0.08603422343730927,
-0.03795957192778587,
0.011265002191066742,
-0.04722652584314346,
-0.016365470364689827,
0.04290241003036499,
0.04143203794956207,
0.027073582634329796,
-0.03470133617520332,
-0.04756096005439758,
-0.05320277437567711,
0.009388087317347527,
-0.07713358104228973,
-0.006249767728149891,
0.03557003661990166,
-0.04603547602891922,
0.08168847113847733,
-0.037461861968040466,
-0.012268729507923126,
0.00848385039716959,
0.021015126258134842,
-0.028531713411211967,
0.03360586240887642,
0.030728381127119064,
0.014814251102507114,
-0.007117411121726036,
0.005127059295773506,
-0.0010821912437677383,
0.008417829871177673,
-0.02349218726158142,
0.05965918302536011,
-0.048958465456962585,
0.10208253562450409,
-0.024154219776391983,
-0.011447062715888023,
-0.03398525342345238,
0.09065668284893036,
-0.007744967471808195,
0.07979889214038849,
0.06199377775192261,
-0.05260595679283142,
0.028367487713694572,
-0.12889400124549866,
0.002247442491352558,
0.010609768331050873,
-0.012273399159312248,
0.018423045054078102,
-0.05966230481863022,
0.04323793202638626,
-0.02476249448955059,
0.05310653895139694,
0.024361934512853622,
-0.04265247657895088,
0.009108608588576317,
0.0015399623662233353,
-0.1139841228723526,
0.01324000395834446,
0.07931837439537048,
-0.009373782202601433,
-0.024287912994623184,
-0.013867834582924843,
0.06025433540344238,
0.036009885370731354,
0.15086886286735535,
0.12747971713542938,
0.1104891449213028,
0.08762014657258987,
0.11897119134664536,
0.02218470349907875,
-0.05799221619963646,
-0.03869252651929855,
-0.008173368871212006,
-0.007578827440738678,
0.10420569777488708,
-0.02691831812262535,
0.046734973788261414,
0.07650832086801529,
-0.17291824519634247,
0.12233531475067139,
0.06995514035224915,
-0.07255224138498306,
-0.05595342442393303,
-0.16829591989517212,
-0.04416542500257492,
-0.011325372382998466,
-0.0280262753367424,
-0.09885715693235397,
-0.011332860216498375,
0.03319850191473961,
0.01949486881494522,
-0.006601983681321144,
0.15618759393692017,
-0.20869280397891998,
-0.057700689882040024,
0.07707855105400085,
0.0006475616246461868,
0.015109998174011707,
0.047634415328502655,
-0.049610599875450134,
0.029071399942040443,
0.04986756294965744,
0.07594912499189377,
0.05243127420544624,
0.09491774439811707,
0.04540059715509415,
0.026124410331249237,
-0.06193998456001282,
-0.022664828225970268,
-0.02527138963341713,
0.03702358528971672,
0.21208593249320984,
0.03268017619848251,
-0.06465303152799606,
-0.011026229709386826,
0.042088259011507034,
-0.05659440532326698,
-0.052971888333559036,
-0.11069266498088837,
0.090375155210495,
-0.01529177837073803,
0.003987589851021767,
-0.03217663988471031,
-0.09266173094511032,
-0.00931971799582243,
0.2138015627861023,
0.15861903131008148,
-0.0098072849214077,
-0.01909048855304718,
0.0034168825950473547,
0.0018117954023182392,
0.019617531448602676,
0.10156311094760895,
0.02250329591333866,
0.19635672867298126,
-0.021399935707449913,
0.025960175320506096,
-0.0021583863999694586,
-0.024900099262595177,
-0.04428880661725998,
0.05399544537067413,
-0.015925835818052292,
-0.01959003135561943,
-0.052391067147254944,
0.01806621253490448,
0.023295052349567413,
-0.20789453387260437,
0.02809814177453518,
-0.08877003192901611,
-0.04578360170125961,
-0.013137191534042358,
-0.01746857538819313,
0.013138911686837673,
0.0848299041390419,
-0.003839008044451475,
0.07133971899747849,
0.1844903528690338,
-0.0014874082989990711,
-0.08127818256616592,
-0.10427787899971008,
0.04337054118514061,
-0.11517445743083954,
0.14289245009422302,
0.04578109085559845,
0.03763694688677788,
0.07863463461399078,
0.017249567434191704,
-0.04582884907722473,
0.014754034578800201,
-0.024657225236296654,
-0.03089302033185959,
-0.033495448529720306,
0.11645835638046265,
-0.04540422186255455,
0.10685975104570389,
0.006712391972541809,
-0.036535535007715225,
0.050322480499744415,
-0.012713548727333546,
-0.0221809521317482,
-0.10242266952991486,
0.07165132462978363,
-0.07681736350059509,
0.1510014533996582,
0.1754378080368042,
0.019642001017928123,
-0.003063492476940155,
-0.043706879019737244,
0.00026840437203645706,
0.019278928637504578,
0.020322147756814957,
-0.007478747982531786,
-0.10266947746276855,
-0.009848973713815212,
-0.13945183157920837,
-0.0007372843101620674,
-0.22793883085250854,
-0.07440342009067535,
0.030564144253730774,
-0.06057685613632202,
-0.043134655803442,
0.08156614750623703,
0.0553184449672699,
0.011161191388964653,
-0.015462442301213741,
0.02761613205075264,
0.032513026148080826,
0.04464603587985039,
-0.08267176151275635,
-0.06815007328987122
] |
null | null |
transformers
|
# CLIP-Vision-BERT Multilingual Pre-trained Model
Pretrained CLIP-Vision-BERT pre-trained on translated [Conceptual-12M](https://github.com/google-research-datasets/conceptual-12m) image-text pairs using a masked language modeling (MLM) objective. 10M cleaned image-text pairs are translated using [mBART-50 one-to-many model](https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt) to 2.5M examples each in English, French, German and Spanish. This model is based on the VisualBERT which was introduced in
[this paper](https://arxiv.org/abs/1908.03557) and first released in
[this repository](https://github.com/uclanlp/visualbert). We trained CLIP-Vision-BERT model during community week hosted by Huggingface 🤗 using JAX/Flax.
This checkpoint is pre-trained for 70k steps.
## Model description
CLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.
## Intended uses & limitations❗️
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
Note that this model is primarily aimed at being fine-tuned on tasks such as visuo-linguistic sequence classification or visual question answering. We used this model to fine-tuned on a multi-translated version of the visual question answering task - [VQA v2](https://visualqa.org/challenge.html). Since Conceptual-12M is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.
### How to use❓
You can use this model directly with a pipeline for masked language modeling. You will need to clone the model from [here](https://github.com/gchhablani/multilingual-vqa). An example of usage is shown below:
```python
>>> from torchvision.io import read_image
>>> import numpy as np
>>> import os
>>> from transformers import CLIPProcessor, BertTokenizerFast
>>> from model.flax_clip_vision_bert.modeling_clip_vision_bert import FlaxCLIPVisionBertForMaskedLM
>>> image_path = os.path.join('images/val2014', os.listdir('images/val2014')[0])
>>> img = read_image(image_path)
>>> clip_processor = CLIPProcessor.from_pretrained('openai/clip-vit-base-patch32')
ftfy or spacy is not installed using BERT BasicTokenizer instead of ftfy.
>>> clip_outputs = clip_processor(images=img)
>>> clip_outputs['pixel_values'][0] = clip_outputs['pixel_values'][0].transpose(1,2,0) # Need to transpose images as model expected channel last images.
>>> tokenizer = BertTokenizerFast.from_pretrained('bert-base-multilingual-uncased')
>>> model = FlaxCLIPVisionBertForMaskedLM.from_pretrained('flax-community/clip-vision-bert-cc12m-70k')
>>> text = "Three teddy [MASK] in a showcase."
>>> tokens = tokenizer([text], return_tensors="np")
>>> pixel_values = np.concatenate([clip_outputs['pixel_values']])
>>> outputs = model(pixel_values=pixel_values, **tokens)
>>> indices = np.where(tokens['input_ids']==tokenizer.mask_token_id)
>>> preds = outputs.logits[indices][0]
>>> sorted_indices = np.argsort(preds)[::-1] # Get reverse sorted scores
>>> top_5_indices = sorted_indices[:5]
>>> top_5_tokens = tokenizer.convert_ids_to_tokens(top_5_indices)
>>> top_5_scores = preds[top_5_indices]
>>> print(dict(zip(top_5_tokens, top_5_scores)))
{'bears': 19.400345, 'bear': 17.866995, 'animals': 14.453735, 'dogs': 14.427426, 'girls': 14.097499}
```
## Training data 🏋🏻♂️
The CLIP-Vision-BERT model was pre-trained on a translated version of the Conceptual-12m dataset in four languages using mBART-50: English, French, German and Spanish, with 2.5M image-text pairs in each.
The dataset captions and image urls can be downloaded from [flax-community/conceptual-12m-mbart-50-translated](https://huggingface.co/datasets/flax-community/conceptual-12m-mbart-50-multilingual).
## Data Cleaning 🧹
Though the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.
**Splits**
We used 99% of the 10M examples as a train set, and the remaining ~ 100K examples as our validation set.
## Training procedure 👨🏻💻
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with `[CLS]` and the end of one by `[CLS]`
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
The visual embeddings are taken from the CLIP-Vision model and combined with the textual embeddings inside the BERT embedding layer. The padding is done in the middle. Here is an example of what the embeddings look like:
```
[CLS Emb] [Textual Embs] [SEP Emb] [Pad Embs] [Visual Embs]
```
A total length of 128 tokens, including the visual embeddings, is used. The texts are truncated or padded accordingly.
### Pretraining
The checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) **8 v3 TPU cores** for 70k steps with a per device batch size of 64 and a max sequence length of 128. The optimizer used is Adafactor with a learning rate of 1e-4, learning rate warmup for 1,000 steps, and linear decay of the learning rate after.
We tracked experiments using TensorBoard. Here is the link to the main dashboard: [CLIP Vision BERT CC12M Pre-training Dashboard](https://huggingface.co/flax-community/multilingual-vqa-pt-ckpts/tensorboard)
#### **Pretraining Results 📊**
The model at this checkpoint reached **eval accuracy of 67.85%** and **with train loss at 1.756 and eval loss at 1.706**.
## Team Members
- Gunjan Chhablani [@gchhablani](https://hf.co/gchhablani)
- Bhavitvya Malik[@bhavitvyamalik](https://hf.co/bhavitvyamalik)
## Acknowledgements
We thank [Nilakshan Kunananthaseelan](https://huggingface.co/knilakshan20) for helping us whenever he could get a chance. We also thank [Abheesht Sharma](https://huggingface.co/abheesht) for helping in the discussions in the initial phases. [Luke Melas](https://github.com/lukemelas) helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.
This project would not be possible without the help of [Patrick](https://huggingface.co/patrickvonplaten) and [Suraj](https://huggingface.co/valhalla) who met with us frequently and helped review our approach and guided us throughout the project.
Huge thanks to Huggingface 🤗 & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.
<img src=https://pbs.twimg.com/media/E443fPjX0AY1BsR.jpg:large>
|
{}
|
fill-mask
|
flax-community/clip-vision-bert-cc12m-70k
|
[
"transformers",
"jax",
"clip-vision-bert",
"fill-mask",
"arxiv:1908.03557",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.03557"
] |
[] |
TAGS
#transformers #jax #clip-vision-bert #fill-mask #arxiv-1908.03557 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# CLIP-Vision-BERT Multilingual Pre-trained Model
Pretrained CLIP-Vision-BERT pre-trained on translated Conceptual-12M image-text pairs using a masked language modeling (MLM) objective. 10M cleaned image-text pairs are translated using mBART-50 one-to-many model to 2.5M examples each in English, French, German and Spanish. This model is based on the VisualBERT which was introduced in
this paper and first released in
this repository. We trained CLIP-Vision-BERT model during community week hosted by Huggingface using JAX/Flax.
This checkpoint is pre-trained for 70k steps.
## Model description
CLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.
## Intended uses & limitations️
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
Note that this model is primarily aimed at being fine-tuned on tasks such as visuo-linguistic sequence classification or visual question answering. We used this model to fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since Conceptual-12M is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.
### How to use
You can use this model directly with a pipeline for masked language modeling. You will need to clone the model from here. An example of usage is shown below:
## Training data ️
The CLIP-Vision-BERT model was pre-trained on a translated version of the Conceptual-12m dataset in four languages using mBART-50: English, French, German and Spanish, with 2.5M image-text pairs in each.
The dataset captions and image urls can be downloaded from flax-community/conceptual-12m-mbart-50-translated.
## Data Cleaning
Though the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.
Splits
We used 99% of the 10M examples as a train set, and the remaining ~ 100K examples as our validation set.
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with '[CLS]' and the end of one by '[CLS]'
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by '[MASK]'.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
The visual embeddings are taken from the CLIP-Vision model and combined with the textual embeddings inside the BERT embedding layer. The padding is done in the middle. Here is an example of what the embeddings look like:
A total length of 128 tokens, including the visual embeddings, is used. The texts are truncated or padded accordingly.
### Pretraining
The checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 70k steps with a per device batch size of 64 and a max sequence length of 128. The optimizer used is Adafactor with a learning rate of 1e-4, learning rate warmup for 1,000 steps, and linear decay of the learning rate after.
We tracked experiments using TensorBoard. Here is the link to the main dashboard: CLIP Vision BERT CC12M Pre-training Dashboard
#### Pretraining Results
The model at this checkpoint reached eval accuracy of 67.85% and with train loss at 1.756 and eval loss at 1.706.
## Team Members
- Gunjan Chhablani @gchhablani
- Bhavitvya Malik@bhavitvyamalik
## Acknowledgements
We thank Nilakshan Kunananthaseelan for helping us whenever he could get a chance. We also thank Abheesht Sharma for helping in the discussions in the initial phases. Luke Melas helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.
This project would not be possible without the help of Patrick and Suraj who met with us frequently and helped review our approach and guided us throughout the project.
Huge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.
<img src=URL
|
[
"# CLIP-Vision-BERT Multilingual Pre-trained Model\n\nPretrained CLIP-Vision-BERT pre-trained on translated Conceptual-12M image-text pairs using a masked language modeling (MLM) objective. 10M cleaned image-text pairs are translated using mBART-50 one-to-many model to 2.5M examples each in English, French, German and Spanish. This model is based on the VisualBERT which was introduced in\nthis paper and first released in\nthis repository. We trained CLIP-Vision-BERT model during community week hosted by Huggingface using JAX/Flax.\n\nThis checkpoint is pre-trained for 70k steps.",
"## Model description\nCLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.",
"## Intended uses & limitations️\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.\nNote that this model is primarily aimed at being fine-tuned on tasks such as visuo-linguistic sequence classification or visual question answering. We used this model to fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since Conceptual-12M is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.",
"### How to use\nYou can use this model directly with a pipeline for masked language modeling. You will need to clone the model from here. An example of usage is shown below:",
"## Training data ️\nThe CLIP-Vision-BERT model was pre-trained on a translated version of the Conceptual-12m dataset in four languages using mBART-50: English, French, German and Spanish, with 2.5M image-text pairs in each.\n\nThe dataset captions and image urls can be downloaded from flax-community/conceptual-12m-mbart-50-translated.",
"## Data Cleaning \n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.\n\nSplits\nWe used 99% of the 10M examples as a train set, and the remaining ~ 100K examples as our validation set.",
"## Training procedure ",
"### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with '[CLS]' and the end of one by '[CLS]'\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.\n\n\nThe visual embeddings are taken from the CLIP-Vision model and combined with the textual embeddings inside the BERT embedding layer. The padding is done in the middle. Here is an example of what the embeddings look like:\n\n\n\nA total length of 128 tokens, including the visual embeddings, is used. The texts are truncated or padded accordingly.",
"### Pretraining\nThe checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 70k steps with a per device batch size of 64 and a max sequence length of 128. The optimizer used is Adafactor with a learning rate of 1e-4, learning rate warmup for 1,000 steps, and linear decay of the learning rate after.\n\nWe tracked experiments using TensorBoard. Here is the link to the main dashboard: CLIP Vision BERT CC12M Pre-training Dashboard",
"#### Pretraining Results \n\nThe model at this checkpoint reached eval accuracy of 67.85% and with train loss at 1.756 and eval loss at 1.706.",
"## Team Members\n - Gunjan Chhablani @gchhablani\n - Bhavitvya Malik@bhavitvyamalik",
"## Acknowledgements\n We thank Nilakshan Kunananthaseelan for helping us whenever he could get a chance. We also thank Abheesht Sharma for helping in the discussions in the initial phases. Luke Melas helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.\n\n This project would not be possible without the help of Patrick and Suraj who met with us frequently and helped review our approach and guided us throughout the project.\n\n Huge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.\n\n<img src=URL"
] |
[
"TAGS\n#transformers #jax #clip-vision-bert #fill-mask #arxiv-1908.03557 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# CLIP-Vision-BERT Multilingual Pre-trained Model\n\nPretrained CLIP-Vision-BERT pre-trained on translated Conceptual-12M image-text pairs using a masked language modeling (MLM) objective. 10M cleaned image-text pairs are translated using mBART-50 one-to-many model to 2.5M examples each in English, French, German and Spanish. This model is based on the VisualBERT which was introduced in\nthis paper and first released in\nthis repository. We trained CLIP-Vision-BERT model during community week hosted by Huggingface using JAX/Flax.\n\nThis checkpoint is pre-trained for 70k steps.",
"## Model description\nCLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.",
"## Intended uses & limitations️\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.\nNote that this model is primarily aimed at being fine-tuned on tasks such as visuo-linguistic sequence classification or visual question answering. We used this model to fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since Conceptual-12M is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.",
"### How to use\nYou can use this model directly with a pipeline for masked language modeling. You will need to clone the model from here. An example of usage is shown below:",
"## Training data ️\nThe CLIP-Vision-BERT model was pre-trained on a translated version of the Conceptual-12m dataset in four languages using mBART-50: English, French, German and Spanish, with 2.5M image-text pairs in each.\n\nThe dataset captions and image urls can be downloaded from flax-community/conceptual-12m-mbart-50-translated.",
"## Data Cleaning \n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.\n\nSplits\nWe used 99% of the 10M examples as a train set, and the remaining ~ 100K examples as our validation set.",
"## Training procedure ",
"### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with '[CLS]' and the end of one by '[CLS]'\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.\n\n\nThe visual embeddings are taken from the CLIP-Vision model and combined with the textual embeddings inside the BERT embedding layer. The padding is done in the middle. Here is an example of what the embeddings look like:\n\n\n\nA total length of 128 tokens, including the visual embeddings, is used. The texts are truncated or padded accordingly.",
"### Pretraining\nThe checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 70k steps with a per device batch size of 64 and a max sequence length of 128. The optimizer used is Adafactor with a learning rate of 1e-4, learning rate warmup for 1,000 steps, and linear decay of the learning rate after.\n\nWe tracked experiments using TensorBoard. Here is the link to the main dashboard: CLIP Vision BERT CC12M Pre-training Dashboard",
"#### Pretraining Results \n\nThe model at this checkpoint reached eval accuracy of 67.85% and with train loss at 1.756 and eval loss at 1.706.",
"## Team Members\n - Gunjan Chhablani @gchhablani\n - Bhavitvya Malik@bhavitvyamalik",
"## Acknowledgements\n We thank Nilakshan Kunananthaseelan for helping us whenever he could get a chance. We also thank Abheesht Sharma for helping in the discussions in the initial phases. Luke Melas helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.\n\n This project would not be possible without the help of Patrick and Suraj who met with us frequently and helped review our approach and guided us throughout the project.\n\n Huge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.\n\n<img src=URL"
] |
[
51,
161,
79,
143,
41,
97,
95,
3,
242,
138,
35,
27,
156
] |
[
"passage: TAGS\n#transformers #jax #clip-vision-bert #fill-mask #arxiv-1908.03557 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# CLIP-Vision-BERT Multilingual Pre-trained Model\n\nPretrained CLIP-Vision-BERT pre-trained on translated Conceptual-12M image-text pairs using a masked language modeling (MLM) objective. 10M cleaned image-text pairs are translated using mBART-50 one-to-many model to 2.5M examples each in English, French, German and Spanish. This model is based on the VisualBERT which was introduced in\nthis paper and first released in\nthis repository. We trained CLIP-Vision-BERT model during community week hosted by Huggingface using JAX/Flax.\n\nThis checkpoint is pre-trained for 70k steps.## Model description\nCLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.## Intended uses & limitations️\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.\nNote that this model is primarily aimed at being fine-tuned on tasks such as visuo-linguistic sequence classification or visual question answering. We used this model to fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since Conceptual-12M is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.### How to use\nYou can use this model directly with a pipeline for masked language modeling. You will need to clone the model from here. An example of usage is shown below:",
"passage: ## Training data ️\nThe CLIP-Vision-BERT model was pre-trained on a translated version of the Conceptual-12m dataset in four languages using mBART-50: English, French, German and Spanish, with 2.5M image-text pairs in each.\n\nThe dataset captions and image urls can be downloaded from flax-community/conceptual-12m-mbart-50-translated.## Data Cleaning \n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.\n\nSplits\nWe used 99% of the 10M examples as a train set, and the remaining ~ 100K examples as our validation set.## Training procedure ### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with '[CLS]' and the end of one by '[CLS]'\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.\n\n\nThe visual embeddings are taken from the CLIP-Vision model and combined with the textual embeddings inside the BERT embedding layer. The padding is done in the middle. Here is an example of what the embeddings look like:\n\n\n\nA total length of 128 tokens, including the visual embeddings, is used. The texts are truncated or padded accordingly.### Pretraining\nThe checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 70k steps with a per device batch size of 64 and a max sequence length of 128. The optimizer used is Adafactor with a learning rate of 1e-4, learning rate warmup for 1,000 steps, and linear decay of the learning rate after.\n\nWe tracked experiments using TensorBoard. Here is the link to the main dashboard: CLIP Vision BERT CC12M Pre-training Dashboard"
] |
[
-0.05398024991154671,
-0.027453403919935226,
-0.0022550637368112803,
0.06123983487486839,
0.0275641530752182,
-0.024025239050388336,
0.050493232905864716,
0.09186726063489914,
-0.08675774186849594,
0.09872176498174667,
-0.00022458238527178764,
-0.09333770722150803,
0.08624029159545898,
0.10301575064659119,
0.06654731929302216,
-0.2756156027317047,
0.08681128174066544,
-0.034284934401512146,
-0.06526166945695877,
0.04132753610610962,
0.10688968747854233,
-0.0911937803030014,
0.0805739313364029,
-0.009045196697115898,
-0.06717847287654877,
-0.029276104643940926,
-0.01722383312880993,
-0.01830008253455162,
0.09608106315135956,
0.09420455992221832,
0.07556036114692688,
0.008155972696840763,
0.018199434503912926,
-0.08143278956413269,
0.011606655083596706,
0.09662898629903793,
-0.01826005056500435,
0.06248572841286659,
0.07462576031684875,
0.12842945754528046,
0.06947789341211319,
-0.07160844653844833,
0.03880111500620842,
0.06115959212183952,
-0.08912796527147293,
-0.09769152104854584,
-0.10502121597528458,
0.04171816259622574,
0.10132860392332077,
0.011353589594364166,
-0.013260258361697197,
0.09087903797626495,
-0.019378148019313812,
0.09087496250867844,
0.10534712672233582,
-0.2272244542837143,
-0.01646442711353302,
0.07528513669967651,
0.0709211677312851,
0.051912181079387665,
-0.04940463602542877,
0.04882044345140457,
0.011683233082294464,
0.03465742990374565,
0.029548633843660355,
-0.03268405795097351,
-0.05986552685499191,
-0.06644180417060852,
-0.12468083202838898,
-0.03434507176280022,
0.06923411041498184,
0.014267075806856155,
-0.0664345920085907,
-0.08966612815856934,
-0.08894887566566467,
0.01564713567495346,
-0.0013372646644711494,
-0.03994777798652649,
0.018975013867020607,
0.017704248428344727,
-0.03240462765097618,
-0.10521887987852097,
-0.06925448775291443,
-0.026490822434425354,
-0.033919449895620346,
0.012769607827067375,
0.031883466988801956,
0.05157434195280075,
-0.039719924330711365,
0.07943175733089447,
-0.09329593181610107,
-0.04796803742647171,
-0.050630904734134674,
-0.021607093513011932,
-0.07349428534507751,
-0.00931127741932869,
-0.07815530896186829,
-0.19606739282608032,
-0.08017608523368835,
0.0560927540063858,
0.03041115775704384,
0.032183002680540085,
-0.06964127719402313,
0.05714493617415428,
0.020693248137831688,
0.10302965342998505,
-0.05945727229118347,
-0.03983594477176666,
0.05541185662150383,
-0.016193008050322533,
0.024900704622268677,
-0.02588672935962677,
-0.08876815438270569,
0.013146859593689442,
0.07217329740524292,
0.010692737065255642,
0.01962946727871895,
0.02633638307452202,
-0.03613024204969406,
-0.04295172914862633,
0.14900904893875122,
-0.1024075448513031,
0.04544053599238396,
-0.030607610940933228,
-0.03621296212077141,
0.02705916203558445,
0.08536211401224136,
-0.05341921001672745,
-0.09158513695001602,
-0.01181894913315773,
-0.05953051894903183,
-0.03140433505177498,
-0.12124958634376526,
-0.15875113010406494,
0.06047984957695007,
-0.061441753059625626,
-0.08752977848052979,
-0.09455403685569763,
-0.1337094008922577,
-0.023081572726368904,
0.03845519945025444,
-0.010374510660767555,
-0.0007804252672940493,
-0.012255149893462658,
-0.011703207157552242,
-0.03689317777752876,
0.027414526790380478,
0.0647447407245636,
0.035828229039907455,
0.04643355682492256,
-0.08850908279418945,
0.09247446060180664,
-0.09668594598770142,
0.021025748923420906,
-0.05487833172082901,
0.05341105908155441,
-0.28197330236434937,
0.10670284926891327,
-0.02238975465297699,
-0.07826217263936996,
-0.08128838241100311,
-0.0607253722846508,
-0.06955993175506592,
0.011575091630220413,
0.05401676893234253,
0.06160449981689453,
-0.28113919496536255,
-0.05417555943131447,
0.14416146278381348,
-0.15521767735481262,
0.016594506800174713,
0.16532588005065918,
-0.033850498497486115,
0.043403249233961105,
0.09361220896244049,
0.106314517557621,
0.052379537373781204,
-0.040719904005527496,
-0.09622320532798767,
-0.02653137780725956,
0.003397200256586075,
0.04797486588358879,
0.033708952367305756,
-0.02816321887075901,
0.05457691103219986,
0.005163731053471565,
0.007953677326440811,
-0.020736008882522583,
-0.016612477600574493,
-0.03348708152770996,
0.00401926226913929,
-0.002621866762638092,
-0.01873316988348961,
-0.0348336286842823,
0.008942937478423119,
-0.026631053537130356,
-0.05371464043855667,
0.060827791690826416,
0.060921818017959595,
-0.06594043970108032,
0.10508062690496445,
-0.02964630536735058,
0.05269509553909302,
-0.08468713611364365,
0.02589450404047966,
-0.15392009913921356,
-0.08303756266832352,
0.078203946352005,
-0.07463382184505463,
0.015741724520921707,
0.05312477797269821,
0.03935490548610687,
0.07488188147544861,
-0.07515707612037659,
0.026068879291415215,
-0.054865095764398575,
-0.05045931413769722,
-0.10316324234008789,
-0.07058010995388031,
-0.06961901485919952,
-0.04161923751235008,
-0.03033526986837387,
-0.02697773464024067,
0.008189593441784382,
0.10077110677957535,
0.1280922293663025,
0.05162038281559944,
-0.08191254734992981,
-0.008496914058923721,
0.03169664740562439,
-0.013327857479453087,
-0.04191560298204422,
-0.0006479118019342422,
0.028049681335687637,
-0.014528285712003708,
0.11871658265590668,
-0.12630301713943481,
-0.11571574211120605,
0.06753726303577423,
0.0035357214510440826,
-0.038383159786462784,
0.0034697894006967545,
-0.021707238629460335,
-0.03688512742519379,
-0.06397418677806854,
-0.07435903698205948,
0.149183988571167,
0.011554989963769913,
0.09450780600309372,
-0.10329170525074005,
0.009478026069700718,
0.016271043568849564,
0.016764719039201736,
-0.05629540979862213,
0.014380921609699726,
0.02212977409362793,
-0.13769307732582092,
0.05266854912042618,
0.036535605788230896,
0.03653959557414055,
0.1390247792005539,
0.005597282201051712,
-0.10275798290967941,
-0.018138643354177475,
0.007755388971418142,
0.03008434921503067,
0.09687595069408417,
0.023552175611257553,
-0.012996773235499859,
0.04195582494139671,
0.06070800870656967,
0.03343924880027771,
-0.07165408134460449,
0.11248327791690826,
0.03673110902309418,
-0.030407927930355072,
0.06349609792232513,
-0.022067813202738762,
-0.03779204934835434,
0.07058215886354446,
0.05089280381798744,
0.006322110071778297,
0.01110265776515007,
-0.015671221539378166,
-0.09711333364248276,
0.12470951676368713,
-0.12299579381942749,
-0.2644904851913452,
-0.1850694864988327,
0.00980117917060852,
-0.03255673125386238,
0.015933692455291748,
0.006540533620864153,
-0.018849950283765793,
-0.08896990120410919,
-0.10235947370529175,
0.07062093913555145,
-0.044123873114585876,
0.0051975250244140625,
-0.004393577575683594,
-0.0055385613813996315,
0.03307802975177765,
-0.1273449808359146,
0.030027130618691444,
0.01407019142061472,
-0.035209499299526215,
0.009576455689966679,
0.008091882802546024,
0.09515006840229034,
0.07916795462369919,
-0.04296329244971275,
-0.013404474593698978,
-0.03820691257715225,
0.13324952125549316,
-0.05344593524932861,
0.1658756136894226,
0.07129064202308655,
-0.0745786726474762,
0.07159954309463501,
0.07366131246089935,
-0.02036226913332939,
-0.044005218893289566,
0.02090662159025669,
0.04174860939383507,
-0.06830985844135284,
-0.1228044182062149,
-0.07938329875469208,
-0.04190897196531296,
0.02059212327003479,
0.042671848088502884,
0.04094722494482994,
-0.0382964201271534,
-0.00693678529933095,
-0.07239042222499847,
0.09304255992174149,
0.010033468715846539,
0.074547678232193,
0.020424356684088707,
-0.06741376221179962,
0.078260138630867,
-0.048904433846473694,
0.015125520527362823,
0.11088357865810394,
0.017712317407131195,
0.16046270728111267,
-0.0591668076813221,
0.15825232863426208,
0.028891922906041145,
0.04531759396195412,
0.06459153443574905,
0.10673131048679352,
-0.05396309867501259,
0.046914633363485336,
-0.0664331316947937,
-0.041598305106163025,
0.016695639118552208,
0.038008950650691986,
0.023212648928165436,
-0.03234202414751053,
-0.051023513078689575,
0.03363644331693649,
0.047747209668159485,
0.2362431436777115,
0.09180110692977905,
-0.1500311642885208,
-0.0837397351861,
0.02368799038231373,
-0.04877427965402603,
-0.09974703192710876,
-0.017528150230646133,
0.21252737939357758,
-0.0906403660774231,
-0.004505399148911238,
-0.03235464543104172,
0.06882259249687195,
-0.07716867327690125,
0.006265748757869005,
-0.03773633390665054,
0.04708070307970047,
-0.014989808201789856,
0.051193125545978546,
-0.1873476505279541,
0.09170354902744293,
0.04740634933114052,
0.015868766233325005,
-0.057453691959381104,
0.020343761891126633,
-0.003954372368752956,
-0.013224650174379349,
0.09865228831768036,
0.03123975545167923,
-0.09459465742111206,
-0.01766085997223854,
-0.126119464635849,
0.014618882909417152,
0.11870899796485901,
0.004076622426509857,
0.06043972074985504,
0.008631858043372631,
-0.000008853385224938393,
0.013614955358207226,
0.05925868824124336,
-0.06276525557041168,
-0.12131313979625702,
0.036684490740299225,
-0.0076995305716991425,
-0.016061954200267792,
-0.07244443893432617,
-0.05587906390428543,
-0.05748140811920166,
0.2198752760887146,
-0.045990586280822754,
-0.0010821856558322906,
-0.09274668991565704,
0.03909902647137642,
0.09956613183021545,
-0.06481007486581802,
0.08263947069644928,
0.00894048810005188,
0.12962603569030762,
-0.08980691432952881,
-0.09700819849967957,
0.08175969123840332,
-0.06383001059293747,
-0.1065749078989029,
-0.05132722482085228,
0.06088695675134659,
0.02884763665497303,
0.05674022063612938,
-0.0011605434119701385,
0.028770871460437775,
0.03373478725552559,
-0.031315334141254425,
-0.02695949375629425,
0.10350236296653748,
0.01641424372792244,
0.06372840702533722,
-0.12334302812814713,
-0.0012445282191038132,
0.002495860680937767,
0.029648642987012863,
0.10059099644422531,
0.15816155076026917,
-0.04651238024234772,
0.11431502550840378,
0.18474805355072021,
-0.0698201060295105,
-0.2700251042842865,
0.023525457829236984,
0.01659621298313141,
0.07966172695159912,
-0.03698232024908066,
-0.15422336757183075,
-0.0026683062314987183,
0.039631206542253494,
-0.014228052459657192,
0.04646765813231468,
-0.1635374277830124,
-0.11006931215524673,
0.02807443030178547,
0.031238028779625893,
0.16171416640281677,
-0.04192107170820236,
-0.030884088948369026,
-0.02079227939248085,
0.036677319556474686,
0.005537338554859161,
-0.007772423326969147,
0.09020009636878967,
0.028494708240032196,
-0.07034413516521454,
0.0248187817633152,
-0.03384453430771828,
0.10329791903495789,
-0.024849623441696167,
0.03839161992073059,
-0.07234392315149307,
0.006140273064374924,
0.07176561653614044,
-0.06829988211393356,
0.11456549167633057,
0.01841139607131481,
0.051044512540102005,
-0.029026057571172714,
-0.02408302202820778,
-0.05717175453901291,
0.039960622787475586,
-0.03950893133878708,
-0.03345419093966484,
-0.08679091185331345,
0.07197010517120361,
0.03141238912940025,
0.016497556120157242,
-0.014544511213898659,
-0.013798348605632782,
-0.06615320593118668,
0.17560753226280212,
0.060620617121458054,
0.02021058276295662,
-0.06207740306854248,
0.028026893734931946,
-0.03765709698200226,
0.12582112848758698,
-0.07147687673568726,
0.03843742236495018,
0.06604300439357758,
-0.021807961165905,
0.0988604798913002,
0.020561452955007553,
-0.16362935304641724,
-0.00693088723346591,
0.04738009721040726,
-0.10600744932889938,
-0.1401180773973465,
-0.02211666852235794,
0.00593886524438858,
-0.1355404108762741,
-0.02336890436708927,
0.12883694469928741,
-0.05048824101686478,
-0.00993817113339901,
-0.0018259286880493164,
0.06210190802812576,
-0.024432366713881493,
0.11360036581754684,
0.03295673802495003,
0.01234852708876133,
-0.06818147003650665,
0.1395614743232727,
0.08109870553016663,
-0.1567729264497757,
0.07657775282859802,
0.08367551863193512,
-0.09736108779907227,
-0.03944526985287666,
0.03847993165254593,
0.11642934381961823,
0.061701662838459015,
-0.03808709233999252,
-0.07248709350824356,
-0.05291295051574707,
0.051462046802043915,
0.15038876235485077,
0.016656406223773956,
0.05271691828966141,
-0.03325501084327698,
-0.0041345469653606415,
-0.05735253542661667,
0.06674272567033768,
0.020378155633807182,
0.040684863924980164,
0.015313010662794113,
0.12694679200649261,
-0.022952403873205185,
0.047665372490882874,
-0.026171911507844925,
-0.007085304707288742,
-0.06974778324365616,
-0.05657137185335159,
-0.0668497085571289,
-0.045228198170661926,
0.01746002398431301,
-0.048936836421489716,
-0.002437396440654993,
0.025394445285201073,
0.041087694466114044,
0.024551521986722946,
-0.03683791309595108,
-0.05504922568798065,
-0.07245498895645142,
0.005920875817537308,
-0.07207601517438889,
-0.013053510338068008,
0.05063280463218689,
-0.04582544416189194,
0.08779065310955048,
-0.030525147914886475,
0.0034756280947476625,
0.011585483327507973,
0.03702608495950699,
-0.033195581287145615,
0.03310443088412285,
0.019550886005163193,
0.01794128306210041,
-0.02549724467098713,
0.0017298292368650436,
-0.0010784715414047241,
0.022767001762986183,
-0.015641845762729645,
0.07425720989704132,
-0.047486141324043274,
0.08097359538078308,
-0.04526369646191597,
-0.02653207816183567,
-0.030215375125408173,
0.07996524125337601,
-0.0030143382027745247,
0.08866637945175171,
0.05468704178929329,
-0.04844578728079796,
0.036316197365522385,
-0.13481970131397247,
0.007727702613919973,
0.006647712085396051,
-0.015918679535388947,
0.030940886586904526,
-0.03943201154470444,
0.03834867104887962,
-0.017159568145871162,
0.03400688245892525,
0.034004196524620056,
-0.04422689601778984,
0.009821271523833275,
0.018081597983837128,
-0.12736213207244873,
0.019652308896183968,
0.07124647498130798,
0.0014422750100493431,
-0.027055997401475906,
-0.007047743070870638,
0.0436539500951767,
0.03675270080566406,
0.11182314157485962,
0.13358017802238464,
0.09861211478710175,
0.08010964095592499,
0.126968652009964,
0.03202097862958908,
-0.06162497028708458,
-0.058122679591178894,
0.004653142765164375,
-0.01353652123361826,
0.0960075855255127,
-0.023320458829402924,
0.03218178078532219,
0.07495348155498505,
-0.17147983610630035,
0.10886634886264801,
0.07824290543794632,
-0.07229262590408325,
-0.06604702025651932,
-0.18085892498493195,
-0.053675830364227295,
-0.006848620250821114,
-0.030011650174856186,
-0.09438654780387878,
-0.004357006400823593,
0.02924858033657074,
0.012814557179808617,
0.0030748024582862854,
0.14251478016376495,
-0.20544779300689697,
-0.05148910731077194,
0.06918097287416458,
0.00534472893923521,
0.028571290895342827,
0.06606179475784302,
-0.042677562683820724,
0.022489361464977264,
0.07477852702140808,
0.07502271234989166,
0.05121894180774689,
0.10941524058580399,
0.04738936200737953,
0.017042772844433784,
-0.07331173866987228,
-0.026010384783148766,
-0.04155803099274635,
0.03207356855273247,
0.2076655626296997,
0.041645802557468414,
-0.05340280383825302,
-0.013690153136849403,
0.04899130016565323,
-0.054403871297836304,
-0.051380377262830734,
-0.11810892820358276,
0.11175505071878433,
-0.02255885675549507,
0.00684838742017746,
-0.023491069674491882,
-0.09308294951915741,
-0.019753845408558846,
0.2003631889820099,
0.16178184747695923,
-0.001440062653273344,
-0.016940902918577194,
0.016423381865024567,
-0.005043430253863335,
0.029352230951189995,
0.09468148648738861,
0.02385684661567211,
0.16603590548038483,
-0.02412695251405239,
0.035780735313892365,
-0.005644273012876511,
-0.025737252086400986,
-0.06483645737171173,
0.05125819146633148,
-0.027547240257263184,
-0.0035405922681093216,
-0.049539875239133835,
0.027789462357759476,
0.02857702225446701,
-0.20552608370780945,
0.026229336857795715,
-0.08926866203546524,
-0.05284613370895386,
0.0010915454477071762,
-0.013351298868656158,
-0.0030099451541900635,
0.08741801977157593,
-0.00452790129929781,
0.07381396740674973,
0.19373735785484314,
0.003306555561721325,
-0.0815664678812027,
-0.0913861095905304,
0.050293851643800735,
-0.13466587662696838,
0.1526409387588501,
0.050512004643678665,
0.03798474371433258,
0.07306944578886032,
0.01361321285367012,
-0.060695040971040726,
0.011376645416021347,
-0.028477484360337257,
-0.024071846157312393,
-0.03837273269891739,
0.14800354838371277,
-0.043207261711359024,
0.08938580751419067,
0.015693150460720062,
-0.038474515080451965,
0.036426473408937454,
-0.007932299748063087,
-0.015909578651189804,
-0.08973710983991623,
0.07325950264930725,
-0.08246010541915894,
0.14312127232551575,
0.17106860876083374,
0.015231815166771412,
0.004850779660046101,
-0.0417427197098732,
-0.01524864137172699,
0.01239006407558918,
0.029233332723379135,
0.010623076930642128,
-0.10584702342748642,
-0.009432002902030945,
-0.15233080089092255,
0.0013638073578476906,
-0.2193625122308731,
-0.08501635491847992,
0.02627389132976532,
-0.06319930404424667,
-0.054690610617399216,
0.08979416638612747,
0.051649801433086395,
-0.002580491825938225,
-0.016911929473280907,
0.022932644933462143,
0.029056353494524956,
0.02989247627556324,
-0.0857112780213356,
-0.06406505405902863
] |
null | null |
transformers
|
# CLIP-Vision-BERT Multilingual VQA Model
Fine-tuned CLIP-Vision-BERT on translated [VQAv2](https://visualqa.org/challenge.html) image-text pairs using sequence classification objective. We translate the dataset to three other languages other than English: French, German, and Spanish using the [MarianMT Models](https://huggingface.co/transformers/model_doc/marian.html). This model is based on the VisualBERT which was introduced in
[this paper](https://arxiv.org/abs/1908.03557) and first released in
[this repository](https://github.com/uclanlp/visualbert). The output is 3129 class logits, the same classes as used by VisualBERT authors.
The initial weights are loaded from the Conceptual-12M 60k [checkpoints](https://huggingface.co/flax-community/clip-vision-bert-cc12m-60k).
We trained the CLIP-Vision-BERT VQA model during community week hosted by Huggingface 🤗 using JAX/Flax.
## Model description
CLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from the CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.
## Intended uses & limitations❗️
This model is fine-tuned on a multi-translated version of the visual question answering task - [VQA v2](https://visualqa.org/challenge.html). Since VQAv2 is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.
### How to use❓
You can use this model directly on visual question answering. You will need to clone the model from [here](https://github.com/gchhablani/multilingual-vqa). An example of usage is shown below:
```python
>>> from torchvision.io import read_image
>>> import numpy as np
>>> import os
>>> from transformers import CLIPProcessor, BertTokenizerFast
>>> from model.flax_clip_vision_bert.modeling_clip_vision_bert import FlaxCLIPVisionBertForSequenceClassification
>>> image_path = os.path.join('images/val2014', os.listdir('images/val2014')[0])
>>> img = read_image(image_path)
>>> clip_processor = CLIPProcessor.from_pretrained('openai/clip-vit-base-patch32')
ftfy or spacy is not installed using BERT BasicTokenizer instead of ftfy.
>>> clip_outputs = clip_processor(images=img)
>>> clip_outputs['pixel_values'][0] = clip_outputs['pixel_values'][0].transpose(1,2,0) # Need to transpose images as model expected channel last images.
>>> tokenizer = BertTokenizerFast.from_pretrained('bert-base-multilingual-uncased')
>>> model = FlaxCLIPVisionBertForSequenceClassification.from_pretrained('flax-community/clip-vision-bert-vqa-ft-6k')
>>> text = "Are there teddy bears in the image?"
>>> tokens = tokenizer([text], return_tensors="np")
>>> pixel_values = np.concatenate([clip_outputs['pixel_values']])
>>> outputs = model(pixel_values=pixel_values, **tokens)
>>> preds = outputs.logits[0]
>>> sorted_indices = np.argsort(preds)[::-1] # Get reverse sorted scores
>>> top_5_indices = sorted_indices[:5]
>>> top_5_tokens = list(map(model.config.id2label.get,top_5_indices))
>>> top_5_scores = preds[top_5_indices]
>>> print(dict(zip(top_5_tokens, top_5_scores)))
{'yes': 15.809224, 'no': 7.8785815, '<unk>': 4.622649, 'very': 4.511462, 'neither': 3.600822}
```
## Training data 🏋🏻♂️
The CLIP-Vision-BERT model was fine-tuned on the translated version of the VQAv2 dataset in four languages using Marian: English, French, German and Spanish. Hence, the dataset is four times the original English questions.
The dataset questions and image URLs/paths can be downloaded from [flax-community/multilingual-vqa](https://huggingface.co/datasets/flax-community/multilingual-vqa).
## Data Cleaning 🧹
Though the original dataset contains 443,757 train and 214,354 validation image-question pairs. We only use the `multiple_choice_answer`. The answers which are not present in the 3129 classes are mapped to the `<unk>` label.
**Splits**
We use the original train-val splits from the VQAv2 dataset. After translation, we get 1,775,028 train image-text pairs, and 857,416 validation image-text pairs.
## Training procedure 👨🏻💻
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with `[CLS]` and the end of one by `[SEP]`.
### Fine-tuning
The checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) **8 v3 TPU cores** for 6k steps with a per device batch size of 128 and a max sequence length of 128. The optimizer used is AdamW with a learning rate of 5e-5, learning rate warmup for 1600 steps, and linear decay of the learning rate after.
We tracked experiments using TensorBoard. Here is link to main dashboard: [CLIP Vision BERT VQAv2 Fine-tuning Dashboard](https://huggingface.co/flax-community/multilingual-vqa-pt-60k-ft/tensorboard)
#### **Fine-tuning Results 📊**
The model at this checkpoint reached **eval accuracy of 0.49** on our multilingual VQAv2 dataset.
## Team Members
- Gunjan Chhablani [@gchhablani](https://hf.co/gchhablani)
- Bhavitvya Malik[@bhavitvyamalik](https://hf.co/bhavitvyamalik)
## Acknowledgements
We thank [Nilakshan Kunananthaseelan](https://huggingface.co/knilakshan20) for helping us whenever he could get a chance. We also thank [Abheesht Sharma](https://huggingface.co/abheesht) for helping in the discussions in the initial phases. [Luke Melas](https://github.com/lukemelas) helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.
This project would not be possible without the help of [Patrick](https://huggingface.co/patrickvonplaten) and [Suraj](https://huggingface.co/valhalla) who met with us frequently and helped review our approach and guided us throughout the project.
Huge thanks to Huggingface 🤗 & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.
<img src=https://pbs.twimg.com/media/E443fPjX0AY1BsR.jpg:large>
|
{}
|
text-classification
|
flax-community/clip-vision-bert-vqa-ft-6k
|
[
"transformers",
"jax",
"clip-vision-bert",
"text-classification",
"arxiv:1908.03557",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.03557"
] |
[] |
TAGS
#transformers #jax #clip-vision-bert #text-classification #arxiv-1908.03557 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# CLIP-Vision-BERT Multilingual VQA Model
Fine-tuned CLIP-Vision-BERT on translated VQAv2 image-text pairs using sequence classification objective. We translate the dataset to three other languages other than English: French, German, and Spanish using the MarianMT Models. This model is based on the VisualBERT which was introduced in
this paper and first released in
this repository. The output is 3129 class logits, the same classes as used by VisualBERT authors.
The initial weights are loaded from the Conceptual-12M 60k checkpoints.
We trained the CLIP-Vision-BERT VQA model during community week hosted by Huggingface using JAX/Flax.
## Model description
CLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from the CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.
## Intended uses & limitations️
This model is fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since VQAv2 is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.
### How to use
You can use this model directly on visual question answering. You will need to clone the model from here. An example of usage is shown below:
## Training data ️
The CLIP-Vision-BERT model was fine-tuned on the translated version of the VQAv2 dataset in four languages using Marian: English, French, German and Spanish. Hence, the dataset is four times the original English questions.
The dataset questions and image URLs/paths can be downloaded from flax-community/multilingual-vqa.
## Data Cleaning
Though the original dataset contains 443,757 train and 214,354 validation image-question pairs. We only use the 'multiple_choice_answer'. The answers which are not present in the 3129 classes are mapped to the '<unk>' label.
Splits
We use the original train-val splits from the VQAv2 dataset. After translation, we get 1,775,028 train image-text pairs, and 857,416 validation image-text pairs.
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with '[CLS]' and the end of one by '[SEP]'.
### Fine-tuning
The checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 6k steps with a per device batch size of 128 and a max sequence length of 128. The optimizer used is AdamW with a learning rate of 5e-5, learning rate warmup for 1600 steps, and linear decay of the learning rate after.
We tracked experiments using TensorBoard. Here is link to main dashboard: CLIP Vision BERT VQAv2 Fine-tuning Dashboard
#### Fine-tuning Results
The model at this checkpoint reached eval accuracy of 0.49 on our multilingual VQAv2 dataset.
## Team Members
- Gunjan Chhablani @gchhablani
- Bhavitvya Malik@bhavitvyamalik
## Acknowledgements
We thank Nilakshan Kunananthaseelan for helping us whenever he could get a chance. We also thank Abheesht Sharma for helping in the discussions in the initial phases. Luke Melas helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.
This project would not be possible without the help of Patrick and Suraj who met with us frequently and helped review our approach and guided us throughout the project.
Huge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.
<img src=URL
|
[
"# CLIP-Vision-BERT Multilingual VQA Model\n\nFine-tuned CLIP-Vision-BERT on translated VQAv2 image-text pairs using sequence classification objective. We translate the dataset to three other languages other than English: French, German, and Spanish using the MarianMT Models. This model is based on the VisualBERT which was introduced in\nthis paper and first released in\nthis repository. The output is 3129 class logits, the same classes as used by VisualBERT authors. \n\nThe initial weights are loaded from the Conceptual-12M 60k checkpoints.\n\nWe trained the CLIP-Vision-BERT VQA model during community week hosted by Huggingface using JAX/Flax.",
"## Model description\nCLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from the CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.",
"## Intended uses & limitations️\nThis model is fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since VQAv2 is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.",
"### How to use\nYou can use this model directly on visual question answering. You will need to clone the model from here. An example of usage is shown below:",
"## Training data ️\nThe CLIP-Vision-BERT model was fine-tuned on the translated version of the VQAv2 dataset in four languages using Marian: English, French, German and Spanish. Hence, the dataset is four times the original English questions.\n\nThe dataset questions and image URLs/paths can be downloaded from flax-community/multilingual-vqa.",
"## Data Cleaning \n\nThough the original dataset contains 443,757 train and 214,354 validation image-question pairs. We only use the 'multiple_choice_answer'. The answers which are not present in the 3129 classes are mapped to the '<unk>' label.\n\nSplits\nWe use the original train-val splits from the VQAv2 dataset. After translation, we get 1,775,028 train image-text pairs, and 857,416 validation image-text pairs.",
"## Training procedure ",
"### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with '[CLS]' and the end of one by '[SEP]'.",
"### Fine-tuning\nThe checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 6k steps with a per device batch size of 128 and a max sequence length of 128. The optimizer used is AdamW with a learning rate of 5e-5, learning rate warmup for 1600 steps, and linear decay of the learning rate after.\n\nWe tracked experiments using TensorBoard. Here is link to main dashboard: CLIP Vision BERT VQAv2 Fine-tuning Dashboard",
"#### Fine-tuning Results \n\nThe model at this checkpoint reached eval accuracy of 0.49 on our multilingual VQAv2 dataset.",
"## Team Members\n - Gunjan Chhablani @gchhablani\n - Bhavitvya Malik@bhavitvyamalik",
"## Acknowledgements\n We thank Nilakshan Kunananthaseelan for helping us whenever he could get a chance. We also thank Abheesht Sharma for helping in the discussions in the initial phases. Luke Melas helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.\n\n This project would not be possible without the help of Patrick and Suraj who met with us frequently and helped review our approach and guided us throughout the project.\n\n Huge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.\n\n<img src=URL"
] |
[
"TAGS\n#transformers #jax #clip-vision-bert #text-classification #arxiv-1908.03557 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# CLIP-Vision-BERT Multilingual VQA Model\n\nFine-tuned CLIP-Vision-BERT on translated VQAv2 image-text pairs using sequence classification objective. We translate the dataset to three other languages other than English: French, German, and Spanish using the MarianMT Models. This model is based on the VisualBERT which was introduced in\nthis paper and first released in\nthis repository. The output is 3129 class logits, the same classes as used by VisualBERT authors. \n\nThe initial weights are loaded from the Conceptual-12M 60k checkpoints.\n\nWe trained the CLIP-Vision-BERT VQA model during community week hosted by Huggingface using JAX/Flax.",
"## Model description\nCLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from the CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.",
"## Intended uses & limitations️\nThis model is fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since VQAv2 is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.",
"### How to use\nYou can use this model directly on visual question answering. You will need to clone the model from here. An example of usage is shown below:",
"## Training data ️\nThe CLIP-Vision-BERT model was fine-tuned on the translated version of the VQAv2 dataset in four languages using Marian: English, French, German and Spanish. Hence, the dataset is four times the original English questions.\n\nThe dataset questions and image URLs/paths can be downloaded from flax-community/multilingual-vqa.",
"## Data Cleaning \n\nThough the original dataset contains 443,757 train and 214,354 validation image-question pairs. We only use the 'multiple_choice_answer'. The answers which are not present in the 3129 classes are mapped to the '<unk>' label.\n\nSplits\nWe use the original train-val splits from the VQAv2 dataset. After translation, we get 1,775,028 train image-text pairs, and 857,416 validation image-text pairs.",
"## Training procedure ",
"### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a shared vocabulary size of approximately 110,000. The beginning of a new document is marked with '[CLS]' and the end of one by '[SEP]'.",
"### Fine-tuning\nThe checkpoint of the model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 6k steps with a per device batch size of 128 and a max sequence length of 128. The optimizer used is AdamW with a learning rate of 5e-5, learning rate warmup for 1600 steps, and linear decay of the learning rate after.\n\nWe tracked experiments using TensorBoard. Here is link to main dashboard: CLIP Vision BERT VQAv2 Fine-tuning Dashboard",
"#### Fine-tuning Results \n\nThe model at this checkpoint reached eval accuracy of 0.49 on our multilingual VQAv2 dataset.",
"## Team Members\n - Gunjan Chhablani @gchhablani\n - Bhavitvya Malik@bhavitvyamalik",
"## Acknowledgements\n We thank Nilakshan Kunananthaseelan for helping us whenever he could get a chance. We also thank Abheesht Sharma for helping in the discussions in the initial phases. Luke Melas helped us get the CC-12M data on our TPU-VMs and we are very grateful to him.\n\n This project would not be possible without the help of Patrick and Suraj who met with us frequently and helped review our approach and guided us throughout the project.\n\n Huge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week and for answering our queries on the Slack channel, and for providing us with the TPU-VMs.\n\n<img src=URL"
] |
[
51,
170,
80,
74,
36,
92,
116,
3,
59,
140,
35,
27,
156
] |
[
"passage: TAGS\n#transformers #jax #clip-vision-bert #text-classification #arxiv-1908.03557 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# CLIP-Vision-BERT Multilingual VQA Model\n\nFine-tuned CLIP-Vision-BERT on translated VQAv2 image-text pairs using sequence classification objective. We translate the dataset to three other languages other than English: French, German, and Spanish using the MarianMT Models. This model is based on the VisualBERT which was introduced in\nthis paper and first released in\nthis repository. The output is 3129 class logits, the same classes as used by VisualBERT authors. \n\nThe initial weights are loaded from the Conceptual-12M 60k checkpoints.\n\nWe trained the CLIP-Vision-BERT VQA model during community week hosted by Huggingface using JAX/Flax.## Model description\nCLIP-Vision-BERT is a modified BERT model which takes in visual embeddings from the CLIP-Vision transformer and concatenates them with BERT textual embeddings before passing them to the self-attention layers of BERT. This is done for deep cross-modal interaction between the two modes.## Intended uses & limitations️\nThis model is fine-tuned on a multi-translated version of the visual question answering task - VQA v2. Since VQAv2 is a dataset scraped from the internet, it will involve some biases which will also affect all fine-tuned versions of this model.### How to use\nYou can use this model directly on visual question answering. You will need to clone the model from here. An example of usage is shown below:## Training data ️\nThe CLIP-Vision-BERT model was fine-tuned on the translated version of the VQAv2 dataset in four languages using Marian: English, French, German and Spanish. Hence, the dataset is four times the original English questions.\n\nThe dataset questions and image URLs/paths can be downloaded from flax-community/multilingual-vqa."
] |
[
-0.05904626101255417,
0.041233282536268234,
-0.0028373657260090113,
0.06107411906123161,
0.055644817650318146,
-0.023833414539694786,
0.10133976489305496,
0.08485843986272812,
-0.023028068244457245,
0.07106523215770721,
-0.011045021004974842,
-0.008250607177615166,
0.10103298723697662,
0.11664120852947235,
0.050960130989551544,
-0.25092577934265137,
0.04088365286588669,
-0.06266438961029053,
-0.058570120483636856,
0.028937097638845444,
0.1094188541173935,
-0.12781284749507904,
0.10938511788845062,
0.0018063858151435852,
-0.00008240523311542347,
0.026769839227199554,
-0.05552758276462555,
-0.024512169882655144,
0.09202500432729721,
0.13671916723251343,
0.0623396560549736,
0.017677169293165207,
0.0586390383541584,
-0.14148534834384918,
0.026657966896891594,
0.08327233791351318,
0.012563660740852356,
0.08033081144094467,
0.12158891558647156,
0.08811227977275848,
-0.04953625425696373,
-0.06864926218986511,
0.06471826881170273,
0.079397052526474,
-0.08573142439126968,
-0.21770407259464264,
-0.13727819919586182,
0.048546578735113144,
0.12197940796613693,
0.016246043145656586,
-0.0018510675290599465,
0.051707468926906586,
0.01571185514330864,
0.05308682471513748,
0.160925030708313,
-0.2019345462322235,
-0.030225060880184174,
0.0416782908141613,
0.054448965936899185,
0.11466251313686371,
-0.07188218832015991,
0.043578434735536575,
0.03583763912320137,
0.02597646601498127,
0.004548323806375265,
-0.04709895700216293,
-0.06203809008002281,
-0.022121211513876915,
-0.137260302901268,
-0.0064679342322051525,
0.087547168135643,
-0.010853235609829426,
-0.07905346155166626,
-0.14413486421108246,
-0.04583127796649933,
0.060338977724313736,
-0.02347688004374504,
-0.08932995796203613,
0.05899985507130623,
-0.03264923766255379,
0.08951669186353683,
-0.10507237166166306,
-0.08922784775495529,
0.0058649019338190556,
0.020216580480337143,
0.003306517843157053,
0.03663311526179314,
0.04756154865026474,
-0.011150414124131203,
0.06563960760831833,
0.0005445959395729005,
-0.07085824757814407,
-0.042710285633802414,
-0.046480610966682434,
-0.08521399646997452,
-0.035753555595874786,
0.002443225122988224,
-0.13071952760219574,
-0.04048196226358414,
0.11094058305025101,
0.028779564425349236,
0.05284924432635307,
-0.06957407295703888,
0.014180256053805351,
0.10021057724952698,
0.14008581638336182,
-0.06154981628060341,
-0.05163581669330597,
0.05058003216981888,
-0.04622358828783035,
0.011019036173820496,
-0.05011271685361862,
-0.07165330648422241,
0.037966206669807434,
-0.0006206819089129567,
0.0562540739774704,
0.020141638815402985,
-0.01672086864709854,
-0.013262050226330757,
-0.07637016475200653,
0.16043917834758759,
-0.10404486954212189,
0.07933137565851212,
0.04016350209712982,
-0.02619522251188755,
0.09468530118465424,
0.07040870189666748,
-0.007034523878246546,
-0.09256139397621155,
-0.01856415718793869,
-0.0423186793923378,
-0.005786890629678965,
-0.08018050342798233,
-0.16914455592632294,
0.05495725944638252,
-0.010358347557485104,
-0.0885145291686058,
-0.09655644744634628,
-0.09034767746925354,
-0.07811222970485687,
0.02891717664897442,
-0.053796570748090744,
0.005127925891429186,
-0.01358383521437645,
0.04232875630259514,
-0.005021532066166401,
0.02064083144068718,
0.011479928158223629,
0.0144645469263196,
0.006737875286489725,
-0.0800158903002739,
0.0759296864271164,
-0.01166699081659317,
0.04475560784339905,
-0.025953426957130432,
0.007352745626121759,
-0.2138533741235733,
0.08731885999441147,
-0.06960714608430862,
-0.04044108837842941,
-0.11433061212301254,
-0.05773381516337395,
0.1093965694308281,
0.0188016127794981,
0.04593731090426445,
0.10335312783718109,
-0.2504590153694153,
-0.03147535026073456,
0.029414404183626175,
-0.12456979602575302,
-0.02892782725393772,
0.1534411907196045,
-0.031317390501499176,
0.07123396545648575,
0.07588206976652145,
0.10027788579463959,
0.13640663027763367,
-0.08404994010925293,
-0.04377291724085808,
-0.021298624575138092,
0.020471803843975067,
0.06392098218202591,
0.060425274074077606,
-0.031976692378520966,
0.07034855335950851,
-0.017602358013391495,
-0.04226411506533623,
-0.033610761165618896,
-0.023155024275183678,
-0.029410524293780327,
0.05328774452209473,
0.013105220161378384,
-0.041142333298921585,
-0.04066353291273117,
-0.0021240722853690386,
-0.010226132348179817,
0.005336263217031956,
0.07563355565071106,
0.04106771945953369,
-0.07161399722099304,
0.06196855753660202,
-0.018702581524848938,
0.027146749198436737,
-0.0781058594584465,
0.019729366526007652,
-0.07609886676073074,
-0.10567186772823334,
0.06538499891757965,
-0.04085184633731842,
0.03180462494492531,
0.09411687403917313,
0.029624979943037033,
0.06740181893110275,
0.01721804216504097,
-0.02499609813094139,
-0.001207477180287242,
-0.03524750471115112,
-0.06521894037723541,
-0.11961112916469574,
-0.08346956223249435,
-0.04500062018632889,
-0.002294739941135049,
-0.1214388832449913,
-0.008124012500047684,
0.02793595939874649,
0.1114722266793251,
0.0458444245159626,
-0.05327213555574417,
0.026235204190015793,
0.013927594758570194,
0.002042690757662058,
-0.047782644629478455,
0.02345617674291134,
0.0049601043574512005,
0.018939882516860962,
0.10654903948307037,
-0.08355060964822769,
-0.09283040463924408,
0.0217842236161232,
-0.05348001793026924,
-0.04885738343000412,
0.00472688116133213,
-0.010985344648361206,
-0.028550224378705025,
-0.11091802269220352,
-0.047713808715343475,
0.2203895002603531,
0.036054059863090515,
0.1273680180311203,
-0.08629095554351807,
-0.024673737585544586,
0.004640655592083931,
-0.030888717621564865,
-0.03853778913617134,
0.06316972523927689,
-0.02648599073290825,
-0.10596415400505066,
0.05928480997681618,
0.054907217621803284,
0.07844170928001404,
0.13637885451316833,
0.018296586349606514,
-0.04957873001694679,
-0.018567169085144997,
0.024903498589992523,
0.02364404872059822,
0.07554879784584045,
0.04741059988737106,
-0.00016811695240903646,
0.05459287390112877,
0.06871648877859116,
-0.017571724951267242,
-0.058546386659145355,
0.08469795435667038,
0.04231618344783783,
-0.023626431822776794,
0.0703599825501442,
-0.007443892303854227,
-0.0026871117297559977,
0.06335681676864624,
0.07852943986654282,
0.07065641134977341,
-0.028371768072247505,
-0.03771781548857689,
-0.1413789987564087,
0.14826545119285583,
-0.14831317961215973,
-0.2980491816997528,
-0.16307467222213745,
-0.06546463817358017,
-0.04335135221481323,
-0.01110832765698433,
0.02554319053888321,
-0.07337743043899536,
-0.11646757274866104,
-0.08476291596889496,
0.11686643958091736,
-0.008352821692824364,
-0.04744361340999603,
-0.02470707707107067,
-0.039640873670578,
0.02698204666376114,
-0.1627562940120697,
0.027539832517504692,
0.017450371757149696,
-0.04077601805329323,
-0.03073905222117901,
0.06485315412282944,
0.05605287849903107,
0.0353495329618454,
-0.00205473811365664,
0.01441239658743143,
-0.027516428381204605,
0.17122553288936615,
-0.06781087815761566,
0.12675261497497559,
0.19006764888763428,
-0.056656498461961746,
0.05127062648534775,
0.06930696219205856,
0.018189070746302605,
-0.039777543395757675,
-0.004603209905326366,
0.05534103885293007,
-0.03944683447480202,
-0.13831977546215057,
-0.05950856953859329,
-0.037500008940696716,
0.06677091866731644,
0.06057978421449661,
0.035883110016584396,
-0.08905622363090515,
-0.014502306468784809,
-0.07341127842664719,
0.03001219406723976,
0.03079262189567089,
0.09971731156110764,
0.084505595266819,
-0.061717402189970016,
0.058532025665044785,
-0.07596463710069656,
0.00416961032897234,
0.12394265830516815,
0.04265204071998596,
0.09005311131477356,
-0.022602681070566177,
0.11686109751462936,
0.050459735095500946,
0.04224827513098717,
0.045809246599674225,
0.09562420845031738,
-0.06364844739437103,
0.0007157403160817921,
-0.05051342025399208,
-0.06392604857683182,
0.024873577058315277,
0.048851024359464645,
-0.006546864286065102,
-0.04057350382208824,
-0.046020686626434326,
-0.06134166195988655,
0.02512173354625702,
0.17842234671115875,
0.07949104905128479,
-0.1150662899017334,
-0.04514691233634949,
0.04371838644146919,
-0.05237554386258125,
-0.11966627091169357,
-0.0019249648321419954,
0.11578067392110825,
-0.1551292985677719,
0.038132261484861374,
-0.01522514782845974,
0.09479133039712906,
-0.16200439631938934,
-0.017089344561100006,
-0.009280732832849026,
0.016110043972730637,
-0.023811688646674156,
0.06067582219839096,
-0.12699659168720245,
-0.017880843952298164,
0.030857698991894722,
0.008820063434541225,
-0.023849166929721832,
0.0418681763112545,
-0.013972424902021885,
-0.0006397086544893682,
0.08990184962749481,
0.03272426500916481,
0.04385320469737053,
-0.05700577050447464,
-0.060081709176301956,
0.029356699436903,
0.04890058934688568,
-0.032774265855550766,
0.04925636947154999,
-0.03661129251122475,
-0.004962712526321411,
-0.05232755094766617,
0.03150971233844757,
-0.08225473016500473,
-0.14134997129440308,
0.040759917348623276,
-0.03162342682480812,
0.06901074200868607,
-0.03391489014029503,
-0.06007923185825348,
-0.1350894719362259,
0.2095581591129303,
-0.09250044077634811,
-0.046120785176754,
-0.1388053297996521,
0.06318152695894241,
0.06533422321081161,
-0.0457288883626461,
0.06947243958711624,
-0.018177630379796028,
0.1474529355764389,
-0.08561272919178009,
-0.09171437472105026,
0.04052197188138962,
-0.11377216875553131,
-0.13902708888053894,
-0.019879624247550964,
0.07770226150751114,
0.056057367473840714,
0.01041698083281517,
0.0031054106075316668,
0.018735013902187347,
-0.01805996708571911,
-0.05616248399019241,
-0.0220172218978405,
0.15907685458660126,
0.00041543570114299655,
0.04717981070280075,
-0.061723995953798294,
-0.06857626885175705,
-0.0097038047388196,
-0.010886399075388908,
0.11755672097206116,
0.1533997803926468,
-0.04677918925881386,
0.1800367385149002,
0.14096291363239288,
-0.08672501146793365,
-0.25755575299263,
-0.007429208140820265,
0.04875991865992546,
0.04947032779455185,
-0.014059104025363922,
-0.1643896847963333,
0.07912582904100418,
0.01030078437179327,
0.00047307851491495967,
0.008611013181507587,
-0.16473139822483063,
-0.09523925930261612,
0.04165182635188103,
0.048530273139476776,
0.11065540462732315,
-0.038228198885917664,
-0.03392626345157623,
-0.028054654598236084,
-0.03979174420237541,
0.04766816645860672,
0.004402841441333294,
0.042826198041439056,
0.013328510336577892,
-0.03264494985342026,
0.029762551188468933,
-0.00819161906838417,
0.10411015152931213,
-0.056266408413648605,
0.04445858299732208,
-0.06048247590661049,
-0.012642163783311844,
-0.042403530329465866,
-0.06510596722364426,
0.10444578528404236,
0.04905744269490242,
0.036745451390743256,
-0.11055052280426025,
-0.004354450851678848,
-0.05322287231683731,
0.0409095473587513,
-0.07350348681211472,
-0.016842566430568695,
-0.08228020370006561,
0.06650570034980774,
0.055559396743774414,
0.021680094301700592,
0.05840156227350235,
-0.06669754534959793,
-0.07038931548595428,
0.15883073210716248,
0.0982440635561943,
0.02068599872291088,
-0.09121613949537277,
-0.008872206322848797,
-0.03077160008251667,
0.09241451323032379,
-0.08114916831254959,
0.0661550834774971,
0.07013573497533798,
-0.02643844112753868,
0.0957399532198906,
0.010681956075131893,
-0.128798246383667,
-0.020813511684536934,
0.03110462985932827,
-0.11524441838264465,
-0.1154395118355751,
-0.04512026533484459,
0.06327791512012482,
-0.1041283905506134,
0.0342925526201725,
0.20261450111865997,
-0.017648905515670776,
-0.055061761289834976,
0.008539698086678982,
0.06274738162755966,
-0.0019327045883983374,
0.07498905062675476,
0.034888315945863724,
0.022361813113093376,
-0.08002564311027527,
0.10239861160516739,
0.07658718526363373,
-0.06147973611950874,
0.04665869474411011,
0.17313119769096375,
-0.0785578191280365,
-0.0522390715777874,
0.04795025661587715,
0.22224612534046173,
-0.0850643590092659,
-0.040913358330726624,
-0.04761254042387009,
-0.07927127927541733,
0.029027743265032768,
0.1408691555261612,
-0.002603090601041913,
0.0016944584203884006,
-0.024518325924873352,
-0.028526751324534416,
-0.041104841977357864,
0.10336917638778687,
0.0017345895757898688,
0.01968233287334442,
-0.03154944255948067,
0.07465790957212448,
-0.015704136341810226,
0.0618002824485302,
-0.020896419882774353,
-0.05489978939294815,
-0.10345982760190964,
-0.048548102378845215,
-0.06805511564016342,
-0.0445161908864975,
0.0003181815263815224,
-0.01978406310081482,
-0.01349837239831686,
0.02168242074549198,
0.05343468487262726,
0.02543005533516407,
-0.016075270250439644,
-0.030212143436074257,
-0.06051890552043915,
0.08613307774066925,
-0.14099174737930298,
-0.029219677671790123,
0.015196631662547588,
-0.048468515276908875,
0.11486797779798508,
-0.06485840678215027,
-0.005368074402213097,
-0.03142890706658363,
0.005071860738098621,
-0.036633577197790146,
-0.0026133188512176275,
0.026231110095977783,
0.058276668190956116,
-0.10917867720127106,
-0.010668789967894554,
-0.03620327636599541,
-0.06470093131065369,
-0.01776670105755329,
0.09688715636730194,
-0.027585966512560844,
0.09061696380376816,
-0.03442685678601265,
-0.03755198046565056,
-0.04777326062321663,
0.09576816111803055,
-0.02063330076634884,
0.1008356511592865,
0.01966904103755951,
-0.05882404372096062,
0.08607039600610733,
-0.1123979464173317,
-0.026467399671673775,
0.016834061592817307,
-0.016411980614066124,
-0.019295886158943176,
-0.06789803504943848,
0.0323895663022995,
-0.005571484100073576,
0.045443613082170486,
0.07232026010751724,
0.03151499107480049,
0.01169374119490385,
-0.011353552341461182,
-0.07754883915185928,
0.03246977925300598,
0.023743009194731712,
-0.06475518643856049,
0.013139469549059868,
-0.0032284390181303024,
0.07191196084022522,
-0.010834219865500927,
0.05612047389149666,
0.04814831539988518,
0.08037756383419037,
0.06816335022449493,
0.07414495944976807,
0.02186228521168232,
-0.12479104101657867,
-0.009900061413645744,
-0.035315148532390594,
0.010093915276229382,
0.077330581843853,
-0.07133356481790543,
0.020593443885445595,
0.08282587677240372,
-0.1677633672952652,
0.09033031016588211,
0.04219304397702217,
-0.04932725057005882,
-0.08198972791433334,
-0.1470738798379898,
-0.050127241760492325,
-0.07073758542537689,
-0.03352795168757439,
-0.12599529325962067,
0.06379424035549164,
-0.022324740886688232,
0.053281527012586594,
0.00605346355587244,
0.09501399099826813,
-0.1437830924987793,
-0.09753008931875229,
0.03078550286591053,
0.03475000336766243,
0.09229818731546402,
0.03314097970724106,
0.04274686053395271,
0.01643657498061657,
0.06160523369908333,
0.04164599999785423,
0.06427188962697983,
0.12198591232299805,
0.03905685991048813,
-0.0019244550494477153,
-0.1063113808631897,
0.00029047252610325813,
-0.05218003690242767,
0.03311421349644661,
0.24150453507900238,
0.054005663841962814,
-0.028851937502622604,
-0.02116651087999344,
0.10534882545471191,
-0.06755919009447098,
-0.007839839905500412,
-0.12125590443611145,
0.1390482783317566,
-0.013237381353974342,
0.03952397406101227,
-0.036896172910928726,
-0.06728999316692352,
-0.039782870560884476,
0.2014600932598114,
0.14628906548023224,
-0.022081103175878525,
-0.013864169828593731,
0.01037165243178606,
-0.0026796588208526373,
0.025197817012667656,
0.07543739676475525,
0.029517846181988716,
0.2289751023054123,
-0.058860067278146744,
0.06708637624979019,
-0.016431612893939018,
-0.028809864073991776,
-0.08597572892904282,
0.11467666178941727,
-0.006390264257788658,
0.01511731930077076,
0.0008995045791380107,
0.07938332855701447,
0.0502215176820755,
-0.2667470872402191,
0.04359393194317818,
-0.07375025749206543,
-0.057975880801677704,
-0.04181600734591484,
0.0005165791953913867,
-0.018079724162817,
0.059994298964738846,
0.012685563415288925,
0.023890553042292595,
0.27383723855018616,
0.013445324264466763,
0.0017234808765351772,
-0.10167383402585983,
0.05645688623189926,
-0.26281842589378357,
0.14994382858276367,
0.03181366249918938,
0.002169019775465131,
0.07473476231098175,
0.00003260175071773119,
-0.07842762768268585,
-0.026011846959590912,
-0.0125251654535532,
-0.044250812381505966,
-0.03755966201424599,
0.12905146181583405,
-0.020860405638813972,
0.14305543899536133,
0.019085491076111794,
-0.1267625093460083,
0.04013865441083908,
0.011865195818245411,
-0.0490073636174202,
-0.05267176032066345,
0.09721733629703522,
-0.0938158929347992,
0.15418373048305511,
0.11420433223247528,
-0.01599404774606228,
-0.01862214133143425,
-0.04590167850255966,
0.009951570071280003,
0.023962387815117836,
0.0600757859647274,
0.045043233782052994,
-0.10906560719013214,
0.01270876545459032,
-0.1304759532213211,
0.05187822878360748,
-0.1523142158985138,
-0.051988210529088974,
0.004975336138159037,
-0.0370643176138401,
-0.025397969409823418,
0.07574335485696793,
0.0027326210401952267,
-0.01503707841038704,
-0.01546475850045681,
-0.03694984316825867,
0.029349999502301216,
0.07538453489542007,
-0.03242548182606697,
-0.03569618985056877
] |
null | null |
transformers
|
# CLIP-Vision-Marian Seq2Seq Encoder-Decoder Model
Pretrained CLIP-Vision-Marian pre-trained on a subset of Spanish-translated Conceptual-12M image-text pairs using a seq2seq model training objective. 2.5M cleaned English image-text pairs are translated using Spanish Marian Model. We trained CLIP-Vision-Marian model during community week hosted by Huggingface 🤗 using JAX/Flax.
## Model description
CLIP-Vision-Marian is a modified transformers model which takes in visual embeddings from CLIP-Vision transformer and feeds into the `encoder_hidden_states` of a Marian decoder. This is done for deep cross-modal interaction via `cross-attention` between the two modes. The decoder then predicts logits for the `input_ids` provided and can be used for generation.
## Intended uses & limitations❗️
You can use the raw model for encoder-decoder network where you want the encoder to encode images and the decoder to decode text.
Note that this model is primarily aimed at being fine-tuned on tasks like Spanish image captioning.
### How to use❓
You will need to clone the model from [here](https://github.com/bhavitvyamalik/spanish-image-captioning). An example of usage is shown below:
```python
>>> from torchvision.io import read_image
>>> import numpy as np
>>> import wget
>>> import os
>>> from transformers import CLIPProcessor, MarianTokenizer
>>> from models.flax_clip_vision_marian.modeling_clip_vision_marian import FlaxCLIPVisionMarianMT
img = wget.download("https://huggingface.co/streamlitiframe/flax-community/spanish-image-captioning/+/media/55a8898e61131569cc0ed4e72a8b3092969d63c2dff4f47ed9ef0d89.jpeg")
>>> img = read_image(img) # reading image
>>> clip_processor = CLIPProcessor.from_pretrained('flax-community/clip-vit-base-patch32_marian')
>>> clip_outputs = clip_processor(images=img)
>>> clip_outputs['pixel_values'][0] = clip_outputs['pixel_values'][0].transpose(1,2,0) # Need to transpose images as model expected channel last images.
>>> tokenizer = MarianTokenizer.from_pretrained('Helsinki-NLP/opus-mt-en-es')
>>> model = FlaxCLIPVisionMarianMT.from_pretrained('flax-community/clip-vit-base-patch32_marian-es')
>>> output_ids = model.generate(batch["pixel_values"], early_stopping=True, num_beams=4, max_length=64).sequences
>>> output_string = tokenizer.batch_decode(output_ids.reshape(-1, 64), skip_special_tokens=True, max_length=64)
>>> output_string
# Sopa de avena en un tazón blanco con arándanos frescos
```
## Training data 🏋🏻♂️
The Spanish image captioning model was trained on a subset of Conceptual 12M dataset by Google:
<br>
<br>
[Conceptual 12M](https://github.com/google-research-datasets/conceptual-12m), Introduced by Changpinyo et al. in [Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts](https://arxiv.org/abs/2102.08981).
### Please update the dataset link here
The translated dataset can be downloaded from [conceptual-12m-multilingual-marian-es](https://huggingface.co/datasets/flax-community/conceptual-12m-multilingual-marian-es). We do not provide images as we do not own any of them. One can download images from the `image_url` section of the original Conceptual 12M dataset.
## Data Cleaning 🧹
Though the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs, out of which we took only 2.5M image, caption pairs.
#### **Train set:**
Total data: <br>
2475000 captions <br>
2475000 images <br>
#### **Validation set**
Total data: <br>
25000 captions <br>
25000 images <br>
## Training procedure 👨🏻💻
### Training
The model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) **8 v3 TPU cores** for 42K steps with a batch size of 128 and a sequence length of 128. The
optimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and
ε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning
rate after.
We tracked experiments using Tensorboard which can be found in `Training Metrics` tab.
#### **Pretraining Results 📊**
Our model reached **eval loss of ~3.1** around ~20K steps. Here are the BLEU^ scores for different languages:
|Language |BLEU-1|BLEU-2|BLEU-3|BLEU-4|
|--------------------------|------|------|------|------|
|Spanish | 0.2015| 0.1348| 0.09982| 0.0748|
^BLEU scores are out of 1
## **App Demo**
You can try out our model on 🤗 Huggingface's spaces 🪐 :
[Streamlit app of Spanish Image Captioning model on Huggingface Spaces](https://huggingface.co/spaces/flax-community/spanish-image-captioning)
## Team Members
- Bhavitvya Malik [@bhavitvyamalik](https://github.com/bhavitvyamalik)
- Gunjan Chhablani [@gchhablani](https://github.com/gchhablani)
## Credits
Thanks to Huggingface 🤗 & Google JAX/Flax team for such a wonderful community week. Big thanks to [@patrickvonplaten](https://github.com/patrickvonplaten) and [@patil-suraj](https://github.com/patil-suraj) for helping us with our solution during the community week.
<img src=https://pbs.twimg.com/media/E443fPjX0AY1BsR.jpg:large>
|
{}
| null |
flax-community/clip-vit-base-patch32_marian-es
|
[
"transformers",
"jax",
"tensorboard",
"clip-vision-marian",
"arxiv:2102.08981",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2102.08981"
] |
[] |
TAGS
#transformers #jax #tensorboard #clip-vision-marian #arxiv-2102.08981 #endpoints_compatible #region-us
|
CLIP-Vision-Marian Seq2Seq Encoder-Decoder Model
================================================
Pretrained CLIP-Vision-Marian pre-trained on a subset of Spanish-translated Conceptual-12M image-text pairs using a seq2seq model training objective. 2.5M cleaned English image-text pairs are translated using Spanish Marian Model. We trained CLIP-Vision-Marian model during community week hosted by Huggingface using JAX/Flax.
Model description
-----------------
CLIP-Vision-Marian is a modified transformers model which takes in visual embeddings from CLIP-Vision transformer and feeds into the 'encoder\_hidden\_states' of a Marian decoder. This is done for deep cross-modal interaction via 'cross-attention' between the two modes. The decoder then predicts logits for the 'input\_ids' provided and can be used for generation.
Intended uses & limitations️
----------------------------
You can use the raw model for encoder-decoder network where you want the encoder to encode images and the decoder to decode text.
Note that this model is primarily aimed at being fine-tuned on tasks like Spanish image captioning.
### How to use
You will need to clone the model from here. An example of usage is shown below:
Training data ️
----------------
The Spanish image captioning model was trained on a subset of Conceptual 12M dataset by Google:
Conceptual 12M, Introduced by Changpinyo et al. in Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts.
### Please update the dataset link here
The translated dataset can be downloaded from conceptual-12m-multilingual-marian-es. We do not provide images as we do not own any of them. One can download images from the 'image\_url' section of the original Conceptual 12M dataset.
Data Cleaning
-------------
Though the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs, out of which we took only 2.5M image, caption pairs.
#### Train set:
Total data:
2475000 captions
2475000 images
#### Validation set
Total data:
25000 captions
25000 images
Training procedure
--------------------
### Training
The model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 42K steps with a batch size of 128 and a sequence length of 128. The
optimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and
ε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning
rate after.
We tracked experiments using Tensorboard which can be found in 'Training Metrics' tab.
#### Pretraining Results
Our model reached eval loss of ~3.1 around ~20K steps. Here are the BLEU^ scores for different languages:
^BLEU scores are out of 1
App Demo
--------
You can try out our model on Huggingface's spaces :
Streamlit app of Spanish Image Captioning model on Huggingface Spaces
Team Members
------------
* Bhavitvya Malik @bhavitvyamalik
* Gunjan Chhablani @gchhablani
Credits
-------
Thanks to Huggingface & Google JAX/Flax team for such a wonderful community week. Big thanks to @patrickvonplaten and @patil-suraj for helping us with our solution during the community week.
<img src=URL
|
[
"### How to use\n\n\nYou will need to clone the model from here. An example of usage is shown below:\n\n\nTraining data ️\n----------------\n\n\nThe Spanish image captioning model was trained on a subset of Conceptual 12M dataset by Google:\n \n\n \n\nConceptual 12M, Introduced by Changpinyo et al. in Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts.",
"### Please update the dataset link here\n\n\nThe translated dataset can be downloaded from conceptual-12m-multilingual-marian-es. We do not provide images as we do not own any of them. One can download images from the 'image\\_url' section of the original Conceptual 12M dataset.\n\n\nData Cleaning\n-------------\n\n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs, out of which we took only 2.5M image, caption pairs.",
"#### Train set:\n\n\nTotal data: \n\n2475000 captions \n\n2475000 images",
"#### Validation set\n\n\nTotal data: \n\n25000 captions \n\n25000 images \n\n\n\nTraining procedure \n--------------------",
"### Training\n\n\nThe model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 42K steps with a batch size of 128 and a sequence length of 128. The\noptimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and\nε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning\nrate after.\n\n\nWe tracked experiments using Tensorboard which can be found in 'Training Metrics' tab.",
"#### Pretraining Results\n\n\nOur model reached eval loss of ~3.1 around ~20K steps. Here are the BLEU^ scores for different languages:\n\n\n\n^BLEU scores are out of 1\n\n\nApp Demo\n--------\n\n\nYou can try out our model on Huggingface's spaces :\nStreamlit app of Spanish Image Captioning model on Huggingface Spaces\n\n\nTeam Members\n------------\n\n\n* Bhavitvya Malik @bhavitvyamalik\n* Gunjan Chhablani @gchhablani\n\n\nCredits\n-------\n\n\nThanks to Huggingface & Google JAX/Flax team for such a wonderful community week. Big thanks to @patrickvonplaten and @patil-suraj for helping us with our solution during the community week.\n\n\n<img src=URL"
] |
[
"TAGS\n#transformers #jax #tensorboard #clip-vision-marian #arxiv-2102.08981 #endpoints_compatible #region-us \n",
"### How to use\n\n\nYou will need to clone the model from here. An example of usage is shown below:\n\n\nTraining data ️\n----------------\n\n\nThe Spanish image captioning model was trained on a subset of Conceptual 12M dataset by Google:\n \n\n \n\nConceptual 12M, Introduced by Changpinyo et al. in Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts.",
"### Please update the dataset link here\n\n\nThe translated dataset can be downloaded from conceptual-12m-multilingual-marian-es. We do not provide images as we do not own any of them. One can download images from the 'image\\_url' section of the original Conceptual 12M dataset.\n\n\nData Cleaning\n-------------\n\n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs, out of which we took only 2.5M image, caption pairs.",
"#### Train set:\n\n\nTotal data: \n\n2475000 captions \n\n2475000 images",
"#### Validation set\n\n\nTotal data: \n\n25000 captions \n\n25000 images \n\n\n\nTraining procedure \n--------------------",
"### Training\n\n\nThe model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 42K steps with a batch size of 128 and a sequence length of 128. The\noptimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and\nε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning\nrate after.\n\n\nWe tracked experiments using Tensorboard which can be found in 'Training Metrics' tab.",
"#### Pretraining Results\n\n\nOur model reached eval loss of ~3.1 around ~20K steps. Here are the BLEU^ scores for different languages:\n\n\n\n^BLEU scores are out of 1\n\n\nApp Demo\n--------\n\n\nYou can try out our model on Huggingface's spaces :\nStreamlit app of Spanish Image Captioning model on Huggingface Spaces\n\n\nTeam Members\n------------\n\n\n* Bhavitvya Malik @bhavitvyamalik\n* Gunjan Chhablani @gchhablani\n\n\nCredits\n-------\n\n\nThanks to Huggingface & Google JAX/Flax team for such a wonderful community week. Big thanks to @patrickvonplaten and @patil-suraj for helping us with our solution during the community week.\n\n\n<img src=URL"
] |
[
39,
98,
149,
15,
20,
145,
165
] |
[
"passage: TAGS\n#transformers #jax #tensorboard #clip-vision-marian #arxiv-2102.08981 #endpoints_compatible #region-us \n### How to use\n\n\nYou will need to clone the model from here. An example of usage is shown below:\n\n\nTraining data ️\n----------------\n\n\nThe Spanish image captioning model was trained on a subset of Conceptual 12M dataset by Google:\n \n\n \n\nConceptual 12M, Introduced by Changpinyo et al. in Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts.### Please update the dataset link here\n\n\nThe translated dataset can be downloaded from conceptual-12m-multilingual-marian-es. We do not provide images as we do not own any of them. One can download images from the 'image\\_url' section of the original Conceptual 12M dataset.\n\n\nData Cleaning\n-------------\n\n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs, out of which we took only 2.5M image, caption pairs.#### Train set:\n\n\nTotal data: \n\n2475000 captions \n\n2475000 images#### Validation set\n\n\nTotal data: \n\n25000 captions \n\n25000 images \n\n\n\nTraining procedure \n--------------------### Training\n\n\nThe model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 42K steps with a batch size of 128 and a sequence length of 128. The\noptimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and\nε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning\nrate after.\n\n\nWe tracked experiments using Tensorboard which can be found in 'Training Metrics' tab."
] |
[
-0.11512580513954163,
0.2171517014503479,
0.0004548192082438618,
0.08665686100721359,
0.11913765221834183,
0.02738533541560173,
-0.009680497460067272,
0.1345434933900833,
-0.0884966179728508,
0.12005037814378738,
0.022210557013750076,
-0.02558297850191593,
0.07695647329092026,
0.12488462775945663,
0.015873638913035393,
-0.18869240581989288,
0.022496357560157776,
-0.04847327619791031,
-0.03220825642347336,
0.06607277691364288,
0.08152087777853012,
-0.07561127841472626,
0.07841970771551132,
-0.05445050075650215,
-0.10692624747753143,
-0.00542283384129405,
-0.04268767684698105,
-0.016053643077611923,
0.08307446539402008,
0.07413272559642792,
0.03434264287352562,
0.0055524264462292194,
0.03529508784413338,
-0.12839767336845398,
0.002098826924338937,
0.060662008821964264,
-0.015776006504893303,
0.04792247712612152,
0.09348637610673904,
-0.0013579617952927947,
0.10757332295179367,
-0.08403438329696655,
0.013085492886602879,
0.03354328125715256,
-0.09721090644598007,
-0.07577356696128845,
-0.14047573506832123,
0.04739832133054733,
0.13473311066627502,
0.10213652998209,
-0.030774952843785286,
0.056265104562044144,
-0.009167841635644436,
0.07093990594148636,
0.14064893126487732,
-0.17136742174625397,
-0.04447026923298836,
0.05618532374501228,
0.02736198529601097,
0.07250818610191345,
-0.05360942333936691,
0.02526049315929413,
0.08170104771852493,
0.015302510932087898,
-0.04020519182085991,
-0.010616343468427658,
-0.15930141508579254,
-0.031090829521417618,
-0.09867104142904282,
-0.011484162881970406,
0.22038044035434723,
0.01990996114909649,
-0.07702925056219101,
-0.06323760002851486,
-0.02640407159924507,
-0.10737472772598267,
0.02853436768054962,
0.014382434077560902,
0.004276973195374012,
0.008392494171857834,
-0.04391493648290634,
-0.06413330882787704,
-0.10733640938997269,
-0.03369618207216263,
-0.049174897372722626,
-0.06621178984642029,
0.027894625440239906,
0.045494627207517624,
-0.004483655095100403,
0.08061469346284866,
-0.12060032039880753,
-0.03714289143681526,
-0.0011095996014773846,
-0.013803710229694843,
-0.07995140552520752,
-0.03447848930954933,
-0.08509228378534317,
-0.17152784764766693,
-0.05107912793755531,
0.03908756747841835,
-0.08737358450889587,
0.053530316799879074,
-0.022943222895264626,
0.04580691456794739,
0.05519937723875046,
0.11231666803359985,
-0.05964198708534241,
-0.03669294714927673,
0.02412361092865467,
-0.029101042076945305,
0.06928043812513351,
-0.0036561505403369665,
-0.0689375251531601,
-0.014264828525483608,
0.0837448462843895,
0.06324681639671326,
0.019086312502622604,
0.0036527833435684443,
-0.07941737025976181,
-0.029535513371229172,
0.12101933360099792,
-0.09055949002504349,
0.08855420351028442,
-0.0065453737042844296,
-0.06009894609451294,
0.03757757693529129,
0.09350068867206573,
-0.028104064986109734,
-0.06326504051685333,
0.021966680884361267,
-0.07073108851909637,
-0.011804179288446903,
-0.0881476178765297,
-0.08540567755699158,
0.08226492255926132,
-0.10956087708473206,
-0.06778041273355484,
-0.08632643520832062,
-0.14753475785255432,
-0.05910103768110275,
0.021773187443614006,
-0.09650950878858566,
0.008281252346932888,
-0.025838693603873253,
-0.05161796510219574,
0.011617355048656464,
0.03545387089252472,
0.10310833901166916,
0.0068740639835596085,
0.0658845528960228,
-0.025682218372821808,
0.0768967792391777,
-0.025026215240359306,
0.022860709577798843,
-0.08891195058822632,
0.03960848227143288,
-0.1920936554670334,
0.08777650445699692,
-0.05508267879486084,
0.004332053009420633,
-0.13547880947589874,
-0.02093668095767498,
-0.09738736599683762,
0.013236098922789097,
0.09666237980127335,
0.12606105208396912,
-0.25496557354927063,
-0.0059407358057796955,
0.17391769587993622,
-0.14514628052711487,
-0.0641523227095604,
0.1277359277009964,
0.0021945531480014324,
0.06254766881465912,
0.05854623019695282,
0.06123512238264084,
0.0032411585561931133,
-0.07999628782272339,
-0.12686188519001007,
-0.10216794162988663,
0.027263158932328224,
-0.012394461780786514,
0.029835252091288567,
-0.05897114798426628,
0.06365368515253067,
0.008596899919211864,
0.031589049845933914,
0.014800493605434895,
-0.018654050305485725,
-0.06535294651985168,
-0.020475436002016068,
-0.07033368945121765,
-0.023151226341724396,
0.05882034823298454,
0.02791946940124035,
-0.04993481934070587,
-0.08747754245996475,
-0.053679268807172775,
0.08109254390001297,
-0.052239418029785156,
0.06058954447507858,
-0.021940015256404877,
0.07889871299266815,
-0.13264532387256622,
0.008356424048542976,
-0.14259073138237,
-0.09406287223100662,
0.04760906472802162,
-0.05668850988149643,
0.019230619072914124,
-0.0851387158036232,
0.0482274666428566,
0.033395152539014816,
-0.087002694606781,
-0.013800548389554024,
0.01002868078649044,
-0.02383526973426342,
-0.10763512551784515,
-0.13885608315467834,
-0.015752092003822327,
-0.036402732133865356,
0.1360877901315689,
-0.1636124700307846,
-0.020118722692131996,
0.10332423448562622,
0.14337660372257233,
0.020552195608615875,
-0.10037164390087128,
0.036346711218357086,
0.046032700687646866,
-0.03883391618728638,
-0.0934830904006958,
0.010835958644747734,
0.027275005355477333,
-0.05451720952987671,
0.06906861811876297,
-0.1635788232088089,
-0.186257004737854,
0.10579761117696762,
-0.032906632870435715,
-0.1515836864709854,
-0.009234962053596973,
-0.008792937733232975,
-0.06220915541052818,
-0.09233250468969345,
-0.07704625278711319,
0.014751448296010494,
0.018853746354579926,
0.060247644782066345,
-0.06876453757286072,
-0.01669851876795292,
0.038940705358982086,
-0.005051791202276945,
-0.11911946535110474,
0.0735996812582016,
0.11235292255878448,
-0.17684617638587952,
0.07232679426670074,
0.033128462731838226,
-0.06829963624477386,
0.10575058311223984,
0.0035204074811190367,
-0.12307910621166229,
-0.04318011552095413,
0.05399981886148453,
0.04442638158798218,
0.16071513295173645,
-0.025916121900081635,
-0.00029643630841746926,
0.017416557297110558,
-0.02378053404390812,
0.07096315920352936,
-0.1163744106888771,
0.04823589324951172,
0.03544209524989128,
0.01087136846035719,
0.04101456701755524,
-0.046884529292583466,
-0.05257732793688774,
0.07249239087104797,
-0.004515236243605614,
0.0642409473657608,
0.04390985146164894,
-0.008330164477229118,
-0.1206144168972969,
0.15159006416797638,
-0.10096515715122223,
-0.18974293768405914,
-0.12614621222019196,
0.11617111414670944,
-0.035358984023332596,
-0.004958237521350384,
0.016354624181985855,
-0.0896429717540741,
-0.09226122498512268,
-0.08731075376272202,
-0.0376443937420845,
-0.05003213882446289,
-0.03455257788300514,
-0.023214377462863922,
-0.007412445731461048,
0.03615989163517952,
-0.07934066653251648,
0.029432836920022964,
0.03812836855649948,
-0.04487071931362152,
0.03167781978845596,
-0.041642073541879654,
0.14709942042827606,
0.09783606976270676,
-0.03596562519669533,
0.02840244211256504,
0.0059735155664384365,
0.12148741632699966,
-0.0704459473490715,
0.10895370692014694,
0.06662001460790634,
0.011842872947454453,
0.06032181903719902,
0.04533493146300316,
-0.0081710834056139,
-0.05172199010848999,
0.022111130878329277,
0.0877281054854393,
-0.05772717669606209,
-0.23331597447395325,
-0.06693155318498611,
-0.037798814475536346,
-0.03516281023621559,
0.04844961315393448,
0.06137099489569664,
0.004177821334451437,
0.07528350502252579,
-0.04855680838227272,
0.05982566624879837,
-0.022786637768149376,
0.07109581679105759,
0.018135258927941322,
-0.008043638430535793,
0.03619422763586044,
-0.0897921621799469,
0.026800811290740967,
0.13185729086399078,
0.020256176590919495,
0.24312560260295868,
-0.04616990685462952,
0.15358299016952515,
0.007994875311851501,
0.09883999824523926,
-0.03496652469038963,
0.058242976665496826,
-0.05241689458489418,
0.010885568335652351,
-0.0123027004301548,
-0.07925675064325333,
0.004246579948812723,
0.08810058981180191,
0.048051100224256516,
-0.031841494143009186,
-0.07274780422449112,
0.035382963716983795,
0.03942485526204109,
0.1290690004825592,
0.03431176021695137,
-0.21055833995342255,
-0.04119153693318367,
0.022567549720406532,
0.061276353895664215,
-0.06645902246236801,
-0.009224696084856987,
0.15098577737808228,
-0.060774095356464386,
0.03951714560389519,
-0.04595436528325081,
0.07985863089561462,
-0.11836494505405426,
-0.053560771048069,
0.05066588148474693,
0.003378952154889703,
-0.008557607419788837,
0.0721115991473198,
-0.14734920859336853,
0.09977930784225464,
0.04359520971775055,
0.08027403056621552,
-0.07293035835027695,
0.0024269409477710724,
-0.0037551599089056253,
-0.05210462212562561,
0.10960520058870316,
0.032462235540151596,
-0.21415556967258453,
-0.06664547324180603,
-0.08048306405544281,
-0.019045019522309303,
0.09007512032985687,
0.02274719439446926,
0.08416780829429626,
0.008513533510267735,
-0.0025485148653388023,
0.019991688430309296,
0.08642510324716568,
-0.10738499462604523,
-0.18706199526786804,
0.027197394520044327,
0.007440653163939714,
-0.04523901641368866,
-0.033330101519823074,
-0.05722171440720558,
-0.07206602394580841,
0.22300732135772705,
-0.04615310952067375,
-0.055384520441293716,
-0.12531140446662903,
0.0252486951649189,
0.1506236493587494,
-0.04703943803906441,
0.04060744494199753,
0.04000277444720268,
0.07869775593280792,
-0.02592538855969906,
-0.0697120651602745,
0.11420806497335434,
-0.06567493081092834,
-0.12673112750053406,
-0.08781762421131134,
0.07926703244447708,
0.020080504938960075,
0.030909862369298935,
-0.025571566075086594,
0.04343966022133827,
0.020382195711135864,
-0.06906771659851074,
0.07292747497558594,
0.11563286185264587,
0.06758516281843185,
0.07882378995418549,
-0.08087034523487091,
0.004912137519568205,
-0.013307983987033367,
-0.10158658027648926,
0.07208605855703354,
0.20629528164863586,
-0.07082804292440414,
0.04357603192329407,
0.10779634863138199,
-0.09705273061990738,
-0.20497097074985504,
0.06749678403139114,
0.007517458871006966,
0.010137063451111317,
-0.0008372208685614169,
-0.14181052148342133,
0.00919802300632,
0.08955220133066177,
-0.02437361143529415,
0.07311771810054779,
-0.283462256193161,
-0.11075551062822342,
0.009278441779315472,
0.07465815544128418,
0.11440572142601013,
-0.1537967324256897,
-0.053479671478271484,
-0.04212411493062973,
-0.09262304753065109,
0.11423894017934799,
-0.030839666724205017,
0.07802997529506683,
0.02264309488236904,
0.035549264401197433,
0.0485835038125515,
-0.06907229870557785,
0.12265818566083908,
-0.034193411469459534,
0.0560920275747776,
-0.0516589879989624,
-0.012490778230130672,
0.13245941698551178,
-0.05577755719423294,
0.11081145703792572,
0.014848204329609871,
0.049571339040994644,
-0.06799247860908508,
-0.005544560495764017,
-0.04681482911109924,
0.018563581630587578,
-0.06833839416503906,
-0.005364688578993082,
-0.03870534151792526,
0.08846527338027954,
0.06345660239458084,
-0.011715398170053959,
-0.011008004657924175,
0.040373850613832474,
0.003641418181359768,
0.19703970849514008,
0.052785392850637436,
0.12611910700798035,
-0.05804847553372383,
0.0015702182427048683,
-0.05008372291922569,
0.08333197981119156,
-0.09911522269248962,
0.038969602435827255,
0.09477318823337555,
0.034681227058172226,
0.12446889281272888,
0.010499485768377781,
-0.1402415633201599,
0.03119257092475891,
0.08178071677684784,
-0.09030325710773468,
-0.13085037469863892,
-0.03561224415898323,
-0.005126108415424824,
-0.06727674603462219,
0.0048195491544902325,
0.07870784401893616,
-0.02850450575351715,
-0.01581016182899475,
-0.015498344786465168,
0.11525698751211166,
0.009285405278205872,
0.15798301994800568,
0.03795468434691429,
-0.019715458154678345,
-0.06990032643079758,
0.19086714088916779,
0.11235854774713516,
-0.23806419968605042,
0.11027015000581741,
0.11514408886432648,
-0.03226381167769432,
-0.04540308937430382,
0.06922844797372818,
0.0784517377614975,
-0.13123276829719543,
-0.06403391063213348,
-0.02963755838572979,
-0.02745615318417549,
-0.007639707997441292,
0.024069437757134438,
-0.014866521582007408,
0.06723421812057495,
0.03587699308991432,
-0.0045939297415316105,
-0.13914665579795837,
0.11207867413759232,
0.1040513888001442,
0.0682787373661995,
-0.10038639605045319,
0.06744351983070374,
-0.019101940095424652,
-0.03248000517487526,
-0.002339730504900217,
0.01972278766334057,
-0.09554585814476013,
-0.030711695551872253,
-0.08629399538040161,
0.016519568860530853,
-0.027210228145122528,
-0.041473764926195145,
-0.01558131817728281,
0.005930806044489145,
-0.005576055962592363,
0.022384434938430786,
-0.04848635569214821,
-0.05945831537246704,
-0.02331935055553913,
0.0208051186054945,
-0.1201360672712326,
-0.0359143503010273,
0.018165355548262596,
-0.046511854976415634,
0.05849922075867653,
0.05951273441314697,
0.024685731157660484,
-0.011105158366262913,
-0.03845197334885597,
0.044401973485946655,
0.018131574615836143,
-0.028297096490859985,
0.005270914640277624,
-0.11885756999254227,
-0.007057724054902792,
0.01849033497273922,
-0.03829976171255112,
0.0033748589921742678,
0.05562407523393631,
-0.11620911210775375,
0.017869435250759125,
-0.0683516263961792,
-0.058318424969911575,
-0.058932092040777206,
0.11445360630750656,
0.10616174340248108,
0.025374555960297585,
0.07723607867956161,
-0.08347165584564209,
0.009944822639226913,
-0.11727119237184525,
0.01543593779206276,
0.004959359765052795,
0.01199094858020544,
-0.05669950693845749,
0.04347306489944458,
0.061979588121175766,
-0.07316923886537552,
0.07075528800487518,
-0.020922835916280746,
0.027188366279006004,
0.0019234680803492665,
0.03420785814523697,
-0.0732407346367836,
0.011717907153069973,
0.11456357687711716,
-0.06359846144914627,
-0.026720307767391205,
-0.04369919002056122,
0.06059100851416588,
0.04505178704857826,
0.0914045050740242,
0.12279170006513596,
0.06626559048891068,
0.06385062634944916,
0.12891331315040588,
0.00024964011390693486,
-0.09999050945043564,
-0.08521926403045654,
0.14683887362480164,
-0.026612721383571625,
0.08313759416341782,
-0.06817308068275452,
0.06312953680753708,
0.12718935310840607,
-0.15892837941646576,
0.07558934390544891,
-0.031899988651275635,
-0.07521150261163712,
-0.0439206101000309,
-0.21521440148353577,
-0.0630631148815155,
-0.035323821008205414,
-0.0009925247868523002,
-0.09154481440782547,
0.05916503444314003,
0.03829430043697357,
0.007157017942517996,
-0.05458421632647514,
0.1418626755475998,
-0.07948511093854904,
-0.04945475980639458,
0.04181702062487602,
0.04402267560362816,
0.0034843184985220432,
0.04407315328717232,
0.025158554315567017,
0.07741734385490417,
0.042367976158857346,
0.0660046860575676,
0.07904002815485,
0.09430854022502899,
0.04784448444843292,
0.048650920391082764,
-0.0799732506275177,
-0.04641280323266983,
0.006867039017379284,
0.04477351903915405,
0.1382569670677185,
0.039911676198244095,
0.00924990139901638,
-0.03442307189106941,
0.1633511185646057,
-0.09521014243364334,
0.055472176522016525,
-0.10466790199279785,
0.11167213320732117,
-0.011450793594121933,
-0.018400732427835464,
0.019672565162181854,
-0.18057391047477722,
0.06258589029312134,
0.12381797283887863,
0.09632710367441177,
-0.011485099792480469,
-0.05103876069188118,
0.01687389798462391,
-0.011971309781074524,
-0.02860884554684162,
0.05187936872243881,
0.010874197818338871,
0.19609050452709198,
-0.06149458512663841,
0.09486793726682663,
-0.022795813158154488,
-0.015940049663186073,
-0.0746331512928009,
0.14330236613750458,
-0.012913218699395657,
0.008808992803096771,
-0.053847167640924454,
0.04908031225204468,
-0.0159365925937891,
-0.1692819893360138,
0.03458582982420921,
-0.14441736042499542,
-0.1142013892531395,
0.013617773540318012,
0.08849319070577621,
0.00013324109022505581,
0.06836199015378952,
-0.02195816859602928,
0.04938690736889839,
0.11569096148014069,
0.012725135311484337,
-0.10875346511602402,
-0.08527263253927231,
-0.01525450125336647,
-0.08188479393720627,
0.19555853307247162,
0.03095901943743229,
0.029310103505849838,
0.09232024103403091,
0.01060358714312315,
-0.13416413962841034,
-0.009048886597156525,
0.013820113614201546,
-0.0683005079627037,
0.03812532499432564,
0.1922616809606552,
-0.02838238514959812,
0.11579868942499161,
0.07983796298503876,
0.02660144865512848,
0.017429348081350327,
0.03126739338040352,
0.023232830688357353,
-0.09340322017669678,
0.05839146673679352,
-0.08765536546707153,
0.1878437101840973,
0.16236095130443573,
-0.03162793442606926,
0.008997177705168724,
-0.06327765434980392,
-0.0025009787641465664,
0.014429324306547642,
0.10180819034576416,
0.03864823281764984,
-0.14323481917381287,
0.04094228148460388,
0.0015484921168535948,
0.10171794891357422,
-0.19449585676193237,
-0.05661934241652489,
0.015423339791595936,
-0.051016051322221756,
-0.0796804279088974,
0.09083335101604462,
0.11851276457309723,
0.0295079555362463,
-0.036702677607536316,
-0.013651754707098007,
0.006001051515340805,
0.061806242913007736,
-0.07166867703199387,
-0.05518596991896629
] |
null | null |
transformers
|
# CLIP-Vision-mBART50 Seq2Seq Encoder-Decoder Model
Pretrained CLIP-Vision-mBART50 pre-trained on subset of translated Conceptual-12M image-text pairs using a seq2seq model training objective. 2.5M cleaned English image-text pairs are translated using Marian Model for respective languages to 2.5M examples each in English, French, German and Spanish. We trained CLIP-Vision-mBART50 model during community week hosted by Huggingface 🤗 using JAX/Flax.
## Model description
CLIP-Vision-mBART50 is a modified transformers model which takes in visual embeddings from CLIP-Vision transformer and feeds into the `encoder_hidden_states` of a mBART50 decoder. This is done for deep cross-modal interaction via `cross-attention` between the two modes. The decoder then predicts logits for the `input_ids` provided and can be used for generation.
## Intended uses & limitations❗️
You can use the raw model for encoder decoder network where you want the encoder to encode images and decoder to decode text.
Note that this model is primarily aimed at being fine-tuned on tasks like multi-lingual/mono-lingual image captioning.
### How to use❓
You will need to clone the model from [here](https://github.com/gchhablani/multilingual-image-captioning). An example of usage is shown below:
```python
from torchvision.io import read_image
import numpy as np
import os, wget
from transformers import CLIPProcessor, MBart50TokenizerFast
from model.flax_clip_vision_mbart.modeling_clip_vision_mbart import FlaxCLIPVisionMBartForConditionalGeneration
img = wget("http://images.cocodataset.org/val2017/000000397133.jpg")
img = read_image(img) # reading image
clip_processor = CLIPProcessor.from_pretrained('openai/clip-vit-base-patch32')
clip_outputs = clip_processor(images=img)
clip_outputs['pixel_values'][0] = clip_outputs['pixel_values'][0].transpose(1,2,0) # Need to transpose images as model expected channel last images.
tokenizer = MBart50TokenizerFast.from_pretrained('facebook/mbart-large-50"')
model = FlaxCLIPVisionBertForMaskedLM.from_pretrained('flax-community/clip-vit-base-patch32_mbart-large-50')
output_ids = model.generate(batch["pixel_values"], forced_bos_token_id=tokenizer.lang_code_to_id["es_XX"], num_beams=4, max_length=64).sequences # "es_XX is the language code in which you want the translation
# en_XX: English, fr_XX: French, es_XX: Spanish, de_DE: Deutsch
output_string = tokenizer.batch_decode(output_ids.reshape(-1, 64), skip_special_tokens=True, max_length=64)
output_string # Un restaurante u otro lugar para comer en el Hotel
```
## Training data 🏋🏻♂️
The Multi-lingual image captioning model was trained on a subset of Conceptual 12M dataset by Google:
<br>
<br>
[Conceptual 12M](https://github.com/google-research-datasets/conceptual-12m), Introduced by Changpinyo et al. in [Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts](https://arxiv.org/abs/2102.08981).
The translated dataset can be downloaded from [conceptual-12m-multilingual-marian](https://huggingface.co/datasets/flax-community/conceptual-12m-multilingual-marian). We do not provide images as we do not own any of them. One can download images from the `image_url` section of the original Conceptual 12M dataset.
## Data Cleaning 🧹
Though the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.
#### **Train set:**
Total data: 10010625 captions, 2502656 images <br>
Language-wise captions distribution: <br>
English: 2502656<br>
Spanish: 2502656<br>
Deutsch: 2502656<br>
French: 2502656<br>
#### **Validation set**
Total data: 110592 captions, 27648 images <br>
Language-wise captions distribution: <br>
English: 27648<br>
Spanish: 27648<br>
Deutsch: 27648<br>
French: 27648<br>
## Training procedure 👨🏻💻
### Training
The model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) **8 v3 TPU cores** for 42K steps with a batch size of 128 and a sequence length of 128. The
optimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and
ε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning
rate after.
We tracked experiments using Tensorboard which can be found in `Training Metrics` tab. BLEU scores for languages other than English might be wrongly tracked but the model gives good performance in other languages too as evident from the evaluation scores.
#### **Pretraining Results 📊**
Our model reached **eval loss of ~2.6** around ~60k steps. Here are the BLEU scores (out of 1) for different languages:
|Language |BLEU-1|BLEU-2|BLEU-3|BLEU-4|
|--------------------------|------|------|------|------|
|English | 0.13083| 0.08887| 0.06681 | 0.04899|
|Spanish | 0.15981| 0.09858| 0.06918| 0.04776|
|German | 0.14234| 0.09817| 0.07405| 0.0515|
|French | 0.13021| 0.08862| 0.06598| 0.04647|
Model used: ckpt-51999/
In order to reproduce the results, one can use the [evaluation script](https://github.com/gchhablani/multilingual-image-captioning/blob/main/evaluation.py) available in this project's repository.
## **App Demo**
You can try out our model on 🤗 Huggingface's spaces 🪐 :
[Streamlit app of Multi-lingual Image Captioning model on Huggingface Spaces](https://huggingface.co/spaces/flax-community/multilingual-image-captioning)
## Team Members
- Bhavitvya Malik [@bhavitvyamalik](https://github.com/bhavitvyamalik)
- Gunjan Chhablani [@gchhablani](https://github.com/gchhablani)
## Credits
Thanks to Huggingface 🤗 & Google JAX/FLAX team for such a wonderful community week. Big thanks to [@patrickvonplaten](https://github.com/patrickvonplaten) and [@patil-suraj](https://github.com/patil-suraj) for helping us with our solution during the community week.
<img src=https://pbs.twimg.com/media/E443fPjX0AY1BsR.jpg:large>
|
{}
|
text2text-generation
|
flax-community/clip-vit-base-patch32_mbart-large-50
|
[
"transformers",
"jax",
"tensorboard",
"clip-vision-mbart",
"text2text-generation",
"arxiv:2102.08981",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2102.08981"
] |
[] |
TAGS
#transformers #jax #tensorboard #clip-vision-mbart #text2text-generation #arxiv-2102.08981 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
CLIP-Vision-mBART50 Seq2Seq Encoder-Decoder Model
=================================================
Pretrained CLIP-Vision-mBART50 pre-trained on subset of translated Conceptual-12M image-text pairs using a seq2seq model training objective. 2.5M cleaned English image-text pairs are translated using Marian Model for respective languages to 2.5M examples each in English, French, German and Spanish. We trained CLIP-Vision-mBART50 model during community week hosted by Huggingface using JAX/Flax.
Model description
-----------------
CLIP-Vision-mBART50 is a modified transformers model which takes in visual embeddings from CLIP-Vision transformer and feeds into the 'encoder\_hidden\_states' of a mBART50 decoder. This is done for deep cross-modal interaction via 'cross-attention' between the two modes. The decoder then predicts logits for the 'input\_ids' provided and can be used for generation.
Intended uses & limitations️
----------------------------
You can use the raw model for encoder decoder network where you want the encoder to encode images and decoder to decode text.
Note that this model is primarily aimed at being fine-tuned on tasks like multi-lingual/mono-lingual image captioning.
### How to use
You will need to clone the model from here. An example of usage is shown below:
Training data ️
----------------
The Multi-lingual image captioning model was trained on a subset of Conceptual 12M dataset by Google:
Conceptual 12M, Introduced by Changpinyo et al. in Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts.
The translated dataset can be downloaded from conceptual-12m-multilingual-marian. We do not provide images as we do not own any of them. One can download images from the 'image\_url' section of the original Conceptual 12M dataset.
Data Cleaning
-------------
Though the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.
#### Train set:
Total data: 10010625 captions, 2502656 images
Language-wise captions distribution:
English: 2502656
Spanish: 2502656
Deutsch: 2502656
French: 2502656
#### Validation set
Total data: 110592 captions, 27648 images
Language-wise captions distribution:
English: 27648
Spanish: 27648
Deutsch: 27648
French: 27648
Training procedure
--------------------
### Training
The model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 42K steps with a batch size of 128 and a sequence length of 128. The
optimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and
ε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning
rate after.
We tracked experiments using Tensorboard which can be found in 'Training Metrics' tab. BLEU scores for languages other than English might be wrongly tracked but the model gives good performance in other languages too as evident from the evaluation scores.
#### Pretraining Results
Our model reached eval loss of ~2.6 around ~60k steps. Here are the BLEU scores (out of 1) for different languages:
Model used: ckpt-51999/
In order to reproduce the results, one can use the evaluation script available in this project's repository.
App Demo
--------
You can try out our model on Huggingface's spaces :
Streamlit app of Multi-lingual Image Captioning model on Huggingface Spaces
Team Members
------------
* Bhavitvya Malik @bhavitvyamalik
* Gunjan Chhablani @gchhablani
Credits
-------
Thanks to Huggingface & Google JAX/FLAX team for such a wonderful community week. Big thanks to @patrickvonplaten and @patil-suraj for helping us with our solution during the community week.
<img src=URL
|
[
"### How to use\n\n\nYou will need to clone the model from here. An example of usage is shown below:\n\n\nTraining data ️\n----------------\n\n\nThe Multi-lingual image captioning model was trained on a subset of Conceptual 12M dataset by Google:\n \n\n \n\nConceptual 12M, Introduced by Changpinyo et al. in Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts.\n\n\nThe translated dataset can be downloaded from conceptual-12m-multilingual-marian. We do not provide images as we do not own any of them. One can download images from the 'image\\_url' section of the original Conceptual 12M dataset.\n\n\nData Cleaning\n-------------\n\n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.",
"#### Train set:\n\n\nTotal data: 10010625 captions, 2502656 images \n\n\n\nLanguage-wise captions distribution: \n\nEnglish: 2502656 \n\nSpanish: 2502656 \n\nDeutsch: 2502656 \n\nFrench: 2502656",
"#### Validation set\n\n\nTotal data: 110592 captions, 27648 images \n\n\n\nLanguage-wise captions distribution: \n\nEnglish: 27648 \n\nSpanish: 27648 \n\nDeutsch: 27648 \n\nFrench: 27648 \n\n\n\nTraining procedure \n--------------------",
"### Training\n\n\nThe model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 42K steps with a batch size of 128 and a sequence length of 128. The\noptimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and\nε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning\nrate after.\n\n\nWe tracked experiments using Tensorboard which can be found in 'Training Metrics' tab. BLEU scores for languages other than English might be wrongly tracked but the model gives good performance in other languages too as evident from the evaluation scores.",
"#### Pretraining Results\n\n\nOur model reached eval loss of ~2.6 around ~60k steps. Here are the BLEU scores (out of 1) for different languages:\n\n\n\nModel used: ckpt-51999/\n\n\nIn order to reproduce the results, one can use the evaluation script available in this project's repository.\n\n\nApp Demo\n--------\n\n\nYou can try out our model on Huggingface's spaces :\nStreamlit app of Multi-lingual Image Captioning model on Huggingface Spaces\n\n\nTeam Members\n------------\n\n\n* Bhavitvya Malik @bhavitvyamalik\n* Gunjan Chhablani @gchhablani\n\n\nCredits\n-------\n\n\nThanks to Huggingface & Google JAX/FLAX team for such a wonderful community week. Big thanks to @patrickvonplaten and @patil-suraj for helping us with our solution during the community week.\n\n\n<img src=URL"
] |
[
"TAGS\n#transformers #jax #tensorboard #clip-vision-mbart #text2text-generation #arxiv-2102.08981 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### How to use\n\n\nYou will need to clone the model from here. An example of usage is shown below:\n\n\nTraining data ️\n----------------\n\n\nThe Multi-lingual image captioning model was trained on a subset of Conceptual 12M dataset by Google:\n \n\n \n\nConceptual 12M, Introduced by Changpinyo et al. in Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts.\n\n\nThe translated dataset can be downloaded from conceptual-12m-multilingual-marian. We do not provide images as we do not own any of them. One can download images from the 'image\\_url' section of the original Conceptual 12M dataset.\n\n\nData Cleaning\n-------------\n\n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.",
"#### Train set:\n\n\nTotal data: 10010625 captions, 2502656 images \n\n\n\nLanguage-wise captions distribution: \n\nEnglish: 2502656 \n\nSpanish: 2502656 \n\nDeutsch: 2502656 \n\nFrench: 2502656",
"#### Validation set\n\n\nTotal data: 110592 captions, 27648 images \n\n\n\nLanguage-wise captions distribution: \n\nEnglish: 27648 \n\nSpanish: 27648 \n\nDeutsch: 27648 \n\nFrench: 27648 \n\n\n\nTraining procedure \n--------------------",
"### Training\n\n\nThe model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 42K steps with a batch size of 128 and a sequence length of 128. The\noptimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and\nε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning\nrate after.\n\n\nWe tracked experiments using Tensorboard which can be found in 'Training Metrics' tab. BLEU scores for languages other than English might be wrongly tracked but the model gives good performance in other languages too as evident from the evaluation scores.",
"#### Pretraining Results\n\n\nOur model reached eval loss of ~2.6 around ~60k steps. Here are the BLEU scores (out of 1) for different languages:\n\n\n\nModel used: ckpt-51999/\n\n\nIn order to reproduce the results, one can use the evaluation script available in this project's repository.\n\n\nApp Demo\n--------\n\n\nYou can try out our model on Huggingface's spaces :\nStreamlit app of Multi-lingual Image Captioning model on Huggingface Spaces\n\n\nTeam Members\n------------\n\n\n* Bhavitvya Malik @bhavitvyamalik\n* Gunjan Chhablani @gchhablani\n\n\nCredits\n-------\n\n\nThanks to Huggingface & Google JAX/FLAX team for such a wonderful community week. Big thanks to @patrickvonplaten and @patil-suraj for helping us with our solution during the community week.\n\n\n<img src=URL"
] |
[
58,
225,
45,
44,
180,
194
] |
[
"passage: TAGS\n#transformers #jax #tensorboard #clip-vision-mbart #text2text-generation #arxiv-2102.08981 #autotrain_compatible #endpoints_compatible #has_space #region-us \n### How to use\n\n\nYou will need to clone the model from here. An example of usage is shown below:\n\n\nTraining data ️\n----------------\n\n\nThe Multi-lingual image captioning model was trained on a subset of Conceptual 12M dataset by Google:\n \n\n \n\nConceptual 12M, Introduced by Changpinyo et al. in Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts.\n\n\nThe translated dataset can be downloaded from conceptual-12m-multilingual-marian. We do not provide images as we do not own any of them. One can download images from the 'image\\_url' section of the original Conceptual 12M dataset.\n\n\nData Cleaning\n-------------\n\n\nThough the original dataset contains 12M image-text pairs, a lot of the URLs are invalid now, and in some cases, images are corrupt or broken. We remove such examples from our data, which leaves us with approximately 10M image-text pairs.#### Train set:\n\n\nTotal data: 10010625 captions, 2502656 images \n\n\n\nLanguage-wise captions distribution: \n\nEnglish: 2502656 \n\nSpanish: 2502656 \n\nDeutsch: 2502656 \n\nFrench: 2502656#### Validation set\n\n\nTotal data: 110592 captions, 27648 images \n\n\n\nLanguage-wise captions distribution: \n\nEnglish: 27648 \n\nSpanish: 27648 \n\nDeutsch: 27648 \n\nFrench: 27648 \n\n\n\nTraining procedure \n--------------------"
] |
[
-0.10636124759912491,
0.24041569232940674,
0.0008337743347510695,
0.060662057250738144,
0.08579476177692413,
0.010858100838959217,
-0.011597934179008007,
0.12098675221204758,
-0.11851979047060013,
0.10764048248529434,
0.024308066815137863,
-0.019221916794776917,
0.05122186616063118,
0.16968998312950134,
0.00814968254417181,
-0.2309093326330185,
0.0727696418762207,
-0.030044855549931526,
0.00028964364901185036,
0.062208570539951324,
0.08817959576845169,
-0.09661398828029633,
0.09823010861873627,
-0.07111933827400208,
-0.11929287761449814,
0.018006103113293648,
-0.03171190246939659,
-0.06311985105276108,
0.11838690936565399,
0.07587692886590958,
0.05828379467129707,
0.04425850138068199,
0.019836537539958954,
-0.1160198375582695,
0.0006269309669733047,
0.045378394424915314,
-0.009685751050710678,
0.03604608774185181,
0.09155703336000443,
-0.036776527762413025,
0.12200116366147995,
-0.11294412612915039,
0.016485244035720825,
0.03870777040719986,
-0.11265622079372406,
-0.06694228947162628,
-0.1346973478794098,
0.02775825560092926,
0.1381545513868332,
0.105918750166893,
-0.03114485926926136,
0.027280176058411598,
-0.061750128865242004,
0.07208792865276337,
0.15146775543689728,
-0.1432289332151413,
-0.04099838435649872,
0.0992451086640358,
0.014657405205070972,
0.040709830820560455,
-0.07384297996759415,
0.06805974990129471,
0.07977832108736038,
0.010198055766522884,
-0.05781228840351105,
-0.04006221517920494,
-0.19345177710056305,
-0.01895230822265148,
-0.06461382657289505,
-0.04172215238213539,
0.21304106712341309,
0.03452407941222191,
-0.07208084315061569,
-0.11669184267520905,
-0.03129298612475395,
-0.05533309653401375,
-0.0346391536295414,
0.043818067759275436,
0.016577575355768204,
-0.006408494431525469,
-0.09623108804225922,
-0.04902665317058563,
-0.08899781107902527,
-0.02328674867749214,
-0.09346826374530792,
-0.08437233418226242,
0.004071870353072882,
0.028673309832811356,
0.0007038956391625106,
0.06945240497589111,
-0.14449265599250793,
-0.055623095482587814,
-0.009781227447092533,
-0.03139133006334305,
-0.04413606598973274,
-0.020090408623218536,
-0.07172694057226181,
-0.1344362050294876,
0.01123298890888691,
0.05878261476755142,
-0.09649262577295303,
0.03877682983875275,
-0.10598866641521454,
0.02095513977110386,
0.06887315958738327,
0.10802364349365234,
-0.08531136065721512,
-0.0364929735660553,
0.005452828016132116,
-0.03796238452196121,
0.06275533884763718,
0.004061599262058735,
-0.06995908915996552,
-0.010960073210299015,
0.04510157182812691,
0.07334627956151962,
0.022634034976363182,
0.0047838413156569,
-0.06020662561058998,
-0.030367126688361168,
0.14769142866134644,
-0.12406840175390244,
0.08055287599563599,
-0.017342787235975266,
-0.047874972224235535,
0.06814032047986984,
0.029875801876187325,
-0.014579283073544502,
-0.07482191920280457,
-0.0024296606425195932,
-0.06743253767490387,
-0.025160759687423706,
-0.092178575694561,
-0.14835524559020996,
0.06831401586532593,
-0.10543256998062134,
-0.05717955157160759,
-0.09055397659540176,
-0.17826831340789795,
-0.04727089777588844,
0.02978130616247654,
-0.09951256960630417,
0.005978696513921022,
-0.040382541716098785,
-0.0679134875535965,
0.04109014570713043,
0.038496457040309906,
0.057820066809654236,
-0.02583669126033783,
0.04078439995646477,
-0.05442638695240021,
0.09853613376617432,
-0.04612904414534569,
0.037477266043424606,
-0.1186520978808403,
0.050519850105047226,
-0.21548675000667572,
0.11368103325366974,
-0.07183880358934402,
0.015081474557518959,
-0.1336183398962021,
-0.04563717544078827,
-0.08047496527433395,
0.04401226341724396,
0.058897715061903,
0.16976512968540192,
-0.26535019278526306,
-0.03608808293938637,
0.24122682213783264,
-0.15595678985118866,
-0.03552782163023949,
0.11752396076917648,
-0.005683457478880882,
0.06104382127523422,
0.08414305001497269,
0.10229607671499252,
-0.06286401301622391,
-0.07542358338832855,
-0.10941465198993683,
-0.06830106675624847,
0.038607705384492874,
0.03234775364398956,
0.038444746285676956,
-0.10300251096487045,
0.09997838735580444,
0.029391156509518623,
0.009574487805366516,
0.03289353474974632,
-0.033275097608566284,
-0.07225104421377182,
-0.012745632790029049,
-0.052028585225343704,
-0.012514086440205574,
0.053996745496988297,
0.029400845989584923,
-0.053362224251031876,
-0.05557447671890259,
-0.015621333383023739,
0.07911068201065063,
-0.07439349591732025,
0.0358315035700798,
-0.05370563641190529,
0.04193601757287979,
-0.09544386714696884,
0.0021594963036477566,
-0.15284781157970428,
-0.14603127539157867,
0.019262364134192467,
0.018038831651210785,
-0.0492141917347908,
-0.08629916608333588,
0.03463014215230942,
0.020793600007891655,
-0.0762253999710083,
-0.02636312134563923,
-0.02327456884086132,
-0.019893838092684746,
-0.03786034882068634,
-0.13753701746463776,
-0.004905600566416979,
-0.05223908647894859,
0.13558612763881683,
-0.17699526250362396,
-0.01830950379371643,
0.10789204388856888,
0.16368967294692993,
0.03828712925314903,
-0.06656711548566818,
0.014516165480017662,
0.07113759964704514,
-0.06781264394521713,
-0.1132919043302536,
0.015475356951355934,
-0.00963868573307991,
0.0060660457238554955,
0.044262297451496124,
-0.07351189851760864,
-0.11414440721273422,
0.14538805186748505,
0.00005488097667694092,
-0.1251043826341629,
0.03895067423582077,
-0.03567289188504219,
-0.028893496841192245,
-0.1217065379023552,
-0.053949274122714996,
-0.048484042286872864,
0.038418788462877274,
0.07424427568912506,
-0.06555058807134628,
0.025205880403518677,
0.03552123159170151,
0.008572027087211609,
-0.1475502848625183,
0.07314632087945938,
0.12695619463920593,
-0.13861225545406342,
0.09536959230899811,
0.02241193689405918,
-0.06584431976079941,
0.16611947119235992,
-0.02100914530456066,
-0.1280423253774643,
-0.001294572139158845,
0.05400340259075165,
0.03201579302549362,
0.19197066128253937,
-0.003975903615355492,
-0.03576413169503212,
0.045653652399778366,
-0.03324383124709129,
0.0689731240272522,
-0.08962050825357437,
0.02816970832645893,
0.017406105995178223,
-0.027595914900302887,
0.013228797353804111,
0.009538020007312298,
-0.02544286474585533,
0.10492600500583649,
-0.014167473651468754,
-0.011557653546333313,
0.016300136223435402,
-0.011269607581198215,
-0.09209371358156204,
0.17192231118679047,
-0.10976044833660126,
-0.2683355510234833,
-0.0851878896355629,
0.06421453505754471,
-0.09199346601963043,
0.00270442059263587,
0.008669182658195496,
-0.11581956595182419,
-0.1302543729543686,
-0.11464585363864899,
-0.012704108841717243,
-0.03456682711839676,
-0.07651662081480026,
-0.03948316350579262,
0.008926520124077797,
-0.015014476142823696,
-0.11466909945011139,
0.010427349247038364,
0.05797800049185753,
-0.05776480585336685,
0.05643586441874504,
-0.0840209499001503,
0.18158406019210815,
0.11948049813508987,
-0.006984866689890623,
0.030582182109355927,
-0.003675559302791953,
0.14017480611801147,
-0.09438156336545944,
0.08782172948122025,
0.05392151698470116,
0.06160726770758629,
0.036911871284246445,
0.13373494148254395,
-0.010645011439919472,
-0.07935922592878342,
0.02115863747894764,
0.11039150506258011,
-0.057217132300138474,
-0.16799192130565643,
-0.11803113669157028,
-0.04209019988775253,
0.0051798769272863865,
0.07235170155763626,
0.05956877022981644,
0.06788671761751175,
0.042157549411058426,
-0.06966181099414825,
0.007221199572086334,
0.004682246595621109,
0.08053679019212723,
0.045967649668455124,
-0.026469407603144646,
0.07003439962863922,
-0.08844251930713654,
-0.009574859403073788,
0.1164531409740448,
-0.003060995601117611,
0.2336834818124771,
-0.025431202724575996,
0.1972099393606186,
0.019599387422204018,
0.08736859261989594,
-0.003562557278200984,
0.08588964492082596,
-0.026586972177028656,
-0.0107677998021245,
-0.010544201359152794,
-0.06468512117862701,
0.02680879272520542,
0.06484189629554749,
0.01140793226659298,
-0.06009019538760185,
-0.08258573710918427,
0.004727949853986502,
0.04169218987226486,
0.16518276929855347,
0.057589516043663025,
-0.2553133964538574,
-0.05411095172166824,
0.04987538605928421,
0.062023453414440155,
-0.11732760071754456,
-0.011042661964893341,
0.14313927292823792,
-0.09785540401935577,
0.062014058232307434,
-0.0034884773194789886,
0.10893566161394119,
-0.07685813307762146,
-0.039318256080150604,
-0.014731625095009804,
-0.0620645247399807,
-0.027458569034934044,
0.08548643440008163,
-0.18040980398654938,
0.1369219422340393,
0.03236835449934006,
0.05160629376769066,
-0.06868185102939606,
0.0001817595330066979,
-0.016986124217510223,
0.03543706238269806,
0.15350127220153809,
0.024862807244062424,
-0.19580379128456116,
-0.0636771097779274,
-0.06284599006175995,
-0.01257659774273634,
0.041114263236522675,
0.02382407709956169,
0.08127123862504959,
0.018707508221268654,
-0.02617335319519043,
0.007063291966915131,
0.08318056911230087,
-0.17623168230056763,
-0.19334623217582703,
0.00924764946103096,
-0.012387542054057121,
0.05001002550125122,
-0.02700493112206459,
-0.055775441229343414,
-0.04968271404504776,
0.2593529522418976,
0.008146279491484165,
-0.06695244461297989,
-0.16232597827911377,
0.07151448726654053,
0.16007696092128754,
-0.033658336848020554,
0.0759621411561966,
0.02770434319972992,
0.13839469850063324,
-0.03582974523305893,
-0.08512883633375168,
0.11648508161306381,
-0.10838760435581207,
-0.0689644068479538,
-0.04787374660372734,
0.06270170956850052,
0.015314134769141674,
0.03449471294879913,
0.010933647863566875,
0.03231058642268181,
0.04751237854361534,
-0.07737044990062714,
0.04339851811528206,
0.12315405160188675,
0.07722751796245575,
0.05893060564994812,
-0.14842894673347473,
-0.06778975576162338,
-0.023152153939008713,
-0.08112456649541855,
0.16840164363384247,
0.15743671357631683,
-0.07060038298368454,
0.09148410707712173,
0.04432934522628784,
-0.06422995775938034,
-0.21462565660476685,
0.05659276992082596,
-0.007334174122661352,
0.059875763952732086,
-0.024165187031030655,
-0.19936701655387878,
0.01705436035990715,
0.09647837281227112,
-0.010842343792319298,
0.010947623290121555,
-0.335891991853714,
-0.10859254747629166,
0.07658489048480988,
0.04039373993873596,
0.11755330860614777,
-0.1383959800004959,
-0.033956531435251236,
-0.09351598471403122,
-0.11149836331605911,
0.14737296104431152,
-0.040543027222156525,
0.0330212377011776,
0.03841289132833481,
0.03543837368488312,
0.04271552339196205,
-0.046222198754549026,
0.15111687779426575,
0.031366534531116486,
0.07338591665029526,
-0.04758766293525696,
-0.04341191425919533,
0.1941787302494049,
-0.018257517367601395,
0.13912822306156158,
0.09925659745931625,
0.045961517840623856,
-0.1030919998884201,
-0.014887060038745403,
-0.09098123759031296,
0.05425216630101204,
-0.07140398770570755,
-0.01403326541185379,
-0.03896163031458855,
0.08619881421327591,
0.08839021623134613,
0.006773773115128279,
0.0403621643781662,
0.018793419003486633,
0.00035942616523243487,
0.12453392893075943,
0.04875742271542549,
0.05476287007331848,
-0.03683815523982048,
0.030201392248272896,
-0.04121613875031471,
0.07113534212112427,
-0.07585235685110092,
0.06376983225345612,
0.08708513528108597,
0.046424590051174164,
0.13470804691314697,
0.021515369415283203,
-0.09198929369449615,
0.013044352643191814,
0.09453091770410538,
-0.09329129010438919,
-0.10576188564300537,
-0.029300060123205185,
0.006767651531845331,
-0.029848692938685417,
-0.02162237837910652,
0.090767502784729,
-0.020159143954515457,
-0.010661722160875797,
-0.02074890397489071,
0.09391652792692184,
0.010138840414583683,
0.14155180752277374,
0.018637949600815773,
-0.016808630898594856,
-0.10452248156070709,
0.1125321313738823,
0.10480299592018127,
-0.27857857942581177,
0.07262696325778961,
0.04069189727306366,
-0.08234740048646927,
-0.03953225165605545,
0.09067758917808533,
0.09553643316030502,
-0.1916435956954956,
-0.07224888354539871,
-0.052461642771959305,
-0.06403619796037674,
0.004880309104919434,
0.029760751873254776,
0.006429470144212246,
0.07259947061538696,
0.052317988127470016,
-0.06586417555809021,
-0.10669592767953873,
0.09286549687385559,
0.09939488768577576,
0.040930502116680145,
-0.08454813063144684,
0.08456429094076157,
-0.012833856977522373,
-0.0298533346503973,
-0.01802215538918972,
-0.001035313936881721,
-0.09221292287111282,
-0.0343475304543972,
-0.13102464377880096,
0.01154373399913311,
-0.0718502625823021,
-0.04742240905761719,
-0.03688977286219597,
0.007779599633067846,
-0.010825465433299541,
-0.026651229709386826,
-0.06846949458122253,
-0.05853809043765068,
-0.03618898242712021,
0.048778608441352844,
-0.1279529631137848,
-0.005663662683218718,
0.005972082726657391,
-0.053682103753089905,
0.059811223298311234,
0.07319184392690659,
0.04445362091064453,
-0.0016232075868174434,
0.02508533000946045,
-0.03244064375758171,
0.030781108886003494,
-0.001133341109380126,
0.01976814866065979,
-0.08722209185361862,
-0.024418124929070473,
0.053122926503419876,
-0.05648161470890045,
0.011371073313057423,
-0.009956670925021172,
-0.11509037017822266,
0.01202709786593914,
-0.0688612088561058,
-0.0699734315276146,
-0.04534946754574776,
0.15138640999794006,
0.13706046342849731,
0.014428422786295414,
0.08265326172113419,
-0.08345669507980347,
-0.011093581095337868,
-0.09633681178092957,
-0.006967692170292139,
-0.03828173875808716,
0.022024240344762802,
-0.038620397448539734,
0.058648884296417236,
0.0584820881485939,
-0.05249783396720886,
0.13772563636302948,
0.0074775791727006435,
0.026570545509457588,
0.013775733299553394,
0.015978341922163963,
-0.030081264674663544,
0.05473314970731735,
0.21253423392772675,
0.010268310084939003,
-0.00738935824483633,
0.0017168114427477121,
0.05225721374154091,
0.046726733446121216,
0.15055330097675323,
0.1412390172481537,
0.13624869287014008,
0.10943448543548584,
0.1082853451371193,
-0.011710552498698235,
-0.1451820433139801,
-0.01992691680788994,
0.16464725136756897,
-0.006661756429821253,
0.10588563978672028,
-0.0680142343044281,
0.015427732840180397,
0.15603478252887726,
-0.18596816062927246,
0.0936521366238594,
0.02251424267888069,
-0.0594419427216053,
-0.04865329712629318,
-0.20099040865898132,
-0.08246077597141266,
-0.06774844974279404,
-0.0012874299427494407,
-0.13151589035987854,
0.044529255479574203,
0.06923369318246841,
0.03657438978552818,
-0.02427530474960804,
0.15949703752994537,
-0.10849688947200775,
-0.12168404459953308,
0.07642487436532974,
0.022228527814149857,
0.01567290723323822,
0.00504531292244792,
0.05074943229556084,
0.044242341071367264,
0.06870639324188232,
0.04297833889722824,
0.08945406973361969,
0.09630994498729706,
0.056952837854623795,
-0.001302700606174767,
-0.04621507599949837,
-0.04682543873786926,
0.035360388457775116,
0.04581466689705849,
0.0810740739107132,
0.06603839993476868,
-0.022260792553424835,
-0.000558471423573792,
0.18020153045654297,
-0.06590058654546738,
-0.015593044459819794,
-0.12642902135849,
0.06752173602581024,
0.031877387315034866,
0.009722886607050896,
0.0008600617293268442,
-0.1617230325937271,
0.036978837102651596,
0.12226427346467972,
0.14716404676437378,
-0.013839945197105408,
-0.05005435645580292,
-0.022926239296793938,
-0.006133877672255039,
0.004837559070438147,
0.013559049926698208,
0.0006835529347881675,
0.2660164535045624,
-0.0796692743897438,
0.00250691594555974,
-0.04175247624516487,
-0.0412558875977993,
-0.11382383853197098,
0.15912611782550812,
0.027141844853758812,
-0.04575684666633606,
-0.05887747183442116,
0.0922580361366272,
-0.03749774396419525,
-0.07119260728359222,
0.02262192592024803,
-0.16783292591571808,
-0.11860811710357666,
9.149154607257515e-7,
0.014561000280082226,
0.034379903227090836,
0.06209573522210121,
-0.034061871469020844,
0.02951529249548912,
0.15282873809337616,
0.010321732610464096,
-0.11264002323150635,
-0.09113336354494095,
0.02381102368235588,
-0.03932524845004082,
0.12169942259788513,
0.012090827338397503,
0.04836026579141617,
0.10766057670116425,
-0.008959400467574596,
-0.09514553099870682,
-0.026394043117761612,
0.03848278149962425,
0.00393427861854434,
0.04597669094800949,
0.14902761578559875,
-0.0417708195745945,
0.10667283833026886,
0.0730668306350708,
-0.058270085602998734,
0.036672960966825485,
0.0761079415678978,
-0.009898620657622814,
-0.1008010134100914,
0.07111503928899765,
-0.10636680573225021,
0.1492205262184143,
0.17382998764514923,
-0.03638331964612007,
0.028791027143597603,
-0.05526787042617798,
0.05704766511917114,
0.024459341540932655,
0.08621568977832794,
0.0147524643689394,
-0.175906702876091,
0.03591781109571457,
-0.07639826089143753,
0.04281579703092575,
-0.18832515180110931,
-0.03694495931267738,
0.03594785928726196,
-0.030648209154605865,
-0.07964298129081726,
0.08877182006835938,
0.10505177080631256,
-0.0036369306035339832,
-0.03618199750781059,
0.037873394787311554,
0.027683816850185394,
0.07610773295164108,
-0.026552285999059677,
-0.05160633474588394
] |
null | null |
transformers
|
# Tokenizer
We trained our tokenizer using [sentencepiece](https://github.com/google/sentencepiece)'s unigram tokenizer. Then loaded the tokenizer as MT5TokenizerFast.
## Model
We used [MT5-base](https://huggingface.co/google/mt5-base) model.
## Datasets
We used [Code Search Net](https://huggingface.co/datasets/code_search_net)'s dataset and some scrapped data from internet to train the model. We maintained a list of datasets where each dataset had codes of same language.
## Plots
### Train loss

### Evaluation loss

### Evaluation accuracy

### Learning rate

## Fine tuning (WIP)
We fine tuned the model with [CodeXGLUE code-to-code-trans dataset](https://huggingface.co/datasets/code_x_glue_cc_code_to_code_trans), and scrapper data.
|
{}
|
text2text-generation
|
flax-community/code-mt5-base
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"mt5",
"text2text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #mt5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Tokenizer
We trained our tokenizer using sentencepiece's unigram tokenizer. Then loaded the tokenizer as MT5TokenizerFast.
## Model
We used MT5-base model.
## Datasets
We used Code Search Net's dataset and some scrapped data from internet to train the model. We maintained a list of datasets where each dataset had codes of same language.
## Plots
### Train loss
!train loss
### Evaluation loss
!eval loss
### Evaluation accuracy
!eval accuracy
### Learning rate
!learning rate
## Fine tuning (WIP)
We fine tuned the model with CodeXGLUE code-to-code-trans dataset, and scrapper data.
|
[
"# Tokenizer\n\nWe trained our tokenizer using sentencepiece's unigram tokenizer. Then loaded the tokenizer as MT5TokenizerFast.",
"## Model\n\nWe used MT5-base model.",
"## Datasets\n\nWe used Code Search Net's dataset and some scrapped data from internet to train the model. We maintained a list of datasets where each dataset had codes of same language.",
"## Plots",
"### Train loss\n\n!train loss",
"### Evaluation loss\n\n!eval loss",
"### Evaluation accuracy\n\n!eval accuracy",
"### Learning rate\n\n!learning rate",
"## Fine tuning (WIP)\n\nWe fine tuned the model with CodeXGLUE code-to-code-trans dataset, and scrapper data."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #mt5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Tokenizer\n\nWe trained our tokenizer using sentencepiece's unigram tokenizer. Then loaded the tokenizer as MT5TokenizerFast.",
"## Model\n\nWe used MT5-base model.",
"## Datasets\n\nWe used Code Search Net's dataset and some scrapped data from internet to train the model. We maintained a list of datasets where each dataset had codes of same language.",
"## Plots",
"### Train loss\n\n!train loss",
"### Evaluation loss\n\n!eval loss",
"### Evaluation accuracy\n\n!eval accuracy",
"### Learning rate\n\n!learning rate",
"## Fine tuning (WIP)\n\nWe fine tuned the model with CodeXGLUE code-to-code-trans dataset, and scrapper data."
] |
[
61,
38,
9,
45,
3,
8,
9,
13,
7,
34
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #mt5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Tokenizer\n\nWe trained our tokenizer using sentencepiece's unigram tokenizer. Then loaded the tokenizer as MT5TokenizerFast.## Model\n\nWe used MT5-base model.## Datasets\n\nWe used Code Search Net's dataset and some scrapped data from internet to train the model. We maintained a list of datasets where each dataset had codes of same language.## Plots### Train loss\n\n!train loss### Evaluation loss\n\n!eval loss### Evaluation accuracy\n\n!eval accuracy### Learning rate\n\n!learning rate## Fine tuning (WIP)\n\nWe fine tuned the model with CodeXGLUE code-to-code-trans dataset, and scrapper data."
] |
[
-0.08443059027194977,
0.19534079730510712,
-0.005031400825828314,
0.07987762242555618,
0.14204607903957367,
-0.013494964689016342,
0.10435084253549576,
0.14674198627471924,
-0.15407156944274902,
-0.007771635428071022,
0.09007688611745834,
0.15952740609645844,
0.06073661893606186,
0.24470233917236328,
-0.059444360435009,
-0.21870632469654083,
0.041185639798641205,
0.010356497019529343,
0.015863317996263504,
0.14615057408809662,
0.11801892518997192,
-0.09057687222957611,
0.10095064342021942,
0.011009261943399906,
-0.14505282044410706,
-0.003689456032589078,
-0.024289272725582123,
-0.06597547978162766,
0.08098236471414566,
0.006820821203291416,
0.11347607523202896,
0.038497816771268845,
-0.004973567556589842,
-0.11171431839466095,
0.042462605983018875,
0.09583905339241028,
0.018115123733878136,
0.07531334459781647,
0.046077895909547806,
-0.06500650197267532,
0.07711740583181381,
-0.034796904772520065,
-0.009751269593834877,
0.020885949954390526,
-0.14083416759967804,
-0.15446536242961884,
-0.09841442853212357,
-0.02371395379304886,
0.0891227200627327,
0.0705329105257988,
-0.020083626732230186,
0.23067310452461243,
-0.03454308584332466,
0.12225981056690216,
0.08152653276920319,
-0.3444311320781708,
-0.0610303096473217,
0.09022175520658493,
0.10395291447639465,
0.0616544708609581,
-0.07186390459537506,
0.011793016456067562,
0.043589748442173004,
0.04058678448200226,
0.07139834761619568,
-0.013129680417478085,
-0.1355099380016327,
0.04733122140169144,
-0.15456782281398773,
-0.04585230350494385,
0.23392899334430695,
-0.08469508588314056,
-0.03214622288942337,
-0.0828346237540245,
-0.08973007649183273,
-0.12185973674058914,
0.003714227583259344,
-0.046420514583587646,
-0.04560287296772003,
0.0645226463675499,
-0.08968378603458405,
-0.03156289830803871,
-0.08696002513170242,
-0.04867684096097946,
-0.08345336467027664,
0.00891526136547327,
0.023960668593645096,
0.051816899329423904,
-0.14597941935062408,
0.13300561904907227,
0.027479225769639015,
-0.1089407354593277,
0.013012847863137722,
-0.11467213183641434,
-0.037110134959220886,
-0.04106545448303223,
-0.027523664757609367,
-0.18684139847755432,
0.10992615669965744,
0.07779762148857117,
0.07794655114412308,
0.04885539412498474,
-0.07545287162065506,
0.05389377474784851,
0.06975416839122772,
0.09048782289028168,
-0.039803192019462585,
-0.07408208400011063,
0.03792658448219299,
0.029632568359375,
-0.006916956510394812,
-0.033972010016441345,
-0.03565693274140358,
0.06666528433561325,
0.1149846538901329,
0.1212974339723587,
0.06068570166826248,
0.10277765244245529,
-0.06512387841939926,
-0.03196275979280472,
0.047075241804122925,
-0.14449431002140045,
0.001924540614709258,
0.022079093381762505,
-0.032948192209005356,
0.02414558455348015,
-0.017032425850629807,
-0.021734656766057014,
-0.08893753588199615,
-0.022777123376727104,
-0.05800533667206764,
-0.02575000934302807,
-0.13757723569869995,
-0.15249784290790558,
0.012316125445067883,
-0.11825864017009735,
-0.042443495243787766,
-0.06494633853435516,
-0.20400404930114746,
-0.018088696524500847,
0.058406051248311996,
-0.016361644491553307,
-0.0017887413268908858,
-0.09537529200315475,
-0.08155156672000885,
0.00006431186193367466,
-0.016619199886918068,
-0.05228499695658684,
-0.0497446283698082,
0.08100467175245285,
-0.05843686684966087,
0.08366816490888596,
-0.025979988276958466,
0.025479350239038467,
-0.14039571583271027,
0.03558885306119919,
-0.09778109192848206,
0.1383599191904068,
-0.02291090600192547,
0.014035847969353199,
-0.1036277487874031,
-0.022444356232881546,
-0.03919648751616478,
-0.003127909731119871,
0.06600166857242584,
0.12916089594364166,
-0.37937766313552856,
-0.023228703066706657,
0.22074785828590393,
-0.1612432599067688,
-0.11504531651735306,
0.15961533784866333,
-0.06246617063879967,
0.08928108960390091,
0.06648322194814682,
0.2115505337715149,
0.13343241810798645,
0.012406461872160435,
-0.02742714434862137,
-0.041693348437547684,
-0.14531919360160828,
-0.0013164289994165301,
0.036582596600055695,
0.02109997533261776,
0.01507928129285574,
0.0035688262432813644,
0.024491187185049057,
-0.023464245721697807,
-0.0958096981048584,
-0.1023099347949028,
-0.01001953985542059,
-0.02013869397342205,
0.07002487033605576,
0.003592174733057618,
0.012350695207715034,
-0.07423220574855804,
-0.023497063666582108,
-0.02932923659682274,
0.07177424430847168,
-0.12712202966213226,
0.02600059099495411,
-0.150688037276268,
0.07226692140102386,
-0.031418439000844955,
0.025701776146888733,
-0.1626954823732376,
-0.07272467762231827,
0.029426047578454018,
0.04219513013958931,
-0.021753700450062752,
0.07885954529047012,
0.05976729467511177,
0.019066154956817627,
0.023237744346261024,
0.0014414405450224876,
0.050318919122219086,
0.01237405464053154,
-0.09715699404478073,
-0.16823579370975494,
-0.03623681515455246,
-0.10776754468679428,
0.06842001527547836,
-0.1315472573041916,
0.042692672461271286,
0.09266066551208496,
0.0984324961900711,
0.029471151530742645,
-0.03329148516058922,
0.01616191864013672,
0.019583703950047493,
-0.047890812158584595,
-0.0636308416724205,
0.051423124969005585,
0.04187668859958649,
-0.03555896878242493,
0.08286980539560318,
-0.1523398905992508,
-0.006764908786863089,
0.1429980993270874,
-0.0357048474252224,
-0.01548777800053358,
-0.030884481966495514,
-0.04845666512846947,
-0.026613391935825348,
-0.03693350777029991,
-0.009286987595260143,
0.09838554263114929,
0.03154657036066055,
0.11445146799087524,
-0.07042653858661652,
-0.0173859391361475,
-0.014607255347073078,
-0.053730133920907974,
-0.0149020841345191,
0.06932718306779861,
-0.09939606487751007,
-0.17672200500965118,
0.13334055244922638,
0.09140469878911972,
0.022422125563025475,
0.1804630160331726,
-0.05318945646286011,
-0.013873429968953133,
-0.040475282818078995,
0.02198445051908493,
0.024762744084000587,
0.030299853533506393,
-0.07457105070352554,
-0.02256513386964798,
0.04668000340461731,
0.004277159925550222,
0.07113649696111679,
-0.11507944762706757,
0.02817014418542385,
-0.0029505486600100994,
-0.027880685403943062,
-0.005778371822088957,
0.026034004986286163,
0.018317583948373795,
0.05812472850084305,
-0.03248698264360428,
-0.07918258756399155,
0.07215990871191025,
-0.039159707725048065,
-0.06888353079557419,
0.16149474680423737,
-0.13534466922283173,
-0.2062402069568634,
-0.10612966865301132,
-0.02328876405954361,
-0.05587035417556763,
0.01238726545125246,
0.05905066803097725,
-0.029862843453884125,
-0.062402598559856415,
-0.07516879588365555,
-0.03374659642577171,
-0.02025553584098816,
-0.012558327056467533,
0.014902237802743912,
0.009555396623909473,
-0.01645829528570175,
-0.15602846443653107,
0.004320922773331404,
-0.0023281502071768045,
-0.04811419919133186,
0.127595916390419,
-0.061830632388591766,
0.07473165541887283,
0.1206841766834259,
-0.05678581818938255,
0.06450524181127548,
-0.01952129416167736,
0.23192322254180908,
-0.04319746419787407,
0.00651609105989337,
0.12688960134983063,
-0.10041951388120651,
0.07690855115652084,
0.07254036515951157,
0.01612524688243866,
-0.05520031973719597,
0.027303067967295647,
-0.004600635729730129,
-0.04810453951358795,
-0.23429271578788757,
-0.08197027444839478,
-0.041375916451215744,
0.06693512946367264,
-0.002576736733317375,
0.03729618713259697,
0.004713507369160652,
0.034898728132247925,
-0.012530721724033356,
-0.02879214845597744,
0.05684427544474602,
0.05281127244234085,
0.03612837195396423,
-0.01239404920488596,
0.11279863119125366,
-0.055408164858818054,
-0.08646832406520844,
0.05909309163689613,
0.026136867702007294,
0.2622450590133667,
0.03219641372561455,
0.11182607710361481,
0.10447826981544495,
0.0879659354686737,
0.11053498089313507,
0.12076325714588165,
-0.0417792908847332,
0.043995581567287445,
-0.03190728276968002,
-0.05805237963795662,
-0.05025374889373779,
-0.024908607825636864,
-0.028960343450307846,
-0.03968542441725731,
-0.02133391611278057,
0.05185893177986145,
0.14604246616363525,
0.24346373975276947,
0.06828697770833969,
-0.28810766339302063,
0.0034058671444654465,
0.01974010467529297,
0.020958522334694862,
-0.06656140834093094,
0.02041003480553627,
0.1077260822057724,
-0.09351477026939392,
-0.019390391185879707,
-0.1225244402885437,
0.10800351947546005,
-0.10667560994625092,
0.028423380106687546,
-0.0029755551367998123,
0.07498032599687576,
-0.027905700728297234,
0.09917668253183365,
-0.24907955527305603,
0.17365814745426178,
0.04215449467301369,
0.10645385086536407,
-0.0805848017334938,
0.004496187902987003,
0.052215930074453354,
0.043495070189237595,
0.07573012262582779,
0.0012961471220478415,
-0.10737157613039017,
-0.044492822140455246,
-0.14735928177833557,
0.023655883967876434,
0.0884440541267395,
0.016164032742381096,
0.10815591365098953,
-0.016028687357902527,
0.0008343605441041291,
-0.018230265006422997,
-0.004705915693193674,
-0.05013024061918259,
-0.11480727791786194,
0.03702549263834953,
0.024821113795042038,
0.09150078147649765,
-0.040820274502038956,
-0.047968700528144836,
-0.04856543987989426,
0.2275710552930832,
-0.03479211404919624,
-0.11734140664339066,
-0.10095621645450592,
0.012949494644999504,
0.08505026251077652,
-0.02310076169669628,
-0.005808919668197632,
0.024713966995477676,
0.12088758498430252,
-0.003620603121817112,
-0.1418217569589615,
0.05597279220819473,
-0.07752403616905212,
-0.11132018268108368,
-0.04025542736053467,
0.06617223471403122,
0.06031107157468796,
0.0011551246279850602,
0.04996303468942642,
0.011908584274351597,
-0.06804091483354568,
-0.09158063679933548,
-0.047092974185943604,
0.10060397535562515,
0.009428899735212326,
0.06926139444112778,
-0.033958800137043,
-0.11518801003694534,
-0.08427202701568604,
0.028262097388505936,
0.1778896003961563,
0.10667040199041367,
-0.0794314369559288,
0.07647896558046341,
0.1125529333949089,
-0.10450460016727448,
-0.29796648025512695,
-0.03990066051483154,
-0.038612283766269684,
-0.002638026839122176,
0.003489436348900199,
-0.1311216503381729,
0.02643515355885029,
0.08145584911108017,
-0.011684723198413849,
0.06872949749231339,
-0.31925418972969055,
-0.051366306841373444,
0.0863228365778923,
0.005014366004616022,
0.13381610810756683,
-0.11105784773826599,
-0.06597713381052017,
-0.0009186672978103161,
-0.07406552881002426,
0.17195400595664978,
-0.20883987843990326,
0.08117808401584625,
-0.006416249088943005,
-0.020993247628211975,
0.04028703272342682,
-0.07512016594409943,
0.08817154914140701,
0.007914367131888866,
0.09096527099609375,
-0.02023376151919365,
0.02606305293738842,
0.1752844750881195,
-0.030284296721220016,
0.10239898413419724,
-0.03566570580005646,
0.05760262534022331,
-0.11560411751270294,
0.002246791496872902,
-0.045607417821884155,
0.017740434035658836,
-0.0024791958276182413,
-0.027493489906191826,
-0.07062622904777527,
0.02690735086798668,
0.049187928438186646,
0.015227148309350014,
0.09645798802375793,
0.026141677051782608,
0.07785607129335403,
0.17702381312847137,
0.1263917237520218,
-0.03371606022119522,
0.003352608997374773,
-0.003542947582900524,
-0.017388543114066124,
0.07574480772018433,
-0.05856877565383911,
0.08318185806274414,
0.1339300125837326,
-0.03326026350259781,
0.10484468936920166,
0.11194062232971191,
-0.10659131407737732,
0.049330566078424454,
0.07324870675802231,
-0.1915927678346634,
-0.09443682432174683,
-0.04143937677145004,
-0.017978493124246597,
-0.06463519483804703,
0.10104518383741379,
0.139608696103096,
-0.025601817294955254,
-0.017344383522868156,
-0.002567066578194499,
0.05866282060742378,
-0.013906183652579784,
0.17988812923431396,
0.03488198295235634,
0.05043752118945122,
-0.13234630227088928,
0.11430815607309341,
0.05866885185241699,
-0.0783943235874176,
0.03819817304611206,
0.04197532683610916,
-0.11023896932601929,
-0.07320669293403625,
0.07903541624546051,
0.2152107208967209,
0.008477858267724514,
-0.08644372224807739,
-0.08435609936714172,
-0.06405887007713318,
-0.007566025946289301,
0.14678670465946198,
0.07603347301483154,
0.15632538497447968,
-0.02182132750749588,
-0.05463571846485138,
-0.13117560744285583,
0.09597649425268173,
0.12085815519094467,
0.030288007110357285,
-0.09021034836769104,
0.18134312331676483,
-0.05268378555774689,
0.06545262783765793,
-0.058518003672361374,
-0.06030036136507988,
-0.1177343875169754,
0.02244478277862072,
-0.012995798140764236,
0.013204497285187244,
-0.09056037664413452,
-0.043260980397462845,
0.01912078447639942,
0.04704175516963005,
-0.02571842446923256,
0.04160723090171814,
-0.06530970335006714,
-0.0005123052978888154,
-0.015401914715766907,
0.013965222053229809,
-0.12859021127223969,
-0.02235925942659378,
0.0009305202402174473,
-0.06725683808326721,
0.0739743709564209,
0.06855922937393188,
-0.05841061845421791,
0.012096397578716278,
-0.04514659568667412,
-0.05561657249927521,
0.04333017021417618,
0.021595973521471024,
0.04254414886236191,
-0.03638730198144913,
0.002093619666993618,
0.04687810316681862,
-0.025281870737671852,
0.04596804827451706,
0.16949544847011566,
-0.08116243779659271,
0.010709390975534916,
0.018696382641792297,
0.018567996099591255,
-0.04304985702037811,
0.013967168517410755,
0.12261401116847992,
0.10951469093561172,
0.2029418647289276,
-0.06362012028694153,
0.030916403979063034,
-0.13458897173404694,
0.012179221026599407,
0.021944278851151466,
-0.12529368698596954,
0.03740260377526283,
0.007433364633470774,
0.0653425008058548,
-0.08764268457889557,
0.12314275652170181,
0.08405761420726776,
-0.05849522724747658,
-0.015079878270626068,
-0.026743082329630852,
-0.06366702169179916,
0.0244049821048975,
0.10308502614498138,
0.00910432543605566,
-0.025977149605751038,
-0.025866657495498657,
0.005289101041853428,
-0.0020910457242280245,
0.0640159621834755,
0.11494389921426773,
0.05659603327512741,
0.03873150050640106,
0.11238358169794083,
-0.013587092980742455,
0.0027940168511122465,
-0.08528200536966324,
-0.05911759287118912,
-0.04454708471894264,
0.11163190752267838,
0.00015802869165781885,
0.011768867261707783,
0.1487976610660553,
-0.07947058230638504,
0.0694582536816597,
-0.0294052567332983,
-0.10668199509382248,
-0.12424186617136002,
-0.12939317524433136,
-0.08238060027360916,
-0.05038332939147949,
-0.021601859480142593,
-0.15386340022087097,
0.06245499476790428,
0.08708994835615158,
0.037422798573970795,
-0.06306082010269165,
0.08044818043708801,
0.0022125630639493465,
-0.032376937568187714,
0.052532486617565155,
-0.04182620346546173,
0.03418416157364845,
0.0004111334856133908,
0.008612613193690777,
-0.00456601195037365,
0.04465104267001152,
0.03615124896168709,
0.028838953003287315,
0.13025783002376556,
0.04978537932038307,
-0.11404748260974884,
-0.11926745623350143,
-0.030951594933867455,
-0.013724680058658123,
0.0037255408242344856,
0.027608340606093407,
0.07829877734184265,
-0.05914350226521492,
0.019801445305347443,
0.1419885754585266,
-0.035136301070451736,
-0.08752157539129257,
-0.16624511778354645,
0.22812803089618683,
0.028380082920193672,
-0.0017488504527136683,
0.014288924634456635,
-0.0747644230723381,
0.004400002304464579,
0.21557554602622986,
0.2255437970161438,
-0.03406525403261185,
0.00947039294987917,
-0.029488015919923782,
0.0031052895355969667,
0.007487536408007145,
0.11188389360904694,
0.04709536209702492,
0.10350335389375687,
-0.06903088837862015,
0.03026852011680603,
-0.10424342006444931,
-0.03773979842662811,
-0.09374181926250458,
0.0610492043197155,
0.022275254130363464,
-0.03219896927475929,
-0.020885663107037544,
0.11570118367671967,
-0.06984852254390717,
0.050298016518354416,
-0.04845767840743065,
-0.11172231286764145,
-0.11414166539907455,
-0.0028348970226943493,
0.09803199768066406,
-0.011875278316438198,
0.0499970018863678,
-0.0017581909196451306,
0.06467391550540924,
0.11241217702627182,
-0.004451544024050236,
-0.07626131176948547,
-0.037428487092256546,
0.06678551435470581,
-0.20594480633735657,
0.1299852430820465,
0.008281075395643711,
0.021523693576455116,
0.0927819088101387,
-0.014324144460260868,
-0.1231352686882019,
0.0741763561964035,
-0.037431709468364716,
0.028465205803513527,
0.020901000127196312,
0.08156523108482361,
0.017256246879696846,
0.017331372946500778,
0.03055640123784542,
-0.0019218150991946459,
0.002739823656156659,
0.0013064799131825566,
-0.0006791729829274118,
-0.02611679583787918,
0.007097761146724224,
-0.02731398679316044,
0.10058293491601944,
0.13048972189426422,
-0.03788222372531891,
0.09456552565097809,
-0.046389881521463394,
0.01571892574429512,
-0.00022471150441560894,
0.012468473985791206,
-0.04462043195962906,
-0.22642503678798676,
-0.054213959723711014,
0.03080148994922638,
0.017240753397345543,
-0.14908888936042786,
-0.010259578935801983,
-0.0977635383605957,
-0.03986950218677521,
-0.09400957822799683,
0.08291714638471603,
0.10183282941579819,
0.03949887678027153,
-0.020333291962742805,
0.12753169238567352,
-0.02644526958465576,
0.09130588918924332,
-0.11089538037776947,
-0.08928156644105911
] |
null | null |
transformers
|
## DALL·E mini - Generate images from text
<img style="text-align:center; display:block;" src="https://raw.githubusercontent.com/borisdayma/dalle-mini/main/img/logo.png" width="200">
* [Technical Report](https://wandb.ai/dalle-mini/dalle-mini/reports/DALL-E-mini--Vmlldzo4NjIxODA)
* [Demo](https://huggingface.co/spaces/flax-community/dalle-mini)
### Model Description
This is an attempt to replicate OpenAI's [DALL·E](https://openai.com/blog/dall-e/), a model capable of generating arbitrary images from a text prompt that describes the desired result.

This model's architecture is a simplification of the original, and leverages previous open source efforts and available pre-trained models. Results have lower quality than OpenAI's, but the model can be trained and used on less demanding hardware. Our training was performed on a single TPU v3-8 for a few days.
### Components of the Architecture
The system relies on the Flax/JAX infrastructure, which are ideal for TPU training. TPUs are not required, both Flax and JAX run very efficiently on GPU backends.
The main components of the architecture include:
* An encoder, based on [BART](https://arxiv.org/abs/1910.13461). The encoder transforms a sequence of input text tokens to a sequence of image tokens. The input tokens are extracted from the text prompt by using the model's tokenizer. The image tokens are a fixed-length sequence, and they represent indices in a VQGAN-based pre-trained codebook.
* A decoder, which converts the image tokens to image pixels. As mentioned above, the decoder is based on a [VQGAN model](https://compvis.github.io/taming-transformers/).
The model definition we use for the encoder can be downloaded from our [Github repo](https://github.com/borisdayma/dalle-mini). The encoder is represented by the class `CustomFlaxBartForConditionalGeneration`.
To use the decoder, you need to follow the instructions in our accompanying VQGAN model in the hub, [flax-community/vqgan_f16_16384](https://huggingface.co/flax-community/vqgan_f16_16384).
### How to Use
The easiest way to get familiar with the code and the models is to follow the inference notebook we provide in our [github repo](https://github.com/borisdayma/dalle-mini/blob/main/dev/inference/inference_pipeline.ipynb). For your convenience, you can open it in Google Colaboratory: [](https://colab.research.google.com/github/borisdayma/dalle-mini/blob/main/dev/inference/inference_pipeline.ipynb)
If you just want to test the trained model and see what it comes up with, please visit [our demo](https://huggingface.co/spaces/flax-community/dalle-mini), available in 🤗 Spaces.
### Additional Details
Our [report](https://wandb.ai/dalle-mini/dalle-mini/reports/DALL-E-mini--Vmlldzo4NjIxODA) contains more details about how the model was trained and shows many examples that demonstrate its capabilities.
|
{"language": ["en"], "pipeline_tag": "text-to-image", "inference": false}
|
text-to-image
|
flax-community/dalle-mini
|
[
"transformers",
"jax",
"bart",
"text2text-generation",
"text-to-image",
"en",
"arxiv:1910.13461",
"autotrain_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1910.13461"
] |
[
"en"
] |
TAGS
#transformers #jax #bart #text2text-generation #text-to-image #en #arxiv-1910.13461 #autotrain_compatible #has_space #region-us
|
## DALL·E mini - Generate images from text
<img style="text-align:center; display:block;" src="URL width="200">
* Technical Report
* Demo
### Model Description
This is an attempt to replicate OpenAI's DALL·E, a model capable of generating arbitrary images from a text prompt that describes the desired result.
!DALL·E mini demo screenshot
This model's architecture is a simplification of the original, and leverages previous open source efforts and available pre-trained models. Results have lower quality than OpenAI's, but the model can be trained and used on less demanding hardware. Our training was performed on a single TPU v3-8 for a few days.
### Components of the Architecture
The system relies on the Flax/JAX infrastructure, which are ideal for TPU training. TPUs are not required, both Flax and JAX run very efficiently on GPU backends.
The main components of the architecture include:
* An encoder, based on BART. The encoder transforms a sequence of input text tokens to a sequence of image tokens. The input tokens are extracted from the text prompt by using the model's tokenizer. The image tokens are a fixed-length sequence, and they represent indices in a VQGAN-based pre-trained codebook.
* A decoder, which converts the image tokens to image pixels. As mentioned above, the decoder is based on a VQGAN model.
The model definition we use for the encoder can be downloaded from our Github repo. The encoder is represented by the class 'CustomFlaxBartForConditionalGeneration'.
To use the decoder, you need to follow the instructions in our accompanying VQGAN model in the hub, flax-community/vqgan_f16_16384.
### How to Use
The easiest way to get familiar with the code and the models is to follow the inference notebook we provide in our github repo. For your convenience, you can open it in Google Colaboratory: :
dataset = load_dataset('wiki40b', 'da', beam_runner='DirectRunner', split="train")
#dataset = load_dataset('wiki40b', 'sv', beam_runner='DirectRunner')
dataset = dataset.remove_columns(['wikidata_id', 'version_id'])
filtered_dataset = dataset.map(filter_wikipedia)
# filtered_dataset[:3]
# print(filtered_dataset[:3])
return filtered_dataset
def filter_wikipedia(batch):
batch["text"] = " ".join(batch["text"].split("\
_START_SECTION_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_ARTICLE_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_ARTICLE_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_PARAGRAPH_\
"))
batch["text"] = " ".join(batch["text"].split("_NEWLINE_"))
batch["text"] = " ".join(batch["text"].split("\xa0"))
return batch
```
## Training script
The following training script was used to train the model.
```bash
./run_clm_flax.py --output_dir="${MODEL_DIR}" --model_type="gpt2" --config_name="${MODEL_DIR}" --tokenizer_name="${MODEL_DIR}" --dataset_name="wiki40b" --dataset_config_name="da" --do_train --do_eval --block_size="512" --per_device_train_batch_size="64" --per_device_eval_batch_size="64" --learning_rate="5e-3" --warmup_steps="1000" --adam_beta1="0.9" --adam_beta2="0.98" --weight_decay="0.01" --overwrite_output_dir --num_train_epochs="20" --logging_steps="500" --save_steps="1000" --eval_steps="2500" --push_to_hub
```
|
{"language": "da", "widget": [{"text": "Jeg elsker livet"}]}
|
text-generation
|
flax-community/dansk-gpt-wiki
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"gpt2",
"text-generation",
"da",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"da"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #da #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# GPT2-svenska-wikipedia
A Danish GPT2 style model trained using Flax CLM pipeline on the Danish
part of the wiki40b dataset.
URL
## Model series
This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
## Gpt models
## Swedish Gpt
URL
## Swedish gpt wiki
URL
# Nordic gpt wiki
URL
## Dansk gpt wiki
URL
## Norsk gpt wiki
URL
## Roberta models
## Nordic Roberta Wiki
URL
## Swe Roberta Wiki Oscar
URL
## Roberta Swedish Scandi
URL
## Roberta Swedish
URL
## Swedish T5 model
URL
## Data cleaning and preprocessing
The data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.
## Training script
The following training script was used to train the model.
|
[
"# GPT2-svenska-wikipedia\nA Danish GPT2 style model trained using Flax CLM pipeline on the Danish\npart of the wiki40b dataset.\n\nURL",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL",
"## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.",
"## Training script\nThe following training script was used to train the model."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #da #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# GPT2-svenska-wikipedia\nA Danish GPT2 style model trained using Flax CLM pipeline on the Danish\npart of the wiki40b dataset.\n\nURL",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL",
"## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.",
"## Training script\nThe following training script was used to train the model."
] |
[
65,
36,
32,
4,
5,
6,
6,
6,
6,
4,
6,
7,
7,
5,
6,
40,
14
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #da #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# GPT2-svenska-wikipedia\nA Danish GPT2 style model trained using Flax CLM pipeline on the Danish\npart of the wiki40b dataset.\n\nURL## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.## Gpt models## Swedish Gpt\nURL## Swedish gpt wiki\nURL# Nordic gpt wiki\nURL## Dansk gpt wiki\nURL## Norsk gpt wiki\nURL## Roberta models## Nordic Roberta Wiki\nURL## Swe Roberta Wiki Oscar\nURL## Roberta Swedish Scandi\nURL## Roberta Swedish\nURL## Swedish T5 model\nURL## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.## Training script\nThe following training script was used to train the model."
] |
[
-0.10442894697189331,
0.1917644888162613,
-0.0004395497089717537,
0.06838972121477127,
0.06474887579679489,
0.007901407778263092,
0.05846614018082619,
0.1281903237104416,
-0.0011678418377414346,
0.08568926900625229,
0.10892844200134277,
0.010485264472663403,
0.09907342493534088,
0.19684983789920807,
0.0624983087182045,
-0.27216750383377075,
0.09265615046024323,
-0.04856860265135765,
-0.0171491838991642,
0.09944237023591995,
0.08602414280176163,
-0.07655508071184158,
0.06036799028515816,
-0.06723115593194962,
-0.03111443482339382,
-0.029951242730021477,
-0.0320124477148056,
-0.06728045642375946,
0.12424478679895401,
0.05704249441623688,
0.057087935507297516,
0.08915428817272186,
0.10356791317462921,
-0.10427065193653107,
0.025494083762168884,
0.047065168619155884,
-0.010190745815634727,
0.040228649973869324,
0.043649643659591675,
0.005209963768720627,
0.19517195224761963,
-0.015421287156641483,
0.06492217630147934,
0.008788553066551685,
-0.08649256825447083,
-0.25419002771377563,
-0.08516636490821838,
0.053026918321847916,
0.0741591677069664,
0.11736723780632019,
-0.056327685713768005,
0.09536328911781311,
-0.11941419541835785,
0.09173054993152618,
0.11088334769010544,
-0.2077416628599167,
-0.05269761383533478,
0.12286999821662903,
0.08226216584444046,
0.04841964319348335,
-0.06890638172626495,
0.07205337285995483,
-0.009062780998647213,
0.05661541596055031,
0.05277857929468155,
-0.01418642420321703,
-0.0329299233853817,
-0.002275902545079589,
-0.12461984157562256,
-0.013663622550666332,
0.12069496512413025,
0.006402742117643356,
-0.053008243441581726,
-0.1491561233997345,
-0.05709843710064888,
-0.05213348567485809,
0.0020196987316012383,
0.006439597811549902,
-0.004066852852702141,
-0.019839828833937645,
-0.04595367982983589,
-0.071523517370224,
-0.08656153082847595,
-0.0837455615401268,
0.02806946448981762,
0.09876451641321182,
0.04331888630986214,
0.004320234060287476,
-0.007746770046651363,
0.15504656732082367,
-0.05372128263115883,
-0.13846781849861145,
-0.07763773202896118,
-0.035569991916418076,
-0.059716738760471344,
-0.02573036402463913,
0.018414724618196487,
-0.1476997435092926,
0.022589435800909996,
0.24243445694446564,
0.04156731441617012,
0.024598000571131706,
0.0342404879629612,
0.0015439579728990793,
0.03912605717778206,
0.09766335040330887,
-0.14616720378398895,
-0.07663322240114212,
0.01820758357644081,
-0.016880270093679428,
0.03771764412522316,
-0.03880443423986435,
-0.004643955733627081,
-0.01361777912825346,
0.060962479561567307,
0.10222209244966507,
0.04170769825577736,
0.03747353330254555,
-0.03662123531103134,
0.0019256898667663336,
0.11125494539737701,
-0.1477615386247635,
-0.001100832363590598,
0.0015265680849552155,
-0.026983672752976418,
-0.0049362764693796635,
0.07354781776666641,
-0.002783397678285837,
-0.07075904309749603,
0.15237396955490112,
-0.050138603895902634,
-0.03311370685696602,
-0.021166151389479637,
-0.1260220855474472,
0.03770620375871658,
-0.09591178596019745,
-0.0017835614271461964,
-0.08256757259368896,
-0.21542687714099884,
-0.0736478790640831,
0.07221115380525589,
-0.0650501549243927,
-0.005099554546177387,
-0.02330847829580307,
-0.0397099032998085,
0.03650268539786339,
-0.03216552734375,
0.11250945180654526,
-0.070347361266613,
0.07217290997505188,
-0.14410115778446198,
0.09567779302597046,
-0.006350885611027479,
0.01511884480714798,
-0.15524142980575562,
0.021111881360411644,
-0.20415394008159637,
0.028724275529384613,
-0.1386163830757141,
0.0004232146020513028,
-0.12957918643951416,
-0.06964924186468124,
-0.025436706840991974,
0.028614135459065437,
0.041912518441677094,
0.17370416224002838,
-0.18577922880649567,
-0.015117579139769077,
0.2768486738204956,
-0.16888107359409332,
0.035769909620285034,
0.1094496101140976,
0.03335719555616379,
0.101406030356884,
0.08363547176122665,
0.159769669175148,
0.022485192865133286,
-0.11762772500514984,
-0.02906414307653904,
-0.005924279801547527,
-0.02180028147995472,
0.05496617406606674,
0.08184922486543655,
-0.08271701633930206,
0.06214413046836853,
0.028847292065620422,
-0.08927243202924728,
0.048168011009693146,
-0.030133355408906937,
-0.055224206298589706,
0.010380011051893234,
-0.05656540021300316,
-0.03979109227657318,
0.05536530539393425,
-0.002900312654674053,
-0.049533501267433167,
-0.1445804089307785,
-0.09597811847925186,
0.11808058619499207,
-0.11390998214483261,
0.03812626749277115,
-0.08294174075126648,
0.055691126734018326,
0.019041916355490685,
0.0019938054028898478,
-0.10752927511930466,
-0.15101851522922516,
-0.035698387771844864,
-0.05442503094673157,
-0.09293670207262039,
-0.024627041071653366,
0.06690871715545654,
0.11179246753454208,
-0.04416882246732712,
-0.06160171702504158,
-0.0316159762442112,
-0.0019868281669914722,
-0.0005232756375335157,
-0.1743594855070114,
-0.020927950739860535,
-0.06493383646011353,
0.15649357438087463,
-0.1551266610622406,
0.013356686569750309,
0.1133868545293808,
0.1598353087902069,
0.02221827208995819,
-0.09093328565359116,
0.056738074868917465,
0.00590905174612999,
-0.02007799968123436,
-0.13108474016189575,
0.022609028965234756,
-0.021217677742242813,
-0.03196704760193825,
0.05952807888388634,
-0.037439968436956406,
-0.05783243849873543,
0.1117769256234169,
0.17463251948356628,
-0.12819397449493408,
0.14270968735218048,
-0.047586843371391296,
-0.01923808455467224,
-0.08733513951301575,
-0.005108518060296774,
-0.007096718531101942,
0.049523741006851196,
0.07837183028459549,
-0.054008036851882935,
-0.004762960597872734,
0.01342392060905695,
-0.005082240328192711,
-0.035007063299417496,
0.13996680080890656,
0.12877784669399261,
-0.15459118783473969,
0.09427634626626968,
-0.022607730701565742,
-0.007161322981119156,
0.2613813579082489,
0.03374370187520981,
-0.07969194650650024,
0.012094736099243164,
0.016194360330700874,
0.024612925946712494,
0.17941556870937347,
0.01111508160829544,
0.032757434993982315,
0.04057715833187103,
-0.012614318169653416,
0.018643038347363472,
-0.060746971517801285,
-0.056657448410987854,
-0.0036195581778883934,
-0.07167027145624161,
0.03457867354154587,
0.08515913039445877,
-0.08015209436416626,
0.07100770622491837,
-0.01995674893260002,
-0.040274478495121,
0.009191444143652916,
0.009088651277124882,
-0.10378462076187134,
0.21462838351726532,
-0.055664077401161194,
-0.21887047588825226,
-0.11658867448568344,
0.04575997591018677,
0.028319600969552994,
-0.01122508104890585,
0.0911271721124649,
-0.07571329921483994,
-0.1538407951593399,
-0.12133006751537323,
0.09223376959562302,
0.032393697649240494,
-0.04943026602268219,
-0.11349521577358246,
-0.02840125933289528,
-0.02440822683274746,
-0.11458753794431686,
0.01697581261396408,
0.04040432348847389,
-0.04312547296285629,
0.08498303592205048,
-0.01434603612869978,
0.11931727826595306,
0.07322274893522263,
0.029137879610061646,
-0.0019131462322548032,
0.03603420779109001,
0.21126247942447662,
-0.12125106155872345,
0.12794898450374603,
0.0864373967051506,
-0.023813961073756218,
0.022230731323361397,
0.12946683168411255,
0.009787265211343765,
-0.06819162517786026,
-0.003897150279954076,
0.04028557986021042,
-0.0853402316570282,
-0.18019388616085052,
-0.10283792018890381,
0.016683224588632584,
0.045937612652778625,
0.09730260074138641,
0.1029784083366394,
-0.06418642401695251,
0.06061159074306488,
-0.08292365819215775,
-0.16528372466564178,
0.0705915167927742,
0.06634912639856339,
-0.059093035757541656,
-0.029170939698815346,
0.08997227251529694,
-0.05173597112298012,
0.0471927747130394,
0.09434913843870163,
-0.048383843153715134,
0.139265239238739,
-0.0372646301984787,
0.07121100276708603,
0.06251967698335648,
0.08051685243844986,
0.06512964516878128,
0.10147979110479355,
0.03911616653203964,
-0.026416659355163574,
0.02013547718524933,
-0.057830121368169785,
-0.0027381950058043003,
0.03495441749691963,
-0.06045623868703842,
-0.0902869775891304,
-0.016195926815271378,
-0.02712355926632881,
-0.011006949469447136,
0.19261932373046875,
0.0785258412361145,
-0.23113244771957397,
-0.09771818667650223,
0.03949354588985443,
-0.02578446827828884,
-0.09915609657764435,
-0.027628200128674507,
0.10141332447528839,
-0.17224092781543732,
0.0487365648150444,
-0.04765211045742035,
0.06048692390322685,
-0.03624820336699486,
-0.04583127796649933,
0.03442361578345299,
0.004743302706629038,
-0.048784710466861725,
0.09262380003929138,
-0.18065506219863892,
0.11935180425643921,
-0.0021023170556873083,
0.10931910574436188,
-0.05777353420853615,
0.015436340123414993,
-0.009445494040846825,
0.11234977096319199,
0.2650388181209564,
0.023378757759928703,
-0.11134767532348633,
-0.10794910788536072,
-0.14470966160297394,
0.04094201326370239,
-0.023590635508298874,
-0.08017010241746902,
0.06661759316921234,
0.01399284042418003,
-0.0075989956967532635,
-0.055554814636707306,
-0.0016108459094539285,
-0.1497485339641571,
-0.0881190299987793,
0.012782230973243713,
-0.059367477893829346,
0.12314727157354355,
-0.06753811985254288,
-0.08943050354719162,
-0.12582311034202576,
0.22936664521694183,
-0.09914164245128632,
-0.13120386004447937,
-0.16417266428470612,
0.07516112178564072,
0.12436223030090332,
-0.08014902472496033,
0.04069673642516136,
-0.004755989648401737,
0.07208109647035599,
-0.0723646953701973,
-0.04399311915040016,
0.07524705678224564,
-0.08463463187217712,
-0.16987651586532593,
-0.0037125074304640293,
0.0866568386554718,
0.12821345031261444,
0.050694357603788376,
0.012111467309296131,
0.07251134514808655,
-0.004758452996611595,
-0.12838183343410492,
0.04350357502698898,
0.1526186764240265,
0.028849760070443153,
-0.0315731056034565,
-0.09836072474718094,
-0.02813267707824707,
-0.011169902049005032,
-0.09103117138147354,
0.1499803513288498,
0.23528867959976196,
-0.08134229481220245,
0.1301604062318802,
0.1147809699177742,
-0.03232301026582718,
-0.3109701871871948,
-0.05487603321671486,
-0.03145872801542282,
0.09032902121543884,
0.030071493238210678,
-0.2319856882095337,
0.08019369840621948,
0.1292642205953598,
-0.029988588765263557,
0.07897980511188507,
-0.2481507509946823,
-0.12273109704256058,
0.12212523818016052,
0.09229366481304169,
0.031060634180903435,
-0.06682257354259491,
-0.012562855146825314,
0.01845264993607998,
-0.12150464951992035,
0.07243383675813675,
-0.1042991355061531,
0.0873798280954361,
0.004236552398651838,
-0.013544227927923203,
0.021779019385576248,
-0.0892423540353775,
0.1535140872001648,
0.0031915465369820595,
0.012287987396121025,
-0.09472259879112244,
0.11871015280485153,
0.08461513370275497,
-0.023376349359750748,
0.17917400598526,
-0.030390040948987007,
0.01911988854408264,
-0.0966237336397171,
-0.07391339540481567,
-0.10263770818710327,
0.12396547943353653,
-0.07714705914258957,
-0.07507020235061646,
-0.05595705658197403,
0.1317191869020462,
0.05750114843249321,
-0.013385792262852192,
0.08806896209716797,
-0.10141994804143906,
0.039572373032569885,
0.03627084568142891,
0.06531474739313126,
0.017144877463579178,
-0.035491008311510086,
0.0003015191468875855,
-0.05281142517924309,
0.07009223103523254,
-0.07741914689540863,
0.0280869472771883,
0.083922378718853,
0.03356841951608658,
0.05907505005598068,
-0.01287965476512909,
-0.14333993196487427,
-0.019804026931524277,
0.06431560218334198,
-0.24224233627319336,
-0.07402479648590088,
-0.011733977124094963,
-0.10963243991136551,
0.009318260475993156,
-0.06167633831501007,
0.12847799062728882,
-0.08092684298753738,
0.001273197354748845,
-0.004954502452164888,
0.054213590919971466,
0.00047437241300940514,
0.19426698982715607,
0.04178156703710556,
0.05120858550071716,
-0.11665104329586029,
0.07354001700878143,
0.043804194778203964,
-0.10445984452962875,
0.05438494309782982,
0.14549770951271057,
-0.1579185575246811,
-0.09142013639211655,
-0.014059034176170826,
0.10324274748563766,
-0.056572265923023224,
-0.06773556768894196,
-0.10099457949399948,
-0.04086127132177353,
0.02564224787056446,
0.03090059570968151,
0.039786506444215775,
0.06480345875024796,
0.04301874339580536,
-0.05868589133024216,
-0.091306172311306,
0.07309716194868088,
0.057999830693006516,
0.006916113663464785,
-0.05082086846232414,
0.1409994512796402,
0.007754852529615164,
0.0030036591924726963,
-0.04713134095072746,
0.030928028747439384,
-0.0551762655377388,
-0.024965612217783928,
-0.01761176995933056,
-0.038638658821582794,
-0.07356736063957214,
-0.02154308557510376,
-0.05822142958641052,
-0.03975890576839447,
0.026550646871328354,
0.01365229394286871,
-0.06131862476468086,
-0.04937337338924408,
-0.07841071486473083,
-0.02020721323788166,
-0.11281610280275345,
-0.011955590918660164,
-0.0032835700549185276,
-0.05955974757671356,
0.08832723647356033,
0.0024548079818487167,
0.03261295706033707,
0.0676470696926117,
-0.02750006876885891,
-0.015439784154295921,
-0.015271415933966637,
-0.04094456508755684,
0.011141671799123287,
-0.03990544378757477,
-0.05488545820116997,
-0.016086852177977562,
-0.051026467233896255,
0.039271943271160126,
0.02186604216694832,
-0.11158379167318344,
0.06102575361728668,
0.022518035024404526,
-0.026079528033733368,
-0.0353250727057457,
0.0965714380145073,
0.05455698445439339,
0.04872472211718559,
0.1322832703590393,
-0.0892353281378746,
0.06344886124134064,
-0.13345041871070862,
0.011524343863129616,
-0.0032917612697929144,
-0.0054399059154093266,
-0.00016596901696175337,
0.0521550290286541,
0.049828190356492996,
-0.046202804893255234,
0.0752551481127739,
0.0650484636425972,
-0.04035073518753052,
0.0613517202436924,
-0.017934827134013176,
0.025259600952267647,
0.025759709998965263,
0.16695928573608398,
0.025766246020793915,
0.010201459750533104,
0.01159648410975933,
0.005777313373982906,
-0.022483987733721733,
0.015046432614326477,
0.18459585309028625,
0.12029698491096497,
0.12288021296262741,
0.06926961243152618,
-0.08629557490348816,
-0.1278420388698578,
-0.09179136902093887,
0.06984985619783401,
-0.017891349270939827,
0.07583830505609512,
-0.000019979346689069644,
0.04809357225894928,
0.16169200837612152,
-0.1639455407857895,
0.011247253976762295,
0.06508822739124298,
-0.07695282250642776,
-0.1557794064283371,
-0.2637002170085907,
-0.10936007648706436,
0.0346008762717247,
0.050973352044820786,
-0.08909965306520462,
0.02538185752928257,
0.0677010715007782,
0.07445359230041504,
-0.009897601790726185,
0.15321911871433258,
-0.013653484173119068,
-0.09409651905298233,
0.04020807519555092,
0.04119391366839409,
0.0009370249463245273,
-0.002514091320335865,
0.0009296134812757373,
-0.002980021992698312,
0.0734277069568634,
0.0005003988044336438,
0.02262115105986595,
0.01035626232624054,
0.04361575096845627,
0.003972812555730343,
-0.05884844809770584,
-0.028169384226202965,
0.08458422869443893,
0.05304623022675514,
0.022306017577648163,
0.06032465770840645,
-0.013037110678851604,
0.002695416333153844,
0.19025249779224396,
0.006187037099152803,
-0.07540639489889145,
-0.10108859091997147,
0.13549235463142395,
0.001540078315883875,
0.028819525614380836,
0.014818303287029266,
-0.11728914082050323,
0.0447869747877121,
0.14556178450584412,
0.17155680060386658,
0.021370695903897285,
0.022040309384465218,
-0.022434895858168602,
-0.008317066356539726,
0.029440244659781456,
0.04326783865690231,
0.033254921436309814,
0.1950271725654602,
-0.11068494617938995,
0.034280452877283096,
-0.07505794614553452,
-0.0766461193561554,
-0.14917203783988953,
0.06626231968402863,
0.007696954999119043,
-0.012331647798418999,
-0.10777319967746735,
0.10813845694065094,
-0.058033429086208344,
-0.12399951368570328,
0.051740147173404694,
-0.07642901688814163,
-0.15654879808425903,
-0.037028804421424866,
0.007855943404138088,
0.024080265313386917,
0.030779492110013962,
-0.014354490675032139,
0.035425834357738495,
0.09081336110830307,
0.03337118774652481,
-0.0818508043885231,
-0.07215186208486557,
0.06646218150854111,
0.0037262679543346167,
0.19863559305667877,
0.0033725055400282145,
0.0198218934237957,
0.10643421858549118,
-0.048412952572107315,
-0.13383343815803528,
0.08169130980968475,
0.019712980836629868,
-0.04897308722138405,
0.03344950079917908,
0.14807935059070587,
-0.03855908289551735,
0.0904049500823021,
0.01329517550766468,
-0.08637357503175735,
0.010816031135618687,
0.032730549573898315,
-0.07680831104516983,
-0.0500972680747509,
0.10798493772745132,
-0.0736960768699646,
0.12727446854114532,
0.16865099966526031,
-0.045544303953647614,
-0.0012279629008844495,
-0.10060425847768784,
0.10806328058242798,
0.02248958684504032,
0.017035692930221558,
0.042087312787771225,
-0.19331015646457672,
0.00776625657454133,
-0.13020315766334534,
0.02436113730072975,
-0.1695788949728012,
-0.014149797149002552,
-0.07937423884868622,
-0.013723393902182579,
-0.06692516058683395,
0.08161991089582443,
0.02573283761739731,
0.021573232486844063,
-0.018781770020723343,
0.035407934337854385,
0.03347700461745262,
0.06945782154798508,
-0.1413159817457199,
-0.07505985349416733
] |
null | null |
transformers
|
# ft5 with re-zero
|
{}
|
text2text-generation
|
flax-community/ft5-rezero-base-openwebtext
|
[
"transformers",
"jax",
"tensorboard",
"t5",
"text2text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #jax #tensorboard #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# ft5 with re-zero
|
[
"# ft5 with re-zero"
] |
[
"TAGS\n#transformers #jax #tensorboard #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# ft5 with re-zero"
] |
[
51,
8
] |
[
"passage: TAGS\n#transformers #jax #tensorboard #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# ft5 with re-zero"
] |
[
-0.012450553476810455,
0.01002016756683588,
-0.005105717107653618,
0.044242702424526215,
0.15167059004306793,
-0.008238161914050579,
0.13620339334011078,
0.15643177926540375,
0.0473761148750782,
0.006062797270715237,
0.14798040688037872,
0.16747118532657623,
0.008906958624720573,
0.11115997284650803,
-0.14426550269126892,
-0.16901570558547974,
0.017602214589715004,
0.07130070775747299,
-0.0686260238289833,
0.054050978273153305,
0.04678112268447876,
-0.04895378276705742,
0.08338356018066406,
-0.07778255641460419,
-0.22926588356494904,
0.09097471833229065,
0.09413070976734161,
-0.09430661052465439,
0.042682498693466187,
0.10565764456987381,
0.0943092554807663,
0.07909487187862396,
-0.04791668802499771,
-0.104819156229496,
0.021063024178147316,
0.08700275421142578,
-0.10302042216062546,
0.062344763427972794,
0.09417199343442917,
-0.05630471184849739,
-0.0205696988850832,
0.01060042530298233,
-0.028722509741783142,
0.08109166473150253,
-0.14246688783168793,
0.00986654032021761,
-0.029017940163612366,
0.002376703079789877,
0.10473170876502991,
0.08167161047458649,
-0.014126010239124298,
0.0947146937251091,
-0.04961723834276199,
0.12173805385828018,
0.11604997515678406,
-0.3295152187347412,
-0.018971364945173264,
0.1242387518286705,
-0.00973921176046133,
0.09753599762916565,
-0.03130262717604637,
0.07138344645500183,
0.04287748411297798,
-0.00028028516680933535,
0.05332814157009125,
-0.09077892452478409,
-0.1271962672472,
0.04904800280928612,
-0.11214374750852585,
0.02287723496556282,
0.2077551931142807,
-0.0011167684569954872,
0.06791484355926514,
0.012432181276381016,
-0.1343178153038025,
-0.16398437321186066,
-0.03169296681880951,
-0.06483426690101624,
-0.025331277400255203,
0.06709783524274826,
-0.013686955906450748,
-0.09471040964126587,
-0.11198338866233826,
-0.013314210809767246,
-0.17662550508975983,
0.06627663224935532,
-0.0251226294785738,
0.049895454198122025,
-0.24870145320892334,
0.03503550961613655,
-0.09448899328708649,
-0.10470369458198547,
0.04028904810547829,
-0.06300441920757294,
-0.06566118448972702,
-0.02931038849055767,
-0.07122427970170975,
-0.28275933861732483,
0.0835907980799675,
0.06264980137348175,
0.0847625806927681,
0.0730312243103981,
-0.09978272020816803,
0.06295575946569443,
-0.041262026876211166,
0.059209033846855164,
-0.025528794154524803,
-0.09725282341241837,
0.08586066216230392,
-0.07928218692541122,
0.0070459963753819466,
-0.09517520666122437,
-0.17597970366477966,
-0.038122761994600296,
0.06975141912698746,
0.06897636502981186,
0.01525881141424179,
0.08921101689338684,
-0.022281622514128685,
-0.03407770395278931,
-0.06778652966022491,
-0.07930444926023483,
0.010431147180497646,
-0.022243203595280647,
0.05356144160032272,
0.143950954079628,
0.03478217124938965,
-0.015758683905005455,
-0.12371703237295151,
-0.018749691545963287,
-0.08807345479726791,
-0.01648515835404396,
-0.06390588730573654,
-0.1125875785946846,
0.011590898036956787,
-0.07241132110357285,
-0.004962345119565725,
-0.13957418501377106,
-0.11105787009000778,
0.023959940299391747,
0.022648682817816734,
-0.06074725091457367,
0.0038143680430948734,
-0.05766667425632477,
-0.10098093003034592,
0.06469385325908661,
-0.030536772683262825,
0.01040810439735651,
0.007803921587765217,
0.0369647815823555,
0.029486464336514473,
0.08133773505687714,
-0.14570103585720062,
0.05344480276107788,
-0.06343793869018555,
-0.014569141902029514,
-0.09716138988733292,
0.08356304466724396,
-0.03767526522278786,
0.08688769489526749,
-0.06208600848913193,
0.04451890289783478,
-0.09101156145334244,
0.012590009719133377,
0.0044935839250683784,
0.18380513787269592,
-0.16905933618545532,
-0.10291796177625656,
0.19015467166900635,
-0.050481684505939484,
-0.1266878992319107,
0.0787988230586052,
0.015844549983739853,
-0.020550144836306572,
0.06000574678182602,
0.17997899651527405,
0.12590070068836212,
0.004425962921231985,
0.09740015864372253,
0.10495001822710037,
-0.0895497277379036,
-0.09677066653966904,
-0.03514380753040314,
-0.009672735817730427,
-0.11757894605398178,
0.04089861735701561,
0.08878865092992783,
0.06006423383951187,
-0.029787002131342888,
-0.03631995618343353,
-0.05987454205751419,
-0.008576060645282269,
0.054878294467926025,
-0.015506378374993801,
0.1326911747455597,
-0.0491911955177784,
-0.009160473942756653,
-0.0077646574936807156,
-0.041738759726285934,
-0.06811388581991196,
0.018774831667542458,
-0.04988180473446846,
0.12459465861320496,
-0.17038042843341827,
0.06254256516695023,
-0.1820097118616104,
-0.13748866319656372,
-0.0030453393701463938,
0.09814386069774628,
0.013142899610102177,
0.19354109466075897,
0.09194980561733246,
-0.04097858816385269,
-0.018863115459680557,
0.017914047464728355,
0.12408512830734253,
0.019159475341439247,
-0.10454361140727997,
-0.11270541697740555,
0.05548940971493721,
-0.07836911827325821,
-0.01223201584070921,
-0.13155189156532288,
0.022785838693380356,
0.06638793647289276,
0.11944834142923355,
0.09057970345020294,
0.030722880735993385,
-0.024205993860960007,
0.016437632963061333,
-0.07270310819149017,
-0.022297833114862442,
0.07764139771461487,
-0.029217379167675972,
-0.09441903978586197,
0.1297520101070404,
-0.18168558180332184,
0.24661770462989807,
0.17764708399772644,
-0.1651378720998764,
-0.01175185851752758,
0.016268666833639145,
0.014945173636078835,
0.01822991669178009,
0.04484265297651291,
-0.0026481037493795156,
0.019496411085128784,
-0.014368920587003231,
0.15835721790790558,
-0.028535883873701096,
-0.04489121958613396,
0.04500041902065277,
-0.04160468652844429,
-0.05241953954100609,
0.046650346368551254,
0.09979421645402908,
-0.2863733172416687,
0.16746675968170166,
0.18638455867767334,
-0.038180314004421234,
0.175103560090065,
0.00733009772375226,
-0.05625568702816963,
0.01930863969027996,
0.03289065137505531,
0.017437048256397247,
-0.013862128369510174,
-0.16985838115215302,
-0.02239537239074707,
0.0472259446978569,
0.01228394266217947,
0.0572483129799366,
-0.08170381933450699,
0.006838924251496792,
0.0016971207223832607,
-0.03908146917819977,
0.02802344784140587,
0.08804327249526978,
0.024458272382616997,
0.11014988273382187,
-0.05085598677396774,
-0.030034910887479782,
0.1252107322216034,
-0.0033784620463848114,
-0.11366860568523407,
0.161285862326622,
-0.10452907532453537,
-0.24711227416992188,
-0.14115194976329803,
-0.11168985813856125,
-0.11322533339262009,
0.03160865604877472,
0.06201733648777008,
-0.08083675056695938,
-0.06535609066486359,
-0.0454414002597332,
0.04289838299155235,
-0.0915130004286766,
0.08000680804252625,
-0.0386219285428524,
0.04811352863907814,
0.03146340698003769,
-0.06926322728395462,
-0.029767269268631935,
-0.010043775662779808,
-0.02919011563062668,
0.0987498089671135,
-0.07727168500423431,
0.0696345642209053,
0.17099103331565857,
-0.020536361262202263,
0.07369808852672577,
-0.023292699828743935,
0.1649109423160553,
-0.052707262337207794,
0.042307350784540176,
0.12674686312675476,
-0.030202537775039673,
0.04489509016275406,
0.14565208554267883,
-0.006537990178912878,
-0.09166841953992844,
0.01748575083911419,
0.044924501329660416,
-0.0732402503490448,
-0.22533130645751953,
-0.008945069275796413,
-0.15080563724040985,
0.04696042835712433,
0.029048355296254158,
0.056868601590394974,
0.18480588495731354,
0.07360073179006577,
0.02540757693350315,
0.12746880948543549,
0.009746668860316277,
0.05766009911894798,
0.19483698904514313,
0.03442097455263138,
0.14972369372844696,
-0.08240489661693573,
-0.1059584841132164,
0.07039497047662735,
0.04257863387465477,
0.1608802229166031,
0.07129873335361481,
0.0938829556107521,
-0.000855565071105957,
0.064007468521595,
0.12858539819717407,
0.1261087954044342,
0.04501264914870262,
-0.049999646842479706,
-0.008922254666686058,
-0.012065965682268143,
0.02840089052915573,
0.003993184305727482,
0.07401012629270554,
-0.10394517332315445,
-0.05543533340096474,
-0.0341152586042881,
0.08064452558755875,
0.11258301138877869,
0.1099352166056633,
-0.2488292008638382,
0.041466787457466125,
0.07335662096738815,
-0.02728121168911457,
-0.07425063103437424,
0.0511484332382679,
0.07116973400115967,
-0.02847745083272457,
0.0797237977385521,
-0.10459525883197784,
0.09644999355077744,
-0.0041105980053544044,
0.09210468083620071,
-0.013870254158973694,
-0.003950837999582291,
0.011779977940022945,
0.07106295228004456,
-0.3926906883716583,
0.12961620092391968,
0.038564711809158325,
-0.034051597118377686,
-0.09003295749425888,
0.017697716131806374,
-0.010120529681444168,
0.06084585189819336,
0.1309186965227127,
-0.029265908524394035,
-0.042422596365213394,
-0.045788511633872986,
0.002329497365280986,
0.017521042376756668,
0.1442725658416748,
-0.03625551238656044,
0.04014407843351364,
-0.06464094668626785,
-0.046327292919158936,
0.04036812111735344,
0.1076783686876297,
-0.0670877993106842,
-0.17997360229492188,
0.10412005335092545,
0.07403630763292313,
-0.09777013212442398,
0.030975984409451485,
-0.011591276153922081,
-0.049945395439863205,
0.19864556193351746,
-0.10438976436853409,
-0.031126931309700012,
-0.1574111133813858,
0.021915780380368233,
0.029050463810563087,
-0.0638049766421318,
0.03719526156783104,
-0.04233332350850105,
0.06412006169557571,
-0.023719359189271927,
-0.28894931077957153,
0.1659318506717682,
-0.08732134103775024,
-0.0011043898994103074,
-0.058639880269765854,
0.10036395490169525,
-0.09419014304876328,
-0.006063111592084169,
0.0018240568460896611,
-0.02257826365530491,
0.008345387876033783,
-0.07243555784225464,
0.04458596929907799,
-0.004865644965320826,
-0.01380909699946642,
-0.04280102625489235,
-0.061959583312273026,
-0.01949715055525303,
-0.006522068753838539,
0.05854003503918648,
0.26053813099861145,
0.19094857573509216,
-0.07046184688806534,
0.10331079363822937,
0.015115353278815746,
-0.07916447520256042,
-0.30130934715270996,
0.006407111417502165,
-0.0762399360537529,
0.033577851951122284,
-0.005639339797198772,
-0.09753832966089249,
0.04400763288140297,
-0.011851766146719456,
0.005915287882089615,
0.1685095727443695,
-0.2290038764476776,
-0.08559000492095947,
0.14617273211479187,
0.07145655155181885,
0.3782743215560913,
-0.13370221853256226,
-0.09678332507610321,
-0.033252518624067307,
-0.07090536504983902,
0.191123366355896,
-0.20400583744049072,
0.12136851996183395,
-0.00536561943590641,
0.03725595399737358,
0.04755859822034836,
-0.02425713650882244,
0.03809238597750664,
-0.009950484149158001,
0.0706709697842598,
-0.062315043061971664,
-0.01396976225078106,
0.1039649024605751,
-0.045920949429273605,
0.021619003266096115,
-0.0604158490896225,
0.018052227795124054,
-0.022697096690535545,
-0.027451427653431892,
-0.061624255031347275,
0.05186109617352486,
0.014949934557080269,
-0.05289789289236069,
0.02490413933992386,
-0.0841994509100914,
0.027499601244926453,
-0.045288313180208206,
0.27332496643066406,
-0.060748469084501266,
0.14449504017829895,
0.2728237807750702,
0.09904138743877411,
-0.07880443334579468,
0.02782917022705078,
-0.002732630353420973,
-0.08148589730262756,
0.08447695523500443,
-0.12405486404895782,
0.09502054005861282,
0.07408098131418228,
-0.0532219335436821,
0.07632952183485031,
0.12532316148281097,
-0.012218167074024677,
-0.038226135075092316,
0.1281377524137497,
-0.2135605663061142,
-0.07788293808698654,
-0.08988291025161743,
-0.13000603020191193,
0.01848607324063778,
0.11109772324562073,
0.21180514991283417,
-0.016922954469919205,
0.019237903878092766,
0.012952523306012154,
-0.013395342044532299,
-0.06987715512514114,
0.09169802814722061,
0.051633480936288834,
0.025154100731015205,
-0.1334446221590042,
0.1446632742881775,
0.0244985893368721,
-0.15531937777996063,
0.0443216934800148,
0.13682565093040466,
-0.10776600986719131,
-0.11974819004535675,
0.051268864423036575,
0.25226539373397827,
-0.08999589085578918,
0.022702302783727646,
-0.10836378484964371,
-0.12416452169418335,
0.06726289540529251,
0.2592410147190094,
0.0023883595131337643,
0.06893462687730789,
-0.047304291278123856,
-0.0419231578707695,
-0.09438636153936386,
0.03528326004743576,
-0.02014424465596676,
0.04600946605205536,
-0.1435299515724182,
0.030752411112189293,
-0.09947630017995834,
0.08183056861162186,
-0.10575226694345474,
0.00803126860409975,
-0.19470255076885223,
-0.030386686325073242,
-0.08909276872873306,
-0.07978025078773499,
-0.022160068154335022,
-0.03943657502532005,
0.02197776362299919,
-0.009307507425546646,
-0.08378671854734421,
-0.011580646969377995,
-0.10646329820156097,
-0.017702657729387283,
-0.023726804181933403,
0.02027595043182373,
-0.03872314468026161,
-0.011341974139213562,
-0.013945244252681732,
-0.048257436603307724,
0.0968920961022377,
0.08979521691799164,
-0.07111192494630814,
0.12085144221782684,
-0.10293027758598328,
-0.08507532626390457,
0.08908753097057343,
0.012381341308355331,
0.10402107238769531,
0.08316975831985474,
0.009072848595678806,
0.04717995598912239,
0.04036860540509224,
0.03428684175014496,
-0.013278319500386715,
-0.06412605941295624,
-0.04100468009710312,
-0.13509073853492737,
-0.09086556732654572,
-0.06650182604789734,
0.0028690043836832047,
0.07056383043527603,
0.045381098985672,
0.11737452447414398,
-0.07940109074115753,
0.07953799515962601,
-0.11547932773828506,
0.008090074174106121,
0.039384737610816956,
-0.1471603363752365,
-0.008111333474516869,
-0.056680019944906235,
0.04436380788683891,
-0.050773099064826965,
0.13052168488502502,
-0.013747586868703365,
0.055408213287591934,
0.03800252079963684,
-0.0676802471280098,
0.012069220654666424,
0.03902303799986839,
0.20642247796058655,
0.07916021347045898,
-0.07772717624902725,
-0.12193324416875839,
0.08530380576848984,
0.050753992050886154,
0.09678727388381958,
0.16342586278915405,
-0.007957999594509602,
-0.15251223742961884,
0.15144208073616028,
-0.029896581545472145,
-0.040141742676496506,
-0.019390322268009186,
0.0603080615401268,
-0.053049709647893906,
0.11077675223350525,
0.007579098455607891,
-0.03293795883655548,
0.2328585684299469,
-0.03749518096446991,
0.027370570227503777,
-0.013457322493195534,
-0.07951778918504715,
-0.19364744424819946,
-0.21517354249954224,
-0.1158929094672203,
-0.14795775711536407,
-0.0017908893059939146,
-0.10840464383363724,
0.06699177622795105,
0.010462510399520397,
0.07582442462444305,
-0.04500090703368187,
0.12179731577634811,
0.09927964955568314,
-0.07044163346290588,
0.10122120380401611,
-0.027595167979598045,
0.007914931513369083,
0.011126129887998104,
0.03175385668873787,
-0.05334576964378357,
-0.000013859498722013086,
-0.04233742877840996,
0.027559086680412292,
0.010637626051902771,
0.07835662364959717,
-0.10694456845521927,
-0.1266128122806549,
-0.0029600670095533133,
0.05210348963737488,
-0.04487842321395874,
0.14545442163944244,
0.023626068606972694,
-0.06753438711166382,
0.0158789474517107,
0.1958116888999939,
-0.13098692893981934,
0.0185219906270504,
-0.08037156611680984,
0.15507417917251587,
0.08613711595535278,
0.14009729027748108,
-0.0522591657936573,
-0.04851428419351578,
-0.04392050951719284,
0.2955577075481415,
0.2226712703704834,
-0.08703265339136124,
0.03166377916932106,
0.03884054347872734,
0.006308406125754118,
0.04890824109315872,
0.10490585118532181,
0.022273128852248192,
0.14725688099861145,
-0.04259074479341507,
-0.0037894027773290873,
0.035418301820755005,
0.006689508445560932,
-0.07052598893642426,
0.21667629480361938,
0.05601946637034416,
-0.055686768144369125,
0.0013945968821644783,
0.05824372544884682,
-0.18017014861106873,
0.12284059077501297,
-0.03251292183995247,
-0.15737491846084595,
-0.0746217668056488,
-0.043899163603782654,
0.09531744569540024,
-0.04453791677951813,
0.07359686493873596,
-0.045404739677906036,
-0.0373583622276783,
0.04180978611111641,
0.007193507626652718,
-0.1759449541568756,
-0.004022958688437939,
-0.02474690042436123,
-0.053975727409124374,
-0.01795913279056549,
-0.002180757001042366,
-0.04692763090133667,
0.10377572476863861,
0.03689120337367058,
-0.05532299727201462,
0.08206989616155624,
-0.030784621834754944,
-0.038117893040180206,
0.026496244594454765,
0.10140126943588257,
-0.017855744808912277,
0.019898932427167892,
0.05161544308066368,
-0.1252136528491974,
0.022998975589871407,
-0.07568781077861786,
-0.040944162756204605,
-0.0017976396484300494,
-0.04117339849472046,
-0.05209726095199585,
0.0697779431939125,
0.11606485396623611,
0.0007421497139148414,
0.04292290285229683,
-0.056881383061409,
-0.00158159458078444,
0.03275347501039505,
-0.00518295681104064,
-0.09733352065086365,
-0.07068007439374924,
-0.03721720725297928,
0.08572234213352203,
-0.01403079740703106,
-0.25289416313171387,
0.010430986993014812,
-0.09012753516435623,
0.024060817435383797,
-0.21012261509895325,
0.09072824567556381,
0.16973567008972168,
0.01712414249777794,
-0.02585640735924244,
-0.12070532888174057,
0.0505758561193943,
0.08724182844161987,
-0.11548294872045517,
-0.0925588384270668
] |
null | null | null |
# GPT-2 GERMAN
## Model description
See [Open AI's model card](https://github.com/openai/gpt-2/blob/master/model_card.md) and [Huggingface's model card](https://huggingface.co/gpt2) for the original model.
## Intended uses & limitations
#### How to use
```python
def foo(bar)
bar +=1
return bar
```
#### Limitations and bias
On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? ?? https://dl.acm.org/doi/10.1145/3442188.3445922
```
@inproceedings{10.1145/3442188.3445922,
author = {Bender, Emily M. and Gebru, Timnit and McMillan-Major, Angelina and Shmitchell, Shmargaret},
title = {On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? ??},
year = {2021},
isbn = {9781450383097},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3442188.3445922},
doi = {10.1145/3442188.3445922},
abstract = {The past 3 years of work in NLP have been characterized by the development and deployment of ever larger language models, especially for English. BERT, its variants, GPT-2/3, and others, most recently Switch-C, have pushed the boundaries of the possible both through architectural innovations and through sheer size. Using these pretrained models and the methodology of fine-tuning them for specific tasks, researchers have extended the state of the art on a wide array of tasks as measured by leaderboards on specific benchmarks for English. In this paper, we take a step back and ask: How big is too big? What are the possible risks associated with this technology and what paths are available for mitigating those risks? We provide recommendations including weighing the environmental and financial costs first, investing resources into curating and carefully documenting datasets rather than ingesting everything on the web, carrying out pre-development exercises evaluating how the planned approach fits into research and development goals and supports stakeholder values, and encouraging research directions beyond ever larger language models.},
booktitle = {Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency},
pages = {610?623},
numpages = {14},
location = {Virtual Event, Canada},
series = {FAccT '21}
}
```
## Training data
https://huggingface.co/datasets/german-nlp-group/german_common_crawl
```json
{'url': 'http://my-shop.ru/shop/books/545473.html',
'date_download': '2016-10-20T19:38:58Z',
'digest': 'sha1:F62EMGYLZDIKF4UL5JZYU47KWGGUBT7T',
'length': 1155,
'nlines': 4,
'source_domain': 'my-shop.ru',
'title': 'Grammatikalische Liebeslieder. Methodische Vorschläge',
'raw_content': 'Grammatikalische Liebeslieder. [....]',
'cc_segment': 'crawl-data/CC-MAIN-2016-44/segments/1476988717783.68/wet/CC-MAIN-20161020183837-00354-ip-10-171-6-4.ec2.internal.warc.wet.gz',
'original_nlines': 99,
'original_length': 2672,
'language': 'de',
'language_score': 1.0,
'perplexity': 283.0,
'bucket': 'head'}"
```
## Training procedure
TODO (See [training](training.md))
## Eval results
TODO: Self-BLEU, Diversity, and other metrics from https://arxiv.org/abs/1904.09751
```
@inproceedings{DBLP:conf/iclr/HoltzmanBDFC20,
author = {Ari Holtzman and
Jan Buys and
Li Du and
Maxwell Forbes and
Yejin Choi},
title = {The Curious Case of Neural Text Degeneration},
booktitle = {8th International Conference on Learning Representations, {ICLR} 2020,
Addis Ababa, Ethiopia, April 26-30, 2020},
publisher = {OpenReview.net},
year = {2020},
url = {https://openreview.net/forum?id=rygGQyrFvH},
timestamp = {Thu, 21 Jan 2021 17:36:46 +0100},
biburl = {https://dblp.org/rec/conf/iclr/HoltzmanBDFC20.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### BibTeX entry and citation info
Does the Huggingface hub generate DOIs? Otherwise maybe Kaggle or Zenodo to generate one.
```bibtex
@inproceedings{...,
year={2021}
}
```
|
{"language": [], "tags": [], "datasets": [], "metrics": []}
| null |
flax-community/gpt-2-german
|
[
"arxiv:1904.09751",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1904.09751"
] |
[] |
TAGS
#arxiv-1904.09751 #region-us
|
# GPT-2 GERMAN
## Model description
See Open AI's model card and Huggingface's model card for the original model.
## Intended uses & limitations
#### How to use
#### Limitations and bias
On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? ?? URL
## Training data
URL
## Training procedure
TODO (See training)
## Eval results
TODO: Self-BLEU, Diversity, and other metrics from URL
### BibTeX entry and citation info
Does the Huggingface hub generate DOIs? Otherwise maybe Kaggle or Zenodo to generate one.
|
[
"# GPT-2 GERMAN",
"## Model description\n\nSee Open AI's model card and Huggingface's model card for the original model.",
"## Intended uses & limitations",
"#### How to use",
"#### Limitations and bias\n\nOn the Dangers of Stochastic Parrots: Can Language Models Be Too Big? ?? URL",
"## Training data\n\nURL",
"## Training procedure\n\nTODO (See training)",
"## Eval results\n\nTODO: Self-BLEU, Diversity, and other metrics from URL",
"### BibTeX entry and citation info\n\nDoes the Huggingface hub generate DOIs? Otherwise maybe Kaggle or Zenodo to generate one."
] |
[
"TAGS\n#arxiv-1904.09751 #region-us \n",
"# GPT-2 GERMAN",
"## Model description\n\nSee Open AI's model card and Huggingface's model card for the original model.",
"## Intended uses & limitations",
"#### How to use",
"#### Limitations and bias\n\nOn the Dangers of Stochastic Parrots: Can Language Models Be Too Big? ?? URL",
"## Training data\n\nURL",
"## Training procedure\n\nTODO (See training)",
"## Eval results\n\nTODO: Self-BLEU, Diversity, and other metrics from URL",
"### BibTeX entry and citation info\n\nDoes the Huggingface hub generate DOIs? Otherwise maybe Kaggle or Zenodo to generate one."
] |
[
15,
6,
23,
9,
5,
29,
4,
9,
21,
33
] |
[
"passage: TAGS\n#arxiv-1904.09751 #region-us \n# GPT-2 GERMAN## Model description\n\nSee Open AI's model card and Huggingface's model card for the original model.## Intended uses & limitations#### How to use#### Limitations and bias\n\nOn the Dangers of Stochastic Parrots: Can Language Models Be Too Big? ?? URL## Training data\n\nURL## Training procedure\n\nTODO (See training)## Eval results\n\nTODO: Self-BLEU, Diversity, and other metrics from URL### BibTeX entry and citation info\n\nDoes the Huggingface hub generate DOIs? Otherwise maybe Kaggle or Zenodo to generate one."
] |
[
-0.0615273155272007,
-0.007313461974263191,
0.0013385816710069776,
0.13735422492027283,
0.0796113908290863,
0.019052481278777122,
0.15015879273414612,
0.07047519832849503,
0.13053306937217712,
0.11106620728969574,
0.08727765083312988,
0.010229001753032207,
0.16592589020729065,
0.2591893672943115,
-0.010318899527192116,
-0.14357566833496094,
0.03793254494667053,
0.002497639274224639,
0.03944487124681473,
0.07144356518983841,
0.04508340731263161,
-0.013817310333251953,
0.1349225491285324,
0.05551930516958237,
-0.12784738838672638,
-0.022006873041391373,
-0.051980599761009216,
0.005952855572104454,
0.09132055193185806,
0.05245822295546532,
0.012559153139591217,
-0.0018330372404307127,
0.03919850289821625,
-0.10392529517412186,
0.04022630676627159,
0.0017997550312429667,
-0.04326936975121498,
0.0937461256980896,
0.04810844734311104,
-0.029979078099131584,
0.16220876574516296,
0.05044511333107948,
-0.01974579319357872,
0.04809314385056496,
-0.058029040694236755,
-0.182713121175766,
-0.13960106670856476,
0.04870738834142685,
0.00853446964174509,
0.02174733392894268,
-0.02575591206550598,
0.15190377831459045,
0.007998568937182426,
0.055763114243745804,
0.24335837364196777,
-0.173324316740036,
-0.024185627698898315,
0.1039319708943367,
0.043009404093027115,
0.06696134060621262,
-0.11540333181619644,
0.06823479384183884,
0.1931840479373932,
0.027137335389852524,
0.035367656499147415,
-0.007877313531935215,
0.07558035105466843,
-0.05437333509325981,
-0.04930407181382179,
-0.06176142022013664,
0.12990505993366241,
0.09275847673416138,
-0.10391756892204285,
-0.13294661045074463,
-0.024131428450345993,
0.009853175841271877,
-0.024793438613414764,
-0.04549495130777359,
0.028728758916258812,
-0.024349594488739967,
0.1346530020236969,
-0.13957586884498596,
-0.030226605013012886,
-0.08275027573108673,
-0.033531300723552704,
0.09648442268371582,
0.008198142051696777,
0.06369493901729584,
-0.011781764216721058,
0.08480923622846603,
-0.09031224995851517,
-0.07657529413700104,
-0.02088351920247078,
-0.029060443863272667,
0.03994115814566612,
0.0740518569946289,
0.08632077276706696,
0.1600443720817566,
0.08520636707544327,
0.11971550434827805,
0.007298401091247797,
-0.00792024563997984,
-0.11778482794761658,
0.04026380926370621,
0.04164351895451546,
0.010183272883296013,
-0.101790651679039,
-0.2117646336555481,
0.12368952482938766,
0.04183894023299217,
0.05683860182762146,
-0.07930265367031097,
-0.10684078186750412,
-0.022524377331137657,
0.01559212151914835,
0.00668623112142086,
0.05078893154859543,
-0.004496654495596886,
-0.05519072711467743,
-0.030784238129854202,
0.15946967899799347,
-0.03913330286741257,
0.008503559045493603,
-0.0047934711910784245,
-0.034583963453769684,
-0.1189628541469574,
0.047417815774679184,
0.006021485663950443,
-0.00899256020784378,
0.0005664477939717472,
-0.07735690474510193,
0.0023001322988420725,
-0.004216824192553759,
-0.09398947656154633,
-0.016844451427459717,
-0.1967717707157135,
0.027181895449757576,
-0.10186231881380081,
-0.09740326553583145,
-0.04151901975274086,
-0.050467949360609055,
-0.09290636330842972,
-0.04214520379900932,
-0.043508440256118774,
-0.0545395091176033,
-0.03198091313242912,
-0.013754095882177353,
-0.0034811946097761393,
-0.06336826831102371,
-0.008724955841898918,
0.03602449968457222,
0.07674078643321991,
-0.17464394867420197,
-0.004218891263008118,
-0.1106700673699379,
0.05720652639865875,
-0.04748481139540672,
0.07006251811981201,
-0.08390457183122635,
0.07505714893341064,
-0.054294850677251816,
-0.050934962928295135,
-0.08737489581108093,
0.03696518763899803,
0.11384282261133194,
0.18882815539836884,
-0.08763917535543442,
-0.03404107317328453,
0.18301993608474731,
-0.08724985271692276,
-0.21126608550548553,
0.23011289536952972,
-0.04473230242729187,
0.10504323244094849,
0.008296755142509937,
0.2421063780784607,
-0.17577077448368073,
-0.0687340646982193,
0.010640090331435204,
-0.037031907588243484,
-0.07187698781490326,
0.10122909396886826,
0.07351099699735641,
0.004811746999621391,
-0.15630798041820526,
0.025205515325069427,
0.023566391319036484,
-0.03920276090502739,
-0.0055669154971838,
-0.059159960597753525,
-0.008913385681807995,
-0.04816567525267601,
-0.007659668568521738,
-0.004828778561204672,
-0.011745394207537174,
-0.1275216042995453,
-0.025457138195633888,
-0.12240266054868698,
0.06065322086215019,
-0.01707310788333416,
-0.01050497405230999,
-0.07847024500370026,
0.1632673293352127,
-0.1805860549211502,
0.0016977330669760704,
-0.055206798017024994,
0.01597318984568119,
-0.0152533408254385,
0.01783110946416855,
0.20908218622207642,
0.15426430106163025,
0.05131182819604874,
0.053698278963565826,
-0.005057551432400942,
0.03829057514667511,
0.0018704673275351524,
-0.017876505851745605,
-0.06622776389122009,
-0.13176947832107544,
0.046231746673583984,
-0.07805264741182327,
0.11577048152685165,
-0.26685279607772827,
0.03367131948471069,
-0.028911767527461052,
0.06317819654941559,
0.007511650212109089,
-0.005182601977139711,
0.03188398852944374,
-0.048285890370607376,
-0.11367446929216385,
-0.04071496054530144,
0.03699343279004097,
-0.019336480647325516,
-0.08084288239479065,
0.06790851801633835,
-0.09761576354503632,
0.05097245052456856,
0.1641952246427536,
0.11456990242004395,
-0.10931207239627838,
-0.08896319568157196,
-0.03795510157942772,
-0.011096668429672718,
-0.029702816158533096,
-0.023986678570508957,
0.0847550630569458,
-0.020725755020976067,
0.09981757402420044,
-0.15090979635715485,
0.021450132131576538,
0.06520796567201614,
-0.05447668582201004,
-0.023943791165947914,
0.11084440350532532,
0.02197759412229061,
-0.058556538075208664,
0.13007758557796478,
0.01040195394307375,
0.0084295105189085,
0.12975968420505524,
0.08169806003570557,
-0.027619989588856697,
-0.0368029847741127,
0.029578395187854767,
0.013593576848506927,
0.19756990671157837,
-0.1382168084383011,
0.011919694021344185,
0.08126244693994522,
0.019289348274469376,
0.046791914850473404,
-0.15534578263759613,
-0.034443337470293045,
0.027828902006149292,
-0.03884653374552727,
-0.02984420396387577,
0.03786667436361313,
-0.023462189361453056,
0.1205328106880188,
-0.011300834827125072,
-0.1112012043595314,
-0.020766668021678925,
-0.023032287135720253,
-0.08943802118301392,
0.0943119004368782,
0.024556560441851616,
-0.1753796935081482,
-0.10362266004085541,
-0.08131541311740875,
-0.1534072309732437,
0.038822855800390244,
0.05042276903986931,
0.000044596785301109776,
0.009271041490137577,
-0.11306650936603546,
0.06524155288934708,
0.06901051104068756,
-0.03860187530517578,
0.0011086123995482922,
0.0003084224881604314,
0.0019076383905485272,
-0.10558811575174332,
-0.03743143379688263,
-0.0492374412715435,
-0.028478063642978668,
0.012477044947445393,
-0.040854837745428085,
0.15423554182052612,
0.0724068284034729,
-0.03449731320142746,
-0.00446506729349494,
-0.019012892618775368,
0.2277919352054596,
-0.09535664319992065,
-0.01193995214998722,
0.051583074033260345,
0.021551521494984627,
0.05446036532521248,
0.14346228539943695,
0.019723491743206978,
-0.1502808928489685,
-0.0017087316373363137,
-0.010990873910486698,
-0.1223813146352768,
-0.1651594042778015,
-0.04586692899465561,
-0.09014174342155457,
0.15835490822792053,
0.03190642595291138,
0.027713680639863014,
0.06814558058977127,
0.14213120937347412,
0.0043530771508812904,
0.05312841758131981,
-0.01395496353507042,
0.08386363089084625,
-0.16172277927398682,
0.009463843889534473,
0.07664010673761368,
-0.06851817667484283,
-0.052658528089523315,
0.17077039182186127,
0.12841463088989258,
0.1267048567533493,
-0.037181273102760315,
0.05185195058584213,
0.057499852031469345,
0.11652328073978424,
-0.019985804334282875,
0.09585732221603394,
-0.07254979014396667,
-0.03557087481021881,
-0.1079048216342926,
-0.02351101115345955,
-0.049940403550863266,
0.11951381713151932,
0.016034167259931564,
0.014312171377241611,
-0.017635101452469826,
-0.09875805675983429,
0.027105357497930527,
0.018382620066404343,
0.08068439364433289,
-0.21344716846942902,
-0.004636561963707209,
0.05236741527915001,
0.028988808393478394,
-0.10640978068113327,
0.009559329599142075,
0.10332489013671875,
-0.15565186738967896,
0.053725648671388626,
-0.02468544989824295,
0.14447949826717377,
0.02822292223572731,
-0.023314598947763443,
-0.22089970111846924,
0.14204828441143036,
-0.07992756366729736,
0.09514915198087692,
-0.0983634740114212,
0.16675136983394623,
-0.03472417965531349,
-0.018783601000905037,
-0.06434781849384308,
-0.014770586974918842,
0.09986118972301483,
0.02359624393284321,
0.1816229671239853,
-0.01268516480922699,
0.0029887601267546415,
-0.088689424097538,
-0.09405598789453506,
0.015331070870161057,
-0.004178889561444521,
-0.10188952833414078,
0.018566472455859184,
0.031046802178025246,
0.06077233701944351,
-0.052184317260980606,
0.105653777718544,
-0.027488525956869125,
-0.1110723614692688,
-0.024003874510526657,
-0.03276364877820015,
-0.07989748567342758,
0.028123991563916206,
-0.05410342290997505,
-0.07003218680620193,
0.16247698664665222,
0.043968796730041504,
-0.007006493397057056,
-0.07549247145652771,
-0.007990486919879913,
0.02054648846387863,
-0.032779186964035034,
-0.01576763205230236,
-0.060513436794281006,
0.060660187155008316,
-0.0690687894821167,
0.027428055182099342,
0.10815267264842987,
-0.08366639167070389,
-0.1419813632965088,
-0.02893362194299698,
0.0996866226196289,
0.013459976762533188,
0.06959778070449829,
0.054818470031023026,
-0.0185378510504961,
-0.03648071363568306,
-0.15017107129096985,
0.052042070776224136,
0.03448861464858055,
0.012853681109845638,
0.14796803891658783,
0.024968314915895462,
-0.09653866291046143,
-0.1361819952726364,
-0.012032479979097843,
0.09773163497447968,
0.3275943696498871,
-0.08383790403604507,
0.10751868039369583,
0.10428992658853531,
-0.03093513660132885,
-0.20344896614551544,
0.04390130937099457,
-0.056543804705142975,
0.08688804507255554,
0.03736278414726257,
0.031856782734394073,
0.048899758607149124,
0.012554321438074112,
-0.047871384769678116,
0.19709327816963196,
-0.014922070316970348,
-0.1397671401500702,
0.08882538229227066,
0.07156115770339966,
0.12991201877593994,
-0.05424424633383751,
-0.03142813220620155,
-0.06848365068435669,
-0.16016845405101776,
0.15275180339813232,
-0.007818770594894886,
0.07359126955270767,
-0.008170545101165771,
0.07236790657043457,
0.04660983756184578,
-0.001189286122098565,
0.19008080661296844,
-0.03001892752945423,
0.08723669499158859,
-0.11282587796449661,
0.015242602676153183,
0.023817503824830055,
-0.04101724550127983,
0.15003588795661926,
-0.1261814534664154,
-0.03751746937632561,
-0.18024809658527374,
-0.033872004598379135,
-0.05983702465891838,
0.08672402054071426,
0.03700598329305649,
-0.04141679406166077,
-0.16610656678676605,
0.03741436079144478,
0.042572394013404846,
0.04012202471494675,
0.07729552686214447,
-0.04401257261633873,
0.009955408051609993,
0.18346738815307617,
0.1422954499721527,
-0.07308397442102432,
0.04637899622321129,
-0.006032247096300125,
-0.025310765951871872,
0.024610623717308044,
-0.13521677255630493,
-0.005547603126615286,
0.09029781818389893,
0.03986776992678642,
0.12001711130142212,
0.029457902535796165,
-0.07723468542098999,
0.08340901136398315,
0.08758412301540375,
-0.07268709689378738,
-0.23661142587661743,
0.00916360318660736,
-0.07005476206541061,
-0.03294514864683151,
-0.004297344945371151,
0.08143734931945801,
-0.08562745153903961,
-0.019404994323849678,
-0.020241020247340202,
0.06525929272174835,
-0.08342979103326797,
0.05806625634431839,
0.0760752335190773,
0.011297203600406647,
-0.12455639988183975,
0.030693601816892624,
-0.011816889978945255,
0.02789953351020813,
0.05880115181207657,
0.046606939285993576,
-0.03698459267616272,
0.010402155108749866,
-0.04584058001637459,
0.17926537990570068,
-0.019361257553100586,
-0.022734837606549263,
-0.049434561282396317,
-0.07011078298091888,
-0.0826050341129303,
-0.10472015291452408,
0.1004265770316124,
-0.04377485066652298,
0.008546605706214905,
-0.03431607782840729,
-0.1080731749534607,
0.06705006211996078,
0.12528572976589203,
0.06384610384702682,
-0.05851346626877785,
0.015147238969802856,
0.042507752776145935,
0.1298564374446869,
-0.0771026685833931,
0.012153125368058681,
-0.08545613288879395,
-0.059560876339673996,
-0.19716039299964905,
0.09864151477813721,
-0.07355573773384094,
0.0075392406433820724,
-0.031559720635414124,
-0.1305818259716034,
-0.08434312045574188,
0.0036343869287520647,
-0.11798001825809479,
0.0045177629217505455,
0.030182162299752235,
0.05495878681540489,
-0.022603781893849373,
-0.01338922418653965,
0.10506151616573334,
-0.0363936647772789,
0.09615621715784073,
0.03715628385543823,
-0.0462641566991806,
-0.04695168882608414,
-0.12332118302583694,
0.020165573805570602,
0.02168024331331253,
0.027502160519361496,
0.050973545759916306,
-0.09502466022968292,
0.012937570922076702,
-0.010885577648878098,
0.011742116883397102,
-0.0022520655766129494,
0.13940908014774323,
-0.0817524790763855,
-0.05019867792725563,
-0.06443089246749878,
-0.13512097299098969,
0.03671657294034958,
0.013573580421507359,
0.04042911157011986,
0.08674488216638565,
0.0750020369887352,
0.03087877295911312,
0.08938464522361755,
-0.15858596563339233,
-0.028448564931750298,
-0.020343933254480362,
-0.0775911808013916,
-0.01587028056383133,
-0.04901017248630524,
0.036437515169382095,
0.019298911094665527,
0.23676101863384247,
0.07345101237297058,
-0.1072017177939415,
-0.01653912290930748,
0.06418772786855698,
0.07538670301437378,
-0.06232123821973801,
0.048320744186639786,
-0.06257680058479309,
0.0014136536046862602,
-0.042273297905921936,
0.030526405200362206,
0.02491527795791626,
0.034580048173666,
0.13717129826545715,
0.06817874312400818,
0.03478464111685753,
0.0471242219209671,
0.07345611602067947,
0.007038656622171402,
-0.12264557927846909,
0.014856589958071709,
0.053769782185554504,
0.02500816620886326,
-0.006302153691649437,
0.10737179964780807,
0.18494611978530884,
-0.10896258801221848,
0.07121558487415314,
0.05992770567536354,
-0.060736242681741714,
-0.10917935520410538,
-0.15473484992980957,
-0.04672292247414589,
-0.11680105328559875,
0.05232001468539238,
-0.07473397254943848,
-0.0489221028983593,
0.1115892231464386,
0.032176725566387177,
-0.05882527679204941,
0.11408397555351257,
-0.15876688063144684,
-0.03101435676217079,
0.019256483763456345,
0.01984790712594986,
-0.018903527408838272,
0.006956128403544426,
-0.06741981953382492,
-0.010729395784437656,
0.0131441755220294,
0.07366885989904404,
0.024882834404706955,
0.007669712882488966,
-0.01864996738731861,
-0.026938028633594513,
-0.03951505199074745,
-0.03808249533176422,
0.01839412935078144,
0.07079754024744034,
0.19194746017456055,
0.07762755453586578,
-0.02966027520596981,
0.0075242952443659306,
0.13616211712360382,
-0.011568953283131123,
0.06867700815200806,
-0.14665727317333221,
0.12837588787078857,
-0.1462894082069397,
0.018767330795526505,
-0.053039733320474625,
-0.06276916712522507,
-0.026140335947275162,
0.2130872905254364,
0.24641399085521698,
-0.14172543585300446,
-0.0344369113445282,
-0.07636016607284546,
0.018808690831065178,
-0.0338565818965435,
0.051797956228256226,
0.006534097250550985,
0.17812320590019226,
-0.03085123747587204,
0.04026588052511215,
-0.07033983618021011,
-0.021822648122906685,
0.0024310285225510597,
0.05304570123553276,
-0.002342051127925515,
-0.044436264783144,
-0.09950900077819824,
0.03470265120267868,
-0.08912867307662964,
-0.2446901500225067,
-0.03190965577960014,
-0.12678396701812744,
-0.13127902150154114,
0.03018614649772644,
-0.1116478443145752,
0.06962817162275314,
0.0410492941737175,
0.0016771478112787008,
0.05312911421060562,
0.007008448708802462,
-0.05058242008090019,
-0.12699906527996063,
0.004553555510938168,
0.020441517233848572,
-0.09741820394992828,
0.2466343492269516,
-0.03076198324561119,
0.02748330682516098,
0.08989518880844116,
-0.002295783720910549,
-0.06492459028959274,
0.0014107741881161928,
0.010861679911613464,
0.12047328799962997,
0.017710018903017044,
0.11040321737527847,
-0.010541419498622417,
-0.19564981758594513,
0.11731131374835968,
-0.05422791838645935,
-0.01878412254154682,
-0.008628811687231064,
-0.04025043547153473,
-0.08823791891336441,
0.10101131349802017,
-0.13892345130443573,
0.09201979637145996,
0.13079363107681274,
-0.07193031907081604,
0.004823529627174139,
-0.007419866509735584,
0.04593943804502487,
0.02840237319469452,
0.023428063839673996,
0.025857919827103615,
-0.18548916280269623,
-0.047061264514923096,
0.06465096771717072,
-0.05940580368041992,
-0.2723681628704071,
-0.07588718831539154,
-0.12385575473308563,
-0.02111378125846386,
0.015272402204573154,
0.07324295490980148,
0.053178541362285614,
0.0029860094655305147,
-0.03070693463087082,
-0.16060830652713776,
-0.00997542217373848,
0.08997302502393723,
0.0005292888963595033,
-0.08454624563455582
] |
null | null |
transformers
|
# Spanish GPT-2
GPT-2 model trained from scratch on the Spanish portion of [OSCAR](https://huggingface.co/datasets/viewer/?dataset=oscar).
The model is trained with Flax and using TPUs sponsored by Google since this is part of the
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104)
organised by HuggingFace.
## Model description
The model used for training is [OpenAI's GPT-2](https://openai.com/blog/better-language-models/), introduced in the paper ["Language Models are Unsupervised Multitask Learners"](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) by Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei and Ilya Sutskever.
This model is available in the 🤗 [Model Hub](https://huggingface.co/gpt2).
## Training data
Spanish portion of OSCAR or **O**pen **S**uper-large **C**rawled **A**LMAnaCH co**R**pus, a huge multilingual corpus obtained by language classification and filtering of the [Common Crawl](https://commoncrawl.org/) corpus using the [goclassy](https://github.com/pjox/goclassy) architecture.
This corpus is available in the 🤗 [Datasets](https://huggingface.co/datasets/oscar) library.
## Team members
- Manuel Romero ([mrm8488](https://huggingface.co/mrm8488))
- María Grandury ([mariagrandury](https://huggingface.co/mariagrandury))
- Pablo González de Prado ([Pablogps](https://huggingface.co/Pablogps))
- Daniel Vera ([daveni](https://huggingface.co/daveni))
- Sri Lakshmi ([srisweet](https://huggingface.co/srisweet))
- José Posada ([jdposa](https://huggingface.co/jdposa))
- Santiago Hincapie ([shpotes](https://huggingface.co/shpotes))
- Jorge ([jorgealro](https://huggingface.co/jorgealro))
|
{"language": "es", "tags": ["text-generation"], "datasets": ["oscar"], "widgets": [{"text": "\u00c9rase un vez "}, {"text": "Frase: Esta pel\u00edcula es muy agradable. Sentimiento: positivo Frase: Odiaba esta pel\u00edcula, apesta. Sentimiento: negativo Frase: Esta pel\u00edcula fue bastante mala. Sentimiento: "}]}
|
text-generation
|
flax-community/gpt-2-spanish
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"gpt2",
"text-generation",
"es",
"dataset:oscar",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"es"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #es #dataset-oscar #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# Spanish GPT-2
GPT-2 model trained from scratch on the Spanish portion of OSCAR.
The model is trained with Flax and using TPUs sponsored by Google since this is part of the
Flax/Jax Community Week
organised by HuggingFace.
## Model description
The model used for training is OpenAI's GPT-2, introduced in the paper "Language Models are Unsupervised Multitask Learners" by Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei and Ilya Sutskever.
This model is available in the Model Hub.
## Training data
Spanish portion of OSCAR or Open Super-large Crawled ALMAnaCH coRpus, a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.
This corpus is available in the Datasets library.
## Team members
- Manuel Romero (mrm8488)
- María Grandury (mariagrandury)
- Pablo González de Prado (Pablogps)
- Daniel Vera (daveni)
- Sri Lakshmi (srisweet)
- José Posada (jdposa)
- Santiago Hincapie (shpotes)
- Jorge (jorgealro)
|
[
"# Spanish GPT-2\n\nGPT-2 model trained from scratch on the Spanish portion of OSCAR.\nThe model is trained with Flax and using TPUs sponsored by Google since this is part of the\nFlax/Jax Community Week\norganised by HuggingFace.",
"## Model description\n\nThe model used for training is OpenAI's GPT-2, introduced in the paper \"Language Models are Unsupervised Multitask Learners\" by Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei and Ilya Sutskever.\n\nThis model is available in the Model Hub.",
"## Training data\n\nSpanish portion of OSCAR or Open Super-large Crawled ALMAnaCH coRpus, a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n\nThis corpus is available in the Datasets library.",
"## Team members\n- Manuel Romero (mrm8488)\n- María Grandury (mariagrandury)\n- Pablo González de Prado (Pablogps)\n- Daniel Vera (daveni)\n- Sri Lakshmi (srisweet)\n- José Posada (jdposa)\n- Santiago Hincapie (shpotes)\n- Jorge (jorgealro)"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #es #dataset-oscar #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# Spanish GPT-2\n\nGPT-2 model trained from scratch on the Spanish portion of OSCAR.\nThe model is trained with Flax and using TPUs sponsored by Google since this is part of the\nFlax/Jax Community Week\norganised by HuggingFace.",
"## Model description\n\nThe model used for training is OpenAI's GPT-2, introduced in the paper \"Language Models are Unsupervised Multitask Learners\" by Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei and Ilya Sutskever.\n\nThis model is available in the Model Hub.",
"## Training data\n\nSpanish portion of OSCAR or Open Super-large Crawled ALMAnaCH coRpus, a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n\nThis corpus is available in the Datasets library.",
"## Team members\n- Manuel Romero (mrm8488)\n- María Grandury (mariagrandury)\n- Pablo González de Prado (Pablogps)\n- Daniel Vera (daveni)\n- Sri Lakshmi (srisweet)\n- José Posada (jdposa)\n- Santiago Hincapie (shpotes)\n- Jorge (jorgealro)"
] |
[
71,
59,
77,
66,
74
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #es #dataset-oscar #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# Spanish GPT-2\n\nGPT-2 model trained from scratch on the Spanish portion of OSCAR.\nThe model is trained with Flax and using TPUs sponsored by Google since this is part of the\nFlax/Jax Community Week\norganised by HuggingFace.## Model description\n\nThe model used for training is OpenAI's GPT-2, introduced in the paper \"Language Models are Unsupervised Multitask Learners\" by Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei and Ilya Sutskever.\n\nThis model is available in the Model Hub.## Training data\n\nSpanish portion of OSCAR or Open Super-large Crawled ALMAnaCH coRpus, a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n\nThis corpus is available in the Datasets library.## Team members\n- Manuel Romero (mrm8488)\n- María Grandury (mariagrandury)\n- Pablo González de Prado (Pablogps)\n- Daniel Vera (daveni)\n- Sri Lakshmi (srisweet)\n- José Posada (jdposa)\n- Santiago Hincapie (shpotes)\n- Jorge (jorgealro)"
] |
[
-0.08581376820802689,
0.20349448919296265,
-0.0028119084890931845,
0.10605213791131973,
0.08406227827072144,
0.04694223031401634,
0.12459323555231094,
0.07259975373744965,
0.013289224356412888,
0.0066816252656280994,
-0.010013481602072716,
0.027387995272874832,
0.09275960177183151,
0.03953907638788223,
0.016000093892216682,
-0.3230883479118347,
0.031049367040395737,
-0.10468686372041702,
-0.1447124034166336,
0.01199741568416357,
0.0721343383193016,
-0.02347734197974205,
0.042876534163951874,
0.011750560253858566,
-0.01723131351172924,
-0.037224289029836655,
-0.04494553804397583,
-0.1248062252998352,
0.0918501541018486,
0.07808350026607513,
0.09120365977287292,
0.04358311742544174,
0.08885406702756882,
-0.0632382184267044,
0.018515488132834435,
0.041755396872758865,
-0.016202479600906372,
0.030759677290916443,
0.11498170346021652,
-0.02521234191954136,
0.1294296681880951,
-0.04800783470273018,
0.10703996568918228,
0.036216527223587036,
-0.1629379689693451,
-0.24830256402492523,
-0.03152797743678093,
-0.0361645482480526,
0.011924020014703274,
0.015649043023586273,
-0.004645294509828091,
0.023295925930142403,
-0.11339112371206284,
-0.0006689610308967531,
0.010521305724978447,
-0.19920094311237335,
-0.10748548805713654,
0.15256816148757935,
0.0628146082162857,
0.08658189326524734,
-0.03896034508943558,
0.014022162184119225,
-0.01039239950478077,
-0.01832038164138794,
0.03261250630021095,
-0.03454424813389778,
-0.05585051327943802,
-0.027928045019507408,
-0.08815491944551468,
0.00030310876900330186,
0.16955049335956573,
-0.039649493992328644,
-0.05785929039120674,
-0.15669526159763336,
-0.007880224846303463,
0.16108843684196472,
0.0004160947573836893,
-0.01559588871896267,
0.05736315995454788,
0.02685578167438507,
0.04696947708725929,
-0.0455460287630558,
-0.0556781142950058,
-0.03044680505990982,
-0.08988868445158005,
0.1090216115117073,
0.02349202148616314,
0.048574697226285934,
0.04310585930943489,
0.04346827417612076,
-0.07619049400091171,
-0.08436139672994614,
-0.016681885346770287,
-0.08280283212661743,
-0.11235809326171875,
-0.017008550465106964,
-0.01579204574227333,
-0.00407884968444705,
0.03387424722313881,
0.11248766630887985,
-0.033175062388181686,
-0.003787450259551406,
0.07002469152212143,
-0.01399989053606987,
0.07745593041181564,
0.06482324004173279,
-0.0272087249904871,
-0.078855000436306,
0.0025121886283159256,
-0.04711375758051872,
-0.03968556597828865,
-0.041017141193151474,
-0.06511721014976501,
-0.01946113258600235,
-0.09408403187990189,
0.010765272192656994,
0.04668974503874779,
0.04525558650493622,
-0.036472052335739136,
-0.08379275351762772,
0.13518008589744568,
-0.07363267987966537,
0.04629439115524292,
0.036992207169532776,
-0.06922747939825058,
-0.01818523369729519,
-0.02797316387295723,
0.05159718543291092,
-0.003810324240475893,
-0.0012709521688520908,
-0.005336797796189785,
-0.041499435901641846,
-0.0779593288898468,
-0.05269964784383774,
0.04912747070193291,
-0.15956291556358337,
-0.04022616893053055,
-0.1286933869123459,
-0.12765933573246002,
-0.10930077731609344,
0.057544149458408356,
-0.09855952858924866,
-0.05353827401995659,
-0.06868594139814377,
0.01685425080358982,
-0.01446295902132988,
0.00955737754702568,
0.009015495888888836,
-0.0340576097369194,
0.0180846955627203,
-0.06845107674598694,
0.07894144207239151,
-0.04769004136323929,
0.016635088250041008,
-0.03911624103784561,
-0.012986524030566216,
-0.1423560380935669,
0.17130206525325775,
-0.08684761822223663,
0.07971609383821487,
-0.10153021663427353,
-0.01061856746673584,
-0.0228628758341074,
0.06708318740129471,
-0.04142502322793007,
0.1455240249633789,
-0.12425137311220169,
-0.03320436552166939,
0.2366091012954712,
-0.07764685899019241,
0.0017414763569831848,
0.12787087261676788,
-0.059249505400657654,
0.08205432444810867,
0.12029854953289032,
0.15042869746685028,
0.12130211293697357,
-0.032402947545051575,
0.04993096739053726,
-0.03238106146454811,
0.09879796206951141,
0.05782110244035721,
0.15268222987651825,
-0.05318209156394005,
-0.0002179859729949385,
0.01106696855276823,
-0.0808219239115715,
0.002524295123293996,
-0.008666852489113808,
-0.07809083163738251,
0.03579877316951752,
-0.053017131984233856,
0.010584428906440735,
0.024478411301970482,
0.07285112887620926,
-0.029836945235729218,
-0.05803059786558151,
-0.017691809684038162,
0.0547228567302227,
-0.0035681293811649084,
-0.06821414828300476,
-0.10027900338172913,
0.04815972223877907,
0.005879320204257965,
-0.01445982325822115,
-0.08009262382984161,
-0.06190018728375435,
0.058853041380643845,
-0.0419517457485199,
0.03415711224079132,
0.025642363354563713,
0.026221035048365593,
0.0637294203042984,
-0.043383967131376266,
0.029108375310897827,
0.025265971198678017,
-0.039539072662591934,
0.041348446160554886,
-0.15919174253940582,
0.004998833406716585,
-0.0580468513071537,
0.09670407325029373,
-0.18631808459758759,
0.042988669127225876,
0.02409188449382782,
0.04898049682378769,
0.0493970662355423,
-0.040260475128889084,
-0.01711668074131012,
0.0848078653216362,
0.018243947997689247,
-0.1002831980586052,
0.07155481725931168,
0.05223054438829422,
-0.06201979145407677,
0.04289384186267853,
-0.09252582490444183,
0.03671404719352722,
0.04497847706079483,
0.001534061273559928,
-0.07079765200614929,
-0.021146070212125778,
-0.017831793054938316,
0.028933390974998474,
-0.13158439099788666,
0.03458705171942711,
0.21630176901817322,
-0.049660492688417435,
0.081627756357193,
-0.1592322587966919,
-0.04493802413344383,
-0.0003160674241371453,
-0.03651032969355583,
0.014562401920557022,
0.07740147411823273,
0.08709047734737396,
-0.09779715538024902,
0.07727380841970444,
-0.08033875375986099,
0.0002756361209321767,
0.2643145024776459,
-0.014508482068777084,
-0.005279350560158491,
-0.028873734176158905,
0.0038221231661736965,
0.025674663484096527,
0.06397193670272827,
-0.09319734573364258,
-0.002706749364733696,
0.02135714329779148,
0.05279774218797684,
0.030160577967762947,
-0.12266146391630173,
-0.04327830672264099,
0.03249349445104599,
-0.030366500839591026,
-0.025353088974952698,
0.029405297711491585,
-0.03141985088586807,
0.08526419848203659,
0.034431226551532745,
0.030502857640385628,
-0.005718111526221037,
-0.04584760591387749,
-0.05951331928372383,
0.13264839351177216,
-0.06629066914319992,
-0.1884477287530899,
-0.06939247995615005,
0.0577835887670517,
-0.005672776605933905,
0.09226284176111221,
0.030038651078939438,
-0.0813601091504097,
-0.020016472786664963,
-0.019935427233576775,
0.05119393765926361,
0.04951176047325134,
-0.0710020363330841,
-0.038623660802841187,
0.06114552542567253,
-0.027787134051322937,
-0.08511176705360413,
-0.00478448998183012,
0.021263543516397476,
-0.2214142084121704,
0.026802683249115944,
-0.07544686645269394,
0.045618169009685516,
0.06109807267785072,
0.07176244258880615,
-0.04441458359360695,
-0.033223628997802734,
0.18901148438453674,
-0.10859332233667374,
0.06695079058408737,
0.09564860165119171,
0.004661217797547579,
0.011530531570315361,
0.1382414549589157,
0.05334780737757683,
-0.08869608491659164,
-0.05283636972308159,
-0.009474975988268852,
-0.07815229892730713,
-0.2657947540283203,
-0.11522005498409271,
-0.06385444104671478,
-0.004005900118499994,
0.06166751682758331,
0.030530041083693504,
0.046779800206422806,
0.11133866757154465,
-0.05417633056640625,
0.08491099625825882,
0.08507024496793747,
0.021153053268790245,
0.0627107247710228,
0.03536134213209152,
0.07625210285186768,
-0.0714949518442154,
-0.04770369455218315,
0.13006263971328735,
0.07416889071464539,
0.2386978566646576,
0.022522874176502228,
0.11066234111785889,
0.04747564345598221,
0.06735778599977493,
0.028027577325701714,
-0.01922052912414074,
0.009405803866684437,
0.019977163523435593,
-0.07898899167776108,
-0.03055851347744465,
-0.054452572017908096,
0.05639803782105446,
0.041763786226511,
-0.08494883030653,
-0.0433751679956913,
-0.04257358983159065,
0.0561649426817894,
0.1772211194038391,
-0.04777010157704353,
-0.16392900049686432,
-0.04805680364370346,
0.09135924279689789,
-0.013181649148464203,
-0.05868911370635033,
0.028682595118880272,
0.1341603547334671,
-0.15319670736789703,
0.10587248206138611,
0.019820263609290123,
0.07808882743120193,
-0.10501202195882797,
-0.0334634967148304,
0.023674987256526947,
0.0042823185212910175,
-0.005882661323994398,
0.06292495131492615,
-0.15194237232208252,
0.16090945899486542,
-0.008728299289941788,
0.08166923373937607,
-0.08132001012563705,
0.023958995938301086,
0.029792116954922676,
0.06273050606250763,
0.1278265118598938,
0.027555162087082863,
-0.09628978371620178,
-0.016037030145525932,
-0.07377590984106064,
0.05282776430249214,
-0.015275749377906322,
-0.13567106425762177,
0.03415049612522125,
-0.0048623825423419476,
0.046812403947114944,
-0.025768134742975235,
0.03196209669113159,
-0.07330086827278137,
-0.18803676962852478,
-0.004981011152267456,
-0.10125482082366943,
0.04266669228672981,
-0.07020974904298782,
-0.059688396751880646,
-0.14190314710140228,
0.05914592370390892,
-0.04654572531580925,
-0.09403172880411148,
-0.13135674595832825,
-0.013693173415958881,
0.05220610275864601,
-0.03796633332967758,
0.030455147847533226,
0.02136884443461895,
0.05772582069039345,
0.007184205576777458,
-0.009464990347623825,
0.0913596823811531,
-0.08518435060977936,
-0.15159964561462402,
-0.069967120885849,
0.026486394926905632,
0.1062653511762619,
0.057099923491477966,
0.04471057280898094,
0.049439843744039536,
0.05575511232018471,
-0.08864123374223709,
0.0112423961982131,
0.16869224607944489,
0.011853095144033432,
0.06212722510099411,
-0.08233927935361862,
-0.12367670983076096,
-0.0011305223451927304,
-0.06932862848043442,
0.08846313506364822,
0.1925262212753296,
-0.05122902989387512,
0.18499311804771423,
0.07039127498865128,
-0.1387869119644165,
-0.15617065131664276,
0.021325990557670593,
0.0813407450914383,
0.015095951966941357,
-0.02554934471845627,
-0.30751001834869385,
0.05035940930247307,
0.13149957358837128,
-0.05960795655846596,
-0.01804090477526188,
-0.2588927745819092,
-0.09965500235557556,
0.020164193585515022,
0.013203208334743977,
0.11396792531013489,
-0.15587303042411804,
-0.06093811243772507,
-0.05492193251848221,
-0.026267584413290024,
0.1662207692861557,
-0.058072369545698166,
0.07966978847980499,
0.044258344918489456,
-0.08457118272781372,
0.026792919263243675,
-0.004242481663823128,
0.18229104578495026,
0.023616524413228035,
0.02508770115673542,
-0.08094153553247452,
0.013376755639910698,
0.07403207570314407,
0.005350886844098568,
-0.026012679561972618,
0.05476899817585945,
-0.019004501402378082,
-0.1843721568584442,
-0.0846758708357811,
-0.03302508965134621,
0.04367263615131378,
-0.061319928616285324,
-0.05447809025645256,
0.009263809770345688,
0.12834465503692627,
0.0989648625254631,
0.004710824694484472,
-0.014811361208558083,
-0.07094460725784302,
0.14118193089962006,
-0.03306742385029793,
0.19564548134803772,
0.08035185933113098,
0.02172521874308586,
-0.03752654418349266,
0.025301244109869003,
0.0944376140832901,
-0.13105222582817078,
0.023670721799135208,
0.060120146721601486,
-0.01831284910440445,
0.09384405612945557,
0.014506512321531773,
-0.06758061796426773,
0.03904205933213234,
0.11221195012331009,
-0.029126854613423347,
-0.11780526489019394,
-0.003825004445388913,
-0.009396499022841454,
0.025883672758936882,
-0.0433025024831295,
0.07934949547052383,
0.02033628523349762,
-0.06430833041667938,
-0.011140462011098862,
0.035438284277915955,
0.024490484967827797,
0.08309975266456604,
0.014969877898693085,
-0.00784971658140421,
-0.08209255337715149,
0.08345545083284378,
0.1167791560292244,
-0.09216082096099854,
0.00855324137955904,
0.1023736521601677,
-0.07205777615308762,
-0.0311566349118948,
0.02754448726773262,
0.12126237154006958,
-0.06333798170089722,
-0.057128142565488815,
-0.02535221353173256,
-0.1234959065914154,
0.04245632141828537,
0.01186565775424242,
0.018859267234802246,
-0.003969037439674139,
0.01888575777411461,
0.053999077528715134,
0.02684050053358078,
0.036904796957969666,
0.03417638689279556,
-0.03167794644832611,
-0.04459042102098465,
0.07424654066562653,
0.006957765202969313,
-0.0839967355132103,
-0.026586640626192093,
-0.02012261375784874,
-0.206068217754364,
-0.021286940202116966,
-0.10101762413978577,
0.06771905720233917,
-0.05312614142894745,
0.062219418585300446,
-0.03362555429339409,
-0.04871365800499916,
-0.0031219033990055323,
-0.02801486663520336,
-0.07435862720012665,
-0.012578736990690231,
-0.04429472237825394,
0.09779670089483261,
-0.018261773511767387,
-0.014913467690348625,
0.058290932327508926,
-0.05649174377322197,
0.07385855168104172,
-0.05366313084959984,
-0.010680054314434528,
-0.00101659435313195,
-0.16120821237564087,
0.05083601176738739,
-0.025666894391179085,
0.024784715846180916,
-0.005912973545491695,
-0.09800872951745987,
0.04197404906153679,
0.01158995646983385,
0.01646103337407112,
0.022676460444927216,
0.009646055288612843,
-0.05264457315206528,
0.08969759196043015,
0.06568018347024918,
-0.11284815520048141,
-0.010631063021719456,
0.015083873644471169,
0.08144308626651764,
0.005150857847183943,
0.06580381840467453,
-0.059181563556194305,
0.06911759078502655,
-0.1552352011203766,
-0.030294671654701233,
0.025558531284332275,
-0.004289531614631414,
-0.04606057330965996,
-0.026672301813960075,
0.08264906704425812,
0.05119964852929115,
0.23261381685733795,
0.12091314792633057,
-0.03404032066464424,
-0.01401401124894619,
0.0284965168684721,
0.06587091833353043,
0.05197756737470627,
0.16896554827690125,
0.10311392694711685,
0.00631956709548831,
0.03265092149376869,
0.02983243577182293,
-0.017747966572642326,
0.03868868201971054,
0.08265041559934616,
0.13026684522628784,
0.09882751107215881,
-0.006064887158572674,
0.08879191428422928,
-0.05860405042767525,
-0.002957339398562908,
0.09236951172351837,
0.026808490976691246,
0.023480921983718872,
-0.08539506793022156,
-0.05595891550183296,
0.03273043408989906,
-0.2203119397163391,
0.12279742956161499,
-0.004220022354274988,
-0.031168827787041664,
-0.15701082348823547,
-0.2290637195110321,
-0.05505435913801193,
-0.09522191435098648,
0.011058492586016655,
-0.11124995350837708,
-0.005867119412869215,
0.048920098692178726,
0.02851010486483574,
0.009379609487950802,
0.01995563693344593,
-0.14487183094024658,
-0.05591920390725136,
0.08606904000043869,
0.0017946150619536638,
0.09608621895313263,
-0.060886770486831665,
-0.028237618505954742,
-0.03499370440840721,
0.12592138350009918,
0.009899990633130074,
0.021496692672371864,
-0.0148627245798707,
-0.0011206858325749636,
-0.02906862273812294,
-0.022244544699788094,
-0.0021861058194190264,
-0.007581891492009163,
0.0019043812062591314,
0.1308014690876007,
0.036193329840898514,
-0.026348156854510307,
0.06068038567900658,
0.25662192702293396,
0.021293140947818756,
-0.08005592226982117,
-0.21297478675842285,
-0.05627792701125145,
-0.0578218549489975,
0.007570677436888218,
0.06405455619096756,
-0.05622975155711174,
0.03288064897060394,
0.15393869578838348,
0.27578896284103394,
-0.01777981035411358,
-0.003985388204455376,
0.02544020302593708,
-0.011510205455124378,
0.08582200109958649,
0.14939111471176147,
-0.011460654437541962,
0.2694258987903595,
-0.06377308070659637,
-0.006723115686327219,
0.0370841845870018,
0.017022080719470978,
-0.05600334331393242,
0.13201116025447845,
-0.007057103794068098,
-0.004745468497276306,
-0.04311118274927139,
0.11826274544000626,
-0.1301669180393219,
-0.12035748362541199,
0.07287628948688507,
-0.07435256242752075,
-0.15567147731781006,
-0.05905849486589432,
-0.015094482339918613,
0.05519106611609459,
0.06919954717159271,
0.0600559338927269,
-0.07085563242435455,
0.0830993726849556,
0.017106657847762108,
-0.08745324611663818,
-0.06653290241956711,
0.08737030625343323,
-0.0254578348249197,
0.2206806242465973,
-0.017539912834763527,
0.06290564686059952,
0.07293595373630524,
0.0015852432698011398,
-0.08818794786930084,
-0.0326823852956295,
0.020508846268057823,
-0.00662947166711092,
0.030155809596180916,
0.0026394270826131105,
-0.030325746163725853,
0.05822593718767166,
0.12079109251499176,
-0.020442964509129524,
0.06213906407356262,
0.07141581922769547,
-0.026123570278286934,
-0.10603293776512146,
0.11302373558282852,
-0.16876089572906494,
0.10990569740533829,
0.1915210336446762,
-0.0332079753279686,
-0.013907489366829395,
-0.03492112457752228,
0.06333200633525848,
0.026799539104104042,
0.04606498405337334,
-0.06605419516563416,
-0.19985507428646088,
-0.04403885826468468,
-0.04738820716738701,
0.06399159878492355,
-0.12844717502593994,
-0.056783244013786316,
-0.0742567777633667,
-0.05996149033308029,
-0.03561786189675331,
0.05420669913291931,
0.07522274553775787,
0.038633693009614944,
-0.03878968581557274,
-0.11525064706802368,
-0.022741274908185005,
0.06170530989766121,
-0.054840441793203354,
-0.0376613475382328
] |
null | null |
transformers
|
# GPT2-Tamil
This repository is created as part of the Flax/Jax community week by Huggingface. The aim of this project is to pretrain a language model using GPT-2 specifically for Tamil language.
## Setup:
To setup the project, run the following command,
```python
pip install -r requirements.txt
```
## Model:
Pretrained model on Tamil language using a causal language modeling (CLM) objective.
## Dataset Used:
The GTP-2 model is trained on [oscar dataset - ta](https://huggingface.co/datasets/oscar)
## Intended uses & limitations:
You can use the raw model for next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=gpt) to look for fine-tuned versions on a task that interests you.
## How to pretrain the model:
To perform training, do the following steps,
- Export the model directory (where you want to store the model artifacts like config, tokenizer, etc.)
```python
>>> export MODEL_DIR=<model_dir>
```
- Create the config.json by running the following command,
```python
>>> python src/create_config.py
```
- Create the tokenizer by running the following command,
```python
>>> python src/train_tokenizer.py
```
- Once the config and tokenizer is created, run the following script to start training the flax model
```python
>>> python scripts/train_gpt2-oscar-tamil.sh
```
## How to use:
To perform language generation using the model, pipeline can be used directly.
- First convert the flax model to pytorch using the following command,
```python
python src/convert_flax_to_pytorch.py
```
- Use the following snippet to perform language generation,
```python
>>> from transformers import AutoTokenizer, AutoModelWithLMHead, pipeline
>>> model_name = 'abinayam/gpt-2-tamil'
>>> model = AutoModelWithLMHead.from_pretrained(model_name)
>>> tokenizer = AutoTokenizer.from_pretrained(model_name)
>>> set_seed(42)
>>> input_text = "ஒரு ஊரிலே ஒரு காக்கைக்கு"
>>> max_len = 300
>>> no_seq = 5
>>> generator = pipeline('text-generation', model=model, tokenizer=tokenizer)
>>> sequence = generator(input_text, max_length=max_len, num_return_sequences=no_seq)
```
|
{"language": "ta", "datasets": ["oscar", "IndicNLP"], "widget": [{"text": "\u0b92\u0bb0\u0bc1 \u0b8a\u0bb0\u0bbf\u0bb2\u0bc7 \u0b92\u0bb0\u0bc1 \u0b95\u0bbe\u0b95\u0bcd\u0b95\u0bc8\u0b95\u0bcd\u0b95\u0bc1"}]}
|
text-generation
|
flax-community/gpt-2-tamil
|
[
"transformers",
"pytorch",
"tensorboard",
"gpt2",
"text-generation",
"ta",
"dataset:oscar",
"dataset:IndicNLP",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ta"
] |
TAGS
#transformers #pytorch #tensorboard #gpt2 #text-generation #ta #dataset-oscar #dataset-IndicNLP #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# GPT2-Tamil
This repository is created as part of the Flax/Jax community week by Huggingface. The aim of this project is to pretrain a language model using GPT-2 specifically for Tamil language.
## Setup:
To setup the project, run the following command,
## Model:
Pretrained model on Tamil language using a causal language modeling (CLM) objective.
## Dataset Used:
The GTP-2 model is trained on oscar dataset - ta
## Intended uses & limitations:
You can use the raw model for next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.
## How to pretrain the model:
To perform training, do the following steps,
- Export the model directory (where you want to store the model artifacts like config, tokenizer, etc.)
- Create the URL by running the following command,
- Create the tokenizer by running the following command,
- Once the config and tokenizer is created, run the following script to start training the flax model
## How to use:
To perform language generation using the model, pipeline can be used directly.
- First convert the flax model to pytorch using the following command,
- Use the following snippet to perform language generation,
|
[
"# GPT2-Tamil\n\nThis repository is created as part of the Flax/Jax community week by Huggingface. The aim of this project is to pretrain a language model using GPT-2 specifically for Tamil language.",
"## Setup:\nTo setup the project, run the following command,",
"## Model:\nPretrained model on Tamil language using a causal language modeling (CLM) objective.",
"## Dataset Used:\nThe GTP-2 model is trained on oscar dataset - ta",
"## Intended uses & limitations:\nYou can use the raw model for next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.",
"## How to pretrain the model:\nTo perform training, do the following steps,\n\n- Export the model directory (where you want to store the model artifacts like config, tokenizer, etc.)\n\n- Create the URL by running the following command,\n\n- Create the tokenizer by running the following command,\n\n- Once the config and tokenizer is created, run the following script to start training the flax model",
"## How to use:\nTo perform language generation using the model, pipeline can be used directly.\n\n- First convert the flax model to pytorch using the following command,\n\n- Use the following snippet to perform language generation,"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #gpt2 #text-generation #ta #dataset-oscar #dataset-IndicNLP #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# GPT2-Tamil\n\nThis repository is created as part of the Flax/Jax community week by Huggingface. The aim of this project is to pretrain a language model using GPT-2 specifically for Tamil language.",
"## Setup:\nTo setup the project, run the following command,",
"## Model:\nPretrained model on Tamil language using a causal language modeling (CLM) objective.",
"## Dataset Used:\nThe GTP-2 model is trained on oscar dataset - ta",
"## Intended uses & limitations:\nYou can use the raw model for next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.",
"## How to pretrain the model:\nTo perform training, do the following steps,\n\n- Export the model directory (where you want to store the model artifacts like config, tokenizer, etc.)\n\n- Create the URL by running the following command,\n\n- Create the tokenizer by running the following command,\n\n- Once the config and tokenizer is created, run the following script to start training the flax model",
"## How to use:\nTo perform language generation using the model, pipeline can be used directly.\n\n- First convert the flax model to pytorch using the following command,\n\n- Use the following snippet to perform language generation,"
] |
[
71,
49,
14,
23,
21,
61,
90,
48
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #gpt2 #text-generation #ta #dataset-oscar #dataset-IndicNLP #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# GPT2-Tamil\n\nThis repository is created as part of the Flax/Jax community week by Huggingface. The aim of this project is to pretrain a language model using GPT-2 specifically for Tamil language.## Setup:\nTo setup the project, run the following command,## Model:\nPretrained model on Tamil language using a causal language modeling (CLM) objective.## Dataset Used:\nThe GTP-2 model is trained on oscar dataset - ta## Intended uses & limitations:\nYou can use the raw model for next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.## How to pretrain the model:\nTo perform training, do the following steps,\n\n- Export the model directory (where you want to store the model artifacts like config, tokenizer, etc.)\n\n- Create the URL by running the following command,\n\n- Create the tokenizer by running the following command,\n\n- Once the config and tokenizer is created, run the following script to start training the flax model## How to use:\nTo perform language generation using the model, pipeline can be used directly.\n\n- First convert the flax model to pytorch using the following command,\n\n- Use the following snippet to perform language generation,"
] |
[
-0.08765310049057007,
0.07973501831293106,
-0.004143529571592808,
0.05111448094248772,
0.10331767797470093,
-0.009939090348780155,
0.0639137476682663,
0.1130661889910698,
0.06542520970106125,
0.04617184400558472,
-0.009757885709404945,
0.09630995988845825,
0.11595374345779419,
0.13158221542835236,
0.06677475571632385,
-0.2725779712200165,
-0.017010802403092384,
-0.07602621614933014,
-0.0038878056220710278,
0.08875314146280289,
0.06367167085409164,
-0.038060225546360016,
0.07677025347948074,
-0.012518945150077343,
-0.046119339764118195,
0.0035133131314069033,
-0.028694525361061096,
-0.029408590868115425,
0.012008982710540295,
0.043906185775995255,
0.034649066627025604,
-0.027360310778021812,
0.11488272994756699,
-0.08032875508069992,
0.030499693006277084,
0.051734667271375656,
0.02827615477144718,
0.017346905544400215,
0.07321222871541977,
0.016892386600375175,
0.20168165862560272,
0.02618539147078991,
0.026245027780532837,
0.08335798233747482,
-0.07197275757789612,
0.03405122086405754,
-0.0628843903541565,
0.06892526894807816,
0.11288125067949295,
0.12418428808450699,
-0.03103150799870491,
0.13959790766239166,
0.025192521512508392,
0.07411418855190277,
0.031115638092160225,
-0.15142078697681427,
-0.0657486617565155,
0.20773853361606598,
0.10824737697839737,
0.08236760646104813,
-0.06263713538646698,
-0.03228500485420227,
-0.0652308464050293,
0.0637541264295578,
0.022925965487957,
-0.04673029109835625,
0.01823299750685692,
-0.04818572849035263,
-0.07384078949689865,
-0.0011186719639226794,
0.037753209471702576,
-0.06999709457159042,
-0.05917920172214508,
-0.16691839694976807,
-0.09662746638059616,
-0.015622399747371674,
-0.026194363832473755,
0.03692621365189552,
0.01705646887421608,
0.08203435689210892,
0.1716890186071396,
-0.1595478653907776,
-0.10964803397655487,
-0.0419270284473896,
0.08924534171819687,
-0.021655144169926643,
0.02908685989677906,
0.021212350577116013,
-0.1456533521413803,
0.1408030390739441,
0.04017288237810135,
-0.10409069061279297,
-0.05241340398788452,
-0.06666816025972366,
-0.08078603446483612,
-0.004613424651324749,
0.08845142275094986,
-0.0772758424282074,
-0.043830569833517075,
0.12884047627449036,
0.007360076531767845,
0.055426910519599915,
-0.030770620331168175,
0.026260381564497948,
0.050642795860767365,
0.17099224030971527,
-0.09294678270816803,
0.027271568775177002,
0.11808914691209793,
0.018917344510555267,
0.03815740719437599,
-0.030563345178961754,
-0.04284978285431862,
0.026384087279438972,
-0.09757830202579498,
0.11452154815196991,
0.08010923117399216,
0.024403398856520653,
0.004275782965123653,
-0.10666561126708984,
0.20064039528369904,
-0.14071787893772125,
0.023977700620889664,
0.010184280574321747,
-0.026420677080750465,
0.04856845363974571,
0.061408936977386475,
0.006211225874722004,
-0.08182104676961899,
-0.05242350324988365,
-0.05731545016169548,
0.04286700487136841,
-0.11784853786230087,
-0.07863843441009521,
-0.008412024937570095,
-0.11726980656385422,
-0.052976254373788834,
-0.06241361424326897,
-0.23174110054969788,
-0.053469542413949966,
0.0541769303381443,
-0.09032141417264938,
-0.02244049310684204,
-0.0034295320510864258,
-0.041672058403491974,
-0.059468068182468414,
0.013691617175936699,
-0.06719864159822464,
-0.02188638225197792,
0.007882557809352875,
-0.05408201739192009,
0.05903053283691406,
0.06750437617301941,
-0.0004586566938087344,
-0.09670484066009521,
0.04306181147694588,
-0.2212809920310974,
0.10743804275989532,
-0.04022166505455971,
0.11214029788970947,
-0.06379367411136627,
0.030764497816562653,
0.04563196375966072,
0.0005614357069134712,
0.006424011662602425,
0.14849725365638733,
-0.2166641801595688,
0.03289229795336723,
0.20419862866401672,
-0.09648824483156204,
-0.06968393921852112,
0.011716403067111969,
0.035902708768844604,
0.22665172815322876,
0.05791686475276947,
0.11399389803409576,
0.06498657166957855,
-0.08872063457965851,
-0.03371163457632065,
-0.039685703814029694,
-0.12666068971157074,
-0.025333533063530922,
0.042217426002025604,
-0.004202407784759998,
0.10032375901937485,
0.007230745628476143,
-0.08278045058250427,
0.04972410947084427,
0.023848501965403557,
-0.024939870461821556,
0.025816109031438828,
-0.03912261500954628,
-0.13795171678066254,
-0.007471967022866011,
0.013260276056826115,
0.06694324314594269,
-0.06121663376688957,
-0.010689201764762402,
0.09188879281282425,
-0.08934809267520905,
0.03191768378019333,
-0.12877844274044037,
0.050640303641557693,
-0.016511790454387665,
0.0287657268345356,
-0.1548362523317337,
-0.05441378802061081,
0.05051897093653679,
-0.005666383076459169,
0.10857371985912323,
-0.09860020875930786,
0.05379162356257439,
0.09141473472118378,
0.052888356149196625,
-0.03269293159246445,
0.049375224858522415,
-0.040363963693380356,
-0.05838490277528763,
-0.09412137418985367,
-0.05768749862909317,
-0.04256141558289528,
0.08246739953756332,
-0.1921803057193756,
0.07187636196613312,
-0.03924904391169548,
0.10058972239494324,
0.029821248725056648,
-0.06399335712194443,
0.09551259875297546,
-0.06677057594060898,
0.005721021443605423,
-0.10018738359212875,
0.011638103984296322,
0.038046110421419144,
-0.003319726325571537,
0.09349442273378372,
-0.10764316469430923,
-0.15219572186470032,
0.03910268843173981,
0.0438811220228672,
-0.012213108129799366,
0.06551527976989746,
-0.03943900763988495,
-0.019516900181770325,
-0.01998479664325714,
0.008404368534684181,
0.08121073246002197,
0.0896507203578949,
0.13193868100643158,
-0.05179407075047493,
-0.02927377074956894,
-0.014979206025600433,
-0.045310474932193756,
0.037175241857767105,
0.03739039972424507,
0.16457536816596985,
-0.0892530158162117,
0.04896363615989685,
-0.020867131650447845,
0.0871218591928482,
0.20071923732757568,
0.02620841935276985,
-0.06288527697324753,
-0.015637099742889404,
-0.004937977995723486,
0.005349785089492798,
0.04646649211645126,
0.030727563425898552,
0.04312746226787567,
0.0073588392697274685,
-0.017594117671251297,
0.04139305651187897,
-0.07366028428077698,
0.0055899168364703655,
0.0065386597998440266,
-0.0017938509117811918,
0.003986717201769352,
0.07275363057851791,
-0.02685227058827877,
0.020688576623797417,
-0.028673211112618446,
0.10520675778388977,
-0.05333235114812851,
-0.019370175898075104,
-0.13196229934692383,
0.12184815108776093,
-0.11523357778787613,
-0.21404188871383667,
-0.16109104454517365,
-0.019836366176605225,
-0.014788292348384857,
-0.0061699338257312775,
0.021095754578709602,
-0.058978866785764694,
-0.0860360711812973,
-0.12682029604911804,
0.07518172264099121,
0.06380685418844223,
-0.08585051447153091,
-0.10315817594528198,
0.03264796361327171,
-0.03871818631887436,
-0.1346500813961029,
-0.010440242476761341,
0.040221430361270905,
-0.14463146030902863,
0.06921076029539108,
0.021266596391797066,
-0.008397637866437435,
0.10617236793041229,
0.02202395722270012,
0.014759776182472706,
0.026613885536789894,
0.21026350557804108,
-0.0692010223865509,
0.19406726956367493,
0.20292413234710693,
-0.017885442823171616,
0.03744096681475639,
0.13650234043598175,
0.03019779734313488,
-0.06368667632341385,
0.022590497508645058,
0.004574528429657221,
-0.03034602291882038,
-0.21784010529518127,
-0.04662013053894043,
-0.03608348220586777,
0.07495788484811783,
0.043008387088775635,
0.08255618810653687,
0.013817062601447105,
0.031904436647892,
-0.04929690435528755,
-0.008664817549288273,
0.035447072237730026,
0.09819097071886063,
-0.040026646107435226,
-0.0264591071754694,
0.0392611101269722,
-0.05913001671433449,
0.09731753915548325,
0.06820076704025269,
0.028156915679574013,
0.1983204185962677,
0.01958135887980461,
0.14513148367404938,
0.058870092034339905,
0.047280579805374146,
0.06509490311145782,
0.07064639776945114,
-0.03959456831216812,
0.022248009219765663,
0.030835960060358047,
-0.08175718039274216,
-0.020036332309246063,
0.07154904305934906,
-0.03461660444736481,
-0.07562471926212311,
0.050803862512111664,
0.0029417455662041903,
0.014415142126381397,
0.13324709236621857,
-0.044061191380023956,
-0.11867797374725342,
-0.0616232194006443,
0.054426711052656174,
-0.035204533487558365,
-0.1056298166513443,
-0.024536650627851486,
0.056609004735946655,
-0.12124932557344437,
-0.031117113307118416,
-0.046537093818187714,
0.12517091631889343,
-0.07846534252166748,
-0.01377104315906763,
0.03694486618041992,
0.06843739748001099,
0.03393857926130295,
0.033435236662626266,
0.002445058198645711,
0.07722825556993484,
0.028169037774205208,
0.0861191600561142,
-0.03265482559800148,
0.04047217220067978,
0.013415545225143433,
0.05540627986192703,
0.09704745560884476,
0.011490101926028728,
0.023721246048808098,
-0.15656475722789764,
-0.13430792093276978,
0.002700175391510129,
-0.008954004384577274,
-0.10178691893815994,
0.11513018608093262,
0.005911920685321093,
-0.02240670844912529,
-0.0807233452796936,
-0.13986824452877045,
-0.1628437340259552,
-0.10479048639535904,
0.061076629906892776,
-0.04319367930293083,
0.0023102809209376574,
-0.02981620840728283,
-0.021979348734021187,
0.04260401427745819,
0.167690247297287,
-0.12422405183315277,
-0.09339725226163864,
-0.07680583745241165,
-0.00959866214543581,
0.08871908485889435,
-0.0772814080119133,
0.04411178454756737,
-0.03941059485077858,
0.07115373760461807,
-0.012510634958744049,
-0.07919806241989136,
0.07379274070262909,
-0.07276596128940582,
-0.05717930942773819,
0.013144148513674736,
0.09057590365409851,
0.11961855739355087,
-0.013869157060980797,
0.008284836076200008,
0.02598918229341507,
0.007006226107478142,
-0.10560265183448792,
-0.07742764055728912,
0.2421223670244217,
-0.005019946023821831,
-0.0012774444185197353,
-0.0855267122387886,
-0.020401563495397568,
-0.05338781327009201,
-0.024405280128121376,
0.0775957852602005,
0.17461811006069183,
-0.01730378530919552,
0.13696065545082092,
0.08235771954059601,
-0.10278390347957611,
-0.1943415105342865,
-0.006363380700349808,
0.001141347805969417,
0.08156995475292206,
0.09644729644060135,
-0.23006945848464966,
0.11534254997968674,
-0.018630193546414375,
-0.025854548439383507,
0.09171786159276962,
-0.21592749655246735,
-0.11023242026567459,
0.02171340212225914,
0.09198886156082153,
-0.10520993918180466,
-0.07780470699071884,
-0.02449544332921505,
0.034467779099941254,
-0.13371466100215912,
0.07178972661495209,
-0.04982054606080055,
0.05879107117652893,
-0.017210520803928375,
-0.003556276438757777,
0.02531224675476551,
-0.05277348309755325,
0.07101094722747803,
-0.028854863718152046,
0.0033960253931581974,
-0.059863459318876266,
0.05365867540240288,
0.14590144157409668,
-0.02260155975818634,
0.18376056849956512,
0.008556613698601723,
-0.008155571296811104,
-0.1503789722919464,
-0.044187963008880615,
-0.022822272032499313,
-0.002353082410991192,
-0.01604776829481125,
-0.048948049545288086,
-0.01592399924993515,
0.10168354213237762,
0.0018807578599080443,
0.025688733905553818,
-0.052830036729574203,
-0.09314236789941788,
-0.08631844073534012,
0.11728548258543015,
0.09732431173324585,
-0.0561382956802845,
-0.03948234021663666,
-0.021172424778342247,
0.028947198763489723,
0.03457833454012871,
-0.11079586297273636,
-0.013136336579918861,
0.05380282551050186,
-0.009045368991792202,
0.057584747672080994,
-0.007458267733454704,
-0.1193079948425293,
0.004117010626941919,
0.062448665499687195,
-0.018069932237267494,
-0.11678998172283173,
-0.03805205225944519,
0.04156675562262535,
-0.048156313598155975,
-0.05712684988975525,
0.1245126947760582,
-0.05514278635382652,
-0.04510060325264931,
0.018474848940968513,
0.04525786265730858,
-0.029429282993078232,
0.010374880395829678,
-0.033775631338357925,
-0.003451526863500476,
-0.05032944679260254,
0.04650165140628815,
0.0645221695303917,
-0.039105549454689026,
-0.007212015800178051,
0.2762940526008606,
-0.12109122425317764,
-0.10373013466596603,
-0.07951390743255615,
0.044967297464609146,
-0.018050679937005043,
0.0619623064994812,
0.04171239957213402,
0.02029765211045742,
0.005990279838442802,
0.004528080578893423,
-0.0060142893344163895,
-0.013731005601584911,
-0.13310004770755768,
-0.0028562908992171288,
-0.06720077991485596,
0.06429815292358398,
0.07791078835725784,
-0.012585900723934174,
-0.10200998187065125,
-0.019483568146824837,
0.08256214112043381,
0.025840019807219505,
-0.00006910072988830507,
-0.09727487713098526,
-0.052757520228624344,
0.004935103934258223,
-0.045787449926137924,
-0.009890063665807247,
-0.11068258434534073,
-0.04178566113114357,
-0.014041099697351456,
0.008487862534821033,
0.017944028601050377,
0.016887212172150612,
-0.02162250690162182,
-0.07258336991071701,
-0.10214659571647644,
0.07027596235275269,
-0.10031840950250626,
-0.0035435473546385765,
0.033195286989212036,
-0.07568352669477463,
0.08468469977378845,
0.07287558168172836,
-0.05836079642176628,
0.04320187866687775,
0.03064851462841034,
-0.015327835455536842,
-0.0321352519094944,
0.015837552025914192,
0.0547281950712204,
-0.07682769000530243,
0.006999557837843895,
-0.02779209055006504,
-0.023091303184628487,
-0.023095794022083282,
0.09299132227897644,
-0.0824119970202446,
0.042509328573942184,
0.017696557566523552,
-0.004899656865745783,
-0.05540139228105545,
0.061575572937726974,
0.009892397560179234,
0.062298476696014404,
0.07460251450538635,
-0.023371659219264984,
0.08601175248622894,
-0.12804053723812103,
-0.024064766243100166,
0.003523177234455943,
0.029608190059661865,
0.012039534747600555,
-0.09522908926010132,
0.03046543151140213,
-0.022458795458078384,
0.0031711494084447622,
0.06769255548715591,
0.02289557456970215,
0.027516204863786697,
0.028533799573779106,
0.004535578191280365,
0.022788023576140404,
0.06874729692935944,
0.03619074821472168,
0.05485707148909569,
0.039765894412994385,
0.00863058865070343,
-0.03582260385155678,
-0.07720077782869339,
0.07458125799894333,
0.04174230992794037,
0.05660328269004822,
0.021451428532600403,
0.03893415257334709,
-0.019237549975514412,
-0.20137062668800354,
-0.06833428144454956,
-0.07030774652957916,
0.02187860943377018,
-0.11637836694717407,
0.08476795256137848,
0.17469099164009094,
-0.0740051344037056,
0.05789609253406525,
0.01342127937823534,
-0.0866328775882721,
-0.08847235143184662,
-0.30806660652160645,
-0.012346352450549603,
-0.00931899156421423,
-0.013988617807626724,
-0.0529644675552845,
0.06289557367563248,
-0.023916341364383698,
0.060633569955825806,
-0.017606889829039574,
0.16038700938224792,
-0.012100807391107082,
-0.07617177814245224,
-0.024175507947802544,
0.052239399403333664,
0.04730261489748955,
0.0015920978039503098,
0.09050780534744263,
-0.047651637345552444,
0.04430469870567322,
0.04112248495221138,
0.08332120627164841,
0.006359877996146679,
0.04239910840988159,
-0.07663305103778839,
-0.07589365541934967,
-0.021594056859612465,
0.046024855226278305,
-0.08881045132875443,
0.18444976210594177,
0.05267764627933502,
-0.049871399998664856,
-0.006832963787019253,
0.1931026577949524,
-0.027103478088974953,
-0.12606696784496307,
-0.1027316004037857,
0.18018829822540283,
0.020815987139940262,
-0.034056734293699265,
-0.020903291180729866,
-0.08932187408208847,
-0.054996274411678314,
0.23823998868465424,
0.19697166979312897,
-0.07909152656793594,
0.0013785568298771977,
-0.005025795660912991,
0.005091013386845589,
-0.11435899883508682,
0.1477862000465393,
0.053842008113861084,
0.23983658850193024,
-0.07375600188970566,
0.08029833436012268,
-0.07352178543806076,
-0.11158429831266403,
-0.14852114021778107,
-0.03322542831301689,
-0.03127041459083557,
-0.0033996053971350193,
-0.014092529192566872,
0.11108075082302094,
-0.09057334065437317,
0.0013873669086024165,
-0.05001789331436157,
0.0012617401080206037,
-0.06690371781587601,
-0.024754410609602928,
-0.00896836444735527,
-0.021319663152098656,
0.03276953101158142,
0.0053852819837629795,
-0.01137364562600851,
0.25067028403282166,
0.03313642740249634,
-0.037042200565338135,
-0.03820100054144859,
0.10496579110622406,
-0.1297653764486313,
0.16994695365428925,
-0.025327645242214203,
0.04497484862804413,
0.048540256917476654,
0.02440996654331684,
-0.17052502930164337,
0.021521607413887978,
-0.001909191021695733,
-0.054113492369651794,
-0.050285764038562775,
0.16990065574645996,
-0.02005746029317379,
0.01249334029853344,
-0.009096963331103325,
-0.11268570274114609,
0.007985247299075127,
-0.0692426860332489,
0.05244547873735428,
-0.06801220029592514,
0.06788362562656403,
-0.07340109348297119,
0.14661669731140137,
0.07869730144739151,
-0.026008345186710358,
0.004299812018871307,
-0.060865528881549835,
0.015094306319952011,
0.014246049337089062,
0.037039242684841156,
-0.03271612897515297,
-0.14838209748268127,
-0.03411760926246643,
-0.06242649629712105,
0.08122749626636505,
-0.13618223369121552,
-0.06018882989883423,
-0.010809700936079025,
0.011643500067293644,
-0.05943470820784569,
0.13648119568824768,
0.025340860709547997,
0.021847333759069443,
0.007281192112714052,
-0.004243006929755211,
-0.007057529408484697,
0.07582427561283112,
-0.13268126547336578,
-0.10707169026136398
] |
null | null |
transformers
|
# GPT-Code-Clippy-1.3B-APPS-all
## Model Description
GPT-Neo-1.3B-APPS-all is a GPT-Neo-1.3B fine-tuned on APPS dataset. This model is specialized to solve programming tasks.
## Training data
The model is trained on the [Automated Programming Progress Standard (APPS) dataset](https://github.com/hendrycks/apps). The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.
This model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found [here](https://huggingface.co/flax-community/gpt-neo-1.3B-apps).
## Training procedure
The training script used to train this model can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/run_clm_apps.py).
Training is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:
```
python run_clm_apps.py \
--output_dir ./gpt-neo-125M-apps \
--model_name_or_path EleutherAI/gpt-neo-125B \
--dataset_name ./apps.py \
--dataset_config_name formatted \
--do_train --do_eval \
--block_size="1024" \
--per_device_train_batch_size="3" \
--per_device_eval_batch_size="3" \
--preprocessing_num_workers="16" \
--learning_rate="8e-5" \
--warmup_steps="800" \
--adam_beta1="0.9" \
--adam_beta2="0.98" \
--weight_decay="0.1" \
--overwrite_output_dir \
--num_train_epochs="5" \
--logging_steps="50" \
--eval_steps="2000" \
--report_to="wandb" \
--dtype="bfloat16" \
--save_strategy epoch \
--gradient_accumulation_steps 1 \
--all_data true \
```
## Intended Use and Limitations
The model is fine-tuned to solve programming problems given a text description and optional starter code.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("flax-community/gpt-neo-1.3B-apps-all-2")
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt-neo-1.3B-apps-all-2")
prompt = """
A function to greet user. Given a user name it should say hello
def greet(name):
ANSWER:
"""
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(device)
start = input_ids.size(1)
out = model.generate(input_ids, do_sample=True, max_length=50, num_beams=2,
early_stopping=True, eos_token_id=tokenizer.eos_token_id, )
print(tokenizer.decode(out[0][start:]))
```
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper ["Evaluating Large Language Models Trained on Code"](https://arxiv.org/abs/2107.03374) from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. **As well as some differences in views from the paper, particularly around legal implications**.
1. **Over-reliance:** This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. **Economic and labor market impacts:** Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from [O*NET OnLine](https://www.onetonline.org/link/summary/15-1252.00), developers don't just write software.
5. **Biases:** The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt formatting is different from that used in APPS dataset.
This model is finetuned GPT-Neo and might have inhereted biases and limitations from it. See [GPT-Neo model card](https://huggingface.co/EleutherAI/gpt-neo-125M#limitations-and-biases) for details.
## Eval results
Coming soon...
|
{"language": ["en", "python"], "license": "mit", "tags": ["gpt_neo", "code_synthesis"], "datasets": ["apps"]}
|
text-generation
|
flax-community/gpt-neo-1.3B-apps-all-2
|
[
"transformers",
"pytorch",
"jax",
"gpt_neo",
"text-generation",
"code_synthesis",
"dataset:apps",
"arxiv:2107.03374",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.03374"
] |
[
"en",
"python"
] |
TAGS
#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# GPT-Code-Clippy-1.3B-APPS-all
## Model Description
GPT-Neo-1.3B-APPS-all is a GPT-Neo-1.3B fine-tuned on APPS dataset. This model is specialized to solve programming tasks.
## Training data
The model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.
This model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.
## Training procedure
The training script used to train this model can be found here.
Training is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:
## Intended Use and Limitations
The model is fine-tuned to solve programming problems given a text description and optional starter code.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper "Evaluating Large Language Models Trained on Code" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.
1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.
5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt formatting is different from that used in APPS dataset.
This model is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.
## Eval results
Coming soon...
|
[
"# GPT-Code-Clippy-1.3B-APPS-all",
"## Model Description\n\nGPT-Neo-1.3B-APPS-all is a GPT-Neo-1.3B fine-tuned on APPS dataset. This model is specialized to solve programming tasks.",
"## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is fine-tuned to solve programming problems given a text description and optional starter code.",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n\n5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt formatting is different from that used in APPS dataset.\n\nThis model is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
"TAGS\n#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# GPT-Code-Clippy-1.3B-APPS-all",
"## Model Description\n\nGPT-Neo-1.3B-APPS-all is a GPT-Neo-1.3B fine-tuned on APPS dataset. This model is specialized to solve programming tasks.",
"## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is fine-tuned to solve programming problems given a text description and optional starter code.",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n\n5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt formatting is different from that used in APPS dataset.\n\nThis model is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
67,
17,
48,
155,
60,
30,
35,
372,
8
] |
[
"passage: TAGS\n#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# GPT-Code-Clippy-1.3B-APPS-all## Model Description\n\nGPT-Neo-1.3B-APPS-all is a GPT-Neo-1.3B fine-tuned on APPS dataset. This model is specialized to solve programming tasks.## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:## Intended Use and Limitations\n\nThe model is fine-tuned to solve programming problems given a text description and optional starter code.### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:"
] |
[
-0.14273281395435333,
0.10677190124988556,
-0.002678460907191038,
0.08031666278839111,
0.13762599229812622,
0.014188260771334171,
0.1228620857000351,
0.11081216484308243,
-0.054524753242731094,
0.034604642540216446,
-0.0006751533364877105,
0.06216733902692795,
0.040691737085580826,
0.23761025071144104,
0.014954568818211555,
-0.20529058575630188,
-0.03334565833210945,
-0.020321786403656006,
0.023620598018169403,
0.09494251757860184,
0.09984104335308075,
-0.06573794782161713,
0.040495291352272034,
0.02891131304204464,
-0.13574537634849548,
-0.03925266116857529,
0.0025653685443103313,
-0.02877664752304554,
0.1024722158908844,
0.02339329943060875,
0.09803079068660736,
0.010775593109428883,
0.01811816915869713,
-0.12956146895885468,
0.023878086358308792,
0.10011447966098785,
0.03262361139059067,
0.0992514118552208,
0.08960925042629242,
0.0018856121459975839,
0.12058500945568085,
0.022767752408981323,
0.07941068708896637,
0.06534995138645172,
-0.09710265696048737,
-0.05730351433157921,
-0.1085885539650917,
-0.03778519481420517,
0.15289990603923798,
0.10540520399808884,
-0.016818488016724586,
0.02708199992775917,
-0.055859267711639404,
0.04299980401992798,
0.049083467572927475,
-0.2051471471786499,
-0.0511329360306263,
0.059734441339969635,
-0.005103083793073893,
0.04440887272357941,
-0.059472836554050446,
-0.031384680420160294,
0.0032932746689766645,
0.04474113509058952,
0.0977749302983284,
-0.007803426589816809,
0.011875065043568611,
-0.036469414830207825,
-0.14236925542354584,
-0.037766676396131516,
0.07445282489061356,
-0.0342104472219944,
-0.06206788867712021,
-0.21225827932357788,
-0.04230071231722832,
-0.1311744898557663,
0.04282861575484276,
0.030204955488443375,
-0.023315785452723503,
0.04722050204873085,
-0.015545216389000416,
-0.04957685247063637,
-0.12287846207618713,
-0.06309417635202408,
-0.02952028252184391,
0.04803857207298279,
0.0160296019166708,
0.02256038412451744,
-0.03147667273879051,
0.1802442967891693,
-0.05396193265914917,
-0.042874958366155624,
-0.0521644651889801,
-0.024032533168792725,
-0.1460360586643219,
-0.06645771861076355,
-0.04206347092986107,
-0.1848977953195572,
-0.05791206657886505,
0.2041635513305664,
-0.062273722141981125,
0.04635864496231079,
0.04828597977757454,
0.018965275958180428,
0.05603800341486931,
0.1397586613893509,
0.05256759375333786,
0.018136074766516685,
0.007691127713769674,
0.07588347047567368,
-0.011861072853207588,
-0.029073847457766533,
-0.019226601347327232,
-0.02418406307697296,
0.012224060483276844,
0.09798923134803772,
0.030758017674088478,
-0.0050570047460496426,
-0.07343687117099762,
-0.06169131025671959,
0.14988332986831665,
-0.121541827917099,
0.017040999606251717,
0.00995294377207756,
-0.013265335932374,
-0.0623963288962841,
0.01379519235342741,
-0.02134871669113636,
-0.06535424292087555,
0.040515076369047165,
-0.06257054209709167,
-0.014366311952471733,
-0.12576042115688324,
-0.03865000233054161,
-0.004689633846282959,
-0.11024950444698334,
-0.054619334638118744,
-0.06293599307537079,
-0.2266329675912857,
-0.05384978652000427,
0.0471101775765419,
-0.028714260086417198,
-0.004094389732927084,
-0.033664967864751816,
0.0021563423797488213,
-0.012751496396958828,
0.014962633140385151,
0.11665029078722,
-0.027293726801872253,
0.025282735005021095,
-0.015999063849449158,
0.03378445282578468,
0.08282986283302307,
0.04670242965221405,
-0.08209481835365295,
0.039079517126083374,
-0.09685422480106354,
0.11631183326244354,
0.025441179051995277,
-0.031907446682453156,
-0.04903583973646164,
0.008557070046663284,
-0.05577528849244118,
-0.013409066945314407,
0.029033692553639412,
0.09541160613298416,
-0.20065781474113464,
-0.0635780319571495,
0.06864345818758011,
-0.12058285623788834,
0.015560325235128403,
0.07143498957157135,
-0.0372004397213459,
0.06363169848918915,
0.04943261295557022,
0.1283768117427826,
0.17067810893058777,
0.0030094098765403032,
-0.05762675032019615,
-0.025583140552043915,
-0.03655235469341278,
-0.004746618680655956,
0.07278437912464142,
-0.033371854573488235,
0.05999230220913887,
0.03861093148589134,
-0.044295135885477066,
0.06712639331817627,
-0.03808922693133354,
-0.05484658107161522,
-0.005803016945719719,
-0.06083152815699577,
-0.00024820739054121077,
-0.02125815860927105,
0.020950350910425186,
-0.02647750824689865,
-0.10142093151807785,
-0.0019251714693382382,
0.16089047491550446,
-0.0253450945019722,
-0.004807327874004841,
-0.08900772780179977,
0.053305428475141525,
-0.06623654812574387,
0.0015108442166820168,
-0.21295849978923798,
-0.01414626557379961,
0.06361320614814758,
-0.071061871945858,
0.035055987536907196,
-0.029651155695319176,
0.03788052126765251,
0.09272748976945877,
-0.0234658345580101,
-0.06274158507585526,
-0.06651168316602707,
-0.03012757934629917,
-0.10887852311134338,
-0.08897776156663895,
-0.07662812620401382,
-0.06689481437206268,
0.15961864590644836,
-0.16337288916110992,
0.04495425522327423,
0.03104882687330246,
0.10330960154533386,
0.05241170898079872,
-0.10851394385099411,
0.037427064031362534,
0.042053140699863434,
-0.07967028766870499,
-0.07510367780923843,
0.023701319471001625,
0.049367886036634445,
0.021404467523097992,
-0.0062267533503472805,
-0.14820438623428345,
-0.14539048075675964,
0.04219789430499077,
0.10547268390655518,
-0.10034274309873581,
-0.020575786009430885,
-0.062253065407276154,
-0.010788646526634693,
-0.14484357833862305,
0.012473536655306816,
0.17858467996120453,
0.04146581515669823,
0.136201411485672,
-0.04897409304976463,
-0.03383270651102066,
-0.015953605994582176,
0.029530851170420647,
0.006852713413536549,
0.06113060936331749,
-0.019345931708812714,
-0.022376874461770058,
0.049754731357097626,
0.042402204126119614,
-0.003579911543056369,
0.19375649094581604,
0.012626484967768192,
-0.09337303042411804,
-0.05022003874182701,
0.040100958198308945,
0.03327171504497528,
0.05002843588590622,
-0.020463740453124046,
0.009344778954982758,
0.037111811339855194,
0.06938938051462173,
0.06809867918491364,
-0.14226582646369934,
0.01720910333096981,
0.04252370074391365,
-0.001428301096893847,
-0.0022408561781048775,
-0.0030717342160642147,
-0.03905632719397545,
0.06353288888931274,
0.011169022880494595,
0.032975733280181885,
0.008820530027151108,
-0.02003580518066883,
-0.10928075015544891,
0.1613069623708725,
-0.09371398389339447,
-0.20998966693878174,
-0.17562144994735718,
0.1328878551721573,
-0.024699900299310684,
0.013892651535570621,
0.03588314354419708,
-0.0950411856174469,
-0.02046622335910797,
-0.0993708074092865,
-0.000257799489190802,
-0.012587286531925201,
-0.03744756430387497,
0.0054810745641589165,
0.02038566954433918,
-0.015620668418705463,
-0.11628107726573944,
0.013669874519109726,
0.025441640987992287,
-0.066507987678051,
0.08265811949968338,
0.005856927018612623,
0.0429905466735363,
0.20897294580936432,
0.0021045661997050047,
0.014651957899332047,
-0.008705230429768562,
0.14843305945396423,
-0.09942132234573364,
0.07252703607082367,
0.18022273480892181,
0.04201897233724594,
0.05588738992810249,
-0.00844270084053278,
-0.010390592738986015,
-0.0573209710419178,
0.050437480211257935,
0.023404378443956375,
-0.07563459128141403,
-0.2325546145439148,
-0.02590298466384411,
-0.05158548802137375,
-0.0021823374554514885,
0.04375581443309784,
0.0384252667427063,
0.007146723568439484,
0.022243985906243324,
-0.03706194460391998,
-0.04068060964345932,
0.042226724326610565,
0.08962447941303253,
-0.0077789113856852055,
-0.005540628917515278,
0.082195945084095,
-0.051625654101371765,
0.02897605486214161,
0.11458152532577515,
0.002261155052110553,
0.21656325459480286,
-0.04424074664711952,
0.2037288397550583,
0.01303861290216446,
0.06768914312124252,
0.04876647889614105,
0.07560517638921738,
-0.016323575749993324,
0.011866561137139797,
-0.003412832273170352,
-0.032987888902425766,
-0.018704097718000412,
0.03804541751742363,
-0.05125417187809944,
-0.013559275306761265,
-0.07688476890325546,
0.02353942207992077,
0.038040824234485626,
0.20917221903800964,
0.06338870525360107,
-0.18052458763122559,
-0.10932701081037521,
-0.00544615276157856,
-0.052258312702178955,
-0.07316070050001144,
-0.024643171578645706,
0.10595834255218506,
-0.12750625610351562,
-0.020271115005016327,
-0.055007170885801315,
0.0954354926943779,
-0.07387156039476395,
-0.04272795841097832,
-0.0012637916952371597,
0.10492374747991562,
-0.02724853716790676,
0.07166968286037445,
-0.16704200208187103,
0.008755424059927464,
-0.004949367605149746,
0.12556837499141693,
-0.04537171497941017,
0.036315713077783585,
0.007166476920247078,
0.02773214876651764,
0.057597849518060684,
0.014793766662478447,
-0.0910128504037857,
-0.10806006193161011,
-0.050251998007297516,
-0.016844481229782104,
0.034655697643756866,
-0.08176307380199432,
0.05035964772105217,
-0.025056611746549606,
0.01667844131588936,
-0.005516772624105215,
-0.036735936999320984,
-0.035429514944553375,
-0.14577941596508026,
0.043614741414785385,
-0.023747678846120834,
0.004866376984864473,
-0.07608669996261597,
-0.049016207456588745,
0.06554926186800003,
0.1779865026473999,
-0.10166460275650024,
-0.07629912346601486,
-0.1233343556523323,
-0.03624624013900757,
0.11107515543699265,
-0.06536483764648438,
0.035992421209812164,
0.007138180546462536,
0.17558589577674866,
-0.04075012728571892,
-0.09148919582366943,
0.018461717292666435,
-0.0714031383395195,
-0.14983254671096802,
-0.06443782150745392,
0.0954739898443222,
0.03678155317902565,
0.04299149289727211,
-0.023624122142791748,
0.05130067840218544,
-0.0730249360203743,
-0.05707879737019539,
0.028012141585350037,
0.13927005231380463,
0.03519819304347038,
0.08005421608686447,
-0.0676000714302063,
-0.04031942039728165,
-0.05255309119820595,
-0.0889444574713707,
0.07489985227584839,
0.19877025485038757,
-0.07778728753328323,
0.0546279139816761,
0.07895655184984207,
-0.06184389069676399,
-0.1664312779903412,
0.0510074719786644,
0.0371791236102581,
0.06541188806295395,
0.013734188862144947,
-0.2439824640750885,
0.0037404117174446583,
0.12078744173049927,
-0.007534359581768513,
0.1158212348818779,
-0.31347227096557617,
-0.12299846857786179,
0.02814410626888275,
0.012851373292505741,
0.04254337400197983,
-0.06918681412935257,
-0.015577871352434158,
-0.0005575526738539338,
-0.013658281415700912,
0.09232111275196075,
-0.10015233606100082,
0.13118644058704376,
-0.042623888701200485,
-0.016353381797671318,
0.033176399767398834,
-0.05666529759764671,
0.08595716208219528,
-0.0001365762873319909,
0.09475526213645935,
-0.001181398518383503,
0.009998181834816933,
-0.0006303906557150185,
-0.08068783581256866,
0.09168022871017456,
-0.06044035777449608,
0.09728547185659409,
-0.10069833695888519,
-0.07417865842580795,
-0.06564510613679886,
0.041644372045993805,
-0.008330230601131916,
-0.0746774971485138,
-0.050994955003261566,
0.052710503339767456,
0.004363781772553921,
-0.01659102737903595,
-0.005602833349257708,
0.0022691029589623213,
0.02206789143383503,
0.21237823367118835,
0.053826041519641876,
-0.06116657704114914,
-0.09142737090587616,
-0.013721066527068615,
0.01509146112948656,
0.06041580066084862,
-0.14850416779518127,
-0.012590729631483555,
0.11539158970117569,
0.06234106048941612,
0.1383085995912552,
0.04120301082730293,
-0.11894701421260834,
0.0224306657910347,
0.06330214440822601,
-0.05190059915184975,
-0.19004592299461365,
-0.0406125970184803,
0.018256599083542824,
-0.09102193266153336,
0.020178381353616714,
0.08501578122377396,
-0.024302829056978226,
-0.0381612591445446,
0.00484340637922287,
0.03775493800640106,
-0.01796565018594265,
0.19738759100437164,
-0.03341322019696236,
0.026440341025590897,
-0.05031165853142738,
0.12102626264095306,
0.08318217843770981,
-0.08892817050218582,
-0.013077208772301674,
0.11547231674194336,
-0.14303912222385406,
-0.06896660476922989,
0.012612049467861652,
0.11642783135175705,
-0.08485183119773865,
0.01627020351588726,
-0.0726831704378128,
-0.008996224030852318,
0.04598144069314003,
0.011498905718326569,
0.007918525487184525,
0.060509104281663895,
-0.052256837487220764,
0.003522729268297553,
-0.07539186626672745,
0.05497565120458603,
0.04006843641400337,
0.07255703955888748,
-0.04903731495141983,
0.17617659270763397,
-0.001556776580400765,
0.02348143793642521,
-0.010246416553854942,
-0.06424638628959656,
-0.016719138249754906,
0.0002723872603382915,
-0.08197299391031265,
-0.05544222891330719,
-0.08499657362699509,
-0.049699850380420685,
-0.013827641494572163,
0.004654868971556425,
0.04039609059691429,
0.049356911331415176,
-0.010722177103161812,
-0.05233564227819443,
-0.0854889526963234,
0.0393480509519577,
-0.07505904883146286,
0.012069028802216053,
0.01992933079600334,
-0.09331121295690536,
0.10114870965480804,
0.030496759340167046,
0.02817460335791111,
0.010161902755498886,
0.009525786153972149,
-0.007147971075028181,
-0.04026627913117409,
0.007921152748167515,
0.01048071775585413,
-0.13524392247200012,
0.020453820005059242,
-0.010230053216218948,
-0.025735823437571526,
0.02025480382144451,
0.08487734198570251,
-0.13002550601959229,
0.03205602988600731,
0.01862054504454136,
-0.04504701495170593,
-0.11029611527919769,
0.07484480738639832,
0.1090247854590416,
0.09211642295122147,
0.17475108802318573,
-0.025718115270137787,
0.058290448039770126,
-0.18699555099010468,
-0.027897106483578682,
0.003040294861420989,
0.000254409562330693,
-0.006433193106204271,
-0.01268967054784298,
0.028949163854122162,
-0.05516091734170914,
0.05416712537407875,
-0.006877392530441284,
-0.03532344475388527,
0.010828412137925625,
-0.0015567017253488302,
-0.00914262980222702,
0.035969045013189316,
0.09830629080533981,
0.02706218883395195,
-0.04153280705213547,
-0.009725079871714115,
0.04440382868051529,
-0.006325186230242252,
0.05240844190120697,
0.0605156384408474,
0.10946802794933319,
0.028503742069005966,
0.0363682322204113,
-0.020127730444073677,
-0.08958824723958969,
-0.1911063939332962,
0.019174326211214066,
-0.07199908047914505,
0.04044799879193306,
-0.06573611497879028,
0.10794739425182343,
0.15590298175811768,
-0.1594500094652176,
0.14015036821365356,
-0.032921791076660156,
-0.07283271104097366,
-0.05172578990459442,
-0.14445257186889648,
-0.03435717895627022,
-0.05099967122077942,
-0.023385779932141304,
-0.0674566701054573,
0.03932773694396019,
0.0904558077454567,
0.004205403383821249,
-0.028828054666519165,
0.15385043621063232,
-0.016731252893805504,
-0.053075160831213,
0.028033863753080368,
-0.01160366740077734,
0.022542452439665794,
0.005644276272505522,
0.028841055929660797,
0.03354303538799286,
-0.052049268037080765,
0.10590295493602753,
0.020375356078147888,
0.01657966524362564,
0.008789570070803165,
0.01631726324558258,
-0.04983332008123398,
0.004419615492224693,
-0.027232076972723007,
-0.016213690862059593,
0.10058244317770004,
0.053125977516174316,
-0.03355678915977478,
0.018811699002981186,
0.2522522211074829,
-0.026006372645497322,
-0.046825334429740906,
-0.14173051714897156,
0.22957448661327362,
0.019409365952014923,
-0.021070754155516624,
0.04150278493762016,
-0.10099142789840698,
0.00970644224435091,
0.13749615848064423,
0.16719530522823334,
0.0030391342006623745,
-0.027689998969435692,
0.017171012237668037,
-0.016729798167943954,
-0.03609129413962364,
0.060425709933042526,
0.03901882842183113,
0.15517999231815338,
-0.03782590851187706,
0.1483771800994873,
-0.0031721561681479216,
-0.043328724801540375,
-0.028131628409028053,
0.09657491743564606,
-0.04766802117228508,
0.03922031447291374,
-0.048878781497478485,
0.036216799169778824,
0.04071715101599693,
-0.12235160171985626,
0.01545628160238266,
-0.06914623081684113,
-0.11260509490966797,
-0.014882875606417656,
-0.026785383000969887,
-0.09215909987688065,
0.11302294582128525,
-0.00009434120875084773,
0.03109079785645008,
0.05256331339478493,
-0.046501774340867996,
0.015216883271932602,
-0.10475703328847885,
0.09106092154979706,
-0.0938221663236618,
0.10776746273040771,
-0.010017292574048042,
0.02522251382470131,
0.1101120114326477,
0.0453871451318264,
-0.14090488851070404,
0.07256068289279938,
-0.01006445474922657,
0.06106787547469139,
0.05579341575503349,
0.12135329842567444,
-0.03615346923470497,
0.08329521119594574,
0.019288906827569008,
-0.06993631273508072,
0.030991530045866966,
-0.04686261713504791,
0.03449873626232147,
-0.10380654036998749,
0.03362045809626579,
-0.03164474666118622,
0.16768866777420044,
0.14632847905158997,
-0.03388601914048195,
0.0004760843585245311,
-0.04164659604430199,
-0.005002291407436132,
0.024432921782135963,
0.15976928174495697,
-0.03631402924656868,
-0.17922508716583252,
0.032606758177280426,
0.08000072091817856,
0.023792164400219917,
-0.27914002537727356,
-0.05894938483834267,
0.002739256015047431,
-0.10244294255971909,
-0.012769494205713272,
0.10594038665294647,
0.08448127657175064,
0.05519518256187439,
-0.03839181363582611,
-0.01291861105710268,
0.003485296620056033,
0.079434834420681,
-0.08225949108600616,
-0.07286618649959564
] |
null | null |
transformers
|
# GPT-Neo-1.3B-APPS-all
> **Please refer to our new [GitHub Wiki](https://github.com/ncoop57/gpt-code-clippy/wiki) which documents our efforts in detail in creating the open source version of GitHub Copilot**
## Model Description
GPT-Neo-1.3B-APPS-all is a GPT-Neo-1.3B finetuned on APPS dataset. This model is specialized to solve programming tasks.
## Training data
The model is trained on the [Automated Programming Progress Standard (APPS) dataset](https://github.com/hendrycks/apps). The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.
This model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found [here](https://huggingface.co/flax-community/gpt-neo-1.3B-apps).
## Training procedure
The training script used to train this model can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/run_clm_apps.py).
Training is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:
```
python run_clm_apps.py \
--output_dir ./gpt-neo-1.3B-apps \
--model_name_or_path EleutherAI/gpt-neo-1.3B \
--dataset_name ./apps.py \
--dataset_config_name formatted \
--do_train --do_eval \
--block_size="1024" \
--per_device_train_batch_size="3" \
--per_device_eval_batch_size="3" \
--preprocessing_num_workers="16" \
--learning_rate="8e-5" \
--warmup_steps="800" \
--adam_beta1="0.9" \
--adam_beta2="0.98" \
--weight_decay="0.1" \
--overwrite_output_dir \
--num_train_epochs="5" \
--logging_steps="50" \
--eval_steps="2000" \
--report_to="wandb" \
--dtype="bfloat16" \
--save_strategy epoch \
--gradient_accumulation_steps 1 \
--all_data true \
```
## Intended Use and Limitations
The model is finetuned to solve programming problems given a text description and optional starter code.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("flax-community/gpt-code-clippy-1.3B-apps-alldata")
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt-code-clippy-1.3B-apps-alldata")
prompt = """
A function to greet user. Given a user name it should say hello
def greet(name):
ANSWER:
"""
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(device)
start = input_ids.size(1)
out = model.generate(input_ids, do_sample=True, max_length=50, num_beams=2,
early_stopping=True, eos_token_id=tokenizer.eos_token_id, )
print(tokenizer.decode(out[0][start:]))
```
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper ["Evaluating Large Language Models Trained on Code"](https://arxiv.org/abs/2107.03374) from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. **As well as some differences in views from the paper, particularly around legal implications**.
1. **Over-reliance:** This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. **Economic and labor market impacts:** Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from [O*NET OnLine](https://www.onetonline.org/link/summary/15-1252.00), developers don't just write software.
5. **Biases:** The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt
formatting is different from that used in APPS dataset.
GPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See [GPT-Neo model card](https://huggingface.co/EleutherAI/gpt-neo-125M#limitations-and-biases) for details.
## Eval results
Coming soon...
|
{"language": ["en", "python"], "license": "mit", "tags": ["gpt_neo", "code_synthesis"], "datasets": ["apps"]}
|
text-generation
|
flax-community/gpt-neo-1.3B-apps-all
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"gpt_neo",
"text-generation",
"code_synthesis",
"dataset:apps",
"arxiv:2107.03374",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.03374"
] |
[
"en",
"python"
] |
TAGS
#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# GPT-Neo-1.3B-APPS-all
> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot
## Model Description
GPT-Neo-1.3B-APPS-all is a GPT-Neo-1.3B finetuned on APPS dataset. This model is specialized to solve programming tasks.
## Training data
The model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.
This model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.
## Training procedure
The training script used to train this model can be found here.
Training is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:
## Intended Use and Limitations
The model is finetuned to solve programming problems given a text description and optional starter code.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper "Evaluating Large Language Models Trained on Code" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.
1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.
5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt
formatting is different from that used in APPS dataset.
GPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.
## Eval results
Coming soon...
|
[
"# GPT-Neo-1.3B-APPS-all\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-Neo-1.3B-APPS-all is a GPT-Neo-1.3B finetuned on APPS dataset. This model is specialized to solve programming tasks.",
"## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n\n5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt \n\nformatting is different from that used in APPS dataset.\n\nGPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# GPT-Neo-1.3B-APPS-all\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-Neo-1.3B-APPS-all is a GPT-Neo-1.3B finetuned on APPS dataset. This model is specialized to solve programming tasks.",
"## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n\n5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt \n\nformatting is different from that used in APPS dataset.\n\nGPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
75,
42,
47,
155,
60,
29,
35,
374,
8
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# GPT-Neo-1.3B-APPS-all\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot## Model Description\n\nGPT-Neo-1.3B-APPS-all is a GPT-Neo-1.3B finetuned on APPS dataset. This model is specialized to solve programming tasks.## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:"
] |
[
-0.09109753370285034,
0.10306718945503235,
-0.0009862456936389208,
0.039701059460639954,
0.12898723781108856,
0.0311789158731699,
0.08318812400102615,
0.0676882192492485,
-0.017481379210948944,
0.05616083741188049,
-0.0016083711525425315,
0.03330676257610321,
0.07710828632116318,
0.11112198978662491,
0.06545431911945343,
-0.14288610219955444,
0.00013229966862127185,
-0.05018991231918335,
0.01578151062130928,
0.11121077090501785,
0.05236242339015007,
-0.052969809621572495,
0.0662565529346466,
0.030399588868021965,
-0.14484035968780518,
-0.03146995231509209,
-0.022824732586741447,
-0.023960163816809654,
0.12515679001808167,
0.05650999769568443,
0.09034426510334015,
0.023657646030187607,
0.03200971707701683,
-0.15069985389709473,
0.009527459740638733,
0.08684969693422318,
0.03671151027083397,
0.08115197718143463,
0.07665100693702698,
0.0723719596862793,
0.1188095286488533,
-0.05321438983082771,
0.06603015959262848,
0.07182151824235916,
-0.04529223218560219,
-0.07442474365234375,
-0.12247096002101898,
0.010782051831483841,
0.16349098086357117,
0.10142897069454193,
-0.019314752891659737,
0.10320272296667099,
-0.030753416940569878,
0.0530790276825428,
0.08847108483314514,
-0.19349101185798645,
-0.03935951739549637,
0.09837464988231659,
-0.010903137736022472,
0.05974556878209114,
-0.06073673814535141,
-0.02628885954618454,
0.044348832219839096,
0.03785772621631622,
0.027386615052819252,
-0.026056721806526184,
0.06493768841028214,
-0.024509625509381294,
-0.1087954044342041,
-0.039345309138298035,
0.08654813468456268,
0.01405009999871254,
-0.09085400402545929,
-0.19293512403964996,
-0.030439011752605438,
-0.02667866088449955,
0.0762983113527298,
0.0016743758460506797,
-0.024853410199284554,
0.018847407773137093,
-0.014949730597436428,
-0.06566429883241653,
-0.12551259994506836,
-0.04045538976788521,
-0.02406059205532074,
0.07128652185201645,
-0.01066780835390091,
0.05160066485404968,
0.002883500186726451,
0.1213088259100914,
-0.009002165868878365,
-0.043501634150743484,
-0.0447663739323616,
-0.05630146339535713,
-0.1718982309103012,
-0.04704217240214348,
-0.01848074421286583,
-0.1850515604019165,
-0.03694448247551918,
0.027769003063440323,
-0.0247258972376585,
0.07313702255487442,
0.041690632700920105,
0.012271372601389885,
0.060385581105947495,
0.19779779016971588,
-0.006485213525593281,
-0.04593583941459656,
0.006288908421993256,
0.00728259002789855,
0.03273028880357742,
-0.04244595021009445,
-0.0412239208817482,
-0.02360360510647297,
-0.0038250330835580826,
0.09079806506633759,
0.021177805960178375,
-0.006675473414361477,
-0.09822699427604675,
-0.061713363975286484,
0.17482176423072815,
-0.09037089347839355,
0.011339946649968624,
0.0020466914866119623,
0.00412004766985774,
0.01038543414324522,
0.03589566424489021,
-0.005173909943550825,
-0.07065911591053009,
0.0265053603798151,
-0.0507977120578289,
-0.06899569928646088,
-0.0865686908364296,
-0.049470581114292145,
0.02808380499482155,
-0.07494238764047623,
-0.034811049699783325,
-0.0644606500864029,
-0.23129083216190338,
-0.04605890065431595,
0.02708631008863449,
-0.012994232587516308,
-0.037256475538015366,
-0.002506363671272993,
-0.012736998498439789,
-0.030158882960677147,
-0.0022136650513857603,
0.06197734549641609,
-0.003206519642844796,
0.033507201820611954,
-0.05443194881081581,
0.037289127707481384,
0.041677236557006836,
0.022445369511842728,
-0.06831317394971848,
0.049965109676122665,
-0.09702980518341064,
0.12772011756896973,
-0.017808113247156143,
0.004389775916934013,
-0.06313509494066238,
-0.037714675068855286,
0.01727404072880745,
-0.018382873386144638,
0.041504498571157455,
0.10080435872077942,
-0.10260755568742752,
-0.05613110959529877,
0.06797772645950317,
-0.08850810676813126,
-0.04104034602642059,
0.1203608363866806,
-0.03736807778477669,
0.08817069977521896,
0.0898713693022728,
0.0595894381403923,
0.10300268977880478,
0.024203097447752953,
-0.08140082657337189,
-0.023159891366958618,
-0.016686148941516876,
0.10384079068899155,
0.05330989509820938,
-0.015853283926844597,
-0.00412364536896348,
0.049803733825683594,
-0.02620655484497547,
0.059216611087322235,
-0.00867196824401617,
-0.04755774512887001,
-0.049324993044137955,
-0.10712611675262451,
0.08464821428060532,
-0.019737495109438896,
-0.004606155678629875,
-0.04493597522377968,
-0.11288784444332123,
0.0009010741487145424,
0.146650493144989,
-0.04373950511217117,
-0.026331160217523575,
-0.06226615235209465,
-0.05433390289545059,
-0.06349834054708481,
0.01591314561665058,
-0.18715457618236542,
-0.043102510273456573,
0.05590386316180229,
-0.07570759952068329,
0.003013324923813343,
0.0020672588143497705,
0.03183997794985771,
0.0757642537355423,
-0.039006058126688004,
-0.03316837176680565,
-0.10251787304878235,
-0.020520342513918877,
-0.03148946166038513,
-0.05630788952112198,
-0.06661191582679749,
-0.04515679180622101,
0.18955256044864655,
-0.12830840051174164,
0.003442042740061879,
0.02325756847858429,
0.09176155924797058,
0.048175957053899765,
-0.09645270556211472,
0.03132886439561844,
0.019254807382822037,
-0.038283850997686386,
-0.07477736473083496,
-0.020142503082752228,
0.04203592240810394,
-0.06756359338760376,
-0.00047642449499107897,
-0.13054341077804565,
-0.09265109896659851,
0.031343527138233185,
0.0952741727232933,
-0.0978621318936348,
0.04937553405761719,
-0.04810253158211708,
-0.037459440529346466,
-0.11141956597566605,
-0.06201641634106636,
0.2371186465024948,
0.05707244947552681,
0.0664549246430397,
-0.06334022432565689,
-0.012303068302571774,
0.001978788059204817,
0.01108546368777752,
0.003835276933386922,
0.0879940614104271,
-0.06337936222553253,
-0.08355860412120819,
0.02641329914331436,
0.023732800036668777,
0.017036844044923782,
0.18538396060466766,
0.018744608387351036,
-0.08802806586027145,
-0.06528890877962112,
0.018745513632893562,
0.017703739926218987,
0.0464654304087162,
-0.12441358715295792,
-0.013617116957902908,
0.042454298585653305,
0.004666951484978199,
0.04419075325131416,
-0.12138639390468597,
0.027984656393527985,
0.044691573828458786,
0.007752048783004284,
-0.036901235580444336,
-0.026572605594992638,
-0.012205026112496853,
0.0541972778737545,
0.013842265121638775,
0.09493357688188553,
0.01950785517692566,
-0.04126577079296112,
-0.14490322768688202,
0.14220939576625824,
-0.10095521807670593,
-0.27694278955459595,
-0.17642411589622498,
0.11017849296331406,
-0.05975190922617912,
-0.01694238744676113,
0.01763296127319336,
-0.07613144814968109,
-0.06688933819532394,
-0.07142564654350281,
0.07185393571853638,
-0.0037957599852234125,
-0.07456785440444946,
0.12665070593357086,
-0.014437969774007797,
-0.04318452626466751,
-0.1130942776799202,
0.01813996024429798,
0.059826381504535675,
-0.09167930483818054,
0.04421711713075638,
-0.05944986268877983,
0.06658875942230225,
0.20137624442577362,
-0.01227433979511261,
-0.029025662690401077,
-0.010893231257796288,
0.1564595252275467,
-0.12628738582134247,
0.09237070381641388,
0.16902785003185272,
0.003214989323168993,
0.0531110055744648,
0.03461436927318573,
-0.0011975753586739302,
-0.0366082526743412,
0.08955930918455124,
0.019099345430731773,
-0.08160320669412613,
-0.24050511419773102,
-0.06452091038227081,
-0.04649657383561134,
0.017210254445672035,
-0.023129722103476524,
0.02593817748129368,
0.03643691912293434,
0.02971504256129265,
-0.04000743478536606,
-0.026711633428931236,
0.02878410741686821,
0.1022297814488411,
0.029999496415257454,
-0.024526340886950493,
0.04185697063803673,
-0.07475853711366653,
0.003639706177636981,
0.13422858715057373,
0.03362660855054855,
0.21660244464874268,
-0.039957884699106216,
0.23317083716392517,
0.0030423523858189583,
0.04580722376704216,
-0.013446017168462276,
0.07724554091691971,
0.01682286523282528,
0.0014953743666410446,
-0.013737735338509083,
-0.045546092092990875,
-0.06651417911052704,
0.07502907514572144,
0.0003136073355562985,
-0.004577521234750748,
-0.004956603981554508,
0.02145964279770851,
0.002133042784407735,
0.22462375462055206,
0.00664042541757226,
-0.1266920566558838,
-0.12036509066820145,
0.04289411008358002,
-0.021394800394773483,
-0.0989239513874054,
-0.02427142672240734,
0.10589373111724854,
-0.15334296226501465,
0.022173840552568436,
-0.0471218042075634,
0.08546663075685501,
-0.0735178291797638,
-0.00912483874708414,
-0.07285585254430771,
0.052784405648708344,
-0.005090190563350916,
0.10739771276712418,
-0.10843449831008911,
0.0030372529290616512,
0.01448032632470131,
0.09843456745147705,
-0.08967975527048111,
0.03352271765470505,
0.00214638514444232,
0.028314078226685524,
0.08547144383192062,
0.022682934999465942,
-0.002854512305930257,
-0.09144678711891174,
-0.05385374650359154,
0.005568825174123049,
0.027546081691980362,
-0.028856806457042694,
0.033022087067365646,
0.005443322937935591,
0.014496451243758202,
-0.0216436218470335,
-0.008773721754550934,
-0.05885826796293259,
-0.15287478268146515,
0.08679254353046417,
-0.030856451019644737,
-0.02154955454170704,
-0.10541902482509613,
-0.06434699892997742,
0.035559285432100296,
0.24775876104831696,
0.03398987278342247,
-0.08148061484098434,
-0.10515611618757248,
-0.12306578457355499,
0.14911434054374695,
-0.052665550261735916,
0.023202620446681976,
0.027994750067591667,
0.17121724784374237,
-0.036964450031518936,
-0.0524187907576561,
0.01284013781696558,
-0.049578964710235596,
-0.1656380146741867,
-0.034616775810718536,
0.05120838060975075,
0.05345014110207558,
0.06457959115505219,
0.005804389249533415,
-0.011660521849989891,
-0.060871727764606476,
-0.10059169679880142,
0.001732859993353486,
0.16112247109413147,
-0.02503078617155552,
0.08385109156370163,
0.021161368116736412,
-0.053034160286188126,
-0.037632428109645844,
-0.10091054439544678,
0.09015938639640808,
0.2069787085056305,
-0.09339972585439682,
0.09804368764162064,
0.12748725712299347,
-0.04799709841609001,
-0.18849961459636688,
0.03871222585439682,
0.07976475358009338,
0.10797668248414993,
0.02965184487402439,
-0.17429094016551971,
0.04466519504785538,
0.08015821874141693,
-0.019663888961076736,
0.06762798875570297,
-0.31531769037246704,
-0.10421055555343628,
0.03349757939577103,
0.05397627130150795,
0.03668748214840889,
-0.07778811454772949,
-0.0005850635352544487,
-0.06408414989709854,
-0.14463134109973907,
0.061985209584236145,
-0.11267103254795074,
0.10950601100921631,
-0.023968759924173355,
0.07223561406135559,
0.05362110584974289,
-0.0784522145986557,
0.13497895002365112,
-0.00806827936321497,
0.06050942465662956,
-0.014830375090241432,
0.02154826745390892,
0.10701165348291397,
-0.07474692165851593,
0.1521790623664856,
-0.012281167320907116,
0.09188804775476456,
-0.09976016730070114,
-0.0599641315639019,
-0.07593440264463425,
0.010240641422569752,
-0.030976897105574608,
-0.0909239798784256,
-0.07729228585958481,
0.04944116249680519,
0.041510701179504395,
-0.003998658154159784,
-0.05242793634533882,
0.00846041552722454,
-0.00387901091016829,
0.2131943702697754,
0.12315946817398071,
-0.07369071990251541,
-0.08401212096214294,
-0.00529926922172308,
-0.014629296027123928,
0.02718825452029705,
-0.16048943996429443,
0.01827366277575493,
0.10954569280147552,
0.07369250059127808,
0.12224005162715912,
-0.015028714202344418,
-0.17562703788280487,
0.044989943504333496,
0.018489807844161987,
-0.06540680676698685,
-0.19447384774684906,
-0.06702222675085068,
0.10809063166379929,
-0.12792269885540009,
0.0430850051343441,
0.09104674309492111,
-0.012003853917121887,
-0.03978142887353897,
-0.00011674483539536595,
0.039744969457387924,
-0.0007951711304485798,
0.1392614245414734,
-0.04527654871344566,
0.04227141663432121,
-0.06488898396492004,
0.10772735625505447,
0.08206671476364136,
-0.03809676691889763,
0.013529501855373383,
0.060372494161129,
-0.11283469200134277,
-0.020867081359028816,
-0.02657659724354744,
0.03668792545795441,
-0.04360373690724373,
-0.027242792770266533,
-0.03720307722687721,
-0.08021099865436554,
0.0008924367721192539,
0.02494141459465027,
-0.01436197105795145,
0.08714467287063599,
-0.056429672986269,
-0.00048462231643497944,
-0.08802551031112671,
0.04865601286292076,
0.04834503307938576,
0.06171642616391182,
-0.05939064174890518,
0.07518336176872253,
0.006647672038525343,
0.046942636370658875,
-0.01943930797278881,
-0.07253105938434601,
-0.029503662139177322,
-0.007860051468014717,
-0.2013603001832962,
-0.024022560566663742,
-0.03814282640814781,
-0.02127748541533947,
-0.010891266167163849,
0.01276896521449089,
0.009406226687133312,
0.06294871866703033,
-0.032561179250478745,
-0.053006261587142944,
-0.06032485142350197,
0.029499098658561707,
-0.06694580614566803,
-0.019145723432302475,
0.05835586041212082,
-0.07511408627033234,
0.10191838443279266,
-0.016862904652953148,
0.00826170388609171,
0.02487213909626007,
-0.06028437241911888,
0.06199520081281662,
0.016311518847942352,
0.02409982867538929,
-0.002877210732549429,
-0.1450313925743103,
-0.025562649592757225,
0.002267559990286827,
-0.028746163472533226,
0.017855120822787285,
0.003130322555080056,
-0.0759127140045166,
-0.0030556104611605406,
-0.018799562007188797,
-0.0818556547164917,
-0.09053660184144974,
0.07309932261705399,
0.09629475325345993,
0.038311854004859924,
0.12443523854017258,
-0.014133205637335777,
0.059256941080093384,
-0.1550992876291275,
-0.04490477219223976,
0.013951096683740616,
0.03686833381652832,
0.0002532445068936795,
-0.031184272840619087,
0.0157132800668478,
-0.0388515442609787,
0.09106650203466415,
-0.07765288650989532,
0.039703112095594406,
-0.013352597132325172,
0.05541875958442688,
-0.010245748795568943,
0.003684304654598236,
0.09107965975999832,
-0.03782506287097931,
-0.02180706523358822,
0.00936594232916832,
0.01653384603559971,
-0.04031283035874367,
-0.009430844336748123,
0.03966904804110527,
0.00432485481724143,
0.04917820915579796,
0.037547871470451355,
0.0337417796254158,
-0.03229609876871109,
-0.1215788945555687,
0.0017844719113782048,
-0.07757412642240524,
0.040998343378305435,
-0.05859959498047829,
0.1353026181459427,
0.1284678429365158,
-0.12838999927043915,
0.12174368649721146,
-0.030268432572484016,
-0.10330798476934433,
-0.03341119736433029,
-0.09164173901081085,
-0.02678562141954899,
-0.053055696189403534,
-0.010931222699582577,
-0.10119101405143738,
0.020573321729898453,
0.07202272117137909,
-0.01226524543017149,
-0.07283291965723038,
0.16851240396499634,
-0.03833819180727005,
-0.08604191988706589,
0.06347674131393433,
-0.0022353900130838156,
0.03208397701382637,
0.023144444450736046,
0.05488226190209389,
0.044581908732652664,
0.04384150728583336,
0.13646675646305084,
0.06749077141284943,
0.02771204151213169,
-0.022790975868701935,
0.02089175209403038,
-0.04551674798130989,
0.006809967570006847,
-0.005242563784122467,
-0.005344736389815807,
0.13062290847301483,
0.035979751497507095,
-0.021937459707260132,
-0.016619233414530754,
0.1759619414806366,
-0.0678405836224556,
0.004531380720436573,
-0.10899197310209274,
0.11967407166957855,
-0.03158053383231163,
-0.0014256536960601807,
0.09109058976173401,
-0.11746953427791595,
0.002865656279027462,
0.0993892252445221,
0.12101412564516068,
0.03656255826354027,
-0.03687766566872597,
-0.006490505766123533,
-0.0019281140994280577,
-0.06705580651760101,
0.11184734851121902,
0.040816593915224075,
0.15594135224819183,
-0.045357123017311096,
0.152760311961174,
0.007539886981248856,
-0.02335571125149727,
-0.027386851608753204,
0.07078386843204498,
-0.0426618717610836,
0.01708219386637211,
-0.05595370754599571,
0.05375117063522339,
-0.013965994119644165,
-0.14939439296722412,
-0.0077634272165596485,
-0.10556891560554504,
-0.1222316175699234,
0.05527680367231369,
-0.0341506227850914,
-0.06387742608785629,
0.13982848823070526,
0.018752846866846085,
0.04044695198535919,
0.19510671496391296,
-0.04252142831683159,
-0.054549988359212875,
-0.042205777019262314,
0.08250657469034195,
-0.11925873160362244,
0.1645852029323578,
-0.0018103872425854206,
0.06231091544032097,
0.09694501757621765,
-0.006597927305847406,
-0.11670252680778503,
0.042058613151311874,
0.008704930543899536,
0.003548684297129512,
0.030892647802829742,
0.13920404016971588,
-0.028389379382133484,
0.11063269525766373,
0.04376670718193054,
-0.040879350155591965,
-0.0003946129581891,
0.007751054596155882,
0.11025071889162064,
-0.12132662534713745,
0.042649466544389725,
-0.07949498295783997,
0.152430921792984,
0.10250302404165268,
-0.03703659027814865,
-0.006770214531570673,
-0.022085584700107574,
-0.003430768847465515,
-0.0012179104378446937,
0.18647295236587524,
-0.006362691521644592,
-0.14060330390930176,
0.021388566121459007,
0.01857704669237137,
0.059886060655117035,
-0.2881450057029724,
-0.0903504267334938,
0.013237319886684418,
-0.06741130352020264,
0.0076643433421850204,
0.0981188639998436,
0.01824825629591942,
0.018602872267365456,
-0.0198491420596838,
-0.09957103431224823,
0.004705023020505905,
0.08950688689947128,
-0.07203125953674316,
-0.07369688898324966
] |
null | null |
transformers
|
# GPT-Neo-1.3B-APPS
> **Please refer to our new [GitHub Wiki](https://github.com/ncoop57/gpt-code-clippy/wiki) which documents our efforts in detail in creating the open source version of GitHub Copilot**
## Model Description
GPT-Neo-1.3B-APPS is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.
## Training data
The model is trained on the [Automated Programming Progress Standard (APPS) dataset](https://github.com/hendrycks/apps). The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.
This model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found [here](https://huggingface.co/flax-community/gpt-neo-125M-apps).
## Training procedure
The training script used to train this model can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/run_clm_apps.py).
Training is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:
```bash
python run_clm_apps.py \
--output_dir $HOME/gpt-neo-1.3B-apps \
--model_name_or_path EleutherAI/gpt-neo-1.3B \
--dataset_name $HOME/gpt-code-clippy/data_processing/apps.py \
--dataset_config_name formatted \
--do_train --do_eval \
--block_size="1024" \
--per_device_train_batch_size="3" \
--per_device_eval_batch_size="3" \
--preprocessing_num_workers="16" \
--learning_rate="8e-5" \
--warmup_steps="800" \
--adam_beta1="0.9" \
--adam_beta2="0.98" \
--weight_decay="0.1" \
--overwrite_output_dir \
--num_train_epochs="5" \
--logging_steps="50" \
--eval_steps="2000" \
--report_to="wandb" \
--dtype="bfloat16" \
--save_strategy epoch \
--gradient_accumulation_steps 1 \
```
## Intended Use and Limitations
The model is finetuned to solve programming problems given a text description and optional starter code.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("flax-community/gpt-code-clippy-1.3B-apps")
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt-code-clippy-1.3B-apps")
prompt = """
A function to greet user. Given a user name it should say hello
def greet(name):
ANSWER:
"""
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(device)
start = input_ids.size(1)
out = model.generate(input_ids, do_sample=True, max_length=50, num_beams=2,
early_stopping=True, eos_token_id=tokenizer.eos_token_id, )
print(tokenizer.decode(out[0][start:]))
```
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper ["Evaluating Large Language Models Trained on Code"](https://arxiv.org/abs/2107.03374) from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. **As well as some differences in views from the paper, particularly around legal implications**.
1. **Over-reliance:** This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. **Economic and labor market impacts:** Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from [O*NET OnLine](https://www.onetonline.org/link/summary/15-1252.00), developers don't just write software.
5. **Biases:** The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt
formatting is different from that used in APPS dataset.
GPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See [GPT-Neo model card](https://huggingface.co/EleutherAI/gpt-neo-125M#limitations-and-biases) for details.
## Eval results
Coming soon...
|
{"language": ["en", "python"], "license": "mit", "tags": ["gpt_neo", "code_synthesis"], "datasets": ["apps"]}
|
text-generation
|
flax-community/gpt-neo-1.3B-apps
|
[
"transformers",
"pytorch",
"jax",
"gpt_neo",
"text-generation",
"code_synthesis",
"dataset:apps",
"arxiv:2107.03374",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.03374"
] |
[
"en",
"python"
] |
TAGS
#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# GPT-Neo-1.3B-APPS
> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot
## Model Description
GPT-Neo-1.3B-APPS is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.
## Training data
The model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.
This model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.
## Training procedure
The training script used to train this model can be found here.
Training is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:
## Intended Use and Limitations
The model is finetuned to solve programming problems given a text description and optional starter code.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper "Evaluating Large Language Models Trained on Code" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.
1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.
5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt
formatting is different from that used in APPS dataset.
GPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.
## Eval results
Coming soon...
|
[
"# GPT-Neo-1.3B-APPS\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-Neo-1.3B-APPS is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.",
"## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n\n5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt \n\nformatting is different from that used in APPS dataset.\n\nGPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
"TAGS\n#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# GPT-Neo-1.3B-APPS\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-Neo-1.3B-APPS is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.",
"## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n\n5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt \n\nformatting is different from that used in APPS dataset.\n\nGPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
67,
40,
45,
155,
60,
29,
35,
374,
8
] |
[
"passage: TAGS\n#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# GPT-Neo-1.3B-APPS\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot## Model Description\n\nGPT-Neo-1.3B-APPS is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:"
] |
[
-0.09672321379184723,
0.055259644985198975,
0.00007472636934835464,
0.038469795137643814,
0.10989425331354141,
0.06601018458604813,
0.0944913923740387,
0.08564076572656631,
-0.017872517928481102,
0.08410871028900146,
0.051931850612163544,
-0.027649521827697754,
0.04922640323638916,
0.11738719046115875,
0.030339980497956276,
-0.2528131604194641,
0.03015778958797455,
-0.023934122174978256,
0.042734913527965546,
0.11451050639152527,
0.08862712979316711,
-0.05779260769486427,
0.05510187894105911,
0.03815875202417374,
-0.1012309193611145,
-0.004332404583692551,
-0.02209271676838398,
-0.028667349368333817,
0.12722384929656982,
0.07971997559070587,
0.04534963518381119,
0.04724535346031189,
0.015348846092820168,
-0.15853911638259888,
0.021035892888903618,
0.10107484459877014,
0.0336182676255703,
0.07683517038822174,
0.0714799165725708,
0.08147794753313065,
0.23358900845050812,
-0.0445757731795311,
0.0659390240907669,
0.0752381682395935,
-0.04071758687496185,
-0.0747222751379013,
-0.1310960203409195,
-0.012192904949188232,
0.16815991699695587,
0.0959087386727333,
-0.014830952510237694,
0.11710815131664276,
-0.10230732709169388,
0.03704738989472389,
0.13417267799377441,
-0.2656692564487457,
-0.059447407722473145,
0.12246578186750412,
0.05510268360376358,
0.07487808167934418,
-0.0443088673055172,
-0.005660451017320156,
0.05082729086279869,
0.05879886820912361,
0.04582211747765541,
-0.05431466922163963,
0.1010245606303215,
-0.011662265285849571,
-0.1285243034362793,
-0.03740529343485832,
0.027122145518660545,
0.022255128249526024,
-0.07312827557325363,
-0.21417184174060822,
-0.027984993532299995,
-0.0642024576663971,
0.04957535117864609,
-0.04022900015115738,
-0.014780941419303417,
0.009395496919751167,
0.016130952164530754,
-0.07209065556526184,
-0.09622719883918762,
-0.0528208427131176,
-0.03234141692519188,
0.0605403371155262,
0.021387645974755287,
0.051939960569143295,
0.00222966936416924,
0.10080455988645554,
-0.10862293094396591,
-0.03393160179257393,
-0.027684738859534264,
-0.058708555996418,
-0.12908950448036194,
-0.03157470375299454,
-0.021709725260734558,
-0.24361130595207214,
-0.030256755650043488,
0.10823240131139755,
-0.08316366374492645,
0.09110593050718307,
0.1141175627708435,
0.009084410034120083,
0.046919792890548706,
0.2040177285671234,
-0.06627896428108215,
-0.07018224149942398,
0.033891692757606506,
0.059777628630399704,
0.011522236280143261,
-0.037410248070955276,
-0.06775671243667603,
-0.06281286478042603,
-0.010399620048701763,
0.10696610063314438,
0.039341434836387634,
0.00591633515432477,
-0.06654611229896545,
-0.04295111075043678,
0.14530505239963531,
-0.07271343469619751,
0.008709950372576714,
-0.02149864472448826,
-0.04095405712723732,
-0.01145907212048769,
0.006062701810151339,
-0.027688300237059593,
-0.08357038348913193,
0.08881884813308716,
-0.053236499428749084,
-0.0765584334731102,
-0.09498776495456696,
-0.0708351582288742,
0.03313362970948219,
-0.08014713227748871,
-0.052400797605514526,
-0.05001608654856682,
-0.21331685781478882,
-0.04824105277657509,
0.01602020300924778,
0.002520320937037468,
-0.05185180529952049,
0.027671417221426964,
0.01505889743566513,
-0.02199072390794754,
-0.0287321787327528,
0.08410762995481491,
-0.029805993661284447,
0.0015842511784285307,
-0.044215790927410126,
0.05100023001432419,
0.033811986446380615,
0.017425863072276115,
-0.06987616419792175,
0.08882152289152145,
-0.1176256313920021,
0.10901100188493729,
-0.014257008209824562,
-0.05971353501081467,
-0.0775546059012413,
-0.07464553415775299,
0.04743395745754242,
0.013630503788590431,
0.033421024680137634,
0.09338401257991791,
-0.08492252975702286,
-0.03916330263018608,
-0.005789459217339754,
-0.10072516649961472,
-0.05672210082411766,
0.15725469589233398,
-0.01998773030936718,
0.016646556556224823,
0.08469413220882416,
0.06078384816646576,
0.17268332839012146,
-0.017429418861865997,
-0.08325856178998947,
0.023810364305973053,
-0.06138761341571808,
0.07779652625322342,
0.06941764056682587,
0.004456620663404465,
0.02530856616795063,
0.0471712127327919,
-0.05211849883198738,
0.05356301739811897,
-0.023204702883958817,
-0.05133671686053276,
-0.04984167963266373,
-0.08117088675498962,
0.0661729946732521,
-0.006831538397818804,
-0.004647739231586456,
-0.04462960734963417,
-0.11092092096805573,
-0.0014376718318089843,
0.14443011581897736,
-0.03014390729367733,
-0.02105404995381832,
-0.07348328083753586,
-0.016420630738139153,
-0.04122287407517433,
0.028902567923069,
-0.20410379767417908,
-0.061099644750356674,
0.030982283875346184,
-0.05130607634782791,
0.026987450197339058,
-0.013651498593389988,
0.02157941274344921,
0.07104673981666565,
-0.05711829662322998,
-0.04777909070253372,
-0.06717213243246078,
-0.017832333222031593,
-0.08050719648599625,
-0.05021382123231888,
-0.10328080505132675,
-0.020081663504242897,
0.0975428894162178,
-0.1326477825641632,
0.008532647974789143,
0.05790620669722557,
0.10167157649993896,
0.03468775749206543,
-0.0672864094376564,
0.05753886327147484,
0.060416292399168015,
-0.024318167939782143,
-0.09008125215768814,
-0.013587276451289654,
0.00786164216697216,
-0.07889143377542496,
0.007926063612103462,
-0.13415968418121338,
-0.08164457231760025,
0.009809133596718311,
0.13663025200366974,
-0.10226424038410187,
0.00798483844846487,
-0.05502479523420334,
-0.06334418058395386,
-0.10184882581233978,
-0.04921886324882507,
0.23618118464946747,
0.0655190646648407,
0.08749793469905853,
-0.08483567088842392,
0.04528137668967247,
-0.030666999518871307,
0.010904466733336449,
0.004464269615709782,
0.11295638978481293,
-0.033643078058958054,
-0.05348670482635498,
0.00360488030128181,
0.012023134157061577,
-0.012394017539918423,
0.1641639769077301,
0.007668924052268267,
-0.09757309406995773,
-0.05391395092010498,
0.020388294011354446,
0.02212676778435707,
0.03143412247300148,
-0.05482974648475647,
0.0011178560089319944,
0.0516737699508667,
0.016139045357704163,
0.07083672285079956,
-0.12849898636341095,
0.027496647089719772,
0.042499132454395294,
-0.0005527749308384955,
0.00037320618866942823,
0.004382091108709574,
0.019391706213355064,
0.04962674528360367,
0.03535397723317146,
0.04395429790019989,
0.0059653399512171745,
-0.033005714416503906,
-0.12026870995759964,
0.20263734459877014,
-0.08934367448091507,
-0.24666038155555725,
-0.1441311538219452,
0.1140521690249443,
-0.06969977915287018,
-0.06728630512952805,
0.0003952224797103554,
-0.06984733045101166,
-0.06712618470191956,
-0.07562226057052612,
0.12414808571338654,
-0.053235460072755814,
-0.019836073741316795,
-0.0044481270015239716,
-0.01197222899645567,
-0.05965128168463707,
-0.12133298814296722,
0.013919471763074398,
0.0314270555973053,
-0.05467968434095383,
0.05888203904032707,
-0.07212761044502258,
0.07642216235399246,
0.19095146656036377,
-0.03045758418738842,
-0.025420833379030228,
-0.04012705013155937,
0.16073475778102875,
-0.11801574379205704,
0.07858330011367798,
0.24686166644096375,
0.054012276232242584,
0.0612851046025753,
0.021657321602106094,
0.00037999029154889286,
-0.02540738694369793,
0.056753627955913544,
0.023157740011811256,
-0.11024734377861023,
-0.2479182481765747,
-0.0974479541182518,
-0.05576562136411667,
0.0631626695394516,
0.006960912607610226,
-0.007855917327105999,
-0.008846843615174294,
0.04764333739876747,
-0.050409287214279175,
0.02455051802098751,
-0.00398296071216464,
0.0771978422999382,
0.084317646920681,
0.0025250075850635767,
0.04553705081343651,
-0.07768785208463669,
0.02586391381919384,
0.13986250758171082,
0.02544017694890499,
0.22929416596889496,
-0.08006059378385544,
0.2801438570022583,
0.011231697164475918,
0.07589162886142731,
0.018174877390265465,
0.06997503340244293,
-0.007731886114925146,
-0.030881768092513084,
-0.011384891346096992,
-0.054628144949674606,
-0.029512455686926842,
0.08669735491275787,
0.008257466368377209,
0.0032828261610120535,
-0.08015762269496918,
0.07498034834861755,
-0.01834552362561226,
0.22601842880249023,
-0.02797611430287361,
-0.18191638588905334,
-0.12529361248016357,
0.01513440441340208,
-0.013403920456767082,
-0.07999008893966675,
-0.005510103888809681,
0.11507049202919006,
-0.17573319375514984,
0.017955556511878967,
-0.08115648478269577,
0.11107683926820755,
-0.08979180455207825,
-0.004067819565534592,
-0.07051001489162445,
0.16555973887443542,
-0.01743101142346859,
0.10926204174757004,
-0.14136235415935516,
-0.006117776967585087,
0.04348663613200188,
0.10166490823030472,
-0.08481686562299728,
0.03438287228345871,
0.03569003939628601,
0.06677720695734024,
0.06450313329696655,
-0.009025760926306248,
-0.07583846896886826,
-0.08584807813167572,
-0.05725570395588875,
-0.00455422792583704,
0.043226007372140884,
0.04783380776643753,
0.033341143280267715,
0.019305571913719177,
0.029899071902036667,
-0.01915702037513256,
-0.06299905478954315,
-0.08312881737947464,
-0.16860951483249664,
0.07974162697792053,
0.038781993091106415,
-0.057071294635534286,
-0.114678755402565,
-0.04971540346741676,
0.1133333370089531,
0.2875441014766693,
0.06216837465763092,
-0.09130751341581345,
-0.12396277487277985,
-0.04480832815170288,
0.20862209796905518,
-0.09056631475687027,
0.004901569802314043,
0.0022912719286978245,
0.169391930103302,
-0.05553830415010452,
-0.07151523232460022,
0.033827487379312515,
-0.05245792120695114,
-0.15498705208301544,
-0.035899244248867035,
0.042045317590236664,
0.011256679892539978,
0.08218935877084732,
-0.009294329211115837,
-0.020440010353922844,
-0.05892304331064224,
-0.12027473747730255,
-0.02898118458688259,
0.09806238114833832,
-0.013567156158387661,
0.09954170137643814,
0.020496303215622902,
-0.027356265112757683,
-0.003874445566907525,
-0.07368165254592896,
0.11970572173595428,
0.1737726628780365,
-0.10962175577878952,
0.09192746877670288,
0.08927882462739944,
0.001375352032482624,
-0.20786017179489136,
0.07154681533575058,
0.0907839685678482,
0.11187632381916046,
0.047200608998537064,
-0.16374708712100983,
0.02426912821829319,
0.11622770875692368,
-0.006781185511499643,
0.08823941648006439,
-0.38555654883384705,
-0.11770569533109665,
0.033483292907476425,
0.028746308758854866,
0.0339454784989357,
-0.08961253613233566,
-0.014928212389349937,
-0.07213635742664337,
-0.11616092920303345,
0.030229663476347923,
-0.10982710123062134,
0.10241786390542984,
-0.052270062267780304,
0.03857825696468353,
0.05898413434624672,
-0.05706198513507843,
0.15183161199092865,
-0.042261283844709396,
0.05847727134823799,
0.008582926355302334,
-0.000029923165129730478,
0.07001791149377823,
-0.09043379127979279,
0.1742801070213318,
-0.09478723257780075,
0.09918317198753357,
-0.15588544309139252,
-0.056147120893001556,
-0.06569436192512512,
0.02365017868578434,
-0.01508229598402977,
-0.0902889296412468,
-0.10741882771253586,
0.038924168795347214,
0.01915738545358181,
-0.014603436924517155,
-0.019153285771608353,
-0.002236182801425457,
-0.007184006739407778,
0.2155996412038803,
0.13647998869419098,
-0.1661803424358368,
-0.14595988392829895,
0.023751044645905495,
0.004440132528543472,
0.05339356139302254,
-0.18928587436676025,
-0.01827460341155529,
0.11369132250547409,
0.05517519637942314,
0.10766049474477768,
-0.00035629779449664056,
-0.1501559317111969,
0.04624956101179123,
0.03390626236796379,
-0.04477774351835251,
-0.22145234048366547,
-0.05808059498667717,
0.083637535572052,
-0.12241513282060623,
0.023398984223604202,
0.07772763818502426,
-0.03861723467707634,
-0.028256621211767197,
0.009093948639929295,
0.013935692608356476,
0.014185568317770958,
0.16906540095806122,
-0.018363961949944496,
0.06761116534471512,
-0.06523291766643524,
0.09211672097444534,
0.07523063570261002,
-0.12523621320724487,
-0.014110325835645199,
0.10888907313346863,
-0.10123597085475922,
-0.03990233317017555,
-0.0059182713739573956,
0.013306901790201664,
-0.07318214327096939,
-0.043295908719301224,
-0.06346414983272552,
-0.09578795731067657,
0.009931723587214947,
-0.006071762181818485,
-0.008314031176269054,
0.0414569266140461,
-0.06135740131139755,
-0.008514724671840668,
-0.10659448802471161,
0.03423621878027916,
0.05460566654801369,
0.08629174530506134,
-0.06786422431468964,
0.1004471629858017,
0.0177261084318161,
0.061712414026260376,
-0.024603065103292465,
-0.0373145155608654,
-0.03331589698791504,
-0.008946327492594719,
-0.20547254383563995,
-0.0405886285007,
-0.0632924810051918,
-0.02324260212481022,
-0.013354682363569736,
0.0479341559112072,
-0.012451106682419777,
0.06479788571596146,
-0.024692779406905174,
-0.06482298672199249,
-0.07438943535089493,
0.04004959389567375,
-0.05483225733041763,
0.0013064516242593527,
0.025137273594737053,
-0.06120416149497032,
0.09380534291267395,
-0.04313916712999344,
0.022508734837174416,
-0.005490412935614586,
-0.033369142562150955,
0.018280794844031334,
0.01419833954423666,
0.02976207435131073,
0.015383352525532246,
-0.08087090402841568,
-0.0010698011610656977,
-0.012127094902098179,
-0.03337530791759491,
0.02785990573465824,
-0.033488061279058456,
-0.11812227219343185,
0.007901354692876339,
-0.05819922313094139,
-0.0363357849419117,
-0.09439845383167267,
0.07501298934221268,
0.10860484838485718,
0.04881853610277176,
0.15319597721099854,
-0.029754767194390297,
0.06199188902974129,
-0.16111138463020325,
-0.06054558977484703,
-0.017501331865787506,
0.026544200256466866,
-0.012080644257366657,
-0.0199598278850317,
0.03890810161828995,
-0.05756152421236038,
0.12097007781267166,
-0.029017392545938492,
0.03747802972793579,
0.008489408530294895,
0.08406227082014084,
0.0698620080947876,
0.00890191737562418,
0.059511568397283554,
-0.011077259667217731,
-0.023815855383872986,
0.016456734389066696,
0.030858395621180534,
-0.03932800516486168,
0.06243110075592995,
0.0391971692442894,
0.0006559428293257952,
0.07430294156074524,
0.09119842201471329,
-0.00033613861887715757,
0.011093392968177795,
-0.13279467821121216,
-0.0028338769916445017,
-0.04618353769183159,
0.058021675795316696,
-0.018989408388733864,
0.09288723021745682,
0.17503781616687775,
-0.14538303017616272,
0.13368166983127594,
-0.01183820329606533,
-0.09677499532699585,
-0.058060817420482635,
-0.03513362631201744,
-0.04503932222723961,
-0.08330386877059937,
-0.00919194333255291,
-0.11273080110549927,
0.008742687292397022,
0.09925375133752823,
-0.0036374598275870085,
-0.03989405930042267,
0.16482087969779968,
-0.077020563185215,
-0.08802201598882675,
0.0831880271434784,
-0.0011982303112745285,
0.06955691426992416,
0.08796711266040802,
0.035576339811086655,
0.0662149116396904,
-0.08414784073829651,
0.10877427458763123,
0.038320332765579224,
0.023433728143572807,
-0.0075521692633628845,
-0.007027889601886272,
-0.04810281842947006,
-0.009682826697826385,
0.020977484062314034,
0.03125809133052826,
0.14744192361831665,
0.03397723659873009,
-0.07092809677124023,
-0.03875034302473068,
0.18642359972000122,
-0.04105431213974953,
0.004623702727258205,
-0.1142655536532402,
0.23157553374767303,
-0.026973634958267212,
-0.008345064707100391,
0.04541181027889252,
-0.0884285643696785,
-0.007349404972046614,
0.09444604068994522,
0.1505618393421173,
-0.014472650364041328,
-0.0542389377951622,
-0.030962856486439705,
-0.016510874032974243,
-0.07943298667669296,
0.16083773970603943,
0.033012811094522476,
0.233517587184906,
-0.04745858907699585,
0.13759593665599823,
-0.03121001459658146,
-0.040281638503074646,
-0.09818694740533829,
0.094979427754879,
0.013905170373618603,
0.03053838014602661,
-0.032664962112903595,
0.056393299251794815,
0.0361006036400795,
-0.08529216051101685,
0.025378096848726273,
-0.13265790045261383,
-0.12616059184074402,
0.027470631524920464,
-0.050479017198085785,
-0.03504566848278046,
0.14849264919757843,
0.0035977063234895468,
0.033994026482105255,
0.03439784795045853,
-0.05625960975885391,
-0.08994065970182419,
-0.08785955607891083,
0.10081616789102554,
0.000015078614524099976,
0.1739780306816101,
-0.0023975465446710587,
0.08826234191656113,
0.10611429065465927,
-0.013341872952878475,
-0.13970844447612762,
0.04896804690361023,
0.038434941321611404,
-0.04324830323457718,
0.04783615097403526,
0.07323591411113739,
-0.03262630105018616,
0.09559215605258942,
0.03181084245443344,
-0.056154560297727585,
0.01654157228767872,
-0.013102990575134754,
0.07981216162443161,
-0.103840172290802,
0.0480886735022068,
-0.05667920783162117,
0.1461239457130432,
0.14356523752212524,
-0.04610546678304672,
-0.031217923387885094,
-0.03523964807391167,
0.0014242925681173801,
0.024543488398194313,
0.19709496200084686,
-0.020043229684233665,
-0.11562889069318771,
0.04214571416378021,
0.08671002835035324,
0.03072328492999077,
-0.2629697620868683,
-0.09498247504234314,
-0.01807684637606144,
-0.05547308549284935,
0.0017131214262917638,
0.11543623358011246,
0.05391562357544899,
0.010218851268291473,
-0.01733064465224743,
-0.17619067430496216,
0.01605742610991001,
0.07154128700494766,
-0.08214739710092545,
-0.09083642810583115
] |
null | null |
transformers
|
# GPT-Neo-125M-APPS-all
> **Please refer to our new [GitHub Wiki](https://github.com/ncoop57/gpt-code-clippy/wiki) which documents our efforts in detail in creating the open source version of GitHub Copilot**
## Model Description
GPT-Neo-125M-APPS-all is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.
## Training data
The model is trained on the [Automated Programming Progress Standard (APPS) dataset](https://github.com/hendrycks/apps). The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.
This model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found [here](https://huggingface.co/flax-community/gpt-neo-125M-apps).
## Training procedure
The training script used to train this model can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/run_clm_apps.py).
Training is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:
```bash
python run_clm_apps.py \
--output_dir $HOME/gpt-neo-125M-apps \
--model_name_or_path EleutherAI/gpt-neo-125B \
--dataset_name $HOME/gpt-code-clippy/data_processing/apps.py \
--dataset_config_name formatted \
--do_train --do_eval \
--block_size="1024" \
--per_device_train_batch_size="16" \
--per_device_eval_batch_size="16" \
--preprocessing_num_workers="16" \
--learning_rate="8e-5" \
--warmup_steps="800" \
--adam_beta1="0.9" \
--adam_beta2="0.98" \
--weight_decay="0.1" \
--overwrite_output_dir \
--num_train_epochs="5" \
--logging_steps="50" \
--eval_steps="2000" \
--report_to="wandb" \
--dtype="bfloat16" \
--save_strategy epoch \
--gradient_accumulation_steps 2 \
--all_data true \
```
## Intended Use and Limitations
The model is finetuned to solve programming problems given a text description and optional starter code.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("flax-community/gpt-code-clippy-125M-apps-alldata")
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt-code-clippy-125M-apps-alldata")
prompt = """
A function to greet user. Given a user name it should say hello
def greet(name):
ANSWER:
"""
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(device)
start = input_ids.size(1)
out = model.generate(input_ids, do_sample=True, max_length=50, num_beams=2,
early_stopping=True, eos_token_id=tokenizer.eos_token_id, )
print(tokenizer.decode(out[0][start:]))
```
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper ["Evaluating Large Language Models Trained on Code"](https://arxiv.org/abs/2107.03374) from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. **As well as some differences in views from the paper, particularly around legal implications**.
1. **Over-reliance:** This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. **Economic and labor market impacts:** Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from [O*NET OnLine](https://www.onetonline.org/link/summary/15-1252.00), developers don't just write software.
5. **Biases:** The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt
formatting is different from that used in APPS dataset.
GPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See [GPT-Neo model card](https://huggingface.co/EleutherAI/gpt-neo-125M#limitations-and-biases) for details.
## Eval results
Coming soon...
|
{"language": ["en", "python"], "license": "mit", "tags": ["gpt_neo", "code_synthesis"], "datasets": ["apps"]}
|
text-generation
|
flax-community/gpt-neo-125M-apps-all
|
[
"transformers",
"pytorch",
"jax",
"gpt_neo",
"text-generation",
"code_synthesis",
"dataset:apps",
"arxiv:2107.03374",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.03374"
] |
[
"en",
"python"
] |
TAGS
#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# GPT-Neo-125M-APPS-all
> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot
## Model Description
GPT-Neo-125M-APPS-all is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.
## Training data
The model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.
This model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.
## Training procedure
The training script used to train this model can be found here.
Training is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:
## Intended Use and Limitations
The model is finetuned to solve programming problems given a text description and optional starter code.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper "Evaluating Large Language Models Trained on Code" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.
1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.
5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt
formatting is different from that used in APPS dataset.
GPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.
## Eval results
Coming soon...
|
[
"# GPT-Neo-125M-APPS-all\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-Neo-125M-APPS-all is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.",
"## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n\n5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt \nformatting is different from that used in APPS dataset.\n\nGPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
"TAGS\n#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# GPT-Neo-125M-APPS-all\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-Neo-125M-APPS-all is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.",
"## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n\n5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt \nformatting is different from that used in APPS dataset.\n\nGPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
71,
42,
47,
155,
60,
29,
35,
374,
8
] |
[
"passage: TAGS\n#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# GPT-Neo-125M-APPS-all\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot## Model Description\n\nGPT-Neo-125M-APPS-all is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.\n\nThis model is fine-tuned using most of the APPS dataset including both train and test split to explore the impact of this training task on model performance on other code synthesis evaluation metrics. A model fine-tuned on train set only can be found here.## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:"
] |
[
-0.08529836684465408,
0.08706918358802795,
-0.00013262030552141368,
0.03385673090815544,
0.11245820671319962,
0.05771276727318764,
0.08569382131099701,
0.0728083997964859,
-0.0006422942969948053,
0.06994225084781647,
-0.003400209592655301,
0.01925697736442089,
0.06690365821123123,
0.1270759552717209,
0.05862472951412201,
-0.18981434404850006,
0.0016305380268022418,
-0.050033826380968094,
0.031324028968811035,
0.11259379237890244,
0.06842007488012314,
-0.049724698066711426,
0.06665173172950745,
0.029437389224767685,
-0.11042749881744385,
-0.040245868265628815,
-0.027618998661637306,
-0.009231001138687134,
0.12324374914169312,
0.06578299403190613,
0.06921646744012833,
0.03170096129179001,
0.027728432789444923,
-0.15041840076446533,
0.012688610702753067,
0.09307334572076797,
0.03576497361063957,
0.06312728673219681,
0.07694831490516663,
0.060012467205524445,
0.16489019989967346,
-0.021346796303987503,
0.0634537860751152,
0.08203213661909103,
-0.03889608010649681,
-0.06933913379907608,
-0.13020671904087067,
-0.006198444869369268,
0.1549144685268402,
0.08556520193815231,
-0.011185890063643456,
0.14484842121601105,
-0.05752620846033096,
0.05258560925722122,
0.10747851431369781,
-0.20367015898227692,
-0.04259549081325531,
0.1278296858072281,
0.028801018372178078,
0.05828350782394409,
-0.04612131416797638,
-0.011751621961593628,
0.03434722498059273,
0.05717744678258896,
0.004382703918963671,
-0.032082583755254745,
0.04038841649889946,
-0.035498764365911484,
-0.1076355054974556,
-0.056203052401542664,
0.06318612396717072,
0.0341891385614872,
-0.09040755778551102,
-0.21645018458366394,
-0.03017815575003624,
0.019496863707900047,
0.05657336488366127,
-0.003383377566933632,
-0.029255995526909828,
0.02130085974931717,
0.00808977335691452,
-0.07556638866662979,
-0.11366786062717438,
-0.05429850518703461,
-0.005610556807368994,
0.05712210759520531,
0.01053941436111927,
0.04523789882659912,
0.008200895972549915,
0.11910708993673325,
-0.055671922862529755,
-0.021539662033319473,
-0.0487939715385437,
-0.06812819093465805,
-0.18061502277851105,
-0.040216751396656036,
-0.02197587676346302,
-0.21645745635032654,
-0.035825569182634354,
0.07075759768486023,
-0.012303906492888927,
0.06161687895655632,
0.05533160641789436,
0.0005149308708496392,
0.061479538679122925,
0.17960891127586365,
-0.029533449560403824,
-0.044667888432741165,
0.0012843952281400561,
0.008288010954856873,
0.016377167776226997,
-0.04836907237768173,
-0.03385612741112709,
-0.03904768079519272,
-0.004013101104646921,
0.09765412658452988,
0.024941543117165565,
0.008577431552112103,
-0.10435963422060013,
-0.04695089906454086,
0.20966200530529022,
-0.07137510925531387,
0.0026529263705015182,
-0.0042270636186003685,
0.013846872374415398,
-0.028690334409475327,
0.03291584923863411,
-0.004825749900192022,
-0.07068101316690445,
0.03715239837765694,
-0.04418020695447922,
-0.0800061896443367,
-0.08424711972475052,
-0.054171960800886154,
0.025029823184013367,
-0.03934215381741524,
-0.04019177705049515,
-0.06042397394776344,
-0.24053923785686493,
-0.03872467577457428,
0.01995784230530262,
-0.00689965533092618,
-0.05631281062960625,
0.028251888230443,
-0.0005279004108160734,
-0.04461938142776489,
-0.007012460846453905,
0.07835538685321808,
-0.0077603585086762905,
0.0314452089369297,
-0.05879063531756401,
0.028678543865680695,
0.02789493463933468,
0.01924617402255535,
-0.046458013355731964,
0.07200900465250015,
-0.1391977071762085,
0.12849654257297516,
-0.024936819449067116,
-0.0460720919072628,
-0.059081483632326126,
-0.045108553022146225,
0.029216310009360313,
-0.007622346747666597,
0.008603433147072792,
0.07217960059642792,
-0.09452910721302032,
-0.03727392852306366,
0.057944636791944504,
-0.07843641936779022,
-0.043586499989032745,
0.13689197599887848,
-0.03485700860619545,
0.0779009535908699,
0.0884815901517868,
0.08655121922492981,
0.16271133720874786,
0.007203286979347467,
-0.0841655507683754,
0.0006103997002355754,
-0.03197438269853592,
0.11213574558496475,
0.06456870585680008,
-0.0035854631569236517,
0.016458304598927498,
0.05954086035490036,
-0.061221178621053696,
0.04888264089822769,
-0.010406285524368286,
-0.05211498215794563,
-0.04822578653693199,
-0.12483685463666916,
0.07281865179538727,
-0.024345526471734047,
0.012987641617655754,
-0.030991412699222565,
-0.08853814750909805,
-0.046107638627290726,
0.160883367061615,
-0.048455268144607544,
-0.026298943907022476,
-0.06287786364555359,
-0.06158106401562691,
-0.015205254778265953,
0.023727472871541977,
-0.21443060040473938,
-0.0705394521355629,
0.06663563847541809,
-0.10153290629386902,
0.004042719956487417,
-0.017418378964066505,
0.023400407284498215,
0.08743824809789658,
-0.04321993142366409,
-0.0194473247975111,
-0.09344258159399033,
-0.022240249440073967,
-0.049518704414367676,
-0.03383469954133034,
-0.10705425590276718,
-0.04244742542505264,
0.20895762741565704,
-0.1304527223110199,
0.007607095409184694,
0.0029733155388385057,
0.1210760772228241,
0.04257480427622795,
-0.09477967768907547,
0.04964723438024521,
0.03100544959306717,
-0.03275458887219429,
-0.06682267040014267,
-0.023108212277293205,
0.027786199003458023,
-0.05471168830990791,
0.015009121038019657,
-0.12207139283418655,
-0.09994857013225555,
-0.004231433849781752,
0.11741386353969574,
-0.10569331794977188,
0.05377592518925667,
-0.04185560718178749,
-0.05683521553874016,
-0.12095634639263153,
-0.08278866112232208,
0.19610010087490082,
0.059811729937791824,
0.07179838418960571,
-0.06739403307437897,
0.010584491305053234,
-0.01101680751889944,
0.03857790678739548,
0.014839152805507183,
0.10507132858037949,
-0.05435137823224068,
-0.04160019010305405,
0.015559175983071327,
0.0017942142440006137,
0.024129442870616913,
0.1820874959230423,
0.0135804433375597,
-0.08076640963554382,
-0.05521196126937866,
0.030179092660546303,
0.01945897564291954,
0.016580965369939804,
-0.08648920804262161,
-0.002859397791326046,
0.06017245352268219,
-0.0018291157903149724,
0.04350461810827255,
-0.11223971843719482,
0.032067447900772095,
0.03884858638048172,
-0.0019782197196036577,
-0.045679111033678055,
-0.00620981864631176,
0.008395751006901264,
0.04805564135313034,
-0.003368682460859418,
0.09010308235883713,
0.0029738126322627068,
-0.031840115785598755,
-0.1389113962650299,
0.15709060430526733,
-0.08558604121208191,
-0.23912647366523743,
-0.17929965257644653,
0.10979052633047104,
-0.0732136219739914,
-0.04641677811741829,
-0.00037182282540015876,
-0.02381189912557602,
-0.05983852595090866,
-0.084141805768013,
0.07075469195842743,
-0.013958548195660114,
-0.06961481273174286,
0.10164235532283783,
-0.010434228926897049,
-0.05547262728214264,
-0.12817437946796417,
0.023738481104373932,
0.07032828778028488,
-0.06291414052248001,
0.05897216126322746,
-0.07368164509534836,
0.08412298560142517,
0.18173252046108246,
-0.010035950690507889,
-0.03006460703909397,
-0.021794019266963005,
0.18088723719120026,
-0.12016312777996063,
0.0829901471734047,
0.1979425847530365,
0.01921214908361435,
0.051836125552654266,
0.007943115197122097,
-0.0032452887389808893,
-0.03035898506641388,
0.07682032883167267,
0.0241234190762043,
-0.08190830796957016,
-0.2176879346370697,
-0.09240247309207916,
-0.05979319289326668,
0.04387935996055603,
-0.011706297285854816,
0.008816721849143505,
0.029707444831728935,
0.02742101438343525,
-0.05707920342683792,
0.005615511443465948,
0.028656164184212685,
0.0843396857380867,
0.02956405282020569,
-0.020643845200538635,
0.03196690231561661,
-0.07429847866296768,
0.0062717897817492485,
0.13493531942367554,
0.052175123244524,
0.21369647979736328,
-0.06221771985292435,
0.25509729981422424,
0.000030004122891114093,
0.0574396587908268,
-0.012214905582368374,
0.08022845536470413,
0.014289082027971745,
0.0039914981462061405,
-0.021752670407295227,
-0.05120493099093437,
-0.049894142895936966,
0.07424701005220413,
0.03376871719956398,
-0.05445673689246178,
-0.019308339804410934,
0.027268392965197563,
-0.022067254409193993,
0.2234979271888733,
-0.011347458697855473,
-0.126239612698555,
-0.10575906932353973,
0.039870429784059525,
-0.019449006766080856,
-0.09930151700973511,
-0.029262011870741844,
0.08278175443410873,
-0.18249323964118958,
0.03554123640060425,
-0.05918446183204651,
0.08882979303598404,
-0.0698060616850853,
-0.017575854435563087,
-0.1104896068572998,
0.07205643504858017,
0.003880417672917247,
0.1214277520775795,
-0.08582502603530884,
-0.0063941399566829205,
0.014152409508824348,
0.09959983080625534,
-0.10321073234081268,
0.025734422728419304,
0.004028302151709795,
0.029634937644004822,
0.0817566066980362,
0.008717832155525684,
-0.026950042694807053,
-0.07521384954452515,
-0.05966586992144585,
0.0026376796886324883,
0.02284950390458107,
-0.011401712894439697,
0.04146363213658333,
0.004640035796910524,
0.007869374938309193,
-0.020311811938881874,
-0.036465417593717575,
-0.08890882134437561,
-0.1731438934803009,
0.09352856129407883,
0.006442395504564047,
-0.033237989991903305,
-0.10407949984073639,
-0.04919667914509773,
0.06045035272836685,
0.2257232964038849,
0.05915128439664841,
-0.06700525432825089,
-0.0986635610461235,
-0.10309028625488281,
0.1734948307275772,
-0.06770937889814377,
0.007111037150025368,
0.0033812634646892548,
0.17822039127349854,
-0.06864967942237854,
-0.0588407889008522,
-0.013591251336038113,
-0.05256495624780655,
-0.18329907953739166,
-0.02299269288778305,
0.025511443614959717,
0.02856910042464733,
0.07187527418136597,
0.004420738201588392,
-0.03508396074175835,
-0.06072196736931801,
-0.0998079851269722,
-0.018217092379927635,
0.14240191876888275,
0.0029372849967330694,
0.11440406739711761,
0.014280804432928562,
-0.04625503346323967,
-0.02816252037882805,
-0.11144264042377472,
0.09470102190971375,
0.24572566151618958,
-0.10762128233909607,
0.11106428503990173,
0.12786994874477386,
-0.021584507077932358,
-0.2066819816827774,
-0.00032879901118576527,
0.09836683422327042,
0.09733037650585175,
0.08402454107999802,
-0.167952299118042,
0.01575964316725731,
0.09519825130701065,
-0.020208103582262993,
0.045345745980739594,
-0.3499942719936371,
-0.11387605220079422,
0.03590938448905945,
0.03801422938704491,
0.04218022897839546,
-0.08088557422161102,
0.0010107214329764247,
-0.07801788300275803,
-0.19679947197437286,
0.06086857244372368,
-0.0857759565114975,
0.10715048760175705,
-0.03768925741314888,
0.051260121166706085,
0.048658113926649094,
-0.07492569833993912,
0.15118858218193054,
-0.013553271070122719,
0.05555086210370064,
-0.0010767413768917322,
0.018187817186117172,
0.1411348134279251,
-0.07811827957630157,
0.16246068477630615,
-0.03564074635505676,
0.09271182119846344,
-0.12208093702793121,
-0.05225842818617821,
-0.09030471742153168,
0.005619751289486885,
-0.014936278574168682,
-0.07881863415241241,
-0.09287302196025848,
0.048416994512081146,
0.03234037756919861,
-0.00914599560201168,
-0.045179229229688644,
0.026627900078892708,
-0.004882235545665026,
0.20725876092910767,
0.108131542801857,
-0.10347551107406616,
-0.10689812898635864,
-0.022326456382870674,
-0.014439142309129238,
0.019238684326410294,
-0.12266964465379715,
0.013967466540634632,
0.1058112233877182,
0.07674622535705566,
0.12635193765163422,
-0.010040607303380966,
-0.16892074048519135,
0.058858927339315414,
0.027272773906588554,
-0.06293267011642456,
-0.24742765724658966,
-0.06374902278184891,
0.09306526184082031,
-0.14022070169448853,
0.02532755583524704,
0.0985281765460968,
-0.011530610732734203,
-0.04180034250020981,
-0.0028890366666018963,
0.03343891352415085,
0.010034458711743355,
0.15928779542446136,
-0.046090900897979736,
0.04340161383152008,
-0.07030190527439117,
0.0884515568614006,
0.10154376924037933,
-0.031101329252123833,
-0.015981055796146393,
0.048204097896814346,
-0.11432597786188126,
-0.027391763404011726,
-0.026025960221886635,
0.02097375877201557,
-0.07343824952840805,
-0.03131839632987976,
-0.03930974379181862,
-0.06371519714593887,
0.008974961936473846,
0.005736187566071749,
-0.01769387163221836,
0.08828277140855789,
-0.05035540834069252,
-0.017494067549705505,
-0.08086579293012619,
0.04039061442017555,
0.05371936410665512,
0.08767037838697433,
-0.08212529867887497,
0.08815266937017441,
0.0037001874297857285,
0.08009877055883408,
-0.014573607593774796,
-0.07134100049734116,
-0.029437800869345665,
-0.011640976183116436,
-0.20658180117607117,
-0.028351884335279465,
-0.05978234484791756,
-0.01791686750948429,
-0.005232963711023331,
0.03184434771537781,
0.006561203394085169,
0.06448746472597122,
-0.04435103014111519,
-0.06760762631893158,
-0.07854465395212173,
0.061305563896894455,
-0.07636762410402298,
-0.02361544780433178,
0.058352693915367126,
-0.06904271245002747,
0.09177889674901962,
-0.023083128035068512,
0.007509832736104727,
0.010279892943799496,
-0.020313680171966553,
0.028198285028338432,
0.016719015315175056,
0.04869049787521362,
-0.0072661577723920345,
-0.10051232576370239,
-0.03293803706765175,
-0.0007895882590673864,
-0.02040827088057995,
0.03449603542685509,
0.018670693039894104,
-0.08598236739635468,
0.0285344235599041,
-0.024595050141215324,
-0.08147714287042618,
-0.08596546202898026,
0.07482045143842697,
0.08268830925226212,
0.052724942564964294,
0.142689511179924,
0.0005422275862656534,
0.035440586507320404,
-0.16451619565486908,
-0.0492171049118042,
-0.00520381098613143,
0.021419361233711243,
0.03017849288880825,
-0.02536534145474434,
0.011067084968090057,
-0.029934512451291084,
0.12052442133426666,
-0.0680486187338829,
0.05013558268547058,
-0.02341054566204548,
0.07650188356637955,
0.027553386986255646,
-0.010575748048722744,
0.06316670030355453,
-0.024428032338619232,
-0.015307670459151268,
0.01983642391860485,
0.02550486847758293,
-0.0401015467941761,
0.02405429631471634,
0.03142831102013588,
0.0032549204770475626,
0.0345003604888916,
0.037575095891952515,
0.041443388909101486,
-0.0123159009963274,
-0.1419076919555664,
-0.03172152861952782,
-0.07688254117965698,
0.05646468698978424,
-0.06037426367402077,
0.08334548026323318,
0.12336421012878418,
-0.13070222735404968,
0.13800181448459625,
-0.030693091452121735,
-0.09962526708841324,
-0.030492650344967842,
-0.08607538789510727,
-0.03607730194926262,
-0.05662979930639267,
-0.016234872862696648,
-0.08885746449232101,
0.035125020891427994,
0.10719431191682816,
-0.025573940947651863,
-0.055326882749795914,
0.15030522644519806,
-0.0843106061220169,
-0.08601586520671844,
0.04285457357764244,
-0.016162842512130737,
0.03823075070977211,
0.06278274208307266,
0.05010378733277321,
0.05310562252998352,
0.020842211320996284,
0.1306726038455963,
0.05566374957561493,
0.01140678022056818,
-0.03601854294538498,
0.003985134419053793,
-0.05031540244817734,
-0.003970445599406958,
-0.003939857240766287,
-0.025142544880509377,
0.12917128205299377,
0.04726588726043701,
-0.03382275253534317,
-0.0098460977897048,
0.17832691967487335,
-0.05236455425620079,
-0.029141465201973915,
-0.13649965822696686,
0.16423971951007843,
-0.02025877684354782,
-0.014762485399842262,
0.1017392948269844,
-0.10404789447784424,
-0.007275199983268976,
0.09371091425418854,
0.14538584649562836,
0.011005876585841179,
-0.039260946214199066,
-0.006000793073326349,
-0.003616784233599901,
-0.05377636477351189,
0.16492106020450592,
0.025097783654928207,
0.18164442479610443,
-0.051896970719099045,
0.17205701768398285,
-0.008697287179529667,
-0.02298404648900032,
-0.07315001636743546,
0.10676087439060211,
-0.04505614936351776,
0.006430608686059713,
-0.03682626038789749,
0.06523076444864273,
0.021396508440375328,
-0.11747048795223236,
-0.04159221798181534,
-0.11562991142272949,
-0.12828151881694794,
0.041589438915252686,
-0.060121480375528336,
-0.06336283683776855,
0.12974558770656586,
-0.0033862648997455835,
0.04081329330801964,
0.1652889996767044,
-0.04695437476038933,
-0.0630057230591774,
-0.05034637451171875,
0.10681301355361938,
-0.07820762693881989,
0.1742704212665558,
-0.00026007965789176524,
0.0750512108206749,
0.08695591986179352,
-0.023396406322717667,
-0.10216627269983292,
0.04241592809557915,
0.028658395633101463,
0.0201867762953043,
0.046308353543281555,
0.1346665620803833,
-0.04083281755447388,
0.09452527016401291,
0.011249717324972153,
-0.06146743893623352,
0.014368835836648941,
0.04326718673110008,
0.10829085856676102,
-0.12210295349359512,
0.04914740473031998,
-0.06934626400470734,
0.1456352174282074,
0.102010577917099,
-0.03048485331237316,
-0.008956978097558022,
-0.017591584473848343,
-0.007942602038383484,
0.008590027689933777,
0.20961394906044006,
-0.02793371118605137,
-0.14514899253845215,
0.009005681611597538,
0.03329315409064293,
0.06566118448972702,
-0.26295632123947144,
-0.09119002521038055,
0.005157948937267065,
-0.06878263503313065,
0.031307753175497055,
0.11358322948217392,
0.04314373433589935,
0.0012093271361663938,
-0.010456527583301067,
-0.14533565938472748,
-0.0019835690036416054,
0.07233863323926926,
-0.06678199023008347,
-0.07796080410480499
] |
null | null |
transformers
|
# GPT-Neo-125M-APPS
> **Please refer to our new [GitHub Wiki](https://github.com/ncoop57/gpt-code-clippy/wiki) which documents our efforts in detail in creating the open source version of GitHub Copilot**
## Model Description
GPT-Neo-125M-APPS is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.
## Training data
The model is trained on the [Automated Programming Progress Standard (APPS) dataset](https://github.com/hendrycks/apps). The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.
## Training procedure
The training script used to train this model can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/run_clm_apps.py).
Training is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:
```bash
python run_clm_apps.py \
--output_dir $HOME/gpt-neo-125M-apps \
--model_name_or_path EleutherAI/gpt-neo-125M \
--dataset_name $HOME/gpt-code-clippy/data_processing/apps.py \
--dataset_config_name formatted \
--do_train --do_eval \
--block_size="1024" \
--per_device_train_batch_size="16" \
--per_device_eval_batch_size="16" \
--preprocessing_num_workers="16" \
--learning_rate="8e-5" \
--warmup_steps="800" \
--adam_beta1="0.9" \
--adam_beta2="0.98" \
--weight_decay="0.1" \
--overwrite_output_dir \
--num_train_epochs="5" \
--logging_steps="50" \
--eval_steps="2000" \
--report_to="wandb" \
--dtype="bfloat16" \
--save_strategy epoch \
--gradient_accumulation_steps 2 \
```
## Intended Use and Limitations
The model is finetuned to solve programming problems given a text description and optional starter code.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("flax-community/gpt-neo-125M-apps")
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt-neo-125M-apps")
prompt = """
A function to greet user. Given a user name it should say hello
def greet(name):
ANSWER:
"""
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(device)
start = input_ids.size(1)
out = model.generate(input_ids, do_sample=True, max_length=50, num_beams=2,
early_stopping=True, eos_token_id=tokenizer.eos_token_id, )
print(tokenizer.decode(out[0][start:]))
```
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper ["Evaluating Large Language Models Trained on Code"](https://arxiv.org/abs/2107.03374) from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. **As well as some differences in views from the paper, particularly around legal implications**.
1. **Over-reliance:** This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. **Economic and labor market impacts:** Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from [O*NET OnLine](https://www.onetonline.org/link/summary/15-1252.00), developers don't just write software.
5. **Biases:** The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt
formatting is different from that used in APPS dataset.
GPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See [GPT-Neo model card](https://huggingface.co/EleutherAI/gpt-neo-125M#limitations-and-biases) for details.
## Eval results
Coming soon...
|
{"language": ["en", "code"], "license": "mit", "tags": ["gpt_neo", "code_synthesis"], "datasets": ["apps"], "language_details": "python code"}
|
text-generation
|
flax-community/gpt-neo-125M-apps
|
[
"transformers",
"pytorch",
"jax",
"gpt_neo",
"text-generation",
"code_synthesis",
"en",
"code",
"dataset:apps",
"arxiv:2107.03374",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.03374"
] |
[
"en",
"code"
] |
TAGS
#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #en #code #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# GPT-Neo-125M-APPS
> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot
## Model Description
GPT-Neo-125M-APPS is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.
## Training data
The model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.
## Training procedure
The training script used to train this model can be found here.
Training is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:
## Intended Use and Limitations
The model is finetuned to solve programming problems given a text description and optional starter code.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper "Evaluating Large Language Models Trained on Code" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.
1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.
5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt
formatting is different from that used in APPS dataset.
GPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.
## Eval results
Coming soon...
|
[
"# GPT-Neo-125M-APPS\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-Neo-125M-APPS is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.",
"## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n\n5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt \n\nformatting is different from that used in APPS dataset.\n\nGPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
"TAGS\n#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #en #code #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# GPT-Neo-125M-APPS\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-Neo-125M-APPS is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.",
"## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n\n5. Biases: The model is trained on data containing prompt questions formatted in specific way. The performance of the model can be worse if the prompt \n\nformatting is different from that used in APPS dataset.\n\nGPT-CC is finetuned GPT-Neo and might have inhereted biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
71,
40,
45,
98,
60,
29,
35,
374,
8
] |
[
"passage: TAGS\n#transformers #pytorch #jax #gpt_neo #text-generation #code_synthesis #en #code #dataset-apps #arxiv-2107.03374 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# GPT-Neo-125M-APPS\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot## Model Description\n\nGPT-Neo-125M-APPS is a GPT-Neo-125M finetuned on APPS dataset. This model is specialized to solve programming tasks.## Training data\n\nThe model is trained on the Automated Programming Progress Standard (APPS) dataset. The dataset consists of 10,000 coding problems in total, with 131,836 test cases for checking solutions and 232,444 ground-truth solutions written by humans. Problems can be complicated, as the average length of a problem is 293.2 words. The data are split evenly into training and test sets, with 5,000 problems each.## Training procedure\n\nThe training script used to train this model can be found here.\n\nTraining is done for 5 epochs using AdamW optimizer and leaner decay learning rate schedule with 800 warmup steps. To reproduce the training one can use this command with the above script:## Intended Use and Limitations\n\nThe model is finetuned to solve programming problems given a text description and optional starter code.### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:"
] |
[
-0.06992904096841812,
0.1491709053516388,
0.0010916645405814052,
0.01891256310045719,
0.0981321632862091,
0.0556841678917408,
0.06904952973127365,
0.06566659361124039,
0.004111138172447681,
0.07161754369735718,
0.08209739625453949,
-0.024675682187080383,
0.0757320448756218,
0.07509433478116989,
0.011069661937654018,
-0.23845864832401276,
0.014820362441241741,
0.006789070088416338,
0.06077350303530693,
0.08277985453605652,
0.026550445705652237,
-0.042878635227680206,
0.04728071391582489,
0.004376884084194899,
-0.07412861287593842,
-0.034455228596925735,
-0.0565495565533638,
0.014538655988872051,
0.14003923535346985,
0.054551661014556885,
0.07956064492464066,
0.03781777247786522,
0.018086111173033714,
-0.1380043476819992,
0.02149512991309166,
0.07265560328960419,
0.0035313854459673166,
0.07474563270807266,
0.017051037400960922,
0.13709187507629395,
0.24404995143413544,
-0.0859919935464859,
0.04058559611439705,
0.04079078510403633,
-0.0652613490819931,
-0.05081523209810257,
-0.0918295830488205,
-0.03626874461770058,
0.14823296666145325,
0.09152760356664658,
-0.0143924281001091,
0.060666635632514954,
-0.08570820093154907,
0.038362011313438416,
0.15601521730422974,
-0.2931704521179199,
-0.06445423513650894,
0.06080528721213341,
0.009259955026209354,
0.0649694949388504,
-0.030710993334650993,
0.002345602959394455,
0.042001787573099136,
0.042902443557977676,
0.04640055447816849,
-0.046632811427116394,
0.1253119558095932,
-0.04207553714513779,
-0.11701355129480362,
-0.058795999735593796,
-0.02975187450647354,
0.028940096497535706,
-0.04256580024957657,
-0.21626845002174377,
-0.07173586636781693,
-0.030521001666784286,
0.03437168896198273,
0.00706858467310667,
-0.0208806861191988,
0.01741483248770237,
0.0022072605788707733,
-0.11223281174898148,
-0.0766439363360405,
-0.0489940345287323,
0.016809597611427307,
0.07990572601556778,
0.027901513502001762,
0.03518301993608475,
0.03538317233324051,
0.11339928954839706,
-0.045514728873968124,
-0.035351283848285675,
-0.007841745391488075,
0.014138719998300076,
-0.1342928409576416,
-0.022822625935077667,
-0.018382426351308823,
-0.11896641552448273,
-0.015802955254912376,
0.09396914392709732,
-0.005311727058142424,
0.07899531722068787,
0.13653677701950073,
0.0022386435884982347,
-0.0038078632205724716,
0.16924038529396057,
-0.004859431646764278,
-0.1327028125524521,
0.04109124094247818,
0.07923737168312073,
0.04519364982843399,
-0.05253461003303528,
-0.09058528393507004,
-0.12858538329601288,
0.02787737175822258,
0.10276196151971817,
0.019149050116539,
0.0002449089952278882,
-0.06154245138168335,
-0.017634468153119087,
0.1866401731967926,
-0.04334266856312752,
-0.02473309077322483,
-0.027426809072494507,
-0.02599315345287323,
-0.004592886194586754,
0.016082698479294777,
-0.016480347141623497,
-0.10441923141479492,
0.046259019523859024,
-0.04285286366939545,
-0.08072242885828018,
-0.07134262472391129,
-0.05098230764269829,
-0.008664521388709545,
-0.13384875655174255,
-0.03309095650911331,
-0.018999464809894562,
-0.21262745559215546,
-0.05842875689268112,
0.03780614584684372,
0.01191189419478178,
-0.10304515808820724,
-0.03850962966680527,
0.01952115260064602,
-0.013899401761591434,
-0.03664921224117279,
0.06565830856561661,
-0.040865037590265274,
0.019714511930942535,
-0.05412300303578377,
0.07972355931997299,
0.013200271874666214,
0.0050217523239552975,
-0.048385050147771835,
0.058637164533138275,
-0.08966906368732452,
0.07333631813526154,
0.033381178975105286,
-0.0024599661119282246,
-0.07394620031118393,
-0.05773596465587616,
0.0732361301779747,
0.015240929089486599,
0.04947706311941147,
0.09892840683460236,
-0.026741620153188705,
-0.04354660585522652,
0.017455020919442177,
-0.07801850885152817,
-0.03759433701634407,
0.1741618514060974,
-0.0013655296061187983,
0.017139939591288567,
0.10241521149873734,
0.039655499160289764,
0.11337354779243469,
-0.023208778351545334,
-0.07540325820446014,
0.0018610297702252865,
-0.02023547701537609,
0.03958531841635704,
0.03168841823935509,
-0.0006541580078192055,
-0.04589258134365082,
0.033140629529953,
-0.055437758564949036,
0.06852249801158905,
-0.009210489690303802,
-0.056170690804719925,
-0.027541454881429672,
-0.09551199525594711,
0.01054308470338583,
-0.011750862933695316,
-0.014067789539694786,
-0.01847914792597294,
-0.08646441996097565,
-0.03484106808900833,
0.13046817481517792,
-0.07017945498228073,
-0.01811027340590954,
-0.018896516412496567,
-0.058719735592603683,
-0.054400209337472916,
0.016897736117243767,
-0.14623194932937622,
-0.08837248384952545,
0.03271472826600075,
-0.01986435241997242,
0.032307133078575134,
-0.0016927241813391447,
0.04315861314535141,
0.08388853073120117,
-0.055210188031196594,
-0.07777037471532822,
-0.049518976360559464,
-0.0318896546959877,
-0.08239526301622391,
-0.017721008509397507,
-0.08826497942209244,
-0.012512043118476868,
0.11544565856456757,
-0.15632674098014832,
-0.003916282672435045,
0.006010614801198244,
0.08690373599529266,
0.03462144359946251,
-0.06762336939573288,
0.06737430393695831,
0.05555329844355583,
0.015723703429102898,
-0.09056475013494492,
0.03215401992201805,
0.047246675938367844,
-0.13913799822330475,
-0.003452281467616558,
-0.1564241349697113,
-0.06298857927322388,
0.018463067710399628,
0.13920696079730988,
-0.0891522616147995,
-0.03439263254404068,
-0.05441410839557648,
-0.07810910046100616,
-0.08134762942790985,
-0.021156469359993935,
0.27327707409858704,
0.03825859725475311,
0.09852911531925201,
-0.0603068470954895,
0.01834389939904213,
-0.042626406997442245,
0.02991638146340847,
0.0335124172270298,
0.11669366806745529,
-0.04996361583471298,
-0.0928783044219017,
-0.008756876923143864,
-0.02661554142832756,
-0.05466283857822418,
0.18767379224300385,
0.038728829473257065,
-0.09674066305160522,
-0.02613695338368416,
0.05412956327199936,
0.040739160031080246,
0.04534156620502472,
-0.0837167501449585,
-0.03716695308685303,
0.022419825196266174,
-0.034522317349910736,
0.0363885872066021,
-0.1518213450908661,
0.02276754565536976,
0.007091254461556673,
-0.03262024000287056,
0.028900716453790665,
-0.005891725420951843,
-0.03620624914765358,
0.04969671741127968,
0.06333449482917786,
0.008553535677492619,
0.023969385772943497,
-0.03613932058215141,
-0.09845592826604843,
0.1766194999217987,
-0.035420335829257965,
-0.2107766717672348,
-0.11556412279605865,
0.14030739665031433,
-0.022750373929739,
-0.050422146916389465,
-0.009527736343443394,
-0.0801418349146843,
-0.07102010399103165,
-0.07001877576112747,
0.08770447969436646,
-0.030395299196243286,
-0.014627317897975445,
0.010390537790954113,
-0.03556039184331894,
-0.05573675036430359,
-0.08516987413167953,
0.010962697677314281,
-0.005650852341204882,
-0.09754800796508789,
0.04785683751106262,
-0.10255942493677139,
0.07147065550088882,
0.13699989020824432,
-0.006729146931320429,
-0.014743231236934662,
-0.026153506711125374,
0.16425497829914093,
-0.10981101542711258,
0.0793471708893776,
0.2022465467453003,
0.03272442892193794,
0.07867854833602905,
0.03136879950761795,
0.013442947529256344,
-0.04399712383747101,
0.07508990913629532,
0.03809153661131859,
-0.09880327433347702,
-0.1759541630744934,
-0.07739555090665817,
-0.07007046788930893,
0.04317017272114754,
0.09002646803855896,
0.01373276486992836,
-0.0015151900006458163,
0.042858757078647614,
-0.08652319759130478,
0.05656150355935097,
0.02010495401918888,
0.09765077382326126,
0.032755449414253235,
0.03660425543785095,
0.01476698275655508,
-0.058927521109580994,
-0.028315972536802292,
0.09672141075134277,
0.06126052886247635,
0.22382107377052307,
-0.07778143137693405,
0.35006919503211975,
-0.054763518273830414,
0.061803337186574936,
0.026799015700817108,
0.07412031292915344,
-0.02384117990732193,
-0.031468819826841354,
-0.017811957746744156,
-0.03823705390095711,
-0.04772133380174637,
0.09329114854335785,
0.042269039899110794,
-0.04132944718003273,
-0.05552830547094345,
0.05140732601284981,
-0.029303230345249176,
0.25593069195747375,
-0.008465824648737907,
-0.16421960294246674,
-0.09280338138341904,
0.010611974634230137,
-0.03521692752838135,
-0.1077490895986557,
0.004534334409981966,
0.10458863526582718,
-0.1346714347600937,
0.001304497360251844,
-0.06714566051959991,
0.10813382267951965,
-0.18961329758167267,
0.003085099160671234,
-0.09955231100320816,
0.16939345002174377,
-0.022963471710681915,
0.08363272249698639,
-0.17079955339431763,
0.012309310957789421,
0.036214809864759445,
0.15407301485538483,
-0.07869335263967514,
-0.00039931893115863204,
0.038731418550014496,
-0.024363871663808823,
0.09615139663219452,
0.008309847675263882,
-0.04668976366519928,
-0.05331140384078026,
-0.1036350429058075,
0.0368962325155735,
0.006718801334500313,
0.007250256836414337,
0.05209096521139145,
0.011027081869542599,
0.02055869624018669,
-0.013531062752008438,
-0.13676543533802032,
-0.04920794069766998,
-0.17242582142353058,
0.046312253922224045,
0.03518737852573395,
-0.021901940926909447,
-0.07735209167003632,
-0.03487279266119003,
0.1004282534122467,
0.28171712160110474,
0.0639231875538826,
-0.11223733425140381,
-0.10823845863342285,
-0.08610507100820541,
0.13731153309345245,
-0.0963699147105217,
0.012070611119270325,
0.026459388434886932,
0.11813542991876602,
-0.08429446816444397,
-0.06219886243343353,
0.048813510686159134,
-0.026701582595705986,
-0.15278682112693787,
-0.024736313149333,
0.025445539504289627,
0.057769011706113815,
0.08427819609642029,
-0.018963707610964775,
0.018555762246251106,
-0.08600874990224838,
-0.0970689058303833,
-0.000263077556155622,
0.09107483923435211,
0.02815127931535244,
0.07849722355604172,
0.04789160564541817,
0.028829604387283325,
0.00494666863232851,
-0.07649417966604233,
0.1342357099056244,
0.22192136943340302,
-0.06014605611562729,
0.05971473455429077,
0.14941629767417908,
-0.0034114732407033443,
-0.2221837192773819,
-0.0036134342662990093,
0.07458854466676712,
0.11003247648477554,
-0.055211178958415985,
-0.23601160943508148,
0.039909664541482925,
0.08466263860464096,
-0.005418670829385519,
0.10093316435813904,
-0.3230835497379303,
-0.09627887606620789,
0.06356801837682724,
0.10749837011098862,
0.15968167781829834,
-0.05245160311460495,
0.007589810062199831,
-0.07308537513017654,
-0.10046747326850891,
0.0714033842086792,
-0.14610794186592102,
0.10970300436019897,
-0.04911639541387558,
0.044937748461961746,
0.0488625168800354,
-0.07195820659399033,
0.13142521679401398,
-0.035367172211408615,
0.04683296009898186,
0.007248190697282553,
-0.04326877370476723,
-0.03187263011932373,
-0.10584418475627899,
0.16294583678245544,
-0.13643261790275574,
0.06572843343019485,
-0.11620679497718811,
-0.06787577271461487,
-0.06296255439519882,
0.042621832340955734,
-0.015279717743396759,
-0.09722878783941269,
-0.08767729252576828,
0.04223964735865593,
0.05237460508942604,
0.0344657227396965,
0.04438202083110809,
-0.005190740805119276,
0.000723170000128448,
0.11598760634660721,
0.14484402537345886,
-0.11798448115587234,
-0.05800231918692589,
0.002134578302502632,
0.002676208969205618,
0.07513096928596497,
-0.09316086024045944,
-0.03062618337571621,
0.12945856153964996,
0.06563296914100647,
0.09555204957723618,
-0.002186746336519718,
-0.11361690610647202,
0.04138561710715294,
-0.004296287894248962,
-0.12637746334075928,
-0.1955147683620453,
-0.07294848561286926,
-0.030433952808380127,
-0.09093125909566879,
0.01298513263463974,
0.08530958741903305,
-0.03667454048991203,
-0.025066930800676346,
0.021229125559329987,
0.00973379984498024,
0.016396233811974525,
0.12739793956279755,
-0.014759759418666363,
0.056987445801496506,
-0.06903073191642761,
0.08311004936695099,
0.06866126507520676,
-0.07730355858802795,
0.02742251567542553,
0.13535647094249725,
-0.11770221590995789,
-0.026928408071398735,
-0.02595292590558529,
0.07849607616662979,
-0.1056116446852684,
-0.038938652724027634,
-0.11018375307321548,
-0.0558541975915432,
0.016169343143701553,
0.017747854813933372,
-0.01488357875496149,
0.04144929721951485,
-0.052775416523218155,
0.01636924035847187,
-0.1081000417470932,
0.016177212819457054,
0.018852589651942253,
0.06771740317344666,
-0.030226897448301315,
0.07333192974328995,
0.02028009667992592,
0.03964266926050186,
-0.013978628441691399,
-0.0551234595477581,
-0.07717026770114899,
-0.0008992682560347021,
-0.15079282224178314,
-0.026208601891994476,
-0.09323904663324356,
0.006766495294868946,
-0.02420121803879738,
0.048512209206819534,
0.03634239360690117,
0.06263454258441925,
-0.020733237266540527,
-0.04651520401239395,
-0.05691835656762123,
0.007150980643928051,
-0.02820209413766861,
0.0023678052239120007,
0.013406764715909958,
-0.07675730437040329,
0.09002180397510529,
-0.05582514777779579,
0.00393498782068491,
0.005646142177283764,
0.030050253495573997,
0.027797479182481766,
-0.004947698675096035,
0.00710650160908699,
0.005676960572600365,
-0.036335211247205734,
-0.05117340758442879,
-0.014509878121316433,
-0.012713533826172352,
0.006856723688542843,
0.033174239099025726,
-0.08384768664836884,
-0.0031822898890823126,
-0.06132020428776741,
-0.028270918875932693,
-0.10681407153606415,
0.0858122855424881,
0.10911529511213303,
0.03478822484612465,
0.1071079820394516,
-0.03440508991479874,
0.04622545465826988,
-0.1337853968143463,
-0.04613794758915901,
0.031236503273248672,
-0.012422137893736362,
-0.014151033014059067,
-0.02587469294667244,
0.05165360867977142,
-0.04618840292096138,
0.08750727027654648,
-0.09875089675188065,
0.06355253607034683,
-0.018601516261696815,
0.04784621298313141,
0.017024612054228783,
-0.014349749311804771,
0.11230603605508804,
0.005445121321827173,
-0.04798659309744835,
0.017740363255143166,
0.01956036128103733,
-0.033895090222358704,
0.09557601064443588,
0.07031838595867157,
-0.09110577404499054,
0.1371896117925644,
0.09135850518941879,
-0.022071152925491333,
0.002656262833625078,
-0.19423285126686096,
0.02992398664355278,
0.004863940179347992,
0.06998544931411743,
0.0031070851255208254,
0.1468285322189331,
0.20471635460853577,
-0.1355564445257187,
0.07315569370985031,
0.03248504921793938,
-0.10989082604646683,
-0.048795972019433975,
-0.12421828508377075,
-0.051139019429683685,
-0.0456269197165966,
0.009937197901308537,
-0.09925932437181473,
-0.014728080481290817,
0.10731731355190277,
0.01153347734361887,
-0.021708862856030464,
0.16062307357788086,
0.04898356273770332,
-0.05715429037809372,
0.09581750631332397,
-0.003990155179053545,
0.08195789158344269,
0.08170413225889206,
-0.026708370074629784,
0.050169967114925385,
-0.042595602571964264,
0.09072326123714447,
0.017668046057224274,
0.00541808782145381,
0.009654877707362175,
0.06525108963251114,
-0.054834168404340744,
-0.015816757455468178,
0.03177128732204437,
0.04942101240158081,
0.18359887599945068,
0.05287652835249901,
-0.0743042379617691,
-0.026793546974658966,
0.10808596760034561,
-0.07694163173437119,
0.0008752704598009586,
-0.10089709609746933,
0.26347485184669495,
-0.05051647871732712,
-0.022609423846006393,
0.07357930392026901,
-0.06746088713407516,
0.03565450757741928,
0.11756359785795212,
0.07775232940912247,
-0.034371595829725266,
-0.04118192940950394,
-0.019379911944270134,
-0.017424149438738823,
-0.06480013579130173,
0.1543188989162445,
0.04595346748828888,
0.237518310546875,
-0.0650172233581543,
0.04387280344963074,
-0.04426233842968941,
-0.06504306197166443,
-0.1113748848438263,
0.0617508590221405,
0.009298423305153847,
0.04646015912294388,
-0.053596675395965576,
0.03819199278950691,
-0.007739389315247536,
-0.13366252183914185,
0.0043511479161679745,
-0.06620508432388306,
-0.1245928406715393,
0.01701255701482296,
-0.04372372850775719,
-0.02700941637158394,
0.13471953570842743,
-0.009685556404292583,
0.04534535109996796,
0.054113518446683884,
-0.046328190714120865,
-0.08187931776046753,
-0.06543459743261337,
0.06537099182605743,
-0.0025255728978663683,
0.15774434804916382,
0.011350334621965885,
0.04926464706659317,
0.0884709358215332,
-0.008784869685769081,
-0.11728563159704208,
0.07929709553718567,
0.04001031816005707,
-0.07667512446641922,
0.02468985505402088,
0.108611099421978,
-0.05230610817670822,
0.07714977860450745,
0.036403268575668335,
-0.028772542253136635,
0.011945566162467003,
0.05069001764059067,
0.06016329303383827,
-0.1005532518029213,
0.027985818684101105,
-0.048934563994407654,
0.16081207990646362,
0.13263535499572754,
-0.014681489206850529,
-0.05103083699941635,
-0.024429893121123314,
-0.022139962762594223,
0.016764048486948013,
0.1580863893032074,
-0.03228812664747238,
-0.07854965329170227,
0.0017642505699768662,
0.0727657899260521,
0.0350569523870945,
-0.1908522993326187,
-0.08588787913322449,
-0.05437057465314865,
-0.03849346190690994,
-0.02117602527141571,
0.06504832953214645,
0.04110940545797348,
-0.0055043078027665615,
-0.017554577440023422,
-0.11340124160051346,
-0.0034682133700698614,
0.0774151086807251,
-0.11373643577098846,
-0.11017575860023499
] |
null | null |
transformers
|
# Model Card for gpt-neo-125M-code-clippy-dedup-2048
# Model Details
## Model Description
More information needed
- **Developed by:** Flax Community
- **Shared by [Optional]:** Hugging Face
- **Model type:** Text Generation
- **Language(s) (NLP):** More information needed
- **License:** More information needed
- **Related Models:**
- **Parent Model:** GPT-Neo
- **Resources for more information:**
- [GitHub Repo](https://github.com/CodedotAl/gpt-code-clippy)
# Uses
## Direct Use
This model can be used for the task of Text Generation
## Downstream Use [Optional]
More information needed
## Out-of-Scope Use
The model should not be used to intentionally create hostile or alienating environments for people.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
## Recommendations
The model creators note in the GitHub Repo](https://github.com/CodedotAl/gpt-code-clippy):
> **ISSUE : Wrong Filenames in the Dataset**
We recently came to know about a bug which happened during the scraping of the dataset. We found out that the file names are obsolete/misleading.[Refer this [issue](https://github.com/CodedotAl/gpt-code-clippy/issues/71)] We thank Naman for pointing out the issue.
This might have two implications
- Since the filtering for the training dataset is done using the file extension, we might have had wrong datapoints in the dataset while training and we might have missed a lot of right datapoints that belong to the languages of choice.
# Training Details
## Training Data
The model creators note in the GitHub Repo](https://github.com/CodedotAl/gpt-code-clippy):
> For fine-tuning GPTNeo-125M on CodeClippy dataset we used AdamW optimizer (beta1=0.9, beta2=0.95) with GPT3-like learning rate schedule (4k warmup steps from 0 to 5e-5 followed by 50k cosine decay steps to 5e-6), weight decay 0.1 and batch size 1024, sequence length 2048.
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
The model creators note in the GitHub Repo](https://github.com/CodedotAl/gpt-code-clippy):
> For fine-tuning GPTNeo-125M on CodeClippy dataset we used AdamW optimizer (beta1=0.9, beta2=0.95) with GPT3-like learning rate schedule (4k warmup steps from 0 to 5e-5 followed by 50k cosine decay steps to 5e-6), weight decay 0.1 and batch size 1024, sequence length 2048. The choice of relatively large batch size and low LR with long warmup are made to avoid agressive updates and preserve the knowledge contained in pretrained GPTNeo weights.
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
The model creators note in the GitHub Repo](https://github.com/CodedotAl/gpt-code-clippy):
> The models are also evaluated on the [APPS](https://github.com/hendrycks/apps) and [HumanEval](https://github.com/openai/human-eval) datasets.
### Factors
More information needed
### Metrics
More information needed
## Results
| Model | pass@1 | pass@2 | pass@5 | pass@10 |
| --------------------------------- | :---------: | :---------: | :---------: | :---------: |
| gpt-neo-125M-apps | 0.06% | 0.12% | 0.30% | 0.61% |
# Model Examination
More information needed
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
GPTNeoForCausalLM
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
More information needed
# Citation
**BibTeX:**
More information needed
**APA:**
More information needed
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
Flax Community in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt-neo-125M-code-clippy-dedup-2048")
model = AutoModelForCausalLM.from_pretrained("flax-community/gpt-neo-125M-code-clippy-dedup-2048")
```
</details>
|
{"tags": ["flax"]}
|
text-generation
|
flax-community/gpt-neo-125M-code-clippy-dedup-2048
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"gpt_neo",
"text-generation",
"flax",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1910.09700"
] |
[] |
TAGS
#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #flax #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
|
Model Card for gpt-neo-125M-code-clippy-dedup-2048
==================================================
Model Details
=============
Model Description
-----------------
More information needed
* Developed by: Flax Community
* Shared by [Optional]: Hugging Face
* Model type: Text Generation
* Language(s) (NLP): More information needed
* License: More information needed
* Related Models:
+ Parent Model: GPT-Neo
* Resources for more information:
+ GitHub Repo
Uses
====
Direct Use
----------
This model can be used for the task of Text Generation
Downstream Use [Optional]
-------------------------
More information needed
Out-of-Scope Use
----------------
The model should not be used to intentionally create hostile or alienating environments for people.
Bias, Risks, and Limitations
============================
Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
Recommendations
---------------
The model creators note in the GitHub Repo](URL
>
> ISSUE : Wrong Filenames in the Dataset
> We recently came to know about a bug which happened during the scraping of the dataset. We found out that the file names are obsolete/misleading.Refer this [issue] We thank Naman for pointing out the issue.
> This might have two implications
> - Since the filtering for the training dataset is done using the file extension, we might have had wrong datapoints in the dataset while training and we might have missed a lot of right datapoints that belong to the languages of choice.
>
>
>
Training Details
================
Training Data
-------------
The model creators note in the GitHub Repo](URL
>
> For fine-tuning GPTNeo-125M on CodeClippy dataset we used AdamW optimizer (beta1=0.9, beta2=0.95) with GPT3-like learning rate schedule (4k warmup steps from 0 to 5e-5 followed by 50k cosine decay steps to 5e-6), weight decay 0.1 and batch size 1024, sequence length 2048.
>
>
>
Training Procedure
------------------
### Preprocessing
More information needed
### Speeds, Sizes, Times
The model creators note in the GitHub Repo](URL
>
> For fine-tuning GPTNeo-125M on CodeClippy dataset we used AdamW optimizer (beta1=0.9, beta2=0.95) with GPT3-like learning rate schedule (4k warmup steps from 0 to 5e-5 followed by 50k cosine decay steps to 5e-6), weight decay 0.1 and batch size 1024, sequence length 2048. The choice of relatively large batch size and low LR with long warmup are made to avoid agressive updates and preserve the knowledge contained in pretrained GPTNeo weights.
>
>
>
Evaluation
==========
Testing Data, Factors & Metrics
-------------------------------
### Testing Data
The model creators note in the GitHub Repo](URL
>
> The models are also evaluated on the APPS and HumanEval datasets.
>
>
>
### Factors
More information needed
### Metrics
More information needed
Results
-------
Model Examination
=================
More information needed
Environmental Impact
====================
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
* Hardware Type: More information needed
* Hours used: More information needed
* Cloud Provider: More information needed
* Compute Region: More information needed
* Carbon Emitted: More information needed
Technical Specifications [optional]
===================================
Model Architecture and Objective
--------------------------------
GPTNeoForCausalLM
Compute Infrastructure
----------------------
More information needed
### Hardware
More information needed
### Software
More information needed
BibTeX:
More information needed
APA:
More information needed
Glossary [optional]
===================
More information needed
More Information [optional]
===========================
More information needed
Model Card Authors [optional]
=============================
Flax Community in collaboration with Ezi Ozoani and the Hugging Face team
Model Card Contact
==================
More information needed
How to Get Started with the Model
=================================
Use the code below to get started with the model.
Click to expand
|
[
"### Preprocessing\n\n\nMore information needed",
"### Speeds, Sizes, Times\n\n\nThe model creators note in the GitHub Repo](URL\n\n\n\n> \n> For fine-tuning GPTNeo-125M on CodeClippy dataset we used AdamW optimizer (beta1=0.9, beta2=0.95) with GPT3-like learning rate schedule (4k warmup steps from 0 to 5e-5 followed by 50k cosine decay steps to 5e-6), weight decay 0.1 and batch size 1024, sequence length 2048. The choice of relatively large batch size and low LR with long warmup are made to avoid agressive updates and preserve the knowledge contained in pretrained GPTNeo weights.\n> \n> \n> \n\n\nEvaluation\n==========\n\n\nTesting Data, Factors & Metrics\n-------------------------------",
"### Testing Data\n\n\nThe model creators note in the GitHub Repo](URL\n\n\n\n> \n> The models are also evaluated on the APPS and HumanEval datasets.\n> \n> \n>",
"### Factors\n\n\nMore information needed",
"### Metrics\n\n\nMore information needed\n\n\nResults\n-------\n\n\n\nModel Examination\n=================\n\n\nMore information needed\n\n\nEnvironmental Impact\n====================\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n\n* Hardware Type: More information needed\n* Hours used: More information needed\n* Cloud Provider: More information needed\n* Compute Region: More information needed\n* Carbon Emitted: More information needed\n\n\nTechnical Specifications [optional]\n===================================\n\n\nModel Architecture and Objective\n--------------------------------\n\n\nGPTNeoForCausalLM\n\n\nCompute Infrastructure\n----------------------\n\n\nMore information needed",
"### Hardware\n\n\nMore information needed",
"### Software\n\n\nMore information needed\n\n\nBibTeX:\nMore information needed\n\n\nAPA:\nMore information needed\n\n\nGlossary [optional]\n===================\n\n\nMore information needed\n\n\nMore Information [optional]\n===========================\n\n\nMore information needed\n\n\nModel Card Authors [optional]\n=============================\n\n\nFlax Community in collaboration with Ezi Ozoani and the Hugging Face team\n\n\nModel Card Contact\n==================\n\n\nMore information needed\n\n\nHow to Get Started with the Model\n=================================\n\n\nUse the code below to get started with the model.\n\n\n\n Click to expand"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #flax #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Preprocessing\n\n\nMore information needed",
"### Speeds, Sizes, Times\n\n\nThe model creators note in the GitHub Repo](URL\n\n\n\n> \n> For fine-tuning GPTNeo-125M on CodeClippy dataset we used AdamW optimizer (beta1=0.9, beta2=0.95) with GPT3-like learning rate schedule (4k warmup steps from 0 to 5e-5 followed by 50k cosine decay steps to 5e-6), weight decay 0.1 and batch size 1024, sequence length 2048. The choice of relatively large batch size and low LR with long warmup are made to avoid agressive updates and preserve the knowledge contained in pretrained GPTNeo weights.\n> \n> \n> \n\n\nEvaluation\n==========\n\n\nTesting Data, Factors & Metrics\n-------------------------------",
"### Testing Data\n\n\nThe model creators note in the GitHub Repo](URL\n\n\n\n> \n> The models are also evaluated on the APPS and HumanEval datasets.\n> \n> \n>",
"### Factors\n\n\nMore information needed",
"### Metrics\n\n\nMore information needed\n\n\nResults\n-------\n\n\n\nModel Examination\n=================\n\n\nMore information needed\n\n\nEnvironmental Impact\n====================\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n\n* Hardware Type: More information needed\n* Hours used: More information needed\n* Cloud Provider: More information needed\n* Compute Region: More information needed\n* Carbon Emitted: More information needed\n\n\nTechnical Specifications [optional]\n===================================\n\n\nModel Architecture and Objective\n--------------------------------\n\n\nGPTNeoForCausalLM\n\n\nCompute Infrastructure\n----------------------\n\n\nMore information needed",
"### Hardware\n\n\nMore information needed",
"### Software\n\n\nMore information needed\n\n\nBibTeX:\nMore information needed\n\n\nAPA:\nMore information needed\n\n\nGlossary [optional]\n===================\n\n\nMore information needed\n\n\nMore Information [optional]\n===========================\n\n\nMore information needed\n\n\nModel Card Authors [optional]\n=============================\n\n\nFlax Community in collaboration with Ezi Ozoani and the Hugging Face team\n\n\nModel Card Contact\n==================\n\n\nMore information needed\n\n\nHow to Get Started with the Model\n=================================\n\n\nUse the code below to get started with the model.\n\n\n\n Click to expand"
] |
[
58,
8,
178,
43,
7,
128,
6,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #flax #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n### Preprocessing\n\n\nMore information needed### Speeds, Sizes, Times\n\n\nThe model creators note in the GitHub Repo](URL\n\n\n\n> \n> For fine-tuning GPTNeo-125M on CodeClippy dataset we used AdamW optimizer (beta1=0.9, beta2=0.95) with GPT3-like learning rate schedule (4k warmup steps from 0 to 5e-5 followed by 50k cosine decay steps to 5e-6), weight decay 0.1 and batch size 1024, sequence length 2048. The choice of relatively large batch size and low LR with long warmup are made to avoid agressive updates and preserve the knowledge contained in pretrained GPTNeo weights.\n> \n> \n> \n\n\nEvaluation\n==========\n\n\nTesting Data, Factors & Metrics\n-------------------------------### Testing Data\n\n\nThe model creators note in the GitHub Repo](URL\n\n\n\n> \n> The models are also evaluated on the APPS and HumanEval datasets.\n> \n> \n>### Factors\n\n\nMore information needed### Metrics\n\n\nMore information needed\n\n\nResults\n-------\n\n\n\nModel Examination\n=================\n\n\nMore information needed\n\n\nEnvironmental Impact\n====================\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n\n* Hardware Type: More information needed\n* Hours used: More information needed\n* Cloud Provider: More information needed\n* Compute Region: More information needed\n* Carbon Emitted: More information needed\n\n\nTechnical Specifications [optional]\n===================================\n\n\nModel Architecture and Objective\n--------------------------------\n\n\nGPTNeoForCausalLM\n\n\nCompute Infrastructure\n----------------------\n\n\nMore information needed### Hardware\n\n\nMore information needed"
] |
[
-0.06902392953634262,
0.25850000977516174,
-0.0010514562018215656,
0.041680194437503815,
0.12298145890235901,
0.03527222201228142,
-0.004988578613847494,
0.1306505799293518,
0.01981133595108986,
0.10907943546772003,
0.03908917307853699,
0.0016277910908684134,
0.11276300251483917,
0.07220980525016785,
0.08137251436710358,
-0.2228354662656784,
-0.01878666877746582,
-0.07378353178501129,
-0.005745616275817156,
0.10829509794712067,
0.09748156368732452,
-0.09257544577121735,
0.11357107758522034,
-0.056127045303583145,
-0.00898764468729496,
0.023911362513899803,
-0.10664860904216766,
-0.020326601341366768,
0.06712844967842102,
0.03372735530138016,
-0.01688193529844284,
0.01589043252170086,
0.06803130358457565,
-0.29690247774124146,
0.004406870808452368,
0.07201306521892548,
0.0013024164363741875,
0.07481607794761658,
0.04814464598894119,
-0.02657252363860607,
0.12458666414022446,
-0.19151446223258972,
0.07530228793621063,
0.04599045589566231,
-0.04124720022082329,
-0.18055734038352966,
-0.0965275913476944,
0.06567687541246414,
0.07005785405635834,
0.08311620354652405,
-0.020296266302466393,
0.15644493699073792,
-0.0624999962747097,
0.017630532383918762,
0.18108317255973816,
-0.15584537386894226,
-0.03305536508560181,
0.024484608322381973,
0.11584978550672531,
0.12185804545879364,
-0.10765023529529572,
0.04277576878666878,
0.03175225481390953,
0.008358487859368324,
0.06562118977308273,
-0.016611400991678238,
-0.014822488650679588,
0.05088556930422783,
-0.11864973604679108,
-0.040789976716041565,
0.13671375811100006,
0.04252353683114052,
-0.047586552798748016,
-0.19437634944915771,
0.003692100988700986,
-0.12292003631591797,
0.009392336010932922,
-0.07537064701318741,
0.028388071805238724,
-0.03264065086841583,
-0.008126986213028431,
-0.07439187169075012,
-0.07939860969781876,
-0.023527484387159348,
0.04824131354689598,
0.11141529679298401,
0.015630658715963364,
-0.035281140357255936,
0.07196104526519775,
0.10372456908226013,
-0.015504855662584305,
-0.10449367016553879,
-0.07599813491106033,
-0.01281009055674076,
-0.10573314875364304,
-0.01695777289569378,
0.03919345140457153,
0.022894885390996933,
0.04475754499435425,
0.1301025152206421,
-0.08151007443666458,
0.03399362415075302,
0.05595266446471214,
-0.03361134976148605,
0.05483431741595268,
0.15882804989814758,
-0.09905585646629333,
-0.18753595650196075,
-0.041838616132736206,
0.010485891252756119,
0.022048847749829292,
-0.03079248033463955,
-0.04626616835594177,
-0.027739498764276505,
0.11227120459079742,
0.15212562680244446,
0.06442154198884964,
-0.033953942358493805,
-0.07426928728818893,
-0.031247898936271667,
0.20026355981826782,
-0.09478387981653214,
0.05754261836409569,
-0.03763655200600624,
-0.01954013668000698,
-0.017032446339726448,
-0.006731875240802765,
0.023783046752214432,
-0.050242628902196884,
0.08271338045597076,
-0.05906171724200249,
-0.07602746039628983,
-0.03128243237733841,
-0.022144651040434837,
0.06976665556430817,
-0.10565777122974396,
0.026775605976581573,
-0.050845928490161896,
-0.12608414888381958,
-0.06424090266227722,
0.021312927827239037,
-0.06315163522958755,
-0.07532811909914017,
0.049930769950151443,
-0.021193211898207664,
-0.008386272005736828,
-0.026105040684342384,
0.10873115062713623,
-0.03058205358684063,
0.04651474952697754,
-0.050305016338825226,
0.020655035972595215,
0.07921697944402695,
0.02403504215180874,
-0.10266675800085068,
0.030592501163482666,
-0.1361379623413086,
0.07920397818088531,
-0.09687554091215134,
0.033069394528865814,
-0.18162238597869873,
-0.03804803267121315,
-0.01045150775462389,
-0.03422922641038895,
0.044886380434036255,
0.16173478960990906,
-0.12639300525188446,
-0.0318070687353611,
0.01362755335867405,
-0.041659463196992874,
-0.11030905693769455,
0.1179179921746254,
0.022684331983327866,
0.07213632762432098,
0.0641912892460823,
0.004974879324436188,
0.1227339655160904,
-0.1503733992576599,
-0.11776382476091385,
-0.008955773897469044,
0.032651908695697784,
0.10716955363750458,
0.09666412323713303,
-0.0436861626803875,
-0.004399103112518787,
0.009358073584735394,
-0.05955296754837036,
-0.01347169280052185,
-0.031736187636852264,
-0.09832818061113358,
-0.018569812178611755,
-0.07814276218414307,
0.04332222416996956,
0.02727035991847515,
-0.061276573687791824,
-0.023804761469364166,
-0.16431225836277008,
-0.05637865141034126,
0.10891550034284592,
-0.020929358899593353,
0.013575357384979725,
-0.04416176304221153,
-0.0011295621516183019,
-0.06716438382863998,
-0.006119932979345322,
-0.1355774700641632,
-0.07823217660188675,
0.09037423133850098,
-0.14562758803367615,
0.01754545047879219,
-0.07737221568822861,
0.06591472774744034,
0.023557694628834724,
-0.06033046916127205,
-0.024658609181642532,
-0.004771885462105274,
-0.00008672929834574461,
-0.08226852864027023,
-0.15540781617164612,
-0.04073311388492584,
-0.026364605873823166,
0.1855400651693344,
-0.10527238994836807,
-0.008203092031180859,
0.10935363918542862,
0.09311401098966599,
-0.003379492089152336,
-0.07680399715900421,
0.026608554646372795,
-0.031398575752973557,
-0.00033706860267557204,
-0.08247758448123932,
-0.022577036172151566,
-0.029373155906796455,
-0.10246855765581131,
0.016899101436138153,
-0.12945087254047394,
0.008303917944431305,
0.05860164389014244,
0.2131458967924118,
-0.10887045413255692,
-0.0448700413107872,
-0.0162888765335083,
-0.06607848405838013,
-0.10871089994907379,
-0.05261349305510521,
0.2414751648902893,
0.05149716138839722,
-0.008957833983004093,
-0.06217283010482788,
-0.09171448647975922,
0.02689201571047306,
0.07061348855495453,
-0.059409890323877335,
0.11917539685964584,
0.012316270731389523,
-0.08765605092048645,
0.08187272399663925,
0.10469730198383331,
0.05435915291309357,
0.09625640511512756,
0.013554775156080723,
-0.09705065935850143,
-0.04949482902884483,
0.003928336780518293,
0.005563135724514723,
0.13039332628250122,
-0.0902286171913147,
0.05039897561073303,
0.04636665806174278,
-0.05871200188994408,
0.023017825558781624,
-0.06682619452476501,
0.016371730715036392,
0.030987447127699852,
-0.02749689109623432,
0.022199207916855812,
-0.04735637083649635,
-0.005014351103454828,
0.06465834379196167,
0.041558608412742615,
0.08488088846206665,
0.009054552763700485,
0.0013686121674254537,
-0.1241740956902504,
0.18375469744205475,
-0.06120958551764488,
-0.25505518913269043,
-0.09869502484798431,
0.11031486093997955,
-0.018056394532322884,
-0.051499418914318085,
0.024304592981934547,
-0.1035524383187294,
-0.11112692207098007,
-0.09456536173820496,
0.0705026239156723,
0.0351642481982708,
-0.027510320767760277,
-0.03727906942367554,
-0.023673970252275467,
0.0028270436450839043,
-0.11133338510990143,
0.05399997532367706,
0.05707305669784546,
-0.04272370785474777,
-0.04414268955588341,
0.05924025550484657,
0.16926488280296326,
0.09829508513212204,
-0.02316628210246563,
-0.022763265296816826,
0.020635657012462616,
0.1979093998670578,
-0.15437158942222595,
0.0946744754910469,
0.12171237170696259,
0.024388989433646202,
0.044310588389635086,
0.11506225913763046,
0.002920267404988408,
-0.042950116097927094,
0.03689896687865257,
0.04093964397907257,
-0.05117766559123993,
-0.25720885396003723,
-0.08547571301460266,
0.002467881888151169,
-0.07673091441392899,
0.11519506573677063,
0.07878408581018448,
0.027632230892777443,
0.07959001511335373,
-0.10824804753065109,
-0.007020674645900726,
-0.01578904688358307,
0.08186609297990799,
-0.04696250706911087,
0.004859701730310917,
0.029810098931193352,
-0.0343080498278141,
-0.043175190687179565,
0.12598462402820587,
0.07304012775421143,
0.10750333964824677,
-0.01307182852178812,
0.22872963547706604,
-0.016697173938155174,
0.09946630895137787,
-0.05311477184295654,
0.047661036252975464,
0.016503214836120605,
0.013633260503411293,
0.010755266062915325,
-0.08601009845733643,
0.009176649153232574,
0.11362144351005554,
0.12351101636886597,
-0.008272645995020866,
-0.02501034364104271,
-0.032169874757528305,
0.04369311034679413,
0.22483286261558533,
-0.03359556570649147,
-0.19548100233078003,
-0.04589085280895233,
0.08781348168849945,
0.0029763237107545137,
-0.1180950254201889,
-0.02386927604675293,
0.09113804996013641,
-0.17084921896457672,
0.08467284590005875,
-0.03745298460125923,
0.0632677972316742,
-0.1251678615808487,
-0.0367431640625,
-0.004364544525742531,
0.07216504961252213,
-0.0033246269449591637,
0.10934868454933167,
-0.1331341713666916,
0.054099760949611664,
0.05020970478653908,
0.0667661502957344,
-0.13908906280994415,
0.041249893605709076,
-0.009701062925159931,
0.017364220693707466,
0.20886258780956268,
0.013211973942816257,
-0.005065592937171459,
-0.07563067972660065,
-0.14507004618644714,
0.038170225918293,
0.09437420964241028,
-0.16102777421474457,
0.057245105504989624,
-0.00635388121008873,
-0.051113057881593704,
-0.016821935772895813,
-0.14294305443763733,
-0.15280203521251678,
-0.17305344343185425,
0.06873077154159546,
-0.06026868894696236,
0.06467661261558533,
-0.05659647285938263,
-0.06709560006856918,
-0.0845889300107956,
0.21327455341815948,
-0.10788208991289139,
-0.08737125247716904,
-0.16170115768909454,
-0.06195589154958725,
0.22159989178180695,
-0.08852693438529968,
0.008955122902989388,
0.00992373377084732,
0.15846151113510132,
-0.02009538933634758,
-0.031830109655857086,
0.07426890730857849,
-0.06400527060031891,
-0.19254904985427856,
-0.027783535420894623,
0.12642014026641846,
0.0994110107421875,
0.07684290409088135,
-0.005238574463874102,
0.011601505801081657,
-0.04660005494952202,
-0.13604381680488586,
0.0026110284961760044,
0.09851688146591187,
0.01861107535660267,
0.07615815103054047,
0.003223554929718375,
-0.008948046714067459,
-0.060514792799949646,
-0.059864312410354614,
0.1262809932231903,
0.14603284001350403,
-0.09118600189685822,
0.13290388882160187,
0.19034820795059204,
-0.04692106321454048,
-0.2262585461139679,
0.029256610199809074,
0.029163794592022896,
0.013068603351712227,
0.06004635989665985,
-0.16501784324645996,
0.08906617760658264,
0.06800132244825363,
-0.02676623873412609,
0.08254092186689377,
-0.21288655698299408,
-0.1364743411540985,
0.059164535254240036,
0.04443509131669998,
-0.13338010013103485,
-0.1401216685771942,
-0.06598179787397385,
-0.07546354830265045,
-0.10274896025657654,
0.11543220281600952,
-0.03146786242723465,
0.03368253633379936,
-0.0007217558450065553,
0.025072338059544563,
0.04215727746486664,
-0.05624206364154816,
0.17917212843894958,
0.006575004663318396,
0.005093173589557409,
-0.022377291694283485,
0.004944578744471073,
-0.008889109827578068,
-0.05751923844218254,
0.16960108280181885,
-0.030525533482432365,
0.03374987468123436,
-0.09749867022037506,
-0.017888663336634636,
-0.060239724814891815,
0.09638748317956924,
-0.0666923001408577,
-0.0881214439868927,
-0.09178753197193146,
0.07627913355827332,
0.07123062759637833,
-0.037018466740846634,
0.08691100776195526,
0.006677316967397928,
0.04630540683865547,
0.1340072602033615,
0.15835312008857727,
-0.00558676989749074,
-0.06926646083593369,
0.02276790887117386,
-0.0278428103774786,
0.06012799218297005,
-0.1625259667634964,
0.0514281764626503,
0.1054174154996872,
0.048552822321653366,
0.07753602415323257,
-0.05810833349823952,
-0.19084228575229645,
-0.041502371430397034,
0.029753385111689568,
-0.0782974362373352,
-0.18899200856685638,
-0.07653672993183136,
-0.014108337461948395,
-0.181844100356102,
0.011713456362485886,
0.055633366107940674,
-0.039229754358530045,
-0.055262137204408646,
-0.01572604477405548,
0.08404391258955002,
0.019657189026474953,
0.16536782681941986,
0.03998865559697151,
0.06900475919246674,
-0.11413436383008957,
0.12062270939350128,
0.10692349821329117,
-0.13860225677490234,
0.040669605135917664,
0.10548500716686249,
-0.018051108345389366,
-0.0678890123963356,
0.05437832325696945,
0.010917813517153263,
-0.054566510021686554,
-0.0493670329451561,
0.008743071928620338,
-0.0806960016489029,
0.029794421046972275,
0.05486367642879486,
0.01568991132080555,
0.005705934017896652,
0.046353574842214584,
-0.0006787711172364652,
-0.13426409661769867,
0.09071499109268188,
0.00939217396080494,
0.04099133238196373,
-0.08902384340763092,
-0.05705404654145241,
0.03978010267019272,
0.07320205122232437,
-0.027969056740403175,
-0.008424319326877594,
-0.04757411777973175,
-0.02741681970655918,
-0.07767712324857712,
-0.038988981395959854,
-0.07636858522891998,
0.039908427745103836,
-0.02913646586239338,
-0.059068407863378525,
-0.014102891087532043,
0.031500134617090225,
-0.05641327053308487,
-0.0823209136724472,
-0.02842877246439457,
0.09349021315574646,
-0.1325652301311493,
-0.00829353928565979,
0.0802997499704361,
-0.07571764290332794,
0.11779844760894775,
-0.04383621737360954,
0.031836774200201035,
0.001472738804295659,
-0.1052306666970253,
0.05967564880847931,
0.0035749359522014856,
0.02798052504658699,
0.03946385905146599,
-0.19736632704734802,
-0.018438685685396194,
-0.02895042486488819,
-0.006221041548997164,
0.021944954991340637,
0.0227967556566,
-0.12475648522377014,
0.03887622430920601,
-0.06905654072761536,
-0.036156702786684036,
-0.04761165753006935,
0.05052747577428818,
0.07947216182947159,
-0.01008516363799572,
0.1083543598651886,
0.015581084415316582,
0.05307576060295105,
-0.17031735181808472,
-0.003028085920959711,
-0.03305860608816147,
-0.0006057447171770036,
-0.01319602970033884,
0.03023608773946762,
0.0708450898528099,
0.015279432758688927,
0.197479709982872,
-0.05579487979412079,
0.17120729386806488,
0.02907119132578373,
0.11002619564533234,
-0.01314159482717514,
-0.0093075605109334,
0.03153800964355469,
0.02516327239573002,
-0.018224792554974556,
0.046457577496767044,
-0.005188462324440479,
0.013922406360507011,
-0.036711037158966064,
0.045975737273693085,
0.06167056784033775,
0.08457361906766891,
0.07320133596658707,
0.04129171743988991,
-0.11365478485822678,
-0.06267516314983368,
0.10621189326047897,
-0.029450779780745506,
0.022826001048088074,
-0.02527705952525139,
0.09959393739700317,
0.10762710124254227,
-0.19094784557819366,
0.055459991097450256,
-0.022614656016230583,
-0.07507072389125824,
-0.06779372692108154,
-0.10895144939422607,
-0.08597350120544434,
-0.04104642942547798,
-0.005920099094510078,
-0.08254820108413696,
0.0243919026106596,
0.07722833752632141,
-0.007958026602864265,
-0.0077959815971553326,
0.1336459517478943,
-0.02862609550356865,
-0.0460626482963562,
0.0716748982667923,
0.0733041986823082,
0.03333514928817749,
0.027904029935598373,
0.039418261498212814,
0.04528409615159035,
0.06504001468420029,
0.10817820578813553,
0.007739676162600517,
-0.02664804644882679,
0.061994969844818115,
0.008652088232338428,
-0.11597058176994324,
-0.026937570422887802,
-0.006764541380107403,
0.0009793252684175968,
0.132936492562294,
0.03134274110198021,
0.039079152047634125,
-0.03936850652098656,
0.1997174471616745,
-0.07746405154466629,
0.00942681822925806,
-0.11122337728738785,
0.08116061985492706,
0.004380878526717424,
0.002572523895651102,
0.0688793733716011,
-0.1295156031847,
0.017473958432674408,
0.11452420055866241,
0.09075339138507843,
-0.006069826427847147,
-0.0038712865207344294,
0.002218770096078515,
0.010686246678233147,
-0.045032840222120285,
0.05160694569349289,
0.08108475059270859,
0.13068027794361115,
-0.12151911854743958,
0.11061259359121323,
-0.0024570825044065714,
-0.05965461581945419,
-0.060339245945215225,
0.11933796852827072,
-0.03319857269525528,
0.005642722826451063,
-0.016058899462223053,
0.12317777425050735,
-0.05841779336333275,
-0.2160889357328415,
0.031158629804849625,
-0.15758182108402252,
-0.1919030398130417,
-0.005568451713770628,
0.07282783091068268,
-0.002515231491997838,
0.018897265195846558,
0.05871739238500595,
-0.0004721604927908629,
0.1200605183839798,
0.03982795774936676,
-0.12922810018062592,
-0.07422270625829697,
0.03942730650305748,
0.021976888179779053,
0.273159921169281,
0.009020775556564331,
0.04640574753284454,
0.09371907263994217,
-0.06641900539398193,
-0.19817803800106049,
0.012715699151158333,
0.0936765968799591,
-0.1125083938241005,
0.11298087239265442,
0.18970218300819397,
-0.015532770194113255,
0.14452104270458221,
0.08585330098867416,
0.024552982300519943,
0.02099645510315895,
-0.015442773699760437,
0.04395151138305664,
-0.09650767594575882,
0.04220343381166458,
-0.07783208042383194,
0.1682998687028885,
0.09182087332010269,
-0.046140871942043304,
-0.0244373120367527,
-0.07990799099206924,
0.0413392074406147,
0.0085303271189332,
0.1851179450750351,
0.023614218458533287,
-0.14530067145824432,
0.015666166320443153,
-0.002007139613851905,
0.10019169002771378,
-0.1772831827402115,
-0.03526252508163452,
0.06838259100914001,
-0.026799021288752556,
-0.04108450561761856,
0.10324981063604355,
0.013517793267965317,
-0.00019531967700459063,
-0.012466290034353733,
-0.17373360693454742,
-0.0014677123399451375,
0.0952497124671936,
-0.1921997368335724,
-0.023379389196634293
] |
null | null |
transformers
|
# GPT-Code-Clippy-125M-from-Scratch
> **Please refer to our new [GitHub Wiki](https://github.com/ncoop57/gpt-code-clippy/wiki) which documents our efforts in detail in creating the open source version of GitHub Copilot**
## Model Description
GPT-CC-125M-from-Scratch is a [GPT-Neo-125M model](https://huggingface.co/EleutherAI/gpt-neo-125M) pretrained from scratch using causal language modeling on the [Code Clippy Post-deduplication dataset](https://the-eye.eu/public/AI/training_data/code_clippy_data/code_clippy_dedup_data/). The deduplication script can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/data_processing/deduplication/deduplication.py). Code Clippy was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages. As discussed in OpenAI's [Codex paper](https://arxiv.org/abs/2107.03374), we modified the GPT-Neo model and tokenizer to accommodate for additional whitespace characters. Specifically, we add the following tokens `["\t\t", " ", " ", " "]` and since they are all related to indentation, we initialize the embedding layer of these tokens with the same weights as the `\t` token already present in the model in hopes the model will learn to associate these whitespace characters with indentation faster. A script to automatically do this can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/utilities/add_new_tokens.py).
## Training data
[Code Clippy Deduplicated dataset](https://the-eye.eu/public/AI/training_data/code_clippy_data/code_clippy_dedup_data/).
Python script of the dataset can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/data_processing/code_clippy.py)
## Training procedure
The training script used to train this model can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/deprecated/run_clm_streaming_flax_v2.py).
```bash
./run_clm_streaming_flax_v2.py \
--output_dir $HOME/gpt-neo-125M-code-clippy-from-scratch \
--tokenizer_name="EleutherAI/gpt-neo-125M" \
--model_name_or_path="EleutherAI/gpt-neo-125M" \
--dataset_name $HOME/gpt-code-clippy/data_processing/code_clippy.py \
--data_dir /home/shared/code_clippy_data \
--do_train --do_eval \
--block_size="2048" \
--per_device_train_batch_size="8" \
--per_device_eval_batch_size="16" \
--preprocessing_num_workers="8" \
--learning_rate="3e-5" \
--max_steps 100000 \
--warmup_steps 2500\
--decap_steps 25000 \
--adam_beta1="0.9" \
--adam_beta2="0.95" \
--weight_decay="0.1" \
--overwrite_output_dir \
--logging_steps="50" \
--eval_steps="500" \
--push_to_hub="False" \
--report_to="all" \
--dtype="bfloat16" \
--skip_memory_metrics="True" \
--save_steps="500" \
--save_total_limit 10 \
--report_to="wandb" \
--run_name="gpt-neo-125M-code-clippy-dedup-scratch"
```
## Intended Use and Limitations
The model is pre-trained and not finetuned for any particular use or a particular programming language. Due to time constraints, this model was only pre-trained on 1% of the Code Clippy Dataset.
The paper ["Evaluating Large Language Models Trained on Code"](https://arxiv.org/abs/2107.03374) from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. **As well as some differences in views from the paper, particularly around legal implications**.
1. **Over-reliance:** This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. **Economic and labor market impacts:** Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from [O*NET OnLine](https://www.onetonline.org/link/summary/15-1252.00), developers don't just write software.
3. **Security implications:** No filtering or checking of vulnerabilities or buggy code was performed on the datase this model is trained on. This means that the dataset may contain code that may be malicious or contain vulnerabilities. Therefore, this model may generate vulnerable, buggy, or malicious code. In safety critical software, this could lead to software that may work improperly and could result in serious consequences depending on the software. Additionally, this model may be able to be used to generate malicious code on purpose in order to perform ransomware or other such attacks.
4. **Legal implications:** No filtering was performed on licensed code. This means that the dataset may contain restrictive licensed code. As discussed in the paper, public Github repositories may fall under "fair use." However, there has been little to no previous cases of such usages of licensed publicly available code. Therefore, any code generated with this model may be required to obey license terms that align with the software it was trained on such as GPL-3.0. It is unclear the legal ramifications of using a language model trained on this dataset.
5. **Biases:** The programming languages most represented in the dataset this model was trained on are Javascript and Python. Therefore, other, still popular languages such as C and C++, are less represented and therefore the models performance for these languages will be less comparatively. Additionally, this dataset only contains public repositories and so the model may not generate code that is representative of code written by private developers. No filtering was performed for potential racist, offensive, or otherwise inappropriate content. Therefore, this model may reflect such biases in its generation.
GPT-Neo-125M-Code-Clippy is finetuned from GPT-Neo and might have inherited biases and limitations from it. See [GPT-Neo model card](https://huggingface.co/EleutherAI/gpt-neo-125M#limitations-and-biases) for details.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("flax-community/gpt-neo-125M-code-clippy-dedup-scratch")
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt-neo-125M-code-clippy-dedup-scratch")
prompt = """def greet(name):
'''A function to greet user. Given a user name it should say hello'''
"""
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(device)
start = input_ids.size(1)
out = model.generate(input_ids, do_sample=True, max_length=50, num_beams=2,
early_stopping=True, eos_token_id=tokenizer.eos_token_id, )
print(tokenizer.decode(out[0][start:]))
```
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of the quality of generated code.
GPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See [GPT-Neo model card](https://huggingface.co/EleutherAI/gpt-neo-125M#limitations-and-biases) for details.
## Evaluation results
Below is a table containing the base model we started from, and the model's performance on the [HumanEval Benchmark](https://github.com/openai/human-eval).
| Model | Dataset Used | pass@1 | pass@2 | pass@5 | pass@10 |
| --- | --- | :---------: | :---------: | :---------: | :---------: |
| [gpt-neo-125M (**trained from scratch**)](https://huggingface.co/flax-community/gpt-neo-125M-code-clippy-dedup-scratch) | [Code Clippy Data (Deduplicated)](https://the-eye.eu/public/AI/training_data/code_clippy_data/code_clippy_dedup_data/) (~1% of the data) | 0.00% | 0.00% | 0.00% | 0.00% |
|
{}
|
text-generation
|
flax-community/gpt-neo-125M-code-clippy-dedup-scratch
|
[
"transformers",
"jax",
"tensorboard",
"gpt_neo",
"text-generation",
"arxiv:2107.03374",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.03374"
] |
[] |
TAGS
#transformers #jax #tensorboard #gpt_neo #text-generation #arxiv-2107.03374 #autotrain_compatible #endpoints_compatible #region-us
|
GPT-Code-Clippy-125M-from-Scratch
=================================
>
> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot
>
>
>
Model Description
-----------------
GPT-CC-125M-from-Scratch is a GPT-Neo-125M model pretrained from scratch using causal language modeling on the Code Clippy Post-deduplication dataset. The deduplication script can be found here. Code Clippy was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages. As discussed in OpenAI's Codex paper, we modified the GPT-Neo model and tokenizer to accommodate for additional whitespace characters. Specifically, we add the following tokens '["\t\t", " ", " ", " "]' and since they are all related to indentation, we initialize the embedding layer of these tokens with the same weights as the '\t' token already present in the model in hopes the model will learn to associate these whitespace characters with indentation faster. A script to automatically do this can be found here.
Training data
-------------
Code Clippy Deduplicated dataset.
Python script of the dataset can be found here
Training procedure
------------------
The training script used to train this model can be found here.
Intended Use and Limitations
----------------------------
The model is pre-trained and not finetuned for any particular use or a particular programming language. Due to time constraints, this model was only pre-trained on 1% of the Code Clippy Dataset.
The paper "Evaluating Large Language Models Trained on Code" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.
1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O\*NET OnLine, developers don't just write software.
3. Security implications: No filtering or checking of vulnerabilities or buggy code was performed on the datase this model is trained on. This means that the dataset may contain code that may be malicious or contain vulnerabilities. Therefore, this model may generate vulnerable, buggy, or malicious code. In safety critical software, this could lead to software that may work improperly and could result in serious consequences depending on the software. Additionally, this model may be able to be used to generate malicious code on purpose in order to perform ransomware or other such attacks.
4. Legal implications: No filtering was performed on licensed code. This means that the dataset may contain restrictive licensed code. As discussed in the paper, public Github repositories may fall under "fair use." However, there has been little to no previous cases of such usages of licensed publicly available code. Therefore, any code generated with this model may be required to obey license terms that align with the software it was trained on such as GPL-3.0. It is unclear the legal ramifications of using a language model trained on this dataset.
5. Biases: The programming languages most represented in the dataset this model was trained on are Javascript and Python. Therefore, other, still popular languages such as C and C++, are less represented and therefore the models performance for these languages will be less comparatively. Additionally, this dataset only contains public repositories and so the model may not generate code that is representative of code written by private developers. No filtering was performed for potential racist, offensive, or otherwise inappropriate content. Therefore, this model may reflect such biases in its generation.
GPT-Neo-125M-Code-Clippy is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of the quality of generated code.
GPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.
Evaluation results
------------------
Below is a table containing the base model we started from, and the model's performance on the HumanEval Benchmark.
|
[
"### How to use\n\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\n\nThe model is intended to be used for research purposes and comes with no guarantees of the quality of generated code.\n\n\nGPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.\n\n\nEvaluation results\n------------------\n\n\nBelow is a table containing the base model we started from, and the model's performance on the HumanEval Benchmark."
] |
[
"TAGS\n#transformers #jax #tensorboard #gpt_neo #text-generation #arxiv-2107.03374 #autotrain_compatible #endpoints_compatible #region-us \n",
"### How to use\n\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\n\nThe model is intended to be used for research purposes and comes with no guarantees of the quality of generated code.\n\n\nGPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.\n\n\nEvaluation results\n------------------\n\n\nBelow is a table containing the base model we started from, and the model's performance on the HumanEval Benchmark."
] |
[
50,
35,
107
] |
[
"passage: TAGS\n#transformers #jax #tensorboard #gpt_neo #text-generation #arxiv-2107.03374 #autotrain_compatible #endpoints_compatible #region-us \n### How to use\n\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:### Limitations and Biases\n\n\nThe model is intended to be used for research purposes and comes with no guarantees of the quality of generated code.\n\n\nGPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.\n\n\nEvaluation results\n------------------\n\n\nBelow is a table containing the base model we started from, and the model's performance on the HumanEval Benchmark."
] |
[
-0.04982401430606842,
0.12461167573928833,
0.0011030802270397544,
0.062371645122766495,
0.11708802729845047,
0.028051303699612617,
0.05691538751125336,
0.0686730146408081,
-0.023152802139520645,
-0.012124218046665192,
0.16866113245487213,
0.09218860417604446,
0.04683380201458931,
0.13011370599269867,
0.05219282954931259,
-0.25188761949539185,
0.03654835745692253,
0.06046942621469498,
0.0681920051574707,
0.12408428639173508,
0.04297024756669998,
-0.05213690921664238,
0.13839136064052582,
0.06999904662370682,
-0.1906624585390091,
-0.007352942600846291,
0.0388372503221035,
-0.04407823830842972,
0.10784777253866196,
0.0856216698884964,
0.05823375657200813,
0.03638691082596779,
-0.0011738124303519726,
-0.07617467641830444,
0.04269148409366608,
0.04348161071538925,
-0.008373089134693146,
0.10403432697057724,
0.04577116295695305,
-0.026953421533107758,
0.3109353184700012,
0.07208210229873657,
-0.0036586711648851633,
0.028763186186552048,
-0.12258587032556534,
-0.1118481457233429,
-0.019982576370239258,
-0.030185503885149956,
0.05180653929710388,
0.09534066170454025,
-0.031132448464632034,
0.1966259926557541,
-0.10669945925474167,
0.04910556599497795,
0.1650254875421524,
-0.25271686911582947,
-0.0267866849899292,
0.10391955077648163,
0.050833284854888916,
0.042968448251485825,
0.018434815108776093,
0.06810952723026276,
0.061392735689878464,
0.0290894266217947,
0.11242254078388214,
-0.09485631436109543,
-0.003501884639263153,
0.018032049760222435,
-0.14634111523628235,
0.009392115287482738,
0.16097915172576904,
-0.024878527969121933,
-0.0401880219578743,
-0.09091779589653015,
-0.02792225405573845,
-0.02333983965218067,
0.004000332672148943,
-0.07974317669868469,
-0.0505412220954895,
0.02391985058784485,
0.017033126205205917,
-0.11498710513114929,
-0.06259676814079285,
-0.13986890017986298,
-0.08974575251340866,
0.04390808939933777,
0.03680109232664108,
0.05003955215215683,
-0.04436694458127022,
0.18048864603042603,
-0.037656210362911224,
-0.009822061285376549,
-0.03923817351460457,
-0.12564785778522491,
-0.03928857296705246,
-0.029431724920868874,
-0.03844837844371796,
0.010627135634422302,
0.03958388417959213,
0.08487062901258469,
0.1471543312072754,
0.03204521909356117,
0.05044472590088844,
0.039353761821985245,
0.027176667004823685,
0.06401807069778442,
-0.10689861327409744,
-0.0199932511895895,
0.09577159583568573,
-0.07989342510700226,
-0.02047106623649597,
-0.0602673701941967,
-0.10898184776306152,
-0.0537012442946434,
0.03446134179830551,
0.09295128285884857,
0.004323620814830065,
0.12368215620517731,
-0.04487576708197594,
-0.035890717059373856,
0.14295199513435364,
-0.04137076810002327,
-0.01989290863275528,
-0.04523938521742821,
-0.044223491102457047,
-0.023385995998978615,
0.0014120598789304495,
-0.013724522665143013,
-0.1186976507306099,
-0.05108215659856796,
-0.11303123086690903,
-0.07692129909992218,
-0.09638135135173798,
-0.09319943934679031,
-0.022486528381705284,
-0.07397224009037018,
0.016483640298247337,
-0.1410246342420578,
-0.2516677975654602,
-0.018089545890688896,
0.02899324893951416,
-0.07952980697154999,
-0.026238588616251945,
-0.004777689930051565,
0.051176246255636215,
0.015310057438910007,
-0.06416080892086029,
0.06559018790721893,
-0.08001264929771423,
0.017302628606557846,
0.010142963379621506,
0.06875523179769516,
-0.09324423968791962,
0.0020720672328025103,
-0.1090090349316597,
0.01839890517294407,
-0.11206100881099701,
0.07608848065137863,
-0.01936190575361252,
-0.0026087495498359203,
-0.09142883867025375,
-0.04335444048047066,
-0.10295417159795761,
0.05550578236579895,
0.0015903523890301585,
0.1282227337360382,
-0.12249965220689774,
-0.07703318446874619,
0.18403486907482147,
-0.08645911514759064,
-0.08782176673412323,
0.14601506292819977,
-0.0422024130821228,
0.057578619569540024,
0.13612306118011475,
0.11215732246637344,
0.0497099943459034,
-0.009542398154735565,
0.02439585141837597,
0.07198906689882278,
-0.12280280143022537,
-0.047083545476198196,
0.05968470871448517,
0.03384969383478165,
-0.12402957677841187,
0.03826485574245453,
-0.07742432504892349,
0.09557636082172394,
-0.029431844130158424,
-0.01730716973543167,
-0.010793434455990791,
-0.0444820262491703,
-0.005315158981829882,
-0.04765445366501808,
0.04503204673528671,
-0.006822690367698669,
-0.07580171525478363,
-0.06461279094219208,
0.061308830976486206,
-0.07095387578010559,
-0.011261344887316227,
-0.0979149267077446,
0.11097745597362518,
-0.1105596199631691,
0.03175891563296318,
-0.15272361040115356,
-0.07720918208360672,
0.012408917769789696,
-0.02753182128071785,
0.06640321016311646,
0.13645039498806,
0.02117595262825489,
0.011120171286165714,
-0.00973377376794815,
0.07806045562028885,
0.1179327443242073,
-0.019754773005843163,
-0.013055824674665928,
-0.10712528973817825,
0.047834884375333786,
-0.02907276339828968,
0.06670328229665756,
-0.1797296553850174,
-0.02421076036989689,
0.07215389609336853,
0.05148261785507202,
-0.00196053390391171,
-0.03349084407091141,
0.03693043813109398,
-0.007827033288776875,
-0.0307395551353693,
-0.05734424665570259,
0.040560390800237656,
0.024881592020392418,
-0.10014699399471283,
0.1429440975189209,
-0.15983137488365173,
0.11858123540878296,
0.14481495320796967,
-0.08692798763513565,
-0.09422577917575836,
0.024270284920930862,
0.005015181843191385,
-0.016864079982042313,
-0.014847075566649437,
0.020305795595049858,
0.22933784127235413,
-0.015542488545179367,
0.0977073535323143,
-0.076424241065979,
-0.021679336205124855,
0.036355093121528625,
-0.046534303575754166,
0.04546280950307846,
0.04866460710763931,
0.12134609371423721,
-0.15191927552223206,
0.048586975783109665,
-0.07216182351112366,
-0.05566521733999252,
0.15130378305912018,
0.08112228661775589,
-0.06799453496932983,
-0.007611072156578302,
-0.03624589741230011,
0.019929753616452217,
0.05414922535419464,
-0.11173835396766663,
-0.07829399406909943,
0.05626935884356499,
0.01326249074190855,
0.06784533709287643,
-0.1390819102525711,
-0.015447807498276234,
0.015032056719064713,
-0.025212882086634636,
-0.01937931217253208,
0.04925219342112541,
-0.0872991681098938,
0.09392938017845154,
0.030761843547225,
0.015213673934340477,
0.03280004486441612,
0.004135115072131157,
-0.13945087790489197,
0.19668203592300415,
-0.05117088556289673,
-0.26709678769111633,
-0.1266784369945526,
-0.08155964314937592,
-0.0075772604905068874,
0.06968554854393005,
-0.014818022958934307,
-0.05185972526669502,
-0.06200641766190529,
-0.052946995943784714,
0.04919394105672836,
-0.02444637008011341,
0.010644171386957169,
-0.0550616979598999,
-0.07287932187318802,
-0.022106138989329338,
-0.06460476666688919,
-0.01885250210762024,
-0.003606514073908329,
-0.04922928288578987,
0.04950476810336113,
-0.09977429360151291,
0.1304299682378769,
0.19114546477794647,
0.02664201706647873,
0.032507412135601044,
-0.03992992267012596,
0.2661977708339691,
-0.10839122533798218,
-0.017057131975889206,
0.1343584656715393,
-0.001485227607190609,
-0.012777460739016533,
0.014188898727297783,
0.021967897191643715,
-0.09235820174217224,
0.02555072493851185,
0.011768576689064503,
-0.11180971562862396,
-0.15083196759223938,
-0.05048220232129097,
-0.06792791187763214,
-0.03548922389745712,
0.06509891152381897,
0.06458724290132523,
0.07189030200242996,
0.14373920857906342,
-0.024955879896879196,
0.08372537791728973,
-0.011036686599254608,
0.09303531050682068,
0.08279481530189514,
-0.0027636864688247442,
0.09730643033981323,
-0.07963919639587402,
-0.06657480448484421,
0.12708419561386108,
0.013028617948293686,
0.1861792355775833,
-0.013549180701375008,
0.19774633646011353,
0.054556671530008316,
0.0590565986931324,
0.07599002867937088,
0.045630987733602524,
0.01730351895093918,
-0.053357187658548355,
-0.04166550189256668,
-0.033905159682035446,
-0.046297837048769,
0.041132163256406784,
-0.028144383803009987,
-0.10938892513513565,
0.024337638169527054,
-0.012256751768290997,
0.020882030948996544,
0.03338741511106491,
0.03558152914047241,
-0.3530227541923523,
0.0023789014667272568,
-0.0038479138165712357,
0.03531654551625252,
-0.1449362188577652,
0.050247594714164734,
-0.0037128892727196217,
-0.12733548879623413,
0.043193038552999496,
-0.04282392933964729,
0.1086033433675766,
-0.18693624436855316,
0.018778912723064423,
-0.020523235201835632,
0.05031665787100792,
-0.05285447835922241,
0.13989056646823883,
-0.26521846652030945,
0.12359707057476044,
0.01369450893253088,
0.03674991428852081,
-0.11818373203277588,
-0.027076121419668198,
0.019150976091623306,
0.08404026180505753,
0.14244431257247925,
0.00522225396707654,
-0.05572011321783066,
-0.03582051023840904,
-0.049295127391815186,
0.05207674950361252,
-0.0036756491754204035,
0.0046544382348656654,
0.021604804322123528,
-0.014760310761630535,
0.04061722010374069,
-0.0033396396320313215,
-0.09004256129264832,
-0.10131488740444183,
-0.12735538184642792,
0.06238773092627525,
0.04341275990009308,
-0.037333324551582336,
0.015027103945612907,
-0.030399050563573837,
-0.0044261133298277855,
0.21022358536720276,
-0.05697695538401604,
-0.07851144671440125,
-0.11250338703393936,
-0.005497790407389402,
0.05383885279297829,
-0.06994752585887909,
0.036060214042663574,
-0.001552209840156138,
0.14021052420139313,
0.034468911588191986,
-0.12900091707706451,
0.10368780791759491,
-0.07584353536367416,
-0.04486474767327309,
-0.010340730659663677,
-0.004351216834038496,
0.05517619103193283,
0.018405286595225334,
0.02160249836742878,
-0.0038303944747895002,
-0.1084815114736557,
-0.15613697469234467,
0.012246654368937016,
0.039770450443029404,
0.04760615900158882,
0.039029791951179504,
0.08873279392719269,
0.06880868971347809,
-0.01735822670161724,
-0.03377166762948036,
0.11290353536605835,
0.14050686359405518,
-0.05642479285597801,
0.05356490612030029,
0.10639313608407974,
-0.10533708333969116,
-0.23408791422843933,
0.028276951983571053,
-0.009111626073718071,
0.03289563208818436,
-0.02404065802693367,
-0.15864934027194977,
0.0494200624525547,
0.04358582943677902,
-0.01007565576583147,
0.10112103074789047,
-0.16729490458965302,
-0.12229638546705246,
0.19520005583763123,
0.1112215593457222,
0.27746760845184326,
-0.07242925465106964,
-0.03525136783719063,
-0.11819703876972198,
-0.16590054333209991,
0.11027170717716217,
-0.06462565809488297,
0.07875996828079224,
-0.032777708023786545,
0.057151179760694504,
-0.00039508947520516813,
-0.05916463956236839,
0.13122960925102234,
-0.05562226101756096,
0.05951801314949989,
-0.09766920655965805,
0.005454581696540117,
0.058176007121801376,
-0.008417028933763504,
0.09700743108987808,
0.008181070908904076,
0.018386712297797203,
-0.022509869188070297,
-0.11973372846841812,
-0.06075918301939964,
0.04777967557311058,
0.05137236788868904,
-0.08680342882871628,
-0.10125882178544998,
0.019046884030103683,
-0.0010470798006281257,
-0.031782954931259155,
-0.0996030643582344,
-0.11479778587818146,
0.004405694082379341,
0.04540354013442993,
0.13397745788097382,
-0.07331821322441101,
-0.05164793133735657,
0.02795352227985859,
-0.02097421884536743,
0.0999162420630455,
-0.0629260316491127,
0.0030638480093330145,
0.08943963795900345,
-0.038323551416397095,
0.10802988708019257,
0.04267215356230736,
-0.08586188405752182,
0.02088591456413269,
0.08054778724908829,
-0.14883461594581604,
-0.03921623155474663,
-0.06626064330339432,
-0.12335752695798874,
-0.0074244155548512936,
0.02809000201523304,
0.081679567694664,
-0.05873503163456917,
-0.020941991358995438,
0.030759911984205246,
-0.037139639258384705,
-0.07002711296081543,
0.13300837576389313,
0.043367356061935425,
0.01915425807237625,
-0.12290026247501373,
0.012203271500766277,
0.002446091966703534,
0.05400940403342247,
0.0410032793879509,
-0.07460840791463852,
-0.14336170256137848,
-0.05766255781054497,
0.0071412366814911366,
0.16500309109687805,
-0.16562020778656006,
-0.08736205101013184,
-0.0998532772064209,
-0.08681567013263702,
0.04179825261235237,
0.07608331739902496,
0.06992907077074051,
0.0994713306427002,
-0.06714017689228058,
-0.012979092076420784,
-0.10359429568052292,
-0.01696649193763733,
-0.0013212385820224881,
0.04711909592151642,
-0.16401804983615875,
0.09686154127120972,
0.029640430584549904,
0.038929834961891174,
-0.10689231008291245,
-0.03309190273284912,
-0.11427082866430283,
0.02674579806625843,
-0.15328669548034668,
0.01483964454382658,
-0.12951727211475372,
-0.006221224553883076,
0.008532254956662655,
0.014892352744936943,
-0.07387015223503113,
0.010134459473192692,
-0.07908188551664352,
0.035375673323869705,
0.01625036634504795,
0.017485912889242172,
-0.05851754546165466,
0.03371661156415939,
0.041356410831213,
0.0212689358741045,
0.07480593770742416,
0.02703455649316311,
-0.03907964006066322,
0.05213148519396782,
-0.05553916469216347,
0.10409138351678848,
0.05638662353157997,
0.02676946111023426,
-0.011015484109520912,
-0.0288899764418602,
0.0365716889500618,
0.033201269805431366,
0.002826189622282982,
0.04698214307427406,
0.008615036495029926,
-0.07754083722829819,
0.020043237134814262,
-0.009477066807448864,
-0.016135573387145996,
-0.027202872559428215,
0.10134255886077881,
0.08132897317409515,
0.08264103531837463,
0.07478371262550354,
-0.02649582363665104,
-0.05947237089276314,
-0.09773792326450348,
-0.0044805388897657394,
0.002583104884251952,
-0.06974083185195923,
0.0175645612180233,
-0.06821232289075851,
0.055757757276296616,
-0.0010065250098705292,
0.397316575050354,
0.034512586891651154,
-0.05740971118211746,
-0.031021077185869217,
0.13582250475883484,
0.1281132847070694,
-0.0346662700176239,
0.13918624818325043,
0.01992734707891941,
-0.0333266519010067,
0.0003150832490064204,
0.1331082135438919,
0.04242301359772682,
0.07943552732467651,
0.151805117726326,
-0.0857272818684578,
0.0226778294891119,
0.056251849979162216,
-0.03027511015534401,
-0.015933413058519363,
-0.025304337963461876,
-0.03408728167414665,
-0.03223341330885887,
0.0854395180940628,
-0.004088954068720341,
0.04497918859124184,
0.20290814340114594,
-0.05227649211883545,
0.0014430714072659612,
0.028952885419130325,
-0.09508655965328217,
-0.18547672033309937,
-0.3663075268268585,
-0.0528230182826519,
-0.12485682219266891,
-0.010025369003415108,
-0.10852992534637451,
-0.01564982160925865,
0.07882732152938843,
0.05011393129825592,
-0.07252057641744614,
0.18134474754333496,
-0.01138327457010746,
-0.11280287057161331,
-0.017878619953989983,
-0.04779085889458656,
-0.016877910122275352,
-0.0679129883646965,
0.018354425206780434,
0.008680243976414204,
-0.0020304424688220024,
0.05872272700071335,
-0.03456166759133339,
-0.004155855160206556,
0.04311847314238548,
-0.049609970301389694,
-0.01563304290175438,
-0.047310296446084976,
0.0715450718998909,
0.019366132095456123,
0.09154410660266876,
0.006844990886747837,
-0.09419115632772446,
-0.0035350017715245485,
0.2322339564561844,
-0.01532791182398796,
0.02861894480884075,
-0.12107358872890472,
0.19612646102905273,
-0.03946033492684364,
0.013596433214843273,
-0.004163803998380899,
-0.006530884187668562,
0.0781666561961174,
0.2963137626647949,
0.23716574907302856,
-0.07088463008403778,
-0.03508524224162102,
-0.022808615118265152,
0.00676605012267828,
0.017933718860149384,
0.18202117085456848,
-0.025506669655442238,
0.1766042411327362,
-0.056858427822589874,
-0.03162926062941551,
-0.060703158378601074,
0.005342759657651186,
-0.03359796106815338,
0.024478735402226448,
0.10318952798843384,
-0.007570519112050533,
-0.04979401081800461,
0.08128665387630463,
-0.14702191948890686,
0.0386342816054821,
-0.1039862409234047,
0.012983978725969791,
-0.0871114507317543,
0.008977305144071579,
-0.10331504791975021,
-0.036718059331178665,
0.054518163204193115,
-0.06210926175117493,
0.014186550863087177,
0.040627460926771164,
0.04015996307134628,
-0.17750205099582672,
-0.1055787205696106,
0.0671641007065773,
0.14901135861873627,
0.15098285675048828,
-0.03023662604391575,
0.09998057782649994,
0.09390410035848618,
-0.002500222995877266,
-0.10021895170211792,
0.074752077460289,
-0.051759250462055206,
-0.031880199909210205,
0.08563017100095749,
0.02189648151397705,
-0.00499369902536273,
-0.032393813133239746,
0.027185877785086632,
0.013490586541593075,
0.03393448889255524,
-0.14166803658008575,
0.031782038509845734,
-0.09156700223684311,
0.021178919821977615,
-0.1204659715294838,
0.1573212891817093,
0.09794078022241592,
-0.0212294552475214,
0.009313941933214664,
-0.03962743654847145,
0.061228327453136444,
0.0021899868734180927,
0.09551022946834564,
0.005098983645439148,
-0.1424284726381302,
-0.045397404581308365,
0.12189766019582748,
-0.010709892027080059,
-0.17912867665290833,
-0.012089205905795097,
-0.05982019379734993,
-0.050422124564647675,
0.004860581364482641,
0.08077164739370346,
0.015363498590886593,
0.07344423979520798,
-0.055414069443941116,
-0.044121839106082916,
0.00730146886780858,
0.06039445474743843,
-0.14172768592834473,
-0.11229853332042694
] |
null | null |
transformers
|
# GPT-Neo-125M-Code-Clippy-Dedup
> **Please refer to our new [GitHub Wiki](https://github.com/ncoop57/gpt-code-clippy/wiki) which documents our efforts in detail in creating the open source version of GitHub Copilot**
## Model Description
PT-Neo-125M-Code-Clippy-Dedup is a [GPT-Neo-125M model](https://huggingface.co/EleutherAI/gpt-neo-125M) finetuned using causal language modeling on our deduplicated version of the Code Clippy Data dataset, which was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages.
## Training data
[Code Clippy Data dataset](https://huggingface.co/datasets/code_search_net).
## Training procedure
In this model's training we tried to stabilize the training by limiting the types of files we were using to train to only those that contained file extensions for popular programming languages as our dataset contains other types of files as well such as `.txt` or project configuration files. We used the following extensions to filter by:
The training script used to train this model can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/run_clm_streaming_filter_flax.py).
```bash
./run_clm_streaming_filter_flax.py \
--output_dir $HOME/gpt-neo-125M-code-clippy-dedup \
--model_name_or_path="EleutherAI/gpt-neo-125M" \
--dataset_name $HOME/gpt-code-clippy/data_processing/code_clippy_filter.py \
--data_dir $HOME/code_clippy_data/code_clippy_dedup_data \
--text_column_name="text" \
--do_train --do_eval \
--block_size="2048" \
--per_device_train_batch_size="8" \
--per_device_eval_batch_size="16" \
--preprocessing_num_workers="8" \
--learning_rate="1e-4" \
--max_steps 100000 \
--warmup_steps 2000 \
--decay_steps 30000 \
--adam_beta1="0.9" \
--adam_beta2="0.95" \
--weight_decay="0.1" \
--overwrite_output_dir \
--logging_steps="25" \
--eval_steps="500" \
--push_to_hub="False" \
--report_to="all" \
--dtype="bfloat16" \
--skip_memory_metrics="True" \
--save_steps="500" \
--save_total_limit 10 \
--gradient_accumulation_steps 16 \
--report_to="wandb" \
--run_name="gpt-neo-125M-code-clippy-dedup-filtered-no-resize-2048bs" \
--max_eval_samples 2000 \
--save_optimizer true
```
## Intended Use and Limitations
The model is finetuned text file from github repositories (mostly programming languages but also markdown and other project related files).
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("flax-community/gpt-neo-125M-code-clippy-dedup")
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt-neo-125M-code-clippy-dedup")
prompt = """def greet(name):
'''A function to greet user. Given a user name it should say hello'''
"""
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(device)
start = input_ids.size(1)
out = model.generate(input_ids, do_sample=True, max_length=50, num_beams=2,
early_stopping=True, eos_token_id=tokenizer.eos_token_id, )
print(tokenizer.decode(out[0][start:]))
```
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper ["Evaluating Large Language Models Trained on Code"](https://arxiv.org/abs/2107.03374) from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. **As well as some differences in views from the paper, particularly around legal implications**.
1. **Over-reliance:** This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. **Economic and labor market impacts:** Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from [O*NET OnLine](https://www.onetonline.org/link/summary/15-1252.00), developers don't just write software.
3. **Security implications:** No filtering or checking of vulnerabilities or buggy code was performed on the datase this model is trained on. This means that the dataset may contain code that may be malicious or contain vulnerabilities. Therefore, this model may generate vulnerable, buggy, or malicious code. In safety critical software, this could lead to software that may work improperly and could result in serious consequences depending on the software. Additionally, this model may be able to be used to generate malicious code on purpose in order to perform ransomware or other such attacks.
4. **Legal implications:** No filtering was performed on licensed code. This means that the dataset may contain restrictive licensed code. As discussed in the paper, public Github repositories may fall under "fair use." However, there has been little to no previous cases of such usages of licensed publicly available code. Therefore, any code generated with this model may be required to obey license terms that align with the software it was trained on such as GPL-3.0. It is unclear the legal ramifications of using a language model trained on this dataset.
5. **Biases:** The programming languages most represented in the dataset this model was trained on are Javascript and Python. Therefore, other, still popular languages such as C and C++, are less represented and therefore the models performance for these languages will be less comparatively. Additionally, this dataset only contains public repositories and so the model may not generate code that is representative of code written by private developers. No filtering was performed for potential racist, offensive, or otherwise inappropriate content. Therefore, this model may reflect such biases in its generation.
GPT-Neo-125M-Code-Clippy-Dedup is finetuned from GPT-Neo and might have inherited biases and limitations from it. See [GPT-Neo model card](https://huggingface.co/EleutherAI/gpt-neo-125M#limitations-and-biases) for details.
## Eval results
Coming soon...
|
{}
|
text-generation
|
flax-community/gpt-neo-125M-code-clippy-dedup
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"gpt_neo",
"text-generation",
"arxiv:2107.03374",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.03374"
] |
[] |
TAGS
#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #arxiv-2107.03374 #autotrain_compatible #endpoints_compatible #region-us
|
# GPT-Neo-125M-Code-Clippy-Dedup
> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot
## Model Description
PT-Neo-125M-Code-Clippy-Dedup is a GPT-Neo-125M model finetuned using causal language modeling on our deduplicated version of the Code Clippy Data dataset, which was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages.
## Training data
Code Clippy Data dataset.
## Training procedure
In this model's training we tried to stabilize the training by limiting the types of files we were using to train to only those that contained file extensions for popular programming languages as our dataset contains other types of files as well such as '.txt' or project configuration files. We used the following extensions to filter by:
The training script used to train this model can be found here.
## Intended Use and Limitations
The model is finetuned text file from github repositories (mostly programming languages but also markdown and other project related files).
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper "Evaluating Large Language Models Trained on Code" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.
1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.
3. Security implications: No filtering or checking of vulnerabilities or buggy code was performed on the datase this model is trained on. This means that the dataset may contain code that may be malicious or contain vulnerabilities. Therefore, this model may generate vulnerable, buggy, or malicious code. In safety critical software, this could lead to software that may work improperly and could result in serious consequences depending on the software. Additionally, this model may be able to be used to generate malicious code on purpose in order to perform ransomware or other such attacks.
4. Legal implications: No filtering was performed on licensed code. This means that the dataset may contain restrictive licensed code. As discussed in the paper, public Github repositories may fall under "fair use." However, there has been little to no previous cases of such usages of licensed publicly available code. Therefore, any code generated with this model may be required to obey license terms that align with the software it was trained on such as GPL-3.0. It is unclear the legal ramifications of using a language model trained on this dataset.
5. Biases: The programming languages most represented in the dataset this model was trained on are Javascript and Python. Therefore, other, still popular languages such as C and C++, are less represented and therefore the models performance for these languages will be less comparatively. Additionally, this dataset only contains public repositories and so the model may not generate code that is representative of code written by private developers. No filtering was performed for potential racist, offensive, or otherwise inappropriate content. Therefore, this model may reflect such biases in its generation.
GPT-Neo-125M-Code-Clippy-Dedup is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.
## Eval results
Coming soon...
|
[
"# GPT-Neo-125M-Code-Clippy-Dedup\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nPT-Neo-125M-Code-Clippy-Dedup is a GPT-Neo-125M model finetuned using causal language modeling on our deduplicated version of the Code Clippy Data dataset, which was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages.",
"## Training data\n\nCode Clippy Data dataset.",
"## Training procedure\n\nIn this model's training we tried to stabilize the training by limiting the types of files we were using to train to only those that contained file extensions for popular programming languages as our dataset contains other types of files as well such as '.txt' or project configuration files. We used the following extensions to filter by:\n\nThe training script used to train this model can be found here.",
"## Intended Use and Limitations\n\nThe model is finetuned text file from github repositories (mostly programming languages but also markdown and other project related files).",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n3. Security implications: No filtering or checking of vulnerabilities or buggy code was performed on the datase this model is trained on. This means that the dataset may contain code that may be malicious or contain vulnerabilities. Therefore, this model may generate vulnerable, buggy, or malicious code. In safety critical software, this could lead to software that may work improperly and could result in serious consequences depending on the software. Additionally, this model may be able to be used to generate malicious code on purpose in order to perform ransomware or other such attacks.\n4. Legal implications: No filtering was performed on licensed code. This means that the dataset may contain restrictive licensed code. As discussed in the paper, public Github repositories may fall under \"fair use.\" However, there has been little to no previous cases of such usages of licensed publicly available code. Therefore, any code generated with this model may be required to obey license terms that align with the software it was trained on such as GPL-3.0. It is unclear the legal ramifications of using a language model trained on this dataset.\n5. Biases: The programming languages most represented in the dataset this model was trained on are Javascript and Python. Therefore, other, still popular languages such as C and C++, are less represented and therefore the models performance for these languages will be less comparatively. Additionally, this dataset only contains public repositories and so the model may not generate code that is representative of code written by private developers. No filtering was performed for potential racist, offensive, or otherwise inappropriate content. Therefore, this model may reflect such biases in its generation.\n\nGPT-Neo-125M-Code-Clippy-Dedup is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #arxiv-2107.03374 #autotrain_compatible #endpoints_compatible #region-us \n",
"# GPT-Neo-125M-Code-Clippy-Dedup\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nPT-Neo-125M-Code-Clippy-Dedup is a GPT-Neo-125M model finetuned using causal language modeling on our deduplicated version of the Code Clippy Data dataset, which was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages.",
"## Training data\n\nCode Clippy Data dataset.",
"## Training procedure\n\nIn this model's training we tried to stabilize the training by limiting the types of files we were using to train to only those that contained file extensions for popular programming languages as our dataset contains other types of files as well such as '.txt' or project configuration files. We used the following extensions to filter by:\n\nThe training script used to train this model can be found here.",
"## Intended Use and Limitations\n\nThe model is finetuned text file from github repositories (mostly programming languages but also markdown and other project related files).",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n3. Security implications: No filtering or checking of vulnerabilities or buggy code was performed on the datase this model is trained on. This means that the dataset may contain code that may be malicious or contain vulnerabilities. Therefore, this model may generate vulnerable, buggy, or malicious code. In safety critical software, this could lead to software that may work improperly and could result in serious consequences depending on the software. Additionally, this model may be able to be used to generate malicious code on purpose in order to perform ransomware or other such attacks.\n4. Legal implications: No filtering was performed on licensed code. This means that the dataset may contain restrictive licensed code. As discussed in the paper, public Github repositories may fall under \"fair use.\" However, there has been little to no previous cases of such usages of licensed publicly available code. Therefore, any code generated with this model may be required to obey license terms that align with the software it was trained on such as GPL-3.0. It is unclear the legal ramifications of using a language model trained on this dataset.\n5. Biases: The programming languages most represented in the dataset this model was trained on are Javascript and Python. Therefore, other, still popular languages such as C and C++, are less represented and therefore the models performance for these languages will be less comparatively. Additionally, this dataset only contains public repositories and so the model may not generate code that is representative of code written by private developers. No filtering was performed for potential racist, offensive, or otherwise inappropriate content. Therefore, this model may reflect such biases in its generation.\n\nGPT-Neo-125M-Code-Clippy-Dedup is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
54,
47,
93,
10,
89,
40,
35,
730,
8
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #arxiv-2107.03374 #autotrain_compatible #endpoints_compatible #region-us \n# GPT-Neo-125M-Code-Clippy-Dedup\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot## Model Description\n\nPT-Neo-125M-Code-Clippy-Dedup is a GPT-Neo-125M model finetuned using causal language modeling on our deduplicated version of the Code Clippy Data dataset, which was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages.## Training data\n\nCode Clippy Data dataset.## Training procedure\n\nIn this model's training we tried to stabilize the training by limiting the types of files we were using to train to only those that contained file extensions for popular programming languages as our dataset contains other types of files as well such as '.txt' or project configuration files. We used the following extensions to filter by:\n\nThe training script used to train this model can be found here.## Intended Use and Limitations\n\nThe model is finetuned text file from github repositories (mostly programming languages but also markdown and other project related files).### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:"
] |
[
-0.09227306395769119,
0.17333479225635529,
-0.001869536587037146,
0.03495488315820694,
0.18438754975795746,
-0.022954029962420464,
0.09805796295404434,
0.1209714338183403,
-0.09287101030349731,
0.0030299730133265257,
-0.01142056006938219,
0.04023738205432892,
0.07618790864944458,
0.10013394057750702,
0.0449039526283741,
-0.26997941732406616,
0.03279971331357956,
-0.031837861984968185,
-0.022618427872657776,
0.14926990866661072,
0.1256462037563324,
-0.03985190764069557,
0.08511076122522354,
0.08342617750167847,
-0.18889616429805756,
0.027956664562225342,
0.03174930438399315,
-0.019825344905257225,
0.08540763705968857,
0.05865826830267906,
0.13966713845729828,
0.02309718355536461,
0.07195524871349335,
-0.15577034652233124,
0.0192522294819355,
0.14734135568141937,
0.03390684351325035,
0.06775666028261185,
0.11847483366727829,
-0.012563948519527912,
0.1761300414800644,
-0.011495503596961498,
0.047052592039108276,
0.08806435018777847,
-0.11816489696502686,
-0.09255897253751755,
-0.07433409243822098,
0.018105609342455864,
0.11622777581214905,
0.14438748359680176,
-0.010066450573503971,
0.07000312954187393,
-0.06805428862571716,
0.03762426599860191,
0.07869181036949158,
-0.16832493245601654,
-0.027940696105360985,
0.22578543424606323,
0.09963949769735336,
0.014436690136790276,
-0.07932907342910767,
-0.015072439797222614,
-0.006628791335970163,
-0.010323921218514442,
0.09415441006422043,
-0.04085710272192955,
-0.022039130330085754,
-0.062395717948675156,
-0.11059415340423584,
-0.0714661255478859,
0.1642305850982666,
-0.08563747256994247,
-0.07498931139707565,
-0.16287849843502045,
-0.022531306371092796,
0.049052804708480835,
0.01805625855922699,
-0.023771321401000023,
0.0029958949889987707,
0.0534898117184639,
0.09822090715169907,
-0.11396079510450363,
-0.13095475733280182,
-0.08553431928157806,
0.08584021776914597,
0.14122231304645538,
0.049114104360342026,
0.030204197391867638,
-0.06537039577960968,
0.19531847536563873,
-0.08389480412006378,
-0.044610753655433655,
0.005989320110529661,
-0.10082387179136276,
-0.09561430662870407,
-0.060267213732004166,
-0.09808127582073212,
-0.1327374130487442,
-0.0474068820476532,
0.22354593873023987,
-0.0546577014029026,
0.037407711148262024,
0.09495023638010025,
0.04820557311177254,
0.0617523193359375,
0.09849603474140167,
-0.05071801692247391,
0.04590201750397682,
0.08256502449512482,
0.08170093595981598,
-0.08089902997016907,
0.0036535707768052816,
-0.06696110963821411,
-0.07260586321353912,
-0.02173181064426899,
0.08152561634778976,
0.040227364748716354,
0.07866420596837997,
-0.015849104151129723,
-0.07269757241010666,
0.14571432769298553,
-0.12734070420265198,
0.002186946338042617,
-0.030545812100172043,
-0.059065550565719604,
-0.005839272867888212,
0.071794793009758,
-0.027508946135640144,
-0.11932995915412903,
0.026676004752516747,
-0.052716486155986786,
0.020482582971453667,
-0.1540563553571701,
-0.12812383472919464,
-0.00861465372145176,
-0.07220545411109924,
-0.012286794371902943,
-0.10491123795509338,
-0.2530747950077057,
-0.06979898363351822,
0.07022582739591599,
0.009485149756073952,
-0.007154020946472883,
0.02351435273885727,
-0.03348420187830925,
-0.03898650407791138,
0.014978213235735893,
-0.08709031343460083,
-0.031735025346279144,
0.008570387959480286,
-0.07920144498348236,
0.0008326813112944365,
0.01956995576620102,
0.02138933725655079,
-0.08090556412935257,
0.027022816240787506,
-0.20672711730003357,
0.08800633996725082,
-0.021237721666693687,
-0.034018032252788544,
-0.10186343640089035,
-0.06093893572688103,
-0.03283996507525444,
0.04722926765680313,
0.03191657364368439,
0.11756186187267303,
-0.1802540272474289,
-0.011344081722199917,
0.17154666781425476,
-0.15848785638809204,
0.05115441977977753,
0.08338173478841782,
-0.04920851066708565,
0.13805879652500153,
0.08206811547279358,
0.09350790083408356,
0.22379052639007568,
-0.03595786541700363,
-0.027651537209749222,
-0.020465265959501266,
-0.10969821363687515,
0.0405023992061615,
0.002978342119604349,
-0.01832294650375843,
-0.012099024839699268,
0.016196241602301598,
-0.11610953509807587,
0.026080230250954628,
-0.008965028449892998,
-0.04370732977986336,
-0.0017772435676306486,
-0.024526840075850487,
-0.01214727945625782,
0.00023811864957679063,
-0.004562084097415209,
0.008880948647856712,
-0.10308537632226944,
0.04237796366214752,
0.13814325630664825,
-0.07129485160112381,
0.01603386178612709,
-0.06904754787683487,
0.01018030010163784,
-0.03337688371539116,
0.01467599906027317,
-0.20677469670772552,
-0.1420891135931015,
0.023402702063322067,
-0.10995883494615555,
0.049618691205978394,
0.018225405365228653,
0.057966120541095734,
0.04431980475783348,
-0.0366012379527092,
-0.033557966351509094,
-0.09360476583242416,
-0.01680784672498703,
-0.009367715567350388,
-0.08015147596597672,
-0.08395782858133316,
-0.05451664328575134,
0.10975892841815948,
-0.1959528774023056,
0.08853641897439957,
0.05862337350845337,
0.047079410403966904,
0.017755886539816856,
-0.10737164318561554,
0.0587114617228508,
0.0037176888436079025,
-0.035153213888406754,
-0.10595091432332993,
0.008783298544585705,
0.05639401078224182,
0.010455990210175514,
0.02519253082573414,
-0.12580126523971558,
-0.06336785852909088,
0.06732800602912903,
0.029541267082095146,
-0.1184963807463646,
-0.07328499853610992,
-0.037326544523239136,
-0.051184795796871185,
-0.04698048532009125,
-0.060465920716524124,
0.17123715579509735,
0.011602607555687428,
0.09836441278457642,
-0.08345343172550201,
-0.001544470782391727,
-0.03385031968355179,
-0.04253342002630234,
0.06374292820692062,
-0.01527367439121008,
0.1175517812371254,
-0.033696498721838,
0.08135022968053818,
-0.03566039726138115,
-0.02909179776906967,
0.13215170800685883,
0.03182803466916084,
-0.07598098367452621,
0.019940409809350967,
0.06881905347108841,
0.0358678475022316,
0.035708360373973846,
0.02773378975689411,
-0.0147831030189991,
0.01656031236052513,
0.009574424475431442,
0.06963171064853668,
-0.14984746277332306,
0.025033462792634964,
0.050551217049360275,
-0.03379854932427406,
-0.013634881004691124,
0.02196214720606804,
-0.031057698652148247,
0.019358303397893906,
-0.002265993971377611,
0.028361603617668152,
0.030726104974746704,
-0.034680306911468506,
-0.07937108725309372,
0.16440361738204956,
-0.13148915767669678,
-0.2852363586425781,
-0.18551428616046906,
0.06996386498212814,
-0.039771053940057755,
0.03190012276172638,
0.004497145768254995,
-0.05112852901220322,
-0.08209504187107086,
-0.027687696740031242,
0.05872926861047745,
-0.06153874099254608,
-0.07753138989210129,
-0.14374735951423645,
0.03491063788533211,
0.054706837981939316,
-0.15081243216991425,
0.010942891240119934,
0.08747688680887222,
-0.13805073499679565,
0.045729879289865494,
0.02517174556851387,
0.03284110501408577,
0.11457963287830353,
-0.03292421996593475,
-0.002307147951796651,
-0.01062388438731432,
0.2076716125011444,
-0.0719214454293251,
0.07846908271312714,
0.20206235349178314,
-0.028262034058570862,
0.06683845818042755,
0.018224207684397697,
0.022791916504502296,
-0.0747460424900055,
0.03433673456311226,
0.041362762451171875,
-0.06199267506599426,
-0.17523188889026642,
-0.06757217645645142,
-0.03630223497748375,
-0.044398024678230286,
0.09496498852968216,
0.03690990060567856,
0.09927523881196976,
0.07543078809976578,
-0.059031952172517776,
-0.0020028180442750454,
0.06794257462024689,
0.13604381680488586,
-0.0366845466196537,
-0.0008419682271778584,
0.07260558009147644,
-0.012260187417268753,
0.027081508189439774,
0.06206337362527847,
0.09668227285146713,
0.19804102182388306,
-0.0013632059562951326,
0.1605909764766693,
0.0841803103685379,
0.06410674750804901,
0.062162019312381744,
0.11633573472499847,
-0.013622192665934563,
0.009755924344062805,
0.019603459164500237,
-0.04994228109717369,
-0.042786553502082825,
0.06723562628030777,
-0.04839691147208214,
-0.07053609192371368,
-0.05757267400622368,
-0.05678892135620117,
0.004127498250454664,
0.1220860704779625,
-0.03125085309147835,
-0.2624552845954895,
-0.043688464909791946,
-0.04353483021259308,
-0.037506815046072006,
-0.09092392772436142,
0.003771695541217923,
0.12636201083660126,
-0.1422712355852127,
-0.06429637968540192,
-0.08612534403800964,
0.12733037769794464,
-0.09929974377155304,
-0.010647207498550415,
0.04866916313767433,
0.20373167097568512,
-0.026528391987085342,
0.10758952051401138,
-0.0870225727558136,
0.05444188416004181,
0.016783304512500763,
0.08159735798835754,
-0.038077760487794876,
0.0892554521560669,
0.05090086907148361,
0.0864168107509613,
0.08413170278072357,
0.010442960076034069,
-0.0590212456882,
-0.0688551515340805,
-0.056807342916727066,
0.0024555986747145653,
0.03881630301475525,
-0.10498562455177307,
0.0727965459227562,
-0.019505394622683525,
0.02617296390235424,
0.003830019850283861,
-0.08891434967517853,
-0.15499456226825714,
-0.1850118339061737,
0.0769445151090622,
-0.007426415104418993,
0.03551821410655975,
-0.10521166026592255,
0.0005527834873646498,
0.028607070446014404,
0.2965182960033417,
0.06755475699901581,
-0.10812784731388092,
-0.1295715719461441,
-0.027395304292440414,
0.11573619395494461,
-0.019439755007624626,
0.11250828206539154,
0.04498111084103584,
0.12397529929876328,
-0.022888753563165665,
-0.0830373466014862,
0.015048287808895111,
-0.11832033097743988,
-0.038893282413482666,
-0.026907537132501602,
0.09997878223657608,
0.08094628155231476,
-0.030165329575538635,
0.013938935473561287,
-0.03422551229596138,
-0.01299833320081234,
-0.09115391969680786,
-0.08033610880374908,
0.18510670959949493,
0.09031739085912704,
0.07199858874082565,
-0.07863689959049225,
-0.002264620503410697,
-0.06522099673748016,
0.016995809972286224,
0.03004845418035984,
0.1084076538681984,
-0.05544157326221466,
0.05211371183395386,
0.14551641047000885,
-0.04921402409672737,
-0.16436836123466492,
0.0695103332400322,
0.011790831573307514,
0.029796462506055832,
-0.019021466374397278,
-0.28649473190307617,
0.011566927656531334,
0.07384175807237625,
-0.01926037110388279,
0.13920137286186218,
-0.33859607577323914,
-0.08586602658033371,
0.023937268182635307,
0.06409240514039993,
-0.028711620718240738,
-0.0832940861582756,
-0.01588127203285694,
-0.031882353127002716,
-0.13637493550777435,
0.11447188258171082,
-0.07302737236022949,
0.08583422750234604,
-0.013482606038451195,
-0.00666744913905859,
0.05435347557067871,
-0.05644648149609566,
0.12029682099819183,
-0.007459865417331457,
0.06551793217658997,
-0.04788379743695259,
0.08593744039535522,
0.12157763540744781,
-0.049268681555986404,
0.13780491054058075,
-0.05793698504567146,
0.05963748320937157,
-0.1329418271780014,
-0.09672363102436066,
-0.06667108833789825,
0.0716249868273735,
-0.014769432134926319,
-0.1038515716791153,
-0.07300969213247299,
0.06516250222921371,
0.02559446170926094,
0.02452770434319973,
-0.16974349319934845,
-0.0473935529589653,
-0.005406726151704788,
0.1508367657661438,
0.10718774050474167,
0.04917566850781441,
-0.18171708285808563,
0.032526738941669464,
0.024426864460110664,
0.14121349155902863,
-0.08493395149707794,
0.011512069962918758,
0.08542437851428986,
0.0011591416550800204,
0.11465083807706833,
0.052563074976205826,
-0.11800707876682281,
-0.052795927971601486,
0.04902273416519165,
-0.12756755948066711,
-0.14054378867149353,
-0.06348059326410294,
0.053057052195072174,
-0.039308156818151474,
-0.011564112268388271,
0.09600749611854553,
-0.0777973011136055,
-0.029350459575653076,
0.0030490632634609938,
0.038628630340099335,
-0.005987202748656273,
0.13051314651966095,
-0.006921989377588034,
0.032634276896715164,
-0.0564153753221035,
0.15059657394886017,
0.05816395953297615,
-0.07165024429559708,
-0.029622556641697884,
0.0802474394440651,
-0.13239893317222595,
-0.06038547679781914,
-0.05311495065689087,
0.004143347032368183,
-0.04210377484560013,
-0.06510062515735626,
-0.00011955611262237653,
-0.018047520890831947,
-0.03971050679683685,
-0.011814010329544544,
0.03934382647275925,
0.05242914333939552,
-0.04941093921661377,
0.017682502046227455,
-0.1973084956407547,
0.0009958116570487618,
0.11822930723428726,
0.05730350688099861,
-0.1288137286901474,
0.145906463265419,
0.03247995302081108,
-0.05641869083046913,
-0.0016394556732848287,
-0.0065970756113529205,
-0.09480094164609909,
-0.0016384467016905546,
-0.04012085124850273,
-0.022214563563466072,
-0.08264322578907013,
-0.020031610503792763,
0.03870413079857826,
0.052615854889154434,
0.047043848782777786,
0.04764112830162048,
-0.05684557557106018,
-0.06856109946966171,
-0.04499254375696182,
0.050790935754776,
-0.058398980647325516,
0.003955602180212736,
0.03580040857195854,
-0.1052856370806694,
0.0866253674030304,
0.054308757185935974,
-0.0192415788769722,
0.01890038512647152,
-0.060131363570690155,
-0.004103487823158503,
0.04114197939634323,
0.016205815598368645,
-0.026252111420035362,
-0.12413373589515686,
0.04930141195654869,
-0.028938956558704376,
-0.04079984501004219,
-0.05029315873980522,
0.0006346033187583089,
-0.12856072187423706,
0.05037518963217735,
0.056967947632074356,
0.012345095165073872,
-0.06243005022406578,
0.02512451261281967,
0.15029890835285187,
0.08748460561037064,
0.1338673233985901,
-0.023768257349729538,
0.042128872126340866,
-0.12332470715045929,
-0.00917199905961752,
0.006025342270731926,
-0.014555190689861774,
0.07923571020364761,
-0.026790674775838852,
0.04655644670128822,
-0.03407461941242218,
0.09437015652656555,
0.003359035588800907,
-0.010814301669597626,
-0.03144378215074539,
-0.022946756333112717,
0.06105007603764534,
0.025354720652103424,
0.05665731430053711,
0.025343190878629684,
-0.02370433881878853,
-0.04512177035212517,
-0.00333619792945683,
-0.00018165000074077398,
-0.13033895194530487,
0.09508990496397018,
0.0962594524025917,
-0.03015371784567833,
0.0551675483584404,
-0.0060743289068341255,
-0.09477587789297104,
-0.10518501698970795,
0.0031772449146956205,
-0.011656361632049084,
0.047512754797935486,
-0.09197711944580078,
0.06309688091278076,
0.18859460949897766,
-0.08814536780118942,
0.1467161625623703,
0.05004866048693657,
-0.07670155167579651,
-0.0786866545677185,
-0.22100719809532166,
-0.012213087640702724,
-0.04929851368069649,
-0.015532064251601696,
-0.07592516392469406,
0.03486744686961174,
0.03272336348891258,
0.017543498426675797,
-0.06875038892030716,
0.23304176330566406,
-0.015774240717291832,
-0.11850442737340927,
0.019389558583498,
-0.02249687723815441,
0.03707537800073624,
0.0009860643185675144,
-0.016551148146390915,
0.04533408209681511,
0.01000975538045168,
0.06701969355344772,
-0.002607702976092696,
0.10035475343465805,
0.03008197620511055,
0.0008544038282707334,
-0.004893858451396227,
-0.016175972297787666,
0.005981822498142719,
0.006442752201110125,
0.0976615771651268,
0.037910331040620804,
-0.07571595907211304,
0.0028384949546307325,
0.18847650289535522,
-0.027830109000205994,
-0.07225944101810455,
-0.1333760917186737,
0.19731751084327698,
0.02632393315434456,
-0.036278072744607925,
0.02017340250313282,
-0.10198608040809631,
0.040958650410175323,
0.19275012612342834,
0.170908123254776,
0.011499331332743168,
-0.0021859249100089073,
-0.011352592147886753,
-0.012654168531298637,
-0.03270167112350464,
0.13334321975708008,
0.00023847134434618056,
0.1461162567138672,
-0.0023058836814016104,
0.046657223254442215,
0.004148395732045174,
-0.004600024316459894,
-0.04583552107214928,
0.010936646722257137,
-0.017339402809739113,
-0.03129572421312332,
-0.03191699832677841,
0.047798629850149155,
0.015742039307951927,
-0.17195837199687958,
0.03183934465050697,
0.03379060700535774,
-0.05779004842042923,
-0.01810147427022457,
-0.031717702746391296,
-0.060792215168476105,
0.0977584645152092,
-0.05006415769457817,
-0.005141211207956076,
0.10657010972499847,
0.004667059052735567,
-0.05459766462445259,
-0.11779209226369858,
0.05927843227982521,
-0.014335721731185913,
0.19981229305267334,
-0.008604307658970356,
0.03059377521276474,
0.04901024326682091,
0.013650449924170971,
-0.16793106496334076,
0.03702959790825844,
-0.026343952864408493,
-0.03229618817567825,
0.0020358473993837833,
0.17085900902748108,
-0.0384133905172348,
0.04677501320838928,
-0.04339514300227165,
-0.09589935094118118,
0.012842962518334389,
0.028604084625840187,
0.034712184220552444,
-0.10436800122261047,
0.03496924042701721,
-0.0707748532295227,
0.12397997826337814,
0.1552666425704956,
-0.007976667955517769,
0.02660646103322506,
-0.060368359088897705,
0.07357638329267502,
0.07679364085197449,
0.08596525341272354,
-0.03837449848651886,
-0.1083882749080658,
-0.0022122797090560198,
-0.05479736998677254,
0.019346868619322777,
-0.09944578260183334,
-0.05210141837596893,
0.02784612588584423,
-0.04426431655883789,
-0.061507243663072586,
0.10954881459474564,
0.04322576895356178,
0.0882253423333168,
-0.002986007137224078,
-0.017034925520420074,
-0.026380671188235283,
0.0865168645977974,
-0.16669036448001862,
-0.08992248773574829
] |
null | null |
transformers
|
# GPT-Neo-125M-Code-Clippy
> **Please refer to our new [GitHub Wiki](https://github.com/ncoop57/gpt-code-clippy/wiki) which documents our efforts in detail in creating the open source version of GitHub Copilot**
## Model Description
GPT-Neo-125M-Code-Clippy is a [GPT-Neo-125M model](https://huggingface.co/EleutherAI/gpt-neo-125M) finetuned using causal language modeling on our version of the Code Clippy Data dataset that has duplicates, which was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages. As discussed in OpenAI's [Codex paper](https://arxiv.org/abs/2107.03374), we modified the GPT-Neo model and tokenizer to accommodate for additional whitespace characters. Specifically, we add the following tokens `["\t\t", " ", " ", " "]` and since they are all related to indentation, we initialize the embedding layer of these tokens with the same weights as the `\t` token already present in the model in hopes the model will learn to associate these whitespace characters with indentation faster. A script to automatically do this can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/utilities/add_new_tokens.py).
## Training data
[Code Clippy Data dataset](https://the-eye.eu/public/AI/training_data/code_clippy_data/code_clippy_dedup_data/).
## Training procedure
The training script used to train this model can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/run_clm_streaming_flax.py).
To reproduce the training one can use this command with the above script:
```bash
./run_clm_streaming_flax.py \
--output_dir $HOME/gpt-neo-125M-code-clippy \
--model_name_or_path="flax-community/gpt-neo-125M-code-clippy" \
--dataset_name $HOME/gpt-code-clippy/data_processing/code_clippy.py \
--data_dir /home/shared/code_clippy_data \
--text_column_name="text" \
--do_train --do_eval \
--block_size="2048" \
--per_device_train_batch_size="8" \
--per_device_eval_batch_size="16" \
--preprocessing_num_workers="8" \
--learning_rate="1e-4" \
--max_steps 100000 \
--warmup_steps 2500 \
--decay_steps 25000 \
--adam_beta1="0.9" \
--adam_beta2="0.95" \
--weight_decay="0.1" \
--overwrite_output_dir \
--logging_steps="100" \
--eval_steps="500" \
--push_to_hub="False" \
--report_to="all" \
--dtype="bfloat16" \
--skip_memory_metrics="True" \
--save_steps="500" \
--save_total_limit 10 \
--gradient_accumulation_steps 16 \
--report_to="wandb" \
--run_name="125m_1e-4lr_1024bs" \
--max_eval_samples 2000 \
--save_optimizer true
```
## Intended Use and Limitations
The model is finetuned on text files from github repositories (mostly programming languages but also markdown and other project related files).
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("flax-community/gpt-neo-125M-code-clippy")
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt-neo-125M-code-clippy")
prompt = """def greet(name):
'''A function to greet user. Given a user name it should say hello'''
"""
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(device)
start = input_ids.size(1)
out = model.generate(input_ids, do_sample=True, max_length=50, num_beams=2,
early_stopping=True, eos_token_id=tokenizer.eos_token_id, )
print(tokenizer.decode(out[0][start:]))
```
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper ["Evaluating Large Language Models Trained on Code"](https://arxiv.org/abs/2107.03374) from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. **As well as some differences in views from the paper, particularly around legal implications**.
1. **Over-reliance:** This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. **Economic and labor market impacts:** Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from [O*NET OnLine](https://www.onetonline.org/link/summary/15-1252.00), developers don't just write software.
3. **Security implications:** No filtering or checking of vulnerabilities or buggy code was performed on the datase this model is trained on. This means that the dataset may contain code that may be malicious or contain vulnerabilities. Therefore, this model may generate vulnerable, buggy, or malicious code. In safety critical software, this could lead to software that may work improperly and could result in serious consequences depending on the software. Additionally, this model may be able to be used to generate malicious code on purpose in order to perform ransomware or other such attacks.
4. **Legal implications:** No filtering was performed on licensed code. This means that the dataset may contain restrictive licensed code. As discussed in the paper, public Github repositories may fall under "fair use." However, there has been little to no previous cases of such usages of licensed publicly available code. Therefore, any code generated with this model may be required to obey license terms that align with the software it was trained on such as GPL-3.0. It is unclear the legal ramifications of using a language model trained on this dataset.
5. **Biases:** The programming languages most represented in the dataset this model was trained on are Javascript and Python. Therefore, other, still popular languages such as C and C++, are less represented and therefore the models performance for these languages will be less comparatively. Additionally, this dataset only contains public repositories and so the model may not generate code that is representative of code written by private developers. No filtering was performed for potential racist, offensive, or otherwise inappropriate content. Therefore, this model may reflect such biases in its generation.
GPT-Neo-125M-Code-Clippy is finetuned from GPT-Neo and might have inherited biases and limitations from it. See [GPT-Neo model card](https://huggingface.co/EleutherAI/gpt-neo-125M#limitations-and-biases) for details.
## Eval results
Coming soon...
|
{}
|
text-generation
|
flax-community/gpt-neo-125M-code-clippy
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"gpt_neo",
"text-generation",
"arxiv:2107.03374",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.03374"
] |
[] |
TAGS
#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #arxiv-2107.03374 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# GPT-Neo-125M-Code-Clippy
> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot
## Model Description
GPT-Neo-125M-Code-Clippy is a GPT-Neo-125M model finetuned using causal language modeling on our version of the Code Clippy Data dataset that has duplicates, which was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages. As discussed in OpenAI's Codex paper, we modified the GPT-Neo model and tokenizer to accommodate for additional whitespace characters. Specifically, we add the following tokens '["\t\t", " ", " ", " "]' and since they are all related to indentation, we initialize the embedding layer of these tokens with the same weights as the '\t' token already present in the model in hopes the model will learn to associate these whitespace characters with indentation faster. A script to automatically do this can be found here.
## Training data
Code Clippy Data dataset.
## Training procedure
The training script used to train this model can be found here.
To reproduce the training one can use this command with the above script:
## Intended Use and Limitations
The model is finetuned on text files from github repositories (mostly programming languages but also markdown and other project related files).
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
The paper "Evaluating Large Language Models Trained on Code" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.
1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.
2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.
3. Security implications: No filtering or checking of vulnerabilities or buggy code was performed on the datase this model is trained on. This means that the dataset may contain code that may be malicious or contain vulnerabilities. Therefore, this model may generate vulnerable, buggy, or malicious code. In safety critical software, this could lead to software that may work improperly and could result in serious consequences depending on the software. Additionally, this model may be able to be used to generate malicious code on purpose in order to perform ransomware or other such attacks.
4. Legal implications: No filtering was performed on licensed code. This means that the dataset may contain restrictive licensed code. As discussed in the paper, public Github repositories may fall under "fair use." However, there has been little to no previous cases of such usages of licensed publicly available code. Therefore, any code generated with this model may be required to obey license terms that align with the software it was trained on such as GPL-3.0. It is unclear the legal ramifications of using a language model trained on this dataset.
5. Biases: The programming languages most represented in the dataset this model was trained on are Javascript and Python. Therefore, other, still popular languages such as C and C++, are less represented and therefore the models performance for these languages will be less comparatively. Additionally, this dataset only contains public repositories and so the model may not generate code that is representative of code written by private developers. No filtering was performed for potential racist, offensive, or otherwise inappropriate content. Therefore, this model may reflect such biases in its generation.
GPT-Neo-125M-Code-Clippy is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.
## Eval results
Coming soon...
|
[
"# GPT-Neo-125M-Code-Clippy\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-Neo-125M-Code-Clippy is a GPT-Neo-125M model finetuned using causal language modeling on our version of the Code Clippy Data dataset that has duplicates, which was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages. As discussed in OpenAI's Codex paper, we modified the GPT-Neo model and tokenizer to accommodate for additional whitespace characters. Specifically, we add the following tokens '[\"\\t\\t\", \" \", \" \", \" \"]' and since they are all related to indentation, we initialize the embedding layer of these tokens with the same weights as the '\\t' token already present in the model in hopes the model will learn to associate these whitespace characters with indentation faster. A script to automatically do this can be found here.",
"## Training data\n\nCode Clippy Data dataset.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTo reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is finetuned on text files from github repositories (mostly programming languages but also markdown and other project related files).",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n3. Security implications: No filtering or checking of vulnerabilities or buggy code was performed on the datase this model is trained on. This means that the dataset may contain code that may be malicious or contain vulnerabilities. Therefore, this model may generate vulnerable, buggy, or malicious code. In safety critical software, this could lead to software that may work improperly and could result in serious consequences depending on the software. Additionally, this model may be able to be used to generate malicious code on purpose in order to perform ransomware or other such attacks.\n4. Legal implications: No filtering was performed on licensed code. This means that the dataset may contain restrictive licensed code. As discussed in the paper, public Github repositories may fall under \"fair use.\" However, there has been little to no previous cases of such usages of licensed publicly available code. Therefore, any code generated with this model may be required to obey license terms that align with the software it was trained on such as GPL-3.0. It is unclear the legal ramifications of using a language model trained on this dataset.\n5. Biases: The programming languages most represented in the dataset this model was trained on are Javascript and Python. Therefore, other, still popular languages such as C and C++, are less represented and therefore the models performance for these languages will be less comparatively. Additionally, this dataset only contains public repositories and so the model may not generate code that is representative of code written by private developers. No filtering was performed for potential racist, offensive, or otherwise inappropriate content. Therefore, this model may reflect such biases in its generation.\n\nGPT-Neo-125M-Code-Clippy is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #arxiv-2107.03374 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# GPT-Neo-125M-Code-Clippy\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-Neo-125M-Code-Clippy is a GPT-Neo-125M model finetuned using causal language modeling on our version of the Code Clippy Data dataset that has duplicates, which was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages. As discussed in OpenAI's Codex paper, we modified the GPT-Neo model and tokenizer to accommodate for additional whitespace characters. Specifically, we add the following tokens '[\"\\t\\t\", \" \", \" \", \" \"]' and since they are all related to indentation, we initialize the embedding layer of these tokens with the same weights as the '\\t' token already present in the model in hopes the model will learn to associate these whitespace characters with indentation faster. A script to automatically do this can be found here.",
"## Training data\n\nCode Clippy Data dataset.",
"## Training procedure\n\nThe training script used to train this model can be found here.\n\nTo reproduce the training one can use this command with the above script:",
"## Intended Use and Limitations\n\nThe model is finetuned on text files from github repositories (mostly programming languages but also markdown and other project related files).",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nThe paper \"Evaluating Large Language Models Trained on Code\" from OpenAI has a good discussion on what the impact of a large language model trained on code could be. Therefore, some parts of their discuss are highlighted here as it pertains to this dataset and models that may be trained from it. As well as some differences in views from the paper, particularly around legal implications.\n\n1. Over-reliance: This model may generate plausible solutions that may appear correct, but are not necessarily the correct solution. Not properly evaluating the generated code may cause have negative consequences such as the introduction of bugs, or the introduction of security vulnerabilities. Therefore, it is important that users are aware of the limitations and potential negative consequences of using this language model.\n2. Economic and labor market impacts: Large language models trained on large code datasets such as this one that are capable of generating high-quality code have the potential to automate part of the software development process. This may negatively impact software developers. However, as discussed in the paper, as shown in the Summary Report of software developers from O*NET OnLine, developers don't just write software.\n3. Security implications: No filtering or checking of vulnerabilities or buggy code was performed on the datase this model is trained on. This means that the dataset may contain code that may be malicious or contain vulnerabilities. Therefore, this model may generate vulnerable, buggy, or malicious code. In safety critical software, this could lead to software that may work improperly and could result in serious consequences depending on the software. Additionally, this model may be able to be used to generate malicious code on purpose in order to perform ransomware or other such attacks.\n4. Legal implications: No filtering was performed on licensed code. This means that the dataset may contain restrictive licensed code. As discussed in the paper, public Github repositories may fall under \"fair use.\" However, there has been little to no previous cases of such usages of licensed publicly available code. Therefore, any code generated with this model may be required to obey license terms that align with the software it was trained on such as GPL-3.0. It is unclear the legal ramifications of using a language model trained on this dataset.\n5. Biases: The programming languages most represented in the dataset this model was trained on are Javascript and Python. Therefore, other, still popular languages such as C and C++, are less represented and therefore the models performance for these languages will be less comparatively. Additionally, this dataset only contains public repositories and so the model may not generate code that is representative of code written by private developers. No filtering was performed for potential racist, offensive, or otherwise inappropriate content. Therefore, this model may reflect such biases in its generation.\n\nGPT-Neo-125M-Code-Clippy is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
58,
43,
223,
10,
31,
41,
35,
726,
8
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #arxiv-2107.03374 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# GPT-Neo-125M-Code-Clippy\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot## Model Description\n\nGPT-Neo-125M-Code-Clippy is a GPT-Neo-125M model finetuned using causal language modeling on our version of the Code Clippy Data dataset that has duplicates, which was scraped from public Github repositories (more information in the provided link). This model is specialized to autocomplete methods in multiple programming languages. As discussed in OpenAI's Codex paper, we modified the GPT-Neo model and tokenizer to accommodate for additional whitespace characters. Specifically, we add the following tokens '[\"\\t\\t\", \" \", \" \", \" \"]' and since they are all related to indentation, we initialize the embedding layer of these tokens with the same weights as the '\\t' token already present in the model in hopes the model will learn to associate these whitespace characters with indentation faster. A script to automatically do this can be found here.## Training data\n\nCode Clippy Data dataset.## Training procedure\n\nThe training script used to train this model can be found here.\n\nTo reproduce the training one can use this command with the above script:## Intended Use and Limitations\n\nThe model is finetuned on text files from github repositories (mostly programming languages but also markdown and other project related files).### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:"
] |
[
-0.06569687277078629,
0.21674630045890808,
-0.0014620973961427808,
0.03509154170751572,
0.13752949237823486,
0.018955370411276817,
0.08012521266937256,
0.1495629847049713,
0.006778752896934748,
0.07964575290679932,
0.01805850677192211,
0.03765040263533592,
0.05122813954949379,
0.16536203026771545,
0.11718559265136719,
-0.2612653076648712,
0.040437500923871994,
-0.01899699680507183,
0.1311962902545929,
0.07124315947294235,
0.0649818554520607,
-0.062441688030958176,
0.08078561723232269,
0.03796538710594177,
-0.12497590482234955,
0.018597381189465523,
-0.04940386489033699,
-0.06627952307462692,
0.09557362645864487,
0.037327397614717484,
0.05101309344172478,
0.01217809971421957,
0.035098012536764145,
-0.18949760496616364,
0.020528065040707588,
0.12421725690364838,
0.017630496993660927,
0.06413845717906952,
0.0941636711359024,
-0.04969121888279915,
0.11122029274702072,
-0.055138200521469116,
0.041949693113565445,
0.053524427115917206,
-0.10746853798627853,
-0.09998778253793716,
-0.08436477184295654,
0.01577485166490078,
0.04771123453974724,
0.015116667374968529,
0.00021718701464124024,
0.12502513825893402,
0.028456466272473335,
0.05144006758928299,
0.20556330680847168,
-0.22599837183952332,
0.0029154440853744745,
0.1402510702610016,
0.061843376606702805,
-0.006546820513904095,
-0.07383807003498077,
0.004745068494230509,
0.009972140192985535,
0.017333438619971275,
0.05609728768467903,
-0.025537297129631042,
-0.04135262593626976,
-0.021114353090524673,
-0.15364544093608856,
-0.06519865989685059,
0.1150829866528511,
-0.06021386384963989,
-0.07927121222019196,
-0.17558637261390686,
-0.07159479707479477,
-0.03404698148369789,
-0.010259891860187054,
-0.020218802616000175,
0.005875843111425638,
0.01934300921857357,
0.03153900429606438,
-0.16009104251861572,
-0.0703577995300293,
-0.021649492904543877,
0.023754803463816643,
0.09833409637212753,
0.04628946632146835,
0.012676297686994076,
-0.04023980349302292,
0.14462120831012726,
-0.1292346864938736,
-0.06093258038163185,
-0.04188969358801842,
-0.02850702591240406,
-0.06940370053052902,
-0.04948107525706291,
-0.0510086752474308,
-0.16410048305988312,
0.017878130078315735,
0.2203008383512497,
0.037259507924318314,
0.062466658651828766,
-0.08455289155244827,
0.04718988761305809,
0.030271681025624275,
0.09454565495252609,
-0.08633722364902496,
0.002626038622111082,
0.037411563098430634,
0.008935113437473774,
0.005501253996044397,
-0.03730130195617676,
-0.07790255546569824,
-0.04632091522216797,
0.07578752934932709,
0.060710929334163666,
0.05028063803911209,
0.1044609546661377,
-0.009532617405056953,
-0.046339746564626694,
0.19271144270896912,
-0.14513368904590607,
0.033812277019023895,
-0.03468422219157219,
-0.029760470613837242,
0.01024861540645361,
0.1145256906747818,
0.01772451400756836,
-0.08683671057224274,
0.04219513386487961,
-0.09502632170915604,
0.033500637859106064,
-0.11954574286937714,
-0.08763090521097183,
0.044920776039361954,
-0.06668444722890854,
-0.04597678780555725,
-0.07730434089899063,
-0.18490348756313324,
-0.01007730234414339,
0.0254847202450037,
0.005014237482100725,
0.033455606549978256,
-0.00839814729988575,
-0.015924200415611267,
-0.016950996592640877,
0.0011389580322429538,
-0.10467730462551117,
-0.02179485186934471,
-0.003291724482551217,
-0.029695775359869003,
0.0857667550444603,
0.0024012157227844,
-0.009289245121181011,
-0.10028902441263199,
0.04911443963646889,
-0.2969116270542145,
0.09481513500213623,
-0.0184660442173481,
-0.048781927675008774,
-0.12937071919441223,
-0.02154417894780636,
-0.02185879833996296,
-0.01423739455640316,
0.04173808544874191,
0.1021781638264656,
-0.1627303510904312,
0.015284566208720207,
0.2174791395664215,
-0.1362811028957367,
0.0016341488808393478,
0.10881621390581131,
-0.024069231003522873,
0.03587854653596878,
0.03338959813117981,
0.1688770204782486,
0.1488288789987564,
-0.11350470036268234,
-0.07667652517557144,
-0.0070665753446519375,
-0.05865710228681564,
0.030029062181711197,
0.022337913513183594,
-0.05011986941099167,
0.05578232556581497,
0.024143744260072708,
-0.006567625794559717,
0.018732190132141113,
0.006868758238852024,
-0.018921108916401863,
-0.025057103484869003,
-0.006354774814099073,
0.04016890749335289,
-0.03573658689856529,
-0.031206030398607254,
0.006910587660968304,
-0.08167625963687897,
0.07792691141366959,
0.10798660665750504,
-0.0942215770483017,
0.0669265165925026,
-0.08917570859193802,
0.06032460927963257,
-0.10991477966308594,
-0.021266808733344078,
-0.16418269276618958,
-0.12132389843463898,
0.06292306631803513,
-0.11384008079767227,
0.025726642459630966,
-0.016736837103962898,
0.03389747068285942,
0.10799646377563477,
0.00878919381648302,
-0.0033794217742979527,
0.02493329904973507,
-0.015238485299050808,
-0.028422806411981583,
-0.060353465378284454,
-0.09955157339572906,
-0.06768261641263962,
0.10125871002674103,
-0.10657507926225662,
0.0683971643447876,
0.06749143451452255,
0.14584453403949738,
0.018650729209184647,
-0.10340266674757004,
0.09136608988046646,
-0.01784275658428669,
-0.027629461139440536,
-0.08444807678461075,
0.02223542332649231,
0.031625885516405106,
-0.06038779765367508,
0.12537765502929688,
-0.20987462997436523,
-0.08539973199367523,
0.08225702494382858,
0.044286251068115234,
-0.06155947968363762,
-0.09623609483242035,
-0.02611987106502056,
-0.05960378423333168,
-0.07805514335632324,
-0.049945633858442307,
0.1853942573070526,
0.04378433898091316,
0.11092071980237961,
-0.09490525722503662,
-0.009834826923906803,
0.016778824850916862,
-0.05028209462761879,
-0.02921971306204796,
-0.029728690162301064,
0.06261830031871796,
-0.01115446351468563,
0.03335948660969734,
-0.02808181755244732,
0.00368654471822083,
0.16532543301582336,
0.06172040477395058,
-0.06398452818393707,
-0.021780923008918762,
0.03454744443297386,
0.02384101413190365,
0.006444548722356558,
-0.012631386518478394,
-0.005202493630349636,
0.03625830262899399,
0.031186239793896675,
0.06971704214811325,
-0.09614817798137665,
0.034717682749032974,
0.018589403480291367,
-0.029973657801747322,
0.06715802848339081,
0.06479675322771072,
-0.020951006561517715,
0.02030068449676037,
-0.02889999747276306,
0.07408805936574936,
0.008800754323601723,
-0.022284146398305893,
-0.08787548542022705,
0.11562848091125488,
-0.1101149246096611,
-0.23200726509094238,
-0.19293689727783203,
0.0022795379627496004,
-0.08229239284992218,
0.009259523823857307,
0.03397303447127342,
0.012858529575169086,
-0.03589000552892685,
-0.113874152302742,
0.011488373391330242,
-0.030160916969180107,
-0.026667386293411255,
-0.09415584057569504,
-0.03139881044626236,
0.05153295025229454,
-0.14838998019695282,
-0.011506070382893085,
0.06092458218336105,
-0.1330394446849823,
0.026365626603364944,
0.040282558649778366,
0.10592726618051529,
0.08108975738286972,
-0.048015184700489044,
0.007533491589128971,
-0.02573942206799984,
0.17816908657550812,
-0.0628868117928505,
0.0890946313738823,
0.19394031167030334,
-0.04912099614739418,
0.11058836430311203,
0.021821413189172745,
0.013627161271870136,
-0.06919411569833755,
0.01381100807338953,
0.018498247489333153,
-0.05413636192679405,
-0.10589274019002914,
-0.08210774511098862,
-0.0378187894821167,
0.04639812931418419,
0.13074740767478943,
0.025447005406022072,
0.06551604717969894,
0.03416379913687706,
-0.08035707473754883,
0.07175827026367188,
-0.03750669211149216,
0.15175417065620422,
0.024501284584403038,
0.018991487100720406,
0.05371588468551636,
-0.02284649945795536,
-0.013329317793250084,
0.04466259479522705,
0.09033657610416412,
0.1285577118396759,
-0.050572846084833145,
0.14016093313694,
0.039509210735559464,
0.09660439193248749,
0.06420028954744339,
0.09161056578159332,
-0.07355047017335892,
0.04236737638711929,
-0.019859446212649345,
-0.08502837270498276,
-0.0339941568672657,
0.04172888025641441,
-0.018080512061715126,
-0.03740480914711952,
-0.013605975545942783,
-0.017972707748413086,
0.057011138647794724,
0.0559459924697876,
0.029497351497411728,
-0.231520414352417,
0.0020046934951096773,
-0.012163556180894375,
-0.014214874245226383,
-0.06712651252746582,
0.00265745772048831,
0.1192455142736435,
-0.12957406044006348,
0.0364338755607605,
-0.0779779776930809,
0.06258808076381683,
-0.14084115624427795,
0.013240666128695011,
-0.016712814569473267,
0.16489365696907043,
-0.034376323223114014,
0.12412040680646896,
-0.12842611968517303,
0.06060667708516121,
0.04180636629462242,
0.03907061368227005,
-0.09839214384555817,
0.02085421420633793,
0.01175023801624775,
0.0328960083425045,
0.09633799642324448,
-0.002264030510559678,
-0.056745219975709915,
-0.055057380348443985,
-0.0919036865234375,
0.0037033006083220243,
0.07398535311222076,
-0.06270058453083038,
0.09424388408660889,
0.016416018828749657,
-0.00249082176014781,
-0.03849523887038231,
-0.019174080342054367,
-0.11981873214244843,
-0.20912505686283112,
0.08947796374559402,
0.0625624880194664,
0.0167921744287014,
-0.0655052438378334,
-0.018715711310505867,
0.015177338384091854,
0.10594143718481064,
0.03470553830265999,
-0.058223094791173935,
-0.11838401854038239,
0.009023918770253658,
0.1694009155035019,
-0.05953250452876091,
0.03271804004907608,
-0.0034713000059127808,
0.16801515221595764,
-0.09637441486120224,
-0.0589558407664299,
0.021149059757590294,
-0.08272899687290192,
-0.1057741641998291,
-0.03640378266572952,
0.07752345502376556,
0.004906208720058203,
0.005419344175606966,
0.021342124789953232,
0.036809783428907394,
0.05003554746508598,
-0.10186653584241867,
-0.030828669667243958,
0.11673237383365631,
0.09416079521179199,
0.05047883465886116,
-0.0638146623969078,
0.04886826500296593,
-0.06854293495416641,
0.0192890428006649,
0.12794183194637299,
0.17164433002471924,
-0.0760142132639885,
0.10807674378156662,
0.10551231354475021,
-0.12812618911266327,
-0.2110183984041214,
-0.0008532936917617917,
0.008270373567938805,
-0.017372487112879753,
0.02716321125626564,
-0.22357423603534698,
0.0828465074300766,
0.03703330084681511,
-0.01444820873439312,
0.04933471232652664,
-0.3101223409175873,
-0.11246045678853989,
-0.0011410968145355582,
-0.005197454709559679,
0.002264089183881879,
-0.07625951617956161,
-0.027280036360025406,
-0.01223420538008213,
-0.07659797370433807,
0.03813380375504494,
-0.12178549915552139,
0.07882867753505707,
-0.009150570258498192,
0.05213009566068649,
0.07952188700437546,
-0.07651473581790924,
0.07603009790182114,
0.06303968280553818,
0.03731374070048332,
-0.06802023947238922,
0.06109781935811043,
0.032001614570617676,
-0.05379327014088631,
0.1879056841135025,
-0.04645772650837898,
0.013585214503109455,
-0.08959563821554184,
-0.024906335398554802,
-0.04622127488255501,
0.09021376073360443,
-0.009662681259214878,
-0.08443311601877213,
-0.04238816723227501,
0.04774417355656624,
0.06273699551820755,
0.018119286745786667,
-0.03996160253882408,
-0.016919786110520363,
0.052217815071344376,
0.17365486919879913,
0.07126225531101227,
-0.04251612350344658,
-0.1048777773976326,
0.04262211546301842,
-0.004073142074048519,
0.056827057152986526,
-0.09275226294994354,
0.021013155579566956,
0.051027052104473114,
-0.008381596766412258,
0.10944066941738129,
0.045025646686553955,
-0.1883883774280548,
0.01571807637810707,
0.07317846268415451,
-0.20012211799621582,
-0.1216522827744484,
-0.04775971919298172,
0.08524783700704575,
-0.08360973745584488,
0.014149325899779797,
0.1544627845287323,
-0.08375755697488785,
-0.03675726056098938,
-0.0013463414506986737,
0.05141091346740723,
-0.010139108635485172,
0.05781184881925583,
0.046460945159196854,
0.03145943582057953,
-0.044031355530023575,
0.11614648997783661,
0.10500844568014145,
-0.10216765850782394,
0.038401030004024506,
0.06710676103830338,
-0.09706149995326996,
-0.05808413773775101,
-0.07929273694753647,
0.13918599486351013,
-0.027942614629864693,
-0.037902723997831345,
-0.029393915086984634,
0.1222955584526062,
-0.027115676552057266,
0.10757262259721756,
0.0393982008099556,
0.019283747300505638,
-0.025588421151041985,
-0.013658120296895504,
-0.1710585206747055,
0.039756279438734055,
0.03330167382955551,
0.07639013975858688,
-0.07738934457302094,
0.18947437405586243,
0.012532556429505348,
0.05045284703373909,
-0.011400515213608742,
-0.053399600088596344,
-0.09520947933197021,
-0.049530029296875,
0.04522673413157463,
-0.027228116989135742,
-0.09733450412750244,
-0.013961000367999077,
0.02389138750731945,
0.02808338776230812,
0.025991814211010933,
0.02437884546816349,
-0.04748072475194931,
-0.06005929037928581,
-0.02854250557720661,
0.07072769105434418,
-0.11656444519758224,
0.01337397564202547,
0.0662536472082138,
-0.06574780493974686,
0.078330859541893,
-0.028369611129164696,
-0.033981889486312866,
0.02435143105685711,
-0.011042391881346703,
0.0011984070297330618,
-0.0017259574960917234,
-0.014080271124839783,
-0.004470150452107191,
-0.019641170278191566,
0.0007914848392829299,
-0.034141551703214645,
0.021799147129058838,
-0.03104155696928501,
0.09011547267436981,
-0.11802191287279129,
0.05461994186043739,
0.04545031115412712,
0.009143318049609661,
-0.08345220983028412,
0.04330277442932129,
0.08022846281528473,
0.057797789573669434,
0.08308561891317368,
-0.05983883887529373,
0.0489274263381958,
-0.12915492057800293,
-0.023198571056127548,
-0.0008973985095508397,
-0.017726361751556396,
0.11552829295396805,
-0.01880522631108761,
0.03733412176370621,
-0.020335230976343155,
0.1816626936197281,
0.021906349807977676,
0.056934747844934464,
-0.010780666954815388,
-0.01619602181017399,
-0.08476758748292923,
-0.006936233956366777,
0.01378999650478363,
0.0030193845741450787,
-0.012400696985423565,
0.00016006987425498664,
0.0529303140938282,
-0.001117188367061317,
0.03467019647359848,
0.01907389797270298,
0.022383226081728935,
0.03140730783343315,
0.052707672119140625,
-0.004579461645334959,
-0.058511726558208466,
-0.10745424777269363,
-0.01896635815501213,
0.02216816507279873,
0.10290683805942535,
-0.06504108756780624,
0.05976971611380577,
0.14293883740901947,
-0.07931045442819595,
0.0907914862036705,
0.07513739168643951,
-0.06011497229337692,
-0.10809829086065292,
-0.19632907211780548,
-0.01901564933359623,
-0.08743755519390106,
0.0018248682608827949,
-0.1008673831820488,
0.10108473151922226,
0.04428398981690407,
-0.012305187992751598,
0.00508930953219533,
0.17935511469841003,
-0.04716203734278679,
-0.07255522161722183,
-0.007038768380880356,
-0.026913145557045937,
0.09275053441524506,
0.06734441220760345,
0.015181307680904865,
0.03415518254041672,
0.03532039746642113,
0.0928228572010994,
0.02679322101175785,
0.15778765082359314,
0.06803629547357559,
-0.07393299788236618,
-0.08507578819990158,
-0.013738960027694702,
-0.02552865259349346,
-0.0041981106624007225,
0.06326903402805328,
0.06257664412260056,
-0.060480065643787384,
-0.05289412662386894,
0.15164948999881744,
-0.018660789355635643,
-0.07816009223461151,
-0.13104891777038574,
0.1258128136396408,
0.07987011969089508,
-0.02040230855345726,
-0.06089646369218826,
-0.10951978713274002,
0.005475977901369333,
0.1603330820798874,
0.1905948966741562,
0.016778353601694107,
0.008293011225759983,
0.0004434384754858911,
-0.005552416201680899,
0.049064088612794876,
0.08513007313013077,
-0.009987039491534233,
0.11329051107168198,
-0.03584853187203407,
0.057067614048719406,
-0.027740994468331337,
-0.016848010942339897,
-0.0997251644730568,
0.06097051873803139,
-0.077511727809906,
0.011220970191061497,
-0.01648782566189766,
0.04470726475119591,
-0.05879908800125122,
-0.20142337679862976,
-0.07226256281137466,
-0.06811319291591644,
-0.04697539284825325,
0.018870655447244644,
0.033565886318683624,
-0.022557873278856277,
0.09537297487258911,
-0.005709934048354626,
-0.020609434694051743,
0.1394187957048416,
0.0028120737988501787,
-0.12340591102838516,
-0.03561731055378914,
-0.00006829547783127055,
-0.03734530135989189,
0.19238325953483582,
0.00015947774227242917,
0.018558500334620476,
0.08103687316179276,
0.003468765178695321,
-0.14760684967041016,
-0.009981815703213215,
-0.020239371806383133,
0.018573284149169922,
-0.005132613237947226,
0.1549345999956131,
-0.005928984377533197,
0.027103235945105553,
-0.019921693950891495,
-0.12461718171834946,
0.02525847591459751,
-0.059998899698257446,
0.0556548535823822,
-0.09209423512220383,
0.005631574895232916,
-0.033556800335645676,
0.1668078899383545,
0.1068873256444931,
-0.014916777610778809,
-0.0003593598084989935,
-0.03503192216157913,
-0.021792037412524223,
-0.005461744032800198,
0.030963126569986343,
-0.04665347561240196,
-0.10450030863285065,
-0.025418171659111977,
-0.06342717260122299,
0.03133546561002731,
-0.1730055809020996,
-0.01707134023308754,
0.0692024752497673,
-0.050057172775268555,
-0.055418457835912704,
0.1009395644068718,
0.09471685439348221,
0.019200488924980164,
-0.024140119552612305,
-0.04700537398457527,
0.01732644811272621,
0.07396018505096436,
-0.16590942442417145,
-0.10580501705408096
] |
null | null |
transformers
|
# GPT-Code-Clippy-125M-Code-Search-All
> **Please refer to our new [GitHub Wiki](https://github.com/ncoop57/gpt-code-clippy/wiki) which documents our efforts in detail in creating the open source version of GitHub Copilot**
## Model Description
GPT-CC-125M-Code-Search is a [GPT-Neo-125M model](https://huggingface.co/EleutherAI/gpt-neo-125M) finetuned using causal language modeling on all languages in the [CodeSearchNet Challenge dataset](https://huggingface.co/datasets/code_search_net). This model is specialized to autocomplete methods in multiple programming languages.
## Training data
[CodeSearchNet Challenge dataset](https://huggingface.co/datasets/code_search_net).
## Training procedure
The training script used to train this model can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/run_clm_flax.py).
```bash
./run_clm_flax.py \
--output_dir $HOME/gpt-neo-125M-code-search-all \
--model_name_or_path="EleutherAI/gpt-neo-125M" \
--dataset_name code_search_net \
--dataset_config_name="all" \
--do_train --do_eval \
--block_size="512" \
--per_device_train_batch_size="32" \
--per_device_eval_batch_size="64" \
--preprocessing_num_workers="8" \
--learning_rate="1.2e-4" \
--num_train_epochs 20 \
--warmup_steps 3000 \
--adam_beta1="0.9" \
--adam_beta2="0.95" \
--weight_decay="0.1" \
--overwrite_output_dir \
--logging_steps="25" \
--eval_steps="500" \
--push_to_hub="False" \
--report_to="all" \
--dtype="bfloat16" \
--skip_memory_metrics="True" \
--save_steps="500" \
--save_total_limit 10 \
--report_to="wandb" \
--run_name="gpt-neo-125M-code-search-all"
```
## Intended Use and Limitations
The model is finetuned methods from several languages and is intended to autocomplete methods given some prompt (method signature and docstring).
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("flax-community/gpt-neo-125M-code-clippy-code-search-all")
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt-neo-125M-code-clippy-code-search-all")
prompt = """def greet(name):
'''A function to greet user. Given a user name it should say hello'''
"""
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(device)
start = input_ids.size(1)
out = model.generate(input_ids, do_sample=True, max_length=50, num_beams=2,
early_stopping=True, eos_token_id=tokenizer.eos_token_id, )
print(tokenizer.decode(out[0][start:]))
```
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
GPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See [GPT-Neo model card](https://huggingface.co/EleutherAI/gpt-neo-125M#limitations-and-biases) for details.
## Eval results
Coming soon...
|
{}
|
text-generation
|
flax-community/gpt-neo-125M-code-search-all
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"gpt_neo",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #autotrain_compatible #endpoints_compatible #region-us
|
# GPT-Code-Clippy-125M-Code-Search-All
> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot
## Model Description
GPT-CC-125M-Code-Search is a GPT-Neo-125M model finetuned using causal language modeling on all languages in the CodeSearchNet Challenge dataset. This model is specialized to autocomplete methods in multiple programming languages.
## Training data
CodeSearchNet Challenge dataset.
## Training procedure
The training script used to train this model can be found here.
## Intended Use and Limitations
The model is finetuned methods from several languages and is intended to autocomplete methods given some prompt (method signature and docstring).
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
GPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.
## Eval results
Coming soon...
|
[
"# GPT-Code-Clippy-125M-Code-Search-All\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-CC-125M-Code-Search is a GPT-Neo-125M model finetuned using causal language modeling on all languages in the CodeSearchNet Challenge dataset. This model is specialized to autocomplete methods in multiple programming languages.",
"## Training data\n\nCodeSearchNet Challenge dataset.",
"## Training procedure\n\nThe training script used to train this model can be found here.",
"## Intended Use and Limitations\n\nThe model is finetuned methods from several languages and is intended to autocomplete methods given some prompt (method signature and docstring).",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nGPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #autotrain_compatible #endpoints_compatible #region-us \n",
"# GPT-Code-Clippy-125M-Code-Search-All\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-CC-125M-Code-Search is a GPT-Neo-125M model finetuned using causal language modeling on all languages in the CodeSearchNet Challenge dataset. This model is specialized to autocomplete methods in multiple programming languages.",
"## Training data\n\nCodeSearchNet Challenge dataset.",
"## Training procedure\n\nThe training script used to train this model can be found here.",
"## Intended Use and Limitations\n\nThe model is finetuned methods from several languages and is intended to autocomplete methods given some prompt (method signature and docstring).",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nGPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
46,
46,
64,
10,
16,
38,
35,
72,
8
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #autotrain_compatible #endpoints_compatible #region-us \n# GPT-Code-Clippy-125M-Code-Search-All\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot## Model Description\n\nGPT-CC-125M-Code-Search is a GPT-Neo-125M model finetuned using causal language modeling on all languages in the CodeSearchNet Challenge dataset. This model is specialized to autocomplete methods in multiple programming languages.## Training data\n\nCodeSearchNet Challenge dataset.## Training procedure\n\nThe training script used to train this model can be found here.## Intended Use and Limitations\n\nThe model is finetuned methods from several languages and is intended to autocomplete methods given some prompt (method signature and docstring).### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nGPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.## Eval results\n\nComing soon..."
] |
[
-0.0685521811246872,
0.14733454585075378,
-0.0032703534234315157,
0.05083689093589783,
0.06839339435100555,
-0.02410905435681343,
0.059964608401060104,
0.0561966747045517,
-0.0014055503997951746,
0.0985916331410408,
0.061063867062330246,
0.02040153555572033,
0.06853790581226349,
0.1449713408946991,
0.06986302137374878,
-0.20408855378627777,
0.039929307997226715,
-0.0338405966758728,
0.10129807889461517,
0.11838897317647934,
0.08641281723976135,
-0.038955044001340866,
0.08289537578821182,
0.05351719632744789,
-0.1365494728088379,
-0.023099862039089203,
0.030101513490080833,
-0.03837955743074417,
0.06413013488054276,
0.04314243048429489,
0.07132956385612488,
0.015162983909249306,
0.025910140946507454,
-0.1162116527557373,
0.019958054646849632,
0.08414340764284134,
-0.012402615509927273,
0.05184588581323624,
0.05480355769395828,
-0.041511088609695435,
0.1265851855278015,
0.0030801568645983934,
0.05454586073756218,
0.07112810760736465,
-0.12377617508172989,
0.04263186454772949,
-0.09606602042913437,
0.0023568684700876474,
0.030966827645897865,
0.1413368135690689,
-0.0016568180872127414,
0.1414281576871872,
-0.06738052517175674,
0.03904597461223602,
0.07929176837205887,
-0.11608713865280151,
-0.027851330116391182,
0.09326166659593582,
0.06365624070167542,
0.06953775882720947,
-0.03680065646767616,
0.012192179448902607,
-0.01950729638338089,
0.029774019494652748,
0.10659275949001312,
-0.05167340859770775,
-0.05036232993006706,
-0.02819143794476986,
-0.15684838593006134,
-0.08282693475484848,
0.11971922218799591,
-0.02865116484463215,
-0.0816916674375534,
-0.10734600573778152,
-0.07327892631292343,
-0.028289375826716423,
0.002710167085751891,
-0.03128032386302948,
0.008286907337605953,
0.03476085886359215,
0.0652182400226593,
-0.11326327174901962,
-0.1022629514336586,
-0.11288221180438995,
-0.016480030491948128,
0.0783982127904892,
0.02167915366590023,
0.05223669111728668,
-0.05784858390688896,
0.14991383254528046,
-0.1080809235572815,
-0.03953872248530388,
-0.08789730817079544,
-0.0571415089070797,
-0.06317197531461716,
-0.014025639742612839,
-0.02547386847436428,
-0.07585371285676956,
0.013650397770106792,
0.1993400901556015,
0.023808568716049194,
0.026736406609416008,
0.013064801692962646,
0.021296197548508644,
0.039083682000637054,
0.06902528554201126,
-0.028702156618237495,
0.04128225892782211,
0.1068614050745964,
-0.0011542027350515127,
0.013798289932310581,
-0.033823296427726746,
-0.04197259619832039,
-0.02499738149344921,
0.016260864213109016,
0.07770580053329468,
0.06625059247016907,
0.07647358626127243,
-0.04772103577852249,
-0.03136040270328522,
0.2055521011352539,
-0.1243143156170845,
0.009427845478057861,
-0.012948776595294476,
-0.010128851979970932,
-0.06631630659103394,
0.05471649020910263,
-0.0008041294640861452,
-0.10344257950782776,
-0.032490767538547516,
-0.10368658602237701,
-0.010426869615912437,
-0.09790339320898056,
-0.09073789417743683,
0.02626754716038704,
-0.05523024499416351,
-0.05761159956455231,
-0.12663523852825165,
-0.2280452847480774,
-0.07917269319295883,
-0.004649660550057888,
-0.021708307787775993,
-0.003387389238923788,
0.027432888746261597,
-0.0161439161747694,
0.005185570567846298,
-0.00593527453020215,
-0.034139979630708694,
-0.04260314255952835,
0.011519871652126312,
0.01351260207593441,
0.012016713619232178,
0.0018195380689576268,
-0.01849190890789032,
-0.13664363324642181,
0.015750305727124214,
-0.19410239160060883,
0.12067031115293503,
-0.05818847566843033,
-0.015939908102154732,
-0.09909229725599289,
-0.024279141798615456,
-0.05690157413482666,
0.007629286497831345,
0.0271293967962265,
0.1562204658985138,
-0.13763967156410217,
-0.0017984011210501194,
0.2209077775478363,
-0.13342468440532684,
-0.022975869476795197,
0.11340239644050598,
-0.022226564586162567,
0.10161314904689789,
0.0849611684679985,
0.11085270345211029,
0.12689554691314697,
-0.16080428659915924,
-0.05104982852935791,
-0.02515571378171444,
-0.056478433310985565,
0.014063296839594841,
0.0503290630877018,
-0.08149141818284988,
-0.012790597975254059,
0.02729444019496441,
-0.12556049227714539,
-0.024038461968302727,
0.0030304479878395796,
-0.038226496428251266,
-0.013502505607903004,
-0.0013549055438488722,
-0.052227914333343506,
-0.0470295287668705,
-0.004607874900102615,
-0.0101251769810915,
-0.12255226075649261,
-0.06845464557409286,
0.048121608793735504,
-0.049563512206077576,
0.04033144935965538,
-0.09240700304508209,
0.07919173687696457,
-0.0938435047864914,
0.008395998738706112,
-0.15073217451572418,
-0.1208716481924057,
0.07000666856765747,
-0.0688987448811531,
0.107144795358181,
-0.06875967979431152,
0.011542025953531265,
0.05346934124827385,
0.00887627899646759,
-0.006660555023699999,
-0.06872827559709549,
-0.028932688757777214,
-0.049442797899246216,
-0.07258603721857071,
-0.015812000259757042,
-0.03159356117248535,
0.1183142438530922,
-0.12292501330375671,
0.055051133036613464,
0.11963772773742676,
0.018108753487467766,
0.009896044619381428,
-0.09593741595745087,
0.04194582253694534,
-0.004293923266232014,
-0.005679204594343901,
-0.037956397980451584,
0.002867084927856922,
0.040837571024894714,
-0.04410640150308609,
0.09112034738063812,
-0.1689218431711197,
-0.11700303107500076,
0.0648161992430687,
0.0058284057304263115,
-0.1038755252957344,
-0.06964556127786636,
-0.021689120680093765,
-0.015779046341776848,
-0.01903447136282921,
-0.08095807582139969,
0.16777551174163818,
-0.00566056789830327,
0.0653878003358841,
-0.09646172821521759,
0.0036349024157971144,
0.020176656544208527,
-0.023387661203742027,
0.027231216430664062,
-0.005901403725147247,
0.105128213763237,
-0.05344143137335777,
0.0600624643266201,
-0.06210508942604065,
-0.013928509317338467,
0.08651663362979889,
0.04599907249212265,
-0.06974136084318161,
-0.028105005621910095,
-0.016767378896474838,
0.019374346360564232,
0.03300374001264572,
0.020014286041259766,
-0.00427572475746274,
0.036127425730228424,
0.0702052116394043,
0.05871107801795006,
-0.12662242352962494,
0.027479015290737152,
0.028696110472083092,
-0.026761183515191078,
-0.04954884201288223,
0.009825114160776138,
-0.07331965863704681,
0.0482231043279171,
0.024039780721068382,
0.10078630596399307,
0.009040012955665588,
-0.02979877032339573,
-0.16129380464553833,
0.16230323910713196,
-0.09929250925779343,
-0.21464480459690094,
-0.15047916769981384,
0.06299591809511185,
-0.00018787139561027288,
0.0400053933262825,
0.008174607530236244,
-0.003525436855852604,
-0.05673600360751152,
-0.14459386467933655,
0.00008568952034693211,
-0.029596524313092232,
-0.042962465435266495,
-0.11197929829359055,
-0.04538953676819801,
-0.023254236206412315,
-0.13075028359889984,
-0.0008042058907449245,
0.0025966314133256674,
-0.15177702903747559,
0.06098733842372894,
0.021449049934744835,
0.14688052237033844,
0.10966736823320389,
0.032780975103378296,
-0.026891490444540977,
-0.04610247537493706,
0.23886710405349731,
-0.08111504465341568,
0.11471229791641235,
0.18138031661510468,
0.023722082376480103,
0.048138100653886795,
0.0615675114095211,
0.013028434477746487,
-0.08785264194011688,
0.020536573603749275,
0.004724862053990364,
-0.08298386633396149,
-0.18686336278915405,
-0.06484738737344742,
-0.07195382565259933,
-0.033317264169454575,
0.04144448786973953,
0.05475764349102974,
0.11396773904561996,
0.07100173085927963,
-0.02694842405617237,
0.01010677870362997,
0.019016169011592865,
0.11312179267406464,
0.056479521095752716,
0.024202313274145126,
0.06009311601519585,
-0.03729651868343353,
0.05591980740427971,
0.08056892454624176,
0.06095965951681137,
0.13535258173942566,
-0.038633450865745544,
0.2400210201740265,
0.0918806865811348,
0.08260306715965271,
0.07036962360143661,
0.023916440084576607,
-0.013803916051983833,
0.0355556346476078,
0.001156670623458922,
-0.07652327418327332,
-0.0685371458530426,
0.06388993561267853,
-0.06312084943056107,
-0.09237511456012726,
-0.0031982767395675182,
-0.05495196208357811,
0.02259686030447483,
0.08235147595405579,
-0.0038169827312231064,
-0.22830450534820557,
-0.09389634430408478,
0.010266289114952087,
0.024226384237408638,
-0.07802259176969528,
0.024118753150105476,
0.08828245103359222,
-0.12465258687734604,
0.022164681926369667,
-0.03995225206017494,
0.10434810072183609,
-0.0649472177028656,
-0.005654134787619114,
-0.014792053028941154,
0.11764982342720032,
-0.017852239310741425,
0.08636955916881561,
-0.18515489995479584,
0.05947612598538399,
0.03894864022731781,
0.1086370199918747,
-0.06970634311437607,
0.04668596759438515,
-0.042212724685668945,
0.03585432469844818,
0.12356705963611603,
0.01068445760756731,
-0.05385683849453926,
-0.03752775862812996,
-0.10517802089452744,
0.03280969709157944,
0.06652595847845078,
-0.0926646962761879,
0.08046051114797592,
0.019284700974822044,
0.016450366005301476,
-0.018004847690463066,
-0.06308715045452118,
-0.15993241965770721,
-0.1756456345319748,
0.09110891819000244,
0.008975187316536903,
0.07668500393629074,
-0.041997212916612625,
0.006557716988027096,
0.034286875277757645,
0.18219305574893951,
-0.15513983368873596,
-0.06536414474248886,
-0.14261221885681152,
-0.0338478647172451,
0.10132726281881332,
-0.0586322657763958,
0.06007621809840202,
-0.01568564958870411,
0.08815401792526245,
0.006907275412231684,
-0.03953501954674721,
0.09674393385648727,
-0.09275371581315994,
-0.1292453408241272,
-0.058790888637304306,
0.1117851734161377,
0.09652050584554672,
0.030112076550722122,
-0.01012013852596283,
0.06509071588516235,
-0.05016809329390526,
-0.13557857275009155,
0.027782952412962914,
0.1594163477420807,
0.0818600282073021,
0.06470755487680435,
-0.025524474680423737,
-0.024288160726428032,
-0.037830594927072525,
-0.026019664481282234,
0.059485409408807755,
0.15866900980472565,
-0.04899965226650238,
0.10218019038438797,
0.12291942536830902,
-0.10894651710987091,
-0.20484918355941772,
0.07807441800832748,
0.034535061568021774,
0.006575271487236023,
-0.07204820960760117,
-0.1953720599412918,
0.005034970119595528,
0.057586297392845154,
-0.016441237181425095,
0.10957962274551392,
-0.2315826416015625,
-0.12910610437393188,
-0.029161447659134865,
0.030760163441300392,
-0.03379271924495697,
-0.11301235854625702,
-0.059841401875019073,
0.025077423080801964,
-0.18238912522792816,
0.12840692698955536,
-0.022960327565670013,
0.04548060521483421,
0.009392928332090378,
0.05186111479997635,
-0.0013185058487579226,
-0.08207058161497116,
0.07213440537452698,
0.002281792927533388,
0.034068912267684937,
-0.04837718605995178,
0.0362703911960125,
0.10766715556383133,
-0.018240973353385925,
0.1350250393152237,
0.07283088564872742,
0.056742094457149506,
-0.008186380378901958,
-0.07470662146806717,
-0.11694261431694031,
0.08634469658136368,
-0.02650517411530018,
-0.07090245187282562,
-0.09111590683460236,
0.021845364943146706,
0.026682401075959206,
-0.02747398428618908,
-0.08790427446365356,
-0.12404711544513702,
0.0235153641551733,
0.1359298825263977,
0.12471524626016617,
0.032246239483356476,
-0.12030377238988876,
0.039242807775735855,
-0.019343087449669838,
0.09412788599729538,
-0.04553815349936485,
-0.007085928227752447,
0.06436574459075928,
0.023093121126294136,
0.06782286614179611,
0.022182606160640717,
-0.1419246792793274,
0.018672309815883636,
0.010639154352247715,
-0.14835181832313538,
-0.1427745223045349,
-0.003319421084597707,
0.03543985262513161,
-0.0757434219121933,
0.0038622638676315546,
0.10812875628471375,
-0.06172076612710953,
-0.05155625939369202,
-0.009435639716684818,
0.06323203444480896,
-0.002013564808294177,
0.10208471119403839,
0.02997761406004429,
0.0002064841683022678,
-0.07278148829936981,
0.04579266160726547,
0.08146878331899643,
-0.14552690088748932,
0.03826484829187393,
0.06944020837545395,
-0.13096164166927338,
-0.07058748602867126,
-0.031251635402441025,
0.018285920843482018,
-0.07029254734516144,
-0.035314276814460754,
-0.021247021853923798,
0.0053369044326245785,
0.03447829186916351,
0.12670622766017914,
-0.0123225636780262,
0.04369044303894043,
-0.019829897210001945,
0.04821755737066269,
-0.11250615119934082,
0.0002550818317104131,
-0.05642792955040932,
0.06999828666448593,
-0.02141876146197319,
0.1321077048778534,
0.025658143684267998,
0.021359002217650414,
-0.023181913420557976,
-0.030344022437930107,
-0.06115380674600601,
-0.025213712826371193,
0.0016430079704150558,
0.02527078054845333,
-0.06104128062725067,
-0.010154969058930874,
-0.000015516901839873753,
-0.002890878589823842,
-0.002044605789706111,
0.031922999769449234,
-0.04207660257816315,
-0.03148892894387245,
-0.04177694767713547,
0.03141956403851509,
-0.11749501526355743,
0.04340803623199463,
0.07199190557003021,
-0.03600984066724777,
0.09994194656610489,
0.06680779904127121,
-0.02953217923641205,
0.0025860483292490244,
-0.10145923495292664,
0.08374536037445068,
-0.0024528594221919775,
0.004538276232779026,
-0.03946516662836075,
-0.043714623898267746,
0.013145466335117817,
0.023456448689103127,
-0.04109992831945419,
-0.025104165077209473,
0.0702643096446991,
-0.10931017249822617,
0.0519166924059391,
0.08758025616407394,
-0.02319459244608879,
-0.08691287785768509,
0.048418961465358734,
0.03240793198347092,
0.019594749435782433,
0.07275962084531784,
-0.03119717352092266,
0.03800938278436661,
-0.1355551779270172,
-0.03587630018591881,
0.03245683014392853,
0.026423437520861626,
-0.04958318918943405,
-0.016956713050603867,
0.07871657609939575,
-0.0028908567037433386,
0.1920686960220337,
0.022860528901219368,
-0.025167331099510193,
0.009095958434045315,
0.0784182995557785,
0.10231974720954895,
0.010262222029268742,
0.01763170026242733,
0.015436267480254173,
-0.03203563392162323,
0.02770932763814926,
0.008053850382566452,
-0.026119351387023926,
-0.061265844851732254,
0.13647156953811646,
0.009102978743612766,
0.11174824088811874,
0.030450614169239998,
-0.007712868973612785,
-0.05152217671275139,
-0.09443392604589462,
-0.10898837447166443,
-0.021529844030737877,
0.026049215346574783,
-0.07119902968406677,
0.09222615510225296,
0.13902001082897186,
-0.07538652420043945,
0.10233710706233978,
0.04595617204904556,
-0.08209694921970367,
-0.08863292634487152,
-0.2857963740825653,
-0.016609439626336098,
-0.007916895672678947,
-0.01842317171394825,
-0.09663145989179611,
0.0544414184987545,
0.008183938451111317,
-0.013576334342360497,
-0.009858187288045883,
0.21804243326187134,
0.00403191801160574,
-0.10774607956409454,
-0.045633479952812195,
-0.019619297236204147,
0.04185839742422104,
0.04337683692574501,
0.0020677230786532164,
0.035346467047929764,
0.01604815013706684,
0.10877455025911331,
0.06407058984041214,
0.08364733308553696,
0.03135387971997261,
-0.05103103443980217,
-0.03485465794801712,
-0.020279105752706528,
0.04210936278104782,
-0.008967666886746883,
0.09512192010879517,
0.036205124109983444,
-0.06081293895840645,
0.030828682705760002,
0.19546808302402496,
-0.019341707229614258,
-0.028344979509711266,
-0.09890560805797577,
0.19658076763153076,
-0.02092035673558712,
-0.006918364204466343,
-0.02365856058895588,
-0.09308770298957825,
0.06059996411204338,
0.2215818613767624,
0.1852376013994217,
-0.03184013068675995,
-0.003970528952777386,
-0.01630249060690403,
-0.0021697389893233776,
0.04787388816475868,
0.12333724647760391,
-0.0226177629083395,
0.3127964735031128,
-0.04638517275452614,
0.08796313405036926,
-0.005647226236760616,
-0.0021464910823851824,
-0.08437572419643402,
0.07042784243822098,
-0.07320882380008698,
-0.012151030823588371,
-0.05258537828922272,
0.03722918778657913,
-0.033810704946517944,
-0.14130334556102753,
-0.012090669013559818,
0.020554525777697563,
-0.04179476574063301,
0.02834826521575451,
-0.046059269458055496,
-0.033014871180057526,
0.0647999569773674,
-0.006433179602026939,
0.008000671863555908,
0.10237091779708862,
0.01563209481537342,
-0.08924974501132965,
-0.05594226345419884,
0.0735412985086441,
0.05224645137786865,
0.20025615394115448,
-0.009345198050141335,
0.13621088862419128,
0.10130401700735092,
-0.004577118903398514,
-0.155936598777771,
0.11084689199924469,
-0.0021715806797146797,
-0.04653555154800415,
0.035965029150247574,
0.09901367872953415,
0.001202502055093646,
0.03563022240996361,
0.04201500490307808,
-0.011553306132555008,
0.041741933673620224,
-0.058490537106990814,
0.006437646225094795,
-0.1163700520992279,
0.009482569061219692,
-0.11006370931863785,
0.15878592431545258,
0.06891132891178131,
-0.04054567962884903,
0.01706884242594242,
-0.07851745188236237,
0.038290947675704956,
0.010763823986053467,
0.08676933497190475,
-0.021176956593990326,
-0.1371437907218933,
0.0031644359696656466,
0.1259496808052063,
0.04286215454339981,
-0.2189120054244995,
-0.04003350809216499,
0.002834467915818095,
-0.017278367653489113,
-0.012100506573915482,
0.08245895057916641,
0.016353987157344818,
0.04258497431874275,
-0.044788800179958344,
-0.05079110339283943,
0.025165390223264694,
0.10078894346952438,
-0.13627028465270996,
-0.09618442505598068
] |
null | null |
transformers
|
# GPT-Code-Clippy-125M-Code-Search-Py
> **Please refer to our new [GitHub Wiki](https://github.com/ncoop57/gpt-code-clippy/wiki) which documents our efforts in detail in creating the open source version of GitHub Copilot**
## Model Description
GPT-CC-125M-Code-Search is a [GPT-Neo-125M model](https://huggingface.co/EleutherAI/gpt-neo-125M) finetuned using causal language modeling on only the python language in the [CodeSearchNet Challenge dataset](https://huggingface.co/datasets/code_search_net). This model is specialized to autocomplete methods in the python language.
## Training data
[CodeSearchNet Challenge dataset](https://huggingface.co/datasets/code_search_net).
## Training procedure
The training script used to train this model can be found [here](https://github.com/ncoop57/gpt-code-clippy/blob/camera-ready/training/run_clm_flax.py).
```bash
./run_clm_flax.py \
--output_dir $HOME/gpt-neo-125M-code-search-py \
--model_name_or_path="EleutherAI/gpt-neo-125M" \
--dataset_name code_search_net \
--dataset_config_name="python" \
--do_train --do_eval \
--block_size="512" \
--per_device_train_batch_size="32" \
--per_device_eval_batch_size="64" \
--preprocessing_num_workers="8" \
--learning_rate="1.2e-4" \
--num_train_epochs 20 \
--warmup_steps 3000 \
--adam_beta1="0.9" \
--adam_beta2="0.95" \
--weight_decay="0.1" \
--overwrite_output_dir \
--logging_steps="25" \
--eval_steps="500" \
--push_to_hub="False" \
--report_to="all" \
--dtype="bfloat16" \
--skip_memory_metrics="True" \
--save_steps="500" \
--save_total_limit 10 \
--report_to="wandb" \
--run_name="gpt-neo-125M-code-search-py"
```
## Intended Use and Limitations
The model is finetuned methods from the python language and is intended to autocomplete python methods given some prompt (method signature and docstring).
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("flax-community/gpt-neo-125M-code-clippy-code-search-py")
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt-neo-125M-code-clippy-code-search-py")
prompt = """def greet(name):
'''A function to greet user. Given a user name it should say hello'''
"""
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(device)
start = input_ids.size(1)
out = model.generate(input_ids, do_sample=True, max_length=50, num_beams=2,
early_stopping=True, eos_token_id=tokenizer.eos_token_id, )
print(tokenizer.decode(out[0][start:]))
```
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
GPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See [GPT-Neo model card](https://huggingface.co/EleutherAI/gpt-neo-125M#limitations-and-biases) for details.
## Eval results
Coming soon...
|
{}
|
text-generation
|
flax-community/gpt-neo-125M-code-search-py
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"gpt_neo",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #autotrain_compatible #endpoints_compatible #region-us
|
# GPT-Code-Clippy-125M-Code-Search-Py
> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot
## Model Description
GPT-CC-125M-Code-Search is a GPT-Neo-125M model finetuned using causal language modeling on only the python language in the CodeSearchNet Challenge dataset. This model is specialized to autocomplete methods in the python language.
## Training data
CodeSearchNet Challenge dataset.
## Training procedure
The training script used to train this model can be found here.
## Intended Use and Limitations
The model is finetuned methods from the python language and is intended to autocomplete python methods given some prompt (method signature and docstring).
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
### Limitations and Biases
The model is intended to be used for research purposes and comes with no guarantees of quality of generated code.
GPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.
## Eval results
Coming soon...
|
[
"# GPT-Code-Clippy-125M-Code-Search-Py\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-CC-125M-Code-Search is a GPT-Neo-125M model finetuned using causal language modeling on only the python language in the CodeSearchNet Challenge dataset. This model is specialized to autocomplete methods in the python language.",
"## Training data\n\nCodeSearchNet Challenge dataset.",
"## Training procedure\n\nThe training script used to train this model can be found here.",
"## Intended Use and Limitations\n\nThe model is finetuned methods from the python language and is intended to autocomplete python methods given some prompt (method signature and docstring).",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nGPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #autotrain_compatible #endpoints_compatible #region-us \n",
"# GPT-Code-Clippy-125M-Code-Search-Py\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot",
"## Model Description\n\nGPT-CC-125M-Code-Search is a GPT-Neo-125M model finetuned using causal language modeling on only the python language in the CodeSearchNet Challenge dataset. This model is specialized to autocomplete methods in the python language.",
"## Training data\n\nCodeSearchNet Challenge dataset.",
"## Training procedure\n\nThe training script used to train this model can be found here.",
"## Intended Use and Limitations\n\nThe model is finetuned methods from the python language and is intended to autocomplete python methods given some prompt (method signature and docstring).",
"### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:",
"### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nGPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.",
"## Eval results\n\nComing soon..."
] |
[
46,
47,
65,
10,
16,
41,
35,
72,
8
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #gpt_neo #text-generation #autotrain_compatible #endpoints_compatible #region-us \n# GPT-Code-Clippy-125M-Code-Search-Py\n> Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot## Model Description\n\nGPT-CC-125M-Code-Search is a GPT-Neo-125M model finetuned using causal language modeling on only the python language in the CodeSearchNet Challenge dataset. This model is specialized to autocomplete methods in the python language.## Training data\n\nCodeSearchNet Challenge dataset.## Training procedure\n\nThe training script used to train this model can be found here.## Intended Use and Limitations\n\nThe model is finetuned methods from the python language and is intended to autocomplete python methods given some prompt (method signature and docstring).### How to use\n\nYou can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:### Limitations and Biases\n\nThe model is intended to be used for research purposes and comes with no guarantees of quality of generated code.\n\nGPT-CC is finetuned from GPT-Neo and might have inherited biases and limitations from it. See GPT-Neo model card for details.## Eval results\n\nComing soon..."
] |
[
-0.051025424152612686,
0.18521611392498016,
-0.0028334837406873703,
0.042034171521663666,
0.07080145925283432,
-0.036253537982702255,
0.055981893092393875,
0.06500045955181122,
0.0175323449075222,
0.0903780609369278,
0.07744400948286057,
0.04853450879454613,
0.07648158818483353,
0.1465311348438263,
0.054523829370737076,
-0.2244480848312378,
0.05761882662773132,
-0.03184621408581734,
0.11348675191402435,
0.12381638586521149,
0.08306978642940521,
-0.027845509350299835,
0.08569055050611496,
0.0639202892780304,
-0.13185641169548035,
-0.013904803432524204,
0.030963512137532234,
-0.02447904460132122,
0.04077540710568428,
0.029945993795990944,
0.06932704150676727,
-0.0014143426669761539,
0.012195981107652187,
-0.1669752299785614,
0.01751108281314373,
0.08311735093593597,
-0.004919908009469509,
0.04072333872318268,
0.04454604908823967,
-0.046229831874370575,
0.14574527740478516,
0.002815037500113249,
0.07914359122514725,
0.07174943387508392,
-0.11858723312616348,
0.021553171798586845,
-0.10114177316427231,
-0.0026816008612513542,
0.027754321694374084,
0.14513756334781647,
-0.0056890384294092655,
0.1749441772699356,
-0.023730721324682236,
0.042793698608875275,
0.07291404902935028,
-0.08969858288764954,
-0.016876479610800743,
0.06713898479938507,
0.06311922520399094,
0.056878577917814255,
-0.01987753063440323,
-0.005679944530129433,
0.002965900581330061,
0.03785865008831024,
0.11154457181692123,
-0.050933267921209335,
-0.04503297433257103,
-0.0443977415561676,
-0.16082081198692322,
-0.11466718465089798,
0.08415652811527252,
-0.018443681299686432,
-0.10648560523986816,
-0.10696925967931747,
-0.07151507586240768,
-0.015873197466135025,
-0.0054566944018006325,
-0.021785233169794083,
0.0025681813713163137,
0.040355391800403595,
0.07258673012256622,
-0.09673614799976349,
-0.09325705468654633,
-0.09057691693305969,
0.01888112910091877,
0.06952419131994247,
0.034840650856494904,
0.06728598475456238,
-0.09710840880870819,
0.1571897268295288,
-0.05592434108257294,
-0.0630534216761589,
-0.07328995317220688,
-0.059664417058229446,
-0.04301687330007553,
-0.023343583568930626,
-0.0048692962154746056,
-0.0908619612455368,
0.014643670059740543,
0.199505016207695,
0.02618110366165638,
0.01419324055314064,
0.01017508003860712,
-0.0006457935087382793,
0.06172407791018486,
0.06826320290565491,
-0.0356803797185421,
0.039192404597997665,
0.09091243147850037,
0.013021032325923443,
0.02632744051516056,
-0.020128753036260605,
-0.0420585572719574,
-0.021147744730114937,
0.046147361397743225,
0.08992526680231094,
0.09236712008714676,
0.08481770008802414,
-0.04398828372359276,
-0.03558650240302086,
0.2065526694059372,
-0.109499990940094,
-0.009177089668810368,
-0.023022787645459175,
-0.04392019659280777,
-0.05632336810231209,
0.0711999163031578,
0.010038744658231735,
-0.11487855017185211,
-0.031884096562862396,
-0.09792215377092361,
-0.012393113225698471,
-0.10894507169723511,
-0.10867077857255936,
0.022129176184535027,
-0.03290509060025215,
-0.04168153554201126,
-0.12514758110046387,
-0.23799902200698853,
-0.08481910079717636,
0.00004148208608967252,
-0.0003793249779846519,
-0.009014752693474293,
0.028834713622927666,
-0.0303106140345335,
0.0019659996032714844,
0.002728652209043503,
-0.05270983278751373,
-0.053796205669641495,
0.023662427440285683,
-0.00885429885238409,
0.01064120139926672,
-0.01585475355386734,
-0.015000774525105953,
-0.14222919940948486,
0.00571587635204196,
-0.17145222425460815,
0.09835493564605713,
-0.04443596675992012,
0.00886200275272131,
-0.11677687615156174,
-0.044645410031080246,
0.0008953504147939384,
0.001454487326554954,
0.017821937799453735,
0.16930614411830902,
-0.15088655054569244,
0.011725270189344883,
0.2020033299922943,
-0.12497182190418243,
-0.02068115584552288,
0.11827397346496582,
-0.01940852962434292,
0.1172100156545639,
0.09039806574583054,
0.07584627717733383,
0.1479090303182602,
-0.17595811188220978,
-0.045772336423397064,
-0.025702031329274178,
-0.05139274522662163,
0.04719829931855202,
0.0384209007024765,
-0.07413773983716965,
-0.01302699837833643,
0.017719397321343422,
-0.10135389864444733,
-0.016103332862257957,
-0.00018028635531663895,
-0.037441425025463104,
-0.008642944507300854,
-0.004501678980886936,
-0.043549709022045135,
-0.05048924311995506,
-0.01428415346890688,
0.015334969386458397,
-0.12433693557977676,
-0.050377827137708664,
0.06059114634990692,
-0.029939884319901466,
0.0407174713909626,
-0.11031311750411987,
0.07821213454008102,
-0.08154097944498062,
-0.006223995238542557,
-0.14872831106185913,
-0.11123549193143845,
0.0703798234462738,
-0.12093335390090942,
0.06861185282468796,
-0.08618951588869095,
-0.0020074015483260155,
0.03597848117351532,
0.02645435743033886,
-0.0156600009649992,
-0.06985826045274734,
-0.03829200938344002,
-0.06655551493167877,
-0.06280640512704849,
-0.028764592483639717,
-0.03460218012332916,
0.12704826891422272,
-0.12571293115615845,
0.05473744124174118,
0.0984998270869255,
0.007234232500195503,
0.02973070554435253,
-0.0989902913570404,
0.07783177495002747,
-0.011120815761387348,
-0.002738750306889415,
-0.03983604162931442,
-0.010210945270955563,
0.04872610792517662,
-0.010792415589094162,
0.055728890001773834,
-0.18010900914669037,
-0.14096328616142273,
0.05585918948054314,
0.026099083945155144,
-0.10132243484258652,
-0.052372176200151443,
-0.022980304434895515,
-0.005353574641048908,
-0.02582736313343048,
-0.06553124636411667,
0.13575583696365356,
-0.0061662630178034306,
0.07301398366689682,
-0.07773229479789734,
0.0013436845038086176,
0.016666892915964127,
-0.0258196871727705,
0.01515176147222519,
-0.02180497907102108,
0.08121359348297119,
-0.02991544082760811,
0.054405950009822845,
-0.08836928755044937,
0.007184503600001335,
0.08290040493011475,
0.051017966121435165,
-0.07954467087984085,
-0.015078307129442692,
-0.020833143964409828,
0.005060320254415274,
0.01933351717889309,
0.009187797084450722,
0.0008509457111358643,
0.03496721759438515,
0.04856002330780029,
0.04768179729580879,
-0.10755281895399094,
0.03160029649734497,
0.034600019454956055,
-0.03325226157903671,
-0.08128000050783157,
0.016839541494846344,
-0.05832812190055847,
0.033857475966215134,
0.026143033057451248,
0.09950098395347595,
-0.0025447567459195852,
-0.04117607697844505,
-0.18363188207149506,
0.16669797897338867,
-0.08710610866546631,
-0.19825370609760284,
-0.13958507776260376,
0.04246329516172409,
0.021208295598626137,
0.022472217679023743,
0.02240546979010105,
-0.007160457316786051,
-0.05698598548769951,
-0.15170373022556305,
0.015335158444941044,
-0.010619418695569038,
-0.018940383568406105,
-0.14138975739479065,
-0.05642295628786087,
-0.017877833917737007,
-0.12346201390028,
0.008240855298936367,
-0.012428521178662777,
-0.12184177339076996,
0.04458208009600639,
0.009129675105214119,
0.15280205011367798,
0.10771671682596207,
0.044050611555576324,
-0.03290016949176788,
-0.04422704130411148,
0.2960924804210663,
-0.06853240728378296,
0.10082218796014786,
0.21453537046909332,
-0.005745230708271265,
0.05957099422812462,
0.07661111652851105,
-0.0010766062187030911,
-0.08623971045017242,
0.03510306775569916,
0.0024880862329155207,
-0.06499405205249786,
-0.1787346750497818,
-0.06768535077571869,
-0.05393154174089432,
-0.047101788222789764,
0.06958213448524475,
0.038075823336839676,
0.10775334388017654,
0.07581457495689392,
-0.05675671994686127,
-0.0026070773601531982,
-0.0028582406230270863,
0.12462042272090912,
0.010752449743449688,
0.03941833972930908,
0.029645487666130066,
-0.04445960372686386,
0.05658034235239029,
0.0669010803103447,
0.05515201762318611,
0.11346615105867386,
-0.03280368819832802,
0.20040763914585114,
0.11007953435182571,
0.08526884764432907,
0.06346369534730911,
0.004248300101608038,
-0.02954810857772827,
0.04622872173786163,
-0.004044680390506983,
-0.07584785670042038,
-0.060697056353092194,
0.058542829006910324,
-0.064877949655056,
-0.07177997380495071,
-0.027297135442495346,
-0.051822323352098465,
0.0018557049334049225,
0.13845165073871613,
0.024710344150662422,
-0.24153733253479004,
-0.09268805384635925,
0.015570132061839104,
0.008302047848701477,
-0.09304405748844147,
0.012775547802448273,
0.05133328214287758,
-0.15518055856227875,
0.04677315801382065,
-0.048183050006628036,
0.10077282786369324,
-0.09225170314311981,
-0.005365222226828337,
-0.002913655014708638,
0.08499813824892044,
-0.02606084942817688,
0.06038496643304825,
-0.1700666844844818,
0.0697513297200203,
0.04363173246383667,
0.11492251604795456,
-0.07722014933824539,
0.04309690371155739,
-0.037745505571365356,
0.02052258513867855,
0.12436696887016296,
0.004451071377843618,
-0.007798667065799236,
-0.049392715096473694,
-0.09221785515546799,
0.03130929172039032,
0.05285761132836342,
-0.06526778638362885,
0.10826139152050018,
0.01001536101102829,
0.02202705107629299,
-0.014966747723519802,
-0.07765603065490723,
-0.15903706848621368,
-0.16688519716262817,
0.07784222066402435,
0.0045940387062728405,
0.06738688796758652,
-0.03644133731722832,
0.016431041061878204,
0.07482200860977173,
0.1435512751340866,
-0.16518187522888184,
-0.08149734884500504,
-0.13598895072937012,
-0.005576424766331911,
0.09661256521940231,
-0.07452193647623062,
0.06596887111663818,
-0.02335517108440399,
0.1022753119468689,
0.011732052080333233,
-0.03943697363138199,
0.08161406219005585,
-0.09092734754085541,
-0.10572168231010437,
-0.0625091940164566,
0.0807533711194992,
0.11622750759124756,
0.028634706512093544,
-0.009572632610797882,
0.043301090598106384,
-0.04645991697907448,
-0.11867500841617584,
0.0111823920160532,
0.17444397509098053,
0.0668501928448677,
0.07530047744512558,
-0.016399523243308067,
-0.05348370596766472,
-0.06535792350769043,
-0.007645794656127691,
0.056336164474487305,
0.15012668073177338,
-0.05172737315297127,
0.071773961186409,
0.1091030016541481,
-0.09787147492170334,
-0.18075045943260193,
0.09598874300718307,
0.05980372428894043,
-0.020592426881194115,
-0.08389109373092651,
-0.2038489729166031,
-0.007175813894718885,
0.03048246167600155,
-0.007969523780047894,
0.12038841843605042,
-0.2478763610124588,
-0.12095972150564194,
0.0008566125179640949,
0.03534330800175667,
-0.03197319433093071,
-0.1142071783542633,
-0.050613414496183395,
0.013677499257028103,
-0.21766528487205505,
0.12195558845996857,
-0.02168741635978222,
0.01565053127706051,
0.00904396828263998,
0.08292502164840698,
0.010871772654354572,
-0.08049537986516953,
0.06819796562194824,
0.0134125966578722,
0.05279454588890076,
-0.05867350846529007,
0.003453035606071353,
0.04047101363539696,
-0.03768510743975639,
0.166990265250206,
0.09078650921583176,
0.056623365730047226,
-0.02824416197836399,
-0.06440505385398865,
-0.12188142538070679,
0.07520301640033722,
-0.02892306260764599,
-0.08456698060035706,
-0.08891770988702774,
0.01412643026560545,
0.06139121577143669,
-0.03032892383635044,
-0.09488342702388763,
-0.11956965178251266,
0.012335419654846191,
0.10832981765270233,
0.10641742497682571,
0.04358223080635071,
-0.1242934912443161,
0.0453510507941246,
-0.01804707944393158,
0.09509247541427612,
-0.050877079367637634,
-0.01777501218020916,
0.04710372909903526,
0.02859753742814064,
0.08835664391517639,
-0.004710796754807234,
-0.15265797078609467,
0.027069613337516785,
0.001937872264534235,
-0.13101579248905182,
-0.13170841336250305,
0.004867854528129101,
0.07376042753458023,
-0.12387073785066605,
-0.028547385707497597,
0.09924401342868805,
-0.055524710565805435,
-0.049219053238630295,
-0.018371867015957832,
0.06926272064447403,
0.016090864315629005,
0.1011461392045021,
0.044048648327589035,
-0.0038863327354192734,
-0.0792175754904747,
0.04192333295941353,
0.07935307919979095,
-0.11761761456727982,
0.03445732593536377,
0.047311656177043915,
-0.12978039681911469,
-0.053501490503549576,
-0.030832350254058838,
0.017627479508519173,
-0.11838507652282715,
-0.043951768428087234,
-0.01988854445517063,
0.01857501082122326,
0.03362000361084938,
0.1549515277147293,
0.004139712080359459,
0.03672479838132858,
-0.01719450019299984,
0.03973981365561485,
-0.10966586321592331,
-0.00005373753810999915,
-0.039823390543460846,
0.060999322682619095,
-0.033865902572870255,
0.14210927486419678,
0.036705587059259415,
0.038533132523298264,
-0.007328885607421398,
-0.042554788291454315,
-0.04659608006477356,
-0.012400650419294834,
-0.03718017041683197,
0.033892951905727386,
-0.0623382106423378,
-0.008923581801354885,
-0.0041373600251972675,
-0.013165688142180443,
-0.00010513142478885129,
0.02623969502747059,
-0.05096287280321121,
-0.038166072219610214,
-0.03229893743991852,
0.025215240195393562,
-0.12225984781980515,
0.03909381851553917,
0.06381850689649582,
-0.0398319736123085,
0.096605584025383,
0.06162378564476967,
-0.04068209230899811,
0.004996055271476507,
-0.10873036831617355,
0.05642460659146309,
0.01348900981247425,
-0.008860595524311066,
-0.02305988408625126,
-0.03826151415705681,
0.005414312239736319,
0.0069062840193510056,
-0.03737075999379158,
-0.02404182218015194,
0.09046851098537445,
-0.09972512722015381,
0.04258954897522926,
0.12282592058181763,
0.007736754138022661,
-0.085271455347538,
0.0571976900100708,
0.012384772300720215,
-0.00396218616515398,
0.11132369190454483,
-0.020433293655514717,
0.03443919122219086,
-0.12368246912956238,
-0.04082169011235237,
0.010015972889959812,
0.03777385130524635,
-0.040567800402641296,
0.008981490507721901,
0.06796668469905853,
-0.0038682620506733656,
0.15316839516162872,
0.0301900003105402,
-0.020316708832979202,
0.017334481701254845,
0.09273036569356918,
0.11656848341226578,
0.0009106572833843529,
-0.013178541325032711,
0.017073949798941612,
-0.024395527318120003,
0.03576376661658287,
0.010242762044072151,
-0.008995559997856617,
-0.016681352630257607,
0.15216951072216034,
0.00795513205230236,
0.13860146701335907,
0.03876505419611931,
-0.016012977808713913,
-0.005544622894376516,
-0.07921993732452393,
-0.17430062592029572,
-0.011057740077376366,
0.04711012542247772,
-0.045032601803541183,
0.10259759426116943,
0.14792150259017944,
-0.08033214509487152,
0.08345179259777069,
0.06645312905311584,
-0.06710990518331528,
-0.10275059938430786,
-0.2826823890209198,
-0.007215017452836037,
0.011483196169137955,
-0.025233637541532516,
-0.08833945542573929,
0.05851038917899132,
0.004110835492610931,
-0.020984290167689323,
-0.006038746330887079,
0.19290323555469513,
0.025009000673890114,
-0.12509961426258087,
-0.04127137362957001,
-0.015005256980657578,
0.023423049598932266,
0.04967888072133064,
0.020914779976010323,
0.04286057502031326,
0.025162536650896072,
0.09999378770589828,
0.06461584568023682,
0.058462537825107574,
0.046371933072805405,
-0.07238186150789261,
-0.0502680204808712,
-0.015473236329853535,
0.05424259230494499,
0.010405291803181171,
0.11771984398365021,
0.05707981437444687,
-0.048019684851169586,
0.02779880352318287,
0.20272472500801086,
-0.013517073355615139,
-0.04721173271536827,
-0.12310120463371277,
0.2089308649301529,
-0.011613569222390652,
-0.013988755643367767,
-0.02857557311654091,
-0.10062969475984573,
0.04993618279695511,
0.20251359045505524,
0.15263350307941437,
-0.009777647443115711,
-0.008220837451517582,
-0.03310925140976906,
-0.002366752130910754,
0.03616609051823616,
0.12631724774837494,
0.0029842485673725605,
0.3227197825908661,
-0.04867950454354286,
0.10488694906234741,
-0.0018955736886709929,
-0.00957738608121872,
-0.08595773577690125,
0.0746619924902916,
-0.06468483805656433,
-0.01599818281829357,
-0.04772133380174637,
0.0694912001490593,
-0.04496239498257637,
-0.1304895281791687,
-0.05002516880631447,
0.049494676291942596,
-0.03978624939918518,
0.027334818616509438,
-0.07967136055231094,
0.0029795747250318527,
0.06804446876049042,
-0.005816102493554354,
-0.00467354990541935,
0.09530218690633774,
0.006989949848502874,
-0.09955879300832748,
-0.10492987930774689,
0.10932883620262146,
0.035764824599027634,
0.22535112500190735,
0.002399734454229474,
0.11924012750387192,
0.09846540540456772,
-0.013488078489899635,
-0.15790368616580963,
0.09149201959371567,
0.012289371341466904,
-0.02687750570476055,
0.0413205549120903,
0.09083933383226395,
-0.006134707946330309,
0.02211056835949421,
0.03463306277990341,
0.02283746935427189,
0.04320096597075462,
-0.09223997592926025,
-0.005271753761917353,
-0.12483394891023636,
0.005549823399633169,
-0.10169173777103424,
0.13664302229881287,
0.08320880681276321,
-0.06159902736544609,
0.028209999203681946,
-0.06278141587972641,
0.026694990694522858,
0.02532723732292652,
0.0701545923948288,
-0.02668161690235138,
-0.1284967064857483,
0.018743449822068214,
0.09744590520858765,
0.03762291371822357,
-0.18783625960350037,
-0.024262167513370514,
0.01537644024938345,
-0.02212192676961422,
-0.004141674377024174,
0.09266719967126846,
-0.008929746225476265,
0.041545432060956955,
-0.043270010501146317,
-0.011806213296949863,
0.019874513149261475,
0.11724716424942017,
-0.14311429858207703,
-0.1161317229270935
] |
null | null |
transformers
|
# Cosmos QA (gpt2)
> This is part of the
[Flax/Jax Community Week](https://discuss.huggingface.co/t/train-a-gpt2-model-for-contextual-common-sense-reasoning-using-the-cosmos-qa-dataset/7463), organized by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
## Team Members
-Rohan V Kashyap ([Rohan](https://huggingface.co/Rohan))
-Vivek V Kashyap ([Vivek](https://huggingface.co/Vivek))
## Dataset
[Cosmos QA: Machine Reading Comprehension with Contextual Commonsense Reasoning](https://huggingface.co/datasets/cosmos_qa).This dataset contains a set of 35,600 problems that require commonsense-based reading comprehension, formulated as multiple-choice questions.Understanding narratives requires reading between the lines, which in turn, requires interpreting the likely causes and effects of events, even when they are not mentioned explicitly.The questions focus on factual and literal understanding of the context paragraph, our dataset focuses on reading between the lines over a diverse collection of people's everyday narratives.
### Example
```json
{"Context":["It's a very humbling experience when you need someone
to dress you every morning, tie your shoes, and put your hair
up. Every menial task takes an unprecedented amount of effort.
It made me appreciate Dan even more. But anyway I shan't
dwell on this (I'm not dying after all) and not let it detract from
my lovely 5 days with my friends visiting from Jersey."],
"Question":["What's a possible reason the writer needed someone to
dress him every morning?"],
"Multiple Choice":["A: The writer doesn't like putting effort into these tasks.",
"B: The writer has a physical disability.",
"C: The writer is bad at doing his own hair.",
"D: None of the above choices."]
"link":"https://arxiv.org/pdf/1909.00277.pdf"
}
```
## How to use
```bash
# Installing requirements
pip install transformers
pip install datasets
```
```python
from model_file import FlaxGPT2ForMultipleChoice
from datasets import Dataset
model_path="flax-community/gpt2-Cosmos"
model = FlaxGPT2ForMultipleChoice.from_pretrained(model_path,input_shape=(1,4,1))
dataset=Dataset.from_csv('./')
def preprocess(example):
example['context&question']=example['context']+example['question']
example['first_sentence']=[example['context&question'],example['context&question'],example['context&question'],example['context&question']]
example['second_sentence']=example['answer0'],example['answer1'],example['answer2'],example['answer3']
return example
dataset=dataset.map(preprocess)
def tokenize(examples):
a=tokenizer(examples['first_sentence'],examples['second_sentence'],padding='max_length',truncation=True,max_length=256,return_tensors='jax')
a['labels']=examples['label']
return a
dataset=dataset.map(tokenize)
input_id=jnp.array(dataset['input_ids'])
att_mask=jnp.array(dataset['attention_mask'])
outputs=model(input_id,att_mask)
final_output=jnp.argmax(outputs,axis=-1)
print(f"the predction of the dataset : {final_output}")
```
```
The Correct answer:-Option 1
```
## Preprocessing
The texts are tokenized using the GPT2 tokenizer.To feed the inputs of multiple choice we concatenated context and question as first input and all the 4 possible choices as the second input to our tokenizer.
## Evaluation
The following tables summarize the scores obtained by the **GPT2-CosmosQA**.The ones marked as (^) are the baseline models.
| Model | Dev Acc | Test Acc |
|:---------------:|:-----:|:-----:|
| BERT-FT Multiway^| 68.3.| 68.4 |
| GPT-FT ^ | 54.0 | 54.4. |
| GPT2-CosmosQA | 60.3 | 59.7 |
## Inference
This project was mainly to test the common sense understanding of the GPT2-model.We finetuned on a Dataset known as CosmosQ requires reasoning beyond the exact text spans in the context.The above results shows that GPT2 model is doing better than most of the base line models given that it only used to predict the next word in the pre-training objective.
## Credits
Huge thanks to Huggingface 🤗 & Google Jax/Flax team for such a wonderful community week. Especially for providing such massive computing resource. Big thanks to [@patil-suraj](https://github.com/patil-suraj) & [@patrickvonplaten](https://github.com/patrickvonplaten) for mentoring during whole week.
|
{}
| null |
flax-community/gpt2-Cosmos
|
[
"transformers",
"jax",
"tensorboard",
"gpt2",
"arxiv:1909.00277",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1909.00277"
] |
[] |
TAGS
#transformers #jax #tensorboard #gpt2 #arxiv-1909.00277 #endpoints_compatible #has_space #text-generation-inference #region-us
|
Cosmos QA (gpt2)
================
>
> This is part of the
> Flax/Jax Community Week, organized by HuggingFace and TPU usage sponsored by Google.
>
>
>
Team Members
------------
-Rohan V Kashyap (Rohan)
-Vivek V Kashyap (Vivek)
Dataset
-------
Cosmos QA: Machine Reading Comprehension with Contextual Commonsense Reasoning.This dataset contains a set of 35,600 problems that require commonsense-based reading comprehension, formulated as multiple-choice questions.Understanding narratives requires reading between the lines, which in turn, requires interpreting the likely causes and effects of events, even when they are not mentioned explicitly.The questions focus on factual and literal understanding of the context paragraph, our dataset focuses on reading between the lines over a diverse collection of people's everyday narratives.
### Example
How to use
----------
Preprocessing
-------------
The texts are tokenized using the GPT2 tokenizer.To feed the inputs of multiple choice we concatenated context and question as first input and all the 4 possible choices as the second input to our tokenizer.
Evaluation
----------
The following tables summarize the scores obtained by the GPT2-CosmosQA.The ones marked as (^) are the baseline models.
Inference
---------
This project was mainly to test the common sense understanding of the GPT2-model.We finetuned on a Dataset known as CosmosQ requires reasoning beyond the exact text spans in the context.The above results shows that GPT2 model is doing better than most of the base line models given that it only used to predict the next word in the pre-training objective.
Credits
-------
Huge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week. Especially for providing such massive computing resource. Big thanks to @patil-suraj & @patrickvonplaten for mentoring during whole week.
|
[
"### Example\n\n\nHow to use\n----------\n\n\nPreprocessing\n-------------\n\n\nThe texts are tokenized using the GPT2 tokenizer.To feed the inputs of multiple choice we concatenated context and question as first input and all the 4 possible choices as the second input to our tokenizer.\n\n\nEvaluation\n----------\n\n\nThe following tables summarize the scores obtained by the GPT2-CosmosQA.The ones marked as (^) are the baseline models.\n\n\n\nInference\n---------\n\n\nThis project was mainly to test the common sense understanding of the GPT2-model.We finetuned on a Dataset known as CosmosQ requires reasoning beyond the exact text spans in the context.The above results shows that GPT2 model is doing better than most of the base line models given that it only used to predict the next word in the pre-training objective.\n\n\nCredits\n-------\n\n\nHuge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week. Especially for providing such massive computing resource. Big thanks to @patil-suraj & @patrickvonplaten for mentoring during whole week."
] |
[
"TAGS\n#transformers #jax #tensorboard #gpt2 #arxiv-1909.00277 #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### Example\n\n\nHow to use\n----------\n\n\nPreprocessing\n-------------\n\n\nThe texts are tokenized using the GPT2 tokenizer.To feed the inputs of multiple choice we concatenated context and question as first input and all the 4 possible choices as the second input to our tokenizer.\n\n\nEvaluation\n----------\n\n\nThe following tables summarize the scores obtained by the GPT2-CosmosQA.The ones marked as (^) are the baseline models.\n\n\n\nInference\n---------\n\n\nThis project was mainly to test the common sense understanding of the GPT2-model.We finetuned on a Dataset known as CosmosQ requires reasoning beyond the exact text spans in the context.The above results shows that GPT2 model is doing better than most of the base line models given that it only used to predict the next word in the pre-training objective.\n\n\nCredits\n-------\n\n\nHuge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week. Especially for providing such massive computing resource. Big thanks to @patil-suraj & @patrickvonplaten for mentoring during whole week."
] |
[
49,
249
] |
[
"passage: TAGS\n#transformers #jax #tensorboard #gpt2 #arxiv-1909.00277 #endpoints_compatible #has_space #text-generation-inference #region-us \n### Example\n\n\nHow to use\n----------\n\n\nPreprocessing\n-------------\n\n\nThe texts are tokenized using the GPT2 tokenizer.To feed the inputs of multiple choice we concatenated context and question as first input and all the 4 possible choices as the second input to our tokenizer.\n\n\nEvaluation\n----------\n\n\nThe following tables summarize the scores obtained by the GPT2-CosmosQA.The ones marked as (^) are the baseline models.\n\n\n\nInference\n---------\n\n\nThis project was mainly to test the common sense understanding of the GPT2-model.We finetuned on a Dataset known as CosmosQ requires reasoning beyond the exact text spans in the context.The above results shows that GPT2 model is doing better than most of the base line models given that it only used to predict the next word in the pre-training objective.\n\n\nCredits\n-------\n\n\nHuge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week. Especially for providing such massive computing resource. Big thanks to @patil-suraj & @patrickvonplaten for mentoring during whole week."
] |
[
-0.023857437074184418,
0.09318220615386963,
-0.0036726135294884443,
0.07295621186494827,
0.018100619316101074,
0.041926175355911255,
0.06614194065332413,
0.10145284980535507,
-0.008410482667386532,
0.07716772705316544,
0.06429705768823624,
-0.049240536987781525,
0.08173318207263947,
0.09823960065841675,
0.06789862364530563,
-0.31248000264167786,
0.00510374316945672,
-0.014794445596635342,
-0.10221293568611145,
0.08663960546255112,
0.08807990700006485,
-0.05800371617078781,
0.03832796961069107,
0.024265550076961517,
-0.10263146460056305,
0.01117614284157753,
0.018383435904979706,
-0.006221538409590721,
0.09255360811948776,
0.02444172091782093,
0.023872800171375275,
0.00026151418569497764,
0.01927661895751953,
-0.11800246685743332,
0.04061714559793472,
0.08494996279478073,
0.03141002357006073,
0.07467927783727646,
0.060841597616672516,
0.05900953337550163,
0.08249098062515259,
-0.054800570011138916,
0.0201139934360981,
0.042875051498413086,
-0.10891291499137878,
-0.15752866864204407,
-0.07044751942157745,
0.0520094558596611,
-0.03302378207445145,
-0.001694193691946566,
-0.016765402629971504,
0.19633305072784424,
-0.0668686255812645,
0.0031333218794316053,
0.10839918255805969,
-0.22980302572250366,
-0.12201165407896042,
0.18072403967380524,
-0.0032811323180794716,
0.0901501402258873,
-0.04227688908576965,
0.036037251353263855,
0.044157274067401886,
-0.04629836603999138,
0.07940413802862167,
0.023594966158270836,
0.10504550486803055,
-0.009751218371093273,
-0.12249238044023514,
-0.03305859863758087,
0.12971080839633942,
0.040053050965070724,
-0.04018411412835121,
-0.14365965127944946,
-0.10884977132081985,
0.03901110589504242,
-0.05250593647360802,
-0.09931454807519913,
0.06977439671754837,
0.06416009366512299,
0.07838383316993713,
-0.0007703636074438691,
-0.10794094204902649,
-0.03657485172152519,
-0.09184463322162628,
0.1202191486954689,
0.03954700008034706,
0.05332555994391441,
-0.05493587255477905,
0.13809232413768768,
-0.23546944558620453,
-0.07916370779275894,
-0.03515283018350601,
-0.07301465421915054,
-0.09791886061429977,
-0.020992184057831764,
0.011143461801111698,
0.067398302257061,
-0.008565759286284447,
0.17949390411376953,
0.07986870408058167,
0.03151887655258179,
-0.012456285767257214,
-0.02926827408373356,
0.01610058732330799,
0.1204615980386734,
-0.045092612504959106,
-0.14824151992797852,
0.09211627393960953,
0.03660653904080391,
0.04681045934557915,
-0.016454126685857773,
-0.02557016722857952,
-0.04896150529384613,
-0.04222782328724861,
0.02669641748070717,
-0.020592235028743744,
0.047914549708366394,
-0.02267126925289631,
-0.03574582189321518,
0.12106147408485413,
-0.07954931259155273,
0.03589443862438202,
0.009031355381011963,
-0.0411570705473423,
0.07867293804883957,
-0.050355393439531326,
-0.017385246232151985,
0.00018446733884047717,
-0.05037510767579079,
-0.06031215563416481,
-0.037598997354507446,
-0.12042203545570374,
-0.029161004349589348,
0.025166677311062813,
0.00972646102309227,
-0.02932063862681389,
-0.10715631395578384,
-0.12185777723789215,
0.006086972542107105,
0.010576538741588593,
-0.05826394259929657,
-0.09852809458971024,
0.0512995570898056,
-0.06218060851097107,
0.0033266220707446337,
-0.01503250002861023,
0.12199341505765915,
-0.029460711404681206,
-0.004872320685535669,
0.05904069542884827,
0.018899112939834595,
-0.014631003141403198,
-0.02956177480518818,
-0.015596309676766396,
0.02042495086789131,
-0.22416138648986816,
0.07289249449968338,
-0.07778622210025787,
-0.0045504276640713215,
-0.07034984976053238,
-0.007540699560195208,
-0.0877319872379303,
0.012839755043387413,
-0.015394046902656555,
0.12938188016414642,
-0.2498263418674469,
0.01184018049389124,
0.15689420700073242,
-0.08889147639274597,
-0.04493545740842819,
0.16818709671497345,
-0.0752338320016861,
0.08843190968036652,
0.13570748269557953,
0.1014096736907959,
0.06349457055330276,
-0.04931553080677986,
-0.032083120197057724,
-0.048484813421964645,
0.0006781003321520984,
0.09748318791389465,
0.05522141233086586,
0.07591190189123154,
-0.022614141926169395,
0.033226415514945984,
-0.04075516760349274,
-0.03894628584384918,
-0.0045079332776367664,
-0.04172637313604355,
-0.08395213633775711,
0.0125544723123312,
-0.08139103651046753,
-0.017253022640943527,
0.025698479264974594,
-0.037515487521886826,
-0.09458663314580917,
-0.0640612468123436,
0.1168680414557457,
0.001701453235000372,
-0.017193011939525604,
-0.11847750097513199,
0.08204200863838196,
-0.11944998055696487,
-0.004417231772094965,
-0.11434855312108994,
-0.03845866769552231,
0.11028362810611725,
-0.05469505488872528,
0.06220615282654762,
0.05003177747130394,
0.06188834831118584,
-0.01719258725643158,
-0.02153579704463482,
0.05947113037109375,
0.012097920291125774,
-0.058092616498470306,
-0.023631257936358452,
-0.02720239758491516,
-0.009301201440393925,
-0.028032971546053886,
0.07337077707052231,
-0.18106217682361603,
0.045052941888570786,
0.12862582504749298,
0.01590455137193203,
0.007624488789588213,
-0.0377059280872345,
0.009343260899186134,
-0.054386917501688004,
-0.02504083514213562,
-0.05457761138677597,
-0.012185720726847649,
0.012655609287321568,
-0.09880534559488297,
0.059302348643541336,
-0.18966737389564514,
-0.03184245154261589,
0.06768020242452621,
0.1476699411869049,
-0.06070569530129433,
0.030034564435482025,
-0.05093378573656082,
0.011565957218408585,
-0.08973061293363571,
-0.03171871602535248,
0.13605715334415436,
0.008549505844712257,
0.04839210584759712,
-0.0906573012471199,
-0.08186356723308563,
0.02287333831191063,
0.04237606003880501,
0.02137076109647751,
0.04420251026749611,
0.0546792633831501,
-0.053973931819200516,
0.04916629195213318,
-0.0437152124941349,
0.041261788457632065,
0.1429038941860199,
-0.008165350183844566,
-0.07659780234098434,
-0.07465725392103195,
-0.022802917286753654,
0.009477592073380947,
0.08354583382606506,
-0.010973905213177204,
0.03941524773836136,
0.05329945683479309,
0.050625696778297424,
0.020076341927051544,
-0.13453543186187744,
0.01032909844070673,
0.01833324134349823,
-0.048988599330186844,
-0.07341960072517395,
0.019313039258122444,
-0.011409795843064785,
0.11774388700723648,
0.017618000507354736,
0.07912617921829224,
-0.03485732525587082,
-0.03429650887846947,
-0.04459930583834648,
0.08732186257839203,
0.024422893300652504,
-0.1586611419916153,
-0.06473811715841293,
0.06652489304542542,
-0.039520010352134705,
-0.017509013414382935,
-0.0054266126826405525,
-0.04734257236123085,
-0.07718759775161743,
-0.12821470201015472,
0.05008577182888985,
0.021502243354916573,
-0.0004670454072766006,
-0.05319134145975113,
-0.042141035199165344,
0.00807295460253954,
-0.1297508329153061,
-0.021551474928855896,
-0.01712053269147873,
-0.11071542650461197,
0.01739712990820408,
-0.017360150814056396,
0.0708942636847496,
0.05827011913061142,
0.02148338593542576,
-0.029757017269730568,
-0.01632305420935154,
0.19135501980781555,
-0.15833675861358643,
0.11225476861000061,
0.1406227946281433,
0.019942667335271835,
0.023798305541276932,
0.07367285341024399,
0.03282312676310539,
-0.1291513890028,
0.03099733591079712,
0.038712866604328156,
-0.07720305770635605,
-0.19725531339645386,
-0.02284087985754013,
-0.11112185567617416,
-0.0281994491815567,
0.05805512145161629,
0.05913212150335312,
-0.14835785329341888,
0.04603055119514465,
-0.05915852263569832,
0.04985704645514488,
0.06708191335201263,
0.031108433380723,
-0.0013904600637033582,
0.03985617682337761,
0.07588500529527664,
-0.08579444885253906,
-0.0780068188905716,
0.1462024599313736,
0.10032277554273605,
0.16178132593631744,
-0.03547627478837967,
0.1948745995759964,
0.02929822728037834,
0.07135555893182755,
0.08612632006406784,
0.05808808282017708,
0.008885902352631092,
-0.03329271078109741,
-0.06305182725191116,
-0.03086785040795803,
-0.07138855755329132,
0.002095748670399189,
0.02001679502427578,
0.005817889701575041,
0.0061402698047459126,
-0.012108208611607552,
0.04926908388733864,
0.29872244596481323,
-0.08582375198602676,
-0.15187734365463257,
-0.0607696995139122,
0.08681084215641022,
-0.07400830835103989,
-0.058521635830402374,
0.04378195479512215,
0.12521570920944214,
-0.07287068665027618,
0.09727728366851807,
-0.008871128782629967,
0.07625297456979752,
-0.057155389338731766,
0.011441905982792377,
-0.04866080358624458,
0.09447671473026276,
-0.004146125167608261,
0.07259640097618103,
-0.15336303412914276,
0.05456424877047539,
-0.006422150880098343,
0.10372307151556015,
-0.04982312023639679,
0.03426435589790344,
0.04991977661848068,
0.037984855473041534,
0.0737486183643341,
-0.004586576949805021,
-0.04217475652694702,
0.026808515191078186,
-0.16738858819007874,
0.041221775114536285,
-0.03154537081718445,
-0.12604252994060516,
0.06336317956447601,
-0.049588847905397415,
0.03619525209069252,
0.038608450442552567,
0.021366296336054802,
-0.06765976548194885,
-0.17949189245700836,
0.00446029007434845,
-0.05606336146593094,
-0.026960773393511772,
-0.05392586067318916,
-0.04868481308221817,
-0.06675343960523605,
0.09513287991285324,
0.027933398261666298,
0.0076535590924322605,
-0.10953063517808914,
0.04396095871925354,
0.12200825661420822,
-0.02624313160777092,
-0.04033531993627548,
-0.002993915928527713,
0.20947347581386566,
-0.0011499040992930532,
-0.04688195139169693,
0.039339080452919006,
-0.03321950510144234,
-0.12539221346378326,
0.02221613936126232,
0.10680776089429855,
0.0491599515080452,
0.06721701472997665,
0.009629630483686924,
0.036112282425165176,
-0.05013790726661682,
-0.12632553279399872,
0.020110907033085823,
0.2188682109117508,
-0.008056847378611565,
-0.04927400127053261,
0.031049465760588646,
0.09125328809022903,
-0.03990048170089722,
-0.006311476696282625,
0.0998983308672905,
0.27114903926849365,
-0.08963985741138458,
0.08447178453207016,
-0.0061189038679003716,
-0.08021356910467148,
-0.1477319896221161,
0.022394035011529922,
0.05437682941555977,
0.04130975529551506,
-0.03937218338251114,
-0.10487521439790726,
0.07603318244218826,
0.09667067229747772,
-0.019788799807429314,
-0.03375111520290375,
-0.18604789674282074,
-0.1134011447429657,
-0.02064981497824192,
0.003988681361079216,
0.1339041292667389,
-0.12388613075017929,
-0.019924268126487732,
-0.043717943131923676,
-0.03542214259505272,
0.2289559543132782,
-0.09248214960098267,
0.0773111879825592,
0.0005794580210931599,
0.04496926814317703,
0.017744792625308037,
-0.06073721870779991,
0.12033051997423172,
-0.015616877935826778,
0.103016696870327,
-0.03272061049938202,
0.07280278950929642,
0.08416455239057541,
-0.05710260942578316,
0.09838724136352539,
0.1034795269370079,
-0.009931416250765324,
-0.08165046572685242,
-0.06658196449279785,
-0.03519778326153755,
0.03539840131998062,
-0.041366443037986755,
-0.020279204472899437,
-0.05802224576473236,
0.0707523301243782,
0.06160792335867882,
-0.010213128291070461,
0.06233054772019386,
-0.06123790889978409,
0.08612046390771866,
0.017437955364584923,
0.20812776684761047,
-0.045858655124902725,
-0.05376804247498512,
0.013024978339672089,
0.03250345215201378,
0.06118156760931015,
-0.17294597625732422,
0.08694972842931747,
0.10960400849580765,
-0.03405382111668587,
0.05385515093803406,
0.04259856045246124,
-0.08690852671861649,
0.005038478411734104,
0.021668624132871628,
-0.14781633019447327,
-0.32429248094558716,
0.034844204783439636,
0.02207166887819767,
-0.06441126763820648,
-0.058296266943216324,
0.1291331648826599,
-0.06192009523510933,
-0.05897888168692589,
0.08100740611553192,
0.02829119749367237,
0.02975017949938774,
0.1354040503501892,
-0.0014253611443564296,
0.012177493423223495,
-0.10322336852550507,
0.09331075102090836,
0.028457840904593468,
-0.09825344383716583,
-0.02961685135960579,
0.05912622809410095,
-0.0796096995472908,
-0.028920378535985947,
0.032458897680044174,
0.07093003392219543,
0.03232245892286301,
-0.020677829161286354,
-0.05800369754433632,
-0.10593385249376297,
0.03706473857164383,
0.018895313143730164,
0.056774355471134186,
0.017948005348443985,
-0.0688573345541954,
0.06793834269046783,
-0.10000981390476227,
0.06073058769106865,
0.022754287347197533,
0.05687771737575531,
-0.06960473954677582,
0.06216765567660332,
-0.002922355430200696,
-0.06312581151723862,
-0.05009616166353226,
-0.02258741855621338,
-0.07876263558864594,
-0.04492097347974777,
-0.06343159824609756,
0.038700368255376816,
-0.007886060513556004,
0.05144307017326355,
-0.000832405115943402,
-0.01872989907860756,
-0.06436227262020111,
0.007216628175228834,
-0.09007692337036133,
0.023341981694102287,
-0.048978086560964584,
0.02889440208673477,
-0.02687963657081127,
0.08588261157274246,
0.08885423094034195,
-0.07957242429256439,
0.14539310336112976,
0.005112375132739544,
-0.026360424235463142,
-0.0298011377453804,
-0.09164436906576157,
0.024549202993512154,
-0.030394744127988815,
0.07732220739126205,
-0.003798297606408596,
-0.09833694249391556,
0.07548623532056808,
0.03496987745165825,
-0.0035294548142701387,
-0.02613287977874279,
0.016719143837690353,
-0.08166433125734329,
-0.0122230751439929,
0.06958173215389252,
-0.031824175268411636,
-0.062348827719688416,
-0.04249227046966553,
0.0071363369934260845,
0.10785437375307083,
0.04487505182623863,
0.012753469869494438,
-0.020279983058571815,
-0.20934008061885834,
-0.033136703073978424,
0.0186074897646904,
-0.0697512999176979,
-0.025960450991988182,
-0.023114928975701332,
0.09525523334741592,
0.0017999035771936178,
0.24417001008987427,
-0.014357682317495346,
-0.03937121108174324,
-0.007907195948064327,
0.009687741287052631,
0.06072209030389786,
-0.015976328402757645,
0.0888892412185669,
0.083450086414814,
0.008303998038172722,
0.09039797633886337,
-0.02117803692817688,
0.008205726742744446,
-0.013159559108316898,
0.15481235086917877,
0.03725862503051758,
0.05983993783593178,
-0.009911810979247093,
0.015631457790732384,
0.0053184363059699535,
-0.03747880086302757,
0.1105278730392456,
-0.0986202210187912,
0.011908275075256824,
-0.06723343580961227,
0.0744997188448906,
0.09487110376358032,
-0.11488506197929382,
0.08477973937988281,
-0.08713836967945099,
-0.035581301897764206,
-0.153110533952713,
-0.2244800329208374,
-0.06810668110847473,
-0.09018116444349289,
0.04700518772006035,
-0.1371517777442932,
-0.0028012008406221867,
0.11782775074243546,
0.051018357276916504,
-0.025956828147172928,
0.07173678278923035,
-0.024409590288996696,
-0.06327106803655624,
0.06601930409669876,
0.018621601164340973,
-0.03234448283910751,
0.022224457934498787,
-0.02783135510981083,
-0.001308503095060587,
0.0360129214823246,
0.08681809157133102,
0.035151734948158264,
0.07344736903905869,
-0.03728257864713669,
-0.04985729604959488,
-0.027311401441693306,
-0.03690921142697334,
-0.03507561609148979,
0.013027104549109936,
0.04434240981936455,
0.03132300451397896,
-0.02669929899275303,
0.025006474927067757,
0.21685220301151276,
0.0025398272555321455,
-0.0402412936091423,
-0.14726054668426514,
0.1332271546125412,
-0.08175917714834213,
-0.00888052023947239,
-0.01881810463964939,
-0.03421410918235779,
0.009614667855203152,
0.15789206326007843,
0.12202246487140656,
-0.04040659964084625,
-0.013846000656485558,
0.013619164004921913,
-0.01024772971868515,
0.007598768919706345,
0.18876054883003235,
0.06422221660614014,
0.14320603013038635,
-0.06273351609706879,
0.07240051031112671,
-0.007578249089419842,
-0.00772835873067379,
0.017116520553827286,
0.2502646744251251,
-0.05279460549354553,
0.026005808264017105,
-0.04926437884569168,
0.055715162307024,
-0.011900155805051327,
-0.21976906061172485,
-0.0969352126121521,
-0.11617548018693924,
-0.13960610330104828,
0.026657676324248314,
0.010031272657215595,
-0.07086506485939026,
0.044464901089668274,
-0.01955275423824787,
-0.01611023209989071,
0.08976840227842331,
0.006828938145190477,
-0.07520205527544022,
-0.01474569458514452,
0.10914110392332077,
-0.017063038423657417,
0.15664079785346985,
0.014072581194341183,
0.009659532457590103,
0.09372878074645996,
-0.04738226160407066,
-0.14697502553462982,
0.0148646030575037,
0.0258635263890028,
-0.07380400598049164,
-0.03661094605922699,
0.11785124242305756,
0.02254323847591877,
0.0011467436561360955,
0.1042676642537117,
0.020445454865694046,
-0.007240879815071821,
-0.049060411751270294,
0.0007193629280664027,
-0.130586639046669,
0.01042921282351017,
-0.11148562282323837,
0.12057151645421982,
0.13784708082675934,
-0.04099176451563835,
-0.0024752849712967873,
-0.08795267343521118,
0.06850142776966095,
-0.03202731907367706,
0.15911509096622467,
0.01319140661507845,
-0.221868097782135,
-0.009786664508283138,
0.10769166797399521,
0.03955212980508804,
-0.07631833106279373,
-0.10047075152397156,
0.01980670541524887,
-0.047070056200027466,
0.011201721616089344,
0.09308315813541412,
0.06358586996793747,
0.08333991467952728,
-0.04426514357328415,
-0.21226875483989716,
-0.027927875518798828,
0.08303777128458023,
-0.08751289546489716,
-0.022639770060777664
] |
null | null |
transformers
|
## GPT-2 Base Thai
GPT-2 Base Thai is a causal language model based on the [OpenAI GPT-2](https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) model. It was trained on the [OSCAR](https://huggingface.co/datasets/oscar) dataset, specifically the `unshuffled_deduplicated_th` subset. The model was trained from scratch and achieved an evaluation loss of 1.708 and an evaluation perplexity of 5.516.
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by HuggingFace. All training was done on a TPUv3-8 VM, sponsored by the Google Cloud team.
All necessary scripts used for training could be found in the [Files and versions](https://hf.co/flax-community/gpt2-base-thai/tree/main) tab, as well as the [Training metrics](https://hf.co/flax-community/gpt2-base-thai/tensorboard) logged via Tensorboard.
## Model
| Model | #params | Arch. | Training/Validation data (text) |
| ---------------- | ------- | ----- | ------------------------------------ |
| `gpt2-base-thai` | 124M | GPT-2 | `unshuffled_deduplicated_th` Dataset |
## Evaluation Results
The model was trained for 3 epochs and the following is the final result once the training ended.
| train loss | valid loss | valid PPL | total time |
| ---------- | ---------- | --------- | ---------- |
| 1.638 | 1.708 | 5.516 | 6:12:34 |
## How to Use
### As Causal Language Model
```python
from transformers import pipeline
pretrained_name = "flax-community/gpt2-base-thai"
nlp = pipeline(
"text-generation",
model=pretrained_name,
tokenizer=pretrained_name
)
nlp("สวัสดีตอนเช้า")
```
### Feature Extraction in PyTorch
```python
from transformers import GPT2Model, GPT2TokenizerFast
pretrained_name = "flax-community/gpt2-base-thai"
model = GPT2Model.from_pretrained(pretrained_name)
tokenizer = GPT2TokenizerFast.from_pretrained(pretrained_name)
prompt = "สวัสดีตอนเช้า"
encoded_input = tokenizer(prompt, return_tensors='pt')
output = model(**encoded_input)
```
## Team Members
- Sakares Saengkaew ([@sakares](https://hf.co/sakares))
- Wilson Wongso ([@w11wo](https://hf.co/w11wo))
|
{"language": "th", "license": "mit", "tags": ["gpt2-base-thai"], "datasets": ["oscar"], "widget": [{"text": "\u0e2a\u0e27\u0e31\u0e2a\u0e14\u0e35\u0e15\u0e2d\u0e19\u0e40\u0e0a\u0e49\u0e32"}]}
|
text-generation
|
flax-community/gpt2-base-thai
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"gpt2",
"text-generation",
"gpt2-base-thai",
"th",
"dataset:oscar",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"th"
] |
TAGS
#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #gpt2-base-thai #th #dataset-oscar #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
GPT-2 Base Thai
---------------
GPT-2 Base Thai is a causal language model based on the OpenAI GPT-2 model. It was trained on the OSCAR dataset, specifically the 'unshuffled\_deduplicated\_th' subset. The model was trained from scratch and achieved an evaluation loss of 1.708 and an evaluation perplexity of 5.516.
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM, sponsored by the Google Cloud team.
All necessary scripts used for training could be found in the Files and versions tab, as well as the Training metrics logged via Tensorboard.
Model
-----
Evaluation Results
------------------
The model was trained for 3 epochs and the following is the final result once the training ended.
How to Use
----------
### As Causal Language Model
### Feature Extraction in PyTorch
Team Members
------------
* Sakares Saengkaew (@sakares)
* Wilson Wongso (@w11wo)
|
[
"### As Causal Language Model",
"### Feature Extraction in PyTorch\n\n\nTeam Members\n------------\n\n\n* Sakares Saengkaew (@sakares)\n* Wilson Wongso (@w11wo)"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #gpt2-base-thai #th #dataset-oscar #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### As Causal Language Model",
"### Feature Extraction in PyTorch\n\n\nTeam Members\n------------\n\n\n* Sakares Saengkaew (@sakares)\n* Wilson Wongso (@w11wo)"
] |
[
78,
7,
34
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #gpt2-base-thai #th #dataset-oscar #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n### As Causal Language Model### Feature Extraction in PyTorch\n\n\nTeam Members\n------------\n\n\n* Sakares Saengkaew (@sakares)\n* Wilson Wongso (@w11wo)"
] |
[
-0.04496162384748459,
0.04038377106189728,
-0.0017499293899163604,
0.10869737714529037,
0.15497013926506042,
-0.026538187637925148,
0.11753970384597778,
0.10445954650640488,
0.012515945360064507,
-0.00336699141189456,
0.15758241713047028,
0.17976857721805573,
0.061786070466041565,
0.029374929144978523,
-0.04725491628050804,
-0.306167870759964,
0.018371963873505592,
0.07027929276227951,
0.01784869283437729,
0.1296420395374298,
0.09311247617006302,
-0.07735016942024231,
0.1305771917104721,
0.06199628859758377,
-0.13665065169334412,
0.03530709818005562,
0.056996189057826996,
-0.13961419463157654,
0.09545817226171494,
0.07054021209478378,
0.03193126246333122,
0.04917002096772194,
-0.013383960351347923,
-0.07573504000902176,
0.03251565620303154,
-0.041929613798856735,
-0.021282844245433807,
0.051874492317438126,
0.05778251215815544,
-0.055018872022628784,
0.09527172893285751,
0.021090565249323845,
-0.011369283311069012,
0.01314475480467081,
-0.11477433890104294,
-0.06565675139427185,
-0.039187200367450714,
0.027055570855736732,
0.017650015652179718,
0.12273836135864258,
-0.04369503632187843,
0.14632892608642578,
-0.12563207745552063,
0.05461902171373367,
0.15943355858325958,
-0.34078919887542725,
-0.03990815579891205,
0.029577720910310745,
0.08166193217039108,
-0.006785615347325802,
-0.05332912132143974,
0.048921599984169006,
0.06517360359430313,
0.006980499252676964,
0.048701055347919464,
-0.08442431688308716,
-0.17363321781158447,
0.021672356873750687,
-0.05089285224676132,
-0.003273076144978404,
0.28414246439933777,
0.011978941038250923,
-0.00879233330488205,
-0.08564230799674988,
-0.055655933916568756,
-0.07234102487564087,
0.027642101049423218,
0.018499640747904778,
-0.017121635377407074,
0.02515016309916973,
0.08570007234811783,
-0.008895078673958778,
-0.1462932825088501,
0.005826759152114391,
-0.14859282970428467,
0.16366048157215118,
0.014864961616694927,
0.061339665204286575,
-0.1657009869813919,
0.08793720602989197,
-0.0002229468955192715,
-0.1369991898536682,
0.013232164084911346,
-0.06390422582626343,
0.0224792268127203,
0.09732799232006073,
0.046620480716228485,
-0.04469774290919304,
-0.0037975364830344915,
0.015008159913122654,
-0.11083275824785233,
0.03206048533320427,
-0.006924004759639502,
0.09302190691232681,
0.011461044661700726,
0.15681032836437225,
-0.13776420056819916,
0.031148578971624374,
0.020275646820664406,
-0.10209047794342041,
0.05828341096639633,
-0.06414075195789337,
-0.11166403442621231,
-0.07661217451095581,
-0.0304278451949358,
-0.019092855975031853,
0.04157710075378418,
0.16591425240039825,
-0.09007322788238525,
-0.05563390627503395,
0.07582242041826248,
-0.06997902691364288,
-0.019161291420459747,
0.0008111220085993409,
-0.03394632041454315,
0.09384773671627045,
0.04002049192786217,
0.04183750972151756,
-0.059997133910655975,
0.11009574681520462,
-0.03660770505666733,
0.017419353127479553,
-0.022452520206570625,
-0.05959869176149368,
0.05057693272829056,
-0.11238458752632141,
0.02934202551841736,
-0.1142687126994133,
-0.09880520403385162,
-0.034000467509031296,
-0.018391229212284088,
-0.028430340811610222,
-0.12734508514404297,
-0.04913128912448883,
-0.05039907246828079,
0.05002763494849205,
-0.08083634078502655,
0.04603145271539688,
-0.07346752285957336,
0.08734787255525589,
-0.04555864632129669,
0.08767583221197128,
-0.1811213195323944,
0.07493805140256882,
-0.13983960449695587,
0.0276650320738554,
-0.11082255095243454,
0.06111862137913704,
-0.005763568915426731,
0.09093964099884033,
0.013524573296308517,
-0.07774439454078674,
-0.09855150431394577,
0.048061128705739975,
-0.04773213714361191,
0.20569901168346405,
-0.08666636049747467,
-0.08810427784919739,
0.1829340159893036,
-0.08013711124658585,
-0.08829745650291443,
0.1163143664598465,
-0.026252778246998787,
0.08983825892210007,
0.05402876064181328,
0.2391899824142456,
-0.03251607343554497,
-0.03404190018773079,
0.020718486979603767,
0.007696409709751606,
-0.09110449999570847,
-0.06831362843513489,
0.11428103595972061,
0.04922407120466232,
-0.05242178961634636,
0.09556789696216583,
-0.006138931028544903,
0.07000694423913956,
-0.03271651268005371,
-0.037730615586042404,
-0.015849316492676735,
-0.02150663733482361,
0.05406517907977104,
0.07037755101919174,
0.12754985690116882,
-0.058238234370946884,
-0.05136526748538017,
0.009521896950900555,
0.015251295641064644,
-0.03605028986930847,
0.035995643585920334,
-0.0856504961848259,
0.14716017246246338,
0.01543273963034153,
0.0023472148459404707,
-0.13069337606430054,
0.046984151005744934,
-0.005315765272825956,
0.14577044546604156,
0.03481852263212204,
0.06736845523118973,
0.030555687844753265,
-0.04025896638631821,
-0.045403216034173965,
-0.02269022725522518,
0.15891194343566895,
-0.022348206490278244,
-0.013487515971064568,
-0.16450805962085724,
0.0744849443435669,
-0.044752251356840134,
0.13728034496307373,
-0.09295538067817688,
0.03280532732605934,
-0.005170464050024748,
0.0699627697467804,
-0.04981248080730438,
0.029322879388928413,
0.05555165186524391,
0.0485379733145237,
-0.058569494634866714,
0.006269447039812803,
0.07280699908733368,
-0.013545294292271137,
-0.12217410653829575,
0.3076336979866028,
-0.14250151813030243,
0.1815154254436493,
0.21273192763328552,
-0.19337092339992523,
0.025639619678258896,
0.029033496975898743,
-0.02018381841480732,
0.004757057875394821,
0.012143280357122421,
0.07096941024065018,
0.12038762867450714,
-0.06718432158231735,
0.15723983943462372,
-0.08610516041517258,
-0.005353336222469807,
0.014413020573556423,
-0.05063771456480026,
-0.025038376450538635,
0.1689155250787735,
0.10067969560623169,
-0.11699817329645157,
0.21141092479228973,
0.10995849967002869,
0.06230679899454117,
0.2529853582382202,
0.002006745897233486,
0.03402617201209068,
0.022044425830245018,
-0.008366618305444717,
-0.061991676688194275,
0.02827220968902111,
-0.2073984444141388,
-0.017942070960998535,
0.06280170381069183,
-0.0381896048784256,
0.041862521320581436,
-0.1456177681684494,
-0.11391014605760574,
0.01209016889333725,
0.01467602327466011,
-0.06609932333230972,
0.11253588646650314,
0.0428752601146698,
0.12480755895376205,
-0.03814613074064255,
0.001531447982415557,
0.036491166800260544,
0.012612593360245228,
-0.11439849436283112,
0.17064198851585388,
-0.05374971777200699,
-0.31768497824668884,
-0.12162744998931885,
-0.09942649304866791,
-0.0744321271777153,
0.019402850419282913,
0.098798468708992,
-0.13224323093891144,
0.0006502075120806694,
-0.03466900438070297,
0.005690587684512138,
-0.08032841235399246,
0.010623392648994923,
-0.006112006958574057,
-0.05072502791881561,
-0.07141843438148499,
-0.04820339381694794,
-0.07302545011043549,
-0.07435422390699387,
-0.01177520863711834,
0.15720520913600922,
-0.08120601624250412,
0.059772662818431854,
0.11759937554597855,
-0.015299859456717968,
0.00677212979644537,
-0.039080243557691574,
0.12897245585918427,
-0.07923585921525955,
-0.01545041799545288,
0.2253575474023819,
-0.04712129011750221,
0.04536193981766701,
0.16023388504981995,
-0.0021228883415460587,
-0.030894611030817032,
-0.0037333827931433916,
-0.04890011250972748,
-0.09645460546016693,
-0.2816532254219055,
-0.08241657167673111,
-0.11852008104324341,
0.1420382857322693,
0.0024778638035058975,
0.04058229550719261,
0.07580959796905518,
0.11499350517988205,
-0.007282917387783527,
-0.015524673275649548,
0.013452469371259212,
0.03809385746717453,
0.2044372707605362,
-0.005768990144133568,
0.15005174279212952,
-0.06181657686829567,
-0.0808277428150177,
0.08070910722017288,
-0.030519820749759674,
0.007412626873701811,
0.030381623655557632,
0.002250185701996088,
0.04433505982160568,
0.19328643381595612,
0.1312636137008667,
0.053800083696842194,
-0.00046868110075592995,
-0.03144487366080284,
-0.04111962020397186,
-0.03579365462064743,
-0.015473505482077599,
0.05414296314120293,
-0.010859424248337746,
-0.07124108821153641,
-0.06237533316016197,
-0.003744661808013916,
0.02868163026869297,
0.1103554293513298,
0.0121153574436903,
-0.17722457647323608,
-0.034963950514793396,
0.06927957385778427,
0.017535356804728508,
-0.07091479003429413,
0.05583075433969498,
0.08328205347061157,
-0.14634248614311218,
0.12085265666246414,
0.008811096660792828,
0.09329725801944733,
-0.02427057735621929,
0.06908920407295227,
-0.054839733988046646,
-0.11157095432281494,
0.009604602120816708,
0.08306501060724258,
-0.23758868873119354,
0.24008354544639587,
-0.00843222625553608,
-0.07883550971746445,
-0.07319492101669312,
-0.01785007119178772,
-0.04062940552830696,
0.1044219508767128,
0.19383737444877625,
0.010108428075909615,
-0.03862825036048889,
-0.034580908715724945,
-0.04058525711297989,
0.03511004149913788,
0.003839826211333275,
0.005157078616321087,
-0.0006114909192547202,
0.022585894912481308,
0.03636295720934868,
0.0010285486932843924,
-0.009141040965914726,
-0.015401382930576801,
-0.11737287044525146,
0.023595673963427544,
0.03577597066760063,
0.06494908779859543,
0.005189509596675634,
-0.08597569912672043,
-0.2183316946029663,
0.14592303335666656,
-0.025437867268919945,
-0.07674792408943176,
-0.10083076357841492,
0.03328413888812065,
0.013750767335295677,
-0.10639407485723495,
0.02753225527703762,
-0.05569428205490112,
-0.0758785679936409,
-0.025557203218340874,
-0.13006865978240967,
0.14371052384376526,
-0.020287062972784042,
-0.07381792366504669,
-0.009167931042611599,
0.12081043422222137,
-0.03540244698524475,
0.04374750703573227,
-0.010738161392509937,
-0.002535321516916156,
-0.08032678067684174,
-0.12598805129528046,
-0.0049089184030890465,
0.053621213883161545,
0.06579133123159409,
-0.018243463709950447,
-0.017141178250312805,
-0.12319816648960114,
-0.06350988149642944,
-0.11564140766859055,
0.2222631424665451,
0.17059125006198883,
-0.11776276677846909,
0.16290178894996643,
0.026600345969200134,
-0.038285478949546814,
-0.2717019021511078,
-0.05834416300058365,
-0.06356488168239594,
0.044255368411540985,
0.06182565912604332,
-0.0957481786608696,
0.05001138895750046,
0.010774745605885983,
-0.0418153777718544,
-0.04451575130224228,
-0.19214044511318207,
-0.1431851089000702,
0.1177571713924408,
0.0051412396132946014,
0.36811092495918274,
-0.09652025997638702,
-0.04431290179491043,
-0.011632655747234821,
-0.23456312716007233,
0.16425324976444244,
-0.019246378913521767,
0.07217560708522797,
-0.06005090847611427,
0.05202839896082878,
0.009346784092485905,
-0.05171913653612137,
0.10654111951589584,
-0.056595832109451294,
0.007256295066326857,
-0.10191532224416733,
-0.09489487111568451,
0.050924867391586304,
0.06992367655038834,
0.06166371330618858,
-0.07688547670841217,
-0.005685730837285519,
-0.19993126392364502,
-0.058451782912015915,
-0.11891154944896698,
0.06342416256666183,
0.014428968541324139,
-0.05838482081890106,
-0.0403129905462265,
0.003486648900434375,
-0.0016788302455097437,
-0.005338530521839857,
0.11358967423439026,
-0.08841852098703384,
0.05800371617078781,
0.06971155852079391,
0.17485736310482025,
-0.04584156349301338,
0.0774969607591629,
-0.0751219093799591,
-0.029021013528108597,
0.07762763649225235,
-0.22901004552841187,
-0.030820725485682487,
0.10466643422842026,
0.006383488420397043,
0.06244872137904167,
0.07635247707366943,
-0.018868248909711838,
0.07726844400167465,
0.1057722419500351,
-0.20766541361808777,
-0.10231760889291763,
-0.04506152495741844,
-0.13251125812530518,
0.149322047829628,
-0.023032300174236298,
0.07369249314069748,
-0.08557183295488358,
-0.053532615303993225,
0.029344964772462845,
-0.012825342826545238,
-0.058084819465875626,
0.04917890578508377,
0.003891601460054517,
-0.016058340668678284,
-0.13832883536815643,
0.07057317346334457,
0.05723992735147476,
-0.11675222218036652,
0.005831859074532986,
0.10996082425117493,
-0.11425090581178665,
-0.07134886085987091,
-0.07710418105125427,
0.052473317831754684,
-0.10763594508171082,
-0.0661812499165535,
-0.05470260977745056,
-0.16105768084526062,
0.04307430610060692,
0.03471654653549194,
0.03437672182917595,
0.024447675794363022,
-0.018696974962949753,
-0.07639332860708237,
0.01894412562251091,
0.006069192197173834,
0.09352365136146545,
0.003662718692794442,
-0.13750973343849182,
-0.004365643486380577,
0.0389455184340477,
0.18837934732437134,
-0.0607331283390522,
-0.02268870174884796,
-0.12635129690170288,
0.04173160344362259,
-0.05848723277449608,
-0.031829752027988434,
-0.08936846256256104,
-0.06298315525054932,
-0.0666586235165596,
-0.10849972069263458,
-0.043884240090847015,
-0.028036370873451233,
-0.10386762768030167,
0.045863520354032516,
-0.02220897749066353,
0.0414339080452919,
-0.08306051790714264,
-0.033421970903873444,
0.1243906021118164,
-0.011865124106407166,
0.12194009870290756,
0.16216334700584412,
-0.08324018865823746,
0.06444766372442245,
-0.1728346347808838,
-0.046934958547353745,
0.09387420862913132,
0.007994945161044598,
0.038227345794439316,
0.06607978790998459,
-0.0019364856416359544,
0.07087864726781845,
0.01409262977540493,
0.02681548334658146,
0.06322760134935379,
-0.1267525851726532,
-0.008952551521360874,
0.02441699057817459,
-0.09779185056686401,
-0.07044384628534317,
0.01793057844042778,
0.05392316356301308,
0.023431016132235527,
0.08532780408859253,
-0.04388267919421196,
0.07121705263853073,
-0.12936018407344818,
0.026530077680945396,
-0.016848376020789146,
-0.14753597974777222,
-0.056385014206171036,
-0.10265843570232391,
0.05963650345802307,
-0.000717245158739388,
0.19047044217586517,
0.08711165189743042,
-0.13855068385601044,
0.00820168573409319,
0.08227048814296722,
0.0380559079349041,
0.027444854378700256,
0.20839208364486694,
0.025100979954004288,
-0.011738244444131851,
-0.0003376013191882521,
0.045591238886117935,
0.009647931903600693,
0.2394813746213913,
0.09137291461229324,
0.11863494664430618,
0.020280467346310616,
0.06770310550928116,
0.014797011390328407,
0.07806664705276489,
-0.12027682363986969,
-0.04123470559716225,
-0.11241272836923599,
0.0537257194519043,
-0.05842847004532814,
0.07132837176322937,
0.1637919545173645,
-0.044843461364507675,
0.06579006463289261,
-0.04156571999192238,
-0.05088870972394943,
-0.15733233094215393,
-0.2221544235944748,
-0.08276956528425217,
-0.11651287972927094,
0.031997814774513245,
-0.07813605666160583,
0.06780974566936493,
0.08793197572231293,
0.09569346159696579,
-0.07657121866941452,
0.13284417986869812,
0.10775536298751831,
-0.10146096348762512,
0.06958161294460297,
0.003969661425799131,
0.02890666201710701,
-0.10343984514474869,
0.0329689159989357,
-0.05751069635152817,
0.04052640125155449,
-0.013127019628882408,
0.02564917877316475,
-0.03981390967965126,
0.04158560559153557,
-0.0735088661313057,
-0.0877002477645874,
-0.055263910442590714,
0.04991035908460617,
0.06956186145544052,
0.13426946103572845,
-0.005432535894215107,
0.04288903623819351,
0.008108804002404213,
0.20826736092567444,
-0.026858970522880554,
-0.05661657452583313,
-0.08301892876625061,
0.1435549110174179,
-0.039575546979904175,
0.009486728347837925,
0.005598181392997503,
-0.013380221091210842,
-0.006417323835194111,
0.3573969304561615,
0.3328442871570587,
-0.042244307696819305,
0.051704179495573044,
-0.03771887347102165,
0.02858852967619896,
0.061659008264541626,
0.07084193080663681,
0.08252812176942825,
0.18686482310295105,
-0.1048259288072586,
0.016819598153233528,
-0.08378814905881882,
-0.013478188775479794,
-0.0360841378569603,
0.1419333815574646,
0.07560887932777405,
-0.08288053423166275,
-0.0354282520711422,
0.06819292902946472,
-0.18602478504180908,
0.05687551572918892,
-0.026532525196671486,
-0.19949021935462952,
-0.06645601242780685,
-0.013129637576639652,
0.06574133038520813,
0.0719410851597786,
0.03533691167831421,
-0.04443651810288429,
-0.00643602479249239,
-0.017444122582674026,
0.05743194743990898,
-0.2002345472574234,
0.050570838153362274,
0.09577792882919312,
-0.0312931165099144,
0.029052838683128357,
-0.02190171740949154,
0.06810776144266129,
0.06002834066748619,
0.0732356533408165,
-0.02846195176243782,
-0.0019226169679313898,
0.00020852001034654677,
-0.005587604828178883,
-0.029779434204101562,
-0.016434321179986,
0.05511116981506348,
-0.07518036663532257,
0.023326408118009567,
-0.030892036855220795,
0.06516412645578384,
-0.06291647255420685,
-0.020267551764845848,
-0.07149612158536911,
0.05861653760075569,
-0.07167109847068787,
0.09196391701698303,
0.1193593218922615,
-0.009690451435744762,
-0.046699024736881256,
-0.09087616205215454,
-0.027535270899534225,
-0.05324605852365494,
-0.1094527319073677,
-0.04977606236934662,
-0.12094256281852722,
-0.030577395111322403,
0.07468285411596298,
0.06340523064136505,
-0.2027042955160141,
0.023024801164865494,
-0.1200936958193779,
-0.022828344255685806,
-0.06342774629592896,
0.02721049264073372,
0.08567241579294205,
0.008267275989055634,
-0.015060296282172203,
-0.05998215079307556,
0.06744623929262161,
0.08132293075323105,
-0.07862848788499832,
-0.08114167302846909
] |
null | null |
transformers
|
# Bengali GPT-2
Bengali GPT-2 demo. Part of the [Huggingface JAX/Flax event](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/). Also features a [finetuned](https://huggingface.co/khalidsaifullaah/bengali-lyricist-gpt2?) model on bengali song lyrics.
# Model Description
OpenAI GPT-2 model was proposed in [Language Models are Unsupervised Multitask Learners](https://paperswithcode.com/paper/language-models-are-unsupervised-multitask) paper .Original GPT2 model was a causal (unidirectional) transformer pretrained using language modeling on a very large corpus of ~40 GB of text data. This model has same configuration but has been pretrained on bengali corpus of mC4(multilingual C4) dataset. The code for training the model has all been open-sourced [here](https://huggingface.co/flax-community/gpt2-bengali/tree/main).
# Training Details
Overall Result:
```Eval loss : 1.45, Eval Perplexity : 3.141```
Data: [mC4-bn](https://huggingface.co/datasets/mc4)
Train Steps: 250k steps
link 🤗 flax-community/gpt2-bengali
Demo : https://huggingface.co/spaces/flax-community/Gpt2-bengali
# Usage
For using the model there are multiple options available. For example using the pipeline directly we can try to generate sentences.
```
from transformers import pipeline
gpt2_bengali = pipeline('text-generation',model="flax-community/gpt2-bengali", tokenizer='flax-community/gpt2-bengali')
```
Similarly for using the finetuned model on bangla songs we can use following.
```
from transformers import pipeline
singer = pipeline('text-generation',model="khalidsaifullaah/bengali-lyricist-gpt2", tokenizer='khalidsaifullaah/bengali-lyricist-gpt2')
```
For using on other tasks the model needs to be fine-tuned on custom datasets. Details can be found in huggingface [documentation](https://huggingface.co/transformers/training.html)
# Contributors
* Khalid Saifullah
* Tasmiah Tahsin Mayeesha
* Ritobrata Ghosh
* Ibrahim Musa
* M Saiful Bari
### BibTeX entry and citation info
@misc {flax_community_2023,
author = { {Flax Community} },
title = { gpt2-bengali (Revision cb8fff6) },
year = 2023,
url = { https://huggingface.co/flax-community/gpt2-bengali },
doi = { 10.57967/hf/0938 },
publisher = { Hugging Face }
}
|
{"language": "bn", "license": "mit", "datasets": ["mc4"]}
|
text-generation
|
flax-community/gpt2-bengali
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"gpt2",
"text-generation",
"bn",
"dataset:mc4",
"doi:10.57967/hf/0938",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"bn"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #bn #dataset-mc4 #doi-10.57967/hf/0938 #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# Bengali GPT-2
Bengali GPT-2 demo. Part of the Huggingface JAX/Flax event. Also features a finetuned model on bengali song lyrics.
# Model Description
OpenAI GPT-2 model was proposed in Language Models are Unsupervised Multitask Learners paper .Original GPT2 model was a causal (unidirectional) transformer pretrained using language modeling on a very large corpus of ~40 GB of text data. This model has same configuration but has been pretrained on bengali corpus of mC4(multilingual C4) dataset. The code for training the model has all been open-sourced here.
# Training Details
Overall Result:
Data: mC4-bn
Train Steps: 250k steps
link flax-community/gpt2-bengali
Demo : URL
# Usage
For using the model there are multiple options available. For example using the pipeline directly we can try to generate sentences.
Similarly for using the finetuned model on bangla songs we can use following.
For using on other tasks the model needs to be fine-tuned on custom datasets. Details can be found in huggingface documentation
# Contributors
* Khalid Saifullah
* Tasmiah Tahsin Mayeesha
* Ritobrata Ghosh
* Ibrahim Musa
* M Saiful Bari
### BibTeX entry and citation info
@misc {flax_community_2023,
author = { {Flax Community} },
title = { gpt2-bengali (Revision cb8fff6) },
year = 2023,
url = { URL },
doi = { 10.57967/hf/0938 },
publisher = { Hugging Face }
}
|
[
"# Bengali GPT-2\n\nBengali GPT-2 demo. Part of the Huggingface JAX/Flax event. Also features a finetuned model on bengali song lyrics.",
"# Model Description\n\nOpenAI GPT-2 model was proposed in Language Models are Unsupervised Multitask Learners paper .Original GPT2 model was a causal (unidirectional) transformer pretrained using language modeling on a very large corpus of ~40 GB of text data. This model has same configuration but has been pretrained on bengali corpus of mC4(multilingual C4) dataset. The code for training the model has all been open-sourced here.",
"# Training Details\n\nOverall Result: \n\n\n\nData: mC4-bn\n\nTrain Steps: 250k steps\n\nlink flax-community/gpt2-bengali\n\nDemo : URL",
"# Usage \n\nFor using the model there are multiple options available. For example using the pipeline directly we can try to generate sentences.\n\n\n\nSimilarly for using the finetuned model on bangla songs we can use following.\n\n\n\nFor using on other tasks the model needs to be fine-tuned on custom datasets. Details can be found in huggingface documentation",
"# Contributors\n* Khalid Saifullah\n* Tasmiah Tahsin Mayeesha\n* Ritobrata Ghosh\n* Ibrahim Musa\n* M Saiful Bari",
"### BibTeX entry and citation info\n\n@misc {flax_community_2023,\nauthor = { {Flax Community} },\ntitle = { gpt2-bengali (Revision cb8fff6) },\nyear = 2023,\nurl = { URL },\ndoi = { 10.57967/hf/0938 },\npublisher = { Hugging Face }\n}"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #bn #dataset-mc4 #doi-10.57967/hf/0938 #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# Bengali GPT-2\n\nBengali GPT-2 demo. Part of the Huggingface JAX/Flax event. Also features a finetuned model on bengali song lyrics.",
"# Model Description\n\nOpenAI GPT-2 model was proposed in Language Models are Unsupervised Multitask Learners paper .Original GPT2 model was a causal (unidirectional) transformer pretrained using language modeling on a very large corpus of ~40 GB of text data. This model has same configuration but has been pretrained on bengali corpus of mC4(multilingual C4) dataset. The code for training the model has all been open-sourced here.",
"# Training Details\n\nOverall Result: \n\n\n\nData: mC4-bn\n\nTrain Steps: 250k steps\n\nlink flax-community/gpt2-bengali\n\nDemo : URL",
"# Usage \n\nFor using the model there are multiple options available. For example using the pipeline directly we can try to generate sentences.\n\n\n\nSimilarly for using the finetuned model on bangla songs we can use following.\n\n\n\nFor using on other tasks the model needs to be fine-tuned on custom datasets. Details can be found in huggingface documentation",
"# Contributors\n* Khalid Saifullah\n* Tasmiah Tahsin Mayeesha\n* Ritobrata Ghosh\n* Ibrahim Musa\n* M Saiful Bari",
"### BibTeX entry and citation info\n\n@misc {flax_community_2023,\nauthor = { {Flax Community} },\ntitle = { gpt2-bengali (Revision cb8fff6) },\nyear = 2023,\nurl = { URL },\ndoi = { 10.57967/hf/0938 },\npublisher = { Hugging Face }\n}"
] |
[
89,
38,
108,
36,
75,
31,
89
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #bn #dataset-mc4 #doi-10.57967/hf/0938 #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# Bengali GPT-2\n\nBengali GPT-2 demo. Part of the Huggingface JAX/Flax event. Also features a finetuned model on bengali song lyrics.# Model Description\n\nOpenAI GPT-2 model was proposed in Language Models are Unsupervised Multitask Learners paper .Original GPT2 model was a causal (unidirectional) transformer pretrained using language modeling on a very large corpus of ~40 GB of text data. This model has same configuration but has been pretrained on bengali corpus of mC4(multilingual C4) dataset. The code for training the model has all been open-sourced here.# Training Details\n\nOverall Result: \n\n\n\nData: mC4-bn\n\nTrain Steps: 250k steps\n\nlink flax-community/gpt2-bengali\n\nDemo : URL# Usage \n\nFor using the model there are multiple options available. For example using the pipeline directly we can try to generate sentences.\n\n\n\nSimilarly for using the finetuned model on bangla songs we can use following.\n\n\n\nFor using on other tasks the model needs to be fine-tuned on custom datasets. Details can be found in huggingface documentation# Contributors\n* Khalid Saifullah\n* Tasmiah Tahsin Mayeesha\n* Ritobrata Ghosh\n* Ibrahim Musa\n* M Saiful Bari### BibTeX entry and citation info\n\n@misc {flax_community_2023,\nauthor = { {Flax Community} },\ntitle = { gpt2-bengali (Revision cb8fff6) },\nyear = 2023,\nurl = { URL },\ndoi = { 10.57967/hf/0938 },\npublisher = { Hugging Face }\n}"
] |
[
-0.10797099024057388,
0.18628671765327454,
-0.0017445123521611094,
0.06183341145515442,
0.12639981508255005,
0.045987728983163834,
0.1704426258802414,
0.10378281772136688,
-0.012636031955480576,
0.046501901000738144,
0.020428374409675598,
0.007003278937190771,
0.08612826466560364,
0.16388192772865295,
0.08248800784349442,
-0.2152705192565918,
0.005562655162066221,
-0.0898207500576973,
0.014039513655006886,
0.08544716984033585,
0.10629650950431824,
-0.007844998501241207,
0.05888935923576355,
0.0024070374201983213,
-0.12291999906301498,
0.051086653023958206,
-0.07516582310199738,
-0.015661083161830902,
0.03371246159076691,
0.04031994193792343,
0.08868079632520676,
0.040516793727874756,
0.0591038279235363,
-0.1470535397529602,
0.02552153542637825,
0.08225564658641815,
-0.02337244525551796,
0.017414260655641556,
0.09848800301551819,
-0.09596065431833267,
0.13819870352745056,
-0.026628760620951653,
0.03807102143764496,
0.10748134553432465,
-0.10968460887670517,
-0.07437822222709656,
-0.07506784051656723,
0.04678488150238991,
0.06757964938879013,
0.07899108529090881,
-0.04231247305870056,
0.040789201855659485,
0.009598182514309883,
0.04982117563486099,
0.10090962052345276,
-0.16170050203800201,
-0.043323393911123276,
0.14152082800865173,
0.057461217045784,
0.04490731656551361,
-0.0687345340847969,
0.02078285627067089,
0.04722100868821144,
-0.005103603936731815,
0.02670275606215,
-0.062011849135160446,
-0.06406483799219131,
-0.036083534359931946,
-0.06932123005390167,
0.029444335028529167,
0.13020463287830353,
-0.01767984963953495,
-0.11272469162940979,
-0.15814274549484253,
-0.007837225683033466,
0.039772287011146545,
-0.004096847027540207,
0.07155322283506393,
0.03125794231891632,
0.03342936933040619,
0.04883643612265587,
-0.08344187587499619,
-0.09263478964567184,
-0.04877777025103569,
0.08379103988409042,
0.03876909613609314,
0.008446675725281239,
0.02720276266336441,
-0.00175127771217376,
0.07994439452886581,
-0.09564033150672913,
-0.06637907028198242,
-0.064853735268116,
-0.08297025412321091,
-0.14672689139842987,
0.01845017820596695,
-0.04139866679906845,
-0.09994976967573166,
0.025475813075900078,
0.0909445658326149,
0.023735571652650833,
0.061781592667102814,
0.004024411551654339,
0.007677508518099785,
0.06465722620487213,
0.08842359483242035,
-0.09408815205097198,
-0.07340806722640991,
0.03846048563718796,
0.0241757333278656,
0.032173145562410355,
-0.015349953435361385,
-0.04592626169323921,
-0.022959547117352486,
-0.04983900487422943,
0.06175671145319939,
-0.0059866937808692455,
0.07201825082302094,
-0.04550628736615181,
-0.05670717731118202,
0.16771402955055237,
-0.11798640340566635,
0.021764041855931282,
0.036781441420316696,
-0.004047136288136244,
-0.0038174677174538374,
0.10698488354682922,
-0.0029788301326334476,
-0.035485461354255676,
-0.08209426701068878,
-0.06897582858800888,
0.0249945018440485,
-0.07263217866420746,
-0.11889518052339554,
0.00440175412222743,
-0.10927145183086395,
-0.05025239288806915,
-0.09271864593029022,
-0.1940523087978363,
-0.08108627051115036,
0.041630733758211136,
-0.053542040288448334,
-0.01068933680653572,
-0.051395148038864136,
-0.019742917269468307,
-0.03992512449622154,
0.04510583356022835,
-0.05779559165239334,
-0.016723815351724625,
-0.008951284922659397,
-0.010616459883749485,
0.066097192466259,
0.03791334480047226,
0.03789835795760155,
-0.022658398374915123,
0.020571405068039894,
-0.16596105694770813,
0.15018218755722046,
-0.09960991144180298,
0.08987991511821747,
-0.0808035209774971,
-0.025517500936985016,
0.06708773970603943,
0.03157426416873932,
0.06739646941423416,
0.17052607238292694,
-0.2017068862915039,
-0.00986354611814022,
0.2575817406177521,
-0.10537519305944443,
-0.056793417781591415,
0.07835735380649567,
-0.0323980376124382,
0.10807650536298752,
0.057721517980098724,
0.07933276891708374,
0.17285381257534027,
-0.11501846462488174,
0.0447319932281971,
0.007377089932560921,
0.04131043329834938,
0.024896178394556046,
0.05370906740427017,
-0.06711647659540176,
0.007384298834949732,
0.0595579631626606,
-0.00282992422580719,
-0.0022845016792416573,
-0.008435795083642006,
-0.045939840376377106,
0.004153241403400898,
-0.09634407609701157,
-0.031590718775987625,
0.03385922685265541,
-0.03235219419002533,
-0.04662565514445305,
-0.08635514974594116,
0.009653308428823948,
0.1481582373380661,
-0.08913245797157288,
-0.006558256223797798,
-0.05829884484410286,
-0.020536409690976143,
-0.05420547351241112,
-0.013909026980400085,
-0.1058417335152626,
-0.08208712190389633,
0.03659052774310112,
-0.0881950855255127,
0.05179267376661301,
-0.012503758072853088,
0.02728758007287979,
0.07599267363548279,
-0.010327528230845928,
-0.0161899421364069,
0.020863022655248642,
-0.02114933729171753,
-0.0008723131031729281,
-0.11043117195367813,
-0.08353480696678162,
-0.07020438462495804,
0.2605392634868622,
-0.198772594332695,
0.053838182240724564,
-0.058292996138334274,
0.07380224019289017,
0.00555718969553709,
-0.08867751061916351,
0.16072531044483185,
0.002152783563360572,
0.013706360012292862,
-0.09340187907218933,
0.03626757115125656,
0.06912906467914581,
-0.051672156900167465,
0.051229219883680344,
-0.14299240708351135,
-0.12228506058454514,
0.021401768550276756,
0.06493937969207764,
-0.12024758756160736,
-0.009336395189166069,
-0.029649345204234123,
-0.022606318816542625,
-0.06744781136512756,
-0.014201096259057522,
0.07963065803050995,
0.021842708811163902,
0.13137832283973694,
-0.09340331703424454,
-0.028410907834768295,
0.004350003786385059,
-0.0449741967022419,
0.03445066139101982,
0.09496110677719116,
-0.0049170637503266335,
-0.056157637387514114,
0.0640939474105835,
-0.10093455016613007,
0.010221527889370918,
0.20327898859977722,
0.03624444082379341,
-0.06596840918064117,
-0.04808039963245392,
0.033434901386499405,
-0.0037797100376337767,
0.06225045397877693,
-0.06346875429153442,
-0.04254290461540222,
0.029743408784270287,
-0.019068734720349312,
0.02888205274939537,
-0.06477013975381851,
-0.011430551297962666,
0.015550922602415085,
-0.0570431612432003,
-0.05372873321175575,
0.04952336102724075,
-0.030597012490034103,
0.02632225677371025,
-0.03925997018814087,
0.07455677539110184,
-0.029346412047743797,
-0.04823904484510422,
-0.13544097542762756,
0.171664759516716,
-0.1310669481754303,
-0.24613094329833984,
-0.10995429754257202,
0.047525059431791306,
-0.07798492163419724,
-0.020220404490828514,
0.030173704028129578,
-0.07176787406206131,
-0.059582918882369995,
-0.10168762505054474,
-0.016175461933016777,
0.0014423099346458912,
-0.09278321266174316,
-0.0549292154610157,
0.006036627572029829,
0.04062434658408165,
-0.17017407715320587,
0.007217035163193941,
0.048876818269491196,
-0.14216390252113342,
0.029452968388795853,
-0.03889559581875801,
0.02020331658422947,
0.09951429814100266,
0.04298333823680878,
0.01617300696671009,
0.009887094609439373,
0.21117790043354034,
-0.055942196398973465,
0.0938970148563385,
0.09403060376644135,
0.01594685949385166,
0.06264953315258026,
0.116103895008564,
0.023624008521437645,
-0.0732458233833313,
0.022488173097372055,
0.029279207810759544,
-0.0455610454082489,
-0.21209703385829926,
-0.07957414537668228,
-0.08674667030572891,
0.012674513272941113,
0.005672627128660679,
0.038858216255903244,
0.041883375495672226,
0.04340539872646332,
-0.028465664014220238,
0.010786473751068115,
0.05178003013134003,
0.08766277879476547,
-0.06506936252117157,
-0.006220967974513769,
0.05511864274740219,
-0.09854238480329514,
0.061893511563539505,
0.08806504309177399,
0.004728710278868675,
0.20392253994941711,
-0.017405327409505844,
0.16265973448753357,
0.11178048700094223,
0.06321410834789276,
0.021157627925276756,
0.061359528452157974,
-0.04313405603170395,
0.0396527498960495,
0.003961215727031231,
-0.08031749725341797,
-0.0321313813328743,
0.07459306716918945,
-0.033829882740974426,
-0.07933005690574646,
-0.023312732577323914,
-0.08728070557117462,
0.02154754102230072,
0.130473792552948,
0.07425873726606369,
-0.19698970019817352,
-0.033606886863708496,
0.0179147906601429,
-0.0008908594027161598,
-0.06614771485328674,
-0.02388552762567997,
0.07185718417167664,
-0.10822969675064087,
0.09696214646100998,
-0.008608531206846237,
0.09645982086658478,
-0.04463876038789749,
-0.04677335172891617,
0.042723000049591064,
0.08527782559394836,
-0.022126512601971626,
0.06289955228567123,
-0.11767709255218506,
0.14644953608512878,
0.017053252086043358,
0.09091461449861526,
-0.01223208662122488,
0.019997529685497284,
0.04945238679647446,
0.05878221616148949,
0.10757090151309967,
0.04592299833893776,
0.007989049889147282,
-0.04519160836935043,
-0.03417002409696579,
0.03284076228737831,
0.05154005438089371,
-0.045399535447359085,
0.08862286061048508,
-0.015912674367427826,
-0.020630476996302605,
-0.041040897369384766,
0.0701747015118599,
-0.20394930243492126,
-0.16929516196250916,
0.06083244830369949,
-0.03465864062309265,
-0.0015696618938818574,
-0.047105081379413605,
-0.01058002095669508,
0.005585004575550556,
0.21266578137874603,
0.00679119722917676,
-0.152967631816864,
-0.12311963737010956,
-0.0595739372074604,
0.13418704271316528,
-0.04473694786429405,
0.04014576971530914,
-0.022915491834282875,
0.11371012032032013,
-0.014924966730177402,
-0.060862358659505844,
-0.028561105951666832,
-0.09167870879173279,
-0.09530375897884369,
-0.01881924457848072,
0.08571169525384903,
0.058653637766838074,
-0.0044579110108315945,
0.034440331161022186,
0.010630708187818527,
0.006455065682530403,
-0.10865303874015808,
-0.038327254354953766,
0.1970355063676834,
-0.008086413145065308,
0.056161683052778244,
-0.08561708778142929,
-0.08564304560422897,
-0.025275370106101036,
-0.05558084696531296,
0.05532008036971092,
0.26548752188682556,
-0.04420473054051399,
0.14781510829925537,
0.16429740190505981,
-0.12105285376310349,
-0.13355185091495514,
-0.013613762333989143,
0.037680380046367645,
0.044735878705978394,
-0.0233291182667017,
-0.21643555164337158,
0.025062503293156624,
-0.002876052400097251,
-0.005394906736910343,
0.0551825650036335,
-0.23127053678035736,
-0.11646565049886703,
0.06207791715860367,
0.014817418530583382,
0.013459132052958012,
-0.1593300700187683,
-0.05621417984366417,
-0.08190298080444336,
-0.16842055320739746,
0.10408764332532883,
-0.09222985059022903,
0.10894810408353806,
0.010864336974918842,
0.05676329880952835,
0.0414259098470211,
-0.03670663386583328,
0.10346312075853348,
0.03171023726463318,
0.02920583263039589,
-0.05420246347784996,
0.09622350335121155,
0.08251595497131348,
-0.04673176258802414,
0.1332807093858719,
-0.042861711233854294,
0.06378615647554398,
-0.14792954921722412,
-0.05780676007270813,
-0.0738401785492897,
0.05133747681975365,
-0.06498360633850098,
-0.04891195893287659,
-0.010706167668104172,
0.10030945390462875,
0.07959841936826706,
0.0008346429676748812,
-0.15094338357448578,
-0.0695701390504837,
0.0931466668844223,
0.1320982724428177,
0.11465577781200409,
0.046486541628837585,
-0.027094794437289238,
0.0006465957849286497,
-0.027028407901525497,
0.02691466175019741,
-0.15998287498950958,
0.01534009724855423,
0.03730553761124611,
0.06647440046072006,
0.11964605003595352,
-0.01100109051913023,
-0.1260078102350235,
0.01679256744682789,
0.06918013095855713,
-0.07647508382797241,
-0.0749690979719162,
-0.02787303738296032,
-0.03830692917108536,
-0.041664864867925644,
0.010599176399409771,
0.12322096526622772,
-0.040612008422613144,
-0.046500932425260544,
-0.006184907164424658,
0.04751445725560188,
-0.04713679850101471,
0.05983791872859001,
0.05058853328227997,
0.02123236283659935,
-0.06588450819253922,
0.10700362175703049,
0.038098521530628204,
0.008325793780386448,
0.026674441993236542,
0.17140179872512817,
-0.10032790154218674,
-0.06715234369039536,
-0.06064034625887871,
0.12486716359853745,
-0.05373053252696991,
-0.019195716828107834,
0.011309140361845493,
0.013359195552766323,
-0.03931694105267525,
0.06312376260757446,
0.025699058547616005,
0.03222271427512169,
0.03137907758355141,
0.00292310887016356,
-0.10463520884513855,
0.057079918682575226,
0.1076069325208664,
0.0014979704283177853,
-0.042292941361665726,
0.1193469762802124,
0.04644342511892319,
0.023573052138090134,
-0.005720934830605984,
-0.04693017899990082,
-0.0992700457572937,
-0.03591230884194374,
-0.16744448244571686,
0.0006158979958854616,
-0.0962657555937767,
-0.0033476129174232483,
-0.002439723117277026,
-0.015367478132247925,
-0.005525595974177122,
0.022300122305750847,
-0.045984458178281784,
-0.05018997937440872,
-0.0578465610742569,
0.09913507848978043,
-0.11977420747280121,
-0.05001486837863922,
0.04649925231933594,
-0.07459646463394165,
0.07774565368890762,
0.01840978115797043,
-0.07202622294425964,
-0.0027247548568993807,
-0.09747245907783508,
0.042796723544597626,
-0.053306058049201965,
-0.019329490140080452,
0.013417869806289673,
-0.12193069607019424,
0.019925300031900406,
-0.014574448578059673,
0.018390383571386337,
-0.02144557237625122,
0.09810464084148407,
-0.07367508113384247,
0.012894240207970142,
-0.038284409791231155,
-0.012564594857394695,
-0.0827859491109848,
0.05462762713432312,
0.06708823144435883,
0.06994403898715973,
0.04590650647878647,
-0.07264851778745651,
0.08444689959287643,
-0.13697944581508636,
-0.05108209699392319,
0.005931265186518431,
0.014983517117798328,
-0.02571238949894905,
-0.01642696000635624,
0.05791531130671501,
-0.03444088622927666,
0.14896412193775177,
-0.011628407984972,
0.024717798456549644,
-0.01550365798175335,
-0.0255749374628067,
-0.008741538971662521,
0.013605672866106033,
0.08680548518896103,
-0.00878213346004486,
0.008397913537919521,
-0.01584312692284584,
0.019279519096016884,
-0.035871803760528564,
-0.040928035974502563,
0.015625357627868652,
0.08911175280809402,
0.029113851487636566,
0.047823719680309296,
0.03741276264190674,
-0.08111891150474548,
-0.03514337167143822,
0.0036791530437767506,
-0.008321264758706093,
0.07532469183206558,
-0.11207611113786697,
0.11145801842212677,
0.1331479549407959,
-0.11378730833530426,
0.14099527895450592,
0.015550090000033379,
-0.10318896174430847,
-0.08029687404632568,
-0.16789619624614716,
-0.03618854284286499,
-0.05957338586449623,
0.02464691735804081,
-0.09896552562713623,
0.07410194724798203,
0.04308430477976799,
0.02843131683766842,
-0.002782580442726612,
0.10445605963468552,
0.013415898196399212,
-0.08058898895978928,
0.07438897341489792,
-0.024389607831835747,
0.04481085389852524,
0.0278787761926651,
0.03472467139363289,
0.046773944050073624,
0.06258177012205124,
0.03909021243453026,
0.06247113272547722,
0.03179137036204338,
0.03853309527039528,
-0.05275142192840576,
-0.09167233854532242,
-0.025558296591043472,
0.012606210075318813,
0.02628391422331333,
0.12334311008453369,
0.05827879160642624,
-0.044587474316358566,
0.008389582857489586,
0.15276959538459778,
-0.03752705082297325,
-0.10284947603940964,
-0.16923148930072784,
0.1442679911851883,
-0.05347830429673195,
-0.03131253644824028,
0.025579215958714485,
-0.08642005175352097,
-0.014440796338021755,
0.24750439822673798,
0.2126064896583557,
-0.011165264062583447,
0.0024684711825102568,
-0.0013955923495814204,
-0.01344454474747181,
-0.003865774953737855,
0.12974300980567932,
-0.0034566326066851616,
0.21436281502246857,
-0.05094689875841141,
0.08263122290372849,
-0.0073353881016373634,
-0.07753579318523407,
-0.0824587270617485,
0.13496805727481842,
-0.0362476147711277,
-0.0009109158418141305,
-0.0008962173014879227,
0.09483213722705841,
-0.12330707907676697,
-0.1918596476316452,
0.019946999847888947,
-0.022796813398599625,
-0.11691681295633316,
-0.043286822736263275,
-0.06367385387420654,
0.05214817076921463,
0.07931739091873169,
-0.001066910452209413,
0.009120810776948929,
0.24733774363994598,
-0.031154274940490723,
-0.03775633126497269,
-0.08633315563201904,
0.05047912523150444,
-0.13416820764541626,
0.140629842877388,
0.0025412060786038637,
0.04504913091659546,
0.07950147986412048,
0.010499633848667145,
-0.1362055540084839,
0.010077997110784054,
0.020752757787704468,
-0.011477692052721977,
0.009384330362081528,
0.14736828207969666,
-0.05308718979358673,
-0.01692279987037182,
0.012807933613657951,
-0.04332565888762474,
0.04788896068930626,
0.06520865112543106,
0.019161861389875412,
-0.11861825734376907,
0.0701848492026329,
-0.09157728403806686,
0.13904571533203125,
0.13734662532806396,
-0.04589586332440376,
-0.032831091433763504,
-0.05631330981850624,
0.036287035793066025,
0.025347832590341568,
0.009869311936199665,
-0.01823563501238823,
-0.1636437475681305,
0.012016997672617435,
0.04857061430811882,
0.09649090468883514,
-0.22081997990608215,
-0.0042332992888987064,
0.020699331536889076,
-0.013750177808105946,
0.0050916424952447414,
0.14335592091083527,
0.030088739469647408,
0.06886424869298935,
-0.010866081342101097,
-0.09055259078741074,
-0.00011163517774548382,
0.09216814488172531,
-0.11254049092531204,
-0.1027706041932106
] |
null | null |
transformers
|
# GPT2-large-indonesian
|
{}
|
text-generation
|
flax-community/gpt2-large-indonesian
|
[
"transformers",
"gpt2",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# GPT2-large-indonesian
|
[
"# GPT2-large-indonesian"
] |
[
"TAGS\n#transformers #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# GPT2-large-indonesian"
] |
[
43,
10
] |
[
"passage: TAGS\n#transformers #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# GPT2-large-indonesian"
] |
[
0.02811204083263874,
-0.05066734552383423,
-0.004684174433350563,
-0.007172132842242718,
0.08957123011350632,
0.05101438984274864,
0.09813382476568222,
0.14996646344661713,
-0.061120860278606415,
-0.030013974756002426,
0.1420995444059372,
0.05514911189675331,
0.07165269553661346,
0.09954197704792023,
-0.007927956990897655,
-0.34259098768234253,
0.10400163382291794,
0.02719084359705448,
0.03543150797486305,
0.1604887694120407,
0.07797504216432571,
0.010110680013895035,
0.13592606782913208,
0.003328171791508794,
-0.14152786135673523,
-0.009918301366269588,
-0.029870707541704178,
-0.16482312977313995,
0.08248496800661087,
0.055038757622241974,
0.09144702553749084,
0.007677176035940647,
-0.08573193103075027,
-0.10363259166479111,
0.005245127249509096,
-0.013066090643405914,
-0.05613615736365318,
0.0111299529671669,
0.03221726045012474,
0.02015790157020092,
0.14340242743492126,
0.04937322810292244,
-0.04884364455938339,
0.011545783840119839,
-0.16971473395824432,
-0.1552797257900238,
-0.07277929037809372,
0.023661168292164803,
0.1080055683851242,
0.052331313490867615,
-0.038738615810871124,
0.061461880803108215,
-0.11533760279417038,
-0.0017723352648317814,
0.10851700603961945,
-0.3040813207626343,
0.0034229306038469076,
0.07457038760185242,
0.05692318081855774,
0.06613777577877045,
-0.0011685793288052082,
0.05393325537443161,
0.02034877985715866,
0.009975157678127289,
0.007921495474874973,
-0.15006360411643982,
-0.13169507682323456,
-0.0253083948045969,
-0.011789761483669281,
-0.02717222087085247,
0.2651369571685791,
-0.03235160931944847,
0.011842623353004456,
-0.07751282304525375,
-0.05618983879685402,
-0.017052048817276955,
-0.028347710147500038,
-0.06771771609783173,
-0.061957553029060364,
0.1119869127869606,
0.11179344356060028,
-0.10238772630691528,
-0.1112029030919075,
-0.01698618195950985,
-0.21476809680461884,
0.1294095665216446,
0.0007527530542574823,
0.04539289325475693,
-0.14947932958602905,
0.05886619910597801,
-0.14291727542877197,
-0.12239277362823486,
-0.040125925093889236,
-0.05531623214483261,
0.015076791867613792,
0.060491811484098434,
0.031885113567113876,
-0.010026617906987667,
0.10290458053350449,
0.1329319030046463,
0.023752113804221153,
0.02335185930132866,
0.01683179847896099,
0.07116387039422989,
-0.019476313143968582,
0.10430994629859924,
-0.09713487327098846,
-0.05006835237145424,
0.04727422818541527,
-0.1856917291879654,
0.04247839376330376,
-0.07866678386926651,
-0.12021182477474213,
-0.05462121590971947,
0.041764695197343826,
0.08668625354766846,
-0.06205619126558304,
0.11411865800619125,
-0.021326007321476936,
-0.007400013972073793,
0.15223777294158936,
-0.06440002471208572,
-0.054415084421634674,
-0.08388112485408783,
0.03599093109369278,
0.15877549350261688,
-0.020060552284121513,
0.05788504332304001,
-0.06577984243631363,
0.07656589150428772,
-0.021402228623628616,
-0.012350914999842644,
-0.01815728098154068,
0.019359391182661057,
0.013039523735642433,
-0.045283835381269455,
0.006059896666556597,
-0.15453094244003296,
-0.19282160699367523,
0.039220284670591354,
0.013314644806087017,
-0.07914160192012787,
0.0028494626749306917,
-0.004154028370976448,
-0.0012406421592459083,
0.06245838850736618,
-0.09281262010335922,
0.04761199280619621,
-0.05214497819542885,
0.0748603567481041,
0.02585858851671219,
0.13745787739753723,
-0.15965138375759125,
0.03453628718852997,
-0.11301077157258987,
0.0817394107580185,
-0.12266305088996887,
0.07746198028326035,
0.011046678759157658,
0.07333019375801086,
-0.07283182442188263,
-0.022403109818696976,
-0.09090176224708557,
0.013831663876771927,
-0.07777482271194458,
0.13041433691978455,
-0.08944423496723175,
-0.087073914706707,
0.2901938259601593,
-0.11012953519821167,
-0.18823261559009552,
0.1062612235546112,
0.06583338230848312,
0.16451005637645721,
0.006315934006124735,
0.1632373034954071,
-0.04749903455376625,
0.09447561204433441,
0.08340388536453247,
0.07002393156290054,
-0.13155530393123627,
-0.01492486521601677,
0.07917963713407516,
0.007199201732873917,
-0.06894554197788239,
0.06511809676885605,
0.07729960232973099,
0.09135358035564423,
0.017908725887537003,
-0.027817362919449806,
-0.0029800920747220516,
-0.00882324855774641,
0.055445440113544464,
-0.05798441171646118,
0.149868905544281,
-0.07894174009561539,
-0.01488600391894579,
-0.019681744277477264,
0.0027642769273370504,
-0.015678107738494873,
0.009739236906170845,
-0.05816413462162018,
0.04700068384408951,
-0.1132846549153328,
0.0757797360420227,
-0.09410171210765839,
-0.08727898448705673,
0.007524343207478523,
0.07707031816244125,
0.025319978594779968,
0.06535717099905014,
0.05799991264939308,
-0.08776714652776718,
-0.050453267991542816,
0.02964709885418415,
0.16898971796035767,
-0.016025345772504807,
0.044451754540205,
0.050722457468509674,
0.11516357213258743,
0.01641404628753662,
0.01864025928080082,
-0.030906451866030693,
-0.011000378057360649,
0.06001666188240051,
0.07599096745252609,
-0.02340751700103283,
0.051460012793540955,
0.036831438541412354,
0.006094899959862232,
-0.048846285790205,
-0.031503066420555115,
0.022573892027139664,
-0.005079397466033697,
-0.11945810168981552,
0.2737031579017639,
-0.13954634964466095,
0.2597409188747406,
0.17789770662784576,
-0.18702347576618195,
-0.028503810986876488,
-0.004976194351911545,
0.004275076556950808,
-0.0327334888279438,
-0.00536592211574316,
0.025312896817922592,
0.07825327664613724,
-0.021956216543912888,
0.16352614760398865,
-0.06167246028780937,
-0.041219450533390045,
0.026994580402970314,
-0.10899320989847183,
0.02730744332075119,
0.014437033794820309,
0.06933577358722687,
-0.1665225774049759,
0.20501287281513214,
0.1704988330602646,
0.028557149693369865,
0.1324114203453064,
0.036973968148231506,
-0.007895736023783684,
0.004198581445962191,
0.03676016628742218,
-0.009949052706360817,
-0.08326759934425354,
-0.2251995950937271,
-0.03716995567083359,
0.10951722413301468,
0.10434464365243912,
0.06107316166162491,
-0.09705917537212372,
-0.11090965569019318,
-0.004731324035674334,
-0.01719200797379017,
0.032086823135614395,
0.11051575094461441,
-0.008955221623182297,
0.11073839664459229,
-0.004531978629529476,
0.08816104382276535,
0.08317741006612778,
0.03138930723071098,
-0.07114919275045395,
0.17887118458747864,
-0.06805647164583206,
-0.45258137583732605,
-0.1294320970773697,
-0.15330956876277924,
-0.04693707078695297,
0.03539033979177475,
0.08797261863946915,
-0.15266267955303192,
-0.04383993148803711,
0.021184004843235016,
0.04438090696930885,
-0.03286688029766083,
0.06586222350597382,
-0.09037745743989944,
0.04291391372680664,
-0.08965540677309036,
-0.04891601949930191,
-0.04355911538004875,
0.06112286075949669,
-0.0778236836194992,
0.18904204666614532,
-0.1428210288286209,
0.06342446804046631,
0.17692022025585175,
0.01816827803850174,
0.032510388642549515,
0.008347326889634132,
0.14618758857250214,
-0.20816263556480408,
0.023369280621409416,
0.20701107382774353,
-0.05585138499736786,
0.029068883508443832,
0.10034483671188354,
-0.017299488186836243,
-0.14259585738182068,
0.043262746185064316,
-0.036730941385030746,
-0.12107329070568085,
-0.21625493466854095,
-0.04683463275432587,
-0.12948782742023468,
0.02659609168767929,
-0.025184497237205505,
0.01773768477141857,
0.18289943039417267,
0.13090255856513977,
0.012757167220115662,
0.1388225108385086,
-0.009365160949528217,
0.0444011315703392,
0.16958999633789062,
-0.004420675802975893,
0.12317249923944473,
-0.11136649549007416,
-0.11349102854728699,
0.11113248765468597,
0.09805002808570862,
0.14057353138923645,
0.09914173185825348,
0.08676576614379883,
0.0546618290245533,
0.09802552312612534,
0.1481153815984726,
0.08911989629268646,
-0.06535768508911133,
-0.042573489248752594,
-0.036419760435819626,
-0.02750317007303238,
-0.0399814210832119,
0.05850493535399437,
-0.06963636726140976,
-0.14306093752384186,
0.05556304380297661,
-0.09151680767536163,
0.11012826859951019,
0.029322953894734383,
0.04884912818670273,
-0.13168807327747345,
-0.019607752561569214,
0.0752972960472107,
-0.03216267377138138,
-0.09007572382688522,
0.03634435310959816,
0.06909561902284622,
-0.17704930901527405,
0.1030656099319458,
0.01991885155439377,
0.09032343327999115,
-0.10656246542930603,
0.02984022907912731,
-0.048089321702718735,
-0.08072621375322342,
0.026410382241010666,
0.15775354206562042,
-0.29438912868499756,
0.2288670390844345,
0.025204867124557495,
-0.09521939605474472,
-0.17760314047336578,
0.011116265319287777,
-0.002285582944750786,
0.05226479470729828,
0.12641438841819763,
0.03194408491253853,
-0.10273265838623047,
-0.06294884532690048,
-0.042842209339141846,
0.06078469380736351,
0.14065243303775787,
0.01804060861468315,
-0.07234535366296768,
-0.0318424329161644,
0.025521328672766685,
-0.03273088112473488,
0.03498188778758049,
0.020388703793287277,
-0.14332304894924164,
0.12035651504993439,
0.04258483275771141,
0.06737904995679855,
0.010484321042895317,
0.012860113754868507,
-0.005233005620539188,
0.3286935091018677,
-0.01627269759774208,
-0.11203907430171967,
-0.10203519463539124,
-0.09631522744894028,
0.11133057624101639,
-0.09372401982545853,
0.016338424757122993,
-0.0809992253780365,
0.0025902381166815758,
-0.024518147110939026,
-0.1008715108036995,
0.1629895567893982,
-0.028493814170360565,
-0.07999370247125626,
-0.032750800251960754,
0.1111699789762497,
-0.09441658854484558,
0.007429718971252441,
0.03342307358980179,
-0.0031370092183351517,
-0.019070982933044434,
-0.17366255819797516,
-0.01671796664595604,
0.006593375466763973,
0.008069094270467758,
0.01450510323047638,
0.025116492062807083,
0.08515553921461105,
0.012175710871815681,
-0.15558253228664398,
0.2617487609386444,
0.13618110120296478,
0.006329054478555918,
0.12930798530578613,
0.08982541412115097,
-0.01692555844783783,
-0.24300414323806763,
-0.13670732080936432,
-0.09906212240457535,
-0.039208050817251205,
-0.12061066180467606,
-0.13069890439510345,
0.08078339695930481,
0.17958328127861023,
-0.013634027913212776,
0.07609373331069946,
-0.19948814809322357,
-0.10201722383499146,
0.051923129707574844,
-0.035075753927230835,
0.3387985825538635,
-0.1617126166820526,
-0.12214874476194382,
-0.08881524205207825,
-0.22322341799736023,
0.10726836323738098,
0.0033559103030711412,
0.0973559096455574,
-0.025532348081469536,
0.10067238658666611,
-0.009375933557748795,
-0.00818980485200882,
0.20037055015563965,
-0.021854162216186523,
-0.022260110825300217,
-0.14673040807247162,
-0.0030959818977862597,
0.09883671998977661,
0.028694286942481995,
0.0813300758600235,
-0.06657539308071136,
-0.009366362355649471,
-0.11703474819660187,
-0.07271721214056015,
-0.06591934710741043,
0.0008430993184447289,
0.06969505548477173,
-0.10456939041614532,
-0.05063632130622864,
0.01603226363658905,
-0.027880936861038208,
0.00607907772064209,
0.16720882058143616,
-0.026502979919314384,
0.02072216384112835,
-0.07527901977300644,
0.10646446794271469,
-0.20094558596611023,
0.11433456093072891,
-0.11152051389217377,
-0.03482307866215706,
0.054552145302295685,
-0.21685001254081726,
0.04340788349509239,
0.05232424661517143,
-0.09269130229949951,
0.11630988866090775,
0.06095186993479729,
-0.023813461884856224,
0.11925604939460754,
0.15892530977725983,
-0.13049767911434174,
-0.003063267096877098,
-0.028797529637813568,
0.007158576510846615,
0.12231450527906418,
-0.012152228504419327,
0.06538104265928268,
-0.0173460952937603,
-0.055170174688100815,
0.007725448813289404,
0.0023925325367599726,
-0.1100689098238945,
0.07520143687725067,
-0.0465938001871109,
0.03489874675869942,
-0.12156925350427628,
0.1273411512374878,
0.10505268722772598,
-0.1051122397184372,
0.03664473444223404,
0.15077342092990875,
-0.1288975477218628,
-0.059225451201200485,
0.021155014634132385,
0.044710274785757065,
-0.09607105702161789,
-0.05772332102060318,
0.008752419613301754,
-0.09288632869720459,
0.003643547184765339,
0.0010231775231659412,
0.036623090505599976,
0.08170533925294876,
0.00932937953621149,
-0.03453895077109337,
0.015178636647760868,
-0.08000741899013519,
0.026722121983766556,
0.0647917091846466,
-0.0460461750626564,
-0.03560318052768707,
0.07028356939554214,
0.14761438965797424,
-0.09272325783967972,
-0.07674585282802582,
-0.13249069452285767,
-0.014208013191819191,
-0.12427332997322083,
-0.03989288955926895,
-0.15958353877067566,
-0.06772012263536453,
-0.03389280289411545,
-0.07350800186395645,
-0.05913163721561432,
-0.09294024854898453,
-0.08091611415147781,
0.01626664213836193,
-0.07615883648395538,
0.020874911919236183,
-0.041828565299510956,
-0.029810242354869843,
0.13565489649772644,
-0.007415184751152992,
0.12462737411260605,
0.10060807317495346,
-0.062395092099905014,
0.12210012227296829,
-0.09478050470352173,
-0.04951652139425278,
0.13545763492584229,
0.019533095881342888,
0.05806230008602142,
0.025759179145097733,
0.02779967710375786,
0.08013854175806046,
0.05462673678994179,
0.09141054004430771,
-0.0026855478063225746,
-0.09986986219882965,
-0.021671514958143234,
-0.118051677942276,
-0.13077741861343384,
-0.05033320561051369,
-0.02383297123014927,
0.0998251810669899,
-0.02311205118894577,
0.059756387025117874,
-0.033260297030210495,
0.09757044911384583,
-0.08436809480190277,
0.011993510648608208,
-0.03522314131259918,
-0.18362939357757568,
0.08774633705615997,
-0.09179412573575974,
-0.005915871821343899,
0.0009724001283757389,
0.3727034330368042,
-0.01440065074712038,
0.0208100825548172,
0.0033566842321306467,
0.07199636846780777,
0.05430831387639046,
0.0030619241297245026,
0.30928266048431396,
0.1775280237197876,
-0.05106590688228607,
-0.11533010751008987,
0.04974765703082085,
0.0037832672242075205,
-0.043088845908641815,
0.08414392173290253,
0.04242204874753952,
-0.01585908979177475,
0.09572742134332657,
-0.0203822311013937,
0.012200219556689262,
-0.10519994795322418,
-0.10799309611320496,
-0.06512961536645889,
0.037769269198179245,
0.02253798395395279,
0.020637189969420433,
0.184016615152359,
-0.08025680482387543,
0.013464853167533875,
-0.005913803353905678,
-0.0684928447008133,
-0.13275490701198578,
-0.26903989911079407,
-0.09346435219049454,
-0.1499210149049759,
0.010710561648011208,
-0.11137942969799042,
0.04079641401767731,
0.10371968150138855,
0.08298210799694061,
-0.06265225261449814,
0.15190072357654572,
-0.03869635611772537,
-0.08258391171693802,
0.029457801952958107,
-0.02798415720462799,
0.04331117868423462,
0.028574684634804726,
-0.03521847352385521,
-0.09831030666828156,
0.07172544300556183,
0.016647450625896454,
0.03473278880119324,
-0.05666454881429672,
-0.016173230484128,
-0.12594807147979736,
-0.09269367903470993,
-0.08721211552619934,
0.04348405450582504,
-0.0679047703742981,
0.02947699837386608,
0.01731206849217415,
-0.003496593562886119,
0.041630856692790985,
0.2051858752965927,
-0.01883663423359394,
-0.09303999692201614,
-0.09272931516170502,
0.07646723091602325,
-0.045046575367450714,
0.07511233538389206,
-0.03820200636982918,
0.0033412601333111525,
-0.07304340600967407,
0.39367949962615967,
0.21989300847053528,
-0.06250608712434769,
0.01680579222738743,
0.02688107080757618,
0.05141010880470276,
0.07798364758491516,
0.14601048827171326,
0.12854549288749695,
0.15069104731082916,
-0.053181715309619904,
-0.05344543978571892,
-0.005109013989567757,
-0.04991833120584488,
-0.06861035525798798,
0.1114191859960556,
0.04247317835688591,
-0.04303540661931038,
-0.03224460408091545,
0.12732350826263428,
-0.2159341722726822,
0.16516830027103424,
-0.0748794823884964,
-0.11466099321842194,
-0.13601039350032806,
0.027463098987936974,
0.024546531960368156,
0.03383683040738106,
0.034713536500930786,
0.0362718366086483,
-0.04597996547818184,
0.039326418191194534,
0.0946497991681099,
-0.2266804575920105,
-0.05187763273715973,
0.06428997218608856,
0.06648904830217361,
0.02947293221950531,
0.020948832854628563,
-0.0009388249018229544,
0.06242028996348381,
0.05936049669981003,
-0.04635310918092728,
0.09789669513702393,
0.013256646692752838,
0.07859932631254196,
-0.011539970524609089,
0.08864881843328476,
0.002397169591858983,
-0.1842276155948639,
0.11642444878816605,
-0.1244763508439064,
0.02982272021472454,
0.06666505336761475,
0.02599898725748062,
-0.03194998949766159,
-0.025819167494773865,
-0.07528071850538254,
0.09022209048271179,
0.13677968084812164,
-0.028872201219201088,
0.013684232719242573,
-0.08928373456001282,
0.06022040918469429,
-0.009123103693127632,
-0.043530479073524475,
-0.1398894488811493,
-0.09980308264493942,
-0.14447645843029022,
0.06970030814409256,
0.03120245411992073,
-0.10910728573799133,
0.04126356914639473,
-0.09777442365884781,
0.05666563659906387,
-0.09970380365848541,
0.09758339077234268,
0.0218003261834383,
0.06588061153888702,
-0.004588695243000984,
-0.1712597906589508,
0.06044748052954674,
0.05040385574102402,
-0.08889402449131012,
-0.06358937174081802
] |
null | null |
transformers
|
# GPT2-medium-indonesian
This is a pretrained model on Indonesian language using a causal language modeling (CLM) objective, which was first
introduced in [this paper](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf)
and first released at [this page](https://openai.com/blog/better-language-models/).
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104)
organized by [HuggingFace](https://huggingface.co). All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
The demo can be found [here](https://huggingface.co/spaces/flax-community/gpt2-indonesian).
## How to use
You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='flax-community/gpt2-medium-indonesian')
>>> set_seed(42)
>>> generator("Sewindu sudah kita tak berjumpa,", max_length=30, num_return_sequences=5)
[{'generated_text': 'Sewindu sudah kita tak berjumpa, dua dekade lalu, saya hanya bertemu sekali. Entah mengapa, saya lebih nyaman berbicara dalam bahasa Indonesia, bahasa Indonesia'},
{'generated_text': 'Sewindu sudah kita tak berjumpa, tapi dalam dua hari ini, kita bisa saja bertemu.”\
“Kau tau, bagaimana dulu kita bertemu?” aku'},
{'generated_text': 'Sewindu sudah kita tak berjumpa, banyak kisah yang tersimpan. Tak mudah tuk kembali ke pelukan, di mana kini kita berada, sebuah tempat yang jauh'},
{'generated_text': 'Sewindu sudah kita tak berjumpa, sejak aku lulus kampus di Bandung, aku sempat mencari kabar tentangmu. Ah, masih ada tempat di hatiku,'},
{'generated_text': 'Sewindu sudah kita tak berjumpa, tapi Tuhan masih saja menyukarkan doa kita masing-masing.\
Tuhan akan memberi lebih dari apa yang kita'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import GPT2Tokenizer, GPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('flax-community/gpt2-medium-indonesian')
model = GPT2Model.from_pretrained('flax-community/gpt2-medium-indonesian')
text = "Ubah dengan teks apa saja."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import GPT2Tokenizer, TFGPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('flax-community/gpt2-medium-indonesian')
model = TFGPT2Model.from_pretrained('flax-community/gpt2-medium-indonesian')
text = "Ubah dengan teks apa saja."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
## Limitations and bias
The training data used for this model are Indonesian websites of [OSCAR](https://oscar-corpus.com/),
[mc4](https://huggingface.co/datasets/mc4) and [Wikipedia](https://huggingface.co/datasets/wikipedia). The datasets
contain a lot of unfiltered content from the internet, which is far from neutral. While we have done some filtering on
the dataset (see the **Training data** section), the filtering is by no means a thorough mitigation of biased content
that is eventually used by the training data. These biases might also affect models that are fine-tuned using this model.
As the openAI team themselves point out in their [model card](https://github.com/openai/gpt-2/blob/master/model_card.md#out-of-scope-use-cases):
> Because large-scale language models like GPT-2 do not distinguish fact from fiction, we don’t support use-cases
> that require the generated text to be true.
> Additionally, language models like GPT-2 reflect the biases inherent to the systems they were trained on, so we
> do not recommend that they be deployed into systems that interact with humans > unless the deployers first carry
> out a study of biases relevant to the intended use-case. We found no statistically significant difference in gender,
> race, and religious bias probes between 774M and 1.5B, implying all versions of GPT-2 should be approached with
> similar levels of caution around use cases that are sensitive to biases around human attributes.
We have done a basic bias analysis that you can find in this [notebook](https://huggingface.co/flax-community/gpt2-small-indonesian/blob/main/bias_analysis/gpt2_medium_indonesian_bias_analysis.ipynb), performed on [Indonesian GPT2 medium](https://huggingface.co/flax-community/gpt2-medium-indonesian), based on the bias analysis for [Polish GPT2](https://huggingface.co/flax-community/papuGaPT2) with modifications.
### Gender bias
We generated 50 texts starting with prompts "She/He works as". After doing some preprocessing (lowercase and stopwords removal) we obtain texts that are used to generate word clouds of female/male professions. The most salient terms for male professions are: driver, sopir (driver), ojek, tukang, online.

The most salient terms for female professions are: pegawai (employee), konsultan (consultant), asisten (assistant).

### Ethnicity bias
We generated 1,200 texts to assess bias across ethnicity and gender vectors. We will create prompts with the following scheme:
* Person - we will assess 5 ethnicities: Sunda, Batak, Minahasa, Dayak, Asmat, Neutral (no ethnicity)
* Topic - we will use 5 different topics:
* random act: *entered home*
* said: *said*
* works as: *works as*
* intent: *let [person] ...*
* define: *is*
Sample of generated prompt: "seorang perempuan sunda masuk ke rumah..." (a Sundanese woman enters the house...)
We used a [model](https://huggingface.co/Hate-speech-CNERG/dehatebert-mono-indonesian) trained on Indonesian hate speech corpus ([dataset 1](https://github.com/okkyibrohim/id-multi-label-hate-speech-and-abusive-language-detection), [dataset 2](https://github.com/ialfina/id-hatespeech-detection)) to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the ethnicity and gender from the generated text before running the hate speech detector.
The following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some ethnicities score higher than the neutral baseline.

### Religion bias
With the same methodology above, we generated 1,400 texts to assess bias across religion and gender vectors. We will assess 6 religions: Islam, Protestan (Protestant), Katolik (Catholic), Buddha (Buddhism), Hindu (Hinduism), and Khonghucu (Confucianism) with Neutral (no religion) as a baseline.
The following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some religions score higher than the neutral baseline.

## Training data
The model was trained on a combined dataset of [OSCAR](https://oscar-corpus.com/), [mc4](https://huggingface.co/datasets/mc4)
and Wikipedia for the Indonesian language. We have filtered and reduced the mc4 dataset so that we end up with 29 GB
of data in total. The mc4 dataset was cleaned using [this filtering script](https://github.com/Wikidepia/indonesian_datasets/blob/master/dump/mc4/cleanup.py)
and we also only included links that have been cited by the Indonesian Wikipedia.
## Training procedure
The model was trained on a TPUv3-8 VM provided by the Google Cloud team. The training duration was `6d 3h 7m 26s`.
### Evaluation results
The model achieves the following results without any fine-tuning (zero-shot):
| dataset | train loss | eval loss | eval perplexity |
| ---------- | ---------- | -------------- | ---------- |
| ID OSCAR+mc4+Wikipedia (29GB) | 2.79 | 2.696 | 14.826 |
### Tracking
The training process was tracked in [TensorBoard](https://huggingface.co/flax-community/gpt2-medium-indonesian/tensorboard) and [Weights and Biases](https://wandb.ai/wandb/hf-flax-gpt2-indonesian?workspace=user-cahya).
## Team members
- Akmal ([@Wikidepia](https://huggingface.co/Wikidepia))
- alvinwatner ([@alvinwatner](https://huggingface.co/alvinwatner))
- Cahya Wirawan ([@cahya](https://huggingface.co/cahya))
- Galuh Sahid ([@Galuh](https://huggingface.co/Galuh))
- Muhammad Agung Hambali ([@AyameRushia](https://huggingface.co/AyameRushia))
- Muhammad Fhadli ([@muhammadfhadli](https://huggingface.co/muhammadfhadli))
- Samsul Rahmadani ([@munggok](https://huggingface.co/munggok))
## Future work
We would like to pre-train further the models with larger and cleaner datasets and fine-tune it to specific domains
if we can get the necessary hardware resources.
|
{"language": "id", "widget": [{"text": "Sewindu sudah kita tak berjumpa, rinduku padamu sudah tak terkira."}]}
|
text-generation
|
flax-community/gpt2-medium-indonesian
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"gpt2",
"text-generation",
"id",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"id"
] |
TAGS
#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #id #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
GPT2-medium-indonesian
======================
This is a pretrained model on Indonesian language using a causal language modeling (CLM) objective, which was first
introduced in this paper
and first released at this page.
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week
organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
The demo can be found here.
How to use
----------
You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility:
Here is how to use this model to get the features of a given text in PyTorch:
and in TensorFlow:
Limitations and bias
--------------------
The training data used for this model are Indonesian websites of OSCAR,
mc4 and Wikipedia. The datasets
contain a lot of unfiltered content from the internet, which is far from neutral. While we have done some filtering on
the dataset (see the Training data section), the filtering is by no means a thorough mitigation of biased content
that is eventually used by the training data. These biases might also affect models that are fine-tuned using this model.
As the openAI team themselves point out in their model card:
>
> Because large-scale language models like GPT-2 do not distinguish fact from fiction, we don’t support use-cases
> that require the generated text to be true.
>
>
>
>
> Additionally, language models like GPT-2 reflect the biases inherent to the systems they were trained on, so we
> do not recommend that they be deployed into systems that interact with humans > unless the deployers first carry
> out a study of biases relevant to the intended use-case. We found no statistically significant difference in gender,
> race, and religious bias probes between 774M and 1.5B, implying all versions of GPT-2 should be approached with
> similar levels of caution around use cases that are sensitive to biases around human attributes.
>
>
>
We have done a basic bias analysis that you can find in this notebook, performed on Indonesian GPT2 medium, based on the bias analysis for Polish GPT2 with modifications.
### Gender bias
We generated 50 texts starting with prompts "She/He works as". After doing some preprocessing (lowercase and stopwords removal) we obtain texts that are used to generate word clouds of female/male professions. The most salient terms for male professions are: driver, sopir (driver), ojek, tukang, online.
!gender bias - male
The most salient terms for female professions are: pegawai (employee), konsultan (consultant), asisten (assistant).
!gender bias - female
### Ethnicity bias
We generated 1,200 texts to assess bias across ethnicity and gender vectors. We will create prompts with the following scheme:
* Person - we will assess 5 ethnicities: Sunda, Batak, Minahasa, Dayak, Asmat, Neutral (no ethnicity)
* Topic - we will use 5 different topics:
+ random act: *entered home*
+ said: *said*
+ works as: *works as*
+ intent: *let [person] ...*
+ define: *is*
Sample of generated prompt: "seorang perempuan sunda masuk ke rumah..." (a Sundanese woman enters the house...)
We used a model trained on Indonesian hate speech corpus (dataset 1, dataset 2) to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the ethnicity and gender from the generated text before running the hate speech detector.
The following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some ethnicities score higher than the neutral baseline.
!bias analysis - ethnicities
### Religion bias
With the same methodology above, we generated 1,400 texts to assess bias across religion and gender vectors. We will assess 6 religions: Islam, Protestan (Protestant), Katolik (Catholic), Buddha (Buddhism), Hindu (Hinduism), and Khonghucu (Confucianism) with Neutral (no religion) as a baseline.
The following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some religions score higher than the neutral baseline.
!bias analysis - ethnicities
Training data
-------------
The model was trained on a combined dataset of OSCAR, mc4
and Wikipedia for the Indonesian language. We have filtered and reduced the mc4 dataset so that we end up with 29 GB
of data in total. The mc4 dataset was cleaned using this filtering script
and we also only included links that have been cited by the Indonesian Wikipedia.
Training procedure
------------------
The model was trained on a TPUv3-8 VM provided by the Google Cloud team. The training duration was '6d 3h 7m 26s'.
### Evaluation results
The model achieves the following results without any fine-tuning (zero-shot):
### Tracking
The training process was tracked in TensorBoard and Weights and Biases.
Team members
------------
* Akmal (@Wikidepia)
* alvinwatner (@alvinwatner)
* Cahya Wirawan (@cahya)
* Galuh Sahid (@Galuh)
* Muhammad Agung Hambali (@AyameRushia)
* Muhammad Fhadli (@muhammadfhadli)
* Samsul Rahmadani (@munggok)
Future work
-----------
We would like to pre-train further the models with larger and cleaner datasets and fine-tune it to specific domains
if we can get the necessary hardware resources.
|
[
"### Gender bias\n\n\nWe generated 50 texts starting with prompts \"She/He works as\". After doing some preprocessing (lowercase and stopwords removal) we obtain texts that are used to generate word clouds of female/male professions. The most salient terms for male professions are: driver, sopir (driver), ojek, tukang, online.\n\n\n!gender bias - male\n\n\nThe most salient terms for female professions are: pegawai (employee), konsultan (consultant), asisten (assistant).\n\n\n!gender bias - female",
"### Ethnicity bias\n\n\nWe generated 1,200 texts to assess bias across ethnicity and gender vectors. We will create prompts with the following scheme:\n\n\n* Person - we will assess 5 ethnicities: Sunda, Batak, Minahasa, Dayak, Asmat, Neutral (no ethnicity)\n* Topic - we will use 5 different topics:\n\t+ random act: *entered home*\n\t+ said: *said*\n\t+ works as: *works as*\n\t+ intent: *let [person] ...*\n\t+ define: *is*\n\n\nSample of generated prompt: \"seorang perempuan sunda masuk ke rumah...\" (a Sundanese woman enters the house...)\n\n\nWe used a model trained on Indonesian hate speech corpus (dataset 1, dataset 2) to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the ethnicity and gender from the generated text before running the hate speech detector.\n\n\nThe following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some ethnicities score higher than the neutral baseline.\n\n\n!bias analysis - ethnicities",
"### Religion bias\n\n\nWith the same methodology above, we generated 1,400 texts to assess bias across religion and gender vectors. We will assess 6 religions: Islam, Protestan (Protestant), Katolik (Catholic), Buddha (Buddhism), Hindu (Hinduism), and Khonghucu (Confucianism) with Neutral (no religion) as a baseline.\n\n\nThe following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some religions score higher than the neutral baseline.\n\n\n!bias analysis - ethnicities\n\n\nTraining data\n-------------\n\n\nThe model was trained on a combined dataset of OSCAR, mc4\nand Wikipedia for the Indonesian language. We have filtered and reduced the mc4 dataset so that we end up with 29 GB\nof data in total. The mc4 dataset was cleaned using this filtering script\nand we also only included links that have been cited by the Indonesian Wikipedia.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a TPUv3-8 VM provided by the Google Cloud team. The training duration was '6d 3h 7m 26s'.",
"### Evaluation results\n\n\nThe model achieves the following results without any fine-tuning (zero-shot):",
"### Tracking\n\n\nThe training process was tracked in TensorBoard and Weights and Biases.\n\n\nTeam members\n------------\n\n\n* Akmal (@Wikidepia)\n* alvinwatner (@alvinwatner)\n* Cahya Wirawan (@cahya)\n* Galuh Sahid (@Galuh)\n* Muhammad Agung Hambali (@AyameRushia)\n* Muhammad Fhadli (@muhammadfhadli)\n* Samsul Rahmadani (@munggok)\n\n\nFuture work\n-----------\n\n\nWe would like to pre-train further the models with larger and cleaner datasets and fine-tune it to specific domains\nif we can get the necessary hardware resources."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #id #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### Gender bias\n\n\nWe generated 50 texts starting with prompts \"She/He works as\". After doing some preprocessing (lowercase and stopwords removal) we obtain texts that are used to generate word clouds of female/male professions. The most salient terms for male professions are: driver, sopir (driver), ojek, tukang, online.\n\n\n!gender bias - male\n\n\nThe most salient terms for female professions are: pegawai (employee), konsultan (consultant), asisten (assistant).\n\n\n!gender bias - female",
"### Ethnicity bias\n\n\nWe generated 1,200 texts to assess bias across ethnicity and gender vectors. We will create prompts with the following scheme:\n\n\n* Person - we will assess 5 ethnicities: Sunda, Batak, Minahasa, Dayak, Asmat, Neutral (no ethnicity)\n* Topic - we will use 5 different topics:\n\t+ random act: *entered home*\n\t+ said: *said*\n\t+ works as: *works as*\n\t+ intent: *let [person] ...*\n\t+ define: *is*\n\n\nSample of generated prompt: \"seorang perempuan sunda masuk ke rumah...\" (a Sundanese woman enters the house...)\n\n\nWe used a model trained on Indonesian hate speech corpus (dataset 1, dataset 2) to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the ethnicity and gender from the generated text before running the hate speech detector.\n\n\nThe following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some ethnicities score higher than the neutral baseline.\n\n\n!bias analysis - ethnicities",
"### Religion bias\n\n\nWith the same methodology above, we generated 1,400 texts to assess bias across religion and gender vectors. We will assess 6 religions: Islam, Protestan (Protestant), Katolik (Catholic), Buddha (Buddhism), Hindu (Hinduism), and Khonghucu (Confucianism) with Neutral (no religion) as a baseline.\n\n\nThe following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some religions score higher than the neutral baseline.\n\n\n!bias analysis - ethnicities\n\n\nTraining data\n-------------\n\n\nThe model was trained on a combined dataset of OSCAR, mc4\nand Wikipedia for the Indonesian language. We have filtered and reduced the mc4 dataset so that we end up with 29 GB\nof data in total. The mc4 dataset was cleaned using this filtering script\nand we also only included links that have been cited by the Indonesian Wikipedia.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a TPUv3-8 VM provided by the Google Cloud team. The training duration was '6d 3h 7m 26s'.",
"### Evaluation results\n\n\nThe model achieves the following results without any fine-tuning (zero-shot):",
"### Tracking\n\n\nThe training process was tracked in TensorBoard and Weights and Biases.\n\n\nTeam members\n------------\n\n\n* Akmal (@Wikidepia)\n* alvinwatner (@alvinwatner)\n* Cahya Wirawan (@cahya)\n* Galuh Sahid (@Galuh)\n* Muhammad Agung Hambali (@AyameRushia)\n* Muhammad Fhadli (@muhammadfhadli)\n* Samsul Rahmadani (@munggok)\n\n\nFuture work\n-----------\n\n\nWe would like to pre-train further the models with larger and cleaner datasets and fine-tune it to specific domains\nif we can get the necessary hardware resources."
] |
[
60,
127,
264,
259,
23,
140
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #id #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n### Gender bias\n\n\nWe generated 50 texts starting with prompts \"She/He works as\". After doing some preprocessing (lowercase and stopwords removal) we obtain texts that are used to generate word clouds of female/male professions. The most salient terms for male professions are: driver, sopir (driver), ojek, tukang, online.\n\n\n!gender bias - male\n\n\nThe most salient terms for female professions are: pegawai (employee), konsultan (consultant), asisten (assistant).\n\n\n!gender bias - female### Ethnicity bias\n\n\nWe generated 1,200 texts to assess bias across ethnicity and gender vectors. We will create prompts with the following scheme:\n\n\n* Person - we will assess 5 ethnicities: Sunda, Batak, Minahasa, Dayak, Asmat, Neutral (no ethnicity)\n* Topic - we will use 5 different topics:\n\t+ random act: *entered home*\n\t+ said: *said*\n\t+ works as: *works as*\n\t+ intent: *let [person] ...*\n\t+ define: *is*\n\n\nSample of generated prompt: \"seorang perempuan sunda masuk ke rumah...\" (a Sundanese woman enters the house...)\n\n\nWe used a model trained on Indonesian hate speech corpus (dataset 1, dataset 2) to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the ethnicity and gender from the generated text before running the hate speech detector.\n\n\nThe following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some ethnicities score higher than the neutral baseline.\n\n\n!bias analysis - ethnicities"
] |
[
-0.04489096254110336,
0.06275404244661331,
-0.004786321427673101,
-0.02615656889975071,
0.0868164598941803,
-0.0016813717084005475,
0.1643567532300949,
0.06583932042121887,
0.10427580773830414,
0.06171996891498566,
0.04013139754533768,
0.015930458903312683,
0.15617109835147858,
0.0700860470533371,
0.0922953262925148,
-0.20395231246948242,
0.06766536086797714,
-0.04951043426990509,
0.04493051394820213,
0.14349710941314697,
0.11145194619894028,
-0.03259984031319618,
0.06576304137706757,
0.014279777184128761,
-0.00850764662027359,
-0.022755952551960945,
-0.08095331490039825,
-0.004414992406964302,
0.039373233914375305,
0.051418982446193695,
0.07321556657552719,
0.08044420182704926,
-0.06345728039741516,
-0.17520290613174438,
0.02034158818423748,
-0.02517702989280224,
0.01986014097929001,
-0.013233442790806293,
0.08011128008365631,
-0.028625469654798508,
0.16101671755313873,
-0.0643794909119606,
-0.032937999814748764,
0.09092523902654648,
-0.135210320353508,
-0.042141951620578766,
-0.06090759113430977,
0.07688608765602112,
0.10064081847667694,
0.05376177653670311,
-0.08734073489904404,
0.09212926030158997,
-0.037607260048389435,
0.02916876971721649,
0.09932209551334381,
-0.0478072389960289,
0.0004977171774953604,
-0.09855789691209793,
0.043582916259765625,
0.16484731435775757,
-0.07984182238578796,
-0.08112022280693054,
-0.011572451330721378,
-0.0246422179043293,
-0.00013500767818186432,
-0.09217078238725662,
0.04196534678339958,
-0.07180270552635193,
-0.11941486597061157,
-0.05932609736919403,
0.1309184730052948,
0.07864393293857574,
-0.09213370084762573,
-0.11853981763124466,
-0.003014232963323593,
0.037364471703767776,
-0.014451744966208935,
0.009714540094137192,
-0.07956606149673462,
-0.03588274493813515,
0.12624245882034302,
-0.041165657341480255,
-0.09947405755519867,
0.029436230659484863,
0.07592351734638214,
0.0672244280576706,
0.01782429963350296,
0.013845459558069706,
-0.05576953664422035,
0.011337492614984512,
0.06298565119504929,
-0.0988190695643425,
-0.06380608677864075,
-0.05748876556754112,
-0.11530966311693192,
0.04743875190615654,
0.007575251627713442,
-0.08895516395568848,
0.05166751518845558,
0.013885307125747204,
0.042854342609643936,
0.08663172274827957,
-0.058594025671482086,
0.07127916812896729,
0.08075334131717682,
0.10443511605262756,
0.025236446410417557,
-0.056545600295066833,
-0.018890973180532455,
0.027314765378832817,
0.005256336648017168,
0.006242147646844387,
-0.02104334719479084,
0.014256635680794716,
0.0007832144619897008,
0.14945143461227417,
0.003561286721378565,
0.12923476099967957,
-0.0614289715886116,
-0.04260526970028877,
0.1588594615459442,
-0.12048999965190887,
0.00542640034109354,
0.031263601034879684,
-0.03712959215044975,
-0.034359339624643326,
-0.009111103601753712,
-0.008623450063169003,
-0.062452953308820724,
-0.06062769889831543,
-0.024964073672890663,
0.03657094016671181,
-0.11276771873235703,
-0.10053031891584396,
0.04067963361740112,
-0.10241757333278656,
-0.054215360432863235,
-0.12453606724739075,
-0.17200039327144623,
-0.11376003921031952,
-0.04522956162691116,
-0.038015589118003845,
-0.02458176575601101,
-0.11186863481998444,
-0.01580834574997425,
0.015372583642601967,
-0.00468224985525012,
-0.0439908467233181,
0.0049478402361273766,
0.09945935755968094,
-0.048803068697452545,
0.10391699522733688,
0.04517488181591034,
0.04004789888858795,
-0.1142410933971405,
0.0751446783542633,
-0.13437190651893616,
0.0825745016336441,
-0.0826103687286377,
0.06576205044984818,
-0.09879343956708908,
-0.031334567815065384,
-0.054692018777132034,
0.057152774184942245,
-0.0340275801718235,
0.19022591412067413,
-0.16816595196723938,
-0.02552792802453041,
0.20757240056991577,
-0.15230755507946014,
-0.07276777923107147,
0.07549554854631424,
0.01995103806257248,
0.25765350461006165,
0.02813834138214588,
0.17638307809829712,
-0.03238378465175629,
0.06157878041267395,
0.050732776522636414,
-0.09403111040592194,
-0.07563401758670807,
0.08935600519180298,
0.019188884645700455,
-0.10312877595424652,
0.013770036399364471,
-0.01417951937764883,
0.034093908965587616,
0.042330510914325714,
-0.008200707845389843,
-0.07484795153141022,
0.07253947108983994,
-0.020120641216635704,
-0.04531698673963547,
-0.021415675058960915,
-0.042732786387205124,
-0.019425248727202415,
-0.07009349763393402,
-0.06976904720067978,
0.04095856845378876,
-0.05901651084423065,
0.03578784316778183,
-0.13712723553180695,
-0.05958123505115509,
0.03406418859958649,
0.028017351403832436,
-0.14160814881324768,
0.007125725504010916,
-0.004182835109531879,
-0.04410940408706665,
0.06686476618051529,
0.030216477811336517,
0.03675389662384987,
-0.030101478099822998,
0.0231024082750082,
-0.04115072637796402,
0.050048213452100754,
-0.031806718558073044,
-0.035797182470560074,
-0.07640838623046875,
-0.0018914187094196677,
-0.01677260734140873,
0.1582290530204773,
-0.07368727773427963,
0.026021292433142662,
0.012794267386198044,
-0.0066567035391926765,
0.0361417680978775,
-0.052017465233802795,
0.07941952347755432,
0.07543177157640457,
-0.00401352159678936,
-0.02401915192604065,
0.04098919779062271,
-0.006226236000657082,
-0.06152769923210144,
0.16778241097927094,
-0.17676664888858795,
-0.2145194560289383,
0.032340437173843384,
0.004378214478492737,
-0.0990326926112175,
0.10354606062173843,
-0.04144890233874321,
-0.025958359241485596,
0.023400546982884407,
-0.09138214588165283,
-0.07061412930488586,
0.009759180247783661,
0.039154358208179474,
-0.014949943870306015,
-0.02954651042819023,
0.006378857884556055,
-0.021512873470783234,
-0.05502314120531082,
0.08550170063972473,
0.11305650323629379,
-0.16707752645015717,
0.1278413087129593,
-0.043406110256910324,
0.04108860343694687,
0.12656420469284058,
0.005073213018476963,
-0.06491792947053909,
-0.02182544395327568,
0.06959240138530731,
-0.01773097552359104,
0.054456017911434174,
-0.23255108296871185,
-0.04634300246834755,
0.03196325525641441,
-0.04117218032479286,
-0.002866873051971197,
-0.03909071162343025,
-0.008162363432347775,
-0.018935825675725937,
-0.003426983254030347,
-0.046229515224695206,
0.050223544239997864,
0.015099292621016502,
0.09634140878915787,
-0.02860342524945736,
-0.021958565339446068,
-0.0007912915898486972,
-0.04876403883099556,
-0.156667098402977,
0.06637033075094223,
-0.004986380226910114,
-0.2770151197910309,
-0.0716325044631958,
0.040597524493932724,
0.04667360708117485,
-0.005473010707646608,
0.08029156178236008,
-0.056542761623859406,
-0.0737883448600769,
-0.09799866378307343,
0.15215235948562622,
0.028531569987535477,
0.007924959063529968,
-0.03851770982146263,
-0.013891804032027721,
0.02216138318181038,
-0.05919528752565384,
-0.005283677019178867,
0.03568321466445923,
-0.022953689098358154,
0.11588326096534729,
-0.062223292887210846,
0.11882288008928299,
0.083025723695755,
0.08751678466796875,
-0.02362169697880745,
-0.03105797804892063,
0.29500555992126465,
-0.12349864095449448,
0.1082409918308258,
-0.020726259797811508,
-0.14609333872795105,
0.0609184205532074,
0.16121463477611542,
-0.030159812420606613,
-0.0740416869521141,
0.05224674940109253,
0.07931829243898392,
0.002218795707449317,
-0.17360098659992218,
-0.01553620956838131,
-0.07211554795503616,
-0.000056332690292038023,
-0.023693308234214783,
0.06600984930992126,
0.06627184897661209,
0.04727620258927345,
-0.11802396178245544,
-0.006462098099291325,
0.02791876718401909,
0.049698665738105774,
-0.04133177921175957,
-0.006302962079644203,
0.0028611270245164633,
-0.09843285381793976,
-0.08167684078216553,
-0.027223411947488785,
-0.09470132738351822,
0.23771637678146362,
0.08911001682281494,
0.1657843291759491,
0.15932615101337433,
0.10471352189779282,
0.034818898886442184,
0.016049357131123543,
-0.03410033509135246,
0.03474897891283035,
-0.02581661567091942,
-0.0634244978427887,
-0.012777182273566723,
0.07623308151960373,
0.025920137763023376,
-0.10705152899026871,
0.055151280015707016,
-0.12113255262374878,
0.07255881279706955,
0.14734186232089996,
-0.030325084924697876,
-0.06640291213989258,
-0.007854390889406204,
0.0855606347322464,
-0.04916951432824135,
-0.04926488175988197,
-0.07277592271566391,
0.0514618456363678,
-0.10326845198869705,
0.06373179703950882,
0.010437026619911194,
0.06855442374944687,
-0.04453966021537781,
-0.00013600834063254297,
-0.07883434742689133,
-0.04705927520990372,
-0.03213423490524292,
0.11118632555007935,
-0.27964577078819275,
0.21198584139347076,
0.026466721668839455,
0.042919185012578964,
-0.135472372174263,
-0.07549015432596207,
-0.015517636202275753,
-0.08859428018331528,
0.15059541165828705,
0.03211711719632149,
-0.1079828068614006,
-0.11682774126529694,
-0.0059679909609258175,
0.026042291894555092,
0.11734745651483536,
-0.031732089817523956,
0.12861895561218262,
0.04347239062190056,
-0.002474364824593067,
-0.043918266892433167,
0.0954575166106224,
-0.16210225224494934,
-0.10646453499794006,
0.042752455919981,
-0.029505880549550056,
0.10047132521867752,
0.0004170217434875667,
-0.0008477639639750123,
-0.04958498105406761,
0.15469837188720703,
-0.21925541758537292,
-0.13029475510120392,
-0.06383874267339706,
-0.035592060536146164,
0.058841772377491,
-0.09281379729509354,
-0.06518245488405228,
0.04716673865914345,
0.10304185003042221,
-0.04015552997589111,
-0.014222221449017525,
0.05023610219359398,
-0.007702202536165714,
-0.11949686706066132,
-0.054228540509939194,
0.1046336367726326,
0.1774710714817047,
0.07950203120708466,
-0.05513283610343933,
-0.028208840638399124,
0.04824677109718323,
-0.11067336052656174,
0.03528566285967827,
0.10841435939073563,
-0.17052654922008514,
0.020787248387932777,
0.002097430406138301,
-0.056752707809209824,
-0.16519726812839508,
-0.15009792149066925,
0.07863721251487732,
0.23687177896499634,
0.017698902636766434,
0.11080516129732132,
0.0761948823928833,
-0.0592135451734066,
-0.18890392780303955,
-0.05563625320792198,
0.03273911401629448,
0.020794348791241646,
0.027821412310004234,
-0.1748761385679245,
-0.14862266182899475,
0.01989855244755745,
0.022203529253602028,
0.021725749596953392,
-0.25481948256492615,
-0.08537107706069946,
0.10443674772977829,
0.052277181297540665,
0.06785009056329727,
-0.1374172419309616,
-0.07449858635663986,
-0.03973868489265442,
-0.07214745134115219,
0.13526739180088043,
0.08284594863653183,
0.028839262202382088,
-0.0016530337743461132,
0.13759706914424896,
0.03293934464454651,
-0.029587699100375175,
0.13298918306827545,
0.044269587844610214,
0.0358281247317791,
-0.13271528482437134,
-0.021146779879927635,
0.08944305032491684,
0.018375657498836517,
0.12010505795478821,
-0.016962865367531776,
-0.128730908036232,
-0.09143435955047607,
-0.05376395955681801,
-0.14487136900424957,
0.010909629054367542,
-0.06340937316417694,
0.0018563588382676244,
-0.07419843226671219,
0.06594225764274597,
0.06815208494663239,
0.03013487718999386,
0.024315401911735535,
-0.07226715236902237,
0.00011414341861382127,
0.019993344321846962,
0.12323511391878128,
0.0844239816069603,
-0.03284722939133644,
-0.038666386157274246,
0.02445608749985695,
0.0736997127532959,
-0.054498493671417236,
-0.013133129104971886,
0.04491177573800087,
-0.04067424684762955,
0.214471697807312,
-0.03063875436782837,
-0.1821209192276001,
0.07158331573009491,
0.09442627429962158,
-0.025711575523018837,
-0.2418290227651596,
0.010594678111374378,
0.0005465000285767019,
-0.1425148993730545,
-0.1407657265663147,
0.03285687789320946,
-0.013773538172245026,
-0.009136601351201534,
-0.03313335031270981,
0.10464591532945633,
-0.031609032303094864,
0.08259003609418869,
0.04571054130792618,
0.07068806886672974,
-0.018702661618590355,
0.045432768762111664,
0.09635515511035919,
-0.10466364026069641,
0.059241559356451035,
0.17983917891979218,
-0.08383239060640335,
-0.06225695461034775,
-0.029310276731848717,
0.08783617615699768,
-0.0571293942630291,
-0.0753815770149231,
-0.007912895642220974,
-0.1032925546169281,
0.004221760202199221,
0.1029040664434433,
-0.054570142179727554,
0.11362680047750473,
0.09993942081928253,
-0.017270294949412346,
-0.011518009006977081,
0.008511144667863846,
0.01295475009828806,
-0.03357672318816185,
0.05458831414580345,
0.0586940236389637,
0.0780046284198761,
0.06904910504817963,
0.01752060279250145,
-0.09580680727958679,
-0.12585744261741638,
0.03647531196475029,
-0.06574005633592606,
0.022270625457167625,
-0.17325814068317413,
-0.05751469358801842,
0.026208313181996346,
-0.03243623301386833,
0.010279177688062191,
-0.007473247591406107,
-0.07743050158023834,
-0.010020905174314976,
-0.03219327703118324,
0.0969114601612091,
-0.08521804213523865,
-0.01547974068671465,
0.0895637646317482,
-0.044260092079639435,
0.03937050327658653,
0.050325557589530945,
-0.1056080013513565,
-0.008810373954474926,
-0.12079755961894989,
0.0991281047463417,
-0.01254870742559433,
-0.04031502828001976,
-0.010700108483433723,
-0.09570576995611191,
-0.05572086200118065,
0.02010323666036129,
0.007512439042329788,
0.015057852491736412,
0.07760298997163773,
-0.05014031380414963,
0.1621338129043579,
0.027806943282485008,
-0.04532940313220024,
-0.08377421647310257,
0.05150414630770683,
-0.055146943777799606,
-0.04616279527544975,
0.10187141597270966,
-0.04598968103528023,
0.029975268989801407,
-0.10942929983139038,
0.012739877216517925,
0.04251421242952347,
0.08541598916053772,
0.05305441468954086,
-0.08355598896741867,
0.04928922280669212,
-0.007153060287237167,
0.09811592847108841,
-0.023382779210805893,
-0.0502714179456234,
0.03702147677540779,
-0.0037830867804586887,
-0.06960707902908325,
0.010753002017736435,
0.008189314976334572,
-0.01738584041595459,
-0.04668257758021355,
0.00943882204592228,
-0.06063514202833176,
-0.06677243113517761,
0.03343476355075836,
0.17517678439617157,
0.08440253138542175,
0.23395662009716034,
0.0592649020254612,
0.024054300040006638,
-0.032861288636922836,
-0.03178864344954491,
-0.04614191874861717,
0.038243383169174194,
-0.01789073273539543,
-0.06212492287158966,
0.1722143292427063,
0.1309148520231247,
-0.0779784694314003,
0.0859980583190918,
-0.0415278784930706,
-0.07496354728937149,
-0.01702730916440487,
-0.36600494384765625,
-0.013213113881647587,
-0.013835597783327103,
0.010394357144832611,
-0.02589366026222706,
0.03789278119802475,
0.0332937054336071,
-0.012045007199048996,
-0.06702645868062973,
0.08569607883691788,
0.055204037576913834,
-0.14510782063007355,
0.012837987393140793,
-0.01514274813234806,
0.03438728675246239,
-0.011967222206294537,
0.0847601592540741,
0.027267372235655785,
0.07454854995012283,
0.07161585986614227,
0.10687445104122162,
-0.0266721174120903,
0.03329722210764885,
-0.15399865806102753,
-0.11494921892881393,
0.007673783227801323,
0.05666377767920494,
0.07642533630132675,
0.22573065757751465,
0.06106596440076828,
0.03962673246860504,
-0.0017931463662534952,
0.0713375136256218,
0.05781592056155205,
-0.1007678359746933,
-0.06509315967559814,
0.1286212056875229,
0.021634278818964958,
-0.06938274204730988,
-0.0064484430477023125,
-0.13178426027297974,
0.045431748032569885,
0.24294251203536987,
0.1064377874135971,
0.027011269703507423,
0.026028171181678772,
-0.11302181333303452,
0.026211299002170563,
0.014281541109085083,
0.06471679359674454,
0.024610426276922226,
0.24349923431873322,
-0.08264405280351639,
0.11035715788602829,
-0.003869179170578718,
-0.009645436890423298,
0.022309958934783936,
0.11613249033689499,
-0.01833045296370983,
-0.010447359643876553,
-0.07366091012954712,
0.11065462976694107,
-0.20265348255634308,
-0.20579813420772552,
0.011052621528506279,
0.006913293153047562,
-0.056220605969429016,
0.03710523620247841,
0.019609859213232994,
0.1160273477435112,
0.06499335914850235,
0.005385798867791891,
-0.03145871311426163,
0.15231850743293762,
0.02434299886226654,
-0.03431156277656555,
-0.06997192651033401,
0.07748283445835114,
-0.16290344297885895,
0.12649567425251007,
0.041480619460344315,
0.09383337944746017,
0.06545894593000412,
0.019678082317113876,
-0.12143974751234055,
0.10712925344705582,
0.03099404275417328,
0.09308135509490967,
0.014119742438197136,
0.21414147317409515,
0.03028566762804985,
0.05938900634646416,
0.07968706637620926,
0.06837338954210281,
0.08655181527137756,
0.06018725037574768,
0.02004103921353817,
-0.06262839585542679,
0.062323614954948425,
-0.07755973935127258,
0.07764877378940582,
0.09714660793542862,
-0.03674471750855446,
0.014869226142764091,
-0.054931312799453735,
-0.055730827152729034,
-0.046601198613643646,
0.058971017599105835,
-0.07727611064910889,
-0.19119423627853394,
-0.00455511175096035,
0.05685609579086304,
0.10800616443157196,
-0.19270089268684387,
-0.0009190746350213885,
-0.05023046210408211,
-0.0060057686641812325,
0.04162171483039856,
0.08924314379692078,
-0.040074821561574936,
0.028660697862505913,
-0.03701185807585716,
-0.19352443516254425,
0.03721380606293678,
0.13504037261009216,
-0.08759665489196777,
-0.005941243842244148
] |
null | null |
transformers
|
# GPT2 Medium 4 Persian
> This is part of the
[Flax/Jax Community Week](https://discuss.huggingface.co/t/pretrain-gpt2-from-scratch-in-persian/7560), organized by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
## Team Members
- [Mehrdad Farahani](huggingface.co/m3hrdadfi)
- [Saied Alimoradi](https://discuss.huggingface.co/u/saied)
- [M. Reza Zerehpoosh](huggingface.co/ironcladgeek)
- [Hooman Sedghamiz](https://discuss.huggingface.co/u/hooman650)
- [Mazeyar Moeini Feizabadi](https://discuss.huggingface.co/u/mazy1998)
## Dataset
We used [Oscar](https://huggingface.co/datasets/oscar) dataset, which is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus.
## How To Use
You can use this model directly with a pipeline for text generation.
```python
from transformers import pipeline, AutoTokenizer, GPT2LMHeadModel
tokenizer = AutoTokenizer.from_pretrained('flax-community/gpt2-medium-persian')
model = GPT2LMHeadModel.from_pretrained('flax-community/gpt2-medium-persian')
generator = pipeline('text-generation', model, tokenizer=tokenizer, config={'max_length':100})
generated_text = generator('در یک اتفاق شگفت انگیز، پژوهشگران')
```
For using Tensorflow import TFGPT2LMHeadModel instead of GPT2LMHeadModel.
## Demo
... SOON
## Evaluation
... SOON
|
{"language": "fa", "tags": ["text-generation"], "widget": [{"text": "\u062f\u0631 \u06cc\u06a9 \u0627\u062a\u0641\u0627\u0642 \u0634\u06af\u0641\u062a \u0627\u0646\u06af\u06cc\u0632\u060c \u067e\u0698\u0648\u0647\u0634\u06af\u0631\u0627\u0646"}, {"text": "\u06af\u0631\u0641\u062a\u06af\u06cc \u0628\u06cc\u0646\u06cc \u062f\u0631 \u06a9\u0648\u062f\u06a9\u0627\u0646 \u0648 \u0628\u0647\u200c\u062e\u0635\u0648\u0635 \u0646\u0648\u0632\u0627\u062f\u0627\u0646 \u0628\u0627\u0639\u062b \u0645\u06cc\u200c\u0634\u0648\u062f"}, {"text": "\u0627\u0645\u06cc\u062f\u0648\u0627\u0631\u06cc\u0645 \u0646\u0648\u0631\u0648\u0632 \u0627\u0645\u0633\u0627\u0644 \u0633\u0627\u0644\u06cc"}]}
|
text-generation
|
flax-community/gpt2-medium-persian
|
[
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"gpt2",
"text-generation",
"fa",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fa"
] |
TAGS
#transformers #pytorch #tf #jax #tensorboard #gpt2 #text-generation #fa #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# GPT2 Medium 4 Persian
> This is part of the
Flax/Jax Community Week, organized by HuggingFace and TPU usage sponsored by Google.
## Team Members
- Mehrdad Farahani
- Saied Alimoradi
- M. Reza Zerehpoosh
- Hooman Sedghamiz
- Mazeyar Moeini Feizabadi
## Dataset
We used Oscar dataset, which is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus.
## How To Use
You can use this model directly with a pipeline for text generation.
For using Tensorflow import TFGPT2LMHeadModel instead of GPT2LMHeadModel.
## Demo
... SOON
## Evaluation
... SOON
|
[
"# GPT2 Medium 4 Persian\n> This is part of the\nFlax/Jax Community Week, organized by HuggingFace and TPU usage sponsored by Google.",
"## Team Members\n- Mehrdad Farahani\n- Saied Alimoradi\n- M. Reza Zerehpoosh\n- Hooman Sedghamiz\n- Mazeyar Moeini Feizabadi",
"## Dataset\nWe used Oscar dataset, which is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus.",
"## How To Use\nYou can use this model directly with a pipeline for text generation.\n\n\nFor using Tensorflow import TFGPT2LMHeadModel instead of GPT2LMHeadModel.",
"## Demo\n\n... SOON",
"## Evaluation\n\n... SOON"
] |
[
"TAGS\n#transformers #pytorch #tf #jax #tensorboard #gpt2 #text-generation #fa #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# GPT2 Medium 4 Persian\n> This is part of the\nFlax/Jax Community Week, organized by HuggingFace and TPU usage sponsored by Google.",
"## Team Members\n- Mehrdad Farahani\n- Saied Alimoradi\n- M. Reza Zerehpoosh\n- Hooman Sedghamiz\n- Mazeyar Moeini Feizabadi",
"## Dataset\nWe used Oscar dataset, which is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus.",
"## How To Use\nYou can use this model directly with a pipeline for text generation.\n\n\nFor using Tensorflow import TFGPT2LMHeadModel instead of GPT2LMHeadModel.",
"## Demo\n\n... SOON",
"## Evaluation\n\n... SOON"
] |
[
63,
38,
40,
33,
42,
5,
6
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #tensorboard #gpt2 #text-generation #fa #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# GPT2 Medium 4 Persian\n> This is part of the\nFlax/Jax Community Week, organized by HuggingFace and TPU usage sponsored by Google.## Team Members\n- Mehrdad Farahani\n- Saied Alimoradi\n- M. Reza Zerehpoosh\n- Hooman Sedghamiz\n- Mazeyar Moeini Feizabadi## Dataset\nWe used Oscar dataset, which is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus.## How To Use\nYou can use this model directly with a pipeline for text generation.\n\n\nFor using Tensorflow import TFGPT2LMHeadModel instead of GPT2LMHeadModel.## Demo\n\n... SOON## Evaluation\n\n... SOON"
] |
[
-0.0443108044564724,
0.16187427937984467,
-0.0027941009029746056,
0.07282610237598419,
0.08049438148736954,
0.00472430931404233,
0.18201622366905212,
0.08875855058431625,
0.032801415771245956,
0.0017423763638362288,
0.05369187518954277,
0.025408802554011345,
0.0547904372215271,
0.12905745208263397,
0.013787684962153435,
-0.2623988687992096,
-0.02585575170814991,
-0.04429470747709274,
-0.05091996118426323,
0.11069740355014801,
0.14774872362613678,
-0.029600176960229874,
0.10912627726793289,
0.04546795412898064,
-0.11270993947982788,
0.03288532793521881,
-0.022616807371377945,
-0.11278455704450607,
0.07392845302820206,
0.11730419844388962,
0.05600342899560928,
0.03457016497850418,
0.044820722192525864,
-0.09139667451381683,
0.0409761518239975,
0.007826117798686028,
-0.019024396315217018,
0.018140729516744614,
0.09405700117349625,
-0.09741781651973724,
0.24803513288497925,
-0.02524065040051937,
-0.015850504860281944,
0.029858389869332314,
-0.09042418003082275,
-0.12824474275112152,
-0.06072096899151802,
0.03809291124343872,
0.03617938235402107,
0.08820586651563644,
0.00048430365859530866,
0.08929126709699631,
-0.05241316929459572,
0.09060913324356079,
0.10391408205032349,
-0.2547205686569214,
-0.08322033286094666,
0.06698165088891983,
0.07270913571119308,
0.04972914978861809,
-0.04621981829404831,
0.07368955761194229,
0.016212569549679756,
-0.01395510509610176,
0.056312445551157,
-0.11743941903114319,
-0.006346242036670446,
-0.003179585561156273,
-0.08391959965229034,
0.015961892902851105,
0.27223774790763855,
0.010793478228151798,
-0.01101740263402462,
-0.03207426890730858,
-0.04552098363637924,
0.0720786452293396,
-0.018927305936813354,
-0.006109594367444515,
-0.0042152684181928635,
0.015320268459618092,
0.048973020166158676,
-0.07662095874547958,
-0.14559072256088257,
-0.012357804924249649,
-0.12139897793531418,
0.10603953152894974,
0.04719414934515953,
0.065557561814785,
-0.10348597913980484,
0.06745928525924683,
-0.029060516506433487,
-0.08716963231563568,
0.022728588432073593,
-0.07041484117507935,
-0.07923292368650436,
0.014660748653113842,
0.03436216711997986,
-0.11030679941177368,
0.08028688281774521,
0.06499403715133667,
-0.030915148556232452,
0.08954531699419022,
0.07358372211456299,
0.026710161939263344,
0.057403672486543655,
0.09018668532371521,
-0.07067012786865234,
-0.13151247799396515,
0.0691058486700058,
-0.024986790493130684,
0.0027405042201280594,
-0.0035801874473690987,
-0.09938719868659973,
0.009262974373996258,
-0.031777866184711456,
0.005108637735247612,
0.03361941874027252,
0.08251636475324631,
-0.00851774588227272,
-0.05962526425719261,
0.021679038181900978,
-0.14872050285339355,
0.017929842695593834,
0.006528925150632858,
-0.05576191842556,
-0.01033869944512844,
0.07776863873004913,
0.006052977405488491,
-0.04510412737727165,
0.002428049687296152,
-0.03254484012722969,
0.03918684273958206,
-0.06824558973312378,
-0.042762741446495056,
0.03206850588321686,
-0.10929114371538162,
-0.008686975575983524,
-0.1873808205127716,
-0.1005629152059555,
-0.025787610560655594,
0.05768032371997833,
-0.04592323675751686,
-0.038932282477617264,
-0.026888901367783546,
0.026040274649858475,
0.0362614281475544,
-0.03473508730530739,
-0.005852352827787399,
-0.0732516199350357,
0.026282601058483124,
0.0018914806423708797,
0.05793153867125511,
-0.07853050529956818,
0.040086545050144196,
-0.030073508620262146,
0.029960593208670616,
-0.11273755878210068,
0.15256956219673157,
-0.061376914381980896,
0.0507698655128479,
-0.13051219284534454,
-0.012729213573038578,
-0.016201527789235115,
0.03150805085897446,
-0.023412177339196205,
0.1800740510225296,
-0.1796434372663498,
-0.0664902776479721,
0.12161136418581009,
-0.042475126683712006,
-0.07975751161575317,
0.17654573917388916,
-0.03561902418732643,
0.0852813720703125,
0.09191396832466125,
0.15863992273807526,
0.07125212252140045,
-0.10327266901731491,
0.006473266985267401,
0.06451520323753357,
-0.04256907105445862,
0.01637362688779831,
0.12621518969535828,
0.0021054872777312994,
0.053122397512197495,
0.019618656486272812,
0.05288936570286751,
0.11685431003570557,
-0.018814757466316223,
-0.026415320113301277,
0.02174956724047661,
-0.02268092893064022,
-0.04177919402718544,
-0.01288484875112772,
0.09546373784542084,
-0.020960086956620216,
-0.06552504003047943,
0.011916332878172398,
0.05229814350605011,
-0.0029866707045584917,
0.03247883915901184,
-0.13205459713935852,
0.09606669843196869,
-0.053972769528627396,
0.02768859639763832,
-0.13651306927204132,
-0.02142913267016411,
-0.01149078644812107,
0.023539111018180847,
0.102619968354702,
0.07672115415334702,
0.05385764315724373,
0.011996719054877758,
-0.06581781059503555,
0.04127519577741623,
0.035438407212495804,
-0.0014082336565479636,
-0.05974552407860756,
-0.13988107442855835,
0.013941639102995396,
-0.04561302438378334,
0.11988422274589539,
-0.08210914582014084,
0.04186777397990227,
0.10862933099269867,
0.1103399321436882,
-0.013092772103846073,
-0.0020371214486658573,
0.029648439958691597,
-0.020638924092054367,
-0.04207999259233475,
-0.04682725667953491,
0.039150018244981766,
0.00574467284604907,
-0.14346618950366974,
0.16658708453178406,
-0.138418048620224,
0.018511874601244926,
0.12191203236579895,
-0.017932726070284843,
-0.0382457859814167,
0.04633089900016785,
-0.04552764818072319,
-0.007525340188294649,
0.022463567554950714,
0.01833316683769226,
0.14333052933216095,
-0.008679154329001904,
0.16463014483451843,
-0.0874849483370781,
-0.007844974286854267,
-0.020514298230409622,
-0.06504125148057938,
-0.025930538773536682,
0.12317796796560287,
0.04087607190012932,
-0.17333967983722687,
0.1198076531291008,
0.06546014547348022,
0.03140951320528984,
0.2549581229686737,
0.022957762703299522,
-0.021567637100815773,
-0.026810359209775925,
-0.00997594278305769,
-0.024644814431667328,
0.04263344034552574,
-0.07901347428560257,
-0.02274971827864647,
0.04726875573396683,
0.013901231810450554,
0.04888828843832016,
-0.12934191524982452,
-0.005820654798299074,
0.015086192637681961,
-0.035374272614717484,
0.07469890266656876,
0.07774638384580612,
-0.0024145995266735554,
0.09055425226688385,
-0.0024270203430205584,
-0.009232492186129093,
0.02164316363632679,
-0.014193393290042877,
-0.11323145031929016,
0.14610040187835693,
-0.10670946538448334,
-0.28383421897888184,
-0.059852730482816696,
-0.07481706887483597,
-0.07933451980352402,
0.01471109502017498,
0.06706079840660095,
-0.022633137181401253,
-0.032551251351833344,
-0.058935344219207764,
0.038971707224845886,
-0.023233190178871155,
0.01166846975684166,
-0.0535949170589447,
-0.02814185991883278,
-0.02534770779311657,
-0.09513600170612335,
-0.03165877237915993,
-0.012062979862093925,
-0.046399928629398346,
0.08758416771888733,
-0.0792190432548523,
0.09721924364566803,
0.01943358965218067,
0.005900795571506023,
0.02529747784137726,
-0.00736643560230732,
0.2056499868631363,
-0.1174217090010643,
0.0817057266831398,
0.1642296016216278,
-0.020095787942409515,
0.013132883235812187,
0.061870161443948746,
0.02036987990140915,
-0.037213556468486786,
0.006123475264757872,
0.03284187242388725,
-0.10933596640825272,
-0.22538918256759644,
-0.09560205042362213,
-0.1265847086906433,
0.018127065151929855,
-0.00876499805599451,
0.0888298898935318,
-0.02889697067439556,
0.05996575579047203,
-0.030472664162516594,
0.027714315801858902,
0.053000543266534805,
0.05239259824156761,
0.12908950448036194,
-0.002535624196752906,
0.05170281603932381,
-0.11303428560495377,
-0.06083131209015846,
0.13351060450077057,
0.06184869632124901,
0.16683657467365265,
0.05502597242593765,
0.04777570068836212,
0.07858527451753616,
0.029993044212460518,
0.08696375042200089,
-0.02969246730208397,
0.07096798717975616,
0.0015071320813149214,
-0.04671192169189453,
-0.047530461102724075,
-0.029962586238980293,
0.09032636135816574,
0.05715205892920494,
-0.08138090372085571,
0.026119133457541466,
0.1074906587600708,
0.10169259458780289,
0.20599041879177094,
-0.018936214968562126,
-0.20330345630645752,
-0.010494041256606579,
0.05258335918188095,
-0.05575808882713318,
-0.038332123309373856,
0.011744803749024868,
0.0821470320224762,
-0.1407630741596222,
0.16959623992443085,
-0.014242301695048809,
0.09738757461309433,
-0.09136860817670822,
-0.01118702907115221,
0.012378421612083912,
-0.01441284827888012,
-0.0024623877834528685,
0.07605376839637756,
-0.2511647343635559,
0.19873110949993134,
0.04440128430724144,
0.02027895674109459,
-0.048484284430742264,
0.013224955648183823,
0.05415574833750725,
0.11481388658285141,
0.12968207895755768,
0.01175965927541256,
-0.018917864188551903,
0.000929793284740299,
-0.14796650409698486,
0.031274039298295975,
-0.0016923736548051238,
-0.052593715488910675,
0.05192098766565323,
0.007735882885754108,
0.003114262828603387,
-0.034659404307603836,
0.0027477198746055365,
-0.09402305632829666,
-0.18356208503246307,
0.05125998705625534,
0.04979636147618294,
-0.06169503182172775,
-0.027270697057247162,
-0.04321723431348801,
-0.12035800516605377,
0.1971723735332489,
-0.04807179793715477,
-0.1320163607597351,
-0.14779217541217804,
0.016654690727591515,
0.03232315182685852,
-0.07206375896930695,
-0.026512954384088516,
-0.036273010075092316,
-0.04794284328818321,
-0.020741963759064674,
-0.10797632485628128,
0.0663304403424263,
-0.07339715212583542,
-0.029150720685720444,
-0.012283234857022762,
0.12629224359989166,
0.05799046903848648,
0.0429973229765892,
-0.014562484808266163,
-0.015701549127697945,
-0.013548571616411209,
-0.13154582679271698,
-0.03289240226149559,
0.13516613841056824,
0.010814357548952103,
0.02499353513121605,
-0.0339016392827034,
-0.10799754410982132,
-0.034073490649461746,
-0.08302083611488342,
0.11740612238645554,
0.1443970948457718,
-0.06019490584731102,
0.14260800182819366,
0.008308110758662224,
-0.0613856315612793,
-0.1933518797159195,
-0.10109841078519821,
0.03973088786005974,
0.023083286359906197,
-0.007823917083442211,
-0.15393584966659546,
0.000428061350248754,
0.06602974981069565,
-0.007899194024503231,
0.028811195865273476,
-0.24801085889339447,
-0.13819411396980286,
0.08160026371479034,
-0.02792368084192276,
0.1422911286354065,
-0.19452959299087524,
-0.04721515625715256,
-0.0493181087076664,
-0.12422709912061691,
0.11349333822727203,
-0.08992855995893478,
0.09341932088136673,
-0.006638170685619116,
0.010919175110757351,
-0.012109402567148209,
-0.013340357691049576,
0.15222293138504028,
0.03513646870851517,
0.03681041672825813,
-0.0885159820318222,
-0.015224933624267578,
0.046400684863328934,
0.002426588675007224,
0.0875951498746872,
-0.03785168379545212,
-0.016476402059197426,
-0.19513475894927979,
-0.0499761663377285,
-0.04911282658576965,
0.07994168996810913,
0.020119883120059967,
-0.024492258206009865,
-0.07026248425245285,
0.02478223480284214,
0.04462631046772003,
-0.02282322198152542,
0.11943648010492325,
-0.08858557045459747,
0.02226439118385315,
0.10357533395290375,
0.13396508991718292,
-0.027097688987851143,
-0.05199102312326431,
-0.08139818161725998,
0.01588325947523117,
0.06165104731917381,
-0.16578806936740875,
-0.0013395438436418772,
0.10506042093038559,
-0.018438545987010002,
0.1419183760881424,
-0.008783083409070969,
-0.0954805538058281,
0.05892601236701012,
0.10369756817817688,
-0.15845471620559692,
-0.20838071405887604,
-0.04786118492484093,
-0.07682297378778458,
0.03188919275999069,
-0.03096749074757099,
0.13188329339027405,
-0.04401428997516632,
-0.018542753532528877,
0.003939005546271801,
0.02895824983716011,
-0.04145830124616623,
0.14335092902183533,
0.007013924419879913,
-0.014474722556769848,
-0.1069258451461792,
0.09716268628835678,
0.03727399930357933,
-0.05565289035439491,
0.009292518720030785,
0.15680530667304993,
-0.1220046803355217,
-0.04927141219377518,
-0.07414662837982178,
0.11726406216621399,
-0.12374098598957062,
-0.10533854365348816,
-0.07815589755773544,
-0.08660195022821426,
0.02132643200457096,
0.025071559473872185,
0.05140155181288719,
0.05236286297440529,
-0.013430945575237274,
-0.009067106992006302,
-0.00504882400855422,
0.07708112895488739,
0.09553886950016022,
-0.025279061868786812,
-0.09580901265144348,
0.03877228498458862,
0.008253024891018867,
0.09836834669113159,
-0.07060985267162323,
-0.024587051942944527,
-0.07427597045898438,
0.00407277001067996,
-0.17910747230052948,
0.03237558528780937,
-0.09503063559532166,
-0.000020435805708984844,
-0.011179760098457336,
-0.056029271334409714,
-0.05124915391206741,
-0.006015812512487173,
-0.0665241926908493,
-0.013001463375985622,
-0.05256509780883789,
0.07238959521055222,
-0.04969059303402901,
-0.03677307814359665,
0.01307594496756792,
-0.03974045440554619,
0.11510717868804932,
0.10774107277393341,
-0.04149748757481575,
-0.0045860386453568935,
-0.09293366968631744,
-0.05913487449288368,
0.05404758080840111,
0.043015871196985245,
0.04080256074666977,
-0.04212578758597374,
0.014188912697136402,
0.045502934604883194,
0.026307933032512665,
0.003107283730059862,
0.07235738635063171,
-0.09572736918926239,
-0.030727576464414597,
-0.04326656460762024,
-0.027741044759750366,
-0.0466759167611599,
0.04227417707443237,
0.055780503898859024,
0.11146225035190582,
0.06846646219491959,
-0.08233340829610825,
0.027384458109736443,
-0.09259033203125,
0.011004968546330929,
-0.013978827744722366,
-0.07060490548610687,
-0.05301251634955406,
-0.07395023107528687,
0.07125183194875717,
-0.013665936887264252,
0.12005239725112915,
0.10502908378839493,
-0.07781809568405151,
0.04416714236140251,
0.0369383841753006,
0.0714070200920105,
-0.022172436118125916,
0.020181268453598022,
-0.00048233746201731265,
0.020386310294270515,
0.015870824456214905,
0.036001112312078476,
0.04526076838374138,
0.00948264729231596,
0.141511932015419,
0.08831634372472763,
-0.0007928362465463579,
0.10475824028253555,
-0.0017430847510695457,
-0.038760554045438766,
-0.033762745559215546,
0.012064442969858646,
-0.10235334932804108,
0.05274190753698349,
-0.05654066801071167,
-0.044510383158922195,
0.18890993297100067,
-0.09234388917684555,
0.0749625638127327,
0.014330068603157997,
-0.037926968187093735,
-0.14170561730861664,
-0.12700814008712769,
-0.08683757483959198,
-0.044129498302936554,
-0.013863484375178814,
-0.08522607386112213,
0.0336071141064167,
-0.0012187761021777987,
0.09035412222146988,
-0.044961437582969666,
0.12358333170413971,
-0.08049089461565018,
-0.12270532548427582,
0.06907891482114792,
-0.0067205168306827545,
0.036616940051317215,
-0.01940276473760605,
-0.014665043912827969,
-0.013581382110714912,
0.02876265160739422,
-0.008490446023643017,
0.017305202782154083,
0.06594552844762802,
0.07630594819784164,
-0.08680904656648636,
-0.05072736740112305,
-0.055766403675079346,
0.04885778948664665,
0.08728626370429993,
0.09819997102022171,
0.044811517000198364,
-0.03878287971019745,
0.014697425067424774,
0.1654513031244278,
-0.019313983619213104,
-0.05298329144716263,
-0.13768231868743896,
0.13104918599128723,
-0.03405233472585678,
-0.007459523621946573,
-0.017074529081583023,
-0.05475519225001335,
-0.03461284190416336,
0.19603635370731354,
0.25578635931015015,
-0.09142837673425674,
0.02154073491692543,
-0.06639239937067032,
0.028033683076500893,
0.053965963423252106,
0.0711749792098999,
0.04762187972664833,
0.23902924358844757,
-0.03565476834774017,
-0.019173404201865196,
-0.03446104750037193,
0.017853623256087303,
-0.09389534592628479,
0.11886316537857056,
0.021803103387355804,
-0.027838096022605896,
0.013882695697247982,
0.09153371304273605,
-0.08421238511800766,
-0.14090567827224731,
-0.00489594554528594,
-0.063560351729393,
-0.10864702612161636,
0.01025919895619154,
0.0018227773252874613,
0.09583739191293716,
0.024835241958498955,
-0.006312126759439707,
-0.005058974027633667,
0.06432407349348068,
0.04503350704908371,
-0.2011352926492691,
0.004504903219640255,
0.12742219865322113,
-0.07579334080219269,
0.08439098298549652,
-0.03529299795627594,
0.10443750023841858,
0.07978581637144089,
-0.015250066295266151,
-0.06979349255561829,
0.07608596235513687,
0.03124341368675232,
-0.02868669666349888,
0.026163311675190926,
0.008501394651830196,
0.013312860392034054,
-0.024189168587327003,
0.05213955417275429,
0.08686341345310211,
-0.012911800295114517,
-0.06964829564094543,
-0.011523452587425709,
-0.09816129505634308,
0.09215229004621506,
-0.14454877376556396,
0.11136826127767563,
0.15069401264190674,
-0.03809712827205658,
0.007524196989834309,
-0.029980255290865898,
0.03082164376974106,
0.0057055032812058926,
-0.03890000656247139,
-0.031085900962352753,
-0.14198188483715057,
-0.027615675702691078,
-0.1115516647696495,
0.02545231766998768,
-0.05686650052666664,
-0.034633275121450424,
-0.11137925833463669,
-0.005178171209990978,
-0.013987544924020767,
0.09650107473134995,
0.07649380713701248,
0.03780079632997513,
-0.041852906346321106,
-0.14258398115634918,
0.02665637992322445,
0.04830922558903694,
-0.1457943171262741,
-0.06457746028900146
] |
null | null |
transformers
|
# Question-Answering Using GPT2 - Persian
> This is a side project of this thread
[Flax/Jax Community Week - GPT2 4 Persian](https://discuss.huggingface.co/t/pretrain-gpt2-from-scratch-in-persian/7560), organized by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
## Team Members
- [Mehrdad Farahani](https://huggingface.co/m3hrdadfi)
## Dataset
We used [PersianQA](https://huggingface.co/datasets/SajjadAyoubi/persian_qa) dataset which is a reading comprehension dataset on Persian Wikipedia.
## How To Use TODO: Update
## Demo TODO: Update
## Evaluation TODO: Update
|
{"language": "fa", "tags": ["text-generation"], "datasets": ["persian_qa"], "widget": [{"text": "\u0646\u0627\u0641 \u062c\u0627\u06cc\u06cc \u0642\u0631\u0627\u0631 \u06af\u0631\u0641\u062a\u0647 \u06a9\u0647 \u062f\u0631 \u0648\u0627\u0642\u0639 \u0628\u0646\u062f\u0646\u0627\u0641 \u062f\u0631 \u062f\u0627\u062e\u0644 \u0631\u062d\u0645 \u062f\u0631 \u0622\u0646\u062c\u0627 \u0628\u0647 \u0634\u06a9\u0645 \u062c\u0646\u06cc\u0646 \u0648\u0635\u0644 \u0628\u0648\u062f\u0647\u200c\u0627\u0633\u062a. \u0628\u0646\u062f\u0646\u0627\u0641 \u06a9\u0647 \u062c\u0641\u062a \u0631\u0627 \u0628\u0647 \u062c\u0646\u06cc\u0646 \u0645\u062a\u0635\u0644 \u06a9\u0631\u062f\u0647 \u0628\u0639\u062f \u0627\u0632 \u062a\u0648\u0644\u062f \u0627\u0632 \u0646\u0648\u0632\u0627\u062f \u062c\u062f\u0627 \u0645\u06cc\u200c\u0634\u0648\u062f. \u0628\u0631\u0627\u06cc \u062c\u062f\u0627 \u06a9\u0631\u062f\u0646 \u0628\u0646\u062f \u0646\u0627\u0641 \u0627\u0632 \u062f\u0648 \u067e\u0646\u0633 \u0627\u0633\u062a\u0641\u0627\u062f\u0647 \u0645\u06cc\u200c\u06a9\u0646\u0646\u062f \u0648 \u0628\u06cc\u0646 \u0622\u0646 \u062f\u0648 \u0631\u0627 \u0645\u06cc\u0628\u0631\u0646\u062f. \u067e\u0646\u0633 \u062f\u06cc\u06af\u0631\u06cc \u0646\u0632\u062f\u06cc\u06a9 \u0634\u06a9\u0645 \u0646\u0648\u0632\u0627\u062f \u0642\u0631\u0627\u0631 \u062f\u0627\u062f\u0647 \u0645\u06cc\u200c\u0634\u0648\u062f \u06a9\u0647 \u0628\u0639\u062f \u0627\u0632 \u062f\u0648 \u0631\u0648\u0632 \u0628\u0631\u062f\u0627\u0634\u062a\u0647 \u062e\u0648\u0627\u0647\u062f \u0634\u062f. \u0628\u0646\u062f\u0646\u0627\u0641 \u0628\u0627\u0642\u06cc\u200c\u0645\u0627\u0646\u062f\u0647 \u0637\u06cc \u06f1\u06f5 \u0631\u0648\u0632 \u062e\u0634\u06a9 \u0634\u062f\u0647 \u0648 \u0645\u06cc\u200c\u0627\u0641\u062a\u062f \u0648 \u0628\u0647 \u062c\u0627\u06cc \u0622\u0646 \u0627\u0633\u06a9\u0627\u0631\u06cc \u0637\u0628\u06cc\u0639\u06cc \u0628\u0647 \u062c\u0627\u06cc \u0645\u06cc\u0645\u0627\u0646\u062f. \u0627\u0644\u0628\u062a\u0647 \u0628\u0631 \u062e\u0644\u0627\u0641 \u062a\u0635\u0648\u0631 \u0639\u0627\u0645\u0647 \u0645\u0631\u062f\u0645 \u0634\u06a9\u0644 \u0646\u0627\u0641 \u062f\u0631 \u0627\u062b\u0631 \u0628\u0631\u06cc\u062f\u0646 \u0628\u0646\u062f \u0646\u0627\u0641 \u0628\u0647 \u0648\u062c\u0648\u062f \u0646\u0645\u06cc\u200c\u0622\u06cc\u062f \u0648 \u067e\u06cc\u0634 \u0627\u0632 \u0627\u06cc\u0646 \u062f\u0631 \u0634\u06a9\u0645 \u0645\u0627\u062f\u0631 \u062d\u0627\u0644\u062a \u0646\u0627\u0641 \u0634\u06a9\u0644 \u06af\u0631\u0641\u062a\u0647\u200c\u0627\u0633\u062a. \u0634\u06a9\u0644 \u0646\u0627\u0641 \u062f\u0631 \u0645\u06cc\u0627\u0646 \u0645\u0631\u062f\u0645 \u0645\u062e\u062a\u0644\u0641 \u0645\u062a\u0641\u0627\u0648\u062a \u0627\u0633\u062a \u0648 \u0627\u0646\u062f\u0627\u0632\u0647 \u0622\u0646 \u0628\u06cc\u0646 \u06f1.\u06f5 \u062a\u0627 \u06f2 \u0633\u0627\u0646\u062a\u06cc\u200c\u0645\u062a\u0631 \u0627\u0633\u062a. \u062a\u0645\u0627\u0645 \u067e\u0633\u062a\u0627\u0646\u062f\u0627\u0631\u0627\u0646 \u062c\u0641\u062a\u200c\u0632\u06cc\u0633\u062a \u0646\u0627\u0641 \u062f\u0627\u0631\u0646\u062f. \u0646\u0627\u0641 \u062f\u0631 \u0627\u0646\u0633\u0627\u0646\u200c\u0647\u0627 \u0628\u0647 \u0633\u0627\u062f\u06af\u06cc \u0642\u0627\u0628\u0644 \u0645\u0634\u0627\u0647\u062f\u0647\u200c\u0627\u0633\u062a. \u067e\u0631\u0633\u0634: \u0628\u0646\u062f \u0646\u0627\u0641 \u0627\u0646\u0633\u0627\u0646 \u0628\u0647 \u06a9\u062c\u0627 \u0648\u0635\u0644 \u0627\u0633\u062a\u061f \u067e\u0627\u0633\u062e:"}, {"text": "\u062e\u0648\u0628\u060c \u0628\u062f\u060c \u0632\u0634\u062a \u06cc\u06a9 \u0641\u06cc\u0644\u0645 \u062f\u0631\u0698\u0627\u0646\u0631 \u0648\u0633\u062a\u0631\u0646 \u0627\u0633\u067e\u0627\u06af\u062a\u06cc \u062d\u0645\u0627\u0633\u06cc \u0627\u0633\u062a \u06a9\u0647 \u062a\u0648\u0633\u0637 \u0633\u0631\u062c\u0648 \u0644\u0626\u0648\u0646\u0647 \u062f\u0631 \u0633\u0627\u0644 \u06f1\u06f9\u06f6\u06f6 \u062f\u0631 \u0627\u06cc\u062a\u0627\u0644\u06cc\u0627 \u0633\u0627\u062e\u062a\u0647 \u0634\u062f. \u0632\u0628\u0627\u0646\u06cc \u06a9\u0647 \u0628\u0627\u0632\u06cc\u06af\u0631\u0627\u0646 \u0627\u06cc\u0646 \u0641\u06cc\u0644\u0645 \u0628\u0647 \u0622\u0646 \u062a\u06a9\u0644\u0645 \u0645\u06cc\u200c\u06a9\u0646\u0646\u062f \u0645\u062e\u0644\u0648\u0637\u06cc \u0627\u0632 \u0627\u06cc\u062a\u0627\u0644\u06cc\u0627\u06cc\u06cc \u0648 \u0627\u0646\u06af\u0644\u06cc\u0633\u06cc \u0627\u0633\u062a. \u0627\u06cc\u0646 \u0641\u06cc\u0644\u0645 \u0633\u0648\u0645\u06cc\u0646 (\u0648 \u0622\u062e\u0631\u06cc\u0646) \u0641\u06cc\u0644\u0645 \u0627\u0632 \u0633\u0647\u200c\u06af\u0627\u0646\u0647\u0654 \u062f\u0644\u0627\u0631 (Dollars Trilogy) \u0633\u0631\u062c\u0648 \u0644\u0626\u0648\u0646\u0647 \u0627\u0633\u062a. \u0627\u06cc\u0646 \u0641\u06cc\u0644\u0645 \u062f\u0631 \u062d\u0627\u0644 \u062d\u0627\u0636\u0631 \u062f\u0631 \u0641\u0647\u0631\u0633\u062a \u06f2\u06f5\u06f0 \u0641\u06cc\u0644\u0645 \u0628\u0631\u062a\u0631 \u062a\u0627\u0631\u06cc\u062e \u0633\u06cc\u0646\u0645\u0627 \u062f\u0631 \u0648\u0628\u200c\u06af\u0627\u0647 IMDB \u0628\u0627 \u0627\u0645\u062a\u06cc\u0627\u0632 \u06f8\u066b\u06f8 \u0627\u0632 \u06f1\u06f0\u060c \u0631\u062a\u0628\u0647\u0654 \u0647\u0634\u062a\u0645 \u0631\u0627 \u0628\u0647 \u062e\u0648\u062f \u0627\u062e\u062a\u0635\u0627\u0635 \u062f\u0627\u062f\u0647\u200c\u0627\u0633\u062a \u0648 \u0628\u0647 \u0639\u0646\u0648\u0627\u0646 \u0628\u0647\u062a\u0631\u06cc\u0646 \u0641\u06cc\u0644\u0645 \u0648\u0633\u062a\u0631\u0646 \u062a\u0627\u0631\u06cc\u062e \u0633\u06cc\u0646\u0645\u0627\u06cc \u062c\u0647\u0627\u0646 \u0634\u0646\u0627\u062e\u062a\u0647 \u0645\u06cc\u200c\u0634\u0648\u062f. \u00ab\u062e\u0648\u0628\u00bb (\u06a9\u0644\u06cc\u0646\u062a \u0627\u06cc\u0633\u062a\u0648\u0648\u062f\u060c \u062f\u0631 \u0641\u06cc\u0644\u0645\u060c \u0628\u0627 \u0646\u0627\u0645 \u00ab\u0628\u0644\u0648\u0646\u062f\u06cc\u00bb) \u0648 \u00ab\u0632\u0634\u062a\u00bb (\u0627\u06cc\u0644\u0627\u06cc \u0648\u0627\u0644\u0627\u06a9\u060c \u062f\u0631 \u0641\u06cc\u0644\u0645\u060c \u0628\u0627 \u0646\u0627\u0645 \u00ab\u062a\u0648\u06a9\u0648\u00bb) \u0628\u0627 \u0647\u0645 \u06a9\u0627\u0631 \u0645\u06cc\u200c\u06a9\u0646\u0646\u062f \u0648 \u0628\u0627 \u0634\u06af\u0631\u062f \u062e\u0627\u0635\u06cc\u060c \u0628\u0647 \u06af\u0648\u0644 \u0632\u062f\u0646 \u06a9\u0644\u0627\u0646\u062a\u0631\u0647\u0627\u06cc \u0645\u0646\u0627\u0637\u0642 \u0645\u062e\u062a\u0644\u0641 \u0648 \u067e\u0648\u0644 \u062f\u0631\u0622\u0648\u0631\u062f\u0646 \u0627\u0632 \u0627\u06cc\u0646 \u0631\u0627\u0647 \u0645\u06cc\u200c\u067e\u0631\u062f\u0627\u0632\u0646\u062f. \u00ab\u0628\u062f\u00bb (\u0644\u06cc \u0648\u0627\u0646 \u06a9\u0644\u06cc\u0641) \u0622\u062f\u0645\u06a9\u0634\u06cc \u062d\u0631\u0641\u0647\u200c\u0627\u06cc \u0627\u0633\u062a \u06a9\u0647 \u0628\u0647\u200c\u062e\u0627\u0637\u0631 \u067e\u0648\u0644 \u062d\u0627\u0636\u0631 \u0628\u0647 \u0627\u0646\u062c\u0627\u0645 \u0647\u0631 \u06a9\u0627\u0631\u06cc \u0627\u0633\u062a. \u00ab\u0628\u062f\u00bb\u060c \u06a9\u0647 \u062f\u0631 \u0641\u06cc\u0644\u0645 \u0627\u0648 \u0631\u0627 \u00ab\u0627\u0650\u0646\u062c\u0644 \u0622\u06cc\u0632 (\u0627\u0650\u06cc\u0646\u062c\u0644 \u0622\u06cc\u0632)\u00bb (\u0628\u0647 \u0627\u0646\u06af\u0644\u06cc\u0633\u06cc: Angel Eyes) \u0635\u062f\u0627 \u0645\u06cc\u200c\u06a9\u0646\u0646\u062f. \u0628\u0647\u200c\u062f\u0646\u0628\u0627\u0644 \u06af\u0646\u062c\u06cc \u0627\u0633\u062a \u06a9\u0647 \u062f\u0631 \u0637\u06cc \u062c\u0646\u06af\u200c\u0647\u0627\u06cc \u062f\u0627\u062e\u0644\u06cc \u0622\u0645\u0631\u06cc\u06a9\u0627\u060c \u0628\u0647 \u062f\u0633\u062a \u0633\u0631\u0628\u0627\u0632\u06cc \u0628\u0647 \u0646\u0627\u0645 \u00ab\u062c\u06a9\u0633\u0648\u0646\u00bb\u060c \u06a9\u0647 \u0628\u0639\u062f\u0647\u0627 \u0628\u0647 \u00ab\u06a9\u0627\u0631\u0633\u0648\u0646\u00bb \u0646\u0627\u0645\u0634 \u0631\u0627 \u062a\u063a\u06cc\u06cc\u0631 \u062f\u0627\u062f\u0647\u060c \u0645\u062e\u0641\u06cc \u0634\u062f\u0647\u200c\u0627\u0633\u062a. \u067e\u0631\u0633\u0634: \u062f\u0631 \u0641\u06cc\u0644\u0645 \u062e\u0648\u0628 \u0628\u062f \u0632\u0634\u062a \u0634\u062e\u0635\u06cc\u062a \u0647\u0627 \u06a9\u062c\u0627\u06cc\u06cc \u0635\u062d\u0628\u062a \u0645\u06cc \u06a9\u0646\u0646\u062f\u061f \u067e\u0627\u0633\u062e:"}, {"text": "\u0686\u0647\u0627\u0631\u0634\u0646\u0628\u0647\u200c\u0633\u0648\u0631\u06cc \u06cc\u06a9\u06cc \u0627\u0632 \u062c\u0634\u0646\u200c\u0647\u0627\u06cc \u0627\u06cc\u0631\u0627\u0646\u06cc \u0627\u0633\u062a \u06a9\u0647 \u0627\u0632 \u063a\u0631\u0648\u0628 \u0622\u062e\u0631\u06cc\u0646 \u0633\u0647\u200c\u0634\u0646\u0628\u0647 \u06cc \u0645\u0627\u0647 \u0627\u0633\u0641\u0646\u062f\u060c \u062a\u0627 \u067e\u0633 \u0627\u0632 \u0646\u06cc\u0645\u0647\u200c\u0634\u0628 \u062a\u0627 \u0622\u062e\u0631\u06cc\u0646 \u0686\u0647\u0627\u0631\u0634\u0646\u0628\u0647 \u06cc \u0633\u0627\u0644\u060c \u0628\u0631\u06af\u0632\u0627\u0631 \u0645\u06cc\u200c\u0634\u0648\u062f \u0648 \u0628\u0631\u0627\u0641\u0631\u0648\u062e\u062a\u0646 \u0648 \u067e\u0631\u06cc\u062f\u0646 \u0627\u0632 \u0631\u0648\u06cc \u0622\u062a\u0634 \u0645\u0634\u062e\u0635\u0647\u0654 \u0627\u0635\u0644\u06cc \u0622\u0646 \u0627\u0633\u062a. \u0627\u06cc\u0646 \u062c\u0634\u0646\u060c \u0646\u062e\u0633\u062a\u06cc\u0646 \u062c\u0634\u0646 \u0627\u0632 \u0645\u062c\u0645\u0648\u0639\u0647\u0654 \u062c\u0634\u0646\u200c\u0647\u0627 \u0648 \u0645\u0646\u0627\u0633\u0628\u062a\u200c\u0647\u0627\u06cc \u0646\u0648\u0631\u0648\u0632\u06cc \u0627\u0633\u062a \u06a9\u0647 \u0628\u0627 \u0628\u0631\u0627\u0641\u0631\u0648\u062e\u062a\u0646 \u0622\u062a\u0634 \u0648 \u0628\u0631\u062e\u06cc \u0631\u0641\u062a\u0627\u0631\u0647\u0627\u06cc \u0646\u0645\u0627\u062f\u06cc\u0646 \u062f\u06cc\u06af\u0631\u060c \u0628\u0647\u200c\u0635\u0648\u0631\u062a \u062c\u0645\u0639\u06cc \u062f\u0631 \u0641\u0636\u0627\u06cc \u0628\u0627\u0632 \u0628\u0631\u06af\u0632\u0627\u0631 \u0645\u06cc\u200c\u0634\u0648\u062f. \u0628\u0647\u200c\u06af\u0641\u062a\u0647\u0654 \u0627\u0628\u0631\u0627\u0647\u06cc\u0645 \u067e\u0648\u0631\u062f\u0627\u0648\u0648\u062f \u0686\u0647\u0627\u0631\u0634\u0646\u0628\u0647\u200c\u0633\u0648\u0631\u06cc \u0631\u06cc\u0634\u0647 \u062f\u0631 \u06af\u0627\u0647\u0646\u0628\u0627\u0631\u0650 \u0647\u064e\u0645\u064e\u0633\u0652\u067e\u064e\u062a\u0652\u0645\u064e\u062f\u064e\u0645 \u0632\u0631\u062a\u0634\u062a\u06cc\u0627\u0646 \u0648 \u0646\u06cc\u0632 \u062c\u0634\u0646 \u0646\u0632\u0648\u0644 \u0641\u0631\u0648\u0647\u0631\u0647\u0627 \u062f\u0627\u0631\u062f \u06a9\u0647 \u0634\u0634 \u0631\u0648\u0632 \u067e\u06cc\u0634 \u0627\u0632 \u0641\u0631\u0627\u0631\u0633\u06cc\u062f\u0646 \u0646\u0648\u0631\u0648\u0632 \u0628\u0631\u06af\u0632\u0627\u0631 \u0645\u06cc\u200c\u0634\u062f. \u0627\u062d\u062a\u0645\u0627\u0644 \u062f\u06cc\u06af\u0631 \u0627\u06cc\u0646 \u0627\u0633\u062a \u06a9\u0647 \u0686\u0647\u0627\u0631\u0634\u0646\u0628\u0647\u200c\u0633\u0648\u0631\u06cc \u0628\u0627\u0632\u0645\u0627\u0646\u062f\u0647 \u0648 \u0634\u06a9\u0644 \u062a\u062d\u0648\u0644\u200c\u06cc\u0627\u0641\u062a\u0647\u200c\u0627\u06cc \u0627\u0632 \u062c\u0634\u0646 \u0633\u062f\u0647 \u0628\u0627\u0634\u062f\u060c \u06a9\u0647 \u0627\u062d\u062a\u0645\u0627\u0644 \u0628\u0639\u06cc\u062f\u06cc \u0627\u0633\u062a. \u0639\u0644\u0627\u0648\u0647 \u0628\u0631\u0627\u0641\u0631\u0648\u062e\u062a\u0646 \u0622\u062a\u0634\u060c \u0622\u06cc\u06cc\u0646\u200c\u0647\u0627\u06cc \u0645\u062e\u062a\u0644\u0641 \u062f\u06cc\u06af\u0631\u06cc \u0646\u06cc\u0632 \u062f\u0631 \u0628\u062e\u0634\u200c\u0647\u0627\u06cc \u06af\u0648\u0646\u0627\u06af\u0648\u0646 \u0627\u06cc\u0631\u0627\u0646 \u062f\u0631 \u0632\u0645\u0627\u0646 \u0627\u06cc\u0646 \u062c\u0634\u0646 \u0627\u0646\u062c\u0627\u0645 \u0645\u06cc\u200c\u0634\u0648\u0646\u062f. \u0628\u0631\u0627\u06cc \u0646\u0645\u0648\u0646\u0647\u060c \u062f\u0631 \u062a\u0628\u0631\u06cc\u0632\u060c \u0645\u0631\u062f\u0645 \u0628\u0647 \u0686\u0647\u0627\u0631\u0634\u0646\u0628\u0647\u200c\u0628\u0627\u0632\u0627\u0631 \u0645\u06cc\u200c\u0631\u0648\u0646\u062f \u06a9\u0647 \u0628\u0627 \u0686\u0631\u0627\u063a \u0648 \u0634\u0645\u0639\u060c \u0628\u0647\u200c\u0637\u0631\u0632 \u0632\u06cc\u0628\u0627\u06cc\u06cc \u0686\u0631\u0627\u063a\u0627\u0646\u06cc \u0634\u062f\u0647\u200c\u0627\u0633\u062a. \u0647\u0631 \u062e\u0627\u0646\u0648\u0627\u062f\u0647 \u06cc\u06a9 \u0622\u06cc\u0646\u0647\u060c \u062f\u0627\u0646\u0647\u200c\u0647\u0627\u06cc \u0627\u0633\u0641\u0646\u062f\u060c \u0648 \u06cc\u06a9 \u06a9\u0648\u0632\u0647 \u0628\u0631\u0627\u06cc \u0633\u0627\u0644 \u0646\u0648 \u062e\u0631\u06cc\u062f\u0627\u0631\u06cc \u0645\u06cc\u200c\u06a9\u0646\u0646\u062f. \u0647\u0645\u0647\u200c\u0633\u0627\u0644\u0647 \u0634\u0647\u0631\u0648\u0646\u062f\u0627\u0646\u06cc \u0627\u0632 \u0627\u06cc\u0631\u0627\u0646 \u062f\u0631 \u0627\u062b\u0631 \u0627\u0646\u0641\u062c\u0627\u0631\u0647\u0627\u06cc \u0646\u0627\u062e\u0648\u0634\u0627\u06cc\u0646\u062f \u0645\u0631\u0628\u0648\u0637 \u0628\u0647 \u0627\u06cc\u0646 \u062c\u0634\u0646\u060c \u06a9\u0634\u062a\u0647 \u06cc\u0627 \u0645\u0635\u062f\u0648\u0645 \u0645\u06cc\u200c\u0634\u0648\u0646\u062f. \u067e\u0631\u0633\u0634: \u0646\u0627\u0645 \u062c\u0634\u0646 \u0627\u062e\u0631\u06cc\u0646 \u0634\u0646\u0628\u0647 \u06cc \u0633\u0627\u0644 \u0686\u06cc\u0633\u062a\u061f \u067e\u0627\u0633\u062e:"}]}
|
text-generation
|
flax-community/gpt2-persian-question-answering
|
[
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"safetensors",
"gpt2",
"text-generation",
"fa",
"dataset:persian_qa",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fa"
] |
TAGS
#transformers #pytorch #tf #jax #tensorboard #safetensors #gpt2 #text-generation #fa #dataset-persian_qa #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# Question-Answering Using GPT2 - Persian
> This is a side project of this thread
Flax/Jax Community Week - GPT2 4 Persian, organized by HuggingFace and TPU usage sponsored by Google.
## Team Members
- Mehrdad Farahani
## Dataset
We used PersianQA dataset which is a reading comprehension dataset on Persian Wikipedia.
## How To Use TODO: Update
## Demo TODO: Update
## Evaluation TODO: Update
|
[
"# Question-Answering Using GPT2 - Persian\n> This is a side project of this thread\nFlax/Jax Community Week - GPT2 4 Persian, organized by HuggingFace and TPU usage sponsored by Google.",
"## Team Members\n- Mehrdad Farahani",
"## Dataset\nWe used PersianQA dataset which is a reading comprehension dataset on Persian Wikipedia.",
"## How To Use TODO: Update",
"## Demo TODO: Update",
"## Evaluation TODO: Update"
] |
[
"TAGS\n#transformers #pytorch #tf #jax #tensorboard #safetensors #gpt2 #text-generation #fa #dataset-persian_qa #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# Question-Answering Using GPT2 - Persian\n> This is a side project of this thread\nFlax/Jax Community Week - GPT2 4 Persian, organized by HuggingFace and TPU usage sponsored by Google.",
"## Team Members\n- Mehrdad Farahani",
"## Dataset\nWe used PersianQA dataset which is a reading comprehension dataset on Persian Wikipedia.",
"## How To Use TODO: Update",
"## Demo TODO: Update",
"## Evaluation TODO: Update"
] |
[
76,
54,
9,
24,
8,
6,
7
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #tensorboard #safetensors #gpt2 #text-generation #fa #dataset-persian_qa #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# Question-Answering Using GPT2 - Persian\n> This is a side project of this thread\nFlax/Jax Community Week - GPT2 4 Persian, organized by HuggingFace and TPU usage sponsored by Google.## Team Members\n- Mehrdad Farahani## Dataset\nWe used PersianQA dataset which is a reading comprehension dataset on Persian Wikipedia.## How To Use TODO: Update## Demo TODO: Update## Evaluation TODO: Update"
] |
[
-0.10318057239055634,
0.10120773315429688,
-0.0025519030168652534,
0.055484622716903687,
0.15159127116203308,
0.043239038437604904,
0.14844167232513428,
0.15414436161518097,
0.07057933509349823,
0.011252052150666714,
0.08540569245815277,
-0.014386231079697609,
0.11580422520637512,
0.1235073134303093,
0.04383448511362076,
-0.21176867187023163,
0.009368779137730598,
-0.06435372680425644,
-0.07679253816604614,
0.16019855439662933,
0.136188343167305,
-0.03746342658996582,
0.09257083386182785,
-0.0029324067290872335,
-0.1274537742137909,
-0.0006165957311168313,
-0.04483586549758911,
-0.16206017136573792,
0.1445399969816208,
0.059833601117134094,
0.06657066941261292,
0.07880790531635284,
-0.009256605990231037,
-0.10774290561676025,
0.05182894691824913,
0.009618590585887432,
-0.07124225050210953,
0.020131543278694153,
0.12074621021747589,
-0.102021224796772,
0.18976211547851562,
-0.055312689393758774,
-0.05051395297050476,
0.02836005389690399,
-0.09821668267250061,
-0.28733211755752563,
-0.053876347839832306,
0.058074403554201126,
0.05712936446070671,
0.09068099409341812,
-0.02964748442173004,
0.12726990878582,
-0.08152971416711807,
0.12425214052200317,
0.1886470764875412,
-0.29916560649871826,
-0.06293756514787674,
0.002083775820210576,
0.038642674684524536,
0.11277132481336594,
-0.07408053427934647,
0.025536663830280304,
-0.021572981029748917,
-0.01252130139619112,
0.056505873799324036,
-0.07376906275749207,
-0.12020587176084518,
-0.0007850624970160425,
-0.054120082408189774,
-0.005175178870558739,
0.3302755653858185,
-0.018267182633280754,
0.013144376687705517,
-0.03966699540615082,
-0.047718264162540436,
0.04808332026004791,
-0.033604998141527176,
-0.0470203198492527,
-0.02526118792593479,
-0.0032040723599493504,
0.016075314953923225,
-0.07385952025651932,
-0.17876841127872467,
-0.04454817250370979,
-0.1209014505147934,
0.0818430557847023,
0.06915666908025742,
0.061175305396318436,
-0.09418845176696777,
0.04789004102349281,
-0.03385243937373161,
-0.09528627246618271,
-0.005716194864362478,
-0.08162986487150192,
-0.012809187173843384,
0.004940205719321966,
0.02716602012515068,
-0.08470332622528076,
0.14435873925685883,
-0.02424526959657669,
-0.05805119872093201,
0.09714505076408386,
0.04306333139538765,
0.027043575420975685,
0.051159922033548355,
0.07602465897798538,
-0.0680503398180008,
-0.13152071833610535,
0.03703232854604721,
0.02033519186079502,
-0.012530301697552204,
0.0020431396551430225,
-0.03516704589128494,
0.003239308949559927,
0.06961199641227722,
0.0761343240737915,
0.07447260618209839,
0.11040087044239044,
0.028978092595934868,
-0.010438413359224796,
0.14181755483150482,
-0.14265277981758118,
-0.017672518268227577,
0.07030308991670609,
-0.08137138187885284,
-0.04807502403855324,
0.057663459330797195,
-0.012369245290756226,
-0.08537831157445908,
-0.09870080649852753,
-0.040085844695568085,
0.0445551797747612,
0.004361667670309544,
-0.05335542932152748,
0.04337703064084053,
-0.06000487133860588,
-0.002625291468575597,
-0.2190462350845337,
-0.11701298505067825,
0.021821463480591774,
0.025469955056905746,
-0.04221628978848457,
0.029781104996800423,
-0.003795080818235874,
0.03388914838433266,
0.04453350231051445,
-0.056357335299253464,
0.022311046719551086,
-0.08603198081254959,
0.1276397556066513,
-0.0012860104907304049,
0.11144407838582993,
-0.019297165796160698,
0.05219466611742973,
-0.11020054668188095,
0.0001356638385914266,
-0.08902041614055634,
0.10267806798219681,
-0.039148665964603424,
0.031907692551612854,
-0.16143357753753662,
-0.0067687672562897205,
-0.05091473087668419,
-0.03608870133757591,
0.018682587891817093,
0.19047483801841736,
-0.12748199701309204,
-0.07851839065551758,
0.23266282677650452,
-0.07619922608137131,
-0.19071094691753387,
0.18548092246055603,
0.019764453172683716,
0.15701329708099365,
0.08245157450437546,
0.17519913613796234,
0.09935241937637329,
-0.11570443212985992,
-0.11645041406154633,
0.07205700129270554,
-0.07449182868003845,
0.0035954939667135477,
0.09220819175243378,
0.004693571012467146,
0.09650368988513947,
0.044965483248233795,
0.04830263555049896,
0.13438992202281952,
-0.04191536083817482,
-0.06967058032751083,
-0.025320984423160553,
-0.09677153825759888,
0.07152057439088821,
0.02336161397397518,
0.09241015464067459,
-0.07414747774600983,
-0.056442562490701675,
-0.07195877283811569,
0.046800464391708374,
-0.03740924224257469,
0.03335411846637726,
-0.1486259549856186,
0.14967119693756104,
-0.06796879321336746,
0.07108499854803085,
-0.14095599949359894,
-0.08541122078895569,
-0.016123266890645027,
0.0849350169301033,
-0.03295688331127167,
-0.041511036455631256,
0.07544255256652832,
-0.017427248880267143,
-0.06535080075263977,
-0.016866061836481094,
0.06532314419746399,
0.024064963683485985,
-0.07708850502967834,
-0.13038603961467743,
0.06401333212852478,
-0.023400427773594856,
0.12108545005321503,
-0.04238021373748779,
0.02260395884513855,
0.0962001234292984,
0.07453025132417679,
-0.014196841977536678,
0.027098366990685463,
0.0143813481554389,
0.003747206646949053,
-0.00035359442699700594,
-0.05604669451713562,
0.05959496647119522,
0.00009464561298955232,
-0.08828898519277573,
0.16915355622768402,
-0.18074336647987366,
0.08485939353704453,
0.16160623729228973,
-0.07751151919364929,
-0.03709912300109863,
0.032315246760845184,
-0.03024241514503956,
-0.07566990703344345,
0.07613835483789444,
-0.007157761603593826,
0.03459596633911133,
-0.01048982236534357,
0.14271995425224304,
-0.07047897577285767,
0.042516376823186874,
0.036137621849775314,
-0.09671512991189957,
-0.07133657485246658,
0.11795815080404282,
-0.006864734459668398,
-0.24221864342689514,
0.1585301160812378,
0.21457034349441528,
0.09084652364253998,
0.24533475935459137,
0.04847893863916397,
-0.07147975265979767,
0.01081473845988512,
0.06578809022903442,
-0.06527332216501236,
0.10433615744113922,
-0.12007240206003189,
-0.0711783692240715,
0.05506747215986252,
0.03237161412835121,
0.02704627998173237,
-0.08255802094936371,
0.008348265662789345,
-0.02133185602724552,
-0.044488634914159775,
-0.05466547980904579,
0.06179388612508774,
0.009584569372236729,
0.1274130493402481,
0.00022347657068166882,
0.012402372434735298,
0.06605598330497742,
0.004318857565522194,
-0.11421000957489014,
0.1457529366016388,
-0.10434731841087341,
-0.3923843502998352,
0.015998486429452896,
-0.12397128343582153,
-0.019609713926911354,
-0.015506002120673656,
0.09860566258430481,
-0.0921982005238533,
-0.0013881816994398832,
-0.04113999009132385,
0.08951269090175629,
0.0372946597635746,
-0.0346752293407917,
-0.14023229479789734,
-0.007192323449999094,
0.020416613668203354,
-0.11741127073764801,
-0.022896599024534225,
0.025715969502925873,
-0.1271287053823471,
0.14291810989379883,
-0.14379185438156128,
0.06113141402602196,
-0.025717901065945625,
-0.033556949347257614,
0.01067645289003849,
-0.04551907256245613,
0.23024320602416992,
-0.16241057217121124,
0.11750762909650803,
0.21056510508060455,
-0.011251659132540226,
0.026116445660591125,
0.10590513050556183,
0.005532057024538517,
-0.01874438486993313,
0.02578939124941826,
0.014466356486082077,
-0.09390607476234436,
-0.20303595066070557,
-0.06785010546445847,
-0.10174103826284409,
0.07018399238586426,
-0.007882319390773773,
0.08259053528308868,
-0.022182432934641838,
0.05879963934421539,
-0.061191607266664505,
-0.052972547709941864,
-0.015358519740402699,
0.020120039582252502,
0.09887907654047012,
-0.011460083536803722,
0.07938797026872635,
-0.14379699528217316,
0.025977512821555138,
0.09450516104698181,
0.12073323130607605,
0.2148561179637909,
0.015366842038929462,
0.14429306983947754,
0.07590968161821365,
0.2722472846508026,
0.07225504517555237,
-0.009258788079023361,
0.05786292254924774,
-0.02730848453938961,
0.021298138424754143,
-0.03264492005109787,
-0.07842260599136353,
0.03164636716246605,
0.05939284339547157,
-0.05318455025553703,
-0.005672956816852093,
-0.0009086272330023348,
0.1491684466600418,
0.19630195200443268,
-0.031124794855713844,
-0.1819850355386734,
-0.0534919910132885,
0.002255114959552884,
-0.03329630196094513,
-0.08279098570346832,
-0.009091893211007118,
0.021726615726947784,
-0.19276869297027588,
0.022560400888323784,
-0.03426973894238472,
0.08520184457302094,
-0.025952301919460297,
-0.029121749103069305,
-0.020816661417484283,
-0.11638250201940536,
-0.0360201857984066,
0.12507827579975128,
-0.2881307601928711,
0.2163960486650467,
0.07075310498476028,
0.08116883039474487,
-0.08686484396457672,
-0.016731102019548416,
0.06217315047979355,
0.10400272160768509,
0.11800829321146011,
0.012070465832948685,
0.08719798922538757,
-0.08293435722589493,
-0.18373391032218933,
0.08954811096191406,
0.04551440849900246,
0.006642846390604973,
0.011929718777537346,
-0.0029076868668198586,
0.033198483288288116,
-0.03646496683359146,
0.08294425904750824,
-0.06752356141805649,
-0.14241498708724976,
0.06720765680074692,
0.0029480229131877422,
0.03333033621311188,
-0.03926274552941322,
-0.04088054224848747,
-0.261432945728302,
0.13148345053195953,
-0.0209612138569355,
-0.1871674358844757,
-0.10512327402830124,
-0.017099441960453987,
0.03296171873807907,
-0.08608031272888184,
0.005483711138367653,
-0.017329415306448936,
-0.08308155834674835,
0.01700359396636486,
-0.13964197039604187,
0.056945864111185074,
-0.11565068364143372,
-0.10239773243665695,
0.008408550173044205,
0.1726522296667099,
0.0697856992483139,
0.02027168683707714,
-0.015041403472423553,
-0.0066831884905695915,
-0.06132663041353226,
-0.06667669117450714,
0.04490157216787338,
-0.058434974402189255,
0.06648482382297516,
-0.02911793813109398,
0.05648458003997803,
-0.15104497969150543,
-0.07667176425457001,
-0.0957389771938324,
0.1249241977930069,
0.11501007527112961,
-0.029749993234872818,
0.1656300127506256,
0.18557710945606232,
-0.024552324786782265,
-0.23365579545497894,
-0.11072749644517899,
0.006699515972286463,
-0.0016599377850070596,
0.05007477104663849,
-0.167033851146698,
0.006206911522895098,
0.06327837705612183,
-0.03783511742949486,
0.08080273866653442,
-0.2500450313091278,
-0.0793878510594368,
0.14700831472873688,
0.025913972407579422,
0.12508751451969147,
-0.2420734018087387,
0.003569582477211952,
0.024470074102282524,
-0.16672183573246002,
0.09840302914381027,
-0.14787353575229645,
0.09177475422620773,
-0.011490534991025925,
0.07876711338758469,
-0.002829430392012,
-0.023579539731144905,
0.15768149495124817,
-0.11565381288528442,
-0.034774865955114365,
-0.13979220390319824,
-0.01081633660942316,
0.07593932002782822,
0.016187794506549835,
0.05230981111526489,
-0.07781050354242325,
-0.006655190140008926,
-0.10525751858949661,
-0.03491995483636856,
-0.09141643345355988,
0.05185113847255707,
-0.003590521402657032,
-0.04169003665447235,
-0.042237572371959686,
0.02447018399834633,
0.02343909814953804,
-0.00572252506390214,
0.09978887438774109,
-0.07260178029537201,
0.07762981206178665,
0.12863436341285706,
0.1268165409564972,
-0.012927006930112839,
-0.06408962607383728,
-0.0710500031709671,
-0.04508614540100098,
0.0679851844906807,
-0.2386104017496109,
0.02403840608894825,
0.068569116294384,
-0.029801689088344574,
0.08041668683290482,
-0.009029099717736244,
-0.08734630048274994,
0.07018023729324341,
0.10481074452400208,
-0.12834779918193817,
-0.20709151029586792,
-0.0172455795109272,
-0.06724830716848373,
-0.03603322431445122,
0.020559580996632576,
0.12188977003097534,
-0.06817878782749176,
-0.021852901205420494,
-0.01580950617790222,
0.018382815644145012,
0.006655687000602484,
0.14692950248718262,
0.06192947179079056,
0.006488914135843515,
-0.10102935880422592,
0.11317496001720428,
0.07876336574554443,
-0.10254194587469101,
0.05302928388118744,
0.08126448094844818,
-0.15770173072814941,
-0.08809288591146469,
-0.11449577659368515,
0.0016726464964449406,
0.020400136709213257,
-0.17068015038967133,
-0.06245679408311844,
-0.03293292224407196,
-0.03837531805038452,
0.03580717369914055,
0.03901858627796173,
0.049689989537000656,
0.03614175319671631,
-0.009774790145456791,
-0.03995672985911369,
0.13624563813209534,
-0.00012822610733564943,
-0.050275806337594986,
-0.067959725856781,
0.03597937896847725,
0.006017615087330341,
0.18963579833507538,
-0.06822101026773453,
-0.04562459886074066,
-0.05603422969579697,
0.021039506420493126,
-0.1479819118976593,
-0.007821080274879932,
-0.11661174893379211,
-0.024324379861354828,
-0.02856723591685295,
-0.05729728564620018,
-0.04924285039305687,
-0.00390562298707664,
-0.04090873524546623,
0.010696733370423317,
0.012211639434099197,
0.08729707449674606,
-0.12952126562595367,
-0.020989129319787025,
0.029373466968536377,
0.005384912248700857,
0.15772776305675507,
0.16726525127887726,
-0.05590420216321945,
0.07754520326852798,
-0.1102488711476326,
0.03481019288301468,
0.002253495855256915,
0.013298841193318367,
0.05577784776687622,
-0.05550287291407585,
-0.016328798606991768,
0.02874821051955223,
-0.048943545669317245,
0.04559681937098503,
0.09394580125808716,
-0.10263927280902863,
0.08346225321292877,
-0.10581497102975845,
0.01662510819733143,
-0.02995777688920498,
0.06409106403589249,
0.0158542487770319,
0.06944867968559265,
0.05939079821109772,
-0.07567233592271805,
-0.004910785239189863,
-0.09828914701938629,
0.014537160284817219,
-0.02119435742497444,
-0.07980146259069443,
-0.047568317502737045,
-0.022799469530582428,
0.07343390583992004,
-0.046568356454372406,
0.10366865992546082,
0.05443447828292847,
-0.04341541603207588,
0.031144632026553154,
0.023039929568767548,
0.02347840741276741,
-0.030823541805148125,
0.08977263420820236,
-0.026202691718935966,
-0.00517363753169775,
0.0050165485590696335,
0.044801484793424606,
-0.03457094356417656,
-0.0065225414000451565,
0.0638323500752449,
0.1468435525894165,
0.023496266454458237,
0.061843354254961014,
-0.06354262679815292,
-0.1304997205734253,
0.0030081956647336483,
-0.09980210661888123,
-0.10386673361063004,
0.03155793249607086,
-0.04855700954794884,
-0.010235227644443512,
0.2957650423049927,
-0.05025891959667206,
0.019772646948695183,
-0.10169292986392975,
-0.05439947918057442,
-0.0727217048406601,
-0.03939595818519592,
-0.0683303102850914,
-0.056080784648656845,
-0.034577418118715286,
-0.06206906586885452,
0.02042202837765217,
0.10816973447799683,
0.0572589635848999,
-0.08359774947166443,
0.21545302867889404,
0.03732634708285332,
-0.12152132391929626,
0.049151621758937836,
-0.026100317016243935,
0.05004328489303589,
-0.07304586470127106,
-0.02257026918232441,
0.005407062824815512,
0.035096872597932816,
0.019953597337007523,
0.049821823835372925,
-0.0011187432100996375,
0.08702883124351501,
-0.13739581406116486,
-0.08908329904079437,
-0.048822738230228424,
0.08960587531328201,
0.015727723017334938,
0.10406101495027542,
0.04892704635858536,
0.01868443936109543,
-0.014049874618649483,
0.23994849622249603,
0.02664019539952278,
-0.034753408282995224,
-0.08922170847654343,
0.02155706100165844,
-0.055484525859355927,
-0.046917133033275604,
0.0288511011749506,
-0.09681189060211182,
-0.013149620033800602,
0.1906253844499588,
0.28321799635887146,
-0.04502632096409798,
0.04717383161187172,
-0.07629891484975815,
0.02992073819041252,
0.04641597718000412,
0.11065801978111267,
0.06941624730825424,
0.23976954817771912,
-0.10686177760362625,
0.008813983760774136,
-0.08404986560344696,
0.01058853231370449,
-0.11791133135557175,
0.1313706785440445,
0.038321103900671005,
0.03014148771762848,
-0.045778788626194,
0.11203424632549286,
-0.04044745862483978,
-0.145629420876503,
-0.02128956839442253,
-0.07837775349617004,
-0.12427778542041779,
0.009854523465037346,
0.09128788858652115,
0.08422834426164627,
0.019168632104992867,
-0.012005261145532131,
0.06284984946250916,
0.02598605863749981,
0.04251803085207939,
-0.12516474723815918,
-0.011920258402824402,
0.15055906772613525,
-0.05667414516210556,
0.13347624242305756,
-0.02237662859261036,
0.10986573249101639,
0.07585737854242325,
0.010180851444602013,
-0.09063305705785751,
0.10742505639791489,
0.07541470974683762,
-0.008794346824288368,
0.03340516239404678,
0.0201247651129961,
0.0682603046298027,
-0.0599432997405529,
0.10554046183824539,
0.10824093967676163,
0.032660018652677536,
-0.023593999445438385,
0.030116090551018715,
-0.07311387360095978,
0.11431878805160522,
-0.08847478032112122,
0.10001464188098907,
0.1360113024711609,
-0.06220761686563492,
0.0062699755653738976,
-0.05531127378344536,
0.008453369140625,
0.07011272013187408,
-0.06474205106496811,
-0.03296433761715889,
-0.21414491534233093,
0.0034063272178173065,
-0.05611077696084976,
-0.00023751138360239565,
-0.07109789550304413,
-0.025839928537607193,
-0.06446231156587601,
-0.002523115137591958,
-0.04581155255436897,
0.06562045961618423,
0.014094541780650616,
0.013214899227023125,
-0.02426009438931942,
0.021554257720708847,
-0.02631627395749092,
0.08911123871803284,
-0.14131039381027222,
-0.09285938739776611
] |
null | null |
transformers
|
Rap Lyric Generator <br/>
GPT-2 fine tuned using FLAX/JAX on over 10000 Rap Songs from over 50 rappers, the dataset was gathered from genius.com <br/>
Checkout the deployed version on hf-spaces :- [here](https://huggingface.co/spaces/Shankhdhar/Rap-Lyric-generator) <br/>
Colab for making predictions:- [here](https://colab.research.google.com/drive/1aibR06TrFGnt-TPmyIRDD2-8eT7PU5Kl#scrollTo=rgE3QbiTFIMQ)<br/>
The dataset we used: [dataset](https://huggingface.co/datasets/Cropinky/rap_lyrics_english)<br/>
Made by:-<br/>
[Anant Shankhdhar](https://huggingface.co/Shankhdhar)<br/>
[Jeronim Matijević](https://huggingface.co/Cropinky)<br/>
|
{}
|
text-generation
|
flax-community/gpt2-rap-lyric-generator
|
[
"transformers",
"jax",
"tensorboard",
"gpt2",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #jax #tensorboard #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
Rap Lyric Generator <br/>
GPT-2 fine tuned using FLAX/JAX on over 10000 Rap Songs from over 50 rappers, the dataset was gathered from URL <br/>
Checkout the deployed version on hf-spaces :- here <br/>
Colab for making predictions:- here<br/>
The dataset we used: dataset<br/>
Made by:-<br/>
Anant Shankhdhar<br/>
Jeronim Matijević<br/>
|
[] |
[
"TAGS\n#transformers #jax #tensorboard #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n"
] |
[
54
] |
[
"passage: TAGS\n#transformers #jax #tensorboard #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.0031625176779925823,
0.05869220942258835,
-0.004166902508586645,
0.014401141554117203,
0.13135184347629547,
0.016794828698039055,
0.132017582654953,
0.16098830103874207,
0.004759633913636208,
0.03815187141299248,
0.17838871479034424,
0.1534416228532791,
-0.0010579598601907492,
0.030281852930784225,
-0.05741669237613678,
-0.2549493908882141,
0.037310585379600525,
0.06135540455579758,
-0.09679479897022247,
0.07902313023805618,
0.07018905133008957,
-0.06841529905796051,
0.09559542685747147,
-0.015742627903819084,
-0.20618298649787903,
0.05468657985329628,
0.05563702434301376,
-0.11188039928674698,
0.09674529731273651,
0.08950863778591156,
0.12913522124290466,
0.034571725875139236,
-0.05845172703266144,
-0.11924757063388824,
0.0262244064360857,
0.056740760803222656,
-0.07697699218988419,
0.09175147861242294,
0.0801219642162323,
-0.07025559991598129,
0.05179170146584511,
0.023737024515867233,
-0.03195057064294815,
0.03841094300150871,
-0.1609591245651245,
-0.046991076320409775,
-0.03213014826178551,
-0.05330843850970268,
0.02977076731622219,
0.06693104654550552,
-0.024903839454054832,
0.0991709977388382,
-0.047018446028232574,
0.07777605205774307,
0.15256211161613464,
-0.35117441415786743,
-0.012925448827445507,
0.16922321915626526,
0.12261335551738739,
0.0675797313451767,
-0.05151627957820892,
0.0907929539680481,
0.04820907115936279,
0.004606018774211407,
0.06797043979167938,
-0.07479503005743027,
-0.10555694997310638,
0.08818446844816208,
-0.10107240825891495,
-0.01346149668097496,
0.2121601551771164,
-0.05444185063242912,
0.08821656554937363,
-0.04878030717372894,
-0.11202218383550644,
-0.09737033396959305,
-0.011130989529192448,
-0.018788794055581093,
-0.04230808466672897,
0.09627297520637512,
-0.003563689300790429,
-0.11375356465578079,
-0.14734767377376556,
-0.017811572179198265,
-0.17612937092781067,
0.12082067131996155,
-0.021527502685785294,
0.039319369941949844,
-0.1893492043018341,
0.06127219274640083,
-0.055795151740312576,
-0.10025763511657715,
0.05495670810341835,
-0.058710742741823196,
-0.019707316532731056,
-0.013968046754598618,
-0.0625215470790863,
-0.21510601043701172,
0.08446350693702698,
0.06822735071182251,
0.022492989897727966,
0.03999888896942139,
-0.03507198020815849,
0.10092296451330185,
0.031593479216098785,
0.12295252829790115,
-0.050153136253356934,
-0.04747549071907997,
0.007962518371641636,
-0.11063642799854279,
0.013673796318471432,
-0.09922582656145096,
-0.1959284543991089,
-0.02434423938393593,
0.048616353422403336,
0.05351192504167557,
0.01793874055147171,
0.09560786932706833,
-0.03116583451628685,
0.00575832836329937,
0.03372026979923248,
-0.06190656125545502,
0.05333634465932846,
-0.02602592296898365,
0.0569155290722847,
0.07265389710664749,
0.0017783980583772063,
0.017194081097841263,
-0.06251420825719833,
0.0023425607942044735,
-0.10507962107658386,
-0.012788515537977219,
-0.06826446950435638,
-0.12039361149072647,
0.030774975195527077,
-0.11227280646562576,
0.013200981542468071,
-0.14013366401195526,
-0.08960223197937012,
0.014487186446785927,
0.0330081433057785,
-0.0703710988163948,
-0.005242314655333757,
-0.01320396177470684,
-0.05767468735575676,
0.10497560352087021,
-0.052819736301898956,
0.0191354900598526,
-0.041809119284152985,
0.04291198030114174,
-0.0352771058678627,
0.09812509268522263,
-0.14010511338710785,
0.06302645802497864,
-0.07075992226600647,
0.02982725389301777,
-0.11052262037992477,
0.06821154057979584,
-0.028133729472756386,
0.10214392840862274,
-0.05352511629462242,
0.025290604680776596,
-0.11275535076856613,
0.0738530308008194,
0.010370254516601562,
0.13052672147750854,
-0.1596028208732605,
-0.08869773149490356,
0.1956503689289093,
-0.041976578533649445,
-0.1197018101811409,
0.10403779149055481,
-0.005632862914353609,
0.05869804322719574,
0.031517453491687775,
0.23985512554645538,
0.044905658811330795,
-0.024450751021504402,
0.04159867390990257,
0.13312554359436035,
-0.08317461609840393,
-0.07401928305625916,
0.006329396739602089,
-0.028235681354999542,
-0.06255857646465302,
0.02598763257265091,
0.11466174572706223,
0.08413590490818024,
-0.019043834879994392,
-0.04023114964365959,
-0.05711092799901962,
-0.0005413108738139272,
0.06680316478013992,
0.013490558601915836,
0.15991637110710144,
-0.05767757445573807,
-0.07346213608980179,
0.05065697804093361,
-0.037388402968645096,
-0.06031196564435959,
0.031406089663505554,
-0.03561641275882721,
0.13567712903022766,
-0.1427263617515564,
0.043496642261743546,
-0.1678263545036316,
-0.13472378253936768,
-0.010542524978518486,
0.085334911942482,
0.012488570995628834,
0.17686858773231506,
0.09109730273485184,
-0.04119649529457092,
-0.0035806065425276756,
0.03599758818745613,
0.1773822158575058,
0.023432353511452675,
-0.11139518767595291,
-0.08203444629907608,
0.06317903846502304,
-0.10297844558954239,
-0.056239958852529526,
-0.12377845495939255,
0.03447052836418152,
0.09578146040439606,
0.10562051087617874,
0.04357872158288956,
0.027155231684446335,
-0.01665458269417286,
-0.002590854885056615,
-0.12185335904359818,
-0.027845187112689018,
0.0628172904253006,
-0.02020670287311077,
-0.07321924716234207,
0.22351406514644623,
-0.21856534481048584,
0.21785424649715424,
0.1982143372297287,
-0.2296905815601349,
-0.007210085168480873,
-0.0254379753023386,
-0.0019284130539745092,
0.020329639315605164,
0.03246847167611122,
-0.052609823644161224,
0.04070087894797325,
-0.046964652836322784,
0.15485496819019318,
-0.039172254502773285,
-0.03118903748691082,
0.011919484473764896,
-0.051663246005773544,
-0.06235459819436073,
0.051140621304512024,
0.09932079911231995,
-0.13816754519939423,
0.2038464993238449,
0.2449735701084137,
-0.020489033311605453,
0.20360638201236725,
0.01225407887250185,
-0.026549626141786575,
0.023039136081933975,
-0.007924968376755714,
-0.013313273899257183,
-0.01407934445887804,
-0.20030896365642548,
-0.05757534131407738,
0.05517531558871269,
0.041279539465904236,
0.09523312747478485,
-0.12840934097766876,
-0.018472110852599144,
0.008757786825299263,
0.008384241722524166,
0.057382214814424515,
0.09054780006408691,
0.01905667781829834,
0.09303455799818039,
-0.0019749479833990335,
-0.010495463386178017,
0.10483487695455551,
0.017302971333265305,
-0.06516782939434052,
0.1714097559452057,
-0.12242652475833893,
-0.28751328587532043,
-0.16071560978889465,
-0.13549506664276123,
-0.035046789795160294,
0.0637371614575386,
0.07971442490816116,
-0.10610465705394745,
-0.03593102842569351,
-0.05118098855018616,
0.10048811882734299,
-0.09647825360298157,
0.06142208352684975,
-0.09462534636259079,
0.03527459874749184,
-0.06366075575351715,
-0.0859437957406044,
-0.03374049440026283,
-0.006878068670630455,
-0.03895719721913338,
0.10239961743354797,
-0.06622999906539917,
0.07138654589653015,
0.22029145061969757,
0.017061308026313782,
0.03516208380460739,
-0.02953425422310829,
0.17540818452835083,
-0.10241235792636871,
0.034652382135391235,
0.12055598944425583,
-0.029995162039995193,
0.05657584220170975,
0.1437140554189682,
0.011703207157552242,
-0.08011350780725479,
0.002258131979033351,
0.010228371247649193,
-0.12261748313903809,
-0.20465561747550964,
-0.06705962866544724,
-0.14819422364234924,
0.05926777422428131,
0.06478891521692276,
0.07651592046022415,
0.14429108798503876,
0.09555310010910034,
0.04511459171772003,
0.08588859438896179,
-0.023288369178771973,
0.05475285276770592,
0.16664555668830872,
-0.023160919547080994,
0.15568293631076813,
-0.07384833693504333,
-0.13204066455364227,
0.07433277368545532,
0.12512889504432678,
0.11714643985033035,
0.08114442229270935,
0.1444961577653885,
0.00862330012023449,
0.09221619367599487,
0.149849072098732,
0.0690806582570076,
0.013356966897845268,
-0.04558763653039932,
-0.03022400662302971,
-0.020454568788409233,
0.02518821507692337,
0.08124489337205887,
0.07813797146081924,
-0.16797268390655518,
0.013659918680787086,
-0.11005258560180664,
0.087477907538414,
0.06450770050287247,
0.1229616180062294,
-0.24456165730953217,
0.02001626417040825,
0.11638131737709045,
-0.02203887142241001,
-0.10896003991365433,
0.05952271819114685,
0.09640580415725708,
-0.047947172075510025,
0.02202937938272953,
-0.07410319149494171,
0.09702272713184357,
-0.03919386491179466,
0.07267198711633682,
-0.04528801515698433,
-0.04470019415020943,
-0.006181628443300724,
0.08863008767366409,
-0.2739979028701782,
0.1885276436805725,
0.01909581571817398,
-0.06632459163665771,
-0.09720592200756073,
0.03434727340936661,
0.01397794671356678,
0.09123179316520691,
0.10096582025289536,
-0.01301519013941288,
-0.16873182356357574,
-0.05972950533032417,
0.005203709937632084,
0.008035720326006413,
0.12240704894065857,
-0.024869289249181747,
-0.0038501559756696224,
-0.03652589023113251,
0.0062357718124985695,
0.036726757884025574,
0.03471348434686661,
0.0021472664084285498,
-0.21767643094062805,
0.0917893499135971,
0.06323390454053879,
-0.01450207456946373,
0.023977644741535187,
-0.01706906408071518,
-0.17302773892879486,
0.26311689615249634,
-0.05806812644004822,
-0.029923830181360245,
-0.12235341221094131,
-0.025161921977996826,
0.06250058114528656,
-0.04501267150044441,
0.03274880349636078,
-0.06593604385852814,
0.06128424033522606,
-0.05926539748907089,
-0.2073453813791275,
0.17930079996585846,
-0.07041686028242111,
-0.005208586808294058,
-0.07680193334817886,
0.10202573239803314,
-0.10098346322774887,
0.013108808547258377,
0.011884143576025963,
0.06299641728401184,
-0.09492211043834686,
-0.09095875918865204,
0.04989884793758392,
-0.03089791350066662,
0.02626289613544941,
-0.019809678196907043,
-0.038030195981264114,
0.0047800554893910885,
0.04602740705013275,
0.01180331315845251,
0.2947079539299011,
0.14899159967899323,
-0.10385657846927643,
0.1302204430103302,
0.06794058531522751,
-0.09177064150571823,
-0.3457490801811218,
-0.004644898697733879,
-0.1200491338968277,
0.006488305050879717,
0.011961899697780609,
-0.15958848595619202,
0.07606540620326996,
0.006910755764693022,
-0.04169095307588577,
0.14616553485393524,
-0.24025394022464752,
-0.10335181653499603,
0.13150610029697418,
0.001330648548901081,
0.29296427965164185,
-0.17788857221603394,
-0.10410158336162567,
-0.04856765270233154,
-0.06579331308603287,
0.1786462515592575,
-0.12020042538642883,
0.11712587624788284,
0.018853558227419853,
0.06722011417150497,
0.04307108744978905,
-0.059005007147789,
0.12487974017858505,
-0.023921772837638855,
0.023312054574489594,
-0.107099749147892,
-0.05185035243630409,
0.11671019345521927,
-0.036557551473379135,
0.006711299996823072,
-0.10011179745197296,
-0.007513966877013445,
-0.0938490703701973,
-0.025198068469762802,
-0.04220536723732948,
0.07599833607673645,
0.04063774645328522,
-0.08116070181131363,
-0.06299111992120743,
-0.054160717874765396,
-0.014408733695745468,
-0.009837173856794834,
0.3166894018650055,
-0.04498438909649849,
0.16214321553707123,
0.18660005927085876,
0.055397432297468185,
-0.1110672727227211,
0.02380608208477497,
0.010695328004658222,
-0.06042690947651863,
0.10401603579521179,
-0.20457804203033447,
0.06308453530073166,
0.0758678987622261,
-0.059471845626831055,
0.08517719060182571,
0.10884637385606766,
-0.030587662011384964,
0.02535592019557953,
0.14490807056427002,
-0.2057361751794815,
-0.08077680319547653,
-0.047071799635887146,
-0.06607432663440704,
0.04807679355144501,
0.0678577572107315,
0.18377119302749634,
0.01907130517065525,
0.0040004258044064045,
0.022286664694547653,
0.004443163983523846,
-0.054858848452568054,
0.0649317279458046,
0.04175250604748726,
0.022412173449993134,
-0.10558541119098663,
0.07294834405183792,
0.05230112746357918,
-0.16026760637760162,
0.04337398707866669,
0.12853488326072693,
-0.07624596357345581,
-0.13455143570899963,
-0.0197551678866148,
0.14347197115421295,
-0.09083374589681625,
-0.004121619276702404,
-0.05097829923033714,
-0.10612890124320984,
0.05510052666068077,
0.1406835913658142,
0.0571073517203331,
0.07875766605138779,
-0.04653202369809151,
-0.02188100293278694,
-0.04641636461019516,
0.0021342469844967127,
-0.027782423421740532,
0.053778182715177536,
-0.10708829760551453,
0.014817873015999794,
-0.04359337314963341,
0.10876908898353577,
-0.10724924504756927,
-0.0324641577899456,
-0.19032879173755646,
-0.02556513622403145,
-0.12911443412303925,
-0.06605348736047745,
-0.08861551433801651,
-0.06356687098741531,
0.016711072996258736,
-0.03865523636341095,
-0.08124808967113495,
-0.06324593722820282,
-0.12080960720777512,
0.0015869998605921865,
-0.052399035543203354,
0.04198548197746277,
-0.06744921952486038,
-0.008765471167862415,
0.051066748797893524,
-0.02484334260225296,
0.10800771415233612,
0.03483060002326965,
-0.043360430747270584,
0.1020427867770195,
-0.11929647624492645,
-0.06457541882991791,
0.10243117064237595,
0.0024227406829595566,
0.0616304874420166,
0.0886712595820427,
-0.0004366367938928306,
0.032794926315546036,
0.06838278472423553,
0.044045768678188324,
-0.026266546919941902,
-0.08285555243492126,
0.03527023643255234,
-0.10038720816373825,
-0.1306551843881607,
-0.04364876076579094,
0.0007356908754445612,
0.04210083931684494,
0.021394241601228714,
0.09498246014118195,
-0.04644489288330078,
0.05867717042565346,
-0.07228627800941467,
0.03353310376405716,
-0.0030074899550527334,
-0.18258413672447205,
0.01892843097448349,
-0.06106831133365631,
0.018622474744915962,
-0.022901616990566254,
0.27868974208831787,
0.08077171444892883,
-0.03991829976439476,
0.036082636564970016,
0.05323311686515808,
0.03771907836198807,
0.04139363393187523,
0.21869975328445435,
0.10575182735919952,
-0.06137113273143768,
-0.13915564119815826,
0.10184648633003235,
0.05983760580420494,
0.08639710396528244,
0.0866178348660469,
0.025467177852988243,
-0.0674012154340744,
0.15690188109874725,
-0.00983092375099659,
-0.043573345988988876,
-0.08477026969194412,
0.02817542478442192,
-0.05449263006448746,
0.10388665646314621,
-0.03379504755139351,
0.020492928102612495,
0.1848352998495102,
-0.025273513048887253,
0.03671317920088768,
-0.04623277857899666,
-0.07577329128980637,
-0.17000575363636017,
-0.20021316409111023,
-0.093358613550663,
-0.13072660565376282,
-0.01830078847706318,
-0.08147350698709488,
0.05790375918149948,
0.08591607213020325,
0.05566619709134102,
-0.026469510048627853,
0.12816886603832245,
0.07044699043035507,
-0.06335187703371048,
0.030390655621886253,
-0.012258718721568584,
0.054969485849142075,
-0.0396856851875782,
-0.003173097735270858,
-0.09078017622232437,
0.029316412284970284,
-0.03200716897845268,
0.039031460881233215,
-0.0032636825926601887,
0.0326945036649704,
-0.13935747742652893,
-0.1134294793009758,
-0.056822769343853,
0.07940617203712463,
-0.05552024394273758,
0.08085732907056808,
0.007689157035201788,
-0.04479381442070007,
0.027642153203487396,
0.22628211975097656,
-0.09653963148593903,
0.03109918162226677,
-0.0629974752664566,
0.12005360424518585,
0.009486178867518902,
0.13222070038318634,
-0.07831086218357086,
-0.03560853749513626,
-0.07498298585414886,
0.31280213594436646,
0.3342958986759186,
-0.08745069801807404,
0.02854655496776104,
0.035657335072755814,
0.012312902137637138,
0.06541790813207626,
0.10730492323637009,
0.03176140412688255,
0.20780228078365326,
-0.07747133076190948,
-0.06625863909721375,
0.013265948742628098,
-0.020925037562847137,
-0.0835263729095459,
0.1299448162317276,
0.06256061792373657,
-0.04443657025694847,
-0.0344059020280838,
0.0647117868065834,
-0.184451162815094,
0.07348601520061493,
-0.03838365525007248,
-0.1891637146472931,
-0.07415621727705002,
0.03823869675397873,
0.15200716257095337,
-0.04859136790037155,
0.08838203549385071,
-0.03153667226433754,
-0.08051463961601257,
-0.016526440158486366,
0.004438107367604971,
-0.20648397505283356,
0.044529374688863754,
0.03207828477025032,
-0.06076419726014137,
0.026025572791695595,
-0.034620605409145355,
-0.02377256378531456,
0.12022893130779266,
0.04476414620876312,
-0.04605161398649216,
0.020791349932551384,
-0.0017713498091325164,
-0.051380231976509094,
0.011766922660171986,
0.04376426339149475,
-0.000779884634539485,
-0.06861943751573563,
0.09055774658918381,
-0.13300293684005737,
0.04876919463276863,
-0.13300399482250214,
-0.04605567082762718,
-0.016440192237496376,
-0.06436626613140106,
-0.06004290655255318,
0.09023147076368332,
0.07305080443620682,
0.013962037861347198,
-0.007419113535434008,
-0.03593330830335617,
-0.021340953186154366,
0.00002946987297036685,
-0.020661231130361557,
-0.116391122341156,
-0.12976700067520142,
-0.07820378243923187,
0.11942551285028458,
-0.007138282526284456,
-0.23435598611831665,
-0.011126156896352768,
-0.0631912350654602,
0.061357639729976654,
-0.14143946766853333,
0.07959787547588348,
0.17707812786102295,
0.01640382409095764,
-0.037869103252887726,
-0.10302548855543137,
0.037602540105581284,
0.060249023139476776,
-0.10938987880945206,
-0.08564726263284683
] |
null | null |
transformers
|
# GPT2-small-indonesian
This is a pretrained model on Indonesian language using a causal language modeling (CLM) objective, which was first
introduced in [this paper](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf)
and first released at [this page](https://openai.com/blog/better-language-models/).
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104)
organized by [HuggingFace](https://huggingface.co). All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
The demo can be found [here](https://huggingface.co/spaces/flax-community/gpt2-indonesian).
## How to use
You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness,
we set a seed for reproducibility:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='flax-community/gpt2-small-indonesian')
>>> set_seed(42)
>>> generator("Sewindu sudah kita tak berjumpa,", max_length=30, num_return_sequences=5)
[{'generated_text': 'Sewindu sudah kita tak berjumpa, dua dekade lalu, saya hanya bertemu sekali. Entah mengapa, saya lebih nyaman berbicara dalam bahasa Indonesia, bahasa Indonesia'},
{'generated_text': 'Sewindu sudah kita tak berjumpa, tapi dalam dua hari ini, kita bisa saja bertemu.”\
“Kau tau, bagaimana dulu kita bertemu?” aku'},
{'generated_text': 'Sewindu sudah kita tak berjumpa, banyak kisah yang tersimpan. Tak mudah tuk kembali ke pelukan, di mana kini kita berada, sebuah tempat yang jauh'},
{'generated_text': 'Sewindu sudah kita tak berjumpa, sejak aku lulus kampus di Bandung, aku sempat mencari kabar tentangmu. Ah, masih ada tempat di hatiku,'},
{'generated_text': 'Sewindu sudah kita tak berjumpa, tapi Tuhan masih saja menyukarkan doa kita masing-masing.\
Tuhan akan memberi lebih dari apa yang kita'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import GPT2Tokenizer, GPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('flax-community/gpt2-small-indonesian')
model = GPT2Model.from_pretrained('flax-community/gpt2-small-indonesian')
text = "Ubah dengan teks apa saja."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import GPT2Tokenizer, TFGPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('flax-community/gpt2-small-indonesian')
model = TFGPT2Model.from_pretrained('flax-community/gpt2-small-indonesian')
text = "Ubah dengan teks apa saja."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
## Limitations and bias
The training data used for this model are Indonesian websites of [OSCAR](https://oscar-corpus.com/),
[mc4](https://huggingface.co/datasets/mc4) and [Wikipedia](https://huggingface.co/datasets/wikipedia). The datasets
contain a lot of unfiltered content from the internet, which is far from neutral. While we have done some filtering on
the dataset (see the **Training data** section), the filtering is by no means a thorough mitigation of biased content
that is eventually used by the training data. These biases might also affect models that are fine-tuned using this model.
As the openAI team themselves point out in their [model card](https://github.com/openai/gpt-2/blob/master/model_card.md#out-of-scope-use-cases):
> Because large-scale language models like GPT-2 do not distinguish fact from fiction, we don’t support use-cases
> that require the generated text to be true.
> Additionally, language models like GPT-2 reflect the biases inherent to the systems they were trained on, so we
> do not recommend that they be deployed into systems that interact with humans > unless the deployers first carry
> out a study of biases relevant to the intended use-case. We found no statistically significant difference in gender,
> race, and religious bias probes between 774M and 1.5B, implying all versions of GPT-2 should be approached with
> similar levels of caution around use cases that are sensitive to biases around human attributes.
We have done a basic bias analysis that you can find in this [notebook](https://huggingface.co/flax-community/gpt2-small-indonesian/blob/main/bias_analysis/gpt2_medium_indonesian_bias_analysis.ipynb), performed on [Indonesian GPT2 medium](https://huggingface.co/flax-community/gpt2-medium-indonesian), based on the bias analysis for [Polish GPT2](https://huggingface.co/flax-community/papuGaPT2) with modifications.
### Gender bias
We generated 50 texts starting with prompts "She/He works as". After doing some preprocessing (lowercase and stopwords removal) we obtain texts that are used to generate word clouds of female/male professions. The most salient terms for male professions are: driver, sopir (driver), ojek, tukang, online.

The most salient terms for female professions are: pegawai (employee), konsultan (consultant), asisten (assistant).

### Ethnicity bias
We generated 1,200 texts to assess bias across ethnicity and gender vectors. We will create prompts with the following scheme:
* Person - we will assess 5 ethnicities: Sunda, Batak, Minahasa, Dayak, Asmat, Neutral (no ethnicity)
* Topic - we will use 5 different topics:
* random act: *entered home*
* said: *said*
* works as: *works as*
* intent: *let [person] ...*
* define: *is*
Sample of generated prompt: "seorang perempuan sunda masuk ke rumah..." (a Sundanese woman enters the house...)
We used a [model](https://huggingface.co/Hate-speech-CNERG/dehatebert-mono-indonesian) trained on Indonesian hate speech corpus ([dataset 1](https://github.com/okkyibrohim/id-multi-label-hate-speech-and-abusive-language-detection), [dataset 2](https://github.com/ialfina/id-hatespeech-detection)) to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the ethnicity and gender from the generated text before running the hate speech detector.
The following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some ethnicities score higher than the neutral baseline.

### Religion bias
With the same methodology above, we generated 1,400 texts to assess bias across religion and gender vectors. We will assess 6 religions: Islam, Protestan (Protestant), Katolik (Catholic), Buddha (Buddhism), Hindu (Hinduism), and Khonghucu (Confucianism) with Neutral (no religion) as a baseline.
The following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some religions score higher than the neutral baseline.

## Training data
The model was trained on a combined dataset of [OSCAR](https://oscar-corpus.com/), [mc4](https://huggingface.co/datasets/mc4)
and Wikipedia for the Indonesian language. We have filtered and reduced the mc4 dataset so that we end up with 29 GB
of data in total. The mc4 dataset was cleaned using [this filtering script](https://github.com/Wikidepia/indonesian_datasets/blob/master/dump/mc4/cleanup.py)
and we also only included links that have been cited by the Indonesian Wikipedia.
## Training procedure
The model was trained on a TPUv3-8 VM provided by the Google Cloud team. The training duration was `4d 14h 50m 47s`.
### Evaluation results
The model achieves the following results without any fine-tuning (zero-shot):
| dataset | train loss | eval loss | eval perplexity |
| ---------- | ---------- | -------------- | ---------- |
| ID OSCAR+mc4+wikipedia (29GB) | 3.046 | 2.926 | 18.66 |
### Tracking
The training process was tracked in [TensorBoard](https://huggingface.co/flax-community/gpt2-small-indonesian/tensorboard) and [Weights and Biases](https://wandb.ai/wandb/hf-flax-gpt2-indonesian?workspace=user-cahya).
## Team members
- Akmal ([@Wikidepia](https://huggingface.co/Wikidepia))
- alvinwatner ([@alvinwatner](https://huggingface.co/alvinwatner))
- Cahya Wirawan ([@cahya](https://huggingface.co/cahya))
- Galuh Sahid ([@Galuh](https://huggingface.co/Galuh))
- Muhammad Agung Hambali ([@AyameRushia](https://huggingface.co/AyameRushia))
- Muhammad Fhadli ([@muhammadfhadli](https://huggingface.co/muhammadfhadli))
- Samsul Rahmadani ([@munggok](https://huggingface.co/munggok))
## Future work
We would like to pre-train further the models with larger and cleaner datasets and fine-tune it to specific domains
if we can get the necessary hardware resources.
|
{"language": "id", "widget": [{"text": "Sewindu sudah kita tak berjumpa, rinduku padamu sudah tak terkira."}]}
|
text-generation
|
flax-community/gpt2-small-indonesian
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"gpt2",
"text-generation",
"id",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"id"
] |
TAGS
#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #id #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
GPT2-small-indonesian
=====================
This is a pretrained model on Indonesian language using a causal language modeling (CLM) objective, which was first
introduced in this paper
and first released at this page.
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week
organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
The demo can be found here.
How to use
----------
You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness,
we set a seed for reproducibility:
Here is how to use this model to get the features of a given text in PyTorch:
and in TensorFlow:
Limitations and bias
--------------------
The training data used for this model are Indonesian websites of OSCAR,
mc4 and Wikipedia. The datasets
contain a lot of unfiltered content from the internet, which is far from neutral. While we have done some filtering on
the dataset (see the Training data section), the filtering is by no means a thorough mitigation of biased content
that is eventually used by the training data. These biases might also affect models that are fine-tuned using this model.
As the openAI team themselves point out in their model card:
>
> Because large-scale language models like GPT-2 do not distinguish fact from fiction, we don’t support use-cases
> that require the generated text to be true.
>
>
>
>
> Additionally, language models like GPT-2 reflect the biases inherent to the systems they were trained on, so we
> do not recommend that they be deployed into systems that interact with humans > unless the deployers first carry
> out a study of biases relevant to the intended use-case. We found no statistically significant difference in gender,
> race, and religious bias probes between 774M and 1.5B, implying all versions of GPT-2 should be approached with
> similar levels of caution around use cases that are sensitive to biases around human attributes.
>
>
>
We have done a basic bias analysis that you can find in this notebook, performed on Indonesian GPT2 medium, based on the bias analysis for Polish GPT2 with modifications.
### Gender bias
We generated 50 texts starting with prompts "She/He works as". After doing some preprocessing (lowercase and stopwords removal) we obtain texts that are used to generate word clouds of female/male professions. The most salient terms for male professions are: driver, sopir (driver), ojek, tukang, online.
!gender bias - male
The most salient terms for female professions are: pegawai (employee), konsultan (consultant), asisten (assistant).
!gender bias - female
### Ethnicity bias
We generated 1,200 texts to assess bias across ethnicity and gender vectors. We will create prompts with the following scheme:
* Person - we will assess 5 ethnicities: Sunda, Batak, Minahasa, Dayak, Asmat, Neutral (no ethnicity)
* Topic - we will use 5 different topics:
+ random act: *entered home*
+ said: *said*
+ works as: *works as*
+ intent: *let [person] ...*
+ define: *is*
Sample of generated prompt: "seorang perempuan sunda masuk ke rumah..." (a Sundanese woman enters the house...)
We used a model trained on Indonesian hate speech corpus (dataset 1, dataset 2) to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the ethnicity and gender from the generated text before running the hate speech detector.
The following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some ethnicities score higher than the neutral baseline.
!bias analysis - ethnicities
### Religion bias
With the same methodology above, we generated 1,400 texts to assess bias across religion and gender vectors. We will assess 6 religions: Islam, Protestan (Protestant), Katolik (Catholic), Buddha (Buddhism), Hindu (Hinduism), and Khonghucu (Confucianism) with Neutral (no religion) as a baseline.
The following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some religions score higher than the neutral baseline.
!bias analysis - ethnicities
Training data
-------------
The model was trained on a combined dataset of OSCAR, mc4
and Wikipedia for the Indonesian language. We have filtered and reduced the mc4 dataset so that we end up with 29 GB
of data in total. The mc4 dataset was cleaned using this filtering script
and we also only included links that have been cited by the Indonesian Wikipedia.
Training procedure
------------------
The model was trained on a TPUv3-8 VM provided by the Google Cloud team. The training duration was '4d 14h 50m 47s'.
### Evaluation results
The model achieves the following results without any fine-tuning (zero-shot):
### Tracking
The training process was tracked in TensorBoard and Weights and Biases.
Team members
------------
* Akmal (@Wikidepia)
* alvinwatner (@alvinwatner)
* Cahya Wirawan (@cahya)
* Galuh Sahid (@Galuh)
* Muhammad Agung Hambali (@AyameRushia)
* Muhammad Fhadli (@muhammadfhadli)
* Samsul Rahmadani (@munggok)
Future work
-----------
We would like to pre-train further the models with larger and cleaner datasets and fine-tune it to specific domains
if we can get the necessary hardware resources.
|
[
"### Gender bias\n\n\nWe generated 50 texts starting with prompts \"She/He works as\". After doing some preprocessing (lowercase and stopwords removal) we obtain texts that are used to generate word clouds of female/male professions. The most salient terms for male professions are: driver, sopir (driver), ojek, tukang, online.\n\n\n!gender bias - male\n\n\nThe most salient terms for female professions are: pegawai (employee), konsultan (consultant), asisten (assistant).\n\n\n!gender bias - female",
"### Ethnicity bias\n\n\nWe generated 1,200 texts to assess bias across ethnicity and gender vectors. We will create prompts with the following scheme:\n\n\n* Person - we will assess 5 ethnicities: Sunda, Batak, Minahasa, Dayak, Asmat, Neutral (no ethnicity)\n* Topic - we will use 5 different topics:\n\t+ random act: *entered home*\n\t+ said: *said*\n\t+ works as: *works as*\n\t+ intent: *let [person] ...*\n\t+ define: *is*\n\n\nSample of generated prompt: \"seorang perempuan sunda masuk ke rumah...\" (a Sundanese woman enters the house...)\n\n\nWe used a model trained on Indonesian hate speech corpus (dataset 1, dataset 2) to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the ethnicity and gender from the generated text before running the hate speech detector.\n\n\nThe following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some ethnicities score higher than the neutral baseline.\n\n\n!bias analysis - ethnicities",
"### Religion bias\n\n\nWith the same methodology above, we generated 1,400 texts to assess bias across religion and gender vectors. We will assess 6 religions: Islam, Protestan (Protestant), Katolik (Catholic), Buddha (Buddhism), Hindu (Hinduism), and Khonghucu (Confucianism) with Neutral (no religion) as a baseline.\n\n\nThe following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some religions score higher than the neutral baseline.\n\n\n!bias analysis - ethnicities\n\n\nTraining data\n-------------\n\n\nThe model was trained on a combined dataset of OSCAR, mc4\nand Wikipedia for the Indonesian language. We have filtered and reduced the mc4 dataset so that we end up with 29 GB\nof data in total. The mc4 dataset was cleaned using this filtering script\nand we also only included links that have been cited by the Indonesian Wikipedia.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a TPUv3-8 VM provided by the Google Cloud team. The training duration was '4d 14h 50m 47s'.",
"### Evaluation results\n\n\nThe model achieves the following results without any fine-tuning (zero-shot):",
"### Tracking\n\n\nThe training process was tracked in TensorBoard and Weights and Biases.\n\n\nTeam members\n------------\n\n\n* Akmal (@Wikidepia)\n* alvinwatner (@alvinwatner)\n* Cahya Wirawan (@cahya)\n* Galuh Sahid (@Galuh)\n* Muhammad Agung Hambali (@AyameRushia)\n* Muhammad Fhadli (@muhammadfhadli)\n* Samsul Rahmadani (@munggok)\n\n\nFuture work\n-----------\n\n\nWe would like to pre-train further the models with larger and cleaner datasets and fine-tune it to specific domains\nif we can get the necessary hardware resources."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #id #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### Gender bias\n\n\nWe generated 50 texts starting with prompts \"She/He works as\". After doing some preprocessing (lowercase and stopwords removal) we obtain texts that are used to generate word clouds of female/male professions. The most salient terms for male professions are: driver, sopir (driver), ojek, tukang, online.\n\n\n!gender bias - male\n\n\nThe most salient terms for female professions are: pegawai (employee), konsultan (consultant), asisten (assistant).\n\n\n!gender bias - female",
"### Ethnicity bias\n\n\nWe generated 1,200 texts to assess bias across ethnicity and gender vectors. We will create prompts with the following scheme:\n\n\n* Person - we will assess 5 ethnicities: Sunda, Batak, Minahasa, Dayak, Asmat, Neutral (no ethnicity)\n* Topic - we will use 5 different topics:\n\t+ random act: *entered home*\n\t+ said: *said*\n\t+ works as: *works as*\n\t+ intent: *let [person] ...*\n\t+ define: *is*\n\n\nSample of generated prompt: \"seorang perempuan sunda masuk ke rumah...\" (a Sundanese woman enters the house...)\n\n\nWe used a model trained on Indonesian hate speech corpus (dataset 1, dataset 2) to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the ethnicity and gender from the generated text before running the hate speech detector.\n\n\nThe following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some ethnicities score higher than the neutral baseline.\n\n\n!bias analysis - ethnicities",
"### Religion bias\n\n\nWith the same methodology above, we generated 1,400 texts to assess bias across religion and gender vectors. We will assess 6 religions: Islam, Protestan (Protestant), Katolik (Catholic), Buddha (Buddhism), Hindu (Hinduism), and Khonghucu (Confucianism) with Neutral (no religion) as a baseline.\n\n\nThe following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some religions score higher than the neutral baseline.\n\n\n!bias analysis - ethnicities\n\n\nTraining data\n-------------\n\n\nThe model was trained on a combined dataset of OSCAR, mc4\nand Wikipedia for the Indonesian language. We have filtered and reduced the mc4 dataset so that we end up with 29 GB\nof data in total. The mc4 dataset was cleaned using this filtering script\nand we also only included links that have been cited by the Indonesian Wikipedia.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a TPUv3-8 VM provided by the Google Cloud team. The training duration was '4d 14h 50m 47s'.",
"### Evaluation results\n\n\nThe model achieves the following results without any fine-tuning (zero-shot):",
"### Tracking\n\n\nThe training process was tracked in TensorBoard and Weights and Biases.\n\n\nTeam members\n------------\n\n\n* Akmal (@Wikidepia)\n* alvinwatner (@alvinwatner)\n* Cahya Wirawan (@cahya)\n* Galuh Sahid (@Galuh)\n* Muhammad Agung Hambali (@AyameRushia)\n* Muhammad Fhadli (@muhammadfhadli)\n* Samsul Rahmadani (@munggok)\n\n\nFuture work\n-----------\n\n\nWe would like to pre-train further the models with larger and cleaner datasets and fine-tune it to specific domains\nif we can get the necessary hardware resources."
] |
[
60,
127,
264,
259,
23,
140
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #id #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n### Gender bias\n\n\nWe generated 50 texts starting with prompts \"She/He works as\". After doing some preprocessing (lowercase and stopwords removal) we obtain texts that are used to generate word clouds of female/male professions. The most salient terms for male professions are: driver, sopir (driver), ojek, tukang, online.\n\n\n!gender bias - male\n\n\nThe most salient terms for female professions are: pegawai (employee), konsultan (consultant), asisten (assistant).\n\n\n!gender bias - female### Ethnicity bias\n\n\nWe generated 1,200 texts to assess bias across ethnicity and gender vectors. We will create prompts with the following scheme:\n\n\n* Person - we will assess 5 ethnicities: Sunda, Batak, Minahasa, Dayak, Asmat, Neutral (no ethnicity)\n* Topic - we will use 5 different topics:\n\t+ random act: *entered home*\n\t+ said: *said*\n\t+ works as: *works as*\n\t+ intent: *let [person] ...*\n\t+ define: *is*\n\n\nSample of generated prompt: \"seorang perempuan sunda masuk ke rumah...\" (a Sundanese woman enters the house...)\n\n\nWe used a model trained on Indonesian hate speech corpus (dataset 1, dataset 2) to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the ethnicity and gender from the generated text before running the hate speech detector.\n\n\nThe following chart demonstrates the intensity of hate speech associated with the generated texts with outlier scores removed. Some ethnicities score higher than the neutral baseline.\n\n\n!bias analysis - ethnicities"
] |
[
-0.04489096254110336,
0.06275404244661331,
-0.004786321427673101,
-0.02615656889975071,
0.0868164598941803,
-0.0016813717084005475,
0.1643567532300949,
0.06583932042121887,
0.10427580773830414,
0.06171996891498566,
0.04013139754533768,
0.015930458903312683,
0.15617109835147858,
0.0700860470533371,
0.0922953262925148,
-0.20395231246948242,
0.06766536086797714,
-0.04951043426990509,
0.04493051394820213,
0.14349710941314697,
0.11145194619894028,
-0.03259984031319618,
0.06576304137706757,
0.014279777184128761,
-0.00850764662027359,
-0.022755952551960945,
-0.08095331490039825,
-0.004414992406964302,
0.039373233914375305,
0.051418982446193695,
0.07321556657552719,
0.08044420182704926,
-0.06345728039741516,
-0.17520290613174438,
0.02034158818423748,
-0.02517702989280224,
0.01986014097929001,
-0.013233442790806293,
0.08011128008365631,
-0.028625469654798508,
0.16101671755313873,
-0.0643794909119606,
-0.032937999814748764,
0.09092523902654648,
-0.135210320353508,
-0.042141951620578766,
-0.06090759113430977,
0.07688608765602112,
0.10064081847667694,
0.05376177653670311,
-0.08734073489904404,
0.09212926030158997,
-0.037607260048389435,
0.02916876971721649,
0.09932209551334381,
-0.0478072389960289,
0.0004977171774953604,
-0.09855789691209793,
0.043582916259765625,
0.16484731435775757,
-0.07984182238578796,
-0.08112022280693054,
-0.011572451330721378,
-0.0246422179043293,
-0.00013500767818186432,
-0.09217078238725662,
0.04196534678339958,
-0.07180270552635193,
-0.11941486597061157,
-0.05932609736919403,
0.1309184730052948,
0.07864393293857574,
-0.09213370084762573,
-0.11853981763124466,
-0.003014232963323593,
0.037364471703767776,
-0.014451744966208935,
0.009714540094137192,
-0.07956606149673462,
-0.03588274493813515,
0.12624245882034302,
-0.041165657341480255,
-0.09947405755519867,
0.029436230659484863,
0.07592351734638214,
0.0672244280576706,
0.01782429963350296,
0.013845459558069706,
-0.05576953664422035,
0.011337492614984512,
0.06298565119504929,
-0.0988190695643425,
-0.06380608677864075,
-0.05748876556754112,
-0.11530966311693192,
0.04743875190615654,
0.007575251627713442,
-0.08895516395568848,
0.05166751518845558,
0.013885307125747204,
0.042854342609643936,
0.08663172274827957,
-0.058594025671482086,
0.07127916812896729,
0.08075334131717682,
0.10443511605262756,
0.025236446410417557,
-0.056545600295066833,
-0.018890973180532455,
0.027314765378832817,
0.005256336648017168,
0.006242147646844387,
-0.02104334719479084,
0.014256635680794716,
0.0007832144619897008,
0.14945143461227417,
0.003561286721378565,
0.12923476099967957,
-0.0614289715886116,
-0.04260526970028877,
0.1588594615459442,
-0.12048999965190887,
0.00542640034109354,
0.031263601034879684,
-0.03712959215044975,
-0.034359339624643326,
-0.009111103601753712,
-0.008623450063169003,
-0.062452953308820724,
-0.06062769889831543,
-0.024964073672890663,
0.03657094016671181,
-0.11276771873235703,
-0.10053031891584396,
0.04067963361740112,
-0.10241757333278656,
-0.054215360432863235,
-0.12453606724739075,
-0.17200039327144623,
-0.11376003921031952,
-0.04522956162691116,
-0.038015589118003845,
-0.02458176575601101,
-0.11186863481998444,
-0.01580834574997425,
0.015372583642601967,
-0.00468224985525012,
-0.0439908467233181,
0.0049478402361273766,
0.09945935755968094,
-0.048803068697452545,
0.10391699522733688,
0.04517488181591034,
0.04004789888858795,
-0.1142410933971405,
0.0751446783542633,
-0.13437190651893616,
0.0825745016336441,
-0.0826103687286377,
0.06576205044984818,
-0.09879343956708908,
-0.031334567815065384,
-0.054692018777132034,
0.057152774184942245,
-0.0340275801718235,
0.19022591412067413,
-0.16816595196723938,
-0.02552792802453041,
0.20757240056991577,
-0.15230755507946014,
-0.07276777923107147,
0.07549554854631424,
0.01995103806257248,
0.25765350461006165,
0.02813834138214588,
0.17638307809829712,
-0.03238378465175629,
0.06157878041267395,
0.050732776522636414,
-0.09403111040592194,
-0.07563401758670807,
0.08935600519180298,
0.019188884645700455,
-0.10312877595424652,
0.013770036399364471,
-0.01417951937764883,
0.034093908965587616,
0.042330510914325714,
-0.008200707845389843,
-0.07484795153141022,
0.07253947108983994,
-0.020120641216635704,
-0.04531698673963547,
-0.021415675058960915,
-0.042732786387205124,
-0.019425248727202415,
-0.07009349763393402,
-0.06976904720067978,
0.04095856845378876,
-0.05901651084423065,
0.03578784316778183,
-0.13712723553180695,
-0.05958123505115509,
0.03406418859958649,
0.028017351403832436,
-0.14160814881324768,
0.007125725504010916,
-0.004182835109531879,
-0.04410940408706665,
0.06686476618051529,
0.030216477811336517,
0.03675389662384987,
-0.030101478099822998,
0.0231024082750082,
-0.04115072637796402,
0.050048213452100754,
-0.031806718558073044,
-0.035797182470560074,
-0.07640838623046875,
-0.0018914187094196677,
-0.01677260734140873,
0.1582290530204773,
-0.07368727773427963,
0.026021292433142662,
0.012794267386198044,
-0.0066567035391926765,
0.0361417680978775,
-0.052017465233802795,
0.07941952347755432,
0.07543177157640457,
-0.00401352159678936,
-0.02401915192604065,
0.04098919779062271,
-0.006226236000657082,
-0.06152769923210144,
0.16778241097927094,
-0.17676664888858795,
-0.2145194560289383,
0.032340437173843384,
0.004378214478492737,
-0.0990326926112175,
0.10354606062173843,
-0.04144890233874321,
-0.025958359241485596,
0.023400546982884407,
-0.09138214588165283,
-0.07061412930488586,
0.009759180247783661,
0.039154358208179474,
-0.014949943870306015,
-0.02954651042819023,
0.006378857884556055,
-0.021512873470783234,
-0.05502314120531082,
0.08550170063972473,
0.11305650323629379,
-0.16707752645015717,
0.1278413087129593,
-0.043406110256910324,
0.04108860343694687,
0.12656420469284058,
0.005073213018476963,
-0.06491792947053909,
-0.02182544395327568,
0.06959240138530731,
-0.01773097552359104,
0.054456017911434174,
-0.23255108296871185,
-0.04634300246834755,
0.03196325525641441,
-0.04117218032479286,
-0.002866873051971197,
-0.03909071162343025,
-0.008162363432347775,
-0.018935825675725937,
-0.003426983254030347,
-0.046229515224695206,
0.050223544239997864,
0.015099292621016502,
0.09634140878915787,
-0.02860342524945736,
-0.021958565339446068,
-0.0007912915898486972,
-0.04876403883099556,
-0.156667098402977,
0.06637033075094223,
-0.004986380226910114,
-0.2770151197910309,
-0.0716325044631958,
0.040597524493932724,
0.04667360708117485,
-0.005473010707646608,
0.08029156178236008,
-0.056542761623859406,
-0.0737883448600769,
-0.09799866378307343,
0.15215235948562622,
0.028531569987535477,
0.007924959063529968,
-0.03851770982146263,
-0.013891804032027721,
0.02216138318181038,
-0.05919528752565384,
-0.005283677019178867,
0.03568321466445923,
-0.022953689098358154,
0.11588326096534729,
-0.062223292887210846,
0.11882288008928299,
0.083025723695755,
0.08751678466796875,
-0.02362169697880745,
-0.03105797804892063,
0.29500555992126465,
-0.12349864095449448,
0.1082409918308258,
-0.020726259797811508,
-0.14609333872795105,
0.0609184205532074,
0.16121463477611542,
-0.030159812420606613,
-0.0740416869521141,
0.05224674940109253,
0.07931829243898392,
0.002218795707449317,
-0.17360098659992218,
-0.01553620956838131,
-0.07211554795503616,
-0.000056332690292038023,
-0.023693308234214783,
0.06600984930992126,
0.06627184897661209,
0.04727620258927345,
-0.11802396178245544,
-0.006462098099291325,
0.02791876718401909,
0.049698665738105774,
-0.04133177921175957,
-0.006302962079644203,
0.0028611270245164633,
-0.09843285381793976,
-0.08167684078216553,
-0.027223411947488785,
-0.09470132738351822,
0.23771637678146362,
0.08911001682281494,
0.1657843291759491,
0.15932615101337433,
0.10471352189779282,
0.034818898886442184,
0.016049357131123543,
-0.03410033509135246,
0.03474897891283035,
-0.02581661567091942,
-0.0634244978427887,
-0.012777182273566723,
0.07623308151960373,
0.025920137763023376,
-0.10705152899026871,
0.055151280015707016,
-0.12113255262374878,
0.07255881279706955,
0.14734186232089996,
-0.030325084924697876,
-0.06640291213989258,
-0.007854390889406204,
0.0855606347322464,
-0.04916951432824135,
-0.04926488175988197,
-0.07277592271566391,
0.0514618456363678,
-0.10326845198869705,
0.06373179703950882,
0.010437026619911194,
0.06855442374944687,
-0.04453966021537781,
-0.00013600834063254297,
-0.07883434742689133,
-0.04705927520990372,
-0.03213423490524292,
0.11118632555007935,
-0.27964577078819275,
0.21198584139347076,
0.026466721668839455,
0.042919185012578964,
-0.135472372174263,
-0.07549015432596207,
-0.015517636202275753,
-0.08859428018331528,
0.15059541165828705,
0.03211711719632149,
-0.1079828068614006,
-0.11682774126529694,
-0.0059679909609258175,
0.026042291894555092,
0.11734745651483536,
-0.031732089817523956,
0.12861895561218262,
0.04347239062190056,
-0.002474364824593067,
-0.043918266892433167,
0.0954575166106224,
-0.16210225224494934,
-0.10646453499794006,
0.042752455919981,
-0.029505880549550056,
0.10047132521867752,
0.0004170217434875667,
-0.0008477639639750123,
-0.04958498105406761,
0.15469837188720703,
-0.21925541758537292,
-0.13029475510120392,
-0.06383874267339706,
-0.035592060536146164,
0.058841772377491,
-0.09281379729509354,
-0.06518245488405228,
0.04716673865914345,
0.10304185003042221,
-0.04015552997589111,
-0.014222221449017525,
0.05023610219359398,
-0.007702202536165714,
-0.11949686706066132,
-0.054228540509939194,
0.1046336367726326,
0.1774710714817047,
0.07950203120708466,
-0.05513283610343933,
-0.028208840638399124,
0.04824677109718323,
-0.11067336052656174,
0.03528566285967827,
0.10841435939073563,
-0.17052654922008514,
0.020787248387932777,
0.002097430406138301,
-0.056752707809209824,
-0.16519726812839508,
-0.15009792149066925,
0.07863721251487732,
0.23687177896499634,
0.017698902636766434,
0.11080516129732132,
0.0761948823928833,
-0.0592135451734066,
-0.18890392780303955,
-0.05563625320792198,
0.03273911401629448,
0.020794348791241646,
0.027821412310004234,
-0.1748761385679245,
-0.14862266182899475,
0.01989855244755745,
0.022203529253602028,
0.021725749596953392,
-0.25481948256492615,
-0.08537107706069946,
0.10443674772977829,
0.052277181297540665,
0.06785009056329727,
-0.1374172419309616,
-0.07449858635663986,
-0.03973868489265442,
-0.07214745134115219,
0.13526739180088043,
0.08284594863653183,
0.028839262202382088,
-0.0016530337743461132,
0.13759706914424896,
0.03293934464454651,
-0.029587699100375175,
0.13298918306827545,
0.044269587844610214,
0.0358281247317791,
-0.13271528482437134,
-0.021146779879927635,
0.08944305032491684,
0.018375657498836517,
0.12010505795478821,
-0.016962865367531776,
-0.128730908036232,
-0.09143435955047607,
-0.05376395955681801,
-0.14487136900424957,
0.010909629054367542,
-0.06340937316417694,
0.0018563588382676244,
-0.07419843226671219,
0.06594225764274597,
0.06815208494663239,
0.03013487718999386,
0.024315401911735535,
-0.07226715236902237,
0.00011414341861382127,
0.019993344321846962,
0.12323511391878128,
0.0844239816069603,
-0.03284722939133644,
-0.038666386157274246,
0.02445608749985695,
0.0736997127532959,
-0.054498493671417236,
-0.013133129104971886,
0.04491177573800087,
-0.04067424684762955,
0.214471697807312,
-0.03063875436782837,
-0.1821209192276001,
0.07158331573009491,
0.09442627429962158,
-0.025711575523018837,
-0.2418290227651596,
0.010594678111374378,
0.0005465000285767019,
-0.1425148993730545,
-0.1407657265663147,
0.03285687789320946,
-0.013773538172245026,
-0.009136601351201534,
-0.03313335031270981,
0.10464591532945633,
-0.031609032303094864,
0.08259003609418869,
0.04571054130792618,
0.07068806886672974,
-0.018702661618590355,
0.045432768762111664,
0.09635515511035919,
-0.10466364026069641,
0.059241559356451035,
0.17983917891979218,
-0.08383239060640335,
-0.06225695461034775,
-0.029310276731848717,
0.08783617615699768,
-0.0571293942630291,
-0.0753815770149231,
-0.007912895642220974,
-0.1032925546169281,
0.004221760202199221,
0.1029040664434433,
-0.054570142179727554,
0.11362680047750473,
0.09993942081928253,
-0.017270294949412346,
-0.011518009006977081,
0.008511144667863846,
0.01295475009828806,
-0.03357672318816185,
0.05458831414580345,
0.0586940236389637,
0.0780046284198761,
0.06904910504817963,
0.01752060279250145,
-0.09580680727958679,
-0.12585744261741638,
0.03647531196475029,
-0.06574005633592606,
0.022270625457167625,
-0.17325814068317413,
-0.05751469358801842,
0.026208313181996346,
-0.03243623301386833,
0.010279177688062191,
-0.007473247591406107,
-0.07743050158023834,
-0.010020905174314976,
-0.03219327703118324,
0.0969114601612091,
-0.08521804213523865,
-0.01547974068671465,
0.0895637646317482,
-0.044260092079639435,
0.03937050327658653,
0.050325557589530945,
-0.1056080013513565,
-0.008810373954474926,
-0.12079755961894989,
0.0991281047463417,
-0.01254870742559433,
-0.04031502828001976,
-0.010700108483433723,
-0.09570576995611191,
-0.05572086200118065,
0.02010323666036129,
0.007512439042329788,
0.015057852491736412,
0.07760298997163773,
-0.05014031380414963,
0.1621338129043579,
0.027806943282485008,
-0.04532940313220024,
-0.08377421647310257,
0.05150414630770683,
-0.055146943777799606,
-0.04616279527544975,
0.10187141597270966,
-0.04598968103528023,
0.029975268989801407,
-0.10942929983139038,
0.012739877216517925,
0.04251421242952347,
0.08541598916053772,
0.05305441468954086,
-0.08355598896741867,
0.04928922280669212,
-0.007153060287237167,
0.09811592847108841,
-0.023382779210805893,
-0.0502714179456234,
0.03702147677540779,
-0.0037830867804586887,
-0.06960707902908325,
0.010753002017736435,
0.008189314976334572,
-0.01738584041595459,
-0.04668257758021355,
0.00943882204592228,
-0.06063514202833176,
-0.06677243113517761,
0.03343476355075836,
0.17517678439617157,
0.08440253138542175,
0.23395662009716034,
0.0592649020254612,
0.024054300040006638,
-0.032861288636922836,
-0.03178864344954491,
-0.04614191874861717,
0.038243383169174194,
-0.01789073273539543,
-0.06212492287158966,
0.1722143292427063,
0.1309148520231247,
-0.0779784694314003,
0.0859980583190918,
-0.0415278784930706,
-0.07496354728937149,
-0.01702730916440487,
-0.36600494384765625,
-0.013213113881647587,
-0.013835597783327103,
0.010394357144832611,
-0.02589366026222706,
0.03789278119802475,
0.0332937054336071,
-0.012045007199048996,
-0.06702645868062973,
0.08569607883691788,
0.055204037576913834,
-0.14510782063007355,
0.012837987393140793,
-0.01514274813234806,
0.03438728675246239,
-0.011967222206294537,
0.0847601592540741,
0.027267372235655785,
0.07454854995012283,
0.07161585986614227,
0.10687445104122162,
-0.0266721174120903,
0.03329722210764885,
-0.15399865806102753,
-0.11494921892881393,
0.007673783227801323,
0.05666377767920494,
0.07642533630132675,
0.22573065757751465,
0.06106596440076828,
0.03962673246860504,
-0.0017931463662534952,
0.0713375136256218,
0.05781592056155205,
-0.1007678359746933,
-0.06509315967559814,
0.1286212056875229,
0.021634278818964958,
-0.06938274204730988,
-0.0064484430477023125,
-0.13178426027297974,
0.045431748032569885,
0.24294251203536987,
0.1064377874135971,
0.027011269703507423,
0.026028171181678772,
-0.11302181333303452,
0.026211299002170563,
0.014281541109085083,
0.06471679359674454,
0.024610426276922226,
0.24349923431873322,
-0.08264405280351639,
0.11035715788602829,
-0.003869179170578718,
-0.009645436890423298,
0.022309958934783936,
0.11613249033689499,
-0.01833045296370983,
-0.010447359643876553,
-0.07366091012954712,
0.11065462976694107,
-0.20265348255634308,
-0.20579813420772552,
0.011052621528506279,
0.006913293153047562,
-0.056220605969429016,
0.03710523620247841,
0.019609859213232994,
0.1160273477435112,
0.06499335914850235,
0.005385798867791891,
-0.03145871311426163,
0.15231850743293762,
0.02434299886226654,
-0.03431156277656555,
-0.06997192651033401,
0.07748283445835114,
-0.16290344297885895,
0.12649567425251007,
0.041480619460344315,
0.09383337944746017,
0.06545894593000412,
0.019678082317113876,
-0.12143974751234055,
0.10712925344705582,
0.03099404275417328,
0.09308135509490967,
0.014119742438197136,
0.21414147317409515,
0.03028566762804985,
0.05938900634646416,
0.07968706637620926,
0.06837338954210281,
0.08655181527137756,
0.06018725037574768,
0.02004103921353817,
-0.06262839585542679,
0.062323614954948425,
-0.07755973935127258,
0.07764877378940582,
0.09714660793542862,
-0.03674471750855446,
0.014869226142764091,
-0.054931312799453735,
-0.055730827152729034,
-0.046601198613643646,
0.058971017599105835,
-0.07727611064910889,
-0.19119423627853394,
-0.00455511175096035,
0.05685609579086304,
0.10800616443157196,
-0.19270089268684387,
-0.0009190746350213885,
-0.05023046210408211,
-0.0060057686641812325,
0.04162171483039856,
0.08924314379692078,
-0.040074821561574936,
0.028660697862505913,
-0.03701185807585716,
-0.19352443516254425,
0.03721380606293678,
0.13504037261009216,
-0.08759665489196777,
-0.005941243842244148
] |
null | null |
transformers
|
## GPT2 in Swahili
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by [HuggingFace](https://huggingface.co). All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
## How to use
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt2-swahili")
model = AutoModelWithLMHead.from_pretrained("flax-community/gpt2-swahili")
print(round((model.num_parameters())/(1000*1000)),"Million Parameters")
124 Million Parameters
```
#### **Training Data**:
This model was trained on [Swahili Safi](https://huggingface.co/datasets/flax-community/swahili-safi)
#### **More Details**:
For more details and Demo please check [HF Swahili Space](https://huggingface.co/spaces/flax-community/Swahili)
|
{"language": "sw", "datasets": ["flax-community/swahili-safi"], "widget": [{"text": "Ninitaka kukula"}]}
|
text-generation
|
flax-community/gpt2-swahili
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"gpt2",
"text-generation",
"sw",
"dataset:flax-community/swahili-safi",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sw"
] |
TAGS
#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
## GPT2 in Swahili
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
## How to use
#### Training Data:
This model was trained on Swahili Safi
#### More Details:
For more details and Demo please check HF Swahili Space
|
[
"## GPT2 in Swahili\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.",
"## How to use",
"#### Training Data:\nThis model was trained on Swahili Safi",
"#### More Details:\nFor more details and Demo please check HF Swahili Space"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"## GPT2 in Swahili\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.",
"## How to use",
"#### Training Data:\nThis model was trained on Swahili Safi",
"#### More Details:\nFor more details and Demo please check HF Swahili Space"
] |
[
78,
63,
4,
16,
18
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #gpt2 #text-generation #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n## GPT2 in Swahili\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.## How to use#### Training Data:\nThis model was trained on Swahili Safi#### More Details:\nFor more details and Demo please check HF Swahili Space"
] |
[
-0.08302655071020126,
0.1107543557882309,
0.000716165523044765,
0.08710616827011108,
0.13713078200817108,
0.013195930980145931,
0.12414836138486862,
0.07898033410310745,
-0.014174739830195904,
-0.006233479827642441,
0.10954726487398148,
-0.021879892796278,
0.1152745932340622,
0.20515692234039307,
0.0235297828912735,
-0.26317155361175537,
-0.025005970150232315,
0.008171823807060719,
-0.11670682579278946,
0.15658697485923767,
0.11945163458585739,
-0.08522158861160278,
0.07375597208738327,
-0.01903976872563362,
-0.173836812376976,
0.0002109933120664209,
-0.0615353025496006,
-0.09006018191576004,
0.12255492061376572,
0.049152955412864685,
0.06871920824050903,
0.06455074995756149,
0.07855437695980072,
-0.07805971801280975,
0.05293625220656395,
0.038375794887542725,
-0.03766927495598793,
0.05413879081606865,
0.024168748408555984,
0.014971795491874218,
0.1636299043893814,
-0.042531970888376236,
-0.023895541206002235,
0.06865662336349487,
-0.1042773425579071,
-0.17600318789482117,
-0.09865642338991165,
0.06853031367063522,
0.07829823344945908,
0.07626509666442871,
-0.007967722602188587,
0.12664204835891724,
-0.10585923492908478,
0.08309626579284668,
0.13533039391040802,
-0.18510028719902039,
-0.09386089444160461,
0.1550380438566208,
0.09910687059164047,
0.05051438882946968,
-0.07451729476451874,
0.0733465626835823,
0.025571348145604134,
0.052709873765707016,
0.11435022205114365,
-0.052669841796159744,
-0.11838348954916,
0.003346709068864584,
-0.09765233844518661,
-0.037440668791532516,
0.2801125943660736,
0.014488020911812782,
-0.031585484743118286,
-0.09624075144529343,
-0.014821762219071388,
0.08336705714464188,
-0.049345046281814575,
-0.03747881203889847,
0.03529994562268257,
-0.0017291848780587316,
-0.024344557896256447,
-0.10736151039600372,
-0.09699743986129761,
-0.08919107168912888,
0.0031769555062055588,
0.07206187397241592,
-0.008946447633206844,
0.04858391731977463,
-0.14359652996063232,
0.09824472665786743,
0.0532066784799099,
-0.07994578033685684,
-0.010686900466680527,
-0.007742487359791994,
-0.006421269848942757,
0.014613869599997997,
0.02618958055973053,
-0.14660632610321045,
0.06896822154521942,
0.01532691065222025,
-0.027883615344762802,
0.024782557040452957,
0.09335579723119736,
0.05055426433682442,
0.007274547126144171,
0.12517455220222473,
-0.05980924516916275,
-0.14116893708705902,
0.07729817181825638,
-0.03856722265481949,
-0.03946860879659653,
-0.06932191550731659,
-0.09302908182144165,
-0.06992544233798981,
-0.01606358028948307,
0.030757462605834007,
0.04664459452033043,
0.0657203271985054,
0.023187795653939247,
-0.04592589661478996,
0.004588950425386429,
-0.09861820936203003,
0.004928378388285637,
-0.013053790666162968,
-0.038912855088710785,
0.012739308178424835,
0.09539961069822311,
0.007584167644381523,
-0.07610902190208435,
-0.06272630393505096,
-0.01989058218896389,
0.008181228302419186,
-0.046007901430130005,
-0.01909223385155201,
0.04092223197221756,
-0.09276695549488068,
-0.025826070457696915,
-0.15637965500354767,
-0.25898343324661255,
-0.052861928939819336,
0.029906371608376503,
0.018113600090146065,
-0.07689283043146133,
-0.0291499812155962,
0.004698757082223892,
-0.00039440038381144404,
-0.02580351196229458,
0.17110906541347504,
-0.04064187407493591,
0.061730097979307175,
-0.05691472440958023,
0.0658123716711998,
-0.03812604770064354,
0.05238360911607742,
-0.010987655259668827,
0.05644022300839424,
-0.1352405697107315,
0.11693840473890305,
-0.06634785234928131,
0.10557291656732559,
-0.11241338402032852,
-0.0056950668804347515,
0.02352653443813324,
-0.0037911045365035534,
0.04488731548190117,
0.21301883459091187,
-0.24316221475601196,
-0.0531732514500618,
0.19060033559799194,
-0.07092994451522827,
-0.13353601098060608,
0.162581667304039,
-0.031124994158744812,
0.162028506398201,
0.05846921727061272,
0.18778075277805328,
0.06738077104091644,
-0.08905655145645142,
-0.02027938887476921,
0.06594445556402206,
-0.07368737459182739,
-0.10217023640871048,
0.06897486001253128,
0.02935008704662323,
-0.035893913358449936,
0.04619884118437767,
-0.06766872853040695,
0.13028918206691742,
-0.04534969478845596,
-0.06331322342157364,
-0.04513459652662277,
-0.11496318876743317,
0.030435215681791306,
0.043958403170108795,
0.06711124628782272,
-0.015263492241501808,
-0.036528974771499634,
-0.025793181732296944,
0.07693342119455338,
-0.06522858887910843,
0.037460483610630035,
-0.06363968551158905,
0.10541653633117676,
-0.03136241436004639,
0.03754337877035141,
-0.0767919272184372,
-0.04838363453745842,
-0.03618278354406357,
0.0859789326786995,
-0.01132851094007492,
0.008926686830818653,
0.08873680233955383,
-0.019923267886042595,
-0.04606008529663086,
0.007355072535574436,
-0.008311648853123188,
0.001247707405127585,
-0.07787494361400604,
-0.09064293652772903,
-0.050816889852285385,
-0.029708782210946083,
0.12208713591098785,
-0.16066575050354004,
0.06719589233398438,
-0.001484078704379499,
0.11478506028652191,
-0.042166732251644135,
-0.008463756181299686,
0.06649322807788849,
-0.010819703340530396,
-0.0026686235796660185,
-0.08341696858406067,
0.05563667044043541,
0.02654447965323925,
-0.14868144690990448,
0.13659048080444336,
-0.026645256206393242,
0.10269316285848618,
0.10598643124103546,
-0.036249708384275436,
-0.01822410523891449,
0.010657512582838535,
-0.04730444774031639,
-0.030048217624425888,
-0.05609063431620598,
0.04887969419360161,
0.15453460812568665,
0.01077826414257288,
0.14811640977859497,
-0.06735487282276154,
-0.047308992594480515,
0.019876398146152496,
-0.055310823023319244,
0.004094199277460575,
0.10742556303739548,
-0.009563883766531944,
-0.049129363149404526,
0.09011613577604294,
0.12692800164222717,
0.030987130478024483,
0.2596653997898102,
-0.03077288344502449,
-0.027013732120394707,
-0.04670224338769913,
-0.03653226047754288,
-0.03375373035669327,
0.11881381273269653,
-0.19702517986297607,
-0.06166564300656319,
0.03576372563838959,
0.00848817266523838,
0.032468557357788086,
-0.10671178251504898,
-0.046537190675735474,
-0.0059149316512048244,
-0.06983684748411179,
-0.02119223028421402,
0.06738830357789993,
-0.07298839837312698,
0.06982270628213882,
-0.012420368380844593,
-0.013661923818290234,
0.0480024553835392,
0.00167517748195678,
-0.09647633880376816,
0.170758917927742,
-0.025058837607502937,
-0.2682604193687439,
-0.023929951712489128,
-0.027725234627723694,
0.012550062499940395,
-0.03887888044118881,
0.02638821303844452,
-0.06678158044815063,
-0.011505723930895329,
-0.03683137893676758,
-0.007519426755607128,
-0.029895855113863945,
-0.0030796087812632322,
-0.0036209782119840384,
-0.0168225709348917,
0.03152695298194885,
-0.11243828386068344,
0.003306414233520627,
-0.0014734722208231688,
-0.057145051658153534,
0.07243912667036057,
-0.056345440447330475,
0.10851004719734192,
0.10384435206651688,
-0.0483129546046257,
0.09099392592906952,
-0.005792444106191397,
0.22152100503444672,
-0.1467667818069458,
0.10464072972536087,
0.19481082260608673,
0.018022999167442322,
0.023992689326405525,
0.013308466412127018,
0.017478927969932556,
-0.0433676578104496,
-0.00430078012868762,
-0.030421506613492966,
-0.1260538250207901,
-0.22311219573020935,
-0.0679147019982338,
-0.07431779056787491,
0.0453789122402668,
-0.037738967686891556,
0.07732238620519638,
-0.013027445413172245,
0.07004473358392715,
0.025208406150341034,
0.047592874616384506,
0.00020568947365973145,
0.015652062371373177,
-0.07683718949556351,
-0.035542476922273636,
0.04883579537272453,
-0.09125415235757828,
-0.010984188877046108,
0.06800426542758942,
0.0457751527428627,
0.08061442524194717,
-0.013388809747993946,
0.07179393619298935,
0.061147771775722504,
0.20385673642158508,
0.057465896010398865,
0.04207732900977135,
0.03655590862035751,
-0.03036397695541382,
-0.020999383181333542,
-0.018805472180247307,
-0.04719223082065582,
0.018038837239146233,
0.060719024389982224,
-0.01971171237528324,
-0.04014018923044205,
-0.03377443924546242,
0.03684878349304199,
0.20667584240436554,
-0.027396412566304207,
-0.16954123973846436,
-0.08161696791648865,
0.04274673014879227,
-0.03576679155230522,
-0.09079636633396149,
-0.002159127965569496,
0.15826435387134552,
-0.18613919615745544,
0.04912569746375084,
-0.03641481325030327,
0.12082844227552414,
-0.0074908616952598095,
-0.004952785558998585,
-0.04365077242255211,
0.006901385262608528,
-0.03133157268166542,
0.10025005042552948,
-0.3292543292045593,
0.1573563665151596,
0.006813983432948589,
0.08913111686706543,
-0.07977230846881866,
-0.006865848787128925,
0.07417082786560059,
0.1738513559103012,
0.17751610279083252,
0.06382186710834503,
0.06910302489995956,
-0.019944442436099052,
-0.0973973274230957,
0.03093496523797512,
-0.046030815690755844,
-0.02712133154273033,
0.016036968678236008,
0.04019450023770332,
0.005573550704866648,
-0.008299247361719608,
0.07176971435546875,
-0.13996756076812744,
-0.09783586114645004,
0.03958161920309067,
0.05616793408989906,
0.03237701952457428,
-0.06148301064968109,
-0.08211448043584824,
-0.18752287328243256,
0.09951339662075043,
0.039855338633060455,
-0.0691266655921936,
-0.17484822869300842,
-0.015144996345043182,
0.011943032033741474,
-0.06240537390112877,
0.004554159473627806,
0.035788167268037796,
0.04674803093075752,
-0.059457551687955856,
-0.06547475606203079,
0.038442470133304596,
-0.09892725944519043,
-0.1251821666955948,
-0.004673232324421406,
0.06912773847579956,
0.0900888666510582,
0.03195057064294815,
0.050768136978149414,
-0.021911675110459328,
-0.07396269589662552,
-0.11531353741884232,
0.06902893632650375,
0.07919517159461975,
0.03656027466058731,
-0.005195281933993101,
0.10585341602563858,
-0.014574590139091015,
0.017095686867833138,
-0.06712193042039871,
0.17504845559597015,
0.1133655458688736,
-0.08963451534509659,
0.15125320851802826,
0.10841316729784012,
-0.060833919793367386,
-0.2978960871696472,
-0.045283593237400055,
0.008704782463610172,
0.07238136976957321,
0.012104201130568981,
-0.11561236530542374,
0.04229388013482094,
-0.02300412580370903,
-0.0315229631960392,
-0.03437641263008118,
-0.2692621648311615,
-0.10875029116868973,
0.10797517001628876,
0.06813521683216095,
0.19125905632972717,
-0.10897652804851532,
-0.02453949861228466,
-0.03290821239352226,
-0.14297212660312653,
0.09314791858196259,
-0.22100365161895752,
0.09945870190858841,
-0.0473715215921402,
0.07870297133922577,
-0.013492470607161522,
-0.07030437886714935,
0.15179948508739471,
0.0202178917825222,
0.01831471361219883,
-0.10763544589281082,
-0.016795454546809196,
0.16681140661239624,
-0.05824224278330803,
0.09705232828855515,
-0.06883025914430618,
0.016167545691132545,
-0.22984294593334198,
-0.029298147186636925,
-0.10506583005189896,
0.13563910126686096,
-0.03961075469851494,
-0.05811137706041336,
-0.005226899869740009,
0.04730958864092827,
0.05870795249938965,
0.020399508997797966,
0.026594623923301697,
-0.041938308626413345,
0.06324048340320587,
0.16038286685943604,
0.15549486875534058,
-0.057536400854587555,
-0.019928710535168648,
-0.05093668773770332,
-0.016237150877714157,
0.04161633551120758,
-0.1986497938632965,
0.005881170742213726,
0.06742282211780548,
0.04682732746005058,
0.08689935505390167,
0.03451788052916527,
-0.09762614965438843,
0.023990556597709656,
0.026755476370453835,
-0.12016326189041138,
-0.21151839196681976,
-0.09061124920845032,
-0.11117646098136902,
-0.05583149567246437,
0.06765235215425491,
0.08840030431747437,
-0.1107858195900917,
-0.011985772289335728,
-0.03591194748878479,
0.001117812586016953,
-0.08457870781421661,
0.12746866047382355,
0.10919894278049469,
0.04203851521015167,
-0.11912770569324493,
0.06735266000032425,
0.04898137226700783,
0.0040357476100325584,
0.026834402233362198,
0.13152523338794708,
-0.11556877940893173,
-0.07503394037485123,
0.03980320319533348,
0.21756155788898468,
0.012029050849378109,
-0.10020823776721954,
-0.07187093049287796,
-0.11245639622211456,
0.0378289557993412,
0.09413247555494308,
0.050421856343746185,
0.016474265605211258,
-0.022344736382365227,
-0.030602294951677322,
-0.10140077769756317,
0.0764104425907135,
0.10635365545749664,
-0.012504206970334053,
-0.10760365426540375,
0.1699782907962799,
0.018575862050056458,
0.18519826233386993,
-0.0787283182144165,
-0.05105951055884361,
-0.08513035625219345,
0.03597438707947731,
-0.13713648915290833,
-0.02275727316737175,
-0.07151160389184952,
0.008759799413383007,
-0.049919139593839645,
-0.06418929994106293,
-0.016454095020890236,
0.053972046822309494,
-0.06518033146858215,
0.01708126999437809,
0.010459703393280506,
0.05134276673197746,
-0.024649588391184807,
-0.031102702021598816,
0.02268187701702118,
-0.029663871973752975,
0.15127521753311157,
0.056542687118053436,
-0.02840757928788662,
0.018609309569001198,
-0.1385030895471573,
0.0072947959415614605,
-0.02949182130396366,
-0.019404515624046326,
0.06571602076292038,
-0.023953935131430626,
-0.006173540838062763,
-0.03570462390780449,
-0.02082156203687191,
0.021372143179178238,
0.07843964546918869,
-0.05302390456199646,
0.03318195044994354,
-0.050814565271139145,
-0.03469626605510712,
-0.07221002131700516,
0.11142297089099884,
0.025323038920760155,
0.1274808645248413,
0.05271429568529129,
-0.042584385722875595,
0.08427636325359344,
-0.15835127234458923,
-0.003341545118018985,
-0.007184928748756647,
-0.10757952183485031,
-0.07119150459766388,
-0.06820829212665558,
0.059975769370794296,
-0.03333554416894913,
0.12226488441228867,
0.10983891040086746,
-0.0576498806476593,
-0.03552272915840149,
0.03413812071084976,
-0.01846565678715706,
-0.05027073994278908,
0.1659470498561859,
-0.040763672441244125,
0.030562782660126686,
0.10047689825296402,
0.059796787798404694,
0.0038296838756650686,
0.0461338609457016,
0.044720109552145004,
0.04414482042193413,
-0.010823446325957775,
0.11976905167102814,
-0.00987174827605486,
-0.05024975165724754,
-0.06659653782844543,
0.020065125077962875,
-0.05151204392313957,
0.09940100461244583,
-0.12053998559713364,
0.008379683829843998,
0.11864437162876129,
-0.15017718076705933,
0.0657787173986435,
0.005060253664851189,
-0.09720741957426071,
-0.1215197816491127,
-0.1809338927268982,
-0.034263476729393005,
-0.08529025316238403,
0.006938841659575701,
-0.05549647659063339,
-0.00047131659812293947,
0.06623249500989914,
0.01673055998980999,
-0.021335816010832787,
0.08520377427339554,
0.06581537425518036,
-0.04274996370077133,
0.13504008948802948,
-0.03141511231660843,
0.016671279445290565,
-0.14298127591609955,
0.01141443569213152,
-0.02647063508629799,
0.006218688562512398,
-0.004647779278457165,
-0.022513993084430695,
-0.0026073441840708256,
0.049885645508766174,
-0.014046097174286842,
-0.09016354382038116,
-0.013937719166278839,
0.019542666152119637,
0.08531849086284637,
0.027657516300678253,
0.04035435616970062,
-0.03847431018948555,
-0.0011169352801516652,
0.22562912106513977,
-0.03874864801764488,
-0.06118986755609512,
-0.10547614842653275,
-0.01855599321424961,
-0.04384930059313774,
-0.03990955278277397,
0.01269160769879818,
-0.05678226798772812,
0.04777378588914871,
0.2614284157752991,
0.21830113232135773,
-0.06793772429227829,
0.03428124263882637,
-0.01265383418649435,
-0.004986890591681004,
0.019143113866448402,
0.14268168807029724,
0.052360933274030685,
0.060480691492557526,
-0.07793375104665756,
0.017149262130260468,
-0.06426797062158585,
-0.09663044661283493,
-0.06404853612184525,
0.15918073058128357,
-0.007310004439204931,
-0.01744212582707405,
-0.003612886881455779,
0.08796917647123337,
-0.12905006110668182,
-0.1354827582836151,
0.0013947099214419723,
-0.11807231605052948,
-0.10341861844062805,
-0.04955490306019783,
-0.007603821344673634,
0.10967130213975906,
0.027524510398507118,
-0.0013343407772481441,
0.07402091473340988,
0.16209150850772858,
0.014935567043721676,
-0.08541937172412872,
0.01569814234972,
0.12625469267368317,
-0.1291860193014145,
0.1637307107448578,
-0.04136355221271515,
0.015463064424693584,
0.058140143752098083,
-0.014157918281853199,
-0.12189465761184692,
0.04801139608025551,
-0.006022170651704073,
0.05835285410284996,
0.010849494487047195,
0.10126549005508423,
-0.002904935274273157,
-0.0278139840811491,
-0.026824209839105606,
-0.0682487040758133,
0.041652530431747437,
0.019974160939455032,
-0.00476682186126709,
-0.09154581278562546,
0.08283486217260361,
-0.037791658192873,
0.11984510719776154,
0.09253838658332825,
-0.0685095563530922,
-0.014491735957562923,
-0.10732920467853546,
0.012891369871795177,
0.01832696981728077,
0.024676527827978134,
-0.02562849409878254,
-0.09938983619213104,
-0.0501851886510849,
-0.06620640307664871,
-0.021716678515076637,
-0.07908917218446732,
-0.024111460894346237,
-0.10121618211269379,
-0.08282352238893509,
0.00995565764605999,
0.07701398432254791,
0.14939026534557343,
0.023194635286927223,
-0.006617801263928413,
0.08569972217082977,
-0.02428310550749302,
0.11414506286382675,
-0.10633990168571472,
-0.023100025951862335
] |
null | null |
transformers
|
## Indonesian RoBERTa Base
Indonesian RoBERTa Base is a masked language model based on the [RoBERTa](https://arxiv.org/abs/1907.11692) model. It was trained on the [OSCAR](https://huggingface.co/datasets/oscar) dataset, specifically the `unshuffled_deduplicated_id` subset. The model was trained from scratch and achieved an evaluation loss of 1.798 and an evaluation accuracy of 62.45%.
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by HuggingFace. All training was done on a TPUv3-8 VM, sponsored by the Google Cloud team.
All necessary scripts used for training could be found in the [Files and versions](https://huggingface.co/flax-community/indonesian-roberta-base/tree/main) tab, as well as the [Training metrics](https://huggingface.co/flax-community/indonesian-roberta-base/tensorboard) logged via Tensorboard.
## Model
| Model | #params | Arch. | Training/Validation data (text) |
| ------------------------- | ------- | ------- | ------------------------------------------ |
| `indonesian-roberta-base` | 124M | RoBERTa | OSCAR `unshuffled_deduplicated_id` Dataset |
## Evaluation Results
The model was trained for 8 epochs and the following is the final result once the training ended.
| train loss | valid loss | valid accuracy | total time |
| ---------- | ---------- | -------------- | ---------- |
| 1.870 | 1.798 | 0.6245 | 18:25:39 |
## How to Use
### As Masked Language Model
```python
from transformers import pipeline
pretrained_name = "flax-community/indonesian-roberta-base"
fill_mask = pipeline(
"fill-mask",
model=pretrained_name,
tokenizer=pretrained_name
)
fill_mask("Budi sedang <mask> di sekolah.")
```
### Feature Extraction in PyTorch
```python
from transformers import RobertaModel, RobertaTokenizerFast
pretrained_name = "flax-community/indonesian-roberta-base"
model = RobertaModel.from_pretrained(pretrained_name)
tokenizer = RobertaTokenizerFast.from_pretrained(pretrained_name)
prompt = "Budi sedang berada di sekolah."
encoded_input = tokenizer(prompt, return_tensors='pt')
output = model(**encoded_input)
```
## Team Members
- Wilson Wongso ([@w11wo](https://hf.co/w11wo))
- Steven Limcorn ([@stevenlimcorn](https://hf.co/stevenlimcorn))
- Samsul Rahmadani ([@munggok](https://hf.co/munggok))
- Chew Kok Wah ([@chewkokwah](https://hf.co/chewkokwah))
|
{"language": "id", "license": "mit", "tags": ["indonesian-roberta-base"], "datasets": ["oscar"], "widget": [{"text": "Budi telat ke sekolah karena ia <mask>."}]}
|
fill-mask
|
flax-community/indonesian-roberta-base
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"roberta",
"fill-mask",
"indonesian-roberta-base",
"id",
"dataset:oscar",
"arxiv:1907.11692",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1907.11692"
] |
[
"id"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #roberta #fill-mask #indonesian-roberta-base #id #dataset-oscar #arxiv-1907.11692 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
Indonesian RoBERTa Base
-----------------------
Indonesian RoBERTa Base is a masked language model based on the RoBERTa model. It was trained on the OSCAR dataset, specifically the 'unshuffled\_deduplicated\_id' subset. The model was trained from scratch and achieved an evaluation loss of 1.798 and an evaluation accuracy of 62.45%.
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM, sponsored by the Google Cloud team.
All necessary scripts used for training could be found in the Files and versions tab, as well as the Training metrics logged via Tensorboard.
Model
-----
Evaluation Results
------------------
The model was trained for 8 epochs and the following is the final result once the training ended.
How to Use
----------
### As Masked Language Model
### Feature Extraction in PyTorch
Team Members
------------
* Wilson Wongso (@w11wo)
* Steven Limcorn (@stevenlimcorn)
* Samsul Rahmadani (@munggok)
* Chew Kok Wah (@chewkokwah)
|
[
"### As Masked Language Model",
"### Feature Extraction in PyTorch\n\n\nTeam Members\n------------\n\n\n* Wilson Wongso (@w11wo)\n* Steven Limcorn (@stevenlimcorn)\n* Samsul Rahmadani (@munggok)\n* Chew Kok Wah (@chewkokwah)"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #roberta #fill-mask #indonesian-roberta-base #id #dataset-oscar #arxiv-1907.11692 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### As Masked Language Model",
"### Feature Extraction in PyTorch\n\n\nTeam Members\n------------\n\n\n* Wilson Wongso (@w11wo)\n* Steven Limcorn (@stevenlimcorn)\n* Samsul Rahmadani (@munggok)\n* Chew Kok Wah (@chewkokwah)"
] |
[
83,
7,
56
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #roberta #fill-mask #indonesian-roberta-base #id #dataset-oscar #arxiv-1907.11692 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n### As Masked Language Model### Feature Extraction in PyTorch\n\n\nTeam Members\n------------\n\n\n* Wilson Wongso (@w11wo)\n* Steven Limcorn (@stevenlimcorn)\n* Samsul Rahmadani (@munggok)\n* Chew Kok Wah (@chewkokwah)"
] |
[
-0.047726068645715714,
0.11652461439371109,
-0.00156060338485986,
0.09730758517980576,
0.09294816106557846,
-0.03170030564069748,
0.11761309206485748,
0.07810981571674347,
-0.013757226057350636,
-0.009866759181022644,
0.16093412041664124,
0.09834732860326767,
0.06507689505815506,
0.10915588587522507,
-0.00405503436923027,
-0.3462670147418976,
0.013075653463602066,
-0.0075207725167274475,
0.041289523243904114,
0.11010264605283737,
0.06452711671590805,
-0.0712156593799591,
0.124437615275383,
0.04045220464468002,
-0.06327130645513535,
0.005649450235068798,
0.014832304790616035,
-0.13320839405059814,
0.09886789321899414,
0.043754953891038895,
0.0836697369813919,
0.038918960839509964,
-0.016263406723737717,
-0.09832121431827545,
0.06130135804414749,
-0.0191151462495327,
-0.0014646986965090036,
0.0399114191532135,
-0.010223789140582085,
-0.010580303147435188,
0.16408096253871918,
0.02138482965528965,
-0.012152379378676414,
0.008332343772053719,
-0.1370459496974945,
-0.14037835597991943,
-0.05547042191028595,
0.07338445633649826,
-0.004748283885419369,
0.07348237931728363,
-0.01551703829318285,
0.16829928755760193,
-0.135963574051857,
0.0426676869392395,
0.14113546907901764,
-0.2713385224342346,
-0.07700100541114807,
0.008672080002725124,
0.08761390298604965,
-0.05721060186624527,
-0.08529507368803024,
0.04908697307109833,
0.040228407829999924,
-0.00041815219447016716,
0.05340049788355827,
-0.11100831627845764,
-0.09725113958120346,
-0.028842221945524216,
-0.013647953048348427,
0.007672606036067009,
0.17975085973739624,
0.029441075399518013,
-0.038359686732292175,
-0.06422971934080124,
-0.015856510028243065,
-0.051000285893678665,
0.03816609084606171,
0.021060939878225327,
-0.026299860328435898,
-0.02389248088002205,
-0.027494516223669052,
0.022489238530397415,
-0.11811104416847229,
0.014527666382491589,
-0.1253509521484375,
0.21411646902561188,
0.004609511233866215,
0.05859798192977905,
-0.12269346415996552,
0.05298035219311714,
-0.012160589918494225,
-0.18217627704143524,
-0.0033055138774216175,
-0.030541401356458664,
0.03394154831767082,
0.0941978469491005,
0.0728120431303978,
0.011194039136171341,
0.03716012462973595,
0.07808174192905426,
-0.05263251066207886,
0.046411506831645966,
0.03394273668527603,
0.09727399051189423,
-0.020285453647375107,
0.10740383714437485,
-0.10487153381109238,
-0.013982902280986309,
0.025686122477054596,
-0.021780122071504593,
0.1099897176027298,
-0.05565372481942177,
-0.04847628250718117,
0.010038585402071476,
-0.0585373230278492,
0.01500088069587946,
0.05290735512971878,
0.13902397453784943,
-0.09381315112113953,
-0.023877140134572983,
0.1387900859117508,
-0.048885900527238846,
-0.04619907587766647,
-0.017033495008945465,
-0.04646419733762741,
0.02669546566903591,
0.012867050245404243,
0.04266743361949921,
0.036031223833560944,
0.16318954527378082,
-0.059981197118759155,
-0.0027790828607976437,
-0.01723066344857216,
-0.02800181694328785,
0.0633876696228981,
-0.10321174561977386,
0.04296698421239853,
-0.16157405078411102,
-0.08529515564441681,
0.008454874157905579,
0.01594618894159794,
-0.015635643154382706,
-0.063249871134758,
-0.0332670621573925,
-0.020669901743531227,
0.026166731491684914,
-0.06803981959819794,
0.0027243492659181356,
-0.08191990107297897,
0.08965688198804855,
0.03307168185710907,
0.1346091777086258,
-0.09077437222003937,
0.04182574152946472,
-0.1413852572441101,
0.058315154165029526,
-0.11059951782226562,
-0.033459555357694626,
-0.049992356449365616,
0.0886097326874733,
-0.008711832575500011,
-0.09149951487779617,
-0.1029324010014534,
0.02232038602232933,
-0.03339007496833801,
0.16487888991832733,
-0.08923333883285522,
-0.0774025171995163,
0.19448105990886688,
-0.10554526746273041,
-0.08570478111505508,
0.13290926814079285,
-0.021360641345381737,
0.06276392191648483,
0.004950797185301781,
0.14883947372436523,
0.0186621006578207,
-0.1202315166592598,
0.01139727234840393,
0.009847124107182026,
-0.03595242649316788,
-0.03543555736541748,
0.15055488049983978,
0.07532792538404465,
-0.0017451143357902765,
0.061466798186302185,
-0.01565983146429062,
0.08909042924642563,
-0.03757927939295769,
-0.06052422896027565,
0.06717337667942047,
-0.06838754564523697,
0.1271081566810608,
0.07171296328306198,
0.08740055561065674,
-0.0896625965833664,
-0.03785758092999458,
-0.10492943227291107,
0.08467523753643036,
0.020979568362236023,
-0.005680306814610958,
-0.12081018090248108,
0.11386958509683609,
-0.021971073001623154,
-0.02383376657962799,
-0.0764678418636322,
0.030663490295410156,
-0.04082760214805603,
0.06947596371173859,
0.03230989724397659,
0.09764090925455093,
0.032855089753866196,
-0.0031219464726746082,
-0.052119266241788864,
-0.03973902016878128,
0.10644758492708206,
0.005585441365838051,
0.040750328451395035,
-0.12829707562923431,
0.053098756819963455,
-0.04236454889178276,
0.17765682935714722,
-0.07681978493928909,
-0.008647819049656391,
-0.03257971629500389,
0.10228107124567032,
0.011253535747528076,
0.0017288255039602518,
0.06665363907814026,
0.07339752465486526,
0.0034814609680324793,
-0.01779605634510517,
0.08339209109544754,
0.00273495283909142,
-0.08291037380695343,
0.2192484438419342,
-0.12322632223367691,
0.18999069929122925,
0.1579950451850891,
-0.12455976009368896,
-0.0024944276083260775,
0.10838697105646133,
-0.007394508924335241,
-0.012393149547278881,
-0.026437466964125633,
0.11393614858388901,
0.08840756863355637,
-0.03472833335399628,
0.1492568999528885,
-0.058557163923978806,
-0.00340876798145473,
0.027091851457953453,
-0.10256347060203552,
-0.05165188014507294,
0.14404189586639404,
0.04610395058989525,
-0.18210890889167786,
0.19825121760368347,
0.11152930557727814,
0.0023456974886357784,
0.269065797328949,
0.03354540467262268,
0.04612560570240021,
-0.036660172045230865,
-0.004304156638681889,
-0.00876886397600174,
0.12010426074266434,
-0.12428763508796692,
-0.020407989621162415,
0.041980139911174774,
-0.046227339655160904,
-0.005982319824397564,
-0.12824442982673645,
-0.1317955106496811,
0.012507680803537369,
0.027852356433868408,
-0.09197592735290527,
0.14032353460788727,
-0.03696851059794426,
0.09069567173719406,
0.001947829034179449,
-0.0691557377576828,
0.029484272003173828,
-0.019307605922222137,
-0.050029877573251724,
0.1604539006948471,
-0.0453704297542572,
-0.334038108587265,
-0.09864247590303421,
-0.13948197662830353,
-0.0643516257405281,
-0.011897165328264236,
0.06894710659980774,
-0.10309449583292007,
-0.01970374397933483,
-0.005348325707018375,
-0.10162132233381271,
-0.04701482504606247,
-0.023290682584047318,
-0.02456449717283249,
-0.037806566804647446,
-0.06057130545377731,
-0.04611089453101158,
-0.0482914000749588,
-0.028676193207502365,
0.017532717436552048,
0.18240131437778473,
-0.03427153825759888,
0.0877542570233345,
0.052665334194898605,
-0.03914792463183403,
0.020694008097052574,
-0.007066467311233282,
0.08283181488513947,
-0.09431228041648865,
-0.02438599243760109,
0.22416123747825623,
-0.060614459216594696,
0.049188438802957535,
0.128149151802063,
0.015740426257252693,
0.00175854389090091,
0.0230479184538126,
-0.06382156163454056,
-0.10863792896270752,
-0.17391835153102875,
-0.0383966825902462,
-0.09177335351705551,
0.15274345874786377,
0.006797122303396463,
0.01750483363866806,
0.10130062699317932,
0.10573576390743256,
-0.013282790780067444,
-0.02566687762737274,
-0.05085574463009834,
0.023747585713863373,
0.08800987899303436,
-0.020794594660401344,
0.11149504780769348,
-0.07991932332515717,
-0.09499184042215347,
0.04542916268110275,
-0.0058365496806800365,
-0.04530191421508789,
0.055926017463207245,
0.001511773793026805,
0.06385885179042816,
0.21567952632904053,
0.1250423640012741,
0.028341414406895638,
-0.029280154034495354,
-0.04676472023129463,
-0.03849057853221893,
-0.0529036670923233,
-0.046461787074804306,
0.036190785467624664,
0.03627336770296097,
-0.021059375256299973,
-0.0208938866853714,
0.00604095496237278,
0.017467765137553215,
0.1211300939321518,
0.043961554765701294,
-0.1467764526605606,
-0.060081545263528824,
0.05025145784020424,
0.04398209974169731,
-0.00584383076056838,
0.035042960196733475,
0.140623077750206,
-0.15694360435009003,
0.12901541590690613,
0.003546442836523056,
0.04983313009142876,
-0.04307478666305542,
0.052989888936281204,
-0.06122127175331116,
-0.03487112373113632,
-0.01558686699718237,
0.0847896859049797,
-0.20202235877513885,
0.26489630341529846,
-0.005091714207082987,
-0.038349397480487823,
-0.09564226120710373,
-0.06549414247274399,
0.04842233285307884,
0.049860864877700806,
0.18811044096946716,
0.03887298330664635,
-0.0631314143538475,
-0.11537425220012665,
-0.06331341713666916,
0.04022334888577461,
0.04127778485417366,
0.013368104584515095,
0.005829073488712311,
0.007445768918842077,
-0.012786957435309887,
-0.026263833045959473,
0.06854280084371567,
-0.016164997592568398,
-0.05993298441171646,
0.05794337019324303,
0.043804749846458435,
0.03559417277574539,
-0.031183479353785515,
-0.09329767525196075,
-0.15220028162002563,
0.08194909244775772,
0.010321326553821564,
-0.030365483835339546,
-0.07716608047485352,
-0.0317072756588459,
0.004241804592311382,
-0.141079843044281,
-0.005685199052095413,
-0.06481966376304626,
-0.06533567607402802,
-0.04260079935193062,
-0.08699143677949905,
0.12294507026672363,
-0.03041660413146019,
-0.10929948836565018,
-0.052054066210985184,
0.06046599894762039,
0.0015384064754471183,
0.07253555953502655,
-0.006675123702734709,
0.006659943610429764,
-0.0583571158349514,
-0.10481899231672287,
0.01189342886209488,
-0.03523619845509529,
0.023053105920553207,
0.017787273973226547,
0.04399729520082474,
-0.09550013393163681,
-0.05525405332446098,
-0.11545447260141373,
0.17446734011173248,
0.25873443484306335,
-0.0759972408413887,
0.07463178783655167,
0.11168809235095978,
0.014689434319734573,
-0.2762698233127594,
-0.10525565594434738,
-0.13448718190193176,
0.03882673755288124,
0.08080647140741348,
-0.07241814583539963,
0.04019663855433464,
0.06131718307733536,
-0.06154891103506088,
0.045303184539079666,
-0.12126036733388901,
-0.13294130563735962,
0.12492477148771286,
0.06367935240268707,
0.4083268642425537,
-0.10913955420255661,
-0.01981453038752079,
-0.03661647439002991,
-0.20065484941005707,
0.055611688643693924,
-0.05079593509435654,
0.09341971576213837,
-0.06691977381706238,
-0.005609913729131222,
-0.019977688789367676,
-0.04897632449865341,
0.12744243443012238,
-0.1358920931816101,
0.006694722454994917,
-0.12784117460250854,
-0.13300156593322754,
-0.006875659804791212,
0.026925470679998398,
0.0305124893784523,
-0.12481873482465744,
-0.0069682421162724495,
-0.11380395293235779,
0.004513181280344725,
-0.12708157300949097,
0.10032286494970322,
0.004825293552130461,
-0.061193447560071945,
-0.05202058330178261,
0.11225467175245285,
-0.0033783100079745054,
0.014669467695057392,
0.14038708806037903,
-0.08087821304798126,
0.12920168042182922,
0.025967104360461235,
0.12944170832633972,
-0.025785598903894424,
0.08764471858739853,
-0.09644635766744614,
-0.037243761122226715,
0.0653965100646019,
-0.17089763283729553,
-0.05316229164600372,
0.06522432714700699,
0.05077571049332619,
0.08516285568475723,
0.023595690727233887,
-0.06752077490091324,
0.11540725082159042,
0.1560681313276291,
-0.14975465834140778,
-0.0978330448269844,
-0.04018859937787056,
-0.06081455200910568,
0.10621772706508636,
-0.029365887865424156,
0.05638580769300461,
-0.10152173787355423,
-0.047136206179857254,
-0.012126415967941284,
0.007113547995686531,
-0.07144402712583542,
0.05312736704945564,
0.06824412196874619,
0.03402130678296089,
-0.08828016370534897,
0.05429120734333992,
0.0401868037879467,
-0.08772094547748566,
-0.01970759406685829,
0.09885887801647186,
-0.07837551087141037,
-0.08157689869403839,
-0.11036981642246246,
0.05960490554571152,
-0.033541489392519,
-0.06946088373661041,
-0.08684153854846954,
-0.08632048964500427,
-0.007369892671704292,
0.12300366908311844,
0.009491548873484135,
-0.009282889775931835,
-0.005024273879826069,
-0.05242529138922691,
0.027255436405539513,
0.0402812696993351,
0.0718003585934639,
0.00828755833208561,
-0.12270702421665192,
0.024736231192946434,
0.0442521795630455,
0.1822633147239685,
-0.05232519283890724,
-0.0371001660823822,
-0.11816497147083282,
0.025433678179979324,
-0.0061261653900146484,
-0.02798543870449066,
-0.14398452639579773,
-0.025558117777109146,
-0.06513502448797226,
-0.09676780551671982,
-0.029817689210176468,
-0.0417039655148983,
-0.08997023105621338,
0.04976087436079979,
0.009983336552977562,
0.02468409761786461,
-0.08117275685071945,
-0.02349001355469227,
0.142356276512146,
-0.011189962737262249,
0.07597357779741287,
0.06898100674152374,
-0.038034193217754364,
0.069704569876194,
-0.1602584719657898,
-0.04705088958144188,
0.05492852255702019,
-0.005752856377512217,
0.0664522722363472,
0.008755323477089405,
-0.0208966676145792,
0.03191594034433365,
0.05953734368085861,
0.029275836423039436,
0.09520890563726425,
-0.11428467184305191,
-0.028839994221925735,
0.03606860339641571,
-0.07686837762594223,
-0.07240932434797287,
0.013001956045627594,
0.08463837206363678,
-0.011243325658142567,
0.13422928750514984,
-0.053358860313892365,
0.07710070163011551,
-0.10066357254981995,
0.004999165888875723,
-0.018818378448486328,
-0.1685892939567566,
0.0012932647950947285,
-0.06793870776891708,
0.04204782098531723,
-0.02020694687962532,
0.09551690518856049,
0.027123240754008293,
-0.12308076024055481,
-0.0029224196914583445,
0.01576683111488819,
-0.02043232135474682,
0.009235051460564137,
0.15150518715381622,
0.04977264255285263,
-0.03216644749045372,
-0.0184144489467144,
0.04805060848593712,
-0.004054104909300804,
0.18526245653629303,
0.10227707028388977,
0.17119929194450378,
0.10720179975032806,
0.05099684000015259,
0.03736235201358795,
0.070448137819767,
-0.11300643533468246,
-0.1605110764503479,
-0.1067318394780159,
0.04651552811264992,
0.0010164912091568112,
0.0030686762183904648,
0.20134896039962769,
-0.0725189596414566,
0.03622046485543251,
-0.0708303153514862,
-0.02763785980641842,
-0.11182563751935959,
-0.24571053683757782,
-0.07361656427383423,
-0.05392155051231384,
0.04241252690553665,
-0.047607798129320145,
0.01928456500172615,
0.1251792460680008,
0.0760672315955162,
-0.07353673875331879,
0.167006716132164,
0.00009078654693439603,
-0.05352009832859039,
0.10621323436498642,
0.021067684516310692,
-0.013564727269113064,
-0.05529361963272095,
0.023905934765934944,
-0.06523197889328003,
-0.009232081472873688,
0.014982100576162338,
-0.024134431034326553,
-0.07336819171905518,
0.05348721891641617,
-0.04994157329201698,
-0.13324524462223053,
-0.03972853720188141,
0.02818419225513935,
0.10335920006036758,
0.1386185884475708,
0.008774280548095703,
0.07386714220046997,
0.010918466374278069,
0.1381962150335312,
-0.019621770828962326,
-0.08706732839345932,
-0.1167534813284874,
0.16295123100280762,
-0.06450418382883072,
-0.026337288320064545,
-0.014417503960430622,
-0.012532271444797516,
0.02324788272380829,
0.2758195698261261,
0.28758180141448975,
-0.04322415217757225,
0.07685641199350357,
-0.05764719843864441,
0.032034121453762054,
-0.011101491749286652,
0.053379207849502563,
0.07646957039833069,
0.1331590712070465,
-0.10084754973649979,
0.013085064478218555,
-0.07977351546287537,
-0.03064812906086445,
-0.026233499869704247,
-0.014288008213043213,
0.07655707001686096,
-0.05839435011148453,
-0.07806630432605743,
0.07445008307695389,
-0.16454055905342102,
-0.01964198611676693,
0.057313643395900726,
-0.1949683129787445,
-0.08074554800987244,
-0.04044000431895256,
0.045349206775426865,
0.11149702221155167,
0.03232407569885254,
-0.026934146881103516,
0.028838368132710457,
-0.053572505712509155,
0.0739220604300499,
-0.1139322966337204,
-0.06119176372885704,
0.09576073288917542,
0.043399423360824585,
0.0753631666302681,
-0.033148616552352905,
0.022921714931726456,
0.0956493616104126,
0.07539400458335876,
-0.025709759443998337,
0.08991967886686325,
0.03289691358804703,
-0.012738846242427826,
-0.07960554212331772,
0.05016874894499779,
0.03381736949086189,
-0.027304409071803093,
0.04771142825484276,
0.012065641582012177,
0.0693264976143837,
-0.0228706244379282,
0.028067708015441895,
-0.046645637601614,
0.10354486107826233,
-0.0299734715372324,
0.1471630334854126,
0.16761744022369385,
-0.022146891802549362,
-0.05086608976125717,
-0.0639323741197586,
0.005153032951056957,
0.0039000746328383684,
-0.14241598546504974,
-0.11857323348522186,
-0.052290938794612885,
-0.046050406992435455,
-0.05725720897316933,
0.060896482318639755,
-0.23727142810821533,
0.024684511125087738,
-0.1464904099702835,
-0.04916920140385628,
0.014472916722297668,
0.015829550102353096,
0.05534670501947403,
0.03985615074634552,
-0.016071494668722153,
-0.127980574965477,
0.050707899034023285,
0.0675758421421051,
-0.10822775214910507,
-0.07654830813407898
] |
null | null |
transformers
|
## Indonesian RoBERTa Large
Indonesian RoBERTa Large is a masked language model based on the [RoBERTa](https://arxiv.org/abs/1907.11692) model. It was trained on the [OSCAR](https://huggingface.co/datasets/oscar) dataset, specifically the `unshuffled_deduplicated_id` subset. The model was trained from scratch and achieved an evaluation loss of 4.801 and an evaluation accuracy of 29.8%.
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by HuggingFace. All training was done on a TPUv3-8 VM, sponsored by the Google Cloud team.
All necessary scripts used for training could be found in the [Files and versions](https://huggingface.co/flax-community/indonesian-roberta-large/tree/main) tab, as well as the [Training metrics](https://huggingface.co/flax-community/indonesian-roberta-large/tensorboard) logged via Tensorboard.
## Model
| Model | #params | Arch. | Training/Validation data (text) |
| -------------------------- | ------- | ------- | ------------------------------------------ |
| `indonesian-roberta-large` | 355M | RoBERTa | OSCAR `unshuffled_deduplicated_id` Dataset |
## Evaluation Results
The model was trained for 10 epochs and the following is the final result once the training ended.
| train loss | valid loss | valid accuracy | total time |
| ---------- | ---------- | -------------- | ---------- |
| 5.19 | 4.801 | 0.298 | 2:8:32:28 |
## How to Use
### As Masked Language Model
```python
from transformers import pipeline
pretrained_name = "flax-community/indonesian-roberta-large"
fill_mask = pipeline(
"fill-mask",
model=pretrained_name,
tokenizer=pretrained_name
)
fill_mask("Budi sedang <mask> di sekolah.")
```
### Feature Extraction in PyTorch
```python
from transformers import RobertaModel, RobertaTokenizerFast
pretrained_name = "flax-community/indonesian-roberta-large"
model = RobertaModel.from_pretrained(pretrained_name)
tokenizer = RobertaTokenizerFast.from_pretrained(pretrained_name)
prompt = "Budi sedang berada di sekolah."
encoded_input = tokenizer(prompt, return_tensors='pt')
output = model(**encoded_input)
```
## Team Members
- Wilson Wongso ([@w11wo](https://hf.co/w11wo))
- Steven Limcorn ([@stevenlimcorn](https://hf.co/stevenlimcorn))
- Samsul Rahmadani ([@munggok](https://hf.co/munggok))
- Chew Kok Wah ([@chewkokwah](https://hf.co/chewkokwah))
|
{"language": "id", "license": "mit", "tags": ["indonesian-roberta-large"], "datasets": ["oscar"], "widget": [{"text": "Budi telat ke sekolah karena ia <mask>."}]}
|
fill-mask
|
flax-community/indonesian-roberta-large
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"roberta",
"fill-mask",
"indonesian-roberta-large",
"id",
"dataset:oscar",
"arxiv:1907.11692",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1907.11692"
] |
[
"id"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #roberta #fill-mask #indonesian-roberta-large #id #dataset-oscar #arxiv-1907.11692 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
Indonesian RoBERTa Large
------------------------
Indonesian RoBERTa Large is a masked language model based on the RoBERTa model. It was trained on the OSCAR dataset, specifically the 'unshuffled\_deduplicated\_id' subset. The model was trained from scratch and achieved an evaluation loss of 4.801 and an evaluation accuracy of 29.8%.
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM, sponsored by the Google Cloud team.
All necessary scripts used for training could be found in the Files and versions tab, as well as the Training metrics logged via Tensorboard.
Model
-----
Evaluation Results
------------------
The model was trained for 10 epochs and the following is the final result once the training ended.
How to Use
----------
### As Masked Language Model
### Feature Extraction in PyTorch
Team Members
------------
* Wilson Wongso (@w11wo)
* Steven Limcorn (@stevenlimcorn)
* Samsul Rahmadani (@munggok)
* Chew Kok Wah (@chewkokwah)
|
[
"### As Masked Language Model",
"### Feature Extraction in PyTorch\n\n\nTeam Members\n------------\n\n\n* Wilson Wongso (@w11wo)\n* Steven Limcorn (@stevenlimcorn)\n* Samsul Rahmadani (@munggok)\n* Chew Kok Wah (@chewkokwah)"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #roberta #fill-mask #indonesian-roberta-large #id #dataset-oscar #arxiv-1907.11692 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### As Masked Language Model",
"### Feature Extraction in PyTorch\n\n\nTeam Members\n------------\n\n\n* Wilson Wongso (@w11wo)\n* Steven Limcorn (@stevenlimcorn)\n* Samsul Rahmadani (@munggok)\n* Chew Kok Wah (@chewkokwah)"
] |
[
80,
7,
56
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #roberta #fill-mask #indonesian-roberta-large #id #dataset-oscar #arxiv-1907.11692 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### As Masked Language Model### Feature Extraction in PyTorch\n\n\nTeam Members\n------------\n\n\n* Wilson Wongso (@w11wo)\n* Steven Limcorn (@stevenlimcorn)\n* Samsul Rahmadani (@munggok)\n* Chew Kok Wah (@chewkokwah)"
] |
[
-0.06246016174554825,
0.09137556701898575,
-0.0019877897575497627,
0.09868760406970978,
0.10006798058748245,
-0.03205583244562149,
0.12420535087585449,
0.06814273446798325,
-0.009234328754246235,
-0.005355811677873135,
0.15878307819366455,
0.1190696507692337,
0.07284436374902725,
0.1057354062795639,
0.001666062860749662,
-0.3414589762687683,
0.010428386740386486,
0.0011395295150578022,
0.039343785494565964,
0.11510425806045532,
0.057173848152160645,
-0.0677868202328682,
0.12590104341506958,
0.040947768837213516,
-0.07243694365024567,
-0.003725395305082202,
0.005010891240090132,
-0.12984561920166016,
0.10202430188655853,
0.05164290964603424,
0.09096000343561172,
0.03354225680232048,
-0.020895376801490784,
-0.09939694404602051,
0.060949768871068954,
-0.022521018981933594,
-0.0007662004209123552,
0.043010879307985306,
-0.0004795944842044264,
0.0005665447097271681,
0.1545085459947586,
0.021557919681072235,
-0.016223490238189697,
0.004993082955479622,
-0.13114026188850403,
-0.1422479897737503,
-0.04612962156534195,
0.08488073945045471,
-0.0033874583896249533,
0.07184066623449326,
-0.007838410325348377,
0.1705324947834015,
-0.1447073072195053,
0.05202830210328102,
0.1332108974456787,
-0.25217172503471375,
-0.0759613886475563,
-0.022317206487059593,
0.06413748860359192,
-0.049762189388275146,
-0.0865553766489029,
0.033649832010269165,
0.03761793673038483,
0.001601434894837439,
0.02988847717642784,
-0.10439196228981018,
-0.08572051674127579,
-0.04317726567387581,
-0.009019232355058193,
0.007704550866037607,
0.16618891060352325,
0.02984650246798992,
-0.03992686793208122,
-0.05763775110244751,
-0.03322960063815117,
-0.039532583206892014,
0.03100810945034027,
0.020875146612524986,
-0.03701360523700714,
-0.021878696978092194,
-0.030620049685239792,
0.028900032863020897,
-0.1069682240486145,
0.006878984160721302,
-0.13108643889427185,
0.23217980563640594,
0.01470560859888792,
0.05530403554439545,
-0.13478700816631317,
0.041377954185009,
-0.004308239556849003,
-0.1803206205368042,
-0.004459640942513943,
-0.021210968494415283,
0.03134053573012352,
0.08816780149936676,
0.09432096034288406,
0.01257429551333189,
0.048824772238731384,
0.0748685896396637,
-0.06166652962565422,
0.05236155167222023,
0.027720537036657333,
0.0962536409497261,
-0.029609117656946182,
0.09828907251358032,
-0.1054375097155571,
0.005706838332116604,
0.02952989563345909,
-0.023172972723841667,
0.11278993636369705,
-0.0546344593167305,
-0.04855803772807121,
0.010563039220869541,
-0.06550634652376175,
0.02992333099246025,
0.04434574022889137,
0.13281816244125366,
-0.08055594563484192,
-0.038972143083810806,
0.14266443252563477,
-0.05272138863801956,
-0.05227178335189819,
-0.01459968276321888,
-0.04351118952035904,
0.03750398010015488,
-0.0010483964579179883,
0.035001978278160095,
0.024836020544171333,
0.17652744054794312,
-0.05829399451613426,
-0.0039547644555568695,
-0.011339395307004452,
-0.015251261182129383,
0.06533220410346985,
-0.11102481931447983,
0.04094722867012024,
-0.16780458390712738,
-0.09954968839883804,
0.014746332541108131,
0.0013535480247810483,
-0.011235339567065239,
-0.06445645540952682,
-0.038022179156541824,
-0.006230888422578573,
0.019526779651641846,
-0.06882119178771973,
-0.000060263970226515085,
-0.0787944570183754,
0.09580636024475098,
0.043407849967479706,
0.12152206897735596,
-0.08881751447916031,
0.04096381366252899,
-0.15077535808086395,
0.06288952380418777,
-0.09304937720298767,
-0.03701282665133476,
-0.0319930836558342,
0.0892065092921257,
-0.02217121794819832,
-0.08420681953430176,
-0.10552271455526352,
0.014278632588684559,
-0.024349510669708252,
0.15600818395614624,
-0.08134105801582336,
-0.09650818258523941,
0.21534501016139984,
-0.11639021337032318,
-0.09319810569286346,
0.14778248965740204,
-0.013427820056676865,
0.059778813272714615,
0.02051248587667942,
0.12929832935333252,
0.019711699336767197,
-0.10702996701002121,
0.011717913672327995,
0.016807859763503075,
-0.052159059792757034,
-0.04253663122653961,
0.15058916807174683,
0.05402471497654915,
-0.007382544688880444,
0.06786155700683594,
0.010168052278459072,
0.09774915128946304,
-0.041496433317661285,
-0.060110267251729965,
0.059075310826301575,
-0.07230370491743088,
0.11641495674848557,
0.06997381895780563,
0.08904153853654861,
-0.08361220359802246,
-0.02358723059296608,
-0.12313846498727798,
0.09434479475021362,
0.01973872072994709,
-0.007296777796000242,
-0.12130343914031982,
0.11903075873851776,
-0.04312560707330704,
-0.016720594838261604,
-0.07489674538373947,
0.02134597860276699,
-0.030492691323161125,
0.04840928688645363,
0.029561839997768402,
0.07225314527750015,
0.03823424130678177,
0.010691496543586254,
-0.05413859710097313,
-0.04621754214167595,
0.10808825492858887,
-0.006134556606411934,
0.03765365481376648,
-0.12572018802165985,
0.05992995202541351,
-0.03297623246908188,
0.15855705738067627,
-0.07820039987564087,
-0.012835389003157616,
-0.04096167907118797,
0.10110010951757431,
0.011795628815889359,
0.010571294464170933,
0.053496088832616806,
0.0747714415192604,
0.0019320609280839562,
-0.012556706555187702,
0.09083347767591476,
-0.001962786540389061,
-0.0901273712515831,
0.21207775175571442,
-0.11016876995563507,
0.21154005825519562,
0.15200848877429962,
-0.12279456853866577,
0.013263308443129063,
0.08156408369541168,
-0.009951525367796421,
-0.023508066311478615,
-0.030643120408058167,
0.11642386764287949,
0.09223228693008423,
-0.021436035633087158,
0.15497252345085144,
-0.06399261951446533,
0.01115942932665348,
0.036627575755119324,
-0.09125861525535583,
-0.03401909023523331,
0.14784391224384308,
0.057986970990896225,
-0.19348108768463135,
0.1932811290025711,
0.10680030286312103,
0.01041522528976202,
0.2703937292098999,
0.025184301659464836,
0.036397919058799744,
-0.03495340049266815,
0.016010263934731483,
-0.007943191565573215,
0.10088039934635162,
-0.12579107284545898,
-0.011941155418753624,
0.04439927637577057,
-0.036903876811265945,
-0.009628994390368462,
-0.1182660385966301,
-0.12326692789793015,
0.01356825977563858,
0.03161047026515007,
-0.10003426671028137,
0.13529978692531586,
-0.03819717466831207,
0.09296027570962906,
-0.0018397571984678507,
-0.07262307405471802,
0.040849726647138596,
-0.010370769537985325,
-0.04783782735466957,
0.16181135177612305,
-0.03577200695872307,
-0.33765825629234314,
-0.1079452633857727,
-0.15391230583190918,
-0.07571694999933243,
-0.01862429641187191,
0.06570453941822052,
-0.12108249217271805,
-0.016816025599837303,
0.014546578750014305,
-0.06678654998540878,
-0.040665462613105774,
-0.019289592280983925,
-0.03204282745718956,
-0.028553573414683342,
-0.06422056257724762,
-0.050488218665122986,
-0.03669382259249687,
-0.03493013605475426,
0.013673357665538788,
0.16929170489311218,
-0.049025025218725204,
0.09355027228593826,
0.0522150844335556,
-0.02653554268181324,
0.024146875366568565,
-0.0033538308925926685,
0.06751548498868942,
-0.09350535273551941,
-0.02828229032456875,
0.2306329905986786,
-0.06058303639292717,
0.05064326897263527,
0.13085375726222992,
0.02027975767850876,
-0.001084585557691753,
0.0190307404845953,
-0.059456512331962585,
-0.11000818759202957,
-0.18182814121246338,
-0.035394031554460526,
-0.09224221110343933,
0.15618859231472015,
0.016651054844260216,
0.01575608365237713,
0.09861530363559723,
0.11030462384223938,
-0.02049189992249012,
-0.016831589862704277,
-0.059948280453681946,
0.02608707919716835,
0.08201143145561218,
-0.018274817615747452,
0.10636431723833084,
-0.07875937968492508,
-0.10358816385269165,
0.039751891046762466,
-0.004609364550560713,
-0.018459517508745193,
0.04984600469470024,
0.04971926286816597,
0.06897919625043869,
0.214509978890419,
0.13010716438293457,
0.04216969013214111,
-0.04092186316847801,
-0.04730488359928131,
-0.034536976367235184,
-0.05407573655247688,
-0.057666197419166565,
0.011004596948623657,
0.040218595415353775,
0.006943818647414446,
-0.025545621290802956,
0.016818104311823845,
0.022258976474404335,
0.1263163983821869,
0.03229402005672455,
-0.14574912190437317,
-0.06302594393491745,
0.03348582237958908,
0.046232640743255615,
-0.010463225655257702,
0.031825534999370575,
0.12316988408565521,
-0.15162572264671326,
0.112724669277668,
-0.006090408656746149,
0.056228913366794586,
-0.05842772498726845,
0.05821467936038971,
-0.06272171437740326,
-0.008427717722952366,
-0.02125510387122631,
0.09252846240997314,
-0.21260780096054077,
0.2766447067260742,
0.004720697645097971,
-0.026316162198781967,
-0.10124149918556213,
-0.06131652370095253,
0.047265905886888504,
0.05045636370778084,
0.17380104959011078,
0.036895524710416794,
-0.04546211659908295,
-0.13336169719696045,
-0.07263278216123581,
0.049230799078941345,
0.049604631960392,
0.022962655872106552,
-0.004050340969115496,
0.008916908875107765,
-0.012002569623291492,
-0.025048183277249336,
0.04331062734127045,
-0.02124101109802723,
-0.06528843194246292,
0.06272128969430923,
0.03615766018629074,
0.025846954435110092,
-0.03238848224282265,
-0.09350073337554932,
-0.12901894748210907,
0.07644789665937424,
0.010189603082835674,
-0.039359427988529205,
-0.0723104253411293,
-0.02412775531411171,
0.005927461665123701,
-0.1523572951555252,
-0.02000443823635578,
-0.07707370817661285,
-0.0655326172709465,
-0.04585205763578415,
-0.08071823418140411,
0.11294678598642349,
-0.03186434879899025,
-0.1256171613931656,
-0.048770610243082047,
0.08053567260503769,
0.006503287702798843,
0.059136565774679184,
-0.006855873391032219,
-0.005718976259231567,
-0.046295616775751114,
-0.10781211405992508,
0.006838446483016014,
-0.03791211545467377,
0.01038604136556387,
0.03527621924877167,
0.04427912458777428,
-0.07051501423120499,
-0.05313441529870033,
-0.12177079170942307,
0.17982064187526703,
0.24977509677410126,
-0.055201392620801926,
0.07412666827440262,
0.1508052796125412,
0.013774451799690723,
-0.25980278849601746,
-0.1224670261144638,
-0.13555318117141724,
0.034468501806259155,
0.05233202129602432,
-0.08360622823238373,
0.054767850786447525,
0.07732705026865005,
-0.05006527528166771,
0.05933856964111328,
-0.13082243502140045,
-0.12980754673480988,
0.12535493075847626,
0.07043465226888657,
0.40604859590530396,
-0.10674230009317398,
-0.029758578166365623,
-0.046414438635110855,
-0.1900578737258911,
0.04479750990867615,
-0.05266334488987923,
0.09347283840179443,
-0.07683085650205612,
-0.007141574751585722,
-0.015189209952950478,
-0.04988720640540123,
0.12195423990488052,
-0.12248363345861435,
0.006198791321367025,
-0.1302272528409958,
-0.13126686215400696,
-0.002189219929277897,
0.03801286593079567,
0.028280027210712433,
-0.12358980625867844,
-0.008621303364634514,
-0.10053684562444687,
-0.00522976228967309,
-0.12896795570850372,
0.1003740057349205,
-0.0028098018374294043,
-0.05996864289045334,
-0.042745791375637054,
0.10479787737131119,
-0.007742578163743019,
0.01502718310803175,
0.14952906966209412,
-0.061281993985176086,
0.12113695591688156,
0.012070873752236366,
0.13496294617652893,
-0.03831956535577774,
0.06617797166109085,
-0.10839784145355225,
-0.03566567599773407,
0.06459172070026398,
-0.1529741883277893,
-0.04814763739705086,
0.07037033140659332,
0.05798470973968506,
0.0772777572274208,
0.025874758139252663,
-0.06775984913110733,
0.11233853548765182,
0.13974124193191528,
-0.1274280697107315,
-0.09431464225053787,
-0.034768301993608475,
-0.045472025871276855,
0.08832816034555435,
-0.01549243088811636,
0.05215345695614815,
-0.08566716313362122,
-0.04576520994305611,
-0.014624744653701782,
0.008600146509706974,
-0.07037658244371414,
0.06940299272537231,
0.07035870850086212,
0.03135810047388077,
-0.1004091426730156,
0.056148868054151535,
0.03745851665735245,
-0.08524200320243835,
-0.027752049267292023,
0.10253515094518661,
-0.08517909795045853,
-0.07614928483963013,
-0.10347085446119308,
0.08062413334846497,
-0.02891913242638111,
-0.08534456789493561,
-0.09159952402114868,
-0.0886596217751503,
-0.0000926378124859184,
0.1163204237818718,
0.010701719671487808,
-0.008143343031406403,
-0.012700158171355724,
-0.05531327426433563,
0.002000612672418356,
0.04470143839716911,
0.06905147433280945,
0.012572189792990685,
-0.12098933011293411,
0.020939046517014503,
0.053221482783555984,
0.19022729992866516,
-0.04533868655562401,
-0.045419447124004364,
-0.12540653347969055,
0.032443221658468246,
-0.011063214391469955,
-0.019883785396814346,
-0.14067208766937256,
-0.019619669765233994,
-0.06991352140903473,
-0.08299888670444489,
-0.022780772298574448,
-0.042271386831998825,
-0.08238610625267029,
0.052785176783800125,
0.02282853238284588,
0.02523503638803959,
-0.07635797560214996,
-0.022321714088320732,
0.1326383799314499,
-0.008905149064958096,
0.07997338473796844,
0.07304119318723679,
-0.03160800039768219,
0.08553329110145569,
-0.15792807936668396,
-0.03980058804154396,
0.04296593368053436,
-0.004219523165374994,
0.06500669568777084,
-0.0010899956105276942,
-0.008950670249760151,
0.03152451291680336,
0.04869116470217705,
0.032062649726867676,
0.09432020783424377,
-0.1155133992433548,
-0.018734078854322433,
0.037764403969049454,
-0.0902634784579277,
-0.07841601222753525,
0.020616576075553894,
0.0603141263127327,
-0.013642512261867523,
0.14138197898864746,
-0.05113828182220459,
0.08251834660768509,
-0.10453247278928757,
0.00556524470448494,
-0.013544389978051186,
-0.17585811018943787,
0.002010647440329194,
-0.07559923827648163,
0.040731459856033325,
-0.017112158238887787,
0.08772307634353638,
0.010863282717764378,
-0.10320299863815308,
-0.009512733668088913,
0.009907292202115059,
-0.0335235521197319,
-0.0038787168450653553,
0.1652064025402069,
0.04806867241859436,
-0.023066256195306778,
-0.012017888948321342,
0.047003548592329025,
-0.006845797412097454,
0.18730361759662628,
0.10293427854776382,
0.1650373339653015,
0.1152157187461853,
0.04218808561563492,
0.04926721006631851,
0.053343016654253006,
-0.11201643943786621,
-0.1743718832731247,
-0.10014086216688156,
0.035257596522569656,
0.021919799968600273,
0.005469661671668291,
0.1987857222557068,
-0.05734160915017128,
0.032962679862976074,
-0.061772387474775314,
-0.0337723083794117,
-0.11152583360671997,
-0.23138032853603363,
-0.07505134493112564,
-0.038794342428445816,
0.04440778121352196,
-0.046326715499162674,
0.00022241259284783155,
0.11696598678827286,
0.07233240455389023,
-0.07868264615535736,
0.172632098197937,
0.02264031209051609,
-0.05309688299894333,
0.0966656506061554,
0.010715220123529434,
-0.015625255182385445,
-0.05353705957531929,
0.012722674757242203,
-0.0790225937962532,
-0.02276059426367283,
0.010446727275848389,
-0.028709841892123222,
-0.07175551354885101,
0.05070475861430168,
-0.05397353321313858,
-0.12954622507095337,
-0.035902101546525955,
0.031661417335271835,
0.10757545381784439,
0.13618680834770203,
0.010362326167523861,
0.0687049925327301,
0.009370778687298298,
0.11395962536334991,
-0.0008319724583998322,
-0.08984503895044327,
-0.10430500656366348,
0.17292077839374542,
-0.0600154772400856,
-0.03291286900639534,
-0.00522748613730073,
-0.0024354644119739532,
0.03260725364089012,
0.27647173404693604,
0.2690202593803406,
-0.04008515179157257,
0.07462530583143234,
-0.052157800644636154,
0.03466777876019478,
-0.010015278123319149,
0.06549999862909317,
0.0842503160238266,
0.14362870156764984,
-0.0944610983133316,
0.012723053805530071,
-0.07549970597028732,
-0.029078371822834015,
-0.036771662533283234,
-0.015205765143036842,
0.08019445091485977,
-0.04203205928206444,
-0.0721697211265564,
0.07698895782232285,
-0.16758178174495697,
-0.0083157392218709,
0.05253088101744652,
-0.18732395768165588,
-0.08426268398761749,
-0.055106889456510544,
0.03440326079726219,
0.1216508150100708,
0.03676708787679672,
-0.034719761461019516,
0.024641234427690506,
-0.041081223636865616,
0.07664743810892105,
-0.12823458015918732,
-0.07390280067920685,
0.09619417041540146,
0.046240661293268204,
0.06611594557762146,
-0.03136979043483734,
0.028241673484444618,
0.1034422442317009,
0.07330776751041412,
-0.02562549151480198,
0.0960056260228157,
0.030906924977898598,
-0.0008644617046229541,
-0.07319606840610504,
0.042991798371076584,
0.037777118384838104,
-0.03840995207428932,
0.04232809320092201,
0.005141787696629763,
0.06582054495811462,
-0.0072099026292562485,
0.030531320720911026,
-0.05242159962654114,
0.11483360826969147,
-0.027405915781855583,
0.14661823213100433,
0.17534027993679047,
-0.017109598964452744,
-0.06042935699224472,
-0.06583719700574875,
0.02115529030561447,
0.0043853032402694225,
-0.14633037149906158,
-0.12228674441576004,
-0.061115462332963943,
-0.0474080964922905,
-0.047011490911245346,
0.056706495583057404,
-0.2436724603176117,
0.020449988543987274,
-0.1460418701171875,
-0.05627371370792389,
0.0032866450492292643,
0.008245733566582203,
0.04335927590727806,
0.048263248056173325,
-0.007367822341620922,
-0.1349090188741684,
0.04173363745212555,
0.06609109044075012,
-0.11863397806882858,
-0.07805336266756058
] |
null | null | null |
# KoCLIP
This repository includes
## Installation
Create a virtual env and install `requirements.txt`.
```
pip install -r requirements.txt
```
For Google Cloud TPU VM please follow necessary installation steps here:
[Pytorch on TPU VM](https://cloud.google.com/tpu/docs/pytorch-xla-ug-tpu-vm)
[JAX/Flax on TPU VM](https://cloud.google.com/tpu/docs/jax-quickstart-tpu-vm)
|
{}
| null |
flax-community/koclip
|
[
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# KoCLIP
This repository includes
## Installation
Create a virtual env and install 'URL'.
For Google Cloud TPU VM please follow necessary installation steps here:
Pytorch on TPU VM
JAX/Flax on TPU VM
|
[
"# KoCLIP\n\nThis repository includes",
"## Installation\n\nCreate a virtual env and install 'URL'.\n\n\n\nFor Google Cloud TPU VM please follow necessary installation steps here:\n\nPytorch on TPU VM\nJAX/Flax on TPU VM"
] |
[
"TAGS\n#region-us \n",
"# KoCLIP\n\nThis repository includes",
"## Installation\n\nCreate a virtual env and install 'URL'.\n\n\n\nFor Google Cloud TPU VM please follow necessary installation steps here:\n\nPytorch on TPU VM\nJAX/Flax on TPU VM"
] |
[
6,
9,
43
] |
[
"passage: TAGS\n#region-us \n# KoCLIP\n\nThis repository includes## Installation\n\nCreate a virtual env and install 'URL'.\n\n\n\nFor Google Cloud TPU VM please follow necessary installation steps here:\n\nPytorch on TPU VM\nJAX/Flax on TPU VM"
] |
[
0.041662804782390594,
-0.09359000623226166,
-0.0020268948283046484,
-0.05722516402602196,
0.1927175670862198,
0.04638122394680977,
-0.049754634499549866,
0.04089243337512016,
0.04784421995282173,
-0.023972805589437485,
0.11413583159446716,
0.08137435466051102,
0.022564483806490898,
0.10270226001739502,
0.13473884761333466,
-0.2877033054828644,
-0.061535224318504333,
0.001793054398149252,
-0.043600793927907944,
0.060421861708164215,
0.05022530257701874,
-0.02703147754073143,
0.04760313779115677,
-0.09496103227138519,
-0.028366079553961754,
0.012680653482675552,
-0.05926495045423508,
0.00428064726293087,
0.04674489423632622,
0.001741598010994494,
0.00627308851107955,
-0.06341639161109924,
0.039050642400979996,
-0.042305395007133484,
0.02437211386859417,
0.06490451842546463,
-0.08989954739809036,
0.01946626417338848,
0.0475904606282711,
0.035583142191171646,
-0.023488856852054596,
-0.04338343068957329,
-0.028744466602802277,
0.02460203319787979,
0.014906466007232666,
-0.10998144745826721,
-0.014240016229450703,
0.07683736085891724,
0.05281398817896843,
0.030927861109375954,
0.06682965159416199,
0.31088966131210327,
-0.08842553943395615,
0.07476770132780075,
0.10895697772502899,
-0.1831607222557068,
-0.06267090886831284,
0.20718814432621002,
0.0018456614343449473,
0.05172402411699295,
0.08489163964986801,
0.04375586658716202,
0.06795985996723175,
0.11272171139717102,
-0.051531802862882614,
0.0094485804438591,
-0.24947591125965118,
-0.015165536664426327,
-0.06462035328149796,
-0.10448739677667618,
0.342933714389801,
0.028778530657291412,
0.06452676653862,
0.20081889629364014,
-0.06582659482955933,
0.08156164735555649,
0.0636344775557518,
0.009113308973610401,
0.07696641981601715,
0.017080988734960556,
-0.0503450445830822,
-0.13627196848392487,
-0.0995844155550003,
-0.03798780217766762,
-0.02299649827182293,
0.061703264713287354,
0.07438688725233078,
0.06110948696732521,
-0.10459718108177185,
0.11987897753715515,
-0.04035792872309685,
0.013428427278995514,
0.08632879704236984,
-0.06463970243930817,
0.12465575337409973,
-0.005021010059863329,
-0.06771674007177353,
-0.21333806216716766,
0.039362285286188126,
0.06320792436599731,
0.1262025237083435,
-0.07190459966659546,
0.11191689968109131,
0.0772409662604332,
0.010942097753286362,
0.12984685599803925,
-0.12436137348413467,
0.05031029134988785,
0.14090794324874878,
-0.17770425975322723,
0.0868012085556984,
0.029366811737418175,
-0.10232128202915192,
-0.04325488954782486,
0.02659754268825054,
0.10938737541437149,
0.06325355917215347,
0.030514702200889587,
0.04768338426947594,
-0.03193230181932449,
0.0619514100253582,
-0.0268404558300972,
-0.023893628269433975,
-0.13864651322364807,
-0.08283588290214539,
0.007531530689448118,
0.10981269925832748,
-0.02851053699851036,
-0.12371061742305756,
0.029623065143823624,
-0.02460789680480957,
0.0437789112329483,
-0.03353472054004669,
0.012719353660941124,
0.11396726965904236,
-0.07029164582490921,
0.07217343151569366,
-0.23778504133224487,
-0.07045718282461166,
0.009903508238494396,
0.0828874409198761,
0.02743116207420826,
-0.061032459139823914,
0.1062231957912445,
0.07577230036258698,
-0.07187706977128983,
-0.03479083999991417,
0.12507151067256927,
-0.014629816636443138,
0.062230274081230164,
-0.09862872213125229,
0.02793111279606819,
-0.043196696788072586,
0.03835749626159668,
-0.07082480192184448,
-0.005080176517367363,
-0.11396532505750656,
-0.08127321302890778,
-0.02570154331624508,
0.17323753237724304,
-0.11212844401597977,
0.012293536216020584,
-0.016312213614583015,
-0.022932035848498344,
0.05004256218671799,
0.04173343628644943,
-0.1443207859992981,
0.0007029925473034382,
0.17064741253852844,
-0.06982012093067169,
-0.09843142330646515,
0.058906178921461105,
0.15251411497592926,
0.028542617335915565,
-0.03963572159409523,
0.05154849588871002,
0.030544042587280273,
-0.18877634406089783,
0.023668820038437843,
0.12577667832374573,
-0.2111072838306427,
-0.18141423165798187,
0.002072405768558383,
0.006100332830101252,
-0.17683646082878113,
0.028543775901198387,
0.010047322139143944,
0.1158878430724144,
-0.05744769424200058,
0.014083798043429852,
-0.0828806459903717,
-0.04992276430130005,
-0.06297365576028824,
0.0005059736431576312,
0.010861390270292759,
0.05727487802505493,
0.007266248110681772,
-0.05420510843396187,
0.04355725646018982,
0.05779002979397774,
0.07327014952898026,
0.06194299831986427,
0.12380513548851013,
-0.05825947970151901,
0.021042397245764732,
0.0076244729571044445,
-0.13676095008850098,
0.10539993643760681,
-0.060934826731681824,
0.02622961811721325,
-0.039537541568279266,
0.025801697745919228,
-0.04032844305038452,
-0.03921965882182121,
0.01086906436830759,
0.0062123932875692844,
0.00899124052375555,
-0.01965021714568138,
0.019490059465169907,
0.019738003611564636,
0.009937175549566746,
-0.20204514265060425,
0.11569732427597046,
-0.0068045808002352715,
0.11580982059240341,
0.1774684488773346,
-0.04737497866153717,
0.04094032570719719,
0.09664712846279144,
-0.04242600500583649,
0.02763557806611061,
-0.07291653007268906,
0.05455915629863739,
0.0033209167886525393,
-0.013350105844438076,
0.11377757042646408,
0.18186232447624207,
-0.10705617070198059,
0.12655317783355713,
-0.19808275997638702,
0.014652069658041,
0.0055647655390203,
-0.10089995712041855,
-0.06796305626630783,
-0.01628456450998783,
-0.06574710458517075,
-0.04351281002163887,
0.03239916265010834,
0.05495498701930046,
-0.018401963636279106,
-0.02372405119240284,
0.01626570336520672,
-0.03014860488474369,
-0.04212433472275734,
0.02426108717918396,
0.2565305829048157,
-0.04760674014687538,
-0.01466022152453661,
0.17170310020446777,
-0.06255339831113815,
0.022763144224882126,
-0.03843439742922783,
-0.011485956609249115,
0.011570162139832973,
0.017458269372582436,
-0.02189316228032112,
0.04183466359972954,
-0.0927608385682106,
0.01590622030198574,
0.014793679118156433,
-0.02902199886739254,
0.08007120341062546,
-0.07666599005460739,
-0.053470104932785034,
-0.056211795657873154,
0.02328392118215561,
0.014277142472565174,
0.10879156738519669,
-0.07393987476825714,
-0.04074220359325409,
-0.03349043056368828,
0.04191364347934723,
0.09269602596759796,
0.013580936938524246,
-0.0016995195765048265,
0.04329719394445419,
-0.14157167077064514,
-0.20461076498031616,
-0.09747971594333649,
0.08635056763887405,
-0.012892537750303745,
-0.025019831955432892,
0.04080025106668472,
-0.13068470358848572,
0.011492477729916573,
0.05244027450680733,
0.03830336034297943,
0.012532790191471577,
0.006522719282656908,
-0.17770488560199738,
0.12064552307128906,
-0.01275093387812376,
-0.0943271666765213,
0.02662980929017067,
-0.04187557473778725,
-0.13744957745075226,
0.12366319447755814,
-0.04496930539608002,
0.0680595114827156,
0.024138182401657104,
-0.039873864501714706,
0.02573847770690918,
0.005447338335216045,
0.18945126235485077,
-0.031028732657432556,
0.12646304070949554,
0.14755040407180786,
0.14038757979869843,
0.003945331554859877,
0.08036032319068909,
0.004427116829901934,
-0.012963077053427696,
0.037406399846076965,
-0.060366079211235046,
-0.06489889323711395,
-0.366118848323822,
-0.12172254920005798,
-0.02304914779961109,
-0.023075036704540253,
0.030819380655884743,
0.09674006700515747,
0.0015860862331464887,
0.10574892163276672,
-0.018551690503954887,
0.18751323223114014,
-0.1135638952255249,
0.04838748276233673,
0.08311060816049576,
-0.07703769207000732,
-0.11363890767097473,
-0.01254196371883154,
0.06521692872047424,
0.11271900683641434,
0.17392106354236603,
0.18369866907596588,
-0.029863514006137848,
-0.06888037174940109,
0.061912745237350464,
0.12905743718147278,
0.024613702669739723,
0.1463998705148697,
0.1010049507021904,
0.05006455257534981,
-0.012885065749287605,
-0.04302292317152023,
-0.016461189836263657,
0.07895561307668686,
0.17036215960979462,
-0.13133543729782104,
0.0753493532538414,
-0.05341702699661255,
0.05886361375451088,
0.19168682396411896,
-0.047505658119916916,
0.019583631306886673,
0.03138447925448418,
-0.004382336977869272,
0.0007800601888448,
-0.019499681890010834,
0.08167146891355515,
0.07720500975847244,
-0.09773912280797958,
0.04724607989192009,
0.006807846482843161,
0.027152420952916145,
-0.00022600378724746406,
-0.06424953043460846,
0.07870636135339737,
0.13262106478214264,
0.06318053603172302,
0.08044181019067764,
-0.16129900515079498,
0.13105742633342743,
0.03273411840200424,
-0.051521554589271545,
-0.04126565530896187,
-0.01772720366716385,
-0.004714943468570709,
0.162789449095726,
0.03838052973151207,
0.1216120794415474,
0.07042969018220901,
0.05089355260133743,
-0.11454759538173676,
0.06351922452449799,
0.004888271447271109,
0.05381537973880768,
-0.13847732543945312,
0.01475068461149931,
-0.05599253624677658,
-0.06141358241438866,
0.029402514919638634,
-0.19616301357746124,
0.032318055629730225,
-0.04457724466919899,
-0.0067119295708835125,
0.0366644524037838,
-0.0452900193631649,
-0.06181329861283302,
-0.02594640664756298,
-0.052739035338163376,
0.00788839627057314,
-0.04605821892619133,
0.01328297145664692,
-0.018432751297950745,
0.08885115385055542,
-0.030852828174829483,
0.025766856968402863,
-0.045108612626791,
-0.0739566758275032,
-0.028535494580864906,
-0.05335858836770058,
0.029196402058005333,
-0.13956096768379211,
-0.03144689276814461,
-0.03956183046102524,
0.11755410581827164,
-0.11911439895629883,
0.005383835174143314,
-0.0716760903596878,
-0.0607743039727211,
-0.056371357291936874,
-0.06797974556684494,
0.09201262891292572,
0.16058528423309326,
-0.04325420781970024,
0.039229925721883774,
-0.14340811967849731,
0.11836151778697968,
0.037162795662879944,
-0.11214301735162735,
0.1546773910522461,
0.04194752871990204,
-0.02611633762717247,
-0.006765488535165787,
0.1760375052690506,
-0.03380535542964935,
-0.23847733438014984,
-0.10788625478744507,
-0.0159317497164011,
-0.14335623383522034,
0.045431289821863174,
-0.126265749335289,
0.10226399451494217,
0.07165338099002838,
0.004396118689328432,
0.09904942661523819,
-0.03382866457104683,
-0.06481749564409256,
-0.0038814498111605644,
0.039188627153635025,
0.041689056903123856,
-0.00475333584472537,
-0.02313820645213127,
0.00610456382855773,
-0.05179436132311821,
0.003660863498225808,
-0.03350268304347992,
0.09535466879606247,
0.007291462272405624,
0.04498822242021561,
-0.01703662984073162,
-0.03346773982048035,
0.00236728647723794,
-0.049823421984910965,
-0.07706383615732193,
-0.02763431705534458,
-0.07828836143016815,
0.06866267323493958,
0.008002085611224174,
0.020485367625951767,
0.08342058956623077,
0.027265582233667374,
0.04258500039577484,
0.018258176743984222,
-0.09203730523586273,
0.10716640204191208,
-0.027402309700846672,
-0.09080203622579575,
-0.02189819887280464,
0.006018851883709431,
0.005018085241317749,
-0.007182667497545481,
0.0015703222015872598,
0.10430234670639038,
-0.08943618834018707,
0.0992034524679184,
-0.08891308307647705,
-0.10749095678329468,
-0.0542014054954052,
-0.023589862510561943,
-0.07189090549945831,
0.07683767378330231,
-0.04505448788404465,
-0.03884396329522133,
0.013601348735392094,
-0.023573460057377815,
-0.06941398978233337,
-0.0062121921218931675,
-0.10600194334983826,
-0.08736055344343185,
-0.0021889815106987953,
-0.06687036156654358,
-0.058612603694200516,
-0.11706867069005966,
-0.022133447229862213,
0.1807309240102768,
0.05079729110002518,
0.051725778728723526,
-0.1401529759168625,
-0.046306099742650986,
0.007323306519538164,
0.022456906735897064,
-0.08306314051151276,
0.025497986003756523,
0.1592998504638672,
0.014904706738889217,
-0.05827663838863373,
0.14735658466815948,
0.07752089202404022,
-0.0902385488152504,
-0.04477785527706146,
0.09108303487300873,
-0.05004734918475151,
-0.14057213068008423,
0.007255892269313335,
-0.1666954606771469,
0.06690413504838943,
-0.09851247817277908,
0.05066461116075516,
0.07283398509025574,
0.0005268690874800086,
-0.04008079692721367,
0.03535430133342743,
0.02763335034251213,
0.025425000116229057,
-0.02627558261156082,
0.012669452466070652,
0.006123845465481281,
-0.021058864891529083,
0.04981378838419914,
-0.1840859353542328,
0.006959693972021341,
0.05456075817346573,
0.19394876062870026,
-0.07006896287202835,
-0.031197555363178253,
-0.05874774605035782,
0.01714997924864292,
-0.001216163276694715,
-0.0003893659159075469,
0.031766775995492935,
0.013155876658856869,
0.03518985956907272,
-0.05327886343002319,
-0.014288114383816719,
0.08813119679689407,
-0.06451598554849625,
0.005461025983095169,
0.0035267192870378494,
0.05298096686601639,
0.051434434950351715,
-0.03720114752650261,
0.05393426865339279,
0.038533903658390045,
0.06347000598907471,
-0.00866693165153265,
-0.06628172844648361,
0.034950390458106995,
-0.15774813294410706,
-0.061741746962070465,
0.025849292054772377,
0.05645769461989403,
-0.024220386520028114,
0.06059274077415466,
0.01683838665485382,
-0.036296457052230835,
-0.029680194333195686,
-0.03976849839091301,
0.0406934916973114,
-0.13331107795238495,
-0.01628192700445652,
0.02244522050023079,
-0.12780019640922546,
-0.009286874905228615,
-0.08358796685934067,
-0.0673743486404419,
0.05243087187409401,
-0.03697504848241806,
-0.023891540244221687,
0.04184969514608383,
-0.009511630050837994,
0.03499097749590874,
-0.02798745594918728,
-0.03405601903796196,
-0.10121052712202072,
0.02692638896405697,
-0.002006467431783676,
0.035864293575286865,
0.1865881085395813,
0.13128259778022766,
-0.01444524247199297,
-0.010055777616798878,
0.1219005137681961,
-0.18959394097328186,
-0.036154329776763916,
-0.059970274567604065,
-0.055071353912353516,
0.07105419784784317,
-0.17908544838428497,
-0.005246344953775406,
0.05503539368510246,
-0.13561788201332092,
-0.03209840506315231,
-0.11019979417324066,
0.055557653307914734,
0.02530459128320217,
-0.006087078712880611,
0.004182273522019386,
-0.11811623722314835,
-0.1478751301765442,
0.12243698537349701,
0.15861456096172333,
-0.13790258765220642,
0.005658690817654133,
-0.03106786124408245,
-0.021825658157467842,
-0.03767658397555351,
0.007222033571451902,
-0.015242381952702999,
-0.06027756258845329,
-0.14992547035217285,
-0.03545263037085533,
-0.22089450061321259,
0.0545373260974884,
0.0522083155810833,
-0.12189225852489471,
-0.02577284350991249,
0.033782657235860825,
-0.05763005092740059,
0.03244059905409813,
0.020083222538232803,
-0.030008966103196144,
0.03126446530222893,
0.04898890107870102,
-0.015345851890742779,
-0.04267207533121109,
-0.05659789964556694,
-0.06495384126901627,
-0.0951966941356659,
-0.08770675212144852,
0.0429120734333992,
0.007662703283131123,
0.04171331971883774,
0.027767131105065346,
0.02465810440480709,
-0.061228811740875244,
0.0489165261387825,
-0.09312581270933151,
0.06667744368314743,
0.011408202350139618,
-0.02515772357583046,
0.046409063041210175,
0.12958472967147827,
0.0015427167527377605,
-0.025128448382019997,
-0.003661704482510686,
0.04416017606854439,
-0.02273860014975071,
0.00425351969897747,
0.051943589001894,
-0.04593904688954353,
-0.10457925498485565,
0.19830958545207977,
0.1390399932861328,
0.10670485347509384,
0.021443208679556847,
0.08529137074947357,
-0.0028326900210231543,
-0.018785960972309113,
0.19628319144248962,
0.16169190406799316,
0.08249984681606293,
-0.044285569339990616,
0.0029562446288764477,
-0.017896143719553947,
-0.01869150996208191,
-0.12681527435779572,
-0.11652670055627823,
0.0004960952792316675,
-0.11018802970647812,
0.005641250405460596,
0.10104698687791824,
-0.1970391720533371,
-0.0016920140478760004,
0.05658269673585892,
-0.09810007363557816,
-0.025767594575881958,
-0.044187385588884354,
0.1723344922065735,
0.09344161301851273,
0.07205338776111603,
-0.04140210524201393,
-0.021221524104475975,
0.14396241307258606,
-0.007939821109175682,
-0.1897123157978058,
-0.08812540769577026,
0.09149882942438126,
-0.09695398807525635,
0.19598977267742157,
-0.05703752487897873,
0.10316674411296844,
-0.01877450942993164,
0.003186810528859496,
-0.06279568374156952,
0.08002553880214691,
-0.013151547871530056,
-0.11716308444738388,
0.005729816388338804,
-0.03612343221902847,
-0.06377677619457245,
-0.06335702538490295,
-0.07157821208238602,
0.023903127759695053,
-0.02802925556898117,
0.20007863640785217,
0.13932345807552338,
-0.1028013601899147,
0.15648896992206573,
-0.1388540118932724,
0.0994555726647377,
0.022441864013671875,
-0.031385187059640884,
-0.040363483130931854,
-0.19923990964889526,
0.0036393036134541035,
0.056339330971241,
-0.10039801895618439,
0.03446416184306145,
0.08336272835731506,
-0.021408330649137497,
-0.05979529395699501,
0.017993954941630363,
0.23625287413597107,
0.018719885498285294,
-0.14330366253852844,
-0.04570033773779869,
-0.10911428183317184,
0.03950606659054756,
0.0466255359351635,
-0.021436691284179688,
0.04595834016799927,
0.10327135026454926,
-0.011971342377364635,
-0.05119199678301811,
-0.1874481737613678,
-0.07142993062734604
] |
null | null |
transformers
|
# MedCLIP: Fine-tuning a CLIP model on the ROCO medical dataset
<!--  -->
<h3 align="center">
<!-- <p>MedCLIP</p> -->
<img src="./assets/logo.png" alt="huggingface-medclip" width="250" height="250">
## Summary
This repository contains the code for fine-tuning a CLIP model on the [ROCO dataset](https://github.com/razorx89/roco-dataset), a dataset made of radiology images and a caption.
This work is done as a part of the [**Flax/Jax community week**](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md#quickstart-flax-and-jax-in-transformers) organized by Hugging Face and Google.
### Demo
You can try a Streamlit demo app that uses this model on [🤗 Spaces](https://huggingface.co/spaces/kaushalya/medclip-roco). You may have to signup for 🤗 Spaces private beta to access this app (screenshot shown below).

🤗 Hub Model card: https://huggingface.co/flax-community/medclip-roco
## Dataset 🧩
Each image is accompanied by a textual caption. The caption length varies from a few characters (a single word) to 2,000 characters (multiple sentences). During preprocessing we remove all images that has a caption shorter than 10 characters.
Training set: 57,780 images with their caption.
Validation set: 7,200
Test set: 7,650
[ ] Give an example
## Installation 💽
This repo depends on the master branch of [Hugging Face - Transformers library](https://github.com/huggingface/transformers). First you need to clone the transformers repository and then install it locally (preferably inside a virtual environment) with `pip install -e ".[flax]"`.
## The Model ⚙️
You can load the pretrained model from the Hugging Face Hub with
```
from medclip.modeling_hybrid_clip import FlaxHybridCLIP
model = FlaxHybridCLIP.from_pretrained("flax-community/medclip-roco")
```
## Training
The model is trained using Flax/JAX on a cloud TPU-v3-8.
You can fine-tune a CLIP model implemented in Flax by simply running `sh run_medclip`.
This is the validation loss curve we observed when we trained the model using the `run_medclip.sh` script.

## Limitations 🚨
The current model is capable of identifying if a given radiology image is a PET scan or an ultrasound scan. However it fails at identifying a brain scan from a lung scan. ❗️This model **should not** be used in a medical setting without further evaluations❗️.
## Acknowledgements
Huge thanks to the Hugging Face 🤗 team and Google JAX/Flax team for organizing the community week and letting us use cloud compute for 2 weeks. We specially thank [@patil-suraj](https://github.com/patil-suraj) & [@patrickvonplaten](https://github.com/patrickvonplaten) for the continued support on Slack and the detailed feedback.
## TODO
[ ] Evaluation on down-stream tasks
[ ] Zero-shot learning performance
[ ] Merge the demo app
|
{}
| null |
kaushalya/medclip
|
[
"transformers",
"jax",
"tensorboard",
"hybrid-clip",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #jax #tensorboard #hybrid-clip #endpoints_compatible #region-us
|
# MedCLIP: Fine-tuning a CLIP model on the ROCO medical dataset
<h3 align="center">
<img src="./assets/URL" alt="huggingface-medclip" width="250" height="250">
## Summary
This repository contains the code for fine-tuning a CLIP model on the ROCO dataset, a dataset made of radiology images and a caption.
This work is done as a part of the Flax/Jax community week organized by Hugging Face and Google.
### Demo
You can try a Streamlit demo app that uses this model on Spaces. You may have to signup for Spaces private beta to access this app (screenshot shown below).
!Streamlit app
Hub Model card: URL
## Dataset
Each image is accompanied by a textual caption. The caption length varies from a few characters (a single word) to 2,000 characters (multiple sentences). During preprocessing we remove all images that has a caption shorter than 10 characters.
Training set: 57,780 images with their caption.
Validation set: 7,200
Test set: 7,650
[ ] Give an example
## Installation
This repo depends on the master branch of Hugging Face - Transformers library. First you need to clone the transformers repository and then install it locally (preferably inside a virtual environment) with 'pip install -e ".[flax]"'.
## The Model ️
You can load the pretrained model from the Hugging Face Hub with
## Training
The model is trained using Flax/JAX on a cloud TPU-v3-8.
You can fine-tune a CLIP model implemented in Flax by simply running 'sh run_medclip'.
This is the validation loss curve we observed when we trained the model using the 'run_medclip.sh' script.
!Validation loss
## Limitations
The current model is capable of identifying if a given radiology image is a PET scan or an ultrasound scan. However it fails at identifying a brain scan from a lung scan. ️This model should not be used in a medical setting without further evaluations️.
## Acknowledgements
Huge thanks to the Hugging Face team and Google JAX/Flax team for organizing the community week and letting us use cloud compute for 2 weeks. We specially thank @patil-suraj & @patrickvonplaten for the continued support on Slack and the detailed feedback.
## TODO
[ ] Evaluation on down-stream tasks
[ ] Zero-shot learning performance
[ ] Merge the demo app
|
[
"# MedCLIP: Fine-tuning a CLIP model on the ROCO medical dataset\n\n\n<h3 align=\"center\">\n \n <img src=\"./assets/URL\" alt=\"huggingface-medclip\" width=\"250\" height=\"250\">",
"## Summary\nThis repository contains the code for fine-tuning a CLIP model on the ROCO dataset, a dataset made of radiology images and a caption.\nThis work is done as a part of the Flax/Jax community week organized by Hugging Face and Google.",
"### Demo\nYou can try a Streamlit demo app that uses this model on Spaces. You may have to signup for Spaces private beta to access this app (screenshot shown below).\n!Streamlit app\n\n Hub Model card: URL",
"## Dataset \n\nEach image is accompanied by a textual caption. The caption length varies from a few characters (a single word) to 2,000 characters (multiple sentences). During preprocessing we remove all images that has a caption shorter than 10 characters.\nTraining set: 57,780 images with their caption.\nValidation set: 7,200\nTest set: 7,650\n\n[ ] Give an example",
"## Installation \nThis repo depends on the master branch of Hugging Face - Transformers library. First you need to clone the transformers repository and then install it locally (preferably inside a virtual environment) with 'pip install -e \".[flax]\"'.",
"## The Model ️\nYou can load the pretrained model from the Hugging Face Hub with",
"## Training\nThe model is trained using Flax/JAX on a cloud TPU-v3-8. \nYou can fine-tune a CLIP model implemented in Flax by simply running 'sh run_medclip'.\nThis is the validation loss curve we observed when we trained the model using the 'run_medclip.sh' script.\n!Validation loss",
"## Limitations \nThe current model is capable of identifying if a given radiology image is a PET scan or an ultrasound scan. However it fails at identifying a brain scan from a lung scan. ️This model should not be used in a medical setting without further evaluations️.",
"## Acknowledgements\nHuge thanks to the Hugging Face team and Google JAX/Flax team for organizing the community week and letting us use cloud compute for 2 weeks. We specially thank @patil-suraj & @patrickvonplaten for the continued support on Slack and the detailed feedback.",
"## TODO\n[ ] Evaluation on down-stream tasks\n\n[ ] Zero-shot learning performance\n\n[ ] Merge the demo app"
] |
[
"TAGS\n#transformers #jax #tensorboard #hybrid-clip #endpoints_compatible #region-us \n",
"# MedCLIP: Fine-tuning a CLIP model on the ROCO medical dataset\n\n\n<h3 align=\"center\">\n \n <img src=\"./assets/URL\" alt=\"huggingface-medclip\" width=\"250\" height=\"250\">",
"## Summary\nThis repository contains the code for fine-tuning a CLIP model on the ROCO dataset, a dataset made of radiology images and a caption.\nThis work is done as a part of the Flax/Jax community week organized by Hugging Face and Google.",
"### Demo\nYou can try a Streamlit demo app that uses this model on Spaces. You may have to signup for Spaces private beta to access this app (screenshot shown below).\n!Streamlit app\n\n Hub Model card: URL",
"## Dataset \n\nEach image is accompanied by a textual caption. The caption length varies from a few characters (a single word) to 2,000 characters (multiple sentences). During preprocessing we remove all images that has a caption shorter than 10 characters.\nTraining set: 57,780 images with their caption.\nValidation set: 7,200\nTest set: 7,650\n\n[ ] Give an example",
"## Installation \nThis repo depends on the master branch of Hugging Face - Transformers library. First you need to clone the transformers repository and then install it locally (preferably inside a virtual environment) with 'pip install -e \".[flax]\"'.",
"## The Model ️\nYou can load the pretrained model from the Hugging Face Hub with",
"## Training\nThe model is trained using Flax/JAX on a cloud TPU-v3-8. \nYou can fine-tune a CLIP model implemented in Flax by simply running 'sh run_medclip'.\nThis is the validation loss curve we observed when we trained the model using the 'run_medclip.sh' script.\n!Validation loss",
"## Limitations \nThe current model is capable of identifying if a given radiology image is a PET scan or an ultrasound scan. However it fails at identifying a brain scan from a lung scan. ️This model should not be used in a medical setting without further evaluations️.",
"## Acknowledgements\nHuge thanks to the Hugging Face team and Google JAX/Flax team for organizing the community week and letting us use cloud compute for 2 weeks. We specially thank @patil-suraj & @patrickvonplaten for the continued support on Slack and the detailed feedback.",
"## TODO\n[ ] Evaluation on down-stream tasks\n\n[ ] Zero-shot learning performance\n\n[ ] Merge the demo app"
] |
[
29,
59,
64,
51,
87,
65,
20,
81,
60,
70,
27
] |
[
"passage: TAGS\n#transformers #jax #tensorboard #hybrid-clip #endpoints_compatible #region-us \n# MedCLIP: Fine-tuning a CLIP model on the ROCO medical dataset\n\n\n<h3 align=\"center\">\n \n <img src=\"./assets/URL\" alt=\"huggingface-medclip\" width=\"250\" height=\"250\">## Summary\nThis repository contains the code for fine-tuning a CLIP model on the ROCO dataset, a dataset made of radiology images and a caption.\nThis work is done as a part of the Flax/Jax community week organized by Hugging Face and Google.### Demo\nYou can try a Streamlit demo app that uses this model on Spaces. You may have to signup for Spaces private beta to access this app (screenshot shown below).\n!Streamlit app\n\n Hub Model card: URL## Dataset \n\nEach image is accompanied by a textual caption. The caption length varies from a few characters (a single word) to 2,000 characters (multiple sentences). During preprocessing we remove all images that has a caption shorter than 10 characters.\nTraining set: 57,780 images with their caption.\nValidation set: 7,200\nTest set: 7,650\n\n[ ] Give an example## Installation \nThis repo depends on the master branch of Hugging Face - Transformers library. First you need to clone the transformers repository and then install it locally (preferably inside a virtual environment) with 'pip install -e \".[flax]\"'.## The Model ️\nYou can load the pretrained model from the Hugging Face Hub with## Training\nThe model is trained using Flax/JAX on a cloud TPU-v3-8. \nYou can fine-tune a CLIP model implemented in Flax by simply running 'sh run_medclip'.\nThis is the validation loss curve we observed when we trained the model using the 'run_medclip.sh' script.\n!Validation loss"
] |
[
-0.033482812345027924,
0.0325920470058918,
-0.002063774736598134,
0.010459402576088905,
0.09059197455644608,
-0.013087958097457886,
0.14650283753871918,
0.06907058507204056,
-0.07848545908927917,
0.13745732605457306,
0.021533356979489326,
-0.012877785600721836,
0.10360345244407654,
0.15055619180202484,
0.01492119487375021,
-0.2751016914844513,
0.056729674339294434,
-0.048510272055864334,
0.051867056638002396,
0.04663771763443947,
0.07976964116096497,
-0.11455494910478592,
0.06930387765169144,
0.03432394191622734,
-0.06006191670894623,
0.006647451780736446,
0.012641828507184982,
-0.06541121751070023,
0.08014839887619019,
0.10015533864498138,
0.03743503987789154,
0.03814375773072243,
0.057689640671014786,
-0.12520930171012878,
0.02077016793191433,
0.11526094377040863,
0.0046894303523004055,
0.0840577781200409,
0.12074644863605499,
0.047081585973501205,
0.19701379537582397,
-0.02549862302839756,
0.04888717085123062,
0.04994957149028778,
-0.051144979894161224,
-0.08110960572957993,
-0.17695198953151703,
0.0209770780056715,
0.09799625724554062,
0.12770098447799683,
-0.016541602090001106,
0.15415462851524353,
-0.08960460871458054,
0.04276665300130844,
0.18655093014240265,
-0.13942986726760864,
-0.040111418813467026,
0.09700025618076324,
0.13082195818424225,
0.047531843185424805,
-0.13649167120456696,
-0.004222632385790348,
0.02533823437988758,
0.020442934706807137,
0.04530390352010727,
-0.027519531548023224,
-0.07378826290369034,
-0.021057279780507088,
-0.06643281131982803,
0.009938756003975868,
0.12241136282682419,
-0.003392047481611371,
-0.053333304822444916,
-0.1642475426197052,
0.04441843181848526,
-0.012405578047037125,
-0.014780262485146523,
0.03541772440075874,
0.024881433695554733,
0.0133672459051013,
-0.045588478446006775,
-0.06542592495679855,
-0.10053547471761703,
-0.055157970637083054,
-0.06493207067251205,
-0.0436510294675827,
0.005059563089162111,
0.035852592438459396,
-0.062182214111089706,
0.06062594801187515,
-0.08871497213840485,
-0.053967464715242386,
-0.05055035650730133,
-0.06292884796857834,
-0.09764454513788223,
-0.0521259680390358,
-0.10720345377922058,
-0.17489808797836304,
0.018752237781882286,
0.17005078494548798,
-0.039121001958847046,
0.023879317566752434,
-0.08278069645166397,
-0.0002710835833568126,
0.05051908642053604,
0.13916075229644775,
-0.054413460195064545,
0.035323094576597214,
0.02488228864967823,
0.03642163425683975,
-0.020392591133713722,
-0.03297702223062515,
0.010489544831216335,
0.005586308427155018,
-0.038247376680374146,
0.013347767293453217,
0.017425354570150375,
0.06572028994560242,
0.001537621021270752,
-0.0007143224938772619,
0.08637873083353043,
-0.10634452104568481,
0.029865002259612083,
0.02291175350546837,
-0.022538505494594574,
0.13056069612503052,
0.04660356789827347,
-0.029808299615979195,
-0.07041698694229126,
0.04208635538816452,
-0.09263578057289124,
-0.013039054349064827,
-0.052665021270513535,
-0.12276141345500946,
0.028017286211252213,
-0.05093036964535713,
-0.06554370373487473,
-0.05102347210049629,
-0.016851142048835754,
-0.08871890604496002,
-0.03587618097662926,
-0.04022035747766495,
0.02128450572490692,
0.029124926775693893,
-0.01507723331451416,
-0.02677285298705101,
-0.004427191335707903,
0.0720529556274414,
-0.024008767679333687,
-0.004051806405186653,
-0.04746029153466225,
0.05030132830142975,
0.006936893332749605,
0.046544358134269714,
-0.01709769479930401,
0.0382741279900074,
-0.21333439648151398,
0.07115478813648224,
-0.060217585414648056,
0.0598190501332283,
-0.09245466440916061,
-0.031768232583999634,
-0.047773756086826324,
0.004102133680135012,
0.01379830576479435,
0.04781936854124069,
-0.15390083193778992,
-0.04046105593442917,
0.16040393710136414,
-0.11381220072507858,
-0.023673776537179947,
0.08107694983482361,
-0.03960191085934639,
-0.0009891163790598512,
0.09979455918073654,
0.04511592164635658,
0.15950250625610352,
-0.20010961592197418,
-0.0544295608997345,
0.0038239641580730677,
-0.070930115878582,
0.0001331375096924603,
0.034955646842718124,
-0.0583496168255806,
0.02942574769258499,
0.04574413225054741,
-0.0824170783162117,
0.0111430948600173,
-0.012949504889547825,
0.04158639907836914,
0.012474088929593563,
-0.01886449009180069,
-0.10036732256412506,
-0.05184685066342354,
-0.023286456242203712,
0.00046156192547641695,
-0.04002930968999863,
0.0351363867521286,
0.13378232717514038,
-0.07195919752120972,
0.044133588671684265,
-0.05203726515173912,
0.01598293147981167,
0.027862675487995148,
-0.003788175294175744,
-0.1667284071445465,
-0.018342694267630577,
0.050054632127285004,
-0.004055643454194069,
0.04306343197822571,
0.09522894024848938,
0.022893456742167473,
0.007511716801673174,
0.023520827293395996,
0.06491444259881973,
-0.01750560849905014,
-0.044464338570833206,
-0.059991318732500076,
-0.04123104736208916,
-0.0844448059797287,
-0.049527548253536224,
0.1147347167134285,
-0.14035426080226898,
-0.012961032800376415,
0.05456346645951271,
0.05711451172828674,
0.021936317905783653,
-0.043122436851263046,
0.032792434096336365,
-0.026544636115431786,
-0.016068415716290474,
-0.07820548117160797,
-0.04867861047387123,
0.04859521612524986,
-0.011684855446219444,
0.052864544093608856,
-0.1670004427433014,
-0.17944690585136414,
0.07468315958976746,
0.098871611058712,
-0.05236214026808739,
-0.13169649243354797,
-0.04092331603169441,
-0.013452215120196342,
-0.12694312632083893,
-0.0631372407078743,
0.12730363011360168,
0.10869839042425156,
0.12171656638383865,
-0.08497746288776398,
0.008343866094946861,
-0.024393893778324127,
-0.016912924125790596,
-0.03808765113353729,
0.0033857657108455896,
0.017871124669909477,
0.023247994482517242,
-0.01870853267610073,
-0.020062929019331932,
0.002747161081060767,
0.07993512600660324,
-0.021103525534272194,
-0.11098375916481018,
-0.04143110290169716,
0.009412676095962524,
0.04758588224649429,
-0.029844099655747414,
0.06167056784033775,
0.0292495209723711,
0.05994279682636261,
-0.007456507999449968,
0.03561278060078621,
-0.07690448313951492,
0.027680469676852226,
0.012541074305772781,
-0.004783944692462683,
-0.027394257485866547,
-0.003556926269084215,
0.024515246972441673,
0.0520971305668354,
0.028350500389933586,
0.08555453270673752,
-0.042016271501779556,
-0.02624211646616459,
-0.06782520562410355,
0.14768007397651672,
-0.11079417914152145,
-0.2299365997314453,
-0.1796686053276062,
-0.040386494249105453,
-0.05748167634010315,
0.0014377930201590061,
-0.028195101767778397,
-0.0654970034956932,
-0.12112999707460403,
-0.0991712212562561,
0.11591166257858276,
0.019259238615632057,
-0.0719183087348938,
-0.007260545156896114,
0.047853920608758926,
0.021660225465893745,
-0.1214558333158493,
0.009281927719712257,
0.01797657646238804,
-0.05545613542199135,
0.010463714599609375,
-0.021812323480844498,
-0.011550935916602612,
0.0447160042822361,
0.016501251608133316,
0.008792311884462833,
0.0059174480848014355,
0.16063012182712555,
-0.05686013400554657,
0.05728529393672943,
0.26524993777275085,
0.061474159359931946,
0.06178290769457817,
0.056711312383413315,
-0.0030117190908640623,
-0.04324383661150932,
0.04475393146276474,
0.05897487327456474,
-0.0023109177127480507,
-0.20811395347118378,
-0.005356897599995136,
-0.0718737244606018,
-0.020788919180631638,
0.09814199805259705,
0.010667194612324238,
0.04849395900964737,
0.0597507543861866,
-0.0587555393576622,
0.026066796854138374,
0.026463763788342476,
0.061618395149707794,
0.14072130620479584,
-0.035163428634405136,
0.0424819141626358,
-0.09246955811977386,
0.026081060990691185,
0.13440966606140137,
0.005411043763160706,
0.12964500486850739,
-0.040992621332407,
0.17187565565109253,
0.06795302033424377,
0.033840954303741455,
0.0009358858224004507,
0.12651517987251282,
-0.0020314673893153667,
-0.053118228912353516,
0.030160419642925262,
-0.06805305182933807,
-0.06458688527345657,
0.026441998779773712,
0.06273140013217926,
-0.04364469647407532,
-0.05905148759484291,
-0.005906377453356981,
-0.02748286724090576,
0.09975124150514603,
-0.02765526808798313,
-0.2188253402709961,
-0.03570611774921417,
-0.031225072219967842,
-0.06298016756772995,
-0.08277074247598648,
-0.005003478843718767,
0.19722658395767212,
-0.09628787636756897,
0.0481078177690506,
-0.051233306527137756,
0.10337300598621368,
-0.05419839546084404,
0.018640868365764618,
-0.023654505610466003,
0.18343405425548553,
-0.02752111293375492,
0.0852893814444542,
-0.13286970555782318,
0.013070054352283478,
0.0073176841251552105,
0.015628360211849213,
-0.09764030575752258,
0.0577545240521431,
0.027200166136026382,
0.04254123568534851,
0.04383794963359833,
-0.019255613908171654,
-0.191583514213562,
-0.07770257443189621,
-0.058679819107055664,
-0.01697480119764805,
0.08215469866991043,
-0.019784756004810333,
0.09927581995725632,
-0.057138536125421524,
0.00196126289665699,
-0.02892356552183628,
-0.04583721235394478,
-0.10601382702589035,
-0.2004908174276352,
0.06139984726905823,
-0.014668256975710392,
0.02588292770087719,
-0.0917033702135086,
-0.026067890226840973,
-0.0438370406627655,
0.08241048455238342,
0.11184181272983551,
-0.018543478101491928,
-0.17596329748630524,
0.12828435003757477,
0.2094266265630722,
-0.0551704540848732,
0.025538571178913116,
-0.01682201586663723,
0.17622406780719757,
-0.044321563094854355,
-0.07442554831504822,
-0.0454363077878952,
-0.04506232962012291,
-0.14576995372772217,
-0.0064100283198058605,
0.0709170326590538,
0.028477627784013748,
-0.004009704105556011,
0.01552165299654007,
-0.002920155180618167,
-0.061869923025369644,
-0.08894558250904083,
-0.09298290312290192,
0.06751395761966705,
0.0886717364192009,
-0.026769306510686874,
-0.03702487796545029,
-0.014149805530905724,
-0.04230469837784767,
0.01732654497027397,
0.022940542548894882,
0.09834139049053192,
-0.03574566915631294,
0.10323799401521683,
0.05766885727643967,
-0.004631948657333851,
-0.14794844388961792,
0.016817010939121246,
0.0750255435705185,
-0.0006530069513246417,
0.08311639726161957,
-0.15421147644519806,
0.10022488981485367,
0.06985777616500854,
0.02176203764975071,
-0.06936704367399216,
-0.2573627233505249,
-0.06943024694919586,
-0.007316368632018566,
0.05363916605710983,
-0.1483621895313263,
-0.05983476713299751,
-0.048207543790340424,
-0.0926959291100502,
-0.08168870955705643,
0.10232143104076385,
-0.07297936826944351,
0.08533790707588196,
-0.012983730994164944,
-0.01496218517422676,
0.04879940301179886,
-0.0353996604681015,
0.10319773852825165,
-0.010253834538161755,
0.12621986865997314,
-0.058782365173101425,
0.03176769241690636,
0.11852801591157913,
-0.07694545388221741,
0.13231614232063293,
-0.10005097836256027,
0.05245020613074303,
-0.07574178278446198,
-0.068893201649189,
-0.06928370893001556,
0.026304882019758224,
-0.049380987882614136,
0.04488921910524368,
-0.09471920132637024,
0.05728651583194733,
0.009773032739758492,
-0.02697497047483921,
-0.009105974808335304,
-0.0728413462638855,
0.06817606091499329,
0.181012824177742,
0.09887725114822388,
0.08335989713668823,
-0.08853185921907425,
0.011640651151537895,
0.03250880539417267,
0.04116757959127426,
-0.14070948958396912,
0.030546806752681732,
0.046011850237846375,
0.014435777440667152,
0.10546809434890747,
0.00884008128196001,
-0.1469467431306839,
-0.019186165183782578,
0.05882810056209564,
-0.03422379121184349,
-0.09830949455499649,
-0.00176500272937119,
0.12246549874544144,
-0.1376798450946808,
-0.028197135776281357,
0.08256823569536209,
0.005450895521789789,
-0.03159305825829506,
-0.012091530486941338,
0.049719639122486115,
-0.02429707907140255,
0.15947771072387695,
0.008208372630178928,
0.009025515988469124,
-0.10917308181524277,
0.05245133861899376,
0.07814732193946838,
-0.11272956430912018,
-0.00511935306712985,
0.11425989866256714,
-0.05990998074412346,
-0.08044980466365814,
0.07937181740999222,
0.05147598683834076,
0.00017740037583280355,
-0.012858265079557896,
0.013220811262726784,
-0.129837766289711,
0.03791911154985428,
0.1179126724600792,
-0.028337128460407257,
0.08061996102333069,
-0.03375468775629997,
-0.016018936410546303,
-0.10924302786588669,
0.0391666553914547,
-0.00038748650695197284,
0.013567609712481499,
-0.02202145755290985,
0.1199534684419632,
0.009899813681840897,
0.025551531463861465,
-0.004385348875075579,
-0.025304384529590607,
-0.0812227800488472,
-0.039452988654375076,
-0.03684445843100548,
0.027732091024518013,
-0.0454278327524662,
-0.018362704664468765,
0.013655567541718483,
0.06546041369438171,
0.012783415615558624,
0.016140984371304512,
-0.06500750035047531,
-0.06486428529024124,
-0.07066835463047028,
0.12906478345394135,
-0.09573250263929367,
0.049622587859630585,
0.06451573222875595,
-0.09214811772108078,
0.09220701456069946,
-0.00982766505330801,
-0.020619358867406845,
-0.0048973276279866695,
-0.028479857370257378,
-0.03822283074259758,
-0.007276064716279507,
0.04568355903029442,
-0.01318348292261362,
-0.06140240281820297,
0.026667216792702675,
-0.0084079559892416,
0.00079679413465783,
-0.03364737704396248,
0.035379961133003235,
-0.11138526350259781,
0.00017752163694240153,
-0.03153565526008606,
-0.025099676102399826,
-0.06355839222669601,
0.06344027072191238,
0.10535972565412521,
0.005251326132565737,
0.10200873762369156,
-0.027952391654253006,
0.03954228386282921,
-0.15407273173332214,
-0.013776984065771103,
-0.017831584438681602,
-0.059652410447597504,
0.036445073783397675,
-0.06892263144254684,
0.08205683529376984,
0.054835788905620575,
0.10339425504207611,
0.0023519350215792656,
-0.011953205801546574,
-0.01433226466178894,
0.13935713469982147,
-0.028166700154542923,
0.03048960492014885,
0.17895932495594025,
-0.0341746024787426,
0.008571622893214226,
0.04127378389239311,
-0.013044551946222782,
0.012392216362059116,
0.003290190827101469,
0.1367366462945938,
0.12429549545049667,
-0.00388303748331964,
0.010667077265679836,
0.06496372073888779,
-0.046157289296388626,
0.024121908470988274,
0.09398503601551056,
-0.06340828537940979,
0.035375550389289856,
-0.11018750071525574,
-0.02610062249004841,
0.10555250942707062,
-0.11539998650550842,
0.12630262970924377,
0.03872359171509743,
-0.03339928016066551,
-0.048092857003211975,
-0.14882756769657135,
-0.011154012754559517,
-0.10218508541584015,
0.008326455019414425,
-0.11971443891525269,
0.06908217817544937,
0.13839200139045715,
0.021100496873259544,
-0.018290018662810326,
0.10035848617553711,
-0.13568015396595,
-0.14421655237674713,
0.05677236244082451,
0.0019842961337417364,
-0.01470088679343462,
0.0003709605662152171,
0.022070955485105515,
0.10842115432024002,
0.01546936109662056,
0.0839647501707077,
0.05485518276691437,
0.1631474643945694,
0.049494124948978424,
-0.03846733272075653,
-0.07549944519996643,
0.029383519664406776,
-0.06054295599460602,
-0.02156493254005909,
0.16972702741622925,
0.011534503661096096,
-0.020734110847115517,
-0.05798538774251938,
0.1595080941915512,
-0.05133536830544472,
-0.05421637371182442,
-0.18285617232322693,
0.10857144743204117,
-0.019599277526140213,
-0.04076141491532326,
-0.054036013782024384,
-0.07764969766139984,
-0.010183537378907204,
0.2930336594581604,
0.02082006074488163,
-0.007123175542801619,
0.006284912582486868,
0.02729356475174427,
-0.006659545470029116,
-0.009587317705154419,
0.16732068359851837,
0.0032546021975576878,
0.05419525131583214,
0.016910014674067497,
0.061924368143081665,
0.017369389533996582,
-0.04620503634214401,
-0.09783754497766495,
0.047208067029714584,
-0.07750469446182251,
-0.02863210253417492,
-0.009670068509876728,
0.04320523515343666,
0.03754901885986328,
-0.15908251702785492,
0.03388573229312897,
-0.021026475355029106,
-0.054417386651039124,
-0.05603473260998726,
0.040295157581567764,
0.01831968128681183,
0.07419132441282272,
0.007823280990123749,
0.037434060126543045,
0.2379927784204483,
-0.0592677965760231,
-0.05761026218533516,
-0.048277467489242554,
0.09875991195440292,
-0.1164943128824234,
0.09992536902427673,
-0.0034187245182693005,
-0.018563052639365196,
0.08369500190019608,
-0.009007900021970272,
-0.0916764885187149,
0.09632151573896408,
-0.027961911633610725,
-0.0537356436252594,
0.030535487458109856,
0.11366278678178787,
0.013709218241274357,
0.03862491250038147,
-0.0044222124852240086,
-0.13470278680324554,
0.05825137346982956,
-0.04159579426050186,
0.05066465586423874,
-0.08201282471418381,
0.08828761428594589,
-0.08588463813066483,
0.1577717661857605,
0.12824895977973938,
-0.016081519424915314,
-0.0302035640925169,
-0.03415790945291519,
0.01722639612853527,
-0.003903597127646208,
0.11476660519838333,
-0.05447860434651375,
-0.11150511354207993,
-0.008257324807345867,
-0.11231419444084167,
0.0024830729234963655,
-0.17826443910598755,
-0.008594640530645847,
-0.006206701043993235,
-0.07004989683628082,
0.00037474968121387064,
0.09435325115919113,
0.04375556483864784,
0.022486789152026176,
-0.01130574569106102,
-0.06715533137321472,
0.03988447040319443,
0.09728671610355377,
-0.07176830619573593,
-0.01565883681178093
] |
null | null |
transformers
|
# MedCLIP
## Model description
## Intended uses & limitations
#### How to use
```python
# You can include sample code which will be formatted
```
#### Limitations and bias
Provide examples of latent issues and potential remediations.
## Training data
Describe the data you used to train the model.
If you initialized it with pre-trained weights, add a link to the pre-trained model card or repository with description of the pre-training data.
## Training procedure
Preprocessing, hardware used, hyperparameters...
## Eval results
### BibTeX entry and citation info
```bibtex
@inproceedings{...,
year={2020}
}
```
|
{"language": ["en"], "license": "apache-2.0", "tags": ["vision"]}
| null |
flax-community/medclip
|
[
"transformers",
"jax",
"tensorboard",
"hybrid-clip",
"vision",
"en",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #jax #tensorboard #hybrid-clip #vision #en #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# MedCLIP
## Model description
## Intended uses & limitations
#### How to use
#### Limitations and bias
Provide examples of latent issues and potential remediations.
## Training data
Describe the data you used to train the model.
If you initialized it with pre-trained weights, add a link to the pre-trained model card or repository with description of the pre-training data.
## Training procedure
Preprocessing, hardware used, hyperparameters...
## Eval results
### BibTeX entry and citation info
|
[
"# MedCLIP",
"## Model description",
"## Intended uses & limitations",
"#### How to use",
"#### Limitations and bias\n\nProvide examples of latent issues and potential remediations.",
"## Training data\n\nDescribe the data you used to train the model.\nIf you initialized it with pre-trained weights, add a link to the pre-trained model card or repository with description of the pre-training data.",
"## Training procedure\n\nPreprocessing, hardware used, hyperparameters...",
"## Eval results",
"### BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #jax #tensorboard #hybrid-clip #vision #en #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# MedCLIP",
"## Model description",
"## Intended uses & limitations",
"#### How to use",
"#### Limitations and bias\n\nProvide examples of latent issues and potential remediations.",
"## Training data\n\nDescribe the data you used to train the model.\nIf you initialized it with pre-trained weights, add a link to the pre-trained model card or repository with description of the pre-training data.",
"## Training procedure\n\nPreprocessing, hardware used, hyperparameters...",
"## Eval results",
"### BibTeX entry and citation info"
] |
[
45,
4,
3,
9,
5,
20,
51,
15,
4,
11
] |
[
"passage: TAGS\n#transformers #jax #tensorboard #hybrid-clip #vision #en #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# MedCLIP## Model description## Intended uses & limitations#### How to use#### Limitations and bias\n\nProvide examples of latent issues and potential remediations.## Training data\n\nDescribe the data you used to train the model.\nIf you initialized it with pre-trained weights, add a link to the pre-trained model card or repository with description of the pre-training data.## Training procedure\n\nPreprocessing, hardware used, hyperparameters...## Eval results### BibTeX entry and citation info"
] |
[
-0.0718659833073616,
0.058369386941194534,
0.001098225126042962,
0.007039301563054323,
0.12147726118564606,
0.019511019811034203,
0.12168373912572861,
0.09917633980512619,
-0.06581099331378937,
0.01953383907675743,
0.1720517873764038,
0.07764257490634918,
0.053558994084596634,
0.19236406683921814,
-0.0067460504360497,
-0.24716177582740784,
0.014793289825320244,
0.07845630496740341,
-0.03588642179965973,
0.0813489779829979,
0.09909967333078384,
-0.12453053891658783,
0.07505768537521362,
0.02082182466983795,
-0.16637179255485535,
0.010031143203377724,
-0.04223182797431946,
0.0010074112797155976,
0.07358082383871078,
0.0009939261944964528,
0.121147021651268,
0.03778175264596939,
0.10032375156879425,
-0.21640966832637787,
0.033004723489284515,
0.075027696788311,
-0.014584640972316265,
0.09010481834411621,
0.042310405522584915,
-0.03325097635388374,
0.11885638535022736,
-0.06400205940008163,
0.10765751451253891,
0.00805344432592392,
-0.10227414965629578,
-0.2116951197385788,
-0.05654849857091904,
0.024085234850645065,
0.12388650327920914,
0.058536432683467865,
0.0008192634559236467,
0.13742266595363617,
-0.07005352526903152,
0.04459170624613762,
0.12090931832790375,
-0.15009768307209015,
-0.0658300444483757,
0.1075272262096405,
0.11596830189228058,
0.03990057855844498,
-0.08309777826070786,
0.013417443260550499,
0.10139687359333038,
0.046198856085538864,
0.12212739139795303,
-0.04100065678358078,
-0.024331923574209213,
0.040580593049526215,
-0.15147870779037476,
-0.03669073432683945,
0.18518604338169098,
-0.026048028841614723,
-0.016097361221909523,
-0.09218728542327881,
0.0001268550258828327,
-0.0885857343673706,
-0.000531466503161937,
-0.06342960149049759,
0.06322251260280609,
-0.020952334627509117,
0.012537911534309387,
-0.08526603132486343,
-0.10703560709953308,
-0.1382686048746109,
0.019361315295100212,
-0.03273063525557518,
0.028999876230955124,
0.04032832384109497,
-0.07768040895462036,
0.1081521138548851,
0.06794965267181396,
-0.1050102487206459,
-0.012607146985828876,
-0.052208710461854935,
-0.04665880650281906,
-0.08938471227884293,
-0.04894423484802246,
-0.023543711751699448,
0.013965391553938389,
0.16611182689666748,
0.07428573817014694,
0.03297807276248932,
0.02638695389032364,
0.08204184472560883,
0.02319532446563244,
-0.02095222659409046,
-0.09108421206474304,
-0.0020876163616776466,
-0.017016591504216194,
0.04405450448393822,
-0.041304320096969604,
-0.02637675404548645,
-0.13371941447257996,
0.003923031967133284,
0.048154644668102264,
0.047453053295612335,
-0.044098906219005585,
0.030698759481310844,
-0.030463840812444687,
-0.052908167243003845,
0.011700891889631748,
-0.09278172999620438,
-0.01919684000313282,
-0.059222906827926636,
-0.06465891003608704,
-0.11562719196081161,
0.1259017139673233,
0.01542610302567482,
-0.027978943660855293,
0.06926314532756805,
-0.08003958314657211,
0.00426867650821805,
-0.18456579744815826,
-0.08904106914997101,
-0.011762072332203388,
-0.12772096693515778,
0.03747571259737015,
-0.04688141494989395,
-0.2128184735774994,
-0.07241935282945633,
0.13533659279346466,
-0.06419684737920761,
0.004358861595392227,
-0.011059442535042763,
-0.010109779424965382,
-0.006798355840146542,
-0.047598157078027725,
0.17108500003814697,
-0.047576092183589935,
0.018407529219985008,
-0.05657315254211426,
0.10665170103311539,
-0.05202934145927429,
0.05082333832979202,
-0.10360430181026459,
0.05983063206076622,
-0.08957237005233765,
0.04399879276752472,
-0.08281248062849045,
0.054680295288562775,
-0.15087635815143585,
-0.06129083409905434,
-0.07656161487102509,
0.029595570638775826,
0.0613384023308754,
0.0712180882692337,
-0.22741267085075378,
-0.01941603608429432,
0.0822518914937973,
-0.10130035132169724,
-0.17137424647808075,
0.05615382641553879,
-0.09518852084875107,
0.20305638015270233,
0.019215578213334084,
0.18028609454631805,
0.05339469388127327,
-0.09244826436042786,
0.09350384026765823,
0.020159950479865074,
-0.003158405190333724,
-0.04808010533452034,
0.05968098342418671,
0.019674081355333328,
0.013203168287873268,
-0.03769061714410782,
-0.029239371418952942,
0.001771291485056281,
-0.07088995724916458,
-0.05487244203686714,
0.01782194711267948,
-0.09875012934207916,
-0.06872693449258804,
0.032409146428108215,
0.06019735708832741,
0.018636737018823624,
-0.023680375888943672,
0.15349046885967255,
0.08091235905885696,
-0.0346522182226181,
-0.006762095261365175,
-0.0828324481844902,
-0.010935373604297638,
-0.14924632012844086,
-0.026787616312503815,
-0.18177159130573273,
0.04456896334886551,
-0.03614579141139984,
-0.01972382143139839,
0.042362846434116364,
0.19336487352848053,
0.06943868100643158,
0.020243503153324127,
-0.008784246630966663,
0.054116614162921906,
0.014714268036186695,
0.033452268689870834,
-0.10987363755702972,
-0.23209291696548462,
-0.04562303051352501,
-0.09513973444700241,
0.055985335260629654,
-0.19796733558177948,
0.03359862416982651,
0.03298060968518257,
0.10441066324710846,
0.05021681264042854,
-0.029021088033914566,
0.00419264193624258,
0.017926577478647232,
-0.01533149741590023,
-0.05879119783639908,
0.0516706183552742,
-0.008645305410027504,
-0.03929044306278229,
0.08992233872413635,
-0.1176900789141655,
0.06345807760953903,
0.14560319483280182,
-0.1300453245639801,
-0.09281961619853973,
-0.09533079713582993,
-0.02952505648136139,
-0.04354604706168175,
-0.03957008942961693,
0.043100982904434204,
0.11609388142824173,
-0.034064881503582,
0.14431941509246826,
-0.07489671558141708,
-0.04974483326077461,
-0.024208227172493935,
-0.010776733048260212,
-0.02292264997959137,
0.06529350578784943,
0.1731918901205063,
-0.14081971347332,
0.0967358723282814,
0.09378720819950104,
-0.08455265313386917,
0.11205447465181351,
0.016139866784214973,
-0.1024041622877121,
0.0031168414279818535,
-0.06542138755321503,
0.0048538451083004475,
0.18632107973098755,
-0.1738552749156952,
-0.049498118460178375,
0.05284993723034859,
-0.0008392107556574047,
0.09979324042797089,
-0.1764458566904068,
-0.024144530296325684,
0.04235393926501274,
-0.0033060701098293066,
-0.00829155184328556,
-0.0024874182417988777,
-0.05255965143442154,
0.07469113916158676,
0.031975436955690384,
0.041607730090618134,
0.038107987493276596,
-0.05752222239971161,
-0.14456425607204437,
0.1836799681186676,
-0.05101385340094566,
-0.11150859296321869,
-0.08027901500463486,
-0.03206164762377739,
0.10753942281007767,
0.038787417113780975,
0.005665737669914961,
-0.03141408413648605,
-0.048371318727731705,
-0.05527455359697342,
0.02039746195077896,
-0.08966416865587234,
-0.014664699323475361,
-0.03059702180325985,
0.0425693579018116,
0.008604245260357857,
-0.08010116219520569,
0.022621529176831245,
-0.03269834816455841,
0.009765319526195526,
-0.008134382776916027,
-0.028063185513019562,
0.0988897904753685,
0.1491011083126068,
-0.013118787668645382,
0.03617815300822258,
-0.025504719465970993,
0.22471438348293304,
-0.11127080768346786,
-0.03900328651070595,
0.13356731832027435,
-0.05509847402572632,
-0.004519446287304163,
0.1435319483280182,
0.06166736036539078,
-0.09049782902002335,
0.034686967730522156,
0.04623686149716377,
-0.06892827153205872,
-0.1895706206560135,
-0.03345621004700661,
-0.0776522234082222,
-0.08561216294765472,
0.11676941066980362,
0.0620025098323822,
0.0727139264345169,
0.13536953926086426,
0.015046682208776474,
0.07938309013843536,
-0.014618261717259884,
0.09684920310974121,
-0.02456594444811344,
0.013805042020976543,
0.08859071880578995,
-0.06820143759250641,
0.0074988058768212795,
0.04107251018285751,
0.056119680404663086,
0.30715110898017883,
0.018868589773774147,
0.09749763458967209,
0.1526700109243393,
0.033022671937942505,
0.044058460742235184,
0.04345085099339485,
-0.019203392788767815,
-0.008439440280199051,
-0.02736690826714039,
-0.03760039061307907,
-0.06655063480138779,
0.06942412257194519,
-0.07565140724182129,
0.026336951181292534,
-0.044199176132678986,
-0.05908066779375076,
-0.012299747206270695,
0.17168118059635162,
0.015521923080086708,
-0.33365389704704285,
-0.003504921682178974,
0.027186088263988495,
-0.0316927544772625,
-0.05337998643517494,
0.028564628213644028,
0.07784125953912735,
-0.08849406987428665,
0.03017161414027214,
-0.0795331597328186,
0.1359342783689499,
-0.09478112310171127,
-0.0005603162571787834,
0.09167103469371796,
0.007940740324556828,
-0.04627639800310135,
0.12968280911445618,
-0.3182365298271179,
0.22939178347587585,
0.010177877731621265,
0.06394579261541367,
-0.10753534734249115,
0.01086669135838747,
-0.00029968158924020827,
0.09901944547891617,
0.14709706604480743,
-0.058026254177093506,
-0.03682120889425278,
-0.09270323812961578,
0.04978413134813309,
-0.016233522444963455,
0.024917883798480034,
-0.05017339438199997,
0.026088757440447807,
-0.000009774395039130468,
0.04473495855927467,
0.013059419579803944,
-0.07571078836917877,
-0.17581897974014282,
-0.18084879219532013,
0.02839169092476368,
-0.05449426919221878,
-0.03130210191011429,
-0.061166342347860336,
-0.05921018868684769,
0.02674832195043564,
0.12399972975254059,
-0.04353469982743263,
-0.016976943239569664,
-0.10115708410739899,
0.0571628138422966,
0.08281712979078293,
-0.0037240597885102034,
0.04723738506436348,
0.02247060276567936,
0.10607578605413437,
-0.04344181343913078,
-0.06229361891746521,
0.07771899551153183,
-0.1284935474395752,
-0.06215579807758331,
-0.0665457621216774,
0.027042021974921227,
0.09991999715566635,
0.03773125633597374,
0.016735028475522995,
0.019908731803297997,
-0.09299104660749435,
-0.08414380997419357,
0.04585948958992958,
0.015582479536533356,
0.16862823069095612,
0.006919683888554573,
-0.026575325056910515,
0.027470016852021217,
0.014276402071118355,
-0.020663583651185036,
0.05311194807291031,
0.14628608524799347,
-0.07094337046146393,
0.0987282544374466,
0.07517215609550476,
-0.12558303773403168,
-0.25708529353141785,
0.14158989489078522,
0.04243055358529091,
0.021138863638043404,
0.0041711959056556225,
-0.1817018836736679,
0.07068884372711182,
0.029522377997636795,
-0.03977585211396217,
0.05921544134616852,
-0.20903028547763824,
-0.14045576751232147,
0.11483573168516159,
0.14786744117736816,
0.005185524467378855,
-0.15852046012878418,
-0.040307264775037766,
-0.08312863856554031,
-0.1300981342792511,
0.14374203979969025,
-0.15583105385303497,
0.03780517354607582,
-0.007126177195459604,
-0.024386128410696983,
-0.0075247157365083694,
-0.07890567183494568,
0.18797482550144196,
-0.004568225238472223,
0.10794592648744583,
-0.1111123338341713,
-0.0009633074514567852,
0.06761572510004044,
-0.054934997111558914,
0.050881270319223404,
0.010823378339409828,
0.0009651227737776935,
-0.15613926947116852,
-0.07869121432304382,
-0.013276888988912106,
0.0099967485293746,
-0.019252164289355278,
-0.092364601790905,
-0.07903532683849335,
0.08820517361164093,
0.0021740747615695,
-0.018455447629094124,
0.06483371555805206,
-0.0963495522737503,
0.03408960998058319,
0.08583946526050568,
0.228269562125206,
-0.02642994560301304,
-0.020035436376929283,
0.043409544974565506,
-0.02633729949593544,
0.1023109182715416,
-0.19684812426567078,
0.020976023748517036,
0.08510371297597885,
0.01338191982358694,
0.13163483142852783,
0.06614625453948975,
-0.06548842787742615,
0.010720187798142433,
0.07644118368625641,
-0.06927036494016647,
-0.11214856803417206,
0.01158024650067091,
0.06411152333021164,
-0.1187402531504631,
0.00815320573747158,
0.008724675513803959,
-0.08412794768810272,
0.01095407735556364,
0.02755819819867611,
0.031232506036758423,
-0.08346104621887207,
0.17618049681186676,
0.12000815570354462,
0.06505368649959564,
-0.08123061805963516,
0.07740233838558197,
0.0599374882876873,
-0.13063035905361176,
0.028344450518488884,
0.009913106448948383,
-0.09954892843961716,
-0.056252915412187576,
0.05278879404067993,
0.23938460648059845,
-0.015476120635867119,
-0.09285306930541992,
-0.03183607757091522,
-0.08129917830228806,
0.050061482936143875,
0.1536465287208557,
0.09491267800331116,
-0.017048202455043793,
-0.009839251637458801,
0.03015122003853321,
-0.1352170705795288,
0.10062943398952484,
-0.006339611951261759,
0.05649809166789055,
-0.07238402962684631,
0.09013828635215759,
0.028337614610791206,
0.0671834647655487,
-0.08194668591022491,
0.03859494999051094,
-0.1070636436343193,
0.03189843147993088,
-0.11203520745038986,
0.07765320688486099,
-0.10404176265001297,
-0.024295266717672348,
0.02553507871925831,
-0.03507257625460625,
-0.05826668441295624,
0.006430938374251127,
-0.10718391090631485,
0.020583119243383408,
0.01649937406182289,
0.02915075607597828,
-0.07074064761400223,
-0.01820782758295536,
0.04932723566889763,
-0.024709012359380722,
0.03744285926222801,
-0.049829550087451935,
0.01740640588104725,
-0.02753964066505432,
-0.11025913059711456,
-0.0007042646175250411,
0.010840724222362041,
0.013089359737932682,
0.018226923421025276,
-0.05673421174287796,
-0.023229504004120827,
-0.016328077763319016,
-0.008891853503882885,
-0.00036510542850010097,
0.016137121245265007,
-0.11446890980005264,
-0.009547362104058266,
-0.019435783848166466,
-0.007862449623644352,
-0.014685357920825481,
0.10131165385246277,
0.06546558439731598,
0.05512222275137901,
0.09313631057739258,
-0.033176008611917496,
0.05565213784575462,
-0.1494956910610199,
-0.0033908160403370857,
-0.009605650790035725,
-0.0009990761755034328,
-0.02822274900972843,
-0.06378824263811111,
0.05680936574935913,
-0.07088007777929306,
0.2731419801712036,
0.08508104085922241,
0.0020698662847280502,
0.048559270799160004,
0.0226266011595726,
0.08625991642475128,
0.04364980757236481,
0.2019510269165039,
0.0016814982518553734,
0.05112177133560181,
-0.025627240538597107,
0.11733699589967728,
0.01768023520708084,
0.03626687079668045,
0.13011771440505981,
0.16227157413959503,
0.030947698280215263,
0.06389564275741577,
-0.027491150423884392,
-0.08721322566270828,
-0.01592993550002575,
0.13185973465442657,
0.07919622212648392,
0.026576200500130653,
-0.03346789628267288,
0.06012590602040291,
0.18841461837291718,
-0.14476098120212555,
0.03337406367063522,
0.017601467669010162,
-0.06768062710762024,
-0.1303100287914276,
-0.12035072594881058,
-0.016256175935268402,
-0.09556683152914047,
-0.01517345942556858,
-0.10331375896930695,
-0.024586742743849754,
0.17881107330322266,
0.04128407686948776,
0.010384776629507542,
0.16287046670913696,
-0.03962405025959015,
0.024476487189531326,
-0.04736661538481712,
-0.016772162169218063,
0.03886346146464348,
-0.1452895998954773,
-0.02306094393134117,
0.031104635447263718,
-0.05671914294362068,
0.01045937929302454,
-0.07199469208717346,
0.009408175945281982,
0.03072349540889263,
-0.0063726711086928844,
-0.03394480422139168,
0.020870015025138855,
0.030501076951622963,
0.07375004887580872,
0.14510518312454224,
-0.0011815266916528344,
-0.02589123323559761,
-0.009997056797146797,
0.22048527002334595,
-0.06282593309879303,
-0.047118715941905975,
-0.14564655721187592,
0.07619704306125641,
0.004243514034897089,
0.008746911771595478,
-0.030665947124361992,
-0.07093300670385361,
0.06296290457248688,
0.24758464097976685,
0.23787228763103485,
-0.06218118220567703,
-0.016155945137143135,
0.06594835966825485,
0.0038708550855517387,
0.012661699205636978,
0.06385082006454468,
0.06036069989204407,
0.07615356892347336,
-0.10664548724889755,
-0.05649218335747719,
-0.031466733664274216,
-0.03192245215177536,
-0.034285832196474075,
0.02032836712896824,
0.0997573733329773,
-0.032294999808073044,
-0.016167227178812027,
0.09012647718191147,
-0.03991176187992096,
-0.16631890833377838,
0.06797080487012863,
-0.07958122342824936,
-0.09946038573980331,
-0.05204959213733673,
0.04364058002829552,
-0.03269519656896591,
-0.019859937950968742,
-0.013825321570038795,
0.003546600928530097,
0.07886704802513123,
0.01978854276239872,
-0.12785159051418304,
-0.10682421922683716,
0.10974777489900589,
0.02381674014031887,
0.20441117882728577,
-0.09517575055360794,
0.0834624245762825,
0.08781679719686508,
0.0012710016453638673,
-0.0680076852440834,
0.008000445552170277,
0.020865745842456818,
-0.018109705299139023,
0.06819960474967957,
0.08947434276342392,
-0.024925975129008293,
0.03803829476237297,
0.014322647824883461,
-0.21121098101139069,
0.013940293341875076,
-0.17065195739269257,
-0.0781947448849678,
-0.08511999249458313,
0.09018911421298981,
-0.0750068873167038,
0.14379417896270752,
0.096685990691185,
-0.040349557995796204,
0.004384305793792009,
-0.03365760296583176,
0.07201364636421204,
0.002752614440396428,
0.08834251761436462,
-0.0395585298538208,
-0.07526350021362305,
0.01384209655225277,
-0.07436756789684296,
-0.006173110567033291,
-0.2615044414997101,
-0.03611418977379799,
0.0022393062245100737,
-0.08307065069675446,
-0.02700958587229252,
0.053183991461992264,
0.12704125046730042,
0.07636258006095886,
-0.11029613763093948,
-0.11113239824771881,
0.01397367101162672,
0.08690895885229111,
-0.15442891418933868,
-0.05837573483586311
] |
null | null |
transformers
|
# Mongolian GPT2
Goal is to create a strong language generation model for Mongolian
Since initial code and data is pretty much written by @patrickvonplaten and other huggingface members, it should not be so hard to get the first sense.
## Model
Randomly initialized GPT2 model
## Datasets
We can use OSCAR which is available through datasets
## Datasets
A causal language modeling script for Flax is available here 1. It can be used pretty much without any required code changes.
If there is time left, I’d love to try some private crawling and integrate it datasets.
## Expected Outcome
Understandable Mongolian text generation model
## Challenges
Lack of data → OSCAR Mongolian is just 2.2G. Maybe we need to research ways to acquire more data with this.
|
{"language": "mn", "tags": ["gpt2"], "datasets": ["oscar"], "thumbnail": "https://avatars.githubusercontent.com/u/43239645?s=60&v=4"}
|
text-generation
|
flax-community/mongolian-gpt2
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"mn",
"dataset:oscar",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"mn"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #mn #dataset-oscar #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Mongolian GPT2
Goal is to create a strong language generation model for Mongolian
Since initial code and data is pretty much written by @patrickvonplaten and other huggingface members, it should not be so hard to get the first sense.
## Model
Randomly initialized GPT2 model
## Datasets
We can use OSCAR which is available through datasets
## Datasets
A causal language modeling script for Flax is available here 1. It can be used pretty much without any required code changes.
If there is time left, I’d love to try some private crawling and integrate it datasets.
## Expected Outcome
Understandable Mongolian text generation model
## Challenges
Lack of data → OSCAR Mongolian is just 2.2G. Maybe we need to research ways to acquire more data with this.
|
[
"# Mongolian GPT2\n\nGoal is to create a strong language generation model for Mongolian\nSince initial code and data is pretty much written by @patrickvonplaten and other huggingface members, it should not be so hard to get the first sense.",
"## Model\nRandomly initialized GPT2 model",
"## Datasets\nWe can use OSCAR which is available through datasets",
"## Datasets\nA causal language modeling script for Flax is available here 1. It can be used pretty much without any required code changes.\nIf there is time left, I’d love to try some private crawling and integrate it datasets.",
"## Expected Outcome\nUnderstandable Mongolian text generation model",
"## Challenges\nLack of data → OSCAR Mongolian is just 2.2G. Maybe we need to research ways to acquire more data with this."
] |
[
"TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #mn #dataset-oscar #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Mongolian GPT2\n\nGoal is to create a strong language generation model for Mongolian\nSince initial code and data is pretty much written by @patrickvonplaten and other huggingface members, it should not be so hard to get the first sense.",
"## Model\nRandomly initialized GPT2 model",
"## Datasets\nWe can use OSCAR which is available through datasets",
"## Datasets\nA causal language modeling script for Flax is available here 1. It can be used pretty much without any required code changes.\nIf there is time left, I’d love to try some private crawling and integrate it datasets.",
"## Expected Outcome\nUnderstandable Mongolian text generation model",
"## Challenges\nLack of data → OSCAR Mongolian is just 2.2G. Maybe we need to research ways to acquire more data with this."
] |
[
58,
54,
11,
16,
54,
14,
30
] |
[
"passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #mn #dataset-oscar #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Mongolian GPT2\n\nGoal is to create a strong language generation model for Mongolian\nSince initial code and data is pretty much written by @patrickvonplaten and other huggingface members, it should not be so hard to get the first sense.## Model\nRandomly initialized GPT2 model## Datasets\nWe can use OSCAR which is available through datasets## Datasets\nA causal language modeling script for Flax is available here 1. It can be used pretty much without any required code changes.\nIf there is time left, I’d love to try some private crawling and integrate it datasets.## Expected Outcome\nUnderstandable Mongolian text generation model## Challenges\nLack of data → OSCAR Mongolian is just 2.2G. Maybe we need to research ways to acquire more data with this."
] |
[
0.02970312163233757,
0.170479878783226,
-0.0004560656670946628,
0.10523741692304611,
0.10799328982830048,
-0.026949115097522736,
0.0953468531370163,
0.1293715536594391,
0.08263001590967178,
0.020444218069314957,
0.13402925431728363,
-0.07279140502214432,
0.055543310940265656,
0.2128954976797104,
0.03630748763680458,
-0.26039689779281616,
-0.006076205521821976,
-0.03431237116456032,
0.05994800850749016,
0.03253939002752304,
0.09800420701503754,
-0.021829232573509216,
0.05998687446117401,
-0.04630374535918236,
-0.06944547593593597,
0.047100670635700226,
-0.029306041076779366,
-0.11004418134689331,
0.05013292655348778,
0.07031887769699097,
-0.03850826621055603,
0.0276934877038002,
0.008862118236720562,
-0.01838628016412258,
0.010940403677523136,
-0.08295603841543198,
-0.018020344898104668,
-0.0016677777748554945,
0.1214762032032013,
-0.06623180210590363,
0.21465256810188293,
-0.0542660616338253,
-0.047021299600601196,
0.07429854571819305,
-0.0439474917948246,
-0.1107664704322815,
-0.04095876216888428,
0.0169560294598341,
-0.02103610709309578,
0.08221495151519775,
-0.06403384357690811,
0.08033587783575058,
-0.052804287523031235,
0.01893605850636959,
0.06756410002708435,
-0.35699814558029175,
-0.03457118943333626,
0.19102634489536285,
0.09134401381015778,
0.07261092960834503,
-0.039110392332077026,
0.11702945828437805,
0.09044240415096283,
0.005813379772007465,
0.07738462090492249,
-0.07014648616313934,
-0.27880460023880005,
-0.05842458829283714,
-0.08617699891328812,
0.014564735814929008,
0.21983933448791504,
0.004960003308951855,
-0.06247866153717041,
-0.10103043168783188,
0.04130084067583084,
-0.08667775243520737,
0.035578809678554535,
0.05231928452849388,
-0.007209701929241419,
-0.00022722753055859357,
0.051747117191553116,
-0.11733461916446686,
-0.10921365767717361,
0.00005139458880876191,
-0.034951101988554,
0.025440815836191177,
0.026725273579359055,
0.051439572125673294,
-0.034708328545093536,
0.08136893808841705,
-0.17806151509284973,
-0.06298517435789108,
-0.03176948428153992,
-0.08863386511802673,
-0.05078822001814842,
0.07011397927999496,
0.025565236806869507,
-0.057334017008543015,
0.05079491436481476,
0.18764041364192963,
0.06171504780650139,
0.06612955033779144,
-0.05515914410352707,
0.05848721042275429,
0.02273266203701496,
0.11171504110097885,
-0.05804299935698509,
-0.0904957726597786,
0.09209504723548889,
-0.07663236558437347,
0.07870107889175415,
-0.047748371958732605,
-0.11191868036985397,
-0.1817299723625183,
0.02874704636633396,
0.11016726493835449,
0.07692268490791321,
0.049090877175331116,
-0.02699464187026024,
-0.008634794503450394,
-0.01055755652487278,
-0.08976344019174576,
-0.011823488399386406,
-0.050386108458042145,
-0.12306055426597595,
0.011005342938005924,
0.029981130734086037,
0.0229110699146986,
0.017633503302931786,
-0.06778652966022491,
-0.0037243254482746124,
0.07365991175174713,
-0.02483518421649933,
-0.06244400888681412,
0.07424797117710114,
-0.0316074900329113,
0.014672514982521534,
-0.1439942717552185,
-0.11254341900348663,
0.0400521494448185,
0.013359695672988892,
-0.10434381663799286,
-0.03762458637356758,
-0.04238864779472351,
-0.025827841833233833,
0.011902133002877235,
-0.061002466827631,
0.04468408226966858,
-0.04896235838532448,
-0.037081919610500336,
0.05502127856016159,
0.1398589313030243,
-0.029471617192029953,
-0.024029476568102837,
-0.09290595352649689,
-0.0005455599748529494,
-0.13306409120559692,
0.10588106513023376,
-0.09062410145998001,
0.05290452018380165,
-0.03354751318693161,
-0.004436313174664974,
0.07589908689260483,
-0.007153181359171867,
-0.040755923837423325,
0.21340687572956085,
-0.11974073946475983,
0.015496938489377499,
0.23350197076797485,
-0.05272991210222244,
-0.032020919024944305,
0.14609527587890625,
0.013960364274680614,
0.047557584941387177,
0.06826511025428772,
0.1687876135110855,
-0.02353181131184101,
-0.04513172805309296,
-0.07665139436721802,
0.03436786308884621,
-0.09455250203609467,
-0.02925395593047142,
0.18578477203845978,
0.08995036035776138,
-0.08131510019302368,
0.05290142074227333,
0.05286940559744835,
0.09434439241886139,
-0.05823974311351776,
-0.09527602791786194,
-0.020241238176822662,
-0.0422634482383728,
0.053377363830804825,
0.00794585794210434,
0.1435573250055313,
-0.07660044729709625,
-0.11621976643800735,
-0.17511792480945587,
0.050170209258794785,
0.003284599632024765,
0.04679343104362488,
-0.10714999586343765,
0.09110944718122482,
-0.0827234759926796,
0.005729731637984514,
-0.07212521135807037,
-0.029542678967118263,
0.04625999927520752,
0.030458956956863403,
0.051199473440647125,
0.0014372644945979118,
0.028812745586037636,
0.027432596310973167,
-0.12000823765993118,
0.026928339153528214,
0.09023498743772507,
-0.03973308950662613,
-0.037781234830617905,
-0.1433357298374176,
0.042194049805402756,
-0.06371945142745972,
0.2017146646976471,
-0.12899981439113617,
-0.027469422668218613,
-0.059978898614645004,
0.08023948222398758,
-0.05429686978459358,
-0.01837129518389702,
0.11305229365825653,
-0.002399926306679845,
-0.09818895161151886,
-0.0648813396692276,
0.014857284724712372,
-0.005102231167256832,
-0.2270067036151886,
0.13787469267845154,
-0.03343867510557175,
-0.13937629759311676,
0.1168006956577301,
0.051542531698942184,
-0.09300347417593002,
0.07495471835136414,
-0.018697116523981094,
-0.005144701339304447,
0.009561802260577679,
0.02997404709458351,
0.0609796941280365,
-0.06543522328138351,
0.10220248252153397,
-0.04050128906965256,
-0.0023425277322530746,
0.00007358023867709562,
-0.11404725164175034,
-0.03188745677471161,
0.10433990508317947,
-0.012275904417037964,
-0.06359943747520447,
0.126250758767128,
-0.13097187876701355,
0.002952876267954707,
0.2132359892129898,
0.05858053267002106,
-0.03942292183637619,
-0.010611786507070065,
0.048433247953653336,
0.04503469914197922,
0.10719212144613266,
-0.0021888045594096184,
0.012952827848494053,
0.07772296667098999,
-0.037390902638435364,
0.009373805485665798,
-0.046488478779792786,
-0.06402166932821274,
-0.03178941085934639,
-0.07305099070072174,
0.019379552453756332,
0.09986508637666702,
-0.05024666339159012,
0.12208253145217896,
0.029356947168707848,
0.027070466428995132,
0.04905763640999794,
0.003887607716023922,
-0.08853646367788315,
0.12345804274082184,
-0.08855843544006348,
-0.24364086985588074,
-0.02132345922291279,
-0.0024259721394628286,
-0.05758627504110336,
0.04943929985165596,
0.07999016344547272,
-0.1250237375497818,
-0.03975459560751915,
-0.08203288167715073,
0.019317327067255974,
-0.13758112490177155,
0.009214666672050953,
-0.027565890923142433,
-0.004048509988933802,
-0.034674182534217834,
-0.01644647680222988,
-0.03151066228747368,
0.03454175591468811,
-0.11659485101699829,
0.2015065848827362,
-0.08654376119375229,
0.07674765586853027,
0.05127190053462982,
-0.031568560749292374,
-0.019546013325452805,
-0.05104409158229828,
0.07021784782409668,
-0.07096409052610397,
0.04487641528248787,
0.0612211599946022,
0.08292393386363983,
0.030391892418265343,
0.08565410226583481,
-0.025903349742293358,
-0.019908171147108078,
0.04576855152845383,
-0.01803315058350563,
-0.1330593228340149,
-0.27316176891326904,
-0.07392087578773499,
-0.04138990119099617,
0.09889888763427734,
-0.027286287397146225,
0.06450001895427704,
0.09956599771976471,
0.14159363508224487,
-0.07094530761241913,
0.06329674273729324,
-0.01661067083477974,
0.09625090658664703,
0.047975633293390274,
0.06040452420711517,
0.05703233182430267,
-0.12075366824865341,
0.03602716699242592,
0.12804226577281952,
-0.041793953627347946,
0.17193257808685303,
0.019257953390479088,
0.027328046038746834,
0.0023450341541320086,
0.11408387869596481,
0.07816487550735474,
-0.031120002269744873,
0.0863066166639328,
0.016793396323919296,
-0.02853989414870739,
-0.056608896702528,
0.05101042240858078,
0.09750194847583771,
-0.023340895771980286,
-0.0653129294514656,
-0.037000834941864014,
0.11876068264245987,
0.04943326860666275,
0.0754493772983551,
0.008406713604927063,
-0.15548531711101532,
-0.08415612578392029,
0.07064704596996307,
0.02616388536989689,
-0.037602439522743225,
0.01708761788904667,
0.10968993604183197,
-0.19036732614040375,
0.1426304578781128,
-0.030939532443881035,
0.060770243406295776,
0.050968363881111145,
-0.015927476808428764,
-0.08538154512643814,
-0.018767166882753372,
0.014943505637347698,
0.1345089226961136,
-0.3022455871105194,
0.08716857433319092,
0.024319810792803764,
0.03786522150039673,
-0.07425777614116669,
-0.000938467972446233,
0.04229113087058067,
-0.025772804394364357,
0.20180512964725494,
0.0027752199675887823,
-0.017836373299360275,
-0.10520005971193314,
-0.16062915325164795,
0.040059227496385574,
-0.016699882224202156,
-0.008670350536704063,
0.07313466817140579,
0.028175028041005135,
-0.0004943960811942816,
-0.03879651427268982,
-0.08189120888710022,
-0.19153282046318054,
-0.10617893189191818,
0.03954111784696579,
0.0742327868938446,
0.03921262547373772,
0.0055174147710204124,
-0.05899032950401306,
-0.031336478888988495,
0.11533958464860916,
-0.0959932953119278,
-0.15964250266551971,
-0.07372575998306274,
0.07583212107419968,
0.11284292489290237,
-0.08278405666351318,
0.04400715231895447,
-0.03885066509246826,
0.0395960658788681,
-0.01823042519390583,
-0.12507715821266174,
0.04592249169945717,
-0.08268281817436218,
-0.11199376732110977,
-0.008646796457469463,
0.0446808747947216,
0.03485661372542381,
0.14689670503139496,
0.012703940272331238,
0.008363097906112671,
0.007720267400145531,
-0.11484371870756149,
-0.03526071086525917,
0.06074134632945061,
0.031631749123334885,
0.03211704641580582,
-0.037829842418432236,
-0.04461588338017464,
-0.04739531874656677,
-0.18769346177577972,
0.21945595741271973,
0.27422383427619934,
-0.18464161455631256,
0.19490039348602295,
0.009712283499538898,
-0.07433387637138367,
-0.21419145166873932,
-0.01952654868364334,
0.023113731294870377,
0.04736250638961792,
0.08055976778268814,
-0.20116305351257324,
-0.018050625920295715,
0.02412903867661953,
-0.049651410430669785,
0.10950819402933121,
-0.22919288277626038,
-0.09357388317584991,
-0.004315676633268595,
-0.0456247515976429,
0.20167072117328644,
-0.17349179089069366,
-0.03898826241493225,
-0.002320575062185526,
-0.05797051265835762,
0.054706089198589325,
0.0082687484100461,
0.11852395534515381,
-0.0363474003970623,
0.04519614204764366,
0.02605312690138817,
-0.03876136615872383,
0.19581282138824463,
-0.03342440724372864,
-0.011271907016634941,
-0.08068891614675522,
0.028224926441907883,
0.11148685961961746,
-0.050781749188899994,
0.1058276891708374,
-0.054434746503829956,
-0.018785975873470306,
-0.21650883555412292,
-0.07362372428178787,
-0.037781376391649246,
0.08991304785013199,
0.025700144469738007,
-0.08583787083625793,
-0.13566099107265472,
0.06490069627761841,
0.10132265836000443,
-0.0247165746986866,
0.08831000328063965,
-0.04387554153800011,
0.030154842883348465,
-0.02991492487490177,
0.11581085622310638,
0.07742578536272049,
0.0929340273141861,
0.017225587740540504,
-0.03704168647527695,
0.047640278935432434,
-0.16656848788261414,
-0.04522807151079178,
0.12353790551424026,
-0.05192960053682327,
0.08831610530614853,
-0.012886806391179562,
-0.0748085007071495,
0.13962560892105103,
0.11731280386447906,
-0.1287236362695694,
-0.06684009730815887,
-0.022852951660752296,
-0.04369134083390236,
0.10900548845529556,
-0.11489281803369522,
0.0083320252597332,
-0.14493286609649658,
-0.033019065856933594,
0.0028363680467009544,
0.01592719368636608,
0.008870445191860199,
0.006211987230926752,
0.01573353447020054,
0.010456991381943226,
-0.06850937008857727,
0.1384900063276291,
0.061426661908626556,
-0.020047610625624657,
0.1058349460363388,
0.08437787741422653,
-0.08105207234621048,
-0.031884483993053436,
-0.0053067379631102085,
0.1320071518421173,
-0.20087100565433502,
-0.07036317139863968,
-0.03186485916376114,
0.0006601783679798245,
-0.0679159089922905,
-0.029856381937861443,
0.012571857310831547,
-0.021496061235666275,
0.07837656140327454,
-0.009273289702832699,
-0.06803218275308609,
0.06237945705652237,
0.1189974993467331,
-0.024567021057009697,
-0.13816024363040924,
0.036331016570329666,
0.01813933439552784,
0.097721166908741,
-0.08405690640211105,
-0.02392023615539074,
-0.022274356335401535,
-0.00575529458001256,
-0.2077944278717041,
0.06344069540500641,
-0.12469293922185898,
-0.0579557791352272,
-0.05214244872331619,
-0.08444758504629135,
-0.024516625329852104,
0.01365961879491806,
-0.0652383342385292,
-0.005629291292279959,
-0.08735206723213196,
0.06926313787698746,
-0.021079150959849358,
-0.0775182917714119,
0.0059661464765667915,
-0.008658536709845066,
0.07982566952705383,
0.15117089450359344,
0.001517986529506743,
0.025556309148669243,
-0.046119920909404755,
0.026421578601002693,
0.024140160530805588,
0.015439740382134914,
0.011787584982812405,
0.06868346780538559,
-0.014152349904179573,
-0.012271754443645477,
0.05786788463592529,
0.032761845737695694,
0.04126216471195221,
-0.13522066175937653,
-0.13750645518302917,
-0.09092945605516434,
0.05301151052117348,
-0.06554101407527924,
0.11003322154283524,
0.07784780859947205,
0.09162790328264236,
0.024110790342092514,
-0.13443459570407867,
0.08214186877012253,
-0.04296054318547249,
-0.0034575238823890686,
-0.08403244614601135,
0.0196502935141325,
-0.07605765759944916,
0.03150075301527977,
0.07602880150079727,
-0.004535502754151821,
0.20214974880218506,
0.05961533263325691,
0.03405151143670082,
0.005646523553878069,
-0.10930139571428299,
0.1457921713590622,
-0.06633255630731583,
0.12092360109090805,
0.09880577027797699,
0.0012220287462696433,
0.0414012148976326,
0.04272003844380379,
0.09053238481283188,
-0.06187465786933899,
0.07326202094554901,
0.0025666262954473495,
0.00523286871612072,
0.11977802962064743,
-0.10917717218399048,
-0.07110712677240372,
-0.08272712677717209,
-0.014158759266138077,
-0.09075794368982315,
0.048440929502248764,
-0.06301219016313553,
0.06307999789714813,
0.16387957334518433,
-0.04186704382300377,
0.05290854722261429,
-0.00008832863386487588,
-0.04629659280180931,
-0.11251086741685867,
-0.17260342836380005,
-0.09437765926122665,
-0.052612558007240295,
0.02018812857568264,
-0.08965429663658142,
0.06926600635051727,
0.08313976973295212,
0.11536789685487747,
-0.04404929280281067,
0.13183407485485077,
0.006708297412842512,
-0.07556850463151932,
0.10283856838941574,
-0.005526770371943712,
0.017745135352015495,
0.019812066107988358,
-0.013799712061882019,
0.02501090243458748,
0.07107566297054291,
-0.023508939892053604,
0.07840979844331741,
0.023805230855941772,
0.07375340163707733,
-0.039258766919374466,
-0.09884507954120636,
-0.06503848731517792,
0.09963776916265488,
0.13149575889110565,
0.08523955941200256,
0.0010203556157648563,
-0.02853567712008953,
-0.019089261069893837,
0.24835029244422913,
-0.016106927767395973,
0.01633988879621029,
-0.07396794110536575,
0.21654167771339417,
-0.058286234736442566,
-0.043420251458883286,
-0.01595202460885048,
-0.07800088077783585,
-0.07273219525814056,
0.1827254593372345,
0.1544589102268219,
0.035395942628383636,
-0.04916800558567047,
-0.1161172166466713,
-0.0005627869977615774,
0.024550169706344604,
0.08817576617002487,
-0.017480427399277687,
0.15278777480125427,
-0.10776133835315704,
0.0458926260471344,
-0.07721451669931412,
0.032520588487386703,
-0.16416122019290924,
0.06022395193576813,
-0.03649993613362312,
0.008673601783812046,
-0.04384535551071167,
0.059995539486408234,
-0.1352885216474533,
-0.028988340869545937,
-0.025316722691059113,
-0.03147078678011894,
-0.10419565439224243,
0.027877796441316605,
-0.025688333436846733,
0.08789455145597458,
0.022719165310263634,
-0.00229791016317904,
0.05722206458449364,
-0.07692642509937286,
0.03286777436733246,
-0.09279803931713104,
-0.0070610749535262585,
0.07081214338541031,
0.072835274040699,
0.23599547147750854,
-0.013467865064740181,
0.1078963503241539,
0.0603800006210804,
-0.023297330364584923,
-0.16929134726524353,
0.03906399756669998,
0.05470183119177818,
0.03467639908194542,
0.07125207036733627,
0.13918015360832214,
-0.02509223483502865,
0.01921582594513893,
0.04285524785518646,
0.1660626083612442,
-0.0048163277097046375,
-0.06489790976047516,
0.05909945070743561,
-0.07746788114309311,
0.055003467947244644,
-0.0913105309009552,
0.10126931965351105,
0.10454423725605011,
-0.10365421324968338,
-0.04306498169898987,
-0.07468205690383911,
-0.007590780965983868,
0.05297723785042763,
-0.018345944583415985,
-0.023219473659992218,
-0.14153702557086945,
0.003769826842471957,
0.036558426916599274,
0.07886434346437454,
-0.17404921352863312,
0.016010694205760956,
-0.14020755887031555,
0.04680055379867554,
-0.032755978405475616,
0.04159649461507797,
0.0225365050137043,
0.00814233347773552,
-0.026383763179183006,
-0.11872659623622894,
0.004481000825762749,
0.06886687129735947,
-0.1300238072872162,
-0.07172119617462158
] |
null | null |
transformers
|
# IndicNLP Marathi News Classifier
This model was fine-tuned using [Marathi RoBERTa](https://huggingface.co/flax-community/roberta-base-mr) on [IndicNLP Marathi News Dataset](https://github.com/AI4Bharat/indicnlp_corpus#indicnlp-news-article-classification-dataset)
## Dataset
IndicNLP Marathi news dataset consists 3 classes - `['lifestyle', 'entertainment', 'sports']` - with following docs distribution as per classes:
| train | eval | test |
| ----- | ---- | ---- |
| 9672 | 477 | 478 |
💯 Our **`mr-indicnlp-classifier`** model fine tuned from **roberta-base-mr** Pretrained Marathi RoBERTa model outperformed both classifier mentioned in [Arora, G. (2020). iNLTK](https://www.semanticscholar.org/paper/iNLTK%3A-Natural-Language-Toolkit-for-Indic-Languages-Arora/5039ed9e100d3a1cbbc25a02c82f6ee181609e83/figure/3) and [Kunchukuttan, Anoop et al. AI4Bharat-IndicNLP.](https://www.semanticscholar.org/paper/AI4Bharat-IndicNLP-Corpus%3A-Monolingual-Corpora-and-Kunchukuttan-Kakwani/7997d432925aff0ba05497d2893c09918298ca55/figure/4)
| Dataset | FT-W | FT-WC | INLP | iNLTK | **roberta-base-mr 🏆** |
| --------------- | ----- | ----- | ----- | ----- | --------------------- |
| iNLTK Headlines | 83.06 | 81.65 | 89.92 | 92.4 | **97.48** |
|
{}
| null |
flax-community/mr-indicnlp-classifier
|
[
"transformers",
"pytorch",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #endpoints_compatible #has_space #region-us
|
IndicNLP Marathi News Classifier
================================
This model was fine-tuned using Marathi RoBERTa on IndicNLP Marathi News Dataset
Dataset
-------
IndicNLP Marathi news dataset consists 3 classes - '['lifestyle', 'entertainment', 'sports']' - with following docs distribution as per classes:
train: 9672, eval: 477, test: 478
Our 'mr-indicnlp-classifier' model fine tuned from roberta-base-mr Pretrained Marathi RoBERTa model outperformed both classifier mentioned in Arora, G. (2020). iNLTK and Kunchukuttan, Anoop et al. AI4Bharat-IndicNLP.
|
[] |
[
"TAGS\n#transformers #pytorch #endpoints_compatible #has_space #region-us \n"
] |
[
25
] |
[
"passage: TAGS\n#transformers #pytorch #endpoints_compatible #has_space #region-us \n"
] |
[
-0.0029584316071122885,
-0.0028518293984234333,
-0.007252022158354521,
-0.03406646475195885,
0.1052907407283783,
0.021658478304743767,
0.014934005215764046,
0.0963568165898323,
0.07173345983028412,
0.04052921384572983,
0.12469944357872009,
0.1452905386686325,
-0.06406694650650024,
0.033166203647851944,
-0.03727412968873978,
-0.2475484162569046,
0.10555066913366318,
0.07448861002922058,
-0.10543368011713028,
0.08075199276208878,
0.04427364841103554,
-0.1204332634806633,
0.041203953325748444,
-0.026475591585040092,
-0.12351679056882858,
0.05606850981712341,
-0.00828385166823864,
-0.07634095847606659,
0.13860751688480377,
-0.005084997974336147,
0.21411652863025665,
0.026020441204309464,
-0.12003946304321289,
-0.1273520141839981,
0.023015161976218224,
0.03268890455365181,
-0.06730716675519943,
0.04222181439399719,
0.019230222329497337,
-0.07090400159358978,
0.05956339091062546,
-0.029162751510739326,
0.0002618617145344615,
-0.010863530449569225,
-0.1532917022705078,
-0.21316169202327728,
-0.053317490965127945,
-0.0006507059442810714,
-0.04469408839941025,
0.0898720771074295,
0.009525636211037636,
0.1834687739610672,
-0.1824994534254074,
0.0582394041121006,
0.2250988632440567,
-0.3112156391143799,
0.0214120764285326,
0.20106136798858643,
0.11586902290582657,
0.04537715017795563,
-0.014055252075195312,
0.05113503709435463,
0.014285252429544926,
0.020488103851675987,
0.0548686683177948,
-0.05259808152914047,
-0.08048462122678757,
0.11871915310621262,
-0.12199478596448898,
-0.1263049989938736,
0.21242943406105042,
-0.06615009903907776,
0.09515924751758575,
-0.002029605908319354,
-0.12245463579893112,
-0.10192905366420746,
0.040691014379262924,
0.03397665172815323,
0.00589353870600462,
0.047330692410469055,
0.02876516431570053,
-0.037329331040382385,
-0.1550511121749878,
0.04680014029145241,
-0.21807289123535156,
0.23877084255218506,
0.0015600259648635983,
0.0973609909415245,
-0.19626043736934662,
0.07882057130336761,
-0.05307355150580406,
-0.08718855679035187,
0.05743660777807236,
-0.10784531384706497,
0.015269345603883266,
0.034693922847509384,
-0.13700374960899353,
0.03605262190103531,
0.05943828076124191,
0.08926264196634293,
-0.04827079921960831,
-0.014148334972560406,
0.05754008889198303,
0.1292085349559784,
0.04635833203792572,
0.10615140944719315,
-0.045253999531269073,
0.01622767187654972,
-0.01661987416446209,
-0.11848651617765427,
0.01423704531043768,
-0.05796802416443825,
-0.10585612058639526,
-0.09851326048374176,
0.034658242017030716,
0.06912907212972641,
0.05789582431316376,
0.027000805363059044,
-0.03543194383382797,
0.03960902988910675,
0.0316116139292717,
-0.05825986713171005,
0.006225037854164839,
-0.009220544248819351,
0.029396118596196175,
0.1559952050447464,
0.002293596975505352,
-0.028043491765856743,
0.0061264303512871265,
0.07038558274507523,
-0.09168743342161179,
0.016447262838482857,
-0.07024794816970825,
-0.09628059715032578,
0.05556397885084152,
-0.14340318739414215,
0.05117064341902733,
-0.16716918349266052,
0.0008589614881202579,
0.006253912579268217,
0.05092477798461914,
0.006237487308681011,
0.005776692181825638,
0.08330069482326508,
-0.06275481730699539,
0.034006062895059586,
-0.08140378445386887,
-0.041693054139614105,
-0.07016132771968842,
0.07944785058498383,
-0.035834409296512604,
0.12030940502882004,
-0.14284490048885345,
0.07326507568359375,
-0.08219737559556961,
0.03556881844997406,
-0.12682370841503143,
-0.0417596809566021,
-0.04166887700557709,
0.13813723623752594,
0.0270688459277153,
-0.0871826708316803,
-0.14643874764442444,
0.0697321742773056,
-0.021685872226953506,
0.08998014777898788,
-0.08004175126552582,
-0.05545810982584953,
0.19149646162986755,
-0.07865036278963089,
-0.13861075043678284,
0.057827502489089966,
-0.0021153215784579515,
-0.02251840941607952,
-0.029761021956801414,
0.23124980926513672,
0.02715110033750534,
-0.08258308470249176,
0.007394853513687849,
0.1452827751636505,
-0.11984115093946457,
-0.13545486330986023,
0.04490169882774353,
-0.007621515542268753,
-0.03547787666320801,
0.006857717409729958,
-0.011913275346159935,
0.07471968978643417,
-0.06710518151521683,
-0.027399025857448578,
-0.07139851897954941,
-0.021193567663431168,
0.10081516951322556,
0.05544446408748627,
0.11307995766401291,
-0.06632129102945328,
-0.02411118522286415,
0.06373704969882965,
0.013209088705480099,
0.02873639017343521,
0.06281621009111404,
0.010946054011583328,
0.1705833524465561,
-0.09186646342277527,
-0.025972597301006317,
-0.2158096432685852,
-0.12272024154663086,
-0.05106669291853905,
0.039348848164081573,
-0.04500754177570343,
0.293838769197464,
0.07607422024011612,
-0.09681298583745956,
0.023039964959025383,
-0.024400703608989716,
0.06772734224796295,
0.05504227057099342,
-0.0345907136797905,
-0.007643870078027248,
-0.02654779702425003,
-0.09041503071784973,
-0.08823638409376144,
-0.012429488822817802,
0.02995176613330841,
0.0953330397605896,
0.1333361119031906,
-0.01740361750125885,
0.05910905450582504,
-0.012513579800724983,
0.0676158219575882,
-0.048699818551540375,
0.0014270392712205648,
0.06870709359645844,
-0.020629899576306343,
-0.0359528623521328,
0.19351327419281006,
-0.15416540205478668,
0.3440045714378357,
0.22943636775016785,
-0.29679611325263977,
0.024399293586611748,
0.03521803393959999,
-0.031622983515262604,
0.05245867371559143,
0.050392717123031616,
-0.03700784593820572,
-0.012921047396957874,
-0.020065709948539734,
0.10128912329673767,
-0.013730082660913467,
-0.017052702605724335,
-0.013303711079061031,
-0.0738840401172638,
-0.07105985283851624,
0.05143086612224579,
0.015108847990632057,
-0.08141690492630005,
0.2011115998029709,
0.33665207028388977,
-0.01909264549612999,
0.1292232871055603,
-0.025169676169753075,
-0.016639653593301773,
0.014522822573781013,
-0.02685708925127983,
-0.0936400443315506,
0.06099338084459305,
-0.2452656328678131,
-0.07483787834644318,
0.09717876464128494,
0.039847638458013535,
0.10929383337497711,
-0.15490993857383728,
-0.06305070966482162,
0.060193806886672974,
0.05773612856864929,
-0.08441347628831863,
0.13903503119945526,
0.0834232047200203,
0.07655372470617294,
0.022615687921643257,
-0.0411599762737751,
0.0715603306889534,
0.0000369819208572153,
-0.008042345754802227,
0.12445181608200073,
-0.12605683505535126,
-0.26834750175476074,
-0.09955253452062607,
-0.0677977129817009,
0.020572002977132797,
0.014059635810554028,
0.10280217975378036,
-0.07401889562606812,
-0.017238711938261986,
0.027746427804231644,
0.04317348450422287,
-0.14755412936210632,
0.05034588277339935,
-0.06313338130712509,
0.04624776169657707,
-0.08398661017417908,
-0.09383494406938553,
-0.07730001956224442,
-0.05146007612347603,
-0.04880542680621147,
0.1173279732465744,
-0.03573174029588699,
0.08980541676282883,
0.1437361091375351,
0.010966925881803036,
0.05257997661828995,
-0.004714722279459238,
0.1783446967601776,
-0.07063914835453033,
-0.04879947379231453,
0.21630069613456726,
0.046224698424339294,
0.08115650713443756,
0.13914045691490173,
0.03506186977028847,
-0.02357759140431881,
-0.019554652273654938,
-0.05383121594786644,
-0.13829293847084045,
-0.12250277400016785,
-0.1356431096792221,
-0.15839290618896484,
-0.03690522164106369,
-0.012927987612783909,
0.05791258066892624,
0.07531830668449402,
0.04856061935424805,
0.06441478431224823,
-0.09159663319587708,
-0.11200089007616043,
0.042829882353544235,
0.214774951338768,
-0.059302058070898056,
0.11861855536699295,
-0.07632823288440704,
-0.09176987409591675,
0.06788851320743561,
0.09761005640029907,
0.16839726269245148,
0.05044052377343178,
-0.02884308621287346,
0.0776241272687912,
0.2171095758676529,
0.15067914128303528,
0.1215222105383873,
0.019868675619363785,
-0.030019065365195274,
-0.018077019602060318,
-0.0024190463591367006,
-0.03949350863695145,
0.06176091358065605,
0.17411820590496063,
-0.15967711806297302,
-0.05377067252993584,
-0.263266384601593,
0.09527478367090225,
0.03875482454895973,
0.08375546336174011,
-0.15762236714363098,
-0.009570064954459667,
0.08122731745243073,
0.007135382853448391,
-0.026391206309199333,
0.07088838517665863,
0.09545140713453293,
-0.08288782089948654,
0.0034869282972067595,
0.0008346911054104567,
0.08179408311843872,
0.03769611939787865,
0.10302665829658508,
-0.071475088596344,
-0.16985052824020386,
0.053303901106119156,
0.06021403521299362,
-0.22107045352458954,
0.2658495008945465,
-0.0479622557759285,
-0.1351747363805771,
-0.04838748648762703,
-0.024021867662668228,
0.025235475972294807,
0.1528087705373764,
0.05877424776554108,
0.050266116857528687,
-0.12825050950050354,
-0.15095849335193634,
0.0781656950712204,
-0.00975842960178852,
0.10798349976539612,
-0.02290469966828823,
-0.004709030035883188,
-0.006992180831730366,
0.006713423877954483,
-0.0013189660385251045,
0.173819899559021,
0.08577103167772293,
-0.13919666409492493,
0.06207796931266785,
0.03919563442468643,
-0.01022022683173418,
-0.009070939384400845,
-0.02863902412354946,
-0.15827633440494537,
0.10187989473342896,
0.022372517734766006,
-0.0354381687939167,
-0.08814281970262527,
-0.1438899040222168,
0.16787531971931458,
-0.057758182287216187,
0.09039564430713654,
-0.07333594560623169,
-0.04240069165825844,
-0.08121757209300995,
-0.17010639607906342,
0.11798848956823349,
-0.09606517851352692,
0.028408290818333626,
-0.016388848423957825,
0.13144470751285553,
-0.1333639770746231,
0.02814481593668461,
-0.00777181051671505,
0.07762489467859268,
-0.18995167315006256,
-0.10411135852336884,
0.00573811586946249,
0.00008648325456306338,
0.0942523330450058,
0.08534939587116241,
0.009555318392813206,
0.0355757400393486,
0.09371782839298248,
0.04748714715242386,
0.23989276587963104,
0.13987286388874054,
-0.1131698340177536,
0.11439703404903412,
0.07038960605859756,
0.0011871441965922713,
-0.30259600281715393,
-0.06844288855791092,
-0.19397014379501343,
-0.011684142984449863,
0.03666522726416588,
-0.06436534225940704,
0.07888991385698318,
0.02020973153412342,
-0.05610501766204834,
0.06956689804792404,
-0.26553457975387573,
-0.0443376787006855,
0.12831047177314758,
-0.04875564947724342,
0.4590139091014862,
-0.12482378631830215,
-0.03985273465514183,
0.03664296492934227,
-0.24686279892921448,
0.12072349339723587,
-0.06671737134456635,
0.08167943358421326,
-0.030839961022138596,
0.08536513894796371,
0.050127062946558,
-0.10707498341798782,
0.16413873434066772,
-0.019066456705331802,
0.03144305571913719,
-0.09537012130022049,
-0.13222147524356842,
0.09439468383789062,
-0.06695137172937393,
0.008747032843530178,
0.04298138990998268,
0.019663916900753975,
-0.17302730679512024,
0.02094663679599762,
-0.15740904211997986,
0.07635539770126343,
0.033326953649520874,
-0.033140767365694046,
-0.07079353928565979,
-0.007014946546405554,
0.01739027537405491,
0.015562377870082855,
0.2808522880077362,
-0.026170071214437485,
0.18167032301425934,
0.10018351674079895,
-0.029927000403404236,
-0.17116793990135193,
-0.12743932008743286,
0.01872539147734642,
-0.037098098546266556,
0.10682620853185654,
-0.12740355730056763,
0.028982240706682205,
0.11848041415214539,
-0.03241840377449989,
-0.010813193395733833,
0.13869476318359375,
0.008099484257400036,
-0.005264680366963148,
0.1542629599571228,
-0.20623049139976501,
-0.0782713070511818,
-0.010582976043224335,
-0.04733187332749367,
0.11958517134189606,
0.07384870946407318,
0.10695125162601471,
0.03142425790429115,
-0.005194461438804865,
-0.02525060996413231,
-0.03568004071712494,
-0.08997119963169098,
-0.009834994561970234,
0.0713467076420784,
0.06705339252948761,
-0.09844633936882019,
0.0298356581479311,
0.032173629850149155,
-0.2338513284921646,
-0.03886993229389191,
0.08322587609291077,
-0.08861273527145386,
-0.1540960818529129,
-0.10570456832647324,
-0.03858676552772522,
-0.14221861958503723,
-0.02349267154932022,
0.0013299965066835284,
-0.09814167022705078,
0.0255677942186594,
0.18325190246105194,
0.12519659101963043,
0.09930786490440369,
-0.0037947543896734715,
-0.011561078950762749,
0.03348982706665993,
-0.0858912542462349,
-0.0279342383146286,
0.03432976081967354,
-0.10697319358587265,
0.06508734077215195,
-0.023995252326130867,
0.17531009018421173,
-0.09221936017274857,
-0.052032485604286194,
-0.1637325882911682,
0.02931434102356434,
-0.09241756796836853,
-0.13270160555839539,
-0.1285286545753479,
-0.0945083498954773,
0.028233474120497704,
-0.09016505628824234,
-0.03826546669006348,
-0.022096650674939156,
-0.1581091582775116,
0.035837333649396896,
-0.014746719971299171,
0.03187955543398857,
-0.07257688790559769,
-0.024690542370080948,
0.11456956714391708,
-0.04222938045859337,
0.08038060367107391,
0.13202457129955292,
-0.07766500115394592,
0.0837559700012207,
-0.015268963761627674,
-0.1633693426847458,
0.09604327380657196,
0.0034219035878777504,
0.09792175143957138,
0.05680977180600166,
-0.0038526845164597034,
0.04141596332192421,
0.05602210387587547,
0.03683166578412056,
-0.02315102331340313,
-0.09385822713375092,
0.007288885302841663,
0.0013578106882050633,
-0.1266511082649231,
-0.01009137462824583,
-0.0980178564786911,
0.17206330597400665,
0.050530705600976944,
0.06572292745113373,
0.03754976764321327,
0.08491905778646469,
-0.04338804632425308,
-0.003721855813637376,
-0.037333957850933075,
-0.18188923597335815,
0.06568785011768341,
-0.026972612366080284,
0.026026351377367973,
-0.0077736517414450645,
0.3243110775947571,
0.03192438185214996,
-0.013924413360655308,
0.02449909597635269,
0.07321294397115707,
0.024338286370038986,
0.016380755230784416,
0.20200885832309723,
0.09349657595157623,
-0.050810955464839935,
-0.08036429435014725,
0.07832492887973785,
-0.009372998960316181,
-0.034273505210876465,
0.11357276886701584,
0.15251611173152924,
0.11104395240545273,
0.11967875808477402,
0.008443622849881649,
-0.008139149285852909,
-0.14465583860874176,
-0.2476213276386261,
-0.0023527496960014105,
0.055015888065099716,
-0.07212203741073608,
0.023641817271709442,
0.13954094052314758,
-0.023706352338194847,
0.1081070750951767,
-0.04593716561794281,
0.015047437511384487,
-0.11807583272457123,
-0.057374197989702225,
-0.021522384136915207,
-0.14222657680511475,
-0.029094664379954338,
-0.06382428854703903,
0.0361405685544014,
0.2665198743343353,
0.019907072186470032,
0.0006611203425563872,
0.1264093667268753,
0.06900911778211594,
-0.04990198835730553,
0.012961765751242638,
0.011342190206050873,
0.05761035904288292,
-0.004287507850676775,
-0.006802299991250038,
-0.12445757538080215,
-0.0631248950958252,
-0.05142594501376152,
0.040838707238435745,
-0.09186207503080368,
-0.02507411129772663,
-0.13375972211360931,
-0.09422532469034195,
-0.07305833697319031,
0.0640762448310852,
-0.05481697618961334,
0.1414281278848648,
-0.01189141720533371,
0.015135182067751884,
-0.018813909962773323,
0.2575255334377289,
-0.12512043118476868,
-0.03402259945869446,
0.007378740236163139,
0.18227612972259521,
0.053908850997686386,
0.0869564414024353,
-0.039827726781368256,
0.0003803732106462121,
-0.14558778703212738,
0.24115276336669922,
0.31660687923431396,
-0.041170332580804825,
0.08613770455121994,
0.06550340354442596,
0.027855200693011284,
0.08189579844474792,
0.08518721908330917,
0.1331145316362381,
0.2911747992038727,
-0.10433531552553177,
-0.01843203790485859,
-0.04230036586523056,
0.006273298524320126,
-0.11395373940467834,
0.02973887510597706,
0.030536845326423645,
-0.09382905066013336,
-0.07590433955192566,
0.05829576030373573,
-0.1660558134317398,
0.10031066089868546,
0.0980352833867073,
-0.27192649245262146,
-0.050602026283741,
-0.00019584159599617124,
0.23275458812713623,
-0.05424086004495621,
0.13613925874233246,
-0.05725759640336037,
-0.12572351098060608,
0.03775539621710777,
0.017514344304800034,
-0.2143305391073227,
-0.049155883491039276,
0.12498515099287033,
0.02427395060658455,
-0.01412656344473362,
-0.039183471351861954,
-0.0026477081701159477,
0.08842497318983078,
0.09172382205724716,
-0.04571063071489334,
0.051126882433891296,
0.04861393943428993,
-0.12318507581949234,
-0.1175830140709877,
-0.010080081410706043,
-0.0012158278841525316,
-0.1197454035282135,
0.0611107274889946,
-0.24905891716480255,
0.04398523271083832,
0.01417508814483881,
0.03101803921163082,
0.00012862772564403713,
-0.07904408127069473,
-0.0334860123693943,
0.052517302334308624,
0.04906855523586273,
-0.01538486685603857,
-0.02977125532925129,
-0.013968387618660927,
-0.04559881240129471,
0.03220920264720917,
-0.06289813667535782,
-0.18041765689849854,
-0.013030407950282097,
-0.04920995607972145,
0.07377845048904419,
-0.02891109511256218,
-0.0333990678191185,
-0.0465581975877285,
0.01917346566915512,
0.06295596063137054,
-0.05631313472986221,
0.026507431641221046,
0.051073670387268066,
0.023664750158786774,
-0.0002604288747534156,
-0.021915065124630928,
0.02274945192039013,
0.07124967873096466,
-0.1368294656276703,
-0.07796721905469894
] |
null | null |
transformers
|
# Nordic GPT2--wikipedia
A Nordic GPT2 style model trained using Flax CLM pipeline on the Nordic parts
part of the wiki40b dataset.
https://huggingface.co/datasets/wiki40b
## Model series
This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
## Gpt models
## Swedish Gpt
https://huggingface.co/birgermoell/swedish-gpt/
## Swedish gpt wiki
https://huggingface.co/flax-community/swe-gpt-wiki
# Nordic gpt wiki
https://huggingface.co/flax-community/nordic-gpt-wiki
## Dansk gpt wiki
https://huggingface.co/flax-community/dansk-gpt-wiki
## Norsk gpt wiki
https://huggingface.co/flax-community/norsk-gpt-wiki
## Roberta models
## Nordic Roberta Wiki
https://huggingface.co/flax-community/nordic-roberta-wiki
## Swe Roberta Wiki Oscar
https://huggingface.co/flax-community/swe-roberta-wiki-oscar
## Roberta Swedish Scandi
https://huggingface.co/birgermoell/roberta-swedish-scandi
## Roberta Swedish
https://huggingface.co/birgermoell/roberta-swedish
## Swedish T5 model
https://huggingface.co/birgermoell/t5-base-swedish
## Data cleaning and preprocessing
The data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.
```python
from datasets import load_dataset
def load_and_clean_wiki():
dataset = load_dataset('wiki40b', 'da', beam_runner='DirectRunner', split="train")
#dataset = load_dataset('wiki40b', 'sv', beam_runner='DirectRunner')
dataset = dataset.remove_columns(['wikidata_id', 'version_id'])
filtered_dataset = dataset.map(filter_wikipedia)
# filtered_dataset[:3]
# print(filtered_dataset[:3])
return filtered_dataset
def filter_wikipedia(batch):
batch["text"] = " ".join(batch["text"].split("\
_START_SECTION_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_ARTICLE_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_ARTICLE_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_PARAGRAPH_\
"))
batch["text"] = " ".join(batch["text"].split("_NEWLINE_"))
batch["text"] = " ".join(batch["text"].split("\xa0"))
return batch
```
## Training script
The following training script was used to train the model.
```bash
./run_clm_flax.py --output_dir="${MODEL_DIR}" --model_type="gpt2" --config_name="${MODEL_DIR}" --tokenizer_name="${MODEL_DIR}" --dataset_name="wiki40b" --dataset_config_name="da" --do_train --do_eval --block_size="512" --per_device_train_batch_size="64" --per_device_eval_batch_size="64" --learning_rate="5e-3" --warmup_steps="1000" --adam_beta1="0.9" --adam_beta2="0.98" --weight_decay="0.01" --overwrite_output_dir --num_train_epochs="20" --logging_steps="500" --save_steps="1000" --eval_steps="2500" --push_to_hub
```
|
{"language": "sv", "widget": [{"text": "Det var en g\u00e5ng"}]}
|
text-generation
|
flax-community/nordic-gpt-wiki
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"gpt2",
"text-generation",
"sv",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sv"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #sv #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# Nordic GPT2--wikipedia
A Nordic GPT2 style model trained using Flax CLM pipeline on the Nordic parts
part of the wiki40b dataset.
URL
## Model series
This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
## Gpt models
## Swedish Gpt
URL
## Swedish gpt wiki
URL
# Nordic gpt wiki
URL
## Dansk gpt wiki
URL
## Norsk gpt wiki
URL
## Roberta models
## Nordic Roberta Wiki
URL
## Swe Roberta Wiki Oscar
URL
## Roberta Swedish Scandi
URL
## Roberta Swedish
URL
## Swedish T5 model
URL
## Data cleaning and preprocessing
The data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.
## Training script
The following training script was used to train the model.
|
[
"# Nordic GPT2--wikipedia\nA Nordic GPT2 style model trained using Flax CLM pipeline on the Nordic parts\npart of the wiki40b dataset.\n\nURL",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL",
"## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.",
"## Training script\nThe following training script was used to train the model."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #sv #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# Nordic GPT2--wikipedia\nA Nordic GPT2 style model trained using Flax CLM pipeline on the Nordic parts\npart of the wiki40b dataset.\n\nURL",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL",
"## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.",
"## Training script\nThe following training script was used to train the model."
] |
[
65,
37,
32,
4,
5,
6,
6,
6,
6,
4,
6,
7,
7,
5,
6,
40,
14
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #sv #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# Nordic GPT2--wikipedia\nA Nordic GPT2 style model trained using Flax CLM pipeline on the Nordic parts\npart of the wiki40b dataset.\n\nURL## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.## Gpt models## Swedish Gpt\nURL## Swedish gpt wiki\nURL# Nordic gpt wiki\nURL## Dansk gpt wiki\nURL## Norsk gpt wiki\nURL## Roberta models## Nordic Roberta Wiki\nURL## Swe Roberta Wiki Oscar\nURL## Roberta Swedish Scandi\nURL## Roberta Swedish\nURL## Swedish T5 model\nURL## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.## Training script\nThe following training script was used to train the model."
] |
[
-0.09919551759958267,
0.20627740025520325,
-0.0009960309835150838,
0.07209078222513199,
0.068837009370327,
0.011232024058699608,
0.05416038632392883,
0.13049906492233276,
-0.003550055669620633,
0.0774131640791893,
0.10683213174343109,
0.015365887433290482,
0.10160314291715622,
0.18331892788410187,
0.05888709798455238,
-0.2686788737773895,
0.08571663498878479,
-0.04889056831598282,
-0.03471433371305466,
0.10301763564348221,
0.081581249833107,
-0.07195106893777847,
0.05617353692650795,
-0.056232623755931854,
-0.03368614241480827,
-0.026054633781313896,
-0.020319394767284393,
-0.06371387094259262,
0.12151237577199936,
0.0600745752453804,
0.05862293764948845,
0.0845576673746109,
0.10349151492118835,
-0.11394116282463074,
0.02405657060444355,
0.046712517738342285,
-0.0077871717512607574,
0.04280633479356766,
0.05520022287964821,
0.002753971144556999,
0.19122417271137238,
-0.017327647656202316,
0.06537193059921265,
0.01710839569568634,
-0.08104979991912842,
-0.24984359741210938,
-0.0915946215391159,
0.0509476400911808,
0.06513511389493942,
0.11980072408914566,
-0.05210735648870468,
0.09311874210834503,
-0.1203470453619957,
0.08881618827581406,
0.11916805058717728,
-0.21158061921596527,
-0.05321000888943672,
0.11421134322881699,
0.07566932588815689,
0.04679274559020996,
-0.06877973675727844,
0.07586289942264557,
-0.01069598738104105,
0.054423894733190536,
0.06287060678005219,
-0.011743322014808655,
-0.02712368033826351,
0.0037285075522959232,
-0.12140251696109772,
-0.012186075560748577,
0.10795146971940994,
0.004562359303236008,
-0.0477580800652504,
-0.14545366168022156,
-0.06227198988199234,
-0.05233126878738403,
0.00810841005295515,
-0.0008487870218232274,
-0.00096229586051777,
-0.01506920624524355,
-0.04299909621477127,
-0.07934314757585526,
-0.0808875635266304,
-0.08468002080917358,
0.03097379580140114,
0.10360296070575714,
0.046404749155044556,
0.008719136007130146,
-0.004519207403063774,
0.15371258556842804,
-0.04474523663520813,
-0.12963315844535828,
-0.07212546467781067,
-0.034123893827199936,
-0.07370045781135559,
-0.018654100596904755,
0.020741410553455353,
-0.15222814679145813,
0.01854982227087021,
0.2375824898481369,
0.04080817475914955,
0.02043113298714161,
0.0391077846288681,
-0.0033082652371376753,
0.027527082711458206,
0.10391458123922348,
-0.14631760120391846,
-0.08592988550662994,
0.015818778425455093,
-0.026514265686273575,
0.029892316088080406,
-0.03777395933866501,
-0.004198702517896891,
-0.02192246913909912,
0.05537532642483711,
0.10463156551122665,
0.0407550111413002,
0.03305628523230553,
-0.0360642746090889,
0.0022021315526217222,
0.09743542224168777,
-0.1511206030845642,
0.006439464166760445,
-0.0018966937204822898,
-0.026196926832199097,
0.013958354480564594,
0.07593917101621628,
-0.010901481844484806,
-0.07487288862466812,
0.14730750024318695,
-0.05211431160569191,
-0.029916010797023773,
-0.023928381502628326,
-0.11924155801534653,
0.03611278161406517,
-0.1022290289402008,
-0.004390731453895569,
-0.08053578436374664,
-0.2145250290632248,
-0.06778892129659653,
0.07128512114286423,
-0.0655420646071434,
-0.0034015525598078966,
-0.022582482546567917,
-0.03855818137526512,
0.03724433854222298,
-0.028667211532592773,
0.10836654156446457,
-0.06789150089025497,
0.06537137925624847,
-0.13690301775932312,
0.09309884905815125,
0.00857369601726532,
0.012158209457993507,
-0.14712795615196228,
0.021415330469608307,
-0.18781612813472748,
0.03439852595329285,
-0.1395566761493683,
-0.0034762320574373007,
-0.12519925832748413,
-0.06726843118667603,
-0.019748644903302193,
0.0245308056473732,
0.03660530969500542,
0.17508143186569214,
-0.18042393028736115,
-0.018271107226610184,
0.2813916504383087,
-0.16556194424629211,
0.03929845243692398,
0.1151880994439125,
0.0343441404402256,
0.08520928025245667,
0.08792119473218918,
0.14873665571212769,
0.026640422642230988,
-0.11543911695480347,
-0.025612570345401764,
-0.002642874838784337,
-0.035383373498916626,
0.05064723268151283,
0.08010853081941605,
-0.08157996088266373,
0.05663176625967026,
0.030927129089832306,
-0.09046061336994171,
0.05207085236907005,
-0.02809097431600094,
-0.05768491327762604,
0.003963993862271309,
-0.0561632364988327,
-0.028229834511876106,
0.04967200756072998,
-0.0035632781218737364,
-0.0492858923971653,
-0.15015381574630737,
-0.09297586232423782,
0.11388443410396576,
-0.10796914994716644,
0.04345541447401047,
-0.0845441147685051,
0.051346201449632645,
0.00822773203253746,
0.000899684673640877,
-0.10482313483953476,
-0.15237407386302948,
-0.036857686936855316,
-0.059496261179447174,
-0.08205445110797882,
-0.0159872155636549,
0.07168214023113251,
0.1087392047047615,
-0.04753236845135689,
-0.062235698103904724,
-0.030830608680844307,
-0.003430655226111412,
0.002471043961122632,
-0.17422446608543396,
-0.02104734443128109,
-0.06153534725308418,
0.14892171323299408,
-0.15192820131778717,
0.014394436962902546,
0.11286982893943787,
0.16315323114395142,
0.023964472115039825,
-0.08975543826818466,
0.053778354078531265,
0.0019773810636252165,
-0.01977350004017353,
-0.12902610003948212,
0.018411915749311447,
-0.02629823051393032,
-0.04208146780729294,
0.059237342327833176,
-0.02287653461098671,
-0.05459074303507805,
0.10799151659011841,
0.18014708161354065,
-0.12231805920600891,
0.14586029946804047,
-0.045450109988451004,
-0.02101857401430607,
-0.08489968627691269,
-0.001035340828821063,
0.00024687417317181826,
0.05398239940404892,
0.0778607651591301,
-0.05335375294089317,
-0.0068017239682376385,
0.011853368952870369,
-0.010206491686403751,
-0.031573180109262466,
0.13623294234275818,
0.12685023248195648,
-0.15136823058128357,
0.09584590792655945,
-0.03250531107187271,
-0.013317207805812359,
0.26631292700767517,
0.036551620811223984,
-0.08112534880638123,
0.006767875514924526,
0.011297251097857952,
0.030455898493528366,
0.17240417003631592,
0.011498921550810337,
0.03458792343735695,
0.04054014012217522,
-0.01096968911588192,
0.021661967039108276,
-0.06766559928655624,
-0.05573473498225212,
0.00007377652218565345,
-0.07901141047477722,
0.03988801687955856,
0.07978011667728424,
-0.07790019363164902,
0.06495510041713715,
-0.019485803321003914,
-0.0367501936852932,
0.010887966491281986,
0.008321833796799183,
-0.10061534494161606,
0.20750604569911957,
-0.06804672628641129,
-0.21987740695476532,
-0.13059769570827484,
0.04033014923334122,
0.021561546251177788,
-0.012826806865632534,
0.08897466957569122,
-0.08496899902820587,
-0.15317007899284363,
-0.11250627785921097,
0.09373964369297028,
0.027301384136080742,
-0.04950014501810074,
-0.11726295948028564,
-0.022703533992171288,
-0.023406965658068657,
-0.11724098771810532,
0.019101321697235107,
0.04141923785209656,
-0.04986269399523735,
0.07957132160663605,
-0.011617563664913177,
0.12298451364040375,
0.07731186598539352,
0.027554046362638474,
0.001266178791411221,
0.040332648903131485,
0.19575493037700653,
-0.121737539768219,
0.12804128229618073,
0.0848044827580452,
-0.016380203887820244,
0.027968833222985268,
0.1208915039896965,
0.013082261197268963,
-0.06051665544509888,
-0.012041442096233368,
0.041664592921733856,
-0.08381570875644684,
-0.19721926748752594,
-0.09956483542919159,
0.01716887764632702,
0.045037440955638885,
0.08634788542985916,
0.1032894030213356,
-0.06983088701963425,
0.06071731075644493,
-0.07954095304012299,
-0.17155767977237701,
0.07220183312892914,
0.060614489018917084,
-0.06525197625160217,
-0.0235812459141016,
0.08536162972450256,
-0.04920472204685211,
0.0500163771212101,
0.09724555164575577,
-0.03745298460125923,
0.1465545892715454,
-0.035998862236738205,
0.08483497053384781,
0.06123502552509308,
0.09069854766130447,
0.06441406905651093,
0.10284306854009628,
0.04790690168738365,
-0.025037704035639763,
0.02346484549343586,
-0.05356081575155258,
-0.0032966567669063807,
0.03783416748046875,
-0.052540283650159836,
-0.08321095257997513,
-0.005128840915858746,
-0.014918254688382149,
-0.006106093525886536,
0.19976024329662323,
0.07115606218576431,
-0.2301592230796814,
-0.09683216363191605,
0.03759948909282684,
-0.034416694194078445,
-0.09500917047262192,
-0.02499331720173359,
0.09889300912618637,
-0.17797312140464783,
0.05103291571140289,
-0.05611363425850868,
0.0629688948392868,
-0.03903346136212349,
-0.0493110753595829,
0.04053458943963051,
0.01940067857503891,
-0.041608426719903946,
0.09663616120815277,
-0.17098940908908844,
0.1152111142873764,
-0.005049753002822399,
0.11333347111940384,
-0.047255415469408035,
0.023356353864073753,
-0.005627372767776251,
0.1147615909576416,
0.26533788442611694,
0.02670932374894619,
-0.10421764850616455,
-0.10869783163070679,
-0.13915935158729553,
0.037409570068120956,
-0.023507431149482727,
-0.07868048548698425,
0.07037413865327835,
0.008534531109035015,
-0.013001961633563042,
-0.051864396780729294,
-0.014286858960986137,
-0.14687518775463104,
-0.10068152844905853,
0.009911692701280117,
-0.06709941476583481,
0.11306975036859512,
-0.07130233943462372,
-0.0880173072218895,
-0.12926045060157776,
0.2366858720779419,
-0.09740933775901794,
-0.13176438212394714,
-0.16594816744327545,
0.0628969669342041,
0.12357372045516968,
-0.0832614079117775,
0.04132382571697235,
-0.0017598728882148862,
0.077542744576931,
-0.07203377783298492,
-0.043811358511447906,
0.07358919084072113,
-0.08245120197534561,
-0.16693761944770813,
-0.0060533336363732815,
0.09012842923402786,
0.1279829889535904,
0.052679937332868576,
0.013017814606428146,
0.06673791259527206,
-0.002744584111496806,
-0.12488416582345963,
0.04252517223358154,
0.16507770121097565,
0.016673147678375244,
-0.018479932099580765,
-0.08971765637397766,
-0.03120519407093525,
-0.018575167283415794,
-0.08682456612586975,
0.1536826342344284,
0.2226039320230484,
-0.08682592958211899,
0.13719502091407776,
0.10919517278671265,
-0.03999048098921776,
-0.30369630455970764,
-0.056792184710502625,
-0.033536724746227264,
0.08772610127925873,
0.0343264639377594,
-0.22590896487236023,
0.0845065787434578,
0.1301465630531311,
-0.026797987520694733,
0.07126404345035553,
-0.2498885691165924,
-0.11870218813419342,
0.10431480407714844,
0.09523855894804001,
0.01972571201622486,
-0.06077510863542557,
-0.018483903259038925,
0.013618833385407925,
-0.12914372980594635,
0.06393102556467056,
-0.10581644624471664,
0.08720166236162186,
-0.0004798099980689585,
-0.01044505089521408,
0.019671175628900528,
-0.08767218887805939,
0.1496414691209793,
0.0027967821806669235,
0.01337401382625103,
-0.08609917014837265,
0.12916384637355804,
0.09470222890377045,
-0.02336643636226654,
0.17387834191322327,
-0.03411220386624336,
0.014842820353806019,
-0.0937344953417778,
-0.07837612181901932,
-0.10056120157241821,
0.12547338008880615,
-0.07451289892196655,
-0.07539497315883636,
-0.05379804968833923,
0.12438011169433594,
0.06770826131105423,
-0.017744889482855797,
0.07959920167922974,
-0.098075270652771,
0.037142153829336166,
0.04940653219819069,
0.06637386232614517,
0.01358787901699543,
-0.05849432945251465,
-0.004277365747839212,
-0.04837425425648689,
0.06421348452568054,
-0.07390889525413513,
0.03030276857316494,
0.08354299515485764,
0.03579433262348175,
0.059380073100328445,
-0.011972540058195591,
-0.1460728794336319,
-0.01652778871357441,
0.05903875455260277,
-0.23402296006679535,
-0.09416009485721588,
-0.015081069432199001,
-0.1303737610578537,
0.011922660283744335,
-0.056094732135534286,
0.131190225481987,
-0.08018681406974792,
-0.0007807700312696397,
-0.0015105133643373847,
0.056543562561273575,
-0.0013860465260222554,
0.19421418011188507,
0.04224518686532974,
0.05055293068289757,
-0.11781846731901169,
0.07740048319101334,
0.035226546227931976,
-0.10216091573238373,
0.04511047154664993,
0.15157270431518555,
-0.15929411351680756,
-0.0887959748506546,
-0.00396752217784524,
0.11085925251245499,
-0.061269599944353104,
-0.06705696135759354,
-0.10062994807958603,
-0.04815442115068436,
0.025662126019597054,
0.016226937994360924,
0.03839351609349251,
0.06314024329185486,
0.0326399989426136,
-0.05584375932812691,
-0.09026435762643814,
0.07934752106666565,
0.07095841318368912,
0.007819133810698986,
-0.0634383037686348,
0.13894878327846527,
0.00728775467723608,
-0.0010082576191052794,
-0.04657680541276932,
0.03505203127861023,
-0.04532162845134735,
-0.023696696385741234,
-0.036379385739564896,
-0.03515350818634033,
-0.07669677585363388,
-0.018294682726264,
-0.05732794106006622,
-0.036119766533374786,
0.027745511382818222,
0.020126162096858025,
-0.059174250811338425,
-0.05000404641032219,
-0.07739320397377014,
-0.023539811372756958,
-0.11225078254938126,
-0.010406767949461937,
-0.0024165406357496977,
-0.06385497748851776,
0.08671262115240097,
0.007434725761413574,
0.02726234495639801,
0.07582772523164749,
-0.02851085364818573,
-0.016966454684734344,
-0.015044527128338814,
-0.03871745616197586,
0.01798669435083866,
-0.04546285793185234,
-0.05270865187048912,
-0.017759686335921288,
-0.04771624505519867,
0.036045484244823456,
0.013782701455056667,
-0.11257176846265793,
0.05945901200175285,
0.02208825573325157,
-0.02816617488861084,
-0.03825799748301506,
0.09632839262485504,
0.06896690279245377,
0.049255236983299255,
0.13640864193439484,
-0.0880950540304184,
0.07045716792345047,
-0.13185763359069824,
0.013090535998344421,
-0.002503781346604228,
-0.004259862005710602,
-0.004924757406115532,
0.057355377823114395,
0.04717433452606201,
-0.04820477217435837,
0.07670719176530838,
0.06048581376671791,
-0.04007197916507721,
0.05766333267092705,
-0.02537498250603676,
0.02153540961444378,
0.022010307759046555,
0.15717244148254395,
0.02475849911570549,
0.0075762891210615635,
0.019409803673624992,
-0.001904830220155418,
-0.022800348699092865,
0.008768979460000992,
0.1805189847946167,
0.11606507003307343,
0.12708623707294464,
0.0766528993844986,
-0.07995866239070892,
-0.11638913303613663,
-0.10612764954566956,
0.07007408887147903,
-0.023411303758621216,
0.07417880743741989,
0.0029688673093914986,
0.05477352812886238,
0.16844148933887482,
-0.16649416089057922,
0.017769353464245796,
0.07187488675117493,
-0.08140335232019424,
-0.15646356344223022,
-0.2747826874256134,
-0.1101200208067894,
0.02777143567800522,
0.0543130487203598,
-0.08784636110067368,
0.026438280940055847,
0.07081403583288193,
0.07484683394432068,
-0.010067291557788849,
0.15063300728797913,
-0.0011428759898990393,
-0.09619353711605072,
0.03657679632306099,
0.04551425203680992,
-0.004477961920201778,
0.007827556692063808,
0.0028263581916689873,
-0.004318699706345797,
0.0671473816037178,
0.00006240502989385277,
0.023171842098236084,
0.0101597486063838,
0.04426707327365875,
0.0014182798331603408,
-0.058903004974126816,
-0.026700599119067192,
0.08382156491279602,
0.047972384840250015,
0.024693941697478294,
0.059917714446783066,
-0.01688394322991371,
-0.0015244879759848118,
0.1931397169828415,
-0.0031651658937335014,
-0.06691789627075195,
-0.10352540761232376,
0.14126241207122803,
0.0031081372871994972,
0.03507588058710098,
0.020725928246974945,
-0.11435750126838684,
0.046312715858221054,
0.1486402153968811,
0.17650821805000305,
0.02713676355779171,
0.019155321642756462,
-0.025345753878355026,
-0.010677766054868698,
0.02576136216521263,
0.03948035463690758,
0.027513355016708374,
0.18509253859519958,
-0.10426708310842514,
0.04159507527947426,
-0.07130204886198044,
-0.07908838987350464,
-0.1472349315881729,
0.06694669276475906,
0.002471862593665719,
-0.008287324570119381,
-0.1038612350821495,
0.10756508260965347,
-0.059328801929950714,
-0.13479559123516083,
0.059057898819446564,
-0.06344534456729889,
-0.15556372702121735,
-0.038276031613349915,
0.019961686804890633,
0.02046790160238743,
0.03931613266468048,
-0.004977184347808361,
0.029726272448897362,
0.10463821142911911,
0.039688590914011,
-0.09667117148637772,
-0.06691355258226395,
0.06484874337911606,
-0.00019966348190791905,
0.19900791347026825,
0.009607234038412571,
0.010437668301165104,
0.10618137568235397,
-0.04501640796661377,
-0.13969899713993073,
0.07846373319625854,
0.01746886968612671,
-0.05294637009501457,
0.02976381406188011,
0.15337887406349182,
-0.03441784158349037,
0.08704635500907898,
0.009435341693460941,
-0.08514636754989624,
0.007596890442073345,
0.03132293000817299,
-0.07608745247125626,
-0.04746831953525543,
0.10179653763771057,
-0.07513020932674408,
0.1248563677072525,
0.17665280401706696,
-0.04231448471546173,
0.002758643589913845,
-0.09984966367483139,
0.10467872768640518,
0.02218446508049965,
0.02705494686961174,
0.04777279868721962,
-0.1889996975660324,
0.0001513912866357714,
-0.12246517837047577,
0.027750037610530853,
-0.15966354310512543,
-0.022300709038972855,
-0.08261147886514664,
-0.016634048894047737,
-0.06304745376110077,
0.0834842100739479,
0.030757810920476913,
0.02805415354669094,
-0.012962710112333298,
0.026739871129393578,
0.0332990437746048,
0.07047724723815918,
-0.14354096353054047,
-0.07009955495595932
] |
null | null |
transformers
|
# Nordic Roberta Wikipedia
## Description
Nord roberta model trainined on the swedish danish and norwegian wikipedia.
## Evaluation
Evaluation on Named Entity recognition in Danish.
I finetuned each model on 3 epochs on DaNE, repeated it 5 times for each model, and calculated 95% confidence intervals for the means. Here are the results:
xlm-roberta-base : 88.01 +- 0.43
flax-community/nordic-roberta-wiki: 85.75 +- 0.69 (this model)
Maltehb/danish-bert-botxo: 85.38 +- 0.55
flax-community/roberta-base-danish: 80.14 +- 1.47
flax-community/roberta-base-scandinavian : 78.03 +- 3.02
Maltehb/-l-ctra-danish-electra-small-cased: 57.87 +- 3.19
NbAiLab/nb-bert-base : 30.24 +- 1.21
Randomly initialised RoBERTa model: 19.79 +- 2.00
Evaluation on Sentiment analysis in Dansish
Here are the results on test set, where each model has been trained 5 times, and the “+-” refers to a 95% confidence interval of the mean score:
Maltehb/danish-bert-botxo: 65.19 +- 0.53
NbAiLab/nb-bert-base : 63.80 +- 0.77
xlm-roberta-base : 63.55 +- 1.59
flax-community/nordic-roberta-wiki : 56.46 +- 1.77
flax-community/roberta-base-danish : 54.73 +- 8.96
flax-community/roberta-base-scandinavian : 44.28 +- 9.21
Maltehb/-l-ctra-danish-electra-small-cased : 47.78 +- 12.65
Randomly initialised RoBERTa model: 36.96 +- 1.02
Maltehb/roberta-base-scandinavian : 33.65 +- 8.32
## Model series
This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
## Gpt models
## Swedish Gpt
https://huggingface.co/birgermoell/swedish-gpt/
## Swedish gpt wiki
https://huggingface.co/flax-community/swe-gpt-wiki
# Nordic gpt wiki
https://huggingface.co/flax-community/nordic-gpt-wiki
## Dansk gpt wiki
https://huggingface.co/flax-community/dansk-gpt-wiki
## Norsk gpt wiki
https://huggingface.co/flax-community/norsk-gpt-wiki
## Roberta models
## Nordic Roberta Wiki
https://huggingface.co/flax-community/nordic-roberta-wiki
## Swe Roberta Wiki Oscar
https://huggingface.co/flax-community/swe-roberta-wiki-oscar
## Roberta Swedish Scandi
https://huggingface.co/birgermoell/roberta-swedish-scandi
## Roberta Swedish
https://huggingface.co/birgermoell/roberta-swedish
## Swedish T5 model
https://huggingface.co/birgermoell/t5-base-swedish
|
{"language": "sv", "license": "cc-by-4.0", "tags": ["swedish", "roberta"], "pipeline_tag": "fill-mask", "widget": [{"text": "Meninged med livet \u00e4r <mask>."}]}
|
fill-mask
|
flax-community/nordic-roberta-wiki
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"roberta",
"feature-extraction",
"swedish",
"fill-mask",
"sv",
"license:cc-by-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sv"
] |
TAGS
#transformers #pytorch #jax #tensorboard #roberta #feature-extraction #swedish #fill-mask #sv #license-cc-by-4.0 #endpoints_compatible #region-us
|
# Nordic Roberta Wikipedia
## Description
Nord roberta model trainined on the swedish danish and norwegian wikipedia.
## Evaluation
Evaluation on Named Entity recognition in Danish.
I finetuned each model on 3 epochs on DaNE, repeated it 5 times for each model, and calculated 95% confidence intervals for the means. Here are the results:
xlm-roberta-base : 88.01 +- 0.43
flax-community/nordic-roberta-wiki: 85.75 +- 0.69 (this model)
Maltehb/danish-bert-botxo: 85.38 +- 0.55
flax-community/roberta-base-danish: 80.14 +- 1.47
flax-community/roberta-base-scandinavian : 78.03 +- 3.02
Maltehb/-l-ctra-danish-electra-small-cased: 57.87 +- 3.19
NbAiLab/nb-bert-base : 30.24 +- 1.21
Randomly initialised RoBERTa model: 19.79 +- 2.00
Evaluation on Sentiment analysis in Dansish
Here are the results on test set, where each model has been trained 5 times, and the “+-” refers to a 95% confidence interval of the mean score:
Maltehb/danish-bert-botxo: 65.19 +- 0.53
NbAiLab/nb-bert-base : 63.80 +- 0.77
xlm-roberta-base : 63.55 +- 1.59
flax-community/nordic-roberta-wiki : 56.46 +- 1.77
flax-community/roberta-base-danish : 54.73 +- 8.96
flax-community/roberta-base-scandinavian : 44.28 +- 9.21
Maltehb/-l-ctra-danish-electra-small-cased : 47.78 +- 12.65
Randomly initialised RoBERTa model: 36.96 +- 1.02
Maltehb/roberta-base-scandinavian : 33.65 +- 8.32
## Model series
This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
## Gpt models
## Swedish Gpt
URL
## Swedish gpt wiki
URL
# Nordic gpt wiki
URL
## Dansk gpt wiki
URL
## Norsk gpt wiki
URL
## Roberta models
## Nordic Roberta Wiki
URL
## Swe Roberta Wiki Oscar
URL
## Roberta Swedish Scandi
URL
## Roberta Swedish
URL
## Swedish T5 model
URL
|
[
"# Nordic Roberta Wikipedia",
"## Description\nNord roberta model trainined on the swedish danish and norwegian wikipedia.",
"## Evaluation\nEvaluation on Named Entity recognition in Danish.\n\nI finetuned each model on 3 epochs on DaNE, repeated it 5 times for each model, and calculated 95% confidence intervals for the means. Here are the results:\n\nxlm-roberta-base : 88.01 +- 0.43\n\nflax-community/nordic-roberta-wiki: 85.75 +- 0.69 (this model)\n\nMaltehb/danish-bert-botxo: 85.38 +- 0.55\n\nflax-community/roberta-base-danish: 80.14 +- 1.47\n\nflax-community/roberta-base-scandinavian : 78.03 +- 3.02\n\nMaltehb/-l-ctra-danish-electra-small-cased: 57.87 +- 3.19\n\nNbAiLab/nb-bert-base : 30.24 +- 1.21\n\nRandomly initialised RoBERTa model: 19.79 +- 2.00\n\nEvaluation on Sentiment analysis in Dansish\nHere are the results on test set, where each model has been trained 5 times, and the “+-” refers to a 95% confidence interval of the mean score:\n\nMaltehb/danish-bert-botxo: 65.19 +- 0.53\n\nNbAiLab/nb-bert-base : 63.80 +- 0.77\n\nxlm-roberta-base : 63.55 +- 1.59\n\nflax-community/nordic-roberta-wiki : 56.46 +- 1.77\n\nflax-community/roberta-base-danish : 54.73 +- 8.96\n\nflax-community/roberta-base-scandinavian : 44.28 +- 9.21\n\nMaltehb/-l-ctra-danish-electra-small-cased : 47.78 +- 12.65\n\nRandomly initialised RoBERTa model: 36.96 +- 1.02\n\nMaltehb/roberta-base-scandinavian : 33.65 +- 8.32",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #roberta #feature-extraction #swedish #fill-mask #sv #license-cc-by-4.0 #endpoints_compatible #region-us \n",
"# Nordic Roberta Wikipedia",
"## Description\nNord roberta model trainined on the swedish danish and norwegian wikipedia.",
"## Evaluation\nEvaluation on Named Entity recognition in Danish.\n\nI finetuned each model on 3 epochs on DaNE, repeated it 5 times for each model, and calculated 95% confidence intervals for the means. Here are the results:\n\nxlm-roberta-base : 88.01 +- 0.43\n\nflax-community/nordic-roberta-wiki: 85.75 +- 0.69 (this model)\n\nMaltehb/danish-bert-botxo: 85.38 +- 0.55\n\nflax-community/roberta-base-danish: 80.14 +- 1.47\n\nflax-community/roberta-base-scandinavian : 78.03 +- 3.02\n\nMaltehb/-l-ctra-danish-electra-small-cased: 57.87 +- 3.19\n\nNbAiLab/nb-bert-base : 30.24 +- 1.21\n\nRandomly initialised RoBERTa model: 19.79 +- 2.00\n\nEvaluation on Sentiment analysis in Dansish\nHere are the results on test set, where each model has been trained 5 times, and the “+-” refers to a 95% confidence interval of the mean score:\n\nMaltehb/danish-bert-botxo: 65.19 +- 0.53\n\nNbAiLab/nb-bert-base : 63.80 +- 0.77\n\nxlm-roberta-base : 63.55 +- 1.59\n\nflax-community/nordic-roberta-wiki : 56.46 +- 1.77\n\nflax-community/roberta-base-danish : 54.73 +- 8.96\n\nflax-community/roberta-base-scandinavian : 44.28 +- 9.21\n\nMaltehb/-l-ctra-danish-electra-small-cased : 47.78 +- 12.65\n\nRandomly initialised RoBERTa model: 36.96 +- 1.02\n\nMaltehb/roberta-base-scandinavian : 33.65 +- 8.32",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL"
] |
[
57,
5,
21,
459,
32,
4,
5,
6,
6,
6,
6,
4,
6,
7,
7,
5,
6
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #roberta #feature-extraction #swedish #fill-mask #sv #license-cc-by-4.0 #endpoints_compatible #region-us \n# Nordic Roberta Wikipedia## Description\nNord roberta model trainined on the swedish danish and norwegian wikipedia."
] |
[
0.0036078805569559336,
0.04913231357932091,
-0.003990889526903629,
0.13134270906448364,
0.005116974003612995,
-0.018284626305103302,
0.12132487446069717,
0.04147280752658844,
0.06385140120983124,
-0.012629526667296886,
0.17513717710971832,
0.11155100166797638,
0.04507385194301605,
-0.03787614032626152,
0.05525845289230347,
-0.3988005518913269,
0.09900664538145065,
0.03369685634970665,
-0.10880985856056213,
0.061186064034700394,
0.10895995795726776,
-0.006851235404610634,
0.03682504594326019,
0.03126882016658783,
-0.047084059566259384,
0.02787984162569046,
-0.014529045671224594,
-0.04319626837968826,
0.15301404893398285,
0.02701313979923725,
0.11609747260808945,
0.08648254722356796,
0.07423172146081924,
-0.030201895162463188,
0.03333274647593498,
-0.08586053550243378,
-0.08883020281791687,
0.04243744909763336,
-0.0314248763024807,
-0.011886469088494778,
0.24630609154701233,
0.07015177607536316,
0.043222006410360336,
-0.04783151298761368,
-0.026511460542678833,
-0.23015761375427246,
-0.01919630914926529,
0.10097656399011612,
-0.033233240246772766,
0.05114758014678955,
0.009143874980509281,
0.0292847640812397,
-0.1599147766828537,
0.08521055430173874,
0.1841105818748474,
-0.2701883614063263,
-0.0720284953713417,
0.15462860465049744,
0.1801621913909912,
0.042015865445137024,
-0.09097737818956375,
0.09674566239118576,
-0.027371753007173538,
0.051520705223083496,
0.09751436859369278,
-0.10339558124542236,
-0.03413126617670059,
-0.02943003922700882,
-0.11345111578702927,
0.08627044409513474,
0.20653969049453735,
-0.009835334494709969,
0.04819488152861595,
-0.012567461468279362,
0.0044049168936908245,
0.09678676724433899,
-0.042544540017843246,
-0.03906048461794853,
-0.03594799339771271,
-0.029518799856305122,
-0.02650679461658001,
-0.13157391548156738,
-0.1158682182431221,
-0.02901730127632618,
-0.07687762379646301,
0.15283961594104767,
-0.0034374892711639404,
0.06129975616931915,
-0.04372149333357811,
-0.018676111474633217,
-0.21849042177200317,
-0.11945053935050964,
-0.02307281456887722,
-0.07926159352064133,
0.00927785411477089,
-0.010567097924649715,
0.0113181471824646,
-0.14544273912906647,
0.09835503250360489,
0.1974552720785141,
0.047932252287864685,
0.02955194190144539,
0.03809613361954689,
0.13309074938297272,
0.004590197931975126,
0.06857126206159592,
-0.06346127390861511,
-0.09946686774492264,
0.027926115319132805,
-0.07804277539253235,
0.024346470832824707,
-0.050730761140584946,
-0.16270945966243744,
-0.11959780007600784,
0.0016999199287965894,
-0.025916237384080887,
0.003324228571727872,
0.049287233501672745,
0.04215604066848755,
0.033870819956064224,
0.019169799983501434,
-0.04032187536358833,
-0.003475968027487397,
-0.007722395472228527,
0.025611160323023796,
0.016762705519795418,
0.058633480221033096,
0.006780585274100304,
-0.02098037488758564,
0.16852916777133942,
-0.026497066020965576,
0.02187906578183174,
0.015398946590721607,
-0.09560206532478333,
0.05314801633358002,
-0.15379396080970764,
0.04099370539188385,
-0.13112415373325348,
-0.07271712273359299,
0.017453504726290703,
0.14927326142787933,
-0.0686306431889534,
-0.03672316297888756,
-0.04564059153199196,
-0.07424373179674149,
0.05409111827611923,
-0.03814557567238808,
-0.00968119129538536,
-0.04156745597720146,
0.015019499696791172,
-0.1590186208486557,
0.08158529549837112,
-0.1751667559146881,
0.0011175823165103793,
-0.1349671185016632,
-0.01690983958542347,
-0.24160121381282806,
-0.06168064847588539,
-0.1410723328590393,
0.028682470321655273,
-0.08355177193880081,
-0.08545897901058197,
-0.1093427911400795,
0.03532671183347702,
0.022920608520507812,
0.11706084758043289,
-0.1130998358130455,
-0.0496942512691021,
0.14917445182800293,
-0.14984363317489624,
-0.045245781540870667,
0.10350679606199265,
0.014294823631644249,
0.11870124191045761,
0.04607287421822548,
0.21343737840652466,
0.06189631298184395,
-0.05437556654214859,
0.07796172052621841,
0.11085157841444016,
-0.06004840508103371,
-0.10527656227350235,
0.035988904535770416,
-0.015728192403912544,
0.033758360892534256,
0.0416998453438282,
-0.08829405903816223,
0.006453369278460741,
0.018989460542798042,
-0.03340394049882889,
0.031932372599840164,
-0.03401223570108414,
0.07092494517564774,
0.11180873215198517,
0.06699322909116745,
-0.07085379213094711,
-0.06858768314123154,
-0.014568827114999294,
-0.021182745695114136,
-0.03164537250995636,
0.06710673123598099,
0.004173537716269493,
0.17553256452083588,
-0.024075957015156746,
-0.0017313896678388119,
-0.10210276395082474,
-0.05942472442984581,
-0.06538690626621246,
0.05629614368081093,
0.05782024934887886,
0.14960862696170807,
0.12788830697536469,
-0.024080784991383553,
-0.05946603789925575,
0.09550440311431885,
-0.011605328880250454,
0.013235551305115223,
0.04535974934697151,
-0.23148120939731598,
0.03037380985915661,
-0.06342572718858719,
-0.06394334137439728,
-0.06667610257863998,
-0.035342514514923096,
0.049724746495485306,
0.13131628930568695,
-0.02636781521141529,
0.0196401234716177,
-0.08663461357355118,
0.03404773771762848,
-0.035062290728092194,
0.04770076274871826,
0.07501603662967682,
-0.03411271423101425,
-0.1413986086845398,
0.18673095107078552,
0.06414635479450226,
0.07247298210859299,
0.1308080106973648,
-0.1852816492319107,
-0.06144286319613457,
0.01146841049194336,
-0.006488942075520754,
0.03186577185988426,
0.027886485680937767,
-0.015061015263199806,
0.006463650148361921,
0.01612059399485588,
0.08656836301088333,
-0.064210444688797,
0.025117700919508934,
0.0671204999089241,
-0.07494037598371506,
-0.06338292360305786,
0.14942575991153717,
0.21044674515724182,
-0.1822947859764099,
0.09117548912763596,
0.12086045742034912,
-0.1502632200717926,
0.24755316972732544,
0.04801466688513756,
0.005458850879222155,
0.009723231196403503,
-0.07082004100084305,
-0.010454386472702026,
0.24379310011863708,
-0.07286620885133743,
0.0020349195692688227,
0.02542276494204998,
-0.0329059474170208,
-0.006579125300049782,
-0.07603424787521362,
-0.06046319380402565,
-0.03223146125674248,
-0.014110716991126537,
-0.06005892530083656,
0.09742417186498642,
-0.07075753808021545,
0.05706004798412323,
-0.01059755589812994,
-0.2262439876794815,
0.051370684057474136,
0.015200864523649216,
-0.01767558977007866,
0.20434457063674927,
-0.053112756460905075,
-0.1500471979379654,
-0.16858205199241638,
-0.0941658541560173,
-0.009158184751868248,
0.0036449814215302467,
0.09207804501056671,
-0.03533579036593437,
-0.07180845737457275,
-0.024657918140292168,
0.08058707416057587,
-0.007294115144759417,
-0.008859082125127316,
-0.1724744737148285,
0.027262376621365547,
-0.077340267598629,
-0.07811606675386429,
-0.02753344364464283,
-0.1061178520321846,
0.05198638513684273,
0.015169176273047924,
-0.09456682950258255,
0.07685568928718567,
-0.005701498594135046,
-0.008143818005919456,
0.02965601161122322,
0.03475979343056679,
0.13922451436519623,
-0.059429895132780075,
0.09310918301343918,
0.07784924656152725,
-0.05454041808843613,
0.06226244941353798,
0.23422600328922272,
0.09265138953924179,
0.041218746453523636,
-0.0787961408495903,
-0.0015880785649642348,
-0.1457848697900772,
-0.13849474489688873,
-0.0377696268260479,
-0.07396340370178223,
0.044848762452602386,
0.07950036227703094,
0.03864673897624016,
-0.022124800831079483,
0.1318729817867279,
0.0542328804731369,
-0.09056402742862701,
0.08353206515312195,
0.05109618976712227,
0.09971720725297928,
0.013296607881784439,
0.11431261897087097,
-0.04885748401284218,
-0.07983838766813278,
-0.00269356113858521,
-0.0014277818845584989,
0.11319108307361603,
0.03726998344063759,
0.01540279109030962,
0.08547565340995789,
0.15382333099842072,
0.08866552263498306,
0.08014129847288132,
0.011666999198496342,
-0.06618393212556839,
0.026914238929748535,
-0.08163408935070038,
0.044693201780319214,
0.02393832802772522,
0.00867446418851614,
-0.07822719216346741,
0.009344839490950108,
-0.06910824775695801,
-0.009547220543026924,
0.07187145203351974,
0.02394367940723896,
-0.14711114764213562,
-0.03720439225435257,
0.022487087175250053,
0.002750533167272806,
-0.02662207931280136,
0.020977409556508064,
0.0027431256603449583,
-0.12101010978221893,
0.06553895771503448,
-0.10856500267982483,
0.09167090058326721,
0.07183182239532471,
0.022995049133896828,
-0.004863490350544453,
-0.0034400068689137697,
-0.004900575615465641,
0.05536392703652382,
-0.040460262447595596,
0.3039519190788269,
-0.03074934519827366,
0.04644596204161644,
-0.05953477323055267,
-0.021346483379602432,
0.0001448413240723312,
0.20833225548267365,
0.26345762610435486,
0.00020170539210084826,
-0.1664072871208191,
0.034697651863098145,
-0.004734077025204897,
0.015382624231278896,
0.008886287920176983,
-0.07263653725385666,
0.012499830685555935,
-0.01878112554550171,
0.00041691891965456307,
-0.04591108858585358,
-0.013885816559195518,
-0.0008279465255327523,
-0.10132692009210587,
0.02626592107117176,
-0.03979041799902916,
-0.12181224673986435,
0.042134664952754974,
-0.10099534690380096,
-0.22580088675022125,
0.22839008271694183,
-0.09225991368293762,
-0.08198418468236923,
-0.10384006798267365,
0.03479337319731712,
0.10222000628709793,
-0.06399495154619217,
0.030601955950260162,
-0.0401122123003006,
0.04734402894973755,
-0.1319449245929718,
-0.07243773341178894,
0.04478159919381142,
-0.09895992279052734,
0.009104176424443722,
-0.0013300285208970308,
0.13172946870326996,
0.05743344873189926,
0.013656532391905785,
0.07406222075223923,
0.08506909012794495,
-0.08182206749916077,
-0.08437597006559372,
-0.026710493490099907,
-0.0885675847530365,
-0.041022494435310364,
-0.08499351143836975,
-0.04944176971912384,
-0.05124514922499657,
0.010797887109220028,
0.0361957810819149,
0.08422708511352539,
0.21020302176475525,
-0.09924639016389847,
0.14767122268676758,
0.19340282678604126,
-0.04431920871138573,
-0.35918569564819336,
-0.07428862899541855,
-0.04996239021420479,
0.03378034755587578,
0.10840114951133728,
-0.06120041385293007,
0.1995614916086197,
0.009402395226061344,
-0.05704890191555023,
-0.04762202128767967,
-0.13134512305259705,
-0.10349998623132706,
0.08840297162532806,
0.07160105556249619,
0.29583144187927246,
0.011086126789450645,
-0.0313066802918911,
-0.018570123240351677,
-0.15470467507839203,
0.05177601799368858,
-0.06316779553890228,
0.07789596170186996,
-0.029092246666550636,
0.08391723036766052,
0.04233105480670929,
-0.04161945730447769,
0.08263800293207169,
-0.05011887475848198,
-0.047934915870428085,
-0.15521317720413208,
0.015468412078917027,
0.11412160843610764,
0.0003057676076423377,
0.05887953191995621,
-0.07920803129673004,
-0.041019078344106674,
0.024414140731096268,
-0.03256722167134285,
-0.0987938642501831,
0.18291467428207397,
-0.0595722533762455,
-0.0636981651186943,
-0.03737948089838028,
0.0777968093752861,
0.014326709322631359,
0.012032396160066128,
0.3073813319206238,
-0.10110754519701004,
0.08754238486289978,
-0.03770873695611954,
0.15487848222255707,
0.025883203372359276,
-0.09005256742238998,
0.03540784493088722,
-0.08265531808137894,
0.038523707538843155,
-0.09379970282316208,
0.016610749065876007,
0.11076784133911133,
0.016387321054935455,
-0.029827941209077835,
0.03699066862463951,
-0.06017334386706352,
-0.04387189447879791,
0.1349877417087555,
-0.18248037993907928,
-0.053768061101436615,
-0.0059667304158210754,
-0.17823925614356995,
0.06397169083356857,
0.005723165348172188,
0.1429004818201065,
-0.0712248757481575,
0.0027676669415086508,
0.029337547719478607,
0.002297169528901577,
-0.07266992330551147,
-0.04421217367053032,
0.09608137607574463,
0.03518444672226906,
-0.07800821959972382,
0.029154084622859955,
-0.01734456606209278,
-0.1007080078125,
0.013195597566664219,
0.1730371117591858,
-0.10326583683490753,
-0.1020163968205452,
0.0223692599684,
0.11191373318433762,
-0.19280101358890533,
-0.03599182143807411,
-0.1338675618171692,
-0.11284802109003067,
-0.019963521510362625,
0.16080063581466675,
0.07489379495382309,
-0.044481679797172546,
0.007549886126071215,
-0.05610077083110809,
-0.053378138691186905,
0.07238935679197311,
-0.055059049278497696,
-0.007805669214576483,
-0.04836181178689003,
-0.08327620476484299,
-0.03922678530216217,
0.06622803956270218,
-0.08538195490837097,
0.07442011684179306,
-0.09276413917541504,
0.0030980268493294716,
0.052880026400089264,
-0.0192128736525774,
-0.11538702994585037,
-0.02217487059533596,
-0.039402130991220474,
-0.07791832089424133,
-0.05544281378388405,
0.015493855811655521,
-0.10016825050115585,
0.03207109123468399,
0.004881718195974827,
-0.04543091356754303,
-0.07449576258659363,
-0.04408711940050125,
0.05716238170862198,
-0.004120849538594484,
0.0356246680021286,
-0.014005685225129128,
0.006258505396544933,
0.17312084138393402,
-0.16908347606658936,
0.04382745549082756,
-0.04781526327133179,
0.005547453183680773,
0.04213235154747963,
0.09490978717803955,
-0.04693077504634857,
0.025048093870282173,
0.04238545522093773,
0.06315441429615021,
-0.09570977836847305,
-0.10070407390594482,
-0.014610261656343937,
0.041260991245508194,
-0.14643260836601257,
-0.0003278898948337883,
0.11171698570251465,
0.08019769936800003,
0.04910845682024956,
0.13816207647323608,
-0.03686930984258652,
0.044180985540151596,
-0.002392902970314026,
0.03930071368813515,
0.023307520896196365,
-0.11769478768110275,
-0.0030035299714654684,
-0.060666028410196304,
-0.024769501760601997,
-0.082926444709301,
0.09764476120471954,
0.11807012557983398,
-0.015634046867489815,
0.04747461900115013,
0.008246924728155136,
0.04460306838154793,
0.025289086624979973,
0.19376371800899506,
-0.008648411370813847,
-0.02461305446922779,
-0.08116557449102402,
0.04417805373668671,
-0.06389766931533813,
0.09651705622673035,
0.1049967035651207,
0.08020399510860443,
0.12692278623580933,
0.06435597687959671,
0.11924776434898376,
0.1281173825263977,
0.02515646070241928,
0.003045202698558569,
0.11418258398771286,
0.050681423395872116,
0.02546343021094799,
0.039490971714258194,
0.20145271718502045,
-0.017364585772156715,
0.020883820950984955,
0.019305899739265442,
-0.07736142724752426,
-0.21348556876182556,
-0.29942476749420166,
-0.11291813105344772,
0.04191126301884651,
0.06358128041028976,
-0.060595735907554626,
-0.013703732751309872,
-0.01701781526207924,
0.06426572799682617,
-0.04677699878811836,
0.037235938012599945,
-0.007193483412265778,
-0.06518281251192093,
0.04438953101634979,
0.014340502209961414,
-0.01728052645921707,
0.06952449679374695,
0.010856823064386845,
-0.08332354575395584,
-0.016833243891596794,
-0.1024879515171051,
-0.020346039906144142,
-0.009241485968232155,
-0.06924182176589966,
-0.05732830986380577,
-0.07796225696802139,
-0.016604186967015266,
0.02650856226682663,
0.08383140712976456,
0.032628800719976425,
0.03707326948642731,
-0.055961672216653824,
-0.0478675402700901,
0.15528999269008636,
0.058092162013053894,
-0.026707090437412262,
-0.07800065726041794,
0.05554576963186264,
-0.0067677246406674385,
0.08614906668663025,
-0.058075904846191406,
-0.001236020470969379,
0.09617182612419128,
0.20539207756519318,
0.305245041847229,
0.06562494486570358,
0.06978528201580048,
0.03445831313729286,
-0.0078037395142018795,
0.04924763739109039,
-0.005385827738791704,
-0.02248363383114338,
0.14062434434890747,
-0.07517871260643005,
-0.05682774633169174,
-0.10539243370294571,
0.0254522617906332,
-0.10501430183649063,
-0.022740697488188744,
0.09033948183059692,
-0.043828967958688736,
-0.11482791602611542,
0.1371275931596756,
-0.1523519605398178,
-0.03833189979195595,
0.10411084443330765,
-0.10130922496318817,
-0.15197189152240753,
-0.07397479563951492,
0.09668254852294922,
0.04876112937927246,
0.05286167562007904,
-0.05476865544915199,
-0.01543105486780405,
-0.06814485788345337,
0.021082453429698944,
-0.11452262103557587,
-0.04383838176727295,
0.03581596165895462,
0.07464370876550674,
0.17801514267921448,
-0.025251001119613647,
0.029270756989717484,
0.12610794603824615,
0.00653444230556488,
0.028520602732896805,
-0.0025774408131837845,
0.06976757198572159,
-0.016327250748872757,
-0.007000327575951815,
0.13900019228458405,
-0.003612607717514038,
-0.014811274595558643,
0.045024581253528595,
-0.08547835052013397,
0.013832975178956985,
-0.06578513979911804,
-0.1260143220424652,
0.028397681191563606,
0.17639270424842834,
-0.08938448876142502,
0.09009428322315216,
0.1664908528327942,
0.027690045535564423,
-0.1036604717373848,
-0.08918928354978561,
0.02732943743467331,
0.052681028842926025,
-0.0584535151720047,
-0.08214867115020752,
-0.0520007386803627,
-0.07099056243896484,
0.005702633876353502,
-0.0561092346906662,
-0.09995803982019424,
-0.05248762667179108,
-0.1283196359872818,
0.01605515368282795,
-0.08950793743133545,
0.0017992894863709807,
0.08844295889139175,
-0.02777537703514099,
0.004232664126902819,
-0.010986491106450558,
0.054936155676841736,
0.0017517656087875366,
-0.12736445665359497,
-0.09974245727062225
] |
null | null |
transformers
|
# GPT2-svenska-wikipedia
A norwegian GPT2 style model trained using Flax CLM pipeline on the Norwegian
part of the wiki40b dataset.
https://huggingface.co/datasets/wiki40b
## Model series
This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
## Gpt models
## Swedish Gpt
https://huggingface.co/birgermoell/swedish-gpt/
## Swedish gpt wiki
https://huggingface.co/flax-community/swe-gpt-wiki
# Nordic gpt wiki
https://huggingface.co/flax-community/nordic-gpt-wiki
## Dansk gpt wiki
https://huggingface.co/flax-community/dansk-gpt-wiki
## Norsk gpt wiki
https://huggingface.co/flax-community/norsk-gpt-wiki
## Roberta models
## Nordic Roberta Wiki
https://huggingface.co/flax-community/nordic-roberta-wiki
## Swe Roberta Wiki Oscar
https://huggingface.co/flax-community/swe-roberta-wiki-oscar
## Roberta Swedish Scandi
https://huggingface.co/birgermoell/roberta-swedish-scandi
## Roberta Swedish
https://huggingface.co/birgermoell/roberta-swedish
## Swedish T5 model
https://huggingface.co/birgermoell/t5-base-swedish
## Data cleaning and preprocessing
The data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.
```python
from datasets import load_dataset
def load_and_clean_wiki():
dataset = load_dataset('wiki40b', 'no', beam_runner='DirectRunner', split="train")
#dataset = load_dataset('wiki40b', 'sv', beam_runner='DirectRunner')
dataset = dataset.remove_columns(['wikidata_id', 'version_id'])
filtered_dataset = dataset.map(filter_wikipedia)
# filtered_dataset[:3]
# print(filtered_dataset[:3])
return filtered_dataset
def filter_wikipedia(batch):
batch["text"] = " ".join(batch["text"].split("\
_START_SECTION_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_ARTICLE_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_ARTICLE_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_PARAGRAPH_\
"))
batch["text"] = " ".join(batch["text"].split("_NEWLINE_"))
batch["text"] = " ".join(batch["text"].split("\xa0"))
return batch
```
## Training script
The following training script was used to train the model.
```bash
./run_clm_flax.py --output_dir="${MODEL_DIR}" --model_type="gpt2" --config_name="${MODEL_DIR}" --tokenizer_name="${MODEL_DIR}" --dataset_name="wiki40b" --dataset_config_name="no" --do_train --do_eval --block_size="512" --per_device_train_batch_size="64" --per_device_eval_batch_size="64" --learning_rate="5e-3" --warmup_steps="1000" --adam_beta1="0.9" --adam_beta2="0.98" --weight_decay="0.01" --overwrite_output_dir --num_train_epochs="20" --logging_steps="500" --save_steps="1000" --eval_steps="2500" --push_to_hub
```
|
{"language": false, "widget": [{"text": "Det er flott"}]}
|
text-generation
|
flax-community/norsk-gpt-wiki
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"gpt2",
"text-generation",
"no",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"no"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #no #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# GPT2-svenska-wikipedia
A norwegian GPT2 style model trained using Flax CLM pipeline on the Norwegian
part of the wiki40b dataset.
URL
## Model series
This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
## Gpt models
## Swedish Gpt
URL
## Swedish gpt wiki
URL
# Nordic gpt wiki
URL
## Dansk gpt wiki
URL
## Norsk gpt wiki
URL
## Roberta models
## Nordic Roberta Wiki
URL
## Swe Roberta Wiki Oscar
URL
## Roberta Swedish Scandi
URL
## Roberta Swedish
URL
## Swedish T5 model
URL
## Data cleaning and preprocessing
The data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.
## Training script
The following training script was used to train the model.
|
[
"# GPT2-svenska-wikipedia\nA norwegian GPT2 style model trained using Flax CLM pipeline on the Norwegian\npart of the wiki40b dataset.\n\nURL",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL",
"## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.",
"## Training script\nThe following training script was used to train the model."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #no #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# GPT2-svenska-wikipedia\nA norwegian GPT2 style model trained using Flax CLM pipeline on the Norwegian\npart of the wiki40b dataset.\n\nURL",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL",
"## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.",
"## Training script\nThe following training script was used to train the model."
] |
[
65,
36,
32,
4,
5,
6,
6,
6,
6,
4,
6,
7,
7,
5,
6,
40,
14
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #no #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# GPT2-svenska-wikipedia\nA norwegian GPT2 style model trained using Flax CLM pipeline on the Norwegian\npart of the wiki40b dataset.\n\nURL## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.## Gpt models## Swedish Gpt\nURL## Swedish gpt wiki\nURL# Nordic gpt wiki\nURL## Dansk gpt wiki\nURL## Norsk gpt wiki\nURL## Roberta models## Nordic Roberta Wiki\nURL## Swe Roberta Wiki Oscar\nURL## Roberta Swedish Scandi\nURL## Roberta Swedish\nURL## Swedish T5 model\nURL## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.## Training script\nThe following training script was used to train the model."
] |
[
-0.10281992703676224,
0.20336711406707764,
-0.0004765480989590287,
0.0735694095492363,
0.06431236863136292,
0.008904458954930305,
0.059201955795288086,
0.1258263736963272,
-0.010483134537935257,
0.08233809471130371,
0.11057853698730469,
0.0014862640528008342,
0.09724497050046921,
0.18468505144119263,
0.06440691649913788,
-0.26957252621650696,
0.09113600850105286,
-0.05015559867024422,
-0.018582476302981377,
0.09988927096128464,
0.08237846940755844,
-0.07426749169826508,
0.05925898253917694,
-0.06460831314325333,
-0.031024741008877754,
-0.026376888155937195,
-0.02836610935628414,
-0.06449860334396362,
0.12338583916425705,
0.059362269937992096,
0.05656426399946213,
0.0843573734164238,
0.09961074590682983,
-0.10756919533014297,
0.025911713019013405,
0.04788890853524208,
-0.01263013668358326,
0.040160320699214935,
0.048865027725696564,
0.0055123004131019115,
0.1970592588186264,
-0.012550546787679195,
0.06093047559261322,
0.011348124593496323,
-0.08706875145435333,
-0.25194236636161804,
-0.08630545437335968,
0.05255890637636185,
0.06471532583236694,
0.12388253211975098,
-0.05505336448550224,
0.09867393225431442,
-0.11827028542757034,
0.09271813929080963,
0.12283826619386673,
-0.2161046266555786,
-0.05731978267431259,
0.1162797138094902,
0.0745883509516716,
0.04978431016206741,
-0.0693073645234108,
0.0733272135257721,
-0.00941323023289442,
0.05682365223765373,
0.05217242240905762,
-0.013128895312547684,
-0.04136756435036659,
-0.002383230719715357,
-0.1222745031118393,
-0.010413448326289654,
0.11500140279531479,
0.007658054586499929,
-0.04798036068677902,
-0.14195822179317474,
-0.061941683292388916,
-0.06242381036281586,
0.008134530857205391,
0.006994124501943588,
-0.004834945313632488,
-0.019218679517507553,
-0.048451244831085205,
-0.07656796276569366,
-0.08715501427650452,
-0.08023319393396378,
0.02477635070681572,
0.10160182416439056,
0.04371057078242302,
0.007354416884481907,
-0.009261388331651688,
0.15621764957904816,
-0.053527478128671646,
-0.13762448728084564,
-0.07550859451293945,
-0.03305370733141899,
-0.06119479984045029,
-0.019151480868458748,
0.018475908786058426,
-0.1407935917377472,
0.023714415729045868,
0.24080859124660492,
0.03515353053808212,
0.024376951158046722,
0.037110842764377594,
0.0018206443637609482,
0.02860662154853344,
0.09589220583438873,
-0.14392933249473572,
-0.08023930341005325,
0.016655031591653824,
-0.02262362837791443,
0.04077253118157387,
-0.038060806691646576,
-0.000961104582529515,
-0.019003883004188538,
0.05840225890278816,
0.1011514663696289,
0.03938697651028633,
0.0335061214864254,
-0.03459563106298447,
0.003179508727043867,
0.0988442450761795,
-0.15047752857208252,
-0.0014023368712514639,
0.00029989657923579216,
-0.023525498807430267,
-0.005015261471271515,
0.07682227343320847,
-0.005888098850846291,
-0.07291305065155029,
0.14487434923648834,
-0.051809586584568024,
-0.030314235016703606,
-0.02271786704659462,
-0.12004867941141129,
0.03250554949045181,
-0.1008763387799263,
-0.005280784796923399,
-0.08582759648561478,
-0.21114924550056458,
-0.07163446396589279,
0.07227092236280441,
-0.06530630588531494,
-0.0028871819376945496,
-0.023513982072472572,
-0.039412640035152435,
0.03556590899825096,
-0.03005840629339218,
0.112040214240551,
-0.0692097395658493,
0.07094239443540573,
-0.1414613276720047,
0.09391143172979355,
-0.004117479082196951,
0.013575484044849873,
-0.1546846628189087,
0.021219417452812195,
-0.18686333298683167,
0.027941951528191566,
-0.1360020935535431,
-0.0014741076156497002,
-0.1259825974702835,
-0.07259838283061981,
-0.028445903211832047,
0.026419630274176598,
0.04433196410536766,
0.17178983986377716,
-0.18156109750270844,
-0.01599445566534996,
0.2763594090938568,
-0.172545924782753,
0.03735712170600891,
0.11034794896841049,
0.034753791987895966,
0.09743038564920425,
0.08340369910001755,
0.16196122765541077,
0.032508108764886856,
-0.1086723580956459,
-0.028211040422320366,
-0.006602988112717867,
-0.022368405014276505,
0.0536237433552742,
0.0765221118927002,
-0.08066238462924957,
0.05671178922057152,
0.030800634995102882,
-0.08819282799959183,
0.04707261174917221,
-0.029078831896185875,
-0.05711952969431877,
0.007667953614145517,
-0.05602613091468811,
-0.03930799663066864,
0.055941641330718994,
-0.0024801117833703756,
-0.05141063407063484,
-0.14597371220588684,
-0.10480477660894394,
0.11721041053533554,
-0.11439385265111923,
0.040070462971925735,
-0.07807435095310211,
0.04999605566263199,
0.020897742360830307,
-0.0023533508647233248,
-0.10635366290807724,
-0.15115003287792206,
-0.03855258971452713,
-0.05265457183122635,
-0.08946310728788376,
-0.01744494028389454,
0.06964569538831711,
0.11321304738521576,
-0.04714616760611534,
-0.06371986120939255,
-0.03236745297908783,
-0.005608674604445696,
0.0008325835224241018,
-0.17158307135105133,
-0.02022000402212143,
-0.06313887983560562,
0.1530103236436844,
-0.1588895320892334,
0.014634781517088413,
0.11054161190986633,
0.1613890379667282,
0.022489624097943306,
-0.09448164701461792,
0.052131153643131256,
0.0035973305348306894,
-0.01939256116747856,
-0.1332474648952484,
0.022336149588227272,
-0.019182683899998665,
-0.03849778696894646,
0.051878031343221664,
-0.036274246871471405,
-0.048414696007966995,
0.10949986428022385,
0.17713859677314758,
-0.1282748132944107,
0.14654721319675446,
-0.05023525282740593,
-0.020598769187927246,
-0.08618614822626114,
0.001329918042756617,
0.000617955403868109,
0.0493096187710762,
0.07901883125305176,
-0.05305122584104538,
-0.006146419793367386,
0.010678457096219063,
-0.005847841501235962,
-0.03352264687418938,
0.1433825045824051,
0.12331956624984741,
-0.15239059925079346,
0.09234163910150528,
-0.02626669406890869,
-0.01674993336200714,
0.2686646580696106,
0.035459112375974655,
-0.07645347714424133,
0.007754579186439514,
0.01430694479495287,
0.026981940492987633,
0.17311830818653107,
0.012255472131073475,
0.036811068654060364,
0.04075686261057854,
-0.010962137952446938,
0.020541435107588768,
-0.06438608467578888,
-0.059381891041994095,
-0.003968477249145508,
-0.07386411726474762,
0.04249121621251106,
0.08452711254358292,
-0.08308034390211105,
0.06935252994298935,
-0.02448127046227455,
-0.03932525962591171,
0.009460864588618279,
0.007194902282208204,
-0.10298469662666321,
0.21198637783527374,
-0.058051854372024536,
-0.21411919593811035,
-0.11869414895772934,
0.051447488367557526,
0.020872147753834724,
-0.014922072179615498,
0.09520968049764633,
-0.08063703775405884,
-0.1512679010629654,
-0.12232045829296112,
0.08767526596784592,
0.024137090891599655,
-0.04651973769068718,
-0.11674454063177109,
-0.02799374796450138,
-0.021715698763728142,
-0.11930473893880844,
0.01818097196519375,
0.04028982296586037,
-0.0504516065120697,
0.08568129688501358,
-0.012738712131977081,
0.1171075701713562,
0.07144982367753983,
0.02906285598874092,
0.00011909232853213325,
0.03867172449827194,
0.2082652896642685,
-0.11896401643753052,
0.1259116679430008,
0.08111322671175003,
-0.014035400934517384,
0.02585078775882721,
0.12221228331327438,
0.01148625835776329,
-0.0633968859910965,
-0.003616008209064603,
0.043581970036029816,
-0.08541767299175262,
-0.18992716073989868,
-0.09826576709747314,
0.018682286143302917,
0.03996836766600609,
0.09482229501008987,
0.10481519997119904,
-0.06263072788715363,
0.06297363340854645,
-0.08290884643793106,
-0.169550821185112,
0.07041972875595093,
0.06330045312643051,
-0.05824463441967964,
-0.024369293823838234,
0.08960247039794922,
-0.053690921515226364,
0.047911208122968674,
0.09556553512811661,
-0.054279353469610214,
0.148762509226799,
-0.03860575333237648,
0.07266203314065933,
0.06343411654233932,
0.08661482483148575,
0.06258397549390793,
0.10172762721776962,
0.04418427497148514,
-0.025570785626769066,
0.023441137745976448,
-0.0591474249958992,
0.0001653593935770914,
0.033141810446977615,
-0.06541857123374939,
-0.0823155865073204,
-0.012028978206217289,
-0.020209388807415962,
-0.012262411415576935,
0.19993974268436432,
0.07601106911897659,
-0.2304098904132843,
-0.09664269536733627,
0.03782948851585388,
-0.028683839365839958,
-0.09814322739839554,
-0.0237478818744421,
0.10833168774843216,
-0.170662522315979,
0.05199328437447548,
-0.04907471686601639,
0.06004558503627777,
-0.03573198989033699,
-0.04451983422040939,
0.03797218203544617,
0.008852802217006683,
-0.046040721237659454,
0.09388592094182968,
-0.1705562174320221,
0.12338938564062119,
-0.003839407581835985,
0.11060919612646103,
-0.04998389631509781,
0.014571146108210087,
-0.009528028778731823,
0.11406824737787247,
0.2621366083621979,
0.024511514231562614,
-0.10679136216640472,
-0.09945893287658691,
-0.139870285987854,
0.041638460010290146,
-0.022968867793679237,
-0.0833282545208931,
0.06633754074573517,
0.01057981513440609,
-0.0090761324390769,
-0.054601024836301804,
-0.0020693729165941477,
-0.15588177740573883,
-0.09283799678087234,
0.013288980349898338,
-0.05023621767759323,
0.12159857898950577,
-0.07022567838430405,
-0.09181012958288193,
-0.12479116767644882,
0.23895400762557983,
-0.10346133261919022,
-0.1309354156255722,
-0.162136510014534,
0.07405726611614227,
0.12511172890663147,
-0.0797959417104721,
0.03830587491393089,
-0.00575905479490757,
0.0686657577753067,
-0.07093678414821625,
-0.04197002202272415,
0.07320679724216461,
-0.08090586960315704,
-0.16931159794330597,
-0.0024951715022325516,
0.088139109313488,
0.1301177740097046,
0.05205746740102768,
0.008827483281493187,
0.07142187654972076,
-0.006937484256923199,
-0.13207750022411346,
0.03918808698654175,
0.15711194276809692,
0.02147863805294037,
-0.02745409682393074,
-0.09516419470310211,
-0.022998003289103508,
-0.012946845963597298,
-0.09192296862602234,
0.15349824726581573,
0.22963300347328186,
-0.08633623272180557,
0.1313609778881073,
0.10877585411071777,
-0.03534429520368576,
-0.303824782371521,
-0.05763392150402069,
-0.029351279139518738,
0.08902165293693542,
0.03169567510485649,
-0.22610591351985931,
0.08131182938814163,
0.1326867640018463,
-0.02679194137454033,
0.08210619539022446,
-0.25047045946121216,
-0.12162434309720993,
0.11536016315221786,
0.09194197505712509,
0.03819096460938454,
-0.06591393053531647,
-0.013636424206197262,
0.01696465164422989,
-0.12069398909807205,
0.06814812123775482,
-0.11344152688980103,
0.09027805924415588,
0.0005568409105762839,
-0.017451833933591843,
0.01966855116188526,
-0.08958063274621964,
0.15644022822380066,
0.004807196091860533,
0.01590399444103241,
-0.09264110028743744,
0.12102200090885162,
0.08647046983242035,
-0.024579258635640144,
0.17547911405563354,
-0.028780195862054825,
0.020918307825922966,
-0.09975676238536835,
-0.07332106679677963,
-0.10315154492855072,
0.12341434508562088,
-0.0777185931801796,
-0.07155197858810425,
-0.055174265056848526,
0.13001209497451782,
0.061921242624521255,
-0.012186845764517784,
0.08780530095100403,
-0.10288602858781815,
0.04787292703986168,
0.0416235588490963,
0.06364519894123077,
0.018223298713564873,
-0.04900418967008591,
-0.0035592576023191214,
-0.04939217120409012,
0.06822323799133301,
-0.07372285425662994,
0.02803218737244606,
0.08562704920768738,
0.03284763917326927,
0.05660514533519745,
-0.01226166170090437,
-0.14178575575351715,
-0.01990419253706932,
0.06366187334060669,
-0.24551905691623688,
-0.07702858000993729,
-0.015285166911780834,
-0.11188739538192749,
0.006056054029613733,
-0.061895035207271576,
0.13426095247268677,
-0.083196260035038,
0.004208937752991915,
-0.0029779423493891954,
0.05158137157559395,
0.000055033426178852096,
0.1919575333595276,
0.04349936917424202,
0.05096804350614548,
-0.11452589184045792,
0.0768103078007698,
0.04283960163593292,
-0.09698348492383957,
0.052718471735715866,
0.15038788318634033,
-0.15698140859603882,
-0.09401247650384903,
-0.011720619164407253,
0.10786303132772446,
-0.05684511736035347,
-0.06977879256010056,
-0.09878117591142654,
-0.04076439142227173,
0.02660917304456234,
0.02073776349425316,
0.0370299257338047,
0.06684751808643341,
0.03791672736406326,
-0.05959155037999153,
-0.08577046543359756,
0.07088624686002731,
0.06360559165477753,
0.0063262940384447575,
-0.05327228084206581,
0.1408539116382599,
0.00988277979195118,
0.0004051024152431637,
-0.044656503945589066,
0.02930592931807041,
-0.04971659928560257,
-0.024947838857769966,
-0.020391516387462616,
-0.035659294575452805,
-0.06803636252880096,
-0.01711537502706051,
-0.05718814209103584,
-0.040235619992017746,
0.021655892953276634,
0.013727097772061825,
-0.059349123388528824,
-0.04917680844664574,
-0.07925650477409363,
-0.023436371237039566,
-0.10845878720283508,
-0.011945001780986786,
-0.005895888898521662,
-0.05963568016886711,
0.08740507066249847,
0.003100723261013627,
0.02938271313905716,
0.06658533960580826,
-0.03151655197143555,
-0.012914265505969524,
-0.019951296970248222,
-0.042714666575193405,
0.00807505939155817,
-0.03996888920664787,
-0.05643327906727791,
-0.019131675362586975,
-0.04667477682232857,
0.039848312735557556,
0.01535140722990036,
-0.11393677443265915,
0.05856650695204735,
0.021049045026302338,
-0.026186540722846985,
-0.03879902511835098,
0.09306902438402176,
0.05621016025543213,
0.048513226211071014,
0.13224315643310547,
-0.09340378642082214,
0.06585022807121277,
-0.13562628626823425,
0.014655484817922115,
-0.0015414294321089983,
-0.004458740819245577,
-0.0018722738604992628,
0.05249488353729248,
0.05192919448018074,
-0.04642118513584137,
0.06793377548456192,
0.06402227282524109,
-0.03659484162926674,
0.058329809457063675,
-0.026252834126353264,
0.01694030500948429,
0.020541667938232422,
0.1645127534866333,
0.028507409617304802,
0.010301021859049797,
0.016601810231804848,
-0.0015945853665471077,
-0.02352905087172985,
0.014694718644022942,
0.18699423968791962,
0.11711034923791885,
0.12223902344703674,
0.07594248652458191,
-0.0910627543926239,
-0.12581627070903778,
-0.0880914255976677,
0.07618901133537292,
-0.015064901672303677,
0.07739908248186111,
-0.0015800101682543755,
0.0405985563993454,
0.1680520921945572,
-0.16279108822345734,
0.012074309401214123,
0.0693337544798851,
-0.07669537514448166,
-0.15650735795497894,
-0.2693985104560852,
-0.10918524116277695,
0.03185708448290825,
0.052885159850120544,
-0.09048759937286377,
0.027575984597206116,
0.06948550045490265,
0.07565326243638992,
-0.010016373358666897,
0.15269944071769714,
-0.006570755038410425,
-0.09601497650146484,
0.040718577802181244,
0.042836159467697144,
0.0026511976029723883,
-0.0021846075542271137,
0.0015263217501342297,
-0.0014228771906346083,
0.07285935431718826,
-0.0011140013812109828,
0.025611288845539093,
0.016194188967347145,
0.04272293299436569,
0.005246972199529409,
-0.060217078775167465,
-0.029131624847650528,
0.08532074093818665,
0.05430372431874275,
0.02256435714662075,
0.060613002628088,
-0.015362867154181004,
0.0013134570326656103,
0.19270285964012146,
0.001011652173474431,
-0.07322735339403152,
-0.09711778163909912,
0.13445894420146942,
0.002146591665223241,
0.029479533433914185,
0.013424163684248924,
-0.11832043528556824,
0.04331827163696289,
0.14774537086486816,
0.1750851571559906,
0.025856317952275276,
0.022468309849500656,
-0.02147386036813259,
-0.008647624403238297,
0.03076019138097763,
0.03798474743962288,
0.03444977477192879,
0.19030719995498657,
-0.11211030930280685,
0.03615197539329529,
-0.07359553128480911,
-0.0731625109910965,
-0.15151891112327576,
0.06433655321598053,
0.004846246913075447,
-0.009441290982067585,
-0.10724394768476486,
0.10492748767137527,
-0.05502774566411972,
-0.12743067741394043,
0.05932031571865082,
-0.07387862354516983,
-0.15702013671398163,
-0.03574643284082413,
0.012476044707000256,
0.023472653701901436,
0.03303152695298195,
-0.01336897723376751,
0.033429235219955444,
0.09446493536233902,
0.03384525701403618,
-0.09028647094964981,
-0.07371505349874496,
0.06581611931324005,
0.005720086861401796,
0.19768381118774414,
0.0047903829254209995,
0.018590357154607773,
0.10581796616315842,
-0.0489908829331398,
-0.13179868459701538,
0.08474382013082504,
0.018053412437438965,
-0.05162503942847252,
0.02910553850233555,
0.1458456814289093,
-0.0370984748005867,
0.08479633182287216,
0.013667567633092403,
-0.08212846517562866,
0.0063241757452487946,
0.043932076543569565,
-0.07167978584766388,
-0.047893017530441284,
0.10669688135385513,
-0.07197214663028717,
0.12606798112392426,
0.1713150143623352,
-0.045115552842617035,
-0.0019462606869637966,
-0.10007607936859131,
0.1047794297337532,
0.01896272599697113,
0.016829345375299454,
0.04721718281507492,
-0.18889209628105164,
0.007568457163870335,
-0.12914812564849854,
0.02596871741116047,
-0.1700696349143982,
-0.017914511263370514,
-0.07541900873184204,
-0.015997296199202538,
-0.06620900332927704,
0.08141874521970749,
0.022809358313679695,
0.023003598675131798,
-0.016314787790179253,
0.0297127403318882,
0.034009989351034164,
0.06813706457614899,
-0.1410118192434311,
-0.07589202374219894
] |
null | null |
transformers
|
# papuGaPT2 - Polish GPT2 language model
[GPT2](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) was released in 2019 and surprised many with its text generation capability. However, up until very recently, we have not had a strong text generation model in Polish language, which limited the research opportunities for Polish NLP practitioners. With the release of this model, we hope to enable such research.
Our model follows the standard GPT2 architecture and training approach. We are using a causal language modeling (CLM) objective, which means that the model is trained to predict the next word (token) in a sequence of words (tokens).
## Datasets
We used the Polish subset of the [multilingual Oscar corpus](https://www.aclweb.org/anthology/2020.acl-main.156) to train the model in a self-supervised fashion.
```
from datasets import load_dataset
dataset = load_dataset('oscar', 'unshuffled_deduplicated_pl')
```
## Intended uses & limitations
The raw model can be used for text generation or fine-tuned for a downstream task. The model has been trained on data scraped from the web, and can generate text containing intense violence, sexual situations, coarse language and drug use. It also reflects the biases from the dataset (see below for more details). These limitations are likely to transfer to the fine-tuned models as well. At this stage, we do not recommend using the model beyond research.
## Bias Analysis
There are many sources of bias embedded in the model and we caution to be mindful of this while exploring the capabilities of this model. We have started a very basic analysis of bias that you can see in [this notebook](https://huggingface.co/flax-community/papuGaPT2/blob/main/papuGaPT2_bias_analysis.ipynb).
### Gender Bias
As an example, we generated 50 texts starting with prompts "She/He works as". The image below presents the resulting word clouds of female/male professions. The most salient terms for male professions are: teacher, sales representative, programmer. The most salient terms for female professions are: model, caregiver, receptionist, waitress.

### Ethnicity/Nationality/Gender Bias
We generated 1000 texts to assess bias across ethnicity, nationality and gender vectors. We created prompts with the following scheme:
* Person - in Polish this is a single word that differentiates both nationality/ethnicity and gender. We assessed the following 5 nationalities/ethnicities: German, Romani, Jewish, Ukrainian, Neutral. The neutral group used generic pronounts ("He/She").
* Topic - we used 5 different topics:
* random act: *entered home*
* said: *said*
* works as: *works as*
* intent: Polish *niech* which combined with *he* would roughly translate to *let him ...*
* define: *is*
Each combination of 5 nationalities x 2 genders x 5 topics had 20 generated texts.
We used a model trained on [Polish Hate Speech corpus](https://huggingface.co/datasets/hate_speech_pl) to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the nationality/ethnicity and gender from the generated text before running the hate speech detector.
The following tables and charts demonstrate the intensity of hate speech associated with the generated texts. There is a very clear effect where each of the ethnicities/nationalities score higher than the neutral baseline.

Looking at the gender dimension we see higher hate score associated with males vs. females.

We don't recommend using the GPT2 model beyond research unless a clear mitigation for the biases is provided.
## Training procedure
### Training scripts
We used the [causal language modeling script for Flax](https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py). We would like to thank the authors of that script as it allowed us to complete this training in a very short time!
### Preprocessing and Training Details
The texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a vocabulary size of 50,257. The inputs are sequences of 512 consecutive tokens.
We have trained the model on a single TPUv3 VM, and due to unforeseen events the training run was split in 3 parts, each time resetting from the final checkpoint with a new optimizer state:
1. LR 1e-3, bs 64, linear schedule with warmup for 1000 steps, 10 epochs, stopped after 70,000 steps at eval loss 3.206 and perplexity 24.68
2. LR 3e-4, bs 64, linear schedule with warmup for 5000 steps, 7 epochs, stopped after 77,000 steps at eval loss 3.116 and perplexity 22.55
3. LR 2e-4, bs 64, linear schedule with warmup for 5000 steps, 3 epochs, stopped after 91,000 steps at eval loss 3.082 and perplexity 21.79
## Evaluation results
We trained the model on 95% of the dataset and evaluated both loss and perplexity on 5% of the dataset. The final checkpoint evaluation resulted in:
* Evaluation loss: 3.082
* Perplexity: 21.79
## How to use
You can use the model either directly for text generation (see example below), by extracting features, or for further fine-tuning. We have prepared a notebook with text generation examples [here](https://huggingface.co/flax-community/papuGaPT2/blob/main/papuGaPT2_text_generation.ipynb) including different decoding methods, bad words suppression, few- and zero-shot learning demonstrations.
### Text generation
Let's first start with the text-generation pipeline. When prompting for the best Polish poet, it comes up with a pretty reasonable text, highlighting one of the most famous Polish poets, Adam Mickiewicz.
```python
from transformers import pipeline, set_seed
generator = pipeline('text-generation', model='flax-community/papuGaPT2')
set_seed(42)
generator('Największym polskim poetą był')
>>> [{'generated_text': 'Największym polskim poetą był Adam Mickiewicz - uważany za jednego z dwóch geniuszów języka polskiego. "Pan Tadeusz" był jednym z najpopularniejszych dzieł w historii Polski. W 1801 został wystawiony publicznie w Teatrze Wilama Horzycy. Pod jego'}]
```
The pipeline uses `model.generate()` method in the background. In [our notebook](https://huggingface.co/flax-community/papuGaPT2/blob/main/papuGaPT2_text_generation.ipynb) we demonstrate different decoding methods we can use with this method, including greedy search, beam search, sampling, temperature scaling, top-k and top-p sampling. As an example, the below snippet uses sampling among the 50 most probable tokens at each stage (top-k) and among the tokens that jointly represent 95% of the probability distribution (top-p). It also returns 3 output sequences.
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
model = AutoModelWithLMHead.from_pretrained('flax-community/papuGaPT2')
tokenizer = AutoTokenizer.from_pretrained('flax-community/papuGaPT2')
set_seed(42) # reproducibility
input_ids = tokenizer.encode('Największym polskim poetą był', return_tensors='pt')
sample_outputs = model.generate(
input_ids,
do_sample=True,
max_length=50,
top_k=50,
top_p=0.95,
num_return_sequences=3
)
print("Output:\
" + 100 * '-')
for i, sample_output in enumerate(sample_outputs):
print("{}: {}".format(i, tokenizer.decode(sample_output, skip_special_tokens=True)))
>>> Output:
>>> ----------------------------------------------------------------------------------------------------
>>> 0: Największym polskim poetą był Roman Ingarden. Na jego wiersze i piosenki oddziaływały jego zamiłowanie do przyrody i przyrody. Dlatego też jako poeta w czasie pracy nad utworami i wierszami z tych wierszy, a następnie z poezji własnej - pisał
>>> 1: Największym polskim poetą był Julian Przyboś, którego poematem „Wierszyki dla dzieci”.
>>> W okresie międzywojennym, pod hasłem „Papież i nie tylko” Polska, jak większość krajów europejskich, była państwem faszystowskim.
>>> Prócz
>>> 2: Największym polskim poetą był Bolesław Leśmian, który był jego tłumaczem, a jego poezja tłumaczyła na kilkanaście języków.
>>> W 1895 roku nakładem krakowskiego wydania "Scientio" ukazała się w języku polskim powieść W krainie kangurów
```
### Avoiding Bad Words
You may want to prevent certain words from occurring in the generated text. To avoid displaying really bad words in the notebook, let's pretend that we don't like certain types of music to be advertised by our model. The prompt says: *my favorite type of music is*.
```python
input_ids = tokenizer.encode('Mój ulubiony gatunek muzyki to', return_tensors='pt')
bad_words = [' disco', ' rock', ' pop', ' soul', ' reggae', ' hip-hop']
bad_word_ids = []
for bad_word in bad_words:
ids = tokenizer(bad_word).input_ids
bad_word_ids.append(ids)
sample_outputs = model.generate(
input_ids,
do_sample=True,
max_length=20,
top_k=50,
top_p=0.95,
num_return_sequences=5,
bad_words_ids=bad_word_ids
)
print("Output:\
" + 100 * '-')
for i, sample_output in enumerate(sample_outputs):
print("{}: {}".format(i, tokenizer.decode(sample_output, skip_special_tokens=True)))
>>> Output:
>>> ----------------------------------------------------------------------------------------------------
>>> 0: Mój ulubiony gatunek muzyki to muzyka klasyczna. Nie wiem, czy to kwestia sposobu, w jaki gramy,
>>> 1: Mój ulubiony gatunek muzyki to reggea. Zachwycają mnie piosenki i piosenki muzyczne o ducho
>>> 2: Mój ulubiony gatunek muzyki to rockabilly, ale nie lubię też punka. Moim ulubionym gatunkiem
>>> 3: Mój ulubiony gatunek muzyki to rap, ale to raczej się nie zdarza w miejscach, gdzie nie chodzi
>>> 4: Mój ulubiony gatunek muzyki to metal aranżeje nie mam pojęcia co mam robić. Co roku,
```
Ok, it seems this worked: we can see *classical music, rap, metal* among the outputs. Interestingly, *reggae* found a way through via a misspelling *reggea*. Take it as a caution to be careful with curating your bad word lists!
### Few Shot Learning
Let's see now if our model is able to pick up training signal directly from a prompt, without any finetuning. This approach was made really popular with GPT3, and while our model is definitely less powerful, maybe it can still show some skills! If you'd like to explore this topic in more depth, check out [the following article](https://huggingface.co/blog/few-shot-learning-gpt-neo-and-inference-api) which we used as reference.
```python
prompt = """Tekst: "Nienawidzę smerfów!"
Sentyment: Negatywny
###
Tekst: "Jaki piękny dzień 👍"
Sentyment: Pozytywny
###
Tekst: "Jutro idę do kina"
Sentyment: Neutralny
###
Tekst: "Ten przepis jest świetny!"
Sentyment:"""
res = generator(prompt, max_length=85, temperature=0.5, end_sequence='###', return_full_text=False, num_return_sequences=5,)
for x in res:
print(res[i]['generated_text'].split(' ')[1])
>>> Pozytywny
>>> Pozytywny
>>> Pozytywny
>>> Pozytywny
>>> Pozytywny
```
It looks like our model is able to pick up some signal from the prompt. Be careful though, this capability is definitely not mature and may result in spurious or biased responses.
### Zero-Shot Inference
Large language models are known to store a lot of knowledge in its parameters. In the example below, we can see that our model has learned the date of an important event in Polish history, the battle of Grunwald.
```python
prompt = "Bitwa pod Grunwaldem miała miejsce w roku"
input_ids = tokenizer.encode(prompt, return_tensors='pt')
# activate beam search and early_stopping
beam_outputs = model.generate(
input_ids,
max_length=20,
num_beams=5,
early_stopping=True,
num_return_sequences=3
)
print("Output:\
" + 100 * '-')
for i, sample_output in enumerate(beam_outputs):
print("{}: {}".format(i, tokenizer.decode(sample_output, skip_special_tokens=True)))
>>> Output:
>>> ----------------------------------------------------------------------------------------------------
>>> 0: Bitwa pod Grunwaldem miała miejsce w roku 1410, kiedy to wojska polsko-litewskie pod
>>> 1: Bitwa pod Grunwaldem miała miejsce w roku 1410, kiedy to wojska polsko-litewskie pokona
>>> 2: Bitwa pod Grunwaldem miała miejsce w roku 1410, kiedy to wojska polsko-litewskie,
```
## BibTeX entry and citation info
```bibtex
@misc{papuGaPT2,
title={papuGaPT2 - Polish GPT2 language model},
url={https://huggingface.co/flax-community/papuGaPT2},
author={Wojczulis, Michał and Kłeczek, Dariusz},
year={2021}
}
```
|
{"language": "pl", "tags": ["text-generation"], "widget": [{"text": "Najsmaczniejszy polski owoc to"}]}
|
text-generation
|
flax-community/papuGaPT2
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"text-generation",
"pl",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"pl"
] |
TAGS
#transformers #pytorch #jax #tensorboard #text-generation #pl #endpoints_compatible #has_space #region-us
|
# papuGaPT2 - Polish GPT2 language model
GPT2 was released in 2019 and surprised many with its text generation capability. However, up until very recently, we have not had a strong text generation model in Polish language, which limited the research opportunities for Polish NLP practitioners. With the release of this model, we hope to enable such research.
Our model follows the standard GPT2 architecture and training approach. We are using a causal language modeling (CLM) objective, which means that the model is trained to predict the next word (token) in a sequence of words (tokens).
## Datasets
We used the Polish subset of the multilingual Oscar corpus to train the model in a self-supervised fashion.
## Intended uses & limitations
The raw model can be used for text generation or fine-tuned for a downstream task. The model has been trained on data scraped from the web, and can generate text containing intense violence, sexual situations, coarse language and drug use. It also reflects the biases from the dataset (see below for more details). These limitations are likely to transfer to the fine-tuned models as well. At this stage, we do not recommend using the model beyond research.
## Bias Analysis
There are many sources of bias embedded in the model and we caution to be mindful of this while exploring the capabilities of this model. We have started a very basic analysis of bias that you can see in this notebook.
### Gender Bias
As an example, we generated 50 texts starting with prompts "She/He works as". The image below presents the resulting word clouds of female/male professions. The most salient terms for male professions are: teacher, sales representative, programmer. The most salient terms for female professions are: model, caregiver, receptionist, waitress.
!gender bias
### Ethnicity/Nationality/Gender Bias
We generated 1000 texts to assess bias across ethnicity, nationality and gender vectors. We created prompts with the following scheme:
* Person - in Polish this is a single word that differentiates both nationality/ethnicity and gender. We assessed the following 5 nationalities/ethnicities: German, Romani, Jewish, Ukrainian, Neutral. The neutral group used generic pronounts ("He/She").
* Topic - we used 5 different topics:
* random act: *entered home*
* said: *said*
* works as: *works as*
* intent: Polish *niech* which combined with *he* would roughly translate to *let him ...*
* define: *is*
Each combination of 5 nationalities x 2 genders x 5 topics had 20 generated texts.
We used a model trained on Polish Hate Speech corpus to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the nationality/ethnicity and gender from the generated text before running the hate speech detector.
The following tables and charts demonstrate the intensity of hate speech associated with the generated texts. There is a very clear effect where each of the ethnicities/nationalities score higher than the neutral baseline.
!hate score by ethnicity
Looking at the gender dimension we see higher hate score associated with males vs. females.
!hate score by gender
We don't recommend using the GPT2 model beyond research unless a clear mitigation for the biases is provided.
## Training procedure
### Training scripts
We used the causal language modeling script for Flax. We would like to thank the authors of that script as it allowed us to complete this training in a very short time!
### Preprocessing and Training Details
The texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a vocabulary size of 50,257. The inputs are sequences of 512 consecutive tokens.
We have trained the model on a single TPUv3 VM, and due to unforeseen events the training run was split in 3 parts, each time resetting from the final checkpoint with a new optimizer state:
1. LR 1e-3, bs 64, linear schedule with warmup for 1000 steps, 10 epochs, stopped after 70,000 steps at eval loss 3.206 and perplexity 24.68
2. LR 3e-4, bs 64, linear schedule with warmup for 5000 steps, 7 epochs, stopped after 77,000 steps at eval loss 3.116 and perplexity 22.55
3. LR 2e-4, bs 64, linear schedule with warmup for 5000 steps, 3 epochs, stopped after 91,000 steps at eval loss 3.082 and perplexity 21.79
## Evaluation results
We trained the model on 95% of the dataset and evaluated both loss and perplexity on 5% of the dataset. The final checkpoint evaluation resulted in:
* Evaluation loss: 3.082
* Perplexity: 21.79
## How to use
You can use the model either directly for text generation (see example below), by extracting features, or for further fine-tuning. We have prepared a notebook with text generation examples here including different decoding methods, bad words suppression, few- and zero-shot learning demonstrations.
### Text generation
Let's first start with the text-generation pipeline. When prompting for the best Polish poet, it comes up with a pretty reasonable text, highlighting one of the most famous Polish poets, Adam Mickiewicz.
The pipeline uses 'model.generate()' method in the background. In our notebook we demonstrate different decoding methods we can use with this method, including greedy search, beam search, sampling, temperature scaling, top-k and top-p sampling. As an example, the below snippet uses sampling among the 50 most probable tokens at each stage (top-k) and among the tokens that jointly represent 95% of the probability distribution (top-p). It also returns 3 output sequences.
### Avoiding Bad Words
You may want to prevent certain words from occurring in the generated text. To avoid displaying really bad words in the notebook, let's pretend that we don't like certain types of music to be advertised by our model. The prompt says: *my favorite type of music is*.
Ok, it seems this worked: we can see *classical music, rap, metal* among the outputs. Interestingly, *reggae* found a way through via a misspelling *reggea*. Take it as a caution to be careful with curating your bad word lists!
### Few Shot Learning
Let's see now if our model is able to pick up training signal directly from a prompt, without any finetuning. This approach was made really popular with GPT3, and while our model is definitely less powerful, maybe it can still show some skills! If you'd like to explore this topic in more depth, check out the following article which we used as reference.
It looks like our model is able to pick up some signal from the prompt. Be careful though, this capability is definitely not mature and may result in spurious or biased responses.
### Zero-Shot Inference
Large language models are known to store a lot of knowledge in its parameters. In the example below, we can see that our model has learned the date of an important event in Polish history, the battle of Grunwald.
## BibTeX entry and citation info
|
[
"# papuGaPT2 - Polish GPT2 language model\nGPT2 was released in 2019 and surprised many with its text generation capability. However, up until very recently, we have not had a strong text generation model in Polish language, which limited the research opportunities for Polish NLP practitioners. With the release of this model, we hope to enable such research. \n\nOur model follows the standard GPT2 architecture and training approach. We are using a causal language modeling (CLM) objective, which means that the model is trained to predict the next word (token) in a sequence of words (tokens).",
"## Datasets\nWe used the Polish subset of the multilingual Oscar corpus to train the model in a self-supervised fashion.",
"## Intended uses & limitations\nThe raw model can be used for text generation or fine-tuned for a downstream task. The model has been trained on data scraped from the web, and can generate text containing intense violence, sexual situations, coarse language and drug use. It also reflects the biases from the dataset (see below for more details). These limitations are likely to transfer to the fine-tuned models as well. At this stage, we do not recommend using the model beyond research.",
"## Bias Analysis\nThere are many sources of bias embedded in the model and we caution to be mindful of this while exploring the capabilities of this model. We have started a very basic analysis of bias that you can see in this notebook.",
"### Gender Bias\nAs an example, we generated 50 texts starting with prompts \"She/He works as\". The image below presents the resulting word clouds of female/male professions. The most salient terms for male professions are: teacher, sales representative, programmer. The most salient terms for female professions are: model, caregiver, receptionist, waitress.\n\n!gender bias",
"### Ethnicity/Nationality/Gender Bias\nWe generated 1000 texts to assess bias across ethnicity, nationality and gender vectors. We created prompts with the following scheme: \n\n* Person - in Polish this is a single word that differentiates both nationality/ethnicity and gender. We assessed the following 5 nationalities/ethnicities: German, Romani, Jewish, Ukrainian, Neutral. The neutral group used generic pronounts (\"He/She\"). \n* Topic - we used 5 different topics: \n * random act: *entered home*\n * said: *said*\n * works as: *works as*\n * intent: Polish *niech* which combined with *he* would roughly translate to *let him ...*\n * define: *is*\n\nEach combination of 5 nationalities x 2 genders x 5 topics had 20 generated texts. \n\nWe used a model trained on Polish Hate Speech corpus to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the nationality/ethnicity and gender from the generated text before running the hate speech detector.\n \nThe following tables and charts demonstrate the intensity of hate speech associated with the generated texts. There is a very clear effect where each of the ethnicities/nationalities score higher than the neutral baseline. \n\n!hate score by ethnicity\n\nLooking at the gender dimension we see higher hate score associated with males vs. females. \n\n!hate score by gender\n\nWe don't recommend using the GPT2 model beyond research unless a clear mitigation for the biases is provided.",
"## Training procedure",
"### Training scripts\nWe used the causal language modeling script for Flax. We would like to thank the authors of that script as it allowed us to complete this training in a very short time!",
"### Preprocessing and Training Details\nThe texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a vocabulary size of 50,257. The inputs are sequences of 512 consecutive tokens.\n\nWe have trained the model on a single TPUv3 VM, and due to unforeseen events the training run was split in 3 parts, each time resetting from the final checkpoint with a new optimizer state: \n1. LR 1e-3, bs 64, linear schedule with warmup for 1000 steps, 10 epochs, stopped after 70,000 steps at eval loss 3.206 and perplexity 24.68\n2. LR 3e-4, bs 64, linear schedule with warmup for 5000 steps, 7 epochs, stopped after 77,000 steps at eval loss 3.116 and perplexity 22.55\n3. LR 2e-4, bs 64, linear schedule with warmup for 5000 steps, 3 epochs, stopped after 91,000 steps at eval loss 3.082 and perplexity 21.79",
"## Evaluation results\nWe trained the model on 95% of the dataset and evaluated both loss and perplexity on 5% of the dataset. The final checkpoint evaluation resulted in: \n* Evaluation loss: 3.082\n* Perplexity: 21.79",
"## How to use\nYou can use the model either directly for text generation (see example below), by extracting features, or for further fine-tuning. We have prepared a notebook with text generation examples here including different decoding methods, bad words suppression, few- and zero-shot learning demonstrations.",
"### Text generation\nLet's first start with the text-generation pipeline. When prompting for the best Polish poet, it comes up with a pretty reasonable text, highlighting one of the most famous Polish poets, Adam Mickiewicz.\n \n\n\nThe pipeline uses 'model.generate()' method in the background. In our notebook we demonstrate different decoding methods we can use with this method, including greedy search, beam search, sampling, temperature scaling, top-k and top-p sampling. As an example, the below snippet uses sampling among the 50 most probable tokens at each stage (top-k) and among the tokens that jointly represent 95% of the probability distribution (top-p). It also returns 3 output sequences.",
"### Avoiding Bad Words\nYou may want to prevent certain words from occurring in the generated text. To avoid displaying really bad words in the notebook, let's pretend that we don't like certain types of music to be advertised by our model. The prompt says: *my favorite type of music is*. \n\n\nOk, it seems this worked: we can see *classical music, rap, metal* among the outputs. Interestingly, *reggae* found a way through via a misspelling *reggea*. Take it as a caution to be careful with curating your bad word lists!",
"### Few Shot Learning\n\nLet's see now if our model is able to pick up training signal directly from a prompt, without any finetuning. This approach was made really popular with GPT3, and while our model is definitely less powerful, maybe it can still show some skills! If you'd like to explore this topic in more depth, check out the following article which we used as reference.\n\n\nIt looks like our model is able to pick up some signal from the prompt. Be careful though, this capability is definitely not mature and may result in spurious or biased responses.",
"### Zero-Shot Inference\n\nLarge language models are known to store a lot of knowledge in its parameters. In the example below, we can see that our model has learned the date of an important event in Polish history, the battle of Grunwald.",
"## BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #text-generation #pl #endpoints_compatible #has_space #region-us \n",
"# papuGaPT2 - Polish GPT2 language model\nGPT2 was released in 2019 and surprised many with its text generation capability. However, up until very recently, we have not had a strong text generation model in Polish language, which limited the research opportunities for Polish NLP practitioners. With the release of this model, we hope to enable such research. \n\nOur model follows the standard GPT2 architecture and training approach. We are using a causal language modeling (CLM) objective, which means that the model is trained to predict the next word (token) in a sequence of words (tokens).",
"## Datasets\nWe used the Polish subset of the multilingual Oscar corpus to train the model in a self-supervised fashion.",
"## Intended uses & limitations\nThe raw model can be used for text generation or fine-tuned for a downstream task. The model has been trained on data scraped from the web, and can generate text containing intense violence, sexual situations, coarse language and drug use. It also reflects the biases from the dataset (see below for more details). These limitations are likely to transfer to the fine-tuned models as well. At this stage, we do not recommend using the model beyond research.",
"## Bias Analysis\nThere are many sources of bias embedded in the model and we caution to be mindful of this while exploring the capabilities of this model. We have started a very basic analysis of bias that you can see in this notebook.",
"### Gender Bias\nAs an example, we generated 50 texts starting with prompts \"She/He works as\". The image below presents the resulting word clouds of female/male professions. The most salient terms for male professions are: teacher, sales representative, programmer. The most salient terms for female professions are: model, caregiver, receptionist, waitress.\n\n!gender bias",
"### Ethnicity/Nationality/Gender Bias\nWe generated 1000 texts to assess bias across ethnicity, nationality and gender vectors. We created prompts with the following scheme: \n\n* Person - in Polish this is a single word that differentiates both nationality/ethnicity and gender. We assessed the following 5 nationalities/ethnicities: German, Romani, Jewish, Ukrainian, Neutral. The neutral group used generic pronounts (\"He/She\"). \n* Topic - we used 5 different topics: \n * random act: *entered home*\n * said: *said*\n * works as: *works as*\n * intent: Polish *niech* which combined with *he* would roughly translate to *let him ...*\n * define: *is*\n\nEach combination of 5 nationalities x 2 genders x 5 topics had 20 generated texts. \n\nWe used a model trained on Polish Hate Speech corpus to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the nationality/ethnicity and gender from the generated text before running the hate speech detector.\n \nThe following tables and charts demonstrate the intensity of hate speech associated with the generated texts. There is a very clear effect where each of the ethnicities/nationalities score higher than the neutral baseline. \n\n!hate score by ethnicity\n\nLooking at the gender dimension we see higher hate score associated with males vs. females. \n\n!hate score by gender\n\nWe don't recommend using the GPT2 model beyond research unless a clear mitigation for the biases is provided.",
"## Training procedure",
"### Training scripts\nWe used the causal language modeling script for Flax. We would like to thank the authors of that script as it allowed us to complete this training in a very short time!",
"### Preprocessing and Training Details\nThe texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a vocabulary size of 50,257. The inputs are sequences of 512 consecutive tokens.\n\nWe have trained the model on a single TPUv3 VM, and due to unforeseen events the training run was split in 3 parts, each time resetting from the final checkpoint with a new optimizer state: \n1. LR 1e-3, bs 64, linear schedule with warmup for 1000 steps, 10 epochs, stopped after 70,000 steps at eval loss 3.206 and perplexity 24.68\n2. LR 3e-4, bs 64, linear schedule with warmup for 5000 steps, 7 epochs, stopped after 77,000 steps at eval loss 3.116 and perplexity 22.55\n3. LR 2e-4, bs 64, linear schedule with warmup for 5000 steps, 3 epochs, stopped after 91,000 steps at eval loss 3.082 and perplexity 21.79",
"## Evaluation results\nWe trained the model on 95% of the dataset and evaluated both loss and perplexity on 5% of the dataset. The final checkpoint evaluation resulted in: \n* Evaluation loss: 3.082\n* Perplexity: 21.79",
"## How to use\nYou can use the model either directly for text generation (see example below), by extracting features, or for further fine-tuning. We have prepared a notebook with text generation examples here including different decoding methods, bad words suppression, few- and zero-shot learning demonstrations.",
"### Text generation\nLet's first start with the text-generation pipeline. When prompting for the best Polish poet, it comes up with a pretty reasonable text, highlighting one of the most famous Polish poets, Adam Mickiewicz.\n \n\n\nThe pipeline uses 'model.generate()' method in the background. In our notebook we demonstrate different decoding methods we can use with this method, including greedy search, beam search, sampling, temperature scaling, top-k and top-p sampling. As an example, the below snippet uses sampling among the 50 most probable tokens at each stage (top-k) and among the tokens that jointly represent 95% of the probability distribution (top-p). It also returns 3 output sequences.",
"### Avoiding Bad Words\nYou may want to prevent certain words from occurring in the generated text. To avoid displaying really bad words in the notebook, let's pretend that we don't like certain types of music to be advertised by our model. The prompt says: *my favorite type of music is*. \n\n\nOk, it seems this worked: we can see *classical music, rap, metal* among the outputs. Interestingly, *reggae* found a way through via a misspelling *reggea*. Take it as a caution to be careful with curating your bad word lists!",
"### Few Shot Learning\n\nLet's see now if our model is able to pick up training signal directly from a prompt, without any finetuning. This approach was made really popular with GPT3, and while our model is definitely less powerful, maybe it can still show some skills! If you'd like to explore this topic in more depth, check out the following article which we used as reference.\n\n\nIt looks like our model is able to pick up some signal from the prompt. Be careful though, this capability is definitely not mature and may result in spurious or biased responses.",
"### Zero-Shot Inference\n\nLarge language models are known to store a lot of knowledge in its parameters. In the example below, we can see that our model has learned the date of an important event in Polish history, the battle of Grunwald.",
"## BibTeX entry and citation info"
] |
[
39,
139,
31,
112,
56,
92,
367,
3,
43,
238,
54,
65,
177,
136,
126,
56,
10
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #text-generation #pl #endpoints_compatible #has_space #region-us \n# papuGaPT2 - Polish GPT2 language model\nGPT2 was released in 2019 and surprised many with its text generation capability. However, up until very recently, we have not had a strong text generation model in Polish language, which limited the research opportunities for Polish NLP practitioners. With the release of this model, we hope to enable such research. \n\nOur model follows the standard GPT2 architecture and training approach. We are using a causal language modeling (CLM) objective, which means that the model is trained to predict the next word (token) in a sequence of words (tokens).## Datasets\nWe used the Polish subset of the multilingual Oscar corpus to train the model in a self-supervised fashion.## Intended uses & limitations\nThe raw model can be used for text generation or fine-tuned for a downstream task. The model has been trained on data scraped from the web, and can generate text containing intense violence, sexual situations, coarse language and drug use. It also reflects the biases from the dataset (see below for more details). These limitations are likely to transfer to the fine-tuned models as well. At this stage, we do not recommend using the model beyond research.## Bias Analysis\nThere are many sources of bias embedded in the model and we caution to be mindful of this while exploring the capabilities of this model. We have started a very basic analysis of bias that you can see in this notebook.### Gender Bias\nAs an example, we generated 50 texts starting with prompts \"She/He works as\". The image below presents the resulting word clouds of female/male professions. The most salient terms for male professions are: teacher, sales representative, programmer. The most salient terms for female professions are: model, caregiver, receptionist, waitress.\n\n!gender bias",
"passage: ### Ethnicity/Nationality/Gender Bias\nWe generated 1000 texts to assess bias across ethnicity, nationality and gender vectors. We created prompts with the following scheme: \n\n* Person - in Polish this is a single word that differentiates both nationality/ethnicity and gender. We assessed the following 5 nationalities/ethnicities: German, Romani, Jewish, Ukrainian, Neutral. The neutral group used generic pronounts (\"He/She\"). \n* Topic - we used 5 different topics: \n * random act: *entered home*\n * said: *said*\n * works as: *works as*\n * intent: Polish *niech* which combined with *he* would roughly translate to *let him ...*\n * define: *is*\n\nEach combination of 5 nationalities x 2 genders x 5 topics had 20 generated texts. \n\nWe used a model trained on Polish Hate Speech corpus to obtain the probability that each generated text contains hate speech. To avoid leakage, we removed the first word identifying the nationality/ethnicity and gender from the generated text before running the hate speech detector.\n \nThe following tables and charts demonstrate the intensity of hate speech associated with the generated texts. There is a very clear effect where each of the ethnicities/nationalities score higher than the neutral baseline. \n\n!hate score by ethnicity\n\nLooking at the gender dimension we see higher hate score associated with males vs. females. \n\n!hate score by gender\n\nWe don't recommend using the GPT2 model beyond research unless a clear mitigation for the biases is provided.## Training procedure### Training scripts\nWe used the causal language modeling script for Flax. We would like to thank the authors of that script as it allowed us to complete this training in a very short time!### Preprocessing and Training Details\nThe texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a vocabulary size of 50,257. The inputs are sequences of 512 consecutive tokens.\n\nWe have trained the model on a single TPUv3 VM, and due to unforeseen events the training run was split in 3 parts, each time resetting from the final checkpoint with a new optimizer state: \n1. LR 1e-3, bs 64, linear schedule with warmup for 1000 steps, 10 epochs, stopped after 70,000 steps at eval loss 3.206 and perplexity 24.68\n2. LR 3e-4, bs 64, linear schedule with warmup for 5000 steps, 7 epochs, stopped after 77,000 steps at eval loss 3.116 and perplexity 22.55\n3. LR 2e-4, bs 64, linear schedule with warmup for 5000 steps, 3 epochs, stopped after 91,000 steps at eval loss 3.082 and perplexity 21.79## Evaluation results\nWe trained the model on 95% of the dataset and evaluated both loss and perplexity on 5% of the dataset. The final checkpoint evaluation resulted in: \n* Evaluation loss: 3.082\n* Perplexity: 21.79## How to use\nYou can use the model either directly for text generation (see example below), by extracting features, or for further fine-tuning. We have prepared a notebook with text generation examples here including different decoding methods, bad words suppression, few- and zero-shot learning demonstrations."
] |
[
-0.06467782706022263,
0.08640764653682709,
-0.003224638756364584,
0.03620687499642372,
0.08690553903579712,
-0.02260834537446499,
0.11503354460000992,
0.05760616809129715,
0.05676651746034622,
0.06720895320177078,
0.03725918009877205,
-0.050322890281677246,
0.051774539053440094,
0.08045931160449982,
0.09634993225336075,
-0.24278989434242249,
0.07755766808986664,
-0.08175405859947205,
0.030871009454131126,
0.07890892028808594,
0.110750712454319,
-0.06384529173374176,
0.06542650610208511,
0.0036318954080343246,
-0.03265512362122536,
0.025963349267840385,
-0.0169655941426754,
-0.007091192528605461,
0.055439293384552,
0.10697238147258759,
0.0771230161190033,
-0.012653255835175514,
0.0008085556328296661,
-0.18032997846603394,
0.026252485811710358,
0.07077430933713913,
0.004291082266718149,
0.023051075637340546,
0.07097309082746506,
-0.05478134751319885,
0.1894257664680481,
-0.12183935940265656,
0.03615463152527809,
0.04186996445059776,
-0.11206436157226562,
-0.059275440871715546,
-0.07091690599918365,
0.10189439356327057,
0.06737156212329865,
0.05290020629763603,
-0.06491319090127945,
0.14372104406356812,
0.010541253723204136,
0.060848768800497055,
0.0918978750705719,
-0.10573947429656982,
-0.005660495720803738,
-0.0750851109623909,
-0.020987514406442642,
0.08821988105773926,
-0.08781895041465759,
0.007123834919184446,
0.014574890024960041,
-0.006253986153751612,
0.024204496294260025,
-0.0068460251204669476,
0.12133853137493134,
-0.05860849469900131,
-0.15503999590873718,
-0.07496264576911926,
0.08780881017446518,
0.036161117255687714,
-0.09681195765733719,
-0.13490645587444305,
-0.008563964627683163,
-0.027873653918504715,
0.004192403517663479,
0.006196800619363785,
-0.05056891590356827,
-0.01230287179350853,
0.07547269761562347,
-0.07994093000888824,
-0.10278218984603882,
0.0338003896176815,
0.013840623199939728,
0.217742919921875,
0.03244009613990784,
0.04843998700380325,
-0.002022257074713707,
0.033356621861457825,
0.001987162046134472,
-0.047734539955854416,
-0.02675130032002926,
-0.07220075279474258,
-0.11115429550409317,
-0.025136686861515045,
-0.021695517003536224,
-0.06645707786083221,
-0.04310429096221924,
0.04417305439710617,
-0.05887243524193764,
0.03799496218562126,
0.027765415608882904,
0.04841930419206619,
0.11441145837306976,
0.1038072258234024,
-0.01688632182776928,
-0.03549686446785927,
0.00620588194578886,
-0.04433176666498184,
0.04068470746278763,
-0.01208420004695654,
-0.04527553915977478,
-0.022548381239175797,
0.007163231261074543,
0.09516559541225433,
-0.0021183062344789505,
0.07423806190490723,
-0.0015192379942163825,
-0.04760197550058365,
0.09292665123939514,
-0.09493417292833328,
0.01315787248313427,
0.021239109337329865,
-0.009734734892845154,
0.0936797559261322,
-0.014401966705918312,
-0.024900753051042557,
-0.1129566878080368,
0.01909608393907547,
-0.04184260219335556,
0.020120467990636826,
-0.1073138564825058,
-0.1437135636806488,
0.033551428467035294,
-0.015562239103019238,
-0.05072851851582527,
-0.11571745574474335,
-0.0885818600654602,
-0.06461957097053528,
-0.03750603273510933,
-0.026159055531024933,
0.029612932354211807,
-0.04259161651134491,
0.005226589739322662,
-0.012566311284899712,
0.02703714743256569,
-0.09506811946630478,
-0.017156697809696198,
0.031418707221746445,
-0.13840648531913757,
0.055928267538547516,
-0.025529449805617332,
-0.006384661886841059,
-0.1178453341126442,
0.01740337163209915,
-0.19532814621925354,
0.10215894877910614,
-0.055586785078048706,
0.021903231739997864,
-0.0762932151556015,
-0.0618634968996048,
-0.05521849915385246,
0.05013825744390488,
-0.010652473196387291,
0.15817710757255554,
-0.20248661935329437,
-0.03293227404356003,
0.12294870615005493,
-0.20072124898433685,
0.036592766642570496,
0.11946164071559906,
-0.029720347374677658,
0.09685059636831284,
0.12174861878156662,
0.19327789545059204,
0.03379320725798607,
-0.032641418278217316,
-0.04712251201272011,
-0.046594951301813126,
-0.09888037294149399,
0.14349418878555298,
0.025625862181186676,
-0.09479744732379913,
0.015153945423662663,
-0.018048912286758423,
0.017163729295134544,
0.005695957690477371,
-0.0060683442279696465,
-0.0415000319480896,
0.056414179503917694,
-0.0035574822686612606,
0.017250675708055496,
-0.03502599149942398,
-0.059095390141010284,
-0.04572204500436783,
-0.10183129459619522,
0.033299386501312256,
0.11575896292924881,
-0.06531286239624023,
0.05564362183213234,
-0.09712444245815277,
0.0349394753575325,
-0.008189572021365166,
-0.026590056717395782,
-0.14564457535743713,
-0.07123175263404846,
0.010097574442625046,
-0.11782976984977722,
0.09231816977262497,
0.009322479367256165,
0.045874327421188354,
0.06330221891403198,
-0.005660181865096092,
0.027191869914531708,
-0.01797681488096714,
0.015805168077349663,
-0.06757411360740662,
-0.15043479204177856,
-0.0320778489112854,
-0.027227917686104774,
0.11539299786090851,
-0.13639792799949646,
-0.012370999902486801,
0.09399321675300598,
0.06511019170284271,
0.03804236650466919,
-0.07091586291790009,
0.03419782221317291,
0.05779802054166794,
-0.008553695864975452,
-0.05324786901473999,
0.03021625615656376,
-0.0012493599206209183,
-0.031210198998451233,
0.1263776421546936,
-0.21067972481250763,
-0.2153601050376892,
0.07567624747753143,
0.008715088479220867,
-0.11006119847297668,
-0.03925652056932449,
-0.03422151878476143,
0.0178757943212986,
0.004546019248664379,
-0.0703674927353859,
0.08629657328128815,
0.017565518617630005,
0.027166303247213364,
-0.04449613764882088,
-0.026395846158266068,
0.003991859965026379,
-0.01938612200319767,
-0.06205996870994568,
0.06648994982242584,
0.0766419768333435,
-0.24881014227867126,
0.09167356789112091,
-0.00656428188085556,
0.04796469584107399,
0.21509242057800293,
0.01733742468059063,
-0.11224748194217682,
-0.008628716692328453,
0.026263665407896042,
0.0007188129238784313,
0.12904256582260132,
-0.0651722401380539,
-0.0006022821180522442,
0.007403011433780193,
0.005306001752614975,
0.04266546666622162,
-0.046524547040462494,
0.038817714899778366,
0.03665264695882797,
-0.015308923088014126,
-0.07658259570598602,
0.0039047766476869583,
-0.02803051471710205,
0.09484390914440155,
0.012297829613089561,
0.051697924733161926,
-0.04208964854478836,
-0.0414019338786602,
-0.1451128125190735,
0.11128320544958115,
-0.0952034592628479,
-0.1835053563117981,
-0.1060439944267273,
0.04707298427820206,
-0.0008454248309135437,
0.01941167563199997,
0.013233027420938015,
-0.10527364909648895,
-0.07590646296739578,
-0.11326615512371063,
0.11465220153331757,
-0.022592712193727493,
-0.02195485308766365,
-0.09451594948768616,
0.03400449454784393,
-0.027469776570796967,
-0.08655638247728348,
-0.0026047024875879288,
-0.02217191830277443,
0.002500893548130989,
0.01665390469133854,
-0.04366660490632057,
0.0803355872631073,
0.1074814647436142,
0.018410272896289825,
-0.019119055941700935,
-0.056647349148988724,
0.23779866099357605,
-0.12965911626815796,
0.05245170742273331,
0.020410632714629173,
-0.13121530413627625,
0.0336163230240345,
0.09072952717542648,
-0.024602733552455902,
-0.04024253785610199,
0.03705346956849098,
0.06744419783353806,
-0.04246077686548233,
-0.1424543857574463,
-0.11940805613994598,
-0.037467729300260544,
-0.03612847626209259,
0.03323791176080704,
0.0593053363263607,
0.005065265577286482,
0.016436325386166573,
-0.12660422921180725,
-0.05654262751340866,
0.02114817686378956,
0.08734078705310822,
0.027785753831267357,
-0.006200311705470085,
0.01847923919558525,
-0.09681388735771179,
-0.032662779092788696,
0.09440641850233078,
-0.0656861886382103,
0.270859956741333,
-0.0010326486080884933,
0.15642142295837402,
0.0927841067314148,
0.051846690475940704,
0.07442233711481094,
0.07320790737867355,
0.02291995659470558,
0.03374660760164261,
-0.02290467731654644,
-0.08837786316871643,
0.016763435676693916,
0.05012669786810875,
0.025184351950883865,
-0.05968043580651283,
-0.028714004904031754,
-0.07364428043365479,
0.11121398210525513,
0.1711178719997406,
0.021478062495589256,
-0.09902514517307281,
-0.06847679615020752,
0.04897354170680046,
-0.07146909832954407,
-0.04590007662773132,
-0.05478730797767639,
0.11591820418834686,
-0.13231518864631653,
0.05537370219826698,
0.028850797563791275,
0.053542666137218475,
-0.1150720864534378,
0.0007390305399894714,
-0.019856711849570274,
0.021746698766946793,
-0.06748295575380325,
0.09338081628084183,
-0.14140945672988892,
0.17716751992702484,
0.023961231112480164,
0.09545271098613739,
-0.0910135880112648,
-0.0664641484618187,
0.011626440100371838,
0.000588555820286274,
0.126511350274086,
0.06476808339357376,
-0.11601551622152328,
-0.11488348990678787,
-0.06633858382701874,
0.015503247268497944,
0.10909497737884521,
-0.015585883520543575,
0.11398464441299438,
-0.0028722542338073254,
0.021695442497730255,
-0.034597091376781464,
-0.018803291022777557,
-0.11697809398174286,
-0.15540042519569397,
0.022780217230319977,
-0.08206719160079956,
-0.009494004771113396,
-0.008465025573968887,
-0.03797977417707443,
0.04295515641570091,
0.13122284412384033,
-0.1329772174358368,
-0.10986240208148956,
-0.10551878809928894,
-0.014026260934770107,
0.11299183964729309,
-0.07081113755702972,
0.016295168548822403,
0.06041806936264038,
0.1827467679977417,
-0.03870493918657303,
-0.06438712030649185,
0.02043711394071579,
-0.03322813659906387,
-0.11365331709384918,
-0.005091514438390732,
0.11349506676197052,
0.12245560437440872,
0.06901246309280396,
0.016464481130242348,
0.03054371103644371,
0.0312851220369339,
-0.13281111419200897,
-0.002920755185186863,
0.10531289130449295,
-0.09259186685085297,
0.08037041127681732,
-0.03539357706904411,
-0.047933004796504974,
-0.14540904760360718,
-0.07593406736850739,
0.12791495025157928,
0.20994538068771362,
-0.0505613312125206,
0.10652536153793335,
0.10902626812458038,
-0.11655552685260773,
-0.1995846927165985,
0.03142976388335228,
0.024403288960456848,
0.02089960128068924,
0.050885774195194244,
-0.1548336297273636,
-0.08071430772542953,
0.028650369495153427,
0.03762232884764671,
0.006709364242851734,
-0.16977408528327942,
-0.11703559756278992,
0.06277942657470703,
0.045176390558481216,
0.08655800670385361,
-0.0488923117518425,
-0.01707758568227291,
-0.05564945936203003,
0.050241388380527496,
0.14724524319171906,
-0.09311281144618988,
0.038739338517189026,
0.05459165200591087,
0.06376030296087265,
0.059720296412706375,
-0.002085837535560131,
0.14012296497821808,
0.023072995245456696,
0.05591874569654465,
-0.09417536854743958,
-0.019615711644291878,
0.03657073527574539,
-0.0042296466417610645,
0.08061795681715012,
0.03779028356075287,
-0.048238564282655716,
-0.07945270836353302,
-0.08308545500040054,
-0.07184493541717529,
0.03220514953136444,
-0.06732150912284851,
-0.03554925695061684,
-0.07097528874874115,
0.0648728609085083,
0.05381401255726814,
-0.019173841923475266,
-0.07916715741157532,
-0.09289178252220154,
-0.036647941917181015,
-0.006794480606913567,
0.17880463600158691,
0.11070381850004196,
-0.09799742698669434,
0.06718750298023224,
-0.005644986405968666,
0.0742654800415039,
-0.025003094226121902,
0.0024721063673496246,
0.04879917949438095,
-0.021294113248586655,
0.10651125758886337,
-0.006246468052268028,
-0.1815732717514038,
0.02529841661453247,
0.05982772260904312,
-0.10133304446935654,
-0.1654314547777176,
-0.0015963491750881076,
-0.029624156653881073,
-0.05267389118671417,
-0.0897233635187149,
0.10985574126243591,
-0.07034947723150253,
0.0010419092141091824,
-0.0451841801404953,
0.04274056479334831,
-0.0219233650714159,
0.06919310241937637,
0.056242410093545914,
0.01355674210935831,
-0.03585382550954819,
0.10020776838064194,
0.0372336320579052,
-0.11687850207090378,
0.07897569239139557,
0.15461938083171844,
-0.10999558120965958,
-0.06589742004871368,
-0.05191822350025177,
0.0588751956820488,
-0.08186274021863937,
-0.07794145494699478,
0.02580418437719345,
-0.06375409662723541,
-0.02089279517531395,
0.12691017985343933,
-0.008564290590584278,
0.08045831322669983,
-0.013004999607801437,
-0.012774146161973476,
-0.08487493544816971,
0.022315945476293564,
0.03492898494005203,
-0.04382132738828659,
0.0049714213237166405,
0.1569397896528244,
0.026398606598377228,
-0.019201233983039856,
-0.023040741682052612,
-0.06342142075300217,
-0.07617595791816711,
0.003783863503485918,
-0.034325018525123596,
-0.02859414555132389,
-0.07924747467041016,
-0.037429362535476685,
0.013648543506860733,
-0.009776574559509754,
0.0033958700951188803,
0.017128294333815575,
-0.06178286671638489,
0.0049412548542022705,
-0.021606262773275375,
0.015292774885892868,
-0.07539975643157959,
0.028420310467481613,
0.06785038113594055,
-0.05323674529790878,
0.09497717022895813,
0.05224204808473587,
-0.04056866466999054,
0.04466389864683151,
-0.07480251044034958,
0.05283603072166443,
-0.03368089720606804,
-0.013167639262974262,
-0.01041058637201786,
-0.0904611200094223,
-0.012444780208170414,
-0.011158713139593601,
0.016996288672089577,
0.010087523609399796,
0.07664434611797333,
-0.029277822002768517,
0.09267871081829071,
0.07629966735839844,
-0.020068179816007614,
-0.05857537314295769,
0.06625892221927643,
0.00934780016541481,
0.019033219665288925,
0.10819385945796967,
-0.03914822265505791,
0.02514307014644146,
-0.08134712278842926,
0.01583998091518879,
0.016597725450992584,
0.040602996945381165,
-0.02692398801445961,
-0.020922701805830002,
0.05442441999912262,
-0.005011121276766062,
0.13583742082118988,
0.005198168568313122,
0.0034691113978624344,
0.06260032206773758,
-0.02796235680580139,
-0.062348902225494385,
0.020963210612535477,
0.0533580519258976,
-0.025158174335956573,
-0.022781474515795708,
-0.0654156431555748,
-0.03254171460866928,
-0.046916477382183075,
0.004234734922647476,
0.1308930218219757,
0.08114767074584961,
0.1719740331172943,
0.04589729756116867,
-0.018893588334321976,
-0.0391857847571373,
-0.053855106234550476,
-0.04784073308110237,
0.03940434008836746,
-0.007703509647399187,
-0.08807540684938431,
0.0903855487704277,
0.1834893822669983,
-0.08831800520420074,
0.09990385174751282,
0.010551314800977707,
-0.08965320140123367,
-0.09671249985694885,
-0.24539992213249207,
0.000828171381726861,
-0.007702271454036236,
-0.004153014160692692,
-0.08495056629180908,
0.053696051239967346,
0.033124081790447235,
0.04344033822417259,
-0.04257507249712944,
0.0615091398358345,
-0.06393547356128693,
-0.13561338186264038,
0.0411071851849556,
0.0034295611549168825,
0.06644612550735474,
0.03087855502963066,
0.0647168755531311,
0.011522685177624226,
-0.020922932773828506,
0.07130482792854309,
0.1035807654261589,
0.022316576912999153,
0.039814431220293045,
-0.07883307337760925,
-0.0670754462480545,
-0.0001001749187707901,
0.009210392832756042,
0.07262489944696426,
0.25726941227912903,
0.06355946511030197,
-0.03684605285525322,
-0.023835439234972,
0.1491934061050415,
0.03751331567764282,
-0.07256186753511429,
-0.11025242507457733,
0.15786075592041016,
0.03270943462848663,
0.013916447758674622,
-0.04249412566423416,
-0.09006941318511963,
0.07143579423427582,
0.14592275023460388,
0.14961451292037964,
0.014839817769825459,
0.033628419041633606,
-0.09153510630130768,
0.02387966401875019,
0.022351674735546112,
0.10252510011196136,
-0.04637164622545242,
0.24017241597175598,
-0.07573816180229187,
0.13308380544185638,
-0.04028801620006561,
0.009485158137977123,
-0.04092811048030853,
0.12157417088747025,
-0.02743607386946678,
-0.003162827342748642,
-0.08378338813781738,
0.11730611324310303,
-0.09362918138504028,
-0.21065649390220642,
-0.025660991668701172,
-0.03549325466156006,
-0.057874247431755066,
0.051996052265167236,
-0.061963215470314026,
0.06866995245218277,
0.09308969974517822,
0.0035895383916795254,
-0.03771575912833214,
0.11899404227733612,
0.02413025312125683,
-0.06975437700748444,
-0.09149016439914703,
0.10450032353401184,
-0.010330618359148502,
0.14876914024353027,
0.021133705973625183,
0.10878150910139084,
0.06547030061483383,
0.02489801123738289,
-0.13571666181087494,
0.03766534850001335,
-0.023449910804629326,
-0.0063164494931697845,
0.0041994135826826096,
0.15627209842205048,
-0.015034480951726437,
0.06552574783563614,
0.05171627178788185,
0.02614704892039299,
0.0874791294336319,
-0.022624459117650986,
-0.008192191831767559,
-0.005729121156036854,
0.09281748533248901,
-0.10758471488952637,
0.1309746354818344,
0.11974877119064331,
-0.026148993521928787,
0.005025915335863829,
-0.05888358876109123,
-0.05174407735466957,
-0.011685307137668133,
0.046363893896341324,
-0.008154377341270447,
-0.12168741226196289,
-0.004203449003398418,
-0.03071356751024723,
0.0457649827003479,
-0.17462512850761414,
-0.018067307770252228,
-0.002744798781350255,
-0.0025829127989709377,
0.012688105925917625,
0.09291323274374008,
0.021030081436038017,
0.009552167728543282,
-0.03737629950046539,
-0.0148234311491251,
0.03619823232293129,
0.08586250245571136,
-0.058063358068466187,
-0.05038321018218994
] |
null | null |
transformers
|
# Pino (Dutch BigBird) base model
Created by [Dat Nguyen](https://www.linkedin.com/in/dat-nguyen-49a641138/) & [Yeb Havinga](https://www.linkedin.com/in/yeb-havinga-86530825/) during the [Hugging Face community week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104)
(Not finished yet)
BigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.
It is a pretrained model on Dutch language using a masked language modeling (MLM) objective. It was introduced in this [paper](https://arxiv.org/abs/2007.14062) and first released in this [repository](https://github.com/google-research/bigbird).
## Model description
BigBird relies on **block sparse attention** instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 at a much lower compute cost compared to BERT. It has achieved SOTA on various tasks involving very long sequences such as long documents summarization, question-answering with long contexts.
## How to use
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BigBirdModel
# by default its in `block_sparse` mode with num_random_blocks=3, block_size=64
model = BigBirdModel.from_pretrained("flax-community/pino-bigbird-roberta-base")
# you can change `attention_type` to full attention like this:
model = BigBirdModel.from_pretrained("flax-community/pino-bigbird-roberta-base", attention_type="original_full")
# you can change `block_size` & `num_random_blocks` like this:
model = BigBirdModel.from_pretrained("flax-community/pino-bigbird-roberta-base", block_size=16, num_random_blocks=2)
```
## Training Data
This model is pre-trained on four publicly available datasets: **mC4**, and scraped **Dutch news** from NRC en Nu.nl. It uses the the fast universal Byte-level BPE (BBPE) in contrast to the sentence piece tokenizer and vocabulary as RoBERTa (which is in turn borrowed from GPT2).
## Training Procedure
The data is cleaned as follows:
Remove texts containing HTML codes / javascript codes / loremipsum / policies
Remove lines without end mark.
Remove too short texts, words
Remove too long texts, words
Remove bad words
## BibTeX entry and citation info
```tex
@misc{zaheer2021big,
title={Big Bird: Transformers for Longer Sequences},
author={Manzil Zaheer and Guru Guruganesh and Avinava Dubey and Joshua Ainslie and Chris Alberti and Santiago Ontanon and Philip Pham and Anirudh Ravula and Qifan Wang and Li Yang and Amr Ahmed},
year={2021},
eprint={2007.14062},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
|
{"language": "nl", "datasets": ["mC4", "Dutch_news"]}
|
fill-mask
|
flax-community/pino-bigbird-roberta-base
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"big_bird",
"fill-mask",
"nl",
"dataset:mC4",
"dataset:Dutch_news",
"arxiv:2007.14062",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2007.14062"
] |
[
"nl"
] |
TAGS
#transformers #pytorch #jax #tensorboard #big_bird #fill-mask #nl #dataset-mC4 #dataset-Dutch_news #arxiv-2007.14062 #autotrain_compatible #endpoints_compatible #region-us
|
# Pino (Dutch BigBird) base model
Created by Dat Nguyen & Yeb Havinga during the Hugging Face community week
(Not finished yet)
BigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.
It is a pretrained model on Dutch language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository.
## Model description
BigBird relies on block sparse attention instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 at a much lower compute cost compared to BERT. It has achieved SOTA on various tasks involving very long sequences such as long documents summarization, question-answering with long contexts.
## How to use
Here is how to use this model to get the features of a given text in PyTorch:
## Training Data
This model is pre-trained on four publicly available datasets: mC4, and scraped Dutch news from NRC en URL. It uses the the fast universal Byte-level BPE (BBPE) in contrast to the sentence piece tokenizer and vocabulary as RoBERTa (which is in turn borrowed from GPT2).
## Training Procedure
The data is cleaned as follows:
Remove texts containing HTML codes / javascript codes / loremipsum / policies
Remove lines without end mark.
Remove too short texts, words
Remove too long texts, words
Remove bad words
## BibTeX entry and citation info
|
[
"# Pino (Dutch BigBird) base model\n\nCreated by Dat Nguyen & Yeb Havinga during the Hugging Face community week\n\n(Not finished yet)\n\n\n\nBigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.\n\nIt is a pretrained model on Dutch language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository.",
"## Model description\n\nBigBird relies on block sparse attention instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 at a much lower compute cost compared to BERT. It has achieved SOTA on various tasks involving very long sequences such as long documents summarization, question-answering with long contexts.",
"## How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:",
"## Training Data\n\nThis model is pre-trained on four publicly available datasets: mC4, and scraped Dutch news from NRC en URL. It uses the the fast universal Byte-level BPE (BBPE) in contrast to the sentence piece tokenizer and vocabulary as RoBERTa (which is in turn borrowed from GPT2).",
"## Training Procedure\nThe data is cleaned as follows:\nRemove texts containing HTML codes / javascript codes / loremipsum / policies\nRemove lines without end mark. \nRemove too short texts, words\nRemove too long texts, words\nRemove bad words",
"## BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #big_bird #fill-mask #nl #dataset-mC4 #dataset-Dutch_news #arxiv-2007.14062 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Pino (Dutch BigBird) base model\n\nCreated by Dat Nguyen & Yeb Havinga during the Hugging Face community week\n\n(Not finished yet)\n\n\n\nBigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.\n\nIt is a pretrained model on Dutch language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository.",
"## Model description\n\nBigBird relies on block sparse attention instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 at a much lower compute cost compared to BERT. It has achieved SOTA on various tasks involving very long sequences such as long documents summarization, question-answering with long contexts.",
"## How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:",
"## Training Data\n\nThis model is pre-trained on four publicly available datasets: mC4, and scraped Dutch news from NRC en URL. It uses the the fast universal Byte-level BPE (BBPE) in contrast to the sentence piece tokenizer and vocabulary as RoBERTa (which is in turn borrowed from GPT2).",
"## Training Procedure\nThe data is cleaned as follows:\nRemove texts containing HTML codes / javascript codes / loremipsum / policies\nRemove lines without end mark. \nRemove too short texts, words\nRemove too long texts, words\nRemove bad words",
"## BibTeX entry and citation info"
] |
[
71,
137,
88,
24,
80,
54,
10
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #big_bird #fill-mask #nl #dataset-mC4 #dataset-Dutch_news #arxiv-2007.14062 #autotrain_compatible #endpoints_compatible #region-us \n# Pino (Dutch BigBird) base model\n\nCreated by Dat Nguyen & Yeb Havinga during the Hugging Face community week\n\n(Not finished yet)\n\n\n\nBigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.\n\nIt is a pretrained model on Dutch language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository.## Model description\n\nBigBird relies on block sparse attention instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 at a much lower compute cost compared to BERT. It has achieved SOTA on various tasks involving very long sequences such as long documents summarization, question-answering with long contexts.## How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:## Training Data\n\nThis model is pre-trained on four publicly available datasets: mC4, and scraped Dutch news from NRC en URL. It uses the the fast universal Byte-level BPE (BBPE) in contrast to the sentence piece tokenizer and vocabulary as RoBERTa (which is in turn borrowed from GPT2).## Training Procedure\nThe data is cleaned as follows:\nRemove texts containing HTML codes / javascript codes / loremipsum / policies\nRemove lines without end mark. \nRemove too short texts, words\nRemove too long texts, words\nRemove bad words## BibTeX entry and citation info"
] |
[
0.0030227210372686386,
0.008194190450012684,
-0.005814537405967712,
0.09120411425828934,
0.02546035498380661,
-0.08102437853813171,
-0.0241452194750309,
0.04789631813764572,
-0.13206659257411957,
0.10698407143354416,
-0.022382616996765137,
-0.01267910934984684,
0.14299830794334412,
0.01799965091049671,
0.05906365066766739,
-0.24569781124591827,
0.022308260202407837,
-0.06575692445039749,
0.054057858884334564,
0.043252766132354736,
0.10886934399604797,
-0.0833316370844841,
0.08565182238817215,
0.05800130218267441,
0.060575466603040695,
0.0283492598682642,
-0.05595749244093895,
-0.013890895061194897,
0.09464514255523682,
0.04074282944202423,
-0.006667074281722307,
0.011885504238307476,
-0.015360157005488873,
-0.08729936182498932,
0.029267223551869392,
0.046966176480054855,
0.03150271996855736,
0.03381921723484993,
0.09105516225099564,
0.09849756211042404,
0.0671752318739891,
-0.16040045022964478,
0.027328237891197205,
0.0362553745508194,
-0.07448969036340714,
-0.10803747922182083,
-0.15718956291675568,
0.017888780683279037,
0.03202144801616669,
0.07128454744815826,
0.007306818850338459,
0.04037465527653694,
-0.041323039680719376,
0.04848531261086464,
0.2120201587677002,
-0.24872517585754395,
-0.032025616616010666,
-0.04075754061341286,
0.08943891525268555,
0.05923838168382645,
-0.11813491582870483,
-0.010815104469656944,
-0.029435038566589355,
0.04327182099223137,
0.08383451402187347,
-0.024199629202485085,
0.03921638801693916,
-0.017173880711197853,
-0.06814342737197876,
-0.0008411378366872668,
0.008930962532758713,
0.019472021609544754,
-0.06810925900936127,
-0.15213774144649506,
-0.028115123510360718,
0.016950925812125206,
-0.0340968556702137,
-0.0465506985783577,
0.07549537718296051,
0.03790062293410301,
0.12640056014060974,
-0.15789596736431122,
-0.08408202230930328,
0.046757686883211136,
-0.12543542683124542,
-0.07575423270463943,
0.0029690407682210207,
0.042418405413627625,
-0.04364323243498802,
0.025031454861164093,
-0.023016538470983505,
-0.10399898886680603,
-0.08776324987411499,
-0.013767336495220661,
-0.02398706041276455,
-0.013195876032114029,
0.051073990762233734,
-0.10247038304805756,
-0.025324085727334023,
0.07167178392410278,
-0.02671230398118496,
0.022735802456736565,
-0.1339731514453888,
0.013939638622105122,
0.008329402655363083,
0.09311538934707642,
-0.11418556421995163,
-0.18002471327781677,
0.11367103457450867,
-0.02611580118536949,
0.1025591567158699,
0.01565147191286087,
-0.0020873078610748053,
0.011413783766329288,
0.024057647213339806,
0.0489322766661644,
0.08880868554115295,
-0.006906847469508648,
-0.060510534793138504,
0.015387458726763725,
0.15862901508808136,
-0.10829764604568481,
0.05102868005633354,
0.03256161883473396,
-0.03584888204932213,
0.07327400892972946,
-0.034119345247745514,
-0.039766088128089905,
-0.05240194872021675,
0.020199455320835114,
-0.040660906583070755,
-0.028281578794121742,
-0.019473642110824585,
-0.1313096284866333,
0.02218378335237503,
-0.0434856042265892,
-0.07397811114788055,
-0.047731269150972366,
-0.20462869107723236,
-0.05332094803452492,
-0.05797847360372543,
-0.08917556703090668,
-0.005487731192260981,
-0.014622565358877182,
-0.10096780955791473,
-0.02483152411878109,
0.02152862399816513,
0.09383305162191391,
-0.043457649648189545,
0.009068736806511879,
-0.08762148767709732,
0.10500109940767288,
-0.023647647351026535,
-0.006373623851686716,
-0.1036306619644165,
-0.006963090505450964,
-0.16252779960632324,
0.11268064379692078,
-0.05740420147776604,
-0.034372735768556595,
-0.10107621550559998,
-0.07149723172187805,
-0.049051813781261444,
-0.01127498596906662,
0.040188342332839966,
0.11273275315761566,
-0.1857321411371231,
0.007931779138743877,
0.12350348383188248,
-0.08142252266407013,
0.034802649170160294,
0.14905650913715363,
0.002553933300077915,
0.07104949653148651,
0.13047775626182556,
0.03171880915760994,
-0.02024991251528263,
-0.016093220561742783,
-0.10385788232088089,
0.04710451886057854,
-0.10138004273176193,
0.12090380489826202,
0.03726021945476532,
-0.01884365826845169,
0.03524214029312134,
0.0374915637075901,
0.019117845222353935,
-0.02399272844195366,
0.04899045452475548,
-0.027658406645059586,
0.035446032881736755,
0.006072711665183306,
-0.031149880960583687,
-0.07046134024858475,
-0.005471407901495695,
-0.05783144757151604,
-0.055732566863298416,
0.01786094903945923,
0.03428512439131737,
0.03179354593157768,
0.04521312192082405,
-0.03418070077896118,
0.04821605607867241,
-0.06918739527463913,
-0.008337823674082756,
-0.13206817209720612,
-0.018903054296970367,
0.011549961753189564,
-0.06589452177286148,
0.051349617540836334,
0.04523847997188568,
0.05967196822166443,
0.026592586189508438,
-0.020031584426760674,
0.03299541771411896,
0.003997639287263155,
-0.028086939826607704,
-0.07987833023071289,
-0.09337569028139114,
-0.011032423004508018,
-0.01143111102283001,
0.04877675324678421,
-0.051330357789993286,
0.016552846878767014,
0.16488860547542572,
0.050684891641139984,
0.04454118013381958,
-0.043808240443468094,
0.06669702380895615,
0.0010002062190324068,
0.01619000919163227,
-0.0631774291396141,
-0.024359259754419327,
0.06263398379087448,
0.02549723908305168,
0.06934481114149094,
-0.14612646400928497,
-0.18088124692440033,
0.053845301270484924,
0.10823407769203186,
-0.04690796881914139,
0.1644226461648941,
0.006305398885160685,
-0.05605964735150337,
-0.10483444482088089,
-0.08456932008266449,
0.1764911562204361,
0.08518755435943604,
0.0885096862912178,
-0.11472399532794952,
-0.022598402574658394,
-0.024689361453056335,
0.04207978397607803,
0.01672748662531376,
0.096769317984581,
-0.03941543772816658,
-0.13308757543563843,
0.07260575145483017,
-0.03600882366299629,
0.052004631608724594,
0.1477377563714981,
0.05527995154261589,
-0.02630859613418579,
-0.04012581333518028,
0.025703147053718567,
0.015083611011505127,
0.061255697160959244,
-0.042083531618118286,
0.06610626727342606,
0.06235777586698532,
0.00904824212193489,
0.017634926363825798,
-0.048219677060842514,
0.05127793177962303,
0.04863003268837929,
-0.04114523157477379,
0.028816569596529007,
-0.042460568249225616,
0.05753225460648537,
0.08116178959608078,
0.05928796902298927,
0.007848921231925488,
-0.027344971895217896,
-0.03310505300760269,
-0.05380368232727051,
0.11530464887619019,
-0.08870118111371994,
-0.28121331334114075,
-0.13797429203987122,
0.06396135687828064,
-0.09439056366682053,
-0.020063912495970726,
0.056247446686029434,
-0.009957206435501575,
-0.06010651960968971,
-0.16978108882904053,
0.05435432121157646,
0.06104569882154465,
-0.0480329729616642,
-0.05906619876623154,
-0.022336479276418686,
-0.021334325894713402,
-0.1607009917497635,
-0.022391989827156067,
-0.012444745749235153,
-0.08705256879329681,
-0.013518575578927994,
0.05326696112751961,
0.028982166200876236,
0.01561388373374939,
0.016700202599167824,
-0.0742729976773262,
-0.020643604919314384,
0.17125074565410614,
-0.04885087534785271,
0.1664646714925766,
0.16165882349014282,
0.02278202958405018,
0.09104213118553162,
0.07597954571247101,
0.0074293226934969425,
-0.029933210462331772,
-0.003427579766139388,
0.04956353083252907,
-0.045669056475162506,
-0.14677229523658752,
-0.02790836989879608,
-0.08215733617544174,
0.05142306163907051,
0.008556585758924484,
0.03817494213581085,
-0.07590725272893906,
-0.0016112641897052526,
-0.02636057510972023,
0.0011947342427447438,
0.04346073418855667,
0.08274031430482864,
0.09606601297855377,
-0.006685861852020025,
0.08537792414426804,
-0.0743238776922226,
-0.024926085025072098,
0.16086852550506592,
-0.02600552700459957,
0.028772834688425064,
-0.03284260630607605,
0.24293489754199982,
0.01888996548950672,
0.014891747385263443,
0.007383542601019144,
0.14741912484169006,
-0.08124949038028717,
0.007753539364784956,
-0.05679987370967865,
-0.08804125338792801,
0.03536596894264221,
0.0034178411588072777,
0.0005583612946793437,
0.09109796583652496,
0.021295703947544098,
0.018870297819375992,
0.06951941549777985,
0.20527970790863037,
0.006657949183136225,
-0.07186684757471085,
-0.07635057717561722,
0.03443405032157898,
-0.069524385035038,
-0.09443797916173935,
-0.038081780076026917,
0.13550350069999695,
-0.16301915049552917,
0.03483179211616516,
0.0249326154589653,
0.08893830329179764,
-0.07145726680755615,
0.006584399379789829,
-0.14731508493423462,
0.11360234767198563,
-0.0455825999379158,
0.06179588660597801,
-0.051403481513261795,
-0.03187117353081703,
0.01727042719721794,
0.09392494708299637,
-0.09767130762338638,
0.010656081140041351,
0.05610271170735359,
-0.07673478871583939,
0.12418331950902939,
-0.010901166126132011,
-0.08802595734596252,
0.0414496548473835,
-0.1339256763458252,
0.018076295033097267,
0.008346079848706722,
-0.048418279737234116,
0.05491669103503227,
-0.0008147165062837303,
-0.009865694679319859,
-0.03951559588313103,
-0.07290492951869965,
0.01915309950709343,
-0.1521277278661728,
0.006545680575072765,
-0.05142182856798172,
-0.0003126438823528588,
-0.05009538680315018,
-0.03168049454689026,
-0.0009475109982304275,
0.2610875070095062,
-0.17655058205127716,
-0.08729878067970276,
-0.09693912416696548,
-0.00047190749319270253,
0.06175507977604866,
-0.06672608852386475,
-0.0013964108657091856,
-0.005804304964840412,
0.10493341833353043,
-0.030575981363654137,
-0.03891022503376007,
0.06571485847234726,
-0.04502399265766144,
-0.14177542924880981,
0.011756820604205132,
0.10870193690061569,
0.02521347627043724,
0.058117423206567764,
0.018517490476369858,
0.03848376125097275,
-0.036752019077539444,
-0.13522569835186005,
-0.05936107039451599,
0.04004605859518051,
0.05252309516072273,
0.0601620189845562,
-0.038638778030872345,
-0.08365203440189362,
-0.030606022104620934,
-0.014145082794129848,
0.19897834956645966,
0.1975586712360382,
-0.07122679799795151,
0.17026640474796295,
0.08999572694301605,
-0.06389651447534561,
-0.2285781353712082,
0.03308066353201866,
-0.002961322432383895,
0.016675017774105072,
0.04120609536767006,
-0.13492870330810547,
0.15095841884613037,
0.1371431052684784,
-0.00794046837836504,
0.021590456366539,
-0.18271827697753906,
-0.09501159191131592,
-0.014997362159192562,
-0.05048111081123352,
0.12362951785326004,
-0.051727648824453354,
-0.006282619666308165,
-0.035115182399749756,
-0.06539048254489899,
0.07272942364215851,
-0.020090851932764053,
0.0756998136639595,
-0.016836784780025482,
-0.02562045492231846,
0.03580445796251297,
-0.04084038361907005,
0.13570536673069,
-0.04673011228442192,
0.08382969349622726,
-0.04986358433961868,
-0.0013360906159505248,
0.14776304364204407,
-0.011877755634486675,
0.1543884575366974,
-0.04325779899954796,
0.013621920719742775,
-0.12133903801441193,
-0.024714766070246696,
-0.11657679826021194,
0.06478866189718246,
-0.0346546396613121,
0.0072678993456065655,
-0.11664701998233795,
0.06450766324996948,
0.12642040848731995,
0.018696939572691917,
0.04886631667613983,
-0.04011177644133568,
-0.10495969653129578,
0.153241828083992,
0.13612759113311768,
-0.08545757085084915,
-0.10932367295026779,
0.015755051746964455,
0.005949918180704117,
0.014527808874845505,
0.03180517256259918,
0.056939952075481415,
0.07491254061460495,
0.0351661741733551,
0.06573972851037979,
0.030276646837592125,
-0.11443210393190384,
0.04095279797911644,
0.04126863554120064,
-0.07107697427272797,
-0.22839350998401642,
-0.011270175687968731,
0.002937017008662224,
-0.09362000972032547,
-0.11958659440279007,
0.09875664114952087,
0.006578855682164431,
-0.044151630252599716,
0.012033394537866116,
0.12012137472629547,
-0.004218364134430885,
0.08064821362495422,
-0.009107152000069618,
0.017202598974108696,
-0.07688219100236893,
0.07688792794942856,
0.09748756140470505,
-0.06174568086862564,
0.01630501262843609,
0.1400102972984314,
-0.06347110867500305,
-0.03974255546927452,
0.07477013021707535,
0.06519646942615509,
0.0765780583024025,
0.035257354378700256,
-0.029004506766796112,
-0.09246741980314255,
0.01831788755953312,
0.09392005950212479,
-0.00843901839107275,
0.02816801518201828,
-0.051002953201532364,
0.004916928242892027,
-0.059905339032411575,
0.08942685276269913,
-0.013173804618418217,
0.014753462746739388,
0.06667209416627884,
0.0945974811911583,
-0.03170185163617134,
0.014695798978209496,
-0.013912147842347622,
-0.004639905411750078,
-0.05625533312559128,
-0.061557866632938385,
-0.12375535815954208,
0.03059404157102108,
-0.09892240911722183,
-0.018929047510027885,
-0.032277654856443405,
0.030175931751728058,
-0.013734099455177784,
0.028790351003408432,
-0.02217898890376091,
-0.05893997475504875,
-0.044155724346637726,
0.05059437081217766,
-0.1410001516342163,
-0.014692619442939758,
0.06910990923643112,
-0.07398141920566559,
0.0918799489736557,
-0.005469976458698511,
0.02085430920124054,
-0.05101757496595383,
0.020828787237405777,
-0.08721619099378586,
-0.022899482399225235,
0.04148631542921066,
0.010838541202247143,
-0.09684760868549347,
-0.00808778591454029,
-0.009966105222702026,
-0.04591304063796997,
-0.005047032609581947,
0.05198241025209427,
-0.07752690464258194,
0.12993022799491882,
-0.017572810873389244,
0.012336364015936852,
-0.0391535647213459,
0.017584621906280518,
0.022765571251511574,
-0.003516317345201969,
0.1572948545217514,
-0.004787959158420563,
0.023544518277049065,
-0.11480040103197098,
-0.012354088947176933,
-0.049039676785469055,
-0.07690471410751343,
0.0799209251999855,
-0.022103682160377502,
0.07562125474214554,
0.004210501443594694,
0.07270771265029907,
0.018448537215590477,
-0.10085073113441467,
0.04756799340248108,
0.06132045388221741,
-0.04063974320888519,
-0.023854851722717285,
0.03031754493713379,
0.0013313253875821829,
-0.059391092509031296,
0.0775945708155632,
-0.028744373470544815,
-0.00916801393032074,
0.09388834238052368,
0.23726297914981842,
0.1324828416109085,
0.0900106355547905,
0.05205549672245979,
0.038063015788793564,
0.026627736166119576,
-0.14803293347358704,
0.02677225135266781,
0.005763099063187838,
0.05075846612453461,
-0.027041371911764145,
0.03055793046951294,
0.12668825685977936,
-0.150286003947258,
0.14841888844966888,
0.03796176239848137,
-0.014862668700516224,
-0.09339243173599243,
-0.1404666304588318,
-0.06332045793533325,
0.07991040498018265,
-0.04934310168027878,
-0.11818858236074448,
0.04225122928619385,
0.13285458087921143,
0.050113193690776825,
0.01375238411128521,
0.1047963947057724,
-0.1794925183057785,
-0.11345922946929932,
0.08913940191268921,
0.007854617200791836,
-0.003921448718756437,
0.01763877645134926,
0.07058883458375931,
0.009909482672810555,
0.06823264062404633,
0.08089416474103928,
0.11101874709129333,
0.10409239679574966,
-0.051742054522037506,
-0.07367780804634094,
-0.08089861273765564,
-0.026505528017878532,
-0.0014063366688787937,
0.0011696509318426251,
0.1399373561143875,
0.07845042645931244,
-0.054029952734708786,
0.018280982971191406,
0.1746608167886734,
-0.017769649624824524,
-0.11263752728700638,
-0.09772368520498276,
0.13403081893920898,
0.009695015847682953,
-0.00035218734410591424,
0.0006381921120919287,
-0.11068814247846603,
0.029650164768099785,
0.17636869847774506,
0.10997546464204788,
-0.06132696196436882,
0.016634974628686905,
-0.02707311138510704,
-0.010542172007262707,
0.00797160156071186,
0.0011674471898004413,
0.05510968714952469,
0.2746906280517578,
-0.0649237409234047,
0.11418266594409943,
-0.030683690682053566,
-0.03694313019514084,
-0.17813768982887268,
0.11167596280574799,
-0.058568548411130905,
0.03783029690384865,
-0.0299002043902874,
0.011864250525832176,
-0.03673618286848068,
-0.3050394654273987,
-0.015924101695418358,
-0.04525642469525337,
-0.1259528249502182,
0.04477676376700401,
-0.03327428176999092,
0.04779920354485512,
0.04289945960044861,
0.07876227796077728,
0.0008601016015745699,
0.08218231797218323,
0.009040767326951027,
-0.060362428426742554,
-0.053648073226213455,
0.07777795195579529,
-0.06560008972883224,
0.1429714560508728,
0.02393762394785881,
-0.040487099438905716,
0.10897445678710938,
-0.031122224405407906,
-0.14592328667640686,
0.027625730261206627,
-0.0006484430632553995,
-0.02275128662586212,
0.050679516047239304,
0.1115165501832962,
0.022497372701764107,
0.016225475817918777,
0.09317724406719208,
0.003578766016289592,
0.012594304978847504,
0.08676467090845108,
-0.03770485892891884,
-0.05040748044848442,
0.03206571191549301,
-0.07291844487190247,
0.10239002108573914,
0.13472159206867218,
-0.045947492122650146,
0.03555503487586975,
-0.04668271541595459,
-0.03340453281998634,
-0.001768021029420197,
0.11382564902305603,
-0.020475713536143303,
-0.11443273723125458,
0.023944642394781113,
-0.04879525676369667,
0.0933440625667572,
-0.1829386204481125,
-0.08354359865188599,
0.028616011142730713,
-0.019506685435771942,
-0.01623420976102352,
0.02755582518875599,
0.04030516371130943,
-0.03437253460288048,
-0.006110259797424078,
-0.04627317562699318,
-0.00685363681986928,
0.08165057003498077,
-0.06298693269491196,
-0.026500826701521873
] |
null | null | null |
# Putting NeRF on a Diet: Semantically Consistent Few-Shot View Synthesis Implementation
[](https://huggingface.co/spaces/flax-community/DietNerf-Demo) [](https://colab.research.google.com/drive/1etYeMTntw5mh3FvJv4Ubb7XUoTtt5J9G?usp=sharing)
<p align="center"><img width="450" alt="스크린샷 2021-07-04 오후 4 11 51" src="https://user-images.githubusercontent.com/77657524/126361638-4aad58e8-4efb-4fc5-bf78-f53d03799e1e.png"></p>
This project attempted to implement the paper **[Putting NeRF on a Diet](https://arxiv.org/abs/2104.00677)** (DietNeRF) in JAX/Flax.
DietNeRF is designed for rendering quality novel views in few-shot learning scheme, a task that vanilla NeRF (Neural Radiance Field) struggles.
To achieve this, the author coins **Semantic Consistency Loss** to supervise DietNeRF by prior knowledge from CLIP Vision Transformer. Such supervision enables DietNeRF to learn 3D scene reconstruction with CLIP's prior knowledge on 2D views.
Besides this repo, you can check our write-up and demo here:
- ✍️ **[Write-up in Notion](https://steep-cycle-f6b.notion.site/DietNeRF-Putting-NeRF-on-a-Diet-4aeddae95d054f1d91686f02bdb74745)**: more details of DietNeRF and our experiments
- ✨ **[Demo in Hugging Face Space](https://huggingface.co/spaces/flax-community/DietNerf-Demo)**: showcase our trained DietNeRFs by Streamlit
## 🤩 Demo
1. You can check out [our demo in Hugging Face Space](https://huggingface.co/spaces/flax-community/DietNerf-Demo)
2. Or you can set up our Streamlit demo locally (model checkpoints will be fetched automatically upon startup)
```shell
pip install -r requirements_demo.txt
streamlit run app.py
```
<p align="center"><img width="600" height="400" alt="Streamlit Demo" src="assets/space_demo.png"></p>
## ✨ Implementation
Our code is written in JAX/ Flax and mainly based upon [jaxnerf](https://github.com/google-research/google-research/tree/master/jaxnerf) from Google Research. The base code is highly optimized in GPU & TPU. For semantic consistency loss, we utilize pretrained CLIP Vision Transformer from [transformers](https://github.com/huggingface/transformers) library.
To learn more about DietNeRF, our experiments and implementation, you are highly recommended to check out our very detailed **[Notion write-up](https://www.notion.so/DietNeRF-Putting-NeRF-on-a-Diet-4aeddae95d054f1d91686f02bdb74745)**!
<p align="center"><img width="500" height="600" alt="스크린샷 2021-07-04 오후 4 11 51" src="assets/report_thumbnail.png"></p>
## 🤗 Hugging Face Model Hub Repo
You can also find our project on the [Hugging Face Model Hub Repository](https://huggingface.co/flax-community/putting-nerf-on-a-diet/).
Our JAX/Flax implementation currently supports:
<table class="tg">
<thead>
<tr>
<th class="tg-0lax"><span style="font-weight:bold">Platform</span></th>
<th class="tg-0lax" colspan="2"><span style="font-weight:bold">Single-Host GPU</span></th>
<th class="tg-0lax" colspan="2"><span style="font-weight:bold">Multi-Device TPU</span></th>
</tr>
</thead>
<tbody>
<tr>
<td class="tg-0lax"><span style="font-weight:bold">Type</span></td>
<td class="tg-0lax">Single-Device</td>
<td class="tg-0lax">Multi-Device</td>
<td class="tg-0lax">Single-Host</td>
<td class="tg-0lax">Multi-Host</td>
</tr>
<tr>
<td class="tg-0lax"><span style="font-weight:bold">Training</span></td>
<td class="tg-0lax"><img src="http://storage.googleapis.com/gresearch/jaxnerf/check.png" alt="Supported" width=18px height=18px></td>
<td class="tg-0lax"><img src="http://storage.googleapis.com/gresearch/jaxnerf/check.png" alt="Supported" width=18px height=18px></td>
<td class="tg-0lax"><img src="http://storage.googleapis.com/gresearch/jaxnerf/check.png" alt="Supported" width=18px height=18px></td>
<td class="tg-0lax"><img src="http://storage.googleapis.com/gresearch/jaxnerf/check.png" alt="Supported" width=18px height=18px></td>
</tr>
<tr>
<td class="tg-0lax"><span style="font-weight:bold">Evaluation</span></td>
<td class="tg-0lax"><img src="http://storage.googleapis.com/gresearch/jaxnerf/check.png" alt="Supported" width=18px height=18px></td>
<td class="tg-0lax"><img src="http://storage.googleapis.com/gresearch/jaxnerf/check.png" alt="Supported" width=18px height=18px></td>
<td class="tg-0lax"><img src="http://storage.googleapis.com/gresearch/jaxnerf/check.png" alt="Supported" width=18px height=18px></td>
<td class="tg-0lax"><img src="http://storage.googleapis.com/gresearch/jaxnerf/check.png" alt="Supported" width=18px height=18px></td>
</tr>
</tbody>
</table>
## 💻 Installation
```bash
# Clone the repo
git clone https://github.com/codestella/putting-nerf-on-a-diet
# Create a conda environment, note you can use python 3.6-3.8 as
# one of the dependencies (TensorFlow) hasn't supported python 3.9 yet.
conda create --name jaxnerf python=3.6.12; conda activate jaxnerf
# Prepare pip
conda install pip; pip install --upgrade pip
# Install requirements
pip install -r requirements.txt
# [Optional] Install GPU and TPU support for Jax
# Remember to change cuda101 to your CUDA version, e.g. cuda110 for CUDA 11.0.
!pip install --upgrade jax "jax[cuda110]" -f https://storage.googleapis.com/jax-releases/jax_releases.html
# install flax and flax-transformer
pip install flax transformers[flax]
```
## ⚽ Dataset
Download the datasets from the [NeRF official Google Drive](https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1).
Please download the `nerf_synthetic.zip` and unzip them
in the place you like. Let's assume they are placed under `/tmp/jaxnerf/data/`.
## 💖 Methods
* 👉👉 You can check VEEEERY detailed explanation about our project on [Notion Report](https://www.notion.so/DietNeRF-Putting-NeRF-on-a-Diet-4aeddae95d054f1d91686f02bdb74745)
<p align="center"><img width="400" alt="스크린샷 2021-07-04 오후 4 11 51" src="https://user-images.githubusercontent.com/77657524/124376591-b312b780-dce2-11eb-80ad-9129d6f5eedb.png"></p>
Based on the principle
that “a bulldozer is a bulldozer from any perspective”, Our proposed DietNeRF supervises the radiance field from arbitrary poses
(DietNeRF cameras). This is possible because we compute a semantic consistency loss in a feature space capturing high-level
scene attributes, not in pixel space. We extract semantic representations of renderings using the CLIP Vision Transformer, then
maximize similarity with representations of ground-truth views. In
effect, we use prior knowledge about scene semantics learned by
single-view 2D image encoders to constrain a 3D representation.
You can check detail information on the author's paper. Also, you can check the CLIP based semantic loss structure on the following image.
<p align="center"><img width="600" alt="스크린샷 2021-07-04 오후 4 11 51" src="https://user-images.githubusercontent.com/77657524/126386709-a4ce7ff8-2a68-442f-b4ed-26971fb90e51.png"></p>
Our code used JAX/FLAX framework for implementation. So that it can achieve much speed up than other NeRF codes. At last, our code used hugging face, transformer, CLIP model library.
## 🤟 How to use
```
python -m train \
--data_dir=/PATH/TO/YOUR/SCENE/DATA \ % e.g., nerf_synthetic/lego
--train_dir=/PATH/TO/THE/PLACE/YOU/WANT/TO/SAVE/CHECKPOINTS \
--config=configs/CONFIG_YOU_LIKE
```
You can toggle the semantic loss by “use_semantic_loss” in configuration files.
## 💎 Experimental Results
### ❗ Rendered Rendering images by 8-shot learned Diet-NeRF
DietNeRF has a strong capacity to generalise on novel and challenging views with EXTREMELY SMALL TRAINING SAMPLES!
### HOTDOG / DRUM / SHIP / CHAIR / LEGO / MIC
<img alt="" src="https://user-images.githubusercontent.com/77657524/126976706-caec6d6c-6126-45d0-8680-4c883f71f5bb.png" width="250"/></td><td><img alt="" src="https://user-images.githubusercontent.com/77657524/126976868-183af09a-47b3-4c76-ba20-90e9fef17bcc.png" width="250"/><td><img alt="" src="https://user-images.githubusercontent.com/77657524/126977843-18b4b077-1db0-4287-8e5c-baa10c46e647.png" width="250"/>
<img alt="" src="https://user-images.githubusercontent.com/77657524/126977066-9c99a882-7a46-4a1d-921f-cdb0eee60f39.gif" width="250"/><img alt="" src="https://user-images.githubusercontent.com/77657524/126913553-19ebd2f2-c5f1-4332-a253-950e41cb5229.gif" width="300"/><img alt="" src="https://user-images.githubusercontent.com/77657524/126913559-dfce4b88-84a8-4a0a-91eb-ed12716ab328.gif" width="300"/>
### ❗ Rendered GIF by occluded 14-shot learned NeRF and Diet-NeRF
We made artificial occlusion on the right side of image (Only picked left side training poses).
The reconstruction quality can be compared with this experiment.
DietNeRF shows better quality than Original NeRF when It is occluded.
#### Training poses
<img width="1400" src="https://user-images.githubusercontent.com/26036843/126111980-4f332c87-a7f0-42e0-a355-8e77621bbca4.png">
#### LEGO
[DietNeRF]
<img alt="" src="https://user-images.githubusercontent.com/77657524/126913404-800777f8-8f88-451a-92de-3dda25075206.gif" width="300"/>
[NeRF]
<img alt="" src="https://user-images.githubusercontent.com/77657524/126913412-f10dfb3e-e918-4ff4-aa2c-63529fec91d8.gif" width="300"/>
#### SHIP
[DietNeRF]
<img alt="" src="https://user-images.githubusercontent.com/77657524/126913430-0014a904-6ca1-4a7b-9cd6-6f73b36552fb.gif" width="300"/>
[NeRF]
<img alt="" src="https://user-images.githubusercontent.com/77657524/126913439-2e3128ef-c7ef-4c21-8261-6e3b8fe51f86.gif" width="300"/>
## 👨👧👦 Our Teams
| Teams | Members |
|------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Project Managing | [Stella Yang](https://github.com/codestella) To Watch Our Project Progress, Please Check [Our Project Notion](https://www.notion.so/Putting-NeRF-on-a-Diet-e0caecea0c2b40c3996c83205baf870d) |
| NeRF Team | [Stella Yang](https://github.com/codestella), [Alex Lau](https://github.com/riven314), [Seunghyun Lee](https://github.com/sseung0703), [Hyunkyu Kim](https://github.com/minus31), [Haswanth Aekula](https://github.com/hassiahk), [JaeYoung Chung](https://github.com/robot0321) |
| CLIP Team | [Seunghyun Lee](https://github.com/sseung0703), [Sasikanth Kotti](https://github.com/ksasi), [Khali Sifullah](https://github.com/khalidsaifullaah) , [Sunghyun Kim](https://github.com/MrBananaHuman) |
| Cloud TPU Team | [Alex Lau](https://github.com/riven314), [Aswin Pyakurel](https://github.com/masapasa), [JaeYoung Chung](https://github.com/robot0321), [Sunghyun Kim](https://github.com/MrBananaHuman) |
* Extremely Don't Sleep Contributors 🤣: [Seunghyun Lee](https://github.com/sseung0703), [Alex Lau](https://github.com/riven314), [Stella Yang](https://github.com/codestella), [Haswanth Aekula](https://github.com/hassiahk)
## 😎 What we improved from original JAX-NeRF : Innovation
- Neural rendering with fewshot images
- Hugging face CLIP based semantic loss loop
- You can choose coarse mlp / coarse + fine mlp training
(coarse + fine is on the `main` branch / coarse is on the `coarse_only` branch)
* coarse + fine : shows good geometric reconstruction
* coarse : shows good PSNR/SSIM result
- Make Video/GIF rendering result, `--generate_gif_only` arg can run fast rendering GIF.
- Cleaning / refactoring the code
- Made multiple models / colab / space for Nice demo
## 💞 Social Impact
- Game Industry
- Augmented Reality Industry
- Virtual Reality Industry
- Graphics Industry
- Online shopping
- Metaverse
- Digital Twin
- Mapping / SLAM
## 🌱 References
This project is based on “JAX-NeRF”.
```
@software{jaxnerf2020github,
author = {Boyang Deng and Jonathan T. Barron and Pratul P. Srinivasan},
title = {{JaxNeRF}: an efficient {JAX} implementation of {NeRF}},
url = {https://github.com/google-research/google-research/tree/master/jaxnerf},
version = {0.0},
year = {2020},
}
```
This project is based on “Putting NeRF on a Diet”.
```
@misc{jain2021putting,
title={Putting NeRF on a Diet: Semantically Consistent Few-Shot View Synthesis},
author={Ajay Jain and Matthew Tancik and Pieter Abbeel},
year={2021},
eprint={2104.00677},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
## 🔑 License
[Apache License 2.0](https://github.com/codestella/putting-nerf-on-a-diet/blob/main/LICENSE)
## ❤️ Special Thanks
Our Project is started in the [HuggingFace X GoogleAI (JAX) Community Week Event](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104).
Thank you for our mentor Suraj and organizers in JAX/Flax Community Week!
Our team grows up with this community learning experience. It was wonderful time!
<img width="250" alt="스크린샷 2021-07-04 오후 4 11 51" src="https://user-images.githubusercontent.com/77657524/126369170-5664076c-ac99-4157-bc53-b91dfb7ed7e1.jpeg">
[Common Computer AI](https://comcom.ai/en/) sponsored multiple V100 GPUs for our project!
Thank you so much for your support!
<img width="250" alt="스크린샷" src="https://user-images.githubusercontent.com/77657524/126914984-d959be06-19f4-4228-8d3a-a855396b2c3f.jpeg">
|
{}
| null |
flax-community/putting-nerf-on-a-diet
|
[
"arxiv:2104.00677",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2104.00677"
] |
[] |
TAGS
#arxiv-2104.00677 #region-us
|
Putting NeRF on a Diet: Semantically Consistent Few-Shot View Synthesis Implementation
======================================================================================
 in JAX/Flax.
DietNeRF is designed for rendering quality novel views in few-shot learning scheme, a task that vanilla NeRF (Neural Radiance Field) struggles.
To achieve this, the author coins Semantic Consistency Loss to supervise DietNeRF by prior knowledge from CLIP Vision Transformer. Such supervision enables DietNeRF to learn 3D scene reconstruction with CLIP's prior knowledge on 2D views.</p>
<p>Besides this repo, you can check our write-up and demo here:</p>
<ul>
<li>️ Write-up in Notion: more details of DietNeRF and our experiments</li>
<li>Demo in Hugging Face Space: showcase our trained DietNeRFs by Streamlit</li>
</ul>
<h2>Demo</h2>
<ol>
<li>You can check out our demo in Hugging Face Space</li>
<li>Or you can set up our Streamlit demo locally (model checkpoints will be fetched automatically upon startup)</li>
</ol>
<p align=)
Implementation
--------------
Our code is written in JAX/ Flax and mainly based upon jaxnerf from Google Research. The base code is highly optimized in GPU & TPU. For semantic consistency loss, we utilize pretrained CLIP Vision Transformer from transformers library.
To learn more about DietNeRF, our experiments and implementation, you are highly recommended to check out our very detailed Notion write-up!

Hugging Face Model Hub Repo
---------------------------
You can also find our project on the Hugging Face Model Hub Repository.
Our JAX/Flax implementation currently supports:
| Platform | Single-Host GPU | Multi-Device TPU |
| --- | --- | --- |
| Type | Single-Device | Multi-Device | Single-Host | Multi-Host |
| Training | | | | |
| Evaluation | | | | |
Installation
------------
Dataset
-------
Download the datasets from the NeRF official Google Drive.
Please download the 'nerf\_synthetic.zip' and unzip them
in the place you like. Let's assume they are placed under '/tmp/jaxnerf/data/'.
Methods
-------
* You can check VEEEERY detailed explanation about our project on Notion Report
. This is possible because we compute a semantic consistency loss in a feature space capturing high-level
scene attributes, not in pixel space. We extract semantic representations of renderings using the CLIP Vision Transformer, then
maximize similarity with representations of ground-truth views. In
effect, we use prior knowledge about scene semantics learned by
single-view 2D image encoders to constrain a 3D representation.</p>
<p>You can check detail information on the author's paper. Also, you can check the CLIP based semantic loss structure on the following image.</p>
<p align=)<img width="600" alt="스크린샷 2021-07-04 오후 4 11 51" src="URL
<p>Our code used JAX/FLAX framework for implementation. So that it can achieve much speed up than other NeRF codes. At last, our code used hugging face, transformer, CLIP model library.
How to use
----------
You can toggle the semantic loss by “use\_semantic\_loss” in configuration files.
Experimental Results
--------------------
### Rendered Rendering images by 8-shot learned Diet-NeRF
DietNeRF has a strong capacity to generalise on novel and challenging views with EXTREMELY SMALL TRAINING SAMPLES!
### HOTDOG / DRUM / SHIP / CHAIR / LEGO / MIC
<img alt="" src="URL width="250"/> <img alt="" src="URL width="250"/> <img alt="" src="URL width="250"/>
<img alt="" src="URL width="250"/><img alt="" src="URL width="300"/><img alt="" src="URL width="300"/> | |
### Rendered GIF by occluded 14-shot learned NeRF and Diet-NeRF
We made artificial occlusion on the right side of image (Only picked left side training poses).
The reconstruction quality can be compared with this experiment.
DietNeRF shows better quality than Original NeRF when It is occluded.
#### Training poses
<img width="1400" src="URL
#### LEGO
[DietNeRF]
<img alt="" src="URL width="300"/>
[NeRF]
<img alt="" src="URL width="300"/>
#### SHIP
[DietNeRF]
<img alt="" src="URL width="300"/>
[NeRF]
<img alt="" src="URL width="300"/>
Our Teams
------------
* Extremely Don't Sleep Contributors : Seunghyun Lee, Alex Lau, Stella Yang, Haswanth Aekula
What we improved from original JAX-NeRF : Innovation
----------------------------------------------------
* Neural rendering with fewshot images
* Hugging face CLIP based semantic loss loop
* You can choose coarse mlp / coarse + fine mlp training
(coarse + fine is on the 'main' branch / coarse is on the 'coarse\_only' branch)
+ coarse + fine : shows good geometric reconstruction
+ coarse : shows good PSNR/SSIM result
* Make Video/GIF rendering result, '--generate\_gif\_only' arg can run fast rendering GIF.
* Cleaning / refactoring the code
* Made multiple models / colab / space for Nice demo
Social Impact
-------------
* Game Industry
* Augmented Reality Industry
* Virtual Reality Industry
* Graphics Industry
* Online shopping
* Metaverse
* Digital Twin
* Mapping / SLAM
References
----------
This project is based on “JAX-NeRF”.
This project is based on “Putting NeRF on a Diet”.
License
-------
Apache License 2.0
️ Special Thanks
----------------
Our Project is started in the HuggingFace X GoogleAI (JAX) Community Week Event.
Thank you for our mentor Suraj and organizers in JAX/Flax Community Week!
Our team grows up with this community learning experience. It was wonderful time!
<img width="250" alt="스크린샷 2021-07-04 오후 4 11 51" src="URL
Common Computer AI sponsored multiple V100 GPUs for our project!
Thank you so much for your support!
<img width="250" alt="스크린샷" src="URL
|
[
"### Rendered Rendering images by 8-shot learned Diet-NeRF\n\n\nDietNeRF has a strong capacity to generalise on novel and challenging views with EXTREMELY SMALL TRAINING SAMPLES!",
"### HOTDOG / DRUM / SHIP / CHAIR / LEGO / MIC\n\n\n<img alt=\"\" src=\"URL width=\"250\"/> <img alt=\"\" src=\"URL width=\"250\"/> <img alt=\"\" src=\"URL width=\"250\"/>\n<img alt=\"\" src=\"URL width=\"250\"/><img alt=\"\" src=\"URL width=\"300\"/><img alt=\"\" src=\"URL width=\"300\"/> | |",
"### Rendered GIF by occluded 14-shot learned NeRF and Diet-NeRF\n\n\nWe made artificial occlusion on the right side of image (Only picked left side training poses).\nThe reconstruction quality can be compared with this experiment.\nDietNeRF shows better quality than Original NeRF when It is occluded.",
"#### Training poses\n\n\n<img width=\"1400\" src=\"URL",
"#### LEGO\n\n\n[DietNeRF]\n<img alt=\"\" src=\"URL width=\"300\"/>\n[NeRF]\n<img alt=\"\" src=\"URL width=\"300\"/>",
"#### SHIP\n\n\n[DietNeRF]\n<img alt=\"\" src=\"URL width=\"300\"/>\n[NeRF]\n<img alt=\"\" src=\"URL width=\"300\"/>\n\n\n Our Teams\n------------\n\n\n\n* Extremely Don't Sleep Contributors : Seunghyun Lee, Alex Lau, Stella Yang, Haswanth Aekula\n\n\nWhat we improved from original JAX-NeRF : Innovation\n----------------------------------------------------\n\n\n* Neural rendering with fewshot images\n* Hugging face CLIP based semantic loss loop\n* You can choose coarse mlp / coarse + fine mlp training\n(coarse + fine is on the 'main' branch / coarse is on the 'coarse\\_only' branch)\n\t+ coarse + fine : shows good geometric reconstruction\n\t+ coarse : shows good PSNR/SSIM result\n* Make Video/GIF rendering result, '--generate\\_gif\\_only' arg can run fast rendering GIF.\n* Cleaning / refactoring the code\n* Made multiple models / colab / space for Nice demo\n\n\nSocial Impact\n-------------\n\n\n* Game Industry\n* Augmented Reality Industry\n* Virtual Reality Industry\n* Graphics Industry\n* Online shopping\n* Metaverse\n* Digital Twin\n* Mapping / SLAM\n\n\nReferences\n----------\n\n\nThis project is based on “JAX-NeRF”.\n\n\nThis project is based on “Putting NeRF on a Diet”.\n\n\nLicense\n-------\n\n\nApache License 2.0\n\n\n️ Special Thanks\n----------------\n\n\nOur Project is started in the HuggingFace X GoogleAI (JAX) Community Week Event.\n\n\nThank you for our mentor Suraj and organizers in JAX/Flax Community Week!\nOur team grows up with this community learning experience. It was wonderful time!\n\n\n<img width=\"250\" alt=\"스크린샷 2021-07-04 오후 4 11 51\" src=\"URL\n\n\nCommon Computer AI sponsored multiple V100 GPUs for our project!\nThank you so much for your support!\n<img width=\"250\" alt=\"스크린샷\" src=\"URL"
] |
[
"TAGS\n#arxiv-2104.00677 #region-us \n",
"### Rendered Rendering images by 8-shot learned Diet-NeRF\n\n\nDietNeRF has a strong capacity to generalise on novel and challenging views with EXTREMELY SMALL TRAINING SAMPLES!",
"### HOTDOG / DRUM / SHIP / CHAIR / LEGO / MIC\n\n\n<img alt=\"\" src=\"URL width=\"250\"/> <img alt=\"\" src=\"URL width=\"250\"/> <img alt=\"\" src=\"URL width=\"250\"/>\n<img alt=\"\" src=\"URL width=\"250\"/><img alt=\"\" src=\"URL width=\"300\"/><img alt=\"\" src=\"URL width=\"300\"/> | |",
"### Rendered GIF by occluded 14-shot learned NeRF and Diet-NeRF\n\n\nWe made artificial occlusion on the right side of image (Only picked left side training poses).\nThe reconstruction quality can be compared with this experiment.\nDietNeRF shows better quality than Original NeRF when It is occluded.",
"#### Training poses\n\n\n<img width=\"1400\" src=\"URL",
"#### LEGO\n\n\n[DietNeRF]\n<img alt=\"\" src=\"URL width=\"300\"/>\n[NeRF]\n<img alt=\"\" src=\"URL width=\"300\"/>",
"#### SHIP\n\n\n[DietNeRF]\n<img alt=\"\" src=\"URL width=\"300\"/>\n[NeRF]\n<img alt=\"\" src=\"URL width=\"300\"/>\n\n\n Our Teams\n------------\n\n\n\n* Extremely Don't Sleep Contributors : Seunghyun Lee, Alex Lau, Stella Yang, Haswanth Aekula\n\n\nWhat we improved from original JAX-NeRF : Innovation\n----------------------------------------------------\n\n\n* Neural rendering with fewshot images\n* Hugging face CLIP based semantic loss loop\n* You can choose coarse mlp / coarse + fine mlp training\n(coarse + fine is on the 'main' branch / coarse is on the 'coarse\\_only' branch)\n\t+ coarse + fine : shows good geometric reconstruction\n\t+ coarse : shows good PSNR/SSIM result\n* Make Video/GIF rendering result, '--generate\\_gif\\_only' arg can run fast rendering GIF.\n* Cleaning / refactoring the code\n* Made multiple models / colab / space for Nice demo\n\n\nSocial Impact\n-------------\n\n\n* Game Industry\n* Augmented Reality Industry\n* Virtual Reality Industry\n* Graphics Industry\n* Online shopping\n* Metaverse\n* Digital Twin\n* Mapping / SLAM\n\n\nReferences\n----------\n\n\nThis project is based on “JAX-NeRF”.\n\n\nThis project is based on “Putting NeRF on a Diet”.\n\n\nLicense\n-------\n\n\nApache License 2.0\n\n\n️ Special Thanks\n----------------\n\n\nOur Project is started in the HuggingFace X GoogleAI (JAX) Community Week Event.\n\n\nThank you for our mentor Suraj and organizers in JAX/Flax Community Week!\nOur team grows up with this community learning experience. It was wonderful time!\n\n\n<img width=\"250\" alt=\"스크린샷 2021-07-04 오후 4 11 51\" src=\"URL\n\n\nCommon Computer AI sponsored multiple V100 GPUs for our project!\nThank you so much for your support!\n<img width=\"250\" alt=\"스크린샷\" src=\"URL"
] |
[
14,
45,
123,
70,
16,
47,
461
] |
[
"passage: TAGS\n#arxiv-2104.00677 #region-us \n### Rendered Rendering images by 8-shot learned Diet-NeRF\n\n\nDietNeRF has a strong capacity to generalise on novel and challenging views with EXTREMELY SMALL TRAINING SAMPLES!### HOTDOG / DRUM / SHIP / CHAIR / LEGO / MIC\n\n\n<img alt=\"\" src=\"URL width=\"250\"/> <img alt=\"\" src=\"URL width=\"250\"/> <img alt=\"\" src=\"URL width=\"250\"/>\n<img alt=\"\" src=\"URL width=\"250\"/><img alt=\"\" src=\"URL width=\"300\"/><img alt=\"\" src=\"URL width=\"300\"/> | |### Rendered GIF by occluded 14-shot learned NeRF and Diet-NeRF\n\n\nWe made artificial occlusion on the right side of image (Only picked left side training poses).\nThe reconstruction quality can be compared with this experiment.\nDietNeRF shows better quality than Original NeRF when It is occluded.#### Training poses\n\n\n<img width=\"1400\" src=\"URL#### LEGO\n\n\n[DietNeRF]\n<img alt=\"\" src=\"URL width=\"300\"/>\n[NeRF]\n<img alt=\"\" src=\"URL width=\"300\"/>"
] |
[
0.010003278963267803,
0.1175350695848465,
-0.0035075151827186346,
0.0756542831659317,
0.04043581709265709,
0.024915747344493866,
0.2026229053735733,
0.10817859321832657,
-0.0710400938987732,
0.10355141013860703,
0.001534470939077437,
-0.04185699671506882,
0.10662039369344711,
0.004367694724351168,
0.02487475797533989,
-0.23100227117538452,
0.0565575547516346,
-0.02995327301323414,
-0.054469093680381775,
0.11012479662895203,
0.022656084969639778,
-0.06952620297670364,
0.04448111355304718,
0.06326647102832794,
-0.13269437849521637,
0.027375349774956703,
-0.008019352331757545,
0.004369647707790136,
0.01619136333465576,
-0.03890994191169739,
0.12879616022109985,
0.08598024398088455,
0.06243569776415825,
-0.22379392385482788,
0.021245140582323074,
0.08300136029720306,
-0.01935320720076561,
0.08755084872245789,
0.06917664408683777,
0.029792457818984985,
0.1690826117992401,
-0.016541313380002975,
0.006176559254527092,
0.03829061985015869,
-0.10385460406541824,
-0.1493801325559616,
0.009035922586917877,
0.1597709059715271,
-0.06960175931453705,
0.03028891049325466,
-0.020525097846984863,
-0.023901162669062614,
-0.1253819316625595,
0.06314709037542343,
0.22562603652477264,
-0.12421638518571854,
-0.13677172362804413,
0.11421408504247665,
-0.07567950338125229,
0.07635249942541122,
-0.13631302118301392,
0.03747021034359932,
0.0064917197450995445,
0.010430923663079739,
0.025990452617406845,
0.04805224761366844,
-0.05704536288976669,
0.049794893711805344,
-0.10069651901721954,
0.021219857037067413,
0.12327026575803757,
0.16568847000598907,
-0.01155569963157177,
-0.10937102884054184,
-0.03928789868950844,
-0.05209140107035637,
-0.05707337707281113,
-0.02928038313984871,
0.024413587525486946,
-0.029618214815855026,
-0.17936402559280396,
-0.05415106564760208,
-0.0530553013086319,
-0.05739789828658104,
0.012305449694395065,
0.07316634804010391,
0.031392715871334076,
-0.0005853845505043864,
-0.008746285922825336,
0.08702750504016876,
0.047218166291713715,
-0.1965189278125763,
-0.03834430128335953,
-0.046446532011032104,
-0.02659824676811695,
-0.007288716733455658,
-0.013381127268075943,
-0.0371977761387825,
-0.015080563724040985,
0.17598949372768402,
0.0927414670586586,
0.05402803421020508,
-0.05779820680618286,
0.043158527463674545,
-0.0740271732211113,
0.1016344353556633,
-0.16013994812965393,
0.07665321975946426,
-0.025957928970456123,
0.18842794001102448,
-0.07238686829805374,
-0.022822881117463112,
-0.02587117813527584,
-0.04132016748189926,
-0.0410756915807724,
-0.0788300484418869,
0.009843840263783932,
0.016950471326708794,
-0.06736557185649872,
-0.027207471430301666,
0.1377132385969162,
-0.054262299090623856,
0.016644923016428947,
0.04314592108130455,
-0.12438812851905823,
0.10148174315690994,
0.07062312960624695,
0.02248946763575077,
-0.0030129062943160534,
0.0419558584690094,
-0.09037604182958603,
0.030551278963685036,
-0.07952697575092316,
-0.03352167084813118,
0.0043899258598685265,
0.15740682184696198,
-0.07642883062362671,
-0.09037713706493378,
-0.07272551953792572,
-0.06151299178600311,
0.048642855137586594,
-0.07907933741807938,
0.046848196536302567,
-0.07358532398939133,
-0.05695493519306183,
-0.01673785410821438,
0.05652774125337601,
-0.00443276995792985,
0.03285988047719002,
0.06218021363019943,
0.020100895315408707,
0.091908298432827,
0.159079909324646,
-0.04843994602560997,
-0.05734008178114891,
0.05502140149474144,
-0.14142008125782013,
0.11668208241462708,
-0.06343095749616623,
-0.05029705539345741,
-0.08383182436227798,
-0.08591247349977493,
-0.2255028635263443,
-0.07242151349782944,
0.054786261171102524,
0.12028242647647858,
-0.16616305708885193,
-0.06430167704820633,
0.16270485520362854,
-0.0008790853898972273,
-0.024944012984633446,
0.09235689043998718,
-0.07541500777006149,
-0.02286011539399624,
0.03401034325361252,
0.022030765190720558,
0.07197535037994385,
-0.07300068438053131,
-0.1408255249261856,
-0.048385199159383774,
0.010946483351290226,
0.14927804470062256,
0.036576636135578156,
0.11662820726633072,
0.033930473029613495,
0.042230818420648575,
-0.06746761500835419,
0.1085301861166954,
-0.035483136773109436,
-0.0471944697201252,
-0.06460964679718018,
-0.051827482879161835,
0.10637488961219788,
0.03990902751684189,
0.05394505336880684,
-0.023297978565096855,
-0.13425672054290771,
-0.08266855776309967,
0.0915689542889595,
-0.12585784494876862,
-0.030802413821220398,
-0.041116341948509216,
0.09572907537221909,
-0.1170145720243454,
-0.017191611230373383,
-0.14476460218429565,
-0.09142788499593735,
0.016678892076015472,
-0.1087362989783287,
0.13262678682804108,
0.033570077270269394,
0.13251471519470215,
0.057371024042367935,
-0.07558170706033707,
-0.05692271888256073,
-0.1277412623167038,
-0.052684951573610306,
0.05896797403693199,
-0.1374308466911316,
-0.052842605859041214,
0.0010773877147585154,
0.10308241099119186,
-0.1412201225757599,
0.030577272176742554,
0.026699896901845932,
0.2411772906780243,
0.007088777143508196,
-0.07926720380783081,
-0.0010613842168822885,
-0.10296481847763062,
-0.027760177850723267,
-0.04983449727296829,
-0.05543926730751991,
0.018216565251350403,
-0.06128603592514992,
-0.030210651457309723,
-0.06451571732759476,
-0.11258920282125473,
0.060704927891492844,
-0.11068185418844223,
-0.11407498270273209,
0.007237743586301804,
-0.07315012812614441,
0.01353211048990488,
-0.07376818358898163,
0.033284202218055725,
0.10954005271196365,
0.08147625625133514,
0.10227213054895401,
-0.07880154997110367,
-0.044380009174346924,
0.016845598816871643,
-0.029031572863459587,
-0.05844370648264885,
0.016141574829816818,
0.15105903148651123,
-0.1550699770450592,
0.04375915601849556,
-0.002343891654163599,
-0.030370496213436127,
0.19774849712848663,
0.0553930327296257,
-0.12741592526435852,
-0.07979783415794373,
0.04654974117875099,
0.009497305378317833,
0.04141616076231003,
0.12057235091924667,
0.08506327122449875,
0.0514928363263607,
-0.049767110496759415,
0.03549093380570412,
-0.16654643416404724,
0.01824004389345646,
0.06878150999546051,
-0.08522369712591171,
-0.012336919084191322,
-0.0534251406788826,
0.007460453547537327,
0.10230090469121933,
0.005223921965807676,
0.0011858451180160046,
-0.035882070660591125,
-0.030495231971144676,
-0.02350790984928608,
0.10023840516805649,
0.02271527424454689,
-0.14157666265964508,
-0.13608716428279877,
-0.04687342047691345,
-0.13283808529376984,
-0.0004447700339369476,
-0.055159807205200195,
-0.1606261134147644,
-0.0800459012389183,
-0.015347214415669441,
-0.03678440675139427,
0.04493241012096405,
-0.060336723923683167,
-0.0015614653239026666,
0.08768711239099503,
0.08359064161777496,
-0.04856321960687637,
0.02502136118710041,
0.02564092166721821,
0.028730865567922592,
-0.013451634906232357,
-0.07302868366241455,
0.10262969136238098,
0.12767311930656433,
-0.015177994035184383,
0.043042514473199844,
0.046731848269701004,
0.2001120001077652,
-0.08777797967195511,
0.038071390241384506,
0.14996732771396637,
0.009694088250398636,
0.10860138386487961,
0.10594788193702698,
0.08387161046266556,
-0.029882708564400673,
0.04957793653011322,
0.03788648545742035,
-0.04850941151380539,
-0.11466854810714722,
0.06399187445640564,
-0.10572043806314468,
-0.06579025089740753,
0.1600295901298523,
0.016157634556293488,
0.0903443992137909,
0.102920763194561,
-0.06621827930212021,
0.08619576692581177,
0.11382701992988586,
0.07938561588525772,
-0.006097789853811264,
0.04550580680370331,
0.10675246268510818,
-0.016305189579725266,
-0.15267029404640198,
0.04305727779865265,
0.058100223541259766,
0.11804083734750748,
-0.051465995609760284,
0.16055510938167572,
0.0205056332051754,
0.1215943992137909,
0.05875932052731514,
0.012254243716597557,
-0.000206280208658427,
-0.032579850405454636,
0.0376100167632103,
-0.07579713314771652,
0.0077789356000721455,
0.06634394079446793,
0.05060786008834839,
-0.030968045815825462,
0.0047369361855089664,
0.044502079486846924,
0.05113880708813667,
0.09864210337400436,
0.033961281180381775,
-0.28901249170303345,
0.03890801593661308,
0.019052134826779366,
0.05475426837801933,
-0.0884583443403244,
-0.07180021703243256,
0.16621485352516174,
-0.05732974782586098,
0.08807332813739777,
-0.07235696166753769,
0.0708814412355423,
-0.06377249211072922,
-0.010225621052086353,
0.06559448689222336,
0.11896663904190063,
-0.028601016849279404,
0.10174179077148438,
-0.0490613617002964,
0.08798467367887497,
0.0014088620664551854,
0.005028384737670422,
-0.11919058859348297,
-0.028780747205018997,
0.08138054609298706,
-0.09897974878549576,
0.12717260420322418,
-0.035622112452983856,
-0.07229838520288467,
-0.11949347704648972,
-0.082099549472332,
-0.02110697142779827,
0.1801615059375763,
-0.05909363180398941,
0.08078320324420929,
-0.017385633662343025,
-0.07231207191944122,
0.018159542232751846,
0.03622248396277428,
-0.07977625727653503,
-0.17663289606571198,
0.011769759468734264,
0.046078674495220184,
-0.12401305139064789,
-0.05334014073014259,
-0.07148131728172302,
-0.16182447969913483,
0.24796701967716217,
0.03770386427640915,
0.03793341666460037,
-0.14072063565254211,
0.20083707571029663,
0.13383717834949493,
-0.06126418337225914,
-0.059956006705760956,
-0.045563116669654846,
0.2877812087535858,
-0.018442552536725998,
-0.131387859582901,
-0.0006092785042710602,
0.04094555601477623,
-0.10260538011789322,
-0.062215548008680344,
0.02592606656253338,
0.04997001215815544,
0.04843531921505928,
-0.036284513771533966,
0.06529959291219711,
-0.030923085287213326,
-0.06364648789167404,
0.0012169096153229475,
-0.028320223093032837,
-0.035872265696525574,
0.15873563289642334,
-0.03414851054549217,
0.14507953822612762,
-0.10869712382555008,
0.1593666821718216,
-0.011451809667050838,
0.15540441870689392,
-0.07759927958250046,
-0.12706196308135986,
-0.07102679461240768,
0.023641841486096382,
-0.10652655363082886,
0.030162503942847252,
0.09376358985900879,
0.07605872303247452,
0.009348267689347267,
-0.14190827310085297,
0.09136572480201721,
0.02129238098859787,
-0.015459757298231125,
0.03062543086707592,
-0.09008021652698517,
-0.1132877990603447,
-0.018157098442316055,
0.06351995468139648,
-0.070001982152462,
0.015297239646315575,
-0.021182622760534286,
-0.034625016152858734,
-0.10507091134786606,
0.012505064718425274,
0.009946282021701336,
0.1017049178481102,
0.0201816838234663,
-0.03546954318881035,
0.058248020708560944,
-0.061059851199388504,
0.14943987131118774,
0.0095027144998312,
0.1119091734290123,
-0.06815962493419647,
0.006079875398427248,
-0.014423590153455734,
-0.06701033562421799,
0.07020319998264313,
0.045146096497774124,
0.05178462341427803,
0.004529211670160294,
-0.05883454531431198,
-0.08325789123773575,
-0.06338964402675629,
0.024447035044431686,
0.009838791564106941,
-0.1276167929172516,
0.08832629770040512,
0.04838164523243904,
0.09183179587125778,
0.1683291643857956,
-0.02830411121249199,
0.07426929473876953,
0.10856770724058151,
0.04228310286998749,
0.07031150162220001,
-0.11534549295902252,
-0.09116867929697037,
0.036423247307538986,
0.021357495337724686,
-0.12359996885061264,
0.06427770853042603,
0.14344289898872375,
0.023801680654287338,
0.14649417996406555,
0.0331413671374321,
-0.08622432500123978,
-0.050889380276203156,
0.09836549311876297,
-0.05779996141791344,
-0.07341546565294266,
-0.06959570944309235,
-0.06192111596465111,
-0.18947789072990417,
-0.1738012731075287,
0.08523337543010712,
-0.040771484375,
0.04461357370018959,
0.0485495924949646,
-0.01762693002820015,
-0.019097136333584785,
0.07943469285964966,
0.1483842432498932,
0.038320142775774,
-0.09133458882570267,
0.020692648366093636,
0.1304284632205963,
0.11539023369550705,
0.015363470651209354,
-0.033872876316308975,
-0.04006151109933853,
0.04250306636095047,
0.05770997330546379,
0.18328829109668732,
0.0009789876639842987,
-0.0062096985056996346,
-0.1864381730556488,
-0.10508458316326141,
0.050438042730093,
-0.052530672401189804,
0.06338844448328018,
0.010860984213650227,
-0.08256802707910538,
0.024909190833568573,
-0.14713911712169647,
0.108204185962677,
0.006211093161255121,
0.08473370224237442,
-0.12258312851190567,
0.08014708012342453,
0.03775905445218086,
0.03829259052872658,
-0.035716209560632706,
0.021828802302479744,
-0.0014665551716461778,
-0.0014798840275034308,
-0.08780849725008011,
-0.032356273382902145,
-0.022401204332709312,
-0.007837614044547081,
0.022129258140921593,
-0.018808141350746155,
-0.09013652801513672,
-0.0032313065603375435,
-0.12363044917583466,
-0.07802356034517288,
-0.0008137379190884531,
-0.008195330388844013,
-0.1308513730764389,
0.030538612976670265,
0.12035799771547318,
-0.14521770179271698,
-0.04947974160313606,
-0.0074283345602452755,
-0.0281814131885767,
0.019651340320706367,
-0.11556971073150635,
-0.09051497280597687,
0.04555051773786545,
0.12315137684345245,
-0.03835240751504898,
-0.02971762977540493,
0.09986463189125061,
-0.03862239047884941,
-0.0072367750108242035,
-0.023394370451569557,
-0.14621715247631073,
-0.09127132594585419,
-0.037981513887643814,
-0.06028854101896286,
-0.019257517531514168,
-0.06159874051809311,
0.07755797356367111,
-0.01841885969042778,
0.01484623271971941,
0.06211188808083534,
0.008392405696213245,
0.026249373331665993,
-0.2715234160423279,
-0.00906359776854515,
0.0378783755004406,
-0.07988040894269943,
0.10246650874614716,
-0.09088458120822906,
0.09251943975687027,
-0.0405023992061615,
0.05079108849167824,
0.020369457080960274,
-0.06912747770547867,
0.010272801853716373,
0.07747730612754822,
0.07433939725160599,
0.024813536554574966,
0.0008615560946054757,
0.06343697756528854,
-0.06631722301244736,
0.10831962525844574,
-0.07688868045806885,
0.06381060928106308,
-0.0864366814494133,
0.16634799540042877,
0.06117737665772438,
-0.0764133632183075,
-0.029617134481668472,
0.16866092383861542,
-0.021309712901711464,
0.06716237962245941,
0.12102629989385605,
-0.12602782249450684,
0.03985881805419922,
-0.01875421404838562,
-0.036532316356897354,
0.13551999628543854,
-0.22621400654315948,
0.07851976901292801,
0.05815155804157257,
-0.06660929322242737,
-0.08239611238241196,
-0.19597606360912323,
-0.11423387378454208,
-0.13813789188861847,
0.09511154145002365,
-0.102993443608284,
-0.020173467695713043,
0.03772173821926117,
0.026937415823340416,
0.08074276894330978,
0.13249123096466064,
0.018755130469799042,
0.0035832724533975124,
0.1091865748167038,
-0.01769113913178444,
-0.015415260568261147,
-0.014855186454951763,
-0.04800749942660332,
0.1296428143978119,
0.06158865615725517,
0.09339378774166107,
-0.06758693605661392,
0.06358274817466736,
-0.04349427670240402,
0.012578098103404045,
-0.08772996813058853,
0.0032624362502247095,
-0.11587367951869965,
0.013084592297673225,
0.024914253503084183,
0.007430890575051308,
0.06278015673160553,
-0.05853217467665672,
0.12915290892124176,
-0.044411517679691315,
-0.02847256325185299,
-0.22458578646183014,
-0.009482381865382195,
-0.020226208493113518,
-0.029088227078318596,
0.028224514797329903,
-0.09045038372278214,
-0.05083499103784561,
0.17799372971057892,
-0.003191345604136586,
-0.13899627327919006,
-0.03935857117176056,
0.038353703916072845,
-0.003824682207778096,
0.012563368305563927,
0.02112579718232155,
0.035721685737371445,
0.058376211673021317,
-0.07650938630104065,
-0.04893134906888008,
0.03843488171696663,
-0.05480571836233139,
0.05098244920372963,
0.05549025908112526,
0.06485346704721451,
0.005022793542593718,
-0.14867745339870453,
0.059872835874557495,
-0.008284199051558971,
0.0374925397336483,
0.07189641892910004,
-0.06966603547334671,
-0.16463032364845276,
-0.03194769099354744,
-0.0032481341622769833,
-0.0320245698094368,
-0.0017536598024889827,
-0.061702001839876175,
-0.06283682584762573,
0.11235179752111435,
0.0021511379163712263,
-0.0216215867549181,
-0.007701202295720577,
-0.032664261758327484,
-0.08597009629011154,
0.162313774228096,
0.05464458838105202,
-0.019035549834370613,
0.13409604132175446,
-0.05383715778589249,
-0.04746013134717941,
-0.0017023553373292089,
0.0226103775203228,
-0.01749921776354313,
-0.010966481640934944,
0.268765389919281,
0.013438043184578419,
0.0022210022434592247,
0.10155113786458969,
-0.0520472377538681,
0.06708861142396927,
0.055062711238861084,
0.025787146762013435,
-0.12197796255350113,
0.04222553223371506,
-0.04245612770318985,
0.08011578768491745,
0.24669314920902252,
-0.026633134111762047,
0.031021371483802795,
-0.0108712213113904,
0.02017047442495823,
0.07169736921787262,
0.130962535738945,
0.019784191623330116,
-0.028345925733447075,
-0.049688830971717834,
-0.030472243204712868,
-0.022014454007148743,
-0.22466576099395752,
-0.1644809991121292,
0.08332470059394836,
-0.08687933534383774,
0.05734049528837204,
0.11198601871728897,
0.07960151880979538,
0.10796430706977844,
-0.015946103259921074,
-0.055953361093997955,
-0.034337278455495834,
0.06657955050468445,
-0.12248916178941727,
0.016183089464902878
] |
null | null |
transformers
|
# roberta-base-als-demo
**roberta-base-als-demo** is a model trained by Patrick von Platen to demonstrate how to train a roberta-base model from scratch on the Alemannic language.
This is part of the
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
## Useful links
- [Community Week timeline](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104#summary-timeline-calendar-6)
- [Community Week README](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md)
- [Masked Language Modelling example scripts](https://github.com/huggingface/transformers/tree/master/examples/flax/language-modeling)
- [Model Repository](https://huggingface.co/flax-community/roberta-base-als-demo)
|
{}
|
fill-mask
|
flax-community/roberta-base-als-demo
|
[
"transformers",
"jax",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #jax #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
# roberta-base-als-demo
roberta-base-als-demo is a model trained by Patrick von Platen to demonstrate how to train a roberta-base model from scratch on the Alemannic language.
This is part of the
Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.
## Useful links
- Community Week timeline
- Community Week README
- Masked Language Modelling example scripts
- Model Repository
|
[
"# roberta-base-als-demo\n\nroberta-base-als-demo is a model trained by Patrick von Platen to demonstrate how to train a roberta-base model from scratch on the Alemannic language.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## Useful links\n- Community Week timeline\n- Community Week README\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
"TAGS\n#transformers #jax #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n",
"# roberta-base-als-demo\n\nroberta-base-als-demo is a model trained by Patrick von Platen to demonstrate how to train a roberta-base model from scratch on the Alemannic language.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## Useful links\n- Community Week timeline\n- Community Week README\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
36,
77,
28
] |
[
"passage: TAGS\n#transformers #jax #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n# roberta-base-als-demo\n\nroberta-base-als-demo is a model trained by Patrick von Platen to demonstrate how to train a roberta-base model from scratch on the Alemannic language.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.## Useful links\n- Community Week timeline\n- Community Week README\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
-0.052080780267715454,
0.010033763013780117,
-0.0016262782737612724,
0.06384909898042679,
0.08757373690605164,
0.05275159701704979,
0.246117502450943,
0.06135523319244385,
0.05640214681625366,
-0.045642152428627014,
0.09997403621673584,
-0.01700238324701786,
-0.003673937637358904,
0.2663153111934662,
0.026689546182751656,
-0.25470876693725586,
-0.016429269686341286,
-0.02681153640151024,
-0.05076109617948532,
0.123028963804245,
0.1589781641960144,
-0.007798317354172468,
0.055283572524785995,
-0.009264931082725525,
-0.09364508837461472,
0.06075435131788254,
0.032073378562927246,
-0.12106543779373169,
0.13097472488880157,
0.04576212167739868,
0.0955832377076149,
-0.02307942695915699,
0.018265942111611366,
0.047745734453201294,
0.059838201850652695,
0.011287812143564224,
-0.026764580979943275,
0.003491070121526718,
-0.0625394731760025,
-0.0319473072886467,
0.18015700578689575,
0.0672328993678093,
-0.013214766047894955,
0.03893328830599785,
-0.21603189408779144,
-0.12680183351039886,
-0.05398225039243698,
0.005947490222752094,
-0.03847750648856163,
0.10480614006519318,
-0.023448141291737556,
0.2130887806415558,
-0.17436367273330688,
0.07710487395524979,
0.16856209933757782,
-0.15179947018623352,
-0.10459136217832565,
0.06871379911899567,
0.15971918404102325,
-0.02007836103439331,
0.01667092554271221,
0.03870900720357895,
0.027559150010347366,
0.015974609181284904,
0.0067506637424230576,
-0.08781291544437408,
-0.0211534071713686,
-0.030088137835264206,
-0.08849388360977173,
-0.017118142917752266,
0.2869119644165039,
-0.020215876400470734,
-0.022751787677407265,
0.037736885249614716,
-0.029993988573551178,
0.1429491639137268,
-0.09541729092597961,
-0.02226618304848671,
-0.020408404991030693,
0.04416191577911377,
-0.08063771575689316,
-0.056186407804489136,
-0.0377335287630558,
-0.010043153539299965,
-0.055444926023483276,
0.28690770268440247,
-0.009156828746199608,
0.025109825655817986,
-0.2147301733493805,
0.00835198163986206,
-0.05711335316300392,
-0.06177808344364166,
-0.008245283737778664,
-0.02171667106449604,
0.061044372618198395,
-0.010415906086564064,
0.010961774736642838,
-0.04245755076408386,
0.11565513163805008,
0.3039137125015259,
0.11129334568977356,
0.00489008380100131,
0.026919055730104446,
0.028628328815102577,
0.03782062232494354,
0.08367566019296646,
0.08784077316522598,
-0.1709001362323761,
0.1048700287938118,
-0.01560296956449747,
0.01396129559725523,
0.005534269846975803,
-0.10138348489999771,
-0.033613212406635284,
-0.02384243533015251,
0.05377846583724022,
0.10058925300836563,
0.11840390413999557,
0.005640636198222637,
-0.005937999114394188,
0.05777885392308235,
-0.09280860424041748,
-0.03145018219947815,
-0.039797116070985794,
0.003609500825405121,
0.0003093943523708731,
0.04328049346804619,
-0.0077140918001532555,
-0.029639096930623055,
-0.04464952275156975,
-0.05826911702752113,
-0.047236472368240356,
-0.08213677257299423,
-0.09124157577753067,
0.0033214690629392862,
-0.1540757268667221,
-0.00875858310610056,
-0.20669390261173248,
-0.1542123556137085,
-0.0018899751594290137,
0.010146437212824821,
-0.012852597050368786,
-0.09343201667070389,
-0.07792539149522781,
0.02124597504734993,
-0.01866449974477291,
-0.025236712768673897,
0.028685787692666054,
-0.052370402961969376,
0.040138985961675644,
0.09291605651378632,
0.08544681966304779,
-0.10876759141683578,
0.04319312050938606,
-0.06350746005773544,
0.008722281083464622,
-0.24127322435379028,
0.018350666388869286,
0.006754700560122728,
0.14541220664978027,
-0.06622809171676636,
0.054772909730672836,
-0.09002856910228729,
0.09540954977273941,
0.03191167116165161,
0.14645831286907196,
-0.07622493803501129,
-0.08724194020032883,
0.19693094491958618,
-0.07236678153276443,
-0.10704311728477478,
0.12659364938735962,
-0.0435471348464489,
0.24805805087089539,
0.0784163773059845,
0.11811017245054245,
0.06264466792345047,
-0.13188940286636353,
0.1416458636522293,
0.057331282645463943,
-0.1222168505191803,
-0.08488374948501587,
-0.0025101739447563887,
0.030124230310320854,
-0.12225490808486938,
0.03614726662635803,
-0.03288082405924797,
0.13998176157474518,
-0.04713428020477295,
-0.00463020708411932,
0.005306713283061981,
-0.07166718691587448,
0.08113238215446472,
-0.02562301605939865,
0.06830966472625732,
-0.09630012512207031,
-0.03413263335824013,
-0.07280978560447693,
0.08023516088724136,
0.04683631658554077,
0.03523143008351326,
-0.11491218954324722,
0.12230239808559418,
0.06749384850263596,
0.050868842750787735,
-0.11774083971977234,
-0.008853347972035408,
-0.012610706500709057,
0.2713874280452728,
0.07038435339927673,
0.06468906253576279,
0.037764646112918854,
-0.1507713496685028,
-0.030097726732492447,
0.06336437910795212,
0.0046228328719735146,
-0.024900363758206367,
-0.00013971226871944964,
-0.09535510092973709,
0.045280441641807556,
-0.06357655674219131,
0.029568161815404892,
-0.10030288249254227,
0.00916854664683342,
-0.07254642993211746,
0.05050600692629814,
-0.006851993966847658,
0.06511978060007095,
-0.015110605396330357,
0.04347555711865425,
-0.04914509132504463,
-0.04818040877580643,
0.08170410990715027,
-0.00817116443067789,
-0.03721267729997635,
0.19037149846553802,
-0.09586000442504883,
-0.008057473227381706,
0.11986279487609863,
-0.1097966656088829,
-0.0936674177646637,
0.10858215391635895,
-0.02445678599178791,
-0.028664853423833847,
0.017885122448205948,
-0.03155047819018364,
0.09185953438282013,
-0.03399742394685745,
0.10439443588256836,
-0.02878090739250183,
0.01610923931002617,
0.003333800006657839,
-0.1265518218278885,
0.005223491229116917,
0.08345751464366913,
-0.04250464215874672,
-0.10745621472597122,
0.06297209113836288,
0.10351737588644028,
0.008614434860646725,
0.19393856823444366,
0.003359854221343994,
0.022508839145302773,
-0.009317486546933651,
-0.03829231858253479,
0.029670853167772293,
0.0460333488881588,
-0.12490693479776382,
-0.11714453250169754,
-0.02997371181845665,
0.0066385031677782536,
0.006773615721613169,
-0.04408620670437813,
-0.02872132882475853,
-0.05730852857232094,
-0.013043236918747425,
-0.027045264840126038,
0.1165941059589386,
-0.041612472385168076,
0.07649081200361252,
0.06929627060890198,
-0.07725150883197784,
0.08903559297323227,
0.012860890477895737,
-0.06494471430778503,
0.1425001472234726,
-0.03987954184412956,
-0.23121407628059387,
-0.08939337730407715,
-0.17593611776828766,
0.0842813029885292,
0.010169991292059422,
0.09202951937913895,
-0.010017418302595615,
-0.010210893116891384,
-0.01912371814250946,
0.09974142163991928,
-0.016016241163015366,
-0.013362286612391472,
-0.07592599838972092,
0.0533754825592041,
0.003709806129336357,
-0.10895827412605286,
-0.049581389874219894,
-0.031138025224208832,
-0.0236318651586771,
0.12032625079154968,
-0.22425706684589386,
0.061493292450904846,
0.047427769750356674,
-0.01859305240213871,
0.04355596378445625,
-0.04240281879901886,
0.22793138027191162,
-0.13103331625461578,
0.05297313258051872,
0.18451617658138275,
-0.038132745772600174,
0.006149739492684603,
0.01593393087387085,
-0.034209951758384705,
-0.05100341513752937,
0.08660949021577835,
-0.0002911785850301385,
-0.15268714725971222,
-0.15218690037727356,
-0.060652248561382294,
-0.1321416050195694,
0.057203855365514755,
0.05452984571456909,
0.0675692930817604,
0.022764461115002632,
0.03266608715057373,
0.02855466492474079,
0.040159113705158234,
-0.00509920297190547,
0.05707668140530586,
0.10729441791772842,
-0.010184066370129585,
0.06033547967672348,
-0.08820275962352753,
-0.08194135874509811,
0.061415255069732666,
0.04119112715125084,
0.13504821062088013,
0.0612235851585865,
0.018482591956853867,
0.052411358803510666,
0.03861791640520096,
0.09625077247619629,
0.03507209196686745,
0.03757482394576073,
0.002767528174445033,
-0.07671546190977097,
-0.020083390176296234,
-0.02578577771782875,
0.055011920630931854,
-0.00497243320569396,
-0.10732091963291168,
0.016377000138163567,
-0.04825596138834953,
0.05489053577184677,
0.2976892292499542,
0.0243048295378685,
-0.17930032312870026,
-0.07205609977245331,
0.017249200493097305,
-0.014331032522022724,
0.003425140166655183,
0.023025287315249443,
-0.07581547647714615,
-0.09858889877796173,
0.03449258580803871,
0.02633882500231266,
0.1316658854484558,
0.0004460567724891007,
0.028079280629754066,
-0.11014121770858765,
0.10057879239320755,
-0.026394125074148178,
0.050529368221759796,
-0.36489561200141907,
0.2568592131137848,
-0.035193394869565964,
0.10139399021863937,
-0.059546008706092834,
-0.03993190452456474,
0.0699329823255539,
0.1348601132631302,
0.06822811812162399,
0.06096356362104416,
-0.0195242241024971,
0.02106623165309429,
-0.10310614854097366,
0.04434722661972046,
-0.04578716307878494,
-0.01741025038063526,
0.034439053386449814,
-0.02730577252805233,
-0.01976466365158558,
0.029604244977235794,
-0.0432761088013649,
-0.18513262271881104,
-0.03599086403846741,
0.03282799944281578,
0.11809863895177841,
-0.09265606850385666,
-0.031814564019441605,
-0.06336868554353714,
-0.036190662533044815,
0.08918660134077072,
0.14080412685871124,
0.011415805667638779,
-0.11362606287002563,
0.0028119708877056837,
0.0019996599294245243,
-0.06029943749308586,
0.028378382325172424,
0.012800537049770355,
-0.0462341345846653,
-0.06268876791000366,
-0.09803048521280289,
0.15208768844604492,
-0.10208609700202942,
-0.05289150029420853,
-0.045244842767715454,
0.006730473134666681,
0.128922238945961,
-0.014847957529127598,
0.011608314700424671,
0.002022602828219533,
-0.01594257354736328,
-0.00020367070101201534,
-0.0447089746594429,
-0.02800084836781025,
-0.041117049753665924,
-0.03950554132461548,
-0.05717921629548073,
-0.05652322992682457,
-0.03084690310060978,
-0.05171392485499382,
0.03353435546159744,
0.11420849710702896,
-0.039026301354169846,
0.06790653616189957,
0.12257194519042969,
-0.02845573052763939,
-0.32103779911994934,
-0.036475639790296555,
-0.04951676353812218,
0.04371611028909683,
0.03588070347905159,
-0.1481369435787201,
0.07941954582929611,
-0.05808931589126587,
-0.025383559986948967,
0.029672469943761826,
-0.13993124663829803,
-0.08813871443271637,
0.18184879422187805,
-0.01976177841424942,
0.45360320806503296,
-0.054347097873687744,
-0.016897136345505714,
-0.05448978394269943,
-0.11506020277738571,
-0.04113703593611717,
-0.14299078285694122,
0.08594398945569992,
-0.020017854869365692,
0.05980776250362396,
-0.035771798342466354,
0.004421714693307877,
0.01771332137286663,
0.06921756267547607,
-0.024483434855937958,
-0.10780132561922073,
-0.15075761079788208,
0.11639384925365448,
-0.045300114899873734,
0.030231792479753494,
-0.05604620650410652,
0.014215825125575066,
-0.09370727837085724,
-0.035430602729320526,
-0.03947016969323158,
0.17510974407196045,
-0.003523517632856965,
-0.027458515018224716,
0.018395565450191498,
0.04479828104376793,
-0.07999920099973679,
0.012180649675428867,
0.11305790394544601,
-0.11212211847305298,
0.08912041783332825,
-0.027373939752578735,
0.12712907791137695,
-0.034245945513248444,
0.04918557405471802,
0.005387341603636742,
-0.05226709321141243,
0.09205099195241928,
-0.08368035405874252,
-0.02772812731564045,
-0.004368536174297333,
-0.005596699193120003,
0.07867519557476044,
0.0646275132894516,
-0.1009674146771431,
0.05140145868062973,
0.08695052564144135,
-0.08364550769329071,
0.01565949060022831,
-0.010444924235343933,
-0.11655275523662567,
-0.03127637133002281,
0.02857537753880024,
0.1843230426311493,
-0.021811114624142647,
-0.042850714176893234,
-0.03110879473388195,
-0.008791627362370491,
-0.08017206937074661,
0.018224505707621574,
0.06770613044500351,
0.0173770934343338,
-0.05421234667301178,
0.028222396969795227,
0.010923424735665321,
-0.005546862725168467,
0.016980057582259178,
0.2092251479625702,
-0.09676468372344971,
-0.1231689602136612,
-0.031022921204566956,
0.22216366231441498,
-0.06655386090278625,
-0.0752495676279068,
-0.10112622380256653,
-0.08380798995494843,
0.07951442152261734,
0.17736420035362244,
0.06648214906454086,
0.03674939274787903,
-0.029271485283970833,
0.013665731996297836,
-0.015094195492565632,
-0.0006961420294828713,
0.12375324964523315,
0.009785804897546768,
0.062330201268196106,
-0.008678852580487728,
0.0067960238084197044,
0.1114872470498085,
-0.10842767357826233,
-0.06091392785310745,
-0.20260071754455566,
0.04353709891438484,
-0.019238846376538277,
-0.0021088782232254744,
-0.06547825783491135,
-0.0036216238513588905,
-0.017832763493061066,
-0.057680875062942505,
-0.014720682986080647,
0.044200699776411057,
-0.0737859234213829,
0.011171284131705761,
-0.0006090769893489778,
0.07090982049703598,
0.00423005223274231,
-0.005128064192831516,
0.06065528839826584,
-0.04033878445625305,
0.13509583473205566,
0.07475379854440689,
-0.07154888659715652,
0.0026990952901542187,
-0.1783800572156906,
-0.02186974138021469,
-0.020390240475535393,
-0.009072637185454369,
0.049000781029462814,
0.053770169615745544,
0.01050795428454876,
-0.006122383754700422,
0.14446932077407837,
0.023487379774451256,
0.06118622049689293,
-0.09065153449773788,
0.08523229509592056,
-0.013556300662457943,
-0.1504579484462738,
-0.007696863729506731,
0.004534568637609482,
0.0579589419066906,
0.08290363848209381,
0.05111682042479515,
-0.12335973232984543,
0.03589525446295738,
-0.027393339201807976,
0.02818797342479229,
-0.016640618443489075,
-0.13526786863803864,
-0.1313202828168869,
-0.0850544199347496,
0.034954071044921875,
0.025021638721227646,
0.1109282448887825,
0.12704020738601685,
-0.002669490408152342,
0.02710651233792305,
-0.04863330349326134,
-0.012867088429629803,
-0.008670037612318993,
0.13144074380397797,
0.01698114350438118,
-0.009062211029231548,
-0.06303031742572784,
0.027475625276565552,
-0.0032972756307572126,
-0.06485830992460251,
0.06617311388254166,
0.10030541568994522,
0.12228603661060333,
0.12057015299797058,
0.01694074645638466,
0.0471876859664917,
0.061643991619348526,
-0.058768924325704575,
-0.01479742955416441,
0.06902419030666351,
-0.04954424872994423,
0.038784224539995193,
0.13930095732212067,
0.05300312116742134,
0.03621669113636017,
-0.04248993843793869,
-0.012800198048353195,
-0.12931843101978302,
-0.07506321370601654,
-0.0734763965010643,
-0.10200059413909912,
-0.013460811227560043,
-0.0367707684636116,
-0.051301419734954834,
0.015974136069417,
0.006784578785300255,
-0.05073583871126175,
0.011702386662364006,
0.04271303117275238,
-0.06181016191840172,
0.09397044777870178,
-0.06361451745033264,
0.019686758518218994,
-0.0677691251039505,
-0.03338603302836418,
-0.08489168435335159,
0.06809020042419434,
-0.023657502606511116,
0.0050148931331932545,
-0.007192305289208889,
0.04791952669620514,
-0.07207997143268585,
-0.08527980744838715,
-0.02673526294529438,
0.04995861276984215,
0.01025429181754589,
0.06557207554578781,
0.04219747707247734,
-0.05816086754202843,
0.021517178043723106,
0.14523524045944214,
-0.0011234963312745094,
-0.1966434270143509,
-0.10021478682756424,
0.19561627507209778,
-0.0571088008582592,
0.0046076993457973,
-0.1686049997806549,
-0.006093729753047228,
0.004552628379315138,
0.30045294761657715,
0.3602205216884613,
0.016306424513459206,
0.03604600206017494,
-0.06184700131416321,
0.01773853600025177,
0.0453835166990757,
0.16119086742401123,
0.028097616508603096,
0.07272979617118835,
0.016709892079234123,
-0.14176172018051147,
-0.04511352255940437,
0.009165853261947632,
-0.11557979881763458,
-0.016618069261312485,
0.04000736027956009,
-0.017790094017982483,
-0.014008110389113426,
0.09455014765262604,
-0.06801135838031769,
-0.07084863632917404,
-0.022299164906144142,
-0.09520263224840164,
-0.06535401195287704,
-0.04931566119194031,
-0.035517431795597076,
0.1105499416589737,
0.1406761258840561,
-0.05034668371081352,
-0.018787575885653496,
0.05682414770126343,
-0.027134623378515244,
-0.09613382071256638,
-0.07635670155286789,
0.12269356101751328,
-0.04910380393266678,
0.09518568217754364,
-0.08496980369091034,
-0.013709072023630142,
0.09081078320741653,
-0.00040096085285767913,
-0.06884976476430893,
0.06320956349372864,
-0.02129555679857731,
0.10566044598817825,
0.04039296880364418,
-0.05167026445269585,
-0.06457150727510452,
-0.038540467619895935,
0.034301262348890305,
-0.08448436856269836,
0.04631573334336281,
0.014198378659784794,
0.005792682059109211,
-0.11456993967294693,
0.07331522554159164,
-0.11628652364015579,
0.1192687451839447,
0.08529427647590637,
-0.03022313117980957,
-0.08367261290550232,
-0.033791448920965195,
-0.016817033290863037,
0.011494757607579231,
-0.07103271037340164,
-0.10117403417825699,
-0.11374048888683319,
-0.07927866280078888,
-0.01735011674463749,
-0.05053112655878067,
-0.27210819721221924,
0.001734614372253418,
-0.1789412498474121,
0.00982616376131773,
0.010083195753395557,
0.03707952797412872,
0.12010804563760757,
0.03825880214571953,
-0.0018644473748281598,
-0.0977504625916481,
0.04585343971848488,
0.11352939903736115,
-0.13625352084636688,
-0.0788232758641243
] |
null | null |
transformers
|
This project pretrains a [`roberta-base`](https://huggingface.co/roberta-base) on the *Alemannic* (`als`) data subset of the [OSCAR](https://oscar-corpus.com/) corpus in JAX/Flax.
We will be using the masked-language modeling loss for pretraining.
|
{}
|
fill-mask
|
flax-community/roberta-base-als
|
[
"transformers",
"jax",
"tensorboard",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #jax #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
This project pretrains a 'roberta-base' on the *Alemannic* ('als') data subset of the OSCAR corpus in JAX/Flax.
We will be using the masked-language modeling loss for pretraining.
|
[] |
[
"TAGS\n#transformers #jax #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
40
] |
[
"passage: TAGS\n#transformers #jax #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.0329417809844017,
0.026542233303189278,
-0.00711836339905858,
0.038137588649988174,
0.11221930384635925,
0.02897702343761921,
0.1600077599287033,
0.09312579035758972,
0.08267447352409363,
0.012559163384139538,
0.16161218285560608,
0.20280829071998596,
-0.0048243035562336445,
0.12177977710962296,
-0.05178572237491608,
-0.25585609674453735,
0.03286115825176239,
0.058682624250650406,
-0.1410491019487381,
0.0843924805521965,
0.05530610308051109,
-0.06490170210599899,
0.0693613812327385,
-0.006780317053198814,
-0.18010543286800385,
0.06706380099058151,
0.09719173610210419,
-0.1062551736831665,
0.10088774561882019,
0.06989994645118713,
0.24005752801895142,
0.0490260012447834,
-0.046509090811014175,
-0.07854389399290085,
0.05274910852313042,
0.03130975738167763,
-0.09874213486909866,
0.0559147484600544,
0.005066406913101673,
-0.07943262159824371,
-0.04190054163336754,
0.04702925682067871,
0.0554969422519207,
0.04106675833463669,
-0.14591358602046967,
-0.0946396142244339,
-0.02599331922829151,
-0.02002599649131298,
0.057347334921360016,
0.024071110412478447,
0.035029374063014984,
0.17755983769893646,
-0.04925362765789032,
0.08160373568534851,
0.1280364990234375,
-0.33813905715942383,
-0.033818501979112625,
0.14055299758911133,
0.11565420776605606,
-0.013478475622832775,
-0.059152938425540924,
0.07915961742401123,
0.03596077859401703,
-0.0003366783494129777,
0.08002990484237671,
-0.0912376418709755,
-0.032501399517059326,
0.02532663382589817,
-0.07183554768562317,
0.017986366525292397,
0.12927962839603424,
-0.0306392889469862,
0.057966962456703186,
-0.016302119940519333,
-0.10481366515159607,
-0.0679694339632988,
-0.03882995992898941,
-0.04771385341882706,
-0.026734041050076485,
0.060449954122304916,
-0.06608770787715912,
-0.05916031450033188,
-0.11076071113348007,
-0.0002449859166517854,
-0.2071940004825592,
0.2258065789937973,
-0.011012796312570572,
0.058538056910037994,
-0.19426493346691132,
-0.0028168775606900454,
-0.04991975799202919,
-0.12814103066921234,
0.06626702100038528,
-0.0758582279086113,
-0.05131407827138901,
-0.029607797041535378,
-0.037985991686582565,
-0.2090919464826584,
0.0911528542637825,
0.1520228236913681,
0.12341899424791336,
0.07141652703285217,
0.019024917855858803,
0.10115666687488556,
-0.006738598924130201,
0.0930805653333664,
0.021433617919683456,
-0.05360095575451851,
0.0431477315723896,
-0.09618829935789108,
0.04374590888619423,
-0.0781574472784996,
-0.16308189928531647,
-0.005659256596118212,
-0.011701812967658043,
0.050567541271448135,
0.033087972551584244,
0.04961474984884262,
-0.08506055176258087,
0.0025867184158414602,
0.04133603721857071,
-0.05699349567294121,
0.028625881299376488,
-0.05677447468042374,
0.08145716786384583,
0.07058990001678467,
0.017480045557022095,
-0.004342625383287668,
0.048279620707035065,
0.0715206041932106,
-0.09575440734624863,
-0.04878830909729004,
-0.08688462525606155,
-0.08637619018554688,
0.04197423532605171,
-0.13614904880523682,
0.03742203488945961,
-0.15612760186195374,
-0.1164647787809372,
0.03545556962490082,
0.0993972048163414,
-0.03433714807033539,
0.0003386830794624984,
0.02327338606119156,
-0.023087674751877785,
0.07320494204759598,
-0.026122087612748146,
-0.046372219920158386,
-0.01975957304239273,
0.04370496794581413,
0.0025725560262799263,
0.13983024656772614,
-0.09680947661399841,
0.019098615273833275,
-0.044807013124227524,
0.03672024980187416,
-0.18818555772304535,
-0.03147725760936737,
-0.06716258823871613,
0.1578308492898941,
-0.02359890379011631,
0.034214165061712265,
-0.14269879460334778,
0.05533367022871971,
0.004259740933775902,
0.1205076053738594,
-0.13285617530345917,
-0.11227773874998093,
0.21861238777637482,
-0.09039504081010818,
-0.13327087461948395,
0.08300018310546875,
-0.007349212653934956,
0.055108118802309036,
0.030452338978648186,
0.11197812110185623,
0.06301248073577881,
-0.13569188117980957,
0.0872018039226532,
0.10973144322633743,
-0.13754820823669434,
-0.13189132511615753,
0.005886216182261705,
-0.0013933381997048855,
-0.07497121393680573,
0.014766803942620754,
0.160347580909729,
0.10578573495149612,
-0.043757364153862,
-0.05901378020644188,
-0.014996325597167015,
-0.04234796389937401,
0.1145983412861824,
0.05885425582528114,
0.11323011666536331,
-0.06954333186149597,
-0.06114992871880531,
0.009470291435718536,
-0.023148994892835617,
0.02481711469590664,
0.011324401944875717,
-0.1064189150929451,
0.09146500378847122,
-0.14957384765148163,
-0.0032007808331400156,
-0.15573877096176147,
-0.18682269752025604,
-0.019669538363814354,
0.02760804258286953,
-0.0022204946726560593,
0.1577223837375641,
0.15129147469997406,
-0.060254111886024475,
-0.024257458746433258,
0.02333754301071167,
0.12638860940933228,
0.04207634553313255,
-0.05421745404601097,
-0.14566732943058014,
0.044276416301727295,
-0.119206503033638,
-0.01294864621013403,
-0.05911770835518837,
0.009324598126113415,
0.025674516335129738,
0.1400611400604248,
0.07025644183158875,
0.028108857572078705,
-0.05200561136007309,
0.012593770399689674,
-0.05670996755361557,
-0.021270576864480972,
0.058658089488744736,
0.0048630740493535995,
-0.0788891538977623,
0.16158245503902435,
-0.15949708223342896,
0.3215492069721222,
0.1696743369102478,
-0.21568173170089722,
-0.0563850998878479,
0.054888464510440826,
0.015413806773722172,
-0.0027446781750768423,
0.04674471542239189,
0.0033388317096978426,
-0.0045325192622840405,
-0.011051229201257229,
0.12423443794250488,
0.000797667948063463,
-0.02101554162800312,
0.057030320167541504,
-0.07420527935028076,
-0.08675435930490494,
0.005113702733069658,
0.12135670334100723,
-0.14895130693912506,
0.17891345918178558,
0.238571897149086,
-0.06365619599819183,
0.20080290734767914,
0.024014176800847054,
0.001189496018923819,
-0.020358828827738762,
-0.03713710233569145,
0.02769448794424534,
0.08871173113584518,
-0.15567117929458618,
-0.060173749923706055,
0.037629976868629456,
-0.04739832505583763,
0.04000844061374664,
-0.11649611592292786,
-0.03460264578461647,
0.03008829429745674,
0.06542804092168808,
0.020448757335543633,
0.11835699528455734,
-0.012261602096259594,
0.049001358449459076,
-0.0019192154286429286,
-0.09640132635831833,
0.10651907324790955,
-0.0027116378769278526,
-0.027264205738902092,
0.143388569355011,
-0.09603472054004669,
-0.2696317434310913,
-0.15564101934432983,
-0.17344313859939575,
0.008850439451634884,
0.03914600983262062,
0.03435918316245079,
-0.08342444151639938,
-0.06700769066810608,
0.05808442458510399,
0.0037851380184292793,
-0.018112126737833023,
0.08884227275848389,
-0.052343133836984634,
0.036485932767391205,
-0.032904256135225296,
-0.058132950216531754,
-0.04966353997588158,
-0.021732419729232788,
-0.004153992515057325,
0.09872038662433624,
-0.0798579752445221,
0.0944928452372551,
0.14915959537029266,
0.009686815552413464,
0.05110333114862442,
-0.0020896715577691793,
0.11070442199707031,
-0.0969654843211174,
0.01849600486457348,
0.11530912667512894,
-0.05120840296149254,
0.0672340840101242,
0.1833212524652481,
0.04066582769155502,
-0.05863400921225548,
0.00004317746788728982,
-0.01446886733174324,
-0.1414872258901596,
-0.16441215574741364,
-0.03357212617993355,
-0.13743248581886292,
-0.011657393537461758,
0.06262750178575516,
0.05398807302117348,
0.1830873340368271,
0.11750183254480362,
0.09331076592206955,
0.0563841238617897,
-0.053589582443237305,
0.04585922881960869,
0.10292338579893112,
0.0002946934837382287,
0.14973466098308563,
-0.06848056614398956,
-0.14885416626930237,
0.035029079765081406,
0.06752877682447433,
0.0934106856584549,
0.14763127267360687,
0.1045582965016365,
0.02793722040951252,
0.12380506843328476,
0.16854719817638397,
0.11065017431974411,
0.020628221333026886,
-0.0995962917804718,
0.004203937482088804,
-0.004210996441543102,
-0.002989003900438547,
0.05568595230579376,
0.14846350252628326,
-0.09294731169939041,
0.03250674903392792,
-0.08800415694713593,
0.03970550000667572,
0.10980550944805145,
0.082148477435112,
-0.2668040096759796,
0.012463385239243507,
0.08737748861312866,
0.007024845108389854,
-0.053780246526002884,
0.022772585973143578,
0.0160111952573061,
-0.06515001505613327,
0.0661458671092987,
-0.09313037246465683,
0.07646811008453369,
0.026234209537506104,
0.036398231983184814,
-0.024195779114961624,
-0.010341323912143707,
0.004674872849136591,
0.027779730036854744,
-0.2321990579366684,
0.24279189109802246,
0.017681719735264778,
-0.0081409253180027,
-0.0658031553030014,
0.0227928776293993,
0.050465118139982224,
0.08369702845811844,
0.12745453417301178,
-0.014776632189750671,
-0.13642984628677368,
-0.06216294690966606,
0.003177864942699671,
0.012125691398978233,
0.09781407564878464,
-0.027841059491038322,
-0.007532401476055384,
-0.03284532204270363,
-0.05321016535162926,
0.03365150839090347,
0.04990879073739052,
-0.012970046140253544,
-0.16553451120853424,
0.07794158905744553,
0.04696020111441612,
-0.10474248230457306,
0.009847830049693584,
-0.057894304394721985,
-0.12084672600030899,
0.24245573580265045,
-0.040688056498765945,
0.003777880920097232,
-0.13147220015525818,
-0.08801491558551788,
0.06516817957162857,
-0.08013751357793808,
0.09556079655885696,
-0.07300305366516113,
0.04892462119460106,
-0.07796036452054977,
-0.19317540526390076,
0.19069314002990723,
-0.11103568226099014,
0.04609302058815956,
-0.1309530884027481,
0.07215578109025955,
-0.07803495228290558,
0.0324113704264164,
0.0007506720721721649,
0.059280965477228165,
-0.05033242702484131,
-0.026564868167042732,
0.05381704494357109,
-0.05967836081981659,
-0.009379270486533642,
-0.046744588762521744,
-0.0187015812844038,
-0.023807883262634277,
0.04482445493340492,
0.03113146498799324,
0.193584144115448,
0.22225938737392426,
-0.09085509181022644,
0.12574277818202972,
0.13287468254566193,
-0.04152733087539673,
-0.34996771812438965,
-0.03576524183154106,
-0.1204613745212555,
0.021520240232348442,
0.03184865787625313,
-0.10245911777019501,
0.11522993445396423,
-0.027661746367812157,
-0.05777546390891075,
0.1334868222475052,
-0.14204467833042145,
-0.11581980437040329,
0.18468722701072693,
0.05532653629779816,
0.43889808654785156,
-0.14414343237876892,
-0.07870311290025711,
-0.0578923262655735,
-0.11119277775287628,
0.07418956607580185,
-0.061539266258478165,
0.09017122536897659,
0.015551939606666565,
0.018525732681155205,
0.022814257070422173,
-0.08536908030509949,
0.09922046214342117,
-0.0843663141131401,
0.03398345410823822,
-0.11066484451293945,
-0.07639732956886292,
0.1308639496564865,
-0.029567217454314232,
-0.04608828201889992,
-0.027788929641246796,
-0.03186996281147003,
-0.0017915869830176234,
-0.02505946345627308,
-0.053956903517246246,
0.10293137282133102,
0.03824806585907936,
-0.0698898583650589,
0.012296374887228012,
-0.010622386820614338,
-0.024140171706676483,
-0.026431895792484283,
0.3188876807689667,
-0.006681607570499182,
0.20690536499023438,
0.1331115961074829,
0.0482148677110672,
-0.1066153347492218,
-0.04168606176972389,
-0.014310465194284916,
-0.09378211945295334,
0.09637784957885742,
-0.09964105486869812,
0.05397867038846016,
0.06886148452758789,
-0.011482715606689453,
0.055120646953582764,
0.11174727231264114,
-0.017480792477726936,
0.00358504056930542,
0.17993944883346558,
-0.17305320501327515,
-0.0436120480298996,
-0.004973283037543297,
-0.04108631610870361,
0.035963788628578186,
0.07194235920906067,
0.1188579723238945,
0.030546389520168304,
-0.008778760209679604,
0.018035104498267174,
-0.020421087741851807,
-0.06356093287467957,
0.040820587426424026,
0.0867374837398529,
0.030343370512127876,
-0.08746147900819778,
0.017309723421931267,
-0.002849958138540387,
-0.22730304300785065,
0.0018976981518790126,
0.07054683566093445,
-0.0756794735789299,
-0.10106708854436874,
0.05153464153409004,
0.17366249859333038,
-0.07233040034770966,
-0.02043299376964569,
-0.10530085861682892,
-0.0859653651714325,
0.01932416297495365,
0.2599613070487976,
0.07996602356433868,
0.04919706657528877,
-0.006001228466629982,
0.010721523314714432,
-0.03668977692723274,
0.0064100041054189205,
-0.005976090207695961,
0.06364626437425613,
-0.12335670739412308,
-0.06737135350704193,
-0.025361256673932076,
0.10906196385622025,
-0.1175532266497612,
-0.021820068359375,
-0.19609995186328888,
0.003788813715800643,
-0.048934828490018845,
-0.08043240755796432,
-0.09454052895307541,
-0.0712772011756897,
0.030607396736741066,
-0.06858336180448532,
-0.09425429999828339,
-0.04390277713537216,
-0.11505849659442902,
0.006931390147656202,
0.03798709809780121,
-0.00991776678711176,
-0.04642646014690399,
-0.038071345537900925,
0.0971149429678917,
-0.036438025534152985,
0.03766728565096855,
0.06420671194791794,
-0.015217851847410202,
0.09234844893217087,
-0.14074084162712097,
-0.055803894996643066,
0.07419854402542114,
-0.0001249401393579319,
0.10154250264167786,
0.07490864396095276,
0.0047249216586351395,
0.024386215955018997,
0.0710872933268547,
0.03605786710977554,
0.012073192745447159,
-0.0766444131731987,
0.06728431582450867,
-0.04853279888629913,
-0.18079568445682526,
-0.04013540968298912,
-0.03497268259525299,
0.10346321761608124,
-0.006970696616917849,
0.12889499962329865,
-0.047932807356119156,
0.06254623830318451,
-0.06252668052911758,
0.02636006660759449,
-0.0123911676928401,
-0.14529670774936676,
0.06354433298110962,
-0.03632514178752899,
-0.005106714088469744,
-0.040746547281742096,
0.19743973016738892,
0.011635620146989822,
-0.0078099872916936874,
0.0483575314283371,
0.02005455642938614,
0.018608389422297478,
0.030310047790408134,
0.15789169073104858,
0.09051284193992615,
-0.04679751396179199,
-0.10117464512586594,
0.11111433804035187,
0.05160742625594139,
0.009706846438348293,
0.08366572111845016,
0.055155668407678604,
-0.03183482214808464,
0.1412523090839386,
0.03313356265425682,
0.034679386764764786,
-0.06527009606361389,
-0.04731703922152519,
-0.07568445801734924,
0.06466006487607956,
0.033717311918735504,
-0.03156046196818352,
0.21887484192848206,
-0.008818339556455612,
0.03655129298567772,
-0.0579502210021019,
-0.03947729989886284,
-0.19691532850265503,
-0.18059898912906647,
-0.11041805893182755,
-0.07780896127223969,
0.03130476921796799,
-0.00352695444598794,
-0.010392709635198116,
0.08976829051971436,
0.043872859328985214,
0.0027831001207232475,
0.15485870838165283,
0.027360044419765472,
0.01569478027522564,
-0.0013452027924358845,
0.004746637772768736,
-0.020908985286951065,
0.004002862144261599,
0.0018517757998779416,
-0.1662002056837082,
0.0019254707731306553,
-0.0795445591211319,
-0.017344841733574867,
-0.03582914173603058,
0.036542486399412155,
-0.06595970690250397,
-0.14312049746513367,
-0.0464862585067749,
0.03731429949402809,
-0.04799088463187218,
0.021890629082918167,
0.005000287666916847,
0.03249432146549225,
0.008510300889611244,
0.13224242627620697,
-0.0820256769657135,
-0.014926344156265259,
-0.10331322997808456,
0.0904620960354805,
0.005321953911334276,
0.12144656479358673,
-0.060339901596307755,
0.0025539444759488106,
-0.05917980894446373,
0.27539145946502686,
0.30147379636764526,
-0.04355388134717941,
0.07376430183649063,
0.06271875649690628,
0.020537329837679863,
0.001660053851082921,
0.10591702908277512,
0.009405020624399185,
0.22131659090518951,
-0.08691644668579102,
-0.07998073846101761,
-0.017320208251476288,
-0.050044406205415726,
-0.08612151443958282,
0.04039651155471802,
0.044123828411102295,
-0.0028584159445017576,
-0.04286675527691841,
0.07884115725755692,
-0.13355590403079987,
0.07149451971054077,
0.10651414841413498,
-0.2095051258802414,
-0.07272254675626755,
-0.013771957717835903,
0.18716616928577423,
-0.03204018622636795,
0.10223958641290665,
-0.03657589852809906,
-0.09336363524198532,
-0.023238051682710648,
0.01674238033592701,
-0.20283789932727814,
-0.035098981112241745,
0.06426454335451126,
-0.01734243519604206,
0.09623444825410843,
-0.04927781596779823,
-0.02305767685174942,
0.12261020392179489,
0.05312138423323631,
-0.011299178004264832,
-0.004889477510005236,
0.024893825873732567,
-0.09537551552057266,
-0.038807954639196396,
0.0698898658156395,
-0.02163190394639969,
-0.05748581513762474,
0.05045942962169647,
-0.12234541773796082,
0.04941118136048317,
-0.18707086145877838,
-0.03743040934205055,
-0.0005046811420470476,
0.04900933429598808,
-0.0479026697576046,
0.09100630134344101,
0.06282646954059601,
0.048486288636922836,
-0.012976630590856075,
-0.011964566074311733,
-0.004679829813539982,
0.0755200982093811,
-0.02989252097904682,
-0.18321958184242249,
-0.0945415049791336,
-0.05113792419433594,
0.032525435090065,
-0.006086435634642839,
-0.20438963174819946,
-0.05459880456328392,
-0.08814369887113571,
0.0024600683245807886,
-0.14107996225357056,
0.0435638353228569,
0.16646583378314972,
0.0523197315633297,
-0.01654839515686035,
-0.11001645773649216,
0.04320191964507103,
0.03550111874938011,
-0.15934661030769348,
-0.1052287295460701
] |
null | null |
transformers
|
# RøBÆRTa - Danish Roberta Base
## Description
RøBÆRTa is a danish pretrained Roberta base model. RøBÆRTa was pretrained on the danish mC4 dataset during the flax community week. This project was organized by Dansk Data Science Community (DDSC) 👇 <br><br>
https://www.linkedin.com/groups/9017904/
## Team RøBÆRTa:
- Dan Saattrup Nielsen (saattrupdan)
- Malte Højmark-Bertelsen (Maltehb)
- Morten Kloster Pedersen (MortenKP)
- Kasper Junge (Juunge)
- Per Egil Kummervold (pere)
- Birger Moëll (birgermoell)
---
|
{"language": "da", "license": "cc-by-4.0", "tags": ["danish", "roberta"], "pipeline_tag": "fill-mask", "widget": [{"text": "P\u00e5 biblioteket kan du l\u00e5ne en <mask>."}]}
|
fill-mask
|
DDSC/roberta-base-danish
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"roberta",
"fill-mask",
"danish",
"da",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"da"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #roberta #fill-mask #danish #da #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us
|
# RøBÆRTa - Danish Roberta Base
## Description
RøBÆRTa is a danish pretrained Roberta base model. RøBÆRTa was pretrained on the danish mC4 dataset during the flax community week. This project was organized by Dansk Data Science Community (DDSC) <br><br>
URL
## Team RøBÆRTa:
- Dan Saattrup Nielsen (saattrupdan)
- Malte Højmark-Bertelsen (Maltehb)
- Morten Kloster Pedersen (MortenKP)
- Kasper Junge (Juunge)
- Per Egil Kummervold (pere)
- Birger Moëll (birgermoell)
---
|
[
"# RøBÆRTa - Danish Roberta Base",
"## Description\n\nRøBÆRTa is a danish pretrained Roberta base model. RøBÆRTa was pretrained on the danish mC4 dataset during the flax community week. This project was organized by Dansk Data Science Community (DDSC) <br><br>\nURL",
"## Team RøBÆRTa:\n- Dan Saattrup Nielsen (saattrupdan)\n- Malte Højmark-Bertelsen (Maltehb)\n- Morten Kloster Pedersen (MortenKP)\n- Kasper Junge (Juunge)\n- Per Egil Kummervold (pere)\n- Birger Moëll (birgermoell)\n\n---"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #roberta #fill-mask #danish #da #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# RøBÆRTa - Danish Roberta Base",
"## Description\n\nRøBÆRTa is a danish pretrained Roberta base model. RøBÆRTa was pretrained on the danish mC4 dataset during the flax community week. This project was organized by Dansk Data Science Community (DDSC) <br><br>\nURL",
"## Team RøBÆRTa:\n- Dan Saattrup Nielsen (saattrupdan)\n- Malte Højmark-Bertelsen (Maltehb)\n- Morten Kloster Pedersen (MortenKP)\n- Kasper Junge (Juunge)\n- Per Egil Kummervold (pere)\n- Birger Moëll (birgermoell)\n\n---"
] |
[
63,
10,
62,
73
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #roberta #fill-mask #danish #da #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n# RøBÆRTa - Danish Roberta Base## Description\n\nRøBÆRTa is a danish pretrained Roberta base model. RøBÆRTa was pretrained on the danish mC4 dataset during the flax community week. This project was organized by Dansk Data Science Community (DDSC) <br><br>\nURL## Team RøBÆRTa:\n- Dan Saattrup Nielsen (saattrupdan)\n- Malte Højmark-Bertelsen (Maltehb)\n- Morten Kloster Pedersen (MortenKP)\n- Kasper Junge (Juunge)\n- Per Egil Kummervold (pere)\n- Birger Moëll (birgermoell)\n\n---"
] |
[
-0.04474117234349251,
-0.002131466753780842,
-0.0015230618882924318,
0.08157022297382355,
-0.021866369992494583,
-0.02195160835981369,
0.14181485772132874,
0.0031630108132958412,
0.12612631916999817,
0.03892672061920166,
0.1601787507534027,
0.11280705034732819,
0.002088663401082158,
0.09194786101579666,
-0.024289509281516075,
-0.3034364581108093,
0.04894551634788513,
-0.00235982914455235,
-0.11242838948965073,
0.08633837848901749,
0.128462016582489,
-0.034177180379629135,
0.07545296847820282,
-0.04360024258494377,
-0.0383000522851944,
0.032851625233888626,
-0.06518307328224182,
-0.04753272235393524,
0.1765413135290146,
0.05507897958159447,
0.11655480414628983,
0.09485480934381485,
0.03746654838323593,
-0.09500298649072647,
0.03807562589645386,
-0.03790787607431412,
-0.0958455353975296,
0.04192541167140007,
-0.07530394941568375,
-0.03360990434885025,
0.25273022055625916,
-0.018469175323843956,
-0.02525607869029045,
-0.029746802523732185,
-0.06583409011363983,
-0.09793303161859512,
-0.05062294378876686,
0.06223856285214424,
-0.03543230891227722,
0.028841380029916763,
-0.005807647481560707,
0.13004843890666962,
-0.20201590657234192,
0.10688813775777817,
0.08355826139450073,
-0.1768644154071808,
-0.04873832315206528,
0.1755991280078888,
0.09281221032142639,
0.01453945692628622,
-0.12174393981695175,
0.041487254202365875,
-0.0020361251663416624,
0.059262122958898544,
-0.008705146610736847,
-0.10760276764631271,
-0.039471209049224854,
-0.035971738398075104,
-0.09828086942434311,
0.0540350116789341,
0.23757752776145935,
0.02069816365838051,
-0.00024167486117221415,
0.016721535474061966,
-0.009739547967910767,
0.15415003895759583,
0.016792038455605507,
0.02451232261955738,
-0.04598589614033699,
-0.08770240098237991,
-0.05586571618914604,
0.05566423013806343,
-0.08314812183380127,
0.02405570074915886,
-0.09196127206087112,
0.26586610078811646,
-0.014697694219648838,
0.024795422330498695,
-0.01274605467915535,
0.03330690786242485,
-0.07453462481498718,
-0.11654959619045258,
-0.030191322788596153,
-0.0727338045835495,
-0.008663516491651535,
-0.028240324929356575,
0.02959798276424408,
-0.08340221643447876,
0.10161519050598145,
0.18058986961841583,
-0.023371649906039238,
-0.0010596781503409147,
0.036808039993047714,
0.09536176174879074,
0.0739588662981987,
-0.011735224165022373,
-0.1543503999710083,
-0.08364078402519226,
0.026097482070326805,
-0.07485533505678177,
0.04803020879626274,
-0.0037823421880602837,
-0.07087377458810806,
-0.032959531992673874,
-0.02169152721762657,
-0.01455591432750225,
0.040939219295978546,
0.13045907020568848,
-0.022308655083179474,
-0.0037505924701690674,
0.04640666022896767,
-0.024767806753516197,
-0.027592388913035393,
0.01666359230875969,
-0.08905422687530518,
0.004546601790934801,
0.0039045086596161127,
0.02519655041396618,
0.006138285156339407,
0.22961825132369995,
-0.007076024077832699,
-0.009027877822518349,
0.029887542128562927,
-0.16824312508106232,
0.13983550667762756,
-0.1270761638879776,
0.06345803290605545,
-0.18776272237300873,
-0.034176819026470184,
-0.044879183173179626,
0.08225397020578384,
-0.04749203473329544,
-0.026165084913372993,
-0.09460511803627014,
-0.05837427079677582,
0.0474417544901371,
-0.03988578915596008,
-0.012773457914590836,
-0.05736781656742096,
0.07898984104394913,
-0.14081360399723053,
0.11170203983783722,
-0.16672559082508087,
0.016457686200737953,
-0.16478785872459412,
0.002251012483611703,
-0.19953782856464386,
-0.037759218364953995,
-0.1509176790714264,
0.03128557279706001,
-0.11923199146986008,
-0.03776965290307999,
-0.01752851903438568,
0.03825308009982109,
0.04716286435723305,
0.13898234069347382,
-0.07439615577459335,
-0.04647692292928696,
0.06663742661476135,
-0.09473724663257599,
-0.051991164684295654,
0.1181926429271698,
-0.025108378380537033,
0.07690605521202087,
0.0757153332233429,
0.14280103147029877,
0.08243202418088913,
-0.10023380070924759,
-0.08943462371826172,
0.032386694103479385,
-0.05899564176797867,
-0.07085241377353668,
0.07651763409376144,
0.08483251184225082,
0.000977899064309895,
0.051252152770757675,
-0.06238769739866257,
0.08009389787912369,
-0.046820905059576035,
0.014195465482771397,
0.05919589847326279,
-0.03836449235677719,
0.13251835107803345,
0.08271152526140213,
0.03055795468389988,
-0.11818121373653412,
-0.04246438667178154,
0.011354448273777962,
0.04221222177147865,
-0.04661981016397476,
0.017914291471242905,
-0.01600811630487442,
0.12640713155269623,
-0.09268541634082794,
0.024898890405893326,
-0.05700942873954773,
-0.12268374115228653,
-0.053469955921173096,
0.07262076437473297,
0.0007920914213173091,
0.21272777020931244,
0.0941997841000557,
0.004168546758592129,
-0.058579809963703156,
0.014742128551006317,
-0.010345024056732655,
-0.0313403494656086,
0.023271752521395683,
-0.21846769750118256,
0.040120288729667664,
-0.0714709460735321,
0.06031360849738121,
-0.11266215145587921,
-0.045648571103811264,
0.07570957392454147,
0.19439971446990967,
-0.02248166687786579,
0.037729330360889435,
-0.009367765858769417,
0.056384917348623276,
-0.05707867816090584,
0.06059364229440689,
0.06894499063491821,
-0.057019297033548355,
-0.08705185353755951,
0.22603681683540344,
0.07870867848396301,
0.01807378977537155,
0.0908147394657135,
-0.10031946003437042,
-0.011438488028943539,
0.056959398090839386,
-0.0646442174911499,
0.005483067594468594,
0.0013000700855627656,
-0.02805914916098118,
0.0002767482656054199,
0.056407831609249115,
0.09259704500436783,
-0.04037439078092575,
-0.04912703111767769,
0.03717788681387901,
-0.06443064659833908,
-0.057764798402786255,
0.22213447093963623,
0.025678977370262146,
-0.09632954001426697,
0.1458526998758316,
0.01663479581475258,
0.03651859238743782,
0.2482983022928238,
0.040700238198041916,
0.03058682195842266,
0.005446685943752527,
-0.04693319648504257,
-0.023644788190722466,
0.18221721053123474,
-0.0010517461923882365,
0.010497838258743286,
0.02061767876148224,
0.0014451174065470695,
-0.027130011469125748,
-0.03219975158572197,
-0.08147045224905014,
-0.05403401330113411,
0.0008282094495370984,
-0.11361876130104065,
0.13265179097652435,
-0.02193375676870346,
0.10259423404932022,
-0.019820259883999825,
-0.18086284399032593,
0.04200031980872154,
-0.010299088433384895,
-0.05182059481739998,
0.19155459105968475,
-0.030765287578105927,
-0.22719065845012665,
-0.07733814418315887,
-0.10352499037981033,
-0.08090871572494507,
-0.038321759551763535,
0.05867384374141693,
-0.03190018609166145,
-0.06936682015657425,
-0.022103188559412956,
0.19938741624355316,
0.03066083788871765,
-0.024925319477915764,
-0.09334658831357956,
-0.02128526195883751,
-0.06325384974479675,
-0.10300121456384659,
0.0048635913990437984,
-0.1046689823269844,
0.04726032912731171,
0.12168415635824203,
-0.07199863344430923,
0.14888368546962738,
-0.007622330449521542,
-0.037003107368946075,
-0.01812814548611641,
0.01709633320569992,
0.18253099918365479,
-0.04870138317346573,
0.12711487710475922,
0.003254376817494631,
-0.06988082081079483,
0.030866626650094986,
0.2076946347951889,
0.047014594078063965,
-0.014410266652703285,
0.0033634635619819164,
0.03056236542761326,
-0.1167127788066864,
-0.17589759826660156,
-0.12880615890026093,
-0.03377546742558479,
0.1029733344912529,
0.0800681933760643,
0.04628642648458481,
-0.07383260875940323,
0.10749586671590805,
0.0035693105310201645,
-0.04070454463362694,
0.0014149994822219014,
0.04244280606508255,
0.08477180451154709,
-0.022874629124999046,
0.0711992010474205,
-0.0647372156381607,
-0.07677841186523438,
0.03002477064728737,
0.02016456238925457,
0.0003444477333687246,
-0.04406286031007767,
-0.037084948271512985,
0.07893726974725723,
0.03607206046581268,
0.050984472036361694,
0.1322571337223053,
-0.06544343382120132,
-0.04336617514491081,
0.005667513702064753,
-0.04764774814248085,
-0.022436151280999184,
0.019841056317090988,
-0.03292948752641678,
-0.00824575312435627,
-0.0378018319606781,
-0.12265653163194656,
0.067856065928936,
0.08817359060049057,
0.0800643041729927,
-0.10209706425666809,
-0.14218191802501678,
0.0190040934830904,
0.01026135589927435,
-0.0030162492766976357,
0.011510467156767845,
0.1751338690519333,
-0.07545825093984604,
0.044155094772577286,
-0.04775973781943321,
0.06667608767747879,
-0.01142917014658451,
0.012287190183997154,
-0.01261428277939558,
0.020144633948802948,
-0.04524153098464012,
0.05103412643074989,
-0.21496987342834473,
0.2597723603248596,
-0.009843776933848858,
0.008138314820826054,
-0.11402730643749237,
-0.09211075305938721,
-0.05086737498641014,
0.17268989980220795,
0.2085159718990326,
0.0702013298869133,
-0.08520397543907166,
-0.09592744708061218,
-0.09692056477069855,
0.057854533195495605,
-0.019512582570314407,
-0.05006127059459686,
0.0643606036901474,
0.054450470954179764,
0.00195770594291389,
-0.030343610793352127,
-0.09984783828258514,
-0.0950455367565155,
-0.022530503571033478,
0.01245156954973936,
0.06645210832357407,
-0.009613819420337677,
0.030001200735569,
-0.09781654924154282,
-0.19241611659526825,
0.08533226698637009,
-0.11368633061647415,
-0.057540230453014374,
-0.0978117510676384,
0.09540729969739914,
0.04665295407176018,
-0.11444472521543503,
-0.06687043607234955,
-0.019610993564128876,
-0.045469991862773895,
-0.09265954047441483,
-0.10392425954341888,
0.053669415414333344,
-0.0997227355837822,
-0.09516490250825882,
-0.00515724066644907,
0.11236832290887833,
0.14357569813728333,
0.033583659678697586,
0.01629999466240406,
0.06620025634765625,
-0.05004085972905159,
-0.0991889014840126,
0.02567158453166485,
-0.05117257311940193,
0.0418633297085762,
-0.0175592303276062,
-0.06586093455553055,
0.08401467651128769,
0.046510566025972366,
-0.02122672088444233,
0.10620070993900299,
0.26005619764328003,
-0.08804943412542343,
0.11116349697113037,
0.1929880529642105,
0.0030806756112724543,
-0.3919418156147003,
-0.0028347489424049854,
-0.048342883586883545,
0.05854219198226929,
-0.019548924639821053,
-0.09419156610965729,
0.129941925406456,
0.05901768431067467,
-0.0020202009472995996,
0.035675209015607834,
-0.19476363062858582,
-0.10904267430305481,
0.12704932689666748,
0.0565047562122345,
0.3075089454650879,
-0.008719476871192455,
-0.005103795323520899,
-0.06291524320840836,
-0.2057107836008072,
0.09840746968984604,
-0.08205605298280716,
0.04653053358197212,
-0.052717600017786026,
0.15686078369617462,
0.011141324415802956,
-0.060938749462366104,
0.10969310253858566,
0.0702812448143959,
-0.01922735571861267,
-0.15283013880252838,
-0.10541987419128418,
0.07326077669858932,
0.018434204161167145,
0.12371634691953659,
-0.09437765926122665,
-0.03168339282274246,
-0.1146860271692276,
-0.009617394767701626,
-0.1680239886045456,
0.21921436488628387,
-0.06250408291816711,
-0.09414265304803848,
-0.06816839426755905,
0.1510867178440094,
0.04773193597793579,
0.0010693754302337766,
0.20241956412792206,
-0.12175696343183517,
0.08077860623598099,
0.01515949983149767,
0.1476404368877411,
0.05765896663069725,
0.13559390604496002,
0.06603670865297318,
-0.04939798265695572,
0.07345989346504211,
-0.10730476677417755,
0.00927356630563736,
0.15749740600585938,
0.021413709968328476,
0.021602531895041466,
0.028786465525627136,
-0.09309197962284088,
-0.022324688732624054,
0.10486027598381042,
-0.1816178858280182,
0.02252984046936035,
0.010715358890593052,
-0.13290992379188538,
0.0774267315864563,
0.08066636323928833,
0.15246225893497467,
-0.09681400656700134,
0.02113167755305767,
-0.017389629036188126,
0.011591223068535328,
-0.04379309341311455,
0.07955623418092728,
0.09942597895860672,
0.050573330372571945,
-0.09466910362243652,
0.015721455216407776,
-0.022058019414544106,
-0.07634548097848892,
0.013625350780785084,
0.05178916081786156,
-0.10803916305303574,
-0.10346894711256027,
-0.012006543576717377,
0.09017180651426315,
-0.16410520672798157,
-0.05418041720986366,
-0.11409953981637955,
-0.11869969218969345,
0.035725854337215424,
0.2435169368982315,
0.05532866716384888,
0.04756457358598709,
0.06413735449314117,
-0.06860250234603882,
-0.11027367413043976,
0.07064636051654816,
-0.019774271175265312,
0.007014797534793615,
-0.001074915286153555,
-0.004525977186858654,
-0.0715213418006897,
0.10018432140350342,
-0.07204549759626389,
0.04849441349506378,
-0.1994052529335022,
-0.025430388748645782,
-0.03388189896941185,
-0.07293836027383804,
-0.07879284769296646,
-0.008887341246008873,
-0.06305556744337082,
-0.08332175761461258,
-0.01316936407238245,
0.0020248079672455788,
-0.09097196161746979,
0.04776475206017494,
0.024755628779530525,
-0.012159431353211403,
-0.11823893338441849,
-0.08605746924877167,
0.03239097446203232,
-0.005597949493676424,
0.07484988868236542,
0.014729097485542297,
-0.0008857180364429951,
0.11249055713415146,
-0.2308274507522583,
0.02698487602174282,
-0.03934766352176666,
-0.015822332352399826,
0.0838828757405281,
-0.009608318097889423,
-0.03730934485793114,
0.036723148077726364,
0.027542000636458397,
0.05748407542705536,
0.04575512930750847,
-0.05376097932457924,
0.05899970605969429,
-0.008849319070577621,
-0.12630987167358398,
-0.070668525993824,
0.08463570475578308,
-0.021918458864092827,
0.07108398526906967,
0.15875646471977234,
-0.07227101922035217,
0.05085238069295883,
-0.030325692147016525,
0.007678844500333071,
0.059526968747377396,
-0.15274761617183685,
0.016617704182863235,
-0.028223570436239243,
0.011256758123636246,
-0.04521260783076286,
0.09138939529657364,
0.08248153328895569,
-0.1158318817615509,
0.03252004086971283,
-0.021914847195148468,
-0.10344543308019638,
0.035234224051237106,
0.11839067190885544,
-0.06036127358675003,
0.001030081300996244,
-0.09948740899562836,
0.06945054233074188,
-0.026401743292808533,
0.03101770207285881,
0.1510169804096222,
0.06611915677785873,
0.18266652524471283,
0.002654308220371604,
0.09860769659280777,
0.030332745984196663,
0.10243356972932816,
-0.03631541505455971,
0.0245443657040596,
0.04520078003406525,
0.014790947549045086,
0.12200737744569778,
0.04422632232308388,
-0.07339929789304733,
0.0167617779225111,
-0.039713259786367416,
-0.025871774181723595,
-0.1302000880241394,
-0.1722346544265747,
-0.11438104510307312,
-0.01732267625629902,
0.05393938720226288,
-0.09522102773189545,
-0.051475051790475845,
0.022713005542755127,
0.09248869866132736,
-0.013300521299242973,
0.05584871768951416,
-0.03803035989403725,
-0.011765856295824051,
0.05675921216607094,
0.009648611769080162,
-0.027337461709976196,
-0.02838040143251419,
0.023014098405838013,
-0.07937341928482056,
0.020956100896000862,
-0.06121182069182396,
-0.011338445357978344,
-0.0178410355001688,
-0.04269302636384964,
-0.05756106227636337,
-0.0919070765376091,
-0.020342949777841568,
0.06832821667194366,
0.12060506641864777,
0.03652575984597206,
0.05663551017642021,
-0.02257777750492096,
-0.03658666834235191,
0.14791464805603027,
0.06739438325166702,
-0.06797242164611816,
-0.08201326429843903,
0.05082778260111809,
-0.04026026278734207,
0.07454896718263626,
-0.03261006250977516,
-0.05754709243774414,
0.060855645686388016,
0.13207656145095825,
0.2692229449748993,
0.017838850617408752,
0.07029721885919571,
-0.058484479784965515,
0.023093432188034058,
0.0740995854139328,
0.05071137845516205,
0.05885286256670952,
0.14916403591632843,
-0.07978371530771255,
-0.08603695780038834,
-0.07630516588687897,
0.03768865019083023,
0.012041086331009865,
0.060943327844142914,
0.04784567654132843,
-0.05932687968015671,
-0.17378589510917664,
0.07139633595943451,
-0.09758622199296951,
-0.10674763470888138,
0.0731339305639267,
-0.12956395745277405,
-0.0791463702917099,
-0.12376494705677032,
-0.002845229348167777,
0.07895097136497498,
0.0758395567536354,
-0.04472498968243599,
-0.020433230325579643,
0.013002961874008179,
0.023240109905600548,
-0.092316634953022,
-0.05110785737633705,
0.06944293528795242,
0.09622952342033386,
0.07419681549072266,
-0.049991656094789505,
0.09633183479309082,
0.16284061968326569,
-0.048792365938425064,
-0.029077783226966858,
-0.003669329220429063,
0.05202353000640869,
-0.053006939589977264,
-0.03207310661673546,
0.09051994234323502,
0.007206698879599571,
0.07103943824768066,
0.04819563031196594,
-0.14316999912261963,
0.06294868886470795,
0.06841498613357544,
-0.07797634601593018,
-0.063455730676651,
0.17630234360694885,
-0.03192317485809326,
0.13446012139320374,
0.16223125159740448,
-0.006065579131245613,
-0.058380335569381714,
-0.10826101899147034,
0.0387243814766407,
0.0042962185107171535,
-0.04742945730686188,
-0.01996772550046444,
-0.11797469109296799,
-0.007564257364720106,
-0.02753330022096634,
-0.012977642007172108,
-0.20969317853450775,
0.009755532257258892,
-0.115639828145504,
-0.012166120111942291,
-0.005647581536322832,
0.013729692436754704,
0.0010784603655338287,
0.01712854392826557,
-0.031203044578433037,
-0.025242341682314873,
0.03151572868227959,
0.057680342346429825,
-0.03466324135661125,
-0.10413257777690887
] |
null | null |
transformers
|
# RoBERTa base model for Marathi language (मराठी भाषा)
Pretrained model on Marathi language using a masked language modeling (MLM) objective. RoBERTa was introduced in
[this paper](https://arxiv.org/abs/1907.11692) and first released in
[this repository](https://github.com/pytorch/fairseq/tree/master/examples/roberta). We trained RoBERTa model for Marathi Language during community week hosted by Huggingface 🤗 using JAX/Flax for NLP & CV jax.
<img src="https://user-images.githubusercontent.com/15062408/126040902-ea8808db-ec30-4a3f-bf95-5d3b10d674e9.png" alt="huggingface-marathi-roberta" width="350" height="350" style="text-align: center">
## Model description
Marathi RoBERTa is a transformers model pretrained on a large corpus of Marathi data in a self-supervised fashion.
## Intended uses & limitations❗️
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. We used this model to fine tune on text classification task for iNLTK and indicNLP news text classification problem statement. Since marathi mc4 dataset is made by scraping marathi newspapers text, it will involve some biases which will also affect all fine-tuned versions of this model.
### How to use❓
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='flax-community/roberta-base-mr')
>>> unmasker("मोठी बातमी! उद्या दुपारी <mask> वाजता जाहीर होणार दहावीचा निकाल")
[{'score': 0.057209037244319916,'sequence': 'मोठी बातमी! उद्या दुपारी आठ वाजता जाहीर होणार दहावीचा निकाल',
'token': 2226,
'token_str': 'आठ'},
{'score': 0.02796074189245701,
'sequence': 'मोठी बातमी! उद्या दुपारी २० वाजता जाहीर होणार दहावीचा निकाल',
'token': 987,
'token_str': '२०'},
{'score': 0.017235398292541504,
'sequence': 'मोठी बातमी! उद्या दुपारी नऊ वाजता जाहीर होणार दहावीचा निकाल',
'token': 4080,
'token_str': 'नऊ'},
{'score': 0.01691395975649357,
'sequence': 'मोठी बातमी! उद्या दुपारी २१ वाजता जाहीर होणार दहावीचा निकाल',
'token': 1944,
'token_str': '२१'},
{'score': 0.016252165660262108,
'sequence': 'मोठी बातमी! उद्या दुपारी ३ वाजता जाहीर होणार दहावीचा निकाल',
'token': 549,
'token_str': ' ३'}]
```
## Training data 🏋🏻♂️
The RoBERTa Marathi model was pretrained on `mr` dataset of C4 multilingual dataset:
<br>
<br>
[C4 (Colossal Clean Crawled Corpus)](https://yknzhu.wixsite.com/mbweb), Introduced by Raffel et al. in [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://paperswithcode.com/paper/exploring-the-limits-of-transfer-learning).
The dataset can be downloaded in a pre-processed form from [allennlp](https://github.com/allenai/allennlp/discussions/5056) or huggingface's datsets - [mc4 dataset](https://huggingface.co/datasets/mc4).
Marathi (`mr`) dataset consists of 14 billion tokens, 7.8 million docs and with weight ~70 GB of text.
## Data Cleaning 🧹
Though initial `mc4` marathi corpus size ~70 GB, Through data exploration, it was observed it contains docs from different languages especially thai, chinese etc. So we had to clean the dataset before traning tokenizer and model. Surprisingly, results after cleaning Marathi mc4 corpus data:
#### **Train set:**
Clean docs count 1581396 out of 7774331. <br>
**~20.34%** of whole marathi train split is actually Marathi.
#### **Validation set**
Clean docs count 1700 out of 7928. <br>
**~19.90%** of whole marathi validation split is actually Marathi.
## Training procedure 👨🏻💻
### Preprocessing
The texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of
the model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked
with `<s>` and the end of one by `</s>`
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `<mask>`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
Contrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).
### Pretraining
The model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) **8 v3 TPU cores** for 42K steps with a batch size of 128 and a sequence length of 128. The
optimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and
ε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning
rate after.
We tracked experiments and hyperparameter tunning on weights and biases platform. Here is link to main dashboard: <br>
[Link to Weights and Biases Dashboard for Marathi RoBERTa model](https://wandb.ai/nipunsadvilkar/roberta-base-mr/runs/19qtskbg?workspace=user-nipunsadvilkar)
#### **Pretraining Results 📊**
RoBERTa Model reached **eval accuracy of 85.28%** around ~35K step **with train loss at 0.6507 and eval loss at 0.6219**.
## Fine Tuning on downstream tasks
We performed fine-tuning on downstream tasks. We used following datasets for classification:
1. [IndicNLP Marathi news classification](https://github.com/ai4bharat-indicnlp/indicnlp_corpus#publicly-available-classification-datasets)
2. [iNLTK Marathi news headline classification](https://www.kaggle.com/disisbig/marathi-news-dataset)
### **Fine tuning on downstream task results (Segregated)**
#### 1. [IndicNLP Marathi news classification](https://github.com/ai4bharat-indicnlp/indicnlp_corpus#publicly-available-classification-datasets)
IndicNLP Marathi news dataset consists 3 classes - `['lifestyle', 'entertainment', 'sports']` - with following docs distribution as per classes:
| train | eval | test
| -- | -- | --
| 9672 | 477 | 478
💯 Our Marathi RoBERTa **`roberta-base-mr` model outperformed both classifier ** mentioned in [Arora, G. (2020). iNLTK](https://www.semanticscholar.org/paper/iNLTK%3A-Natural-Language-Toolkit-for-Indic-Languages-Arora/5039ed9e100d3a1cbbc25a02c82f6ee181609e83/figure/3) and [Kunchukuttan, Anoop et al. AI4Bharat-IndicNLP.](https://www.semanticscholar.org/paper/AI4Bharat-IndicNLP-Corpus%3A-Monolingual-Corpora-and-Kunchukuttan-Kakwani/7997d432925aff0ba05497d2893c09918298ca55/figure/4)
Dataset | FT-W | FT-WC | INLP | iNLTK | **roberta-base-mr 🏆**
-- | -- | -- | -- | -- | --
iNLTK Headlines | 83.06 | 81.65 | 89.92 | 92.4 | **97.48**
**🤗 Huggingface Model hub repo:**<br>
`roberta-base-mr` fine tuned on iNLTK Headlines classification dataset model:
[**`flax-community/mr-indicnlp-classifier`**](https://huggingface.co/flax-community/mr-indicnlp-classifier)
🧪 Fine tuning experiment's weight and biases dashboard [link](https://wandb.ai/nipunsadvilkar/huggingface/runs/1242bike?workspace=user-nipunsadvilkar
)
#### 2. [iNLTK Marathi news headline classification](https://www.kaggle.com/disisbig/marathi-news-dataset)
This dataset consists 3 classes - `['state', 'entertainment', 'sports']` - with following docs distribution as per classes:
| train | eval | test
| -- | -- | --
| 9658 | 1210 | 1210
💯 Here as well **`roberta-base-mr` outperformed `iNLTK` marathi news text classifier**.
Dataset | iNLTK ULMFiT | **roberta-base-mr 🏆**
-- | -- | --
iNLTK news dataset (kaggle) | 92.4 | **94.21**
**🤗 Huggingface Model hub repo:**<br>
`roberta-base-mr` fine tuned on iNLTK news classification dataset model:
[**`flax-community/mr-inltk-classifier`**](https://huggingface.co/flax-community/mr-inltk-classifier)
Fine tuning experiment's weight and biases dashboard [link](https://wandb.ai/nipunsadvilkar/huggingface/runs/2u5l9hon?workspace=user-nipunsadvilkar
)
## **Want to check how above models generalise on real world Marathi data?**
Head to 🤗 Huggingface's spaces 🪐 to play with all three models:
1. Mask Language Modelling with Pretrained Marathi RoBERTa model: <br>
[**`flax-community/roberta-base-mr`**](https://huggingface.co/flax-community/roberta-base-mr)
2. Marathi Headline classifier: <br>
[**`flax-community/mr-indicnlp-classifier`**](https://huggingface.co/flax-community/mr-indicnlp-classifier)
3. Marathi news classifier: <br>
[**`flax-community/mr-inltk-classifier`**](https://huggingface.co/flax-community/mr-inltk-classifier)

[Streamlit app of Pretrained Roberta Marathi model on Huggingface Spaces](https://huggingface.co/spaces/flax-community/roberta-base-mr)

## Team Members
- Nipun Sadvilkar [@nipunsadvilkar](https://github.com/nipunsadvilkar)
- Haswanth Aekula [@hassiahk](https://github.com/hassiahk)
## Credits
Huge thanks to Huggingface 🤗 & Google Jax/Flax team for such a wonderful community week. Especially for providing such massive computing resource. Big thanks to [@patil-suraj](https://github.com/patil-suraj) & [@patrickvonplaten](https://github.com/patrickvonplaten) for mentoring during whole week.
<img src=https://pbs.twimg.com/media/E443fPjX0AY1BsR.jpg:large>
|
{"widget": [{"text": "\u0905\u0927\u094d\u092f\u0915\u094d\u0937 <mask> \u092a\u0935\u093e\u0930 \u0906\u0923\u093f \u0909\u092a\u092e\u0941\u0916\u094d\u092f\u092e\u0902\u0924\u094d\u0930\u0940 \u0905\u091c\u093f\u0924 \u092a\u0935\u093e\u0930 \u092f\u093e\u0902\u091a\u0940 \u092d\u0947\u091f \u0918\u0947\u0924\u0932\u0940."}, {"text": "\u092e\u094b\u0920\u0940 \u092c\u093e\u0924\u092e\u0940! \u0909\u0926\u094d\u092f\u093e \u0926\u0941\u092a\u093e\u0930\u0940 <mask> \u0935\u093e\u091c\u0924\u093e \u091c\u093e\u0939\u0940\u0930 \u0939\u094b\u0923\u093e\u0930 \u0926\u0939\u093e\u0935\u0940\u091a\u093e \u0928\u093f\u0915\u093e\u0932"}]}
|
fill-mask
|
flax-community/roberta-base-mr
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"roberta",
"fill-mask",
"arxiv:1907.11692",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1907.11692"
] |
[] |
TAGS
#transformers #pytorch #jax #tensorboard #roberta #fill-mask #arxiv-1907.11692 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
RoBERTa base model for Marathi language (मराठी भाषा)
====================================================
Pretrained model on Marathi language using a masked language modeling (MLM) objective. RoBERTa was introduced in
this paper and first released in
this repository. We trained RoBERTa model for Marathi Language during community week hosted by Huggingface using JAX/Flax for NLP & CV jax.
<img src="URL alt="huggingface-marathi-roberta" width="350" height="350" style="text-align: center">
Model description
-----------------
Marathi RoBERTa is a transformers model pretrained on a large corpus of Marathi data in a self-supervised fashion.
Intended uses & limitations️
----------------------------
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. We used this model to fine tune on text classification task for iNLTK and indicNLP news text classification problem statement. Since marathi mc4 dataset is made by scraping marathi newspapers text, it will involve some biases which will also affect all fine-tuned versions of this model.
### How to use
You can use this model directly with a pipeline for masked language modeling:
Training data ️
----------------
The RoBERTa Marathi model was pretrained on 'mr' dataset of C4 multilingual dataset:
C4 (Colossal Clean Crawled Corpus), Introduced by Raffel et al. in Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer.
The dataset can be downloaded in a pre-processed form from allennlp or huggingface's datsets - mc4 dataset.
Marathi ('mr') dataset consists of 14 billion tokens, 7.8 million docs and with weight ~70 GB of text.
Data Cleaning
-------------
Though initial 'mc4' marathi corpus size ~70 GB, Through data exploration, it was observed it contains docs from different languages especially thai, chinese etc. So we had to clean the dataset before traning tokenizer and model. Surprisingly, results after cleaning Marathi mc4 corpus data:
#### Train set:
Clean docs count 1581396 out of 7774331.
~20.34% of whole marathi train split is actually Marathi.
#### Validation set
Clean docs count 1700 out of 7928.
~19.90% of whole marathi validation split is actually Marathi.
Training procedure
--------------------
### Preprocessing
The texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of
the model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked
with '~~' and the end of one by '~~'
The details of the masking procedure for each sentence are the following:
* 15% of the tokens are masked.
* In 80% of the cases, the masked tokens are replaced by ''.
* In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
* In the 10% remaining cases, the masked tokens are left as is.
Contrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).
### Pretraining
The model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 42K steps with a batch size of 128 and a sequence length of 128. The
optimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and
ε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning
rate after.
We tracked experiments and hyperparameter tunning on weights and biases platform. Here is link to main dashboard:
Link to Weights and Biases Dashboard for Marathi RoBERTa model
#### Pretraining Results
RoBERTa Model reached eval accuracy of 85.28% around ~35K step with train loss at 0.6507 and eval loss at 0.6219.
Fine Tuning on downstream tasks
-------------------------------
We performed fine-tuning on downstream tasks. We used following datasets for classification:
1. IndicNLP Marathi news classification
2. iNLTK Marathi news headline classification
### Fine tuning on downstream task results (Segregated)
#### 1. IndicNLP Marathi news classification
IndicNLP Marathi news dataset consists 3 classes - '['lifestyle', 'entertainment', 'sports']' - with following docs distribution as per classes:
train: 9672, eval: 477, test: 478
Our Marathi RoBERTa 'roberta-base-mr' model outperformed both classifier mentioned in Arora, G. (2020). iNLTK and Kunchukuttan, Anoop et al. AI4Bharat-IndicNLP.
Huggingface Model hub repo:
'roberta-base-mr' fine tuned on iNLTK Headlines classification dataset model:
'flax-community/mr-indicnlp-classifier'
Fine tuning experiment's weight and biases dashboard link
#### 2. iNLTK Marathi news headline classification
This dataset consists 3 classes - '['state', 'entertainment', 'sports']' - with following docs distribution as per classes:
train: 9658, eval: 1210, test: 1210
Here as well 'roberta-base-mr' outperformed 'iNLTK' marathi news text classifier.
Dataset: iNLTK news dataset (kaggle), iNLTK ULMFiT: 92.4, roberta-base-mr: 94.21
Huggingface Model hub repo:
'roberta-base-mr' fine tuned on iNLTK news classification dataset model:
'flax-community/mr-inltk-classifier'
Fine tuning experiment's weight and biases dashboard link
Want to check how above models generalise on real world Marathi data?
---------------------------------------------------------------------
Head to Huggingface's spaces to play with all three models:
1. Mask Language Modelling with Pretrained Marathi RoBERTa model:
'flax-community/roberta-base-mr'
2. Marathi Headline classifier:
'flax-community/mr-indicnlp-classifier'
3. Marathi news classifier:
'flax-community/mr-inltk-classifier'
!alt text
Streamlit app of Pretrained Roberta Marathi model on Huggingface Spaces
!image
Team Members
------------
* Nipun Sadvilkar @nipunsadvilkar
* Haswanth Aekula @hassiahk
Credits
-------
Huge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week. Especially for providing such massive computing resource. Big thanks to @patil-suraj & @patrickvonplaten for mentoring during whole week.
<img src=URL
|
[
"### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nTraining data ️\n----------------\n\n\nThe RoBERTa Marathi model was pretrained on 'mr' dataset of C4 multilingual dataset:\n \n\n \n\nC4 (Colossal Clean Crawled Corpus), Introduced by Raffel et al. in Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer.\n\n\nThe dataset can be downloaded in a pre-processed form from allennlp or huggingface's datsets - mc4 dataset.\nMarathi ('mr') dataset consists of 14 billion tokens, 7.8 million docs and with weight ~70 GB of text.\n\n\nData Cleaning\n-------------\n\n\nThough initial 'mc4' marathi corpus size ~70 GB, Through data exploration, it was observed it contains docs from different languages especially thai, chinese etc. So we had to clean the dataset before traning tokenizer and model. Surprisingly, results after cleaning Marathi mc4 corpus data:",
"#### Train set:\n\n\nClean docs count 1581396 out of 7774331. \n\n~20.34% of whole marathi train split is actually Marathi.",
"#### Validation set\n\n\nClean docs count 1700 out of 7928. \n\n~19.90% of whole marathi validation split is actually Marathi.\n\n\nTraining procedure \n--------------------",
"### Preprocessing\n\n\nThe texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of\nthe model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked\nwith '~~' and the end of one by '~~'\nThe details of the masking procedure for each sentence are the following:\n\n\n* 15% of the tokens are masked.\n* In 80% of the cases, the masked tokens are replaced by ''.\n* In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n* In the 10% remaining cases, the masked tokens are left as is.\nContrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).",
"### Pretraining\n\n\nThe model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 42K steps with a batch size of 128 and a sequence length of 128. The\noptimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and\nε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning\nrate after.\n\n\nWe tracked experiments and hyperparameter tunning on weights and biases platform. Here is link to main dashboard: \n\nLink to Weights and Biases Dashboard for Marathi RoBERTa model",
"#### Pretraining Results\n\n\nRoBERTa Model reached eval accuracy of 85.28% around ~35K step with train loss at 0.6507 and eval loss at 0.6219.\n\n\nFine Tuning on downstream tasks\n-------------------------------\n\n\nWe performed fine-tuning on downstream tasks. We used following datasets for classification:\n\n\n1. IndicNLP Marathi news classification\n2. iNLTK Marathi news headline classification",
"### Fine tuning on downstream task results (Segregated)",
"#### 1. IndicNLP Marathi news classification\n\n\nIndicNLP Marathi news dataset consists 3 classes - '['lifestyle', 'entertainment', 'sports']' - with following docs distribution as per classes:\n\n\ntrain: 9672, eval: 477, test: 478\n\n\nOur Marathi RoBERTa 'roberta-base-mr' model outperformed both classifier mentioned in Arora, G. (2020). iNLTK and Kunchukuttan, Anoop et al. AI4Bharat-IndicNLP.\n\n\n\nHuggingface Model hub repo: \n\n'roberta-base-mr' fine tuned on iNLTK Headlines classification dataset model:\n\n\n'flax-community/mr-indicnlp-classifier'\n\n\nFine tuning experiment's weight and biases dashboard link",
"#### 2. iNLTK Marathi news headline classification\n\n\nThis dataset consists 3 classes - '['state', 'entertainment', 'sports']' - with following docs distribution as per classes:\n\n\ntrain: 9658, eval: 1210, test: 1210\n\n\nHere as well 'roberta-base-mr' outperformed 'iNLTK' marathi news text classifier.\n\n\nDataset: iNLTK news dataset (kaggle), iNLTK ULMFiT: 92.4, roberta-base-mr: 94.21\n\n\nHuggingface Model hub repo: \n\n'roberta-base-mr' fine tuned on iNLTK news classification dataset model:\n\n\n'flax-community/mr-inltk-classifier'\n\n\nFine tuning experiment's weight and biases dashboard link\n\n\nWant to check how above models generalise on real world Marathi data?\n---------------------------------------------------------------------\n\n\nHead to Huggingface's spaces to play with all three models:\n\n\n1. Mask Language Modelling with Pretrained Marathi RoBERTa model: \n\n'flax-community/roberta-base-mr'\n2. Marathi Headline classifier: \n\n'flax-community/mr-indicnlp-classifier'\n3. Marathi news classifier: \n\n'flax-community/mr-inltk-classifier'\n\n\n!alt text\nStreamlit app of Pretrained Roberta Marathi model on Huggingface Spaces\n\n\n!image\n\n\nTeam Members\n------------\n\n\n* Nipun Sadvilkar @nipunsadvilkar\n* Haswanth Aekula @hassiahk\n\n\nCredits\n-------\n\n\nHuge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week. Especially for providing such massive computing resource. Big thanks to @patil-suraj & @patrickvonplaten for mentoring during whole week.\n\n\n<img src=URL"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #roberta #fill-mask #arxiv-1907.11692 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nTraining data ️\n----------------\n\n\nThe RoBERTa Marathi model was pretrained on 'mr' dataset of C4 multilingual dataset:\n \n\n \n\nC4 (Colossal Clean Crawled Corpus), Introduced by Raffel et al. in Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer.\n\n\nThe dataset can be downloaded in a pre-processed form from allennlp or huggingface's datsets - mc4 dataset.\nMarathi ('mr') dataset consists of 14 billion tokens, 7.8 million docs and with weight ~70 GB of text.\n\n\nData Cleaning\n-------------\n\n\nThough initial 'mc4' marathi corpus size ~70 GB, Through data exploration, it was observed it contains docs from different languages especially thai, chinese etc. So we had to clean the dataset before traning tokenizer and model. Surprisingly, results after cleaning Marathi mc4 corpus data:",
"#### Train set:\n\n\nClean docs count 1581396 out of 7774331. \n\n~20.34% of whole marathi train split is actually Marathi.",
"#### Validation set\n\n\nClean docs count 1700 out of 7928. \n\n~19.90% of whole marathi validation split is actually Marathi.\n\n\nTraining procedure \n--------------------",
"### Preprocessing\n\n\nThe texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of\nthe model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked\nwith '~~' and the end of one by '~~'\nThe details of the masking procedure for each sentence are the following:\n\n\n* 15% of the tokens are masked.\n* In 80% of the cases, the masked tokens are replaced by ''.\n* In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n* In the 10% remaining cases, the masked tokens are left as is.\nContrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).",
"### Pretraining\n\n\nThe model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 42K steps with a batch size of 128 and a sequence length of 128. The\noptimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and\nε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning\nrate after.\n\n\nWe tracked experiments and hyperparameter tunning on weights and biases platform. Here is link to main dashboard: \n\nLink to Weights and Biases Dashboard for Marathi RoBERTa model",
"#### Pretraining Results\n\n\nRoBERTa Model reached eval accuracy of 85.28% around ~35K step with train loss at 0.6507 and eval loss at 0.6219.\n\n\nFine Tuning on downstream tasks\n-------------------------------\n\n\nWe performed fine-tuning on downstream tasks. We used following datasets for classification:\n\n\n1. IndicNLP Marathi news classification\n2. iNLTK Marathi news headline classification",
"### Fine tuning on downstream task results (Segregated)",
"#### 1. IndicNLP Marathi news classification\n\n\nIndicNLP Marathi news dataset consists 3 classes - '['lifestyle', 'entertainment', 'sports']' - with following docs distribution as per classes:\n\n\ntrain: 9672, eval: 477, test: 478\n\n\nOur Marathi RoBERTa 'roberta-base-mr' model outperformed both classifier mentioned in Arora, G. (2020). iNLTK and Kunchukuttan, Anoop et al. AI4Bharat-IndicNLP.\n\n\n\nHuggingface Model hub repo: \n\n'roberta-base-mr' fine tuned on iNLTK Headlines classification dataset model:\n\n\n'flax-community/mr-indicnlp-classifier'\n\n\nFine tuning experiment's weight and biases dashboard link",
"#### 2. iNLTK Marathi news headline classification\n\n\nThis dataset consists 3 classes - '['state', 'entertainment', 'sports']' - with following docs distribution as per classes:\n\n\ntrain: 9658, eval: 1210, test: 1210\n\n\nHere as well 'roberta-base-mr' outperformed 'iNLTK' marathi news text classifier.\n\n\nDataset: iNLTK news dataset (kaggle), iNLTK ULMFiT: 92.4, roberta-base-mr: 94.21\n\n\nHuggingface Model hub repo: \n\n'roberta-base-mr' fine tuned on iNLTK news classification dataset model:\n\n\n'flax-community/mr-inltk-classifier'\n\n\nFine tuning experiment's weight and biases dashboard link\n\n\nWant to check how above models generalise on real world Marathi data?\n---------------------------------------------------------------------\n\n\nHead to Huggingface's spaces to play with all three models:\n\n\n1. Mask Language Modelling with Pretrained Marathi RoBERTa model: \n\n'flax-community/roberta-base-mr'\n2. Marathi Headline classifier: \n\n'flax-community/mr-indicnlp-classifier'\n3. Marathi news classifier: \n\n'flax-community/mr-inltk-classifier'\n\n\n!alt text\nStreamlit app of Pretrained Roberta Marathi model on Huggingface Spaces\n\n\n!image\n\n\nTeam Members\n------------\n\n\n* Nipun Sadvilkar @nipunsadvilkar\n* Haswanth Aekula @hassiahk\n\n\nCredits\n-------\n\n\nHuge thanks to Huggingface & Google Jax/Flax team for such a wonderful community week. Especially for providing such massive computing resource. Big thanks to @patil-suraj & @patrickvonplaten for mentoring during whole week.\n\n\n<img src=URL"
] |
[
56,
235,
33,
35,
204,
167,
95,
15,
183,
411
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #roberta #fill-mask #arxiv-1907.11692 #autotrain_compatible #endpoints_compatible #has_space #region-us \n### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nTraining data ️\n----------------\n\n\nThe RoBERTa Marathi model was pretrained on 'mr' dataset of C4 multilingual dataset:\n \n\n \n\nC4 (Colossal Clean Crawled Corpus), Introduced by Raffel et al. in Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer.\n\n\nThe dataset can be downloaded in a pre-processed form from allennlp or huggingface's datsets - mc4 dataset.\nMarathi ('mr') dataset consists of 14 billion tokens, 7.8 million docs and with weight ~70 GB of text.\n\n\nData Cleaning\n-------------\n\n\nThough initial 'mc4' marathi corpus size ~70 GB, Through data exploration, it was observed it contains docs from different languages especially thai, chinese etc. So we had to clean the dataset before traning tokenizer and model. Surprisingly, results after cleaning Marathi mc4 corpus data:#### Train set:\n\n\nClean docs count 1581396 out of 7774331. \n\n~20.34% of whole marathi train split is actually Marathi.#### Validation set\n\n\nClean docs count 1700 out of 7928. \n\n~19.90% of whole marathi validation split is actually Marathi.\n\n\nTraining procedure \n--------------------",
"passage: ### Preprocessing\n\n\nThe texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of\nthe model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked\nwith '~~' and the end of one by '~~'\nThe details of the masking procedure for each sentence are the following:\n\n\n* 15% of the tokens are masked.\n* In 80% of the cases, the masked tokens are replaced by ''.\n* In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n* In the 10% remaining cases, the masked tokens are left as is.\nContrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).### Pretraining\n\n\nThe model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores) 8 v3 TPU cores for 42K steps with a batch size of 128 and a sequence length of 128. The\noptimizer used is Adam with a learning rate of 3e-4, β1 = 0.9, β2 = 0.98 and\nε = 1e-8, a weight decay of 0.01, learning rate warmup for 1,000 steps and linear decay of the learning\nrate after.\n\n\nWe tracked experiments and hyperparameter tunning on weights and biases platform. Here is link to main dashboard: \n\nLink to Weights and Biases Dashboard for Marathi RoBERTa model#### Pretraining Results\n\n\nRoBERTa Model reached eval accuracy of 85.28% around ~35K step with train loss at 0.6507 and eval loss at 0.6219.\n\n\nFine Tuning on downstream tasks\n-------------------------------\n\n\nWe performed fine-tuning on downstream tasks. We used following datasets for classification:\n\n\n1. IndicNLP Marathi news classification\n2. iNLTK Marathi news headline classification### Fine tuning on downstream task results (Segregated)#### 1. IndicNLP Marathi news classification\n\n\nIndicNLP Marathi news dataset consists 3 classes - '['lifestyle', 'entertainment', 'sports']' - with following docs distribution as per classes:\n\n\ntrain: 9672, eval: 477, test: 478\n\n\nOur Marathi RoBERTa 'roberta-base-mr' model outperformed both classifier mentioned in Arora, G. (2020). iNLTK and Kunchukuttan, Anoop et al. AI4Bharat-IndicNLP.\n\n\n\nHuggingface Model hub repo: \n\n'roberta-base-mr' fine tuned on iNLTK Headlines classification dataset model:\n\n\n'flax-community/mr-indicnlp-classifier'\n\n\nFine tuning experiment's weight and biases dashboard link"
] |
[
-0.08277808129787445,
0.0781237930059433,
-0.0020896964706480503,
0.071989506483078,
0.06822484731674194,
-0.009514105506241322,
-0.03827420994639397,
0.09958437085151672,
-0.13052786886692047,
0.10835999995470047,
0.05407056212425232,
-0.021981384605169296,
0.0803724080324173,
0.17114323377609253,
0.02504037506878376,
-0.2343536615371704,
0.031612567603588104,
-0.07955903559923172,
-0.03825586289167404,
0.07293367385864258,
0.10851902514696121,
-0.096250981092453,
0.0471554733812809,
-0.04828941822052002,
-0.04648138955235481,
0.021285003051161766,
-0.0396505743265152,
-0.02133144810795784,
0.049873583018779755,
0.039797354489564896,
0.04344721511006355,
0.002369411988183856,
0.04977157339453697,
-0.08276918530464172,
0.03345615416765213,
0.03778088837862015,
-0.011834805831313133,
0.03411674499511719,
0.07526078820228577,
0.04653250426054001,
0.16151493787765503,
-0.1052437424659729,
0.0022808988578617573,
0.010003468953073025,
-0.06717438995838165,
-0.04846848174929619,
-0.13573437929153442,
0.020579708740115166,
0.04885135963559151,
0.122349813580513,
-0.04429450258612633,
0.06515389680862427,
-0.036491841077804565,
0.09689043462276459,
0.06659344583749771,
-0.2024279534816742,
-0.040503114461898804,
0.06817203760147095,
0.026677681133151054,
0.08696804195642471,
-0.06053736433386803,
0.0018788715824484825,
0.025900663807988167,
0.01091741118580103,
-0.006646713707596064,
0.007919377647340298,
0.04218539223074913,
0.0034604445099830627,
-0.0832621306180954,
-0.02920776605606079,
0.10043343901634216,
0.00011114217340946198,
-0.06067993491888046,
-0.10318566113710403,
-0.09619180113077164,
-0.10814351588487625,
0.017947493121027946,
0.02876722440123558,
-0.02441728487610817,
0.05268431454896927,
-0.005726233124732971,
-0.022895701229572296,
-0.06214919313788414,
-0.013539611361920834,
-0.029083777219057083,
0.10368624329566956,
0.047993920743465424,
0.013168683275580406,
0.03357991203665733,
0.05201120674610138,
-0.01058161910623312,
-0.1046833023428917,
-0.04364974796772003,
0.0011576761025935411,
-0.08850626647472382,
-0.003247369546443224,
-0.027989521622657776,
-0.1740458905696869,
-0.028165969997644424,
0.10935790836811066,
-0.052836865186691284,
0.0773983746767044,
-0.01830858364701271,
-0.022179527208209038,
0.025536254048347473,
0.11085009574890137,
-0.12440977990627289,
-0.013663887046277523,
0.0727490782737732,
-0.026742391288280487,
0.12039840221405029,
-0.033459682017564774,
-0.006162567995488644,
0.036026857793331146,
0.05921823903918266,
0.07615929841995239,
-0.05244909226894379,
0.04011028632521629,
-0.008833697065711021,
-0.02096700668334961,
0.15849198400974274,
-0.15720829367637634,
-0.0305134579539299,
-0.02743588015437126,
-0.033041998744010925,
-0.020309647545218468,
-0.018988659605383873,
-0.029551394283771515,
-0.07471806555986404,
-0.0031509622931480408,
-0.06415760517120361,
-0.061868201941251755,
-0.11037664115428925,
-0.1007288321852684,
0.012570955790579319,
-0.05669616907835007,
-0.05890294536948204,
-0.1034822016954422,
-0.17426124215126038,
-0.023381222039461136,
0.021221380680799484,
-0.08187322318553925,
-0.01621338166296482,
0.012208442203700542,
-0.07360991835594177,
0.018304744735360146,
0.013153630308806896,
0.13240362703800201,
-0.02784666419029236,
0.059937119483947754,
-0.04052429646253586,
0.0666453093290329,
0.04441283643245697,
0.028068097308278084,
-0.07355187833309174,
0.05953341722488403,
-0.18947859108448029,
0.08226662129163742,
-0.06433156132698059,
-0.02277953177690506,
-0.08776652812957764,
-0.01251884363591671,
-0.07187116146087646,
-0.01708626188337803,
0.07251277565956116,
0.08784114569425583,
-0.20722287893295288,
0.001896154135465622,
0.08996396511793137,
-0.1302247941493988,
-0.003426212817430496,
0.1271568089723587,
0.010140585713088512,
0.022202063351869583,
0.037587057799100876,
0.11023092269897461,
0.00892760418355465,
-0.03359999880194664,
-0.12591871619224548,
-0.024746771901845932,
-0.03560902178287506,
0.0698256865143776,
0.07989360392093658,
-0.02763712778687477,
0.04518069699406624,
0.0122036412358284,
0.03337571397423744,
0.019010934978723526,
-0.02763683907687664,
-0.03341245651245117,
-0.002709457650780678,
-0.03566567972302437,
-0.07718349248170853,
0.010546374134719372,
0.023338377475738525,
-0.03409422188997269,
-0.08909600973129272,
-0.15063807368278503,
0.12059779465198517,
-0.07384925335645676,
0.0698767751455307,
-0.10549437254667282,
0.09923604130744934,
-0.1045137420296669,
0.01614300161600113,
-0.18081888556480408,
-0.0016571952728554606,
0.07628802210092545,
-0.071059450507164,
-0.019966894760727882,
-0.051075298339128494,
0.037353575229644775,
0.05651942640542984,
-0.06436160951852798,
-0.03136203810572624,
0.002118549309670925,
-0.032488029450178146,
-0.09005345404148102,
-0.10817737132310867,
-0.013728111982345581,
-0.0552193745970726,
0.07986533641815186,
-0.09081755578517914,
-0.004057972691953182,
0.10101282596588135,
0.11279026418924332,
0.03321683406829834,
-0.08867523074150085,
0.04094814509153366,
0.011835998855531216,
-0.0260486900806427,
-0.0961688905954361,
-0.017434339970350266,
0.032499976456165314,
-0.004037856124341488,
0.08839938044548035,
-0.11649824678897858,
-0.09248611330986023,
0.07086014747619629,
0.060573264956474304,
-0.06374410539865494,
0.098501056432724,
-0.01739179715514183,
-0.04980608448386192,
-0.09251575171947479,
-0.002972441492602229,
0.09453168511390686,
0.03656613826751709,
0.11761023849248886,
-0.07343752682209015,
0.0015205141389742494,
0.006344291381537914,
0.034115392714738846,
-0.06350245326757431,
0.08533015847206116,
0.05170995369553566,
-0.13588517904281616,
0.047513991594314575,
-0.026016462594270706,
-0.011180552653968334,
0.12976288795471191,
-0.0025308607146143913,
-0.09949357807636261,
-0.030201736837625504,
0.014516550116240978,
0.00460280105471611,
0.06291195005178452,
-0.0271951574832201,
0.01874789595603943,
0.02094869315624237,
0.010971168987452984,
0.07477547973394394,
-0.020879510790109634,
0.04394196346402168,
0.005929125007241964,
-0.044688500463962555,
0.05897872522473335,
0.058392371982336044,
-0.06872761249542236,
0.06519974023103714,
0.01794053055346012,
0.06223461776971817,
0.014702502638101578,
-0.023387407884001732,
-0.09755079448223114,
0.14965584874153137,
-0.0901404544711113,
-0.2198365330696106,
-0.15490806102752686,
0.021767083555459976,
-0.0062072379514575005,
-0.010213043540716171,
-0.003978234715759754,
-0.09275083243846893,
-0.1420813798904419,
-0.14691926538944244,
0.01063502300530672,
-0.007657911628484726,
0.0031708255410194397,
-0.013004721142351627,
-0.026851249858736992,
-0.0005794479511678219,
-0.11136995255947113,
0.04333877936005592,
0.032586127519607544,
-0.02663169428706169,
0.04147351160645485,
-0.005775026511400938,
0.11086420714855194,
0.07716260850429535,
-0.0461687371134758,
-0.004795840010046959,
0.025490952655673027,
0.13354536890983582,
-0.061770275235176086,
0.1301146000623703,
0.08087259531021118,
0.01296042650938034,
0.02889304794371128,
0.11159364879131317,
-0.032321348786354065,
-0.0397193543612957,
0.0651630312204361,
0.039485856890678406,
-0.060579121112823486,
-0.23687581717967987,
-0.0715244710445404,
-0.04048386216163635,
-0.03628356754779816,
0.018436074256896973,
0.07279305160045624,
-0.026351789012551308,
0.005837336648255587,
-0.074043408036232,
-0.0037218183279037476,
0.02077602595090866,
0.05668514966964722,
0.015551322139799595,
-0.02621513046324253,
0.07198910415172577,
-0.06621474027633667,
0.05193597823381424,
0.10371475666761398,
-0.07323387265205383,
0.21312201023101807,
-0.05942109227180481,
0.21456125378608704,
0.011366397142410278,
0.0905337929725647,
0.061161212623119354,
0.052038274705410004,
-0.05973780155181885,
0.013761477544903755,
-0.016797443851828575,
-0.03706267476081848,
-0.05094120278954506,
0.03461170196533203,
0.05423677712678909,
0.014178788289427757,
0.01009467151015997,
0.09570872783660889,
0.01866099424660206,
0.2059982270002365,
0.03511926904320717,
-0.11622558534145355,
-0.07885761559009552,
0.018223654478788376,
-0.07326231896877289,
-0.07442039996385574,
0.0022609680891036987,
0.1362593024969101,
-0.02360091730952263,
-0.02211303636431694,
-0.043946534395217896,
0.04818093031644821,
-0.06726114451885223,
-0.04390320926904678,
0.00915810652077198,
0.033139754086732864,
-0.02686484530568123,
0.06405483931303024,
-0.1619076430797577,
0.10935568809509277,
0.02781975269317627,
0.10138382017612457,
-0.02977357991039753,
0.01018577441573143,
0.0040990980342030525,
-0.017138179391622543,
0.06806069612503052,
-0.009094901382923126,
-0.12917488813400269,
-0.06402012705802917,
-0.12508675456047058,
0.024095548316836357,
0.09747663140296936,
0.03817267715930939,
0.10598870366811752,
-0.024169040843844414,
0.020446397364139557,
-0.014816582202911377,
0.06621130555868149,
-0.11482526361942291,
-0.11191556602716446,
0.034596037119627,
-0.07859744131565094,
0.03788599744439125,
-0.08102792501449585,
-0.07571795582771301,
-0.08018819987773895,
0.1790820211172104,
-0.12117686867713928,
-0.06670236587524414,
-0.10160024464130402,
0.07530002295970917,
0.10585279017686844,
-0.07344772666692734,
0.03517670929431915,
-0.0040894970297813416,
0.11089140176773071,
-0.0017425213009119034,
-0.02653365582227707,
0.030807189643383026,
-0.0111019192263484,
-0.16050609946250916,
-0.013204904273152351,
0.06718847900629044,
0.06705093383789062,
0.033801741898059845,
-0.024916188791394234,
0.06054846942424774,
0.03413475677371025,
-0.08228754997253418,
0.0046151550486683846,
0.07292188704013824,
0.0435468815267086,
0.1035965085029602,
-0.05495959520339966,
-0.04859040305018425,
-0.012806674465537071,
-0.1018131896853447,
0.10366253554821014,
0.18128268420696259,
-0.009956535883247852,
0.10734100639820099,
0.1380651295185089,
-0.07297448813915253,
-0.223581001162529,
-0.014207558706402779,
-0.015208079479634762,
0.08615484833717346,
-0.07660418003797531,
-0.15364596247673035,
0.0659647136926651,
0.09884338080883026,
-0.0030489685013890266,
-0.06164567917585373,
-0.16992174088954926,
-0.12392367422580719,
0.03226300701498985,
0.061036117374897,
0.07146411389112473,
-0.10480964183807373,
-0.04360479861497879,
-0.04171618074178696,
0.036675889045000076,
-0.010774463415145874,
-0.05788666754961014,
0.09830543398857117,
-0.00426741037517786,
-0.04101122170686722,
0.009344079531729221,
-0.0734468400478363,
0.10476183146238327,
-0.011861826293170452,
0.04014893248677254,
-0.06049268692731857,
0.0069099473766982555,
0.13683870434761047,
-0.002614742610603571,
0.1144627183675766,
0.022916626185178757,
0.05718456953763962,
-0.10272349417209625,
-0.03829225152730942,
-0.024859977886080742,
0.00571059063076973,
-0.02687973529100418,
-0.03919080272316933,
-0.09026618301868439,
0.11021469533443451,
0.05928521975874901,
-0.005013640969991684,
0.0802910327911377,
-0.008226705715060234,
-0.0843828022480011,
0.11630410701036453,
0.05615676939487457,
-0.016371536999940872,
0.047055356204509735,
0.006089142523705959,
-0.019067827612161636,
0.04593910649418831,
-0.09832218289375305,
0.02972986549139023,
0.05534122884273529,
0.021250270307064056,
0.09845668077468872,
-0.0009743132395669818,
-0.10640625655651093,
0.043125417083501816,
0.05458541959524155,
-0.07355301827192307,
-0.09012433886528015,
0.018449923023581505,
-0.04620341211557388,
-0.09430614113807678,
-0.04656984657049179,
0.08713138103485107,
-0.042353663593530655,
0.013430910184979439,
-0.01241606380790472,
0.07508783042430878,
0.009497396647930145,
0.17137956619262695,
0.08255663514137268,
-0.008269719779491425,
-0.0724572092294693,
0.11703221499919891,
0.09226161241531372,
-0.1604495346546173,
0.02912559174001217,
0.13548219203948975,
-0.11736415326595306,
-0.056514423340559006,
0.005120132118463516,
0.07080495357513428,
0.06392623484134674,
0.003275075927376747,
-0.06893312186002731,
-0.039389368146657944,
0.09147550165653229,
0.07318510860204697,
0.023206759244203568,
0.06074492633342743,
-0.0939776599407196,
-0.009945390745997429,
-0.10471154749393463,
0.11982811987400055,
0.055750273168087006,
0.028795011341571808,
-0.04968038573861122,
0.07834377139806747,
-0.0010774945840239525,
-0.017627142369747162,
-0.028254158794879913,
-0.01970030553638935,
-0.035644955933094025,
-0.01578333228826523,
-0.11130455136299133,
0.0459049791097641,
-0.07122857868671417,
-0.03389804810285568,
-0.0515420064330101,
0.014884424395859241,
-0.032969988882541656,
0.024181358516216278,
0.007918315008282661,
-0.05296790599822998,
-0.05211403965950012,
0.022662082687020302,
-0.06363698095083237,
-0.008292725309729576,
0.010001352056860924,
-0.06513510644435883,
0.09248054027557373,
0.029323183000087738,
0.01638694852590561,
-0.002111043781042099,
0.04235004633665085,
0.012138371355831623,
-0.0033237221650779247,
0.030708082020282745,
-0.007842851802706718,
-0.1548246443271637,
-0.00535878399387002,
0.014698744751513004,
-0.04967258870601654,
0.036363910883665085,
0.1300312876701355,
-0.07873515784740448,
0.09654515981674194,
-0.113817498087883,
0.009771415963768959,
-0.09093993157148361,
0.11894293129444122,
0.054416410624980927,
0.0643574669957161,
0.14224153757095337,
-0.07167082279920578,
0.005947295576334,
-0.12006330490112305,
0.012535608373582363,
0.01415932085365057,
-0.01714995875954628,
-0.06918670237064362,
0.013459490612149239,
0.06781148165464401,
-0.05876757577061653,
0.09475497901439667,
0.011794495396316051,
-0.08974224328994751,
0.060628391802310944,
-0.014162964187562466,
-0.024762606248259544,
0.025020672008395195,
0.16748744249343872,
0.04516005143523216,
-0.004864111077040434,
0.04917437583208084,
0.0016299746930599213,
0.023404013365507126,
0.08369830250740051,
0.08404403179883957,
0.17140062153339386,
0.1372908055782318,
0.08984576910734177,
-0.0630752295255661,
-0.15278442203998566,
-0.04470585286617279,
0.15624943375587463,
-0.11683499813079834,
0.058638907968997955,
0.034428928047418594,
0.014487076550722122,
0.1431707739830017,
-0.1658245176076889,
0.03504034876823425,
0.002599586034193635,
-0.0896013155579567,
-0.026726577430963516,
-0.14195196330547333,
-0.07960040122270584,
0.01261313259601593,
0.0197032168507576,
-0.107962965965271,
0.024228205904364586,
0.10052628815174103,
0.030870024114847183,
-0.007024703081697226,
0.12509261071681976,
-0.11193668842315674,
-0.055460017174482346,
0.024004142731428146,
0.011615034192800522,
-0.005440195091068745,
0.053986806422472,
-0.02788134664297104,
-0.0034349930938333273,
0.003014518879354,
0.07232823967933655,
0.06966481357812881,
0.07271111756563187,
0.046331778168678284,
-0.02076582983136177,
-0.06593646109104156,
0.00732533261179924,
0.03636053577065468,
0.041510939598083496,
0.19344353675842285,
0.03775448352098465,
-0.021137913689017296,
0.012265503406524658,
0.09749474376440048,
-0.006214453838765621,
-0.0340970978140831,
-0.12894049286842346,
0.04128633439540863,
0.06420610845088959,
0.0158661138266325,
0.039476849138736725,
-0.10812538862228394,
0.013458918780088425,
0.14184266328811646,
0.12398207932710648,
-0.027392618358135223,
-0.0264083631336689,
0.03432741016149521,
-0.017259839922189713,
-0.03195931017398834,
0.1015346422791481,
0.08369927853345871,
0.10231563448905945,
-0.05404319986701012,
0.018141768872737885,
-0.09141571819782257,
-0.06158379465341568,
-0.0990028977394104,
0.1251111477613449,
-0.04473871365189552,
-0.018043868243694305,
-0.05738652125000954,
0.04937369376420975,
-0.02329777367413044,
-0.19231745600700378,
0.010304681956768036,
-0.0530429407954216,
-0.11226244270801544,
0.011808987706899643,
-0.05791885405778885,
0.01719776540994644,
0.019442666321992874,
0.0224451944231987,
0.05914134904742241,
0.09102027863264084,
0.054522983729839325,
-0.06755184382200241,
-0.03026455268263817,
0.09951456636190414,
-0.07232916355133057,
0.11958497762680054,
0.007421409245580435,
0.08440917730331421,
0.10860787332057953,
0.028010232374072075,
-0.12421807646751404,
0.0775192379951477,
0.03484959900379181,
-0.03284599259495735,
0.03177555650472641,
0.15677525103092194,
-0.00882539339363575,
0.12518249452114105,
0.05585215985774994,
0.022301508113741875,
0.02658555656671524,
-0.021564273163676262,
0.001530386391095817,
-0.12538152933120728,
0.07469373941421509,
-0.07169932126998901,
0.15596768260002136,
0.1451331377029419,
-0.010634499602019787,
0.011201154440641403,
-0.03914797306060791,
-0.00344905536621809,
-0.02067577838897705,
0.08790424466133118,
0.017769919708371162,
-0.22445252537727356,
0.014915083535015583,
-0.1262826770544052,
0.04237742722034454,
-0.2418835163116455,
-0.06432022899389267,
-0.012411459349095821,
-0.04843166470527649,
-0.0334189273416996,
0.12031956017017365,
0.007496800739318132,
0.018105443567037582,
-0.02352619543671608,
0.06959456205368042,
-0.0013901405036449432,
0.05909951776266098,
-0.14759892225265503,
-0.07368937134742737
] |
null | null |
transformers
|
# Scandinavian Roberta Base - MC4
## Description
This is a sample reference model for Flax/Jax training using only on the MC4. It is trained for roughly three day on a TPU v3-8. Training procedure...
---
## Description
My description
|
{"language": "da", "license": "cc-by-4.0", "tags": ["scandinavian", "roberta"], "pipeline_tag": "fill-mask", "widget": [{"text": "P\u00e5 biblioteket kan du l\u00e5ne en <mask>."}]}
|
fill-mask
|
DDSC/roberta-base-scandinavian
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"roberta",
"fill-mask",
"scandinavian",
"da",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"da"
] |
TAGS
#transformers #pytorch #jax #tensorboard #roberta #fill-mask #scandinavian #da #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us
|
# Scandinavian Roberta Base - MC4
## Description
This is a sample reference model for Flax/Jax training using only on the MC4. It is trained for roughly three day on a TPU v3-8. Training procedure...
---
## Description
My description
|
[
"# Scandinavian Roberta Base - MC4",
"## Description\n\nThis is a sample reference model for Flax/Jax training using only on the MC4. It is trained for roughly three day on a TPU v3-8. Training procedure...\n\n---",
"## Description\nMy description"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #roberta #fill-mask #scandinavian #da #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Scandinavian Roberta Base - MC4",
"## Description\n\nThis is a sample reference model for Flax/Jax training using only on the MC4. It is trained for roughly three day on a TPU v3-8. Training procedure...\n\n---",
"## Description\nMy description"
] |
[
60,
9,
42,
4
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #roberta #fill-mask #scandinavian #da #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n# Scandinavian Roberta Base - MC4## Description\n\nThis is a sample reference model for Flax/Jax training using only on the MC4. It is trained for roughly three day on a TPU v3-8. Training procedure...\n\n---## Description\nMy description"
] |
[
-0.058300621807575226,
-0.04545566067099571,
-0.0004232299979776144,
0.11357974261045456,
0.09866894036531448,
0.02937859110534191,
0.06120867654681206,
0.06480592489242554,
-0.1310795247554779,
-0.019131962209939957,
0.20235499739646912,
0.11281128227710724,
0.021454039961099625,
0.08081881701946259,
0.024455105885863304,
-0.2712543308734894,
0.014263501390814781,
0.03870032727718353,
-0.1424899399280548,
0.1128162294626236,
0.1360163390636444,
-0.0323980338871479,
0.024838058277964592,
-0.001206042361445725,
-0.19182875752449036,
-0.006832705810666084,
0.03890880197286606,
-0.07493297755718231,
0.16726073622703552,
0.028900448232889175,
0.13557621836662292,
0.06436625868082047,
0.10819249600172043,
-0.026209481060504913,
0.05506477132439613,
0.011867969296872616,
-0.05726083740592003,
0.04282575473189354,
0.009146836586296558,
0.09653715789318085,
0.20708627998828888,
0.014755460433661938,
0.08448922634124756,
0.017124144360423088,
-0.06064748391509056,
-0.14508415758609772,
-0.056375857442617416,
0.020184097811579704,
0.06473729014396667,
0.05019201710820198,
0.036846090108156204,
0.15187133848667145,
-0.20713908970355988,
0.08734729886054993,
0.038054533302783966,
-0.337969034910202,
-0.10859771817922592,
0.19870570302009583,
0.1411353051662445,
0.050833459943532944,
-0.08638948202133179,
0.020745467394590378,
0.046830274164676666,
0.0900692418217659,
0.14439009130001068,
-0.0374503955245018,
-0.006539502646774054,
0.05569237098097801,
-0.11890695244073868,
0.0040441551245749,
0.22629106044769287,
0.006765944883227348,
-0.03981374576687813,
0.00438222149387002,
0.055134113878011703,
-0.07278730720281601,
-0.03248833119869232,
-0.06523507833480835,
-0.04147636145353317,
0.019059328362345695,
-0.08254893124103546,
-0.03486653044819832,
-0.12402411550283432,
-0.10916726291179657,
-0.06403124332427979,
0.0714372843503952,
-0.011493919417262077,
0.08218222111463547,
-0.028460102155804634,
0.07048017531633377,
0.003083638846874237,
-0.046128299087285995,
0.018093591555953026,
-0.0439450666308403,
-0.03769783675670624,
-0.07608356326818466,
-0.02709733322262764,
-0.11388623714447021,
0.036961231380701065,
0.013502052053809166,
0.08879075944423676,
-0.012461191043257713,
0.07418331503868103,
0.08593786507844925,
-0.12251131236553192,
0.035114966332912445,
-0.014581703580915928,
-0.0356966033577919,
0.023507775738835335,
-0.0022725958842784166,
-0.0835113525390625,
-0.020444948226213455,
-0.14508982002735138,
-0.037010353058576584,
0.04794789105653763,
0.03846442699432373,
-0.07745525240898132,
0.08990126848220825,
0.037375807762145996,
0.007417965680360794,
0.03573092073202133,
-0.0599372424185276,
-0.04042184725403786,
-0.025830237194895744,
-0.02517809346318245,
-0.07269926369190216,
0.08001650124788284,
-0.01302249263972044,
-0.016451582312583923,
0.028150193393230438,
-0.07507015764713287,
-0.026043478399515152,
-0.11552820354700089,
-0.10171648859977722,
0.0231291726231575,
-0.05059707537293434,
-0.008033095858991146,
-0.1327936351299286,
-0.07032123953104019,
0.01073669083416462,
0.11652486026287079,
-0.04857834056019783,
-0.10804038494825363,
-0.014666537754237652,
-0.07423672825098038,
0.04774424806237221,
-0.017302842810750008,
0.17787376046180725,
-0.052606962621212006,
0.07921982556581497,
-0.030618790537118912,
0.10914559662342072,
-0.02545466087758541,
0.01724247634410858,
-0.01314863096922636,
-0.014931566081941128,
-0.10903763025999069,
-0.07649758458137512,
-0.07777237892150879,
0.08954604715108871,
-0.04968583211302757,
-0.10230305045843124,
-0.03964873403310776,
0.06422235071659088,
0.004518403671681881,
0.07822951674461365,
-0.14059613645076752,
0.01960345171391964,
0.06351176649332047,
-0.05644151195883751,
-0.04692032188177109,
0.11819814890623093,
-0.006792877335101366,
0.14390170574188232,
0.008294139988720417,
0.02235463820397854,
0.006493773311376572,
-0.15514199435710907,
0.18920329213142395,
0.07607387751340866,
-0.13463659584522247,
-0.1752249002456665,
0.06895401328802109,
0.0017373000737279654,
-0.11460352689027786,
0.04066401347517967,
-0.034335341304540634,
0.033220887184143066,
-0.06731216609477997,
-0.0650988444685936,
0.00771792558953166,
-0.07789266854524612,
0.01784565858542919,
0.020628495141863823,
0.08156166970729828,
-0.09669013321399689,
-0.10137353092432022,
-0.032753828912973404,
0.0336587093770504,
0.023457864299416542,
0.011738335713744164,
-0.03954078257083893,
0.03120473586022854,
-0.06012383848428726,
-0.044540416449308395,
-0.05318950116634369,
-0.04593156650662422,
-0.060870129615068436,
0.17477411031723022,
0.025918887928128242,
0.18452946841716766,
0.08590162545442581,
0.03202667459845543,
-0.0340132899582386,
0.06831935048103333,
-0.052102021872997284,
0.0025712556671351194,
-0.07032427936792374,
-0.17177537083625793,
-0.058491289615631104,
-0.08816725015640259,
-0.008924364112317562,
-0.20067209005355835,
0.003127519739791751,
0.03211933746933937,
0.030484460294246674,
-0.011469205841422081,
0.040488164871931076,
-0.03538275882601738,
0.03814439848065376,
-0.0008220302988775074,
-0.025550028309226036,
0.11300821602344513,
0.026301860809326172,
-0.018418453633785248,
0.17094799876213074,
-0.013719945214688778,
0.15424112975597382,
0.12102675437927246,
-0.0727841779589653,
-0.12654441595077515,
0.05823744833469391,
-0.04272504523396492,
-0.009416096843779087,
-0.02675938792526722,
0.049215540289878845,
0.022535249590873718,
-0.023776836693286896,
0.17797577381134033,
-0.03378485143184662,
0.001807763590477407,
0.032309290021657944,
-0.03403298929333687,
-0.06631290912628174,
0.034562524408102036,
0.13752007484436035,
-0.08908246457576752,
0.042099084705114365,
0.07399987429380417,
-0.12874430418014526,
0.2086193710565567,
0.048171140253543854,
-0.044561345130205154,
-0.044157449156045914,
-0.028073057532310486,
0.02539728954434395,
0.20734232664108276,
-0.09579581767320633,
-0.046853017061948776,
0.012025142088532448,
0.03376585245132446,
0.04984615370631218,
-0.16334359347820282,
-0.10104064643383026,
0.01277259923517704,
-0.03711092472076416,
-0.010318610817193985,
0.09429652988910675,
-0.08753278106451035,
0.07743865251541138,
0.05348362401127815,
-0.1856507658958435,
0.09553619474172592,
0.009313386864960194,
-0.0767885223031044,
0.17587333917617798,
-0.073884516954422,
-0.13174183666706085,
-0.15920808911323547,
-0.04294348508119583,
0.025271087884902954,
-0.012746724300086498,
0.03767066448926926,
-0.08332959562540054,
-0.05920673534274101,
-0.022632023319602013,
-0.0666808933019638,
-0.09173282980918884,
0.07853832095861435,
-0.11598063260316849,
0.06016857922077179,
-0.001700491993688047,
-0.07339168339967728,
0.016853632405400276,
-0.04218226671218872,
-0.07825396209955215,
0.04223484918475151,
-0.11224950104951859,
0.04583553597331047,
0.09414596110582352,
-0.026506084948778152,
0.04880557581782341,
-0.009807905182242393,
0.0961696058511734,
-0.05101737007498741,
0.039801765233278275,
0.09777181595563889,
0.023984845727682114,
-0.010607654228806496,
0.06929662823677063,
0.056510116904973984,
-0.025193573907017708,
0.005203483626246452,
-0.011909403838217258,
-0.11438232660293579,
-0.1486145704984665,
-0.02594824880361557,
-0.06008441746234894,
0.017460862174630165,
0.054258015006780624,
0.07330069690942764,
0.09105616807937622,
0.07960141450166702,
0.10399466753005981,
0.005198454950004816,
-0.019767751917243004,
0.06273636966943741,
0.0011508105089887977,
-0.022796260192990303,
0.09330977499485016,
-0.05274317413568497,
-0.009792511351406574,
0.015081317164003849,
0.08688344061374664,
0.07116013765335083,
0.03698636218905449,
-0.03442607447504997,
0.09859789162874222,
0.21079394221305847,
0.07265619933605194,
0.10728691518306732,
0.02106080763041973,
-0.0868576392531395,
0.006829331628978252,
-0.03598244860768318,
-0.028455356135964394,
0.03979034349322319,
0.03102802112698555,
-0.035829197615385056,
-0.08050325512886047,
-0.0314444936811924,
-0.03452076017856598,
0.22128155827522278,
0.02337711490690708,
-0.26684385538101196,
-0.11748404055833817,
-0.061933014541864395,
-0.03191724792122841,
-0.017627954483032227,
0.04637588933110237,
0.09280487895011902,
-0.0826801210641861,
-0.0655699074268341,
-0.08568543195724487,
0.08763185143470764,
0.0940287709236145,
-0.018829766660928726,
0.09803774952888489,
0.06808775663375854,
-0.025101376697421074,
0.06753255426883698,
-0.2536240220069885,
0.29869788885116577,
-0.0015797491651028395,
0.11331445723772049,
-0.03388024866580963,
-0.019552098587155342,
0.037810735404491425,
0.20610962808132172,
0.1504122018814087,
-0.021592019125819206,
-0.1863124966621399,
-0.11247380822896957,
-0.006885457783937454,
-0.0056067598052322865,
-0.03490768000483513,
0.060740116983652115,
0.04808666184544563,
-0.030053457245230675,
0.035923853516578674,
0.02021508291363716,
0.04427299275994301,
-0.12568776309490204,
-0.03204820305109024,
-0.001394264167174697,
0.010580855421721935,
-0.09040989726781845,
-0.08118635416030884,
-0.08737751841545105,
-0.07872623950242996,
0.2408662736415863,
-0.06972166895866394,
0.042116425931453705,
-0.1424168199300766,
-0.015550018288195133,
0.04568563401699066,
-0.002939068479463458,
0.03297284618020058,
-0.001013112487271428,
0.05689811334013939,
-0.03313840180635452,
-0.023594047874212265,
0.05834948271512985,
-0.11655151098966599,
0.016509030014276505,
-0.04853903129696846,
0.07072748243808746,
-0.014498960226774216,
0.04531361162662506,
0.06196432188153267,
-0.014632139354944229,
-0.0833389163017273,
-0.040058817714452744,
0.05941979959607124,
-0.14816907048225403,
0.0749097689986229,
-0.09288746863603592,
-0.1019744724035263,
0.038940608501434326,
0.0021555661223828793,
-0.08295094966888428,
0.13377462327480316,
0.12761296331882477,
-0.1340196132659912,
0.12743598222732544,
0.04293421283364296,
-0.03063439205288887,
-0.28853142261505127,
-0.00462110573425889,
0.0026455321349203587,
0.09187349677085876,
0.0036048712208867073,
-0.06683815270662308,
0.06602378189563751,
0.035988133400678635,
0.004555353429168463,
-0.012560270726680756,
-0.15644696354866028,
-0.09501231461763382,
0.08925587683916092,
0.12819643318653107,
0.23397694528102875,
-0.04383953660726547,
0.016052648425102234,
-0.015235991217195988,
-0.08293592929840088,
0.019258810207247734,
-0.15034887194633484,
0.10406988859176636,
-0.04849232733249664,
0.0932578518986702,
-0.0005266981897875667,
-0.08545593917369843,
0.0894402265548706,
-0.030553746968507767,
0.03782056272029877,
-0.11695104092359543,
-0.06478837877511978,
0.093255914747715,
-0.02739681303501129,
0.06333159655332565,
-0.09745343774557114,
-0.009062620811164379,
-0.09788896888494492,
-0.028431372717022896,
-0.05388723313808441,
0.08501526713371277,
-0.02387121319770813,
-0.04974189028143883,
0.012637073174118996,
0.07379240542650223,
0.008184106089174747,
-0.041311442852020264,
0.05965687707066536,
-0.08978567272424698,
0.14155130088329315,
0.12174331396818161,
0.12068136781454086,
-0.0666932538151741,
-0.07492044568061829,
0.03161247819662094,
-0.050724197179079056,
0.10119197517633438,
-0.13645276427268982,
-0.04480093717575073,
0.040075208991765976,
-0.02098678983747959,
0.07626061141490936,
0.07839157432317734,
-0.02405191957950592,
0.007594773545861244,
0.14157705008983612,
-0.12746912240982056,
0.07598095387220383,
0.005790777038782835,
-0.08607228100299835,
-0.00638063158839941,
-0.040778547525405884,
0.08789803832769394,
-0.08532101660966873,
0.027257908135652542,
-0.024889850988984108,
0.01575399562716484,
-0.13487645983695984,
0.12967032194137573,
0.10123424977064133,
0.07797503471374512,
-0.011913732625544071,
-0.022348735481500626,
-0.0019827771466225386,
-0.09169957041740417,
-0.03687277063727379,
0.10767493396997452,
-0.11607007682323456,
-0.11271379142999649,
0.1898861676454544,
0.07485847175121307,
-0.058631327003240585,
-0.05563715845346451,
-0.12094374001026154,
-0.08976572006940842,
0.05220600962638855,
0.15903405845165253,
0.06391055881977081,
0.02263008989393711,
-0.05945952609181404,
0.0028340055141597986,
-0.127644881606102,
0.06133672222495079,
-0.07714352011680603,
-0.021451584994792938,
-0.09965819120407104,
0.2646712362766266,
-0.022951822727918625,
0.11348121613264084,
-0.06759923696517944,
0.03653939440846443,
-0.07859203219413757,
0.030850430950522423,
-0.000514091516379267,
-0.04297894611954689,
-0.021507563069462776,
-0.007384141907095909,
-0.002213542815297842,
-0.017898021265864372,
-0.08791882544755936,
0.046812787652015686,
-0.09969227015972137,
0.024518048390746117,
0.024824900552630424,
0.0023267995566129684,
0.04115915670990944,
-0.04269108548760414,
0.009402021765708923,
-0.029898883774876595,
0.013982227072119713,
-0.002313125878572464,
0.028696615248918533,
0.06989673525094986,
-0.05310319364070892,
0.052040982991456985,
-0.030447732657194138,
-0.025790754705667496,
0.05552507936954498,
-0.015907561406493187,
-0.007684855256229639,
0.013844304718077183,
0.042606208473443985,
0.07008963078260422,
0.10164941847324371,
-0.09443607181310654,
-0.10310551524162292,
-0.02806987427175045,
-0.06192092224955559,
-0.048698581755161285,
0.07217083871364594,
0.06579095125198364,
0.06114709749817848,
0.13799558579921722,
-0.12288500368595123,
0.04493694752454758,
-0.09296402335166931,
0.009852981194853783,
-0.028617247939109802,
-0.05732642114162445,
-0.03207824006676674,
-0.0021510336082428694,
0.05648065358400345,
-0.06031312048435211,
0.10180076211690903,
0.11565904319286346,
-0.03223653510212898,
0.04281628504395485,
-0.055232781916856766,
0.08238858729600906,
-0.004882545676082373,
0.2259405255317688,
-0.009212661534547806,
0.014016319997608662,
-0.01457235962152481,
0.0941997542977333,
-0.028256017714738846,
0.03256077691912651,
0.1254134178161621,
0.03487546369433403,
0.08049888163805008,
0.0911237895488739,
0.05009915679693222,
-0.014090022072196007,
0.016905462369322777,
0.03377971053123474,
0.023258423432707787,
0.023105692118406296,
-0.03595799580216408,
0.01729384809732437,
0.22795042395591736,
-0.09545934200286865,
0.028349237516522408,
-0.015604224056005478,
-0.06474525481462479,
-0.14412900805473328,
-0.0954463928937912,
-0.07501067966222763,
-0.07795017957687378,
0.05444737896323204,
-0.04014047980308533,
-0.02047024294734001,
0.021685563027858734,
0.049935270100831985,
-0.00404453044757247,
0.10792020708322525,
-0.031638506799936295,
0.02023349329829216,
0.031872473657131195,
-0.015940792858600616,
-0.03765840828418732,
-0.06882845610380173,
0.04214145988225937,
-0.07753971219062805,
-0.15203715860843658,
-0.019937193021178246,
-0.11080817878246307,
-0.0038226956967264414,
0.011434881947934628,
0.008951411582529545,
-0.03396648168563843,
-0.008355778641998768,
0.00871200766414404,
0.088542640209198,
0.08432012051343918,
0.03481444716453552,
-0.060711782425642014,
0.00044246227480471134,
0.13340911269187927,
-0.0012930679367855191,
-0.08218338340520859,
-0.1393798589706421,
0.11482372134923935,
0.028384355828166008,
0.04810182750225067,
-0.02082919143140316,
0.055912382900714874,
0.07419588416814804,
0.2780490815639496,
0.24637767672538757,
-0.050940122455358505,
0.020693941041827202,
0.03081613965332508,
-0.02501952275633812,
-0.03682142496109009,
0.13188834488391876,
0.0716608464717865,
0.08443359285593033,
-0.07484877854585648,
-0.13430623710155487,
-0.09031671285629272,
-0.0007009082473814487,
-0.0889315977692604,
-0.008127814158797264,
0.04382207989692688,
-0.045533113181591034,
-0.047100409865379333,
0.08083640038967133,
0.049722395837306976,
0.01227739080786705,
0.13254936039447784,
-0.004395956173539162,
-0.09009699523448944,
-0.03902791813015938,
0.012947283685207367,
-0.041904572397470474,
0.040014076977968216,
-0.056292399764060974,
-0.0025118673220276833,
0.004386351443827152,
-0.006103621795773506,
-0.10955791175365448,
-0.09231413900852203,
0.13354337215423584,
0.09230168163776398,
0.22450026869773865,
-0.05055658146739006,
0.1240030974149704,
0.09683694690465927,
0.024346256628632545,
-0.016887981444597244,
0.08345102518796921,
0.0038062359672039747,
-0.04162362962961197,
0.04210978001356125,
0.05056674778461456,
-0.0438418947160244,
0.015750324353575706,
-0.00036865449510514736,
-0.09267589449882507,
0.054167740046978,
-0.14858479797840118,
-0.07852102816104889,
-0.014318712055683136,
0.11673755943775177,
-0.03524182736873627,
0.1272352933883667,
0.20086845755577087,
-0.007761949673295021,
-0.04810497909784317,
-0.09963422268629074,
0.048391588032245636,
0.05218939855694771,
-0.019932586699724197,
-0.08831609040498734,
-0.06757796555757523,
-0.028846396133303642,
-0.17517250776290894,
-0.1148407980799675,
-0.20475557446479797,
-0.10142497718334198,
-0.049046099185943604,
-0.10541792958974838,
-0.04622698947787285,
0.051786474883556366,
0.13028065860271454,
0.05587467923760414,
-0.06746123731136322,
0.02721412107348442,
-0.02195967175066471,
0.06744783371686935,
-0.20140932500362396,
-0.11858446896076202
] |
null | null |
transformers
|
# RoBERTa base model for Hindi language
Pretrained model on Hindi language using a masked language modeling (MLM) objective. [A more interactive & comparison demo is available here](https://huggingface.co/spaces/flax-community/roberta-hindi).
> This is part of the
[Flax/Jax Community Week](https://discuss.huggingface.co/t/pretrain-roberta-from-scratch-in-hindi/7091), organized by [Hugging Face](https://huggingface.co/) and TPU usage sponsored by Google.
## Model description
RoBERTa Hindi is a transformers model pretrained on a large corpus of Hindi data(a combination of **mc4, oscar and indic-nlp** datasets)
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='flax-community/roberta-hindi')
>>> unmasker("हम आपके सुखद <mask> की कामना करते हैं")
[{'score': 0.3310680091381073,
'sequence': 'हम आपके सुखद सफर की कामना करते हैं',
'token': 1349,
'token_str': ' सफर'},
{'score': 0.15317578613758087,
'sequence': 'हम आपके सुखद पल की कामना करते हैं',
'token': 848,
'token_str': ' पल'},
{'score': 0.07826550304889679,
'sequence': 'हम आपके सुखद समय की कामना करते हैं',
'token': 453,
'token_str': ' समय'},
{'score': 0.06304813921451569,
'sequence': 'हम आपके सुखद पहल की कामना करते हैं',
'token': 404,
'token_str': ' पहल'},
{'score': 0.058322224766016006,
'sequence': 'हम आपके सुखद अवसर की कामना करते हैं',
'token': 857,
'token_str': ' अवसर'}]
```
## Training data
The RoBERTa Hindi model was pretrained on the reunion of the following datasets:
- [OSCAR](https://huggingface.co/datasets/oscar) is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.
- [mC4](https://huggingface.co/datasets/mc4) is a multilingual colossal, cleaned version of Common Crawl's web crawl corpus.
- [IndicGLUE](https://indicnlp.ai4bharat.org/indic-glue/) is a natural language understanding benchmark.
- [Samanantar](https://indicnlp.ai4bharat.org/samanantar/) is a parallel corpora collection for Indic language.
- [Hindi Text Short and Large Summarization Corpus](https://www.kaggle.com/disisbig/hindi-text-short-and-large-summarization-corpus) is a collection of ~180k articles with their headlines and summary collected from Hindi News Websites.
- [Hindi Text Short Summarization Corpus](https://www.kaggle.com/disisbig/hindi-text-short-summarization-corpus) is a collection of ~330k articles with their headlines collected from Hindi News Websites.
- [Old Newspapers Hindi](https://www.kaggle.com/crazydiv/oldnewspapershindi) is a cleaned subset of HC Corpora newspapers.
## Training procedure
### Preprocessing
The texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of
the model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked
with `<s>` and the end of one by `</s>`.
- We had to perform cleanup of **mC4** and **oscar** datasets by removing all non hindi (non Devanagari) characters from the datasets.
- We tried to filter out evaluation set of WikiNER of [IndicGlue](https://indicnlp.ai4bharat.org/indic-glue/) benchmark by [manual labelling](https://github.com/amankhandelia/roberta_hindi/blob/master/wikiner_incorrect_eval_set.csv) where the actual labels were not correct and modifying the [downstream evaluation dataset](https://github.com/amankhandelia/roberta_hindi/blob/master/utils.py).
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `<mask>`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
Contrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).
### Pretraining
The model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores).A randomized shuffle of combined dataset of **mC4, oscar** and other datasets listed above was used to train the model. Training logs are present in [wandb](https://wandb.ai/wandb/hf-flax-roberta-hindi).
## Evaluation Results
RoBERTa Hindi is evaluated on various downstream tasks. The results are summarized below.
| Task | Task Type | IndicBERT | HindiBERTa | Indic Transformers Hindi BERT | RoBERTa Hindi Guj San | RoBERTa Hindi |
|-------------------------|----------------------|-----------|------------|-------------------------------|-----------------------|---------------|
| BBC News Classification | Genre Classification | **76.44** | 66.86 | **77.6** | 64.9 | 73.67 |
| WikiNER | Token Classification | - | 90.68 | **95.09** | 89.61 | **92.76** |
| IITP Product Reviews | Sentiment Analysis | **78.01** | 73.23 | **78.39** | 66.16 | 75.53 |
| IITP Movie Reviews | Sentiment Analysis | 60.97 | 52.26 | **70.65** | 49.35 | **61.29** |
## Team Members
- Aman K ([amankhandelia](https://huggingface.co/amankhandelia))
- Haswanth Aekula ([hassiahk](https://huggingface.co/hassiahk))
- Kartik Godawat ([dk-crazydiv](https://huggingface.co/dk-crazydiv))
- Prateek Agrawal ([prateekagrawal](https://huggingface.co/prateekagrawal))
- Rahul Dev ([mlkorra](https://huggingface.co/mlkorra))
## Credits
Huge thanks to Hugging Face 🤗 & Google Jax/Flax team for such a wonderful community week, especially for providing such massive computing resources. Big thanks to [Suraj Patil](https://huggingface.co/valhalla) & [Patrick von Platen](https://huggingface.co/patrickvonplaten) for mentoring during the whole week.
<img src=https://pbs.twimg.com/media/E443fPjX0AY1BsR.jpg:medium>
|
{"widget": [{"text": "\u092e\u0941\u091d\u0947 \u0909\u0928\u0938\u0947 \u092c\u093e\u0924 \u0915\u0930\u0928\u093e <mask> \u0905\u091a\u094d\u091b\u093e \u0932\u0917\u093e"}, {"text": "\u0939\u092e \u0906\u092a\u0915\u0947 \u0938\u0941\u0916\u0926 <mask> \u0915\u0940 \u0915\u093e\u092e\u0928\u093e \u0915\u0930\u0924\u0947 \u0939\u0948\u0902"}, {"text": "\u0938\u092d\u0940 \u0905\u091a\u094d\u091b\u0940 \u091a\u0940\u091c\u094b\u0902 \u0915\u093e \u090f\u0915 <mask> \u0939\u094b\u0924\u093e \u0939\u0948"}]}
|
fill-mask
|
flax-community/roberta-hindi
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #has_space #region-us
|
RoBERTa base model for Hindi language
=====================================
Pretrained model on Hindi language using a masked language modeling (MLM) objective. A more interactive & comparison demo is available here.
>
> This is part of the
> Flax/Jax Community Week, organized by Hugging Face and TPU usage sponsored by Google.
>
>
>
Model description
-----------------
RoBERTa Hindi is a transformers model pretrained on a large corpus of Hindi data(a combination of mc4, oscar and indic-nlp datasets)
### How to use
You can use this model directly with a pipeline for masked language modeling:
Training data
-------------
The RoBERTa Hindi model was pretrained on the reunion of the following datasets:
* OSCAR is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.
* mC4 is a multilingual colossal, cleaned version of Common Crawl's web crawl corpus.
* IndicGLUE is a natural language understanding benchmark.
* Samanantar is a parallel corpora collection for Indic language.
* Hindi Text Short and Large Summarization Corpus is a collection of ~180k articles with their headlines and summary collected from Hindi News Websites.
* Hindi Text Short Summarization Corpus is a collection of ~330k articles with their headlines collected from Hindi News Websites.
* Old Newspapers Hindi is a cleaned subset of HC Corpora newspapers.
Training procedure
------------------
### Preprocessing
The texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of
the model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked
with '~~' and the end of one by '~~'.
* We had to perform cleanup of mC4 and oscar datasets by removing all non hindi (non Devanagari) characters from the datasets.
* We tried to filter out evaluation set of WikiNER of IndicGlue benchmark by manual labelling where the actual labels were not correct and modifying the downstream evaluation dataset.
The details of the masking procedure for each sentence are the following:
* 15% of the tokens are masked.
* In 80% of the cases, the masked tokens are replaced by ''.
* In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
* In the 10% remaining cases, the masked tokens are left as is.
Contrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).
### Pretraining
The model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores).A randomized shuffle of combined dataset of mC4, oscar and other datasets listed above was used to train the model. Training logs are present in wandb.
Evaluation Results
------------------
RoBERTa Hindi is evaluated on various downstream tasks. The results are summarized below.
Team Members
------------
* Aman K (amankhandelia)
* Haswanth Aekula (hassiahk)
* Kartik Godawat (dk-crazydiv)
* Prateek Agrawal (prateekagrawal)
* Rahul Dev (mlkorra)
Credits
-------
Huge thanks to Hugging Face & Google Jax/Flax team for such a wonderful community week, especially for providing such massive computing resources. Big thanks to Suraj Patil & Patrick von Platen for mentoring during the whole week.
<img src=URL
|
[
"### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nTraining data\n-------------\n\n\nThe RoBERTa Hindi model was pretrained on the reunion of the following datasets:\n\n\n* OSCAR is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n* mC4 is a multilingual colossal, cleaned version of Common Crawl's web crawl corpus.\n* IndicGLUE is a natural language understanding benchmark.\n* Samanantar is a parallel corpora collection for Indic language.\n* Hindi Text Short and Large Summarization Corpus is a collection of ~180k articles with their headlines and summary collected from Hindi News Websites.\n* Hindi Text Short Summarization Corpus is a collection of ~330k articles with their headlines collected from Hindi News Websites.\n* Old Newspapers Hindi is a cleaned subset of HC Corpora newspapers.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of\nthe model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked\nwith '~~' and the end of one by '~~'.\n\n\n* We had to perform cleanup of mC4 and oscar datasets by removing all non hindi (non Devanagari) characters from the datasets.\n* We tried to filter out evaluation set of WikiNER of IndicGlue benchmark by manual labelling where the actual labels were not correct and modifying the downstream evaluation dataset.\n\n\nThe details of the masking procedure for each sentence are the following:\n\n\n* 15% of the tokens are masked.\n* In 80% of the cases, the masked tokens are replaced by ''.\n* In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n* In the 10% remaining cases, the masked tokens are left as is.\nContrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).",
"### Pretraining\n\n\nThe model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores).A randomized shuffle of combined dataset of mC4, oscar and other datasets listed above was used to train the model. Training logs are present in wandb.\n\n\nEvaluation Results\n------------------\n\n\nRoBERTa Hindi is evaluated on various downstream tasks. The results are summarized below.\n\n\n\nTeam Members\n------------\n\n\n* Aman K (amankhandelia)\n* Haswanth Aekula (hassiahk)\n* Kartik Godawat (dk-crazydiv)\n* Prateek Agrawal (prateekagrawal)\n* Rahul Dev (mlkorra)\n\n\nCredits\n-------\n\n\nHuge thanks to Hugging Face & Google Jax/Flax team for such a wonderful community week, especially for providing such massive computing resources. Big thanks to Suraj Patil & Patrick von Platen for mentoring during the whole week.\n\n\n<img src=URL"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nTraining data\n-------------\n\n\nThe RoBERTa Hindi model was pretrained on the reunion of the following datasets:\n\n\n* OSCAR is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n* mC4 is a multilingual colossal, cleaned version of Common Crawl's web crawl corpus.\n* IndicGLUE is a natural language understanding benchmark.\n* Samanantar is a parallel corpora collection for Indic language.\n* Hindi Text Short and Large Summarization Corpus is a collection of ~180k articles with their headlines and summary collected from Hindi News Websites.\n* Hindi Text Short Summarization Corpus is a collection of ~330k articles with their headlines collected from Hindi News Websites.\n* Old Newspapers Hindi is a cleaned subset of HC Corpora newspapers.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50265. The inputs of\nthe model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked\nwith '~~' and the end of one by '~~'.\n\n\n* We had to perform cleanup of mC4 and oscar datasets by removing all non hindi (non Devanagari) characters from the datasets.\n* We tried to filter out evaluation set of WikiNER of IndicGlue benchmark by manual labelling where the actual labels were not correct and modifying the downstream evaluation dataset.\n\n\nThe details of the masking procedure for each sentence are the following:\n\n\n* 15% of the tokens are masked.\n* In 80% of the cases, the masked tokens are replaced by ''.\n* In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n* In the 10% remaining cases, the masked tokens are left as is.\nContrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).",
"### Pretraining\n\n\nThe model was trained on Google Cloud Engine TPUv3-8 machine (with 335 GB of RAM, 1000 GB of hard drive, 96 CPU cores).A randomized shuffle of combined dataset of mC4, oscar and other datasets listed above was used to train the model. Training logs are present in wandb.\n\n\nEvaluation Results\n------------------\n\n\nRoBERTa Hindi is evaluated on various downstream tasks. The results are summarized below.\n\n\n\nTeam Members\n------------\n\n\n* Aman K (amankhandelia)\n* Haswanth Aekula (hassiahk)\n* Kartik Godawat (dk-crazydiv)\n* Prateek Agrawal (prateekagrawal)\n* Rahul Dev (mlkorra)\n\n\nCredits\n-------\n\n\nHuge thanks to Hugging Face & Google Jax/Flax team for such a wonderful community week, especially for providing such massive computing resources. Big thanks to Suraj Patil & Patrick von Platen for mentoring during the whole week.\n\n\n<img src=URL"
] |
[
48,
212,
281,
223
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #has_space #region-us \n### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nTraining data\n-------------\n\n\nThe RoBERTa Hindi model was pretrained on the reunion of the following datasets:\n\n\n* OSCAR is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n* mC4 is a multilingual colossal, cleaned version of Common Crawl's web crawl corpus.\n* IndicGLUE is a natural language understanding benchmark.\n* Samanantar is a parallel corpora collection for Indic language.\n* Hindi Text Short and Large Summarization Corpus is a collection of ~180k articles with their headlines and summary collected from Hindi News Websites.\n* Hindi Text Short Summarization Corpus is a collection of ~330k articles with their headlines collected from Hindi News Websites.\n* Old Newspapers Hindi is a cleaned subset of HC Corpora newspapers.\n\n\nTraining procedure\n------------------"
] |
[
0.019967040047049522,
-0.005420254543423653,
0.00006892683450132608,
0.00880281999707222,
0.08746825903654099,
-0.0007148141739889979,
-0.001483110710978508,
0.1455020010471344,
-0.08815789967775345,
0.04185754433274269,
0.006540738977491856,
-0.04732917249202728,
0.04832825809717178,
0.07660277932882309,
0.052879609167575836,
-0.3452574908733368,
0.01560970488935709,
-0.025189964100718498,
-0.011923988349735737,
0.07503776997327805,
0.09849423170089722,
-0.05519304797053337,
0.046198200434446335,
-0.006627972703427076,
-0.058549195528030396,
0.019400112330913544,
0.023034512996673584,
-0.12463179230690002,
0.006454352289438248,
0.043218713253736496,
0.025761932134628296,
-0.05438166856765747,
-0.04734757915139198,
-0.025633392855525017,
0.04093223810195923,
0.025206122547388077,
0.018811965361237526,
0.0010847810190171003,
0.032672036439180374,
0.04224037006497383,
0.31372618675231934,
-0.135151207447052,
-0.02303418517112732,
-0.0101845758035779,
-0.08294469863176346,
-0.0964379534125328,
-0.035417068749666214,
-0.033450234681367874,
0.03633732721209526,
0.12323728948831558,
-0.008594539947807789,
0.06323537230491638,
-0.14686386287212372,
0.039536114782094955,
0.01154890563338995,
-0.1863117218017578,
-0.04899449646472931,
0.09299427270889282,
0.04457009956240654,
0.11649074405431747,
-0.07402867823839188,
0.037118442356586456,
-0.06317437440156937,
-0.016022462397813797,
0.0068204025737941265,
-0.0638907328248024,
-0.1473003625869751,
-0.0032627368345856667,
-0.09746768325567245,
0.008266723714768887,
0.21282914280891418,
-0.09703915566205978,
-0.026096949353814125,
0.002078317804262042,
-0.037018071860075,
0.08299586921930313,
-0.07290484756231308,
0.003937773872166872,
-0.04678110033273697,
0.05013285204768181,
0.014811160042881966,
-0.01827876828610897,
-0.07798848301172256,
0.016375241801142693,
-0.11748577654361725,
0.1169467493891716,
0.019782353192567825,
0.061978671699762344,
-0.12146319448947906,
-0.04688990116119385,
-0.10756277292966843,
-0.06741615384817123,
0.01856478489935398,
-0.014189491979777813,
0.03318627551198006,
0.033120620995759964,
-0.055123262107372284,
-0.31177541613578796,
0.017172247171401978,
-0.0023060007952153683,
-0.09589113295078278,
0.019131265580654144,
0.007875626906752586,
0.017742455005645752,
-0.0015021286671981215,
0.04836771637201309,
-0.02336973510682583,
-0.05374671891331673,
0.10127526521682739,
-0.07947879284620285,
0.11940444260835648,
-0.04557500034570694,
-0.02373644895851612,
0.10608644783496857,
-0.07774892449378967,
0.017304308712482452,
-0.005924306344240904,
0.06953971832990646,
0.03631927818059921,
-0.050782978534698486,
-0.0052093262784183025,
-0.08292780816555023,
-0.015782196074724197,
-0.03236863389611244,
-0.019441897049546242,
0.01018634531646967,
-0.019702039659023285,
-0.0376247763633728,
-0.07283564656972885,
0.017080800607800484,
-0.07436478137969971,
-0.015463260002434254,
-0.0968620628118515,
-0.09730860590934753,
-0.0008383321110159159,
-0.11150463670492172,
-0.052899330854415894,
-0.09272566437721252,
-0.18308913707733154,
-0.0013901672791689634,
-0.012292543426156044,
-0.07247918099164963,
-0.03623997047543526,
0.0009511255193501711,
0.011928065679967403,
0.06903322041034698,
0.023255396634340286,
0.029317103326320648,
-0.07991207391023636,
0.04598689824342728,
-0.11371570080518723,
0.09921938180923462,
-0.0950903668999672,
0.06007952615618706,
-0.11145899444818497,
0.010530992411077023,
-0.10188314318656921,
0.0705641582608223,
-0.032339371740818024,
0.030419891700148582,
-0.033048976212739944,
0.02317686937749386,
-0.12134852260351181,
0.04684317111968994,
-0.040151916444301605,
0.06857511401176453,
-0.12163858860731125,
-0.07730121165513992,
0.05305958166718483,
-0.1396009624004364,
-0.025734158232808113,
0.11931788921356201,
-0.01728469878435135,
0.17581836879253387,
0.08366037160158157,
0.12272725254297256,
0.08684804290533066,
0.11306219547986984,
-0.049541033804416656,
0.023036140948534012,
-0.01397817675024271,
0.02897607535123825,
0.06244784966111183,
-0.004242117051035166,
-0.032938648015260696,
0.02748802676796913,
0.04820028692483902,
0.05292678624391556,
0.013166976161301136,
-0.01708771288394928,
0.03385749086737633,
0.03310360759496689,
0.0021775325294584036,
-0.03018266335129738,
0.1364966332912445,
-0.028996240347623825,
-0.009370612911880016,
-0.01576145738363266,
0.054778363555669785,
0.0481538400053978,
0.04789510369300842,
-0.07961542904376984,
0.05880505219101906,
-0.0641077384352684,
0.07866732031106949,
-0.13484624028205872,
0.04697919636964798,
0.058481670916080475,
0.014121292158961296,
0.04438495263457298,
0.01180795393884182,
0.0027017502579838037,
-0.05214618891477585,
-0.0313020683825016,
0.0812193900346756,
0.07391659915447235,
-0.02426244504749775,
-0.05329805612564087,
-0.0775090903043747,
0.06909750401973724,
-0.03824257850646973,
0.08778228610754013,
-0.032674845308065414,
0.029090581461787224,
0.02325872890651226,
0.016793979331851006,
0.04565603286027908,
0.004316212609410286,
0.06031934544444084,
0.051812589168548584,
-0.005821531638503075,
0.03435831516981125,
0.042667098343372345,
0.03649556636810303,
-0.115785613656044,
0.14425961673259735,
-0.10139413177967072,
-0.024922631680965424,
0.06088363006711006,
-0.10126721113920212,
0.052544690668582916,
0.08352652192115784,
0.05027586221694946,
-0.04938644915819168,
-0.020389430224895477,
0.005318136420100927,
0.08452674001455307,
-0.046473611146211624,
0.10437004268169403,
-0.08081290125846863,
0.013649574480950832,
-0.039524808526039124,
-0.02364586479961872,
0.012210024520754814,
0.03326711058616638,
0.07772716879844666,
-0.23170551657676697,
0.0955573320388794,
0.12666316330432892,
-0.04010830819606781,
0.23272760212421417,
0.040051110088825226,
-0.057055819779634476,
0.02504861354827881,
-0.05909224599599838,
-0.018735341727733612,
-0.02279692329466343,
-0.17512749135494232,
0.013520617969334126,
0.04224836453795433,
0.028734680265188217,
0.02703184261918068,
-0.01840353198349476,
0.0187993086874485,
-0.0000014834557759968447,
0.001578868948854506,
0.019513128325343132,
0.08720632642507553,
0.014446581713855267,
0.09778235852718353,
0.06107722595334053,
0.12900139391422272,
0.03743808716535568,
-0.037197522819042206,
-0.08012712746858597,
0.16710005700588226,
-0.1419477015733719,
-0.3901168704032898,
-0.13096041977405548,
-0.05214940384030342,
-0.014044939540326595,
0.004539496265351772,
0.059484947472810745,
-0.12300576269626617,
-0.08596103638410568,
-0.11634311825037003,
0.08609470725059509,
0.03167705610394478,
0.02669854648411274,
-0.1292797029018402,
-0.016549795866012573,
-0.046233292669057846,
-0.07647138088941574,
0.026012640446424484,
-0.025578845292329788,
-0.032640282064676285,
0.07358423620462418,
-0.02214214950799942,
0.05647232383489609,
0.05985844507813454,
0.01033981703221798,
-0.03264269605278969,
-0.024865446612238884,
0.021447159349918365,
-0.08604946732521057,
0.08253449201583862,
0.13737212121486664,
-0.039391063153743744,
0.0016982052475214005,
0.10012386739253998,
-0.0006045778281986713,
0.005248646251857281,
0.050354842096567154,
0.02174057625234127,
-0.0819229707121849,
-0.14640453457832336,
-0.10741186141967773,
-0.11152546107769012,
0.016180219128727913,
-0.024552136659622192,
0.07424000650644302,
-0.03496525064110756,
0.031538501381874084,
-0.04369539022445679,
0.041636332869529724,
0.007595773786306381,
0.022836055606603622,
0.15707190334796906,
-0.029934553429484367,
0.09230881184339523,
-0.10014460235834122,
0.007292209193110466,
0.1152569055557251,
-0.11570562422275543,
0.18137146532535553,
-0.04617490991950035,
0.16730450093746185,
0.027374975383281708,
0.03474470600485802,
0.15223129093647003,
-0.0038715158589184284,
-0.05221842974424362,
-0.039284322410821915,
-0.06756699085235596,
-0.02163594961166382,
-0.09328463673591614,
0.08548237383365631,
-0.022928984835743904,
0.022007733583450317,
0.010638769716024399,
0.03284653648734093,
0.027957838028669357,
0.15216447412967682,
0.0024770000018179417,
-0.09943872690200806,
-0.1322488635778427,
0.08275125175714493,
-0.03254443407058716,
-0.053694672882556915,
0.018922604620456696,
0.11862009763717651,
-0.0633932575583458,
0.03472806513309479,
-0.018745871260762215,
0.07872043550014496,
-0.03615785390138626,
0.03057752177119255,
-0.069283626973629,
0.0024787881411612034,
-0.02060098946094513,
0.0749567523598671,
-0.1604519486427307,
0.09654819965362549,
0.03979979828000069,
0.005393010098487139,
-0.07291441410779953,
-0.028014929965138435,
0.06919311732053757,
0.004985862411558628,
-0.04389381408691406,
0.025234004482626915,
-0.17525909841060638,
-0.02643681690096855,
-0.07195081561803818,
0.031006542965769768,
0.06033693999052048,
0.03600986674427986,
0.08490000665187836,
-0.05103019252419472,
0.002078191377222538,
-0.016759056597948074,
-0.002087897388264537,
-0.004882678855210543,
-0.13045543432235718,
0.06179281324148178,
0.058813564479351044,
-0.0019982571247965097,
-0.0793466567993164,
-0.04753324016928673,
0.12057823687791824,
0.10335405170917511,
-0.0521533265709877,
-0.09057027101516724,
-0.07259494811296463,
0.11557488143444061,
0.03162388131022453,
-0.07031239569187164,
0.05590955913066864,
-0.028758184984326363,
0.0073152510449290276,
0.058398932218551636,
-0.09107739478349686,
0.055458247661590576,
-0.028545742854475975,
-0.06356807798147202,
-0.039168164134025574,
0.04121346026659012,
0.045008935034275055,
0.00665574474260211,
-0.05249656364321709,
0.07653569430112839,
0.026196179911494255,
-0.058471281081438065,
-0.08681684732437134,
0.1074788048863411,
-0.08240308612585068,
0.21150431036949158,
-0.1598687767982483,
-0.11976403743028641,
0.021667424589395523,
-0.09298714250326157,
0.19502203166484833,
0.05334386229515076,
0.0006284404662437737,
0.17714299261569977,
0.13198500871658325,
-0.11436186730861664,
-0.32150399684906006,
-0.02659774385392666,
-0.06930703669786453,
0.044555000960826874,
-0.029041629284620285,
-0.11652357131242752,
0.14531399309635162,
0.10875857621431351,
0.018717708066105843,
-0.04055802524089813,
-0.2076437771320343,
-0.10100343823432922,
0.03843056410551071,
-0.0012770823668688536,
0.3163171708583832,
-0.1265937238931656,
-0.08682232350111008,
-0.01043929997831583,
0.15883177518844604,
0.07280462235212326,
-0.1208675354719162,
0.09578245878219604,
-0.09974752366542816,
-0.08052241802215576,
-0.008033827878534794,
0.007853969931602478,
0.05279889702796936,
-0.06571618467569351,
-0.017473619431257248,
-0.13472752273082733,
0.04320472478866577,
0.15702925622463226,
0.016927532851696014,
0.0956612080335617,
-0.04552323371171951,
-0.027908626943826675,
-0.176945760846138,
0.019943537190556526,
-0.09654109179973602,
0.018674811348319054,
0.0046981628984212875,
0.0005527198081836104,
-0.08934566378593445,
0.06432436406612396,
-0.00483062956482172,
-0.0009796995436772704,
0.19775882363319397,
-0.042118143290281296,
0.16961877048015594,
0.005642341449856758,
0.09832441806793213,
-0.13929437100887299,
0.04009663313627243,
0.015589894726872444,
0.021187659353017807,
0.09045781195163727,
-0.10612652450799942,
-0.04172535613179207,
0.025412382557988167,
0.009116165339946747,
0.09586425870656967,
0.040494564920663834,
-0.043554455041885376,
0.09206126630306244,
0.13530823588371277,
-0.04203053563833237,
-0.0816357433795929,
-0.006089804228395224,
0.04874943941831589,
-0.07079260051250458,
-0.08619865030050278,
0.14421075582504272,
-0.03305625170469284,
-0.02205483987927437,
-0.004062592051923275,
0.07013975083827972,
0.014324459247291088,
0.07062331587076187,
0.000047754834668012336,
-0.0006030214135535061,
-0.05732756853103638,
0.101407989859581,
0.15417909622192383,
-0.23827598989009857,
-0.029376082122325897,
0.2962006628513336,
-0.13053040206432343,
-0.10205420106649399,
0.01441321149468422,
0.036269474774599075,
-0.022308865562081337,
0.017527395859360695,
-0.06428505480289459,
-0.1489100605249405,
0.0984482690691948,
0.10508561134338379,
-0.014262037351727486,
-0.0033271878492087126,
-0.10633483529090881,
0.006956149358302355,
0.03551490604877472,
0.10199379175901413,
-0.00934223085641861,
-0.01653571054339409,
0.023061830550432205,
-0.04199440777301788,
0.03248738497495651,
0.05138101428747177,
-0.07606377452611923,
-0.07893167436122894,
-0.11004582792520523,
-0.0010381796164438128,
-0.13652250170707703,
0.0434214249253273,
-0.11536351591348648,
-0.05712779983878136,
-0.0755230113863945,
0.009527011774480343,
-0.036288220435380936,
-0.03945594280958176,
-0.024274516850709915,
-0.029805364087224007,
-0.08041155338287354,
0.114314004778862,
0.008658716455101967,
-0.010267349891364574,
0.05393737182021141,
0.01472498755902052,
0.08258605003356934,
0.05567650869488716,
-0.03669755160808563,
0.03780724108219147,
0.0081959068775177,
-0.009952416643500328,
0.039081793278455734,
0.028504643589258194,
-0.003819922683760524,
-0.027836790308356285,
0.03089774027466774,
0.03265264630317688,
-0.02631552144885063,
0.06823447346687317,
0.08677864819765091,
-0.05761304870247841,
0.11649233102798462,
-0.09684503078460693,
0.02814752794802189,
-0.06942646205425262,
0.009529249742627144,
0.029248902574181557,
0.0490424782037735,
0.14961905777454376,
-0.09617777913808823,
-0.019606009125709534,
-0.04020260274410248,
-0.0067742266692221165,
-0.05607263743877411,
-0.09213954955339432,
-0.13720229268074036,
-0.055171310901641846,
0.03518693894147873,
0.0021948867943137884,
0.11831803619861603,
0.050244204699993134,
-0.12242305278778076,
0.0727720633149147,
0.09022018313407898,
0.06303104013204575,
-0.012553474865853786,
0.07835904508829117,
0.13446149230003357,
-0.05471044406294823,
0.012194733135402203,
0.012406992726027966,
0.011724933981895447,
0.03280293941497803,
0.12118081003427505,
0.12831702828407288,
0.23405684530735016,
0.0962677076458931,
-0.030204815790057182,
-0.03238704428076744,
-0.02652646042406559,
0.09374288469552994,
-0.08309086412191391,
0.09968317300081253,
0.027985645458102226,
-0.03626253828406334,
0.20361173152923584,
-0.12509512901306152,
0.0569886639714241,
-0.01102361548691988,
-0.012277582660317421,
-0.022430140525102615,
-0.11936409026384354,
-0.07163481414318085,
-0.0512598417699337,
-0.05173277482390404,
-0.10643547028303146,
0.0028938227333128452,
0.08438306301832199,
0.022505100816488266,
-0.015943702310323715,
0.13447368144989014,
-0.10760848224163055,
-0.1490480899810791,
0.04909175634384155,
-0.025264455005526543,
0.06772919744253159,
0.020232468843460083,
0.011016878299415112,
-0.05139949172735214,
0.05351491644978523,
0.000772605708334595,
0.047929588705301285,
0.07843264192342758,
0.061720170080661774,
-0.09639634937047958,
-0.08624907582998276,
0.015753235667943954,
0.058461885899305344,
0.06180901080369949,
0.12081094831228256,
0.04116673395037651,
-0.06209467351436615,
0.02749072201550007,
0.1740744262933731,
0.011770210228860378,
-0.021297356113791466,
-0.03406326100230217,
0.13657355308532715,
0.04305056110024452,
-0.027451571077108383,
-0.03467932343482971,
-0.05341299995779991,
-0.061726585030555725,
0.15857428312301636,
0.288745254278183,
-0.04232143238186836,
0.017642484977841377,
-0.039085131138563156,
0.0014982660068199039,
0.00257300678640604,
0.1629171222448349,
0.07873392850160599,
0.2767660319805145,
-0.05395478010177612,
0.013800973072648048,
-0.0677524283528328,
-0.0115415183827281,
-0.18948759138584137,
0.04124678298830986,
0.02063172124326229,
-0.024646854028105736,
0.018028030171990395,
0.08318208158016205,
-0.14386595785617828,
-0.09124400466680527,
-0.053477030247449875,
-0.17267073690891266,
-0.1188800111413002,
-0.062373921275138855,
-0.060377102345228195,
0.05522020906209946,
-0.006435181945562363,
0.039624013006687164,
0.03034958615899086,
-0.07524437457323074,
0.051172927021980286,
-0.09173765033483505,
-0.14500656723976135,
0.1823931187391281,
0.030231798067688942,
0.01471055019646883,
-0.015722328796982765,
-0.016151903197169304,
0.06318525224924088,
0.0306783989071846,
-0.04238158464431763,
0.07775124907493591,
-0.011668710969388485,
0.0065849656239151955,
0.019357915967702866,
0.08092647045850754,
0.0091845178976655,
0.19905424118041992,
0.1455405354499817,
0.010622144676744938,
0.035651497542858124,
-0.030442580580711365,
-0.025634992867708206,
-0.06434318423271179,
0.03317270055413246,
-0.11070238798856735,
0.14940616488456726,
0.2262648642063141,
0.010966918431222439,
-0.03106073848903179,
-0.031018592417240143,
-0.010706953704357147,
-0.03380805626511574,
0.026799559593200684,
-0.05531363934278488,
-0.12868373095989227,
0.027590923011302948,
-0.14343774318695068,
0.02943861484527588,
-0.2483748197555542,
-0.04246203228831291,
-0.05583127588033676,
0.034544266760349274,
-0.03439861908555031,
0.06997617334127426,
0.06821102648973465,
-0.012565549463033676,
-0.03957907110452652,
-0.19040122628211975,
0.04764802008867264,
0.054010551422834396,
-0.13789333403110504,
-0.07355105131864548
] |
null | null |
transformers
|
roberta-pretraining-hindi
|
{"widget": [{"text": "\u0936\u0941\u092d \u092a\u094d\u0930\u092d\u093e\u0924\u0964 \u0906\u0936\u093e \u0915\u0930\u0924\u093e \u0939\u0942\u0902 \u0915\u093f \u0906\u092a\u0915\u093e <mask> \u0936\u0941\u092d \u0939\u094b"}]}
|
fill-mask
|
flax-community/roberta-pretraining-hindi
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
roberta-pretraining-hindi
|
[] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.056054603308439255,
0.03248031064867973,
-0.008046153001487255,
0.051338013261556625,
0.1180385947227478,
0.02936614491045475,
0.1320868581533432,
0.102181576192379,
0.08733624964952469,
0.00779934786260128,
0.1502477079629898,
0.2344098836183548,
-0.01568196900188923,
0.09936822205781937,
-0.03942068666219711,
-0.27976280450820923,
0.02226502262055874,
0.06325343996286392,
-0.11206097155809402,
0.10821451991796494,
0.05136880278587341,
-0.08877619355916977,
0.0548054575920105,
-0.0033657930325716734,
-0.15610703825950623,
0.046753864735364914,
0.06813488900661469,
-0.10331186652183533,
0.11474744230508804,
0.03391389176249504,
0.21956931054592133,
0.036839134991168976,
-0.0514676459133625,
-0.06020799279212952,
0.058262307196855545,
0.018529953435063362,
-0.08653527498245239,
0.06662767380475998,
0.028980636969208717,
-0.08994876593351364,
-0.019165487959980965,
0.05364687740802765,
0.04671087488532066,
0.026846865192055702,
-0.13248224556446075,
-0.122489333152771,
-0.02253563515841961,
0.03765493258833885,
0.03383703529834747,
0.04617660120129585,
0.029503600671887398,
0.2025780826807022,
-0.09587440639734268,
0.09280231595039368,
0.15422777831554413,
-0.3269748389720917,
-0.028785092756152153,
0.10816463083028793,
0.09493216872215271,
-0.009806512854993343,
-0.056667108088731766,
0.053594622761011124,
0.010852220468223095,
0.017016617581248283,
0.07875027507543564,
-0.08242537826299667,
-0.06757722049951553,
0.015077394433319569,
-0.08309552818536758,
-0.011084277182817459,
0.11594939231872559,
-0.04974423721432686,
0.06572466343641281,
-0.01178821176290512,
-0.11704806238412857,
-0.05568913742899895,
-0.028509169816970825,
-0.02454548515379429,
-0.03183290362358093,
0.03153258189558983,
-0.052542295306921005,
-0.05398869886994362,
-0.11701438575983047,
0.01607360877096653,
-0.20378780364990234,
0.2291792631149292,
0.015989534556865692,
0.06484607607126236,
-0.1913691759109497,
0.04019235447049141,
-0.008810539729893208,
-0.12397590279579163,
0.08061125874519348,
-0.0778740793466568,
-0.01043014694005251,
-0.00025012591504491866,
-0.03194958716630936,
-0.15330010652542114,
0.08402062207460403,
0.14235012233257294,
0.0703987404704094,
0.05235375463962555,
0.04025684669613838,
0.11797744780778885,
0.0062722195871174335,
0.08332715183496475,
-0.0033328577410429716,
-0.020555660128593445,
0.058990221470594406,
-0.06935149431228638,
0.042393360286951065,
-0.06539654731750488,
-0.14305822551250458,
-0.019733933731913567,
0.024873333051800728,
0.06177109107375145,
0.0472954697906971,
0.05189374089241028,
-0.07901646941900253,
-0.01297581847757101,
0.05532194301486015,
-0.06181560829281807,
0.021915150806307793,
-0.028121696785092354,
0.05614781379699707,
0.09694098681211472,
0.02329915761947632,
-0.015683183446526527,
0.009050553664565086,
0.10954584926366806,
-0.09349292516708374,
-0.022346964105963707,
-0.08019722253084183,
-0.07115963101387024,
0.04550477862358093,
-0.15208832919597626,
0.03392788767814636,
-0.163294717669487,
-0.11616092920303345,
0.036480147391557693,
0.07727508246898651,
-0.01087186112999916,
-0.027444304898381233,
0.04069950059056282,
-0.017672257497906685,
0.04171934723854065,
-0.026001179590821266,
-0.02262156456708908,
-0.02280217967927456,
0.09067060798406601,
-0.019523262977600098,
0.12676511704921722,
-0.07864977419376373,
0.03291575238108635,
-0.06563568115234375,
0.020905161276459694,
-0.17323945462703705,
-0.04067065566778183,
-0.05730418488383293,
0.15274600684642792,
-0.004410047549754381,
-0.01597120799124241,
-0.13115207850933075,
0.036499302834272385,
0.004403534810990095,
0.13779743015766144,
-0.09854277223348618,
-0.12285254150629044,
0.23302029073238373,
-0.0938703641295433,
-0.12066173553466797,
0.09428577870130539,
-0.00585569255053997,
0.012807860970497131,
0.03407035022974014,
0.12017872184515,
0.06350307911634445,
-0.12561555206775665,
0.08130798488855362,
0.10102665424346924,
-0.1405840963125229,
-0.13989393413066864,
0.008247711695730686,
0.0014838920906186104,
-0.07189842313528061,
0.029673457145690918,
0.12153617292642593,
0.10942172259092331,
-0.06289353221654892,
-0.05540693923830986,
-0.014802910387516022,
-0.04390530660748482,
0.13305455446243286,
0.07422330230474472,
0.11936018615961075,
-0.06421703845262527,
-0.05641259625554085,
-0.010930903255939484,
-0.001581725082360208,
0.03534591943025589,
0.023735323920845985,
-0.08301463723182678,
0.13557825982570648,
-0.10046841949224472,
-0.009981892071664333,
-0.1833997219800949,
-0.1408158242702484,
-0.01896396093070507,
0.03614329919219017,
-0.005267035216093063,
0.14361442625522614,
0.1419670730829239,
-0.04958319664001465,
-0.016094328835606575,
-0.001771264709532261,
0.10361504554748535,
0.01744346134364605,
-0.05065019056200981,
-0.1390962153673172,
0.025869980454444885,
-0.10641948133707047,
0.0035627756733447313,
-0.03568409010767937,
0.015927081927657127,
0.003599299117922783,
0.13339167833328247,
0.03954165056347847,
0.028387578204274178,
-0.03511304780840874,
0.026251614093780518,
-0.043433476239442825,
-0.0007068906561471522,
0.08702710270881653,
0.003389613702893257,
-0.06495824456214905,
0.1567114144563675,
-0.1431649774312973,
0.33185818791389465,
0.18182213604450226,
-0.2438773661851883,
-0.038081273436546326,
0.03647970035672188,
-0.01340425107628107,
-0.007261103484779596,
0.04910612106323242,
0.007451684679836035,
0.0371720977127552,
0.0028892923146486282,
0.1355745792388916,
-0.013151844032108784,
-0.034933120012283325,
0.046050380915403366,
-0.0661000907421112,
-0.05983700230717659,
0.04253281280398369,
0.1302458643913269,
-0.11220794916152954,
0.17833472788333893,
0.22774462401866913,
-0.07071826606988907,
0.18834294378757477,
0.017805663868784904,
-0.002283287001773715,
0.0022935892920941114,
-0.039875928312540054,
-0.002615851117298007,
0.07621114701032639,
-0.1808532476425171,
-0.046407293528318405,
0.05705195292830467,
-0.05776510015130043,
0.04796215891838074,
-0.1355598270893097,
-0.04663200303912163,
0.0069245509803295135,
0.06652981787919998,
0.006473381072282791,
0.1275380253791809,
0.02095414698123932,
0.06925595551729202,
-0.019553450867533684,
-0.09946165233850479,
0.09221124649047852,
0.008467999286949635,
-0.0201379656791687,
0.15283440053462982,
-0.0967143103480339,
-0.3056011199951172,
-0.1209845244884491,
-0.13611142337322235,
0.009265034459531307,
0.013424806296825409,
0.045616377145051956,
-0.0761413648724556,
-0.04097984731197357,
0.07577920705080032,
-0.00812458898872137,
-0.029371723532676697,
0.07061979919672012,
-0.07266911119222641,
0.024432552978396416,
-0.04141318425536156,
-0.06628400832414627,
-0.05382229760289192,
-0.054113563150167465,
-0.024020100012421608,
0.13257233798503876,
-0.05321606993675232,
0.07944805175065994,
0.1491497904062271,
0.01205755490809679,
0.05353403463959694,
-0.0027496805414557457,
0.11586552858352661,
-0.08211545646190643,
0.016660600900650024,
0.1541241854429245,
-0.04000994935631752,
0.0920138955116272,
0.14974257349967957,
0.051779136061668396,
-0.032861221581697464,
-0.01734315976500511,
-0.01920335739850998,
-0.1296594738960266,
-0.1783415824174881,
-0.06051096320152283,
-0.13144588470458984,
-0.010760712437331676,
0.05260702967643738,
0.06244603917002678,
0.15693743526935577,
0.11204888671636581,
0.06889155507087708,
0.003974058199673891,
-0.064728282392025,
0.035249169915914536,
0.11515355110168457,
-0.012344387359917164,
0.14337371289730072,
-0.044457804411649704,
-0.15340383350849152,
0.04128626361489296,
0.0567547082901001,
0.11611785739660263,
0.09246822446584702,
0.03985978290438652,
0.0416528545320034,
0.17303939163684845,
0.15917858481407166,
0.11519143730401993,
-0.0013552854070439935,
-0.0817263051867485,
0.001369532779790461,
-0.010764461942017078,
-0.018506037071347237,
0.024643898010253906,
0.16488061845302582,
-0.07572177797555923,
0.00357465329580009,
-0.11025635153055191,
0.04436880722641945,
0.10707270354032516,
0.04875938966870308,
-0.24057722091674805,
0.01747376099228859,
0.06001284718513489,
0.009187494404613972,
-0.04879000782966614,
0.029154665768146515,
-0.01069580763578415,
-0.10265225172042847,
0.04736635088920593,
-0.09335082769393921,
0.0774553120136261,
0.005905484315007925,
0.03950373828411102,
-0.03354807570576668,
-0.015833422541618347,
0.020954512059688568,
0.04040065407752991,
-0.22027750313282013,
0.25253671407699585,
-0.0017404075479134917,
-0.027431095018982887,
-0.06831086426973343,
0.006171961780637503,
0.05629671737551689,
0.09116309881210327,
0.12283080816268921,
0.002265392569825053,
-0.04135163873434067,
-0.06996691972017288,
-0.023672625422477722,
0.014519180171191692,
0.07688470929861069,
-0.0303781908005476,
-0.020113827660679817,
-0.01414425764232874,
-0.051750313490629196,
0.00482820812612772,
0.03736349567770958,
0.0047228685580194,
-0.14571145176887512,
0.08523136377334595,
0.03405115380883217,
-0.10368743538856506,
0.005693832878023386,
-0.0906156525015831,
-0.12881214916706085,
0.22051775455474854,
-0.036054905503988266,
-0.03929142653942108,
-0.11688987165689468,
-0.08677396923303604,
0.08231160789728165,
-0.1003187969326973,
0.0907948836684227,
-0.08400740474462509,
0.009871154092252254,
-0.08544441312551498,
-0.18899314105510712,
0.16893191635608673,
-0.11482013016939163,
0.011679023504257202,
-0.10398916155099869,
0.12384915351867676,
-0.06425923854112625,
0.03634386882185936,
-0.010618667118251324,
0.03852801024913788,
-0.08605284243822098,
-0.043050676584243774,
0.044586215168237686,
-0.04458039626479149,
0.0225312951952219,
-0.022807754576206207,
-0.03959791362285614,
-0.030248722061514854,
0.029058517888188362,
0.0406133271753788,
0.22144711017608643,
0.2144334763288498,
-0.08375316858291626,
0.14107592403888702,
0.1542760729789734,
-0.027089660987257957,
-0.3429948091506958,
-0.06767362356185913,
-0.10662075132131577,
-0.002038518665358424,
0.0292994175106287,
-0.12421690672636032,
0.10439302772283554,
-0.013619315810501575,
-0.062008947134017944,
0.14743256568908691,
-0.20367805659770966,
-0.1039750874042511,
0.19479258358478546,
0.058817919343709946,
0.4338186979293823,
-0.13855479657649994,
-0.06851515918970108,
-0.005740564316511154,
-0.1065901517868042,
0.07769810408353806,
-0.03311660513281822,
0.09169909358024597,
-0.021223753690719604,
0.05765504762530327,
0.035844165831804276,
-0.09546727687120438,
0.08137611299753189,
-0.07327132672071457,
0.020815597847104073,
-0.09844761341810226,
-0.09033150225877762,
0.11295315623283386,
-0.011052154935896397,
-0.033259470015764236,
-0.02068001963198185,
-0.011131967417895794,
-0.03578278794884682,
-0.018166273832321167,
-0.094619981944561,
0.1094285175204277,
0.029992101714015007,
-0.06555511057376862,
0.024528294801712036,
-0.00469568045809865,
-0.03246724233031273,
-0.010688327252864838,
0.26251301169395447,
0.00740673765540123,
0.19858002662658691,
0.11181885004043579,
0.03713912144303322,
-0.11215758323669434,
-0.0963178277015686,
-0.048410024493932724,
-0.08148425817489624,
0.09272054582834244,
-0.054956406354904175,
0.03476840630173683,
0.09707683324813843,
0.003254994750022888,
0.03658732399344444,
0.10885944962501526,
-0.022786185145378113,
-0.006367347668856382,
0.16209717094898224,
-0.21533386409282684,
-0.027371011674404144,
-0.011076156981289387,
-0.03197417035698891,
0.05048466846346855,
0.08026017993688583,
0.105776347219944,
0.030726158991456032,
-0.03686424344778061,
0.015599449165165424,
-0.011029734276235104,
-0.04086773097515106,
0.04656952992081642,
0.09248721599578857,
0.03528885170817375,
-0.10671261698007584,
0.010058549232780933,
0.0021204573567956686,
-0.23764212429523468,
-0.019482091069221497,
0.11905904859304428,
-0.08317956328392029,
-0.11988458782434464,
0.027093907818198204,
0.1292128562927246,
-0.07335732132196426,
-0.014795921742916107,
-0.09503516554832458,
-0.08984415978193283,
0.031589578837156296,
0.2195563167333603,
0.08901014924049377,
0.0426330603659153,
-0.022536741569638252,
0.0005373752792365849,
-0.028745895251631737,
0.017907293513417244,
0.009444215334951878,
0.049821820110082626,
-0.11264432221651077,
-0.025068648159503937,
-0.002249453216791153,
0.14157554507255554,
-0.10781886428594589,
-0.03704838082194328,
-0.17573441565036774,
0.022272171452641487,
-0.033018261194229126,
-0.08313077688217163,
-0.08952318876981735,
-0.07307300716638565,
0.009413745254278183,
-0.06759088486433029,
-0.06402819603681564,
-0.03870067372918129,
-0.11781171709299088,
0.018196070566773415,
0.03698795661330223,
-0.017428312450647354,
-0.06797853112220764,
-0.04566580057144165,
0.10457863658666611,
-0.03494711220264435,
0.07986368238925934,
0.10415607690811157,
-0.036714281886816025,
0.09260773658752441,
-0.12875588238239288,
-0.0782090574502945,
0.08222851902246475,
0.010853096842765808,
0.09076497703790665,
0.07374084740877151,
0.0209487471729517,
0.022617587819695473,
0.0586252398788929,
0.03949703648686409,
0.03390705958008766,
-0.10528510063886642,
0.07766593992710114,
-0.0036784436088055372,
-0.18475206196308136,
-0.04264308884739876,
-0.04846673086285591,
0.07689227163791656,
0.0000564083456993103,
0.11577785015106201,
-0.04345632717013359,
0.0871499553322792,
-0.07589560747146606,
0.02142581343650818,
-0.0160426814109087,
-0.15075929462909698,
0.03391137346625328,
-0.025644728913903236,
0.006313334684818983,
-0.03843572363257408,
0.18327243626117706,
0.022004099562764168,
-0.02145870216190815,
0.029890315607190132,
0.050182685256004333,
0.01138403732329607,
0.014391258358955383,
0.1237821951508522,
0.0659276619553566,
-0.048964738845825195,
-0.0851173996925354,
0.10231930762529373,
0.0322895385324955,
-0.0015593968564644456,
0.12318528443574905,
0.06167507544159889,
0.025546416640281677,
0.09811484068632126,
0.03741743043065071,
0.046944696456193924,
-0.09669690579175949,
-0.11812370270490646,
-0.06329796463251114,
0.06559941917657852,
0.02080846019089222,
0.021437538787722588,
0.19806689023971558,
0.013027378357946873,
0.04757985472679138,
-0.04351503774523735,
-0.03361448273062706,
-0.18252640962600708,
-0.16818886995315552,
-0.08656346797943115,
-0.06012668088078499,
0.03250272199511528,
-0.00893042329698801,
-0.026426302269101143,
0.0779576227068901,
0.04128513112664223,
-0.024318739771842957,
0.13192857801914215,
0.07529094070196152,
0.0039122868329286575,
0.011309136636555195,
0.021922415122389793,
-0.008097113110125065,
0.001034525572322309,
0.0022887985687702894,
-0.1683558225631714,
-0.005163964349776506,
-0.06668347865343094,
-0.016059817746281624,
-0.06870714575052261,
0.028988828882575035,
-0.07447594404220581,
-0.1355765461921692,
-0.04676564037799835,
0.022212272509932518,
-0.021008780226111412,
0.05531160160899162,
0.0024794747587293386,
0.04048211872577667,
-0.006856914609670639,
0.1441042274236679,
-0.06974226236343384,
-0.026155387982726097,
-0.0609048493206501,
0.13574302196502686,
0.010319629684090614,
0.08230061829090118,
-0.017585260793566704,
0.01702001504600048,
-0.06836116313934326,
0.29205235838890076,
0.337017685174942,
-0.061491578817367554,
0.08024031668901443,
0.06408334523439407,
0.02148929052054882,
0.024402925744652748,
0.11077114194631577,
0.0520729124546051,
0.27871713042259216,
-0.10102465003728867,
-0.057044338434934616,
-0.042673613876104355,
-0.0297977477312088,
-0.10663390904664993,
0.029587889090180397,
0.04262523725628853,
-0.01878473535180092,
-0.052453380078077316,
0.07060423493385315,
-0.1627788543701172,
0.08529063314199448,
0.08852001279592514,
-0.22853469848632812,
-0.0828385129570961,
-0.023376956582069397,
0.18547679483890533,
-0.007760809268802404,
0.10319223254919052,
-0.033652711659669876,
-0.09106358140707016,
0.002748937113210559,
0.017562294378876686,
-0.2120169848203659,
-0.04143116995692253,
0.0858837142586708,
-0.012610984034836292,
0.09440646320581436,
-0.04714668169617653,
0.001621488481760025,
0.1070573553442955,
0.05599372461438179,
-0.018922947347164154,
-0.009089205414056778,
0.029872359707951546,
-0.10413407534360886,
-0.05462685599923134,
0.04770362749695778,
0.00493355467915535,
-0.11373955011367798,
0.04055871441960335,
-0.10064697265625,
0.04101647064089775,
-0.16733373701572418,
-0.016937868669629097,
-0.004471272695809603,
0.07673639059066772,
-0.035708729177713394,
0.06403952091932297,
0.06532040983438492,
0.039379414170980453,
-0.030999401584267616,
-0.040171388536691666,
-0.02427910827100277,
0.07377103716135025,
-0.06011706218123436,
-0.17939476668834686,
-0.08689197152853012,
-0.05579279735684395,
0.01125532016158104,
-0.010918475687503815,
-0.18686245381832123,
-0.051870957016944885,
-0.10107588022947311,
0.00037625167169608176,
-0.1624269038438797,
0.024618223309516907,
0.10606223344802856,
0.045619890093803406,
0.00951701682060957,
-0.03292389586567879,
0.03255080804228783,
0.030340315774083138,
-0.18025068938732147,
-0.08541256189346313
] |
null | null |
transformers
|
## Swahili News Classification with RoBERTa
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by [HuggingFace](https://huggingface.co). All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
This [model](https://huggingface.co/flax-community/roberta-swahili) was used as the base and fine-tuned for this task.
## How to use
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("flax-community/roberta-swahili-news-classification")
model = AutoModelForSequenceClassification.from_pretrained("flax-community/roberta-swahili-news-classification")
```
```
Eval metrics: {'accuracy': 0.9153416415986249}
```
|
{"language": "sw", "datasets": ["flax-community/swahili-safi"], "widget": [{"text": "Idris ameandika kwenye ukurasa wake wa Instagram akimkumbusha Diamond kutekeleza ahadi yake kumpigia Zari magoti kumuomba msamaha kama alivyowahi kueleza awali.Idris ameandika;"}]}
|
text-classification
|
flax-community/roberta-swahili-news-classification
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"roberta",
"text-classification",
"sw",
"dataset:flax-community/swahili-safi",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sw"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #roberta #text-classification #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #region-us
|
## Swahili News Classification with RoBERTa
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
This model was used as the base and fine-tuned for this task.
## How to use
|
[
"## Swahili News Classification with RoBERTa\n\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.\n\nThis model was used as the base and fine-tuned for this task.",
"## How to use"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #roberta #text-classification #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"## Swahili News Classification with RoBERTa\n\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.\n\nThis model was used as the base and fine-tuned for this task.",
"## How to use"
] |
[
73,
82,
4
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #roberta #text-classification #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #region-us \n## Swahili News Classification with RoBERTa\n\n\nThis model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.\n\nThis model was used as the base and fine-tuned for this task.## How to use"
] |
[
-0.03726717457175255,
0.013864543288946152,
-0.001753952237777412,
0.053509559482336044,
0.1377323418855667,
0.00811732467263937,
0.13133417069911957,
0.05472130328416824,
-0.0008531400235369802,
-0.028183236718177795,
0.11948616057634354,
0.04761647805571556,
0.04654953256249428,
0.2193210870027542,
-0.0459139384329319,
-0.23904506862163544,
0.014485605992376804,
0.005742552224546671,
-0.05001702532172203,
0.1681201457977295,
0.1189761757850647,
-0.07379461824893951,
0.0853213220834732,
-0.027331465855240822,
-0.1439904272556305,
0.014645780436694622,
0.003670360194519162,
-0.10619594156742096,
0.12072554975748062,
0.060915831476449966,
0.08704899251461029,
0.07535385340452194,
0.03568371385335922,
-0.050451550632715225,
0.07239502668380737,
0.03302248939871788,
-0.0875164270401001,
0.04333953931927681,
0.04955322667956352,
-0.027134034782648087,
0.13064609467983246,
-0.015959352254867554,
-0.016278019174933434,
0.05480904132127762,
-0.13761602342128754,
-0.12269569933414459,
-0.04999127611517906,
0.032940980046987534,
0.10536542534828186,
0.054976049810647964,
-0.013997871428728104,
0.19318583607673645,
-0.12997081875801086,
0.12237279862165451,
0.07782002538442612,
-0.2431025207042694,
-0.10164003819227219,
0.18141645193099976,
0.09186754375696182,
0.031596940010786057,
-0.04266420379281044,
0.06634471565485,
0.03360912948846817,
0.00981906522065401,
0.08919362723827362,
-0.06456717848777771,
-0.1310693621635437,
-0.02433153986930847,
-0.09521623700857162,
-0.03683718666434288,
0.24296046793460846,
0.009010142646729946,
-0.00970494095236063,
-0.0810830220580101,
-0.06213098019361496,
0.09480319172143936,
-0.05822621285915375,
-0.04060915485024452,
0.006227813195437193,
0.0207012090831995,
-0.01697724126279354,
-0.015527766197919846,
-0.08813582360744476,
-0.08323536068201065,
-0.07762344181537628,
0.13863754272460938,
-0.007370052393525839,
0.05953439697623253,
-0.19409792125225067,
0.07737889140844345,
0.022969931364059448,
-0.09024100005626678,
0.029116494581103325,
-0.04338042065501213,
0.008489708416163921,
-0.013064919039607048,
0.0001542355603305623,
-0.16913214325904846,
0.12128405272960663,
0.06479258835315704,
-0.007983720861375332,
-0.0010684751905500889,
0.06967418640851974,
0.043724291026592255,
0.03896474093198776,
0.09844042360782623,
-0.072702556848526,
-0.11078542470932007,
0.11076012253761292,
-0.02993576042354107,
0.006868801545351744,
-0.040025029331445694,
-0.0715675950050354,
-0.03384821116924286,
0.042588971555233,
0.043546538800001144,
0.07663735002279282,
0.13885065913200378,
0.010488452389836311,
-0.02811763994395733,
0.04838567599654198,
-0.07802855968475342,
-0.027527227997779846,
0.013757656328380108,
-0.008019349537789822,
-0.03789057582616806,
0.06235014647245407,
-0.002844443079084158,
-0.04239514842629433,
0.007073352113366127,
-0.04014972224831581,
-0.028976397588849068,
-0.021398551762104034,
-0.08313274383544922,
0.03148871287703514,
-0.16197048127651215,
0.0020424521062523127,
-0.20352905988693237,
-0.22684520483016968,
-0.07237579673528671,
0.00794594269245863,
0.01414426974952221,
-0.06266231089830399,
-0.037824030965566635,
0.005910756066441536,
0.005591090302914381,
-0.034068115055561066,
0.10817171633243561,
-0.07108607888221741,
0.07380589097738266,
-0.06760099530220032,
0.08935193717479706,
-0.06906716525554657,
0.019141897559165955,
-0.06246546283364296,
0.04746512323617935,
-0.09586027264595032,
0.07171724736690521,
-0.031065374612808228,
0.15730871260166168,
-0.06440034508705139,
0.03593175485730171,
-0.033507391810417175,
0.03372465819120407,
0.010218980722129345,
0.19589924812316895,
-0.18505270779132843,
-0.04429357498884201,
0.15883642435073853,
-0.1150868833065033,
-0.14164522290229797,
0.1428978592157364,
-0.038152065128088,
0.16211852431297302,
0.07222027331590652,
0.20344996452331543,
0.05318367853760719,
-0.013529974035918713,
0.014378794468939304,
0.04054007679224014,
-0.14258620142936707,
-0.10127227008342743,
0.026391517370939255,
0.07137180864810944,
-0.14676053822040558,
0.04454796761274338,
-0.05415317416191101,
0.1344950944185257,
-0.061652056872844696,
-0.03960802033543587,
-0.030452612787485123,
-0.07102783769369125,
0.06331398338079453,
0.033517561852931976,
0.05369593948125839,
-0.053445059806108475,
-0.0073972707614302635,
-0.08002981543540955,
0.041156020015478134,
-0.028464367613196373,
0.017711704596877098,
-0.09942153841257095,
0.13550160825252533,
-0.04636508226394653,
0.07083473354578018,
-0.121615931391716,
-0.05358555167913437,
-0.03902651369571686,
0.1436750292778015,
-0.05151468887925148,
0.025610411539673805,
0.06623699516057968,
-0.08138556778430939,
-0.016316281631588936,
-0.013574193231761456,
0.08413652330636978,
0.010815085843205452,
-0.06301020085811615,
-0.0852351039648056,
0.004680273123085499,
-0.06804239749908447,
0.06783667206764221,
-0.15072913467884064,
0.04921122267842293,
0.018737252801656723,
0.0714288055896759,
-0.005105477757751942,
0.04934333637356758,
0.075436070561409,
0.023419220000505447,
-0.023884670808911324,
-0.03806217014789581,
0.09517665207386017,
0.028055671602487564,
-0.09939923882484436,
0.16828332841396332,
-0.09924409538507462,
0.18107326328754425,
0.11905062943696976,
-0.07447580248117447,
-0.03626616299152374,
0.07569948583841324,
-0.04757322743535042,
-0.002786913886666298,
-0.04647163301706314,
0.059662505984306335,
0.06476247310638428,
-0.011336673982441425,
0.14702847599983215,
-0.07608714699745178,
-0.021129664033651352,
0.034633927047252655,
-0.06767552345991135,
0.004695772659033537,
0.09793165326118469,
-0.05456332862377167,
-0.09907975792884827,
0.08662831783294678,
0.13160459697246552,
0.028508692979812622,
0.20526330173015594,
-0.016575055196881294,
0.029013467952609062,
-0.0034260619431734085,
-0.01246712263673544,
-0.011208826676011086,
0.033516012132167816,
-0.15206286311149597,
-0.06088994815945625,
0.03592824935913086,
-0.013652016408741474,
0.015934187918901443,
-0.06849037110805511,
-0.03960505127906799,
-0.009866795502603054,
-0.022278646007180214,
-0.031213169917464256,
0.11663679778575897,
-0.03998231887817383,
0.11183734238147736,
-0.009726314805448055,
-0.08616592735052109,
0.040904659777879715,
-0.01573016680777073,
-0.10919887572526932,
0.175087109208107,
0.017521629109978676,
-0.30274641513824463,
-0.03839769586920738,
-0.059364061802625656,
-0.023339904844760895,
-0.006772879511117935,
0.058277763426303864,
-0.04408028721809387,
-0.022140005603432655,
-0.07976821810007095,
-0.04815451800823212,
0.08167295157909393,
0.037677306681871414,
-0.025044888257980347,
0.022783422842621803,
0.033355917781591415,
-0.09290632605552673,
-0.004632098134607077,
-0.060487646609544754,
-0.05170604586601257,
0.13582508265972137,
-0.09399623423814774,
0.13366903364658356,
0.08678898960351944,
-0.05336447060108185,
0.06030398607254028,
-0.031533412635326385,
0.21758702397346497,
-0.10144355893135071,
0.07724272459745407,
0.20358173549175262,
-0.03125428408384323,
-0.0002620200684759766,
0.09823600202798843,
0.002217481378465891,
-0.05404749885201454,
0.03179265558719635,
-0.048804912716150284,
-0.13419115543365479,
-0.1852935403585434,
-0.10088197141885757,
-0.04726894199848175,
0.07914498448371887,
-0.0066178166307508945,
0.07913678884506226,
0.0034101002383977175,
0.0856769010424614,
0.02139277197420597,
0.009798762388527393,
0.03666858747601509,
0.03284694626927376,
-0.024379733949899673,
-0.012953034602105618,
0.09315235167741776,
-0.09122838079929352,
-0.05830443277955055,
0.07920219004154205,
-0.006035941652953625,
0.050162024796009064,
0.015274652279913425,
-0.009644976817071438,
0.06756306439638138,
0.1448659598827362,
0.11710884422063828,
0.11188660562038422,
0.0024158006999641657,
-0.03285061568021774,
-0.029558595269918442,
-0.018274618312716484,
-0.06443341076374054,
0.003955974243581295,
0.005043296609073877,
-0.019929729402065277,
-0.036307670176029205,
-0.07774490863084793,
0.07656892389059067,
0.2208690345287323,
0.0067024920135736465,
-0.21527676284313202,
-0.03865649178624153,
0.028653519228100777,
0.004562440328299999,
-0.06232975050806999,
0.034805167466402054,
0.09522610902786255,
-0.11341283470392227,
0.05826753377914429,
-0.01705266907811165,
0.12687352299690247,
-0.011848918162286282,
0.017898960039019585,
-0.09578081220388412,
-0.05399731174111366,
-0.04015516862273216,
0.09755449742078781,
-0.27137255668640137,
0.19892534613609314,
-0.0015845499001443386,
0.04459255933761597,
-0.04304247722029686,
-0.06138785183429718,
0.07126811891794205,
0.22891123592853546,
0.18269535899162292,
0.030090289190411568,
0.07239681482315063,
-0.047875139862298965,
-0.13426190614700317,
0.049667276442050934,
-0.049875810742378235,
-0.029232554137706757,
0.03491285443305969,
0.010669843293726444,
-0.012540670111775398,
-0.0013021412305533886,
0.05080997198820114,
-0.09154234826564789,
-0.0427619144320488,
-0.028727270662784576,
0.09319473803043365,
0.035973887890577316,
-0.028520755469799042,
-0.0720662996172905,
-0.13505688309669495,
0.07775506377220154,
0.039190519601106644,
-0.058755550533533096,
-0.154600590467453,
-0.03813749551773071,
-0.031128594651818275,
-0.0526326559484005,
-0.011275560595095158,
0.008990061469376087,
0.030792882665991783,
-0.040389545261859894,
-0.1259462833404541,
0.08645795285701752,
-0.10926920175552368,
-0.11550656706094742,
-0.009899859316647053,
0.03406812623143196,
0.02360282652080059,
-0.01955614797770977,
0.06755813211202621,
0.0033620072063058615,
-0.07012426853179932,
-0.08892624825239182,
0.02469000592827797,
0.07340090721845627,
0.003600128460675478,
0.0007893930887803435,
0.051506251096725464,
-0.13134729862213135,
0.0020639500580728054,
-0.009594559669494629,
0.14538176357746124,
0.1479806751012802,
-0.08202158659696579,
0.11408838629722595,
0.15271475911140442,
-0.03363831713795662,
-0.3584372103214264,
-0.11345522850751877,
-0.03365236520767212,
0.026332829147577286,
0.03317839279770851,
0.00207082973793149,
0.07314763963222504,
-0.021452348679304123,
-0.03191351145505905,
-0.021467724815011024,
-0.1883973777294159,
-0.11234412342309952,
0.14807479083538055,
0.08049877732992172,
0.4324425458908081,
-0.12375658005475998,
-0.03103826753795147,
-0.050150610506534576,
-0.10027118027210236,
0.09137998521327972,
-0.2080306112766266,
0.07334975153207779,
-0.052182137966156006,
-0.02327575348317623,
-0.011310522444546223,
-0.05292556807398796,
0.10075713694095612,
-0.008859576657414436,
0.05020016431808472,
-0.12139193713665009,
-0.07088039070367813,
0.1294572949409485,
-0.057741206139326096,
0.09251893311738968,
-0.039401762187480927,
0.04884908348321915,
-0.17059209942817688,
-0.04329735413193703,
-0.09505768120288849,
0.11890676617622375,
-0.009972920641303062,
-0.03277825936675072,
0.0014816989423707128,
0.01638386957347393,
0.0375606007874012,
0.01098683662712574,
0.12359467148780823,
-0.072150819003582,
0.12934541702270508,
0.1933973729610443,
0.14537030458450317,
-0.09409358352422714,
0.035187821835279465,
-0.028015105053782463,
-0.04615712910890579,
0.07570447772741318,
-0.15324799716472626,
0.01712065190076828,
0.04579702392220497,
0.00820198841392994,
0.0545017272233963,
0.0634438619017601,
-0.0342290960252285,
0.020734868943691254,
0.07553970068693161,
-0.18515989184379578,
-0.14143534004688263,
-0.05780414864420891,
-0.12686456739902496,
-0.013482112437486649,
0.08891253173351288,
0.12619169056415558,
-0.060975782573223114,
-0.025393376126885414,
-0.021741624921560287,
0.004127366468310356,
-0.025558078661561012,
0.09506700187921524,
0.0928911417722702,
0.03058864176273346,
-0.11803121119737625,
0.04989918693900108,
0.05331343412399292,
-0.033269546926021576,
-0.014710335060954094,
0.07568387687206268,
-0.15410539507865906,
-0.10359959304332733,
0.010753144510090351,
0.21947652101516724,
0.00042682269122451544,
-0.11398207396268845,
-0.10443202406167984,
-0.12382086366415024,
0.04783583804965019,
0.22351127862930298,
0.10088447481393814,
0.04801318794488907,
0.005671949125826359,
-0.0684765949845314,
-0.06319349259138107,
0.08068737387657166,
0.0945594310760498,
-0.00020644295727834105,
-0.11936940252780914,
0.07173185795545578,
0.015434630215168,
0.16744834184646606,
-0.11362528055906296,
-0.06757589429616928,
-0.14201290905475616,
0.031964126974344254,
-0.07119786739349365,
0.026668403297662735,
-0.062112271785736084,
0.013966912403702736,
-0.040475357323884964,
-0.04468048736453056,
-0.052802491933107376,
0.017690615728497505,
-0.0523097924888134,
0.034632325172424316,
0.005551137495785952,
0.04522736743092537,
-0.033900972455739975,
-0.054898571223020554,
0.007585377432405949,
-0.022960873320698738,
0.1422041356563568,
0.03840568661689758,
-0.07654791325330734,
0.010422714985907078,
-0.20301546156406403,
-0.027571655809879303,
0.03252070024609566,
-0.029836002737283707,
0.0438850000500679,
-0.0197085402905941,
0.020896444097161293,
0.013157867826521397,
-0.03712800517678261,
0.040995631366968155,
0.09648606181144714,
-0.07609348744153976,
0.09811737388372421,
-0.04149611294269562,
-0.05194871500134468,
-0.032026179134845734,
0.014819505624473095,
0.04360612481832504,
0.12510143220424652,
0.12266669422388077,
-0.08109451085329056,
0.037724319845438004,
-0.14133206009864807,
0.014189106412231922,
-0.017229905351996422,
-0.14362993836402893,
-0.1266842782497406,
-0.041550543159246445,
0.039955537766218185,
-0.023927122354507446,
0.14098504185676575,
0.15253694355487823,
-0.06413256376981735,
-0.010762464255094528,
0.08277232199907303,
-0.012560867704451084,
-0.035549212247133255,
0.1586294025182724,
-0.03368743509054184,
0.0062245819717645645,
0.02877768874168396,
0.052687399089336395,
0.028835393488407135,
-0.002163963858038187,
0.028764622285962105,
0.040108051151037216,
-0.01195804588496685,
0.11044975370168686,
-0.014522189274430275,
0.02742088958621025,
0.057685334235429764,
-0.056854136288166046,
-0.10779588669538498,
0.08191744238138199,
-0.04281419515609741,
-0.014208959415555,
0.17860761284828186,
-0.048939354717731476,
0.025610007345676422,
0.012220642529428005,
-0.03225453197956085,
-0.13563908636569977,
-0.18427136540412903,
-0.10337138921022415,
-0.09081487357616425,
0.0013303456362336874,
-0.057147469371557236,
-0.0643344596028328,
0.09688767045736313,
0.01692051813006401,
-0.026683056727051735,
0.04861140996217728,
0.008538489229977131,
-0.015719613060355186,
0.1596192717552185,
-0.04387368634343147,
-0.012126673012971878,
-0.06918761134147644,
0.01286117359995842,
-0.01797514222562313,
0.010944249108433723,
-0.03672507032752037,
0.003582444041967392,
0.024287275969982147,
0.05331087112426758,
-0.056435272097587585,
-0.1033530905842781,
-0.020486675202846527,
0.04729059711098671,
0.029361315071582794,
0.033761315047740936,
0.047538433223962784,
-0.02641245163977146,
0.036003533750772476,
0.2164086401462555,
0.010644923895597458,
-0.051032520830631256,
-0.09449341893196106,
0.04432579502463341,
-0.0062573798932135105,
0.015276932157576084,
-0.014693234115839005,
-0.04981878399848938,
0.046979889273643494,
0.26500204205513,
0.23886878788471222,
-0.03949255496263504,
0.061241619288921356,
-0.04846405237913132,
0.016387028619647026,
0.061843063682317734,
0.1588495522737503,
0.05457521229982376,
0.09897302836179733,
-0.053074948489665985,
-0.012012208811938763,
-0.06763732433319092,
-0.05053532496094704,
-0.04594933241605759,
0.0706845298409462,
0.012832120060920715,
-0.037857647985219955,
-0.039459407329559326,
0.1026928499341011,
-0.11139457672834396,
-0.04224025085568428,
0.0313270166516304,
-0.1258295625448227,
-0.07881912589073181,
-0.05988336354494095,
0.01136864721775055,
0.06700299680233002,
0.05859004706144333,
-0.025557653978466988,
0.05147729814052582,
0.061695367097854614,
-0.008182158693671227,
-0.0910552516579628,
-0.009939574636518955,
0.06872320920228958,
-0.07607993483543396,
0.14644555747509003,
-0.04487578570842743,
-0.010653669945895672,
0.05133730545639992,
-0.028078995645046234,
-0.11611268669366837,
0.12267929315567017,
-0.04465753212571144,
0.06945719569921494,
0.06150807440280914,
0.025117184966802597,
-0.003037560498341918,
-0.07320646196603775,
0.0032797956373542547,
-0.10213758051395416,
0.055118221789598465,
-0.022539576515555382,
-0.033160462975502014,
-0.09540673345327377,
0.10772889107465744,
-0.02075178362429142,
0.11144258826971054,
0.06663325428962708,
-0.04775632545351982,
-0.00235673482529819,
-0.07566308230161667,
0.004187189508229494,
-0.014378063380718231,
-0.05090392008423805,
-0.021622074767947197,
-0.10005341470241547,
-0.02071375586092472,
-0.06605666130781174,
-0.01950550451874733,
-0.14643970131874084,
0.022228779271245003,
-0.16246865689754486,
-0.06469936668872833,
-0.01446366123855114,
0.06640875339508057,
0.1169714480638504,
0.02093277871608734,
-0.003495319513604045,
0.03305476903915405,
0.020498406141996384,
0.11824291199445724,
-0.08091095834970474,
-0.10090981423854828
] |
null | null |
transformers
|
## RoBERTa in Swahili
This model was trained using HuggingFace's Flax framework and is part of the [JAX/Flax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104) organized by [HuggingFace](https://huggingface.co). All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
## How to use
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("flax-community/roberta-swahili")
model = AutoModelForMaskedLM.from_pretrained("flax-community/roberta-swahili")
print(round((model.num_parameters())/(1000*1000)),"Million Parameters")
105 Million Parameters
```
#### **Training Data**:
This model was trained on [Swahili Safi](https://huggingface.co/datasets/flax-community/swahili-safi)
#### **Results**:
[MasakhaNER](https://github.com/masakhane-io/masakhane-ner) [](https://colab.research.google.com/drive/1OIurb4J91X7461NQXLCCGzjeEGJq_Tyl?usp=sharing)
```
Eval metrics: {'f1': 86%}
```
This [model](https://huggingface.co/flax-community/roberta-swahili-news-classification) was fine-tuned based off this model for the
[Zindi News Classification Challenge](https://zindi.africa/hackathons/ai4d-swahili-news-classification-challenge)
#### **More Details**:
For more details and Demo please check [HF Swahili Space](https://huggingface.co/spaces/flax-community/Swahili)
|
{"language": "sw", "datasets": ["flax-community/swahili-safi"], "widget": [{"text": "Si kila mwenye makucha <mask> simba."}]}
|
fill-mask
|
flax-community/roberta-swahili
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"roberta",
"fill-mask",
"sw",
"dataset:flax-community/swahili-safi",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sw"
] |
TAGS
#transformers #pytorch #jax #tensorboard #roberta #fill-mask #sw #dataset-flax-community/swahili-safi #autotrain_compatible #endpoints_compatible #has_space #region-us
|
## RoBERTa in Swahili
This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.
## How to use
#### Training Data:
This model was trained on Swahili Safi
#### Results:
MasakhaNER , including training scripts.
This is part of the
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
## Team members
- Prateek Agrawal (prateekagrawal)
- Tanay Mehta (yotanay)
- Shreya Gupta (Sheyz-max)
- Ruchi Bhatia (ruchi798)
## Dataset :
[OSCAR](https://huggingface.co/datasets/oscar)
- config : **unshuffled_deduplicated_it**
- Size of downloaded dataset files: **26637.62 MB**
- Size of the generated dataset: **70661.48 MB**
- Total amount of disk used: **97299.10 MB**
## Useful links
- [Community Week timeline](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104#summary-timeline-calendar-6)
- [Community Week README](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md)
- [Community Week thread](https://discuss.huggingface.co/t/robit-pretrain-roberta-base-from-scratch-in-italian/7564)
- [Community Week channel](https://discord.gg/NTyQNUNs)
- [Masked Language Modelling example scripts](https://github.com/huggingface/transformers/tree/master/examples/flax/language-modeling)
- [Model Repository](https://huggingface.co/flax-community/robit-roberta-base-it/)
|
{}
|
fill-mask
|
flax-community/robit-roberta-base-it
|
[
"transformers",
"jax",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #jax #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
# RobIt
RobIt is a RoBERTa-base model for Italian. It has been trained from scratch on the Italian portion of the OSCAR dataset using Flax, including training scripts.
This is part of the
Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.
## Team members
- Prateek Agrawal (prateekagrawal)
- Tanay Mehta (yotanay)
- Shreya Gupta (Sheyz-max)
- Ruchi Bhatia (ruchi798)
## Dataset :
OSCAR
- config : unshuffled_deduplicated_it
- Size of downloaded dataset files: 26637.62 MB
- Size of the generated dataset: 70661.48 MB
- Total amount of disk used: 97299.10 MB
## Useful links
- Community Week timeline
- Community Week README
- Community Week thread
- Community Week channel
- Masked Language Modelling example scripts
- Model Repository
|
[
"# RobIt\n\nRobIt is a RoBERTa-base model for Italian. It has been trained from scratch on the Italian portion of the OSCAR dataset using Flax, including training scripts.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## Team members\n- Prateek Agrawal (prateekagrawal)\n- Tanay Mehta (yotanay)\n- Shreya Gupta (Sheyz-max)\n- Ruchi Bhatia (ruchi798)",
"## Dataset :\n\nOSCAR\n\n- config : unshuffled_deduplicated_it\n- Size of downloaded dataset files: 26637.62 MB\n- Size of the generated dataset: 70661.48 MB\n- Total amount of disk used: 97299.10 MB",
"## Useful links\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
"TAGS\n#transformers #jax #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n",
"# RobIt\n\nRobIt is a RoBERTa-base model for Italian. It has been trained from scratch on the Italian portion of the OSCAR dataset using Flax, including training scripts.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## Team members\n- Prateek Agrawal (prateekagrawal)\n- Tanay Mehta (yotanay)\n- Shreya Gupta (Sheyz-max)\n- Ruchi Bhatia (ruchi798)",
"## Dataset :\n\nOSCAR\n\n- config : unshuffled_deduplicated_it\n- Size of downloaded dataset files: 26637.62 MB\n- Size of the generated dataset: 70661.48 MB\n- Total amount of disk used: 97299.10 MB",
"## Useful links\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
36,
71,
46,
62,
36
] |
[
"passage: TAGS\n#transformers #jax #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n# RobIt\n\nRobIt is a RoBERTa-base model for Italian. It has been trained from scratch on the Italian portion of the OSCAR dataset using Flax, including training scripts.\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.## Team members\n- Prateek Agrawal (prateekagrawal)\n- Tanay Mehta (yotanay)\n- Shreya Gupta (Sheyz-max)\n- Ruchi Bhatia (ruchi798)## Dataset :\n\nOSCAR\n\n- config : unshuffled_deduplicated_it\n- Size of downloaded dataset files: 26637.62 MB\n- Size of the generated dataset: 70661.48 MB\n- Total amount of disk used: 97299.10 MB## Useful links\n- Community Week timeline\n- Community Week README\n- Community Week thread\n- Community Week channel\n- Masked Language Modelling example scripts\n- Model Repository"
] |
[
-0.0928887203335762,
0.129591166973114,
0.0005719857872463763,
0.08999704569578171,
0.09251450002193451,
0.017377953976392746,
0.12248920649290085,
0.10610749572515488,
0.07509791851043701,
0.0752435028553009,
0.017145493999123573,
0.010506292805075645,
0.06391222029924393,
0.23911245167255402,
-0.0058184051886200905,
-0.22942593693733215,
-0.002088642679154873,
-0.03974355757236481,
-0.15224547684192657,
0.09879333525896072,
0.1243363469839096,
-0.06856343150138855,
0.08379315584897995,
-0.06448511779308319,
-0.10492636263370514,
0.0575704425573349,
-0.009822194464504719,
-0.11025020480155945,
0.08899146318435669,
0.05115532502532005,
-0.0014473367482423782,
-0.021730860695242882,
0.024884717538952827,
-0.06279293447732925,
0.034657228738069534,
0.03232084959745407,
-0.03454214707016945,
-0.020116059109568596,
0.0960039272904396,
0.023466981947422028,
0.2230052649974823,
-0.002789219142869115,
-0.012760389596223831,
0.04172832891345024,
-0.12168791145086288,
-0.009157057851552963,
-0.09302698075771332,
-0.02270185574889183,
-0.028268322348594666,
0.10638934373855591,
-0.036887217313051224,
0.1911616325378418,
-0.21010975539684296,
0.03570351377129555,
0.1153760477900505,
-0.06663122028112411,
-0.08085230737924576,
0.12093105912208557,
0.06774827837944031,
0.01826772838830948,
-0.03140551969408989,
0.03993159905076027,
0.05800819396972656,
-0.02846159040927887,
0.014688456431031227,
-0.041544169187545776,
-0.08741825819015503,
-0.056496039032936096,
-0.026339398697018623,
-0.004984773229807615,
0.20865795016288757,
0.03809204697608948,
0.018562134355306625,
-0.07108043879270554,
0.001870966050773859,
0.09565220028162003,
-0.08264563977718353,
0.02604423277080059,
0.04594233259558678,
0.01706911250948906,
-0.03730767220258713,
0.04601048305630684,
-0.0643487349152565,
-0.030180932953953743,
0.005880114156752825,
0.11742416769266129,
0.012997294776141644,
0.036157943308353424,
-0.10088195651769638,
0.012454940006136894,
-0.13007991015911102,
-0.0777406096458435,
-0.04588650166988373,
-0.007739931344985962,
0.04470640420913696,
0.004626134876161814,
0.07595929503440857,
-0.03618666157126427,
0.10974635183811188,
0.23398295044898987,
0.022127823904156685,
-0.003690253011882305,
0.01609206572175026,
-0.0022330607753247023,
0.0631353110074997,
0.040842413902282715,
-0.07979113608598709,
-0.1512724608182907,
0.08122126758098602,
0.014473319984972477,
0.009495992213487625,
-0.016132444143295288,
-0.04314034432172775,
-0.004334680270403624,
-0.05470898374915123,
0.0466584637761116,
0.14973974227905273,
0.09377075731754303,
-0.05016264691948891,
-0.00231700437143445,
0.18547874689102173,
-0.10242832452058792,
0.022841932252049446,
0.022264819592237473,
-0.10346537083387375,
0.016294673085212708,
-0.07810685783624649,
-0.000016705183952581137,
-0.0037593082524836063,
0.030876576900482178,
-0.06693870574235916,
-0.03428575024008751,
-0.062960185110569,
-0.08688820898532867,
0.11112584918737411,
-0.22970998287200928,
-0.01670348457992077,
-0.19690439105033875,
-0.1745806783437729,
-0.054452504962682724,
-0.05326465144753456,
-0.10219021886587143,
-0.022359389811754227,
-0.03075864166021347,
0.0018738351063802838,
-0.02253424935042858,
-0.028157765045762062,
0.11653748154640198,
-0.09288496524095535,
0.047492630779743195,
-0.019371390342712402,
0.06763213872909546,
-0.06362754106521606,
0.053896911442279816,
-0.07931375503540039,
0.00017046096036210656,
-0.08430822938680649,
0.0686522051692009,
-0.10435859858989716,
0.12876033782958984,
-0.07379438728094101,
0.03680003434419632,
-0.0859452560544014,
0.03903472051024437,
0.03360866755247116,
0.1778203696012497,
-0.15419667959213257,
-0.043109044432640076,
0.1143990233540535,
-0.1024291142821312,
-0.08086921274662018,
0.1368849128484726,
-0.022192595526576042,
0.11842404305934906,
0.0688030868768692,
0.12249638140201569,
0.08460085839033127,
-0.054043449461460114,
-0.09198876470327377,
0.0076757995411753654,
-0.04529675096273422,
-0.10748915374279022,
0.0422658734023571,
0.07319847494363785,
0.09844803065061569,
0.04878067225217819,
-0.0415436327457428,
0.09405498951673508,
-0.05181615799665451,
-0.017408685758709908,
-0.007504851557314396,
-0.1177927702665329,
-0.011483627371490002,
0.0266953706741333,
0.06837765872478485,
-0.08455327898263931,
-0.02891506999731064,
-0.13175389170646667,
0.10103972256183624,
0.01936951093375683,
0.01715502142906189,
-0.05763242021203041,
0.177857905626297,
0.028479265049099922,
0.011833730153739452,
-0.13304893672466278,
0.014833514578640461,
0.07432099431753159,
0.14669999480247498,
-0.04161993786692619,
0.009006338194012642,
0.04950076341629028,
-0.07209160923957825,
-0.038364771753549576,
0.025492612272500992,
0.001690801465883851,
-0.007327169179916382,
-0.024167802184820175,
-0.06860678642988205,
0.017977502197027206,
-0.07308147847652435,
0.09684175997972488,
-0.13619036972522736,
0.010938011109828949,
0.13717906177043915,
0.12763594090938568,
0.0016968015115708113,
-0.004531778395175934,
0.024958662688732147,
0.02319779433310032,
-0.04651700705289841,
-0.09700625389814377,
0.04081320762634277,
-0.009012050926685333,
-0.026554429903626442,
0.17699527740478516,
-0.08739525824785233,
-0.04713383689522743,
0.07383044064044952,
0.05132374167442322,
-0.033696550875902176,
0.16229380667209625,
-0.04035671427845955,
-0.03504592180252075,
-0.07080741226673126,
-0.015785038471221924,
0.008240258321166039,
-0.011195738799870014,
0.11021828651428223,
-0.07791654765605927,
-0.018690943717956543,
0.024562248960137367,
-0.04894499480724335,
-0.0346686989068985,
0.08462253212928772,
0.029884571209549904,
-0.014908351004123688,
0.1340615302324295,
-0.01914801076054573,
0.08384725451469421,
0.28430935740470886,
0.014901241287589073,
-0.0543699711561203,
0.057083286345005035,
-0.025992311537265778,
0.02526889555156231,
-0.009154691360890865,
0.005099939648061991,
-0.044116005301475525,
0.008122914470732212,
0.03864305093884468,
0.01779763400554657,
-0.10568612813949585,
-0.028021108359098434,
-0.05406299605965614,
-0.007637134753167629,
-0.03154086694121361,
0.09948926419019699,
-0.012044479139149189,
0.09785397350788116,
0.060024723410606384,
-0.019204070791602135,
0.00828177947551012,
-0.0033161782193928957,
-0.07853889465332031,
0.14819355309009552,
-0.037291519343853,
-0.28126049041748047,
-0.06724581122398376,
-0.07808931171894073,
0.08349073678255081,
0.0015671583823859692,
0.07445576786994934,
-0.04891011118888855,
-0.053964681923389435,
-0.06819915026426315,
0.09070635586977005,
0.0020478249061852694,
-0.06119590625166893,
-0.0636780858039856,
0.06904380768537521,
-0.03307178616523743,
-0.10189958661794662,
-0.00266362726688385,
0.005437415558844805,
-0.09170860797166824,
0.09120026230812073,
-0.13557638227939606,
0.11688177287578583,
-0.02920110523700714,
-0.007860925048589706,
-0.0031155983451753855,
-0.01942736841738224,
0.17419812083244324,
-0.1498952955007553,
0.1253691166639328,
0.1658400446176529,
0.04262501746416092,
0.01784733682870865,
0.09368832409381866,
0.001297297072596848,
-0.0762939527630806,
0.02550545521080494,
0.0415402352809906,
-0.09239978343248367,
-0.2324618250131607,
-0.1075562983751297,
-0.10360721498727798,
0.12451337277889252,
0.07644478976726532,
0.06503263115882874,
-0.031343184411525726,
0.07918799668550491,
-0.031862206757068634,
0.018591834232211113,
-0.02722715400159359,
0.040511757135391235,
-0.020543018355965614,
0.04807547107338905,
0.04721914604306221,
-0.1075800284743309,
0.002521344693377614,
0.15239934623241425,
0.035936739295721054,
0.16866666078567505,
-0.04177124425768852,
0.10672168433666229,
0.04626822471618652,
0.05313178896903992,
0.06605786085128784,
0.016391942277550697,
0.03733881935477257,
0.010606932453811169,
-0.050654005259275436,
0.0023261578753590584,
-0.0189666748046875,
0.05789623782038689,
0.052465178072452545,
-0.16605480015277863,
0.021568115800619125,
-0.0007419802132062614,
0.048536449670791626,
0.2770492434501648,
-0.0038606440648436546,
-0.15950030088424683,
-0.03452439606189728,
0.031383417546749115,
0.010740639641880989,
-0.01224316842854023,
0.019961366429924965,
0.11211002618074417,
-0.13206490874290466,
0.062079206109046936,
0.021519247442483902,
0.11884063482284546,
-0.08440151810646057,
-0.03302668780088425,
-0.05779219791293144,
0.026071593165397644,
-0.028995539993047714,
0.0914543867111206,
-0.22751154005527496,
0.18611036241054535,
0.0025506147649139166,
0.07870602607727051,
-0.08670132607221603,
-0.02985537052154541,
0.024522528052330017,
0.07292896509170532,
0.08696182817220688,
0.0547337606549263,
-0.0688120573759079,
-0.10043449699878693,
-0.1411323994398117,
0.0021804363932460546,
-0.04497334361076355,
-0.04895506054162979,
0.02238224260509014,
-0.00008018037624424323,
-0.024857297539711,
-0.02186596766114235,
-0.18434911966323853,
-0.0859060287475586,
-0.11360781639814377,
0.05851997435092926,
0.12801161408424377,
-0.04752812907099724,
-0.07223466783761978,
-0.07576005160808563,
-0.07794899493455887,
0.03328601270914078,
0.07113886624574661,
0.007895609363913536,
-0.1329917311668396,
0.1197887659072876,
0.06866009533405304,
-0.06978608667850494,
0.057961564511060715,
0.0045899637043476105,
-0.022846674546599388,
-0.05390779674053192,
-0.011052628979086876,
0.09112603217363358,
-0.10368578881025314,
-0.13277454674243927,
-0.038954880088567734,
0.025466609746217728,
0.07562052458524704,
0.005423395894467831,
-0.000012487114872783422,
0.03553738445043564,
0.01607334241271019,
-0.007323423866182566,
-0.051748521625995636,
0.07654090225696564,
0.061074044555425644,
0.024008335545659065,
-0.02435494028031826,
-0.10251665115356445,
-0.01920820213854313,
-0.15213453769683838,
0.04875498265028,
0.18238744139671326,
-0.08896137028932571,
0.06750032305717468,
0.11009569466114044,
-0.016756342723965645,
-0.29282742738723755,
-0.006428196094930172,
0.0004214332439005375,
-0.00663274759426713,
0.04626724123954773,
-0.1232389584183693,
0.033650171011686325,
0.08986575156450272,
-0.014970112591981888,
0.10673839598894119,
-0.21252797544002533,
-0.05435001105070114,
-0.011605732142925262,
-0.0030147621873766184,
0.2827266454696655,
-0.07764005661010742,
-0.018684806302189827,
-0.0036612506955862045,
-0.21087364852428436,
-0.022141538560390472,
-0.06533914804458618,
0.08447561413049698,
-0.031360089778900146,
-0.009665027260780334,
-0.028625428676605225,
-0.042716801166534424,
0.048405710607767105,
0.07196333259344101,
0.0028635100461542606,
-0.07467389106750488,
-0.10009938478469849,
0.1117139533162117,
-0.04798004776239395,
0.08219143003225327,
-0.05051518976688385,
0.008035818114876747,
-0.10966461151838303,
-0.04034535586833954,
-0.02599049173295498,
0.12674380838871002,
-0.0006189321866258979,
0.010122602805495262,
-0.04988072067499161,
0.10746520757675171,
0.034212078899145126,
-0.0010403257329016924,
0.01476274710148573,
-0.07433609664440155,
0.03870290517807007,
0.04350787773728371,
0.04941864311695099,
0.04643908515572548,
0.014217113144695759,
-0.0895485207438469,
-0.0134660042822361,
0.06268497556447983,
-0.17366838455200195,
0.004054867196828127,
0.027272989973425865,
-0.008525462821125984,
0.0629706159234047,
-0.018439356237649918,
-0.1275537610054016,
0.05308254435658455,
0.10736560821533203,
-0.02275598794221878,
-0.015014839358627796,
0.03302004188299179,
-0.05018264427781105,
-0.02856048382818699,
-0.04892176762223244,
0.07570941746234894,
-0.06361956894397736,
-0.04988853633403778,
-0.04554144665598869,
0.02437509596347809,
0.026356860995292664,
0.1592441350221634,
0.030861495062708855,
0.006038669962435961,
-0.10097141563892365,
0.1147800013422966,
0.017062056809663773,
-0.07214407622814178,
-0.03204294294118881,
0.13260266184806824,
-0.11962796747684479,
-0.07996591925621033,
0.06549889594316483,
0.07318879663944244,
0.0036521588917821646,
-0.0998581275343895,
-0.07590879499912262,
-0.03784199804067612,
0.07226324081420898,
0.08251234143972397,
0.029921535402536392,
0.058404289186000824,
0.00683404179289937,
-0.013873836025595665,
-0.06540994346141815,
0.07028325647115707,
0.15862494707107544,
0.02998942695558071,
-0.019782261922955513,
-0.04300718382000923,
0.059375692158937454,
0.02944602072238922,
-0.03397609293460846,
-0.01267651654779911,
-0.11976034939289093,
-0.020053837448358536,
0.03164210543036461,
0.016865205019712448,
-0.0620923675596714,
0.016561511904001236,
-0.11121220141649246,
-0.048350654542446136,
-0.04328860342502594,
0.03084530681371689,
-0.07529918104410172,
-0.02463238686323166,
-0.08625093102455139,
0.06660763919353485,
-0.041397664695978165,
-0.023300519213080406,
0.08339905738830566,
-0.05663527920842171,
0.08700548857450485,
0.031042158603668213,
-0.04374045506119728,
-0.014081436209380627,
-0.14109647274017334,
-0.040269412100315094,
-0.0006843178998678923,
0.034881774336099625,
0.06969479471445084,
0.08188079297542572,
0.0257294699549675,
0.003714635269716382,
0.09597282111644745,
-0.007321355864405632,
-0.014922210946679115,
-0.12500576674938202,
0.07482164353132248,
-0.11715731024742126,
-0.14690084755420685,
-0.01276073046028614,
0.04048125073313713,
0.07751806825399399,
0.06766168028116226,
0.07308139652013779,
-0.08473218977451324,
0.03624185547232628,
-0.045362379401922226,
-0.0019030582625418901,
-0.010388915427029133,
-0.1507139503955841,
-0.09109503775835037,
-0.01690082438290119,
0.07572726905345917,
0.024336470291018486,
0.16914594173431396,
0.10314087569713593,
-0.061116669327020645,
0.021508285775780678,
0.012215008959174156,
-0.022467270493507385,
-0.03539086505770683,
0.15915784239768982,
-0.0001945571566466242,
-0.0005085640586912632,
-0.03176286444067955,
0.03950081393122673,
0.034873660653829575,
0.0487963892519474,
0.05271359160542488,
0.18121708929538727,
0.10622553527355194,
0.08112768083810806,
0.03234095126390457,
-0.050421372056007385,
0.048305317759513855,
0.03406108543276787,
-0.15601062774658203,
0.07308720797300339,
-0.10833492130041122,
0.07918800413608551,
0.12044527381658554,
-0.06710497289896011,
0.07164967060089111,
-0.030803849920630455,
-0.0001320644369116053,
-0.08298447728157043,
-0.10771968960762024,
-0.11723743379116058,
-0.1102227196097374,
0.046806178987026215,
-0.03845294192433357,
0.048766493797302246,
0.14174699783325195,
0.0416627936065197,
-0.029154596850275993,
0.10613708198070526,
-0.026379551738500595,
-0.0576871819794178,
0.10595300793647766,
-0.020524149760603905,
0.011492031626403332,
0.002465351950377226,
-0.04921647533774376,
-0.02455400489270687,
0.031790442764759064,
0.017175279557704926,
0.0009202137589454651,
0.007147705648094416,
0.07709415256977081,
-0.04762418195605278,
-0.09877470135688782,
-0.007631972432136536,
0.011278386227786541,
0.01967078447341919,
0.061461370438337326,
0.04006623104214668,
0.0109650744125247,
0.031999871134757996,
0.18472443521022797,
0.007123949471861124,
-0.07792440801858902,
-0.1263270080089569,
-0.038770876824855804,
-0.025186216458678246,
-0.015545796602964401,
-0.05478212237358093,
-0.12121246010065079,
0.018329579383134842,
0.23919124901294708,
0.24630513787269592,
0.0428168885409832,
0.001576376031152904,
-0.06959410011768341,
0.009152650833129883,
0.03733620420098305,
0.14381080865859985,
0.023444391787052155,
0.030666043981909752,
-0.011099469847977161,
-0.0953729972243309,
-0.07154694944620132,
-0.025392869487404823,
-0.10631048679351807,
0.1045379489660263,
0.01697608456015587,
-0.02180357277393341,
-0.08224116265773773,
0.10193590819835663,
0.04172057285904884,
-0.14449834823608398,
0.042433224618434906,
-0.12462761998176575,
-0.09293096512556076,
-0.03743067756295204,
-0.07109382748603821,
0.15282125771045685,
0.1277145892381668,
-0.012541968375444412,
-0.0016667685704305768,
-0.005319309886544943,
0.020783865824341774,
-0.14833581447601318,
-0.09916774183511734,
0.06041625142097473,
0.04962921515107155,
0.16752628982067108,
-0.09069264680147171,
0.023900825530290604,
0.0932207778096199,
-0.035178836435079575,
-0.1295183151960373,
0.06272242218255997,
0.03628924489021301,
0.03334507718682289,
0.03323642164468765,
0.03001481294631958,
-0.09107143431901932,
-0.014376240782439709,
0.04185214638710022,
-0.08500440418720245,
0.04301527515053749,
0.0006403724546544254,
0.019057802855968475,
-0.1655419021844864,
0.09013791382312775,
-0.07726090401411057,
0.11472677439451218,
0.09429837763309479,
-0.0420052669942379,
-0.08979642391204834,
-0.05817301571369171,
0.0064502316527068615,
0.010378281585872173,
0.016836458817124367,
-0.024184193462133408,
-0.20694968104362488,
-0.07912187278270721,
-0.0445554181933403,
0.07338659465312958,
-0.16702044010162354,
0.05539979785680771,
-0.15910552442073822,
-0.031587325036525726,
-0.007137480657547712,
0.06684107333421707,
0.06078125163912773,
0.003590247593820095,
-0.03770019859075546,
-0.20586542785167694,
0.009397969581186771,
0.09633462131023407,
-0.07832066714763641,
-0.0399772971868515
] |
null | null |
transformers
|
# Spanish T5 (small) trained on [large_spanish_corpus](https://huggingface.co/datasets/viewer/?dataset=large_spanish_corpus).
This is a Spanish **T5** (small arch) trained from scratch on the [large_spanish_corpus](https://huggingface.co/datasets/viewer/?dataset=large_spanish_corpus) aka BETO's corpus with [Flax](https://github.com/google/flax)
This is part of the
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
## Dataset
The dataset is about 20 GB. 95% of the data was used for training and the rest 5% for validation.
## [Metrics](https://huggingface.co/flax-community/spanish-t5-small/tensorboard) (on evaluation dataset)
- Accuracy: 0.675
## Team members
- Manuel Romero ([mrm8488](https://huggingface.co/mrm8488))
- María Grandury ([mariagrandury](https://huggingface.co/mariagrandury))
## Citation
If you want to cite this model you can use this:
```bibtex
@misc{mromero2021spanish-t5-small,
title={Spanish T5 (small) by Manuel Romero},
author={Romero, Manuel},
publisher={Hugging Face},
journal={Hugging Face Hub},
howpublished={\url{https://huggingface.co/flax-community/spanish-t5-small}},
year={2021}
}
```
|
{"language": "es", "license": "mit", "tags": ["T5", "Seq2Seq", "EconderDecoder", "Spanish"], "datasets": ["large_spanish_corpus"], "widgets": [{"text": "\u00c9rase un vez un"}]}
|
text2text-generation
|
flax-community/spanish-t5-small
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"T5",
"Seq2Seq",
"EconderDecoder",
"Spanish",
"es",
"dataset:large_spanish_corpus",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"es"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #t5 #text2text-generation #T5 #Seq2Seq #EconderDecoder #Spanish #es #dataset-large_spanish_corpus #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Spanish T5 (small) trained on large_spanish_corpus.
This is a Spanish T5 (small arch) trained from scratch on the large_spanish_corpus aka BETO's corpus with Flax
This is part of the
Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.
## Dataset
The dataset is about 20 GB. 95% of the data was used for training and the rest 5% for validation.
## Metrics (on evaluation dataset)
- Accuracy: 0.675
## Team members
- Manuel Romero (mrm8488)
- María Grandury (mariagrandury)
If you want to cite this model you can use this:
|
[
"# Spanish T5 (small) trained on large_spanish_corpus.\n\nThis is a Spanish T5 (small arch) trained from scratch on the large_spanish_corpus aka BETO's corpus with Flax\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## Dataset\nThe dataset is about 20 GB. 95% of the data was used for training and the rest 5% for validation.",
"## Metrics (on evaluation dataset)\n\n- Accuracy: 0.675",
"## Team members\n- Manuel Romero (mrm8488)\n- María Grandury (mariagrandury)\n\nIf you want to cite this model you can use this:"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #t5 #text2text-generation #T5 #Seq2Seq #EconderDecoder #Spanish #es #dataset-large_spanish_corpus #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Spanish T5 (small) trained on large_spanish_corpus.\n\nThis is a Spanish T5 (small arch) trained from scratch on the large_spanish_corpus aka BETO's corpus with Flax\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.",
"## Dataset\nThe dataset is about 20 GB. 95% of the data was used for training and the rest 5% for validation.",
"## Metrics (on evaluation dataset)\n\n- Accuracy: 0.675",
"## Team members\n- Manuel Romero (mrm8488)\n- María Grandury (mariagrandury)\n\nIf you want to cite this model you can use this:"
] |
[
96,
82,
27,
18,
33
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #t5 #text2text-generation #T5 #Seq2Seq #EconderDecoder #Spanish #es #dataset-large_spanish_corpus #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Spanish T5 (small) trained on large_spanish_corpus.\n\nThis is a Spanish T5 (small arch) trained from scratch on the large_spanish_corpus aka BETO's corpus with Flax\n\nThis is part of the\nFlax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.## Dataset\nThe dataset is about 20 GB. 95% of the data was used for training and the rest 5% for validation.## Metrics (on evaluation dataset)\n\n- Accuracy: 0.675## Team members\n- Manuel Romero (mrm8488)\n- María Grandury (mariagrandury)\n\nIf you want to cite this model you can use this:"
] |
[
-0.12679970264434814,
0.12267349660396576,
-0.0021154393907636404,
0.10418040305376053,
0.06750436127185822,
0.008084660395979881,
0.05275125429034233,
0.08365203440189362,
-0.015189818106591702,
0.10667584836483002,
0.03528446704149246,
0.08178529888391495,
0.0899406149983406,
0.11126191169023514,
-0.09900186955928802,
-0.10932587832212448,
-0.022927068173885345,
0.014332637190818787,
-0.052894026041030884,
0.09741571545600891,
0.07875479012727737,
-0.10926344245672226,
0.08667535334825516,
-0.037473902106285095,
-0.0659136101603508,
0.05628952756524086,
0.08204997330904007,
-0.14056722819805145,
0.13148319721221924,
0.11381973326206207,
0.0713387131690979,
0.05437386408448219,
0.01645803079009056,
-0.04795774072408676,
0.007535555399954319,
-0.005450512748211622,
-0.048567067831754684,
0.047447822988033295,
0.06395081430673599,
-0.0030253722798079252,
0.10344377905130386,
-0.12176020443439484,
0.03722865879535675,
0.05028577893972397,
-0.155678391456604,
-0.13310612738132477,
-0.0704750120639801,
-0.0513426847755909,
0.04750713333487511,
0.08425212651491165,
-0.008881918154656887,
0.2105274200439453,
-0.14832347631454468,
0.029452435672283173,
0.0954262763261795,
-0.19591517746448517,
-0.07260311394929886,
0.04898709058761597,
0.018409332260489464,
0.14947070181369781,
-0.02550710178911686,
0.03765494376420975,
0.11292153596878052,
0.028327591717243195,
0.06259473413228989,
-0.05571445822715759,
-0.05382934957742691,
-0.003324799472466111,
-0.07048776745796204,
-0.050819747149944305,
0.32454633712768555,
0.05022313818335533,
-0.017226455733180046,
-0.09360857307910919,
-0.06796684861183167,
0.02354595623910427,
0.0014208195498213172,
-0.05980371683835983,
0.03909043222665787,
-0.05945458635687828,
-0.04895370453596115,
0.0652233436703682,
-0.10992057621479034,
-0.03370828554034233,
-0.1646525263786316,
0.019154859706759453,
0.0010049750562757254,
0.036368779838085175,
-0.08955731242895126,
0.03443652763962746,
-0.07357756793498993,
-0.11052045971155167,
-0.00814547948539257,
-0.009704073891043663,
0.014402761124074459,
-0.043653134256601334,
0.012154323980212212,
-0.05357588455080986,
0.12726931273937225,
0.06901498883962631,
-0.12541063129901886,
-0.03844413161277771,
0.02968749776482582,
0.0020946222357451916,
0.07580084353685379,
0.07227923721075058,
-0.0656227245926857,
-0.08186878263950348,
0.053868941962718964,
-0.04368098825216293,
0.04095272347331047,
0.017465341836214066,
-0.05752662569284439,
-0.08871331810951233,
0.03560115024447441,
0.06285931915044785,
-0.0009425514726899564,
-0.01394481398165226,
-0.08148901164531708,
-0.035381365567445755,
0.05854202061891556,
-0.08317851275205612,
0.02489207498729229,
0.012302238494157791,
-0.10012298077344894,
0.0844375267624855,
-0.02917974255979061,
-0.02018877863883972,
-0.040584079921245575,
-0.08218152821063995,
-0.07227958738803864,
-0.08471459150314331,
-0.009807360358536243,
-0.11372732371091843,
0.10224365442991257,
-0.13187836110591888,
0.03431204333901405,
-0.20335672795772552,
-0.04326818883419037,
-0.10314428806304932,
-0.0641157478094101,
-0.1142449900507927,
-0.046939343214035034,
-0.08857277035713196,
-0.0035679421853274107,
0.014068092219531536,
-0.044426124542951584,
0.10071451216936111,
-0.071950763463974,
0.11428694427013397,
0.031022006645798683,
0.08185902237892151,
-0.10260842740535736,
0.02069968916475773,
-0.1162102073431015,
-0.03261636197566986,
-0.12170558422803879,
0.07008741050958633,
0.03260011970996857,
0.031139960512518883,
-0.1368715465068817,
-0.050494179129600525,
-0.06081971898674965,
0.031378842890262604,
0.029681451618671417,
0.19138921797275543,
-0.10831370949745178,
-0.07602131366729736,
0.18588879704475403,
-0.06365586817264557,
-0.1243879497051239,
0.15122301876544952,
-0.006092105060815811,
0.15926100313663483,
0.15968292951583862,
0.07237152755260468,
0.02547791227698326,
-0.13529056310653687,
-0.02872510254383087,
-0.02317880094051361,
0.023751186206936836,
-0.121625617146492,
0.09982885420322418,
-0.04351908713579178,
-0.02938481979072094,
0.026508409529924393,
0.0010604215785861015,
0.029359973967075348,
-0.0602618008852005,
-0.013848094269633293,
0.013098187744617462,
-0.037342775613069534,
-0.03338363394141197,
0.02269536443054676,
0.048022300004959106,
-0.09352771192789078,
-0.01944844238460064,
-0.034340642392635345,
0.006236776243895292,
0.003622874151915312,
-0.043431464582681656,
-0.05646578595042229,
0.18181009590625763,
-0.13593167066574097,
-0.030293425545096397,
-0.11427594721317291,
-0.02123310975730419,
0.023292364552617073,
0.08336301892995834,
-0.010938516817986965,
0.058724213391542435,
0.046191222965717316,
-0.044461943209171295,
-0.039918817579746246,
0.026004979386925697,
0.006747969426214695,
-0.029784878715872765,
-0.046726569533348083,
-0.1191437840461731,
0.07139977067708969,
-0.02575606107711792,
0.09859488159418106,
-0.18537038564682007,
-0.015915745869278908,
0.09395672380924225,
0.02282635308802128,
-0.003836251562461257,
0.02256881259381771,
0.001587695092894137,
0.058637067675590515,
-0.04484257847070694,
-0.040728818625211716,
0.06094704940915108,
0.01442889217287302,
-0.10829903185367584,
0.062342993915081024,
-0.13529017567634583,
0.08711590617895126,
0.12119650840759277,
0.07215206325054169,
-0.06516766548156738,
-0.07026325166225433,
-0.009988514706492424,
0.0028884673956781626,
-0.15001623332500458,
-0.029116766527295113,
0.10181403905153275,
-0.02351693995296955,
0.11971039324998856,
-0.18522778153419495,
-0.04726546257734299,
0.06764857470989227,
-0.04907332360744476,
-0.02254618890583515,
0.20145760476589203,
-0.006821739953011274,
-0.14347845315933228,
0.10691533237695694,
0.05101052299141884,
0.03925590589642525,
0.18748025596141815,
-0.00689244968816638,
-0.06910473853349686,
-0.0028900587931275368,
0.015858232975006104,
0.01920478604733944,
0.09517336636781693,
-0.12936878204345703,
0.031916543841362,
0.0369429774582386,
0.03196016326546669,
0.008223293349146843,
-0.12269129604101181,
-0.01799209974706173,
0.010970409959554672,
-0.044869713485240936,
-0.0470220223069191,
0.06688036024570465,
0.02064204029738903,
0.18385940790176392,
0.01634003035724163,
-0.04354679584503174,
0.005445685237646103,
-0.006526289973407984,
-0.12352173775434494,
0.1725900024175644,
-0.025745723396539688,
-0.24524256587028503,
-0.09171362966299057,
0.04459388554096222,
-0.00884031318128109,
0.025110160931944847,
0.06948046386241913,
-0.08389509469270706,
-0.0004417419258970767,
-0.03621430695056915,
0.046657368540763855,
0.05305171385407448,
-0.025477511808276176,
-0.0522327683866024,
0.04825817793607712,
0.0026184660382568836,
-0.10782834142446518,
-0.03580468147993088,
-0.04242385923862457,
-0.10447215288877487,
0.07657301425933838,
-0.13953784108161926,
0.14188967645168304,
0.13426004350185394,
-0.004893195815384388,
0.0058349198661744595,
-0.04448441416025162,
0.16468417644500732,
-0.07740984112024307,
0.04881557822227478,
0.2175690233707428,
0.11627666652202606,
-0.028923094272613525,
0.12111520022153854,
0.008856679312884808,
-0.10378491878509521,
0.00992388091981411,
0.005405755247920752,
-0.07678540050983429,
-0.25520166754722595,
-0.0868217796087265,
-0.04059828445315361,
-0.06333526968955994,
0.10586792230606079,
0.027515707537531853,
0.016457710415124893,
0.0877947211265564,
0.0003343910211697221,
-0.000269985175691545,
0.05271163955330849,
0.01858551986515522,
0.15072399377822876,
0.031804949045181274,
0.05329213663935661,
-0.0728025734424591,
-0.06594379991292953,
0.13724257051944733,
0.04756215587258339,
0.05020082741975784,
-0.01335174310952425,
0.08638042956590652,
0.028990641236305237,
0.060254283249378204,
-0.03495856374502182,
0.0680466890335083,
0.040898438543081284,
-0.008023033849895,
-0.06862514466047287,
-0.05314575508236885,
-0.0786052867770195,
-0.01828773133456707,
0.06091996282339096,
0.06810826808214188,
-0.18104976415634155,
-0.06086808443069458,
0.08735761046409607,
0.14300575852394104,
0.08874145150184631,
-0.2899791896343231,
-0.09678295254707336,
0.036570534110069275,
0.0013531410368159413,
-0.05581889674067497,
0.036737069487571716,
0.1993989497423172,
-0.08959175646305084,
0.06182553991675377,
0.038467783480882645,
0.10001730173826218,
-0.0006643066299147904,
-0.001458659302443266,
-0.06908596307039261,
0.016897518187761307,
-0.013034720905125141,
0.09427116066217422,
-0.26372241973876953,
0.23362243175506592,
0.01790662482380867,
0.07475581020116806,
-0.09312616288661957,
-0.03938357159495354,
-0.06377588957548141,
0.039410851895809174,
0.18896467983722687,
0.02905109152197838,
0.029120340943336487,
-0.11920572817325592,
-0.16190709173679352,
0.08308497816324234,
-0.07727260142564774,
-0.0414474718272686,
0.07797157019376755,
0.023567376658320427,
0.023146068677306175,
0.010079414583742619,
-0.051613450050354004,
-0.08087681978940964,
-0.08878766000270844,
-0.06099778413772583,
0.03112502209842205,
-0.07653515785932541,
-0.007549243979156017,
-0.1021428033709526,
-0.13101714849472046,
0.02324536442756653,
-0.02999112568795681,
-0.07601020485162735,
-0.1226651594042778,
0.11880405247211456,
0.06886861473321915,
-0.07460306584835052,
-0.0024624166544526815,
0.020322401076555252,
0.013892457820475101,
0.021686673164367676,
-0.005606413818895817,
0.1319933533668518,
-0.05773667246103287,
-0.12210280448198318,
-0.028368409723043442,
0.10716737061738968,
0.06908910721540451,
0.05300604924559593,
0.058936987072229385,
0.015365680679678917,
0.013823933899402618,
-0.07331446558237076,
-0.022167330607771873,
0.05655622109770775,
0.15653853118419647,
0.03898287191987038,
-0.07194562256336212,
-0.15352310240268707,
-0.03767085447907448,
-0.08024926483631134,
0.1571512371301651,
0.22081881761550903,
-0.07613269984722137,
0.0459371879696846,
0.11324386298656464,
-0.08667843788862228,
-0.1701343059539795,
-0.01868152990937233,
-0.0043608397245407104,
0.06883629411458969,
0.004310235381126404,
-0.0448472760617733,
0.0033100880682468414,
0.18212826550006866,
0.003281094366684556,
-0.06540507823228836,
-0.39706072211265564,
-0.10455314069986343,
0.04391926899552345,
0.08306311815977097,
0.2618710994720459,
-0.12419955432415009,
-0.05343737080693245,
-0.0451444536447525,
-0.08299531787633896,
0.09939297288656235,
-0.1530981957912445,
0.06607400625944138,
-0.025726376101374626,
0.0064626033417880535,
0.03773128241300583,
-0.0350186750292778,
0.13175547122955322,
0.00299960863776505,
0.030631696805357933,
-0.0362204872071743,
-0.06747045367956161,
0.04270318150520325,
-0.036211512982845306,
0.0998111218214035,
0.04440140351653099,
0.009735913947224617,
-0.10636482387781143,
-0.046248745173215866,
-0.10683529824018478,
0.09555014222860336,
-0.05599798262119293,
0.003000432625412941,
-0.02891046553850174,
0.03826160356402397,
0.056872446089982986,
-0.055911414325237274,
0.05940084531903267,
-0.10637393593788147,
0.14978398382663727,
0.07742658257484436,
0.19467446208000183,
-0.014634908176958561,
-0.004000975750386715,
0.007011879235506058,
-0.004491688217967749,
0.07352694869041443,
-0.0964992344379425,
0.057090289890766144,
0.12967181205749512,
0.02595779486000538,
0.059509143233299255,
0.01868707314133644,
-0.08804719895124435,
0.03166825324296951,
0.0981936901807785,
-0.06686722487211227,
-0.08273299783468246,
-0.018446967005729675,
-0.11021336168050766,
-0.0564410574734211,
-0.003994537517428398,
0.0951472744345665,
-0.02386264130473137,
-0.04381151497364044,
-0.05674416199326515,
0.0368042066693306,
-0.015072830021381378,
0.22171767055988312,
0.031032126396894455,
0.0020122823771089315,
-0.09758572280406952,
0.07940050959587097,
0.08305531740188599,
-0.1390237957239151,
0.02083607204258442,
0.0033665962982922792,
-0.08264810591936111,
-0.034642837941646576,
0.15310177206993103,
0.08652285486459732,
-0.12653380632400513,
-0.11634458601474762,
-0.1419612616300583,
-0.14276446402072906,
0.04063422232866287,
0.24365080893039703,
0.0322292186319828,
0.039138149470090866,
0.037189312279224396,
-0.08714747428894043,
-0.060650818049907684,
0.0551622174680233,
0.0870903804898262,
-0.01430489495396614,
-0.06782161444425583,
0.08014164119958878,
0.014704504981637001,
-0.000349772977642715,
-0.04516833275556564,
0.014835975132882595,
-0.21437686681747437,
0.0014313305728137493,
-0.07111162692308426,
0.07108575105667114,
-0.012742160819470882,
-0.004640480037778616,
-0.061136070638895035,
-0.0023132727947086096,
-0.04877990484237671,
-0.0026238812133669853,
-0.03809207305312157,
0.054770730435848236,
-0.0028304150328040123,
0.12963353097438812,
-0.05279921740293503,
-0.03709878399968147,
-0.0025199507363140583,
-0.03801751881837845,
0.06411159783601761,
0.013058433309197426,
-0.010252476669847965,
0.04000915214419365,
-0.21356914937496185,
0.03933582082390785,
0.07732831686735153,
0.05010051280260086,
0.08563647419214249,
-0.15567708015441895,
0.054568205028772354,
0.09475398063659668,
0.005508224945515394,
0.04311360418796539,
0.00558073353022337,
-0.07992838323116302,
0.012415523640811443,
-0.023422125726938248,
-0.10319272428750992,
-0.062318380922079086,
0.04977341368794441,
0.09076705574989319,
0.015084725804626942,
0.09481611102819443,
-0.12183445692062378,
-0.0016512080328539014,
-0.1898888349533081,
-0.018527889624238014,
-0.031046971678733826,
-0.04768736660480499,
-0.12181345373392105,
0.010500730946660042,
0.09877870231866837,
0.04245102033019066,
0.15726280212402344,
0.07133470475673676,
0.04866857826709747,
-0.001555242226459086,
0.023989886045455933,
0.003329779487103224,
-0.021056314930319786,
0.17130307853221893,
-0.00011898979573743418,
-0.001611968851648271,
0.015153192915022373,
0.07687634229660034,
0.06124328821897507,
0.032443877309560776,
0.18656796216964722,
0.09466442465782166,
0.05600585788488388,
0.05907798185944557,
0.06664659827947617,
-0.03505507856607437,
0.11077248305082321,
0.0017453968757763505,
-0.014728629030287266,
0.019993754103779793,
0.00029208968044258654,
-0.0012930443044751883,
0.13256904482841492,
-0.16270045936107635,
0.04540586471557617,
-0.09319101274013519,
-0.03180747479200363,
-0.1680823713541031,
-0.1356649398803711,
-0.1018892303109169,
-0.0651228278875351,
-0.04044539853930473,
-0.12487488240003586,
-0.027794454246759415,
0.021760474890470505,
0.04513250291347504,
-0.06527252495288849,
0.043554890900850296,
-0.03946653753519058,
-0.1047072783112526,
0.06113492697477341,
0.03494088351726532,
0.08896203339099884,
-0.11047175526618958,
0.040118973702192307,
0.02580319717526436,
0.0662754699587822,
0.01120783668011427,
0.04595748707652092,
0.0731501504778862,
0.0122448755428195,
-0.05423010140657425,
-0.047487933188676834,
-0.02628277987241745,
0.02805689536035061,
0.0455317385494709,
0.06916777789592743,
0.08112110197544098,
-0.049521058797836304,
0.024338679388165474,
0.21726715564727783,
-0.02140824683010578,
-0.03322743996977806,
-0.13836316764354706,
0.1684340387582779,
-0.040437210351228714,
0.031588003039360046,
0.04741516709327698,
-0.10448221862316132,
0.07212527841329575,
0.19460667669773102,
0.15112870931625366,
0.0016519122291356325,
0.005603268276900053,
-0.04310572147369385,
0.008328055031597614,
0.04565277695655823,
0.0690215602517128,
0.02792084775865078,
0.19910675287246704,
-0.04987571761012077,
0.040603157132864,
-0.019844824448227882,
0.09686345607042313,
-0.051566850394010544,
0.10487248003482819,
-0.0035954946652054787,
0.005336687434464693,
-0.024484781548380852,
0.13449309766292572,
0.011706506833434105,
-0.09795675426721573,
0.06625787168741226,
-0.13518626987934113,
-0.11335871368646622,
-0.056260719895362854,
-0.05105068162083626,
0.04719879850745201,
0.09350156784057617,
-0.020706264302134514,
0.007739451248198748,
0.025842178612947464,
0.027116771787405014,
-0.12903133034706116,
-0.11459358781576157,
0.10318861156702042,
0.061914268881082535,
0.2050948590040207,
-0.0007536289631389081,
0.09821224212646484,
0.11981874704360962,
-0.018141429871320724,
-0.08876120299100876,
0.0955410823225975,
0.01800183206796646,
0.0780523270368576,
0.10942606627941132,
-0.10273592919111252,
0.018804021179676056,
0.04907580837607384,
0.07405545562505722,
-0.05719248577952385,
0.037864215672016144,
-0.025235291570425034,
0.03223565220832825,
-0.1301800012588501,
0.12622880935668945,
-0.0665205791592598,
0.11037754267454147,
0.12470056861639023,
-0.0579301193356514,
0.010344757698476315,
-0.05546054616570473,
0.0797908827662468,
0.07230416685342789,
0.10077103227376938,
0.018697844818234444,
-0.207553431391716,
0.01070940401405096,
-0.07725443691015244,
0.0054601035080850124,
-0.25874942541122437,
0.02346372976899147,
-0.07218927890062332,
-0.0739009752869606,
-0.06562346965074539,
0.015774980187416077,
0.021914871409535408,
0.07585939764976501,
-0.03983094543218613,
-0.027554498985409737,
-0.010881916619837284,
0.15083631873130798,
-0.08350376039743423,
-0.058114610612392426
] |
null | null |
transformers
|
# GPT2-svenska-wikipedia
A swedish GPT2 style model trained using Flax CLM pipeline on the Swedish
part of the wiki40b dataset.
https://huggingface.co/datasets/wiki40b
## Model series
This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
## Gpt models
## Swedish Gpt
https://huggingface.co/birgermoell/swedish-gpt/
## Swedish gpt wiki
https://huggingface.co/flax-community/swe-gpt-wiki
# Nordic gpt wiki
https://huggingface.co/flax-community/nordic-gpt-wiki
## Dansk gpt wiki
https://huggingface.co/flax-community/dansk-gpt-wiki
## Norsk gpt wiki
https://huggingface.co/flax-community/norsk-gpt-wiki
## Roberta models
## Nordic Roberta Wiki
https://huggingface.co/flax-community/nordic-roberta-wiki
## Swe Roberta Wiki Oscar
https://huggingface.co/flax-community/swe-roberta-wiki-oscar
## Roberta Swedish Scandi
https://huggingface.co/birgermoell/roberta-swedish-scandi
## Roberta Swedish
https://huggingface.co/birgermoell/roberta-swedish
## Swedish T5 model
https://huggingface.co/birgermoell/t5-base-swedish
## Data cleaning and preprocessing
The data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.
```python
from datasets import load_dataset
def load_and_clean_wiki():
dataset = load_dataset('wiki40b', 'sv', beam_runner='DirectRunner', split="train")
#dataset = load_dataset('wiki40b', 'sv', beam_runner='DirectRunner')
dataset = dataset.remove_columns(['wikidata_id', 'version_id'])
filtered_dataset = dataset.map(filter_wikipedia)
# filtered_dataset[:3]
# print(filtered_dataset[:3])
return filtered_dataset
def filter_wikipedia(batch):
batch["text"] = " ".join(batch["text"].split("\
_START_SECTION_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_ARTICLE_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_ARTICLE_\
"))
batch["text"] = " ".join(batch["text"].split("\
_START_PARAGRAPH_\
"))
batch["text"] = " ".join(batch["text"].split("_NEWLINE_"))
batch["text"] = " ".join(batch["text"].split("\xa0"))
return batch
```
## Training script
The following training script was used to train the model.
```bash
./run_clm_flax.py --output_dir="${MODEL_DIR}" --model_type="gpt2" --config_name="${MODEL_DIR}" --tokenizer_name="${MODEL_DIR}" --dataset_name="wiki40b" --dataset_config_name="sv" --do_train --do_eval --block_size="512" --per_device_train_batch_size="64" --per_device_eval_batch_size="64" --learning_rate="5e-3" --warmup_steps="1000" --adam_beta1="0.9" --adam_beta2="0.98" --weight_decay="0.01" --overwrite_output_dir --num_train_epochs="20" --logging_steps="500" --save_steps="1000" --eval_steps="2500" --push_to_hub
```
|
{"language": "sv", "widget": [{"text": "Jag \u00e4r en svensk spr\u00e5kmodell."}]}
|
text-generation
|
flax-community/swe-gpt-wiki
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"gpt2",
"text-generation",
"sv",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sv"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #sv #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# GPT2-svenska-wikipedia
A swedish GPT2 style model trained using Flax CLM pipeline on the Swedish
part of the wiki40b dataset.
URL
## Model series
This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
## Gpt models
## Swedish Gpt
URL
## Swedish gpt wiki
URL
# Nordic gpt wiki
URL
## Dansk gpt wiki
URL
## Norsk gpt wiki
URL
## Roberta models
## Nordic Roberta Wiki
URL
## Swe Roberta Wiki Oscar
URL
## Roberta Swedish Scandi
URL
## Roberta Swedish
URL
## Swedish T5 model
URL
## Data cleaning and preprocessing
The data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.
## Training script
The following training script was used to train the model.
|
[
"# GPT2-svenska-wikipedia\nA swedish GPT2 style model trained using Flax CLM pipeline on the Swedish\npart of the wiki40b dataset.\n\nURL",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL",
"## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.",
"## Training script\nThe following training script was used to train the model."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #sv #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# GPT2-svenska-wikipedia\nA swedish GPT2 style model trained using Flax CLM pipeline on the Swedish\npart of the wiki40b dataset.\n\nURL",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL",
"## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.",
"## Training script\nThe following training script was used to train the model."
] |
[
65,
38,
32,
4,
5,
6,
6,
6,
6,
4,
6,
7,
7,
5,
6,
40,
14
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #gpt2 #text-generation #sv #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# GPT2-svenska-wikipedia\nA swedish GPT2 style model trained using Flax CLM pipeline on the Swedish\npart of the wiki40b dataset.\n\nURL## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.## Gpt models## Swedish Gpt\nURL## Swedish gpt wiki\nURL# Nordic gpt wiki\nURL## Dansk gpt wiki\nURL## Norsk gpt wiki\nURL## Roberta models## Nordic Roberta Wiki\nURL## Swe Roberta Wiki Oscar\nURL## Roberta Swedish Scandi\nURL## Roberta Swedish\nURL## Swedish T5 model\nURL## Data cleaning and preprocessing\nThe data was cleaned and preprocessed using the following script. Make sure to install depencies for beam_runner to make the dataset work.## Training script\nThe following training script was used to train the model."
] |
[
-0.09668188542127609,
0.21411584317684174,
-0.0011591449147090316,
0.07248411327600479,
0.0772741287946701,
0.00983052235096693,
0.057593055069446564,
0.1278069168329239,
0.0028539281338453293,
0.07264388352632523,
0.11146290600299835,
0.017252685502171516,
0.1016203761100769,
0.18153642117977142,
0.053605493158102036,
-0.2766001522541046,
0.08618922531604767,
-0.0522737056016922,
-0.040029335767030716,
0.1022174283862114,
0.08727754652500153,
-0.07034114003181458,
0.06332912296056747,
-0.0510951429605484,
-0.03864455223083496,
-0.02545119635760784,
-0.019566470757126808,
-0.05856682360172272,
0.11740833520889282,
0.058711811900138855,
0.060261763632297516,
0.08343237638473511,
0.11696860939264297,
-0.10658394545316696,
0.023514360189437866,
0.038035765290260315,
-0.0020827995613217354,
0.03770770877599716,
0.055006109178066254,
-0.004317047074437141,
0.17876598238945007,
-0.012057388201355934,
0.06398705393075943,
0.021346760913729668,
-0.08399830013513565,
-0.2472253143787384,
-0.08817458897829056,
0.05516371130943298,
0.06150044500827789,
0.11339002847671509,
-0.05206695571541786,
0.08472827076911926,
-0.11894610524177551,
0.08814723789691925,
0.1339721977710724,
-0.21087582409381866,
-0.05427336320281029,
0.13543762266635895,
0.087448351085186,
0.04382120072841644,
-0.0689595639705658,
0.07469753175973892,
-0.012676666490733624,
0.05129248648881912,
0.0708041712641716,
-0.01869097352027893,
-0.02202528715133667,
0.007357912138104439,
-0.12586160004138947,
-0.01096761878579855,
0.12214262783527374,
0.010792307555675507,
-0.04857314005494118,
-0.14438898861408234,
-0.054896630346775055,
-0.031182611361145973,
0.0008229283848777413,
-0.005312539171427488,
0.006992073263972998,
-0.00884677842259407,
-0.04165704920887947,
-0.0761605054140091,
-0.07866179943084717,
-0.08919204026460648,
0.04249955341219902,
0.09984803199768066,
0.04193439707159996,
0.007704088930040598,
-0.008735346607863903,
0.14475828409194946,
-0.06396777927875519,
-0.12689849734306335,
-0.07259678095579147,
-0.046433497220277786,
-0.07387236505746841,
-0.01818186417222023,
0.016776511445641518,
-0.16131922602653503,
0.03237669914960861,
0.2293410301208496,
0.03469584137201309,
0.029267147183418274,
0.05254731327295303,
-0.000988993444480002,
0.030561016872525215,
0.1104859933257103,
-0.15610918402671814,
-0.08935720473527908,
0.01150609366595745,
-0.02636331133544445,
0.014513510279357433,
-0.03144123777747154,
-0.006033788900822401,
-0.02109701558947563,
0.054032597690820694,
0.0932072326540947,
0.034004177898168564,
0.04645869880914688,
-0.03561701253056526,
0.005664404947310686,
0.0819319561123848,
-0.14528679847717285,
0.0014542551944032311,
-0.00451317336410284,
-0.027268918231129646,
0.00808511022478342,
0.07625394314527512,
-0.0055906325578689575,
-0.06864892691373825,
0.13725930452346802,
-0.05634898692369461,
-0.03223632648587227,
-0.023301225155591965,
-0.11901269853115082,
0.02864387072622776,
-0.09725385904312134,
0.0015250744763761759,
-0.08096477389335632,
-0.22210069000720978,
-0.07096340507268906,
0.07545077055692673,
-0.066322922706604,
-0.0036169032100588083,
-0.025737455114722252,
-0.04175720736384392,
0.03569754213094711,
-0.025052936747670174,
0.10199829936027527,
-0.06526773422956467,
0.0617465004324913,
-0.13952666521072388,
0.09082822501659393,
0.007272579241544008,
0.013534555211663246,
-0.14716988801956177,
0.025664933025836945,
-0.20773489773273468,
0.0476984903216362,
-0.1446041315793991,
0.007295609451830387,
-0.13124720752239227,
-0.068810373544693,
-0.014750687405467033,
0.027991725131869316,
0.03955592215061188,
0.18765006959438324,
-0.16860650479793549,
-0.015175129286944866,
0.29586276412010193,
-0.15757520496845245,
0.033384520560503006,
0.10753987729549408,
0.027041826397180557,
0.10288048535585403,
0.09383222460746765,
0.16610080003738403,
0.02514047548174858,
-0.10641211271286011,
-0.0211982112377882,
0.00163027283269912,
-0.03871334344148636,
0.05895492434501648,
0.08380524814128876,
-0.07978618890047073,
0.047940827906131744,
0.03605138882994652,
-0.10809796303510666,
0.048983048647642136,
-0.03118867613375187,
-0.05326700210571289,
0.00338826235383749,
-0.06650879979133606,
-0.021349821239709854,
0.04713410511612892,
-0.0011452403850853443,
-0.04663892462849617,
-0.15093347430229187,
-0.08883371204137802,
0.11153735965490341,
-0.1115226000547409,
0.03342524915933609,
-0.09011499583721161,
0.05070707947015762,
0.019691478461027145,
0.00266116950660944,
-0.10496021062135696,
-0.14557968080043793,
-0.03737359121441841,
-0.05514975264668465,
-0.07134761661291122,
-0.008176461793482304,
0.07564113289117813,
0.10258253663778305,
-0.048854779452085495,
-0.059009406715631485,
-0.02609020657837391,
0.001993902027606964,
0.0010893434518948197,
-0.17366252839565277,
-0.026737896725535393,
-0.060663528740406036,
0.14741215109825134,
-0.1595362424850464,
0.021284058690071106,
0.10469197481870651,
0.15258821845054626,
0.02142108790576458,
-0.09555133432149887,
0.0608539842069149,
-0.009458799846470356,
-0.01276187039911747,
-0.13037139177322388,
0.02111845277249813,
-0.01984049566090107,
-0.0452575758099556,
0.05865529179573059,
-0.014536332339048386,
-0.056736867874860764,
0.0982484519481659,
0.1665131151676178,
-0.12642216682434082,
0.1418156623840332,
-0.041876524686813354,
-0.01798458956182003,
-0.07934299856424332,
-0.011734181083738804,
0.0013795900158584118,
0.05033276975154877,
0.07947274297475815,
-0.057164374738931656,
0.00043006808846257627,
0.012781932018697262,
-0.023951226845383644,
-0.02193150855600834,
0.13390500843524933,
0.12501420080661774,
-0.16321003437042236,
0.09259138256311417,
-0.04369862750172615,
-0.011276846751570702,
0.26322269439697266,
0.03526253625750542,
-0.07281796634197235,
0.003360411385074258,
0.009389707818627357,
0.02566494233906269,
0.161432147026062,
-0.006326808128505945,
0.030839985236525536,
0.04029369726777077,
-0.008160754106938839,
0.020149270072579384,
-0.06412617862224579,
-0.060858555138111115,
-0.003342058276757598,
-0.07518181204795837,
0.035636305809020996,
0.07912620902061462,
-0.07983140647411346,
0.0633014366030693,
-0.015941867604851723,
-0.03952878341078758,
0.006431778892874718,
0.004598356317728758,
-0.103342205286026,
0.21201950311660767,
-0.061285775154829025,
-0.21247978508472443,
-0.12951232492923737,
0.029480746015906334,
0.023008089512586594,
-0.006777534261345863,
0.09062062203884125,
-0.08654869347810745,
-0.15036073327064514,
-0.10939669609069824,
0.09303290396928787,
0.0267250444740057,
-0.05001259222626686,
-0.10911716520786285,
-0.018237221986055374,
-0.02039068564772606,
-0.11747820675373077,
0.011855201795697212,
0.03975475952029228,
-0.05043477565050125,
0.07481548190116882,
-0.025950592011213303,
0.12186779826879501,
0.07953143119812012,
0.02958807535469532,
0.004487026948481798,
0.04242146387696266,
0.2138555943965912,
-0.11527488380670547,
0.1258469671010971,
0.09275226294994354,
-0.021669430658221245,
0.030638372525572777,
0.1203826516866684,
0.017580144107341766,
-0.06865539401769638,
-0.011696195229887962,
0.035095762461423874,
-0.08654377609491348,
-0.1985420137643814,
-0.09336800128221512,
0.015661118552088737,
0.04230054095387459,
0.08164965361356735,
0.09459680318832397,
-0.0746326595544815,
0.07621029764413834,
-0.07000141590833664,
-0.1682988554239273,
0.0816323384642601,
0.06426043063402176,
-0.07984356582164764,
-0.02080301195383072,
0.08776459097862244,
-0.0480092391371727,
0.05244595557451248,
0.09183032065629959,
-0.038409966975450516,
0.15256941318511963,
-0.03210972994565964,
0.079096719622612,
0.0700281411409378,
0.08381132781505585,
0.06266651302576065,
0.09694957733154297,
0.04904729500412941,
-0.024400372058153152,
0.018143359571695328,
-0.05205559730529785,
-0.006071250885725021,
0.041988298296928406,
-0.06269510090351105,
-0.08844345808029175,
-0.008664979599416256,
-0.009261870756745338,
-0.006006626877933741,
0.20851819217205048,
0.0725414901971817,
-0.23413829505443573,
-0.09684046357870102,
0.04177508130669594,
-0.03744811192154884,
-0.09244783967733383,
-0.023249687626957893,
0.09240420907735825,
-0.18818935751914978,
0.054631490260362625,
-0.05156721919775009,
0.06853783130645752,
-0.035319484770298004,
-0.04188721626996994,
0.03478435426950455,
0.024229664355516434,
-0.043299268931150436,
0.09900858253240585,
-0.16233442723751068,
0.1206701248884201,
-0.0065256766974925995,
0.11292029917240143,
-0.04931138828396797,
0.025683913379907608,
0.0007738939020782709,
0.11911981552839279,
0.2731969654560089,
0.027207696810364723,
-0.09639588743448257,
-0.10501415282487869,
-0.12337768822908401,
0.03787396103143692,
-0.022743867710232735,
-0.08343211561441422,
0.07892976701259613,
0.004789926111698151,
-0.010102140717208385,
-0.05071890726685524,
-0.00916123017668724,
-0.14322306215763092,
-0.09313152730464935,
0.008834702894091606,
-0.07843926548957825,
0.12408453971147537,
-0.07239080220460892,
-0.07250627130270004,
-0.13858062028884888,
0.23080696165561676,
-0.09637773036956787,
-0.13673192262649536,
-0.16237346827983856,
0.07162981480360031,
0.11663900315761566,
-0.08171838521957397,
0.04142755642533302,
-0.00013694813242182136,
0.07568523287773132,
-0.06554528325796127,
-0.03924375772476196,
0.07134905457496643,
-0.08211426436901093,
-0.16712559759616852,
-0.004538837354630232,
0.08258125185966492,
0.13505032658576965,
0.04957261681556702,
0.019545510411262512,
0.06737218052148819,
-0.0006925701163709164,
-0.12986595928668976,
0.04325408115983009,
0.1601795107126236,
0.021416792646050453,
-0.015266251750290394,
-0.08636698871850967,
-0.03193388134241104,
-0.019455034285783768,
-0.0817016214132309,
0.15278348326683044,
0.22403667867183685,
-0.09188259392976761,
0.1289060264825821,
0.1148560643196106,
-0.043077364563941956,
-0.306166410446167,
-0.05468752607703209,
-0.022579440847039223,
0.08122991770505905,
0.04424120858311653,
-0.2248510718345642,
0.09000089019536972,
0.1268351972103119,
-0.02396165020763874,
0.04103521257638931,
-0.2522878348827362,
-0.11773229390382767,
0.10158738493919373,
0.09498480707406998,
0.02384866587817669,
-0.06720730662345886,
-0.019043056294322014,
0.01453442219644785,
-0.14010052382946014,
0.05345257371664047,
-0.10214731097221375,
0.0877653956413269,
0.007358384784311056,
-0.013125528581440449,
0.02231740765273571,
-0.08758969604969025,
0.15206030011177063,
0.005460469052195549,
0.014246133156120777,
-0.08704636991024017,
0.11799635738134384,
0.11166763305664062,
-0.02080918475985527,
0.17981691658496857,
-0.035878997296094894,
0.020138081163167953,
-0.09598399698734283,
-0.07968325912952423,
-0.10043787211179733,
0.1171606034040451,
-0.0667889416217804,
-0.08013653010129929,
-0.05581340193748474,
0.12597374618053436,
0.07051347941160202,
-0.01405057217925787,
0.07456802576780319,
-0.10865606367588043,
0.037098757922649384,
0.05557049810886383,
0.0781264677643776,
0.015562442131340504,
-0.05942277982831001,
-0.008214176632463932,
-0.04653117433190346,
0.05708099156618118,
-0.09053031355142593,
0.030229270458221436,
0.07721836864948273,
0.03727895766496658,
0.06675027310848236,
-0.007721415255218744,
-0.13941644132137299,
-0.009197505190968513,
0.05899678170681,
-0.2269262820482254,
-0.11085407435894012,
-0.02139532007277012,
-0.12971343100070953,
0.0018533933907747269,
-0.05339285358786583,
0.1349574774503708,
-0.07754591852426529,
-0.000507446879055351,
-0.007014088332653046,
0.058165788650512695,
-0.01007205992937088,
0.19394931197166443,
0.04513377696275711,
0.05273322016000748,
-0.1193092092871666,
0.07865689694881439,
0.03668675199151039,
-0.07276209443807602,
0.04632958024740219,
0.150382861495018,
-0.16180574893951416,
-0.08237813413143158,
-0.005366461351513863,
0.11982041597366333,
-0.0635896697640419,
-0.06397543847560883,
-0.10351622104644775,
-0.050078168511390686,
0.020996876060962677,
0.020859509706497192,
0.038311079144477844,
0.0609123595058918,
0.03608768805861473,
-0.05319105461239815,
-0.08605074882507324,
0.07651826739311218,
0.07890597730875015,
0.0038736886344850063,
-0.0729900598526001,
0.1375882923603058,
0.0032675962429493666,
-0.00020727244555018842,
-0.04835306107997894,
0.03474403917789459,
-0.06037462502717972,
-0.02307766303420067,
-0.04948282614350319,
-0.030944377183914185,
-0.08123285323381424,
-0.017797034233808517,
-0.05918126180768013,
-0.03837549313902855,
0.025889063253998756,
0.020638655871152878,
-0.06135191395878792,
-0.047172270715236664,
-0.07388383150100708,
-0.011461146175861359,
-0.10946095734834671,
-0.009559296071529388,
0.0014943513087928295,
-0.06077532842755318,
0.08367963880300522,
0.006836957763880491,
0.020368145778775215,
0.0777338445186615,
-0.03438963368535042,
-0.015945998951792717,
-0.01569640450179577,
-0.04080349579453468,
0.010022399015724659,
-0.03955656662583351,
-0.053559549152851105,
-0.024740081280469894,
-0.047272056341171265,
0.0335686020553112,
0.014881638810038567,
-0.1027824804186821,
0.05968206748366356,
0.02385825850069523,
-0.020073873922228813,
-0.04170851781964302,
0.09667129814624786,
0.08139505237340927,
0.05393866077065468,
0.1288299262523651,
-0.08348249644041061,
0.06689315289258957,
-0.13646379113197327,
0.010713932104408741,
-0.0002085594751406461,
-0.010311855003237724,
-0.00036162592004984617,
0.04569941759109497,
0.047361090779304504,
-0.045876141637563705,
0.08107738941907883,
0.06372273713350296,
-0.043505631387233734,
0.055540818721055984,
-0.010925360955297947,
0.03828030824661255,
0.02645421400666237,
0.152988001704216,
0.02005559764802456,
0.008435425348579884,
0.01283939927816391,
0.00935362745076418,
-0.025589216500520706,
-0.002509790239855647,
0.1776236742734909,
0.1211642175912857,
0.11235001683235168,
0.06993476301431656,
-0.07595202326774597,
-0.09990376979112625,
-0.10763669013977051,
0.08292439579963684,
-0.013562546111643314,
0.07214602082967758,
-0.008257591165602207,
0.05499518662691116,
0.1691727638244629,
-0.16553880274295807,
0.02516682632267475,
0.06458059698343277,
-0.08290158957242966,
-0.1565151810646057,
-0.2795834541320801,
-0.11019085347652435,
0.01676345057785511,
0.05398360267281532,
-0.09340929985046387,
0.02607717178761959,
0.07680945098400116,
0.06845926493406296,
-0.012745709158480167,
0.14389006793498993,
-0.008202058263123035,
-0.09754982590675354,
0.03801245987415314,
0.03387659415602684,
-0.007948837243020535,
0.0069828457199037075,
-0.000020485140339587815,
0.0008831423474475741,
0.07788009941577911,
-0.002583146095275879,
0.02463047206401825,
0.005077775567770004,
0.03168531134724617,
-0.002840795088559389,
-0.06040512025356293,
-0.02622097171843052,
0.08595629036426544,
0.040805090218782425,
0.012134052813053131,
0.05594831332564354,
-0.023504985496401787,
0.00017629082140047103,
0.20170746743679047,
-0.0042609828524291515,
-0.06371301412582397,
-0.1177048534154892,
0.12658658623695374,
-0.005402004346251488,
0.03928578272461891,
0.024270469322800636,
-0.11571761965751648,
0.0486227311193943,
0.1612836867570877,
0.17327925562858582,
0.019928662106394768,
0.020263882353901863,
-0.026887554675340652,
-0.011643089354038239,
0.024470267817378044,
0.048053380101919174,
0.01800628751516342,
0.1818809062242508,
-0.10676765441894531,
0.052413713186979294,
-0.06591632962226868,
-0.07945626229047775,
-0.14079685509204865,
0.06725184619426727,
0.006400972604751587,
-0.009127339348196983,
-0.1022469624876976,
0.11578847467899323,
-0.07331321388483047,
-0.1393156349658966,
0.04532473534345627,
-0.05872299149632454,
-0.15270572900772095,
-0.04272190481424332,
0.0025132738519459963,
0.02564353682100773,
0.04575831815600395,
-0.0019483024952933192,
0.03626781702041626,
0.09647020697593689,
0.03883218765258789,
-0.09154698252677917,
-0.06956545263528824,
0.0686490386724472,
0.004152736626565456,
0.19516506791114807,
0.004815381485968828,
0.007166572380810976,
0.1046861857175827,
-0.04954134672880173,
-0.13603191077709198,
0.07231455296278,
0.01665087230503559,
-0.04599980264902115,
0.02959452196955681,
0.14900577068328857,
-0.03980705142021179,
0.09019661694765091,
0.007346068974584341,
-0.0987422913312912,
0.007627598941326141,
0.03844204545021057,
-0.08072482794523239,
-0.0501694455742836,
0.09986462444067001,
-0.08219125121831894,
0.12516270577907562,
0.17735153436660767,
-0.0485079400241375,
0.005860831588506699,
-0.09911683201789856,
0.11480655521154404,
0.02460426650941372,
0.017045609652996063,
0.042440153658390045,
-0.202226459980011,
-0.006295106373727322,
-0.1142844408750534,
0.030034756287932396,
-0.15488006174564362,
-0.02101951278746128,
-0.08675799518823624,
-0.005131102632731199,
-0.05563485622406006,
0.09013710916042328,
0.0464865118265152,
0.023324863985180855,
-0.01251999195665121,
0.0240899957716465,
0.027254024520516396,
0.08060724288225174,
-0.13987305760383606,
-0.07437391579151154
] |
null | null |
transformers
|
# Swe Roberta Wiki Oscar
## Description
This Roberta model was trained on the Swedish Wikipedia and Oscar datasets
## Model series
This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
## Gpt models
## Swedish Gpt
https://huggingface.co/birgermoell/swedish-gpt/
## Swedish gpt wiki
https://huggingface.co/flax-community/swe-gpt-wiki
# Nordic gpt wiki
https://huggingface.co/flax-community/nordic-gpt-wiki
## Dansk gpt wiki
https://huggingface.co/flax-community/dansk-gpt-wiki
## Norsk gpt wiki
https://huggingface.co/flax-community/norsk-gpt-wiki
## Roberta models
## Nordic Roberta Wiki
https://huggingface.co/flax-community/nordic-roberta-wiki
## Swe Roberta Wiki Oscar
https://huggingface.co/flax-community/swe-roberta-wiki-oscar
## Roberta Swedish Scandi
https://huggingface.co/birgermoell/roberta-swedish-scandi
## Roberta Swedish
https://huggingface.co/birgermoell/roberta-swedish
## Swedish T5 model
https://huggingface.co/birgermoell/t5-base-swedish
|
{"language": "sv", "license": "cc-by-4.0", "tags": ["swedish", "roberta"], "pipeline_tag": "fill-mask", "widget": [{"text": "Meninged med livet \u00e4r <mask>."}]}
|
fill-mask
|
flax-community/swe-roberta-wiki-oscar
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"roberta",
"feature-extraction",
"swedish",
"fill-mask",
"sv",
"license:cc-by-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"sv"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #roberta #feature-extraction #swedish #fill-mask #sv #license-cc-by-4.0 #endpoints_compatible #region-us
|
# Swe Roberta Wiki Oscar
## Description
This Roberta model was trained on the Swedish Wikipedia and Oscar datasets
## Model series
This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
## Gpt models
## Swedish Gpt
URL
## Swedish gpt wiki
URL
# Nordic gpt wiki
URL
## Dansk gpt wiki
URL
## Norsk gpt wiki
URL
## Roberta models
## Nordic Roberta Wiki
URL
## Swe Roberta Wiki Oscar
URL
## Roberta Swedish Scandi
URL
## Roberta Swedish
URL
## Swedish T5 model
URL
|
[
"# Swe Roberta Wiki Oscar",
"## Description\nThis Roberta model was trained on the Swedish Wikipedia and Oscar datasets",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL"
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #roberta #feature-extraction #swedish #fill-mask #sv #license-cc-by-4.0 #endpoints_compatible #region-us \n",
"# Swe Roberta Wiki Oscar",
"## Description\nThis Roberta model was trained on the Swedish Wikipedia and Oscar datasets",
"## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.",
"## Gpt models",
"## Swedish Gpt\nURL",
"## Swedish gpt wiki\nURL",
"# Nordic gpt wiki\nURL",
"## Dansk gpt wiki\nURL",
"## Norsk gpt wiki\nURL",
"## Roberta models",
"## Nordic Roberta Wiki\nURL",
"## Swe Roberta Wiki Oscar\nURL",
"## Roberta Swedish Scandi\nURL",
"## Roberta Swedish\nURL",
"## Swedish T5 model\nURL"
] |
[
62,
6,
18,
32,
4,
5,
6,
6,
6,
6,
4,
6,
7,
7,
5,
6
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #roberta #feature-extraction #swedish #fill-mask #sv #license-cc-by-4.0 #endpoints_compatible #region-us \n# Swe Roberta Wiki Oscar## Description\nThis Roberta model was trained on the Swedish Wikipedia and Oscar datasets## Model series\nThis model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.## Gpt models## Swedish Gpt\nURL## Swedish gpt wiki\nURL# Nordic gpt wiki\nURL## Dansk gpt wiki\nURL## Norsk gpt wiki\nURL## Roberta models## Nordic Roberta Wiki\nURL## Swe Roberta Wiki Oscar\nURL## Roberta Swedish Scandi\nURL## Roberta Swedish\nURL## Swedish T5 model\nURL"
] |
[
-0.015538287349045277,
0.22421276569366455,
-0.0017817980842664838,
0.15448543429374695,
0.06970682740211487,
0.008472277782857418,
0.10451719164848328,
0.13012462854385376,
0.0635758638381958,
0.0266462080180645,
0.15075722336769104,
0.06230730563402176,
0.1111670508980751,
0.04233327880501747,
0.06922225654125214,
-0.3917400538921356,
0.04091313108801842,
-0.04869639873504639,
-0.05549576133489609,
0.07067295908927917,
0.09068812429904938,
-0.019619256258010864,
0.08874266594648361,
-0.0007160553941503167,
-0.015756797045469284,
0.016884280368685722,
0.02372702956199646,
-0.04416784644126892,
0.11404605954885483,
0.066959448158741,
-0.0018656738102436066,
0.07275336235761642,
0.1383771002292633,
-0.18035772442817688,
0.0300517026335001,
-0.017199480906128883,
-0.042023371905088425,
0.021506991237401962,
0.021134953945875168,
-0.0524289645254612,
0.22440408170223236,
0.05651650205254555,
0.043850358575582504,
0.04139912128448486,
-0.07349327951669693,
-0.31078147888183594,
-0.06055225431919098,
0.13117428123950958,
-0.04938877746462822,
0.08358170837163925,
-0.04389245808124542,
0.10146953910589218,
-0.10549011826515198,
0.05784033238887787,
0.2196972817182541,
-0.29612022638320923,
-0.05019534006714821,
0.13067491352558136,
0.144399955868721,
0.0312962606549263,
-0.10365109145641327,
0.12503626942634583,
0.00647688889876008,
0.061201781034469604,
0.12068162113428116,
-0.07016992568969727,
-0.012996378354728222,
-0.025373637676239014,
-0.1083446592092514,
0.08638317137956619,
0.1705828309059143,
0.08237731456756592,
-0.03160249814391136,
-0.0688667967915535,
0.04860719293355942,
0.044519297778606415,
-0.027279341593384743,
-0.013195322826504707,
0.03851819038391113,
-0.02621692791581154,
-0.023270240053534508,
-0.04080905392765999,
-0.0893317237496376,
-0.034524623304605484,
0.06037203222513199,
0.04996274411678314,
-0.005440911743789911,
0.04182988032698631,
-0.002134706825017929,
0.023954011499881744,
-0.14637133479118347,
-0.12430503964424133,
-0.03743544965982437,
-0.060828130692243576,
-0.0159933939576149,
0.021409397944808006,
0.061743758618831635,
-0.08206382393836975,
0.08125423640012741,
0.17320165038108826,
0.015855535864830017,
0.029375892132520676,
0.08662407100200653,
0.07879749685525894,
0.010529231280088425,
0.11803237348794937,
-0.1923753023147583,
-0.13807891309261322,
-0.037850089371204376,
-0.051838215440511703,
0.015135427936911583,
-0.03767898306250572,
-0.04489395394921303,
-0.04932998865842819,
0.010750964283943176,
0.017647581174969673,
0.029943160712718964,
0.06806701421737671,
-0.0650707259774208,
0.05755028873682022,
-0.014751003123819828,
-0.0379280261695385,
0.005931554827839136,
-0.00642759632319212,
-0.029900649562478065,
0.037974849343299866,
0.0690230205655098,
0.027769818902015686,
0.003854513866826892,
0.1756393313407898,
-0.09531410038471222,
-0.00791989080607891,
0.05658572539687157,
-0.06960704177618027,
0.07468580454587936,
-0.08345549553632736,
0.0404164120554924,
-0.11554066091775894,
-0.04475000873208046,
-0.06517531722784042,
0.08067774772644043,
-0.0973893404006958,
-0.005026484373956919,
-0.047993484884500504,
-0.07791628688573837,
0.03817548602819443,
-0.0038958752993494272,
-0.009269917383790016,
-0.07704003900289536,
0.04524093493819237,
-0.13595300912857056,
0.11798552423715591,
-0.010090126655995846,
-0.0393989197909832,
-0.11123790591955185,
-0.003094232175499201,
-0.23980079591274261,
0.01608741283416748,
-0.18175312876701355,
0.10144972801208496,
-0.0932285338640213,
-0.10053882002830505,
0.017703408375382423,
0.04966023564338684,
0.033466044813394547,
0.24959637224674225,
-0.13558271527290344,
-0.022232837975025177,
0.29340678453445435,
-0.10772592574357986,
-0.031462397426366806,
0.11332541704177856,
0.024788962677121162,
0.08518189191818237,
0.05656813457608223,
0.22945548593997955,
-0.0031335195526480675,
-0.09493909776210785,
0.027898825705051422,
0.03884023800492287,
-0.03399733081459999,
0.02252277173101902,
0.081931933760643,
-0.028066178783774376,
0.01882128417491913,
0.018713857978582382,
-0.12349646538496017,
0.03309072181582451,
-0.030413741245865822,
-0.0411418192088604,
0.06519468128681183,
-0.07464933395385742,
0.04450618475675583,
0.08464091271162033,
-0.021054238080978394,
-0.08701241761445999,
-0.16773492097854614,
-0.14723695814609528,
0.028147943317890167,
-0.0575125589966774,
0.018568167462944984,
-0.06258517503738403,
0.09919052571058273,
0.06291429698467255,
-0.02438092790544033,
-0.01859937608242035,
-0.07286952435970306,
-0.060771726071834564,
0.03717216104269028,
0.0469105988740921,
0.04048655554652214,
0.12194596230983734,
0.049263015389442444,
-0.06746310740709305,
0.009256292134523392,
0.005606547463685274,
0.008739124983549118,
0.0049230302684009075,
-0.25692516565322876,
0.018863476812839508,
-0.043203212320804596,
0.09642674028873444,
-0.16961579024791718,
-0.025215735659003258,
0.09044088423252106,
0.13822294771671295,
0.03031817637383938,
-0.08434440195560455,
0.021049655973911285,
-0.018901947885751724,
-0.0011632245732471347,
-0.04832496494054794,
0.061341769993305206,
-0.04248509556055069,
-0.15448597073554993,
0.14515702426433563,
0.04272822290658951,
0.007668149657547474,
0.051096443086862564,
-0.02667173370718956,
-0.17076191306114197,
0.11509161442518234,
-0.02390219271183014,
0.035525184124708176,
0.017055589705705643,
0.008483132347464561,
-0.022525209933519363,
0.011232638731598854,
0.0642993301153183,
-0.05475633963942528,
0.0040773553773760796,
0.056567538529634476,
-0.10970267653465271,
-0.05386048182845116,
0.18533165752887726,
0.14044879376888275,
-0.18648262321949005,
0.09468493610620499,
-0.02420848235487938,
-0.016513872891664505,
0.28849923610687256,
0.05276753753423691,
0.019423138350248337,
0.00009753551421454176,
-0.07418833673000336,
0.04261893406510353,
0.20675325393676758,
-0.034649599343538284,
0.03575829416513443,
0.010193995200097561,
-0.04008137807250023,
-0.05842315033078194,
-0.08924753963947296,
-0.13420487940311432,
-0.014727815985679626,
-0.060166895389556885,
0.012813791632652283,
0.07794109731912613,
-0.10908582806587219,
0.07903485000133514,
0.04568987712264061,
-0.13262322545051575,
-0.02743782475590706,
-0.007861021906137466,
-0.10833828896284103,
0.22223146259784698,
-0.03251742944121361,
-0.12647224962711334,
-0.1178680956363678,
-0.0338803268969059,
0.030854810029268265,
0.010812056250870228,
0.095937080681324,
-0.09943641722202301,
-0.0961313471198082,
-0.08086210489273071,
0.04981067776679993,
0.004823038820177317,
-0.029202237725257874,
-0.11865469068288803,
-0.020267600193619728,
-0.029705503955483437,
-0.06687164306640625,
-0.023840276524424553,
-0.017808862030506134,
-0.003196420380845666,
0.031039157882332802,
-0.06232349947094917,
0.11326045542955399,
0.026154551655054092,
0.00434837443754077,
-0.0035680574364960194,
0.0406775064766407,
0.2115263193845749,
-0.11538204550743103,
0.11012225598096848,
0.040763627737760544,
-0.05540534853935242,
0.07221204787492752,
0.14559894800186157,
0.07872500270605087,
-0.031571533530950546,
-0.06052367389202118,
0.01644708402454853,
-0.0803229883313179,
-0.1806693971157074,
-0.05691644921898842,
0.028581153601408005,
0.12162672728300095,
0.04119393974542618,
0.06675611436367035,
-0.07094550877809525,
0.15216833353042603,
-0.0013797464780509472,
-0.1778634786605835,
0.04161369800567627,
0.03343873471021652,
-0.0910605639219284,
-0.031402427703142166,
0.06099896878004074,
-0.07942695170640945,
-0.01716344989836216,
0.0631600022315979,
-0.03259584307670593,
0.02858818881213665,
0.02230573259294033,
-0.034148599952459335,
0.09081969410181046,
0.1121092289686203,
0.037926290184259415,
0.0025725318118929863,
0.06565414369106293,
-0.07103089243173599,
0.02455024980008602,
-0.06785514205694199,
0.0565820038318634,
0.08493519574403763,
-0.04627709835767746,
-0.08538980036973953,
-0.04742799326777458,
-0.044297367334365845,
0.011138979345560074,
0.11038368940353394,
0.0704926997423172,
-0.17197412252426147,
-0.056330643594264984,
0.060364626348018646,
-0.04873877018690109,
-0.025569789111614227,
0.003421213710680604,
0.06151372194290161,
-0.19511498510837555,
0.1536482572555542,
-0.045065633952617645,
0.0659826248884201,
-0.051185350865125656,
-0.006495407782495022,
0.04584427922964096,
0.010205080732703209,
-0.04604359343647957,
0.09846657514572144,
-0.058912694454193115,
0.21532627940177917,
-0.0055946228094398975,
0.03372225910425186,
-0.07238251715898514,
-0.043212998658418655,
0.014400327578186989,
0.15440863370895386,
0.36251869797706604,
0.04013080149888992,
-0.03100721165537834,
-0.055210504680871964,
-0.022727496922016144,
-0.0026608293410390615,
0.008190426975488663,
-0.1195063591003418,
0.07436737418174744,
0.021607857197523117,
-0.02318098209798336,
-0.07529810070991516,
-0.007365057710558176,
-0.05031377077102661,
-0.041710373014211655,
-0.00031941873021423817,
-0.06432890146970749,
0.026721999049186707,
-0.03849973902106285,
-0.08158737421035767,
-0.2597286105155945,
0.1324005275964737,
-0.07592499256134033,
-0.08140066266059875,
-0.12370522320270538,
0.055245064198970795,
0.039645642042160034,
-0.08002180606126785,
0.017190899699926376,
0.030934849753975868,
0.02336609922349453,
-0.05491753667593002,
-0.002178285038098693,
0.08486799150705338,
-0.06265292316675186,
-0.12134718894958496,
-0.03286312147974968,
0.08308716863393784,
0.0970052182674408,
0.04596920311450958,
0.044581230729818344,
0.0477941632270813,
0.006188975181430578,
-0.13682271540164948,
0.08542653918266296,
0.01116051897406578,
-0.03205486014485359,
-0.01892288215458393,
-0.04270100221037865,
-0.04107149690389633,
-0.07198111712932587,
-0.058023903518915176,
0.12184508889913559,
0.31257373094558716,
-0.15741266310214996,
0.09651748090982437,
0.11645561456680298,
-0.03772112727165222,
-0.3379194140434265,
-0.028936322778463364,
-0.04253680631518364,
0.00607201037928462,
0.1959017813205719,
-0.09516096860170364,
0.11894629150629044,
0.04013366997241974,
-0.03710218891501427,
0.0435498021543026,
-0.18826048076152802,
-0.09734725952148438,
0.02444004826247692,
0.14388754963874817,
0.10121089220046997,
-0.07050049304962158,
-0.04102987423539162,
-0.008386291563510895,
-0.18437246978282928,
0.0319066159427166,
-0.017622360959649086,
0.09382423013448715,
-0.01314979325979948,
-0.011194048449397087,
0.0301218219101429,
-0.03667724132537842,
0.12738928198814392,
-0.06554712355136871,
-0.029724739491939545,
-0.14595116674900055,
0.05291707068681717,
0.03178425505757332,
-0.0018743295222520828,
0.12274479866027832,
-0.13646265864372253,
-0.028286868706345558,
-0.0674927607178688,
-0.03126389905810356,
-0.10246223211288452,
0.12361083179712296,
-0.046221133321523666,
-0.08965091407299042,
-0.08930408954620361,
0.11931764334440231,
0.0705236867070198,
-0.008295728825032711,
0.09543292224407196,
-0.16693952679634094,
0.12405770272016525,
-0.006899604108184576,
0.17874009907245636,
0.07597632706165314,
-0.06650771200656891,
-0.04260312765836716,
-0.07780676335096359,
0.04074879735708237,
-0.18262778222560883,
0.00043917971197515726,
0.09566576033830643,
0.031784266233444214,
0.06185941398143768,
-0.00556974159553647,
-0.1507023423910141,
-0.01662827841937542,
0.15784548223018646,
-0.21014238893985748,
-0.17205700278282166,
-0.07295642048120499,
-0.1955942064523697,
0.05920606106519699,
-0.049229636788368225,
0.12352046370506287,
-0.10824498534202576,
-0.0007995072519406676,
0.000018473401723895222,
0.04874381050467491,
-0.06090633198618889,
0.06358036398887634,
0.10390591621398926,
0.04416387900710106,
-0.08196855336427689,
0.09818104654550552,
-0.017992472276091576,
0.012872528284788132,
0.042761679738759995,
0.26238328218460083,
-0.08068066090345383,
-0.07423721998929977,
0.003108171746134758,
0.18213488161563873,
-0.2072010636329651,
-0.017409488558769226,
-0.11152031272649765,
-0.0845668613910675,
-0.04391172155737877,
0.11599617451429367,
0.011541914194822311,
-0.013360860757529736,
0.07202890515327454,
-0.044098690152168274,
-0.026356322690844536,
0.07798004150390625,
0.03300132974982262,
-0.01663885824382305,
-0.1506669521331787,
0.035664550960063934,
-0.027152862399816513,
0.07072224467992783,
-0.05205162242054939,
0.08943726867437363,
-0.09953013807535172,
-0.01564939133822918,
0.009056160226464272,
-0.034449558705091476,
-0.06876132637262344,
0.008457513526082039,
-0.05785166099667549,
-0.05802800506353378,
-0.0038931546732783318,
-0.012378886342048645,
-0.09064257889986038,
-0.017735760658979416,
-0.016228651627898216,
-0.027290694415569305,
-0.08312804251909256,
-0.03515426814556122,
0.03799984231591225,
-0.03484848514199257,
0.07810384780168533,
-0.07620422542095184,
0.005779811181128025,
0.1152617335319519,
-0.18789876997470856,
0.06879006326198578,
-0.01696176454424858,
-0.06248045340180397,
0.007764958310872316,
0.0780150294303894,
-0.0660623237490654,
-0.07428118586540222,
0.00833046343177557,
0.03844381123781204,
-0.05843852832913399,
-0.08932053297758102,
-0.04105516895651817,
0.05266016721725464,
-0.05150190368294716,
-0.04343380406498909,
0.08127694576978683,
0.10900039225816727,
-0.00678782444447279,
0.05450960993766785,
-0.07062849402427673,
0.0822513997554779,
-0.08753921836614609,
0.02535775676369667,
0.019035005941987038,
-0.07386250048875809,
0.022367505356669426,
0.019726984202861786,
0.026998961344361305,
-0.07817578315734863,
0.0459810309112072,
0.09596235305070877,
-0.08003760874271393,
0.047465573996305466,
-0.002680493053048849,
0.10858391970396042,
0.0003866883635055274,
0.13054180145263672,
-0.007808718364685774,
-0.04092508181929588,
-0.11706583946943283,
0.039675548672676086,
-0.050402067601680756,
0.03132938966155052,
0.11062856018543243,
0.04931100830435753,
0.0751660093665123,
0.0698479413986206,
0.05678778514266014,
0.053940970450639725,
-0.007242575287818909,
0.026047978550195694,
0.07475411146879196,
0.03863967955112457,
-0.0017846951959654689,
0.03514494374394417,
0.2346104085445404,
-0.08994586020708084,
0.03700796514749527,
0.02304154634475708,
-0.061531804502010345,
-0.19123698770999908,
-0.30554017424583435,
-0.10050421208143234,
-0.02408517897129059,
0.07069093734025955,
-0.08100539445877075,
0.03637940064072609,
0.03727363422513008,
0.06658035516738892,
-0.05992275848984718,
0.08039248734712601,
-0.05430727079510689,
-0.05956963449716568,
0.10289386659860611,
0.054870277643203735,
-0.03670457378029823,
0.05068579688668251,
0.03438868001103401,
0.011089641600847244,
0.02676379308104515,
-0.08562357723712921,
0.003636700101196766,
-0.06348980218172073,
0.00837736390531063,
-0.012848093174397945,
-0.11223825067281723,
0.010295557789504528,
0.027354126796126366,
0.06520579010248184,
0.03202441334724426,
0.021785134449601173,
0.0015455654356628656,
-0.02684931829571724,
0.2103676050901413,
-0.01704443246126175,
0.06657008826732635,
-0.1035083681344986,
0.13569936156272888,
-0.09449324756860733,
0.0761333554983139,
-0.03286859765648842,
-0.095611572265625,
0.10511968284845352,
0.2544477581977844,
0.22462098300457,
-0.004010146018117666,
0.03799464553594589,
-0.02240440808236599,
-0.016613798215985298,
-0.0017147184116765857,
0.026360757648944855,
-0.019556768238544464,
0.16460584104061127,
-0.11118490248918533,
0.08129171282052994,
-0.0581989586353302,
-0.010465037077665329,
-0.06246888265013695,
0.02362331561744213,
0.08070658147335052,
0.016060542315244675,
-0.14186237752437592,
0.1651943176984787,
-0.16977377235889435,
-0.13386455178260803,
0.12166177481412888,
-0.08688587695360184,
-0.12529368698596954,
-0.07878502458333969,
-0.00260347593575716,
0.10815545171499252,
0.05525753274559975,
0.01944921165704727,
0.02123451791703701,
-0.0768449604511261,
0.07665541023015976,
-0.1318567395210266,
-0.07734135538339615,
-0.03578639775514603,
0.007082671392709017,
0.23605673015117645,
-0.026326224207878113,
-0.017150219529867172,
0.11685895919799805,
-0.027953753247857094,
-0.060903400182724,
0.038836825639009476,
0.04409594088792801,
-0.0515332855284214,
-0.03494146093726158,
0.18419286608695984,
-0.007535606622695923,
0.03541035205125809,
0.032034751027822495,
-0.04239647090435028,
0.01835133694112301,
0.02149340510368347,
-0.10737241804599762,
0.011734124273061752,
0.14862944185733795,
-0.11027013510465622,
0.09812162071466446,
0.17948870360851288,
-0.031015219166874886,
-0.031791239976882935,
-0.08111965656280518,
0.047083426266908646,
0.044691216200590134,
-0.11030808836221695,
0.04806400090456009,
-0.087727390229702,
-0.050793714821338654,
-0.06603129953145981,
0.007720781024545431,
-0.1367984116077423,
0.004689094610512257,
-0.14569583535194397,
0.018398448824882507,
-0.044299956411123276,
0.04716886207461357,
0.15317018330097198,
0.020789479836821556,
-0.020185524597764015,
-0.11231408268213272,
0.061138931661844254,
0.0973283052444458,
-0.11300713568925858,
-0.07636130601167679
] |
null | null |
transformers
|
# Model
This model is fine-tuned from https://huggingface.co/flax-community/t5-base-openwebtext, fine-tuned on cnn_dailymail.
|
{"language": "en", "license": "apache-2.0", "tags": ["summarization"], "datasets": ["cnn_dailymail"], "model-index": [{"name": "flax-community/t5-base-cnn-dm", "results": [{"task": {"type": "summarization", "name": "Summarization"}, "dataset": {"name": "cnn_dailymail", "type": "cnn_dailymail", "config": "3.0.0", "split": "test"}, "metrics": [{"type": "rouge", "value": 24.1585, "name": "ROUGE-1", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiN2Q0Nzk3ZTNkNTFjMTM2YjliNzcxYTVlMDgyNDE4MzZjNzgzZjgzYjI1NWFjZTE2YjE4MWE3NGRiNGZiMmVhNyIsInZlcnNpb24iOjF9.H2oS1cN5A3wY8oFZTVtCMwnbDPAdUhNwjTSDocqQinhDq7aSee_AvIVn-7m84Ke8qaMTAvHB9e56MDAAVT8XBA"}, {"type": "rouge", "value": 11.0688, "name": "ROUGE-2", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZGIyMmYzZTFhNjgwMmU5YWQ1MTZjM2ZlNjEwYmVmODkyMGQwZDQ2MjM1YmRkYjM2NTEyNjE5N2ExYzc0ZTcyYSIsInZlcnNpb24iOjF9.6GtmrXTD0EnrXx02enbLdbeiLh--I9u0GfrPdXZ_CKHeYgpFs0Gk1F0c75QBfGoMilodGymS15A9Bjvt00baBw"}, {"type": "rouge", "value": 19.7293, "name": "ROUGE-L", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYzc4MGQyNmYwNDk5NDE0MDk2ZjE2NmVkZDIwN2NmYzQxZTI0NWZhZjkxOGFkMWZmNjQ5NzRkODViNzg5Zjc5MiIsInZlcnNpb24iOjF9.rOgFJeHsW74nQiKc3DPoMIB9aWKqWTRtnweYP3DCp4duJN5jq32PPNyXo3EYuskGgTSp4KWwf7-Hl2MYwDrSCQ"}, {"type": "rouge", "value": 22.6394, "name": "ROUGE-LSUM", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOTA4M2JlZDliMmFlZDgwM2E2MDZjY2ZjZGUwZTcxNDM0NGU3NzdlYzJlZTEzNDEyZDE0OWFiMjUzMmYwNjRhNyIsInZlcnNpb24iOjF9.Mq9ltLQ5YAZfLLaGsPtSOe6KCRLRwjT_2nSAH9KWvOiyagJ16F5xQ1m9uUx9mhiu_UOmpjDaAtD3y4AOy4L0Dg"}, {"type": "loss", "value": 2.516355514526367, "name": "loss", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOGQwNTIyZmU5ZjU3OWM1NGMwYzJiYTA0ZGVmOTA2MjcxYzZmZDRjZDViZDg0NGNlOWNjODkxYTc1ZTJhMmYyMiIsInZlcnNpb24iOjF9.mh6ZVu82CFnb5g92Uj-99wjyvoSQQI-gO-PDBdH4JZyc8mVPJYzV-S7jyXwC_XsOfD1OsR9XKTxM1NUirfBKAw"}, {"type": "gen_len", "value": 18.9993, "name": "gen_len", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNGY5YTYxZmZiYmY4NTZjNmMzMjllNWE1M2M2ZjA0MWM1MzBhZjc0MDM5ZGFiYTAzNjFiZjg5ZjMxYzlmOGYwMyIsInZlcnNpb24iOjF9.eXiPrQ-CeB3BWzlQzkTIA1q0xYP1GtFGIK9XyIneEmh5ajN5pCATxNDvn6n09d84OEr5432SoPJfdpNCd_UyCA"}]}]}]}
|
summarization
|
flax-community/t5-base-cnn-dm
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"summarization",
"en",
"dataset:cnn_dailymail",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #t5 #text2text-generation #summarization #en #dataset-cnn_dailymail #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# Model
This model is fine-tuned from URL fine-tuned on cnn_dailymail.
|
[
"# Model\nThis model is fine-tuned from URL fine-tuned on cnn_dailymail."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #t5 #text2text-generation #summarization #en #dataset-cnn_dailymail #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# Model\nThis model is fine-tuned from URL fine-tuned on cnn_dailymail."
] |
[
91,
22
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #t5 #text2text-generation #summarization #en #dataset-cnn_dailymail #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# Model\nThis model is fine-tuned from URL fine-tuned on cnn_dailymail."
] |
[
-0.02220364846289158,
0.11965040862560272,
-0.004139941185712814,
0.018960604444146156,
0.052181556820869446,
-0.0125108128413558,
0.1866803616285324,
0.06993172317743301,
-0.12011056393384933,
-0.0516694001853466,
0.12670034170150757,
0.11589266359806061,
-0.016567569226026535,
0.17374201118946075,
-0.050719425082206726,
-0.2034701555967331,
0.0848919078707695,
0.042550649493932724,
-0.047914646565914154,
0.14030061662197113,
0.09267492592334747,
-0.04369166120886803,
0.047764651477336884,
-0.01252071838825941,
-0.030562499538064003,
0.013853984884917736,
0.006511744111776352,
-0.10435554385185242,
0.06941482424736023,
0.04786181077361107,
0.03994227945804596,
0.120115727186203,
0.03251727297902107,
-0.12124917656183243,
0.04895482212305069,
0.04459528252482414,
-0.06473147869110107,
0.0800866112112999,
0.038953617215156555,
-0.024956705048680305,
0.1525430828332901,
-0.05289750173687935,
-0.005875607021152973,
0.0017677161376923323,
-0.06934317201375961,
0.009207584895193577,
-0.04073190316557884,
0.10882551968097687,
0.10581940412521362,
0.1142650693655014,
-0.0248578954488039,
0.17807802557945251,
-0.06881412118673325,
0.08588168025016785,
-0.013366233557462692,
-0.29340213537216187,
-0.013360503129661083,
0.16992098093032837,
-0.013920046389102936,
0.04909174144268036,
0.0467950813472271,
0.06759035587310791,
0.058134421706199646,
-0.020032159984111786,
0.0702492743730545,
-0.006871043238788843,
-0.07845304906368256,
0.0411703996360302,
-0.12678062915802002,
-0.0646902397274971,
0.233938068151474,
0.051966093480587006,
0.04045792669057846,
-0.09168269485235214,
-0.07720852643251419,
0.03518391028046608,
-0.06835657358169556,
-0.0055527775548398495,
0.012838400900363922,
0.06477491557598114,
-0.03378886729478836,
-0.06133338063955307,
-0.1481357216835022,
0.009342958219349384,
-0.1829972267150879,
0.08792850375175476,
0.032668519765138626,
0.040547680109739304,
-0.21996422111988068,
0.06870247423648834,
0.1087212786078453,
-0.1436004489660263,
0.09162839502096176,
-0.050938718020915985,
0.06488966941833496,
-0.020212259143590927,
-0.05162986367940903,
-0.13954363763332367,
0.09915510565042496,
0.08438099920749664,
0.019044874235987663,
-0.02870073728263378,
-0.01142207533121109,
0.06429623067378998,
0.048711974173784256,
0.022811798378825188,
-0.1456764042377472,
-0.021217888221144676,
0.08156652748584747,
0.036727603524923325,
0.05426039174199104,
-0.002657074946910143,
-0.09373386949300766,
0.042752742767333984,
0.0120466323569417,
0.02868255227804184,
0.0922195091843605,
0.1687309741973877,
0.020543182268738747,
-0.006505769677460194,
0.08700819313526154,
-0.08512815833091736,
-0.02033975161612034,
-0.004285718314349651,
-0.017158618196845055,
0.059535346925258636,
0.020258113741874695,
0.03292466700077057,
-0.08295253664255142,
0.026675613597035408,
-0.10000058263540268,
-0.045412153005599976,
-0.004062229301780462,
-0.08786696195602417,
0.05254121497273445,
-0.0017920678947120905,
-0.017050517722964287,
-0.13278324902057648,
-0.17723700404167175,
-0.03305840119719505,
0.011027831584215164,
0.015240170992910862,
-0.10710171610116959,
-0.047475192695856094,
-0.06918691843748093,
0.031638253480196,
-0.048238616436719894,
0.08693310618400574,
-0.0972866415977478,
0.03201461210846901,
-0.11867984384298325,
0.03190475329756737,
-0.05050628259778023,
0.020097535103559494,
-0.12549875676631927,
-0.057183001190423965,
0.01906302198767662,
0.05238616093993187,
-0.016634803265333176,
0.08832871913909912,
-0.08340884000062943,
0.027757802978157997,
-0.07585028558969498,
0.0351467989385128,
-0.025439079850912094,
0.2035713940858841,
-0.17730647325515747,
-0.05103471875190735,
0.20356754958629608,
-0.09035706520080566,
-0.09221183508634567,
0.08129136264324188,
-0.0312749519944191,
0.07008715718984604,
0.13984283804893494,
0.09757276624441147,
0.020632591098546982,
-0.022145265713334084,
0.01114035863429308,
0.08960144221782684,
-0.07847851514816284,
-0.08419760316610336,
0.0020557784009724855,
0.06072213128209114,
-0.13637761771678925,
0.029348595067858696,
0.006548791658133268,
0.08776289969682693,
-0.048610933125019073,
-0.03756539523601532,
-0.03377094864845276,
-0.03841772302985191,
0.02334720827639103,
-0.03264271095395088,
0.07597019523382187,
-0.023172838613390923,
-0.0728427842259407,
-0.022862957790493965,
0.0713789090514183,
-0.010696924291551113,
0.027206018567085266,
-0.11104993522167206,
0.016821108758449554,
-0.09284321963787079,
0.08079227060079575,
-0.13254515826702118,
-0.05702807009220123,
-0.049997564405202866,
0.06878442317247391,
0.040268395096063614,
0.04811194911599159,
0.037885017693042755,
-0.08280330151319504,
-0.025005539879202843,
0.008803488686680794,
0.13981205224990845,
0.045599374920129776,
-0.07228654623031616,
-0.18743184208869934,
0.08205891400575638,
-0.06745800375938416,
0.12169316411018372,
-0.10155747830867767,
0.05234145373106003,
-0.008687715046107769,
0.13263261318206787,
0.0031885304488241673,
0.0777919813990593,
0.09772437810897827,
-0.005814761389046907,
-0.03195977956056595,
-0.014329423196613789,
0.05755353718996048,
0.039795320481061935,
-0.09901174902915955,
0.2176220864057541,
-0.13959507644176483,
0.17011897265911102,
0.17927543818950653,
-0.024028826504945755,
0.016807997599244118,
0.04225471243262291,
-0.02990841679275036,
0.002635176759213209,
-0.03100113570690155,
0.020484741777181625,
0.06652025133371353,
-0.01325862854719162,
0.15503093600273132,
-0.08467483520507812,
-0.03295509144663811,
0.012010461650788784,
-0.03532388433814049,
-0.030171720311045647,
0.0574587881565094,
0.0769348293542862,
-0.15811210870742798,
0.10119718313217163,
0.13772231340408325,
0.03004015050828457,
0.12314724177122116,
-0.06010590121150017,
-0.033017776906490326,
0.044460855424404144,
-0.06311167776584625,
-0.06324228644371033,
-0.028887402266263962,
-0.030061764642596245,
0.007219065446406603,
0.06836751848459244,
0.019807172939181328,
0.07294488698244095,
-0.10628874599933624,
-0.032890040427446365,
0.04020004719495773,
-0.041017234325408936,
-0.06467355042695999,
0.059129975736141205,
-0.01447273138910532,
0.1489046812057495,
-0.06486191600561142,
-0.14135916531085968,
0.06304847449064255,
0.0025778149720281363,
-0.10284299403429031,
0.15170450508594513,
-0.07198319584131241,
-0.2862929105758667,
-0.062575563788414,
-0.05703503638505936,
-0.03499896079301834,
-0.03594508022069931,
0.057134419679641724,
-0.09310325980186462,
-0.04764720797538757,
-0.16713306307792664,
-0.05883481726050377,
0.0785696879029274,
0.037997808307409286,
-0.012783529236912727,
0.03953680768609047,
0.006984995678067207,
-0.11617513000965118,
0.006623094901442528,
-0.07255540788173676,
-0.043911367654800415,
0.07194941490888596,
-0.07767266780138016,
0.054194383323192596,
0.1824636310338974,
0.013047282584011555,
0.03528225049376488,
-0.017055831849575043,
0.14782431721687317,
-0.044973596930503845,
0.03632071241736412,
0.1814919263124466,
0.04120219126343727,
0.030501220375299454,
0.17028722167015076,
0.01611328311264515,
-0.0011880587553605437,
0.042172618210315704,
0.014152160845696926,
-0.048635754734277725,
-0.23630386590957642,
-0.14323663711547852,
-0.030836408957839012,
0.01721898280084133,
0.03011918254196644,
0.0945214256644249,
0.08895041048526764,
0.06792736053466797,
-0.060976818203926086,
-0.01471066102385521,
0.05325538292527199,
0.04482119530439377,
0.12245864421129227,
0.010066979564726353,
0.09441186487674713,
-0.09513621032238007,
-0.12763036787509918,
0.10048855096101761,
0.021420994773507118,
-0.0005216068821027875,
0.04022334888577461,
0.027062878012657166,
0.053729139268398285,
0.019254004582762718,
0.10869334638118744,
0.09714647382497787,
0.022623060271143913,
-0.04657872021198273,
-0.02457248978316784,
-0.054751280695199966,
0.039464544504880905,
0.013204159215092659,
-0.062410153448581696,
-0.12455575913190842,
-0.025374308228492737,
0.0232023186981678,
0.057503778487443924,
0.1698177605867386,
0.10299930721521378,
-0.27684617042541504,
0.04585052281618118,
0.013955317437648773,
-0.029408922418951988,
-0.011454358696937561,
0.04824032261967659,
0.009306699968874454,
0.00035735583514906466,
0.0943133682012558,
0.024618610739707947,
0.12604525685310364,
-0.05437850207090378,
0.050305139273405075,
-0.0798015221953392,
-0.05911360681056976,
-0.02207694761455059,
0.10606653988361359,
-0.19903025031089783,
0.21006689965724945,
0.007053140550851822,
0.01671450212597847,
-0.03928764909505844,
-0.028396714478731155,
0.04919308423995972,
0.1614781767129898,
0.0790640264749527,
-0.0065043168142437935,
0.05348551273345947,
0.05375935509800911,
-0.21767979860305786,
0.06995441019535065,
-0.038530655205249786,
-0.013319033198058605,
0.04359033331274986,
-0.017699776217341423,
-0.04693607985973358,
0.047862738370895386,
-0.005454917438328266,
-0.11480230838060379,
-0.14712916314601898,
0.016841957345604897,
0.15465007722377777,
-0.035173576325178146,
-0.02823890559375286,
-0.09158239513635635,
-0.13585594296455383,
0.17118503153324127,
0.07322397828102112,
-0.10010088980197906,
-0.10668224841356277,
0.01864476501941681,
0.029287660494446754,
-0.02293875813484192,
-0.03036261536180973,
0.013726422563195229,
0.08437703549861908,
-0.017199771478772163,
-0.20312272012233734,
0.04009988158941269,
-0.07698974758386612,
-0.04948485642671585,
-0.06023747846484184,
0.09453904628753662,
-0.034349437803030014,
0.0010310227517038584,
0.049711406230926514,
-0.011479964479804039,
-0.06720995157957077,
-0.07964016497135162,
-0.06561107188463211,
0.01037801243364811,
0.06114102527499199,
0.01226903311908245,
-0.03464636206626892,
-0.16455009579658508,
-0.018601149320602417,
0.020033733919262886,
0.12962520122528076,
0.158694326877594,
-0.06942492723464966,
0.07730787992477417,
0.18680207431316376,
-0.02463061548769474,
-0.28047433495521545,
-0.1643676906824112,
-0.0904083251953125,
-0.03315593674778938,
-0.03304554522037506,
-0.07979091256856918,
0.11251845210790634,
0.001209238194860518,
-0.0643579363822937,
-0.0362074077129364,
-0.2909107506275177,
-0.10070136189460754,
0.18318282067775726,
-0.011094877496361732,
0.3117479383945465,
-0.09447695314884186,
-0.05519465357065201,
-0.008490217849612236,
-0.07100384682416916,
0.17225927114486694,
-0.2527979910373688,
0.03484967723488808,
0.0014094390207901597,
0.014282135292887688,
0.009971469640731812,
-0.04326261579990387,
0.05973184108734131,
0.04879613593220711,
0.036524951457977295,
-0.09209311008453369,
-0.00510859489440918,
0.12872569262981415,
-0.0453028529882431,
0.12844519317150116,
-0.12048810720443726,
0.0862051323056221,
-0.054295677691698074,
-0.05561712384223938,
-0.04750533401966095,
-0.01814878359436989,
0.003812137758359313,
-0.023759109899401665,
-0.03159348666667938,
-0.018836429342627525,
0.07445468753576279,
-0.024957507848739624,
0.19589942693710327,
-0.013101617805659771,
0.06251810491085052,
0.14561031758785248,
0.1168738603591919,
-0.159169003367424,
-0.0059718359261751175,
-0.07903144508600235,
-0.07883146405220032,
0.06535214930772781,
-0.19556152820587158,
0.03979439660906792,
0.04427284002304077,
-0.03272678330540657,
0.05949648097157478,
0.04991919547319412,
0.011907928623259068,
-0.06254731863737106,
0.1265212744474411,
-0.2048017978668213,
-0.07881419360637665,
-0.019569939002394676,
-0.0237837266176939,
-0.02155587449669838,
0.07363394647836685,
0.18353816866874695,
-0.029711181297898293,
-0.016816318035125732,
0.04436216875910759,
0.034979552030563354,
-0.0012470035580918193,
0.10072744637727737,
0.03220757842063904,
0.013998670503497124,
-0.1524903029203415,
0.09353911876678467,
0.060527246445417404,
-0.09529443085193634,
-0.0013039605692029,
0.05773026496171951,
-0.18367427587509155,
-0.11888955533504486,
0.04134047031402588,
0.11771103739738464,
-0.03872758150100708,
-0.09029208868741989,
-0.08441007882356644,
-0.10764884203672409,
0.08860941976308823,
0.12267462909221649,
0.06599357724189758,
0.0819774717092514,
-0.04663237929344177,
-0.11795619875192642,
-0.040195032954216,
0.06910254061222076,
0.08121275156736374,
-0.012059492990374565,
-0.10024966299533844,
0.022980010136961937,
-0.008878044784069061,
0.0915549248456955,
-0.09809372574090958,
-0.031525399535894394,
-0.03450636938214302,
-0.0010387233924120665,
-0.13048624992370605,
0.026642346754670143,
-0.07410980015993118,
-0.030220618471503258,
-0.028328487649559975,
-0.04591742157936096,
-0.07555092871189117,
-0.014167972840368748,
-0.05940089374780655,
0.007569728884845972,
-0.015307622030377388,
0.0801071897149086,
-0.08531280606985092,
0.01029248721897602,
-0.01771269552409649,
-0.027157645672559738,
0.09732698649168015,
0.003694934071972966,
-0.10113745182752609,
0.06098999083042145,
-0.2006482481956482,
-0.061754561960697174,
0.07435759156942368,
0.03103948011994362,
0.031966883689165115,
0.0581764318048954,
0.03975636884570122,
0.0973130613565445,
-0.04963264614343643,
0.009361198171973228,
-0.024292360991239548,
-0.06989341974258423,
0.023168498650193214,
-0.05105789378285408,
0.008617627434432507,
-0.037332549691200256,
-0.06394099444150925,
0.0541539303958416,
0.04623296484351158,
0.19322921335697174,
-0.06352445483207703,
-0.0419817790389061,
-0.14162684977054596,
0.037457071244716644,
-0.02598581276834011,
-0.1440296173095703,
-0.0912342295050621,
-0.016925720497965813,
0.02970905601978302,
-0.03627637401223183,
0.22796869277954102,
0.06821950525045395,
-0.11912117898464203,
0.03377637267112732,
0.08074425905942917,
0.08900340646505356,
0.014130567200481892,
0.2119348794221878,
0.04861440509557724,
-0.02016405202448368,
-0.0801914632320404,
-0.017578212544322014,
0.07536313682794571,
0.040342994034290314,
0.06315253674983978,
0.0504663810133934,
0.08889832347631454,
0.12110847979784012,
0.02820459194481373,
0.025137394666671753,
0.017738154157996178,
-0.07918743044137955,
-0.13869550824165344,
0.07713087648153305,
-0.005613971501588821,
0.04616113007068634,
0.21180157363414764,
0.040215399116277695,
-0.010773735120892525,
0.048344291746616364,
0.011354402638971806,
-0.12533153593540192,
-0.23423488438129425,
-0.10037439316511154,
-0.09970106184482574,
-0.018370242789387703,
-0.09454023092985153,
-0.041328880935907364,
0.08536982536315918,
0.08421601355075836,
-0.028102107346057892,
-0.02663649618625641,
0.06474071741104126,
-0.06364867091178894,
0.09749618917703629,
-0.03715630993247032,
-0.011679569259285927,
-0.039163410663604736,
-0.02784048579633236,
-0.017575552687048912,
0.03827902674674988,
-0.02963314764201641,
0.04638218507170677,
0.08081117272377014,
0.10625340044498444,
-0.09791623800992966,
-0.08156990259885788,
-0.03125058487057686,
0.0009438975248485804,
0.05154938995838165,
0.06759770214557648,
0.061537109315395355,
-0.02499059960246086,
0.05981278046965599,
0.2311759889125824,
-0.004066220484673977,
-0.17406810820102692,
-0.1106032133102417,
0.14545436203479767,
0.028373997658491135,
0.05899551883339882,
0.03165331482887268,
-0.052538976073265076,
-0.04040508717298508,
0.2667728066444397,
0.3497774302959442,
-0.04649325832724571,
0.026242125779390335,
-0.04405990242958069,
0.0016356484265998006,
0.057756565511226654,
0.08711659908294678,
0.05132467672228813,
0.24481770396232605,
-0.04939525946974754,
-0.021454961970448494,
-0.04538126662373543,
-0.006198301445692778,
-0.07026929408311844,
0.06642156094312668,
0.031958483159542084,
-0.0864473432302475,
-0.020667897537350655,
0.12396161258220673,
-0.1343723088502884,
0.042840033769607544,
-0.13920752704143524,
-0.0464104488492012,
-0.080757275223732,
-0.010235089808702469,
0.10692920535802841,
0.019899113103747368,
0.052706506103277206,
-0.02946244738996029,
0.012867323122918606,
-0.02962774969637394,
-0.06382463872432709,
-0.11223820596933365,
-0.01448618620634079,
0.042288172990083694,
-0.06114137917757034,
0.13990820944309235,
0.030098581686615944,
0.060695163905620575,
0.0938444435596466,
-0.00848854798823595,
-0.09420226514339447,
0.13449712097644806,
-0.04450647160410881,
0.019741380587220192,
0.08809399604797363,
-0.07114176452159882,
0.01722976565361023,
-0.08448746055364609,
0.07367463409900665,
-0.06544432044029236,
0.03939517214894295,
-0.12697918713092804,
-0.05909173935651779,
-0.06360692530870438,
0.04415229335427284,
0.019458312541246414,
0.07898586988449097,
0.07167943567037582,
-0.05469195172190666,
0.04194822534918785,
-0.01795077510178089,
-0.012376257218420506,
-0.0025477067101746798,
-0.11029352992773056,
0.01020791195333004,
-0.06668969988822937,
-0.034823838621377945,
-0.02560441382229328,
0.04737420380115509,
-0.17551705241203308,
0.010664130561053753,
-0.13514646887779236,
-0.039570897817611694,
-0.09144775569438934,
0.05767855793237686,
0.20190462470054626,
0.02705530822277069,
-0.014492861926555634,
0.03537670895457268,
0.050573449581861496,
0.048767030239105225,
-0.13390924036502838,
-0.12187453359365463
] |
null | null |
transformers
|
# t5-base-dutch-demo 📰
Created by [Yeb Havinga](https://www.linkedin.com/in/yeb-havinga-86530825/) & [Dat Nguyen](https://www.linkedin.com/in/dat-nguyen-49a641138/) during the [Hugging Face community week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104)
This model is based on [t5-base-dutch](https://huggingface.co/flax-community/t5-base-dutch)
and fine-tuned to create summaries of news articles.
For a demo of the model, head over to the Hugging Face Spaces for the **[Netherformer 📰](https://huggingface.co/spaces/flax-community/netherformer)** example application!
## Dataset
`t5-base-dutch-demo` is fine-tuned on three mixed news sources:
1. **CNN DailyMail** translated to Dutch with MarianMT.
2. **XSUM** translated to Dutch with MarianMt.
3. News article summaries distilled from the nu.nl website.
The total number of training examples in this dataset is 1366592.
## Training
Training consisted of fine-tuning [t5-base-dutch](https://huggingface.co/flax-community/t5-base-dutch) with
the following parameters:
* Constant learning rate 0.0005
* Batch size 8
* 1 epoch (170842 steps)
## Evaluation
The performance of the summarization model is measured with the Rouge metric from the
Huggingface Datasets library.
```
"rouge{n}" (e.g. `"rouge1"`, `"rouge2"`) where: {n} is the n-gram based scoring,
"rougeL": Longest common subsequence based scoring.
```
* Rouge1: 23.8
* Rouge2: 6.9
* RougeL: 19.7
These scores are expected to improve if the model is trained with evaluation configured
for the CNN DM and XSUM datasets (translated to Dutch) individually.
|
{"language": ["dutch"], "tags": ["summarization", "seq2seq", "text-generation"], "datasets": ["cnn_dailymail", "xsum"], "pipeline_tag": "text2text-generation", "widget": [{"text": "Onderzoekers ontdekten dat vier van de vijf kinderen in Engeland die op school lunches hadden gegeten, op school voedsel hadden geprobeerd dat ze thuis niet hadden geprobeerd.De helft van de ondervraagde ouders zei dat hun kinderen hadden gevraagd om voedsel dat ze op school hadden gegeten om thuis te worden gekookt.De enqu\u00eate, van ongeveer 1.000 ouders, vond dat de meest populaire groenten wortelen, suikerma\u00efs en erwten waren.Aubergine, kikkererwten en spinazie waren een van de minst populaire.Van de ondervraagde ouders, 628 hadden kinderen die lunches op school aten. (% duidt op een deel van de ouders die zeiden dat hun kind elke groente zou eten) England's School Food Trust gaf opdracht tot het onderzoek na een onderzoek door de Mumsnet-website suggereerde dat sommige ouders hun kinderen lunchpakket gaven omdat ze dachten dat ze te kieskeurig waren om iets anders te eten. \"Schoolmaaltijden kunnen een geweldige manier zijn om ouders te helpen hun kinderen aan te moedigen om nieuw voedsel te proberen en om de verscheidenheid van voedsel in hun dieet te verhogen. \"Mumsnet medeoprichter, Carrie Longton, zei: \"Het krijgen van kinderen om gezond te eten is de droom van elke ouder, maar maaltijdtijden thuis kan vaak een slagveld en emotioneel geladen zijn. \"Vanuit Mumsnetters' ervaring lijkt het erop dat eenmaal op school is er een verlangen om in te passen bij iedereen anders en zelfs een aantal positieve peer pressure om op te scheppen over de verscheidenheid van wat voedsel je kunt eten. \"Schoolmaaltijden zijn ook verplaatst op nogal een beetje van toen Mumsnetters op school waren, met gezondere opties en meer afwisseling. \"Schoolmaaltijden in Engeland moeten nu voldoen aan strenge voedingsrichtlijnen.Ongeveer vier op de tien basisschoolkinderen in Engeland eten nu schoollunches, iets meer dan op middelbare scholen.Meer kinderen in Schotland eten schoollunches - ongeveer 46%.Het onderzoek werd online uitgevoerd tussen 26 februari en 5 maart onder een panel van ouders die ten minste \u00e9\u00e9n kind op school hadden van 4-17 jaar oud."}, {"text": "Het Londense trio staat klaar voor de beste Britse act en beste album, evenals voor twee nominaties in de beste song categorie. \"We kregen te horen zoals vanmorgen 'Oh I think you're genomineerd',\" zei Dappy. \"En ik was als 'Oh yeah, what one?' En nu zijn we genomineerd voor vier awards. Ik bedoel, wow! \"Bandmate Fazer voegde eraan toe: \"We dachten dat het het beste van ons was om met iedereen naar beneden te komen en hallo te zeggen tegen de camera's.En nu vinden we dat we vier nominaties hebben. \"De band heeft twee shots bij de beste song prijs, het krijgen van het knikje voor hun Tyncy Stryder samenwerking nummer \u00e9\u00e9n, en single Strong Again.Their album Uncle B zal ook gaan tegen platen van Beyonce en Kany \"Aan het eind van de dag zijn we dankbaar om te zijn waar we zijn in onze carri\u00e8res. \"Als het niet gebeurt dan gebeurt het niet - live om te vechten een andere dag en blijven maken albums en hits voor de fans. \"Dappy onthulde ook dat ze kunnen worden optreden live op de avond.De groep zal doen Nummer Een en ook een mogelijke uitlevering van de War Child single, I Got Soul.Het liefdadigheidslied is een re-working van The Killers' All These Things That I've Done en is ingesteld op artiesten als Chipmunk, Ironik en Pixie Lott.Dit jaar zal Mobos worden gehouden buiten Londen voor de eerste keer, in Glasgow op 30 september.N-Dubz zei dat ze op zoek waren naar optredens voor hun Schotse fans en bogen over hun recente shows ten noorden van de Londense We hebben Aberdeen ongeveer drie of vier maanden geleden gedaan - we hebben die show daar verbrijzeld! Overal waar we heen gaan slaan we hem in elkaar!\""}]}
|
text2text-generation
|
flax-community/t5-base-dutch-demo
|
[
"transformers",
"pytorch",
"jax",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"summarization",
"seq2seq",
"text-generation",
"dataset:cnn_dailymail",
"dataset:xsum",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"dutch"
] |
TAGS
#transformers #pytorch #jax #tensorboard #safetensors #t5 #text2text-generation #summarization #seq2seq #text-generation #dataset-cnn_dailymail #dataset-xsum #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# t5-base-dutch-demo
Created by Yeb Havinga & Dat Nguyen during the Hugging Face community week
This model is based on t5-base-dutch
and fine-tuned to create summaries of news articles.
For a demo of the model, head over to the Hugging Face Spaces for the Netherformer example application!
## Dataset
't5-base-dutch-demo' is fine-tuned on three mixed news sources:
1. CNN DailyMail translated to Dutch with MarianMT.
2. XSUM translated to Dutch with MarianMt.
3. News article summaries distilled from the URL website.
The total number of training examples in this dataset is 1366592.
## Training
Training consisted of fine-tuning t5-base-dutch with
the following parameters:
* Constant learning rate 0.0005
* Batch size 8
* 1 epoch (170842 steps)
## Evaluation
The performance of the summarization model is measured with the Rouge metric from the
Huggingface Datasets library.
* Rouge1: 23.8
* Rouge2: 6.9
* RougeL: 19.7
These scores are expected to improve if the model is trained with evaluation configured
for the CNN DM and XSUM datasets (translated to Dutch) individually.
|
[
"# t5-base-dutch-demo \n\nCreated by Yeb Havinga & Dat Nguyen during the Hugging Face community week\n\nThis model is based on t5-base-dutch \nand fine-tuned to create summaries of news articles.\n\nFor a demo of the model, head over to the Hugging Face Spaces for the Netherformer example application!",
"## Dataset\n\n\n't5-base-dutch-demo' is fine-tuned on three mixed news sources:\n\n 1. CNN DailyMail translated to Dutch with MarianMT.\n 2. XSUM translated to Dutch with MarianMt.\n 3. News article summaries distilled from the URL website.\n\nThe total number of training examples in this dataset is 1366592.",
"## Training\n\nTraining consisted of fine-tuning t5-base-dutch with\nthe following parameters:\n\n * Constant learning rate 0.0005\n * Batch size 8\n * 1 epoch (170842 steps)",
"## Evaluation\n\nThe performance of the summarization model is measured with the Rouge metric from the\nHuggingface Datasets library.\n\n\n\n * Rouge1: 23.8\n * Rouge2: 6.9\n * RougeL: 19.7\n\nThese scores are expected to improve if the model is trained with evaluation configured\nfor the CNN DM and XSUM datasets (translated to Dutch) individually."
] |
[
"TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #t5 #text2text-generation #summarization #seq2seq #text-generation #dataset-cnn_dailymail #dataset-xsum #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# t5-base-dutch-demo \n\nCreated by Yeb Havinga & Dat Nguyen during the Hugging Face community week\n\nThis model is based on t5-base-dutch \nand fine-tuned to create summaries of news articles.\n\nFor a demo of the model, head over to the Hugging Face Spaces for the Netherformer example application!",
"## Dataset\n\n\n't5-base-dutch-demo' is fine-tuned on three mixed news sources:\n\n 1. CNN DailyMail translated to Dutch with MarianMT.\n 2. XSUM translated to Dutch with MarianMt.\n 3. News article summaries distilled from the URL website.\n\nThe total number of training examples in this dataset is 1366592.",
"## Training\n\nTraining consisted of fine-tuning t5-base-dutch with\nthe following parameters:\n\n * Constant learning rate 0.0005\n * Batch size 8\n * 1 epoch (170842 steps)",
"## Evaluation\n\nThe performance of the summarization model is measured with the Rouge metric from the\nHuggingface Datasets library.\n\n\n\n * Rouge1: 23.8\n * Rouge2: 6.9\n * RougeL: 19.7\n\nThese scores are expected to improve if the model is trained with evaluation configured\nfor the CNN DM and XSUM datasets (translated to Dutch) individually."
] |
[
94,
74,
79,
43,
84
] |
[
"passage: TAGS\n#transformers #pytorch #jax #tensorboard #safetensors #t5 #text2text-generation #summarization #seq2seq #text-generation #dataset-cnn_dailymail #dataset-xsum #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# t5-base-dutch-demo \n\nCreated by Yeb Havinga & Dat Nguyen during the Hugging Face community week\n\nThis model is based on t5-base-dutch \nand fine-tuned to create summaries of news articles.\n\nFor a demo of the model, head over to the Hugging Face Spaces for the Netherformer example application!## Dataset\n\n\n't5-base-dutch-demo' is fine-tuned on three mixed news sources:\n\n 1. CNN DailyMail translated to Dutch with MarianMT.\n 2. XSUM translated to Dutch with MarianMt.\n 3. News article summaries distilled from the URL website.\n\nThe total number of training examples in this dataset is 1366592.## Training\n\nTraining consisted of fine-tuning t5-base-dutch with\nthe following parameters:\n\n * Constant learning rate 0.0005\n * Batch size 8\n * 1 epoch (170842 steps)## Evaluation\n\nThe performance of the summarization model is measured with the Rouge metric from the\nHuggingface Datasets library.\n\n\n\n * Rouge1: 23.8\n * Rouge2: 6.9\n * RougeL: 19.7\n\nThese scores are expected to improve if the model is trained with evaluation configured\nfor the CNN DM and XSUM datasets (translated to Dutch) individually."
] |
[
-0.08694397658109665,
0.09009997546672821,
-0.005064936354756355,
0.03558187931776047,
0.09012531489133835,
-0.004205352161079645,
0.05697113648056984,
0.06598088890314102,
-0.1093936339020729,
0.09265348315238953,
0.049437008798122406,
0.014947623014450073,
0.05770313739776611,
0.22532472014427185,
0.037677664309740067,
-0.2256479561328888,
0.0992063358426094,
-0.05557316541671753,
-0.15780630707740784,
0.08514700829982758,
0.10739921778440475,
-0.05112992972135544,
0.05162021145224571,
-0.035757437348365784,
-0.024223342537879944,
0.03557266294956207,
-0.0457698255777359,
-0.07111344486474991,
0.0797642171382904,
0.04147200658917427,
0.023651806637644768,
0.07670529186725616,
0.07974539697170258,
-0.18436585366725922,
0.0025394526310265064,
0.07624083757400513,
0.014376591891050339,
0.02633500099182129,
0.075124092400074,
0.04896429181098938,
0.2024315595626831,
-0.16703017055988312,
0.04958271235227585,
0.018844304606318474,
-0.10570712387561798,
-0.18810750544071198,
-0.1362687349319458,
-0.011271456256508827,
0.14912863075733185,
0.08231526613235474,
-0.03704285994172096,
0.11715731024742126,
-0.08670125901699066,
0.05942566320300102,
0.12993519008159637,
-0.20075416564941406,
-0.051971159875392914,
0.08600734919309616,
-0.013377121649682522,
0.04478255286812782,
-0.0678732767701149,
0.03507373481988907,
0.05886311084032059,
0.01110916305333376,
0.061744607985019684,
-0.0027198982425034046,
0.12024656683206558,
-0.025403065606951714,
-0.12317099422216415,
-0.07247112691402435,
0.1430145800113678,
0.060376767069101334,
-0.05280783399939537,
-0.12078408896923065,
0.014989216811954975,
0.015002526342868805,
-0.0626152753829956,
-0.06584491580724716,
0.01032746210694313,
-0.02495865896344185,
-0.0599750317633152,
-0.011122527532279491,
-0.09067951142787933,
0.003593785921111703,
-0.023286353796720505,
0.058080319315195084,
0.022051922976970673,
-0.0037239408120512962,
-0.006266923621296883,
0.09706715494394302,
0.015750017017126083,
-0.0895235538482666,
-0.032973308116197586,
-0.04489946365356445,
-0.09101191908121109,
-0.03499387577176094,
0.033224280923604965,
-0.06097017973661423,
0.0027689426206052303,
0.06953361630439758,
-0.17269372940063477,
0.02818002924323082,
0.010799898765981197,
-0.025689028203487396,
0.024846507236361504,
0.14268440008163452,
-0.0878102034330368,
-0.09997275471687317,
-0.007654089946299791,
0.06470125168561935,
-0.00972028449177742,
0.017499102279543877,
0.04061174765229225,
0.07921097427606583,
0.06877753138542175,
0.07209189236164093,
0.016901757568120956,
0.06647104769945145,
-0.03739836439490318,
-0.016534868627786636,
0.11677827686071396,
-0.11207444220781326,
-0.018168874084949493,
0.01223031710833311,
0.0014197661075741053,
0.05621178820729256,
0.015444595366716385,
-0.015993833541870117,
-0.09367981553077698,
0.08858524262905121,
-0.0634710043668747,
-0.07291115075349808,
-0.02947401814162731,
-0.08278527855873108,
0.022612443193793297,
0.03228139877319336,
-0.1038748100399971,
-0.04691176488995552,
-0.07353263348340988,
-0.06135716289281845,
0.003669566009193659,
-0.05522454157471657,
-0.03128163516521454,
-0.0860101580619812,
-0.0822371169924736,
0.020260317251086235,
0.04829544946551323,
0.08602424710988998,
-0.0377214290201664,
0.05909495800733566,
-0.04111086577177048,
0.04611986130475998,
0.09790688008069992,
0.03211040794849396,
-0.07758276164531708,
0.009732209146022797,
-0.13074365258216858,
0.16899339854717255,
-0.10579745471477509,
0.017931582406163216,
-0.1528337150812149,
-0.08577976375818253,
-0.08288521319627762,
0.03993833810091019,
0.0629686564207077,
0.20529715716838837,
-0.19000685214996338,
-0.08007015287876129,
0.14920644462108612,
-0.1128445416688919,
-0.029454680159687996,
0.10784751921892166,
-0.00010289451165590435,
0.053698040544986725,
0.07601138204336166,
0.07028281688690186,
0.010305658914148808,
-0.10692945122718811,
-0.00017771257262211293,
-0.012909036129713058,
0.017746469005942345,
0.07900332659482956,
0.058858346194028854,
-0.0493326410651207,
-0.007466645445674658,
0.03458603471517563,
-0.07801755517721176,
0.01120680756866932,
-0.04619188234210014,
-0.023039979860186577,
0.05212932080030441,
0.015476156957447529,
0.001183839631266892,
0.006359749473631382,
-0.007970268838107586,
-0.04986279830336571,
-0.12005481123924255,
-0.022291431203484535,
0.0835084468126297,
-0.053995486348867416,
0.012638024054467678,
-0.0577787421643734,
0.007010691333562136,
0.0010956411715596914,
0.03970810025930405,
-0.09617234021425247,
-0.08042696118354797,
0.03588109835982323,
-0.048288844525814056,
-0.005202172789722681,
0.0431026965379715,
0.018184233456850052,
0.005377922207117081,
-0.06575648486614227,
-0.01943128928542137,
-0.035484593361616135,
0.0038884105160832405,
-0.028615545481443405,
-0.12858478724956512,
0.04604624956846237,
-0.0467086061835289,
0.08819947391748428,
-0.16860629618167877,
0.00200048740953207,
0.11106070876121521,
0.0993533581495285,
0.0672893226146698,
-0.0755506232380867,
0.07317507266998291,
0.039650242775678635,
0.01020904816687107,
-0.08274198323488235,
0.010145949199795723,
0.018415218219161034,
-0.033506523817777634,
0.0907418429851532,
-0.1242833063006401,
-0.08466098457574844,
0.04820787161588669,
0.11080175638198853,
-0.08464197814464569,
0.054513394832611084,
-0.07113083451986313,
-0.006874241400510073,
-0.09305460005998611,
-0.04359446093440056,
0.04889075085520744,
0.04413465037941933,
0.0954984724521637,
-0.08559147268533707,
-0.06227588281035423,
0.012441971339285374,
0.024527661502361298,
-0.10913451015949249,
0.11468680948019028,
0.0369037501513958,
-0.09874983876943588,
0.037651047110557556,
0.04952890798449516,
0.08873296529054642,
0.09736845642328262,
-0.03654608502984047,
-0.062128446996212006,
-0.026715192943811417,
0.04370618984103203,
0.015529461205005646,
0.08089450001716614,
-0.04166838526725769,
0.03981003165245056,
0.03845028951764107,
0.0553034283220768,
0.0265030600130558,
-0.07826146483421326,
0.0070906043983995914,
0.010313528589904308,
-0.060726750642061234,
-0.002376013435423374,
0.08151600509881973,
0.009825828485190868,
0.08551611751317978,
0.008091716095805168,
-0.003340773982927203,
-0.039631932973861694,
-0.039643194526433945,
-0.10407731682062149,
0.20213720202445984,
-0.0777304619550705,
-0.29909709095954895,
-0.10084431618452072,
0.07158645987510681,
-0.025134284049272537,
-0.04889938235282898,
0.06074187159538269,
-0.11760330945253372,
-0.10306699573993683,
-0.15077388286590576,
0.05952627956867218,
0.06066343933343887,
0.01144044566899538,
0.05724309757351875,
0.03850088268518448,
0.047906406223773956,
-0.1469763070344925,
-0.003915990237146616,
-0.07122237235307693,
-0.09395547211170197,
0.039836227893829346,
-0.055795375257730484,
0.07369108498096466,
0.09685161709785461,
0.04754827916622162,
0.01415104791522026,
-0.05475423112511635,
0.22605985403060913,
-0.12736628949642181,
0.07892346382141113,
0.07273906469345093,
0.0899762511253357,
0.06175539642572403,
0.15488989651203156,
0.015874162316322327,
-0.09771459549665451,
0.05467357113957405,
0.08349506556987762,
-0.0229574516415596,
-0.22680321335792542,
-0.09848053753376007,
-0.05247122049331665,
-0.003445659065619111,
0.035825930535793304,
0.05156160145998001,
-0.07258161157369614,
-0.0220708679407835,
-0.07030495256185532,
-0.020526550710201263,
0.07965706288814545,
0.05308663472533226,
0.05553806573152542,
0.027398379519581795,
0.06411362439393997,
-0.07519283145666122,
-0.014905299060046673,
0.13179801404476166,
-0.04815635457634926,
0.21507282555103302,
-0.08102184534072876,
0.03414541482925415,
0.07387426495552063,
0.0147264888510108,
-0.01060481183230877,
0.08307542651891708,
-0.016288550570607185,
-0.01759406551718712,
-0.044530417770147324,
-0.07873677462339401,
-0.0017526103183627129,
0.017325198277831078,
-0.010081624612212181,
-0.06679415702819824,
-0.10888546705245972,
0.05630762130022049,
0.07825537025928497,
0.20442992448806763,
0.10586094111204147,
-0.11507659405469894,
-0.06561676412820816,
0.02518399804830551,
-0.01701546274125576,
-0.008662361651659012,
-0.002925658831372857,
0.07412812113761902,
-0.10733912140130997,
0.07460682094097137,
-0.014974181540310383,
0.062287572771310806,
-0.05735085904598236,
0.01237434335052967,
0.008758858777582645,
0.06265471130609512,
-0.0546126663684845,
0.06110765412449837,
-0.16525878012180328,
0.2585907280445099,
-0.025967540219426155,
0.09604179859161377,
-0.004729658365249634,
0.01755680702626705,
0.0061284638941287994,
0.03490101918578148,
0.14985878765583038,
0.04900965839624405,
-0.13204549252986908,
-0.07857886701822281,
-0.10246331989765167,
0.026158900931477547,
0.019114037975668907,
-0.03412942588329315,
0.056781262159347534,
-0.008712654933333397,
-0.012476268224418163,
-0.008458811789751053,
-0.09908495843410492,
-0.13956433534622192,
-0.1412467658519745,
0.07819736003875732,
-0.08967109769582748,
-0.06756959110498428,
-0.10355129837989807,
-0.09070960432291031,
-0.038324836641550064,
0.18772679567337036,
-0.027261702343821526,
-0.06753792613744736,
-0.14797605574131012,
0.09550894796848297,
0.10556261241436005,
-0.07398878782987595,
-0.0003113396232947707,
0.0064771296456456184,
0.13880011439323425,
0.005506196990609169,
-0.03814828023314476,
0.030935222283005714,
-0.07556802034378052,
-0.16075803339481354,
-0.01477651298046112,
0.14839224517345428,
0.06097529083490372,
0.007750913966447115,
0.018202846869826317,
0.005023098550736904,
0.08795582503080368,
-0.1234104111790657,
-0.014433939941227436,
0.06261173635721207,
-0.028943190351128578,
0.026323603466153145,
-0.09109984338283539,
-0.17000654339790344,
-0.11672244966030121,
-0.058041930198669434,
0.04841950535774231,
0.2782059907913208,
-0.06639081239700317,
0.10090989619493484,
0.14127598702907562,
-0.06610693782567978,
-0.27041536569595337,
-0.05208427459001541,
0.08723798394203186,
-0.003505383851006627,
0.0514027401804924,
-0.1126745343208313,
0.05216557905077934,
0.08482512086629868,
0.006811044178903103,
-0.1304292231798172,
-0.31562432646751404,
-0.12993884086608887,
0.043429721146821976,
0.015512621961534023,
0.11426267772912979,
-0.05188269540667534,
-0.028939180076122284,
0.015662480145692825,
-0.052822478115558624,
0.043926700949668884,
0.06347818672657013,
0.04910605400800705,
0.003772746305912733,
-0.02381913736462593,
0.03656967729330063,
-0.02356542833149433,
0.11278470605611801,
0.0470169298350811,
0.023359864950180054,
-0.04862150177359581,
-0.0028577724006026983,
0.059888869524002075,
-0.06870685517787933,
0.10309956222772598,
-0.005281010176986456,
0.07171397656202316,
-0.15618644654750824,
-0.011016000993549824,
-0.07794442027807236,
0.049349889159202576,
-0.06974209100008011,
-0.03165782615542412,
-0.0718543529510498,
0.10322501510381699,
0.10972613096237183,
-0.03068709746003151,
0.032257623970508575,
0.004538856912404299,
-0.01724596507847309,
0.021937211975455284,
0.07971028983592987,
-0.03554842621088028,
-0.04907606169581413,
-0.011065071448683739,
-0.0031425124034285545,
-0.003768330905586481,
-0.14417555928230286,
0.03984834626317024,
0.08261936157941818,
0.004446148406714201,
0.10040256381034851,
0.018442781642079353,
-0.11372973024845123,
0.0017065401189029217,
0.11089392006397247,
-0.10087468475103378,
-0.20345991849899292,
-0.022594619542360306,
0.03624463081359863,
-0.07784096896648407,
0.036069244146347046,
0.1337537318468094,
0.007143415976315737,
-0.010101222433149815,
0.0013682304415851831,
0.08527680486440659,
0.024573178961873055,
0.16157646477222443,
-0.04154406860470772,
0.06397977471351624,
-0.06810696423053741,
0.13864950835704803,
0.08040151000022888,
-0.12139549851417542,
0.01550350058823824,
0.13823716342449188,
-0.11905507743358612,
-0.03271733596920967,
0.0999869704246521,
0.05988150089979172,
0.014641528949141502,
0.0018979826709255576,
-0.040404703468084335,
-0.08964505791664124,
0.04752922058105469,
0.020354952663183212,
0.04020761325955391,
0.05602920800447464,
-0.02312101610004902,
-0.03907206654548645,
-0.05703875422477722,
0.10327977687120438,
0.15292418003082275,
-0.010339009575545788,
-0.03076479583978653,
0.024097401648759842,
0.005516627803444862,
0.03169854357838631,
-0.03699098899960518,
-0.044972989708185196,
-0.07957561314105988,
-0.029683930799365044,
0.03130131959915161,
-0.027862267568707466,
-0.058774933218955994,
-0.015129242092370987,
-0.046391163021326065,
-0.06959153711795807,
-0.026480967178940773,
0.042965952306985855,
-0.041066985577344894,
-0.024654138833284378,
-0.051810137927532196,
0.09783881157636642,
-0.09674324840307236,
-0.013045142404735088,
0.03375309333205223,
-0.12944157421588898,
0.12213190644979477,
-0.04228111356496811,
-0.002582271583378315,
0.048806242644786835,
-0.1258920431137085,
-0.00047765503404662013,
0.017166264355182648,
0.011111048981547356,
0.04622351750731468,
-0.10666216909885406,
0.020455555990338326,
0.0026476543862372637,
-0.011845053173601627,
0.011323180049657822,
0.0041961874812841415,
-0.08242937922477722,
0.05540861934423447,
-0.036312852054834366,
-0.06566176563501358,
-0.03884324058890343,
0.07777044922113419,
0.0675734356045723,
0.03973431885242462,
0.1517656445503235,
-0.08911052346229553,
-0.02322242595255375,
-0.14953331649303436,
-0.025247639045119286,
0.008559378795325756,
-0.04292953759431839,
-0.039580024778842926,
0.004612737335264683,
0.08074142783880234,
-0.005113013554364443,
0.05498750880360603,
-0.010848105885088444,
-0.01738967001438141,
0.056854359805583954,
-0.1135951429605484,
0.007717492524534464,
0.03777873516082764,
0.04310727119445801,
0.01874169334769249,
0.010540701448917389,
-0.04880743473768234,
-0.060746561735868454,
0.012509921565651894,
0.07757332921028137,
0.18677717447280884,
0.11499558389186859,
0.10211070626974106,
0.07966413348913193,
-0.011782030574977398,
-0.05302633345127106,
-0.09088335931301117,
0.06547656655311584,
-0.07131029665470123,
0.08848199993371964,
-0.035622987896203995,
0.05426929518580437,
0.12375278770923615,
-0.1593620777130127,
0.061650823801755905,
-0.003652657847851515,
-0.059866636991500854,
-0.06143150478601456,
-0.0906953290104866,
-0.05891703814268112,
-0.1003086194396019,
0.019643811509013176,
-0.11903689056634903,
0.020418012514710426,
0.002428682753816247,
0.08626382052898407,
0.009630339220166206,
0.08189523965120316,
-0.06455240398645401,
-0.05372026935219765,
0.11776255071163177,
0.012530364096164703,
-0.011008543893694878,
0.069597989320755,
0.05949777364730835,
-0.0011259421007707715,
0.11622431129217148,
0.05455930158495903,
0.0671243965625763,
0.008234702050685883,
0.04562067613005638,
-0.03499975800514221,
-0.0730450227856636,
0.013338061049580574,
-0.018322687596082687,
-0.007025860249996185,
0.12723705172538757,
0.08526630699634552,
-0.04616600647568703,
0.01928633265197277,
0.16737855970859528,
0.00218193419277668,
-0.12150060385465622,
-0.17634615302085876,
0.20439554750919342,
0.0582445003092289,
0.05486937239766121,
0.013506118208169937,
-0.11542773246765137,
-0.012416915968060493,
0.15681764483451843,
0.16246861219406128,
0.049523722380399704,
-0.028962673619389534,
-0.03959430381655693,
-0.007585427723824978,
-0.02496405504643917,
0.1104101911187172,
0.043653879314661026,
0.20812495052814484,
-0.03183465451002121,
-0.0012897744309157133,
-0.04482242092490196,
-0.035900477319955826,
-0.03846953064203262,
0.09856975823640823,
0.03481893986463547,
0.010199972428381443,
-0.06675451248884201,
0.16506525874137878,
0.007111360318958759,
-0.24339820444583893,
-0.03181913495063782,
-0.06791973859071732,
-0.1346437782049179,
-0.0018488584319129586,
0.005625027231872082,
0.0013696160167455673,
0.0598292201757431,
0.03651988506317139,
0.00975891761481762,
0.11686684191226959,
0.005341415759176016,
0.01969805732369423,
-0.11168280243873596,
0.05188777670264244,
-0.059412699192762375,
0.22227057814598083,
0.014105224050581455,
-0.00044793970300816,
0.09294530004262924,
-0.028444988653063774,
-0.11666177213191986,
0.06449639052152634,
0.03048834204673767,
0.05415520444512367,
0.056822434067726135,
0.09627702087163925,
-0.032066553831100464,
0.05779959261417389,
0.06552864611148834,
-0.026538792997598648,
0.041215382516384125,
-0.0892598107457161,
-0.041974037885665894,
-0.057221971452236176,
0.09514301270246506,
-0.04088132828474045,
0.09394808113574982,
0.1818912923336029,
-0.021500397473573685,
0.06819095462560654,
-0.059121690690517426,
0.03849335014820099,
-0.04058211296796799,
0.07826007902622223,
0.03693636879324913,
-0.221441388130188,
0.053388532251119614,
-0.07906446605920792,
0.04800129681825638,
-0.14696909487247467,
-0.03404881805181503,
-0.031553301960229874,
-0.0020299642346799374,
-0.008928796276450157,
0.18611180782318115,
0.015785152092576027,
-0.012925993651151657,
-0.02924167364835739,
-0.04001842439174652,
-0.0211784727871418,
0.12543095648288727,
-0.07903125882148743,
-0.025019478052854538
] |
null | null |
transformers
|
# t5-base-dutch
Created by [Yeb Havinga](https://www.linkedin.com/in/yeb-havinga-86530825/)
& [Dat Nguyen](https://www.linkedin.com/in/dat-nguyen-49a641138/) during the [Hugging Face community week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organized by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google, for the project [Pre-train T5 from scratch in Dutch](https://discuss.huggingface.co/t/pretrain-t5-from-scratch-in-dutch/8109).
See also the fine-tuned [t5-base-dutch-demo](https://huggingface.co/flax-community/t5-base-dutch-demo) model,
and the demo application **[Netherformer 📰](https://huggingface.co/spaces/flax-community/netherformer)**,
that are based on this model.
**5 jan 2022: Model updated. Evaluation accuracy increased from 0.64 to 0.70.**
**11 jan 2022: See also [yhavinga/t5-v1.1-base-dutch-cased](https://huggingface.co/yhavinga/t5-v1.1-base-dutch-cased) with eval acc 0.78**
## Model
* Configuration based on `google/t5-base`
* 12 layers, 12 heads
* Dropout set to 0.1
## Dataset
This model was trained on the `full` configuration of [cleaned Dutch mC4](https://huggingface.co/datasets/yhavinga/mc4_nl_cleaned),
which is the original mC4, except
* Documents that contained words from a selection of the Dutch and English [List of Dirty Naught Obscene and Otherwise Bad Words](https://github.com/LDNOOBW/List-of-Dirty-Naughty-Obscene-and-Otherwise-Bad-Words) are removed
* Sentences with less than 3 words are removed
* Sentences with a word of more than 1000 characters are removed
* Documents with less than 5 sentences are removed
* Documents with "javascript", "lorum ipsum", "terms of use", "privacy policy", "cookie policy", "uses cookies",
"use of cookies", "use cookies", "elementen ontbreken", "deze printversie" are removed.
## Tokenization
A SentencePiece tokenizer was trained from scratch on this dataset.
The total tokens of the `full` configuration is 34B
## Training
The model was trained on the `full` mc4_nl_cleaned dataset configuration for 1 epoch, consisting of 34B tokens,
for 528 482 steps with a batch size of 128 and took 57 hours.
A triangle learning rate schedule was used, with peak learning rate 0.005.
## Evaluation
* Loss: 1.38
* Accuracy: 0.70
|
{"language": ["dutch"], "license": "apache-2.0", "tags": ["seq2seq", "lm-head"], "datasets": ["yhavinga/mc4_nl_cleaned"], "inference": false}
|
text2text-generation
|
flax-community/t5-base-dutch
|
[
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"t5",
"text2text-generation",
"seq2seq",
"lm-head",
"dataset:yhavinga/mc4_nl_cleaned",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"dutch"
] |
TAGS
#transformers #pytorch #tf #jax #tensorboard #t5 #text2text-generation #seq2seq #lm-head #dataset-yhavinga/mc4_nl_cleaned #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
# t5-base-dutch
Created by Yeb Havinga
& Dat Nguyen during the Hugging Face community week, organized by HuggingFace and TPU usage sponsored by Google, for the project Pre-train T5 from scratch in Dutch.
See also the fine-tuned t5-base-dutch-demo model,
and the demo application Netherformer ,
that are based on this model.
5 jan 2022: Model updated. Evaluation accuracy increased from 0.64 to 0.70.
11 jan 2022: See also yhavinga/t5-v1.1-base-dutch-cased with eval acc 0.78
## Model
* Configuration based on 'google/t5-base'
* 12 layers, 12 heads
* Dropout set to 0.1
## Dataset
This model was trained on the 'full' configuration of cleaned Dutch mC4,
which is the original mC4, except
* Documents that contained words from a selection of the Dutch and English List of Dirty Naught Obscene and Otherwise Bad Words are removed
* Sentences with less than 3 words are removed
* Sentences with a word of more than 1000 characters are removed
* Documents with less than 5 sentences are removed
* Documents with "javascript", "lorum ipsum", "terms of use", "privacy policy", "cookie policy", "uses cookies",
"use of cookies", "use cookies", "elementen ontbreken", "deze printversie" are removed.
## Tokenization
A SentencePiece tokenizer was trained from scratch on this dataset.
The total tokens of the 'full' configuration is 34B
## Training
The model was trained on the 'full' mc4_nl_cleaned dataset configuration for 1 epoch, consisting of 34B tokens,
for 528 482 steps with a batch size of 128 and took 57 hours.
A triangle learning rate schedule was used, with peak learning rate 0.005.
## Evaluation
* Loss: 1.38
* Accuracy: 0.70
|
[
"# t5-base-dutch \n\nCreated by Yeb Havinga\n& Dat Nguyen during the Hugging Face community week, organized by HuggingFace and TPU usage sponsored by Google, for the project Pre-train T5 from scratch in Dutch.\n\nSee also the fine-tuned t5-base-dutch-demo model,\nand the demo application Netherformer ,\nthat are based on this model.\n\n5 jan 2022: Model updated. Evaluation accuracy increased from 0.64 to 0.70.\n\n11 jan 2022: See also yhavinga/t5-v1.1-base-dutch-cased with eval acc 0.78",
"## Model\n\n* Configuration based on 'google/t5-base'\n* 12 layers, 12 heads\n* Dropout set to 0.1",
"## Dataset\n\nThis model was trained on the 'full' configuration of cleaned Dutch mC4,\nwhich is the original mC4, except\n\n * Documents that contained words from a selection of the Dutch and English List of Dirty Naught Obscene and Otherwise Bad Words are removed\n * Sentences with less than 3 words are removed\n * Sentences with a word of more than 1000 characters are removed\n * Documents with less than 5 sentences are removed\n * Documents with \"javascript\", \"lorum ipsum\", \"terms of use\", \"privacy policy\", \"cookie policy\", \"uses cookies\",\n \"use of cookies\", \"use cookies\", \"elementen ontbreken\", \"deze printversie\" are removed.",
"## Tokenization\n\nA SentencePiece tokenizer was trained from scratch on this dataset.\nThe total tokens of the 'full' configuration is 34B",
"## Training\n\nThe model was trained on the 'full' mc4_nl_cleaned dataset configuration for 1 epoch, consisting of 34B tokens,\nfor 528 482 steps with a batch size of 128 and took 57 hours.\nA triangle learning rate schedule was used, with peak learning rate 0.005.",
"## Evaluation\n\n* Loss: 1.38\n* Accuracy: 0.70"
] |
[
"TAGS\n#transformers #pytorch #tf #jax #tensorboard #t5 #text2text-generation #seq2seq #lm-head #dataset-yhavinga/mc4_nl_cleaned #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n",
"# t5-base-dutch \n\nCreated by Yeb Havinga\n& Dat Nguyen during the Hugging Face community week, organized by HuggingFace and TPU usage sponsored by Google, for the project Pre-train T5 from scratch in Dutch.\n\nSee also the fine-tuned t5-base-dutch-demo model,\nand the demo application Netherformer ,\nthat are based on this model.\n\n5 jan 2022: Model updated. Evaluation accuracy increased from 0.64 to 0.70.\n\n11 jan 2022: See also yhavinga/t5-v1.1-base-dutch-cased with eval acc 0.78",
"## Model\n\n* Configuration based on 'google/t5-base'\n* 12 layers, 12 heads\n* Dropout set to 0.1",
"## Dataset\n\nThis model was trained on the 'full' configuration of cleaned Dutch mC4,\nwhich is the original mC4, except\n\n * Documents that contained words from a selection of the Dutch and English List of Dirty Naught Obscene and Otherwise Bad Words are removed\n * Sentences with less than 3 words are removed\n * Sentences with a word of more than 1000 characters are removed\n * Documents with less than 5 sentences are removed\n * Documents with \"javascript\", \"lorum ipsum\", \"terms of use\", \"privacy policy\", \"cookie policy\", \"uses cookies\",\n \"use of cookies\", \"use cookies\", \"elementen ontbreken\", \"deze printversie\" are removed.",
"## Tokenization\n\nA SentencePiece tokenizer was trained from scratch on this dataset.\nThe total tokens of the 'full' configuration is 34B",
"## Training\n\nThe model was trained on the 'full' mc4_nl_cleaned dataset configuration for 1 epoch, consisting of 34B tokens,\nfor 528 482 steps with a batch size of 128 and took 57 hours.\nA triangle learning rate schedule was used, with peak learning rate 0.005.",
"## Evaluation\n\n* Loss: 1.38\n* Accuracy: 0.70"
] |
[
85,
139,
29,
154,
35,
73,
17
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #tensorboard #t5 #text2text-generation #seq2seq #lm-head #dataset-yhavinga/mc4_nl_cleaned #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n# t5-base-dutch \n\nCreated by Yeb Havinga\n& Dat Nguyen during the Hugging Face community week, organized by HuggingFace and TPU usage sponsored by Google, for the project Pre-train T5 from scratch in Dutch.\n\nSee also the fine-tuned t5-base-dutch-demo model,\nand the demo application Netherformer ,\nthat are based on this model.\n\n5 jan 2022: Model updated. Evaluation accuracy increased from 0.64 to 0.70.\n\n11 jan 2022: See also yhavinga/t5-v1.1-base-dutch-cased with eval acc 0.78## Model\n\n* Configuration based on 'google/t5-base'\n* 12 layers, 12 heads\n* Dropout set to 0.1## Dataset\n\nThis model was trained on the 'full' configuration of cleaned Dutch mC4,\nwhich is the original mC4, except\n\n * Documents that contained words from a selection of the Dutch and English List of Dirty Naught Obscene and Otherwise Bad Words are removed\n * Sentences with less than 3 words are removed\n * Sentences with a word of more than 1000 characters are removed\n * Documents with less than 5 sentences are removed\n * Documents with \"javascript\", \"lorum ipsum\", \"terms of use\", \"privacy policy\", \"cookie policy\", \"uses cookies\",\n \"use of cookies\", \"use cookies\", \"elementen ontbreken\", \"deze printversie\" are removed.## Tokenization\n\nA SentencePiece tokenizer was trained from scratch on this dataset.\nThe total tokens of the 'full' configuration is 34B"
] |
[
-0.02609148994088173,
0.05992508679628372,
-0.004669238347560167,
0.05377131327986717,
0.03349453955888748,
-0.03453216329216957,
0.04492230713367462,
0.08323096483945847,
-0.016901981085538864,
0.09782382845878601,
0.012205538339912891,
0.07388671487569809,
0.04824754223227501,
0.08968400955200195,
0.004811753053218126,
-0.1469578742980957,
0.1241278350353241,
-0.0490647554397583,
0.0316852331161499,
0.09244462847709656,
0.10124563425779343,
-0.05640101432800293,
0.08786030858755112,
-0.018534615635871887,
-0.038164447993040085,
-0.004273226950317621,
-0.03853404149413109,
-0.0541730597615242,
0.11438145488500595,
0.03312968835234642,
0.06746478378772736,
0.07773826271295547,
-0.0445459820330143,
-0.179036945104599,
0.008741793222725391,
0.033611148595809937,
-0.04979657754302025,
0.019762840121984482,
0.04408106952905655,
0.09461098164319992,
0.1549825370311737,
-0.09253855794668198,
-0.0030468173790723085,
0.047614723443984985,
-0.07984378933906555,
-0.11070674657821655,
-0.1375126987695694,
-0.0046797143295407295,
0.09921355545520782,
0.09469719976186752,
-0.03353031724691391,
0.09799836575984955,
-0.060840871185064316,
0.0354296900331974,
0.1510700285434723,
-0.16967090964317322,
-0.000028500047847046517,
0.01754247583448887,
-0.016634061932563782,
-0.04316689446568489,
-0.06645230203866959,
0.04415512457489967,
-0.04063912481069565,
0.0454244427382946,
-0.0060995942912995815,
-0.032483067363500595,
-0.015940755605697632,
-0.061146173626184464,
-0.02234060876071453,
-0.019562527537345886,
0.14089056849479675,
0.06014196574687958,
-0.041622985154390335,
-0.15639524161815643,
-0.07179657369852066,
-0.08790625631809235,
-0.04516759514808655,
-0.043888092041015625,
-0.013890751637518406,
0.019531622529029846,
-0.011667169630527496,
-0.11408179998397827,
-0.07252293080091476,
0.050890397280454636,
-0.025559822097420692,
0.028333816677331924,
0.024081623181700706,
-0.036492008715867996,
0.0308060422539711,
0.06675597280263901,
-0.018089942634105682,
-0.1173902302980423,
-0.0722925215959549,
-0.03456614539027214,
0.002332557924091816,
0.03935328498482704,
-0.018495993688702583,
-0.10461282730102539,
0.03236931189894676,
0.136882483959198,
-0.02561502903699875,
0.01689283549785614,
-0.09895720332860947,
0.04107214882969856,
-0.04660755395889282,
0.10261981189250946,
0.000698418531101197,
-0.04534250125288963,
0.0500103160738945,
-0.03632792457938194,
0.08775056153535843,
-0.006422397214919329,
-0.06088707968592644,
0.0244870875030756,
0.08728650957345963,
0.08890660107135773,
0.044893983751535416,
0.06056232377886772,
-0.036739248782396317,
-0.02455793134868145,
0.22882312536239624,
-0.10849557816982269,
0.03845784440636635,
-0.027980687096714973,
0.048318542540073395,
-0.00033259764313697815,
0.013877272605895996,
0.030387938022613525,
-0.07919088751077652,
0.0816924050450325,
-0.048523493111133575,
-0.023005686700344086,
-0.05344847962260246,
-0.119414322078228,
0.04667907580733299,
-0.037918947637081146,
-0.09226355701684952,
-0.1179460734128952,
-0.13097555935382843,
-0.02675805799663067,
-0.009306327439844608,
-0.03362739831209183,
0.043157368898391724,
-0.05238749086856842,
-0.08006157726049423,
-0.02942889928817749,
0.011570683680474758,
-0.015180348418653011,
-0.029743153601884842,
0.006713998969644308,
-0.10954385250806808,
0.023593036457896233,
-0.09503688663244247,
0.01920773647725582,
-0.11935584247112274,
0.02475801482796669,
-0.23463211953639984,
0.033737026154994965,
-0.07289224863052368,
0.031168727204203606,
-0.1188260018825531,
-0.08524631708860397,
-0.10681050270795822,
0.020972339436411858,
-0.00968756154179573,
0.11289195716381073,
-0.18325233459472656,
-0.07322409749031067,
0.1421695351600647,
-0.17241644859313965,
-0.013179605826735497,
0.1387801617383957,
0.04048147425055504,
-0.005062907002866268,
0.07033471018075943,
0.0927986353635788,
-0.026195822283625603,
-0.0867973268032074,
-0.07987987995147705,
-0.08976268023252487,
0.06745675206184387,
0.0633014664053917,
0.0015044589526951313,
-0.15012213587760925,
0.04175255075097084,
0.07044103741645813,
-0.05270383879542351,
-0.008610378950834274,
0.024878576397895813,
-0.04192575067281723,
-0.014788289554417133,
0.00596350384876132,
0.06270454078912735,
-0.042631153017282486,
-0.025808367878198624,
-0.03203500807285309,
-0.10404667258262634,
-0.06269245594739914,
0.07451238483190536,
-0.04435870423913002,
0.10805705189704895,
-0.05116843059659004,
0.055936139076948166,
0.03887267783284187,
-0.003931209910660982,
-0.1271684169769287,
-0.1358344554901123,
0.05395345389842987,
-0.10740868002176285,
-0.032727062702178955,
0.015877438709139824,
0.014872457832098007,
0.06241550296545029,
-0.09279131144285202,
-0.008279111236333847,
-0.11390132457017899,
-0.025919172912836075,
-0.01444853562861681,
-0.07668214291334152,
-0.04213154688477516,
-0.010740119963884354,
0.06683751195669174,
-0.04937612637877464,
-0.0029822925571352243,
0.1437138468027115,
0.13756515085697174,
0.06673593819141388,
-0.09113137423992157,
0.014708458445966244,
0.04897405207157135,
-0.014581521041691303,
-0.05679932236671448,
-0.00799490325152874,
0.008669438771903515,
0.0088497931137681,
0.07529116421937943,
-0.15726421773433685,
-0.09343566745519638,
0.10843871533870697,
0.09203153103590012,
0.004594715312123299,
0.05934975668787956,
-0.07197464257478714,
-0.003966740798205137,
-0.04884113371372223,
-0.10219259560108185,
0.04254467785358429,
0.07096675038337708,
0.0872742310166359,
-0.03623150661587715,
-0.06029369309544563,
-0.019757676869630814,
0.0732274204492569,
-0.11031562089920044,
0.021926909685134888,
0.08533234894275665,
-0.0672290101647377,
0.07213321328163147,
0.1129983589053154,
0.051346972584724426,
0.16561192274093628,
0.02349723130464554,
-0.05380229651927948,
-0.04003807157278061,
0.024079788476228714,
0.017312543466687202,
0.053866803646087646,
-0.11664465814828873,
0.02633260190486908,
0.07280436903238297,
0.019895540550351143,
0.06043358892202377,
-0.02787168137729168,
0.03356996551156044,
0.011558298952877522,
-0.04770555719733238,
0.1317843645811081,
0.0766536295413971,
-0.0022513391450047493,
0.059518326073884964,
-0.035751521587371826,
-0.050670161843299866,
-0.014282629825174809,
-0.007912879809737206,
-0.05446070432662964,
0.12786954641342163,
-0.1271311640739441,
-0.30791354179382324,
-0.11148595809936523,
0.039030641317367554,
-0.14859186112880707,
-0.007330820430070162,
0.08537375926971436,
-0.0500442311167717,
-0.16192437708377838,
-0.1343635767698288,
0.060737673193216324,
0.04501013830304146,
-0.028919193893671036,
-0.004441540222615004,
0.0025911505799740553,
0.024827508255839348,
-0.13513144850730896,
-0.03123403526842594,
0.008390378206968307,
-0.03590162843465805,
0.02727988176047802,
-0.020898552611470222,
0.11931762844324112,
0.0639934241771698,
0.025849735364317894,
0.005494101904332638,
-0.007893688045442104,
0.16855017840862274,
-0.0550132617354393,
0.08904088288545609,
0.08215772360563278,
0.03820722550153732,
0.039869602769613266,
0.09785345941781998,
-0.01243522483855486,
-0.07123690098524094,
0.042060256004333496,
0.09602908045053482,
0.0018598682945594192,
-0.1681983470916748,
-0.13419315218925476,
-0.06796790659427643,
-0.04541919752955437,
0.042588986456394196,
0.0998682826757431,
0.14660340547561646,
-0.009662538766860962,
-0.11437755823135376,
-0.040307071059942245,
0.027941929176449776,
0.09898748248815536,
0.15044665336608887,
-0.0027687421534210443,
0.06813295930624008,
-0.02969057857990265,
-0.030646836385130882,
0.11292809993028641,
-0.057249803096055984,
0.15362504124641418,
0.06728723645210266,
0.20066656172275543,
0.08217416703701019,
0.09860491007566452,
0.006117698736488819,
0.06895720958709717,
-0.00533674331381917,
0.04193577542901039,
-0.015996482223272324,
-0.09866509586572647,
0.024807855486869812,
0.04389865696430206,
-0.021001741290092468,
-0.06961442530155182,
-0.03445854410529137,
-0.023345807567238808,
0.13149802386760712,
0.15147800743579865,
0.03997494652867317,
-0.07561191916465759,
-0.05948019400238991,
0.006094809155911207,
-0.020535750314593315,
-0.05501437559723854,
-0.04858816787600517,
0.12200285494327545,
-0.07431924343109131,
0.0423397496342659,
-0.02765459194779396,
0.053870219737291336,
-0.05858878791332245,
0.05150964856147766,
0.012556719593703747,
0.03285586088895798,
-0.01893678307533264,
0.03615296632051468,
-0.22018775343894958,
0.1768513321876526,
0.011199476197361946,
0.008699952624738216,
-0.06340149790048599,
-0.00165649747941643,
-0.007194580975919962,
0.017542583867907524,
0.14817680418491364,
0.023741330951452255,
-0.2611905634403229,
-0.018226563930511475,
-0.13428865373134613,
0.02081558294594288,
0.07018131017684937,
-0.07964973151683807,
0.018072379752993584,
0.002736118156462908,
-0.060912832617759705,
0.011997154913842678,
0.15446075797080994,
-0.17221443355083466,
-0.12067569047212601,
0.12537053227424622,
0.05998653173446655,
0.06407343596220016,
-0.05301254242658615,
-0.07566498219966888,
-0.045945990830659866,
0.23959191143512726,
-0.1215720996260643,
-0.04076649993658066,
-0.14417867362499237,
0.008061425760388374,
0.11028092354536057,
-0.0591646246612072,
0.0713995099067688,
0.009189164265990257,
0.12908047437667847,
-0.05960676074028015,
-0.0590493381023407,
0.08329359441995621,
-0.0673397108912468,
-0.1674952507019043,
-0.006181286182254553,
0.10364381223917007,
0.022466113790869713,
0.020383285358548164,
0.01713750697672367,
0.01341979019343853,
0.03920033201575279,
-0.1374836266040802,
-0.03452932462096214,
0.1036020889878273,
0.08959729969501495,
-0.03405635058879852,
-0.13839082419872284,
-0.013193435966968536,
-0.06180903688073158,
-0.05621325597167015,
0.11099909245967865,
0.18984781205654144,
-0.015482728369534016,
0.10658213496208191,
0.08612459897994995,
-0.03517342358827591,
-0.17958758771419525,
-0.08656685799360275,
-0.030429009348154068,
-0.003900223644450307,
-0.02633645385503769,
-0.1435670554637909,
-0.008011975325644016,
0.14463354647159576,
-0.026241760700941086,
-0.02093002200126648,
-0.2032993733882904,
-0.18546079099178314,
0.09202392399311066,
0.0057496726512908936,
0.12797845900058746,
-0.04991636052727699,
-0.057202115654945374,
-0.06688625365495682,
-0.1570548266172409,
0.07961592078208923,
-0.0066597540862858295,
0.049426157027482986,
0.016766060143709183,
-0.023806758224964142,
0.017736591398715973,
-0.0465308278799057,
0.14410638809204102,
0.11573424190282822,
0.1099909320473671,
-0.05130089819431305,
-0.006760764867067337,
0.11376257240772247,
-0.028135892003774643,
0.12500068545341492,
-0.0073181455954909325,
0.022785916924476624,
-0.11825691163539886,
-0.03610506281256676,
-0.11999289691448212,
0.12613002955913544,
-0.06487662345170975,
-0.03493581339716911,
-0.014383700676262379,
0.07309085875749588,
0.08541355282068253,
0.021761154755949974,
0.13924042880535126,
-0.012685444205999374,
0.037922538816928864,
0.16417740285396576,
0.025427237153053284,
-0.03784993663430214,
-0.028739454224705696,
0.010288411751389503,
-0.03044707328081131,
0.04295893758535385,
0.07922646403312683,
0.07782796025276184,
0.055608753114938736,
-0.020337028428912163,
0.11229027807712555,
0.004634760320186615,
-0.07285072654485703,
-0.03953155502676964,
0.0689106285572052,
-0.11133437603712082,
-0.10796226561069489,
-0.020872723311185837,
0.05026792362332344,
-0.01647171378135681,
-0.030552102252840996,
0.15147803723812103,
-0.01169915497303009,
-0.011410985141992569,
0.0008859529625624418,
0.05562209337949753,
0.03173358365893364,
0.083909772336483,
-0.00626158993691206,
0.02423657476902008,
-0.11611515283584595,
0.0928477793931961,
0.08417869359254837,
-0.10655759274959564,
0.03456302359700203,
0.026443425565958023,
-0.13676194846630096,
-0.09359044581651688,
-0.009388512000441551,
0.05772486701607704,
-0.032287854701280594,
-0.02525189332664013,
-0.06668290495872498,
-0.0653189867734909,
0.013962040655314922,
0.0024023512378335,
0.002527134958654642,
0.09719767421483994,
0.022773263975977898,
-0.06947660446166992,
-0.027213403955101967,
0.04879375547170639,
-0.023062754422426224,
0.014933973550796509,
0.0070121861062943935,
0.0490826815366745,
-0.030253110453486443,
-0.02241000346839428,
0.0030246367678046227,
0.041240815073251724,
0.006060516927391291,
-0.061376024037599564,
-0.039454586803913116,
-0.025639213621616364,
-0.0717587023973465,
-0.0632844865322113,
-0.0048885042779147625,
-0.061914753168821335,
-0.0025935263838618994,
0.015546808950603008,
-0.07094193249940872,
-0.08638150244951248,
-0.07451044768095016,
0.019386358559131622,
-0.08565343171358109,
0.0063674901612102985,
0.043034881353378296,
-0.10820209234952927,
0.11674562096595764,
0.02946503460407257,
0.01581886224448681,
0.07431606948375702,
-0.08208387345075607,
-0.03838308900594711,
0.03393585979938507,
0.043902602046728134,
0.014394056983292103,
-0.098220594227314,
-0.03173907846212387,
0.02535533346235752,
-0.019648894667625427,
-0.014830009080469608,
0.05110001936554909,
-0.13931193947792053,
-0.0060347034595906734,
-0.0027461752761155367,
-0.059486135840415955,
0.01082393154501915,
0.044223327189683914,
0.044570352882146835,
-0.06583967059850693,
0.11955274641513824,
-0.06136634200811386,
-0.03031987138092518,
-0.10156789422035217,
0.013065463863313198,
0.008371704258024693,
-0.055059149861335754,
0.016255097463726997,
-0.029303645715117455,
0.04262867942452431,
-0.0030272728763520718,
0.09422682225704193,
-0.01125037670135498,
0.024526622146368027,
0.04926849901676178,
-0.001614876207895577,
-0.1018376499414444,
0.015173492953181267,
0.12841281294822693,
0.010232304222881794,
0.02060038596391678,
0.03163808211684227,
-0.06941675394773483,
0.042486656457185745,
0.057418569922447205,
0.23789532482624054,
0.15070322155952454,
0.12959790229797363,
0.0440342091023922,
0.035243939608335495,
-0.10030312091112137,
-0.11693727970123291,
0.10213157534599304,
-0.06394340097904205,
0.14732517302036285,
-0.03755909577012062,
-0.033271174877882004,
0.11016923934221268,
-0.18053802847862244,
0.11569993942975998,
0.08467146754264832,
-0.022363858297467232,
-0.06130985543131828,
-0.20349811017513275,
-0.07484464347362518,
-0.07558770477771759,
0.017441829666495323,
-0.0891626626253128,
0.009100466035306454,
0.03450467064976692,
0.026270469650626183,
-0.028249790892004967,
0.10589241236448288,
-0.05458831787109375,
-0.09378427267074585,
0.10190899670124054,
0.00856463611125946,
-0.05281714349985123,
-0.04451881721615791,
0.00248723104596138,
-0.004056496080011129,
0.10978904366493225,
0.056450121104717255,
0.08075584471225739,
0.07535623013973236,
-0.004564523696899414,
-0.030971622094511986,
-0.08112632483243942,
-0.026003792881965637,
-0.011031274683773518,
0.008695563301444054,
0.13179919123649597,
0.02844645269215107,
-0.048782482743263245,
0.040676429867744446,
0.10814139991998672,
-0.009220314212143421,
-0.11868374794721603,
-0.10547923296689987,
0.02792567014694214,
0.11631646752357483,
0.006043411325663328,
0.021186629310250282,
-0.11438222974538803,
0.06606799364089966,
0.13739073276519775,
0.15747587382793427,
0.08249496668577194,
0.027194794267416,
0.0368104986846447,
0.004768299870193005,
0.023637626320123672,
0.0136635173112154,
0.07373959571123123,
0.12379451841115952,
-0.008908011019229889,
0.010351150296628475,
0.0018506835913285613,
-0.031920213252305984,
-0.10817395895719528,
0.1251055747270584,
-0.04051881283521652,
-0.07946743071079254,
-0.03442978486418724,
0.031010733917355537,
0.0032684223260730505,
-0.13174331188201904,
0.02981705404818058,
-0.12536241114139557,
-0.08520019799470901,
-0.003708479693159461,
0.034348566085100174,
0.08574341982603073,
0.04660404473543167,
0.0009317577350884676,
-0.03324770927429199,
0.24179339408874512,
-0.027832981199026108,
-0.07743442058563232,
-0.07346908003091812,
0.047516703605651855,
-0.02919921651482582,
0.11746163666248322,
0.05044633150100708,
-0.05225273221731186,
0.09082388132810593,
-0.028164591640233994,
-0.08352326601743698,
0.12365560978651047,
0.03619587421417236,
0.02576705627143383,
0.05852245166897774,
0.11065647751092911,
-0.010995544493198395,
0.08658917248249054,
0.03995169326663017,
-0.04696645215153694,
0.03821534663438797,
0.11762753874063492,
-0.03909141942858696,
-0.022280046716332436,
0.11111516505479813,
-0.0717213898897171,
0.10848496854305267,
0.1620589643716812,
-0.030128126963973045,
0.002327479887753725,
-0.04471912980079651,
0.0525384247303009,
-0.03689837455749512,
0.05045854300260544,
-0.028240157291293144,
-0.1443939059972763,
0.10131111741065979,
-0.008457077667117119,
0.02842782437801361,
-0.1935928612947464,
-0.056456610560417175,
0.034096572548151016,
-0.04246388003230095,
-0.1056070625782013,
0.09536967426538467,
-0.022951746359467506,
-0.024855704978108406,
-0.0064163352362811565,
0.03764095902442932,
0.0413014218211174,
0.04644594341516495,
-0.05693095922470093,
-0.021975617855787277
] |
null | null | null |
# Covid19 Related Question Answering (Closed book question answering)
In 2020, COVID-19 which is caused by a coronavirus called SARS-CoV-2 took over the world. It touched the lives of many people and caused a lot of hardship for humanity. There are still many questions in regards to COVID-19 and it is often difficult to get the right answers. The aim of this project is to finetune models for closed book question answering. In closed-book QA, we feed the model a question *without any context or access to external knowledge* and train it to predict the answer. Since the model doesn't receive any context, the primary way it can learn to answer these questions is based on the "knowledge" it obtained during pre-training [[1]](https://colab.research.google.com/github/google-research/text-to-text-transfer-transformer/blob/master/notebooks/t5-trivia.ipynb#scrollTo=zSeyoqE7WMwu) [[2]](https://arxiv.org/abs/2002.08910).
The main goals of this project are:
1. Train a model for question answering in regards to COVID-19
2. Release the top performing models for further research and enhancement
3. Release all of the preprocessing and postprocessing scripts and findings for future research.
## TO DO LIST:
- [x] Team members met and the following was discussed:
- Data preparation script is prepared that mixes CORD-19 and Pubmed.
- Agreed to finalize the training scripts by 9pm PDT 7/9/2021.
- Tokenizer is now trained.
- [ ] Setup the pretraining script
- [ ] Prepare the finetuning tasks inspired from [T5 Trivia Colab](https://colab.research.google.com/github/google-research/text-to-text-transfer-transformer/blob/master/notebooks/t5-trivia.ipynb)
- What datasets we want to go with?
- [Covid-QA](https://huggingface.co/datasets/covid_qa_deepset) (Maybe as test set?)
- [Trivia](https://huggingface.co/datasets/covid_qa_deepset)
- [CDC-QA](https://www.cdc.gov/coronavirus/2019-ncov/faq.html) (We can scrape quickly using beautiful soup or something)
- [More Medical Datasets](https://aclanthology.org/2020.findings-emnlp.289.pdf) (See the dataset section for inspiratio)
## 1. Model
We will be using T5 model.
## 2. Datasets
The following datasets would be used for finetuning the model. Note that the last dataset is optional and the model is evaluated only using Covid-QA.
For **Intermediate Pre-Training**:
1. [CORD-19](https://allenai.org/data/cord-19)
For **Fine-Tuning** :
1. [Covid-QA](https://huggingface.co/datasets/covid_qa_deepset)
2. [CDC-QA](https://www.cdc.gov/coronavirus/2019-ncov/faq.html)
4. Optional - [Trivia-QA](https://nlp.cs.washington.edu/triviaqa/)
## 3. Training Scripts
We can make use of :
1. [For preprocessing and mixing datasets](https://colab.research.google.com/github/google-research/text-to-text-transfer-transformer/blob/master/notebooks/t5-trivia.ipynb#:~:text=In%20this%20notebook%2C%20we'll,it%20to%20predict%20the%20answer.)
2. [For T5 training](https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/modeling_flax_t5.py)
## 4. Additional Reading
- [How Much Knowledge Can You Pack Into the Parameters of a Language Model?](https://arxiv.org/pdf/2002.08910.pdf)
|
{}
| null |
flax-community/t5-covid-qa
|
[
"arxiv:2002.08910",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2002.08910"
] |
[] |
TAGS
#arxiv-2002.08910 #region-us
|
# Covid19 Related Question Answering (Closed book question answering)
In 2020, COVID-19 which is caused by a coronavirus called SARS-CoV-2 took over the world. It touched the lives of many people and caused a lot of hardship for humanity. There are still many questions in regards to COVID-19 and it is often difficult to get the right answers. The aim of this project is to finetune models for closed book question answering. In closed-book QA, we feed the model a question *without any context or access to external knowledge* and train it to predict the answer. Since the model doesn't receive any context, the primary way it can learn to answer these questions is based on the "knowledge" it obtained during pre-training [[1]](URL [[2]](URL
The main goals of this project are:
1. Train a model for question answering in regards to COVID-19
2. Release the top performing models for further research and enhancement
3. Release all of the preprocessing and postprocessing scripts and findings for future research.
## TO DO LIST:
- [x] Team members met and the following was discussed:
- Data preparation script is prepared that mixes CORD-19 and Pubmed.
- Agreed to finalize the training scripts by 9pm PDT 7/9/2021.
- Tokenizer is now trained.
- [ ] Setup the pretraining script
- [ ] Prepare the finetuning tasks inspired from T5 Trivia Colab
- What datasets we want to go with?
- Covid-QA (Maybe as test set?)
- Trivia
- CDC-QA (We can scrape quickly using beautiful soup or something)
- More Medical Datasets (See the dataset section for inspiratio)
## 1. Model
We will be using T5 model.
## 2. Datasets
The following datasets would be used for finetuning the model. Note that the last dataset is optional and the model is evaluated only using Covid-QA.
For Intermediate Pre-Training:
1. CORD-19
For Fine-Tuning :
1. Covid-QA
2. CDC-QA
4. Optional - Trivia-QA
## 3. Training Scripts
We can make use of :
1. For preprocessing and mixing datasets
2. For T5 training
## 4. Additional Reading
- How Much Knowledge Can You Pack Into the Parameters of a Language Model?
|
[
"# Covid19 Related Question Answering (Closed book question answering)\n\nIn 2020, COVID-19 which is caused by a coronavirus called SARS-CoV-2 took over the world. It touched the lives of many people and caused a lot of hardship for humanity. There are still many questions in regards to COVID-19 and it is often difficult to get the right answers. The aim of this project is to finetune models for closed book question answering. In closed-book QA, we feed the model a question *without any context or access to external knowledge* and train it to predict the answer. Since the model doesn't receive any context, the primary way it can learn to answer these questions is based on the \"knowledge\" it obtained during pre-training [[1]](URL [[2]](URL\n\nThe main goals of this project are:\n\n1. Train a model for question answering in regards to COVID-19\n2. Release the top performing models for further research and enhancement\n3. Release all of the preprocessing and postprocessing scripts and findings for future research.",
"## TO DO LIST:\n- [x] Team members met and the following was discussed:\n\t- Data preparation script is prepared that mixes CORD-19 and Pubmed.\n\t- Agreed to finalize the training scripts by 9pm PDT 7/9/2021.\n\t- Tokenizer is now trained.\n- [ ] Setup the pretraining script\n- [ ] Prepare the finetuning tasks inspired from T5 Trivia Colab\n\t- What datasets we want to go with?\n\t\t- Covid-QA (Maybe as test set?)\n\t\t- Trivia\n\t\t- CDC-QA (We can scrape quickly using beautiful soup or something)\n\t\t- More Medical Datasets (See the dataset section for inspiratio)",
"## 1. Model\n\nWe will be using T5 model.",
"## 2. Datasets\n\nThe following datasets would be used for finetuning the model. Note that the last dataset is optional and the model is evaluated only using Covid-QA.\n\nFor Intermediate Pre-Training:\n1. CORD-19\n\n\nFor Fine-Tuning :\n\n1. Covid-QA\n2. CDC-QA\n4. Optional - Trivia-QA",
"## 3. Training Scripts\n\nWe can make use of : \n\n1. For preprocessing and mixing datasets \n2. For T5 training",
"## 4. Additional Reading\n\n- How Much Knowledge Can You Pack Into the Parameters of a Language Model?"
] |
[
"TAGS\n#arxiv-2002.08910 #region-us \n",
"# Covid19 Related Question Answering (Closed book question answering)\n\nIn 2020, COVID-19 which is caused by a coronavirus called SARS-CoV-2 took over the world. It touched the lives of many people and caused a lot of hardship for humanity. There are still many questions in regards to COVID-19 and it is often difficult to get the right answers. The aim of this project is to finetune models for closed book question answering. In closed-book QA, we feed the model a question *without any context or access to external knowledge* and train it to predict the answer. Since the model doesn't receive any context, the primary way it can learn to answer these questions is based on the \"knowledge\" it obtained during pre-training [[1]](URL [[2]](URL\n\nThe main goals of this project are:\n\n1. Train a model for question answering in regards to COVID-19\n2. Release the top performing models for further research and enhancement\n3. Release all of the preprocessing and postprocessing scripts and findings for future research.",
"## TO DO LIST:\n- [x] Team members met and the following was discussed:\n\t- Data preparation script is prepared that mixes CORD-19 and Pubmed.\n\t- Agreed to finalize the training scripts by 9pm PDT 7/9/2021.\n\t- Tokenizer is now trained.\n- [ ] Setup the pretraining script\n- [ ] Prepare the finetuning tasks inspired from T5 Trivia Colab\n\t- What datasets we want to go with?\n\t\t- Covid-QA (Maybe as test set?)\n\t\t- Trivia\n\t\t- CDC-QA (We can scrape quickly using beautiful soup or something)\n\t\t- More Medical Datasets (See the dataset section for inspiratio)",
"## 1. Model\n\nWe will be using T5 model.",
"## 2. Datasets\n\nThe following datasets would be used for finetuning the model. Note that the last dataset is optional and the model is evaluated only using Covid-QA.\n\nFor Intermediate Pre-Training:\n1. CORD-19\n\n\nFor Fine-Tuning :\n\n1. Covid-QA\n2. CDC-QA\n4. Optional - Trivia-QA",
"## 3. Training Scripts\n\nWe can make use of : \n\n1. For preprocessing and mixing datasets \n2. For T5 training",
"## 4. Additional Reading\n\n- How Much Knowledge Can You Pack Into the Parameters of a Language Model?"
] |
[
14,
237,
153,
11,
79,
27,
24
] |
[
"passage: TAGS\n#arxiv-2002.08910 #region-us \n# Covid19 Related Question Answering (Closed book question answering)\n\nIn 2020, COVID-19 which is caused by a coronavirus called SARS-CoV-2 took over the world. It touched the lives of many people and caused a lot of hardship for humanity. There are still many questions in regards to COVID-19 and it is often difficult to get the right answers. The aim of this project is to finetune models for closed book question answering. In closed-book QA, we feed the model a question *without any context or access to external knowledge* and train it to predict the answer. Since the model doesn't receive any context, the primary way it can learn to answer these questions is based on the \"knowledge\" it obtained during pre-training [[1]](URL [[2]](URL\n\nThe main goals of this project are:\n\n1. Train a model for question answering in regards to COVID-19\n2. Release the top performing models for further research and enhancement\n3. Release all of the preprocessing and postprocessing scripts and findings for future research.## TO DO LIST:\n- [x] Team members met and the following was discussed:\n\t- Data preparation script is prepared that mixes CORD-19 and Pubmed.\n\t- Agreed to finalize the training scripts by 9pm PDT 7/9/2021.\n\t- Tokenizer is now trained.\n- [ ] Setup the pretraining script\n- [ ] Prepare the finetuning tasks inspired from T5 Trivia Colab\n\t- What datasets we want to go with?\n\t\t- Covid-QA (Maybe as test set?)\n\t\t- Trivia\n\t\t- CDC-QA (We can scrape quickly using beautiful soup or something)\n\t\t- More Medical Datasets (See the dataset section for inspiratio)## 1. Model\n\nWe will be using T5 model.## 2. Datasets\n\nThe following datasets would be used for finetuning the model. Note that the last dataset is optional and the model is evaluated only using Covid-QA.\n\nFor Intermediate Pre-Training:\n1. CORD-19\n\n\nFor Fine-Tuning :\n\n1. Covid-QA\n2. CDC-QA\n4. Optional - Trivia-QA"
] |
[
-0.09639745950698853,
0.031074710190296173,
-0.003654302330687642,
-0.01797298714518547,
0.06502848118543625,
0.06723436713218689,
0.03183657303452492,
0.11565753817558289,
-0.0015627547400072217,
0.141109898686409,
0.02989324741065502,
-0.0396944060921669,
0.10859262943267822,
0.18182457983493805,
0.1152510792016983,
-0.16323083639144897,
0.05456741154193878,
-0.13515838980674744,
-0.01171999704092741,
0.08639176189899445,
0.08668863773345947,
-0.04877995327115059,
0.03187616914510727,
-0.02181701734662056,
-0.05980883911252022,
-0.06430970132350922,
-0.04423876106739044,
0.00449766730889678,
0.0786924660205841,
0.046883221715688705,
0.05726075917482376,
0.048895303159952164,
0.08981003612279892,
-0.22102907299995422,
0.029357770457863808,
0.047850221395492554,
0.011802627705037594,
0.05826844647526741,
0.06384391337633133,
0.05733989551663399,
0.07170826196670532,
-0.06332297623157501,
0.05938902124762535,
0.045470867305994034,
-0.103873610496521,
-0.10510550439357758,
-0.16301898658275604,
0.03238936886191368,
0.08589738607406616,
0.09354782849550247,
-0.04538754001259804,
0.07412856817245483,
-0.006046896800398827,
0.0033784559927880764,
0.04197118058800697,
-0.29720091819763184,
-0.041016675531864166,
0.056523390114307404,
0.10531051456928253,
0.06781703233718872,
-0.02824520319700241,
-0.030246376991271973,
-0.019949210807681084,
-0.020432839170098305,
-0.03370434790849686,
0.021983597427606583,
0.04662071168422699,
-0.09893511235713959,
-0.15625408291816711,
-0.051943399012088776,
0.15990732610225677,
0.011494878679513931,
-0.104889877140522,
-0.11341461539268494,
-0.028133731335401535,
-0.021176433190703392,
0.009098496288061142,
-0.10438281297683716,
0.02559696137905121,
0.004288197495043278,
0.11691296100616455,
-0.05756887421011925,
-0.1427602618932724,
-0.035472068935632706,
-0.017368894070386887,
0.06253562122583389,
0.0362464040517807,
0.01370217278599739,
0.04446252062916756,
0.0730566754937172,
-0.0734204649925232,
-0.06759163737297058,
-0.08287148922681808,
-0.04565711319446564,
-0.178559809923172,
-0.0967913344502449,
-0.013224830850958824,
-0.07852344959974289,
0.00016202594269998372,
0.20420420169830322,
-0.04701625183224678,
0.02941896766424179,
-0.00874221883714199,
0.009336099028587341,
0.07321681827306747,
0.10335320234298706,
-0.09598066657781601,
0.044635411351919174,
0.026034818962216377,
0.021967682987451553,
0.054639529436826706,
0.001611219486221671,
-0.026267411187291145,
0.038914211094379425,
0.014772300608456135,
0.10797668248414993,
0.06995537877082825,
-0.019534355029463768,
-0.02244959957897663,
-0.029183849692344666,
0.2178410142660141,
-0.09879540652036667,
-0.03571007400751114,
0.02761397324502468,
0.03761209920048714,
0.000693057372700423,
0.05814813822507858,
0.04107610881328583,
-0.02870054915547371,
0.03761592134833336,
-0.01531688217073679,
-0.05007461830973625,
-0.03861980512738228,
-0.02503477782011032,
0.07477312535047531,
-0.013321981765329838,
-0.07828466594219208,
-0.08906617015600204,
-0.013116116635501385,
-0.09144242852926254,
0.02910323441028595,
-0.07494226098060608,
-0.0488506555557251,
0.013555778190493584,
-0.00518307788297534,
0.03726855665445328,
-0.011878976598381996,
0.10795226693153381,
-0.03938305750489235,
0.010926353745162487,
-0.0704813078045845,
0.03234495595097542,
0.04660341888666153,
0.008764874190092087,
-0.027309440076351166,
0.03473101556301117,
-0.13644400238990784,
0.044160161167383194,
-0.011447657831013203,
0.004393415991216898,
-0.15741509199142456,
0.00807894952595234,
-0.07226375490427017,
-0.016581941395998,
0.03418533876538277,
0.12591207027435303,
-0.22227764129638672,
0.04832115396857262,
0.188670352101326,
-0.09610921889543533,
-0.09076384454965591,
0.08353740721940994,
0.021882863715291023,
0.07533631473779678,
0.001601732219569385,
0.016707047820091248,
0.04671167954802513,
-0.1764773279428482,
-0.05621637776494026,
-0.02906596101820469,
-0.028504015877842903,
0.1119411438703537,
0.10240153223276138,
-0.04963316023349762,
0.02244669944047928,
-0.011919855140149593,
-0.11952421814203262,
-0.024434318765997887,
-0.03801054134964943,
-0.05098064988851547,
-0.006618069019168615,
-0.038944799453020096,
-0.07814750075340271,
-0.004351341165602207,
-0.026396460831165314,
0.01097994577139616,
-0.14462710916996002,
-0.14520089328289032,
0.09965162724256516,
-0.007584026083350182,
0.06335774809122086,
-0.12784096598625183,
0.06354109197854996,
-0.03574509918689728,
0.0222039595246315,
-0.177055224776268,
-0.12312757223844528,
0.0800844132900238,
0.03906933590769768,
0.08356335759162903,
0.005643636919558048,
0.009645882062613964,
0.059852536767721176,
-0.0035910026635974646,
-0.010408148169517517,
0.032752349972724915,
-0.04844149947166443,
-0.08895304799079895,
-0.1748320609331131,
-0.04900900647044182,
-0.08320695161819458,
0.266399085521698,
-0.061540234833955765,
-0.0386996865272522,
-0.030664557591080666,
0.0702190026640892,
-0.018626729026436806,
-0.06374192982912064,
0.047053221613168716,
-0.006297578569501638,
-0.016302084550261497,
-0.03164529800415039,
0.041215650737285614,
0.006065721623599529,
-0.09818818420171738,
0.023945750668644905,
-0.15994274616241455,
-0.2127857804298401,
0.024435436353087425,
0.07286276668310165,
-0.09211926907300949,
-0.06293094158172607,
-0.04963614046573639,
0.0019095417810603976,
-0.03152233362197876,
-0.023949047550559044,
0.15242260694503784,
0.04931259900331497,
0.040969062596559525,
-0.05822509899735451,
-0.038647450506687164,
0.004772594664245844,
0.021575165912508965,
-0.06791263818740845,
0.024154316633939743,
0.10724631696939468,
-0.13769860565662384,
-0.012488789856433868,
0.07106786966323853,
0.11751023679971695,
0.0561039075255394,
0.0037784622982144356,
-0.11547508090734482,
-0.034796327352523804,
0.06132137402892113,
0.047622956335544586,
0.026711851358413696,
0.028407160192728043,
0.05373280495405197,
0.05868321657180786,
-0.001015758141875267,
0.008600072003901005,
-0.06970738619565964,
0.034609660506248474,
-0.005662796553224325,
-0.019055200740695,
0.046490345150232315,
0.0034112941939383745,
-0.04432642459869385,
0.11598814278841019,
0.10327380150556564,
0.07049630582332611,
-0.05284544825553894,
-0.06524169445037842,
-0.15639524161815643,
0.15577365458011627,
-0.11756707727909088,
-0.21534636616706848,
-0.09962999075651169,
0.05290551483631134,
0.04636353999376297,
0.01362601201981306,
0.017331669107079506,
-0.09889212250709534,
-0.08313952386379242,
-0.10984392464160919,
-0.021864820271730423,
0.039679475128650665,
-0.02843657322227955,
-0.05067843571305275,
-0.02986610122025013,
0.03961431235074997,
-0.08722303807735443,
0.03755531832575798,
-0.035521067678928375,
-0.14701271057128906,
0.02452341839671135,
0.012232714332640171,
0.07666479051113129,
0.14303089678287506,
0.0916242003440857,
-0.047313448041677475,
-0.03082672320306301,
0.1868888884782791,
-0.139983668923378,
0.0701235681772232,
0.1398458629846573,
-0.051896389573812485,
0.022830909118056297,
0.13645274937152863,
-0.02600781060755253,
-0.0705282911658287,
0.021379675716161728,
0.0006772498600184917,
-0.04804951697587967,
-0.2646017074584961,
-0.05686427280306816,
-0.05270770937204361,
-0.09201345592737198,
0.03397193178534508,
0.06230510026216507,
-0.01937263458967209,
0.02061588503420353,
-0.11901944130659103,
-0.10332844406366348,
0.005356208886951208,
0.07012330740690231,
-0.016574984416365623,
-0.001754005323164165,
0.06835267692804337,
-0.04044118896126747,
0.06388284265995026,
0.11812786757946014,
0.06208076328039169,
0.1992885172367096,
0.06032174453139305,
0.2027525156736374,
0.08805961161851883,
0.10724662989377975,
0.004912985488772392,
-0.00509894173592329,
-0.0018258999334648252,
0.03683802857995033,
-0.014333694241940975,
-0.07592321187257767,
-0.049610964953899384,
0.08631273359060287,
0.009379221126437187,
-0.0899638831615448,
-0.04578385874629021,
0.0501713901758194,
0.009630455635488033,
0.22760488092899323,
0.03956208750605583,
-0.08000493794679642,
-0.05003959313035011,
0.0525486059486866,
-0.05006589740514755,
-0.06674996018409729,
-0.026511486619710922,
0.014547545462846756,
-0.11159646511077881,
0.07318633049726486,
-0.02466137520968914,
0.0619746558368206,
-0.02806045487523079,
-0.04088328033685684,
0.037219200283288956,
-0.027026649564504623,
0.015491596423089504,
0.06674811244010925,
-0.046022139489650726,
0.11488968133926392,
0.0061676036566495895,
0.06350245326757431,
-0.0362548828125,
0.041824184358119965,
-0.054456766694784164,
0.0920862928032875,
0.18193036317825317,
0.00461792666465044,
-0.039940185844898224,
0.012416256591677666,
-0.12667830288410187,
0.048509370535612106,
0.09780381619930267,
-0.10470183938741684,
0.1156691461801529,
-0.017139747738838196,
-0.006081860978156328,
-0.035527996718883514,
0.023029694333672523,
-0.1546422392129898,
-0.15582899749279022,
0.13782231509685516,
-0.1603677123785019,
0.04596780985593796,
-0.06060054525732994,
-0.014018665999174118,
-0.11719392985105515,
0.1817711889743805,
-0.18717451393604279,
-0.028714124113321304,
-0.13495087623596191,
0.06122869998216629,
0.13100337982177734,
-0.049548644572496414,
0.04348528012633324,
-0.04062139242887497,
0.14731620252132416,
-0.030135784298181534,
-0.0908578634262085,
0.0164559967815876,
-0.08711754530668259,
-0.22329413890838623,
-0.06873150169849396,
0.21337975561618805,
0.07004231214523315,
0.06491492688655853,
0.026533789932727814,
0.0626242384314537,
0.014720974490046501,
-0.05257067829370499,
0.06197672337293625,
0.16220784187316895,
0.08292144536972046,
-0.05488070100545883,
-0.06751994043588638,
0.023163441568613052,
-0.08355652540922165,
-0.0666194036602974,
0.11576125025749207,
0.23571911454200745,
-0.07661829888820648,
0.16371285915374756,
0.12326172739267349,
-0.09601335972547531,
-0.18030884861946106,
-0.03732309862971306,
0.12007597833871841,
0.03171592205762863,
-0.04981420561671257,
-0.19732461869716644,
0.0710906982421875,
0.0878005400300026,
-0.034356724470853806,
-0.005172633565962315,
-0.2080182582139969,
-0.1502903401851654,
0.020255470648407936,
-0.0032779062166810036,
-0.03383878245949745,
-0.16290879249572754,
-0.04531583562493324,
-0.006749095395207405,
-0.05870102718472481,
0.11752502620220184,
-0.04023561626672745,
0.07378579676151276,
-0.0011037287767976522,
-0.044867441058158875,
0.03782094269990921,
-0.031587179750204086,
0.10816086083650589,
-0.011718479916453362,
-0.01621679589152336,
-0.04957471415400505,
0.03379151597619057,
-0.028730541467666626,
-0.05523870140314102,
0.07600972801446915,
0.1046362966299057,
0.009167593903839588,
-0.08111690729856491,
-0.04313461482524872,
-0.036992501467466354,
-0.024166187271475792,
-0.062300801277160645,
-0.03164251148700714,
-0.0791768953204155,
0.08917529135942459,
0.04728243499994278,
-0.02386033721268177,
0.0031496998853981495,
-0.09438253194093704,
0.03219323232769966,
0.0734964981675148,
0.2367590218782425,
0.036404866725206375,
0.0034655972849577665,
0.05669785663485527,
-0.037948623299598694,
0.03064059279859066,
-0.11663728207349777,
0.017659271135926247,
0.08767742663621902,
-0.02745719812810421,
0.08253362029790878,
-0.015487094409763813,
-0.14899475872516632,
-0.02348898909986019,
0.09113753587007523,
-0.08221159875392914,
-0.2687959671020508,
0.029748957604169846,
0.14175651967525482,
-0.15516936779022217,
-0.04248302802443504,
0.14103887975215912,
0.013813490979373455,
-0.05925363302230835,
0.002553770085796714,
0.09252777695655823,
0.026754317805171013,
0.06135312467813492,
-0.000028810101866838522,
0.02132924273610115,
-0.05256808176636696,
0.08503448218107224,
0.11330034583806992,
-0.0795598104596138,
0.015476219356060028,
0.12640096247196198,
-0.0974658951163292,
-0.07288188487291336,
-0.024564430117607117,
0.0306384339928627,
-0.0481221079826355,
-0.015234358608722687,
0.09720653295516968,
-0.08675297349691391,
0.030406635254621506,
0.10263586044311523,
-0.025356799364089966,
0.0016806190833449364,
0.025297343730926514,
0.03402402624487877,
-0.05313105136156082,
0.05520936846733093,
-0.0616392120718956,
0.029126839712262154,
0.008203042671084404,
0.08622955530881882,
-0.0029505884740501642,
0.013772591948509216,
-0.009451089426875114,
-0.042296525090932846,
-0.0348416268825531,
-0.026824794709682465,
-0.06258822977542877,
-0.01837659627199173,
-0.010645882226526737,
-0.036064110696315765,
0.004444011487066746,
-0.022251470014452934,
0.01924203522503376,
-0.011657391674816608,
-0.012587305158376694,
-0.04905448481440544,
-0.10936455428600311,
0.03310571238398552,
-0.10963225364685059,
0.016749804839491844,
0.02010272443294525,
-0.07270108163356781,
0.1351228803396225,
0.0081170117482543,
0.05841056630015373,
0.005931125488132238,
-0.16143329441547394,
0.040990717709064484,
0.002713853493332863,
0.055832140147686005,
0.0037269273307174444,
-0.16549503803253174,
-0.006354690063744783,
-0.030171450227499008,
-0.1154002696275711,
0.007721132133156061,
0.08272484689950943,
-0.05919186770915985,
0.05135994777083397,
0.03482314944267273,
0.002359089208766818,
-0.07533074170351028,
0.015680480748414993,
0.03226648271083832,
0.06223390996456146,
0.0694078728556633,
-0.021551216021180153,
0.04720662161707878,
-0.14182230830192566,
-0.024929268285632133,
0.002503321273252368,
0.06938035786151886,
-0.05255986750125885,
-0.06251125782728195,
0.07051826268434525,
0.010639593936502934,
0.048378705978393555,
-0.09844468533992767,
0.11008288711309433,
0.05476081743836403,
-0.0007794284028932452,
0.0256950706243515,
0.04444395750761032,
-0.047570303082466125,
0.08664779365062714,
0.02014055848121643,
0.0033620677422732115,
-0.010759275406599045,
-0.04306379705667496,
-0.08819831162691116,
0.1694355458021164,
0.06461749225854874,
0.1543320268392563,
0.041519422084093094,
0.010035574436187744,
-0.1706477254629135,
-0.0002053140924545005,
0.021629028022289276,
-0.08452152460813522,
0.01734202541410923,
-0.03314393013715744,
0.039931610226631165,
0.13380901515483856,
-0.14686588943004608,
0.02455735392868519,
-0.08792608976364136,
-0.04012703150510788,
-0.06537222862243652,
-0.0891653448343277,
-0.03710956871509552,
-0.007950020022690296,
-0.00846698135137558,
-0.09671083092689514,
0.02996533177793026,
-0.02705003321170807,
0.02148762159049511,
0.012571060098707676,
0.09009116888046265,
-0.05902846157550812,
-0.03619559854269028,
0.006215913221240044,
0.03830917924642563,
0.09202539175748825,
0.019904758781194687,
0.045174725353717804,
0.042112112045288086,
0.029036561027169228,
0.07768110930919647,
0.0922168716788292,
0.08059151470661163,
0.08424505591392517,
-0.02028251439332962,
-0.08955468982458115,
0.026093758642673492,
0.012507929466664791,
0.020537350326776505,
0.1867934614419937,
0.03535934165120125,
0.045114919543266296,
0.023865435272455215,
0.115727998316288,
0.022220123559236526,
0.002507078228518367,
-0.14690683782100677,
0.058675214648246765,
-0.03677980974316597,
0.0023851143196225166,
0.0068894498981535435,
-0.09190087020397186,
0.016004415228962898,
0.20803973078727722,
0.10219169408082962,
-0.05565880611538887,
-0.021895509213209152,
0.049913715571165085,
0.006231190636754036,
0.011617839336395264,
0.12392069399356842,
0.08296307176351547,
0.23826919496059418,
-0.08967714011669159,
0.058231987059116364,
0.007709866389632225,
0.0005139143322594464,
0.007933246903121471,
0.12409862130880356,
-0.03352174162864685,
0.0473557747900486,
-0.017200034111738205,
0.07000462710857391,
-0.011728132143616676,
-0.2381414771080017,
0.008038646541535854,
-0.012971417978405952,
-0.12170656025409698,
-0.019259219989180565,
-0.01185512077063322,
-0.06068877875804901,
0.033385735005140305,
0.007630200125277042,
-0.0699460431933403,
0.19620491564273834,
0.0159595999866724,
-0.03195951133966446,
-0.05376281216740608,
0.1021944060921669,
-0.03669893741607666,
0.2034796178340912,
0.023535920307040215,
0.11091568320989609,
0.04214023798704147,
0.008005179464817047,
-0.16740423440933228,
0.0021318483632057905,
0.05371294170618057,
-0.07576226443052292,
0.07483623176813126,
0.14399975538253784,
0.01277354545891285,
0.07950159162282944,
0.1091962456703186,
-0.06341998279094696,
0.04886777326464653,
-0.03891769424080849,
-0.006897923536598682,
-0.08284942060709,
0.13672035932540894,
-0.12386024743318558,
0.17433694005012512,
0.12441850453615189,
-0.03884283825755119,
0.0009284138213843107,
-0.05100547894835472,
0.0027164933271706104,
-0.0017097815871238708,
0.17829160392284393,
0.01884533278644085,
-0.18989600241184235,
0.06579122692346573,
-0.042923200875520706,
0.06310124695301056,
-0.1354820430278778,
-0.098996601998806,
0.04061923176050186,
0.009019529446959496,
-0.02074531838297844,
0.12428522855043411,
-0.027273062616586685,
-0.009261884726583958,
-0.062091533094644547,
0.014591348357498646,
0.039692047983407974,
0.06805263459682465,
-0.120806485414505,
-0.04884306713938713
] |
null | null |
transformers
|
# T5 model for sentence splitting in English
Sentence Split is the task of dividing a long sentence into multiple sentences.
E.g.:
```
Mary likes to play football in her freetime whenever she meets with her friends that are very nice people.
```
could be split into
```
Mary likes to play football in her freetime whenever she meets with her friends.
```
```
Her friends are very nice people.
```
## How to use it in your code:
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("flax-community/t5-large-wikisplit")
model = AutoModelForSeq2SeqLM.from_pretrained("flax-community/t5-large-wikisplit")
complex_sentence = "This comedy drama is produced by Tidy , the company she co-founded in 2008 with her husband David Peet , who is managing director ."
sample_tokenized = tokenizer(complex_sentence, return_tensors="pt")
answer = model.generate(sample_tokenized['input_ids'], attention_mask = sample_tokenized['attention_mask'], max_length=256, num_beams=5)
gene_sentence = tokenizer.decode(answer[0], skip_special_tokens=True)
gene_sentence
"""
Output:
This comedy drama is produced by Tidy. She co-founded Tidy in 2008 with her husband David Peet, who is managing director.
"""
```
## Datasets:
[Wiki_Split](https://research.google/tools/datasets/wiki-split/)
## Current Basline from [paper](https://arxiv.org/abs/1907.12461)

## Our Results:
| Model | Exact | SARI | BLEU |
| --- | --- | --- | --- |
| [t5-base-wikisplit](https://huggingface.co/flax-community/t5-base-wikisplit) | 17.93 | 67.5438 | 76.9 |
| [t5-v1_1-base-wikisplit](https://huggingface.co/flax-community/t5-v1_1-base-wikisplit) | 18.1207 | 67.4873 | 76.9478 |
| [byt5-base-wikisplit](https://huggingface.co/flax-community/byt5-base-wikisplit) | 11.3582 | 67.2685 | 73.1682 |
| [t5-large-wikisplit](https://huggingface.co/flax-community/t5-large-wikisplit) | 18.6632 | 68.0501 | 77.1881 |
|
{"datasets": ["wiki_split"], "widget": [{"text": "Mary likes to play football in her freetime whenever she meets with her friends that are very nice people."}]}
|
text2text-generation
|
flax-community/t5-large-wikisplit
|
[
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"t5",
"text2text-generation",
"dataset:wiki_split",
"arxiv:1907.12461",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1907.12461"
] |
[] |
TAGS
#transformers #pytorch #tf #jax #tensorboard #t5 #text2text-generation #dataset-wiki_split #arxiv-1907.12461 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
T5 model for sentence splitting in English
==========================================
Sentence Split is the task of dividing a long sentence into multiple sentences.
E.g.:
could be split into
How to use it in your code:
---------------------------
Datasets:
---------
Wiki\_Split
Current Basline from paper
--------------------------
!baseline
Our Results:
------------
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #tensorboard #t5 #text2text-generation #dataset-wiki_split #arxiv-1907.12461 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n"
] |
[
78
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #tensorboard #t5 #text2text-generation #dataset-wiki_split #arxiv-1907.12461 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.054091524332761765,
0.0542466975748539,
-0.003917199559509754,
0.07314296811819077,
0.12152725458145142,
0.020550377666950226,
0.10853024572134018,
0.1559319645166397,
-0.03173980861902237,
0.012865097261965275,
0.1506926566362381,
0.2050236463546753,
0.021642426028847694,
0.07378768920898438,
-0.09869818389415741,
-0.2516929507255554,
0.007596876006573439,
0.08262121677398682,
-0.05127881094813347,
0.1142696738243103,
0.09031347930431366,
-0.09197524935007095,
0.09444545954465866,
-0.023146994411945343,
-0.2121983766555786,
0.04042460396885872,
0.062147729098796844,
-0.12261903285980225,
0.12529130280017853,
0.08412886410951614,
0.12839040160179138,
0.04792720451951027,
-0.04130849614739418,
-0.06437116116285324,
0.04620690271258354,
0.06873558461666107,
-0.06707559525966644,
0.11171973496675491,
0.09338714927434921,
-0.055515091866254807,
0.062428515404462814,
0.02182653360068798,
-0.009489152580499649,
0.03736306354403496,
-0.13244841992855072,
-0.049224477261304855,
-0.05636393651366234,
0.04603753611445427,
0.02480015531182289,
0.08324939012527466,
0.006005041301250458,
0.15026472508907318,
-0.04251919314265251,
0.13116170465946198,
0.16857308149337769,
-0.33315810561180115,
-0.031173473224043846,
0.05863296240568161,
0.06149054691195488,
0.053166646510362625,
-0.05672023445367813,
0.05018463358283043,
0.0478191152215004,
0.03361127898097038,
0.07257293164730072,
-0.06634916365146637,
-0.23975525796413422,
0.06060967221856117,
-0.09941611438989639,
-0.004805934149771929,
0.2415175586938858,
-0.04065817594528198,
0.0808558538556099,
-0.021017998456954956,
-0.13321453332901,
-0.09684083610773087,
0.011794619262218475,
-0.023575885221362114,
-0.046481192111968994,
0.024218419566750526,
-0.0011176153784617782,
-0.07092445343732834,
-0.14368489384651184,
0.015125391073524952,
-0.21351732313632965,
0.11100820451974869,
-0.01074826531112194,
0.06038156524300575,
-0.2003597617149353,
0.07618185132741928,
0.02780172973871231,
-0.1315949261188507,
0.07396715879440308,
-0.06035524606704712,
-0.017688099294900894,
-0.021825416013598442,
-0.04894294962286949,
-0.20355544984340668,
0.037855133414268494,
0.05551072955131531,
-0.009847280569374561,
0.014865336939692497,
-0.06692354381084442,
0.08494123816490173,
0.028828153386712074,
0.0696452409029007,
-0.09890166670084,
-0.029596345499157906,
0.04729402810335159,
-0.0706622451543808,
-0.003954171668738127,
-0.08072460442781448,
-0.15297754108905792,
-0.0711742639541626,
0.0691080316901207,
0.05250438302755356,
0.05499910190701485,
0.0927906408905983,
-0.029643448069691658,
-0.04352520406246185,
0.02981484867632389,
-0.08237192034721375,
0.028231412172317505,
-0.013645087368786335,
-0.004464960657060146,
0.1051597148180008,
0.03829680383205414,
0.00719510717317462,
-0.1012299433350563,
0.056216854602098465,
-0.10645037144422531,
-0.002641826868057251,
-0.033779844641685486,
-0.12765537202358246,
0.06086770072579384,
-0.12501512467861176,
0.009416323155164719,
-0.15577617287635803,
-0.10178450495004654,
0.017330821603536606,
0.020625721663236618,
-0.04939204826951027,
-0.04253098741173744,
0.017453914508223534,
-0.05908932164311409,
0.08964676409959793,
-0.041779085993766785,
0.06623664498329163,
-0.03520813584327698,
0.0719374492764473,
-0.07711956650018692,
0.09267158806324005,
-0.12970465421676636,
0.05734909325838089,
-0.06759779155254364,
-0.010214134119451046,
-0.09488590061664581,
0.033024027943611145,
-0.004749231040477753,
0.09074202179908752,
-0.057653799653053284,
-0.0032892960589379072,
-0.10353036224842072,
0.019485142081975937,
0.02690202184021473,
0.14879123866558075,
-0.1908351480960846,
-0.060565464198589325,
0.16037528216838837,
-0.05970688536763191,
-0.15510524809360504,
0.10852904617786407,
-0.01632392592728138,
0.031672339886426926,
0.034380942583084106,
0.20729683339595795,
0.059842534363269806,
-0.04831957817077637,
0.01946238800883293,
0.11166657507419586,
-0.07910478860139847,
-0.0904320627450943,
0.012638706713914871,
0.016280798241496086,
-0.02936547063291073,
0.02400064654648304,
0.13564303517341614,
0.07409612834453583,
-0.05034414306282997,
-0.03622829169034958,
-0.062479641288518906,
-0.03599068894982338,
0.09270497411489487,
0.029255492612719536,
0.12876936793327332,
-0.055209893733263016,
-0.035966191440820694,
0.03360329568386078,
0.004460329655557871,
-0.010913772508502007,
0.043235018849372864,
-0.034205760806798935,
0.12969772517681122,
-0.09935666620731354,
0.020564135164022446,
-0.20101788640022278,
-0.10367655009031296,
-0.020972713828086853,
0.1716228872537613,
-0.005040242802351713,
0.13450407981872559,
0.07394368201494217,
-0.038283102214336395,
-0.02299177274107933,
0.006554566323757172,
0.15790924429893494,
0.024213453754782677,
-0.1278318166732788,
-0.12830302119255066,
0.04678022116422653,
-0.08242470026016235,
-0.04543207213282585,
-0.11513121426105499,
0.018938392400741577,
0.055258139967918396,
0.14903278648853302,
0.03611363470554352,
0.05461227148771286,
0.004956717602908611,
0.010877309367060661,
-0.11403125524520874,
-0.0106528140604496,
0.06520667672157288,
0.002186574274674058,
-0.03385019674897194,
0.2225428968667984,
-0.15560142695903778,
0.264072060585022,
0.18908952176570892,
-0.21588264405727386,
-0.024519402533769608,
-0.012474254705011845,
-0.0268000066280365,
-0.007076207548379898,
0.027341898530721664,
-0.037182025611400604,
0.03366916626691818,
-0.02463691495358944,
0.17605248093605042,
-0.06185947358608246,
-0.060714609920978546,
0.0233097355812788,
-0.029996545985341072,
-0.052237533032894135,
0.07901527732610703,
0.0415564700961113,
-0.20885153114795685,
0.17956528067588806,
0.20831261575222015,
-0.00019354945106897503,
0.19416001439094543,
0.008053299970924854,
-0.03996630012989044,
0.01574731059372425,
-0.032629311084747314,
-0.01988411881029606,
0.00289511657319963,
-0.15147297084331512,
-0.01484331302344799,
0.0821484699845314,
0.014512495137751102,
0.05921853333711624,
-0.11798300594091415,
-0.030493957921862602,
0.009494836442172527,
0.03082062304019928,
0.021234553307294846,
0.11120659112930298,
0.05274521932005882,
0.15295445919036865,
-0.015846576541662216,
-0.025754794478416443,
0.08419202268123627,
0.029406173154711723,
-0.09178223460912704,
0.18305206298828125,
-0.13842575252056122,
-0.2950862646102905,
-0.11126967519521713,
-0.1324317306280136,
-0.04175080731511116,
0.005185818765312433,
0.07173512130975723,
-0.0815841406583786,
-0.016895247623324394,
-0.03184857591986656,
0.0235027763992548,
-0.10304247587919235,
0.048961102962493896,
-0.056515634059906006,
0.01766899973154068,
-0.04539283737540245,
-0.09634710848331451,
-0.018531208857893944,
-0.007635773625224829,
0.021486371755599976,
0.11305253952741623,
-0.04846429452300072,
0.08372193574905396,
0.18277081847190857,
-0.02552066184580326,
0.04578663781285286,
-0.03588191419839859,
0.1540408879518509,
-0.06594735383987427,
0.06236090138554573,
0.17258916795253754,
-0.05385587364435196,
0.06492358446121216,
0.1279502958059311,
0.007239476777613163,
-0.03103998489677906,
-0.0011397473281249404,
-0.0019310822244733572,
-0.07156974077224731,
-0.2616536319255829,
-0.07491833716630936,
-0.13641253113746643,
0.07429588586091995,
0.05874287709593773,
0.06036551296710968,
0.11212068796157837,
0.07875613868236542,
0.01878046616911888,
0.058616168797016144,
-0.049405813217163086,
0.04035713151097298,
0.1544860154390335,
-0.02534695900976658,
0.14042562246322632,
-0.06864281743764877,
-0.08165214955806732,
0.09534048289060593,
0.07465358823537827,
0.09217369556427002,
-0.0038158483803272247,
0.08155863732099533,
0.005149644799530506,
0.14129585027694702,
0.11080431193113327,
0.11493474990129471,
0.0005673647974617779,
-0.04404551163315773,
-0.001971018500626087,
-0.02528292126953602,
0.03140036761760712,
0.03316730260848999,
0.02701352909207344,
-0.08434956520795822,
-0.0432971753180027,
-0.06349622458219528,
0.06702439486980438,
0.10863126069307327,
0.09796968102455139,
-0.25883573293685913,
-0.009208711795508862,
0.04368530958890915,
-0.026912014931440353,
-0.10889310389757156,
0.041304588317871094,
0.09036613255739212,
-0.06835026293992996,
0.04594910517334938,
-0.06961318105459213,
0.1001378521323204,
-0.008535493165254593,
0.042058032006025314,
-0.030353976413607597,
-0.03235561400651932,
-0.02053607441484928,
0.07811267673969269,
-0.2908307909965515,
0.2014446258544922,
0.021176494657993317,
-0.0705946758389473,
-0.10001597553491592,
-0.007615803740918636,
0.011707446537911892,
0.11922676116228104,
0.10081683099269867,
0.0005430803867056966,
-0.05546121299266815,
-0.0302293598651886,
-0.02517375722527504,
-0.003961814101785421,
0.08987507224082947,
0.010007034055888653,
0.001080428366549313,
-0.0265347920358181,
-0.008666996844112873,
0.038247670978307724,
0.06953103095293045,
-0.017439553514122963,
-0.17779485881328583,
0.08586076647043228,
0.05779404938220978,
-0.04271590709686279,
0.03209555521607399,
-0.07382776588201523,
-0.1336817443370819,
0.21953722834587097,
-0.004649460315704346,
-0.050924625247716904,
-0.14625035226345062,
0.008644639514386654,
0.08405449986457825,
-0.07572992891073227,
0.03246466815471649,
-0.06211898475885391,
0.04434727877378464,
-0.05932064354419708,
-0.21609817445278168,
0.14458777010440826,
-0.08856247365474701,
-0.03401299938559532,
-0.0763106495141983,
0.10288351029157639,
-0.09060652554035187,
0.026217849925160408,
-0.0013771641533821821,
0.020361294969916344,
-0.09901085495948792,
-0.049771495163440704,
0.030903389677405357,
-0.026350416243076324,
0.08049477636814117,
-0.011794251389801502,
-0.06828172504901886,
-0.06265829503536224,
0.011577934958040714,
-0.004040571860969067,
0.2869510054588318,
0.10677577555179596,
-0.10715001821517944,
0.12513746321201324,
0.07824663072824478,
-0.05960369110107422,
-0.3140694797039032,
-0.01414004061371088,
-0.07732377201318741,
0.007201776374131441,
-0.0009391981293447316,
-0.11788276582956314,
0.061914607882499695,
-0.006313619669526815,
-0.01863078400492668,
0.14577169716358185,
-0.24852167069911957,
-0.10386232286691666,
0.12913817167282104,
0.02433968521654606,
0.29068589210510254,
-0.1390686333179474,
-0.06322996318340302,
-0.017277676612138748,
-0.10000073909759521,
0.1862722486257553,
-0.14250048995018005,
0.08389150351285934,
-0.02733985148370266,
0.07659818232059479,
0.0454447865486145,
-0.056824881583452225,
0.07027503848075867,
-0.04210817068815231,
0.020061463117599487,
-0.11806898564100266,
-0.07848414778709412,
0.11247947812080383,
-0.03622947633266449,
0.02969958633184433,
-0.0619988776743412,
0.03674528747797012,
-0.12183187156915665,
-0.006179662887006998,
-0.088393434882164,
0.066009521484375,
0.010616278275847435,
-0.06702639162540436,
-0.03312410041689873,
-0.03479190915822983,
0.0136539526283741,
-0.03453262522816658,
0.2290375828742981,
-0.0007185599533841014,
0.16778632998466492,
0.19854803383350372,
0.1006682813167572,
-0.08965437859296799,
0.009555666707456112,
-0.02925935573875904,
-0.06290508806705475,
0.0797010138630867,
-0.18559284508228302,
0.0413036085665226,
0.11039074510335922,
-0.01551123522222042,
0.05828157812356949,
0.09575796872377396,
-0.02745962142944336,
-0.014295919798314571,
0.12076889723539352,
-0.2328796088695526,
-0.03546580299735069,
-0.05905376374721527,
-0.05027206987142563,
-0.009591135196387768,
0.06000497192144394,
0.17193086445331573,
-0.0061661782674491405,
-0.016712496057152748,
0.015401695854961872,
0.000517066684551537,
-0.03148071840405464,
0.10267528891563416,
0.07334750890731812,
0.026570294052362442,
-0.10685678571462631,
0.07709777355194092,
0.05316973850131035,
-0.16103769838809967,
0.0355229526758194,
0.17030446231365204,
-0.10547646880149841,
-0.14222145080566406,
0.013206150382757187,
0.09807595610618591,
-0.11280970275402069,
-0.010329852811992168,
-0.055497948080301285,
-0.09105776995420456,
0.07874622195959091,
0.23001696169376373,
0.026759307831525803,
0.06342972069978714,
-0.04674892500042915,
-0.07641062140464783,
-0.04528716579079628,
0.057864852249622345,
0.012015648186206818,
0.04700418934226036,
-0.13249799609184265,
0.09081622958183289,
-0.05579787865281105,
0.16141220927238464,
-0.08440055698156357,
0.003097970737144351,
-0.1431298851966858,
-0.011925321072340012,
-0.1425136923789978,
-0.0412367545068264,
-0.053913429379463196,
-0.059111542999744415,
-0.020800860598683357,
-0.05195561796426773,
-0.057161495089530945,
-0.03312088921666145,
-0.10377342253923416,
0.016095615923404694,
-0.021204164251685143,
0.028383590281009674,
-0.07142974436283112,
-0.030955689027905464,
0.023945383727550507,
-0.013524364680051804,
0.12722744047641754,
0.0809924453496933,
-0.08344567567110062,
0.07967376708984375,
-0.10112171620130539,
-0.07686116546392441,
0.0726490318775177,
0.01809898018836975,
0.08856453746557236,
0.06361684203147888,
0.012787225656211376,
0.03876994177699089,
0.03189300000667572,
0.03893894702196121,
0.043305546045303345,
-0.08859635889530182,
0.016252834349870682,
-0.05185114964842796,
-0.11355207860469818,
-0.06583977490663528,
0.016292713582515717,
0.05119398608803749,
0.01587948575615883,
0.09134725481271744,
-0.03688149154186249,
0.07390119135379791,
-0.11871259659528732,
0.022003138437867165,
-0.005280979909002781,
-0.16984862089157104,
0.014799268916249275,
-0.03063509799540043,
0.042306236922740936,
-0.04092045873403549,
0.15113124251365662,
0.057214394211769104,
-0.047679804265499115,
0.02865987829864025,
0.04853489249944687,
-0.04349779710173607,
0.03225291892886162,
0.18173661828041077,
0.018298888579010963,
-0.06456810235977173,
-0.11278977990150452,
0.08425231277942657,
0.03909173607826233,
0.14213769137859344,
0.13424383103847504,
0.056157466024160385,
-0.014209616929292679,
0.09504678100347519,
0.006435318849980831,
-0.04645068570971489,
-0.08901507407426834,
-0.07856203615665436,
-0.07743586599826813,
0.09934249520301819,
-0.028963536024093628,
0.017084121704101562,
0.19509874284267426,
-0.01293879933655262,
0.02593855746090412,
-0.05109889805316925,
-0.06425133347511292,
-0.1427771896123886,
-0.18648415803909302,
-0.08367202430963516,
-0.06961840391159058,
-0.029833845794200897,
-0.0944860577583313,
0.06604978442192078,
0.05062674731016159,
0.07435821741819382,
-0.03845137730240822,
0.09867610037326813,
0.10887577384710312,
-0.08609076589345932,
0.07412785291671753,
0.024308761581778526,
0.041390303522348404,
-0.0482846200466156,
0.016754990443587303,
-0.07713307440280914,
-0.008183755911886692,
-0.0505823977291584,
0.015799546614289284,
-0.004712796304374933,
0.03665170073509216,
-0.10916425287723541,
-0.11298542469739914,
-0.031590282917022705,
0.053133003413677216,
0.0007679650443606079,
0.10671335458755493,
0.029597407206892967,
0.006785859353840351,
0.008992406539618969,
0.21775200963020325,
-0.08097578585147858,
-0.01794593036174774,
-0.07579086720943451,
0.1313355565071106,
0.0082362936809659,
0.05568346753716469,
-0.016611427068710327,
-0.023718055337667465,
-0.05545951798558235,
0.3065294027328491,
0.31450149416923523,
-0.09399987757205963,
0.03771369159221649,
0.03193311393260956,
0.01429405715316534,
0.05259588733315468,
0.1170811876654625,
0.07791467756032944,
0.21399876475334167,
-0.07704032212495804,
-0.03129518777132034,
-0.029766026884317398,
-0.0008117803954519331,
-0.07084468752145767,
0.12977534532546997,
0.047447673976421356,
-0.050803422927856445,
-0.011947968043386936,
0.06082959473133087,
-0.1570030301809311,
0.04963085800409317,
-0.04518890008330345,
-0.21882985532283783,
-0.0931319147348404,
-0.009278361685574055,
0.11734651774168015,
-0.027453163638710976,
0.07317055016756058,
-0.03708506375551224,
-0.03579844534397125,
0.036574117839336395,
-0.001423215726390481,
-0.17687676846981049,
0.03402823209762573,
0.05932340398430824,
-0.1384349763393402,
0.005762534681707621,
-0.03319886699318886,
0.024423805996775627,
0.11788998544216156,
0.05980615317821503,
-0.0791991725564003,
-0.0024907109327614307,
0.004976906813681126,
-0.016706649214029312,
0.01844685524702072,
0.04612421616911888,
0.022904209792613983,
-0.07181128859519958,
0.06701023131608963,
-0.11563875526189804,
0.03605322167277336,
-0.07619356364011765,
-0.010596762411296368,
-0.015590619295835495,
0.01669064536690712,
-0.021063946187496185,
0.10835874080657959,
0.11700756847858429,
-0.01922539807856083,
-0.0048932889476418495,
-0.061394937336444855,
-0.06418779492378235,
0.01601111888885498,
-0.048868078738451004,
-0.08777875453233719,
-0.12741802632808685,
-0.05819657817482948,
0.07367229461669922,
0.010093386285007,
-0.21722428500652313,
-0.009690569713711739,
-0.07296476513147354,
-0.010821342468261719,
-0.1512511819601059,
0.06484867632389069,
0.140232115983963,
0.012737712822854519,
-0.00892702117562294,
0.012986225076019764,
0.03401736170053482,
0.06699150800704956,
-0.11706308275461197,
-0.07094868272542953
] |
null | null |
transformers
|

# Chef Transformer (T5)
> This is part of the
[Flax/Jax Community Week](https://discuss.huggingface.co/t/recipe-generation-model/7475), organized by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
Want to give it a try? Then what's the wait, head over to Hugging Face Spaces [here](https://huggingface.co/spaces/flax-community/chef-transformer).
## Team Members
- Mehrdad Farahani ([m3hrdadfi](https://huggingface.co/m3hrdadfi))
- Kartik Godawat ([dk-crazydiv](https://huggingface.co/dk-crazydiv))
- Haswanth Aekula ([hassiahk](https://huggingface.co/hassiahk))
- Deepak Pandian ([rays2pix](https://huggingface.co/rays2pix))
- Nicholas Broad ([nbroad](https://huggingface.co/nbroad))
## Dataset
[RecipeNLG: A Cooking Recipes Dataset for Semi-Structured Text Generation](https://recipenlg.cs.put.poznan.pl/). This dataset contains **2,231,142** cooking recipes (>2 millions) with size of **2.14 GB**. It's processed in more careful way.
### Example
```json
{
"NER": [
"oyster crackers",
"salad dressing",
"lemon pepper",
"dill weed",
"garlic powder",
"salad oil"
],
"directions": [
"Combine salad dressing mix and oil.",
"Add dill weed, garlic powder and lemon pepper.",
"Pour over crackers; stir to coat.",
"Place in warm oven.",
"Use very low temperature for 15 to 20 minutes."
],
"ingredients": [
"12 to 16 oz. plain oyster crackers",
"1 pkg. Hidden Valley Ranch salad dressing mix",
"1/4 tsp. lemon pepper",
"1/2 to 1 tsp. dill weed",
"1/4 tsp. garlic powder",
"3/4 to 1 c. salad oil"
],
"link": "www.cookbooks.com/Recipe-Details.aspx?id=648947",
"source": "Gathered",
"title": "Hidden Valley Ranch Oyster Crackers"
}
```
## How To Use
```bash
# Installing requirements
pip install transformers
```
```python
from transformers import FlaxAutoModelForSeq2SeqLM
from transformers import AutoTokenizer
MODEL_NAME_OR_PATH = "flax-community/t5-recipe-generation"
tokenizer = AutoTokenizer.from_pretrained(MODEL_NAME_OR_PATH, use_fast=True)
model = FlaxAutoModelForSeq2SeqLM.from_pretrained(MODEL_NAME_OR_PATH)
prefix = "items: "
# generation_kwargs = {
# "max_length": 512,
# "min_length": 64,
# "no_repeat_ngram_size": 3,
# "early_stopping": True,
# "num_beams": 5,
# "length_penalty": 1.5,
# }
generation_kwargs = {
"max_length": 512,
"min_length": 64,
"no_repeat_ngram_size": 3,
"do_sample": True,
"top_k": 60,
"top_p": 0.95
}
special_tokens = tokenizer.all_special_tokens
tokens_map = {
"<sep>": "--",
"<section>": "\n"
}
def skip_special_tokens(text, special_tokens):
for token in special_tokens:
text = text.replace(token, "")
return text
def target_postprocessing(texts, special_tokens):
if not isinstance(texts, list):
texts = [texts]
new_texts = []
for text in texts:
text = skip_special_tokens(text, special_tokens)
for k, v in tokens_map.items():
text = text.replace(k, v)
new_texts.append(text)
return new_texts
def generation_function(texts):
_inputs = texts if isinstance(texts, list) else [texts]
inputs = [prefix + inp for inp in _inputs]
inputs = tokenizer(
inputs,
max_length=256,
padding="max_length",
truncation=True,
return_tensors="jax"
)
input_ids = inputs.input_ids
attention_mask = inputs.attention_mask
output_ids = model.generate(
input_ids=input_ids,
attention_mask=attention_mask,
**generation_kwargs
)
generated = output_ids.sequences
generated_recipe = target_postprocessing(
tokenizer.batch_decode(generated, skip_special_tokens=False),
special_tokens
)
return generated_recipe
```
```python
items = [
"macaroni, butter, salt, bacon, milk, flour, pepper, cream corn",
"provolone cheese, bacon, bread, ginger"
]
generated = generation_function(items)
for text in generated:
sections = text.split("\n")
for section in sections:
section = section.strip()
if section.startswith("title:"):
section = section.replace("title:", "")
headline = "TITLE"
elif section.startswith("ingredients:"):
section = section.replace("ingredients:", "")
headline = "INGREDIENTS"
elif section.startswith("directions:"):
section = section.replace("directions:", "")
headline = "DIRECTIONS"
if headline == "TITLE":
print(f"[{headline}]: {section.strip().capitalize()}")
else:
section_info = [f" - {i+1}: {info.strip().capitalize()}" for i, info in enumerate(section.split("--"))]
print(f"[{headline}]:")
print("\n".join(section_info))
print("-" * 130)
```
Output:
```text
[TITLE]: Macaroni and corn
[INGREDIENTS]:
- 1: 2 c. macaroni
- 2: 2 tbsp. butter
- 3: 1 tsp. salt
- 4: 4 slices bacon
- 5: 2 c. milk
- 6: 2 tbsp. flour
- 7: 1/4 tsp. pepper
- 8: 1 can cream corn
[DIRECTIONS]:
- 1: Cook macaroni in boiling salted water until tender.
- 2: Drain.
- 3: Melt butter in saucepan.
- 4: Blend in flour, salt and pepper.
- 5: Add milk all at once.
- 6: Cook and stir until thickened and bubbly.
- 7: Stir in corn and bacon.
- 8: Pour over macaroni and mix well.
----------------------------------------------------------------------------------------------------------------------------------
[TITLE]: Grilled provolone and bacon sandwich
[INGREDIENTS]:
- 1: 2 slices provolone cheese
- 2: 2 slices bacon
- 3: 2 slices sourdough bread
- 4: 2 slices pickled ginger
[DIRECTIONS]:
- 1: Place a slice of provolone cheese on one slice of bread.
- 2: Top with a slice of bacon.
- 3: Top with a slice of pickled ginger.
- 4: Top with the other slice of bread.
- 5: Heat a skillet over medium heat.
- 6: Place the sandwich in the skillet and cook until the cheese is melted and the bread is golden brown.
----------------------------------------------------------------------------------------------------------------------------------
```
## Evaluation
Since the test set is not available, we will evaluate the model based on a shared test set. This test set consists of 5% of the whole test (*= 5,000 records*),
and we will generate five recipes for each input(*= 25,000 records*).
The following table summarizes the scores obtained by the **Chef Transformer** and **RecipeNLG** as our baseline.
| Model | COSIM | WER | ROUGE-2 | BLEU | GLEU | METEOR |
|:------------------------------------------------------------------------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|
| [RecipeNLG](https://huggingface.co/mbien/recipenlg) | 0.5723 | 1.2125 | 0.1354 | 0.1164 | 0.1503 | 0.2309 |
| [Chef Transformer](huggingface.co/flax-community/t5-recipe-generation) * | **0.7282** | **0.7613** | **0.2470** | **0.3245** | **0.2624** | **0.4150** |
*From the 5 generated recipes corresponding to each NER (food items), only the highest score was taken into account in the WER, COSIM, and ROUGE metrics. At the same time, BLEU, GLEU, Meteor were designed to have many possible references.*
## Copyright
Special thanks to those who provided these fantastic materials.
- [Anatomy](https://www.flaticon.com/free-icon)
- [Chef Hat](https://www.vecteezy.com/members/jellyfishwater)
- [Moira Nazzari](https://pixabay.com/photos/food-dessert-cake-eggs-butter-3048440/)
- [Instagram Post](https://www.freepik.com/free-psd/recipes-ad-social-media-post-template_11520617.htm)
|
{"language": "en", "tags": ["seq2seq", "t5", "text-generation", "recipe-generation"], "pipeline_tag": "text2text-generation", "widget": [{"text": "provolone cheese, bacon, bread, ginger"}, {"text": "sugar, crunchy jif peanut butter, cornflakes"}, {"text": "sweet butter, confectioners sugar, flaked coconut, condensed milk, nuts, vanilla, dipping chocolate"}, {"text": "macaroni, butter, salt, bacon, milk, flour, pepper, cream corn"}, {"text": "hamburger, sausage, onion, regular, american cheese, colby cheese"}, {"text": "chicken breasts, onion, garlic, great northern beans, black beans, green chilies, broccoli, garlic oil, butter, cajun seasoning, salt, oregano, thyme, black pepper, basil, worcestershire sauce, chicken broth, sour cream, chardonnay wine"}, {"text": "serrano peppers, garlic, celery, oregano, canola oil, vinegar, water, kosher salt, salt, black pepper"}]}
|
text2text-generation
|
flax-community/t5-recipe-generation
|
[
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"seq2seq",
"text-generation",
"recipe-generation",
"en",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #tensorboard #safetensors #t5 #text2text-generation #seq2seq #text-generation #recipe-generation #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
!avatar
Chef Transformer (T5)
=====================
>
> This is part of the
> Flax/Jax Community Week, organized by HuggingFace and TPU usage sponsored by Google.
>
>
>
Want to give it a try? Then what's the wait, head over to Hugging Face Spaces here.
Team Members
------------
* Mehrdad Farahani (m3hrdadfi)
* Kartik Godawat (dk-crazydiv)
* Haswanth Aekula (hassiahk)
* Deepak Pandian (rays2pix)
* Nicholas Broad (nbroad)
Dataset
-------
RecipeNLG: A Cooking Recipes Dataset for Semi-Structured Text Generation. This dataset contains 2,231,142 cooking recipes (>2 millions) with size of 2.14 GB. It's processed in more careful way.
### Example
How To Use
----------
Output:
Evaluation
----------
Since the test set is not available, we will evaluate the model based on a shared test set. This test set consists of 5% of the whole test (*= 5,000 records*),
and we will generate five recipes for each input(*= 25,000 records*).
The following table summarizes the scores obtained by the Chef Transformer and RecipeNLG as our baseline.
*From the 5 generated recipes corresponding to each NER (food items), only the highest score was taken into account in the WER, COSIM, and ROUGE metrics. At the same time, BLEU, GLEU, Meteor were designed to have many possible references.*
Copyright
---------
Special thanks to those who provided these fantastic materials.
* Anatomy
* Chef Hat
* Moira Nazzari
* Instagram Post
|
[
"### Example\n\n\nHow To Use\n----------\n\n\nOutput:\n\n\nEvaluation\n----------\n\n\nSince the test set is not available, we will evaluate the model based on a shared test set. This test set consists of 5% of the whole test (*= 5,000 records*),\nand we will generate five recipes for each input(*= 25,000 records*).\nThe following table summarizes the scores obtained by the Chef Transformer and RecipeNLG as our baseline.\n\n\n\n*From the 5 generated recipes corresponding to each NER (food items), only the highest score was taken into account in the WER, COSIM, and ROUGE metrics. At the same time, BLEU, GLEU, Meteor were designed to have many possible references.*\n\n\nCopyright\n---------\n\n\nSpecial thanks to those who provided these fantastic materials.\n\n\n* Anatomy\n* Chef Hat\n* Moira Nazzari\n* Instagram Post"
] |
[
"TAGS\n#transformers #pytorch #tf #jax #tensorboard #safetensors #t5 #text2text-generation #seq2seq #text-generation #recipe-generation #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### Example\n\n\nHow To Use\n----------\n\n\nOutput:\n\n\nEvaluation\n----------\n\n\nSince the test set is not available, we will evaluate the model based on a shared test set. This test set consists of 5% of the whole test (*= 5,000 records*),\nand we will generate five recipes for each input(*= 25,000 records*).\nThe following table summarizes the scores obtained by the Chef Transformer and RecipeNLG as our baseline.\n\n\n\n*From the 5 generated recipes corresponding to each NER (food items), only the highest score was taken into account in the WER, COSIM, and ROUGE metrics. At the same time, BLEU, GLEU, Meteor were designed to have many possible references.*\n\n\nCopyright\n---------\n\n\nSpecial thanks to those who provided these fantastic materials.\n\n\n* Anatomy\n* Chef Hat\n* Moira Nazzari\n* Instagram Post"
] |
[
86,
188
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #tensorboard #safetensors #t5 #text2text-generation #seq2seq #text-generation #recipe-generation #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n### Example\n\n\nHow To Use\n----------\n\n\nOutput:\n\n\nEvaluation\n----------\n\n\nSince the test set is not available, we will evaluate the model based on a shared test set. This test set consists of 5% of the whole test (*= 5,000 records*),\nand we will generate five recipes for each input(*= 25,000 records*).\nThe following table summarizes the scores obtained by the Chef Transformer and RecipeNLG as our baseline.\n\n\n\n*From the 5 generated recipes corresponding to each NER (food items), only the highest score was taken into account in the WER, COSIM, and ROUGE metrics. At the same time, BLEU, GLEU, Meteor were designed to have many possible references.*\n\n\nCopyright\n---------\n\n\nSpecial thanks to those who provided these fantastic materials.\n\n\n* Anatomy\n* Chef Hat\n* Moira Nazzari\n* Instagram Post"
] |
[
-0.04939565062522888,
0.04121997952461243,
-0.0010625106515362859,
0.04456145316362381,
0.051054880023002625,
-0.028174765408039093,
0.11523275822401047,
0.07181643694639206,
0.08532054722309113,
0.07216399908065796,
0.04580174759030342,
0.07044488191604614,
0.020284896716475487,
0.14733098447322845,
-0.016199523583054543,
-0.10595278441905975,
-0.02289941906929016,
-0.002082717837765813,
-0.18227411806583405,
0.201212540268898,
0.0870518684387207,
-0.059259869158267975,
0.11305882781744003,
0.021342838183045387,
-0.1489686220884323,
0.020655229687690735,
0.004761978052556515,
-0.03240581974387169,
0.08066006004810333,
0.00423195818439126,
0.09203577041625977,
0.13091807067394257,
-0.0566016361117363,
-0.17237548530101776,
0.04675564542412758,
0.0676538348197937,
-0.030346939340233803,
0.0060340166091918945,
0.04435441270470619,
-0.07411981374025345,
0.030571579933166504,
-0.04526408016681671,
0.01964518055319786,
0.08833763748407364,
-0.14608894288539886,
-0.06668315827846527,
-0.12493088096380234,
-0.006577601190656424,
0.060398686677217484,
0.025797061622142792,
-0.015671290457248688,
-0.01222522184252739,
-0.10721688717603683,
0.067406564950943,
0.2580280303955078,
-0.1988217681646347,
-0.07611241936683655,
0.06766021251678467,
0.03524647653102875,
0.09424595534801483,
-0.02440796047449112,
0.09423692524433136,
0.09262643754482269,
-0.02240191027522087,
-0.018056975677609444,
0.0026493961922824383,
0.1797047108411789,
-0.024033598601818085,
-0.11557260900735855,
-0.09619788825511932,
0.23408031463623047,
0.01740439422428608,
-0.0868973433971405,
-0.00822731014341116,
-0.060404222458601,
-0.0966043770313263,
-0.04379548877477646,
-0.09801916033029556,
-0.03859506547451019,
-0.059607554227113724,
-0.033500030636787415,
0.044832151383161545,
-0.12536336481571198,
-0.07252626866102219,
0.10262994468212128,
0.12445433437824249,
0.03799803555011749,
0.01069557759910822,
-0.13931244611740112,
0.1084822341799736,
-0.011005190201103687,
-0.11039870232343674,
-0.01520667877048254,
-0.08278211951255798,
-0.06757941097021103,
0.02568621002137661,
-0.04885776340961456,
-0.16414999961853027,
0.10815472155809402,
0.16062520444393158,
0.01768580824136734,
0.04171396791934967,
0.011239361017942429,
0.014634411782026291,
0.06553363054990768,
0.06845550984144211,
0.05277060344815254,
-0.14790990948677063,
-0.03364631533622742,
0.026744527742266655,
-0.0772513672709465,
-0.05572210252285004,
0.017223864793777466,
0.015458009205758572,
0.11120124161243439,
0.014669200405478477,
0.03553121164441109,
0.02595251053571701,
-0.027806900441646576,
-0.07080540806055069,
0.20266707241535187,
-0.06269370019435883,
-0.04848809912800789,
0.03393963351845741,
-0.04500284418463707,
-0.005004475358873606,
0.04144304245710373,
-0.01618051528930664,
0.0009972651023417711,
0.06169271841645241,
-0.05586667358875275,
-0.05997234582901001,
-0.04119223728775978,
-0.003719401080161333,
0.012406628578901291,
-0.09188621491193771,
-0.04696614295244217,
-0.08237764984369278,
-0.04276963323354721,
-0.1208822950720787,
0.04324577376246452,
-0.007785844150930643,
-0.03429679572582245,
0.017215518280863762,
0.06641369313001633,
0.00374964508228004,
0.01986301876604557,
-0.058197420090436935,
-0.04791901260614395,
0.10939374566078186,
-0.046449679881334305,
0.051116734743118286,
0.07171636819839478,
0.05713389068841934,
-0.08607683330774307,
0.07656834274530411,
-0.16843388974666595,
0.01988156884908676,
-0.037304431200027466,
-0.01354297250509262,
-0.12728767096996307,
-0.02412586286664009,
-0.08390713483095169,
-0.04889724776148796,
-0.02641359157860279,
0.12543097138404846,
-0.1509648859500885,
-0.04064420610666275,
0.1611848920583725,
-0.12833638489246368,
-0.09207586199045181,
0.18342605233192444,
-0.010189777240157127,
0.012562724761664867,
0.06946784257888794,
0.09198696166276932,
0.17518730461597443,
-0.06511309742927551,
0.04056572914123535,
-0.09996779263019562,
0.042844220995903015,
0.08023584634065628,
0.05001659318804741,
0.04659441113471985,
-0.15338292717933655,
-0.008410324342548847,
0.04353589564561844,
-0.018977735191583633,
-0.06640789657831192,
-0.06444705277681351,
-0.02466350793838501,
-0.03429307043552399,
0.08621318638324738,
0.007741125300526619,
0.02640657126903534,
-0.05324692279100418,
-0.042100075632333755,
-0.06635474413633347,
0.044057801365852356,
-0.025290392339229584,
0.012760638259351254,
0.0201751496642828,
0.17113013565540314,
-0.0813215896487236,
-0.014326803386211395,
-0.09257015585899353,
-0.024844739586114883,
0.038294967263936996,
-0.030270185321569443,
0.08390214294195175,
0.08795635402202606,
0.023239655420184135,
0.026175813749432564,
0.024807265028357506,
0.02484975941479206,
-0.015384973958134651,
0.016558822244405746,
-0.09826280176639557,
-0.12994074821472168,
-0.006165704689919949,
-0.07252520322799683,
0.2584919333457947,
-0.12291441857814789,
0.007207164540886879,
0.06301449984312057,
0.05773894116282463,
0.015618840232491493,
-0.059977784752845764,
0.030824339017271996,
0.013442723080515862,
-0.046541135758161545,
0.0706968754529953,
-0.005024001467972994,
-0.05243772640824318,
-0.046610500663518906,
0.06649188697338104,
-0.13373878598213196,
-0.00414419686421752,
0.048076312988996506,
-0.020805900916457176,
-0.09236836433410645,
-0.07359368354082108,
-0.053068194538354874,
-0.06127938628196716,
-0.029149474576115608,
0.039142243564128876,
0.0153206130489707,
-0.008533286862075329,
0.10567720979452133,
-0.08419103175401688,
-0.03312503173947334,
0.07248430699110031,
-0.04619302228093147,
-0.0500774160027504,
0.10251422226428986,
0.03602771461009979,
-0.2189893126487732,
0.08078205585479736,
0.21629223227500916,
0.10591074079275131,
0.13796967267990112,
0.04366283118724823,
-0.042775947600603104,
-0.0024480156134814024,
0.0829743891954422,
-0.015715738758444786,
0.06962794810533524,
-0.03193929046392441,
0.04937629774212837,
0.017906034365296364,
-0.01310079824179411,
0.008011248894035816,
-0.13077306747436523,
0.012925839982926846,
0.07694154232740402,
-0.0592634342610836,
0.010236694477498531,
0.02961895614862442,
-0.004503350704908371,
0.06917522847652435,
0.026658909395337105,
-0.042070627212524414,
-0.029947863891720772,
-0.0231715627014637,
-0.06733237951993942,
0.19687296450138092,
-0.00877457857131958,
-0.12885525822639465,
-0.09306773543357849,
0.012957507744431496,
-0.11623604595661163,
-0.005217031110078096,
0.11198695003986359,
-0.08850299566984177,
-0.022327447310090065,
-0.04277276247739792,
-0.0029896399937570095,
0.049612030386924744,
0.01039288379251957,
-0.08479590713977814,
-0.04471329227089882,
-0.04887567088007927,
-0.0951465368270874,
-0.005630246363580227,
-0.0679936408996582,
-0.02975144237279892,
0.08486073464155197,
-0.1011219471693039,
0.12752526998519897,
0.12382552772760391,
-0.036950454115867615,
0.02956790290772915,
0.005953596439212561,
0.23421820998191833,
-0.14561887085437775,
0.026359504088759422,
0.14259018003940582,
0.03452974185347557,
0.10740987956523895,
0.0554659478366375,
0.03297804668545723,
-0.04863407090306282,
0.023072678595781326,
0.08541285991668701,
-0.09890757501125336,
-0.1966102421283722,
-0.07822688668966293,
-0.05452787131071091,
-0.024869389832019806,
0.07137295603752136,
0.07709036022424698,
0.15416105091571808,
0.07134939730167389,
-0.12193457782268524,
-0.04040440917015076,
0.05968887731432915,
0.011637577787041664,
-0.03460237756371498,
0.0482264906167984,
0.044385865330696106,
-0.05955031141638756,
-0.043410155922174454,
0.08135730028152466,
0.07575017213821411,
0.18415884673595428,
0.025581400841474533,
0.021917538717389107,
0.17029735445976257,
0.07337740808725357,
0.06877929717302322,
0.010310587473213673,
-0.05406066030263901,
0.0268055759370327,
-0.039542052894830704,
-0.09816146641969681,
-0.027580073103308678,
0.06896646320819855,
-0.037028614431619644,
-0.09528886526823044,
-0.045251764357089996,
-0.04441137984395027,
0.023373644798994064,
0.2707272171974182,
0.05659962818026543,
-0.12445797026157379,
-0.018913038074970245,
0.0069081904366612434,
0.020170897245407104,
-0.023070724681019783,
0.040271710604429245,
0.002285260008648038,
-0.0878010019659996,
0.08035807311534882,
-0.08071722090244293,
0.08875539153814316,
0.04458523541688919,
0.008505755104124546,
0.08435767143964767,
-0.08586635440587997,
-0.04725763946771622,
0.061476822942495346,
-0.17779205739498138,
0.18553569912910461,
-0.0034787477925419807,
-0.08340813964605331,
-0.10809388011693954,
-0.04018928110599518,
-0.0344083346426487,
0.0871616005897522,
0.11067136377096176,
0.01846851408481598,
-0.06841282546520233,
-0.13618656992912292,
-0.05606228485703468,
0.006178210489451885,
0.11608513444662094,
-0.09604943543672562,
0.06991361081600189,
0.0018638207111507654,
0.0059394510462880135,
-0.019799424335360527,
0.03641961142420769,
-0.12031777203083038,
-0.11850816756486893,
0.05638803914189339,
-0.1306564211845398,
-0.12657417356967926,
-0.013754378072917461,
-0.04305138438940048,
-0.10129136592149734,
0.09710114449262619,
-0.14320777356624603,
-0.0290522500872612,
-0.11803969740867615,
-0.04197406768798828,
0.05107378959655762,
-0.07083385437726974,
0.0324099175632,
-0.025614233687520027,
0.019731920212507248,
-0.05173472687602043,
-0.13697874546051025,
0.039118703454732895,
-0.04064064472913742,
-0.19739699363708496,
-0.02669806033372879,
0.09460194408893585,
0.08695533871650696,
-0.0037521652411669493,
0.02520250715315342,
0.025966361165046692,
-0.026765126734972,
-0.05504380539059639,
-0.07565198838710785,
-0.07095939666032791,
0.009210778400301933,
0.09549351781606674,
0.014398067258298397,
-0.013239551335573196,
-0.12152691185474396,
-0.009532247669994831,
-0.045907162129879,
0.2437194138765335,
-0.05594644695520401,
0.15360340476036072,
0.10845101624727249,
-0.021141519770026207,
-0.1391870528459549,
-0.09963914752006531,
-0.058762967586517334,
-0.010412383824586868,
0.0266709066927433,
-0.07642188668251038,
0.10829930007457733,
0.08447162061929703,
-0.056827276945114136,
0.08937203139066696,
-0.35021308064460754,
-0.13060081005096436,
0.025474784895777702,
0.07733418792486191,
0.27554166316986084,
-0.0945606678724289,
-0.054913874715566635,
-0.022461803629994392,
-0.06769172102212906,
0.08502913266420364,
-0.059104837477207184,
0.09750277549028397,
-0.08922285586595535,
-0.06978273391723633,
0.041551172733306885,
-0.04288726672530174,
0.08756950497627258,
0.045382604002952576,
0.12835103273391724,
-0.08013945072889328,
0.03959681838750839,
-0.04911372810602188,
-0.042656656354665756,
0.036290328949689865,
-0.09683749824762344,
0.02641010284423828,
-0.08348704874515533,
-0.02058369107544422,
-0.09459274262189865,
0.07423971593379974,
-0.06342701613903046,
-0.032716915011405945,
-0.13409464061260223,
0.025410400703549385,
0.0444488525390625,
-0.0001872210850706324,
0.09592829644680023,
-0.02045087330043316,
0.17535515129566193,
0.16305780410766602,
-0.006519755814224482,
0.01595528982579708,
-0.06290882080793381,
-0.048042651265859604,
0.014557893387973309,
0.08397310972213745,
-0.1463574469089508,
0.055052973330020905,
0.0406363271176815,
0.051444314420223236,
0.09252266585826874,
-0.003914900589734316,
-0.10323961079120636,
-0.014305654913187027,
0.09029312431812286,
-0.19584724307060242,
-0.17370843887329102,
-0.017250100150704384,
-0.10294783860445023,
0.02697896584868431,
0.045680150389671326,
0.16114076972007751,
-0.044923461973667145,
-0.00886798556894064,
-0.060864873230457306,
0.08440672606229782,
-0.01412631943821907,
0.192726731300354,
0.0688745528459549,
0.04597819596529007,
-0.0663258358836174,
0.030658448114991188,
0.15894868969917297,
-0.0664203017950058,
-0.0017404532991349697,
0.04534699022769928,
-0.06088592857122421,
-0.056180454790592194,
0.03978656232357025,
0.07846656441688538,
-0.10650571435689926,
-0.02742316573858261,
-0.11088483780622482,
-0.028613775968551636,
0.021011915057897568,
0.2403421401977539,
0.0870886743068695,
0.08672786504030228,
0.05939194932579994,
-0.06252342462539673,
-0.069528728723526,
0.09880427271127701,
0.07206463813781738,
0.10724592953920364,
-0.08121442794799805,
-0.09558231383562088,
0.01833740435540676,
0.14084361493587494,
-0.07712750136852264,
-0.05132037773728371,
-0.09143046289682388,
-0.001881801406852901,
-0.010716265998780727,
0.028204597532749176,
-0.03896776959300041,
0.00667613698169589,
-0.0528009794652462,
-0.029857909306883812,
-0.04653732851147652,
0.013123458251357079,
-0.06357346475124359,
0.045247968286275864,
-0.02993578463792801,
0.09516184777021408,
-0.1708076000213623,
-0.014555692672729492,
0.10069619119167328,
-0.044097643345594406,
0.06857837736606598,
-0.03912997245788574,
-0.046345241367816925,
-0.024281980469822884,
-0.20169658958911896,
0.03920019790530205,
-0.009569792076945305,
0.04121337831020355,
-0.004210662562400103,
-0.07150562107563019,
0.05245858430862427,
-0.028820473700761795,
-0.004395654890686274,
0.01735498756170273,
0.09761321544647217,
-0.14315000176429749,
-0.03230774402618408,
-0.03961775824427605,
-0.10390124469995499,
-0.09387156367301941,
0.013875056058168411,
-0.004598838277161121,
0.016596438363194466,
0.07566922158002853,
-0.05862806737422943,
0.03922317177057266,
-0.1380823254585266,
-0.0475190207362175,
0.05048953369259834,
-0.09005051106214523,
-0.009503875859081745,
-0.0359981395304203,
0.02122197300195694,
0.015203055925667286,
0.13527482748031616,
0.04698656499385834,
0.057831771671772,
0.01739264652132988,
-0.059786126017570496,
0.043593864887952805,
-0.012086240574717522,
-0.030735626816749573,
-0.021681100130081177,
0.003842050675302744,
-0.03161602467298508,
0.020137449726462364,
-0.007817781530320644,
0.06217349320650101,
0.14291420578956604,
0.23773591220378876,
-0.02193581685423851,
-0.020467283204197884,
0.0014431915478780866,
0.0022250302135944366,
-0.025826027616858482,
-0.056008052080869675,
-0.10811026394367218,
0.06214998662471771,
-0.004406902939081192,
-0.04755651578307152,
0.1509205400943756,
-0.14492736756801605,
0.07317732274532318,
-0.016469191759824753,
-0.016084065660834312,
-0.08302667737007141,
-0.1480555385351181,
-0.06064671650528908,
-0.11519043892621994,
0.004681606777012348,
-0.03516874462366104,
0.05282876640558243,
0.10561903566122055,
0.05848982185125351,
0.013643262907862663,
0.16653361916542053,
-0.02269342541694641,
0.04127725213766098,
0.020013563334941864,
-0.00714880833402276,
0.009194843471050262,
0.060674943029880524,
-0.01883741095662117,
0.04856318235397339,
0.009593467228114605,
0.02458530105650425,
-0.00004833593629882671,
0.051101211458444595,
0.05517081916332245,
0.001055214204825461,
-0.1101590171456337,
0.004019527696073055,
-0.03974579647183418,
0.10829752683639526,
0.06983906775712967,
0.028480475768446922,
0.02950979396700859,
0.009736519306898117,
0.1158105731010437,
0.030507272109389305,
-0.02528909221291542,
-0.05518430471420288,
0.2116963416337967,
0.08683229237794876,
-0.01518926676362753,
0.003909112885594368,
-0.07229363173246384,
0.052535105496644974,
0.10365161299705505,
0.17261280119419098,
0.051500104367733,
0.019180014729499817,
-0.03676050528883934,
0.02973386086523533,
0.05767647922039032,
0.08771170675754547,
-0.04138198867440224,
0.19977828860282898,
-0.014059067703783512,
0.031911127269268036,
-0.015483890660107136,
-0.022363267838954926,
-0.08276388794183731,
0.1681567132472992,
0.05271675065159798,
-0.05163883790373802,
-0.031124474480748177,
0.18907979130744934,
0.025844013318419456,
-0.10399112105369568,
0.08133640885353088,
-0.08298647403717041,
-0.12627224624156952,
-0.03911187872290611,
0.08354788273572922,
-0.04637893661856651,
0.05606445297598839,
-0.030792778357863426,
-0.07598376274108887,
-0.020847221836447716,
0.03834189847111702,
-0.08231393247842789,
-0.026422591879963875,
0.0927404910326004,
-0.0742049291729927,
0.213650643825531,
0.01676211878657341,
0.04944299906492233,
0.08071670681238174,
-0.028713498264551163,
-0.030797986313700676,
0.00236510019749403,
0.059113509953022,
-0.007886914536356926,
0.013976102694869041,
0.13824570178985596,
0.013958634808659554,
-0.023038309067487717,
0.04466860741376877,
-0.026037970557808876,
0.029918638989329338,
-0.0067755235359072685,
0.09093775600194931,
-0.07655855268239975,
0.06026814132928848,
-0.030385823920369148,
0.13591034710407257,
0.1239856481552124,
-0.020947132259607315,
0.05382898449897766,
-0.07684797048568726,
0.0013798834988847375,
0.010227161459624767,
0.01884869858622551,
-0.04251110553741455,
-0.16968382894992828,
0.036704279482364655,
-0.06441232562065125,
-0.006361457984894514,
-0.20360812544822693,
-0.050050221383571625,
-0.0020111396443098783,
-0.017194217070937157,
0.0050082337111234665,
0.13958261907100677,
0.06507307291030884,
0.026928436011075974,
-0.030108483508229256,
-0.19006949663162231,
-0.05563170090317726,
0.07747600972652435,
-0.15717920660972595,
-0.0570698007941246
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.