sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
listlengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
| tokens_length
listlengths 1
723
| input_texts
listlengths 1
61
| embeddings
listlengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null |
transformers
|
# 🤗 + 📚 dbmdz Turkish BERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources an uncased model for Turkish 🎉
# 🇹🇷 BERTurk
BERTurk is a community-driven uncased BERT model for Turkish.
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the model name: BERTurk.
## Stats
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish [OSCAR corpus](https://traces1.inria.fr/oscar/),
a recent Wikipedia dump, various [OPUS corpora](http://opus.nlpl.eu/) and a
special corpus provided by [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/).
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train an uncased model
on a TPU v3-8 for 2M steps.
For this model we use a vocab size of 128k.
## Model weights
Currently only PyTorch-[Transformers](https://github.com/huggingface/transformers)
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
| Model | Downloads
| -------------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `dbmdz/bert-base-turkish-128k-uncased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-128k-uncased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-128k-uncased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-128k-uncased/vocab.txt)
## Usage
With Transformers >= 2.3 our BERTurk uncased model can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("dbmdz/bert-base-turkish-128k-uncased")
model = AutoModel.from_pretrained("dbmdz/bert-base-turkish-128k-uncased")
```
## Results
For results on PoS tagging or NER tasks, please refer to
[this repository](https://github.com/stefan-it/turkish-bert).
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "tr", "license": "mit"}
| null |
dbmdz/bert-base-turkish-128k-uncased
|
[
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"tr",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #jax #bert #tr #license-mit #endpoints_compatible #has_space #region-us
|
+ dbmdz Turkish BERT model
==========================
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources an uncased model for Turkish
🇹🇷 BERTurk
==========
BERTurk is a community-driven uncased BERT model for Turkish.
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the model name: BERTurk.
Stats
-----
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish OSCAR corpus,
a recent Wikipedia dump, various OPUS corpora and a
special corpus provided by Kemal Oflazer.
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train an uncased model
on a TPU v3-8 for 2M steps.
For this model we use a vocab size of 128k.
Model weights
-------------
Currently only PyTorch-Transformers
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
Usage
-----
With Transformers >= 2.3 our BERTurk uncased model can be loaded like:
Results
-------
For results on PoS tagging or NER tasks, please refer to
this repository.
Huggingface model hub
=====================
All models are available on the Huggingface model hub.
Contact (Bugs, Feedback, Contribution and more)
===============================================
For questions about our BERT models just open an issue
here
Acknowledgments
===============
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #bert #tr #license-mit #endpoints_compatible #has_space #region-us \n"
] |
[
40
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #bert #tr #license-mit #endpoints_compatible #has_space #region-us \n"
] |
[
0.018921364098787308,
-0.00019106945546809584,
-0.006394296884536743,
0.03337884694337845,
0.054041218012571335,
0.031650494784116745,
0.06662475317716599,
0.10807959735393524,
0.05830428749322891,
-0.019409330561757088,
0.13395871222019196,
0.18363657593727112,
-0.042950283735990524,
0.03756747394800186,
-0.033980563282966614,
-0.23781342804431915,
0.0514480322599411,
0.052651047706604004,
-0.07216320186853409,
0.11049067229032516,
0.07754318416118622,
-0.07751429826021194,
0.05530031770467758,
-0.009757393039762974,
-0.12986992299556732,
0.03262612968683243,
0.04725905507802963,
-0.07637171447277069,
0.1480608433485031,
0.044089175760746,
0.12649399042129517,
0.09060006588697433,
-0.03512771427631378,
-0.0813177078962326,
0.034379344433546066,
0.012670991010963917,
-0.12737680971622467,
0.04784052446484566,
0.0038243152666836977,
-0.03693895414471626,
0.13288435339927673,
0.05445465072989464,
0.0063299741595983505,
0.03950441628694534,
-0.16893337666988373,
-0.2548118233680725,
-0.07381822168827057,
0.08884354680776596,
-0.02052173763513565,
0.04091643914580345,
0.03058725595474243,
0.20716851949691772,
-0.13842296600341797,
0.05934154987335205,
0.19126549363136292,
-0.39465510845184326,
-0.011868304572999477,
0.16808973252773285,
0.12622633576393127,
0.032379381358623505,
-0.06192849576473236,
0.06399940699338913,
0.05275721848011017,
0.019478803500533104,
0.12466363608837128,
-0.07277891039848328,
-0.04720394313335419,
0.10058879852294922,
-0.10866198688745499,
-0.08265569061040878,
0.22826968133449554,
-0.020683445036411285,
0.03560272604227066,
0.03850972279906273,
-0.07584118098020554,
-0.07571636140346527,
0.025144880637526512,
-0.022743092849850655,
0.0026006216648966074,
0.07900294661521912,
-0.01369452103972435,
-0.04454221948981285,
-0.15518058836460114,
0.025484146550297737,
-0.22992919385433197,
0.13037264347076416,
-0.0037297243252396584,
0.08252513408660889,
-0.17930646240711212,
0.0790943130850792,
-0.019592057913541794,
-0.07786344736814499,
0.03198960795998573,
-0.09400814026594162,
0.05977398529648781,
0.004003958310931921,
-0.05471600219607353,
0.07530807703733444,
0.05432100594043732,
0.14611782133579254,
0.012070882134139538,
-0.01876658946275711,
0.017675457522273064,
0.12192820757627487,
-0.024760860949754715,
0.04574089124798775,
-0.015430393628776073,
0.02416994422674179,
0.01457999087870121,
-0.11165327578783035,
-0.008331339806318283,
-0.04006664454936981,
-0.13114413619041443,
-0.053011540323495865,
-0.00046386977192014456,
0.06673956662416458,
0.05116226524114609,
0.05029689148068428,
-0.04164477437734604,
0.04405054822564125,
0.0799972265958786,
-0.014050222001969814,
0.007079362403601408,
-0.018930165097117424,
0.07033229619264603,
0.044898953288793564,
0.004350714851170778,
-0.013392960652709007,
0.06401839852333069,
0.06933358311653137,
-0.11029591411352158,
-0.032753556966781616,
-0.024227159097790718,
-0.08324852585792542,
0.07242018729448318,
-0.09464021027088165,
0.04510653018951416,
-0.19161643087863922,
-0.03341130539774895,
0.04895780235528946,
0.06587915122509003,
-0.003518822602927685,
-0.02106628008186817,
0.09844736754894257,
-0.07420367002487183,
0.04550398886203766,
-0.05749610438942909,
-0.023689400404691696,
-0.061869945377111435,
0.11284130066633224,
-0.08427777886390686,
0.09290680289268494,
-0.15307174623012543,
0.041453663259744644,
-0.07665122300386429,
0.013973030261695385,
-0.05626176297664642,
-0.07523717731237411,
-0.04507468268275261,
0.15355731546878815,
0.008108905516564846,
-0.06740334630012512,
-0.14346280694007874,
0.046696294099092484,
-0.04539621248841286,
0.09800661355257034,
-0.12722671031951904,
-0.058843065053224564,
0.17224109172821045,
-0.07735142111778259,
-0.1869661957025528,
0.05597550794482231,
0.013961449265480042,
0.05645507201552391,
0.013563055545091629,
0.22008489072322845,
0.05232677236199379,
-0.14414089918136597,
0.03595493361353874,
0.14881235361099243,
-0.13480418920516968,
-0.1392943114042282,
0.07309679687023163,
0.005714767146855593,
-0.08493904769420624,
-0.008863232098519802,
0.007384501863270998,
0.10543231666088104,
-0.05483245849609375,
-0.04142352193593979,
-0.04210564121603966,
-0.00836840271949768,
0.0716710314154625,
0.03725959360599518,
0.08052113652229309,
-0.09881141036748886,
-0.056291595101356506,
0.05348818004131317,
0.0003750566393136978,
0.09270141273736954,
0.0430111289024353,
-0.050984617322683334,
0.10983968526124954,
0.001625535194762051,
-0.039218783378601074,
-0.12669385969638824,
-0.08412294834852219,
-0.041719261556863785,
0.06811828911304474,
0.0077491686679422855,
0.28311729431152344,
0.06907472014427185,
-0.08319556713104248,
-0.017015645280480385,
-0.006065546069294214,
0.0952838882803917,
0.07185361534357071,
-0.022340446710586548,
-0.09672259539365768,
-0.0047282068990170956,
-0.06517353653907776,
-0.08806779235601425,
-0.04047058895230293,
0.026650212705135345,
0.13822844624519348,
0.12135589867830276,
-0.02114042267203331,
0.06477739661931992,
-0.0375421978533268,
0.01081872172653675,
-0.04319320619106293,
-0.01595330238342285,
0.08564510196447372,
0.023633481934666634,
-0.0436147004365921,
0.22958077490329742,
-0.08547714352607727,
0.3939151465892792,
0.22446800768375397,
-0.1657782644033432,
-0.031030558049678802,
0.06123265251517296,
-0.051767002791166306,
0.03748093545436859,
0.06632548570632935,
-0.0631106048822403,
-0.040214505046606064,
-0.05183693394064903,
0.1000824049115181,
-0.04166794940829277,
-0.0760674849152565,
0.005690592806786299,
-0.03536844626069069,
-0.07515683025121689,
0.046796247363090515,
0.07047999650239944,
-0.2172105759382248,
0.19383907318115234,
0.3774738013744354,
0.05013132467865944,
0.14449995756149292,
-0.03993186727166176,
0.012442865408957005,
-0.032865963876247406,
-0.03054577298462391,
-0.04821619763970375,
0.10719674080610275,
-0.16407407820224762,
-0.037437137216329575,
0.07295597344636917,
0.006247274111956358,
0.034409135580062866,
-0.15331147611141205,
-0.10955478996038437,
0.04336130619049072,
0.029929859563708305,
-0.09580295532941818,
0.1635720133781433,
0.010531868785619736,
0.10101879388093948,
-0.012210089713335037,
-0.12064392119646072,
0.09066576510667801,
0.014052140526473522,
-0.05228690803050995,
0.09404075145721436,
-0.13077005743980408,
-0.2045978605747223,
-0.06662975996732712,
-0.06628576666116714,
0.073656365275383,
-0.01600862666964531,
0.1302538514137268,
-0.026411157101392746,
-0.0007316036499105394,
0.020127976313233376,
-0.026053689420223236,
-0.1503445953130722,
0.06380217522382736,
-0.08932972699403763,
0.02396491914987564,
-0.06136816740036011,
-0.10904357582330704,
-0.09035322815179825,
-0.014093480072915554,
-0.07301399111747742,
0.1116311103105545,
-0.05141761153936386,
0.0725216194987297,
0.09947564452886581,
-0.028806697577238083,
0.053319692611694336,
-0.06826431304216385,
0.20720802247524261,
-0.059331413358449936,
0.030779873952269554,
0.15294189751148224,
0.054447006434202194,
0.07210250198841095,
0.15391798317432404,
0.07699142396450043,
-0.03385859355330467,
-0.011553134769201279,
-0.05508512631058693,
-0.11758847534656525,
-0.1600302904844284,
-0.05947500467300415,
-0.1385406106710434,
0.0024164041969925165,
-0.005456132814288139,
0.08605465292930603,
0.13780304789543152,
0.03569338470697403,
0.028745347633957863,
-0.046719420701265335,
-0.05642664059996605,
0.056320443749427795,
0.2243598997592926,
-0.058458078652620316,
0.10283274203538895,
-0.08308514952659607,
-0.07471421360969543,
0.10732393711805344,
0.02800784818828106,
0.08840981125831604,
0.1018141582608223,
-0.031360968947410583,
0.10143575817346573,
0.2378867119550705,
0.11593813449144363,
0.08610434830188751,
-0.00764454435557127,
-0.058286938816308975,
-0.04331653192639351,
-0.03456863388419151,
0.03303278237581253,
0.0610777847468853,
0.10405449569225311,
-0.11127109825611115,
0.0004154023772571236,
-0.2587382197380066,
0.047090817242860794,
0.04537611082196236,
0.06576401740312576,
-0.15773995220661163,
0.014254134148359299,
0.06556764245033264,
0.011166803538799286,
-0.02129744365811348,
0.07596481591463089,
0.0803239718079567,
-0.07936139404773712,
0.03128942474722862,
-0.0033386985305696726,
0.07891964167356491,
0.141982302069664,
0.08221472799777985,
0.026959864422678947,
-0.14622759819030762,
0.018382223322987556,
0.05415626987814903,
-0.29122474789619446,
0.2603740990161896,
-0.016298852860927582,
-0.08506055176258087,
-0.017716290429234505,
-0.051800116896629333,
0.02599170058965683,
0.17704296112060547,
0.11179764568805695,
0.047093238681554794,
-0.08303944021463394,
-0.09443166106939316,
0.06625372916460037,
0.006351374089717865,
0.06221204996109009,
-0.048077523708343506,
-0.039894506335258484,
-0.038912419229745865,
-0.006282614544034004,
0.03132876753807068,
0.1985524743795395,
-0.00042567605851218104,
-0.09110472351312637,
0.05488983541727066,
0.033157818019390106,
-0.012798459269106388,
-0.04242902994155884,
-0.040151726454496384,
-0.1195252388715744,
0.07019954174757004,
-0.012865433469414711,
-0.01390661671757698,
-0.11006497591733932,
-0.1635427623987198,
0.0876099169254303,
-0.054369423538446426,
0.06406673043966293,
-0.03894845396280289,
-0.07413246482610703,
-0.0853787437081337,
-0.1807815432548523,
0.14632569253444672,
-0.10285159945487976,
0.005677650682628155,
-0.07375694811344147,
0.1498737931251526,
-0.10922247916460037,
0.07183880358934402,
0.012159033678472042,
0.04959312453866005,
-0.11737602204084396,
-0.09297776967287064,
0.02451600506901741,
-0.09403116255998611,
0.04706693813204765,
-0.09736547619104385,
-0.04247691482305527,
0.05206327140331268,
0.07581953704357147,
-0.009907481260597706,
0.18349303305149078,
0.22155191004276276,
-0.12401971220970154,
0.1636885553598404,
0.07030263543128967,
-0.02478925697505474,
-0.2698127031326294,
-0.08735955506563187,
-0.18858814239501953,
-0.037029486149549484,
0.09425146132707596,
-0.04421720281243324,
0.016336563974618912,
0.026114607229828835,
-0.0652138963341713,
0.11724712699651718,
-0.24414044618606567,
-0.06941452622413635,
0.12150342017412186,
-0.030888468027114868,
0.3945734202861786,
-0.15146198868751526,
-0.043057575821876526,
0.05751958116889,
-0.23613524436950684,
0.15220879018306732,
-0.06982243061065674,
0.059932176023721695,
-0.024687664583325386,
0.009625919163227081,
0.012956679798662663,
-0.058646511286497116,
0.11852645874023438,
-0.028607038781046867,
0.03917904198169708,
-0.0995291993021965,
-0.14747604727745056,
0.15192294120788574,
-0.010339842177927494,
-0.0039353701286017895,
-0.0560183972120285,
0.0014862052630633116,
-0.14887772500514984,
0.02288043312728405,
-0.14148478209972382,
0.09924518316984177,
-0.013591835275292397,
-0.08339280635118484,
-0.06405390053987503,
0.030689138919115067,
-0.0006512186955660582,
-0.0655183494091034,
0.22220633924007416,
-0.01301235519349575,
0.23067015409469604,
0.09011214226484299,
-0.001718861167319119,
-0.16529352962970734,
-0.1028117686510086,
-0.0182523000985384,
-0.06786543875932693,
0.07782507687807083,
-0.14629456400871277,
0.00777800939977169,
0.10193872451782227,
0.0008055730140767992,
0.03828044235706329,
0.10511858016252518,
-0.018135055899620056,
-0.01799452118575573,
0.17199650406837463,
-0.17657452821731567,
-0.10651668906211853,
-0.0373256579041481,
0.004737530369311571,
0.11901739239692688,
0.04452887549996376,
0.08069372177124023,
-0.035874053835868835,
-0.00571437506005168,
0.006929844152182341,
-0.04200324788689613,
-0.10143157839775085,
-0.026781678199768066,
0.09098425507545471,
0.04007735103368759,
-0.08146607875823975,
-0.014658465050160885,
0.018745744600892067,
-0.14638830721378326,
-0.04493904858827591,
0.09320402890443802,
-0.0856664702296257,
-0.1383364200592041,
-0.03634100779891014,
-0.028726182878017426,
-0.1471620500087738,
0.005056874826550484,
0.021310318261384964,
-0.11028612405061722,
0.046193405985832214,
0.23917366564273834,
0.07942447811365128,
0.12003888934850693,
-0.00624223193153739,
-0.014540894888341427,
0.06047806888818741,
-0.03106739930808544,
-0.09156856685876846,
0.028075164183974266,
-0.13165976107120514,
0.06659668684005737,
-0.015198924578726292,
0.13404253125190735,
-0.09143192321062088,
-0.013847830705344677,
-0.16979002952575684,
0.01005585864186287,
-0.03643380478024483,
-0.11661911755800247,
-0.1018761694431305,
-0.0664786621928215,
0.03516659885644913,
-0.12889797985553741,
-0.06750722229480743,
-0.014512577094137669,
-0.1366327852010727,
0.030165666714310646,
0.040125224739313126,
0.08488902449607849,
-0.084175243973732,
-0.04602627828717232,
0.0952710434794426,
-0.007860500365495682,
0.07812531292438507,
0.0796566754579544,
-0.04677276313304901,
0.09265077859163284,
-0.0742633044719696,
-0.09033126384019852,
0.07081255316734314,
0.008014141581952572,
0.07401663064956665,
0.06573578715324402,
-0.005506083834916353,
0.033054232597351074,
0.011137601919472218,
0.060951173305511475,
-0.06692736595869064,
-0.09784119576215744,
0.002001343760639429,
0.012363482266664505,
-0.1226206123828888,
0.005838602315634489,
-0.0842478796839714,
0.16003870964050293,
-0.0031541911885142326,
0.09600365906953812,
0.014077738858759403,
0.019272007048130035,
-0.11455035954713821,
-0.0005564488237723708,
-0.031815413385629654,
-0.16153185069561005,
-0.020841067656874657,
-0.04984382167458534,
-0.014035704545676708,
-0.014970787800848484,
0.21103821694850922,
0.052751973271369934,
-0.12153859436511993,
0.06305060535669327,
0.06090157851576805,
0.014134160242974758,
-0.024568676948547363,
0.22517432272434235,
0.04472561925649643,
-0.04426772892475128,
-0.11429143697023392,
0.06277164816856384,
-0.031958941370248795,
-0.0988655537366867,
0.1007656380534172,
0.12398026138544083,
0.05795013904571533,
0.04358082637190819,
0.0883537083864212,
-0.016972195357084274,
-0.11261337250471115,
-0.2075732797384262,
0.04702022299170494,
0.05454430729150772,
-0.0723206102848053,
0.0690259262919426,
0.19904738664627075,
-0.04785175248980522,
0.054873671382665634,
-0.061130229383707047,
0.03585329279303551,
-0.14385195076465607,
-0.09793546795845032,
-0.02680368348956108,
-0.112136609852314,
0.00018097730935551226,
-0.025985533371567726,
0.05662226676940918,
0.1552053987979889,
0.05158377066254616,
-0.0050090644508600235,
0.013797912746667862,
0.034431930631399155,
-0.05215965583920479,
0.00695808045566082,
0.01798609457910061,
0.0237100999802351,
-0.097680002450943,
0.021873442456126213,
-0.0972055122256279,
-0.09044606238603592,
-0.07261839509010315,
0.003794738557189703,
-0.04609888419508934,
-0.02652757056057453,
-0.1192617192864418,
-0.08353544026613235,
-0.05491027981042862,
0.03775608912110329,
-0.020447196438908577,
0.09760937839746475,
0.0010796304559335113,
0.03233664855360985,
0.017549393698573112,
0.2191435843706131,
-0.07920964062213898,
-0.041005779057741165,
0.005028486717492342,
0.1873423457145691,
0.03238807991147041,
0.09078659117221832,
-0.00025332922814413905,
0.026695407927036285,
-0.06992001086473465,
0.21113532781600952,
0.3371700942516327,
-0.03200918808579445,
0.08686735481023788,
0.04957680404186249,
0.014318463392555714,
0.06563498079776764,
0.10330832004547119,
0.10295400023460388,
0.23794005811214447,
-0.11865480244159698,
0.003639715723693371,
-0.0632590800523758,
0.03249784931540489,
-0.060046788305044174,
0.04283567890524864,
0.03338109329342842,
-0.07934872806072235,
-0.02819124050438404,
0.06159615516662598,
-0.08641811460256577,
0.055679332464933395,
0.07948239147663116,
-0.20277303457260132,
-0.0322968028485775,
-0.012311164289712906,
0.1655222773551941,
-0.0081708999350667,
0.09296835213899612,
-0.058473922312259674,
-0.07403064519166946,
0.013284390792250633,
0.010752661153674126,
-0.24578692018985748,
-0.05873711407184601,
0.1534961313009262,
0.02293015830218792,
0.060544077306985855,
-0.06414378434419632,
0.025068385526537895,
0.09329359978437424,
0.06718573719263077,
-0.06130964681506157,
0.05137915909290314,
0.05769737809896469,
-0.09846795350313187,
-0.11652769893407822,
-0.10557594150304794,
0.024011142551898956,
-0.06896069645881653,
0.04533424600958824,
-0.14160817861557007,
0.049556341022253036,
-0.004323633853346109,
-0.011176121421158314,
-0.027610979974269867,
0.02008398436009884,
-0.029410140588879585,
0.08877939730882645,
0.033858463168144226,
-0.008892212063074112,
-0.047436561435461044,
-0.04151783511042595,
-0.06487058103084564,
0.10949592292308807,
-0.09368491172790527,
-0.118156798183918,
0.02916806749999523,
-0.047728683799505234,
0.015931548550724983,
-0.028221987187862396,
-0.07404676824808121,
-0.07388848811388016,
-0.03162632882595062,
0.030543042346835136,
-0.11053670197725296,
0.04174854978919029,
0.08706703037023544,
0.030031858012080193,
0.012773646973073483,
-0.06605713814496994,
0.01675252430140972,
0.042071882635354996,
-0.13691921532154083,
-0.03748135268688202
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz Turkish BERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a cased model for Turkish 🎉
# 🇹🇷 BERTurk
BERTurk is a community-driven cased BERT model for Turkish.
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the model name: BERTurk.
## Stats
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish [OSCAR corpus](https://traces1.inria.fr/oscar/),
a recent Wikipedia dump, various [OPUS corpora](http://opus.nlpl.eu/) and a
special corpus provided by [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/).
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model
on a TPU v3-8 for 2M steps.
## Model weights
Currently only PyTorch-[Transformers](https://github.com/huggingface/transformers)
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
| Model | Downloads
| --------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `dbmdz/bert-base-turkish-cased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-cased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-cased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-cased/vocab.txt)
## Usage
With Transformers >= 2.3 our BERTurk cased model can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("dbmdz/bert-base-turkish-cased")
model = AutoModel.from_pretrained("dbmdz/bert-base-turkish-cased")
```
## Results
For results on PoS tagging or NER tasks, please refer to
[this repository](https://github.com/stefan-it/turkish-bert).
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "tr", "license": "mit"}
| null |
dbmdz/bert-base-turkish-cased
|
[
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"tr",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #jax #bert #tr #license-mit #endpoints_compatible #has_space #region-us
|
+ dbmdz Turkish BERT model
==========================
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a cased model for Turkish
🇹🇷 BERTurk
==========
BERTurk is a community-driven cased BERT model for Turkish.
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the model name: BERTurk.
Stats
-----
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish OSCAR corpus,
a recent Wikipedia dump, various OPUS corpora and a
special corpus provided by Kemal Oflazer.
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model
on a TPU v3-8 for 2M steps.
Model weights
-------------
Currently only PyTorch-Transformers
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
Usage
-----
With Transformers >= 2.3 our BERTurk cased model can be loaded like:
Results
-------
For results on PoS tagging or NER tasks, please refer to
this repository.
Huggingface model hub
=====================
All models are available on the Huggingface model hub.
Contact (Bugs, Feedback, Contribution and more)
===============================================
For questions about our BERT models just open an issue
here
Acknowledgments
===============
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #bert #tr #license-mit #endpoints_compatible #has_space #region-us \n"
] |
[
40
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #bert #tr #license-mit #endpoints_compatible #has_space #region-us \n"
] |
[
0.018921364098787308,
-0.00019106945546809584,
-0.006394296884536743,
0.03337884694337845,
0.054041218012571335,
0.031650494784116745,
0.06662475317716599,
0.10807959735393524,
0.05830428749322891,
-0.019409330561757088,
0.13395871222019196,
0.18363657593727112,
-0.042950283735990524,
0.03756747394800186,
-0.033980563282966614,
-0.23781342804431915,
0.0514480322599411,
0.052651047706604004,
-0.07216320186853409,
0.11049067229032516,
0.07754318416118622,
-0.07751429826021194,
0.05530031770467758,
-0.009757393039762974,
-0.12986992299556732,
0.03262612968683243,
0.04725905507802963,
-0.07637171447277069,
0.1480608433485031,
0.044089175760746,
0.12649399042129517,
0.09060006588697433,
-0.03512771427631378,
-0.0813177078962326,
0.034379344433546066,
0.012670991010963917,
-0.12737680971622467,
0.04784052446484566,
0.0038243152666836977,
-0.03693895414471626,
0.13288435339927673,
0.05445465072989464,
0.0063299741595983505,
0.03950441628694534,
-0.16893337666988373,
-0.2548118233680725,
-0.07381822168827057,
0.08884354680776596,
-0.02052173763513565,
0.04091643914580345,
0.03058725595474243,
0.20716851949691772,
-0.13842296600341797,
0.05934154987335205,
0.19126549363136292,
-0.39465510845184326,
-0.011868304572999477,
0.16808973252773285,
0.12622633576393127,
0.032379381358623505,
-0.06192849576473236,
0.06399940699338913,
0.05275721848011017,
0.019478803500533104,
0.12466363608837128,
-0.07277891039848328,
-0.04720394313335419,
0.10058879852294922,
-0.10866198688745499,
-0.08265569061040878,
0.22826968133449554,
-0.020683445036411285,
0.03560272604227066,
0.03850972279906273,
-0.07584118098020554,
-0.07571636140346527,
0.025144880637526512,
-0.022743092849850655,
0.0026006216648966074,
0.07900294661521912,
-0.01369452103972435,
-0.04454221948981285,
-0.15518058836460114,
0.025484146550297737,
-0.22992919385433197,
0.13037264347076416,
-0.0037297243252396584,
0.08252513408660889,
-0.17930646240711212,
0.0790943130850792,
-0.019592057913541794,
-0.07786344736814499,
0.03198960795998573,
-0.09400814026594162,
0.05977398529648781,
0.004003958310931921,
-0.05471600219607353,
0.07530807703733444,
0.05432100594043732,
0.14611782133579254,
0.012070882134139538,
-0.01876658946275711,
0.017675457522273064,
0.12192820757627487,
-0.024760860949754715,
0.04574089124798775,
-0.015430393628776073,
0.02416994422674179,
0.01457999087870121,
-0.11165327578783035,
-0.008331339806318283,
-0.04006664454936981,
-0.13114413619041443,
-0.053011540323495865,
-0.00046386977192014456,
0.06673956662416458,
0.05116226524114609,
0.05029689148068428,
-0.04164477437734604,
0.04405054822564125,
0.0799972265958786,
-0.014050222001969814,
0.007079362403601408,
-0.018930165097117424,
0.07033229619264603,
0.044898953288793564,
0.004350714851170778,
-0.013392960652709007,
0.06401839852333069,
0.06933358311653137,
-0.11029591411352158,
-0.032753556966781616,
-0.024227159097790718,
-0.08324852585792542,
0.07242018729448318,
-0.09464021027088165,
0.04510653018951416,
-0.19161643087863922,
-0.03341130539774895,
0.04895780235528946,
0.06587915122509003,
-0.003518822602927685,
-0.02106628008186817,
0.09844736754894257,
-0.07420367002487183,
0.04550398886203766,
-0.05749610438942909,
-0.023689400404691696,
-0.061869945377111435,
0.11284130066633224,
-0.08427777886390686,
0.09290680289268494,
-0.15307174623012543,
0.041453663259744644,
-0.07665122300386429,
0.013973030261695385,
-0.05626176297664642,
-0.07523717731237411,
-0.04507468268275261,
0.15355731546878815,
0.008108905516564846,
-0.06740334630012512,
-0.14346280694007874,
0.046696294099092484,
-0.04539621248841286,
0.09800661355257034,
-0.12722671031951904,
-0.058843065053224564,
0.17224109172821045,
-0.07735142111778259,
-0.1869661957025528,
0.05597550794482231,
0.013961449265480042,
0.05645507201552391,
0.013563055545091629,
0.22008489072322845,
0.05232677236199379,
-0.14414089918136597,
0.03595493361353874,
0.14881235361099243,
-0.13480418920516968,
-0.1392943114042282,
0.07309679687023163,
0.005714767146855593,
-0.08493904769420624,
-0.008863232098519802,
0.007384501863270998,
0.10543231666088104,
-0.05483245849609375,
-0.04142352193593979,
-0.04210564121603966,
-0.00836840271949768,
0.0716710314154625,
0.03725959360599518,
0.08052113652229309,
-0.09881141036748886,
-0.056291595101356506,
0.05348818004131317,
0.0003750566393136978,
0.09270141273736954,
0.0430111289024353,
-0.050984617322683334,
0.10983968526124954,
0.001625535194762051,
-0.039218783378601074,
-0.12669385969638824,
-0.08412294834852219,
-0.041719261556863785,
0.06811828911304474,
0.0077491686679422855,
0.28311729431152344,
0.06907472014427185,
-0.08319556713104248,
-0.017015645280480385,
-0.006065546069294214,
0.0952838882803917,
0.07185361534357071,
-0.022340446710586548,
-0.09672259539365768,
-0.0047282068990170956,
-0.06517353653907776,
-0.08806779235601425,
-0.04047058895230293,
0.026650212705135345,
0.13822844624519348,
0.12135589867830276,
-0.02114042267203331,
0.06477739661931992,
-0.0375421978533268,
0.01081872172653675,
-0.04319320619106293,
-0.01595330238342285,
0.08564510196447372,
0.023633481934666634,
-0.0436147004365921,
0.22958077490329742,
-0.08547714352607727,
0.3939151465892792,
0.22446800768375397,
-0.1657782644033432,
-0.031030558049678802,
0.06123265251517296,
-0.051767002791166306,
0.03748093545436859,
0.06632548570632935,
-0.0631106048822403,
-0.040214505046606064,
-0.05183693394064903,
0.1000824049115181,
-0.04166794940829277,
-0.0760674849152565,
0.005690592806786299,
-0.03536844626069069,
-0.07515683025121689,
0.046796247363090515,
0.07047999650239944,
-0.2172105759382248,
0.19383907318115234,
0.3774738013744354,
0.05013132467865944,
0.14449995756149292,
-0.03993186727166176,
0.012442865408957005,
-0.032865963876247406,
-0.03054577298462391,
-0.04821619763970375,
0.10719674080610275,
-0.16407407820224762,
-0.037437137216329575,
0.07295597344636917,
0.006247274111956358,
0.034409135580062866,
-0.15331147611141205,
-0.10955478996038437,
0.04336130619049072,
0.029929859563708305,
-0.09580295532941818,
0.1635720133781433,
0.010531868785619736,
0.10101879388093948,
-0.012210089713335037,
-0.12064392119646072,
0.09066576510667801,
0.014052140526473522,
-0.05228690803050995,
0.09404075145721436,
-0.13077005743980408,
-0.2045978605747223,
-0.06662975996732712,
-0.06628576666116714,
0.073656365275383,
-0.01600862666964531,
0.1302538514137268,
-0.026411157101392746,
-0.0007316036499105394,
0.020127976313233376,
-0.026053689420223236,
-0.1503445953130722,
0.06380217522382736,
-0.08932972699403763,
0.02396491914987564,
-0.06136816740036011,
-0.10904357582330704,
-0.09035322815179825,
-0.014093480072915554,
-0.07301399111747742,
0.1116311103105545,
-0.05141761153936386,
0.0725216194987297,
0.09947564452886581,
-0.028806697577238083,
0.053319692611694336,
-0.06826431304216385,
0.20720802247524261,
-0.059331413358449936,
0.030779873952269554,
0.15294189751148224,
0.054447006434202194,
0.07210250198841095,
0.15391798317432404,
0.07699142396450043,
-0.03385859355330467,
-0.011553134769201279,
-0.05508512631058693,
-0.11758847534656525,
-0.1600302904844284,
-0.05947500467300415,
-0.1385406106710434,
0.0024164041969925165,
-0.005456132814288139,
0.08605465292930603,
0.13780304789543152,
0.03569338470697403,
0.028745347633957863,
-0.046719420701265335,
-0.05642664059996605,
0.056320443749427795,
0.2243598997592926,
-0.058458078652620316,
0.10283274203538895,
-0.08308514952659607,
-0.07471421360969543,
0.10732393711805344,
0.02800784818828106,
0.08840981125831604,
0.1018141582608223,
-0.031360968947410583,
0.10143575817346573,
0.2378867119550705,
0.11593813449144363,
0.08610434830188751,
-0.00764454435557127,
-0.058286938816308975,
-0.04331653192639351,
-0.03456863388419151,
0.03303278237581253,
0.0610777847468853,
0.10405449569225311,
-0.11127109825611115,
0.0004154023772571236,
-0.2587382197380066,
0.047090817242860794,
0.04537611082196236,
0.06576401740312576,
-0.15773995220661163,
0.014254134148359299,
0.06556764245033264,
0.011166803538799286,
-0.02129744365811348,
0.07596481591463089,
0.0803239718079567,
-0.07936139404773712,
0.03128942474722862,
-0.0033386985305696726,
0.07891964167356491,
0.141982302069664,
0.08221472799777985,
0.026959864422678947,
-0.14622759819030762,
0.018382223322987556,
0.05415626987814903,
-0.29122474789619446,
0.2603740990161896,
-0.016298852860927582,
-0.08506055176258087,
-0.017716290429234505,
-0.051800116896629333,
0.02599170058965683,
0.17704296112060547,
0.11179764568805695,
0.047093238681554794,
-0.08303944021463394,
-0.09443166106939316,
0.06625372916460037,
0.006351374089717865,
0.06221204996109009,
-0.048077523708343506,
-0.039894506335258484,
-0.038912419229745865,
-0.006282614544034004,
0.03132876753807068,
0.1985524743795395,
-0.00042567605851218104,
-0.09110472351312637,
0.05488983541727066,
0.033157818019390106,
-0.012798459269106388,
-0.04242902994155884,
-0.040151726454496384,
-0.1195252388715744,
0.07019954174757004,
-0.012865433469414711,
-0.01390661671757698,
-0.11006497591733932,
-0.1635427623987198,
0.0876099169254303,
-0.054369423538446426,
0.06406673043966293,
-0.03894845396280289,
-0.07413246482610703,
-0.0853787437081337,
-0.1807815432548523,
0.14632569253444672,
-0.10285159945487976,
0.005677650682628155,
-0.07375694811344147,
0.1498737931251526,
-0.10922247916460037,
0.07183880358934402,
0.012159033678472042,
0.04959312453866005,
-0.11737602204084396,
-0.09297776967287064,
0.02451600506901741,
-0.09403116255998611,
0.04706693813204765,
-0.09736547619104385,
-0.04247691482305527,
0.05206327140331268,
0.07581953704357147,
-0.009907481260597706,
0.18349303305149078,
0.22155191004276276,
-0.12401971220970154,
0.1636885553598404,
0.07030263543128967,
-0.02478925697505474,
-0.2698127031326294,
-0.08735955506563187,
-0.18858814239501953,
-0.037029486149549484,
0.09425146132707596,
-0.04421720281243324,
0.016336563974618912,
0.026114607229828835,
-0.0652138963341713,
0.11724712699651718,
-0.24414044618606567,
-0.06941452622413635,
0.12150342017412186,
-0.030888468027114868,
0.3945734202861786,
-0.15146198868751526,
-0.043057575821876526,
0.05751958116889,
-0.23613524436950684,
0.15220879018306732,
-0.06982243061065674,
0.059932176023721695,
-0.024687664583325386,
0.009625919163227081,
0.012956679798662663,
-0.058646511286497116,
0.11852645874023438,
-0.028607038781046867,
0.03917904198169708,
-0.0995291993021965,
-0.14747604727745056,
0.15192294120788574,
-0.010339842177927494,
-0.0039353701286017895,
-0.0560183972120285,
0.0014862052630633116,
-0.14887772500514984,
0.02288043312728405,
-0.14148478209972382,
0.09924518316984177,
-0.013591835275292397,
-0.08339280635118484,
-0.06405390053987503,
0.030689138919115067,
-0.0006512186955660582,
-0.0655183494091034,
0.22220633924007416,
-0.01301235519349575,
0.23067015409469604,
0.09011214226484299,
-0.001718861167319119,
-0.16529352962970734,
-0.1028117686510086,
-0.0182523000985384,
-0.06786543875932693,
0.07782507687807083,
-0.14629456400871277,
0.00777800939977169,
0.10193872451782227,
0.0008055730140767992,
0.03828044235706329,
0.10511858016252518,
-0.018135055899620056,
-0.01799452118575573,
0.17199650406837463,
-0.17657452821731567,
-0.10651668906211853,
-0.0373256579041481,
0.004737530369311571,
0.11901739239692688,
0.04452887549996376,
0.08069372177124023,
-0.035874053835868835,
-0.00571437506005168,
0.006929844152182341,
-0.04200324788689613,
-0.10143157839775085,
-0.026781678199768066,
0.09098425507545471,
0.04007735103368759,
-0.08146607875823975,
-0.014658465050160885,
0.018745744600892067,
-0.14638830721378326,
-0.04493904858827591,
0.09320402890443802,
-0.0856664702296257,
-0.1383364200592041,
-0.03634100779891014,
-0.028726182878017426,
-0.1471620500087738,
0.005056874826550484,
0.021310318261384964,
-0.11028612405061722,
0.046193405985832214,
0.23917366564273834,
0.07942447811365128,
0.12003888934850693,
-0.00624223193153739,
-0.014540894888341427,
0.06047806888818741,
-0.03106739930808544,
-0.09156856685876846,
0.028075164183974266,
-0.13165976107120514,
0.06659668684005737,
-0.015198924578726292,
0.13404253125190735,
-0.09143192321062088,
-0.013847830705344677,
-0.16979002952575684,
0.01005585864186287,
-0.03643380478024483,
-0.11661911755800247,
-0.1018761694431305,
-0.0664786621928215,
0.03516659885644913,
-0.12889797985553741,
-0.06750722229480743,
-0.014512577094137669,
-0.1366327852010727,
0.030165666714310646,
0.040125224739313126,
0.08488902449607849,
-0.084175243973732,
-0.04602627828717232,
0.0952710434794426,
-0.007860500365495682,
0.07812531292438507,
0.0796566754579544,
-0.04677276313304901,
0.09265077859163284,
-0.0742633044719696,
-0.09033126384019852,
0.07081255316734314,
0.008014141581952572,
0.07401663064956665,
0.06573578715324402,
-0.005506083834916353,
0.033054232597351074,
0.011137601919472218,
0.060951173305511475,
-0.06692736595869064,
-0.09784119576215744,
0.002001343760639429,
0.012363482266664505,
-0.1226206123828888,
0.005838602315634489,
-0.0842478796839714,
0.16003870964050293,
-0.0031541911885142326,
0.09600365906953812,
0.014077738858759403,
0.019272007048130035,
-0.11455035954713821,
-0.0005564488237723708,
-0.031815413385629654,
-0.16153185069561005,
-0.020841067656874657,
-0.04984382167458534,
-0.014035704545676708,
-0.014970787800848484,
0.21103821694850922,
0.052751973271369934,
-0.12153859436511993,
0.06305060535669327,
0.06090157851576805,
0.014134160242974758,
-0.024568676948547363,
0.22517432272434235,
0.04472561925649643,
-0.04426772892475128,
-0.11429143697023392,
0.06277164816856384,
-0.031958941370248795,
-0.0988655537366867,
0.1007656380534172,
0.12398026138544083,
0.05795013904571533,
0.04358082637190819,
0.0883537083864212,
-0.016972195357084274,
-0.11261337250471115,
-0.2075732797384262,
0.04702022299170494,
0.05454430729150772,
-0.0723206102848053,
0.0690259262919426,
0.19904738664627075,
-0.04785175248980522,
0.054873671382665634,
-0.061130229383707047,
0.03585329279303551,
-0.14385195076465607,
-0.09793546795845032,
-0.02680368348956108,
-0.112136609852314,
0.00018097730935551226,
-0.025985533371567726,
0.05662226676940918,
0.1552053987979889,
0.05158377066254616,
-0.0050090644508600235,
0.013797912746667862,
0.034431930631399155,
-0.05215965583920479,
0.00695808045566082,
0.01798609457910061,
0.0237100999802351,
-0.097680002450943,
0.021873442456126213,
-0.0972055122256279,
-0.09044606238603592,
-0.07261839509010315,
0.003794738557189703,
-0.04609888419508934,
-0.02652757056057453,
-0.1192617192864418,
-0.08353544026613235,
-0.05491027981042862,
0.03775608912110329,
-0.020447196438908577,
0.09760937839746475,
0.0010796304559335113,
0.03233664855360985,
0.017549393698573112,
0.2191435843706131,
-0.07920964062213898,
-0.041005779057741165,
0.005028486717492342,
0.1873423457145691,
0.03238807991147041,
0.09078659117221832,
-0.00025332922814413905,
0.026695407927036285,
-0.06992001086473465,
0.21113532781600952,
0.3371700942516327,
-0.03200918808579445,
0.08686735481023788,
0.04957680404186249,
0.014318463392555714,
0.06563498079776764,
0.10330832004547119,
0.10295400023460388,
0.23794005811214447,
-0.11865480244159698,
0.003639715723693371,
-0.0632590800523758,
0.03249784931540489,
-0.060046788305044174,
0.04283567890524864,
0.03338109329342842,
-0.07934872806072235,
-0.02819124050438404,
0.06159615516662598,
-0.08641811460256577,
0.055679332464933395,
0.07948239147663116,
-0.20277303457260132,
-0.0322968028485775,
-0.012311164289712906,
0.1655222773551941,
-0.0081708999350667,
0.09296835213899612,
-0.058473922312259674,
-0.07403064519166946,
0.013284390792250633,
0.010752661153674126,
-0.24578692018985748,
-0.05873711407184601,
0.1534961313009262,
0.02293015830218792,
0.060544077306985855,
-0.06414378434419632,
0.025068385526537895,
0.09329359978437424,
0.06718573719263077,
-0.06130964681506157,
0.05137915909290314,
0.05769737809896469,
-0.09846795350313187,
-0.11652769893407822,
-0.10557594150304794,
0.024011142551898956,
-0.06896069645881653,
0.04533424600958824,
-0.14160817861557007,
0.049556341022253036,
-0.004323633853346109,
-0.011176121421158314,
-0.027610979974269867,
0.02008398436009884,
-0.029410140588879585,
0.08877939730882645,
0.033858463168144226,
-0.008892212063074112,
-0.047436561435461044,
-0.04151783511042595,
-0.06487058103084564,
0.10949592292308807,
-0.09368491172790527,
-0.118156798183918,
0.02916806749999523,
-0.047728683799505234,
0.015931548550724983,
-0.028221987187862396,
-0.07404676824808121,
-0.07388848811388016,
-0.03162632882595062,
0.030543042346835136,
-0.11053670197725296,
0.04174854978919029,
0.08706703037023544,
0.030031858012080193,
0.012773646973073483,
-0.06605713814496994,
0.01675252430140972,
0.042071882635354996,
-0.13691921532154083,
-0.03748135268688202
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz Turkish BERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources an uncased model for Turkish 🎉
# 🇹🇷 BERTurk
BERTurk is a community-driven uncased BERT model for Turkish.
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the model name: BERTurk.
## Stats
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish [OSCAR corpus](https://traces1.inria.fr/oscar/),
a recent Wikipedia dump, various [OPUS corpora](http://opus.nlpl.eu/) and a
special corpus provided by [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/).
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train an uncased model
on a TPU v3-8 for 2M steps.
## Model weights
Currently only PyTorch-[Transformers](https://github.com/huggingface/transformers)
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
| Model | Downloads
| --------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `dbmdz/bert-base-turkish-uncased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-uncased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-uncased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-uncased/vocab.txt)
## Usage
With Transformers >= 2.3 our BERTurk uncased model can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("dbmdz/bert-base-turkish-uncased")
model = AutoModel.from_pretrained("dbmdz/bert-base-turkish-uncased")
```
## Results
For results on PoS tagging or NER tasks, please refer to
[this repository](https://github.com/stefan-it/turkish-bert).
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "tr", "license": "mit"}
| null |
dbmdz/bert-base-turkish-uncased
|
[
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"tr",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #jax #bert #tr #license-mit #endpoints_compatible #has_space #region-us
|
+ dbmdz Turkish BERT model
==========================
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources an uncased model for Turkish
🇹🇷 BERTurk
==========
BERTurk is a community-driven uncased BERT model for Turkish.
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the model name: BERTurk.
Stats
-----
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish OSCAR corpus,
a recent Wikipedia dump, various OPUS corpora and a
special corpus provided by Kemal Oflazer.
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train an uncased model
on a TPU v3-8 for 2M steps.
Model weights
-------------
Currently only PyTorch-Transformers
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
Usage
-----
With Transformers >= 2.3 our BERTurk uncased model can be loaded like:
Results
-------
For results on PoS tagging or NER tasks, please refer to
this repository.
Huggingface model hub
=====================
All models are available on the Huggingface model hub.
Contact (Bugs, Feedback, Contribution and more)
===============================================
For questions about our BERT models just open an issue
here
Acknowledgments
===============
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #bert #tr #license-mit #endpoints_compatible #has_space #region-us \n"
] |
[
40
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #bert #tr #license-mit #endpoints_compatible #has_space #region-us \n"
] |
[
0.018921364098787308,
-0.00019106945546809584,
-0.006394296884536743,
0.03337884694337845,
0.054041218012571335,
0.031650494784116745,
0.06662475317716599,
0.10807959735393524,
0.05830428749322891,
-0.019409330561757088,
0.13395871222019196,
0.18363657593727112,
-0.042950283735990524,
0.03756747394800186,
-0.033980563282966614,
-0.23781342804431915,
0.0514480322599411,
0.052651047706604004,
-0.07216320186853409,
0.11049067229032516,
0.07754318416118622,
-0.07751429826021194,
0.05530031770467758,
-0.009757393039762974,
-0.12986992299556732,
0.03262612968683243,
0.04725905507802963,
-0.07637171447277069,
0.1480608433485031,
0.044089175760746,
0.12649399042129517,
0.09060006588697433,
-0.03512771427631378,
-0.0813177078962326,
0.034379344433546066,
0.012670991010963917,
-0.12737680971622467,
0.04784052446484566,
0.0038243152666836977,
-0.03693895414471626,
0.13288435339927673,
0.05445465072989464,
0.0063299741595983505,
0.03950441628694534,
-0.16893337666988373,
-0.2548118233680725,
-0.07381822168827057,
0.08884354680776596,
-0.02052173763513565,
0.04091643914580345,
0.03058725595474243,
0.20716851949691772,
-0.13842296600341797,
0.05934154987335205,
0.19126549363136292,
-0.39465510845184326,
-0.011868304572999477,
0.16808973252773285,
0.12622633576393127,
0.032379381358623505,
-0.06192849576473236,
0.06399940699338913,
0.05275721848011017,
0.019478803500533104,
0.12466363608837128,
-0.07277891039848328,
-0.04720394313335419,
0.10058879852294922,
-0.10866198688745499,
-0.08265569061040878,
0.22826968133449554,
-0.020683445036411285,
0.03560272604227066,
0.03850972279906273,
-0.07584118098020554,
-0.07571636140346527,
0.025144880637526512,
-0.022743092849850655,
0.0026006216648966074,
0.07900294661521912,
-0.01369452103972435,
-0.04454221948981285,
-0.15518058836460114,
0.025484146550297737,
-0.22992919385433197,
0.13037264347076416,
-0.0037297243252396584,
0.08252513408660889,
-0.17930646240711212,
0.0790943130850792,
-0.019592057913541794,
-0.07786344736814499,
0.03198960795998573,
-0.09400814026594162,
0.05977398529648781,
0.004003958310931921,
-0.05471600219607353,
0.07530807703733444,
0.05432100594043732,
0.14611782133579254,
0.012070882134139538,
-0.01876658946275711,
0.017675457522273064,
0.12192820757627487,
-0.024760860949754715,
0.04574089124798775,
-0.015430393628776073,
0.02416994422674179,
0.01457999087870121,
-0.11165327578783035,
-0.008331339806318283,
-0.04006664454936981,
-0.13114413619041443,
-0.053011540323495865,
-0.00046386977192014456,
0.06673956662416458,
0.05116226524114609,
0.05029689148068428,
-0.04164477437734604,
0.04405054822564125,
0.0799972265958786,
-0.014050222001969814,
0.007079362403601408,
-0.018930165097117424,
0.07033229619264603,
0.044898953288793564,
0.004350714851170778,
-0.013392960652709007,
0.06401839852333069,
0.06933358311653137,
-0.11029591411352158,
-0.032753556966781616,
-0.024227159097790718,
-0.08324852585792542,
0.07242018729448318,
-0.09464021027088165,
0.04510653018951416,
-0.19161643087863922,
-0.03341130539774895,
0.04895780235528946,
0.06587915122509003,
-0.003518822602927685,
-0.02106628008186817,
0.09844736754894257,
-0.07420367002487183,
0.04550398886203766,
-0.05749610438942909,
-0.023689400404691696,
-0.061869945377111435,
0.11284130066633224,
-0.08427777886390686,
0.09290680289268494,
-0.15307174623012543,
0.041453663259744644,
-0.07665122300386429,
0.013973030261695385,
-0.05626176297664642,
-0.07523717731237411,
-0.04507468268275261,
0.15355731546878815,
0.008108905516564846,
-0.06740334630012512,
-0.14346280694007874,
0.046696294099092484,
-0.04539621248841286,
0.09800661355257034,
-0.12722671031951904,
-0.058843065053224564,
0.17224109172821045,
-0.07735142111778259,
-0.1869661957025528,
0.05597550794482231,
0.013961449265480042,
0.05645507201552391,
0.013563055545091629,
0.22008489072322845,
0.05232677236199379,
-0.14414089918136597,
0.03595493361353874,
0.14881235361099243,
-0.13480418920516968,
-0.1392943114042282,
0.07309679687023163,
0.005714767146855593,
-0.08493904769420624,
-0.008863232098519802,
0.007384501863270998,
0.10543231666088104,
-0.05483245849609375,
-0.04142352193593979,
-0.04210564121603966,
-0.00836840271949768,
0.0716710314154625,
0.03725959360599518,
0.08052113652229309,
-0.09881141036748886,
-0.056291595101356506,
0.05348818004131317,
0.0003750566393136978,
0.09270141273736954,
0.0430111289024353,
-0.050984617322683334,
0.10983968526124954,
0.001625535194762051,
-0.039218783378601074,
-0.12669385969638824,
-0.08412294834852219,
-0.041719261556863785,
0.06811828911304474,
0.0077491686679422855,
0.28311729431152344,
0.06907472014427185,
-0.08319556713104248,
-0.017015645280480385,
-0.006065546069294214,
0.0952838882803917,
0.07185361534357071,
-0.022340446710586548,
-0.09672259539365768,
-0.0047282068990170956,
-0.06517353653907776,
-0.08806779235601425,
-0.04047058895230293,
0.026650212705135345,
0.13822844624519348,
0.12135589867830276,
-0.02114042267203331,
0.06477739661931992,
-0.0375421978533268,
0.01081872172653675,
-0.04319320619106293,
-0.01595330238342285,
0.08564510196447372,
0.023633481934666634,
-0.0436147004365921,
0.22958077490329742,
-0.08547714352607727,
0.3939151465892792,
0.22446800768375397,
-0.1657782644033432,
-0.031030558049678802,
0.06123265251517296,
-0.051767002791166306,
0.03748093545436859,
0.06632548570632935,
-0.0631106048822403,
-0.040214505046606064,
-0.05183693394064903,
0.1000824049115181,
-0.04166794940829277,
-0.0760674849152565,
0.005690592806786299,
-0.03536844626069069,
-0.07515683025121689,
0.046796247363090515,
0.07047999650239944,
-0.2172105759382248,
0.19383907318115234,
0.3774738013744354,
0.05013132467865944,
0.14449995756149292,
-0.03993186727166176,
0.012442865408957005,
-0.032865963876247406,
-0.03054577298462391,
-0.04821619763970375,
0.10719674080610275,
-0.16407407820224762,
-0.037437137216329575,
0.07295597344636917,
0.006247274111956358,
0.034409135580062866,
-0.15331147611141205,
-0.10955478996038437,
0.04336130619049072,
0.029929859563708305,
-0.09580295532941818,
0.1635720133781433,
0.010531868785619736,
0.10101879388093948,
-0.012210089713335037,
-0.12064392119646072,
0.09066576510667801,
0.014052140526473522,
-0.05228690803050995,
0.09404075145721436,
-0.13077005743980408,
-0.2045978605747223,
-0.06662975996732712,
-0.06628576666116714,
0.073656365275383,
-0.01600862666964531,
0.1302538514137268,
-0.026411157101392746,
-0.0007316036499105394,
0.020127976313233376,
-0.026053689420223236,
-0.1503445953130722,
0.06380217522382736,
-0.08932972699403763,
0.02396491914987564,
-0.06136816740036011,
-0.10904357582330704,
-0.09035322815179825,
-0.014093480072915554,
-0.07301399111747742,
0.1116311103105545,
-0.05141761153936386,
0.0725216194987297,
0.09947564452886581,
-0.028806697577238083,
0.053319692611694336,
-0.06826431304216385,
0.20720802247524261,
-0.059331413358449936,
0.030779873952269554,
0.15294189751148224,
0.054447006434202194,
0.07210250198841095,
0.15391798317432404,
0.07699142396450043,
-0.03385859355330467,
-0.011553134769201279,
-0.05508512631058693,
-0.11758847534656525,
-0.1600302904844284,
-0.05947500467300415,
-0.1385406106710434,
0.0024164041969925165,
-0.005456132814288139,
0.08605465292930603,
0.13780304789543152,
0.03569338470697403,
0.028745347633957863,
-0.046719420701265335,
-0.05642664059996605,
0.056320443749427795,
0.2243598997592926,
-0.058458078652620316,
0.10283274203538895,
-0.08308514952659607,
-0.07471421360969543,
0.10732393711805344,
0.02800784818828106,
0.08840981125831604,
0.1018141582608223,
-0.031360968947410583,
0.10143575817346573,
0.2378867119550705,
0.11593813449144363,
0.08610434830188751,
-0.00764454435557127,
-0.058286938816308975,
-0.04331653192639351,
-0.03456863388419151,
0.03303278237581253,
0.0610777847468853,
0.10405449569225311,
-0.11127109825611115,
0.0004154023772571236,
-0.2587382197380066,
0.047090817242860794,
0.04537611082196236,
0.06576401740312576,
-0.15773995220661163,
0.014254134148359299,
0.06556764245033264,
0.011166803538799286,
-0.02129744365811348,
0.07596481591463089,
0.0803239718079567,
-0.07936139404773712,
0.03128942474722862,
-0.0033386985305696726,
0.07891964167356491,
0.141982302069664,
0.08221472799777985,
0.026959864422678947,
-0.14622759819030762,
0.018382223322987556,
0.05415626987814903,
-0.29122474789619446,
0.2603740990161896,
-0.016298852860927582,
-0.08506055176258087,
-0.017716290429234505,
-0.051800116896629333,
0.02599170058965683,
0.17704296112060547,
0.11179764568805695,
0.047093238681554794,
-0.08303944021463394,
-0.09443166106939316,
0.06625372916460037,
0.006351374089717865,
0.06221204996109009,
-0.048077523708343506,
-0.039894506335258484,
-0.038912419229745865,
-0.006282614544034004,
0.03132876753807068,
0.1985524743795395,
-0.00042567605851218104,
-0.09110472351312637,
0.05488983541727066,
0.033157818019390106,
-0.012798459269106388,
-0.04242902994155884,
-0.040151726454496384,
-0.1195252388715744,
0.07019954174757004,
-0.012865433469414711,
-0.01390661671757698,
-0.11006497591733932,
-0.1635427623987198,
0.0876099169254303,
-0.054369423538446426,
0.06406673043966293,
-0.03894845396280289,
-0.07413246482610703,
-0.0853787437081337,
-0.1807815432548523,
0.14632569253444672,
-0.10285159945487976,
0.005677650682628155,
-0.07375694811344147,
0.1498737931251526,
-0.10922247916460037,
0.07183880358934402,
0.012159033678472042,
0.04959312453866005,
-0.11737602204084396,
-0.09297776967287064,
0.02451600506901741,
-0.09403116255998611,
0.04706693813204765,
-0.09736547619104385,
-0.04247691482305527,
0.05206327140331268,
0.07581953704357147,
-0.009907481260597706,
0.18349303305149078,
0.22155191004276276,
-0.12401971220970154,
0.1636885553598404,
0.07030263543128967,
-0.02478925697505474,
-0.2698127031326294,
-0.08735955506563187,
-0.18858814239501953,
-0.037029486149549484,
0.09425146132707596,
-0.04421720281243324,
0.016336563974618912,
0.026114607229828835,
-0.0652138963341713,
0.11724712699651718,
-0.24414044618606567,
-0.06941452622413635,
0.12150342017412186,
-0.030888468027114868,
0.3945734202861786,
-0.15146198868751526,
-0.043057575821876526,
0.05751958116889,
-0.23613524436950684,
0.15220879018306732,
-0.06982243061065674,
0.059932176023721695,
-0.024687664583325386,
0.009625919163227081,
0.012956679798662663,
-0.058646511286497116,
0.11852645874023438,
-0.028607038781046867,
0.03917904198169708,
-0.0995291993021965,
-0.14747604727745056,
0.15192294120788574,
-0.010339842177927494,
-0.0039353701286017895,
-0.0560183972120285,
0.0014862052630633116,
-0.14887772500514984,
0.02288043312728405,
-0.14148478209972382,
0.09924518316984177,
-0.013591835275292397,
-0.08339280635118484,
-0.06405390053987503,
0.030689138919115067,
-0.0006512186955660582,
-0.0655183494091034,
0.22220633924007416,
-0.01301235519349575,
0.23067015409469604,
0.09011214226484299,
-0.001718861167319119,
-0.16529352962970734,
-0.1028117686510086,
-0.0182523000985384,
-0.06786543875932693,
0.07782507687807083,
-0.14629456400871277,
0.00777800939977169,
0.10193872451782227,
0.0008055730140767992,
0.03828044235706329,
0.10511858016252518,
-0.018135055899620056,
-0.01799452118575573,
0.17199650406837463,
-0.17657452821731567,
-0.10651668906211853,
-0.0373256579041481,
0.004737530369311571,
0.11901739239692688,
0.04452887549996376,
0.08069372177124023,
-0.035874053835868835,
-0.00571437506005168,
0.006929844152182341,
-0.04200324788689613,
-0.10143157839775085,
-0.026781678199768066,
0.09098425507545471,
0.04007735103368759,
-0.08146607875823975,
-0.014658465050160885,
0.018745744600892067,
-0.14638830721378326,
-0.04493904858827591,
0.09320402890443802,
-0.0856664702296257,
-0.1383364200592041,
-0.03634100779891014,
-0.028726182878017426,
-0.1471620500087738,
0.005056874826550484,
0.021310318261384964,
-0.11028612405061722,
0.046193405985832214,
0.23917366564273834,
0.07942447811365128,
0.12003888934850693,
-0.00624223193153739,
-0.014540894888341427,
0.06047806888818741,
-0.03106739930808544,
-0.09156856685876846,
0.028075164183974266,
-0.13165976107120514,
0.06659668684005737,
-0.015198924578726292,
0.13404253125190735,
-0.09143192321062088,
-0.013847830705344677,
-0.16979002952575684,
0.01005585864186287,
-0.03643380478024483,
-0.11661911755800247,
-0.1018761694431305,
-0.0664786621928215,
0.03516659885644913,
-0.12889797985553741,
-0.06750722229480743,
-0.014512577094137669,
-0.1366327852010727,
0.030165666714310646,
0.040125224739313126,
0.08488902449607849,
-0.084175243973732,
-0.04602627828717232,
0.0952710434794426,
-0.007860500365495682,
0.07812531292438507,
0.0796566754579544,
-0.04677276313304901,
0.09265077859163284,
-0.0742633044719696,
-0.09033126384019852,
0.07081255316734314,
0.008014141581952572,
0.07401663064956665,
0.06573578715324402,
-0.005506083834916353,
0.033054232597351074,
0.011137601919472218,
0.060951173305511475,
-0.06692736595869064,
-0.09784119576215744,
0.002001343760639429,
0.012363482266664505,
-0.1226206123828888,
0.005838602315634489,
-0.0842478796839714,
0.16003870964050293,
-0.0031541911885142326,
0.09600365906953812,
0.014077738858759403,
0.019272007048130035,
-0.11455035954713821,
-0.0005564488237723708,
-0.031815413385629654,
-0.16153185069561005,
-0.020841067656874657,
-0.04984382167458534,
-0.014035704545676708,
-0.014970787800848484,
0.21103821694850922,
0.052751973271369934,
-0.12153859436511993,
0.06305060535669327,
0.06090157851576805,
0.014134160242974758,
-0.024568676948547363,
0.22517432272434235,
0.04472561925649643,
-0.04426772892475128,
-0.11429143697023392,
0.06277164816856384,
-0.031958941370248795,
-0.0988655537366867,
0.1007656380534172,
0.12398026138544083,
0.05795013904571533,
0.04358082637190819,
0.0883537083864212,
-0.016972195357084274,
-0.11261337250471115,
-0.2075732797384262,
0.04702022299170494,
0.05454430729150772,
-0.0723206102848053,
0.0690259262919426,
0.19904738664627075,
-0.04785175248980522,
0.054873671382665634,
-0.061130229383707047,
0.03585329279303551,
-0.14385195076465607,
-0.09793546795845032,
-0.02680368348956108,
-0.112136609852314,
0.00018097730935551226,
-0.025985533371567726,
0.05662226676940918,
0.1552053987979889,
0.05158377066254616,
-0.0050090644508600235,
0.013797912746667862,
0.034431930631399155,
-0.05215965583920479,
0.00695808045566082,
0.01798609457910061,
0.0237100999802351,
-0.097680002450943,
0.021873442456126213,
-0.0972055122256279,
-0.09044606238603592,
-0.07261839509010315,
0.003794738557189703,
-0.04609888419508934,
-0.02652757056057453,
-0.1192617192864418,
-0.08353544026613235,
-0.05491027981042862,
0.03775608912110329,
-0.020447196438908577,
0.09760937839746475,
0.0010796304559335113,
0.03233664855360985,
0.017549393698573112,
0.2191435843706131,
-0.07920964062213898,
-0.041005779057741165,
0.005028486717492342,
0.1873423457145691,
0.03238807991147041,
0.09078659117221832,
-0.00025332922814413905,
0.026695407927036285,
-0.06992001086473465,
0.21113532781600952,
0.3371700942516327,
-0.03200918808579445,
0.08686735481023788,
0.04957680404186249,
0.014318463392555714,
0.06563498079776764,
0.10330832004547119,
0.10295400023460388,
0.23794005811214447,
-0.11865480244159698,
0.003639715723693371,
-0.0632590800523758,
0.03249784931540489,
-0.060046788305044174,
0.04283567890524864,
0.03338109329342842,
-0.07934872806072235,
-0.02819124050438404,
0.06159615516662598,
-0.08641811460256577,
0.055679332464933395,
0.07948239147663116,
-0.20277303457260132,
-0.0322968028485775,
-0.012311164289712906,
0.1655222773551941,
-0.0081708999350667,
0.09296835213899612,
-0.058473922312259674,
-0.07403064519166946,
0.013284390792250633,
0.010752661153674126,
-0.24578692018985748,
-0.05873711407184601,
0.1534961313009262,
0.02293015830218792,
0.060544077306985855,
-0.06414378434419632,
0.025068385526537895,
0.09329359978437424,
0.06718573719263077,
-0.06130964681506157,
0.05137915909290314,
0.05769737809896469,
-0.09846795350313187,
-0.11652769893407822,
-0.10557594150304794,
0.024011142551898956,
-0.06896069645881653,
0.04533424600958824,
-0.14160817861557007,
0.049556341022253036,
-0.004323633853346109,
-0.011176121421158314,
-0.027610979974269867,
0.02008398436009884,
-0.029410140588879585,
0.08877939730882645,
0.033858463168144226,
-0.008892212063074112,
-0.047436561435461044,
-0.04151783511042595,
-0.06487058103084564,
0.10949592292308807,
-0.09368491172790527,
-0.118156798183918,
0.02916806749999523,
-0.047728683799505234,
0.015931548550724983,
-0.028221987187862396,
-0.07404676824808121,
-0.07388848811388016,
-0.03162632882595062,
0.030543042346835136,
-0.11053670197725296,
0.04174854978919029,
0.08706703037023544,
0.030031858012080193,
0.012773646973073483,
-0.06605713814496994,
0.01675252430140972,
0.042071882635354996,
-0.13691921532154083,
-0.03748135268688202
] |
null | null |
transformers
|
# Historic Language Models (HLMs)
## Languages
Our Historic Language Models Zoo contains support for the following languages - incl. their training data source:
| Language | Training data | Size
| -------- | ------------- | ----
| German | [Europeana](http://www.europeana-newspapers.eu/) | 13-28GB (filtered)
| French | [Europeana](http://www.europeana-newspapers.eu/) | 11-31GB (filtered)
| English | [British Library](https://data.bl.uk/digbks/db14.html) | 24GB (year filtered)
| Finnish | [Europeana](http://www.europeana-newspapers.eu/) | 1.2GB
| Swedish | [Europeana](http://www.europeana-newspapers.eu/) | 1.1GB
## Models
At the moment, the following models are available on the model hub:
| Model identifier | Model Hub link
| --------------------------------------------- | --------------------------------------------------------------------------
| `dbmdz/bert-base-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-multilingual-cased)
| `dbmdz/bert-base-historic-english-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-english-cased)
| `dbmdz/bert-base-finnish-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-finnish-europeana-cased)
| `dbmdz/bert-base-swedish-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-swedish-europeana-cased)
We also released smaller models for the multilingual model:
| Model identifier | Model Hub link
| ----------------------------------------------- | ---------------------------------------------------------------------------
| `dbmdz/bert-tiny-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-tiny-historic-multilingual-cased)
| `dbmdz/bert-mini-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-mini-historic-multilingual-cased)
| `dbmdz/bert-small-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-small-historic-multilingual-cased)
| `dbmdz/bert-medium-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-multilingual-cased)
**Notice**: We have released language models for Historic German and French trained on more noisier data earlier - see
[this repo](https://github.com/stefan-it/europeana-bert) for more information:
| Model identifier | Model Hub link
| --------------------------------------------- | --------------------------------------------------------------------------
| `dbmdz/bert-base-german-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-german-europeana-cased)
| `dbmdz/bert-base-french-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-french-europeana-cased)
# Corpora Stats
## German Europeana Corpus
We provide some statistics using different thresholds of ocr confidences, in order to shrink down the corpus size
and use less-noisier data:
| OCR confidence | Size
| -------------- | ----
| **0.60** | 28GB
| 0.65 | 18GB
| 0.70 | 13GB
For the final corpus we use a OCR confidence of 0.6 (28GB). The following plot shows a tokens per year distribution:

## French Europeana Corpus
Like German, we use different ocr confidence thresholds:
| OCR confidence | Size
| -------------- | ----
| 0.60 | 31GB
| 0.65 | 27GB
| **0.70** | 27GB
| 0.75 | 23GB
| 0.80 | 11GB
For the final corpus we use a OCR confidence of 0.7 (27GB). The following plot shows a tokens per year distribution:

## British Library Corpus
Metadata is taken from [here](https://data.bl.uk/digbks/DB21.html). Stats incl. year filtering:
| Years | Size
| ----------------- | ----
| ALL | 24GB
| >= 1800 && < 1900 | 24GB
We use the year filtered variant. The following plot shows a tokens per year distribution:

## Finnish Europeana Corpus
| OCR confidence | Size
| -------------- | ----
| 0.60 | 1.2GB
The following plot shows a tokens per year distribution:

## Swedish Europeana Corpus
| OCR confidence | Size
| -------------- | ----
| 0.60 | 1.1GB
The following plot shows a tokens per year distribution:

## All Corpora
The following plot shows a tokens per year distribution of the complete training corpus:

# Multilingual Vocab generation
For the first attempt, we use the first 10GB of each pretraining corpus. We upsample both Finnish and Swedish to ~10GB.
The following tables shows the exact size that is used for generating a 32k and 64k subword vocabs:
| Language | Size
| -------- | ----
| German | 10GB
| French | 10GB
| English | 10GB
| Finnish | 9.5GB
| Swedish | 9.7GB
We then calculate the subword fertility rate and portion of `[UNK]`s over the following NER corpora:
| Language | NER corpora
| -------- | ------------------
| German | CLEF-HIPE, NewsEye
| French | CLEF-HIPE, NewsEye
| English | CLEF-HIPE
| Finnish | NewsEye
| Swedish | NewsEye
Breakdown of subword fertility rate and unknown portion per language for the 32k vocab:
| Language | Subword fertility | Unknown portion
| -------- | ------------------ | ---------------
| German | 1.43 | 0.0004
| French | 1.25 | 0.0001
| English | 1.25 | 0.0
| Finnish | 1.69 | 0.0007
| Swedish | 1.43 | 0.0
Breakdown of subword fertility rate and unknown portion per language for the 64k vocab:
| Language | Subword fertility | Unknown portion
| -------- | ------------------ | ---------------
| German | 1.31 | 0.0004
| French | 1.16 | 0.0001
| English | 1.17 | 0.0
| Finnish | 1.54 | 0.0007
| Swedish | 1.32 | 0.0
# Final pretraining corpora
We upsample Swedish and Finnish to ~27GB. The final stats for all pretraining corpora can be seen here:
| Language | Size
| -------- | ----
| German | 28GB
| French | 27GB
| English | 24GB
| Finnish | 27GB
| Swedish | 27GB
Total size is 130GB.
# Smaller multilingual models
Inspired by the ["Well-Read Students Learn Better: On the Importance of Pre-training Compact Models"](https://arxiv.org/abs/1908.08962)
paper, we train smaller models (different layers and hidden sizes), and report number of parameters and pre-training costs:
| Model (Layer / Hidden size) | Parameters | Pre-Training time
| --------------------------- | ----------: | ----------------------:
| hmBERT Tiny ( 2/128) | 4.58M | 4.3 sec / 1,000 steps
| hmBERT Mini ( 4/256) | 11.55M | 10.5 sec / 1,000 steps
| hmBERT Small ( 4/512) | 29.52M | 20.7 sec / 1,000 steps
| hmBERT Medium ( 8/512) | 42.13M | 35.0 sec / 1,000 steps
| hmBERT Base (12/768) | 110.62M | 80.0 sec / 1,000 steps
We then perform downstream evaluations on the multilingual [NewsEye](https://zenodo.org/record/4573313#.Ya3oVr-ZNzU) dataset:

# Pretraining
## Multilingual model - hmBERT Base
We train a multilingual BERT model using the 32k vocab with the official BERT implementation
on a v3-32 TPU using the following parameters:
```bash
python3 run_pretraining.py --input_file gs://histolectra/historic-multilingual-tfrecords/*.tfrecord \
--output_dir gs://histolectra/bert-base-historic-multilingual-cased \
--bert_config_file ./config.json \
--max_seq_length=512 \
--max_predictions_per_seq=75 \
--do_train=True \
--train_batch_size=128 \
--num_train_steps=3000000 \
--learning_rate=1e-4 \
--save_checkpoints_steps=100000 \
--keep_checkpoint_max=20 \
--use_tpu=True \
--tpu_name=electra-2 \
--num_tpu_cores=32
```
The following plot shows the pretraining loss curve:

## Smaller multilingual models
We use the same parameters as used for training the base model.
### hmBERT Tiny
The following plot shows the pretraining loss curve for the tiny model:

### hmBERT Mini
The following plot shows the pretraining loss curve for the mini model:

### hmBERT Small
The following plot shows the pretraining loss curve for the small model:

### hmBERT Medium
The following plot shows the pretraining loss curve for the medium model:

## English model
The English BERT model - with texts from British Library corpus - was trained with the Hugging Face
JAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-historic-english-cased/ \
--tokenizer_name /mnt/datasets/bert-base-historic-english-cased/ \
--train_file /mnt/datasets/bl-corpus/bl_1800-1900_extracted.txt \
--validation_file /mnt/datasets/bl-corpus/english_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 10 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-historic-english-cased-512-noadafactor-10e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

## Finnish model
The BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-finnish-europeana-cased/ \
--tokenizer_name /mnt/datasets/bert-base-finnish-europeana-cased/ \
--train_file /mnt/datasets/hlms/extracted_content_Finnish_0.6.txt \
--validation_file /mnt/datasets/hlms/finnish_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 40 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-finnish-europeana-cased-512-dupe1-noadafactor-40e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

## Swedish model
The BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-swedish-europeana-cased/ \
--tokenizer_name /mnt/datasets/bert-base-swedish-europeana-cased/ \
--train_file /mnt/datasets/hlms/extracted_content_Swedish_0.6.txt \
--validation_file /mnt/datasets/hlms/swedish_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 40 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-swedish-europeana-cased-512-dupe1-noadafactor-40e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

# Acknowledgments
Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as
TensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "multilingual", "license": "mit", "widget": [{"text": "and I cannot conceive the reafon why [MASK] hath"}, {"text": "T\u00e4k\u00e4l\u00e4inen sanomalehdist\u00f6 [MASK] erit - t\u00e4in"}, {"text": "Det vore [MASK] h\u00e4ller n\u00f6dv\u00e4ndigt att be"}, {"text": "Comme, \u00e0 cette \u00e9poque [MASK] \u00e9tait celle de la"}, {"text": "In [MASK] an atmosph\u00e4rischen Nahrungsmitteln"}]}
|
fill-mask
|
dbmdz/bert-medium-historic-multilingual-cased
|
[
"transformers",
"pytorch",
"tf",
"tensorboard",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"arxiv:1908.08962",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.08962"
] |
[
"multilingual"
] |
TAGS
#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
Historic Language Models (HLMs)
===============================
Languages
---------
Our Historic Language Models Zoo contains support for the following languages - incl. their training data source:
Language: German, Training data: Europeana, Size: 13-28GB (filtered)
Language: French, Training data: Europeana, Size: 11-31GB (filtered)
Language: English, Training data: British Library, Size: 24GB (year filtered)
Language: Finnish, Training data: Europeana, Size: 1.2GB
Language: Swedish, Training data: Europeana, Size: 1.1GB
Models
------
At the moment, the following models are available on the model hub:
We also released smaller models for the multilingual model:
Notice: We have released language models for Historic German and French trained on more noisier data earlier - see
this repo for more information:
Corpora Stats
=============
German Europeana Corpus
-----------------------
We provide some statistics using different thresholds of ocr confidences, in order to shrink down the corpus size
and use less-noisier data:
For the final corpus we use a OCR confidence of 0.6 (28GB). The following plot shows a tokens per year distribution:
!German Europeana Corpus Stats
French Europeana Corpus
-----------------------
Like German, we use different ocr confidence thresholds:
For the final corpus we use a OCR confidence of 0.7 (27GB). The following plot shows a tokens per year distribution:
!French Europeana Corpus Stats
British Library Corpus
----------------------
Metadata is taken from here. Stats incl. year filtering:
We use the year filtered variant. The following plot shows a tokens per year distribution:
!British Library Corpus Stats
Finnish Europeana Corpus
------------------------
The following plot shows a tokens per year distribution:
!Finnish Europeana Corpus Stats
Swedish Europeana Corpus
------------------------
The following plot shows a tokens per year distribution:
!Swedish Europeana Corpus Stats
All Corpora
-----------
The following plot shows a tokens per year distribution of the complete training corpus:
!All Corpora Stats
Multilingual Vocab generation
=============================
For the first attempt, we use the first 10GB of each pretraining corpus. We upsample both Finnish and Swedish to ~10GB.
The following tables shows the exact size that is used for generating a 32k and 64k subword vocabs:
We then calculate the subword fertility rate and portion of '[UNK]'s over the following NER corpora:
Breakdown of subword fertility rate and unknown portion per language for the 32k vocab:
Language: German, Subword fertility: 1.43, Unknown portion: 0.0004
Language: French, Subword fertility: 1.25, Unknown portion: 0.0001
Language: English, Subword fertility: 1.25, Unknown portion: 0.0
Language: Finnish, Subword fertility: 1.69, Unknown portion: 0.0007
Language: Swedish, Subword fertility: 1.43, Unknown portion: 0.0
Breakdown of subword fertility rate and unknown portion per language for the 64k vocab:
Language: German, Subword fertility: 1.31, Unknown portion: 0.0004
Language: French, Subword fertility: 1.16, Unknown portion: 0.0001
Language: English, Subword fertility: 1.17, Unknown portion: 0.0
Language: Finnish, Subword fertility: 1.54, Unknown portion: 0.0007
Language: Swedish, Subword fertility: 1.32, Unknown portion: 0.0
Final pretraining corpora
=========================
We upsample Swedish and Finnish to ~27GB. The final stats for all pretraining corpora can be seen here:
Total size is 130GB.
Smaller multilingual models
===========================
Inspired by the "Well-Read Students Learn Better: On the Importance of Pre-training Compact Models"
paper, we train smaller models (different layers and hidden sizes), and report number of parameters and pre-training costs:
We then perform downstream evaluations on the multilingual NewsEye dataset:
!NewsEye hmBERT Evaluation
Pretraining
===========
Multilingual model - hmBERT Base
--------------------------------
We train a multilingual BERT model using the 32k vocab with the official BERT implementation
on a v3-32 TPU using the following parameters:
The following plot shows the pretraining loss curve:
!Training loss curve
Smaller multilingual models
---------------------------
We use the same parameters as used for training the base model.
### hmBERT Tiny
The following plot shows the pretraining loss curve for the tiny model:
!Training loss curve
### hmBERT Mini
The following plot shows the pretraining loss curve for the mini model:
!Training loss curve
### hmBERT Small
The following plot shows the pretraining loss curve for the small model:
!Training loss curve
### hmBERT Medium
The following plot shows the pretraining loss curve for the medium model:
!Training loss curve
English model
-------------
The English BERT model - with texts from British Library corpus - was trained with the Hugging Face
JAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Finnish model
-------------
The BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Swedish model
-------------
The BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Acknowledgments
===============
Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as
TensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[
"### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve",
"### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve",
"### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve",
"### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve",
"### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve",
"### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve",
"### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
66,
30,
28,
28,
347
] |
[
"passage: TAGS\n#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
-0.01838250458240509,
0.07497698813676834,
-0.004591826349496841,
0.0935814306139946,
0.08459219336509705,
0.0281265489757061,
0.0513935424387455,
0.11914435029029846,
0.019784854725003242,
0.12505578994750977,
0.01800212636590004,
-0.004408061970025301,
0.10708323866128922,
0.07002894580364227,
0.10031522065401077,
-0.26725950837135315,
0.015286745503544807,
-0.14339587092399597,
-0.012879506684839725,
0.061703287065029144,
0.13286221027374268,
-0.049308016896247864,
0.0773063525557518,
0.005427641794085503,
0.00662423437461257,
0.02681286446750164,
-0.029497727751731873,
-0.036411575973033905,
0.08129750192165375,
0.06282371282577515,
0.02853299304842949,
-0.03729385510087013,
0.057260334491729736,
-0.2171691656112671,
0.0006818408728577197,
0.0776631310582161,
-0.009753323160111904,
0.03736304119229317,
0.10214542597532272,
-0.0279656033962965,
0.08662102371454239,
-0.11414434760808945,
0.05029687657952309,
0.045086249709129333,
-0.08234736323356628,
-0.14702413976192474,
-0.13204717636108398,
0.09344872832298279,
0.11585770547389984,
0.05216670408844948,
-0.020233305171132088,
0.07557069510221481,
-0.0856800451874733,
0.05004337057471275,
0.2793901264667511,
-0.2027529627084732,
-0.04622967913746834,
-0.05339162424206734,
0.05674716830253601,
0.034474875777959824,
-0.07952550053596497,
0.013355844654142857,
0.0012370070908218622,
0.010445362888276577,
-0.01488783210515976,
0.002163159428164363,
0.03620881959795952,
-0.008577418513596058,
-0.10514488071203232,
-0.019028153270483017,
0.10807893425226212,
-0.004645194858312607,
-0.0795569121837616,
-0.18938635289669037,
-0.018536098301410675,
-0.03773390129208565,
-0.01827855035662651,
0.004385469015687704,
0.06355904042720795,
-0.03177975118160248,
0.08268830925226212,
-0.06494207680225372,
-0.07094825804233551,
-0.0010663216235116124,
0.017042797058820724,
0.09598168730735779,
0.02850382588803768,
0.0019258481916040182,
0.06261249631643295,
0.07416264712810516,
-0.1009078249335289,
-0.03536380082368851,
-0.024320369586348534,
-0.02747233957052231,
-0.09559712558984756,
-0.013390122912824154,
-0.0076389857567846775,
-0.06646184623241425,
0.001987295923754573,
0.21341915428638458,
0.03402634337544441,
0.024493880569934845,
-0.03990111127495766,
0.008588647469878197,
0.029462207108736038,
0.13073284924030304,
-0.11216874420642853,
-0.13901303708553314,
-0.018986279144883156,
-0.013873172923922539,
0.059853531420230865,
0.002015564125031233,
-0.01200065203011036,
0.028986893594264984,
0.01434057392179966,
0.0787043645977974,
0.029499389231204987,
0.016802452504634857,
-0.08529693633317947,
-0.03680713474750519,
0.15668834745883942,
-0.1264379769563675,
0.071449875831604,
0.015350238420069218,
-0.03899960219860077,
0.024907097220420837,
0.042215049266815186,
0.012826773338019848,
-0.0705830380320549,
0.11708971112966537,
-0.050523821264505386,
-0.009749419055879116,
-0.045183200389146805,
-0.10202311724424362,
0.046891745179891586,
-0.033271461725234985,
-0.061858199536800385,
-0.07730357348918915,
-0.02072875201702118,
-0.06297934055328369,
0.03946205601096153,
-0.06353288888931274,
-0.01744234375655651,
-0.02939661033451557,
-0.056464456021785736,
0.038754090666770935,
0.015417465008795261,
0.0595666877925396,
-0.01732446253299713,
0.01403512991964817,
-0.12174932658672333,
0.05614302307367325,
0.0541803278028965,
0.02246646210551262,
-0.051134880632162094,
-0.008520944975316525,
-0.28358370065689087,
0.06606274843215942,
-0.132695734500885,
0.004836035892367363,
-0.11571432650089264,
-0.028689511120319366,
0.08277411013841629,
0.03743920475244522,
0.024713460355997086,
0.08890194445848465,
-0.17414288222789764,
-0.04899533465504646,
0.1714445799589157,
-0.09474573284387589,
-0.002242365386337042,
0.1442134976387024,
0.003990788012742996,
0.027375195175409317,
0.027852144092321396,
0.1334686577320099,
0.04412377253174782,
-0.14020144939422607,
-0.044648177921772,
-0.06190132722258568,
-0.05038272961974144,
0.147560253739357,
0.06303991377353668,
-0.12529557943344116,
0.03476065769791603,
0.01511664129793644,
-0.0928725153207779,
-0.06102854013442993,
0.01619136892259121,
-0.009231594391167164,
0.02897515892982483,
-0.019016604870557785,
0.0019346270710229874,
-0.0037865471094846725,
-0.049945980310440063,
-0.08919233828783035,
-0.11897703260183334,
-0.006689291447401047,
0.09475875645875931,
-0.02355268783867359,
0.0748969316482544,
-0.056626494973897934,
-0.01113037671893835,
0.020184379070997238,
-0.03496541455388069,
-0.07251078635454178,
-0.10067931562662125,
0.08108916878700256,
-0.14647723734378815,
0.019197965040802956,
-0.03843053802847862,
0.059079382568597794,
0.0791577398777008,
-0.04065806046128273,
-0.023166347295045853,
-0.013846770860254765,
-0.018708376213908195,
-0.04894305393099785,
-0.15186434984207153,
-0.03388196974992752,
-0.012137913145124912,
0.10956902801990509,
-0.04277980700135231,
-0.017493732273578644,
0.03304886445403099,
0.14915940165519714,
0.0495106503367424,
-0.05695953220129013,
-0.0009552006958983839,
0.010290311649441719,
0.007390175014734268,
-0.05511286109685898,
0.01396782137453556,
-0.04304596036672592,
-0.008544501848518848,
0.09625044465065002,
-0.1230364516377449,
-0.17937225103378296,
0.07885769754648209,
0.13410720229148865,
-0.09971778839826584,
0.005583271384239197,
-0.06032155454158783,
-0.03603639453649521,
-0.07150408625602722,
-0.087070532143116,
0.14423224329948425,
0.04508695378899574,
0.08474673330783844,
-0.07967165112495422,
-0.08038293570280075,
-0.0015841529238969088,
-0.03652593865990639,
-0.08957476913928986,
0.12280010432004929,
-0.024162983521819115,
-0.09752131998538971,
0.06307537108659744,
-0.02167368307709694,
0.05711343511939049,
0.15706712007522583,
0.02176528237760067,
-0.10685811191797256,
-0.03281952813267708,
0.02877911925315857,
0.08813615888357162,
0.09000173211097717,
0.0035563369747251272,
0.03690013661980629,
0.05070043355226517,
0.020603828132152557,
0.006292149890214205,
-0.04952035844326019,
0.02139190584421158,
0.0073630125261843204,
-0.03865039721131325,
0.04656580463051796,
0.01181273814290762,
0.012281140312552452,
0.08929583430290222,
0.029386257752776146,
0.049766477197408676,
-0.04818480461835861,
-0.04621870443224907,
-0.09485994279384613,
0.10117428004741669,
-0.15671852231025696,
-0.25755253434181213,
-0.1571509689092636,
0.015771346166729927,
-0.031071770936250687,
-0.01897772215306759,
0.031087210401892662,
-0.06490792334079742,
-0.12198474258184433,
-0.12990815937519073,
0.07952530682086945,
0.021082352846860886,
-0.06551789492368698,
-0.0519949346780777,
0.010261857882142067,
0.046790238469839096,
-0.10674012452363968,
-0.0008368181297555566,
-0.0018373981583863497,
-0.08390948176383972,
-0.02464928664267063,
0.0495821014046669,
0.06632652133703232,
0.009783849120140076,
0.04045012593269348,
-0.01987491548061371,
-0.016492094844579697,
0.1272875815629959,
-0.10731779783964157,
0.12936800718307495,
0.040774937719106674,
-0.03718724101781845,
0.0787333995103836,
0.12205632776021957,
0.0005583474412560463,
-0.04694322124123573,
0.026539001613855362,
0.059327322989702225,
-0.009934182278811932,
-0.19217784702777863,
-0.07273104041814804,
-0.005397085566073656,
0.0800570696592331,
0.07311416417360306,
0.07692285627126694,
-0.03743058815598488,
-0.0395035482943058,
-0.05649024248123169,
-0.052111390978097916,
0.07605570554733276,
0.07018332928419113,
-0.03506311774253845,
0.002545110648497939,
0.02077251486480236,
-0.029770629480481148,
0.030265918001532555,
0.15279091894626617,
-0.006687644403427839,
0.14844508469104767,
-0.04340033233165741,
0.204502135515213,
0.038611166179180145,
0.07517001032829285,
0.0007572589674964547,
0.09116441756486893,
-0.020160265266895294,
0.02726554125547409,
-0.01694844476878643,
-0.048810336738824844,
0.0036235428415238857,
0.057950813323259354,
0.11489176005125046,
-0.014287389814853668,
-0.04418133571743965,
-0.03403383120894432,
0.09645664691925049,
0.29461750388145447,
0.05363300070166588,
-0.0730837881565094,
-0.015456943772733212,
0.020612338557839394,
-0.11504577100276947,
-0.057371094822883606,
0.000878174148965627,
0.0718030035495758,
-0.19257500767707825,
0.08470744639635086,
-0.024735575541853905,
0.02729548141360283,
-0.07196321338415146,
-0.04436325281858444,
0.07887852191925049,
0.07532859593629837,
-0.006973697803914547,
0.046409640461206436,
-0.19605618715286255,
0.07133689522743225,
-0.007497962098568678,
0.05017570033669472,
-0.028938259929418564,
0.025135373696684837,
0.005642155185341835,
-0.060727279633283615,
0.2177029401063919,
0.039179686456918716,
-0.0060600396245718,
-0.017742758616805077,
-0.15911316871643066,
-0.04035639017820358,
0.11310891807079315,
-0.16096965968608856,
0.06993868947029114,
-0.04904988035559654,
-0.05508439615368843,
-0.06287791579961777,
-0.025131095200777054,
-0.06492875516414642,
-0.1570369452238083,
0.029878079891204834,
-0.11359013617038727,
0.0002901718544308096,
-0.08747215569019318,
-0.01414119079709053,
-0.11259045451879501,
0.12672153115272522,
-0.09415681660175323,
-0.0597437247633934,
-0.1314782202243805,
-0.06134991720318794,
0.14393338561058044,
-0.04764938727021217,
0.06130359694361687,
-0.0180063359439373,
0.10339190810918808,
-0.03469991683959961,
-0.020652316510677338,
0.06225379928946495,
-0.10041819512844086,
-0.22944556176662445,
-0.03873603045940399,
0.1914825141429901,
0.07758241146802902,
0.03545099124312401,
-0.02346174791455269,
0.06784546375274658,
0.014724667184054852,
-0.062163520604372025,
0.0140851940959692,
0.19028399884700775,
0.005599664524197578,
0.09301856905221939,
-0.033275384455919266,
-0.043601151555776596,
-0.08216627687215805,
-0.04096629470586777,
0.11256449669599533,
0.23705719411373138,
-0.07057871669530869,
0.20767341554164886,
0.05980931967496872,
-0.08724096417427063,
-0.22206172347068787,
-0.06833211332559586,
0.08250375837087631,
-0.016818897798657417,
0.04607631638646126,
-0.10268574208021164,
0.0649406760931015,
0.0836360901594162,
-0.0003731143951881677,
0.0075850714929401875,
-0.19650597870349884,
-0.13132430613040924,
-0.00009504318586550653,
0.022485485300421715,
-0.12776856124401093,
-0.09017539769411087,
-0.03638658672571182,
-0.03802711144089699,
-0.03171171620488167,
0.09119751304388046,
0.02176234871149063,
0.06574343889951706,
0.0161287821829319,
-0.018230311572551727,
0.04010692238807678,
-0.05274474248290062,
0.1315503567457199,
0.0006615432212129235,
0.0008700272883288562,
-0.07788953930139542,
-0.006720277946442366,
0.011849350295960903,
-0.029129017144441605,
0.08881589025259018,
0.023957794532179832,
-0.01478641852736473,
-0.10074609518051147,
-0.025975273922085762,
-0.08064210414886475,
0.08047845214605331,
-0.09317826479673386,
-0.014880192466080189,
-0.06574171781539917,
0.09193854033946991,
0.07885187119245529,
-0.007170373108237982,
-0.05163529887795448,
-0.03615882992744446,
-0.032467544078826904,
0.12290814518928528,
0.14394697546958923,
0.14960549771785736,
-0.10638763010501862,
0.009623134508728981,
-0.02266727387905121,
0.022725705057382584,
0.0006025979528203607,
0.060287635773420334,
0.057062651962041855,
-0.02514052949845791,
0.08479106426239014,
-0.03233977407217026,
-0.1877731829881668,
-0.01477011851966381,
0.07108699530363083,
-0.1142258271574974,
-0.1848980337381363,
0.0027300126384943724,
-0.05806051939725876,
-0.03311494365334511,
-0.054740361869335175,
0.14488957822322845,
-0.02601415477693081,
-0.05373754724860191,
0.0015331106260418892,
0.08311688154935837,
-0.002947279019281268,
0.06711946427822113,
0.01123325526714325,
0.01234008464962244,
-0.08239259570837021,
0.1798771321773529,
0.037067487835884094,
-0.070403091609478,
0.024714896455407143,
0.18392956256866455,
-0.060240112245082855,
-0.04635774344205856,
0.01238054409623146,
0.0685010775923729,
-0.02067599818110466,
-0.02920771762728691,
0.008610808290541172,
-0.042542655020952225,
0.03182251378893852,
0.027357513085007668,
0.00405572634190321,
-0.020990174263715744,
0.045929521322250366,
0.030377009883522987,
-0.06215943023562431,
0.09900863468647003,
0.02937569096684456,
0.019046641886234283,
-0.029101882129907608,
0.06740113347768784,
-0.00785854458808899,
-0.02934357523918152,
0.015818582847714424,
0.03407582268118858,
-0.07262527197599411,
-0.05504230037331581,
0.009665303863584995,
-0.009872435592114925,
-0.03676630184054375,
-0.0368829108774662,
0.011984419077634811,
0.0029200143180787563,
0.037966225296258926,
0.009964853525161743,
-0.038416579365730286,
-0.09426009654998779,
-0.04130178689956665,
0.056379880756139755,
-0.12840798497200012,
-0.013053863309323788,
0.07383227348327637,
-0.07784447073936462,
0.1616670787334442,
0.028151391074061394,
0.0369388721883297,
0.014219528995454311,
-0.042326994240283966,
-0.004574701189994812,
-0.06890713423490524,
-0.005315940361469984,
0.050130292773246765,
-0.12536831200122833,
-0.017062537372112274,
-0.07755958288908005,
-0.0415453277528286,
-0.0043003442697227,
0.0745561420917511,
-0.09432709962129593,
0.07073409110307693,
0.04107894003391266,
-0.06596718728542328,
-0.07844087481498718,
-0.008957783691585064,
0.045746736228466034,
0.02081238478422165,
0.053385864943265915,
-0.06615421921014786,
0.06893545389175415,
-0.0711255818605423,
0.004025696776807308,
0.0003294531488791108,
0.000547576230019331,
-0.008706079795956612,
0.04077719897031784,
0.047532785683870316,
-0.014299001544713974,
0.07903705537319183,
0.0030690613202750683,
-0.011152289807796478,
0.050828102976083755,
-0.0533759742975235,
-0.07456227391958237,
0.05758000910282135,
-0.034995973110198975,
-0.05231659486889839,
0.0009630967397242785,
-0.05408048257231712,
-0.030563313513994217,
-0.018295438960194588,
-0.08191109448671341,
0.13703787326812744,
0.14033427834510803,
0.10663381218910217,
0.029856018722057343,
0.0014520130353048444,
-0.13258697092533112,
-0.14132624864578247,
0.06212732195854187,
-0.04013478383421898,
0.0671432763338089,
-0.051220931112766266,
0.12502215802669525,
0.06776692718267441,
-0.21412335336208344,
0.10079759359359741,
-0.02154095098376274,
-0.04872625693678856,
-0.061198651790618896,
-0.13281942903995514,
-0.05466828867793083,
-0.019822154194116592,
0.004205089993774891,
-0.09153120219707489,
0.08638890087604523,
0.05275481939315796,
0.04308638721704483,
0.02252444066107273,
0.08840763568878174,
-0.1633525937795639,
-0.06294316053390503,
0.10780304670333862,
0.054057925939559937,
0.03180251643061638,
0.0644444078207016,
0.00007095551700331271,
-0.05848664045333862,
0.028906812891364098,
0.047613002359867096,
0.07179775834083557,
0.010908284224569798,
0.004417906980961561,
-0.015637483447790146,
-0.08978146314620972,
0.027938678860664368,
-0.027230313047766685,
-0.021034106612205505,
0.10000819712877274,
0.05768291652202606,
0.007310186512768269,
-0.03264622390270233,
0.2302917242050171,
-0.061509035527706146,
-0.027476301416754723,
-0.16131579875946045,
0.11148304492235184,
-0.04467588663101196,
0.0508061908185482,
-0.016360467299818993,
-0.0902116671204567,
-0.03026147000491619,
0.18582548201084137,
0.14042215049266815,
-0.0268323365598917,
0.018725818023085594,
0.025976354256272316,
-0.008945228531956673,
0.03137330710887909,
0.0821147933602333,
0.05842192843556404,
0.16061681509017944,
-0.06543948501348495,
0.09932872653007507,
-0.02220502868294716,
-0.038988903164863586,
-0.09744497388601303,
0.19688546657562256,
-0.057210639119148254,
0.006156980991363525,
-0.04606388881802559,
0.05331490933895111,
0.010959283448755741,
-0.3100702464580536,
0.043515466153621674,
-0.09794948250055313,
-0.13634485006332397,
0.012481434270739555,
0.06696493178606033,
0.03506696969270706,
0.08163053542375565,
0.07605531811714172,
-0.015224069356918335,
0.22355005145072937,
0.037145934998989105,
-0.06211879476904869,
-0.022114593535661697,
0.03087041899561882,
-0.14987362921237946,
0.2555731534957886,
0.02411094307899475,
-0.021345950663089752,
0.08581890910863876,
-0.019428851082921028,
-0.14025336503982544,
-0.011787205003201962,
0.044839974492788315,
-0.09226434677839279,
0.03943173959851265,
0.22652697563171387,
-0.0030909525230526924,
0.053718939423561096,
0.038558028638362885,
-0.07765743136405945,
0.06320123374462128,
0.07155612111091614,
-0.004466915037482977,
-0.06386460363864899,
0.11380192637443542,
-0.12245496362447739,
0.13287103176116943,
0.1905631124973297,
-0.03742115944623947,
0.021784896031022072,
-0.07720133662223816,
-0.013400997035205364,
-0.012626060284674168,
0.13845470547676086,
0.048798974603414536,
-0.12940466403961182,
0.019900813698768616,
-0.05042421445250511,
0.06816644221544266,
-0.13737063109874725,
-0.046657636761665344,
-0.012370054610073566,
-0.04045470431447029,
-0.053929831832647324,
0.09804452210664749,
0.03951381891965866,
0.011623288504779339,
0.0010270479833707213,
0.09554577618837357,
0.014831733889877796,
0.09901735931634903,
-0.06622068583965302,
-0.00982118770480156
] |
null | null |
transformers
|
# Historic Language Models (HLMs)
## Languages
Our Historic Language Models Zoo contains support for the following languages - incl. their training data source:
| Language | Training data | Size
| -------- | ------------- | ----
| German | [Europeana](http://www.europeana-newspapers.eu/) | 13-28GB (filtered)
| French | [Europeana](http://www.europeana-newspapers.eu/) | 11-31GB (filtered)
| English | [British Library](https://data.bl.uk/digbks/db14.html) | 24GB (year filtered)
| Finnish | [Europeana](http://www.europeana-newspapers.eu/) | 1.2GB
| Swedish | [Europeana](http://www.europeana-newspapers.eu/) | 1.1GB
## Models
At the moment, the following models are available on the model hub:
| Model identifier | Model Hub link
| --------------------------------------------- | --------------------------------------------------------------------------
| `dbmdz/bert-base-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-multilingual-cased)
| `dbmdz/bert-base-historic-english-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-english-cased)
| `dbmdz/bert-base-finnish-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-finnish-europeana-cased)
| `dbmdz/bert-base-swedish-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-swedish-europeana-cased)
We also released smaller models for the multilingual model:
| Model identifier | Model Hub link
| ----------------------------------------------- | ---------------------------------------------------------------------------
| `dbmdz/bert-tiny-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-tiny-historic-multilingual-cased)
| `dbmdz/bert-mini-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-mini-historic-multilingual-cased)
| `dbmdz/bert-small-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-small-historic-multilingual-cased)
| `dbmdz/bert-medium-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-multilingual-cased)
**Notice**: We have released language models for Historic German and French trained on more noisier data earlier - see
[this repo](https://github.com/stefan-it/europeana-bert) for more information:
| Model identifier | Model Hub link
| --------------------------------------------- | --------------------------------------------------------------------------
| `dbmdz/bert-base-german-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-german-europeana-cased)
| `dbmdz/bert-base-french-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-french-europeana-cased)
# Corpora Stats
## German Europeana Corpus
We provide some statistics using different thresholds of ocr confidences, in order to shrink down the corpus size
and use less-noisier data:
| OCR confidence | Size
| -------------- | ----
| **0.60** | 28GB
| 0.65 | 18GB
| 0.70 | 13GB
For the final corpus we use a OCR confidence of 0.6 (28GB). The following plot shows a tokens per year distribution:

## French Europeana Corpus
Like German, we use different ocr confidence thresholds:
| OCR confidence | Size
| -------------- | ----
| 0.60 | 31GB
| 0.65 | 27GB
| **0.70** | 27GB
| 0.75 | 23GB
| 0.80 | 11GB
For the final corpus we use a OCR confidence of 0.7 (27GB). The following plot shows a tokens per year distribution:

## British Library Corpus
Metadata is taken from [here](https://data.bl.uk/digbks/DB21.html). Stats incl. year filtering:
| Years | Size
| ----------------- | ----
| ALL | 24GB
| >= 1800 && < 1900 | 24GB
We use the year filtered variant. The following plot shows a tokens per year distribution:

## Finnish Europeana Corpus
| OCR confidence | Size
| -------------- | ----
| 0.60 | 1.2GB
The following plot shows a tokens per year distribution:

## Swedish Europeana Corpus
| OCR confidence | Size
| -------------- | ----
| 0.60 | 1.1GB
The following plot shows a tokens per year distribution:

## All Corpora
The following plot shows a tokens per year distribution of the complete training corpus:

# Multilingual Vocab generation
For the first attempt, we use the first 10GB of each pretraining corpus. We upsample both Finnish and Swedish to ~10GB.
The following tables shows the exact size that is used for generating a 32k and 64k subword vocabs:
| Language | Size
| -------- | ----
| German | 10GB
| French | 10GB
| English | 10GB
| Finnish | 9.5GB
| Swedish | 9.7GB
We then calculate the subword fertility rate and portion of `[UNK]`s over the following NER corpora:
| Language | NER corpora
| -------- | ------------------
| German | CLEF-HIPE, NewsEye
| French | CLEF-HIPE, NewsEye
| English | CLEF-HIPE
| Finnish | NewsEye
| Swedish | NewsEye
Breakdown of subword fertility rate and unknown portion per language for the 32k vocab:
| Language | Subword fertility | Unknown portion
| -------- | ------------------ | ---------------
| German | 1.43 | 0.0004
| French | 1.25 | 0.0001
| English | 1.25 | 0.0
| Finnish | 1.69 | 0.0007
| Swedish | 1.43 | 0.0
Breakdown of subword fertility rate and unknown portion per language for the 64k vocab:
| Language | Subword fertility | Unknown portion
| -------- | ------------------ | ---------------
| German | 1.31 | 0.0004
| French | 1.16 | 0.0001
| English | 1.17 | 0.0
| Finnish | 1.54 | 0.0007
| Swedish | 1.32 | 0.0
# Final pretraining corpora
We upsample Swedish and Finnish to ~27GB. The final stats for all pretraining corpora can be seen here:
| Language | Size
| -------- | ----
| German | 28GB
| French | 27GB
| English | 24GB
| Finnish | 27GB
| Swedish | 27GB
Total size is 130GB.
# Smaller multilingual models
Inspired by the ["Well-Read Students Learn Better: On the Importance of Pre-training Compact Models"](https://arxiv.org/abs/1908.08962)
paper, we train smaller models (different layers and hidden sizes), and report number of parameters and pre-training costs:
| Model (Layer / Hidden size) | Parameters | Pre-Training time
| --------------------------- | ----------: | ----------------------:
| hmBERT Tiny ( 2/128) | 4.58M | 4.3 sec / 1,000 steps
| hmBERT Mini ( 4/256) | 11.55M | 10.5 sec / 1,000 steps
| hmBERT Small ( 4/512) | 29.52M | 20.7 sec / 1,000 steps
| hmBERT Medium ( 8/512) | 42.13M | 35.0 sec / 1,000 steps
| hmBERT Base (12/768) | 110.62M | 80.0 sec / 1,000 steps
We then perform downstream evaluations on the multilingual [NewsEye](https://zenodo.org/record/4573313#.Ya3oVr-ZNzU) dataset:

# Pretraining
## Multilingual model - hmBERT Base
We train a multilingual BERT model using the 32k vocab with the official BERT implementation
on a v3-32 TPU using the following parameters:
```bash
python3 run_pretraining.py --input_file gs://histolectra/historic-multilingual-tfrecords/*.tfrecord \
--output_dir gs://histolectra/bert-base-historic-multilingual-cased \
--bert_config_file ./config.json \
--max_seq_length=512 \
--max_predictions_per_seq=75 \
--do_train=True \
--train_batch_size=128 \
--num_train_steps=3000000 \
--learning_rate=1e-4 \
--save_checkpoints_steps=100000 \
--keep_checkpoint_max=20 \
--use_tpu=True \
--tpu_name=electra-2 \
--num_tpu_cores=32
```
The following plot shows the pretraining loss curve:

## Smaller multilingual models
We use the same parameters as used for training the base model.
### hmBERT Tiny
The following plot shows the pretraining loss curve for the tiny model:

### hmBERT Mini
The following plot shows the pretraining loss curve for the mini model:

### hmBERT Small
The following plot shows the pretraining loss curve for the small model:

### hmBERT Medium
The following plot shows the pretraining loss curve for the medium model:

## English model
The English BERT model - with texts from British Library corpus - was trained with the Hugging Face
JAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-historic-english-cased/ \
--tokenizer_name /mnt/datasets/bert-base-historic-english-cased/ \
--train_file /mnt/datasets/bl-corpus/bl_1800-1900_extracted.txt \
--validation_file /mnt/datasets/bl-corpus/english_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 10 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-historic-english-cased-512-noadafactor-10e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

## Finnish model
The BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-finnish-europeana-cased/ \
--tokenizer_name /mnt/datasets/bert-base-finnish-europeana-cased/ \
--train_file /mnt/datasets/hlms/extracted_content_Finnish_0.6.txt \
--validation_file /mnt/datasets/hlms/finnish_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 40 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-finnish-europeana-cased-512-dupe1-noadafactor-40e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

## Swedish model
The BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-swedish-europeana-cased/ \
--tokenizer_name /mnt/datasets/bert-base-swedish-europeana-cased/ \
--train_file /mnt/datasets/hlms/extracted_content_Swedish_0.6.txt \
--validation_file /mnt/datasets/hlms/swedish_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 40 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-swedish-europeana-cased-512-dupe1-noadafactor-40e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

# Acknowledgments
Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as
TensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "multilingual", "license": "mit", "widget": [{"text": "and I cannot conceive the reafon why [MASK] hath"}, {"text": "T\u00e4k\u00e4l\u00e4inen sanomalehdist\u00f6 [MASK] erit - t\u00e4in"}, {"text": "Det vore [MASK] h\u00e4ller n\u00f6dv\u00e4ndigt att be"}, {"text": "Comme, \u00e0 cette \u00e9poque [MASK] \u00e9tait celle de la"}, {"text": "In [MASK] an atmosph\u00e4rischen Nahrungsmitteln"}]}
|
fill-mask
|
dbmdz/bert-mini-historic-multilingual-cased
|
[
"transformers",
"pytorch",
"tf",
"tensorboard",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"arxiv:1908.08962",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.08962"
] |
[
"multilingual"
] |
TAGS
#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
Historic Language Models (HLMs)
===============================
Languages
---------
Our Historic Language Models Zoo contains support for the following languages - incl. their training data source:
Language: German, Training data: Europeana, Size: 13-28GB (filtered)
Language: French, Training data: Europeana, Size: 11-31GB (filtered)
Language: English, Training data: British Library, Size: 24GB (year filtered)
Language: Finnish, Training data: Europeana, Size: 1.2GB
Language: Swedish, Training data: Europeana, Size: 1.1GB
Models
------
At the moment, the following models are available on the model hub:
We also released smaller models for the multilingual model:
Notice: We have released language models for Historic German and French trained on more noisier data earlier - see
this repo for more information:
Corpora Stats
=============
German Europeana Corpus
-----------------------
We provide some statistics using different thresholds of ocr confidences, in order to shrink down the corpus size
and use less-noisier data:
For the final corpus we use a OCR confidence of 0.6 (28GB). The following plot shows a tokens per year distribution:
!German Europeana Corpus Stats
French Europeana Corpus
-----------------------
Like German, we use different ocr confidence thresholds:
For the final corpus we use a OCR confidence of 0.7 (27GB). The following plot shows a tokens per year distribution:
!French Europeana Corpus Stats
British Library Corpus
----------------------
Metadata is taken from here. Stats incl. year filtering:
We use the year filtered variant. The following plot shows a tokens per year distribution:
!British Library Corpus Stats
Finnish Europeana Corpus
------------------------
The following plot shows a tokens per year distribution:
!Finnish Europeana Corpus Stats
Swedish Europeana Corpus
------------------------
The following plot shows a tokens per year distribution:
!Swedish Europeana Corpus Stats
All Corpora
-----------
The following plot shows a tokens per year distribution of the complete training corpus:
!All Corpora Stats
Multilingual Vocab generation
=============================
For the first attempt, we use the first 10GB of each pretraining corpus. We upsample both Finnish and Swedish to ~10GB.
The following tables shows the exact size that is used for generating a 32k and 64k subword vocabs:
We then calculate the subword fertility rate and portion of '[UNK]'s over the following NER corpora:
Breakdown of subword fertility rate and unknown portion per language for the 32k vocab:
Language: German, Subword fertility: 1.43, Unknown portion: 0.0004
Language: French, Subword fertility: 1.25, Unknown portion: 0.0001
Language: English, Subword fertility: 1.25, Unknown portion: 0.0
Language: Finnish, Subword fertility: 1.69, Unknown portion: 0.0007
Language: Swedish, Subword fertility: 1.43, Unknown portion: 0.0
Breakdown of subword fertility rate and unknown portion per language for the 64k vocab:
Language: German, Subword fertility: 1.31, Unknown portion: 0.0004
Language: French, Subword fertility: 1.16, Unknown portion: 0.0001
Language: English, Subword fertility: 1.17, Unknown portion: 0.0
Language: Finnish, Subword fertility: 1.54, Unknown portion: 0.0007
Language: Swedish, Subword fertility: 1.32, Unknown portion: 0.0
Final pretraining corpora
=========================
We upsample Swedish and Finnish to ~27GB. The final stats for all pretraining corpora can be seen here:
Total size is 130GB.
Smaller multilingual models
===========================
Inspired by the "Well-Read Students Learn Better: On the Importance of Pre-training Compact Models"
paper, we train smaller models (different layers and hidden sizes), and report number of parameters and pre-training costs:
We then perform downstream evaluations on the multilingual NewsEye dataset:
!NewsEye hmBERT Evaluation
Pretraining
===========
Multilingual model - hmBERT Base
--------------------------------
We train a multilingual BERT model using the 32k vocab with the official BERT implementation
on a v3-32 TPU using the following parameters:
The following plot shows the pretraining loss curve:
!Training loss curve
Smaller multilingual models
---------------------------
We use the same parameters as used for training the base model.
### hmBERT Tiny
The following plot shows the pretraining loss curve for the tiny model:
!Training loss curve
### hmBERT Mini
The following plot shows the pretraining loss curve for the mini model:
!Training loss curve
### hmBERT Small
The following plot shows the pretraining loss curve for the small model:
!Training loss curve
### hmBERT Medium
The following plot shows the pretraining loss curve for the medium model:
!Training loss curve
English model
-------------
The English BERT model - with texts from British Library corpus - was trained with the Hugging Face
JAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Finnish model
-------------
The BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Swedish model
-------------
The BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Acknowledgments
===============
Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as
TensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[
"### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve",
"### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve",
"### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve",
"### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve",
"### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve",
"### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve",
"### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
66,
30,
28,
28,
347
] |
[
"passage: TAGS\n#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
-0.01838250458240509,
0.07497698813676834,
-0.004591826349496841,
0.0935814306139946,
0.08459219336509705,
0.0281265489757061,
0.0513935424387455,
0.11914435029029846,
0.019784854725003242,
0.12505578994750977,
0.01800212636590004,
-0.004408061970025301,
0.10708323866128922,
0.07002894580364227,
0.10031522065401077,
-0.26725950837135315,
0.015286745503544807,
-0.14339587092399597,
-0.012879506684839725,
0.061703287065029144,
0.13286221027374268,
-0.049308016896247864,
0.0773063525557518,
0.005427641794085503,
0.00662423437461257,
0.02681286446750164,
-0.029497727751731873,
-0.036411575973033905,
0.08129750192165375,
0.06282371282577515,
0.02853299304842949,
-0.03729385510087013,
0.057260334491729736,
-0.2171691656112671,
0.0006818408728577197,
0.0776631310582161,
-0.009753323160111904,
0.03736304119229317,
0.10214542597532272,
-0.0279656033962965,
0.08662102371454239,
-0.11414434760808945,
0.05029687657952309,
0.045086249709129333,
-0.08234736323356628,
-0.14702413976192474,
-0.13204717636108398,
0.09344872832298279,
0.11585770547389984,
0.05216670408844948,
-0.020233305171132088,
0.07557069510221481,
-0.0856800451874733,
0.05004337057471275,
0.2793901264667511,
-0.2027529627084732,
-0.04622967913746834,
-0.05339162424206734,
0.05674716830253601,
0.034474875777959824,
-0.07952550053596497,
0.013355844654142857,
0.0012370070908218622,
0.010445362888276577,
-0.01488783210515976,
0.002163159428164363,
0.03620881959795952,
-0.008577418513596058,
-0.10514488071203232,
-0.019028153270483017,
0.10807893425226212,
-0.004645194858312607,
-0.0795569121837616,
-0.18938635289669037,
-0.018536098301410675,
-0.03773390129208565,
-0.01827855035662651,
0.004385469015687704,
0.06355904042720795,
-0.03177975118160248,
0.08268830925226212,
-0.06494207680225372,
-0.07094825804233551,
-0.0010663216235116124,
0.017042797058820724,
0.09598168730735779,
0.02850382588803768,
0.0019258481916040182,
0.06261249631643295,
0.07416264712810516,
-0.1009078249335289,
-0.03536380082368851,
-0.024320369586348534,
-0.02747233957052231,
-0.09559712558984756,
-0.013390122912824154,
-0.0076389857567846775,
-0.06646184623241425,
0.001987295923754573,
0.21341915428638458,
0.03402634337544441,
0.024493880569934845,
-0.03990111127495766,
0.008588647469878197,
0.029462207108736038,
0.13073284924030304,
-0.11216874420642853,
-0.13901303708553314,
-0.018986279144883156,
-0.013873172923922539,
0.059853531420230865,
0.002015564125031233,
-0.01200065203011036,
0.028986893594264984,
0.01434057392179966,
0.0787043645977974,
0.029499389231204987,
0.016802452504634857,
-0.08529693633317947,
-0.03680713474750519,
0.15668834745883942,
-0.1264379769563675,
0.071449875831604,
0.015350238420069218,
-0.03899960219860077,
0.024907097220420837,
0.042215049266815186,
0.012826773338019848,
-0.0705830380320549,
0.11708971112966537,
-0.050523821264505386,
-0.009749419055879116,
-0.045183200389146805,
-0.10202311724424362,
0.046891745179891586,
-0.033271461725234985,
-0.061858199536800385,
-0.07730357348918915,
-0.02072875201702118,
-0.06297934055328369,
0.03946205601096153,
-0.06353288888931274,
-0.01744234375655651,
-0.02939661033451557,
-0.056464456021785736,
0.038754090666770935,
0.015417465008795261,
0.0595666877925396,
-0.01732446253299713,
0.01403512991964817,
-0.12174932658672333,
0.05614302307367325,
0.0541803278028965,
0.02246646210551262,
-0.051134880632162094,
-0.008520944975316525,
-0.28358370065689087,
0.06606274843215942,
-0.132695734500885,
0.004836035892367363,
-0.11571432650089264,
-0.028689511120319366,
0.08277411013841629,
0.03743920475244522,
0.024713460355997086,
0.08890194445848465,
-0.17414288222789764,
-0.04899533465504646,
0.1714445799589157,
-0.09474573284387589,
-0.002242365386337042,
0.1442134976387024,
0.003990788012742996,
0.027375195175409317,
0.027852144092321396,
0.1334686577320099,
0.04412377253174782,
-0.14020144939422607,
-0.044648177921772,
-0.06190132722258568,
-0.05038272961974144,
0.147560253739357,
0.06303991377353668,
-0.12529557943344116,
0.03476065769791603,
0.01511664129793644,
-0.0928725153207779,
-0.06102854013442993,
0.01619136892259121,
-0.009231594391167164,
0.02897515892982483,
-0.019016604870557785,
0.0019346270710229874,
-0.0037865471094846725,
-0.049945980310440063,
-0.08919233828783035,
-0.11897703260183334,
-0.006689291447401047,
0.09475875645875931,
-0.02355268783867359,
0.0748969316482544,
-0.056626494973897934,
-0.01113037671893835,
0.020184379070997238,
-0.03496541455388069,
-0.07251078635454178,
-0.10067931562662125,
0.08108916878700256,
-0.14647723734378815,
0.019197965040802956,
-0.03843053802847862,
0.059079382568597794,
0.0791577398777008,
-0.04065806046128273,
-0.023166347295045853,
-0.013846770860254765,
-0.018708376213908195,
-0.04894305393099785,
-0.15186434984207153,
-0.03388196974992752,
-0.012137913145124912,
0.10956902801990509,
-0.04277980700135231,
-0.017493732273578644,
0.03304886445403099,
0.14915940165519714,
0.0495106503367424,
-0.05695953220129013,
-0.0009552006958983839,
0.010290311649441719,
0.007390175014734268,
-0.05511286109685898,
0.01396782137453556,
-0.04304596036672592,
-0.008544501848518848,
0.09625044465065002,
-0.1230364516377449,
-0.17937225103378296,
0.07885769754648209,
0.13410720229148865,
-0.09971778839826584,
0.005583271384239197,
-0.06032155454158783,
-0.03603639453649521,
-0.07150408625602722,
-0.087070532143116,
0.14423224329948425,
0.04508695378899574,
0.08474673330783844,
-0.07967165112495422,
-0.08038293570280075,
-0.0015841529238969088,
-0.03652593865990639,
-0.08957476913928986,
0.12280010432004929,
-0.024162983521819115,
-0.09752131998538971,
0.06307537108659744,
-0.02167368307709694,
0.05711343511939049,
0.15706712007522583,
0.02176528237760067,
-0.10685811191797256,
-0.03281952813267708,
0.02877911925315857,
0.08813615888357162,
0.09000173211097717,
0.0035563369747251272,
0.03690013661980629,
0.05070043355226517,
0.020603828132152557,
0.006292149890214205,
-0.04952035844326019,
0.02139190584421158,
0.0073630125261843204,
-0.03865039721131325,
0.04656580463051796,
0.01181273814290762,
0.012281140312552452,
0.08929583430290222,
0.029386257752776146,
0.049766477197408676,
-0.04818480461835861,
-0.04621870443224907,
-0.09485994279384613,
0.10117428004741669,
-0.15671852231025696,
-0.25755253434181213,
-0.1571509689092636,
0.015771346166729927,
-0.031071770936250687,
-0.01897772215306759,
0.031087210401892662,
-0.06490792334079742,
-0.12198474258184433,
-0.12990815937519073,
0.07952530682086945,
0.021082352846860886,
-0.06551789492368698,
-0.0519949346780777,
0.010261857882142067,
0.046790238469839096,
-0.10674012452363968,
-0.0008368181297555566,
-0.0018373981583863497,
-0.08390948176383972,
-0.02464928664267063,
0.0495821014046669,
0.06632652133703232,
0.009783849120140076,
0.04045012593269348,
-0.01987491548061371,
-0.016492094844579697,
0.1272875815629959,
-0.10731779783964157,
0.12936800718307495,
0.040774937719106674,
-0.03718724101781845,
0.0787333995103836,
0.12205632776021957,
0.0005583474412560463,
-0.04694322124123573,
0.026539001613855362,
0.059327322989702225,
-0.009934182278811932,
-0.19217784702777863,
-0.07273104041814804,
-0.005397085566073656,
0.0800570696592331,
0.07311416417360306,
0.07692285627126694,
-0.03743058815598488,
-0.0395035482943058,
-0.05649024248123169,
-0.052111390978097916,
0.07605570554733276,
0.07018332928419113,
-0.03506311774253845,
0.002545110648497939,
0.02077251486480236,
-0.029770629480481148,
0.030265918001532555,
0.15279091894626617,
-0.006687644403427839,
0.14844508469104767,
-0.04340033233165741,
0.204502135515213,
0.038611166179180145,
0.07517001032829285,
0.0007572589674964547,
0.09116441756486893,
-0.020160265266895294,
0.02726554125547409,
-0.01694844476878643,
-0.048810336738824844,
0.0036235428415238857,
0.057950813323259354,
0.11489176005125046,
-0.014287389814853668,
-0.04418133571743965,
-0.03403383120894432,
0.09645664691925049,
0.29461750388145447,
0.05363300070166588,
-0.0730837881565094,
-0.015456943772733212,
0.020612338557839394,
-0.11504577100276947,
-0.057371094822883606,
0.000878174148965627,
0.0718030035495758,
-0.19257500767707825,
0.08470744639635086,
-0.024735575541853905,
0.02729548141360283,
-0.07196321338415146,
-0.04436325281858444,
0.07887852191925049,
0.07532859593629837,
-0.006973697803914547,
0.046409640461206436,
-0.19605618715286255,
0.07133689522743225,
-0.007497962098568678,
0.05017570033669472,
-0.028938259929418564,
0.025135373696684837,
0.005642155185341835,
-0.060727279633283615,
0.2177029401063919,
0.039179686456918716,
-0.0060600396245718,
-0.017742758616805077,
-0.15911316871643066,
-0.04035639017820358,
0.11310891807079315,
-0.16096965968608856,
0.06993868947029114,
-0.04904988035559654,
-0.05508439615368843,
-0.06287791579961777,
-0.025131095200777054,
-0.06492875516414642,
-0.1570369452238083,
0.029878079891204834,
-0.11359013617038727,
0.0002901718544308096,
-0.08747215569019318,
-0.01414119079709053,
-0.11259045451879501,
0.12672153115272522,
-0.09415681660175323,
-0.0597437247633934,
-0.1314782202243805,
-0.06134991720318794,
0.14393338561058044,
-0.04764938727021217,
0.06130359694361687,
-0.0180063359439373,
0.10339190810918808,
-0.03469991683959961,
-0.020652316510677338,
0.06225379928946495,
-0.10041819512844086,
-0.22944556176662445,
-0.03873603045940399,
0.1914825141429901,
0.07758241146802902,
0.03545099124312401,
-0.02346174791455269,
0.06784546375274658,
0.014724667184054852,
-0.062163520604372025,
0.0140851940959692,
0.19028399884700775,
0.005599664524197578,
0.09301856905221939,
-0.033275384455919266,
-0.043601151555776596,
-0.08216627687215805,
-0.04096629470586777,
0.11256449669599533,
0.23705719411373138,
-0.07057871669530869,
0.20767341554164886,
0.05980931967496872,
-0.08724096417427063,
-0.22206172347068787,
-0.06833211332559586,
0.08250375837087631,
-0.016818897798657417,
0.04607631638646126,
-0.10268574208021164,
0.0649406760931015,
0.0836360901594162,
-0.0003731143951881677,
0.0075850714929401875,
-0.19650597870349884,
-0.13132430613040924,
-0.00009504318586550653,
0.022485485300421715,
-0.12776856124401093,
-0.09017539769411087,
-0.03638658672571182,
-0.03802711144089699,
-0.03171171620488167,
0.09119751304388046,
0.02176234871149063,
0.06574343889951706,
0.0161287821829319,
-0.018230311572551727,
0.04010692238807678,
-0.05274474248290062,
0.1315503567457199,
0.0006615432212129235,
0.0008700272883288562,
-0.07788953930139542,
-0.006720277946442366,
0.011849350295960903,
-0.029129017144441605,
0.08881589025259018,
0.023957794532179832,
-0.01478641852736473,
-0.10074609518051147,
-0.025975273922085762,
-0.08064210414886475,
0.08047845214605331,
-0.09317826479673386,
-0.014880192466080189,
-0.06574171781539917,
0.09193854033946991,
0.07885187119245529,
-0.007170373108237982,
-0.05163529887795448,
-0.03615882992744446,
-0.032467544078826904,
0.12290814518928528,
0.14394697546958923,
0.14960549771785736,
-0.10638763010501862,
0.009623134508728981,
-0.02266727387905121,
0.022725705057382584,
0.0006025979528203607,
0.060287635773420334,
0.057062651962041855,
-0.02514052949845791,
0.08479106426239014,
-0.03233977407217026,
-0.1877731829881668,
-0.01477011851966381,
0.07108699530363083,
-0.1142258271574974,
-0.1848980337381363,
0.0027300126384943724,
-0.05806051939725876,
-0.03311494365334511,
-0.054740361869335175,
0.14488957822322845,
-0.02601415477693081,
-0.05373754724860191,
0.0015331106260418892,
0.08311688154935837,
-0.002947279019281268,
0.06711946427822113,
0.01123325526714325,
0.01234008464962244,
-0.08239259570837021,
0.1798771321773529,
0.037067487835884094,
-0.070403091609478,
0.024714896455407143,
0.18392956256866455,
-0.060240112245082855,
-0.04635774344205856,
0.01238054409623146,
0.0685010775923729,
-0.02067599818110466,
-0.02920771762728691,
0.008610808290541172,
-0.042542655020952225,
0.03182251378893852,
0.027357513085007668,
0.00405572634190321,
-0.020990174263715744,
0.045929521322250366,
0.030377009883522987,
-0.06215943023562431,
0.09900863468647003,
0.02937569096684456,
0.019046641886234283,
-0.029101882129907608,
0.06740113347768784,
-0.00785854458808899,
-0.02934357523918152,
0.015818582847714424,
0.03407582268118858,
-0.07262527197599411,
-0.05504230037331581,
0.009665303863584995,
-0.009872435592114925,
-0.03676630184054375,
-0.0368829108774662,
0.011984419077634811,
0.0029200143180787563,
0.037966225296258926,
0.009964853525161743,
-0.038416579365730286,
-0.09426009654998779,
-0.04130178689956665,
0.056379880756139755,
-0.12840798497200012,
-0.013053863309323788,
0.07383227348327637,
-0.07784447073936462,
0.1616670787334442,
0.028151391074061394,
0.0369388721883297,
0.014219528995454311,
-0.042326994240283966,
-0.004574701189994812,
-0.06890713423490524,
-0.005315940361469984,
0.050130292773246765,
-0.12536831200122833,
-0.017062537372112274,
-0.07755958288908005,
-0.0415453277528286,
-0.0043003442697227,
0.0745561420917511,
-0.09432709962129593,
0.07073409110307693,
0.04107894003391266,
-0.06596718728542328,
-0.07844087481498718,
-0.008957783691585064,
0.045746736228466034,
0.02081238478422165,
0.053385864943265915,
-0.06615421921014786,
0.06893545389175415,
-0.0711255818605423,
0.004025696776807308,
0.0003294531488791108,
0.000547576230019331,
-0.008706079795956612,
0.04077719897031784,
0.047532785683870316,
-0.014299001544713974,
0.07903705537319183,
0.0030690613202750683,
-0.011152289807796478,
0.050828102976083755,
-0.0533759742975235,
-0.07456227391958237,
0.05758000910282135,
-0.034995973110198975,
-0.05231659486889839,
0.0009630967397242785,
-0.05408048257231712,
-0.030563313513994217,
-0.018295438960194588,
-0.08191109448671341,
0.13703787326812744,
0.14033427834510803,
0.10663381218910217,
0.029856018722057343,
0.0014520130353048444,
-0.13258697092533112,
-0.14132624864578247,
0.06212732195854187,
-0.04013478383421898,
0.0671432763338089,
-0.051220931112766266,
0.12502215802669525,
0.06776692718267441,
-0.21412335336208344,
0.10079759359359741,
-0.02154095098376274,
-0.04872625693678856,
-0.061198651790618896,
-0.13281942903995514,
-0.05466828867793083,
-0.019822154194116592,
0.004205089993774891,
-0.09153120219707489,
0.08638890087604523,
0.05275481939315796,
0.04308638721704483,
0.02252444066107273,
0.08840763568878174,
-0.1633525937795639,
-0.06294316053390503,
0.10780304670333862,
0.054057925939559937,
0.03180251643061638,
0.0644444078207016,
0.00007095551700331271,
-0.05848664045333862,
0.028906812891364098,
0.047613002359867096,
0.07179775834083557,
0.010908284224569798,
0.004417906980961561,
-0.015637483447790146,
-0.08978146314620972,
0.027938678860664368,
-0.027230313047766685,
-0.021034106612205505,
0.10000819712877274,
0.05768291652202606,
0.007310186512768269,
-0.03264622390270233,
0.2302917242050171,
-0.061509035527706146,
-0.027476301416754723,
-0.16131579875946045,
0.11148304492235184,
-0.04467588663101196,
0.0508061908185482,
-0.016360467299818993,
-0.0902116671204567,
-0.03026147000491619,
0.18582548201084137,
0.14042215049266815,
-0.0268323365598917,
0.018725818023085594,
0.025976354256272316,
-0.008945228531956673,
0.03137330710887909,
0.0821147933602333,
0.05842192843556404,
0.16061681509017944,
-0.06543948501348495,
0.09932872653007507,
-0.02220502868294716,
-0.038988903164863586,
-0.09744497388601303,
0.19688546657562256,
-0.057210639119148254,
0.006156980991363525,
-0.04606388881802559,
0.05331490933895111,
0.010959283448755741,
-0.3100702464580536,
0.043515466153621674,
-0.09794948250055313,
-0.13634485006332397,
0.012481434270739555,
0.06696493178606033,
0.03506696969270706,
0.08163053542375565,
0.07605531811714172,
-0.015224069356918335,
0.22355005145072937,
0.037145934998989105,
-0.06211879476904869,
-0.022114593535661697,
0.03087041899561882,
-0.14987362921237946,
0.2555731534957886,
0.02411094307899475,
-0.021345950663089752,
0.08581890910863876,
-0.019428851082921028,
-0.14025336503982544,
-0.011787205003201962,
0.044839974492788315,
-0.09226434677839279,
0.03943173959851265,
0.22652697563171387,
-0.0030909525230526924,
0.053718939423561096,
0.038558028638362885,
-0.07765743136405945,
0.06320123374462128,
0.07155612111091614,
-0.004466915037482977,
-0.06386460363864899,
0.11380192637443542,
-0.12245496362447739,
0.13287103176116943,
0.1905631124973297,
-0.03742115944623947,
0.021784896031022072,
-0.07720133662223816,
-0.013400997035205364,
-0.012626060284674168,
0.13845470547676086,
0.048798974603414536,
-0.12940466403961182,
0.019900813698768616,
-0.05042421445250511,
0.06816644221544266,
-0.13737063109874725,
-0.046657636761665344,
-0.012370054610073566,
-0.04045470431447029,
-0.053929831832647324,
0.09804452210664749,
0.03951381891965866,
0.011623288504779339,
0.0010270479833707213,
0.09554577618837357,
0.014831733889877796,
0.09901735931634903,
-0.06622068583965302,
-0.00982118770480156
] |
null | null |
transformers
|
# Historic Language Models (HLMs)
## Languages
Our Historic Language Models Zoo contains support for the following languages - incl. their training data source:
| Language | Training data | Size
| -------- | ------------- | ----
| German | [Europeana](http://www.europeana-newspapers.eu/) | 13-28GB (filtered)
| French | [Europeana](http://www.europeana-newspapers.eu/) | 11-31GB (filtered)
| English | [British Library](https://data.bl.uk/digbks/db14.html) | 24GB (year filtered)
| Finnish | [Europeana](http://www.europeana-newspapers.eu/) | 1.2GB
| Swedish | [Europeana](http://www.europeana-newspapers.eu/) | 1.1GB
## Models
At the moment, the following models are available on the model hub:
| Model identifier | Model Hub link
| --------------------------------------------- | --------------------------------------------------------------------------
| `dbmdz/bert-base-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-multilingual-cased)
| `dbmdz/bert-base-historic-english-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-english-cased)
| `dbmdz/bert-base-finnish-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-finnish-europeana-cased)
| `dbmdz/bert-base-swedish-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-swedish-europeana-cased)
We also released smaller models for the multilingual model:
| Model identifier | Model Hub link
| ----------------------------------------------- | ---------------------------------------------------------------------------
| `dbmdz/bert-tiny-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-tiny-historic-multilingual-cased)
| `dbmdz/bert-mini-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-mini-historic-multilingual-cased)
| `dbmdz/bert-small-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-small-historic-multilingual-cased)
| `dbmdz/bert-medium-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-multilingual-cased)
**Notice**: We have released language models for Historic German and French trained on more noisier data earlier - see
[this repo](https://github.com/stefan-it/europeana-bert) for more information:
| Model identifier | Model Hub link
| --------------------------------------------- | --------------------------------------------------------------------------
| `dbmdz/bert-base-german-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-german-europeana-cased)
| `dbmdz/bert-base-french-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-french-europeana-cased)
# Corpora Stats
## German Europeana Corpus
We provide some statistics using different thresholds of ocr confidences, in order to shrink down the corpus size
and use less-noisier data:
| OCR confidence | Size
| -------------- | ----
| **0.60** | 28GB
| 0.65 | 18GB
| 0.70 | 13GB
For the final corpus we use a OCR confidence of 0.6 (28GB). The following plot shows a tokens per year distribution:

## French Europeana Corpus
Like German, we use different ocr confidence thresholds:
| OCR confidence | Size
| -------------- | ----
| 0.60 | 31GB
| 0.65 | 27GB
| **0.70** | 27GB
| 0.75 | 23GB
| 0.80 | 11GB
For the final corpus we use a OCR confidence of 0.7 (27GB). The following plot shows a tokens per year distribution:

## British Library Corpus
Metadata is taken from [here](https://data.bl.uk/digbks/DB21.html). Stats incl. year filtering:
| Years | Size
| ----------------- | ----
| ALL | 24GB
| >= 1800 && < 1900 | 24GB
We use the year filtered variant. The following plot shows a tokens per year distribution:

## Finnish Europeana Corpus
| OCR confidence | Size
| -------------- | ----
| 0.60 | 1.2GB
The following plot shows a tokens per year distribution:

## Swedish Europeana Corpus
| OCR confidence | Size
| -------------- | ----
| 0.60 | 1.1GB
The following plot shows a tokens per year distribution:

## All Corpora
The following plot shows a tokens per year distribution of the complete training corpus:

# Multilingual Vocab generation
For the first attempt, we use the first 10GB of each pretraining corpus. We upsample both Finnish and Swedish to ~10GB.
The following tables shows the exact size that is used for generating a 32k and 64k subword vocabs:
| Language | Size
| -------- | ----
| German | 10GB
| French | 10GB
| English | 10GB
| Finnish | 9.5GB
| Swedish | 9.7GB
We then calculate the subword fertility rate and portion of `[UNK]`s over the following NER corpora:
| Language | NER corpora
| -------- | ------------------
| German | CLEF-HIPE, NewsEye
| French | CLEF-HIPE, NewsEye
| English | CLEF-HIPE
| Finnish | NewsEye
| Swedish | NewsEye
Breakdown of subword fertility rate and unknown portion per language for the 32k vocab:
| Language | Subword fertility | Unknown portion
| -------- | ------------------ | ---------------
| German | 1.43 | 0.0004
| French | 1.25 | 0.0001
| English | 1.25 | 0.0
| Finnish | 1.69 | 0.0007
| Swedish | 1.43 | 0.0
Breakdown of subword fertility rate and unknown portion per language for the 64k vocab:
| Language | Subword fertility | Unknown portion
| -------- | ------------------ | ---------------
| German | 1.31 | 0.0004
| French | 1.16 | 0.0001
| English | 1.17 | 0.0
| Finnish | 1.54 | 0.0007
| Swedish | 1.32 | 0.0
# Final pretraining corpora
We upsample Swedish and Finnish to ~27GB. The final stats for all pretraining corpora can be seen here:
| Language | Size
| -------- | ----
| German | 28GB
| French | 27GB
| English | 24GB
| Finnish | 27GB
| Swedish | 27GB
Total size is 130GB.
# Smaller multilingual models
Inspired by the ["Well-Read Students Learn Better: On the Importance of Pre-training Compact Models"](https://arxiv.org/abs/1908.08962)
paper, we train smaller models (different layers and hidden sizes), and report number of parameters and pre-training costs:
| Model (Layer / Hidden size) | Parameters | Pre-Training time
| --------------------------- | ----------: | ----------------------:
| hmBERT Tiny ( 2/128) | 4.58M | 4.3 sec / 1,000 steps
| hmBERT Mini ( 4/256) | 11.55M | 10.5 sec / 1,000 steps
| hmBERT Small ( 4/512) | 29.52M | 20.7 sec / 1,000 steps
| hmBERT Medium ( 8/512) | 42.13M | 35.0 sec / 1,000 steps
| hmBERT Base (12/768) | 110.62M | 80.0 sec / 1,000 steps
We then perform downstream evaluations on the multilingual [NewsEye](https://zenodo.org/record/4573313#.Ya3oVr-ZNzU) dataset:

# Pretraining
## Multilingual model - hmBERT Base
We train a multilingual BERT model using the 32k vocab with the official BERT implementation
on a v3-32 TPU using the following parameters:
```bash
python3 run_pretraining.py --input_file gs://histolectra/historic-multilingual-tfrecords/*.tfrecord \
--output_dir gs://histolectra/bert-base-historic-multilingual-cased \
--bert_config_file ./config.json \
--max_seq_length=512 \
--max_predictions_per_seq=75 \
--do_train=True \
--train_batch_size=128 \
--num_train_steps=3000000 \
--learning_rate=1e-4 \
--save_checkpoints_steps=100000 \
--keep_checkpoint_max=20 \
--use_tpu=True \
--tpu_name=electra-2 \
--num_tpu_cores=32
```
The following plot shows the pretraining loss curve:

## Smaller multilingual models
We use the same parameters as used for training the base model.
### hmBERT Tiny
The following plot shows the pretraining loss curve for the tiny model:

### hmBERT Mini
The following plot shows the pretraining loss curve for the mini model:

### hmBERT Small
The following plot shows the pretraining loss curve for the small model:

### hmBERT Medium
The following plot shows the pretraining loss curve for the medium model:

## English model
The English BERT model - with texts from British Library corpus - was trained with the Hugging Face
JAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-historic-english-cased/ \
--tokenizer_name /mnt/datasets/bert-base-historic-english-cased/ \
--train_file /mnt/datasets/bl-corpus/bl_1800-1900_extracted.txt \
--validation_file /mnt/datasets/bl-corpus/english_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 10 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-historic-english-cased-512-noadafactor-10e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

## Finnish model
The BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-finnish-europeana-cased/ \
--tokenizer_name /mnt/datasets/bert-base-finnish-europeana-cased/ \
--train_file /mnt/datasets/hlms/extracted_content_Finnish_0.6.txt \
--validation_file /mnt/datasets/hlms/finnish_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 40 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-finnish-europeana-cased-512-dupe1-noadafactor-40e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

## Swedish model
The BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-swedish-europeana-cased/ \
--tokenizer_name /mnt/datasets/bert-base-swedish-europeana-cased/ \
--train_file /mnt/datasets/hlms/extracted_content_Swedish_0.6.txt \
--validation_file /mnt/datasets/hlms/swedish_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 40 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-swedish-europeana-cased-512-dupe1-noadafactor-40e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

# Acknowledgments
Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as
TensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "multilingual", "license": "mit", "widget": [{"text": "and I cannot conceive the reafon why [MASK] hath"}, {"text": "T\u00e4k\u00e4l\u00e4inen sanomalehdist\u00f6 [MASK] erit - t\u00e4in"}, {"text": "Det vore [MASK] h\u00e4ller n\u00f6dv\u00e4ndigt att be"}, {"text": "Comme, \u00e0 cette \u00e9poque [MASK] \u00e9tait celle de la"}, {"text": "In [MASK] an atmosph\u00e4rischen Nahrungsmitteln"}]}
|
fill-mask
|
dbmdz/bert-small-historic-multilingual-cased
|
[
"transformers",
"pytorch",
"tf",
"tensorboard",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"arxiv:1908.08962",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.08962"
] |
[
"multilingual"
] |
TAGS
#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
Historic Language Models (HLMs)
===============================
Languages
---------
Our Historic Language Models Zoo contains support for the following languages - incl. their training data source:
Language: German, Training data: Europeana, Size: 13-28GB (filtered)
Language: French, Training data: Europeana, Size: 11-31GB (filtered)
Language: English, Training data: British Library, Size: 24GB (year filtered)
Language: Finnish, Training data: Europeana, Size: 1.2GB
Language: Swedish, Training data: Europeana, Size: 1.1GB
Models
------
At the moment, the following models are available on the model hub:
We also released smaller models for the multilingual model:
Notice: We have released language models for Historic German and French trained on more noisier data earlier - see
this repo for more information:
Corpora Stats
=============
German Europeana Corpus
-----------------------
We provide some statistics using different thresholds of ocr confidences, in order to shrink down the corpus size
and use less-noisier data:
For the final corpus we use a OCR confidence of 0.6 (28GB). The following plot shows a tokens per year distribution:
!German Europeana Corpus Stats
French Europeana Corpus
-----------------------
Like German, we use different ocr confidence thresholds:
For the final corpus we use a OCR confidence of 0.7 (27GB). The following plot shows a tokens per year distribution:
!French Europeana Corpus Stats
British Library Corpus
----------------------
Metadata is taken from here. Stats incl. year filtering:
We use the year filtered variant. The following plot shows a tokens per year distribution:
!British Library Corpus Stats
Finnish Europeana Corpus
------------------------
The following plot shows a tokens per year distribution:
!Finnish Europeana Corpus Stats
Swedish Europeana Corpus
------------------------
The following plot shows a tokens per year distribution:
!Swedish Europeana Corpus Stats
All Corpora
-----------
The following plot shows a tokens per year distribution of the complete training corpus:
!All Corpora Stats
Multilingual Vocab generation
=============================
For the first attempt, we use the first 10GB of each pretraining corpus. We upsample both Finnish and Swedish to ~10GB.
The following tables shows the exact size that is used for generating a 32k and 64k subword vocabs:
We then calculate the subword fertility rate and portion of '[UNK]'s over the following NER corpora:
Breakdown of subword fertility rate and unknown portion per language for the 32k vocab:
Language: German, Subword fertility: 1.43, Unknown portion: 0.0004
Language: French, Subword fertility: 1.25, Unknown portion: 0.0001
Language: English, Subword fertility: 1.25, Unknown portion: 0.0
Language: Finnish, Subword fertility: 1.69, Unknown portion: 0.0007
Language: Swedish, Subword fertility: 1.43, Unknown portion: 0.0
Breakdown of subword fertility rate and unknown portion per language for the 64k vocab:
Language: German, Subword fertility: 1.31, Unknown portion: 0.0004
Language: French, Subword fertility: 1.16, Unknown portion: 0.0001
Language: English, Subword fertility: 1.17, Unknown portion: 0.0
Language: Finnish, Subword fertility: 1.54, Unknown portion: 0.0007
Language: Swedish, Subword fertility: 1.32, Unknown portion: 0.0
Final pretraining corpora
=========================
We upsample Swedish and Finnish to ~27GB. The final stats for all pretraining corpora can be seen here:
Total size is 130GB.
Smaller multilingual models
===========================
Inspired by the "Well-Read Students Learn Better: On the Importance of Pre-training Compact Models"
paper, we train smaller models (different layers and hidden sizes), and report number of parameters and pre-training costs:
We then perform downstream evaluations on the multilingual NewsEye dataset:
!NewsEye hmBERT Evaluation
Pretraining
===========
Multilingual model - hmBERT Base
--------------------------------
We train a multilingual BERT model using the 32k vocab with the official BERT implementation
on a v3-32 TPU using the following parameters:
The following plot shows the pretraining loss curve:
!Training loss curve
Smaller multilingual models
---------------------------
We use the same parameters as used for training the base model.
### hmBERT Tiny
The following plot shows the pretraining loss curve for the tiny model:
!Training loss curve
### hmBERT Mini
The following plot shows the pretraining loss curve for the mini model:
!Training loss curve
### hmBERT Small
The following plot shows the pretraining loss curve for the small model:
!Training loss curve
### hmBERT Medium
The following plot shows the pretraining loss curve for the medium model:
!Training loss curve
English model
-------------
The English BERT model - with texts from British Library corpus - was trained with the Hugging Face
JAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Finnish model
-------------
The BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Swedish model
-------------
The BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Acknowledgments
===============
Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as
TensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[
"### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve",
"### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve",
"### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve",
"### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve",
"### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve",
"### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve",
"### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
66,
30,
28,
28,
347
] |
[
"passage: TAGS\n#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
-0.01838250458240509,
0.07497698813676834,
-0.004591826349496841,
0.0935814306139946,
0.08459219336509705,
0.0281265489757061,
0.0513935424387455,
0.11914435029029846,
0.019784854725003242,
0.12505578994750977,
0.01800212636590004,
-0.004408061970025301,
0.10708323866128922,
0.07002894580364227,
0.10031522065401077,
-0.26725950837135315,
0.015286745503544807,
-0.14339587092399597,
-0.012879506684839725,
0.061703287065029144,
0.13286221027374268,
-0.049308016896247864,
0.0773063525557518,
0.005427641794085503,
0.00662423437461257,
0.02681286446750164,
-0.029497727751731873,
-0.036411575973033905,
0.08129750192165375,
0.06282371282577515,
0.02853299304842949,
-0.03729385510087013,
0.057260334491729736,
-0.2171691656112671,
0.0006818408728577197,
0.0776631310582161,
-0.009753323160111904,
0.03736304119229317,
0.10214542597532272,
-0.0279656033962965,
0.08662102371454239,
-0.11414434760808945,
0.05029687657952309,
0.045086249709129333,
-0.08234736323356628,
-0.14702413976192474,
-0.13204717636108398,
0.09344872832298279,
0.11585770547389984,
0.05216670408844948,
-0.020233305171132088,
0.07557069510221481,
-0.0856800451874733,
0.05004337057471275,
0.2793901264667511,
-0.2027529627084732,
-0.04622967913746834,
-0.05339162424206734,
0.05674716830253601,
0.034474875777959824,
-0.07952550053596497,
0.013355844654142857,
0.0012370070908218622,
0.010445362888276577,
-0.01488783210515976,
0.002163159428164363,
0.03620881959795952,
-0.008577418513596058,
-0.10514488071203232,
-0.019028153270483017,
0.10807893425226212,
-0.004645194858312607,
-0.0795569121837616,
-0.18938635289669037,
-0.018536098301410675,
-0.03773390129208565,
-0.01827855035662651,
0.004385469015687704,
0.06355904042720795,
-0.03177975118160248,
0.08268830925226212,
-0.06494207680225372,
-0.07094825804233551,
-0.0010663216235116124,
0.017042797058820724,
0.09598168730735779,
0.02850382588803768,
0.0019258481916040182,
0.06261249631643295,
0.07416264712810516,
-0.1009078249335289,
-0.03536380082368851,
-0.024320369586348534,
-0.02747233957052231,
-0.09559712558984756,
-0.013390122912824154,
-0.0076389857567846775,
-0.06646184623241425,
0.001987295923754573,
0.21341915428638458,
0.03402634337544441,
0.024493880569934845,
-0.03990111127495766,
0.008588647469878197,
0.029462207108736038,
0.13073284924030304,
-0.11216874420642853,
-0.13901303708553314,
-0.018986279144883156,
-0.013873172923922539,
0.059853531420230865,
0.002015564125031233,
-0.01200065203011036,
0.028986893594264984,
0.01434057392179966,
0.0787043645977974,
0.029499389231204987,
0.016802452504634857,
-0.08529693633317947,
-0.03680713474750519,
0.15668834745883942,
-0.1264379769563675,
0.071449875831604,
0.015350238420069218,
-0.03899960219860077,
0.024907097220420837,
0.042215049266815186,
0.012826773338019848,
-0.0705830380320549,
0.11708971112966537,
-0.050523821264505386,
-0.009749419055879116,
-0.045183200389146805,
-0.10202311724424362,
0.046891745179891586,
-0.033271461725234985,
-0.061858199536800385,
-0.07730357348918915,
-0.02072875201702118,
-0.06297934055328369,
0.03946205601096153,
-0.06353288888931274,
-0.01744234375655651,
-0.02939661033451557,
-0.056464456021785736,
0.038754090666770935,
0.015417465008795261,
0.0595666877925396,
-0.01732446253299713,
0.01403512991964817,
-0.12174932658672333,
0.05614302307367325,
0.0541803278028965,
0.02246646210551262,
-0.051134880632162094,
-0.008520944975316525,
-0.28358370065689087,
0.06606274843215942,
-0.132695734500885,
0.004836035892367363,
-0.11571432650089264,
-0.028689511120319366,
0.08277411013841629,
0.03743920475244522,
0.024713460355997086,
0.08890194445848465,
-0.17414288222789764,
-0.04899533465504646,
0.1714445799589157,
-0.09474573284387589,
-0.002242365386337042,
0.1442134976387024,
0.003990788012742996,
0.027375195175409317,
0.027852144092321396,
0.1334686577320099,
0.04412377253174782,
-0.14020144939422607,
-0.044648177921772,
-0.06190132722258568,
-0.05038272961974144,
0.147560253739357,
0.06303991377353668,
-0.12529557943344116,
0.03476065769791603,
0.01511664129793644,
-0.0928725153207779,
-0.06102854013442993,
0.01619136892259121,
-0.009231594391167164,
0.02897515892982483,
-0.019016604870557785,
0.0019346270710229874,
-0.0037865471094846725,
-0.049945980310440063,
-0.08919233828783035,
-0.11897703260183334,
-0.006689291447401047,
0.09475875645875931,
-0.02355268783867359,
0.0748969316482544,
-0.056626494973897934,
-0.01113037671893835,
0.020184379070997238,
-0.03496541455388069,
-0.07251078635454178,
-0.10067931562662125,
0.08108916878700256,
-0.14647723734378815,
0.019197965040802956,
-0.03843053802847862,
0.059079382568597794,
0.0791577398777008,
-0.04065806046128273,
-0.023166347295045853,
-0.013846770860254765,
-0.018708376213908195,
-0.04894305393099785,
-0.15186434984207153,
-0.03388196974992752,
-0.012137913145124912,
0.10956902801990509,
-0.04277980700135231,
-0.017493732273578644,
0.03304886445403099,
0.14915940165519714,
0.0495106503367424,
-0.05695953220129013,
-0.0009552006958983839,
0.010290311649441719,
0.007390175014734268,
-0.05511286109685898,
0.01396782137453556,
-0.04304596036672592,
-0.008544501848518848,
0.09625044465065002,
-0.1230364516377449,
-0.17937225103378296,
0.07885769754648209,
0.13410720229148865,
-0.09971778839826584,
0.005583271384239197,
-0.06032155454158783,
-0.03603639453649521,
-0.07150408625602722,
-0.087070532143116,
0.14423224329948425,
0.04508695378899574,
0.08474673330783844,
-0.07967165112495422,
-0.08038293570280075,
-0.0015841529238969088,
-0.03652593865990639,
-0.08957476913928986,
0.12280010432004929,
-0.024162983521819115,
-0.09752131998538971,
0.06307537108659744,
-0.02167368307709694,
0.05711343511939049,
0.15706712007522583,
0.02176528237760067,
-0.10685811191797256,
-0.03281952813267708,
0.02877911925315857,
0.08813615888357162,
0.09000173211097717,
0.0035563369747251272,
0.03690013661980629,
0.05070043355226517,
0.020603828132152557,
0.006292149890214205,
-0.04952035844326019,
0.02139190584421158,
0.0073630125261843204,
-0.03865039721131325,
0.04656580463051796,
0.01181273814290762,
0.012281140312552452,
0.08929583430290222,
0.029386257752776146,
0.049766477197408676,
-0.04818480461835861,
-0.04621870443224907,
-0.09485994279384613,
0.10117428004741669,
-0.15671852231025696,
-0.25755253434181213,
-0.1571509689092636,
0.015771346166729927,
-0.031071770936250687,
-0.01897772215306759,
0.031087210401892662,
-0.06490792334079742,
-0.12198474258184433,
-0.12990815937519073,
0.07952530682086945,
0.021082352846860886,
-0.06551789492368698,
-0.0519949346780777,
0.010261857882142067,
0.046790238469839096,
-0.10674012452363968,
-0.0008368181297555566,
-0.0018373981583863497,
-0.08390948176383972,
-0.02464928664267063,
0.0495821014046669,
0.06632652133703232,
0.009783849120140076,
0.04045012593269348,
-0.01987491548061371,
-0.016492094844579697,
0.1272875815629959,
-0.10731779783964157,
0.12936800718307495,
0.040774937719106674,
-0.03718724101781845,
0.0787333995103836,
0.12205632776021957,
0.0005583474412560463,
-0.04694322124123573,
0.026539001613855362,
0.059327322989702225,
-0.009934182278811932,
-0.19217784702777863,
-0.07273104041814804,
-0.005397085566073656,
0.0800570696592331,
0.07311416417360306,
0.07692285627126694,
-0.03743058815598488,
-0.0395035482943058,
-0.05649024248123169,
-0.052111390978097916,
0.07605570554733276,
0.07018332928419113,
-0.03506311774253845,
0.002545110648497939,
0.02077251486480236,
-0.029770629480481148,
0.030265918001532555,
0.15279091894626617,
-0.006687644403427839,
0.14844508469104767,
-0.04340033233165741,
0.204502135515213,
0.038611166179180145,
0.07517001032829285,
0.0007572589674964547,
0.09116441756486893,
-0.020160265266895294,
0.02726554125547409,
-0.01694844476878643,
-0.048810336738824844,
0.0036235428415238857,
0.057950813323259354,
0.11489176005125046,
-0.014287389814853668,
-0.04418133571743965,
-0.03403383120894432,
0.09645664691925049,
0.29461750388145447,
0.05363300070166588,
-0.0730837881565094,
-0.015456943772733212,
0.020612338557839394,
-0.11504577100276947,
-0.057371094822883606,
0.000878174148965627,
0.0718030035495758,
-0.19257500767707825,
0.08470744639635086,
-0.024735575541853905,
0.02729548141360283,
-0.07196321338415146,
-0.04436325281858444,
0.07887852191925049,
0.07532859593629837,
-0.006973697803914547,
0.046409640461206436,
-0.19605618715286255,
0.07133689522743225,
-0.007497962098568678,
0.05017570033669472,
-0.028938259929418564,
0.025135373696684837,
0.005642155185341835,
-0.060727279633283615,
0.2177029401063919,
0.039179686456918716,
-0.0060600396245718,
-0.017742758616805077,
-0.15911316871643066,
-0.04035639017820358,
0.11310891807079315,
-0.16096965968608856,
0.06993868947029114,
-0.04904988035559654,
-0.05508439615368843,
-0.06287791579961777,
-0.025131095200777054,
-0.06492875516414642,
-0.1570369452238083,
0.029878079891204834,
-0.11359013617038727,
0.0002901718544308096,
-0.08747215569019318,
-0.01414119079709053,
-0.11259045451879501,
0.12672153115272522,
-0.09415681660175323,
-0.0597437247633934,
-0.1314782202243805,
-0.06134991720318794,
0.14393338561058044,
-0.04764938727021217,
0.06130359694361687,
-0.0180063359439373,
0.10339190810918808,
-0.03469991683959961,
-0.020652316510677338,
0.06225379928946495,
-0.10041819512844086,
-0.22944556176662445,
-0.03873603045940399,
0.1914825141429901,
0.07758241146802902,
0.03545099124312401,
-0.02346174791455269,
0.06784546375274658,
0.014724667184054852,
-0.062163520604372025,
0.0140851940959692,
0.19028399884700775,
0.005599664524197578,
0.09301856905221939,
-0.033275384455919266,
-0.043601151555776596,
-0.08216627687215805,
-0.04096629470586777,
0.11256449669599533,
0.23705719411373138,
-0.07057871669530869,
0.20767341554164886,
0.05980931967496872,
-0.08724096417427063,
-0.22206172347068787,
-0.06833211332559586,
0.08250375837087631,
-0.016818897798657417,
0.04607631638646126,
-0.10268574208021164,
0.0649406760931015,
0.0836360901594162,
-0.0003731143951881677,
0.0075850714929401875,
-0.19650597870349884,
-0.13132430613040924,
-0.00009504318586550653,
0.022485485300421715,
-0.12776856124401093,
-0.09017539769411087,
-0.03638658672571182,
-0.03802711144089699,
-0.03171171620488167,
0.09119751304388046,
0.02176234871149063,
0.06574343889951706,
0.0161287821829319,
-0.018230311572551727,
0.04010692238807678,
-0.05274474248290062,
0.1315503567457199,
0.0006615432212129235,
0.0008700272883288562,
-0.07788953930139542,
-0.006720277946442366,
0.011849350295960903,
-0.029129017144441605,
0.08881589025259018,
0.023957794532179832,
-0.01478641852736473,
-0.10074609518051147,
-0.025975273922085762,
-0.08064210414886475,
0.08047845214605331,
-0.09317826479673386,
-0.014880192466080189,
-0.06574171781539917,
0.09193854033946991,
0.07885187119245529,
-0.007170373108237982,
-0.05163529887795448,
-0.03615882992744446,
-0.032467544078826904,
0.12290814518928528,
0.14394697546958923,
0.14960549771785736,
-0.10638763010501862,
0.009623134508728981,
-0.02266727387905121,
0.022725705057382584,
0.0006025979528203607,
0.060287635773420334,
0.057062651962041855,
-0.02514052949845791,
0.08479106426239014,
-0.03233977407217026,
-0.1877731829881668,
-0.01477011851966381,
0.07108699530363083,
-0.1142258271574974,
-0.1848980337381363,
0.0027300126384943724,
-0.05806051939725876,
-0.03311494365334511,
-0.054740361869335175,
0.14488957822322845,
-0.02601415477693081,
-0.05373754724860191,
0.0015331106260418892,
0.08311688154935837,
-0.002947279019281268,
0.06711946427822113,
0.01123325526714325,
0.01234008464962244,
-0.08239259570837021,
0.1798771321773529,
0.037067487835884094,
-0.070403091609478,
0.024714896455407143,
0.18392956256866455,
-0.060240112245082855,
-0.04635774344205856,
0.01238054409623146,
0.0685010775923729,
-0.02067599818110466,
-0.02920771762728691,
0.008610808290541172,
-0.042542655020952225,
0.03182251378893852,
0.027357513085007668,
0.00405572634190321,
-0.020990174263715744,
0.045929521322250366,
0.030377009883522987,
-0.06215943023562431,
0.09900863468647003,
0.02937569096684456,
0.019046641886234283,
-0.029101882129907608,
0.06740113347768784,
-0.00785854458808899,
-0.02934357523918152,
0.015818582847714424,
0.03407582268118858,
-0.07262527197599411,
-0.05504230037331581,
0.009665303863584995,
-0.009872435592114925,
-0.03676630184054375,
-0.0368829108774662,
0.011984419077634811,
0.0029200143180787563,
0.037966225296258926,
0.009964853525161743,
-0.038416579365730286,
-0.09426009654998779,
-0.04130178689956665,
0.056379880756139755,
-0.12840798497200012,
-0.013053863309323788,
0.07383227348327637,
-0.07784447073936462,
0.1616670787334442,
0.028151391074061394,
0.0369388721883297,
0.014219528995454311,
-0.042326994240283966,
-0.004574701189994812,
-0.06890713423490524,
-0.005315940361469984,
0.050130292773246765,
-0.12536831200122833,
-0.017062537372112274,
-0.07755958288908005,
-0.0415453277528286,
-0.0043003442697227,
0.0745561420917511,
-0.09432709962129593,
0.07073409110307693,
0.04107894003391266,
-0.06596718728542328,
-0.07844087481498718,
-0.008957783691585064,
0.045746736228466034,
0.02081238478422165,
0.053385864943265915,
-0.06615421921014786,
0.06893545389175415,
-0.0711255818605423,
0.004025696776807308,
0.0003294531488791108,
0.000547576230019331,
-0.008706079795956612,
0.04077719897031784,
0.047532785683870316,
-0.014299001544713974,
0.07903705537319183,
0.0030690613202750683,
-0.011152289807796478,
0.050828102976083755,
-0.0533759742975235,
-0.07456227391958237,
0.05758000910282135,
-0.034995973110198975,
-0.05231659486889839,
0.0009630967397242785,
-0.05408048257231712,
-0.030563313513994217,
-0.018295438960194588,
-0.08191109448671341,
0.13703787326812744,
0.14033427834510803,
0.10663381218910217,
0.029856018722057343,
0.0014520130353048444,
-0.13258697092533112,
-0.14132624864578247,
0.06212732195854187,
-0.04013478383421898,
0.0671432763338089,
-0.051220931112766266,
0.12502215802669525,
0.06776692718267441,
-0.21412335336208344,
0.10079759359359741,
-0.02154095098376274,
-0.04872625693678856,
-0.061198651790618896,
-0.13281942903995514,
-0.05466828867793083,
-0.019822154194116592,
0.004205089993774891,
-0.09153120219707489,
0.08638890087604523,
0.05275481939315796,
0.04308638721704483,
0.02252444066107273,
0.08840763568878174,
-0.1633525937795639,
-0.06294316053390503,
0.10780304670333862,
0.054057925939559937,
0.03180251643061638,
0.0644444078207016,
0.00007095551700331271,
-0.05848664045333862,
0.028906812891364098,
0.047613002359867096,
0.07179775834083557,
0.010908284224569798,
0.004417906980961561,
-0.015637483447790146,
-0.08978146314620972,
0.027938678860664368,
-0.027230313047766685,
-0.021034106612205505,
0.10000819712877274,
0.05768291652202606,
0.007310186512768269,
-0.03264622390270233,
0.2302917242050171,
-0.061509035527706146,
-0.027476301416754723,
-0.16131579875946045,
0.11148304492235184,
-0.04467588663101196,
0.0508061908185482,
-0.016360467299818993,
-0.0902116671204567,
-0.03026147000491619,
0.18582548201084137,
0.14042215049266815,
-0.0268323365598917,
0.018725818023085594,
0.025976354256272316,
-0.008945228531956673,
0.03137330710887909,
0.0821147933602333,
0.05842192843556404,
0.16061681509017944,
-0.06543948501348495,
0.09932872653007507,
-0.02220502868294716,
-0.038988903164863586,
-0.09744497388601303,
0.19688546657562256,
-0.057210639119148254,
0.006156980991363525,
-0.04606388881802559,
0.05331490933895111,
0.010959283448755741,
-0.3100702464580536,
0.043515466153621674,
-0.09794948250055313,
-0.13634485006332397,
0.012481434270739555,
0.06696493178606033,
0.03506696969270706,
0.08163053542375565,
0.07605531811714172,
-0.015224069356918335,
0.22355005145072937,
0.037145934998989105,
-0.06211879476904869,
-0.022114593535661697,
0.03087041899561882,
-0.14987362921237946,
0.2555731534957886,
0.02411094307899475,
-0.021345950663089752,
0.08581890910863876,
-0.019428851082921028,
-0.14025336503982544,
-0.011787205003201962,
0.044839974492788315,
-0.09226434677839279,
0.03943173959851265,
0.22652697563171387,
-0.0030909525230526924,
0.053718939423561096,
0.038558028638362885,
-0.07765743136405945,
0.06320123374462128,
0.07155612111091614,
-0.004466915037482977,
-0.06386460363864899,
0.11380192637443542,
-0.12245496362447739,
0.13287103176116943,
0.1905631124973297,
-0.03742115944623947,
0.021784896031022072,
-0.07720133662223816,
-0.013400997035205364,
-0.012626060284674168,
0.13845470547676086,
0.048798974603414536,
-0.12940466403961182,
0.019900813698768616,
-0.05042421445250511,
0.06816644221544266,
-0.13737063109874725,
-0.046657636761665344,
-0.012370054610073566,
-0.04045470431447029,
-0.053929831832647324,
0.09804452210664749,
0.03951381891965866,
0.011623288504779339,
0.0010270479833707213,
0.09554577618837357,
0.014831733889877796,
0.09901735931634903,
-0.06622068583965302,
-0.00982118770480156
] |
null | null |
transformers
|
# Historic Language Models (HLMs)
## Languages
Our Historic Language Models Zoo contains support for the following languages - incl. their training data source:
| Language | Training data | Size
| -------- | ------------- | ----
| German | [Europeana](http://www.europeana-newspapers.eu/) | 13-28GB (filtered)
| French | [Europeana](http://www.europeana-newspapers.eu/) | 11-31GB (filtered)
| English | [British Library](https://data.bl.uk/digbks/db14.html) | 24GB (year filtered)
| Finnish | [Europeana](http://www.europeana-newspapers.eu/) | 1.2GB
| Swedish | [Europeana](http://www.europeana-newspapers.eu/) | 1.1GB
## Models
At the moment, the following models are available on the model hub:
| Model identifier | Model Hub link
| --------------------------------------------- | --------------------------------------------------------------------------
| `dbmdz/bert-base-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-multilingual-cased)
| `dbmdz/bert-base-historic-english-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-english-cased)
| `dbmdz/bert-base-finnish-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-finnish-europeana-cased)
| `dbmdz/bert-base-swedish-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-swedish-europeana-cased)
We also released smaller models for the multilingual model:
| Model identifier | Model Hub link
| ----------------------------------------------- | ---------------------------------------------------------------------------
| `dbmdz/bert-tiny-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-tiny-historic-multilingual-cased)
| `dbmdz/bert-mini-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-mini-historic-multilingual-cased)
| `dbmdz/bert-small-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-small-historic-multilingual-cased)
| `dbmdz/bert-medium-historic-multilingual-cased` | [here](https://huggingface.co/dbmdz/bert-base-historic-multilingual-cased)
**Notice**: We have released language models for Historic German and French trained on more noisier data earlier - see
[this repo](https://github.com/stefan-it/europeana-bert) for more information:
| Model identifier | Model Hub link
| --------------------------------------------- | --------------------------------------------------------------------------
| `dbmdz/bert-base-german-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-german-europeana-cased)
| `dbmdz/bert-base-french-europeana-cased` | [here](https://huggingface.co/dbmdz/bert-base-french-europeana-cased)
# Corpora Stats
## German Europeana Corpus
We provide some statistics using different thresholds of ocr confidences, in order to shrink down the corpus size
and use less-noisier data:
| OCR confidence | Size
| -------------- | ----
| **0.60** | 28GB
| 0.65 | 18GB
| 0.70 | 13GB
For the final corpus we use a OCR confidence of 0.6 (28GB). The following plot shows a tokens per year distribution:

## French Europeana Corpus
Like German, we use different ocr confidence thresholds:
| OCR confidence | Size
| -------------- | ----
| 0.60 | 31GB
| 0.65 | 27GB
| **0.70** | 27GB
| 0.75 | 23GB
| 0.80 | 11GB
For the final corpus we use a OCR confidence of 0.7 (27GB). The following plot shows a tokens per year distribution:

## British Library Corpus
Metadata is taken from [here](https://data.bl.uk/digbks/DB21.html). Stats incl. year filtering:
| Years | Size
| ----------------- | ----
| ALL | 24GB
| >= 1800 && < 1900 | 24GB
We use the year filtered variant. The following plot shows a tokens per year distribution:

## Finnish Europeana Corpus
| OCR confidence | Size
| -------------- | ----
| 0.60 | 1.2GB
The following plot shows a tokens per year distribution:

## Swedish Europeana Corpus
| OCR confidence | Size
| -------------- | ----
| 0.60 | 1.1GB
The following plot shows a tokens per year distribution:

## All Corpora
The following plot shows a tokens per year distribution of the complete training corpus:

# Multilingual Vocab generation
For the first attempt, we use the first 10GB of each pretraining corpus. We upsample both Finnish and Swedish to ~10GB.
The following tables shows the exact size that is used for generating a 32k and 64k subword vocabs:
| Language | Size
| -------- | ----
| German | 10GB
| French | 10GB
| English | 10GB
| Finnish | 9.5GB
| Swedish | 9.7GB
We then calculate the subword fertility rate and portion of `[UNK]`s over the following NER corpora:
| Language | NER corpora
| -------- | ------------------
| German | CLEF-HIPE, NewsEye
| French | CLEF-HIPE, NewsEye
| English | CLEF-HIPE
| Finnish | NewsEye
| Swedish | NewsEye
Breakdown of subword fertility rate and unknown portion per language for the 32k vocab:
| Language | Subword fertility | Unknown portion
| -------- | ------------------ | ---------------
| German | 1.43 | 0.0004
| French | 1.25 | 0.0001
| English | 1.25 | 0.0
| Finnish | 1.69 | 0.0007
| Swedish | 1.43 | 0.0
Breakdown of subword fertility rate and unknown portion per language for the 64k vocab:
| Language | Subword fertility | Unknown portion
| -------- | ------------------ | ---------------
| German | 1.31 | 0.0004
| French | 1.16 | 0.0001
| English | 1.17 | 0.0
| Finnish | 1.54 | 0.0007
| Swedish | 1.32 | 0.0
# Final pretraining corpora
We upsample Swedish and Finnish to ~27GB. The final stats for all pretraining corpora can be seen here:
| Language | Size
| -------- | ----
| German | 28GB
| French | 27GB
| English | 24GB
| Finnish | 27GB
| Swedish | 27GB
Total size is 130GB.
# Smaller multilingual models
Inspired by the ["Well-Read Students Learn Better: On the Importance of Pre-training Compact Models"](https://arxiv.org/abs/1908.08962)
paper, we train smaller models (different layers and hidden sizes), and report number of parameters and pre-training costs:
| Model (Layer / Hidden size) | Parameters | Pre-Training time
| --------------------------- | ----------: | ----------------------:
| hmBERT Tiny ( 2/128) | 4.58M | 4.3 sec / 1,000 steps
| hmBERT Mini ( 4/256) | 11.55M | 10.5 sec / 1,000 steps
| hmBERT Small ( 4/512) | 29.52M | 20.7 sec / 1,000 steps
| hmBERT Medium ( 8/512) | 42.13M | 35.0 sec / 1,000 steps
| hmBERT Base (12/768) | 110.62M | 80.0 sec / 1,000 steps
We then perform downstream evaluations on the multilingual [NewsEye](https://zenodo.org/record/4573313#.Ya3oVr-ZNzU) dataset:

# Pretraining
## Multilingual model - hmBERT Base
We train a multilingual BERT model using the 32k vocab with the official BERT implementation
on a v3-32 TPU using the following parameters:
```bash
python3 run_pretraining.py --input_file gs://histolectra/historic-multilingual-tfrecords/*.tfrecord \
--output_dir gs://histolectra/bert-base-historic-multilingual-cased \
--bert_config_file ./config.json \
--max_seq_length=512 \
--max_predictions_per_seq=75 \
--do_train=True \
--train_batch_size=128 \
--num_train_steps=3000000 \
--learning_rate=1e-4 \
--save_checkpoints_steps=100000 \
--keep_checkpoint_max=20 \
--use_tpu=True \
--tpu_name=electra-2 \
--num_tpu_cores=32
```
The following plot shows the pretraining loss curve:

## Smaller multilingual models
We use the same parameters as used for training the base model.
### hmBERT Tiny
The following plot shows the pretraining loss curve for the tiny model:

### hmBERT Mini
The following plot shows the pretraining loss curve for the mini model:

### hmBERT Small
The following plot shows the pretraining loss curve for the small model:

### hmBERT Medium
The following plot shows the pretraining loss curve for the medium model:

## English model
The English BERT model - with texts from British Library corpus - was trained with the Hugging Face
JAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-historic-english-cased/ \
--tokenizer_name /mnt/datasets/bert-base-historic-english-cased/ \
--train_file /mnt/datasets/bl-corpus/bl_1800-1900_extracted.txt \
--validation_file /mnt/datasets/bl-corpus/english_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 10 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-historic-english-cased-512-noadafactor-10e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

## Finnish model
The BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-finnish-europeana-cased/ \
--tokenizer_name /mnt/datasets/bert-base-finnish-europeana-cased/ \
--train_file /mnt/datasets/hlms/extracted_content_Finnish_0.6.txt \
--validation_file /mnt/datasets/hlms/finnish_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 40 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-finnish-europeana-cased-512-dupe1-noadafactor-40e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

## Swedish model
The BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:
```bash
python3 run_mlm_flax.py --model_type bert \
--config_name /mnt/datasets/bert-base-swedish-europeana-cased/ \
--tokenizer_name /mnt/datasets/bert-base-swedish-europeana-cased/ \
--train_file /mnt/datasets/hlms/extracted_content_Swedish_0.6.txt \
--validation_file /mnt/datasets/hlms/swedish_validation.txt \
--max_seq_length 512 \
--per_device_train_batch_size 16 \
--learning_rate 1e-4 \
--num_train_epochs 40 \
--preprocessing_num_workers 96 \
--output_dir /mnt/datasets/bert-base-swedish-europeana-cased-512-dupe1-noadafactor-40e \
--save_steps 2500 \
--eval_steps 2500 \
--warmup_steps 10000 \
--line_by_line \
--pad_to_max_length
```
The following plot shows the pretraining loss curve:

# Acknowledgments
Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as
TensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "multilingual", "license": "mit", "widget": [{"text": "and I cannot conceive the reafon why [MASK] hath"}, {"text": "T\u00e4k\u00e4l\u00e4inen sanomalehdist\u00f6 [MASK] erit - t\u00e4in"}, {"text": "Det vore [MASK] h\u00e4ller n\u00f6dv\u00e4ndigt att be"}, {"text": "Comme, \u00e0 cette \u00e9poque [MASK] \u00e9tait celle de la"}, {"text": "In [MASK] an atmosph\u00e4rischen Nahrungsmitteln"}]}
|
fill-mask
|
dbmdz/bert-tiny-historic-multilingual-cased
|
[
"transformers",
"pytorch",
"tf",
"tensorboard",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"arxiv:1908.08962",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.08962"
] |
[
"multilingual"
] |
TAGS
#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
Historic Language Models (HLMs)
===============================
Languages
---------
Our Historic Language Models Zoo contains support for the following languages - incl. their training data source:
Language: German, Training data: Europeana, Size: 13-28GB (filtered)
Language: French, Training data: Europeana, Size: 11-31GB (filtered)
Language: English, Training data: British Library, Size: 24GB (year filtered)
Language: Finnish, Training data: Europeana, Size: 1.2GB
Language: Swedish, Training data: Europeana, Size: 1.1GB
Models
------
At the moment, the following models are available on the model hub:
We also released smaller models for the multilingual model:
Notice: We have released language models for Historic German and French trained on more noisier data earlier - see
this repo for more information:
Corpora Stats
=============
German Europeana Corpus
-----------------------
We provide some statistics using different thresholds of ocr confidences, in order to shrink down the corpus size
and use less-noisier data:
For the final corpus we use a OCR confidence of 0.6 (28GB). The following plot shows a tokens per year distribution:
!German Europeana Corpus Stats
French Europeana Corpus
-----------------------
Like German, we use different ocr confidence thresholds:
For the final corpus we use a OCR confidence of 0.7 (27GB). The following plot shows a tokens per year distribution:
!French Europeana Corpus Stats
British Library Corpus
----------------------
Metadata is taken from here. Stats incl. year filtering:
We use the year filtered variant. The following plot shows a tokens per year distribution:
!British Library Corpus Stats
Finnish Europeana Corpus
------------------------
The following plot shows a tokens per year distribution:
!Finnish Europeana Corpus Stats
Swedish Europeana Corpus
------------------------
The following plot shows a tokens per year distribution:
!Swedish Europeana Corpus Stats
All Corpora
-----------
The following plot shows a tokens per year distribution of the complete training corpus:
!All Corpora Stats
Multilingual Vocab generation
=============================
For the first attempt, we use the first 10GB of each pretraining corpus. We upsample both Finnish and Swedish to ~10GB.
The following tables shows the exact size that is used for generating a 32k and 64k subword vocabs:
We then calculate the subword fertility rate and portion of '[UNK]'s over the following NER corpora:
Breakdown of subword fertility rate and unknown portion per language for the 32k vocab:
Language: German, Subword fertility: 1.43, Unknown portion: 0.0004
Language: French, Subword fertility: 1.25, Unknown portion: 0.0001
Language: English, Subword fertility: 1.25, Unknown portion: 0.0
Language: Finnish, Subword fertility: 1.69, Unknown portion: 0.0007
Language: Swedish, Subword fertility: 1.43, Unknown portion: 0.0
Breakdown of subword fertility rate and unknown portion per language for the 64k vocab:
Language: German, Subword fertility: 1.31, Unknown portion: 0.0004
Language: French, Subword fertility: 1.16, Unknown portion: 0.0001
Language: English, Subword fertility: 1.17, Unknown portion: 0.0
Language: Finnish, Subword fertility: 1.54, Unknown portion: 0.0007
Language: Swedish, Subword fertility: 1.32, Unknown portion: 0.0
Final pretraining corpora
=========================
We upsample Swedish and Finnish to ~27GB. The final stats for all pretraining corpora can be seen here:
Total size is 130GB.
Smaller multilingual models
===========================
Inspired by the "Well-Read Students Learn Better: On the Importance of Pre-training Compact Models"
paper, we train smaller models (different layers and hidden sizes), and report number of parameters and pre-training costs:
We then perform downstream evaluations on the multilingual NewsEye dataset:
!NewsEye hmBERT Evaluation
Pretraining
===========
Multilingual model - hmBERT Base
--------------------------------
We train a multilingual BERT model using the 32k vocab with the official BERT implementation
on a v3-32 TPU using the following parameters:
The following plot shows the pretraining loss curve:
!Training loss curve
Smaller multilingual models
---------------------------
We use the same parameters as used for training the base model.
### hmBERT Tiny
The following plot shows the pretraining loss curve for the tiny model:
!Training loss curve
### hmBERT Mini
The following plot shows the pretraining loss curve for the mini model:
!Training loss curve
### hmBERT Small
The following plot shows the pretraining loss curve for the small model:
!Training loss curve
### hmBERT Medium
The following plot shows the pretraining loss curve for the medium model:
!Training loss curve
English model
-------------
The English BERT model - with texts from British Library corpus - was trained with the Hugging Face
JAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Finnish model
-------------
The BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Swedish model
-------------
The BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face
JAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:
The following plot shows the pretraining loss curve:
!Training loss curve
Acknowledgments
===============
Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as
TensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[
"### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve",
"### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve",
"### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve",
"### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve",
"### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve",
"### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve",
"### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
66,
30,
28,
28,
347
] |
[
"passage: TAGS\n#transformers #pytorch #tf #tensorboard #safetensors #bert #fill-mask #multilingual #arxiv-1908.08962 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### hmBERT Tiny\n\n\nThe following plot shows the pretraining loss curve for the tiny model:\n\n\n!Training loss curve### hmBERT Mini\n\n\nThe following plot shows the pretraining loss curve for the mini model:\n\n\n!Training loss curve### hmBERT Small\n\n\nThe following plot shows the pretraining loss curve for the small model:\n\n\n!Training loss curve### hmBERT Medium\n\n\nThe following plot shows the pretraining loss curve for the medium model:\n\n\n!Training loss curve\n\n\nEnglish model\n-------------\n\n\nThe English BERT model - with texts from British Library corpus - was trained with the Hugging Face\nJAX/FLAX implementation for 10 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nFinnish model\n-------------\n\n\nThe BERT model - with texts from Finnish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 1M steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nSwedish model\n-------------\n\n\nThe BERT model - with texts from Swedish part of Europeana - was trained with the Hugging Face\nJAX/FLAX implementation for 40 epochs (approx. 660K steps) on a v3-8 TPU, using the following command:\n\n\nThe following plot shows the pretraining loss curve:\n\n\n!Training loss curve\n\n\nAcknowledgments\n===============\n\n\nResearch supported with Cloud TPUs from Google's TPU Research Cloud (TRC) program, previously known as\nTensorFlow Research Cloud (TFRC). Many thanks for providing access to the TRC ️\n\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
-0.01838250458240509,
0.07497698813676834,
-0.004591826349496841,
0.0935814306139946,
0.08459219336509705,
0.0281265489757061,
0.0513935424387455,
0.11914435029029846,
0.019784854725003242,
0.12505578994750977,
0.01800212636590004,
-0.004408061970025301,
0.10708323866128922,
0.07002894580364227,
0.10031522065401077,
-0.26725950837135315,
0.015286745503544807,
-0.14339587092399597,
-0.012879506684839725,
0.061703287065029144,
0.13286221027374268,
-0.049308016896247864,
0.0773063525557518,
0.005427641794085503,
0.00662423437461257,
0.02681286446750164,
-0.029497727751731873,
-0.036411575973033905,
0.08129750192165375,
0.06282371282577515,
0.02853299304842949,
-0.03729385510087013,
0.057260334491729736,
-0.2171691656112671,
0.0006818408728577197,
0.0776631310582161,
-0.009753323160111904,
0.03736304119229317,
0.10214542597532272,
-0.0279656033962965,
0.08662102371454239,
-0.11414434760808945,
0.05029687657952309,
0.045086249709129333,
-0.08234736323356628,
-0.14702413976192474,
-0.13204717636108398,
0.09344872832298279,
0.11585770547389984,
0.05216670408844948,
-0.020233305171132088,
0.07557069510221481,
-0.0856800451874733,
0.05004337057471275,
0.2793901264667511,
-0.2027529627084732,
-0.04622967913746834,
-0.05339162424206734,
0.05674716830253601,
0.034474875777959824,
-0.07952550053596497,
0.013355844654142857,
0.0012370070908218622,
0.010445362888276577,
-0.01488783210515976,
0.002163159428164363,
0.03620881959795952,
-0.008577418513596058,
-0.10514488071203232,
-0.019028153270483017,
0.10807893425226212,
-0.004645194858312607,
-0.0795569121837616,
-0.18938635289669037,
-0.018536098301410675,
-0.03773390129208565,
-0.01827855035662651,
0.004385469015687704,
0.06355904042720795,
-0.03177975118160248,
0.08268830925226212,
-0.06494207680225372,
-0.07094825804233551,
-0.0010663216235116124,
0.017042797058820724,
0.09598168730735779,
0.02850382588803768,
0.0019258481916040182,
0.06261249631643295,
0.07416264712810516,
-0.1009078249335289,
-0.03536380082368851,
-0.024320369586348534,
-0.02747233957052231,
-0.09559712558984756,
-0.013390122912824154,
-0.0076389857567846775,
-0.06646184623241425,
0.001987295923754573,
0.21341915428638458,
0.03402634337544441,
0.024493880569934845,
-0.03990111127495766,
0.008588647469878197,
0.029462207108736038,
0.13073284924030304,
-0.11216874420642853,
-0.13901303708553314,
-0.018986279144883156,
-0.013873172923922539,
0.059853531420230865,
0.002015564125031233,
-0.01200065203011036,
0.028986893594264984,
0.01434057392179966,
0.0787043645977974,
0.029499389231204987,
0.016802452504634857,
-0.08529693633317947,
-0.03680713474750519,
0.15668834745883942,
-0.1264379769563675,
0.071449875831604,
0.015350238420069218,
-0.03899960219860077,
0.024907097220420837,
0.042215049266815186,
0.012826773338019848,
-0.0705830380320549,
0.11708971112966537,
-0.050523821264505386,
-0.009749419055879116,
-0.045183200389146805,
-0.10202311724424362,
0.046891745179891586,
-0.033271461725234985,
-0.061858199536800385,
-0.07730357348918915,
-0.02072875201702118,
-0.06297934055328369,
0.03946205601096153,
-0.06353288888931274,
-0.01744234375655651,
-0.02939661033451557,
-0.056464456021785736,
0.038754090666770935,
0.015417465008795261,
0.0595666877925396,
-0.01732446253299713,
0.01403512991964817,
-0.12174932658672333,
0.05614302307367325,
0.0541803278028965,
0.02246646210551262,
-0.051134880632162094,
-0.008520944975316525,
-0.28358370065689087,
0.06606274843215942,
-0.132695734500885,
0.004836035892367363,
-0.11571432650089264,
-0.028689511120319366,
0.08277411013841629,
0.03743920475244522,
0.024713460355997086,
0.08890194445848465,
-0.17414288222789764,
-0.04899533465504646,
0.1714445799589157,
-0.09474573284387589,
-0.002242365386337042,
0.1442134976387024,
0.003990788012742996,
0.027375195175409317,
0.027852144092321396,
0.1334686577320099,
0.04412377253174782,
-0.14020144939422607,
-0.044648177921772,
-0.06190132722258568,
-0.05038272961974144,
0.147560253739357,
0.06303991377353668,
-0.12529557943344116,
0.03476065769791603,
0.01511664129793644,
-0.0928725153207779,
-0.06102854013442993,
0.01619136892259121,
-0.009231594391167164,
0.02897515892982483,
-0.019016604870557785,
0.0019346270710229874,
-0.0037865471094846725,
-0.049945980310440063,
-0.08919233828783035,
-0.11897703260183334,
-0.006689291447401047,
0.09475875645875931,
-0.02355268783867359,
0.0748969316482544,
-0.056626494973897934,
-0.01113037671893835,
0.020184379070997238,
-0.03496541455388069,
-0.07251078635454178,
-0.10067931562662125,
0.08108916878700256,
-0.14647723734378815,
0.019197965040802956,
-0.03843053802847862,
0.059079382568597794,
0.0791577398777008,
-0.04065806046128273,
-0.023166347295045853,
-0.013846770860254765,
-0.018708376213908195,
-0.04894305393099785,
-0.15186434984207153,
-0.03388196974992752,
-0.012137913145124912,
0.10956902801990509,
-0.04277980700135231,
-0.017493732273578644,
0.03304886445403099,
0.14915940165519714,
0.0495106503367424,
-0.05695953220129013,
-0.0009552006958983839,
0.010290311649441719,
0.007390175014734268,
-0.05511286109685898,
0.01396782137453556,
-0.04304596036672592,
-0.008544501848518848,
0.09625044465065002,
-0.1230364516377449,
-0.17937225103378296,
0.07885769754648209,
0.13410720229148865,
-0.09971778839826584,
0.005583271384239197,
-0.06032155454158783,
-0.03603639453649521,
-0.07150408625602722,
-0.087070532143116,
0.14423224329948425,
0.04508695378899574,
0.08474673330783844,
-0.07967165112495422,
-0.08038293570280075,
-0.0015841529238969088,
-0.03652593865990639,
-0.08957476913928986,
0.12280010432004929,
-0.024162983521819115,
-0.09752131998538971,
0.06307537108659744,
-0.02167368307709694,
0.05711343511939049,
0.15706712007522583,
0.02176528237760067,
-0.10685811191797256,
-0.03281952813267708,
0.02877911925315857,
0.08813615888357162,
0.09000173211097717,
0.0035563369747251272,
0.03690013661980629,
0.05070043355226517,
0.020603828132152557,
0.006292149890214205,
-0.04952035844326019,
0.02139190584421158,
0.0073630125261843204,
-0.03865039721131325,
0.04656580463051796,
0.01181273814290762,
0.012281140312552452,
0.08929583430290222,
0.029386257752776146,
0.049766477197408676,
-0.04818480461835861,
-0.04621870443224907,
-0.09485994279384613,
0.10117428004741669,
-0.15671852231025696,
-0.25755253434181213,
-0.1571509689092636,
0.015771346166729927,
-0.031071770936250687,
-0.01897772215306759,
0.031087210401892662,
-0.06490792334079742,
-0.12198474258184433,
-0.12990815937519073,
0.07952530682086945,
0.021082352846860886,
-0.06551789492368698,
-0.0519949346780777,
0.010261857882142067,
0.046790238469839096,
-0.10674012452363968,
-0.0008368181297555566,
-0.0018373981583863497,
-0.08390948176383972,
-0.02464928664267063,
0.0495821014046669,
0.06632652133703232,
0.009783849120140076,
0.04045012593269348,
-0.01987491548061371,
-0.016492094844579697,
0.1272875815629959,
-0.10731779783964157,
0.12936800718307495,
0.040774937719106674,
-0.03718724101781845,
0.0787333995103836,
0.12205632776021957,
0.0005583474412560463,
-0.04694322124123573,
0.026539001613855362,
0.059327322989702225,
-0.009934182278811932,
-0.19217784702777863,
-0.07273104041814804,
-0.005397085566073656,
0.0800570696592331,
0.07311416417360306,
0.07692285627126694,
-0.03743058815598488,
-0.0395035482943058,
-0.05649024248123169,
-0.052111390978097916,
0.07605570554733276,
0.07018332928419113,
-0.03506311774253845,
0.002545110648497939,
0.02077251486480236,
-0.029770629480481148,
0.030265918001532555,
0.15279091894626617,
-0.006687644403427839,
0.14844508469104767,
-0.04340033233165741,
0.204502135515213,
0.038611166179180145,
0.07517001032829285,
0.0007572589674964547,
0.09116441756486893,
-0.020160265266895294,
0.02726554125547409,
-0.01694844476878643,
-0.048810336738824844,
0.0036235428415238857,
0.057950813323259354,
0.11489176005125046,
-0.014287389814853668,
-0.04418133571743965,
-0.03403383120894432,
0.09645664691925049,
0.29461750388145447,
0.05363300070166588,
-0.0730837881565094,
-0.015456943772733212,
0.020612338557839394,
-0.11504577100276947,
-0.057371094822883606,
0.000878174148965627,
0.0718030035495758,
-0.19257500767707825,
0.08470744639635086,
-0.024735575541853905,
0.02729548141360283,
-0.07196321338415146,
-0.04436325281858444,
0.07887852191925049,
0.07532859593629837,
-0.006973697803914547,
0.046409640461206436,
-0.19605618715286255,
0.07133689522743225,
-0.007497962098568678,
0.05017570033669472,
-0.028938259929418564,
0.025135373696684837,
0.005642155185341835,
-0.060727279633283615,
0.2177029401063919,
0.039179686456918716,
-0.0060600396245718,
-0.017742758616805077,
-0.15911316871643066,
-0.04035639017820358,
0.11310891807079315,
-0.16096965968608856,
0.06993868947029114,
-0.04904988035559654,
-0.05508439615368843,
-0.06287791579961777,
-0.025131095200777054,
-0.06492875516414642,
-0.1570369452238083,
0.029878079891204834,
-0.11359013617038727,
0.0002901718544308096,
-0.08747215569019318,
-0.01414119079709053,
-0.11259045451879501,
0.12672153115272522,
-0.09415681660175323,
-0.0597437247633934,
-0.1314782202243805,
-0.06134991720318794,
0.14393338561058044,
-0.04764938727021217,
0.06130359694361687,
-0.0180063359439373,
0.10339190810918808,
-0.03469991683959961,
-0.020652316510677338,
0.06225379928946495,
-0.10041819512844086,
-0.22944556176662445,
-0.03873603045940399,
0.1914825141429901,
0.07758241146802902,
0.03545099124312401,
-0.02346174791455269,
0.06784546375274658,
0.014724667184054852,
-0.062163520604372025,
0.0140851940959692,
0.19028399884700775,
0.005599664524197578,
0.09301856905221939,
-0.033275384455919266,
-0.043601151555776596,
-0.08216627687215805,
-0.04096629470586777,
0.11256449669599533,
0.23705719411373138,
-0.07057871669530869,
0.20767341554164886,
0.05980931967496872,
-0.08724096417427063,
-0.22206172347068787,
-0.06833211332559586,
0.08250375837087631,
-0.016818897798657417,
0.04607631638646126,
-0.10268574208021164,
0.0649406760931015,
0.0836360901594162,
-0.0003731143951881677,
0.0075850714929401875,
-0.19650597870349884,
-0.13132430613040924,
-0.00009504318586550653,
0.022485485300421715,
-0.12776856124401093,
-0.09017539769411087,
-0.03638658672571182,
-0.03802711144089699,
-0.03171171620488167,
0.09119751304388046,
0.02176234871149063,
0.06574343889951706,
0.0161287821829319,
-0.018230311572551727,
0.04010692238807678,
-0.05274474248290062,
0.1315503567457199,
0.0006615432212129235,
0.0008700272883288562,
-0.07788953930139542,
-0.006720277946442366,
0.011849350295960903,
-0.029129017144441605,
0.08881589025259018,
0.023957794532179832,
-0.01478641852736473,
-0.10074609518051147,
-0.025975273922085762,
-0.08064210414886475,
0.08047845214605331,
-0.09317826479673386,
-0.014880192466080189,
-0.06574171781539917,
0.09193854033946991,
0.07885187119245529,
-0.007170373108237982,
-0.05163529887795448,
-0.03615882992744446,
-0.032467544078826904,
0.12290814518928528,
0.14394697546958923,
0.14960549771785736,
-0.10638763010501862,
0.009623134508728981,
-0.02266727387905121,
0.022725705057382584,
0.0006025979528203607,
0.060287635773420334,
0.057062651962041855,
-0.02514052949845791,
0.08479106426239014,
-0.03233977407217026,
-0.1877731829881668,
-0.01477011851966381,
0.07108699530363083,
-0.1142258271574974,
-0.1848980337381363,
0.0027300126384943724,
-0.05806051939725876,
-0.03311494365334511,
-0.054740361869335175,
0.14488957822322845,
-0.02601415477693081,
-0.05373754724860191,
0.0015331106260418892,
0.08311688154935837,
-0.002947279019281268,
0.06711946427822113,
0.01123325526714325,
0.01234008464962244,
-0.08239259570837021,
0.1798771321773529,
0.037067487835884094,
-0.070403091609478,
0.024714896455407143,
0.18392956256866455,
-0.060240112245082855,
-0.04635774344205856,
0.01238054409623146,
0.0685010775923729,
-0.02067599818110466,
-0.02920771762728691,
0.008610808290541172,
-0.042542655020952225,
0.03182251378893852,
0.027357513085007668,
0.00405572634190321,
-0.020990174263715744,
0.045929521322250366,
0.030377009883522987,
-0.06215943023562431,
0.09900863468647003,
0.02937569096684456,
0.019046641886234283,
-0.029101882129907608,
0.06740113347768784,
-0.00785854458808899,
-0.02934357523918152,
0.015818582847714424,
0.03407582268118858,
-0.07262527197599411,
-0.05504230037331581,
0.009665303863584995,
-0.009872435592114925,
-0.03676630184054375,
-0.0368829108774662,
0.011984419077634811,
0.0029200143180787563,
0.037966225296258926,
0.009964853525161743,
-0.038416579365730286,
-0.09426009654998779,
-0.04130178689956665,
0.056379880756139755,
-0.12840798497200012,
-0.013053863309323788,
0.07383227348327637,
-0.07784447073936462,
0.1616670787334442,
0.028151391074061394,
0.0369388721883297,
0.014219528995454311,
-0.042326994240283966,
-0.004574701189994812,
-0.06890713423490524,
-0.005315940361469984,
0.050130292773246765,
-0.12536831200122833,
-0.017062537372112274,
-0.07755958288908005,
-0.0415453277528286,
-0.0043003442697227,
0.0745561420917511,
-0.09432709962129593,
0.07073409110307693,
0.04107894003391266,
-0.06596718728542328,
-0.07844087481498718,
-0.008957783691585064,
0.045746736228466034,
0.02081238478422165,
0.053385864943265915,
-0.06615421921014786,
0.06893545389175415,
-0.0711255818605423,
0.004025696776807308,
0.0003294531488791108,
0.000547576230019331,
-0.008706079795956612,
0.04077719897031784,
0.047532785683870316,
-0.014299001544713974,
0.07903705537319183,
0.0030690613202750683,
-0.011152289807796478,
0.050828102976083755,
-0.0533759742975235,
-0.07456227391958237,
0.05758000910282135,
-0.034995973110198975,
-0.05231659486889839,
0.0009630967397242785,
-0.05408048257231712,
-0.030563313513994217,
-0.018295438960194588,
-0.08191109448671341,
0.13703787326812744,
0.14033427834510803,
0.10663381218910217,
0.029856018722057343,
0.0014520130353048444,
-0.13258697092533112,
-0.14132624864578247,
0.06212732195854187,
-0.04013478383421898,
0.0671432763338089,
-0.051220931112766266,
0.12502215802669525,
0.06776692718267441,
-0.21412335336208344,
0.10079759359359741,
-0.02154095098376274,
-0.04872625693678856,
-0.061198651790618896,
-0.13281942903995514,
-0.05466828867793083,
-0.019822154194116592,
0.004205089993774891,
-0.09153120219707489,
0.08638890087604523,
0.05275481939315796,
0.04308638721704483,
0.02252444066107273,
0.08840763568878174,
-0.1633525937795639,
-0.06294316053390503,
0.10780304670333862,
0.054057925939559937,
0.03180251643061638,
0.0644444078207016,
0.00007095551700331271,
-0.05848664045333862,
0.028906812891364098,
0.047613002359867096,
0.07179775834083557,
0.010908284224569798,
0.004417906980961561,
-0.015637483447790146,
-0.08978146314620972,
0.027938678860664368,
-0.027230313047766685,
-0.021034106612205505,
0.10000819712877274,
0.05768291652202606,
0.007310186512768269,
-0.03264622390270233,
0.2302917242050171,
-0.061509035527706146,
-0.027476301416754723,
-0.16131579875946045,
0.11148304492235184,
-0.04467588663101196,
0.0508061908185482,
-0.016360467299818993,
-0.0902116671204567,
-0.03026147000491619,
0.18582548201084137,
0.14042215049266815,
-0.0268323365598917,
0.018725818023085594,
0.025976354256272316,
-0.008945228531956673,
0.03137330710887909,
0.0821147933602333,
0.05842192843556404,
0.16061681509017944,
-0.06543948501348495,
0.09932872653007507,
-0.02220502868294716,
-0.038988903164863586,
-0.09744497388601303,
0.19688546657562256,
-0.057210639119148254,
0.006156980991363525,
-0.04606388881802559,
0.05331490933895111,
0.010959283448755741,
-0.3100702464580536,
0.043515466153621674,
-0.09794948250055313,
-0.13634485006332397,
0.012481434270739555,
0.06696493178606033,
0.03506696969270706,
0.08163053542375565,
0.07605531811714172,
-0.015224069356918335,
0.22355005145072937,
0.037145934998989105,
-0.06211879476904869,
-0.022114593535661697,
0.03087041899561882,
-0.14987362921237946,
0.2555731534957886,
0.02411094307899475,
-0.021345950663089752,
0.08581890910863876,
-0.019428851082921028,
-0.14025336503982544,
-0.011787205003201962,
0.044839974492788315,
-0.09226434677839279,
0.03943173959851265,
0.22652697563171387,
-0.0030909525230526924,
0.053718939423561096,
0.038558028638362885,
-0.07765743136405945,
0.06320123374462128,
0.07155612111091614,
-0.004466915037482977,
-0.06386460363864899,
0.11380192637443542,
-0.12245496362447739,
0.13287103176116943,
0.1905631124973297,
-0.03742115944623947,
0.021784896031022072,
-0.07720133662223816,
-0.013400997035205364,
-0.012626060284674168,
0.13845470547676086,
0.048798974603414536,
-0.12940466403961182,
0.019900813698768616,
-0.05042421445250511,
0.06816644221544266,
-0.13737063109874725,
-0.046657636761665344,
-0.012370054610073566,
-0.04045470431447029,
-0.053929831832647324,
0.09804452210664749,
0.03951381891965866,
0.011623288504779339,
0.0010270479833707213,
0.09554577618837357,
0.014831733889877796,
0.09901735931634903,
-0.06622068583965302,
-0.00982118770480156
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz ConvBERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a German Europeana ConvBERT model 🎉
# German Europeana ConvBERT
We use the open source [Europeana newspapers](http://www.europeana-newspapers.eu/)
that were provided by *The European Library*. The final
training corpus has a size of 51GB and consists of 8,035,986,369 tokens.
Detailed information about the data and pretraining steps can be found in
[this repository](https://github.com/stefan-it/europeana-bert).
## Results
For results on Historic NER, please refer to [this repository](https://github.com/stefan-it/europeana-bert).
## Usage
With Transformers >= 4.3 our German Europeana ConvBERT model can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/convbert-base-german-europeana-cased"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
```
# Huggingface model hub
All other German Europeana models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our Europeana BERT, ELECTRA and ConvBERT models just open a new discussion
[here](https://github.com/stefan-it/europeana-bert/discussions) 🤗
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "de", "license": "mit", "tags": ["historic german"]}
|
feature-extraction
|
dbmdz/convbert-base-german-europeana-cased
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"convbert",
"feature-extraction",
"historic german",
"de",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #safetensors #convbert #feature-extraction #historic german #de #license-mit #endpoints_compatible #region-us
|
# + dbmdz ConvBERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a German Europeana ConvBERT model
# German Europeana ConvBERT
We use the open source Europeana newspapers
that were provided by *The European Library*. The final
training corpus has a size of 51GB and consists of 8,035,986,369 tokens.
Detailed information about the data and pretraining steps can be found in
this repository.
## Results
For results on Historic NER, please refer to this repository.
## Usage
With Transformers >= 4.3 our German Europeana ConvBERT model can be loaded like:
# Huggingface model hub
All other German Europeana models are available on the Huggingface model hub.
# Contact (Bugs, Feedback, Contribution and more)
For questions about our Europeana BERT, ELECTRA and ConvBERT models just open a new discussion
here
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[
"# + dbmdz ConvBERT model\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources a German Europeana ConvBERT model",
"# German Europeana ConvBERT\n\nWe use the open source Europeana newspapers\nthat were provided by *The European Library*. The final\ntraining corpus has a size of 51GB and consists of 8,035,986,369 tokens.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.",
"## Results\n\nFor results on Historic NER, please refer to this repository.",
"## Usage\n\nWith Transformers >= 4.3 our German Europeana ConvBERT model can be loaded like:",
"# Huggingface model hub\n\nAll other German Europeana models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our Europeana BERT, ELECTRA and ConvBERT models just open a new discussion\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #convbert #feature-extraction #historic german #de #license-mit #endpoints_compatible #region-us \n",
"# + dbmdz ConvBERT model\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources a German Europeana ConvBERT model",
"# German Europeana ConvBERT\n\nWe use the open source Europeana newspapers\nthat were provided by *The European Library*. The final\ntraining corpus has a size of 51GB and consists of 8,035,986,369 tokens.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.",
"## Results\n\nFor results on Historic NER, please refer to this repository.",
"## Usage\n\nWith Transformers >= 4.3 our German Europeana ConvBERT model can be loaded like:",
"# Huggingface model hub\n\nAll other German Europeana models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our Europeana BERT, ELECTRA and ConvBERT models just open a new discussion\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
50,
44,
70,
17,
25,
22,
37,
70
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #convbert #feature-extraction #historic german #de #license-mit #endpoints_compatible #region-us \n# + dbmdz ConvBERT model\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources a German Europeana ConvBERT model# German Europeana ConvBERT\n\nWe use the open source Europeana newspapers\nthat were provided by *The European Library*. The final\ntraining corpus has a size of 51GB and consists of 8,035,986,369 tokens.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.## Results\n\nFor results on Historic NER, please refer to this repository.## Usage\n\nWith Transformers >= 4.3 our German Europeana ConvBERT model can be loaded like:# Huggingface model hub\n\nAll other German Europeana models are available on the Huggingface model hub.# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our Europeana BERT, ELECTRA and ConvBERT models just open a new discussion\nhere# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
-0.08759792894124985,
0.13573436439037323,
0.0016784329200163484,
0.10921324044466019,
0.031112024560570717,
0.016719017177820206,
0.18901410698890686,
0.04793982580304146,
0.16597464680671692,
0.062013156712055206,
-0.0365915410220623,
-0.12791267037391663,
0.004820250440388918,
0.12435613572597504,
0.0895557776093483,
-0.24732860922813416,
-0.003855593502521515,
-0.0715281143784523,
0.01693497784435749,
0.030132560059428215,
0.09097416698932648,
-0.027807937934994698,
0.05300406739115715,
-0.008610048331320286,
-0.062047477811574936,
0.09690680354833603,
-0.09301220625638962,
-0.05432366579771042,
0.15414077043533325,
0.0809021145105362,
0.009081421419978142,
-0.04912656545639038,
0.037834443151950836,
-0.05113466829061508,
0.024340083822607994,
0.0451543964445591,
-0.0855061337351799,
0.04898151755332947,
0.09672976285219193,
-0.07884421199560165,
0.20290981233119965,
-0.011616610921919346,
-0.006453210487961769,
0.05906454101204872,
-0.015543574467301369,
-0.021945104002952576,
-0.14507552981376648,
0.11075924336910248,
-0.05189434066414833,
0.054673682898283005,
0.02405623532831669,
0.2005259096622467,
-0.07481271028518677,
0.034854236990213394,
0.15522603690624237,
-0.17618755996227264,
-0.060480449348688126,
0.08268444985151291,
-0.024103734642267227,
0.03298234939575195,
-0.08520253747701645,
0.0660635232925415,
0.03152909874916077,
0.05499815568327904,
-0.06742186844348907,
0.004593360237777233,
-0.11510591953992844,
-0.07366727292537689,
-0.0597994290292263,
-0.0110222352668643,
0.16734367609024048,
0.03720312565565109,
-0.09237650781869888,
-0.14404915273189545,
-0.04257446527481079,
0.1952485591173172,
-0.0030439263209700584,
0.05945134535431862,
0.031215885654091835,
-0.03662429749965668,
0.08297096937894821,
-0.11579971015453339,
-0.0747026577591896,
-0.01756342686712742,
-0.027918964624404907,
0.293258935213089,
0.01950133591890335,
0.04922927916049957,
0.062355887144804,
0.07258488982915878,
-0.1402793526649475,
-0.051118336617946625,
-0.0408683605492115,
-0.045596398413181305,
-0.07015293836593628,
-0.01308414712548256,
-0.01595262996852398,
-0.07479782402515411,
0.011365477927029133,
0.21555779874324799,
-0.014300134032964706,
-0.06820189952850342,
-0.09475567191839218,
0.0147702656686306,
0.05227696895599365,
0.06266815960407257,
-0.12295159697532654,
-0.15586361289024353,
0.08006199449300766,
-0.16459998488426208,
0.11108265072107315,
0.0004868963733315468,
-0.05940379202365875,
-0.04368112236261368,
-0.059118568897247314,
0.05721419304609299,
-0.008733142167329788,
0.016328034922480583,
0.0033599864691495895,
-0.03854481130838394,
0.3259747624397278,
-0.07052045315504074,
-0.028987381607294083,
0.002677513984963298,
-0.048676908016204834,
-0.03249048814177513,
0.07805357873439789,
-0.023527273908257484,
-0.0393700934946537,
0.052258752286434174,
-0.08269160985946655,
-0.05461498722434044,
-0.004603159613907337,
-0.11708510667085648,
0.11153911054134369,
-0.09392213076353073,
-0.006947872694581747,
-0.19167280197143555,
-0.0960233137011528,
-0.004707809071987867,
-0.013960776850581169,
-0.04187816381454468,
0.035559818148612976,
0.049209196120500565,
-0.027692075818777084,
-0.03933316469192505,
0.005864178296178579,
0.020171163603663445,
-0.052817609161138535,
-0.06527752429246902,
-0.15025487542152405,
0.02034689113497734,
-0.13141381740570068,
-0.037452906370162964,
-0.10150345414876938,
-0.0699547603726387,
-0.30889248847961426,
0.034949205815792084,
-0.19096820056438446,
0.0036190191749483347,
-0.07856187224388123,
0.017025204375386238,
0.06342267245054245,
-0.009572754614055157,
0.038761526346206665,
0.0878063514828682,
-0.15042251348495483,
-0.0361618846654892,
0.1778988391160965,
-0.1471031904220581,
-0.1009877547621727,
0.1669466346502304,
-0.02471226267516613,
-0.021753011271357536,
0.0032216922845691442,
0.17358821630477905,
0.09942129999399185,
-0.17397068440914154,
-0.13900186121463776,
-0.021275566890835762,
0.008964124135673046,
0.00693944888189435,
0.03888824209570885,
-0.04045489430427551,
0.004668372683227062,
0.03675966337323189,
-0.055467668920755386,
0.04320240020751953,
-0.006759802345186472,
0.01960357464849949,
-0.033008474856615067,
-0.10056320577859879,
0.047628313302993774,
0.03595763444900513,
0.02798392064869404,
-0.06901612877845764,
-0.028130950406193733,
0.06726490706205368,
0.1153007224202156,
-0.038115810602903366,
0.02791168913245201,
0.041972097009420395,
0.06158876791596413,
-0.055426158010959625,
-0.023690607398748398,
-0.13306334614753723,
-0.2589685916900635,
0.09784151613712311,
-0.11209724843502045,
0.07554952055215836,
0.015856564044952393,
0.07742217928171158,
0.14084352552890778,
-0.06344491243362427,
0.038831084966659546,
-0.06138407811522484,
0.007532359100878239,
0.012376445345580578,
-0.09423404932022095,
-0.03947572410106659,
-0.06769338250160217,
0.05746950954198837,
-0.011869457550346851,
-0.043572958558797836,
0.03906754404306412,
0.24186916649341583,
0.021028796210885048,
-0.07710177451372147,
0.03405344486236572,
0.004284611903131008,
-0.04385629668831825,
-0.03986188396811485,
-0.04369005933403969,
-0.02654150500893593,
-0.038608286529779434,
0.17310912907123566,
0.025713440030813217,
0.03125173598527908,
0.05361805856227875,
0.22204183042049408,
-0.08667779713869095,
-0.022944916039705276,
-0.11398281902074814,
-0.005990275181829929,
-0.07970355451107025,
-0.08138266950845718,
0.17852172255516052,
0.04527730122208595,
0.07811763882637024,
-0.06469764560461044,
-0.06631677597761154,
0.04093126952648163,
0.06760982424020767,
-0.07186735421419144,
0.13156229257583618,
-0.04586346074938774,
-0.007699266541749239,
0.07588372379541397,
-0.01927746832370758,
0.01215269323438406,
0.21613530814647675,
0.004644603934139013,
-0.10861498862504959,
0.03349637612700462,
-0.052223727107048035,
0.050213832408189774,
0.18942411243915558,
-0.009652705863118172,
0.025102511048316956,
0.05506366118788719,
0.0023137081880122423,
0.02159120701253414,
-0.0029108079615980387,
-0.0003314759233035147,
-0.060470882803201675,
-0.04394010826945305,
0.018264126032590866,
0.12374506890773773,
-0.023583823814988136,
0.09472642838954926,
-0.03675904497504234,
-0.02920220047235489,
-0.07569348067045212,
-0.02963658794760704,
-0.07188882678747177,
0.12846915423870087,
-0.08999035507440567,
-0.1901596337556839,
-0.11444368213415146,
0.12166020274162292,
-0.16240130364894867,
0.009417260065674782,
0.012468996457755566,
0.055013734847307205,
-0.11417863517999649,
-0.13440953195095062,
0.025290990248322487,
0.11106712371110916,
-0.060269102454185486,
-0.07875537872314453,
-0.025239313021302223,
0.006133216433227062,
-0.1752503663301468,
0.008782763965427876,
-0.030347445979714394,
-0.07647736370563507,
-0.014700842089951038,
0.05359003692865372,
0.14717800915241241,
-0.042602360248565674,
-0.040912337601184845,
-0.03331116586923599,
-0.03631453961133957,
0.118754543364048,
-0.08054423332214355,
0.14922305941581726,
0.0393543615937233,
0.0008945506997406483,
-0.003887196071445942,
0.021890969946980476,
0.061331622302532196,
-0.03835253044962883,
0.0179713387042284,
0.04741168022155762,
-0.05331258103251457,
-0.20751707255840302,
-0.21630653738975525,
-0.03884514793753624,
0.04019090533256531,
-0.004503802862018347,
0.08170110732316971,
0.021739808842539787,
0.0009107901132665575,
-0.07962968200445175,
-0.02620396390557289,
-0.027516108006238937,
0.04140131175518036,
0.0062973638996481895,
0.007195500191301107,
-0.011033820919692516,
-0.07182728499174118,
0.026775622740387917,
0.17959196865558624,
0.0524393692612648,
0.06944464147090912,
-0.08680567890405655,
0.08845506608486176,
-0.016537098214030266,
0.04967968538403511,
-0.01212678849697113,
0.1405184417963028,
0.06885799020528793,
-0.019021879881620407,
-0.025713089853525162,
-0.08501525968313217,
0.047849707305431366,
0.05694441497325897,
0.02482304535806179,
-0.03858695179224014,
-0.03411409631371498,
-0.16291283071041107,
0.08536748588085175,
0.1500934213399887,
0.023237770423293114,
-0.06225629523396492,
-0.12691441178321838,
-0.027264906093478203,
-0.11330150067806244,
-0.03433453291654587,
0.012547241523861885,
0.1334281861782074,
-0.16980023682117462,
0.09375662356615067,
0.009905613027513027,
0.04142642766237259,
-0.06759055703878403,
-0.06080767139792442,
-0.035978998988866806,
0.05334649980068207,
-0.03648688644170761,
0.058056894689798355,
-0.08346814662218094,
0.18228654563426971,
-0.0010955097386613488,
0.006657373160123825,
-0.1025504395365715,
-0.0008220684831030667,
0.050391435623168945,
-0.018665356561541557,
0.1956101655960083,
0.0017725344514474273,
0.09291692823171616,
0.05762584134936333,
-0.20312152802944183,
0.03691628575325012,
0.012834632769227028,
-0.14338096976280212,
-0.01566128432750702,
0.06698856502771378,
-0.06203016638755798,
-0.07847586274147034,
-0.00020003083045594394,
-0.18499477207660675,
-0.11548829078674316,
-0.00939803384244442,
-0.037079837173223495,
-0.07497656345367432,
-0.04962652549147606,
-0.10540429502725601,
-0.14071092009544373,
0.10678429156541824,
0.11877860128879547,
-0.019931023940443993,
-0.15411630272865295,
-0.05418701842427254,
0.14600782096385956,
-0.0622977688908577,
0.07243908196687698,
0.008558796718716621,
0.15508215129375458,
-0.10558804869651794,
-0.11249522119760513,
0.01846228912472725,
-0.17825278639793396,
-0.12287311255931854,
-0.037460263818502426,
0.11731492727994919,
0.0742102712392807,
0.024136580526828766,
0.02024218626320362,
0.017877917736768723,
0.0854845866560936,
-0.111073337495327,
0.001332728425040841,
0.09276337921619415,
0.05220995470881462,
0.11241167038679123,
-0.02376486547291279,
-0.054445091634988785,
0.04509681835770607,
0.014703566208481789,
0.07322955876588821,
0.17614318430423737,
-0.07707662880420685,
0.18520425260066986,
0.1176833063364029,
-0.02529679611325264,
-0.27658167481422424,
0.03165964409708977,
0.07248269021511078,
-0.025550397112965584,
0.055326517671346664,
-0.06711917370557785,
0.12102938443422318,
0.08510440587997437,
-0.044000644236803055,
0.01937582716345787,
-0.0898989662528038,
-0.12282711267471313,
0.06278324127197266,
0.01719215326011181,
-0.03569287061691284,
-0.02010190300643444,
-0.02241465263068676,
-0.022215772420167923,
-0.15272092819213867,
0.07671848684549332,
-0.06543461978435516,
0.02310348115861416,
-0.012891186401247978,
0.007264983840286732,
0.024062659591436386,
-0.05158922076225281,
0.046479061245918274,
0.10366984456777573,
-0.012625017203390598,
-0.062378644943237305,
0.08661094307899475,
0.10205347836017609,
0.027918167412281036,
0.14441925287246704,
0.02402099221944809,
0.028088746592402458,
-0.09780379384756088,
-0.002654526848345995,
-0.08588892966508865,
0.21753685176372528,
-0.023041201755404472,
-0.0719003900885582,
-0.021101869642734528,
0.08010771125555038,
0.0012454392854124308,
0.005900141783058643,
0.002340662758797407,
0.03297211602330208,
0.03524557128548622,
0.14047330617904663,
0.08206768333911896,
0.05237162113189697,
-0.07279865443706512,
0.03747471049427986,
-0.07593671977519989,
0.010835589841008186,
-0.00799093209207058,
0.02031422220170498,
0.08249524235725403,
0.027462126687169075,
0.0018543967744335532,
-0.05642133206129074,
-0.15259844064712524,
0.009517437778413296,
0.0594971738755703,
-0.1896011233329773,
-0.15668220818042755,
-0.05781092494726181,
-0.13648244738578796,
0.06761601567268372,
0.09960795193910599,
0.19845527410507202,
-0.140523299574852,
-0.03249191492795944,
-0.007390380837023258,
-0.0057989452034235,
0.030622093006968498,
0.05416455492377281,
0.07049296796321869,
-0.017977464944124222,
-0.06931706517934799,
0.18690551817417145,
0.0021163783967494965,
-0.10105325281620026,
0.04744793102145195,
0.05899444594979286,
-0.06942813098430634,
-0.062035463750362396,
-0.029962662607431412,
0.018236422911286354,
-0.1895642727613449,
-0.045941878110170364,
-0.021534966304898262,
0.008503112941980362,
-0.01266150176525116,
0.1627682000398636,
0.055531129240989685,
-0.008254065178334713,
-0.0273049995303154,
-0.0024692509323358536,
-0.028685614466667175,
0.056022677570581436,
0.032866448163986206,
0.08198348432779312,
0.0036784519907087088,
-0.014964393340051174,
-0.06907514482736588,
-0.003553735790774226,
-0.041971806436777115,
-0.008157925680279732,
-0.052161455154418945,
-0.10276374965906143,
-0.18821761012077332,
-0.005599474534392357,
-0.02166791260242462,
0.022518794983625412,
-0.04151495173573494,
-0.0666477382183075,
0.040127914398908615,
0.03672628477215767,
-0.04376707226037979,
-0.011283944360911846,
0.006334651727229357,
0.09441966563463211,
-0.1320425271987915,
0.0021674034651368856,
0.057881854474544525,
-0.04304127022624016,
0.21916703879833221,
0.06969907879829407,
0.007761708460748196,
0.04275289550423622,
-0.1411694437265396,
-0.00866562221199274,
-0.0005541134160012007,
0.03994743898510933,
0.09677331894636154,
-0.0011074859648942947,
0.008280664682388306,
-0.009936128742992878,
0.03242255374789238,
0.026997504755854607,
0.04129255190491676,
-0.0534520149230957,
0.0526939332485199,
0.08388998359441757,
-0.03692464902997017,
-0.05609361454844475,
0.019781095907092094,
0.11621880531311035,
0.03478805720806122,
-0.01571720838546753,
-0.04279852285981178,
0.02789182960987091,
-0.06417450308799744,
-0.00610258849337697,
0.0026294123381376266,
-0.08072351664304733,
-0.05207562819123268,
0.08828603476285934,
0.02824067696928978,
-0.004829296376556158,
0.2646486163139343,
0.021326640620827675,
0.14342157542705536,
0.03475299850106239,
0.053929366171360016,
-0.04662027582526207,
0.010652188211679459,
0.04621386155486107,
-0.057927124202251434,
0.04682456701993942,
-0.08956019580364227,
0.008133909665048122,
-0.027814475819468498,
-0.13195785880088806,
0.10039491951465607,
0.01996210590004921,
-0.055105384439229965,
0.018866781145334244,
0.10930345952510834,
-0.054391104727983475,
-0.10009303689002991,
-0.13965268433094025,
0.02701537497341633,
0.10472343116998672,
-0.0920834168791771,
-0.0038743193726986647,
0.021834643557667732,
-0.1768030822277069,
0.07093437761068344,
0.05952132120728493,
-0.0020420688670128584,
-0.021012399345636368,
-0.05854165181517601,
0.003452819539234042,
-0.05011427029967308,
0.07048103213310242,
-0.09010662138462067,
0.045194100588560104,
0.0011243068147450686,
0.06857416033744812,
0.009690258651971817,
0.12751992046833038,
-0.13175292313098907,
-0.01339570339769125,
0.09785601496696472,
0.05741577968001366,
0.016994686797261238,
0.01665153168141842,
-0.029118653386831284,
-0.0902634710073471,
0.08834889531135559,
-0.0009653858141973615,
0.058072786778211594,
0.046569518744945526,
-0.01855204999446869,
-0.0240036528557539,
-0.04764629527926445,
-0.028384415432810783,
-0.02498197928071022,
-0.02062234841287136,
0.13173618912696838,
0.08876005560159683,
0.00461227260529995,
-0.003063639858737588,
0.22923801839351654,
-0.04955301433801651,
-0.005359914153814316,
-0.12844312191009521,
0.1093369871377945,
-0.07735291868448257,
0.060231998562812805,
0.029907410964369774,
-0.08753592520952225,
-0.04734288156032562,
0.10995390266180038,
0.2338513284921646,
0.04716771841049194,
0.04306062310934067,
0.039316825568675995,
-0.026764987036585808,
0.05249360203742981,
0.07403096556663513,
-0.025577755644917488,
0.2953537404537201,
-0.027286730706691742,
0.054417166858911514,
0.011785736307501793,
0.002807979704812169,
-0.049882542341947556,
0.20773062109947205,
-0.12649470567703247,
-0.09356299042701721,
-0.0429438017308712,
0.059036869555711746,
-0.06539111584424973,
-0.34643441438674927,
-0.036795925348997116,
-0.10019909590482712,
-0.12507331371307373,
-0.03836090490221977,
-0.05691374093294144,
0.08782519400119781,
0.06146647408604622,
0.03276389092206955,
-0.008076800964772701,
0.21741290390491486,
0.0034250032622367144,
-0.10081280767917633,
-0.0808749794960022,
0.019595090299844742,
-0.08529576659202576,
0.28807637095451355,
-0.011430428363382816,
0.04402824491262436,
0.04323887825012207,
0.02899441495537758,
-0.15536551177501678,
-0.070104219019413,
0.07215142250061035,
-0.15733222663402557,
0.00741841085255146,
0.169663667678833,
-0.06957123428583145,
0.007400862406939268,
-0.0020649083890020847,
-0.08821424096822739,
0.058026328682899475,
0.10156270116567612,
0.03863610327243805,
-0.0893935114145279,
0.1356329768896103,
-0.08404704183340073,
0.12177488207817078,
0.10370007157325745,
-0.02525501884520054,
-0.022006429731845856,
-0.06532586365938187,
0.028728071600198746,
0.02824695035815239,
0.016020050272345543,
0.040726736187934875,
-0.0903848186135292,
0.02489134669303894,
-0.09622021019458771,
-0.008430879563093185,
-0.1593611091375351,
-0.02807604894042015,
-0.04160788282752037,
0.00040436454582959414,
0.007498357445001602,
0.05801275372505188,
0.12860575318336487,
-0.007139122113585472,
0.017158735543489456,
0.05788516625761986,
0.003068088088184595,
0.004279440268874168,
-0.05794911086559296,
-0.05029609054327011
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz Turkish ConvBERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a cased ConvBERT model for Turkish 🎉
# 🇹🇷 ConvBERTurk
ConvBERTurk is a community-driven cased ConvBERT model for Turkish.
In addition to the BERT and ELECTRA based models, we also trained a ConvBERT model. The ConvBERT architecture is presented
in the ["ConvBERT: Improving BERT with Span-based Dynamic Convolution"](https://arxiv.org/abs/2008.02496) paper.
We follow a different training procedure: instead of using a two-phase approach, that pre-trains the model for 90% with 128
sequence length and 10% with 512 sequence length, we pre-train the model with 512 sequence length for 1M steps on a v3-32 TPU.
## Stats
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish [OSCAR corpus](https://traces1.inria.fr/oscar/),
a recent Wikipedia dump, various [OPUS corpora](http://opus.nlpl.eu/) and a
special corpus provided by [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/).
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model
on a TPU v3-32!
## Usage
With Transformers >= 4.3 our cased ConvBERT model can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/convbert-base-turkish-cased"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
```
## Results
For results on PoS tagging, NER and Question Answering downstream tasks, please refer to
[this repository](https://github.com/stefan-it/turkish-bert).
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our DBMDZ BERT models in general, just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "tr", "license": "mit"}
|
feature-extraction
|
dbmdz/convbert-base-turkish-cased
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"convbert",
"feature-extraction",
"tr",
"arxiv:2008.02496",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2008.02496"
] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #safetensors #convbert #feature-extraction #tr #arxiv-2008.02496 #license-mit #endpoints_compatible #region-us
|
# + dbmdz Turkish ConvBERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a cased ConvBERT model for Turkish
# 🇹🇷 ConvBERTurk
ConvBERTurk is a community-driven cased ConvBERT model for Turkish.
In addition to the BERT and ELECTRA based models, we also trained a ConvBERT model. The ConvBERT architecture is presented
in the "ConvBERT: Improving BERT with Span-based Dynamic Convolution" paper.
We follow a different training procedure: instead of using a two-phase approach, that pre-trains the model for 90% with 128
sequence length and 10% with 512 sequence length, we pre-train the model with 512 sequence length for 1M steps on a v3-32 TPU.
## Stats
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish OSCAR corpus,
a recent Wikipedia dump, various OPUS corpora and a
special corpus provided by Kemal Oflazer.
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model
on a TPU v3-32!
## Usage
With Transformers >= 4.3 our cased ConvBERT model can be loaded like:
## Results
For results on PoS tagging, NER and Question Answering downstream tasks, please refer to
this repository.
# Huggingface model hub
All models are available on the Huggingface model hub.
# Contact (Bugs, Feedback, Contribution and more)
For questions about our DBMDZ BERT models in general, just open an issue
here
# Acknowledgments
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[
"# + dbmdz Turkish ConvBERT model\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources a cased ConvBERT model for Turkish",
"# 🇹🇷 ConvBERTurk\n\nConvBERTurk is a community-driven cased ConvBERT model for Turkish.\n\nIn addition to the BERT and ELECTRA based models, we also trained a ConvBERT model. The ConvBERT architecture is presented\nin the \"ConvBERT: Improving BERT with Span-based Dynamic Convolution\" paper.\n\nWe follow a different training procedure: instead of using a two-phase approach, that pre-trains the model for 90% with 128\nsequence length and 10% with 512 sequence length, we pre-train the model with 512 sequence length for 1M steps on a v3-32 TPU.",
"## Stats\n\nThe current version of the model is trained on a filtered and sentence\nsegmented version of the Turkish OSCAR corpus,\na recent Wikipedia dump, various OPUS corpora and a\nspecial corpus provided by Kemal Oflazer.\n\nThe final training corpus has a size of 35GB and 44,04,976,662 tokens.\n\nThanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model\non a TPU v3-32!",
"## Usage\n\nWith Transformers >= 4.3 our cased ConvBERT model can be loaded like:",
"## Results\n\nFor results on PoS tagging, NER and Question Answering downstream tasks, please refer to\nthis repository.",
"# Huggingface model hub\n\nAll models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our DBMDZ BERT models in general, just open an issue\nhere",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #convbert #feature-extraction #tr #arxiv-2008.02496 #license-mit #endpoints_compatible #region-us \n",
"# + dbmdz Turkish ConvBERT model\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources a cased ConvBERT model for Turkish",
"# 🇹🇷 ConvBERTurk\n\nConvBERTurk is a community-driven cased ConvBERT model for Turkish.\n\nIn addition to the BERT and ELECTRA based models, we also trained a ConvBERT model. The ConvBERT architecture is presented\nin the \"ConvBERT: Improving BERT with Span-based Dynamic Convolution\" paper.\n\nWe follow a different training procedure: instead of using a two-phase approach, that pre-trains the model for 90% with 128\nsequence length and 10% with 512 sequence length, we pre-train the model with 512 sequence length for 1M steps on a v3-32 TPU.",
"## Stats\n\nThe current version of the model is trained on a filtered and sentence\nsegmented version of the Turkish OSCAR corpus,\na recent Wikipedia dump, various OPUS corpora and a\nspecial corpus provided by Kemal Oflazer.\n\nThe final training corpus has a size of 35GB and 44,04,976,662 tokens.\n\nThanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model\non a TPU v3-32!",
"## Usage\n\nWith Transformers >= 4.3 our cased ConvBERT model can be loaded like:",
"## Results\n\nFor results on PoS tagging, NER and Question Answering downstream tasks, please refer to\nthis repository.",
"# Huggingface model hub\n\nAll models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our DBMDZ BERT models in general, just open an issue\nhere",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
54,
48,
154,
101,
24,
29,
18,
31,
107
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #convbert #feature-extraction #tr #arxiv-2008.02496 #license-mit #endpoints_compatible #region-us \n# + dbmdz Turkish ConvBERT model\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources a cased ConvBERT model for Turkish# 🇹🇷 ConvBERTurk\n\nConvBERTurk is a community-driven cased ConvBERT model for Turkish.\n\nIn addition to the BERT and ELECTRA based models, we also trained a ConvBERT model. The ConvBERT architecture is presented\nin the \"ConvBERT: Improving BERT with Span-based Dynamic Convolution\" paper.\n\nWe follow a different training procedure: instead of using a two-phase approach, that pre-trains the model for 90% with 128\nsequence length and 10% with 512 sequence length, we pre-train the model with 512 sequence length for 1M steps on a v3-32 TPU.## Stats\n\nThe current version of the model is trained on a filtered and sentence\nsegmented version of the Turkish OSCAR corpus,\na recent Wikipedia dump, various OPUS corpora and a\nspecial corpus provided by Kemal Oflazer.\n\nThe final training corpus has a size of 35GB and 44,04,976,662 tokens.\n\nThanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model\non a TPU v3-32!## Usage\n\nWith Transformers >= 4.3 our cased ConvBERT model can be loaded like:## Results\n\nFor results on PoS tagging, NER and Question Answering downstream tasks, please refer to\nthis repository.# Huggingface model hub\n\nAll models are available on the Huggingface model hub.# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our DBMDZ BERT models in general, just open an issue\nhere"
] |
[
-0.016178198158740997,
0.08993861079216003,
-0.0030220807529985905,
0.01249473076313734,
0.06163255497813225,
0.022621899843215942,
0.13583983480930328,
0.09474103897809982,
-0.07179528474807739,
0.0585663728415966,
-0.014422102831304073,
-0.06711450964212418,
0.12912318110466003,
0.008211729116737843,
0.07244995981454849,
-0.31381380558013916,
0.03638835996389389,
-0.09625512361526489,
-0.0128175625577569,
0.06723100692033768,
0.11275537312030792,
-0.05554887652397156,
0.08018694818019867,
0.017217103391885757,
-0.04174557328224182,
0.04151593893766403,
-0.09222695231437683,
-0.09001997858285904,
0.08961845189332962,
0.02702181413769722,
0.08210118114948273,
-0.03381764516234398,
0.04698936268687248,
-0.1309213489294052,
0.02070692554116249,
0.0764213353395462,
0.009878663346171379,
0.06101846322417259,
0.14150306582450867,
0.01920877769589424,
0.19214211404323578,
-0.11176241189241409,
0.022607926279306412,
-0.00536688044667244,
-0.09180743992328644,
-0.014218815602362156,
-0.15654423832893372,
0.13687793910503387,
0.10207759588956833,
0.05528122931718826,
-0.016194723546504974,
-0.003728439100086689,
-0.0357399545609951,
0.0576162189245224,
0.11933520436286926,
-0.26394423842430115,
-0.059337910264730453,
0.007198737468570471,
0.0415581539273262,
0.06190510466694832,
-0.06388214975595474,
0.011126060970127583,
0.007830522954463959,
-0.028358343988656998,
-0.029588069766759872,
-0.04219139367341995,
0.045750491321086884,
-0.06565941870212555,
-0.06452064216136932,
-0.029011795297265053,
0.14252498745918274,
0.002157544484362006,
-0.06867797672748566,
-0.07779791951179504,
-0.02185019478201866,
-0.0028445147909224033,
-0.015410646796226501,
-0.013431061990559101,
0.00730300135910511,
-0.008976666256785393,
0.045194391161203384,
-0.06489986181259155,
-0.09269072115421295,
-0.0010718600824475288,
-0.06384183466434479,
0.15469494462013245,
0.06856895238161087,
0.007973656989634037,
0.0026313087437301874,
0.08405262231826782,
-0.08736153692007065,
-0.08829038590192795,
0.012140961363911629,
-0.01877153292298317,
-0.13469257950782776,
-0.035258032381534576,
-0.05516715720295906,
-0.18572071194648743,
-0.0003386833122931421,
0.08238685131072998,
-0.015311775729060173,
0.015243344008922577,
-0.006827015429735184,
-0.002518187277019024,
0.03231886401772499,
0.09692001342773438,
-0.09008877724409103,
-0.0828203335404396,
-0.014116090722382069,
-0.052753325551748276,
0.004315531812608242,
0.018566211685538292,
-0.01970534399151802,
-0.013241423293948174,
0.0036383825354278088,
0.05313031002879143,
-0.009392378851771355,
0.08131608366966248,
0.0449364148080349,
-0.0339963473379612,
0.15990440547466278,
-0.11917572468519211,
0.008904848247766495,
0.0010251072235405445,
-0.061149511486291885,
0.015883997082710266,
-0.015275429002940655,
-0.013828149996697903,
-0.08909600228071213,
0.13339583575725555,
-0.03652720898389816,
-0.015166981145739555,
-0.054785579442977905,
-0.1265905648469925,
0.013330045156180859,
-0.13420645892620087,
-0.04020493105053902,
-0.11242091655731201,
-0.09360441565513611,
-0.05089794844388962,
0.0077150072902441025,
-0.0449015237390995,
0.04180525243282318,
-0.00741223618388176,
-0.04634742811322212,
-0.014708167873322964,
0.02274363487958908,
0.06458435207605362,
-0.028477860614657402,
0.028265750035643578,
-0.11655078828334808,
0.05188978835940361,
-0.016994306817650795,
0.01434670202434063,
-0.061470434069633484,
0.005869035609066486,
-0.06641598790884018,
0.07723917067050934,
-0.019951147958636284,
0.043663494288921356,
-0.10856786370277405,
-0.0011529813054949045,
-0.09662376344203949,
0.0007344786426983774,
0.023895731195807457,
0.0630962997674942,
-0.206770122051239,
-0.00027679066988639534,
0.17073725163936615,
-0.07729905843734741,
0.0021736861672252417,
0.11249791085720062,
0.0016232628840953112,
-0.007162856869399548,
0.0937172919511795,
0.13026472926139832,
0.12339998781681061,
-0.06841479241847992,
-0.05340617150068283,
-0.05586579442024231,
-0.004022261127829552,
0.13359671831130981,
0.0550575815141201,
-0.06973318755626678,
0.10373663157224655,
0.029235156252980232,
-0.033830273896455765,
-0.013983500190079212,
0.024878788739442825,
-0.020645059645175934,
0.03698788210749626,
-0.0360393263399601,
-0.0030932731460779905,
-0.02390018105506897,
0.02720976248383522,
-0.06343214213848114,
-0.08561909198760986,
-0.031308673322200775,
0.11527453362941742,
-0.024742504581809044,
0.018008075654506683,
-0.042202819138765335,
0.017269961535930634,
-0.05625883489847183,
0.009697974659502506,
-0.1440666764974594,
-0.014575899578630924,
0.04834889620542526,
-0.051475442945957184,
0.07607515901327133,
0.06388920545578003,
0.04952302947640419,
0.06820952147245407,
-0.029014309868216515,
0.034026049077510834,
0.0045058708637952805,
0.009053429588675499,
-0.061669912189245224,
-0.06900139153003693,
-0.028140392154455185,
-0.025901254266500473,
0.07083933800458908,
-0.0004883331712335348,
0.010386526584625244,
0.059895943850278854,
0.12911853194236755,
0.0008102054125629365,
-0.043313201516866684,
-0.007775394711643457,
0.02808268554508686,
0.012839321047067642,
-0.06573417037725449,
-0.007871824316680431,
0.018117845058441162,
0.022470487281680107,
0.024558525532484055,
-0.17043958604335785,
-0.17647011578083038,
0.08097590506076813,
0.1592407524585724,
-0.09991918504238129,
-0.005390645936131477,
-0.061568763107061386,
-0.024167053401470184,
-0.008553698658943176,
-0.06932086497545242,
0.23279976844787598,
0.02489740587770939,
0.08052676916122437,
-0.06983144581317902,
-0.05563065782189369,
-0.0094208475202322,
-0.03397835046052933,
-0.032938402146101,
0.058480601757764816,
-0.01809946447610855,
-0.15391825139522552,
0.07195119559764862,
0.019444506615400314,
0.06777780503034592,
0.13721340894699097,
0.016468580812215805,
-0.131612628698349,
-0.0085085928440094,
0.021729841828346252,
0.0376703254878521,
0.0924484059214592,
-0.048718392848968506,
-0.035156238824129105,
0.030820108950138092,
0.01645880937576294,
0.04244634881615639,
-0.06532153487205505,
0.07754981517791748,
0.03326532617211342,
-0.02726013772189617,
0.03759065642952919,
-0.009757054969668388,
0.0019131660228595138,
0.09456362575292587,
0.04407663643360138,
0.030574794858694077,
-0.033874232321977615,
-0.056639377027750015,
-0.05663156136870384,
0.10373304784297943,
-0.11298371851444244,
-0.2470809668302536,
-0.1342838853597641,
0.0006907528731971979,
-0.0641707330942154,
0.025163833051919937,
0.03398118540644646,
-0.07687601447105408,
-0.059859052300453186,
-0.06369839608669281,
0.09582473337650299,
0.007126529701054096,
-0.0053568570874631405,
-0.07730171829462051,
0.0319671593606472,
-0.010017575696110725,
-0.1115955114364624,
-0.001957840984687209,
-0.023686040192842484,
-0.13504213094711304,
-0.03016446903347969,
0.02337457612156868,
0.05267053842544556,
0.04711534082889557,
-0.06344345957040787,
-0.018302174285054207,
-0.0002507481840439141,
0.05236496403813362,
-0.11396504938602448,
0.11456696689128876,
0.0545680969953537,
-0.07539798319339752,
0.042903196066617966,
0.12056589871644974,
0.007928670383989811,
-0.01035033818334341,
0.029023809358477592,
0.04861648380756378,
-0.006747809704393148,
-0.20453400909900665,
-0.06794086843729019,
-0.05658751353621483,
0.0029582930728793144,
0.06246209144592285,
0.07145041972398758,
0.038748811930418015,
0.049556490033864975,
-0.10816701501607895,
0.039436161518096924,
0.03477099537849426,
0.05438394472002983,
0.01915839873254299,
0.01136233564466238,
0.0069617945700883865,
-0.08886736631393433,
-0.04401812329888344,
0.11227124184370041,
0.04708752781152725,
0.2127448469400406,
-0.01029955968260765,
0.16053611040115356,
0.053806062787771225,
0.054003745317459106,
0.025926345959305763,
0.07933733612298965,
-0.04290099814534187,
0.03943997994065285,
-0.0296565480530262,
-0.07340632379055023,
-0.0025461064651608467,
0.04791723191738129,
0.028218934312462807,
-0.04181385785341263,
-0.0033627392258495092,
-0.048514001071453094,
0.09267020970582962,
0.2084396779537201,
0.009399870410561562,
-0.16933585703372955,
-0.0712183341383934,
0.00014856822963338345,
-0.10359445959329605,
-0.058209799230098724,
-0.037009235471487045,
0.12175590544939041,
-0.11800559610128403,
0.020287279039621353,
-0.028022367507219315,
0.08023102581501007,
-0.13971169292926788,
0.009957087226212025,
0.07001033425331116,
0.10440133512020111,
-0.014471993781626225,
0.090048648416996,
-0.11168695241212845,
0.11172662675380707,
0.021447697654366493,
0.10665721446275711,
-0.10229793190956116,
0.05273023992776871,
0.05394717678427696,
-0.06015487015247345,
0.08960070461034775,
0.005051181185990572,
0.004640601109713316,
0.0010062435176223516,
-0.17833460867404938,
0.0037139500491321087,
0.10686054825782776,
-0.10402628034353256,
0.07672551274299622,
-0.01426281314343214,
-0.004494940862059593,
-0.056638121604919434,
0.020508509129285812,
-0.09448100626468658,
-0.1882617324590683,
0.06201796978712082,
-0.11644083261489868,
0.0018354385392740369,
-0.05691120773553848,
0.009436911903321743,
-0.09250766038894653,
0.2018512487411499,
0.002294125035405159,
-0.09347469359636307,
-0.08582951128482819,
-0.021669350564479828,
0.1226440891623497,
-0.04480096697807312,
-0.011117891408503056,
0.021068159490823746,
0.11064606159925461,
-0.04767509177327156,
-0.08797863870859146,
-0.014827181585133076,
-0.054837483912706375,
-0.09960418194532394,
-0.06042632833123207,
0.15460407733917236,
0.048462316393852234,
0.04162677004933357,
0.016353007405996323,
0.024413572624325752,
0.022491924464702606,
-0.07032608240842819,
-0.0785144567489624,
0.060445744544267654,
0.028308888897299767,
0.034678928554058075,
-0.062175825238227844,
-0.07641860097646713,
-0.11302739381790161,
0.01148812286555767,
0.03994987532496452,
0.11883112788200378,
-0.013040339574217796,
0.10407925397157669,
0.1244010403752327,
-0.09101735055446625,
-0.1591789573431015,
-0.04251929372549057,
0.039665818214416504,
0.03737267106771469,
-0.03681156784296036,
-0.14372214674949646,
0.05313193425536156,
0.1307847499847412,
0.007326016668230295,
0.026125555858016014,
-0.27831771969795227,
-0.09187622368335724,
0.10121917724609375,
0.012301581911742687,
-0.07418835163116455,
-0.12784422934055328,
-0.06284107267856598,
-0.03843938559293747,
-0.017972217872738838,
0.07428309321403503,
-0.07089360803365707,
0.08368612825870514,
0.00538506917655468,
-0.013653559610247612,
0.04933764412999153,
-0.015035318210721016,
0.1276727318763733,
0.04252706468105316,
0.03423658758401871,
-0.12633608281612396,
0.014851766638457775,
0.08185182511806488,
-0.014882978051900864,
0.08885335177183151,
0.02940456196665764,
0.02260025404393673,
-0.10115736722946167,
-0.017911836504936218,
-0.023155897855758667,
0.04618195816874504,
-0.03870205208659172,
-0.02748957648873329,
-0.08537936210632324,
0.07915013283491135,
0.03914119303226471,
0.006489292718470097,
-0.030482495203614235,
0.007887878455221653,
-0.029095172882080078,
0.021103372797369957,
0.09639662504196167,
0.07001246511936188,
-0.054295361042022705,
-0.003555424278602004,
0.005906775128096342,
0.035931747406721115,
-0.018909746780991554,
-0.020130423828959465,
0.08207379281520844,
-0.0035792607814073563,
0.0907515361905098,
-0.01976100727915764,
-0.14587722718715668,
0.009015655145049095,
0.08228285610675812,
-0.1560891717672348,
-0.06862477213144302,
-0.017387591302394867,
-0.04283633828163147,
-0.04272100701928139,
-0.018114637583494186,
0.10491736978292465,
-0.07164439558982849,
-0.019393250346183777,
0.0023288200609385967,
0.06762810051441193,
-0.027294808998703957,
0.11258182674646378,
-0.0034398448187857866,
-0.011279880069196224,
-0.06083216518163681,
0.1648123413324356,
0.11460530757904053,
-0.07930167764425278,
0.008573494851589203,
0.1265341341495514,
-0.09847903251647949,
-0.03490303456783295,
-0.07796459645032883,
0.04036422073841095,
0.07384322583675385,
-0.06598798930644989,
-0.00005561931902775541,
-0.07970893383026123,
-0.03239130973815918,
-0.03323264420032501,
-0.00936066173017025,
0.07366891950368881,
-0.0435333251953125,
0.01440015435218811,
-0.1082807406783104,
0.05662350729107857,
0.041240375488996506,
0.027213988825678825,
-0.029103022068738937,
0.11555853486061096,
-0.04590114578604698,
-0.035947006195783615,
-0.001939576817676425,
0.0022386277560144663,
-0.023951886221766472,
-0.05461433157324791,
-0.11827737092971802,
-0.018159328028559685,
-0.12445779144763947,
-0.027945350855588913,
0.026180827990174294,
0.009329798631370068,
0.0044214483350515366,
-0.021855641156435013,
-0.03893491253256798,
-0.06071070209145546,
-0.05775297060608864,
0.06878817081451416,
-0.09030882269144058,
0.014525157399475574,
0.030500169843435287,
-0.074483223259449,
0.08953583240509033,
0.05638345703482628,
0.017581095919013023,
0.030296608805656433,
-0.04103723540902138,
-0.007878021337091923,
-0.0005930764600634575,
0.029209652915596962,
0.033610545098781586,
-0.06100205332040787,
-0.006371187977492809,
0.017589684575796127,
-0.03830981254577637,
-0.015386511571705341,
0.04012589901685715,
-0.08161015808582306,
0.023600488901138306,
-0.01818646304309368,
0.004639789927750826,
-0.07242424786090851,
0.03343048319220543,
0.13429157435894012,
0.027817439287900925,
0.06067143380641937,
-0.0636771023273468,
0.04166892170906067,
-0.09388905018568039,
0.01939205825328827,
-0.004179828334599733,
-0.05618133023381233,
0.03965034708380699,
-0.028045086190104485,
0.06320016831159592,
-0.001323479926213622,
0.10717197507619858,
0.035375479608774185,
0.02098018489778042,
0.025466227903962135,
-0.024646786972880363,
-0.026291128247976303,
0.03788226842880249,
0.11762520670890808,
0.022054383531212807,
-0.043254490941762924,
-0.1145893931388855,
-0.027341371402144432,
-0.03783056139945984,
-0.10880536586046219,
0.11196617782115936,
0.11630846560001373,
0.12454588711261749,
0.053360987454652786,
0.08229092508554459,
-0.07607706636190414,
-0.012261754833161831,
0.03823405131697655,
-0.023581480607390404,
0.05518433451652527,
-0.03066415712237358,
0.050395336002111435,
0.14420010149478912,
-0.18663936853408813,
0.07755482196807861,
0.03100830689072609,
-0.07608577609062195,
0.014018693007528782,
-0.12395872175693512,
-0.005935492925345898,
0.014806782826781273,
-0.008033095858991146,
-0.10890735685825348,
0.04283328726887703,
0.056684475392103195,
0.02266686223447323,
-0.05197003483772278,
0.16668052971363068,
-0.14091673493385315,
-0.0995153933763504,
0.11754802614450455,
0.03336922451853752,
0.056564811617136,
0.010010556317865849,
-0.03554808348417282,
-0.00401315139606595,
0.07671300321817398,
0.0691133439540863,
0.06206079572439194,
0.08975109457969666,
-0.0096803680062294,
-0.021376173943281174,
-0.04718005284667015,
0.007518712896853685,
-0.057782191783189774,
0.04273451492190361,
0.1610747128725052,
0.08886870741844177,
-0.04931972920894623,
-0.017142100259661674,
0.13801540434360504,
-0.019733991473913193,
-0.05220051482319832,
-0.1289718896150589,
0.002520631765946746,
0.03530203551054001,
0.004196751397103071,
0.05278773233294487,
-0.13405780494213104,
-0.013498889282345772,
0.17796596884727478,
0.20072515308856964,
0.03891834616661072,
0.017662525177001953,
0.01959952525794506,
0.007126001641154289,
-0.014946025796234608,
0.08424977958202362,
-0.005096520762890577,
0.19917526841163635,
-0.020065058022737503,
0.0781509205698967,
0.028014803305268288,
-0.012633328326046467,
-0.10090936720371246,
0.11358457058668137,
-0.01839648000895977,
-0.030476335436105728,
-0.01604214310646057,
0.10393786430358887,
-0.03679816424846649,
-0.38733842968940735,
0.017001822590827942,
-0.022538864985108376,
-0.13461647927761078,
0.0437784343957901,
0.04521271213889122,
0.07103989273309708,
0.10044273734092712,
0.026237962767481804,
0.02891492284834385,
0.15196986496448517,
0.03165700286626816,
-0.043892789632081985,
-0.06890063732862473,
0.09230794757604599,
-0.07617344707250595,
0.18971647322177887,
0.00717175891622901,
0.030751947313547134,
0.04444683715701103,
0.0007002113270573318,
-0.12264310568571091,
0.011538106948137283,
0.01512227300554514,
-0.06535422056913376,
0.05524085462093353,
0.13959188759326935,
0.0050145345740020275,
-0.052785806357860565,
0.08245474845170975,
0.020098639652132988,
0.03322684019804001,
0.01606866903603077,
0.07180671393871307,
-0.06334741413593292,
0.10488824546337128,
-0.07506869733333588,
0.15426108241081238,
0.21031834185123444,
-0.007845797576010227,
0.05373536795377731,
-0.03774441033601761,
-0.03248673677444458,
0.020552221685647964,
0.02073719911277294,
-0.012887215241789818,
-0.13444948196411133,
-0.05114121362566948,
-0.10348222404718399,
0.05197857320308685,
-0.12914808094501495,
-0.051207177340984344,
-0.00810165237635374,
-0.031453803181648254,
-0.03719692304730415,
0.05160919576883316,
0.06030537188053131,
0.010136528871953487,
-0.028455186635255814,
0.03522142395377159,
-0.005375167354941368,
0.05605535954236984,
-0.1235198825597763,
-0.05580493435263634
] |
null | null |
transformers
|
# 🇹🇷 Turkish ConvBERT model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="https://raw.githubusercontent.com/stefan-it/turkish-bert/master/merve_logo.png">
</p>
[](https://zenodo.org/badge/latestdoi/237817454)
We present community-driven BERT, DistilBERT, ELECTRA and ConvBERT models for Turkish 🎉
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the BERT model name: BERTurk.
Logo is provided by [Merve Noyan](https://twitter.com/mervenoyann).
# Stats
We've trained an (cased) ConvBERT model on the recently released Turkish part of the
[multiligual C4 (mC4) corpus](https://github.com/allenai/allennlp/discussions/5265) from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ConvBERT
In addition to the ELEC**TR**A base model, we also trained an ConvBERT model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the [DBMDZ](https://github.com/dbmdz) Hugging Face [model hub page](https://huggingface.co/dbmdz)
using their model name.
Example usage with 🤗/Transformers:
```python
tokenizer = AutoTokenizer.from_pretrained("dbmdz/convbert-base-turkish-mc4-cased")
model = AutoModel.from_pretrained("dbmdz/convbert-base-turkish-mc4-cased")
```
# Citation
You can use the following BibTeX entry for citation:
```bibtex
@software{stefan_schweter_2020_3770924,
author = {Stefan Schweter},
title = {BERTurk - BERT models for Turkish},
month = apr,
year = 2020,
publisher = {Zenodo},
version = {1.0.0},
doi = {10.5281/zenodo.3770924},
url = {https://doi.org/10.5281/zenodo.3770924}
}
```
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank [Merve Noyan](https://twitter.com/mervenoyann) for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
|
{"language": "tr", "license": "mit", "datasets": ["allenai/c4"]}
|
fill-mask
|
dbmdz/convbert-base-turkish-mc4-cased
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"convbert",
"fill-mask",
"tr",
"dataset:allenai/c4",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #safetensors #convbert #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# 🇹🇷 Turkish ConvBERT model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="URL
</p>
 ConvBERT model on the recently released Turkish part of the
multiligual C4 (mC4) corpus from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ConvBERT
In addition to the ELECTRA base model, we also trained an ConvBERT model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the DBMDZ Hugging Face model hub page
using their model name.
Example usage with /Transformers:
You can use the following BibTeX entry for citation:
# Acknowledgments
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank Merve Noyan for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
|
[
"# 🇹🇷 Turkish ConvBERT model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n ConvBERT model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ConvBERT\n\nIn addition to the ELECTRA base model, we also trained an ConvBERT model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #convbert #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# 🇹🇷 Turkish ConvBERT model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n ConvBERT model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ConvBERT\n\nIn addition to the ELECTRA base model, we also trained an ConvBERT model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
62,
132,
93,
70,
49,
90
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #convbert #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# 🇹🇷 Turkish ConvBERT model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n ConvBERT model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).# mC4 ConvBERT\n\nIn addition to the ELECTRA base model, we also trained an ConvBERT model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
-0.014608503319323063,
0.10861615091562271,
-0.003868376836180687,
-0.005123086739331484,
0.07657518982887268,
0.03239569440484047,
0.1600884646177292,
0.09151376783847809,
-0.036877408623695374,
0.0602322481572628,
0.03905978426337242,
-0.03070518560707569,
0.07464195042848587,
0.06886129081249237,
0.07858123630285263,
-0.19142912328243256,
-0.022386547178030014,
-0.14188525080680847,
0.006962202955037355,
0.0779029056429863,
0.10313951969146729,
-0.04321799427270889,
0.10555645823478699,
0.016429061070084572,
-0.06112184748053551,
0.008821859024465084,
-0.05459201708436012,
-0.08791137486696243,
0.09299401938915253,
0.03923759609460831,
0.0675307959318161,
-0.03655888885259628,
0.007139767985790968,
-0.14992354810237885,
0.026940610259771347,
0.05250145122408867,
-0.0073155686259269714,
0.05791153013706207,
0.11779791116714478,
-0.03198866918683052,
0.2146250605583191,
-0.13267506659030914,
-0.01833295449614525,
0.0036171881947666407,
-0.08458024263381958,
-0.020965715870261192,
-0.2150341272354126,
0.11533346772193909,
0.0325191468000412,
0.0648418739438057,
0.011044664308428764,
0.06730056554079056,
0.013897406868636608,
0.05601321533322334,
0.06798357516527176,
-0.3029722571372986,
-0.06881897151470184,
-0.04590325802564621,
0.011955064721405506,
0.03521343693137169,
-0.01369324978441,
0.09109324961900711,
-0.00813621561974287,
-0.010398761369287968,
-0.03658496215939522,
-0.0334416963160038,
-0.02750202640891075,
-0.08331514894962311,
-0.05471426993608475,
-0.03798798471689224,
0.22866597771644592,
0.0035984511487185955,
-0.07297354191541672,
-0.0882299616932869,
0.007105475291609764,
0.06917806714773178,
0.01577582210302353,
-0.026357563212513924,
0.003632832085713744,
-0.02042292058467865,
0.0293032955378294,
-0.1438303291797638,
-0.12194816023111343,
0.032490309327840805,
-0.06835637241601944,
0.19200552999973297,
0.06718454509973526,
0.04330902174115181,
-0.006665385328233242,
0.0759013295173645,
-0.052242957055568695,
-0.015828190371394157,
-0.01927902176976204,
-0.022858325392007828,
-0.06363154202699661,
-0.027482222765684128,
-0.04856749624013901,
-0.18509414792060852,
0.013150815851986408,
0.07887046784162521,
-0.03258557990193367,
0.006089219357818365,
-0.035136107355356216,
0.01832915097475052,
0.0015915179392322898,
0.07034275680780411,
-0.08998564630746841,
-0.06395523250102997,
0.03628986328840256,
-0.12521618604660034,
0.05524267256259918,
-0.004117083735764027,
-0.0009949784725904465,
0.0353124663233757,
-0.007135172374546528,
0.10625936836004257,
0.021281862631440163,
0.05584336444735527,
0.03194449841976166,
-0.008277405053377151,
0.19682282209396362,
-0.10235876590013504,
-0.024718889966607094,
-0.002386106178164482,
-0.07445266842842102,
0.01126846857368946,
0.046455927193164825,
-0.024475205689668655,
-0.04350533336400986,
0.05124964565038681,
-0.016716433688998222,
-0.005710795521736145,
-0.0041121831163764,
-0.07524845004081726,
0.07215829938650131,
-0.048941753804683685,
-0.04235117509961128,
-0.15726391971111298,
-0.0838865339756012,
-0.033587921410799026,
0.024859189987182617,
-0.05589937046170235,
0.047085799276828766,
0.03946198895573616,
-0.028435954824090004,
0.018907517194747925,
-0.004343926906585693,
0.054355062544345856,
-0.005835069343447685,
0.0023344929795712233,
-0.11932054162025452,
0.023232348263263702,
-0.0765925794839859,
0.004929588176310062,
-0.04628797248005867,
0.01594795286655426,
-0.07946418970823288,
0.08733309805393219,
-0.018868835642933846,
0.02771236002445221,
-0.14509837329387665,
-0.025015907362103462,
-0.06554137915372849,
-0.04611136391758919,
0.045315712690353394,
0.07417693734169006,
-0.15223795175552368,
-0.022621920332312584,
0.10218507051467896,
-0.0655931830406189,
-0.05639371648430824,
0.10716703534126282,
0.0046185897663235664,
0.052761323750019073,
0.10808522254228592,
0.10213534533977509,
0.05633069947361946,
-0.08700024336576462,
-0.1466643065214157,
-0.09782759100198746,
-0.049305059015750885,
0.07411859184503555,
0.04819609969854355,
-0.05384651944041252,
0.09411299973726273,
0.024316469207406044,
-0.042430151253938675,
0.032073974609375,
-0.03841378539800644,
-0.03336295485496521,
0.037366386502981186,
-0.07726762443780899,
-0.018157586455345154,
-0.061423685401678085,
0.0574260912835598,
-0.06410586833953857,
-0.05723188817501068,
-0.07350222021341324,
0.08800589293241501,
-0.008752615191042423,
0.05619647726416588,
-0.07978080958127975,
0.09881051629781723,
-0.08678868412971497,
0.02002835087478161,
-0.1295628398656845,
-0.13751690089702606,
0.053440358489751816,
-0.06284869462251663,
0.06182451918721199,
-0.03531400114297867,
0.03522666171193123,
0.0533473826944828,
-0.02704019844532013,
0.03795745223760605,
0.021469516679644585,
-0.014451543800532818,
-0.07189658284187317,
-0.11071135103702545,
-0.01862507313489914,
-0.010694860480725765,
0.10885760188102722,
0.02609114535152912,
-0.002407771535217762,
0.07734504342079163,
0.11595027893781662,
0.014477649703621864,
-0.06380896270275116,
0.030184363946318626,
0.025501517578959465,
0.019777987152338028,
-0.04407258704304695,
-0.010701839812099934,
0.005609442014247179,
-0.07669039070606232,
0.16623108088970184,
-0.16050155460834503,
-0.14899300038814545,
0.05884584039449692,
0.0971515104174614,
-0.08040834218263626,
0.08939719945192337,
-0.05839347839355469,
-0.05160043388605118,
0.03767281770706177,
0.005437167827039957,
0.18742330372333527,
0.03921642154455185,
0.0646989494562149,
-0.0483904629945755,
-0.08284276723861694,
0.020644402131438255,
-0.025932956486940384,
-0.06206590309739113,
0.0413593128323555,
-0.041430991142988205,
-0.1950531005859375,
0.016806073486804962,
0.09917043894529343,
0.10180336236953735,
0.1725667268037796,
0.05566626042127609,
-0.11135855317115784,
-0.08339683711528778,
0.025890041142702103,
0.05800549313426018,
0.08497131615877151,
0.013430093415081501,
-0.023302700370550156,
0.03493841737508774,
0.009268728084862232,
0.04428979381918907,
-0.011852824129164219,
0.06199934333562851,
0.01114191859960556,
-0.04400767385959625,
0.05951164662837982,
0.06079820170998573,
-0.03892350569367409,
0.09470667690038681,
0.021628398448228836,
0.07577510178089142,
-0.011405598372220993,
-0.027779873460531235,
-0.05880807712674141,
0.08635088056325912,
-0.10590530931949615,
-0.2284560203552246,
-0.10982997715473175,
0.01594051904976368,
-0.05113598704338074,
-0.018443260341882706,
-0.00015224277740344405,
-0.05159316211938858,
-0.02547791786491871,
-0.07321593910455704,
-0.006378255784511566,
0.0075734746642410755,
-0.028980419039726257,
-0.04371403157711029,
-0.026255527511239052,
0.05108419805765152,
-0.09794724732637405,
0.0025557707995176315,
-0.012810196727514267,
-0.14075522124767303,
-0.0020332017447799444,
0.09570257365703583,
0.09433066844940186,
-0.011368677951395512,
-0.062024928629398346,
-0.025826087221503258,
-0.0045527867041528225,
0.07453256100416183,
-0.11576331406831741,
0.09078231453895569,
0.03928973898291588,
-0.04206768795847893,
0.010107212699949741,
0.04317250847816467,
0.02262566052377224,
-0.05903669819235802,
0.0203858632594347,
0.05243149399757385,
-0.030123936012387276,
-0.2742656171321869,
-0.10942733287811279,
-0.03479883447289467,
-0.028414083644747734,
-0.045288071036338806,
0.06796929985284805,
0.03662289306521416,
0.012409068644046783,
-0.08683855086565018,
0.010546060279011726,
-0.018290799111127853,
0.029765943065285683,
0.07094047218561172,
0.0021897684782743454,
0.0004955621552653611,
-0.09091012924909592,
0.010799459181725979,
0.15364199876785278,
0.010765843093395233,
0.20602978765964508,
0.001761126797646284,
0.1891775280237198,
0.07904516905546188,
0.08805657178163528,
-0.01079739909619093,
0.03854329511523247,
0.01100203674286604,
0.0534835048019886,
0.00592444883659482,
-0.08184274286031723,
0.0010203607380390167,
0.013137128204107285,
0.06917881965637207,
-0.045967139303684235,
-0.0186943169683218,
0.011508212424814701,
0.1126873791217804,
0.22722938656806946,
-0.006993657909333706,
-0.1278754472732544,
-0.0934210792183876,
-0.024148743599653244,
-0.07789316028356552,
0.009202076122164726,
-0.06137578561902046,
0.15765893459320068,
-0.14577940106391907,
0.07922890037298203,
-0.031189918518066406,
0.03501889854669571,
-0.16157346963882446,
-0.03504062443971634,
0.07329721748828888,
0.00522925378754735,
-0.017068976536393166,
0.050786398351192474,
-0.0770844891667366,
0.13505111634731293,
0.04048332944512367,
0.08812812715768814,
-0.06463787704706192,
0.01973181590437889,
0.069742351770401,
-0.03815045207738876,
0.10161057859659195,
0.018163708969950676,
-0.10722481459379196,
0.0003162930079270154,
-0.24486951529979706,
0.04084155336022377,
0.10649221390485764,
-0.12101436406373978,
0.06760662794113159,
0.02170608565211296,
-0.009978791698813438,
-0.05932864174246788,
0.037900183349847794,
-0.20001770555973053,
-0.1794581264257431,
0.07845849543809891,
0.006262894719839096,
0.014760836027562618,
-0.05463588610291481,
-0.02763688750565052,
-0.05706256255507469,
0.181535542011261,
-0.027706898748874664,
-0.06701112538576126,
-0.07001785188913345,
0.002491031540557742,
0.11203400790691376,
-0.07174061238765717,
0.0035327873192727566,
0.0012629891280084848,
0.01623709499835968,
-0.019488349556922913,
-0.08634750545024872,
-0.03447195887565613,
-0.06830748170614243,
-0.1512758582830429,
-0.03385165333747864,
0.13574069738388062,
0.08420698344707489,
0.04957210272550583,
0.019246982410550117,
0.018922360613942146,
0.061075836420059204,
-0.0857396349310875,
0.00967585388571024,
0.10361272841691971,
0.034081634134054184,
0.03980090469121933,
-0.01603890210390091,
-0.017927179113030434,
-0.08174783736467361,
-0.03427492454648018,
0.10543622076511383,
0.11304491013288498,
-0.033241402357816696,
0.14592629671096802,
0.08541925251483917,
-0.11986815184354782,
-0.11566270142793655,
-0.07456021010875702,
0.0077958377078175545,
0.011736869812011719,
-0.03114011138677597,
-0.148337721824646,
0.01940177194774151,
0.09669925272464752,
-0.007494942285120487,
0.06519508361816406,
-0.2356046736240387,
-0.10030616819858551,
0.013957195915281773,
-0.011858334764838219,
-0.003918189089745283,
-0.10831113904714584,
-0.07003410160541534,
-0.01696854457259178,
0.01120334304869175,
0.06555834412574768,
-0.06921408325433731,
0.07969402521848679,
-0.011836275458335876,
0.05248074233531952,
0.02382073365151882,
-0.022040007635951042,
0.12053212523460388,
0.00831710547208786,
0.02122987061738968,
-0.11661584675312042,
0.03702949360013008,
0.1174282357096672,
-0.014993316493928432,
0.10033761709928513,
0.026081768795847893,
0.02388186566531658,
-0.030458761379122734,
-0.004847757052630186,
-0.05891041457653046,
0.09620478749275208,
-0.05021415278315544,
-0.0032134985085576773,
-0.043528541922569275,
0.0789342075586319,
0.025393273681402206,
0.026656465604901314,
-0.05903157964348793,
0.02119726873934269,
0.00018291854939889163,
0.02359175868332386,
0.11328819394111633,
0.10396048426628113,
0.022360550239682198,
-0.02737952023744583,
-0.024945925921201706,
0.050006479024887085,
-0.01716342568397522,
-0.020386120304465294,
0.07582402229309082,
-0.0473364032804966,
0.04384201765060425,
-0.0485159233212471,
-0.18065011501312256,
0.09088750928640366,
0.09466163069009781,
-0.18650640547275543,
-0.15586450695991516,
0.002695744624361396,
-0.045422378927469254,
0.020411381497979164,
0.00911044143140316,
0.18873773515224457,
-0.0800262838602066,
-0.013855590485036373,
-0.029725132510066032,
0.054268140345811844,
-0.0435197614133358,
0.10996798425912857,
0.004399989265948534,
-0.05906743183732033,
-0.050254013389348984,
0.1596190482378006,
0.11228705942630768,
-0.07078631222248077,
0.056826457381248474,
0.0667840763926506,
-0.07879973948001862,
-0.04367108643054962,
-0.07347094267606735,
0.04057195782661438,
0.009013312868773937,
-0.10000744462013245,
0.030056102201342583,
-0.0100442199036479,
-0.011110533960163593,
0.022811539471149445,
-0.030782556161284447,
0.1087699756026268,
-0.010102426633238792,
-0.009357347153127193,
-0.05467559024691582,
0.07270064949989319,
0.014401871711015701,
-0.03971024975180626,
-0.04668007791042328,
0.08269199728965759,
-0.03177746757864952,
0.002757389098405838,
0.0009310297900810838,
-0.024242866784334183,
0.03506292775273323,
-0.06432376056909561,
-0.10993368178606033,
-0.039119377732276917,
-0.08738411962985992,
-0.027298305183649063,
0.025770466774702072,
0.022679518908262253,
-0.026279864832758904,
0.0019209107849746943,
-0.03551606833934784,
-0.056124620139598846,
-0.027992215007543564,
0.1147976964712143,
-0.14891552925109863,
-0.0039799450896680355,
0.009598668664693832,
-0.031941790133714676,
0.15078631043434143,
0.11336199939250946,
0.012901829555630684,
0.029957210645079613,
-0.09925730526447296,
0.017885228618979454,
-0.04056490212678909,
-0.007212512660771608,
0.03584831953048706,
-0.11396415531635284,
-0.010385444387793541,
0.037589527666568756,
-0.035987116396427155,
0.023923924192786217,
0.09345020353794098,
-0.07518526911735535,
0.003940113820135593,
0.0326211117208004,
0.03660011291503906,
-0.061609264463186264,
0.02486586943268776,
0.04119550436735153,
0.05084901303052902,
0.040063973516225815,
-0.08542782813310623,
0.03934755176305771,
-0.06288702040910721,
0.02493288367986679,
-0.016277847811579704,
-0.03884388133883476,
-0.04398267716169357,
0.006756545044481754,
0.05785427242517471,
-0.013412666507065296,
0.12233859300613403,
-0.022540511563420296,
0.0020566368475556374,
0.012585360556840897,
-0.00637133838608861,
-0.11748607456684113,
-0.011756404303014278,
-0.0024915102403610945,
0.0004444026853889227,
-0.03492256999015808,
-0.0473603755235672,
-0.027690431103110313,
-0.040123846381902695,
-0.14199869334697723,
0.1491318941116333,
0.08618997037410736,
0.15460778772830963,
0.06444744765758514,
0.055956948548555374,
-0.09690867364406586,
-0.04489211365580559,
-0.004734141286462545,
-0.09531692415475845,
0.10903006047010422,
-0.03280304744839668,
-0.013220647349953651,
0.09551675617694855,
-0.16902579367160797,
0.03126935660839081,
-0.00042961453436873853,
-0.03565756976604462,
0.007402670104056597,
-0.1483084112405777,
-0.012971502728760242,
0.01756804622709751,
0.005514930933713913,
-0.08694586902856827,
0.06059124693274498,
0.030622486025094986,
0.04179936647415161,
-0.05120435357093811,
0.14914782345294952,
-0.15586701035499573,
-0.0652812272310257,
0.10239467024803162,
0.01631028763949871,
0.015629451721906662,
-0.04906881973147392,
-0.019131530076265335,
-0.062933050096035,
0.07122641056776047,
0.050661850720644,
0.08855241537094116,
0.08441364765167236,
0.05449390038847923,
-0.019927045330405235,
-0.08561991900205612,
-0.00749943545088172,
-0.009386654943227768,
0.08605975657701492,
0.08831248432397842,
0.05163254961371422,
-0.021453330293297768,
-0.008401819504797459,
0.09754869341850281,
0.017271744087338448,
0.021133124828338623,
-0.12885527312755585,
-0.0227699913084507,
-0.01691094972193241,
-0.059466246515512466,
0.022983120754361153,
-0.08438479900360107,
-0.007215564139187336,
0.1243681013584137,
0.2686361074447632,
0.03219232335686684,
0.027754301205277443,
0.002682383870705962,
0.005746318958699703,
-0.02274651639163494,
0.06648614257574081,
-0.023338520899415016,
0.1454637348651886,
-0.01628004014492035,
0.06128636747598648,
-0.02755867876112461,
0.027844127267599106,
-0.04338744282722473,
0.1388605386018753,
-0.08722271770238876,
-0.02989467978477478,
0.009347363375127316,
0.10134126245975494,
-0.049196455627679825,
-0.3706440329551697,
0.06373330950737,
-0.07112623751163483,
-0.0977201834321022,
0.04984457045793533,
0.029050396755337715,
0.07870636880397797,
0.08574561774730682,
0.0228646881878376,
-0.027997082099318504,
0.20759667456150055,
0.03139939531683922,
-0.11061582714319229,
-0.07347746193408966,
0.08018562197685242,
-0.06161421164870262,
0.23879526555538177,
0.013683153316378593,
0.027919117361307144,
0.055467430502176285,
-0.0043976143933832645,
-0.1430547535419464,
-0.002442875411361456,
0.03935942426323891,
-0.022467544302344322,
0.05213124305009842,
0.16029462218284607,
0.013549686409533024,
0.05933927372097969,
0.07581674307584763,
0.05798887461423874,
0.07429176568984985,
0.01790344901382923,
0.05832236632704735,
-0.06342461705207825,
0.08173669129610062,
-0.06914912164211273,
0.15902552008628845,
0.198895663022995,
-0.027325604110956192,
0.03544316068291664,
-0.02370268665254116,
-0.015340888872742653,
0.05166540667414665,
0.019076891243457794,
-0.00017694449343252927,
-0.1479925513267517,
0.023761330172419548,
-0.1062898263335228,
0.0794682651758194,
-0.09789346158504486,
-0.060082919895648956,
0.010479463264346123,
-0.031027911230921745,
-0.00994597002863884,
0.07407109439373016,
0.06129796430468559,
0.005591957829892635,
-0.01389998011291027,
0.0970330610871315,
-0.0018091555684804916,
0.024381587281823158,
-0.10218505561351776,
-0.033725008368492126
] |
null | null |
transformers
|
# 🇹🇷 Turkish ConvBERT model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="https://raw.githubusercontent.com/stefan-it/turkish-bert/master/merve_logo.png">
</p>
[](https://zenodo.org/badge/latestdoi/237817454)
We present community-driven BERT, DistilBERT, ELECTRA and ConvBERT models for Turkish 🎉
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the BERT model name: BERTurk.
Logo is provided by [Merve Noyan](https://twitter.com/mervenoyann).
# Stats
We've trained an (uncased) ConvBERT model on the recently released Turkish part of the
[multiligual C4 (mC4) corpus](https://github.com/allenai/allennlp/discussions/5265) from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ConvBERT
In addition to the ELEC**TR**A base model, we also trained an ConvBERT model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the [DBMDZ](https://github.com/dbmdz) Hugging Face [model hub page](https://huggingface.co/dbmdz)
using their model name.
Example usage with 🤗/Transformers:
```python
tokenizer = AutoTokenizer.from_pretrained("dbmdz/convbert-base-turkish-mc4-uncased")
model = AutoModel.from_pretrained("dbmdz/convbert-base-turkish-mc4-uncased")
```
# Citation
You can use the following BibTeX entry for citation:
```bibtex
@software{stefan_schweter_2020_3770924,
author = {Stefan Schweter},
title = {BERTurk - BERT models for Turkish},
month = apr,
year = 2020,
publisher = {Zenodo},
version = {1.0.0},
doi = {10.5281/zenodo.3770924},
url = {https://doi.org/10.5281/zenodo.3770924}
}
```
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank [Merve Noyan](https://twitter.com/mervenoyann) for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
|
{"language": "tr", "license": "mit", "datasets": ["allenai/c4"]}
|
fill-mask
|
dbmdz/convbert-base-turkish-mc4-uncased
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"convbert",
"fill-mask",
"tr",
"dataset:allenai/c4",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #safetensors #convbert #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# 🇹🇷 Turkish ConvBERT model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="URL
</p>
 ConvBERT model on the recently released Turkish part of the
multiligual C4 (mC4) corpus from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ConvBERT
In addition to the ELECTRA base model, we also trained an ConvBERT model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the DBMDZ Hugging Face model hub page
using their model name.
Example usage with /Transformers:
You can use the following BibTeX entry for citation:
# Acknowledgments
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank Merve Noyan for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
|
[
"# 🇹🇷 Turkish ConvBERT model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n ConvBERT model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ConvBERT\n\nIn addition to the ELECTRA base model, we also trained an ConvBERT model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #convbert #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# 🇹🇷 Turkish ConvBERT model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n ConvBERT model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ConvBERT\n\nIn addition to the ELECTRA base model, we also trained an ConvBERT model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
66,
132,
94,
70,
49,
90
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #convbert #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# 🇹🇷 Turkish ConvBERT model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n ConvBERT model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).# mC4 ConvBERT\n\nIn addition to the ELECTRA base model, we also trained an ConvBERT model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
-0.0006194667075760663,
0.08928524702787399,
-0.004405145067721605,
-0.023110108450055122,
0.06879320740699768,
0.022348521277308464,
0.13827666640281677,
0.1066908985376358,
-0.06024784967303276,
0.06917782872915268,
0.033990342170000076,
-0.01821107044816017,
0.0798141360282898,
0.07533922046422958,
0.09181834757328033,
-0.21773409843444824,
-0.006501324474811554,
-0.12202563881874084,
0.026611629873514175,
0.07867784053087234,
0.10336297005414963,
-0.04848133400082588,
0.0919693112373352,
0.029366077855229378,
-0.05695505440235138,
0.005420955363661051,
-0.0844937413930893,
-0.1044435128569603,
0.07660851627588272,
0.005128880497068167,
0.07060707360506058,
-0.045282043516635895,
-0.023039836436510086,
-0.16211244463920593,
0.029223201796412468,
0.051766544580459595,
0.0017753493739292026,
0.05224082991480827,
0.12146809697151184,
-0.05197407305240631,
0.22214297950267792,
-0.15031756460666656,
-0.02515283413231373,
-0.0015879691345617175,
-0.08568015694618225,
-0.034220244735479355,
-0.21242621541023254,
0.140152245759964,
0.029686618596315384,
0.05119040608406067,
0.012514579109847546,
0.04121527448296547,
0.009899305179715157,
0.03943914920091629,
0.12177011370658875,
-0.28988489508628845,
-0.06681648641824722,
-0.07931288331747055,
-0.0014943493297323585,
0.037584491074085236,
-0.02439570240676403,
0.09253094345331192,
-0.018726017326116562,
-0.010123738087713718,
-0.023296967148780823,
-0.012626934796571732,
0.01500470656901598,
-0.0871228575706482,
-0.06304820626974106,
-0.034337420016527176,
0.17778511345386505,
0.0006940091261640191,
-0.06652665883302689,
-0.09128084778785706,
0.012022647075355053,
0.03250076621770859,
-0.022922629490494728,
-0.04678743705153465,
-0.0026924186386168003,
-0.030240222811698914,
0.03896646201610565,
-0.16867950558662415,
-0.12228771299123764,
0.05464484170079231,
-0.06316716969013214,
0.2185460478067398,
0.06456681340932846,
0.042120784521102905,
0.0012130667455494404,
0.07947666943073273,
-0.08870568126440048,
-0.019335366785526276,
-0.03229543939232826,
-0.0057057966478168964,
-0.04559357091784477,
-0.03208030387759209,
-0.038704898208379745,
-0.21602804958820343,
0.016607698053121567,
0.06664810329675674,
-0.013921927660703659,
-0.0029316116124391556,
-0.048732779920101166,
0.015697935596108437,
-0.006471812725067139,
0.0788310244679451,
-0.049899350851774216,
-0.08651762455701828,
0.02609908953309059,
-0.09711991250514984,
0.06614430993795395,
-0.008701282553374767,
0.013170532882213593,
0.030337240546941757,
0.019563226029276848,
0.09389594942331314,
0.04529265686869621,
0.04410329461097717,
0.05230480059981346,
-0.003198279533535242,
0.19595806300640106,
-0.10070061683654785,
-0.018521718680858612,
-0.011332719586789608,
-0.07686428725719452,
-0.004316920414566994,
0.024110915139317513,
-0.01742180623114109,
-0.055497732013463974,
0.08705099672079086,
-0.01755574345588684,
0.01531213242560625,
-0.004859549459069967,
-0.05322576314210892,
0.0620482973754406,
-0.06793755292892456,
-0.06028711423277855,
-0.12925203144550323,
-0.1006503701210022,
-0.03193086385726929,
0.016159115359187126,
-0.05547470971941948,
0.04431436210870743,
0.030280714854598045,
-0.04101833701133728,
0.02539880946278572,
0.0032550792675465345,
0.05161162465810776,
-0.008585273288190365,
0.013677893206477165,
-0.10872010886669159,
0.016181878745555878,
-0.07029181718826294,
0.022089513018727303,
-0.06089748069643974,
0.0038614668883383274,
-0.08856042474508286,
0.10947999358177185,
-0.030319375917315483,
0.027697667479515076,
-0.14841140806674957,
-0.01827932335436344,
-0.10046640783548355,
-0.03780434653162956,
0.0326915979385376,
0.0586610771715641,
-0.15296198427677155,
-0.04222412034869194,
0.10567456483840942,
-0.07757163792848587,
-0.057582855224609375,
0.1007642075419426,
-0.002228908007964492,
0.06185055151581764,
0.12030018866062164,
0.11942068487405777,
0.06589997559785843,
-0.06528972834348679,
-0.15342050790786743,
-0.11058198660612106,
-0.04474420100450516,
0.0764869675040245,
0.04318242147564888,
-0.06145545095205307,
0.14055660367012024,
0.03082038275897503,
-0.06264049559831619,
0.018788320943713188,
-0.03465387597680092,
-0.04501134157180786,
0.027813630178570747,
-0.0739695355296135,
-0.01989557035267353,
-0.07975653558969498,
0.07077907025814056,
-0.07360342890024185,
-0.08559636771678925,
-0.03484455868601799,
0.08673185110092163,
0.0175460297614336,
0.05663470923900604,
-0.07152694463729858,
0.1107369214296341,
-0.0837417021393776,
0.01716608554124832,
-0.11364061385393143,
-0.11625268310308456,
0.0433712936937809,
-0.0843779593706131,
0.07537887990474701,
-0.029110226780176163,
0.027667922899127007,
0.04214169830083847,
-0.023149989545345306,
0.04714550822973251,
-0.00015465127944480628,
-0.000002518856945243897,
-0.07729192823171616,
-0.07691793888807297,
-0.012257578782737255,
-0.0067221312783658504,
0.0867621973156929,
0.04363898187875748,
0.0029595117084681988,
0.074323870241642,
0.11562766134738922,
0.025157645344734192,
-0.06629540771245956,
0.010018659755587578,
0.03040490113198757,
0.029746903106570244,
-0.031199513003230095,
-0.012649931013584137,
-0.0017712661065161228,
-0.059755805879831314,
0.1648479402065277,
-0.16651536524295807,
-0.1322716772556305,
0.05830378457903862,
0.08512697368860245,
-0.06872570514678955,
0.08126914501190186,
-0.058963313698768616,
-0.05641211196780205,
0.048112653195858,
0.011649224907159805,
0.1432383805513382,
0.04771193861961365,
0.05659736320376396,
-0.04602638632059097,
-0.06636609137058258,
0.009232479147613049,
-0.023232953622937202,
-0.06952511519193649,
0.03999895602464676,
-0.03977128118276596,
-0.1891416609287262,
0.029230551794171333,
0.12098852545022964,
0.10144402831792831,
0.16483452916145325,
0.04990970715880394,
-0.1236235648393631,
-0.08256690204143524,
0.02802438847720623,
0.043861839920282364,
0.06537981331348419,
-0.00268910126760602,
-0.023631663993000984,
0.030538838356733322,
0.013252726756036282,
0.05261215195059776,
-0.022790394723415375,
0.054712604731321335,
0.020949887111783028,
-0.04908130690455437,
0.07845040410757065,
0.058836668729782104,
-0.023401929065585136,
0.0848013311624527,
0.020455801859498024,
0.08864156901836395,
-0.013829431496560574,
-0.019036559388041496,
-0.03741868585348129,
0.06946329772472382,
-0.09072227776050568,
-0.250682532787323,
-0.11083699017763138,
0.015597530640661716,
-0.046895984560251236,
-0.02768135443329811,
0.022797875106334686,
-0.03729531913995743,
-0.0014756349846720695,
-0.07459229230880737,
0.01944892667233944,
0.00535289291292429,
-0.03211181238293648,
-0.04863526299595833,
-0.03218569606542587,
0.03279368579387665,
-0.09730493277311325,
-0.0041443826630711555,
-0.02469624951481819,
-0.1424034833908081,
-0.01629338227212429,
0.12178391218185425,
0.08302808552980423,
-0.03206641972064972,
-0.07850413024425507,
-0.02233210578560829,
-0.0069105904549360275,
0.08193395286798477,
-0.12490294873714447,
0.09489136934280396,
0.03259719908237457,
-0.033761583268642426,
0.03708219900727272,
0.03681282699108124,
0.02946501225233078,
-0.02932431921362877,
-0.0017584286397323012,
0.0793415904045105,
-0.02406931295990944,
-0.24806047976016998,
-0.08981133252382278,
-0.02207513526082039,
-0.03392544761300087,
-0.04847637191414833,
0.08044261485338211,
0.06920604407787323,
0.012781083583831787,
-0.08933474123477936,
-0.023730415850877762,
-0.030339982360601425,
0.025058994069695473,
0.0863848552107811,
0.016214780509471893,
-0.0015785181894898415,
-0.10099215060472488,
-0.015642859041690826,
0.14631220698356628,
0.010643402114510536,
0.18311014771461487,
-0.010994005016982555,
0.17727404832839966,
0.07902399450540543,
0.08680195361375809,
-0.020920362323522568,
0.017637161538004875,
0.011354698799550533,
0.04141530767083168,
0.00037942087510600686,
-0.08582355827093124,
0.0004678952682297677,
0.012880725786089897,
0.07667887210845947,
-0.048812512308359146,
0.003182691056281328,
0.007575989235192537,
0.12089144438505173,
0.2440653145313263,
-0.015362096019089222,
-0.10147206485271454,
-0.08175420761108398,
-0.030214708298444748,
-0.08783356845378876,
0.010414575226604939,
-0.07107255607843399,
0.11253124475479126,
-0.13258486986160278,
0.08225147426128387,
-0.03637014701962471,
0.043327875435352325,
-0.1386645883321762,
-0.01735108532011509,
0.06384587287902832,
0.018837831914424896,
-0.017748937010765076,
0.05996483191847801,
-0.07883143424987793,
0.10971474647521973,
0.03771670535206795,
0.10915868729352951,
-0.072910375893116,
0.00965657364577055,
0.0721948891878128,
-0.05923854559659958,
0.08207923173904419,
0.009926669299602509,
-0.09735610336065292,
0.02320302464067936,
-0.25698789954185486,
0.02658168226480484,
0.10380563884973526,
-0.12462799996137619,
0.07079779356718063,
0.024153735488653183,
0.002810221165418625,
-0.052424754947423935,
0.06261811405420303,
-0.18892939388751984,
-0.18576925992965698,
0.10117163509130478,
-0.005576455499976873,
-0.0011407291749492288,
-0.05581791326403618,
-0.03897596523165703,
-0.04242022708058357,
0.17451055347919464,
-0.003698509419336915,
-0.06090087816119194,
-0.06439138203859329,
0.012492509558796883,
0.12428540736436844,
-0.07450810819864273,
-0.0004377785371616483,
-0.009638141840696335,
0.02230212092399597,
-0.024840952828526497,
-0.08186353743076324,
-0.04854338616132736,
-0.05012650042772293,
-0.14412237703800201,
-0.01635649986565113,
0.13900847733020782,
0.09331099689006805,
0.0384523831307888,
0.018682917580008507,
0.017329270020127296,
0.05342068150639534,
-0.08466628193855286,
0.02241690270602703,
0.06876909732818604,
-0.00979180820286274,
0.036420151591300964,
-0.0013343064347282052,
-0.07244006544351578,
-0.1048058271408081,
-0.02789640985429287,
0.11154189705848694,
0.1272355318069458,
-0.027514049783349037,
0.1611231416463852,
0.07276293635368347,
-0.1264108121395111,
-0.11487843841314316,
-0.081000916659832,
0.012601400725543499,
0.0041635967791080475,
-0.023827290162444115,
-0.14555267989635468,
0.04214174672961235,
0.1206027939915657,
-0.019708070904016495,
0.09385797381401062,
-0.25726431608200073,
-0.09618046134710312,
0.02670738846063614,
-0.027892952784895897,
0.019268549978733063,
-0.12503060698509216,
-0.0791834145784378,
-0.0018240615027025342,
0.042933180928230286,
0.08199051022529602,
-0.02720475010573864,
0.08548644185066223,
-0.0149774681776762,
0.07172940671443939,
0.037055324763059616,
-0.0141014838591218,
0.12507279217243195,
-0.01190950907766819,
0.01501549407839775,
-0.13116082549095154,
0.03607216104865074,
0.10315191000699997,
-0.02674943394958973,
0.10756140947341919,
0.016094107180833817,
0.0072075920179486275,
-0.024042529985308647,
0.004492230713367462,
-0.06792504340410233,
0.12461408972740173,
-0.04765550419688225,
0.010902750305831432,
-0.0489182323217392,
0.060298383235931396,
0.03089796006679535,
0.028670301660895348,
-0.017905404791235924,
0.0291157066822052,
0.027979426085948944,
0.003967728465795517,
0.06377915292978287,
0.10717971622943878,
-0.0033068873453885317,
-0.04965093359351158,
-0.019692547619342804,
0.029602674767374992,
-0.008675365708768368,
-0.02641282044351101,
0.06858482211828232,
-0.04896286129951477,
0.04400382563471794,
-0.05548442527651787,
-0.19778993725776672,
0.08302924782037735,
0.1171383336186409,
-0.16383393108844757,
-0.17494380474090576,
0.0015696687623858452,
-0.01149073988199234,
0.01982194557785988,
0.015514872036874294,
0.18748673796653748,
-0.09262003749608994,
-0.023505734279751778,
-0.029133779928088188,
0.06388977915048599,
-0.04285561293363571,
0.08564998209476471,
0.008117555640637875,
-0.06398459523916245,
-0.039220813661813736,
0.15019027888774872,
0.11548496782779694,
-0.07313533127307892,
0.05714452639222145,
0.07190942019224167,
-0.05532711371779442,
-0.035306889563798904,
-0.04958134889602661,
0.027243366464972496,
-0.000770001148339361,
-0.07959044724702835,
0.04035205766558647,
0.004557819105684757,
-0.005239423830062151,
0.04169381409883499,
-0.03148047998547554,
0.10267417132854462,
-0.008296612650156021,
0.012064797803759575,
-0.03498104214668274,
0.07509001344442368,
0.011351312510669231,
-0.031437233090400696,
-0.03756101429462433,
0.094242624938488,
-0.03292542323470116,
0.00803983211517334,
-0.0028906497173011303,
-0.026542723178863525,
0.06174912303686142,
-0.06776075065135956,
-0.11531026661396027,
-0.035808760672807693,
-0.10742447525262833,
-0.03212559595704079,
0.035311322659254074,
0.011002453044056892,
-0.0266873762011528,
-0.012260576710104942,
-0.048367246985435486,
-0.060360491275787354,
-0.03952537104487419,
0.11073247343301773,
-0.17575575411319733,
0.013050799258053303,
0.01612347550690174,
-0.02487174980342388,
0.13216936588287354,
0.11234206706285477,
0.013196202926337719,
0.03968999907374382,
-0.08693349361419678,
0.008687814697623253,
-0.035876888781785965,
-0.023166922852396965,
0.028825344517827034,
-0.08341571688652039,
-0.012339076958596706,
0.04605237767100334,
-0.03071746788918972,
0.022457405924797058,
0.08200802654027939,
-0.09246636927127838,
-0.006806624121963978,
0.04044116288423538,
0.019235339015722275,
-0.03995930030941963,
0.033338285982608795,
0.06315106898546219,
0.02547532692551613,
0.050835613161325455,
-0.07831798493862152,
0.03021259605884552,
-0.05705879256129265,
0.03008783422410488,
-0.02734586037695408,
-0.05008862167596817,
-0.02277335338294506,
0.00153882906306535,
0.04437512904405594,
-0.0034964806400239468,
0.11016634106636047,
-0.0009967550868168473,
-0.02433563396334648,
0.01567198522388935,
-0.03376531973481178,
-0.1340586096048355,
-0.0022046708036214113,
0.023485353216528893,
0.010787621140480042,
-0.05204612761735916,
-0.04411725327372551,
-0.048756372183561325,
-0.048997219651937485,
-0.16180510818958282,
0.12557843327522278,
0.10899632424116135,
0.16783665120601654,
0.06627435982227325,
0.059054892510175705,
-0.08996549993753433,
-0.0640285387635231,
-0.010621820576488972,
-0.0871456041932106,
0.10571213811635971,
-0.029046442359685898,
0.02913481369614601,
0.10558254271745682,
-0.1432063728570938,
0.036974262446165085,
0.0064361486583948135,
-0.03462066873908043,
0.014264614321291447,
-0.15062515437602997,
-0.01445572916418314,
0.03053336963057518,
-0.002619859529659152,
-0.0746152400970459,
0.07300437986850739,
0.04015757516026497,
0.03124125674366951,
-0.06272613257169724,
0.15172456204891205,
-0.1393996775150299,
-0.0809316635131836,
0.12805981934070587,
0.006591596640646458,
0.01933741755783558,
-0.04849867895245552,
-0.022023582831025124,
-0.06267987191677094,
0.1117587685585022,
0.06699772924184799,
0.09666409343481064,
0.07234121859073639,
0.062325283885002136,
-0.04573244974017143,
-0.08470416814088821,
0.013709742575883865,
-0.0244598425924778,
0.0994153618812561,
0.07863008230924606,
0.04200576990842819,
-0.03162005543708801,
-0.016839515417814255,
0.12533287703990936,
0.03222407028079033,
0.01877140812575817,
-0.10330269485712051,
-0.04946591332554817,
0.013784803450107574,
-0.048844676464796066,
0.005262566730380058,
-0.08760989457368851,
-0.008435606025159359,
0.09299562871456146,
0.2803167700767517,
0.041712936013936996,
0.0301201231777668,
0.0037042477633804083,
0.010758953168988228,
-0.022513536736369133,
0.03767361864447594,
-0.046135492622852325,
0.1440381556749344,
-0.013312635011970997,
0.10776727646589279,
-0.03686473146080971,
0.027026589959859848,
-0.062482595443725586,
0.12734878063201904,
-0.0919162705540657,
-0.01695050485432148,
-0.001497575081884861,
0.11058934032917023,
-0.06243814527988434,
-0.40326738357543945,
0.03436494991183281,
-0.0791175588965416,
-0.12246226519346237,
0.061465825885534286,
0.05535004287958145,
0.08964069932699203,
0.07869401574134827,
0.04115821048617363,
-0.031171955168247223,
0.1800081729888916,
0.021958129480481148,
-0.09131679683923721,
-0.07765235751867294,
0.08394493162631989,
-0.050405871123075485,
0.24543780088424683,
0.021525243297219276,
-0.014112714678049088,
0.04954199492931366,
-0.012698723934590816,
-0.1300182342529297,
-0.0018698744243010879,
0.024549275636672974,
-0.005107825621962547,
0.04869025945663452,
0.16498281061649323,
0.014855774119496346,
0.0346120223402977,
0.09599509835243225,
0.04951309412717819,
0.05414086580276489,
0.023916056379675865,
0.045264050364494324,
-0.04972413182258606,
0.07277698814868927,
-0.06267926096916199,
0.14182919263839722,
0.21144478023052216,
-0.022985918447375298,
0.048263102769851685,
-0.030773578211665154,
-0.020657243207097054,
0.05373062193393707,
0.026438772678375244,
-0.01065940409898758,
-0.1478409469127655,
0.02412445843219757,
-0.07695819437503815,
0.07817774266004562,
-0.10426158457994461,
-0.057778384536504745,
0.02233465202152729,
-0.0232715904712677,
0.00016746402252465487,
0.06684762239456177,
0.0688614696264267,
-0.013853979296982288,
-0.01226273737847805,
0.07325570285320282,
-0.0057492246851325035,
0.02916436828672886,
-0.10141441226005554,
-0.017560925334692
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz DistilBERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a German Europeana DistilBERT model 🎉
# German Europeana DistilBERT
We use the open source [Europeana newspapers](http://www.europeana-newspapers.eu/)
that were provided by *The European Library*. The final
training corpus has a size of 51GB and consists of 8,035,986,369 tokens.
Detailed information about the data and pretraining steps can be found in
[this repository](https://github.com/stefan-it/europeana-bert).
## Results
For results on Historic NER, please refer to [this repository](https://github.com/stefan-it/europeana-bert).
## Usage
With Transformers >= 4.3 our German Europeana DistilBERT model can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/distilbert-base-german-europeana-cased"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
```
# Huggingface model hub
All other German Europeana models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our Europeana BERT, ELECTRA and ConvBERT models just open a new discussion
[here](https://github.com/stefan-it/europeana-bert/discussions) 🤗
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "de", "license": "mit", "tags": ["historic german"]}
| null |
dbmdz/distilbert-base-german-europeana-cased
|
[
"transformers",
"pytorch",
"tf",
"distilbert",
"historic german",
"de",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #distilbert #historic german #de #license-mit #endpoints_compatible #region-us
|
# + dbmdz DistilBERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a German Europeana DistilBERT model
# German Europeana DistilBERT
We use the open source Europeana newspapers
that were provided by *The European Library*. The final
training corpus has a size of 51GB and consists of 8,035,986,369 tokens.
Detailed information about the data and pretraining steps can be found in
this repository.
## Results
For results on Historic NER, please refer to this repository.
## Usage
With Transformers >= 4.3 our German Europeana DistilBERT model can be loaded like:
# Huggingface model hub
All other German Europeana models are available on the Huggingface model hub.
# Contact (Bugs, Feedback, Contribution and more)
For questions about our Europeana BERT, ELECTRA and ConvBERT models just open a new discussion
here
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[
"# + dbmdz DistilBERT model\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources a German Europeana DistilBERT model",
"# German Europeana DistilBERT\n\nWe use the open source Europeana newspapers\nthat were provided by *The European Library*. The final\ntraining corpus has a size of 51GB and consists of 8,035,986,369 tokens.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.",
"## Results\n\nFor results on Historic NER, please refer to this repository.",
"## Usage\n\nWith Transformers >= 4.3 our German Europeana DistilBERT model can be loaded like:",
"# Huggingface model hub\n\nAll other German Europeana models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our Europeana BERT, ELECTRA and ConvBERT models just open a new discussion\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #tf #distilbert #historic german #de #license-mit #endpoints_compatible #region-us \n",
"# + dbmdz DistilBERT model\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources a German Europeana DistilBERT model",
"# German Europeana DistilBERT\n\nWe use the open source Europeana newspapers\nthat were provided by *The European Library*. The final\ntraining corpus has a size of 51GB and consists of 8,035,986,369 tokens.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.",
"## Results\n\nFor results on Historic NER, please refer to this repository.",
"## Usage\n\nWith Transformers >= 4.3 our German Europeana DistilBERT model can be loaded like:",
"# Huggingface model hub\n\nAll other German Europeana models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our Europeana BERT, ELECTRA and ConvBERT models just open a new discussion\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
39,
44,
70,
17,
25,
22,
37,
70
] |
[
"passage: TAGS\n#transformers #pytorch #tf #distilbert #historic german #de #license-mit #endpoints_compatible #region-us \n# + dbmdz DistilBERT model\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources a German Europeana DistilBERT model# German Europeana DistilBERT\n\nWe use the open source Europeana newspapers\nthat were provided by *The European Library*. The final\ntraining corpus has a size of 51GB and consists of 8,035,986,369 tokens.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.## Results\n\nFor results on Historic NER, please refer to this repository.## Usage\n\nWith Transformers >= 4.3 our German Europeana DistilBERT model can be loaded like:# Huggingface model hub\n\nAll other German Europeana models are available on the Huggingface model hub.# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our Europeana BERT, ELECTRA and ConvBERT models just open a new discussion\nhere# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
-0.047181639820337296,
0.13383325934410095,
-0.0008128026966005564,
0.0979742556810379,
0.06203222647309303,
0.009627137333154678,
0.13947592675685883,
0.06828118860721588,
0.1463353931903839,
0.07539530098438263,
-0.022148316726088524,
-0.12832897901535034,
0.041588105261325836,
0.07550684362649918,
0.04574454575777054,
-0.23915834724903107,
-0.016874272376298904,
-0.04925859719514847,
-0.025011880323290825,
0.030142735689878464,
0.10581546276807785,
-0.051149629056453705,
0.10353279113769531,
-0.027030620723962784,
-0.08133580535650253,
0.09884287416934967,
-0.06858991831541061,
-0.005905987694859505,
0.1368345469236374,
0.08334119617938995,
0.009755455888807774,
-0.04645688831806183,
0.03622320666909218,
-0.06397140026092529,
0.019219381734728813,
0.04263587296009064,
-0.05751055106520653,
0.0711759477853775,
0.08065512776374817,
-0.05045400559902191,
0.17984293401241302,
-0.08195875585079193,
-0.021601559594273567,
0.059910908341407776,
-0.040394049137830734,
-0.0036710002459585667,
-0.1625254601240158,
0.14373628795146942,
0.014310121536254883,
0.07091724872589111,
0.01739712618291378,
0.14686432480812073,
-0.04507070779800415,
0.05036122724413872,
0.18741269409656525,
-0.190469428896904,
-0.05699342116713524,
0.1191563680768013,
0.011004127562046051,
0.014675818383693695,
-0.08284485340118408,
0.06035446375608444,
0.04563150182366371,
0.034189049154520035,
-0.03509689122438431,
-0.023486262187361717,
-0.04635946825146675,
-0.048251111060380936,
-0.06623527407646179,
-0.030342472717165947,
0.2142179310321808,
0.021224578842520714,
-0.09813419729471207,
-0.11391312628984451,
-0.0640682727098465,
0.14642852544784546,
0.01916070654988289,
0.016826175153255463,
0.026890886947512627,
-0.024500999599695206,
0.048156123608350754,
-0.13584911823272705,
-0.06468039005994797,
-0.004906981252133846,
-0.056414760649204254,
0.18766944110393524,
0.0032902006059885025,
0.04578441381454468,
0.06443189084529877,
0.08876945823431015,
-0.12134174257516861,
-0.06988395005464554,
-0.017825482413172722,
-0.013368761166930199,
-0.0622067004442215,
-0.04875855892896652,
-0.019375979900360107,
-0.05897867679595947,
0.011509724892675877,
0.20144279301166534,
-0.034443460404872894,
-0.05672513693571091,
-0.08131596446037292,
0.013520292937755585,
0.0683823823928833,
0.10742179304361343,
-0.12484380602836609,
-0.17901864647865295,
0.07365371286869049,
-0.17952783405780792,
0.07696951925754547,
0.008235830813646317,
-0.06297081708908081,
-0.06967880576848984,
-0.026390505954623222,
0.03845944255590439,
0.028193149715662003,
0.03719009459018707,
0.00996781513094902,
-0.038640595972537994,
0.27220964431762695,
-0.06014035642147064,
0.008345493115484715,
0.006794847082346678,
-0.09619007259607315,
-0.006386764347553253,
0.04052659124135971,
-0.02104279212653637,
-0.08712150901556015,
0.0744827538728714,
-0.09732677787542343,
-0.05966026335954666,
-0.029591891914606094,
-0.12802116572856903,
0.10198628157377243,
-0.12581337988376617,
-0.01784447953104973,
-0.14316314458847046,
-0.12103225290775299,
-0.011227589100599289,
0.022631721571087837,
-0.0670093446969986,
0.004079257138073444,
0.029325300827622414,
-0.03880566358566284,
-0.011382807046175003,
0.013454691506922245,
-0.005631760228425264,
-0.021675633266568184,
-0.03497105464339256,
-0.17919957637786865,
0.035664577037096024,
-0.13949190080165863,
-0.009187680669128895,
-0.08541834354400635,
-0.025874126702547073,
-0.2795655131340027,
0.035851940512657166,
-0.14492955803871155,
0.02161250449717045,
-0.10671740770339966,
-0.0026486471761018038,
0.047758083790540695,
-0.02854744903743267,
0.05150725692510605,
0.1428116261959076,
-0.11873297393321991,
-0.015852024778723717,
0.14221440255641937,
-0.08752015978097916,
-0.07714294642210007,
0.1989780068397522,
-0.021159762516617775,
-0.014541479758918285,
0.008170717395842075,
0.15245677530765533,
0.07301980257034302,
-0.17230604588985443,
-0.11479723453521729,
0.004665126092731953,
-0.04329201579093933,
0.019540950655937195,
0.018416164442896843,
-0.06227198615670204,
0.020182490348815918,
-0.004271595273166895,
-0.029417194426059723,
0.0457291416823864,
-0.023492960259318352,
0.03877104073762894,
-0.03695813566446304,
-0.05774648115038872,
0.025496000424027443,
0.02397635392844677,
0.018387848511338234,
-0.052531275898218155,
-0.0218630563467741,
0.09319666028022766,
0.06832452118396759,
-0.019042005762457848,
0.0331575982272625,
0.02279617078602314,
0.079380102455616,
-0.057723142206668854,
-0.015334839932620525,
-0.09379009902477264,
-0.1808125376701355,
0.10564016550779343,
-0.04601989686489105,
0.08235009759664536,
0.06077666953206062,
0.07822313159704208,
0.09860026091337204,
-0.06644031405448914,
0.01740316115319729,
0.007448429241776466,
-0.013535193167626858,
-0.012875756248831749,
-0.1027410551905632,
-0.011011160910129547,
-0.02703416720032692,
0.08538971841335297,
-0.006346453912556171,
-0.02337721362709999,
0.07103618234395981,
0.219583198428154,
0.019117223098874092,
-0.06017635390162468,
0.0026402745861560106,
0.0007285316823981702,
-0.04474102333188057,
-0.04287048429250717,
-0.022334733977913857,
-0.02005355805158615,
-0.04565173760056496,
0.1370963752269745,
0.0197006743401289,
0.0160234197974205,
0.06944767385721207,
0.15383471548557281,
-0.060538001358509064,
-0.018743544816970825,
-0.09094907343387604,
-0.012127568945288658,
-0.04881249740719795,
-0.10060582309961319,
0.20714308321475983,
0.02918243035674095,
0.07640668004751205,
-0.07047305256128311,
-0.08184321969747543,
0.019391411915421486,
0.009472687728703022,
-0.07827376574277878,
0.10507961362600327,
-0.04482854902744293,
-0.06521470844745636,
0.10780712217092514,
-0.011496935039758682,
0.02667531743645668,
0.21756194531917572,
0.007946177385747433,
-0.09001758694648743,
0.010099879465997219,
-0.02597135119140148,
0.023724175989627838,
0.16502590477466583,
-0.037997204810380936,
0.026182474568486214,
0.042418282479047775,
-0.0008698965539224446,
0.026817001402378082,
-0.02238648012280464,
0.02232862450182438,
-0.043236829340457916,
-0.04306451603770256,
0.031276118010282516,
0.10033325105905533,
-0.01376179326325655,
0.10480012744665146,
0.029564909636974335,
0.007077073212713003,
-0.04410203918814659,
-0.035152632743120193,
-0.1024581640958786,
0.1212688684463501,
-0.10775220394134521,
-0.20504732429981232,
-0.17023442685604095,
0.06043115258216858,
-0.152998149394989,
0.027124660089612007,
0.001871870132163167,
0.02785135991871357,
-0.0982930064201355,
-0.11163554340600967,
0.09566229581832886,
0.09408169239759445,
-0.05848510563373566,
-0.06642576307058334,
-0.034428294748067856,
0.005580772180110216,
-0.15352989733219147,
-0.008437435142695904,
-0.00825402420014143,
-0.10936588793992996,
-0.014464152976870537,
0.00855244044214487,
0.12982501089572906,
-0.02002859301865101,
-0.05226648598909378,
-0.01735130324959755,
-0.023048002272844315,
0.16660740971565247,
-0.10458088666200638,
0.17499001324176788,
0.07720377296209335,
-0.009821638464927673,
0.027677390724420547,
0.07708287984132767,
0.05053187161684036,
-0.02662544883787632,
0.017918340861797333,
0.04029490798711777,
-0.039920102804899216,
-0.20081296563148499,
-0.19194240868091583,
-0.04609719663858414,
0.057961348444223404,
-0.006014753598719835,
0.07518795132637024,
0.06186259165406227,
0.006389440968632698,
-0.061752621084451675,
0.00741503294557333,
-0.013676444999873638,
0.06590370833873749,
0.005867017898708582,
-0.002482742303982377,
-0.019436726346611977,
-0.07874218374490738,
-0.0023836749605834484,
0.18683750927448273,
0.02826288901269436,
0.07812841981649399,
-0.05505206808447838,
0.08416654169559479,
-0.017601564526557922,
0.0586070716381073,
-0.02178136818110943,
0.1421145349740982,
0.015587701462209225,
-0.027545059099793434,
-0.026910217478871346,
-0.06306570768356323,
0.03222304955124855,
0.05814836546778679,
0.05964520201086998,
-0.00010663704597391188,
-0.021135924383997917,
-0.15078958868980408,
0.09932290017604828,
0.2452668398618698,
0.030504347756505013,
-0.06890402734279633,
-0.13954657316207886,
-0.004738884977996349,
-0.09296666085720062,
-0.060901347547769547,
0.001629998441785574,
0.15802717208862305,
-0.18586528301239014,
0.04734155163168907,
0.012252306565642357,
0.06333910673856735,
-0.062257442623376846,
-0.03998306393623352,
-0.015038437210023403,
0.05631150305271149,
-0.05130413919687271,
0.05877083167433739,
-0.15481390058994293,
0.14995567500591278,
0.02681673876941204,
0.008765806443989277,
-0.10559498518705368,
-0.0010016495361924171,
0.03902069106698036,
0.022344818338751793,
0.17913897335529327,
0.034636929631233215,
0.12026048451662064,
0.011037606745958328,
-0.14379702508449554,
0.004847855307161808,
0.05822623148560524,
-0.1300937831401825,
0.012113526463508606,
0.021962758153676987,
-0.041929032653570175,
-0.05933389440178871,
0.04073977470397949,
-0.11841458082199097,
-0.09698829799890518,
-0.010479678399860859,
-0.07223939895629883,
-0.04528310149908066,
-0.053431518375873566,
-0.0417044423520565,
-0.07529452443122864,
0.13221611082553864,
0.009643876925110817,
-0.0345076285302639,
-0.15215465426445007,
-0.06342171132564545,
0.13536006212234497,
-0.06319908052682877,
0.08251161128282547,
0.005928881000727415,
0.11925511062145233,
-0.09302746504545212,
-0.11676819622516632,
0.046899836510419846,
-0.1585758775472641,
-0.11649373173713684,
-0.02438945695757866,
0.16053830087184906,
0.09826184064149857,
0.04025770351290703,
0.015379478223621845,
0.02478085458278656,
0.04671024531126022,
-0.11101603507995605,
0.011054256930947304,
0.046530451625585556,
0.04778182506561279,
0.12930305302143097,
-0.038412176072597504,
-0.10100357979536057,
0.014415761455893517,
0.005683576688170433,
0.07273602485656738,
0.16009929776191711,
-0.05842108279466629,
0.14735101163387299,
0.10604110360145569,
-0.032140735536813736,
-0.29264065623283386,
0.06380373984575272,
0.0309025589376688,
-0.016528036445379257,
-0.005777915474027395,
-0.07977747172117233,
0.09926974028348923,
0.059326183050870895,
-0.02166251465678215,
0.04587579891085625,
-0.10740894824266434,
-0.08756104856729507,
0.05283842980861664,
0.005366567522287369,
-0.004939737264066935,
-0.04288988560438156,
-0.03822036460042,
-0.029081914573907852,
-0.16245467960834503,
0.07862986624240875,
-0.06528613716363907,
0.020358724519610405,
-0.013757561333477497,
0.05497533082962036,
0.04200524464249611,
-0.04888661578297615,
0.05738537013530731,
0.07225541770458221,
0.02918567880988121,
-0.0861549824476242,
0.013516047969460487,
0.08538106083869934,
0.01734132133424282,
0.15130658447742462,
0.05784393101930618,
0.03903598338365555,
-0.11717402189970016,
0.017506010830402374,
-0.0864865854382515,
0.20126597583293915,
-0.040666837245225906,
-0.09777325391769409,
-0.0712871178984642,
0.09229933470487595,
0.02712351270020008,
0.006290029734373093,
-0.029639096930623055,
0.0117953447625041,
0.03292741999030113,
0.14660242199897766,
0.12631632387638092,
0.07156294584274292,
-0.004940677899867296,
0.056690435856580734,
-0.061233386397361755,
0.01167488656938076,
-0.030236054211854935,
0.03715799003839493,
0.10912392288446426,
0.01513934601098299,
0.05372902378439903,
-0.03762613236904144,
-0.13604411482810974,
-0.01691669598221779,
0.04523728787899017,
-0.21693332493305206,
-0.16037322580814362,
-0.0895623043179512,
-0.17389711737632751,
0.05649837106466293,
0.07628599554300308,
0.1740996241569519,
-0.09597158432006836,
-0.022034527733922005,
0.0002739445772022009,
0.01165713183581829,
0.0026102305855602026,
0.03370635583996773,
0.05966696888208389,
0.0010620984248816967,
-0.0559755377471447,
0.18113909661769867,
0.006800456438213587,
-0.13196803629398346,
0.042396191507577896,
0.09840328246355057,
-0.07754707336425781,
-0.04163180664181709,
-0.04405296966433525,
0.037414614111185074,
-0.16817250847816467,
-0.02233905717730522,
-0.016724538058042526,
-0.029031511396169662,
-0.008057359606027603,
0.12824086844921112,
0.02919771894812584,
-0.0011372893350198865,
-0.01964849792420864,
0.024120347574353218,
-0.06752986460924149,
0.056745495647192,
0.027561718598008156,
0.043309178203344345,
0.0012678519124165177,
0.03358246758580208,
-0.0785190761089325,
-0.0019603720866143703,
-0.0534985288977623,
0.014500961638987064,
-0.08074019104242325,
-0.10783465951681137,
-0.14677190780639648,
-0.061814527958631516,
-0.03964248672127724,
-0.012307258322834969,
-0.048917632550001144,
-0.07420363277196884,
0.02482098899781704,
0.025112828239798546,
-0.042331960052251816,
-0.004759057890623808,
-0.002034979173913598,
0.08655548095703125,
-0.12213532626628876,
-0.02208956889808178,
0.07732046395540237,
-0.031925372779369354,
0.22199051082134247,
0.0827653631567955,
0.006140329409390688,
0.022099776193499565,
-0.1465168595314026,
-0.020138589665293694,
-0.023292090743780136,
0.020829619839787483,
0.11443524062633514,
-0.01948051154613495,
-0.0060844095423817635,
-0.019710173830389977,
-0.0025377802085131407,
0.007073880173265934,
0.023438259959220886,
-0.04008099436759949,
0.06717701256275177,
0.044646695256233215,
-0.043255243450403214,
-0.07998980581760406,
0.034693554043769836,
0.13079749047756195,
0.03442814201116562,
-0.0230462197214365,
-0.059582605957984924,
0.045634813606739044,
-0.07838205248117447,
0.004258478060364723,
0.02626059763133526,
-0.06753139942884445,
-0.06836149841547012,
0.03608955815434456,
0.031137263402342796,
-0.0044159130193293095,
0.21730133891105652,
0.033882491290569305,
0.06187283992767334,
0.047104109078645706,
0.06207645311951637,
-0.037488214671611786,
-0.009354740381240845,
0.05856868997216225,
-0.07596277445554733,
0.015298794023692608,
-0.09795442223548889,
0.014251567423343658,
-0.03986726701259613,
-0.10581403970718384,
0.10234527289867401,
0.024957260116934776,
-0.035124074667692184,
0.027127081528306007,
0.0753478854894638,
-0.085907943546772,
-0.07993423938751221,
-0.10088689625263214,
0.04253770783543587,
0.086404949426651,
-0.08364193141460419,
0.07632573693990707,
0.052839815616607666,
-0.22174876928329468,
0.0695161446928978,
0.04010084271430969,
-0.009245662949979305,
-0.01917813904583454,
-0.06197114288806915,
-0.003151754615828395,
-0.05307108536362648,
0.05225084349513054,
-0.12296023219823837,
0.06833997368812561,
-0.024461902678012848,
0.053667277097702026,
-0.02928350120782852,
0.15740624070167542,
-0.1806102842092514,
-0.04056822136044502,
0.10474368929862976,
0.06577914208173752,
0.01712477207183838,
0.0030183037742972374,
-0.007021427154541016,
-0.08653797209262848,
0.0684436559677124,
0.0235518217086792,
0.07449858635663986,
0.054783254861831665,
-0.000808218726888299,
-0.011509370058774948,
-0.057715434581041336,
-0.02385965920984745,
-0.019102541729807854,
-0.029152020812034607,
0.13559755682945251,
0.049614161252975464,
0.014343533664941788,
-0.017819257453083992,
0.22101916372776031,
-0.09334424138069153,
0.013097543269395828,
-0.11315133422613144,
0.14031612873077393,
-0.04698117822408676,
0.05460106208920479,
0.02444707788527012,
-0.1011001318693161,
-0.04056668281555176,
0.1420658528804779,
0.2352137416601181,
-0.0008398953941650689,
0.025059180334210396,
0.02678600884974003,
-0.006897301413118839,
0.054949529469013214,
0.07737362384796143,
0.01445457711815834,
0.28721657395362854,
-0.035698700696229935,
0.07294000685214996,
-0.009856605902314186,
-0.003208651440218091,
-0.051086802035570145,
0.210401251912117,
-0.07777485251426697,
-0.07156779617071152,
-0.03896746039390564,
0.07498469203710556,
-0.10227189213037491,
-0.373468816280365,
-0.023887602612376213,
-0.12014970183372498,
-0.09542703628540039,
-0.027698364108800888,
-0.06498345732688904,
0.07222495973110199,
0.09459338337182999,
0.028787193819880486,
0.028834111988544464,
0.21023541688919067,
0.014486330561339855,
-0.13146474957466125,
-0.023459594696760178,
0.05654755234718323,
-0.12939369678497314,
0.24149422347545624,
-0.00864708237349987,
0.011099743656814098,
0.06890342384576797,
-0.004795527551323175,
-0.13687239587306976,
-0.057288337498903275,
0.06541245430707932,
-0.1771710067987442,
-0.021351002156734467,
0.1718759536743164,
-0.021424496546387672,
0.00032032790477387607,
0.01733074150979519,
-0.10886699706315994,
0.06634863466024399,
0.1249169111251831,
0.0270331259816885,
-0.11226733773946762,
0.13195019960403442,
-0.10270390659570694,
0.15540319681167603,
0.14659731090068817,
-0.02080059051513672,
0.013225270435214043,
-0.0841173604130745,
-0.0034122925717383623,
0.03548581153154373,
0.043795712292194366,
0.03227859362959862,
-0.08666087687015533,
0.03740533068776131,
-0.12897849082946777,
-0.024497246369719505,
-0.16455812752246857,
-0.014352143742144108,
-0.05779113247990608,
0.011171816848218441,
-0.03411228582262993,
0.07683634012937546,
0.07264566421508789,
0.007081110496073961,
0.027156872674822807,
0.023462750017642975,
-0.005131975281983614,
0.028647081926465034,
-0.04840829595923424,
-0.03560050204396248
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz Distilled Turkish BERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a (cased) distilled model for Turkish 🎉
# 🇹🇷 DistilBERTurk
DistilBERTurk is a community-driven cased distilled BERT model for Turkish.
DistilBERTurk was trained on 7GB of the original training data that was used
for training [BERTurk](https://github.com/stefan-it/turkish-bert/tree/master#stats),
using the cased version of BERTurk as teacher model.
*DistilBERTurk* was trained with the official Hugging Face implementation from
[here](https://github.com/huggingface/transformers/tree/master/examples/distillation)
for 5 days on 4 RTX 2080 TI.
More details about distillation can be found in the
["DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter"](https://arxiv.org/abs/1910.01108)
paper by Sanh et al. (2019).
## Model weights
Currently only PyTorch-[Transformers](https://github.com/huggingface/transformers)
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue in the [BERTurk](https://github.com/stefan-it/turkish-bert) repository!
| Model | Downloads
| --------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `dbmdz/distilbert-base-turkish-cased` | [`config.json`](https://cdn.huggingface.co/dbmdz/distilbert-base-turkish-cased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/distilbert-base-turkish-cased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/distilbert-base-turkish-cased/vocab.txt)
## Usage
With Transformers >= 2.3 our DistilBERTurk model can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("dbmdz/distilbert-base-turkish-cased")
model = AutoModel.from_pretrained("dbmdz/distilbert-base-turkish-cased")
```
## Results
For results on PoS tagging or NER tasks, please refer to
[this repository](https://github.com/stefan-it/turkish-bert).
For PoS tagging, DistilBERTurk outperforms the 24-layer XLM-RoBERTa model.
The overall performance difference between DistilBERTurk and the original
(teacher) BERTurk model is ~1.18%.
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "tr", "license": "mit"}
| null |
dbmdz/distilbert-base-turkish-cased
|
[
"transformers",
"pytorch",
"tf",
"distilbert",
"tr",
"arxiv:1910.01108",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1910.01108"
] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #distilbert #tr #arxiv-1910.01108 #license-mit #endpoints_compatible #has_space #region-us
|
+ dbmdz Distilled Turkish BERT model
====================================
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a (cased) distilled model for Turkish
🇹🇷 DistilBERTurk
================
DistilBERTurk is a community-driven cased distilled BERT model for Turkish.
DistilBERTurk was trained on 7GB of the original training data that was used
for training BERTurk,
using the cased version of BERTurk as teacher model.
*DistilBERTurk* was trained with the official Hugging Face implementation from
here
for 5 days on 4 RTX 2080 TI.
More details about distillation can be found in the
"DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter"
paper by Sanh et al. (2019).
Model weights
-------------
Currently only PyTorch-Transformers
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue in the BERTurk repository!
Usage
-----
With Transformers >= 2.3 our DistilBERTurk model can be loaded like:
Results
-------
For results on PoS tagging or NER tasks, please refer to
this repository.
For PoS tagging, DistilBERTurk outperforms the 24-layer XLM-RoBERTa model.
The overall performance difference between DistilBERTurk and the original
(teacher) BERTurk model is ~1.18%.
Huggingface model hub
=====================
All models are available on the Huggingface model hub.
Contact (Bugs, Feedback, Contribution and more)
===============================================
For questions about our BERT models just open an issue
here
Acknowledgments
===============
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[] |
[
"TAGS\n#transformers #pytorch #tf #distilbert #tr #arxiv-1910.01108 #license-mit #endpoints_compatible #has_space #region-us \n"
] |
[
47
] |
[
"passage: TAGS\n#transformers #pytorch #tf #distilbert #tr #arxiv-1910.01108 #license-mit #endpoints_compatible #has_space #region-us \n"
] |
[
-0.028796138241887093,
0.018791792914271355,
-0.0058485474437475204,
0.032257843762636185,
0.05639076232910156,
0.023025711998343468,
0.07191447168588638,
0.10738883912563324,
0.06761540472507477,
0.008744793944060802,
0.15214738249778748,
0.18979613482952118,
-0.0451444648206234,
0.025295911356806755,
-0.06331414729356766,
-0.2218925654888153,
0.05437302216887474,
0.07307272404432297,
-0.05802389606833458,
0.09643100947141647,
0.08903917670249939,
-0.0807376354932785,
0.06185227632522583,
-0.0075125135481357574,
-0.11638721078634262,
0.033688053488731384,
0.04003065079450607,
-0.08907569944858551,
0.15654213726520538,
0.04644240438938141,
0.12594984471797943,
0.09369229525327682,
-0.029401713982224464,
-0.07836096733808517,
0.02804378606379032,
0.0022588169667869806,
-0.11661078035831451,
0.0673084557056427,
0.018050681799650192,
-0.03133590891957283,
0.13362333178520203,
0.01945680007338524,
-0.015420153737068176,
0.03687094524502754,
-0.16700105369091034,
-0.24886742234230042,
-0.09718117862939835,
0.11091413348913193,
-0.02830539643764496,
0.060269784182310104,
0.04066579043865204,
0.18981479108333588,
-0.10366086661815643,
0.06050742790102959,
0.20552483201026917,
-0.3846036195755005,
-0.0032186475582420826,
0.17034433782100677,
0.08868414163589478,
0.01333855651319027,
-0.03609370067715645,
0.06308554857969284,
0.06636637449264526,
0.023881487548351288,
0.07984708249568939,
-0.06029867008328438,
-0.05457402393221855,
0.09447858482599258,
-0.10628462582826614,
-0.08334476500749588,
0.23458510637283325,
-0.012616634368896484,
0.042090415954589844,
0.037929993122816086,
-0.08895429223775864,
-0.06217467412352562,
0.031395234167575836,
-0.005073964595794678,
0.009254229255020618,
0.06798069179058075,
0.020922604948282242,
-0.028299735859036446,
-0.16671879589557648,
0.03232439234852791,
-0.24515298008918762,
0.137983039021492,
-0.013350717723369598,
0.0773947685956955,
-0.14915694296360016,
0.0705673098564148,
-0.04869712144136429,
-0.08848100900650024,
0.02525363303720951,
-0.08797046542167664,
0.08202160149812698,
0.020955046638846397,
-0.06812657415866852,
0.05689162015914917,
0.03645069897174835,
0.17053422331809998,
-0.03334579989314079,
-0.017744841054081917,
0.03789909556508064,
0.12314551323652267,
-0.01782347820699215,
0.05049201473593712,
-0.04007204249501228,
0.033935293555259705,
0.007034557871520519,
-0.10682413727045059,
0.019991302862763405,
-0.03864450380206108,
-0.13813716173171997,
-0.09208427369594574,
-0.008775757625699043,
0.06299877911806107,
0.05951394885778427,
0.034544043242931366,
-0.04542369395494461,
0.04500081390142441,
0.12104863673448563,
-0.019903501495718956,
0.004863600712269545,
-0.024759212508797646,
0.042304959148168564,
0.06749161332845688,
0.030662167817354202,
-0.02086256444454193,
0.04571756720542908,
0.09016440808773041,
-0.1105017438530922,
-0.015574198216199875,
-0.012434959411621094,
-0.09625445306301117,
0.08350016921758652,
-0.12183640897274017,
0.0484306775033474,
-0.19672313332557678,
-0.02810242772102356,
0.03850819915533066,
0.04544910416007042,
-0.0100906016305089,
-0.010720651596784592,
0.09742990881204605,
-0.07031485438346863,
0.05784321203827858,
-0.05506496503949165,
-0.041948460042476654,
-0.06823138892650604,
0.09994581341743469,
-0.06650130450725555,
0.08639438450336456,
-0.18501736223697662,
0.04045084863901138,
-0.07773283123970032,
0.029013240709900856,
-0.06032376363873482,
-0.07055474072694778,
-0.060709115117788315,
0.13501319289207458,
-0.011644684709608555,
-0.09411598742008209,
-0.1306745707988739,
0.03287101536989212,
0.004212420433759689,
0.10105686634778976,
-0.14095811545848846,
-0.04309830069541931,
0.13243307173252106,
-0.09650322049856186,
-0.17287662625312805,
0.05644373223185539,
0.019988728687167168,
0.01798800565302372,
-0.0033574718981981277,
0.22064277529716492,
0.0288486760109663,
-0.14133672416210175,
-0.01833302527666092,
0.15328764915466309,
-0.12244162708520889,
-0.15810704231262207,
0.06935185194015503,
-0.003807975910604,
-0.028457244858145714,
-0.007273054216057062,
-0.008386502973735332,
0.08375562727451324,
-0.05375339463353157,
-0.04768030345439911,
-0.06136876344680786,
-0.014895819127559662,
0.0876709520816803,
0.03642263263463974,
0.07145731896162033,
-0.08751039206981659,
-0.038997091352939606,
0.04727339744567871,
0.01649761013686657,
0.09280308336019516,
0.05207027122378349,
-0.049237873405218124,
0.12649992108345032,
-0.008876802399754524,
-0.04200785234570503,
-0.14368776977062225,
-0.04907916113734245,
-0.059825599193573,
0.07091864198446274,
0.015792516991496086,
0.27087512612342834,
0.06145933270454407,
-0.07684283703565598,
-0.010428335517644882,
-0.017598815262317657,
0.07925828546285629,
0.07812637090682983,
-0.024662643671035767,
-0.08260612189769745,
-0.007157038431614637,
-0.056452326476573944,
-0.080600805580616,
-0.053946252912282944,
0.01966710388660431,
0.134922593832016,
0.10471006482839584,
-0.03479505702853203,
0.08020766824483871,
-0.03387932851910591,
0.029304709285497665,
-0.048566389828920364,
-0.017754144966602325,
0.08816797286272049,
0.022907063364982605,
-0.04691559076309204,
0.21599438786506653,
-0.08631763607263565,
0.35214805603027344,
0.21669615805149078,
-0.18511046469211578,
-0.029473140835762024,
0.04453452304005623,
-0.050013381987810135,
0.03507017344236374,
0.0809539332985878,
-0.04062055051326752,
-0.050767309963703156,
-0.045901596546173096,
0.08896350860595703,
-0.03929679095745087,
-0.036915287375450134,
0.009161877445876598,
-0.05821862071752548,
-0.08065300434827805,
0.06100437790155411,
0.07299795001745224,
-0.191167414188385,
0.1882409304380417,
0.35987594723701477,
0.0372968427836895,
0.10365879535675049,
-0.04211646318435669,
0.0001965597621165216,
-0.03158848360180855,
-0.031098980456590652,
-0.04188893735408783,
0.10765834152698517,
-0.13682323694229126,
-0.024463526904582977,
0.08853607624769211,
0.008172310888767242,
0.048264358192682266,
-0.16370104253292084,
-0.09968932718038559,
0.04299686849117279,
0.032012008130550385,
-0.11187383532524109,
0.16103751957416534,
0.01335754245519638,
0.11517011374235153,
-0.00944913737475872,
-0.10192197561264038,
0.07876617461442947,
0.024347439408302307,
-0.04466827213764191,
0.10999071598052979,
-0.1415635198354721,
-0.19187407195568085,
-0.09854782372713089,
-0.06355006247758865,
0.04934727028012276,
0.0044456119649112225,
0.1254831850528717,
-0.03614011034369469,
-0.021417761221528053,
0.010228685103356838,
-0.03887404128909111,
-0.14491501450538635,
0.03345983475446701,
-0.038930922746658325,
0.019816827028989792,
-0.04157043993473053,
-0.12560486793518066,
-0.08482847362756729,
-0.018998512998223305,
-0.0436585359275341,
0.12077901512384415,
-0.03274943307042122,
0.08725887537002563,
0.11039380729198456,
-0.02288505993783474,
0.03601061925292015,
-0.052940353751182556,
0.19734196364879608,
-0.04217619076371193,
0.029747415333986282,
0.19072377681732178,
0.058280784636735916,
0.08497686684131622,
0.13604331016540527,
0.07220536470413208,
-0.04490915313363075,
-0.019264942035079002,
-0.05791366845369339,
-0.11588389426469803,
-0.17562194168567657,
-0.09907028824090958,
-0.12833689153194427,
0.007048081140965223,
0.009738799184560776,
0.08094573765993118,
0.11868428438901901,
0.05768217891454697,
0.025461949408054352,
-0.045504290610551834,
-0.08681117743253708,
0.05350175499916077,
0.28026148676872253,
-0.05574388429522514,
0.09676481783390045,
-0.09931284934282303,
-0.039351936429739,
0.11735668033361435,
0.04087769240140915,
0.12972335517406464,
0.08230341970920563,
-0.01640842668712139,
0.10253634303808212,
0.22048816084861755,
0.09853833168745041,
0.09883766621351242,
0.007072823122143745,
-0.051850929856300354,
-0.03584688529372215,
-0.03405007719993591,
0.022573187947273254,
0.06798422336578369,
0.06591513007879257,
-0.1328027993440628,
-0.00513141043484211,
-0.20952391624450684,
0.041022852063179016,
0.04554961249232292,
0.07381933927536011,
-0.16781432926654816,
-0.020176412537693977,
0.04110340029001236,
0.025178514420986176,
-0.019149556756019592,
0.06265444308519363,
0.06806887686252594,
-0.05909973010420799,
0.05174482613801956,
0.007630275562405586,
0.06268762797117233,
0.12735743820667267,
0.07213150709867477,
-0.015541791915893555,
-0.14492501318454742,
0.018593234941363335,
0.06107315793633461,
-0.2992587685585022,
0.2801213264465332,
-0.023636650294065475,
-0.08736107498407364,
-0.025378184393048286,
-0.0480777882039547,
0.027274826541543007,
0.1772812455892563,
0.10580656677484512,
0.056802112609148026,
-0.08523674309253693,
-0.11292316764593124,
0.05716406926512718,
-0.009556567296385765,
0.056987252086400986,
0.006541332229971886,
-0.04193434491753578,
-0.02605378068983555,
0.00868297927081585,
0.034398093819618225,
0.1964482069015503,
0.0051878527738153934,
-0.09557031840085983,
0.06884507834911346,
0.03718537464737892,
-0.018248066306114197,
-0.04535442218184471,
-0.040114689618349075,
-0.1529960185289383,
0.05892161652445793,
-0.04437188059091568,
-0.027390340343117714,
-0.10707230865955353,
-0.14199785888195038,
0.11296907812356949,
-0.07085983455181122,
0.08134586364030838,
-0.04725591465830803,
-0.08026835322380066,
-0.09651661664247513,
-0.1704423874616623,
0.13555674254894257,
-0.09734343737363815,
-0.018907487392425537,
-0.034917451441287994,
0.13562479615211487,
-0.10379330068826675,
0.060607898980379105,
0.005328942555934191,
0.06594595313072205,
-0.12989352643489838,
-0.10571377724409103,
0.04037603735923767,
-0.08324673026800156,
0.07197019457817078,
-0.07237901538610458,
-0.024293020367622375,
0.027334073558449745,
0.06855535507202148,
-0.016754284501075745,
0.1833694726228714,
0.23963415622711182,
-0.12587016820907593,
0.14256423711776733,
0.08016061037778854,
-0.014586275443434715,
-0.2722010016441345,
-0.08383254706859589,
-0.18582600355148315,
-0.043543435633182526,
0.08057991415262222,
-0.05877820774912834,
0.02672952599823475,
0.0546986348927021,
-0.06429395079612732,
0.12703834474086761,
-0.2528819739818573,
-0.07890746742486954,
0.12919771671295166,
-0.045079249888658524,
0.4020475447177887,
-0.1348555088043213,
-0.0466737262904644,
0.04345942661166191,
-0.2724064886569977,
0.133653461933136,
-0.0065351142548024654,
0.05872255563735962,
-0.04203185439109802,
0.0336601547896862,
0.014752411283552647,
-0.06337738037109375,
0.1469561755657196,
-0.004904669709503651,
0.05304063484072685,
-0.1153087392449379,
-0.1551879495382309,
0.1472109854221344,
-0.022132879123091698,
0.013329868204891682,
-0.032869771122932434,
0.02076846919953823,
-0.15207043290138245,
0.028400858864188194,
-0.1361599564552307,
0.09882563352584839,
-0.00617360370233655,
-0.082045778632164,
-0.08404068648815155,
0.024024218320846558,
-0.010076033882796764,
-0.05581261217594147,
0.2233877032995224,
-0.010199197567999363,
0.18858453631401062,
0.1062169000506401,
0.003590245498344302,
-0.1904318630695343,
-0.07959571480751038,
-0.01680329442024231,
-0.0667799562215805,
0.07357816398143768,
-0.15050293505191803,
0.0029462496750056744,
0.12015228718519211,
0.015262117609381676,
0.022420275956392288,
0.09317607432603836,
-0.008366502821445465,
-0.006941669154912233,
0.16377338767051697,
-0.1744050830602646,
-0.09048138558864594,
-0.013548712246119976,
0.00008269589307019487,
0.123392753303051,
0.05709776282310486,
0.1018705740571022,
-0.02549796551465988,
0.003238013247027993,
0.0030141028109937906,
-0.037850670516490936,
-0.09716330468654633,
0.004425255581736565,
0.09089329838752747,
0.03353582322597504,
-0.08684520423412323,
0.01795049197971821,
0.014458871446549892,
-0.14347712695598602,
-0.03886202722787857,
0.07606793195009232,
-0.08502962440252304,
-0.13762904703617096,
-0.09951069205999374,
-0.07171061635017395,
-0.20734919607639313,
-0.019962839782238007,
0.013655360788106918,
-0.10189872980117798,
0.05308688059449196,
0.22171597182750702,
0.07607028633356094,
0.11249185353517532,
-0.01243214588612318,
-0.029161332175135612,
0.04846811294555664,
-0.05403296276926994,
-0.06531398743391037,
0.04188190773129463,
-0.13381992280483246,
0.09364768862724304,
-0.021464521065354347,
0.13227447867393494,
-0.0720115602016449,
-0.0021835758816450834,
-0.13673989474773407,
0.008987195789813995,
-0.08420194685459137,
-0.08278658986091614,
-0.12503556907176971,
-0.06282275170087814,
0.01535169966518879,
-0.12404896318912506,
-0.05931549891829491,
-0.0007072818116284907,
-0.13365069031715393,
0.027626270428299904,
0.032196637243032455,
0.07783098518848419,
-0.10175912827253342,
-0.04311203956604004,
0.08272729068994522,
0.004842462949454784,
0.08460462838411331,
0.08475963771343231,
-0.05462734401226044,
0.06705179065465927,
-0.07949727028608322,
-0.10295063257217407,
0.0914677083492279,
0.011556446552276611,
0.06138286367058754,
0.04779620096087456,
-0.011588303372263908,
0.05326477065682411,
0.0009500401793047786,
0.046993885189294815,
-0.04401238635182381,
-0.09948858618736267,
-0.0008824806427583098,
-0.01242859661579132,
-0.11465257406234741,
-0.008754389360547066,
-0.07860584557056427,
0.17246079444885254,
0.01630975492298603,
0.0965508222579956,
0.009140157140791416,
0.008076341822743416,
-0.09585660696029663,
-0.002524862764403224,
-0.031031522899866104,
-0.15439090132713318,
-0.017667721956968307,
-0.04632223770022392,
-0.006081562489271164,
-0.010006184689700603,
0.21118295192718506,
0.011729669757187366,
-0.14535674452781677,
0.0622311495244503,
0.08995471149682999,
-0.040511373430490494,
-0.03308720141649246,
0.22791795432567596,
0.035579171031713486,
-0.03375000134110451,
-0.12775498628616333,
0.07699105143547058,
-0.03415732830762863,
-0.07690280675888062,
0.11875925213098526,
0.13814370334148407,
0.059115540236234665,
0.045935939997434616,
0.07099784165620804,
-0.04360431432723999,
-0.13498377799987793,
-0.17440974712371826,
0.015719149261713028,
0.05475200340151787,
-0.058938149362802505,
0.07777425646781921,
0.18426501750946045,
-0.04617936536669731,
0.054908379912376404,
-0.05482592433691025,
0.03135564178228378,
-0.11782129108905792,
-0.10924076288938522,
-0.02249673567712307,
-0.11202114075422287,
0.0010017147287726402,
-0.024986889213323593,
0.069613516330719,
0.1761329472064972,
0.04933534935116768,
-0.020382599905133247,
0.024977445602416992,
0.0037377169355750084,
-0.056898605078458786,
0.008892450481653214,
0.021068040281534195,
0.02133559249341488,
-0.09187827259302139,
0.0011876538628712296,
-0.07159138470888138,
-0.07446403801441193,
-0.05960380658507347,
0.012548336759209633,
-0.027535440400242805,
-0.03948225826025009,
-0.09953977912664413,
-0.06432266533374786,
-0.061437275260686874,
0.07065217941999435,
-0.009252597577869892,
0.12712964415550232,
-0.005072451196610928,
0.04766632616519928,
0.018607797101140022,
0.19616436958312988,
-0.07258570939302444,
-0.07154285162687302,
0.012773037888109684,
0.17573393881320953,
0.029951151460409164,
0.07562959939241409,
-0.0012732072500512004,
0.012395484372973442,
-0.052675049751996994,
0.17903929948806763,
0.3033868372440338,
-0.03527120128273964,
0.08384135365486145,
0.03180139139294624,
0.019167553633451462,
0.061774998903274536,
0.09287252277135849,
0.11415939033031464,
0.24196197092533112,
-0.10597437620162964,
-0.030839210376143456,
-0.08660688251256943,
0.046357084065675735,
-0.07465177774429321,
0.04477201774716377,
0.017005689442157745,
-0.09272098541259766,
-0.026320841163396835,
0.06113772839307785,
-0.08003387600183487,
0.047727350145578384,
0.09092786908149719,
-0.20063647627830505,
-0.026431875303387642,
0.005046670325100422,
0.15132734179496765,
-0.01058141142129898,
0.09591232240200043,
-0.07400581985712051,
-0.07253431528806686,
0.029091987758874893,
0.017914609983563423,
-0.22648583352565765,
-0.025464128702878952,
0.12522616982460022,
0.0178629569709301,
0.04545022174715996,
-0.05561722069978714,
0.03222864121198654,
0.09572771191596985,
0.09389525651931763,
-0.0793735533952713,
0.08243429660797119,
0.05533774569630623,
-0.09106063842773438,
-0.09280429780483246,
-0.12296050786972046,
0.02381976693868637,
-0.07491230964660645,
0.05129282549023628,
-0.16627784073352814,
0.05411877483129501,
0.0894838199019432,
-0.014970095828175545,
-0.026801010593771935,
0.005465998314321041,
-0.0414668470621109,
0.07756529003381729,
0.018131665885448456,
-0.0204940102994442,
-0.0580744631588459,
-0.03965522348880768,
-0.058009929955005646,
0.09357071667909622,
-0.11500599980354309,
-0.1048152968287468,
0.008065590634942055,
-0.01575504243373871,
0.013655954971909523,
-0.009221644140779972,
-0.047606851905584335,
-0.06731243431568146,
-0.04865060746669769,
0.032992154359817505,
-0.09008665382862091,
0.03894711285829544,
0.0670764297246933,
0.014603203162550926,
0.016779610887169838,
-0.02979346178472042,
0.012751450762152672,
0.056277111172676086,
-0.12351168692111969,
-0.03683582320809364
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz ELECTRA models
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources French Europeana ELECTRA models 🎉
# French Europeana ELECTRA
We extracted all French texts using the `language` metadata attribute from the Europeana corpus.
The resulting corpus has a size of 63GB and consists of 11,052,528,456 tokens.
Based on the metadata information, texts from the 18th - 20th century are mainly included in the
training corpus.
Detailed information about the data and pretraining steps can be found in
[this repository](https://github.com/stefan-it/europeana-bert).
## Model weights
ELECTRA model weights for PyTorch and TensorFlow are available.
* French Europeana ELECTRA (discriminator): `dbmdz/electra-base-french-europeana-cased-discriminator` - [model hub page](https://huggingface.co/dbmdz/electra-base-french-europeana-cased-discriminator/tree/main)
* French Europeana ELECTRA (generator): `dbmdz/electra-base-french-europeana-cased-generator` - [model hub page](https://huggingface.co/dbmdz/electra-base-french-europeana-cased-generator/tree/main)
## Results
For results on Historic NER, please refer to [this repository](https://github.com/stefan-it/europeana-bert).
## Usage
With Transformers >= 2.3 our French Europeana ELECTRA model can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("dbmdz/electra-base-french-europeana-cased-discriminator")
model = AutoModel.from_pretrained("dbmdz/electra-base-french-europeana-cased-discriminator")
```
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our ELECTRA models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download our models from their S3 storage 🤗
|
{"language": "fr", "license": "mit", "tags": ["historic french"]}
| null |
dbmdz/electra-base-french-europeana-cased-discriminator
|
[
"transformers",
"pytorch",
"tf",
"electra",
"pretraining",
"historic french",
"fr",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fr"
] |
TAGS
#transformers #pytorch #tf #electra #pretraining #historic french #fr #license-mit #endpoints_compatible #region-us
|
# + dbmdz ELECTRA models
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources French Europeana ELECTRA models
# French Europeana ELECTRA
We extracted all French texts using the 'language' metadata attribute from the Europeana corpus.
The resulting corpus has a size of 63GB and consists of 11,052,528,456 tokens.
Based on the metadata information, texts from the 18th - 20th century are mainly included in the
training corpus.
Detailed information about the data and pretraining steps can be found in
this repository.
## Model weights
ELECTRA model weights for PyTorch and TensorFlow are available.
* French Europeana ELECTRA (discriminator): 'dbmdz/electra-base-french-europeana-cased-discriminator' - model hub page
* French Europeana ELECTRA (generator): 'dbmdz/electra-base-french-europeana-cased-generator' - model hub page
## Results
For results on Historic NER, please refer to this repository.
## Usage
With Transformers >= 2.3 our French Europeana ELECTRA model can be loaded like:
# Huggingface model hub
All models are available on the Huggingface model hub.
# Contact (Bugs, Feedback, Contribution and more)
For questions about our ELECTRA models just open an issue
here
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download our models from their S3 storage
|
[
"# + dbmdz ELECTRA models\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources French Europeana ELECTRA models",
"# French Europeana ELECTRA\n\nWe extracted all French texts using the 'language' metadata attribute from the Europeana corpus.\n\nThe resulting corpus has a size of 63GB and consists of 11,052,528,456 tokens.\n\nBased on the metadata information, texts from the 18th - 20th century are mainly included in the\ntraining corpus.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.",
"## Model weights\n\nELECTRA model weights for PyTorch and TensorFlow are available.\n\n* French Europeana ELECTRA (discriminator): 'dbmdz/electra-base-french-europeana-cased-discriminator' - model hub page\n* French Europeana ELECTRA (generator): 'dbmdz/electra-base-french-europeana-cased-generator' - model hub page",
"## Results\n\nFor results on Historic NER, please refer to this repository.",
"## Usage\n\nWith Transformers >= 2.3 our French Europeana ELECTRA model can be loaded like:",
"# Huggingface model hub\n\nAll models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our ELECTRA models just open an issue\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download our models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #tf #electra #pretraining #historic french #fr #license-mit #endpoints_compatible #region-us \n",
"# + dbmdz ELECTRA models\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources French Europeana ELECTRA models",
"# French Europeana ELECTRA\n\nWe extracted all French texts using the 'language' metadata attribute from the Europeana corpus.\n\nThe resulting corpus has a size of 63GB and consists of 11,052,528,456 tokens.\n\nBased on the metadata information, texts from the 18th - 20th century are mainly included in the\ntraining corpus.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.",
"## Model weights\n\nELECTRA model weights for PyTorch and TensorFlow are available.\n\n* French Europeana ELECTRA (discriminator): 'dbmdz/electra-base-french-europeana-cased-discriminator' - model hub page\n* French Europeana ELECTRA (generator): 'dbmdz/electra-base-french-europeana-cased-generator' - model hub page",
"## Results\n\nFor results on Historic NER, please refer to this repository.",
"## Usage\n\nWith Transformers >= 2.3 our French Europeana ELECTRA model can be loaded like:",
"# Huggingface model hub\n\nAll models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our ELECTRA models just open an issue\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download our models from their S3 storage"
] |
[
42,
41,
99,
100,
17,
24,
18,
26,
64
] |
[
"passage: TAGS\n#transformers #pytorch #tf #electra #pretraining #historic french #fr #license-mit #endpoints_compatible #region-us \n# + dbmdz ELECTRA models\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources French Europeana ELECTRA models# French Europeana ELECTRA\n\nWe extracted all French texts using the 'language' metadata attribute from the Europeana corpus.\n\nThe resulting corpus has a size of 63GB and consists of 11,052,528,456 tokens.\n\nBased on the metadata information, texts from the 18th - 20th century are mainly included in the\ntraining corpus.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.## Model weights\n\nELECTRA model weights for PyTorch and TensorFlow are available.\n\n* French Europeana ELECTRA (discriminator): 'dbmdz/electra-base-french-europeana-cased-discriminator' - model hub page\n* French Europeana ELECTRA (generator): 'dbmdz/electra-base-french-europeana-cased-generator' - model hub page## Results\n\nFor results on Historic NER, please refer to this repository.## Usage\n\nWith Transformers >= 2.3 our French Europeana ELECTRA model can be loaded like:# Huggingface model hub\n\nAll models are available on the Huggingface model hub.# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our ELECTRA models just open an issue\nhere# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download our models from their S3 storage"
] |
[
-0.03021431155502796,
0.21530263125896454,
-0.0043554469011723995,
0.06360071152448654,
0.052812933921813965,
0.005696864798665047,
0.10630401223897934,
0.07484256476163864,
0.09075875580310822,
0.06378594785928726,
-0.017115343362092972,
-0.08211016654968262,
0.03680016100406647,
0.05180789902806282,
0.06144724041223526,
-0.184141606092453,
0.021630823612213135,
-0.02522849850356579,
-0.007651675026863813,
0.04405369237065315,
0.10321523994207382,
-0.04096711799502373,
0.11619094759225845,
0.02016601338982582,
-0.1007719337940216,
0.12093591690063477,
-0.05927006155252457,
-0.04262919723987579,
0.11858702450990677,
0.06649605929851532,
0.03234478458762169,
-0.07303804904222488,
0.034922171384096146,
-0.15662571787834167,
0.021783968433737755,
0.05821441113948822,
-0.034324925392866135,
0.061081402003765106,
0.09381268173456192,
-0.08521807193756104,
0.2104933261871338,
-0.06184590980410576,
0.0019826143980026245,
0.030356390401721,
-0.0472208708524704,
-0.014891386963427067,
-0.15913358330726624,
0.09980355948209763,
-0.017810355871915817,
0.09504333883523941,
0.02913174219429493,
0.08902158588171005,
-0.07158626616001129,
-0.0074594891630113125,
0.19751396775245667,
-0.11046555638313293,
-0.03529684618115425,
0.03392369672656059,
-0.0003494994598440826,
0.029639506712555885,
-0.09682633727788925,
0.048252955079078674,
0.006452804431319237,
0.03767534717917442,
0.02157716080546379,
-0.04885846748948097,
-0.04127924516797066,
-0.02004043012857437,
-0.05355755239725113,
-0.040877748280763626,
0.20747099816799164,
-0.0028179879300296307,
-0.07683033496141434,
-0.13793671131134033,
-0.08343125879764557,
0.11746621876955032,
-0.012963793240487576,
-0.0068233925849199295,
0.043243926018476486,
-0.017937593162059784,
0.0917261466383934,
-0.12485163658857346,
-0.047881416976451874,
-0.018612174317240715,
-0.07731328904628754,
0.18270689249038696,
-0.009390688501298428,
0.03762657195329666,
0.06621792167425156,
0.08632401376962662,
-0.113230399787426,
-0.03818259388208389,
-0.006550005171447992,
-0.031241510063409805,
-0.049722976982593536,
-0.032130271196365356,
-0.01673196628689766,
-0.030849160626530647,
0.016210095956921577,
0.18160903453826904,
-0.13829119503498077,
-0.041136499494314194,
-0.08997738361358643,
-0.01170649379491806,
0.05315793305635452,
0.12223472446203232,
-0.09681108593940735,
-0.18055331707000732,
0.02659447491168976,
-0.11854727566242218,
0.04324741289019585,
0.04441557452082634,
-0.030534543097019196,
-0.07464664429426193,
-0.006175908260047436,
0.038977500051259995,
-0.0005037490627728403,
0.0019019785104319453,
-0.004454044159501791,
-0.033920228481292725,
0.22776924073696136,
-0.09421496093273163,
0.020884808152914047,
0.017558950930833817,
-0.10478939116001129,
0.021198585629463196,
0.03572020307183266,
-0.011169317178428173,
-0.10867369920015335,
0.05980311334133148,
-0.08937706053256989,
-0.031247910112142563,
-0.06823136657476425,
-0.12134496122598648,
0.05702684447169304,
-0.06526562571525574,
-0.02654438279569149,
-0.10646834224462509,
-0.14168328046798706,
-0.04123268648982048,
0.05108780041337013,
-0.05943848937749863,
0.014087338000535965,
0.0035211110953241587,
-0.032124411314725876,
-0.0179747324436903,
0.013420850038528442,
-0.03447068855166435,
-0.04361429065465927,
-0.03639915585517883,
-0.1789259910583496,
0.0028611624147742987,
-0.06968221068382263,
0.02047164924442768,
-0.08386323601007462,
-0.02659515105187893,
-0.2657482922077179,
0.053540781140327454,
-0.07856893539428711,
-0.023335710167884827,
-0.09325318038463593,
0.015442294999957085,
-0.016606425866484642,
0.013999919407069683,
0.04613777622580528,
0.11702712625265121,
-0.12266232073307037,
-0.0479913130402565,
0.1671195775270462,
-0.10050783306360245,
-0.06376270949840546,
0.17005731165409088,
-0.02237357571721077,
0.00789773091673851,
0.04861338436603546,
0.140485942363739,
0.08788108080625534,
-0.14328685402870178,
-0.0853324681520462,
-0.03710181638598442,
-0.026442429050803185,
0.007474827114492655,
0.028624944388866425,
-0.10699919611215591,
0.0690772607922554,
-0.01652706041932106,
-0.053492553532123566,
0.024772465229034424,
0.008859412744641304,
0.0315435454249382,
-0.01488643977791071,
-0.036388665437698364,
0.009283283725380898,
0.037644099444150925,
-0.012006152421236038,
-0.024963704869151115,
-0.04050542786717415,
0.12094075232744217,
0.07576131075620651,
-0.02516499161720276,
0.050610702484846115,
0.017725568264722824,
0.05910416319966316,
-0.0035981102846562862,
-0.009486449882388115,
-0.13238441944122314,
-0.10202930122613907,
0.07977120578289032,
-0.06910086423158646,
0.0835157185792923,
-0.025248132646083832,
0.05611950904130936,
0.07768667489290237,
-0.041773002594709396,
-0.0000644818865112029,
0.00018035956600215286,
-0.024760302156209946,
-0.0076097180135548115,
-0.08727087825536728,
0.025328271090984344,
-0.024521220475435257,
0.13813963532447815,
-0.040690891444683075,
-0.015141502022743225,
0.030656397342681885,
0.1984003782272339,
0.007806084584444761,
-0.034974630922079086,
-0.03403966501355171,
0.018624335527420044,
-0.013613943941891193,
-0.03628441318869591,
-0.02077118307352066,
-0.03728371486067772,
-0.011066926643252373,
0.11346378922462463,
-0.0470619797706604,
0.0330662839114666,
0.08691596239805222,
0.12294089794158936,
-0.053172774612903595,
-0.05737388879060745,
-0.05757357180118561,
-0.00919745396822691,
-0.047360312193632126,
-0.0918133333325386,
0.18169473111629486,
0.05889468267560005,
0.09801849722862244,
-0.08767557144165039,
-0.09023799002170563,
0.0010474385926499963,
0.023048538714647293,
-0.0954202339053154,
0.07096090912818909,
0.010634591802954674,
-0.04566633328795433,
0.09836024791002274,
0.003362385556101799,
0.006398183759301901,
0.2080121785402298,
0.0014647325733676553,
-0.08108191192150116,
0.023717263713479042,
-0.04822195693850517,
-0.016384728252887726,
0.13315580785274506,
0.022604195401072502,
0.014268492348492146,
0.026308121159672737,
0.003708956763148308,
0.038813650608062744,
-0.037907250225543976,
0.05150526762008667,
-0.042184289544820786,
-0.05653245374560356,
0.057109083980321884,
0.05293120816349983,
0.026315301656723022,
0.11741631478071213,
0.04410495609045029,
0.007904539816081524,
-0.07923048734664917,
-0.04330320656299591,
-0.09068235009908676,
0.1278483122587204,
-0.1497541069984436,
-0.13425204157829285,
-0.16617490351200104,
0.055715370923280716,
-0.16182607412338257,
0.022470474243164062,
-0.016122549772262573,
0.014871898107230663,
-0.08134914189577103,
-0.12295365333557129,
0.06165105476975441,
0.04503758251667023,
-0.08526327461004257,
-0.05080365017056465,
-0.02455538883805275,
-0.029086964204907417,
-0.14991749823093414,
-0.012936961837112904,
-0.03679949790239334,
-0.05707906186580658,
-0.05270335078239441,
-0.015564623288810253,
0.10555607080459595,
0.012831883504986763,
-0.06085076928138733,
-0.020589709281921387,
-0.01716352067887783,
0.13774025440216064,
-0.10630448162555695,
0.13464856147766113,
0.10216210782527924,
0.02958696149289608,
0.05467663332819939,
0.09419365227222443,
0.038955312222242355,
0.013860656879842281,
0.01225157268345356,
0.08417466282844543,
-0.014213616028428078,
-0.17541435360908508,
-0.18906639516353607,
-0.01994411088526249,
-0.006615976803004742,
0.03375992551445961,
0.06574933975934982,
0.07586881518363953,
0.014611317776143551,
-0.09080848842859268,
-0.042105913162231445,
0.04338512197136879,
0.062456853687763214,
0.10766535997390747,
0.03319643810391426,
-0.018611958250403404,
-0.059368349611759186,
-0.012798607349395752,
0.15375182032585144,
0.025536242872476578,
0.08399933576583862,
-0.05171588435769081,
0.09665828198194504,
-0.006723790895193815,
0.0064435251988470554,
-0.03838491067290306,
0.13219738006591797,
0.00025291857309639454,
0.014584941789507866,
-0.022729549556970596,
-0.053785309195518494,
-0.0002723136858548969,
0.030745431780815125,
0.03856489434838295,
0.006963169202208519,
-0.03417932242155075,
-0.11133967339992523,
0.10435830801725388,
0.1745835542678833,
-0.050881896167993546,
-0.0897163674235344,
-0.07572835683822632,
0.01360153965651989,
-0.08380551636219025,
-0.09377406537532806,
0.018804777413606644,
0.0951431393623352,
-0.18878623843193054,
0.04987809434533119,
0.008536049164831638,
0.07749038934707642,
-0.04672412946820259,
-0.007709348574280739,
0.007228046655654907,
0.08749771863222122,
-0.0472843311727047,
0.027975203469395638,
-0.15433666110038757,
0.11714768409729004,
0.02614176832139492,
0.021298682317137718,
-0.080587238073349,
0.023046346381306648,
0.03317886218428612,
0.02074735052883625,
0.1627863645553589,
0.015302376821637154,
0.03708944469690323,
0.0557752400636673,
-0.05972324311733246,
-0.00943142268806696,
0.028143147006630898,
-0.12561656534671783,
0.03146054223179817,
-0.007617647293955088,
-0.037025805562734604,
-0.06823678314685822,
-0.016417494043707848,
-0.10532762110233307,
-0.14396223425865173,
-0.023412708193063736,
-0.04596610739827156,
0.004613006021827459,
-0.03788438066840172,
-0.01061095017939806,
-0.10276157408952713,
0.13667987287044525,
-0.014961336739361286,
-0.03029448166489601,
-0.136987566947937,
-0.09957608580589294,
0.15428030490875244,
-0.056179478764534,
0.08134812861680984,
-0.025526631623506546,
0.13912460207939148,
-0.11300729215145111,
-0.09115546941757202,
0.014796334318816662,
-0.1359567642211914,
-0.05090036615729332,
-0.020974433049559593,
0.14783425629138947,
0.10322962701320648,
0.01711081713438034,
0.021117785945534706,
0.041557326912879944,
0.009116042405366898,
-0.11957754194736481,
-0.017758535221219063,
0.04735337570309639,
0.05352669209241867,
0.1369745135307312,
-0.07762216031551361,
-0.1384640634059906,
-0.021779242902994156,
-0.0007379152812063694,
0.07713103294372559,
0.10581442713737488,
-0.03390112146735191,
0.1295660138130188,
0.1345202624797821,
-0.08267062157392502,
-0.25132668018341064,
0.02534724958240986,
0.018103836104273796,
-0.03389259800314903,
0.002379197860136628,
-0.13533300161361694,
0.0859941840171814,
0.07512252777814865,
-0.0091472789645195,
-0.018491342663764954,
-0.16948316991329193,
-0.08242019265890121,
0.07506360858678818,
0.010568973608314991,
-0.020098045468330383,
-0.0393206849694252,
-0.027437502518296242,
-0.048344168812036514,
-0.13617751002311707,
0.12537948787212372,
-0.06913990527391434,
-0.01053230743855238,
-0.01717827469110489,
0.04663916304707527,
0.048076920211315155,
-0.024469079449772835,
0.05533754080533981,
0.08404836058616638,
0.04618815332651138,
-0.04703671112656593,
0.019767701625823975,
0.055022697895765305,
0.002136513590812683,
0.13871997594833374,
0.061011575162410736,
-0.005125002935528755,
-0.1361982673406601,
0.029174497351050377,
-0.08255817741155624,
0.15008263289928436,
-0.04086778312921524,
-0.05708782374858856,
-0.07231594622135162,
0.07064899802207947,
0.0467115081846714,
0.0023939560633152723,
-0.03284682705998421,
-0.005062893498688936,
0.061147455126047134,
0.126311257481575,
0.11116037517786026,
0.05860167369246483,
-0.01352649088948965,
0.08371533453464508,
-0.007140764035284519,
0.008099721744656563,
-0.0016626233700662851,
0.055099427700042725,
0.10464797168970108,
-0.01360394898802042,
0.0656961053609848,
-0.04203234985470772,
-0.15777824819087982,
-0.05709225684404373,
0.046711429953575134,
-0.17097891867160797,
-0.07578153908252716,
-0.05883528292179108,
-0.13329868018627167,
0.04549511522054672,
0.0010514432797208428,
0.15246747434139252,
-0.04738796502351761,
-0.04783117398619652,
-0.0030453784856945276,
0.029307378455996513,
0.002023187465965748,
0.017423897981643677,
0.028618015348911285,
-0.013912235386669636,
-0.05304945260286331,
0.1757018268108368,
0.025813763961195946,
-0.1359989494085312,
0.026170499622821808,
0.1344178467988968,
-0.05398834869265556,
-0.03345772624015808,
-0.028300102800130844,
0.09855519980192184,
-0.21358349919319153,
-0.07768791168928146,
-0.04454483091831207,
-0.07677746564149857,
-0.00928543508052826,
0.16208472847938538,
0.03150226175785065,
-0.0008206735365092754,
-0.022181879729032516,
-0.0014955000951886177,
-0.09368196129798889,
0.0736544281244278,
0.013623320497572422,
0.013982361182570457,
0.016273628920316696,
-0.003006380284205079,
-0.04493038356304169,
-0.021736018359661102,
-0.01175003219395876,
0.015852153301239014,
-0.06398304551839828,
-0.06791874766349792,
-0.12230806052684784,
-0.04922136291861534,
-0.07741636037826538,
-0.010438584722578526,
-0.030142314732074738,
-0.00025539827765896916,
0.015034348703920841,
-0.004990032874047756,
-0.012611870653927326,
-0.0008336757309734821,
0.00984248798340559,
0.09179823100566864,
-0.1486707478761673,
0.02066204883158207,
0.06252176314592361,
-0.039444901049137115,
0.14859513938426971,
0.08318601548671722,
0.03632386028766632,
0.01603744365274906,
-0.13776791095733643,
-0.016863543540239334,
-0.04190586879849434,
0.013762585818767548,
0.06522005051374435,
-0.08141148090362549,
0.0121343694627285,
-0.014214135706424713,
-0.011556980200111866,
-0.01192863006144762,
-0.0004594010242726654,
-0.03474701568484306,
0.05463848263025284,
0.029690267518162727,
-0.043738190084695816,
-0.08459948003292084,
0.07884246110916138,
0.1561959832906723,
0.01842631958425045,
-0.00006307464354904369,
-0.042525339871644974,
0.004122728947550058,
-0.05903911963105202,
0.00746697373688221,
0.009387621656060219,
-0.028348879888653755,
-0.039912838488817215,
0.036422234028577805,
0.012983951717615128,
0.01638394594192505,
0.2455274611711502,
0.05112730711698532,
0.09033507853746414,
0.06481517851352692,
0.0179063081741333,
-0.10989152640104294,
0.015825418755412102,
0.004683186300098896,
-0.05557306855916977,
0.01639697141945362,
-0.029342256486415863,
-0.0051258634775877,
-0.028740504756569862,
-0.01695942133665085,
0.09502571076154709,
0.06516285985708237,
0.07525178045034409,
0.03474361076951027,
0.030238276347517967,
-0.0621313601732254,
-0.03376181051135063,
-0.04713096097111702,
0.04896502569317818,
-0.010876907967031002,
-0.08188345283269882,
0.07036509364843369,
0.08112359046936035,
-0.22280389070510864,
0.10973245650529861,
0.05491387099027634,
-0.01201701071113348,
-0.038637492805719376,
-0.03846973180770874,
-0.006164575926959515,
-0.03826019912958145,
0.013817713595926762,
-0.12641295790672302,
0.09458713978528976,
0.005190917756408453,
0.039721596986055374,
-0.026656726375222206,
0.10624340921640396,
-0.17309999465942383,
-0.10996267199516296,
0.08590158075094223,
0.07105887681245804,
0.04413697123527527,
0.033672381192445755,
0.05240694805979729,
-0.05047830566763878,
0.041398826986551285,
0.05495845526456833,
0.08482898771762848,
0.060268584638834,
0.027369877323508263,
-0.05148308724164963,
-0.029663896188139915,
-0.007047227583825588,
-0.031704939901828766,
-0.020945455878973007,
0.16122841835021973,
0.05117630958557129,
-0.03158530965447426,
-0.01369746308773756,
0.2318645864725113,
-0.06832394748926163,
-0.027405718341469765,
-0.11654936522245407,
0.13458076119422913,
0.02641783095896244,
0.0616508387029171,
0.0349155031144619,
-0.09027242660522461,
-0.025438087061047554,
0.13503944873809814,
0.2074464112520218,
-0.010609909892082214,
0.01470726914703846,
0.016247881576418877,
-0.0022500779014080763,
0.04851530119776726,
-0.0034150579012930393,
0.02949841506779194,
0.28088563680648804,
-0.03924364596605301,
0.11651831865310669,
-0.035370104014873505,
0.012411624193191528,
-0.07503990828990936,
0.22196537256240845,
-0.053748682141304016,
-0.05032253637909889,
-0.0010437597520649433,
0.07300667464733124,
-0.07942903786897659,
-0.3318961560726166,
-0.01364334300160408,
-0.11668967455625534,
-0.06101313605904579,
-0.029841283336281776,
-0.0871831476688385,
0.06660446524620056,
0.055757854133844376,
0.04490001127123833,
-0.041983600705862045,
0.1985795944929123,
0.03628977760672569,
-0.11619054526090622,
-0.06574121862649918,
0.034992728382349014,
-0.08191083371639252,
0.27211782336235046,
-0.0010174987837672234,
0.020040011033415794,
0.057611867785453796,
-0.023232821375131607,
-0.11601649224758148,
-0.008759776130318642,
0.05421597510576248,
-0.10982584953308105,
-0.025822196155786514,
0.1281849890947342,
-0.004143050406128168,
0.07014474272727966,
0.011160778813064098,
-0.03785700350999832,
0.07420405745506287,
0.0631960779428482,
-0.020125126466155052,
-0.07319007813930511,
0.11715800315141678,
-0.11280035227537155,
0.15608634054660797,
0.12705166637897491,
-0.0014197342097759247,
0.06079120188951492,
-0.09765402972698212,
-0.01069454476237297,
0.009030349552631378,
0.08968048542737961,
0.028550099581480026,
-0.06918372213840485,
0.06259313970804214,
-0.0523613877594471,
-0.004850242752581835,
-0.19282065331935883,
0.010838743299245834,
-0.05232349783182144,
0.005964017938822508,
-0.029419919475913048,
0.042404819279909134,
0.0517197884619236,
0.02258535847067833,
0.0005852363537997007,
0.0019077418837696314,
-0.0013831971446052194,
0.06340081244707108,
-0.04376811906695366,
-0.033381033688783646
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz ELECTRA models
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources French Europeana ELECTRA models 🎉
# French Europeana ELECTRA
We extracted all French texts using the `language` metadata attribute from the Europeana corpus.
The resulting corpus has a size of 63GB and consists of 11,052,528,456 tokens.
Based on the metadata information, texts from the 18th - 20th century are mainly included in the
training corpus.
Detailed information about the data and pretraining steps can be found in
[this repository](https://github.com/stefan-it/europeana-bert).
## Model weights
ELECTRA model weights for PyTorch and TensorFlow are available.
* French Europeana ELECTRA (discriminator): `dbmdz/electra-base-french-europeana-cased-discriminator` - [model hub page](https://huggingface.co/dbmdz/electra-base-french-europeana-cased-discriminator/tree/main)
* French Europeana ELECTRA (generator): `dbmdz/electra-base-french-europeana-cased-generator` - [model hub page](https://huggingface.co/dbmdz/electra-base-french-europeana-cased-generator/tree/main)
## Results
For results on Historic NER, please refer to [this repository](https://github.com/stefan-it/europeana-bert).
## Usage
With Transformers >= 2.3 our French Europeana ELECTRA model can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("dbmdz/electra-base-french-europeana-cased-discriminator")
model = AutoModel.from_pretrained("dbmdz/electra-base-french-europeana-cased-discriminator")
```
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our ELECTRA models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download our models from their S3 storage 🤗
|
{"language": "fr", "license": "mit", "tags": ["historic french"]}
|
fill-mask
|
dbmdz/electra-base-french-europeana-cased-generator
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"electra",
"fill-mask",
"historic french",
"fr",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fr"
] |
TAGS
#transformers #pytorch #tf #safetensors #electra #fill-mask #historic french #fr #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# + dbmdz ELECTRA models
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources French Europeana ELECTRA models
# French Europeana ELECTRA
We extracted all French texts using the 'language' metadata attribute from the Europeana corpus.
The resulting corpus has a size of 63GB and consists of 11,052,528,456 tokens.
Based on the metadata information, texts from the 18th - 20th century are mainly included in the
training corpus.
Detailed information about the data and pretraining steps can be found in
this repository.
## Model weights
ELECTRA model weights for PyTorch and TensorFlow are available.
* French Europeana ELECTRA (discriminator): 'dbmdz/electra-base-french-europeana-cased-discriminator' - model hub page
* French Europeana ELECTRA (generator): 'dbmdz/electra-base-french-europeana-cased-generator' - model hub page
## Results
For results on Historic NER, please refer to this repository.
## Usage
With Transformers >= 2.3 our French Europeana ELECTRA model can be loaded like:
# Huggingface model hub
All models are available on the Huggingface model hub.
# Contact (Bugs, Feedback, Contribution and more)
For questions about our ELECTRA models just open an issue
here
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download our models from their S3 storage
|
[
"# + dbmdz ELECTRA models\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources French Europeana ELECTRA models",
"# French Europeana ELECTRA\n\nWe extracted all French texts using the 'language' metadata attribute from the Europeana corpus.\n\nThe resulting corpus has a size of 63GB and consists of 11,052,528,456 tokens.\n\nBased on the metadata information, texts from the 18th - 20th century are mainly included in the\ntraining corpus.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.",
"## Model weights\n\nELECTRA model weights for PyTorch and TensorFlow are available.\n\n* French Europeana ELECTRA (discriminator): 'dbmdz/electra-base-french-europeana-cased-discriminator' - model hub page\n* French Europeana ELECTRA (generator): 'dbmdz/electra-base-french-europeana-cased-generator' - model hub page",
"## Results\n\nFor results on Historic NER, please refer to this repository.",
"## Usage\n\nWith Transformers >= 2.3 our French Europeana ELECTRA model can be loaded like:",
"# Huggingface model hub\n\nAll models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our ELECTRA models just open an issue\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download our models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #electra #fill-mask #historic french #fr #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# + dbmdz ELECTRA models\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources French Europeana ELECTRA models",
"# French Europeana ELECTRA\n\nWe extracted all French texts using the 'language' metadata attribute from the Europeana corpus.\n\nThe resulting corpus has a size of 63GB and consists of 11,052,528,456 tokens.\n\nBased on the metadata information, texts from the 18th - 20th century are mainly included in the\ntraining corpus.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.",
"## Model weights\n\nELECTRA model weights for PyTorch and TensorFlow are available.\n\n* French Europeana ELECTRA (discriminator): 'dbmdz/electra-base-french-europeana-cased-discriminator' - model hub page\n* French Europeana ELECTRA (generator): 'dbmdz/electra-base-french-europeana-cased-generator' - model hub page",
"## Results\n\nFor results on Historic NER, please refer to this repository.",
"## Usage\n\nWith Transformers >= 2.3 our French Europeana ELECTRA model can be loaded like:",
"# Huggingface model hub\n\nAll models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our ELECTRA models just open an issue\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download our models from their S3 storage"
] |
[
57,
41,
99,
100,
17,
24,
18,
26,
64
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #electra #fill-mask #historic french #fr #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# + dbmdz ELECTRA models\n\nIn this repository the MDZ Digital Library team (dbmdz) at the Bavarian State\nLibrary open sources French Europeana ELECTRA models# French Europeana ELECTRA\n\nWe extracted all French texts using the 'language' metadata attribute from the Europeana corpus.\n\nThe resulting corpus has a size of 63GB and consists of 11,052,528,456 tokens.\n\nBased on the metadata information, texts from the 18th - 20th century are mainly included in the\ntraining corpus.\n\nDetailed information about the data and pretraining steps can be found in\nthis repository.## Model weights\n\nELECTRA model weights for PyTorch and TensorFlow are available.\n\n* French Europeana ELECTRA (discriminator): 'dbmdz/electra-base-french-europeana-cased-discriminator' - model hub page\n* French Europeana ELECTRA (generator): 'dbmdz/electra-base-french-europeana-cased-generator' - model hub page## Results\n\nFor results on Historic NER, please refer to this repository.## Usage\n\nWith Transformers >= 2.3 our French Europeana ELECTRA model can be loaded like:# Huggingface model hub\n\nAll models are available on the Huggingface model hub.# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our ELECTRA models just open an issue\nhere# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download our models from their S3 storage"
] |
[
-0.07750503718852997,
0.2123543620109558,
-0.0030265890527516603,
0.07918795198202133,
0.03697308152914047,
-0.004036802798509598,
0.1276395171880722,
0.07509416341781616,
0.11709335446357727,
0.07812231034040451,
-0.018665507435798645,
-0.06595925986766815,
0.017501559108495712,
0.11023446172475815,
0.07350735366344452,
-0.1816880851984024,
0.00683433236554265,
-0.06776253879070282,
-0.02936295047402382,
0.0164251159876585,
0.09913323819637299,
0.002225784119218588,
0.06521109491586685,
-0.0010435807052999735,
-0.004884965252131224,
0.10135588049888611,
-0.06338756531476974,
-0.07224756479263306,
0.12017933279275894,
0.10365446656942368,
-0.00924333743751049,
-0.04060453176498413,
0.007926668040454388,
-0.1566774547100067,
0.03356368467211723,
0.06639815866947174,
-0.06480572372674942,
0.05806317925453186,
0.08620557188987732,
-0.09965301305055618,
0.19403786957263947,
-0.06822995096445084,
0.00513866450637579,
0.04172298684716225,
-0.019799688830971718,
-0.07101219892501831,
-0.13125945627689362,
0.016779428347945213,
-0.052424877882003784,
0.07447674870491028,
0.041288118809461594,
0.12402132898569107,
-0.07182513922452927,
0.0031703824643045664,
0.16303732991218567,
-0.14131613075733185,
-0.04796754568815231,
-0.025704199448227882,
0.028892794623970985,
0.047192174941301346,
-0.0860937312245369,
0.055892396718263626,
-0.006279658991843462,
0.034193433821201324,
0.033443205058574677,
-0.04111047834157944,
-0.15549391508102417,
-0.03953246399760246,
-0.04354069009423256,
-0.005723160691559315,
0.20109373331069946,
0.008810729719698429,
-0.08770322799682617,
-0.16983385384082794,
-0.06821495294570923,
0.15569943189620972,
-0.0165708065032959,
0.03954773023724556,
0.058294013142585754,
-0.02278232015669346,
0.09137847274541855,
-0.15364477038383484,
-0.045811448246240616,
-0.04458826407790184,
-0.04177836701273918,
0.20265676081180573,
0.0007966712582856417,
0.0566260889172554,
0.08759292960166931,
0.04856739565730095,
-0.1664086878299713,
-0.042000625282526016,
-0.0075681316666305065,
-0.03603127598762512,
-0.07310984283685684,
0.0037875035777688026,
-0.01132149901241064,
-0.09635072946548462,
0.035364288836717606,
0.2006579339504242,
-0.14303340017795563,
-0.020498979836702347,
-0.07228551805019379,
0.011089091189205647,
0.02005930431187153,
0.07147286087274551,
-0.09949621558189392,
-0.1688464879989624,
0.053351130336523056,
-0.10490477830171585,
0.08459161221981049,
0.053658924996852875,
-0.040972352027893066,
-0.04773357883095741,
-0.07680370658636093,
0.05970305949449539,
0.010037921369075775,
-0.028074512258172035,
-0.017258642241358757,
-0.041190098971128464,
0.290360689163208,
-0.08235319703817368,
0.01644761487841606,
0.03768722340464592,
-0.10971776396036148,
-0.027834471315145493,
0.07967665046453476,
-0.01757996529340744,
-0.0755363255739212,
0.02120310626924038,
-0.0813111886382103,
-0.029775168746709824,
-0.031592994928359985,
-0.12594611942768097,
0.06592784076929092,
-0.09521780908107758,
-0.03243979811668396,
-0.1303529143333435,
-0.14824992418289185,
-0.016866300255060196,
0.049615442752838135,
-0.06380526721477509,
0.03513781353831291,
0.009634699672460556,
-0.012960684485733509,
-0.012290528044104576,
0.00004861738852923736,
0.015238064341247082,
-0.05922844633460045,
-0.03675771504640579,
-0.14647673070430756,
0.04555694758892059,
-0.08359703421592712,
-0.01051162276417017,
-0.07149628549814224,
-0.04241929203271866,
-0.29083359241485596,
0.07729356735944748,
-0.1256834864616394,
-0.03144228458404541,
-0.10062743723392487,
0.0003445803595241159,
0.021840140223503113,
0.008070120587944984,
0.05083063617348671,
0.1152549535036087,
-0.1288248598575592,
-0.040870774537324905,
0.18001767992973328,
-0.11682567000389099,
-0.07251143455505371,
0.18356913328170776,
-0.009583665989339352,
-0.029919732362031937,
0.04139574244618416,
0.16682159900665283,
0.12797857820987701,
-0.16822080314159393,
-0.08316052705049515,
0.01748478040099144,
0.007323503028601408,
-0.02858668379485607,
0.06980293989181519,
-0.10424596816301346,
0.056582011282444,
0.0075579541735351086,
-0.015555400401353836,
0.048824042081832886,
0.009488606825470924,
0.017999794334173203,
-0.01864824816584587,
-0.1059630960226059,
0.040966928005218506,
0.04243159294128418,
-0.004318601917475462,
-0.05315496772527695,
-0.04667598009109497,
0.07206982374191284,
0.09271647036075592,
-0.02848682925105095,
0.03269210830330849,
0.026118243113160133,
0.0879245474934578,
-0.015898827463388443,
-0.03085189126431942,
-0.1306745857000351,
-0.18920470774173737,
0.07990157604217529,
-0.11248106509447098,
0.04929783195257187,
-0.09633144736289978,
0.06778950244188309,
0.1290546953678131,
-0.043465837836265564,
0.012350915931165218,
-0.0011369290295988321,
-0.024364538490772247,
0.009514727629721165,
-0.08305634558200836,
0.036334309726953506,
-0.044208019971847534,
0.07154420018196106,
-0.04048959165811539,
-0.03196015954017639,
0.04013210907578468,
0.19783158600330353,
0.042325206100940704,
-0.04859829321503639,
0.029463298618793488,
0.0087125888094306,
-0.02193588577210903,
-0.04601775482296944,
-0.02506318874657154,
-0.028428468853235245,
-0.024556562304496765,
0.1308954656124115,
-0.01675477810204029,
0.06958920508623123,
0.06847640126943588,
0.1682446151971817,
-0.06903427839279175,
-0.05786322429776192,
-0.05141681060194969,
0.005270779132843018,
-0.08295551687479019,
-0.061757851392030716,
0.15053698420524597,
0.058116938918828964,
0.1455809473991394,
-0.09338495880365372,
-0.049677394330501556,
0.04139556363224983,
0.04330277070403099,
-0.07080636918544769,
0.1086094006896019,
0.007961438968777657,
0.021134084090590477,
0.07310041785240173,
-0.03421655297279358,
-0.0016558197094127536,
0.19844499230384827,
0.011125494726002216,
-0.08966396003961563,
0.016905194148421288,
-0.008342863991856575,
0.03123052604496479,
0.14610379934310913,
0.022153601050376892,
0.012564482167363167,
0.03487047925591469,
0.003620784031227231,
0.02489335462450981,
-0.03024343028664589,
0.020301098003983498,
-0.047560956329107285,
-0.06262478977441788,
0.07524231821298599,
0.08561500906944275,
-0.006280587520450354,
0.10076472908258438,
0.008595678023993969,
-0.024149544537067413,
-0.07167115807533264,
-0.0461561344563961,
-0.07622236758470535,
0.1368580013513565,
-0.16975268721580505,
-0.15914589166641235,
-0.12538443505764008,
0.11310309916734695,
-0.15372703969478607,
0.029481740668416023,
0.010135709308087826,
0.030796270817518234,
-0.08938266336917877,
-0.1425134241580963,
-0.036078523844480515,
0.06051621586084366,
-0.07473635673522949,
-0.10124076902866364,
-0.0004338249855209142,
0.01679929904639721,
-0.15051718056201935,
0.006533424369990826,
-0.02478848025202751,
-0.06506010890007019,
-0.04309454187750816,
0.013113304041326046,
0.12130434066057205,
-0.039046067744493484,
-0.03821113705635071,
-0.03717157244682312,
-0.021736079826951027,
0.13873191177845,
-0.07490929961204529,
0.13016588985919952,
0.07622005045413971,
0.08251882344484329,
0.051153454929590225,
0.07472454756498337,
0.049178190529346466,
-0.013465235009789467,
0.025043871253728867,
0.06743491441011429,
-0.031297482550144196,
-0.20208989083766937,
-0.20696933567523956,
-0.03450199216604233,
-0.029983453452587128,
0.014250287786126137,
0.07552053034305573,
0.025583237409591675,
0.03482392802834511,
-0.08021582663059235,
-0.00379062257707119,
0.03134458139538765,
0.045554157346487045,
0.12167537212371826,
0.02568623423576355,
-0.015499681234359741,
-0.0868118479847908,
-0.005504152271896601,
0.1744992733001709,
0.038205407559871674,
0.09551741927862167,
-0.07554958015680313,
0.16800034046173096,
-0.005086027085781097,
0.021802039816975594,
-0.04568328335881233,
0.11864601075649261,
0.019440878182649612,
0.025830768048763275,
-0.027166802436113358,
-0.07913778722286224,
0.02532547153532505,
0.0385865792632103,
0.03933681175112724,
-0.009167343378067017,
-0.042795825749635696,
-0.11028596013784409,
0.11190460622310638,
0.1135098934173584,
-0.025275329127907753,
-0.08090727031230927,
-0.05882130190730095,
-0.0027985526248812675,
-0.08268938213586807,
-0.05466856434941292,
0.03429032862186432,
0.15965573489665985,
-0.17322057485580444,
0.0808587297797203,
0.01959969475865364,
0.05996085703372955,
-0.06174274533987045,
-0.04448593780398369,
0.026415489614009857,
0.06150863319635391,
-0.03017120063304901,
0.04103251174092293,
-0.11185751110315323,
0.13372252881526947,
0.040181297808885574,
0.01351091917604208,
-0.06259571760892868,
0.010947604663670063,
0.04926908016204834,
-0.01896466687321663,
0.20115560293197632,
-0.0015642570797353983,
0.05085185915231705,
0.04787026718258858,
-0.10981745272874832,
0.0053459652699530125,
0.023937322199344635,
-0.11294476687908173,
0.007162191905081272,
0.0398833341896534,
-0.07167595624923706,
-0.10194631665945053,
-0.011072528548538685,
-0.13778574764728546,
-0.1626993864774704,
-0.050611305981874466,
0.012135888449847698,
-0.032995376735925674,
-0.05715896189212799,
-0.031009506434202194,
-0.1276690512895584,
0.09877383708953857,
0.0016908739926293492,
-0.059923529624938965,
-0.13832968473434448,
-0.1124039888381958,
0.15584774315357208,
-0.05567645654082298,
0.07751327008008957,
-0.023778920993208885,
0.15532130002975464,
-0.11414418369531631,
-0.06342555582523346,
0.017509784549474716,
-0.15942294895648956,
-0.05547092854976654,
-0.07274293899536133,
0.12400952726602554,
0.0683693140745163,
0.0045776888728141785,
0.010980313643813133,
0.03671533986926079,
0.0334586501121521,
-0.11440592259168625,
-0.022473203018307686,
0.06748706102371216,
0.03550240397453308,
0.1397435963153839,
-0.04090425744652748,
-0.09528414905071259,
0.017895720899105072,
0.02758689783513546,
0.07390404492616653,
0.11064113676548004,
-0.06474055349826813,
0.17550629377365112,
0.14169955253601074,
-0.06135077401995659,
-0.22318704426288605,
0.008071467280387878,
0.06343796104192734,
-0.047845929861068726,
0.01227844413369894,
-0.16238191723823547,
0.06609510630369186,
0.04263428598642349,
-0.035031989216804504,
0.052520692348480225,
-0.19423434138298035,
-0.08957256376743317,
0.0200498066842556,
0.027699202299118042,
-0.07692396640777588,
-0.06923145800828934,
-0.04369591549038887,
-0.054649412631988525,
-0.054441407322883606,
0.13241346180438995,
-0.12103687226772308,
0.011980684474110603,
-0.006842339411377907,
-0.004896547645330429,
0.03459160774946213,
-0.030800510197877884,
0.04927666857838631,
0.05398022383451462,
0.014311513863503933,
-0.024676505476236343,
0.12301623076200485,
0.07302453368902206,
0.00953511893749237,
0.13079340755939484,
-0.012644789181649685,
0.012334409169852734,
-0.13274167478084564,
0.0011249894741922617,
-0.05995132774114609,
0.16101713478565216,
-0.02694520726799965,
-0.021104294806718826,
-0.052795737981796265,
0.04926604777574539,
0.05276308208703995,
0.023842763155698776,
-0.06848235428333282,
0.037745747715234756,
0.05431751534342766,
0.18571825325489044,
0.06323697417974472,
0.07821241021156311,
-0.14233700931072235,
0.08108390867710114,
-0.02574624866247177,
0.023620430380105972,
-0.02901717834174633,
0.03168517351150513,
0.07979327440261841,
-0.006599320098757744,
0.06264210492372513,
-0.0663302019238472,
-0.16303090751171112,
-0.0020839073695242405,
0.059716466814279556,
-0.14088468253612518,
-0.0812254324555397,
-0.04365687817335129,
-0.08197841793298721,
0.0635879635810852,
-0.01347952987998724,
0.14359614253044128,
-0.06697831302881241,
-0.03181600570678711,
-0.03243196755647659,
0.00017937477969098836,
0.03153248503804207,
0.052443645894527435,
0.06570399552583694,
-0.0045274668373167515,
-0.0601629801094532,
0.17486274242401123,
0.020484907552599907,
-0.16549059748649597,
0.029406599700450897,
0.15835851430892944,
-0.07056309282779694,
-0.0637209340929985,
-0.036501847207546234,
0.05592142418026924,
-0.16471122205257416,
-0.07860220223665237,
-0.0603184811770916,
-0.023539124056696892,
-0.01242883875966072,
0.19401009380817413,
0.04507286474108696,
-0.06611523777246475,
-0.0035088423173874617,
-0.036005180329084396,
-0.03569995239377022,
0.09211284667253494,
0.0038577548693865538,
0.013977358117699623,
0.040650032460689545,
-0.06493021547794342,
-0.023355523124337196,
-0.03508851304650307,
0.0037125530652701855,
-0.0034965784288942814,
-0.051795732229948044,
-0.07418598979711533,
-0.14664945006370544,
0.029762769117951393,
-0.03359096124768257,
0.012983076274394989,
-0.037453267723321915,
0.0015019982820376754,
0.0318756029009819,
0.010595167987048626,
-0.008198988623917103,
-0.026306405663490295,
0.02726421132683754,
0.0865253433585167,
-0.17032550275325775,
-0.019136402755975723,
0.05219431221485138,
-0.04319765046238899,
0.14603213965892792,
0.05392929166555405,
0.020199473947286606,
-0.009819628670811653,
-0.13852337002754211,
-0.0417509600520134,
-0.07342679053544998,
0.03591235727071762,
0.048898495733737946,
-0.0738966315984726,
0.03930206969380379,
-0.01852628029882908,
0.03929045423865318,
-0.009043627418577671,
0.0057320548221468925,
-0.06708507984876633,
0.03326082229614258,
0.04601440951228142,
-0.026499716565012932,
-0.07688743621110916,
0.057895347476005554,
0.1320972442626953,
0.024637999013066292,
0.002002995228394866,
-0.06371793895959854,
0.02099776081740856,
-0.04485466703772545,
-0.009912163019180298,
-0.010192411951720715,
-0.026272093877196312,
-0.08883892744779587,
0.08529780060052872,
0.02098854072391987,
-0.0001675902312854305,
0.2512471079826355,
0.05577118694782257,
0.13064660131931305,
0.053775474429130554,
-0.01685481332242489,
-0.11767588555812836,
0.027255801483988762,
0.007515547331422567,
-0.023154588416218758,
0.036318209022283554,
-0.04847888648509979,
0.02114967070519924,
0.008066556416451931,
-0.003320513991639018,
0.03312467411160469,
0.07047019898891449,
0.04058164358139038,
0.03470715135335922,
0.012839033268392086,
-0.06303943693637848,
-0.0437503419816494,
-0.05313999950885773,
0.04990633577108383,
0.030814386904239655,
-0.09379871189594269,
0.007542088162153959,
0.04030535742640495,
-0.19764985144138336,
0.09914771467447281,
0.09239140897989273,
-0.019228937104344368,
-0.06261880695819855,
-0.03463767468929291,
-0.017460601404309273,
-0.03045654296875,
0.020633921027183533,
-0.1022450178861618,
0.085980124771595,
-0.0009669394348748028,
0.05505607649683952,
0.03417457640171051,
0.09742576628923416,
-0.16614024341106415,
-0.06362384557723999,
0.08901479840278625,
0.058301426470279694,
0.04032381996512413,
0.05563250929117203,
0.04192553460597992,
-0.04631891846656799,
-0.003367704339325428,
0.02242112345993519,
0.09708722680807114,
0.049048107117414474,
0.02580038085579872,
-0.02517765387892723,
-0.03304799646139145,
-0.021486660465598106,
-0.00018843778525479138,
-0.019876638427376747,
0.2005140781402588,
0.08815696835517883,
-0.04356251284480095,
-0.010619979351758957,
0.21839676797389984,
-0.0395706370472908,
-0.001976987347006798,
-0.11719609051942825,
0.13338163495063782,
-0.021265918388962746,
0.04882580786943436,
0.020077668130397797,
-0.06925638020038605,
-0.03280774503946304,
0.15709443390369415,
0.21310745179653168,
0.004723044577986002,
0.02486279606819153,
0.02554844692349434,
-0.02584543265402317,
0.06496810913085938,
-0.003856026567518711,
0.006065211724489927,
0.32604360580444336,
-0.043514952063560486,
0.09564268589019775,
-0.010856169275939465,
0.016088828444480896,
-0.10112753510475159,
0.17116056382656097,
-0.08719975501298904,
-0.019264528527855873,
-0.0059859780594706535,
0.07070323079824448,
-0.07338269799947739,
-0.29452937841415405,
0.03950197249650955,
-0.1072487011551857,
-0.06150251254439354,
-0.04631636291742325,
-0.09000051766633987,
0.10081915557384491,
0.036454230546951294,
0.043832916766405106,
-0.04687299206852913,
0.1722596287727356,
0.01126173697412014,
-0.14380840957164764,
-0.08080357313156128,
-0.01269188430160284,
-0.10901764035224915,
0.3160547614097595,
-0.008528032340109348,
0.06264869123697281,
0.04863235354423523,
-0.010864268988370895,
-0.1364527940750122,
-0.01902572624385357,
0.08019819110631943,
-0.11601703613996506,
0.009457788430154324,
0.1173323541879654,
-0.028910327702760696,
0.09748312085866928,
0.007760298904031515,
-0.010322827845811844,
0.05714578926563263,
0.06181098520755768,
0.0012943920446559787,
-0.08680585026741028,
0.1099996268749237,
-0.11748111248016357,
0.13984039425849915,
0.10633434355258942,
-0.011217420920729637,
0.027056409046053886,
-0.07514137029647827,
-0.0064010703936219215,
0.019962234422564507,
0.018550090491771698,
0.03233019635081291,
-0.06994803994894028,
0.06800056248903275,
-0.02602594904601574,
0.032888758927583694,
-0.22509223222732544,
0.013596412725746632,
-0.06193351373076439,
0.013888693414628506,
0.018948445096611977,
0.02968575432896614,
0.08070876449346542,
0.0122830206528306,
-0.020608728751540184,
0.017324166372418404,
-0.011528095230460167,
0.04124346747994423,
-0.0317705012857914,
-0.04776093736290932
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz BERT and ELECTRA models
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources Italian BERT and ELECTRA models 🎉
# Italian BERT
The source data for the Italian BERT model consists of a recent Wikipedia dump and
various texts from the [OPUS corpora](http://opus.nlpl.eu/) collection. The final
training corpus has a size of 13GB and 2,050,057,573 tokens.
For sentence splitting, we use NLTK (faster compared to spacy).
Our cased and uncased models are training with an initial sequence length of 512
subwords for ~2-3M steps.
For the XXL Italian models, we use the same training data from OPUS and extend
it with data from the Italian part of the [OSCAR corpus](https://traces1.inria.fr/oscar/).
Thus, the final training corpus has a size of 81GB and 13,138,379,147 tokens.
Note: Unfortunately, a wrong vocab size was used when training the XXL models.
This explains the mismatch of the "real" vocab size of 31102, compared to the
vocab size specified in `config.json`. However, the model is working and all
evaluations were done under those circumstances.
See [this issue](https://github.com/dbmdz/berts/issues/7) for more information.
The Italian ELECTRA model was trained on the "XXL" corpus for 1M steps in total using a batch
size of 128. We pretty much following the ELECTRA training procedure as used for
[BERTurk](https://github.com/stefan-it/turkish-bert/tree/master/electra).
## Model weights
Currently only PyTorch-[Transformers](https://github.com/huggingface/transformers)
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
| Model | Downloads
| ---------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `dbmdz/bert-base-italian-cased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-cased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-cased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-cased/vocab.txt)
| `dbmdz/bert-base-italian-uncased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-uncased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-uncased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-uncased/vocab.txt)
| `dbmdz/bert-base-italian-xxl-cased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-cased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-cased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-cased/vocab.txt)
| `dbmdz/bert-base-italian-xxl-uncased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-uncased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-uncased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-uncased/vocab.txt)
| `dbmdz/electra-base-italian-xxl-cased-discriminator` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/dbmdz/electra-base-italian-xxl-cased-discriminator/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-discriminator/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-discriminator/vocab.txt)
| `dbmdz/electra-base-italian-xxl-cased-generator` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/dbmdz/electra-base-italian-xxl-cased-generator/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-generator/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-generator/vocab.txt)
## Results
For results on downstream tasks like NER or PoS tagging, please refer to
[this repository](https://github.com/stefan-it/italian-bertelectra).
## Usage
With Transformers >= 2.3 our Italian BERT models can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/bert-base-italian-cased"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
```
To load the (recommended) Italian XXL BERT models, just use:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/bert-base-italian-xxl-cased"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
```
To load the Italian XXL ELECTRA model (discriminator), just use:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/electra-base-italian-xxl-cased-discriminator"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelWithLMHead.from_pretrained(model_name)
```
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT/ELECTRA models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "it", "license": "mit", "datasets": ["wikipedia"]}
| null |
dbmdz/electra-base-italian-xxl-cased-discriminator
|
[
"transformers",
"pytorch",
"electra",
"pretraining",
"it",
"dataset:wikipedia",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"it"
] |
TAGS
#transformers #pytorch #electra #pretraining #it #dataset-wikipedia #license-mit #endpoints_compatible #has_space #region-us
|
+ dbmdz BERT and ELECTRA models
===============================
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources Italian BERT and ELECTRA models
Italian BERT
============
The source data for the Italian BERT model consists of a recent Wikipedia dump and
various texts from the OPUS corpora collection. The final
training corpus has a size of 13GB and 2,050,057,573 tokens.
For sentence splitting, we use NLTK (faster compared to spacy).
Our cased and uncased models are training with an initial sequence length of 512
subwords for ~2-3M steps.
For the XXL Italian models, we use the same training data from OPUS and extend
it with data from the Italian part of the OSCAR corpus.
Thus, the final training corpus has a size of 81GB and 13,138,379,147 tokens.
Note: Unfortunately, a wrong vocab size was used when training the XXL models.
This explains the mismatch of the "real" vocab size of 31102, compared to the
vocab size specified in 'URL'. However, the model is working and all
evaluations were done under those circumstances.
See this issue for more information.
The Italian ELECTRA model was trained on the "XXL" corpus for 1M steps in total using a batch
size of 128. We pretty much following the ELECTRA training procedure as used for
BERTurk.
Model weights
-------------
Currently only PyTorch-Transformers
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
Results
-------
For results on downstream tasks like NER or PoS tagging, please refer to
this repository.
Usage
-----
With Transformers >= 2.3 our Italian BERT models can be loaded like:
To load the (recommended) Italian XXL BERT models, just use:
To load the Italian XXL ELECTRA model (discriminator), just use:
Huggingface model hub
=====================
All models are available on the Huggingface model hub.
Contact (Bugs, Feedback, Contribution and more)
===============================================
For questions about our BERT/ELECTRA models just open an issue
here
Acknowledgments
===============
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[] |
[
"TAGS\n#transformers #pytorch #electra #pretraining #it #dataset-wikipedia #license-mit #endpoints_compatible #has_space #region-us \n"
] |
[
43
] |
[
"passage: TAGS\n#transformers #pytorch #electra #pretraining #it #dataset-wikipedia #license-mit #endpoints_compatible #has_space #region-us \n"
] |
[
-0.019736196845769882,
0.11329983919858932,
-0.004941870458424091,
-0.0031050252728164196,
0.08644600957632065,
0.047131720930337906,
0.0574783980846405,
0.11235702037811279,
0.07480104267597198,
-0.013873192481696606,
0.1648106724023819,
0.15207819640636444,
-0.023606939241290092,
0.042927395552396774,
-0.028857992962002754,
-0.2681526839733124,
0.09731367975473404,
0.06429887562990189,
-0.09381745010614395,
0.08963426947593689,
0.11025142669677734,
-0.13373789191246033,
0.04065907001495361,
0.0014218244468793273,
-0.09076301008462906,
0.020182814449071884,
-0.025883032009005547,
-0.1112760379910469,
0.17051072418689728,
0.02018071711063385,
0.16050226986408234,
0.055987466126680374,
-0.015740135684609413,
-0.15916331112384796,
0.036855388432741165,
-0.0004718373529613018,
-0.09051832556724548,
0.07130821794271469,
-0.042821772396564484,
0.007106852252036333,
0.12071017175912857,
0.04080815613269806,
0.011158383451402187,
0.007496105041354895,
-0.17971530556678772,
-0.19866728782653809,
-0.06859444826841354,
0.040209755301475525,
-0.021726306527853012,
0.06306716054677963,
0.001166037400253117,
0.15319553017616272,
-0.1699451059103012,
0.019976576790213585,
0.14725127816200256,
-0.3073500394821167,
0.004551153630018234,
0.137478306889534,
0.13772797584533691,
0.001782775274477899,
-0.04734621196985245,
0.0719166100025177,
0.05447487160563469,
0.011395246721804142,
0.05256776511669159,
-0.049733348190784454,
-0.03712307661771774,
0.0929395779967308,
-0.10600464046001434,
-0.11109764873981476,
0.32942861318588257,
-0.015002910979092121,
0.0781448632478714,
-0.011107388883829117,
-0.05097739398479462,
-0.042359162122011185,
0.03926369175314903,
0.020134462043642998,
0.018849477171897888,
0.0661122128367424,
0.039304640144109726,
-0.0496218167245388,
-0.17115166783332825,
0.03940865769982338,
-0.20052281022071838,
0.15925654768943787,
0.01722639426589012,
0.0744774118065834,
-0.1239660307765007,
0.06530874967575073,
0.01608305424451828,
-0.07047027349472046,
-0.01320385467261076,
-0.09125764667987823,
0.04584367200732231,
-0.017900994047522545,
-0.09773339331150055,
0.07477815449237823,
0.11250913888216019,
0.1884511113166809,
-0.011689028702676296,
-0.05058035999536514,
0.07845714688301086,
0.14043521881103516,
0.10445135086774826,
0.02690998464822769,
-0.08944834768772125,
0.02134794555604458,
-0.0030640135519206524,
-0.08155542612075806,
0.020068511366844177,
-0.03646121546626091,
-0.11530736088752747,
-0.08510151505470276,
0.009519166313111782,
0.07589031755924225,
0.09223823994398117,
0.015627197921276093,
-0.05555097758769989,
0.029544886201620102,
0.03720034286379814,
-0.02341851219534874,
0.0015323415864259005,
-0.020373962819576263,
-0.014000286348164082,
0.03990182280540466,
0.012562381103634834,
0.014612436294555664,
0.028368579223752022,
0.05330539122223854,
-0.10275840759277344,
-0.010611303150653839,
-0.04648841544985771,
-0.0805116817355156,
0.07968232035636902,
-0.13819119334220886,
0.09637537598609924,
-0.15109135210514069,
-0.08802163600921631,
0.0045922077260911465,
0.09466022253036499,
-0.0029720093589276075,
-0.024341261014342308,
0.07757575064897537,
-0.03127164766192436,
0.02653440274298191,
-0.08514159917831421,
-0.05154772102832794,
-0.10038530826568604,
0.0885937362909317,
-0.08123203366994858,
0.07792158424854279,
-0.17390678822994232,
0.0513540618121624,
-0.10369043052196503,
0.01005855668336153,
-0.012057455256581306,
-0.05168791860342026,
-0.052144553512334824,
0.11640804260969162,
-0.010874852538108826,
-0.05598047375679016,
-0.09310907125473022,
0.07265676558017731,
-0.010385239496827126,
0.11697849631309509,
-0.12532435357570648,
-0.03183041885495186,
0.13218003511428833,
-0.11077608168125153,
-0.17005117237567902,
0.0609506256878376,
-0.028103554621338844,
0.07832804322242737,
0.014585413970053196,
0.20222844183444977,
-0.009951441548764706,
-0.1047186329960823,
-0.03814484551548958,
0.10196506232023239,
-0.08264261484146118,
-0.15166404843330383,
0.06633199751377106,
0.0060135419480502605,
0.01499362662434578,
-0.003309928346425295,
-0.023576153442263603,
0.07862327247858047,
-0.06734694540500641,
-0.06658799946308136,
-0.047431450337171555,
-0.02340523526072502,
0.049578387290239334,
0.06642656028270721,
0.06169648841023445,
-0.07244373112916946,
-0.0546267069876194,
0.11854418367147446,
0.04656199365854263,
0.047159451991319656,
0.0333537794649601,
-0.06080273538827896,
0.1083599254488945,
-0.03720826655626297,
-0.03528749570250511,
-0.1731315404176712,
-0.05580276623368263,
-0.0744260624051094,
0.0368373803794384,
0.0028440644964575768,
0.2724743187427521,
0.02814910188317299,
-0.09135546535253525,
0.015351011417806149,
-0.007123843301087618,
0.03302953019738197,
0.07692062109708786,
-0.014361977577209473,
-0.08300704509019852,
-0.02041909284889698,
-0.06372365355491638,
-0.06222068518400192,
-0.07186196744441986,
0.021626491099596024,
0.045972052961587906,
0.10507096350193024,
-0.04583744332194328,
0.07305411249399185,
-0.018936414271593094,
0.02877894788980484,
-0.07784946262836456,
0.006130640860646963,
0.09122425317764282,
0.02485957369208336,
-0.03150733932852745,
0.12089483439922333,
-0.0617416575551033,
0.29341429471969604,
0.21486306190490723,
-0.2064778357744217,
0.008160047233104706,
0.014621427282691002,
-0.046476952731609344,
0.03721218556165695,
0.0050737326964735985,
-0.027348259463906288,
-0.03129672259092331,
-0.017709875479340553,
0.0771981030702591,
-0.02699451334774494,
-0.003372733248397708,
-0.01006558071821928,
-0.07091329991817474,
-0.06800293177366257,
0.031666435301303864,
0.17048494517803192,
-0.175573468208313,
0.1987713724374771,
0.28773966431617737,
0.0020934634376317263,
0.16265791654586792,
-0.04161537438631058,
0.0013303383020684123,
-0.01546527724713087,
-0.08425546437501907,
-0.09566085040569305,
0.13976092636585236,
-0.20497575402259827,
-0.03190815821290016,
0.07797256857156754,
0.00967079121619463,
0.05584009736776352,
-0.17481675744056702,
-0.09251255542039871,
0.026925649493932724,
0.036218781024217606,
-0.07743498682975769,
0.09671126306056976,
-0.0022771561052650213,
0.08894013613462448,
0.004649693146348,
-0.07670341432094574,
0.07087302953004837,
0.014526075683534145,
-0.015203227289021015,
0.1466459184885025,
-0.10579348355531693,
-0.227842777967453,
-0.053486794233322144,
-0.0193219892680645,
0.07994739711284637,
0.0021663112565875053,
0.11157757043838501,
-0.0033645210787653923,
-0.03370818495750427,
0.023091468960046768,
0.0207807719707489,
-0.1403045654296875,
0.0005540112033486366,
-0.050434138625860214,
0.05927528068423271,
-0.10286375880241394,
-0.11458949744701385,
-0.06663026660680771,
-0.02933671697974205,
-0.020161405205726624,
0.1171887144446373,
-0.028832070529460907,
0.062134284526109695,
0.064189612865448,
0.00585275748744607,
0.03060881979763508,
-0.04344296082854271,
0.2078106701374054,
-0.0799853503704071,
0.023365715518593788,
0.168708935379982,
0.03638662025332451,
0.03172708675265312,
0.20318849384784698,
0.07048346102237701,
-0.048433344811201096,
-0.02224826067686081,
-0.06326925754547119,
-0.10059290379285812,
-0.20317038893699646,
-0.12595826387405396,
-0.11888061463832855,
-0.007471879944205284,
0.045100413262844086,
0.07450371235609055,
0.06785986572504044,
0.08267784118652344,
0.024902988225221634,
-0.09463205933570862,
-0.06634346395730972,
0.023828456178307533,
0.2611008584499359,
-0.04819125682115555,
0.09963399916887283,
-0.09462831169366837,
-0.04387245699763298,
0.08852459490299225,
0.0899873673915863,
0.19401851296424866,
0.11664115637540817,
0.033549997955560684,
0.11941801011562347,
0.18022212386131287,
0.10408516973257065,
0.0741693452000618,
0.049435149878263474,
-0.036339350044727325,
-0.03273610770702362,
0.02078958973288536,
-0.019547486677765846,
0.06783947348594666,
0.11935918778181076,
-0.15405136346817017,
-0.014394792728126049,
-0.2023974061012268,
0.03292962163686752,
0.12478113919496536,
0.08176661282777786,
-0.18037723004817963,
-0.01851091906428337,
0.06053457036614418,
-0.013455611653625965,
-0.0502290315926075,
0.07407024502754211,
0.04097120836377144,
-0.09172587841749191,
0.03191811591386795,
0.0026707383804023266,
0.06898801773786545,
0.003999242093414068,
0.05295722931623459,
-0.03325565904378891,
-0.17390811443328857,
0.038205258548259735,
0.0826716497540474,
-0.25675204396247864,
0.2987595498561859,
-0.02778531052172184,
-0.04238460585474968,
-0.04301682859659195,
-0.04766392335295677,
0.015664270147681236,
0.1817048192024231,
0.1080184280872345,
0.02243104577064514,
-0.16972483694553375,
-0.08362045139074326,
0.050172191113233566,
0.015525818802416325,
0.003342241747304797,
-0.022120434790849686,
-0.010979597456753254,
-0.043562762439250946,
0.048595305532217026,
0.017892155796289444,
0.13990285992622375,
-0.031177155673503876,
-0.13515286147594452,
0.0300487969070673,
0.056566886603832245,
0.06996256858110428,
-0.020568057894706726,
-0.053780317306518555,
-0.09380923956632614,
0.0985596626996994,
-0.09046970307826996,
-0.04938763752579689,
-0.08884338289499283,
-0.0415821336209774,
0.12081493437290192,
-0.04986211657524109,
0.09052887558937073,
-0.05080276355147362,
-0.02346714772284031,
-0.09636297821998596,
-0.1169019341468811,
0.12934713065624237,
-0.12397696822881699,
0.004265601746737957,
-0.045755669474601746,
0.06845908612012863,
-0.039835888892412186,
0.05634529888629913,
0.04185793921351433,
0.08987107127904892,
-0.1553904414176941,
-0.08434313535690308,
0.039039090275764465,
-0.025177499279379845,
0.0824364721775055,
-0.022945156320929527,
-0.03580371290445328,
-0.0027390322647988796,
0.10473733395338058,
-0.05231604352593422,
0.23807547986507416,
0.21156172454357147,
-0.10116737335920334,
0.11183233559131622,
0.14613349735736847,
-0.06302866339683533,
-0.3067699670791626,
-0.08652512729167938,
-0.16885536909103394,
-0.017380069941282272,
0.051670897752046585,
-0.17654357850551605,
0.06380634754896164,
0.08528205007314682,
-0.08394167572259903,
0.07339327037334442,
-0.23570598661899567,
-0.044139400124549866,
0.16104210913181305,
-0.04678996652364731,
0.35915976762771606,
-0.14807309210300446,
-0.023804523050785065,
0.013653331436216831,
-0.1705472469329834,
0.20043553411960602,
-0.012960447929799557,
0.07421619445085526,
-0.025682110339403152,
0.020628685131669044,
0.0006052667740732431,
-0.0708414763212204,
0.17385652661323547,
-0.027301985770463943,
0.025983763858675957,
-0.10660283267498016,
-0.10586901009082794,
0.08187209069728851,
-0.034828003495931625,
0.03337809816002846,
0.013541693799197674,
-0.006300049833953381,
-0.18748803436756134,
0.005229639355093241,
-0.12040816247463226,
0.1255425065755844,
0.0018245248356834054,
-0.0694093257188797,
-0.08830250799655914,
0.05736221745610237,
-0.016755471006035805,
0.003091318067163229,
0.3000355064868927,
-0.0041678352281451225,
0.1453980952501297,
0.05366925895214081,
0.05539773777127266,
-0.13052767515182495,
-0.06571242213249207,
-0.014123215340077877,
-0.04596839100122452,
0.07442880421876907,
-0.1319907009601593,
-0.01358128059655428,
0.11997295916080475,
-0.012364919297397137,
0.054627276957035065,
0.08276968449354172,
-0.031088342890143394,
0.027737455442547798,
0.15020903944969177,
-0.15685619413852692,
-0.09380845725536346,
0.010618021711707115,
0.04443847015500069,
0.06697539240121841,
0.030280351638793945,
0.09898359328508377,
-0.01819439046084881,
-0.02768217772245407,
-0.009039601311087608,
-0.008847269229590893,
-0.06601924449205399,
0.020680341869592667,
0.10678406804800034,
0.03306179493665695,
-0.09581566601991653,
0.06071896106004715,
0.042272403836250305,
-0.15787756443023682,
0.007735380902886391,
0.03505910933017731,
-0.07780124992132187,
-0.1657044142484665,
-0.1151624321937561,
-0.021730592474341393,
-0.2080262154340744,
-0.07766607403755188,
-0.016308503225445747,
-0.09898385405540466,
0.046502936631441116,
0.15012623369693756,
0.10484327375888824,
0.0697379931807518,
0.005431527737528086,
-0.02489323355257511,
0.02942012809216976,
-0.029080381616950035,
-0.06982536613941193,
0.006722136400640011,
-0.08417866379022598,
-0.011972680687904358,
0.009349717758595943,
0.11727495491504669,
-0.06955447793006897,
-0.048659227788448334,
-0.14468035101890564,
0.03567987680435181,
-0.0696917474269867,
-0.06434684991836548,
-0.15084514021873474,
-0.06347670406103134,
0.029575908556580544,
-0.11984681338071823,
-0.06258063018321991,
-0.03220684826374054,
-0.13121472299098969,
0.03553800284862518,
0.01563899777829647,
0.07694891840219498,
-0.10435424745082855,
-0.046155862510204315,
0.09118158370256424,
-0.00637260265648365,
0.08631555736064911,
0.03434334322810173,
-0.03172402083873749,
0.060722943395376205,
-0.052720025181770325,
-0.09359477460384369,
0.07977093756198883,
0.006219216622412205,
0.055435556918382645,
-0.029620569199323654,
-0.0032781723421067,
0.06531362980604172,
0.005398978479206562,
0.03622284531593323,
-0.10591699928045273,
-0.11113599687814713,
-0.011644415557384491,
0.005467958748340607,
-0.1023075133562088,
0.00189635728020221,
-0.07170868664979935,
0.1722375452518463,
-0.019033752381801605,
0.146678164601326,
0.02183172106742859,
0.013623079285025597,
-0.07703370600938797,
0.02979465015232563,
-0.051375243812799454,
-0.12799598276615143,
-0.022289667278528214,
-0.059543076902627945,
-0.016097603365778923,
-0.01620583049952984,
0.31408363580703735,
0.06935916095972061,
-0.08990495651960373,
0.07430940121412277,
0.07342425733804703,
-0.05277816206216812,
0.01683824695646763,
0.25333043932914734,
0.06851797550916672,
-0.03860628977417946,
-0.10069827735424042,
0.07198076695203781,
-0.020325949415564537,
-0.08621201664209366,
0.09079577773809433,
0.1689119189977646,
0.10587611794471741,
0.04946763813495636,
0.02959364466369152,
-0.024088626727461815,
-0.07525165379047394,
-0.11045313626527786,
0.054852526634931564,
0.06546320766210556,
-0.017941245809197426,
0.03869570419192314,
0.17113837599754333,
-0.0748489648103714,
0.04764462634921074,
-0.07694797217845917,
0.01872999593615532,
-0.14338639378547668,
-0.12761957943439484,
-0.03502991423010826,
-0.09066782891750336,
0.0005788213456980884,
-0.07358710467815399,
0.020653201267123222,
0.26217716932296753,
0.032738097012043,
-0.04209856316447258,
-0.007277685217559338,
0.0868179053068161,
-0.052067194133996964,
-0.0016439916798844934,
0.007418988738209009,
0.03985363245010376,
-0.1281622052192688,
-0.018535668030381203,
-0.08972135186195374,
-0.025829892605543137,
-0.05119544267654419,
0.02659635618329048,
-0.08904717117547989,
0.0018668993143364787,
-0.12535889446735382,
-0.07505020499229431,
-0.05606222152709961,
0.04099927842617035,
0.0013586276909336448,
0.09898867458105087,
-0.009965255856513977,
0.03673715889453888,
0.057446062564849854,
0.23002883791923523,
-0.033381011337041855,
-0.06564486026763916,
-0.03974130377173424,
0.04413658380508423,
0.054617781192064285,
0.06868194788694382,
-0.007724251598119736,
-0.0038481364026665688,
-0.06428855657577515,
0.1936013251543045,
0.28743863105773926,
-0.031171875074505806,
0.058695610612630844,
0.0210709311068058,
0.012930694036185741,
0.05580415576696396,
0.07834775745868683,
0.09485382586717606,
0.18001282215118408,
-0.12984569370746613,
-0.0255686417222023,
-0.10077959299087524,
0.017046120017766953,
-0.09030056744813919,
0.0101012559607625,
0.04318750649690628,
-0.06024974584579468,
-0.05292413756251335,
0.10014069080352783,
-0.1291406750679016,
-0.02229791320860386,
0.07801514863967896,
-0.1840907335281372,
-0.07025763392448425,
-0.007153640501201153,
0.18852053582668304,
0.01408139057457447,
0.0795769989490509,
-0.07302206009626389,
-0.08390386402606964,
0.03624418005347252,
0.020506903529167175,
-0.22354216873645782,
-0.060750965029001236,
0.1534968614578247,
0.08684583753347397,
0.08094275742769241,
-0.06018863990902901,
0.03240041062235832,
0.0971648246049881,
0.04132603481411934,
-0.0902756005525589,
0.06092558056116104,
0.07209482789039612,
-0.016999932006001472,
-0.051223646849393845,
-0.09960062801837921,
0.022404048591852188,
-0.03651336207985878,
0.09063219279050827,
-0.1389809101819992,
0.06073577329516411,
-0.019347764551639557,
-0.039261505007743835,
-0.035184346139431,
0.0611518993973732,
-0.03260064125061035,
0.09469888359308243,
-0.0018589395331218839,
-0.04159344360232353,
-0.06112987548112869,
-0.054486751556396484,
-0.017729276791214943,
0.06679093092679977,
-0.07027716189622879,
-0.118986576795578,
-0.022145144641399384,
-0.031321074813604355,
0.01762199215590954,
0.00036384197301231325,
-0.06487970054149628,
-0.02987869828939438,
-0.034209758043289185,
0.020127616822719574,
-0.07526324689388275,
0.03175663575530052,
0.08385080099105835,
0.014066921547055244,
-0.026568403467535973,
0.06290454417467117,
0.008233824744820595,
0.021534709259867668,
-0.12183690816164017,
-0.07664665579795837
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz BERT and ELECTRA models
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources Italian BERT and ELECTRA models 🎉
# Italian BERT
The source data for the Italian BERT model consists of a recent Wikipedia dump and
various texts from the [OPUS corpora](http://opus.nlpl.eu/) collection. The final
training corpus has a size of 13GB and 2,050,057,573 tokens.
For sentence splitting, we use NLTK (faster compared to spacy).
Our cased and uncased models are training with an initial sequence length of 512
subwords for ~2-3M steps.
For the XXL Italian models, we use the same training data from OPUS and extend
it with data from the Italian part of the [OSCAR corpus](https://traces1.inria.fr/oscar/).
Thus, the final training corpus has a size of 81GB and 13,138,379,147 tokens.
Note: Unfortunately, a wrong vocab size was used when training the XXL models.
This explains the mismatch of the "real" vocab size of 31102, compared to the
vocab size specified in `config.json`. However, the model is working and all
evaluations were done under those circumstances.
See [this issue](https://github.com/dbmdz/berts/issues/7) for more information.
The Italian ELECTRA model was trained on the "XXL" corpus for 1M steps in total using a batch
size of 128. We pretty much following the ELECTRA training procedure as used for
[BERTurk](https://github.com/stefan-it/turkish-bert/tree/master/electra).
## Model weights
Currently only PyTorch-[Transformers](https://github.com/huggingface/transformers)
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
| Model | Downloads
| ---------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `dbmdz/bert-base-italian-cased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-cased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-cased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-cased/vocab.txt)
| `dbmdz/bert-base-italian-uncased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-uncased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-uncased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-uncased/vocab.txt)
| `dbmdz/bert-base-italian-xxl-cased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-cased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-cased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-cased/vocab.txt)
| `dbmdz/bert-base-italian-xxl-uncased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-uncased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-uncased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-uncased/vocab.txt)
| `dbmdz/electra-base-italian-xxl-cased-discriminator` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/dbmdz/electra-base-italian-xxl-cased-discriminator/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-discriminator/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-discriminator/vocab.txt)
| `dbmdz/electra-base-italian-xxl-cased-generator` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/dbmdz/electra-base-italian-xxl-cased-generator/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-generator/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-generator/vocab.txt)
## Results
For results on downstream tasks like NER or PoS tagging, please refer to
[this repository](https://github.com/stefan-it/italian-bertelectra).
## Usage
With Transformers >= 2.3 our Italian BERT models can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/bert-base-italian-cased"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
```
To load the (recommended) Italian XXL BERT models, just use:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/bert-base-italian-xxl-cased"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
```
To load the Italian XXL ELECTRA model (discriminator), just use:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/electra-base-italian-xxl-cased-discriminator"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelWithLMHead.from_pretrained(model_name)
```
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT/ELECTRA models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "it", "license": "mit", "datasets": ["wikipedia"]}
|
fill-mask
|
dbmdz/electra-base-italian-xxl-cased-generator
|
[
"transformers",
"pytorch",
"safetensors",
"electra",
"fill-mask",
"it",
"dataset:wikipedia",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"it"
] |
TAGS
#transformers #pytorch #safetensors #electra #fill-mask #it #dataset-wikipedia #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
+ dbmdz BERT and ELECTRA models
===============================
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources Italian BERT and ELECTRA models
Italian BERT
============
The source data for the Italian BERT model consists of a recent Wikipedia dump and
various texts from the OPUS corpora collection. The final
training corpus has a size of 13GB and 2,050,057,573 tokens.
For sentence splitting, we use NLTK (faster compared to spacy).
Our cased and uncased models are training with an initial sequence length of 512
subwords for ~2-3M steps.
For the XXL Italian models, we use the same training data from OPUS and extend
it with data from the Italian part of the OSCAR corpus.
Thus, the final training corpus has a size of 81GB and 13,138,379,147 tokens.
Note: Unfortunately, a wrong vocab size was used when training the XXL models.
This explains the mismatch of the "real" vocab size of 31102, compared to the
vocab size specified in 'URL'. However, the model is working and all
evaluations were done under those circumstances.
See this issue for more information.
The Italian ELECTRA model was trained on the "XXL" corpus for 1M steps in total using a batch
size of 128. We pretty much following the ELECTRA training procedure as used for
BERTurk.
Model weights
-------------
Currently only PyTorch-Transformers
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
Results
-------
For results on downstream tasks like NER or PoS tagging, please refer to
this repository.
Usage
-----
With Transformers >= 2.3 our Italian BERT models can be loaded like:
To load the (recommended) Italian XXL BERT models, just use:
To load the Italian XXL ELECTRA model (discriminator), just use:
Huggingface model hub
=====================
All models are available on the Huggingface model hub.
Contact (Bugs, Feedback, Contribution and more)
===============================================
For questions about our BERT/ELECTRA models just open an issue
here
Acknowledgments
===============
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[] |
[
"TAGS\n#transformers #pytorch #safetensors #electra #fill-mask #it #dataset-wikipedia #license-mit #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
54
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #electra #fill-mask #it #dataset-wikipedia #license-mit #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.07164481282234192,
0.08390630036592484,
-0.006444564089179039,
0.025752101093530655,
0.1141873449087143,
0.04456048086285591,
0.11181459575891495,
0.08508214354515076,
0.12026519328355789,
-0.03483906760811806,
0.15949897468090057,
0.23163852095603943,
0.002101130783557892,
0.16099703311920166,
-0.07206914573907852,
-0.21764367818832397,
0.09519460797309875,
0.042331960052251816,
-0.04583408683538437,
0.1061888188123703,
0.10416077077388763,
-0.09442903846502304,
0.06017204374074936,
-0.0171620212495327,
-0.06896607577800751,
0.016371332108974457,
0.06710274517536163,
-0.13526302576065063,
0.16328075528144836,
0.027571355924010277,
0.18252764642238617,
0.03491882234811783,
-0.011141941882669926,
-0.1329667866230011,
0.04780164361000061,
-0.004255296196788549,
-0.08032695204019547,
0.06066456809639931,
-0.01497195940464735,
-0.019874459132552147,
0.009002108126878738,
0.05053425952792168,
0.037666093558073044,
0.03696652874350548,
-0.12435700744390488,
-0.17052577435970306,
-0.04985405504703522,
0.04841252416372299,
0.04383603483438492,
0.053227610886096954,
0.023200178518891335,
0.2140571027994156,
-0.11232849955558777,
0.07040533423423767,
0.11505935341119766,
-0.2923436462879181,
0.020197473466396332,
0.04885675758123398,
0.07028531283140182,
-0.07595013082027435,
-0.035197701305150986,
0.06224706768989563,
0.038019418716430664,
0.020605813711881638,
0.07922566682100296,
-0.049244921654462814,
-0.05028173699975014,
0.012738744728267193,
-0.048900894820690155,
-0.07860102504491806,
0.2314886748790741,
-0.009563463740050793,
0.03445986285805702,
-0.033382970839738846,
-0.10409917682409286,
0.017329327762126923,
-0.005029608495533466,
0.006080280989408493,
-0.012294305488467216,
0.05067014694213867,
0.01846706122159958,
-0.009187447838485241,
-0.14446212351322174,
0.021905936300754547,
-0.22443346679210663,
0.2158396691083908,
0.03643697872757912,
0.08866722881793976,
-0.1374357044696808,
0.035705775022506714,
0.0036871773190796375,
-0.11292128264904022,
0.006195473484694958,
-0.079037144780159,
0.061993177980184555,
-0.024255139753222466,
-0.028710244223475456,
0.013782051391899586,
0.15056075155735016,
0.2448902577161789,
0.017126036807894707,
-0.029358068481087685,
0.022864622995257378,
0.10192660987377167,
0.055454548448324203,
0.013787687756121159,
-0.023882532492280006,
-0.013443047180771828,
0.09629437327384949,
-0.10468938946723938,
0.06553760170936584,
-0.034330882132053375,
-0.09809435904026031,
-0.03672118857502937,
0.02843409590423107,
0.11595556885004044,
0.0682237297296524,
0.043626345694065094,
-0.08692498505115509,
0.03452272340655327,
0.13200189173221588,
-0.06987054646015167,
0.010999185964465141,
-0.011298940517008305,
0.02933676540851593,
0.051758404821157455,
0.0179911982268095,
-0.001458696904592216,
-0.008933208882808685,
0.10916203260421753,
-0.08533191680908203,
-0.038254816085100174,
-0.01992128975689411,
-0.07249121367931366,
0.07343397289514542,
-0.1254330575466156,
0.08430378884077072,
-0.2013050764799118,
-0.19844557344913483,
0.05818287283182144,
0.06580080091953278,
0.006020589265972376,
-0.03413975611329079,
0.06185217574238777,
0.0016167920548468828,
0.008859941735863686,
-0.06487464904785156,
-0.10077943652868271,
-0.08667583763599396,
0.10081929713487625,
-0.030178196728229523,
0.0759277269244194,
-0.15787756443023682,
0.022548379376530647,
-0.12025101482868195,
-0.005640014074742794,
-0.0769556313753128,
-0.08763238787651062,
-0.037417683750391006,
0.130620077252388,
-0.019774887710809708,
-0.027073802426457405,
-0.052901577204465866,
0.03376547992229462,
-0.0071620880626142025,
0.17319603264331818,
-0.08987627178430557,
-0.07106024026870728,
0.1951095014810562,
-0.1668645590543747,
-0.18284372985363007,
0.09554353356361389,
-0.0033552884124219418,
0.035096462815999985,
0.08050233870744705,
0.07257667928934097,
0.0168368648737669,
-0.1550779640674591,
0.005083343014121056,
0.07478640228509903,
-0.1444101780653,
-0.16055965423583984,
0.029474789276719093,
-0.008400899358093739,
-0.07834123075008392,
0.043621234595775604,
0.0500563308596611,
0.1080569326877594,
-0.061634473502635956,
-0.06585105508565903,
-0.04595291614532471,
-0.04906250536441803,
0.09975557774305344,
0.028966085985302925,
0.03770129010081291,
-0.07753617316484451,
-0.026611648499965668,
-0.0028386388439685106,
0.027267033234238625,
0.04738442227244377,
0.015035084448754787,
-0.11098185926675797,
0.14037059247493744,
-0.025039348751306534,
-0.003729368792846799,
-0.13405561447143555,
-0.09800941497087479,
-0.03005128726363182,
0.019810888916254044,
-0.03882335126399994,
0.05130109190940857,
0.06498022377490997,
-0.01735883206129074,
0.0032981070689857006,
-0.033049412071704865,
0.09322422742843628,
0.06970331817865372,
0.018872302025556564,
-0.1172449141740799,
0.02792869322001934,
-0.059524763375520706,
-0.018395977094769478,
-0.034526124596595764,
-0.0006635918980464339,
-0.024413762614130974,
0.12934662401676178,
-0.02579839713871479,
0.06042550504207611,
-0.05918736010789871,
0.0028256322257220745,
-0.05757541209459305,
0.010099183768033981,
0.0794532373547554,
0.025744648650288582,
-0.039294060319662094,
0.12763191759586334,
-0.09993383288383484,
0.3389323949813843,
0.20165316760540009,
-0.18236932158470154,
-0.012875119224190712,
0.014182084240019321,
-0.005048954859375954,
-0.0025798422284424305,
-0.017248403280973434,
-0.018147265538573265,
-0.04383418709039688,
0.00016214518109336495,
0.1244504451751709,
-0.037763841450214386,
0.005606768652796745,
0.036746665835380554,
-0.08188390731811523,
-0.04256030544638634,
0.0022801924496889114,
0.17884959280490875,
-0.20162628591060638,
0.18311960995197296,
0.2231827974319458,
0.02182730659842491,
0.12390073388814926,
-0.02993578091263771,
0.006347449496388435,
-0.0068907239474356174,
-0.060944024473428726,
-0.02607082948088646,
0.1086297333240509,
-0.1120503768324852,
0.03199790418148041,
0.07869045436382294,
-0.034200720489025116,
0.018218345940113068,
-0.13856984674930573,
-0.058677371591329575,
0.007769498974084854,
0.02447950281202793,
-0.07288563996553421,
0.09545168280601501,
0.017643995583057404,
0.09538061916828156,
-0.03867210075259209,
-0.10801302641630173,
0.10213685780763626,
0.01908387988805771,
-0.043898288160562515,
0.176002636551857,
-0.12109147757291794,
-0.31997957825660706,
-0.10578775405883789,
-0.10427101701498032,
0.056374046951532364,
0.024213943630456924,
0.0970819815993309,
-0.008532261475920677,
-0.08278770744800568,
0.04495816305279732,
-0.028892870992422104,
0.021064136177301407,
0.03655693680047989,
-0.05016443133354187,
0.04123619198799133,
-0.03983192890882492,
-0.083858422935009,
-0.07131709903478622,
0.0012565512442961335,
0.00649913540109992,
0.14704009890556335,
-0.045833393931388855,
0.070428766310215,
0.06190032139420509,
-0.004120092373341322,
0.025146935135126114,
-0.03797362744808197,
0.16083894670009613,
-0.061506204307079315,
0.04146070033311844,
0.20096847414970398,
-0.02838037721812725,
0.070624440908432,
0.21726390719413757,
0.04906821995973587,
-0.049734026193618774,
0.0027185159269720316,
-0.07692739367485046,
-0.09788897633552551,
-0.20805487036705017,
-0.1253647357225418,
-0.08829623460769653,
0.028565002605319023,
0.06616827845573425,
0.06422335654497147,
0.11362926661968231,
0.10559533536434174,
0.008709810674190521,
-0.07425529509782791,
-0.030325816944241524,
0.04447789862751961,
0.21492040157318115,
-0.005532060284167528,
0.10213708877563477,
-0.07710713893175125,
-0.09814510494470596,
0.07607772201299667,
0.04200562834739685,
0.12554484605789185,
0.11270707845687866,
0.03034055233001709,
0.06739816069602966,
0.18063780665397644,
0.11693930625915527,
0.15952514111995697,
0.07127469778060913,
-0.06158847734332085,
0.0009357071248814464,
0.0017306958325207233,
-0.02747580036520958,
0.011807351373136044,
0.06247422844171524,
-0.09083818644285202,
-0.03469359874725342,
-0.10394633561372757,
0.03899924457073212,
0.11785395443439484,
0.05779986456036568,
-0.25252652168273926,
-0.003381576621904969,
0.048853419721126556,
0.0006989024695940316,
-0.07417427003383636,
0.0460122749209404,
-0.031216496601700783,
-0.1028204932808876,
0.08724599331617355,
-0.05155908688902855,
0.06205407530069351,
0.02743891254067421,
0.03554137051105499,
-0.03508397564291954,
-0.06543984264135361,
0.027888629585504532,
0.07274032384157181,
-0.23243016004562378,
0.2779034376144409,
-0.006802688352763653,
0.026503371074795723,
-0.07738614827394485,
-0.018990717828273773,
0.029038656502962112,
0.14920498430728912,
0.1378416121006012,
0.01628807745873928,
-0.10685724765062332,
-0.11551901698112488,
-0.02898523025214672,
0.041869696229696274,
0.0030161465983837843,
0.013786929659545422,
-0.01483883149921894,
-0.0606110617518425,
-0.01302117295563221,
0.008467990905046463,
0.05874527990818024,
-0.04839164391160011,
-0.11936832964420319,
0.01981090009212494,
0.09641915559768677,
0.07277114689350128,
-0.033816564828157425,
-0.05965152382850647,
-0.10057482123374939,
0.14040878415107727,
-0.0803583487868309,
-0.07114725559949875,
-0.1087188571691513,
-0.06154245510697365,
0.0892554447054863,
-0.09463495016098022,
0.13025352358818054,
-0.07660584151744843,
0.023493483662605286,
-0.0956566110253334,
-0.15092197060585022,
0.13393794000148773,
-0.15492883324623108,
-0.03420400992035866,
-0.06711332499980927,
0.11288080364465714,
-0.036504365503787994,
0.009557296521961689,
0.06333157420158386,
0.04389437288045883,
-0.08677178621292114,
-0.05662073194980621,
0.03232797607779503,
-0.03223571181297302,
0.07697843015193939,
0.032244693487882614,
-0.05144922435283661,
-0.12453377991914749,
0.049526333808898926,
-0.04464603587985039,
0.219641774892807,
0.2588515877723694,
-0.06113922968506813,
0.11108298599720001,
0.21924051642417908,
-0.046835824847221375,
-0.3241901099681854,
-0.16295310854911804,
-0.17172938585281372,
-0.018509840592741966,
0.05311264842748642,
-0.11247405409812927,
0.09865482896566391,
0.03713472932577133,
-0.07952933758497238,
0.13157695531845093,
-0.14990174770355225,
-0.0772489383816719,
0.24831229448318481,
0.025423618033528328,
0.3647112548351288,
-0.11730469018220901,
-0.04283427074551582,
-0.05207574740052223,
-0.12956362962722778,
0.13583329319953918,
0.003451200667768717,
0.06134973466396332,
-0.019270289689302444,
0.03586742654442787,
0.0011483147973194718,
-0.0873611569404602,
0.11834454536437988,
-0.0817040354013443,
0.026440085843205452,
-0.10852675884962082,
-0.04333198443055153,
0.04919243976473808,
-0.0015060871373862028,
0.056245993822813034,
-0.028453262522816658,
0.013491474092006683,
-0.0047068106941878796,
-0.02932867966592312,
-0.08560873568058014,
0.14595893025398254,
0.019487185403704643,
-0.07359539717435837,
-0.006126811727881432,
0.006742860656231642,
-0.020615076646208763,
-0.0220117699354887,
0.21699009835720062,
0.041155342012643814,
0.16385364532470703,
0.0686965137720108,
0.042682401835918427,
-0.13206684589385986,
-0.05967201292514801,
-0.05340658500790596,
-0.09554421156644821,
0.0598098449409008,
-0.0043433369137346745,
0.03878066688776016,
0.09722063690423965,
-0.010956340469419956,
0.06741933524608612,
0.08105374872684479,
-0.01936442032456398,
-0.01169817429035902,
0.14973556995391846,
-0.1861867606639862,
0.004684587009251118,
0.024126358330249786,
0.04101283475756645,
0.04585626721382141,
0.057952962815761566,
0.09609351307153702,
-0.005568975582718849,
-0.0453629344701767,
-0.017901049926877022,
0.032842472195625305,
-0.031694598495960236,
0.061723217368125916,
0.07792439311742783,
0.039660271257162094,
-0.12765412032604218,
0.06318164616823196,
-0.02447199448943138,
-0.1547856479883194,
0.0024616862647235394,
0.031887710094451904,
-0.12279517203569412,
-0.12604065239429474,
0.004327827598899603,
0.03685592859983444,
-0.1474475860595703,
-0.11688199639320374,
-0.08174624294042587,
-0.12394677847623825,
0.03730838745832443,
0.19688917696475983,
0.09169767051935196,
0.07365425676107407,
0.01468422170728445,
-0.05260758474469185,
-0.031119929626584053,
0.02200881391763687,
-0.03298991546034813,
0.005195057485252619,
-0.10552626848220825,
-0.04083465039730072,
0.010988393798470497,
0.09445114433765411,
-0.08263596892356873,
-0.021108614280819893,
-0.14029166102409363,
0.029826192185282707,
-0.06042696163058281,
-0.010523905046284199,
-0.13640843331813812,
-0.05211547389626503,
0.028145235031843185,
-0.08198749274015427,
-0.040122538805007935,
-0.02479340322315693,
-0.089480921626091,
0.022782159969210625,
0.04700721800327301,
0.02286529541015625,
-0.11754554510116577,
-0.05826861038804054,
0.09704995900392532,
-0.018550768494606018,
0.08403932303190231,
0.07576822489500046,
-0.06362449377775192,
0.08615317940711975,
-0.18211349844932556,
-0.09883502125740051,
0.10314951092004776,
0.003713635727763176,
0.05384219437837601,
-0.03854231536388397,
0.028302345424890518,
0.10455779731273651,
-0.017266253009438515,
0.03351753577589989,
-0.017015691846609116,
-0.12800739705562592,
0.017437437549233437,
0.009323198348283768,
-0.12957769632339478,
-0.010349939577281475,
-0.11918876320123672,
0.10215594619512558,
-0.06857030093669891,
0.1911027878522873,
-0.0477205291390419,
0.0470740906894207,
-0.06490593403577805,
0.022377654910087585,
-0.05214659497141838,
-0.1367795765399933,
-0.0705442875623703,
-0.017493251711130142,
-0.019502149894833565,
-0.009097239933907986,
0.2627105712890625,
0.004880605731159449,
-0.03430989757180214,
0.07134558260440826,
0.07875377684831619,
-0.04719090834259987,
0.009511142037808895,
0.22870735824108124,
0.03952619433403015,
-0.048348210752010345,
-0.10308077186346054,
0.05377376452088356,
0.0011957904789596796,
-0.11940407752990723,
0.10018827766180038,
0.08757133781909943,
0.06303229928016663,
0.04478562995791435,
0.043998271226882935,
0.0058700330555438995,
-0.06758537143468857,
-0.2185790240764618,
-0.023247193545103073,
0.038567133247852325,
0.07135266065597534,
0.018168030306696892,
0.16530872881412506,
-0.02340870536863804,
0.02540036477148533,
-0.059763990342617035,
-0.012238415889441967,
-0.18288107216358185,
-0.17566335201263428,
-0.09337212890386581,
-0.05427420511841774,
0.04160875454545021,
-0.04127614200115204,
-0.0223892442882061,
0.14146284759044647,
0.02245190553367138,
-0.06483624875545502,
0.03827252984046936,
0.04876866191625595,
-0.021059466525912285,
-0.00326066417619586,
0.016829462721943855,
0.005440781824290752,
-0.0022678235545754433,
-0.04066353291273117,
-0.13687016069889069,
-0.002952427137643099,
-0.04335709288716316,
0.011200220324099064,
-0.05858976021409035,
0.07021981477737427,
-0.11136733740568161,
-0.10026732832193375,
-0.05168849602341652,
0.03071122244000435,
-0.02683607116341591,
0.0762396976351738,
0.006521879695355892,
0.058941617608070374,
0.07124900072813034,
0.14932821691036224,
-0.011830699630081654,
-0.15302477777004242,
-0.05432307347655296,
0.10901343077421188,
0.047360632568597794,
0.06928540021181107,
0.025290846824645996,
0.010212186723947525,
-0.04953363537788391,
0.2451527863740921,
0.29004931449890137,
0.018304554745554924,
0.06913518160581589,
-0.019583705812692642,
0.011936456896364689,
0.04133912920951843,
0.10496164858341217,
0.07826672494411469,
0.24394410848617554,
-0.08339714258909225,
-0.005155437160283327,
-0.08619523793458939,
-0.006809898652136326,
-0.12957650423049927,
-0.015295607969164848,
0.014185091480612755,
-0.02648928202688694,
-0.04264300316572189,
0.11563468724489212,
-0.13489307463169098,
0.04407075420022011,
0.05936067923903465,
-0.1293560117483139,
-0.04303991049528122,
-0.02820858173072338,
0.17299355566501617,
0.047323133796453476,
0.05286646634340286,
-0.06374029815196991,
-0.04835252836346626,
0.04710131511092186,
0.011386924423277378,
-0.21304340660572052,
-0.054945509880781174,
0.07935195416212082,
0.0474301353096962,
0.136273592710495,
-0.013210734352469444,
0.08441753685474396,
0.08901706337928772,
0.04030478745698929,
-0.07630271464586258,
0.08668774366378784,
0.041246168315410614,
-0.05108494684100151,
0.019051596522331238,
-0.0879296213388443,
0.017232952639460564,
-0.08498325198888779,
0.043469272553920746,
-0.0899803638458252,
0.04810729995369911,
-0.06652039289474487,
-0.055596042424440384,
-0.025187646970152855,
0.11078274995088577,
-0.04118617996573448,
0.0729590430855751,
-0.01388965081423521,
-0.021132029592990875,
-0.036853939294815063,
-0.06393906474113464,
0.024594470858573914,
0.08184278756380081,
-0.13625110685825348,
-0.09772065281867981,
-0.07980719208717346,
-0.021789947524666786,
0.044518060982227325,
0.007959896698594093,
-0.12809856235980988,
-0.03548061475157738,
-0.13671760261058807,
-0.020516321063041687,
-0.16799350082874298,
0.016543256118893623,
0.0935162827372551,
0.04042918235063553,
-0.004762436728924513,
0.025757255032658577,
0.010194357484579086,
0.029773375019431114,
-0.1340470016002655,
-0.11068984866142273
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz Turkish ELECTRA model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a cased ELECTRA base model for Turkish 🎉
# Turkish ELECTRA model
We release a base ELEC**TR**A model for Turkish, that was trained on the same data as *BERTurk*.
> ELECTRA is a new method for self-supervised language representation learning. It can be used to
> pre-train transformer networks using relatively little compute. ELECTRA models are trained to
> distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to
> the discriminator of a GAN.
More details about ELECTRA can be found in the [ICLR paper](https://openreview.net/forum?id=r1xMH1BtvB)
or in the [official ELECTRA repository](https://github.com/google-research/electra) on GitHub.
## Stats
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish [OSCAR corpus](https://traces1.inria.fr/oscar/),
a recent Wikipedia dump, various [OPUS corpora](http://opus.nlpl.eu/) and a
special corpus provided by [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/).
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model
on a TPU v3-8 for 1M steps.
## Model weights
[Transformers](https://github.com/huggingface/transformers)
compatible weights for both PyTorch and TensorFlow are available.
| Model | Downloads
| ------------------------------------------------ | ---------------------------------------------------------------------------------------------------------------
| `dbmdz/electra-base-turkish-cased-discriminator` | [`config.json`](https://cdn.huggingface.co/dbmdz/electra-base-turkish-cased-discriminator/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/electra-base-turkish-cased-discriminator/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/electra-base-turkish-cased-discriminator/vocab.txt)
## Usage
With Transformers >= 2.8 our ELECTRA base cased model can be loaded like:
```python
from transformers import AutoModelWithLMHead, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("dbmdz/electra-base-turkish-cased-discriminator")
model = AutoModelWithLMHead.from_pretrained("dbmdz/electra-base-turkish-cased-discriminator")
```
## Results
For results on PoS tagging or NER tasks, please refer to
[this repository](https://github.com/stefan-it/turkish-bert/electra).
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our ELECTRA models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "tr", "license": "mit"}
| null |
dbmdz/electra-base-turkish-cased-discriminator
|
[
"transformers",
"pytorch",
"tf",
"electra",
"pretraining",
"tr",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #electra #pretraining #tr #license-mit #endpoints_compatible #region-us
|
+ dbmdz Turkish ELECTRA model
=============================
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a cased ELECTRA base model for Turkish
Turkish ELECTRA model
=====================
We release a base ELECTRA model for Turkish, that was trained on the same data as *BERTurk*.
>
> ELECTRA is a new method for self-supervised language representation learning. It can be used to
> pre-train transformer networks using relatively little compute. ELECTRA models are trained to
> distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to
> the discriminator of a GAN.
>
>
>
More details about ELECTRA can be found in the ICLR paper
or in the official ELECTRA repository on GitHub.
Stats
-----
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish OSCAR corpus,
a recent Wikipedia dump, various OPUS corpora and a
special corpus provided by Kemal Oflazer.
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model
on a TPU v3-8 for 1M steps.
Model weights
-------------
Transformers
compatible weights for both PyTorch and TensorFlow are available.
Usage
-----
With Transformers >= 2.8 our ELECTRA base cased model can be loaded like:
Results
-------
For results on PoS tagging or NER tasks, please refer to
this repository.
Huggingface model hub
=====================
All models are available on the Huggingface model hub.
Contact (Bugs, Feedback, Contribution and more)
===============================================
For questions about our ELECTRA models just open an issue
here
Acknowledgments
===============
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[] |
[
"TAGS\n#transformers #pytorch #tf #electra #pretraining #tr #license-mit #endpoints_compatible #region-us \n"
] |
[
37
] |
[
"passage: TAGS\n#transformers #pytorch #tf #electra #pretraining #tr #license-mit #endpoints_compatible #region-us \n"
] |
[
-0.05861099809408188,
0.005883581005036831,
-0.007683257572352886,
0.01050401572138071,
0.1385493129491806,
0.032392919063568115,
0.054023899137973785,
0.08081942051649094,
0.035687435418367386,
-0.04102092236280441,
0.15276306867599487,
0.232399120926857,
-0.0239882729947567,
0.028588522225618362,
-0.061818547546863556,
-0.27025726437568665,
0.041802648454904556,
0.09078606963157654,
-0.06467315554618835,
0.12250131368637085,
0.09992603957653046,
-0.0769590437412262,
0.055778585374355316,
-0.0011740676127374172,
-0.12458879500627518,
0.007188597694039345,
0.011086559854447842,
-0.09724237024784088,
0.17676666378974915,
0.07856621593236923,
0.1313212811946869,
0.06332097202539444,
-0.025700319558382034,
-0.1541755199432373,
0.03964284434914589,
-0.011196048930287361,
-0.1172606572508812,
0.03243924304842949,
0.0014171745860949159,
-0.04069536551833153,
0.166208878159523,
0.10752841830253601,
0.016911009326577187,
0.07205304503440857,
-0.20554456114768982,
-0.1624647080898285,
-0.0745515525341034,
0.08227041363716125,
0.05714043229818344,
0.0768786296248436,
0.024753326550126076,
0.20003430545330048,
-0.14801184833049774,
0.0695260763168335,
0.08220566809177399,
-0.3500400185585022,
-0.006324237212538719,
0.06154241785407066,
0.10042374581098557,
0.0054606045596301556,
-0.03618277236819267,
0.02890080399811268,
0.05691437050700188,
0.02767961286008358,
0.010218902491033077,
-0.06416044384241104,
-0.019355522468686104,
0.06984327733516693,
-0.12443090230226517,
-0.08097978681325912,
0.2509234547615051,
-0.011438771151006222,
0.009712930768728256,
0.047727297991514206,
-0.059253040701150894,
-0.05274515599012375,
0.022144461050629616,
-0.028856048360466957,
-0.022323323413729668,
0.09137516468763351,
0.0335056334733963,
-0.02145870216190815,
-0.14971984922885895,
0.021736452355980873,
-0.23197415471076965,
0.1910589188337326,
0.03386012464761734,
0.05724876746535301,
-0.16956987977027893,
0.0964091569185257,
0.025797298178076744,
-0.04511306807398796,
-0.020063506439328194,
-0.08362678438425064,
0.035964783281087875,
-0.048469338566064835,
-0.03162158653140068,
0.0625,
0.041315145790576935,
0.22322091460227966,
0.03946147486567497,
-0.002135366201400757,
0.01262836903333664,
0.1152714267373085,
-0.031889159232378006,
0.04206826165318489,
0.01616571843624115,
0.07046030461788177,
0.012387840077280998,
-0.1317649632692337,
-0.008984189480543137,
0.012339488603174686,
-0.10951486229896545,
-0.06378389894962311,
-0.03154749423265457,
0.12982891499996185,
-0.003205276792868972,
0.03561646491289139,
-0.07834482938051224,
0.02764315716922283,
0.05200176686048508,
-0.012534755282104015,
-0.04485943913459778,
-0.020614488050341606,
0.02160545252263546,
0.08195780962705612,
0.028419047594070435,
0.012121075764298439,
-0.020690428093075752,
0.0985795333981514,
-0.09141883254051208,
-0.04166746884584427,
-0.0019621718674898148,
-0.021910952404141426,
0.07833003997802734,
-0.14589586853981018,
0.09448972344398499,
-0.17583277821540833,
-0.12421081215143204,
0.03839629516005516,
0.06442876905202866,
0.0142550989985466,
-0.015422634780406952,
0.03343493491411209,
-0.04222394526004791,
-0.03841765224933624,
-0.05294029787182808,
-0.05860840901732445,
-0.0686926618218422,
0.12931139767169952,
-0.03727142512798309,
0.007449334487318993,
-0.11858849227428436,
0.04384096711874008,
-0.08292753249406815,
-0.000030275525205070153,
-0.03650026023387909,
-0.03608031943440437,
-0.012376311235129833,
0.18659330904483795,
-0.023658154532313347,
-0.09258444607257843,
-0.12711112201213837,
0.04868162423372269,
-0.03746766597032547,
0.13997432589530945,
-0.06437350809574127,
-0.08124920725822449,
0.2185097187757492,
-0.12629152834415436,
-0.20666983723640442,
0.06788549572229385,
-0.004485560581088066,
0.09793362766504288,
0.07633427530527115,
0.13724525272846222,
0.06975381076335907,
-0.17459318041801453,
0.09122815728187561,
0.09996297955513,
-0.17746597528457642,
-0.1806361973285675,
0.05717217177152634,
-0.02587689459323883,
-0.08864537626504898,
0.0005505099543370306,
-0.004762877244502306,
0.12460057437419891,
-0.08444449305534363,
-0.04104410111904144,
-0.012544686906039715,
-0.002469615312293172,
0.04392916336655617,
0.06086611747741699,
0.03924363851547241,
-0.1011456549167633,
-0.05136562138795853,
0.02984556555747986,
0.016124237328767776,
0.058323610574007034,
0.029178159311413765,
-0.11423160135746002,
0.042107030749320984,
0.04165387153625488,
-0.03973345085978508,
-0.14576828479766846,
-0.058662496507167816,
-0.05485336482524872,
0.07964149862527847,
0.030870774760842323,
0.24981987476348877,
0.04672081768512726,
-0.04008972644805908,
-0.009696691296994686,
-0.021264798939228058,
0.08329746872186661,
0.026441285386681557,
-0.011117080226540565,
-0.08014033734798431,
0.0249272882938385,
-0.04416198283433914,
-0.02650575526058674,
-0.10898125171661377,
0.03524579852819443,
0.11520867049694061,
0.10302460193634033,
-0.013859816826879978,
0.07446960359811783,
-0.05287126079201698,
0.03337080404162407,
-0.025858500972390175,
-0.006216938141733408,
0.10796436667442322,
0.05370244011282921,
-0.05870413780212402,
0.11610135436058044,
-0.09346634894609451,
0.33908188343048096,
0.1807519942522049,
-0.19452568888664246,
-0.05104917287826538,
-0.008406221866607666,
-0.07843230664730072,
0.013892152346670628,
0.06361021846532822,
0.0032344337087124586,
0.07565552741289139,
-0.0021452864166349173,
0.12087003141641617,
-0.035743772983551025,
-0.06681826710700989,
0.018154511228203773,
-0.03493683040142059,
-0.035763468593358994,
0.06294766813516617,
0.1556628793478012,
-0.24024388194084167,
0.1468108594417572,
0.2272484004497528,
0.0749354213476181,
0.1280985325574875,
-0.08369775861501694,
-0.0014803250087425113,
0.0005325355450622737,
0.03187645226716995,
-0.011886157095432281,
0.07212597131729126,
-0.1976832151412964,
0.0001717980339890346,
0.0461178794503212,
-0.005102223716676235,
0.04561777785420418,
-0.18541298806667328,
-0.11017479747533798,
0.030533863231539726,
0.00235375901684165,
-0.0685095340013504,
0.137012779712677,
-0.028816062957048416,
0.08894412964582443,
-0.010273819789290428,
-0.10678882151842117,
0.12516643106937408,
0.021041443571448326,
-0.09212040156126022,
0.1564621478319168,
-0.10842899233102798,
-0.20250974595546722,
-0.10983860492706299,
-0.028664320707321167,
0.10637716948986053,
0.0024925731122493744,
0.12420479953289032,
-0.03002515621483326,
-0.039359550923109055,
0.06774091720581055,
-0.002766458783298731,
-0.12545546889305115,
0.011840625666081905,
-0.04191145673394203,
0.06808041781187057,
-0.0808519646525383,
-0.11032489687204361,
-0.07526286691427231,
-0.032326363027095795,
-0.06532617658376694,
0.10300689935684204,
-0.10158174484968185,
0.05282466113567352,
0.11051923781633377,
0.002821343019604683,
0.055243536829948425,
-0.07603069394826889,
0.18844035267829895,
-0.08936873078346252,
0.01774509996175766,
0.17959026992321014,
-0.005757506471127272,
0.07511201500892639,
0.13402609527111053,
0.05142059549689293,
-0.07230348885059357,
0.007274297531694174,
-0.0674576386809349,
-0.10292960703372955,
-0.26747000217437744,
-0.0851212814450264,
-0.12939418852329254,
0.01998973824083805,
0.00364684802480042,
0.07993164658546448,
0.13616091012954712,
0.08434935659170151,
0.004557931795716286,
-0.07797807455062866,
-0.005770571529865265,
0.06387504935264587,
0.2793932259082794,
-0.015556451864540577,
0.07204017788171768,
-0.1167387068271637,
-0.06947706639766693,
0.09143878519535065,
0.014418491162359715,
0.18382707238197327,
0.1289607435464859,
0.005192040931433439,
0.1303403377532959,
0.15892836451530457,
0.1049061268568039,
0.09420283138751984,
0.016589021310210228,
-0.051918528974056244,
-0.03040500171482563,
0.008910728618502617,
-0.018939509987831116,
0.03289582207798958,
0.049791377037763596,
-0.11020305752754211,
-0.054079752415418625,
-0.17222437262535095,
0.02111734449863434,
0.11667342483997345,
0.02563261054456234,
-0.1887086033821106,
-0.02259455807507038,
0.03071954846382141,
-0.019485916942358017,
-0.04911273717880249,
0.09637565910816193,
-0.026065826416015625,
-0.12888571619987488,
0.07847193628549576,
-0.04871496930718422,
0.07929779589176178,
0.06563542783260345,
0.06417816132307053,
0.03808039054274559,
-0.1156422421336174,
0.025719434022903442,
0.09404852241277695,
-0.31257253885269165,
0.3077980577945709,
-0.006809825077652931,
0.011994853615760803,
-0.04334330931305885,
-0.03944801539182663,
0.0064111389219760895,
0.20608985424041748,
0.13563716411590576,
0.010539514012634754,
-0.0728854089975357,
-0.08863142132759094,
0.0865129604935646,
0.02651088498532772,
0.07468948513269424,
-0.01873064413666725,
-0.04570149630308151,
-0.04483257979154587,
0.02062719315290451,
0.02277568355202675,
0.033870160579681396,
-0.031164279207587242,
-0.10378522425889969,
0.03912532702088356,
-0.007347439881414175,
0.0782884731888771,
-0.04456476494669914,
-0.02841477282345295,
-0.05493967607617378,
0.07182090729475021,
-0.15645691752433777,
-0.051118407398462296,
-0.10272152721881866,
-0.13893668353557587,
0.051439106464385986,
-0.08829248696565628,
0.08523569256067276,
-0.04295189678668976,
-0.07763536274433136,
-0.07065507024526596,
-0.14803582429885864,
0.1382049173116684,
-0.11951083689928055,
-0.021533850580453873,
-0.04775160551071167,
0.18714779615402222,
-0.03373394161462784,
0.03114781528711319,
0.011781677603721619,
0.0095354113727808,
-0.05576273426413536,
-0.08631312102079391,
0.0009413790539838374,
-0.09073225408792496,
0.07072651386260986,
-0.047334909439086914,
-0.0674246996641159,
0.05711057037115097,
0.030822236090898514,
-0.05806637555360794,
0.19977843761444092,
0.23948143422603607,
-0.053435128182172775,
0.12802354991436005,
0.14840000867843628,
-0.05945441871881485,
-0.23797965049743652,
-0.04731312766671181,
-0.15530382096767426,
-0.043426766991615295,
0.03283531218767166,
-0.15546724200248718,
0.03742826357483864,
0.08683471381664276,
-0.03302839770913124,
0.1288791298866272,
-0.27221187949180603,
-0.06400095671415329,
0.15461160242557526,
0.002583677414804697,
0.3781931698322296,
-0.13022807240486145,
-0.045894134789705276,
0.055980514734983444,
-0.23525001108646393,
0.13002179563045502,
-0.000507331860717386,
0.0567743182182312,
-0.011408084072172642,
-0.010513070039451122,
-0.0020655517000705004,
-0.059702955186367035,
0.11817490309476852,
0.027159979566931725,
0.04377518594264984,
-0.1096094399690628,
-0.0783454179763794,
0.09588336199522018,
0.024232594296336174,
0.01918364129960537,
0.014331398531794548,
0.023586994037032127,
-0.16334541141986847,
-0.025969577953219414,
-0.10034101456403732,
0.09853698313236237,
0.001341068185865879,
-0.09256350994110107,
-0.04338794946670532,
0.026196593418717384,
-0.013371402397751808,
-0.04627860337495804,
0.1874774992465973,
-0.013359828852117062,
0.18923340737819672,
0.01906132698059082,
0.15405185520648956,
-0.16067762672901154,
-0.06776446104049683,
-0.08994469791650772,
-0.05440984293818474,
0.048213131725788116,
-0.06260605901479721,
0.004598570521920919,
0.15415364503860474,
0.0006949116359464824,
0.06505808979272842,
0.09880312532186508,
-0.00664450041949749,
0.001606155768968165,
0.13660547137260437,
-0.15568186342716217,
-0.10053468495607376,
-0.047706425189971924,
0.032070375978946686,
0.09687922894954681,
0.1050226166844368,
0.0850791484117508,
-0.00791708193719387,
0.002020999789237976,
0.0019751053769141436,
-0.04119418188929558,
-0.08169881254434586,
0.01022866927087307,
0.09833689779043198,
0.021932465955615044,
-0.08829081803560257,
0.020862668752670288,
-0.006525723729282618,
-0.1852513998746872,
-0.0324665866792202,
0.09789416939020157,
-0.12203185260295868,
-0.11600426584482193,
-0.058960165828466415,
0.03837605565786362,
-0.2484338879585266,
-0.05497286841273308,
-0.020266149193048477,
-0.12955501675605774,
0.09360747039318085,
0.28902438282966614,
0.06786639243364334,
0.11616130918264389,
-0.021566400304436684,
-0.0013668235624209046,
-0.0022938260808587074,
-0.042854130268096924,
-0.038682691752910614,
0.00426688976585865,
-0.12265821546316147,
0.08366742730140686,
0.0037358489353209734,
0.13106955587863922,
-0.06905385106801987,
-0.07619559019804001,
-0.13937772810459137,
0.07819639891386032,
-0.07285302132368088,
-0.06436524540185928,
-0.10806272178888321,
-0.03112115152180195,
0.02448553591966629,
-0.11287640035152435,
-0.046837836503982544,
-0.0169993843883276,
-0.1260463148355484,
0.081724151968956,
0.06479965895414352,
0.05749804899096489,
-0.059798769652843475,
-0.03691520169377327,
0.09532593190670013,
-0.01607115939259529,
0.10467593371868134,
0.06305039674043655,
-0.05756572261452675,
0.10770197957754135,
-0.09382795542478561,
-0.05528214946389198,
0.09551241248846054,
-0.0042087542824447155,
0.032034654170274734,
0.03954492136836052,
0.03175737336277962,
0.06422111392021179,
-0.018816864117980003,
0.0591857023537159,
-0.0912519097328186,
-0.1251654475927353,
0.02031831257045269,
0.023685423657298088,
-0.1325400471687317,
-0.04516003653407097,
-0.07999031990766525,
0.11155621707439423,
-0.012353334575891495,
0.14478126168251038,
-0.008559519425034523,
0.03160591423511505,
-0.06785569339990616,
-0.0016739077400416136,
-0.0049416315741837025,
-0.10955435037612915,
-0.010171468369662762,
-0.08227506279945374,
-0.028599973767995834,
-0.005865692161023617,
0.20030300319194794,
-0.022435178980231285,
-0.0506771020591259,
0.08601896464824677,
0.04089942201972008,
-0.021788785234093666,
-0.01613939180970192,
0.248198002576828,
0.05444559082388878,
-0.01808932237327099,
-0.1440548449754715,
0.05017741024494171,
-0.04315269738435745,
-0.1890053153038025,
0.15995106101036072,
0.11648929119110107,
-0.022368496283888817,
0.048295799642801285,
0.029820607975125313,
0.0093945087864995,
-0.08485109359025955,
-0.18416666984558105,
0.06984227150678635,
0.013523516245186329,
0.003167259506881237,
0.10881166905164719,
0.20763033628463745,
-0.07750587910413742,
0.016894595697522163,
-0.042349591851234436,
-0.0036280525382608175,
-0.1642192155122757,
-0.13105911016464233,
-0.022578762844204903,
-0.0727090984582901,
0.051097311079502106,
-0.030553782358765602,
0.01841375231742859,
0.15214644372463226,
0.06584577262401581,
-0.051947738975286484,
0.021414224058389664,
0.02917594648897648,
-0.0526249073445797,
0.005096277222037315,
-0.0004534249019343406,
0.050447601824998856,
-0.10685541480779648,
-0.019434280693531036,
-0.09905914217233658,
-0.0876966342329979,
-0.06408388167619705,
-0.00012773348134942353,
-0.07047653943300247,
-0.013428273610770702,
-0.12114056944847107,
-0.05371568351984024,
-0.03273484483361244,
0.08361051231622696,
0.00412009097635746,
0.086480051279068,
-0.006343976128846407,
0.0252322219312191,
0.058051832020282745,
0.16941778361797333,
-0.0205397829413414,
-0.1145453006029129,
-0.033476945012807846,
0.14142508804798126,
0.08873463422060013,
0.0797295942902565,
0.029802607372403145,
0.01780109293758869,
-0.0022786473855376244,
0.26137542724609375,
0.23033958673477173,
-0.015446661040186882,
0.05509280040860176,
0.027154341340065002,
0.028465047478675842,
0.1023523285984993,
0.12468189001083374,
0.08850894123315811,
0.25036174058914185,
-0.13312506675720215,
-0.04953673854470253,
-0.06366101652383804,
0.0467824712395668,
-0.06575153768062592,
0.049131881445646286,
0.013948683626949787,
-0.06580666452646255,
-0.010579888708889484,
0.1309019923210144,
-0.1290314793586731,
0.03941289708018303,
0.06998421251773834,
-0.12955467402935028,
-0.03608769550919533,
-0.045644134283065796,
0.11567727476358414,
0.031160173937678337,
0.08868777006864548,
-0.05869605019688606,
-0.10865940898656845,
0.04806296527385712,
0.05674128979444504,
-0.2633470296859741,
-0.08255340158939362,
0.14011400938034058,
0.08798375725746155,
0.03361082077026367,
-0.04807840660214424,
0.06403250992298126,
0.07459012418985367,
0.06269991397857666,
-0.04021207243204117,
0.07417741417884827,
0.059080976992845535,
-0.028697647154331207,
-0.06297820061445236,
-0.11308249086141586,
0.024638449773192406,
-0.0658857524394989,
0.02725204825401306,
-0.1673862338066101,
0.0688876137137413,
-0.017041780054569244,
-0.05683276057243347,
-0.04214475676417351,
0.08546668291091919,
-0.04455987364053726,
0.08360309153795242,
0.029924502596259117,
0.00001848252759373281,
-0.04375547543168068,
-0.0745522752404213,
-0.025419410318136215,
0.10643206536769867,
-0.10300762206315994,
-0.05566250905394554,
-0.02315659262239933,
-0.04442897066473961,
0.0038731112144887447,
-0.010689720511436462,
-0.10320927202701569,
-0.05836177617311478,
-0.07618073374032974,
0.016821207478642464,
-0.1454089730978012,
0.05880630388855934,
0.05806706100702286,
0.041929736733436584,
0.006459210533648729,
-0.009131353348493576,
0.0034616088960319757,
0.043670497834682465,
-0.1203746348619461,
-0.047705162316560745
] |
null | null |
transformers
|
# 🇹🇷 Turkish ELECTRA model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="https://raw.githubusercontent.com/stefan-it/turkish-bert/master/merve_logo.png">
</p>
[](https://zenodo.org/badge/latestdoi/237817454)
We present community-driven BERT, DistilBERT, ELECTRA and ConvBERT models for Turkish 🎉
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the BERT model name: BERTurk.
Logo is provided by [Merve Noyan](https://twitter.com/mervenoyann).
# Stats
We've also trained an ELECTRA (cased) model on the recently released Turkish part of the
[multiligual C4 (mC4) corpus](https://github.com/allenai/allennlp/discussions/5265) from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ELECTRA
In addition to the ELEC**TR**A base model, we also trained an ELECTRA model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the [DBMDZ](https://github.com/dbmdz) Hugging Face [model hub page](https://huggingface.co/dbmdz)
using their model name.
Example usage with 🤗/Transformers:
```python
tokenizer = AutoTokenizer.from_pretrained("dbmdz/electra-base-turkish-mc4-cased-discriminator")
model = AutoModel.from_pretrained("dbmdz/electra-base-turkish-mc4-cased-discriminator")
```
# Citation
You can use the following BibTeX entry for citation:
```bibtex
@software{stefan_schweter_2020_3770924,
author = {Stefan Schweter},
title = {BERTurk - BERT models for Turkish},
month = apr,
year = 2020,
publisher = {Zenodo},
version = {1.0.0},
doi = {10.5281/zenodo.3770924},
url = {https://doi.org/10.5281/zenodo.3770924}
}
```
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank [Merve Noyan](https://twitter.com/mervenoyann) for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
|
{"language": "tr", "license": "mit", "datasets": ["allenai/c4"]}
| null |
dbmdz/electra-base-turkish-mc4-cased-discriminator
|
[
"transformers",
"pytorch",
"tf",
"tensorboard",
"electra",
"pretraining",
"tr",
"dataset:allenai/c4",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #tensorboard #electra #pretraining #tr #dataset-allenai/c4 #license-mit #endpoints_compatible #region-us
|
# 🇹🇷 Turkish ELECTRA model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="URL
</p>
 model on the recently released Turkish part of the
multiligual C4 (mC4) corpus from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ELECTRA
In addition to the ELECTRA base model, we also trained an ELECTRA model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the DBMDZ Hugging Face model hub page
using their model name.
Example usage with /Transformers:
You can use the following BibTeX entry for citation:
# Acknowledgments
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank Merve Noyan for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
|
[
"# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ELECTRA\n\nIn addition to the ELECTRA base model, we also trained an ELECTRA model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
"TAGS\n#transformers #pytorch #tf #tensorboard #electra #pretraining #tr #dataset-allenai/c4 #license-mit #endpoints_compatible #region-us \n",
"# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ELECTRA\n\nIn addition to the ELECTRA base model, we also trained an ELECTRA model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
50,
131,
93,
68,
49,
90
] |
[
"passage: TAGS\n#transformers #pytorch #tf #tensorboard #electra #pretraining #tr #dataset-allenai/c4 #license-mit #endpoints_compatible #region-us \n# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).# mC4 ELECTRA\n\nIn addition to the ELECTRA base model, we also trained an ELECTRA model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
-0.015353217720985413,
0.0706590786576271,
-0.004084402695298195,
0.014056571759283543,
0.12302916496992111,
0.046651441603899,
0.17830675840377808,
0.0852278620004654,
0.011632305569946766,
0.023099545389413834,
0.021349512040615082,
-0.03841232508420944,
0.09261178970336914,
0.04138152301311493,
0.08457323908805847,
-0.23043794929981232,
-0.022715678438544273,
-0.12837377190589905,
-0.04571488872170448,
0.07260657101869583,
0.11958690732717514,
-0.04630047455430031,
0.11782906204462051,
0.022643152624368668,
-0.05244692787528038,
0.006832714192569256,
-0.04713035002350807,
-0.07535584270954132,
0.10010150074958801,
0.06587109714746475,
0.0808052122592926,
-0.06238127872347832,
-0.029217826202511787,
-0.13236607611179352,
0.03592567890882492,
0.06315209716558456,
-0.011140085756778717,
0.058727774769067764,
0.11417823284864426,
-0.05407039076089859,
0.21789458394050598,
-0.11789069324731827,
-0.023137586191296577,
-0.004930658265948296,
-0.12309285253286362,
-0.036589812487363815,
-0.18345405161380768,
0.1044553890824318,
0.05694212764501572,
0.05261601135134697,
0.009012776426970959,
0.04578174650669098,
-0.020467489957809448,
0.0510634183883667,
0.06167716532945633,
-0.2860781252384186,
-0.06354992091655731,
-0.04624636843800545,
0.014559753239154816,
0.07513267546892166,
-0.03267272934317589,
0.07711820304393768,
-0.012268650345504284,
-0.01801150292158127,
-0.020642785355448723,
-0.024593381211161613,
-0.012218408286571503,
-0.07293029129505157,
-0.027664542198181152,
-0.04538474231958389,
0.2505740821361542,
-0.0013945205137133598,
-0.06388100981712341,
-0.06303966045379639,
0.03332702815532684,
0.04046317934989929,
0.023705540224909782,
-0.009452342055737972,
-0.011272524483501911,
-0.024671047925949097,
0.0785852000117302,
-0.12156688421964645,
-0.13565921783447266,
0.022594284266233444,
-0.048160724341869354,
0.18434883654117584,
0.04718007892370224,
0.061947301030159,
-0.047321617603302,
0.08154740184545517,
-0.026746930554509163,
-0.027064982801675797,
-0.010659630410373211,
-0.035420335829257965,
-0.10721463710069656,
-0.027225954458117485,
-0.024478795006871223,
-0.16071105003356934,
-0.018629558384418488,
0.1067938283085823,
-0.007687716279178858,
0.015504647977650166,
0.017081404104828835,
0.03606081381440163,
0.013941876590251923,
0.043502118438482285,
-0.07341040670871735,
-0.060640960931777954,
0.02143915183842182,
-0.12388115376234055,
0.026666538789868355,
-0.02087983675301075,
-0.01438554935157299,
0.015606301836669445,
-0.03251925855875015,
0.09196501970291138,
0.00019687942403834313,
0.051883846521377563,
0.001608652644790709,
-0.033486731350421906,
0.17554214596748352,
-0.08268675208091736,
0.011542029678821564,
0.025020355358719826,
-0.06308761984109879,
-0.007177313789725304,
0.033650726079940796,
-0.02296234667301178,
-0.053981978446245193,
0.0738188773393631,
-0.015558306127786636,
-0.00639431644231081,
-0.04384640231728554,
-0.08275288343429565,
0.07710140198469162,
-0.058673080056905746,
-0.02317926660180092,
-0.14325451850891113,
-0.09987371414899826,
-0.038734812289476395,
0.043524619191884995,
-0.054989419877529144,
0.04060383513569832,
0.047792911529541016,
-0.03628901392221451,
0.013480397872626781,
0.006552941165864468,
0.06843549013137817,
-0.009646416641771793,
0.0027643884532153606,
-0.12436560541391373,
0.048965997993946075,
-0.0873728021979332,
0.0161582063883543,
-0.045885223895311356,
0.004924141801893711,
-0.12213189899921417,
0.0743866115808487,
-0.01787940226495266,
0.009939844720065594,
-0.11241696029901505,
0.012738998048007488,
-0.10178574174642563,
-0.02906651981174946,
0.029706189408898354,
0.05590950697660446,
-0.15987275540828705,
-0.02330896072089672,
0.06684620678424835,
-0.07943151146173477,
-0.0511338971555233,
0.12766575813293457,
-0.01747187413275242,
0.08351438492536545,
0.09485476464033127,
0.12226693332195282,
0.08062300831079483,
-0.03232046216726303,
-0.1273641288280487,
-0.09732189029455185,
-0.08436455577611923,
0.05289338901638985,
0.06425704061985016,
-0.01442097406834364,
0.04503645747900009,
0.0178506039083004,
-0.04826060310006142,
0.029647717252373695,
-0.03564317896962166,
-0.04206978529691696,
0.056346043944358826,
-0.05309755727648735,
-0.03649942949414253,
-0.057458218187093735,
0.0507168285548687,
-0.06701560318470001,
-0.05148712545633316,
-0.07346824556589127,
0.07520438730716705,
-0.02503027394413948,
0.028307732194662094,
-0.06617637723684311,
0.08286471664905548,
-0.08876803517341614,
0.024631725624203682,
-0.11375229060649872,
-0.07190698385238647,
0.03284044191241264,
-0.06091029942035675,
0.0666344091296196,
-0.02712583728134632,
0.03127974271774292,
0.04545062407851219,
-0.029421333223581314,
0.05420367792248726,
0.02915799990296364,
-0.018800262361764908,
-0.07580113410949707,
-0.1406620591878891,
-0.026279935613274574,
-0.00904848612844944,
0.13324572145938873,
-0.049169111996889114,
0.010325567796826363,
0.04911334067583084,
0.11145606637001038,
0.03576640412211418,
-0.052852071821689606,
0.02456546574831009,
0.04704799875617027,
0.009900939650833607,
-0.03799193352460861,
-0.0025118025951087475,
0.012986022047698498,
-0.072711281478405,
0.14020729064941406,
-0.1723378300666809,
-0.1165672093629837,
0.04939243197441101,
0.05777019262313843,
-0.07752229273319244,
0.06378733366727829,
-0.025113394483923912,
-0.06003841012716293,
-0.0021532466635107994,
-0.024915479123592377,
0.1968662291765213,
0.03789934143424034,
0.07151278108358383,
-0.0506603829562664,
-0.05382464826107025,
0.01581753045320511,
-0.04369482398033142,
-0.03163427487015724,
0.07327736169099808,
-0.01506806630641222,
-0.17307086288928986,
0.04222065582871437,
0.09091759473085403,
0.06562303006649017,
0.18530547618865967,
0.06694359332323074,
-0.11122636497020721,
-0.07252626866102219,
0.01884777471423149,
0.04793041944503784,
0.11588773876428604,
-0.029487621039152145,
-0.0022582467645406723,
0.04511306434869766,
0.005955032538622618,
0.04304767772555351,
-0.035775694996118546,
0.04294014722108841,
0.0232132188975811,
-0.05149226635694504,
-0.007202604319900274,
0.03348766639828682,
-0.019245360046625137,
0.09112740308046341,
-0.01152096502482891,
0.06888390332460403,
0.0055063581094145775,
-0.02671034447848797,
-0.07643425464630127,
0.11260158568620682,
-0.07856699079275131,
-0.29490041732788086,
-0.14459407329559326,
0.0709172859787941,
-0.017613572999835014,
-0.011143514886498451,
0.025151466950774193,
-0.04212069883942604,
-0.014837007038295269,
-0.05596068874001503,
0.012472867034375668,
0.05465739965438843,
-0.04516956955194473,
-0.052337490022182465,
-0.01778716780245304,
0.020688770338892937,
-0.09911222010850906,
-0.003150729928165674,
-0.011425250209867954,
-0.08295445889234543,
0.051424745470285416,
0.07258657366037369,
0.09565964341163635,
0.006754525005817413,
-0.09549834579229355,
-0.026811549440026283,
0.01040564477443695,
0.09063441306352615,
-0.1384500414133072,
0.09927744418382645,
0.03628368675708771,
-0.06551122665405273,
0.026175210252404213,
0.05723726749420166,
0.03528900817036629,
-0.022585850208997726,
0.012076891027390957,
0.05093735456466675,
-0.047663360834121704,
-0.28916072845458984,
-0.07451970130205154,
-0.04858814552426338,
-0.06118819862604141,
-0.03950709104537964,
0.062252163887023926,
0.030188074335455894,
0.02771560288965702,
-0.07509420812129974,
0.003407470416277647,
-0.010191153734922409,
0.03474484011530876,
0.024900376796722412,
0.013267884962260723,
0.0050429352559149265,
-0.11387097090482712,
0.01516776904463768,
0.12637832760810852,
0.06939530372619629,
0.20524421334266663,
0.03189757466316223,
0.20992451906204224,
0.07338611036539078,
0.04732027277350426,
0.02537371963262558,
0.051964111626148224,
0.002589720766991377,
0.03563852980732918,
-0.01827472634613514,
-0.06475375592708588,
-0.015275864861905575,
0.008585273288190365,
0.03374801203608513,
-0.0367417074739933,
-0.029800355434417725,
0.04262378439307213,
0.12478148192167282,
0.23777909576892853,
-0.030778978019952774,
-0.15359842777252197,
-0.09291515499353409,
-0.02167581580579281,
-0.05994647368788719,
-0.026617391034960747,
-0.08550078421831131,
0.15995121002197266,
-0.14636793732643127,
0.02800040878355503,
-0.038557011634111404,
0.05513251572847366,
-0.16577236354351044,
-0.007571104448288679,
0.04224010556936264,
0.026270365342497826,
-0.03172242268919945,
0.06499659270048141,
-0.08454542607069016,
0.12911932170391083,
0.04165562614798546,
0.12143213301897049,
-0.10549259185791016,
-0.006763789802789688,
0.07096565514802933,
-0.07063619047403336,
0.08264877647161484,
0.028924908488988876,
-0.10914872586727142,
-0.049960069358348846,
-0.22078602015972137,
0.04841354861855507,
0.07574980705976486,
-0.1205284595489502,
0.06985443830490112,
0.01848679780960083,
0.019398439675569534,
-0.05163652077317238,
0.0102284736931324,
-0.18738970160484314,
-0.18586906790733337,
0.07135029882192612,
-0.003895194735378027,
0.01302376389503479,
-0.033382598310709,
-0.009733052924275398,
-0.034301936626434326,
0.1477932333946228,
-0.07211335748434067,
-0.07787860929965973,
-0.06654085218906403,
-0.007026298902928829,
0.09344084560871124,
-0.07063519954681396,
-0.03160277381539345,
0.03667712211608887,
0.029057521373033524,
-0.020835475996136665,
-0.058593567460775375,
-0.038240719586610794,
-0.07484469562768936,
-0.11679290980100632,
-0.033813830465078354,
0.10619182884693146,
0.08107076585292816,
0.060152098536491394,
0.028397666290402412,
-0.01855011098086834,
0.05108131468296051,
-0.09650480002164841,
-0.001822377205826342,
0.07690597325563431,
0.017577337101101875,
0.07707288861274719,
-0.03559001535177231,
-0.00016677600797265768,
-0.09482382237911224,
-0.0552225336432457,
0.10484801977872849,
0.15480774641036987,
-0.03997400775551796,
0.14854015409946442,
0.1085360199213028,
-0.12989184260368347,
-0.1018804982304573,
-0.044391728937625885,
0.001975922379642725,
0.04767467454075813,
-0.005200899206101894,
-0.14906319975852966,
0.03217953070998192,
0.1491633504629135,
0.0001294437824981287,
0.06461231410503387,
-0.267721563577652,
-0.09035040438175201,
0.043442558497190475,
-0.012072013691067696,
0.032126858830451965,
-0.1265491545200348,
-0.04697998985648155,
-0.018338941037654877,
0.013898339122533798,
0.07926159352064133,
-0.0875345766544342,
0.10068202018737793,
-0.014320483431220055,
0.0007570015732198954,
0.02104957588016987,
-0.01467438880354166,
0.15801061689853668,
0.01538836769759655,
0.059092938899993896,
-0.09125010669231415,
0.020260997116565704,
0.1302034556865692,
-0.01772347278892994,
0.11662312597036362,
-0.005337150767445564,
-0.003784543601796031,
-0.08814831078052521,
-0.00877359788864851,
-0.05480990186333656,
0.08193080872297287,
-0.04176896810531616,
0.008362102322280407,
-0.03613396733999252,
0.07238929718732834,
0.01681596226990223,
0.04309959337115288,
-0.06966650485992432,
0.002011675387620926,
-0.018876539543271065,
0.03280990943312645,
0.1604432761669159,
0.055339522659778595,
-0.009135653264820576,
-0.006774278823286295,
-0.01659386046230793,
0.08489032089710236,
-0.0213856752961874,
0.004154998809099197,
0.08709152042865753,
-0.05078154057264328,
0.0803270936012268,
-0.01884971559047699,
-0.1663840264081955,
0.08640161901712418,
0.12796707451343536,
-0.15044519305229187,
-0.15856514871120453,
-0.007563361432403326,
-0.038850393146276474,
-0.00673679681494832,
-0.020858779549598694,
0.160866379737854,
-0.08089442551136017,
-0.0037818215787410736,
-0.017881879583001137,
0.048221297562122345,
-0.026159893721342087,
0.11056793481111526,
0.005688616074621677,
-0.03438163921236992,
-0.054935723543167114,
0.17535267770290375,
0.11998314410448074,
-0.09769613295793533,
0.05421917140483856,
0.08055257797241211,
-0.06757596135139465,
-0.06080389767885208,
-0.07444477081298828,
0.06535673141479492,
-0.013857333920896053,
-0.11492225527763367,
0.02282675914466381,
-0.02428179793059826,
-0.021469902247190475,
-0.0024046110920608044,
-0.035331081598997116,
0.08879624307155609,
-0.018138080835342407,
-0.010251805186271667,
-0.07741117477416992,
0.0736459344625473,
0.05441407486796379,
-0.025231311097741127,
-0.04753093048930168,
0.07019658386707306,
-0.026171527802944183,
-0.01774005964398384,
-0.00033027821336872876,
-0.030744854360818863,
-0.02017943561077118,
-0.05565847083926201,
-0.10587912797927856,
-0.01932542212307453,
-0.09299620985984802,
-0.02312896028161049,
0.046851955354213715,
0.03313005715608597,
-0.0243497584015131,
-0.005698830354958773,
-0.04891668260097504,
-0.052059173583984375,
-0.015402302145957947,
0.11179280281066895,
-0.10964354127645493,
-0.023493601009249687,
0.039352402091026306,
-0.03118024580180645,
0.13368159532546997,
0.10500887781381607,
0.008681056089699268,
-0.018577344715595245,
-0.1133384183049202,
0.037168264389038086,
-0.03215954825282097,
-0.01162358932197094,
0.02006012387573719,
-0.14425137639045715,
-0.004948064684867859,
0.019358988851308823,
-0.026140153408050537,
0.02294875867664814,
0.0971502959728241,
-0.08056224882602692,
0.037908848375082016,
-0.00196098699234426,
0.027865875512361526,
-0.07873573154211044,
0.038415078073740005,
0.05419863015413284,
0.04177350550889969,
0.07424160838127136,
-0.06988667696714401,
0.023155920207500458,
-0.07855058461427689,
0.04360990971326828,
-0.019068924710154533,
-0.03726378455758095,
-0.03928866982460022,
-0.0011273445561528206,
0.053945183753967285,
-0.01760941557586193,
0.12210100144147873,
0.012620910070836544,
-0.01869392953813076,
-0.009073671884834766,
-0.01647898182272911,
-0.1326337456703186,
-0.004351872485131025,
0.02094823122024536,
-0.005529812537133694,
-0.05281207710504532,
-0.06622340530157089,
-0.018269577994942665,
-0.03507132828235626,
-0.07041770964860916,
0.15843898057937622,
0.12560148537158966,
0.12229717522859573,
0.0752287209033966,
0.043896839022636414,
-0.08462843298912048,
-0.05589193478226662,
0.015159035101532936,
-0.055218007415533066,
0.07974547892808914,
-0.0173474308103323,
-0.046850115060806274,
0.17627225816249847,
-0.16312089562416077,
0.0413600355386734,
-0.004293443635106087,
-0.05178172141313553,
-0.01102098636329174,
-0.18957030773162842,
-0.005944435019046068,
0.02461143583059311,
-0.011717166751623154,
-0.11618474870920181,
0.04477978125214577,
0.07118552178144455,
0.032877907156944275,
-0.06056239455938339,
0.14550194144248962,
-0.12188341468572617,
-0.06144820526242256,
0.08560939878225327,
0.002502489136531949,
0.054810781031847,
-0.05301060527563095,
-0.012543471530079842,
-0.044703349471092224,
0.06289248168468475,
0.052413493394851685,
0.08521361649036407,
0.11231080442667007,
0.04332151263952255,
-0.018684007227420807,
-0.09810750931501389,
-0.008906429633498192,
-0.007338983938097954,
0.08217482268810272,
0.13120751082897186,
0.06435685604810715,
-0.03730400279164314,
-0.0019808534998446703,
0.11160098016262054,
0.020187899470329285,
0.028899118304252625,
-0.13561499118804932,
0.01768442615866661,
-0.015098849311470985,
-0.035767119377851486,
-0.0004601407272275537,
-0.09518396109342575,
0.002413801848888397,
0.11015386134386063,
0.285123348236084,
0.01921176351606846,
0.012611798010766506,
-0.022166116163134575,
0.0026643567252904177,
-0.01836979202926159,
0.10333222150802612,
-0.03253224864602089,
0.14235515892505646,
-0.04599380120635033,
0.11732365190982819,
-0.015297899954020977,
0.0316501222550869,
-0.06428724527359009,
0.15942569077014923,
-0.054388899356126785,
-0.04939822107553482,
0.010182485915720463,
0.10225541144609451,
-0.06975550204515457,
-0.3409145176410675,
0.016538627445697784,
-0.055527329444885254,
-0.10467559099197388,
0.03660969063639641,
-0.02960924059152603,
0.06800540536642075,
0.09437350183725357,
-0.008879750967025757,
-0.03598632663488388,
0.16656777262687683,
0.03368375450372696,
-0.10076765716075897,
-0.058457206934690475,
0.09052867442369461,
-0.051730360835790634,
0.21113546192646027,
0.019363146275281906,
0.05783810839056969,
0.05871045961976051,
0.009923632256686687,
-0.13789591193199158,
-0.027413705363869667,
0.0005810051807202399,
0.025351809337735176,
0.07440166920423508,
0.1428762823343277,
0.026410117745399475,
0.041355427354574203,
0.09421931207180023,
0.03622359782457352,
0.06486745923757553,
-0.0011425013653934002,
0.07687072455883026,
-0.03787244111299515,
0.05924210324883461,
-0.04782306030392647,
0.15263143181800842,
0.19453676044940948,
-0.017078272998332977,
0.02918095700442791,
-0.0043367380276322365,
-0.023871244862675667,
0.07268532365560532,
0.017813144251704216,
-0.028777219355106354,
-0.13685348629951477,
0.018131906166672707,
-0.08044248819351196,
0.06426879018545151,
-0.10822007060050964,
-0.07777261734008789,
-0.0009092087857425213,
-0.03995027765631676,
0.010805975645780563,
0.06108001247048378,
0.09180983901023865,
0.0453421026468277,
-0.0245176013559103,
0.023023102432489395,
-0.01358070969581604,
0.04903988540172577,
-0.0966326892375946,
-0.05110722780227661
] |
null | null |
transformers
|
# 🇹🇷 Turkish ELECTRA model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="https://raw.githubusercontent.com/stefan-it/turkish-bert/master/merve_logo.png">
</p>
[](https://zenodo.org/badge/latestdoi/237817454)
We present community-driven BERT, DistilBERT, ELECTRA and ConvBERT models for Turkish 🎉
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the BERT model name: BERTurk.
Logo is provided by [Merve Noyan](https://twitter.com/mervenoyann).
# Stats
We've also trained an ELECTRA (cased) model on the recently released Turkish part of the
[multiligual C4 (mC4) corpus](https://github.com/allenai/allennlp/discussions/5265) from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ELECTRA
In addition to the ELEC**TR**A base model, we also trained an ELECTRA model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the [DBMDZ](https://github.com/dbmdz) Hugging Face [model hub page](https://huggingface.co/dbmdz)
using their model name.
Example usage with 🤗/Transformers:
```python
tokenizer = AutoTokenizer.from_pretrained("dbmdz/electra-base-turkish-mc4-cased-generator")
model = AutoModel.from_pretrained("dbmdz/electra-base-turkish-mc4-cased-generator")
```
# Citation
You can use the following BibTeX entry for citation:
```bibtex
@software{stefan_schweter_2020_3770924,
author = {Stefan Schweter},
title = {BERTurk - BERT models for Turkish},
month = apr,
year = 2020,
publisher = {Zenodo},
version = {1.0.0},
doi = {10.5281/zenodo.3770924},
url = {https://doi.org/10.5281/zenodo.3770924}
}
```
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank [Merve Noyan](https://twitter.com/mervenoyann) for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
|
{"language": "tr", "license": "mit", "datasets": ["allenai/c4"], "widget": [{"text": "[MASK] s\u00f6zc\u00fc\u011f\u00fc T\u00fcrk\u00e7e k\u00f6kenlidir"}]}
|
fill-mask
|
dbmdz/electra-base-turkish-mc4-cased-generator
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"electra",
"fill-mask",
"tr",
"dataset:allenai/c4",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #safetensors #electra #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# 🇹🇷 Turkish ELECTRA model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="URL
</p>
 model on the recently released Turkish part of the
multiligual C4 (mC4) corpus from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ELECTRA
In addition to the ELECTRA base model, we also trained an ELECTRA model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the DBMDZ Hugging Face model hub page
using their model name.
Example usage with /Transformers:
You can use the following BibTeX entry for citation:
# Acknowledgments
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank Merve Noyan for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
|
[
"# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ELECTRA\n\nIn addition to the ELECTRA base model, we also trained an ELECTRA model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #electra #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ELECTRA\n\nIn addition to the ELECTRA base model, we also trained an ELECTRA model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
61,
131,
93,
68,
49,
90
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #electra #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).# mC4 ELECTRA\n\nIn addition to the ELECTRA base model, we also trained an ELECTRA model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
-0.01672000251710415,
0.09372977167367935,
-0.003923008218407631,
0.0070352754555642605,
0.10476633161306381,
0.03672634810209274,
0.15382041037082672,
0.08253436535596848,
-0.021457986906170845,
0.040962353348731995,
0.023216454312205315,
-0.018155701458454132,
0.08214781433343887,
0.06956004351377487,
0.08375296741724014,
-0.20446577668190002,
-0.033680692315101624,
-0.14561600983142853,
-0.02144704759120941,
0.07392977178096771,
0.10125170648097992,
-0.04602678492665291,
0.11439166218042374,
0.011667541228234768,
-0.03759496286511421,
0.012885739095509052,
-0.05022271350026131,
-0.08570186048746109,
0.09663894772529602,
0.0554770790040493,
0.04370167851448059,
-0.05377057194709778,
-0.0019995816983282566,
-0.1326538473367691,
0.029473204165697098,
0.05371446907520294,
-0.0220214631408453,
0.0518602654337883,
0.11435277760028839,
-0.03524867445230484,
0.21525302529335022,
-0.12919054925441742,
-0.021024594083428383,
-0.0036945606116205454,
-0.08174058049917221,
-0.03418556600809097,
-0.20093590021133423,
0.10519306361675262,
0.05641228333115578,
0.07415110617876053,
0.021561546251177788,
0.08257348835468292,
0.014892791397869587,
0.07246048003435135,
0.08690948784351349,
-0.3017558455467224,
-0.06327587366104126,
-0.06383150070905685,
0.022362416610121727,
0.041417330503463745,
-0.01686815917491913,
0.08171547949314117,
-0.01987661048769951,
0.0060342298820614815,
-0.0036310418508946896,
-0.03192133083939552,
-0.035727061331272125,
-0.07549142837524414,
-0.04730463773012161,
-0.0353408046066761,
0.25754067301750183,
-0.005259188823401928,
-0.06539314240217209,
-0.06837940216064453,
0.011559943668544292,
0.06867852061986923,
0.024181604385375977,
-0.015028800815343857,
0.001960998633876443,
-0.02781122364103794,
0.05167512595653534,
-0.14554081857204437,
-0.12560749053955078,
0.037760213017463684,
-0.07474644482135773,
0.18781505525112152,
0.06340206414461136,
0.04595698416233063,
-0.009153647348284721,
0.07366229593753815,
-0.023200316354632378,
-0.014426031149923801,
-0.0024338995572179556,
-0.012976568192243576,
-0.07229921966791153,
-0.02179354429244995,
-0.053922638297080994,
-0.2066117227077484,
-0.013126589357852936,
0.05657076835632324,
-0.015447390265762806,
0.004366040695458651,
-0.007402101065963507,
0.02799496427178383,
0.0026054526679217815,
0.06297473609447479,
-0.09320288896560669,
-0.06649258732795715,
0.04402996599674225,
-0.14330239593982697,
0.04744730144739151,
-0.011076892726123333,
-0.009000151418149471,
0.03907022252678871,
-0.005126158241182566,
0.10346788167953491,
0.028745261952280998,
0.06027107685804367,
0.039071984589099884,
-0.026682326570153236,
0.19844864308834076,
-0.10615748912096024,
-0.018036292865872383,
-0.006919822655618191,
-0.08167019486427307,
-0.014525972306728363,
0.057696737349033356,
-0.030276987701654434,
-0.07450945675373077,
0.05802752822637558,
-0.021298848092556,
-0.002287951298058033,
-0.009130824357271194,
-0.07703916728496552,
0.07583460956811905,
-0.03710291534662247,
-0.04301554337143898,
-0.15381044149398804,
-0.1101386770606041,
-0.022879892960190773,
0.03023587167263031,
-0.051752619445323944,
0.040246348828077316,
0.049539804458618164,
-0.031834058463573456,
0.02415546216070652,
-0.005697699263691902,
0.051397230476140976,
-0.0030400631949305534,
0.012540780939161777,
-0.13707013428211212,
0.016938399523496628,
-0.08749556541442871,
0.012560325674712658,
-0.03035222738981247,
0.008469360880553722,
-0.11801698058843613,
0.0648927092552185,
-0.0008518446120433509,
0.010713815689086914,
-0.13416078686714172,
-0.030151233077049255,
-0.052389319986104965,
-0.04442353546619415,
0.03559103608131409,
0.07048359513282776,
-0.13222584128379822,
-0.015736395493149757,
0.08905016630887985,
-0.06262413412332535,
-0.05347316339612007,
0.11239825189113617,
0.010037416592240334,
0.04586944356560707,
0.10909971594810486,
0.09456173330545425,
0.061492059379816055,
-0.08038511872291565,
-0.13830019533634186,
-0.09033206105232239,
-0.06855320185422897,
0.046650663018226624,
0.0505681075155735,
-0.03727903962135315,
0.082373708486557,
0.020975271239876747,
-0.023362359032034874,
0.049370259046554565,
-0.03245919197797775,
-0.021925998851656914,
0.04881636053323746,
-0.07559411227703094,
-0.020939316600561142,
-0.0671585276722908,
0.055933043360710144,
-0.05152374878525734,
-0.045052897185087204,
-0.0723605677485466,
0.07116608321666718,
-0.011617275886237621,
0.05356412008404732,
-0.06321647763252258,
0.08898834884166718,
-0.0874297246336937,
0.018804408609867096,
-0.11254438012838364,
-0.14305900037288666,
0.04527220129966736,
-0.0616026297211647,
0.058516502380371094,
-0.04142879694700241,
0.03157021850347519,
0.05838119611144066,
-0.03821302205324173,
0.04212374985218048,
0.02593228593468666,
-0.016140678897500038,
-0.07089013606309891,
-0.1193559318780899,
-0.02038835734128952,
-0.00521680386736989,
0.07322821021080017,
0.025308601558208466,
0.004272509831935167,
0.04295848309993744,
0.11616692692041397,
0.020401855930685997,
-0.057512301951646805,
0.02850605919957161,
0.02115708403289318,
0.02152201347053051,
-0.0410708449780941,
-0.023043859750032425,
0.012001682072877884,
-0.08309482783079147,
0.17942751944065094,
-0.17497946321964264,
-0.08126439154148102,
0.0550386942923069,
0.08428830653429031,
-0.07668517529964447,
0.07993388921022415,
-0.03770117834210396,
-0.06302274763584137,
0.040379807353019714,
-0.010670389048755169,
0.1844514161348343,
0.04085429385304451,
0.0828152596950531,
-0.054449018090963364,
-0.0827837809920311,
0.02970297448337078,
-0.020836293697357178,
-0.05394819378852844,
0.038825128227472305,
-0.021116705611348152,
-0.19856153428554535,
0.020847823470830917,
0.0979209914803505,
0.10407094657421112,
0.18100175261497498,
0.059294961392879486,
-0.11009439080953598,
-0.08780615776777267,
0.025987323373556137,
0.0466981865465641,
0.0981002002954483,
0.012227004393935204,
-0.004632954485714436,
0.035172585397958755,
0.015555773861706257,
0.046696215867996216,
-0.007531259674578905,
0.051050394773483276,
0.00712093710899353,
-0.0438474677503109,
0.034702278673648834,
0.054328467696905136,
-0.021539291366934776,
0.09374265372753143,
0.01035561878234148,
0.09523916244506836,
0.001158855389803648,
-0.01636340096592903,
-0.07464735209941864,
0.0985645279288292,
-0.1213923990726471,
-0.25299057364463806,
-0.12418920546770096,
0.03249407187104225,
-0.04777771607041359,
-0.022425659000873566,
0.011657008901238441,
-0.028183814138174057,
-0.020442912355065346,
-0.07509162276983261,
0.01961929351091385,
0.011169607751071453,
-0.03280707448720932,
-0.06101971119642258,
-0.03159321844577789,
0.04198265075683594,
-0.0975642055273056,
0.007298773154616356,
-0.0009006889304146171,
-0.11503062397241592,
0.013491246849298477,
0.09542801976203918,
0.09092191606760025,
-0.02031002938747406,
-0.07054702937602997,
-0.029069462791085243,
-0.010841569863259792,
0.09456225484609604,
-0.11714190989732742,
0.11547857522964478,
0.04516015574336052,
-0.0436222143471241,
0.02597915008664131,
0.03321487456560135,
0.023366764187812805,
-0.04376351833343506,
0.017413495108485222,
0.04981851577758789,
-0.02772895246744156,
-0.27994513511657715,
-0.10236381739377975,
-0.04664693400263786,
-0.03813289478421211,
-0.05419289693236351,
0.07732013612985611,
0.01798873208463192,
0.011288790963590145,
-0.08787402510643005,
0.013337753713130951,
-0.023941339924931526,
0.033743515610694885,
0.0704137310385704,
-0.00012346140283625573,
-0.006481501739472151,
-0.1020096093416214,
0.011284452863037586,
0.1529211401939392,
0.0331893190741539,
0.2148015797138214,
-0.005936353467404842,
0.153509721159935,
0.07025478780269623,
0.08281944692134857,
-0.016246328130364418,
0.0611637644469738,
0.010805733501911163,
0.04650915414094925,
-0.00969260185956955,
-0.0752577930688858,
0.010728107765316963,
0.005486363545060158,
0.07954731583595276,
-0.04162142425775528,
-0.01913115195930004,
0.00713209668174386,
0.10802914202213287,
0.24871212244033813,
-0.038860220462083817,
-0.1267874836921692,
-0.09754932671785355,
-0.03984180837869644,
-0.08542450517416,
-0.005765189882367849,
-0.06620462983846664,
0.16157466173171997,
-0.15820014476776123,
0.08680573105812073,
-0.03945225104689598,
0.026686299592256546,
-0.13384385406970978,
-0.035305481404066086,
0.0841377004981041,
-0.001745319226756692,
-0.02360270544886589,
0.054443810135126114,
-0.061968665570020676,
0.12465329468250275,
0.04703715071082115,
0.08262094110250473,
-0.08517675846815109,
0.012035658583045006,
0.07183974981307983,
-0.04442722722887993,
0.09557461738586426,
0.029056169092655182,
-0.10369039326906204,
-0.021025661379098892,
-0.22215937077999115,
0.046154238283634186,
0.10749910771846771,
-0.11603628844022751,
0.06842772662639618,
0.01812242530286312,
0.00040669305599294603,
-0.0580846332013607,
0.039910558611154556,
-0.17884229123592377,
-0.18075264990329742,
0.07886712998151779,
0.0027334694750607014,
-0.013906323350965977,
-0.05518043413758278,
-0.02669220231473446,
-0.06017102301120758,
0.16288729012012482,
-0.05026296526193619,
-0.07155444473028183,
-0.07010800391435623,
-0.011938422918319702,
0.11995658278465271,
-0.08122025430202484,
0.002723535755649209,
0.018663983792066574,
0.007460875902324915,
-0.022662170231342316,
-0.08789297193288803,
-0.02247605472803116,
-0.07395140081644058,
-0.13966138660907745,
-0.030585002154111862,
0.12679047882556915,
0.08369562774896622,
0.04042356088757515,
0.014426308684051037,
-0.003530040616169572,
0.060065858066082,
-0.09458934515714645,
0.008261390030384064,
0.08432744443416595,
0.044695209711790085,
0.07445260882377625,
-0.026883967220783234,
-0.025278404355049133,
-0.07281241565942764,
-0.045128025114536285,
0.08782704919576645,
0.10615625977516174,
-0.03109603561460972,
0.14986838400363922,
0.09087485074996948,
-0.1161184087395668,
-0.1035776436328888,
-0.07745342701673508,
-0.0028003582265228033,
0.007602691184729338,
0.002474613720551133,
-0.1455521136522293,
0.031594593077898026,
0.08722709864377975,
-0.007219802122563124,
0.0923260748386383,
-0.23610171675682068,
-0.09166282415390015,
0.00629328703507781,
0.0035594559740275145,
0.00946000125259161,
-0.11468973755836487,
-0.05536409839987755,
-0.002479350892826915,
0.029683800414204597,
0.05028700456023216,
-0.07885961979627609,
0.08882016688585281,
-0.017559608444571495,
0.04290017858147621,
0.020278088748455048,
-0.019682418555021286,
0.11435864865779877,
-0.004285165574401617,
0.03961384668946266,
-0.11602018773555756,
0.03833451122045517,
0.13304826617240906,
-0.012840948067605495,
0.10912258177995682,
-0.0006756003713235259,
0.022831223905086517,
-0.04666649177670479,
-0.00747389392927289,
-0.06857670843601227,
0.09713689982891083,
-0.045606691390275955,
-0.0029293273109942675,
-0.04975002631545067,
0.07241204380989075,
0.04263995587825775,
0.021722188219428062,
-0.08453627675771713,
0.028745390474796295,
-0.008267249912023544,
0.042223650962114334,
0.10441283136606216,
0.09438449889421463,
-0.0021717974450439215,
-0.028692465275526047,
-0.026612717658281326,
0.05048689246177673,
0.002063911175355315,
-0.00929208192974329,
0.0703963115811348,
-0.05814685299992561,
0.04271276667714119,
-0.041098710149526596,
-0.1734580397605896,
0.08910562843084335,
0.10075540095567703,
-0.1681966632604599,
-0.15647144615650177,
-0.01367423590272665,
-0.053361255675554276,
0.012878982350230217,
0.012896555475890636,
0.1876790076494217,
-0.07717420905828476,
-0.004629233386367559,
-0.040381740778684616,
0.045921340584754944,
-0.029101531952619553,
0.10061971843242645,
0.0030988401267677546,
-0.05179116874933243,
-0.05103156715631485,
0.17272542417049408,
0.10955268889665604,
-0.09319635480642319,
0.0693112164735794,
0.08977370709180832,
-0.07279165089130402,
-0.05037909373641014,
-0.07395056635141373,
0.04596002772450447,
-0.004370514769107103,
-0.09435407817363739,
0.018970048055052757,
-0.01083583664149046,
0.0016251696506515145,
0.028675006702542305,
-0.04728570953011513,
0.09385442733764648,
-0.01554157119244337,
-0.005448519717901945,
-0.05190605670213699,
0.07682081311941147,
0.02581213042140007,
-0.040380898863077164,
-0.058050017803907394,
0.080438032746315,
-0.0281855296343565,
0.01770651899278164,
-0.0028838159050792456,
-0.02269795723259449,
0.03659169003367424,
-0.06328243017196655,
-0.06799428910017014,
-0.039342403411865234,
-0.09290029108524323,
-0.024428637698292732,
0.02541215531527996,
0.026280051097273827,
-0.027498217299580574,
0.004421016667038202,
-0.03908459469676018,
-0.05450413003563881,
-0.016862595453858376,
0.11105116456747055,
-0.1382325142621994,
-0.025561751797795296,
0.011662949807941914,
-0.030563363805413246,
0.1627308875322342,
0.1110268160700798,
-0.0034206847194582224,
0.02667544037103653,
-0.09411972016096115,
0.026420412585139275,
-0.04535485804080963,
-0.01003266591578722,
0.025709494948387146,
-0.11524511873722076,
-0.0046484824270009995,
0.02728891931474209,
-0.028095252811908722,
0.024213772267103195,
0.07164660096168518,
-0.09151152521371841,
-0.0017344546504318714,
0.018443956971168518,
0.030277902260422707,
-0.07285333424806595,
0.023441776633262634,
0.03627615049481392,
0.020822513848543167,
0.037128932774066925,
-0.07873071730136871,
0.03723469749093056,
-0.0778086856007576,
0.0287235826253891,
-0.01405472680926323,
-0.03391890600323677,
-0.052385661751031876,
-0.0009450927027501166,
0.058157604187726974,
-0.017941882833838463,
0.12014306336641312,
-0.010408326052129269,
-0.017937911674380302,
-0.0010118263307958841,
-0.007704016286879778,
-0.1350010633468628,
-0.016664814203977585,
0.004213819745928049,
-0.027065400034189224,
-0.047748636454343796,
-0.059365641325712204,
-0.024890350177884102,
-0.04750810191035271,
-0.12577468156814575,
0.14634418487548828,
0.09353527426719666,
0.16455601155757904,
0.0727042704820633,
0.048525623977184296,
-0.10721708834171295,
-0.05430877208709717,
-0.023943422362208366,
-0.06683040410280228,
0.11756404489278793,
-0.010246281512081623,
-0.05632228031754494,
0.11330055445432663,
-0.17594598233699799,
0.03696008771657944,
0.007613878231495619,
-0.03134869411587715,
0.007888148538768291,
-0.15841396152973175,
-0.0058875922113657,
0.02388858050107956,
0.000022260292098508216,
-0.09539702534675598,
0.06049202382564545,
0.031221646815538406,
0.0389825776219368,
-0.056112464517354965,
0.14895661175251007,
-0.15212969481945038,
-0.07409609109163284,
0.099180206656456,
0.024755489081144333,
0.03248688578605652,
-0.03280915319919586,
-0.013651203364133835,
-0.05717502161860466,
0.05709223076701164,
0.04848300293087959,
0.08064047247171402,
0.0937192365527153,
0.06809840351343155,
-0.030855899676680565,
-0.08127755671739578,
-0.00002734843837970402,
-0.004499050788581371,
0.08394734561443329,
0.11192229390144348,
0.04220190271735191,
-0.03479284420609474,
-0.013964075595140457,
0.11290726065635681,
0.009181332774460316,
0.027527550235390663,
-0.11738524585962296,
-0.008986388333141804,
0.0008747957181185484,
-0.06760933995246887,
0.0222151055932045,
-0.08111969381570816,
-0.014138785190880299,
0.11368504911661148,
0.2736422121524811,
0.028104837983846664,
0.024028722196817398,
0.004503738600760698,
0.004106106702238321,
-0.01103960070759058,
0.08884742110967636,
-0.007579491473734379,
0.1576550304889679,
-0.015827812254428864,
0.09895965456962585,
-0.03655637428164482,
0.046071432530879974,
-0.04461710527539253,
0.15429915487766266,
-0.07608012109994888,
-0.032283391803503036,
0.019721198827028275,
0.10784640163183212,
-0.04130050539970398,
-0.3676508367061615,
0.03977806866168976,
-0.06381343305110931,
-0.09434738755226135,
0.05114297568798065,
0.007649312261492014,
0.07953915745019913,
0.08149868994951248,
0.015771109610795975,
-0.019963758066296577,
0.19304555654525757,
0.03431978076696396,
-0.12409725785255432,
-0.06718230247497559,
0.09032618254423141,
-0.08755236119031906,
0.23022405803203583,
0.013987202197313309,
0.03363407030701637,
0.05290868878364563,
-0.0006575409206561744,
-0.12936967611312866,
0.00054706702940166,
0.03367698937654495,
-0.003998149186372757,
0.0725562572479248,
0.1573520451784134,
0.026736002415418625,
0.05173693224787712,
0.06523805856704712,
0.06485286355018616,
0.06752040237188339,
0.023067360743880272,
0.06299470365047455,
-0.05393412709236145,
0.08426860719919205,
-0.07216336578130722,
0.1644097864627838,
0.19176870584487915,
-0.016197988763451576,
0.037987325340509415,
-0.021385017782449722,
-0.026171673089265823,
0.05595998466014862,
0.006309266202151775,
-0.0040273359045386314,
-0.132248193025589,
0.026983456686139107,
-0.11088615655899048,
0.08413052558898926,
-0.11358178406953812,
-0.06274149566888809,
0.00879693403840065,
-0.0485035702586174,
-0.012131379917263985,
0.07530120015144348,
0.046554166823625565,
0.00978891272097826,
-0.02115517482161522,
0.08181551843881607,
-0.01596236415207386,
0.02059772238135338,
-0.11086062341928482,
-0.03507247194647789
] |
null | null |
transformers
|
# 🇹🇷 Turkish ELECTRA model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="https://raw.githubusercontent.com/stefan-it/turkish-bert/master/merve_logo.png">
</p>
[](https://zenodo.org/badge/latestdoi/237817454)
We present community-driven BERT, DistilBERT, ELECTRA and ConvBERT models for Turkish 🎉
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the BERT model name: BERTurk.
Logo is provided by [Merve Noyan](https://twitter.com/mervenoyann).
# Stats
We've also trained an ELECTRA (uncased) model on the recently released Turkish part of the
[multiligual C4 (mC4) corpus](https://github.com/allenai/allennlp/discussions/5265) from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ELECTRA
In addition to the ELEC**TR**A base cased model, we also trained an ELECTRA uncased model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the [DBMDZ](https://github.com/dbmdz) Hugging Face [model hub page](https://huggingface.co/dbmdz)
using their model name.
Example usage with 🤗/Transformers:
```python
tokenizer = AutoTokenizer.from_pretrained("electra-base-turkish-mc4-uncased-discriminator")
model = AutoModel.from_pretrained("electra-base-turkish-mc4-uncased-discriminator")
```
# Citation
You can use the following BibTeX entry for citation:
```bibtex
@software{stefan_schweter_2020_3770924,
author = {Stefan Schweter},
title = {BERTurk - BERT models for Turkish},
month = apr,
year = 2020,
publisher = {Zenodo},
version = {1.0.0},
doi = {10.5281/zenodo.3770924},
url = {https://doi.org/10.5281/zenodo.3770924}
}
```
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank [Merve Noyan](https://twitter.com/mervenoyann) for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
|
{"language": "tr", "license": "mit", "datasets": ["allenai/c4"]}
| null |
dbmdz/electra-base-turkish-mc4-uncased-discriminator
|
[
"transformers",
"pytorch",
"tf",
"electra",
"pretraining",
"tr",
"dataset:allenai/c4",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #electra #pretraining #tr #dataset-allenai/c4 #license-mit #endpoints_compatible #region-us
|
# 🇹🇷 Turkish ELECTRA model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="URL
</p>
 model on the recently released Turkish part of the
multiligual C4 (mC4) corpus from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ELECTRA
In addition to the ELECTRA base cased model, we also trained an ELECTRA uncased model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the DBMDZ Hugging Face model hub page
using their model name.
Example usage with /Transformers:
You can use the following BibTeX entry for citation:
# Acknowledgments
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank Merve Noyan for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
|
[
"# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ELECTRA\n\nIn addition to the ELECTRA base cased model, we also trained an ELECTRA uncased model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
"TAGS\n#transformers #pytorch #tf #electra #pretraining #tr #dataset-allenai/c4 #license-mit #endpoints_compatible #region-us \n",
"# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ELECTRA\n\nIn addition to the ELECTRA base cased model, we also trained an ELECTRA uncased model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
46,
131,
94,
73,
49,
90
] |
[
"passage: TAGS\n#transformers #pytorch #tf #electra #pretraining #tr #dataset-allenai/c4 #license-mit #endpoints_compatible #region-us \n# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).# mC4 ELECTRA\n\nIn addition to the ELECTRA base cased model, we also trained an ELECTRA uncased model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
-0.03293319046497345,
0.07106632739305496,
-0.004130590707063675,
0.023138951510190964,
0.1271773725748062,
0.05517469346523285,
0.14898528158664703,
0.09769503772258759,
0.016270536929368973,
0.019930463284254074,
0.02780170738697052,
-0.005476463586091995,
0.08854171633720398,
0.04035564139485359,
0.049201980233192444,
-0.2093619853258133,
-0.017106635496020317,
-0.13134974241256714,
-0.008934813551604748,
0.07202456891536713,
0.12022290378808975,
-0.04158783704042435,
0.10740168392658234,
0.014318432658910751,
-0.03566710278391838,
0.010087837465107441,
-0.03063923306763172,
-0.05301561951637268,
0.09556317329406738,
0.06646262109279633,
0.07380267232656479,
-0.06271584331989288,
-0.03068780153989792,
-0.12768884003162384,
0.03645946830511093,
0.07103326171636581,
-0.015672467648983,
0.06485986709594727,
0.11264852434396744,
-0.022441036999225616,
0.18115781247615814,
-0.15046323835849762,
-0.02897004410624504,
-0.0030280177015811205,
-0.11447978764772415,
-0.045954279601573944,
-0.1951715350151062,
0.12713685631752014,
0.07449124753475189,
0.06000107526779175,
-0.0037383735179901123,
0.058939408510923386,
0.015534935519099236,
0.061790402978658676,
0.057083770632743835,
-0.3007607161998749,
-0.06069737300276756,
-0.043701305985450745,
0.020279159769415855,
0.04543645307421684,
-0.040661443024873734,
0.0717773586511612,
-0.007547637913376093,
-0.018498891964554787,
-0.039972249418497086,
-0.04152538627386093,
-0.04339320585131645,
-0.07190912961959839,
-0.02199571020901203,
-0.042330510914325714,
0.22514981031417847,
-0.0015801803674548864,
-0.07548419386148453,
-0.08676248788833618,
0.023541832342743874,
0.01738622412085533,
0.041223011910915375,
0.00019170751329511404,
-0.017436888068914413,
-0.00967797078192234,
0.07808654755353928,
-0.12328717857599258,
-0.12549826502799988,
0.03550424054265022,
-0.046391990035772324,
0.19161458313465118,
0.049548178911209106,
0.051228687167167664,
-0.03871053457260132,
0.0952000692486763,
-0.022169990465044975,
-0.034265629947185516,
-0.0011562337167561054,
-0.03369108587503433,
-0.1124371737241745,
-0.02681979350745678,
-0.03397634997963905,
-0.14065496623516083,
-0.03320126608014107,
0.10290203243494034,
-0.05466040223836899,
0.010517831891775131,
0.03941405564546585,
0.04002959281206131,
0.01105538196861744,
0.02547425590455532,
-0.08481858670711517,
-0.03996460139751434,
0.02791702374815941,
-0.11415498703718185,
0.0285338144749403,
-0.02172430232167244,
-0.0021883926820009947,
0.0004986350541003048,
-0.02466398850083351,
0.108972929418087,
0.013503423891961575,
0.04090505838394165,
-0.01769919879734516,
-0.04289443418383598,
0.17296119034290314,
-0.0855867862701416,
0.007667990401387215,
0.01764031872153282,
-0.07725175470113754,
-0.013839606195688248,
0.0419183224439621,
-0.022986536845564842,
-0.06824705749750137,
0.09620741754770279,
-0.006934765726327896,
-0.019627608358860016,
-0.03453444689512253,
-0.09349680691957474,
0.07699575275182724,
-0.07560090720653534,
-0.021231070160865784,
-0.12882623076438904,
-0.12345469743013382,
-0.03826049715280533,
0.038606591522693634,
-0.04435471072793007,
0.03115951269865036,
0.04264999181032181,
-0.03749099001288414,
0.011931456625461578,
0.0007038204348646104,
0.05558011680841446,
0.0048991478979587555,
0.004235077649354935,
-0.13723354041576385,
0.04818850755691528,
-0.10092511028051376,
0.002862478606402874,
-0.046018727123737335,
0.00518543878570199,
-0.0778985321521759,
0.06895867735147476,
-0.01375697273761034,
-0.007741579320281744,
-0.10875396430492401,
0.00859652366489172,
-0.08335354924201965,
-0.02594267763197422,
0.051654212176799774,
0.07071886211633682,
-0.17575590312480927,
-0.019271697849035263,
0.062114160507917404,
-0.06787928193807602,
-0.04992141202092171,
0.12730172276496887,
-0.02259266935288906,
0.07886146008968353,
0.08877597004175186,
0.10926328599452972,
0.06466710567474365,
-0.03305455669760704,
-0.16698181629180908,
-0.10572248697280884,
-0.08571027964353561,
0.08863982558250427,
0.045372143387794495,
-0.02115967497229576,
0.0496785007417202,
0.010352694429457188,
-0.05704870447516441,
0.025497330352663994,
-0.04006165266036987,
-0.047509703785181046,
0.04424150288105011,
-0.04916089028120041,
-0.01023475918918848,
-0.0584036260843277,
0.037242498248815536,
-0.06415095180273056,
-0.059117984026670456,
-0.06519526988267899,
0.07777832448482513,
-0.024079250171780586,
0.030827512964606285,
-0.06353681534528732,
0.06314008682966232,
-0.0681895911693573,
0.029160575941205025,
-0.12338997423648834,
-0.07868946343660355,
0.0269914697855711,
-0.06753942370414734,
0.05073806643486023,
-0.05796467140316963,
0.03085082769393921,
0.040383268147706985,
-0.047380466014146805,
0.018601274117827415,
0.030698325484991074,
-0.019779417663812637,
-0.08582442253828049,
-0.14734986424446106,
-0.012399064376950264,
-0.0016048187389969826,
0.09245018661022186,
-0.05663543567061424,
0.013199128210544586,
0.04279617220163345,
0.12358582019805908,
0.03356703370809555,
-0.05832930654287338,
0.029987499117851257,
0.04325360059738159,
0.0026495042257010937,
-0.044011861085891724,
-0.0031690960749983788,
0.014246398583054543,
-0.07155037671327591,
0.12841250002384186,
-0.1786627173423767,
-0.12838472425937653,
0.05040642246603966,
0.051491133868694305,
-0.08720869570970535,
0.07788513600826263,
-0.031017573550343513,
-0.05035251006484032,
-0.02415970340371132,
-0.02581038326025009,
0.20643916726112366,
0.03171824291348457,
0.0812673419713974,
-0.046431805938482285,
-0.06887523084878922,
0.013550124131143093,
-0.04622353985905647,
-0.03700127452611923,
0.07690128684043884,
-0.04127395898103714,
-0.2032829076051712,
0.038394030183553696,
0.07968078553676605,
0.05820814520120621,
0.20167727768421173,
0.07259225100278854,
-0.1017264872789383,
-0.07472734898328781,
0.03144942224025726,
0.06080314889550209,
0.11519022285938263,
-0.001309890765696764,
0.01108812727034092,
0.045620519667863846,
-0.0037934693973511457,
0.04377096891403198,
-0.027986174449324608,
0.03574380651116371,
0.02083813026547432,
-0.04955343157052994,
-0.004270631819963455,
0.05758432671427727,
-0.014308660291135311,
0.09383885562419891,
-0.017694413661956787,
0.0762513279914856,
0.014023914933204651,
-0.027615966275334358,
-0.08421946316957474,
0.10727353394031525,
-0.09915988147258759,
-0.306474506855011,
-0.1357433795928955,
0.08166120201349258,
-0.02445797622203827,
-0.020026929676532745,
0.03360477089881897,
-0.06612107157707214,
-0.024505063891410828,
-0.050650276243686676,
-0.0006658406928181648,
0.03803643211722374,
-0.036545269191265106,
-0.04827331006526947,
-0.01525422465056181,
0.03564213961362839,
-0.08941090106964111,
-0.002027026144787669,
-0.003566228784620762,
-0.0550856739282608,
0.04643482714891434,
0.07876129448413849,
0.10093554109334946,
0.002001106506213546,
-0.09676387906074524,
-0.024426285177469254,
0.012966633774340153,
0.0973767414689064,
-0.12872348725795746,
0.09800054132938385,
0.03744429722428322,
-0.07462428510189056,
0.0340757742524147,
0.053633805364370346,
0.02345803752541542,
-0.02660537138581276,
0.015279159881174564,
0.06735638529062271,
-0.025424985215067863,
-0.29470404982566833,
-0.07454836368560791,
-0.05671961233019829,
-0.05836588516831398,
-0.034093715250492096,
0.06651943922042847,
0.02624383196234703,
0.014376738108694553,
-0.09264969080686569,
0.00949620921164751,
-0.005436284467577934,
0.047575924545526505,
0.024179665371775627,
0.012722698971629143,
0.009101789444684982,
-0.10072878003120422,
0.018271872773766518,
0.144343763589859,
0.06289752572774887,
0.21420377492904663,
0.020167959854006767,
0.2001572698354721,
0.07963988184928894,
0.04253010451793671,
0.013389656320214272,
0.04129672050476074,
-0.0028962199576199055,
0.033695634454488754,
-0.010136366821825504,
-0.06443781405687332,
-0.00415562279522419,
0.001806539949029684,
0.016246343031525612,
-0.022553453221917152,
-0.049650486558675766,
0.055261239409446716,
0.13021938502788544,
0.23863984644412994,
-0.021485760807991028,
-0.13290448486804962,
-0.09642685949802399,
-0.0197908915579319,
-0.05625739321112633,
-0.03220381587743759,
-0.10144125670194626,
0.16990377008914948,
-0.1360923945903778,
0.028152113780379295,
-0.043628472834825516,
0.05738157778978348,
-0.16766798496246338,
-0.0034722855780273676,
0.028383105993270874,
0.02523082308471203,
-0.03285903111100197,
0.05906395614147186,
-0.08609689027070999,
0.11862989515066147,
0.04573827609419823,
0.13199716806411743,
-0.08944188803434372,
-0.002328911330550909,
0.09018349647521973,
-0.08384988456964493,
0.10303113609552383,
0.034576304256916046,
-0.09713711589574814,
-0.06423278152942657,
-0.1893511563539505,
0.03074292466044426,
0.07828856259584427,
-0.09355515986680984,
0.06682558357715607,
0.01598743535578251,
0.02167452871799469,
-0.04571734368801117,
0.02555096708238125,
-0.2014046013355255,
-0.1775386780500412,
0.0725632756948471,
0.015402761287987232,
-0.011932277120649815,
-0.032728780061006546,
-0.00993017852306366,
-0.029230166226625443,
0.17699217796325684,
-0.08408737927675247,
-0.07736953347921371,
-0.06944224238395691,
-0.013570049777626991,
0.09590218961238861,
-0.07516211271286011,
-0.033570099622011185,
0.04596938565373421,
0.039781879633665085,
-0.04589453712105751,
-0.07131168246269226,
-0.019788675010204315,
-0.08631917089223862,
-0.12754522264003754,
-0.047044750303030014,
0.08303781598806381,
0.08896107226610184,
0.057141322642564774,
0.03434424474835396,
-0.02137916535139084,
0.032644372433423996,
-0.09540754556655884,
0.004489688202738762,
0.09866709262132645,
0.01655646413564682,
0.08548124134540558,
-0.035366371273994446,
-0.004724475089460611,
-0.0937175378203392,
-0.04986945912241936,
0.09863238781690598,
0.15679696202278137,
-0.030474089086055756,
0.14193211495876312,
0.09773001819849014,
-0.12159043550491333,
-0.10164398699998856,
-0.04785425588488579,
-0.001001487486064434,
0.05455969646573067,
-0.01190574187785387,
-0.12678784132003784,
0.0043437182903289795,
0.14512737095355988,
-0.0009540435275994241,
0.08131218701601028,
-0.257982075214386,
-0.09441360831260681,
0.05056308954954147,
-0.009460082277655602,
0.0495336577296257,
-0.10821796208620071,
-0.046090513467788696,
-0.032453302294015884,
0.026120755821466446,
0.08981233835220337,
-0.10366487503051758,
0.10067734867334366,
-0.007083424832671881,
-0.016759097576141357,
0.020659945905208588,
-0.013558079488575459,
0.16924719512462616,
0.008402577601373196,
0.0551212802529335,
-0.07171392440795898,
0.01564275100827217,
0.12299196422100067,
-0.02985420450568199,
0.11445162445306778,
-0.020911652594804764,
0.015313102863729,
-0.1217791736125946,
-0.012275861576199532,
-0.04880951717495918,
0.08875033259391785,
-0.049739815294742584,
-0.004194991197437048,
-0.029412679374217987,
0.08850014954805374,
0.0341937392950058,
0.05178283900022507,
-0.07827574014663696,
0.0010011408012360334,
-0.03435363993048668,
0.04891977086663246,
0.16402871906757355,
0.09194636344909668,
-0.017052598297595978,
0.00024361739633604884,
-0.026187950745224953,
0.09709062427282333,
-0.020453980192542076,
-0.0028060944750905037,
0.08858420699834824,
-0.04723655432462692,
0.08392375707626343,
-0.02160695381462574,
-0.15384092926979065,
0.08338914066553116,
0.11045359820127487,
-0.16314788162708282,
-0.11390382051467896,
-0.004303290043026209,
-0.031906869262456894,
-0.010102583095431328,
-0.037572819739580154,
0.15836073458194733,
-0.06461017578840256,
0.005363856442272663,
-0.007910016924142838,
0.0445299856364727,
-0.015944013372063637,
0.1245616152882576,
0.004601506981998682,
-0.027272896841168404,
-0.05378059670329094,
0.17829318344593048,
0.10093140602111816,
-0.0954565703868866,
0.06484001874923706,
0.086858831346035,
-0.07085597515106201,
-0.05232805013656616,
-0.09724606573581696,
0.06156414747238159,
-0.03650512546300888,
-0.10435251146554947,
0.004281509201973677,
-0.014182474464178085,
-0.007494165562093258,
-0.0413794219493866,
-0.039788633584976196,
0.08426372706890106,
-0.023995164781808853,
-0.03627252206206322,
-0.07669473439455032,
0.07434897869825363,
0.08674865961074829,
-0.027033019810914993,
-0.04727640375494957,
0.0706951916217804,
-0.029137104749679565,
-0.029341211542487144,
0.007165147457271814,
-0.007557602133601904,
-0.027840295806527138,
-0.059763453900814056,
-0.10829175263643265,
-0.012833945453166962,
-0.08126923441886902,
-0.032625459134578705,
0.03216804936528206,
0.046190645545721054,
-0.015589774586260319,
0.001162924338132143,
-0.035292766988277435,
-0.06580466777086258,
0.0021587947849184275,
0.09436573833227158,
-0.10329800844192505,
-0.03033996745944023,
0.03178452327847481,
-0.034396253526210785,
0.1345827281475067,
0.09655255079269409,
0.0009233764139935374,
-0.015904124826192856,
-0.11447266489267349,
0.019560569897294044,
-0.02785353735089302,
-0.014571587555110455,
0.018908992409706116,
-0.14661966264247894,
-0.003173099597916007,
0.007795544806867838,
-0.02821233868598938,
0.027316158637404442,
0.09462743997573853,
-0.08072647452354431,
0.028711630031466484,
-0.002700298558920622,
0.05089584365487099,
-0.07815881073474884,
0.05234351381659508,
0.0637487843632698,
0.04607026278972626,
0.07753565162420273,
-0.0758371576666832,
0.02312508597970009,
-0.08335348218679428,
0.0471949502825737,
-0.011611496098339558,
-0.046613939106464386,
-0.02179783582687378,
0.014581762254238129,
0.060719091445207596,
-0.03339476138353348,
0.10374724119901657,
-0.019835706800222397,
-0.02125347964465618,
-0.0036278211046010256,
-0.014255593530833721,
-0.15820251405239105,
-0.009498185478150845,
0.04210692644119263,
-0.0153072914108634,
-0.052800584584474564,
-0.08787397295236588,
-0.02453235723078251,
-0.022019430994987488,
-0.07798969000577927,
0.17756402492523193,
0.11949271708726883,
0.1311160922050476,
0.08477091789245605,
0.030465759336948395,
-0.09890712797641754,
-0.06128724664449692,
0.031192563474178314,
-0.05897276848554611,
0.09404595196247101,
-0.0044059487991034985,
-0.07221362739801407,
0.1768793761730194,
-0.17468787729740143,
0.04805989935994148,
0.008338811807334423,
-0.05829343572258949,
-0.015009218826889992,
-0.18505710363388062,
-0.00022670587350148708,
0.05200081691145897,
-0.007791971787810326,
-0.12041175365447998,
0.05155103653669357,
0.03476327285170555,
0.0264468714594841,
-0.0593235157430172,
0.1313440352678299,
-0.13786070048809052,
-0.05889680236577988,
0.10141023993492126,
0.008967127650976181,
0.05073457583785057,
-0.05038464441895485,
-0.015399386174976826,
-0.03630167618393898,
0.05826704949140549,
0.05462019145488739,
0.06959649920463562,
0.11617812514305115,
0.036442115902900696,
0.006879974156618118,
-0.09925778210163116,
-0.01218934915959835,
0.006094945594668388,
0.09041277319192886,
0.16468550264835358,
0.06779683381319046,
-0.0232691690325737,
-0.013208352029323578,
0.10005338490009308,
0.008535034954547882,
0.042834263294935226,
-0.1286400705575943,
-0.02621421590447426,
-0.0075004673562943935,
-0.03959721326828003,
0.015551619231700897,
-0.10307536274194717,
0.0062057580798864365,
0.10340889543294907,
0.25994935631752014,
0.011675175279378891,
0.024102866649627686,
-0.016522474586963654,
0.005346545483916998,
-0.01945267617702484,
0.0912410095334053,
-0.01477766688913107,
0.13929463922977448,
-0.04602951556444168,
0.10717780143022537,
-0.013364186510443687,
0.027930961921811104,
-0.05138964205980301,
0.15201695263385773,
-0.037848684936761856,
-0.05298558995127678,
0.010323786176741123,
0.09439889341592789,
-0.069633349776268,
-0.32670557498931885,
0.029459230601787567,
-0.05795026198029518,
-0.10051506757736206,
0.02597663179039955,
-0.03291874751448631,
0.06689461320638657,
0.09126761555671692,
-0.004188420716673136,
-0.02121065743267536,
0.18019552528858185,
0.028943482786417007,
-0.09726393967866898,
-0.054768916219472885,
0.05869774892926216,
-0.07365337759256363,
0.20560874044895172,
0.025379568338394165,
0.05933743715286255,
0.06092888116836548,
0.00019457770395092666,
-0.12788960337638855,
-0.0239690113812685,
-0.0025912784039974213,
0.022482899948954582,
0.07608076184988022,
0.15761101245880127,
0.02345639280974865,
0.0410294346511364,
0.09666071832180023,
0.03150089457631111,
0.0596126988530159,
0.030806194990873337,
0.0740731805562973,
-0.04280459135770798,
0.09419060498476028,
-0.04828814044594765,
0.14439664781093597,
0.19954028725624084,
-0.019683217629790306,
0.030837666243314743,
0.0018766430439427495,
-0.017847398295998573,
0.07378064095973969,
0.015815958380699158,
-0.01021654810756445,
-0.11774946749210358,
0.03056260757148266,
-0.07222684472799301,
0.06897427141666412,
-0.12008699774742126,
-0.0802747830748558,
-0.014578069560229778,
-0.05156612768769264,
0.0035345503129065037,
0.06959007680416107,
0.07651406526565552,
0.044989246875047684,
-0.011700024828314781,
0.038817401975393295,
-0.011043518781661987,
0.03745545446872711,
-0.06925206631422043,
-0.046478573232889175
] |
null | null |
transformers
|
# 🇹🇷 Turkish ELECTRA model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="https://raw.githubusercontent.com/stefan-it/turkish-bert/master/merve_logo.png">
</p>
[](https://zenodo.org/badge/latestdoi/237817454)
We present community-driven BERT, DistilBERT, ELECTRA and ConvBERT models for Turkish 🎉
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the BERT model name: BERTurk.
Logo is provided by [Merve Noyan](https://twitter.com/mervenoyann).
# Stats
We've also trained an ELECTRA (uncased) model on the recently released Turkish part of the
[multiligual C4 (mC4) corpus](https://github.com/allenai/allennlp/discussions/5265) from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ELECTRA
In addition to the ELEC**TR**A base cased model, we also trained an ELECTRA uncased model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the [DBMDZ](https://github.com/dbmdz) Hugging Face [model hub page](https://huggingface.co/dbmdz)
using their model name.
Example usage with 🤗/Transformers:
```python
tokenizer = AutoTokenizer.from_pretrained("electra-base-turkish-mc4-uncased-generator")
model = AutoModel.from_pretrained("electra-base-turkish-mc4-uncased-generator")
```
# Citation
You can use the following BibTeX entry for citation:
```bibtex
@software{stefan_schweter_2020_3770924,
author = {Stefan Schweter},
title = {BERTurk - BERT models for Turkish},
month = apr,
year = 2020,
publisher = {Zenodo},
version = {1.0.0},
doi = {10.5281/zenodo.3770924},
url = {https://doi.org/10.5281/zenodo.3770924}
}
```
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank [Merve Noyan](https://twitter.com/mervenoyann) for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
|
{"language": "tr", "license": "mit", "datasets": ["allenai/c4"]}
|
fill-mask
|
dbmdz/electra-base-turkish-mc4-uncased-generator
|
[
"transformers",
"pytorch",
"tf",
"electra",
"fill-mask",
"tr",
"dataset:allenai/c4",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #electra #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# 🇹🇷 Turkish ELECTRA model
<p align="center">
<img alt="Logo provided by Merve Noyan" title="Awesome logo from Merve Noyan" src="URL
</p>
 model on the recently released Turkish part of the
multiligual C4 (mC4) corpus from the AI2 team.
After filtering documents with a broken encoding, the training corpus has a size of 242GB resulting
in 31,240,963,926 tokens.
We used the original 32k vocab (instead of creating a new one).
# mC4 ELECTRA
In addition to the ELECTRA base cased model, we also trained an ELECTRA uncased model on the Turkish part of the mC4 corpus. We use a
sequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.
# Model usage
All trained models can be used from the DBMDZ Hugging Face model hub page
using their model name.
Example usage with /Transformers:
You can use the following BibTeX entry for citation:
# Acknowledgments
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
We would like to thank Merve Noyan for the
awesome logo!
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
|
[
"# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ELECTRA\n\nIn addition to the ELECTRA base cased model, we also trained an ELECTRA uncased model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
"TAGS\n#transformers #pytorch #tf #electra #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).",
"# mC4 ELECTRA\n\nIn addition to the ELECTRA base cased model, we also trained an ELECTRA uncased model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.",
"# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:",
"# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
56,
131,
94,
73,
49,
90
] |
[
"passage: TAGS\n#transformers #pytorch #tf #electra #fill-mask #tr #dataset-allenai/c4 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# 🇹🇷 Turkish ELECTRA model\n\n<p align=\"center\">\n <img alt=\"Logo provided by Merve Noyan\" title=\"Awesome logo from Merve Noyan\" src=\"URL\n</p>\n\n model on the recently released Turkish part of the\nmultiligual C4 (mC4) corpus from the AI2 team.\n\nAfter filtering documents with a broken encoding, the training corpus has a size of 242GB resulting\nin 31,240,963,926 tokens.\n\nWe used the original 32k vocab (instead of creating a new one).# mC4 ELECTRA\n\nIn addition to the ELECTRA base cased model, we also trained an ELECTRA uncased model on the Turkish part of the mC4 corpus. We use a\nsequence length of 512 over the full training time and train the model for 1M steps on a v3-32 TPU.# Model usage\n\nAll trained models can be used from the DBMDZ Hugging Face model hub page\nusing their model name.\n\nExample usage with /Transformers:\n\n\n\nYou can use the following BibTeX entry for citation:# Acknowledgments\n\nThanks to Kemal Oflazer for providing us\nadditional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing\nus the Turkish NER dataset for evaluation.\n\nWe would like to thank Merve Noyan for the\nawesome logo!\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️"
] |
[
-0.01828174479305744,
0.10579814016819,
-0.004710542969405651,
0.01722315140068531,
0.10750789940357208,
0.049586743116378784,
0.16120977699756622,
0.07512667775154114,
-0.013799688778817654,
0.025543466210365295,
0.03815235197544098,
-0.02078256756067276,
0.09544363617897034,
0.05739801377058029,
0.0988643690943718,
-0.21731355786323547,
-0.018823523074388504,
-0.13799604773521423,
-0.01014603953808546,
0.07048498094081879,
0.09587903320789337,
-0.043860215693712234,
0.11449094861745834,
0.006571238394826651,
-0.03401445224881172,
0.013608364388346672,
-0.06559861451387405,
-0.07441751658916473,
0.10084047168493271,
0.03908783942461014,
0.04672970622777939,
-0.05586644262075424,
-0.011401725001633167,
-0.12740349769592285,
0.032438866794109344,
0.04884980991482735,
-0.02593505196273327,
0.05045600235462189,
0.12509995698928833,
-0.04266384243965149,
0.20374800264835358,
-0.12928549945354462,
-0.016955368220806122,
-0.0035363442730158567,
-0.10042822360992432,
-0.021398400887846947,
-0.1906602531671524,
0.11095742881298065,
0.05822262540459633,
0.07284954190254211,
0.017115389928221703,
0.07321874797344208,
0.004424689803272486,
0.06237156316637993,
0.08147625625133514,
-0.298124760389328,
-0.06220528110861778,
-0.012887179851531982,
0.01976638473570347,
0.03546395152807236,
-0.02486332878470421,
0.05987844243645668,
-0.033521782606840134,
0.0005281902267597616,
-0.01639907993376255,
-0.03421609476208687,
-0.05207694321870804,
-0.06013370305299759,
-0.03502751886844635,
-0.04168879985809326,
0.2534773051738739,
-0.006332347635179758,
-0.06259237229824066,
-0.06222594156861305,
0.025110015645623207,
0.05419310927391052,
0.010168365202844143,
-0.014303278177976608,
0.0040276977233588696,
-0.025325415655970573,
0.07294399291276932,
-0.1438305675983429,
-0.12534582614898682,
0.044557031244039536,
-0.06432836502790451,
0.14969712495803833,
0.05618647485971451,
0.040276847779750824,
-0.003501239465549588,
0.09211435168981552,
-0.03943081572651863,
-0.028754526749253273,
-0.000462776399217546,
-0.013501528650522232,
-0.09810607880353928,
-0.03186419978737831,
-0.03981275111436844,
-0.18680024147033691,
-0.009626012295484543,
0.050201863050460815,
0.020112764090299606,
0.0237673819065094,
-0.00913441926240921,
0.03131673112511635,
-0.0020555444061756134,
0.05225357785820961,
-0.09055984020233154,
-0.09266189485788345,
0.036803897470235825,
-0.12934961915016174,
0.031034965068101883,
-0.01611124724149704,
-0.02254519797861576,
0.007463318761438131,
-0.011859501712024212,
0.09221331775188446,
0.03197665512561798,
0.07485607266426086,
0.016909262165427208,
-0.028648879379034042,
0.19586697220802307,
-0.09849905222654343,
0.0019640279933810234,
0.002845470793545246,
-0.0657302513718605,
-0.028673846274614334,
0.05394301190972328,
-0.009355184622108936,
-0.08212978392839432,
0.06586044281721115,
-0.023559631779789925,
-0.0020297174341976643,
-0.02715878002345562,
-0.06278141587972641,
0.09042181819677353,
-0.03308119252324104,
-0.04323335364460945,
-0.12650908529758453,
-0.11259639263153076,
-0.029528098180890083,
0.033448390662670135,
-0.05466146394610405,
0.03199475258588791,
0.03193940967321396,
-0.03652403876185417,
0.01924997568130493,
0.0010673906654119492,
0.07377168536186218,
0.004196772351861,
0.009564398787915707,
-0.13308089971542358,
0.025437653064727783,
-0.08429299294948578,
0.007634843699634075,
-0.03499423712491989,
0.004797061439603567,
-0.14613406360149384,
0.07004453241825104,
-0.008465451188385487,
-0.008819065056741238,
-0.13280121982097626,
-0.005568434949964285,
-0.06277070939540863,
-0.0465795174241066,
0.031587932258844376,
0.08080539107322693,
-0.13595478236675262,
-0.0037038661539554596,
0.06412513554096222,
-0.05549851059913635,
-0.05798428878188133,
0.09542524069547653,
-0.006191771011799574,
0.05531759560108185,
0.08617810904979706,
0.08709704875946045,
0.07397201657295227,
-0.0625973716378212,
-0.1316223293542862,
-0.10250444710254669,
-0.08682958781719208,
0.051131412386894226,
0.04866908863186836,
-0.025110358372330666,
0.048482295125722885,
0.027974484488368034,
-0.03219594806432724,
0.025616426020860672,
-0.02273663505911827,
-0.028737496584653854,
0.04475652426481247,
-0.04542314633727074,
-0.027146115899086,
-0.0705312192440033,
0.03851742669939995,
-0.04206794127821922,
-0.04374534636735916,
-0.0860951840877533,
0.06547501683235168,
-0.016442937776446342,
0.04376418516039848,
-0.04760149493813515,
0.07414520531892776,
-0.10658539086580276,
0.018735729157924652,
-0.10592348873615265,
-0.14150208234786987,
0.04962382838129997,
-0.0643557608127594,
0.07954991608858109,
-0.0343327671289444,
0.03626631200313568,
0.05434097722172737,
-0.04346896708011627,
0.03975307568907738,
0.024130946025252342,
-0.014340532943606377,
-0.06496621668338776,
-0.13620169460773468,
-0.02560615725815296,
-0.0023456562776118517,
0.07939895987510681,
0.013812039978802204,
0.007728469092398882,
0.04100792482495308,
0.12811630964279175,
0.01917596161365509,
-0.06144789978861809,
0.031268928200006485,
0.022065404802560806,
0.032825253903865814,
-0.043769653886556625,
-0.013577987439930439,
0.009529214352369308,
-0.07063640654087067,
0.17692098021507263,
-0.181466206908226,
-0.08996748179197311,
0.05589630827307701,
0.07072161883115768,
-0.07523717731237411,
0.031543683260679245,
-0.025196673348546028,
-0.07479751855134964,
0.025820588693022728,
-0.01963970996439457,
0.2124243825674057,
0.04900957643985748,
0.08177313208580017,
-0.05018201097846031,
-0.06958264112472534,
0.02375996857881546,
-0.019209587946534157,
-0.0488726831972599,
0.041104163974523544,
0.008425927720963955,
-0.1774466633796692,
0.029304800555109978,
0.08821281045675278,
0.09696535021066666,
0.16564257442951202,
0.06026121973991394,
-0.10094688087701797,
-0.09478723257780075,
0.01447996124625206,
0.04396958276629448,
0.10160847008228302,
-0.02186886966228485,
0.012223416939377785,
0.04365169629454613,
0.019196514040231705,
0.05312007665634155,
-0.009011808782815933,
0.04413599520921707,
0.00324277114123106,
-0.04428498074412346,
0.013468830846250057,
0.05348188057541847,
0.004530010744929314,
0.08750653266906738,
-0.003963177558034658,
0.08307000994682312,
0.009616577066481113,
-0.021537069231271744,
-0.08111891895532608,
0.09883999079465866,
-0.11659222096204758,
-0.2601067125797272,
-0.1400206834077835,
0.032839737832546234,
-0.05183522030711174,
-0.020034782588481903,
0.012660231441259384,
-0.008668544702231884,
-0.011897490359842777,
-0.07445599883794785,
0.04064110293984413,
0.026430021971464157,
-0.040635302662849426,
-0.09083694964647293,
-0.023259367793798447,
0.0265403650701046,
-0.09972532838582993,
0.005651835352182388,
0.008371487259864807,
-0.10617929697036743,
0.022373786196112633,
0.09418812394142151,
0.08350241184234619,
-0.01644773595035076,
-0.08371249586343765,
-0.033399567008018494,
-0.004727184306830168,
0.11313153058290482,
-0.11343655735254288,
0.12425276637077332,
0.039708852767944336,
-0.051817189902067184,
0.038223106414079666,
0.05231534317135811,
0.03118165396153927,
-0.03125307708978653,
0.010645532049238682,
0.03462854400277138,
-0.02003573440015316,
-0.28579288721084595,
-0.09763580560684204,
-0.04990551248192787,
-0.048737186938524246,
-0.03234541043639183,
0.06909279525279999,
0.012823964469134808,
0.02248409576714039,
-0.08180022239685059,
0.024942802265286446,
-0.013231661170721054,
0.04457860440015793,
0.0653076246380806,
-0.001506732776761055,
-0.00804927572607994,
-0.1086248904466629,
0.010447628796100616,
0.12288624793291092,
0.08457489311695099,
0.21108926832675934,
0.0015177562600001693,
0.1424987018108368,
0.0728546679019928,
0.07495901733636856,
-0.00030857918318361044,
0.06275760382413864,
0.011888572946190834,
0.044013455510139465,
-0.019157391041517258,
-0.07527022808790207,
-0.0021629466209560633,
0.0108646797016263,
0.09417486935853958,
-0.04890484735369682,
-0.02419455535709858,
-0.002049979753792286,
0.10593967139720917,
0.24069039523601532,
-0.03483384847640991,
-0.1352531909942627,
-0.06506822258234024,
-0.034818731248378754,
-0.07679512351751328,
-0.03082038275897503,
-0.07601939141750336,
0.14918015897274017,
-0.16180972754955292,
0.058105453848838806,
-0.046080343425273895,
0.033261027187108994,
-0.14668481051921844,
-0.02340845763683319,
0.061876825988292694,
0.019063811749219894,
-0.025322025641798973,
0.06830926239490509,
-0.07023873925209045,
0.11969079822301865,
0.047574855387210846,
0.07805655151605606,
-0.10968825966119766,
0.008245538920164108,
0.059900932013988495,
-0.056735146790742874,
0.09065667539834976,
0.03022737056016922,
-0.10131482779979706,
-0.02973492629826069,
-0.22006860375404358,
0.05086072161793709,
0.09880947321653366,
-0.1072838231921196,
0.06790176779031754,
0.02702820487320423,
0.003608101513236761,
-0.05954930558800697,
0.04425104707479477,
-0.17904387414455414,
-0.20584212243556976,
0.08979488909244537,
-0.001628123689442873,
-0.0025067972019314766,
-0.05404385179281235,
-0.01784868724644184,
-0.0523962564766407,
0.15547196567058563,
-0.04808689281344414,
-0.06495184451341629,
-0.0727463960647583,
-0.025484813377261162,
0.13158637285232544,
-0.06912451237440109,
-0.006859795190393925,
0.01605631597340107,
0.019501864910125732,
-0.034100938588380814,
-0.06709467619657516,
-0.020955735817551613,
-0.07105492800474167,
-0.13025055825710297,
-0.04420657828450203,
0.12677574157714844,
0.07979470491409302,
0.04298083484172821,
0.019432328641414642,
-0.014166904613375664,
0.05794746056199074,
-0.09633587300777435,
-0.007668275386095047,
0.07348940521478653,
0.06910756230354309,
0.08541324734687805,
-0.029607979580760002,
-0.009815232828259468,
-0.07660931348800659,
-0.04208890721201897,
0.10126987099647522,
0.12222375720739365,
-0.02748015895485878,
0.15785905718803406,
0.078752301633358,
-0.12433306872844696,
-0.11642751842737198,
-0.057839177548885345,
0.010912111960351467,
0.008955302648246288,
0.008074565790593624,
-0.1479760706424713,
0.039580587297677994,
0.09643983095884323,
-0.007263637147843838,
0.06643941253423691,
-0.23657357692718506,
-0.08548332750797272,
0.013724558986723423,
-0.003084574593231082,
0.014516017399728298,
-0.10568996518850327,
-0.05036763846874237,
0.0030441132839769125,
0.015708936378359795,
0.054548170417547226,
-0.06297618895769119,
0.09486909210681915,
-0.015079076401889324,
0.042110979557037354,
0.025129498913884163,
-0.03063204512000084,
0.14217659831047058,
0.022855712100863457,
0.04741762951016426,
-0.09364542365074158,
0.02975444495677948,
0.12863826751708984,
-0.011799153871834278,
0.11814582347869873,
0.005207433365285397,
0.013758116401731968,
-0.07232354581356049,
-0.004910436924546957,
-0.05644399672746658,
0.09574370086193085,
-0.05264953151345253,
-0.00208973023109138,
-0.04879719391465187,
0.0701868087053299,
0.04015733301639557,
0.024935897439718246,
-0.0844581201672554,
0.013858175836503506,
-0.03496602550148964,
0.049247074872255325,
0.12978078424930573,
0.09126200526952744,
-0.005519379395991564,
-0.02312561497092247,
-0.029980450868606567,
0.04844938591122627,
-0.005450785625725985,
0.00039315535104833543,
0.08200016617774963,
-0.07765091955661774,
0.050115879625082016,
-0.031204188242554665,
-0.17885345220565796,
0.07117434591054916,
0.10916852951049805,
-0.1535032093524933,
-0.13100089132785797,
-0.011916747316718102,
-0.03161846846342087,
-0.008663749322295189,
-0.013525471091270447,
0.17748302221298218,
-0.07752908021211624,
-0.012265444733202457,
-0.02455245889723301,
0.04237263649702072,
-0.02479069121181965,
0.09032578766345978,
-0.0041380273178219795,
-0.03702719509601593,
-0.05510126054286957,
0.18380233645439148,
0.11943918466567993,
-0.11202069371938705,
0.059042781591415405,
0.08233160525560379,
-0.05845855548977852,
-0.055219776928424835,
-0.06438207626342773,
0.07090740650892258,
0.009465020149946213,
-0.09340397268533707,
0.04134407266974449,
-0.013027476146817207,
0.007924218662083149,
0.0045192609541118145,
-0.05119061842560768,
0.09159448742866516,
-0.013973204419016838,
-0.018874095752835274,
-0.06871183961629868,
0.07502106577157974,
0.038591381162405014,
-0.01974363811314106,
-0.03939739614725113,
0.06994710117578506,
-0.029544362798333168,
0.006046201102435589,
0.0030430315528064966,
-0.025121347978711128,
0.008385829627513885,
-0.0663321241736412,
-0.056537214666604996,
-0.03946126997470856,
-0.09835845977067947,
-0.023522118106484413,
0.038338132202625275,
0.02418571338057518,
-0.015615959651768208,
0.0030710778664797544,
-0.03839600086212158,
-0.0555923730134964,
-0.02440691366791725,
0.11490890383720398,
-0.11271502822637558,
-0.02430053986608982,
0.04406847432255745,
-0.040597572922706604,
0.15760201215744019,
0.09408188611268997,
-0.017133112996816635,
0.01668316312134266,
-0.11482701450586319,
0.037695832550525665,
-0.048077505081892014,
0.00047075795009732246,
0.00873745046555996,
-0.12632043659687042,
-0.003597493516281247,
0.01836090348660946,
-0.021760664880275726,
0.01856524869799614,
0.09269122034311295,
-0.08819079399108887,
0.007730351760983467,
0.009415834210813046,
0.020576342940330505,
-0.07521452754735947,
0.023469150066375732,
0.046285416930913925,
0.012528473511338234,
0.04537292569875717,
-0.06807814538478851,
0.04688620939850807,
-0.08380991220474243,
0.03709288686513901,
-0.020735876634716988,
-0.035442572087049484,
-0.0235124584287405,
-0.015397387556731701,
0.06642790883779526,
-0.01470750942826271,
0.13309769332408905,
-0.012884333729743958,
-0.017485223710536957,
-0.011165306903421879,
0.0013370487140491605,
-0.1353222131729126,
-0.004396339878439903,
-0.0015535530401393771,
-0.027771295979619026,
-0.049207352101802826,
-0.060105759650468826,
-0.014222313649952412,
-0.05031442269682884,
-0.09364507347345352,
0.13270004093647003,
0.09443776309490204,
0.1568952351808548,
0.075596384704113,
0.058678802102804184,
-0.0978991836309433,
-0.04224538430571556,
0.014020746573805809,
-0.03971407189965248,
0.0984463095664978,
-0.02183486521244049,
-0.05250108242034912,
0.11864519119262695,
-0.17563797533512115,
0.04811803996562958,
0.012634520418941975,
-0.03678459674119949,
0.004871230572462082,
-0.18018822371959686,
-0.002743061864748597,
0.02333810366690159,
-0.0024861376732587814,
-0.10301683098077774,
0.05278581753373146,
0.03547988459467888,
0.03164611756801605,
-0.05448572710156441,
0.1540757268667221,
-0.15473006665706635,
-0.06057038530707359,
0.08884837478399277,
0.02268744260072708,
0.0590192936360836,
-0.013997586444020271,
-0.014791068620979786,
-0.05650139972567558,
0.058597926050424576,
0.05315575748682022,
0.0775735005736351,
0.11604949831962585,
0.05253757908940315,
-0.03552958369255066,
-0.09136147797107697,
-0.0033898616675287485,
-0.020254608243703842,
0.06326406449079514,
0.1440650075674057,
0.04723931476473808,
-0.033704888075590134,
-0.01594291441142559,
0.11862397938966751,
0.010232134722173214,
0.01287317369133234,
-0.12310398370027542,
-0.009244184009730816,
0.008684404194355011,
-0.05465560033917427,
0.014132924377918243,
-0.08758991211652756,
-0.010027238167822361,
0.11277652531862259,
0.2803686559200287,
0.010031082667410374,
0.023950865492224693,
0.01797908917069435,
0.0026130916085094213,
-0.00649324432015419,
0.09974179416894913,
-0.013914954848587513,
0.14090228080749512,
-0.032672036439180374,
0.10836071521043777,
-0.030289068818092346,
0.041556332260370255,
-0.03866441920399666,
0.1611565500497818,
-0.07829064130783081,
-0.047859638929367065,
0.022350722923874855,
0.10669504106044769,
-0.049482159316539764,
-0.3590035140514374,
0.018229525536298752,
-0.050792478024959564,
-0.10242161899805069,
0.05325142294168472,
-0.0051038130186498165,
0.07539817690849304,
0.07807483524084091,
0.020756592974066734,
-0.021321935579180717,
0.19196335971355438,
0.03335922956466675,
-0.11795509606599808,
-0.05280524864792824,
0.0778505802154541,
-0.08774265646934509,
0.23841148614883423,
0.026744050905108452,
0.0588219128549099,
0.051958344876766205,
0.006401566788554192,
-0.14349955320358276,
-0.006135769188404083,
0.023070290684700012,
0.013693932443857193,
0.07228363305330276,
0.1578417271375656,
0.03590330481529236,
0.05224395915865898,
0.06994055211544037,
0.050330255180597305,
0.06927933543920517,
0.016833242028951645,
0.06908555328845978,
-0.03963656350970268,
0.08002171665430069,
-0.05614983290433884,
0.15188176929950714,
0.1757061928510666,
-0.004084988031536341,
0.0325000025331974,
-0.01832829788327217,
-0.03306709602475166,
0.0479164719581604,
0.026895418763160706,
-0.007585923653095961,
-0.12239593267440796,
0.020306482911109924,
-0.12870697677135468,
0.07831498980522156,
-0.10023286938667297,
-0.07591988146305084,
0.007460168097168207,
-0.047425277531147,
-0.010665045119822025,
0.08617029339075089,
0.04813821613788605,
0.008796749636530876,
-0.02185787819325924,
0.05226416885852814,
-0.017094682902097702,
0.03675347939133644,
-0.11327040195465088,
-0.038312409073114395
] |
null | null |
transformers
|
# 🤗 + 📚 dbmdz Turkish ELECTRA model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a cased ELECTRA small model for Turkish 🎉
# Turkish ELECTRA model
We release a small ELEC**TR**A model for Turkish, that was trained on the same data as *BERTurk*.
> ELECTRA is a new method for self-supervised language representation learning. It can be used to
> pre-train transformer networks using relatively little compute. ELECTRA models are trained to
> distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to
> the discriminator of a GAN.
More details about ELECTRA can be found in the [ICLR paper](https://openreview.net/forum?id=r1xMH1BtvB)
or in the [official ELECTRA repository](https://github.com/google-research/electra) on GitHub.
## Stats
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish [OSCAR corpus](https://traces1.inria.fr/oscar/),
a recent Wikipedia dump, various [OPUS corpora](http://opus.nlpl.eu/) and a
special corpus provided by [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/).
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model
on a TPU v3-8 for 1M steps.
## Model weights
[Transformers](https://github.com/huggingface/transformers)
compatible weights for both PyTorch and TensorFlow are available.
| Model | Downloads
| ------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `dbmdz/electra-small-turkish-cased-discriminator` | [`config.json`](https://cdn.huggingface.co/dbmdz/electra-small-turkish-cased-discriminator/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/electra-small-turkish-cased-discriminator/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/electra-small-turkish-cased-discriminator/vocab.txt)
## Usage
With Transformers >= 2.8 our ELECTRA small cased model can be loaded like:
```python
from transformers import AutoModelWithLMHead, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("dbmdz/electra-small-turkish-cased-discriminator")
model = AutoModelWithLMHead.from_pretrained("dbmdz/electra-small-turkish-cased-discriminator")
```
## Results
For results on PoS tagging or NER tasks, please refer to
[this repository](https://github.com/stefan-it/turkish-bert/electra).
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our ELECTRA models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "tr", "license": "mit"}
| null |
dbmdz/electra-small-turkish-cased-discriminator
|
[
"transformers",
"pytorch",
"tf",
"electra",
"pretraining",
"tr",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #tf #electra #pretraining #tr #license-mit #endpoints_compatible #region-us
|
+ dbmdz Turkish ELECTRA model
=============================
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a cased ELECTRA small model for Turkish
Turkish ELECTRA model
=====================
We release a small ELECTRA model for Turkish, that was trained on the same data as *BERTurk*.
>
> ELECTRA is a new method for self-supervised language representation learning. It can be used to
> pre-train transformer networks using relatively little compute. ELECTRA models are trained to
> distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to
> the discriminator of a GAN.
>
>
>
More details about ELECTRA can be found in the ICLR paper
or in the official ELECTRA repository on GitHub.
Stats
-----
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish OSCAR corpus,
a recent Wikipedia dump, various OPUS corpora and a
special corpus provided by Kemal Oflazer.
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model
on a TPU v3-8 for 1M steps.
Model weights
-------------
Transformers
compatible weights for both PyTorch and TensorFlow are available.
Usage
-----
With Transformers >= 2.8 our ELECTRA small cased model can be loaded like:
Results
-------
For results on PoS tagging or NER tasks, please refer to
this repository.
Huggingface model hub
=====================
All models are available on the Huggingface model hub.
Contact (Bugs, Feedback, Contribution and more)
===============================================
For questions about our ELECTRA models just open an issue
here
Acknowledgments
===============
Thanks to Kemal Oflazer for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[] |
[
"TAGS\n#transformers #pytorch #tf #electra #pretraining #tr #license-mit #endpoints_compatible #region-us \n"
] |
[
37
] |
[
"passage: TAGS\n#transformers #pytorch #tf #electra #pretraining #tr #license-mit #endpoints_compatible #region-us \n"
] |
[
-0.05861099809408188,
0.005883581005036831,
-0.007683257572352886,
0.01050401572138071,
0.1385493129491806,
0.032392919063568115,
0.054023899137973785,
0.08081942051649094,
0.035687435418367386,
-0.04102092236280441,
0.15276306867599487,
0.232399120926857,
-0.0239882729947567,
0.028588522225618362,
-0.061818547546863556,
-0.27025726437568665,
0.041802648454904556,
0.09078606963157654,
-0.06467315554618835,
0.12250131368637085,
0.09992603957653046,
-0.0769590437412262,
0.055778585374355316,
-0.0011740676127374172,
-0.12458879500627518,
0.007188597694039345,
0.011086559854447842,
-0.09724237024784088,
0.17676666378974915,
0.07856621593236923,
0.1313212811946869,
0.06332097202539444,
-0.025700319558382034,
-0.1541755199432373,
0.03964284434914589,
-0.011196048930287361,
-0.1172606572508812,
0.03243924304842949,
0.0014171745860949159,
-0.04069536551833153,
0.166208878159523,
0.10752841830253601,
0.016911009326577187,
0.07205304503440857,
-0.20554456114768982,
-0.1624647080898285,
-0.0745515525341034,
0.08227041363716125,
0.05714043229818344,
0.0768786296248436,
0.024753326550126076,
0.20003430545330048,
-0.14801184833049774,
0.0695260763168335,
0.08220566809177399,
-0.3500400185585022,
-0.006324237212538719,
0.06154241785407066,
0.10042374581098557,
0.0054606045596301556,
-0.03618277236819267,
0.02890080399811268,
0.05691437050700188,
0.02767961286008358,
0.010218902491033077,
-0.06416044384241104,
-0.019355522468686104,
0.06984327733516693,
-0.12443090230226517,
-0.08097978681325912,
0.2509234547615051,
-0.011438771151006222,
0.009712930768728256,
0.047727297991514206,
-0.059253040701150894,
-0.05274515599012375,
0.022144461050629616,
-0.028856048360466957,
-0.022323323413729668,
0.09137516468763351,
0.0335056334733963,
-0.02145870216190815,
-0.14971984922885895,
0.021736452355980873,
-0.23197415471076965,
0.1910589188337326,
0.03386012464761734,
0.05724876746535301,
-0.16956987977027893,
0.0964091569185257,
0.025797298178076744,
-0.04511306807398796,
-0.020063506439328194,
-0.08362678438425064,
0.035964783281087875,
-0.048469338566064835,
-0.03162158653140068,
0.0625,
0.041315145790576935,
0.22322091460227966,
0.03946147486567497,
-0.002135366201400757,
0.01262836903333664,
0.1152714267373085,
-0.031889159232378006,
0.04206826165318489,
0.01616571843624115,
0.07046030461788177,
0.012387840077280998,
-0.1317649632692337,
-0.008984189480543137,
0.012339488603174686,
-0.10951486229896545,
-0.06378389894962311,
-0.03154749423265457,
0.12982891499996185,
-0.003205276792868972,
0.03561646491289139,
-0.07834482938051224,
0.02764315716922283,
0.05200176686048508,
-0.012534755282104015,
-0.04485943913459778,
-0.020614488050341606,
0.02160545252263546,
0.08195780962705612,
0.028419047594070435,
0.012121075764298439,
-0.020690428093075752,
0.0985795333981514,
-0.09141883254051208,
-0.04166746884584427,
-0.0019621718674898148,
-0.021910952404141426,
0.07833003997802734,
-0.14589586853981018,
0.09448972344398499,
-0.17583277821540833,
-0.12421081215143204,
0.03839629516005516,
0.06442876905202866,
0.0142550989985466,
-0.015422634780406952,
0.03343493491411209,
-0.04222394526004791,
-0.03841765224933624,
-0.05294029787182808,
-0.05860840901732445,
-0.0686926618218422,
0.12931139767169952,
-0.03727142512798309,
0.007449334487318993,
-0.11858849227428436,
0.04384096711874008,
-0.08292753249406815,
-0.000030275525205070153,
-0.03650026023387909,
-0.03608031943440437,
-0.012376311235129833,
0.18659330904483795,
-0.023658154532313347,
-0.09258444607257843,
-0.12711112201213837,
0.04868162423372269,
-0.03746766597032547,
0.13997432589530945,
-0.06437350809574127,
-0.08124920725822449,
0.2185097187757492,
-0.12629152834415436,
-0.20666983723640442,
0.06788549572229385,
-0.004485560581088066,
0.09793362766504288,
0.07633427530527115,
0.13724525272846222,
0.06975381076335907,
-0.17459318041801453,
0.09122815728187561,
0.09996297955513,
-0.17746597528457642,
-0.1806361973285675,
0.05717217177152634,
-0.02587689459323883,
-0.08864537626504898,
0.0005505099543370306,
-0.004762877244502306,
0.12460057437419891,
-0.08444449305534363,
-0.04104410111904144,
-0.012544686906039715,
-0.002469615312293172,
0.04392916336655617,
0.06086611747741699,
0.03924363851547241,
-0.1011456549167633,
-0.05136562138795853,
0.02984556555747986,
0.016124237328767776,
0.058323610574007034,
0.029178159311413765,
-0.11423160135746002,
0.042107030749320984,
0.04165387153625488,
-0.03973345085978508,
-0.14576828479766846,
-0.058662496507167816,
-0.05485336482524872,
0.07964149862527847,
0.030870774760842323,
0.24981987476348877,
0.04672081768512726,
-0.04008972644805908,
-0.009696691296994686,
-0.021264798939228058,
0.08329746872186661,
0.026441285386681557,
-0.011117080226540565,
-0.08014033734798431,
0.0249272882938385,
-0.04416198283433914,
-0.02650575526058674,
-0.10898125171661377,
0.03524579852819443,
0.11520867049694061,
0.10302460193634033,
-0.013859816826879978,
0.07446960359811783,
-0.05287126079201698,
0.03337080404162407,
-0.025858500972390175,
-0.006216938141733408,
0.10796436667442322,
0.05370244011282921,
-0.05870413780212402,
0.11610135436058044,
-0.09346634894609451,
0.33908188343048096,
0.1807519942522049,
-0.19452568888664246,
-0.05104917287826538,
-0.008406221866607666,
-0.07843230664730072,
0.013892152346670628,
0.06361021846532822,
0.0032344337087124586,
0.07565552741289139,
-0.0021452864166349173,
0.12087003141641617,
-0.035743772983551025,
-0.06681826710700989,
0.018154511228203773,
-0.03493683040142059,
-0.035763468593358994,
0.06294766813516617,
0.1556628793478012,
-0.24024388194084167,
0.1468108594417572,
0.2272484004497528,
0.0749354213476181,
0.1280985325574875,
-0.08369775861501694,
-0.0014803250087425113,
0.0005325355450622737,
0.03187645226716995,
-0.011886157095432281,
0.07212597131729126,
-0.1976832151412964,
0.0001717980339890346,
0.0461178794503212,
-0.005102223716676235,
0.04561777785420418,
-0.18541298806667328,
-0.11017479747533798,
0.030533863231539726,
0.00235375901684165,
-0.0685095340013504,
0.137012779712677,
-0.028816062957048416,
0.08894412964582443,
-0.010273819789290428,
-0.10678882151842117,
0.12516643106937408,
0.021041443571448326,
-0.09212040156126022,
0.1564621478319168,
-0.10842899233102798,
-0.20250974595546722,
-0.10983860492706299,
-0.028664320707321167,
0.10637716948986053,
0.0024925731122493744,
0.12420479953289032,
-0.03002515621483326,
-0.039359550923109055,
0.06774091720581055,
-0.002766458783298731,
-0.12545546889305115,
0.011840625666081905,
-0.04191145673394203,
0.06808041781187057,
-0.0808519646525383,
-0.11032489687204361,
-0.07526286691427231,
-0.032326363027095795,
-0.06532617658376694,
0.10300689935684204,
-0.10158174484968185,
0.05282466113567352,
0.11051923781633377,
0.002821343019604683,
0.055243536829948425,
-0.07603069394826889,
0.18844035267829895,
-0.08936873078346252,
0.01774509996175766,
0.17959026992321014,
-0.005757506471127272,
0.07511201500892639,
0.13402609527111053,
0.05142059549689293,
-0.07230348885059357,
0.007274297531694174,
-0.0674576386809349,
-0.10292960703372955,
-0.26747000217437744,
-0.0851212814450264,
-0.12939418852329254,
0.01998973824083805,
0.00364684802480042,
0.07993164658546448,
0.13616091012954712,
0.08434935659170151,
0.004557931795716286,
-0.07797807455062866,
-0.005770571529865265,
0.06387504935264587,
0.2793932259082794,
-0.015556451864540577,
0.07204017788171768,
-0.1167387068271637,
-0.06947706639766693,
0.09143878519535065,
0.014418491162359715,
0.18382707238197327,
0.1289607435464859,
0.005192040931433439,
0.1303403377532959,
0.15892836451530457,
0.1049061268568039,
0.09420283138751984,
0.016589021310210228,
-0.051918528974056244,
-0.03040500171482563,
0.008910728618502617,
-0.018939509987831116,
0.03289582207798958,
0.049791377037763596,
-0.11020305752754211,
-0.054079752415418625,
-0.17222437262535095,
0.02111734449863434,
0.11667342483997345,
0.02563261054456234,
-0.1887086033821106,
-0.02259455807507038,
0.03071954846382141,
-0.019485916942358017,
-0.04911273717880249,
0.09637565910816193,
-0.026065826416015625,
-0.12888571619987488,
0.07847193628549576,
-0.04871496930718422,
0.07929779589176178,
0.06563542783260345,
0.06417816132307053,
0.03808039054274559,
-0.1156422421336174,
0.025719434022903442,
0.09404852241277695,
-0.31257253885269165,
0.3077980577945709,
-0.006809825077652931,
0.011994853615760803,
-0.04334330931305885,
-0.03944801539182663,
0.0064111389219760895,
0.20608985424041748,
0.13563716411590576,
0.010539514012634754,
-0.0728854089975357,
-0.08863142132759094,
0.0865129604935646,
0.02651088498532772,
0.07468948513269424,
-0.01873064413666725,
-0.04570149630308151,
-0.04483257979154587,
0.02062719315290451,
0.02277568355202675,
0.033870160579681396,
-0.031164279207587242,
-0.10378522425889969,
0.03912532702088356,
-0.007347439881414175,
0.0782884731888771,
-0.04456476494669914,
-0.02841477282345295,
-0.05493967607617378,
0.07182090729475021,
-0.15645691752433777,
-0.051118407398462296,
-0.10272152721881866,
-0.13893668353557587,
0.051439106464385986,
-0.08829248696565628,
0.08523569256067276,
-0.04295189678668976,
-0.07763536274433136,
-0.07065507024526596,
-0.14803582429885864,
0.1382049173116684,
-0.11951083689928055,
-0.021533850580453873,
-0.04775160551071167,
0.18714779615402222,
-0.03373394161462784,
0.03114781528711319,
0.011781677603721619,
0.0095354113727808,
-0.05576273426413536,
-0.08631312102079391,
0.0009413790539838374,
-0.09073225408792496,
0.07072651386260986,
-0.047334909439086914,
-0.0674246996641159,
0.05711057037115097,
0.030822236090898514,
-0.05806637555360794,
0.19977843761444092,
0.23948143422603607,
-0.053435128182172775,
0.12802354991436005,
0.14840000867843628,
-0.05945441871881485,
-0.23797965049743652,
-0.04731312766671181,
-0.15530382096767426,
-0.043426766991615295,
0.03283531218767166,
-0.15546724200248718,
0.03742826357483864,
0.08683471381664276,
-0.03302839770913124,
0.1288791298866272,
-0.27221187949180603,
-0.06400095671415329,
0.15461160242557526,
0.002583677414804697,
0.3781931698322296,
-0.13022807240486145,
-0.045894134789705276,
0.055980514734983444,
-0.23525001108646393,
0.13002179563045502,
-0.000507331860717386,
0.0567743182182312,
-0.011408084072172642,
-0.010513070039451122,
-0.0020655517000705004,
-0.059702955186367035,
0.11817490309476852,
0.027159979566931725,
0.04377518594264984,
-0.1096094399690628,
-0.0783454179763794,
0.09588336199522018,
0.024232594296336174,
0.01918364129960537,
0.014331398531794548,
0.023586994037032127,
-0.16334541141986847,
-0.025969577953219414,
-0.10034101456403732,
0.09853698313236237,
0.001341068185865879,
-0.09256350994110107,
-0.04338794946670532,
0.026196593418717384,
-0.013371402397751808,
-0.04627860337495804,
0.1874774992465973,
-0.013359828852117062,
0.18923340737819672,
0.01906132698059082,
0.15405185520648956,
-0.16067762672901154,
-0.06776446104049683,
-0.08994469791650772,
-0.05440984293818474,
0.048213131725788116,
-0.06260605901479721,
0.004598570521920919,
0.15415364503860474,
0.0006949116359464824,
0.06505808979272842,
0.09880312532186508,
-0.00664450041949749,
0.001606155768968165,
0.13660547137260437,
-0.15568186342716217,
-0.10053468495607376,
-0.047706425189971924,
0.032070375978946686,
0.09687922894954681,
0.1050226166844368,
0.0850791484117508,
-0.00791708193719387,
0.002020999789237976,
0.0019751053769141436,
-0.04119418188929558,
-0.08169881254434586,
0.01022866927087307,
0.09833689779043198,
0.021932465955615044,
-0.08829081803560257,
0.020862668752670288,
-0.006525723729282618,
-0.1852513998746872,
-0.0324665866792202,
0.09789416939020157,
-0.12203185260295868,
-0.11600426584482193,
-0.058960165828466415,
0.03837605565786362,
-0.2484338879585266,
-0.05497286841273308,
-0.020266149193048477,
-0.12955501675605774,
0.09360747039318085,
0.28902438282966614,
0.06786639243364334,
0.11616130918264389,
-0.021566400304436684,
-0.0013668235624209046,
-0.0022938260808587074,
-0.042854130268096924,
-0.038682691752910614,
0.00426688976585865,
-0.12265821546316147,
0.08366742730140686,
0.0037358489353209734,
0.13106955587863922,
-0.06905385106801987,
-0.07619559019804001,
-0.13937772810459137,
0.07819639891386032,
-0.07285302132368088,
-0.06436524540185928,
-0.10806272178888321,
-0.03112115152180195,
0.02448553591966629,
-0.11287640035152435,
-0.046837836503982544,
-0.0169993843883276,
-0.1260463148355484,
0.081724151968956,
0.06479965895414352,
0.05749804899096489,
-0.059798769652843475,
-0.03691520169377327,
0.09532593190670013,
-0.01607115939259529,
0.10467593371868134,
0.06305039674043655,
-0.05756572261452675,
0.10770197957754135,
-0.09382795542478561,
-0.05528214946389198,
0.09551241248846054,
-0.0042087542824447155,
0.032034654170274734,
0.03954492136836052,
0.03175737336277962,
0.06422111392021179,
-0.018816864117980003,
0.0591857023537159,
-0.0912519097328186,
-0.1251654475927353,
0.02031831257045269,
0.023685423657298088,
-0.1325400471687317,
-0.04516003653407097,
-0.07999031990766525,
0.11155621707439423,
-0.012353334575891495,
0.14478126168251038,
-0.008559519425034523,
0.03160591423511505,
-0.06785569339990616,
-0.0016739077400416136,
-0.0049416315741837025,
-0.10955435037612915,
-0.010171468369662762,
-0.08227506279945374,
-0.028599973767995834,
-0.005865692161023617,
0.20030300319194794,
-0.022435178980231285,
-0.0506771020591259,
0.08601896464824677,
0.04089942201972008,
-0.021788785234093666,
-0.01613939180970192,
0.248198002576828,
0.05444559082388878,
-0.01808932237327099,
-0.1440548449754715,
0.05017741024494171,
-0.04315269738435745,
-0.1890053153038025,
0.15995106101036072,
0.11648929119110107,
-0.022368496283888817,
0.048295799642801285,
0.029820607975125313,
0.0093945087864995,
-0.08485109359025955,
-0.18416666984558105,
0.06984227150678635,
0.013523516245186329,
0.003167259506881237,
0.10881166905164719,
0.20763033628463745,
-0.07750587910413742,
0.016894595697522163,
-0.042349591851234436,
-0.0036280525382608175,
-0.1642192155122757,
-0.13105911016464233,
-0.022578762844204903,
-0.0727090984582901,
0.051097311079502106,
-0.030553782358765602,
0.01841375231742859,
0.15214644372463226,
0.06584577262401581,
-0.051947738975286484,
0.021414224058389664,
0.02917594648897648,
-0.0526249073445797,
0.005096277222037315,
-0.0004534249019343406,
0.050447601824998856,
-0.10685541480779648,
-0.019434280693531036,
-0.09905914217233658,
-0.0876966342329979,
-0.06408388167619705,
-0.00012773348134942353,
-0.07047653943300247,
-0.013428273610770702,
-0.12114056944847107,
-0.05371568351984024,
-0.03273484483361244,
0.08361051231622696,
0.00412009097635746,
0.086480051279068,
-0.006343976128846407,
0.0252322219312191,
0.058051832020282745,
0.16941778361797333,
-0.0205397829413414,
-0.1145453006029129,
-0.033476945012807846,
0.14142508804798126,
0.08873463422060013,
0.0797295942902565,
0.029802607372403145,
0.01780109293758869,
-0.0022786473855376244,
0.26137542724609375,
0.23033958673477173,
-0.015446661040186882,
0.05509280040860176,
0.027154341340065002,
0.028465047478675842,
0.1023523285984993,
0.12468189001083374,
0.08850894123315811,
0.25036174058914185,
-0.13312506675720215,
-0.04953673854470253,
-0.06366101652383804,
0.0467824712395668,
-0.06575153768062592,
0.049131881445646286,
0.013948683626949787,
-0.06580666452646255,
-0.010579888708889484,
0.1309019923210144,
-0.1290314793586731,
0.03941289708018303,
0.06998421251773834,
-0.12955467402935028,
-0.03608769550919533,
-0.045644134283065796,
0.11567727476358414,
0.031160173937678337,
0.08868777006864548,
-0.05869605019688606,
-0.10865940898656845,
0.04806296527385712,
0.05674128979444504,
-0.2633470296859741,
-0.08255340158939362,
0.14011400938034058,
0.08798375725746155,
0.03361082077026367,
-0.04807840660214424,
0.06403250992298126,
0.07459012418985367,
0.06269991397857666,
-0.04021207243204117,
0.07417741417884827,
0.059080976992845535,
-0.028697647154331207,
-0.06297820061445236,
-0.11308249086141586,
0.024638449773192406,
-0.0658857524394989,
0.02725204825401306,
-0.1673862338066101,
0.0688876137137413,
-0.017041780054569244,
-0.05683276057243347,
-0.04214475676417351,
0.08546668291091919,
-0.04455987364053726,
0.08360309153795242,
0.029924502596259117,
0.00001848252759373281,
-0.04375547543168068,
-0.0745522752404213,
-0.025419410318136215,
0.10643206536769867,
-0.10300762206315994,
-0.05566250905394554,
-0.02315659262239933,
-0.04442897066473961,
0.0038731112144887447,
-0.010689720511436462,
-0.10320927202701569,
-0.05836177617311478,
-0.07618073374032974,
0.016821207478642464,
-0.1454089730978012,
0.05880630388855934,
0.05806706100702286,
0.041929736733436584,
0.006459210533648729,
-0.009131353348493576,
0.0034616088960319757,
0.043670497834682465,
-0.1203746348619461,
-0.047705162316560745
] |
null | null |
flair
|
# Triple E - Effective Ensembling of Embeddings and Language Models for NER of Historical German
Based on [our paper](http://ceur-ws.org/Vol-2696/paper_173.pdf) we release a new baseline model for the German
[CLEF-HIPE shared task](https://impresso.github.io/CLEF-HIPE-2020/).
In contrast to the models used in the paper, we manually sentence-segmented and normalize hyphenations and
trained a NER model using the German Europeana BERT model.
Additionally, we perform experiments with different context sizes. This approach is described in
more detail in [this paper](https://arxiv.org/abs/2011.06993).
# Results
The results with different context sizes can be seen in the following table:
| Model | Run 1 | Run 2 | Run 3 | Run 4 | Run 5 | Avg.
| -------------------------- | --------------- | --------------- | --------------- | ------------------- | --------------- | ---------------
| German Europeana BERT | (81.45) / 76.92 | (**81.53**) / 77.03 | (80.49) / 77.83 | (80.88) / 77.19 | (81.39) / 77.00 | (81.15 ± 0.45) / 77.19 ± 0.34
| German Europeana BERT (16) | (**82.56**) / 77.38 | (81.19) / 77.76 | (80.99) / 76.34 | (81.27) / 77.70 | (81.28) / 77.22 | (81.46 ± 0.63) / 77.28 ± 0.57
| German Europeana BERT (32) | (**82.04**) / 78.50 | (81.14) / 76.56 | (81.81) / 78.28 | (81.50) / 76.90 | (81.64) / 77.94 | (81.63 ± 0.34) / 77.64 ± 0.86
| German Europeana BERT (64) | (81.21) / 78.39 | (81.27) / 75.98 | (**81.88**) / 78.40 | (81.66) / 77.35 | (81.29) / 76.70 | (81.46 ± 0.29) / 77.36 ± 1.06
| German Europeana BERT (80) | (82.13) / 77.77 | (81.31) / 76.81 | (82.09) / 78.69 | (**82.30**) / 76.79 | (80.65) / 77.10 | (81.70 ± 0.70) / 77.43 ± 0.81
For model upload, we choose the best model on development score: 82.56 with a context length of 16.
## Comparisons
The following figure shows the results with different context sized (on development dataset):

We perform "Almost Stochastic Order" tests as proposed in the
["Deep Dominance - How to Properly Compare Deep Neural Models"](https://www.aclweb.org/anthology/P19-1266/) paper.
The heatmap figure is heavily inspired by the ["CharacterBERT"](https://arxiv.org/abs/2010.10392) paper.

|
{"language": "de", "license": "mit", "tags": ["flair", "token-classification", "sequence-tagger-model"], "widget": [{"text": "Herr Oberst Brunner ist n\u00e4mlich Hauptagent f\u00fcr den Kanton Z\u00fcrich."}]}
|
token-classification
|
dbmdz/flair-clef-hipe-german-base
|
[
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"de",
"arxiv:2011.06993",
"arxiv:2010.10392",
"license:mit",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2011.06993",
"2010.10392"
] |
[
"de"
] |
TAGS
#flair #pytorch #token-classification #sequence-tagger-model #de #arxiv-2011.06993 #arxiv-2010.10392 #license-mit #region-us
|
Triple E - Effective Ensembling of Embeddings and Language Models for NER of Historical German
==============================================================================================
Based on our paper we release a new baseline model for the German
CLEF-HIPE shared task.
In contrast to the models used in the paper, we manually sentence-segmented and normalize hyphenations and
trained a NER model using the German Europeana BERT model.
Additionally, we perform experiments with different context sizes. This approach is described in
more detail in this paper.
Results
=======
The results with different context sizes can be seen in the following table:
For model upload, we choose the best model on development score: 82.56 with a context length of 16.
Comparisons
-----------
The following figure shows the results with different context sized (on development dataset):
!German CLEF-HIPE Development Results
We perform "Almost Stochastic Order" tests as proposed in the
"Deep Dominance - How to Properly Compare Deep Neural Models" paper.
The heatmap figure is heavily inspired by the "CharacterBERT" paper.
!Almost Stochastic Order Tests on Development set
|
[] |
[
"TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #de #arxiv-2011.06993 #arxiv-2010.10392 #license-mit #region-us \n"
] |
[
52
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #de #arxiv-2011.06993 #arxiv-2010.10392 #license-mit #region-us \n"
] |
[
-0.0563078448176384,
0.10515210777521133,
-0.009043658152222633,
0.0810207948088646,
0.04912245273590088,
0.06490633636713028,
0.10326271504163742,
0.08188295364379883,
0.16491946578025818,
0.017774133011698723,
0.13605833053588867,
0.20170719921588898,
0.03309363126754761,
0.07978846877813339,
-0.05460166186094284,
-0.26622435450553894,
0.04030761495232582,
0.023489270359277725,
0.04334169626235962,
0.12417106330394745,
0.0763479471206665,
-0.06543824821710587,
0.03331881761550903,
0.029392298310995102,
-0.007063522003591061,
-0.007447634357959032,
0.012592851184308529,
-0.07775148004293442,
0.1459033191204071,
-0.04222699627280235,
0.0947839617729187,
0.07497310638427734,
0.04028656706213951,
-0.10752652585506439,
0.031145477667450905,
-0.0495687760412693,
-0.08122733980417252,
0.08989090472459793,
0.11230583488941193,
-0.045645732432603836,
0.06822811812162399,
0.09589008241891861,
-0.0026557689998298883,
0.04927695542573929,
-0.19943439960479736,
-0.2373453676700592,
-0.0974649041891098,
0.0741356611251831,
0.08281064033508301,
0.02919297106564045,
0.027973106130957603,
0.13455916941165924,
-0.12952359020709991,
0.019959809258580208,
0.1795593500137329,
-0.35756951570510864,
0.02080991491675377,
0.20194530487060547,
0.05615416541695595,
-0.01484467089176178,
-0.06452538818120956,
0.0256443340331316,
0.06848721206188202,
-0.01141964178532362,
-0.003981675487011671,
-0.053712327033281326,
-0.036270756274461746,
0.06437339633703232,
-0.1185358539223671,
-0.0393860824406147,
0.24574092030525208,
0.030290206894278526,
0.004859799984842539,
0.015738392248749733,
-0.04142433777451515,
-0.06043517217040062,
0.013372587971389294,
0.029423845931887627,
0.035083699971437454,
0.05978718400001526,
0.21953809261322021,
0.08395783603191376,
-0.11052253097295761,
-0.039689693599939346,
-0.16180966794490814,
0.11847461014986038,
-0.002894460689276457,
0.099925696849823,
-0.19419607520103455,
0.04248134046792984,
-0.06382578611373901,
-0.06332281976938248,
0.02492425963282585,
-0.08391205221414566,
0.024421395733952522,
0.01635671779513359,
0.01473448146134615,
0.10046521574258804,
0.06338836252689362,
0.18905414640903473,
0.025411337614059448,
0.06544749438762665,
0.04598573222756386,
0.1281977742910385,
0.04707234352827072,
0.060485582798719406,
0.004592422861605883,
-0.004667214117944241,
-0.009016966447234154,
-0.027363715693354607,
0.015399462543427944,
-0.03386346250772476,
-0.11683350056409836,
-0.022653890773653984,
-0.06114480644464493,
0.10674946755170822,
-0.00596802169457078,
-0.0015709905419498682,
-0.08622577041387558,
-0.028100889176130295,
0.1538192331790924,
-0.03322616592049599,
-0.02317747473716736,
0.05363895371556282,
-0.06068387255072594,
0.009237493388354778,
-0.08050700277090073,
0.016151096671819687,
0.03895017132163048,
0.15466177463531494,
-0.11610545217990875,
0.005581023637205362,
0.01334958802908659,
-0.0703236386179924,
0.04423007369041443,
-0.1733447164297104,
0.019369108602404594,
-0.071658194065094,
-0.10491596162319183,
0.028883785009384155,
0.00254876003600657,
0.003119254717603326,
-0.00226723519153893,
0.03696971386671066,
0.005458826664835215,
-0.020531529560685158,
-0.04692864790558815,
-0.004472560714930296,
-0.09411925822496414,
0.08565674722194672,
-0.08293064683675766,
0.022285040467977524,
-0.1359805166721344,
0.019934888929128647,
-0.12794490158557892,
0.042239878326654434,
-0.07720553874969482,
-0.09758423268795013,
-0.09172037988901138,
0.10464650392532349,
-0.07365375757217407,
-0.08129529654979706,
-0.015407942235469818,
-0.024014310911297798,
-0.0473259761929512,
0.09894687682390213,
-0.15816089510917664,
-0.041708704084157944,
0.03520679101347923,
-0.12380713224411011,
-0.07721911370754242,
0.0015851844800636172,
-0.001342426985502243,
0.007307961583137512,
0.0256964098662138,
0.28757721185684204,
-0.0331684872508049,
-0.060707300901412964,
0.041110485792160034,
0.07968772202730179,
-0.15407836437225342,
-0.155499666929245,
0.13933265209197998,
-0.08048385381698608,
-0.09549237787723541,
0.015562272630631924,
-0.08162026107311249,
-0.014523684047162533,
-0.027661873027682304,
-0.06276488304138184,
0.06398998200893402,
0.0016303977463394403,
0.062272217124700546,
0.042031172662973404,
0.026569722220301628,
-0.0470578595995903,
0.01569506712257862,
-0.051997337490320206,
0.05606437474489212,
0.12127242982387543,
-0.029192278161644936,
-0.05623656138777733,
0.12820686399936676,
0.0544917955994606,
-0.03779589757323265,
-0.14145484566688538,
-0.0420696921646595,
0.03740730509161949,
-0.08936314284801483,
0.008438188582658768,
0.06378795206546783,
0.03325154259800911,
-0.022937370464205742,
0.02667814865708351,
-0.009327098727226257,
0.0995437502861023,
0.024893905967473984,
0.0016090747667476535,
-0.12895962595939636,
-0.0432400107383728,
-0.048409488052129745,
0.017046025022864342,
-0.07553740590810776,
0.009919079020619392,
0.06263519823551178,
0.004748582374304533,
-0.049997348338365555,
0.05629706382751465,
0.033722471445798874,
-0.018549038097262383,
-0.0006828015320934355,
-0.008054668083786964,
0.18611475825309753,
-0.026463976129889488,
-0.013590608723461628,
0.048627156764268875,
-0.06697104871273041,
0.15774492919445038,
0.142355278134346,
-0.11462283134460449,
-0.014737273566424847,
-0.09393324702978134,
-0.05129178613424301,
0.01477438397705555,
0.05095857009291649,
0.06072616949677467,
0.047530658543109894,
0.008461052551865578,
0.06822508573532104,
-0.07140616327524185,
-0.03886280208826065,
-0.000516024767421186,
0.006965554319322109,
-0.03365654498338699,
0.1308116465806961,
0.19182969629764557,
-0.21295079588890076,
0.10204043239355087,
0.21998389065265656,
0.092186339199543,
0.04678763821721077,
-0.04727010801434517,
0.009818422608077526,
-0.03388402238488197,
-0.024618657305836678,
-0.04369238391518593,
0.15352246165275574,
-0.09674780070781708,
-0.002031984506174922,
0.08152575045824051,
0.02146572433412075,
0.021137161180377007,
-0.16247078776359558,
-0.10999467223882675,
-0.05323997884988785,
-0.0036491400096565485,
-0.221043661236763,
0.07444189488887787,
-0.005348741542547941,
0.12202578037977219,
-0.00005029354724683799,
-0.17769084870815277,
0.054637763649225235,
-0.014529628679156303,
-0.051494769752025604,
0.1768159568309784,
-0.07480794936418533,
-0.15218348801136017,
-0.09561456739902496,
-0.10138940811157227,
-0.007161407731473446,
-0.003940205555409193,
0.03987216204404831,
-0.061861686408519745,
-0.03409691900014877,
0.037894826382398605,
0.0013953695306554437,
-0.19211269915103912,
-0.015918342396616936,
-0.05992778018116951,
0.029690686613321304,
-0.05278991535305977,
-0.1147688552737236,
-0.05858182907104492,
-0.11417146772146225,
0.06780138611793518,
0.10573441535234451,
-0.031636662781238556,
0.044618990272283554,
0.18121139705181122,
0.011155764572322369,
0.05011181905865669,
-0.058606717735528946,
0.16083534061908722,
-0.018986690789461136,
0.004360448569059372,
0.1730497181415558,
-0.04211637005209923,
0.06484804302453995,
0.17956675589084625,
0.10661450773477554,
-0.08621733635663986,
-0.06172779202461243,
-0.019596535712480545,
-0.07922996580600739,
-0.26320168375968933,
-0.056535474956035614,
-0.0818139985203743,
0.1216408982872963,
0.06763172149658203,
0.08138127624988556,
0.08012347668409348,
0.04452105984091759,
0.04601528123021126,
-0.08175313472747803,
-0.01254289597272873,
0.042404718697071075,
0.2438734769821167,
-0.0002041884872596711,
0.059861719608306885,
-0.061126142740249634,
-0.022299032658338547,
0.08012102544307709,
0.054957080632448196,
0.14375898241996765,
0.15603387355804443,
0.054717421531677246,
0.11233653128147125,
0.1165047138929367,
0.12870723009109497,
0.08379057794809341,
0.0003084199852310121,
-0.0018766217399388552,
-0.07185497134923935,
-0.031614623963832855,
0.07490842044353485,
-0.00020039135415572673,
-0.016287226229906082,
-0.09268022328615189,
-0.016934340819716454,
-0.1568736582994461,
0.013942689634859562,
-0.0007530525326728821,
0.13685445487499237,
-0.17635361850261688,
0.0470581017434597,
0.040112581104040146,
0.06296009570360184,
-0.049263570457696915,
0.11299041658639908,
-0.14817366003990173,
-0.08878903090953827,
0.0852174237370491,
-0.018512984737753868,
0.10421459376811981,
0.000038706431951140985,
0.058487627655267715,
0.04391293600201607,
-0.044276442378759384,
0.005715749226510525,
0.0999278873205185,
-0.12496389448642731,
0.3058242201805115,
0.003969553392380476,
-0.08473200350999832,
0.0018687498522922397,
-0.0515616200864315,
0.05362587794661522,
0.08719082921743393,
0.17519019544124603,
0.02623424492776394,
-0.110385000705719,
-0.14807793498039246,
-0.06125480681657791,
-0.013386680744588375,
0.04066105931997299,
-0.05610939487814903,
-0.04766750708222389,
-0.00952575821429491,
0.03003796562552452,
-0.00858771987259388,
0.007008693180978298,
-0.017383072525262833,
-0.015194679610431194,
0.06739837676286697,
-0.009380018338561058,
-0.031054969877004623,
0.0032760449685156345,
-0.061117492616176605,
-0.11122164875268936,
0.03714461997151375,
-0.0994136705994606,
-0.05657174438238144,
-0.08641140908002853,
-0.12600822746753693,
0.11287498474121094,
-0.07056880742311478,
-0.012599499896168709,
-0.07833297550678253,
-0.08083439618349075,
-0.06153715029358864,
-0.14294035732746124,
0.11775898933410645,
-0.047040242701768875,
-0.043135177344083786,
-0.07804103195667267,
0.1773461401462555,
-0.057099003344774246,
0.04713589698076248,
-0.000717479211743921,
0.0683787539601326,
-0.06027870997786522,
-0.13132283091545105,
0.1197068840265274,
-0.1358577460050583,
0.05640440806746483,
-0.008734929375350475,
-0.014885135926306248,
-0.018267275765538216,
-0.03886779397726059,
0.012642467394471169,
0.14194844663143158,
0.3565099239349365,
-0.07684326171875,
0.17764367163181305,
0.19977261126041412,
-0.06465264409780502,
-0.17193184792995453,
-0.0922488123178482,
-0.1520288735628128,
-0.08243269473314285,
0.05219709500670433,
-0.1635904610157013,
0.08901501446962357,
0.127082958817482,
-0.097600556910038,
0.20874600112438202,
-0.2869577407836914,
-0.050843048840761185,
0.2057722806930542,
-0.020633842796087265,
0.29970845580101013,
-0.10498477518558502,
-0.11059895902872086,
0.0434577502310276,
-0.23734204471111298,
0.13215024769306183,
0.10984809696674347,
0.048747919499874115,
-0.09057794511318207,
-0.048318710178136826,
-0.007957296445965767,
-0.03205892816185951,
0.18204988539218903,
0.08913397789001465,
0.09791199117898941,
-0.051727138459682465,
-0.2547423839569092,
0.15347030758857727,
0.016478221863508224,
0.03758176043629646,
0.019183512777090073,
0.011290901340544224,
-0.20033061504364014,
-0.007011170033365488,
-0.07604056596755981,
0.08505403995513916,
-0.009057200513780117,
-0.08098659664392471,
-0.09879573434591293,
0.028038563206791878,
-0.08413060754537582,
-0.035771485418081284,
0.186213880777359,
0.018779000267386436,
0.08693445473909378,
0.03628528118133545,
0.030057741329073906,
-0.06110873445868492,
-0.14352409541606903,
-0.0833681970834732,
-0.04961613938212395,
0.07002007216215134,
-0.09829910099506378,
-0.048330407589673996,
0.17804639041423798,
0.04729348421096802,
0.051345061510801315,
0.09919936209917068,
-0.0020905754063278437,
0.001646137679927051,
0.11557446420192719,
-0.15620313584804535,
0.0024526219349354506,
0.05623030290007591,
-0.033378440886735916,
0.10282574594020844,
0.04442232847213745,
0.025764642283320427,
0.03444993495941162,
-0.027652058750391006,
0.04929839447140694,
0.023249369114637375,
-0.014578210189938545,
0.0467975027859211,
0.04581412300467491,
0.02757432498037815,
-0.11982230097055435,
0.12201384454965591,
0.08592043817043304,
0.008455195464193821,
-0.11105328053236008,
0.11124873906373978,
-0.10735438764095306,
-0.0563528910279274,
-0.12732478976249695,
0.051384855061769485,
-0.15990373492240906,
-0.035938020795583725,
-0.06880786269903183,
-0.13482818007469177,
0.02636641263961792,
0.12662045657634735,
0.08617027848958969,
0.06909815967082977,
-0.03741433843970299,
-0.06327398866415024,
0.05427595227956772,
-0.022870760411024094,
-0.06391079723834991,
0.005382999312132597,
-0.12874233722686768,
0.004311658907681704,
0.03848477080464363,
0.05572628602385521,
-0.04456741735339165,
-0.07544893771409988,
-0.13236089050769806,
0.03664584830403328,
-0.03977947682142258,
0.037367481738328934,
-0.0683712586760521,
0.00665292190387845,
-0.02686341293156147,
-0.029342463240027428,
-0.07002869993448257,
-0.027402030304074287,
-0.10961712896823883,
0.046882398426532745,
0.03944599628448486,
0.10703493654727936,
-0.08279093354940414,
-0.046005092561244965,
0.06139355152845383,
0.01940738968551159,
0.09613644331693649,
0.06938613951206207,
0.020296823233366013,
0.06055670231580734,
-0.07455645501613617,
-0.06900201737880707,
0.09000851958990097,
0.012739804573357105,
-0.004984190221875906,
-0.058745767921209335,
-0.010271136648952961,
0.010825373232364655,
-0.09091121703386307,
0.05025602504611015,
-0.07882160693407059,
-0.11752153933048248,
-0.08823872357606888,
-0.024006178602576256,
-0.1397048979997635,
0.0062029710970819,
-0.1407189667224884,
0.17052237689495087,
0.029391659423708916,
0.10914456844329834,
0.06391694396734238,
0.033501412719488144,
-0.12456657737493515,
-0.015495543368160725,
-0.041077110916376114,
-0.14834022521972656,
-0.17757388949394226,
-0.018492039293050766,
-0.007546104025095701,
-0.030978139489889145,
0.23498345911502838,
0.04795447364449501,
-0.14633749425411224,
0.022499481216073036,
0.20379188656806946,
-0.05125042051076889,
-0.0014806005638092756,
0.179372176527977,
0.0533006489276886,
-0.03294584900140762,
-0.038632214069366455,
0.06442372500896454,
-0.06579188257455826,
-0.03145845606923103,
0.21160607039928436,
0.09975289553403854,
0.13410362601280212,
-0.027668938040733337,
0.08580487221479416,
-0.06562159955501556,
-0.12120801955461502,
0.03262929990887642,
0.10581761598587036,
0.03127109631896019,
0.0645056739449501,
0.11539504677057266,
0.16233134269714355,
-0.035217564553022385,
0.06760273873806,
-0.0724213644862175,
0.005048729479312897,
-0.13851062953472137,
-0.12794573605060577,
0.012474359013140202,
-0.030433189123868942,
0.017016449943184853,
-0.02847040630877018,
-0.004052812699228525,
0.22189131379127502,
0.042603105306625366,
-0.012114294804632664,
-0.09424228221178055,
-0.09167801588773727,
-0.06023568660020828,
0.030774224549531937,
-0.0012885996839031577,
-0.012131338007748127,
-0.08190707862377167,
-0.0016055690357461572,
0.008930725045502186,
-0.13193680346012115,
-0.04062245413661003,
-0.04743748530745506,
0.010550186969339848,
-0.032251935452222824,
-0.09842567145824432,
-0.08209823071956635,
-0.016724437475204468,
0.05503595247864723,
0.014710110612213612,
0.12319723516702652,
0.02045901119709015,
0.056793469935655594,
0.05322214588522911,
0.10784197598695755,
0.051314711570739746,
-0.017587432637810707,
0.05102500319480896,
0.23758451640605927,
-0.03513779118657112,
0.04427115246653557,
0.031742360442876816,
0.001792851253412664,
0.01533605344593525,
0.16220739483833313,
0.23394645750522614,
-0.03954160958528519,
-0.03155648335814476,
0.027600349858403206,
0.005177213344722986,
0.11734408885240555,
0.08411877602338791,
0.02868669666349888,
0.27021610736846924,
-0.07857699692249298,
-0.013272741809487343,
-0.11380614340305328,
0.04319504275918007,
-0.028738563880324364,
0.07379627227783203,
0.11217040568590164,
-0.06696873903274536,
-0.08717475831508636,
0.13576774299144745,
-0.1091509684920311,
0.06939645111560822,
0.06550497561693192,
-0.12873488664627075,
-0.08939999341964722,
-0.02322537824511528,
0.0889277532696724,
0.0699010118842125,
0.07550239562988281,
-0.10142473131418228,
-0.07409577816724777,
-0.03864089027047157,
0.036604929715394974,
-0.2596588432788849,
-0.14291474223136902,
0.09211225062608719,
0.04618335887789726,
0.14304278790950775,
-0.03404632583260536,
0.15231001377105713,
0.051755499094724655,
0.03565370291471481,
-0.07828564941883087,
0.046043310314416885,
0.041402239352464676,
0.008987818844616413,
-0.032545723021030426,
-0.13195361196994781,
0.04466710239648819,
-0.09383513033390045,
0.05656823888421059,
-0.04376760870218277,
0.034144818782806396,
0.09041785448789597,
-0.11197429150342941,
-0.02267174795269966,
0.06828861683607101,
-0.05162971466779709,
0.0330134816467762,
0.04934067651629448,
0.0010419541504234076,
-0.10297499597072601,
-0.053618740290403366,
-0.034668970853090286,
0.08327276259660721,
-0.18861854076385498,
-0.06307519972324371,
0.060928571969270706,
0.003795573953539133,
0.005920052528381348,
-0.026190323755145073,
-0.03905174136161804,
-0.03842621296644211,
-0.12523777782917023,
0.018141448497772217,
-0.09188736975193024,
0.0070117139257490635,
0.016129424795508385,
0.018544292077422142,
0.008254759944975376,
-0.029355401173233986,
0.0329211950302124,
0.03299769386649132,
-0.08168184757232666,
-0.08217831701040268
] |
null | null |
flair
|
# Flair NER model trained on GermEval14 dataset
This model was trained on the official [GermEval14](https://sites.google.com/site/germeval2014ner/data)
dataset using the [Flair](https://github.com/flairNLP/flair) framework.
It uses a fine-tuned German DistilBERT model from [here](https://huggingface.co/distilbert-base-german-cased).
# Results
| Dataset \ Run | Run 1 | Run 2 | Run 3† | Run 4 | Run 5 | Avg.
| ------------- | ----- | ----- | --------- | ----- | ----- | ----
| Development | 87.05 | 86.52 | **87.34** | 86.85 | 86.46 | 86.84
| Test | 85.43 | 85.88 | 85.72 | 85.47 | 85.62 | 85.62
† denotes that this model is selected for upload.
# Flair Fine-Tuning
We used the following script to fine-tune the model on the GermEval14 dataset:
```python
from argparse import ArgumentParser
import torch, flair
# dataset, model and embedding imports
from flair.datasets import GERMEVAL_14
from flair.embeddings import TransformerWordEmbeddings
from flair.models import SequenceTagger
from flair.trainers import ModelTrainer
if __name__ == "__main__":
# All arguments that can be passed
parser = ArgumentParser()
parser.add_argument("-s", "--seeds", nargs='+', type=int, default='42') # pass list of seeds for experiments
parser.add_argument("-c", "--cuda", type=int, default=0, help="CUDA device") # which cuda device to use
parser.add_argument("-m", "--model", type=str, help="Model name (such as Hugging Face model hub name")
# Parse experimental arguments
args = parser.parse_args()
# use cuda device as passed
flair.device = f'cuda:{str(args.cuda)}'
# for each passed seed, do one experimental run
for seed in args.seeds:
flair.set_seed(seed)
# model
hf_model = args.model
# initialize embeddings
embeddings = TransformerWordEmbeddings(
model=hf_model,
layers="-1",
subtoken_pooling="first",
fine_tune=True,
use_context=False,
respect_document_boundaries=False,
)
# select dataset depending on which language variable is passed
corpus = GERMEVAL_14()
# make the dictionary of tags to predict
tag_dictionary = corpus.make_tag_dictionary('ner')
# init bare-bones sequence tagger (no reprojection, LSTM or CRF)
tagger: SequenceTagger = SequenceTagger(
hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type='ner',
use_crf=False,
use_rnn=False,
reproject_embeddings=False,
)
# init the model trainer
trainer = ModelTrainer(tagger, corpus, optimizer=torch.optim.AdamW)
# make string for output folder
output_folder = f"flert-ner-{hf_model}-{seed}"
# train with XLM parameters (AdamW, 20 epochs, small LR)
from torch.optim.lr_scheduler import OneCycleLR
trainer.train(
output_folder,
learning_rate=5.0e-5,
mini_batch_size=16,
mini_batch_chunk_size=1,
max_epochs=10,
scheduler=OneCycleLR,
embeddings_storage_mode='none',
weight_decay=0.,
train_with_dev=False,
)
```
|
{"language": "de", "license": "mit", "tags": ["flair", "token-classification", "sequence-tagger-model"], "datasets": ["germeval_14"], "widget": [{"text": "Hugging Face ist eine franz\u00f6sische Firma mit Sitz in New York."}]}
|
token-classification
|
stefan-it/flair-distilbert-ner-germeval14
|
[
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"de",
"dataset:germeval_14",
"license:mit",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#flair #pytorch #token-classification #sequence-tagger-model #de #dataset-germeval_14 #license-mit #region-us
|
Flair NER model trained on GermEval14 dataset
=============================================
This model was trained on the official GermEval14
dataset using the Flair framework.
It uses a fine-tuned German DistilBERT model from here.
Results
=======
† denotes that this model is selected for upload.
Flair Fine-Tuning
=================
We used the following script to fine-tune the model on the GermEval14 dataset:
|
[] |
[
"TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #de #dataset-germeval_14 #license-mit #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #de #dataset-germeval_14 #license-mit #region-us \n"
] |
[
-0.07507196068763733,
0.11758723109960556,
-0.008122032508254051,
0.08862777799367905,
0.07400777190923691,
0.07692361623048782,
0.1076003909111023,
0.07553325593471527,
0.1764901876449585,
-0.01635214313864708,
0.11591225862503052,
0.1243429109454155,
0.01850699633359909,
0.09618467837572098,
-0.043607067316770554,
-0.30852219462394714,
0.04781952500343323,
-0.0008668669615872204,
0.027306580916047096,
0.11037064343690872,
0.07907847315073013,
-0.06282747536897659,
0.06081829592585564,
0.0065088351257145405,
-0.027454743161797523,
0.0066041662357747555,
0.007323476020246744,
-0.05593441054224968,
0.14237619936466217,
-0.058104753494262695,
0.07060164958238602,
0.02574647031724453,
0.046686023473739624,
-0.11027169972658157,
0.03369855508208275,
-0.03570932522416115,
-0.09046529978513718,
0.08619876950979233,
0.0845351442694664,
-0.06612168997526169,
0.07188724726438522,
0.04330364987254143,
-0.011741328053176403,
0.04600786045193672,
-0.19426293671131134,
-0.21513068675994873,
-0.09561798721551895,
0.057740408927202225,
0.045401524752378464,
0.025311311706900597,
0.014048145152628422,
0.07847321033477783,
-0.17167997360229492,
0.01620909571647644,
0.16779734194278717,
-0.3126699924468994,
0.014657366089522839,
0.2328237146139145,
0.06048833206295967,
0.016001371666789055,
-0.10237651318311691,
0.05169491097331047,
0.04731956124305725,
-0.004486424382776022,
-0.005391246173530817,
-0.04444347694516182,
-0.020426858216524124,
0.09706757217645645,
-0.09590508788824081,
-0.048246532678604126,
0.2689959704875946,
0.013884666375815868,
0.030708471313118935,
0.015893040224909782,
-0.04927601292729378,
-0.0224363561719656,
0.00290597858838737,
0.049154531210660934,
0.018449287861585617,
0.07401107251644135,
0.1748785525560379,
0.034978751093149185,
-0.09000413864850998,
-0.048837918788194656,
-0.17488200962543488,
0.12628008425235748,
0.004328469280153513,
0.09370901435613632,
-0.19107259809970856,
0.05114637315273285,
-0.08784867078065872,
-0.04989420250058174,
0.001779286190867424,
-0.09318055957555771,
-0.030261123552918434,
-0.000021616928279399872,
-0.007358173374086618,
0.1480056494474411,
0.11211258172988892,
0.21372370421886444,
-0.035528719425201416,
0.02556704916059971,
0.07341313362121582,
0.11063367128372192,
0.08809337764978409,
0.02308829128742218,
-0.045489877462387085,
-0.024468103423714638,
-0.00377841480076313,
-0.10587344318628311,
-0.006282556802034378,
-0.04849206283688545,
-0.098926842212677,
-0.019203895702958107,
-0.06975904852151871,
0.08731534332036972,
-0.03164837881922722,
0.013470166362822056,
-0.08849948644638062,
-0.03597615286707878,
0.0924663320183754,
-0.04851803556084633,
0.02960566245019436,
0.06524822115898132,
-0.052853088825941086,
0.008062890730798244,
-0.09909343719482422,
-0.0022328693885356188,
0.03272653743624687,
0.12108878046274185,
-0.12062331289052963,
0.003064037300646305,
0.0030892633367329836,
-0.0665937140583992,
0.04680343344807625,
-0.16888539493083954,
0.026572683826088905,
-0.03655974939465523,
-0.1839330792427063,
0.03974629193544388,
0.03979524224996567,
-0.014334629289805889,
0.024171948432922363,
0.0278775617480278,
-0.002121533500030637,
-0.012086843140423298,
-0.03292601928114891,
-0.04070817679166794,
-0.10391039401292801,
0.07214247435331345,
-0.1163875088095665,
0.021655231714248657,
-0.14412318170070648,
0.030996540561318398,
-0.1322912871837616,
0.038911186158657074,
-0.07091957330703735,
-0.04403049871325493,
-0.10033199936151505,
0.10053323954343796,
-0.05756550654768944,
-0.08249463886022568,
0.004653891082853079,
-0.019696980714797974,
-0.06280099600553513,
0.10435236245393753,
-0.15428699553012848,
-0.06593813747167587,
0.07654120773077011,
-0.12192050367593765,
-0.07042036205530167,
0.035446759313344955,
-0.02152455784380436,
0.003571102162823081,
0.03759203851222992,
0.2934022843837738,
0.019020671024918556,
-0.03919116035103798,
0.04389205947518349,
0.1171603873372078,
-0.1524016559123993,
-0.13474230468273163,
0.13470689952373505,
-0.05561574175953865,
-0.10717139393091202,
0.030410056933760643,
-0.08031919598579407,
0.005198584869503975,
-0.052779119461774826,
-0.07590574771165848,
0.04749791696667671,
0.019697202369570732,
0.09893545508384705,
0.046068351715803146,
0.04522999748587608,
-0.04512603208422661,
0.03654072806239128,
-0.003285465994849801,
0.05815718695521355,
0.08437911421060562,
-0.03359741345047951,
-0.04219524562358856,
0.11655092239379883,
0.07137144356966019,
-0.018120061606168747,
-0.1332479864358902,
-0.09733989089727402,
0.004930638242512941,
-0.06424109637737274,
-0.026404596865177155,
0.07738950103521347,
0.048378098756074905,
-0.01588561199605465,
0.011581800878047943,
0.025824418291449547,
0.07964759320020676,
0.03123699128627777,
0.01298994105309248,
-0.16462524235248566,
-0.037201810628175735,
-0.05924801900982857,
0.04737461730837822,
-0.07426830381155014,
0.0042983065359294415,
0.10051214694976807,
0.026320388540625572,
-0.054948318749666214,
0.03730173036456108,
0.01855890452861786,
-0.024079861119389534,
-0.008516919799149036,
-0.026252299547195435,
0.15632812678813934,
-0.027502253651618958,
-0.012654268182814121,
0.02880481444299221,
-0.015702329576015472,
0.178277388215065,
0.13867168128490448,
-0.0781170204281807,
0.008038618601858616,
-0.10861054062843323,
-0.05946208909153938,
0.013779397122561932,
0.0015534918056800961,
0.03237926959991455,
0.008873548358678818,
-0.005358332302421331,
0.06137150898575783,
-0.05191963538527489,
-0.054303672164678574,
0.017107641324400902,
0.004920761566609144,
-0.03233487904071808,
0.12413207441568375,
0.1780604124069214,
-0.21204574406147003,
0.15256541967391968,
0.1534471958875656,
0.0932781919836998,
0.08746924251317978,
-0.068412184715271,
0.027791902422904968,
-0.02828889526426792,
-0.0389757864177227,
-0.05946442484855652,
0.19114457070827484,
-0.17698006331920624,
-0.007871237583458424,
0.08098921924829483,
0.023962872102856636,
0.02892681397497654,
-0.13615833222866058,
-0.11458023637533188,
-0.06007007136940956,
-0.023000888526439667,
-0.20649218559265137,
0.08378186076879501,
-0.010427347384393215,
0.08963040262460709,
-0.02037670463323593,
-0.18777577579021454,
0.08157563954591751,
-0.014081597328186035,
-0.04488542675971985,
0.1787485033273697,
-0.0937943086028099,
-0.12089788168668747,
-0.061586085706949234,
-0.14834454655647278,
-0.04107089340686798,
-0.006430478300899267,
0.04616312310099602,
-0.0658549889922142,
-0.01680484414100647,
0.04887337610125542,
0.014715391211211681,
-0.1744026094675064,
0.0010938309133052826,
-0.08806291967630386,
0.010429133661091328,
-0.07805543392896652,
-0.11781015247106552,
-0.04628048464655876,
-0.10090979188680649,
0.06620129197835922,
0.1206948384642601,
-0.051381781697273254,
0.05104319378733635,
0.1692671775817871,
0.013110383413732052,
0.06671204417943954,
-0.06432870030403137,
0.21834810078144073,
-0.02462623082101345,
0.007929927669465542,
0.1002979651093483,
-0.0014569908380508423,
0.056046683341264725,
0.16519200801849365,
0.12200471758842468,
-0.08625523000955582,
-0.07401234656572342,
0.0022390279918909073,
-0.09040083736181259,
-0.257323682308197,
-0.10196904093027115,
-0.09823057800531387,
0.11580527573823929,
0.06841035932302475,
0.08048108965158463,
0.06509571522474289,
0.037096381187438965,
0.04952801764011383,
-0.06366536766290665,
0.004380345344543457,
0.01529346127063036,
0.20252861082553864,
0.005364769604057074,
0.02976849488914013,
-0.03946441784501076,
-0.04220152273774147,
0.10593321919441223,
0.062314778566360474,
0.1706233024597168,
0.14900653064250946,
0.08387952297925949,
0.10667800903320312,
0.10809691995382309,
0.09638884663581848,
0.09059041738510132,
-0.00874179508537054,
0.0049047647044062614,
-0.046953048557043076,
-0.030357936397194862,
0.10805106908082962,
-0.013414260931313038,
-0.025302348658442497,
-0.11308006197214127,
-0.018411042168736458,
-0.1583324819803238,
0.07092302292585373,
0.007891801185905933,
0.12466464191675186,
-0.1709207147359848,
0.025585105642676353,
0.04515567794442177,
0.0679258331656456,
-0.042924489825963974,
0.10044356435537338,
-0.07594283670186996,
-0.08623374253511429,
0.08372196555137634,
-0.02627643384039402,
0.11858689039945602,
0.025134870782494545,
0.05467770993709564,
0.06491319090127945,
-0.008137091062963009,
0.003639375790953636,
0.11557841300964355,
-0.10034144669771194,
0.3043111562728882,
-0.005371627863496542,
-0.05904040113091469,
-0.019802095368504524,
-0.0535343773663044,
0.06416500359773636,
0.14878040552139282,
0.1967487335205078,
0.06644881516695023,
-0.08428218215703964,
-0.12057080864906311,
-0.07155945897102356,
0.0012556050205603242,
0.037498589605093,
-0.08301421254873276,
-0.06825145334005356,
0.00037347772740758955,
0.026718564331531525,
-0.007998376153409481,
0.001818408607505262,
-0.04158744961023331,
-0.044560324400663376,
0.038315996527671814,
0.00754062132909894,
-0.01989569328725338,
0.00961230881512165,
-0.07495323568582535,
-0.14007796347141266,
0.0963437631726265,
-0.08873911947011948,
-0.05938233807682991,
-0.11011052131652832,
-0.056903187185525894,
0.10317809134721756,
-0.05315546318888664,
-0.007505394052714109,
-0.062362611293792725,
-0.043411463499069214,
-0.07457835972309113,
-0.1656867265701294,
0.1216273307800293,
-0.05616779625415802,
-0.013987693935632706,
-0.07415725290775299,
0.1619279831647873,
-0.012139148078858852,
0.03408844769001007,
0.012842881493270397,
0.05661371350288391,
-0.06989417225122452,
-0.16708724200725555,
0.10930993407964706,
-0.07516325265169144,
0.0469701923429966,
0.024468258023262024,
-0.002801511436700821,
0.01061274018138647,
0.025315633043646812,
0.02641671895980835,
0.15286476910114288,
0.26703205704689026,
-0.09117969870567322,
0.1876484900712967,
0.19014029204845428,
-0.07133356481790543,
-0.2177620530128479,
-0.0874597504734993,
-0.17175394296646118,
-0.06718602031469345,
0.03694188594818115,
-0.20842550694942474,
0.12980858981609344,
0.13131894171237946,
-0.08731415122747421,
0.1382862776517868,
-0.2989412248134613,
-0.038987692445516586,
0.19053132832050323,
-0.031459707766771317,
0.35172176361083984,
-0.06712016463279724,
-0.11658331006765366,
0.022103844210505486,
-0.17502830922603607,
0.15264780819416046,
-0.010481387376785278,
0.07482290267944336,
-0.07617851346731186,
-0.027618825435638428,
-0.022201424464583397,
-0.038718331605196,
0.19309628009796143,
0.14715643227100372,
0.09438732266426086,
-0.028030822053551674,
-0.1922364979982376,
0.2374708652496338,
0.016621405258774757,
0.052359700202941895,
-0.000986547558568418,
0.027203774079680443,
-0.20003674924373627,
-0.009360763244330883,
-0.07980724424123764,
0.11344573646783829,
-0.002103053033351898,
-0.08981701731681824,
-0.0931774154305458,
0.05607790872454643,
-0.07861080020666122,
-0.030646832659840584,
0.17481477558612823,
0.05645538866519928,
0.04625024273991585,
-0.03970824182033539,
0.004881287459284067,
-0.06257820129394531,
-0.14896520972251892,
-0.08603023737668991,
-0.03476868197321892,
0.07722082734107971,
-0.10015197843313217,
-0.039378393441438675,
0.1754692792892456,
0.05169420316815376,
0.0642777755856514,
0.11251673102378845,
-0.0195628572255373,
-0.003178395563736558,
0.12380754947662354,
-0.17616789042949677,
-0.08700460195541382,
0.03213897719979286,
-0.10133301466703415,
0.09552442282438278,
0.08500385284423828,
0.041600070893764496,
0.04187287762761116,
-0.028742970898747444,
0.04217633232474327,
0.034168943762779236,
-0.04334311559796333,
0.05905783176422119,
0.04886410012841225,
0.028534313663840294,
-0.16771550476551056,
0.12924696505069733,
0.0836494192481041,
0.01304288674145937,
-0.09359397739171982,
0.12597094476222992,
-0.13599659502506256,
-0.082666777074337,
-0.09503815323114395,
0.12419593334197998,
-0.13727034628391266,
-0.03610743209719658,
-0.05030402913689613,
-0.13328216969966888,
0.036736950278282166,
0.09187354892492294,
0.10168690234422684,
0.10463345050811768,
-0.04191374406218529,
-0.07119254022836685,
0.05332634970545769,
-0.00824920553714037,
-0.043435122817754745,
0.009370097890496254,
-0.12093460559844971,
-0.05445778742432594,
0.016035571694374084,
0.08298888057470322,
-0.0528891421854496,
-0.09770279377698898,
-0.15176665782928467,
0.025379247963428497,
-0.06507737189531326,
0.004062949679791927,
-0.08493759483098984,
0.015459083020687103,
-0.021720191463828087,
-0.032188717275857925,
-0.03735516220331192,
-0.02551780641078949,
-0.10150863975286484,
0.04611130431294441,
0.047671083360910416,
0.12454056739807129,
-0.12764930725097656,
-0.028259821236133575,
0.06610022485256195,
0.006715618539601564,
0.14778123795986176,
0.09936577081680298,
0.005112472455948591,
0.04078288376331329,
-0.13920405507087708,
-0.09706759452819824,
0.07542199641466141,
0.004961037077009678,
0.05017814412713051,
-0.05819320306181908,
-0.004045476671308279,
0.011557920835912228,
-0.06744765490293503,
0.06865799427032471,
-0.09737231582403183,
-0.11836109310388565,
-0.04135293513536453,
-0.04910288378596306,
-0.1307821422815323,
0.002949042245745659,
-0.1260383576154709,
0.17610634863376617,
0.03057613968849182,
0.09656456857919693,
0.06027364730834961,
0.04464717581868172,
-0.11682229489088058,
-0.02665499784052372,
-0.02885359339416027,
-0.16097219288349152,
-0.15235023200511932,
0.0037672340404242277,
-0.004502039402723312,
-0.022793730720877647,
0.3060609996318817,
0.11246690899133682,
-0.13563700020313263,
0.002787657780572772,
0.18344788253307343,
-0.038079068064689636,
0.013775628991425037,
0.2267565280199051,
0.05630342662334442,
-0.03750615194439888,
-0.03366662561893463,
0.0674322247505188,
-0.04302293062210083,
0.046973783522844315,
0.19037438929080963,
0.07432720810174942,
0.13296523690223694,
-0.04824068769812584,
0.08240962773561478,
-0.0676080659031868,
-0.14667144417762756,
0.017584331333637238,
0.10307874530553818,
0.03869342431426048,
0.04739908501505852,
0.023115552961826324,
0.13382486999034882,
-0.0820433497428894,
0.08954856544733047,
-0.045209724456071854,
-0.02681814134120941,
-0.13201768696308136,
-0.16892217099666595,
0.0014560874551534653,
-0.07781472057104111,
0.023647205904126167,
-0.06272593885660172,
-0.028448058292269707,
0.19749338924884796,
0.05660315230488777,
-0.006981587503105402,
-0.06871715933084488,
-0.048355091363191605,
-0.019745776429772377,
0.02409420721232891,
-0.012545380741357803,
-0.014623564667999744,
-0.04784896969795227,
-0.0041677821427583694,
-0.006547580007463694,
-0.11369558423757553,
-0.053013112396001816,
-0.023409703746438026,
0.03237812966108322,
-0.0004608525487128645,
-0.13459397852420807,
-0.07387957721948624,
-0.03218545764684677,
0.03243844211101532,
-0.030482830479741096,
0.0850018858909607,
0.0332457460463047,
0.03216824680566788,
0.06268618255853653,
0.1339084953069687,
0.042782559990882874,
-0.0284784734249115,
0.014931726269423962,
0.25066253542900085,
-0.007465755101293325,
0.04912785068154335,
0.0345013402402401,
-0.01297119539231062,
0.006664158310741186,
0.14810453355312347,
0.28793832659721375,
-0.03275721147656441,
-0.019451798871159554,
0.013199321925640106,
0.0016893272986635566,
0.13842801749706268,
0.07541850209236145,
-0.0007605875725857913,
0.24718905985355377,
-0.06845761090517044,
0.0005042478442192078,
-0.08773811906576157,
-0.00762454466894269,
-0.050139207392930984,
0.07820605486631393,
0.1277545541524887,
-0.07400961965322495,
-0.09663709253072739,
0.160323366522789,
-0.13705669343471527,
0.06590007990598679,
0.08805936574935913,
-0.10177526623010635,
-0.07768405228853226,
-0.03297245875000954,
0.0692705437541008,
0.08511054515838623,
0.0734398141503334,
-0.0890481248497963,
-0.06313257664442062,
-0.01959403231739998,
0.04318942129611969,
-0.30463361740112305,
-0.14896444976329803,
0.09929323196411133,
0.055659741163253784,
0.1582024246454239,
-0.044911716133356094,
0.12246806174516678,
0.05024483799934387,
0.015163161791861057,
-0.09329783171415329,
0.004570929799228907,
0.01982228271663189,
0.0390465073287487,
-0.056181441992521286,
-0.1056678295135498,
0.019074566662311554,
-0.13505147397518158,
0.03868918493390083,
-0.032062992453575134,
0.0029869640711694956,
0.1265045553445816,
-0.0827450379729271,
-0.04674487188458443,
0.052502334117889404,
-0.036560215055942535,
0.039209894835948944,
0.03925575688481331,
-0.007311148103326559,
-0.057326238602399826,
-0.06777717918157578,
-0.002018159022554755,
0.06439412385225296,
-0.19364924728870392,
-0.06927835941314697,
0.0374191515147686,
-0.02983606792986393,
0.04926043376326561,
-0.01547570526599884,
-0.014132735319435596,
-0.032101791352033615,
-0.13090170919895172,
0.06092877313494682,
-0.08835800737142563,
0.02445063553750515,
0.057056620717048645,
0.017265183851122856,
0.006401633378118277,
0.010813336819410324,
0.02309471182525158,
0.009714759886264801,
-0.07387479394674301,
-0.08860820531845093
] |
null | null |
flair
|
# Towards Robust Named Entity Recognition for Historic German
Based on [our paper](https://www.aclweb.org/anthology/W19-4312/)
we release a new model trained on the LFT dataset.
**Note:** We use BPEmbeddings instead of the combination of
Wikipedia, Common Crawl and character embeddings (as used in the paper),
so save space and training/inferencing time.
# Results
| Dataset \ Run | Run 1 | Run 2 | Run 3† | Avg.
| ------------- | ----- | ----- | --------- | ------------
| Development | 76.32 | 76.13 | **76.36** | 76.27
| Test | 77.07 | 77.35 | 77.20 | 77.21
Paper reported an averaged F1-score of 77.51.
† denotes that this model is selected for upload.
|
{"language": "de", "license": "mit", "tags": ["flair", "token-classification", "sequence-tagger-model"], "inference": false}
|
token-classification
|
dbmdz/flair-historic-ner-lft
|
[
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"de",
"license:mit",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#flair #pytorch #token-classification #sequence-tagger-model #de #license-mit #region-us
|
Towards Robust Named Entity Recognition for Historic German
===========================================================
Based on our paper
we release a new model trained on the LFT dataset.
Note: We use BPEmbeddings instead of the combination of
Wikipedia, Common Crawl and character embeddings (as used in the paper),
so save space and training/inferencing time.
Results
=======
Paper reported an averaged F1-score of 77.51.
† denotes that this model is selected for upload.
|
[] |
[
"TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #de #license-mit #region-us \n"
] |
[
35
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #de #license-mit #region-us \n"
] |
[
-0.0036795686464756727,
0.05010911822319031,
-0.010403805412352085,
0.06107344850897789,
0.05939234048128128,
0.07261454313993454,
0.10142005234956741,
0.0797845795750618,
0.17229503393173218,
-0.024312956258654594,
0.10704395174980164,
0.17865599691867828,
0.01765342429280281,
0.06073921173810959,
-0.03420824185013771,
-0.32650086283683777,
0.05298984795808792,
0.013769924640655518,
0.029345689341425896,
0.11331672966480255,
0.06993436068296432,
-0.05328290909528732,
0.02288002148270607,
0.0017123306170105934,
0.005512170027941465,
0.00757942907512188,
-0.007896251045167446,
-0.05721316114068031,
0.1469748318195343,
-0.05381958186626434,
0.0991503894329071,
0.05045221000909805,
0.012923717498779297,
-0.1480889916419983,
0.03344063460826874,
-0.03641324117779732,
-0.08739111572504044,
0.05873055383563042,
0.07930290699005127,
-0.06890209019184113,
0.07691458612680435,
0.09864741563796997,
0.0177229642868042,
0.04016691818833351,
-0.18131491541862488,
-0.22380755841732025,
-0.04779231175780296,
0.038658030331134796,
0.08536475896835327,
0.00992334820330143,
0.017739566043019295,
0.11124055832624435,
-0.19180889427661896,
0.012658520601689816,
0.12625296413898468,
-0.31715327501296997,
0.020568275824189186,
0.21664577722549438,
0.08090180903673172,
0.02903403341770172,
-0.07353939116001129,
0.037838246673345566,
0.03664468973875046,
-0.002376921707764268,
-0.005583807360380888,
-0.056536074727773666,
0.024679942056536674,
0.07113107293844223,
-0.1129387840628624,
-0.047891564667224884,
0.21706707775592804,
-0.005586276762187481,
0.007542468141764402,
0.00021280997316353023,
-0.04410085454583168,
-0.03561628237366676,
0.006996201351284981,
0.018893331289291382,
0.026453660801053047,
0.09595686197280884,
0.22304244339466095,
0.05325881764292717,
-0.09286445379257202,
-0.018692972138524055,
-0.16786742210388184,
0.13276533782482147,
0.022887367755174637,
0.08867405354976654,
-0.21296489238739014,
0.05824918672442436,
-0.05029694736003876,
-0.05498380586504936,
0.01917002536356449,
-0.09604288637638092,
0.013509915210306644,
-0.001424651825800538,
0.00993371568620205,
0.13193216919898987,
0.09578004479408264,
0.21630573272705078,
-0.0021979892626404762,
0.04795631393790245,
0.019148776307702065,
0.13255608081817627,
0.0745205506682396,
0.046663980931043625,
0.029958805069327354,
0.006904567126184702,
-0.01724216900765896,
-0.07777528464794159,
-0.004900467582046986,
-0.05436798185110092,
-0.12411218136548996,
0.010398139245808125,
-0.06021125987172127,
0.1232977882027626,
-0.03214048221707344,
-0.002269420772790909,
-0.09230733662843704,
-0.021944722160696983,
0.12036838382482529,
-0.03033202700316906,
0.008219812996685505,
0.06410527974367142,
-0.030509458854794502,
0.048066556453704834,
-0.11796943098306656,
0.0047250185161828995,
0.047673407942056656,
0.17148645222187042,
-0.09547101706266403,
-0.0007739316206425428,
-0.01064376626163721,
-0.0470799021422863,
0.038134489208459854,
-0.1843862682580948,
0.03150441125035286,
-0.057977017015218735,
-0.12605956196784973,
0.04792715236544609,
0.018945951014757156,
-0.00710764154791832,
0.01593581773340702,
0.02938639186322689,
-0.005324516445398331,
-0.028284555301070213,
-0.03819366544485092,
-0.04302984103560448,
-0.11092261970043182,
0.09416510909795761,
-0.11704668402671814,
0.020703710615634918,
-0.14346981048583984,
0.033641111105680466,
-0.13898149132728577,
0.05632141977548599,
-0.09767574816942215,
-0.0739138275384903,
-0.0776921734213829,
0.11767656356096268,
-0.03231373056769371,
-0.0743226706981659,
-0.026852088049054146,
0.0004645414010155946,
-0.10811407119035721,
0.09280183166265488,
-0.17202001810073853,
-0.07187608629465103,
0.08483423292636871,
-0.1304878294467926,
-0.050750214606523514,
0.03846375271677971,
-0.02079722471535206,
0.026535173878073692,
0.022625723853707314,
0.29800644516944885,
0.00742744468152523,
-0.030505266040563583,
0.10381605476140976,
0.12622444331645966,
-0.16496531665325165,
-0.10108356177806854,
0.11526753008365631,
-0.06658309698104858,
-0.103905089199543,
0.009260003454983234,
-0.0720583125948906,
-0.001965390518307686,
-0.03269887715578079,
-0.061457064002752304,
0.06268298625946045,
0.030208878219127655,
0.0813501626253128,
0.041894249618053436,
0.03556666150689125,
-0.06639204174280167,
0.013283057138323784,
-0.006693736184388399,
0.04296623915433884,
0.11138533055782318,
-0.026196850463747978,
-0.04445121809840202,
0.1086672991514206,
0.11881166696548462,
-0.018252797424793243,
-0.1286124289035797,
-0.05431460589170456,
0.03974827006459236,
-0.08762367814779282,
-0.010513626039028168,
0.10000091791152954,
0.018244680017232895,
-0.02233949303627014,
0.02615661546587944,
0.0050929696299135685,
0.0851011872291565,
0.026585854589939117,
0.017073780298233032,
-0.1512882113456726,
-0.02896532043814659,
-0.05954836681485176,
0.01797553524374962,
-0.04362252727150917,
0.017765136435627937,
0.10996916145086288,
0.020832346752285957,
-0.055643681436777115,
0.07351699471473694,
0.003926136065274477,
0.011158289387822151,
-0.008280202746391296,
-0.010489435866475105,
0.16994020342826843,
-0.03916032239794731,
-0.035828787833452225,
0.03799161687493324,
-0.0701022744178772,
0.17896035313606262,
0.16635803878307343,
-0.1426251232624054,
0.007319127209484577,
-0.0794878825545311,
-0.04546229913830757,
0.006974914111196995,
0.02861889824271202,
0.025743108242750168,
0.016611097380518913,
0.00523415906354785,
0.05906730890274048,
-0.05944051966071129,
-0.06761599332094193,
-0.025882834568619728,
0.009907633066177368,
-0.03902173787355423,
0.11303810030221939,
0.2031945139169693,
-0.1922261118888855,
0.13693252205848694,
0.24531981348991394,
0.10991805791854858,
0.09930943697690964,
-0.06574816256761551,
0.031343452632427216,
-0.015271336771547794,
-0.017790643498301506,
-0.04848736524581909,
0.15063203871250153,
-0.16825072467327118,
-0.01010062638670206,
0.06482603400945663,
0.029531579464673996,
0.027973799034953117,
-0.15668350458145142,
-0.11607147008180618,
-0.05322127416729927,
-0.0104880565777421,
-0.16826190054416656,
0.07375656068325043,
0.002784262178465724,
0.09141486138105392,
-0.008756049908697605,
-0.23250682651996613,
0.0875498428940773,
-0.019295869395136833,
-0.04080163687467575,
0.16177962720394135,
-0.0992288738489151,
-0.18158076703548431,
-0.06881281733512878,
-0.131332665681839,
-0.01005924679338932,
0.001334892469458282,
0.051255736500024796,
-0.05533933639526367,
-0.00713599706068635,
0.05342251807451248,
0.03913242369890213,
-0.19943717122077942,
-0.0034415649715811014,
-0.13523346185684204,
0.041492678225040436,
-0.09641127288341522,
-0.10199771821498871,
-0.06379405409097672,
-0.09291189908981323,
0.05330314859747887,
0.12266889214515686,
-0.06683124601840973,
0.03302762657403946,
0.2132207602262497,
0.013591530732810497,
0.05918286368250847,
-0.06991030275821686,
0.17126351594924927,
-0.05045023187994957,
0.013983078300952911,
0.1286635547876358,
-0.0466664619743824,
0.06754075735807419,
0.19277320802211761,
0.12070497125387192,
-0.08122798800468445,
-0.05708666890859604,
-0.007653567939996719,
-0.10690761357545853,
-0.2146100252866745,
-0.06507720053195953,
-0.08722499758005142,
0.13574254512786865,
0.03337181359529495,
0.09345605969429016,
0.12320474535226822,
0.020204123109579086,
0.0411502830684185,
-0.09517576545476913,
0.01933283917605877,
0.027311038225889206,
0.20170235633850098,
-0.005388038232922554,
0.058613188564777374,
-0.04565560445189476,
-0.08552214503288269,
0.0908823013305664,
0.07743706554174423,
0.11656532436609268,
0.19862917065620422,
0.053588006645441055,
0.12438996881246567,
0.10824277997016907,
0.13466353714466095,
0.07370276004076004,
-0.013827832415699959,
0.011369159445166588,
-0.08045495301485062,
-0.029626278206706047,
0.10192039608955383,
-0.004867017734795809,
0.015864919871091843,
-0.11098379641771317,
-0.012033196166157722,
-0.17618417739868164,
0.06664480268955231,
-0.001966160489246249,
0.1102212518453598,
-0.11598584800958633,
0.08438820391893387,
0.08916114270687103,
0.04994698241353035,
-0.030762825161218643,
0.1352028101682663,
-0.12322115153074265,
-0.10961329936981201,
0.07363547384738922,
-0.0062927925027906895,
0.12603656947612762,
-0.006365159992128611,
0.08633650839328766,
0.07182896137237549,
-0.06381495296955109,
0.017407074570655823,
0.09895285218954086,
-0.14659011363983154,
0.26897886395454407,
-0.0074640545062720776,
-0.08321927487850189,
-0.017435843124985695,
-0.052751343697309494,
0.06731681525707245,
0.15378428995609283,
0.16185340285301208,
0.029759621247649193,
-0.11788131296634674,
-0.14221660792827606,
-0.040674734860658646,
-0.015295414254069328,
0.06407811492681503,
-0.08919704705476761,
-0.06150687113404274,
-0.006607279647141695,
0.034307386726140976,
-0.01486398745328188,
0.039214130491018295,
0.0015992262633517385,
-0.03012157417833805,
0.03784244880080223,
-0.004423942882567644,
-0.0411578007042408,
-0.0019464505603536963,
-0.04418714717030525,
-0.08333547413349152,
0.052429504692554474,
-0.11937408894300461,
-0.06855843961238861,
-0.08141232281923294,
-0.14850389957427979,
0.0738101527094841,
-0.04058445245027542,
-0.024611368775367737,
-0.06917362660169601,
-0.06504375487565994,
-0.08225684612989426,
-0.16462266445159912,
0.1164412647485733,
-0.04526699334383011,
-0.0210331529378891,
-0.08543267846107483,
0.18398316204547882,
-0.04306674003601074,
0.04656622186303139,
0.008691149763762951,
0.03512502461671829,
-0.05660879239439964,
-0.1558499038219452,
0.10662390291690826,
-0.14014577865600586,
0.00013400193711277097,
0.016824694350361824,
-0.02267802320420742,
0.017502199858427048,
-0.014346824027597904,
0.024409247562289238,
0.12417184561491013,
0.31233131885528564,
-0.06533604115247726,
0.19915887713432312,
0.2001846581697464,
-0.08622944355010986,
-0.20087391138076782,
-0.1034541130065918,
-0.20755963027477264,
-0.08407760411500931,
0.07124072313308716,
-0.1910136342048645,
0.09833526611328125,
0.12187595665454865,
-0.0865609273314476,
0.1881432831287384,
-0.3257783353328705,
-0.036756739020347595,
0.22217071056365967,
-0.04089142754673958,
0.3348767161369324,
-0.09589944779872894,
-0.12226905673742294,
0.017323045060038567,
-0.1698630154132843,
0.13410584628582,
0.027201933786273003,
0.056731317192316055,
-0.07691000401973724,
-0.04208185896277428,
-0.015417151153087616,
-0.019854700192809105,
0.19237354397773743,
0.1174442321062088,
0.11285637319087982,
-0.028665585443377495,
-0.21574340760707855,
0.1761549413204193,
0.029874125495553017,
0.016679922118782997,
-0.009964099153876305,
-0.013039539568126202,
-0.18472325801849365,
-0.011474545113742352,
-0.07501538842916489,
0.10022124648094177,
-0.028503762558102608,
-0.09104496240615845,
-0.08778208494186401,
0.045846883207559586,
-0.09717472642660141,
-0.02691113017499447,
0.17822608351707458,
0.025056468322873116,
0.07653960585594177,
0.00591096468269825,
0.01890859380364418,
-0.06244397908449173,
-0.1418461948633194,
-0.08846503496170044,
-0.04109882190823555,
0.0696473941206932,
-0.053858354687690735,
-0.041791751980781555,
0.1584724336862564,
0.010743051767349243,
0.060949087142944336,
0.12375758588314056,
-0.01724504865705967,
-0.01714155077934265,
0.1253010481595993,
-0.1556495875120163,
-0.07221465557813644,
0.04381691291928291,
-0.07208605110645294,
0.14523746073246002,
0.0514751598238945,
0.026169318705797195,
0.04994739219546318,
-0.02262304723262787,
0.055595893412828445,
0.022620221599936485,
-0.044250309467315674,
0.0071805668994784355,
0.033645305782556534,
0.036671921610832214,
-0.12059330195188522,
0.11771204322576523,
0.0871766060590744,
0.006102545186877251,
-0.11705238372087479,
0.13412947952747345,
-0.11719775944948196,
-0.07562237232923508,
-0.11656839400529861,
0.08048252016305923,
-0.1329060047864914,
-0.03580007329583168,
-0.043621234595775604,
-0.17843352258205414,
0.02238728106021881,
0.10349691659212112,
0.1060691699385643,
0.10811378806829453,
-0.046915967017412186,
-0.042228516191244125,
0.08550536632537842,
-0.008273978717625141,
-0.08565962314605713,
0.011485891416668892,
-0.10923688858747482,
-0.013910462148487568,
0.03739195689558983,
0.07765750586986542,
-0.06503203511238098,
-0.10703685879707336,
-0.16836871206760406,
0.0544358529150486,
-0.0816957950592041,
-0.0015215035527944565,
-0.06045306473970413,
-0.009204811416566372,
0.002095235511660576,
-0.018437517806887627,
-0.0416855551302433,
-0.053975142538547516,
-0.10746767371892929,
0.05269133672118187,
0.04729697108268738,
0.12118972837924957,
-0.08729428797960281,
-0.0456467941403389,
0.08838783204555511,
-0.008587098680436611,
0.10828958451747894,
0.07694322615861893,
-0.0026638624258339405,
0.059296950697898865,
-0.08325094729661942,
-0.068731389939785,
0.09084760397672653,
0.016093982383608818,
0.009772025980055332,
-0.015629781410098076,
-0.010931387543678284,
0.000011894947419932578,
-0.09015235304832458,
0.07115966826677322,
-0.11614164710044861,
-0.12733983993530273,
-0.04328680783510208,
-0.0220204908400774,
-0.1698911041021347,
0.023078566417098045,
-0.14851880073547363,
0.1785881519317627,
0.013227500952780247,
0.10587017238140106,
0.07131370902061462,
0.06202162057161331,
-0.08830033242702484,
-0.022750290110707283,
-0.024586858227849007,
-0.16061411798000336,
-0.13504177331924438,
-0.021375875920057297,
-0.028895054012537003,
-0.0354781411588192,
0.30922871828079224,
0.10629543662071228,
-0.12130111455917358,
0.02594573423266411,
0.17406021058559418,
-0.006043537985533476,
0.03153233230113983,
0.19344103336334229,
0.06551244854927063,
-0.03700175881385803,
-0.0678122490644455,
0.03874752297997475,
-0.051812294870615005,
-0.029817752540111542,
0.2059686928987503,
0.1053335964679718,
0.12699741125106812,
-0.026238752529025078,
0.10145348310470581,
-0.009651245549321175,
-0.15121592581272125,
-0.01719384640455246,
0.1578594297170639,
0.030199846252799034,
0.05577855184674263,
0.07097223401069641,
0.14920030534267426,
-0.05634397268295288,
0.10126321762800217,
-0.04757196083664894,
-0.004203788936138153,
-0.16045717895030975,
-0.13405078649520874,
0.020963745191693306,
-0.040528565645217896,
0.005983204580843449,
-0.02691793441772461,
-0.04958268254995346,
0.21184559166431427,
0.05169002711772919,
-0.026291893795132637,
-0.07448139786720276,
-0.08465810865163803,
-0.021794628351926804,
0.041356198489665985,
-0.020142516121268272,
-0.004095018841326237,
-0.0788370743393898,
-0.005758965853601694,
-0.04073655977845192,
-0.11225265264511108,
-0.06677135825157166,
-0.034977320581674576,
-0.012204691767692566,
-0.02738565020263195,
-0.15646283328533173,
-0.08652833849191666,
-0.003091116901487112,
0.027868252247571945,
-0.039598722010850906,
0.10146434605121613,
0.025519179180264473,
0.02206254191696644,
0.05836808308959007,
0.09181483089923859,
0.04684610292315483,
-0.04060780256986618,
0.06478752940893173,
0.2582286298274994,
-0.009557262063026428,
0.07765278220176697,
0.042161185294389725,
0.005273135844618082,
-0.02625374309718609,
0.1489316076040268,
0.2898900806903839,
-0.05963734909892082,
-0.0185280479490757,
0.03333631530404091,
0.005383186973631382,
0.14680255949497223,
0.10117700695991516,
0.028202742338180542,
0.2901718318462372,
-0.07815781980752945,
0.017861399799585342,
-0.09638893604278564,
0.0002629955706652254,
-0.06460151076316833,
0.06343577802181244,
0.1488620489835739,
-0.06497363746166229,
-0.09615558385848999,
0.14596602320671082,
-0.171260803937912,
0.10455329716205597,
0.09362941235303879,
-0.10375013202428818,
-0.05950307846069336,
-0.03182476758956909,
0.08595506101846695,
0.03777002915740013,
0.06911823153495789,
-0.1000286340713501,
-0.09708678722381592,
-0.04145868495106697,
0.040636904537677765,
-0.27576974034309387,
-0.12659412622451782,
0.10116948187351227,
0.07747091352939606,
0.11253780126571655,
-0.037277981638908386,
0.10603490471839905,
0.041821155697107315,
0.03736141696572304,
-0.03996327519416809,
0.03360713645815849,
0.041497111320495605,
0.042229749262332916,
-0.0806586742401123,
-0.12281133979558945,
0.03892228752374649,
-0.13436947762966156,
0.056495487689971924,
-0.05499741807579994,
0.02155071496963501,
0.06788431107997894,
-0.12884031236171722,
-0.03141976520419121,
0.06163177266716957,
-0.05840855464339256,
0.0010979801882058382,
0.04908638820052147,
0.01156782079488039,
-0.07445146143436432,
-0.0516262911260128,
-0.038174428045749664,
0.06461025774478912,
-0.20242957770824432,
-0.07833843678236008,
0.09385866671800613,
-0.033096201717853546,
0.04026290774345398,
-0.031582754105329514,
-0.033175889402627945,
-0.04875008389353752,
-0.11755713075399399,
0.07104776799678802,
-0.1180994063615799,
0.014102814719080925,
0.03192251920700073,
0.029810314998030663,
-0.004771379288285971,
-0.04727673903107643,
0.027761859819293022,
0.013418665155768394,
-0.06831370294094086,
-0.07663145661354065
] |
null | null |
flair
|
# Towards Robust Named Entity Recognition for Historic German
Based on [our paper](https://www.aclweb.org/anthology/W19-4312/)
we release a new model trained on the ONB dataset.
**Note:** We use BPEmbeddings instead of the combination of
Wikipedia, Common Crawl and character embeddings (as used in the paper),
so save space and training/inferencing time.
# Results
| Dataset \ Run | Run 1 | Run 2 | Run 3 | Avg.
| ------------- | ----- | ----- | --------- | ------------
| Development | 86.69 | 86.13 | **87.18** | 86.67
| Test | 85.27 | 86.05 | 85.75† | 85.69
Paper reported an averaged F1-score of 85.31.
† denotes that this model is selected for upload.
|
{"language": "de", "license": "mit", "tags": ["flair", "token-classification", "sequence-tagger-model"], "widget": [{"text": "April Martin Ansclm, K. Gefangen-Auffehers Georg Sausgruber."}]}
|
token-classification
|
dbmdz/flair-historic-ner-onb
|
[
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"de",
"license:mit",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#flair #pytorch #token-classification #sequence-tagger-model #de #license-mit #region-us
|
Towards Robust Named Entity Recognition for Historic German
===========================================================
Based on our paper
we release a new model trained on the ONB dataset.
Note: We use BPEmbeddings instead of the combination of
Wikipedia, Common Crawl and character embeddings (as used in the paper),
so save space and training/inferencing time.
Results
=======
Paper reported an averaged F1-score of 85.31.
† denotes that this model is selected for upload.
|
[] |
[
"TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #de #license-mit #region-us \n"
] |
[
35
] |
[
"passage: TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #de #license-mit #region-us \n"
] |
[
-0.0036795686464756727,
0.05010911822319031,
-0.010403805412352085,
0.06107344850897789,
0.05939234048128128,
0.07261454313993454,
0.10142005234956741,
0.0797845795750618,
0.17229503393173218,
-0.024312956258654594,
0.10704395174980164,
0.17865599691867828,
0.01765342429280281,
0.06073921173810959,
-0.03420824185013771,
-0.32650086283683777,
0.05298984795808792,
0.013769924640655518,
0.029345689341425896,
0.11331672966480255,
0.06993436068296432,
-0.05328290909528732,
0.02288002148270607,
0.0017123306170105934,
0.005512170027941465,
0.00757942907512188,
-0.007896251045167446,
-0.05721316114068031,
0.1469748318195343,
-0.05381958186626434,
0.0991503894329071,
0.05045221000909805,
0.012923717498779297,
-0.1480889916419983,
0.03344063460826874,
-0.03641324117779732,
-0.08739111572504044,
0.05873055383563042,
0.07930290699005127,
-0.06890209019184113,
0.07691458612680435,
0.09864741563796997,
0.0177229642868042,
0.04016691818833351,
-0.18131491541862488,
-0.22380755841732025,
-0.04779231175780296,
0.038658030331134796,
0.08536475896835327,
0.00992334820330143,
0.017739566043019295,
0.11124055832624435,
-0.19180889427661896,
0.012658520601689816,
0.12625296413898468,
-0.31715327501296997,
0.020568275824189186,
0.21664577722549438,
0.08090180903673172,
0.02903403341770172,
-0.07353939116001129,
0.037838246673345566,
0.03664468973875046,
-0.002376921707764268,
-0.005583807360380888,
-0.056536074727773666,
0.024679942056536674,
0.07113107293844223,
-0.1129387840628624,
-0.047891564667224884,
0.21706707775592804,
-0.005586276762187481,
0.007542468141764402,
0.00021280997316353023,
-0.04410085454583168,
-0.03561628237366676,
0.006996201351284981,
0.018893331289291382,
0.026453660801053047,
0.09595686197280884,
0.22304244339466095,
0.05325881764292717,
-0.09286445379257202,
-0.018692972138524055,
-0.16786742210388184,
0.13276533782482147,
0.022887367755174637,
0.08867405354976654,
-0.21296489238739014,
0.05824918672442436,
-0.05029694736003876,
-0.05498380586504936,
0.01917002536356449,
-0.09604288637638092,
0.013509915210306644,
-0.001424651825800538,
0.00993371568620205,
0.13193216919898987,
0.09578004479408264,
0.21630573272705078,
-0.0021979892626404762,
0.04795631393790245,
0.019148776307702065,
0.13255608081817627,
0.0745205506682396,
0.046663980931043625,
0.029958805069327354,
0.006904567126184702,
-0.01724216900765896,
-0.07777528464794159,
-0.004900467582046986,
-0.05436798185110092,
-0.12411218136548996,
0.010398139245808125,
-0.06021125987172127,
0.1232977882027626,
-0.03214048221707344,
-0.002269420772790909,
-0.09230733662843704,
-0.021944722160696983,
0.12036838382482529,
-0.03033202700316906,
0.008219812996685505,
0.06410527974367142,
-0.030509458854794502,
0.048066556453704834,
-0.11796943098306656,
0.0047250185161828995,
0.047673407942056656,
0.17148645222187042,
-0.09547101706266403,
-0.0007739316206425428,
-0.01064376626163721,
-0.0470799021422863,
0.038134489208459854,
-0.1843862682580948,
0.03150441125035286,
-0.057977017015218735,
-0.12605956196784973,
0.04792715236544609,
0.018945951014757156,
-0.00710764154791832,
0.01593581773340702,
0.02938639186322689,
-0.005324516445398331,
-0.028284555301070213,
-0.03819366544485092,
-0.04302984103560448,
-0.11092261970043182,
0.09416510909795761,
-0.11704668402671814,
0.020703710615634918,
-0.14346981048583984,
0.033641111105680466,
-0.13898149132728577,
0.05632141977548599,
-0.09767574816942215,
-0.0739138275384903,
-0.0776921734213829,
0.11767656356096268,
-0.03231373056769371,
-0.0743226706981659,
-0.026852088049054146,
0.0004645414010155946,
-0.10811407119035721,
0.09280183166265488,
-0.17202001810073853,
-0.07187608629465103,
0.08483423292636871,
-0.1304878294467926,
-0.050750214606523514,
0.03846375271677971,
-0.02079722471535206,
0.026535173878073692,
0.022625723853707314,
0.29800644516944885,
0.00742744468152523,
-0.030505266040563583,
0.10381605476140976,
0.12622444331645966,
-0.16496531665325165,
-0.10108356177806854,
0.11526753008365631,
-0.06658309698104858,
-0.103905089199543,
0.009260003454983234,
-0.0720583125948906,
-0.001965390518307686,
-0.03269887715578079,
-0.061457064002752304,
0.06268298625946045,
0.030208878219127655,
0.0813501626253128,
0.041894249618053436,
0.03556666150689125,
-0.06639204174280167,
0.013283057138323784,
-0.006693736184388399,
0.04296623915433884,
0.11138533055782318,
-0.026196850463747978,
-0.04445121809840202,
0.1086672991514206,
0.11881166696548462,
-0.018252797424793243,
-0.1286124289035797,
-0.05431460589170456,
0.03974827006459236,
-0.08762367814779282,
-0.010513626039028168,
0.10000091791152954,
0.018244680017232895,
-0.02233949303627014,
0.02615661546587944,
0.0050929696299135685,
0.0851011872291565,
0.026585854589939117,
0.017073780298233032,
-0.1512882113456726,
-0.02896532043814659,
-0.05954836681485176,
0.01797553524374962,
-0.04362252727150917,
0.017765136435627937,
0.10996916145086288,
0.020832346752285957,
-0.055643681436777115,
0.07351699471473694,
0.003926136065274477,
0.011158289387822151,
-0.008280202746391296,
-0.010489435866475105,
0.16994020342826843,
-0.03916032239794731,
-0.035828787833452225,
0.03799161687493324,
-0.0701022744178772,
0.17896035313606262,
0.16635803878307343,
-0.1426251232624054,
0.007319127209484577,
-0.0794878825545311,
-0.04546229913830757,
0.006974914111196995,
0.02861889824271202,
0.025743108242750168,
0.016611097380518913,
0.00523415906354785,
0.05906730890274048,
-0.05944051966071129,
-0.06761599332094193,
-0.025882834568619728,
0.009907633066177368,
-0.03902173787355423,
0.11303810030221939,
0.2031945139169693,
-0.1922261118888855,
0.13693252205848694,
0.24531981348991394,
0.10991805791854858,
0.09930943697690964,
-0.06574816256761551,
0.031343452632427216,
-0.015271336771547794,
-0.017790643498301506,
-0.04848736524581909,
0.15063203871250153,
-0.16825072467327118,
-0.01010062638670206,
0.06482603400945663,
0.029531579464673996,
0.027973799034953117,
-0.15668350458145142,
-0.11607147008180618,
-0.05322127416729927,
-0.0104880565777421,
-0.16826190054416656,
0.07375656068325043,
0.002784262178465724,
0.09141486138105392,
-0.008756049908697605,
-0.23250682651996613,
0.0875498428940773,
-0.019295869395136833,
-0.04080163687467575,
0.16177962720394135,
-0.0992288738489151,
-0.18158076703548431,
-0.06881281733512878,
-0.131332665681839,
-0.01005924679338932,
0.001334892469458282,
0.051255736500024796,
-0.05533933639526367,
-0.00713599706068635,
0.05342251807451248,
0.03913242369890213,
-0.19943717122077942,
-0.0034415649715811014,
-0.13523346185684204,
0.041492678225040436,
-0.09641127288341522,
-0.10199771821498871,
-0.06379405409097672,
-0.09291189908981323,
0.05330314859747887,
0.12266889214515686,
-0.06683124601840973,
0.03302762657403946,
0.2132207602262497,
0.013591530732810497,
0.05918286368250847,
-0.06991030275821686,
0.17126351594924927,
-0.05045023187994957,
0.013983078300952911,
0.1286635547876358,
-0.0466664619743824,
0.06754075735807419,
0.19277320802211761,
0.12070497125387192,
-0.08122798800468445,
-0.05708666890859604,
-0.007653567939996719,
-0.10690761357545853,
-0.2146100252866745,
-0.06507720053195953,
-0.08722499758005142,
0.13574254512786865,
0.03337181359529495,
0.09345605969429016,
0.12320474535226822,
0.020204123109579086,
0.0411502830684185,
-0.09517576545476913,
0.01933283917605877,
0.027311038225889206,
0.20170235633850098,
-0.005388038232922554,
0.058613188564777374,
-0.04565560445189476,
-0.08552214503288269,
0.0908823013305664,
0.07743706554174423,
0.11656532436609268,
0.19862917065620422,
0.053588006645441055,
0.12438996881246567,
0.10824277997016907,
0.13466353714466095,
0.07370276004076004,
-0.013827832415699959,
0.011369159445166588,
-0.08045495301485062,
-0.029626278206706047,
0.10192039608955383,
-0.004867017734795809,
0.015864919871091843,
-0.11098379641771317,
-0.012033196166157722,
-0.17618417739868164,
0.06664480268955231,
-0.001966160489246249,
0.1102212518453598,
-0.11598584800958633,
0.08438820391893387,
0.08916114270687103,
0.04994698241353035,
-0.030762825161218643,
0.1352028101682663,
-0.12322115153074265,
-0.10961329936981201,
0.07363547384738922,
-0.0062927925027906895,
0.12603656947612762,
-0.006365159992128611,
0.08633650839328766,
0.07182896137237549,
-0.06381495296955109,
0.017407074570655823,
0.09895285218954086,
-0.14659011363983154,
0.26897886395454407,
-0.0074640545062720776,
-0.08321927487850189,
-0.017435843124985695,
-0.052751343697309494,
0.06731681525707245,
0.15378428995609283,
0.16185340285301208,
0.029759621247649193,
-0.11788131296634674,
-0.14221660792827606,
-0.040674734860658646,
-0.015295414254069328,
0.06407811492681503,
-0.08919704705476761,
-0.06150687113404274,
-0.006607279647141695,
0.034307386726140976,
-0.01486398745328188,
0.039214130491018295,
0.0015992262633517385,
-0.03012157417833805,
0.03784244880080223,
-0.004423942882567644,
-0.0411578007042408,
-0.0019464505603536963,
-0.04418714717030525,
-0.08333547413349152,
0.052429504692554474,
-0.11937408894300461,
-0.06855843961238861,
-0.08141232281923294,
-0.14850389957427979,
0.0738101527094841,
-0.04058445245027542,
-0.024611368775367737,
-0.06917362660169601,
-0.06504375487565994,
-0.08225684612989426,
-0.16462266445159912,
0.1164412647485733,
-0.04526699334383011,
-0.0210331529378891,
-0.08543267846107483,
0.18398316204547882,
-0.04306674003601074,
0.04656622186303139,
0.008691149763762951,
0.03512502461671829,
-0.05660879239439964,
-0.1558499038219452,
0.10662390291690826,
-0.14014577865600586,
0.00013400193711277097,
0.016824694350361824,
-0.02267802320420742,
0.017502199858427048,
-0.014346824027597904,
0.024409247562289238,
0.12417184561491013,
0.31233131885528564,
-0.06533604115247726,
0.19915887713432312,
0.2001846581697464,
-0.08622944355010986,
-0.20087391138076782,
-0.1034541130065918,
-0.20755963027477264,
-0.08407760411500931,
0.07124072313308716,
-0.1910136342048645,
0.09833526611328125,
0.12187595665454865,
-0.0865609273314476,
0.1881432831287384,
-0.3257783353328705,
-0.036756739020347595,
0.22217071056365967,
-0.04089142754673958,
0.3348767161369324,
-0.09589944779872894,
-0.12226905673742294,
0.017323045060038567,
-0.1698630154132843,
0.13410584628582,
0.027201933786273003,
0.056731317192316055,
-0.07691000401973724,
-0.04208185896277428,
-0.015417151153087616,
-0.019854700192809105,
0.19237354397773743,
0.1174442321062088,
0.11285637319087982,
-0.028665585443377495,
-0.21574340760707855,
0.1761549413204193,
0.029874125495553017,
0.016679922118782997,
-0.009964099153876305,
-0.013039539568126202,
-0.18472325801849365,
-0.011474545113742352,
-0.07501538842916489,
0.10022124648094177,
-0.028503762558102608,
-0.09104496240615845,
-0.08778208494186401,
0.045846883207559586,
-0.09717472642660141,
-0.02691113017499447,
0.17822608351707458,
0.025056468322873116,
0.07653960585594177,
0.00591096468269825,
0.01890859380364418,
-0.06244397908449173,
-0.1418461948633194,
-0.08846503496170044,
-0.04109882190823555,
0.0696473941206932,
-0.053858354687690735,
-0.041791751980781555,
0.1584724336862564,
0.010743051767349243,
0.060949087142944336,
0.12375758588314056,
-0.01724504865705967,
-0.01714155077934265,
0.1253010481595993,
-0.1556495875120163,
-0.07221465557813644,
0.04381691291928291,
-0.07208605110645294,
0.14523746073246002,
0.0514751598238945,
0.026169318705797195,
0.04994739219546318,
-0.02262304723262787,
0.055595893412828445,
0.022620221599936485,
-0.044250309467315674,
0.0071805668994784355,
0.033645305782556534,
0.036671921610832214,
-0.12059330195188522,
0.11771204322576523,
0.0871766060590744,
0.006102545186877251,
-0.11705238372087479,
0.13412947952747345,
-0.11719775944948196,
-0.07562237232923508,
-0.11656839400529861,
0.08048252016305923,
-0.1329060047864914,
-0.03580007329583168,
-0.043621234595775604,
-0.17843352258205414,
0.02238728106021881,
0.10349691659212112,
0.1060691699385643,
0.10811378806829453,
-0.046915967017412186,
-0.042228516191244125,
0.08550536632537842,
-0.008273978717625141,
-0.08565962314605713,
0.011485891416668892,
-0.10923688858747482,
-0.013910462148487568,
0.03739195689558983,
0.07765750586986542,
-0.06503203511238098,
-0.10703685879707336,
-0.16836871206760406,
0.0544358529150486,
-0.0816957950592041,
-0.0015215035527944565,
-0.06045306473970413,
-0.009204811416566372,
0.002095235511660576,
-0.018437517806887627,
-0.0416855551302433,
-0.053975142538547516,
-0.10746767371892929,
0.05269133672118187,
0.04729697108268738,
0.12118972837924957,
-0.08729428797960281,
-0.0456467941403389,
0.08838783204555511,
-0.008587098680436611,
0.10828958451747894,
0.07694322615861893,
-0.0026638624258339405,
0.059296950697898865,
-0.08325094729661942,
-0.068731389939785,
0.09084760397672653,
0.016093982383608818,
0.009772025980055332,
-0.015629781410098076,
-0.010931387543678284,
0.000011894947419932578,
-0.09015235304832458,
0.07115966826677322,
-0.11614164710044861,
-0.12733983993530273,
-0.04328680783510208,
-0.0220204908400774,
-0.1698911041021347,
0.023078566417098045,
-0.14851880073547363,
0.1785881519317627,
0.013227500952780247,
0.10587017238140106,
0.07131370902061462,
0.06202162057161331,
-0.08830033242702484,
-0.022750290110707283,
-0.024586858227849007,
-0.16061411798000336,
-0.13504177331924438,
-0.021375875920057297,
-0.028895054012537003,
-0.0354781411588192,
0.30922871828079224,
0.10629543662071228,
-0.12130111455917358,
0.02594573423266411,
0.17406021058559418,
-0.006043537985533476,
0.03153233230113983,
0.19344103336334229,
0.06551244854927063,
-0.03700175881385803,
-0.0678122490644455,
0.03874752297997475,
-0.051812294870615005,
-0.029817752540111542,
0.2059686928987503,
0.1053335964679718,
0.12699741125106812,
-0.026238752529025078,
0.10145348310470581,
-0.009651245549321175,
-0.15121592581272125,
-0.01719384640455246,
0.1578594297170639,
0.030199846252799034,
0.05577855184674263,
0.07097223401069641,
0.14920030534267426,
-0.05634397268295288,
0.10126321762800217,
-0.04757196083664894,
-0.004203788936138153,
-0.16045717895030975,
-0.13405078649520874,
0.020963745191693306,
-0.040528565645217896,
0.005983204580843449,
-0.02691793441772461,
-0.04958268254995346,
0.21184559166431427,
0.05169002711772919,
-0.026291893795132637,
-0.07448139786720276,
-0.08465810865163803,
-0.021794628351926804,
0.041356198489665985,
-0.020142516121268272,
-0.004095018841326237,
-0.0788370743393898,
-0.005758965853601694,
-0.04073655977845192,
-0.11225265264511108,
-0.06677135825157166,
-0.034977320581674576,
-0.012204691767692566,
-0.02738565020263195,
-0.15646283328533173,
-0.08652833849191666,
-0.003091116901487112,
0.027868252247571945,
-0.039598722010850906,
0.10146434605121613,
0.025519179180264473,
0.02206254191696644,
0.05836808308959007,
0.09181483089923859,
0.04684610292315483,
-0.04060780256986618,
0.06478752940893173,
0.2582286298274994,
-0.009557262063026428,
0.07765278220176697,
0.042161185294389725,
0.005273135844618082,
-0.02625374309718609,
0.1489316076040268,
0.2898900806903839,
-0.05963734909892082,
-0.0185280479490757,
0.03333631530404091,
0.005383186973631382,
0.14680255949497223,
0.10117700695991516,
0.028202742338180542,
0.2901718318462372,
-0.07815781980752945,
0.017861399799585342,
-0.09638893604278564,
0.0002629955706652254,
-0.06460151076316833,
0.06343577802181244,
0.1488620489835739,
-0.06497363746166229,
-0.09615558385848999,
0.14596602320671082,
-0.171260803937912,
0.10455329716205597,
0.09362941235303879,
-0.10375013202428818,
-0.05950307846069336,
-0.03182476758956909,
0.08595506101846695,
0.03777002915740013,
0.06911823153495789,
-0.1000286340713501,
-0.09708678722381592,
-0.04145868495106697,
0.040636904537677765,
-0.27576974034309387,
-0.12659412622451782,
0.10116948187351227,
0.07747091352939606,
0.11253780126571655,
-0.037277981638908386,
0.10603490471839905,
0.041821155697107315,
0.03736141696572304,
-0.03996327519416809,
0.03360713645815849,
0.041497111320495605,
0.042229749262332916,
-0.0806586742401123,
-0.12281133979558945,
0.03892228752374649,
-0.13436947762966156,
0.056495487689971924,
-0.05499741807579994,
0.02155071496963501,
0.06788431107997894,
-0.12884031236171722,
-0.03141976520419121,
0.06163177266716957,
-0.05840855464339256,
0.0010979801882058382,
0.04908638820052147,
0.01156782079488039,
-0.07445146143436432,
-0.0516262911260128,
-0.038174428045749664,
0.06461025774478912,
-0.20242957770824432,
-0.07833843678236008,
0.09385866671800613,
-0.033096201717853546,
0.04026290774345398,
-0.031582754105329514,
-0.033175889402627945,
-0.04875008389353752,
-0.11755713075399399,
0.07104776799678802,
-0.1180994063615799,
0.014102814719080925,
0.03192251920700073,
0.029810314998030663,
-0.004771379288285971,
-0.04727673903107643,
0.027761859819293022,
0.013418665155768394,
-0.06831370294094086,
-0.07663145661354065
] |
null | null |
transformers
|
# German GPT-2 model
In this repository we release (yet another) GPT-2 model, that was trained on various texts for German.
The model is meant to be an entry point for fine-tuning on other texts, and it is definitely not as good or "dangerous" as the English GPT-3 model. We do not plan extensive PR or staged releases for this model 😉
**Note**: The model was initially released under an anonymous alias (`anonymous-german-nlp/german-gpt2`) so we now "de-anonymize" it.
More details about GPT-2 can be found in the great [Hugging Face](https://huggingface.co/transformers/model_doc/gpt2.html) documentation.
## German GPT-2 fine-tuned on Faust I and II
We fine-tuned our German GPT-2 model on "Faust I and II" from Johann Wolfgang Goethe. These texts can be obtained from [Deutsches Textarchiv (DTA)](http://www.deutschestextarchiv.de/book/show/goethe_faust01_1808). We use the "normalized" version of both texts (to avoid out-of-vocabulary problems with e.g. "ſ")
Fine-Tuning was done for 100 epochs, using a batch size of 4 with half precision on a RTX 3090. Total time was around 12 minutes (it is really fast!).
We also open source this fine-tuned model. Text can be generated with:
```python
from transformers import pipeline
pipe = pipeline('text-generation', model="dbmdz/german-gpt2-faust",
tokenizer="dbmdz/german-gpt2-faust")
text = pipe("Schon um die Liebe", max_length=100)[0]["generated_text"]
print(text)
```
and could output:
```
Schon um die Liebe bitte ich, Herr! Wer mag sich die dreifach Ermächtigen?
Sei mir ein Held!
Und daß die Stunde kommt spreche ich nicht aus.
Faust (schaudernd).
Den schönen Boten finde' ich verwirrend;
```
# License
All models are licensed under [MIT](LICENSE).
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT models just open an issue
[here](https://github.com/stefan-it/german-gpt/issues/new) 🤗
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "de", "license": "mit", "widget": [{"text": "Schon um die Liebe"}]}
|
text-generation
|
dbmdz/german-gpt2-faust
|
[
"transformers",
"pytorch",
"jax",
"safetensors",
"gpt2",
"text-generation",
"de",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #jax #safetensors #gpt2 #text-generation #de #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# German GPT-2 model
In this repository we release (yet another) GPT-2 model, that was trained on various texts for German.
The model is meant to be an entry point for fine-tuning on other texts, and it is definitely not as good or "dangerous" as the English GPT-3 model. We do not plan extensive PR or staged releases for this model
Note: The model was initially released under an anonymous alias ('anonymous-german-nlp/german-gpt2') so we now "de-anonymize" it.
More details about GPT-2 can be found in the great Hugging Face documentation.
## German GPT-2 fine-tuned on Faust I and II
We fine-tuned our German GPT-2 model on "Faust I and II" from Johann Wolfgang Goethe. These texts can be obtained from Deutsches Textarchiv (DTA). We use the "normalized" version of both texts (to avoid out-of-vocabulary problems with e.g. "ſ")
Fine-Tuning was done for 100 epochs, using a batch size of 4 with half precision on a RTX 3090. Total time was around 12 minutes (it is really fast!).
We also open source this fine-tuned model. Text can be generated with:
and could output:
# License
All models are licensed under MIT.
# Huggingface model hub
All models are available on the Huggingface model hub.
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT models just open an issue
here
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[
"# German GPT-2 model\n\nIn this repository we release (yet another) GPT-2 model, that was trained on various texts for German.\n\nThe model is meant to be an entry point for fine-tuning on other texts, and it is definitely not as good or \"dangerous\" as the English GPT-3 model. We do not plan extensive PR or staged releases for this model \n\nNote: The model was initially released under an anonymous alias ('anonymous-german-nlp/german-gpt2') so we now \"de-anonymize\" it.\n\nMore details about GPT-2 can be found in the great Hugging Face documentation.",
"## German GPT-2 fine-tuned on Faust I and II\n\nWe fine-tuned our German GPT-2 model on \"Faust I and II\" from Johann Wolfgang Goethe. These texts can be obtained from Deutsches Textarchiv (DTA). We use the \"normalized\" version of both texts (to avoid out-of-vocabulary problems with e.g. \"ſ\")\n\nFine-Tuning was done for 100 epochs, using a batch size of 4 with half precision on a RTX 3090. Total time was around 12 minutes (it is really fast!).\n\nWe also open source this fine-tuned model. Text can be generated with:\n\n\n\nand could output:",
"# License\n\nAll models are licensed under MIT.",
"# Huggingface model hub\n\nAll models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our BERT models just open an issue\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #jax #safetensors #gpt2 #text-generation #de #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# German GPT-2 model\n\nIn this repository we release (yet another) GPT-2 model, that was trained on various texts for German.\n\nThe model is meant to be an entry point for fine-tuning on other texts, and it is definitely not as good or \"dangerous\" as the English GPT-3 model. We do not plan extensive PR or staged releases for this model \n\nNote: The model was initially released under an anonymous alias ('anonymous-german-nlp/german-gpt2') so we now \"de-anonymize\" it.\n\nMore details about GPT-2 can be found in the great Hugging Face documentation.",
"## German GPT-2 fine-tuned on Faust I and II\n\nWe fine-tuned our German GPT-2 model on \"Faust I and II\" from Johann Wolfgang Goethe. These texts can be obtained from Deutsches Textarchiv (DTA). We use the \"normalized\" version of both texts (to avoid out-of-vocabulary problems with e.g. \"ſ\")\n\nFine-Tuning was done for 100 epochs, using a batch size of 4 with half precision on a RTX 3090. Total time was around 12 minutes (it is really fast!).\n\nWe also open source this fine-tuned model. Text can be generated with:\n\n\n\nand could output:",
"# License\n\nAll models are licensed under MIT.",
"# Huggingface model hub\n\nAll models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our BERT models just open an issue\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
62,
151,
151,
10,
18,
25,
70
] |
[
"passage: TAGS\n#transformers #pytorch #jax #safetensors #gpt2 #text-generation #de #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# German GPT-2 model\n\nIn this repository we release (yet another) GPT-2 model, that was trained on various texts for German.\n\nThe model is meant to be an entry point for fine-tuning on other texts, and it is definitely not as good or \"dangerous\" as the English GPT-3 model. We do not plan extensive PR or staged releases for this model \n\nNote: The model was initially released under an anonymous alias ('anonymous-german-nlp/german-gpt2') so we now \"de-anonymize\" it.\n\nMore details about GPT-2 can be found in the great Hugging Face documentation.## German GPT-2 fine-tuned on Faust I and II\n\nWe fine-tuned our German GPT-2 model on \"Faust I and II\" from Johann Wolfgang Goethe. These texts can be obtained from Deutsches Textarchiv (DTA). We use the \"normalized\" version of both texts (to avoid out-of-vocabulary problems with e.g. \"ſ\")\n\nFine-Tuning was done for 100 epochs, using a batch size of 4 with half precision on a RTX 3090. Total time was around 12 minutes (it is really fast!).\n\nWe also open source this fine-tuned model. Text can be generated with:\n\n\n\nand could output:# License\n\nAll models are licensed under MIT.# Huggingface model hub\n\nAll models are available on the Huggingface model hub.# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our BERT models just open an issue\nhere# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
-0.05566433444619179,
0.011063790880143642,
-0.0035921966191381216,
0.07672064751386642,
0.021110694855451584,
0.0397060289978981,
0.10354600101709366,
0.08814748376607895,
0.07159557193517685,
0.07984639704227448,
-0.01656252145767212,
-0.095020592212677,
0.053759343922138214,
0.09675396233797073,
0.023407084867358208,
-0.22137433290481567,
0.009732736274600029,
-0.12604013085365295,
0.06135256588459015,
0.05149266868829727,
0.1570271998643875,
0.009297910146415234,
0.06545548141002655,
0.03071845881640911,
-0.0733797624707222,
0.0018226430984213948,
-0.018614135682582855,
0.03902234509587288,
0.09375312924385071,
0.05656364932656288,
0.04269098863005638,
-0.024357376620173454,
0.02049153670668602,
-0.1149541363120079,
0.015263884328305721,
0.039375390857458115,
-0.01232073549181223,
0.03729655593633652,
0.08803737163543701,
0.0000694348054821603,
0.16666600108146667,
-0.0840146616101265,
-0.008444959297776222,
0.05687547102570534,
-0.07509898394346237,
-0.040800634771585464,
-0.18644168972969055,
0.12847015261650085,
0.05986728146672249,
0.004453487228602171,
-0.026181669905781746,
0.074839286506176,
-0.030354468151926994,
0.002577144419774413,
0.07110418379306793,
-0.24433308839797974,
-0.03670419752597809,
0.0022097076289355755,
0.020032841712236404,
0.014849632047116756,
-0.05672565475106239,
0.045941129326820374,
0.021572664380073547,
0.015202773734927177,
-0.032106511294841766,
-0.010517283342778683,
0.027111945673823357,
-0.09889856725931168,
-0.11261732131242752,
-0.010765266604721546,
0.07681006193161011,
0.019959399476647377,
-0.14427144825458527,
-0.23854604363441467,
-0.041387367993593216,
0.12777982652187347,
0.06302320957183838,
-0.01370022539049387,
-0.023920651525259018,
0.029569370672106743,
0.10366258025169373,
-0.15844593942165375,
-0.11966179311275482,
-0.030478298664093018,
-0.032411251217126846,
0.23850244283676147,
0.042831458151340485,
0.054674070328474045,
0.08596740663051605,
0.09660908579826355,
-0.2430345118045807,
-0.0670681893825531,
-0.05816873535513878,
-0.08259124308824539,
-0.09655459970235825,
-0.025081692263484,
-0.049700427800416946,
-0.11499908566474915,
0.02001103013753891,
0.2587433159351349,
-0.04385506361722946,
-0.04542859271168709,
0.006452161353081465,
0.006141544319689274,
0.08881105482578278,
0.0965985506772995,
-0.010732250288128853,
-0.06311972439289093,
0.08109494298696518,
-0.06282220780849457,
0.13136127591133118,
0.010213756002485752,
-0.035651206970214844,
-0.08235344290733337,
-0.03428114950656891,
0.05423326417803764,
0.05525720492005348,
0.020580517128109932,
-0.06402818858623505,
-0.04300906881690025,
0.26472771167755127,
-0.09935285896062851,
0.045745570212602615,
-0.002331194467842579,
-0.01276992168277502,
-0.023361103609204292,
-0.022709233686327934,
0.027051692828536034,
-0.06966035813093185,
0.06381580233573914,
-0.014005976729094982,
-0.022688740864396095,
-0.060253579169511795,
-0.08068899810314178,
0.04900854080915451,
-0.037770826369524,
-0.05658212676644325,
-0.11925671249628067,
-0.09855898469686508,
-0.009873266331851482,
0.035403672605752945,
-0.05343107134103775,
-0.021484563127160072,
0.054853323847055435,
-0.00878877192735672,
-0.010209252126514912,
-0.002348387148231268,
-0.03322993591427803,
-0.0005786937545053661,
-0.021897051483392715,
-0.05412643030285835,
0.021850451827049255,
-0.06518649309873581,
-0.02336066961288452,
-0.07955733686685562,
-0.015126878395676613,
-0.200530007481575,
0.10277785360813141,
-0.07530064880847931,
-0.06489585340023041,
-0.037360649555921555,
0.028290532529354095,
0.02255398966372013,
0.0038845434319227934,
0.02873801998794079,
0.13654573261737823,
-0.1179027259349823,
-0.025184793397784233,
0.17737749218940735,
-0.18376044929027557,
-0.028583265841007233,
0.2322806417942047,
0.009542896412312984,
-0.045870669186115265,
0.04337240383028984,
0.16603823006153107,
0.06331998109817505,
-0.22122721374034882,
-0.07115796953439713,
0.0015784085262566805,
-0.055486928671598434,
0.1351272463798523,
0.09025734663009644,
-0.07515093684196472,
0.02485842816531658,
0.017768871039152145,
-0.12069778144359589,
0.004455589223653078,
-0.00770625239238143,
-0.013816030696034431,
-0.029865726828575134,
-0.05936003103852272,
0.07141266018152237,
-0.04026668146252632,
-0.03323084115982056,
-0.09312635660171509,
-0.146280437707901,
-0.08799143880605698,
0.12093333154916763,
-0.0004708245978690684,
0.04140062257647514,
-0.027725525200366974,
0.0754345953464508,
-0.0035979668609797955,
-0.02725001983344555,
-0.07086475938558578,
-0.16566863656044006,
0.12063581496477127,
-0.1936269998550415,
0.09205660969018936,
-0.04646286368370056,
0.03835026174783707,
0.1391313076019287,
-0.07769054174423218,
-0.0016147395363077521,
-0.06207336485385895,
-0.04160051792860031,
-0.023266596719622612,
-0.07846970856189728,
-0.0746297612786293,
-0.061058636754751205,
0.20054084062576294,
-0.0712062269449234,
0.0023375560995191336,
0.0465739481151104,
0.13688628375530243,
0.013311606831848621,
-0.07530169934034348,
-0.023138845339417458,
-0.028609788045287132,
-0.01540283765643835,
-0.04464662820100784,
-0.01886012963950634,
0.029549138620495796,
-0.06698288768529892,
0.1293744146823883,
-0.08473681658506393,
-0.12611602246761322,
0.057913102209568024,
0.10526315867900848,
-0.08893946558237076,
-0.015412081964313984,
-0.07855740934610367,
-0.0176937784999609,
-0.09070199728012085,
-0.10604088753461838,
0.16535530984401703,
0.009575904347002506,
0.04493403062224388,
-0.059471603482961655,
-0.030141985043883324,
-0.025415530428290367,
-0.06029660254716873,
-0.05059877410531044,
0.017071828246116638,
-0.09149862080812454,
-0.07026080042123795,
0.050809018313884735,
0.0023431351874023676,
-0.02920415624976158,
0.16549700498580933,
0.031603291630744934,
-0.09575895965099335,
-0.03396225348114967,
0.052222445607185364,
0.023116067051887512,
0.08252361416816711,
0.005528165493160486,
-0.012364182621240616,
0.05420905724167824,
0.003943854942917824,
0.06279051303863525,
-0.07059314101934433,
0.05547144636511803,
-0.006734485737979412,
-0.0357031412422657,
0.07279718667268753,
0.06281722337007523,
-0.03793613612651825,
0.07784212380647659,
-0.005305000115185976,
0.08701718598604202,
0.008358336053788662,
-0.04088528826832771,
-0.11069628596305847,
0.07100295275449753,
-0.04894443228840828,
-0.23877617716789246,
-0.18957993388175964,
0.0935865268111229,
-0.12798812985420227,
0.04551399126648903,
0.04380626976490021,
-0.007605170831084251,
-0.08523658663034439,
-0.09201367944478989,
0.10157511383295059,
0.08594036847352982,
-0.053638529032468796,
-0.07687317579984665,
-0.04109838232398033,
0.04355364665389061,
-0.10647198557853699,
-0.02603260986506939,
-0.0057128253392875195,
-0.14200595021247864,
0.03339379280805588,
0.027912992984056473,
0.07811740785837173,
0.010416832752525806,
-0.041006769984960556,
-0.027372438460588455,
-0.007487428840249777,
0.12710870802402496,
-0.09054864943027496,
0.12476404756307602,
0.17992527782917023,
-0.05445953458547592,
0.07922158390283585,
0.05592558532953262,
0.0048797763884067535,
-0.012292970903217793,
0.036058537662029266,
0.04148446023464203,
-0.03657181188464165,
-0.17313534021377563,
-0.1324901133775711,
-0.04078119248151779,
-0.02264772355556488,
-0.02336995117366314,
0.033603958785533905,
0.06598187237977982,
0.03026255965232849,
-0.11120590567588806,
-0.03490901738405228,
0.057347994297742844,
0.09794840216636658,
0.029113320633769035,
0.008743837475776672,
0.02906441129744053,
-0.033972013741731644,
-0.014363228343427181,
0.1968895047903061,
-0.004040290601551533,
0.07289348542690277,
-0.04327049106359482,
0.11994477361440659,
0.012472626753151417,
0.0977373868227005,
0.012882166542112827,
0.02608945034444332,
-0.005619303788989782,
0.023554936051368713,
-0.03653494641184807,
-0.06529868394136429,
-0.026299674063920975,
0.09525702148675919,
0.06848668307065964,
-0.042764291167259216,
-0.04411817342042923,
-0.014452127739787102,
0.09826204180717468,
0.16853363811969757,
0.08052182197570801,
-0.05960959568619728,
-0.1010342538356781,
0.03423786163330078,
-0.06955590099096298,
-0.05374494567513466,
-0.021722687408328056,
0.14347000420093536,
-0.14563444256782532,
0.037550028413534164,
0.007580805104225874,
0.038933172821998596,
-0.05726105719804764,
-0.02281440608203411,
0.008093000389635563,
0.12179911136627197,
-0.007007521111518145,
0.06882111728191376,
-0.12681573629379272,
0.025512637570500374,
0.005999384913593531,
0.10729999840259552,
-0.07823571562767029,
0.054114751517772675,
0.0713508129119873,
-0.023124143481254578,
0.14452166855335236,
0.02929261513054371,
-0.0022579964715987444,
0.08357485383749008,
-0.13446950912475586,
0.0005625548074021935,
0.11722752451896667,
-0.1165979653596878,
0.02374175749719143,
-0.04269566759467125,
0.008543351665139198,
-0.04998128116130829,
0.12381451576948166,
-0.13314446806907654,
-0.1019914373755455,
0.06308013200759888,
-0.06592896580696106,
-0.006701938807964325,
-0.08209241926670074,
-0.003976307809352875,
-0.06181267276406288,
0.2340753823518753,
-0.011145446449518204,
-0.09537588059902191,
-0.1105262041091919,
-0.08793527632951736,
0.1079634353518486,
-0.06972618401050568,
0.022888336330652237,
0.011063081212341785,
0.14011703431606293,
-0.09129787981510162,
-0.07381129264831543,
-0.0027718949131667614,
-0.12117840349674225,
-0.1812746673822403,
-0.010268843732774258,
0.165320485830307,
0.06027742102742195,
0.0371539480984211,
0.04505288228392601,
0.028674742206931114,
-0.0008448502630926669,
-0.11356548964977264,
0.03340144455432892,
0.09503293037414551,
0.04716780036687851,
0.12276320904493332,
-0.041177812963724136,
-0.13325658440589905,
-0.032245807349681854,
-0.03451082855463028,
0.027821935713291168,
0.291538804769516,
-0.042680852115154266,
0.1392432451248169,
0.1758292317390442,
-0.055912088602781296,
-0.23636290431022644,
-0.06641791015863419,
0.0732022225856781,
0.02938981167972088,
-0.03083866648375988,
-0.0852062925696373,
0.052629146724939346,
0.13489478826522827,
-0.051369406282901764,
0.06023978441953659,
-0.09190131723880768,
-0.11980373412370682,
0.04142804816365242,
-0.03395634517073631,
0.014685975387692451,
-0.09200744330883026,
-0.06938712298870087,
-0.04380791261792183,
-0.1103002205491066,
0.1036725789308548,
-0.071989506483078,
0.06289856880903244,
0.02393328584730625,
0.03964642435312271,
0.06009387597441673,
-0.04523153975605965,
0.1330280750989914,
0.02192782796919346,
0.02168182097375393,
-0.07239971309900284,
-0.0004195520014036447,
0.03149205818772316,
-0.03369579464197159,
0.17155468463897705,
-0.022314276546239853,
0.03845196217298508,
-0.08369215577840805,
-0.029095947742462158,
-0.06711763143539429,
0.14349700510501862,
-0.050691161304712296,
-0.10206620395183563,
-0.06722796708345413,
0.0921609178185463,
0.018840445205569267,
0.020511042326688766,
-0.07636407762765884,
-0.01229259092360735,
-0.012504165060818195,
0.1413978934288025,
0.07633932679891586,
0.05288701876997948,
-0.015009609051048756,
-0.020464029163122177,
-0.05006517469882965,
0.05406150221824646,
0.05700733885169029,
0.06787710636854172,
0.05405741557478905,
0.0005305716185830534,
0.10540353506803513,
-0.03596604987978935,
-0.133570596575737,
0.016162056475877762,
0.044125910848379135,
-0.1815120130777359,
-0.10484496504068375,
-0.061958491802215576,
-0.07074972987174988,
0.030507002025842667,
0.03781743347644806,
0.17541013658046722,
-0.03674428537487984,
-0.06358762830495834,
-0.028997626155614853,
0.08219818025827408,
-0.028290407732129097,
0.022115852683782578,
0.03743613511323929,
-0.014974312856793404,
-0.03257237374782562,
0.13135282695293427,
0.026207001879811287,
-0.07526549696922302,
0.0008090580813586712,
0.07480937987565994,
-0.07375992089509964,
-0.040194641798734665,
-0.09390100836753845,
-0.04192351549863815,
-0.0908203274011612,
-0.044878315180540085,
0.03153453394770622,
-0.0405234768986702,
-0.05714540556073189,
-0.006602210458368063,
0.05315659940242767,
0.02750711888074875,
0.012925258837640285,
0.06596516817808151,
-0.07126756757497787,
0.05150562524795532,
0.07063128799200058,
0.0458148792386055,
-0.011438252404332161,
0.07231584936380386,
-0.059074319899082184,
-0.02783026359975338,
-0.026886463165283203,
0.01154174841940403,
-0.06808295100927353,
-0.09736554324626923,
-0.1298293173313141,
0.020415974780917168,
-0.02667202427983284,
-0.01376558467745781,
0.0034867364447563887,
-0.05396707355976105,
0.05736329033970833,
0.0444926843047142,
-0.014520195312798023,
-0.056508868932724,
-0.06138132885098457,
0.03308557718992233,
-0.13882605731487274,
-0.033692944794893265,
0.08076027780771255,
-0.07683633267879486,
0.1584634929895401,
0.10349735617637634,
0.03387411683797836,
-0.017977619543671608,
-0.09683720767498016,
-0.020370615646243095,
-0.05379066616296768,
0.028150759637355804,
0.015170375816524029,
-0.13443247973918915,
-0.006791326683014631,
0.009722410701215267,
-0.03746825084090233,
-0.011833908036351204,
0.11128481477499008,
-0.054647911339998245,
0.0858294889330864,
0.054788123816251755,
-0.10595087707042694,
-0.07518161833286285,
0.011225013993680477,
0.07712094485759735,
0.06898711621761322,
0.06985626369714737,
-0.08205144107341766,
0.027266060933470726,
-0.07956112176179886,
0.023857755586504936,
0.04645886644721031,
0.015454118140041828,
-0.05143808200955391,
0.03165556862950325,
0.02322312816977501,
-0.0039686597883701324,
0.17135818302631378,
0.04054248705506325,
-0.027015531435608864,
0.028295719996094704,
0.036633532494306564,
0.036311980336904526,
-0.006998060736805201,
0.029328590258955956,
-0.08451595157384872,
-0.0035281083546578884,
-0.118531234562397,
-0.05310380458831787,
-0.028500642627477646,
-0.15132299065589905,
0.1323433816432953,
0.06727714836597443,
0.01871170662343502,
0.045134078711271286,
0.06686864048242569,
-0.05917467549443245,
-0.1560845822095871,
-0.12397760152816772,
0.010067669674754143,
0.016307450830936432,
-0.06354255229234695,
0.08998317271471024,
0.11500676721334457,
-0.20412085950374603,
0.07987059652805328,
0.015682248398661613,
-0.0486493855714798,
-0.05595190078020096,
-0.16283944249153137,
-0.03752691298723221,
0.016711832955479622,
0.024872930720448494,
-0.08932386338710785,
0.07187867164611816,
0.05181403458118439,
0.01790466532111168,
-0.041185665875673294,
0.12804237008094788,
-0.18347769975662231,
-0.047525860369205475,
0.07048580050468445,
0.019114483147859573,
0.05877397954463959,
0.11827939003705978,
-0.03189133480191231,
-0.014912083745002747,
0.1086934357881546,
0.09310878813266754,
0.05487174913287163,
0.0833226665854454,
-0.026406075805425644,
-0.012160240672528744,
-0.05837463587522507,
-0.028612488880753517,
-0.025235092267394066,
0.029645901173353195,
0.13188207149505615,
0.03818197920918465,
-0.007613368798047304,
0.0033286872785538435,
0.14868281781673431,
-0.06188153102993965,
-0.08858627825975418,
-0.14370064437389374,
0.15121829509735107,
-0.0662686824798584,
0.0398932583630085,
0.02980712801218033,
-0.1133778914809227,
0.021670101210474968,
0.09279418736696243,
0.20864656567573547,
0.004058341030031443,
0.013072865083813667,
0.01665634848177433,
-0.008983399718999863,
0.03014654852449894,
0.09100989252328873,
-0.035629600286483765,
0.27419912815093994,
-0.007199838291853666,
0.16566546261310577,
0.020050637423992157,
-0.01602029986679554,
-0.07829871028661728,
0.23321369290351868,
-0.11661706119775772,
0.003240505000576377,
-0.04754228517413139,
0.04647083580493927,
0.052931372076272964,
-0.26940348744392395,
0.012855003587901592,
-0.0049420190043747425,
-0.08519595116376877,
0.05637392774224281,
-0.006185445934534073,
0.011568396352231503,
0.12399560213088989,
0.005240186583250761,
0.004925074987113476,
0.17403976619243622,
-0.0052720122039318085,
-0.034343477338552475,
0.027306323871016502,
0.07057102769613266,
-0.07545676082372665,
0.27201324701309204,
0.05750850960612297,
0.14389663934707642,
0.048874445259571075,
-0.04073266312479973,
-0.12830011546611786,
0.022548167034983635,
0.0038520668167620897,
-0.15444229543209076,
0.007262088358402252,
0.1880045086145401,
-0.029041586443781853,
0.011198572814464569,
0.048979565501213074,
-0.0891822874546051,
0.023374268785119057,
0.1176750585436821,
-0.03642282634973526,
-0.0931660458445549,
0.1069122850894928,
-0.12383512407541275,
0.14457614719867706,
0.14174886047840118,
-0.012602255679666996,
0.015940580517053604,
-0.06102316454052925,
0.006465862039476633,
0.06796368211507797,
0.11129644513130188,
0.04782557114958763,
-0.11401328444480896,
0.0336499959230423,
0.03912840038537979,
0.03994440287351608,
-0.17684678733348846,
-0.05528105050325394,
-0.03913953900337219,
0.02359289675951004,
-0.026950789615511894,
0.06152825057506561,
0.06493192166090012,
-0.0013887155801057816,
0.01412943098694086,
-0.04046589881181717,
0.005478945095092058,
0.020111102610826492,
-0.04282504692673683,
-0.038136694580316544
] |
null | null |
transformers
|
# German GPT-2 model
In this repository we release (yet another) GPT-2 model, that was trained on various texts for German.
The model is meant to be an entry point for fine-tuning on other texts, and it is definitely not as good or "dangerous" as the English GPT-3 model. We do not plan extensive PR or staged releases for this model 😉
**Note**: The model was initially released under an anonymous alias (`anonymous-german-nlp/german-gpt2`) so we now "de-anonymize" it.
More details about GPT-2 can be found in the great [Hugging Face](https://huggingface.co/transformers/model_doc/gpt2.html) documentation.
# Changelog
16.08.2021: Public release of re-trained version of our German GPT-2 model with better results.
15.11.2020: Initial release. Please use the tag `v1.0` for [this older version](https://huggingface.co/dbmdz/german-gpt2/tree/v1.0).
# Training corpora
We use pretty much the same corpora as used for training the DBMDZ BERT model, that can be found in [this repository](https://github.com/dbmdz/berts).
Thanks to the awesome Hugging Face team, it is possible to create byte-level BPE with their awesome [Tokenizers](https://github.com/huggingface/tokenizers) library.
With the previously mentioned awesome Tokenizers library we created a 50K byte-level BPE vocab based on the training corpora.
After creating the vocab, we could train the GPT-2 for German on a v3-8 TPU over the complete training corpus for 20 epochs. All hyperparameters
can be found in the official JAX/FLAX documentation [here](https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md)
from Transformers.
# Using the model
The model itself can be used in this way:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("dbmdz/german-gpt2")
model = AutoModelWithLMHead.from_pretrained("dbmdz/german-gpt2")
```
However, text generation is a bit more interesting, so here's an example that shows how to use the great Transformers *Pipelines* for generating text:
```python
from transformers import pipeline
pipe = pipeline('text-generation', model="dbmdz/german-gpt2",
tokenizer="dbmdz/german-gpt2")
text = pipe("Der Sinn des Lebens ist es", max_length=100)[0]["generated_text"]
print(text)
```
This could output this beautiful text:
```
Der Sinn des Lebens ist es, im Geist zu verweilen, aber nicht in der Welt zu sein, sondern ganz im Geist zu leben.
Die Menschen beginnen, sich nicht nach der Natur und nach der Welt zu richten, sondern nach der Seele,'
```
# License
All models are licensed under [MIT](LICENSE).
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT models just open an issue
[here](https://github.com/stefan-it/german-gpt/issues/new) 🤗
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "de", "license": "mit", "widget": [{"text": "Heute ist sehr sch\u00f6nes Wetter in"}]}
|
text-generation
|
dbmdz/german-gpt2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"onnx",
"safetensors",
"gpt2",
"text-generation",
"de",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #jax #onnx #safetensors #gpt2 #text-generation #de #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# German GPT-2 model
In this repository we release (yet another) GPT-2 model, that was trained on various texts for German.
The model is meant to be an entry point for fine-tuning on other texts, and it is definitely not as good or "dangerous" as the English GPT-3 model. We do not plan extensive PR or staged releases for this model
Note: The model was initially released under an anonymous alias ('anonymous-german-nlp/german-gpt2') so we now "de-anonymize" it.
More details about GPT-2 can be found in the great Hugging Face documentation.
# Changelog
16.08.2021: Public release of re-trained version of our German GPT-2 model with better results.
15.11.2020: Initial release. Please use the tag 'v1.0' for this older version.
# Training corpora
We use pretty much the same corpora as used for training the DBMDZ BERT model, that can be found in this repository.
Thanks to the awesome Hugging Face team, it is possible to create byte-level BPE with their awesome Tokenizers library.
With the previously mentioned awesome Tokenizers library we created a 50K byte-level BPE vocab based on the training corpora.
After creating the vocab, we could train the GPT-2 for German on a v3-8 TPU over the complete training corpus for 20 epochs. All hyperparameters
can be found in the official JAX/FLAX documentation here
from Transformers.
# Using the model
The model itself can be used in this way:
However, text generation is a bit more interesting, so here's an example that shows how to use the great Transformers *Pipelines* for generating text:
This could output this beautiful text:
# License
All models are licensed under MIT.
# Huggingface model hub
All models are available on the Huggingface model hub.
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT models just open an issue
here
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ️
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[
"# German GPT-2 model\n\nIn this repository we release (yet another) GPT-2 model, that was trained on various texts for German.\n\nThe model is meant to be an entry point for fine-tuning on other texts, and it is definitely not as good or \"dangerous\" as the English GPT-3 model. We do not plan extensive PR or staged releases for this model \n\nNote: The model was initially released under an anonymous alias ('anonymous-german-nlp/german-gpt2') so we now \"de-anonymize\" it.\n\nMore details about GPT-2 can be found in the great Hugging Face documentation.",
"# Changelog\n\n16.08.2021: Public release of re-trained version of our German GPT-2 model with better results.\n\n15.11.2020: Initial release. Please use the tag 'v1.0' for this older version.",
"# Training corpora\n\nWe use pretty much the same corpora as used for training the DBMDZ BERT model, that can be found in this repository.\n\nThanks to the awesome Hugging Face team, it is possible to create byte-level BPE with their awesome Tokenizers library.\n\nWith the previously mentioned awesome Tokenizers library we created a 50K byte-level BPE vocab based on the training corpora.\n\nAfter creating the vocab, we could train the GPT-2 for German on a v3-8 TPU over the complete training corpus for 20 epochs. All hyperparameters\ncan be found in the official JAX/FLAX documentation here\nfrom Transformers.",
"# Using the model\n\nThe model itself can be used in this way:\n\n\n\nHowever, text generation is a bit more interesting, so here's an example that shows how to use the great Transformers *Pipelines* for generating text:\n\n\n\nThis could output this beautiful text:",
"# License\n\nAll models are licensed under MIT.",
"# Huggingface model hub\n\nAll models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our BERT models just open an issue\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #tf #jax #onnx #safetensors #gpt2 #text-generation #de #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# German GPT-2 model\n\nIn this repository we release (yet another) GPT-2 model, that was trained on various texts for German.\n\nThe model is meant to be an entry point for fine-tuning on other texts, and it is definitely not as good or \"dangerous\" as the English GPT-3 model. We do not plan extensive PR or staged releases for this model \n\nNote: The model was initially released under an anonymous alias ('anonymous-german-nlp/german-gpt2') so we now \"de-anonymize\" it.\n\nMore details about GPT-2 can be found in the great Hugging Face documentation.",
"# Changelog\n\n16.08.2021: Public release of re-trained version of our German GPT-2 model with better results.\n\n15.11.2020: Initial release. Please use the tag 'v1.0' for this older version.",
"# Training corpora\n\nWe use pretty much the same corpora as used for training the DBMDZ BERT model, that can be found in this repository.\n\nThanks to the awesome Hugging Face team, it is possible to create byte-level BPE with their awesome Tokenizers library.\n\nWith the previously mentioned awesome Tokenizers library we created a 50K byte-level BPE vocab based on the training corpora.\n\nAfter creating the vocab, we could train the GPT-2 for German on a v3-8 TPU over the complete training corpus for 20 epochs. All hyperparameters\ncan be found in the official JAX/FLAX documentation here\nfrom Transformers.",
"# Using the model\n\nThe model itself can be used in this way:\n\n\n\nHowever, text generation is a bit more interesting, so here's an example that shows how to use the great Transformers *Pipelines* for generating text:\n\n\n\nThis could output this beautiful text:",
"# License\n\nAll models are licensed under MIT.",
"# Huggingface model hub\n\nAll models are available on the Huggingface model hub.",
"# Contact (Bugs, Feedback, Contribution and more)\n\nFor questions about our BERT models just open an issue\nhere",
"# Acknowledgments\n\nResearch supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).\nThanks for providing access to the TFRC ️\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
73,
151,
48,
148,
58,
10,
18,
25,
70
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #onnx #safetensors #gpt2 #text-generation #de #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# German GPT-2 model\n\nIn this repository we release (yet another) GPT-2 model, that was trained on various texts for German.\n\nThe model is meant to be an entry point for fine-tuning on other texts, and it is definitely not as good or \"dangerous\" as the English GPT-3 model. We do not plan extensive PR or staged releases for this model \n\nNote: The model was initially released under an anonymous alias ('anonymous-german-nlp/german-gpt2') so we now \"de-anonymize\" it.\n\nMore details about GPT-2 can be found in the great Hugging Face documentation.# Changelog\n\n16.08.2021: Public release of re-trained version of our German GPT-2 model with better results.\n\n15.11.2020: Initial release. Please use the tag 'v1.0' for this older version.# Training corpora\n\nWe use pretty much the same corpora as used for training the DBMDZ BERT model, that can be found in this repository.\n\nThanks to the awesome Hugging Face team, it is possible to create byte-level BPE with their awesome Tokenizers library.\n\nWith the previously mentioned awesome Tokenizers library we created a 50K byte-level BPE vocab based on the training corpora.\n\nAfter creating the vocab, we could train the GPT-2 for German on a v3-8 TPU over the complete training corpus for 20 epochs. All hyperparameters\ncan be found in the official JAX/FLAX documentation here\nfrom Transformers.# Using the model\n\nThe model itself can be used in this way:\n\n\n\nHowever, text generation is a bit more interesting, so here's an example that shows how to use the great Transformers *Pipelines* for generating text:\n\n\n\nThis could output this beautiful text:# License\n\nAll models are licensed under MIT.# Huggingface model hub\n\nAll models are available on the Huggingface model hub."
] |
[
-0.043667420744895935,
0.12372907996177673,
-0.004200898110866547,
0.012330539524555206,
0.09966511279344559,
0.03860275447368622,
0.12992742657661438,
0.11646299064159393,
-0.06449031084775925,
0.0472186803817749,
-0.023527078330516815,
-0.015477786771953106,
0.1065448671579361,
0.1132730096578598,
0.1186385378241539,
-0.22349144518375397,
-0.003572952700778842,
-0.06152170151472092,
0.022876765578985214,
0.04577621817588806,
0.12512578070163727,
-0.048176415264606476,
0.10981540381908417,
0.05333803594112396,
-0.09269239008426666,
0.010176989249885082,
-0.03402770310640335,
-0.013453303836286068,
0.08876034617424011,
0.07197928428649902,
0.04171696677803993,
-0.046676866710186005,
0.02973799966275692,
-0.11865543574094772,
0.010502944700419903,
0.09704314917325974,
-0.014856492169201374,
0.06770182400941849,
0.0973970890045166,
-0.02146122045814991,
0.15654829144477844,
-0.10303099453449249,
-0.015841711312532425,
0.08253248780965805,
-0.09206089377403259,
-0.12998579442501068,
-0.14923560619354248,
0.1010579839348793,
0.04081864282488823,
0.01969807595014572,
-0.0008402683888562024,
0.026424527168273926,
-0.013510284945368767,
0.0026298081502318382,
0.13443726301193237,
-0.17086303234100342,
-0.028868375346064568,
0.04300575330853462,
0.04546074941754341,
0.049477528780698776,
-0.10444876551628113,
0.06553090363740921,
0.01904965192079544,
0.02193492278456688,
0.07206226140260696,
0.004463044460862875,
0.11764219403266907,
-0.03453409671783447,
-0.06433967500925064,
-0.003249764908105135,
0.06804999709129333,
-0.02563626505434513,
-0.1228061243891716,
-0.20606602728366852,
0.016155708581209183,
0.03873796761035919,
0.032035063952207565,
-0.017551239579916,
0.014211696572601795,
0.00006433339876821265,
0.019801031798124313,
-0.15148194134235382,
-0.10789655894041061,
-0.02183423936367035,
0.047863416373729706,
0.1134059950709343,
-0.0040529887191951275,
0.03156454861164093,
0.009565260261297226,
0.09078960865736008,
-0.13320429623126984,
-0.026244986802339554,
-0.06362131237983704,
-0.08424703776836395,
-0.048711251467466354,
-0.021064843982458115,
-0.04991992190480232,
-0.06329155713319778,
-0.01650313101708889,
0.1321057230234146,
-0.052280887961387634,
-0.015195132233202457,
-0.03396468982100487,
0.0007630821783095598,
0.06786531955003738,
0.17538146674633026,
-0.05631689354777336,
-0.05549246072769165,
0.05127571523189545,
-0.05571504309773445,
0.033256158232688904,
0.020805299282073975,
-0.043436985462903976,
-0.030751297250390053,
0.01915665715932846,
0.055354975163936615,
-0.01418711431324482,
0.06417707353830338,
-0.020725183188915253,
-0.04868491739034653,
0.2593889832496643,
-0.08310515433549881,
0.05272803083062172,
0.014128490351140499,
-0.02350880578160286,
-0.004035221878439188,
0.06200375035405159,
-0.05076184123754501,
-0.0528249628841877,
0.06477224081754684,
-0.04920712485909462,
-0.035495560616254807,
-0.08462009578943253,
-0.09030922502279282,
-0.004301487002521753,
-0.1087135300040245,
-0.03855302929878235,
-0.07552644610404968,
-0.11922674626111984,
-0.01908552087843418,
0.005539315287023783,
-0.012248196639120579,
0.02100951597094536,
0.002897110069170594,
-0.053028278052806854,
-0.034603580832481384,
0.02980729751288891,
-0.036006756126880646,
-0.0013810504460707307,
0.006833640858530998,
-0.07359537482261658,
0.009305664338171482,
-0.04675024747848511,
0.02550656907260418,
-0.0573585070669651,
-0.009798931889235973,
-0.20666277408599854,
0.08677089959383011,
-0.06570851802825928,
0.005003483034670353,
-0.05185500159859657,
-0.02371205948293209,
0.022642578929662704,
0.028350651264190674,
0.026011481881141663,
0.07139113545417786,
-0.10946808010339737,
-0.06963641941547394,
0.18042360246181488,
-0.10180162638425827,
-0.02715366892516613,
0.10127389430999756,
-0.013029579073190689,
0.04744994267821312,
0.09104648232460022,
0.06640178710222244,
0.11806519329547882,
-0.15398892760276794,
-0.03895973786711693,
-0.056317780166864395,
-0.06752385199069977,
0.0626777857542038,
0.023286942392587662,
-0.04204494133591652,
0.04264288395643234,
0.05178491771221161,
-0.09698639065027237,
-0.034813545644283295,
0.014954347163438797,
-0.023219317197799683,
0.017902163788676262,
-0.046296365559101105,
-0.014296002686023712,
-0.033590167760849,
-0.05741259828209877,
-0.04716290906071663,
-0.14736564457416534,
0.046483512967824936,
0.0870128870010376,
-0.02106732688844204,
0.017035681754350662,
-0.01994732953608036,
0.024992985650897026,
-0.003627458121627569,
-0.010007958859205246,
-0.055935051292181015,
-0.0981883630156517,
0.06406121701002121,
-0.151697039604187,
0.11000191420316696,
0.012406228110194206,
0.04039089009165764,
0.13188406825065613,
-0.05869514122605324,
-0.0016015091678127646,
-0.0925792008638382,
-0.019729597494006157,
-0.01604364812374115,
-0.07789576053619385,
-0.06894544512033463,
-0.03972052037715912,
0.14707820117473602,
-0.037526845932006836,
0.02332990989089012,
0.011600853875279427,
0.06856667250394821,
0.030328914523124695,
-0.06500226259231567,
0.012901480309665203,
-0.010375123471021652,
0.0018403030699118972,
-0.047776974737644196,
0.015804149210453033,
0.04878490790724754,
-0.05925972759723663,
0.1020931825041771,
-0.13720326125621796,
-0.15775035321712494,
0.06934121996164322,
0.11815622448921204,
-0.1145961731672287,
-0.0624750554561615,
-0.05902465805411339,
-0.002899523824453354,
-0.027897382155060768,
-0.09518317133188248,
0.18929614126682281,
-0.015124044381082058,
0.06271203607320786,
-0.07637499272823334,
-0.05739105865359306,
-0.008972445502877235,
0.006889942567795515,
-0.016168925911188126,
-0.0052395970560610294,
-0.006643917877227068,
-0.11012561619281769,
0.09164010733366013,
-0.006339380983263254,
0.04874960333108902,
0.20161157846450806,
0.06146201491355896,
-0.08289454132318497,
-0.029594488441944122,
-0.011719965375959873,
0.04602453485131264,
0.0946042537689209,
0.000854838581290096,
-0.016984300687909126,
0.02877088449895382,
-0.0011680572060868144,
0.05968170240521431,
-0.06639040261507034,
0.051697831600904465,
0.026564447209239006,
-0.028322044759988785,
0.03487803786993027,
-0.0005350023857317865,
-0.022275805473327637,
0.05601915717124939,
0.03581548109650612,
0.11309002339839935,
-0.012563920579850674,
-0.013158430345356464,
-0.07547993212938309,
0.10415517538785934,
-0.11215519160032272,
-0.2303195595741272,
-0.20333360135555267,
0.05919199436903,
-0.10498124361038208,
0.0348728746175766,
0.043053675442934036,
-0.03523554280400276,
-0.059638701379299164,
-0.03047044761478901,
0.10844635963439941,
0.06155968829989433,
-0.07483765482902527,
-0.07022614032030106,
-0.0009223752422258258,
0.030121371150016785,
-0.17044219374656677,
-0.026434116065502167,
0.0366896390914917,
-0.1729808747768402,
0.005106612574309111,
0.025190090760588646,
0.040419816970825195,
-0.011723978444933891,
-0.04074052348732948,
-0.031599342823028564,
-0.023948729038238525,
0.12004472315311432,
-0.09963563084602356,
0.11667941510677338,
0.06332361698150635,
-0.004885462112724781,
0.0802616998553276,
-0.028100304305553436,
0.014287848025560379,
-0.030985509976744652,
0.005518959369510412,
0.05895843356847763,
-0.03433743491768837,
-0.17310410737991333,
-0.07211688160896301,
-0.031117046251893044,
0.05147094652056694,
0.0008251593681052327,
0.043362487107515335,
0.14211608469486237,
0.02860197238624096,
-0.09211242944002151,
0.004350860137492418,
0.06530690938234329,
0.08924276381731033,
-0.1076958030462265,
0.014568785205483437,
0.020520783960819244,
-0.044872529804706573,
-0.004689284134656191,
0.12325376272201538,
0.04152699559926987,
0.07517610490322113,
-0.029963631182909012,
0.13231632113456726,
0.04921215400099754,
0.02307193912565708,
0.009439807385206223,
0.08803586661815643,
-0.04179250821471214,
0.027010761201381683,
-0.0465872585773468,
-0.0921037420630455,
-0.0289296954870224,
0.10360316932201385,
0.029703326523303986,
-0.022174464538693428,
-0.0272684246301651,
-0.10481921583414078,
0.06293001025915146,
0.19558513164520264,
0.01551654189825058,
-0.15649767220020294,
-0.10315775126218796,
0.005387596786022186,
0.033230897039175034,
-0.06877875328063965,
-0.04736994579434395,
0.04787846654653549,
-0.1408752053976059,
0.08394616097211838,
-0.0079211900010705,
0.05555384233593941,
-0.02387378178536892,
0.01730320416390896,
0.047019653022289276,
0.13420864939689636,
-0.012772316113114357,
0.06885892897844315,
-0.11556626856327057,
-0.026836995035409927,
0.028954682871699333,
0.12620338797569275,
-0.08324402570724487,
0.038324397057294846,
0.0658797100186348,
-0.014036357402801514,
0.1186317577958107,
0.017763668671250343,
0.0012474514078348875,
0.03598006069660187,
-0.0903315395116806,
0.007826164364814758,
0.09234540909528732,
-0.12278199195861816,
0.059726424515247345,
-0.024629225954413414,
-0.00819272268563509,
-0.029802663251757622,
0.06232248991727829,
-0.11690280586481094,
-0.13550852239131927,
0.04800696298480034,
-0.09837839752435684,
0.04420774057507515,
-0.06442996859550476,
0.010297209024429321,
-0.049317244440317154,
0.2834377586841583,
-0.027229543775320053,
-0.08959636092185974,
-0.10029768943786621,
-0.059441860765218735,
0.07342104613780975,
-0.047174856066703796,
0.04168150573968887,
-0.008591809310019016,
0.17354723811149597,
-0.0436282753944397,
-0.09972408413887024,
-0.0056083593517541885,
-0.06478817760944366,
-0.14784599840641022,
-0.0011867162538692355,
0.13729055225849152,
0.07760221511125565,
0.032906174659729004,
0.03611365705728531,
0.0415431447327137,
0.017690053209662437,
-0.09472758322954178,
0.07352571934461594,
0.20152929425239563,
0.034201644361019135,
0.06649720668792725,
-0.1177491694688797,
-0.10946058481931686,
-0.07927830517292023,
0.01636703684926033,
0.042726561427116394,
0.24216380715370178,
-0.05127335339784622,
0.14601804316043854,
0.2095748335123062,
-0.1037381961941719,
-0.2465355098247528,
-0.013123901560902596,
0.030291549861431122,
0.057025179266929626,
-0.03939104825258255,
-0.20554527640342712,
0.0712004005908966,
0.09329358488321304,
-0.008039177395403385,
0.08429232984781265,
-0.1480913758277893,
-0.11749755591154099,
-0.0141843780875206,
-0.002674564253538847,
-0.06063519045710564,
-0.12010463327169418,
-0.047224532812833786,
-0.06434430927038193,
-0.03798557445406914,
0.1450783908367157,
-0.07125984877347946,
0.10540605336427689,
-0.02032581716775894,
0.03137025982141495,
0.04989377409219742,
-0.03402620181441307,
0.08863379061222076,
-0.04952326789498329,
0.050685834139585495,
-0.08046244829893112,
0.054071635007858276,
0.13096152245998383,
-0.04564113914966583,
0.13786464929580688,
-0.03672708943486214,
0.035166166722774506,
-0.05330994725227356,
-0.01035898458212614,
-0.107984259724617,
0.1272977888584137,
-0.05445566400885582,
-0.08126038312911987,
-0.06948024034500122,
0.08968699723482132,
0.01795225776731968,
-0.020310234278440475,
-0.03854471445083618,
-0.01999822072684765,
0.09684458374977112,
0.10055806487798691,
0.09505133330821991,
0.04432636499404907,
-0.034741975367069244,
-0.022229086607694626,
-0.04825136065483093,
0.06464516371488571,
-0.0053718313574790955,
0.037001919001340866,
0.06388702243566513,
0.01931808516383171,
0.09476777166128159,
0.005584049504250288,
-0.1805424690246582,
-0.00820985808968544,
0.08773142844438553,
-0.1618260145187378,
-0.13925640285015106,
-0.0569603256881237,
0.043082959949970245,
-0.018498383462429047,
0.08093893527984619,
0.14473628997802734,
-0.061234038323163986,
-0.07126828283071518,
-0.024016639217734337,
0.07316870242357254,
-0.07157330214977264,
0.026095813140273094,
0.021311217918992043,
-0.02533072791993618,
-0.042475927621126175,
0.14192324876785278,
0.02817060612142086,
0.0020744851790368557,
0.026583902537822723,
0.08797887712717056,
-0.06141414865851402,
-0.042418453842401505,
0.002708104904741049,
0.0774654746055603,
-0.06462319940328598,
-0.02557094767689705,
0.03983256593346596,
-0.06041247025132179,
-0.03976694121956825,
0.021411143243312836,
-0.009508637711405754,
0.05174534022808075,
0.010341818444430828,
0.06798610091209412,
-0.11575368046760559,
0.04486745223402977,
-0.03675628453493118,
0.029457293450832367,
-0.013907328248023987,
0.1360316276550293,
-0.014637783169746399,
-0.023591719567775726,
-0.013727664947509766,
0.006756164599210024,
-0.0712658166885376,
-0.0848793312907219,
-0.14550374448299408,
-0.04314649850130081,
-0.034670744091272354,
-0.026078559458255768,
-0.020535023882985115,
0.022803861647844315,
0.0530192069709301,
0.02737465687096119,
-0.03193262219429016,
-0.07061391323804855,
-0.0682205781340599,
0.05836552008986473,
-0.15743175148963928,
-0.001038314774632454,
0.04676917567849159,
-0.07967235893011093,
0.12121371179819107,
0.049332235008478165,
0.010867172852158546,
0.007619056850671768,
-0.049679048359394073,
-0.005390059668570757,
-0.023009836673736572,
-0.03488202020525932,
0.025583023205399513,
-0.1316206455230713,
-0.009330932050943375,
-0.013176198117434978,
-0.05700433626770973,
-0.03536687418818474,
0.07766364514827728,
-0.062248218804597855,
0.04787011817097664,
0.07272783666849136,
-0.02037743479013443,
-0.07901792228221893,
0.03605058044195175,
0.08395140618085861,
0.03715519979596138,
0.04371534287929535,
-0.04558812454342842,
0.017465751618146896,
-0.09769178181886673,
-0.012879558838903904,
-0.007543161977082491,
-0.03326213359832764,
-0.03374279662966728,
0.0036663985811173916,
0.029387617483735085,
0.01239384338259697,
0.11654916405677795,
-0.005275166127830744,
-0.040993936359882355,
0.016476521268486977,
-0.004925168585032225,
-0.06876751780509949,
0.03400567173957825,
0.05724781006574631,
-0.06345228105783463,
-0.01848476566374302,
-0.06718498468399048,
0.014580996707081795,
-0.06417747586965561,
-0.08387403935194016,
0.1288590133190155,
0.06545546650886536,
0.04960523918271065,
0.022322343662381172,
0.08618108928203583,
-0.05199478939175606,
-0.10934511572122574,
-0.050895072519779205,
0.013638215139508247,
0.036610472947359085,
-0.09038416296243668,
0.08103694766759872,
0.11969838291406631,
-0.16440023481845856,
0.1377773880958557,
-0.001470517716370523,
-0.05023472383618355,
-0.033267270773649216,
-0.11635758727788925,
-0.019839324057102203,
0.0452362485229969,
0.003075420157983899,
-0.09094788879156113,
0.05169803649187088,
-0.006255842745304108,
0.0023556172382086515,
-0.07891799509525299,
0.12869180738925934,
-0.09821228682994843,
-0.08353297412395477,
0.08329075574874878,
0.03617524728178978,
0.026621324941515923,
0.03144954890012741,
0.03812897205352783,
0.0339612178504467,
0.1469329446554184,
0.08149793744087219,
0.05823749676346779,
0.07239824533462524,
0.017251115292310715,
-0.02699725143611431,
-0.05156583711504936,
-0.006214170251041651,
-0.040828101336956024,
-0.01488606445491314,
0.05907995253801346,
-0.0041649602353572845,
-0.039847806096076965,
0.011440003290772438,
0.1336643397808075,
-0.03976045176386833,
-0.0503821037709713,
-0.0953444167971611,
0.23683710396289825,
-0.006438615266233683,
0.02450726181268692,
0.02048284187912941,
-0.09453476220369339,
0.031863417476415634,
0.12790751457214355,
0.22780294716358185,
0.03280423581600189,
0.01081054750829935,
-0.007581715472042561,
-0.0070341904647648335,
-0.006386362016201019,
0.11378562450408936,
-0.016391223296523094,
0.1748783141374588,
-0.0013814029516652226,
0.23145979642868042,
0.008973595686256886,
0.0026004461105912924,
-0.03410967439413071,
0.11986102908849716,
-0.09909563511610031,
-0.01834964007139206,
-0.007900473661720753,
0.029302557930350304,
-0.007459588814526796,
-0.30222031474113464,
-0.015432942658662796,
-0.0330064557492733,
-0.09715742617845535,
0.00917057041078806,
-0.021299323067069054,
0.01947583630681038,
0.13300824165344238,
0.0351710170507431,
0.03481444716453552,
0.21986567974090576,
-0.018848666921257973,
-0.04375622421503067,
-0.016459565609693527,
0.03208751976490021,
-0.1183270663022995,
0.2207726389169693,
0.01239868625998497,
-0.04250290244817734,
0.05423561483621597,
-0.04571761563420296,
-0.13312219083309174,
-0.00903210137039423,
-0.018274322152137756,
-0.07173697650432587,
0.0030708331614732742,
0.14866410195827484,
-0.031141020357608795,
-0.024719690904021263,
0.018833326175808907,
-0.05056512728333473,
0.04721808433532715,
0.07535035163164139,
-0.03176162391901016,
-0.037273287773132324,
0.09996772557497025,
-0.092257060110569,
0.15495166182518005,
0.16932885348796844,
-0.030928319320082664,
0.0015741153620183468,
-0.08572905510663986,
0.024992452934384346,
0.06820295751094818,
0.0346924252808094,
-0.003232360817492008,
-0.11180128157138824,
-0.0149929104372859,
0.06214600428938866,
0.07377135008573532,
-0.20981211960315704,
-0.023889711126685143,
0.026845816522836685,
-0.002098789205774665,
-0.029919518157839775,
0.07562407106161118,
0.015590465627610683,
0.02532508783042431,
-0.014433559961616993,
-0.013528069481253624,
-0.022584103047847748,
0.04367055371403694,
-0.05439997836947441,
-0.05594708397984505
] |
null | null |
transformers
|
# T5 Base Model for Named Entity Recognition (NER, CoNLL-2003)
In this repository, we open source a T5 Base model, that was fine-tuned on the official CoNLL-2003 NER dataset.
We use the great [TANL library](https://github.com/amazon-research/tanl) from Amazon for fine-tuning the model.
The exact approach of fine-tuning is presented in the "TANL: Structured Prediction as Translation between Augmented Natural Languages"
paper from Giovanni Paolini, Ben Athiwaratkun, Jason Krone, Jie Ma, Alessandro Achille, Rishita Anubhai, Cicero Nogueira dos Santos, Bing Xiang and Stefano Soatto.
# Fine-Tuning
We use the same hyper-parameter settings as used in the official implementation with one minor change. Instead of using 8 V100 GPUs, we train the model
on one V100 GPU and used gradient accumulation. The slighly modified configuration file (`config.ini`) then looks like:
```ini
[conll03]
datasets = conll03
model_name_or_path = t5-base
num_train_epochs = 10
max_seq_length = 256
max_seq_length_eval = 512
per_device_train_batch_size = 4
per_device_eval_batch_size = 4
do_train = True
do_eval = True
do_predict = True
gradient_accumulation_steps = 8
```
It took around 2 hours to fine-tune that model on the 14,041 training sentences of CoNLL-2003 dataset.
# Evaluation
On the development set, the following evaluation results could be achieved:
```json
{
"entity_precision": 0.9536446086664427,
"entity_recall": 0.9555705149781218,
"entity_f1": 0.9546065904505716,
"entity_precision_no_type": 0.9773261672824992,
"entity_recall_no_type": 0.9792998990238977,
"entity_f1_no_type": 0.9783120376597176
}
```
The evaluation results on the test set looks like:
```json
{
"entity_precision": 0.912182296231376,
"entity_recall": 0.9213881019830028,
"entity_f1": 0.9167620893155995,
"entity_precision_no_type": 0.953900087642419,
"entity_recall_no_type": 0.9635269121813032,
"entity_f1_no_type": 0.9586893332158901
}
```
To summarize: On the development set, 95.46% F1-Score and 91.68% on test set were achieved with this model. The paper reported a F1-Score of 91.7%.
# License
The models is licensed under [MIT](https://choosealicense.com/licenses/mit/).
# Acknowledgments
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
|
{"language": "en", "license": "mit", "datasets": ["conll2003"], "widget": [{"text": "My name is Clara Clever and I live in Berkeley , California ."}]}
|
text2text-generation
|
dbmdz/t5-base-conll03-english
|
[
"transformers",
"pytorch",
"safetensors",
"t5",
"text2text-generation",
"en",
"dataset:conll2003",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #safetensors #t5 #text2text-generation #en #dataset-conll2003 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# T5 Base Model for Named Entity Recognition (NER, CoNLL-2003)
In this repository, we open source a T5 Base model, that was fine-tuned on the official CoNLL-2003 NER dataset.
We use the great TANL library from Amazon for fine-tuning the model.
The exact approach of fine-tuning is presented in the "TANL: Structured Prediction as Translation between Augmented Natural Languages"
paper from Giovanni Paolini, Ben Athiwaratkun, Jason Krone, Jie Ma, Alessandro Achille, Rishita Anubhai, Cicero Nogueira dos Santos, Bing Xiang and Stefano Soatto.
# Fine-Tuning
We use the same hyper-parameter settings as used in the official implementation with one minor change. Instead of using 8 V100 GPUs, we train the model
on one V100 GPU and used gradient accumulation. The slighly modified configuration file ('URL') then looks like:
It took around 2 hours to fine-tune that model on the 14,041 training sentences of CoNLL-2003 dataset.
# Evaluation
On the development set, the following evaluation results could be achieved:
The evaluation results on the test set looks like:
To summarize: On the development set, 95.46% F1-Score and 91.68% on test set were achieved with this model. The paper reported a F1-Score of 91.7%.
# License
The models is licensed under MIT.
# Acknowledgments
Thanks to the generous support from the Hugging Face team,
it is possible to download both cased and uncased models from their S3 storage
|
[
"# T5 Base Model for Named Entity Recognition (NER, CoNLL-2003)\n\nIn this repository, we open source a T5 Base model, that was fine-tuned on the official CoNLL-2003 NER dataset.\n\nWe use the great TANL library from Amazon for fine-tuning the model.\n\nThe exact approach of fine-tuning is presented in the \"TANL: Structured Prediction as Translation between Augmented Natural Languages\"\npaper from Giovanni Paolini, Ben Athiwaratkun, Jason Krone, Jie Ma, Alessandro Achille, Rishita Anubhai, Cicero Nogueira dos Santos, Bing Xiang and Stefano Soatto.",
"# Fine-Tuning\n\nWe use the same hyper-parameter settings as used in the official implementation with one minor change. Instead of using 8 V100 GPUs, we train the model\non one V100 GPU and used gradient accumulation. The slighly modified configuration file ('URL') then looks like:\n\n\n\nIt took around 2 hours to fine-tune that model on the 14,041 training sentences of CoNLL-2003 dataset.",
"# Evaluation\n\nOn the development set, the following evaluation results could be achieved:\n\n\n\nThe evaluation results on the test set looks like:\n\n\n\nTo summarize: On the development set, 95.46% F1-Score and 91.68% on test set were achieved with this model. The paper reported a F1-Score of 91.7%.",
"# License\n\nThe models is licensed under MIT.",
"# Acknowledgments\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
"TAGS\n#transformers #pytorch #safetensors #t5 #text2text-generation #en #dataset-conll2003 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# T5 Base Model for Named Entity Recognition (NER, CoNLL-2003)\n\nIn this repository, we open source a T5 Base model, that was fine-tuned on the official CoNLL-2003 NER dataset.\n\nWe use the great TANL library from Amazon for fine-tuning the model.\n\nThe exact approach of fine-tuning is presented in the \"TANL: Structured Prediction as Translation between Augmented Natural Languages\"\npaper from Giovanni Paolini, Ben Athiwaratkun, Jason Krone, Jie Ma, Alessandro Achille, Rishita Anubhai, Cicero Nogueira dos Santos, Bing Xiang and Stefano Soatto.",
"# Fine-Tuning\n\nWe use the same hyper-parameter settings as used in the official implementation with one minor change. Instead of using 8 V100 GPUs, we train the model\non one V100 GPU and used gradient accumulation. The slighly modified configuration file ('URL') then looks like:\n\n\n\nIt took around 2 hours to fine-tune that model on the 14,041 training sentences of CoNLL-2003 dataset.",
"# Evaluation\n\nOn the development set, the following evaluation results could be achieved:\n\n\n\nThe evaluation results on the test set looks like:\n\n\n\nTo summarize: On the development set, 95.46% F1-Score and 91.68% on test set were achieved with this model. The paper reported a F1-Score of 91.7%.",
"# License\n\nThe models is licensed under MIT.",
"# Acknowledgments\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
67,
151,
97,
72,
10,
37
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #t5 #text2text-generation #en #dataset-conll2003 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# T5 Base Model for Named Entity Recognition (NER, CoNLL-2003)\n\nIn this repository, we open source a T5 Base model, that was fine-tuned on the official CoNLL-2003 NER dataset.\n\nWe use the great TANL library from Amazon for fine-tuning the model.\n\nThe exact approach of fine-tuning is presented in the \"TANL: Structured Prediction as Translation between Augmented Natural Languages\"\npaper from Giovanni Paolini, Ben Athiwaratkun, Jason Krone, Jie Ma, Alessandro Achille, Rishita Anubhai, Cicero Nogueira dos Santos, Bing Xiang and Stefano Soatto.# Fine-Tuning\n\nWe use the same hyper-parameter settings as used in the official implementation with one minor change. Instead of using 8 V100 GPUs, we train the model\non one V100 GPU and used gradient accumulation. The slighly modified configuration file ('URL') then looks like:\n\n\n\nIt took around 2 hours to fine-tune that model on the 14,041 training sentences of CoNLL-2003 dataset.# Evaluation\n\nOn the development set, the following evaluation results could be achieved:\n\n\n\nThe evaluation results on the test set looks like:\n\n\n\nTo summarize: On the development set, 95.46% F1-Score and 91.68% on test set were achieved with this model. The paper reported a F1-Score of 91.7%.# License\n\nThe models is licensed under MIT.# Acknowledgments\n\nThanks to the generous support from the Hugging Face team,\nit is possible to download both cased and uncased models from their S3 storage"
] |
[
-0.09601990878582001,
0.05542057380080223,
-0.002564877038821578,
0.06370643526315689,
0.13014665246009827,
0.0204789936542511,
0.13002672791481018,
0.07585495710372925,
-0.09913312643766403,
0.07273858785629272,
-0.030530795454978943,
0.0076041314750909805,
0.08740589767694473,
0.15241491794586182,
0.010196707211434841,
-0.24507221579551697,
0.04484914243221283,
-0.08035673946142197,
-0.06328627467155457,
0.08430948853492737,
0.10788388550281525,
-0.06958066672086716,
0.08305227011442184,
0.014828494749963284,
-0.10289252549409866,
-0.015076268464326859,
0.03247183933854103,
-0.030846403911709785,
0.10739539563655853,
0.08686107397079468,
0.07486055791378021,
0.05393103137612343,
0.12756897509098053,
-0.12998636066913605,
0.018774444237351418,
0.07696384936571121,
0.05350673943758011,
0.049567412585020065,
0.10812263190746307,
0.1134471595287323,
0.20782729983329773,
-0.026944221928715706,
0.0033393732737749815,
0.06287899613380432,
-0.09921252727508545,
-0.012249468825757504,
-0.17086216807365417,
-0.008060211315751076,
0.07254516333341599,
0.04635694995522499,
-0.0042270817793905735,
0.07427913695573807,
-0.0887296125292778,
0.03364328667521477,
0.0697302296757698,
-0.23657064139842987,
-0.06684626638889313,
0.10168334096670151,
0.010778364725410938,
0.045540519058704376,
-0.07253416627645493,
0.020682403817772865,
0.04991244897246361,
-0.02514072135090828,
0.04199735075235367,
-0.014520383439958096,
-0.015565123409032822,
-0.006332778371870518,
-0.10978421568870544,
-0.035682063549757004,
0.1079414039850235,
0.028646504506468773,
-0.09899292886257172,
-0.11762377619743347,
-0.10881640017032623,
0.020242229104042053,
-0.011578630656003952,
-0.04051949828863144,
0.025118397548794746,
0.03774144500494003,
0.10647723823785782,
-0.08714808523654938,
-0.13063262403011322,
-0.0630139708518982,
-0.013078682124614716,
0.14967580139636993,
0.046282775700092316,
0.032423730939626694,
-0.019306255504488945,
0.10259942710399628,
-0.07768523693084717,
-0.022714514285326004,
-0.016429247334599495,
-0.05191564932465553,
-0.1112164705991745,
0.017121287062764168,
-0.041822973638772964,
-0.07928015291690826,
0.004002361558377743,
0.19038693606853485,
-0.020820878446102142,
-0.0014049876481294632,
0.06159641221165657,
0.007280003745108843,
-0.01820693351328373,
0.1835528463125229,
-0.00849834829568863,
-0.10422667115926743,
0.062145933508872986,
0.03176809102296829,
0.0605708509683609,
-0.032431602478027344,
-0.0011391593143343925,
-0.06845603883266449,
-0.06095891445875168,
0.006741694640368223,
0.021708037704229355,
0.06161392480134964,
-0.02014213800430298,
-0.028992200270295143,
0.22788961231708527,
-0.11792203038930893,
0.017664315178990364,
0.005742836277931929,
-0.09672965854406357,
0.015597571618855,
0.008710683323442936,
-0.01912197284400463,
-0.05774657428264618,
0.0952349305152893,
-0.055113501846790314,
-0.04302092641592026,
-0.1241331398487091,
-0.07525854557752609,
0.008890275843441486,
-0.0730588361620903,
-0.04724343121051788,
-0.06719252467155457,
-0.17983464896678925,
-0.07541099190711975,
0.02879492938518524,
-0.028257396072149277,
-0.008392288349568844,
-0.011349226348102093,
-0.014067033305764198,
-0.010679034516215324,
-0.012182800099253654,
-0.012112503871321678,
-0.050615355372428894,
0.03863165155053139,
0.03327645733952522,
0.03545190393924713,
-0.04615597054362297,
0.021931065246462822,
-0.11375083774328232,
-0.01739349588751793,
-0.1369909644126892,
0.023022247478365898,
-0.018076203763484955,
0.006454447750002146,
-0.01458063255995512,
-0.034272700548172,
-0.11484362185001373,
0.04527891427278519,
0.062381960451602936,
0.14715136587619781,
-0.15093356370925903,
-0.031621675938367844,
0.12993194162845612,
-0.12767954170703888,
0.020252488553524017,
0.1589045524597168,
0.0348176546394825,
0.019544608891010284,
0.07184009253978729,
0.11728854477405548,
0.09401369839906693,
-0.10742294043302536,
0.0048203859478235245,
-0.013455410487949848,
-0.06815872341394424,
0.012001996859908104,
0.024911759421229362,
0.011052246205508709,
0.08030227571725845,
0.0799054205417633,
-0.06083368882536888,
0.006408669054508209,
-0.03697307035326958,
-0.033341556787490845,
-0.033838286995887756,
-0.03353838622570038,
-0.05328132212162018,
0.013613350689411163,
-0.047396641224622726,
-0.05736628919839859,
-0.08209040760993958,
-0.08712880313396454,
0.15449050068855286,
-0.05932841822504997,
0.04257189854979515,
-0.07958520948886871,
0.06490780413150787,
0.06591203808784485,
-0.00950735155493021,
-0.10924876481294632,
-0.11721305549144745,
0.054482247680425644,
0.03033587522804737,
0.03787883371114731,
0.07575105130672455,
0.01839776150882244,
0.07440292090177536,
-0.032231405377388,
-0.012922147288918495,
-0.06364064663648605,
-0.0616048164665699,
-0.03870678320527077,
-0.06752568483352661,
-0.056672319769859314,
-0.03706159070134163,
0.16298840939998627,
-0.14827163517475128,
0.04792865738272667,
-0.004242041148245335,
0.04733412712812424,
-0.033128488808870316,
-0.055632464587688446,
-0.03320157900452614,
-0.028769787400960922,
-0.04082081466913223,
-0.07846193015575409,
0.05388045683503151,
0.05512067675590515,
0.04497801512479782,
0.06514038145542145,
-0.17859646677970886,
-0.045157257467508316,
0.09512827545404434,
0.10484080761671066,
-0.016517410054802895,
-0.0523114874958992,
-0.0495733916759491,
-0.027441686019301414,
-0.10280386358499527,
0.03559449315071106,
0.07983841001987457,
0.021702121943235397,
0.09284123033285141,
-0.08860661834478378,
0.057140056043863297,
-0.014053752645850182,
-0.03355567157268524,
0.0032965827267616987,
0.03310798108577728,
0.0489974245429039,
-0.025381775572896004,
0.030843837186694145,
-0.049076952040195465,
0.05562325194478035,
0.12335553765296936,
0.013101556338369846,
-0.10537143796682358,
0.02112039551138878,
0.04812513291835785,
-0.026226963847875595,
0.05982411280274391,
0.04597241431474686,
0.002167452359572053,
0.02240537665784359,
0.021305054426193237,
0.06928011029958725,
-0.18423421680927277,
0.02492337115108967,
0.01022670790553093,
-0.04990053549408913,
0.010036309249699116,
0.01856805570423603,
-0.040567390620708466,
0.07791291177272797,
0.001827206346206367,
-0.02768227830529213,
-0.03717609867453575,
-0.02246915176510811,
-0.04348375275731087,
0.10073551535606384,
-0.053477056324481964,
-0.23853357136249542,
-0.18169249594211578,
0.07754146307706833,
-0.10899896174669266,
0.023394078016281128,
0.0035528310108929873,
-0.07154226303100586,
-0.10746806114912033,
-0.04326267167925835,
0.058150965720415115,
0.09034553170204163,
-0.013872592709958553,
-0.03963271155953407,
0.027664272114634514,
0.004705997183918953,
-0.13081000745296478,
-0.021299075335264206,
-0.04985974729061127,
-0.14278990030288696,
0.029505794867873192,
-0.018306201323866844,
0.06980551034212112,
0.07484537363052368,
-0.049609262496232986,
-0.014815798960626125,
0.03834696114063263,
0.1359182447195053,
-0.03661494329571724,
0.11686071008443832,
0.2786250412464142,
0.07888772338628769,
0.079771488904953,
0.08248069882392883,
-0.005045684985816479,
-0.08209787309169769,
0.07949891686439514,
0.058895327150821686,
-0.058297380805015564,
-0.2157181203365326,
-0.09715504199266434,
-0.06571085751056671,
-0.004612154792994261,
0.013109162449836731,
0.04377259686589241,
-0.035528868436813354,
0.042733196169137955,
-0.11392644047737122,
-0.03322330489754677,
0.11849316209554672,
0.039701543748378754,
0.036679670214653015,
0.019422920420765877,
0.07378602772951126,
-0.06666345149278641,
0.00289625721052289,
0.1399756819009781,
0.009974827989935875,
0.12321531027555466,
-0.06228415668010712,
0.06014741212129593,
0.006449806038290262,
0.14505693316459656,
0.02042846754193306,
0.06444447487592697,
-0.011790615506470203,
-0.014082835055887699,
-0.05515170097351074,
-0.03795682266354561,
-0.11367107927799225,
0.07098082453012466,
-0.010909623466432095,
-0.07108194380998611,
-0.06392448395490646,
0.10577831417322159,
0.02500951662659645,
0.19633719325065613,
0.042939066886901855,
-0.210288867354393,
-0.08946803212165833,
-0.006173667963594198,
0.0065510147251188755,
-0.08314445614814758,
0.04735638201236725,
0.1425447165966034,
-0.06851602345705032,
0.035080697387456894,
-0.04140155389904976,
0.09828958660364151,
-0.0067546172067523,
0.008228196762502193,
-0.00908375158905983,
0.15434877574443817,
0.007983000949025154,
0.06128832697868347,
-0.0979258194565773,
0.09593740105628967,
-0.007367897778749466,
0.12689463794231415,
0.003879861207678914,
0.02898813784122467,
0.07956186681985855,
0.15216360986232758,
0.06815239787101746,
0.002881202846765518,
-0.09874498844146729,
-0.08234883844852448,
-0.10500163584947586,
0.0029367452953010798,
0.08295921981334686,
0.028648102656006813,
0.07829822599887848,
-0.1083257868885994,
0.008545475080609322,
-0.007236715871840715,
0.0683002918958664,
-0.09551642835140228,
-0.08845704793930054,
0.04700537770986557,
0.007757748011499643,
-0.006832911632955074,
-0.09239180386066437,
-0.010950835421681404,
-0.08331869542598724,
0.26635992527008057,
0.11902093142271042,
-0.02158404514193535,
-0.15577614307403564,
0.02706967107951641,
0.11569683998823166,
-0.061922717839479446,
0.03737449273467064,
0.008955113589763641,
0.11834052950143814,
0.00961278472095728,
-0.07435072958469391,
0.0022451255936175585,
-0.0669785663485527,
-0.147308349609375,
-0.0006262485985644162,
0.0601775124669075,
0.026841983199119568,
0.01247702818363905,
0.023645887151360512,
0.003236762247979641,
-0.02767200954258442,
-0.0817006304860115,
-0.04036320745944977,
0.12483889609575272,
0.03918546438217163,
0.005128119606524706,
-0.09752294421195984,
-0.05790651962161064,
-0.04008003696799278,
-0.050905101001262665,
-0.03241315484046936,
0.18568970263004303,
-0.02845292165875435,
0.08439440280199051,
0.1593085378408432,
-0.05524609982967377,
-0.2607647180557251,
-0.004864221904426813,
-0.005187216214835644,
0.09472867846488953,
0.023640399798750877,
-0.11618126183748245,
0.06801152229309082,
0.10422489047050476,
-0.02304733358323574,
-0.0538175143301487,
-0.1706426590681076,
-0.10620040446519852,
0.017614377662539482,
0.009389901533722878,
0.11319619417190552,
-0.06176157295703888,
0.015475227497518063,
-0.04741869121789932,
-0.08704228699207306,
0.049958132207393646,
-0.035609375685453415,
0.10626332461833954,
-0.006610268261283636,
-0.016699908301234245,
0.053700268268585205,
-0.026954613626003265,
0.12377343326807022,
-0.04385283589363098,
0.09413152933120728,
-0.028884705156087875,
0.039856914430856705,
0.047039128839969635,
-0.046600934118032455,
0.11986035108566284,
0.04151040315628052,
0.0800732746720314,
-0.04004855453968048,
-0.05722370743751526,
-0.04068249091506004,
0.07372453063726425,
-0.01688043586909771,
-0.03304733335971832,
-0.07595792412757874,
0.054126523435115814,
0.04321544989943504,
0.03320733457803726,
-0.021066103130578995,
-0.029540833085775375,
-0.055118657648563385,
0.06618450582027435,
0.03943035751581192,
-0.05783166363835335,
-0.011468133889138699,
-0.036874569952487946,
0.055948663502931595,
0.037683792412281036,
-0.06151839345693588,
0.030423752963542938,
0.0892021432518959,
-0.017539607360959053,
0.04838095232844353,
0.01809186115860939,
-0.049157340079545975,
-0.03622566908597946,
0.06863535940647125,
-0.08181586861610413,
-0.10080469399690628,
-0.028053810819983482,
-0.10184334218502045,
-0.025131860747933388,
0.03190137818455696,
0.11533532291650772,
-0.04321737214922905,
-0.01685723476111889,
-0.02411317080259323,
0.08927391469478607,
-0.03063749335706234,
0.1341768056154251,
0.009774190373718739,
-0.024496041238307953,
-0.06924142688512802,
0.1467578113079071,
0.028063125908374786,
-0.11830391734838486,
-0.07015933841466904,
0.09784454107284546,
-0.14531458914279938,
-0.04067079722881317,
0.009564012289047241,
-0.08449453115463257,
-0.07559224218130112,
-0.0981670394539833,
-0.0961528941988945,
-0.04125000163912773,
-0.02144012786448002,
-0.06368022412061691,
0.03811468556523323,
0.06772617250680923,
-0.04653023183345795,
0.026419688016176224,
-0.09854887425899506,
0.014686630107462406,
0.12884558737277985,
0.03507319465279579,
-0.08310561627149582,
0.1460212767124176,
0.004382922779768705,
-0.012607308104634285,
-0.03099517710506916,
-0.0011439889203757048,
-0.04506634920835495,
-0.0065820179879665375,
-0.02756532095372677,
-0.003313223598524928,
-0.02325538545846939,
-0.0021517924033105373,
-0.017708785831928253,
-0.0631868839263916,
-0.032998379319906235,
0.024245060980319977,
-0.060285404324531555,
-0.040253106504678726,
-0.07737535983324051,
0.042432524263858795,
-0.0833192691206932,
-0.006626717746257782,
0.03962545841932297,
-0.1029733270406723,
0.07685351371765137,
0.055705711245536804,
-0.016743406653404236,
0.0347660593688488,
-0.11485135555267334,
-0.0396404042840004,
0.01312242541462183,
0.07707848399877548,
0.0007427092641592026,
-0.035392262041568756,
0.026667583733797073,
0.044480737298727036,
-0.010535209439694881,
-0.05396440625190735,
0.1001855656504631,
-0.09443963319063187,
-0.028732385486364365,
-0.0979912281036377,
-0.11622912436723709,
-0.05312253534793854,
-0.0019416071008890867,
0.07933580130338669,
0.10415259003639221,
0.11151637136936188,
-0.04589003697037697,
-0.02346671186387539,
-0.15348878502845764,
0.0024801583494991064,
-0.01039348915219307,
-0.06479434669017792,
-0.10382388532161713,
-0.09181219339370728,
0.0703304335474968,
0.03582153469324112,
0.06000792607665062,
0.07546142488718033,
-0.02070002444088459,
0.011758508160710335,
0.004701269790530205,
-0.039478085935115814,
-0.006390930153429508,
0.09492523968219757,
0.039071232080459595,
0.028190866112709045,
0.07123884558677673,
0.016503917053341866,
-0.007274904288351536,
0.030247919261455536,
0.1647798717021942,
0.07631560415029526,
0.11201595515012741,
0.14905761182308197,
-0.050985172390937805,
-0.0891355648636818,
-0.11225354671478271,
0.04623984917998314,
-0.15254364907741547,
0.004711525049060583,
-0.07713687419891357,
0.07888936251401901,
0.16293738782405853,
-0.14759603142738342,
0.09870574623346329,
-0.04403117299079895,
-0.07301206886768341,
-0.14346899092197418,
-0.11133425682783127,
-0.052357420325279236,
-0.06962395459413528,
0.0064580366015434265,
-0.08410705626010895,
0.07040746510028839,
0.06057185307145119,
-0.00005262736158329062,
-0.051743846386671066,
0.09813231974840164,
-0.1270168423652649,
-0.06490226835012436,
0.021715356037020683,
0.021924683824181557,
0.04552945867180824,
0.05232176184654236,
-0.08383888006210327,
0.003994128201156855,
0.06038115173578262,
0.09368287771940231,
0.01607239991426468,
0.09755225479602814,
0.06296439468860626,
0.022013502195477486,
-0.01654517464339733,
-0.020493125542998314,
-0.061566323041915894,
0.07431314140558243,
0.07186400890350342,
0.03570989519357681,
-0.012319506146013737,
0.020207278430461884,
0.23787087202072144,
-0.06080232933163643,
-0.0924014076590538,
-0.11562024801969528,
0.19881080090999603,
0.04580477252602577,
0.035288095474243164,
0.016260702162981033,
-0.09642146527767181,
-0.0408591702580452,
0.20227719843387604,
0.13949106633663177,
-0.027040671557188034,
-0.036533474922180176,
-0.0070183007046580315,
-0.00911693274974823,
-0.08476289361715317,
0.15833258628845215,
0.01424060482531786,
0.1787332445383072,
0.02080780640244484,
0.03533497080206871,
-0.02143688127398491,
-0.02400897443294525,
-0.03733004257082939,
0.1265430897474289,
-0.09266385436058044,
0.011230196803808212,
-0.06860969215631485,
0.03344114497303963,
0.11348383873701096,
-0.24091105163097382,
0.0611523799598217,
-0.01701616495847702,
-0.07354322820901871,
0.01932082325220108,
0.012044657953083515,
-0.00295678386464715,
0.11962131410837173,
-0.042970072478055954,
0.0542793869972229,
0.1389329433441162,
-0.02026805467903614,
-0.05920722708106041,
-0.029414888471364975,
0.03948146477341652,
-0.0697687491774559,
0.25686582922935486,
0.011461347341537476,
0.04915162920951843,
0.05338583514094353,
0.006920576561242342,
-0.12678736448287964,
0.050808824598789215,
-0.004618836101144552,
-0.1044514998793602,
0.034064169973134995,
0.024145888164639473,
-0.04966169595718384,
0.01464227307587862,
0.030559377744793892,
-0.09828019887208939,
0.048924755305051804,
0.06419876217842102,
-0.021532660350203514,
-0.07556911557912827,
0.08607259392738342,
-0.12150447815656662,
0.15754103660583496,
0.1329486072063446,
0.009108513593673706,
0.011292397044599056,
-0.07823500782251358,
0.05370810255408287,
0.03409980982542038,
0.17423178255558014,
0.05516853556036949,
-0.17961803078651428,
-0.02133982628583908,
-0.04731613025069237,
0.01261240802705288,
-0.1324373483657837,
-0.027159838005900383,
-0.04002491757273674,
-0.05496782064437866,
-0.06688755750656128,
0.1394147276878357,
0.003116070292890072,
-0.014275050722062588,
-0.04100297763943672,
-0.085035540163517,
-0.005732178688049316,
0.07750298082828522,
-0.12152936309576035,
-0.1054839938879013
] |
null | null |
transformers
|
Masked Language Model trained on the articles and talks of Noam Chomsky.
|
{}
|
fill-mask
|
dbragdon/noam-masked-lm
|
[
"transformers",
"pytorch",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
Masked Language Model trained on the articles and talks of Noam Chomsky.
|
[] |
[
"TAGS\n#transformers #pytorch #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
37
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.05978045240044594,
0.0027343870606273413,
-0.008724397048354149,
0.02515793778002262,
0.13307689130306244,
0.027639828622341156,
0.09509950131177902,
0.08148215711116791,
0.05693569406867027,
-0.005708751268684864,
0.15464650094509125,
0.21959826350212097,
-0.03345884382724762,
0.17867937684059143,
-0.058117255568504333,
-0.2713718116283417,
0.06318216025829315,
0.048634059727191925,
-0.07361909002065659,
0.11866326630115509,
0.0721997618675232,
-0.07688385248184204,
0.06776180863380432,
-0.018220216035842896,
-0.12242016196250916,
0.042348217219114304,
0.05527849495410919,
-0.1122080385684967,
0.12108562141656876,
0.021961210295557976,
0.2060822993516922,
0.013724464923143387,
-0.06524214893579483,
-0.0831608921289444,
0.05297759547829628,
-0.0005626529455184937,
-0.07771056890487671,
0.039234988391399384,
0.006228282582014799,
-0.09797824174165726,
0.0029437756165862083,
0.05069584399461746,
0.031919922679662704,
0.043076999485492706,
-0.14973030984401703,
-0.12170203030109406,
-0.021631255745887756,
0.03224768117070198,
0.04861219599843025,
0.06754541397094727,
0.019578367471694946,
0.20521271228790283,
-0.1258353292942047,
0.10418124496936798,
0.15854887664318085,
-0.29196202754974365,
-0.011583889834582806,
0.07336939871311188,
0.07550900429487228,
-0.05153534933924675,
-0.025016577914357185,
0.061107337474823,
0.0111276526004076,
0.02207767404615879,
0.03159128874540329,
-0.08054082095623016,
-0.06527844071388245,
0.004459382500499487,
-0.07510198652744293,
-0.05628051981329918,
0.14868685603141785,
-0.051077596843242645,
0.04103284329175949,
0.010015198029577732,
-0.12913082540035248,
-0.036706481128931046,
-0.022282211109995842,
0.0016800274606794119,
-0.03736185282468796,
0.03933337330818176,
-0.04058491811156273,
-0.013890751637518406,
-0.10593511909246445,
0.020706869661808014,
-0.23111006617546082,
0.27582958340644836,
0.02612913027405739,
0.07012132555246353,
-0.1858518421649933,
0.043225426226854324,
-0.029383337125182152,
-0.12287542223930359,
0.04163218289613724,
-0.09743601828813553,
0.012437986209988594,
0.0019431845284998417,
-0.06488244980573654,
-0.03970800340175629,
0.08601388335227966,
0.2266247421503067,
0.08983870595693588,
0.03475968539714813,
0.03882957249879837,
0.09770892560482025,
0.014865895733237267,
0.08014364540576935,
0.017019858583807945,
-0.040938593447208405,
0.06849705427885056,
-0.12720969319343567,
0.04322979971766472,
-0.059905603528022766,
-0.12469810247421265,
-0.052705299109220505,
0.004645936656743288,
0.08657316863536835,
0.0478980652987957,
0.04835785925388336,
-0.08745969086885452,
0.0031580787617713213,
0.07787331938743591,
-0.07194057106971741,
-0.002319851191714406,
-0.029221290722489357,
0.05152527242898941,
0.10475372523069382,
0.021636445075273514,
-0.010703074745833874,
-0.016043487936258316,
0.11903663724660873,
-0.07282981276512146,
-0.03583741933107376,
-0.05615774169564247,
-0.055342018604278564,
0.03769473731517792,
-0.14162738621234894,
0.04631955549120903,
-0.1937199980020523,
-0.14164285361766815,
0.05045240744948387,
0.058013904839754105,
-0.004311853088438511,
-0.032746389508247375,
0.032482124865055084,
-0.005240604747086763,
0.01717841997742653,
-0.04425594210624695,
-0.03813016042113304,
-0.03695604205131531,
0.10319358855485916,
0.015862006694078445,
0.1258252114057541,
-0.10736379027366638,
0.04201505705714226,
-0.08612877130508423,
0.012246696278452873,
-0.16673244535923004,
-0.037367139011621475,
-0.02627558261156082,
0.16236986219882965,
0.002689799526706338,
-0.04094931110739708,
-0.11292707920074463,
0.03153347223997116,
-0.005968436133116484,
0.16826218366622925,
-0.050691187381744385,
-0.1258428990840912,
0.23801551759243011,
-0.10858677327632904,
-0.137539803981781,
0.0768466368317604,
-0.0008964669541455805,
0.00503029627725482,
0.04865459352731705,
0.09409047663211823,
0.054460909217596054,
-0.1282840520143509,
0.09256289899349213,
0.09043212980031967,
-0.15783078968524933,
-0.13727834820747375,
0.027883639559149742,
-0.005475207231938839,
-0.10261815786361694,
0.044968098402023315,
0.09180816262960434,
0.11191844940185547,
-0.07093978673219681,
-0.0524664968252182,
-0.019335782155394554,
-0.04510034993290901,
0.12913495302200317,
0.03954192250967026,
0.09601214528083801,
-0.07954450696706772,
-0.026266923174262047,
-0.06726016104221344,
0.00639816839247942,
0.07083439826965332,
0.03640305995941162,
-0.08596820384263992,
0.1376219093799591,
-0.05640283599495888,
0.00475625554099679,
-0.18434511125087738,
-0.10832203924655914,
-0.019048554822802544,
0.06233084574341774,
-0.024080509319901466,
0.11038170754909515,
0.11115530878305435,
-0.04524664208292961,
-0.012584532611072063,
-0.016787465661764145,
0.09273066371679306,
0.02054671198129654,
-0.019367150962352753,
-0.09770754724740982,
0.024277135729789734,
-0.08850374072790146,
0.00890185683965683,
0.020691927522420883,
0.0034111374989151955,
-0.004997740965336561,
0.14283043146133423,
-0.00161478400696069,
0.03947531431913376,
-0.04557863250374794,
0.03357338905334473,
-0.04402673617005348,
0.007829888723790646,
0.08130250126123428,
0.00620792992413044,
-0.052449412643909454,
0.15417484939098358,
-0.1333128660917282,
0.3405405282974243,
0.18508437275886536,
-0.2874508798122406,
-0.03881249949336052,
0.04533010721206665,
-0.018499813973903656,
-0.0015848495531827211,
0.04968307539820671,
0.007090166676789522,
0.027636928483843803,
0.00756347319111228,
0.14252737164497375,
-0.008502244018018246,
-0.015343432314693928,
0.03761560097336769,
-0.0861063003540039,
-0.03436943516135216,
0.034367144107818604,
0.09854230284690857,
-0.11538184434175491,
0.17259535193443298,
0.22679723799228668,
-0.014759926125407219,
0.13486558198928833,
0.019756602123379707,
0.0008263704366981983,
0.004766570404171944,
-0.04244794696569443,
-0.010318059474229813,
0.04416158050298691,
-0.1703941971063614,
-0.037341129034757614,
0.07009443640708923,
-0.04086581617593765,
0.054106131196022034,
-0.1080540344119072,
-0.04539399966597557,
0.023229053243994713,
0.05735735595226288,
-0.057464465498924255,
0.14314846694469452,
0.02993035688996315,
0.07184498757123947,
0.001863340032286942,
-0.0830451101064682,
0.10455971211194992,
0.011808671057224274,
-0.027575377374887466,
0.15615016222000122,
-0.11499504745006561,
-0.3407493829727173,
-0.14060579240322113,
-0.1856870949268341,
0.012841945514082909,
0.04907330498099327,
0.07068860530853271,
-0.0886731967329979,
-0.06028576195240021,
0.1009501963853836,
-0.002118155127391219,
-0.030502429232001305,
0.06733111292123795,
-0.059993140399456024,
0.033099252730607986,
-0.03546803444623947,
-0.05871487408876419,
-0.06814772635698318,
-0.030771153047680855,
-0.026918867602944374,
0.15232053399085999,
-0.08588875830173492,
0.09874926507472992,
0.12222868949174881,
0.012513059191405773,
0.06416334956884384,
0.003602869575843215,
0.16369159519672394,
-0.08151564747095108,
-0.004985830280929804,
0.19588902592658997,
-0.039671871811151505,
0.09784439206123352,
0.16251394152641296,
0.01572851650416851,
-0.04611923173069954,
0.007210151292383671,
-0.054344769567251205,
-0.1310671716928482,
-0.16383560001850128,
-0.10668035596609116,
-0.13408245146274567,
-0.021317366510629654,
0.045856814831495285,
0.050257451832294464,
0.15384143590927124,
0.10692618787288666,
0.039593882858753204,
-0.022173523902893066,
-0.07040359079837799,
0.060743995010852814,
0.16259261965751648,
-0.02038007788360119,
0.13690350949764252,
-0.05275079607963562,
-0.15010952949523926,
0.060834385454654694,
0.017343392595648766,
0.13702066242694855,
0.10000376403331757,
-0.018073182553052902,
0.04605214297771454,
0.15819214284420013,
0.157943993806839,
0.15094222128391266,
0.040314868092536926,
-0.057792484760284424,
-0.00135100819170475,
-0.00355874327942729,
-0.054626744240522385,
0.024207331240177155,
0.13539178669452667,
-0.10293795168399811,
-0.0397566519677639,
-0.122000552713871,
0.05419766530394554,
0.11576513946056366,
0.0632915124297142,
-0.2221747636795044,
0.010753236711025238,
0.06315211206674576,
0.00856467429548502,
-0.06559337675571442,
0.03360356017947197,
-0.0504627525806427,
-0.1452513188123703,
0.0742940679192543,
-0.05110727250576019,
0.09041508287191391,
0.04023086279630661,
0.06344828754663467,
-0.05134069547057152,
-0.05041798576712608,
0.03384535759687424,
0.066301628947258,
-0.24279962480068207,
0.28508660197257996,
-0.015547695569694042,
-0.0414053238928318,
-0.0798855721950531,
-0.0072272163815796375,
0.05470259487628937,
0.10367409884929657,
0.11631010472774506,
0.026701275259256363,
-0.05831453949213028,
-0.12985455989837646,
-0.010049611330032349,
0.025459015741944313,
0.10010068118572235,
-0.023013504222035408,
-0.008190451189875603,
-0.026562752202153206,
-0.05088624730706215,
-0.014673394151031971,
0.07104095071554184,
0.008848524652421474,
-0.12703938782215118,
0.07680127024650574,
0.049460720270872116,
-0.018087532371282578,
-0.008617108687758446,
-0.053773753345012665,
-0.09175468981266022,
0.19304224848747253,
-0.01958429254591465,
-0.05116080120205879,
-0.11072391271591187,
-0.10805923491716385,
0.10193557292222977,
-0.11249701678752899,
0.12616074085235596,
-0.10047021508216858,
0.009487134404480457,
-0.09380611777305603,
-0.1732918620109558,
0.14263565838336945,
-0.12662816047668457,
-0.005548976361751556,
-0.07589493691921234,
0.14169755578041077,
-0.06785442680120468,
0.024742592126131058,
0.0028587186243385077,
0.04321930930018425,
-0.11557991802692413,
-0.053762901574373245,
0.029133325442671776,
-0.0661218911409378,
0.03599350154399872,
0.054347891360521317,
-0.04755578562617302,
-0.035690948367118835,
0.016383660957217216,
0.022910388186573982,
0.2239362597465515,
0.23655645549297333,
-0.060722775757312775,
0.14300964772701263,
0.16329653561115265,
-0.02534516341984272,
-0.334926038980484,
-0.11384004354476929,
-0.13816533982753754,
0.000659266603179276,
0.005350308958441019,
-0.1263294517993927,
0.09073201566934586,
-0.018374400213360786,
-0.05229932442307472,
0.11701809614896774,
-0.15716011822223663,
-0.09006257355213165,
0.23355786502361298,
0.007530366536229849,
0.5038071870803833,
-0.09967079758644104,
-0.05945106968283653,
-0.052533090114593506,
-0.14235186576843262,
0.02291753888130188,
0.0022186338901519775,
0.09462001919746399,
-0.029315819963812828,
0.07914337515830994,
0.03301545977592468,
-0.09031011164188385,
0.09982404112815857,
-0.03833628445863724,
0.016298605129122734,
-0.11617961525917053,
-0.08525510132312775,
0.10806074738502502,
-0.0136068444699049,
-0.016709906980395317,
0.02158220298588276,
0.01456777099519968,
-0.04083799198269844,
-0.022307492792606354,
-0.10379772633314133,
0.10585320740938187,
0.03499794006347656,
-0.05941828712821007,
0.022518588230013847,
-0.009889909066259861,
-0.012628845870494843,
-0.004628289956599474,
0.1912514567375183,
-0.008090890944004059,
0.17767208814620972,
0.06429888308048248,
0.021506020799279213,
-0.12491537630558014,
-0.06406809389591217,
-0.04999396950006485,
-0.08511979877948761,
0.07266417145729065,
-0.009923536330461502,
0.04696602374315262,
0.10173138976097107,
-0.012479927390813828,
0.029755644500255585,
0.1106644868850708,
0.010039771907031536,
-0.014950153417885303,
0.16745342314243317,
-0.2137589156627655,
0.04184075817465782,
-0.015450791455805302,
-0.018000587821006775,
0.06724268943071365,
0.059818465262651443,
0.09292822331190109,
0.042055394500494,
-0.040214166045188904,
-0.010924269445240498,
-0.007243160158395767,
-0.06570540368556976,
0.04728737846016884,
0.07078813016414642,
0.0481405109167099,
-0.1261121779680252,
0.010777958668768406,
-0.019267624244093895,
-0.18276818096637726,
-0.0193245317786932,
0.08627685904502869,
-0.11631006002426147,
-0.10884512960910797,
0.009783122688531876,
0.08114659041166306,
-0.12021104246377945,
-0.030119983479380608,
-0.08283551782369614,
-0.11298330873250961,
0.05262024328112602,
0.22306393086910248,
0.1127481460571289,
0.06906166672706604,
-0.01038964930921793,
-0.013437945395708084,
-0.019051581621170044,
-0.020863153040409088,
0.04014519229531288,
0.036231301724910736,
-0.08725123107433319,
0.00853132363408804,
-0.009417156688869,
0.15634506940841675,
-0.10960228741168976,
-0.05792605131864548,
-0.15895655751228333,
0.04835722595453262,
-0.07342072576284409,
-0.09718167781829834,
-0.09943155944347382,
-0.07611285150051117,
0.009972813539206982,
-0.07039758563041687,
-0.05352642014622688,
-0.033090610057115555,
-0.1172407865524292,
0.028002966195344925,
0.02894745022058487,
-0.025185875594615936,
-0.06389441341161728,
-0.04643470048904419,
0.1366305649280548,
-0.04931804537773132,
0.07242259383201599,
0.1486574411392212,
-0.07077633589506149,
0.07762903720140457,
-0.12045170366764069,
-0.12859106063842773,
0.09266626089811325,
0.011787940748035908,
0.0829651728272438,
0.04641987010836601,
0.028651097789406776,
0.05126902461051941,
0.04517265781760216,
0.04532682150602341,
0.05619427561759949,
-0.11496394872665405,
0.07823242247104645,
0.008991614915430546,
-0.1918167769908905,
-0.027399636805057526,
-0.09008971601724625,
0.08355460315942764,
0.0007113930769264698,
0.12100622057914734,
-0.03608888015151024,
0.11319220811128616,
-0.03684284910559654,
0.015498606488108635,
-0.03130309283733368,
-0.1580456793308258,
-0.0009196364553645253,
-0.04583831876516342,
0.005965739022940397,
-0.008378063328564167,
0.23622959852218628,
-0.019403686746954918,
0.020302124321460724,
0.03706345707178116,
0.08312346041202545,
0.011911443434655666,
0.0034686587750911713,
0.14071707427501678,
0.09389068931341171,
-0.05033082515001297,
-0.07029570639133453,
0.09465329349040985,
0.022331232205033302,
-0.06154777854681015,
0.12758882343769073,
0.07004008442163467,
0.07805091887712479,
0.09229323267936707,
0.00525510823354125,
0.048385001718997955,
-0.10149682313203812,
-0.22425489127635956,
-0.04245093837380409,
0.04338166117668152,
0.030511466786265373,
-0.009528924711048603,
0.16239036619663239,
-0.007807712536305189,
0.05438727140426636,
-0.0286843404173851,
-0.0146601852029562,
-0.1913124918937683,
-0.12455741316080093,
-0.08852022141218185,
-0.06035058572888374,
0.034696102142333984,
-0.019905485212802887,
-0.019849425181746483,
0.10410076379776001,
0.03071051649749279,
-0.029069487005472183,
0.1391187459230423,
0.007366697769612074,
-0.012996077537536621,
0.016209116205573082,
-0.007277622353285551,
0.015048467554152012,
0.04374290257692337,
-0.018845049664378166,
-0.17080430686473846,
-0.003531701397150755,
-0.05279584601521492,
0.0019921238999813795,
-0.08741256594657898,
0.027429690584540367,
-0.09686073660850525,
-0.12442773580551147,
-0.06279069930315018,
0.03428082540631294,
-0.036186520010232925,
0.08635444939136505,
-0.008224183693528175,
0.04980190843343735,
0.003494719509035349,
0.13008199632167816,
-0.06635979562997818,
-0.10865864157676697,
-0.0572807714343071,
0.15417969226837158,
0.044320207089185715,
0.07846927642822266,
-0.024377785623073578,
0.025428062304854393,
-0.09749093651771545,
0.3293962776660919,
0.31430360674858093,
-0.0364663191139698,
0.07512470334768295,
0.049478501081466675,
0.03143766522407532,
0.07872943580150604,
0.10979169607162476,
0.07679310441017151,
0.2867131531238556,
-0.09095823019742966,
-0.0540350005030632,
-0.04679121449589729,
-0.03458467498421669,
-0.12083368748426437,
0.018574830144643784,
0.026663949713110924,
-0.03865300863981247,
-0.05562908574938774,
0.08287046104669571,
-0.17973420023918152,
0.13521145284175873,
0.08000985532999039,
-0.2064894735813141,
-0.05751892551779747,
-0.022878373041749,
0.15401794016361237,
0.030826324597001076,
0.11301977932453156,
-0.03987570479512215,
-0.09233494848012924,
0.0507376492023468,
0.02869175747036934,
-0.20688870549201965,
-0.07890072464942932,
0.10709986090660095,
0.004827416036278009,
0.06665920466184616,
-0.034168489277362823,
0.019203342497348785,
0.09574570506811142,
0.06589552015066147,
-0.023569602519273758,
0.026999808847904205,
0.021599261090159416,
-0.10421290993690491,
-0.05063549056649208,
0.027946757152676582,
-0.005161251872777939,
-0.1348150372505188,
0.025101054459810257,
-0.1394609659910202,
0.04613953083753586,
-0.0913073867559433,
-0.008668651804327965,
-0.005664225202053785,
0.0692635253071785,
-0.0486815981566906,
0.048449039459228516,
0.06823991239070892,
0.01269973162561655,
-0.031037641689181328,
-0.048129696398973465,
-0.011327249929308891,
0.06746246665716171,
-0.10943937301635742,
-0.1738860160112381,
-0.08208338916301727,
-0.07020771503448486,
0.0458214171230793,
-0.008811558596789837,
-0.1559021770954132,
-0.0473502054810524,
-0.11499189585447311,
0.015934964641928673,
-0.14897628128528595,
0.045110028237104416,
0.051641616970300674,
0.04488077014684677,
0.021195126697421074,
-0.024013997986912727,
0.037815019488334656,
0.04779626056551933,
-0.15743644535541534,
-0.09625527262687683
] |
null | null |
transformers
|
Language model fine-tuned on the articles and speeches of Noam Chomsky.
|
{}
|
text-generation
|
dbragdon/noamlm
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
Language model fine-tuned on the articles and speeches of Noam Chomsky.
|
[] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
47
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
-0.027653997763991356,
0.02414041943848133,
-0.0068230400793254375,
0.010564634576439857,
0.18164798617362976,
0.033704131841659546,
0.08821956068277359,
0.13570955395698547,
-0.0068973456509411335,
-0.013526750728487968,
0.1547490805387497,
0.20799952745437622,
-0.0026462990790605545,
0.0791444480419159,
-0.0664469450712204,
-0.2753458023071289,
0.05913490429520607,
0.0680282786488533,
-0.007687992881983519,
0.12075648456811905,
0.07187031954526901,
-0.0549883171916008,
0.0886516347527504,
-0.02030559629201889,
-0.17324471473693848,
0.01953965798020363,
0.04816993698477745,
-0.12518654763698578,
0.1176358312368393,
0.05111858248710632,
0.09795232862234116,
0.008365745656192303,
-0.06405694782733917,
-0.13635118305683136,
0.022147029638290405,
0.03033585101366043,
-0.058860234916210175,
0.0636059120297432,
0.1087222546339035,
-0.09939044713973999,
0.09311723709106445,
0.08541663736104965,
-0.0255570225417614,
0.05364618077874184,
-0.15825888514518738,
-0.06378549337387085,
-0.02499648556113243,
0.007804732769727707,
0.06256697326898575,
0.10073644667863846,
-0.017566369846463203,
0.10258800536394119,
-0.0975269079208374,
0.10333853214979172,
0.1500675231218338,
-0.3112771809101105,
0.009987793862819672,
0.09499151259660721,
0.04119991883635521,
0.03931105509400368,
-0.02533094584941864,
0.05045793950557709,
0.025268254801630974,
0.027277586981654167,
0.007437177933752537,
-0.0750175341963768,
-0.1137726753950119,
0.049895867705345154,
-0.09199702739715576,
-0.07458660751581192,
0.22324641048908234,
-0.07399588078260422,
0.060080595314502716,
-0.025852523744106293,
-0.11121725291013718,
-0.05274823680520058,
-0.013890148140490055,
0.018784796819090843,
-0.06587869673967361,
0.08765926212072372,
0.024050135165452957,
-0.06755640357732773,
-0.1323474794626236,
-0.04128742218017578,
-0.18628640472888947,
0.17943057417869568,
0.015332846902310848,
0.05883103236556053,
-0.1924149990081787,
0.11635245382785797,
-0.004000017885118723,
-0.08559784293174744,
0.024640021845698357,
-0.09488005936145782,
0.03717249631881714,
-0.005796557758003473,
-0.06343648582696915,
-0.07624655961990356,
0.078512042760849,
0.13449318706989288,
-0.0038929670117795467,
0.031459223479032516,
-0.03913462534546852,
0.08946967869997025,
0.023094916716217995,
0.11019261926412582,
-0.01329297386109829,
-0.00601809611544013,
0.043852973729372025,
-0.14449132978916168,
-0.008341594599187374,
-0.06913956254720688,
-0.1527271568775177,
-0.05108632892370224,
0.05306483805179596,
0.08953460305929184,
0.008545879274606705,
0.09067165106534958,
-0.04840036481618881,
-0.026439275592565536,
0.06191498041152954,
-0.07166212797164917,
-0.0057375445030629635,
0.0005479406099766493,
0.020326290279626846,
0.12346802651882172,
-0.006863993126899004,
0.01816580630838871,
-0.1344953328371048,
0.07597071677446365,
-0.0810447409749031,
0.0016609809827059507,
-0.037295255810022354,
-0.051307324320077896,
0.016753138974308968,
-0.09774310886859894,
0.014272624626755714,
-0.15190516412258148,
-0.18175770342350006,
0.015764877200126648,
0.0044948384165763855,
-0.03198384866118431,
-0.035312067717313766,
-0.03263629972934723,
-0.023609675467014313,
0.04306609928607941,
-0.06790579855442047,
0.009302832186222076,
-0.05678845942020416,
0.10395034402608871,
-0.032171644270420074,
0.06649759411811829,
-0.10738259553909302,
0.0829162523150444,
-0.12368609756231308,
-0.004673504736274481,
-0.09571383893489838,
0.07571588456630707,
-0.0049130916595458984,
0.11728651076555252,
-0.028541911393404007,
-0.03454771637916565,
-0.07556727528572083,
0.04999465495347977,
-0.02550712786614895,
0.18951213359832764,
-0.060080599039793015,
-0.12557648122310638,
0.2583121061325073,
-0.07503679394721985,
-0.1294521689414978,
0.09354755282402039,
0.013357079587876797,
0.03000263124704361,
0.08708256483078003,
0.17770351469516754,
0.03385210409760475,
0.011724604293704033,
0.08526027947664261,
0.1101398766040802,
-0.11245359480381012,
-0.0934135690331459,
0.01582467369735241,
-0.04410967230796814,
-0.14348545670509338,
0.0551721565425396,
0.06396481394767761,
0.08126390725374222,
-0.04889657348394394,
-0.02648499235510826,
-0.04211905598640442,
0.005280596204102039,
0.08378548920154572,
0.011136471293866634,
0.12981148064136505,
-0.04937934875488281,
-0.03142275661230087,
-0.018193937838077545,
-0.012411710806190968,
-0.03191297501325607,
0.03591127321124077,
-0.019667068496346474,
0.13700194656848907,
-0.048340748995542526,
0.053371917456388474,
-0.18971459567546844,
-0.07922437787055969,
0.0010099048959091306,
0.123023621737957,
-0.014106693677604198,
0.08013445883989334,
0.05753817409276962,
-0.018720267340540886,
-0.004700321704149246,
-0.01032867468893528,
0.1544346958398819,
-0.021616755053400993,
-0.06661882251501083,
-0.04162381589412689,
0.0662311464548111,
-0.05831345543265343,
-0.0033040468115359545,
-0.05776660889387131,
0.013589667156338692,
0.05048443749547005,
0.10443682968616486,
-0.0023575187660753727,
0.03253777325153351,
-0.02123248018324375,
0.018250472843647003,
-0.07885172218084335,
-0.0028943256475031376,
0.09839999675750732,
-0.003195167751982808,
-0.06114937365055084,
0.191707044839859,
-0.16508106887340546,
0.2123199850320816,
0.18989497423171997,
-0.2840019166469574,
0.008855658583343029,
-0.07930868119001389,
-0.03107025846838951,
0.019292673096060753,
0.04051336646080017,
-0.035391807556152344,
0.12321244925260544,
0.0030509934294968843,
0.1893225461244583,
-0.05120055004954338,
-0.054668959230184555,
-0.0003608512051869184,
-0.05736381933093071,
0.0013126746052876115,
0.06707432866096497,
0.11558198183774948,
-0.12564630806446075,
0.1973772495985031,
0.17830142378807068,
0.02446782775223255,
0.16028088331222534,
0.003589105326682329,
-0.02908729389309883,
0.07800903916358948,
0.001039333757944405,
-0.03403163328766823,
-0.08341804146766663,
-0.19453173875808716,
-0.01920945756137371,
0.08615871518850327,
0.05208343267440796,
0.11178864538669586,
-0.1340440809726715,
-0.039688125252723694,
-0.016580121591687202,
-0.013963420875370502,
0.004052120726555586,
0.08927994221448898,
0.05621529743075371,
0.11766386777162552,
-0.008479462936520576,
0.004914911463856697,
0.11690844595432281,
0.024292193353176117,
-0.0974007099866867,
0.20369629561901093,
-0.12859489023685455,
-0.35919657349586487,
-0.17192909121513367,
-0.16941924393177032,
-0.046767693012952805,
0.06603047996759415,
0.10566895455121994,
-0.11921820044517517,
-0.03283723443746567,
0.01984371617436409,
0.10511579364538193,
-0.0874844342470169,
0.025252653285861015,
-0.07854585349559784,
0.039858005940914154,
-0.08228866755962372,
-0.07852846384048462,
-0.058627899736166,
-0.02397638000547886,
-0.06844961643218994,
0.15293799340724945,
-0.10580270737409592,
0.04606963321566582,
0.19703397154808044,
0.035209350287914276,
0.05708123743534088,
-0.03352535888552666,
0.19375872611999512,
-0.09711813181638718,
-0.014181635342538357,
0.20692157745361328,
-0.04432303458452225,
0.08276087045669556,
0.10658510029315948,
-0.0009211950236931443,
-0.0905555859208107,
0.023672347888350487,
-0.03327333554625511,
-0.09995128959417343,
-0.2413795441389084,
-0.12423769384622574,
-0.12672755122184753,
0.07157120853662491,
0.06113129481673241,
0.06719478219747543,
0.1604551076889038,
0.09354656934738159,
-0.019843624904751778,
0.04505275562405586,
-0.0036725422833114862,
0.07906411588191986,
0.20365294814109802,
-0.0204415675252676,
0.13615357875823975,
-0.050657231360673904,
-0.13334059715270996,
0.09257177263498306,
0.06900633871555328,
0.15225820243358612,
0.054498545825481415,
0.05270633473992348,
0.006767008453607559,
0.06716175377368927,
0.1454283893108368,
0.13071000576019287,
0.014545821584761143,
-0.016409022733569145,
-0.021825823932886124,
-0.011036834679543972,
-0.05876464396715164,
0.04085689038038254,
0.02777833305299282,
-0.1610528975725174,
-0.05520197004079819,
-0.12001585215330124,
0.08774644136428833,
0.09219257533550262,
0.06569026410579681,
-0.2342914491891861,
0.007060535252094269,
0.08197256177663803,
-0.028898365795612335,
-0.1258426308631897,
0.08190665394067764,
-0.021697908639907837,
-0.14926569163799286,
0.0494246669113636,
-0.061497997492551804,
0.12161173671483994,
-0.07084709405899048,
0.08109014481306076,
-0.03937468305230141,
-0.062106676399707794,
0.020281726494431496,
0.1271398812532425,
-0.29730626940727234,
0.20356124639511108,
-0.001819691271521151,
-0.05869410187005997,
-0.11437822878360748,
0.01959572173655033,
0.01367559190839529,
0.11016108095645905,
0.10386832803487778,
0.005328167695552111,
-0.0475030355155468,
-0.12364684045314789,
-0.022924374788999557,
0.024910306558012962,
0.12441114336252213,
-0.05739542469382286,
-0.008891535922884941,
-0.044362228363752365,
-0.0058176638558506966,
-0.028876133263111115,
-0.053936153650283813,
0.025268638506531715,
-0.16888569295406342,
0.08389513194561005,
0.017658868804574013,
0.09978678822517395,
0.01261826977133751,
-0.013697084039449692,
-0.09944134950637817,
0.23519866168498993,
-0.07718266546726227,
-0.11035529524087906,
-0.1205357164144516,
-0.04611735790967941,
0.0686027929186821,
-0.0741099938750267,
0.0634869635105133,
-0.08208895474672318,
0.024847982451319695,
-0.047674816101789474,
-0.21411024034023285,
0.1248590424656868,
-0.09078147262334824,
-0.047217957675457,
-0.038028888404369354,
0.1873915195465088,
-0.07860055565834045,
0.003835690440610051,
0.01727161929011345,
0.03052649088203907,
-0.11501652747392654,
-0.10535892844200134,
0.02131424844264984,
-0.005508285015821457,
0.06073078140616417,
0.04357268661260605,
-0.06716573983430862,
0.01641303487122059,
-0.022389056161046028,
-0.006917606573551893,
0.32454678416252136,
0.14079391956329346,
-0.04770330339670181,
0.17363035678863525,
0.11376409232616425,
-0.08209476619958878,
-0.31482723355293274,
-0.08535979688167572,
-0.09984239190816879,
-0.03735451400279999,
-0.06232178583741188,
-0.21656104922294617,
0.09480288624763489,
0.04200942441821098,
-0.015409117564558983,
0.1568077802658081,
-0.24411429464817047,
-0.0795927420258522,
0.15950311720371246,
-0.007333407178521156,
0.3560895025730133,
-0.12491796165704727,
-0.11301901936531067,
-0.05532994866371155,
-0.1397564709186554,
0.15002089738845825,
-0.009417316876351833,
0.11106741428375244,
-0.03287123143672943,
0.10856477171182632,
0.048215944319963455,
-0.05544896051287651,
0.09160676598548889,
0.026295991614460945,
-0.003711326979100704,
-0.10597866773605347,
-0.01747799478471279,
0.043585844337940216,
0.006319248117506504,
0.031217962503433228,
-0.03127649053931236,
0.033463045954704285,
-0.12691029906272888,
-0.04727448150515556,
-0.08006873726844788,
0.05846472829580307,
0.052333541214466095,
-0.0737200528383255,
-0.0010956452460959554,
-0.06611854583024979,
-0.016030769795179367,
0.003143493551760912,
0.19045160710811615,
-0.03460016846656799,
0.14779594540596008,
0.0818052664399147,
0.09073434770107269,
-0.1361592561006546,
-0.0061243316158652306,
-0.06888517737388611,
-0.057741593569517136,
0.08706554025411606,
-0.10988334566354752,
0.06429524719715118,
0.11854783445596695,
-0.04650293290615082,
0.07134203612804413,
0.11840200424194336,
0.015247469767928123,
-0.0033181030303239822,
0.13015136122703552,
-0.2568117082118988,
0.019211336970329285,
-0.0754370167851448,
-0.03775216266512871,
0.08088402450084686,
0.07995659112930298,
0.16486960649490356,
0.036187540739774704,
-0.042049095034599304,
-0.003924929536879063,
0.009187355637550354,
-0.039663419127464294,
0.08243577927350998,
0.012240500189363956,
0.023174172267317772,
-0.15248477458953857,
0.071900375187397,
0.015580810606479645,
-0.12336304783821106,
0.011253113858401775,
0.1477922946214676,
-0.13801799714565277,
-0.11707340180873871,
-0.03374985232949257,
0.08742405474185944,
-0.14541642367839813,
-0.0241269338876009,
-0.04783749580383301,
-0.12825986742973328,
0.09339214116334915,
0.11613135039806366,
0.07497538626194,
0.10595441609621048,
-0.0529337078332901,
-0.02668607421219349,
-0.03682107478380203,
-0.022537073120474815,
-0.0017330512637272477,
0.032638516277074814,
-0.08304216712713242,
0.0579586885869503,
-0.020800847560167313,
0.14298540353775024,
-0.08964299410581589,
-0.07169508188962936,
-0.1581236720085144,
0.03564200550317764,
-0.12593989074230194,
-0.07035141438245773,
-0.08840593695640564,
-0.05227470397949219,
-0.007837125100195408,
-0.01494099572300911,
-0.0388214997947216,
-0.04472146928310394,
-0.12364204227924347,
0.01879296824336052,
-0.05806630104780197,
0.02100815810263157,
-0.07383234053850174,
0.00039667764212936163,
0.08932872861623764,
-0.0410015694797039,
0.13851116597652435,
0.13557660579681396,
-0.08107975125312805,
0.11907198280096054,
-0.13537484407424927,
-0.0908876284956932,
0.1157127171754837,
0.013428857550024986,
0.03907458856701851,
0.06849293410778046,
0.037317484617233276,
0.06514574587345123,
0.016511039808392525,
0.05237346887588501,
0.006972990930080414,
-0.1299850195646286,
0.03433857858181,
-0.042786743491888046,
-0.1481933295726776,
-0.05744143947958946,
-0.05092177540063858,
0.039562974125146866,
0.02438235841691494,
0.10801149904727936,
-0.03665049374103546,
0.11085481196641922,
-0.058541763573884964,
0.01499281544238329,
0.004919432103633881,
-0.18287403881549835,
-0.044654008001089096,
-0.07792776077985764,
0.02775009535253048,
0.022204352542757988,
0.2720205783843994,
0.0410233810544014,
0.020275471732020378,
0.017097288742661476,
0.11327627301216125,
0.057128578424453735,
0.015525308437645435,
0.214890718460083,
0.11996994912624359,
-0.06049320101737976,
-0.10806480050086975,
0.0858595222234726,
0.02164783701300621,
0.007426374591886997,
0.14070266485214233,
0.008503482677042484,
-0.015597577206790447,
0.0887407436966896,
-0.03357330709695816,
0.0031263602431863546,
-0.11658911406993866,
-0.13779941201210022,
-0.028487415984272957,
0.0629650130867958,
-0.0040870243683457375,
0.0956285297870636,
0.13609373569488525,
-0.026881180703639984,
0.03953414782881737,
-0.007877747528254986,
-0.054916199296712875,
-0.1785028725862503,
-0.15742821991443634,
-0.0790708139538765,
-0.13561099767684937,
0.014744875021278858,
-0.10368648171424866,
0.04369770362973213,
0.09560346603393555,
0.055915698409080505,
-0.05440305173397064,
0.10839936882257462,
0.060064028948545456,
-0.1045473963022232,
0.056569941341876984,
-0.032912541180849075,
0.06427399069070816,
-0.001812951872125268,
-0.02503552846610546,
-0.09098561853170395,
0.0020124134607613087,
0.0017788249533623457,
0.0514003150165081,
-0.05152478814125061,
0.024474015459418297,
-0.15132632851600647,
-0.09570280462503433,
-0.04949872940778732,
0.07316448539495468,
-0.06007300689816475,
0.1162300780415535,
-0.001420395914465189,
-0.017011309042572975,
0.03990921378135681,
0.2064858227968216,
-0.07188161462545395,
-0.04990030825138092,
-0.047407180070877075,
0.22449158132076263,
0.04847963526844978,
0.10619479417800903,
-0.013415440917015076,
-0.00436578830704093,
-0.07670432329177856,
0.36612021923065186,
0.2802904546260834,
-0.06149837002158165,
0.012722660787403584,
0.03524370491504669,
0.030115660279989243,
0.13885097205638885,
0.1454230099916458,
0.09396251291036606,
0.27579233050346375,
-0.08266803622245789,
-0.052018675953149796,
-0.015770163387060165,
-0.020211221650242805,
-0.09714096784591675,
0.11003416776657104,
0.04697350785136223,
-0.06982195377349854,
-0.044631510972976685,
0.09750646352767944,
-0.24107815325260162,
0.1615772694349289,
-0.07760030031204224,
-0.15214353799819946,
-0.06177033111453056,
0.012448563240468502,
0.10150322318077087,
0.00011545186134753749,
0.08784360438585281,
-0.009687529876828194,
-0.10291683673858643,
0.05749227851629257,
0.02730483002960682,
-0.23568211495876312,
-0.007146455347537994,
0.053680915385484695,
-0.04540037736296654,
0.013332240283489227,
-0.01917567476630211,
0.04910791665315628,
0.06717875599861145,
0.055140718817710876,
-0.0426395982503891,
0.03817736729979515,
-0.010196289978921413,
-0.05020907521247864,
0.029649224132299423,
0.044778332114219666,
0.017814766615629196,
-0.13065220415592194,
0.05277646332979202,
-0.13968263566493988,
0.041911475360393524,
-0.029653942212462425,
-0.027413733303546906,
-0.004670299123972654,
-0.019546283408999443,
-0.06313455104827881,
0.057941507548093796,
0.08424945920705795,
0.001472705160267651,
-0.007915833964943886,
-0.08050897717475891,
-0.011023934930562973,
-0.012819311581552029,
-0.08308050036430359,
-0.10086389631032944,
-0.1384236365556717,
-0.10634621232748032,
0.12701933085918427,
-0.017066750675439835,
-0.19125573337078094,
0.01284839678555727,
-0.09708964824676514,
0.060041818767786026,
-0.1797112077474594,
0.0843181237578392,
0.06071038171648979,
0.01623542606830597,
-0.004114143084734678,
-0.029135411605238914,
0.039420004934072495,
0.08210206776857376,
-0.10779064148664474,
-0.09044761955738068
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-ner
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the wikiann dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2781
- Precision: 0.8121
- Recall: 0.8302
- F1: 0.8210
- Accuracy: 0.9204
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.3504 | 1.0 | 1250 | 0.2922 | 0.7930 | 0.8075 | 0.8002 | 0.9115 |
| 0.2353 | 2.0 | 2500 | 0.2711 | 0.8127 | 0.8264 | 0.8195 | 0.9196 |
| 0.1745 | 3.0 | 3750 | 0.2781 | 0.8121 | 0.8302 | 0.8210 | 0.9204 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["wikiann"], "metrics": ["precision", "recall", "f1", "accuracy"], "model-index": [{"name": "distilbert-base-uncased-finetuned-ner", "results": [{"task": {"type": "token-classification", "name": "Token Classification"}, "dataset": {"name": "wikiann", "type": "wikiann", "args": "en"}, "metrics": [{"type": "precision", "value": 0.8120642485217545, "name": "Precision"}, {"type": "recall", "value": 0.830235495804385, "name": "Recall"}, {"type": "f1", "value": 0.8210493441599, "name": "F1"}, {"type": "accuracy", "value": 0.9203828724683252, "name": "Accuracy"}]}]}]}
|
token-classification
|
dbsamu/distilbert-base-uncased-finetuned-ner
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"token-classification",
"generated_from_trainer",
"dataset:wikiann",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #dataset-wikiann #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-ner
=====================================
This model is a fine-tuned version of distilbert-base-uncased on the wikiann dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2781
* Precision: 0.8121
* Recall: 0.8302
* F1: 0.8210
* Accuracy: 0.9204
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #dataset-wikiann #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
68,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #dataset-wikiann #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.10021273791790009,
0.09889699518680573,
-0.0024442344438284636,
0.12742078304290771,
0.1545073240995407,
0.03436994180083275,
0.1262977570295334,
0.12072683870792389,
-0.08027587831020355,
0.018636653199791908,
0.12295075505971909,
0.1664653718471527,
0.01799948327243328,
0.10110809653997421,
-0.0455438569188118,
-0.264363557100296,
-0.005418132990598679,
0.052532222121953964,
-0.05825560912489891,
0.12993666529655457,
0.09777101874351501,
-0.1322602778673172,
0.09025218337774277,
0.022639553993940353,
-0.19616296887397766,
-0.0010819041635841131,
0.005694587714970112,
-0.06233207881450653,
0.1493155062198639,
0.020434090867638588,
0.13363632559776306,
-0.007245011627674103,
0.09222419559955597,
-0.1851639300584793,
0.008520981296896935,
0.052077047526836395,
0.015175905078649521,
0.09507796913385391,
0.039419230073690414,
0.01223190687596798,
0.10863757133483887,
-0.06046471372246742,
0.061451178044080734,
0.014326538890600204,
-0.11394215375185013,
-0.2289452850818634,
-0.08534269779920578,
0.050962015986442566,
0.07589686661958694,
0.09223506599664688,
0.0013893675059080124,
0.13365060091018677,
-0.09669391065835953,
0.08661072701215744,
0.22083018720149994,
-0.28310278058052063,
-0.06662994623184204,
0.03930016979575157,
0.006962729152292013,
0.03326801210641861,
-0.1074468120932579,
-0.03922092542052269,
0.04884175956249237,
0.05022473633289337,
0.14059291779994965,
-0.026983674615621567,
-0.11519155651330948,
0.012423689477145672,
-0.14185906946659088,
-0.03299953415989876,
0.16812379658222198,
0.05156374350190163,
-0.035151246935129166,
-0.04468538984656334,
-0.05729677528142929,
-0.14934125542640686,
-0.028022918850183487,
-0.02184808999300003,
0.04431603103876114,
-0.035940736532211304,
-0.05162994936108589,
0.0020704178605228662,
-0.10310222208499908,
-0.0663159042596817,
-0.07806530594825745,
0.1450185328722,
0.044295065104961395,
0.015826761722564697,
-0.029393469914793968,
0.11513444781303406,
-0.0036216136068105698,
-0.11969558149576187,
0.020664386451244354,
0.023044755682349205,
0.00470326142385602,
-0.04848393797874451,
-0.046965356916189194,
-0.054392438381910324,
0.007305920589715242,
0.13554222881793976,
-0.05034656077623367,
0.033040620386600494,
0.05286343768239021,
0.04699656739830971,
-0.08316771686077118,
0.18558518588542938,
-0.056553713977336884,
-0.022531451657414436,
0.004156526643782854,
0.04986223205924034,
0.013645729050040245,
-0.0038068178109824657,
-0.1186685860157013,
0.002921396167948842,
0.09483414888381958,
0.004681875463575125,
-0.07143803685903549,
0.07350843399763107,
-0.0557267852127552,
-0.03175670653581619,
0.018605591729283333,
-0.08748897165060043,
0.028645498678088188,
-0.005243327934294939,
-0.08539880067110062,
0.0003143867361359298,
0.027869274839758873,
0.0190989151597023,
-0.008457373827695847,
0.12578094005584717,
-0.09729944914579391,
0.020375028252601624,
-0.09288850426673889,
-0.10449477285146713,
0.02165263146162033,
-0.09980018436908722,
0.036123570054769516,
-0.09533251076936722,
-0.17771218717098236,
-0.0026293275877833366,
0.0666949450969696,
-0.018038270995020866,
-0.06098955124616623,
-0.044661995023489,
-0.06702682375907898,
0.010301830247044563,
-0.01144956611096859,
0.1144149973988533,
-0.06289084255695343,
0.0944623127579689,
0.019702022895216942,
0.06626442819833755,
-0.04189186170697212,
0.05518528074026108,
-0.10386637598276138,
0.021363120526075363,
-0.16782698035240173,
0.0197664313018322,
-0.053499817848205566,
0.06118439510464668,
-0.08842169493436813,
-0.10514841973781586,
0.010417873039841652,
-0.008829390630126,
0.06623949855566025,
0.08217284083366394,
-0.1651521623134613,
-0.07423052936792374,
0.1547769457101822,
-0.06926063448190689,
-0.12318578362464905,
0.12674212455749512,
-0.06273576617240906,
0.04942590370774269,
0.05861669406294823,
0.15489153563976288,
0.0745532363653183,
-0.09095113724470139,
-0.00016298098489642143,
0.005976896733045578,
0.04766598716378212,
-0.0550781711935997,
0.06997286528348923,
0.012790086679160595,
0.023332256823778152,
0.02935643494129181,
-0.02520279958844185,
0.06349233537912369,
-0.0907105803489685,
-0.09603181481361389,
-0.04073900729417801,
-0.09734218567609787,
0.05234687775373459,
0.06947536021471024,
0.06867719441652298,
-0.0899696871638298,
-0.08164576441049576,
0.07079742103815079,
0.0955742597579956,
-0.05167866125702858,
0.02102765254676342,
-0.06687957793474197,
0.07141935080289841,
-0.04635835438966751,
-0.028550755232572556,
-0.16569453477859497,
-0.031876809895038605,
0.011706809513270855,
-0.007412417326122522,
0.011534619145095348,
0.03298173472285271,
0.06524714827537537,
0.07048698514699936,
-0.050664521753787994,
-0.018675390630960464,
-0.044487114995718,
0.005433144047856331,
-0.12348724156618118,
-0.1967018097639084,
-0.04405403882265091,
-0.018267782405018806,
0.13407208025455475,
-0.20388823747634888,
0.036075472831726074,
-0.02100341208279133,
0.08941088616847992,
0.01901472546160221,
-0.00716630881652236,
-0.040760863572359085,
0.07336936891078949,
-0.04672021046280861,
-0.0507984422147274,
0.06354415416717529,
0.00039284955710172653,
-0.08288612216711044,
-0.05278094857931137,
-0.07843320071697235,
0.17043626308441162,
0.1297323852777481,
-0.1107465997338295,
-0.0772329717874527,
-0.010259142145514488,
-0.06457358598709106,
-0.04180372133851051,
-0.055056698620319366,
0.02673032321035862,
0.17157407104969025,
-0.007416121196001768,
0.1400229036808014,
-0.07244233042001724,
-0.04353317990899086,
0.024297453463077545,
-0.036610301584005356,
0.01791476458311081,
0.11634936183691025,
0.1451096087694168,
-0.08721090108156204,
0.15050433576107025,
0.15512493252754211,
-0.097478486597538,
0.13495919108390808,
-0.03940490633249283,
-0.07211241871118546,
-0.02208852767944336,
-0.033302851021289825,
-0.009761848486959934,
0.11782044917345047,
-0.15297831594944,
0.0019307546317577362,
0.030884575098752975,
0.02208307757973671,
0.012544156983494759,
-0.220361590385437,
-0.04335027560591698,
0.03716398775577545,
-0.03595962002873421,
-0.00019107460684608668,
-0.008822332136332989,
0.008913759142160416,
0.10121918469667435,
0.0024107082281261683,
-0.10068154335021973,
0.03967965394258499,
0.009670569561421871,
-0.07091444730758667,
0.20643733441829681,
-0.07824815809726715,
-0.14748771488666534,
-0.11927986890077591,
-0.07693837583065033,
-0.05154476687312126,
-0.0014571622014045715,
0.055531859397888184,
-0.0855116993188858,
-0.031408920884132385,
-0.0627036765217781,
0.020383084192872047,
-0.0004005719965789467,
0.028907734900712967,
0.009715091437101364,
-0.0022544334642589092,
0.061256442219018936,
-0.11145321279764175,
-0.011673986911773682,
-0.05276130512356758,
-0.05118940770626068,
0.039670124650001526,
0.03820746764540672,
0.10837967693805695,
0.149861142039299,
-0.018954575061798096,
0.008016706444323063,
-0.023648878559470177,
0.2385396659374237,
-0.06466422975063324,
-0.010291785933077335,
0.13899382948875427,
-0.016901757568120956,
0.05042032524943352,
0.1068698912858963,
0.07958216220140457,
-0.08079178631305695,
-0.0016542708035558462,
0.041466549038887024,
-0.03215993195772171,
-0.22155091166496277,
-0.04306183010339737,
-0.0482310988008976,
-0.001501175225712359,
0.09741653501987457,
0.023851223289966583,
0.04188695177435875,
0.08011559396982193,
0.03494797274470329,
0.08768578618764877,
-0.051582686603069305,
0.0549522340297699,
0.11713846772909164,
0.039940137416124344,
0.11892132461071014,
-0.03764929622411728,
-0.0635695606470108,
0.03781316801905632,
0.014459993690252304,
0.22548991441726685,
0.011429807171225548,
0.128609761595726,
0.06457354128360748,
0.18400056660175323,
-0.010831804014742374,
0.08021679520606995,
-0.014004729688167572,
-0.0388002023100853,
-0.016126723960042,
-0.03641872480511665,
-0.03203410655260086,
0.02787012793123722,
-0.05832480639219284,
0.07314787805080414,
-0.10833635926246643,
0.014813877642154694,
0.05257721245288849,
0.26403459906578064,
0.03590860590338707,
-0.32393786311149597,
-0.10254204273223877,
-0.004510374274104834,
-0.04347313567996025,
-0.020120713859796524,
0.031879980117082596,
0.09275196492671967,
-0.09752743691205978,
0.021341465413570404,
-0.06990557163953781,
0.08897676318883896,
-0.06348223984241486,
0.03773721680045128,
0.08126727491617203,
0.0966520830988884,
0.009150179103016853,
0.088407963514328,
-0.2751307785511017,
0.27227288484573364,
0.0012679819483309984,
0.06546018272638321,
-0.08116358518600464,
0.004778312519192696,
0.03670406714081764,
0.07222224771976471,
0.07155763357877731,
-0.008508313447237015,
-0.015989024192094803,
-0.19432613253593445,
-0.05747883394360542,
0.02058582380414009,
0.06322093307971954,
-0.038261644542217255,
0.08873887360095978,
-0.0272202268242836,
0.007472056429833174,
0.07318450510501862,
0.009813059121370316,
-0.05304449424147606,
-0.09315326064825058,
-0.009877676144242287,
0.035677529871463776,
-0.04951143637299538,
-0.06411010026931763,
-0.11436779797077179,
-0.12740124762058258,
0.14234425127506256,
-0.021315502002835274,
-0.03517376258969307,
-0.1057642251253128,
0.07317111641168594,
0.07844430953264236,
-0.08537329733371735,
0.04681271314620972,
0.00029352845740504563,
0.07303070276975632,
0.023652123287320137,
-0.05946240946650505,
0.10344616323709488,
-0.07742477208375931,
-0.16437026858329773,
-0.07020553946495056,
0.09452877193689346,
0.04157790541648865,
0.060765113681554794,
-0.008143596351146698,
0.006040602922439575,
-0.03779532387852669,
-0.08574913442134857,
0.02656407095491886,
0.0020398010965436697,
0.07390879094600677,
0.007869496941566467,
-0.05437856167554855,
0.03089446946978569,
-0.0588218979537487,
-0.03332607448101044,
0.18219223618507385,
0.22211365401744843,
-0.09868012368679047,
0.014058866538107395,
0.03773181885480881,
-0.06561044603586197,
-0.18183432519435883,
0.0338936448097229,
0.054223958402872086,
0.00533680897206068,
0.036440636962652206,
-0.1849042922258377,
0.14959605038166046,
0.11737634241580963,
-0.016374986618757248,
0.11842817068099976,
-0.3188531696796417,
-0.11811244487762451,
0.1312803477048874,
0.1442520022392273,
0.09910902380943298,
-0.12381010502576828,
-0.012260101735591888,
-0.012140653096139431,
-0.140932098031044,
0.11941486597061157,
-0.07889427244663239,
0.11626852303743362,
-0.0369877927005291,
0.09661763161420822,
0.004911630880087614,
-0.060760822147130966,
0.11691730469465256,
0.022421324625611305,
0.09861847758293152,
-0.05691075697541237,
-0.03848780319094658,
0.03177347779273987,
-0.038597166538238525,
0.03275531157851219,
-0.07983207702636719,
0.025923149660229683,
-0.10207388550043106,
-0.026086334139108658,
-0.07489300519227982,
0.050375599414110184,
-0.0397096686065197,
-0.07795969396829605,
-0.04272206872701645,
0.03374503552913666,
0.04735662415623665,
-0.010442417114973068,
0.1306701898574829,
0.0435580275952816,
0.13626302778720856,
0.09206993877887726,
0.06825253367424011,
-0.06917993724346161,
-0.08735966682434082,
-0.027515361085534096,
-0.015555569902062416,
0.06664726138114929,
-0.14233987033367157,
0.020423661917448044,
0.13973906636238098,
0.02502116747200489,
0.13913854956626892,
0.07773105055093765,
-0.03486587107181549,
0.00495559349656105,
0.053735099732875824,
-0.15960800647735596,
-0.07496216148138046,
-0.00701875239610672,
-0.05338927358388901,
-0.1111375018954277,
0.05761820077896118,
0.09297259151935577,
-0.08168987184762955,
-0.007150862831622362,
-0.005202229600399733,
0.008680767379701138,
-0.055786266922950745,
0.1848522126674652,
0.06804508715867996,
0.04963282123208046,
-0.09221421927213669,
0.078740693628788,
0.053466781973838806,
-0.06783737242221832,
0.002069707727059722,
0.051822301000356674,
-0.08238379657268524,
-0.044306132942438126,
0.04168517515063286,
0.1661863774061203,
-0.06069563329219818,
-0.05315132066607475,
-0.13344721496105194,
-0.11371056735515594,
0.07395399361848831,
0.14605063199996948,
0.11126945167779922,
0.01506771333515644,
-0.06028079241514206,
0.009044883772730827,
-0.11774054914712906,
0.0973363146185875,
0.037623047828674316,
0.07079041004180908,
-0.15701773762702942,
0.13262054324150085,
0.012670605443418026,
0.048884108662605286,
-0.015056679025292397,
0.028063081204891205,
-0.09889652580022812,
0.010758022777736187,
-0.11294011026620865,
-0.02883855253458023,
-0.03916775435209274,
0.014331023208796978,
-0.0021456601098179817,
-0.062480099499225616,
-0.05277702957391739,
0.016223948448896408,
-0.11017115414142609,
-0.016122926026582718,
0.04496817663311958,
0.06585877388715744,
-0.11007951200008392,
-0.03741435706615448,
0.03309858217835426,
-0.06101668253540993,
0.07368050515651703,
0.0423881933093071,
0.02193746529519558,
0.05113358050584793,
-0.1259763240814209,
0.0235892366617918,
0.06446073949337006,
0.023656876757740974,
0.07357119768857956,
-0.10113248974084854,
-0.0089529138058424,
-0.010991853661835194,
0.04400203377008438,
0.013977083377540112,
0.06130311265587807,
-0.14108780026435852,
-0.011827338486909866,
-0.009157008491456509,
-0.08320708572864532,
-0.0638306587934494,
0.02046378329396248,
0.10418565571308136,
0.0008602235466241837,
0.1981753706932068,
-0.06179562211036682,
0.04756154865026474,
-0.2117757350206375,
0.0058935824781656265,
-0.007358625065535307,
-0.1045740619301796,
-0.10869049280881882,
-0.05579320341348648,
0.04906429722905159,
-0.06423872709274292,
0.14459766447544098,
0.02549799159169197,
0.015691598877310753,
0.023024559020996094,
-0.028128977864980698,
-0.0009669044520705938,
0.014084478840231895,
0.19421252608299255,
0.03133104741573334,
-0.034706518054008484,
0.048376020044088364,
0.039587344974279404,
0.09914796054363251,
0.1156797930598259,
0.17915073037147522,
0.1453060507774353,
-0.011013209819793701,
0.08869179338216782,
0.04585707560181618,
-0.058317314833402634,
-0.17232580482959747,
0.03999433293938637,
-0.034425828605890274,
0.10516515374183655,
-0.013864361681044102,
0.20830076932907104,
0.07593419402837753,
-0.1605183482170105,
0.03970066085457802,
-0.0478888563811779,
-0.08061495423316956,
-0.10640477389097214,
-0.0678434893488884,
-0.07778529822826385,
-0.1299070566892624,
0.00812329351902008,
-0.11562979966402054,
0.004162000957876444,
0.1188858300447464,
0.012480183504521847,
-0.03066159226000309,
0.1602284461259842,
0.027572108432650566,
0.03297096863389015,
0.0538334995508194,
0.01388093363493681,
-0.043510813266038895,
-0.1199287623167038,
-0.06667816638946533,
-0.023303676396608353,
-0.01572546176612377,
0.029776958748698235,
-0.07003532350063324,
-0.05228195711970329,
0.03179251402616501,
-0.005513018928468227,
-0.09409403800964355,
0.007705267518758774,
0.0023221527226269245,
0.05392222851514816,
0.026670651510357857,
0.004293274600058794,
0.03162844106554985,
-0.014026918448507786,
0.1934897005558014,
-0.07162468135356903,
-0.060977280139923096,
-0.11026060581207275,
0.23047643899917603,
0.030201241374015808,
-0.01729167439043522,
0.03972993791103363,
-0.06315582245588303,
0.0023735666181892157,
0.24543973803520203,
0.19169862568378448,
-0.07690301537513733,
-0.010370731353759766,
0.01225160900503397,
-0.014002978801727295,
-0.04018905758857727,
0.09604176878929138,
0.1327652782201767,
0.038848526775836945,
-0.08952470123767853,
-0.041873347014188766,
-0.06585586071014404,
-0.014554943889379501,
-0.03048805147409439,
0.059723690152168274,
0.04884425550699234,
0.008947574533522129,
-0.049752190709114075,
0.046491507440805435,
-0.06603217124938965,
-0.10241126269102097,
0.0743507593870163,
-0.1983649581670761,
-0.1667737513780594,
-0.015553150326013565,
0.09856711328029633,
0.003782016457989812,
0.06296101212501526,
-0.0311396736651659,
-0.004491877742111683,
0.0878095030784607,
-0.015985891222953796,
-0.09395674616098404,
-0.08062379062175751,
0.09759212285280228,
-0.0761115550994873,
0.2252918928861618,
-0.04480304569005966,
0.06796333193778992,
0.12324943393468857,
0.06411156803369522,
-0.07427681982517242,
0.053889110684394836,
0.05125822126865387,
-0.057178568094968796,
0.012041298672556877,
0.07655947655439377,
-0.023636721074581146,
0.08545955270528793,
0.04009299725294113,
-0.13833512365818024,
0.016543110832571983,
-0.038205087184906006,
-0.05695093050599098,
-0.04435383901000023,
-0.023819858208298683,
-0.05373081937432289,
0.1368897706270218,
0.21941928565502167,
-0.031124161556363106,
-0.01576300524175167,
-0.07528430223464966,
0.023462258279323578,
0.061257507652044296,
-0.001670380006544292,
-0.06575644761323929,
-0.21412335336208344,
0.016494860872626305,
0.03719676285982132,
-0.018718702718615532,
-0.2149316370487213,
-0.0969139114022255,
0.013160652481019497,
-0.08247966319322586,
-0.09007509797811508,
0.06030523404479027,
0.08858702331781387,
0.05954839661717415,
-0.05737782269716263,
-0.03400585427880287,
-0.0812983289361,
0.13774538040161133,
-0.14573417603969574,
-0.08957794308662415
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# electra-small-discriminator-finetuned-ner
This model is a fine-tuned version of [google/electra-small-discriminator](https://huggingface.co/google/electra-small-discriminator) on the wikiann dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3685
- Precision: 0.7331
- Recall: 0.7543
- F1: 0.7435
- Accuracy: 0.8883
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.5465 | 1.0 | 1250 | 0.4158 | 0.6932 | 0.7201 | 0.7064 | 0.8735 |
| 0.4037 | 2.0 | 2500 | 0.3817 | 0.7191 | 0.7470 | 0.7328 | 0.8828 |
| 0.3606 | 3.0 | 3750 | 0.3685 | 0.7331 | 0.7543 | 0.7435 | 0.8883 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["wikiann"], "metrics": ["precision", "recall", "f1", "accuracy"], "model-index": [{"name": "electra-small-discriminator-finetuned-ner", "results": [{"task": {"type": "token-classification", "name": "Token Classification"}, "dataset": {"name": "wikiann", "type": "wikiann", "args": "en"}, "metrics": [{"type": "precision", "value": 0.7330965535385425, "name": "Precision"}, {"type": "recall", "value": 0.7542632861138681, "name": "Recall"}, {"type": "f1", "value": 0.7435293071244329, "name": "F1"}, {"type": "accuracy", "value": 0.8883011190233978, "name": "Accuracy"}]}]}]}
|
token-classification
|
dbsamu/electra-small-discriminator-finetuned-ner
|
[
"transformers",
"pytorch",
"tensorboard",
"electra",
"token-classification",
"generated_from_trainer",
"dataset:wikiann",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #electra #token-classification #generated_from_trainer #dataset-wikiann #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
electra-small-discriminator-finetuned-ner
=========================================
This model is a fine-tuned version of google/electra-small-discriminator on the wikiann dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3685
* Precision: 0.7331
* Recall: 0.7543
* F1: 0.7435
* Accuracy: 0.8883
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #electra #token-classification #generated_from_trainer #dataset-wikiann #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
67,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #electra #token-classification #generated_from_trainer #dataset-wikiann #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.10451984405517578,
0.11384540796279907,
-0.0023680480662733316,
0.11953621357679367,
0.1660291999578476,
0.03446183353662491,
0.11575037986040115,
0.12087906897068024,
-0.07609503716230392,
0.024236565455794334,
0.12468454241752625,
0.17267951369285583,
0.02592291496694088,
0.10900934785604477,
-0.04164125397801399,
-0.26443102955818176,
0.002568468451499939,
0.051956064999103546,
-0.07011216878890991,
0.12980780005455017,
0.09687090665102005,
-0.1335635483264923,
0.0890362337231636,
0.02408432960510254,
-0.18589696288108826,
0.0001646383316256106,
0.0027322149835526943,
-0.0673932209610939,
0.15481525659561157,
0.02998553216457367,
0.13432452082633972,
-0.00569977005943656,
0.0925118625164032,
-0.1944618970155716,
0.013207489624619484,
0.05421669781208038,
0.00713304802775383,
0.0992627814412117,
0.03182057663798332,
0.019098568707704544,
0.12159993499517441,
-0.052386995404958725,
0.06203595921397209,
0.012979323975741863,
-0.11299838870763779,
-0.221456840634346,
-0.0822552815079689,
0.04765782505273819,
0.07562302052974701,
0.08862791210412979,
0.0008088705944828689,
0.14583534002304077,
-0.08963824063539505,
0.08573800325393677,
0.2212953418493271,
-0.27953505516052246,
-0.06404559314250946,
0.03433167561888695,
0.019305748865008354,
0.037505391985177994,
-0.1023237407207489,
-0.033115454018116,
0.04923247918486595,
0.050626613199710846,
0.14187540113925934,
-0.022298483178019524,
-0.1064370721578598,
0.018387675285339355,
-0.14459967613220215,
-0.04055076837539673,
0.17091193795204163,
0.0553269162774086,
-0.03117903135716915,
-0.044463250786066055,
-0.05971977114677429,
-0.16620436310768127,
-0.025212746113538742,
-0.024046877399086952,
0.04275177791714668,
-0.033589333295822144,
-0.0435648076236248,
0.0036028383765369654,
-0.10581155121326447,
-0.06474365293979645,
-0.07628881931304932,
0.13955657184123993,
0.048719629645347595,
0.019771629944443703,
-0.024421608075499535,
0.11047584563493729,
0.0033917564433068037,
-0.1121235340833664,
0.020097726956009865,
0.023151464760303497,
0.0032913500908762217,
-0.046602264046669006,
-0.04524962231516838,
-0.0536615252494812,
0.018239106982946396,
0.13333839178085327,
-0.03951408341526985,
0.03274831920862198,
0.06101919338107109,
0.04777480289340019,
-0.07467534393072128,
0.17495770752429962,
-0.06274917721748352,
-0.016475584357976913,
0.009733611717820168,
0.05068854242563248,
0.011957475915551186,
-0.005038802977651358,
-0.1176265999674797,
0.005380917340517044,
0.10091160982847214,
0.013210282661020756,
-0.06926003843545914,
0.06991683691740036,
-0.057257939130067825,
-0.031193681061267853,
0.02661397121846676,
-0.08477556705474854,
0.023528987541794777,
-0.004465111531317234,
-0.08975516259670258,
-0.008309402503073215,
0.03070635162293911,
0.023152949288487434,
-0.01743580400943756,
0.1076362207531929,
-0.09914712607860565,
0.01700829528272152,
-0.09236405044794083,
-0.11019586771726608,
0.023528628051280975,
-0.10948217660188675,
0.0396098755300045,
-0.10563851147890091,
-0.1744374781847,
-0.0065798102878034115,
0.06713950634002686,
-0.01508855540305376,
-0.07392795383930206,
-0.04361482709646225,
-0.06260580569505692,
0.00960367638617754,
-0.016403276473283768,
0.10520895570516586,
-0.06617480516433716,
0.09665346145629883,
0.0184409748762846,
0.06499918550252914,
-0.03880899026989937,
0.05655548721551895,
-0.09859803318977356,
0.013315311633050442,
-0.16006380319595337,
0.014236019924283028,
-0.0485350526869297,
0.061387598514556885,
-0.09507829695940018,
-0.10391326248645782,
0.014025810174643993,
-0.0003161908534821123,
0.06875766813755035,
0.08327276259660721,
-0.15966230630874634,
-0.07123405486345291,
0.1557914912700653,
-0.07675772905349731,
-0.12364862114191055,
0.12424927949905396,
-0.06105908751487732,
0.04332052543759346,
0.06404146552085876,
0.14274479448795319,
0.06699381023645401,
-0.09466628730297089,
-0.0012301276437938213,
-0.006086454726755619,
0.039942849427461624,
-0.06704024970531464,
0.0745915099978447,
0.015186285600066185,
0.02386418916285038,
0.024566199630498886,
-0.022835206240415573,
0.0696365237236023,
-0.0931239202618599,
-0.0952896848320961,
-0.03656698763370514,
-0.09487318247556686,
0.05232299119234085,
0.07294156402349472,
0.06011691316962242,
-0.09225217998027802,
-0.08600125461816788,
0.06962592154741287,
0.10165053606033325,
-0.059286464005708694,
0.017073316499590874,
-0.07767705619335175,
0.07027871161699295,
-0.05570787936449051,
-0.02825368195772171,
-0.1668921411037445,
-0.03602159768342972,
0.007582572754472494,
-0.012703640386462212,
0.010242261923849583,
0.043338172137737274,
0.05732961744070053,
0.06435369700193405,
-0.040538299828767776,
-0.021579336374998093,
-0.04610997438430786,
0.0019115719478577375,
-0.1224483996629715,
-0.19656971096992493,
-0.04244831204414368,
-0.019539643079042435,
0.12411370873451233,
-0.22696208953857422,
0.036570701748132706,
-0.030286015942692757,
0.08467292040586472,
0.01877662166953087,
-0.002987770363688469,
-0.038063619285821915,
0.07307269424200058,
-0.05168293043971062,
-0.04813697189092636,
0.0660618469119072,
0.002999299904331565,
-0.0813213512301445,
-0.047081880271434784,
-0.08977963775396347,
0.17456406354904175,
0.12542514503002167,
-0.1082313135266304,
-0.07433535158634186,
-0.0080528873950243,
-0.05831657722592354,
-0.043192729353904724,
-0.05841726064682007,
0.0280716884881258,
0.17103460431098938,
-0.007670404855161905,
0.1433153748512268,
-0.07303035259246826,
-0.03688748925924301,
0.029303213581442833,
-0.03713836893439293,
0.0201298538595438,
0.12091705203056335,
0.16157560050487518,
-0.09589511901140213,
0.15372660756111145,
0.13063594698905945,
-0.09752591699361801,
0.1363019496202469,
-0.03613867610692978,
-0.07169216126203537,
-0.019498173147439957,
-0.03693540021777153,
-0.010986276902258396,
0.11240211874246597,
-0.1495826095342636,
0.0019092753063887358,
0.03190197795629501,
0.02089793048799038,
0.013801449909806252,
-0.2225835621356964,
-0.04190819710493088,
0.03925573080778122,
-0.039244044572114944,
0.00095049396622926,
-0.008250335231423378,
0.003632570384070277,
0.10129592567682266,
-0.0011742424685508013,
-0.1022871807217598,
0.03733430802822113,
0.013006532564759254,
-0.0787091925740242,
0.20741313695907593,
-0.0749116763472557,
-0.15517853200435638,
-0.12131696939468384,
-0.06453368812799454,
-0.03747553750872612,
-0.0012375867227092385,
0.06004253402352333,
-0.07983078062534332,
-0.03594428673386574,
-0.06723146140575409,
0.012570813298225403,
0.003119663568213582,
0.026433102786540985,
0.010686516761779785,
-0.00000694361915520858,
0.05542333796620369,
-0.1108858659863472,
-0.008153067901730537,
-0.05317122861742973,
-0.04268503561615944,
0.04239199310541153,
0.03958476707339287,
0.1106744259595871,
0.1533762812614441,
-0.020044397562742233,
0.0057950508780777454,
-0.024661246687173843,
0.2356133759021759,
-0.06725330650806427,
-0.009539774619042873,
0.14393633604049683,
-0.01863967254757881,
0.04798206686973572,
0.11078379303216934,
0.07194992154836655,
-0.08005271106958389,
0.0020509592723101377,
0.036097925156354904,
-0.03591596707701683,
-0.23064640164375305,
-0.04433543235063553,
-0.047194890677928925,
-0.00842330977320671,
0.09747997671365738,
0.028425516560673714,
0.04734457656741142,
0.08134811371564865,
0.03488968312740326,
0.07915658503770828,
-0.05534128472208977,
0.05314379185438156,
0.11965794861316681,
0.04679776728153229,
0.11906859278678894,
-0.045210838317871094,
-0.06307758390903473,
0.03587137535214424,
0.012662938795983791,
0.2319614738225937,
0.01622219942510128,
0.1436249166727066,
0.06599432229995728,
0.18188203871250153,
-0.000032386942621087655,
0.08158618956804276,
-0.01137402281165123,
-0.04161472246050835,
-0.017605222761631012,
-0.0298972986638546,
-0.039501871913671494,
0.02743201144039631,
-0.04920490086078644,
0.0635947585105896,
-0.10912958532571793,
0.016591805964708328,
0.04522179067134857,
0.269242525100708,
0.029830461367964745,
-0.3349669277667999,
-0.09959571808576584,
-0.006684115156531334,
-0.049598511308431625,
-0.024629419669508934,
0.02756730653345585,
0.09667118638753891,
-0.09999918192625046,
0.019421450793743134,
-0.0736507922410965,
0.09066518396139145,
-0.06854521483182907,
0.03363104164600372,
0.07243797928094864,
0.09756194800138474,
0.0062292953953146935,
0.08869540691375732,
-0.266780823469162,
0.27306339144706726,
-0.0030636820010840893,
0.06494114547967911,
-0.08116744458675385,
0.0014683215413242579,
0.03796800971031189,
0.06807567924261093,
0.06654323637485504,
-0.013059909455478191,
-0.04847687855362892,
-0.19589237868785858,
-0.06029077619314194,
0.029298001900315285,
0.048795148730278015,
-0.039475325495004654,
0.09233992546796799,
-0.030310098081827164,
0.011906679719686508,
0.0691971704363823,
0.008610379882156849,
-0.046126291155815125,
-0.09094743430614471,
-0.013848405331373215,
0.03795647248625755,
-0.03758738189935684,
-0.057334788143634796,
-0.11747986078262329,
-0.12728238105773926,
0.12913967669010162,
-0.03979057818651199,
-0.03604946285486221,
-0.10465919226408005,
0.08490261435508728,
0.07684919238090515,
-0.09153314679861069,
0.05648840218782425,
0.005526598077267408,
0.0758451446890831,
0.03112979233264923,
-0.05601867660880089,
0.10667162388563156,
-0.07896292209625244,
-0.16127878427505493,
-0.06678196787834167,
0.08213900774717331,
0.04677853733301163,
0.059127673506736755,
-0.005815170705318451,
0.007561917882412672,
-0.037940774112939835,
-0.07714433968067169,
0.026995597407221794,
0.0012082838220521808,
0.07660766690969467,
0.012968041934072971,
-0.05146269127726555,
0.020097002387046814,
-0.05833601579070091,
-0.04659060016274452,
0.19219325482845306,
0.2215416431427002,
-0.09465387463569641,
0.011379238218069077,
0.05012081563472748,
-0.0709863007068634,
-0.17947372794151306,
0.023175297304987907,
0.05289765074849129,
0.012542231008410454,
0.04994496703147888,
-0.19434326887130737,
0.14126108586788177,
0.12274638563394547,
-0.016417281702160835,
0.12853191792964935,
-0.31346648931503296,
-0.11217257380485535,
0.13193583488464355,
0.14598305523395538,
0.09427711367607117,
-0.12862616777420044,
-0.010829743929207325,
-0.012341282330453396,
-0.13562721014022827,
0.11852973699569702,
-0.06631722301244736,
0.11460167914628983,
-0.03157564252614975,
0.08983642607927322,
0.0038110220339149237,
-0.05933471769094467,
0.11848378926515579,
0.008955461904406548,
0.0974758192896843,
-0.058805618435144424,
-0.03420921787619591,
0.013085570186376572,
-0.03903907909989357,
0.03767392039299011,
-0.08871667832136154,
0.02189037762582302,
-0.10386283695697784,
-0.030904017388820648,
-0.07311931997537613,
0.04775635525584221,
-0.03226581588387489,
-0.06948008388280869,
-0.043079424649477005,
0.027924196794629097,
0.04928015545010567,
-0.005079043097794056,
0.14392970502376556,
0.04010524973273277,
0.14061950147151947,
0.08706862479448318,
0.07271917164325714,
-0.06220972165465355,
-0.09297273308038712,
-0.032661955803632736,
-0.012716260738670826,
0.07192806899547577,
-0.14265014231204987,
0.024239743128418922,
0.14122585952281952,
0.016657892614603043,
0.14667844772338867,
0.07154025137424469,
-0.039091236889362335,
0.012250087223947048,
0.05455561354756355,
-0.1489194929599762,
-0.07335278391838074,
-0.0012771913316100836,
-0.039588384330272675,
-0.1066795065999031,
0.05807396024465561,
0.09801801294088364,
-0.08376645296812057,
-0.012777252122759819,
-0.009710473008453846,
0.011398576200008392,
-0.04675837606191635,
0.18547563254833221,
0.07281786948442459,
0.053522538393735886,
-0.0929623395204544,
0.07680194824934006,
0.055577658116817474,
-0.0815025046467781,
0.014643426053225994,
0.05416800454258919,
-0.08110476285219193,
-0.052607230842113495,
0.04894806444644928,
0.16954277455806732,
-0.07016409933567047,
-0.06481129676103592,
-0.14098775386810303,
-0.10657971352338791,
0.07779141515493393,
0.15722118318080902,
0.10891595482826233,
0.005503576714545488,
-0.051641613245010376,
0.007557197939604521,
-0.11788665503263474,
0.09486299753189087,
0.03664889931678772,
0.06173504889011383,
-0.15994544327259064,
0.12545578181743622,
0.021298233419656754,
0.050165437161922455,
-0.015743762254714966,
0.024607384577393532,
-0.09468657523393631,
0.016175836324691772,
-0.10289984196424484,
-0.01612100563943386,
-0.04097282141447067,
0.010919787921011448,
0.0024795953650027514,
-0.06074522063136101,
-0.05922533571720123,
0.015341225080192089,
-0.11409825086593628,
-0.011022841557860374,
0.0411173440515995,
0.0666673481464386,
-0.10407087206840515,
-0.03833414241671562,
0.031696684658527374,
-0.0579657256603241,
0.07493269443511963,
0.035529959946870804,
0.018296778202056885,
0.051116667687892914,
-0.1267630010843277,
0.034410662949085236,
0.07161641120910645,
0.021660447120666504,
0.06658030301332474,
-0.10866869240999222,
-0.001343545038253069,
-0.009751220233738422,
0.04357536509633064,
0.007812208961695433,
0.050769586116075516,
-0.14316220581531525,
-0.017964253202080727,
-0.012067996896803379,
-0.08002879470586777,
-0.065605528652668,
0.0170932374894619,
0.09445568919181824,
-0.00044289816287346184,
0.20654460787773132,
-0.05835919454693794,
0.03983398526906967,
-0.20798029005527496,
0.008195006288588047,
-0.011660022661089897,
-0.09653730690479279,
-0.10455062985420227,
-0.05319879949092865,
0.04774322360754013,
-0.06648752838373184,
0.1425466537475586,
0.027814693748950958,
0.011050458997488022,
0.022635165601968765,
-0.025249525904655457,
-0.0065440405160188675,
0.017902322113513947,
0.19615447521209717,
0.031077055260539055,
-0.032010044902563095,
0.04494210332632065,
0.038804229348897934,
0.09887081384658813,
0.11021703481674194,
0.17573171854019165,
0.14036770164966583,
0.00005815762415295467,
0.09499607980251312,
0.03857823833823204,
-0.061396777629852295,
-0.1613060086965561,
0.03786966949701309,
-0.03720158338546753,
0.1066223531961441,
0.002107082400470972,
0.21708567440509796,
0.0906175896525383,
-0.15455804765224457,
0.03796946257352829,
-0.05460032820701599,
-0.07660000771284103,
-0.11175306886434555,
-0.0858781486749649,
-0.08219055086374283,
-0.13077491521835327,
0.005879594944417477,
-0.11557488143444061,
0.003301437245681882,
0.1447059065103531,
0.008990386500954628,
-0.03208348527550697,
0.1387396901845932,
0.04181768000125885,
0.025728419423103333,
0.04154178500175476,
0.014260673895478249,
-0.04164246469736099,
-0.11713629215955734,
-0.06612525880336761,
-0.01524422224611044,
-0.01436896063387394,
0.03376157209277153,
-0.06991326808929443,
-0.054207246750593185,
0.04532192274928093,
-0.011320431716740131,
-0.09529091417789459,
0.008206761442124844,
0.0015211324207484722,
0.05197728052735329,
0.02664603851735592,
0.008596153929829597,
0.02909507043659687,
-0.008633117191493511,
0.19910173118114471,
-0.06320743262767792,
-0.05506962537765503,
-0.1187405213713646,
0.20675034821033478,
0.03984798118472099,
-0.013387450948357582,
0.04053666070103645,
-0.06282327324151993,
0.004643941763788462,
0.23741987347602844,
0.18646284937858582,
-0.0687389075756073,
-0.01523824967443943,
0.004450145177543163,
-0.0148411700502038,
-0.04073980823159218,
0.09715664386749268,
0.13785049319267273,
0.03502800315618515,
-0.09328090399503708,
-0.0417623296380043,
-0.07107602804899216,
-0.008002137765288353,
-0.03412178158760071,
0.04493999853730202,
0.046947166323661804,
0.011675707995891571,
-0.04216805472970009,
0.05090155825018883,
-0.06533876806497574,
-0.10782098770141602,
0.06928500533103943,
-0.19043520092964172,
-0.16716185212135315,
-0.019120462238788605,
0.1008852869272232,
0.011307697743177414,
0.055566415190696716,
-0.03376844525337219,
-0.006612064316868782,
0.08380747586488724,
-0.016087854281067848,
-0.09895811975002289,
-0.08528632670640945,
0.09702160209417343,
-0.07192967087030411,
0.2206955850124359,
-0.0475686639547348,
0.0667906254529953,
0.1259433776140213,
0.05968967452645302,
-0.07408222556114197,
0.059344563633203506,
0.048344776034355164,
-0.047178417444229126,
0.023426862433552742,
0.07026121765375137,
-0.022839190438389778,
0.08371452987194061,
0.044179800897836685,
-0.1305932104587555,
0.012313416227698326,
-0.056251365691423416,
-0.049181025475263596,
-0.035411376506090164,
-0.016024716198444366,
-0.05471910536289215,
0.13698257505893707,
0.20348818600177765,
-0.03422403335571289,
-0.014959522522985935,
-0.07786334306001663,
0.023443687707185745,
0.061485424637794495,
0.0042180330492556095,
-0.06337374448776245,
-0.217618927359581,
0.019058041274547577,
0.030668849125504494,
-0.015956435352563858,
-0.2202325165271759,
-0.09117110818624496,
0.001773192547261715,
-0.08716628700494766,
-0.09237472712993622,
0.05969265475869179,
0.09175516664981842,
0.05949970707297325,
-0.061617955565452576,
-0.026972021907567978,
-0.08113762736320496,
0.13853830099105835,
-0.14833350479602814,
-0.09619221091270447
] |
null | null |
transformers
|
# BETO: Spanish BERT
BETO is a [BERT model](https://github.com/google-research/bert) trained on a [big Spanish corpus](https://github.com/josecannete/spanish-corpora). BETO is of size similar to a BERT-Base and was trained with the Whole Word Masking technique. Below you find Tensorflow and Pytorch checkpoints for the uncased and cased versions, as well as some results for Spanish benchmarks comparing BETO with [Multilingual BERT](https://github.com/google-research/bert/blob/master/multilingual.md) as well as other (not BERT-based) models.
## Download
| | | | |
|-|:--------:|:-----:|:----:|
|BETO uncased|[tensorflow_weights](https://users.dcc.uchile.cl/~jperez/beto/uncased_2M/tensorflow_weights.tar.gz) | [pytorch_weights](https://users.dcc.uchile.cl/~jperez/beto/uncased_2M/pytorch_weights.tar.gz) | [vocab](./config/uncased_2M/vocab.txt), [config](./config/uncased_2M/config.json) |
|BETO cased| [tensorflow_weights](https://users.dcc.uchile.cl/~jperez/beto/cased_2M/tensorflow_weights.tar.gz) | [pytorch_weights](https://users.dcc.uchile.cl/~jperez/beto/cased_2M/pytorch_weights.tar.gz) | [vocab](./config/cased_2M/vocab.txt), [config](./config/cased_2M/config.json) |
All models use a vocabulary of about 31k BPE subwords constructed using SentencePiece and were trained for 2M steps.
## Benchmarks
The following table shows some BETO results in the Spanish version of every task.
We compare BETO (cased and uncased) with the Best Multilingual BERT results that
we found in the literature (as of October 2019).
The table also shows some alternative methods for the same tasks (not necessarily BERT-based methods).
References for all methods can be found [here](#references).
|Task | BETO-cased | BETO-uncased | Best Multilingual BERT | Other results |
|-------|--------------:|--------------:|--------------------------:|-------------------------------:|
|[POS](https://lindat.mff.cuni.cz/repository/xmlui/handle/11234/1-1827) | **98.97** | 98.44 | 97.10 [2] | 98.91 [6], 96.71 [3] |
|[NER-C](https://www.kaggle.com/nltkdata/conll-corpora) | [**88.43**](https://github.com/gchaperon/beto-benchmarks/blob/master/conll2002/dev_results_beto-cased_conll2002.txt) | 82.67 | 87.38 [2] | 87.18 [3] |
|[MLDoc](https://github.com/facebookresearch/MLDoc) | [95.60](https://github.com/gchaperon/beto-benchmarks/blob/master/MLDoc/dev_results_beto-cased_mldoc.txt) | [**96.12**](https://github.com/gchaperon/beto-benchmarks/blob/master/MLDoc/dev_results_beto-uncased_mldoc.txt) | 95.70 [2] | 88.75 [4] |
|[PAWS-X](https://github.com/google-research-datasets/paws/tree/master/pawsx) | 89.05 | 89.55 | 90.70 [8] |
|[XNLI](https://github.com/facebookresearch/XNLI) | **82.01** | 80.15 | 78.50 [2] | 80.80 [5], 77.80 [1], 73.15 [4]|
## Example of use
For further details on how to use BETO you can visit the [🤗Huggingface Transformers library](https://github.com/huggingface/transformers), starting by the [Quickstart section](https://huggingface.co/transformers/quickstart.html).
BETO models can be accessed simply as [`'dccuchile/bert-base-spanish-wwm-cased'`](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) and [`'dccuchile/bert-base-spanish-wwm-uncased'`](https://huggingface.co/dccuchile/bert-base-spanish-wwm-uncased) by using the Transformers library.
An example on how to download and use the models in this page can be found in [this colab notebook](https://colab.research.google.com/drive/1pYOYsCU59GBOwztkWCw5PTsqBiJbRy4S?usp=sharing).
(We will soon add a more detailed step-by-step tutorial in Spanish for newcommers 😉)
## Acknowledgments
We thank [Adereso](https://www.adere.so/) for kindly providing support for traininig BETO-uncased, and the [Millennium Institute for Foundational Research on Data](https://imfd.cl/en/)
that provided support for training BETO-cased. Also thanks to Google for helping us with the [TensorFlow Research Cloud](https://www.tensorflow.org/tfrc) program.
## Citation
[Spanish Pre-Trained BERT Model and Evaluation Data](https://users.dcc.uchile.cl/~jperez/papers/pml4dc2020.pdf)
To cite this resource in a publication please use the following:
```
@inproceedings{CaneteCFP2020,
title={Spanish Pre-Trained BERT Model and Evaluation Data},
author={Cañete, José and Chaperon, Gabriel and Fuentes, Rodrigo and Ho, Jou-Hui and Kang, Hojin and Pérez, Jorge},
booktitle={PML4DC at ICLR 2020},
year={2020}
}
```
## License Disclaimer
The license CC BY 4.0 best describes our intentions for our work. However we are not sure that all the datasets used to train BETO have licenses compatible with CC BY 4.0 (specially for commercial use). Please use at your own discretion and verify that the licenses of the original text resources match your needs.
## References
* [1] [Original Multilingual BERT](https://github.com/google-research/bert/blob/master/multilingual.md)
* [2] [Multilingual BERT on "Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT"](https://arxiv.org/pdf/1904.09077.pdf)
* [3] [Multilingual BERT on "How Multilingual is Multilingual BERT?"](https://arxiv.org/pdf/1906.01502.pdf)
* [4] [LASER](https://arxiv.org/abs/1812.10464)
* [5] [XLM (MLM+TLM)](https://arxiv.org/pdf/1901.07291.pdf)
* [6] [UDPipe on "75 Languages, 1 Model: Parsing Universal Dependencies Universally"](https://arxiv.org/pdf/1904.02099.pdf)
* [7] [Multilingual BERT on "Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation"](https://arxiv.org/pdf/1906.01569.pdf)
* [8] [Multilingual BERT on "PAWS-X: A Cross-lingual Adversarial Dataset for Paraphrase Identification"](https://arxiv.org/abs/1908.11828)
|
{"language": ["es"], "tags": ["masked-lm"]}
|
fill-mask
|
dccuchile/bert-base-spanish-wwm-cased
|
[
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"masked-lm",
"es",
"arxiv:1904.09077",
"arxiv:1906.01502",
"arxiv:1812.10464",
"arxiv:1901.07291",
"arxiv:1904.02099",
"arxiv:1906.01569",
"arxiv:1908.11828",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1904.09077",
"1906.01502",
"1812.10464",
"1901.07291",
"1904.02099",
"1906.01569",
"1908.11828"
] |
[
"es"
] |
TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #masked-lm #es #arxiv-1904.09077 #arxiv-1906.01502 #arxiv-1812.10464 #arxiv-1901.07291 #arxiv-1904.02099 #arxiv-1906.01569 #arxiv-1908.11828 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
BETO: Spanish BERT
==================
BETO is a BERT model trained on a big Spanish corpus. BETO is of size similar to a BERT-Base and was trained with the Whole Word Masking technique. Below you find Tensorflow and Pytorch checkpoints for the uncased and cased versions, as well as some results for Spanish benchmarks comparing BETO with Multilingual BERT as well as other (not BERT-based) models.
Download
--------
All models use a vocabulary of about 31k BPE subwords constructed using SentencePiece and were trained for 2M steps.
Benchmarks
----------
The following table shows some BETO results in the Spanish version of every task.
We compare BETO (cased and uncased) with the Best Multilingual BERT results that
we found in the literature (as of October 2019).
The table also shows some alternative methods for the same tasks (not necessarily BERT-based methods).
References for all methods can be found here.
Example of use
--------------
For further details on how to use BETO you can visit the Huggingface Transformers library, starting by the Quickstart section.
BETO models can be accessed simply as ''dccuchile/bert-base-spanish-wwm-cased'' and ''dccuchile/bert-base-spanish-wwm-uncased'' by using the Transformers library.
An example on how to download and use the models in this page can be found in this colab notebook.
(We will soon add a more detailed step-by-step tutorial in Spanish for newcommers )
Acknowledgments
---------------
We thank Adereso for kindly providing support for traininig BETO-uncased, and the Millennium Institute for Foundational Research on Data
that provided support for training BETO-cased. Also thanks to Google for helping us with the TensorFlow Research Cloud program.
Spanish Pre-Trained BERT Model and Evaluation Data
To cite this resource in a publication please use the following:
License Disclaimer
------------------
The license CC BY 4.0 best describes our intentions for our work. However we are not sure that all the datasets used to train BETO have licenses compatible with CC BY 4.0 (specially for commercial use). Please use at your own discretion and verify that the licenses of the original text resources match your needs.
References
----------
* [1] Original Multilingual BERT
* [2] Multilingual BERT on "Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT"
* [3] Multilingual BERT on "How Multilingual is Multilingual BERT?"
* [4] LASER
* [5] XLM (MLM+TLM)
* [6] UDPipe on "75 Languages, 1 Model: Parsing Universal Dependencies Universally"
* [7] Multilingual BERT on "Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation"
* [8] Multilingual BERT on "PAWS-X: A Cross-lingual Adversarial Dataset for Paraphrase Identification"
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #masked-lm #es #arxiv-1904.09077 #arxiv-1906.01502 #arxiv-1812.10464 #arxiv-1901.07291 #arxiv-1904.02099 #arxiv-1906.01569 #arxiv-1908.11828 #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
112
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #masked-lm #es #arxiv-1904.09077 #arxiv-1906.01502 #arxiv-1812.10464 #arxiv-1901.07291 #arxiv-1904.02099 #arxiv-1906.01569 #arxiv-1908.11828 #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
-0.06927864998579025,
0.06858528405427933,
-0.008396383374929428,
0.08531554788351059,
0.04279095679521561,
0.02447163686156273,
0.09600638598203659,
0.12097586691379547,
0.0885903611779213,
0.04260623827576637,
0.1762554794549942,
0.13626046478748322,
0.008020914159715176,
0.08139751106500626,
-0.014077585190534592,
-0.21578392386436462,
0.001782926032319665,
0.028258362784981728,
-0.09143471717834473,
0.10740519315004349,
0.051162101328372955,
-0.08205297589302063,
0.08451661467552185,
-0.03633422031998634,
-0.05067161098122597,
0.038020059466362,
0.07113029062747955,
-0.09474030137062073,
0.12903699278831482,
0.048817530274391174,
0.12242042273283005,
0.06295950710773468,
0.0002629029913805425,
-0.06970865279436111,
0.04118760675191879,
0.022531723603606224,
-0.09923937171697617,
0.08074844628572464,
0.0037979150656610727,
-0.04030340164899826,
0.05294648930430412,
-0.048158884048461914,
-0.014321519993245602,
0.03796221315860748,
-0.2093735784292221,
-0.2518466114997864,
-0.05417228117585182,
0.09454696625471115,
-0.0055037252604961395,
0.07157342880964279,
0.013799306005239487,
0.182430237531662,
-0.04956797510385513,
0.08629383891820908,
0.24047063291072845,
-0.34390363097190857,
-0.029486343264579773,
0.052513111382722855,
0.0688915029168129,
0.01827082969248295,
-0.03598067909479141,
0.07821405678987503,
0.0625448152422905,
0.0007498532650060952,
0.08998877555131912,
-0.07327102869749069,
-0.07528914511203766,
0.06736234575510025,
-0.12534843385219574,
-0.05348765477538109,
0.15643024444580078,
-0.005921343341469765,
0.016224000602960587,
0.049673862755298615,
-0.09009434282779694,
-0.13438217341899872,
0.01979081518948078,
-0.03424113988876343,
0.01872434839606285,
-0.014597623609006405,
0.016973603516817093,
0.009563789702951908,
-0.10677038133144379,
-0.017642389982938766,
-0.18504446744918823,
0.23869460821151733,
-0.023818308487534523,
0.05656564608216286,
-0.13937599956989288,
0.02028307504951954,
-0.05106367915868759,
-0.13333408534526825,
0.06854961812496185,
-0.03211977705359459,
0.01208472903817892,
0.0701851025223732,
-0.027351440861821175,
-0.026035629212856293,
0.025065911933779716,
0.1320372074842453,
-0.05165574699640274,
0.037294451147317886,
0.05954628065228462,
0.11765867471694946,
-0.01140736322849989,
0.0634249746799469,
-0.049198731780052185,
-0.016538424417376518,
-0.0057483576238155365,
0.0020716136787086725,
0.04880101978778839,
-0.051349569112062454,
-0.12552335858345032,
-0.08001048117876053,
0.018535291776061058,
0.052030883729457855,
0.09145641326904297,
0.0522436760365963,
-0.08620718121528625,
0.00925943162292242,
0.07248297333717346,
-0.053548071533441544,
-0.009915700182318687,
-0.023172849789261818,
0.001273973612114787,
-0.03733149543404579,
0.04069436341524124,
-0.006260338705033064,
0.031613778322935104,
0.05424897000193596,
-0.12798534333705902,
-0.016611194238066673,
-0.034579526633024216,
-0.14579883217811584,
0.053667716681957245,
-0.049512531608343124,
-0.00250948965549469,
-0.17189809679985046,
-0.011499447748064995,
0.05761472508311272,
0.02040105313062668,
-0.019191142171621323,
-0.014404328539967537,
0.04377197474241257,
-0.039491359144449234,
0.05103026330471039,
-0.014413050375878811,
0.004958382807672024,
-0.04300795868039131,
0.07566679269075394,
0.01090725976973772,
0.17881304025650024,
-0.057602301239967346,
0.02011946029961109,
-0.05071857571601868,
0.033032484352588654,
-0.16995783150196075,
-0.0669969916343689,
-0.07436741143465042,
0.06510551273822784,
-0.03824472054839134,
-0.07397117465734482,
-0.09359250217676163,
0.027135368436574936,
0.05880982428789139,
0.16877864301204681,
-0.1518174707889557,
-0.0851631686091423,
0.17793013155460358,
-0.05786045640707016,
-0.1341952234506607,
0.07546187937259674,
0.024495994672179222,
-0.0735754668712616,
-0.009159035049378872,
0.1708783209323883,
-0.041221775114536285,
-0.15849421918392181,
0.006739253643900156,
0.09362956881523132,
-0.061575744301080704,
-0.036448605358600616,
0.1093658059835434,
0.010799828916788101,
-0.08091963082551956,
-0.011950931511819363,
0.01834995113313198,
0.06531869620084763,
-0.07541544735431671,
-0.04772970452904701,
0.009046372957527637,
-0.038324568420648575,
0.1646353155374527,
0.02594441920518875,
0.052443671971559525,
-0.07953692972660065,
-0.0859423577785492,
-0.08281612396240234,
0.04629012197256088,
0.10155396163463593,
0.02947276271879673,
-0.08034177124500275,
0.10143496841192245,
-0.03736593574285507,
-0.019531333819031715,
-0.14560747146606445,
-0.07558012008666992,
-0.026153208687901497,
0.06159156933426857,
0.017503930255770683,
0.08354170620441437,
0.1046905368566513,
-0.016963616013526917,
-0.042787104845047,
-0.040133487433195114,
0.01593513786792755,
0.027721207588911057,
-0.03283844515681267,
-0.18448804318904877,
-0.02312953770160675,
-0.07463929057121277,
0.017163759097456932,
-0.05963984504342079,
0.02101714536547661,
-0.005468613933771849,
0.07233934849500656,
0.01470261812210083,
0.02628456987440586,
0.010947007685899734,
0.035352595150470734,
-0.004911723546683788,
0.0016507171094417572,
0.07634352147579193,
-0.02342132478952408,
-0.07919050753116608,
0.18864023685455322,
-0.14094586670398712,
0.30949392914772034,
0.14038893580436707,
-0.2169933319091797,
-0.06615138053894043,
0.07043495029211044,
-0.020124472677707672,
0.010896586813032627,
0.1028653010725975,
-0.0388895720243454,
0.03412444144487381,
-0.026596762239933014,
0.10057538002729416,
-0.060555096715688705,
-0.02400464005768299,
0.02771260403096676,
-0.0633796751499176,
-0.06411422789096832,
0.13537319004535675,
0.053409211337566376,
-0.1481379270553589,
0.1634901911020279,
0.2782641053199768,
0.013855498284101486,
0.12500980496406555,
0.058166660368442535,
-0.015743570402264595,
-0.08366892486810684,
-0.049111198633909225,
0.0034213457256555557,
0.10780303925275803,
-0.08657941967248917,
-0.04808342829346657,
0.04558350145816803,
-0.04246943071484566,
0.02014819346368313,
-0.13760557770729065,
-0.09178196638822556,
0.024280093610286713,
0.03842535242438316,
-0.13073380291461945,
0.13082116842269897,
-0.007660071365535259,
0.14781439304351807,
0.04229322448372841,
-0.08995010703802109,
0.019121572375297546,
0.00983871053904295,
-0.05728638544678688,
0.16667313873767853,
-0.1344251185655594,
-0.26218295097351074,
-0.08065544068813324,
-0.08159522712230682,
0.013820616528391838,
0.001263903104700148,
0.04265400394797325,
-0.11798931658267975,
-0.013794025406241417,
0.014990953728556633,
-0.012884516268968582,
-0.13387419283390045,
0.042058881372213364,
0.012209679931402206,
-0.016757559031248093,
0.01793607696890831,
-0.08648062497377396,
-0.07624223828315735,
-0.03426868095993996,
0.000042003292037406936,
0.14170007407665253,
-0.014166084118187428,
0.09901147335767746,
0.12982049584388733,
-0.00008183056343114004,
0.040771957486867905,
-0.019022388383746147,
0.20066535472869873,
-0.053609348833560944,
0.00899057649075985,
0.13864457607269287,
-0.013646453619003296,
0.07687987387180328,
0.0944785624742508,
0.04893971607089043,
-0.04169631749391556,
-0.014830630272626877,
-0.019008701667189598,
-0.05568023398518562,
-0.16974294185638428,
-0.04786086082458496,
-0.09978978335857391,
0.04261016473174095,
0.03278592228889465,
0.04078511893749237,
0.10224069654941559,
-0.0015985493082553148,
0.03301076591014862,
-0.000928546825889498,
-0.09030864387750626,
0.03677203506231308,
0.12299474328756332,
-0.06897427886724472,
0.09860438853502274,
-0.017142413184046745,
-0.048764802515506744,
0.0426960326731205,
-0.02556983195245266,
0.043181322515010834,
0.09859553724527359,
0.016216954216361046,
0.061320867389440536,
0.1463346630334854,
0.0934852585196495,
0.059614505618810654,
0.005678252317011356,
-0.07958768308162689,
-0.012186395935714245,
-0.057297829538583755,
-0.020091746002435684,
0.028208520263433456,
0.053865302354097366,
-0.03382749855518341,
-0.05531924217939377,
-0.1010386198759079,
0.008290465921163559,
0.04440324753522873,
0.11269859969615936,
-0.22212812304496765,
-0.03341588005423546,
0.020861921831965446,
0.049065932631492615,
-0.027287980541586876,
0.013294434174895287,
0.04220587760210037,
-0.04118203744292259,
0.05980701372027397,
-0.004155222326517105,
0.06174333021044731,
0.10748793184757233,
0.053146108984947205,
-0.07400745153427124,
-0.03639274463057518,
-0.039327289909124374,
0.03419746831059456,
-0.23788493871688843,
0.3331236243247986,
0.02418210729956627,
-0.10735685378313065,
-0.016858860850334167,
-0.051703810691833496,
0.041688837110996246,
0.09730510413646698,
0.1572994738817215,
0.06641371548175812,
-0.03343670442700386,
-0.1406654417514801,
-0.01535088662058115,
-0.011348768137395382,
0.08794514834880829,
-0.03518129885196686,
0.0203844141215086,
0.022317402064800262,
-0.040035679936409,
0.028628185391426086,
0.2084532231092453,
-0.04020579159259796,
-0.01684213988482952,
0.1145603358745575,
0.025665057823061943,
-0.06464587152004242,
-0.037915464490652084,
-0.096628338098526,
-0.13979986310005188,
0.06509198993444443,
-0.018632186576724052,
-0.01096014492213726,
-0.09634647518396378,
-0.05620720982551575,
0.12045051157474518,
-0.11170663684606552,
0.04101710394024849,
-0.027174195274710655,
-0.006413300987333059,
-0.06334687024354935,
-0.11185751110315323,
0.16832302510738373,
-0.10138481855392456,
-0.04785109683871269,
-0.07724401354789734,
0.10164858400821686,
-0.04131216183304787,
0.11348777264356613,
-0.04923922196030617,
0.058034829795360565,
-0.0736183449625969,
-0.033962272107601166,
0.14063698053359985,
-0.11822309345006943,
0.06632635742425919,
0.006872445344924927,
-0.046580102294683456,
0.0027476344257593155,
-0.018749060109257698,
0.017603788524866104,
0.18285229802131653,
0.3886747360229492,
-0.1017192006111145,
0.0911867767572403,
0.1200098842382431,
-0.021184740588068962,
-0.2724488079547882,
-0.0627681091427803,
-0.0924636498093605,
0.020444560796022415,
0.08339910954236984,
-0.1148531585931778,
-0.04284072294831276,
-0.0064535802230238914,
-0.04077315330505371,
0.15231500566005707,
-0.2126755714416504,
-0.09354184567928314,
0.16227933764457703,
0.009809529408812523,
0.3773428797721863,
-0.13070812821388245,
-0.03696532920002937,
0.00534776458516717,
-0.10745392739772797,
0.05921151489019394,
0.016403453424572945,
0.07141821831464767,
-0.062221284955739975,
0.018180303275585175,
0.040331173688173294,
-0.05277001112699509,
0.11683527380228043,
-0.08641782402992249,
0.035562556236982346,
-0.11831739544868469,
-0.22513443231582642,
0.08028014004230499,
-0.03562968224287033,
-0.03435454145073891,
-0.054741352796554565,
-0.00042377039790153503,
-0.11152083426713943,
0.038335926830768585,
-0.12511950731277466,
0.05658624321222305,
-0.016967225819826126,
-0.049153123050928116,
-0.013924815692007542,
0.039118196815252304,
-0.016172124072909355,
-0.02872813493013382,
0.2018735706806183,
-0.034742824733257294,
0.2276999056339264,
0.11942601203918457,
0.0378582626581192,
-0.06973990797996521,
-0.044177278876304626,
0.013975866138935089,
-0.05155518278479576,
0.096216581761837,
-0.14315305650234222,
0.006725570186972618,
0.11133964359760284,
0.04567345231771469,
0.07063068449497223,
0.06724008172750473,
-0.03897455707192421,
-0.03627622127532959,
0.13304995000362396,
-0.19816021621227264,
-0.03177177160978317,
-0.03768886253237724,
-0.008450740948319435,
0.030698562040925026,
-0.0045660496689379215,
0.08938440680503845,
-0.024206364527344704,
-0.01844339445233345,
0.013476205058395863,
0.009279005229473114,
-0.04075188934803009,
0.05980491265654564,
0.112696573138237,
0.07624685019254684,
-0.05278920382261276,
0.035001423209905624,
0.010674907825887203,
-0.15105842053890228,
-0.001479327562265098,
0.1459844559431076,
-0.046792272478342056,
-0.11572875082492828,
-0.06088530272245407,
0.0532500222325325,
-0.06848602741956711,
-0.004017036873847246,
-0.09448935836553574,
-0.054973602294921875,
0.06234017014503479,
0.25848111510276794,
0.03318895027041435,
-0.002412316855043173,
0.02452191524207592,
-0.01998935453593731,
0.06488757580518723,
0.012260925956070423,
-0.022877143695950508,
0.0019510366255417466,
-0.07315366715192795,
0.1335129290819168,
-0.0028066234663128853,
0.16360026597976685,
-0.06745713204145432,
-0.004433998838067055,
-0.15629805624485016,
0.025561606511473656,
-0.02714972198009491,
-0.05540703609585762,
-0.04101523384451866,
-0.07395719736814499,
-0.031193263828754425,
-0.10005975514650345,
-0.08154357969760895,
-0.022904284298419952,
-0.12273623049259186,
-0.0032850538846105337,
0.04234875366091728,
0.021153569221496582,
-0.05686516687273979,
-0.06964950263500214,
0.09250984340906143,
-0.018977124243974686,
0.06069031357765198,
0.10189516842365265,
0.00856998935341835,
0.06532518565654755,
-0.10074629634618759,
-0.05238058790564537,
0.05267835408449173,
0.006744242738932371,
0.054776258766651154,
-0.023997586220502853,
-0.020940179005265236,
-0.053931452333927155,
-0.021108193323016167,
0.03456440940499306,
0.04070146381855011,
-0.05112171918153763,
0.03593306988477707,
0.06538592278957367,
-0.12582072615623474,
-0.006844863761216402,
-0.0376385897397995,
0.08961097151041031,
0.0028520768973976374,
0.03432304412126541,
-0.036005549132823944,
0.05248293653130531,
-0.14953842759132385,
0.021202391013503075,
-0.037583936005830765,
-0.10646572709083557,
-0.00393739202991128,
-0.007872874848544598,
0.0657854974269867,
-0.03477322310209274,
0.07797234505414963,
-0.024686850607395172,
-0.07323320209980011,
0.03743361681699753,
0.06025320664048195,
-0.05465323105454445,
-0.013914239592850208,
0.06133042275905609,
0.02571037970483303,
-0.05041459575295448,
-0.061912115663290024,
0.08603830635547638,
0.018084736540913582,
0.07156524062156677,
0.14941629767417908,
0.09493794292211533,
0.15900486707687378,
0.06337503343820572,
0.04949294030666351,
-0.06887125223875046,
-0.07187860459089279,
-0.14588597416877747,
-0.04298205301165581,
0.04158030450344086,
-0.004200899042189121,
0.062021467834711075,
0.19703128933906555,
0.006447294261306524,
0.06438492983579636,
-0.07178395241498947,
0.002107169944792986,
-0.13926440477371216,
-0.08861929178237915,
-0.028241218999028206,
-0.035174816846847534,
-0.020917708054184914,
-0.04065854474902153,
0.051272276788949966,
0.0476234070956707,
0.02808297425508499,
0.012623959220945835,
0.10220830142498016,
0.022050224244594574,
-0.07008835673332214,
0.02032545395195484,
0.03840262070298195,
0.02253645285964012,
-0.05999724939465523,
0.019009150564670563,
-0.02563866600394249,
-0.061275750398635864,
-0.05038198083639145,
-0.01750919781625271,
-0.0507647767663002,
-0.0027553816325962543,
-0.03672603517770767,
-0.11431879550218582,
-0.05205933749675751,
0.02081097848713398,
0.0020446586422622204,
0.09802315384149551,
0.01740480586886406,
0.07260674238204956,
-0.007721601985394955,
0.16231995820999146,
-0.06192592903971672,
-0.005356399342417717,
0.006289863493293524,
0.2047567218542099,
-0.023484181612730026,
0.05830555781722069,
-0.04548345506191254,
0.005695577710866928,
-0.04913974553346634,
0.1967952400445938,
0.2968980669975281,
-0.08259399980306625,
0.07066705077886581,
0.03844944015145302,
0.03447898477315903,
0.04283467307686806,
0.05087996646761894,
0.09234032034873962,
0.2202320396900177,
-0.1027471125125885,
-0.0033104303292930126,
-0.10090050846338272,
0.03559855371713638,
-0.02644142135977745,
0.023119712248444557,
0.06528598815202713,
-0.032191548496484756,
-0.04592393711209297,
0.052972279489040375,
-0.08249510079622269,
0.0010646700393408537,
0.0630367174744606,
-0.2937219440937042,
-0.06900615990161896,
-0.019847322255373,
0.11500700563192368,
0.03086903877556324,
0.11305054277181625,
-0.034925125539302826,
-0.050195224583148956,
-0.027882901951670647,
0.027025535702705383,
-0.14821484684944153,
-0.11353529989719391,
0.0947364792227745,
-0.02706257626414299,
0.13843326270580292,
-0.05521116405725479,
0.06433174759149551,
0.1118536964058876,
0.04547524452209473,
-0.07040004432201385,
0.028966329991817474,
0.054019998759031296,
-0.06476250290870667,
-0.09176972508430481,
0.009138362482190132,
0.019884787499904633,
-0.029836688190698624,
0.06475906074047089,
-0.0879134014248848,
0.041965365409851074,
0.030087174847722054,
-0.03340911865234375,
0.02589462697505951,
0.06718237698078156,
-0.027889233082532883,
0.0715578943490982,
0.11757685989141464,
-0.00850208941847086,
-0.06809531897306442,
-0.038139913231134415,
-0.07972521334886551,
0.09817636758089066,
-0.04410087317228317,
-0.12588010728359222,
-0.038403745740652084,
-0.002524776617065072,
0.0036709231790155172,
0.022423051297664642,
-0.18239817023277283,
-0.06490959972143173,
-0.062358830124139786,
-0.004827936179935932,
-0.060926683247089386,
0.011973650194704533,
0.06061045825481415,
0.03092498704791069,
0.01133742742240429,
-0.09528668224811554,
0.07328860461711884,
0.10089890658855438,
-0.14924980700016022,
-0.05849994346499443
] |
null | null |
transformers
|
# BETO: Spanish BERT
BETO is a [BERT model](https://github.com/google-research/bert) trained on a [big Spanish corpus](https://github.com/josecannete/spanish-corpora). BETO is of size similar to a BERT-Base and was trained with the Whole Word Masking technique. Below you find Tensorflow and Pytorch checkpoints for the uncased and cased versions, as well as some results for Spanish benchmarks comparing BETO with [Multilingual BERT](https://github.com/google-research/bert/blob/master/multilingual.md) as well as other (not BERT-based) models.
## Download
| | | | |
|-|:--------:|:-----:|:----:|
|BETO uncased|[tensorflow_weights](https://users.dcc.uchile.cl/~jperez/beto/uncased_2M/tensorflow_weights.tar.gz) | [pytorch_weights](https://users.dcc.uchile.cl/~jperez/beto/uncased_2M/pytorch_weights.tar.gz) | [vocab](./config/uncased_2M/vocab.txt), [config](./config/uncased_2M/config.json) |
|BETO cased| [tensorflow_weights](https://users.dcc.uchile.cl/~jperez/beto/cased_2M/tensorflow_weights.tar.gz) | [pytorch_weights](https://users.dcc.uchile.cl/~jperez/beto/cased_2M/pytorch_weights.tar.gz) | [vocab](./config/cased_2M/vocab.txt), [config](./config/cased_2M/config.json) |
All models use a vocabulary of about 31k BPE subwords constructed using SentencePiece and were trained for 2M steps.
## Benchmarks
The following table shows some BETO results in the Spanish version of every task.
We compare BETO (cased and uncased) with the Best Multilingual BERT results that
we found in the literature (as of October 2019).
The table also shows some alternative methods for the same tasks (not necessarily BERT-based methods).
References for all methods can be found [here](#references).
|Task | BETO-cased | BETO-uncased | Best Multilingual BERT | Other results |
|-------|--------------:|--------------:|--------------------------:|-------------------------------:|
|[POS](https://lindat.mff.cuni.cz/repository/xmlui/handle/11234/1-1827) | **98.97** | 98.44 | 97.10 [2] | 98.91 [6], 96.71 [3] |
|[NER-C](https://www.kaggle.com/nltkdata/conll-corpora) | [**88.43**](https://github.com/gchaperon/beto-benchmarks/blob/master/conll2002/dev_results_beto-cased_conll2002.txt) | 82.67 | 87.38 [2] | 87.18 [3] |
|[MLDoc](https://github.com/facebookresearch/MLDoc) | [95.60](https://github.com/gchaperon/beto-benchmarks/blob/master/MLDoc/dev_results_beto-cased_mldoc.txt) | [**96.12**](https://github.com/gchaperon/beto-benchmarks/blob/master/MLDoc/dev_results_beto-uncased_mldoc.txt) | 95.70 [2] | 88.75 [4] |
|[PAWS-X](https://github.com/google-research-datasets/paws/tree/master/pawsx) | 89.05 | 89.55 | 90.70 [8] |
|[XNLI](https://github.com/facebookresearch/XNLI) | **82.01** | 80.15 | 78.50 [2] | 80.80 [5], 77.80 [1], 73.15 [4]|
## Example of use
For further details on how to use BETO you can visit the [🤗Huggingface Transformers library](https://github.com/huggingface/transformers), starting by the [Quickstart section](https://huggingface.co/transformers/quickstart.html).
BETO models can be accessed simply as [`'dccuchile/bert-base-spanish-wwm-cased'`](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) and [`'dccuchile/bert-base-spanish-wwm-uncased'`](https://huggingface.co/dccuchile/bert-base-spanish-wwm-uncased) by using the Transformers library.
An example on how to download and use the models in this page can be found in [this colab notebook](https://colab.research.google.com/drive/1pYOYsCU59GBOwztkWCw5PTsqBiJbRy4S?usp=sharing).
(We will soon add a more detailed step-by-step tutorial in Spanish for newcommers 😉)
## Acknowledgments
We thank [Adereso](https://www.adere.so/) for kindly providing support for traininig BETO-uncased, and the [Millennium Institute for Foundational Research on Data](https://imfd.cl/en/)
that provided support for training BETO-cased. Also thanks to Google for helping us with the [TensorFlow Research Cloud](https://www.tensorflow.org/tfrc) program.
## Citation
[Spanish Pre-Trained BERT Model and Evaluation Data](https://users.dcc.uchile.cl/~jperez/papers/pml4dc2020.pdf)
To cite this resource in a publication please use the following:
```
@inproceedings{CaneteCFP2020,
title={Spanish Pre-Trained BERT Model and Evaluation Data},
author={Cañete, José and Chaperon, Gabriel and Fuentes, Rodrigo and Ho, Jou-Hui and Kang, Hojin and Pérez, Jorge},
booktitle={PML4DC at ICLR 2020},
year={2020}
}
```
## License Disclaimer
The license CC BY 4.0 best describes our intentions for our work. However we are not sure that all the datasets used to train BETO have licenses compatible with CC BY 4.0 (specially for commercial use). Please use at your own discretion and verify that the licenses of the original text resources match your needs.
## References
* [1] [Original Multilingual BERT](https://github.com/google-research/bert/blob/master/multilingual.md)
* [2] [Multilingual BERT on "Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT"](https://arxiv.org/pdf/1904.09077.pdf)
* [3] [Multilingual BERT on "How Multilingual is Multilingual BERT?"](https://arxiv.org/pdf/1906.01502.pdf)
* [4] [LASER](https://arxiv.org/abs/1812.10464)
* [5] [XLM (MLM+TLM)](https://arxiv.org/pdf/1901.07291.pdf)
* [6] [UDPipe on "75 Languages, 1 Model: Parsing Universal Dependencies Universally"](https://arxiv.org/pdf/1904.02099.pdf)
* [7] [Multilingual BERT on "Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation"](https://arxiv.org/pdf/1906.01569.pdf)
* [8] [Multilingual BERT on "PAWS-X: A Cross-lingual Adversarial Dataset for Paraphrase Identification"](https://arxiv.org/abs/1908.11828)
|
{"language": ["es"], "tags": ["masked-lm"]}
|
fill-mask
|
dccuchile/bert-base-spanish-wwm-uncased
|
[
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"masked-lm",
"es",
"arxiv:1904.09077",
"arxiv:1906.01502",
"arxiv:1812.10464",
"arxiv:1901.07291",
"arxiv:1904.02099",
"arxiv:1906.01569",
"arxiv:1908.11828",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1904.09077",
"1906.01502",
"1812.10464",
"1901.07291",
"1904.02099",
"1906.01569",
"1908.11828"
] |
[
"es"
] |
TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #masked-lm #es #arxiv-1904.09077 #arxiv-1906.01502 #arxiv-1812.10464 #arxiv-1901.07291 #arxiv-1904.02099 #arxiv-1906.01569 #arxiv-1908.11828 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
BETO: Spanish BERT
==================
BETO is a BERT model trained on a big Spanish corpus. BETO is of size similar to a BERT-Base and was trained with the Whole Word Masking technique. Below you find Tensorflow and Pytorch checkpoints for the uncased and cased versions, as well as some results for Spanish benchmarks comparing BETO with Multilingual BERT as well as other (not BERT-based) models.
Download
--------
All models use a vocabulary of about 31k BPE subwords constructed using SentencePiece and were trained for 2M steps.
Benchmarks
----------
The following table shows some BETO results in the Spanish version of every task.
We compare BETO (cased and uncased) with the Best Multilingual BERT results that
we found in the literature (as of October 2019).
The table also shows some alternative methods for the same tasks (not necessarily BERT-based methods).
References for all methods can be found here.
Example of use
--------------
For further details on how to use BETO you can visit the Huggingface Transformers library, starting by the Quickstart section.
BETO models can be accessed simply as ''dccuchile/bert-base-spanish-wwm-cased'' and ''dccuchile/bert-base-spanish-wwm-uncased'' by using the Transformers library.
An example on how to download and use the models in this page can be found in this colab notebook.
(We will soon add a more detailed step-by-step tutorial in Spanish for newcommers )
Acknowledgments
---------------
We thank Adereso for kindly providing support for traininig BETO-uncased, and the Millennium Institute for Foundational Research on Data
that provided support for training BETO-cased. Also thanks to Google for helping us with the TensorFlow Research Cloud program.
Spanish Pre-Trained BERT Model and Evaluation Data
To cite this resource in a publication please use the following:
License Disclaimer
------------------
The license CC BY 4.0 best describes our intentions for our work. However we are not sure that all the datasets used to train BETO have licenses compatible with CC BY 4.0 (specially for commercial use). Please use at your own discretion and verify that the licenses of the original text resources match your needs.
References
----------
* [1] Original Multilingual BERT
* [2] Multilingual BERT on "Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT"
* [3] Multilingual BERT on "How Multilingual is Multilingual BERT?"
* [4] LASER
* [5] XLM (MLM+TLM)
* [6] UDPipe on "75 Languages, 1 Model: Parsing Universal Dependencies Universally"
* [7] Multilingual BERT on "Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation"
* [8] Multilingual BERT on "PAWS-X: A Cross-lingual Adversarial Dataset for Paraphrase Identification"
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #masked-lm #es #arxiv-1904.09077 #arxiv-1906.01502 #arxiv-1812.10464 #arxiv-1901.07291 #arxiv-1904.02099 #arxiv-1906.01569 #arxiv-1908.11828 #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
112
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #masked-lm #es #arxiv-1904.09077 #arxiv-1906.01502 #arxiv-1812.10464 #arxiv-1901.07291 #arxiv-1904.02099 #arxiv-1906.01569 #arxiv-1908.11828 #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] |
[
-0.06927864998579025,
0.06858528405427933,
-0.008396383374929428,
0.08531554788351059,
0.04279095679521561,
0.02447163686156273,
0.09600638598203659,
0.12097586691379547,
0.0885903611779213,
0.04260623827576637,
0.1762554794549942,
0.13626046478748322,
0.008020914159715176,
0.08139751106500626,
-0.014077585190534592,
-0.21578392386436462,
0.001782926032319665,
0.028258362784981728,
-0.09143471717834473,
0.10740519315004349,
0.051162101328372955,
-0.08205297589302063,
0.08451661467552185,
-0.03633422031998634,
-0.05067161098122597,
0.038020059466362,
0.07113029062747955,
-0.09474030137062073,
0.12903699278831482,
0.048817530274391174,
0.12242042273283005,
0.06295950710773468,
0.0002629029913805425,
-0.06970865279436111,
0.04118760675191879,
0.022531723603606224,
-0.09923937171697617,
0.08074844628572464,
0.0037979150656610727,
-0.04030340164899826,
0.05294648930430412,
-0.048158884048461914,
-0.014321519993245602,
0.03796221315860748,
-0.2093735784292221,
-0.2518466114997864,
-0.05417228117585182,
0.09454696625471115,
-0.0055037252604961395,
0.07157342880964279,
0.013799306005239487,
0.182430237531662,
-0.04956797510385513,
0.08629383891820908,
0.24047063291072845,
-0.34390363097190857,
-0.029486343264579773,
0.052513111382722855,
0.0688915029168129,
0.01827082969248295,
-0.03598067909479141,
0.07821405678987503,
0.0625448152422905,
0.0007498532650060952,
0.08998877555131912,
-0.07327102869749069,
-0.07528914511203766,
0.06736234575510025,
-0.12534843385219574,
-0.05348765477538109,
0.15643024444580078,
-0.005921343341469765,
0.016224000602960587,
0.049673862755298615,
-0.09009434282779694,
-0.13438217341899872,
0.01979081518948078,
-0.03424113988876343,
0.01872434839606285,
-0.014597623609006405,
0.016973603516817093,
0.009563789702951908,
-0.10677038133144379,
-0.017642389982938766,
-0.18504446744918823,
0.23869460821151733,
-0.023818308487534523,
0.05656564608216286,
-0.13937599956989288,
0.02028307504951954,
-0.05106367915868759,
-0.13333408534526825,
0.06854961812496185,
-0.03211977705359459,
0.01208472903817892,
0.0701851025223732,
-0.027351440861821175,
-0.026035629212856293,
0.025065911933779716,
0.1320372074842453,
-0.05165574699640274,
0.037294451147317886,
0.05954628065228462,
0.11765867471694946,
-0.01140736322849989,
0.0634249746799469,
-0.049198731780052185,
-0.016538424417376518,
-0.0057483576238155365,
0.0020716136787086725,
0.04880101978778839,
-0.051349569112062454,
-0.12552335858345032,
-0.08001048117876053,
0.018535291776061058,
0.052030883729457855,
0.09145641326904297,
0.0522436760365963,
-0.08620718121528625,
0.00925943162292242,
0.07248297333717346,
-0.053548071533441544,
-0.009915700182318687,
-0.023172849789261818,
0.001273973612114787,
-0.03733149543404579,
0.04069436341524124,
-0.006260338705033064,
0.031613778322935104,
0.05424897000193596,
-0.12798534333705902,
-0.016611194238066673,
-0.034579526633024216,
-0.14579883217811584,
0.053667716681957245,
-0.049512531608343124,
-0.00250948965549469,
-0.17189809679985046,
-0.011499447748064995,
0.05761472508311272,
0.02040105313062668,
-0.019191142171621323,
-0.014404328539967537,
0.04377197474241257,
-0.039491359144449234,
0.05103026330471039,
-0.014413050375878811,
0.004958382807672024,
-0.04300795868039131,
0.07566679269075394,
0.01090725976973772,
0.17881304025650024,
-0.057602301239967346,
0.02011946029961109,
-0.05071857571601868,
0.033032484352588654,
-0.16995783150196075,
-0.0669969916343689,
-0.07436741143465042,
0.06510551273822784,
-0.03824472054839134,
-0.07397117465734482,
-0.09359250217676163,
0.027135368436574936,
0.05880982428789139,
0.16877864301204681,
-0.1518174707889557,
-0.0851631686091423,
0.17793013155460358,
-0.05786045640707016,
-0.1341952234506607,
0.07546187937259674,
0.024495994672179222,
-0.0735754668712616,
-0.009159035049378872,
0.1708783209323883,
-0.041221775114536285,
-0.15849421918392181,
0.006739253643900156,
0.09362956881523132,
-0.061575744301080704,
-0.036448605358600616,
0.1093658059835434,
0.010799828916788101,
-0.08091963082551956,
-0.011950931511819363,
0.01834995113313198,
0.06531869620084763,
-0.07541544735431671,
-0.04772970452904701,
0.009046372957527637,
-0.038324568420648575,
0.1646353155374527,
0.02594441920518875,
0.052443671971559525,
-0.07953692972660065,
-0.0859423577785492,
-0.08281612396240234,
0.04629012197256088,
0.10155396163463593,
0.02947276271879673,
-0.08034177124500275,
0.10143496841192245,
-0.03736593574285507,
-0.019531333819031715,
-0.14560747146606445,
-0.07558012008666992,
-0.026153208687901497,
0.06159156933426857,
0.017503930255770683,
0.08354170620441437,
0.1046905368566513,
-0.016963616013526917,
-0.042787104845047,
-0.040133487433195114,
0.01593513786792755,
0.027721207588911057,
-0.03283844515681267,
-0.18448804318904877,
-0.02312953770160675,
-0.07463929057121277,
0.017163759097456932,
-0.05963984504342079,
0.02101714536547661,
-0.005468613933771849,
0.07233934849500656,
0.01470261812210083,
0.02628456987440586,
0.010947007685899734,
0.035352595150470734,
-0.004911723546683788,
0.0016507171094417572,
0.07634352147579193,
-0.02342132478952408,
-0.07919050753116608,
0.18864023685455322,
-0.14094586670398712,
0.30949392914772034,
0.14038893580436707,
-0.2169933319091797,
-0.06615138053894043,
0.07043495029211044,
-0.020124472677707672,
0.010896586813032627,
0.1028653010725975,
-0.0388895720243454,
0.03412444144487381,
-0.026596762239933014,
0.10057538002729416,
-0.060555096715688705,
-0.02400464005768299,
0.02771260403096676,
-0.0633796751499176,
-0.06411422789096832,
0.13537319004535675,
0.053409211337566376,
-0.1481379270553589,
0.1634901911020279,
0.2782641053199768,
0.013855498284101486,
0.12500980496406555,
0.058166660368442535,
-0.015743570402264595,
-0.08366892486810684,
-0.049111198633909225,
0.0034213457256555557,
0.10780303925275803,
-0.08657941967248917,
-0.04808342829346657,
0.04558350145816803,
-0.04246943071484566,
0.02014819346368313,
-0.13760557770729065,
-0.09178196638822556,
0.024280093610286713,
0.03842535242438316,
-0.13073380291461945,
0.13082116842269897,
-0.007660071365535259,
0.14781439304351807,
0.04229322448372841,
-0.08995010703802109,
0.019121572375297546,
0.00983871053904295,
-0.05728638544678688,
0.16667313873767853,
-0.1344251185655594,
-0.26218295097351074,
-0.08065544068813324,
-0.08159522712230682,
0.013820616528391838,
0.001263903104700148,
0.04265400394797325,
-0.11798931658267975,
-0.013794025406241417,
0.014990953728556633,
-0.012884516268968582,
-0.13387419283390045,
0.042058881372213364,
0.012209679931402206,
-0.016757559031248093,
0.01793607696890831,
-0.08648062497377396,
-0.07624223828315735,
-0.03426868095993996,
0.000042003292037406936,
0.14170007407665253,
-0.014166084118187428,
0.09901147335767746,
0.12982049584388733,
-0.00008183056343114004,
0.040771957486867905,
-0.019022388383746147,
0.20066535472869873,
-0.053609348833560944,
0.00899057649075985,
0.13864457607269287,
-0.013646453619003296,
0.07687987387180328,
0.0944785624742508,
0.04893971607089043,
-0.04169631749391556,
-0.014830630272626877,
-0.019008701667189598,
-0.05568023398518562,
-0.16974294185638428,
-0.04786086082458496,
-0.09978978335857391,
0.04261016473174095,
0.03278592228889465,
0.04078511893749237,
0.10224069654941559,
-0.0015985493082553148,
0.03301076591014862,
-0.000928546825889498,
-0.09030864387750626,
0.03677203506231308,
0.12299474328756332,
-0.06897427886724472,
0.09860438853502274,
-0.017142413184046745,
-0.048764802515506744,
0.0426960326731205,
-0.02556983195245266,
0.043181322515010834,
0.09859553724527359,
0.016216954216361046,
0.061320867389440536,
0.1463346630334854,
0.0934852585196495,
0.059614505618810654,
0.005678252317011356,
-0.07958768308162689,
-0.012186395935714245,
-0.057297829538583755,
-0.020091746002435684,
0.028208520263433456,
0.053865302354097366,
-0.03382749855518341,
-0.05531924217939377,
-0.1010386198759079,
0.008290465921163559,
0.04440324753522873,
0.11269859969615936,
-0.22212812304496765,
-0.03341588005423546,
0.020861921831965446,
0.049065932631492615,
-0.027287980541586876,
0.013294434174895287,
0.04220587760210037,
-0.04118203744292259,
0.05980701372027397,
-0.004155222326517105,
0.06174333021044731,
0.10748793184757233,
0.053146108984947205,
-0.07400745153427124,
-0.03639274463057518,
-0.039327289909124374,
0.03419746831059456,
-0.23788493871688843,
0.3331236243247986,
0.02418210729956627,
-0.10735685378313065,
-0.016858860850334167,
-0.051703810691833496,
0.041688837110996246,
0.09730510413646698,
0.1572994738817215,
0.06641371548175812,
-0.03343670442700386,
-0.1406654417514801,
-0.01535088662058115,
-0.011348768137395382,
0.08794514834880829,
-0.03518129885196686,
0.0203844141215086,
0.022317402064800262,
-0.040035679936409,
0.028628185391426086,
0.2084532231092453,
-0.04020579159259796,
-0.01684213988482952,
0.1145603358745575,
0.025665057823061943,
-0.06464587152004242,
-0.037915464490652084,
-0.096628338098526,
-0.13979986310005188,
0.06509198993444443,
-0.018632186576724052,
-0.01096014492213726,
-0.09634647518396378,
-0.05620720982551575,
0.12045051157474518,
-0.11170663684606552,
0.04101710394024849,
-0.027174195274710655,
-0.006413300987333059,
-0.06334687024354935,
-0.11185751110315323,
0.16832302510738373,
-0.10138481855392456,
-0.04785109683871269,
-0.07724401354789734,
0.10164858400821686,
-0.04131216183304787,
0.11348777264356613,
-0.04923922196030617,
0.058034829795360565,
-0.0736183449625969,
-0.033962272107601166,
0.14063698053359985,
-0.11822309345006943,
0.06632635742425919,
0.006872445344924927,
-0.046580102294683456,
0.0027476344257593155,
-0.018749060109257698,
0.017603788524866104,
0.18285229802131653,
0.3886747360229492,
-0.1017192006111145,
0.0911867767572403,
0.1200098842382431,
-0.021184740588068962,
-0.2724488079547882,
-0.0627681091427803,
-0.0924636498093605,
0.020444560796022415,
0.08339910954236984,
-0.1148531585931778,
-0.04284072294831276,
-0.0064535802230238914,
-0.04077315330505371,
0.15231500566005707,
-0.2126755714416504,
-0.09354184567928314,
0.16227933764457703,
0.009809529408812523,
0.3773428797721863,
-0.13070812821388245,
-0.03696532920002937,
0.00534776458516717,
-0.10745392739772797,
0.05921151489019394,
0.016403453424572945,
0.07141821831464767,
-0.062221284955739975,
0.018180303275585175,
0.040331173688173294,
-0.05277001112699509,
0.11683527380228043,
-0.08641782402992249,
0.035562556236982346,
-0.11831739544868469,
-0.22513443231582642,
0.08028014004230499,
-0.03562968224287033,
-0.03435454145073891,
-0.054741352796554565,
-0.00042377039790153503,
-0.11152083426713943,
0.038335926830768585,
-0.12511950731277466,
0.05658624321222305,
-0.016967225819826126,
-0.049153123050928116,
-0.013924815692007542,
0.039118196815252304,
-0.016172124072909355,
-0.02872813493013382,
0.2018735706806183,
-0.034742824733257294,
0.2276999056339264,
0.11942601203918457,
0.0378582626581192,
-0.06973990797996521,
-0.044177278876304626,
0.013975866138935089,
-0.05155518278479576,
0.096216581761837,
-0.14315305650234222,
0.006725570186972618,
0.11133964359760284,
0.04567345231771469,
0.07063068449497223,
0.06724008172750473,
-0.03897455707192421,
-0.03627622127532959,
0.13304995000362396,
-0.19816021621227264,
-0.03177177160978317,
-0.03768886253237724,
-0.008450740948319435,
0.030698562040925026,
-0.0045660496689379215,
0.08938440680503845,
-0.024206364527344704,
-0.01844339445233345,
0.013476205058395863,
0.009279005229473114,
-0.04075188934803009,
0.05980491265654564,
0.112696573138237,
0.07624685019254684,
-0.05278920382261276,
0.035001423209905624,
0.010674907825887203,
-0.15105842053890228,
-0.001479327562265098,
0.1459844559431076,
-0.046792272478342056,
-0.11572875082492828,
-0.06088530272245407,
0.0532500222325325,
-0.06848602741956711,
-0.004017036873847246,
-0.09448935836553574,
-0.054973602294921875,
0.06234017014503479,
0.25848111510276794,
0.03318895027041435,
-0.002412316855043173,
0.02452191524207592,
-0.01998935453593731,
0.06488757580518723,
0.012260925956070423,
-0.022877143695950508,
0.0019510366255417466,
-0.07315366715192795,
0.1335129290819168,
-0.0028066234663128853,
0.16360026597976685,
-0.06745713204145432,
-0.004433998838067055,
-0.15629805624485016,
0.025561606511473656,
-0.02714972198009491,
-0.05540703609585762,
-0.04101523384451866,
-0.07395719736814499,
-0.031193263828754425,
-0.10005975514650345,
-0.08154357969760895,
-0.022904284298419952,
-0.12273623049259186,
-0.0032850538846105337,
0.04234875366091728,
0.021153569221496582,
-0.05686516687273979,
-0.06964950263500214,
0.09250984340906143,
-0.018977124243974686,
0.06069031357765198,
0.10189516842365265,
0.00856998935341835,
0.06532518565654755,
-0.10074629634618759,
-0.05238058790564537,
0.05267835408449173,
0.006744242738932371,
0.054776258766651154,
-0.023997586220502853,
-0.020940179005265236,
-0.053931452333927155,
-0.021108193323016167,
0.03456440940499306,
0.04070146381855011,
-0.05112171918153763,
0.03593306988477707,
0.06538592278957367,
-0.12582072615623474,
-0.006844863761216402,
-0.0376385897397995,
0.08961097151041031,
0.0028520768973976374,
0.03432304412126541,
-0.036005549132823944,
0.05248293653130531,
-0.14953842759132385,
0.021202391013503075,
-0.037583936005830765,
-0.10646572709083557,
-0.00393739202991128,
-0.007872874848544598,
0.0657854974269867,
-0.03477322310209274,
0.07797234505414963,
-0.024686850607395172,
-0.07323320209980011,
0.03743361681699753,
0.06025320664048195,
-0.05465323105454445,
-0.013914239592850208,
0.06133042275905609,
0.02571037970483303,
-0.05041459575295448,
-0.061912115663290024,
0.08603830635547638,
0.018084736540913582,
0.07156524062156677,
0.14941629767417908,
0.09493794292211533,
0.15900486707687378,
0.06337503343820572,
0.04949294030666351,
-0.06887125223875046,
-0.07187860459089279,
-0.14588597416877747,
-0.04298205301165581,
0.04158030450344086,
-0.004200899042189121,
0.062021467834711075,
0.19703128933906555,
0.006447294261306524,
0.06438492983579636,
-0.07178395241498947,
0.002107169944792986,
-0.13926440477371216,
-0.08861929178237915,
-0.028241218999028206,
-0.035174816846847534,
-0.020917708054184914,
-0.04065854474902153,
0.051272276788949966,
0.0476234070956707,
0.02808297425508499,
0.012623959220945835,
0.10220830142498016,
0.022050224244594574,
-0.07008835673332214,
0.02032545395195484,
0.03840262070298195,
0.02253645285964012,
-0.05999724939465523,
0.019009150564670563,
-0.02563866600394249,
-0.061275750398635864,
-0.05038198083639145,
-0.01750919781625271,
-0.0507647767663002,
-0.0027553816325962543,
-0.03672603517770767,
-0.11431879550218582,
-0.05205933749675751,
0.02081097848713398,
0.0020446586422622204,
0.09802315384149551,
0.01740480586886406,
0.07260674238204956,
-0.007721601985394955,
0.16231995820999146,
-0.06192592903971672,
-0.005356399342417717,
0.006289863493293524,
0.2047567218542099,
-0.023484181612730026,
0.05830555781722069,
-0.04548345506191254,
0.005695577710866928,
-0.04913974553346634,
0.1967952400445938,
0.2968980669975281,
-0.08259399980306625,
0.07066705077886581,
0.03844944015145302,
0.03447898477315903,
0.04283467307686806,
0.05087996646761894,
0.09234032034873962,
0.2202320396900177,
-0.1027471125125885,
-0.0033104303292930126,
-0.10090050846338272,
0.03559855371713638,
-0.02644142135977745,
0.023119712248444557,
0.06528598815202713,
-0.032191548496484756,
-0.04592393711209297,
0.052972279489040375,
-0.08249510079622269,
0.0010646700393408537,
0.0630367174744606,
-0.2937219440937042,
-0.06900615990161896,
-0.019847322255373,
0.11500700563192368,
0.03086903877556324,
0.11305054277181625,
-0.034925125539302826,
-0.050195224583148956,
-0.027882901951670647,
0.027025535702705383,
-0.14821484684944153,
-0.11353529989719391,
0.0947364792227745,
-0.02706257626414299,
0.13843326270580292,
-0.05521116405725479,
0.06433174759149551,
0.1118536964058876,
0.04547524452209473,
-0.07040004432201385,
0.028966329991817474,
0.054019998759031296,
-0.06476250290870667,
-0.09176972508430481,
0.009138362482190132,
0.019884787499904633,
-0.029836688190698624,
0.06475906074047089,
-0.0879134014248848,
0.041965365409851074,
0.030087174847722054,
-0.03340911865234375,
0.02589462697505951,
0.06718237698078156,
-0.027889233082532883,
0.0715578943490982,
0.11757685989141464,
-0.00850208941847086,
-0.06809531897306442,
-0.038139913231134415,
-0.07972521334886551,
0.09817636758089066,
-0.04410087317228317,
-0.12588010728359222,
-0.038403745740652084,
-0.002524776617065072,
0.0036709231790155172,
0.022423051297664642,
-0.18239817023277283,
-0.06490959972143173,
-0.062358830124139786,
-0.004827936179935932,
-0.060926683247089386,
0.011973650194704533,
0.06061045825481415,
0.03092498704791069,
0.01133742742240429,
-0.09528668224811554,
0.07328860461711884,
0.10089890658855438,
-0.14924980700016022,
-0.05849994346499443
] |
null | null | null |
https://teespring.com/dashboard/listings/113925135/edit
|
{}
| null |
ddddd/EDCLasVegas
|
[
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#region-us
|
URL
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
[
0.024608636274933815,
-0.026205500587821007,
-0.009666500613093376,
-0.10395516455173492,
0.08638657629489899,
0.059816278517246246,
0.01882290467619896,
0.020661840215325356,
0.23975107073783875,
-0.005599027033895254,
0.1219947561621666,
0.0015615287702530622,
-0.037353623658418655,
0.03733762726187706,
-0.0035912662278860807,
-0.17583473026752472,
0.03876631706953049,
-0.018274923786520958,
0.01843859627842903,
0.026470553129911423,
-0.07776834815740585,
-0.07564429938793182,
0.015296397730708122,
-0.10247814655303955,
-0.083692267537117,
0.11002834886312485,
0.031466204673051834,
-0.019670886918902397,
0.10779199749231339,
-0.04243955761194229,
0.18699054419994354,
-0.011512263678014278,
-0.11213519424200058,
-0.2536850869655609,
0.021806683391332626,
-0.01765260472893715,
-0.08747660368680954,
0.01506110467016697,
0.0665089413523674,
-0.09014441072940826,
-0.0588928684592247,
0.0795099288225174,
-0.01132340170443058,
0.04246443510055542,
-0.27593839168548584,
-0.12684126198291779,
-0.05297930911183357,
-0.1421966552734375,
0.08651168644428253,
0.04035491496324539,
0.008764253929257393,
0.15506891906261444,
-0.20897391438484192,
0.004104613792151213,
0.08255259692668915,
-0.2538507878780365,
0.05591634660959244,
0.17671173810958862,
0.03623908758163452,
0.18037272989749908,
0.0060391901060938835,
0.11029672622680664,
0.0716743916273117,
-0.024263937026262283,
-0.17590197920799255,
-0.08127854019403458,
-0.04696211963891983,
0.16642488539218903,
-0.06727185100317001,
-0.14248386025428772,
0.34701237082481384,
0.00015008423360995948,
0.009657775051891804,
0.16921205818653107,
-0.059524230659008026,
-0.09972117841243744,
0.07259953022003174,
0.016484731808304787,
0.018492350354790688,
0.1471305936574936,
0.16307872533798218,
-0.0458691343665123,
-0.13837823271751404,
-0.018630273640155792,
-0.22798998653888702,
0.17510560154914856,
-0.03248048573732376,
0.13137903809547424,
-0.27447956800460815,
0.01684025302529335,
-0.2570667266845703,
0.0032130838371813297,
0.04178816080093384,
-0.06004921346902847,
-0.0226522795855999,
-0.013265985064208508,
-0.08018817007541656,
0.004899587947875261,
0.06192673370242119,
0.1266920566558838,
-0.06128726154565811,
0.06128238886594772,
-0.09319206327199936,
0.141696035861969,
0.07166698575019836,
0.07868369668722153,
0.13037432730197906,
0.041205424815416336,
-0.07187089323997498,
-0.21872246265411377,
-0.0026476888451725245,
-0.06275863200426102,
-0.09502086788415909,
-0.0020165652967989445,
-0.11606067419052124,
0.17244569957256317,
-0.030802514404058456,
-0.09825427830219269,
-0.11208184063434601,
0.09148659557104111,
-0.032992321997880936,
-0.03437839448451996,
-0.03552987426519394,
-0.020977836102247238,
0.019381176680326462,
0.04704452306032181,
-0.1548958420753479,
-0.005131472367793322,
0.07039852440357208,
0.11502562463283539,
-0.1346137970685959,
-0.003783059772104025,
-0.07908964157104492,
0.03039063885807991,
0.07654735445976257,
-0.16510222852230072,
0.03158547356724739,
-0.1124754324555397,
-0.07531405985355377,
0.002912673633545637,
-0.015710093080997467,
-0.016202643513679504,
0.166526660323143,
-0.0020451415330171585,
0.0714716836810112,
-0.026345307007431984,
-0.05890209600329399,
-0.11243434250354767,
-0.08489254862070084,
0.05390460044145584,
0.03670717030763626,
0.03266148269176483,
-0.2193479984998703,
0.014805203303694725,
-0.12762966752052307,
0.1360815018415451,
-0.10566820204257965,
-0.04705966264009476,
-0.022842247039079666,
0.20562705397605896,
0.037286072969436646,
0.08762791007757187,
-0.22171171009540558,
0.039756543934345245,
-0.05404696613550186,
0.18480908870697021,
-0.1502426266670227,
-0.0799463614821434,
0.20813211798667908,
-0.07964949309825897,
-0.10115210711956024,
0.021235812455415726,
0.020391687750816345,
0.026287272572517395,
0.0766737088561058,
0.4564172327518463,
-0.09766800701618195,
-0.09146861732006073,
0.10178250074386597,
0.17055274546146393,
-0.12427149713039398,
-0.1827561855316162,
0.06446871906518936,
-0.16666454076766968,
-0.1973118633031845,
0.0018917324487119913,
0.09222044050693512,
0.038269978016614914,
-0.07875611633062363,
-0.020746968686580658,
0.06325206160545349,
-0.0007678253459744155,
0.09095914661884308,
0.03755716234445572,
0.09034032374620438,
-0.08716782182455063,
0.11115926504135132,
-0.05017651244997978,
0.004037132486701012,
0.1343354731798172,
0.027325427159667015,
-0.03223329409956932,
0.08694463223218918,
-0.0485352948307991,
0.05295134335756302,
-0.1662379503250122,
-0.15068690478801727,
0.03398871049284935,
0.06283251196146011,
0.03186952322721481,
0.1280253529548645,
0.08141885697841644,
-0.10732853412628174,
0.022690722718834877,
-0.004228927195072174,
0.058398615568876266,
0.03891623765230179,
0.006107209715992212,
0.008764320984482765,
0.0961301177740097,
-0.10607069730758667,
-0.13589619100093842,
-0.07336436957120895,
-0.014715781435370445,
0.14371353387832642,
-0.0302802175283432,
0.07690227776765823,
-0.004240254405885935,
0.00013200697139836848,
0.06930823624134064,
0.08137880265712738,
0.016412746161222458,
0.08971183747053146,
-0.05237193778157234,
-0.05160155147314072,
0.10863113403320312,
-0.13533565402030945,
0.17837053537368774,
0.14053137600421906,
-0.20532016456127167,
0.029453208670020103,
-0.06838275492191315,
0.03670361638069153,
-0.008162540383636951,
0.0975119024515152,
-0.08272241055965424,
-0.02106042578816414,
0.013134466484189034,
0.0052274600602686405,
-0.013007243163883686,
0.017682146281003952,
-0.07295988500118256,
-0.07787393033504486,
-0.10233919322490692,
0.08436838537454605,
0.11562882363796234,
-0.10282530635595322,
0.14214380085468292,
0.4384984076023102,
0.11495281755924225,
0.21582984924316406,
-0.09581480920314789,
-0.0412987545132637,
0.007486371789127588,
0.0001535322517156601,
-0.04476691037416458,
0.08031861484050751,
-0.15973517298698425,
-0.038901735097169876,
0.027348900213837624,
0.07128690183162689,
0.11475157737731934,
-0.14959022402763367,
-0.09639324247837067,
-0.00793045200407505,
0.0022841424215584993,
-0.1249532699584961,
0.023905446752905846,
-0.03974650055170059,
0.04015624523162842,
0.07232289016246796,
-0.021535737439990044,
0.13939237594604492,
-0.04166141897439957,
-0.0639561116695404,
0.07585346698760986,
-0.2017085999250412,
-0.23179671168327332,
-0.12309670448303223,
-0.14680525660514832,
0.04366797208786011,
0.05154111236333847,
0.01726446859538555,
-0.17635835707187653,
-0.015074856579303741,
0.07706750929355621,
0.07820965349674225,
-0.20886357128620148,
-0.022814949974417686,
-0.004290030337870121,
0.0895976573228836,
-0.10227091610431671,
-0.0017130117630586028,
-0.04419664293527603,
-0.10150232166051865,
0.0017003051470965147,
0.07279510796070099,
-0.137485533952713,
0.13807645440101624,
0.21589438617229462,
0.07225540280342102,
0.07359948754310608,
-0.019093448296189308,
0.09936179965734482,
-0.10856141895055771,
-0.16549113392829895,
0.08348225057125092,
-0.06234746053814888,
0.047262318432331085,
0.17534415423870087,
0.03307317942380905,
-0.13904969394207,
-0.015682822093367577,
-0.0402069091796875,
-0.15603256225585938,
-0.238995760679245,
-0.09178274869918823,
-0.1182505264878273,
0.16442428529262543,
0.0009358620154671371,
0.06651917099952698,
0.08258313685655594,
-0.022042419761419296,
0.16447891294956207,
-0.07379321753978729,
-0.07578866183757782,
-0.006978808436542749,
0.12375060468912125,
-0.056660156697034836,
-0.03080669604241848,
-0.10566964000463486,
-0.008295975625514984,
0.1151021271944046,
0.15304014086723328,
0.12214863300323486,
0.2957419455051422,
0.08268889784812927,
0.026645636186003685,
0.08958091586828232,
0.17622539401054382,
0.09495089203119278,
0.07838419824838638,
-0.045413073152303696,
-0.014814783819019794,
0.014317171648144722,
-0.04022889584302902,
0.010141594335436821,
0.14683100581169128,
-0.2679629921913147,
-0.006678564939647913,
-0.2710230350494385,
0.0965198427438736,
-0.10913380235433578,
0.11837165057659149,
-0.01015760749578476,
0.10194015502929688,
0.11082887649536133,
0.03233652561903,
-0.03858073800802231,
0.16613617539405823,
0.08450309932231903,
-0.11277695000171661,
0.001758623169735074,
0.03737903758883476,
0.09715615212917328,
-0.02818971499800682,
0.12721189856529236,
-0.11048974841833115,
-0.1464834064245224,
0.013753619976341724,
0.07152791321277618,
-0.15373679995536804,
0.3138748109340668,
0.012069208547472954,
-0.13481520116329193,
-0.01481647603213787,
-0.09957809001207352,
-0.006440147757530212,
0.1254177987575531,
0.09333524852991104,
0.07935678958892822,
-0.2185502052307129,
-0.13339371979236603,
0.05872276425361633,
-0.00575496768578887,
0.22408108413219452,
-0.034034017473459244,
-0.11356475204229355,
-0.027013886719942093,
0.04241163283586502,
-0.06043251231312752,
0.08524788916110992,
0.023536119610071182,
-0.08113526552915573,
-0.032957352697849274,
0.05323701351881027,
0.012368366122245789,
0.00524376705288887,
0.09360801428556442,
0.020107939839363098,
-0.0009265501867048442,
0.01785753294825554,
0.047885000705718994,
-0.0675911232829094,
-0.1984109878540039,
0.09357594698667526,
-0.05215044692158699,
0.0015536568826064467,
-0.08013670891523361,
-0.15122665464878082,
-0.08837161958217621,
-0.16009655594825745,
0.12540200352668762,
-0.034406669437885284,
0.12700119614601135,
-0.06619787961244583,
0.17341409623622894,
-0.07871770113706589,
0.04481020197272301,
-0.047349292784929276,
0.050332702696323395,
-0.007268077693879604,
-0.07756082713603973,
0.16585899889469147,
-0.15564003586769104,
0.01809087023139,
0.19572502374649048,
-0.018915493041276932,
0.07177707552909851,
0.021322092041373253,
-0.0636206790804863,
0.23147478699684143,
0.3014698624610901,
0.008138049393892288,
0.1665448248386383,
0.3018903136253357,
-0.07466315478086472,
-0.2642788887023926,
-0.05505012720823288,
-0.2841376066207886,
-0.05371501296758652,
0.10716094076633453,
-0.22523896396160126,
0.06986407935619354,
0.14383509755134583,
-0.06471995264291763,
0.30228954553604126,
-0.21825523674488068,
0.012589273042976856,
0.15434536337852478,
-0.08868814259767532,
0.5515313148498535,
-0.1133413165807724,
-0.17677772045135498,
-0.008122089318931103,
-0.08741296827793121,
0.10602109134197235,
-0.0340677872300148,
0.06877441704273224,
0.013465235009789467,
0.04797380417585373,
0.048932258039712906,
-0.03111894056200981,
0.22701001167297363,
0.008710170164704323,
0.09015397727489471,
-0.07378865778446198,
-0.18624304234981537,
0.11639340221881866,
-0.04359482601284981,
-0.08891059458255768,
0.0849778801202774,
-0.05942516401410103,
-0.11078983545303345,
0.04663389176130295,
-0.07950539886951447,
-0.024862350896000862,
0.08423490077257156,
-0.04678233340382576,
-0.042606171220541,
-0.008054176345467567,
-0.1618063747882843,
-0.0002289071271661669,
0.31360217928886414,
-0.07096036523580551,
0.16695955395698547,
0.03677211329340935,
0.00038613268407061696,
-0.11027684062719345,
0.030288029462099075,
-0.05203165486454964,
-0.021576624363660812,
0.09578979015350342,
-0.11096979677677155,
0.03204701095819473,
0.14160704612731934,
-0.04864364117383957,
0.05846960097551346,
0.09256096184253693,
-0.0849417969584465,
0.007583672646433115,
0.17753590643405914,
-0.17537221312522888,
-0.1273445188999176,
-0.006135711446404457,
-0.09862716495990753,
0.14055661857128143,
0.04394126310944557,
0.05191568285226822,
0.16669964790344238,
0.03967129811644554,
-0.029474308714270592,
-0.02817419543862343,
-0.1153380498290062,
-0.0201893113553524,
0.040153320878744125,
0.00045633706031367183,
-0.08791285753250122,
0.2262638509273529,
0.06409153342247009,
-0.1328488290309906,
-0.051157206296920776,
0.2161225974559784,
-0.06805316358804703,
-0.04911920800805092,
-0.223562553524971,
0.10752306133508682,
-0.07112517952919006,
-0.0965060144662857,
0.05453834682703018,
-0.02270081453025341,
0.005106312222778797,
0.181985542178154,
0.03941008821129799,
0.11070270836353302,
0.03738937899470329,
-0.02448922023177147,
0.15798696875572205,
-0.142850860953331,
-0.14191335439682007,
-0.025354057550430298,
-0.08757315576076508,
-0.13844476640224457,
-0.026804137974977493,
0.1617041826248169,
-0.09177309274673462,
-0.14772607386112213,
-0.2621181011199951,
0.10968475043773651,
-0.16432365775108337,
-0.10192688554525375,
-0.03469514101743698,
-0.08968492597341537,
0.0696166530251503,
0.030301768332719803,
-0.03093348816037178,
-0.06706760823726654,
-0.18593791127204895,
0.0816768929362297,
0.06349513679742813,
0.045533183962106705,
-0.017847947776317596,
0.0067379772663116455,
0.1720137596130371,
0.025955144315958023,
0.10040043294429779,
0.16762186586856842,
0.011397695168852806,
0.2246655523777008,
-0.1671202927827835,
-0.11496317386627197,
0.1336962729692459,
-0.026543032377958298,
0.06762003898620605,
0.16792191565036774,
-0.0772583931684494,
0.015526676550507545,
-0.028136352077126503,
0.07066910713911057,
-0.11003983020782471,
-0.105624258518219,
0.007937257178127766,
0.02567129209637642,
-0.2755882740020752,
-0.005599735304713249,
-0.19717298448085785,
0.14788752794265747,
0.02579621411859989,
0.03297143429517746,
0.10257530212402344,
0.10404334217309952,
0.08312062919139862,
-0.0017710148822516203,
0.03226327523589134,
-0.1176818460226059,
0.02753005363047123,
-0.059239376336336136,
-0.020663779228925705,
0.017624232918024063,
0.36952024698257446,
-0.03603357449173927,
-0.046802736818790436,
0.003710439894348383,
0.1307835876941681,
-0.02139742486178875,
0.017395347356796265,
0.13209912180900574,
0.12607666850090027,
-0.08595693111419678,
-0.1504845917224884,
0.04888554662466049,
-0.04565655067563057,
-0.02836887165904045,
0.1464131623506546,
0.05905961990356445,
0.1050296202301979,
0.0908031314611435,
-0.014463032595813274,
-0.00318976235575974,
0.012856799177825451,
-0.15486004948616028,
0.06223496049642563,
-0.010558074340224266,
0.012565906159579754,
0.017934376373887062,
0.15238402783870697,
-0.005540105979889631,
0.07739730179309845,
-0.09889880567789078,
0.004208535887300968,
-0.13498884439468384,
-0.07913459837436676,
0.03617347031831741,
-0.13393273949623108,
0.04141177982091904,
-0.01871878281235695,
0.029611799865961075,
0.30386561155319214,
0.02558239921927452,
-0.020639164373278618,
0.12512871623039246,
-0.1214587539434433,
-0.12050267308950424,
-0.001594188273884356,
-0.029960084706544876,
0.0791488066315651,
-0.02633434161543846,
-0.0997740775346756,
-0.1001306027173996,
-0.15166029334068298,
-0.09759195148944855,
0.05182836204767227,
-0.04993441700935364,
-0.059362251311540604,
-0.17634081840515137,
-0.05707859992980957,
-0.05147340148687363,
0.14025864005088806,
-0.12263951450586319,
0.15159130096435547,
-0.014490418136119843,
0.004084470681846142,
0.04405883327126503,
0.1950942426919937,
-0.03644494712352753,
0.08714226633310318,
0.0154351145029068,
0.1522706001996994,
-0.05119588226079941,
0.14720745384693146,
-0.10931728035211563,
-0.04014137014746666,
-0.06710435450077057,
0.21513493359088898,
0.25630924105644226,
-0.06136954948306084,
-0.008937356993556023,
-0.012760217301547527,
0.058654606342315674,
0.1073930487036705,
0.16049085557460785,
0.002326392102986574,
0.2802925705909729,
-0.03133585304021835,
0.04815128445625305,
0.02901598811149597,
0.013607407920062542,
-0.06336209923028946,
0.03397751972079277,
0.07539387792348862,
-0.035039983689785004,
-0.1412304788827896,
0.15837742388248444,
-0.21980468928813934,
0.18157227337360382,
0.11640069633722305,
-0.19996967911720276,
-0.013728445395827293,
-0.04882071167230606,
0.1689416468143463,
-0.0856364443898201,
0.1637246012687683,
-0.0903693437576294,
-0.2108195722103119,
-0.2056000679731369,
0.03867346793413162,
-0.34623071551322937,
-0.254462867975235,
0.10422009229660034,
0.1488201916217804,
0.04015883058309555,
-0.018507536500692368,
-0.019967829808592796,
-0.018367022275924683,
0.04877542704343796,
-0.0067357709631323814,
0.06014643982052803,
0.031397558748722076,
-0.02988368645310402,
-0.24127542972564697,
-0.029804671183228493,
0.023964406922459602,
-0.07093082368373871,
0.07464958727359772,
-0.06874357163906097,
-0.022495782002806664,
0.08059766888618469,
-0.03066304884850979,
0.03298592567443848,
-0.035373736172914505,
-0.16326889395713806,
0.027529051527380943,
0.03900543600320816,
0.036012712866067886,
0.00634160777553916,
0.0008072225609794259,
-0.03455270454287529,
0.0644603744149208,
-0.16716794669628143,
-0.16015739738941193,
0.14140215516090393,
-0.06745140254497528,
0.2779497504234314,
-0.05812826007604599,
-0.0809100940823555,
0.04766704887151718,
-0.03426874056458473,
0.1807648241519928,
-0.07756473124027252,
0.047254521399736404,
0.12766779959201813,
0.011127962730824947,
0.03121316432952881,
-0.3092964291572571,
0.11082969605922699,
-0.000795336440205574,
-0.006093299947679043,
-0.07581598311662674
] |
null | null |
sentence-transformers
|
# ddobokki/electra-small-nli-sts
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 256 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('ddobokki/electra-small-nli-sts')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('ddobokki/electra-small-nli-sts')
model = AutoModel.from_pretrained('ddobokki/electra-small-nli-sts')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=ddobokki/electra-small-nli-sts)
## Training
The model was trained with the parameters:
**DataLoader**:
`sentence_transformers.datasets.NoDuplicatesDataLoader.NoDuplicatesDataLoader` of length 9039 with parameters:
```
{'batch_size': 64}
```
**Loss**:
`sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
```
{'scale': 20.0, 'similarity_fct': 'cos_sim'}
```
Parameters of the fit()-Method:
```
{
"epochs": 1,
"evaluation_steps": 903,
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 904,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: ElectraModel
(1): Pooling({'word_embedding_dimension': 256, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information -->
|
{"tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "transformers", "ko"], "pipeline_tag": "sentence-similarity"}
|
sentence-similarity
|
ddobokki/electra-small-nli-sts
|
[
"sentence-transformers",
"pytorch",
"electra",
"feature-extraction",
"sentence-similarity",
"transformers",
"ko",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#sentence-transformers #pytorch #electra #feature-extraction #sentence-similarity #transformers #ko #endpoints_compatible #region-us
|
# ddobokki/electra-small-nli-sts
This is a sentence-transformers model: It maps sentences & paragraphs to a 256 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have sentence-transformers installed:
Then you can use the model like this:
## Usage (HuggingFace Transformers)
Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL
## Training
The model was trained with the parameters:
DataLoader:
'sentence_transformers.datasets.NoDuplicatesDataLoader.NoDuplicatesDataLoader' of length 9039 with parameters:
Loss:
'sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss' with parameters:
Parameters of the fit()-Method:
## Full Model Architecture
## Citing & Authors
|
[
"# ddobokki/electra-small-nli-sts\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 256 dimensional dense vector space and can be used for tasks like clustering or semantic search.",
"## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:",
"## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.",
"## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL",
"## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'sentence_transformers.datasets.NoDuplicatesDataLoader.NoDuplicatesDataLoader' of length 9039 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss' with parameters:\n \n\nParameters of the fit()-Method:",
"## Full Model Architecture",
"## Citing & Authors"
] |
[
"TAGS\n#sentence-transformers #pytorch #electra #feature-extraction #sentence-similarity #transformers #ko #endpoints_compatible #region-us \n",
"# ddobokki/electra-small-nli-sts\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 256 dimensional dense vector space and can be used for tasks like clustering or semantic search.",
"## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:",
"## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.",
"## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL",
"## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'sentence_transformers.datasets.NoDuplicatesDataLoader.NoDuplicatesDataLoader' of length 9039 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss' with parameters:\n \n\nParameters of the fit()-Method:",
"## Full Model Architecture",
"## Citing & Authors"
] |
[
45,
59,
38,
64,
29,
102,
5,
6
] |
[
"passage: TAGS\n#sentence-transformers #pytorch #electra #feature-extraction #sentence-similarity #transformers #ko #endpoints_compatible #region-us \n# ddobokki/electra-small-nli-sts\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 256 dimensional dense vector space and can be used for tasks like clustering or semantic search.## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'sentence_transformers.datasets.NoDuplicatesDataLoader.NoDuplicatesDataLoader' of length 9039 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss' with parameters:\n \n\nParameters of the fit()-Method:## Full Model Architecture## Citing & Authors"
] |
[
-0.03962735831737518,
0.1506088227033615,
-0.006408609449863434,
0.03888203576207161,
0.12753532826900482,
0.03395552560687065,
0.13362735509872437,
0.0776664987206459,
-0.01969245634973049,
0.09331703931093216,
0.018522275611758232,
0.10673970729112625,
0.006745724473148584,
0.05112452059984207,
0.037999410182237625,
-0.26961585879325867,
0.02554219588637352,
-0.04831480234861374,
0.03953806310892105,
0.058338895440101624,
0.111011803150177,
-0.07680173963308334,
0.05313337221741676,
0.017670651897788048,
-0.038450323045253754,
0.009769195690751076,
-0.039233818650245667,
-0.04970689117908478,
0.08694861829280853,
0.04107626900076866,
0.0264817513525486,
-0.003529503708705306,
0.014100106433033943,
-0.19112496078014374,
0.010888325981795788,
0.07367824763059616,
-0.007692819926887751,
0.05780632421374321,
0.02233779989182949,
-0.022977635264396667,
0.19150355458259583,
-0.08533995598554611,
0.06766145676374435,
0.03871124982833862,
-0.1075962632894516,
-0.09854792058467865,
-0.05577152967453003,
-0.005472410935908556,
0.13279785215854645,
0.10489047318696976,
-0.04810555279254913,
0.13565440475940704,
-0.04879254102706909,
0.07994632422924042,
0.07298950850963593,
-0.26967653632164,
-0.02367100678384304,
0.026935027912259102,
0.04348889738321304,
0.00010995908087352291,
-0.09897083789110184,
-0.021717799827456474,
-0.02827281504869461,
0.037376146763563156,
0.051920756697654724,
-0.044439561665058136,
0.05671684816479683,
-0.037274569272994995,
-0.09146449714899063,
0.007661647628992796,
0.21987561881542206,
0.03329797089099884,
-0.03219511732459068,
-0.17492109537124634,
-0.07254141569137573,
0.05035538226366043,
-0.04191209748387337,
-0.04652191698551178,
0.023471225053071976,
0.04422653093934059,
-0.027053125202655792,
-0.08114078640937805,
-0.09784717857837677,
-0.007258032914251089,
-0.06446702778339386,
0.027127614244818687,
-0.007410416845232248,
-0.05172213912010193,
-0.01091833133250475,
0.07808951288461685,
-0.09013934433460236,
-0.11011120676994324,
-0.023581350222229958,
-0.020326752215623856,
-0.10958024859428406,
-0.040631093084812164,
-0.060258008539676666,
-0.13016588985919952,
0.032388415187597275,
0.17552664875984192,
0.09351538866758347,
0.00912915263324976,
-0.0024169194512069225,
0.04971165210008621,
0.047017838805913925,
0.1338500678539276,
-0.05231945216655731,
-0.09753130376338959,
-0.0284233707934618,
0.0011639847652986646,
0.02894362062215805,
-0.018378067761659622,
-0.026264917105436325,
-0.011577163822948933,
0.05050777271389961,
0.05322902277112007,
0.06758968532085419,
0.04782313480973244,
-0.04699180647730827,
-0.04592323303222656,
0.07769312709569931,
-0.12238264083862305,
0.014571469277143478,
0.009627126157283783,
-0.05945565178990364,
0.05460837110877037,
0.08144304156303406,
-0.01720719411969185,
-0.08133604377508163,
0.043742649257183075,
-0.09870772063732147,
-0.017299387603998184,
-0.0535077266395092,
-0.12381608784198761,
-0.0004628680180758238,
0.013234894722700119,
-0.02669890597462654,
-0.10218998789787292,
-0.17706219851970673,
-0.08058969676494598,
0.031913697719573975,
-0.03707708790898323,
-0.020083794370293617,
-0.1182013750076294,
0.005610624328255653,
0.010308663360774517,
-0.009420042857527733,
-0.06632307171821594,
-0.00025832984829321504,
0.01815154403448105,
-0.04405257850885391,
0.043818917125463486,
0.029534149914979935,
0.04955286905169487,
-0.11129230260848999,
0.022663520649075508,
-0.14553505182266235,
0.1576719880104065,
-0.016227835789322853,
0.07724151760339737,
-0.14774470031261444,
0.0034300389233976603,
-0.019724441692233086,
0.054010339081287384,
0.02386370487511158,
0.1406981498003006,
-0.21158060431480408,
-0.06368208676576614,
0.10665063560009003,
-0.06626281142234802,
-0.08729083091020584,
0.09452524781227112,
-0.03729463741183281,
0.15468016266822815,
0.12436258047819138,
0.10259702801704407,
0.10915607959032059,
-0.030633609741926193,
-0.035419825464487076,
0.010682969354093075,
-0.04391908273100853,
0.07633521407842636,
0.034844834357500076,
-0.07772570103406906,
0.11953051388263702,
0.002489939332008362,
-0.048394735902547836,
0.011509578675031662,
0.006020260043442249,
-0.05994303897023201,
0.016832727938890457,
-0.04612041637301445,
0.050810255110263824,
-0.03166896849870682,
0.03110784851014614,
0.022098533809185028,
-0.09841828048229218,
0.1137121170759201,
0.09449104964733124,
-0.06862175464630127,
0.018974104896187782,
-0.07729866355657578,
0.007513664197176695,
-0.018298283219337463,
0.019482169300317764,
-0.2088036984205246,
-0.11027137190103531,
0.01940181851387024,
-0.03256573900580406,
0.07718505710363388,
0.05400930345058441,
0.0539570115506649,
0.02889142744243145,
-0.010557740926742554,
-0.011113874614238739,
0.03802315518260002,
-0.008401085622608662,
-0.08515842258930206,
-0.10505998134613037,
0.000014320765330921859,
-0.0375647246837616,
0.05473032966256142,
-0.13286229968070984,
0.01857808791100979,
0.006327766925096512,
0.060323316603899,
0.042319849133491516,
-0.03894621506333351,
-0.007305168081074953,
-0.025340840220451355,
-0.012978127226233482,
-0.03417883813381195,
0.03667106851935387,
0.034849781543016434,
-0.14352655410766602,
0.049961138516664505,
-0.21087060868740082,
-0.11013251543045044,
0.06576623022556305,
0.020937930792570114,
-0.08087463676929474,
-0.006373317446559668,
-0.008466719649732113,
-0.01490102894604206,
-0.027006739750504494,
-0.04812848940491676,
0.154516339302063,
0.0934748724102974,
0.10986221581697464,
-0.019067633897066116,
-0.020411726087331772,
-0.04053244739770889,
-0.034876029938459396,
-0.047813963145017624,
0.08691154420375824,
-0.03133340924978256,
-0.1285620927810669,
0.050005875527858734,
0.06616857647895813,
-0.0842236801981926,
0.10789211839437485,
-0.004421573597937822,
-0.0668792650103569,
-0.07829416543245316,
0.03411132097244263,
0.03909299522638321,
0.0025142503436654806,
-0.08822237700223923,
0.019944384694099426,
0.058893777430057526,
0.01834631897509098,
0.019754856824874878,
-0.04051687568426132,
0.053434740751981735,
0.0378798209130764,
0.026682278141379356,
0.09000378847122192,
0.029451826587319374,
-0.014766693115234375,
0.055839356034994125,
0.013330097310245037,
0.04125232249498367,
-0.031614046543836594,
-0.038789741694927216,
-0.11055231094360352,
0.17062848806381226,
-0.1343730241060257,
-0.18717634677886963,
-0.14274214208126068,
0.016418755054473877,
-0.058593425899744034,
0.01685662567615509,
0.07068048417568207,
-0.0397358164191246,
-0.06633441895246506,
-0.06278777867555618,
0.03049320913851261,
0.05383021757006645,
-0.058091312646865845,
0.02414592355489731,
0.03430045768618584,
0.021051866933703423,
-0.11559223383665085,
-0.011643249541521072,
-0.0008374509052373469,
-0.06191040948033333,
-0.0032666129991412163,
0.01146321278065443,
0.04623241350054741,
0.10521532595157623,
0.05921522527933121,
-0.018187666311860085,
0.010525147430598736,
0.1995430886745453,
-0.08193470537662506,
0.049086932092905045,
0.18543922901153564,
-0.0019623953849077225,
0.0600886307656765,
0.08422628790140152,
0.019280757755041122,
-0.0631987527012825,
0.05131009593605995,
0.05024031549692154,
-0.02222074568271637,
-0.13881303369998932,
-0.10752400755882263,
-0.07606184482574463,
-0.004412913229316473,
0.13089127838611603,
0.04767047241330147,
0.0029921585228294134,
0.06280508637428284,
-0.012077578343451023,
0.027583029121160507,
0.06558586657047272,
0.12492796033620834,
0.13409021496772766,
-0.01722876913845539,
0.09455094486474991,
-0.05010188743472099,
-0.06315451115369797,
0.05572296306490898,
-0.006906709633767605,
0.13284249603748322,
0.005753975361585617,
0.14517371356487274,
0.06232412904500961,
-0.02671508863568306,
-0.02935611642897129,
0.07738018780946732,
-0.0222797691822052,
0.02043749764561653,
-0.022671489045023918,
-0.08770454674959183,
-0.045658040791749954,
0.08476747572422028,
0.08323685079813004,
-0.03976999968290329,
-0.033408403396606445,
0.06846822053194046,
0.10622143000364304,
0.14938770234584808,
0.08244698494672775,
-0.2419794201850891,
-0.07190028578042984,
0.022015556693077087,
-0.06824829429388046,
-0.0634644404053688,
-0.017677748575806618,
0.027235515415668488,
-0.126426100730896,
0.04320375621318817,
-0.01271619088947773,
0.07601874321699142,
-0.12057946622371674,
0.026593703776597977,
-0.06649330258369446,
0.042725950479507446,
0.0005098285037092865,
0.07206184417009354,
-0.23487338423728943,
0.05894911289215088,
0.020297423005104065,
0.08206643164157867,
-0.04954368993639946,
0.03633479028940201,
0.06955857574939728,
-0.027633758261799812,
0.159723699092865,
-0.019221704453229904,
-0.0308337714523077,
0.01587485522031784,
-0.07040901482105255,
-0.007462841924279928,
0.028816038742661476,
-0.10987798869609833,
0.08351816236972809,
-0.029866905882954597,
-0.013625255785882473,
-0.0071752676740288734,
0.016796225681900978,
-0.06155268847942352,
-0.1671600043773651,
0.027197662740945816,
0.004414073657244444,
0.008891422301530838,
-0.01501049567013979,
-0.007676123175770044,
0.029212506487965584,
0.2070668786764145,
-0.0980672761797905,
-0.06431741267442703,
-0.11282504349946976,
0.009018965065479279,
0.10288780182600021,
-0.10156790167093277,
0.02243030071258545,
0.0008810595609247684,
0.13489298522472382,
-0.05211710184812546,
-0.05885177478194237,
0.06491450220346451,
-0.06919287890195847,
-0.06033950671553612,
-0.037426963448524475,
0.06019284203648567,
0.07481160759925842,
0.028970515355467796,
0.0335918553173542,
0.06450432538986206,
-0.04279796779155731,
-0.09265930205583572,
-0.07271420210599899,
0.12506787478923798,
0.0027189631946384907,
0.077931247651577,
-0.10947296023368835,
-0.06153241544961929,
-0.08567876368761063,
0.0337255522608757,
0.20580080151557922,
0.23579338192939758,
-0.05641666427254677,
0.060670241713523865,
0.16241906583309174,
-0.08964281529188156,
-0.19916534423828125,
-0.0757494792342186,
0.030528228729963303,
0.04104447364807129,
0.08344189822673798,
-0.15991215407848358,
0.08463197946548462,
0.0813165232539177,
-0.009464254602789879,
-0.04576074704527855,
-0.24738000333309174,
-0.14199836552143097,
0.16254305839538574,
0.02172398380935192,
-0.0000045129704631108325,
-0.09405013918876648,
-0.04953749477863312,
-0.0931343212723732,
-0.03270368278026581,
0.08917031437158585,
-0.010968584567308426,
0.09828484058380127,
0.03360462561249733,
0.025158023461699486,
0.056068453937768936,
-0.007922869175672531,
0.12640000879764557,
0.08884672820568085,
0.029950685799121857,
-0.050703778862953186,
-0.008566056378185749,
0.10348054766654968,
-0.09236511588096619,
0.15473976731300354,
-0.027371598407626152,
0.03672277554869652,
-0.1302233785390854,
-0.03394857794046402,
-0.04191533103585243,
0.03687625378370285,
-0.04286467283964157,
-0.05012882128357887,
-0.016935398802161217,
0.060077596455812454,
0.10391143709421158,
0.004925466608256102,
0.007907887920737267,
-0.04159172251820564,
0.0381779707968235,
0.15341068804264069,
0.10307924449443817,
0.11458397656679153,
-0.16516093909740448,
0.019992128014564514,
0.0012737695360556245,
0.06156216934323311,
-0.0896538645029068,
0.06381116062402725,
0.07121284306049347,
0.005936420988291502,
0.13194383680820465,
0.012377649545669556,
-0.09073034673929214,
0.0021774619817733765,
0.05497823283076286,
-0.07847905158996582,
-0.17021329700946808,
-0.014684971421957016,
-0.019277557730674744,
-0.12035184353590012,
-0.046870190650224686,
0.1561134159564972,
-0.014477599412202835,
0.009476782754063606,
0.03395926207304001,
0.04550692439079285,
-0.035338182002305984,
0.13114261627197266,
-0.010069685988128185,
0.03946685418486595,
-0.05185394734144211,
0.10546233505010605,
0.08501636981964111,
-0.07510516047477722,
0.05474216490983963,
0.08944723755121231,
-0.07988821715116501,
-0.08652950078248978,
-0.08812984824180603,
0.10130549967288971,
-0.09382376074790955,
0.02307688258588314,
-0.061253197491168976,
-0.05218648537993431,
0.013146686367690563,
-0.024265944957733154,
0.03912624716758728,
0.06437918543815613,
-0.10700836032629013,
-0.017885897308588028,
-0.10528586059808731,
0.07545045763254166,
0.09994127601385117,
0.018499240279197693,
-0.02930687554180622,
0.0799754187464714,
-0.0212445929646492,
0.00750233419239521,
-0.010020083747804165,
-0.05436614528298378,
-0.07323910295963287,
-0.010515003465116024,
-0.030132824555039406,
-0.02799692191183567,
-0.13560611009597778,
-0.00796523317694664,
0.028887443244457245,
0.06512825191020966,
-0.01351648848503828,
-0.02015186846256256,
-0.056192636489868164,
-0.07777035981416702,
-0.04032665491104126,
0.10693450272083282,
-0.1319819688796997,
0.0025262869894504547,
0.025300385430455208,
-0.10041419416666031,
0.09197168797254562,
0.019566837698221207,
-0.011331334710121155,
0.024971552193164825,
-0.01065745297819376,
-0.02806818298995495,
0.017501210793852806,
0.03392374515533447,
0.04606784135103226,
-0.10626913607120514,
0.009568086825311184,
-0.04132533073425293,
0.03208470344543457,
0.0003817307879216969,
0.05326228588819504,
-0.11268521845340729,
0.026088081300258636,
-0.019460076466202736,
-0.003293291199952364,
-0.1064826175570488,
0.03006196767091751,
0.022177962586283684,
0.045421574264764786,
0.1770658940076828,
-0.04601770639419556,
0.0702708438038826,
-0.12069980055093765,
-0.0021750687155872583,
0.01638862118124962,
-0.03752763941884041,
0.11341414600610733,
-0.10141897946596146,
0.06074533611536026,
-0.033118896186351776,
0.06617799401283264,
-0.04721002280712128,
0.022622615098953247,
0.04823518544435501,
0.00816038902848959,
-0.0312960147857666,
-0.00005803194653708488,
0.0897388681769371,
0.04523332044482231,
-0.0035992765333503485,
-0.05767305940389633,
0.03847361356019974,
0.015955142676830292,
-0.008244008757174015,
0.06997096538543701,
0.0802873894572258,
0.013737032189965248,
0.09220107644796371,
0.04981226474046707,
-0.015710167586803436,
-0.08668167144060135,
0.04305179789662361,
-0.04142313078045845,
0.08559358865022659,
-0.018394343554973602,
0.02173982560634613,
0.14825116097927094,
-0.1563023328781128,
0.10532752424478531,
0.01255872379988432,
-0.05849682167172432,
-0.07511968165636063,
-0.1628255546092987,
-0.05605514720082283,
-0.034396279603242874,
-0.003990639466792345,
-0.12160954624414444,
-0.017403123900294304,
0.008744610473513603,
0.0014751318376511335,
-0.026940152049064636,
0.11859849095344543,
-0.11998369544744492,
-0.09722350537776947,
0.07981565594673157,
-0.027303770184516907,
0.02993534877896309,
0.01732541248202324,
0.025324057787656784,
-0.0014562247088178992,
0.08864305913448334,
0.06398062407970428,
0.049244459718465805,
0.04418964311480522,
0.041978318244218826,
-0.06445247679948807,
-0.06873660534620285,
-0.002837020205333829,
0.005850446876138449,
-0.03777437284588814,
0.08884010463953018,
0.040498096495866776,
-0.0801001563668251,
-0.012316573411226273,
0.20579755306243896,
-0.09015196561813354,
-0.1007457748055458,
-0.18477436900138855,
0.18076512217521667,
0.05964184179902077,
0.009040175005793571,
-0.025604834780097008,
-0.10092198103666306,
-0.007984532043337822,
0.10621821135282516,
0.1757422685623169,
-0.06916125118732452,
0.017752381041646004,
0.03545556962490082,
0.008323539048433304,
-0.0004191068874206394,
0.031859710812568665,
0.04846520721912384,
0.20157340168952942,
-0.03907165676355362,
0.136169895529747,
-0.021690858528017998,
-0.045081283897161484,
-0.06916430592536926,
0.10473884642124176,
0.028348714113235474,
0.022479813545942307,
-0.008160504512488842,
0.11610747128725052,
-0.050877608358860016,
-0.1097237691283226,
-0.030718006193637848,
-0.0704454779624939,
-0.13639359176158905,
-0.025003526359796524,
-0.00798013899475336,
0.03073771670460701,
0.0972609892487526,
0.016531620174646378,
-0.029131809249520302,
0.13332732021808624,
-0.017812605947256088,
-0.05530038848519325,
-0.033809203654527664,
0.02788819745182991,
0.0064564975909888744,
0.14975197613239288,
-0.008896544575691223,
-0.025256237015128136,
0.11883018165826797,
0.008381453342735767,
-0.06358613073825836,
0.09241867810487747,
0.025107556954026222,
-0.03565312922000885,
0.14495593309402466,
0.08405593782663345,
-0.0205059964209795,
0.09092970192432404,
0.08693736046552658,
-0.16513171792030334,
0.04965488240122795,
-0.01031582709401846,
-0.0017623441526666284,
-0.07271850109100342,
0.06891769170761108,
-0.08035943657159805,
0.13609665632247925,
0.16576217114925385,
-0.02640608511865139,
-0.0014378284104168415,
-0.00444048410281539,
0.0234639011323452,
0.024663876742124557,
0.05196418985724449,
-0.05576648563146591,
-0.10589791089296341,
0.010107944719493389,
-0.013536837883293629,
0.027340222150087357,
-0.27936798334121704,
-0.10866977274417877,
0.009388777427375317,
-0.033155135810375214,
-0.022900599986314774,
0.11079466342926025,
0.047117218375205994,
-0.0005057092639617622,
-0.04088238254189491,
-0.15323032438755035,
0.006646911613643169,
0.10368285328149796,
-0.12176967412233353,
-0.09749771654605865
] |
null | null |
sentence-transformers
|
# ddobokki/klue-roberta-small-nli-sts
한국어 Sentence Transformer 모델입니다.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
[sentence-transformers](https://www.SBERT.net) 라이브러리를 이용해 사용할 수 있습니다.
```
pip install -U sentence-transformers
```
사용법
```python
from sentence_transformers import SentenceTransformer
sentences = ["흐르는 강물을 거꾸로 거슬러 오르는", "세월이 가면 가슴이 터질 듯한"]
model = SentenceTransformer('ddobokki/klue-roberta-small-nli-sts')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
transformers 라이브러리만 사용할 경우
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ["흐르는 강물을 거꾸로 거슬러 오르는", "세월이 가면 가슴이 터질 듯한"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('ddobokki/klue-roberta-small-nli-sts')
model = AutoModel.from_pretrained('ddobokki/klue-roberta-small-nli-sts')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Performance
- Semantic Textual Similarity test set results <br>
| Model | Cosine Pearson | Cosine Spearman | Euclidean Pearson | Euclidean Spearman | Manhattan Pearson | Manhattan Spearman | Dot Pearson | Dot Spearman |
|------------------------|:----:|:----:|:----:|:----:|:----:|:----:|:----:|:----:|
| KoSRoBERTa<sup>small</sup> | 84.27 | 84.17 | 83.33 | 83.65 | 83.34 | 83.65 | 82.10 | 81.38 |
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information -->
|
{"tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "transformers", "ko"], "pipeline_tag": "sentence-similarity"}
|
sentence-similarity
|
ddobokki/klue-roberta-small-nli-sts
|
[
"sentence-transformers",
"pytorch",
"roberta",
"feature-extraction",
"sentence-similarity",
"transformers",
"ko",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#sentence-transformers #pytorch #roberta #feature-extraction #sentence-similarity #transformers #ko #endpoints_compatible #region-us
|
ddobokki/klue-roberta-small-nli-sts
===================================
한국어 Sentence Transformer 모델입니다.
Usage (Sentence-Transformers)
-----------------------------
sentence-transformers 라이브러리를 이용해 사용할 수 있습니다.
사용법
Usage (HuggingFace Transformers)
--------------------------------
transformers 라이브러리만 사용할 경우
Performance
-----------
* Semantic Textual Similarity test set results
Full Model Architecture
-----------------------
Citing & Authors
----------------
|
[] |
[
"TAGS\n#sentence-transformers #pytorch #roberta #feature-extraction #sentence-similarity #transformers #ko #endpoints_compatible #region-us \n"
] |
[
45
] |
[
"passage: TAGS\n#sentence-transformers #pytorch #roberta #feature-extraction #sentence-similarity #transformers #ko #endpoints_compatible #region-us \n"
] |
[
0.035826098173856735,
-0.0763310045003891,
-0.009300023317337036,
-0.049764905124902725,
0.09856557846069336,
0.03974660485982895,
0.006291947793215513,
0.11501798778772354,
-0.019506683573126793,
-0.009314009919762611,
0.09453267604112625,
0.23079422116279602,
-0.00386620475910604,
-0.11088021099567413,
-0.05512509122490883,
-0.34508705139160156,
0.1072995513677597,
0.08020742982625961,
-0.022947225719690323,
0.11269968748092651,
0.09632512181997299,
-0.03590013459324837,
0.02490263059735298,
0.03143874555826187,
-0.07706277072429657,
0.055131763219833374,
0.03307224065065384,
-0.09207922965288162,
0.12485579401254654,
0.05635557323694229,
0.10075642913579941,
0.015994878485798836,
-0.1446937918663025,
-0.27148962020874023,
0.02497849613428116,
-0.03752823546528816,
-0.040199633687734604,
-0.02873001992702484,
-0.018198898062109947,
-0.17654113471508026,
0.06161047890782356,
-0.005569641478359699,
0.013306970708072186,
0.005235206801444292,
-0.16022223234176636,
-0.03212627023458481,
0.001924038864672184,
0.03130943328142166,
0.08905687928199768,
0.1079874113202095,
-0.052412282675504684,
0.06043456494808197,
-0.18315818905830383,
0.08157791197299957,
0.209028422832489,
-0.25453877449035645,
-0.029911484569311142,
0.06903225928544998,
0.13292784988880157,
-0.01472847443073988,
-0.04496757686138153,
0.05342356860637665,
-0.010002253577113152,
0.02039439044892788,
-0.10014612972736359,
-0.12205712497234344,
-0.06586583703756332,
0.10752958059310913,
-0.10047953575849533,
-0.04753667488694191,
0.138255313038826,
-0.021666156128048897,
0.06757792085409164,
-0.052111148834228516,
-0.04634564369916916,
-0.021003959700465202,
-0.06057129055261612,
0.0008544213487766683,
-0.04697098955512047,
0.08371403068304062,
-0.04415389522910118,
0.001563839497976005,
-0.13157778978347778,
0.02082492597401142,
-0.22679947316646576,
0.25013628602027893,
0.036954205483198166,
0.006240524351596832,
-0.17478784918785095,
0.02307712286710739,
-0.1337898224592209,
-0.0901821032166481,
0.017285218462347984,
-0.0796138122677803,
0.00008879738743416965,
0.02551971934735775,
-0.12234649807214737,
-0.08018194139003754,
0.11546904593706131,
0.10900704562664032,
0.030096514150500298,
-0.0015739690279588103,
-0.0032374339643865824,
0.09798823297023773,
0.03386284038424492,
0.18973349034786224,
-0.013848339207470417,
-0.036947160959243774,
-0.0779348686337471,
-0.09480272233486176,
-0.03858114406466484,
-0.013893692754209042,
-0.11708879470825195,
-0.030895322561264038,
0.043602053076028824,
0.08175648748874664,
0.0263347364962101,
0.09032531827688217,
-0.050202831625938416,
0.02076643332839012,
-0.062023937702178955,
-0.03586690127849579,
-0.043338142335414886,
-0.02466999553143978,
0.054043274372816086,
0.17807316780090332,
-0.08171842247247696,
0.04561876133084297,
-0.06792239844799042,
0.07907022535800934,
-0.06740489602088928,
0.02440127730369568,
0.015938930213451385,
-0.04384000599384308,
0.025402553379535675,
-0.11538029462099075,
0.07653304189443588,
-0.156798854470253,
-0.018007982522249222,
0.007419546600431204,
0.004590651486068964,
-0.028431527316570282,
-0.014240198768675327,
-0.07379105687141418,
-0.018357494845986366,
0.018931888043880463,
-0.04211274906992912,
-0.08537686616182327,
-0.05932413786649704,
0.10320453345775604,
-0.04636231064796448,
0.053707659244537354,
-0.02744685299694538,
0.06732109934091568,
-0.1882294863462448,
0.009989531710743904,
-0.07184132188558578,
0.014258054085075855,
-0.010241055861115456,
0.13514429330825806,
-0.03390982374548912,
-0.03391425684094429,
-0.11340866982936859,
0.07918377965688705,
-0.09878348559141159,
0.18672490119934082,
-0.06910879909992218,
-0.12568679451942444,
0.24025821685791016,
-0.04781851917505264,
-0.10714702308177948,
0.09969136118888855,
0.020433181896805763,
0.0367404967546463,
0.09476666897535324,
0.25212615728378296,
0.04887009784579277,
0.011517352424561977,
0.12405557930469513,
0.12336321175098419,
-0.08660099655389786,
0.09804510325193405,
0.04447297006845474,
-0.059555280953645706,
-0.0027533341199159622,
0.04240691289305687,
0.04362891986966133,
0.0917869284749031,
-0.04918111115694046,
-0.055462077260017395,
0.001150800264440477,
0.017009928822517395,
0.08995772153139114,
-0.002578966086730361,
0.03988305851817131,
-0.05661372095346451,
-0.025949375703930855,
0.005878726486116648,
0.03103838674724102,
-0.02603958174586296,
0.07233605533838272,
-0.04558243975043297,
0.09327703714370728,
0.0007714714156463742,
0.020513221621513367,
-0.20671144127845764,
-0.02400333806872368,
-0.03282039612531662,
0.11098892986774445,
-0.007217187434434891,
0.11613469570875168,
0.05699852481484413,
-0.11612249165773392,
-0.02138916403055191,
0.025060517713427544,
0.11599276214838028,
0.059278540313243866,
-0.01662055402994156,
-0.06524498760700226,
0.08439034223556519,
-0.03718758746981621,
0.002700856188312173,
0.0368320532143116,
-0.025864658877253532,
0.1147088035941124,
0.10009337216615677,
-0.019036101177334785,
0.0631963461637497,
-0.025543855503201485,
0.037618447095155716,
0.002303103916347027,
0.058451950550079346,
0.11279758810997009,
-0.029633989557623863,
-0.09886566549539566,
0.28252795338630676,
-0.13251109421253204,
0.046684566885232925,
0.16802188754081726,
-0.2601410746574402,
0.019392944872379303,
-0.10432589799165726,
-0.06020740047097206,
0.024807840585708618,
0.08118598163127899,
-0.0692838504910469,
0.18426857888698578,
0.008663886226713657,
0.1071438193321228,
-0.04414590448141098,
-0.03844170272350311,
-0.02320694923400879,
-0.0515589565038681,
-0.06232058256864548,
0.12167950719594955,
-0.024162957444787025,
-0.2171097993850708,
0.17674365639686584,
0.20531930029392242,
0.027733175083994865,
0.16055066883563995,
-0.057318802922964096,
-0.029075145721435547,
0.006448022555559874,
0.03165404498577118,
-0.05323658138513565,
-0.018255848437547684,
-0.24515579640865326,
-0.045474324375391006,
0.05998630449175835,
0.03824831172823906,
0.09208980947732925,
-0.11619512736797333,
-0.0792422667145729,
0.014377840794622898,
-0.0004458700423128903,
0.052984874695539474,
0.11668973416090012,
0.08735978603363037,
0.07443233579397202,
0.031137505546212196,
-0.06822729110717773,
0.06783405691385269,
0.021412577480077744,
-0.03398612141609192,
0.1854967176914215,
-0.14152520895004272,
-0.2735963761806488,
-0.09755025804042816,
-0.11266928911209106,
-0.012069757096469402,
-0.010007170028984547,
0.12075302749872208,
-0.09573090821504593,
0.03253461793065071,
0.04860873147845268,
0.13675174117088318,
-0.12715481221675873,
-0.018643543124198914,
-0.15106122195720673,
0.05431486666202545,
-0.1563814878463745,
-0.05967124179005623,
-0.0857357382774353,
-0.05568475276231766,
-0.031569793820381165,
0.07988058775663376,
-0.17433740198612213,
0.059362590312957764,
0.1573149859905243,
0.08700577169656754,
0.04443354904651642,
-0.06666392832994461,
0.15887831151485443,
-0.10574391484260559,
-0.04380780830979347,
0.1468825489282608,
-0.05037741735577583,
0.06601928174495697,
0.13770194351673126,
-0.004864836111664772,
-0.010243281722068787,
-0.013402067124843597,
0.024670543149113655,
-0.04386961832642555,
-0.14337365329265594,
-0.09967966377735138,
-0.11759687960147858,
0.09327370673418045,
-0.013266228139400482,
0.02484988607466221,
0.15929238498210907,
0.0452042855322361,
0.004753998480737209,
-0.11199060082435608,
0.04985866695642471,
0.059678204357624054,
0.2684195339679718,
-0.018610110506415367,
0.11376363039016724,
0.016792532056570053,
-0.17361140251159668,
0.005541704595088959,
0.02310524694621563,
0.11523634195327759,
0.10312438756227493,
0.05024039372801781,
0.09586051106452942,
0.05341926962137222,
0.09392783045768738,
0.02872527576982975,
-0.03603753447532654,
-0.04512489214539528,
-0.025501728057861328,
-0.054297562688589096,
0.00955099519342184,
0.07304613292217255,
0.1757330596446991,
-0.08882306516170502,
-0.059204623103141785,
-0.17559754848480225,
0.14410924911499023,
0.0906558632850647,
0.1122974157333374,
-0.06444232910871506,
0.022354228422045708,
0.07135145366191864,
-0.04533003270626068,
-0.03315745294094086,
0.13277696073055267,
-0.0849887952208519,
-0.1385928839445114,
0.026024609804153442,
-0.046400852501392365,
0.1054871454834938,
0.013575227931141853,
0.09049016237258911,
-0.10113349556922913,
-0.14769719541072845,
0.03883757069706917,
0.08746541291475296,
-0.2361702173948288,
0.2714803218841553,
0.006214501801878214,
-0.09270264208316803,
-0.07121293246746063,
-0.024276072159409523,
0.021874267607927322,
0.08878818899393082,
0.16931317746639252,
0.0038428236730396748,
-0.08960657566785812,
-0.09903179109096527,
0.0287073515355587,
0.013408525846898556,
0.1816948652267456,
-0.0984315574169159,
0.00403326191008091,
-0.0977044478058815,
-0.0031051235273480415,
-0.02739102579653263,
0.16851431131362915,
0.09561438113451004,
-0.15281285345554352,
0.0731407105922699,
-0.019008424133062363,
0.06679842621088028,
-0.00955794658511877,
-0.004057140555232763,
-0.05596170574426651,
0.15643325448036194,
-0.07529567927122116,
-0.04981328919529915,
-0.0729396864771843,
-0.08803956210613251,
0.11539391428232193,
-0.09433743357658386,
-0.004255697596818209,
-0.07712208479642868,
-0.009027487598359585,
-0.11656512320041656,
-0.15593846142292023,
0.09967903047800064,
-0.05066116899251938,
0.01815028116106987,
-0.012015572749078274,
0.23878900706768036,
-0.0688825249671936,
0.07712579518556595,
0.0388517901301384,
0.06337195634841919,
-0.09442950040102005,
-0.03690734878182411,
-0.05722300708293915,
-0.02914511226117611,
0.08702109009027481,
0.0502268522977829,
-0.0975748673081398,
0.09880286455154419,
-0.13329090178012848,
-0.022837378084659576,
0.28783923387527466,
0.15273676812648773,
-0.02622365765273571,
0.1128678098320961,
0.13966010510921478,
-0.03714420273900032,
-0.2748062014579773,
-0.1202283427119255,
-0.12751281261444092,
-0.021922677755355835,
-0.04394394904375076,
-0.03440001606941223,
0.0690276101231575,
0.025004928931593895,
0.005865192972123623,
-0.04691724479198456,
-0.3073001801967621,
-0.07409369945526123,
0.17853450775146484,
-0.03936358168721199,
0.40490472316741943,
-0.13970567286014557,
-0.09089714288711548,
0.0223428662866354,
-0.22645163536071777,
0.12501618266105652,
0.03126692399382591,
0.11743185669183731,
0.019937390461564064,
0.05486976355314255,
0.03501033037900925,
-0.02133573405444622,
0.13359206914901733,
0.02655123360455036,
-0.0004685211752075702,
-0.047305043786764145,
-0.2156296968460083,
0.009438563138246536,
0.010304948315024376,
-0.02281660959124565,
-0.032160643488168716,
0.017620515078306198,
-0.09927131980657578,
-0.021338671445846558,
-0.16395574808120728,
0.026709062978625298,
0.0053971922025084496,
-0.07593213021755219,
-0.07281232625246048,
0.039243970066308975,
0.043868113309144974,
-0.021489545702934265,
0.26509708166122437,
-0.13053427636623383,
0.1259988695383072,
-0.03651080280542374,
0.0323801226913929,
-0.09206841886043549,
-0.14618247747421265,
-0.07095886766910553,
-0.059728316962718964,
0.07537879049777985,
-0.026823634281754494,
0.04103582352399826,
0.11326402425765991,
-0.03231533616781235,
0.07790139317512512,
0.07936875522136688,
-0.005226818844676018,
-0.04588475450873375,
0.0962875708937645,
-0.16499215364456177,
-0.02122233808040619,
-0.10406796634197235,
-0.0956055223941803,
0.1194843053817749,
-0.04809056594967842,
0.1460336446762085,
0.0181707926094532,
-0.024520978331565857,
0.006579800974577665,
-0.011424008756875992,
-0.0855787917971611,
0.02155153639614582,
0.009214718826115131,
0.04575017839670181,
-0.08572769165039062,
0.024142464622855186,
-0.010752566158771515,
-0.2882552146911621,
-0.0012119569582864642,
0.08487354218959808,
-0.09432821720838547,
-0.08238435536623001,
-0.11623811721801758,
0.009114691987633705,
-0.13621823489665985,
-0.007169606629759073,
-0.06981559842824936,
-0.15810084342956543,
0.032186511904001236,
0.25723588466644287,
0.10264923423528671,
0.15256145596504211,
-0.031225627288222313,
-0.022965287789702415,
0.01400532852858305,
-0.003969286102801561,
0.009732961654663086,
-0.015897803008556366,
0.008848640136420727,
0.050898127257823944,
-0.03434831649065018,
0.16736817359924316,
-0.06263083964586258,
-0.08439118415117264,
-0.13822372257709503,
0.0514325350522995,
-0.06637340784072876,
-0.09899263083934784,
-0.15458792448043823,
-0.07970600575208664,
0.03042464330792427,
-0.042014770209789276,
-0.030424674972891808,
-0.0746428370475769,
-0.09173917770385742,
0.04816797003149986,
-0.05520368739962578,
0.044076986610889435,
-0.010080210864543915,
-0.032098665833473206,
0.1267867535352707,
-0.05472159385681152,
0.04102937504649162,
0.15049508213996887,
-0.10151080042123795,
0.12574830651283264,
-0.10812962055206299,
-0.12185471504926682,
0.07596602290868759,
-0.03500821813941002,
0.03986725956201553,
-0.032285176217556,
0.005888007581233978,
0.03130693361163139,
0.036606788635253906,
0.06676942110061646,
-0.07216028869152069,
-0.12945140898227692,
-0.052411068230867386,
0.02598368562757969,
-0.2084943950176239,
-0.05091913789510727,
-0.12577107548713684,
0.07888704538345337,
0.0048538194969296455,
0.11298421025276184,
-0.02949550934135914,
0.11720160394906998,
0.004125087056308985,
0.00959740299731493,
-0.025167981162667274,
-0.16988645493984222,
0.09961123019456863,
-0.11395120620727539,
-0.022480743005871773,
0.023336919024586678,
0.20824307203292847,
-0.027428947389125824,
0.030437014997005463,
0.053003109991550446,
0.0226878821849823,
0.11863302439451218,
0.0712931752204895,
0.19967462122440338,
0.15215077996253967,
-0.07836075127124786,
-0.1758120059967041,
0.0878298431634903,
-0.0011795790633186698,
0.009866749867796898,
0.05653669312596321,
0.148367241024971,
0.1462714970111847,
0.15792728960514069,
0.0628686472773552,
0.0980304405093193,
0.024509146809577942,
-0.18678414821624756,
0.006410109344869852,
0.04031673073768616,
0.010020877234637737,
0.14922550320625305,
0.24752895534038544,
-0.08377958834171295,
0.09419412165880203,
-0.02534640021622181,
-0.051900915801525116,
-0.11257666349411011,
-0.11616881936788559,
-0.06125380098819733,
-0.09032462537288666,
0.011865576729178429,
-0.09343273937702179,
0.008887620642781258,
0.1258699595928192,
0.04558541253209114,
-0.021322276443243027,
0.14456620812416077,
0.04178725928068161,
-0.14203613996505737,
0.12529513239860535,
-0.039081502705812454,
0.08649998903274536,
0.048796337097883224,
-0.024641459807753563,
-0.046834494918584824,
-0.09328693151473999,
-0.013906539417803288,
0.04187200963497162,
-0.028364187106490135,
-0.055918119847774506,
-0.1957029104232788,
-0.11932884156703949,
0.004260986112058163,
0.03116520494222641,
-0.03593433275818825,
0.06172795593738556,
0.031166214495897293,
0.010373755358159542,
0.05440879985690117,
0.22801576554775238,
-0.04756331071257591,
-0.1340302675962448,
-0.06636593490839005,
0.16073454916477203,
0.12266319990158081,
0.1327652633190155,
-0.0010584105039015412,
-0.008799531497061253,
-0.08095414936542511,
0.319225549697876,
0.19889336824417114,
-0.013133322820067406,
0.03421248123049736,
0.1108265146613121,
0.05914297327399254,
0.13311834633350372,
0.008360616862773895,
0.07300984114408493,
0.2768004834651947,
-0.08316659927368164,
-0.047121576964855194,
-0.0583137571811676,
-0.06709028780460358,
-0.07082080096006393,
0.10681749880313873,
0.1336706280708313,
-0.07220824062824249,
-0.039144158363342285,
0.14156483113765717,
-0.1652989536523819,
0.21567599475383759,
-0.02301160991191864,
-0.25033891201019287,
-0.07576094567775726,
-0.05249238386750221,
0.19141343235969543,
0.04008989781141281,
0.14679944515228271,
0.01283137034624815,
-0.13461461663246155,
-0.028546787798404694,
0.06252328306436539,
-0.18445345759391785,
-0.038019273430109024,
0.08774560689926147,
0.06339459866285324,
-0.07492269575595856,
-0.01844828762114048,
0.01497286930680275,
0.10729987919330597,
0.08568146079778671,
0.07845696806907654,
0.06838250905275345,
0.07716351747512817,
-0.07825427502393723,
-0.019880352541804314,
0.028853489086031914,
0.026694657281041145,
-0.10875584930181503,
0.10843207687139511,
-0.19294406473636627,
0.11703410744667053,
-0.020933430641889572,
-0.053019504994153976,
-0.005896410904824734,
0.05109706521034241,
-0.07744787633419037,
0.007053370587527752,
0.1161494180560112,
0.01564241759479046,
-0.023831509053707123,
-0.05842337757349014,
-0.04307195171713829,
0.03230154886841774,
-0.045999377965927124,
-0.1304822862148285,
-0.043500564992427826,
-0.09196572750806808,
0.09654270112514496,
-0.04536852240562439,
-0.15618515014648438,
-0.07053335011005402,
0.02287476323544979,
0.07099166512489319,
-0.0951571837067604,
0.102142833173275,
0.015573783777654171,
0.006057238206267357,
-0.0024191064294427633,
-0.2046748548746109,
0.10403042286634445,
0.06374262273311615,
-0.13987934589385986,
-0.09236166626214981
] |
null | null |
transformers
|
## EXAMPLE
```python
import requests
import torch
from PIL import Image
from transformers import (
VisionEncoderDecoderModel,
ViTFeatureExtractor,
PreTrainedTokenizerFast,
)
# device setting
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
# load feature extractor and tokenizer
encoder_model_name_or_path = "ddobokki/vision-encoder-decoder-vit-gpt2-coco-ko"
feature_extractor = ViTFeatureExtractor.from_pretrained(encoder_model_name_or_path)
tokenizer = PreTrainedTokenizerFast.from_pretrained(encoder_model_name_or_path)
# load model
model = VisionEncoderDecoderModel.from_pretrained(encoder_model_name_or_path)
model.to(device)
# inference
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
with Image.open(requests.get(url, stream=True).raw) as img:
pixel_values = feature_extractor(images=img, return_tensors="pt").pixel_values
generated_ids = model.generate(pixel_values.to(device),num_beams=5)
generated_text = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
>> ['고양이 두마리가 담요 위에 누워 있다.']
```
|
{}
| null |
ddobokki/vision-encoder-decoder-vit-gpt2-coco-ko
|
[
"transformers",
"pytorch",
"vision-encoder-decoder",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #vision-encoder-decoder #endpoints_compatible #region-us
|
## EXAMPLE
|
[
"## EXAMPLE"
] |
[
"TAGS\n#transformers #pytorch #vision-encoder-decoder #endpoints_compatible #region-us \n",
"## EXAMPLE"
] |
[
29,
4
] |
[
"passage: TAGS\n#transformers #pytorch #vision-encoder-decoder #endpoints_compatible #region-us \n## EXAMPLE"
] |
[
-0.09413618594408035,
0.026519285514950752,
-0.008303516544401646,
-0.005268195178359747,
0.16738729178905487,
0.00534309446811676,
0.0154751380905509,
0.07172167301177979,
0.05881743133068085,
-0.014179777354001999,
0.12222111970186234,
0.23014453053474426,
-0.013403967022895813,
0.05510817468166351,
-0.059005480259656906,
-0.3101367950439453,
0.07419715076684952,
0.13603955507278442,
-0.028277356177568436,
0.09313874691724777,
0.0454527772963047,
-0.10549679398536682,
0.08998413383960724,
-0.04419413208961487,
-0.14605949819087982,
0.017287375405430794,
-0.017024295404553413,
-0.08622749149799347,
0.10395906120538712,
0.0019203730626031756,
0.1416446715593338,
-0.032942235469818115,
-0.08250884711742401,
-0.16240669786930084,
0.005327850580215454,
0.043401993811130524,
-0.03983524814248085,
0.01201142743229866,
0.08863013982772827,
-0.03953951597213745,
0.012517774477601051,
0.0024631633423268795,
0.00349845876917243,
0.022363465279340744,
-0.12750622630119324,
-0.17390985786914825,
0.046547722071409225,
0.015596571378409863,
0.049441367387771606,
0.10910848528146744,
0.03059401363134384,
0.11728470027446747,
-0.11178348958492279,
0.14072172343730927,
0.1371140033006668,
-0.15028655529022217,
0.0020172102376818657,
0.09272397309541702,
0.03600826859474182,
0.02767828293144703,
-0.053780246526002884,
0.030456826090812683,
-0.009911962784826756,
0.057569585740566254,
-0.015040558762848377,
-0.06793107092380524,
-0.1214013621211052,
0.02934832125902176,
-0.09526584297418594,
-0.12128999829292297,
0.16985119879245758,
-0.03570733964443207,
0.0752764493227005,
-0.04596012085676193,
-0.15267950296401978,
-0.10112345218658447,
-0.04168800637125969,
0.055130306631326675,
-0.023033158853650093,
0.0675988644361496,
-0.024021916091442108,
0.01510451640933752,
-0.11086314171552658,
0.0484023354947567,
-0.17824041843414307,
0.17659643292427063,
0.025838090106844902,
0.10671615600585938,
-0.2058081179857254,
0.08725501596927643,
0.09177571535110474,
-0.12212584167718887,
0.0035786668304353952,
-0.11899501085281372,
0.04654369503259659,
0.0636989176273346,
-0.0581948384642601,
0.04827914386987686,
0.04994112253189087,
0.04694462567567825,
0.033625528216362,
0.05269210413098335,
0.0030207927338778973,
0.1301438957452774,
-0.03369741514325142,
0.149967223405838,
-0.007783256471157074,
0.05508864298462868,
0.0007919797790236771,
-0.1467132568359375,
0.013486473821103573,
-0.04390382766723633,
-0.10525929927825928,
-0.048727426677942276,
0.021804803982377052,
0.07858311384916306,
-0.019085031002759933,
0.05776052176952362,
-0.08193588256835938,
-0.026178712025284767,
0.004337201360613108,
-0.05239037051796913,
-0.011000615544617176,
-0.00565624563023448,
0.06100555509328842,
0.1352711021900177,
0.022412652149796486,
-0.04226488620042801,
-0.10366957634687424,
0.031180722638964653,
-0.048469603061676025,
0.039950598031282425,
-0.07637059688568115,
-0.013082553632557392,
0.031145034357905388,
-0.13375408947467804,
0.07033892720937729,
-0.1269386261701584,
-0.08747393637895584,
0.04757910966873169,
0.05371452495455742,
0.03685830906033516,
0.04372946172952652,
0.014019413851201534,
-0.08699338138103485,
-0.03112128935754299,
-0.06476198136806488,
-0.09759299457073212,
-0.04856320470571518,
0.11875586956739426,
-0.014829635620117188,
0.06241196021437645,
-0.14472918212413788,
0.05314861983060837,
-0.13045503199100494,
0.026241416111588478,
-0.1286039650440216,
0.09400679916143417,
-0.02330673299729824,
0.13768038153648376,
0.04094647243618965,
-0.09653858095407486,
-0.07483865320682526,
0.05153907090425491,
-0.04302647337317467,
0.13485832512378693,
-0.11146232485771179,
-0.07874365895986557,
0.1737508475780487,
-0.13348978757858276,
-0.16020426154136658,
0.0735856369137764,
0.01760454662144184,
-0.09119494259357452,
0.04794672876596451,
0.16453449428081512,
0.1097763180732727,
-0.090149886906147,
0.05981456860899925,
0.15521302819252014,
-0.1648448258638382,
-0.130606546998024,
0.008843455463647842,
0.029058288782835007,
0.015786482021212578,
0.06516233086585999,
-0.05193373188376427,
0.11008961498737335,
-0.08619744330644608,
-0.0483279787003994,
-0.03441041707992554,
0.026748593896627426,
0.006832875311374664,
0.11560342460870743,
0.03704036399722099,
-0.05007974058389664,
0.004434380680322647,
-0.0029911885503679514,
0.008790280669927597,
-0.06497781723737717,
0.054498981684446335,
-0.07356883585453033,
0.056010935455560684,
-0.07240498811006546,
-0.0036551300436258316,
-0.21447820961475372,
-0.06048200652003288,
-0.008152604103088379,
0.035256627947092056,
-0.10953878611326218,
0.13964886963367462,
0.06137065589427948,
-0.016668999567627907,
0.03328012675046921,
-0.07066573202610016,
0.08431194722652435,
0.002742751268669963,
-0.008598021231591702,
-0.05295316129922867,
-0.011446588672697544,
-0.04907041788101196,
-0.09162018448114395,
-0.0187559612095356,
-0.006304329261183739,
0.17702309787273407,
0.15403297543525696,
0.016527770087122917,
0.008569682948291302,
-0.043884266167879105,
0.060430508106946945,
-0.03219548985362053,
-0.035247884690761566,
0.10368519276380539,
-0.0270182266831398,
-0.021029852330684662,
0.17360900342464447,
-0.16254885494709015,
0.32293376326560974,
0.1948169767856598,
-0.3973396122455597,
0.06389317661523819,
0.03317420929670334,
-0.029432468116283417,
0.04388587549328804,
0.04565855860710144,
-0.00676384475082159,
0.1498715579509735,
0.016112785786390305,
0.14800569415092468,
-0.01498433668166399,
-0.007621497847139835,
0.03153850510716438,
-0.02010682411491871,
-0.042115718126297,
0.060217320919036865,
0.03491370379924774,
-0.13966281712055206,
0.19266630709171295,
0.1776408553123474,
0.06688503921031952,
0.1356295347213745,
-0.07188431173563004,
-0.017061779275536537,
0.07845427840948105,
0.019599908962845802,
-0.06408145278692245,
0.08631236851215363,
-0.3038991093635559,
-0.05071357265114784,
0.05649161338806152,
-0.008710673078894615,
0.08576574921607971,
-0.15839844942092896,
-0.01964537613093853,
0.0080827372148633,
-0.001629314967431128,
0.018558334559202194,
0.05871933326125145,
0.06355629861354828,
0.04233323037624359,
-0.036514151841402054,
-0.09041879326105118,
0.13476261496543884,
-0.021294159814715385,
-0.046080268919467926,
0.2014567106962204,
-0.10207188874483109,
-0.41787415742874146,
-0.1507294625043869,
-0.08991549909114838,
0.010819164104759693,
0.0323825404047966,
0.07614690065383911,
-0.05562322959303856,
-0.05709199607372284,
0.03789650648832321,
0.008358404040336609,
-0.07142031192779541,
0.017643073573708534,
-0.029105763882398605,
0.04273195192217827,
-0.045829594135284424,
-0.0909031555056572,
-0.05042475461959839,
-0.05268629267811775,
0.008252776227891445,
0.09755069017410278,
-0.10627271980047226,
0.09887132793664932,
0.14121170341968536,
0.02291758731007576,
0.09947393834590912,
0.002033788478001952,
0.18379762768745422,
-0.11821621656417847,
-0.08075551688671112,
0.18122220039367676,
-0.06566175073385239,
0.07559121400117874,
0.14943936467170715,
0.026190076023340225,
-0.08374059945344925,
-0.04265294224023819,
-0.06820579618215561,
-0.12441045045852661,
-0.06011893227696419,
-0.1356322020292282,
-0.17340366542339325,
0.01730257458984852,
0.019118625670671463,
0.06724271178245544,
0.03622835874557495,
0.08419351279735565,
0.008696776814758778,
-0.07472915947437286,
-0.020539183169603348,
0.0903083011507988,
0.25767090916633606,
-0.06935020536184311,
0.07676003128290176,
-0.0874757394194603,
-0.13214938342571259,
0.04022934287786484,
0.06061318889260292,
0.200735405087471,
-0.0008224869379773736,
-0.013510948978364468,
0.06193852424621582,
0.09820674359798431,
0.14233914017677307,
0.09150900691747665,
0.03365887328982353,
-0.013394988141953945,
-0.002101785968989134,
-0.024155324324965477,
-0.1047622412443161,
0.0008747559622861445,
0.04648710414767265,
-0.11678609251976013,
-0.03675142303109169,
-0.13668274879455566,
0.08412923663854599,
0.1167697012424469,
0.1024102121591568,
-0.2869931161403656,
0.01880384050309658,
0.05035426467657089,
0.014906602911651134,
-0.1008307933807373,
0.10176200419664383,
0.08467665314674377,
-0.10902569442987442,
0.06257995218038559,
-0.08404084295034409,
0.09657175838947296,
-0.10682173073291779,
0.08308564871549606,
-0.015601612627506256,
-0.12899038195610046,
0.09014731645584106,
0.06617208570241928,
-0.20168383419513702,
0.2251412719488144,
-0.035503894090652466,
-0.034255750477313995,
-0.07860638946294785,
0.007151803467422724,
0.0062406957149505615,
0.2007538378238678,
0.12123194336891174,
0.030624322593212128,
0.06807248294353485,
-0.1876341700553894,
0.0910872295498848,
0.034319888800382614,
0.15382400155067444,
0.026700258255004883,
-0.06190522760152817,
-0.011677471920847893,
-0.026614055037498474,
-0.03698597848415375,
-0.06281723827123642,
0.10283210128545761,
-0.15007701516151428,
0.04294919595122337,
0.020999498665332794,
0.10873857885599136,
-0.01716873049736023,
0.011159161105751991,
-0.028195664286613464,
0.09618103504180908,
-0.11651340872049332,
-0.0751831904053688,
-0.08032311499118805,
-0.08477379381656647,
0.10704927146434784,
-0.07446642965078354,
0.1304703950881958,
-0.07224751263856888,
0.017612963914871216,
-0.03877276927232742,
-0.2066565304994583,
0.09255778044462204,
-0.11641256511211395,
0.027626894414424896,
-0.020221108570694923,
0.14266496896743774,
-0.09719564765691757,
-0.07235727459192276,
0.021539295092225075,
0.025664910674095154,
-0.11510846763849258,
-0.09202058613300323,
-0.016194932162761688,
0.027803698554635048,
0.06947552412748337,
0.12085215002298355,
0.030709434300661087,
-0.015756016597151756,
0.033169861882925034,
0.0636456310749054,
0.17844852805137634,
0.044971711933612823,
-0.04078591987490654,
0.07417961955070496,
0.141060933470726,
-0.042606815695762634,
-0.31478163599967957,
-0.0738372951745987,
-0.09684175252914429,
-0.04779732599854469,
-0.03763184696435928,
-0.08281289041042328,
0.15712463855743408,
-0.007835284806787968,
-0.015183700248599052,
0.1391306072473526,
-0.2820338308811188,
-0.019666224718093872,
0.15658850967884064,
0.08560565859079361,
0.297528475522995,
-0.12047850340604782,
-0.034271400421857834,
0.018953556194901466,
-0.27001696825027466,
0.08706775307655334,
0.11631111055612564,
0.09059751033782959,
-0.01453329250216484,
0.02069365605711937,
0.052758850157260895,
-0.0961531326174736,
0.09885966777801514,
-0.010722914710640907,
0.05436912924051285,
-0.09816402941942215,
-0.08693277835845947,
0.1451476514339447,
-0.00870745163410902,
-0.011355125345289707,
0.08480870723724365,
0.07855421304702759,
-0.130363330245018,
-0.02194400317966938,
-0.11028876155614853,
0.015409613028168678,
0.0559915266931057,
-0.04018453136086464,
-0.02946838177740574,
-0.0271286778151989,
0.024780696257948875,
0.03775235265493393,
0.2760048508644104,
-0.03621315583586693,
0.020313294604420662,
0.04906928166747093,
0.07542386651039124,
-0.1948898583650589,
-0.21402788162231445,
-0.09701727330684662,
-0.023790933191776276,
0.112234927713871,
-0.04664251580834389,
0.07405547797679901,
0.13765232264995575,
-0.010748549364507198,
0.014271135441958904,
0.1458597183227539,
0.04828324913978577,
0.014563686214387417,
0.12392890453338623,
-0.15575848519802094,
-0.045169487595558167,
-0.05892184004187584,
-0.019651511684060097,
0.14454525709152222,
0.1483093798160553,
0.10029084980487823,
0.038131359964609146,
0.015936769545078278,
-0.0209367498755455,
-0.00038364529609680176,
-0.036090876907110214,
-0.0033256374299526215,
0.00774105079472065,
0.05786208063364029,
-0.15883252024650574,
0.04820490628480911,
-0.06652653962373734,
-0.289985328912735,
-0.04309576749801636,
0.11426462978124619,
-0.15263895690441132,
-0.09743273258209229,
-0.03742888569831848,
0.09720304608345032,
-0.15295915305614471,
-0.04762931540608406,
-0.043487489223480225,
-0.1437879055738449,
0.08077866584062576,
0.22419877350330353,
0.10820925235748291,
0.11123627424240112,
-0.05161958932876587,
-0.0015178794274106622,
-0.01657899096608162,
-0.031072204932570457,
-0.04876449704170227,
0.024929480627179146,
-0.16214637458324432,
-0.02034073695540428,
0.008819544687867165,
0.1550699919462204,
-0.08844079822301865,
-0.09201399981975555,
-0.11644361168146133,
0.10528811812400818,
-0.10236329585313797,
-0.022461093962192535,
-0.10213981568813324,
-0.022341445088386536,
-0.010672993026673794,
-0.05850397050380707,
-0.029452107846736908,
0.01667960360646248,
-0.11354338377714157,
0.01872231811285019,
0.019005926325917244,
-0.016298651695251465,
-0.0412013903260231,
-0.03090491145849228,
0.06306187808513641,
-0.0757753849029541,
0.08536041527986526,
0.10830817371606827,
-0.12692546844482422,
0.044082533568143845,
-0.1240963265299797,
-0.18385839462280273,
0.18776963651180267,
0.0171456728130579,
0.02458261325955391,
0.14861555397510529,
0.04154522716999054,
0.09321815520524979,
0.017033034935593605,
0.018227018415927887,
0.0841323509812355,
-0.1121973842382431,
0.015575550496578217,
-0.015309839509427547,
-0.11243773996829987,
-0.05783329904079437,
-0.0038280447479337454,
0.09952297806739807,
0.07595262676477432,
0.0862204059958458,
-0.010099882259964943,
0.11432244628667831,
-0.035203609615564346,
-0.026758024469017982,
0.006221088580787182,
-0.201973095536232,
0.05915556848049164,
-0.050789788365364075,
0.02170655131340027,
-0.036302197724580765,
0.25308236479759216,
0.012264806777238846,
0.03872215002775192,
0.01612835004925728,
0.09442714601755142,
0.017974335700273514,
0.050498105585575104,
0.29986006021499634,
0.051828037947416306,
-0.023367691785097122,
-0.029642710462212563,
0.09504140913486481,
0.03837135061621666,
0.07543783634901047,
0.018140606582164764,
0.11337340623140335,
-0.029866095632314682,
0.10711567103862762,
0.023366373032331467,
0.04844539240002632,
-0.1491352617740631,
-0.21765638887882233,
0.0006506781210191548,
0.10723524540662766,
-0.012309471145272255,
-0.004179036244750023,
0.15696245431900024,
0.0008912108605727553,
0.0871649757027626,
0.038049519062042236,
-0.041055381298065186,
-0.07718394696712494,
-0.10546944290399551,
-0.04107240214943886,
-0.11372864991426468,
0.04349144548177719,
-0.07205770909786224,
-0.0016829150263220072,
0.25593042373657227,
0.040387656539678574,
-0.05382285639643669,
0.178946852684021,
0.09227501600980759,
-0.06916692852973938,
0.06963198632001877,
0.017758401110768318,
0.03474884480237961,
0.042118728160858154,
-0.012340308167040348,
-0.074363112449646,
-0.03678547218441963,
-0.042999256402254105,
0.0015677858609706163,
-0.03150946646928787,
0.0005263706552796066,
-0.1299961507320404,
-0.07861986011266708,
-0.033989351242780685,
0.05382619425654411,
-0.08251725882291794,
0.11144647002220154,
-0.021420814096927643,
-0.019997596740722656,
0.00987011194229126,
0.17316333949565887,
-0.10812146961688995,
-0.11743326485157013,
-0.013073031790554523,
0.19119247794151306,
0.04098738357424736,
0.09761364758014679,
-0.007980543188750744,
-0.014473475515842438,
-0.08198695629835129,
0.2919662296772003,
0.20644192397594452,
-0.03182894363999367,
0.019053751602768898,
0.06071029603481293,
0.03180522099137306,
0.0733226090669632,
0.1521148979663849,
0.09864680469036102,
0.39288854598999023,
-0.08220025151968002,
-0.03968771547079086,
-0.013070929795503616,
-0.023958327248692513,
-0.14647527039051056,
-0.022119933739304543,
0.04553713649511337,
-0.08901973813772202,
-0.08734887838363647,
0.09889431297779083,
-0.1936352401971817,
0.19920799136161804,
0.15839482843875885,
-0.20614592730998993,
-0.013298287987709045,
-0.06092449277639389,
0.19345629215240479,
-0.02326984517276287,
0.04877806082367897,
-0.0435260646045208,
-0.07534291595220566,
0.05829824134707451,
0.04444972425699234,
-0.261331170797348,
-0.03645925968885422,
0.005285841412842274,
-0.15344640612602234,
0.036099616438150406,
-0.03966689854860306,
0.044705938547849655,
0.05248996987938881,
0.07634158432483673,
-0.021020200103521347,
0.03976055607199669,
-0.02111056260764599,
-0.09309851378202438,
-0.05646933615207672,
0.03763481229543686,
-0.00733903469517827,
-0.08525633066892624,
0.014791121706366539,
-0.18063314259052277,
0.0294183362275362,
-0.010130182839930058,
0.006736579816788435,
0.00733579508960247,
-0.07414209097623825,
-0.040450211614370346,
0.021723324432969093,
0.005856798961758614,
0.03395590931177139,
-0.03335118293762207,
-0.06719224154949188,
-0.023755494505167007,
0.03475351631641388,
-0.1476643979549408,
-0.08079054206609726,
-0.031999439001083374,
-0.09226354211568832,
0.06530242413282394,
-0.04522259905934334,
-0.005373047664761543,
-0.04804299399256706,
-0.0030416250228881836,
0.017384111881256104,
-0.09760105609893799,
0.020340532064437866,
0.03365173190832138,
0.015213373117148876,
0.003085461910814047,
-0.012425921857357025,
0.06193091347813606,
0.07167977094650269,
-0.0742446631193161,
-0.09509546309709549
] |
null | null |
speechbrain
|
<iframe src="https://ghbtns.com/github-btn.html?user=speechbrain&repo=speechbrain&type=star&count=true&size=large&v=2" frameborder="0" scrolling="0" width="170" height="30" title="GitHub"></iframe>
<br/><br/>
# Conformer for KsponSpeech (with Transformer LM)
This repository provides all the necessary tools to perform automatic speech
recognition from an end-to-end system pretrained on KsponSpeech (Kr) within
SpeechBrain. For a better experience, we encourage you to learn more about
[SpeechBrain](https://speechbrain.github.io).
The performance of the model is the following:
| Release | eval clean CER | eval other CER | GPUs |
| :------: | :------------: | :------------: | :---------: |
| 01-23-23 | 7.33% | 7.99% | 6xA100 80GB |
## Pipeline description
This ASR system is composed of 3 different but linked blocks:
- Tokenizer (unigram) that transforms words into subword units and trained with
the train transcriptions of KsponSpeech.
- Neural language model (Transformer LM) trained on the train transcriptions of KsponSpeech
- Acoustic model made of a conformer encoder and a joint decoder with CTC +
transformer. Hence, the decoding also incorporates the CTC probabilities.
## Install SpeechBrain
First of all, please install SpeechBrain with the following command:
```
!pip install git+https://github.com/speechbrain/speechbrain.git
```
Please notice that we encourage you to read our tutorials and learn more about
[SpeechBrain](https://speechbrain.github.io).
### Transcribing your own audio files (in Korean)
```python
from speechbrain.pretrained import EncoderDecoderASR
asr_model = EncoderDecoderASR.from_hparams(source="ddwkim/asr-conformer-transformerlm-ksponspeech", savedir="pretrained_models/asr-conformer-transformerlm-ksponspeech", run_opts={"device":"cuda"})
asr_model.transcribe_file("ddwkim/asr-conformer-transformerlm-ksponspeech/record_0_16k.wav")
```
### Inference on GPU
To perform inference on the GPU, add `run_opts={"device":"cuda"}` when calling the `from_hparams` method.
## Parallel Inference on a Batch
Please, [see this Colab notebook](https://colab.research.google.com/drive/1finp9pfmGRzWHCAPNkqAH2yGH6k_BbPA?usp=sharing) on using the pretrained model
### Training
The model was trained with SpeechBrain (Commit hash: '4b3bf60').
To train it from scratch follow these steps:
1. Clone SpeechBrain:
```bash
git clone https://github.com/speechbrain/speechbrain/
```
2. Install it:
```bash
cd speechbrain
pip install -r requirements.txt
pip install .
```
3. Run Training:
```bash
cd recipes/KsponSpeech/ASR/transformer
python train.py hparams/conformer_medium.yaml --data_folder=your_data_folder
```
You can find our training results (models, logs, etc) at the subdirectories.
### Limitations
The SpeechBrain team does not provide any warranty on the performance achieved by this model when used on other datasets.
# **About SpeechBrain**
- Website: https://speechbrain.github.io/
- Code: https://github.com/speechbrain/speechbrain/
- HuggingFace: https://huggingface.co/speechbrain/
# **Citing SpeechBrain**
Please, cite SpeechBrain if you use it for your research or business.
```bibtex
@misc{speechbrain,
title={{SpeechBrain}: A General-Purpose Speech Toolkit},
author={Mirco Ravanelli and Titouan Parcollet and Peter Plantinga and Aku Rouhe and Samuele Cornell and Loren Lugosch and Cem Subakan and Nauman Dawalatabad and Abdelwahab Heba and Jianyuan Zhong and Ju-Chieh Chou and Sung-Lin Yeh and Szu-Wei Fu and Chien-Feng Liao and Elena Rastorgueva and François Grondin and William Aris and Hwidong Na and Yan Gao and Renato De Mori and Yoshua Bengio},
year={2021},
eprint={2106.04624},
archivePrefix={arXiv},
primaryClass={eess.AS},
note={arXiv:2106.04624}
}
```
# Citing the model
```bibtex
@misc{returnzero,
title = {ReturnZero Conformer Korean ASR model},
author = {Dongwon Kim and Dongwoo Kim and Jeongkyu Roh},
year = {2021},
howpublished = {\url{https://huggingface.co/ddwkim/asr-conformer-transformerlm-ksponspeech}},
}
```
# Citing KsponSpeech dataset
```bibtex
@Article{app10196936,
AUTHOR = {Bang, Jeong-Uk and Yun, Seung and Kim, Seung-Hi and Choi, Mu-Yeol and Lee, Min-Kyu and Kim, Yeo-Jeong and Kim, Dong-Hyun and Park, Jun and Lee, Young-Jik and Kim, Sang-Hun},
TITLE = {KsponSpeech: Korean Spontaneous Speech Corpus for Automatic Speech Recognition},
JOURNAL = {Applied Sciences},
VOLUME = {10},
YEAR = {2020},
NUMBER = {19},
ARTICLE-NUMBER = {6936},
URL = {https://www.mdpi.com/2076-3417/10/19/6936},
ISSN = {2076-3417},
DOI = {10.3390/app10196936}
}
```
|
{"language": "kr", "license": "apache-2.0", "tags": ["ASR", "CTC", "Attention", "Conformer", "pytorch", "speechbrain"], "datasets": ["ksponspeech"], "metrics": ["wer", "cer"]}
| null |
ddwkim/asr-conformer-transformerlm-ksponspeech
|
[
"speechbrain",
"ASR",
"CTC",
"Attention",
"Conformer",
"pytorch",
"kr",
"dataset:ksponspeech",
"arxiv:2106.04624",
"license:apache-2.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.04624"
] |
[
"kr"
] |
TAGS
#speechbrain #ASR #CTC #Attention #Conformer #pytorch #kr #dataset-ksponspeech #arxiv-2106.04624 #license-apache-2.0 #region-us
|
Conformer for KsponSpeech (with Transformer LM)
===============================================
This repository provides all the necessary tools to perform automatic speech
recognition from an end-to-end system pretrained on KsponSpeech (Kr) within
SpeechBrain. For a better experience, we encourage you to learn more about
SpeechBrain.
The performance of the model is the following:
Pipeline description
--------------------
This ASR system is composed of 3 different but linked blocks:
* Tokenizer (unigram) that transforms words into subword units and trained with
the train transcriptions of KsponSpeech.
* Neural language model (Transformer LM) trained on the train transcriptions of KsponSpeech
* Acoustic model made of a conformer encoder and a joint decoder with CTC +
transformer. Hence, the decoding also incorporates the CTC probabilities.
Install SpeechBrain
-------------------
First of all, please install SpeechBrain with the following command:
Please notice that we encourage you to read our tutorials and learn more about
SpeechBrain.
### Transcribing your own audio files (in Korean)
### Inference on GPU
To perform inference on the GPU, add 'run\_opts={"device":"cuda"}' when calling the 'from\_hparams' method.
Parallel Inference on a Batch
-----------------------------
Please, see this Colab notebook on using the pretrained model
### Training
The model was trained with SpeechBrain (Commit hash: '4b3bf60').
To train it from scratch follow these steps:
1. Clone SpeechBrain:
2. Install it:
3. Run Training:
You can find our training results (models, logs, etc) at the subdirectories.
### Limitations
The SpeechBrain team does not provide any warranty on the performance achieved by this model when used on other datasets.
About SpeechBrain
=================
* Website: URL
* Code: URL
* HuggingFace: URL
Citing SpeechBrain
==================
Please, cite SpeechBrain if you use it for your research or business.
Citing the model
================
Citing KsponSpeech dataset
==========================
|
[
"### Transcribing your own audio files (in Korean)",
"### Inference on GPU\n\n\nTo perform inference on the GPU, add 'run\\_opts={\"device\":\"cuda\"}' when calling the 'from\\_hparams' method.\n\n\nParallel Inference on a Batch\n-----------------------------\n\n\nPlease, see this Colab notebook on using the pretrained model",
"### Training\n\n\nThe model was trained with SpeechBrain (Commit hash: '4b3bf60').\nTo train it from scratch follow these steps:\n\n\n1. Clone SpeechBrain:\n2. Install it:\n3. Run Training:\n\n\nYou can find our training results (models, logs, etc) at the subdirectories.",
"### Limitations\n\n\nThe SpeechBrain team does not provide any warranty on the performance achieved by this model when used on other datasets.\n\n\nAbout SpeechBrain\n=================\n\n\n* Website: URL\n* Code: URL\n* HuggingFace: URL\n\n\nCiting SpeechBrain\n==================\n\n\nPlease, cite SpeechBrain if you use it for your research or business.\n\n\nCiting the model\n================\n\n\nCiting KsponSpeech dataset\n=========================="
] |
[
"TAGS\n#speechbrain #ASR #CTC #Attention #Conformer #pytorch #kr #dataset-ksponspeech #arxiv-2106.04624 #license-apache-2.0 #region-us \n",
"### Transcribing your own audio files (in Korean)",
"### Inference on GPU\n\n\nTo perform inference on the GPU, add 'run\\_opts={\"device\":\"cuda\"}' when calling the 'from\\_hparams' method.\n\n\nParallel Inference on a Batch\n-----------------------------\n\n\nPlease, see this Colab notebook on using the pretrained model",
"### Training\n\n\nThe model was trained with SpeechBrain (Commit hash: '4b3bf60').\nTo train it from scratch follow these steps:\n\n\n1. Clone SpeechBrain:\n2. Install it:\n3. Run Training:\n\n\nYou can find our training results (models, logs, etc) at the subdirectories.",
"### Limitations\n\n\nThe SpeechBrain team does not provide any warranty on the performance achieved by this model when used on other datasets.\n\n\nAbout SpeechBrain\n=================\n\n\n* Website: URL\n* Code: URL\n* HuggingFace: URL\n\n\nCiting SpeechBrain\n==================\n\n\nPlease, cite SpeechBrain if you use it for your research or business.\n\n\nCiting the model\n================\n\n\nCiting KsponSpeech dataset\n=========================="
] |
[
54,
13,
73,
72,
97
] |
[
"passage: TAGS\n#speechbrain #ASR #CTC #Attention #Conformer #pytorch #kr #dataset-ksponspeech #arxiv-2106.04624 #license-apache-2.0 #region-us \n### Transcribing your own audio files (in Korean)### Inference on GPU\n\n\nTo perform inference on the GPU, add 'run\\_opts={\"device\":\"cuda\"}' when calling the 'from\\_hparams' method.\n\n\nParallel Inference on a Batch\n-----------------------------\n\n\nPlease, see this Colab notebook on using the pretrained model### Training\n\n\nThe model was trained with SpeechBrain (Commit hash: '4b3bf60').\nTo train it from scratch follow these steps:\n\n\n1. Clone SpeechBrain:\n2. Install it:\n3. Run Training:\n\n\nYou can find our training results (models, logs, etc) at the subdirectories.### Limitations\n\n\nThe SpeechBrain team does not provide any warranty on the performance achieved by this model when used on other datasets.\n\n\nAbout SpeechBrain\n=================\n\n\n* Website: URL\n* Code: URL\n* HuggingFace: URL\n\n\nCiting SpeechBrain\n==================\n\n\nPlease, cite SpeechBrain if you use it for your research or business.\n\n\nCiting the model\n================\n\n\nCiting KsponSpeech dataset\n=========================="
] |
[
-0.07414738088846207,
0.15280790627002716,
-0.0017196539556607604,
0.059015631675720215,
0.08847378939390182,
-0.007998161017894745,
0.15108945965766907,
0.08710727095603943,
0.05269164964556694,
0.08727515488862991,
0.03711847588419914,
0.03803999349474907,
0.09550152719020844,
0.09800262749195099,
0.055450569838285446,
-0.21279612183570862,
0.01448032446205616,
-0.061626631766557693,
0.14551256597042084,
0.054317522794008255,
0.0652003362774849,
-0.04936686530709267,
0.07387744635343552,
0.005662593990564346,
-0.07738921046257019,
-0.012203168123960495,
0.05635754391551018,
-0.062286652624607086,
0.07731273770332336,
0.03976908698678017,
0.014227799139916897,
-0.008463009260594845,
0.054499126970767975,
-0.23728522658348083,
0.022224728018045425,
0.013058085925877094,
0.021046459674835205,
0.057872869074344635,
0.04423755407333374,
-0.019529292359948158,
0.13720634579658508,
0.05316390469670296,
0.025326121598482132,
0.10715703666210175,
-0.11816813796758652,
-0.022221747785806656,
-0.12305165827274323,
0.16106736660003662,
0.07763314992189407,
0.07738987356424332,
-0.04983139783143997,
0.12276279181241989,
-0.04219624400138855,
0.08604840934276581,
0.11520892381668091,
-0.2418479174375534,
0.005569968838244677,
-0.0034015243873000145,
0.052163220942020416,
0.0012558922171592712,
-0.07845357805490494,
-0.016567757353186607,
0.017075147479772568,
-0.021804073825478554,
-0.0020111906342208385,
-0.06547082215547562,
-0.07328849285840988,
-0.06315351277589798,
-0.10394027829170227,
-0.08064928650856018,
0.17567680776119232,
0.03440798446536064,
-0.08874648809432983,
-0.08636355400085449,
0.018333865329623222,
-0.02338976040482521,
0.016247287392616272,
0.0577632300555706,
0.015620794147253036,
-0.03831126168370247,
0.028996353968977928,
-0.0888281762599945,
-0.062313344329595566,
-0.10424140095710754,
0.026031438261270523,
0.0669802725315094,
0.046322379261255264,
0.01437605544924736,
-0.0006387810572050512,
0.091893769800663,
-0.1610112190246582,
-0.07825876772403717,
-0.032449182122945786,
-0.02928200364112854,
-0.07894697040319443,
-0.02806304581463337,
0.052654266357421875,
-0.013781285844743252,
0.0353388637304306,
0.17495214939117432,
-0.009604818187654018,
0.004736708477139473,
-0.0030542751774191856,
0.008122861385345459,
0.025069933384656906,
0.08684690296649933,
-0.09936165809631348,
0.012341594323515892,
0.04707583039999008,
0.06688257306814194,
0.012640004977583885,
-0.00031060268520377576,
-0.030377935618162155,
0.05600086972117424,
-0.028999844565987587,
0.07420188188552856,
0.08039895445108414,
0.03168310225009918,
-0.06727943569421768,
0.010580004192888737,
0.21663740277290344,
-0.1322634518146515,
-0.01221984252333641,
0.04954028129577637,
-0.04563398286700249,
-0.09450214356184006,
0.08956911414861679,
0.04939495027065277,
-0.0903247594833374,
-0.01644020900130272,
-0.0684170350432396,
0.009385185316205025,
-0.05233640968799591,
-0.04731219634413719,
0.03126639500260353,
-0.0076314909383654594,
-0.02688758075237274,
-0.09445536136627197,
-0.09763125330209732,
-0.052893079817295074,
-0.033165402710437775,
-0.05502361059188843,
-0.06718631088733673,
-0.04780217260122299,
-0.07090786844491959,
0.00971323624253273,
-0.06224328279495239,
-0.004380406811833382,
-0.03163246810436249,
0.026651820167899132,
0.05840654671192169,
0.051488179713487625,
-0.019699886441230774,
0.03778250142931938,
-0.0929432064294815,
0.012908055447041988,
-0.16226300597190857,
0.11408441513776779,
-0.06825719028711319,
-0.0008396202465519309,
-0.09891188144683838,
0.007229475770145655,
-0.06936346739530563,
0.0018895528046414256,
0.057351455092430115,
0.1253526359796524,
-0.12417628616094589,
-0.04530155286192894,
0.18322153389453888,
-0.0693843886256218,
-0.08291056007146835,
0.13740624487400055,
-0.017691563814878464,
0.09868941456079483,
0.09795321524143219,
0.2250472754240036,
0.01666978932917118,
-0.17567642033100128,
-0.14282529056072235,
-0.07977104187011719,
-0.057463448494672775,
0.025036891922354698,
0.047742486000061035,
-0.12485938519239426,
0.11890941858291626,
0.01008237898349762,
-0.0527384988963604,
-0.004952348303049803,
-0.005084728356450796,
-0.05063590779900551,
0.005990067031234503,
-0.054354969412088394,
-0.028105145320296288,
-0.0004113751056138426,
-0.02180197648704052,
0.021412579342722893,
-0.09020597487688065,
0.06103944033384323,
0.0580301471054554,
-0.0663711279630661,
0.039148569107055664,
-0.05036426708102226,
0.03971010819077492,
-0.04776623100042343,
0.0034365253522992134,
-0.12545587122440338,
-0.010776741430163383,
0.07617730647325516,
-0.15641704201698303,
0.15790405869483948,
-0.08076192438602448,
-0.008029035292565823,
0.03521261364221573,
-0.04993799701333046,
0.010464025661349297,
-0.01758022978901863,
0.017720390111207962,
-0.05445951595902443,
-0.13280557096004486,
-0.0025715644005686045,
-0.013181940652430058,
0.1401798576116562,
-0.1531221568584442,
0.011003724299371243,
0.02954556792974472,
0.09349527955055237,
-0.010184110142290592,
-0.06375046074390411,
0.0788179337978363,
0.02401784062385559,
0.030849019065499306,
-0.02353380061686039,
0.009136345237493515,
0.023267410695552826,
-0.03028441034257412,
0.15724627673625946,
-0.23894426226615906,
-0.2515770494937897,
0.07449378073215485,
0.043267134577035904,
-0.07467231899499893,
0.04171823710203171,
-0.026031872257590294,
-0.07340288162231445,
-0.03741352632641792,
-0.08079846948385239,
0.2496611773967743,
0.018686654046177864,
0.11606860905885696,
-0.08149871975183487,
-0.046685848385095596,
-0.000410262233344838,
-0.0927165150642395,
0.030959805473685265,
0.08357204496860504,
0.024129116907715797,
-0.035009678453207016,
0.06902372092008591,
-0.11873503774404526,
-0.0533086359500885,
0.2137148231267929,
0.03956012800335884,
-0.09695213288068771,
-0.07969808578491211,
0.0011383331147953868,
0.028207827359437943,
0.1593932956457138,
-0.04295586794614792,
-0.00029941604589112103,
0.05819553881883621,
-0.02853943035006523,
0.028663739562034607,
-0.11635594815015793,
0.05706602707505226,
-0.03277137875556946,
-0.08023896813392639,
-0.12135715037584305,
0.020987479016184807,
-0.0579579696059227,
0.06479937583208084,
-0.00737211387604475,
0.062225669622421265,
-0.014200421050190926,
-0.0493989959359169,
-0.14228686690330505,
0.1041734516620636,
-0.05209724232554436,
-0.1495712697505951,
-0.1855563521385193,
0.06424602121114731,
-0.03910040855407715,
0.003785497508943081,
0.019637588411569595,
-0.01820486970245838,
-0.05164232477545738,
-0.09841212630271912,
0.0019987323321402073,
0.05167563632130623,
-0.10732235759496689,
-0.14324446022510529,
-0.009085210971534252,
0.02481207810342312,
-0.08059538900852203,
-0.02053765207529068,
-0.022169038653373718,
-0.04446197673678398,
-0.01040963176637888,
0.05553516373038292,
0.034605901688337326,
0.11672960221767426,
0.033920928835868835,
-0.021973969414830208,
-0.03866340219974518,
0.09377209097146988,
-0.11240165680646896,
0.061669282615184784,
0.074043408036232,
-0.029568275436758995,
0.02699464000761509,
0.1768096387386322,
0.00605520186945796,
-0.024449404329061508,
0.03580888733267784,
0.031429413706064224,
-0.05663686618208885,
-0.26977649331092834,
-0.018189752474427223,
-0.08389978110790253,
0.046262674033641815,
0.04638724401593208,
0.030222244560718536,
0.10144033282995224,
0.00449194572865963,
-0.037793487310409546,
0.0043539018370211124,
0.04437142610549927,
0.0557481087744236,
0.09963847696781158,
0.024583306163549423,
0.04584190994501114,
-0.08492391556501389,
0.04397210851311684,
0.03932680934667587,
0.10651067644357681,
0.15311987698078156,
0.0062781572341918945,
0.2678406238555908,
0.09715604782104492,
0.10069367289543152,
0.0004807791265193373,
0.03464938700199127,
0.007000912446528673,
0.04672064632177353,
0.04684750735759735,
-0.10193660855293274,
-0.05902452766895294,
0.07272058725357056,
0.13578476011753082,
-0.012979156337678432,
0.002686350140720606,
-0.027569491416215897,
0.011341983452439308,
0.25044015049934387,
0.07288962602615356,
-0.19076097011566162,
-0.08162203431129456,
0.0030931751243770123,
-0.0634278804063797,
-0.02591274306178093,
-0.01826925203204155,
0.1063513234257698,
-0.15168114006519318,
0.010687703266739845,
0.025160986930131912,
0.06742585450410843,
-0.12681026756763458,
-0.05901152640581131,
-0.02723977155983448,
0.08017987757921219,
-0.014100157655775547,
0.04358891025185585,
-0.18655113875865936,
0.17303591966629028,
0.021286480128765106,
0.1402507722377777,
0.006310885772109032,
0.03034469485282898,
0.024026626721024513,
-0.083624467253685,
0.15872885286808014,
0.00231460714712739,
0.05680616945028305,
-0.09721048176288605,
-0.15737536549568176,
-0.004274711944162846,
0.010235954076051712,
-0.009060853160917759,
0.06749209761619568,
-0.009436904452741146,
-0.03213074430823326,
-0.019284896552562714,
-0.13644658029079437,
-0.24645139276981354,
-0.08419930934906006,
0.0559363029897213,
0.03525681421160698,
0.020567402243614197,
-0.004054406192153692,
-0.03953763470053673,
-0.04985924810171127,
0.04945310205221176,
-0.16630154848098755,
-0.08214355260133743,
-0.057952314615249634,
-0.048976436257362366,
0.13385511934757233,
-0.07730599492788315,
0.04699455201625824,
-0.045652806758880615,
0.037060827016830444,
-0.009103421121835709,
0.011541800573468208,
0.0431068018078804,
-0.062312982976436615,
-0.15415921807289124,
-0.004961463622748852,
0.17841306328773499,
0.07131799310445786,
0.07549220323562622,
0.006925199646502733,
0.08848381042480469,
-0.028266770765185356,
-0.10194273293018341,
0.003167648334056139,
0.05760859698057175,
0.010229901410639286,
0.09210821986198425,
-0.08025937527418137,
-0.10063617676496506,
-0.1511380672454834,
-0.05096723139286041,
0.11554906517267227,
0.27754753828048706,
-0.06324341148138046,
0.12161793559789658,
0.18178817629814148,
-0.1233142763376236,
-0.1986212432384491,
-0.09021776169538498,
0.04342950880527496,
0.030028533190488815,
-0.0106181176379323,
-0.0813775435090065,
0.003716400358825922,
0.024317322298884392,
-0.035635825246572495,
-0.01896822080016136,
-0.15704286098480225,
-0.1818067729473114,
0.12152120471000671,
0.0049297995865345,
-0.19572487473487854,
-0.10683231055736542,
-0.08674105256795883,
-0.03863899037241936,
-0.04338590428233147,
0.10052774846553802,
-0.05491349473595619,
0.1127234399318695,
0.031059136614203453,
0.03833715245127678,
-0.007057921960949898,
-0.06629252433776855,
0.07858847081661224,
0.09615816920995712,
-0.034026410430669785,
-0.031921613961458206,
-0.03873894736170769,
0.1097320094704628,
-0.04985640197992325,
0.20330969989299774,
0.05163828283548355,
0.01919350028038025,
-0.11155290901660919,
-0.037224214524030685,
-0.11101647466421127,
0.07753586024045944,
-0.041809674352407455,
-0.01032203994691372,
-0.004471560474485159,
0.01923562027513981,
0.01482713408768177,
0.02214079722762108,
-0.11182179301977158,
-0.15637758374214172,
-0.01615094766020775,
0.24972288310527802,
0.20682069659233093,
0.1004805713891983,
-0.05815460532903671,
0.01536642201244831,
-0.056842003017663956,
0.023393718525767326,
-0.10181114077568054,
0.02051052823662758,
0.06415829807519913,
0.04762404039502144,
0.13583862781524658,
-0.01974429003894329,
-0.16534709930419922,
0.02645610086619854,
0.038808636367321014,
-0.09897951036691666,
-0.11046111583709717,
0.051012326031923294,
-0.05744941532611847,
-0.09844149649143219,
-0.05020150914788246,
0.11068476736545563,
-0.11465323716402054,
-0.02210383489727974,
0.014266091398894787,
0.06929247826337814,
-0.10308384895324707,
0.1474679410457611,
0.11800235509872437,
0.05408347398042679,
-0.07713630795478821,
0.1261284351348877,
0.07273389399051666,
-0.02769998461008072,
0.05748999863862991,
0.06322337687015533,
-0.064380943775177,
-0.07266797870397568,
-0.17347480356693268,
-0.006721404381096363,
-0.017756499350070953,
-0.06964372843503952,
-0.06782921403646469,
-0.006869182921946049,
-0.02873743139207363,
0.08307109773159027,
-0.03228229284286499,
0.06200256943702698,
-0.017028234899044037,
0.016429267823696136,
-0.12324441224336624,
0.12371689826250076,
0.011572212912142277,
0.016856012865900993,
-0.02402668446302414,
0.1075458899140358,
0.0693640485405922,
0.02564176730811596,
-0.0019852230325341225,
-0.035469863563776016,
-0.10674059391021729,
0.022582141682505608,
0.10082599520683289,
-0.0004903408116661012,
-0.05814468488097191,
-0.008485826663672924,
-0.010708644054830074,
-0.0423554927110672,
-0.009490355849266052,
0.04638851061463356,
-0.06481370329856873,
-0.05625012889504433,
-0.005092567764222622,
0.14160262048244476,
-0.15987803041934967,
0.010467500425875187,
0.08239829540252686,
-0.03377256914973259,
0.09090148657560349,
0.07214638590812683,
-0.052953265607357025,
0.0356888584792614,
-0.09282209724187851,
0.037066731601953506,
-0.044125329703092575,
0.020359326153993607,
0.006032276898622513,
-0.14245985448360443,
-0.01996568962931633,
0.006664770655333996,
0.05582038685679436,
-0.006374339573085308,
0.08189365267753601,
-0.102360300719738,
-0.008148963563144207,
0.01921730674803257,
-0.01768471859395504,
-0.06169156730175018,
0.05722861737012863,
-0.014337907545268536,
0.05297519266605377,
0.08221817761659622,
-0.08385435491800308,
0.022512590512633324,
-0.11924150586128235,
0.011372939683496952,
-0.00704263336956501,
-0.015498876571655273,
-0.02422262355685234,
-0.015035241842269897,
0.09433896094560623,
0.01711154356598854,
0.1615716814994812,
-0.02536694146692753,
-0.02557387948036194,
0.05127665400505066,
0.024965783581137657,
-0.01914464496076107,
0.049988869577646255,
0.058389510959386826,
0.011013527400791645,
-0.03440900519490242,
-0.0042371610179543495,
0.007559641730040312,
-0.08495937287807465,
-0.08505039662122726,
0.1209908127784729,
0.1167270764708519,
0.08641024678945541,
0.033212024718523026,
0.16251610219478607,
-0.06785642355680466,
0.019291704520583153,
0.006335393059998751,
-0.09753863513469696,
0.049508582800626755,
-0.09960708022117615,
0.1569109410047531,
0.15811066329479218,
-0.12920217216014862,
0.0966954454779625,
-0.0065589225850999355,
-0.1290908306837082,
-0.09106895327568054,
-0.14878727495670319,
-0.009642117656767368,
0.0011532834032550454,
0.004712438676506281,
-0.07380806654691696,
0.04899239540100098,
0.1658417284488678,
-0.007787739858031273,
-0.013645299710333347,
0.1865534633398056,
-0.046575840562582016,
-0.05566825717687607,
0.03515424579381943,
-0.02076895721256733,
0.004518761765211821,
-0.06628426909446716,
0.04783295840024948,
0.04903632029891014,
0.052306149154901505,
0.08787131309509277,
0.051668792963027954,
0.00609760032966733,
0.06038544699549675,
-0.01242715585976839,
-0.06609956175088882,
0.017224527895450592,
0.05298428237438202,
0.042468927800655365,
0.07341164350509644,
0.09123387187719345,
-0.028339263051748276,
0.027689887210726738,
0.11717112362384796,
-0.040161971002817154,
-0.08009164780378342,
-0.1619074046611786,
0.1591324359178543,
-0.026301393285393715,
-0.029570886865258217,
-0.06290293484926224,
-0.10506193339824677,
0.03816349431872368,
0.16662777960300446,
0.182109534740448,
-0.0231783427298069,
-0.01532540749758482,
-0.053966276347637177,
0.016110042110085487,
0.004689437337219715,
0.060847122222185135,
0.0041062673553824425,
0.17891459167003632,
-0.02852489799261093,
0.10443516075611115,
0.0021546429488807917,
-0.057364389300346375,
-0.021640673279762268,
0.06905277818441391,
-0.041819214820861816,
-0.016485875472426414,
-0.06806117296218872,
0.09021306037902832,
-0.09109589457511902,
-0.1828155219554901,
-0.1279779076576233,
-0.028718480840325356,
-0.06516097486019135,
0.006769177038222551,
-0.03558241203427315,
0.03946002200245857,
0.014300907962024212,
-0.00025053639546968043,
0.003554695751518011,
0.1833966225385666,
0.014080441556870937,
-0.08361349254846573,
-0.030452949926257133,
0.029448017477989197,
-0.07558471709489822,
0.18781954050064087,
-0.028431251645088196,
0.15150333940982819,
0.07199335843324661,
-0.0006108670495450497,
-0.08246329426765442,
0.08742373436689377,
0.014271561056375504,
-0.1325850933790207,
0.036733124405145645,
0.1776653677225113,
0.02582302689552307,
0.0227859728038311,
0.0007121347589418292,
0.014483489096164703,
0.09348075836896896,
-0.10105692595243454,
0.010873347520828247,
-0.1274668574333191,
0.0711996778845787,
-0.07897192239761353,
0.14594902098178864,
0.1334947943687439,
-0.0874703899025917,
-0.004586334805935621,
-0.07286681979894638,
-0.029714442789554596,
0.021316898986697197,
0.02684762142598629,
-0.028906235471367836,
-0.17495176196098328,
0.09557054936885834,
-0.08343013375997543,
0.004742140416055918,
-0.2590891420841217,
-0.026642892509698868,
-0.01944037154316902,
-0.03408879041671753,
-0.017725905403494835,
0.07162503898143768,
0.019034462049603462,
0.057981833815574646,
-0.03770102187991142,
-0.15996401011943817,
0.0009376084781251848,
0.11150690913200378,
-0.14201520383358002,
-0.11650846153497696
] |
null | null |
transformers
|
# DialoGPT Trained on the Speech of a Game Character
Chat with the model:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("dead69/GTP-small-yoda")
model = AutoModelWithLMHead.from_pretrained("dead69/GTP-small-yoda")
# Let's chat for 4 lines
for step in range(10):
# encode the new user input, add the eos_token and return a tensor in Pytorch
new_user_input_ids = tokenizer.encode(input(">> User:") + tokenizer.eos_token, return_tensors='pt')
# print(new_user_input_ids)
# append the new user input tokens to the chat history
bot_input_ids = torch.cat([chat_history_ids, new_user_input_ids], dim=-1) if step > 0 else new_user_input_ids
# generated a response while limiting the total chat history to 1000 tokens,
chat_history_ids = model.generate(
bot_input_ids, max_length=200,
pad_token_id=tokenizer.eos_token_id,
no_repeat_ngram_size=3,
do_sample=True,
top_k=100,
top_p=0.7,
temperature=0.8
)
# pretty print last ouput tokens from bot
print("Master YODA: {}".format(tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][0], skip_special_tokens=True)))
```
|
{"license": "mit", "tags": ["conversational"], "thumbnail": "https://huggingface.co/front/thumbnails/dialogpt.png"}
|
text-generation
|
dead69/GPT-small-yoda
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# DialoGPT Trained on the Speech of a Game Character
Chat with the model:
|
[
"# DialoGPT Trained on the Speech of a Game Character\n\n\nChat with the model:"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# DialoGPT Trained on the Speech of a Game Character\n\n\nChat with the model:"
] |
[
56,
21
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# DialoGPT Trained on the Speech of a Game Character\n\n\nChat with the model:"
] |
[
0.009732226841151714,
0.1090635359287262,
-0.003257261822000146,
0.07003235816955566,
0.17689915001392365,
-0.02063610590994358,
0.13917487859725952,
0.1123373806476593,
0.002713523805141449,
-0.06448715925216675,
0.09866461157798767,
0.14212432503700256,
0.04005192965269089,
0.12336406111717224,
0.041943974792957306,
-0.26614341139793396,
0.09978538006544113,
0.04029017314314842,
0.0874929204583168,
0.10743609815835953,
0.0607687383890152,
-0.015219511464238167,
0.017018770799040794,
0.07011882960796356,
-0.09817659854888916,
-0.04296039789915085,
0.04304113984107971,
-0.08380921930074692,
0.13785578310489655,
0.02359333075582981,
-0.0023329290561378,
-0.04420030117034912,
-0.0485457181930542,
-0.2080802172422409,
0.03436710685491562,
-0.015515578910708427,
-0.015053607523441315,
-0.007924850098788738,
-0.030508847907185555,
0.07368574291467667,
0.18702690303325653,
0.18967899680137634,
0.06655141711235046,
0.13878493010997772,
-0.13381366431713104,
-0.13860692083835602,
0.04837137088179588,
0.024324491620063782,
0.10074323415756226,
0.12354842573404312,
-0.049705930054187775,
0.11518062651157379,
-0.06433799117803574,
0.06861026585102081,
0.009817837737500668,
-0.3763779103755951,
-0.054711442440748215,
0.13959984481334686,
0.10047478973865509,
0.08601148426532745,
-0.07090231776237488,
0.04751312732696533,
-0.03412756323814392,
0.02080298587679863,
-0.07170505821704865,
-0.04981404170393944,
-0.0018893767846748233,
0.04182884842157364,
-0.07516653835773468,
0.0022776152472943068,
0.18897317349910736,
-0.015353905968368053,
0.044806916266679764,
-0.12552893161773682,
0.020878933370113373,
-0.06695704162120819,
-0.058174844831228256,
-0.08066436648368835,
-0.07508084177970886,
0.07847185432910919,
-0.07810685783624649,
-0.11496318131685257,
-0.12610886991024017,
-0.0897989422082901,
-0.08908999711275101,
0.07302089035511017,
0.0018542405450716615,
-0.0020269376691430807,
-0.24107344448566437,
0.09430991113185883,
-0.08320058137178421,
-0.05063294619321823,
-0.11516337841749191,
-0.08409091830253601,
-0.0655074268579483,
0.028576835989952087,
-0.010098880156874657,
0.048253580927848816,
0.09412882477045059,
0.10093094408512115,
0.00049379508709535,
0.06361455470323563,
-0.02959447167813778,
0.03849238157272339,
0.009701062925159931,
0.07923301309347153,
0.05757572501897812,
-0.03804773837327957,
0.009310380555689335,
-0.034745607525110245,
0.01975955255329609,
-0.08970649540424347,
-0.21451111137866974,
0.02506605163216591,
-0.008341983892023563,
0.048241425305604935,
0.00022256429656408727,
0.14882595837116241,
-0.03204035013914108,
0.02279883250594139,
0.0780804380774498,
0.02493993192911148,
-0.047638170421123505,
0.072098508477211,
-0.04285295680165291,
0.036669131368398666,
-0.02161715179681778,
0.04205470159649849,
-0.05953969433903694,
-0.12681220471858978,
-0.05170775577425957,
-0.046580489724874496,
-0.055383700877428055,
-0.009517397731542587,
0.06053202226758003,
0.059146150946617126,
-0.022917894646525383,
-0.15390673279762268,
-0.07614874839782715,
0.020392732694745064,
0.0008431582245975733,
-0.038706809282302856,
-0.1122683435678482,
-0.17854715883731842,
-0.05368488281965256,
0.014698745682835579,
-0.03715892136096954,
-0.08381600677967072,
-0.021396365016698837,
0.10223107039928436,
-0.020084993913769722,
0.11874691396951675,
-0.14013051986694336,
0.05478790029883385,
-0.0806136354804039,
-0.04570094868540764,
-0.10962865501642227,
0.11195316910743713,
-0.010342906229197979,
0.00514627480879426,
-0.019929764792323112,
-0.03573136404156685,
-0.02650223858654499,
0.0493740439414978,
-0.06102929264307022,
0.15434125065803528,
-0.09318576753139496,
-0.08867054432630539,
0.27999213337898254,
-0.05592920631170273,
-0.16256633400917053,
0.12226780503988266,
-0.04132164269685745,
0.09694086015224457,
0.14480257034301758,
0.28127509355545044,
0.06179070845246315,
-0.05149165540933609,
0.10971710830926895,
0.10240133106708527,
-0.12835398316383362,
0.0031661000102758408,
0.09389542043209076,
-0.01106268260627985,
0.014763908460736275,
0.03657392039895058,
-0.051411569118499756,
0.13825160264968872,
-0.023142430931329727,
-0.050550173968076706,
0.050414036959409714,
-0.059407662600278854,
0.0900852158665657,
0.006062963046133518,
0.0631483718752861,
-0.06592875719070435,
-0.06427934765815735,
0.005554373376071453,
0.07515453547239304,
-0.050983622670173645,
0.03857582435011864,
-0.17143011093139648,
0.061078961938619614,
0.09931206703186035,
0.07647509127855301,
-0.07796482741832733,
-0.07487524300813675,
-0.054926831275224686,
0.09843660145998001,
0.09984497725963593,
0.08130986988544464,
0.07447236031293869,
0.017387358471751213,
0.022495701909065247,
0.011386374942958355,
0.04975572228431702,
-0.007515161298215389,
0.013606669381260872,
-0.09148680418729782,
0.02730247750878334,
-0.07228349149227142,
0.14519327878952026,
-0.01683369278907776,
0.009338384494185448,
0.01543915830552578,
0.004947887733578682,
-0.05601705610752106,
-0.019023312255740166,
-0.030412329360842705,
-0.01299333106726408,
-0.04709216207265854,
-0.03145255148410797,
0.11520750820636749,
0.025963015854358673,
-0.09380990266799927,
0.20445865392684937,
-0.19890187680721283,
0.10559303313493729,
0.19009733200073242,
-0.154107004404068,
0.007313440553843975,
-0.06618386507034302,
0.0049462695606052876,
0.02087593264877796,
0.08720619231462479,
-0.027011185884475708,
0.3102799952030182,
-0.009741055779159069,
0.17781414091587067,
-0.03960423171520233,
0.014334414154291153,
-0.030698643997311592,
-0.06973354518413544,
0.0013828561641275883,
0.058040689677000046,
0.07584251463413239,
-0.11908088624477386,
0.1614442765712738,
-0.022875621914863586,
0.09007228165864944,
0.3444589078426361,
0.026605287566781044,
0.03973323479294777,
0.018522653728723526,
0.022531898692250252,
-0.08210758864879608,
-0.01670890301465988,
-0.3919760584831238,
-0.035912707448005676,
0.04264446347951889,
-0.02379128895699978,
0.06316038966178894,
-0.1358751654624939,
-0.0916772410273552,
0.012326748110353947,
-0.06843847781419754,
0.003130714176222682,
0.15522907674312592,
-0.03425723686814308,
0.13431142270565033,
0.023702368140220642,
-0.11502069234848022,
0.1250326931476593,
-0.0459650456905365,
-0.1144697517156601,
0.12303368002176285,
-0.1801944077014923,
-0.34730568528175354,
-0.08619517087936401,
-0.14852504432201385,
-0.03603687509894371,
0.08421032130718231,
0.11175474524497986,
-0.19766242802143097,
0.03109399974346161,
0.02967493236064911,
0.03291543200612068,
-0.1375143826007843,
-0.0659099891781807,
0.025135505944490433,
0.009640495292842388,
-0.15375551581382751,
-0.11770378798246384,
-0.030503908172249794,
0.016803059726953506,
-0.1623755842447281,
0.04249800741672516,
-0.11577960848808289,
0.062184445559978485,
0.28215712308883667,
0.023604456335306168,
0.09348347783088684,
-0.0913688912987709,
0.16937696933746338,
-0.082569420337677,
-0.0016327467747032642,
0.17150303721427917,
-0.0015204532537609339,
0.0537681058049202,
0.04804391413927078,
0.011199528351426125,
0.004471261985599995,
0.04313616827130318,
-0.11233984678983688,
-0.0977591797709465,
-0.12336454540491104,
-0.09389127045869827,
-0.08201759308576584,
0.14671075344085693,
-0.008599094115197659,
0.036250993609428406,
0.1798761934041977,
0.012182475998997688,
0.0154234254732728,
-0.11086660623550415,
0.1301499456167221,
0.08069191873073578,
0.23359140753746033,
-0.13722658157348633,
0.10324843227863312,
-0.02336236648261547,
-0.10982438176870346,
0.05351119115948677,
0.04009413719177246,
0.032565586268901825,
0.0937168300151825,
0.15101611614227295,
0.03303799778223038,
0.04833964258432388,
0.08759067207574844,
0.010147538036108017,
0.0604369081556797,
-0.032613471150398254,
-0.05300915613770485,
-0.03318830579519272,
-0.10209117829799652,
0.07738927751779556,
0.11444491893053055,
-0.12206607311964035,
-0.03533153980970383,
0.043389081954956055,
0.04088664799928665,
0.008165303617715836,
-0.008996184915304184,
-0.13700751960277557,
-0.04923845827579498,
0.060307443141937256,
-0.03726707026362419,
-0.11916781961917877,
0.14744630455970764,
0.0964001938700676,
-0.17360663414001465,
-0.006178570911288261,
0.037041075527668,
0.056211236864328384,
0.0009039161377586424,
0.08473149687051773,
-0.011331981047987938,
-0.06180591881275177,
0.00837950874119997,
0.12541529536247253,
-0.3716540038585663,
0.12055426836013794,
-0.04244256019592285,
0.017719779163599014,
-0.11657553911209106,
-0.07168182730674744,
0.023077335208654404,
0.01750241592526436,
0.11454426497220993,
-0.015908174216747284,
0.04323412477970123,
-0.07191929966211319,
0.02869338169693947,
0.016658442094922066,
0.05262605473399162,
0.028109321370720863,
-0.04115907847881317,
-0.04408738762140274,
-0.001688259420916438,
-0.014762898907065392,
-0.08574589341878891,
0.030949456617236137,
-0.15873154997825623,
0.014941960573196411,
0.16598832607269287,
0.12456029653549194,
0.0037308295723050833,
0.01153463963419199,
-0.09362353384494781,
0.15578004717826843,
0.05592092499136925,
-0.06586059182882309,
-0.05356747284531593,
-0.061332862824201584,
-0.06941520422697067,
-0.059560250490903854,
0.07117567211389542,
-0.0989036113023758,
0.09187392890453339,
-0.10878368467092514,
-0.09640087187290192,
0.06102878600358963,
-0.06701605767011642,
-0.0364513136446476,
-0.0008902611443772912,
0.16491834819316864,
0.08971190452575684,
0.06571296602487564,
0.0483974926173687,
-0.043445341289043427,
-0.04030143469572067,
-0.0786077156662941,
0.0030797221697866917,
0.005270074587315321,
-0.0760469064116478,
-0.09244616329669952,
0.07068820297718048,
-0.014805749990046024,
-0.10430186241865158,
-0.029310500249266624,
0.3250448703765869,
0.15175165235996246,
-0.0997655913233757,
0.19285012781620026,
0.10220705717802048,
-0.0024865572340786457,
-0.2360541969537735,
-0.13638468086719513,
-0.07258287817239761,
-0.0779586061835289,
-0.058602456003427505,
-0.20346632599830627,
0.0011840732768177986,
-0.11180689185857773,
-0.016770541667938232,
0.10471096634864807,
-0.25928932428359985,
-0.05595974996685982,
0.1642850637435913,
0.04262874647974968,
0.3028857707977295,
-0.09725796431303024,
-0.020865513011813164,
-0.0023722187615931034,
-0.17214852571487427,
0.15466731786727905,
-0.045818131417036057,
0.11905550956726074,
0.02058274857699871,
0.1603131890296936,
0.036724355071783066,
-0.023068545386195183,
0.09533601999282837,
0.021320387721061707,
-0.04559219628572464,
-0.09216831624507904,
-0.0801888257265091,
0.023145902901887894,
0.02016288787126541,
0.05595055967569351,
-0.04910645633935928,
-0.01967611350119114,
-0.11760241538286209,
-0.025476055219769478,
-0.08839684724807739,
-0.02930080145597458,
0.03424420952796936,
-0.0672454982995987,
-0.05921278893947601,
0.02110511064529419,
-0.010780755430459976,
0.0786864385008812,
0.07824725657701492,
-0.11530135571956635,
0.1436038613319397,
0.09096162766218185,
0.13871650397777557,
-0.05992148071527481,
-0.08719475567340851,
-0.0515592023730278,
-0.07033077627420425,
0.05390770733356476,
-0.056074630469083786,
-0.0048818509094417095,
0.0859401524066925,
0.004290462471544743,
0.11212864518165588,
0.10724523663520813,
-0.0631772130727768,
0.10195596516132355,
0.06953597068786621,
-0.1691010594367981,
-0.12549687922000885,
-0.035332612693309784,
0.070180743932724,
0.1444357931613922,
0.05587809905409813,
0.11182794719934464,
-0.028517302125692368,
-0.005516437813639641,
0.007863610051572323,
0.026030143722891808,
-0.087118960916996,
-0.00133410282433033,
-0.012313606217503548,
0.025937188416719437,
-0.14568229019641876,
0.0517420768737793,
-0.025072062388062477,
-0.04946032166481018,
0.08547169715166092,
0.0923989936709404,
-0.0901881530880928,
-0.1278858333826065,
-0.15910394489765167,
0.0656786635518074,
-0.05407712608575821,
-0.10072606801986694,
0.0029448294080793858,
-0.13396286964416504,
0.06969700008630753,
0.05455876514315605,
0.05589628964662552,
0.037541043013334274,
-0.09740247577428818,
0.0050087254494428635,
-0.03947978466749191,
0.018140368163585663,
-0.04468110576272011,
-0.11013957858085632,
-0.05144919455051422,
0.1198156401515007,
0.038527585566043854,
0.10044952481985092,
-0.0526970811188221,
-0.18498273193836212,
-0.15210939943790436,
0.10959712415933609,
-0.020994218066334724,
-0.09390807151794434,
-0.12211974710226059,
-0.028867026790976524,
0.016443507745862007,
-0.02106235735118389,
-0.019905023276805878,
-0.017604144290089607,
-0.12679843604564667,
0.00980783998966217,
0.015363061800599098,
0.004143439698964357,
-0.10561557859182358,
0.03898557648062706,
0.04276803880929947,
-0.017009619623422623,
0.12899106740951538,
0.11589502543210983,
-0.10938884317874908,
0.11237442493438721,
-0.19754908978939056,
-0.03785653039813042,
0.08059505373239517,
0.015825489535927773,
0.010509494692087173,
0.08654145896434784,
-0.03635459020733833,
0.06778416037559509,
0.0439416728913784,
0.06226365268230438,
0.07736263424158096,
-0.07545747607946396,
0.03525087982416153,
0.006021382287144661,
-0.07197891920804977,
-0.04901007190346718,
-0.015334461815655231,
0.08032875508069992,
0.006350177340209484,
0.11106127500534058,
-0.057312872260808945,
0.08923743665218353,
-0.08592955023050308,
0.03861093521118164,
0.02600264549255371,
-0.08262268453836441,
-0.003123396774753928,
-0.09814920276403427,
-0.0016882102936506271,
-0.03232724592089653,
0.1530434638261795,
-0.00711660273373127,
-0.0163098331540823,
0.00026045629056170583,
-0.01675027422606945,
0.07864920794963837,
-0.003924954682588577,
0.2235395908355713,
0.06608468294143677,
-0.03634857386350632,
0.015596460551023483,
0.12438875436782837,
0.017397161573171616,
-0.02984325960278511,
0.10738407075405121,
-0.07945356518030167,
-0.04172036051750183,
0.11021213978528976,
0.02262214571237564,
0.08228679746389389,
-0.09252794831991196,
-0.08001039177179337,
-0.016513768583536148,
-0.025281332433223724,
-0.10998251289129257,
0.23873433470726013,
0.14173026382923126,
-0.07369469106197357,
-0.0007924571982584894,
-0.03380321338772774,
-0.04140211641788483,
-0.13950005173683167,
-0.10185728967189789,
-0.04291627183556557,
-0.2249901443719864,
0.05295521020889282,
-0.09163644909858704,
0.1050851121544838,
0.001417199382558465,
0.09197948127985,
-0.09090036898851395,
0.07878780364990234,
0.003133925376459956,
-0.054430413991212845,
0.07533761113882065,
-0.07032016664743423,
0.05570925772190094,
-0.060560621321201324,
0.010300622321665287,
0.00744410278275609,
0.018467145040631294,
0.08007117360830307,
0.086578868329525,
-0.07785260677337646,
0.018868837505578995,
-0.07609478384256363,
-0.04321739077568054,
-0.03855445608496666,
0.057928163558244705,
-0.011732250452041626,
0.18683451414108276,
0.04179460182785988,
-0.06722693890333176,
0.02423451654613018,
0.10228268057107925,
-0.022654306143522263,
-0.1452276110649109,
-0.04574721306562424,
0.2401515394449234,
0.004012640565633774,
0.035243939608335495,
-0.05249262601137161,
-0.008935549296438694,
-0.10177504271268845,
0.33395975828170776,
0.2503587603569031,
-0.08913406729698181,
-0.005109346471726894,
-0.07687126100063324,
0.05248977988958359,
0.03497859463095665,
0.17670606076717377,
0.09443799406290054,
0.20260730385780334,
-0.11161954700946808,
-0.06636134535074234,
-0.05853443220257759,
-0.038910526782274246,
-0.002415341092273593,
-0.026408931240439415,
0.08194659650325775,
-0.017626063898205757,
-0.055943284183740616,
0.041461262851953506,
-0.31190744042396545,
0.06477247923612595,
-0.16391044855117798,
-0.0550195537507534,
-0.009087536484003067,
0.032163355499506,
0.13015015423297882,
0.02142256684601307,
0.10383019596338272,
0.013761045411229134,
-0.0011158508714288473,
0.07665325701236725,
-0.0030302403029054403,
-0.22686775028705597,
-0.0016016518929973245,
0.11277347058057785,
-0.12140903621912003,
0.06566590815782547,
-0.05914003401994705,
0.09173310548067093,
0.06850182265043259,
0.07016461342573166,
0.028988279402256012,
0.10007860511541367,
-0.0035135014913976192,
0.008811873383820057,
-0.04261016100645065,
0.05055229738354683,
0.03519212454557419,
0.024918830022215843,
0.035427406430244446,
-0.15958371758460999,
0.06052498146891594,
0.013385255821049213,
-0.05478792265057564,
-0.08590459078550339,
0.07850773632526398,
-0.06207914650440216,
0.08779007941484451,
0.04255130514502525,
-0.032835882157087326,
-0.017830003052949905,
0.003785452339798212,
-0.0061009605415165424,
-0.03876736760139465,
-0.031856242567300797,
-0.08151036500930786,
-0.1528659462928772,
-0.09379234910011292,
-0.06467481702566147,
-0.012830141000449657,
-0.21235863864421844,
0.023304147645831108,
-0.12553736567497253,
0.015404599718749523,
-0.04585060477256775,
0.06867270171642303,
0.07347501069307327,
0.0458528958261013,
0.005003978963941336,
0.07535979896783829,
0.04680829122662544,
0.15586480498313904,
-0.12600930035114288,
-0.07071501016616821
] |
null | null |
transformers
|
Pretraining Dataset: [AAAC01](https://huggingface.co/datasets/debatelab/aaac)
Demo: [DeepA2 Demo](https://huggingface.co/spaces/debatelab/deepa2-demo)
Paper: [DeepA2: A Modular Framework for Deep Argument Analysis with Pretrained Neural Text2Text Language Models](https://arxiv.org/abs/2110.01509)
Authors: *Gregor Betz, Kyle Richardson*
## Abstract
In this paper, we present and implement a multi-dimensional, modular framework for performing deep argument analysis (DeepA2) using current pre-trained language models (PTLMs). ArgumentAnalyst -- a T5 model (Raffel et al. 2020) set up and trained within DeepA2 -- reconstructs argumentative texts, which advance an informal argumentation, as valid arguments: It inserts, e.g., missing premises and conclusions, formalizes inferences, and coherently links the logical reconstruction to the source text. We create a synthetic corpus for deep argument analysis, and evaluate ArgumentAnalyst on this new dataset as well as on existing data, specifically EntailmentBank (Dalvi et al. 2021). Our empirical findings vindicate the overall framework and highlight the advantages of a modular design, in particular its ability to emulate established heuristics (such as hermeneutic cycles), to explore the model's uncertainty, to cope with the plurality of correct solutions (underdetermination), and to exploit higher-order evidence.
|
{"language": ["en"], "license": "cc-by-sa-4.0", "datasets": ["debatelab/aaac"], "widget": [{"text": "reason_statements: argument_source: If Peter likes fish, Peter has been to New York. So, Peter has been to New York.", "example_title": "Premise identification"}, {"text": "argdown_reconstruction: argument_source: If Peter likes fish, Peter has been to New York. So, Peter has been to New York.", "example_title": "Argdown reconstruction"}, {"text": "premises_formalized: reason_statements: If Peter likes fish, Peter has been to New York. (ref: (1))", "example_title": "Formalization"}], "inference": {"parameters": {"max_length": 80}}}
|
text2text-generation
|
DebateLabKIT/argument-analyst
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"en",
"dataset:debatelab/aaac",
"arxiv:2110.01509",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2110.01509"
] |
[
"en"
] |
TAGS
#transformers #pytorch #t5 #text2text-generation #en #dataset-debatelab/aaac #arxiv-2110.01509 #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
Pretraining Dataset: AAAC01
Demo: DeepA2 Demo
Paper: DeepA2: A Modular Framework for Deep Argument Analysis with Pretrained Neural Text2Text Language Models
Authors: *Gregor Betz, Kyle Richardson*
## Abstract
In this paper, we present and implement a multi-dimensional, modular framework for performing deep argument analysis (DeepA2) using current pre-trained language models (PTLMs). ArgumentAnalyst -- a T5 model (Raffel et al. 2020) set up and trained within DeepA2 -- reconstructs argumentative texts, which advance an informal argumentation, as valid arguments: It inserts, e.g., missing premises and conclusions, formalizes inferences, and coherently links the logical reconstruction to the source text. We create a synthetic corpus for deep argument analysis, and evaluate ArgumentAnalyst on this new dataset as well as on existing data, specifically EntailmentBank (Dalvi et al. 2021). Our empirical findings vindicate the overall framework and highlight the advantages of a modular design, in particular its ability to emulate established heuristics (such as hermeneutic cycles), to explore the model's uncertainty, to cope with the plurality of correct solutions (underdetermination), and to exploit higher-order evidence.
|
[
"## Abstract\n\nIn this paper, we present and implement a multi-dimensional, modular framework for performing deep argument analysis (DeepA2) using current pre-trained language models (PTLMs). ArgumentAnalyst -- a T5 model (Raffel et al. 2020) set up and trained within DeepA2 -- reconstructs argumentative texts, which advance an informal argumentation, as valid arguments: It inserts, e.g., missing premises and conclusions, formalizes inferences, and coherently links the logical reconstruction to the source text. We create a synthetic corpus for deep argument analysis, and evaluate ArgumentAnalyst on this new dataset as well as on existing data, specifically EntailmentBank (Dalvi et al. 2021). Our empirical findings vindicate the overall framework and highlight the advantages of a modular design, in particular its ability to emulate established heuristics (such as hermeneutic cycles), to explore the model's uncertainty, to cope with the plurality of correct solutions (underdetermination), and to exploit higher-order evidence."
] |
[
"TAGS\n#transformers #pytorch #t5 #text2text-generation #en #dataset-debatelab/aaac #arxiv-2110.01509 #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"## Abstract\n\nIn this paper, we present and implement a multi-dimensional, modular framework for performing deep argument analysis (DeepA2) using current pre-trained language models (PTLMs). ArgumentAnalyst -- a T5 model (Raffel et al. 2020) set up and trained within DeepA2 -- reconstructs argumentative texts, which advance an informal argumentation, as valid arguments: It inserts, e.g., missing premises and conclusions, formalizes inferences, and coherently links the logical reconstruction to the source text. We create a synthetic corpus for deep argument analysis, and evaluate ArgumentAnalyst on this new dataset as well as on existing data, specifically EntailmentBank (Dalvi et al. 2021). Our empirical findings vindicate the overall framework and highlight the advantages of a modular design, in particular its ability to emulate established heuristics (such as hermeneutic cycles), to explore the model's uncertainty, to cope with the plurality of correct solutions (underdetermination), and to exploit higher-order evidence."
] |
[
81,
247
] |
[
"passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #en #dataset-debatelab/aaac #arxiv-2110.01509 #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## Abstract\n\nIn this paper, we present and implement a multi-dimensional, modular framework for performing deep argument analysis (DeepA2) using current pre-trained language models (PTLMs). ArgumentAnalyst -- a T5 model (Raffel et al. 2020) set up and trained within DeepA2 -- reconstructs argumentative texts, which advance an informal argumentation, as valid arguments: It inserts, e.g., missing premises and conclusions, formalizes inferences, and coherently links the logical reconstruction to the source text. We create a synthetic corpus for deep argument analysis, and evaluate ArgumentAnalyst on this new dataset as well as on existing data, specifically EntailmentBank (Dalvi et al. 2021). Our empirical findings vindicate the overall framework and highlight the advantages of a modular design, in particular its ability to emulate established heuristics (such as hermeneutic cycles), to explore the model's uncertainty, to cope with the plurality of correct solutions (underdetermination), and to exploit higher-order evidence."
] |
[
-0.04753926396369934,
0.09691774100065231,
-0.002031982410699129,
-0.006198197137564421,
0.049967244267463684,
-0.010706598870456219,
0.12794439494609833,
0.06973883509635925,
-0.04284925386309624,
0.026145417243242264,
0.021457016468048096,
0.02332216501235962,
0.006694411858916283,
0.025380387902259827,
0.007148906588554382,
-0.23970778286457062,
-0.0031233509071171284,
0.0004488280101213604,
-0.01990319788455963,
0.07806519418954849,
0.09047961235046387,
-0.09504105895757675,
0.08108823746442795,
0.03705767169594765,
0.03217662498354912,
-0.0012084931368008256,
-0.11193419992923737,
-0.01802683062851429,
0.09912075102329254,
0.016662824898958206,
0.05041692405939102,
0.006950157228857279,
-0.0249155405908823,
-0.11297804117202759,
0.036307938396930695,
0.03782665729522705,
0.01113517489284277,
0.09814779460430145,
0.0050191013142466545,
0.011750095523893833,
0.1510961502790451,
-0.035121068358421326,
0.06350792944431305,
-0.006213662680238485,
-0.11062081903219223,
-0.10268480330705643,
-0.0679561048746109,
0.020240353420376778,
0.11281785368919373,
0.006625466980040073,
-0.004567841067910194,
0.1000392884016037,
-0.0382067896425724,
0.01848638989031315,
0.0441599078476429,
-0.22656266391277313,
-0.05261467024683952,
0.027442194521427155,
-0.005092557519674301,
0.09733881801366806,
-0.022227099165320396,
-0.018420526757836342,
-0.004296462517231703,
0.034915972501039505,
0.032533273100852966,
-0.03118475340306759,
-0.10449676215648651,
0.04087021201848984,
-0.13185295462608337,
-0.004599886015057564,
0.23190626502037048,
-0.028943294659256935,
-0.015470747835934162,
-0.07573425769805908,
-0.03843260183930397,
0.0368163138628006,
-0.007990854792296886,
-0.12850643694400787,
0.014545354060828686,
-0.03008691966533661,
0.19673772156238556,
-0.09422723948955536,
-0.1076437458395958,
0.01735745556652546,
-0.07791931927204132,
0.11551550030708313,
0.0417654812335968,
0.03117506206035614,
-0.021480632945895195,
0.05557484179735184,
-0.007549778558313847,
-0.08425966650247574,
-0.0458473302423954,
-0.09187167882919312,
-0.021258750930428505,
-0.09296833723783493,
-0.057130031287670135,
-0.10216047614812851,
0.004283570218831301,
0.14550241827964783,
0.07276490330696106,
0.02557986043393612,
0.043490346521139145,
0.026509936898946762,
0.027834786102175713,
0.04001558944582939,
-0.08870331943035126,
-0.00016930914716795087,
0.07715103775262833,
0.06303688138723373,
-0.012406773865222931,
-0.04605432227253914,
-0.12117733806371689,
-0.016374496743083,
0.01942719891667366,
0.020525148138403893,
-0.013256249949336052,
0.07696011662483215,
-0.0015584385255351663,
-0.08146250247955322,
-0.013823535293340683,
-0.133352592587471,
-0.025471046566963196,
-0.02663133293390274,
-0.057816386222839355,
-0.028859948739409447,
0.032772768288850784,
0.018014680594205856,
-0.07879698276519775,
0.07746421545743942,
-0.09416501969099045,
-0.024993259459733963,
-0.09990552067756653,
-0.10915473848581314,
0.007697977125644684,
-0.06430601328611374,
-0.007141253910958767,
-0.09772459417581558,
-0.2582821547985077,
0.0022070342674851418,
0.06335608661174774,
-0.06672074645757675,
-0.002450365573167801,
-0.05745561420917511,
0.021041980013251305,
-0.014326578006148338,
-0.009776017628610134,
-0.06661677360534668,
-0.021763747557997704,
-0.02430783212184906,
0.014909670688211918,
0.02433794178068638,
-0.09507936239242554,
0.06003148853778839,
-0.114879310131073,
0.01924605295062065,
-0.06992007791996002,
0.04789622500538826,
-0.02798665314912796,
0.013559892773628235,
-0.10321135818958282,
-0.03336559981107712,
-0.0830969512462616,
0.036724820733070374,
0.027622444555163383,
0.11678218841552734,
-0.1990126222372055,
-0.03237259387969971,
-0.020117128267884254,
-0.12609197199344635,
-0.18652017414569855,
0.06148238480091095,
-0.056344710290431976,
0.227760449051857,
0.07430290430784225,
0.07899449020624161,
0.1351969689130783,
0.058538615703582764,
0.04170987010002136,
-0.052175313234329224,
0.031170904636383057,
0.08711808174848557,
0.08885885030031204,
-0.02535252459347248,
0.002013956196606159,
-0.01820543222129345,
-0.058215975761413574,
-0.08469242602586746,
-0.022870434448122978,
-0.12125276029109955,
0.06967272609472275,
0.04080957546830177,
0.07936559617519379,
-0.10492227971553802,
0.019209247082471848,
0.018855955451726913,
-0.05532662943005562,
0.04999275133013725,
0.056742921471595764,
-0.056472860276699066,
-0.021486390382051468,
-0.14603881537914276,
0.10343918204307556,
-0.04068014398217201,
0.022634781897068024,
-0.14557161927223206,
-0.07113891839981079,
-0.0010222134878858924,
-0.000448614708147943,
0.09397665411233902,
0.2258535623550415,
0.057035550475120544,
0.02431887947022915,
0.05889566242694855,
0.035358965396881104,
-0.08240936696529388,
0.024634812027215958,
0.002965247957035899,
-0.13734424114227295,
-0.017060356214642525,
-0.06316324323415756,
0.09681369364261627,
-0.09237335622310638,
-0.024470176547765732,
0.020026933401823044,
0.003667873330414295,
0.06783032417297363,
0.0004160589014645666,
-0.01956830732524395,
0.03241945058107376,
0.004535681568086147,
-0.0038823639042675495,
0.05971195176243782,
-0.0011430325685068965,
-0.06942509114742279,
0.05297784507274628,
-0.20758460462093353,
0.07355279475450516,
0.006963121704757214,
-0.06642234325408936,
-0.018394317477941513,
-0.11849988251924515,
0.013244599103927612,
-0.056385260075330734,
0.09136926382780075,
-0.027481645345687866,
0.1629621982574463,
0.053685933351516724,
0.0662100687623024,
-0.08929219096899033,
-0.016478950157761574,
-0.04091724753379822,
-0.06827616691589355,
0.049140121787786484,
-0.003274718765169382,
0.026435935869812965,
-0.08808593451976776,
0.06095676124095917,
0.1303723305463791,
-0.11273016780614853,
0.14791400730609894,
0.043157998472452164,
-0.08385882526636124,
-0.012498653493821621,
0.00292781600728631,
-0.048498522490262985,
0.04701206088066101,
-0.1565457433462143,
0.009207982569932938,
0.04090331494808197,
-0.02186904102563858,
0.07952430844306946,
-0.10174259543418884,
0.040295280516147614,
0.040173161774873734,
0.03519943356513977,
0.036958299577236176,
0.02881035767495632,
0.028051158413290977,
0.09061945974826813,
0.02035851776599884,
0.0035238992422819138,
0.009662306867539883,
-0.06440970301628113,
-0.11147579550743103,
0.1932692676782608,
-0.1190820261836052,
-0.25768032670021057,
-0.11869648844003677,
-0.05519285053014755,
-0.022847216576337814,
0.002318359911441803,
0.01862543635070324,
-0.005761844106018543,
-0.08695367723703384,
-0.0651240423321724,
0.11324097961187363,
0.028409285470843315,
-0.07520958036184311,
0.02932841144502163,
-0.0015561808831989765,
-0.033989693969488144,
-0.10093976557254791,
0.004648603033274412,
-0.04131399840116501,
-0.08337141573429108,
0.046311065554618835,
0.04053080826997757,
0.0287212785333395,
0.10799316316843033,
-0.03533168137073517,
-0.008402925916016102,
-0.0035588666796684265,
0.16695921123027802,
0.005936785601079464,
0.027865374460816383,
0.1878148913383484,
-0.10878732800483704,
0.056384120136499405,
0.13332846760749817,
0.04290735721588135,
-0.03919592872262001,
-0.014437285251915455,
0.018708229064941406,
-0.0015764771960675716,
-0.1590675413608551,
-0.14924155175685883,
-0.09112493693828583,
-0.03200053423643112,
0.05495679751038551,
0.0013827024959027767,
0.04634985327720642,
0.08811026811599731,
-0.023024775087833405,
0.01774701662361622,
0.0023510840255767107,
0.0934080109000206,
0.2064005583524704,
0.0038234195671975613,
0.049942534416913986,
-0.01855519227683544,
-0.08568960428237915,
0.04336864873766899,
0.16268225014209747,
0.2127564251422882,
0.015180228278040886,
0.02588031440973282,
0.09975378215312958,
-0.0022460336331278086,
0.03763829171657562,
0.08993043005466461,
-0.04095226153731346,
-0.008059768937528133,
-0.08511064946651459,
-0.13281764090061188,
-0.11312128603458405,
0.11767423897981644,
-0.041329462081193924,
-0.04302813857793808,
0.04545550048351288,
0.1405964195728302,
0.03671431541442871,
0.14730776846408844,
0.048135679215192795,
-0.27298590540885925,
-0.015735674649477005,
0.0007868654211051762,
-0.0031899320892989635,
-0.07174096256494522,
-0.007171484176069498,
0.06362409144639969,
-0.08766285330057144,
0.0053174979984760284,
-0.0431763231754303,
0.07813750952482224,
-0.056630514562129974,
0.0579616017639637,
-0.0006353941280394793,
0.08631163835525513,
0.004897150211036205,
0.12853994965553284,
-0.21006406843662262,
0.16587349772453308,
0.020033329725265503,
-0.008537322282791138,
-0.10430706292390823,
-0.013377425260841846,
0.02711499109864235,
-0.006091806571930647,
0.07832206040620804,
0.01866881735622883,
-0.020763888955116272,
0.004234026186168194,
-0.02691936492919922,
0.07112111151218414,
0.028149349614977837,
0.0024328611325472593,
0.10051886737346649,
-0.03973366692662239,
0.07716510444879532,
-0.034928545355796814,
0.05552611127495766,
-0.04085210710763931,
-0.17430859804153442,
0.09869524091482162,
-0.07168839871883392,
-0.08000373095273972,
-0.0371062308549881,
-0.05640159174799919,
0.09872297197580338,
0.11208795011043549,
-0.10440406203269958,
-0.08355934172868729,
-0.03136845678091049,
0.01912681944668293,
0.08735889941453934,
-0.08187222480773926,
0.055355254560709,
0.025507980957627296,
0.07553573697805405,
-0.026942620053887367,
-0.12888795137405396,
0.08166686445474625,
-0.08400627225637436,
0.02457018569111824,
-0.021732967346906662,
0.06147199496626854,
0.019295889884233475,
-0.016188563778996468,
0.059791140258312225,
0.0022229060996323824,
-0.10117034614086151,
-0.08183038979768753,
-0.0560392364859581,
0.07163984328508377,
0.03591328486800194,
0.11932160705327988,
-0.1528436243534088,
0.022083140909671783,
-0.02991444244980812,
0.03267329931259155,
0.13820651173591614,
0.08858440816402435,
-0.040205277502536774,
0.08265414088964462,
0.1208500936627388,
-0.09567004442214966,
-0.2689511775970459,
-0.0048515936359763145,
0.05105144903063774,
-0.0035110912285745144,
0.12143009901046753,
-0.24084851145744324,
0.06839264184236526,
0.06200799718499184,
0.010176154784858227,
-0.007916913367807865,
-0.0824209451675415,
-0.06643889099359512,
0.17529912292957306,
-0.002585581736639142,
0.16072405874729156,
-0.07273277640342712,
-0.027233881875872612,
-0.017215769737958908,
-0.06033112481236458,
0.10439146310091019,
-0.024153273552656174,
0.03169503062963486,
-0.011163841001689434,
0.0885748416185379,
0.07020499557256699,
-0.004840393550693989,
0.11114497482776642,
0.038570892065763474,
0.0751546248793602,
-0.06361162662506104,
0.07437533885240555,
0.07466351985931396,
-0.012937551364302635,
0.1295444816350937,
-0.015497271902859211,
0.01825767755508423,
-0.14183709025382996,
0.019694270566105843,
0.007759569678455591,
0.020748518407344818,
-0.007933460175991058,
-0.062274739146232605,
-0.00942873302847147,
0.06362579762935638,
0.03775913268327713,
0.0016117221675813198,
0.08670376241207123,
-0.03140822425484657,
0.12162945419549942,
0.04939640685915947,
0.1766716092824936,
-0.08572234213352203,
-0.003939489368349314,
0.015560293570160866,
0.04882637411355972,
0.12533439695835114,
-0.07714127749204636,
0.057658616453409195,
0.09982267767190933,
-0.027763601392507553,
0.12418505549430847,
0.029241811484098434,
-0.09596043080091476,
-0.032019585371017456,
0.12425833195447922,
-0.1932736337184906,
-0.11683588474988937,
-0.04503799229860306,
0.10914571583271027,
-0.07637714594602585,
0.033490315079689026,
0.1412965953350067,
-0.052117303013801575,
-0.019066238775849342,
0.021749131381511688,
0.0774327963590622,
-0.022728832438588142,
0.10071323812007904,
-0.05240846425294876,
0.058931782841682434,
-0.031958408653736115,
0.038552943617105484,
0.051558803766965866,
-0.08192658424377441,
0.06305133551359177,
0.04953926056623459,
-0.058126505464315414,
-0.066743865609169,
-0.11811242252588272,
0.08640184253454208,
-0.13506601750850677,
-0.041813675314188004,
-0.053702160716056824,
-0.05861480161547661,
-0.034529589116573334,
0.10457509756088257,
0.055400434881448746,
0.04475044831633568,
-0.037749409675598145,
-0.05766012519598007,
0.043533265590667725,
0.08745990693569183,
0.002694705966860056,
0.013176276348531246,
-0.06128764897584915,
0.03275710344314575,
-0.028942028060555458,
0.03371774032711983,
-0.04030998423695564,
-0.06624913215637207,
-0.0772801861166954,
-0.01400755438953638,
-0.13833090662956238,
0.026040909811854362,
-0.07934995740652084,
0.009203589521348476,
0.0365530364215374,
0.09180117398500443,
-0.009756600484251976,
-0.011936976574361324,
-0.0812021940946579,
0.05887932330369949,
-0.001023094286210835,
0.053038712590932846,
-0.10642512142658234,
0.039034850895404816,
0.015494531951844692,
-0.06524921953678131,
0.09430357813835144,
-0.04196758568286896,
-0.03996017202734947,
0.039731841534376144,
-0.05952722579240799,
-0.0332338884472847,
0.03663427382707596,
0.08135461807250977,
-0.0009029421489685774,
-0.14436939358711243,
-0.009882546029984951,
0.05134771391749382,
0.01970471814274788,
-0.030139366164803505,
0.09719354659318924,
-0.07284951210021973,
0.08644150197505951,
-0.014290573075413704,
-0.042650770395994186,
-0.06800384074449539,
0.005492962896823883,
-0.03509954735636711,
0.07583774626255035,
0.1555550992488861,
-0.010628190822899342,
-0.020171696320176125,
-0.07898510992527008,
-0.018949387595057487,
0.019949985668063164,
0.031848110258579254,
-0.08598291873931885,
-0.10055209696292877,
0.03671322762966156,
-0.03825929015874863,
0.21726961433887482,
0.036453939974308014,
-0.08113664388656616,
0.025133419781923294,
0.08151467144489288,
-0.040169596672058105,
0.023187430575489998,
0.019341833889484406,
0.018820539116859436,
-0.02065122500061989,
-0.11505527049303055,
-0.016471533104777336,
-0.06823582202196121,
-0.04027373716235161,
0.1602657437324524,
0.07890143245458603,
0.10681741684675217,
0.011200688779354095,
0.12605981528759003,
0.026116957888007164,
-0.0021089462097734213,
0.0753495916724205,
0.16730792820453644,
0.042900051921606064,
-0.03702521696686745,
0.04216187819838524,
0.21872232854366302,
-0.07220076769590378,
0.1090979054570198,
-0.07111459225416183,
-0.013685361482203007,
-0.13519640266895294,
-0.22547294199466705,
-0.08056747168302536,
-0.1043286845088005,
-0.03880062699317932,
-0.165924534201622,
-0.020873503759503365,
0.13048723340034485,
0.02137686312198639,
-0.08899842202663422,
0.08325152099132538,
-0.12797194719314575,
-0.11503911018371582,
-0.009870757348835468,
0.000640803191345185,
0.11797396838665009,
0.06421154737472534,
0.015058797784149647,
0.034039489924907684,
0.12809748947620392,
0.031850218772888184,
-0.0019001202890649438,
0.052736345678567886,
0.04849077761173248,
-0.11306222528219223,
-0.07256823033094406,
-0.020550604909658432,
-0.04137500748038292,
-0.03230362758040428,
0.078521229326725,
0.009077239781618118,
-0.090579092502594,
0.019846787676215172,
0.20955488085746765,
-0.024252885952591896,
-0.030404018238186836,
-0.1448919028043747,
0.1300981640815735,
0.019947772845625877,
0.08213945478200912,
0.0709058940410614,
-0.04620475322008133,
0.0016809874214231968,
0.022914323955774307,
0.21814608573913574,
-0.1620052456855774,
-0.004850222263485193,
-0.011345288716256618,
0.039545152336359024,
-0.06919077038764954,
0.10259189456701279,
0.008227738551795483,
0.1652698814868927,
0.012108789756894112,
0.002655551303178072,
-0.052013099193573,
0.02164090983569622,
-0.05306901037693024,
-0.012261179275810719,
0.03145013377070427,
-0.0008769721025601029,
-0.04585682228207588,
0.0840231329202652,
0.010216646827757359,
-0.1292802095413208,
-0.13195951282978058,
-0.11207057535648346,
-0.12953948974609375,
-0.06393775343894958,
0.05332051217556,
-0.010269766673445702,
0.057616669684648514,
-0.022730378434062004,
-0.09844086319208145,
0.07562548667192459,
0.04500312730669975,
-0.030317673459649086,
-0.10538181662559509,
0.18457193672657013,
0.043212275952100754,
0.0700850784778595,
0.028903648257255554,
0.033995598554611206,
0.05397837236523628,
0.017002318054437637,
-0.05986928194761276,
0.025105010718107224,
0.011986403726041317,
0.008432870730757713,
0.04429216682910919,
0.047769419848918915,
0.01417104434221983,
0.07296355813741684,
0.12720538675785065,
-0.11284890025854111,
-0.025893153622746468,
0.02746787667274475,
-0.05260923132300377,
0.008534652180969715,
0.06560973823070526,
-0.09371192753314972,
0.14329643547534943,
0.1429693102836609,
-0.031574930995702744,
0.026409460231661797,
-0.05198950320482254,
0.005401000380516052,
-0.010262463241815567,
0.10137854516506195,
-0.04939087852835655,
-0.03530623018741608,
-0.01759803108870983,
-0.15728223323822021,
0.037029147148132324,
-0.26590922474861145,
-0.02600754238665104,
-0.015530072152614594,
0.022165903821587563,
-0.02800454944372177,
0.10431300848722458,
0.12877634167671204,
0.00017254862177651376,
-0.06991267204284668,
-0.0846497043967247,
0.01340812910348177,
0.04966292902827263,
-0.09881801158189774,
-0.03922435641288757
] |
null | null |
transformers
|
# CRiPT Model Large (Critical Thinking Intermediarily Pretrained Transformer)
Large version of the trained model (`SYL01-2020-10-24-72K/gpt2-large-train03-72K`) presented in the paper "Critical Thinking for Language Models" (Betz, Voigt and Richardson 2020). See also:
* [blog entry](https://debatelab.github.io/journal/critical-thinking-language-models.html)
* [GitHub repo](https://github.com/debatelab/aacorpus)
* [paper](https://arxiv.org/pdf/2009.07185)
|
{"language": "en", "tags": ["gpt2"]}
|
text-generation
|
DebateLabKIT/cript-large
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"arxiv:2009.07185",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2009.07185"
] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #en #arxiv-2009.07185 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# CRiPT Model Large (Critical Thinking Intermediarily Pretrained Transformer)
Large version of the trained model ('SYL01-2020-10-24-72K/gpt2-large-train03-72K') presented in the paper "Critical Thinking for Language Models" (Betz, Voigt and Richardson 2020). See also:
* blog entry
* GitHub repo
* paper
|
[
"# CRiPT Model Large (Critical Thinking Intermediarily Pretrained Transformer)\nLarge version of the trained model ('SYL01-2020-10-24-72K/gpt2-large-train03-72K') presented in the paper \"Critical Thinking for Language Models\" (Betz, Voigt and Richardson 2020). See also:\n * blog entry\n * GitHub repo\n * paper"
] |
[
"TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #en #arxiv-2009.07185 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# CRiPT Model Large (Critical Thinking Intermediarily Pretrained Transformer)\nLarge version of the trained model ('SYL01-2020-10-24-72K/gpt2-large-train03-72K') presented in the paper \"Critical Thinking for Language Models\" (Betz, Voigt and Richardson 2020). See also:\n * blog entry\n * GitHub repo\n * paper"
] |
[
60,
95
] |
[
"passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #en #arxiv-2009.07185 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# CRiPT Model Large (Critical Thinking Intermediarily Pretrained Transformer)\nLarge version of the trained model ('SYL01-2020-10-24-72K/gpt2-large-train03-72K') presented in the paper \"Critical Thinking for Language Models\" (Betz, Voigt and Richardson 2020). See also:\n * blog entry\n * GitHub repo\n * paper"
] |
[
-0.07603124529123306,
-0.03028331696987152,
-0.0023902736138552427,
0.012499697506427765,
0.04729164019227028,
-0.012183267623186111,
0.1089683249592781,
0.06236698850989342,
-0.03614889830350876,
-0.026395762339234352,
0.15814489126205444,
0.09993023425340652,
0.0019751719664782286,
0.09614358842372894,
-0.011889266781508923,
-0.31373172998428345,
0.029181834310293198,
0.044353265315294266,
-0.031126508489251137,
0.12885329127311707,
0.1398012340068817,
-0.09941831976175308,
0.052080024033784866,
0.0521065816283226,
-0.13102546334266663,
-0.03300139680504799,
-0.028980830684304237,
-0.0649949312210083,
0.12003500014543533,
0.026719288900494576,
0.05903097242116928,
0.06678996235132217,
-0.009269798174500465,
-0.05053528770804405,
0.05418135225772858,
0.006696393713355064,
-0.014912049286067486,
0.07527042180299759,
0.06843678653240204,
-0.02970476634800434,
0.14145946502685547,
0.028597405180335045,
-0.03410350903868675,
0.040097638964653015,
-0.11845805495977402,
-0.11901786178350449,
-0.04054024815559387,
0.17699559032917023,
0.02209664136171341,
0.08364549279212952,
-0.023060642182826996,
0.1058841347694397,
-0.05706199258565903,
0.09211656451225281,
0.054318204522132874,
-0.2506988048553467,
-0.04598117992281914,
0.14738190174102783,
0.04150548577308655,
0.06622903794050217,
-0.028392404317855835,
0.037384260445833206,
0.05097988247871399,
0.020048348233103752,
-0.02024644799530506,
-0.0706060603260994,
-0.1657221019268036,
0.010419885627925396,
-0.14122578501701355,
-0.011730865575373173,
0.2681788206100464,
-0.01647370494902134,
-0.01648569293320179,
-0.04137913137674332,
-0.09811972081661224,
-0.085562564432621,
-0.000017576059690327384,
-0.10609912127256393,
-0.019907305017113686,
0.04105718806385994,
-0.031770579516887665,
-0.10372386127710342,
-0.11191971600055695,
-0.012540400959551334,
-0.19430270791053772,
0.22597987949848175,
-0.002452345797792077,
0.0260012187063694,
-0.13978703320026398,
0.10856053978204727,
-0.08081594854593277,
-0.08672269433736801,
-0.013039328157901764,
-0.10821035504341125,
0.04605051502585411,
-0.025567255914211273,
-0.04772819206118584,
-0.05895531177520752,
-0.006264090538024902,
0.08259578049182892,
0.06434395909309387,
-0.03799864277243614,
0.03928418457508087,
0.034322258085012436,
0.057657528668642044,
0.12397614866495132,
-0.19320514798164368,
0.008993338793516159,
0.02558053471148014,
-0.019857579842209816,
-0.04241834953427315,
-0.026075486093759537,
-0.15299831330776215,
-0.0717901885509491,
0.07159961014986038,
0.016821764409542084,
0.012787509709596634,
0.10654346644878387,
-0.021339770406484604,
-0.07117068022489548,
-0.0568535290658474,
0.01691165752708912,
-0.028157392516732216,
-0.032735586166381836,
-0.058306653052568436,
0.19310346245765686,
-0.022722583264112473,
0.05028194934129715,
-0.058163080364465714,
0.028489328920841217,
-0.08522005379199982,
-0.03568807989358902,
-0.061981748789548874,
-0.09416358917951584,
0.02980218455195427,
-0.052075210958719254,
0.05041416361927986,
-0.11215262860059738,
-0.14181405305862427,
-0.01748187094926834,
-0.007741947658360004,
-0.04122591391205788,
-0.04949239268898964,
-0.0608273409307003,
-0.07521763443946838,
-0.016862785443663597,
-0.0338718444108963,
0.05761391296982765,
-0.04500892758369446,
0.04187718778848648,
-0.009646148420870304,
0.06792903691530228,
-0.12944115698337555,
0.055665045976638794,
-0.09794803708791733,
-0.011707895435392857,
-0.011954482644796371,
0.0795745700597763,
0.023126062005758286,
0.03421938046813011,
-0.07608724385499954,
-0.06555505096912384,
-0.12535619735717773,
0.05698961764574051,
-0.007915772497653961,
0.16061177849769592,
-0.1554945856332779,
-0.07559896260499954,
0.17764824628829956,
-0.04372454807162285,
-0.14465226233005524,
0.17443034052848816,
-0.02987021394073963,
0.20881801843643188,
0.08632279932498932,
0.12264806032180786,
0.03519913926720619,
0.001621220144443214,
-0.0047483788803219795,
0.0733073502779007,
-0.06854937970638275,
-0.025953862816095352,
0.055114470422267914,
0.06500759720802307,
-0.09501899778842926,
0.030512195080518723,
0.005970090162009001,
-0.007525278255343437,
-0.07340598851442337,
-0.008273151703178883,
0.007476354483515024,
0.007378174923360348,
0.06733524054288864,
-0.01392749696969986,
0.11323346197605133,
-0.07129030674695969,
-0.024729304015636444,
0.0017746574012562633,
0.04687802121043205,
0.027815714478492737,
-0.0019192778272554278,
-0.103455550968647,
0.006439339369535446,
-0.06669123470783234,
0.05148715525865555,
-0.15428975224494934,
-0.0006018213462084532,
-0.03867543116211891,
0.10458442568778992,
0.11789781600236893,
0.24186807870864868,
0.04120219871401787,
-0.08979564905166626,
0.0024262985680252314,
0.017747679725289345,
0.0718725398182869,
-0.01305957231670618,
-0.09745495021343231,
-0.10552115738391876,
0.05952544882893562,
-0.04826023802161217,
0.13820844888687134,
-0.09459702670574188,
0.0201578252017498,
0.08005523681640625,
0.07272844761610031,
-0.03159552067518234,
0.050844524055719376,
0.027059243991971016,
0.005084263160824776,
-0.05724172666668892,
0.04947013780474663,
0.04442214220762253,
-0.021364353597164154,
-0.10556745529174805,
0.15811701118946075,
-0.07080938667058945,
0.205255925655365,
0.15441083908081055,
-0.16033831238746643,
-0.0037152140866965055,
-0.023032892495393753,
-0.01705305092036724,
-0.008152658119797707,
0.05551828816533089,
-0.01440186146646738,
0.22684486210346222,
-0.02313772775232792,
0.11211096495389938,
-0.08655279129743576,
-0.02117062173783779,
-0.0026602307334542274,
-0.05601842701435089,
0.0037591352593153715,
0.12979669868946075,
0.03007529489696026,
-0.20896196365356445,
0.13283342123031616,
0.14864906668663025,
-0.05342748016119003,
0.1654592901468277,
0.028856433928012848,
-0.02170545421540737,
-0.007907663471996784,
-0.053920067846775055,
-0.05689559876918793,
-0.0034443328622728586,
-0.20542794466018677,
-0.027480458840727806,
0.08917108178138733,
0.027810990810394287,
0.0802585631608963,
-0.13255228102207184,
-0.01301946584135294,
0.01984487660229206,
0.018284466117620468,
-0.002865627408027649,
0.12775006890296936,
0.01193778682500124,
0.09670905023813248,
0.013831386342644691,
-0.05456860363483429,
0.08285143971443176,
0.018132677301764488,
-0.08192063122987747,
0.15634965896606445,
-0.05846381559967995,
-0.2957075238227844,
-0.11380990594625473,
-0.06283750385046005,
-0.07909902185201645,
0.003667999291792512,
0.011657400988042355,
-0.08555566519498825,
-0.033702973276376724,
-0.031413476914167404,
0.11800669133663177,
-0.15295153856277466,
0.014554903842508793,
-0.019508933648467064,
0.020440636202692986,
-0.07411065697669983,
-0.09029816836118698,
-0.007157401647418737,
-0.07471416890621185,
-0.0563504733145237,
0.05084122344851494,
-0.09751623868942261,
0.07289350032806396,
0.20229358971118927,
0.0030222833156585693,
0.046178609132766724,
-0.016694895923137665,
0.15064914524555206,
-0.12324635684490204,
0.014592047780752182,
0.2294333130121231,
-0.008473381400108337,
0.028127498924732208,
0.16623587906360626,
0.001338272588327527,
-0.030450815334916115,
0.025605352595448494,
-0.04939337074756622,
-0.05464008077979088,
-0.23393787443637848,
-0.08301294595003128,
-0.10362186282873154,
0.09743784368038177,
0.0526294931769371,
0.022409208118915558,
0.1127764880657196,
0.12290477007627487,
-0.044721972197294235,
0.02747299335896969,
-0.051565930247306824,
0.0945393517613411,
0.2641095519065857,
-0.07218454033136368,
0.11788085103034973,
-0.030676178634166718,
-0.11695937067270279,
0.0888727530837059,
0.02208413928747177,
0.016952447593212128,
0.03724126145243645,
0.13505251705646515,
0.01827841065824032,
0.07389161735773087,
0.09899286925792694,
0.09368577599525452,
-0.02248561941087246,
-0.06156541779637337,
-0.055641647428274155,
-0.026440458372235298,
-0.03747735545039177,
0.07886528968811035,
0.015983080491423607,
-0.036576613783836365,
-0.020441805943846703,
0.013890807516872883,
0.07803647965192795,
0.09996947646141052,
0.11770500242710114,
-0.2366463541984558,
-0.09336619824171066,
0.020884539932012558,
-0.042183469980955124,
-0.09638368338346481,
0.03975343704223633,
0.03953643888235092,
-0.0893767848610878,
0.005244836676865816,
-0.015312295407056808,
0.0982394888997078,
-0.10431957989931107,
0.06926323473453522,
-0.1320882886648178,
0.053817592561244965,
0.002273901365697384,
0.11102709919214249,
-0.20064473152160645,
0.21165038645267487,
-0.005192125681787729,
-0.044380225241184235,
-0.1232948899269104,
-0.05198947712779045,
0.014812695793807507,
0.1279306262731552,
0.053147830069065094,
-0.021866075694561005,
0.006200897041708231,
0.054710131138563156,
0.003268450265750289,
0.02808155119419098,
0.09713300317525864,
-0.07865633070468903,
0.05649212747812271,
-0.036716438829898834,
0.022699125111103058,
0.019629335030913353,
-0.024789191782474518,
-0.01671483740210533,
-0.09488409757614136,
0.09088152647018433,
0.035521987825632095,
0.09058434516191483,
-0.03170781582593918,
-0.09268883615732193,
-0.08609650284051895,
0.21950416266918182,
-0.030778275802731514,
-0.035053279250860214,
-0.08719637989997864,
0.10658206790685654,
0.07888752222061157,
-0.07504451274871826,
-0.013799982145428658,
-0.07044664770364761,
0.04156003147363663,
-0.04522904381155968,
-0.0661909282207489,
0.13738960027694702,
-0.11333272606134415,
-0.1294579952955246,
-0.018746791407465935,
0.16798681020736694,
-0.040015459060668945,
0.03009503148496151,
0.0014582129660993814,
-0.006444799248129129,
-0.1227732002735138,
-0.10446308553218842,
-0.004848443437367678,
-0.009665721096098423,
-0.03486604616045952,
0.049363117665052414,
0.05010101944208145,
0.004336975514888763,
-0.014924850314855576,
-0.007543349638581276,
0.22964991629123688,
0.12118261307477951,
-0.05124639719724655,
0.06289705634117126,
0.17393097281455994,
0.001436398597434163,
-0.32246389985084534,
-0.002358059398829937,
-0.06151572987437248,
-0.00004050566712976433,
-0.06810709834098816,
-0.15862567722797394,
0.11026329547166824,
0.04574712738394737,
-0.017084570601582527,
0.038877177983522415,
-0.27310675382614136,
-0.11008642613887787,
0.13194674253463745,
-0.036867909133434296,
0.4093899130821228,
-0.06541343033313751,
-0.014817931689321995,
-0.049046650528907776,
-0.06813120096921921,
0.07206171005964279,
-0.09643544256687164,
0.09106428176164627,
-0.014712131582200527,
0.14061501622200012,
0.04128055274486542,
-0.0148992408066988,
0.08592264354228973,
-0.03819854557514191,
-0.011574958451092243,
-0.08275660872459412,
-0.07748937606811523,
0.06182461231946945,
0.009013106115162373,
0.06522911041975021,
-0.015036966651678085,
0.05230202525854111,
-0.025333961471915245,
-0.08568605035543442,
-0.0596100352704525,
0.05220954865217209,
0.008977899327874184,
-0.09462965279817581,
-0.08917796611785889,
0.024458207190036774,
-0.05802305042743683,
-0.0193491093814373,
0.11411375552415848,
-0.06015635281801224,
0.02152738906443119,
0.09820351749658585,
0.24157801270484924,
-0.15134093165397644,
0.03519720956683159,
0.018305983394384384,
-0.04735372215509415,
0.08513356745243073,
-0.13807158172130585,
-0.006604955065995455,
0.13450147211551666,
0.03533608838915825,
0.09426087886095047,
0.08938498795032501,
-0.05826672539114952,
0.0013849983224645257,
0.03569412603974342,
-0.16426897048950195,
-0.2062777578830719,
-0.0513288639485836,
0.07336132228374481,
-0.073883555829525,
0.12634821236133575,
0.18580415844917297,
-0.07024992257356644,
-0.019229622557759285,
-0.004996004514396191,
0.0009150971891358495,
-0.02235640585422516,
0.03342010825872421,
0.05477525666356087,
0.033034779131412506,
-0.10591194033622742,
-0.0024245306849479675,
0.03822053223848343,
-0.08510981500148773,
-0.01594599336385727,
0.09763763844966888,
-0.10958746820688248,
-0.0970495194196701,
-0.10406430810689926,
0.010440150275826454,
-0.07706423103809357,
-0.02732657827436924,
-0.013255109079182148,
-0.1520024538040161,
0.05601448193192482,
0.1335359513759613,
0.08607639372348785,
0.020330283790826797,
-0.1128169521689415,
-0.031058911234140396,
-0.0579192154109478,
0.05411325767636299,
0.04413539916276932,
0.004867773037403822,
-0.09524277597665787,
0.05207766592502594,
-0.029243966564536095,
0.10270491987466812,
-0.0729379653930664,
-0.0337899848818779,
-0.09650164842605591,
0.03094748966395855,
-0.14989346265792847,
-0.09978906065225601,
-0.10056401789188385,
-0.03668120875954628,
0.0018170183757320046,
-0.06742629408836365,
-0.06928703188896179,
-0.004107742570340633,
-0.11942926049232483,
0.05482009798288345,
-0.009186072275042534,
0.06462413817644119,
-0.009936661459505558,
0.04427536949515343,
0.04186013713479042,
0.002872234210371971,
0.14135125279426575,
0.08894949406385422,
-0.05297296121716499,
0.08549395948648453,
-0.1291663497686386,
-0.01814674399793148,
0.07172421365976334,
0.04023505002260208,
0.047805409878492355,
-0.004445761442184448,
0.00415800279006362,
0.04744470492005348,
0.06514327228069305,
0.042989082634449005,
-0.006639887113124132,
-0.059814393520355225,
0.045106109231710434,
0.0003002237353939563,
-0.05572277680039406,
-0.06853897124528885,
0.013134884648025036,
0.007580136880278587,
0.08466560393571854,
0.13711336255073547,
-0.034026481211185455,
0.04678875580430031,
-0.07932168990373611,
0.06995569169521332,
-0.03932734578847885,
-0.1306246668100357,
-0.009235573932528496,
-0.11693209409713745,
0.06262961775064468,
-0.01586815156042576,
0.23695535957813263,
-0.0074532669968903065,
-0.08585801720619202,
0.020003560930490494,
0.03951742500066757,
0.02559317648410797,
0.00874719861894846,
0.13366851210594177,
0.10514429956674576,
-0.01662987284362316,
-0.07552529871463776,
0.08579768985509872,
0.0647672787308693,
0.12342210859060287,
0.1562976837158203,
0.043072786182165146,
0.009751122444868088,
0.11086731404066086,
0.01618501916527748,
0.02539539709687233,
-0.030943630263209343,
-0.04329167306423187,
-0.048657920211553574,
0.025837676599621773,
-0.07064282149076462,
0.12227068841457367,
0.16960673034191132,
0.037100061774253845,
-0.010586571879684925,
-0.011550343595445156,
-0.03075406514108181,
-0.106480173766613,
-0.16601480543613434,
-0.1173834428191185,
-0.11775057762861252,
-0.028375037014484406,
-0.13420049846172333,
-0.026293443515896797,
0.06532430648803711,
0.12124446779489517,
-0.07975484430789948,
0.08647116273641586,
0.01617239974439144,
-0.12215577065944672,
0.10607800632715225,
-0.03112724982202053,
0.051539015024900436,
-0.07240454107522964,
-0.015539604239165783,
-0.08762381970882416,
0.043436456471681595,
0.02045041136443615,
0.0023363379295915365,
-0.03025435283780098,
-0.0641164630651474,
-0.06508074700832367,
-0.08930210769176483,
-0.08378463983535767,
0.08764006942510605,
0.0707060918211937,
0.11082908511161804,
0.03337802737951279,
-0.052977148443460464,
0.003375160275027156,
0.13241426646709442,
-0.04394752159714699,
-0.08863143622875214,
-0.07642024010419846,
0.1419180929660797,
-0.05486924946308136,
0.0538007989525795,
-0.018338670954108238,
-0.02536068856716156,
-0.034262482076883316,
0.27851203083992004,
0.32980453968048096,
-0.13780392706394196,
-0.01086305733770132,
0.02365221083164215,
0.02802691049873829,
0.04852961748838425,
0.024542462080717087,
0.08586164563894272,
0.2307008057832718,
-0.041188664734363556,
0.022799957543611526,
-0.00869232788681984,
-0.019322991371154785,
0.03177288919687271,
0.07226838171482086,
0.043783560395240784,
-0.03753378987312317,
-0.06335991621017456,
0.10933243483304977,
-0.18788737058639526,
-0.05157316103577614,
-0.07381331920623779,
-0.10545484721660614,
-0.09707614779472351,
0.010857764631509781,
0.028788331896066666,
0.03252939134836197,
0.09696771949529648,
-0.06229059398174286,
-0.061234451830387115,
-0.027011223137378693,
0.03583790361881256,
-0.12341473996639252,
0.03147447481751442,
0.1561158150434494,
0.0030296125914901495,
0.040392886847257614,
-0.02998274937272072,
0.09420932829380035,
0.13773155212402344,
-0.015850555151700974,
-0.09653182327747345,
0.00004287560659577139,
0.023479172959923744,
-0.03446592763066292,
0.010536916553974152,
0.032029807567596436,
0.05060148611664772,
-0.04480923339724541,
0.13465894758701324,
-0.24274109303951263,
0.015028130263090134,
-0.053711410611867905,
-0.04599187150597572,
-0.10957460105419159,
0.05715722218155861,
-0.0793704092502594,
0.10805145651102066,
0.19367140531539917,
-0.060796160250902176,
-0.05587698519229889,
-0.035819754004478455,
-0.005411997437477112,
-0.01110776700079441,
0.027187857776880264,
-0.02276994287967682,
-0.1250007003545761,
-0.06612206995487213,
0.06344366073608398,
-0.004071300849318504,
-0.20015862584114075,
-0.0025234012864530087,
-0.06060950458049774,
0.02133246324956417,
-0.017385542392730713,
0.043617699295282364,
0.09454627335071564,
0.016786575317382812,
0.021767573431134224,
0.0075239334255456924,
-0.019690508022904396,
0.037260107696056366,
-0.11686406284570694,
-0.07373166084289551
] |
null | null |
transformers
|
# CRiPT Model Medium (Critical Thinking Intermediarily Pretrained Transformer)
Medium version of the trained model (`SYL01-2020-10-24-72K/gpt2-medium-train03-72K`) presented in the paper "Critical Thinking for Language Models" (Betz, Voigt and Richardson 2020). See also:
* [blog entry](https://debatelab.github.io/journal/critical-thinking-language-models.html)
* [GitHub repo](https://github.com/debatelab/aacorpus)
* [paper](https://arxiv.org/pdf/2009.07185)
|
{"language": "en", "tags": ["gpt2"]}
|
text-generation
|
DebateLabKIT/cript-medium
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"arxiv:2009.07185",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2009.07185"
] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #en #arxiv-2009.07185 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# CRiPT Model Medium (Critical Thinking Intermediarily Pretrained Transformer)
Medium version of the trained model ('SYL01-2020-10-24-72K/gpt2-medium-train03-72K') presented in the paper "Critical Thinking for Language Models" (Betz, Voigt and Richardson 2020). See also:
* blog entry
* GitHub repo
* paper
|
[
"# CRiPT Model Medium (Critical Thinking Intermediarily Pretrained Transformer)\nMedium version of the trained model ('SYL01-2020-10-24-72K/gpt2-medium-train03-72K') presented in the paper \"Critical Thinking for Language Models\" (Betz, Voigt and Richardson 2020). See also:\n * blog entry\n * GitHub repo\n * paper"
] |
[
"TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #en #arxiv-2009.07185 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# CRiPT Model Medium (Critical Thinking Intermediarily Pretrained Transformer)\nMedium version of the trained model ('SYL01-2020-10-24-72K/gpt2-medium-train03-72K') presented in the paper \"Critical Thinking for Language Models\" (Betz, Voigt and Richardson 2020). See also:\n * blog entry\n * GitHub repo\n * paper"
] |
[
60,
95
] |
[
"passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #en #arxiv-2009.07185 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# CRiPT Model Medium (Critical Thinking Intermediarily Pretrained Transformer)\nMedium version of the trained model ('SYL01-2020-10-24-72K/gpt2-medium-train03-72K') presented in the paper \"Critical Thinking for Language Models\" (Betz, Voigt and Richardson 2020). See also:\n * blog entry\n * GitHub repo\n * paper"
] |
[
-0.06711281836032867,
-0.029232176020741463,
-0.0028422533068805933,
-0.0057940129190683365,
0.06078780069947243,
-0.01273230742663145,
0.11520853638648987,
0.05540012940764427,
-0.05625410005450249,
-0.04872602969408035,
0.14577616751194,
0.12109329551458359,
0.005232458934187889,
0.08001832664012909,
-0.024897098541259766,
-0.31770163774490356,
0.027425669133663177,
0.04261474683880806,
0.019372060894966125,
0.1492837369441986,
0.15030063688755035,
-0.08990494161844254,
0.03629334643483162,
0.05755128338932991,
-0.1249324381351471,
-0.0350177027285099,
-0.016606003046035767,
-0.06039438769221306,
0.12107923626899719,
0.025880970060825348,
0.0761832445859909,
0.08917172998189926,
-0.014475117437541485,
-0.09068875759840012,
0.05719649791717529,
0.018990514799952507,
-0.02768533304333687,
0.06130649894475937,
0.058851417154073715,
-0.07322458922863007,
0.1598287671804428,
0.05926531180739403,
-0.02284730225801468,
0.047163378447294235,
-0.14824149012565613,
-0.08736617118120193,
-0.048139020800590515,
0.16535550355911255,
0.04538966715335846,
0.08809540420770645,
-0.03133537620306015,
0.12045934796333313,
-0.03879302367568016,
0.10741251707077026,
0.05512874945998192,
-0.25933265686035156,
-0.05856403708457947,
0.1491430103778839,
0.05976230651140213,
0.08727696537971497,
-0.038183774799108505,
0.052308812737464905,
0.02948145568370819,
0.030135534703731537,
-0.03359808400273323,
-0.07421194761991501,
-0.1442245990037918,
0.009615053422749043,
-0.14568378031253815,
-0.007792665623128414,
0.27637356519699097,
-0.032826874405145645,
-0.02609439194202423,
-0.025971077382564545,
-0.08892165869474411,
-0.08356546610593796,
0.005266656633466482,
-0.1155458390712738,
-0.037699438631534576,
0.0579678937792778,
-0.017928387969732285,
-0.10646792501211166,
-0.11498009413480759,
-0.008941426873207092,
-0.1679793745279312,
0.25947701930999756,
-0.009647347033023834,
0.02351396530866623,
-0.11034385114908218,
0.13340836763381958,
-0.036633484065532684,
-0.09431634098291397,
-0.007578694727271795,
-0.11347139626741409,
0.05100470408797264,
-0.025856800377368927,
-0.05602966248989105,
-0.12842516601085663,
-0.02334030345082283,
0.094053715467453,
0.04909650608897209,
-0.010600839741528034,
0.03214804455637932,
0.033433884382247925,
0.05803078040480614,
0.09275969862937927,
-0.1481359302997589,
0.014649292454123497,
0.015865383669734,
-0.02246362343430519,
-0.03579678758978844,
-0.038979608565568924,
-0.13661114871501923,
-0.04922911524772644,
0.0712294727563858,
0.018521632999181747,
0.024280056357383728,
0.11203431338071823,
-0.03461071848869324,
-0.07869482040405273,
-0.06569860130548477,
0.007181515451520681,
-0.04027377441525459,
-0.0515778623521328,
-0.04440348595380783,
0.19417181611061096,
-0.0055986372753977776,
0.0708574578166008,
-0.06933237612247467,
0.031864359974861145,
-0.07298922538757324,
-0.0259366724640131,
-0.06646419316530228,
-0.09653017669916153,
0.02531234547495842,
-0.08009840548038483,
0.03431813791394234,
-0.10838615149259567,
-0.15882422029972076,
-0.03497079759836197,
0.02949926070868969,
-0.033021364361047745,
-0.041031938046216965,
-0.07457900047302246,
-0.08469109237194061,
0.010127687826752663,
-0.034172821789979935,
0.04172210395336151,
-0.04476441815495491,
0.041798919439315796,
-0.016026178374886513,
0.05003773793578148,
-0.12163657695055008,
0.05584680661559105,
-0.07428330183029175,
-0.008346370421350002,
-0.03133947402238846,
0.07707425206899643,
0.0033045762684196234,
0.03806905820965767,
-0.07758114486932755,
-0.045830514281988144,
-0.10938646644353867,
0.055745646357536316,
-0.010662592947483063,
0.18329080939292908,
-0.14747419953346252,
-0.0663282498717308,
0.1576516181230545,
-0.06410350650548935,
-0.11539895832538605,
0.15198974311351776,
-0.03336016461253166,
0.20163945853710175,
0.09642039239406586,
0.1163715124130249,
0.04000026732683182,
-0.03962836042046547,
0.009754911996424198,
0.06845898181200027,
-0.055431269109249115,
0.006789852865040302,
0.04947838559746742,
0.07655362039804459,
-0.09923478215932846,
0.03685447946190834,
-0.006290836725383997,
-0.0064128367230296135,
-0.07987887412309647,
-0.012538090348243713,
0.01741492748260498,
0.02251017838716507,
0.09924153238534927,
-0.0016170684248209,
0.09825276583433151,
-0.07034389674663544,
-0.04222552105784416,
0.013413864187896252,
0.029571499675512314,
0.026282237842679024,
0.00603922363370657,
-0.10832293331623077,
-0.02179390750825405,
-0.0494115985929966,
0.05031503364443779,
-0.1636778563261032,
-0.011375936679542065,
-0.05653700605034828,
0.13478776812553406,
0.11991123855113983,
0.23132410645484924,
0.035591285675764084,
-0.08762725442647934,
-0.013604125007987022,
0.0042639742605388165,
0.06781386584043503,
0.0016907352255657315,
-0.09162937849760056,
-0.13566026091575623,
0.07013300806283951,
-0.057442136108875275,
0.13716192543506622,
-0.08403848856687546,
0.03329095244407654,
0.08837217837572098,
0.11104727536439896,
-0.03228813782334328,
0.040374886244535446,
0.024960769340395927,
0.011728941462934017,
-0.04489895701408386,
0.049047909677028656,
0.06103988364338875,
-0.003866247832775116,
-0.10620062053203583,
0.18175393342971802,
-0.06548076868057251,
0.19349759817123413,
0.15086548030376434,
-0.20786398649215698,
-0.0044651576317846775,
-0.039567481726408005,
-0.0076490286737680435,
-0.012693806551396847,
0.06388745456933975,
-0.005331041757017374,
0.2456146776676178,
-0.010287093929946423,
0.1060318648815155,
-0.08316683769226074,
-0.03306737169623375,
-0.00859853345900774,
-0.06542830914258957,
-0.01771901734173298,
0.12811559438705444,
0.006139142904430628,
-0.20518910884857178,
0.12847350537776947,
0.14521226286888123,
-0.02598733641207218,
0.16437454521656036,
0.03130306303501129,
0.00901719368994236,
-0.010922443121671677,
-0.0510384775698185,
-0.06559896469116211,
-0.0007837694138288498,
-0.21150155365467072,
-0.041097261011600494,
0.0678814947605133,
0.030177338048815727,
0.08838160336017609,
-0.14875774085521698,
-0.017556237056851387,
0.018843775615096092,
0.013812697492539883,
0.01447801198810339,
0.136469304561615,
-0.013760866597294807,
0.09001918137073517,
0.011549881659448147,
-0.02766856923699379,
0.07942592352628708,
0.011361476965248585,
-0.09935367852449417,
0.15175938606262207,
-0.10523560643196106,
-0.31369268894195557,
-0.10351423174142838,
-0.0852832943201065,
-0.0805647000670433,
0.002727421233430505,
0.02349996007978916,
-0.08962912857532501,
-0.03882114961743355,
-0.0218683872371912,
0.09425130486488342,
-0.17465242743492126,
0.006337163504213095,
-0.01369544118642807,
0.007229536771774292,
-0.0668892040848732,
-0.08508090674877167,
-0.013880366459488869,
-0.0640735924243927,
-0.05467037484049797,
0.06278755515813828,
-0.11309273540973663,
0.07660580426454544,
0.19853779673576355,
0.017629673704504967,
0.054619014263153076,
-0.022105839103460312,
0.19062551856040955,
-0.12223803997039795,
0.028994260355830193,
0.2225148230791092,
-0.012091170996427536,
0.04170640558004379,
0.17734426259994507,
-0.006860526278614998,
-0.02656867355108261,
0.026520801708102226,
-0.04672408103942871,
-0.055910591036081314,
-0.24651053547859192,
-0.08697421848773956,
-0.09646425396203995,
0.049102626740932465,
0.017095353454351425,
0.025568878278136253,
0.12582242488861084,
0.10437219589948654,
-0.05634754151105881,
0.0152336610481143,
-0.01112675666809082,
0.10378731042146683,
0.2831652760505676,
-0.08411378413438797,
0.13356763124465942,
-0.018399860709905624,
-0.08925553411245346,
0.07693894952535629,
0.03914059326052666,
0.01761496812105179,
0.07333750277757645,
0.10166920721530914,
0.0489436499774456,
0.053416091948747635,
0.10682369768619537,
0.06508061289787292,
-0.00547598022967577,
-0.050747133791446686,
-0.0490780770778656,
-0.029236873611807823,
-0.0546787790954113,
0.07716182619333267,
-0.015256802551448345,
-0.0562468059360981,
-0.014041687361896038,
0.005687274504452944,
0.09315221011638641,
0.09240739792585373,
0.086088165640831,
-0.19889047741889954,
-0.09908709675073624,
0.02533043548464775,
-0.04579918459057808,
-0.10463177412748337,
0.03224322944879532,
0.008987943641841412,
-0.09223411977291107,
0.02541770040988922,
-0.03761793673038483,
0.11079497635364532,
-0.09787100553512573,
0.06000688672065735,
-0.12443450093269348,
0.025501074269413948,
-0.005688662175089121,
0.09700740873813629,
-0.22198385000228882,
0.23855359852313995,
0.00036526494659483433,
-0.02729022689163685,
-0.09930544346570969,
-0.06042265146970749,
0.003538688877597451,
0.17474082112312317,
0.058620717376470566,
-0.012325608171522617,
-0.01819424331188202,
0.05208719149231911,
0.03229818120598793,
0.030142076313495636,
0.12529653310775757,
-0.05363652855157852,
0.053755875676870346,
-0.03069177456200123,
0.030688198283314705,
0.02236228436231613,
-0.01629534922540188,
-0.0689050629734993,
-0.10460442304611206,
0.10852240025997162,
0.03335164859890938,
0.08206344395875931,
-0.040145497769117355,
-0.09288331866264343,
-0.07151129841804504,
0.21199925243854523,
-0.07305199652910233,
-0.04411445930600166,
-0.10899709165096283,
0.08542125672101974,
0.04798712953925133,
-0.07099103927612305,
-0.01627480424940586,
-0.07852818816900253,
0.03941002115607262,
-0.054147541522979736,
-0.07423566281795502,
0.12095338851213455,
-0.11807311326265335,
-0.1175868883728981,
-0.03696915879845619,
0.17819228768348694,
-0.0014346300158649683,
0.023206887766718864,
0.0009522509062662721,
0.005444012116640806,
-0.11552978307008743,
-0.10754048824310303,
0.007351058069616556,
-0.0035558680538088083,
-0.023700885474681854,
0.016202682629227638,
0.04551146924495697,
-0.005714569706469774,
-0.03507460281252861,
-0.037541426718235016,
0.20037822425365448,
0.11123114824295044,
-0.04759440943598747,
0.08562018722295761,
0.19325724244117737,
-0.031451206654310226,
-0.3102516233921051,
-0.0038106755819171667,
-0.048344653099775314,
-0.00899056252092123,
-0.08633590489625931,
-0.1669086217880249,
0.08753837645053864,
0.013107547536492348,
-0.01696464605629444,
0.02492322027683258,
-0.25920918583869934,
-0.11312134563922882,
0.13282112777233124,
-0.004290768876671791,
0.3983354866504669,
-0.07490820437669754,
-0.03245769068598747,
-0.048153284937143326,
-0.12064351886510849,
0.06588603556156158,
-0.09937047213315964,
0.08423938602209091,
-0.011654810048639774,
0.1314312219619751,
0.029892701655626297,
-0.012637909501791,
0.08758123964071274,
-0.029296601191163063,
-0.01671026647090912,
-0.07713621109724045,
-0.08462613821029663,
0.04350513964891434,
0.013549858704209328,
0.034604769200086594,
-0.026282375678420067,
0.06020599976181984,
-0.038526296615600586,
-0.08127461373806,
-0.051286254078149796,
0.04012994468212128,
-0.0036539663560688496,
-0.10017620027065277,
-0.07839596271514893,
0.029527222737669945,
-0.04265039041638374,
-0.01956683211028576,
0.10915612429380417,
-0.09215934574604034,
0.08114761859178543,
0.12501214444637299,
0.2297518253326416,
-0.1735011637210846,
0.06056372821331024,
0.0012978487648069859,
-0.05160418525338173,
0.07904001325368881,
-0.1351429969072342,
-0.020203666761517525,
0.13808409869670868,
0.02270105853676796,
0.11857184767723083,
0.08343584090471268,
-0.0690641850233078,
-0.011079437099397182,
0.032636791467666626,
-0.19248969852924347,
-0.16443006694316864,
-0.07033956050872803,
0.09434714913368225,
-0.044043585658073425,
0.12822148203849792,
0.20769959688186646,
-0.07135847955942154,
-0.014274420216679573,
-0.005961509421467781,
0.0037974321749061346,
-0.020410850644111633,
0.023319412022829056,
0.05064202472567558,
0.03485477715730667,
-0.08658794313669205,
-0.005038073752075434,
0.033516865223646164,
-0.08899084478616714,
-0.026345042511820793,
0.10367133468389511,
-0.12284324318170547,
-0.10767082870006561,
-0.11204644292593002,
0.003171426011249423,
-0.08486192673444748,
-0.020461903885006905,
-0.009960952214896679,
-0.14390714466571808,
0.05691942200064659,
0.1720888465642929,
0.07468074560165405,
0.017036622390151024,
-0.11668193340301514,
-0.02939441427588463,
-0.019191613420844078,
0.04898621141910553,
0.05079104006290436,
-0.015376482158899307,
-0.08693690598011017,
0.0666547566652298,
0.00029934829217381775,
0.08413136005401611,
-0.07279234379529953,
-0.04621488228440285,
-0.10549002885818481,
0.048130057752132416,
-0.11704157292842865,
-0.0935479924082756,
-0.10516660660505295,
-0.040299467742443085,
0.009589413180947304,
-0.08176688104867935,
-0.07561225444078445,
0.003755844896659255,
-0.11587129533290863,
0.06108970195055008,
-0.0026558239478617907,
0.05319574847817421,
-0.008918299339711666,
0.03883720934391022,
0.04654877632856369,
0.00011078554234700277,
0.11048182845115662,
0.09421445429325104,
-0.05831462889909744,
0.10712987184524536,
-0.16633160412311554,
0.006963993888348341,
0.06751490384340286,
0.031188545748591423,
0.02653857320547104,
-0.00020174297969788313,
-0.014212382026016712,
0.058924220502376556,
0.06982828676700592,
0.05214724689722061,
0.011751511134207249,
-0.05701560527086258,
0.0315852053463459,
0.01284363865852356,
-0.07214276492595673,
-0.07961719483137131,
0.02439083717763424,
0.046246487647295,
0.08673110604286194,
0.1508645862340927,
-0.05372849479317665,
0.05497199669480324,
-0.06938184797763824,
0.07602193206548691,
-0.014834536239504814,
-0.12883982062339783,
-0.01048737671226263,
-0.12970517575740814,
0.04977969825267792,
-0.031456269323825836,
0.22937707602977753,
0.004003164358437061,
-0.0687856674194336,
0.03746913745999336,
0.034312717616558075,
-0.03435233607888222,
0.008417192846536636,
0.09073027968406677,
0.11060978472232819,
-0.004669281654059887,
-0.08161155879497528,
0.07625574618577957,
0.0322316475212574,
0.054894380271434784,
0.15145091712474823,
0.037746503949165344,
0.030196163803339005,
0.11819876730442047,
-0.00248259911313653,
0.038945022970438004,
-0.03800780698657036,
-0.05774892121553421,
-0.04939783737063408,
0.04070143401622772,
-0.07436098903417587,
0.161336749792099,
0.1764369010925293,
0.009779566898941994,
-0.004036430269479752,
-0.01382817979902029,
-0.0403732992708683,
-0.12198114395141602,
-0.20312893390655518,
-0.10094437748193741,
-0.12170752882957458,
-0.030690189450979233,
-0.12691277265548706,
-0.0037356060929596424,
0.02259536273777485,
0.11519703269004822,
-0.06294500082731247,
0.07012666761875153,
0.02933157980442047,
-0.12865912914276123,
0.11241382360458374,
-0.0240852702409029,
0.06692638248205185,
-0.09950094670057297,
0.00505502475425601,
-0.07405008375644684,
0.04407161846756935,
0.0019117261981591582,
0.0011229426600039005,
-0.039915647357702255,
-0.07224644720554352,
-0.06460252404212952,
-0.09899214655160904,
-0.0701194703578949,
0.10013068467378616,
0.06441511958837509,
0.1357521265745163,
0.029690876603126526,
-0.04143702611327171,
0.008608385920524597,
0.15053711831569672,
-0.026406746357679367,
-0.0889919325709343,
-0.08374430239200592,
0.1355525702238083,
-0.05090514197945595,
0.049382418394088745,
-0.008534599095582962,
-0.020638981834053993,
-0.02086418680846691,
0.2684653401374817,
0.3256314694881439,
-0.12425132095813751,
-0.007571731694042683,
0.019420452415943146,
0.03487220034003258,
0.08204185962677002,
0.016044529154896736,
0.07757943123579025,
0.22102659940719604,
-0.0417451336979866,
0.025340113788843155,
-0.03366594389081001,
-0.036607980728149414,
0.07131527364253998,
0.06592385470867157,
0.04276266321539879,
-0.05083955079317093,
-0.07307419925928116,
0.12998181581497192,
-0.17923839390277863,
-0.050834428519010544,
-0.06593536585569382,
-0.10177560895681381,
-0.07092003524303436,
0.0047014020383358,
0.03670545667409897,
0.018755819648504257,
0.07671386003494263,
-0.045571133494377136,
-0.06815695017576218,
-0.004902181681245565,
0.0457875058054924,
-0.10215328633785248,
0.020507484674453735,
0.16977936029434204,
0.003549573477357626,
0.025379624217748642,
-0.025699004530906677,
0.09025806933641434,
0.12899473309516907,
0.0027916417457163334,
-0.0739327147603035,
0.016198061406612396,
0.025438914075493813,
-0.06143443286418915,
0.007318821270018816,
0.04370298236608505,
0.06017765402793884,
-0.027681497856974602,
0.11622945964336395,
-0.2309940755367279,
0.04355703666806221,
-0.049035780131816864,
-0.0702827200293541,
-0.08306658267974854,
0.08757796883583069,
-0.09198035299777985,
0.10866326838731766,
0.18165183067321777,
-0.04060167819261551,
-0.04805130138993263,
-0.03702192008495331,
-0.01747857592999935,
-0.030031876638531685,
0.02452978678047657,
-0.012017587199807167,
-0.1280127763748169,
-0.058002401143312454,
0.08758507668972015,
-0.0016203722916543484,
-0.1970573365688324,
-0.01311541348695755,
-0.06519002467393875,
0.01879953220486641,
-0.024212799966335297,
0.056500039994716644,
0.0879427120089531,
0.006180136930197477,
0.01915914937853813,
0.00566734978929162,
0.0001764095068210736,
0.04443074390292168,
-0.10816487669944763,
-0.05472104623913765
] |
null | null |
transformers
|
# CRiPT Model (Critical Thinking Intermediarily Pretrained Transformer)
Small version of the trained model (`SYL01-2020-10-24-72K/gpt2-small-train03-72K`) presented in the paper "Critical Thinking for Language Models" (Betz, Voigt and Richardson 2020). See also:
* [blog entry](https://debatelab.github.io/journal/critical-thinking-language-models.html)
* [GitHub repo](https://github.com/debatelab/aacorpus)
* [paper](https://arxiv.org/pdf/2009.07185)
|
{"language": "en", "tags": ["gpt2"]}
|
text-generation
|
DebateLabKIT/cript
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"arxiv:2009.07185",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2009.07185"
] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #en #arxiv-2009.07185 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# CRiPT Model (Critical Thinking Intermediarily Pretrained Transformer)
Small version of the trained model ('SYL01-2020-10-24-72K/gpt2-small-train03-72K') presented in the paper "Critical Thinking for Language Models" (Betz, Voigt and Richardson 2020). See also:
* blog entry
* GitHub repo
* paper
|
[
"# CRiPT Model (Critical Thinking Intermediarily Pretrained Transformer)\n\nSmall version of the trained model ('SYL01-2020-10-24-72K/gpt2-small-train03-72K') presented in the paper \"Critical Thinking for Language Models\" (Betz, Voigt and Richardson 2020). See also:\n\n * blog entry\n * GitHub repo\n * paper"
] |
[
"TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #en #arxiv-2009.07185 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# CRiPT Model (Critical Thinking Intermediarily Pretrained Transformer)\n\nSmall version of the trained model ('SYL01-2020-10-24-72K/gpt2-small-train03-72K') presented in the paper \"Critical Thinking for Language Models\" (Betz, Voigt and Richardson 2020). See also:\n\n * blog entry\n * GitHub repo\n * paper"
] |
[
60,
94
] |
[
"passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #en #arxiv-2009.07185 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# CRiPT Model (Critical Thinking Intermediarily Pretrained Transformer)\n\nSmall version of the trained model ('SYL01-2020-10-24-72K/gpt2-small-train03-72K') presented in the paper \"Critical Thinking for Language Models\" (Betz, Voigt and Richardson 2020). See also:\n\n * blog entry\n * GitHub repo\n * paper"
] |
[
-0.05782853439450264,
-0.024860363453626633,
-0.0023736292496323586,
0.0015833426732569933,
0.05232420563697815,
-0.023393305018544197,
0.12017467617988586,
0.08395708352327347,
-0.05568394437432289,
-0.036223333328962326,
0.1514887809753418,
0.11987452208995819,
0.005461951717734337,
0.09001326560974121,
-0.027128078043460846,
-0.3087475001811981,
0.040049728006124496,
0.05779827758669853,
0.014775318093597889,
0.12884347140789032,
0.15307976305484772,
-0.08961529284715652,
0.05004049092531204,
0.06058324873447418,
-0.12331555783748627,
-0.03033771738409996,
-0.009683639742434025,
-0.07447387278079987,
0.10734684020280838,
0.0330023355782032,
0.07067809998989105,
0.07331179827451706,
-0.011059006676077843,
-0.06168179586529732,
0.05388282984495163,
0.016319096088409424,
-0.013040252961218357,
0.06942656636238098,
0.05448026955127716,
-0.04890697821974754,
0.1565966010093689,
0.037764906883239746,
-0.034404218196868896,
0.036283690482378006,
-0.13150952756404877,
-0.10524994879961014,
-0.036933332681655884,
0.16420716047286987,
0.039195142686367035,
0.09126828610897064,
-0.030558858066797256,
0.14323922991752625,
-0.06863241642713547,
0.09076205641031265,
0.07446277886629105,
-0.23252402245998383,
-0.05252915993332863,
0.13659515976905823,
0.036158449947834015,
0.049112506210803986,
-0.03231343626976013,
0.04670995846390724,
0.039772119373083115,
0.025063857436180115,
-0.04488460347056389,
-0.06190364062786102,
-0.16259898245334625,
0.007150237914174795,
-0.13744321465492249,
-0.027923759073019028,
0.273001104593277,
-0.027267174795269966,
-0.024351490661501884,
-0.036181919276714325,
-0.07575676590204239,
-0.09237119555473328,
-0.023300746455788612,
-0.12326057255268097,
-0.029645536094903946,
0.04887453466653824,
-0.026882940903306007,
-0.09385444223880768,
-0.1248030960559845,
-0.012106525711715221,
-0.16174712777137756,
0.25067102909088135,
-0.0028136526234447956,
0.014705246314406395,
-0.14594759047031403,
0.10471752285957336,
-0.05248735100030899,
-0.07720719277858734,
-0.018249254673719406,
-0.11323249340057373,
0.09365811944007874,
-0.025300223380327225,
-0.06301633268594742,
-0.08432513475418091,
-0.007289008237421513,
0.10211018472909927,
0.0592971034348011,
-0.03074478730559349,
0.05024375021457672,
0.03069298528134823,
0.05149735510349274,
0.1007407009601593,
-0.1493215262889862,
0.027751119807362556,
0.03797917440533638,
-0.025983668863773346,
-0.03065757267177105,
-0.03402923792600632,
-0.14834992587566376,
-0.056638747453689575,
0.09051833301782608,
0.025151988491415977,
0.03513257950544357,
0.11659859120845795,
-0.027910597622394562,
-0.07235337048768997,
-0.05888593569397926,
0.009591721929609776,
-0.03874898701906204,
-0.03627119958400726,
-0.04815090820193291,
0.1826316863298416,
-0.012794965878129005,
0.06176679953932762,
-0.08805660903453827,
0.01053373608738184,
-0.07495688647031784,
-0.034820202738046646,
-0.05483477562665939,
-0.09891442209482193,
0.022962132468819618,
-0.05532247945666313,
0.05361824855208397,
-0.12080969661474228,
-0.13568682968616486,
-0.027525531128048897,
0.0030180513858795166,
-0.03791310638189316,
-0.053084712475538254,
-0.08154730498790741,
-0.09215032309293747,
-0.00009235815377905965,
-0.02453482337296009,
0.028699928894639015,
-0.049301210790872574,
0.03948669135570526,
-0.012910387478768826,
0.06198294088244438,
-0.1242770329117775,
0.058897458016872406,
-0.09692563116550446,
-0.023554665967822075,
-0.017562279477715492,
0.08338106423616409,
0.005200005602091551,
0.02583887241780758,
-0.08216938376426697,
-0.05022290349006653,
-0.12703964114189148,
0.06518600881099701,
-0.008554259315133095,
0.18175725638866425,
-0.14102967083454132,
-0.0819321796298027,
0.15733258426189423,
-0.06672252714633942,
-0.11712825298309326,
0.1539495438337326,
-0.026907287538051605,
0.23640067875385284,
0.09635750949382782,
0.1453048586845398,
0.0590025968849659,
-0.006712723523378372,
-0.0036109364591538906,
0.08409483730792999,
-0.07045592367649078,
-0.01306852512061596,
0.03869000822305679,
0.07272758334875107,
-0.09375136345624924,
0.033742260187864304,
-0.018611593171954155,
-0.013345531187951565,
-0.07305430620908737,
-0.014345016330480576,
0.01298533845692873,
0.011584639549255371,
0.08323502540588379,
0.006786589976400137,
0.12086132913827896,
-0.06646046042442322,
-0.03663965314626694,
0.018162190914154053,
0.0442357137799263,
0.01344653032720089,
-0.0007510647410526872,
-0.12573714554309845,
-0.001997167943045497,
-0.050081536173820496,
0.04808824136853218,
-0.151728093624115,
-0.0003392829967197031,
-0.04363372549414635,
0.1046103686094284,
0.11239294707775116,
0.2313094288110733,
0.04243218153715134,
-0.105067677795887,
0.003208698006346822,
0.02436981350183487,
0.06484592705965042,
0.004885461181402206,
-0.0985068678855896,
-0.11945751309394836,
0.06013481691479683,
-0.04915539175271988,
0.14536501467227936,
-0.0842168927192688,
0.01322441641241312,
0.08382435142993927,
0.09603613615036011,
-0.03255787864327431,
0.04790312424302101,
0.025954466313123703,
0.0022116240579634905,
-0.04663584381341934,
0.03710360825061798,
0.06898979842662811,
-0.006829560734331608,
-0.07506884634494781,
0.16141845285892487,
-0.05543479323387146,
0.15360035002231598,
0.15904739499092102,
-0.18308645486831665,
-0.009268548339605331,
-0.00955888256430626,
-0.01514096837490797,
-0.006075368728488684,
0.06303109228610992,
-0.008615408092737198,
0.23206406831741333,
-0.03079383820295334,
0.1026352122426033,
-0.0941811129450798,
-0.0357569120824337,
-0.005305700469762087,
-0.0699850544333458,
-0.0033350661396980286,
0.12986847758293152,
0.0282623078674078,
-0.20554062724113464,
0.13376949727535248,
0.11951152235269547,
-0.02983885258436203,
0.15412896871566772,
0.02105831913650036,
-0.01079526636749506,
0.003173662116751075,
-0.044104743748903275,
-0.04440142214298248,
-0.014053991995751858,
-0.20643116533756256,
-0.03678230196237564,
0.08404642343521118,
0.020657949149608612,
0.09303770214319229,
-0.14367903769016266,
-0.007919436320662498,
0.022371139377355576,
0.005490225274115801,
0.016956396400928497,
0.12121754884719849,
-0.007796646561473608,
0.09790072590112686,
0.030120495706796646,
-0.04341897368431091,
0.08441265672445297,
0.022540900856256485,
-0.09611213952302933,
0.1495014727115631,
-0.07431697100400925,
-0.3201310932636261,
-0.10988949984312057,
-0.04538331553339958,
-0.057672567665576935,
0.022706909105181694,
0.024810686707496643,
-0.09540276229381561,
-0.043366316705942154,
-0.029960596933960915,
0.09084036201238632,
-0.1590307056903839,
0.001609036815352738,
-0.009673291817307472,
0.02984057180583477,
-0.07541926205158234,
-0.09600304067134857,
-0.010973562486469746,
-0.06644003093242645,
-0.03758066147565842,
0.06060978025197983,
-0.12077080458402634,
0.04834003001451492,
0.19862358272075653,
0.0035641526337713003,
0.05010320991277695,
-0.03125425800681114,
0.18432235717773438,
-0.1143839880824089,
0.027383895590901375,
0.2205427885055542,
-0.01698952168226242,
0.021637341007590294,
0.15534372627735138,
-0.016437256708741188,
-0.037969257682561874,
0.029099950566887856,
-0.042444858700037,
-0.05190950632095337,
-0.22946080565452576,
-0.08103639632463455,
-0.08500799536705017,
0.08488190919160843,
0.05064227432012558,
0.030083388090133667,
0.14200018346309662,
0.12016208469867706,
-0.0593189038336277,
0.03561775013804436,
-0.04121711850166321,
0.10162505507469177,
0.281681627035141,
-0.06926197558641434,
0.12471413612365723,
-0.029875794425606728,
-0.09915444254875183,
0.07965686917304993,
-0.005074233748018742,
-0.003469490446150303,
0.051183972507715225,
0.11572302877902985,
0.024768246337771416,
0.07192635536193848,
0.1216704323887825,
0.08122102171182632,
-0.0065331789664924145,
-0.05939692631363869,
-0.05222586914896965,
-0.029743831604719162,
-0.07150591164827347,
0.07760310173034668,
-0.008199747651815414,
-0.0675298348069191,
-0.02337236888706684,
0.020139606669545174,
0.06668464839458466,
0.10998323559761047,
0.12182681262493134,
-0.21159029006958008,
-0.12160325050354004,
0.015856245532631874,
-0.04464856907725334,
-0.09869123995304108,
0.03819825500249863,
0.012650874443352222,
-0.08921957015991211,
-0.020516715943813324,
-0.015606810338795185,
0.09904686361551285,
-0.11537787318229675,
0.06830310821533203,
-0.13976067304611206,
0.0365428701043129,
0.011155014857649803,
0.10549420863389969,
-0.20614783465862274,
0.20430724322795868,
-0.00549765769392252,
-0.009702181443572044,
-0.12061583995819092,
-0.04902264103293419,
0.006756107788532972,
0.11932902783155441,
0.06463358551263809,
-0.011870787478983402,
-0.036176323890686035,
0.03629499301314354,
0.014487043023109436,
0.034872617572546005,
0.11702839285135269,
-0.06590510904788971,
0.05960611253976822,
-0.055153001099824905,
0.03148144856095314,
0.029903272166848183,
-0.04966511204838753,
-0.054992176592350006,
-0.0990573912858963,
0.11020948737859726,
0.03691168874502182,
0.09345021098852158,
-0.037630993872880936,
-0.08968190103769302,
-0.06965363770723343,
0.21892349421977997,
-0.04777847230434418,
-0.05025678873062134,
-0.09269393980503082,
0.10474218428134918,
0.04935857281088829,
-0.06906580924987793,
0.004766787402331829,
-0.06833980977535248,
0.04425979405641556,
-0.03532329574227333,
-0.08500946313142776,
0.11758002638816833,
-0.11812694370746613,
-0.12711356580257416,
-0.006256970576941967,
0.1606726199388504,
-0.01892050914466381,
0.02533217892050743,
0.024243948981165886,
0.000753644504584372,
-0.10635457932949066,
-0.09407525509595871,
-0.004189555067569017,
-0.006472491193562746,
-0.0445193387567997,
0.024377651512622833,
0.042212486267089844,
-0.0008838395588099957,
-0.03548223152756691,
-0.025548478588461876,
0.22558006644248962,
0.1327475756406784,
-0.05047911778092384,
0.0730041116476059,
0.17878921329975128,
-0.017447195947170258,
-0.34043243527412415,
-0.011043533682823181,
-0.051686644554138184,
-0.019323408603668213,
-0.08425017446279526,
-0.16251911222934723,
0.1020793542265892,
0.015130356885492802,
-0.012679832987487316,
0.02029346488416195,
-0.2673932611942291,
-0.10307245701551437,
0.15178246796131134,
-0.004200609400868416,
0.40193769335746765,
-0.0697084292769432,
-0.0284687802195549,
-0.04546840116381645,
-0.10471030324697495,
0.08527688682079315,
-0.07059424370527267,
0.09258086234331131,
-0.011915639974176884,
0.11989589035511017,
0.040444593876600266,
-0.0018794700736179948,
0.0901251807808876,
-0.0468670018017292,
-0.02199021354317665,
-0.08463029563426971,
-0.09497512876987457,
0.051136523485183716,
0.017914799973368645,
0.03856034204363823,
0.008197027258574963,
0.05489743500947952,
-0.03226739913225174,
-0.08347965031862259,
-0.0574025884270668,
0.05813485011458397,
0.00510278670117259,
-0.10413338989019394,
-0.10228171199560165,
0.026249198243021965,
-0.04690873622894287,
-0.011398402974009514,
0.11379656940698624,
-0.08650747686624527,
0.048960842192173004,
0.1015273705124855,
0.23490287363529205,
-0.14820356667041779,
0.07174992561340332,
0.018204623833298683,
-0.05319211632013321,
0.07680989801883698,
-0.13235823810100555,
-0.011973882094025612,
0.12003776431083679,
0.03133587911725044,
0.08909790962934494,
0.08336590230464935,
-0.06216401606798172,
-0.009393821470439434,
0.03158937767148018,
-0.17006364464759827,
-0.1813632696866989,
-0.0602392852306366,
0.06637832522392273,
-0.05210914462804794,
0.11467307060956955,
0.19992733001708984,
-0.08076272159814835,
-0.03023495338857174,
-0.011895861476659775,
0.003298978554084897,
-0.02320491336286068,
0.02168295346200466,
0.05990075320005417,
0.027223236858844757,
-0.09705722332000732,
0.007641489617526531,
0.03704162314534187,
-0.09796566516160965,
0.005036442540585995,
0.08610005676746368,
-0.11628463864326477,
-0.11296883970499039,
-0.08323957026004791,
0.013536477461457253,
-0.08392602205276489,
-0.03240165859460831,
-0.020496953278779984,
-0.14452962577342987,
0.05752529576420784,
0.13798888027668,
0.07319434732198715,
0.02936731092631817,
-0.10558374971151352,
-0.014217707328498363,
-0.05790562555193901,
0.03622083738446236,
0.054470714181661606,
-0.012098409235477448,
-0.08986092358827591,
0.06373318284749985,
-0.026449158787727356,
0.08161751925945282,
-0.0729978084564209,
-0.04134903475642204,
-0.11842261999845505,
0.03660392016172409,
-0.12921811640262604,
-0.06531407684087753,
-0.10757673531770706,
-0.04277561604976654,
0.019349275156855583,
-0.06263923645019531,
-0.08107692003250122,
0.006684033200144768,
-0.123395174741745,
0.05285608768463135,
-0.006654746364802122,
0.06034497916698456,
0.00651339627802372,
0.053423840552568436,
0.04888363555073738,
-0.0023829438723623753,
0.11886449158191681,
0.09897958487272263,
-0.05299687758088112,
0.10685768723487854,
-0.14587809145450592,
-0.009581266902387142,
0.06560447812080383,
0.02439148910343647,
0.036846090108156204,
-0.009206732735037804,
-0.002814308274537325,
0.05576163902878761,
0.059184081852436066,
0.04620768129825592,
0.03838980570435524,
-0.05255525931715965,
0.04041589796543121,
0.010029155761003494,
-0.05492642521858215,
-0.07737896591424942,
0.023129988461732864,
0.034699130803346634,
0.07036340981721878,
0.1389273703098297,
-0.05157214775681496,
0.0409243181347847,
-0.05870316922664642,
0.0750170424580574,
-0.03222048655152321,
-0.1336972415447235,
0.005414552055299282,
-0.12436076998710632,
0.05601031705737114,
-0.010289371944963932,
0.2056119441986084,
0.0065157609060406685,
-0.05858859047293663,
0.028091100975871086,
0.006729903165251017,
-0.001082530478015542,
0.015934685245156288,
0.14124555885791779,
0.1202453076839447,
-0.016636012122035027,
-0.08384603261947632,
0.08041980117559433,
0.049003299325704575,
0.08852545917034149,
0.15883038938045502,
0.05538923665881157,
0.0005462925764732063,
0.13240858912467957,
-0.013988030143082142,
0.03177224099636078,
-0.032334908843040466,
-0.025026651099324226,
-0.037814170122146606,
0.034429922699928284,
-0.08075210452079773,
0.12021816521883011,
0.17720948159694672,
0.028462639078497887,
-0.0005238106823526323,
-0.035417500883340836,
-0.03754425793886185,
-0.12264716625213623,
-0.20022843778133392,
-0.11450734734535217,
-0.12988358736038208,
-0.03412967547774315,
-0.12811386585235596,
-0.022244583815336227,
0.029900992289185524,
0.11594229191541672,
-0.08201184123754501,
0.0798402950167656,
0.0007944955723360181,
-0.13157886266708374,
0.11264374852180481,
-0.04184314236044884,
0.05040299892425537,
-0.10619155317544937,
-0.015709148719906807,
-0.0757753849029541,
0.05880460888147354,
0.014271390624344349,
0.002148489700630307,
-0.029343532398343086,
-0.06091087684035301,
-0.06333630532026291,
-0.09884270280599594,
-0.06828439980745316,
0.09065747261047363,
0.04944436252117157,
0.12036309391260147,
0.02598174288868904,
-0.04827818647027016,
0.01696208491921425,
0.15325389802455902,
-0.033438511192798615,
-0.11862486600875854,
-0.08523434400558472,
0.15511251986026764,
-0.05918213725090027,
0.058802343904972076,
-0.006580725312232971,
-0.03200089558959007,
-0.027352582663297653,
0.306559294462204,
0.30959826707839966,
-0.09876801073551178,
-0.001534450682811439,
0.024182379245758057,
0.028535248711705208,
0.04238288104534149,
0.02926776371896267,
0.06896524876356125,
0.20302769541740417,
-0.04813409596681595,
0.012011400423943996,
-0.013938428834080696,
-0.025843683630228043,
0.02380210906267166,
0.06944843381643295,
0.05538983270525932,
-0.04113050177693367,
-0.06956728547811508,
0.107387974858284,
-0.1887979805469513,
-0.055713631212711334,
-0.06773156672716141,
-0.09037074446678162,
-0.07349228858947754,
0.008116096258163452,
0.018308430910110474,
0.028111234307289124,
0.09836132824420929,
-0.04859412834048271,
-0.06818961352109909,
-0.019801033660769463,
0.04201919212937355,
-0.11201363801956177,
0.03840591013431549,
0.16402947902679443,
-0.016391070559620857,
0.03708619996905327,
-0.031965769827365875,
0.08624567836523056,
0.13498511910438538,
0.002140624215826392,
-0.10069963335990906,
0.01265661045908928,
0.025397708639502525,
-0.039026759564876556,
0.0261228047311306,
0.054554130882024765,
0.04518020153045654,
-0.03154129162430763,
0.11573896557092667,
-0.24779412150382996,
0.033206090331077576,
-0.03480168431997299,
-0.04271195828914642,
-0.0902337059378624,
0.05814545601606369,
-0.0941852331161499,
0.10876567661762238,
0.16878379881381989,
-0.04635137692093849,
-0.05943785607814789,
-0.03868262469768524,
0.0038292347453534603,
-0.015812905505299568,
0.012176599353551865,
-0.02041550539433956,
-0.12927281856536865,
-0.061433278024196625,
0.07347938418388367,
0.007433713413774967,
-0.18936491012573242,
0.010709105059504509,
-0.06643333286046982,
0.013575023971498013,
-0.024705469608306885,
0.05091670900583267,
0.09992575645446777,
0.009177191182971,
0.017089836299419403,
0.009844532236456871,
-0.010763305239379406,
0.05437313765287399,
-0.09848157316446304,
-0.0702299252152443
] |
null | null |
transformers
|
This model has been trained for the purpose of classifying text from different domains. Currently it is trained with much lesser data and it has been trained to identify text from 3 domains, "sports", "healthcare" and "financial". Label_0 represents "financial", Label_1 represents "Healthcare" and Label_2 represents "Sports". Currently I have trained it with these 3 domains only, I am pretty soon planning to train it on more domains and more data, hence its accuracy will improve further too.
|
{}
|
text-classification
|
debjyoti007/new_doc_classifier
|
[
"transformers",
"pytorch",
"distilbert",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #distilbert #text-classification #autotrain_compatible #endpoints_compatible #region-us
|
This model has been trained for the purpose of classifying text from different domains. Currently it is trained with much lesser data and it has been trained to identify text from 3 domains, "sports", "healthcare" and "financial". Label_0 represents "financial", Label_1 represents "Healthcare" and Label_2 represents "Sports". Currently I have trained it with these 3 domains only, I am pretty soon planning to train it on more domains and more data, hence its accuracy will improve further too.
|
[] |
[
"TAGS\n#transformers #pytorch #distilbert #text-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
38
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #text-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.03533577919006348,
0.06443645805120468,
-0.007824759930372238,
0.02963758073747158,
0.21134145557880402,
0.0368538573384285,
0.06359195709228516,
0.10786357522010803,
0.047846585512161255,
-0.029699334874749184,
0.09624463319778442,
0.2456933856010437,
-0.04527274891734123,
0.11506538093090057,
-0.1315092295408249,
-0.2995516061782837,
0.0646483302116394,
0.06820031255483627,
0.01968790777027607,
0.11027561128139496,
0.08955937623977661,
-0.08577843010425568,
0.06416945904493332,
-0.03987749293446541,
-0.13028311729431152,
0.036934368312358856,
0.037670549005270004,
-0.12557227909564972,
0.08850666880607605,
0.03936105594038963,
0.16363440454006195,
0.029493317008018494,
-0.0571451373398304,
-0.13760130107402802,
0.03542056307196617,
0.003107793163508177,
-0.08173839002847672,
0.035451244562864304,
0.07971785217523575,
-0.13606007397174835,
0.03269175812602043,
0.01657985709607601,
0.028779901564121246,
0.05034712329506874,
-0.13549968600273132,
-0.06766978651285172,
-0.009825913235545158,
0.02846479043364525,
0.08123840391635895,
0.06563035398721695,
-0.00027321543893776834,
0.11571130156517029,
-0.14468228816986084,
0.13729768991470337,
0.08681581169366837,
-0.26667332649230957,
-0.01513616368174553,
0.09300960600376129,
0.014211298897862434,
0.03189397603273392,
-0.05005642771720886,
0.03387840837240219,
0.021587392315268517,
0.012041964568197727,
-0.005505601409822702,
-0.06911619752645493,
-0.12172640115022659,
0.01909228041768074,
-0.0760328620672226,
-0.039914727210998535,
0.2024218738079071,
-0.06752687692642212,
0.06574457883834839,
-0.03853347897529602,
-0.09920144081115723,
-0.04725521057844162,
-0.028420861810445786,
0.03284634277224541,
-0.05052020400762558,
0.06803859770298004,
0.04873250797390938,
0.02093963511288166,
-0.10541380196809769,
0.027895580977201462,
-0.2198127955198288,
0.21804359555244446,
0.00917235016822815,
0.04113364964723587,
-0.17035873234272003,
0.06059039384126663,
0.043774571269750595,
-0.10760118812322617,
0.049048252403736115,
-0.10497406870126724,
0.019541887566447258,
-0.04680290073156357,
-0.07833123207092285,
-0.044003088027238846,
0.0761561468243599,
0.15131190419197083,
0.024525625631213188,
0.0676354393362999,
-0.023907558992505074,
0.08125972747802734,
0.03615585342049599,
0.12704050540924072,
0.04965166375041008,
-0.030767392367124557,
0.03752761334180832,
-0.13245059549808502,
-0.00002132852932845708,
-0.07070981711149216,
-0.1520344465970993,
-0.028104213997721672,
0.058518148958683014,
0.07771685719490051,
0.007545619271695614,
0.09117837250232697,
-0.07305282354354858,
-0.03670652583241463,
0.09205243736505508,
-0.09038619697093964,
0.022389709949493408,
0.0189626757055521,
0.024910688400268555,
0.11437109857797623,
-0.01640472002327442,
-0.004441923461854458,
-0.08554866164922714,
0.15481221675872803,
-0.05412428826093674,
0.01906411163508892,
-0.027951309457421303,
-0.07562480866909027,
0.023844171315431595,
-0.16517141461372375,
0.024268588051199913,
-0.16968505084514618,
-0.12177367508411407,
0.0011497566010802984,
0.01497613824903965,
0.0003558929602149874,
-0.029599502682685852,
-0.034584347158670425,
0.0028823118191212416,
0.05339471623301506,
-0.05009040981531143,
-0.08925710618495941,
-0.0734119787812233,
0.09545788168907166,
-0.03665677830576897,
0.07958490401506424,
-0.12844105064868927,
0.0784672200679779,
-0.0987219363451004,
-0.0187049712985754,
-0.14024826884269714,
0.05743253231048584,
-0.04765705391764641,
0.18340644240379333,
0.01636499911546707,
-0.05442013591527939,
-0.05629796162247658,
0.05081459879875183,
-0.06792773306369781,
0.17081454396247864,
-0.10482346266508102,
-0.11688733100891113,
0.18975088000297546,
-0.09539731591939926,
-0.11199936270713806,
0.08214274048805237,
-0.012322766706347466,
-0.002544441493228078,
0.10592521727085114,
0.18774141371250153,
0.11772145330905914,
0.015394842252135277,
0.071439228951931,
0.1266816407442093,
-0.09738999605178833,
-0.10514426231384277,
-0.016195401549339294,
-0.010998358018696308,
-0.11682542413473129,
0.06311710923910141,
0.08283041417598724,
0.0693083181977272,
-0.04381299018859863,
-0.038738906383514404,
-0.015374792739748955,
-0.0029897931963205338,
0.14953550696372986,
0.06494788080453873,
0.11409911513328552,
-0.07472079247236252,
0.010434641502797604,
0.010832404717803001,
-0.008651630952954292,
0.016917014494538307,
0.02875317819416523,
-0.061046965420246124,
0.11194391548633575,
0.03876045346260071,
0.02736404910683632,
-0.24566538631916046,
-0.06682449579238892,
-0.011323003098368645,
0.1456235647201538,
-0.02446315996348858,
0.10121438652276993,
0.045561324805021286,
-0.0504569448530674,
-0.010978372767567635,
-0.029581138864159584,
0.17828664183616638,
0.022655870765447617,
-0.06422974169254303,
-0.0612877793610096,
0.0651540756225586,
-0.07150227576494217,
0.012235969305038452,
-0.07036937773227692,
0.020627280697226524,
0.08606486022472382,
0.12204300612211227,
0.010734139941632748,
0.06475073099136353,
-0.02579765021800995,
0.07209211587905884,
-0.07104320824146271,
0.019227510318160057,
0.11117701232433319,
-0.010595849715173244,
-0.07011682540178299,
0.13524381816387177,
-0.1373681277036667,
0.2673107087612152,
0.19483336806297302,
-0.2967563271522522,
0.0005786092369817197,
-0.04439404606819153,
-0.007282515522092581,
0.030610160902142525,
0.030042126774787903,
0.014859852381050587,
0.08437592536211014,
0.0014727829257026315,
0.20341786742210388,
-0.021047484129667282,
-0.03919289633631706,
-0.018922755494713783,
-0.04877391830086708,
-0.03148360177874565,
0.08788784593343735,
0.06451795995235443,
-0.192406564950943,
0.19050060212612152,
0.21731194853782654,
0.010114802047610283,
0.16024211049079895,
-0.010486523620784283,
0.043989237397909164,
0.09252246469259262,
-0.03757351264357567,
-0.024272754788398743,
-0.08932791650295258,
-0.1848243772983551,
-0.03918878361582756,
0.07472185045480728,
0.03010893426835537,
0.06895712018013,
-0.10219920426607132,
-0.027038687840104103,
0.0004840063920710236,
0.021132981404662132,
-0.01947878859937191,
0.08704918622970581,
0.08203180879354477,
0.1052171140909195,
-0.017219819128513336,
-0.07267280668020248,
0.11330383270978928,
-0.0011106154415756464,
-0.07149384170770645,
0.18412140011787415,
-0.15954560041427612,
-0.36233094334602356,
-0.1530739665031433,
-0.20456592738628387,
-0.02883506752550602,
0.06615062057971954,
0.10685895383358002,
-0.12165717035531998,
-0.048558108508586884,
0.0375000461935997,
-0.013693227432668209,
-0.04041895270347595,
0.03981194645166397,
-0.05303730443120003,
0.07329315692186356,
-0.05222955346107483,
-0.06364883482456207,
-0.06660815328359604,
-0.03131863474845886,
-0.004695216193795204,
0.16393853724002838,
-0.12483653426170349,
0.06658802926540375,
0.1819998174905777,
0.0010995424818247557,
0.06644674390554428,
-0.032483141869306564,
0.1697184294462204,
-0.08651559799909592,
-0.02343188226222992,
0.1893177032470703,
-0.07345744967460632,
0.07808925211429596,
0.15666639804840088,
0.020104380324482918,
-0.0712679922580719,
0.0352557972073555,
-0.035343270748853683,
-0.08934015780687332,
-0.2058166265487671,
-0.1703205555677414,
-0.12546730041503906,
0.05237005278468132,
0.0663270428776741,
0.07582127302885056,
0.12632738053798676,
0.06528977304697037,
0.00627241050824523,
0.010700550861656666,
0.006936580874025822,
0.07483439892530441,
0.24698598682880402,
-0.0010819705203175545,
0.14767786860466003,
-0.057353224605321884,
-0.13245494663715363,
0.08233633637428284,
0.000922833161894232,
0.1185675784945488,
0.08539658784866333,
0.017674902454018593,
0.005295653361827135,
0.05462205410003662,
0.164198637008667,
0.1299368292093277,
0.04298880323767662,
-0.013622048310935497,
-0.01172587089240551,
0.0032578855752944946,
-0.0797785148024559,
0.006457295268774033,
0.07906489074230194,
-0.14195358753204346,
-0.08270972222089767,
-0.11039547622203827,
0.10006770491600037,
0.08380265533924103,
0.042938295751810074,
-0.2052999883890152,
0.005745685659348965,
0.09206069260835648,
-0.027502331882715225,
-0.09957162290811539,
0.06463603675365448,
-0.04812092334032059,
-0.13455109298229218,
0.10769277811050415,
-0.029609164223074913,
0.13354617357254028,
-0.0870715081691742,
0.08272852748632431,
-0.0378170944750309,
-0.11202792823314667,
0.03467349335551262,
0.10786684602499008,
-0.27751585841178894,
0.2031957507133484,
0.007435075007379055,
-0.06144534796476364,
-0.07824365049600601,
-0.015199865214526653,
0.039944443851709366,
0.22591036558151245,
0.06934285908937454,
0.004277070518583059,
-0.05739999935030937,
-0.1865520477294922,
-0.009981787763535976,
-0.008337096311151981,
0.12231403589248657,
-0.03427664935588837,
-0.01814279891550541,
-0.036011241376399994,
-0.030255382880568504,
-0.03578435257077217,
-0.06897740066051483,
0.02666986919939518,
-0.17997102439403534,
0.056329283863306046,
0.034454237669706345,
0.05416429787874222,
0.01469043642282486,
-0.04343695193529129,
-0.11887014657258987,
0.19838201999664307,
-0.10767136514186859,
-0.09184177964925766,
-0.11828504502773285,
-0.07852382957935333,
0.02535579912364483,
-0.08476060628890991,
0.06807194650173187,
-0.08172672241926193,
0.018900277093052864,
-0.06600436568260193,
-0.20524995028972626,
0.11596046388149261,
-0.10182060301303864,
-0.03258875012397766,
-0.058350928127765656,
0.1526644378900528,
-0.07479622215032578,
0.010474151000380516,
0.03318091109395027,
0.02239469438791275,
-0.08559903502464294,
-0.08446884155273438,
-0.018381169065833092,
0.03129338473081589,
0.06142119690775871,
0.08739607781171799,
-0.09792511910200119,
-0.07674866914749146,
-0.03134889155626297,
0.02817792072892189,
0.2929084002971649,
0.1401015967130661,
-0.06586769968271255,
0.1629326492547989,
0.10387758165597916,
-0.06942285597324371,
-0.3373493552207947,
-0.09150945395231247,
-0.09645266830921173,
-0.03972399979829788,
-0.042589932680130005,
-0.16358928382396698,
0.13413257896900177,
-0.004249863792210817,
-0.010055972263216972,
0.08473600447177887,
-0.16361457109451294,
-0.08480892330408096,
0.19654500484466553,
-0.0355062410235405,
0.36373743414878845,
-0.09189414978027344,
-0.09806639701128006,
-0.07035496085882187,
-0.1232207641005516,
0.12262474000453949,
0.007738110609352589,
0.08150525391101837,
-0.02050303854048252,
0.04451111704111099,
0.04815887659788132,
-0.03690929710865021,
0.10097026824951172,
0.036669690161943436,
0.025901002809405327,
-0.11938466131687164,
-0.09219347685575485,
0.023168733343482018,
-0.019243339076638222,
-0.007111898623406887,
-0.01547485776245594,
0.01685570739209652,
-0.17164339125156403,
-0.04131095111370087,
-0.07032524049282074,
0.05912882834672928,
0.04161927476525307,
-0.029813537374138832,
0.012351144105196,
-0.020498499274253845,
-0.000361355283530429,
0.006620287895202637,
0.251852810382843,
-0.03737054020166397,
0.1604781597852707,
0.08527542650699615,
0.141584113240242,
-0.15723979473114014,
0.01194052491337061,
-0.07652142643928528,
-0.05061504244804382,
0.06191904842853546,
-0.06635212153196335,
0.07575498521327972,
0.13591395318508148,
-0.05730273202061653,
0.07247055321931839,
0.11612356454133987,
0.07706465572118759,
-0.034392137080430984,
0.16330119967460632,
-0.2292891889810562,
0.04589579999446869,
-0.050483379513025284,
-0.033954232931137085,
0.06465915590524673,
0.0655360221862793,
0.1258573830127716,
0.06694923341274261,
-0.04017629101872444,
0.005630772560834885,
0.00028037314768880606,
0.005372054409235716,
0.07443340867757797,
0.04748379439115524,
0.04316747188568115,
-0.14709694683551788,
0.05031560733914375,
0.05119774490594864,
-0.15819577872753143,
-0.022534551098942757,
0.1376158893108368,
-0.1704932600259781,
-0.1271103173494339,
-0.021827740594744682,
0.12368015199899673,
-0.09311434626579285,
-0.046253565698862076,
-0.07048245519399643,
-0.13402129709720612,
0.07112511247396469,
0.18836617469787598,
0.12805050611495972,
0.09663103520870209,
-0.06118634715676308,
-0.04969988390803337,
0.0036050756461918354,
-0.004089095629751682,
0.017009761184453964,
0.03120747022330761,
-0.12284451723098755,
0.046005018055438995,
-0.02090919390320778,
0.15390309691429138,
-0.09199176728725433,
-0.07624588906764984,
-0.1582917422056198,
0.04238278418779373,
-0.09195777773857117,
-0.023019742220640182,
-0.09330286085605621,
-0.01648246869444847,
0.0030273916199803352,
-0.030272169038653374,
-0.026145517826080322,
-0.06213071197271347,
-0.11623096466064453,
0.04011767357587814,
-0.028817979618906975,
0.04146858677268028,
-0.06920336186885834,
-0.04603973776102066,
0.09102679789066315,
-0.03833403438329697,
0.10358903557062149,
0.10654495656490326,
-0.0914529487490654,
0.0934479758143425,
-0.14121071994304657,
-0.1319282501935959,
0.1433861404657364,
0.030263781547546387,
0.07207431644201279,
0.07694290578365326,
0.03595962002873421,
0.07349478453397751,
0.004535248037427664,
0.06631990522146225,
0.06761990487575531,
-0.12337382882833481,
0.061452679336071014,
-0.046973392367362976,
-0.17189696431159973,
-0.05778007209300995,
-0.04047338292002678,
0.10660306364297867,
0.010234192945063114,
0.1720496565103531,
-0.05692226439714432,
0.1017514169216156,
-0.03180769085884094,
0.0038062711246311665,
-0.01604292169213295,
-0.20698778331279755,
-0.06364472210407257,
-0.08077114075422287,
0.026275143027305603,
0.005083381198346615,
0.23303534090518951,
0.061751753091812134,
0.033835094422101974,
0.04869496077299118,
0.09752455353736877,
-0.0014774927403777838,
0.023545393720269203,
0.17794077098369598,
0.10133370757102966,
-0.05567975342273712,
-0.05575546622276306,
0.05616139620542526,
0.029215605929493904,
0.006353367585688829,
0.14132826030254364,
0.07252193242311478,
-0.041009921580553055,
0.07551323622465134,
-0.03376345708966255,
0.04427867755293846,
-0.1321653574705124,
-0.16054923832416534,
-0.05143791809678078,
0.07023841142654419,
0.01740087941288948,
0.03448288515210152,
0.07088012248277664,
-0.028410857543349266,
0.05220868065953255,
-0.033101536333560944,
-0.05869230628013611,
-0.18244294822216034,
-0.09428907185792923,
-0.09423913061618805,
-0.09753676503896713,
0.0058974651619791985,
-0.07943454384803772,
-0.01026046834886074,
0.06547573953866959,
0.037508975714445114,
-0.05198773369193077,
0.07752657681703568,
0.003285798244178295,
-0.05593571066856384,
0.08687124401330948,
-0.045962750911712646,
0.02649652026593685,
0.00841206219047308,
-0.029707664623856544,
-0.1380927860736847,
-0.013390704058110714,
-0.04401649907231331,
0.040850814431905746,
-0.058590736240148544,
0.007230483461171389,
-0.1483704298734665,
-0.12039808928966522,
-0.019934508949518204,
0.0580129399895668,
-0.06074916571378708,
0.14175079762935638,
0.015395265072584152,
0.00611070916056633,
0.047287240624427795,
0.17810532450675964,
-0.0544942207634449,
-0.06539076566696167,
-0.04489162564277649,
0.24079638719558716,
0.09303659200668335,
0.10803006589412689,
0.0026883413083851337,
-0.013426939956843853,
-0.07931891828775406,
0.28847232460975647,
0.27526742219924927,
-0.04996299743652344,
0.054827310144901276,
0.007495596073567867,
0.03283945098519325,
0.15242771804332733,
0.1401364952325821,
0.09061526507139206,
0.24117816984653473,
-0.0521743968129158,
-0.05017128586769104,
-0.026741530746221542,
-0.03419290855526924,
-0.13402216136455536,
0.0581725612282753,
0.05382576957345009,
-0.0488528348505497,
-0.06285785138607025,
0.10921014845371246,
-0.21934591233730316,
0.16537490487098694,
0.019078493118286133,
-0.20565392076969147,
-0.06819386035203934,
-0.03284084051847458,
0.1365688294172287,
-0.0016830840613692999,
0.07499389350414276,
-0.00323955318890512,
-0.11883772164583206,
0.042848069220781326,
0.01306091621518135,
-0.20812170207500458,
-0.0041817850433290005,
0.06021128222346306,
-0.05781300365924835,
-0.0120098190382123,
-0.02640264853835106,
0.03757385164499283,
0.06560133397579193,
0.07958315312862396,
-0.0117155397310853,
0.04959989711642265,
-0.012948726303875446,
-0.030828366056084633,
0.029231732711195946,
0.02946082502603531,
0.0038178605027496815,
-0.09871038049459457,
0.06783884763717651,
-0.16667571663856506,
0.0549757145345211,
-0.05383889377117157,
-0.05352160334587097,
-0.019258368760347366,
0.04339629411697388,
-0.05456918105483055,
0.04438189044594765,
0.10450860112905502,
0.011940731666982174,
-0.025312455371022224,
-0.04523419588804245,
-0.04262804985046387,
-0.012295196764171124,
-0.1369558572769165,
-0.14967197179794312,
-0.09997987747192383,
-0.08965370059013367,
0.09313849359750748,
0.0034958450123667717,
-0.12975360453128815,
-0.006513827480375767,
-0.11122267693281174,
0.05365913361310959,
-0.16868756711483002,
0.09322161972522736,
0.0323028489947319,
0.015595607459545135,
-0.011563225649297237,
-0.040581803768873215,
0.04532773047685623,
0.07905946671962738,
-0.1267605572938919,
-0.08728102594614029
] |
null | null |
transformers
|
# Model Trained Using AutoNLP
- Problem type: Multi-class Classification
- Model ID: 38639804
- CO2 Emissions (in grams): 11.98841452241473
## Validation Metrics
- Loss: 0.421400249004364
- Accuracy: 0.86783988957902
- Macro F1: 0.8669477050676501
- Micro F1: 0.86783988957902
- Weighted F1: 0.86694770506765
- Macro Precision: 0.867606300132228
- Micro Precision: 0.86783988957902
- Weighted Precision: 0.8676063001322278
- Macro Recall: 0.86783988957902
- Micro Recall: 0.86783988957902
- Weighted Recall: 0.86783988957902
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/dee4hf/autonlp-shajBERT-38639804
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("dee4hf/autonlp-shajBERT-38639804", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("dee4hf/autonlp-shajBERT-38639804", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
```
|
{"language": "unk", "tags": "autonlp", "datasets": ["dee4hf/autonlp-data-shajBERT"], "widget": [{"text": "I love AutoNLP \ud83e\udd17"}], "co2_eq_emissions": 11.98841452241473}
|
text-classification
|
dee4hf/autonlp-shajBERT-38639804
|
[
"transformers",
"pytorch",
"albert",
"text-classification",
"autonlp",
"unk",
"dataset:dee4hf/autonlp-data-shajBERT",
"co2_eq_emissions",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"unk"
] |
TAGS
#transformers #pytorch #albert #text-classification #autonlp #unk #dataset-dee4hf/autonlp-data-shajBERT #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us
|
# Model Trained Using AutoNLP
- Problem type: Multi-class Classification
- Model ID: 38639804
- CO2 Emissions (in grams): 11.98841452241473
## Validation Metrics
- Loss: 0.421400249004364
- Accuracy: 0.86783988957902
- Macro F1: 0.8669477050676501
- Micro F1: 0.86783988957902
- Weighted F1: 0.86694770506765
- Macro Precision: 0.867606300132228
- Micro Precision: 0.86783988957902
- Weighted Precision: 0.8676063001322278
- Macro Recall: 0.86783988957902
- Micro Recall: 0.86783988957902
- Weighted Recall: 0.86783988957902
## Usage
You can use cURL to access this model:
Or Python API:
|
[
"# Model Trained Using AutoNLP\n\n- Problem type: Multi-class Classification\n- Model ID: 38639804\n- CO2 Emissions (in grams): 11.98841452241473",
"## Validation Metrics\n\n- Loss: 0.421400249004364\n- Accuracy: 0.86783988957902\n- Macro F1: 0.8669477050676501\n- Micro F1: 0.86783988957902\n- Weighted F1: 0.86694770506765\n- Macro Precision: 0.867606300132228\n- Micro Precision: 0.86783988957902\n- Weighted Precision: 0.8676063001322278\n- Macro Recall: 0.86783988957902\n- Micro Recall: 0.86783988957902\n- Weighted Recall: 0.86783988957902",
"## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] |
[
"TAGS\n#transformers #pytorch #albert #text-classification #autonlp #unk #dataset-dee4hf/autonlp-data-shajBERT #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Trained Using AutoNLP\n\n- Problem type: Multi-class Classification\n- Model ID: 38639804\n- CO2 Emissions (in grams): 11.98841452241473",
"## Validation Metrics\n\n- Loss: 0.421400249004364\n- Accuracy: 0.86783988957902\n- Macro F1: 0.8669477050676501\n- Micro F1: 0.86783988957902\n- Weighted F1: 0.86694770506765\n- Macro Precision: 0.867606300132228\n- Micro Precision: 0.86783988957902\n- Weighted Precision: 0.8676063001322278\n- Macro Recall: 0.86783988957902\n- Micro Recall: 0.86783988957902\n- Weighted Recall: 0.86783988957902",
"## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] |
[
72,
43,
144,
17
] |
[
"passage: TAGS\n#transformers #pytorch #albert #text-classification #autonlp #unk #dataset-dee4hf/autonlp-data-shajBERT #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us \n# Model Trained Using AutoNLP\n\n- Problem type: Multi-class Classification\n- Model ID: 38639804\n- CO2 Emissions (in grams): 11.98841452241473## Validation Metrics\n\n- Loss: 0.421400249004364\n- Accuracy: 0.86783988957902\n- Macro F1: 0.8669477050676501\n- Micro F1: 0.86783988957902\n- Weighted F1: 0.86694770506765\n- Macro Precision: 0.867606300132228\n- Micro Precision: 0.86783988957902\n- Weighted Precision: 0.8676063001322278\n- Macro Recall: 0.86783988957902\n- Micro Recall: 0.86783988957902\n- Weighted Recall: 0.86783988957902## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] |
[
-0.10120843350887299,
0.1818603128194809,
-0.0027736362535506487,
0.0718473345041275,
0.0878625437617302,
0.027374951168894768,
0.060514889657497406,
0.11837082356214523,
0.04757266119122505,
0.1645110696554184,
0.09649275243282318,
0.15215319395065308,
0.0865340456366539,
0.1732112318277359,
-0.091119185090065,
-0.09598708897829056,
0.03865351900458336,
-0.009574783965945244,
0.06939224153757095,
0.06710946559906006,
0.056293513625860214,
-0.08140508830547333,
0.12142772227525711,
0.0027393533382564783,
-0.07319241762161255,
0.03151528909802437,
0.08895984292030334,
-0.0802101194858551,
0.05298807844519615,
0.1072622686624527,
0.12214503437280655,
0.0010218985844403505,
0.08671177178621292,
-0.15258942544460297,
-0.021519601345062256,
0.05103686451911926,
-0.025403980165719986,
0.08038178086280823,
0.1614203155040741,
0.01635577529668808,
-0.003967129159718752,
-0.10321978479623795,
0.1052822694182396,
0.09072259068489075,
-0.08094105124473572,
-0.0726819783449173,
-0.11159740388393402,
0.07037197798490524,
0.10071655362844467,
0.07953780889511108,
-0.001207218156196177,
0.19948294758796692,
-0.03012559749186039,
0.07919113337993622,
0.06256528943777084,
-0.24612301588058472,
-0.03327876701951027,
0.15267156064510345,
-0.018508700653910637,
0.0004627812886610627,
-0.008899306878447533,
0.00809916015714407,
0.058595310896635056,
-0.00022321178403217345,
0.01115924771875143,
-0.056605271995067596,
-0.037152208387851715,
-0.025357022881507874,
-0.12457139045000076,
-0.06431204080581665,
0.15916894376277924,
0.04414854943752289,
-0.09614826738834381,
-0.07291408628225327,
-0.07527840882539749,
-0.11339550465345383,
-0.03482713922858238,
-0.055213022977113724,
0.010807077400386333,
-0.03232472389936447,
-0.048643000423908234,
0.08884850144386292,
-0.04122665524482727,
-0.059422701597213745,
-0.13855938613414764,
0.0007938467897474766,
0.021491022780537605,
0.06358715146780014,
0.03628792613744736,
0.023227397352457047,
-0.05794685333967209,
-0.053506284952163696,
-0.01649901457130909,
0.01729007251560688,
-0.12234360724687576,
-0.06871639937162399,
0.0052411495707929134,
0.10953893512487411,
0.04014262929558754,
0.20196610689163208,
0.005718049127608538,
0.0947972908616066,
0.054213546216487885,
-0.02487817406654358,
-0.046106722205877304,
0.13251528143882751,
-0.11779970675706863,
-0.11601327359676361,
0.03085864521563053,
-0.05178258940577507,
0.03776838257908821,
-0.041862744837999344,
-0.044669970870018005,
-0.06933985650539398,
0.04777510464191437,
0.04426657781004906,
0.035323381423950195,
0.0012887242482975125,
-0.07623543590307236,
-0.04776839539408684,
0.05791269242763519,
-0.09469972550868988,
0.04560847580432892,
0.0055391499772667885,
-0.13008445501327515,
0.09479053318500519,
0.05123909190297127,
0.020389672368764877,
-0.11405111849308014,
0.005889138672500849,
-0.12319760769605637,
-0.020404541864991188,
-0.07659827917814255,
-0.13084688782691956,
0.06735678017139435,
0.02969798631966114,
-0.018829822540283203,
-0.13418138027191162,
-0.162746861577034,
-0.06501136720180511,
-0.008177660405635834,
-0.0877968966960907,
-0.043279193341732025,
0.013911119662225246,
-0.014769506640732288,
0.05165008082985878,
-0.0007443997310474515,
0.020363448187708855,
-0.03333786502480507,
0.015116512775421143,
0.0486065149307251,
0.07524292171001434,
-0.023650063201785088,
0.018821513280272484,
-0.03505125641822815,
0.028716111555695534,
-0.10280944406986237,
0.035114653408527374,
-0.09622354805469513,
0.01699131354689598,
-0.1803678423166275,
-0.04057784378528595,
0.1116313636302948,
-0.051642533391714096,
0.06979897618293762,
0.08169683068990707,
-0.07990271598100662,
0.01968885213136673,
0.08765895664691925,
-0.04838370904326439,
-0.09702171385288239,
0.0945797935128212,
0.013862079940736294,
-0.0027370976749807596,
0.034637659788131714,
0.06541307270526886,
0.1409660279750824,
-0.15699416399002075,
-0.09240458905696869,
0.017120884731411934,
0.03473540022969246,
-0.040465086698532104,
0.08272597193717957,
-0.03310537710785866,
-0.13625015318393707,
0.0011441264068707824,
0.08783372491598129,
-0.01808873564004898,
-0.027435526251792908,
-0.06253522634506226,
-0.027912888675928116,
-0.036624033004045486,
-0.012482947669923306,
-0.027103710919618607,
-0.0002668831730261445,
-0.04174058884382248,
-0.05902700126171112,
-0.00987046118825674,
0.16189660131931305,
-0.011402441188693047,
-0.04236454889178276,
-0.16927984356880188,
0.05397028103470802,
-0.1123490184545517,
-0.04771227389574051,
-0.17092248797416687,
-0.07140841335058212,
0.031148681417107582,
-0.16128723323345184,
0.010822082869708538,
-0.017382727935910225,
0.06846310198307037,
0.04862479120492935,
0.06215202435851097,
0.027251677587628365,
0.0830368921160698,
-0.009518701583147049,
-0.07931168377399445,
-0.041852112859487534,
-0.019121160730719566,
0.017411960288882256,
0.24917638301849365,
-0.15298590064048767,
0.009697363711893559,
0.03563406318426132,
0.05261312052607536,
-0.004760107956826687,
-0.04650941863656044,
-0.04817880690097809,
0.06386327743530273,
0.013152061961591244,
-0.0569516159594059,
0.02872222289443016,
-0.026484549045562744,
-0.04835313931107521,
-0.03240258991718292,
-0.2835789918899536,
0.17841103672981262,
0.11686276644468307,
0.01633993722498417,
-0.08979471027851105,
-0.029071392491459846,
0.04082098975777626,
-0.05420374497771263,
-0.028402118012309074,
-0.0037687167059630156,
0.11266175657510757,
0.02895803563296795,
0.0874740406870842,
-0.05969395115971565,
-0.02401086501777172,
0.042920373380184174,
-0.05032650753855705,
-0.023008190095424652,
0.17378102242946625,
0.07653121650218964,
-0.10406049340963364,
0.05964554473757744,
0.005425272509455681,
-0.08614534884691238,
0.012391471303999424,
0.04488211125135422,
-0.057809073477983475,
-0.08670054376125336,
-0.005233778152614832,
0.05362671986222267,
0.04572908580303192,
0.035380419343709946,
0.09374237805604935,
0.06510605663061142,
-0.017608828842639923,
0.015583125874400139,
-0.11559489369392395,
0.02166150137782097,
0.025773130357265472,
-0.034090083092451096,
-0.04535844922065735,
0.001150554046034813,
0.02425585873425007,
0.10647714138031006,
0.013762671500444412,
-0.025358065962791443,
-0.009005810134112835,
-0.016406724229454994,
-0.11884384602308273,
0.22033047676086426,
-0.11978422850370407,
-0.16939182579517365,
-0.17064468562602997,
-0.185524582862854,
-0.05932631343603134,
-0.03945402055978775,
-0.005268251057714224,
-0.048023566603660583,
-0.14036345481872559,
-0.06549487262964249,
-0.07549391686916351,
-0.028477387502789497,
-0.07373793423175812,
0.01333545334637165,
-0.02582649514079094,
0.10189872235059738,
-0.12307936698198318,
-0.035123009234666824,
0.028903882950544357,
-0.1313258707523346,
0.061224713921546936,
-0.006192693952471018,
0.10466568917036057,
0.18213148415088654,
-0.02438301034271717,
0.013691102154552937,
0.010663308203220367,
0.23558983206748962,
0.014613951556384563,
0.0016661247937008739,
0.21321091055870056,
0.08633354306221008,
0.08004184067249298,
0.12996040284633636,
0.051915403455495834,
-0.06332724541425705,
-0.0026291091926395893,
0.047050971537828445,
-0.009454667568206787,
-0.20254181325435638,
-0.18425032496452332,
0.014339842833578587,
0.026889657601714134,
0.1421622335910797,
0.021447373554110527,
0.08590592443943024,
0.10178244113922119,
0.004995244089514017,
0.09166105091571808,
-0.041868485510349274,
0.08008365333080292,
0.1650075912475586,
0.03877739980816841,
0.12172233313322067,
-0.06725417077541351,
0.022321930155158043,
0.10441168397665024,
0.024147000163793564,
0.07572353631258011,
0.08538661897182465,
0.1084807813167572,
-0.033958420157432556,
0.12086270749568939,
0.030711285769939423,
0.08974254876375198,
0.0727282389998436,
-0.01797889545559883,
0.040285661816596985,
-0.07390247285366058,
-0.0932237058877945,
0.014580875635147095,
0.024383988231420517,
0.034708742052316666,
-0.09565585106611252,
0.02069091983139515,
-0.01378297433257103,
0.053640637546777725,
0.09260772913694382,
-0.4368636906147003,
-0.04560796916484833,
0.02812187559902668,
-0.03196030855178833,
-0.11469262093305588,
-0.020624516531825066,
-0.012140089645981789,
-0.15140774846076965,
0.02998036891222,
-0.0014493034686893225,
0.11987759172916412,
-0.060144491493701935,
-0.03426051884889603,
-0.053130388259887695,
0.07455616444349289,
-0.009676697663962841,
0.07008881121873856,
-0.13349953293800354,
0.15155453979969025,
0.05389298498630524,
0.06025485321879387,
-0.06982556730508804,
0.015978367999196053,
0.021867843344807625,
-0.016239013522863388,
0.12326179444789886,
0.015869827941060066,
-0.13972514867782593,
-0.3735339045524597,
-0.1492094248533249,
0.027381164953112602,
-0.0054497066885232925,
0.01893169805407524,
0.0928296446800232,
-0.03977039083838463,
0.00044268369674682617,
-0.02294745296239853,
-0.016033785417675972,
-0.1238521933555603,
-0.06778955459594727,
0.032616641372442245,
0.10737466812133789,
-0.03857443481683731,
-0.04813686013221741,
-0.011078591458499432,
-0.021176107227802277,
0.10782372206449509,
-0.11954652518033981,
-0.05196750536561012,
-0.14183804392814636,
-0.02724814973771572,
0.1195610836148262,
-0.12292039394378662,
0.053631577640771866,
-0.016749074682593346,
0.09468507021665573,
-0.0014586752513423562,
-0.10469651222229004,
0.08095414936542511,
-0.06282718479633331,
-0.056112874299287796,
-0.0013670504558831453,
0.013695012778043747,
0.014077577739953995,
0.06823697686195374,
0.06857334822416306,
0.01863902434706688,
-0.059600282460451126,
-0.13528738915920258,
-0.015157301910221577,
0.05785265564918518,
0.1048203706741333,
0.07407375425100327,
0.004858409054577351,
-0.08936402201652527,
-0.046584367752075195,
0.06128161773085594,
0.10922247171401978,
0.29709431529045105,
-0.06784692406654358,
-0.02266879752278328,
0.10922743380069733,
-0.0231077391654253,
-0.20130610466003418,
-0.0385325625538826,
0.013846511952579021,
0.006339952349662781,
-0.06533733010292053,
-0.07232261449098587,
0.13614727556705475,
0.19401679933071136,
-0.049027882516384125,
-0.03919479250907898,
-0.24227085709571838,
-0.114839568734169,
0.18322764337062836,
0.09194938838481903,
0.04282698407769203,
-0.15524519979953766,
-0.05336916446685791,
-0.13663384318351746,
-0.17602410912513733,
0.12252875417470932,
-0.0718638226389885,
0.05567806959152222,
-0.032438550144433975,
0.10979552567005157,
0.02663533017039299,
-0.059265367686748505,
0.18789684772491455,
-0.0024200277402997017,
0.015108276158571243,
-0.04452188313007355,
-0.004413919523358345,
-0.01575043611228466,
-0.09256277233362198,
0.10552795231342316,
0.03792977333068848,
0.06119604408740997,
-0.21301080286502838,
-0.0038357521407306194,
0.014382869936525822,
0.0757659301161766,
-0.05311160907149315,
-0.035203512758016586,
-0.029861023649573326,
0.02849908545613289,
-0.0033781409729272127,
-0.03922217711806297,
-0.07359112054109573,
-0.03686891868710518,
0.07179908454418182,
0.21633683145046234,
0.07314252108335495,
-0.018105093389749527,
-0.08124815672636032,
0.05236079916357994,
-0.061160147190093994,
0.03966912627220154,
-0.09217085689306259,
0.06802672892808914,
0.12475419789552689,
0.021704137325286865,
0.0899394229054451,
0.01783386804163456,
-0.061754029244184494,
-0.0029736871365457773,
0.02790199965238571,
-0.11110394448041916,
0.041219525039196014,
0.023061834275722504,
0.08704584836959839,
-0.09064238518476486,
-0.061042629182338715,
0.11986839026212692,
0.01187712699174881,
-0.026825185865163803,
0.01904134638607502,
0.003867111634463072,
-0.004179686773568392,
0.2583937644958496,
0.022317128255963326,
0.09102410078048706,
-0.0982075110077858,
0.06646142899990082,
0.1063452810049057,
-0.12786507606506348,
0.017224954441189766,
0.05606532096862793,
-0.08762034773826599,
-0.05011717602610588,
-0.006344838999211788,
0.08584992587566376,
-0.15261034667491913,
-0.07459373027086258,
0.006417771801352501,
-0.08632303774356842,
0.06051213666796684,
0.1977546364068985,
0.08486253768205643,
0.0009720536763779819,
0.016934631392359734,
-0.08519653230905533,
-0.10966113954782486,
0.037686511874198914,
0.09115604311227798,
0.01901816762983799,
-0.09823364019393921,
0.147237166762352,
-0.007395233027637005,
-0.004940250422805548,
-0.006454460322856903,
-0.008288437500596046,
-0.1964161992073059,
-0.03227061778306961,
-0.12980827689170837,
0.08298797160387039,
-0.049112413078546524,
0.03379357233643532,
0.004611137788742781,
0.009560275822877884,
-0.07109136134386063,
0.02051215060055256,
-0.05846814066171646,
-0.07019519805908203,
0.018300488591194153,
0.059777095913887024,
-0.1017572358250618,
-0.03704414144158363,
0.10090330988168716,
-0.03275445103645325,
0.020828556269407272,
0.07528571039438248,
0.06613769382238388,
0.0023566700983792543,
-0.046872302889823914,
-0.008060446940362453,
0.05000922083854675,
0.053936917334795,
0.08804656565189362,
-0.19857697188854218,
0.05939888209104538,
0.0000995284499367699,
0.0037993716541677713,
0.045326147228479385,
0.09030831605195999,
-0.10367808490991592,
0.00039684720104560256,
-0.11838872730731964,
-0.06904883682727814,
-0.13457229733467102,
0.014108005911111832,
0.13673007488250732,
0.04684286192059517,
0.047041937708854675,
-0.07106819748878479,
0.04486503824591637,
-0.1777367889881134,
-0.013040895573794842,
-0.03315730020403862,
-0.008426935411989689,
0.029423410072922707,
-0.0077910879626870155,
0.0941719338297844,
-0.014429353177547455,
0.11407435685396194,
-0.04085151478648186,
0.04935672506690025,
0.021448927000164986,
0.04971880465745926,
-0.016714682802557945,
-0.04270181804895401,
0.1706577092409134,
0.10287311673164368,
0.010387400165200233,
0.09742675721645355,
0.06446250528097153,
0.051980260759592056,
-0.026085633784532547,
0.015364725142717361,
0.05265451595187187,
-0.0958859920501709,
0.0716325119137764,
0.022342797368764877,
-0.14128319919109344,
-0.010271988809108734,
0.07132383435964584,
-0.10789848119020462,
0.018629752099514008,
-0.031050898134708405,
0.03395587205886841,
0.13478024303913116,
-0.12408744543790817,
-0.016131095588207245,
-0.004599505104124546,
-0.06377854943275452,
-0.22061550617218018,
-0.07568144798278809,
-0.12920339405536652,
-0.03127296268939972,
-0.03326655924320221,
-0.10558286309242249,
0.013841111212968826,
0.1529759019613266,
0.013335596770048141,
0.032541438937187195,
0.06964453309774399,
-0.2192668616771698,
-0.014543235301971436,
-0.07429040968418121,
0.004810913000255823,
-0.0012467054184526205,
-0.023990103974938393,
-0.033086709678173065,
0.03471773490309715,
0.007151452358812094,
0.10330738127231598,
0.02442251145839691,
0.036276061087846756,
0.09245005995035172,
-0.017175383865833282,
-0.08349967747926712,
-0.05024426057934761,
0.014646963216364384,
0.014376454055309296,
0.1302267163991928,
0.028967782855033875,
0.022039981558918953,
-0.026192279532551765,
0.18880337476730347,
-0.10201960802078247,
0.03267978876829147,
-0.11252887547016144,
0.2642022669315338,
-0.0017578599508851767,
0.06203576177358627,
0.033895477652549744,
-0.020463282242417336,
-0.011282045394182205,
0.1894136369228363,
0.07582160085439682,
-0.0085012037307024,
-0.026684820652008057,
0.01466331072151661,
-0.013017214834690094,
-0.043538376688957214,
0.08939222246408463,
0.04277186468243599,
0.17854954302310944,
-0.0712360069155693,
0.04882243648171425,
0.03493758663535118,
-0.015712738037109375,
-0.10852085798978806,
0.047782182693481445,
0.0031187382992357016,
-0.0018907785415649414,
-0.006316635757684708,
0.08879775553941727,
-0.03846856579184532,
0.047554753720760345,
0.09010612964630127,
-0.04065253958106041,
-0.12171690165996552,
0.04801349714398384,
-0.05540959909558296,
-0.06307092308998108,
0.09452539682388306,
-0.062926284968853,
-0.009857245720922947,
0.061993308365345,
0.0022489377297461033,
-0.19223752617835999,
-0.07864765077829361,
0.0018988957162946463,
0.1460725963115692,
0.29639914631843567,
0.04349618777632713,
0.14053407311439514,
0.18154743313789368,
-0.011871041730046272,
-0.15773285925388336,
0.10738787055015564,
0.027921905741095543,
-0.1601412147283554,
0.08587478846311569,
0.05604833737015724,
-0.040138695389032364,
0.15435664355754852,
0.06967342644929886,
-0.14931057393550873,
-0.008001272566616535,
-0.012788231484591961,
0.09762732684612274,
-0.06358463317155838,
-0.005627590697258711,
-0.08936479687690735,
0.11755591630935669,
0.12135758996009827,
-0.04917318373918533,
0.002597111277282238,
-0.025143064558506012,
0.06545082479715347,
-0.00733879953622818,
-0.0024506011977791786,
-0.0031551464926451445,
-0.0920393094420433,
0.08705979585647583,
-0.21712498366832733,
0.02446869947016239,
-0.26909273862838745,
-0.037592750042676926,
-0.01706569455564022,
-0.05619123578071594,
-0.04727526754140854,
0.11800076067447662,
0.017965272068977356,
-0.004658299032598734,
-0.06010995805263519,
-0.1620914787054062,
0.00257194135338068,
0.1500435173511505,
-0.10530393570661545,
-0.13690844178199768
] |
null | null | null |
trying to create my first BERT model
|
{}
| null |
dee4hf/deeBERT
|
[
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#region-us
|
trying to create my first BERT model
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
[
0.024608636274933815,
-0.026205500587821007,
-0.009666500613093376,
-0.10395516455173492,
0.08638657629489899,
0.059816278517246246,
0.01882290467619896,
0.020661840215325356,
0.23975107073783875,
-0.005599027033895254,
0.1219947561621666,
0.0015615287702530622,
-0.037353623658418655,
0.03733762726187706,
-0.0035912662278860807,
-0.17583473026752472,
0.03876631706953049,
-0.018274923786520958,
0.01843859627842903,
0.026470553129911423,
-0.07776834815740585,
-0.07564429938793182,
0.015296397730708122,
-0.10247814655303955,
-0.083692267537117,
0.11002834886312485,
0.031466204673051834,
-0.019670886918902397,
0.10779199749231339,
-0.04243955761194229,
0.18699054419994354,
-0.011512263678014278,
-0.11213519424200058,
-0.2536850869655609,
0.021806683391332626,
-0.01765260472893715,
-0.08747660368680954,
0.01506110467016697,
0.0665089413523674,
-0.09014441072940826,
-0.0588928684592247,
0.0795099288225174,
-0.01132340170443058,
0.04246443510055542,
-0.27593839168548584,
-0.12684126198291779,
-0.05297930911183357,
-0.1421966552734375,
0.08651168644428253,
0.04035491496324539,
0.008764253929257393,
0.15506891906261444,
-0.20897391438484192,
0.004104613792151213,
0.08255259692668915,
-0.2538507878780365,
0.05591634660959244,
0.17671173810958862,
0.03623908758163452,
0.18037272989749908,
0.0060391901060938835,
0.11029672622680664,
0.0716743916273117,
-0.024263937026262283,
-0.17590197920799255,
-0.08127854019403458,
-0.04696211963891983,
0.16642488539218903,
-0.06727185100317001,
-0.14248386025428772,
0.34701237082481384,
0.00015008423360995948,
0.009657775051891804,
0.16921205818653107,
-0.059524230659008026,
-0.09972117841243744,
0.07259953022003174,
0.016484731808304787,
0.018492350354790688,
0.1471305936574936,
0.16307872533798218,
-0.0458691343665123,
-0.13837823271751404,
-0.018630273640155792,
-0.22798998653888702,
0.17510560154914856,
-0.03248048573732376,
0.13137903809547424,
-0.27447956800460815,
0.01684025302529335,
-0.2570667266845703,
0.0032130838371813297,
0.04178816080093384,
-0.06004921346902847,
-0.0226522795855999,
-0.013265985064208508,
-0.08018817007541656,
0.004899587947875261,
0.06192673370242119,
0.1266920566558838,
-0.06128726154565811,
0.06128238886594772,
-0.09319206327199936,
0.141696035861969,
0.07166698575019836,
0.07868369668722153,
0.13037432730197906,
0.041205424815416336,
-0.07187089323997498,
-0.21872246265411377,
-0.0026476888451725245,
-0.06275863200426102,
-0.09502086788415909,
-0.0020165652967989445,
-0.11606067419052124,
0.17244569957256317,
-0.030802514404058456,
-0.09825427830219269,
-0.11208184063434601,
0.09148659557104111,
-0.032992321997880936,
-0.03437839448451996,
-0.03552987426519394,
-0.020977836102247238,
0.019381176680326462,
0.04704452306032181,
-0.1548958420753479,
-0.005131472367793322,
0.07039852440357208,
0.11502562463283539,
-0.1346137970685959,
-0.003783059772104025,
-0.07908964157104492,
0.03039063885807991,
0.07654735445976257,
-0.16510222852230072,
0.03158547356724739,
-0.1124754324555397,
-0.07531405985355377,
0.002912673633545637,
-0.015710093080997467,
-0.016202643513679504,
0.166526660323143,
-0.0020451415330171585,
0.0714716836810112,
-0.026345307007431984,
-0.05890209600329399,
-0.11243434250354767,
-0.08489254862070084,
0.05390460044145584,
0.03670717030763626,
0.03266148269176483,
-0.2193479984998703,
0.014805203303694725,
-0.12762966752052307,
0.1360815018415451,
-0.10566820204257965,
-0.04705966264009476,
-0.022842247039079666,
0.20562705397605896,
0.037286072969436646,
0.08762791007757187,
-0.22171171009540558,
0.039756543934345245,
-0.05404696613550186,
0.18480908870697021,
-0.1502426266670227,
-0.0799463614821434,
0.20813211798667908,
-0.07964949309825897,
-0.10115210711956024,
0.021235812455415726,
0.020391687750816345,
0.026287272572517395,
0.0766737088561058,
0.4564172327518463,
-0.09766800701618195,
-0.09146861732006073,
0.10178250074386597,
0.17055274546146393,
-0.12427149713039398,
-0.1827561855316162,
0.06446871906518936,
-0.16666454076766968,
-0.1973118633031845,
0.0018917324487119913,
0.09222044050693512,
0.038269978016614914,
-0.07875611633062363,
-0.020746968686580658,
0.06325206160545349,
-0.0007678253459744155,
0.09095914661884308,
0.03755716234445572,
0.09034032374620438,
-0.08716782182455063,
0.11115926504135132,
-0.05017651244997978,
0.004037132486701012,
0.1343354731798172,
0.027325427159667015,
-0.03223329409956932,
0.08694463223218918,
-0.0485352948307991,
0.05295134335756302,
-0.1662379503250122,
-0.15068690478801727,
0.03398871049284935,
0.06283251196146011,
0.03186952322721481,
0.1280253529548645,
0.08141885697841644,
-0.10732853412628174,
0.022690722718834877,
-0.004228927195072174,
0.058398615568876266,
0.03891623765230179,
0.006107209715992212,
0.008764320984482765,
0.0961301177740097,
-0.10607069730758667,
-0.13589619100093842,
-0.07336436957120895,
-0.014715781435370445,
0.14371353387832642,
-0.0302802175283432,
0.07690227776765823,
-0.004240254405885935,
0.00013200697139836848,
0.06930823624134064,
0.08137880265712738,
0.016412746161222458,
0.08971183747053146,
-0.05237193778157234,
-0.05160155147314072,
0.10863113403320312,
-0.13533565402030945,
0.17837053537368774,
0.14053137600421906,
-0.20532016456127167,
0.029453208670020103,
-0.06838275492191315,
0.03670361638069153,
-0.008162540383636951,
0.0975119024515152,
-0.08272241055965424,
-0.02106042578816414,
0.013134466484189034,
0.0052274600602686405,
-0.013007243163883686,
0.017682146281003952,
-0.07295988500118256,
-0.07787393033504486,
-0.10233919322490692,
0.08436838537454605,
0.11562882363796234,
-0.10282530635595322,
0.14214380085468292,
0.4384984076023102,
0.11495281755924225,
0.21582984924316406,
-0.09581480920314789,
-0.0412987545132637,
0.007486371789127588,
0.0001535322517156601,
-0.04476691037416458,
0.08031861484050751,
-0.15973517298698425,
-0.038901735097169876,
0.027348900213837624,
0.07128690183162689,
0.11475157737731934,
-0.14959022402763367,
-0.09639324247837067,
-0.00793045200407505,
0.0022841424215584993,
-0.1249532699584961,
0.023905446752905846,
-0.03974650055170059,
0.04015624523162842,
0.07232289016246796,
-0.021535737439990044,
0.13939237594604492,
-0.04166141897439957,
-0.0639561116695404,
0.07585346698760986,
-0.2017085999250412,
-0.23179671168327332,
-0.12309670448303223,
-0.14680525660514832,
0.04366797208786011,
0.05154111236333847,
0.01726446859538555,
-0.17635835707187653,
-0.015074856579303741,
0.07706750929355621,
0.07820965349674225,
-0.20886357128620148,
-0.022814949974417686,
-0.004290030337870121,
0.0895976573228836,
-0.10227091610431671,
-0.0017130117630586028,
-0.04419664293527603,
-0.10150232166051865,
0.0017003051470965147,
0.07279510796070099,
-0.137485533952713,
0.13807645440101624,
0.21589438617229462,
0.07225540280342102,
0.07359948754310608,
-0.019093448296189308,
0.09936179965734482,
-0.10856141895055771,
-0.16549113392829895,
0.08348225057125092,
-0.06234746053814888,
0.047262318432331085,
0.17534415423870087,
0.03307317942380905,
-0.13904969394207,
-0.015682822093367577,
-0.0402069091796875,
-0.15603256225585938,
-0.238995760679245,
-0.09178274869918823,
-0.1182505264878273,
0.16442428529262543,
0.0009358620154671371,
0.06651917099952698,
0.08258313685655594,
-0.022042419761419296,
0.16447891294956207,
-0.07379321753978729,
-0.07578866183757782,
-0.006978808436542749,
0.12375060468912125,
-0.056660156697034836,
-0.03080669604241848,
-0.10566964000463486,
-0.008295975625514984,
0.1151021271944046,
0.15304014086723328,
0.12214863300323486,
0.2957419455051422,
0.08268889784812927,
0.026645636186003685,
0.08958091586828232,
0.17622539401054382,
0.09495089203119278,
0.07838419824838638,
-0.045413073152303696,
-0.014814783819019794,
0.014317171648144722,
-0.04022889584302902,
0.010141594335436821,
0.14683100581169128,
-0.2679629921913147,
-0.006678564939647913,
-0.2710230350494385,
0.0965198427438736,
-0.10913380235433578,
0.11837165057659149,
-0.01015760749578476,
0.10194015502929688,
0.11082887649536133,
0.03233652561903,
-0.03858073800802231,
0.16613617539405823,
0.08450309932231903,
-0.11277695000171661,
0.001758623169735074,
0.03737903758883476,
0.09715615212917328,
-0.02818971499800682,
0.12721189856529236,
-0.11048974841833115,
-0.1464834064245224,
0.013753619976341724,
0.07152791321277618,
-0.15373679995536804,
0.3138748109340668,
0.012069208547472954,
-0.13481520116329193,
-0.01481647603213787,
-0.09957809001207352,
-0.006440147757530212,
0.1254177987575531,
0.09333524852991104,
0.07935678958892822,
-0.2185502052307129,
-0.13339371979236603,
0.05872276425361633,
-0.00575496768578887,
0.22408108413219452,
-0.034034017473459244,
-0.11356475204229355,
-0.027013886719942093,
0.04241163283586502,
-0.06043251231312752,
0.08524788916110992,
0.023536119610071182,
-0.08113526552915573,
-0.032957352697849274,
0.05323701351881027,
0.012368366122245789,
0.00524376705288887,
0.09360801428556442,
0.020107939839363098,
-0.0009265501867048442,
0.01785753294825554,
0.047885000705718994,
-0.0675911232829094,
-0.1984109878540039,
0.09357594698667526,
-0.05215044692158699,
0.0015536568826064467,
-0.08013670891523361,
-0.15122665464878082,
-0.08837161958217621,
-0.16009655594825745,
0.12540200352668762,
-0.034406669437885284,
0.12700119614601135,
-0.06619787961244583,
0.17341409623622894,
-0.07871770113706589,
0.04481020197272301,
-0.047349292784929276,
0.050332702696323395,
-0.007268077693879604,
-0.07756082713603973,
0.16585899889469147,
-0.15564003586769104,
0.01809087023139,
0.19572502374649048,
-0.018915493041276932,
0.07177707552909851,
0.021322092041373253,
-0.0636206790804863,
0.23147478699684143,
0.3014698624610901,
0.008138049393892288,
0.1665448248386383,
0.3018903136253357,
-0.07466315478086472,
-0.2642788887023926,
-0.05505012720823288,
-0.2841376066207886,
-0.05371501296758652,
0.10716094076633453,
-0.22523896396160126,
0.06986407935619354,
0.14383509755134583,
-0.06471995264291763,
0.30228954553604126,
-0.21825523674488068,
0.012589273042976856,
0.15434536337852478,
-0.08868814259767532,
0.5515313148498535,
-0.1133413165807724,
-0.17677772045135498,
-0.008122089318931103,
-0.08741296827793121,
0.10602109134197235,
-0.0340677872300148,
0.06877441704273224,
0.013465235009789467,
0.04797380417585373,
0.048932258039712906,
-0.03111894056200981,
0.22701001167297363,
0.008710170164704323,
0.09015397727489471,
-0.07378865778446198,
-0.18624304234981537,
0.11639340221881866,
-0.04359482601284981,
-0.08891059458255768,
0.0849778801202774,
-0.05942516401410103,
-0.11078983545303345,
0.04663389176130295,
-0.07950539886951447,
-0.024862350896000862,
0.08423490077257156,
-0.04678233340382576,
-0.042606171220541,
-0.008054176345467567,
-0.1618063747882843,
-0.0002289071271661669,
0.31360217928886414,
-0.07096036523580551,
0.16695955395698547,
0.03677211329340935,
0.00038613268407061696,
-0.11027684062719345,
0.030288029462099075,
-0.05203165486454964,
-0.021576624363660812,
0.09578979015350342,
-0.11096979677677155,
0.03204701095819473,
0.14160704612731934,
-0.04864364117383957,
0.05846960097551346,
0.09256096184253693,
-0.0849417969584465,
0.007583672646433115,
0.17753590643405914,
-0.17537221312522888,
-0.1273445188999176,
-0.006135711446404457,
-0.09862716495990753,
0.14055661857128143,
0.04394126310944557,
0.05191568285226822,
0.16669964790344238,
0.03967129811644554,
-0.029474308714270592,
-0.02817419543862343,
-0.1153380498290062,
-0.0201893113553524,
0.040153320878744125,
0.00045633706031367183,
-0.08791285753250122,
0.2262638509273529,
0.06409153342247009,
-0.1328488290309906,
-0.051157206296920776,
0.2161225974559784,
-0.06805316358804703,
-0.04911920800805092,
-0.223562553524971,
0.10752306133508682,
-0.07112517952919006,
-0.0965060144662857,
0.05453834682703018,
-0.02270081453025341,
0.005106312222778797,
0.181985542178154,
0.03941008821129799,
0.11070270836353302,
0.03738937899470329,
-0.02448922023177147,
0.15798696875572205,
-0.142850860953331,
-0.14191335439682007,
-0.025354057550430298,
-0.08757315576076508,
-0.13844476640224457,
-0.026804137974977493,
0.1617041826248169,
-0.09177309274673462,
-0.14772607386112213,
-0.2621181011199951,
0.10968475043773651,
-0.16432365775108337,
-0.10192688554525375,
-0.03469514101743698,
-0.08968492597341537,
0.0696166530251503,
0.030301768332719803,
-0.03093348816037178,
-0.06706760823726654,
-0.18593791127204895,
0.0816768929362297,
0.06349513679742813,
0.045533183962106705,
-0.017847947776317596,
0.0067379772663116455,
0.1720137596130371,
0.025955144315958023,
0.10040043294429779,
0.16762186586856842,
0.011397695168852806,
0.2246655523777008,
-0.1671202927827835,
-0.11496317386627197,
0.1336962729692459,
-0.026543032377958298,
0.06762003898620605,
0.16792191565036774,
-0.0772583931684494,
0.015526676550507545,
-0.028136352077126503,
0.07066910713911057,
-0.11003983020782471,
-0.105624258518219,
0.007937257178127766,
0.02567129209637642,
-0.2755882740020752,
-0.005599735304713249,
-0.19717298448085785,
0.14788752794265747,
0.02579621411859989,
0.03297143429517746,
0.10257530212402344,
0.10404334217309952,
0.08312062919139862,
-0.0017710148822516203,
0.03226327523589134,
-0.1176818460226059,
0.02753005363047123,
-0.059239376336336136,
-0.020663779228925705,
0.017624232918024063,
0.36952024698257446,
-0.03603357449173927,
-0.046802736818790436,
0.003710439894348383,
0.1307835876941681,
-0.02139742486178875,
0.017395347356796265,
0.13209912180900574,
0.12607666850090027,
-0.08595693111419678,
-0.1504845917224884,
0.04888554662466049,
-0.04565655067563057,
-0.02836887165904045,
0.1464131623506546,
0.05905961990356445,
0.1050296202301979,
0.0908031314611435,
-0.014463032595813274,
-0.00318976235575974,
0.012856799177825451,
-0.15486004948616028,
0.06223496049642563,
-0.010558074340224266,
0.012565906159579754,
0.017934376373887062,
0.15238402783870697,
-0.005540105979889631,
0.07739730179309845,
-0.09889880567789078,
0.004208535887300968,
-0.13498884439468384,
-0.07913459837436676,
0.03617347031831741,
-0.13393273949623108,
0.04141177982091904,
-0.01871878281235695,
0.029611799865961075,
0.30386561155319214,
0.02558239921927452,
-0.020639164373278618,
0.12512871623039246,
-0.1214587539434433,
-0.12050267308950424,
-0.001594188273884356,
-0.029960084706544876,
0.0791488066315651,
-0.02633434161543846,
-0.0997740775346756,
-0.1001306027173996,
-0.15166029334068298,
-0.09759195148944855,
0.05182836204767227,
-0.04993441700935364,
-0.059362251311540604,
-0.17634081840515137,
-0.05707859992980957,
-0.05147340148687363,
0.14025864005088806,
-0.12263951450586319,
0.15159130096435547,
-0.014490418136119843,
0.004084470681846142,
0.04405883327126503,
0.1950942426919937,
-0.03644494712352753,
0.08714226633310318,
0.0154351145029068,
0.1522706001996994,
-0.05119588226079941,
0.14720745384693146,
-0.10931728035211563,
-0.04014137014746666,
-0.06710435450077057,
0.21513493359088898,
0.25630924105644226,
-0.06136954948306084,
-0.008937356993556023,
-0.012760217301547527,
0.058654606342315674,
0.1073930487036705,
0.16049085557460785,
0.002326392102986574,
0.2802925705909729,
-0.03133585304021835,
0.04815128445625305,
0.02901598811149597,
0.013607407920062542,
-0.06336209923028946,
0.03397751972079277,
0.07539387792348862,
-0.035039983689785004,
-0.1412304788827896,
0.15837742388248444,
-0.21980468928813934,
0.18157227337360382,
0.11640069633722305,
-0.19996967911720276,
-0.013728445395827293,
-0.04882071167230606,
0.1689416468143463,
-0.0856364443898201,
0.1637246012687683,
-0.0903693437576294,
-0.2108195722103119,
-0.2056000679731369,
0.03867346793413162,
-0.34623071551322937,
-0.254462867975235,
0.10422009229660034,
0.1488201916217804,
0.04015883058309555,
-0.018507536500692368,
-0.019967829808592796,
-0.018367022275924683,
0.04877542704343796,
-0.0067357709631323814,
0.06014643982052803,
0.031397558748722076,
-0.02988368645310402,
-0.24127542972564697,
-0.029804671183228493,
0.023964406922459602,
-0.07093082368373871,
0.07464958727359772,
-0.06874357163906097,
-0.022495782002806664,
0.08059766888618469,
-0.03066304884850979,
0.03298592567443848,
-0.035373736172914505,
-0.16326889395713806,
0.027529051527380943,
0.03900543600320816,
0.036012712866067886,
0.00634160777553916,
0.0008072225609794259,
-0.03455270454287529,
0.0644603744149208,
-0.16716794669628143,
-0.16015739738941193,
0.14140215516090393,
-0.06745140254497528,
0.2779497504234314,
-0.05812826007604599,
-0.0809100940823555,
0.04766704887151718,
-0.03426874056458473,
0.1807648241519928,
-0.07756473124027252,
0.047254521399736404,
0.12766779959201813,
0.011127962730824947,
0.03121316432952881,
-0.3092964291572571,
0.11082969605922699,
-0.000795336440205574,
-0.006093299947679043,
-0.07581598311662674
] |
null | null |
transformers
|
## Model description
T5 model trained for Grammar Correction. This model corrects grammatical mistakes in input sentences
### Dataset Description
The T5-base model has been trained on C4_200M dataset.
### Model in Action 🚀
```
import torch
from transformers import T5Tokenizer, T5ForConditionalGeneration
model_name = 'deep-learning-analytics/GrammarCorrector'
torch_device = 'cuda' if torch.cuda.is_available() else 'cpu'
tokenizer = T5Tokenizer.from_pretrained(model_name)
model = T5ForConditionalGeneration.from_pretrained(model_name).to(torch_device)
def correct_grammar(input_text,num_return_sequences):
batch = tokenizer([input_text],truncation=True,padding='max_length',max_length=64, return_tensors="pt").to(torch_device)
translated = model.generate(**batch,max_length=64,num_beams=num_beams, num_return_sequences=num_return_sequences, temperature=1.5)
tgt_text = tokenizer.batch_decode(translated, skip_special_tokens=True)
return tgt_text
```
### Example Usage
```
text = 'He are moving here.'
print(correct_grammar(text, num_return_sequences=2))
['He is moving here.', 'He is moving here now.']
```
Another example
```
text = 'Cat drinked milk'
print(correct_grammar(text, num_return_sequences=2))
['Cat drank milk.', 'Cat drink milk.']
```
Model Developed by [Priya-Dwivedi](https://www.linkedin.com/in/priyanka-dwivedi-6864362)
|
{}
|
text2text-generation
|
deep-learning-analytics/GrammarCorrector
|
[
"transformers",
"pytorch",
"tf",
"t5",
"text2text-generation",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tf #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
## Model description
T5 model trained for Grammar Correction. This model corrects grammatical mistakes in input sentences
### Dataset Description
The T5-base model has been trained on C4_200M dataset.
### Model in Action
### Example Usage
Another example
Model Developed by Priya-Dwivedi
|
[
"## Model description\nT5 model trained for Grammar Correction. This model corrects grammatical mistakes in input sentences",
"### Dataset Description\nThe T5-base model has been trained on C4_200M dataset.",
"### Model in Action",
"### Example Usage\n\n\nAnother example\n\n\nModel Developed by Priya-Dwivedi"
] |
[
"TAGS\n#transformers #pytorch #tf #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"## Model description\nT5 model trained for Grammar Correction. This model corrects grammatical mistakes in input sentences",
"### Dataset Description\nThe T5-base model has been trained on C4_200M dataset.",
"### Model in Action",
"### Example Usage\n\n\nAnother example\n\n\nModel Developed by Priya-Dwivedi"
] |
[
55,
27,
23,
5,
18
] |
[
"passage: TAGS\n#transformers #pytorch #tf #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n## Model description\nT5 model trained for Grammar Correction. This model corrects grammatical mistakes in input sentences### Dataset Description\nThe T5-base model has been trained on C4_200M dataset.### Model in Action### Example Usage\n\n\nAnother example\n\n\nModel Developed by Priya-Dwivedi"
] |
[
-0.03358723595738411,
-0.057024288922548294,
-0.0007630208856426179,
0.04234803467988968,
0.13225926458835602,
0.008269775658845901,
0.06743551790714264,
0.08685694634914398,
-0.08996573090553284,
-0.052959997206926346,
0.12210143357515335,
0.09162948280572891,
0.013468880206346512,
0.07441484928131104,
-0.02841966785490513,
-0.30493128299713135,
0.06434714794158936,
-0.0005809462163597345,
-0.023819781839847565,
0.15444821119308472,
0.16060464084148407,
-0.05794020742177963,
0.0825514942407608,
0.02862095646560192,
-0.1818891167640686,
0.04838879033923149,
0.0633467435836792,
-0.11452792584896088,
0.16045211255550385,
0.0514364056289196,
0.08450613170862198,
0.05117590352892876,
0.06541822105646133,
-0.06299766898155212,
0.026675378903746605,
0.00969347171485424,
-0.051935598254203796,
0.04729482904076576,
0.03704021871089935,
0.0025814264081418514,
0.24792475998401642,
-0.050464581698179245,
0.03692631050944328,
0.033354271203279495,
-0.10357853025197983,
-0.005382757633924484,
0.03916135057806969,
0.1302533596754074,
0.07408428192138672,
0.1407446563243866,
-0.0777268186211586,
0.14026127755641937,
-0.13936161994934082,
0.10152578353881836,
0.06213417276740074,
-0.27694836258888245,
-0.03055398538708687,
0.14401420950889587,
0.04108016937971115,
0.08402568101882935,
0.02492264285683632,
0.07032723724842072,
0.0590728223323822,
0.02919023670256138,
0.10561709105968475,
-0.04232224076986313,
-0.051402490586042404,
0.07776419073343277,
-0.1439526528120041,
-0.02871696464717388,
0.3287774920463562,
-0.014393948018550873,
0.0021149960812181234,
-0.10627099871635437,
-0.12012311071157455,
-0.019302574917674065,
0.02892225794494152,
-0.1526484489440918,
-0.03929370269179344,
0.04584731534123421,
0.09196151793003082,
-0.06319595873355865,
-0.11367558687925339,
-0.04321586713194847,
-0.09699836373329163,
0.0494612492620945,
0.03209930285811424,
-0.013684891164302826,
-0.18222825229167938,
0.0942234918475151,
-0.10169602185487747,
-0.06060122326016426,
0.03897498548030853,
-0.12274480611085892,
-0.06173070892691612,
-0.026360642164945602,
-0.1385698765516281,
-0.1102169081568718,
0.013507289811968803,
-0.014508139342069626,
0.038519486784935,
-0.03005891852080822,
0.02197716198861599,
-0.0010986900888383389,
0.0759691596031189,
0.09444449096918106,
-0.17888444662094116,
0.03051329217851162,
0.023447174578905106,
-0.02552015148103237,
-0.03110603243112564,
-0.01412755623459816,
-0.11545616388320923,
-0.10953198373317719,
0.15513481199741364,
0.05331408604979515,
-0.04792216047644615,
0.11738374829292297,
-0.0440208874642849,
-0.0664714053273201,
0.010657913982868195,
-0.0665731132030487,
-0.08287837356328964,
-0.028340311720967293,
-0.041827984154224396,
0.06621448695659637,
0.05043863505125046,
0.030305951833724976,
-0.12779830396175385,
-0.05163715034723282,
-0.12728050351142883,
-0.091628797352314,
-0.0038204272277653217,
-0.12347420305013657,
-0.01856410503387451,
-0.0934605598449707,
0.04105350002646446,
-0.17041723430156708,
-0.13514798879623413,
0.02216854877769947,
-0.017597388476133347,
-0.06183477118611336,
-0.11685554683208466,
-0.07128472626209259,
-0.05586433410644531,
0.01258055493235588,
-0.030270904302597046,
0.03745030239224434,
-0.01118597574532032,
0.06686964631080627,
-0.051189396530389786,
0.0684744119644165,
-0.10596529394388199,
0.06229114532470703,
-0.10181010514497757,
-0.042306434363126755,
-0.033867549151182175,
0.0785096287727356,
0.07991530001163483,
-0.03433111310005188,
-0.08506917208433151,
-0.090459443628788,
-0.05222618579864502,
0.0474996343255043,
-0.0002806390111800283,
0.19093427062034607,
-0.14513012766838074,
-0.04685824364423752,
0.12992438673973083,
-0.04587783291935921,
-0.11464599519968033,
0.08684047311544418,
-0.021370381116867065,
0.17092527449131012,
0.10044441372156143,
0.15550360083580017,
0.011596092022955418,
-0.07276920974254608,
0.08356933295726776,
0.03322933241724968,
-0.0933147594332695,
0.03857875615358353,
0.05565729737281799,
0.0064355116337537766,
-0.11977840214967728,
0.01456928439438343,
-0.024525124579668045,
0.015656856819987297,
-0.05648518726229668,
-0.06173726171255112,
-0.051482539623975754,
-0.04564819484949112,
0.1111108809709549,
-0.0007804965716786683,
0.08117663860321045,
-0.0387725830078125,
-0.09186989814043045,
0.07987067103385925,
0.06854619830846786,
-0.03446575254201889,
0.022767378017306328,
-0.09697656333446503,
0.0671934261918068,
-0.008320841006934643,
0.02373732440173626,
-0.20932988822460175,
-0.04483148455619812,
-0.03284327685832977,
0.20906132459640503,
0.03364313021302223,
0.15979206562042236,
0.025224260985851288,
-0.035463251173496246,
-0.051692936569452286,
0.01879993826150894,
0.024694036692380905,
0.07092877477407455,
-0.05217380449175835,
-0.12404099106788635,
0.005794634111225605,
-0.028447028249502182,
0.14144471287727356,
-0.1392493098974228,
0.009239854291081429,
0.0525486133992672,
0.08827478438615799,
-0.012030849233269691,
0.05428585782647133,
0.003121295478194952,
-0.006509173661470413,
-0.037656307220458984,
-0.010449220426380634,
0.09499620646238327,
-0.01702367328107357,
-0.06538861244916916,
0.1349751502275467,
-0.17924925684928894,
0.043327346444129944,
0.14636214077472687,
-0.14612965285778046,
-0.08016103506088257,
0.029711799696087837,
-0.026582932099699974,
0.006379066500812769,
-0.011144566349685192,
-0.0770721584558487,
0.06358623504638672,
-0.07590962201356888,
0.11649378389120102,
-0.10671186447143555,
-0.01810780167579651,
0.04940996319055557,
-0.011294945143163204,
-0.020790211856365204,
0.08832939714193344,
-0.00022952903236728162,
-0.22490032017230988,
0.06865639984607697,
0.16524383425712585,
0.0542021244764328,
0.16405193507671356,
0.01557878963649273,
-0.06783639639616013,
-0.01935495249927044,
-0.032358456403017044,
-0.052020344883203506,
0.0064929756335914135,
-0.20573848485946655,
-0.03895043581724167,
0.05062824487686157,
0.008948209695518017,
0.05523963272571564,
-0.07087159156799316,
-0.029984863474965096,
0.0823880136013031,
-0.004593743942677975,
-0.03958554565906525,
0.13675571978092194,
0.002952280454337597,
0.1422792226076126,
0.016030950471758842,
-0.08939823508262634,
0.05930912494659424,
0.011326915584504604,
-0.17607557773590088,
0.18518993258476257,
-0.10300263017416,
-0.294143408536911,
-0.05423716455698013,
-0.00694007845595479,
-0.010894476436078548,
0.04247685521841049,
0.06924480199813843,
-0.11083807796239853,
-0.025994563475251198,
-0.052830200642347336,
0.0560484454035759,
-0.0793454498052597,
0.021094610914587975,
-0.1038980484008789,
-0.00005063434946350753,
-0.01598329097032547,
-0.10365501046180725,
-0.04356693476438522,
-0.01146105770021677,
-0.06612861901521683,
0.02960604801774025,
-0.17134912312030792,
0.051790155470371246,
0.2364407181739807,
-0.037629276514053345,
0.05979679897427559,
-0.04158519580960274,
0.26110079884529114,
-0.05093865841627121,
0.06526676565408707,
0.18538667261600494,
-0.03439195081591606,
-0.027718685567378998,
0.1596527397632599,
-0.041395194828510284,
-0.026455719023942947,
0.06789641082286835,
-0.001534234150312841,
-0.06208331137895584,
-0.19592203199863434,
-0.1358194351196289,
-0.08675333112478256,
-0.01852652058005333,
0.04199301078915596,
0.028207050636410713,
0.20483094453811646,
0.06407660245895386,
0.026785943657159805,
0.018489833921194077,
0.06007452681660652,
0.055530939251184464,
0.2514745593070984,
-0.04408952221274376,
0.1328192800283432,
-0.020607421174645424,
-0.10115329921245575,
0.08097869157791138,
-0.09429877251386642,
0.1318284124135971,
0.005871147383004427,
0.03429147228598595,
0.043947041034698486,
0.0943688154220581,
0.06157596781849861,
0.09033389389514923,
0.018203917890787125,
-0.01774168200790882,
-0.03910341113805771,
-0.08005093783140182,
-0.013499761000275612,
0.0791112631559372,
0.04165438190102577,
-0.06509354710578918,
-0.150229811668396,
-0.009682975709438324,
0.05316116660833359,
0.11424621194601059,
0.14326179027557373,
-0.2547415792942047,
-0.02690584398806095,
0.022990921512246132,
-0.0452633872628212,
-0.0936712846159935,
0.09754517674446106,
0.040288764983415604,
-0.13305872678756714,
-0.022598572075366974,
0.003571284469217062,
0.08817635476589203,
0.04851142317056656,
0.07192768901586533,
-0.04981297254562378,
-0.07207104563713074,
-0.025151196867227554,
0.10631244629621506,
-0.2654199004173279,
0.253396600484848,
0.001716673607006669,
-0.05363353341817856,
-0.11195322126150131,
-0.032624080777168274,
0.010933861136436462,
0.17879782617092133,
0.191984161734581,
0.018166281282901764,
0.060860056430101395,
-0.03362351283431053,
0.009973779320716858,
0.0360359251499176,
0.06109307333827019,
-0.03486232832074165,
0.06183664873242378,
-0.09553702920675278,
0.014476228505373001,
0.0789707750082016,
0.1775510311126709,
-0.18535062670707703,
-0.08001965284347534,
0.012457428500056267,
0.026881631463766098,
0.030442791059613228,
-0.03460121899843216,
-0.0825803205370903,
-0.04410345107316971,
0.10538969933986664,
0.08980134129524231,
-0.0993305966258049,
-0.12037433683872223,
0.05039544776082039,
0.07625534385442734,
-0.05877844616770744,
0.05807076022028923,
-0.00912890862673521,
0.03380393236875534,
-0.006319875828921795,
-0.15486089885234833,
0.1338493376970291,
-0.046104904264211655,
-0.04403474181890488,
0.00004180555333732627,
0.14499466121196747,
0.027313249185681343,
0.0027266032993793488,
0.05998445302248001,
0.026618873700499535,
-0.042893484234809875,
-0.03989134728908539,
-0.04735367372632027,
-0.021098464727401733,
0.06871530413627625,
0.040308352559804916,
-0.07355786114931107,
-0.1151159256696701,
-0.03187054023146629,
-0.027451401576399803,
0.22174714505672455,
0.029167402535676956,
-0.06612784415483475,
0.0874224379658699,
0.07844172418117523,
-0.06920522451400757,
-0.18963374197483063,
-0.05363750457763672,
-0.03797805309295654,
0.06981930881738663,
0.07414974272251129,
-0.0587523877620697,
0.012508407235145569,
-0.008728208020329475,
0.023346897214651108,
-0.15731608867645264,
-0.27199244499206543,
-0.13573063910007477,
0.20135387778282166,
0.05444207787513733,
0.2633294463157654,
-0.03614146634936333,
-0.00950732920318842,
0.0032610439229756594,
-0.1429954320192337,
0.17262601852416992,
-0.18084478378295898,
0.06190229952335358,
0.010141189210116863,
0.028334781527519226,
0.07494772225618362,
0.015249233692884445,
0.04533158987760544,
0.008906197734177113,
-0.028427258133888245,
-0.09027557075023651,
-0.1362084299325943,
0.11766796559095383,
0.008076596073806286,
0.0802997350692749,
0.04086919128894806,
0.09573014080524445,
-0.1260806918144226,
-0.09098298102617264,
-0.11964719742536545,
-0.00024751151795499027,
-0.05420977994799614,
-0.1148272380232811,
-0.03188704326748848,
0.04982040077447891,
0.04821913689374924,
-0.08067132532596588,
0.025108810514211655,
-0.1260356605052948,
0.09665030241012573,
0.1582687497138977,
0.17781272530555725,
0.017662864178419113,
0.04681920260190964,
0.00043472001561895013,
-0.05909247696399689,
0.06783969700336456,
-0.17928200960159302,
0.03973885998129845,
0.09244096279144287,
-0.01847938820719719,
0.10863058269023895,
0.07222581654787064,
-0.020426947623491287,
-0.0602477602660656,
0.09192726761102676,
-0.19134952127933502,
-0.10753250122070312,
-0.1454673856496811,
-0.13083596527576447,
0.021407485008239746,
0.0013271226780489087,
0.16793473064899445,
-0.10988322645425797,
-0.03380420804023743,
0.007473164703696966,
0.0028720067348331213,
-0.08054544031620026,
0.06875472515821457,
0.08942066878080368,
0.03578535467386246,
-0.06690187007188797,
0.009432516060769558,
0.05648871883749962,
-0.0795355886220932,
0.05916818231344223,
0.09784507751464844,
-0.15981590747833252,
-0.08148239552974701,
-0.007777099497616291,
0.010447980836033821,
-0.16672611236572266,
-0.057056888937950134,
-0.05080932378768921,
-0.14957109093666077,
0.05540889501571655,
0.10966778546571732,
0.09417906403541565,
0.08261524885892868,
-0.10074105858802795,
-0.06306201964616776,
-0.062460675835609436,
0.10288739204406738,
0.05260913074016571,
-0.04221927374601364,
-0.13836854696273804,
0.10849729925394058,
-0.03161805123090744,
0.11164608597755432,
-0.10265061259269714,
-0.05302801728248596,
-0.10418745875358582,
0.019142724573612213,
-0.22857996821403503,
-0.07684385776519775,
-0.05485110357403755,
-0.04283798858523369,
-0.0203232504427433,
-0.013374220579862595,
-0.011075160466134548,
0.023558828979730606,
-0.05238534137606621,
0.022393841296434402,
-0.0025351981166750193,
0.08089986443519592,
-0.05615935102105141,
-0.007142513524740934,
0.012285386212170124,
-0.023651784285902977,
0.11535990238189697,
0.025722479447722435,
-0.10448291897773743,
0.13017775118350983,
-0.00850857887417078,
0.00742123881354928,
0.028056250885128975,
0.005892714019864798,
0.0771707147359848,
-0.11487159132957458,
0.029001975432038307,
0.05014496669173241,
0.03443627059459686,
0.036079421639442444,
0.03497345373034477,
-0.03720512613654137,
0.003881279844790697,
-0.021859653294086456,
-0.044567011296749115,
-0.0654483437538147,
0.031109025701880455,
-0.002361086430028081,
0.0452711321413517,
0.07781263440847397,
-0.040809933096170425,
0.027865178883075714,
-0.12137328088283539,
0.031061401590704918,
-0.05055694282054901,
-0.04226226359605789,
-0.056793831288814545,
-0.08667050302028656,
0.023948244750499725,
-0.06207318231463432,
0.11722845584154129,
0.020646508783102036,
0.07801368087530136,
0.06464128941297531,
0.056970104575157166,
0.06411115825176239,
0.016318971291184425,
0.28034672141075134,
0.017178554087877274,
-0.0345168299973011,
-0.058778319507837296,
0.03319129720330238,
0.0019800527952611446,
0.14054840803146362,
0.1944372057914734,
-0.0002471651532687247,
-0.005177283193916082,
0.0865662544965744,
0.03569960966706276,
-0.008581554517149925,
-0.05524452030658722,
-0.09000684320926666,
-0.0026891815941780806,
0.06330792605876923,
-0.0445663183927536,
-0.02162715047597885,
0.2457631528377533,
-0.10329212993383408,
0.025845283642411232,
-0.061372097581624985,
-0.10286390036344528,
-0.1551012247800827,
-0.16106507182121277,
-0.08365116268396378,
-0.10255903005599976,
-0.04332895949482918,
-0.1668715626001358,
0.009394952096045017,
0.06117963790893555,
0.0632031038403511,
-0.09911077469587326,
0.11647836863994598,
-0.015325316227972507,
-0.10923903435468674,
0.10736412554979324,
-0.007992236874997616,
0.06955534219741821,
-0.024754084646701813,
0.03323996439576149,
-0.06265611946582794,
0.03485097363591194,
-0.05249099060893059,
-0.018000690266489983,
-0.04326290637254715,
-0.009648657403886318,
-0.06797734647989273,
-0.040049802511930466,
-0.04999396577477455,
0.02958434820175171,
-0.002518007066100836,
0.11609716713428497,
0.025439836084842682,
-0.08162932097911835,
-0.012656006030738354,
0.25264957547187805,
-0.06445537507534027,
-0.1361321210861206,
-0.1422700583934784,
0.24870318174362183,
0.03679446503520012,
0.05486856400966644,
0.07884366065263748,
-0.05920049548149109,
-0.041282057762145996,
0.29169777035713196,
0.3063417673110962,
-0.07886712998151779,
0.027751076966524124,
0.014078998006880283,
0.01554965041577816,
0.04752660542726517,
0.1529511660337448,
0.03414854779839516,
0.19614572823047638,
-0.06601777672767639,
0.08454747498035431,
-0.062073964625597,
-0.01579415239393711,
-0.011372217908501625,
0.14475709199905396,
0.16164831817150116,
-0.10403858125209808,
0.01007834542542696,
0.13973937928676605,
-0.11742039769887924,
0.06367004662752151,
-0.09830565750598907,
-0.07586774975061417,
-0.09257379174232483,
-0.06199193373322487,
-0.06728421896696091,
0.0022561363875865936,
0.10445927083492279,
-0.0610804483294487,
0.006095349323004484,
0.09113107621669769,
0.0771305039525032,
-0.14983782172203064,
-0.03200839087367058,
0.17636050283908844,
0.07418039441108704,
0.030567051842808723,
0.0014142735162749887,
0.1053718850016594,
0.08706925064325333,
0.02822836861014366,
-0.06459800899028778,
0.06686335057020187,
0.00890607014298439,
0.07721679657697678,
0.028996599838137627,
0.009773632511496544,
0.019330870360136032,
-0.07555954903364182,
0.03162269666790962,
-0.20138554275035858,
0.04324735328555107,
-0.00839761458337307,
0.006953307893127203,
-0.1000121459364891,
0.07172771543264389,
-0.08238980174064636,
0.10811548680067062,
0.14793480932712555,
-0.05031171441078186,
0.00040917249862104654,
-0.07428295165300369,
0.015621540136635303,
0.014928069896996021,
0.014904323033988476,
-0.005367249250411987,
-0.08669400215148926,
-0.014011498540639877,
0.006010797340422869,
-0.024439705535769463,
-0.20537646114826202,
-0.03622344508767128,
-0.08096889406442642,
-0.014891432598233223,
-0.10833694040775299,
0.1475931704044342,
0.11434213817119598,
0.016345467418432236,
0.021252961829304695,
0.1286795437335968,
-0.00831554364413023,
0.09398916363716125,
-0.13813482224941254,
-0.1135336384177208
] |
null | null |
transformers
|
# Model name
Closed Book Trivia-QA T5 base
## Model description
This is a T5-base model trained on No Context Trivia QA data set. The input to the model is a Trivia type question. The model is tuned to search for the answer in its memory to return it. The pretrained model used here was trained on Common Crawl (C4) data set. The model was trained for 135 epochs using a batch size of 32 and learning rate of 1e-3. Max_input_lngth is set as 25 and max_output_length is 10. Model attained an EM score of 17 and a Subset Match score of 24.5
We have written a blog post that covers the training procedure. Please find it [here](https://medium.com/@priya.dwivedi/build-a-trivia-bot-using-t5-transformer-345ff83205b6).
Test the model on Trivia Questions from the websites below:
https://www.triviaquestionss.com/easy-trivia-questions/
https://laffgaff.com/easy-trivia-questions-and-answers/
## Usage
```
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("deep-learning-analytics/triviaqa-t5-base")
model = AutoModelWithLMHead.from_pretrained("deep-learning-analytics/triviaqa-t5-base")
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
model = model.to(device)
text = "Who directed the movie Jaws?"
preprocess_text = text.strip().replace("\n","")
tokenized_text = tokenizer.encode(preprocess_text, return_tensors="pt").to(device)
outs = model.model.generate(
tokenized_text,
max_length=10,
num_beams=2,
early_stopping=True
)
dec = [tokenizer.decode(ids) for ids in outs]
print("Predicted Answer: ", dec)
```
|
{"language": "eng", "tags": ["triviaqa", "t5-base", "pytorch", "lm-head", "question-answering", "closed-book", "t5", "pipeline:question-answering"], "datasets": ["triviaqa"], "metrics": [{"EM": 17}, {"Subset match": 24.5}], "widget": [{"text": ["Mount Everest is found in which mountain range?", "None"]}]}
|
question-answering
|
deep-learning-analytics/triviaqa-t5-base
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"triviaqa",
"t5-base",
"lm-head",
"question-answering",
"closed-book",
"pipeline:question-answering",
"eng",
"dataset:triviaqa",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"eng"
] |
TAGS
#transformers #pytorch #t5 #text2text-generation #triviaqa #t5-base #lm-head #question-answering #closed-book #pipeline-question-answering #eng #dataset-triviaqa #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model name
Closed Book Trivia-QA T5 base
## Model description
This is a T5-base model trained on No Context Trivia QA data set. The input to the model is a Trivia type question. The model is tuned to search for the answer in its memory to return it. The pretrained model used here was trained on Common Crawl (C4) data set. The model was trained for 135 epochs using a batch size of 32 and learning rate of 1e-3. Max_input_lngth is set as 25 and max_output_length is 10. Model attained an EM score of 17 and a Subset Match score of 24.5
We have written a blog post that covers the training procedure. Please find it here.
Test the model on Trivia Questions from the websites below:
URL
URL
## Usage
|
[
"# Model name\nClosed Book Trivia-QA T5 base",
"## Model description\n\nThis is a T5-base model trained on No Context Trivia QA data set. The input to the model is a Trivia type question. The model is tuned to search for the answer in its memory to return it. The pretrained model used here was trained on Common Crawl (C4) data set. The model was trained for 135 epochs using a batch size of 32 and learning rate of 1e-3. Max_input_lngth is set as 25 and max_output_length is 10. Model attained an EM score of 17 and a Subset Match score of 24.5\nWe have written a blog post that covers the training procedure. Please find it here. \n\nTest the model on Trivia Questions from the websites below:\nURL\nURL",
"## Usage"
] |
[
"TAGS\n#transformers #pytorch #t5 #text2text-generation #triviaqa #t5-base #lm-head #question-answering #closed-book #pipeline-question-answering #eng #dataset-triviaqa #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model name\nClosed Book Trivia-QA T5 base",
"## Model description\n\nThis is a T5-base model trained on No Context Trivia QA data set. The input to the model is a Trivia type question. The model is tuned to search for the answer in its memory to return it. The pretrained model used here was trained on Common Crawl (C4) data set. The model was trained for 135 epochs using a batch size of 32 and learning rate of 1e-3. Max_input_lngth is set as 25 and max_output_length is 10. Model attained an EM score of 17 and a Subset Match score of 24.5\nWe have written a blog post that covers the training procedure. Please find it here. \n\nTest the model on Trivia Questions from the websites below:\nURL\nURL",
"## Usage"
] |
[
90,
13,
170,
3
] |
[
"passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #triviaqa #t5-base #lm-head #question-answering #closed-book #pipeline-question-answering #eng #dataset-triviaqa #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model name\nClosed Book Trivia-QA T5 base## Model description\n\nThis is a T5-base model trained on No Context Trivia QA data set. The input to the model is a Trivia type question. The model is tuned to search for the answer in its memory to return it. The pretrained model used here was trained on Common Crawl (C4) data set. The model was trained for 135 epochs using a batch size of 32 and learning rate of 1e-3. Max_input_lngth is set as 25 and max_output_length is 10. Model attained an EM score of 17 and a Subset Match score of 24.5\nWe have written a blog post that covers the training procedure. Please find it here. \n\nTest the model on Trivia Questions from the websites below:\nURL\nURL## Usage"
] |
[
-0.028763024136424065,
0.0035707345232367516,
-0.0031528917606920004,
0.07990401983261108,
0.08451330661773682,
0.025724291801452637,
0.07930323481559753,
0.11586112529039383,
-0.01502144429832697,
0.06347339600324631,
0.045811232179403305,
-0.01044554729014635,
0.05091332271695137,
0.0555778369307518,
0.03702797368168831,
-0.20151858031749725,
-0.004717132542282343,
-0.028049468994140625,
0.016690175980329514,
0.13156574964523315,
0.07983823120594025,
-0.07404287159442902,
0.04946776479482651,
-0.02379518188536167,
-0.02251814864575863,
0.020318347960710526,
-0.022091945633292198,
-0.008544351905584335,
0.06100960448384285,
0.047869790345430374,
0.02854214608669281,
-0.029212135821580887,
0.058194730430841446,
-0.203924760222435,
0.03134413808584213,
0.07953909039497375,
-0.027399955317378044,
0.042823806405067444,
0.02819756232202053,
0.046668730676174164,
0.128959059715271,
0.04564936086535454,
0.04269770532846451,
0.0991382747888565,
-0.08053737133741379,
-0.03586018085479736,
-0.09530956298112869,
0.09292614459991455,
0.10871107876300812,
0.1361066997051239,
-0.03475301340222359,
0.12308535724878311,
-0.10383081436157227,
0.07572654634714127,
0.1001700833439827,
-0.2742599546909332,
-0.03949869051575661,
0.11121702194213867,
0.10921183228492737,
0.05146501585841179,
-0.07116781920194626,
-0.02300214394927025,
0.05190245062112808,
0.02715212292969227,
0.03802141919732094,
0.0030023320578038692,
0.05996466055512428,
0.03881486505270004,
-0.12423614412546158,
-0.03916435316205025,
0.13443852961063385,
0.0679459273815155,
-0.03838152065873146,
-0.06612143665552139,
-0.07562030106782913,
0.005440949462354183,
0.026023872196674347,
-0.03266146779060364,
0.05120289325714111,
0.04758605733513832,
-0.023542726412415504,
-0.07540401816368103,
-0.0862002894282341,
-0.08185287564992905,
-0.0646544024348259,
-0.07429911196231842,
0.03768843039870262,
0.044240377843379974,
-0.11996178328990936,
0.07137961685657501,
-0.010961661115288734,
-0.11383523046970367,
-0.035951174795627594,
-0.025444233790040016,
-0.06390313804149628,
-0.01915428228676319,
-0.04334959387779236,
-0.11351009458303452,
0.002666935557499528,
0.13127601146697998,
0.05602381378412247,
0.009376045316457748,
0.016700981184840202,
0.018562283366918564,
0.023629944771528244,
0.14412711560726166,
-0.07771679013967514,
-0.07203483581542969,
0.01888781599700451,
0.04130823165178299,
-0.027796044945716858,
-0.006896659731864929,
-0.037116143852472305,
0.029806295409798622,
0.07128836214542389,
0.04099148511886597,
0.06655620783567429,
0.029079578816890717,
0.012994207441806793,
-0.050994254648685455,
0.06605660915374756,
-0.10475095361471176,
-0.01595800369977951,
0.01462080329656601,
-0.10836520791053772,
0.0758204385638237,
0.020293204113841057,
-0.0289935190230608,
-0.12381768971681595,
-0.00772405369207263,
-0.1100962683558464,
-0.029599584639072418,
-0.055630914866924286,
-0.0863843709230423,
0.04525749385356903,
-0.037129029631614685,
-0.056059397757053375,
-0.06390387564897537,
-0.178081676363945,
-0.09115481376647949,
0.0686350017786026,
-0.05758798122406006,
0.009511656127870083,
-0.048981256783008575,
-0.022868653759360313,
0.02300027757883072,
0.014996043406426907,
-0.0138394208624959,
-0.0403551422059536,
0.08768591284751892,
-0.015927353873848915,
0.0533556193113327,
0.0305437333881855,
0.031025132164359093,
-0.07303975522518158,
-0.00015333473857026547,
-0.08003023266792297,
0.07922670990228653,
0.05342759191989899,
0.07402287423610687,
-0.11951803416013718,
-0.06497473269701004,
-0.061990637332201004,
-0.01015468966215849,
0.063737653195858,
0.12730103731155396,
-0.1529460996389389,
0.01575821451842785,
0.21851983666419983,
-0.042754050344228745,
-0.12214349210262299,
0.09179290384054184,
-0.024346398189663887,
0.019662190228700638,
0.11437390744686127,
0.10394918918609619,
0.11012645810842514,
-0.07429639250040054,
-0.06656290590763092,
0.005666282959282398,
-0.0336737334728241,
-0.007913301698863506,
0.0804409384727478,
-0.04437050223350525,
0.036674223840236664,
0.006783812306821346,
0.033912017941474915,
-0.02277045138180256,
-0.046723004430532455,
-0.03811788186430931,
-0.04246772453188896,
-0.04518131911754608,
-0.020880775526165962,
-0.0055698370561003685,
-0.004921804182231426,
-0.05168136954307556,
-0.1390371024608612,
0.057924479246139526,
0.08301004022359848,
-0.016632748767733574,
0.03873039782047272,
-0.057778846472501755,
0.03417985141277313,
-0.08112860471010208,
0.025712577626109123,
-0.15008199214935303,
-0.05163755267858505,
-0.01303993258625269,
0.0678449422121048,
0.07913666218519211,
0.045939307659864426,
0.03117670863866806,
-0.04520845040678978,
-0.024602221325039864,
-0.016899190843105316,
-0.030655715614557266,
-0.059453077614307404,
-0.14077146351337433,
-0.0888071209192276,
-0.03532933443784714,
0.0147422319278121,
-0.019457144662737846,
-0.11993058025836945,
0.01258213073015213,
-0.059952717274427414,
0.07306649535894394,
-0.023683402687311172,
-0.03725963830947876,
0.016938455402851105,
-0.0726526528596878,
-0.024881616234779358,
-0.03438105434179306,
0.022546784952282906,
0.02110963873565197,
-0.022125639021396637,
0.05039137229323387,
-0.12612584233283997,
-0.06478467583656311,
0.09551656246185303,
0.006950404495000839,
-0.003033093409612775,
0.03469444811344147,
-0.039978619664907455,
-0.0102165462449193,
-0.0367625392973423,
-0.08888689428567886,
0.12695246934890747,
0.034812964498996735,
0.11798228323459625,
-0.13215982913970947,
-0.13206471502780914,
0.004390477668493986,
-0.002645589644089341,
0.018321357667446136,
0.04322710260748863,
0.0382072888314724,
-0.19666548073291779,
0.03454141691327095,
0.06440452486276627,
0.07559391856193542,
0.13887344300746918,
0.009544205851852894,
-0.12189456820487976,
-0.02677331119775772,
0.03144540265202522,
-0.03905778005719185,
0.06237941235303879,
-0.04210358113050461,
0.027465559542179108,
0.009860685095191002,
0.0791921466588974,
0.04079210013151169,
-0.09763310849666595,
0.024460267275571823,
0.05484678968787193,
-0.0427919402718544,
-0.0875454694032669,
0.0005034258938394487,
-0.025208670645952225,
0.08760345727205276,
0.098732128739357,
0.03803775832056999,
-0.030987834557890892,
-0.06934969872236252,
-0.14391975104808807,
0.18598149716854095,
-0.08534310013055801,
-0.17448581755161285,
-0.10937207192182541,
0.04446965083479881,
0.006815125700086355,
-0.020248141139745712,
0.06313612312078476,
-0.13525642454624176,
-0.03814087435603142,
-0.0937703475356102,
-0.01313682273030281,
-0.05355758219957352,
0.00486739631742239,
-0.0303411353379488,
0.027000539004802704,
0.021362056955695152,
-0.15547096729278564,
0.0130483228713274,
-0.06872980296611786,
-0.1049901694059372,
-0.02960851602256298,
-0.05111497640609741,
0.03671780228614807,
0.17310240864753723,
-0.022794457152485847,
0.013401408679783344,
-0.01903468742966652,
0.2396983951330185,
-0.060774412006139755,
0.07547850906848907,
0.17357812821865082,
0.05547839030623436,
0.0749637633562088,
0.07328729331493378,
0.005087960045784712,
-0.05578560009598732,
0.08093664795160294,
0.09164411574602127,
-0.04265562444925308,
-0.31039920449256897,
-0.01789916679263115,
-0.03680586814880371,
-0.05785365775227547,
0.046202652156353,
0.07399492710828781,
-0.004817880690097809,
0.018870243802666664,
-0.08377356827259064,
0.025423649698495865,
-0.005692989099770784,
0.061073269695043564,
0.03449252247810364,
0.009778497740626335,
0.08229069411754608,
-0.06843185424804688,
0.0379481241106987,
0.11502574384212494,
0.017373790964484215,
0.1704697459936142,
-0.02174147218465805,
0.08206382393836975,
0.07028862088918686,
0.058608319610357285,
-0.03736918792128563,
0.088371142745018,
-0.08285044133663177,
0.03787097707390785,
-0.011046387255191803,
-0.05330974608659744,
0.02321488969027996,
0.04034387320280075,
-0.013705052435398102,
0.05752921849489212,
-0.10096283257007599,
-0.011029395274817944,
0.08196671307086945,
0.17833462357521057,
0.08063820749521255,
-0.15643353760242462,
-0.09676951915025711,
0.006986046209931374,
-0.11108556389808655,
-0.04894493892788887,
0.04855036735534668,
0.04814177751541138,
-0.10948184877634048,
0.004510453436523676,
0.0174952931702137,
0.15513378381729126,
0.0035502780228853226,
-0.0009985335636883974,
0.05732789263129234,
0.015975967049598694,
-0.018598349764943123,
0.0776670053601265,
-0.15214017033576965,
0.11158961057662964,
0.03726943954825401,
0.06395412236452103,
-0.036689337342977524,
0.0015639454359188676,
0.018383724614977837,
-0.03555823490023613,
0.08003419637680054,
0.02746928483247757,
0.12116456776857376,
-0.011637628078460693,
-0.10544177889823914,
0.04551421105861664,
0.09928648918867111,
-0.04126860201358795,
0.13487623631954193,
-0.0657418817281723,
0.0146182207390666,
0.004190341103821993,
0.08183803409337997,
-0.09290207177400589,
-0.08161254227161407,
-0.01060230191797018,
-0.010010928846895695,
-0.045753803104162216,
-0.018815778195858,
-0.023643696680665016,
0.030985670164227486,
0.203258216381073,
-0.07991435378789902,
-0.0790448933839798,
-0.111958809196949,
0.030080510303378105,
0.118623286485672,
-0.07571004331111908,
0.006414984352886677,
-0.0129713648930192,
0.045436255633831024,
0.030658867210149765,
-0.12900683283805847,
0.06730247288942337,
-0.0492262989282608,
-0.10553650557994843,
-0.05068313702940941,
0.12793447077274323,
0.03342634066939354,
0.05237703025341034,
0.011466801166534424,
0.004972578026354313,
-0.10270772129297256,
-0.095216765999794,
-0.026440592482686043,
0.04621221125125885,
0.03785461559891701,
0.004315305966883898,
-0.04261655732989311,
0.0783107727766037,
-0.04798131063580513,
0.008901562541723251,
0.05409650877118111,
0.10244745761156082,
-0.02882578782737255,
0.08974986523389816,
0.13083882629871368,
-0.07169263809919357,
-0.17158003151416779,
-0.020011810585856438,
-0.014681339263916016,
0.0012566556688398123,
-0.004144262056797743,
-0.0975361242890358,
0.0761372298002243,
0.019017687067389488,
-0.013917457312345505,
0.057066626846790314,
-0.2632719576358795,
-0.10770856589078903,
0.002467632293701172,
0.016749117523431778,
0.12417764961719513,
-0.114029660820961,
-0.05786633491516113,
-0.025248270481824875,
-0.14366087317466736,
0.0267026349902153,
-0.13271987438201904,
0.093650683760643,
-0.04254898056387901,
-0.028491882607340813,
0.027849026024341583,
-0.03474484756588936,
0.03235781937837601,
0.00721945334225893,
0.03187388554215431,
-0.0798586830496788,
0.024820704013109207,
0.08909982442855835,
-0.08372378349304199,
0.1598966270685196,
-0.09123427420854568,
0.12430418282747269,
-0.18362584710121155,
-0.0536283403635025,
-0.06318451464176178,
-0.07133904099464417,
-0.05025409162044525,
-0.04120773822069168,
-0.08912389725446701,
0.03213288635015488,
0.1364535093307495,
-0.0009972997941076756,
-0.014736567623913288,
-0.027915576472878456,
0.08075176924467087,
0.09390300512313843,
0.14271222054958344,
-0.038778696209192276,
-0.14567658305168152,
0.0031802996527403593,
-0.0026465803384780884,
0.08609496802091599,
-0.23602142930030823,
0.01673976518213749,
0.11922605335712433,
0.0295069869607687,
0.06552578508853912,
0.07660286128520966,
-0.0985167920589447,
-0.0030879175756126642,
0.07205482572317123,
-0.05460621416568756,
-0.13798640668392181,
-0.06391174346208572,
-0.064240962266922,
-0.1639019250869751,
0.017897188663482666,
0.10495197027921677,
-0.02243070863187313,
-0.028197357431054115,
-0.0017763443756848574,
0.03509151190519333,
-0.014673229306936264,
0.13314229249954224,
0.04935642331838608,
0.07067207247018814,
-0.05253181606531143,
0.0968019887804985,
0.07883799821138382,
-0.026648418977856636,
0.07005458325147629,
0.17310172319412231,
-0.07242600619792938,
-0.04545649513602257,
0.07039876282215118,
-0.016378730535507202,
-0.05936312675476074,
0.013947110623121262,
-0.07211791723966599,
-0.02121036872267723,
0.09110378473997116,
0.15381652116775513,
0.015106864273548126,
0.06779423356056213,
0.004490526393055916,
0.008032609708607197,
-0.054917145520448685,
0.09177710115909576,
-0.0062653739005327225,
-0.028145508840680122,
-0.05164486914873123,
0.01270902156829834,
0.013994891196489334,
0.08410412073135376,
-0.030041048303246498,
-0.03703511878848076,
-0.10059761255979538,
0.01764928549528122,
-0.14215168356895447,
-0.032391227781772614,
-0.01794418692588806,
-0.019359085708856583,
-0.039879996329545975,
-0.019586168229579926,
-0.02167532965540886,
0.051759883761405945,
-0.02561059407889843,
-0.07555141299962997,
-0.025763656944036484,
0.02127119153738022,
-0.15813735127449036,
0.014080886729061604,
0.017635785043239594,
-0.028947340324521065,
0.0884992927312851,
-0.0020858042407780886,
-0.039381083101034164,
0.05193573236465454,
-0.07263553142547607,
-0.018151067197322845,
-0.04441928490996361,
0.041085150092840195,
0.005946805234998465,
-0.009107212536036968,
0.0375831313431263,
-0.011808428913354874,
0.006956405472010374,
0.013543262146413326,
-0.033722877502441406,
-0.108083076775074,
-0.03364308550953865,
-0.08029185980558395,
-0.010276134125888348,
-0.05906715616583824,
0.0019783517345786095,
0.003065788187086582,
0.0595439076423645,
0.11081068217754364,
-0.030469650402665138,
0.04865557700395584,
-0.2574160099029541,
-0.005012934561818838,
0.036742761731147766,
0.054179295897483826,
-0.06566519290208817,
-0.018287917599081993,
0.04487413167953491,
-0.06149045750498772,
0.07995619624853134,
0.026662567630410194,
0.05920511484146118,
0.039898429065942764,
0.090681292116642,
-0.022678548470139503,
-0.034848589450120926,
0.13264048099517822,
0.009330972097814083,
-0.04303396865725517,
0.0230824314057827,
0.007527417503297329,
-0.0007545187254436314,
-0.07110516726970673,
0.16503283381462097,
0.0852997675538063,
0.1087975725531578,
0.07971521466970444,
0.02950895205140114,
-0.003539915895089507,
-0.10263691842556,
-0.073955237865448,
-0.012296421453356743,
0.03133302554488182,
-0.013013885356485844,
0.06647532433271408,
0.21950577199459076,
-0.10668939352035522,
0.07320642471313477,
-0.031943179666996,
-0.044398900121450424,
-0.13903126120567322,
-0.15376868844032288,
-0.05738016217947006,
-0.09517732262611389,
0.020996222272515297,
-0.13020575046539307,
0.04479669779539108,
0.056605834513902664,
0.03830147162079811,
-0.02312289923429489,
0.04736912623047829,
-0.0011215230915695429,
-0.06431744992733002,
0.03430460765957832,
0.0176985003054142,
0.019238010048866272,
0.06285685300827026,
0.07855910807847977,
0.03677018731832504,
-0.020826265215873718,
0.05810797959566116,
0.056628789752721786,
-0.004775369539856911,
0.03164839744567871,
-0.02626197598874569,
-0.0832202285528183,
-0.010978702455759048,
0.03272407501935959,
0.037790447473526,
0.1419246941804886,
0.06748591363430023,
-0.04462563246488571,
-0.029184479266405106,
0.22709104418754578,
-0.0670560747385025,
-0.07269800454378128,
-0.11249038577079773,
0.16204063594341278,
0.06141158193349838,
-0.02211851067841053,
-0.002326155314221978,
-0.1361425668001175,
-0.016988446936011314,
0.27916932106018066,
0.1621730476617813,
-0.04310343414545059,
-0.014981100335717201,
0.008502643555402756,
0.004101707134395838,
0.06468497216701508,
0.104460708796978,
0.09373495727777481,
0.3014196753501892,
-0.06672500818967819,
0.03753674775362015,
-0.0049325572326779366,
-0.04626297950744629,
0.01648971624672413,
0.05103488638997078,
-0.0047287121415138245,
0.0304169412702322,
-0.04256726801395416,
0.0854448452591896,
-0.0694214329123497,
-0.04338006675243378,
0.015854306519031525,
-0.04260898381471634,
-0.06775925308465958,
-0.04764881730079651,
0.007850593887269497,
-0.002811734564602375,
0.0856180191040039,
0.010322199203073978,
0.01681806892156601,
0.08970807492733002,
-0.01408176589757204,
-0.16079343855381012,
-0.11718536168336868,
0.07579556107521057,
-0.01267156284302473,
0.16730351746082306,
0.0396440215408802,
0.04502001032233238,
0.09534171968698502,
0.0251801535487175,
-0.10145555436611176,
0.06208818778395653,
0.02970064803957939,
-0.07306063920259476,
0.0326412208378315,
0.0782763659954071,
0.012420488521456718,
0.001812748028896749,
0.07468147575855255,
-0.01703418418765068,
0.002143810736015439,
0.005668801721185446,
0.02268035337328911,
-0.17190483212471008,
0.07579559087753296,
-0.06741315126419067,
0.1683354526758194,
0.16032233834266663,
-0.08100525289773941,
0.0160488560795784,
-0.050915833562612534,
-0.010510697960853577,
-0.027426276355981827,
0.03405505418777466,
0.006017705425620079,
-0.12143182754516602,
0.026167631149291992,
-0.062434833496809006,
-0.009978014044463634,
-0.27942025661468506,
-0.04749125614762306,
0.020169414579868317,
-0.033305246382951736,
-0.04640723764896393,
0.13593682646751404,
0.10054979473352432,
0.03358108922839165,
-0.048445459455251694,
-0.003493774216622114,
-0.029041435569524765,
0.11621271073818207,
-0.1612091362476349,
-0.08679135888814926
] |
null | null |
transformers
|
# Model name
Wikihow T5-small
## Model description
This is a T5-small model trained on Wikihow All data set. The model was trained for 3 epochs using a batch size of 16 and learning rate of 3e-4. Max_input_lngth is set as 512 and max_output_length is 150. Model attained a Rouge1 score of 31.2 and RougeL score of 24.5.
We have written a blog post that covers the training procedure. Please find it [here](https://medium.com/@priya.dwivedi/fine-tuning-a-t5-transformer-for-any-summarization-task-82334c64c81).
## Usage
```
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("deep-learning-analytics/wikihow-t5-small")
model = AutoModelWithLMHead.from_pretrained("deep-learning-analytics/wikihow-t5-small")
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
model = model.to(device)
text = """"
Lack of fluids can lead to dry mouth, which is a leading cause of bad breath. Water
can also dilute any chemicals in your mouth or gut that are causing bad breath., Studies show that
eating 6 ounces of yogurt a day reduces the level of odor-causing compounds in the mouth. In
particular, look for yogurt containing the active bacteria Streptococcus thermophilus or
Lactobacillus bulgaricus., The abrasive nature of fibrous fruits and vegetables helps to clean
teeth, while the vitamins, antioxidants, and acids they contain improve dental health.Foods that can
be particularly helpful include:Apples — Apples contain vitamin C, which is necessary for health
gums, as well as malic acid, which helps to whiten teeth.Carrots — Carrots are rich in vitamin A,
which strengthens tooth enamel.Celery — Chewing celery produces a lot of saliva, which helps to
neutralize bacteria that cause bad breath.Pineapples — Pineapples contain bromelain, an enzyme that
cleans the mouth., These teas have been shown to kill the bacteria that cause bad breath and
plaque., An upset stomach can lead to burping, which contributes to bad breath. Don’t eat foods that
upset your stomach, or if you do, use antacids. If you are lactose intolerant, try lactase tablets.,
They can all cause bad breath. If you do eat them, bring sugar-free gum or a toothbrush and
toothpaste to freshen your mouth afterwards., Diets low in carbohydrates lead to ketosis — a state
in which the body burns primarily fat instead of carbohydrates for energy. This may be good for your
waistline, but it also produces chemicals called ketones, which contribute to bad breath.To stop the
problem, you must change your diet. Or, you can combat the smell in one of these ways:Drink lots of
water to dilute the ketones.Chew sugarless gum or suck on sugarless mints.Chew mint leaves.
"""
preprocess_text = text.strip().replace("\n","")
tokenized_text = tokenizer.encode(preprocess_text, return_tensors="pt").to(device)
summary_ids = model.generate(
tokenized_text,
max_length=150,
num_beams=2,
repetition_penalty=2.5,
length_penalty=1.0,
early_stopping=True
)
output = tokenizer.decode(summary_ids[0], skip_special_tokens=True)
print ("\n\nSummarized text: \n",output)
```
|
{"language": "eng", "tags": ["wikihow", "t5-small", "pytorch", "lm-head", "seq2seq", "t5", "pipeline:summarization", "summarization"], "datasets": ["Wikihow"], "metrics": [{"Rouge1": 31.2}, {"RougeL": 24.5}], "widget": [{"text": "Lack of fluids can lead to dry mouth, which is a leading cause of bad breath. Water can also dilute any chemicals in your mouth or gut that are causing bad breath., Studies show that eating 6 ounces of yogurt a day reduces the level of odor-causing compounds in the mouth. In particular, look for yogurt containing the active bacteria Streptococcus thermophilus or Lactobacillus bulgaricus., The abrasive nature of fibrous fruits and vegetables helps to clean teeth, while the vitamins, antioxidants, and acids they contain improve dental health.Foods that can be particularly helpful include:Apples \u2014 Apples contain vitamin C, which is necessary for health gums, as well as malic acid, which helps to whiten teeth.Carrots \u2014 Carrots are rich in vitamin A, which strengthens tooth enamel.Celery \u2014 Chewing celery produces a lot of saliva, which helps to neutralize bacteria that cause bad breath.Pineapples \u2014 Pineapples contain bromelain, an enzyme that cleans the mouth., These teas have been shown to kill the bacteria that cause bad breath and plaque., An upset stomach can lead to burping, which contributes to bad breath. Don\u2019t eat foods that upset your stomach, or if you do, use antacids. If you are lactose intolerant, try lactase tablets., They can all cause bad breath. If you do eat them, bring sugar-free gum or a toothbrush and toothpaste to freshen your mouth afterwards., Diets low in carbohydrates lead to ketosis \u2014 a state in which the body burns primarily fat instead of carbohydrates for energy. This may be good for your waistline, but it also produces chemicals called ketones, which contribute to bad breath.To stop the problem, you must change your diet. Or, you can combat the smell in one of these ways:Drink lots of water to dilute the ketones.Chew sugarless gum or suck on sugarless mints.Chew mint leaves."}, {"text": " Bring 1/2 cup water to the boil.Add the fresh or dried rosemary to the water.Remove from the heat. Set aside for 1/2 an hour to infuse. Added flavour can be released by pressing down on the rosemary leaves with a spoon. Add the pieces to the blender or food processor with the elderflower cordial. Blend or process to a pur\u00e9e.,, Add the lemon or lime juice and stir to combine., Add a cover and place in the freezer.After 2 hours, remove from the freezer and break up with a fork. This helps the ice crystals to form properly.Continue doing this every hour until the granita freezes properly. Scoop the granita into dessert bowls and serve. Garnish with a cucumber curl or a small sprig of rosemary."}]}
|
summarization
|
deep-learning-analytics/wikihow-t5-small
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"wikihow",
"t5-small",
"lm-head",
"seq2seq",
"pipeline:summarization",
"summarization",
"eng",
"dataset:Wikihow",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"eng"
] |
TAGS
#transformers #pytorch #t5 #text2text-generation #wikihow #t5-small #lm-head #seq2seq #pipeline-summarization #summarization #eng #dataset-Wikihow #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# Model name
Wikihow T5-small
## Model description
This is a T5-small model trained on Wikihow All data set. The model was trained for 3 epochs using a batch size of 16 and learning rate of 3e-4. Max_input_lngth is set as 512 and max_output_length is 150. Model attained a Rouge1 score of 31.2 and RougeL score of 24.5.
We have written a blog post that covers the training procedure. Please find it here.
## Usage
|
[
"# Model name\nWikihow T5-small",
"## Model description\n\nThis is a T5-small model trained on Wikihow All data set. The model was trained for 3 epochs using a batch size of 16 and learning rate of 3e-4. Max_input_lngth is set as 512 and max_output_length is 150. Model attained a Rouge1 score of 31.2 and RougeL score of 24.5. \nWe have written a blog post that covers the training procedure. Please find it here.",
"## Usage"
] |
[
"TAGS\n#transformers #pytorch #t5 #text2text-generation #wikihow #t5-small #lm-head #seq2seq #pipeline-summarization #summarization #eng #dataset-Wikihow #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# Model name\nWikihow T5-small",
"## Model description\n\nThis is a T5-small model trained on Wikihow All data set. The model was trained for 3 epochs using a batch size of 16 and learning rate of 3e-4. Max_input_lngth is set as 512 and max_output_length is 150. Model attained a Rouge1 score of 31.2 and RougeL score of 24.5. \nWe have written a blog post that covers the training procedure. Please find it here.",
"## Usage"
] |
[
90,
9,
104,
3
] |
[
"passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #wikihow #t5-small #lm-head #seq2seq #pipeline-summarization #summarization #eng #dataset-Wikihow #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# Model name\nWikihow T5-small## Model description\n\nThis is a T5-small model trained on Wikihow All data set. The model was trained for 3 epochs using a batch size of 16 and learning rate of 3e-4. Max_input_lngth is set as 512 and max_output_length is 150. Model attained a Rouge1 score of 31.2 and RougeL score of 24.5. \nWe have written a blog post that covers the training procedure. Please find it here.## Usage"
] |
[
-0.11818534880876541,
-0.015539742074906826,
-0.0026961578987538815,
0.09079555422067642,
0.12716442346572876,
0.0005050868494436145,
0.0489412285387516,
0.08765889704227448,
-0.03609292581677437,
0.011935802176594734,
0.1276867687702179,
0.027974631637334824,
0.055085524916648865,
0.21784254908561707,
-0.01611713133752346,
-0.2513035833835602,
0.01652424782514572,
0.014648421667516232,
-0.06929069757461548,
0.13656699657440186,
0.10901039838790894,
-0.06700988113880157,
0.09182983636856079,
0.017825130373239517,
-0.1420758217573166,
0.011585598811507225,
0.022153565660119057,
-0.08896637707948685,
0.10121218860149384,
0.055102042853832245,
0.057972513139247894,
0.03990469500422478,
0.0908384770154953,
-0.07189692556858063,
0.020983124151825905,
0.025361744686961174,
0.004203072749078274,
0.06386365741491318,
0.018738310784101486,
0.04248213768005371,
0.17096953094005585,
-0.10305695980787277,
0.017396381124854088,
0.07916072010993958,
-0.0746263936161995,
-0.028657643124461174,
-0.04159277305006981,
0.05250057578086853,
0.09490625560283661,
0.07266733050346375,
-0.006370405200868845,
0.0823897272348404,
-0.12467459589242935,
0.10861106961965561,
0.21215860545635223,
-0.323970228433609,
-0.03331358730792999,
0.12905733287334442,
0.051195334643125534,
-0.05376320332288742,
-0.06248348951339722,
0.0405069924890995,
0.08095858246088028,
0.03299972042441368,
0.17066964507102966,
-0.01888282410800457,
0.061161238700151443,
0.029623009264469147,
-0.1312888115644455,
-0.00255022756755352,
0.306636244058609,
0.04612240567803383,
-0.06518100947141647,
-0.06744197756052017,
-0.072372667491436,
-0.07250572741031647,
-0.005522795021533966,
-0.06126849725842476,
0.0012940651504322886,
-0.03359232842922211,
-0.03582322970032692,
-0.012784305959939957,
-0.08008448779582977,
-0.09282110631465912,
-0.06743213534355164,
0.04355781525373459,
0.029662204906344414,
0.0535171739757061,
-0.10946041345596313,
0.06635630130767822,
-0.1171405240893364,
-0.029742974787950516,
-0.04057995229959488,
-0.0667281523346901,
-0.07347922772169113,
-0.01844828762114048,
-0.05035662651062012,
-0.06286893784999847,
0.0020871018059551716,
0.03654249384999275,
0.01746993139386177,
0.0006817603134550154,
0.18038409948349,
0.023640988394618034,
0.06077853590250015,
0.09005751460790634,
-0.09015105664730072,
-0.09124401211738586,
0.056140221655368805,
-0.007932301610708237,
-0.062032781541347504,
-0.029220392927527428,
-0.041120707988739014,
-0.07430741935968399,
0.0972122773528099,
0.02060602605342865,
-0.0325058251619339,
0.07708026468753815,
0.0028535127639770508,
-0.05763274431228638,
0.0699983611702919,
-0.12026120722293854,
-0.04329845309257507,
-0.00984408613294363,
-0.09367885440587997,
0.17666824162006378,
0.09381616115570068,
-0.06726472079753876,
-0.13107971847057343,
0.04563526064157486,
-0.09803546220064163,
-0.05825353413820267,
-0.04639415070414543,
-0.10386892408132553,
0.009637700393795967,
-0.04556196928024292,
-0.02770616114139557,
-0.13175632059574127,
-0.17922702431678772,
-0.0294609647244215,
0.04127507284283638,
-0.009545492008328438,
-0.09384843707084656,
-0.0541980005800724,
-0.05107145383954048,
0.06025605648756027,
-0.013617417775094509,
0.014472766779363155,
-0.03618844971060753,
0.06778304278850555,
-0.02077944576740265,
0.10946200042963028,
-0.06435611844062805,
0.03274988755583763,
-0.02874014340341091,
0.007748287636786699,
-0.08599256724119186,
0.06503047794103622,
0.046590451151132584,
0.04608475789427757,
-0.12766212224960327,
-0.12859556078910828,
-0.08809389173984528,
0.014923830516636372,
0.05460616946220398,
0.173467218875885,
-0.1543649584054947,
-0.028264207765460014,
0.2386988401412964,
-0.0838124230504036,
-0.10653021931648254,
0.14380481839179993,
0.000571281008888036,
0.05811752751469612,
0.1215422973036766,
0.0944676473736763,
0.08686981350183487,
-0.14376923441886902,
0.012109581381082535,
0.07924798876047134,
-0.030628008767962456,
-0.12965412437915802,
0.08817170560359955,
0.03856131434440613,
-0.09155874699354172,
0.028870895504951477,
0.019754065200686455,
0.05817806348204613,
-0.09230218827724457,
-0.036709755659103394,
-0.04357309639453888,
-0.08370336145162582,
0.044154971837997437,
-0.013635964132845402,
0.05167783424258232,
-0.05388087034225464,
-0.09579648822546005,
0.0698859766125679,
0.1036108136177063,
-0.03152881935238838,
0.018320640549063683,
-0.06624288111925125,
0.03016325645148754,
-0.016376277431845665,
0.029293809086084366,
-0.11949905008077621,
-0.03616470843553543,
-0.03724462538957596,
0.09813270717859268,
0.017585720866918564,
0.11421553790569305,
0.02087526023387909,
0.008342583663761616,
-0.04522135481238365,
0.020600715652108192,
0.000601214065682143,
-0.01898955926299095,
-0.12710422277450562,
-0.08292883634567261,
-0.06182250753045082,
0.0019666433800011873,
-0.01517209317535162,
-0.18290621042251587,
0.045605067163705826,
-0.0070290411822497845,
0.010505173355340958,
0.0006100985337980092,
-0.015799373388290405,
0.03562718629837036,
-0.004248267505317926,
-0.03705395758152008,
-0.028520425781607628,
0.07695890218019485,
0.024451931938529015,
-0.04319862648844719,
0.05635996162891388,
-0.10464662313461304,
0.08204798400402069,
0.10552573949098587,
0.005827912595123053,
-0.09022749960422516,
0.004560773726552725,
-0.017336782068014145,
-0.028561752289533615,
-0.07039958238601685,
-0.0794600248336792,
0.012861819006502628,
-0.00998439360409975,
0.14851675927639008,
-0.08902836591005325,
-0.06729882955551147,
0.027011174708604813,
-0.016122223809361458,
0.04210267961025238,
0.06866929680109024,
0.0757029578089714,
-0.18681436777114868,
0.015132313594222069,
0.0859413594007492,
-0.022334571927785873,
0.16219355165958405,
-0.010308738797903061,
-0.08912015706300735,
-0.00012678702478297055,
0.009257923811674118,
-0.014983799308538437,
0.07534447312355042,
-0.13607576489448547,
-0.058511655777692795,
0.02978825755417347,
0.05116190388798714,
0.006296274717897177,
-0.0891619622707367,
-0.008908967487514019,
0.04583947733044624,
-0.03481723368167877,
-0.062096405774354935,
0.05807003751397133,
-0.04215223342180252,
0.11152056604623795,
0.05840509384870529,
-0.044314686208963394,
0.003448429750278592,
-0.005905460100620985,
-0.1232156902551651,
0.2300037294626236,
-0.021824302151799202,
-0.20562639832496643,
-0.057708803564310074,
0.024965643882751465,
-0.010162273421883583,
-0.033732421696186066,
0.05379044637084007,
-0.09866171330213547,
-0.042037252336740494,
-0.041416235268116,
0.08474139124155045,
-0.060028642416000366,
0.07974398881196976,
0.0033709537237882614,
0.012129895389080048,
0.01527123712003231,
-0.12513743340969086,
-0.008610894903540611,
-0.04740101099014282,
-0.07440071552991867,
0.04631100594997406,
-0.09303554147481918,
0.06554404646158218,
0.18554389476776123,
-0.05888015031814575,
0.050066083669662476,
-0.03041423112154007,
0.2573249340057373,
-0.00392409460619092,
0.08677993714809418,
0.16281606256961823,
0.0786697044968605,
0.03632731735706329,
0.03261561319231987,
0.015623161569237709,
-0.06336182355880737,
0.0742671862244606,
0.04796389862895012,
-0.0915997326374054,
-0.23402757942676544,
-0.06542801111936569,
-0.06913677603006363,
0.06785301119089127,
0.09273576736450195,
0.04629414156079292,
-0.05682124197483063,
0.06298106908798218,
-0.006609220057725906,
0.07414105534553528,
0.012701218016445637,
0.04608006030321121,
0.034719642251729965,
0.012516303919255733,
0.11910378187894821,
-0.06513867527246475,
0.012105693109333515,
0.10291507095098495,
-0.007232345640659332,
0.1886872500181198,
-0.05561944097280502,
0.06894712895154953,
0.05425093695521355,
0.10830385982990265,
0.039222054183483124,
0.21201613545417786,
-0.026396727189421654,
0.0059550972655415535,
-0.01478497963398695,
-0.040912821888923645,
-0.01695794239640236,
0.02893151342868805,
-0.04835190996527672,
0.006389301270246506,
-0.183619424700737,
0.10217814892530441,
0.025949442759156227,
0.23675158619880676,
0.16012656688690186,
-0.26210036873817444,
-0.14984166622161865,
-0.01636444963514805,
-0.07391877472400665,
-0.051482174545526505,
0.09158001095056534,
0.1052745059132576,
-0.12637636065483093,
0.054394856095314026,
-0.00795282144099474,
0.12037819623947144,
0.00968092679977417,
0.015669943764805794,
0.026704953983426094,
0.07465169578790665,
-0.007322449237108231,
0.06783677637577057,
-0.169032022356987,
0.18808041512966156,
-0.004359957296401262,
0.04319879785180092,
-0.05000821873545647,
-0.032049618661403656,
0.02989950031042099,
0.10353221744298935,
0.12185318022966385,
0.049049217253923416,
-0.005504692904651165,
-0.05238755792379379,
-0.08799456804990768,
0.03738722577691078,
0.040916841477155685,
-0.008789890445768833,
0.0841568186879158,
-0.04404878616333008,
-0.001630615210160613,
0.028705304488539696,
0.07538337260484695,
-0.14662234485149384,
-0.06752558052539825,
-0.004848673474043608,
0.03394576907157898,
-0.060211289674043655,
-0.0292265173047781,
-0.08965843170881271,
-0.023538867011666298,
0.2746601998806,
0.031077470630407333,
-0.10187087208032608,
-0.1283256560564041,
0.05655135586857796,
0.048022083938121796,
-0.09609538316726685,
0.05633950233459473,
0.005280741024762392,
0.058001965284347534,
0.00938988197594881,
-0.13062119483947754,
0.11003534495830536,
-0.07325540482997894,
-0.06999407708644867,
0.002663181396201253,
0.09652531892061234,
0.03156159818172455,
0.03601587563753128,
0.027753302827477455,
-0.04036131501197815,
-0.008291961625218391,
-0.11247773468494415,
-0.004904039204120636,
0.049385372549295425,
-0.048557136207818985,
0.0432138666510582,
-0.045108240097761154,
-0.04765694960951805,
-0.004900831263512373,
-0.010991155169904232,
0.17924484610557556,
0.1230585053563118,
-0.0811467096209526,
0.09532910585403442,
0.08699583262205124,
-0.04158699885010719,
-0.27933189272880554,
-0.0431123785674572,
0.0025657708756625652,
0.05920090898871422,
-0.015792887657880783,
-0.11383849382400513,
0.12697717547416687,
0.04755258560180664,
0.000590412993915379,
0.014706781134009361,
-0.2819140553474426,
-0.12590615451335907,
0.05703442171216011,
0.0653379037976265,
0.31710708141326904,
-0.10632522404193878,
-0.002280972432345152,
-0.04493292421102524,
-0.14418654143810272,
0.13374917209148407,
-0.14955544471740723,
0.09608589857816696,
-0.053572021424770355,
0.04530785232782364,
0.028228161856532097,
-0.01654502935707569,
0.06693266332149506,
0.03984814137220383,
-0.00945236999541521,
-0.05770008638501167,
0.02317216619849205,
0.05815873667597771,
-0.10131518542766571,
0.18859533965587616,
-0.10022053867578506,
0.08265187591314316,
-0.15678070485591888,
-0.061874352395534515,
-0.1033310815691948,
0.037101440131664276,
-0.02542155422270298,
-0.06175229698419571,
-0.06867504864931107,
0.03205911070108414,
0.07972031086683273,
-0.009658308699727058,
-0.026754554361104965,
-0.0009308884618803859,
0.04065156728029251,
0.08991549164056778,
0.13838796317577362,
-0.07223024219274521,
-0.0925169289112091,
0.007814321666955948,
-0.018387414515018463,
0.08311353623867035,
-0.28199276328086853,
0.030520690605044365,
0.1075253114104271,
0.06527987867593765,
0.054582808166742325,
0.06567654758691788,
-0.05313453450798988,
-0.01742306724190712,
0.08095332235097885,
-0.1263130158185959,
-0.11844071000814438,
-0.07283472269773483,
-0.05042015016078949,
-0.11307410150766373,
0.0872487723827362,
0.10567401349544525,
-0.08492232859134674,
-0.009858454577624798,
-0.041396308690309525,
0.03138778358697891,
-0.07910602539777756,
0.16269667446613312,
0.05658263713121414,
0.10392674058675766,
-0.07611826062202454,
0.04899515211582184,
0.03449353948235512,
-0.04630006477236748,
0.047661807388067245,
0.16100344061851501,
-0.12981413304805756,
-0.06327134370803833,
0.05054635927081108,
0.00013526136172004044,
-0.08253543078899384,
-0.036242228001356125,
-0.030483797192573547,
-0.12061435729265213,
0.07774080336093903,
0.11059719324111938,
0.052238285541534424,
0.05398641154170036,
-0.007037541829049587,
-0.033511240035295486,
-0.08531168848276138,
0.05716700106859207,
0.11711149662733078,
-0.010618632659316063,
-0.12370578944683075,
0.07906128466129303,
-0.011397142894566059,
0.17286823689937592,
-0.05401028320193291,
-0.02505466900765896,
-0.10765646398067474,
0.0191901084035635,
-0.09602269530296326,
0.010735404677689075,
-0.0801735669374466,
0.00928578432649374,
-0.0450485460460186,
-0.03253045305609703,
-0.034167274832725525,
0.0411093644797802,
-0.06065031886100769,
-0.02205772139132023,
-0.01567034050822258,
0.03223033621907234,
-0.02305861935019493,
-0.0172288678586483,
0.0092971445992589,
-0.07547489553689957,
0.08979646116495132,
0.002451124135404825,
-0.07075624912977219,
0.09153323620557785,
-0.08901119232177734,
0.01063354592770338,
0.0436985045671463,
0.011677133850753307,
0.048862941563129425,
-0.04974448308348656,
0.029874496161937714,
-0.0017772599821910262,
0.0767093077301979,
0.02380782552063465,
-0.001480176579207182,
-0.0951836034655571,
-0.04875707998871803,
-0.07606860250234604,
-0.02311612106859684,
-0.07993625104427338,
0.01623886078596115,
0.0486970953643322,
0.07049057632684708,
0.09834227710962296,
-0.0633709654211998,
0.02465623989701271,
-0.1955736130475998,
0.026260048151016235,
0.00040614689351059496,
-0.02778296172618866,
-0.048297327011823654,
-0.012646577320992947,
0.06832380592823029,
-0.028425779193639755,
0.07684029638767242,
0.01230151392519474,
-0.02486908622086048,
0.02202080748975277,
-0.04320541396737099,
-0.037626538425683975,
-0.018638435751199722,
0.1953403353691101,
-0.003882370423525572,
-0.01775236800312996,
-0.0023820754140615463,
0.05772935226559639,
0.058152154088020325,
0.12961900234222412,
0.2619168758392334,
0.057076483964920044,
-0.024220213294029236,
0.13353604078292847,
-0.05976822227239609,
-0.013313532806932926,
-0.12678417563438416,
-0.03814263641834259,
-0.03881635144352913,
0.07707864791154861,
-0.031262122094631195,
-0.025667259469628334,
0.19820643961429596,
-0.07653451710939407,
0.03505190461874008,
-0.047326840460300446,
-0.10765622556209564,
-0.19080570340156555,
-0.177077978849411,
-0.06848526746034622,
-0.07411970943212509,
-0.02576391026377678,
-0.11355061084032059,
-0.00738224433735013,
0.0645509660243988,
0.07307213544845581,
-0.0045515443198382854,
0.08195757120847702,
-0.03226981312036514,
-0.04875508323311806,
0.10853882879018784,
-0.009365265257656574,
-0.04178335517644882,
0.0032357387244701385,
0.05882078781723976,
-0.00009657991176936775,
0.009591990150511265,
0.017550667747855186,
0.018796369433403015,
-0.03360375761985779,
0.0735640674829483,
-0.015166970901191235,
-0.07283813506364822,
-0.058300141245126724,
0.032993290573358536,
0.04582523927092552,
0.08883044123649597,
0.06084802374243736,
-0.09255175292491913,
-0.028803806751966476,
0.1660994589328766,
-0.06143999472260475,
-0.09939026087522507,
-0.1490326225757599,
0.17848464846611023,
0.03071693144738674,
-0.012428586371243,
0.013761849142611027,
-0.06393861770629883,
-0.02450443059206009,
0.29320427775382996,
0.22554251551628113,
0.03590342774987221,
0.003206014633178711,
-0.06717739254236221,
-0.008250655606389046,
0.03724728152155876,
0.14903855323791504,
0.0923447534441948,
0.23153480887413025,
-0.05665523558855057,
0.028479184955358505,
-0.046593792736530304,
-0.028402063995599747,
-0.05916977301239967,
0.03124498389661312,
0.04443127289414406,
0.020428277552127838,
-0.050449833273887634,
0.07992590963840485,
-0.088362917304039,
-0.06819596886634827,
0.015228514559566975,
-0.0426817424595356,
-0.08442305773496628,
-0.06865157186985016,
-0.08300556987524033,
0.052025165408849716,
0.11257735639810562,
-0.06984636932611465,
0.03873236104846001,
0.084494948387146,
0.006409569177776575,
-0.17758429050445557,
-0.036715567111968994,
0.09614895284175873,
0.04203561693429947,
0.10464949905872345,
0.01973893493413925,
0.06618297100067139,
0.08879014849662781,
0.024671869352459908,
-0.13502444326877594,
0.042956575751304626,
-0.01883716881275177,
0.07190614938735962,
0.03415364772081375,
0.09584654867649078,
-0.03126995638012886,
0.03234679624438286,
-0.011327804066240788,
-0.09563769400119781,
-0.06252102553844452,
0.06378094106912613,
0.031748656183481216,
-0.10962486267089844,
0.07832194119691849,
-0.03871163725852966,
0.13345636427402496,
0.17414136230945587,
-0.07737515866756439,
-0.009839468635618687,
-0.07883180677890778,
0.030446507036685944,
-0.014083167538046837,
-0.11942320317029953,
-0.02167489565908909,
-0.14604239165782928,
0.009388159029185772,
-0.10001831501722336,
0.016985829919576645,
-0.22478565573692322,
-0.018178947269916534,
-0.07034850120544434,
-0.008866542018949986,
-0.09616920351982117,
0.13817167282104492,
0.09222756326198578,
0.018589971587061882,
-0.030189314857125282,
0.045489709824323654,
-0.043908048421144485,
0.12241294980049133,
-0.15794354677200317,
-0.13994531333446503
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-distilled-squad-finetuned-squad
This model is a fine-tuned version of [distilbert-base-uncased-distilled-squad](https://huggingface.co/distilbert-base-uncased-distilled-squad) on the squad_v2 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 0.1
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad_v2"], "model-index": [{"name": "distilbert-base-uncased-distilled-squad-finetuned-squad", "results": []}]}
|
question-answering
|
deepakvk/distilbert-base-uncased-distilled-squad-finetuned-squad
|
[
"transformers",
"pytorch",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:squad_v2",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-squad_v2 #license-apache-2.0 #endpoints_compatible #region-us
|
# distilbert-base-uncased-distilled-squad-finetuned-squad
This model is a fine-tuned version of distilbert-base-uncased-distilled-squad on the squad_v2 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 0.1
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
[
"# distilbert-base-uncased-distilled-squad-finetuned-squad\n\nThis model is a fine-tuned version of distilbert-base-uncased-distilled-squad on the squad_v2 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 0.1",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-squad_v2 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# distilbert-base-uncased-distilled-squad-finetuned-squad\n\nThis model is a fine-tuned version of distilbert-base-uncased-distilled-squad on the squad_v2 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 0.1",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
55,
60,
6,
12,
8,
3,
90,
35
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-squad_v2 #license-apache-2.0 #endpoints_compatible #region-us \n# distilbert-base-uncased-distilled-squad-finetuned-squad\n\nThis model is a fine-tuned version of distilbert-base-uncased-distilled-squad on the squad_v2 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 0.1### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
-0.08718035370111465,
0.1466832458972931,
-0.0029437115881592035,
0.08942831307649612,
0.13620734214782715,
0.021201105788350105,
0.08995118737220764,
0.14826007187366486,
-0.10199566185474396,
0.0668671727180481,
0.06417851895093918,
0.053015731275081635,
0.035722628235816956,
0.0876639187335968,
-0.03952275961637497,
-0.21190433204174042,
0.010022684931755066,
0.005146585870534182,
-0.04610195755958557,
0.09750150889158249,
0.10840488970279694,
-0.09875989705324173,
0.06690289080142975,
-0.013205095194280148,
-0.14835119247436523,
0.03508134186267853,
-0.03117152862250805,
-0.023280898109078407,
0.08868658542633057,
-0.003047322854399681,
0.08460153639316559,
0.015326944179832935,
0.12344998121261597,
-0.22870974242687225,
-0.004314531106501818,
0.07621794193983078,
0.04027340188622475,
0.07117073982954025,
0.014492141082882881,
0.013460730202496052,
0.08983191102743149,
-0.1549053192138672,
0.08864077180624008,
0.039633817970752716,
-0.07984313368797302,
-0.15859271585941315,
-0.09493224322795868,
0.0631798580288887,
0.08329758793115616,
0.09617355465888977,
0.012341630645096302,
0.15392747521400452,
-0.07510404288768768,
0.08498237282037735,
0.18951363861560822,
-0.2577437460422516,
-0.0568234845995903,
0.048279788345098495,
0.04458044841885567,
0.07519375532865524,
-0.11004408448934555,
-0.022421246394515038,
0.051220037043094635,
0.03244501352310181,
0.08305465430021286,
-0.02902130037546158,
-0.13158667087554932,
-0.0038340610917657614,
-0.13155052065849304,
-0.03523831441998482,
0.2258727252483368,
0.0372803620994091,
-0.03834472596645355,
-0.0783817321062088,
-0.06900891661643982,
-0.06891641020774841,
-0.0032683692406862974,
-0.06152724474668503,
0.03739096596837044,
-0.053383469581604004,
-0.034210991114377975,
-0.0699998289346695,
-0.0666339322924614,
-0.07350897043943405,
-0.00013201437832321972,
0.08903693407773972,
0.05076625943183899,
0.025897156447172165,
-0.04315100610256195,
0.10622991621494293,
-0.021000448614358902,
-0.13023345172405243,
-0.03805222734808922,
-0.004790353588759899,
-0.08653642982244492,
-0.06414993107318878,
-0.03571409359574318,
-0.017974214628338814,
0.01007996965199709,
0.18462513387203217,
-0.057441797107458115,
0.044935423880815506,
0.01724006049335003,
-0.0070696803741157055,
-0.003403042210265994,
0.14641842246055603,
-0.03585084155201912,
-0.01832638867199421,
0.003273148089647293,
0.09782083332538605,
0.0013391549000516534,
0.0031774083618074656,
-0.08532146364450455,
-0.04048427194356918,
0.08755528181791306,
0.0709380954504013,
-0.023469947278499603,
0.031649794429540634,
-0.028414292261004448,
-0.04148556664586067,
0.005288174841552973,
-0.14271776378154755,
0.03881720080971718,
-0.02301909774541855,
-0.08232679218053818,
-0.00692357961088419,
0.02379901334643364,
-0.017475992441177368,
-0.03806988149881363,
0.025834431871771812,
-0.08135320991277695,
0.0013286569155752659,
-0.07661320269107819,
-0.06764739006757736,
0.01682310178875923,
-0.11489187180995941,
-0.005301465280354023,
-0.06572609394788742,
-0.2169337421655655,
-0.034524206072092056,
0.027443792670965195,
-0.05468810349702835,
-0.03667185828089714,
-0.04483401030302048,
-0.065058633685112,
-0.0014719560276716948,
-0.004165911581367254,
0.06872191280126572,
-0.042473625391721725,
0.07355441898107529,
0.02927292138338089,
0.035759277641773224,
0.011748109944164753,
0.056826718151569366,
-0.10864167660474777,
0.033257946372032166,
-0.104263074696064,
0.08581322431564331,
-0.09896193444728851,
0.019686030223965645,
-0.11851794272661209,
-0.10532227903604507,
0.01571819558739662,
-0.04247622564435005,
0.05681617558002472,
0.13211210072040558,
-0.1813315451145172,
-0.01678592339158058,
0.12456227838993073,
-0.07132893800735474,
-0.08321183919906616,
0.10047651827335358,
-0.04626242443919182,
0.045499254018068314,
0.06812283396720886,
0.159382164478302,
0.14922021329402924,
-0.15174369513988495,
-0.03334782272577286,
0.040945205837488174,
0.05231102183461189,
0.023128511384129524,
0.051301077008247375,
0.00006515743734780699,
0.05248669162392616,
0.02300345152616501,
-0.08837981522083282,
-0.023972168564796448,
-0.08461549878120422,
-0.09398216754198074,
-0.07716729491949081,
-0.07761580497026443,
0.06736525148153305,
0.04612884297966957,
0.02500673569738865,
-0.06707586348056793,
-0.10137101262807846,
0.13229912519454956,
0.14233465492725372,
-0.04953831434249878,
0.013742349110543728,
-0.073670893907547,
0.04954112321138382,
-0.03790201246738434,
-0.01660226099193096,
-0.20625972747802734,
-0.13242390751838684,
0.049958400428295135,
-0.047475628554821014,
0.04100169613957405,
0.06466318666934967,
0.05249551311135292,
0.05271118879318237,
-0.04625723510980606,
-0.026125596836209297,
-0.1132635548710823,
-0.0016597607173025608,
-0.09420275688171387,
-0.159201517701149,
-0.06373073160648346,
-0.03769109398126602,
0.11084059625864029,
-0.19935593008995056,
0.03193672373890877,
0.00504169799387455,
0.12352097779512405,
0.012477442622184753,
-0.04601646587252617,
0.0004342833417467773,
0.04149504750967026,
-0.01792301796376705,
-0.0954236164689064,
0.03457091748714447,
-0.005495443008840084,
-0.07047072798013687,
-0.09632007777690887,
-0.10464680939912796,
0.03579423204064369,
0.058997929096221924,
0.04655628278851509,
-0.07834162563085556,
-0.004517683759331703,
-0.06384403258562088,
-0.04387815296649933,
-0.07721029222011566,
-0.0170269925147295,
0.18509018421173096,
0.004905475303530693,
0.11733917146921158,
-0.06390872597694397,
-0.06646732240915298,
-0.005736950784921646,
-0.0022077702451497316,
-0.008388535119593143,
0.06791957467794418,
0.05366077274084091,
-0.08749935030937195,
0.10678587108850479,
0.11896851658821106,
-0.03634098917245865,
0.11810782551765442,
-0.061296816915273666,
-0.08785513788461685,
-0.038006462156772614,
0.01148794125765562,
-0.03371623158454895,
0.11430896818637848,
-0.11236309260129929,
0.00021098814613651484,
0.03365495800971985,
0.025908570736646652,
0.026136761531233788,
-0.1649821400642395,
-0.0013271281495690346,
0.029394790530204773,
-0.05693864822387695,
-0.01352662593126297,
-0.010588912293314934,
0.033580586314201355,
0.08413778990507126,
0.013619918376207352,
-0.012486239895224571,
0.024319084361195564,
-0.009258359670639038,
-0.07472304254770279,
0.16763189435005188,
-0.11533831059932709,
-0.15158629417419434,
-0.12165456265211105,
0.06343980133533478,
-0.0818672701716423,
-0.020772775635123253,
0.019154738634824753,
-0.077939473092556,
-0.041181329637765884,
-0.0750875324010849,
0.003002294572070241,
-0.05133955553174019,
-0.024503473192453384,
0.05013911798596382,
0.02187434397637844,
0.09425212442874908,
-0.1418304145336151,
0.007315100636333227,
-0.0033228376414626837,
-0.1271287202835083,
-0.016238968819379807,
0.04853098466992378,
0.1198892891407013,
0.10536643862724304,
-0.028217488899827003,
0.019783534109592438,
-0.03196694701910019,
0.23405225574970245,
-0.05980128049850464,
0.001209666719660163,
0.13624896109104156,
0.01923157274723053,
0.05705423280596733,
0.1187414824962616,
0.026327744126319885,
-0.10014715790748596,
0.020869489759206772,
0.06869207322597504,
-0.018299171701073647,
-0.24558210372924805,
-0.04572593420743942,
-0.027232574298977852,
-0.06431008130311966,
0.09157638996839523,
0.04841461032629013,
0.03193897753953934,
0.04260058328509331,
-0.02116873301565647,
0.04206008464097977,
-0.02939404547214508,
0.08393321931362152,
0.1339603215456009,
0.018680889159440994,
0.08898027241230011,
-0.03791841119527817,
-0.02793711982667446,
0.056323446333408356,
0.03172260895371437,
0.26984068751335144,
-0.013765419833362103,
0.131214439868927,
0.04141021892428398,
0.16002622246742249,
-0.0480344183743,
0.02869403548538685,
0.019703274592757225,
0.010347188450396061,
0.012910814955830574,
-0.054645050317049026,
-0.03246309980750084,
0.027628911659121513,
-0.019444221630692482,
0.06803534924983978,
-0.09869605302810669,
0.049521394073963165,
0.03974498435854912,
0.28085869550704956,
0.045582979917526245,
-0.26136472821235657,
-0.09147869795560837,
0.017097659409046173,
-0.02628224715590477,
-0.04972204193472862,
0.020151523873209953,
0.13295485079288483,
-0.11587291955947876,
0.037551600486040115,
-0.03586356341838837,
0.09757359325885773,
-0.04132215678691864,
0.014824416488409042,
0.04076332226395607,
0.10793127864599228,
0.015759998932480812,
0.10620025545358658,
-0.219105526804924,
0.1951286941766739,
0.0341433510184288,
0.10067401826381683,
-0.05705752223730087,
0.035130277276039124,
-0.002168034901842475,
0.10682924091815948,
0.12746880948543549,
0.0035295395646244287,
-0.03474852815270424,
-0.15809984505176544,
-0.07000283151865005,
0.028415558859705925,
0.11261814832687378,
-0.03190245106816292,
0.09161198884248734,
-0.05566191300749779,
0.018368490040302277,
0.045804593712091446,
-0.07046688348054886,
-0.15145868062973022,
-0.1164470911026001,
0.022360974922776222,
0.01207331009209156,
-0.04858943819999695,
-0.06731431186199188,
-0.08581531792879105,
-0.019927985966205597,
0.16678154468536377,
-0.027511706575751305,
-0.05545381084084511,
-0.13210979104042053,
0.07605688273906708,
0.1286790370941162,
-0.07255923002958298,
-0.0008927789749577641,
0.013135778717696667,
0.10908344388008118,
0.03503197059035301,
-0.09611107409000397,
0.028028802946209908,
-0.06110787019133568,
-0.14679624140262604,
-0.036791153252124786,
0.13154470920562744,
0.04630136862397194,
0.05020265653729439,
0.000024434328224742785,
0.024446725845336914,
0.00307637732475996,
-0.08900968730449677,
0.013973512686789036,
0.08370544761419296,
0.07685396820306778,
0.06944462656974792,
-0.10318673402070999,
0.03765342757105827,
-0.05366915091872215,
0.015393581241369247,
0.13350334763526917,
0.19398018717765808,
-0.10096201300621033,
0.07666594535112381,
0.06412291526794434,
-0.1014639288187027,
-0.18759511411190033,
0.05971158295869827,
0.0837770625948906,
0.004009490367025137,
0.051154933869838715,
-0.19385026395320892,
0.09740965813398361,
0.11330971121788025,
-0.008685708045959473,
0.06392218917608261,
-0.3392048478126526,
-0.1114656925201416,
0.07829619199037552,
0.07661749422550201,
0.008219337090849876,
-0.1357530951499939,
-0.03620562702417374,
-0.017180245369672775,
-0.1385781317949295,
0.07880871742963791,
-0.07172462344169617,
0.10512740164995193,
-0.02155998721718788,
0.10269929468631744,
0.029326675459742546,
-0.029733512550592422,
0.1471337229013443,
0.08159494400024414,
0.0929252952337265,
-0.05361620709300041,
0.0030265026725828648,
0.12697680294513702,
-0.07259204238653183,
0.09166000038385391,
-0.013234433718025684,
0.08675169199705124,
-0.16748309135437012,
-0.01149722933769226,
-0.06878089904785156,
0.07658226042985916,
-0.055572740733623505,
-0.0628754049539566,
-0.05720318481326103,
0.04496905207633972,
0.06204737722873688,
-0.03364036604762077,
0.07877848297357559,
0.0329616516828537,
0.0972164124250412,
0.10217523574829102,
0.0940803736448288,
-0.005360377952456474,
-0.13970234990119934,
-0.0009328153100796044,
0.0012511520180851221,
0.06433937698602676,
-0.11526554822921753,
0.03970538452267647,
0.1487787365913391,
0.06304140388965607,
0.14128059148788452,
0.025416279211640358,
-0.03343109041452408,
-0.0006295187631621957,
0.006860869470983744,
-0.11889106035232544,
-0.183090940117836,
-0.012575944885611534,
-0.060922443866729736,
-0.14857633411884308,
0.04154801741242409,
0.10091245174407959,
-0.05510339513421059,
-0.009963255375623703,
-0.02128206193447113,
0.02290898747742176,
-0.023815268650650978,
0.16984885931015015,
0.04888968914747238,
0.06494107842445374,
-0.077723428606987,
0.13262169063091278,
0.07604783773422241,
-0.07047838717699051,
0.05010171979665756,
0.04526306688785553,
-0.08128809928894043,
-0.03308820724487305,
0.055758360773324966,
0.12829379737377167,
-0.03362153843045235,
-0.027739552780985832,
-0.0916713997721672,
-0.06796562671661377,
0.0453307218849659,
0.11581303179264069,
0.0615144781768322,
-0.0013599756639450788,
-0.04467252269387245,
0.02481093443930149,
-0.14454850554466248,
0.10213083773851395,
0.019965853542089462,
0.06130186468362808,
-0.15597984194755554,
0.0913034975528717,
0.006434676703065634,
0.06984611600637436,
-0.02087363973259926,
0.00495497602969408,
-0.07592552155256271,
-0.009066586382687092,
-0.1509513109922409,
-0.024437857791781425,
-0.02540602907538414,
0.011645298451185226,
-0.011063977144658566,
-0.06208477541804314,
-0.03301796689629555,
0.05738523229956627,
-0.06663869321346283,
-0.05691612511873245,
0.022238513454794884,
0.0685875415802002,
-0.15488778054714203,
-0.018091384321451187,
0.023392628878355026,
-0.08988740295171738,
0.07824331521987915,
0.06608610600233078,
0.019200362265110016,
0.023963162675499916,
-0.06633275747299194,
-0.02539638802409172,
0.01644660159945488,
0.05650154873728752,
0.06885161250829697,
-0.09217622131109238,
-0.00955845508724451,
-0.014087981544435024,
0.042977456003427505,
0.015099970623850822,
0.04081102833151817,
-0.11674845218658447,
-0.01766403205692768,
-0.05974191054701805,
-0.05191439017653465,
-0.06562235206365585,
0.03908028081059456,
0.12147317826747894,
0.04433052986860275,
0.1660931259393692,
-0.06609917432069778,
0.06466809660196304,
-0.20590387284755707,
-0.04295456036925316,
0.008952504955232143,
-0.03063669055700302,
-0.059462010860443115,
-0.04799174144864082,
0.06969648599624634,
-0.061418887227773666,
0.10918175429105759,
-0.014345331117510796,
0.11907564848661423,
0.04050423577427864,
-0.02789655141532421,
-0.03329590708017349,
-0.006126165855675936,
0.16058188676834106,
0.06086023524403572,
-0.022126147523522377,
0.0951426774263382,
-0.01864633895456791,
0.05482717230916023,
0.042557716369628906,
0.16881150007247925,
0.1469680517911911,
-0.03454992547631264,
0.046498287469148636,
0.08873927593231201,
-0.099318727850914,
-0.15915028750896454,
0.0820804014801979,
0.005309872794896364,
0.100504569709301,
-0.04820068180561066,
0.1573077142238617,
0.08742944151163101,
-0.17083102464675903,
0.07420904189348221,
-0.056849490851163864,
-0.11921437829732895,
-0.11445941030979156,
-0.043043844401836395,
-0.07719480246305466,
-0.12026877701282501,
0.017638884484767914,
-0.13163062930107117,
0.030229201540350914,
0.06728248298168182,
-0.0029112673364579678,
-0.01077757216989994,
0.16851717233657837,
-0.037297509610652924,
0.010993517003953457,
0.04840180650353432,
0.0007102217641659081,
-0.0050971489399671555,
-0.0740860104560852,
-0.030115505680441856,
0.07225681841373444,
0.015314452350139618,
0.07503791898488998,
-0.0306603591889143,
0.022956885397434235,
0.009008840657770634,
-0.04257041588425636,
-0.06453902274370193,
0.006092558149248362,
0.03435162082314491,
0.014178951270878315,
0.040577422827482224,
0.05668703466653824,
-0.00866753701120615,
-0.044434066861867905,
0.2507832944393158,
-0.07136289775371552,
-0.09300292283296585,
-0.1316944658756256,
0.21032929420471191,
0.03811578080058098,
-0.010530434548854828,
0.07895093411207199,
-0.1123931035399437,
-0.005957239773124456,
0.14486148953437805,
0.13464029133319855,
-0.044363245368003845,
-0.014174681156873703,
-0.026940608397126198,
-0.010569515638053417,
-0.04919684678316116,
0.08224566280841827,
0.10897039622068405,
0.015905018895864487,
-0.04884318262338638,
-0.016659386456012726,
-0.013401745818555355,
-0.0285461638122797,
-0.08072618395090103,
0.0512169674038887,
0.008741946890950203,
0.02728286013007164,
-0.03450753912329674,
0.06677103042602539,
0.008166354149580002,
-0.17697717249393463,
0.0623372383415699,
-0.15619966387748718,
-0.1554478406906128,
-0.01846163533627987,
0.07670560479164124,
-0.02697865292429924,
0.038236793130636215,
-0.02747086063027382,
-0.0008379055070690811,
0.17238380014896393,
-0.027202224358916283,
-0.0760674849152565,
-0.11105426400899887,
0.11169964075088501,
-0.08583187311887741,
0.193603053689003,
-0.00686546741053462,
0.07902413606643677,
0.09504228085279465,
0.005663042422384024,
-0.14444738626480103,
0.04491740092635155,
0.07132750749588013,
-0.049813829362392426,
0.007193319499492645,
0.15060831606388092,
-0.03536809980869293,
0.1182858943939209,
0.04818301275372505,
-0.12182940542697906,
-0.031856026500463486,
0.0006580629851669073,
-0.02408413030207157,
-0.09265762567520142,
0.0005491330521181226,
-0.0562153235077858,
0.15365253388881683,
0.20461681485176086,
-0.03725522384047508,
0.0124042434617877,
-0.08355175703763962,
0.02451196499168873,
0.0650554746389389,
0.08395952731370926,
-0.026248952373862267,
-0.18196026980876923,
0.021439775824546814,
0.030565861612558365,
0.01936647668480873,
-0.2006663978099823,
-0.10694471746683121,
0.07118986546993256,
-0.04633524641394615,
-0.03549446910619736,
0.10559532791376114,
0.047258198261260986,
0.0444418303668499,
-0.03533153980970383,
-0.13815777003765106,
-0.05432109162211418,
0.143403559923172,
-0.16442087292671204,
-0.042642444372177124
] |
null | null |
transformers
|
# Welcome to Roberta-Marathi-MLM
## Model Description
> This is a small language model for [Marathi](https://en.wikipedia.org/wiki/Marathi) language with 1M data samples taken from
[OSCAR page](https://oscar-public.huma-num.fr/shuffled/mr_dedup.txt.gz)
## Training params
- **Dataset** - 1M data samples are used to train this model from OSCAR page(https://oscar-corpus.com/) eventhough data set is of 2.7 GB due to resource constraint to train
I have picked only 1M data from the total 2.7GB data set. If you are interested in collaboration and have computational resources to train on you are most welcome to do so.
- **Preprocessing** - ByteLevelBPETokenizer is used to tokenize the sentences at character level and vocabulary size is set to 52k as per standard values given by 🤗
<!-- - **Hyperparameters** - __ByteLevelBPETokenizer__ : vocabulary size = 52_000 and min_frequency = 2
__Trainer__ : num_train_epochs=12 - trained for 12 epochs
per_gpu_train_batch_size=64 - batch size for the datasamples is 64
save_steps=10_000 - save model for every 10k steps
save_total_limit=2 - save limit is set for 2 -->
**Intended uses & limitations**
this is for anyone who wants to make use of marathi language models for various tasks like language generation, translation and many more use cases.
**Whatever else is helpful!**
If you are intersted in collaboration feel free to reach me [Deepam](mailto:[email protected])
|
{"language": "mr"}
|
fill-mask
|
deepampatel/roberta-mlm-marathi
|
[
"transformers",
"pytorch",
"jax",
"roberta",
"fill-mask",
"mr",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"mr"
] |
TAGS
#transformers #pytorch #jax #roberta #fill-mask #mr #autotrain_compatible #endpoints_compatible #region-us
|
# Welcome to Roberta-Marathi-MLM
## Model Description
> This is a small language model for Marathi language with 1M data samples taken from
OSCAR page
## Training params
- Dataset - 1M data samples are used to train this model from OSCAR page(URL eventhough data set is of 2.7 GB due to resource constraint to train
I have picked only 1M data from the total 2.7GB data set. If you are interested in collaboration and have computational resources to train on you are most welcome to do so.
- Preprocessing - ByteLevelBPETokenizer is used to tokenize the sentences at character level and vocabulary size is set to 52k as per standard values given by 🤗
Intended uses & limitations
this is for anyone who wants to make use of marathi language models for various tasks like language generation, translation and many more use cases.
Whatever else is helpful!
If you are intersted in collaboration feel free to reach me Deepam
|
[
"# Welcome to Roberta-Marathi-MLM",
"## Model Description\n \n> This is a small language model for Marathi language with 1M data samples taken from\n OSCAR page",
"## Training params \n\n- Dataset - 1M data samples are used to train this model from OSCAR page(URL eventhough data set is of 2.7 GB due to resource constraint to train \nI have picked only 1M data from the total 2.7GB data set. If you are interested in collaboration and have computational resources to train on you are most welcome to do so.\n\n- Preprocessing - ByteLevelBPETokenizer is used to tokenize the sentences at character level and vocabulary size is set to 52k as per standard values given by 🤗 \n\n\nIntended uses & limitations\n this is for anyone who wants to make use of marathi language models for various tasks like language generation, translation and many more use cases.\n\nWhatever else is helpful!\n If you are intersted in collaboration feel free to reach me Deepam"
] |
[
"TAGS\n#transformers #pytorch #jax #roberta #fill-mask #mr #autotrain_compatible #endpoints_compatible #region-us \n",
"# Welcome to Roberta-Marathi-MLM",
"## Model Description\n \n> This is a small language model for Marathi language with 1M data samples taken from\n OSCAR page",
"## Training params \n\n- Dataset - 1M data samples are used to train this model from OSCAR page(URL eventhough data set is of 2.7 GB due to resource constraint to train \nI have picked only 1M data from the total 2.7GB data set. If you are interested in collaboration and have computational resources to train on you are most welcome to do so.\n\n- Preprocessing - ByteLevelBPETokenizer is used to tokenize the sentences at character level and vocabulary size is set to 52k as per standard values given by 🤗 \n\n\nIntended uses & limitations\n this is for anyone who wants to make use of marathi language models for various tasks like language generation, translation and many more use cases.\n\nWhatever else is helpful!\n If you are intersted in collaboration feel free to reach me Deepam"
] |
[
42,
12,
24,
183
] |
[
"passage: TAGS\n#transformers #pytorch #jax #roberta #fill-mask #mr #autotrain_compatible #endpoints_compatible #region-us \n# Welcome to Roberta-Marathi-MLM## Model Description\n \n> This is a small language model for Marathi language with 1M data samples taken from\n OSCAR page## Training params \n\n- Dataset - 1M data samples are used to train this model from OSCAR page(URL eventhough data set is of 2.7 GB due to resource constraint to train \nI have picked only 1M data from the total 2.7GB data set. If you are interested in collaboration and have computational resources to train on you are most welcome to do so.\n\n- Preprocessing - ByteLevelBPETokenizer is used to tokenize the sentences at character level and vocabulary size is set to 52k as per standard values given by 🤗 \n\n\nIntended uses & limitations\n this is for anyone who wants to make use of marathi language models for various tasks like language generation, translation and many more use cases.\n\nWhatever else is helpful!\n If you are intersted in collaboration feel free to reach me Deepam"
] |
[
-0.08998450636863708,
0.03386206179857254,
-0.0007262126309797168,
0.07550286501646042,
0.1108567863702774,
-0.04304811358451843,
0.018701594322919846,
0.11236982047557831,
-0.09024783223867416,
-0.003788197645917535,
0.06810689717531204,
0.04550490528345108,
0.0473436638712883,
0.08578629046678543,
-0.004946382716298103,
-0.3631380796432495,
-0.032905757427215576,
0.004636203404515982,
-0.06753060966730118,
0.07295240461826324,
0.16938644647598267,
-0.07709074765443802,
0.05871525779366493,
0.0026935783680528402,
-0.13688494265079498,
0.04586284980177879,
-0.06865595281124115,
-0.0776081457734108,
0.09169359505176544,
0.0346052423119545,
0.028817763552069664,
0.01796780899167061,
0.02498842217028141,
-0.1138780266046524,
0.02473713830113411,
-0.04247399792075157,
0.01355938520282507,
-0.006177917588502169,
0.09954988211393356,
-0.023538628593087196,
0.20177817344665527,
-0.11033893376588821,
-0.016594111919403076,
0.08172929286956787,
-0.07828214764595032,
-0.03748228773474693,
-0.030800994485616684,
-0.047317806631326675,
0.09019043296575546,
0.13895612955093384,
-0.04711630567908287,
0.14952048659324646,
-0.0446433499455452,
0.10463690012693405,
0.09782509505748749,
-0.2661619186401367,
-0.046381838619709015,
0.12125692516565323,
0.0494246631860733,
0.060151852667331696,
-0.02049274928867817,
0.007311341352760792,
0.013606942258775234,
0.02752033993601799,
0.036812324076890945,
-0.07520071417093277,
-0.01884593814611435,
-0.05408325791358948,
-0.06992028653621674,
0.04448837786912918,
0.15845410525798798,
-0.015103540383279324,
-0.009650832042098045,
-0.11593902111053467,
-0.05248453468084335,
0.0371110662817955,
-0.046120576560497284,
0.00295951752923429,
-0.013525457121431828,
0.044255491346120834,
0.06225349381566048,
-0.05725949630141258,
-0.11657252907752991,
0.0011647302890196443,
-0.0024773937184363604,
0.025624901056289673,
0.04847978428006172,
0.013158741407096386,
-0.010344439186155796,
-0.01104350108653307,
-0.21274320781230927,
-0.07504136860370636,
-0.04601067677140236,
-0.058589570224285126,
-0.024843866005539894,
0.03462807834148407,
-0.05179506912827492,
-0.20447294414043427,
0.023269757628440857,
-0.03470747917890549,
-0.014004948548972607,
0.054653000086545944,
0.04420989379286766,
0.02338591404259205,
0.06472353637218475,
0.050831250846385956,
-0.09173181653022766,
-0.09605316817760468,
0.1310838907957077,
-0.08637961745262146,
0.06540986895561218,
0.01922529935836792,
-0.0897691398859024,
-0.07386212050914764,
0.010137424804270267,
0.07751885056495667,
0.0018880496500059962,
0.037414681166410446,
0.018072567880153656,
-0.0169483982026577,
0.10618294775485992,
-0.10081259906291962,
-0.02539125084877014,
-0.01801750436425209,
-0.043549999594688416,
-0.0007649899926036596,
-0.005840995814651251,
0.033414941281080246,
-0.07965125888586044,
-0.11108944565057755,
0.000950933201238513,
0.010323657654225826,
-0.1298646777868271,
-0.10330357402563095,
0.02161160297691822,
-0.07973609864711761,
-0.012268468737602234,
-0.18842124938964844,
-0.18371984362602234,
0.010225693695247173,
0.007639301009476185,
-0.11749997735023499,
-0.03958071768283844,
-0.03525227680802345,
0.030816875398159027,
-0.036512427031993866,
-0.03666146099567413,
0.08799576014280319,
-0.024157768115401268,
0.02670566365122795,
0.01142234168946743,
0.09594160318374634,
-0.05294092372059822,
0.0665501058101654,
-0.05517204850912094,
0.017662374302744865,
-0.12605886161327362,
0.13618238270282745,
-0.06314601004123688,
0.04567436873912811,
-0.09775994718074799,
-0.07588151097297668,
-0.024912867695093155,
-0.006779654882848263,
0.04138965532183647,
0.16313427686691284,
-0.12617667019367218,
0.02386891469359398,
0.11303704231977463,
-0.074740931391716,
-0.07897040247917175,
0.1643325537443161,
-0.0014226543717086315,
0.09152451902627945,
0.05064840614795685,
0.11976473778486252,
0.0897383987903595,
0.0469968356192112,
-0.014622137881815434,
-0.0033087863121181726,
-0.01745084673166275,
-0.15486665070056915,
0.08008041232824326,
0.02115067094564438,
0.04660383239388466,
0.030251581221818924,
0.06815861910581589,
0.0782521516084671,
-0.029540110379457474,
-0.05910652130842209,
-0.025131389498710632,
-0.07595144957304001,
-0.023430468514561653,
0.08566200733184814,
0.1391325294971466,
-0.0674900934100151,
-0.04626436531543732,
-0.06936585158109665,
0.09269967675209045,
-0.05703801289200783,
0.06588360667228699,
-0.08388753235340118,
0.1299198418855667,
-0.07610069960355759,
-0.00138841790612787,
-0.18843482434749603,
0.08155632764101028,
0.027891607955098152,
0.13487814366817474,
0.05731113627552986,
0.045244187116622925,
0.07181718200445175,
0.04880506917834282,
-0.06444661319255829,
0.02676907740533352,
0.11003422737121582,
-0.020351236686110497,
-0.09948166459798813,
-0.023859357461333275,
0.021519044414162636,
-0.0650644302368164,
0.1291503757238388,
-0.08757297694683075,
0.015219922177493572,
0.03997252136468887,
0.09536982327699661,
-0.0056202900595963,
-0.03230512887239456,
0.09644387662410736,
0.11383570730686188,
-0.023969927802681923,
-0.03241138532757759,
0.08513163030147552,
-0.0066597433760762215,
-0.08899497240781784,
0.09725737571716309,
-0.08068037033081055,
-0.057230450212955475,
0.06689932942390442,
-0.09259539842605591,
-0.005764015484601259,
-0.012535703368484974,
0.0025339899584650993,
-0.011272710748016834,
0.0122597711160779,
0.024845898151397705,
0.15873229503631592,
0.03559185564517975,
0.16652433574199677,
-0.07240007817745209,
0.010655212216079235,
-0.026995336636900902,
-0.07917911559343338,
0.0009758083033375442,
0.12123958021402359,
0.05387040600180626,
-0.10521922260522842,
0.12041924893856049,
-0.011526190675795078,
-0.03593054786324501,
0.12156295776367188,
0.02187044732272625,
-0.034298866987228394,
0.03385353833436966,
0.020932722836732864,
0.017371857538819313,
-0.003937484696507454,
-0.17912758886814117,
-0.028357528150081635,
0.060584522783756256,
-0.012158061377704144,
0.0553617961704731,
-0.03891701251268387,
0.0021550487726926804,
0.015379446558654308,
-0.003073466941714287,
0.05519840121269226,
0.11902827024459839,
-0.019273459911346436,
0.03152179345488548,
0.008958855643868446,
0.038801372051239014,
-0.009027538821101189,
-0.006502145901322365,
-0.09938790649175644,
0.16874459385871887,
-0.11168649047613144,
-0.26960766315460205,
-0.11728289723396301,
-0.050116345286369324,
-0.05183606967329979,
-0.008390455506742,
0.0958847776055336,
-0.11965315043926239,
-0.0854083001613617,
-0.05989011749625206,
0.04235228896141052,
-0.03367029130458832,
-0.07234738022089005,
-0.09044329077005386,
0.021222101524472237,
-0.05551701784133911,
-0.09813956916332245,
0.01411077007651329,
0.050943803042173386,
-0.06735531985759735,
0.11490573734045029,
-0.08201123774051666,
0.05688770115375519,
0.05704466998577118,
-0.03218163549900055,
-0.0013641281984746456,
-0.032469965517520905,
0.16539037227630615,
-0.03926484286785126,
0.10717957466840744,
0.2472284734249115,
0.0785212367773056,
0.05597461760044098,
0.09602013975381851,
-0.014519203454256058,
-0.042128145694732666,
0.01446186751127243,
-0.027893342077732086,
-0.06706307083368301,
-0.19944074749946594,
-0.13165970146656036,
-0.10945504158735275,
0.02375895343720913,
-0.014596798457205296,
0.018004603683948517,
-0.0006050898227840662,
0.13239021599292755,
-0.0038476320914924145,
0.00126946484670043,
-0.035057470202445984,
0.10156626254320145,
0.10435254871845245,
-0.03259021043777466,
0.10900042951107025,
-0.06128671020269394,
0.006630270276218653,
0.08868750929832458,
-0.00486655393615365,
0.17520393431186676,
-0.021925188601017,
0.17967446148395538,
0.04447762668132782,
0.1801389902830124,
0.05007051303982735,
0.05519639328122139,
-0.06716801226139069,
0.009349496103823185,
-0.02644989639520645,
-0.023510264232754707,
-0.11206529289484024,
0.0804547518491745,
0.13072432577610016,
-0.07327881455421448,
-0.015185375697910786,
0.04320678859949112,
-0.01634998619556427,
0.1915377378463745,
0.012682083994150162,
-0.1325025111436844,
-0.04876266047358513,
0.03560547158122063,
-0.07742159068584442,
-0.06358084082603455,
0.009501487948000431,
0.09094849228858948,
-0.09765347093343735,
-0.021526871249079704,
-0.05136258527636528,
0.09801369905471802,
-0.015740834176540375,
-0.01881730556488037,
-0.17116641998291016,
0.014302868396043777,
0.029278285801410675,
0.14546695351600647,
-0.1521759033203125,
0.19313237071037292,
0.0296039842069149,
0.028030281886458397,
-0.0901143029332161,
-0.0031376054976135492,
0.0061817471869289875,
0.10550471395254135,
0.07098868489265442,
-0.010586068034172058,
0.001138120424002409,
-0.08170219510793686,
-0.11124084889888763,
0.009779048152267933,
0.019081830978393555,
0.061022039502859116,
0.04999065399169922,
0.01491573080420494,
0.01571439951658249,
-0.03556984290480614,
0.02611372247338295,
-0.1330685019493103,
-0.15052513778209686,
0.04758968576788902,
0.009143375791609287,
0.05026823654770851,
-0.05499301105737686,
-0.05039522796869278,
-0.03162696957588196,
0.14228366315364838,
0.016261223703622818,
-0.1271786242723465,
-0.09376703202724457,
0.05215097591280937,
0.05360701307654381,
-0.10706719756126404,
0.08830233663320541,
-0.033641133457422256,
0.05276766046881676,
-0.034876611083745956,
-0.0031420255545526743,
0.023482419550418854,
-0.07618722319602966,
-0.018476122990250587,
0.03584799915552139,
0.08474134653806686,
0.021169070154428482,
0.021497467532753944,
0.040491510182619095,
0.013133554719388485,
-0.035378966480493546,
-0.037667714059352875,
-0.09615983068943024,
0.01042349822819233,
0.14581722021102905,
0.029901443049311638,
-0.09070931375026703,
-0.045014072209596634,
-0.06000474467873573,
-0.13771143555641174,
0.3017144799232483,
0.14959421753883362,
-0.07202885299921036,
0.10414686799049377,
0.14697976410388947,
-0.03333350270986557,
-0.22103464603424072,
-0.05167660117149353,
0.03773823380470276,
0.10488440841436386,
-0.014733878895640373,
-0.14060541987419128,
0.021573331207036972,
0.048792388290166855,
0.0010069430572912097,
-0.09989053010940552,
-0.16893906891345978,
-0.11989719420671463,
0.05305752158164978,
0.12145435810089111,
0.19945383071899414,
-0.1436726152896881,
-0.002249770099297166,
-0.06713026016950607,
-0.04796905443072319,
-0.03490181639790535,
-0.02130332961678505,
0.13028447329998016,
-0.0381469652056694,
0.09066396206617355,
0.021159213036298752,
-0.06665819138288498,
0.09958399832248688,
-0.06493493169546127,
-0.017271360382437706,
-0.08087696880102158,
0.04062099754810333,
0.003703060792759061,
-0.006419348064810038,
0.11742210388183594,
-0.04233319312334061,
0.00497324438765645,
-0.22716817259788513,
-0.057529643177986145,
-0.03884540870785713,
0.05062325298786163,
0.036287568509578705,
-0.10465270280838013,
-0.028892869129776955,
0.08899394422769547,
0.032933324575424194,
0.003328152233734727,
0.014326000586152077,
-0.11213956028223038,
-0.01921672746539116,
0.1863751858472824,
0.2211693823337555,
-0.08212108910083771,
0.053597643971443176,
-0.025129640474915504,
-0.027485592290759087,
0.02771024778485298,
-0.167860746383667,
-0.010432741604745388,
0.040677253156900406,
-0.025753101333975792,
0.07139268517494202,
-0.018227336928248405,
-0.04317505285143852,
0.08733848482370377,
0.0935276448726654,
0.03202439472079277,
-0.18451373279094696,
-0.021959928795695305,
0.06794466823339462,
-0.013063604012131691,
-0.035004548728466034,
0.06685103476047516,
-0.10749250650405884,
-0.0580194890499115,
-0.031637318432331085,
0.013252966105937958,
-0.04244918003678322,
0.08251774311065674,
0.08108937740325928,
0.03270687162876129,
-0.035745542496442795,
0.06024632975459099,
0.05028511956334114,
-0.09338238835334778,
0.025536727160215378,
0.19667652249336243,
-0.09508433192968369,
-0.11948418617248535,
0.024652834981679916,
0.05229029059410095,
-0.029446454718708992,
0.0009055306436493993,
-0.027524257078766823,
-0.03417881950736046,
-0.0013861048500984907,
0.09725836664438248,
0.031192168593406677,
-0.08498075604438782,
-0.07352574169635773,
-0.01327380072325468,
-0.10028064250946045,
0.06060535088181496,
0.08472619950771332,
-0.011669939383864403,
-0.069130539894104,
0.06536263972520828,
0.00745004927739501,
0.06927490234375,
-0.032602857798337936,
-0.015257628634572029,
-0.03716248273849487,
0.000968615640886128,
-0.12640289962291718,
-0.03249601647257805,
-0.15496976673603058,
-0.0212724506855011,
-0.06312646716833115,
-0.01663779467344284,
-0.0007593609625473619,
0.004938356578350067,
-0.053841929882764816,
-0.007294591516256332,
-0.08642430603504181,
0.055934254080057144,
0.0008437962969765067,
-0.05362887680530548,
0.03313998505473137,
-0.012166888453066349,
0.0807606503367424,
0.012750137597322464,
-0.07629501819610596,
-0.0025033531710505486,
0.031245507299900055,
-0.0000969256361713633,
0.0030582412146031857,
0.0845179334282875,
0.06275554746389389,
-0.10712266713380814,
-0.007084456272423267,
0.040845852345228195,
0.056329164654016495,
0.03829924389719963,
0.15390896797180176,
-0.09178876131772995,
0.06942825019359589,
-0.09847712516784668,
-0.05858667939901352,
-0.07798659056425095,
0.06409143656492233,
0.007946804165840149,
0.06361588835716248,
0.11942889541387558,
-0.0768343061208725,
0.10184488445520401,
-0.07532692700624466,
0.02260446920990944,
-0.02547677978873253,
-0.04887034371495247,
-0.017321456223726273,
-0.0983298048377037,
0.057731688022613525,
-0.008515151217579842,
0.1007644459605217,
0.043037086725234985,
-0.03859628364443779,
0.04726674035191536,
-0.062261104583740234,
0.12818393111228943,
-0.04966878890991211,
0.17436286807060242,
0.0685681700706482,
0.029308469966053963,
0.018856041133403778,
0.1031375452876091,
0.02668306790292263,
0.09052436798810959,
0.015240856446325779,
0.20514196157455444,
0.06498869508504868,
0.09068992733955383,
-0.007640811149030924,
-0.01332738809287548,
-0.02000635676085949,
-0.05324055254459381,
-0.017860040068626404,
0.03767738491296768,
-0.09716054052114487,
0.14329926669597626,
0.18701809644699097,
-0.13997027277946472,
0.05911988019943237,
-0.01419634185731411,
-0.0963086411356926,
-0.0718296617269516,
-0.07758284360170364,
-0.09276295453310013,
-0.09608979523181915,
0.0172100979834795,
-0.10396718233823776,
-0.012203100137412548,
0.086476169526577,
0.07084780186414719,
-0.026397669687867165,
0.16663770377635956,
-0.1024959459900856,
-0.03772226348519325,
0.027325768023729324,
-0.020575668662786484,
0.0576288178563118,
0.006070986855775118,
-0.0020761266350746155,
-0.07224009931087494,
-0.020267082378268242,
0.03688061982393265,
0.0619712769985199,
-0.0520586259663105,
0.023199422284960747,
-0.062259722501039505,
-0.07455887645483017,
-0.04268965870141983,
0.09990403801202774,
0.08178477734327316,
0.28202593326568604,
0.003298432333394885,
-0.030517760664224625,
-0.013810819946229458,
0.1354808509349823,
-0.03426860272884369,
-0.11701856553554535,
-0.12065015733242035,
0.13792067766189575,
0.051638782024383545,
-0.026957251131534576,
-0.00928720086812973,
-0.07555867731571198,
0.004283031914383173,
0.2804434597492218,
0.20225156843662262,
-0.0474514402449131,
-0.027827978134155273,
-0.014690525829792023,
0.009424131363630295,
-0.06082550436258316,
0.16273711621761322,
0.10818411409854889,
0.14581972360610962,
-0.12054337561130524,
0.03188684210181236,
-0.08941618353128433,
-0.021464021876454353,
-0.24844412505626678,
0.08805795013904572,
-0.006731456611305475,
-0.047986630350351334,
0.01821829006075859,
0.11614786833524704,
-0.037199851125478745,
-0.010412617586553097,
-0.024326058104634285,
-0.09805182367563248,
-0.13868287205696106,
-0.019459303468465805,
-0.058866359293460846,
0.024789804592728615,
0.06046334281563759,
0.007886836305260658,
0.07605353742837906,
0.05157986655831337,
0.05976192280650139,
-0.0713304877281189,
-0.00596048217266798,
0.16187401115894318,
-0.01874723844230175,
0.03215690329670906,
-0.02519245631992817,
0.11615443974733353,
0.11257407814264297,
0.02490045502781868,
-0.09477632492780685,
0.07829113304615021,
0.03013765811920166,
-0.0006758379749953747,
0.06497017294168472,
0.10923454165458679,
-0.042652521282434464,
0.019778434187173843,
-0.010566212236881256,
-0.0740002766251564,
0.04389045014977455,
0.05419085547327995,
0.0720389112830162,
-0.07247942686080933,
0.13104453682899475,
-0.09526099264621735,
0.12365159392356873,
0.10144909471273422,
-0.03631366416811943,
-0.08396487683057785,
-0.07929393649101257,
0.002911545103415847,
0.012960544787347317,
-0.026287632063031197,
-0.11547589302062988,
-0.24381862580776215,
-0.04882686212658882,
-0.00046462981845252216,
0.03159913048148155,
-0.2516394257545471,
-0.04926551878452301,
-0.031624920666217804,
0.0020727897062897682,
-0.09151053428649902,
0.10579922795295715,
0.07268711924552917,
-0.048466622829437256,
-0.0025515425950288773,
-0.1684071570634842,
-0.04526020586490631,
0.06115935370326042,
-0.14232121407985687,
-0.08825938403606415
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# output
This model is a fine-tuned version of [hf-test/xls-r-dummy](https://huggingface.co/hf-test/xls-r-dummy) on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - AB dataset.
It achieves the following results on the evaluation set:
- Loss: 156.8789
- Wer: 1.3456
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
|
{"language": ["ab"], "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_7_0", "generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "output", "results": []}]}
|
automatic-speech-recognition
|
deepdml/output
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_7_0",
"generated_from_trainer",
"ab",
"dataset:common_voice",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ab"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #ab #dataset-common_voice #endpoints_compatible #region-us
|
# output
This model is a fine-tuned version of hf-test/xls-r-dummy on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - AB dataset.
It achieves the following results on the evaluation set:
- Loss: 156.8789
- Wer: 1.3456
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
|
[
"# output\n\nThis model is a fine-tuned version of hf-test/xls-r-dummy on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - AB dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 156.8789\n- Wer: 1.3456",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- training_steps: 10\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.1.dev0\n- Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #ab #dataset-common_voice #endpoints_compatible #region-us \n",
"# output\n\nThis model is a fine-tuned version of hf-test/xls-r-dummy on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - AB dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 156.8789\n- Wer: 1.3456",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- training_steps: 10\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.1.dev0\n- Tokenizers 0.11.0"
] |
[
71,
71,
6,
12,
8,
3,
101,
4,
41
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #ab #dataset-common_voice #endpoints_compatible #region-us \n# output\n\nThis model is a fine-tuned version of hf-test/xls-r-dummy on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - AB dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 156.8789\n- Wer: 1.3456## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- training_steps: 10\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.1.dev0\n- Tokenizers 0.11.0"
] |
[
-0.11410809308290482,
0.11788009107112885,
-0.0038645819295197725,
0.03093365579843521,
0.1420757919549942,
0.012670273892581463,
0.06490922719240189,
0.1582067757844925,
-0.094411201775074,
0.0932377502322197,
0.072003573179245,
0.018965335562825203,
0.09032353013753891,
0.0740947276353836,
-0.008349889889359474,
-0.2572937607765198,
0.020751921460032463,
-0.01622006669640541,
-0.04909496009349823,
0.08421630412340164,
0.13518935441970825,
-0.08303423970937729,
0.031095825135707855,
0.04851124435663223,
-0.1485913246870041,
0.014298614114522934,
-0.028376653790473938,
-0.048995498567819595,
0.08449357748031616,
0.02776942029595375,
0.056935299187898636,
0.004701922181993723,
0.06890033185482025,
-0.2485143095254898,
0.0014626565389335155,
0.06725228577852249,
0.07221459597349167,
0.07712901383638382,
0.06490781903266907,
0.01983778364956379,
0.06945503503084183,
-0.12622176110744476,
0.042032357305288315,
0.08561975508928299,
-0.08203660696744919,
-0.2244582325220108,
-0.08984916657209396,
0.0459698811173439,
0.07739827781915665,
0.1051279753446579,
-0.021497955545783043,
0.11716734617948532,
-0.02652638964354992,
0.0650903657078743,
0.19127510488033295,
-0.2271103411912918,
-0.04197271540760994,
-0.00299051683396101,
0.07986672967672348,
0.07461521029472351,
-0.10352536290884018,
0.011396044865250587,
0.06368618458509445,
0.01290148962289095,
0.042717333883047104,
0.015537585131824017,
-0.006616663653403521,
-0.025518398731946945,
-0.112175852060318,
-0.07427243143320084,
0.23219197988510132,
0.07784435153007507,
-0.06899352371692657,
-0.10580266267061234,
-0.002577045699581504,
-0.12770847976207733,
-0.028508642688393593,
-0.006867008283734322,
0.004315304569900036,
-0.022141335532069206,
-0.021777674555778503,
-0.06215786188840866,
-0.08887340128421783,
-0.06437816470861435,
0.06639137864112854,
0.12925630807876587,
0.03462596237659454,
-0.007148282136768103,
-0.018782922998070717,
0.08558407425880432,
-0.033727679401636124,
-0.1414598822593689,
-0.05425629764795303,
-0.005897045601159334,
-0.07007754594087601,
-0.029745934531092644,
-0.05489519238471985,
-0.04704135283827782,
0.03353048488497734,
0.09027598053216934,
-0.033026114106178284,
0.07922323793172836,
-0.003992450423538685,
0.011022050864994526,
-0.003568635554984212,
0.1743963360786438,
-0.009278719313442707,
-0.03653150051832199,
-0.0009899006690829992,
0.11060168594121933,
-0.00955385621637106,
-0.025809714570641518,
-0.06820174306631088,
-0.029445165768265724,
0.1235651895403862,
0.0837332233786583,
-0.01339734997600317,
0.0029976856894791126,
-0.0852036103606224,
-0.033164385706186295,
0.0031750856433063745,
-0.13587374985218048,
0.03210437670350075,
0.005980128888040781,
-0.060638878494501114,
0.021235089749097824,
0.020162492990493774,
0.0119690066203475,
-0.06527376174926758,
-0.0029715122655034065,
-0.052739351987838745,
-0.01996740698814392,
-0.0696958377957344,
-0.07019693404436111,
0.013231733813881874,
-0.010560431517660618,
0.014531577937304974,
-0.06994315981864929,
-0.13038361072540283,
-0.05074251815676689,
0.024806629866361618,
-0.07318567484617233,
-0.08084656298160553,
-0.018981363624334335,
-0.05688944458961487,
0.025400765240192413,
-0.020882656797766685,
0.11157001554965973,
-0.0219731405377388,
0.07309295237064362,
0.06117240712046623,
0.012974708341062069,
0.04614405706524849,
0.06883803009986877,
-0.03288858011364937,
0.05139075592160225,
-0.0912495031952858,
0.12664112448692322,
-0.1236044391989708,
0.04581601545214653,
-0.15075437724590302,
-0.10027188807725906,
-0.012944784015417099,
-0.024777540937066078,
0.10226138681173325,
0.12769447267055511,
-0.1443968266248703,
-0.04517336189746857,
0.1223326250910759,
-0.05816081166267395,
-0.07172776758670807,
0.12315113842487335,
-0.012838369235396385,
0.021630287170410156,
0.051769476383924484,
0.17974750697612762,
0.14848309755325317,
-0.11863373219966888,
0.0021639596670866013,
0.002918548882007599,
0.08273206651210785,
0.04028824716806412,
0.08027146756649017,
-0.06811417639255524,
-0.023074351251125336,
0.01498294435441494,
-0.058590810745954514,
0.019119800999760628,
-0.07922301441431046,
-0.07546442747116089,
-0.03981956094503403,
-0.06617024540901184,
0.029258696362376213,
0.013870414346456528,
0.010357474908232689,
-0.07934083044528961,
-0.12021725624799728,
0.05377570912241936,
0.13163243234157562,
-0.07418067753314972,
0.02532532811164856,
-0.08453690260648727,
0.04683184623718262,
-0.06190008670091629,
-0.008127572014927864,
-0.18892303109169006,
-0.04019700363278389,
0.06951843202114105,
-0.09998447448015213,
0.04148993268609047,
0.0038307283539325,
0.05083621293306351,
0.0380849651992321,
-0.003080366412177682,
-0.040857456624507904,
-0.10041836649179459,
0.00471071433275938,
-0.07609215378761292,
-0.16162435710430145,
-0.07721517235040665,
-0.02843133546411991,
0.1783212274312973,
-0.22111451625823975,
-0.0038820644840598106,
0.030070936307311058,
0.1029680147767067,
-0.008138247765600681,
-0.06333176046609879,
0.025566527619957924,
0.025269262492656708,
-0.0010560218943282962,
-0.0906691700220108,
0.017520273104310036,
0.02616131864488125,
-0.12301698327064514,
0.003988254349678755,
-0.14600501954555511,
-0.015773305669426918,
0.07087401300668716,
0.050961222499608994,
-0.04914833605289459,
-0.03780553117394447,
-0.04800765588879585,
-0.0506528839468956,
-0.03058541566133499,
-0.06470092386007309,
0.19505847990512848,
0.0173787884414196,
0.12402530759572983,
-0.06904583424329758,
-0.04016590490937233,
0.027159590274095535,
-0.00966718140989542,
-0.02450820803642273,
0.06968401372432709,
-0.03876832500100136,
-0.10611080378293991,
0.0518881119787693,
0.052983928471803665,
-0.027471452951431274,
0.15683741867542267,
-0.07362566143274307,
-0.10394109040498734,
-0.025212442502379417,
0.02257850021123886,
0.015791745856404305,
0.10017139464616776,
-0.16514557600021362,
-0.004154777154326439,
0.05037006735801697,
0.026528773829340935,
0.04072299599647522,
-0.16167616844177246,
0.03339120373129845,
0.05720498785376549,
-0.053365398198366165,
-0.008612534031271935,
-0.007694805506616831,
-0.008248363621532917,
0.06028932332992554,
0.021462397649884224,
-0.008995488286018372,
0.01408228650689125,
-0.03774430975317955,
-0.08749961853027344,
0.14624616503715515,
-0.12338113784790039,
-0.16113939881324768,
-0.1620427519083023,
0.026519563049077988,
-0.030961453914642334,
-0.015679264441132545,
0.027974285185337067,
-0.08756199479103088,
-0.07017625123262405,
-0.06430427730083466,
0.0059364549815654755,
-0.10407677292823792,
-0.008536562323570251,
0.09860152751207352,
-0.017515432089567184,
0.12299025058746338,
-0.13628828525543213,
0.001578141120262444,
0.017278583720326424,
-0.04239847511053085,
-0.029500799253582954,
0.009819827042520046,
0.08377692103385925,
0.11598454415798187,
0.004917338024824858,
0.0251722801476717,
-0.01818864792585373,
0.2551199793815613,
-0.10436952114105225,
-0.00871250219643116,
0.14909303188323975,
0.010151565074920654,
0.060750942677259445,
0.08714675903320312,
0.02330804616212845,
-0.07908593863248825,
0.029874248430132866,
0.018063100054860115,
-0.004239847883582115,
-0.2710329592227936,
-0.03536408394575119,
-0.0654875859618187,
-0.09007382392883301,
0.08278044313192368,
0.04493562877178192,
0.0638267919421196,
0.034359607845544815,
-0.04531172662973404,
0.040190111845731735,
-0.010020515881478786,
0.09605243057012558,
0.12548714876174927,
0.02539299987256527,
0.07521997392177582,
-0.029411908239126205,
-0.006301417946815491,
0.0459212027490139,
0.019417688250541687,
0.25390785932540894,
0.03825058788061142,
0.22072084248065948,
0.03660408407449722,
0.13402235507965088,
0.011301646009087563,
0.033981699496507645,
0.01938551291823387,
0.010779592208564281,
0.016438256949186325,
-0.06293725967407227,
-0.04966631159186363,
0.018890542909502983,
0.07690128684043884,
0.030312780290842056,
-0.09696986526250839,
0.017892738804221153,
-0.010988087393343449,
0.2858976423740387,
0.06185662001371384,
-0.27048107981681824,
-0.08224809169769287,
0.014725650660693645,
-0.04195496812462807,
-0.10968724638223648,
-0.0054664891213178635,
0.08814622461795807,
-0.13865742087364197,
0.046211712062358856,
-0.04389714077115059,
0.10389883816242218,
-0.03888188302516937,
0.023115480318665504,
0.02294090762734413,
0.11906535923480988,
-0.0047599258832633495,
0.09781718254089355,
-0.20832113921642303,
0.19881483912467957,
0.014861038886010647,
0.10351686924695969,
-0.05445472151041031,
0.05564471706748009,
0.011900176294147968,
0.024853313341736794,
0.09469573199748993,
0.015124199911952019,
-0.053149085491895676,
-0.1613284945487976,
-0.08309414982795715,
0.018739890307188034,
0.1252945065498352,
-0.029838763177394867,
0.08775684237480164,
-0.06500531733036041,
0.0028600189834833145,
0.03531840816140175,
-0.02864888310432434,
-0.18716546893119812,
-0.13127049803733826,
0.05008219555020332,
0.05143730714917183,
0.08730170875787735,
-0.088532954454422,
-0.07795939594507217,
-0.020959923043847084,
0.1800340712070465,
-0.042905356734991074,
-0.06136096268892288,
-0.1429327428340912,
0.08801647275686264,
0.1556137651205063,
-0.04769188165664673,
0.032050423324108124,
0.008643894456326962,
0.17166535556316376,
0.0192751195281744,
-0.03524491935968399,
0.043790578842163086,
-0.06634730845689774,
-0.16436725854873657,
-0.035502839833498,
0.177449032664299,
0.007701623719185591,
0.0607648640871048,
0.015148431062698364,
0.0008281906484626234,
-0.010253806598484516,
-0.08159303665161133,
0.03791298717260361,
0.063517726957798,
0.008462440222501755,
0.06143965572118759,
-0.016164714470505714,
0.02206968143582344,
-0.09587551653385162,
-0.04565426707267761,
0.14030270278453827,
0.2323281466960907,
-0.060288991779088974,
0.044085148721933365,
0.05470564216375351,
-0.08051589876413345,
-0.13777293264865875,
0.0162949301302433,
0.14609117805957794,
0.05480406805872917,
0.026637572795152664,
-0.1804560422897339,
0.029897231608629227,
0.0715409442782402,
-0.03385505825281143,
0.03284362703561783,
-0.2780005931854248,
-0.12003615498542786,
0.08392998576164246,
0.05637826398015022,
0.022804662585258484,
-0.13812218606472015,
-0.07905459403991699,
-0.05469242110848427,
-0.10250791907310486,
0.02897169254720211,
0.0017138785915449262,
0.129121333360672,
0.005296314600855112,
0.0848117545247078,
0.04819594696164131,
-0.03259661793708801,
0.18804645538330078,
0.02392730489373207,
0.03773381561040878,
-0.02250015176832676,
0.07051551342010498,
0.0748949870467186,
-0.07041528075933456,
0.07825443148612976,
-0.04726903885602951,
0.05952591076493263,
-0.1748853474855423,
-0.024431169033050537,
-0.06558436900377274,
0.04951188713312149,
-0.048510439693927765,
-0.02900862880051136,
-0.017857607454061508,
0.036343447864055634,
0.045700397342443466,
0.007872221060097218,
0.060471486300230026,
-0.02371666580438614,
0.05164936184883118,
0.14219021797180176,
0.11467825621366501,
0.043734680861234665,
-0.13593052327632904,
-0.003930232487618923,
-0.007652838248759508,
0.07833205163478851,
-0.12479902803897858,
0.03434683382511139,
0.09677527099847794,
0.03617590293288231,
0.1434636116027832,
0.010438108816742897,
-0.13299638032913208,
0.010030217468738556,
0.030614085495471954,
-0.040462423115968704,
-0.1866283267736435,
-0.0601077638566494,
0.06854899972677231,
-0.141091987490654,
-0.006393342278897762,
0.11111637204885483,
-0.06318200379610062,
-0.015275758691132069,
-0.020086757838726044,
0.009697167202830315,
-0.052306097000837326,
0.19358515739440918,
0.04194682091474533,
0.10389579087495804,
-0.07546928524971008,
0.10402052849531174,
0.08632785826921463,
-0.05488232150673866,
0.07583609223365784,
0.031464505940675735,
-0.049880098551511765,
-0.02068527229130268,
0.01903362199664116,
0.06559491902589798,
0.006749439053237438,
-0.06109664589166641,
-0.04583408683538437,
-0.11102105677127838,
0.027766713872551918,
0.005475366488099098,
0.0174822136759758,
0.013380901888012886,
-0.03258746489882469,
0.006668424233794212,
-0.15086323022842407,
0.08579108119010925,
0.05637991800904274,
0.06348694115877151,
-0.14603294432163239,
0.03972743824124336,
0.005932189989835024,
0.04279544949531555,
0.0005523643922060728,
-0.039547961205244064,
-0.059709224849939346,
0.0050499881617724895,
-0.1429479420185089,
0.0003183726512361318,
-0.014965411275625229,
-0.01309188362210989,
-0.0038531930185854435,
-0.05719004198908806,
-0.01962517574429512,
0.08446812629699707,
-0.0719061940908432,
-0.09091034531593323,
-0.00023192536900751293,
0.0717993825674057,
-0.07953204214572906,
-0.001794151496142149,
0.04773091897368431,
-0.12472919374704361,
0.0734688937664032,
0.058310430496931076,
0.021599367260932922,
0.023010393604636192,
-0.060992103070020676,
-0.024570554494857788,
0.0333649106323719,
0.033233821392059326,
0.039554696530103683,
-0.11985297501087189,
-0.019996192306280136,
-0.012819353491067886,
0.03054865635931492,
-0.024487432092428207,
0.012871853075921535,
-0.12303787469863892,
-0.07056605815887451,
-0.04834505543112755,
-0.012806359678506851,
-0.059116411954164505,
0.04249276593327522,
0.062119100242853165,
0.040122140198946,
0.13858751952648163,
-0.06408558785915375,
0.060250621289014816,
-0.19778965413570404,
-0.006526526995003223,
-0.03353805094957352,
-0.034334566444158554,
-0.04381071776151657,
-0.04305574297904968,
0.08287344127893448,
-0.057797934859991074,
0.08877641707658768,
-0.03733455389738083,
0.10555602610111237,
0.031331662088632584,
-0.043589454144239426,
-0.003970818594098091,
0.0014609466306865215,
0.21614933013916016,
0.08461817353963852,
0.0014492934569716454,
0.09340125322341919,
-0.03425358235836029,
0.06164552643895149,
0.10281723737716675,
0.08406322449445724,
0.17937761545181274,
0.006027582101523876,
0.07194878906011581,
0.07380678504705429,
-0.09358914941549301,
-0.1389167308807373,
0.1003899872303009,
-0.0032992991618812084,
0.13371087610721588,
-0.026814881712198257,
0.11564329266548157,
0.1430426388978958,
-0.15958347916603088,
0.0688004121184349,
-0.05561906099319458,
-0.11330974102020264,
-0.09490399062633514,
-0.10775041580200195,
-0.07136814296245575,
-0.13256491720676422,
0.0206808689981699,
-0.10947933793067932,
0.05346503108739853,
0.06009179353713989,
0.00925588607788086,
-0.014162315055727959,
0.15212677419185638,
0.02280491404235363,
-0.0421377457678318,
0.10163339227437973,
-0.0330708809196949,
0.0025998628698289394,
-0.03320617601275444,
-0.06252899020910263,
0.10499255359172821,
0.029068876057863235,
0.10255450755357742,
-0.009813579730689526,
-0.0295676589012146,
0.05317835509777069,
-0.004124297294765711,
-0.12362241744995117,
0.02027035877108574,
-0.004825827199965715,
0.02446518838405609,
0.07058082520961761,
0.050620026886463165,
-0.011454551480710506,
-0.03306281566619873,
0.22291810810565948,
-0.05718304216861725,
-0.07450110465288162,
-0.11934351176023483,
0.15824450552463531,
0.04008752852678299,
-0.014669474214315414,
0.03832335025072098,
-0.11551632732152939,
0.007517674472182989,
0.1403241753578186,
0.10936295241117477,
-0.017274007201194763,
-0.023670619353652,
-0.023231511935591698,
-0.023397792130708694,
-0.07065266370773315,
0.10962122678756714,
0.09453722834587097,
-0.027543606236577034,
-0.025946365669369698,
0.06801097095012665,
-0.015688039362430573,
-0.06223531812429428,
-0.0928335040807724,
0.08363711833953857,
0.020589975640177727,
-0.008451144210994244,
-0.0012775629293173552,
0.10118508338928223,
-0.0021961950697004795,
-0.16848815977573395,
-0.018449490889906883,
-0.10345292091369629,
-0.16882558166980743,
-0.03938180208206177,
0.06698397547006607,
0.044561468064785004,
0.055063918232917786,
-0.031066693365573883,
0.0019903879147022963,
0.1532028615474701,
-0.01379524078220129,
-0.02597995102405548,
-0.07052171230316162,
0.08304393291473389,
-0.09927528351545334,
0.17238816618919373,
0.015144003555178642,
0.07063543796539307,
0.08904135227203369,
0.021023601293563843,
-0.1516495943069458,
0.049156248569488525,
0.05732150003314018,
-0.07622911036014557,
0.06260951608419418,
0.21778279542922974,
-0.011827316135168076,
0.09266411513090134,
0.02659023553133011,
-0.13194319605827332,
-0.03973253071308136,
-0.03881160914897919,
0.0126276146620512,
-0.07340159267187119,
-0.022151879966259003,
-0.06185228377580643,
0.14730443060398102,
0.1660565286874771,
-0.08492579311132431,
-0.03356415405869484,
-0.06597921997308731,
0.010093611665070057,
0.060987383127212524,
0.1333705633878708,
-0.03948013484477997,
-0.20266225934028625,
-0.0002835500054061413,
0.006431426852941513,
0.029649006202816963,
-0.22200782597064972,
-0.08140841871500015,
0.023742742836475372,
-0.047755204141139984,
-0.02779935672879219,
0.10346575081348419,
0.0662137046456337,
0.016997022554278374,
-0.03621665760874748,
-0.11787099391222,
-0.04435913637280464,
0.15581096708774567,
-0.16055427491664886,
-0.05000138655304909
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-timit-demo-colab
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4798
- Wer: 0.3474
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.5229 | 4.0 | 500 | 1.6557 | 1.0422 |
| 0.6618 | 8.0 | 1000 | 0.4420 | 0.4469 |
| 0.2211 | 12.0 | 1500 | 0.4705 | 0.4002 |
| 0.1281 | 16.0 | 2000 | 0.4347 | 0.3688 |
| 0.0868 | 20.0 | 2500 | 0.4653 | 0.3590 |
| 0.062 | 24.0 | 3000 | 0.4747 | 0.3519 |
| 0.0472 | 28.0 | 3500 | 0.4798 | 0.3474 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.9.0+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "wav2vec2-base-timit-demo-colab", "results": []}]}
|
automatic-speech-recognition
|
deepdml/wav2vec2-base-timit-demo-colab
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us
|
wav2vec2-base-timit-demo-colab
==============================
This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4798
* Wer: 0.3474
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 32
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 30
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.9.0+cu102
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
52,
130,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.11214473098516464,
0.10370084643363953,
-0.0024385200813412666,
0.070782370865345,
0.11895522475242615,
-0.013722525909543037,
0.10932900756597519,
0.1477128565311432,
-0.09011273086071014,
0.06015355885028839,
0.12885257601737976,
0.14541617035865784,
0.04019111022353172,
0.1417086273431778,
-0.049511153250932693,
-0.28136885166168213,
0.03455815836787224,
0.02374921552836895,
-0.009118282236158848,
0.129059299826622,
0.09064991772174835,
-0.1247270405292511,
0.05449943616986275,
0.022639736533164978,
-0.14964641630649567,
-0.000010303616363671608,
0.00516656506806612,
-0.09649047255516052,
0.13179023563861847,
0.012348159216344357,
0.06788723170757294,
0.0313396155834198,
0.08009502291679382,
-0.23875337839126587,
0.003950906451791525,
0.04104439169168472,
0.03849348425865173,
0.06817072629928589,
0.04876779019832611,
-0.022306786850094795,
0.11508095264434814,
-0.09357956051826477,
0.07261358201503754,
0.03519876301288605,
-0.10390425473451614,
-0.2924153804779053,
-0.0828220322728157,
0.036717966198921204,
0.07207076251506805,
0.10476227849721909,
-0.01404489018023014,
0.12643758952617645,
-0.0701049342751503,
0.10255704820156097,
0.26522576808929443,
-0.29694855213165283,
-0.04103218391537666,
-0.047266457229852676,
0.03153088688850403,
0.06162836030125618,
-0.10132772475481033,
-0.02314247563481331,
0.025775443762540817,
0.04542151838541031,
0.12796756625175476,
-0.016505537554621696,
-0.09307383745908737,
0.0011465564602985978,
-0.1457781344652176,
-0.05850032716989517,
0.12493040412664413,
0.03121056593954563,
-0.042980752885341644,
-0.09416098147630692,
-0.048031602054834366,
-0.17648763954639435,
-0.05635672062635422,
-0.005123773589730263,
0.04115751013159752,
-0.03883517161011696,
-0.09882000833749771,
-0.0110187828540802,
-0.07281243056058884,
-0.07461711764335632,
-0.03978709876537323,
0.167136088013649,
0.05332247167825699,
-0.00034231500467285514,
-0.024074777960777283,
0.07600751519203186,
-0.019260354340076447,
-0.13636724650859833,
-0.015807922929525375,
0.03737868368625641,
-0.00853126309812069,
-0.008682428859174252,
-0.06298674643039703,
-0.024536149576306343,
0.02860306203365326,
0.1436900794506073,
-0.09887834638357162,
0.0827849730849266,
-0.006543218623846769,
0.03151746466755867,
-0.11148041486740112,
0.20944760739803314,
-0.03695754334330559,
-0.0019109154818579555,
-0.0010682246647775173,
0.05435805767774582,
0.0279711727052927,
-0.026410676538944244,
-0.09445787221193314,
0.013829929754137993,
0.11809737980365753,
0.04471474885940552,
-0.05429545417428017,
0.05654800683259964,
-0.030761508271098137,
-0.008597606793045998,
-0.0164768248796463,
-0.11702682822942734,
0.03221172094345093,
0.02010699175298214,
-0.07381196320056915,
0.000547310512047261,
0.015503088012337685,
0.01735644042491913,
-0.056133829057216644,
0.07807658612728119,
-0.055622268468141556,
0.029748668894171715,
-0.06312848627567291,
-0.11824479699134827,
0.019697396084666252,
-0.08585454523563385,
0.005331470165401697,
-0.10746414959430695,
-0.12037722766399384,
-0.017954299226403236,
0.03943018987774849,
-0.032584719359874725,
-0.033908355981111526,
-0.08924337476491928,
-0.07779963314533234,
0.03837104141712189,
-0.03886183723807335,
0.09792952239513397,
-0.0723726749420166,
0.10345923155546188,
0.028968386352062225,
0.0773681253194809,
-0.024795232340693474,
0.06444209069013596,
-0.08169813454151154,
0.01176672987639904,
-0.17457133531570435,
0.07717481255531311,
-0.07615338265895844,
0.04829688370227814,
-0.12250902503728867,
-0.12005047500133514,
0.03296753019094467,
-0.007135916035622358,
0.09925027936697006,
0.10554897785186768,
-0.16979269683361053,
-0.08307141065597534,
0.1909225881099701,
-0.0755191519856453,
-0.07880755513906479,
0.12604989111423492,
-0.03525959700345993,
0.005825897678732872,
0.06654960662126541,
0.2581895589828491,
0.05304950848221779,
-0.1118726059794426,
0.026246558874845505,
-0.0320756733417511,
0.06797488778829575,
-0.03552399203181267,
0.061912041157484055,
-0.03830990940332413,
0.030873006209731102,
0.027489615604281425,
-0.02434137649834156,
0.05651959404349327,
-0.08972521871328354,
-0.08367308229207993,
-0.045560259371995926,
-0.09319767355918884,
0.026835614815354347,
0.03410099074244499,
0.061430566012859344,
-0.11332698166370392,
-0.0916142538189888,
0.026727747172117233,
0.09079878032207489,
-0.10286348313093185,
0.062312543392181396,
-0.09599673002958298,
0.0785546749830246,
-0.0051842317916452885,
-0.008452384732663631,
-0.18430572748184204,
0.04598286375403404,
0.033948928117752075,
-0.026058468967676163,
0.03861130401492119,
-0.06257335841655731,
0.0700770691037178,
0.04472716897726059,
-0.034350112080574036,
-0.04164010286331177,
-0.013651310466229916,
0.008428918197751045,
-0.08971744030714035,
-0.20230749249458313,
-0.028348619118332863,
-0.024750815704464912,
0.10116053372621536,
-0.14936310052871704,
0.02370390109717846,
0.030308565124869347,
0.08174645155668259,
0.01639682985842228,
-0.027128340676426888,
-0.012096966616809368,
0.09581369906663895,
-0.015304217115044594,
-0.056514542549848557,
0.06336239725351334,
0.01412904728204012,
-0.09003004431724548,
0.024651991203427315,
-0.12521696090698242,
0.11618523299694061,
0.14318780601024628,
-0.025575609877705574,
-0.06384526193141937,
-0.004643653053790331,
-0.04558837041258812,
-0.027341771870851517,
-0.02477261610329151,
0.040201373398303986,
0.21856409311294556,
0.001820394885726273,
0.14207014441490173,
-0.08715210855007172,
-0.035228431224823,
0.03664784133434296,
-0.02265552617609501,
0.009653492830693722,
0.12749557197093964,
0.0605907142162323,
-0.04544289782643318,
0.11144464462995529,
0.1086469516158104,
-0.08746036142110825,
0.11809761822223663,
-0.06356001645326614,
-0.07827940583229065,
-0.014695772901177406,
-0.0016807183856144547,
0.007264620624482632,
0.09698590636253357,
-0.14143376052379608,
-0.029851486906409264,
0.028943616896867752,
0.03109242022037506,
0.02018818072974682,
-0.2147643268108368,
-0.00022443511988967657,
0.03119872882962227,
-0.08469731360673904,
-0.05462302640080452,
-0.0011999956332147121,
0.011340592056512833,
0.09797565639019012,
0.009314504452049732,
-0.0982418805360794,
0.007025392726063728,
-0.0060068052262067795,
-0.0836508646607399,
0.1839771270751953,
-0.12265569716691971,
-0.16909822821617126,
-0.10666962713003159,
-0.07118130475282669,
-0.03895893692970276,
0.00031513150315731764,
0.08518301695585251,
-0.09967143088579178,
-0.032139360904693604,
-0.07948294281959534,
0.0006032365490682423,
-0.02845202013850212,
0.03823724016547203,
0.016850823536515236,
-0.010347637347877026,
0.06740164011716843,
-0.11459193378686905,
-0.02230917476117611,
-0.042887650430202484,
-0.007366285193711519,
0.04678039997816086,
0.038083504885435104,
0.10739269107580185,
0.15652452409267426,
-0.005742993671447039,
0.039761465042829514,
-0.04685472324490547,
0.20958836376667023,
-0.06523773074150085,
-0.04546833783388138,
0.12707637250423431,
0.004241032991558313,
0.058392446488142014,
0.1188696101307869,
0.0454525351524353,
-0.10177378356456757,
-0.0035963982809334993,
0.00010965496039716527,
-0.037569474428892136,
-0.21074499189853668,
-0.04773736745119095,
-0.05009608343243599,
-0.01995871216058731,
0.11047866195440292,
0.02784000150859356,
0.020861180499196053,
0.022362636402249336,
0.03325217217206955,
0.005611320026218891,
-0.0034533797297626734,
0.08153232932090759,
0.14823701977729797,
0.03284073248505592,
0.12628522515296936,
-0.032634906470775604,
-0.03978986665606499,
0.030139321461319923,
-0.012925286777317524,
0.23105113208293915,
0.014925552532076836,
0.1539011299610138,
0.04959076642990112,
0.19626475870609283,
0.025011278688907623,
0.07069509476423264,
0.004840218462049961,
-0.006353407632559538,
0.0017952604684978724,
-0.054801326245069504,
-0.05883827805519104,
0.0272730253636837,
0.024668853729963303,
0.02011873759329319,
-0.13105231523513794,
-0.02866348624229431,
0.04205209016799927,
0.3515981435775757,
0.03925713896751404,
-0.3181913495063782,
-0.09636355936527252,
-0.006261957343667746,
-0.08268649131059647,
-0.027284272015094757,
0.04782175272703171,
0.08757255971431732,
-0.09015783667564392,
0.052959322929382324,
-0.045165397226810455,
0.09448302537202835,
-0.05502108857035637,
0.03914277255535126,
0.023044403642416,
0.0724099725484848,
0.013659808784723282,
0.04945208504796028,
-0.31089574098587036,
0.2871381342411041,
0.0010210108011960983,
0.07945023477077484,
-0.06592410057783127,
0.0012076152488589287,
0.024141255766153336,
0.009368181228637695,
0.0828392282128334,
-0.01823749765753746,
-0.11799535900354385,
-0.18280121684074402,
-0.07892894744873047,
0.029619229957461357,
0.12419337779283524,
0.005517744459211826,
0.10647755116224289,
-0.024417124688625336,
-0.00893753208220005,
0.05806269496679306,
-0.07276972383260727,
-0.08002743124961853,
-0.09596472233533859,
-0.007280582562088966,
0.08683817088603973,
0.038211170583963394,
-0.07071981579065323,
-0.0964096412062645,
-0.08854390680789948,
0.13418777287006378,
-0.05243276059627533,
-0.0401211716234684,
-0.11372531950473785,
0.025126781314611435,
0.11322059482336044,
-0.08119437843561172,
0.05587690323591232,
0.01527022197842598,
0.09011715650558472,
0.011750027537345886,
-0.06478927284479141,
0.10863977670669556,
-0.06824551522731781,
-0.16889820992946625,
-0.03253576532006264,
0.14240451157093048,
0.031072983518242836,
0.06797859817743301,
-0.004224230535328388,
0.03811056911945343,
-0.03375731408596039,
-0.08354409784078598,
0.02943994477391243,
0.049732379615306854,
0.03756782039999962,
0.006370706018060446,
-0.036681871861219406,
-0.002787303877994418,
-0.09807661175727844,
-0.036111969500780106,
0.20995236933231354,
0.23655082285404205,
-0.09867970645427704,
0.08952978253364563,
0.07601279020309448,
-0.04417261853814125,
-0.17131830751895905,
-0.007960698567330837,
0.0683295875787735,
0.006182876415550709,
-0.019841207191348076,
-0.19033490121364594,
0.03919248655438423,
0.0681341215968132,
-0.0182611383497715,
0.07136701792478561,
-0.3223351240158081,
-0.144471675157547,
0.13935129344463348,
0.11659958213567734,
0.07952062040567398,
-0.13545742630958557,
-0.046535152941942215,
-0.026678016409277916,
-0.0944434180855751,
0.09131497144699097,
-0.05894364416599274,
0.1336066722869873,
-0.020724862813949585,
0.10378265380859375,
0.013541067950427532,
-0.05398892983794212,
0.11577664315700531,
0.012653651647269726,
0.0594291128218174,
-0.04441102594137192,
0.022349808365106583,
0.009064745157957077,
-0.05306662991642952,
0.05307551473379135,
-0.08186600357294083,
0.04023103415966034,
-0.08839966356754303,
-0.0325205959379673,
-0.09008356183767319,
0.022422006353735924,
-0.007737088482826948,
-0.04163924604654312,
-0.03335573896765709,
0.0059208651073277,
0.07429662346839905,
-0.01190114114433527,
0.12461680918931961,
-0.03174266219139099,
0.13496893644332886,
0.1353772133588791,
0.08386766910552979,
-0.08450187742710114,
-0.05783496052026749,
0.00423763133585453,
-0.03718811646103859,
0.0616060309112072,
-0.12272986024618149,
0.028272423893213272,
0.13733036816120148,
0.037074457854032516,
0.13294446468353271,
0.06468986719846725,
-0.051553770899772644,
0.021059177815914154,
0.0413326658308506,
-0.1379575878381729,
-0.11878111213445663,
0.00916452705860138,
0.015747692435979843,
-0.07363291084766388,
0.045963361859321594,
0.11020204424858093,
-0.06498843431472778,
-0.012727120891213417,
-0.01562667451798916,
0.016784273087978363,
-0.05437782034277916,
0.20456214249134064,
0.05171116814017296,
0.05519438907504082,
-0.12491008639335632,
0.08136945962905884,
0.04750414937734604,
-0.13232100009918213,
0.06036315858364105,
0.08407430350780487,
-0.0930643305182457,
-0.03112965263426304,
0.041802749037742615,
0.10912128537893295,
-0.048355259001255035,
-0.0749785453081131,
-0.1293836236000061,
-0.14547471702098846,
0.11105101555585861,
0.1790716052055359,
0.06558671593666077,
0.012747826054692268,
-0.05918903276324272,
0.00678919767960906,
-0.11152871698141098,
0.06830578297376633,
0.03874994069337845,
0.043079715222120285,
-0.11357154697179794,
0.15156683325767517,
0.01661859080195427,
0.03934833034873009,
-0.012428445741534233,
-0.004502950236201286,
-0.10867955535650253,
0.04265259951353073,
-0.12897980213165283,
-0.0002643393527250737,
-0.055513765662908554,
0.007719346322119236,
0.004334737081080675,
-0.0617855079472065,
-0.056935738772153854,
0.0370924212038517,
-0.12029073387384415,
-0.02869153395295143,
-0.0031005453784018755,
0.044574346393346786,
-0.12856027483940125,
-0.022912435233592987,
0.01952691376209259,
-0.08542011678218842,
0.08696286380290985,
0.08693265169858932,
-0.028638822957873344,
0.06656148284673691,
-0.09220506250858307,
-0.028045227751135826,
0.06469467282295227,
0.0018608274403959513,
0.04350791126489639,
-0.1422366052865982,
-0.014607000164687634,
0.017182208597660065,
0.040912337601184845,
0.026529019698500633,
0.10754171013832092,
-0.12228663265705109,
-0.008994745090603828,
-0.021179210394620895,
-0.05434951186180115,
-0.06721638888120651,
0.0390625074505806,
0.09538352489471436,
0.03257935866713524,
0.16740739345550537,
-0.0993146151304245,
0.03742532804608345,
-0.16749030351638794,
0.0011608098866418004,
-0.03061586245894432,
-0.11211036145687103,
-0.0841677263379097,
-0.0407077893614769,
0.08310218155384064,
-0.05301857367157936,
0.1281585395336151,
-0.026150841265916824,
0.05535433441400528,
0.030124478042125702,
-0.06990433484315872,
-0.04020890220999718,
0.03454600274562836,
0.23339562118053436,
0.04173344373703003,
-0.03900148719549179,
0.07568458467721939,
0.02765069343149662,
0.084013931453228,
0.15009421110153198,
0.16643738746643066,
0.17335310578346252,
0.06724227219820023,
0.10671768337488174,
0.04845834895968437,
-0.05828425660729408,
-0.15171591937541962,
0.07574097812175751,
-0.04197431728243828,
0.1230030208826065,
-0.02127816714346409,
0.2538428008556366,
0.0983705148100853,
-0.17110411822795868,
0.06805340945720673,
-0.028057632967829704,
-0.08765391260385513,
-0.12160508334636688,
-0.054831840097904205,
-0.08788406103849411,
-0.17295633256435394,
0.007269852329045534,
-0.1054546907544136,
0.06377717107534409,
0.054311733692884445,
0.03114044852554798,
0.009957077912986279,
0.12709322571754456,
0.018011312931776047,
-0.005804810207337141,
0.09376033395528793,
-0.005617162212729454,
-0.049793556332588196,
-0.08312808722257614,
-0.08770501613616943,
0.03880033269524574,
-0.02120095305144787,
0.05561438202857971,
-0.014248736202716827,
-0.08313890546560287,
0.03790149465203285,
-0.03914784640073776,
-0.08663193881511688,
0.021028239279985428,
0.019264234229922295,
0.08454539626836777,
0.07726339250802994,
0.04108578711748123,
-0.03765487298369408,
-0.0006758076488040388,
0.21894194185733795,
-0.1001923456788063,
-0.11494342982769012,
-0.10940337181091309,
0.2850143015384674,
0.04763634875416756,
-0.008707198314368725,
0.020437924191355705,
-0.062486432492733,
-0.026286480948328972,
0.2309589385986328,
0.17403481900691986,
-0.03481825441122055,
0.003686006646603346,
-0.017961625009775162,
-0.007039464078843594,
-0.03552531450986862,
0.08717235177755356,
0.15895573794841766,
0.060340721160173416,
-0.06771886348724365,
-0.037761423736810684,
-0.05545702204108238,
-0.027595704421401024,
-0.0686144083738327,
0.08542995899915695,
-0.0009603094658814371,
-0.03446202352643013,
-0.039917994290590286,
0.06787480413913727,
-0.09019234031438828,
-0.09515608847141266,
0.015323009341955185,
-0.20145252346992493,
-0.14997340738773346,
0.0007695292588323355,
0.06409071385860443,
0.029127318412065506,
0.03468503803014755,
-0.007933756336569786,
0.0010840954491868615,
0.08668141812086105,
-0.005032334942370653,
-0.07842838019132614,
-0.07977064698934555,
0.08866675943136215,
-0.11472378671169281,
0.160131573677063,
-0.032179754227399826,
0.06230587512254715,
0.1166549101471901,
0.09586525708436966,
-0.07089002430438995,
0.10219445079565048,
0.05118347331881523,
-0.10396434366703033,
0.03305647522211075,
0.14352187514305115,
-0.03758694976568222,
0.10648677498102188,
0.03780881687998772,
-0.11677771061658859,
0.024768052622675896,
-0.0668368861079216,
-0.044409602880477905,
-0.044588807970285416,
-0.04607362672686577,
-0.051399797201156616,
0.11731087416410446,
0.17063267529010773,
-0.05027095228433609,
0.003864109516143799,
-0.05875195562839508,
0.012126524932682514,
0.036490049213171005,
0.010490240529179573,
-0.06235406547784805,
-0.26179805397987366,
0.010455776005983353,
0.030636748299002647,
0.002863713540136814,
-0.25481680035591125,
-0.0944959744811058,
0.010775214992463589,
-0.05180995911359787,
-0.07721804827451706,
0.1017005518078804,
0.06389401108026505,
0.035690873861312866,
-0.05031975731253624,
-0.07863166183233261,
-0.03412298113107681,
0.18644516170024872,
-0.166625514626503,
-0.06471696496009827
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-basque
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4276
- Wer: 0.5962
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.9902 | 1.29 | 400 | 2.1257 | 1.0 |
| 0.9625 | 2.59 | 800 | 0.5695 | 0.7452 |
| 0.4605 | 3.88 | 1200 | 0.4276 | 0.5962 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
{"language": "eu", "license": "apache-2.0", "tags": ["automatic-speech-recognition", "basque", "generated_from_trainer", "hf-asr-leaderboard", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_7_0"], "metrics": ["wer", "cer"], "model-index": [{"name": "wav2vec2-large-xls-r-300m-basque", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 7", "type": "mozilla-foundation/common_voice_7_0", "args": "eu"}, "metrics": [{"type": "wer", "value": 51.89, "name": "Test WER"}, {"type": "cer", "value": 10.01, "name": "Test CER"}]}]}]}
|
automatic-speech-recognition
|
deepdml/wav2vec2-large-xls-r-300m-basque
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"basque",
"generated_from_trainer",
"hf-asr-leaderboard",
"robust-speech-event",
"eu",
"dataset:mozilla-foundation/common_voice_7_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"eu"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #basque #generated_from_trainer #hf-asr-leaderboard #robust-speech-event #eu #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
wav2vec2-large-xls-r-300m-basque
================================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4276
* Wer: 0.5962
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 2
* eval\_batch\_size: 2
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 4
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 5
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.2
* Pytorch 1.10.0+cu111
* Datasets 1.18.3
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 4\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 5\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #basque #generated_from_trainer #hf-asr-leaderboard #robust-speech-event #eu #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 4\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 5\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] |
[
102,
158,
4,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #basque #generated_from_trainer #hf-asr-leaderboard #robust-speech-event #eu #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 4\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 5\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] |
[
-0.11271389573812485,
0.12051516026258469,
-0.005612845998257399,
0.07368206977844238,
0.09563544392585754,
0.007818025536835194,
0.11470076441764832,
0.15166901051998138,
-0.028191208839416504,
0.1191585436463356,
0.08582285046577454,
0.06538032740354538,
0.07022872567176819,
0.13783246278762817,
-0.021713826805353165,
-0.29381898045539856,
0.01990799605846405,
-0.027519134804606438,
-0.12060409039258957,
0.09530366212129593,
0.08190044015645981,
-0.10088036954402924,
0.04461697116494179,
0.006523539312183857,
-0.06092076748609543,
0.0069642565213143826,
-0.04702096804976463,
-0.053050652146339417,
0.10113269090652466,
0.030479570850729942,
0.03798716515302658,
0.025968674570322037,
0.09657370299100876,
-0.24946825206279755,
0.0003276149509474635,
0.0468326061964035,
0.029725296422839165,
0.06467842310667038,
0.11011353135108948,
-0.014287546277046204,
0.10303176939487457,
-0.09042634814977646,
0.05493561923503876,
0.039713259786367416,
-0.09570403397083282,
-0.24686045944690704,
-0.057816263288259506,
0.0627036988735199,
0.14423757791519165,
0.06789611279964447,
-0.028745245188474655,
0.04030077904462814,
-0.0642780065536499,
0.0924873873591423,
0.22727979719638824,
-0.24089184403419495,
-0.05884353816509247,
-0.04049750044941902,
0.020966123789548874,
0.04184800386428833,
-0.10448706895112991,
-0.026606392115354538,
-0.005682661198079586,
0.021972980350255966,
0.09321940690279007,
-0.013620026409626007,
0.008386035449802876,
0.0020464223343878984,
-0.14651866257190704,
-0.046152058988809586,
0.11797435581684113,
0.06940669566392899,
-0.0002929459442384541,
-0.11174828559160233,
-0.0473238080739975,
-0.17049063742160797,
-0.05572348088026047,
0.014686132781207561,
0.0343107171356678,
-0.05060410499572754,
-0.047750528901815414,
0.017344636842608452,
-0.051373936235904694,
-0.06416395306587219,
0.045333269983530045,
0.12995435297489166,
0.04490187019109726,
-0.019841549918055534,
0.019283819943666458,
0.09318803250789642,
0.037701383233070374,
-0.16886964440345764,
-0.0030296461191028357,
0.05063170939683914,
-0.10747677832841873,
-0.006736887618899345,
0.008500494994223118,
0.03048068657517433,
0.0418817512691021,
0.13170088827610016,
-0.02019551768898964,
0.09221073985099792,
0.02898155339062214,
0.008156144060194492,
-0.0747353583574295,
0.1723000407218933,
-0.07132236659526825,
-0.07915648818016052,
-0.05106499046087265,
0.12713460624217987,
0.009674291126430035,
-0.0043102758936584,
-0.0875314474105835,
0.03861258924007416,
0.09724096953868866,
0.06003202497959137,
-0.013347356580197811,
0.031411733478307724,
-0.048288002610206604,
-0.03067777305841446,
0.022291114553809166,
-0.1296093761920929,
0.05009736120700836,
0.06331253051757812,
-0.07683280855417252,
-0.005276315379887819,
-0.00643956521525979,
0.0002220581955043599,
-0.052231259644031525,
0.09974328428506851,
-0.05503459274768829,
0.012414706870913506,
-0.04536207392811775,
-0.08783867955207825,
0.02995326928794384,
-0.03806546702980995,
-0.018086545169353485,
-0.04774721711874008,
-0.09329863637685776,
-0.048667486757040024,
0.04734831675887108,
-0.07744821906089783,
-0.055949658155441284,
-0.08457484841346741,
-0.08837059140205383,
0.05165178328752518,
-0.010893398895859718,
0.1141926720738411,
-0.05381564423441887,
0.07801110297441483,
0.028604881837964058,
0.0614011324942112,
0.09863527119159698,
0.06448785960674286,
-0.04650231450796127,
0.05704113095998764,
-0.17382968962192535,
0.09398314356803894,
-0.10548262298107147,
0.040443822741508484,
-0.1456606090068817,
-0.1023872122168541,
0.01132228597998619,
-0.0004478020709939301,
0.08293649554252625,
0.12953723967075348,
-0.1678352952003479,
-0.0973021388053894,
0.15834298729896545,
-0.06118667498230934,
-0.08200478553771973,
0.1296449452638626,
-0.012175308540463448,
-0.04564661160111427,
0.01324483286589384,
0.20605914294719696,
0.0892358347773552,
-0.10824691504240036,
-0.004547803662717342,
-0.03954014554619789,
0.11469794809818268,
0.03894440829753876,
0.08368334919214249,
-0.0499168299138546,
0.08625170588493347,
0.0011377896880730987,
-0.06682976335287094,
0.048282913863658905,
-0.06208677217364311,
-0.08667310327291489,
-0.008485294878482819,
-0.06511944532394409,
0.015707412734627724,
0.05570903420448303,
0.01993486098945141,
-0.06583680957555771,
-0.13143131136894226,
0.008877566084265709,
0.09531401097774506,
-0.11353428661823273,
0.025381403043866158,
-0.06379643082618713,
0.06356541067361832,
-0.020695921033620834,
-0.003463268280029297,
-0.14532805979251862,
-0.020482590422034264,
0.038724057376384735,
-0.07121222466230392,
0.028592444956302643,
-0.020159531384706497,
0.08238992094993591,
0.053978174924850464,
-0.038395024836063385,
-0.07309706509113312,
-0.018016405403614044,
-0.000060357822803780437,
-0.05849749594926834,
-0.2401144802570343,
-0.07528162747621536,
-0.015040818601846695,
0.14799098670482635,
-0.19771291315555573,
0.00490428926423192,
0.054917410016059875,
0.1406610757112503,
0.02620665915310383,
-0.045246563851833344,
0.014260170049965382,
0.04712963476777077,
-0.016442643478512764,
-0.07864639908075333,
0.02773362211883068,
-0.018149273470044136,
-0.10259819030761719,
0.0005574393435381353,
-0.15950101613998413,
0.0822080448269844,
0.07986736297607422,
0.022610649466514587,
-0.07694629579782486,
-0.015889830887317657,
-0.058952681720256805,
-0.059973206371068954,
-0.014756213873624802,
-0.012796626426279545,
0.17932136356830597,
0.03546200320124626,
0.11528763920068741,
-0.06810203939676285,
-0.06022018566727638,
0.03494558483362198,
0.005399894900619984,
-0.0116216279566288,
0.16523702442646027,
0.06354513764381409,
-0.0514461025595665,
0.09366873651742935,
0.052102021872997284,
-0.06161407753825188,
0.14551277458667755,
-0.07286545634269714,
-0.08731594681739807,
-0.04146668314933777,
0.03393612802028656,
0.02613096870481968,
0.12502771615982056,
-0.15031270682811737,
-0.022699296474456787,
0.023075606673955917,
0.013782989233732224,
0.00953446701169014,
-0.17689402401447296,
0.010605452582240105,
0.023036783561110497,
-0.0957484319806099,
0.014747054316103458,
0.005224906839430332,
0.0016283048316836357,
0.09324518591165543,
0.008030819706618786,
-0.11014385521411896,
-0.035372957587242126,
-0.040085770189762115,
-0.08277195692062378,
0.17339958250522614,
-0.08230764418840408,
-0.15127433836460114,
-0.11455845087766647,
-0.013128812424838543,
-0.04545329138636589,
-0.015508265234529972,
0.028144244104623795,
-0.08545129001140594,
-0.046746477484703064,
-0.07301826030015945,
0.04007373005151749,
-0.022771669551730156,
0.017359809949994087,
0.02420075237751007,
-0.0026715591084212065,
0.0681791603565216,
-0.093940369784832,
0.0017511387122794986,
-0.02554197795689106,
-0.01780715584754944,
-0.004865046590566635,
0.03544020280241966,
0.09708006680011749,
0.16887277364730835,
0.049657344818115234,
0.049158837646245956,
-0.023204537108540535,
0.18359065055847168,
-0.12529334425926208,
0.023380281403660774,
0.09389849752187729,
-0.011253328062593937,
0.059721093624830246,
0.17485181987285614,
0.05327991396188736,
-0.06699728965759277,
-0.011211605742573738,
0.038360804319381714,
-0.0131843825802207,
-0.19755126535892487,
-0.04841773957014084,
-0.06451928615570068,
0.005564806051552296,
0.11305443197488785,
0.041357047855854034,
-0.015280673280358315,
0.006092972122132778,
-0.0093659907579422,
-0.024356020614504814,
0.037160251289606094,
0.05710376799106598,
0.11170827597379684,
0.04522335156798363,
0.10727226734161377,
-0.011962596327066422,
-0.06752271950244904,
0.02564452402293682,
0.008170977234840393,
0.2106623649597168,
-0.008267892524600029,
0.19065402448177338,
0.025020455941557884,
0.11370661854743958,
-0.019052457064390182,
0.0555434450507164,
0.00902911089360714,
0.0006787051097489893,
0.02704525925219059,
-0.06151389330625534,
0.011667408049106598,
0.03491268679499626,
0.09842251986265182,
0.016413738951086998,
-0.09819785505533218,
0.014870906248688698,
0.04951425641775131,
0.3171800673007965,
0.08236270397901535,
-0.27415478229522705,
-0.06271278113126755,
0.01968293823301792,
-0.07959071546792984,
-0.023628398776054382,
0.025119833648204803,
0.11022278666496277,
-0.08172173798084259,
0.08298865705728531,
-0.06703104823827744,
0.07392042875289917,
-0.09021613001823425,
0.018568169325590134,
0.0940215140581131,
0.08429484069347382,
0.014622356742620468,
0.05569259822368622,
-0.21917764842510223,
0.272403359413147,
-0.0020719931926578283,
0.05253387987613678,
-0.0459543839097023,
0.03979552164673805,
0.020610520616173744,
-0.02782779186964035,
0.10425326228141785,
-0.008384227752685547,
-0.10553040355443954,
-0.17921386659145355,
-0.1065109446644783,
0.008665921166539192,
0.12528583407402039,
-0.06747470051050186,
0.11948510259389877,
-0.028428390622138977,
-0.05921472981572151,
0.0355176143348217,
-0.1041499674320221,
-0.07435118407011032,
-0.10066428780555725,
0.05720435082912445,
0.016580689698457718,
0.018773719668388367,
-0.062183644622564316,
-0.09838347882032394,
-0.07141140848398209,
0.14918112754821777,
-0.13328251242637634,
-0.040870241820812225,
-0.12857723236083984,
0.03530490770936012,
0.18438421189785004,
-0.08477481454610825,
0.026944123208522797,
0.01598605513572693,
0.1137009933590889,
0.023743532598018646,
-0.038390301167964935,
0.11287183314561844,
-0.07953999936580658,
-0.20786555111408234,
-0.03508185222744942,
0.1877332478761673,
0.02230074256658554,
0.06179589033126831,
-0.03466513380408287,
0.0361708328127861,
-0.007043630816042423,
-0.08611348271369934,
0.0718860775232315,
0.017011180520057678,
-0.022073380649089813,
0.0344674251973629,
-0.021477634087204933,
0.004097491968423128,
-0.07349997758865356,
-0.015785403549671173,
0.10748246312141418,
0.27009114623069763,
-0.09183504432439804,
0.0656059980392456,
0.03493818640708923,
-0.056359585374593735,
-0.1582377701997757,
-0.028438648208975792,
0.1150193065404892,
0.024590961635112762,
-0.010009595192968845,
-0.18298791348934174,
0.042622290551662445,
0.053521204739809036,
-0.02789453975856304,
0.08727600425481796,
-0.33471688628196716,
-0.14064094424247742,
0.11552141606807709,
0.04920825734734535,
-0.03683093562722206,
-0.16511861979961395,
-0.059130050241947174,
-0.0003183228545822203,
-0.06007395312190056,
0.044336456805467606,
0.008497410453855991,
0.12087501585483551,
-0.013327076099812984,
0.05900026112794876,
0.020404577255249023,
-0.04017072170972824,
0.1355900764465332,
0.041120998561382294,
0.0377887487411499,
-0.0035812221467494965,
-0.008236240595579147,
0.004591655917465687,
-0.06842009723186493,
0.04542907327413559,
-0.07778749614953995,
0.010064556263387203,
-0.13676612079143524,
-0.013196254149079323,
-0.08676915615797043,
0.026707490906119347,
-0.04328402131795883,
-0.019437260925769806,
-0.011223684065043926,
0.033642519265413284,
0.08621511608362198,
0.012023895047605038,
0.12046356499195099,
-0.04256791993975639,
0.09482093155384064,
0.14991247653961182,
0.09981843084096909,
0.015136287547647953,
-0.13308578729629517,
0.002885876689106226,
0.007713716477155685,
0.02093246765434742,
-0.10263971984386444,
0.04745374992489815,
0.15449248254299164,
0.039262931793928146,
0.13792122900485992,
0.05328880995512009,
-0.08125603199005127,
-0.02005300112068653,
0.054419342428445816,
-0.11345529556274414,
-0.15036918222904205,
-0.016454029828310013,
-0.02289014868438244,
-0.125896617770195,
0.011928919702768326,
0.12354618310928345,
-0.03808031976222992,
-0.001817734562791884,
0.0223722942173481,
0.05858186259865761,
-0.02431035041809082,
0.22469130158424377,
0.03611178323626518,
0.09072677791118622,
-0.10613350570201874,
0.07358328253030777,
0.05096369609236717,
-0.10360685735940933,
0.04942572861909866,
0.12533800303936005,
-0.04848888888955116,
-0.019268283620476723,
0.029454363510012627,
0.11198306083679199,
0.014654500409960747,
-0.04206595569849014,
-0.1372922658920288,
-0.14855971932411194,
0.08167365938425064,
0.09575087577104568,
0.033917032182216644,
0.018550220876932144,
-0.03917436674237251,
0.02731047011911869,
-0.08945515006780624,
0.13651318848133087,
0.10453492403030396,
0.062173012644052505,
-0.12804453074932098,
0.104880191385746,
-0.007575694937258959,
0.018642665818333626,
-0.003294529626145959,
0.0026082496624439955,
-0.12363938242197037,
0.01199323870241642,
-0.08312027901411057,
-0.005901929922401905,
-0.052687160670757294,
0.0026713854167610407,
0.012031015008687973,
-0.054922375828027725,
-0.04633777588605881,
0.018499357625842094,
-0.1086478903889656,
-0.038450535386800766,
-0.02710474282503128,
0.05656026303768158,
-0.12118895351886749,
-0.023509878665208817,
0.02690761350095272,
-0.1271553486585617,
0.11143649369478226,
0.040176551789045334,
0.00614791689440608,
0.01593690551817417,
-0.06954242289066315,
-0.00007512878073612228,
0.035229768604040146,
0.004201857373118401,
0.04624161496758461,
-0.1640777438879013,
-0.015177400782704353,
-0.0362372100353241,
0.02905160002410412,
-0.0015056777046993375,
0.02376909926533699,
-0.11646833270788193,
0.022725142538547516,
-0.061217520385980606,
-0.05573218688368797,
-0.05136895924806595,
0.07550908625125885,
0.061101336032152176,
0.02516464702785015,
0.14342550933361053,
-0.08587948232889175,
0.06980102509260178,
-0.21804997324943542,
-0.0010652386117726564,
0.0014534011716023088,
-0.0727376639842987,
-0.04357120767235756,
-0.02187144011259079,
0.0970226600766182,
-0.07246258854866028,
0.08966492116451263,
-0.00860066432505846,
0.06841478496789932,
0.02917345240712166,
-0.08001219481229782,
0.01075583603233099,
0.06113944575190544,
0.1147935688495636,
0.023303741589188576,
-0.03391273319721222,
0.0660666674375534,
-0.019363155588507652,
0.05850056931376457,
0.1153680682182312,
0.16285985708236694,
0.11722388863563538,
0.03907070308923721,
0.07567856460809708,
0.10020926594734192,
-0.13373993337154388,
-0.15265949070453644,
0.13130125403404236,
-0.05190608650445938,
0.14861592650413513,
-0.0361514613032341,
0.2125856578350067,
0.07854633033275604,
-0.19782015681266785,
0.06690780818462372,
-0.032759781926870346,
-0.09851067513227463,
-0.11327976733446121,
-0.08445824682712555,
-0.07595286518335342,
-0.16221150755882263,
0.013719463720917702,
-0.10172264277935028,
0.06434943526983261,
0.0468844398856163,
0.04173717647790909,
0.035770148038864136,
0.10747972875833511,
0.046128883957862854,
0.01690690591931343,
0.11135688424110413,
0.028809141367673874,
-0.01340461615473032,
-0.02873423509299755,
-0.05735847353935242,
0.02287409082055092,
-0.02418696880340576,
0.052821382880210876,
-0.03339555114507675,
-0.08938595652580261,
0.04991261661052704,
-0.014647646807134151,
-0.10125437378883362,
0.02400212176144123,
-0.01228266954421997,
0.04696708545088768,
0.07691165059804916,
0.039924342185258865,
-0.028423326089978218,
-0.02531634457409382,
0.20484749972820282,
-0.10760550945997238,
-0.05899280309677124,
-0.11162377148866653,
0.2176796793937683,
-0.009948940016329288,
-0.0015793456695973873,
0.018122470006346703,
-0.07360157370567322,
-0.028090227395296097,
0.1717190444469452,
0.16075485944747925,
-0.029515691101551056,
-0.008873571641743183,
0.009700777940452099,
-0.002463273936882615,
-0.010713030584156513,
0.05242031067609787,
0.10818003863096237,
0.10930537432432175,
-0.05315840244293213,
-0.025908149778842926,
-0.020210782065987587,
-0.06671889126300812,
-0.05647922307252884,
0.09279625117778778,
0.027785323560237885,
0.0031373361125588417,
-0.0356438010931015,
0.0929291844367981,
-0.08510788530111313,
-0.14679227769374847,
0.015061645768582821,
-0.20374752581119537,
-0.17811204493045807,
-0.029140667989850044,
0.06615051627159119,
0.04174341633915901,
0.05005732923746109,
-0.004728338215500116,
-0.05920194089412689,
0.11464104056358337,
0.013931717723608017,
-0.06100105866789818,
-0.07503683120012283,
0.0639912411570549,
-0.15355753898620605,
0.16584542393684387,
-0.03349648416042328,
0.039065420627593994,
0.11092962324619293,
0.045143865048885345,
-0.0832686722278595,
0.020540587604045868,
0.0915442407131195,
-0.11198291182518005,
0.04403571039438248,
0.2005251944065094,
-0.03273461386561394,
0.11890586465597153,
0.041600730270147324,
-0.09149519354104996,
0.0039382209070026875,
-0.059460852295160294,
-0.04228554293513298,
-0.051158372312784195,
-0.01854950748383999,
-0.03389173001050949,
0.13178862631320953,
0.20985673367977142,
-0.060912199318408966,
-0.00794307142496109,
-0.049944207072257996,
0.0041684601455926895,
0.025494398549199104,
0.11498899012804031,
-0.03950541093945503,
-0.2462630718946457,
0.022645825520157814,
-0.010763505473732948,
0.029015250504016876,
-0.1915993094444275,
-0.09659569710493088,
0.023990897461771965,
-0.0409429669380188,
-0.07252579927444458,
0.1252732276916504,
0.06460225582122803,
0.047026801854372025,
-0.053038857877254486,
-0.0559120699763298,
-0.02339332364499569,
0.17160837352275848,
-0.17994464933872223,
-0.0485842302441597
] |
null | null | null |
# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis
The model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script.
The Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP.
A second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 20K iterations.
Performance of this model is now superior to the Tensorpack model.
Please check: [Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis](https://arxiv.org/abs/1908.07836).
This model is different from the model used the paper.
The code has been adapted so that it can be used in a **deep**doctection pipeline.
## How this model can be used
This model can be used with the **deep**doctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this [Get_started](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Get_Started.ipynb) tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this [this model card](https://huggingface.co/deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_publaynet).
|
{"license": "apache-2.0", "tags": ["Pytorch"], "datasets": ["Publaynet"]}
| null |
deepdoctection/d2_casc_rcnn_X_32xd4_50_FPN_GN_2FC_publaynet_inference_only
|
[
"Pytorch",
"dataset:Publaynet",
"arxiv:1908.07836",
"license:apache-2.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.07836"
] |
[] |
TAGS
#Pytorch #dataset-Publaynet #arxiv-1908.07836 #license-apache-2.0 #region-us
|
# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis
The model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script.
The Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP.
A second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 20K iterations.
Performance of this model is now superior to the Tensorpack model.
Please check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis.
This model is different from the model used the paper.
The code has been adapted so that it can be used in a deepdoctection pipeline.
## How this model can be used
This model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card.
|
[
"# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis\n\nThe model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script. \nThe Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP.\n\nA second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 20K iterations.\nPerformance of this model is now superior to the Tensorpack model. \n\nPlease check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis. \n\nThis model is different from the model used the paper. \n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card."
] |
[
"TAGS\n#Pytorch #dataset-Publaynet #arxiv-1908.07836 #license-apache-2.0 #region-us \n",
"# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis\n\nThe model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script. \nThe Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP.\n\nA second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 20K iterations.\nPerformance of this model is now superior to the Tensorpack model. \n\nPlease check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis. \n\nThis model is different from the model used the paper. \n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card."
] |
[
34,
202,
45,
69
] |
[
"passage: TAGS\n#Pytorch #dataset-Publaynet #arxiv-1908.07836 #license-apache-2.0 #region-us \n# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis\n\nThe model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script. \nThe Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP.\n\nA second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 20K iterations.\nPerformance of this model is now superior to the Tensorpack model. \n\nPlease check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis. \n\nThis model is different from the model used the paper. \n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card."
] |
[
-0.09552028030157089,
0.12101567536592484,
-0.002183899050578475,
0.041812244802713394,
0.08461624383926392,
-0.01854313164949417,
0.03671383112668991,
0.12304207682609558,
-0.09116260707378387,
0.12280397862195969,
-0.01891578733921051,
0.07354015856981277,
0.11255139112472534,
0.09117531776428223,
-0.0019812504760921,
-0.18924765288829803,
0.028822824358940125,
0.024892762303352356,
0.06474542617797852,
0.0711844339966774,
0.09829944372177124,
-0.08549904078245163,
0.039110686630010605,
0.027210084721446037,
-0.02865942195057869,
0.04210348799824715,
0.003521735779941082,
-0.041040368378162384,
0.06500270217657089,
0.04044667258858681,
0.07655985653400421,
0.01863139681518078,
0.07054430991411209,
-0.13093507289886475,
0.02697894349694252,
0.08995770663022995,
0.008973458781838417,
0.08969832211732864,
0.11647168546915054,
0.042615339159965515,
0.19138385355472565,
-0.045713238418102264,
0.056767258793115616,
0.05158853903412819,
-0.04103276878595352,
-0.13998721539974213,
-0.11852000653743744,
0.12468729168176651,
0.08727557212114334,
0.13067662715911865,
-0.02056635543704033,
0.131228506565094,
0.00019771262304857373,
0.03621981665492058,
0.07953983545303345,
-0.246975839138031,
-0.0017482985276728868,
0.18533554673194885,
0.026491105556488037,
0.04581271857023239,
-0.058762773871421814,
0.005090881604701281,
-0.022206518799066544,
0.051313355565071106,
0.033582232892513275,
-0.07024337351322174,
-0.004429614637047052,
-0.036020755767822266,
-0.12824669480323792,
-0.019931629300117493,
0.08264889568090439,
-0.039241574704647064,
-0.04517655819654465,
-0.13099785149097443,
-0.05676228925585747,
-0.038523267954587936,
-0.03160058334469795,
-0.07012401521205902,
0.051706522703170776,
0.047095008194446564,
0.11590979248285294,
-0.1704235076904297,
-0.1313440203666687,
-0.0011081983102485538,
-0.0758015513420105,
0.05057597532868385,
0.057191018015146255,
0.030549325048923492,
-0.05074502155184746,
0.12087716162204742,
-0.07358592748641968,
-0.01214100793004036,
-0.0658789873123169,
-0.03670579195022583,
-0.06689435243606567,
-0.016251103952527046,
-0.06402496993541718,
-0.20302699506282806,
-0.016106637194752693,
0.12688563764095306,
0.015671586617827415,
0.04337158426642418,
-0.02512403577566147,
0.044648922979831696,
0.037199366837739944,
0.23908960819244385,
-0.09608307480812073,
0.021633487194776535,
0.12303043901920319,
0.03587568551301956,
0.05972439423203468,
-0.012441514991223812,
-0.07229306548833847,
-0.03791037201881409,
0.018259627744555473,
0.06666383892297745,
0.0513438805937767,
0.03819870576262474,
-0.022861097007989883,
-0.09147602319717407,
0.09435910731554031,
-0.1610497236251831,
0.01906176656484604,
0.011978629976511002,
-0.04209807142615318,
0.13513065874576569,
0.08662881702184677,
-0.005393059924244881,
-0.1682395040988922,
-0.03949623182415962,
-0.047201842069625854,
0.0071350992657244205,
-0.05782468616962433,
-0.04742314666509628,
-0.018318966031074524,
-0.08667502552270889,
-0.05346410721540451,
-0.1192922368645668,
-0.21418820321559906,
-0.08457812666893005,
0.0168624110519886,
0.00036293233279138803,
0.02644944004714489,
-0.010042116045951843,
-0.013964568264782429,
-0.08334705233573914,
0.037513699382543564,
-0.00908053107559681,
0.010933171026408672,
0.007028199266642332,
-0.032849837094545364,
-0.029705261811614037,
-0.08246957510709763,
0.039801716804504395,
-0.103064626455307,
0.028134791180491447,
-0.10229527205228806,
0.09783822298049927,
0.029306363314390182,
-0.06447911262512207,
-0.06627287715673447,
-0.06367260962724686,
-0.12215414643287659,
0.00971539132297039,
0.03565813601016998,
0.08510078489780426,
-0.14193157851696014,
-0.023178521543741226,
0.07618898153305054,
-0.13368430733680725,
0.04448670521378517,
0.07108238339424133,
-0.023486413061618805,
0.02387840859591961,
0.07659216970205307,
0.13801035284996033,
0.1117134764790535,
-0.05509461835026741,
-0.07404112815856934,
-0.01988619565963745,
-0.037883296608924866,
0.011905554682016373,
0.05200406536459923,
-0.08398435264825821,
0.0167328342795372,
0.0357498973608017,
-0.09939896315336227,
-0.06306872516870499,
0.026263462379574776,
-0.047509536147117615,
-0.05475984513759613,
-0.019000008702278137,
-0.008885078132152557,
0.004350918810814619,
0.05338245630264282,
0.056668855249881744,
-0.0879485011100769,
-0.05666109174489975,
0.16649651527404785,
-0.07159421592950821,
0.025028012692928314,
-0.08211860060691833,
0.17273686826229095,
-0.14569273591041565,
0.03261251002550125,
-0.20289944112300873,
0.0196407251060009,
0.06067114695906639,
-0.09166669100522995,
0.04171819984912872,
0.08049578964710236,
0.0309499092400074,
0.04291515052318573,
-0.0022552451118826866,
-0.02042234130203724,
-0.06216149032115936,
-0.039536137133836746,
-0.06662940233945847,
-0.0028653854969888926,
-0.1086188554763794,
-0.04133465886116028,
0.029959561303257942,
-0.21463145315647125,
0.054913152009248734,
-0.07271873950958252,
0.06865181773900986,
0.017324859276413918,
-0.05947631597518921,
0.04345428943634033,
-0.014926659874618053,
-0.03910405933856964,
-0.09956056624650955,
0.003365242388099432,
0.03730002045631409,
-0.04593856260180473,
0.03652835264801979,
-0.15891782939434052,
-0.09352125972509384,
0.051574740558862686,
0.06224054843187332,
-0.011370711028575897,
0.025964148342609406,
-0.06404652446508408,
0.012502919882535934,
-0.037658754736185074,
-0.018905607983469963,
0.24792887270450592,
0.053179774433374405,
0.07776680588722229,
-0.09631365537643433,
-0.009185579605400562,
-0.01722114346921444,
-0.0353676937520504,
0.01102483831346035,
0.0009896544506773353,
0.06336931139230728,
-0.0967743918299675,
0.012038497254252434,
0.017704706639051437,
-0.015923835337162018,
0.07189205288887024,
0.003061376279219985,
-0.10771836340427399,
-0.03726084902882576,
0.005834513809531927,
0.0012085727648809552,
0.057651687413454056,
0.07501574605703354,
0.05829176306724548,
0.038457971066236496,
0.032689984887838364,
0.10890397429466248,
-0.07687430828809738,
0.06546375155448914,
0.05351978912949562,
-0.0029694705735892057,
0.07070216536521912,
0.0011823952663689852,
0.014859997667372227,
0.06604772806167603,
-0.04340551793575287,
0.09908028692007065,
-0.018831683322787285,
-0.020656684413552284,
-0.0925835371017456,
0.1728265881538391,
-0.07093455642461777,
-0.23365387320518494,
-0.1436074674129486,
0.15332888066768646,
-0.06743134558200836,
0.01598348282277584,
-0.03043530508875847,
0.00851866789162159,
-0.08681705594062805,
-0.13513223826885223,
0.0614846795797348,
-0.007526501547545195,
-0.028198163956403732,
-0.07707858085632324,
-0.03690046817064285,
0.025643503293395042,
-0.11218364536762238,
-0.014684154652059078,
-0.009061061777174473,
-0.13436387479305267,
-0.033397823572158813,
-0.0075836568139493465,
0.04262318089604378,
0.15870244801044464,
-0.04352288320660591,
-0.019254904240369797,
0.035467322915792465,
0.007426067721098661,
-0.026497438549995422,
0.07669603079557419,
0.19309869408607483,
0.01978166215121746,
0.014969232492148876,
0.0018258084310218692,
-0.02030794881284237,
-0.034986235201358795,
-0.0242722537368536,
0.052077725529670715,
-0.028056247159838676,
-0.22352848947048187,
-0.05298222228884697,
-0.0480506494641304,
-0.026858871802687645,
0.0678071454167366,
0.07841669768095016,
0.0016693536890670657,
0.06454868614673615,
-0.03700757399201393,
0.0057144807651638985,
-0.004926212597638369,
0.04836060106754303,
0.06936638057231903,
-0.014400876127183437,
0.03364263474941254,
-0.07061369717121124,
0.028656983748078346,
0.1417740434408188,
0.04964796453714371,
0.11799215525388718,
-0.05311761796474457,
0.08308874815702438,
0.018650950863957405,
0.06881444156169891,
0.010486595332622528,
0.07690072059631348,
-0.018290314823389053,
0.027349937707185745,
0.023827429860830307,
-0.03858811780810356,
-0.013551754876971245,
0.0127329770475626,
0.056626733392477036,
0.002410008804872632,
-0.08173442631959915,
0.036284975707530975,
0.03907892853021622,
0.12046625465154648,
-0.026252757757902145,
-0.20346906781196594,
-0.06040613725781441,
-0.04091525450348854,
-0.02397243119776249,
-0.09256595373153687,
0.03322260454297066,
0.1378464549779892,
-0.11397934705018997,
-0.046083513647317886,
-0.0613284707069397,
0.060500532388687134,
-0.10772259533405304,
-0.014278250746428967,
-0.0266280435025692,
0.11716504395008087,
0.026313666254281998,
0.07368182390928268,
-0.10122862458229065,
-0.005580826662480831,
0.04102371633052826,
0.11917495727539062,
-0.07779602706432343,
0.031120313331484795,
0.037963639944791794,
0.0628320574760437,
0.1265617161989212,
-0.007592055015265942,
-0.10639635473489761,
-0.0736483782529831,
-0.15911301970481873,
0.05585205554962158,
0.04168254882097244,
-0.09312142431735992,
0.09246882051229477,
-0.05440583452582359,
0.019102793186903,
-0.040906548500061035,
-0.04941714182496071,
-0.0476040355861187,
-0.19398429989814758,
0.06402339041233063,
0.034729789942502975,
-0.0789899155497551,
-0.0812869742512703,
-0.0056974152103066444,
0.06419504433870316,
0.22077873349189758,
-0.06689608842134476,
-0.06878188252449036,
-0.13213872909545898,
0.07157175987958908,
0.13429860770702362,
-0.05049562081694603,
0.05103396996855736,
-0.006151882465928793,
0.11880199611186981,
-0.0547720231115818,
-0.15845686197280884,
0.011247430928051472,
-0.055803507566452026,
-0.1178140714764595,
0.017351705580949783,
0.11133721470832825,
0.00971421878784895,
0.026362590491771698,
0.06023113429546356,
0.02360956370830536,
-0.011029722169041634,
-0.12458649277687073,
-0.032197486609220505,
0.1548423320055008,
0.056404028087854385,
0.05547674745321274,
-0.12884941697120667,
-0.11834574490785599,
-0.050456453114748,
0.056602198630571365,
0.08407702296972275,
0.09799490123987198,
-0.0652131587266922,
0.060620296746492386,
0.08390406519174576,
-0.0867544487118721,
-0.15872550010681152,
-0.003881296841427684,
-0.007845897227525711,
0.05029743164777756,
0.047050219029188156,
-0.1621093600988388,
0.055842071771621704,
0.07260310649871826,
-0.020783426240086555,
0.11667459458112717,
-0.369501531124115,
-0.08798768371343613,
0.10457827895879745,
0.03737083822488785,
0.08003746718168259,
-0.10921930521726608,
-0.0324292816221714,
0.02857394516468048,
0.022125286981463432,
0.07471425086259842,
-0.08641832321882248,
0.12124098092317581,
-0.024478008970618248,
-0.0027052811346948147,
0.08967157453298569,
-0.054946523159742355,
0.06596001982688904,
-0.05994609370827675,
0.09466520696878433,
-0.05441689118742943,
0.026738232001662254,
0.07165640592575073,
-0.04187984764575958,
0.1783958375453949,
-0.0010742628946900368,
0.12285667657852173,
-0.08427499979734421,
-0.08434292674064636,
-0.07102380692958832,
0.09251531213521957,
-0.01311983447521925,
-0.03982369229197502,
-0.06538855284452438,
0.07377999275922775,
0.07753460109233856,
0.011806405149400234,
-0.002220295835286379,
-0.023164017125964165,
0.01640690304338932,
0.10899494588375092,
0.04019142687320709,
0.14776059985160828,
-0.09812623262405396,
-0.04446061700582504,
0.009060658514499664,
0.07090308517217636,
-0.011508694849908352,
0.009864892810583115,
0.11537796258926392,
0.01565033756196499,
0.06522451341152191,
0.014456856995821,
-0.11991677433252335,
-0.033661454916000366,
0.009812242351472378,
-0.09818267822265625,
-0.13846342265605927,
-0.023954814299941063,
0.10533181577920914,
-0.06770801544189453,
0.039259687066078186,
0.11694443225860596,
-0.031385067850351334,
-0.048085737973451614,
0.02810240350663662,
0.07250015437602997,
-0.033338747918605804,
0.09607591480016708,
0.024606838822364807,
-0.017038580030202866,
-0.03503871336579323,
0.14928720891475677,
0.1068086102604866,
-0.053757522255182266,
0.06080802157521248,
0.03255195915699005,
-0.06049387529492378,
-0.07202813774347305,
0.01226995512843132,
0.10513531416654587,
-0.02518300525844097,
-0.07143595814704895,
-0.0562688484787941,
-0.01986858993768692,
-0.013376624323427677,
-0.011588145978748798,
0.007069827523082495,
0.0966927781701088,
-0.0745398998260498,
-0.05438563600182533,
-0.12200744450092316,
0.03166317567229271,
0.024814583361148834,
0.00653991149738431,
-0.06852768361568451,
0.0840534120798111,
0.009804578498005867,
0.027277635410428047,
-0.02638101764023304,
-0.0012347556184977293,
-0.01912117190659046,
-0.04783613607287407,
-0.0845145508646965,
0.0060691311955451965,
-0.022543705999851227,
-0.04819955304265022,
-0.0008796280017122626,
0.02661454863846302,
-0.00935409776866436,
0.05504992976784706,
-0.0476204939186573,
-0.07169870287179947,
-0.07112943381071091,
0.004561346955597401,
-0.09228304773569107,
0.016793882474303246,
-0.025324417278170586,
-0.06554160267114639,
0.09483658522367477,
0.0708342045545578,
0.00918162614107132,
0.05168769508600235,
-0.08685404062271118,
-0.001750479219481349,
-0.024646924808621407,
0.040492378175258636,
-0.021900353953242302,
-0.07729999721050262,
0.03399181738495827,
0.05354304239153862,
-0.08498367667198181,
-0.05051126331090927,
0.05504169315099716,
-0.1342599242925644,
0.0075710732489824295,
-0.05344191938638687,
0.028722677379846573,
-0.0454234853386879,
0.02988988161087036,
-0.019792286679148674,
0.09183274209499359,
0.10785933583974838,
-0.05260622501373291,
0.08912304788827896,
-0.158593088388443,
-0.022352144122123718,
-0.015254326164722443,
0.05480210855603218,
0.03119158186018467,
-0.03195791691541672,
0.04502412676811218,
-0.012948866933584213,
0.13176961243152618,
-0.034164924174547195,
0.023644644767045975,
-0.002969154389575124,
0.016542991623282433,
0.02919747866690159,
0.0010712856892496347,
0.11228981614112854,
-0.009231386706233025,
-0.04271502047777176,
0.04141321778297424,
0.02291019819676876,
0.021380163729190826,
0.03390491008758545,
0.19723951816558838,
0.11022168397903442,
-0.003938755486160517,
0.09775031358003616,
0.009002403356134892,
-0.056428249925374985,
-0.03651029244065285,
0.04652078449726105,
-0.03940673917531967,
0.0362265445291996,
-0.06895827502012253,
0.0687636062502861,
0.16028772294521332,
-0.11647701263427734,
0.0881134495139122,
0.060823697596788406,
-0.02491707168519497,
-0.11027102172374725,
-0.2598046064376831,
-0.040741920471191406,
-0.006755011156201363,
-0.04218244180083275,
-0.08753720670938492,
-0.002995213493704796,
0.006531535182148218,
0.01625070907175541,
-0.11233222484588623,
0.21753081679344177,
-0.10256951302289963,
-0.08841698616743088,
0.05829097330570221,
0.004703217651695013,
-0.01216450147330761,
0.018793985247612,
-0.01153417956084013,
0.04488837718963623,
0.0004275972896721214,
0.12068731337785721,
0.0321895070374012,
0.10569480061531067,
0.07686319947242737,
-0.03216538205742836,
-0.02651507593691349,
-0.011911093257367611,
-0.05708332359790802,
0.0323202945291996,
0.14182594418525696,
0.04882921651005745,
-0.06578976660966873,
-0.028560049831867218,
0.18799914419651031,
-0.021129939705133438,
-0.019145499914884567,
-0.12933525443077087,
0.1114465594291687,
0.0429898202419281,
-0.07433764636516571,
-0.009983600117266178,
-0.15365329384803772,
0.020370524376630783,
0.21069546043872833,
0.08680564910173416,
0.0038529851008206606,
-0.008848761208355427,
0.029396533966064453,
0.0013043914223089814,
-0.03230077400803566,
0.05574917420744896,
0.002255041850730777,
0.21167847514152527,
-0.028678057715296745,
0.008750063367187977,
-0.002464619232341647,
0.03237391635775566,
-0.05175476893782616,
0.13398508727550507,
-0.03361247479915619,
0.010270818136632442,
-0.011358724907040596,
-0.0035684676840901375,
-0.04523782059550285,
-0.2294021099805832,
0.006479014176875353,
-0.05269515514373779,
-0.06387654691934586,
-0.04514468088746071,
-0.061423659324645996,
-0.04058048129081726,
0.100826695561409,
-0.019155006855726242,
-0.004166833125054836,
0.11285823583602905,
-0.019414126873016357,
-0.0793914869427681,
-0.03649108111858368,
0.06315194070339203,
0.014575649984180927,
0.18178273737430573,
0.01614970527589321,
-0.01934346929192543,
0.06201053038239479,
0.017721055075526237,
-0.12767194211483002,
0.08752522617578506,
-0.045182615518569946,
0.00048181030433624983,
0.03375643864274025,
0.09481150656938553,
0.0052141244523227215,
0.01040735375136137,
0.019489925354719162,
0.013639113865792751,
0.0265639815479517,
-0.11140036582946777,
-0.03998257964849472,
-0.0964699238538742,
0.042783066630363464,
-0.07144608348608017,
0.15454545617103577,
0.11567316204309464,
-0.010469106025993824,
-0.009167398326098919,
-0.07055949419736862,
0.10477589815855026,
0.004455121234059334,
0.07513739913702011,
0.004704105202108622,
-0.08884736895561218,
0.055876512080430984,
-0.09752543270587921,
0.00887841172516346,
-0.20699678361415863,
-0.014531293883919716,
0.038946446031332016,
-0.09537829458713531,
-0.056545548141002655,
0.04750866815447807,
0.05128757655620575,
0.06602753698825836,
-0.03575674816966057,
-0.007716466207057238,
0.017826087772846222,
0.043985769152641296,
-0.13525649905204773,
-0.04871213436126709
] |
null | null | null |
# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script.
The Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP.
A second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 50K iterations.
Performance of this model is now superior to the Tensorpack model.
Regarding the dataset, please check: [Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation](https://arxiv.org/abs/1911.10683).
The model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before
detecting cells.
The code has been adapted so that it can be used in a **deep**doctection pipeline.
## How this model can be used
This model can be used with the **deep**doctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this [Get_started](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Get_Started.ipynb) tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this [this model card](https://huggingface.co/deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_pubtabnet_c).
|
{"license": "apache-2.0", "tags": ["Pytorch"], "datasets": ["Pubtabnet"]}
| null |
deepdoctection/d2_casc_rcnn_X_32xd4_50_FPN_GN_2FC_pubtabnet_c_inference_only
|
[
"Pytorch",
"dataset:Pubtabnet",
"arxiv:1911.10683",
"license:apache-2.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1911.10683"
] |
[] |
TAGS
#Pytorch #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us
|
# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script.
The Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP.
A second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 50K iterations.
Performance of this model is now superior to the Tensorpack model.
Regarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation.
The model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before
detecting cells.
The code has been adapted so that it can be used in a deepdoctection pipeline.
## How this model can be used
This model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card.
|
[
"# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script. \nThe Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP. \n\nA second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 50K iterations.\nPerformance of this model is now superior to the Tensorpack model. \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before \ndetecting cells.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card."
] |
[
"TAGS\n#Pytorch #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n",
"# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script. \nThe Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP. \n\nA second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 50K iterations.\nPerformance of this model is now superior to the Tensorpack model. \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before \ndetecting cells.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card."
] |
[
35,
247,
45,
69
] |
[
"passage: TAGS\n#Pytorch #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script. \nThe Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP. \n\nA second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 50K iterations.\nPerformance of this model is now superior to the Tensorpack model. \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before \ndetecting cells.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card."
] |
[
-0.1135805994272232,
0.11583751440048218,
-0.0026039755903184414,
0.04279102012515068,
0.07372458279132843,
-0.031527671962976456,
0.0807926207780838,
0.10225178301334381,
-0.013451647013425827,
0.09647706151008606,
-0.00535014970228076,
0.04611537232995033,
0.09016627073287964,
0.12516459822654724,
0.05861080065369606,
-0.15799230337142944,
0.01372355967760086,
-0.016479073092341423,
0.03301956132054329,
0.09095130115747452,
0.0846981406211853,
-0.09629929810762405,
0.060541436076164246,
0.02017664723098278,
-0.06533504277467728,
0.02451922558248043,
-0.015757646411657333,
-0.05860119313001633,
0.0878102108836174,
0.009608672931790352,
0.08408522605895996,
0.007567486260086298,
0.07780428230762482,
-0.08428001403808594,
0.044085852801799774,
0.11235901713371277,
0.01340259239077568,
0.08944736421108246,
0.09836965799331665,
0.02446117252111435,
0.18091577291488647,
-0.043949004262685776,
0.0773080587387085,
0.0186900831758976,
-0.041004352271556854,
-0.17811815440654755,
-0.13758310675621033,
0.14084336161613464,
0.07273213565349579,
0.09895136952400208,
-0.012817981652915478,
0.16890190541744232,
0.03656970337033272,
0.04266221076250076,
0.10647531598806381,
-0.29444459080696106,
-0.004876167047768831,
0.15524841845035553,
0.06021099165081978,
0.06475021690130234,
-0.022517934441566467,
0.01440958958119154,
-0.01571693643927574,
0.055093489587306976,
0.07675039768218994,
-0.04541262239217758,
-0.0101652592420578,
-0.017848705872893333,
-0.14755183458328247,
-0.012125262059271336,
0.04301402345299721,
-0.07922744750976562,
-0.06772936135530472,
-0.157223179936409,
-0.04766574129462242,
-0.1100405678153038,
0.009429782629013062,
-0.11114659160375595,
0.039702244102954865,
0.023379037156701088,
0.0935223400592804,
-0.15879502892494202,
-0.11225821077823639,
-0.05376998335123062,
-0.0903502106666565,
-0.009739690460264683,
0.07154947519302368,
0.025536278262734413,
-0.05738877132534981,
0.14596885442733765,
-0.0529477559030056,
-0.013460906222462654,
-0.07047960162162781,
-0.05044126883149147,
-0.11831821501255035,
-0.014283073134720325,
-0.0332409106194973,
-0.16175709664821625,
-0.03572329133749008,
0.07430439442396164,
0.01692080684006214,
0.03428714722394943,
-0.044181063771247864,
0.059129469096660614,
0.04413387551903725,
0.14687351882457733,
-0.09008996933698654,
0.06037784367799759,
0.09223348647356033,
-0.026595378294587135,
0.06125129014253616,
-0.022102590650320053,
-0.08008884638547897,
-0.031178176403045654,
0.031282126903533936,
0.059600308537483215,
0.005042334087193012,
0.012145419605076313,
-0.003742166329175234,
-0.1041104793548584,
0.0753081664443016,
-0.11597590893507004,
0.041844259947538376,
0.05785861983895302,
-0.047645363956689835,
0.08346289396286011,
0.0860808938741684,
-0.0032995317596942186,
-0.10872954875230789,
-0.08023659884929657,
-0.03798921778798103,
0.012612774968147278,
-0.0664171352982521,
-0.08908993005752563,
0.02929181233048439,
-0.12984150648117065,
-0.06694518774747849,
-0.15908925235271454,
-0.2066110521554947,
-0.06908164918422699,
-0.00010591849422780797,
-0.01726244017481804,
0.04720501974225044,
0.008126857690513134,
0.007480471860617399,
-0.07620809227228165,
0.042321570217609406,
0.030396103858947754,
0.017633914947509766,
0.01789812371134758,
-0.06217927113175392,
0.019834311679005623,
-0.022208869457244873,
0.021197015419602394,
-0.09339596331119537,
0.014550203457474709,
-0.13700927793979645,
0.11459853500127792,
0.02605835348367691,
-0.043577373027801514,
-0.09603234380483627,
-0.04578053206205368,
-0.1337009221315384,
-0.006340148393064737,
0.039556290954351425,
0.10019338130950928,
-0.16171742975711823,
-0.01940453052520752,
0.07683286815881729,
-0.1394663155078888,
0.017466560006141663,
0.07161616533994675,
-0.04116280749440193,
0.07062311470508575,
0.08136370778083801,
0.17509521543979645,
0.12116938084363937,
-0.06990119814872742,
-0.08966245502233505,
-0.0338149257004261,
-0.05757475271821022,
-0.017146239057183266,
0.08707521110773087,
-0.028290333226323128,
0.013061284087598324,
-0.0072084409184753895,
-0.08023186028003693,
-0.03491964936256409,
0.003521125065162778,
-0.06193332001566887,
-0.039595503360033035,
-0.038449741899967194,
0.010355009697377682,
-0.024510523304343224,
0.04907643795013428,
0.04879315197467804,
-0.04626660421490669,
-0.054637596011161804,
0.1532253623008728,
-0.05349145829677582,
0.017861995846033096,
-0.08713940531015396,
0.17972925305366516,
-0.08624008297920227,
0.009376750327646732,
-0.2079990804195404,
0.016672499477863312,
0.05393395572900772,
-0.079647958278656,
0.05642441660165787,
0.023083867505192757,
0.019189540296792984,
0.10304258018732071,
0.04903985559940338,
-0.006744435057044029,
-0.01747198775410652,
-0.03832779452204704,
-0.0875512883067131,
-0.06458836793899536,
-0.0918872132897377,
-0.07723510265350342,
0.05674440786242485,
-0.22414585947990417,
0.0377667173743248,
-0.04498422145843506,
0.06874871253967285,
0.03838028758764267,
-0.06327372044324875,
0.0675908774137497,
-0.014049799181520939,
-0.024873245507478714,
-0.09005644172430038,
0.011210598982870579,
0.03360026329755783,
-0.051735978573560715,
0.06842177361249924,
-0.17456674575805664,
-0.058197297155857086,
0.04472222551703453,
0.12220678478479385,
-0.07427917420864105,
-0.012090388685464859,
-0.021026920527219772,
0.004973261617124081,
-0.08673587441444397,
0.027773674577474594,
0.24110917747020721,
0.03809428587555885,
0.09254937618970871,
-0.10597388446331024,
-0.007427908945828676,
0.014354187063872814,
-0.023847442120313644,
0.02283182367682457,
0.06762192398309708,
0.025566836819052696,
-0.13077278435230255,
0.01514868251979351,
0.009477544575929642,
0.00362339336425066,
0.0802871435880661,
0.04293663799762726,
-0.06100788339972496,
-0.06101081520318985,
0.02916114032268524,
0.015477519482374191,
0.07000569254159927,
0.07663455605506897,
0.029054684564471245,
0.028047684580087662,
0.04075215384364128,
0.050303321331739426,
-0.09824687987565994,
0.06199224665760994,
0.06102839112281799,
-0.02080208994448185,
0.06725962460041046,
0.021449558436870575,
-0.010518629103899002,
0.06641069799661636,
-0.03229829668998718,
0.06675256043672562,
-0.02532055415213108,
-0.02749992534518242,
-0.16133199632167816,
0.19363507628440857,
-0.09151863306760788,
-0.23050899803638458,
-0.08941095322370529,
0.07744323462247849,
-0.010085810907185078,
0.018366778269410133,
0.010076469741761684,
0.015929412096738815,
-0.019496450200676918,
-0.13308613002300262,
-0.006642404478043318,
0.029970582574605942,
-0.0107991062104702,
-0.09051404893398285,
-0.029398633167147636,
0.016663145273923874,
-0.087718166410923,
-0.005601802375167608,
-0.035563837736845016,
-0.10509514808654785,
-0.01067611575126648,
0.018674639984965324,
0.026538683101534843,
0.181686669588089,
-0.03240080922842026,
-0.00197386066429317,
0.010296621359884739,
0.014254049398005009,
-0.03504070267081261,
0.06010286509990692,
0.2186555564403534,
0.047152187675237656,
0.003951720427721739,
-0.02244304120540619,
-0.015082767233252525,
-0.003614244982600212,
-0.001205149688757956,
0.012236559763550758,
-0.07049039751291275,
-0.21604536473751068,
-0.06690704077482224,
-0.007326249498873949,
-0.030457403510808945,
0.06209073215723038,
0.0877012386918068,
0.009696544148027897,
0.04088544100522995,
0.023361988365650177,
-0.007672756910324097,
-0.0012690650764852762,
0.06731047481298447,
0.04345152899622917,
-0.015716973692178726,
0.049285002052783966,
-0.0674910843372345,
0.03711646422743797,
0.10933230072259903,
0.06535099446773529,
0.17797669768333435,
-0.04822276905179024,
0.03143719211220741,
0.039499010890722275,
0.1105826199054718,
0.04591695964336395,
0.04110978543758392,
-0.010651580058038235,
0.030142975971102715,
0.0010594421764835715,
-0.07336094975471497,
0.007306130602955818,
0.010738273151218891,
0.017512692138552666,
0.0034797843545675278,
-0.07736258208751678,
0.07845863699913025,
0.01982155814766884,
0.0912661924958229,
0.016961008310317993,
-0.21720801293849945,
-0.05431031808257103,
-0.06017278879880905,
-0.006847284268587828,
-0.07131031155586243,
0.022545624524354935,
0.15021096169948578,
-0.14471372961997986,
-0.0703740045428276,
-0.07925942540168762,
0.06671114265918732,
-0.09188665449619293,
-0.011071646586060524,
-0.020981822162866592,
0.08173228800296783,
0.020915118977427483,
0.062115464359521866,
-0.10613421350717545,
0.04207442328333855,
0.011571047827601433,
0.11491342633962631,
-0.07183122634887695,
-0.004856628365814686,
0.05282030999660492,
0.07566370069980621,
0.12691767513751984,
0.010754800401628017,
-0.1323038935661316,
-0.06627129018306732,
-0.177308589220047,
0.024045560508966446,
0.015523279085755348,
-0.04285987466573715,
0.06649203598499298,
-0.0031468013767153025,
0.01250911783427,
-0.04762948304414749,
-0.011188359931111336,
-0.06720810383558273,
-0.107050321996212,
0.03138842061161995,
0.00521418871358037,
-0.05173265561461449,
-0.0604674406349659,
-0.028680000454187393,
0.058495014905929565,
0.12042614817619324,
-0.1380019187927246,
-0.05622558668255806,
-0.11614682525396347,
0.030862364917993546,
0.08099988102912903,
-0.06129564717411995,
0.03063739649951458,
0.004504069685935974,
0.11096785217523575,
-0.04845445603132248,
-0.15101033449172974,
0.0412461943924427,
-0.07662433385848999,
-0.10527966916561127,
-0.001442145323380828,
0.0685209333896637,
0.06162244454026222,
0.042777758091688156,
0.06296996772289276,
0.02777116745710373,
-0.03112616389989853,
-0.09965012222528458,
0.005467306822538376,
0.1942606270313263,
0.05920575186610222,
0.07742265611886978,
-0.10773822665214539,
-0.06430448591709137,
-0.051132433116436005,
0.03253624960780144,
0.1199469268321991,
0.08169624954462051,
-0.08183392882347107,
0.15661969780921936,
0.07026829570531845,
-0.13307739794254303,
-0.17592184245586395,
0.0033557694405317307,
0.022826984524726868,
0.08863233774900436,
0.08207665383815765,
-0.22658827900886536,
0.07528702914714813,
0.06916973739862442,
-0.026215415447950363,
0.10978808999061584,
-0.39764663577079773,
-0.10356996953487396,
0.09157708287239075,
-0.0015412498032674193,
0.038671012967824936,
-0.07694262266159058,
-0.040919821709394455,
0.04709627106785774,
0.053462155163288116,
0.0987323522567749,
-0.1826409548521042,
0.13066762685775757,
-0.005872893612831831,
-0.012171827256679535,
0.04769054427742958,
-0.050491370260715485,
0.059008534997701645,
-0.023079724982380867,
0.1003735288977623,
-0.029222682118415833,
0.026934007182717323,
0.02717772126197815,
-0.042217738926410675,
0.15971645712852478,
0.02326877787709236,
0.12965704500675201,
-0.11566805839538574,
-0.06564897298812866,
-0.05611059069633484,
0.0892222449183464,
-0.004101997707039118,
-0.010736861266195774,
-0.08713527023792267,
0.10109271109104156,
0.06666005402803421,
0.005756519269198179,
-0.0060420166701078415,
-0.0162592101842165,
0.05428767949342728,
0.12652963399887085,
0.05784296616911888,
0.15386554598808289,
-0.13677802681922913,
-0.04704536497592926,
-0.012128787115216255,
0.08197077363729477,
-0.05051995441317558,
-0.0052904998883605,
0.10176952183246613,
0.014831320382654667,
0.057210229337215424,
0.015300475992262363,
-0.14916175603866577,
0.008768581785261631,
0.003324765246361494,
-0.11909652501344681,
-0.1037721186876297,
-0.012316170148551464,
0.15390273928642273,
-0.04408317431807518,
0.04467795789241791,
0.13203424215316772,
-0.08299553394317627,
-0.05488910898566246,
0.030196303501725197,
0.057040806859731674,
-0.04151713475584984,
0.14080525934696198,
0.036213357001543045,
-0.0031104066874831915,
-0.03942937031388283,
0.13619858026504517,
0.07117968052625656,
-0.012535719200968742,
0.08489405363798141,
0.05841487646102905,
-0.10753364861011505,
-0.07837961614131927,
-0.041806671768426895,
0.14696137607097626,
-0.045441143214702606,
-0.09303023666143417,
-0.03660742565989494,
-0.016754308715462685,
0.0005466443253681064,
0.0710553228855133,
0.031095484271645546,
0.0657740980386734,
-0.10557813942432404,
-0.07243011891841888,
-0.09481193125247955,
0.05063944682478905,
0.032663166522979736,
0.006225202698260546,
-0.08409378677606583,
0.15225636959075928,
0.03588074818253517,
0.059148598462343216,
-0.040894798934459686,
-0.045116446912288666,
-0.010676747187972069,
-0.03405372053384781,
-0.11430373787879944,
0.005885080434381962,
-0.006396291311830282,
-0.05619574710726738,
0.013028169982135296,
0.050298672169446945,
0.009537866339087486,
0.05456266924738884,
-0.028331762179732323,
-0.027642225846648216,
-0.08949817717075348,
0.00032051856396719813,
-0.1189243271946907,
-0.01313415914773941,
-0.03457546979188919,
-0.036618296056985855,
0.10838760435581207,
0.10196780413389206,
-0.007200911175459623,
0.0008386486442759633,
-0.03916893154382706,
0.0031058616004884243,
-0.019406940788030624,
0.025881949812173843,
-0.013683691620826721,
-0.15636591613292694,
0.020179687067866325,
0.02885233610868454,
-0.07882111519575119,
-0.010842723771929741,
0.09057149291038513,
-0.10153898596763611,
0.018684254959225655,
-0.09418637305498123,
0.07575242966413498,
-0.06439311802387238,
0.06866618245840073,
-0.03167634457349777,
0.1452290564775467,
0.08878614753484726,
-0.07325248420238495,
0.0835392028093338,
-0.12460393458604813,
-0.0359218455851078,
-0.035752829164266586,
0.045944441109895706,
-0.04212391376495361,
-0.011395260691642761,
0.044555194675922394,
0.006491257343441248,
0.11910615861415863,
0.001645474461838603,
-0.02201143652200699,
-0.015083263628184795,
0.01860239915549755,
-0.01115378551185131,
-0.006177179981023073,
0.061711087822914124,
-0.007891764864325523,
-0.020881539210677147,
0.08513347059488297,
0.05793710798025131,
0.013812802731990814,
0.07523277401924133,
0.19609616696834564,
0.13467393815517426,
0.05047338828444481,
0.07211781293153763,
-0.003976228181272745,
-0.04079616442322731,
-0.08339154720306396,
0.04766473546624184,
-0.05195239931344986,
0.0322587825357914,
-0.08396324515342712,
0.056208305060863495,
0.14833177626132965,
-0.1232708990573883,
0.07698225229978561,
0.04723168909549713,
-0.04004654660820961,
-0.12679889798164368,
-0.25462138652801514,
-0.03554675728082657,
-0.049831684678792953,
-0.06692369282245636,
-0.07455138862133026,
0.0010610173922032118,
0.0016297973925247788,
0.0431838296353817,
-0.05783353000879288,
0.16543588042259216,
-0.10969026386737823,
-0.05711952969431877,
0.04488767683506012,
-0.004047485068440437,
0.02451252192258835,
0.008923949673771858,
-0.0061558266170322895,
0.08197133243083954,
-0.02752644196152687,
0.11163950711488724,
0.031261492520570755,
0.1292564570903778,
0.07230155915021896,
-0.03934039548039436,
-0.03735867142677307,
-0.017960619181394577,
-0.04379400610923767,
0.020753948017954826,
0.14148381352424622,
0.10125478357076645,
-0.10178060084581375,
-0.03442247211933136,
0.21964727342128754,
-0.03482218086719513,
-0.00010124156688107178,
-0.13171805441379547,
0.20104381442070007,
0.011832602322101593,
-0.05180017650127411,
-0.026083124801516533,
-0.11151815950870514,
0.001254085567779839,
0.18353237211704254,
0.14424949884414673,
-0.03830312192440033,
-0.01433270052075386,
0.019036294892430305,
-0.009736621752381325,
-0.034469377249479294,
0.05065324530005455,
0.024861464276909828,
0.25082260370254517,
-0.054712601006031036,
0.03846646472811699,
-0.046708039939403534,
0.02135239541530609,
-0.10917004942893982,
0.05967501923441887,
-0.052831247448921204,
0.005999470595270395,
0.0021364532876759768,
0.040426064282655716,
-0.0734345093369484,
-0.23686571419239044,
0.02191615290939808,
-0.03710747882723808,
-0.08318088948726654,
-0.022019218653440475,
-0.05381445586681366,
0.02093338593840599,
0.07746513187885284,
-0.02912743389606476,
0.02838161401450634,
0.08650966733694077,
0.019267622381448746,
-0.06516756117343903,
-0.0644187480211258,
0.07688890397548676,
0.030561214312911034,
0.17554569244384766,
-0.016349155455827713,
0.03412068635225296,
0.06551382690668106,
0.011521462351083755,
-0.13589246571063995,
0.03362908586859703,
-0.04878663644194603,
-0.02269391342997551,
0.008825710974633694,
0.07881569117307663,
0.011850464157760143,
0.04952102527022362,
0.027145419269800186,
0.014257046394050121,
0.007139351684600115,
-0.12282902747392654,
0.009311440400779247,
-0.0917591005563736,
0.04607842490077019,
-0.04382190853357315,
0.1611679196357727,
0.09377444535493851,
-0.023958798497915268,
0.011421732604503632,
-0.028200695291161537,
0.06352578103542328,
0.018500274047255516,
0.018735481426119804,
0.05364587903022766,
-0.13023371994495392,
0.043994855135679245,
-0.09356334805488586,
0.028375888243317604,
-0.21879857778549194,
-0.05817161872982979,
0.03877200558781624,
-0.09242849797010422,
-0.015365764498710632,
0.0500652976334095,
0.05915900319814682,
0.058559197932481766,
-0.0482269823551178,
0.04562278836965561,
0.010831977240741253,
0.0877220407128334,
-0.13807493448257446,
-0.06772138178348541
] |
null | null | null |
# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script.
The Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP.
A second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 20K iterations. Performance of this model is now superior to the Tensorpack model.
Regarding the dataset, please check: [Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation](https://arxiv.org/abs/1911.10683).
The model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are
calculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.
The code has been adapted so that it can be used in a **deep**doctection pipeline.
## How this model can be used
This model can be used with the **deep**doctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this [Get_started](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Get_Started.ipynb) tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this [this model card](https://huggingface.co/deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_pubtabnet_rc).
|
{"license": "apache-2.0", "tags": ["Pytorch"], "datasets": ["Pubtabnet"]}
| null |
deepdoctection/d2_casc_rcnn_X_32xd4_50_FPN_GN_2FC_pubtabnet_rc_inference_only
|
[
"Pytorch",
"dataset:Pubtabnet",
"arxiv:1911.10683",
"license:apache-2.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1911.10683"
] |
[] |
TAGS
#Pytorch #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us
|
# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script.
The Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP.
A second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 20K iterations. Performance of this model is now superior to the Tensorpack model.
Regarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation.
The model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are
calculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.
The code has been adapted so that it can be used in a deepdoctection pipeline.
## How this model can be used
This model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card.
|
[
"# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script. \nThe Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP. \n\nA second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 20K iterations. Performance of this model is now superior to the Tensorpack model.\n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are\ncalculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card."
] |
[
"TAGS\n#Pytorch #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n",
"# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script. \nThe Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP. \n\nA second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 20K iterations. Performance of this model is now superior to the Tensorpack model.\n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are\ncalculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card."
] |
[
35,
271,
45,
69
] |
[
"passage: TAGS\n#Pytorch #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n# Detectron2 Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and has been trained with the Tensorflow training toolkit Tensorpack and then transferred to Pytorch using a conversion script. \nThe Tensorflow and Pytorch models differ slightly (padding ...), however validating both models give a difference of less than 0.03 mAP. \n\nA second model has been added where the Tensorpack model has been used as initial checkpoint and training has been resumed for 20K iterations. Performance of this model is now superior to the Tensorpack model.\n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are\ncalculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please use Tensorflow, as well as its training script. More information can be found in this this model card."
] |
[
-0.11005895584821701,
0.11076460778713226,
-0.0029226308688521385,
0.045908283442258835,
0.08833616226911545,
-0.03912823274731636,
0.06549076735973358,
0.1067618578672409,
-0.05289266258478165,
0.09984432905912399,
-0.054944247007369995,
0.0009984204079955816,
0.10802897065877914,
0.13466064631938934,
0.03804417699575424,
-0.14701196551322937,
0.03695866838097572,
-0.01292472891509533,
0.010376691818237305,
0.07783874869346619,
0.09623870253562927,
-0.09479904919862747,
0.07401370257139206,
0.01330987736582756,
-0.04616151750087738,
0.038871005177497864,
-0.04037601500749588,
-0.04769434407353401,
0.07500292360782623,
0.01495317928493023,
0.10025538504123688,
0.014740445651113987,
0.08578889071941376,
-0.1403712034225464,
0.033903904259204865,
0.10294418781995773,
-0.0026433556340634823,
0.07882610708475113,
0.10400164872407913,
0.06516580283641815,
0.20674245059490204,
-0.06917424499988556,
0.059268657118082047,
0.02218019962310791,
-0.05437159165740013,
-0.13643935322761536,
-0.15279097855091095,
0.11478365212678909,
0.09335801750421524,
0.09630224108695984,
-0.023517321795225143,
0.12376943975687027,
0.0576055571436882,
0.052153993397951126,
0.12185332179069519,
-0.2888479232788086,
0.003996460698544979,
0.1332899034023285,
0.07705563306808472,
0.06684539467096329,
-0.05372420698404312,
-0.01712159439921379,
-0.001907127327285707,
0.04930026829242706,
0.07941657304763794,
-0.057696688920259476,
-0.010742861777544022,
-0.0060314154252409935,
-0.15657657384872437,
-0.017532089725136757,
0.12156371027231216,
-0.03359845280647278,
-0.08802641183137894,
-0.13052847981452942,
-0.04401722922921181,
-0.08718504756689072,
0.013051754795014858,
-0.10090626776218414,
0.04376305639743805,
0.029150892049074173,
0.07953427731990814,
-0.17542141675949097,
-0.12312949448823929,
-0.019860392436385155,
-0.08182262629270554,
0.022952787578105927,
0.0676269680261612,
-0.009055446833372116,
-0.07645045220851898,
0.13033238053321838,
0.024412956088781357,
-0.027806038036942482,
-0.07098326086997986,
-0.029051637277007103,
-0.0937931090593338,
-0.029365893453359604,
-0.0740349292755127,
-0.21207034587860107,
-0.06067020073533058,
0.09769576042890549,
0.01252499409019947,
0.03893912956118584,
-0.03162252902984619,
0.044175054877996445,
0.034956347197294235,
0.22285743057727814,
-0.05562557652592659,
0.0344432070851326,
0.06799975782632828,
0.02512625977396965,
0.034459371119737625,
-0.0160264503210783,
-0.08684439212083817,
-0.0387880839407444,
0.04934599623084068,
0.053322430700063705,
-0.0057120234705507755,
0.04169052466750145,
-0.023164916783571243,
-0.08702775090932846,
0.031448736786842346,
-0.16658997535705566,
0.04753299430012703,
0.0214016605168581,
-0.06608026474714279,
0.033379264175891876,
0.09640572965145111,
-0.016241174191236496,
-0.13856008648872375,
-0.05281192436814308,
-0.05446227267384529,
0.004936595913022757,
-0.05383289232850075,
-0.07555952668190002,
0.022117646411061287,
-0.1342499554157257,
-0.05429786071181297,
-0.13960888981819153,
-0.15680275857448578,
-0.0823289006948471,
-0.014574026688933372,
0.017811844125390053,
0.058785662055015564,
-0.0423240140080452,
-0.010799980722367764,
-0.08212646842002869,
0.030789874494075775,
-0.016517091542482376,
0.01973051391541958,
-0.0013284304877743125,
-0.057686787098646164,
0.015445055440068245,
-0.03719040006399155,
0.028104128316044807,
-0.10457166284322739,
0.02363160252571106,
-0.0734386146068573,
0.11327245086431503,
0.03317166864871979,
-0.0499173179268837,
-0.08558320999145508,
-0.04207317903637886,
-0.12507502734661102,
0.008138769306242466,
0.04650116711854935,
0.10567306727170944,
-0.13479265570640564,
-0.03802710026502609,
0.12689585983753204,
-0.11780060082674026,
0.00846271961927414,
0.04603186994791031,
-0.03520902246236801,
0.06483808159828186,
0.08334922045469284,
0.13088402152061462,
0.06868317723274231,
-0.07077953219413757,
-0.11813566088676453,
-0.038875892758369446,
-0.0381048284471035,
0.03546902537345886,
0.07531282305717468,
-0.03515110909938812,
0.03204070031642914,
0.004964728374034166,
-0.0503547340631485,
-0.05188140645623207,
0.03694862499833107,
-0.05709264054894447,
-0.024865558370947838,
0.002354307798668742,
-0.01741965115070343,
-0.03426843136548996,
0.022839423269033432,
0.058278996497392654,
-0.07303149253129959,
-0.09755980968475342,
0.14466017484664917,
-0.0776023417711258,
0.04676799103617668,
-0.09938538074493408,
0.13773073256015778,
-0.08678316324949265,
0.024690872058272362,
-0.22275923192501068,
-0.026361972093582153,
0.0557456836104393,
-0.10101758688688278,
0.06683780997991562,
0.03584066405892372,
0.008398955687880516,
0.04847230017185211,
0.03662283718585968,
-0.02350623905658722,
-0.06733845174312592,
-0.03789285942912102,
-0.07430528849363327,
-0.061676472425460815,
-0.1011618822813034,
-0.08397532254457474,
0.09229405969381332,
-0.19585958123207092,
0.05650067329406738,
-0.0033736685290932655,
0.07656041532754898,
0.05731065943837166,
-0.07058000564575195,
0.04905233532190323,
0.0004950134898535907,
-0.05591973662376404,
-0.09461630880832672,
-0.009031649678945541,
0.04630284011363983,
-0.0656469464302063,
0.06703298538923264,
-0.16719017922878265,
-0.11816936731338501,
0.0490991473197937,
0.11880775541067123,
-0.05130226910114288,
-0.03651260584592819,
-0.027278712019324303,
0.013566019013524055,
-0.05451342836022377,
0.008312521502375603,
0.25146913528442383,
0.042519744485616684,
0.11320212483406067,
-0.11040251702070236,
-0.029515614733099937,
0.0031290266197174788,
-0.03480102866888046,
-0.028977559879422188,
0.010103525593876839,
0.02380518987774849,
-0.1364779770374298,
0.019066600129008293,
0.044522225856781006,
0.028262322768568993,
0.0650668740272522,
0.024162162095308304,
-0.09156017005443573,
-0.04556647315621376,
0.03108769655227661,
0.01828589104115963,
0.06690342724323273,
0.05672270804643631,
0.011963112279772758,
0.007703234441578388,
0.03165897727012634,
0.08246245980262756,
-0.09423661231994629,
0.09179870039224625,
0.06374402344226837,
-0.013909601606428623,
0.07322763651609421,
-0.007291707210242748,
-0.004376988857984543,
0.0675942525267601,
-0.019595222547650337,
0.04748771712183952,
-0.03818270191550255,
-0.043141573667526245,
-0.15034040808677673,
0.17996162176132202,
-0.11307449638843536,
-0.22969231009483337,
-0.15206778049468994,
0.05269981175661087,
-0.022178418934345245,
0.02258552610874176,
0.0026475556660443544,
-0.002453855238854885,
-0.05022246763110161,
-0.15825331211090088,
0.017317242920398712,
0.011851162649691105,
0.003520883386954665,
-0.05114998295903206,
-0.010782112367451191,
0.05186721682548523,
-0.08375959098339081,
-0.007481821347028017,
-0.020821545273065567,
-0.11867415904998779,
-0.034788887947797775,
-0.010665781795978546,
0.07369349896907806,
0.16033323109149933,
-0.04089930281043053,
-0.022228294983506203,
0.0015163164352998137,
0.027019022032618523,
-0.04322051256895065,
0.06453517079353333,
0.12334268540143967,
-0.01837798021733761,
0.020844245329499245,
0.014399049803614616,
-0.04520630091428757,
-0.019543418660759926,
0.018659524619579315,
0.013789083808660507,
-0.03471946716308594,
-0.2448127567768097,
-0.0677080973982811,
-0.02559126913547516,
-0.04246469959616661,
0.0943429097533226,
0.08868969976902008,
0.01559550128877163,
0.03843819350004196,
-0.015320218168199062,
0.021477166563272476,
0.012467151507735252,
0.08286889642477036,
0.06207447499036789,
-0.0001639463589526713,
0.02592024952173233,
-0.060369931161403656,
0.019835660234093666,
0.11624211817979813,
0.07202738523483276,
0.17327181994915009,
-0.05675779655575752,
0.059973008930683136,
0.02325163222849369,
0.07612504810094833,
0.011363609693944454,
0.05438815802335739,
-0.01462238933891058,
0.05452948063611984,
-0.008823440410196781,
-0.04926062002778053,
-0.0037810869980603456,
0.0339980386197567,
0.03548497334122658,
0.004299456719309092,
-0.1167464628815651,
0.052946437150239944,
0.027202986180782318,
0.07227974385023117,
0.017740556970238686,
-0.22798636555671692,
-0.028729330748319626,
-0.044798869639635086,
-0.020807567983865738,
-0.07044917345046997,
0.015775909647345543,
0.15531426668167114,
-0.12388978153467178,
-0.03782821446657181,
-0.07825498282909393,
0.06559743732213974,
-0.10206620395183563,
-0.014735699631273746,
-0.011118235066533089,
0.09305364638566971,
0.01852046512067318,
0.06290320307016373,
-0.10174098610877991,
0.03944792598485947,
0.03061722405254841,
0.11213533580303192,
-0.039662618190050125,
0.0327678881585598,
0.05427287146449089,
0.06037179380655289,
0.12759223580360413,
0.008352531120181084,
-0.165241539478302,
-0.09197847545146942,
-0.1281672716140747,
0.04319150745868683,
0.07132664322853088,
-0.04621178284287453,
0.11165028065443039,
-0.030606942251324654,
0.010256890207529068,
-0.05031023919582367,
-0.04131239280104637,
-0.05021957680583,
-0.17179515957832336,
0.03927454352378845,
0.035572849214076996,
-0.0426076240837574,
-0.07911821454763412,
-0.022239670157432556,
0.030494550243020058,
0.15096667408943176,
-0.1186818927526474,
-0.015900537371635437,
-0.10991113632917404,
0.07179342955350876,
0.0953330397605896,
-0.08010097593069077,
0.042682189494371414,
0.0030541415326297283,
0.11275846511125565,
-0.05264009162783623,
-0.1398809254169464,
0.029543796554207802,
-0.04423311725258827,
-0.10059716552495956,
-0.024325545877218246,
0.06613700091838837,
0.07167265564203262,
0.020586993545293808,
0.0726492777466774,
0.02166113071143627,
-0.0018853923538699746,
-0.08088000118732452,
-0.003275902010500431,
0.16774731874465942,
0.08479034900665283,
0.08507445454597473,
-0.1564376950263977,
-0.09342212975025177,
-0.08789177983999252,
0.03757854178547859,
0.09179820120334625,
0.10110484063625336,
-0.061734605580568314,
0.09155246615409851,
0.09541595727205276,
-0.14755362272262573,
-0.16884496808052063,
-0.007251954637467861,
0.013948018662631512,
0.08580077439546585,
0.08837532997131348,
-0.22561660408973694,
0.037113651633262634,
0.037401970475912094,
-0.013543082401156425,
0.05196966230869293,
-0.3258039951324463,
-0.08506160974502563,
0.08365742117166519,
0.011903472244739532,
-0.012052065692842007,
-0.0663173720240593,
-0.039432842284440994,
0.06258627772331238,
0.01103205792605877,
0.038008272647857666,
-0.11918088793754578,
0.11336425691843033,
0.011443077586591244,
-0.019201189279556274,
0.06417495757341385,
-0.02510886825621128,
0.07905823737382889,
0.002640943042933941,
0.08440984785556793,
-0.044116757810115814,
0.001760129234753549,
0.05879530310630798,
-0.06965618580579758,
0.15787602961063385,
0.0008371861185878515,
0.11221509426832199,
-0.11763598769903183,
-0.0609772764146328,
-0.022716034203767776,
0.0827808827161789,
-0.004904051776975393,
-0.0015360232209786773,
-0.11295828968286514,
0.10293632745742798,
0.08497212082147598,
0.00794193521142006,
0.05620834603905678,
-0.007198987528681755,
0.032772429287433624,
0.09746144711971283,
0.07561057806015015,
0.12463942170143127,
-0.07685434818267822,
-0.021907441318035126,
0.013936814852058887,
0.07368452101945877,
-0.03911472484469414,
0.027704879641532898,
0.11620832979679108,
-0.011654808185994625,
0.09883899986743927,
0.03241272270679474,
-0.1659480333328247,
0.0023721856996417046,
0.0426466129720211,
-0.11022154241800308,
-0.07337154448032379,
-0.031289052218198776,
0.08844008296728134,
-0.08929746598005295,
0.06776324659585953,
0.13614384829998016,
-0.04106513410806656,
-0.045331139117479324,
0.013717814348638058,
0.06200031936168671,
-0.030341338366270065,
0.09867513179779053,
0.04906841367483139,
-0.021041708067059517,
-0.01359603088349104,
0.16060195863246918,
0.10288862884044647,
-0.09747195988893509,
0.09226562082767487,
0.048181623220443726,
-0.09587999433279037,
-0.05600149184465408,
-0.0375719889998436,
0.1478167623281479,
-0.015315921045839787,
-0.09806753695011139,
-0.04521964490413666,
-0.01621418632566929,
0.008605019189417362,
0.05173170194029808,
0.02545313723385334,
0.09569954127073288,
-0.09620315581560135,
-0.041668351739645004,
-0.1222301721572876,
0.06278282403945923,
0.053287480026483536,
0.022740565240383148,
-0.08858202397823334,
0.08066417276859283,
0.025404036045074463,
0.05821416899561882,
-0.02847653068602085,
-0.057745542377233505,
-0.0028546741232275963,
-0.05048374831676483,
-0.05551808699965477,
0.008844808675348759,
-0.04698900133371353,
-0.0523955412209034,
0.014355354942381382,
0.03348175808787346,
0.006311116274446249,
0.06519830971956253,
-0.034417834132909775,
-0.06479500979185104,
-0.09455416351556778,
0.02404790371656418,
-0.1234942302107811,
-0.0033441430423408747,
-0.017650926485657692,
-0.06656051427125931,
0.10445638746023178,
0.06431932747364044,
-0.007551938761025667,
0.038161877542734146,
-0.054612111300230026,
-0.009942058473825455,
-0.002551644342020154,
0.04285815358161926,
0.003952110186219215,
-0.1107632964849472,
0.02064235880970955,
0.03841736540198326,
-0.07102975249290466,
-0.045422304421663284,
0.09095792472362518,
-0.10038524866104126,
-0.00019992377201560885,
-0.0800604298710823,
0.05626589432358742,
-0.05516422912478447,
0.099174864590168,
-0.006289828568696976,
0.1279432773590088,
0.12579843401908875,
-0.05726150423288345,
0.0645807757973671,
-0.13499557971954346,
-0.01732892543077469,
-0.023086978122591972,
0.02832501009106636,
0.015356895513832569,
-0.05935106426477432,
0.034876007586717606,
-0.0016632016049697995,
0.11346256732940674,
-0.008468794636428356,
-0.014038893394172192,
-0.0048280623741447926,
-0.01626109890639782,
-0.04404280334711075,
0.002297796308994293,
0.07050266116857529,
-0.0030293050222098827,
-0.0437222458422184,
0.08006612211465836,
0.02005109004676342,
0.02973126247525215,
0.0749443843960762,
0.1962384730577469,
0.07374818623065948,
0.08350945264101028,
0.12943924963474274,
-0.010506375692784786,
-0.037801869213581085,
-0.05987704545259476,
0.06606078892946243,
-0.051251593977212906,
0.04236423969268799,
-0.040235746651887894,
0.06730420142412186,
0.14217911660671234,
-0.1304381936788559,
0.08124373853206635,
0.0659584030508995,
-0.03913654759526253,
-0.11958365142345428,
-0.20960325002670288,
-0.05157405510544777,
-0.03223295509815216,
-0.05320688709616661,
-0.08438092470169067,
-0.013037719763815403,
-0.015138661488890648,
0.012529910542070866,
-0.0641079694032669,
0.17827044427394867,
-0.11325450986623764,
-0.09293816238641739,
0.06946027278900146,
0.012756388634443283,
0.024556562304496765,
0.027024293318390846,
0.005542270839214325,
0.048599421977996826,
0.010852399282157421,
0.11240182816982269,
0.030063411220908165,
0.14616920053958893,
0.08038346469402313,
-0.03842683508992195,
-0.02469100058078766,
-0.008285352028906345,
-0.03917648270726204,
0.0015465093310922384,
0.09638580679893494,
0.08590143173933029,
-0.08725091814994812,
-0.016861209645867348,
0.2047450840473175,
-0.03165709599852562,
0.0176790039986372,
-0.13459263741970062,
0.10790051519870758,
0.06816317141056061,
-0.05450477823615074,
-0.01638094335794449,
-0.13806886970996857,
0.016874583438038826,
0.14595292508602142,
0.11429639160633087,
-0.01857924833893776,
-0.009231344796717167,
0.020945625379681587,
0.005008137784898281,
-0.0175175704061985,
0.05993478745222092,
0.02893195115029812,
0.21777772903442383,
-0.011336187832057476,
0.03500223159790039,
-0.041156232357025146,
0.030902927741408348,
-0.04349874332547188,
0.07351479679346085,
-0.0577944852411747,
0.011272836476564407,
-0.03175078704953194,
0.01690823584794998,
-0.021067297086119652,
-0.28183406591415405,
0.02438909001648426,
-0.03942250460386276,
-0.06610462814569473,
-0.01800256222486496,
-0.010972847230732441,
0.012092843651771545,
0.07897806167602539,
-0.0055725486017763615,
0.01698552444577217,
0.14774011075496674,
0.013848434202373028,
-0.06412861496210098,
-0.04203113168478012,
0.07882656157016754,
-0.07901834696531296,
0.20910851657390594,
-0.01757676526904106,
0.013631203211843967,
0.05302494019269943,
0.02417551912367344,
-0.13127250969409943,
0.021331531926989555,
-0.03435301035642624,
0.04773425683379173,
0.031822387129068375,
0.07809334248304367,
0.011085943318903446,
0.07154197990894318,
0.045387815684080124,
0.0546664297580719,
0.04682542756199837,
-0.13554731011390686,
-0.03481484204530716,
-0.0959746465086937,
0.04192216321825981,
-0.06406138837337494,
0.16395075619220734,
0.09291350841522217,
-0.01534055732190609,
0.02756703458726406,
-0.012135971337556839,
0.04685921221971512,
0.01105096098035574,
0.10416079312562943,
0.05264550819993019,
-0.07918011397123337,
0.01665673777461052,
-0.10485595464706421,
0.03931184858083725,
-0.22420799732208252,
-0.037652622908353806,
0.04121518135070801,
-0.08198122680187225,
-0.02767235040664673,
0.08949621766805649,
0.05759381875395775,
0.04024956747889519,
-0.05497034266591072,
0.03823219984769821,
0.015425293706357479,
0.10432315617799759,
-0.15459021925926208,
-0.05429048836231232
] |
null | null | null |
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis
The model and its training code has been mainly taken from: [Tensorpack](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN) .
Please check: [Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis](https://arxiv.org/abs/1908.07836).
This model is different from the model used the paper.
The code has been adapted so that it can be used in a **deep**doctection pipeline.
## How this model can be used
This model can be used with the **deep**doctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this [Get_started](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Get_Started.ipynb) tutorial.
## How this model was trained.
To recreate the model run on the **deep**doctection framework, run:
```python
>>> import os
>>> from deep_doctection.datasets import DatasetRegistry
>>> from deep_doctection.eval import MetricRegistry
>>> from deep_doctection.utils import get_configs_dir_path
>>> from deep_doctection.train import train_faster_rcnn
publaynet = DatasetRegistry.get_dataset("publaynet")
path_config_yaml=os.path.join(get_configs_dir_path(),"tp/layout/conf_frcnn_layout.yaml")
path_weights = ""
dataset_train = publaynet
config_overwrite=["TRAIN.STEPS_PER_EPOCH=500","TRAIN.EVAL_PERIOD=200","TRAIN.STARTING_EPOCH=1",
"PREPROC.TRAIN_SHORT_EDGE_SIZE=[800,1200]","TRAIN.CHECKPOINT_PERIOD=50",
"BACKBONE.FREEZE_AT=0"]
build_train_config=["max_datapoints=335703"]
dataset_val = publaynet
build_val_config = ["max_datapoints=2000"]
coco_metric = MetricRegistry.get_metric("coco")
train_faster_rcnn(path_config_yaml=path_config_yaml,
dataset_train=dataset_train,
path_weights=path_weights,
config_overwrite=config_overwrite,
log_dir="/path/to/dir",
build_train_config=build_train_config,
dataset_val=dataset_val,
build_val_config=build_val_config,
metric=coco_metric,
pipeline_component_name="ImageLayoutService"
)
```
## How to fine-tune this model
To fine tune this model, please check this [Fine-tune](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Fine_Tune.ipynb) tutorial.
|
{"license": "apache-2.0", "tags": ["Tensorflow"], "datasets": ["Publaynet"]}
| null |
deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_publaynet
|
[
"Tensorflow",
"dataset:Publaynet",
"arxiv:1908.07836",
"license:apache-2.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.07836"
] |
[] |
TAGS
#Tensorflow #dataset-Publaynet #arxiv-1908.07836 #license-apache-2.0 #region-us
|
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis
The model and its training code has been mainly taken from: Tensorpack .
Please check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis.
This model is different from the model used the paper.
The code has been adapted so that it can be used in a deepdoctection pipeline.
## How this model can be used
This model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.
## How this model was trained.
To recreate the model run on the deepdoctection framework, run:
## How to fine-tune this model
To fine tune this model, please check this Fine-tune tutorial.
|
[
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis\n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nPlease check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis. \n\nThis model is different from the model used the paper. \n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:",
"## How to fine-tune this model\n\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
"TAGS\n#Tensorflow #dataset-Publaynet #arxiv-1908.07836 #license-apache-2.0 #region-us \n",
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis\n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nPlease check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis. \n\nThis model is different from the model used the paper. \n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:",
"## How to fine-tune this model\n\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
33,
111,
45,
24,
22
] |
[
"passage: TAGS\n#Tensorflow #dataset-Publaynet #arxiv-1908.07836 #license-apache-2.0 #region-us \n# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis\n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nPlease check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis. \n\nThis model is different from the model used the paper. \n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:## How to fine-tune this model\n\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
-0.11590853333473206,
0.09498300403356552,
-0.0007088291458785534,
0.06817779690027237,
0.06649432331323624,
-0.03362129256129265,
0.09063306450843811,
0.016235506162047386,
-0.0953846275806427,
0.07847549766302109,
0.022897645831108093,
0.035845011472702026,
0.08202984929084778,
0.17055636644363403,
-0.020176557824015617,
-0.24346235394477844,
-0.008790245279669762,
-0.0008073552744463086,
0.02438233233988285,
0.0609436072409153,
0.09424835443496704,
-0.08749383687973022,
0.044795989990234375,
0.033733174204826355,
-0.06863871216773987,
0.04215167835354805,
-0.02340099588036537,
-0.00465551158413291,
0.04373142123222351,
0.060754209756851196,
0.06044191122055054,
0.04139091819524765,
0.10947113484144211,
-0.05261828005313873,
0.05813102424144745,
0.04187605902552605,
-0.008514861576259136,
0.11370451748371124,
0.1537468433380127,
0.07213462144136429,
0.17968915402889252,
-0.10417204350233078,
-0.008935568854212761,
0.028473157435655594,
-0.020301956683397293,
-0.11313352733850479,
-0.13241326808929443,
0.16812841594219208,
0.09510398656129837,
0.10432557016611099,
-0.009950537234544754,
0.1250782161951065,
-0.03478991985321045,
0.02938252128660679,
0.06933146715164185,
-0.3358953893184662,
-0.07820011675357819,
0.2301880270242691,
0.06061660125851631,
0.05372686684131622,
-0.07306031882762909,
0.007424549665302038,
0.00633219163864851,
0.029591528698801994,
-0.027770167216658592,
-0.0703570544719696,
0.05273336544632912,
-0.02851089835166931,
-0.13428795337677002,
0.031174521893262863,
0.15795603394508362,
0.043260619044303894,
-0.014504462480545044,
-0.12026824057102203,
-0.07066960632801056,
-0.009466593153774738,
-0.0010953975142911077,
-0.09847584366798401,
0.03845445066690445,
0.036383409053087234,
0.10841477662324905,
-0.14536114037036896,
-0.11307904124259949,
-0.011716397479176521,
-0.08761143684387207,
-0.005969353020191193,
0.03195476159453392,
0.025352638214826584,
-0.0719667375087738,
0.09873844683170319,
-0.0440782830119133,
-0.05578009411692619,
0.006435145158320665,
-0.040712300688028336,
-0.12632890045642853,
-0.027524016797542572,
-0.030743731185793877,
-0.1098327711224556,
0.06601212173700333,
0.18347962200641632,
-0.02109980396926403,
0.04500442370772362,
0.0066778091713786125,
0.051311880350112915,
0.05219216272234917,
0.1678282767534256,
-0.14605525135993958,
0.00190968147944659,
0.1423526406288147,
-0.012672508135437965,
0.04582865536212921,
-0.02374352514743805,
-0.09964063763618469,
-0.07643239945173264,
-0.04778315871953964,
0.023485861718654633,
0.06311791390180588,
-0.02589409239590168,
0.020184146240353584,
-0.09899362176656723,
0.17211876809597015,
-0.10049334168434143,
0.05609649419784546,
0.04505107179284096,
-0.06507261097431183,
0.10814625024795532,
0.021415285766124725,
0.013737031258642673,
-0.10466593503952026,
-0.04295671358704567,
-0.026340244337916374,
-0.055278170853853226,
-0.057362060993909836,
-0.02249198779463768,
-0.0055953809060156345,
-0.08865510672330856,
-0.01040076743811369,
-0.09323828667402267,
-0.20067234337329865,
-0.11760083585977554,
0.04847244545817375,
-0.015302776359021664,
-0.06737550348043442,
-0.030707169324159622,
-0.011718759313225746,
-0.024479970335960388,
0.04983486980199814,
0.08767472952604294,
0.0016195208299905062,
-0.04606587067246437,
-0.06231153383851051,
-0.023530425503849983,
-0.09021631628274918,
-0.00312182423658669,
-0.06838472187519073,
0.032778676599264145,
0.005211257375776768,
0.03923407196998596,
0.053629230707883835,
-0.06214264780282974,
-0.08864836394786835,
-0.03240185230970383,
-0.088215671479702,
-0.026795344427227974,
-0.006140111014246941,
0.09567137807607651,
-0.19289512932300568,
0.02899670973420143,
0.03903893381357193,
-0.07254505902528763,
0.041573479771614075,
0.10313685983419418,
-0.02558029070496559,
-0.031150778755545616,
0.03855574503540993,
0.10506819188594818,
0.15923476219177246,
-0.07727531343698502,
-0.017018785700201988,
-0.01700831577181816,
-0.09339696913957596,
-0.0417679101228714,
0.0861511379480362,
-0.008492833003401756,
0.021661164239048958,
-0.011712377890944481,
-0.07662032544612885,
-0.030205655843019485,
-0.026374898850917816,
-0.052558038383722305,
-0.054298605769872665,
-0.04631669819355011,
-0.02567755989730358,
-0.021268852055072784,
0.058650124818086624,
0.039021894335746765,
-0.03999536111950874,
-0.12105334550142288,
0.15488527715206146,
-0.036209188401699066,
-0.030184516683220863,
-0.08449506014585495,
0.16168975830078125,
-0.0868988111615181,
0.03129866346716881,
-0.1789311319589615,
-0.008591976948082447,
0.050802361220121384,
0.07143493741750717,
0.10048026591539383,
0.13079839944839478,
0.04899505898356438,
0.04780236631631851,
0.010437671095132828,
-0.058912456035614014,
0.010789296589791775,
-0.046820759773254395,
-0.008541980758309364,
-0.07465330511331558,
-0.0480484776198864,
-0.06949760019779205,
0.06353549659252167,
-0.32244065403938293,
0.004110564012080431,
-0.11136866360902786,
0.08162571489810944,
0.009056886658072472,
0.008388054557144642,
0.053452324122190475,
-0.01989528350532055,
-0.01253086794167757,
-0.0707196369767189,
0.01564452424645424,
0.04984085261821747,
-0.052961092442274094,
0.0076684048399329185,
-0.08726844191551208,
0.023105505853891373,
0.021953655406832695,
0.08722902834415436,
-0.02031813934445381,
-0.03971606865525246,
-0.056548863649368286,
-0.018486909568309784,
-0.07090093195438385,
0.09879403561353683,
0.2543604075908661,
0.004898392129689455,
0.027146687731146812,
-0.07330071181058884,
0.05671960115432739,
-0.011961497366428375,
-0.03193286806344986,
0.0114225372672081,
0.1086917594075203,
0.0118166608735919,
-0.17508424818515778,
-0.014305991120636463,
0.04410412162542343,
-0.06099911779165268,
0.0932588279247284,
-0.023148899897933006,
-0.06247582659125328,
-0.05088255554437637,
0.06344921141862869,
0.01193470973521471,
0.036250755190849304,
0.1436316967010498,
0.07230356335639954,
0.03017309494316578,
0.008249138481914997,
0.0718916803598404,
-0.06936517357826233,
0.03325361758470535,
0.035061221569776535,
-0.02816474810242653,
0.10307081043720245,
0.04122316837310791,
-0.003972969483584166,
0.04686662182211876,
-0.03290436416864395,
-0.02549716643989086,
-0.00034761620918288827,
-0.018693186342716217,
-0.10882396996021271,
0.1583888828754425,
-0.0547519214451313,
-0.11011172831058502,
-0.1180739477276802,
0.10804606229066849,
-0.07314477115869522,
-0.0020392900332808495,
-0.05222691223025322,
-0.007794261444360018,
-0.045732174068689346,
-0.0783727765083313,
0.00602558022364974,
0.06954316794872284,
-0.029654810205101967,
-0.04109223559498787,
-0.04426681622862816,
-0.011665149591863155,
-0.07880280911922455,
0.011370607651770115,
-0.00824968982487917,
-0.14814957976341248,
-0.011361109092831612,
-0.0038133070338517427,
0.08484293520450592,
0.15930554270744324,
-0.03604379668831825,
0.018109377473592758,
0.08070153743028641,
0.05954670533537865,
-0.07062286138534546,
0.08973229676485062,
0.21935391426086426,
0.060686562210321426,
-0.003130178200080991,
0.047848522663116455,
-0.004379054997116327,
-0.0029798741452395916,
0.0086491284891963,
0.011359415017068386,
-0.02793251723051071,
-0.20328712463378906,
-0.08439385890960693,
-0.05401674285531044,
-0.04110126197338104,
0.044612959027290344,
0.1020725890994072,
-0.03300588205456734,
0.04228676110506058,
0.01693660020828247,
-0.016860373318195343,
0.02653435617685318,
-0.011332491412758827,
0.04350600764155388,
0.0421614907681942,
0.0053993649780750275,
-0.07431457936763763,
0.018308864906430244,
0.13591431081295013,
0.0312788300216198,
0.12426140904426575,
0.015352463349699974,
0.11796116083860397,
0.005141371861100197,
0.08241520822048187,
0.06038743257522583,
0.09163285791873932,
0.02012547105550766,
-0.02126896008849144,
0.03405631333589554,
-0.05633468180894852,
0.06966033577919006,
0.023260969668626785,
0.06301508843898773,
0.020135851576924324,
-0.03438669443130493,
0.09470620006322861,
-0.020637551322579384,
0.11813773959875107,
-0.004757160320878029,
-0.21629777550697327,
-0.03647318482398987,
-0.04916367679834366,
0.031140748411417007,
-0.05641914904117584,
0.033370550721883774,
0.10598210245370865,
-0.12513743340969086,
-0.029265567660331726,
-0.023486556485295296,
0.11132865399122238,
-0.10714396834373474,
-0.020434558391571045,
-0.058245375752449036,
0.1617565155029297,
0.031209561973810196,
0.08170867711305618,
-0.0709221288561821,
0.0458938367664814,
0.01473059132695198,
0.12599535286426544,
-0.06122085824608803,
-0.016023483127355576,
0.06978318095207214,
0.11846145242452621,
0.12451416999101639,
-0.008581588044762611,
-0.07264812290668488,
0.03685172274708748,
-0.17677776515483856,
0.037345241755247116,
-0.02612576261162758,
-0.031160589307546616,
0.009423685260117054,
0.009062321856617928,
-0.036259133368730545,
-0.04239653795957565,
0.05632196366786957,
-0.15156789124011993,
-0.10085563361644745,
-0.00722958380356431,
0.06790408492088318,
-0.09268470853567123,
-0.06457027047872543,
-0.04170180857181549,
0.02179991453886032,
0.18862426280975342,
0.011916844174265862,
0.0009079158189706504,
-0.09645340591669083,
0.05628463998436928,
0.1311779022216797,
-0.07183616608381271,
-0.007222570013254881,
-0.002752680331468582,
0.11764030903577805,
-0.05186241492629051,
-0.1542721539735794,
0.015000496059656143,
-0.06292466074228287,
-0.09366602450609207,
0.012009681202471256,
0.05430871620774269,
0.09994297474622726,
0.023855550214648247,
0.07342419773340225,
0.039252642542123795,
-0.051686882972717285,
-0.14267376065254211,
-0.021916834637522697,
0.08821730315685272,
0.019954947754740715,
0.04443693533539772,
-0.04836311936378479,
-0.14331981539726257,
-0.009223016910254955,
0.05142996832728386,
0.1084015890955925,
0.11620022356510162,
-0.08627232164144516,
0.016007110476493835,
0.11918950080871582,
-0.06033238768577576,
-0.16406609117984772,
-0.03420514613389969,
0.01348976232111454,
0.1380968689918518,
0.007156095001846552,
-0.1698511838912964,
0.16317486763000488,
0.0854472890496254,
-0.029507847502827644,
0.12543407082557678,
-0.34434518218040466,
-0.11187431961297989,
0.20029500126838684,
0.018658237531781197,
0.034824926406145096,
-0.08082278817892075,
-0.06277246028184891,
0.0010790681699290872,
0.025232939049601555,
0.1260155588388443,
-0.23162461817264557,
0.08966106176376343,
-0.03525547310709953,
-0.035452548414468765,
0.01052696444094181,
-0.03112107515335083,
0.08297643810510635,
-0.04030061140656471,
0.09552908688783646,
-0.08907664567232132,
0.06592772156000137,
0.01669042371213436,
-0.025999773293733597,
0.1354788839817047,
0.0135463522747159,
0.11892785131931305,
-0.1507260650396347,
-0.055164627730846405,
-0.06761913746595383,
0.07235738635063171,
-0.024502862244844437,
-0.010606879368424416,
-0.10237601399421692,
0.0916593000292778,
0.03614458069205284,
0.02731049619615078,
0.028722360730171204,
0.00832574162632227,
-0.01947961561381817,
0.12012895196676254,
0.08503247052431107,
0.04814508557319641,
-0.08228509873151779,
-0.06267011910676956,
0.004248719196766615,
0.054065950214862823,
0.04870201647281647,
-0.039149463176727295,
0.14891792833805084,
0.0017250953242182732,
0.0608239583671093,
0.014649429358541965,
-0.0995369479060173,
-0.03199058026075363,
0.01992015540599823,
-0.11428368836641312,
-0.1604790985584259,
-0.05661609396338463,
0.0575849749147892,
-0.01985849067568779,
-0.014442211017012596,
0.08654936403036118,
-0.056935664266347885,
-0.02179398387670517,
0.047682762145996094,
0.028368880972266197,
-0.0767766460776329,
0.13046738505363464,
0.06432928144931793,
0.012203914113342762,
-0.03323287516832352,
0.09628598392009735,
0.1329367756843567,
-0.01484820805490017,
-0.02146470732986927,
0.16973257064819336,
-0.06815619766712189,
-0.06399686634540558,
0.12207832932472229,
0.18453000485897064,
-0.034800995141267776,
-0.1206563413143158,
-0.04654192179441452,
-0.05058282986283302,
0.029008297249674797,
-0.029609760269522667,
0.048793915659189224,
0.0018633621511980891,
-0.06746140122413635,
-0.10721138119697571,
-0.08321204036474228,
-0.000795298139564693,
0.10110393166542053,
0.0013633165508508682,
-0.11871914565563202,
-0.03893150016665459,
-0.02413378469645977,
0.049253806471824646,
-0.05144314467906952,
-0.024269646033644676,
-0.05809629708528519,
-0.03478291258215904,
-0.09345796704292297,
0.062035344541072845,
-0.05843909829854965,
-0.012919164262712002,
-0.06200044974684715,
0.020262010395526886,
-0.02127249352633953,
0.04413532093167305,
-0.017485598102211952,
-0.04786117002367973,
-0.06938830018043518,
0.03399330750107765,
-0.09848073124885559,
0.02348465472459793,
-0.03046131134033203,
-0.02204674668610096,
0.0923319160938263,
0.03420674055814743,
-0.02210817113518715,
-0.0006052443059161305,
-0.1913689225912094,
-0.04208603501319885,
-0.01175775472074747,
0.030670208856463432,
0.02035488933324814,
-0.1483531892299652,
0.01808982715010643,
0.06697452068328857,
-0.07605955749750137,
-0.047503191977739334,
0.05804911628365517,
-0.09530285000801086,
-0.029254043474793434,
-0.12573382258415222,
0.011779570020735264,
-0.07861804962158203,
0.04710819199681282,
0.03801373392343521,
0.13315825164318085,
0.1258775144815445,
-0.07355934381484985,
0.07883371412754059,
-0.12026573717594147,
-0.02252882719039917,
-0.005793470423668623,
0.026431923732161522,
0.05025269091129303,
-0.01826147362589836,
0.03815966472029686,
-0.013924025930464268,
0.07434254884719849,
-0.07874225825071335,
-0.07651328295469284,
-0.020187675952911377,
0.07262246310710907,
-0.022238990291953087,
0.02892650105059147,
0.10466286540031433,
0.010754749178886414,
-0.006712269503623247,
0.08080683648586273,
0.06426896154880524,
0.017678068950772285,
0.15980148315429688,
0.21693171560764313,
0.0777926966547966,
0.0026634116657078266,
0.10876043140888214,
0.05661015957593918,
-0.08370647579431534,
0.010286024771630764,
0.0923905149102211,
-0.06742656975984573,
0.046272147446870804,
-0.08522237092256546,
-0.05983326584100723,
0.20607349276542664,
-0.12455089390277863,
0.015357510186731815,
0.04065708443522453,
-0.06515155732631683,
-0.11891121417284012,
-0.29264193773269653,
-0.10438656061887741,
0.0005597032723017037,
-0.027594387531280518,
-0.08003820478916168,
-0.02459876425564289,
-0.042685721069574356,
0.05448513850569725,
-0.0888325572013855,
0.1535150110721588,
-0.06904558837413788,
-0.04353098198771477,
0.07353519648313522,
-0.03525719791650772,
0.009672806598246098,
0.02323017083108425,
-0.004001608118414879,
0.032133638858795166,
-0.0570465512573719,
0.07278814166784286,
0.028342004865407944,
0.11232557892799377,
0.08882708102464676,
-0.0330149307847023,
-0.012427370063960552,
-0.009188206866383553,
-0.016230948269367218,
0.06538762152194977,
0.12284978479146957,
0.06367861479520798,
-0.07357942312955856,
-0.02102389559149742,
0.17407166957855225,
0.008651752024888992,
-0.06896243244409561,
-0.10276199132204056,
0.054793164134025574,
-0.02739896811544895,
-0.06757710874080658,
0.010127360932528973,
-0.09957322478294373,
-0.005196442361921072,
0.15927396714687347,
0.24681665003299713,
-0.04251110181212425,
-0.00043580791680142283,
0.027972031384706497,
-0.0010430586989969015,
-0.0539366789162159,
0.07030591368675232,
0.02368132770061493,
0.2052496373653412,
-0.04326208680868149,
-0.007272885646671057,
-0.08404900878667831,
-0.005877990275621414,
-0.10535552352666855,
0.024952583014965057,
0.02860860899090767,
0.005244558677077293,
0.02131657674908638,
0.10509996861219406,
-0.11911986023187637,
-0.09402026981115341,
-0.03938090056180954,
-0.02256838232278824,
-0.12611311674118042,
-0.07691418379545212,
-0.07929887622594833,
-0.007129892706871033,
0.06272857636213303,
-0.06062085181474686,
0.027761733159422874,
-0.018807511776685715,
-0.023181429132819176,
-0.030591826885938644,
-0.01830834336578846,
0.050569891929626465,
0.01886868290603161,
0.17796418070793152,
0.022559301927685738,
0.02920793555676937,
0.07590692490339279,
0.011886601336300373,
-0.16324059665203094,
0.08191487193107605,
-0.04073086380958557,
0.009285888634622097,
0.031131042167544365,
0.04961513727903366,
-0.0020545513834804296,
-0.007373080123215914,
0.007908424362540245,
-0.02817133627831936,
0.025380177423357964,
-0.09081053733825684,
0.049171917140483856,
-0.12372167408466339,
0.11690598726272583,
-0.1041441410779953,
0.1599993258714676,
0.09617836773395538,
-0.012198991142213345,
0.020647253841161728,
-0.045448411256074905,
0.10491969436407089,
0.015400567092001438,
0.038350578397512436,
0.016419952735304832,
-0.06482554227113724,
0.030581550672650337,
-0.08998221904039383,
0.013851040042936802,
-0.13106001913547516,
-0.040266942232847214,
-0.07053001970052719,
-0.07665407657623291,
-0.004634660203009844,
0.056018877774477005,
0.053312867879867554,
0.04182944819331169,
-0.017909439280629158,
-0.06809829920530319,
-0.0006270275916904211,
0.00041846930980682373,
-0.1614784300327301,
-0.060627881437540054
] |
null | null | null |
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis
The model and its training code has been mainly taken from: [Tensorpack](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN) .
Please check: [Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis](https://arxiv.org/abs/1908.07836).
This model is different from the model used the paper.
The code has been adapted so that it can be used in a **deep**doctection pipeline.
## How this model can be used
This model can be used with the **deep**doctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this [Get_started](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Get_Started.ipynb) tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check [this model](https://huggingface.co/deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_publaynet).
## How this model was trained.
To recreate the model run on the **deep**doctection framework, run:
```python
>>> import os
>>> from deep_doctection.datasets import DatasetRegistry
>>> from deep_doctection.eval import MetricRegistry
>>> from deep_doctection.utils import get_configs_dir_path
>>> from deep_doctection.train import train_faster_rcnn
publaynet = DatasetRegistry.get_dataset("publaynet")
path_config_yaml=os.path.join(get_configs_dir_path(),"tp/layout/conf_frcnn_layout.yaml")
path_weights = ""
dataset_train = publaynet
config_overwrite=["TRAIN.STEPS_PER_EPOCH=500","TRAIN.EVAL_PERIOD=200","TRAIN.STARTING_EPOCH=1",
"PREPROC.TRAIN_SHORT_EDGE_SIZE=[800,1200]","TRAIN.CHECKPOINT_PERIOD=50",
"BACKBONE.FREEZE_AT=0"]
build_train_config=["max_datapoints=335703"]
dataset_val = publaynet
build_val_config = ["max_datapoints=2000"]
coco_metric = MetricRegistry.get_metric("coco")
train_faster_rcnn(path_config_yaml=path_config_yaml,
dataset_train=dataset_train,
path_weights=path_weights,
config_overwrite=config_overwrite,
log_dir="/path/to/dir",
build_train_config=build_train_config,
dataset_val=dataset_val,
build_val_config=build_val_config,
metric=coco_metric,
pipeline_component_name="ImageLayoutService"
)
```
|
{"license": "apache-2.0", "tags": ["Tensorflow"], "datasets": ["Publaynet"]}
| null |
deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_publaynet_inference_only
|
[
"Tensorflow",
"dataset:Publaynet",
"arxiv:1908.07836",
"license:apache-2.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1908.07836"
] |
[] |
TAGS
#Tensorflow #dataset-Publaynet #arxiv-1908.07836 #license-apache-2.0 #region-us
|
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis
The model and its training code has been mainly taken from: Tensorpack .
Please check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis.
This model is different from the model used the paper.
The code has been adapted so that it can be used in a deepdoctection pipeline.
## How this model can be used
This model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model.
## How this model was trained.
To recreate the model run on the deepdoctection framework, run:
|
[
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis\n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nPlease check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis. \n\nThis model is different from the model used the paper. \n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model.",
"## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:"
] |
[
"TAGS\n#Tensorflow #dataset-Publaynet #arxiv-1908.07836 #license-apache-2.0 #region-us \n",
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis\n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nPlease check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis. \n\nThis model is different from the model used the paper. \n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model.",
"## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:"
] |
[
33,
111,
45,
50,
24
] |
[
"passage: TAGS\n#Tensorflow #dataset-Publaynet #arxiv-1908.07836 #license-apache-2.0 #region-us \n# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis\n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nPlease check: Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis. \n\nThis model is different from the model used the paper. \n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model.## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:"
] |
[
-0.07992123812437057,
0.0879099890589714,
-0.0009542672778479755,
0.08442889153957367,
0.07757414877414703,
-0.005030848551541567,
0.05960748344659805,
0.055472370237112045,
-0.0019959418568760157,
0.13048473000526428,
0.0269180741161108,
0.05283697694540024,
0.093532033264637,
0.08972115069627762,
-0.04877869039773941,
-0.17403730750083923,
-0.0045628417283296585,
0.0004391807597130537,
0.09600663185119629,
0.06242476403713226,
0.0750536397099495,
-0.08764327317476273,
0.04767661541700363,
-0.00315386475995183,
-0.04198082908987999,
0.05874794349074364,
0.00580449216067791,
-0.019981134682893753,
0.03992311656475067,
0.09085017442703247,
0.04344712942838669,
0.018720846623182297,
0.048874665051698685,
-0.1070990189909935,
0.043975748121738434,
0.06616919487714767,
-0.017471078783273697,
0.08886057138442993,
0.11365175992250443,
0.01645643450319767,
0.09563855826854706,
-0.06857278943061829,
0.020998740568757057,
0.06068376451730728,
-0.04674766957759857,
-0.09562595933675766,
-0.12416688352823257,
0.15674418210983276,
0.031149564310908318,
0.13975194096565247,
-0.01725364848971367,
0.16597892343997955,
-0.001945953699760139,
0.021476516500115395,
0.07540798932313919,
-0.28210169076919556,
-0.031064322218298912,
0.18576844036579132,
0.049260277301073074,
0.09108255803585052,
-0.05821298062801361,
0.018619243055582047,
0.01512272097170353,
0.03412569314241409,
-0.012564283795654774,
-0.07231105864048004,
-0.02219559997320175,
-0.056856006383895874,
-0.10135933756828308,
-0.007607343606650829,
0.14497627317905426,
0.01831572689116001,
-0.02444404922425747,
-0.10261311382055283,
-0.07531416416168213,
0.0030693986918777227,
0.008268171921372414,
-0.08746249973773956,
0.029223492369055748,
0.06556715816259384,
0.1516856551170349,
-0.1669037789106369,
-0.10480690002441406,
0.008087247610092163,
-0.07256799191236496,
-0.004481676500290632,
0.028958065435290337,
0.04817340150475502,
-0.07500489801168442,
0.09724390506744385,
-0.13816343247890472,
-0.05236285552382469,
-0.02391507290303707,
-0.07448705285787582,
-0.11330683529376984,
-0.01726982370018959,
-0.01761574298143387,
-0.10215426981449127,
0.021375169977545738,
0.18080811202526093,
-0.015291360206902027,
0.046714723110198975,
-0.034626610577106476,
0.05640638247132301,
0.02231699787080288,
0.1216607391834259,
-0.13752613961696625,
-0.008577752858400345,
0.14845146238803864,
-0.011845722794532776,
0.08460607379674911,
-0.03892672434449196,
-0.09628307074308395,
-0.044443339109420776,
-0.05829213932156563,
0.0489351786673069,
0.07780702412128448,
-0.030157431960105896,
-0.02103082835674286,
-0.09702043235301971,
0.13538549840450287,
-0.07923174649477005,
0.033749811351299286,
0.051537178456783295,
-0.050492063164711,
0.14265576004981995,
0.05157027766108513,
0.03258577734231949,
-0.09903200715780258,
-0.06878432631492615,
-0.04181194677948952,
-0.0001807644439395517,
-0.05038699880242348,
-0.05844510346651077,
-0.00015271581651177257,
-0.09225382655858994,
-0.00843773689121008,
-0.14539368450641632,
-0.22697192430496216,
-0.09857579320669174,
0.019913561642169952,
-0.04850936308503151,
0.00677909841760993,
0.008998462930321693,
-0.017819659784436226,
-0.05107559263706207,
0.03439750894904137,
0.024616742506623268,
0.011631718836724758,
-0.01037587970495224,
-0.03533779829740524,
-0.029299404472112656,
-0.1315356194972992,
0.013545433059334755,
-0.11559993773698807,
0.026359738782048225,
-0.022350357845425606,
0.05170119181275368,
0.053955014795064926,
-0.026448028162121773,
-0.08563201129436493,
-0.014555033296346664,
-0.07336260378360748,
-0.023135609924793243,
0.018570194020867348,
0.10214129835367203,
-0.20160646736621857,
0.032420847564935684,
0.0004743461322505027,
-0.114247627556324,
0.009366612881422043,
0.11727394163608551,
-0.02656807377934456,
-0.0009712667088024318,
0.04781682789325714,
0.1372651308774948,
0.11007489264011383,
-0.043675389140844345,
-0.05879589170217514,
-0.05996747314929962,
-0.02852248214185238,
0.020879076793789864,
0.05334815755486488,
-0.021987222135066986,
-0.008002067916095257,
0.014967234805226326,
-0.03786647319793701,
-0.05551460012793541,
-0.012755476869642735,
-0.06756381690502167,
-0.06149221211671829,
-0.041905246675014496,
-0.01772204414010048,
0.016471857205033302,
0.04439524933695793,
0.04670447111129761,
-0.07364551723003387,
-0.11564724892377853,
0.11816221475601196,
-0.044703416526317596,
-0.004966790787875652,
-0.059573329985141754,
0.18885178864002228,
-0.11591017246246338,
0.03371842950582504,
-0.1737372726202011,
0.004701282829046249,
0.07047424465417862,
0.0517379455268383,
0.08907090127468109,
0.12165726721286774,
0.033374495804309845,
0.04705497622489929,
-0.007473774254322052,
-0.05108893662691116,
0.011073325760662556,
-0.01529610250145197,
-0.047447532415390015,
-0.06273864209651947,
-0.052491191774606705,
-0.06607149541378021,
0.047249678522348404,
-0.31629639863967896,
0.011815145611763,
-0.1013772040605545,
0.08684970438480377,
-0.001969730481505394,
-0.010687999427318573,
0.047561220824718475,
-0.007034494075924158,
-0.05527222901582718,
-0.07528544217348099,
0.032232705503702164,
0.014377160929143429,
-0.06491483747959137,
-0.012383980676531792,
-0.18075977265834808,
-0.06969739496707916,
0.01629744842648506,
0.0346488393843174,
-0.04380378499627113,
-0.03344067558646202,
-0.024199260398745537,
-0.013095920905470848,
-0.060812026262283325,
0.027220265939831734,
0.23828522861003876,
0.002521534450352192,
0.049219511449337006,
-0.06417299062013626,
0.052177414298057556,
0.013797434978187084,
-0.03829294443130493,
0.018768036738038063,
0.07416436821222305,
0.014084650203585625,
-0.12225141376256943,
0.031467463821172714,
0.042661454528570175,
-0.05032385140657425,
0.10096678882837296,
0.02545573003590107,
-0.05536268651485443,
-0.05203529819846153,
0.05782495066523552,
0.009421436116099358,
0.05609801784157753,
0.09944868087768555,
0.09390048682689667,
0.0468294732272625,
0.037717342376708984,
0.04332839325070381,
-0.038255926221609116,
0.041671279817819595,
0.04003704711794853,
0.013817493803799152,
0.10462255775928497,
0.012998261488974094,
0.009640073403716087,
0.054004471749067307,
-0.053125765174627304,
0.04954339191317558,
0.028150534257292747,
-0.021766455844044685,
-0.12079962342977524,
0.17133180797100067,
-0.054491739720106125,
-0.1698518991470337,
-0.07028277218341827,
0.1164230927824974,
-0.09011282026767731,
-0.00833080243319273,
0.011692589148879051,
0.005115274339914322,
-0.06725342571735382,
-0.1199318915605545,
-0.015746191143989563,
0.04002172872424126,
-0.04559827595949173,
-0.11916070431470871,
-0.06188950687646866,
-0.013712246902287006,
-0.0692954733967781,
0.007182188797742128,
-0.029825765639543533,
-0.13897313177585602,
-0.00003824499071924947,
0.014942541718482971,
0.07950204610824585,
0.134095698595047,
-0.03420337289571762,
0.015056655742228031,
0.045786403119564056,
0.02975020930171013,
-0.02785145677626133,
0.09971990436315536,
0.2361416518688202,
0.044984523206949234,
-0.0031642536632716656,
0.07733394205570221,
-0.032625358551740646,
-0.033109862357378006,
-0.005185001529753208,
0.03442232683300972,
-0.057072918862104416,
-0.22097736597061157,
-0.09649448841810226,
-0.06583397835493088,
-0.05380529165267944,
0.03836653381586075,
0.07590862363576889,
0.03453165292739868,
0.059619586914777756,
-0.031643014401197433,
0.007382512092590332,
-0.02385745756328106,
0.015842633321881294,
0.09288186579942703,
0.022308843210339546,
0.023001540452241898,
-0.060380902141332626,
0.05168876051902771,
0.1335562914609909,
0.06259460002183914,
0.1279846727848053,
-0.027615314349532127,
0.1071549579501152,
-0.0017466165591031313,
0.09789154678583145,
0.04202740266919136,
0.06142289191484451,
0.010115891695022583,
-0.0018478642450645566,
0.003189254552125931,
-0.0731036514043808,
0.04657754674553871,
0.05662832781672478,
0.04741906002163887,
0.03286337852478027,
-0.06075437366962433,
0.07437199354171753,
0.0062083350494503975,
0.06263700872659683,
0.02130407653748989,
-0.18802401423454285,
-0.03093147650361061,
-0.019058238714933395,
0.04964433237910271,
-0.08971838653087616,
0.03451196849346161,
0.14288081228733063,
-0.12087959051132202,
-0.03944120928645134,
-0.012296712026000023,
0.09312800318002701,
-0.08900889754295349,
-0.025226710364222527,
-0.09139563143253326,
0.1270194798707962,
0.03243693336844444,
0.08494381606578827,
-0.11490704119205475,
0.005552528891712427,
0.04995672404766083,
0.13588133454322815,
-0.07622773200273514,
-0.019202345982193947,
0.0437585711479187,
0.09269445389509201,
0.16310253739356995,
-0.005989277269691229,
-0.03315432369709015,
-0.01899680867791176,
-0.1564989537000656,
0.051270756870508194,
-0.00868949294090271,
-0.04631127789616585,
0.040189478546381,
-0.007414332125335932,
-0.00167881662491709,
-0.026090428233146667,
0.038938459008932114,
-0.16105078160762787,
-0.14789359271526337,
0.03138159215450287,
0.07152637094259262,
-0.06690775603055954,
-0.016534578055143356,
-0.0019740175921469927,
0.06311219930648804,
0.18026010692119598,
-0.10290206968784332,
-0.023774776607751846,
-0.12136049568653107,
0.06743881851434708,
0.13859635591506958,
-0.054167117923498154,
0.007148338947445154,
-0.030679183080792427,
0.08994156867265701,
-0.0736478939652443,
-0.15165668725967407,
0.06622494757175446,
-0.09322977066040039,
-0.11702625453472137,
-0.0032581076957285404,
0.051649145781993866,
0.041243791580200195,
0.03560866788029671,
0.0760539099574089,
0.0467645488679409,
-0.05648234859108925,
-0.13998235762119293,
-0.04641683027148247,
0.13662853837013245,
0.03079630620777607,
0.08314518630504608,
-0.0648687332868576,
-0.1379951387643814,
-0.02073686011135578,
0.03236247971653938,
0.13786301016807556,
0.18998879194259644,
-0.10437693446874619,
0.06042202189564705,
0.13194437325000763,
-0.08704560250043869,
-0.14950084686279297,
0.0015388027532026172,
-0.0464756116271019,
0.07976765930652618,
0.03868011757731438,
-0.08832597732543945,
0.10179609060287476,
0.07045678049325943,
-0.0033341622911393642,
0.15188078582286835,
-0.2964739203453064,
-0.09325283765792847,
0.14581598341464996,
-0.0012697220081463456,
0.15144242346286774,
-0.12306651473045349,
-0.05692126229405403,
0.003363074269145727,
0.06134524941444397,
0.13552308082580566,
-0.16710996627807617,
0.1027030423283577,
-0.0277590062469244,
-0.034672997891902924,
0.04155762866139412,
-0.05838349461555481,
0.07123328000307083,
-0.05116335302591324,
0.08367448300123215,
-0.05400746688246727,
0.0817965492606163,
0.03354620188474655,
-0.033869948238134384,
0.13845525681972504,
0.009034590795636177,
0.11584389954805374,
-0.11813489347696304,
-0.054598890244960785,
-0.06606569141149521,
0.11347107589244843,
-0.02146957442164421,
-0.04546917602419853,
-0.08951743692159653,
0.06485086679458618,
0.09486784785985947,
0.01809363067150116,
0.02092905528843403,
-0.019414925947785378,
0.017171194776892662,
0.19128742814064026,
0.04828481376171112,
0.05904541537165642,
-0.11007600277662277,
-0.03176091983914375,
-0.017226682975888252,
0.08479011058807373,
-0.009614177979528904,
-0.020202046260237694,
0.13683660328388214,
0.007655009627342224,
0.08628066629171371,
0.005844610743224621,
-0.09507468342781067,
-0.010792494751513004,
-0.036013998091220856,
-0.130037322640419,
-0.09692724049091339,
-0.03829820454120636,
0.1489938646554947,
-0.005028655752539635,
-0.0491587370634079,
0.11242175847291946,
-0.09718145430088043,
-0.030039295554161072,
0.02295273169875145,
0.049179766327142715,
-0.03724119812250137,
0.09882690012454987,
0.03958253189921379,
0.004395902622491121,
-0.051309335976839066,
0.10701066255569458,
0.10560238361358643,
-0.05894780158996582,
0.04436713829636574,
0.11201660335063934,
-0.07005161046981812,
-0.09296083450317383,
0.019875075668096542,
0.14435209333896637,
-0.07858752459287643,
-0.09781405329704285,
-0.05858078971505165,
-0.03789077699184418,
-0.01774064637720585,
0.009417969733476639,
0.022941675037145615,
0.004522703122347593,
-0.06385291367769241,
-0.11737287789583206,
-0.11075960844755173,
0.019531136378645897,
0.05611911416053772,
-0.0015581392217427492,
-0.08319290727376938,
-0.013650619424879551,
0.029494352638721466,
0.017965059727430344,
-0.04444827139377594,
-0.01701054722070694,
-0.06860474497079849,
-0.0366840697824955,
-0.14775997400283813,
0.02091241627931595,
-0.026175521314144135,
-0.03056342899799347,
-0.046451326459646225,
-0.0021473360247910023,
-0.005719661246985197,
0.03786277398467064,
-0.0074708531610667706,
-0.03313199058175087,
-0.05068771913647652,
0.0413881354033947,
-0.10071983933448792,
-0.018283149227499962,
-0.05196118354797363,
-0.03167898207902908,
0.09809553623199463,
0.04783536121249199,
-0.03646006062626839,
-0.02599702961742878,
-0.18518951535224915,
-0.016966475173830986,
-0.007914680987596512,
0.03955601528286934,
0.01839679852128029,
-0.11280585080385208,
-0.0040658507496118546,
0.04544421285390854,
-0.0696631446480751,
-0.030864927917718887,
0.06957349181175232,
-0.11366947740316391,
-0.03900136798620224,
-0.11405248194932938,
-0.0010141241364181042,
-0.06304072588682175,
0.026816602796316147,
-0.020989401265978813,
0.1181759461760521,
0.1059684231877327,
-0.058506228029727936,
0.11060301959514618,
-0.11475522816181183,
-0.03887326642870903,
-0.019409211352467537,
0.04876115918159485,
0.022437183186411858,
-0.0005063270800746977,
0.05191199108958244,
0.007374882232397795,
0.10646143555641174,
-0.06934945285320282,
0.01092557329684496,
-0.0017775718588382006,
0.0466216541826725,
0.005137702915817499,
-0.010985520668327808,
0.08581334352493286,
0.013415554538369179,
-0.01168980821967125,
0.03592095151543617,
0.06584858149290085,
0.0544377826154232,
0.13315969705581665,
0.19839727878570557,
0.10664000362157822,
-0.0033390044700354338,
0.11773708462715149,
0.0277172289788723,
-0.06906537711620331,
-0.004405863117426634,
0.06578991562128067,
-0.0142979109659791,
0.03331772983074188,
-0.08754388242959976,
-0.06739815324544907,
0.2117932140827179,
-0.0942210853099823,
0.019975917413830757,
0.03372901678085327,
-0.0641336441040039,
-0.11264492571353912,
-0.2473854273557663,
-0.07325445860624313,
0.006683421786874533,
-0.05342581868171692,
-0.0840851366519928,
0.011817879043519497,
0.012020819820463657,
0.02396426349878311,
-0.08804283291101456,
0.16704227030277252,
-0.045854419469833374,
-0.04143310710787773,
0.025499992072582245,
-0.002838781336322427,
-0.005885662976652384,
0.028013018891215324,
-0.0041252560913562775,
0.03246862068772316,
-0.048344917595386505,
0.06353392452001572,
0.064661405980587,
0.13029822707176208,
0.07896269857883453,
-0.03609783202409744,
-0.06351899355649948,
-0.041684046387672424,
-0.0003660230722744018,
0.05429491400718689,
0.181938037276268,
0.06795273721218109,
-0.06787022203207016,
-0.027063710615038872,
0.16898462176322937,
-0.024508312344551086,
0.005864844657480717,
-0.09667856991291046,
0.12215381115674973,
-0.028672916814684868,
-0.051351845264434814,
-0.02207934856414795,
-0.11637941002845764,
0.02451901137828827,
0.19705016911029816,
0.15642350912094116,
-0.055589526891708374,
-0.01699146442115307,
0.009474935941398144,
-0.004836713895201683,
-0.0629882663488388,
0.04373084008693695,
0.002119768410921097,
0.24742883443832397,
-0.07514508068561554,
0.03798075020313263,
-0.045603539794683456,
-0.013925308361649513,
-0.10630089044570923,
0.06544873863458633,
0.009365422651171684,
0.020877674221992493,
0.02622772380709648,
0.07064816355705261,
-0.13185946643352509,
-0.10201366245746613,
0.028828507289290428,
-0.06398555636405945,
-0.08807109296321869,
-0.07353420555591583,
-0.0469973124563694,
-0.021751023828983307,
0.06822346150875092,
-0.07084871083498001,
-0.0011191900121048093,
0.017009977251291275,
-0.03464676812291145,
-0.05485444888472557,
-0.024304701015353203,
0.034178365021944046,
0.10145651549100876,
0.18450678884983063,
0.012675429694354534,
0.03151312842965126,
0.07488974928855896,
0.03118257224559784,
-0.15011486411094666,
0.07550062239170074,
-0.01942761056125164,
-0.020985323935747147,
0.027251586318016052,
0.06187061592936516,
-0.00493127666413784,
0.05707036703824997,
0.04111529514193535,
-0.06824492663145065,
0.021477840840816498,
-0.08553279936313629,
-0.0022830613888800144,
-0.0779992938041687,
0.06459815055131912,
-0.06084178760647774,
0.1583307832479477,
0.062055010348558426,
-0.03059394285082817,
-0.015049831941723824,
-0.074076347053051,
0.08069422096014023,
0.02245952934026718,
0.04653017967939377,
0.030902022495865822,
-0.09635590016841888,
0.060567714273929596,
-0.013632955960929394,
0.01427775714546442,
-0.19209067523479462,
-0.033669956028461456,
-0.04310239106416702,
-0.05857614800333977,
-0.03722228854894638,
0.03426281735301018,
0.052677229046821594,
0.03632934391498566,
-0.04335719347000122,
-0.0937039703130722,
0.0039617158472537994,
0.027636470273137093,
-0.12519942224025726,
-0.06631119549274445
] |
null | null | null |
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and its training code has been mainly taken from: [Tensorpack](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN) .
Regarding the dataset, please check: [Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation](https://arxiv.org/abs/1911.10683).
The model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before
detecting cells.
The code has been adapted so that it can be used in a **deep**doctection pipeline.
## How this model can be used
This model can be used with the **deep**doctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this [Get_started](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Get_Started.ipynb) tutorial.
## How this model was trained.
To recreate the model run on the **deep**doctection framework, run:
```python
>>> import os
>>> from deep_doctection.datasets import DatasetRegistry
>>> from deep_doctection.eval import MetricRegistry
>>> from deep_doctection.utils import get_configs_dir_path
>>> from deep_doctection.train import train_faster_rcnn
pubtabnet = DatasetRegistry.get_dataset("pubtabnet")
pubtabnet.dataflow.categories.filter_categories(categories="CELL")
path_config_yaml=os.path.join(get_configs_dir_path(),"tp/cell/conf_frcnn_cell.yaml")
path_weights = ""
dataset_train = pubtabnet
config_overwrite=["TRAIN.STEPS_PER_EPOCH=500","TRAIN.STARTING_EPOCH=1",
"TRAIN.CHECKPOINT_PERIOD=50","BACKBONE.FREEZE_AT=0", "PREPROC.TRAIN_SHORT_EDGE_SIZE=[200,600]"]
build_train_config=["max_datapoints=500000"]
dataset_val = pubtabnet
build_val_config = ["max_datapoints=4000"]
coco_metric = MetricRegistry.get_metric("coco")
coco_metric.set_params(max_detections=[50,200,600], area_range=[[0,1000000],[0,200],[200,800],[800,1000000]])
train_faster_rcnn(path_config_yaml=path_config_yaml,
dataset_train=dataset_train,
path_weights=path_weights,
config_overwrite=config_overwrite,
log_dir="/path/to/dir",
build_train_config=build_train_config,
dataset_val=dataset_val,
build_val_config=build_val_config,
metric=coco_metric,
pipeline_component_name="ImageLayoutService"
)
```
## How to fine-tune this model
To fine tune this model, please check this [Fine-tune](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Fine_Tune.ipynb) tutorial.
|
{"license": "apache-2.0", "tags": ["Tensorflow"], "datasets": ["Pubtabnet"]}
| null |
deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_pubtabnet_c
|
[
"Tensorflow",
"dataset:Pubtabnet",
"arxiv:1911.10683",
"license:apache-2.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1911.10683"
] |
[] |
TAGS
#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us
|
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and its training code has been mainly taken from: Tensorpack .
Regarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation.
The model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before
detecting cells.
The code has been adapted so that it can be used in a deepdoctection pipeline.
## How this model can be used
This model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.
## How this model was trained.
To recreate the model run on the deepdoctection framework, run:
## How to fine-tune this model
To fine tune this model, please check this Fine-tune tutorial.
|
[
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before \ndetecting cells.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:",
"## How to fine-tune this model\n\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
"TAGS\n#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n",
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before \ndetecting cells.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:",
"## How to fine-tune this model\n\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
34,
156,
45,
24,
22
] |
[
"passage: TAGS\n#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before \ndetecting cells.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:## How to fine-tune this model\n\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
-0.13887940347194672,
0.14555305242538452,
-0.0013439671602100134,
0.032078877091407776,
0.07926004379987717,
-0.039279695600271225,
0.047872722148895264,
0.06581272184848785,
-0.04896572232246399,
0.11332322657108307,
0.018677128478884697,
0.02339988201856613,
0.09280159324407578,
0.15194359421730042,
0.002148162107914686,
-0.1412590891122818,
0.0006048322538845241,
0.0038873765151947737,
0.007207354065030813,
0.06236685812473297,
0.0768837109208107,
-0.06080568581819534,
0.04636411368846893,
0.013692973181605339,
-0.09809239953756332,
0.051112912595272064,
0.00016002666961867362,
-0.03861880674958229,
0.06940547376871109,
0.047022633254528046,
0.042772840708494186,
0.012348656542599201,
0.09731054306030273,
-0.09234723448753357,
0.053349658846855164,
0.07403484731912613,
-0.019851723685860634,
0.07132694125175476,
0.1049857810139656,
0.00888102874159813,
0.16705544292926788,
-0.08916076272726059,
0.0353565514087677,
0.034879498183727264,
-0.0487542562186718,
-0.15126708149909973,
-0.142160564661026,
0.1459321677684784,
0.08624967932701111,
0.09835473448038101,
-0.015408030711114407,
0.12686660885810852,
-0.007419412489980459,
0.08002950251102448,
0.07587803900241852,
-0.27304020524024963,
-0.04006974399089813,
0.20201683044433594,
0.07249908149242401,
0.03425740823149681,
-0.043294865638017654,
0.021187027916312218,
0.01078653335571289,
0.018403002992272377,
-0.021089011803269386,
-0.07520703971385956,
-0.04708703234791756,
-0.027114344760775566,
-0.1521587371826172,
-0.003406043164432049,
0.13318970799446106,
-0.0014588878257200122,
-0.04109542816877365,
-0.09150365740060806,
-0.06220236048102379,
-0.10435108840465546,
0.009910881519317627,
-0.08803814649581909,
0.06421208381652832,
0.031394876539707184,
0.10628972202539444,
-0.1310463398694992,
-0.0883636400103569,
-0.03596096485853195,
-0.10022947192192078,
-0.0055780187249183655,
0.03417941927909851,
0.003071129322052002,
-0.04834121838212013,
0.10344203561544418,
-0.045533839613199234,
-0.043585218489170074,
-0.05640754476189613,
0.03047942742705345,
-0.15873068571090698,
-0.037998542189598083,
-0.059000611305236816,
-0.06924938410520554,
0.04542332515120506,
0.19629237055778503,
0.017219850793480873,
0.052042488008737564,
-0.05579347908496857,
0.03289688006043434,
0.04555864632129669,
0.16953550279140472,
-0.09500627219676971,
0.05211305990815163,
0.13684743642807007,
-0.04863739013671875,
0.0541728250682354,
-0.01314567681401968,
-0.08782479912042618,
-0.01936919055879116,
0.04813198000192642,
0.06615393608808517,
0.05383763089776039,
-0.021806886419653893,
-0.017045922577381134,
-0.054650962352752686,
0.09185329079627991,
-0.14695610105991364,
0.07849422842264175,
0.07161707431077957,
-0.06493555754423141,
-0.0031559010967612267,
0.04271151125431061,
0.033710215240716934,
-0.13080137968063354,
-0.08626952022314072,
-0.04566928371787071,
-0.0203730259090662,
-0.0314779132604599,
-0.006876788567751646,
0.004929789341986179,
-0.09362039715051651,
-0.0005272047128528357,
-0.13605144619941711,
-0.20260438323020935,
-0.13543646037578583,
0.02884981967508793,
-0.039786189794540405,
-0.00527169369161129,
-0.039766449481248856,
-0.021991699934005737,
-0.04126270115375519,
0.0719379261136055,
0.07342551648616791,
0.007523837964981794,
0.00041065053665079176,
-0.001834683702327311,
-0.04014784097671509,
-0.04267491400241852,
0.02725464105606079,
-0.13566343486309052,
0.029877198860049248,
0.02328253537416458,
0.07044398784637451,
0.023912841454148293,
-0.06889758259057999,
-0.10914203524589539,
0.008376389741897583,
-0.07862050086259842,
-0.011642256751656532,
0.031224502250552177,
0.09128738939762115,
-0.1795009821653366,
0.02655547484755516,
0.06735987216234207,
-0.08118578046560287,
0.018861137330532074,
0.06979164481163025,
-0.025035865604877472,
-0.009236513637006283,
0.026286562904715538,
0.18337193131446838,
0.05038932338356972,
-0.06558635085821152,
-0.018030034378170967,
-0.05221451073884964,
-0.060356397181749344,
0.010722559876739979,
0.11075811833143234,
-0.01684170588850975,
0.02687263861298561,
-0.02074437215924263,
-0.08881048113107681,
-0.04381512105464935,
-0.009622739627957344,
-0.07045476138591766,
-0.06202322989702225,
-0.029347283765673637,
-0.06804883480072021,
-0.002593517303466797,
0.05306238681077957,
0.028024056926369667,
-0.05457121133804321,
-0.16518914699554443,
0.16570870578289032,
-0.04126585274934769,
-0.021082259714603424,
-0.09767889976501465,
0.20822454988956451,
-0.10078006982803345,
0.023761983960866928,
-0.18365949392318726,
-0.0306831207126379,
0.04483874514698982,
0.01735115423798561,
0.08335473388433456,
0.044883664697408676,
0.011704709380865097,
0.07778438925743103,
0.056685175746679306,
-0.0694422572851181,
0.02957162633538246,
-0.03443658724427223,
-0.030243463814258575,
-0.07810516655445099,
-0.04665058106184006,
-0.08763384819030762,
0.1035810261964798,
-0.3271849751472473,
0.01788378693163395,
-0.04649414122104645,
0.13135914504528046,
-0.0007197175873443484,
-0.034938547760248184,
0.04182090610265732,
-0.01299278810620308,
-0.031986020505428314,
-0.10986509919166565,
0.010242089629173279,
0.05327136069536209,
-0.02835528738796711,
0.023425424471497536,
-0.11395401507616043,
-0.006438910029828548,
0.030123604461550713,
0.11755990982055664,
-0.0727519616484642,
-0.01564577966928482,
-0.0338774174451828,
-0.02119612880051136,
-0.061036303639411926,
0.09160667657852173,
0.23332755267620087,
0.005473070777952671,
0.029975734651088715,
-0.062501460313797,
0.030285021290183067,
0.011591590009629726,
-0.058983370661735535,
-0.002191529143601656,
0.10431572794914246,
-0.00633294926956296,
-0.24586264789104462,
0.022777065634727478,
-0.03893943503499031,
-0.05998651683330536,
0.03176025301218033,
-0.012211055494844913,
-0.049275919795036316,
-0.037185102701187134,
0.05709473788738251,
0.016116827726364136,
0.0987880527973175,
0.0795888900756836,
0.031573958694934845,
0.008057042956352234,
-0.006891094613820314,
0.06175970286130905,
-0.0931074321269989,
0.06373757869005203,
0.04838703200221062,
0.0023954869247972965,
0.09060633927583694,
0.031014952808618546,
-0.04868650063872337,
0.04732033982872963,
-0.0655725970864296,
-0.017902441322803497,
0.007248378358781338,
-0.01680753566324711,
-0.13684219121932983,
0.19178159534931183,
-0.0496913380920887,
-0.15474270284175873,
-0.09449269622564316,
0.0713898316025734,
-0.035382673144340515,
0.03410860523581505,
0.017793532460927963,
0.006671934388577938,
-0.031554535031318665,
-0.09169488400220871,
-0.020300189033150673,
0.03450669348239899,
-0.031952615827322006,
-0.06331190466880798,
-0.04657470062375069,
0.007748620584607124,
-0.0817469134926796,
0.0021140913013368845,
-0.009504888206720352,
-0.14318732917308807,
0.012619364075362682,
-0.025967879220843315,
0.10782942175865173,
0.18840017914772034,
-0.040285781025886536,
-0.006430505309253931,
0.0517171174287796,
0.02473876252770424,
-0.07675691694021225,
0.03409747779369354,
0.15520839393138885,
0.053334519267082214,
-0.020605269819498062,
0.022659234702587128,
-0.030126359313726425,
-0.05201339349150658,
0.016804177314043045,
0.024282246828079224,
-0.027815086767077446,
-0.21914368867874146,
-0.08686047792434692,
-0.02960713766515255,
-0.07479020953178406,
0.056461188942193985,
0.11877497285604477,
0.004514423664659262,
0.029386239126324654,
0.03964699059724808,
0.020807739347219467,
0.007853856310248375,
0.027605868875980377,
0.05068444460630417,
0.023753974586725235,
0.028753774240612984,
-0.009264660999178886,
0.06698983907699585,
0.08970541507005692,
0.05898858606815338,
0.11762785911560059,
-0.02119758352637291,
0.09413351863622665,
-0.011146180331707,
0.08369021862745285,
0.018658803775906563,
0.016090326011180878,
0.01886001229286194,
0.026734724640846252,
0.04037565737962723,
-0.04495982453227043,
0.007378626149147749,
0.02209898829460144,
0.04042357951402664,
-0.01398607436567545,
-0.09365136921405792,
0.10966195166110992,
-0.00691783893853426,
0.11110392212867737,
-0.016250140964984894,
-0.24914567172527313,
-0.05812649056315422,
-0.04030373692512512,
0.025447716936469078,
-0.07804377377033234,
0.052362699061632156,
0.06092371419072151,
-0.13429659605026245,
-0.1023169457912445,
-0.040834665298461914,
0.0950193703174591,
-0.08368632942438126,
-0.03777812421321869,
-0.024847306311130524,
0.09129989147186279,
0.03758122771978378,
0.08558951318264008,
-0.12923681735992432,
0.05713910609483719,
0.0030190746765583754,
0.1302100121974945,
-0.06372928619384766,
-0.013071071356534958,
0.03833593428134918,
0.09499062597751617,
0.1876792162656784,
-0.007968629710376263,
-0.1748477965593338,
-0.04367325082421303,
-0.13320177793502808,
0.048733893781900406,
0.007887877523899078,
-0.06639844924211502,
0.033661168068647385,
-0.012486686930060387,
-0.02509300783276558,
-0.04944726824760437,
-0.026280967518687248,
-0.13190098106861115,
-0.07248613983392715,
0.0015037833945825696,
0.049963489174842834,
-0.011358962394297123,
-0.06669840961694717,
-0.034462835639715195,
0.04904286563396454,
0.1550091952085495,
-0.060257118195295334,
0.0009082927135750651,
-0.11345364153385162,
0.09867245703935623,
0.09782784432172775,
-0.06717468798160553,
0.012142795138061047,
-0.003144659334793687,
0.12142375111579895,
-0.01845145784318447,
-0.15029016137123108,
0.07323506474494934,
-0.10296452045440674,
-0.12870606780052185,
-0.017044778913259506,
0.07415621727705002,
0.0773928239941597,
0.03034939616918564,
0.07672438025474548,
0.07974208146333694,
-0.020040353760123253,
-0.08929973840713501,
0.026470085605978966,
0.0968523845076561,
0.05232968553900719,
0.10098328441381454,
-0.0841645896434784,
-0.15721622109413147,
-0.037987880408763885,
0.028112109750509262,
0.10575907677412033,
0.10263426601886749,
-0.05103310942649841,
0.0635816752910614,
0.07257132232189178,
-0.12498965859413147,
-0.1874568611383438,
0.02898709662258625,
-0.0030673553701490164,
0.12620443105697632,
0.008923698216676712,
-0.198641836643219,
0.14218957722187042,
0.05857300013303757,
-0.046747464686632156,
0.13913676142692566,
-0.34968969225883484,
-0.09070266038179398,
0.19719134271144867,
0.021965689957141876,
0.029354363679885864,
-0.1304984837770462,
-0.06873338669538498,
0.04491202160716057,
-0.016961749643087387,
0.05654749646782875,
-0.20033711194992065,
0.09955015778541565,
-0.01606180891394615,
-0.04249265417456627,
0.03015056997537613,
-0.04217606410384178,
0.07633395493030548,
-0.0026548197492957115,
0.07157565653324127,
-0.07636305689811707,
0.038684818893671036,
0.018101908266544342,
0.000518044747877866,
0.10250042378902435,
0.05331297963857651,
0.14304369688034058,
-0.15055282413959503,
-0.04513770341873169,
-0.04920048639178276,
0.1040520966053009,
-0.02087664231657982,
-0.018258647993206978,
-0.059081003069877625,
0.11978629976511002,
0.08161657303571701,
0.020955510437488556,
0.05774525925517082,
-0.015585952438414097,
-0.01066174078732729,
0.1292257457971573,
0.013368502259254456,
0.09220718592405319,
-0.10038651525974274,
-0.03692620247602463,
-0.0040784552693367004,
0.053725048899650574,
0.05498644709587097,
-0.0182037316262722,
0.1422199308872223,
0.04295507073402405,
0.06486012041568756,
0.03517892584204674,
-0.0896814689040184,
-0.002967625157907605,
-0.013403334654867649,
-0.08422310650348663,
-0.12020173668861389,
-0.040893469005823135,
0.08159347623586655,
0.006773125845938921,
0.056788571178913116,
0.11613835394382477,
-0.061307813972234726,
-0.05361579358577728,
0.04361189529299736,
0.0844801738858223,
-0.08511874824762344,
0.13082338869571686,
0.0476418137550354,
0.016937460750341415,
-0.027217481285333633,
0.11249098181724548,
0.13565953075885773,
-0.07039758563041687,
0.04356808960437775,
0.10150226205587387,
-0.0889969915151596,
-0.07823191583156586,
0.06358131766319275,
0.17274369299411774,
-0.07851522415876389,
-0.11922017484903336,
-0.05207183584570885,
-0.005270606372505426,
0.004925290588289499,
0.02161640115082264,
0.05521755293011665,
0.032893918454647064,
-0.057584457099437714,
-0.08572377264499664,
-0.10486601293087006,
0.01916622370481491,
0.08685558289289474,
0.028728660196065903,
-0.10736903548240662,
-0.01631842367351055,
-0.024148503318428993,
0.05167543888092041,
-0.051616374403238297,
-0.030426425859332085,
-0.08807031065225601,
-0.014892667531967163,
-0.0756479799747467,
0.06022952124476433,
-0.05049509555101395,
-0.011967096477746964,
-0.005463096778839827,
0.007342103403061628,
-0.002800081390887499,
0.0931430235505104,
0.005476119928061962,
-0.047326017171144485,
-0.0865362137556076,
0.04043181613087654,
-0.13240304589271545,
0.00824801530689001,
-0.022226553410291672,
-0.026899350807070732,
0.08743000030517578,
0.08608902990818024,
-0.00875367783010006,
-0.005745989270508289,
-0.12942487001419067,
-0.017319781705737114,
-0.015271291136741638,
0.012219976633787155,
-0.01216115988790989,
-0.2138499766588211,
0.015440661460161209,
0.07076114416122437,
-0.08514991402626038,
-0.033416856080293655,
0.09267015010118484,
-0.1066959947347641,
-0.06065225601196289,
-0.1515970230102539,
0.032038453966379166,
-0.0797312930226326,
0.055920932441949844,
-0.017128033563494682,
0.1610466092824936,
0.09700675308704376,
-0.07506344467401505,
0.07286744564771652,
-0.08050394058227539,
-0.027612531557679176,
-0.006770526524633169,
0.04922875389456749,
-0.02947004698216915,
0.00853082537651062,
0.053594931960105896,
-0.012463614344596863,
0.08329074829816818,
-0.023134194314479828,
0.00966159999370575,
-0.004163765348494053,
0.028112398460507393,
0.028643270954489708,
0.04270177334547043,
0.11708518862724304,
0.0023987358435988426,
-0.022709373384714127,
0.07619525492191315,
0.03940302133560181,
0.01210943516343832,
0.11250366270542145,
0.19358420372009277,
0.11059410870075226,
0.02013694867491722,
0.10935767740011215,
-0.006861457601189613,
-0.13178956508636475,
-0.030067358165979385,
0.08953557908535004,
-0.05334696173667908,
0.04674246534705162,
-0.08724836260080338,
-0.023524392396211624,
0.20356552302837372,
-0.12944617867469788,
0.019806332886219025,
0.0023724716156721115,
-0.0720926895737648,
-0.10597232729196548,
-0.2567637264728546,
-0.0888766199350357,
-0.019345726817846298,
-0.027192549780011177,
-0.07315430045127869,
0.007348344661295414,
-0.03972749039530754,
0.0314372256398201,
-0.04910925403237343,
0.18657997250556946,
-0.07663126289844513,
-0.03364364802837372,
0.032056599855422974,
-0.02997729554772377,
-0.0020494982600212097,
0.01669849269092083,
0.002541373251006007,
0.06425046920776367,
-0.057102639228105545,
0.09992723912000656,
0.023631934076547623,
0.12519466876983643,
0.09934009611606598,
-0.05681106820702553,
-0.008862728253006935,
-0.010341604240238667,
-0.009912424720823765,
0.020766252651810646,
0.09385018050670624,
0.12262533605098724,
-0.05526565760374069,
0.004566557705402374,
0.2668403089046478,
-0.01967649720609188,
-0.03481634333729744,
-0.12143532186746597,
0.11883342266082764,
0.001903819153085351,
-0.05783757194876671,
-0.003501264611259103,
-0.13792937994003296,
0.019034329801797867,
0.13290874660015106,
0.16524456441402435,
-0.027842674404382706,
-0.02086826227605343,
0.037979673594236374,
-0.0032078225631266832,
-0.06887686252593994,
0.025995979085564613,
0.050053566694259644,
0.21265080571174622,
-0.07086433470249176,
-0.056169621646404266,
-0.04270447790622711,
-0.01964765600860119,
-0.08875218778848648,
0.02156234346330166,
-0.01645362563431263,
0.009856127202510834,
0.035787876695394516,
0.07863884419202805,
-0.11712583899497986,
-0.19887490570545197,
-0.010000032372772694,
0.0018265111139044166,
-0.09509970992803574,
-0.06544516980648041,
-0.05752856284379959,
0.008180190809071064,
0.06223289296030998,
-0.03191060200333595,
0.009580250829458237,
0.05213437229394913,
0.009009506553411484,
-0.030293447896838188,
-0.027236148715019226,
0.031350236386060715,
0.030753036960959435,
0.19321662187576294,
-0.021540949121117592,
0.09476213157176971,
0.08413247764110565,
0.043968066573143005,
-0.1540079414844513,
0.041891831904649734,
-0.012805234640836716,
0.009359760209918022,
0.05911311134696007,
0.0953417494893074,
0.016067683696746826,
0.03152146562933922,
0.03415532782673836,
-0.03140539303421974,
0.04806886613368988,
-0.09498076885938644,
0.07955898344516754,
-0.11773084104061127,
0.07194195687770844,
-0.10034544765949249,
0.1674584597349167,
0.07544317841529846,
-0.01761404424905777,
0.02468051202595234,
-0.029864026233553886,
0.10913406312465668,
0.01966949738562107,
0.06850426644086838,
0.058906059712171555,
-0.10843279957771301,
0.06685894727706909,
-0.11485882848501205,
0.010512701235711575,
-0.18141290545463562,
-0.058495912700891495,
-0.008190304972231388,
-0.0813269391655922,
-0.006456911098212004,
0.06072266399860382,
0.0910516083240509,
0.031521864235401154,
-0.06809045374393463,
-0.08599230647087097,
0.018051763996481895,
0.061306800693273544,
-0.14340323209762573,
-0.07403399050235748
] |
null | null | null |
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and its training code has been mainly taken from: [Tensorpack](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN) .
Regarding the dataset, please check: [Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation](https://arxiv.org/abs/1911.10683).
The model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before
detecting cells.
The code has been adapted so that it can be used in a **deep**doctection pipeline.
## How this model can be used
This model can be used with the **deep**doctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this [Get_started](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Get_Started.ipynb) tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this [model](https://huggingface.co/deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_pubtabnet_c) .
## How this model was trained.
To recreate the model run on the **deep**doctection framework, run:
```python
>>> import os
>>> from deep_doctection.datasets import DatasetRegistry
>>> from deep_doctection.eval import MetricRegistry
>>> from deep_doctection.utils import get_configs_dir_path
>>> from deep_doctection.train import train_faster_rcnn
pubtabnet = DatasetRegistry.get_dataset("pubtabnet")
pubtabnet.dataflow.categories.filter_categories(categories="CELL")
path_config_yaml=os.path.join(get_configs_dir_path(),"tp/cell/conf_frcnn_cell.yaml")
path_weights = ""
dataset_train = pubtabnet
config_overwrite=["TRAIN.STEPS_PER_EPOCH=500","TRAIN.STARTING_EPOCH=1",
"TRAIN.CHECKPOINT_PERIOD=50","BACKBONE.FREEZE_AT=0", "PREPROC.TRAIN_SHORT_EDGE_SIZE=[200,600]"]
build_train_config=["max_datapoints=500000"]
dataset_val = pubtabnet
build_val_config = ["max_datapoints=4000"]
coco_metric = MetricRegistry.get_metric("coco")
coco_metric.set_params(max_detections=[50,200,600], area_range=[[0,1000000],[0,200],[200,800],[800,1000000]])
train_faster_rcnn(path_config_yaml=path_config_yaml,
dataset_train=dataset_train,
path_weights=path_weights,
config_overwrite=config_overwrite,
log_dir="/path/to/dir",
build_train_config=build_train_config,
dataset_val=dataset_val,
build_val_config=build_val_config,
metric=coco_metric,
pipeline_component_name="ImageLayoutService"
)
```
## How to fine-tune this model
To fine tune this model, please check this [Fine-tune](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Fine_Tune.ipynb) tutorial.
|
{"license": "apache-2.0", "tags": ["Tensorflow"], "datasets": ["Pubtabnet"]}
| null |
deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_pubtabnet_c_inference_only
|
[
"Tensorflow",
"dataset:Pubtabnet",
"arxiv:1911.10683",
"license:apache-2.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1911.10683"
] |
[] |
TAGS
#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us
|
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and its training code has been mainly taken from: Tensorpack .
Regarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation.
The model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before
detecting cells.
The code has been adapted so that it can be used in a deepdoctection pipeline.
## How this model can be used
This model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model .
## How this model was trained.
To recreate the model run on the deepdoctection framework, run:
## How to fine-tune this model
To fine tune this model, please check this Fine-tune tutorial.
|
[
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before \ndetecting cells.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model .",
"## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:",
"## How to fine-tune this model\n\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
"TAGS\n#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n",
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before \ndetecting cells.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model .",
"## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:",
"## How to fine-tune this model\n\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
34,
156,
45,
51,
24,
22
] |
[
"passage: TAGS\n#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting cells from tables. Note, that the datasets contains tables only. Therefore, it is required to perform a table detection task before \ndetecting cells.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model .## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:## How to fine-tune this model\n\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
-0.10852205753326416,
0.13401047885417938,
-0.0026710545644164085,
0.039398983120918274,
0.08979542553424835,
-0.021708587184548378,
0.011286631226539612,
0.09004707634449005,
-0.06340032815933228,
0.16346192359924316,
0.00047032342990860343,
0.04482715576887131,
0.10374448448419571,
0.14056123793125153,
-0.026566525921225548,
-0.13992764055728912,
0.009548662230372429,
-0.012472282163798809,
0.05650985240936279,
0.061849553138017654,
0.09060338139533997,
-0.0591236911714077,
0.03250056877732277,
-0.0025907387025654316,
-0.07401585578918457,
0.035822510719299316,
0.020027292892336845,
-0.04113118350505829,
0.05743270367383957,
0.07028056681156158,
0.04597145691514015,
0.0010917636100202799,
0.0670364573597908,
-0.10342757403850555,
0.046189263463020325,
0.09278427064418793,
-0.02531643770635128,
0.07746198028326035,
0.09656845778226852,
-0.0009128705714829266,
0.1423831731081009,
-0.08656995743513107,
0.04631670191884041,
0.06630292534828186,
-0.06494905054569244,
-0.14208079874515533,
-0.15416869521141052,
0.16071808338165283,
0.08292613178491592,
0.12128721177577972,
-0.02331450767815113,
0.11514122784137726,
0.0012850561179220676,
0.07291122525930405,
0.06904630362987518,
-0.22692884504795074,
-0.027047395706176758,
0.17527075111865997,
0.06306418031454086,
0.03648998960852623,
-0.06048402190208435,
0.010638854466378689,
-0.011582884937524796,
0.02634512446820736,
-0.023977218195796013,
-0.08327502012252808,
-0.0368274450302124,
-0.03533528745174408,
-0.13225902616977692,
-0.01854434236884117,
0.09077019989490509,
0.007027516607195139,
-0.041360121220350266,
-0.1052790954709053,
-0.06694961339235306,
-0.09218000620603561,
0.007613060530275106,
-0.07467330247163773,
0.046177711337804794,
0.03561539947986603,
0.09831627458333969,
-0.14663539826869965,
-0.08372610807418823,
-0.013451064005494118,
-0.09834012389183044,
-0.016578461974859238,
0.02510925754904747,
0.0017379244090989232,
-0.022249560803174973,
0.10321064293384552,
-0.08416319638490677,
-0.02908887341618538,
-0.08340531587600708,
0.014162606559693813,
-0.1593148559331894,
-0.03836153820157051,
-0.05376424640417099,
-0.0964244157075882,
0.014956871047616005,
0.2000647783279419,
0.0021432358771562576,
0.05455882102251053,
-0.05642352253198624,
0.025204546749591827,
0.049057163298130035,
0.1938972920179367,
-0.08809331804513931,
0.0268703680485487,
0.14269161224365234,
-0.018140027299523354,
0.03752458095550537,
-0.023400548845529556,
-0.060455322265625,
-0.00004138359145144932,
0.05834057182073593,
0.06125929206609726,
0.07345292717218399,
-0.015351458452641964,
-0.03736158832907677,
-0.05060156062245369,
0.09762577712535858,
-0.15243256092071533,
0.08948652446269989,
0.0713687464594841,
-0.05571851134300232,
0.03820562735199928,
0.058663010597229004,
0.013127676211297512,
-0.1534431427717209,
-0.05815785378217697,
-0.05713525786995888,
-0.0005403105169534683,
-0.04116359353065491,
-0.03652748465538025,
-0.012434509582817554,
-0.07931139320135117,
-0.028601696714758873,
-0.13079766929149628,
-0.1925375610589981,
-0.1444542109966278,
0.010960390791296959,
-0.055743273347616196,
0.014846726320683956,
-0.027774859219789505,
-0.03966393321752548,
-0.0416882298886776,
0.06754777580499649,
0.03421824052929878,
0.025331547483801842,
0.015517843887209892,
-0.002452582586556673,
-0.058554716408252716,
-0.04249195009469986,
0.030505899339914322,
-0.12356489151716232,
0.03404594957828522,
-0.013877538032829762,
0.09809303283691406,
0.02927562966942787,
-0.07362283766269684,
-0.11182358860969543,
0.00573811586946249,
-0.07139821350574493,
-0.016063518822193146,
0.04361576586961746,
0.08307939767837524,
-0.1831144243478775,
0.020837711170315742,
0.04786714166402817,
-0.07563090324401855,
0.046230025589466095,
0.08879078179597855,
-0.027595335617661476,
-0.009846029803156853,
0.04490183666348457,
0.156027689576149,
0.03206612542271614,
-0.04815974459052086,
-0.07800173759460449,
-0.08411138504743576,
-0.048606183379888535,
0.06706000119447708,
0.0886463150382042,
-0.04731009528040886,
0.045578259974718094,
-0.021923813968896866,
-0.0578971691429615,
-0.08477175235748291,
-0.004172785207629204,
-0.06633474677801132,
-0.047047290951013565,
-0.01770663820207119,
-0.06084868311882019,
0.012371963821351528,
0.030539395287632942,
0.03243838995695114,
-0.09172849357128143,
-0.1339075267314911,
0.16337212920188904,
-0.05135312303900719,
0.002603608649224043,
-0.10056160390377045,
0.20134855806827545,
-0.10292394459247589,
0.03268056362867355,
-0.18650361895561218,
-0.030781414359807968,
0.041780512779951096,
-0.00464823329821229,
0.08474335819482803,
0.016917144879698753,
0.018244538456201553,
0.07211025804281235,
0.029660267755389214,
-0.06358484923839569,
-0.0053006503731012344,
-0.024555278941988945,
-0.054137393832206726,
-0.05458824336528778,
-0.06867978721857071,
-0.06993447989225388,
0.11245137453079224,
-0.31742843985557556,
0.023958750069141388,
-0.015580843202769756,
0.13977773487567902,
-0.0027819164097309113,
-0.06899255514144897,
0.025649085640907288,
-0.007387000136077404,
-0.04279697686433792,
-0.11489510536193848,
0.005467687733471394,
0.035782262682914734,
-0.01981995441019535,
0.0025766422040760517,
-0.15436646342277527,
-0.09821587800979614,
0.024036157876253128,
0.11062899976968765,
-0.09091515094041824,
0.0014654210535809398,
-0.03282777592539787,
-0.010241086594760418,
-0.06264806538820267,
0.02298678644001484,
0.22905132174491882,
0.03313221409916878,
0.04015517979860306,
-0.04442201927304268,
0.020242255181074142,
0.007029777392745018,
-0.05125880613923073,
-0.0078037031926214695,
0.08734788000583649,
-0.02767132595181465,
-0.23296405375003815,
0.03755347430706024,
-0.07094494253396988,
-0.054591212421655655,
0.03152629733085632,
0.02568265050649643,
-0.06424221396446228,
-0.04606987163424492,
0.05161099135875702,
0.03660787269473076,
0.08400886505842209,
0.09553390741348267,
0.04921223595738411,
0.02074444107711315,
0.0041474648751318455,
0.04932958260178566,
-0.0653395801782608,
0.08000831305980682,
0.06308583915233612,
0.019812114536762238,
0.09010668098926544,
0.01417473889887333,
-0.029204169288277626,
0.042147427797317505,
-0.06527769565582275,
0.037335243076086044,
0.017016761004924774,
-0.015705035999417305,
-0.12792591750621796,
0.1838310956954956,
-0.049833156168460846,
-0.1843010038137436,
-0.11136052757501602,
0.06081850454211235,
-0.051986441016197205,
0.021920207887887955,
0.01842007040977478,
0.015386571176350117,
-0.05536261573433876,
-0.10299241542816162,
-0.011381600983440876,
0.02922787144780159,
-0.04587113857269287,
-0.07763902097940445,
-0.057346075773239136,
0.019373971968889236,
-0.0847093015909195,
0.0027345542330294847,
0.005396392196416855,
-0.1393832415342331,
0.0076659489423036575,
0.002415318274870515,
0.1113695576786995,
0.17753753066062927,
-0.044693782925605774,
0.00412321649491787,
0.04889551177620888,
0.0425780788064003,
-0.06794542074203491,
0.058317385613918304,
0.1522582471370697,
0.03877865523099899,
-0.0005461664986796677,
0.03663334995508194,
-0.03494471311569214,
-0.038725171238183975,
0.003710314864292741,
0.06625973433256149,
-0.020875250920653343,
-0.22083695232868195,
-0.06814739853143692,
-0.036311905831098557,
-0.07231754064559937,
0.04838108643889427,
0.1147695854306221,
0.03118768520653248,
0.026118844747543335,
0.003556462936103344,
0.02434278465807438,
0.0037952358834445477,
0.04250425845384598,
0.06107930466532707,
0.020007701590657234,
0.028194017708301544,
-0.0018872806103900075,
0.08659493178129196,
0.11239980906248093,
0.05612804740667343,
0.10002921521663666,
-0.04134497046470642,
0.11383739113807678,
-0.022709667682647705,
0.0695427656173706,
0.005075361579656601,
0.015645883977413177,
-0.00722303194925189,
0.0386078879237175,
0.04115063324570656,
-0.04451620206236839,
0.016432328149676323,
0.030404934659600258,
0.06295664608478546,
0.005656924098730087,
-0.09761060029268265,
0.08609723299741745,
0.020592492073774338,
0.08314613252878189,
-0.008953070268034935,
-0.2353116273880005,
-0.07047177106142044,
-0.03328310698270798,
0.0007323877071030438,
-0.10297781974077225,
0.01685888133943081,
0.07985643297433853,
-0.14603254199028015,
-0.1034252941608429,
-0.04695386439561844,
0.09182918816804886,
-0.08331168442964554,
-0.037588443607091904,
-0.02330762892961502,
0.11335857957601547,
0.02701328508555889,
0.08317328244447708,
-0.1211078092455864,
0.013451705686748028,
0.02060285396873951,
0.14022056758403778,
-0.05627220496535301,
-0.0057846675626933575,
0.022186079993844032,
0.06539350748062134,
0.19156265258789062,
-0.010751841589808464,
-0.1375575214624405,
-0.07167679071426392,
-0.13738156855106354,
0.03683960437774658,
0.029677672311663628,
-0.07446641474962234,
0.066964291036129,
-0.024294670671224594,
-0.016929378733038902,
-0.041944168508052826,
-0.0289437435567379,
-0.11393964290618896,
-0.1170305460691452,
0.03326167166233063,
0.039910562336444855,
0.023873822763562202,
-0.049327362328767776,
-0.01115768775343895,
0.06796614080667496,
0.18036653101444244,
-0.08451879769563675,
-0.010225873440504074,
-0.14521928131580353,
0.10828296095132828,
0.10782773792743683,
-0.06985883414745331,
0.022043218836188316,
-0.01417466625571251,
0.1113717332482338,
-0.030315006151795387,
-0.12855277955532074,
0.07059861719608307,
-0.08932506293058395,
-0.14672644436359406,
-0.008412952534854412,
0.08855579793453217,
0.057969845831394196,
0.04157347232103348,
0.06666970252990723,
0.08361847698688507,
-0.02348363772034645,
-0.09939033538103104,
0.018032200634479523,
0.12399888783693314,
0.06355965882539749,
0.10428699851036072,
-0.07268572598695755,
-0.1600874960422516,
-0.05812042951583862,
0.03330150991678238,
0.10860694199800491,
0.1320202797651291,
-0.05842278525233269,
0.06886978447437286,
0.07578679919242859,
-0.10793479532003403,
-0.1682490110397339,
0.017948145046830177,
-0.014734135940670967,
0.1019916757941246,
0.01406938023865223,
-0.1586485058069229,
0.07749065011739731,
0.07625395059585571,
-0.030840586870908737,
0.12697915732860565,
-0.34021100401878357,
-0.09137216210365295,
0.14905303716659546,
0.02086077816784382,
0.021451950073242188,
-0.14807040989398956,
-0.06278890371322632,
0.04006698355078697,
-0.020040934905409813,
0.04349466785788536,
-0.15026280283927917,
0.1157190129160881,
-0.007725164294242859,
-0.023507211357355118,
0.04864522069692612,
-0.053463373333215714,
0.08113835006952286,
-0.03405952453613281,
0.05533748120069504,
-0.07053080946207047,
0.008443288505077362,
0.03387409448623657,
-0.018280820921063423,
0.10262245684862137,
0.05591311305761337,
0.1407533437013626,
-0.1368320882320404,
-0.04854366555809975,
-0.06063374504446983,
0.10258202254772186,
-0.031090693548321724,
-0.024506764486432076,
-0.05591747537255287,
0.11920396983623505,
0.11148442327976227,
0.013540362939238548,
0.03488317131996155,
-0.03832404315471649,
0.00021543011825997382,
0.1720137596130371,
0.0006700698286294937,
0.09638160467147827,
-0.14106066524982452,
-0.022903844714164734,
-0.006917361170053482,
0.048645537346601486,
0.057839296758174896,
-0.010395859368145466,
0.1434747725725174,
0.04762393236160278,
0.09184028208255768,
0.03332030400633812,
-0.10967281460762024,
0.005528080277144909,
-0.03394078463315964,
-0.09239117801189423,
-0.1366252452135086,
-0.0367310531437397,
0.09527121484279633,
-0.033163733780384064,
0.02892065979540348,
0.12114211916923523,
-0.04976499825716019,
-0.05548693612217903,
0.04055877402424812,
0.09194981306791306,
-0.06797223538160324,
0.13032826781272888,
0.036029499024152756,
0.016397159546613693,
-0.023943763226270676,
0.09756950289011002,
0.1539984792470932,
-0.08702346682548523,
0.05841537192463875,
0.12254838645458221,
-0.07549278438091278,
-0.08207733184099197,
0.015449891798198223,
0.14001326262950897,
-0.07618016749620438,
-0.08327744901180267,
-0.048106979578733444,
-0.0029664719477295876,
0.007044442929327488,
0.032781802117824554,
0.022666962817311287,
0.05401138588786125,
-0.06088615953922272,
-0.0766356885433197,
-0.1323949098587036,
0.03048359602689743,
0.08932273834943771,
0.016325168311595917,
-0.09512066841125488,
0.011712558567523956,
-0.029498767107725143,
0.03235207870602608,
-0.03662065789103508,
-0.01878906600177288,
-0.08997291326522827,
-0.027592871338129044,
-0.06718169152736664,
0.03227582573890686,
-0.031597577035427094,
-0.02481653541326523,
-0.005743769463151693,
-0.001555735943838954,
0.015628743916749954,
0.09049592912197113,
0.011453945189714432,
-0.07862342894077301,
-0.07851430028676987,
0.027615664526820183,
-0.14371782541275024,
0.00907958671450615,
-0.026413694024086,
-0.03635421395301819,
0.09557196497917175,
0.09213801473379135,
0.00017317778838332742,
-0.021851321682333946,
-0.1051720529794693,
-0.014537503011524677,
-0.030327141284942627,
0.004482505843043327,
0.001688348245806992,
-0.20569823682308197,
0.025633588433265686,
0.045217640697956085,
-0.0929143950343132,
-0.027605373412370682,
0.10364159941673279,
-0.11281069368124008,
-0.04885837435722351,
-0.1354159116744995,
0.03024892881512642,
-0.08277802169322968,
0.06448463350534439,
-0.027774255722761154,
0.1439456194639206,
0.09601457417011261,
-0.06060997396707535,
0.09690265357494354,
-0.10799547284841537,
-0.03163731098175049,
0.0012643726076930761,
0.05780573934316635,
-0.003230666508898139,
0.018038444221019745,
0.0672389343380928,
-0.010458034463226795,
0.045511290431022644,
-0.03160872310400009,
0.022976253181695938,
0.010047000832855701,
0.007010471075773239,
0.006485657766461372,
0.01924046501517296,
0.09285334497690201,
0.00021005707094445825,
-0.03786463290452957,
0.06508772075176239,
0.020294655114412308,
0.0101232398301363,
0.09140563011169434,
0.18147680163383484,
0.10673923790454865,
0.02175857126712799,
0.10144173353910446,
-0.012533210217952728,
-0.12093910574913025,
-0.05168413370847702,
0.09594780206680298,
-0.05062815546989441,
0.023256510496139526,
-0.12265799194574356,
-0.02254711650311947,
0.21646863222122192,
-0.1318693906068802,
0.04891381412744522,
0.008998875506222248,
-0.07562297582626343,
-0.09739536046981812,
-0.25506460666656494,
-0.08281522244215012,
0.003268386237323284,
-0.033817559480667114,
-0.07856549322605133,
0.045028749853372574,
-0.01867581345140934,
0.012946084141731262,
-0.043261997401714325,
0.1997768133878708,
-0.09295158088207245,
-0.04335066303610802,
0.020285841077566147,
-0.015778539702296257,
-0.004148303531110287,
0.01696578972041607,
0.015400758944451809,
0.05534960329532623,
-0.05286335200071335,
0.11787500977516174,
0.053954046219587326,
0.15257734060287476,
0.0884137749671936,
-0.043751880526542664,
-0.03151559457182884,
-0.01907539740204811,
0.0002654635754879564,
0.02007521688938141,
0.14104944467544556,
0.11011835187673569,
-0.049418386071920395,
-0.01142620574682951,
0.2527230978012085,
-0.03292441368103027,
-0.02668084017932415,
-0.13695818185806274,
0.14441785216331482,
0.017598245292901993,
-0.06262779235839844,
-0.013825849629938602,
-0.169487863779068,
0.04552715644240379,
0.1449844092130661,
0.11629384011030197,
-0.023290563374757767,
-0.017116878181695938,
0.03873022645711899,
0.0008005283307284117,
-0.05271707475185394,
0.010559325106441975,
0.032327376306056976,
0.23429906368255615,
-0.06952732056379318,
-0.01670561172068119,
-0.017536988481879234,
-0.01067516952753067,
-0.08429868519306183,
0.07216457277536392,
-0.02284817025065422,
0.030452333390712738,
0.02973644807934761,
0.040755197405815125,
-0.09122855216264725,
-0.22011664509773254,
0.008024965412914753,
-0.019369611516594887,
-0.0756518542766571,
-0.05164492502808571,
-0.04827514663338661,
-0.020410550758242607,
0.08462420105934143,
-0.014307009056210518,
0.0010232210624963045,
0.0979846641421318,
0.0050109149888157845,
-0.02429788187146187,
-0.01924833655357361,
0.03024638630449772,
0.04069758579134941,
0.19512714445590973,
-0.007710330653935671,
0.05798636004328728,
0.0983629822731018,
0.03211001306772232,
-0.1531822681427002,
0.057515744119882584,
-0.002235108520835638,
-0.007845008745789528,
0.06312774121761322,
0.1434776932001114,
0.012973970733582973,
0.06119527295231819,
0.027586987242102623,
-0.05077505111694336,
0.0418572872877121,
-0.10508829355239868,
0.06020250916481018,
-0.10265902429819107,
0.062197111546993256,
-0.08388068526983261,
0.16374804079532623,
0.09544500708580017,
-0.02132597751915455,
0.02038760669529438,
-0.04775990918278694,
0.09081543982028961,
0.018206872045993805,
0.08931612223386765,
0.04294562339782715,
-0.10635089874267578,
0.08640521764755249,
-0.1067158654332161,
0.008463110774755478,
-0.19024083018302917,
-0.06160299479961395,
0.012314790859818459,
-0.07879499346017838,
-0.024405650794506073,
0.06404681503772736,
0.07692275196313858,
0.02821405418217182,
-0.0688745304942131,
-0.07718244194984436,
0.019598189741373062,
0.07419377565383911,
-0.1253247857093811,
-0.05610799044370651
] |
null | null | null |
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and its training code has been mainly taken from: [Tensorpack](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN) .
Regarding the dataset, please check: [Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation](https://arxiv.org/abs/1911.10683).
The model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are
calculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.
The code has been adapted so that it can be used in a **deep**doctection pipeline.
## How this model can be used
This model can be used with the **deep**doctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this [Get_started](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Get_Started.ipynb) tutorial.
## How this model was trained.
To recreate the model run on the **deep**doctection framework, run:
```python
>>> import os
>>> from deep_doctection.datasets import DatasetRegistry
>>> from deep_doctection.eval import MetricRegistry
>>> from deep_doctection.utils import get_configs_dir_path
>>> from deep_doctection.train import train_faster_rcnn
pubtabnet = DatasetRegistry.get_dataset("pubtabnet")
pubtabnet.dataflow.categories.set_cat_to_sub_cat({"ITEM":"row_col"})
pubtabnet.dataflow.categories.filter_categories(categories=["ROW","COLUMN"])
path_config_yaml=os.path.join(get_configs_dir_path(),"tp/rows/conf_frcnn_rows.yaml")
path_weights = ""
dataset_train = pubtabnet
config_overwrite=["TRAIN.STEPS_PER_EPOCH=500","TRAIN.STARTING_EPOCH=1", "TRAIN.CHECKPOINT_PERIOD=50"]
build_train_config=["max_datapoints=500000","rows_and_cols=True"]
dataset_val = pubtabnet
build_val_config = ["max_datapoints=2000","rows_and_cols=True"]
coco_metric = MetricRegistry.get_metric("coco")
coco_metric.set_params(max_detections=[50,200,600], area_range=[[0,1000000],[0,200],[200,800],[800,1000000]])
train_faster_rcnn(path_config_yaml=path_config_yaml,
dataset_train=dataset_train,
path_weights=path_weights,
config_overwrite=config_overwrite,
log_dir="/path/to/dir",
build_train_config=build_train_config,
dataset_val=dataset_val,
build_val_config=build_val_config,
metric=coco_metric,
pipeline_component_name="ImageLayoutService"
)
```
## How to fine-tune this model
To fine tune this model, please check this [Fine-tune](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Fine_Tune.ipynb) tutorial.
|
{"license": "apache-2.0", "tags": ["Tensorflow"], "datasets": ["Pubtabnet"]}
| null |
deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_pubtabnet_rc
|
[
"Tensorflow",
"dataset:Pubtabnet",
"arxiv:1911.10683",
"license:apache-2.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1911.10683"
] |
[] |
TAGS
#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us
|
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and its training code has been mainly taken from: Tensorpack .
Regarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation.
The model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are
calculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.
The code has been adapted so that it can be used in a deepdoctection pipeline.
## How this model can be used
This model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.
## How this model was trained.
To recreate the model run on the deepdoctection framework, run:
## How to fine-tune this model
To fine tune this model, please check this Fine-tune tutorial.
|
[
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \r\n\r\nThe model and its training code has been mainly taken from: Tensorpack . \r\n\r\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \r\n\r\nThe model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are\r\ncalculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.\r\n\r\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\r\n\r\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## How this model was trained. \r\n\r\nTo recreate the model run on the deepdoctection framework, run:",
"## How to fine-tune this model\r\n\r\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
"TAGS\n#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n",
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \r\n\r\nThe model and its training code has been mainly taken from: Tensorpack . \r\n\r\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \r\n\r\nThe model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are\r\ncalculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.\r\n\r\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\r\n\r\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## How this model was trained. \r\n\r\nTo recreate the model run on the deepdoctection framework, run:",
"## How to fine-tune this model\r\n\r\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
34,
180,
45,
24,
22
] |
[
"passage: TAGS\n#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \r\n\r\nThe model and its training code has been mainly taken from: Tensorpack . \r\n\r\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \r\n\r\nThe model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are\r\ncalculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.\r\n\r\nThe code has been adapted so that it can be used in a deepdoctection pipeline.## How this model can be used\r\n\r\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.## How this model was trained. \r\n\r\nTo recreate the model run on the deepdoctection framework, run:## How to fine-tune this model\r\n\r\nTo fine tune this model, please check this Fine-tune tutorial."
] |
[
-0.08436226844787598,
0.13389331102371216,
-0.00294904294423759,
0.055595800280570984,
0.09529847651720047,
-0.007990616373717785,
0.06740986555814743,
0.05010956898331642,
-0.06163270026445389,
0.09468410909175873,
-0.023353170603513718,
0.010656292550265789,
0.09910040348768234,
0.18979579210281372,
0.012730485759675503,
-0.22229351103305817,
-0.011069011874496937,
-0.03535090759396553,
0.01917230151593685,
0.06016081944108009,
0.09268886595964432,
-0.09342502057552338,
0.05472293123602867,
0.011367086321115494,
-0.051508378237485886,
0.06468546390533447,
-0.021840645000338554,
-0.003703794442117214,
0.04556049779057503,
0.07038692384958267,
0.06871534883975983,
-0.029657403007149696,
0.1038317084312439,
-0.08208028227090836,
0.05048030614852905,
0.09491704404354095,
-0.024013662710785866,
0.08232516795396805,
0.09638715535402298,
0.009467126801609993,
0.13954071700572968,
-0.04967961087822914,
0.0283361729234457,
0.0624656081199646,
-0.08212898671627045,
-0.10066668689250946,
-0.14543083310127258,
0.1353696882724762,
0.12129592895507812,
0.1066562682390213,
-0.018711071461439133,
0.1022370234131813,
0.015028796158730984,
0.05086374655365944,
0.12104442715644836,
-0.29360881447792053,
-0.04363038018345833,
0.19650961458683014,
0.07437316328287125,
0.04612516611814499,
-0.06747417896986008,
0.007197228725999594,
-0.006945034023374319,
0.02626178227365017,
-0.014190166257321835,
-0.06837181001901627,
-0.020912762731313705,
-0.012429551221430302,
-0.1423860490322113,
0.003235868876799941,
0.09339329600334167,
-0.009673180058598518,
-0.033322762697935104,
-0.11251863092184067,
-0.09610660374164581,
-0.06753675639629364,
-0.0210527665913105,
-0.08273626863956451,
0.05713030695915222,
0.0453217588365078,
0.1136881485581398,
-0.17078185081481934,
-0.08617471158504486,
-0.031198464334011078,
-0.04333098232746124,
-0.027454694733023643,
0.03002467192709446,
0.025212837383151054,
-0.07875619828701019,
0.11202110350131989,
-0.054750774055719376,
-0.0496525838971138,
-0.03971582651138306,
-0.04023416340351105,
-0.14149776101112366,
-0.038967497646808624,
-0.04365203157067299,
-0.08557132631540298,
-0.006574778817594051,
0.2421947419643402,
0.016861818730831146,
0.06382106244564056,
-0.031984228640794754,
0.017805321142077446,
0.039461199194192886,
0.15754061937332153,
-0.10951478779315948,
0.005780959501862526,
0.12511448562145233,
-0.018008219078183174,
0.029189204797148705,
-0.03554331511259079,
-0.07676740735769272,
-0.023483576253056526,
-0.027636319398880005,
0.07467640191316605,
0.013704854995012283,
-0.025881553068757057,
-0.01773728057742119,
-0.08602380752563477,
0.12149465829133987,
-0.12021907418966293,
0.0627293735742569,
0.07266558706760406,
-0.08553414791822433,
0.016289392486214638,
0.06409565359354019,
0.009795339778065681,
-0.10152685642242432,
-0.06152341142296791,
-0.04865797236561775,
-0.028811907395720482,
-0.09238870441913605,
-0.06952260434627533,
-0.01210838370025158,
-0.09591761976480484,
-0.021422170102596283,
-0.10062717646360397,
-0.21758270263671875,
-0.1301937699317932,
0.03573767468333244,
-0.03449631854891777,
-0.0008531397907063365,
-0.02780924178659916,
-0.008811704814434052,
-0.054181672632694244,
0.07342883944511414,
0.03200804069638252,
0.024580297991633415,
-0.0323868952691555,
-0.05860898271203041,
-0.013881616294384003,
-0.013064667582511902,
0.012732788920402527,
-0.10342466831207275,
0.026872500777244568,
-0.050243791192770004,
0.04862334951758385,
0.04665223881602287,
-0.04070444405078888,
-0.11572719365358353,
-0.0025437078438699245,
-0.07372772693634033,
-0.02428128942847252,
0.018853766843676567,
0.116713747382164,
-0.18897956609725952,
0.02542409487068653,
0.10370256006717682,
-0.05212786793708801,
0.010279289446771145,
0.07833855599164963,
-0.046660229563713074,
0.028489569202065468,
0.036162249743938446,
0.09297920018434525,
0.14788444340229034,
-0.05634480342268944,
-0.0123182637616992,
-0.08123975247144699,
-0.029039297252893448,
0.04576076939702034,
0.066096730530262,
-0.03308624029159546,
0.06742747873067856,
-0.030416246503591537,
-0.040769971907138824,
-0.059502068907022476,
-0.00422914233058691,
-0.07455195486545563,
0.00042358614155091345,
-0.034636594355106354,
-0.02589079737663269,
-0.03534223884344101,
0.03789936751127243,
0.05464107543230057,
-0.04981580749154091,
-0.1297454535961151,
0.13046197593212128,
-0.049336690455675125,
-0.005451068747788668,
-0.09125465899705887,
0.1327044814825058,
-0.03265822306275368,
0.02838589809834957,
-0.1854809671640396,
-0.07481607049703598,
0.04658051207661629,
-0.011133476160466671,
0.08200286328792572,
0.0586068369448185,
0.03202895447611809,
0.061086274683475494,
0.04526534304022789,
-0.04522637277841568,
-0.014224477112293243,
-0.04629002511501312,
-0.024405136704444885,
-0.053680647164583206,
-0.07741302996873856,
-0.07375572621822357,
0.11133615672588348,
-0.2589956820011139,
0.017098993062973022,
-0.08240560442209244,
0.12335250526666641,
0.007915081456303596,
-0.04769159108400345,
0.04870565980672836,
-0.020387331023812294,
0.00827495101839304,
-0.08027081936597824,
0.00897595752030611,
0.03604144603013992,
-0.04418351501226425,
0.01868617907166481,
-0.2008671909570694,
-0.053281232714653015,
0.01811736449599266,
0.11540544778108597,
-0.034796323627233505,
-0.04645562916994095,
-0.013713176362216473,
-0.008360200561583042,
-0.060307979583740234,
0.06183788552880287,
0.21557945013046265,
0.03994058445096016,
0.05825183913111687,
-0.06271808594465256,
0.019561652094125748,
-0.012544331140816212,
-0.03313576802611351,
0.02735206112265587,
0.08633655309677124,
0.027560707181692123,
-0.20414648950099945,
0.0015317347133532166,
-0.008783568628132343,
-0.017594212666153908,
0.10015182942152023,
0.017574824392795563,
-0.056370075792074203,
-0.05719223991036415,
0.026041299104690552,
0.023118508979678154,
0.05022334307432175,
0.10923230648040771,
0.03675628453493118,
0.011961190961301327,
0.015218265354633331,
0.0565679557621479,
-0.06703755259513855,
0.06175857409834862,
0.04824886843562126,
-0.013119950890541077,
0.10389703512191772,
0.050140876322984695,
-0.007048229221254587,
0.01737174391746521,
-0.026869015768170357,
0.038055744022130966,
-0.025842079892754555,
-0.04537872597575188,
-0.14072732627391815,
0.1740998476743698,
-0.10293075442314148,
-0.12534672021865845,
-0.13471080362796783,
0.022088540717959404,
-0.04927961900830269,
0.01171052735298872,
-0.003959489054977894,
0.018726471811532974,
-0.06318610161542892,
-0.08564132452011108,
0.01091384794563055,
0.031390778720378876,
-0.05463796854019165,
-0.04199527949094772,
-0.055110424757003784,
-0.003523294348269701,
-0.1145731508731842,
0.009070899337530136,
-0.0006273090839385986,
-0.1790228635072708,
-0.012313884682953358,
0.012559100054204464,
0.0596662312746048,
0.14733263850212097,
-0.02825693041086197,
0.017251309007406235,
0.05002039670944214,
0.11587610095739365,
-0.05949661508202553,
0.09850405156612396,
0.21250514686107635,
0.022280683740973473,
0.017754018306732178,
0.061096906661987305,
-0.002510835649445653,
-0.04082099720835686,
0.025275124236941338,
0.054681871086359024,
-0.04580160230398178,
-0.18868158757686615,
-0.08204998075962067,
-0.05705592408776283,
-0.04790396988391876,
0.051712941378355026,
0.07428233325481415,
0.0058286599814891815,
0.022603899240493774,
-0.02298557013273239,
-0.010805828496813774,
0.03247273713350296,
0.02040397748351097,
0.06515003740787506,
0.027299275621771812,
0.02234463579952717,
-0.04327905550599098,
0.046274952590465546,
0.09182040393352509,
0.05771423131227493,
0.1370687633752823,
0.013281223364174366,
0.08581550419330597,
0.00964557845145464,
0.0667986124753952,
0.032320115715265274,
0.10192699730396271,
-0.011454696767032146,
0.01221819594502449,
0.00792699959129095,
-0.06787815690040588,
0.01597648486495018,
0.05765639245510101,
0.0533158965408802,
-0.011630462482571602,
-0.027714869007468224,
0.07737396657466888,
-0.0035362078342586756,
0.06960486620664597,
0.02952560968697071,
-0.20634886622428894,
-0.025241872295737267,
-0.0218947920948267,
0.03202071413397789,
-0.07409939169883728,
0.03513288125395775,
0.04720795899629593,
-0.1551024168729782,
-0.0650578960776329,
-0.06947556883096695,
0.1076732873916626,
-0.031892329454422,
-0.01471992302685976,
-0.007096634246408939,
0.15274016559123993,
0.02819308266043663,
0.06229119375348091,
-0.09111624211072922,
0.018334198743104935,
0.03794635087251663,
0.13756515085697174,
-0.051235880702733994,
0.005603364668786526,
0.06826578080654144,
0.033261802047491074,
0.14942702651023865,
-0.004897386766970158,
-0.08543739467859268,
-0.027650661766529083,
-0.13632555305957794,
0.03962476924061775,
0.028109407052397728,
-0.0765770971775055,
0.07801034301519394,
-0.006177882198244333,
-0.024043086916208267,
-0.052398767322301865,
0.06435631215572357,
-0.1311744600534439,
-0.11770536005496979,
0.010619359090924263,
0.0067054759711027145,
-0.008939657360315323,
-0.051909808069467545,
-0.0006387860630638897,
0.04660450667142868,
0.11441347002983093,
-0.0410488061606884,
-0.014241667464375496,
-0.10933223366737366,
0.0520814023911953,
0.12771447002887726,
-0.08707307279109955,
0.05600028857588768,
0.0015824318397790194,
0.11706777662038803,
-0.04956858232617378,
-0.15589390695095062,
0.059941865503787994,
-0.0897434800863266,
-0.09940532594919205,
-0.004359602928161621,
0.053683776408433914,
0.10140657424926758,
0.017636192962527275,
0.05448945239186287,
0.0849844217300415,
-0.03625982999801636,
-0.1218331903219223,
-0.017846424132585526,
0.1418067067861557,
0.021026071161031723,
0.07977995276451111,
-0.1029752865433693,
-0.07718097418546677,
-0.04167011380195618,
0.06974286586046219,
0.10360273718833923,
0.14359985291957855,
-0.07143767923116684,
0.05623643100261688,
0.1016215831041336,
-0.09790315479040146,
-0.19827014207839966,
-0.019409378990530968,
-0.021393224596977234,
0.14124079048633575,
0.0632244348526001,
-0.2167333960533142,
0.09686850011348724,
0.04768799617886543,
-0.032931581139564514,
0.09462172538042068,
-0.3140455484390259,
-0.10067003220319748,
0.15668287873268127,
0.007551172748208046,
-0.007800975814461708,
-0.09571938216686249,
-0.04822637513279915,
0.03743509575724602,
-0.04072245955467224,
0.07491002976894379,
-0.19423426687717438,
0.08963192999362946,
-0.004257366061210632,
0.006156352814286947,
0.02169107459485531,
-0.028789468109607697,
0.05419726297259331,
-0.04419316351413727,
0.0760699212551117,
-0.035985853523015976,
0.06761384755373001,
0.027014514431357384,
-0.04565553367137909,
0.12222447991371155,
0.05061858892440796,
0.13483679294586182,
-0.1555299311876297,
-0.023000195622444153,
-0.036243196576833725,
0.08048698306083679,
-0.051177993416786194,
-0.024548567831516266,
-0.06729115545749664,
0.10258156061172485,
0.08220755308866501,
0.01800350286066532,
0.009288879111409187,
-0.0006167852552607656,
0.028320612385869026,
0.1526460200548172,
0.039507895708084106,
0.06790906935930252,
-0.10005418211221695,
-0.026345334947109222,
0.011073430068790913,
0.06807070225477219,
0.034848231822252274,
0.0016048297984525561,
0.13059397041797638,
0.02704652212560177,
0.07895912230014801,
0.027940087020397186,
-0.140968918800354,
-0.01522013172507286,
0.0007579969824291766,
-0.11186918616294861,
-0.14603397250175476,
-0.054992467164993286,
0.08924634754657745,
-0.04978998005390167,
0.0052937110885977745,
0.12363453209400177,
-0.05224817991256714,
-0.05017184093594551,
0.03303631767630577,
0.055593039840459824,
-0.04822782427072525,
0.11502252519130707,
0.05099058896303177,
0.013104226440191269,
-0.04359149932861328,
0.11475610733032227,
0.10651601850986481,
-0.04293878749012947,
0.019984310492873192,
0.18197867274284363,
-0.0760919526219368,
-0.07250870019197464,
0.014972279779613018,
0.17639248073101044,
-0.07208166271448135,
-0.06107586994767189,
-0.05221796780824661,
-0.005887545645236969,
0.0017029967857524753,
0.0466868095099926,
0.01613514870405197,
0.018144790083169937,
-0.0881943330168724,
-0.06938289105892181,
-0.08877942711114883,
0.03102085180580616,
0.12137163430452347,
-0.0017890414455905557,
-0.08262790739536285,
0.017861690372228622,
-0.028194397687911987,
0.02164529822766781,
-0.03888066112995148,
-0.02579847350716591,
-0.06709781289100647,
-0.023752747103571892,
-0.088560089468956,
-0.009364728815853596,
-0.03625936433672905,
-0.022525032982230186,
-0.02982676960527897,
0.04213999956846237,
-0.012225216254591942,
0.06255233287811279,
-0.015836967155337334,
-0.06723693013191223,
-0.096287801861763,
0.030376510694622993,
-0.15313492715358734,
0.012068252079188824,
-0.014432678930461407,
-0.04946129396557808,
0.07940284162759781,
0.06958817690610886,
-0.0010595632484182715,
0.015212108381092548,
-0.11549665778875351,
-0.04363409802317619,
-0.02238629199564457,
0.023139355704188347,
0.032988812774419785,
-0.15663379430770874,
-0.00042774275061674416,
0.012509399093687534,
-0.07471096515655518,
-0.04774688929319382,
0.03712598979473114,
-0.09434913843870163,
-0.019181648269295692,
-0.10680308938026428,
0.055314283818006516,
-0.07453719526529312,
0.05095771700143814,
-0.013750693760812283,
0.12861080467700958,
0.09463177621364594,
-0.05670860409736633,
0.10881288349628448,
-0.12028779834508896,
-0.028549591079354286,
-0.009593133814632893,
0.050388939678668976,
-0.004817209206521511,
-0.030008958652615547,
0.05031439661979675,
-0.020037995651364326,
0.048870522528886795,
-0.03754299506545067,
-0.017249511554837227,
0.0038512242026627064,
0.03722425550222397,
-0.04133213311433792,
0.02021312341094017,
0.05322922021150589,
0.03310711681842804,
-0.01442734245210886,
0.07041504979133606,
0.05951836332678795,
0.0020327221136540174,
0.08511195331811905,
0.1995985358953476,
0.07735844701528549,
0.06505092233419418,
0.10289698094129562,
0.016681428998708725,
-0.0964256078004837,
-0.0276477187871933,
0.08191651105880737,
-0.018100986257195473,
0.0290574561804533,
-0.08911912888288498,
-0.06815949082374573,
0.2102193385362625,
-0.13696135580539703,
0.06263992190361023,
0.019808202981948853,
-0.06597892194986343,
-0.13681694865226746,
-0.213053897023201,
-0.0902913510799408,
-0.0067031229846179485,
-0.029824133962392807,
-0.09377460926771164,
0.047766659408807755,
-0.03341921046376228,
0.024351220577955246,
-0.03907453268766403,
0.16328345239162445,
-0.11120986938476562,
-0.06084740534424782,
0.02755814418196678,
0.00031475149444304407,
0.042718883603811264,
0.07208655774593353,
0.021040091291069984,
0.041747257113456726,
-0.041305601596832275,
0.0907534658908844,
0.03244330361485481,
0.13046404719352722,
0.10431955754756927,
-0.04643316939473152,
-0.03238469362258911,
-0.012521626427769661,
-0.020244406536221504,
0.01878071017563343,
0.1328379511833191,
0.07518352568149567,
-0.06035672873258591,
-0.01725752465426922,
0.2138238102197647,
-0.038435738533735275,
-0.04475020617246628,
-0.13497494161128998,
0.11537498235702515,
0.0306860264390707,
-0.028773566707968712,
-0.019125306978821754,
-0.12066492438316345,
0.010953322052955627,
0.13040505349636078,
0.2244260460138321,
-0.06036914885044098,
0.01598568633198738,
0.051260970532894135,
0.00730870570987463,
-0.04512787610292435,
0.028513146564364433,
0.04039093479514122,
0.21013350784778595,
-0.045801836997270584,
0.044251251965761185,
-0.03913005068898201,
-0.038632798939943314,
-0.10121530294418335,
0.04360295087099075,
-0.008759270422160625,
0.01381614152342081,
0.04061612859368324,
0.06603991985321045,
-0.1121576726436615,
-0.16026653349399567,
-0.024725912138819695,
-0.03832075372338295,
-0.09907499700784683,
-0.06847438216209412,
-0.04182429611682892,
0.03699316084384918,
0.09751251339912415,
-0.025334147736430168,
0.006767251528799534,
0.10243567079305649,
-0.0012869563652202487,
-0.014565565623342991,
-0.06633736193180084,
0.05795697122812271,
0.001254147500731051,
0.214553564786911,
0.01674242503941059,
-0.039474986493587494,
0.07202467322349548,
0.006155402399599552,
-0.14089320600032806,
0.035327356308698654,
-0.0016608299920335412,
-0.022518634796142578,
0.009438647888600826,
0.07326821982860565,
-0.0068846167996525764,
0.05016091465950012,
-0.009298206306993961,
-0.1001138687133789,
0.03659932315349579,
-0.07448466867208481,
0.04007893428206444,
-0.10102985054254532,
0.09972584247589111,
-0.08920512348413467,
0.17552059888839722,
0.1170399859547615,
-0.025280466303229332,
0.03826529532670975,
-0.019948024302721024,
0.08169890940189362,
0.04302862286567688,
0.08204998821020126,
0.025814557448029518,
-0.062139950692653656,
0.03423822671175003,
-0.07947143167257309,
0.008182487450540066,
-0.1759471595287323,
-0.03927085921168327,
-0.022763865068554878,
-0.0557180717587471,
-0.015218282118439674,
0.0832734927535057,
0.0734141543507576,
0.018166393041610718,
-0.025166932493448257,
-0.09188371151685715,
0.007725628558546305,
0.03351938724517822,
-0.1349027305841446,
-0.08775576949119568
] |
null | null | null |
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and its training code has been mainly taken from: [Tensorpack](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN) .
Regarding the dataset, please check: [Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation](https://arxiv.org/abs/1911.10683).
The model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are
calculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.
The code has been adapted so that it can be used in a **deep**doctection pipeline.
## How this model can be used
This model can be used with the **deep**doctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this [Get_started](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Get_Started.ipynb) tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this [model](https://huggingface.co/deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_pubtabnet_rc).
## How this model was trained.
To recreate the model run on the **deep**doctection framework, run:
```python
>>> import os
>>> from deep_doctection.datasets import DatasetRegistry
>>> from deep_doctection.eval import MetricRegistry
>>> from deep_doctection.utils import get_configs_dir_path
>>> from deep_doctection.train import train_faster_rcnn
pubtabnet = DatasetRegistry.get_dataset("pubtabnet")
pubtabnet.dataflow.categories.set_cat_to_sub_cat({"ITEM":"row_col"})
pubtabnet.dataflow.categories.filter_categories(categories=["ROW","COLUMN"])
path_config_yaml=os.path.join(get_configs_dir_path(),"tp/rows/conf_frcnn_rows.yaml")
path_weights = ""
dataset_train = pubtabnet
config_overwrite=["TRAIN.STEPS_PER_EPOCH=500","TRAIN.STARTING_EPOCH=1", "TRAIN.CHECKPOINT_PERIOD=50"]
build_train_config=["max_datapoints=500000","rows_and_cols=True"]
dataset_val = pubtabnet
build_val_config = ["max_datapoints=2000","rows_and_cols=True"]
coco_metric = MetricRegistry.get_metric("coco")
coco_metric.set_params(max_detections=[50,200,600], area_range=[[0,1000000],[0,200],[200,800],[800,1000000]])
train_faster_rcnn(path_config_yaml=path_config_yaml,
dataset_train=dataset_train,
path_weights=path_weights,
config_overwrite=config_overwrite,
log_dir="/path/to/dir",
build_train_config=build_train_config,
dataset_val=dataset_val,
build_val_config=build_val_config,
metric=coco_metric,
pipeline_component_name="ImageLayoutService"
)
```
|
{"license": "apache-2.0", "tags": ["Tensorflow"], "datasets": ["Pubtabnet"]}
| null |
deepdoctection/tp_casc_rcnn_X_32xd4_50_FPN_GN_2FC_pubtabnet_rc_inference_only
|
[
"Tensorflow",
"dataset:Pubtabnet",
"arxiv:1911.10683",
"license:apache-2.0",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1911.10683"
] |
[] |
TAGS
#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us
|
# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables.
The model and its training code has been mainly taken from: Tensorpack .
Regarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation.
The model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are
calculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.
The code has been adapted so that it can be used in a deepdoctection pipeline.
## How this model can be used
This model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.
## This is an inference model only
To reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model.
## How this model was trained.
To recreate the model run on the deepdoctection framework, run:
|
[
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are\ncalculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model.",
"## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:"
] |
[
"TAGS\n#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n",
"# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are\ncalculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.",
"## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.",
"## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model.",
"## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:"
] |
[
34,
180,
45,
50,
24
] |
[
"passage: TAGS\n#Tensorflow #dataset-Pubtabnet #arxiv-1911.10683 #license-apache-2.0 #region-us \n# Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Pubtabnet for Semantic Segmentation of tables. \n\nThe model and its training code has been mainly taken from: Tensorpack . \n\nRegarding the dataset, please check: Xu Zhong et. all. - Image-based table recognition: data, model, and evaluation. \n\nThe model has been trained on detecting rows and columns for tables. As rows and column bounding boxes are not a priori an element of the annotations they are\ncalculated using the bounding boxes of the cells and the intrinsic structure of the enclosed HTML.\n\nThe code has been adapted so that it can be used in a deepdoctection pipeline.## How this model can be used\n\nThis model can be used with the deepdoctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this Get_started tutorial.## This is an inference model only\n\nTo reduce the size of the checkpoint we removed all variables that are not necessary for inference. Therefore it cannot be used for fine-tuning. To fine tune this model please check this model.## How this model was trained. \n\nTo recreate the model run on the deepdoctection framework, run:"
] |
[
-0.08817637711763382,
0.14101476967334747,
-0.0021282939705997705,
0.09068407118320465,
0.10064439475536346,
-0.007046042941510677,
0.02978384681046009,
0.08825709670782089,
-0.018535982817411423,
0.15234099328517914,
-0.020486917346715927,
0.05518411099910736,
0.08700387924909592,
0.12731792032718658,
-0.06783264130353928,
-0.1749681532382965,
0.020543992519378662,
-0.010245271027088165,
0.045210979878902435,
0.07508859038352966,
0.098048634827137,
-0.08545734733343124,
0.042249590158462524,
-0.002871191129088402,
-0.03047667071223259,
0.028974972665309906,
0.030140237882733345,
-0.011870862916111946,
0.03659255802631378,
0.10507499426603317,
0.042010217905044556,
-0.014551127329468727,
0.028935033828020096,
-0.12799736857414246,
0.029194289818406105,
0.09567861258983612,
-0.014521719887852669,
0.09623079746961594,
0.06941778212785721,
0.02857092209160328,
0.10136693716049194,
-0.06762583553791046,
0.03569521754980087,
0.0826554074883461,
-0.07149376720190048,
-0.08653939515352249,
-0.1368217170238495,
0.12338273972272873,
0.10844126343727112,
0.1312422752380371,
-0.017725862562656403,
0.18653133511543274,
0.017101168632507324,
0.037439677864313126,
0.09651149064302444,
-0.2348392903804779,
-0.02323315478861332,
0.15400157868862152,
0.024947836995124817,
0.04365743324160576,
-0.07111185789108276,
0.01895798370242119,
0.009861445054411888,
0.03444052115082741,
0.02109360694885254,
-0.09601836651563644,
-0.057863786816596985,
-0.06281112134456635,
-0.0884900838136673,
-0.030823564156889915,
0.08220982551574707,
0.012392748147249222,
-0.03627818077802658,
-0.12955403327941895,
-0.08431393653154373,
-0.04173898324370384,
-0.036908939480781555,
-0.07076618820428848,
0.03936304524540901,
0.04198818281292915,
0.13439075648784637,
-0.16106091439723969,
-0.09818746149539948,
0.007353161461651325,
-0.036560188978910446,
-0.0015631759306415915,
0.014709848910570145,
0.04800485074520111,
-0.07533900439739227,
0.07353192567825317,
-0.16022326052188873,
-0.04892408847808838,
-0.0433659590780735,
-0.08333684504032135,
-0.13290956616401672,
-0.02889348194003105,
-0.03163605183362961,
-0.08430349081754684,
-0.02010812982916832,
0.1598333865404129,
-0.0537957027554512,
0.038399696350097656,
-0.014202602207660675,
0.0279193464666605,
0.05752846226096153,
0.1508629471063614,
-0.08560159057378769,
-0.049745846539735794,
0.13422589004039764,
0.019928166642785072,
0.05077727884054184,
-0.02322881668806076,
-0.04941585659980774,
-0.013536280952394009,
-0.03009226731956005,
0.06419072300195694,
0.06164240092039108,
-0.009219648316502571,
-0.04603369161486626,
-0.09663630276918411,
0.15549559891223907,
-0.10974788665771484,
0.04112996906042099,
0.032406121492385864,
-0.021115386858582497,
0.1580866128206253,
0.0469948910176754,
0.0028353689704090357,
-0.12156476080417633,
-0.04798933491110802,
-0.075788713991642,
0.022954214364290237,
-0.06560540199279785,
-0.11632659286260605,
-0.010944986715912819,
-0.10067319124937057,
-0.03830760717391968,
-0.13988912105560303,
-0.24168777465820312,
-0.10004039853811264,
0.01095398049801588,
-0.04333748668432236,
0.018392562866210938,
0.0385274775326252,
-0.029402997344732285,
-0.09042329341173172,
0.025270340964198112,
-0.03577449917793274,
0.025623543187975883,
-0.0060697393491864204,
-0.014159345999360085,
-0.020400581881403923,
-0.09817803651094437,
0.010464334860444069,
-0.10401983559131622,
0.047645993530750275,
-0.12154343724250793,
0.09154029935598373,
0.03389197960495949,
-0.010727224871516228,
-0.1181110367178917,
-0.047269877046346664,
-0.08356254547834396,
-0.004960944410413504,
0.031964391469955444,
0.10358615219593048,
-0.22090144455432892,
0.014181453734636307,
0.04651573300361633,
-0.12092049419879913,
0.004218581598252058,
0.1017555370926857,
-0.030599648132920265,
0.010492530651390553,
0.09892860054969788,
0.10804427415132523,
0.1275230348110199,
-0.010441075079143047,
-0.11358115077018738,
-0.0657682716846466,
0.006102453451603651,
0.0606633797287941,
0.03863202780485153,
-0.06925731897354126,
0.0343494713306427,
0.016451075673103333,
-0.024149687960743904,
-0.07205819338560104,
0.017123719677329063,
-0.06723795086145401,
-0.019650954753160477,
-0.04843941330909729,
-0.013540293090045452,
0.007602526340633631,
0.025820869952440262,
0.06179044395685196,
-0.06520771980285645,
-0.06218482926487923,
0.12978434562683105,
-0.04668392986059189,
0.025314321741461754,
-0.05376143753528595,
0.13046139478683472,
-0.13749687373638153,
0.040927547961473465,
-0.2244924157857895,
-0.021457644179463387,
0.04523749649524689,
-0.04443417489528656,
0.032731957733631134,
0.0877937451004982,
0.019226327538490295,
0.017608871683478355,
-0.008731067180633545,
0.011724192649126053,
-0.006819558329880238,
-0.013487610965967178,
-0.09691550582647324,
-0.0372391939163208,
-0.08877246826887131,
-0.05154477804899216,
0.07455310225486755,
-0.2660354971885681,
0.03649093583226204,
-0.0804886668920517,
0.07912080734968185,
0.031888216733932495,
-0.04791121929883957,
0.03887664154171944,
-0.009813684970140457,
-0.0435253381729126,
-0.09857568144798279,
-0.006055909208953381,
-0.006331086158752441,
-0.028912533074617386,
-0.01021898165345192,
-0.21038508415222168,
-0.12465714663267136,
0.053907036781311035,
0.11867081373929977,
-0.032195448875427246,
-0.03735420107841492,
-0.0433199480175972,
0.014780412428081036,
-0.059138163924217224,
-0.020908478647470474,
0.14370691776275635,
0.05875520035624504,
0.06821660697460175,
-0.06338310241699219,
0.01604272425174713,
0.0024141541216522455,
-0.030449040234088898,
0.02662564255297184,
0.0203775092959404,
0.05156060308218002,
-0.1763526201248169,
0.028330186381936073,
0.0141544034704566,
-0.02166362851858139,
0.051679469645023346,
0.05206083878874779,
-0.06982240080833435,
-0.04759279265999794,
0.014487325213849545,
0.02583734504878521,
0.10828348994255066,
0.1185402125120163,
0.08889595419168472,
0.05295111611485481,
0.04420779272913933,
0.047512345016002655,
-0.010258388705551624,
0.0692814290523529,
0.05148059502243996,
0.01857069320976734,
0.06418909877538681,
0.013095450587570667,
0.01328185573220253,
0.043576519936323166,
-0.05585787072777748,
0.09793046861886978,
-0.01653427630662918,
-0.02767537347972393,
-0.11459991335868835,
0.18129827082157135,
-0.062361426651477814,
-0.22470709681510925,
-0.06960970908403397,
0.06782329082489014,
-0.0767534151673317,
0.010341387242078781,
0.020332975313067436,
0.022053182125091553,
-0.10530716180801392,
-0.12194401025772095,
-0.019252222031354904,
0.026770824566483498,
-0.03267704322934151,
-0.08071491122245789,
-0.05135739967226982,
0.04386308789253235,
-0.08412059396505356,
0.01209502387791872,
-0.009030124172568321,
-0.060899365693330765,
-0.005538937635719776,
0.02359558641910553,
0.055132247507572174,
0.12957558035850525,
-0.04247868061065674,
0.005359983537346125,
0.020970765501260757,
0.1014476791024208,
-0.015032164752483368,
0.10602999478578568,
0.20234617590904236,
0.03088018111884594,
0.012079737149178982,
0.07442182302474976,
-0.030594583600759506,
-0.03897034004330635,
-0.018137628212571144,
0.07845229655504227,
-0.04896332323551178,
-0.18359589576721191,
-0.08689211308956146,
-0.09201555699110031,
-0.02554800733923912,
0.07605539262294769,
0.06009066104888916,
0.01961296610534191,
0.07359138131141663,
-0.07388622313737869,
0.04987644776701927,
-0.02577846124768257,
0.06449313461780548,
0.09119419753551483,
0.0064376541413366795,
0.04001015052199364,
-0.048712849617004395,
0.07091346383094788,
0.13048340380191803,
0.08149243891239166,
0.13230405747890472,
-0.03783201426267624,
0.11053832620382309,
0.007906980812549591,
0.04172533005475998,
0.024081025272607803,
0.1025887131690979,
-0.022312797605991364,
0.016674095764756203,
-0.0034555571619421244,
-0.06502752006053925,
0.012707827612757683,
0.043559927493333817,
0.049646519124507904,
0.021169940009713173,
-0.05804543197154999,
0.0753980502486229,
0.05324907228350639,
0.03290465474128723,
0.04731651023030281,
-0.1869134157896042,
-0.05031590908765793,
0.003366326680406928,
0.015261983498930931,
-0.0932735949754715,
-0.0019555166363716125,
0.11927669495344162,
-0.11312071979045868,
-0.04801439866423607,
-0.062216855585575104,
0.11031447350978851,
-0.041184067726135254,
0.016089782118797302,
-0.041701264679431915,
0.12659935653209686,
0.02154357358813286,
0.0820075124502182,
-0.11201712489128113,
-0.01507784053683281,
0.06423808634281158,
0.11227013170719147,
-0.0434160940349102,
0.013662436045706272,
0.027198446914553642,
0.01686869002878666,
0.1409684419631958,
-0.006892725359648466,
-0.053374841809272766,
-0.039889998733997345,
-0.16032786667346954,
0.05734927952289581,
-0.0054214829578995705,
-0.05700412765145302,
0.09766661375761032,
-0.013219459913671017,
0.007360107731074095,
-0.02872418984770775,
0.006245740223675966,
-0.11341962218284607,
-0.19832974672317505,
0.04842892661690712,
0.04672921448945999,
-0.0772831067442894,
-0.010985760018229485,
0.028430357575416565,
0.042648058384656906,
0.18634435534477234,
-0.06342078745365143,
-0.06051529571413994,
-0.1546955704689026,
0.08753000944852829,
0.15809905529022217,
-0.055719029158353806,
0.04606441780924797,
-0.0018339325906708837,
0.10563158988952637,
-0.07035473734140396,
-0.11919445544481277,
0.0748034343123436,
-0.09969871491193771,
-0.11217881739139557,
-0.009770583361387253,
0.07662152498960495,
0.012558380141854286,
0.032440539449453354,
0.07183995842933655,
0.025372764095664024,
-0.035096149891614914,
-0.14621283113956451,
-0.07083936780691147,
0.17909398674964905,
0.05712110176682472,
0.07579300552606583,
-0.07876256853342056,
-0.1405591368675232,
-0.04468525946140289,
0.09534618258476257,
0.09192832559347153,
0.21902213990688324,
-0.0971163883805275,
0.0819479376077652,
0.09380809217691422,
-0.04541098698973656,
-0.11296751350164413,
0.010387904942035675,
-0.05222481116652489,
0.05833015963435173,
0.10647795349359512,
-0.08661659806966782,
0.03935645520687103,
0.06033693253993988,
-0.019559213891625404,
0.11682204157114029,
-0.2768300771713257,
-0.07833229750394821,
0.08889690041542053,
0.012021047994494438,
0.10205741971731186,
-0.11588311940431595,
-0.03552322834730148,
-0.0301385298371315,
-0.05573778599500656,
0.09424144774675369,
-0.13870924711227417,
0.09811731427907944,
-0.014566773548722267,
-0.01758839376270771,
0.06082558259367943,
-0.055395472794771194,
0.09130535274744034,
-0.02962813898921013,
0.07847286015748978,
0.0027865918818861246,
0.05086670070886612,
0.08099982142448425,
-0.06668686121702194,
0.1642608940601349,
-0.03151213005185127,
0.12704846262931824,
-0.12934310734272003,
-0.05978594347834587,
-0.04626442492008209,
0.09400055557489395,
-0.03640514984726906,
-0.04939921945333481,
-0.05275357887148857,
0.05090026929974556,
0.1381751149892807,
0.014103056862950325,
-0.03510810807347298,
-0.009076690301299095,
-0.005028494168072939,
0.18695007264614105,
0.04000212252140045,
0.11491777747869492,
-0.16654379665851593,
-0.02355816215276718,
-0.00023274245904758573,
0.09431663155555725,
-0.03102518618106842,
0.02233997732400894,
0.10855310410261154,
0.02886871248483658,
0.07658485323190689,
-0.003460389794781804,
-0.08110417425632477,
-0.02428334765136242,
-0.025827519595623016,
-0.10309235006570816,
-0.11067134886980057,
-0.013383721001446247,
0.14086925983428955,
-0.09349946677684784,
-0.013048689812421799,
0.12436658143997192,
-0.05585913360118866,
-0.043313123285770416,
0.02564784698188305,
0.06780007481575012,
-0.006644241977483034,
0.12560606002807617,
0.019339153543114662,
0.022199278697371483,
-0.08694690465927124,
0.0905899703502655,
0.09194618463516235,
-0.0895000547170639,
0.04177887737751007,
0.08360927551984787,
-0.09342819452285767,
-0.0948575884103775,
-0.02368900552392006,
0.07291664183139801,
-0.06231623888015747,
-0.03945712372660637,
-0.04139276221394539,
-0.023058757185935974,
-0.034016069024801254,
-0.00048258304013870656,
0.0047193653881549835,
0.058655042201280594,
-0.06322396546602249,
-0.08855365961790085,
-0.15100838243961334,
0.03904133662581444,
0.09140314161777496,
-0.005981030408293009,
-0.09676798433065414,
0.0004984119441360235,
-0.018696634098887444,
-0.0005918355309404433,
-0.02060948871076107,
0.025140464305877686,
-0.04845014587044716,
-0.05938471481204033,
-0.13176825642585754,
0.01922641135752201,
-0.04894410818815231,
-0.03881118819117546,
-0.021147657185792923,
0.03321663290262222,
-0.007227073889225721,
0.03896746784448624,
-0.025629287585616112,
-0.06753037869930267,
-0.05865682661533356,
0.0006723530241288245,
-0.12438468635082245,
0.005129569675773382,
-0.04572783410549164,
-0.052313730120658875,
0.10685401409864426,
0.06171063333749771,
-0.0014976650709286332,
0.0010363522451370955,
-0.05562739074230194,
-0.03649843484163284,
0.01743903197348118,
0.03326745703816414,
0.050671547651290894,
-0.09242991358041763,
0.008977510966360569,
0.026306234300136566,
-0.03799530491232872,
-0.04073736071586609,
0.015620315447449684,
-0.11556036025285721,
-0.016756780445575714,
-0.09517418593168259,
0.03815204277634621,
-0.03508595749735832,
0.010455123148858547,
-0.022421017289161682,
0.08897048979997635,
0.1372017115354538,
-0.02240092307329178,
0.10580798983573914,
-0.17559608817100525,
-0.034383900463581085,
-0.024738984182476997,
0.055443961173295975,
-0.008960684761404991,
0.01502551231533289,
0.061236001551151276,
-0.015473898500204086,
0.10195738077163696,
-0.025236910209059715,
0.05061646178364754,
0.008386310189962387,
-0.010293320752680302,
0.014299143105745316,
-0.01277617271989584,
0.12219404429197311,
0.016125613823533058,
-0.022138046100735664,
0.03243483230471611,
0.047498371452093124,
0.041825663298368454,
0.07043004035949707,
0.17916113138198853,
0.1464664489030838,
-0.016972219571471214,
0.10497936606407166,
-0.02153017930686474,
-0.07123219221830368,
-0.040989864617586136,
0.04663989320397377,
-0.02146448753774166,
0.016194643452763557,
-0.07015840709209442,
-0.06952651590108871,
0.19382885098457336,
-0.12156149744987488,
0.07689822465181351,
0.03040800616145134,
-0.05076112598180771,
-0.10514969378709793,
-0.19812113046646118,
-0.05567917972803116,
0.01605025678873062,
-0.07020027190446854,
-0.11054256558418274,
0.0057281674817204475,
0.02681216038763523,
-0.007862089201807976,
-0.06487022340297699,
0.17307958006858826,
-0.09537503868341446,
-0.07261309027671814,
0.006624897941946983,
0.03421160206198692,
-0.003416073275730014,
0.07281956076622009,
0.05299527943134308,
0.04276443272829056,
-0.04862711951136589,
0.10470845550298691,
0.05132467672228813,
0.15483617782592773,
0.08211901783943176,
-0.010940592736005783,
-0.067062608897686,
-0.04172634333372116,
0.003360629780218005,
0.03983630985021591,
0.2151273787021637,
0.054961562156677246,
-0.07558432221412659,
-0.04721454158425331,
0.1770205795764923,
-0.04651038721203804,
-0.0242091566324234,
-0.15085583925247192,
0.18137919902801514,
0.027757184579968452,
-0.04357714578509331,
-0.051926299929618835,
-0.16244228184223175,
0.06688998639583588,
0.1752917319536209,
0.14902076125144958,
-0.0588015615940094,
-0.004445391707122326,
-0.011452392674982548,
-0.001713919686153531,
-0.04318804293870926,
0.010819033719599247,
-0.007547601591795683,
0.27026259899139404,
-0.04110634699463844,
0.052964165806770325,
-0.0016426389338448644,
-0.007615027483552694,
-0.08682140707969666,
0.09373906999826431,
-0.0010892448481172323,
0.007604571059346199,
0.019722187891602516,
0.029270587489008904,
-0.05229966342449188,
-0.16032250225543976,
0.02470487728714943,
-0.055426161736249924,
-0.06501543521881104,
-0.04194844886660576,
-0.061438605189323425,
0.014243627898395061,
0.09857592731714249,
-0.05646353214979172,
-0.012561030685901642,
0.13573573529720306,
-0.046102967113256454,
-0.05616040527820587,
-0.03268282115459442,
0.06394916027784348,
0.07727230340242386,
0.14117540419101715,
0.014194080606102943,
-0.01748008280992508,
0.07800305634737015,
-0.010908834636211395,
-0.14834298193454742,
0.038794271647930145,
-0.017049837857484818,
-0.03729703277349472,
0.04106594994664192,
0.10212376713752747,
-0.007444981951266527,
0.04426722601056099,
0.03237958997488022,
-0.060381922870874405,
-0.02239489182829857,
-0.07893113791942596,
-0.002829046919941902,
-0.0935744121670723,
0.04898685961961746,
-0.04165639728307724,
0.15556006133556366,
0.09713467210531235,
-0.04209482669830322,
0.013356095179915428,
-0.047284647822380066,
0.10621535032987595,
0.03421645611524582,
0.07452327013015747,
-0.004625463392585516,
-0.09442668408155441,
0.06315115094184875,
-0.05208403244614601,
-0.010322146117687225,
-0.19658058881759644,
-0.026673099026083946,
-0.012347080744802952,
-0.08910170197486877,
-0.043632470071315765,
0.05648563429713249,
0.06013687327504158,
0.05966036021709442,
-0.0447440929710865,
-0.06129767373204231,
-0.006268179975450039,
0.061555150896310806,
-0.08088841289281845,
-0.0774528756737709
] |
null | null |
transformers
|
# Poster2Plot
An image captioning model to generate movie/t.v show plot from poster. It generates decent plots but is no way perfect. We are still working on improving the model.
## Live demo on Hugging Face Spaces: https://huggingface.co/spaces/deepklarity/poster2plot
# Model Details
The base model uses a Vision Transformer (ViT) model as an image encoder and GPT-2 as a decoder.
We used the following models:
* Encoder: [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k)
* Decoder: [gpt2](https://huggingface.co/gpt2)
# Datasets
Publicly available IMDb datasets were used to train the model.
# How to use
## In PyTorch
```python
import torch
import re
import requests
from PIL import Image
from transformers import AutoTokenizer, AutoFeatureExtractor, VisionEncoderDecoderModel
# Pattern to ignore all the text after 2 or more full stops
regex_pattern = "[.]{2,}"
def post_process(text):
try:
text = text.strip()
text = re.split(regex_pattern, text)[0]
except Exception as e:
print(e)
pass
return text
def predict(image, max_length=64, num_beams=4):
pixel_values = feature_extractor(images=image, return_tensors="pt").pixel_values
pixel_values = pixel_values.to(device)
with torch.no_grad():
output_ids = model.generate(
pixel_values,
max_length=max_length,
num_beams=num_beams,
return_dict_in_generate=True,
).sequences
preds = tokenizer.batch_decode(output_ids, skip_special_tokens=True)
pred = post_process(preds[0])
return pred
model_name_or_path = "deepklarity/poster2plot"
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
# Load model.
model = VisionEncoderDecoderModel.from_pretrained(model_name_or_path)
model.to(device)
print("Loaded model")
feature_extractor = AutoFeatureExtractor.from_pretrained(model.encoder.name_or_path)
print("Loaded feature_extractor")
tokenizer = AutoTokenizer.from_pretrained(model.decoder.name_or_path, use_fast=True)
if model.decoder.name_or_path == "gpt2":
tokenizer.pad_token = tokenizer.eos_token
print("Loaded tokenizer")
url = "https://upload.wikimedia.org/wikipedia/en/2/26/Moana_Teaser_Poster.jpg"
with Image.open(requests.get(url, stream=True).raw) as image:
pred = predict(image)
print(pred)
```
|
{"language": "en", "tags": ["image-classification", "image-captioning"]}
|
image-classification
|
deepklarity/poster2plot
|
[
"transformers",
"pytorch",
"vision-encoder-decoder",
"image-classification",
"image-captioning",
"en",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #vision-encoder-decoder #image-classification #image-captioning #en #endpoints_compatible #has_space #region-us
|
# Poster2Plot
An image captioning model to generate movie/t.v show plot from poster. It generates decent plots but is no way perfect. We are still working on improving the model.
## Live demo on Hugging Face Spaces: URL
# Model Details
The base model uses a Vision Transformer (ViT) model as an image encoder and GPT-2 as a decoder.
We used the following models:
* Encoder: google/vit-base-patch16-224-in21k
* Decoder: gpt2
# Datasets
Publicly available IMDb datasets were used to train the model.
# How to use
## In PyTorch
|
[
"# Poster2Plot\n\nAn image captioning model to generate movie/t.v show plot from poster. It generates decent plots but is no way perfect. We are still working on improving the model.",
"## Live demo on Hugging Face Spaces: URL",
"# Model Details\n\nThe base model uses a Vision Transformer (ViT) model as an image encoder and GPT-2 as a decoder.\n\nWe used the following models:\n\n* Encoder: google/vit-base-patch16-224-in21k\n* Decoder: gpt2",
"# Datasets\n\nPublicly available IMDb datasets were used to train the model.",
"# How to use",
"## In PyTorch"
] |
[
"TAGS\n#transformers #pytorch #vision-encoder-decoder #image-classification #image-captioning #en #endpoints_compatible #has_space #region-us \n",
"# Poster2Plot\n\nAn image captioning model to generate movie/t.v show plot from poster. It generates decent plots but is no way perfect. We are still working on improving the model.",
"## Live demo on Hugging Face Spaces: URL",
"# Model Details\n\nThe base model uses a Vision Transformer (ViT) model as an image encoder and GPT-2 as a decoder.\n\nWe used the following models:\n\n* Encoder: google/vit-base-patch16-224-in21k\n* Decoder: gpt2",
"# Datasets\n\nPublicly available IMDb datasets were used to train the model.",
"# How to use",
"## In PyTorch"
] |
[
46,
45,
11,
65,
20,
4,
5
] |
[
"passage: TAGS\n#transformers #pytorch #vision-encoder-decoder #image-classification #image-captioning #en #endpoints_compatible #has_space #region-us \n# Poster2Plot\n\nAn image captioning model to generate movie/t.v show plot from poster. It generates decent plots but is no way perfect. We are still working on improving the model.## Live demo on Hugging Face Spaces: URL# Model Details\n\nThe base model uses a Vision Transformer (ViT) model as an image encoder and GPT-2 as a decoder.\n\nWe used the following models:\n\n* Encoder: google/vit-base-patch16-224-in21k\n* Decoder: gpt2# Datasets\n\nPublicly available IMDb datasets were used to train the model.# How to use## In PyTorch"
] |
[
-0.0663357749581337,
0.08692321181297302,
-0.0008628002251498401,
0.06018760800361633,
0.1242682933807373,
0.020877046510577202,
0.0023561841808259487,
0.11762915551662445,
-0.0889994204044342,
-0.01985223777592182,
0.12699182331562042,
0.030884625390172005,
0.05556478723883629,
0.20091864466667175,
-0.0021910720970481634,
-0.33928269147872925,
-0.04752300679683685,
0.11190152168273926,
0.018868044018745422,
0.13587519526481628,
0.07751838117837906,
-0.09782835841178894,
0.11082987487316132,
0.02704240009188652,
-0.21502643823623657,
-0.023196138441562653,
-0.021568039432168007,
-0.044026412069797516,
0.12576521933078766,
0.03166867792606354,
0.06680727750062943,
0.02876163274049759,
0.11145733296871185,
0.025776343420147896,
0.03039299137890339,
0.04738430306315422,
-0.0050318301655352116,
0.03999613597989082,
0.0516442209482193,
0.08318374305963516,
0.19517236948013306,
-0.07011811435222626,
0.013396747410297394,
-0.023048099130392075,
-0.10065286606550217,
-0.11817609518766403,
-0.009804058820009232,
0.02434668503701687,
0.06635918468236923,
0.040854353457689285,
-0.025444703176617622,
0.08429358899593353,
-0.04714500904083252,
0.09591864794492722,
0.056898247450590134,
-0.1545170545578003,
-0.05268128216266632,
0.08313383162021637,
0.032440297305583954,
-0.003100379602983594,
-0.05349885672330856,
0.09482179582118988,
0.03880029544234276,
0.03917344659566879,
0.05359083414077759,
-0.026063550263643265,
-0.03505226969718933,
-0.09340444207191467,
-0.14518648386001587,
-0.08731015771627426,
0.16382138431072235,
0.009265631437301636,
-0.06924160569906235,
-0.08406491577625275,
-0.08304360508918762,
0.008913896977901459,
-0.019387224689126015,
0.005017056129872799,
-0.019016316160559654,
-0.015855465084314346,
-0.11820975691080093,
-0.1299365907907486,
-0.11833343654870987,
-0.0745483785867691,
-0.07095035165548325,
-0.03140623867511749,
0.06233429163694382,
0.07984146475791931,
-0.12285695225000381,
0.1810322403907776,
-0.1511000692844391,
-0.0379326194524765,
0.019098153337836266,
-0.1513519585132599,
0.03228071704506874,
0.03069932386279106,
-0.03417942300438881,
-0.06762786954641342,
-0.10729742795228958,
0.0356474407017231,
-0.020507503300905228,
-0.009385773912072182,
0.010446485131978989,
0.08356829732656479,
0.034627653658390045,
0.18872995674610138,
-0.15238027274608612,
0.1048056036233902,
0.08970122784376144,
-0.02553415298461914,
0.003431714605540037,
-0.02352098934352398,
-0.12211707979440689,
-0.024960948154330254,
0.04575492814183235,
-0.0051279086619615555,
0.02181815542280674,
0.07745294272899628,
-0.07163940370082855,
-0.020837033167481422,
0.07775422930717468,
-0.010216179303824902,
0.02393580973148346,
-0.025356337428092957,
-0.05713919550180435,
0.0690365582704544,
0.16586965322494507,
-0.037995897233486176,
-0.08841639012098312,
0.00898758601397276,
-0.047595325857400894,
0.013101523742079735,
-0.07763320952653885,
-0.08388759940862656,
0.006428142543882132,
-0.12335702776908875,
0.024491244927048683,
-0.12163855135440826,
-0.02351677231490612,
0.03766469284892082,
0.012491747736930847,
-0.028033502399921417,
0.0037557699251919985,
-0.0038176937960088253,
-0.0020061652176082134,
0.006527970544993877,
0.015618208795785904,
-0.05231472849845886,
0.000336514669470489,
0.06113704666495323,
-0.020661991089582443,
0.11030031740665436,
-0.0833197832107544,
0.02733016014099121,
-0.11893745511770248,
0.03269825130701065,
-0.19004788994789124,
0.08082306385040283,
0.05499968305230141,
-0.04401182010769844,
-0.03896495699882507,
-0.15089206397533417,
0.0220333281904459,
-0.0114260483533144,
0.040691882371902466,
0.19227510690689087,
-0.17233221232891083,
-0.00431465357542038,
0.16718560457229614,
-0.10167703777551651,
-0.09556400775909424,
0.10778521746397018,
0.0118909552693367,
0.011156992055475712,
0.018444597721099854,
0.08121635019779205,
0.05232832580804825,
-0.19617068767547607,
0.02774590626358986,
0.0964210033416748,
-0.10806111246347427,
0.000013443503121379763,
0.07948572188615799,
0.048816271126270294,
-0.03936639800667763,
-0.04002702981233597,
-0.1952074021100998,
0.037817616015672684,
-0.12315770238637924,
-0.05870416387915611,
-0.035783618688583374,
0.003077689092606306,
0.09441030025482178,
0.08787268400192261,
0.043583087623119354,
-0.029103679582476616,
-0.1427403688430786,
-0.045497309416532516,
0.0652269497513771,
-0.08397090435028076,
0.09029876440763474,
-0.12402812391519547,
0.11498893797397614,
-0.09515401721000671,
-0.016341574490070343,
-0.08060333877801895,
-0.07197754830121994,
-0.018865659832954407,
0.06532552093267441,
-0.000024979288355098106,
0.06232526898384094,
0.03594173863530159,
0.09122284501791,
-0.033391959965229034,
-0.05516182258725166,
-0.07322975993156433,
0.004479904659092426,
0.010508892126381397,
-0.11196548491716385,
-0.06587379425764084,
-0.06683243066072464,
0.06901238858699799,
-0.038127075880765915,
-0.04039748013019562,
0.10004650056362152,
0.15097153186798096,
-0.01487572118639946,
-0.051087163388729095,
0.06038854271173477,
0.005235159769654274,
-0.0033368912991136312,
-0.12251974642276764,
0.10113825649023056,
-0.025632444769144058,
-0.058877915143966675,
0.10760340094566345,
-0.019682148471474648,
-0.049048781394958496,
0.13432450592517853,
-0.2214174121618271,
-0.0640752986073494,
0.12243487685918808,
-0.06567084789276123,
0.02550436183810234,
-0.05591687187552452,
-0.049644701182842255,
0.15698030591011047,
-0.03701537474989891,
0.1389877051115036,
-0.06146019324660301,
0.025698885321617126,
0.04498196020722389,
-0.06618611514568329,
-0.0835149809718132,
0.04133565351366997,
0.16505639255046844,
-0.12017696350812912,
0.0422680489718914,
0.0701259896159172,
0.06192721799015999,
0.13140909373760223,
0.041278962045907974,
-0.05873890221118927,
0.000004888613602815894,
0.001273640664294362,
-0.010822548530995846,
0.13131971657276154,
-0.20695115625858307,
-0.07770296186208725,
0.012598244473338127,
-0.07648351043462753,
0.03038671426475048,
-0.17741608619689941,
-0.004504019860178232,
0.01823057048022747,
0.005428167060017586,
-0.015828225761651993,
0.06656146049499512,
-0.050225719809532166,
0.06534380465745926,
0.048867207020521164,
-0.05016444995999336,
0.07605886459350586,
0.003806039923802018,
-0.07055198401212692,
0.11324365437030792,
-0.0806502178311348,
-0.36122754216194153,
-0.07865790277719498,
0.06425411999225616,
0.035143960267305374,
0.0390482135117054,
0.026845403015613556,
-0.0669865533709526,
-0.06355252861976624,
-0.018488027155399323,
-0.019021429121494293,
-0.10627720504999161,
0.012823238968849182,
-0.01002425979822874,
-0.060722604393959045,
0.02793392352759838,
-0.08480441570281982,
-0.016456807032227516,
-0.02000422775745392,
-0.11296151578426361,
0.12096516788005829,
-0.07459287345409393,
0.12099827080965042,
0.1380389779806137,
-0.10397521406412125,
0.07062822580337524,
-0.033483851701021194,
0.2966465353965759,
-0.11818044632673264,
0.10046888142824173,
0.141070157289505,
0.010154531337320805,
0.06570851057767868,
0.04767422005534172,
0.01811792142689228,
-0.05324990674853325,
-0.008005608804523945,
-0.036495648324489594,
-0.10951454192399979,
-0.05076523497700691,
-0.05870574712753296,
-0.07088465988636017,
-0.014150850474834442,
0.08140382915735245,
0.014428346417844296,
0.0003242752281948924,
0.09711902588605881,
-0.027820711955428123,
0.0174101535230875,
-0.004070613533258438,
0.05921186879277229,
0.11800413578748703,
-0.05872517451643944,
0.042495936155319214,
-0.03205662965774536,
-0.07196004688739777,
0.07443677634000778,
-0.026833422482013702,
0.14021579921245575,
-0.09906183183193207,
-0.1273144781589508,
0.06408190727233887,
0.09578726440668106,
0.09364736825227737,
0.021446816623210907,
-0.07107150554656982,
0.01163864228874445,
-0.045212604105472565,
-0.03800656646490097,
-0.015531565062701702,
0.026253217831254005,
0.0943169817328453,
-0.12097170203924179,
-0.09321662038564682,
0.059350885450839996,
0.04022990167140961,
-0.014848471619188786,
0.20499733090400696,
-0.2460685819387436,
0.03108469396829605,
-0.014990773983299732,
0.09964244067668915,
-0.10973574966192245,
0.036678072065114975,
0.14427103102207184,
-0.143891379237175,
0.06272806227207184,
-0.052350349724292755,
0.07997315376996994,
-0.091216541826725,
-0.004154960624873638,
0.017663462087512016,
-0.020308706909418106,
0.012661412358283997,
0.0828569158911705,
-0.11686941981315613,
0.09099561721086502,
-0.011397228576242924,
0.07168665528297424,
-0.044017840176820755,
-0.016471924260258675,
0.061182696372270584,
0.23392312228679657,
0.22298085689544678,
0.07329273968935013,
0.09179888665676117,
-0.05010315030813217,
-0.03444557264447212,
0.005874797701835632,
0.06554561853408813,
0.06204242631793022,
0.030625496059656143,
-0.01061455812305212,
-0.033474624156951904,
-0.001480619190260768,
0.01538093388080597,
-0.0720730721950531,
-0.06775426864624023,
0.0418180488049984,
-0.047266636043787,
0.02263970486819744,
-0.034124478697776794,
-0.0498022697865963,
-0.06847133487462997,
0.08327551931142807,
0.02347601018846035,
-0.13023869693279266,
-0.09100544452667236,
0.08474326133728027,
0.0816526785492897,
-0.03580267354846001,
0.129534512758255,
-0.03724079579114914,
0.09345576912164688,
-0.025919364765286446,
-0.1197454184293747,
0.03597274795174599,
-0.08964966237545013,
-0.09026698023080826,
-0.025827370584011078,
0.104493647813797,
-0.05563700571656227,
-0.013191440142691135,
0.034034885466098785,
0.01844988763332367,
-0.042751211673021317,
-0.08008260279893875,
0.10798026621341705,
-0.024265820160508156,
0.001732981763780117,
0.06001419574022293,
0.03084508515894413,
0.058453526347875595,
0.008368940092623234,
0.03477891907095909,
0.15526171028614044,
0.12449560314416885,
-0.12845167517662048,
0.00617695227265358,
0.0427464134991169,
-0.04174813628196716,
-0.30308008193969727,
-0.028321020305156708,
0.013189881108701229,
-0.003584873164072633,
0.00608406076207757,
-0.11164311319589615,
0.002006692346185446,
-0.03799787536263466,
-0.0019622291438281536,
-0.01765614189207554,
-0.3339640498161316,
-0.08048296719789505,
0.06881973892450333,
0.14079272747039795,
0.1763935536146164,
-0.024442370980978012,
0.014437774196267128,
0.07825621962547302,
-0.13010966777801514,
0.0927649512887001,
0.023338252678513527,
0.08941841125488281,
-0.018022136762738228,
0.02713054046034813,
0.04162897914648056,
-0.038927458226680756,
0.08113549649715424,
-0.1274842470884323,
0.0068414718843996525,
-0.08348678052425385,
-0.06455081701278687,
0.1915474832057953,
-0.007452177349478006,
0.09199277311563492,
0.12031041830778122,
0.1099393293261528,
-0.10574569553136826,
-0.024570241570472717,
-0.10197441279888153,
0.025221535935997963,
-0.0043679955415427685,
-0.07520953565835953,
-0.12273760139942169,
0.06221022084355354,
0.024708746001124382,
0.003497806843370199,
0.05262061208486557,
-0.02491859346628189,
-0.05667974427342415,
0.04191557317972183,
0.02067120000720024,
0.003078618785366416,
-0.12939129769802094,
-0.013151814229786396,
-0.0348101444542408,
0.1225723922252655,
-0.10511259734630585,
0.03883516415953636,
0.13208310306072235,
0.0322076678276062,
0.08112917840480804,
0.07059181481599808,
-0.039441078901290894,
0.06385374814271927,
0.10841624438762665,
-0.1269470751285553,
-0.19803787767887115,
-0.1187644824385643,
-0.06047409400343895,
0.12492066621780396,
0.09326162189245224,
0.08689773827791214,
-0.11179130524396896,
-0.005422690883278847,
-0.03142611309885979,
0.03691699355840683,
-0.026843106374144554,
0.07364494353532791,
0.09925489872694016,
-0.008723820559680462,
-0.11579647660255432,
0.08780939877033234,
0.009289935231208801,
-0.0430578775703907,
0.03594029322266579,
0.05837557464838028,
-0.09194515645503998,
-0.03733629360795021,
0.021602239459753036,
0.0459151454269886,
-0.12419912219047546,
-0.039270203560590744,
-0.011121890507638454,
-0.06338032335042953,
0.025836031883955002,
0.027219133451581,
0.05032132938504219,
0.05426827445626259,
-0.060613442212343216,
-0.02062281221151352,
-0.14279218018054962,
0.042875878512859344,
0.038065310567617416,
0.03222827985882759,
-0.216960147023201,
0.1275244504213333,
0.04902253299951553,
0.15915149450302124,
-0.09822577983140945,
-0.054968319833278656,
-0.05526413768529892,
0.016175802797079086,
-0.036322224885225296,
-0.009416898712515831,
-0.047551389783620834,
-0.004470176994800568,
-0.012869263999164104,
-0.026137273758649826,
-0.031605467200279236,
0.06581997126340866,
-0.04549073800444603,
-0.02619895152747631,
-0.0013179085217416286,
-0.05114154517650604,
-0.04163394495844841,
-0.02404860034584999,
0.0376841202378273,
-0.06464313715696335,
0.06339361518621445,
-0.016934651881456375,
-0.0677303820848465,
-0.019132787361741066,
-0.10420622676610947,
-0.11205755174160004,
0.07366751879453659,
-0.005304080434143543,
0.010508463717997074,
0.017371876165270805,
0.048868414014577866,
0.013399630784988403,
-0.03752965107560158,
-0.02286512777209282,
0.1494055837392807,
-0.07043783366680145,
-0.0028783564921468496,
-0.03193556144833565,
0.015162751078605652,
-0.04972881078720093,
0.16314612329006195,
0.015592721290886402,
0.11513936519622803,
0.044137366116046906,
-0.06149613857269287,
0.047029439359903336,
-0.05420764163136482,
-0.01743508130311966,
-0.026762591674923897,
-0.021969227120280266,
0.07857955992221832,
-0.021355291828513145,
0.040365006774663925,
-0.03510497510433197,
0.13032114505767822,
0.07084312289953232,
-0.00027842921554110944,
0.0028404826298356056,
0.12286622822284698,
-0.033941321074962616,
-0.021634541451931,
0.13934996724128723,
-0.06782954931259155,
-0.01563238352537155,
-0.034922707825899124,
0.050622738897800446,
0.04805319383740425,
0.09423806518316269,
0.07547920197248459,
-0.04438110068440437,
-0.014491518959403038,
0.07918985188007355,
0.031836383044719696,
0.04075032100081444,
-0.20523914694786072,
-0.0564083494246006,
0.03855413198471069,
0.11446762084960938,
-0.08589007705450058,
-0.028063833713531494,
0.11146195977926254,
-0.040430206805467606,
0.05220664665102959,
0.04817096143960953,
-0.0685751661658287,
-0.049851931631565094,
-0.14886882901191711,
-0.05577092617750168,
-0.13990521430969238,
0.06893858313560486,
-0.08047359436750412,
0.031156480312347412,
0.03433237597346306,
0.05185157433152199,
-0.09537050873041153,
0.21816527843475342,
-0.01658329926431179,
-0.11475633084774017,
0.14135147631168365,
0.03432516008615494,
0.05734774097800255,
0.01556575857102871,
0.0067501007579267025,
0.04295150935649872,
0.057709723711013794,
0.015147126279771328,
-0.013448968529701233,
-0.007921368815004826,
0.050203267484903336,
0.046857550740242004,
-0.04785004258155823,
-0.0539996512234211,
0.05487743392586708,
0.03962628170847893,
0.06668402999639511,
-0.02564116008579731,
0.0014572606887668371,
-0.044196952134370804,
0.12390387803316116,
-0.08079750090837479,
-0.08247010409832001,
-0.07978039234876633,
0.16136090457439423,
-0.04077814146876335,
0.0070633795112371445,
0.07560979574918747,
-0.11458918452262878,
-0.003705167444422841,
0.21641017496585846,
0.15853387117385864,
-0.06255493313074112,
-0.040536485612392426,
-0.011676434427499771,
0.005805053282529116,
0.02468438632786274,
0.1505027711391449,
0.0011893247719854116,
0.19067822396755219,
-0.10714127123355865,
0.022946413606405258,
-0.07809978723526001,
0.0013872752897441387,
-0.0023126478772610426,
-0.031083034351468086,
0.07444176822900772,
-0.03084965981543064,
-0.12468865513801575,
0.04709940776228905,
-0.03593109920620918,
-0.08336705714464188,
0.1705884337425232,
-0.11355538666248322,
-0.013355717994272709,
0.015369070693850517,
0.0974196046590805,
0.004485364072024822,
0.09483915567398071,
-0.0701717883348465,
0.0569441020488739,
0.08236896991729736,
0.056403595954179764,
-0.1310456395149231,
-0.014779405668377876,
0.05393664538860321,
-0.1769329309463501,
0.23013731837272644,
-0.030160292983055115,
0.08744390308856964,
0.03795749694108963,
-0.005048959981650114,
-0.09669341892004013,
0.03367270156741142,
-0.02960866689682007,
-0.004124847240746021,
-0.04068480804562569,
0.13005954027175903,
-0.02082349732518196,
-0.028617344796657562,
-0.03674234822392464,
-0.0960283949971199,
-0.014140329323709011,
0.01716848649084568,
0.003851089160889387,
-0.057493600994348526,
0.004866485949605703,
-0.11276040971279144,
0.10839910805225372,
0.08737626671791077,
-0.029621442779898643,
-0.0062398226000368595,
-0.10352636128664017,
-0.010819959454238415,
0.06527885049581528,
0.029364854097366333,
0.055837418884038925,
-0.0597928948700428,
-0.03503039479255676,
-0.05536431446671486,
0.03867245838046074,
-0.045839063823223114,
-0.027890672907233238,
-0.09676048904657364,
-0.016826584935188293,
-0.04991171509027481,
0.09880843013525009,
0.06235424429178238,
0.010002966970205307,
-0.0005064816796220839,
0.1392245590686798,
-0.01670060306787491,
0.048048797994852066,
-0.08172480762004852,
-0.01587703265249729
] |
null | null | null |
Roberta-base training attempt on hindi datasets.
|
{}
| null |
deepklarity/roberta-base-hindi
|
[
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#region-us
|
Roberta-base training attempt on hindi datasets.
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
[
0.024608636274933815,
-0.026205500587821007,
-0.009666500613093376,
-0.10395516455173492,
0.08638657629489899,
0.059816278517246246,
0.01882290467619896,
0.020661840215325356,
0.23975107073783875,
-0.005599027033895254,
0.1219947561621666,
0.0015615287702530622,
-0.037353623658418655,
0.03733762726187706,
-0.0035912662278860807,
-0.17583473026752472,
0.03876631706953049,
-0.018274923786520958,
0.01843859627842903,
0.026470553129911423,
-0.07776834815740585,
-0.07564429938793182,
0.015296397730708122,
-0.10247814655303955,
-0.083692267537117,
0.11002834886312485,
0.031466204673051834,
-0.019670886918902397,
0.10779199749231339,
-0.04243955761194229,
0.18699054419994354,
-0.011512263678014278,
-0.11213519424200058,
-0.2536850869655609,
0.021806683391332626,
-0.01765260472893715,
-0.08747660368680954,
0.01506110467016697,
0.0665089413523674,
-0.09014441072940826,
-0.0588928684592247,
0.0795099288225174,
-0.01132340170443058,
0.04246443510055542,
-0.27593839168548584,
-0.12684126198291779,
-0.05297930911183357,
-0.1421966552734375,
0.08651168644428253,
0.04035491496324539,
0.008764253929257393,
0.15506891906261444,
-0.20897391438484192,
0.004104613792151213,
0.08255259692668915,
-0.2538507878780365,
0.05591634660959244,
0.17671173810958862,
0.03623908758163452,
0.18037272989749908,
0.0060391901060938835,
0.11029672622680664,
0.0716743916273117,
-0.024263937026262283,
-0.17590197920799255,
-0.08127854019403458,
-0.04696211963891983,
0.16642488539218903,
-0.06727185100317001,
-0.14248386025428772,
0.34701237082481384,
0.00015008423360995948,
0.009657775051891804,
0.16921205818653107,
-0.059524230659008026,
-0.09972117841243744,
0.07259953022003174,
0.016484731808304787,
0.018492350354790688,
0.1471305936574936,
0.16307872533798218,
-0.0458691343665123,
-0.13837823271751404,
-0.018630273640155792,
-0.22798998653888702,
0.17510560154914856,
-0.03248048573732376,
0.13137903809547424,
-0.27447956800460815,
0.01684025302529335,
-0.2570667266845703,
0.0032130838371813297,
0.04178816080093384,
-0.06004921346902847,
-0.0226522795855999,
-0.013265985064208508,
-0.08018817007541656,
0.004899587947875261,
0.06192673370242119,
0.1266920566558838,
-0.06128726154565811,
0.06128238886594772,
-0.09319206327199936,
0.141696035861969,
0.07166698575019836,
0.07868369668722153,
0.13037432730197906,
0.041205424815416336,
-0.07187089323997498,
-0.21872246265411377,
-0.0026476888451725245,
-0.06275863200426102,
-0.09502086788415909,
-0.0020165652967989445,
-0.11606067419052124,
0.17244569957256317,
-0.030802514404058456,
-0.09825427830219269,
-0.11208184063434601,
0.09148659557104111,
-0.032992321997880936,
-0.03437839448451996,
-0.03552987426519394,
-0.020977836102247238,
0.019381176680326462,
0.04704452306032181,
-0.1548958420753479,
-0.005131472367793322,
0.07039852440357208,
0.11502562463283539,
-0.1346137970685959,
-0.003783059772104025,
-0.07908964157104492,
0.03039063885807991,
0.07654735445976257,
-0.16510222852230072,
0.03158547356724739,
-0.1124754324555397,
-0.07531405985355377,
0.002912673633545637,
-0.015710093080997467,
-0.016202643513679504,
0.166526660323143,
-0.0020451415330171585,
0.0714716836810112,
-0.026345307007431984,
-0.05890209600329399,
-0.11243434250354767,
-0.08489254862070084,
0.05390460044145584,
0.03670717030763626,
0.03266148269176483,
-0.2193479984998703,
0.014805203303694725,
-0.12762966752052307,
0.1360815018415451,
-0.10566820204257965,
-0.04705966264009476,
-0.022842247039079666,
0.20562705397605896,
0.037286072969436646,
0.08762791007757187,
-0.22171171009540558,
0.039756543934345245,
-0.05404696613550186,
0.18480908870697021,
-0.1502426266670227,
-0.0799463614821434,
0.20813211798667908,
-0.07964949309825897,
-0.10115210711956024,
0.021235812455415726,
0.020391687750816345,
0.026287272572517395,
0.0766737088561058,
0.4564172327518463,
-0.09766800701618195,
-0.09146861732006073,
0.10178250074386597,
0.17055274546146393,
-0.12427149713039398,
-0.1827561855316162,
0.06446871906518936,
-0.16666454076766968,
-0.1973118633031845,
0.0018917324487119913,
0.09222044050693512,
0.038269978016614914,
-0.07875611633062363,
-0.020746968686580658,
0.06325206160545349,
-0.0007678253459744155,
0.09095914661884308,
0.03755716234445572,
0.09034032374620438,
-0.08716782182455063,
0.11115926504135132,
-0.05017651244997978,
0.004037132486701012,
0.1343354731798172,
0.027325427159667015,
-0.03223329409956932,
0.08694463223218918,
-0.0485352948307991,
0.05295134335756302,
-0.1662379503250122,
-0.15068690478801727,
0.03398871049284935,
0.06283251196146011,
0.03186952322721481,
0.1280253529548645,
0.08141885697841644,
-0.10732853412628174,
0.022690722718834877,
-0.004228927195072174,
0.058398615568876266,
0.03891623765230179,
0.006107209715992212,
0.008764320984482765,
0.0961301177740097,
-0.10607069730758667,
-0.13589619100093842,
-0.07336436957120895,
-0.014715781435370445,
0.14371353387832642,
-0.0302802175283432,
0.07690227776765823,
-0.004240254405885935,
0.00013200697139836848,
0.06930823624134064,
0.08137880265712738,
0.016412746161222458,
0.08971183747053146,
-0.05237193778157234,
-0.05160155147314072,
0.10863113403320312,
-0.13533565402030945,
0.17837053537368774,
0.14053137600421906,
-0.20532016456127167,
0.029453208670020103,
-0.06838275492191315,
0.03670361638069153,
-0.008162540383636951,
0.0975119024515152,
-0.08272241055965424,
-0.02106042578816414,
0.013134466484189034,
0.0052274600602686405,
-0.013007243163883686,
0.017682146281003952,
-0.07295988500118256,
-0.07787393033504486,
-0.10233919322490692,
0.08436838537454605,
0.11562882363796234,
-0.10282530635595322,
0.14214380085468292,
0.4384984076023102,
0.11495281755924225,
0.21582984924316406,
-0.09581480920314789,
-0.0412987545132637,
0.007486371789127588,
0.0001535322517156601,
-0.04476691037416458,
0.08031861484050751,
-0.15973517298698425,
-0.038901735097169876,
0.027348900213837624,
0.07128690183162689,
0.11475157737731934,
-0.14959022402763367,
-0.09639324247837067,
-0.00793045200407505,
0.0022841424215584993,
-0.1249532699584961,
0.023905446752905846,
-0.03974650055170059,
0.04015624523162842,
0.07232289016246796,
-0.021535737439990044,
0.13939237594604492,
-0.04166141897439957,
-0.0639561116695404,
0.07585346698760986,
-0.2017085999250412,
-0.23179671168327332,
-0.12309670448303223,
-0.14680525660514832,
0.04366797208786011,
0.05154111236333847,
0.01726446859538555,
-0.17635835707187653,
-0.015074856579303741,
0.07706750929355621,
0.07820965349674225,
-0.20886357128620148,
-0.022814949974417686,
-0.004290030337870121,
0.0895976573228836,
-0.10227091610431671,
-0.0017130117630586028,
-0.04419664293527603,
-0.10150232166051865,
0.0017003051470965147,
0.07279510796070099,
-0.137485533952713,
0.13807645440101624,
0.21589438617229462,
0.07225540280342102,
0.07359948754310608,
-0.019093448296189308,
0.09936179965734482,
-0.10856141895055771,
-0.16549113392829895,
0.08348225057125092,
-0.06234746053814888,
0.047262318432331085,
0.17534415423870087,
0.03307317942380905,
-0.13904969394207,
-0.015682822093367577,
-0.0402069091796875,
-0.15603256225585938,
-0.238995760679245,
-0.09178274869918823,
-0.1182505264878273,
0.16442428529262543,
0.0009358620154671371,
0.06651917099952698,
0.08258313685655594,
-0.022042419761419296,
0.16447891294956207,
-0.07379321753978729,
-0.07578866183757782,
-0.006978808436542749,
0.12375060468912125,
-0.056660156697034836,
-0.03080669604241848,
-0.10566964000463486,
-0.008295975625514984,
0.1151021271944046,
0.15304014086723328,
0.12214863300323486,
0.2957419455051422,
0.08268889784812927,
0.026645636186003685,
0.08958091586828232,
0.17622539401054382,
0.09495089203119278,
0.07838419824838638,
-0.045413073152303696,
-0.014814783819019794,
0.014317171648144722,
-0.04022889584302902,
0.010141594335436821,
0.14683100581169128,
-0.2679629921913147,
-0.006678564939647913,
-0.2710230350494385,
0.0965198427438736,
-0.10913380235433578,
0.11837165057659149,
-0.01015760749578476,
0.10194015502929688,
0.11082887649536133,
0.03233652561903,
-0.03858073800802231,
0.16613617539405823,
0.08450309932231903,
-0.11277695000171661,
0.001758623169735074,
0.03737903758883476,
0.09715615212917328,
-0.02818971499800682,
0.12721189856529236,
-0.11048974841833115,
-0.1464834064245224,
0.013753619976341724,
0.07152791321277618,
-0.15373679995536804,
0.3138748109340668,
0.012069208547472954,
-0.13481520116329193,
-0.01481647603213787,
-0.09957809001207352,
-0.006440147757530212,
0.1254177987575531,
0.09333524852991104,
0.07935678958892822,
-0.2185502052307129,
-0.13339371979236603,
0.05872276425361633,
-0.00575496768578887,
0.22408108413219452,
-0.034034017473459244,
-0.11356475204229355,
-0.027013886719942093,
0.04241163283586502,
-0.06043251231312752,
0.08524788916110992,
0.023536119610071182,
-0.08113526552915573,
-0.032957352697849274,
0.05323701351881027,
0.012368366122245789,
0.00524376705288887,
0.09360801428556442,
0.020107939839363098,
-0.0009265501867048442,
0.01785753294825554,
0.047885000705718994,
-0.0675911232829094,
-0.1984109878540039,
0.09357594698667526,
-0.05215044692158699,
0.0015536568826064467,
-0.08013670891523361,
-0.15122665464878082,
-0.08837161958217621,
-0.16009655594825745,
0.12540200352668762,
-0.034406669437885284,
0.12700119614601135,
-0.06619787961244583,
0.17341409623622894,
-0.07871770113706589,
0.04481020197272301,
-0.047349292784929276,
0.050332702696323395,
-0.007268077693879604,
-0.07756082713603973,
0.16585899889469147,
-0.15564003586769104,
0.01809087023139,
0.19572502374649048,
-0.018915493041276932,
0.07177707552909851,
0.021322092041373253,
-0.0636206790804863,
0.23147478699684143,
0.3014698624610901,
0.008138049393892288,
0.1665448248386383,
0.3018903136253357,
-0.07466315478086472,
-0.2642788887023926,
-0.05505012720823288,
-0.2841376066207886,
-0.05371501296758652,
0.10716094076633453,
-0.22523896396160126,
0.06986407935619354,
0.14383509755134583,
-0.06471995264291763,
0.30228954553604126,
-0.21825523674488068,
0.012589273042976856,
0.15434536337852478,
-0.08868814259767532,
0.5515313148498535,
-0.1133413165807724,
-0.17677772045135498,
-0.008122089318931103,
-0.08741296827793121,
0.10602109134197235,
-0.0340677872300148,
0.06877441704273224,
0.013465235009789467,
0.04797380417585373,
0.048932258039712906,
-0.03111894056200981,
0.22701001167297363,
0.008710170164704323,
0.09015397727489471,
-0.07378865778446198,
-0.18624304234981537,
0.11639340221881866,
-0.04359482601284981,
-0.08891059458255768,
0.0849778801202774,
-0.05942516401410103,
-0.11078983545303345,
0.04663389176130295,
-0.07950539886951447,
-0.024862350896000862,
0.08423490077257156,
-0.04678233340382576,
-0.042606171220541,
-0.008054176345467567,
-0.1618063747882843,
-0.0002289071271661669,
0.31360217928886414,
-0.07096036523580551,
0.16695955395698547,
0.03677211329340935,
0.00038613268407061696,
-0.11027684062719345,
0.030288029462099075,
-0.05203165486454964,
-0.021576624363660812,
0.09578979015350342,
-0.11096979677677155,
0.03204701095819473,
0.14160704612731934,
-0.04864364117383957,
0.05846960097551346,
0.09256096184253693,
-0.0849417969584465,
0.007583672646433115,
0.17753590643405914,
-0.17537221312522888,
-0.1273445188999176,
-0.006135711446404457,
-0.09862716495990753,
0.14055661857128143,
0.04394126310944557,
0.05191568285226822,
0.16669964790344238,
0.03967129811644554,
-0.029474308714270592,
-0.02817419543862343,
-0.1153380498290062,
-0.0201893113553524,
0.040153320878744125,
0.00045633706031367183,
-0.08791285753250122,
0.2262638509273529,
0.06409153342247009,
-0.1328488290309906,
-0.051157206296920776,
0.2161225974559784,
-0.06805316358804703,
-0.04911920800805092,
-0.223562553524971,
0.10752306133508682,
-0.07112517952919006,
-0.0965060144662857,
0.05453834682703018,
-0.02270081453025341,
0.005106312222778797,
0.181985542178154,
0.03941008821129799,
0.11070270836353302,
0.03738937899470329,
-0.02448922023177147,
0.15798696875572205,
-0.142850860953331,
-0.14191335439682007,
-0.025354057550430298,
-0.08757315576076508,
-0.13844476640224457,
-0.026804137974977493,
0.1617041826248169,
-0.09177309274673462,
-0.14772607386112213,
-0.2621181011199951,
0.10968475043773651,
-0.16432365775108337,
-0.10192688554525375,
-0.03469514101743698,
-0.08968492597341537,
0.0696166530251503,
0.030301768332719803,
-0.03093348816037178,
-0.06706760823726654,
-0.18593791127204895,
0.0816768929362297,
0.06349513679742813,
0.045533183962106705,
-0.017847947776317596,
0.0067379772663116455,
0.1720137596130371,
0.025955144315958023,
0.10040043294429779,
0.16762186586856842,
0.011397695168852806,
0.2246655523777008,
-0.1671202927827835,
-0.11496317386627197,
0.1336962729692459,
-0.026543032377958298,
0.06762003898620605,
0.16792191565036774,
-0.0772583931684494,
0.015526676550507545,
-0.028136352077126503,
0.07066910713911057,
-0.11003983020782471,
-0.105624258518219,
0.007937257178127766,
0.02567129209637642,
-0.2755882740020752,
-0.005599735304713249,
-0.19717298448085785,
0.14788752794265747,
0.02579621411859989,
0.03297143429517746,
0.10257530212402344,
0.10404334217309952,
0.08312062919139862,
-0.0017710148822516203,
0.03226327523589134,
-0.1176818460226059,
0.02753005363047123,
-0.059239376336336136,
-0.020663779228925705,
0.017624232918024063,
0.36952024698257446,
-0.03603357449173927,
-0.046802736818790436,
0.003710439894348383,
0.1307835876941681,
-0.02139742486178875,
0.017395347356796265,
0.13209912180900574,
0.12607666850090027,
-0.08595693111419678,
-0.1504845917224884,
0.04888554662466049,
-0.04565655067563057,
-0.02836887165904045,
0.1464131623506546,
0.05905961990356445,
0.1050296202301979,
0.0908031314611435,
-0.014463032595813274,
-0.00318976235575974,
0.012856799177825451,
-0.15486004948616028,
0.06223496049642563,
-0.010558074340224266,
0.012565906159579754,
0.017934376373887062,
0.15238402783870697,
-0.005540105979889631,
0.07739730179309845,
-0.09889880567789078,
0.004208535887300968,
-0.13498884439468384,
-0.07913459837436676,
0.03617347031831741,
-0.13393273949623108,
0.04141177982091904,
-0.01871878281235695,
0.029611799865961075,
0.30386561155319214,
0.02558239921927452,
-0.020639164373278618,
0.12512871623039246,
-0.1214587539434433,
-0.12050267308950424,
-0.001594188273884356,
-0.029960084706544876,
0.0791488066315651,
-0.02633434161543846,
-0.0997740775346756,
-0.1001306027173996,
-0.15166029334068298,
-0.09759195148944855,
0.05182836204767227,
-0.04993441700935364,
-0.059362251311540604,
-0.17634081840515137,
-0.05707859992980957,
-0.05147340148687363,
0.14025864005088806,
-0.12263951450586319,
0.15159130096435547,
-0.014490418136119843,
0.004084470681846142,
0.04405883327126503,
0.1950942426919937,
-0.03644494712352753,
0.08714226633310318,
0.0154351145029068,
0.1522706001996994,
-0.05119588226079941,
0.14720745384693146,
-0.10931728035211563,
-0.04014137014746666,
-0.06710435450077057,
0.21513493359088898,
0.25630924105644226,
-0.06136954948306084,
-0.008937356993556023,
-0.012760217301547527,
0.058654606342315674,
0.1073930487036705,
0.16049085557460785,
0.002326392102986574,
0.2802925705909729,
-0.03133585304021835,
0.04815128445625305,
0.02901598811149597,
0.013607407920062542,
-0.06336209923028946,
0.03397751972079277,
0.07539387792348862,
-0.035039983689785004,
-0.1412304788827896,
0.15837742388248444,
-0.21980468928813934,
0.18157227337360382,
0.11640069633722305,
-0.19996967911720276,
-0.013728445395827293,
-0.04882071167230606,
0.1689416468143463,
-0.0856364443898201,
0.1637246012687683,
-0.0903693437576294,
-0.2108195722103119,
-0.2056000679731369,
0.03867346793413162,
-0.34623071551322937,
-0.254462867975235,
0.10422009229660034,
0.1488201916217804,
0.04015883058309555,
-0.018507536500692368,
-0.019967829808592796,
-0.018367022275924683,
0.04877542704343796,
-0.0067357709631323814,
0.06014643982052803,
0.031397558748722076,
-0.02988368645310402,
-0.24127542972564697,
-0.029804671183228493,
0.023964406922459602,
-0.07093082368373871,
0.07464958727359772,
-0.06874357163906097,
-0.022495782002806664,
0.08059766888618469,
-0.03066304884850979,
0.03298592567443848,
-0.035373736172914505,
-0.16326889395713806,
0.027529051527380943,
0.03900543600320816,
0.036012712866067886,
0.00634160777553916,
0.0008072225609794259,
-0.03455270454287529,
0.0644603744149208,
-0.16716794669628143,
-0.16015739738941193,
0.14140215516090393,
-0.06745140254497528,
0.2779497504234314,
-0.05812826007604599,
-0.0809100940823555,
0.04766704887151718,
-0.03426874056458473,
0.1807648241519928,
-0.07756473124027252,
0.047254521399736404,
0.12766779959201813,
0.011127962730824947,
0.03121316432952881,
-0.3092964291572571,
0.11082969605922699,
-0.000795336440205574,
-0.006093299947679043,
-0.07581598311662674
] |
null | null |
transformers
|
# Perceiver IO for language
Perceiver IO model pre-trained on the Masked Language Modeling (MLM) task proposed in [BERT](https://arxiv.org/abs/1810.04805) using a large text corpus obtained by combining [English Wikipedia](https://huggingface.co/datasets/wikipedia) and [C4](https://huggingface.co/datasets/c4). It was introduced in the paper [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795) by Jaegle et al. and first released in [this repository](https://github.com/deepmind/deepmind-research/tree/master/perceiver).
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For masked language modeling, the output is a tensor containing the prediction scores of the language modeling head, of shape (batch_size, seq_length, vocab_size).
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/perceiver_architecture.jpg" alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors train the model directly on raw UTF-8 bytes, rather than on subwords as is done in models like BERT, RoBERTa and GPT-2. This has many benefits: one doesn't need to train a tokenizer before training the model, one doesn't need to maintain a (fixed) vocabulary file, and this also doesn't hurt model performance as shown by [Bostrom et al., 2020](https://arxiv.org/abs/2004.03720).
By pre-training the model, it learns an inner representation of language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the Perceiver model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but the model is intended to be fine-tuned on a labeled dataset. See the [model hub](https://huggingface.co/models?search=deepmind/perceiver) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model in PyTorch:
```python
from transformers import PerceiverTokenizer, PerceiverForMaskedLM
tokenizer = PerceiverTokenizer.from_pretrained("deepmind/language-perceiver")
model = PerceiverForMaskedLM.from_pretrained("deepmind/language-perceiver")
text = "This is an incomplete sentence where some words are missing."
# prepare input
encoding = tokenizer(text, padding="max_length", return_tensors="pt")
# mask " missing.". Note that the model performs much better if the masked span starts with a space.
encoding.input_ids[0, 52:61] = tokenizer.mask_token_id
inputs, input_mask = encoding.input_ids.to(device), encoding.attention_mask.to(device)
# forward pass
outputs = model(inputs=inputs, attention_mask=input_mask)
logits = outputs.logits
masked_tokens_predictions = logits[0, 51:61].argmax(dim=-1)
print(tokenizer.decode(masked_tokens_predictions))
>>> should print " missing."
```
## Training data
This model was pretrained on a combination of [English Wikipedia](https://huggingface.co/datasets/wikipedia) and [C4](https://huggingface.co/datasets/c4). 70% of the training tokens were sampled from the C4 dataset and the remaining 30% from Wikipedia. The authors concatenate 10 documents before splitting into crops to reduce wasteful computation on padding tokens.
## Training procedure
### Preprocessing
Text preprocessing is trivial: it only involves encoding text into UTF-8 bytes, and padding them up to the same length (2048).
### Pretraining
Hyperparameter details can be found in table 9 of the [paper](https://arxiv.org/abs/2107.14795).
## Evaluation results
This model is able to achieve an average score of 81.8 on GLUE. For more details, we refer to table 3 of the original paper.
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2107-14795,
author = {Andrew Jaegle and
Sebastian Borgeaud and
Jean{-}Baptiste Alayrac and
Carl Doersch and
Catalin Ionescu and
David Ding and
Skanda Koppula and
Daniel Zoran and
Andrew Brock and
Evan Shelhamer and
Olivier J. H{\'{e}}naff and
Matthew M. Botvinick and
Andrew Zisserman and
Oriol Vinyals and
Jo{\~{a}}o Carreira},
title = {Perceiver {IO:} {A} General Architecture for Structured Inputs {\&}
Outputs},
journal = {CoRR},
volume = {abs/2107.14795},
year = {2021},
url = {https://arxiv.org/abs/2107.14795},
eprinttype = {arXiv},
eprint = {2107.14795},
timestamp = {Tue, 03 Aug 2021 14:53:34 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2107-14795.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
{"language": ["en"], "license": "apache-2.0", "datasets": ["wikipedia", "c4"], "inference": false}
|
fill-mask
|
deepmind/language-perceiver
|
[
"transformers",
"pytorch",
"perceiver",
"fill-mask",
"en",
"dataset:wikipedia",
"dataset:c4",
"arxiv:1810.04805",
"arxiv:2107.14795",
"arxiv:2004.03720",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1810.04805",
"2107.14795",
"2004.03720"
] |
[
"en"
] |
TAGS
#transformers #pytorch #perceiver #fill-mask #en #dataset-wikipedia #dataset-c4 #arxiv-1810.04805 #arxiv-2107.14795 #arxiv-2004.03720 #license-apache-2.0 #autotrain_compatible #has_space #region-us
|
# Perceiver IO for language
Perceiver IO model pre-trained on the Masked Language Modeling (MLM) task proposed in BERT using a large text corpus obtained by combining English Wikipedia and C4. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository.
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For masked language modeling, the output is a tensor containing the prediction scores of the language modeling head, of shape (batch_size, seq_length, vocab_size).
<img src="URL alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors train the model directly on raw UTF-8 bytes, rather than on subwords as is done in models like BERT, RoBERTa and GPT-2. This has many benefits: one doesn't need to train a tokenizer before training the model, one doesn't need to maintain a (fixed) vocabulary file, and this also doesn't hurt model performance as shown by Bostrom et al., 2020.
By pre-training the model, it learns an inner representation of language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the Perceiver model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but the model is intended to be fine-tuned on a labeled dataset. See the model hub to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model in PyTorch:
## Training data
This model was pretrained on a combination of English Wikipedia and C4. 70% of the training tokens were sampled from the C4 dataset and the remaining 30% from Wikipedia. The authors concatenate 10 documents before splitting into crops to reduce wasteful computation on padding tokens.
## Training procedure
### Preprocessing
Text preprocessing is trivial: it only involves encoding text into UTF-8 bytes, and padding them up to the same length (2048).
### Pretraining
Hyperparameter details can be found in table 9 of the paper.
## Evaluation results
This model is able to achieve an average score of 81.8 on GLUE. For more details, we refer to table 3 of the original paper.
### BibTeX entry and citation info
|
[
"# Perceiver IO for language\n\nPerceiver IO model pre-trained on the Masked Language Modeling (MLM) task proposed in BERT using a large text corpus obtained by combining English Wikipedia and C4. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For masked language modeling, the output is a tensor containing the prediction scores of the language modeling head, of shape (batch_size, seq_length, vocab_size).\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors train the model directly on raw UTF-8 bytes, rather than on subwords as is done in models like BERT, RoBERTa and GPT-2. This has many benefits: one doesn't need to train a tokenizer before training the model, one doesn't need to maintain a (fixed) vocabulary file, and this also doesn't hurt model performance as shown by Bostrom et al., 2020.\n\nBy pre-training the model, it learns an inner representation of language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the Perceiver model as inputs.",
"## Intended uses & limitations\n\nYou can use the raw model for masked language modeling, but the model is intended to be fine-tuned on a labeled dataset. See the model hub to look for fine-tuned versions on a task that interests you.",
"### How to use\n\nHere is how to use this model in PyTorch:",
"## Training data\n\nThis model was pretrained on a combination of English Wikipedia and C4. 70% of the training tokens were sampled from the C4 dataset and the remaining 30% from Wikipedia. The authors concatenate 10 documents before splitting into crops to reduce wasteful computation on padding tokens.",
"## Training procedure",
"### Preprocessing\n\nText preprocessing is trivial: it only involves encoding text into UTF-8 bytes, and padding them up to the same length (2048).",
"### Pretraining\n\nHyperparameter details can be found in table 9 of the paper.",
"## Evaluation results\n\nThis model is able to achieve an average score of 81.8 on GLUE. For more details, we refer to table 3 of the original paper.",
"### BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #perceiver #fill-mask #en #dataset-wikipedia #dataset-c4 #arxiv-1810.04805 #arxiv-2107.14795 #arxiv-2004.03720 #license-apache-2.0 #autotrain_compatible #has_space #region-us \n",
"# Perceiver IO for language\n\nPerceiver IO model pre-trained on the Masked Language Modeling (MLM) task proposed in BERT using a large text corpus obtained by combining English Wikipedia and C4. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For masked language modeling, the output is a tensor containing the prediction scores of the language modeling head, of shape (batch_size, seq_length, vocab_size).\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors train the model directly on raw UTF-8 bytes, rather than on subwords as is done in models like BERT, RoBERTa and GPT-2. This has many benefits: one doesn't need to train a tokenizer before training the model, one doesn't need to maintain a (fixed) vocabulary file, and this also doesn't hurt model performance as shown by Bostrom et al., 2020.\n\nBy pre-training the model, it learns an inner representation of language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the Perceiver model as inputs.",
"## Intended uses & limitations\n\nYou can use the raw model for masked language modeling, but the model is intended to be fine-tuned on a labeled dataset. See the model hub to look for fine-tuned versions on a task that interests you.",
"### How to use\n\nHere is how to use this model in PyTorch:",
"## Training data\n\nThis model was pretrained on a combination of English Wikipedia and C4. 70% of the training tokens were sampled from the C4 dataset and the remaining 30% from Wikipedia. The authors concatenate 10 documents before splitting into crops to reduce wasteful computation on padding tokens.",
"## Training procedure",
"### Preprocessing\n\nText preprocessing is trivial: it only involves encoding text into UTF-8 bytes, and padding them up to the same length (2048).",
"### Pretraining\n\nHyperparameter details can be found in table 9 of the paper.",
"## Evaluation results\n\nThis model is able to achieve an average score of 81.8 on GLUE. For more details, we refer to table 3 of the original paper.",
"### BibTeX entry and citation info"
] |
[
82,
135,
436,
61,
17,
70,
3,
41,
18,
34,
11
] |
[
"passage: TAGS\n#transformers #pytorch #perceiver #fill-mask #en #dataset-wikipedia #dataset-c4 #arxiv-1810.04805 #arxiv-2107.14795 #arxiv-2004.03720 #license-apache-2.0 #autotrain_compatible #has_space #region-us \n# Perceiver IO for language\n\nPerceiver IO model pre-trained on the Masked Language Modeling (MLM) task proposed in BERT using a large text corpus obtained by combining English Wikipedia and C4. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team."
] |
[
-0.06256050616502762,
0.024924056604504585,
-0.002379684243351221,
0.09102912247180939,
0.07485388964414597,
0.058226048946380615,
0.18488681316375732,
0.047881871461868286,
0.09730695188045502,
0.04362297058105469,
0.11323139816522598,
-0.015594417229294777,
0.062176413834095,
0.18655507266521454,
0.06666634976863861,
-0.18582232296466827,
0.05032951012253761,
0.02970164641737938,
0.0076988027431070805,
0.10849414020776749,
0.09387098252773285,
-0.0766664445400238,
0.1265362799167633,
0.031540557742118835,
-0.15855897963047028,
-0.010170672088861465,
-0.08547535538673401,
-0.05129330977797508,
0.08590836822986603,
-0.030652949586510658,
0.09879107773303986,
-0.026676738634705544,
0.04883527010679245,
-0.0005024191341362894,
0.03718949109315872,
0.06779111921787262,
0.024830175563693047,
0.08635684847831726,
0.043655697256326675,
-0.020173845812678337,
0.06334520131349564,
-0.02465665340423584,
0.026326986029744148,
0.00640395563095808,
-0.08342695981264114,
-0.23528483510017395,
-0.012454000301659107,
0.16836097836494446,
-0.053360603749752045,
0.08168362826108932,
-0.006073862314224243,
0.09936024248600006,
0.0481671504676342,
0.05540340393781662,
0.0972827598452568,
-0.1827649027109146,
-0.03676912933588028,
-0.019626349210739136,
-0.008968310430645943,
-0.007629459258168936,
-0.023209135979413986,
-0.02110629715025425,
0.0477822981774807,
0.024813013151288033,
0.09340634196996689,
-0.03502502292394638,
-0.011704951524734497,
-0.08740223199129105,
-0.10437916219234467,
-0.03937060758471489,
0.20487020909786224,
-0.009203661233186722,
-0.08376092463731766,
-0.043471671640872955,
-0.08618860691785812,
0.024913422763347626,
0.0516728051006794,
-0.056947704404592514,
0.08007702976465225,
-0.024590788409113884,
0.13661043345928192,
-0.11614971607923508,
-0.07568112760782242,
-0.06525956094264984,
-0.18002481758594513,
0.08640715479850769,
0.027280155569314957,
0.07848571240901947,
-0.06963221728801727,
0.033093441277742386,
-0.2102607786655426,
-0.07305587083101273,
-0.03154722601175308,
-0.10120174288749695,
0.02691538818180561,
0.06295913457870483,
-0.005100710317492485,
-0.09945206344127655,
-0.010624413378536701,
0.06202588975429535,
0.06444726139307022,
0.008312082849442959,
0.011796030215919018,
0.10469067841768265,
0.09618088603019714,
0.10753089934587479,
-0.09310656040906906,
0.015789205208420753,
0.07492774724960327,
0.05141672492027283,
0.055712901055812836,
-0.05976568162441254,
-0.04953227937221527,
0.04628932476043701,
0.010017480701208115,
-0.04479140415787697,
0.05470254272222519,
0.10038717836141586,
-0.014300117269158363,
-0.039150405675172806,
0.09943032264709473,
-0.12360342592000961,
-0.01473542582243681,
-0.005054122302681208,
-0.040269773453474045,
0.06535088270902634,
0.17477700114250183,
-0.014198563992977142,
-0.03509592264890671,
-0.02752295881509781,
-0.05919447913765907,
-0.004313457291573286,
-0.08401135355234146,
-0.14256159961223602,
0.02003643289208412,
-0.20587804913520813,
-0.005579638760536909,
-0.1532200425863266,
-0.1294170618057251,
0.02187936380505562,
0.11425695568323135,
-0.03425276279449463,
-0.015594747848808765,
-0.007905220612883568,
0.031581100076436996,
-0.028108226135373116,
0.011555173434317112,
-0.06397534161806107,
-0.008107260800898075,
-0.005952549632638693,
-0.04531418904662132,
0.1429935097694397,
-0.2019275426864624,
0.0625516027212143,
-0.14059676229953766,
0.06286018341779709,
-0.06249116361141205,
-0.020844509825110435,
0.00726320780813694,
-0.028538841754198074,
-0.0894380435347557,
-0.037004515528678894,
-0.05823870748281479,
0.0614042654633522,
0.059140969067811966,
0.11805276572704315,
-0.06903492659330368,
-0.04676283150911331,
0.11032924056053162,
-0.18233472108840942,
-0.09832059592008591,
0.12756530940532684,
-0.004676461685448885,
0.15622493624687195,
-0.039174869656562805,
0.1692499816417694,
0.008680833503603935,
-0.10100990533828735,
0.021077211946249008,
0.03588316962122917,
-0.06964636594057083,
-0.016126016154885292,
0.08778998255729675,
-0.000818450003862381,
0.005880124866962433,
0.035368118435144424,
-0.027516020461916924,
0.007047723978757858,
-0.032969675958156586,
-0.03959112614393234,
-0.0018975461134687066,
-0.10853850841522217,
0.015373843722045422,
-0.0008569422061555088,
0.03212643042206764,
0.004747579339891672,
-0.03427135571837425,
-0.06128103286027908,
0.03547674044966698,
-0.07643423974514008,
0.020803408697247505,
-0.07231658697128296,
0.1162220686674118,
-0.10223112255334854,
0.022503141313791275,
-0.12617984414100647,
0.03848014026880264,
0.014988508075475693,
-0.07800990343093872,
0.08578053116798401,
0.10883915424346924,
0.06666552275419235,
0.09418854862451553,
0.014058214612305164,
-0.026146771386265755,
-0.013097879476845264,
-0.02339138649404049,
-0.02285870350897312,
-0.1623470038175583,
-0.04172312468290329,
-0.08328498899936676,
0.1283426284790039,
0.04678339511156082,
0.03852469101548195,
-0.11844214051961899,
0.0015569787938147783,
0.0019523968221619725,
-0.03140838071703911,
0.02764463983476162,
0.015554739162325859,
-0.03349703922867775,
-0.02920585684478283,
0.11652621626853943,
0.00976409949362278,
0.0462716706097126,
0.12045973539352417,
-0.1273232400417328,
-0.05881053954362869,
0.1626417636871338,
-0.01737624779343605,
-0.011589344590902328,
0.06679709255695343,
-0.019179251044988632,
-0.018539709970355034,
0.06562352925539017,
-0.026017416268587112,
0.2607419192790985,
-0.02534627728164196,
0.09898263216018677,
-0.08314511924982071,
0.03699703887104988,
0.07390224188566208,
-0.07116147130727768,
-0.10718794167041779,
0.060736607760190964,
0.08253970742225647,
-0.02822708897292614,
0.1284143179655075,
0.13375021517276764,
-0.05692639574408531,
0.13822659850120544,
0.045844659209251404,
-0.02367190271615982,
0.0016842182958498597,
-0.02558928355574608,
0.0056480523198843,
0.10094433277845383,
-0.24636390805244446,
-0.09036387503147125,
0.09667406231164932,
-0.023309392854571342,
0.011442722752690315,
-0.07464700937271118,
0.03381897881627083,
0.04450015351176262,
0.015851551666855812,
-0.07228654623031616,
0.04904978349804878,
-0.05141415819525719,
0.06210409849882126,
0.03999226167798042,
-0.06078726053237915,
0.023352613672614098,
0.0054184990003705025,
-0.09264472872018814,
0.09738194197416306,
-0.08222829550504684,
-0.27296435832977295,
-0.138036847114563,
-0.03189767897129059,
-0.10642329603433609,
0.03317669779062271,
0.01880321279168129,
-0.014049269258975983,
-0.035817455500364304,
-0.10246260464191437,
0.0740121454000473,
-0.0062033142894506454,
-0.02389209344983101,
0.10954058915376663,
-0.07247111201286316,
0.05129262059926987,
-0.10607749968767166,
-0.005813885014504194,
-0.04143230989575386,
-0.043773677200078964,
0.09386676549911499,
-0.04723271727561951,
0.139432892203331,
0.06324665248394012,
-0.06639227271080017,
0.03824293240904808,
-0.007350986357778311,
0.19872842729091644,
0.012534997425973415,
0.05832870677113533,
0.2553759813308716,
-0.0051549575291574,
0.03304550051689148,
0.11949633061885834,
-0.04490230605006218,
-0.037417978048324585,
-0.03623098134994507,
-0.023404235020279884,
-0.11446205526590347,
-0.1362966001033783,
-0.038010843098163605,
-0.043129995465278625,
0.09212742745876312,
0.0883655920624733,
0.031181173399090767,
-0.039845265448093414,
0.09656548500061035,
0.027414854615926743,
0.0731687992811203,
-0.0019867669325321913,
0.05579482018947601,
-0.06874839216470718,
-0.018359320238232613,
0.07783005386590958,
-0.05947807431221008,
0.024561436846852303,
0.1091793105006218,
0.034595977514982224,
0.1561310738325119,
-0.02951844222843647,
-0.008127782493829727,
0.07608066499233246,
0.09791065007448196,
0.04776821658015251,
0.19606992602348328,
-0.13143867254257202,
0.05962257459759712,
-0.042384713888168335,
-0.049397144466638565,
-0.0964188501238823,
0.06584000587463379,
-0.04390690103173256,
0.03099146857857704,
-0.016058120876550674,
0.06199737638235092,
0.013271183706820011,
0.2148190289735794,
0.025165831670165062,
-0.2309427261352539,
-0.040090613067150116,
-0.005091827362775803,
0.09156181663274765,
-0.15590690076351166,
-0.0044172825291752815,
0.055298227816820145,
-0.035793486982584,
0.07697465270757675,
-0.03549686819314957,
0.042546629905700684,
-0.10259024798870087,
0.0281470138579607,
0.006086318287998438,
0.11462700366973877,
0.0636366605758667,
0.07330860942602158,
-0.14345845580101013,
0.1964232325553894,
-0.00836122129112482,
0.005477983970195055,
-0.10194823890924454,
0.01887570321559906,
0.009585647843778133,
0.08692914247512817,
0.21915031969547272,
0.031199999153614044,
0.1545659601688385,
-0.09336820244789124,
-0.11211539059877396,
0.02578125149011612,
0.06547358632087708,
-0.024250062182545662,
-0.00467428844422102,
0.01231998112052679,
0.0025783267337828875,
-0.0132204070687294,
0.22882048785686493,
-0.03962413966655731,
-0.1430053412914276,
0.05485786125063896,
0.010286412201821804,
0.028500093147158623,
-0.016999991610646248,
-0.07636214047670364,
-0.13691100478172302,
0.14043553173542023,
-0.017567722126841545,
-0.06481266766786575,
-0.10384368151426315,
-0.040125567466020584,
0.04159039631485939,
-0.055088672786951065,
0.03433657065033913,
-0.12300650030374527,
0.07096245884895325,
-0.0936611071228981,
-0.14455512166023254,
0.05889301747083664,
-0.09643886983394623,
-0.03026316873729229,
-0.015091823413968086,
0.019028151407837868,
0.0465545617043972,
0.03410949185490608,
-0.0037586649414151907,
0.028719985857605934,
-0.11206495761871338,
-0.048134446144104004,
0.0979602187871933,
0.13851864635944366,
0.004564790055155754,
0.03374598175287247,
-0.09572693705558777,
-0.15528598427772522,
0.060304392129182816,
0.03029518574476242,
0.0862012580037117,
0.1571250557899475,
-0.018427236005663872,
0.15801990032196045,
0.2243693470954895,
-0.10423781722784042,
-0.3210165798664093,
-0.03841223567724228,
-0.01635851338505745,
0.07926296442747116,
-0.07574325054883957,
-0.1519801914691925,
0.015323936007916927,
0.045795783400535583,
0.0066034309566020966,
0.044341105967760086,
-0.05502597987651825,
-0.10617055743932724,
0.11829469352960587,
0.11191854625940323,
0.2112448662519455,
-0.09213631600141525,
0.010393830947577953,
-0.0513492077589035,
-0.11796656250953674,
0.04233003780245781,
-0.13867153227329254,
0.08566352725028992,
-0.03961249813437462,
0.0032325158827006817,
-0.004307513125240803,
-0.057350076735019684,
0.12073130160570145,
-0.08762213587760925,
0.03603352978825569,
-0.09011120349168777,
-0.02944844588637352,
0.11681070923805237,
-0.02885560877621174,
0.1824081689119339,
0.0015174042200669646,
-0.013246080838143826,
-0.029060013592243195,
-0.05877939239144325,
-0.11787979304790497,
0.022420722991228104,
0.012307330965995789,
-0.10324525833129883,
-0.03400520980358124,
0.04227104410529137,
0.044274527579545975,
0.014802306890487671,
0.01907147467136383,
-0.05154181644320488,
0.07312344759702682,
0.1596575230360031,
0.11996755748987198,
-0.07579617947340012,
-0.08273284882307053,
-0.0419180765748024,
-0.046362150460481644,
0.05793146789073944,
-0.14234066009521484,
-0.0011950330808758736,
0.07538292557001114,
-0.014779996126890182,
0.07093676179647446,
0.031206104904413223,
-0.11130021512508392,
0.03268652781844139,
0.03071518987417221,
-0.1787109225988388,
-0.026036353781819344,
0.032374005764722824,
-0.01571248471736908,
-0.010212563909590244,
0.05744967609643936,
0.13640761375427246,
-0.09033269435167313,
-0.009337883442640305,
-0.032339729368686676,
0.03740088269114494,
-0.01498465333133936,
-0.0068786293268203735,
0.0792907178401947,
0.023049240931868553,
-0.07244785875082016,
0.12476438283920288,
0.062142565846443176,
-0.04028603062033653,
0.07092767208814621,
0.03074926882982254,
-0.09241105616092682,
-0.10665521025657654,
0.013067968189716339,
0.19441881775856018,
0.059469856321811676,
-0.11567769199609756,
-0.052747081965208054,
-0.09760048985481262,
-0.009650086984038353,
0.08635782450437546,
0.03959174454212189,
-0.028216397389769554,
-0.04497654363512993,
-0.03040860965847969,
-0.0768732950091362,
0.04468819871544838,
-0.06267257779836655,
-0.011718238703906536,
-0.046646080911159515,
-0.0059541985392570496,
0.04744976386427879,
0.12557899951934814,
-0.07391176372766495,
-0.05406034365296364,
-0.06237081065773964,
-0.011737964116036892,
-0.004676661919802427,
0.026946352794766426,
-0.04314861446619034,
-0.035173166543245316,
0.01395559310913086,
-0.023670539259910583,
-0.034283190965652466,
0.058149419724941254,
-0.07341533154249191,
0.003003376070410013,
-0.029719000682234764,
-0.004128537140786648,
-0.07015829533338547,
-0.03717762604355812,
0.0007448455435223877,
-0.013570929877460003,
0.10213277488946915,
-0.025028809905052185,
-0.09850207716226578,
-0.06566482782363892,
-0.16625826060771942,
-0.009975938126444817,
0.017994653433561325,
0.03194567188620567,
0.022551612928509712,
-0.09154178202152252,
-0.02080729231238365,
0.02859627455472946,
-0.09820286184549332,
-0.004563681781291962,
0.20733121037483215,
-0.00453609973192215,
0.07973546534776688,
0.08549326658248901,
-0.018162278458476067,
-0.03530985116958618,
0.05304925516247749,
0.07975985109806061,
0.057525426149368286,
0.004775511100888252,
-0.016970673575997353,
0.06648002564907074,
-0.0991416871547699,
-0.001696683932095766,
0.03768274933099747,
-0.07434911280870438,
0.0011267663212493062,
-0.055006496608257294,
0.03301111236214638,
0.014585119672119617,
0.18116073310375214,
0.08862940967082977,
-0.033871106803417206,
-0.01851011998951435,
0.0660254955291748,
-0.054420582950115204,
0.00691383657976985,
-0.012886087410151958,
-0.05734190717339516,
0.006642022170126438,
0.04275771975517273,
0.0855577141046524,
0.03089555725455284,
0.17677859961986542,
0.10910201817750931,
0.036581069231033325,
0.19561322033405304,
0.09409265965223312,
0.05798907205462456,
-0.010294206440448761,
-0.024883145466446877,
-0.09931331872940063,
0.018109949305653572,
0.0931611880660057,
-0.049442537128925323,
0.1002943217754364,
0.04938337206840515,
-0.058372411876916885,
0.05760499835014343,
-0.04699328541755676,
-0.05414453148841858,
-0.054181333631277084,
-0.23153235018253326,
-0.0331091545522213,
-0.051386989653110504,
-0.01820903643965721,
-0.02691812254488468,
-0.0233721062541008,
0.07894349098205566,
0.03859059140086174,
-0.015792569145560265,
0.11023160070180893,
-0.15569429099559784,
-0.026905490085482597,
0.07810681313276291,
0.004947388544678688,
-0.03581199049949646,
-0.10435076802968979,
-0.0557401105761528,
0.023519689217209816,
0.0486847385764122,
0.04158385843038559,
0.056748565286397934,
0.08618482202291489,
0.028470411896705627,
-0.02424662932753563,
-0.12002900242805481,
-0.03770258277654648,
0.0067422520369291306,
0.007260559592396021,
0.07327710837125778,
-0.04266417399048805,
-0.027775591239333153,
0.0020481767132878304,
-0.014542060904204845,
0.009725244715809822,
-0.030314605683088303,
-0.08058745414018631,
0.07598935812711716,
-0.10266849398612976,
-0.014832224696874619,
0.0009576489683240652,
-0.06748438626527786,
-0.0868045762181282,
0.28949475288391113,
0.28393813967704773,
-0.03430946543812752,
0.0058339620009064674,
0.027139492332935333,
0.018526090309023857,
-0.006871058605611324,
0.04770791530609131,
0.09214600920677185,
0.14848217368125916,
-0.06826569139957428,
0.060384154319763184,
-0.056219663470983505,
-0.023128364235162735,
-0.03537827730178833,
0.09138748794794083,
0.007526574190706015,
-0.012980624102056026,
-0.06983508169651031,
-0.07339847087860107,
-0.06064992398023605,
-0.16833031177520752,
0.17637738585472107,
-0.14372500777244568,
-0.06702141463756561,
-0.0419265441596508,
0.07666663825511932,
0.060115814208984375,
0.04912220314145088,
-0.009802884422242641,
0.018231594935059547,
0.10000459104776382,
0.012540400959551334,
-0.1405276358127594,
-0.09737240523099899,
0.039529260247945786,
-0.1272251307964325,
0.1494162380695343,
0.0012254841858521104,
0.06621452420949936,
0.03280232474207878,
-0.017264030873775482,
-0.09317464381456375,
0.020825549960136414,
-0.019518088549375534,
-0.0006656922050751746,
-0.04212546348571777,
0.08257804811000824,
-0.03725587576627731,
-0.01912933960556984,
-0.005789006128907204,
0.019764097407460213,
-0.015864690765738487,
0.014444827102124691,
-0.09918244183063507,
0.0028632006142288446,
0.05372164398431778,
-0.12585128843784332,
0.12257718294858932,
0.07736793160438538,
0.031011337414383888,
-0.10481690615415573,
-0.049315694719552994,
0.07600750774145126,
-0.03472502902150154,
-0.08504175394773483,
-0.0010411926778033376,
-0.11303941905498505,
-0.007050748448818922,
-0.11025245487689972,
0.05762026086449623,
-0.24157822132110596,
-0.0742628201842308,
-0.11534897983074188,
-0.02778441831469536,
-0.07881437987089157,
-0.0262443907558918,
0.08304677903652191,
0.018836945295333862,
-0.052706390619277954,
0.0741858184337616,
0.052797187119722366,
0.053349822759628296,
-0.08208788186311722,
-0.08468558639287949
] |
null | null |
transformers
|
# Perceiver IO for multimodal autoencoding
Perceiver IO model trained on [Kinetics-700-2020](https://arxiv.org/abs/2010.10864) for auto-encoding videos that consist of images, audio and a class label. It was introduced in the paper [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795) by Jaegle et al. and first released in [this repository](https://github.com/deepmind/deepmind-research/tree/master/perceiver).
The goal of multimodal autoencoding is to learn a model that can accurately reconstruct multimodal inputs in the presence of a bottleneck induced by an architecture.
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For multimodal autoencoding, the output contains the reconstructions of the 3 modalities: images, audio and the class label.
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/perceiver_architecture.jpg" alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model by padding the inputs (images, audio, class label) with modality-specific embeddings and serialize all of them into a 2D input array (i.e. concatenate along the time dimension). Decoding the final hidden states of the latents is done by using queries containing Fourier-based position embeddings (for video and audio) and modality embeddings.
## Intended uses & limitations
You can use the raw model for multimodal autoencoding. Note that by masking the class label during evaluation, the auto-encoding model becomes a video classifier.
See the [model hub](https://huggingface.co/models search=deepmind/perceiver) to look for other versions on a task that may interest you.
### How to use
We refer to the [tutorial notebook](https://github.com/NielsRogge/Transformers-Tutorials/blob/master/Perceiver/Perceiver_for_Multimodal_Autoencoding.ipynb) regarding using the Perceiver for multimodal autoencoding.
## Training data
This model was trained on [Kinetics-700-200](https://arxiv.org/abs/2010.10864), a dataset consisting of videos that belong to one of 700 classes.
## Training procedure
### Preprocessing
The authors train on 16 frames at 224x224 resolution, preprocessed into 50k 4x4 patches as well as 30k raw audio samples, patched into a total of 1920 16-dimensional vectors and one 700-dimensional one-hot representation of the class label.
### Pretraining
Hyperparameter details can be found in Appendix F of the [paper](https://arxiv.org/abs/2107.14795).
## Evaluation results
For evaluation results, we refer to table 5 of the [paper](https://arxiv.org/abs/2107.14795).
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2107-14795,
author = {Andrew Jaegle and
Sebastian Borgeaud and
Jean{-}Baptiste Alayrac and
Carl Doersch and
Catalin Ionescu and
David Ding and
Skanda Koppula and
Daniel Zoran and
Andrew Brock and
Evan Shelhamer and
Olivier J. H{\'{e}}naff and
Matthew M. Botvinick and
Andrew Zisserman and
Oriol Vinyals and
Jo{\~{a}}o Carreira},
title = {Perceiver {IO:} {A} General Architecture for Structured Inputs {\&}
Outputs},
journal = {CoRR},
volume = {abs/2107.14795},
year = {2021},
url = {https://arxiv.org/abs/2107.14795},
eprinttype = {arXiv},
eprint = {2107.14795},
timestamp = {Tue, 03 Aug 2021 14:53:34 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2107-14795.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
{"license": "apache-2.0", "datasets": ["kinetics-700-2020"]}
| null |
deepmind/multimodal-perceiver
|
[
"transformers",
"pytorch",
"perceiver",
"dataset:kinetics-700-2020",
"arxiv:2010.10864",
"arxiv:2107.14795",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.10864",
"2107.14795"
] |
[] |
TAGS
#transformers #pytorch #perceiver #dataset-kinetics-700-2020 #arxiv-2010.10864 #arxiv-2107.14795 #license-apache-2.0 #endpoints_compatible #region-us
|
# Perceiver IO for multimodal autoencoding
Perceiver IO model trained on Kinetics-700-2020 for auto-encoding videos that consist of images, audio and a class label. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository.
The goal of multimodal autoencoding is to learn a model that can accurately reconstruct multimodal inputs in the presence of a bottleneck induced by an architecture.
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For multimodal autoencoding, the output contains the reconstructions of the 3 modalities: images, audio and the class label.
<img src="URL alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model by padding the inputs (images, audio, class label) with modality-specific embeddings and serialize all of them into a 2D input array (i.e. concatenate along the time dimension). Decoding the final hidden states of the latents is done by using queries containing Fourier-based position embeddings (for video and audio) and modality embeddings.
## Intended uses & limitations
You can use the raw model for multimodal autoencoding. Note that by masking the class label during evaluation, the auto-encoding model becomes a video classifier.
See the model hub to look for other versions on a task that may interest you.
### How to use
We refer to the tutorial notebook regarding using the Perceiver for multimodal autoencoding.
## Training data
This model was trained on Kinetics-700-200, a dataset consisting of videos that belong to one of 700 classes.
## Training procedure
### Preprocessing
The authors train on 16 frames at 224x224 resolution, preprocessed into 50k 4x4 patches as well as 30k raw audio samples, patched into a total of 1920 16-dimensional vectors and one 700-dimensional one-hot representation of the class label.
### Pretraining
Hyperparameter details can be found in Appendix F of the paper.
## Evaluation results
For evaluation results, we refer to table 5 of the paper.
### BibTeX entry and citation info
|
[
"# Perceiver IO for multimodal autoencoding\n\nPerceiver IO model trained on Kinetics-700-2020 for auto-encoding videos that consist of images, audio and a class label. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nThe goal of multimodal autoencoding is to learn a model that can accurately reconstruct multimodal inputs in the presence of a bottleneck induced by an architecture.\n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For multimodal autoencoding, the output contains the reconstructions of the 3 modalities: images, audio and the class label.\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model by padding the inputs (images, audio, class label) with modality-specific embeddings and serialize all of them into a 2D input array (i.e. concatenate along the time dimension). Decoding the final hidden states of the latents is done by using queries containing Fourier-based position embeddings (for video and audio) and modality embeddings.",
"## Intended uses & limitations\n\nYou can use the raw model for multimodal autoencoding. Note that by masking the class label during evaluation, the auto-encoding model becomes a video classifier.\n\nSee the model hub to look for other versions on a task that may interest you.",
"### How to use\n\nWe refer to the tutorial notebook regarding using the Perceiver for multimodal autoencoding.",
"## Training data\n\nThis model was trained on Kinetics-700-200, a dataset consisting of videos that belong to one of 700 classes.",
"## Training procedure",
"### Preprocessing\n\nThe authors train on 16 frames at 224x224 resolution, preprocessed into 50k 4x4 patches as well as 30k raw audio samples, patched into a total of 1920 16-dimensional vectors and one 700-dimensional one-hot representation of the class label.",
"### Pretraining\n\nHyperparameter details can be found in Appendix F of the paper.",
"## Evaluation results\n\nFor evaluation results, we refer to table 5 of the paper.",
"### BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #perceiver #dataset-kinetics-700-2020 #arxiv-2010.10864 #arxiv-2107.14795 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# Perceiver IO for multimodal autoencoding\n\nPerceiver IO model trained on Kinetics-700-2020 for auto-encoding videos that consist of images, audio and a class label. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nThe goal of multimodal autoencoding is to learn a model that can accurately reconstruct multimodal inputs in the presence of a bottleneck induced by an architecture.\n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For multimodal autoencoding, the output contains the reconstructions of the 3 modalities: images, audio and the class label.\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model by padding the inputs (images, audio, class label) with modality-specific embeddings and serialize all of them into a 2D input array (i.e. concatenate along the time dimension). Decoding the final hidden states of the latents is done by using queries containing Fourier-based position embeddings (for video and audio) and modality embeddings.",
"## Intended uses & limitations\n\nYou can use the raw model for multimodal autoencoding. Note that by masking the class label during evaluation, the auto-encoding model becomes a video classifier.\n\nSee the model hub to look for other versions on a task that may interest you.",
"### How to use\n\nWe refer to the tutorial notebook regarding using the Perceiver for multimodal autoencoding.",
"## Training data\n\nThis model was trained on Kinetics-700-200, a dataset consisting of videos that belong to one of 700 classes.",
"## Training procedure",
"### Preprocessing\n\nThe authors train on 16 frames at 224x224 resolution, preprocessed into 50k 4x4 patches as well as 30k raw audio samples, patched into a total of 1920 16-dimensional vectors and one 700-dimensional one-hot representation of the class label.",
"### Pretraining\n\nHyperparameter details can be found in Appendix F of the paper.",
"## Evaluation results\n\nFor evaluation results, we refer to table 5 of the paper.",
"### BibTeX entry and citation info"
] |
[
60,
167,
360,
65,
26,
31,
3,
67,
20,
17,
11
] |
[
"passage: TAGS\n#transformers #pytorch #perceiver #dataset-kinetics-700-2020 #arxiv-2010.10864 #arxiv-2107.14795 #license-apache-2.0 #endpoints_compatible #region-us \n# Perceiver IO for multimodal autoencoding\n\nPerceiver IO model trained on Kinetics-700-2020 for auto-encoding videos that consist of images, audio and a class label. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nThe goal of multimodal autoencoding is to learn a model that can accurately reconstruct multimodal inputs in the presence of a bottleneck induced by an architecture.\n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team."
] |
[
-0.08104176074266434,
-0.039976317435503006,
0.00027418689569458365,
0.09463632106781006,
0.13279643654823303,
0.04299332946538925,
0.14917486906051636,
0.03196999430656433,
0.05955786630511284,
0.06414782255887985,
0.08900796622037888,
0.052211176604032516,
0.037747256457805634,
0.24020491540431976,
0.04633051156997681,
-0.16826733946800232,
0.0546109676361084,
0.067265085875988,
0.18443354964256287,
0.10324542224407196,
0.0955648273229599,
-0.08366332203149796,
0.1265467405319214,
0.018090270459651947,
-0.26534488797187805,
-0.039823830127716064,
-0.07153238356113434,
-0.0010024654911831021,
0.06762784719467163,
-0.0060644252225756645,
0.05318484827876091,
-0.008581314235925674,
0.062282055616378784,
0.023502502590417862,
0.023698056116700172,
0.060023121535778046,
0.0025759730488061905,
0.052359290421009064,
0.028003500774502754,
-0.0298030823469162,
0.11528114229440689,
0.017524147406220436,
0.0043378304690122604,
0.023574309423565865,
-0.04167412966489792,
-0.182297021150589,
-0.04098885878920555,
0.15965500473976135,
0.04190234839916229,
0.05827289819717407,
-0.012185166589915752,
0.10441169142723083,
0.05597398430109024,
0.08550991863012314,
0.04204272851347923,
-0.1378239095211029,
-0.046923793852329254,
0.056601349264383316,
0.006914702709764242,
-0.015119762159883976,
-0.017223043367266655,
-0.005896500777453184,
0.06977126002311707,
0.03806473687291145,
0.06909617781639099,
-0.04121426120400429,
0.07047650963068008,
-0.08945630490779877,
-0.08809428662061691,
-0.054403070360422134,
0.2226867973804474,
-0.0010849435348063707,
-0.12244940549135208,
-0.07832423597574234,
-0.044659774750471115,
0.02526244893670082,
0.008046053349971771,
-0.032404035329818726,
0.04453522711992264,
-0.03509800136089325,
0.06031864881515503,
-0.04988865554332733,
-0.06263087689876556,
-0.1034667119383812,
-0.13781897723674774,
0.013521206565201283,
0.026457728818058968,
0.07700343430042267,
-0.06557111442089081,
0.08668152987957001,
-0.1741436868906021,
-0.04236766695976257,
-0.0558643639087677,
-0.11351653933525085,
-0.01680370420217514,
0.02826567366719246,
-0.037729766219854355,
-0.013937214389443398,
-0.07945200055837631,
0.1102447435259819,
0.13685579597949982,
0.0269180815666914,
0.05343068391084671,
0.0966959148645401,
0.1433558464050293,
0.04405581206083298,
-0.018635068088769913,
0.009458597749471664,
0.03604511544108391,
0.05539257079362869,
0.028425468131899834,
-0.0734044536948204,
-0.050007548183202744,
0.07585998624563217,
0.06437420845031738,
-0.07243192195892334,
0.015421202406287193,
0.06394599378108978,
-0.01337161473929882,
-0.047250643372535706,
0.13205336034297943,
-0.06609301269054413,
-0.0023156199604272842,
-0.03868242725729942,
-0.03699687868356705,
0.1339973360300064,
0.1262475997209549,
-0.005728802178055048,
-0.020414123311638832,
0.04622573405504227,
-0.06095043942332268,
0.02083609253168106,
-0.11465606838464737,
-0.09214656054973602,
0.021751904860138893,
-0.18222372233867645,
0.024813830852508545,
-0.15157778561115265,
-0.20485550165176392,
-0.04416453465819359,
0.08688122779130936,
-0.05961545184254646,
-0.04506456106901169,
-0.01138241682201624,
0.06902972608804703,
-0.015054219402372837,
-0.009531963616609573,
-0.017887664958834648,
0.012163465842604637,
0.01878795400261879,
-0.0014427019050344825,
0.11696217954158783,
-0.14279434084892273,
0.061710845679044724,
-0.12040910124778748,
0.03903704136610031,
-0.07003552466630936,
0.004523991607129574,
0.01249659713357687,
0.0034041604958474636,
-0.04657177999615669,
-0.03734717518091202,
-0.07376661151647568,
0.08441802859306335,
0.0483761690557003,
0.09834647178649902,
-0.09583786875009537,
-0.05990532413125038,
0.11173494905233383,
-0.20299582183361053,
-0.11593282222747803,
0.10451159626245499,
-0.013374797999858856,
0.18373632431030273,
-0.0025163404643535614,
0.11042837053537369,
0.061704035848379135,
-0.09240392595529556,
-0.03692859783768654,
0.027515046298503876,
-0.11417526751756668,
-0.0894019603729248,
0.04277190938591957,
0.06602226942777634,
-0.022230537608265877,
0.047098249197006226,
-0.012826526537537575,
-0.005640268325805664,
-0.05321347713470459,
-0.03149287402629852,
-0.02910473570227623,
-0.09248993545770645,
-0.06273844838142395,
0.02725452184677124,
0.04559134319424629,
-0.007685541640967131,
-0.02696811966598034,
-0.017680374905467033,
0.05318083241581917,
-0.09869243204593658,
0.026545926928520203,
-0.03643673285841942,
0.11806128919124603,
-0.11778976768255234,
0.029927683994174004,
-0.14209046959877014,
0.01046103797852993,
0.013374140486121178,
-0.10405930876731873,
0.04096527770161629,
0.10555064678192139,
0.047535475343465805,
0.10791407525539398,
0.008766017854213715,
-0.049323439598083496,
0.018767142668366432,
-0.022756824269890785,
-0.040644705295562744,
-0.11646950244903564,
-0.052180856466293335,
-0.10136645287275314,
0.16700775921344757,
0.007389349397271872,
0.024971546605229378,
-0.12791797518730164,
0.01666608452796936,
-0.028567355126142502,
-0.05024510994553566,
-0.008568493649363518,
0.03663986921310425,
-0.04101195186376572,
-0.024615682661533356,
0.1537131369113922,
0.031621143221855164,
0.07719611376523972,
0.139799565076828,
-0.16767261922359467,
-0.030868353322148323,
0.1666289120912552,
-0.020627079531550407,
-0.052410900592803955,
-0.0010416270233690739,
-0.00793477613478899,
-0.040246542543172836,
0.05950959771871567,
-0.04928615689277649,
0.15257154405117035,
-0.045689404010772705,
0.09750107675790787,
-0.06634771823883057,
0.008971001952886581,
0.10050637274980545,
-0.07364208996295929,
-0.0687524750828743,
0.05113353207707405,
0.09002334624528885,
-0.0780760794878006,
0.12884357571601868,
0.08856634795665741,
-0.046648941934108734,
0.1518881618976593,
0.022418025881052017,
-0.047069571912288666,
0.015500708483159542,
-0.014687726274132729,
-0.006375941447913647,
0.08395414799451828,
-0.15453827381134033,
-0.10884516686201096,
0.08001459389925003,
-0.021840233355760574,
0.004137407522648573,
-0.07136379927396774,
0.045833881944417953,
0.03964921087026596,
0.06145139038562775,
-0.04006030410528183,
0.05069315806031227,
-0.09238914400339127,
0.05038737133145332,
-0.0012148551177233458,
-0.059586502611637115,
0.056384216994047165,
0.01711711473762989,
-0.10969927906990051,
0.08654975146055222,
-0.039489589631557465,
-0.18793629109859467,
-0.16130733489990234,
-0.08768239617347717,
-0.09973912686109543,
0.011202231049537659,
0.026001375168561935,
0.02253137342631817,
-0.04524768888950348,
-0.10273581743240356,
-0.018083758652210236,
-0.01528745237737894,
-0.039515119045972824,
0.1184697151184082,
-0.0665447935461998,
0.02908731997013092,
-0.08575659990310669,
-0.021661993116140366,
-0.062288474291563034,
-0.060686592012643814,
0.12171722203493118,
-0.050198376178741455,
0.1480373740196228,
0.08438349515199661,
-0.007591220550239086,
0.03548853471875191,
-0.01244202721863985,
0.16115473210811615,
0.015405166894197464,
0.051113929599523544,
0.24422499537467957,
0.04135119169950485,
0.05350949242711067,
0.0719006210565567,
-0.04419289529323578,
-0.06489075720310211,
0.00881273951381445,
-0.0027238279581069946,
-0.15063554048538208,
-0.030712898820638657,
-0.05160171538591385,
-0.04270258918404579,
0.12454022467136383,
0.07033301144838333,
0.035171378403902054,
0.02402433007955551,
0.12974104285240173,
0.030749762430787086,
0.06385651230812073,
-0.033476751297712326,
0.07963589578866959,
-0.07137037813663483,
-0.029455749318003654,
0.13126996159553528,
-0.06722907721996307,
0.036823567003011703,
0.11427585035562515,
0.02056983672082424,
0.18663956224918365,
-0.05635334923863411,
-0.09700888395309448,
0.06308621168136597,
0.05376653000712395,
0.04594312235713005,
0.22590743005275726,
-0.13449910283088684,
0.010194824077188969,
-0.05382242426276207,
-0.0391421914100647,
-0.10455655306577682,
0.10225099325180054,
-0.06609849631786346,
-0.022500617429614067,
-0.05062786862254143,
0.06950349360704422,
0.00458701653406024,
0.10764676332473755,
0.03139607608318329,
-0.2857297360897064,
-0.06377808749675751,
0.006228753365576267,
0.10666266828775406,
-0.15543939173221588,
-0.014499593526124954,
0.04378259927034378,
-0.06391661614179611,
0.02927035093307495,
-0.049190454185009,
0.05756184831261635,
-0.08643754571676254,
0.044013358652591705,
0.03386732190847397,
0.12121736258268356,
0.07958542555570602,
0.07899221032857895,
-0.16087308526039124,
0.11091575026512146,
-0.028690272942185402,
0.03807765990495682,
-0.07689058035612106,
0.017802489921450615,
0.006313720252364874,
0.06899479776620865,
0.21337281167507172,
0.03697189688682556,
0.02302822656929493,
-0.15473543107509613,
-0.09050840884447098,
0.007021436467766762,
0.06972699612379074,
-0.0044014835730195045,
0.026474086567759514,
0.01997441053390503,
0.011778872460126877,
-0.035488273948431015,
0.1586056500673294,
-0.022557778283953667,
-0.14558598399162292,
0.041649796068668365,
-0.0043795290403068066,
0.01777973398566246,
-6.089479143156495e-7,
-0.08119645714759827,
-0.11125173419713974,
0.10565552115440369,
0.011943520046770573,
-0.038619011640548706,
-0.11894449591636658,
-0.0762740895152092,
0.011168931610882282,
-0.05076213181018829,
0.05914008244872093,
-0.08131624758243561,
0.034415438771247864,
-0.08562171459197998,
-0.1578100323677063,
0.057541634887456894,
-0.11747116595506668,
-0.025242241099476814,
-0.029055412858724594,
0.03643135726451874,
0.024560997262597084,
0.004676962271332741,
-0.013703495264053345,
0.04427090287208557,
-0.11345196515321732,
-0.0819496139883995,
0.10790576040744781,
0.12855711579322815,
0.0466177873313427,
-0.009529991075396538,
-0.06462900340557098,
-0.07091943919658661,
0.07589346170425415,
-0.0013481304049491882,
0.05608140304684639,
0.18792715668678284,
-0.011152698658406734,
0.14264602959156036,
0.21500609815120697,
-0.08232922852039337,
-0.3230557441711426,
-0.08638902008533478,
0.00047629105392843485,
0.07335416972637177,
-0.04289378225803375,
-0.1334225982427597,
0.06153321638703346,
0.05202388018369675,
-0.005079102702438831,
0.022894330322742462,
-0.03428427875041962,
-0.09555622935295105,
0.08419390022754669,
0.12941159307956696,
0.3104721009731293,
-0.06554923206567764,
0.023823169991374016,
-0.04008331522345543,
-0.2173803746700287,
0.06538143008947372,
-0.09822820872068405,
0.0790858268737793,
-0.04045258089900017,
0.019644657149910927,
-0.01169548463076353,
-0.07647506147623062,
0.0823584720492363,
-0.01314808800816536,
0.06409099698066711,
-0.09331745654344559,
0.020275460556149483,
0.08313708752393723,
-0.03898998722434044,
0.20166905224323273,
-0.015711065381765366,
0.006678753066807985,
-0.0944494903087616,
-0.07126568257808685,
-0.09997643530368805,
-0.001894815475679934,
0.011601651087403297,
-0.08399137854576111,
-0.006613823119550943,
0.06210837513208389,
-0.007780825719237328,
-0.025046488270163536,
-0.06422112137079239,
-0.058808714151382446,
0.004924063105136156,
0.1934966892004013,
0.08678824454545975,
-0.13197575509548187,
-0.11574116349220276,
-0.07742753624916077,
-0.0709669291973114,
0.0509500727057457,
-0.10431549698114395,
0.008024984039366245,
0.08119788765907288,
-0.0333312526345253,
0.06933857500553131,
0.04479213431477547,
-0.08586057275533676,
0.035524897277355194,
0.047493692487478256,
-0.18009305000305176,
0.01186317391693592,
0.011445225216448307,
-0.04372682794928551,
0.02190210483968258,
0.07814236730337143,
0.09353645890951157,
-0.07158210873603821,
0.022498056292533875,
-0.059212785214185715,
0.0337798073887825,
-0.01967748813331127,
0.0010478445328772068,
0.03159463405609131,
0.0009323461563326418,
-0.09756352752447128,
0.07155558466911316,
0.01259748637676239,
-0.06336523592472076,
0.08678395301103592,
0.012118595652282238,
-0.09146806597709656,
-0.12026294320821762,
-0.04844176769256592,
0.2003079652786255,
-0.04452407732605934,
-0.1184074878692627,
-0.03233272209763527,
-0.10584756731987,
0.005064333323389292,
0.03672359883785248,
0.07016290724277496,
0.018166158348321915,
-0.05772534757852554,
-0.018249452114105225,
-0.1654597669839859,
0.0330725833773613,
-0.01604592055082321,
0.030771251767873764,
-0.08547162264585495,
-0.013858720660209656,
0.06623271107673645,
0.12741602957248688,
-0.09245501458644867,
-0.05766380578279495,
-0.06404474377632141,
-0.009261843748390675,
0.02038293518126011,
-0.02401021122932434,
-0.08109976351261139,
-0.046008747071027756,
0.020993856713175774,
0.007337236776947975,
-0.017660783603787422,
0.059788789600133896,
-0.0605187714099884,
0.01293034665286541,
0.0033495351672172546,
0.01961353048682213,
-0.06813416630029678,
-0.012408491224050522,
-0.00832201074808836,
-0.013341711834073067,
0.0784250944852829,
0.03848282992839813,
-0.0419967882335186,
-0.053568534553050995,
-0.14850939810276031,
0.0018648456316441298,
0.047757942229509354,
-0.0030712548177689314,
0.017126521095633507,
-0.09193973988294601,
-0.01830804906785488,
0.06714383512735367,
-0.07621336728334427,
-0.00884534977376461,
0.15428055822849274,
-0.02385316602885723,
0.09078598022460938,
0.04675484448671341,
-0.034780800342559814,
-0.06424803286790848,
0.0565972663462162,
0.06462474912405014,
0.08775997906923294,
0.03871041163802147,
-0.03136047348380089,
0.023115670308470726,
-0.12277372926473618,
-0.014971503987908363,
0.06566306203603745,
-0.07424402981996536,
-0.006944669876247644,
-0.11699976027011871,
0.02131824754178524,
0.0426114946603775,
0.17925004661083221,
0.05621727183461189,
-0.050062842667102814,
-0.041755907237529755,
0.06497149169445038,
-0.04790734872221947,
-0.019637230783700943,
0.007289097644388676,
-0.018414024263620377,
0.03794678673148155,
0.008564700372517109,
0.11680983006954193,
0.0419531986117363,
0.12641914188861847,
0.08306971192359924,
-0.008894361555576324,
0.11501138657331467,
0.12211469560861588,
0.0368611104786396,
-0.010136792436242104,
-0.05544755607843399,
-0.0153639642521739,
-0.05881991237401962,
0.07312886416912079,
-0.0567568875849247,
0.117234006524086,
0.05736827105283737,
-0.04266873002052307,
0.021505247801542282,
-0.01743362657725811,
-0.06334885954856873,
-0.059976037591695786,
-0.23481591045856476,
-0.04835617169737816,
-0.09056870639324188,
0.008716505952179432,
-0.0010682488791644573,
-0.024691084399819374,
0.07941999286413193,
0.008655461482703686,
-0.02493017539381981,
0.09926406294107437,
-0.14352266490459442,
0.03801576420664787,
0.0621185265481472,
-0.026458049193024635,
-0.049921564757823944,
-0.05061695724725723,
-0.04269031807780266,
0.0054386374540627,
0.019201001152396202,
0.08189035952091217,
0.02742679789662361,
0.09769906848669052,
0.0531046986579895,
0.009563456289470196,
-0.11726299673318863,
-0.04146542772650719,
0.031210169196128845,
0.020493850111961365,
0.03487570211291313,
-0.04066411033272743,
-0.0026764068752527237,
-0.02147289179265499,
-0.013459546491503716,
0.009994779713451862,
-0.0028545509558171034,
-0.0833052545785904,
0.19046564400196075,
-0.05057411640882492,
-0.0029495765920728445,
0.006008664611726999,
-0.057463519275188446,
-0.0571795292198658,
0.230965718626976,
0.19408085942268372,
-0.053157057613134384,
-0.022625261917710304,
0.017514938488602638,
0.007114896550774574,
-0.013143951073288918,
0.10833366215229034,
0.08355515450239182,
0.18573865294456482,
-0.066129170358181,
0.10316388309001923,
-0.0580560676753521,
-0.04936778172850609,
-0.005052926950156689,
0.06563396006822586,
0.006319581530988216,
-0.010078375227749348,
-0.05257366970181465,
-0.07743512094020844,
-0.0445365309715271,
-0.040286026895046234,
0.19398586452007294,
-0.0748392641544342,
-0.039159610867500305,
0.0007113516330718994,
0.10418058186769485,
0.04876227676868439,
0.053017888218164444,
-0.044698864221572876,
0.04771781340241432,
0.11812872439622879,
-0.013173293322324753,
-0.1304437816143036,
-0.043280843645334244,
0.007238869089633226,
-0.11678583174943924,
0.10208992660045624,
-0.015963295474648476,
0.06139414757490158,
0.0264405757188797,
-0.0034127249382436275,
-0.1201796755194664,
0.0760744959115982,
-0.07055459171533585,
-0.043626926839351654,
-0.02112451381981373,
0.15777966380119324,
-0.03624517843127251,
-0.002765194745734334,
-0.027654802426695824,
-0.04222942143678665,
-0.014399949461221695,
-0.019824689254164696,
-0.06201597675681114,
0.00484564108774066,
0.056953415274620056,
-0.09738782793283463,
0.15396131575107574,
0.05386993661522865,
0.04430083557963371,
-0.07777542620897293,
-0.04184403270483017,
0.10201752930879593,
-0.0218963660299778,
-0.049070458859205246,
0.007101237308233976,
-0.09725222736597061,
0.018457533791661263,
-0.06484569609165192,
0.029574662446975708,
-0.2542373538017273,
-0.07757006585597992,
-0.11720532178878784,
-0.031173745170235634,
-0.07050424814224243,
-0.010954434052109718,
0.08391619473695755,
0.02923850156366825,
-0.06713039427995682,
0.06092069298028946,
0.03169450908899307,
0.06845782697200775,
-0.04544894024729729,
-0.12494069337844849
] |
null | null |
transformers
|
# Perceiver IO for optical flow
Perceiver IO model trained on [AutoFlow](https://autoflow-google.github.io/). It was introduced in the paper [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795) by Jaegle et al. and first released in [this repository](https://github.com/deepmind/deepmind-research/tree/master/perceiver).
Optical flow is a decades-old open problem in computer vision. Given two images of the same scene (e.g. two consecutive frames of a video), the task is to estimate the 2D displacement for each pixel in the first image. This has many broader applications, such as navigation and visual odometry in robots, estimation of 3D geometry, and even to aid transfer of more complex, learned inference such as 3D human pose estimation from synthetic to real images.
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For optical flow, the output is a tensor containing the predicted flow of shape (batch_size, height, width, 2).
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/perceiver_architecture.jpg" alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model on raw pixel values, by concatenating a pair of images and extracting a 3x3 patch around each pixel.
The model obtains state-of-the-art results on important optical flow benchmarks, including [Sintel](http://sintel.is.tue.mpg.de/) and [KITTI](http://www.cvlibs.net/datasets/kitti/eval_scene_flow.php?benchmark=flow).
## Intended uses & limitations
You can use the raw model for predicting optical flow between a pair of images. See the [model hub](https://huggingface.co/models?search=deepmind/perceiver) to look for other versions on a task that may interest you.
### How to use
We refer to the [tutorial notebook](https://github.com/NielsRogge/Transformers-Tutorials/blob/master/Perceiver/Perceiver_for_Optical_Flow.ipynb) regarding using the Perceiver for optical flow.
## Training data
This model was trained on [AutoFlow](https://autoflow-google.github.io/), a synthetic dataset consisting of 400,000 annotated image pairs.
## Training procedure
### Preprocessing
Frames are resized to a resolution of 368x496. The authors concatenate the frames along the channel dimension and extract a 3x3 patch around each pixel (leading to 3x3x3x2 = 54 values for each pixel).
### Pretraining
Hyperparameter details can be found in Appendix E of the [paper](https://arxiv.org/abs/2107.14795).
## Evaluation results
The model achieves a average end-point error (EPE) of 1.81 on Sintel.clean, 2.42 on Sintel.final and 4.98 on KITTI. For evaluation results, we refer to table 4 of the [paper](https://arxiv.org/abs/2107.14795).
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2107-14795,
author = {Andrew Jaegle and
Sebastian Borgeaud and
Jean{-}Baptiste Alayrac and
Carl Doersch and
Catalin Ionescu and
David Ding and
Skanda Koppula and
Daniel Zoran and
Andrew Brock and
Evan Shelhamer and
Olivier J. H{\'{e}}naff and
Matthew M. Botvinick and
Andrew Zisserman and
Oriol Vinyals and
Jo{\~{a}}o Carreira},
title = {Perceiver {IO:} {A} General Architecture for Structured Inputs {\&}
Outputs},
journal = {CoRR},
volume = {abs/2107.14795},
year = {2021},
url = {https://arxiv.org/abs/2107.14795},
eprinttype = {arXiv},
eprint = {2107.14795},
timestamp = {Tue, 03 Aug 2021 14:53:34 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2107-14795.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
{"license": "apache-2.0", "datasets": ["autoflow"]}
| null |
deepmind/optical-flow-perceiver
|
[
"transformers",
"pytorch",
"perceiver",
"dataset:autoflow",
"arxiv:2107.14795",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.14795"
] |
[] |
TAGS
#transformers #pytorch #perceiver #dataset-autoflow #arxiv-2107.14795 #license-apache-2.0 #endpoints_compatible #has_space #region-us
|
# Perceiver IO for optical flow
Perceiver IO model trained on AutoFlow. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository.
Optical flow is a decades-old open problem in computer vision. Given two images of the same scene (e.g. two consecutive frames of a video), the task is to estimate the 2D displacement for each pixel in the first image. This has many broader applications, such as navigation and visual odometry in robots, estimation of 3D geometry, and even to aid transfer of more complex, learned inference such as 3D human pose estimation from synthetic to real images.
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For optical flow, the output is a tensor containing the predicted flow of shape (batch_size, height, width, 2).
<img src="URL alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model on raw pixel values, by concatenating a pair of images and extracting a 3x3 patch around each pixel.
The model obtains state-of-the-art results on important optical flow benchmarks, including Sintel and KITTI.
## Intended uses & limitations
You can use the raw model for predicting optical flow between a pair of images. See the model hub to look for other versions on a task that may interest you.
### How to use
We refer to the tutorial notebook regarding using the Perceiver for optical flow.
## Training data
This model was trained on AutoFlow, a synthetic dataset consisting of 400,000 annotated image pairs.
## Training procedure
### Preprocessing
Frames are resized to a resolution of 368x496. The authors concatenate the frames along the channel dimension and extract a 3x3 patch around each pixel (leading to 3x3x3x2 = 54 values for each pixel).
### Pretraining
Hyperparameter details can be found in Appendix E of the paper.
## Evaluation results
The model achieves a average end-point error (EPE) of 1.81 on URL, 2.42 on URL and 4.98 on KITTI. For evaluation results, we refer to table 4 of the paper.
### BibTeX entry and citation info
|
[
"# Perceiver IO for optical flow\n\nPerceiver IO model trained on AutoFlow. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nOptical flow is a decades-old open problem in computer vision. Given two images of the same scene (e.g. two consecutive frames of a video), the task is to estimate the 2D displacement for each pixel in the first image. This has many broader applications, such as navigation and visual odometry in robots, estimation of 3D geometry, and even to aid transfer of more complex, learned inference such as 3D human pose estimation from synthetic to real images.\n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For optical flow, the output is a tensor containing the predicted flow of shape (batch_size, height, width, 2).\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model on raw pixel values, by concatenating a pair of images and extracting a 3x3 patch around each pixel. \n\nThe model obtains state-of-the-art results on important optical flow benchmarks, including Sintel and KITTI.",
"## Intended uses & limitations\n\nYou can use the raw model for predicting optical flow between a pair of images. See the model hub to look for other versions on a task that may interest you.",
"### How to use\n\nWe refer to the tutorial notebook regarding using the Perceiver for optical flow.",
"## Training data\n\nThis model was trained on AutoFlow, a synthetic dataset consisting of 400,000 annotated image pairs.",
"## Training procedure",
"### Preprocessing\n\nFrames are resized to a resolution of 368x496. The authors concatenate the frames along the channel dimension and extract a 3x3 patch around each pixel (leading to 3x3x3x2 = 54 values for each pixel).",
"### Pretraining\n\nHyperparameter details can be found in Appendix E of the paper.",
"## Evaluation results\n\nThe model achieves a average end-point error (EPE) of 1.81 on URL, 2.42 on URL and 4.98 on KITTI. For evaluation results, we refer to table 4 of the paper.",
"### BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #perceiver #dataset-autoflow #arxiv-2107.14795 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n",
"# Perceiver IO for optical flow\n\nPerceiver IO model trained on AutoFlow. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nOptical flow is a decades-old open problem in computer vision. Given two images of the same scene (e.g. two consecutive frames of a video), the task is to estimate the 2D displacement for each pixel in the first image. This has many broader applications, such as navigation and visual odometry in robots, estimation of 3D geometry, and even to aid transfer of more complex, learned inference such as 3D human pose estimation from synthetic to real images.\n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For optical flow, the output is a tensor containing the predicted flow of shape (batch_size, height, width, 2).\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model on raw pixel values, by concatenating a pair of images and extracting a 3x3 patch around each pixel. \n\nThe model obtains state-of-the-art results on important optical flow benchmarks, including Sintel and KITTI.",
"## Intended uses & limitations\n\nYou can use the raw model for predicting optical flow between a pair of images. See the model hub to look for other versions on a task that may interest you.",
"### How to use\n\nWe refer to the tutorial notebook regarding using the Perceiver for optical flow.",
"## Training data\n\nThis model was trained on AutoFlow, a synthetic dataset consisting of 400,000 annotated image pairs.",
"## Training procedure",
"### Preprocessing\n\nFrames are resized to a resolution of 368x496. The authors concatenate the frames along the channel dimension and extract a 3x3 patch around each pixel (leading to 3x3x3x2 = 54 values for each pixel).",
"### Pretraining\n\nHyperparameter details can be found in Appendix E of the paper.",
"## Evaluation results\n\nThe model achieves a average end-point error (EPE) of 1.81 on URL, 2.42 on URL and 4.98 on KITTI. For evaluation results, we refer to table 4 of the paper.",
"### BibTeX entry and citation info"
] |
[
52,
220,
321,
45,
23,
32,
3,
62,
20,
48,
11
] |
[
"passage: TAGS\n#transformers #pytorch #perceiver #dataset-autoflow #arxiv-2107.14795 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n# Perceiver IO for optical flow\n\nPerceiver IO model trained on AutoFlow. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nOptical flow is a decades-old open problem in computer vision. Given two images of the same scene (e.g. two consecutive frames of a video), the task is to estimate the 2D displacement for each pixel in the first image. This has many broader applications, such as navigation and visual odometry in robots, estimation of 3D geometry, and even to aid transfer of more complex, learned inference such as 3D human pose estimation from synthetic to real images.\n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team."
] |
[
-0.07540944963693619,
-0.04453696683049202,
0.0014336196472868323,
0.026074660941958427,
0.09089723229408264,
0.004992200993001461,
0.1433471441268921,
0.05784861743450165,
0.12235159426927567,
0.11821702867746353,
0.045655008405447006,
-0.061896856874227524,
0.07270623743534088,
0.13925811648368835,
0.03433864563703537,
-0.25168782472610474,
0.01676204800605774,
-0.05792657285928726,
0.04662815108895302,
0.03965313360095024,
0.039080798625946045,
-0.11637716740369797,
0.0972273200750351,
0.012786197476089,
-0.13518232107162476,
-0.03895493224263191,
-0.08814320713281631,
-0.0506388284265995,
0.12511415779590607,
0.033487480133771896,
0.049403708428144455,
-0.011718408204615116,
0.122653529047966,
-0.07446252554655075,
0.0047866650857031345,
0.019042354077100754,
0.02850637398660183,
0.03265851363539696,
0.06640629470348358,
0.005310736130923033,
0.12756483256816864,
-0.0469207689166069,
0.09402237087488174,
-0.001841337769292295,
-0.04510011523962021,
-0.15711930394172668,
-0.023417484015226364,
0.06529344618320465,
0.058682918548583984,
0.04350355267524719,
0.0009733528131619096,
0.0483727864921093,
0.0016491672722622752,
0.053037580102682114,
0.03247720003128052,
-0.1631447970867157,
-0.010198698379099369,
0.04080645367503166,
0.008601340465247631,
0.011093934066593647,
-0.05504375323653221,
-0.021571237593889236,
0.05804379656910896,
0.03176227584481239,
0.03983470797538757,
0.0020757666788995266,
0.10672564804553986,
-0.07777631282806396,
-0.10477880388498306,
-0.042541708797216415,
0.10820472240447998,
0.043198056519031525,
-0.08029518276453018,
-0.17625968158245087,
-0.04644380509853363,
0.09458324313163757,
-0.022018907591700554,
-0.06369522958993912,
-0.014436396770179272,
-0.03793436661362648,
0.02947062999010086,
-0.056768350303173065,
-0.09475921839475632,
-0.05856037884950638,
-0.11398857086896896,
0.03648744523525238,
0.025148503482341766,
0.11248825490474701,
-0.029773255810141563,
0.10971442610025406,
-0.10145974904298782,
-0.004458005074411631,
-0.008229747414588928,
-0.0864691510796547,
-0.07830234616994858,
0.04924318939447403,
-0.029200518503785133,
0.006088593974709511,
-0.10493212193250656,
0.0453791506588459,
0.1604510247707367,
0.0037992410361766815,
0.037174176424741745,
0.08660534769296646,
0.10354533046483994,
0.09014955163002014,
-0.08430241048336029,
-0.005364237818866968,
-0.015534810721874237,
0.0506419874727726,
0.01128108985722065,
-0.06425356864929199,
-0.04856213554739952,
0.04692002385854721,
0.013675128109753132,
-0.048709992319345474,
0.016648372635245323,
0.024582432582974434,
-0.01718866638839245,
-0.04419688135385513,
0.031605176627635956,
-0.08008990436792374,
-0.008771008811891079,
-0.015171246603131294,
-0.13577789068222046,
0.1020594909787178,
0.1521475911140442,
0.014507338404655457,
-0.04635338485240936,
-0.0442609041929245,
-0.03806895390152931,
-0.009278657846152782,
-0.047331228852272034,
-0.08490400016307831,
-0.005632451269775629,
-0.11137297004461288,
-0.010682797059416771,
-0.172962948679924,
-0.15521039068698883,
-0.022251253947615623,
0.060152288526296616,
-0.062313251197338104,
0.03474913164973259,
-0.049285564571619034,
0.07521812617778778,
-0.04779510572552681,
-0.00437151687219739,
-0.0722387433052063,
0.006799494847655296,
0.03348838537931442,
-0.08485671132802963,
0.1252877116203308,
-0.050100862979888916,
0.04393778368830681,
-0.08545446395874023,
0.03784314543008804,
-0.0470161959528923,
0.1322200745344162,
-0.010245688259601593,
-0.0013193581253290176,
-0.08740536123514175,
-0.06390835344791412,
-0.0007665241719223559,
0.054667387157678604,
0.015713157132267952,
0.11642009764909744,
-0.09599988907575607,
-0.04501181095838547,
0.011115008033812046,
-0.08130235224962234,
-0.06979267299175262,
0.11015553772449493,
-0.03037021867930889,
0.1635502278804779,
0.004938808269798756,
0.10906026512384415,
0.0180811807513237,
-0.1383150964975357,
0.036155764013528824,
-0.00251004914753139,
-0.04700955003499985,
0.06259189546108246,
0.02927382104098797,
0.08805263787508011,
0.030315399169921875,
0.03373981639742851,
-0.03486631438136101,
0.041122179478406906,
-0.05931493639945984,
-0.07665715366601944,
0.009738896042108536,
-0.07428645342588425,
-0.0647888034582138,
-0.0033626959193497896,
0.08100719004869461,
0.03851001709699631,
-0.01623152568936348,
-0.06065158545970917,
0.08326970785856247,
-0.12083863466978073,
0.010218595154583454,
-0.023239145055413246,
0.09758950024843216,
-0.11066888272762299,
-0.0032619410194456577,
-0.15713249146938324,
-0.05427972599864006,
0.04732084646821022,
-0.014265565201640129,
0.018224794417619705,
0.1268213838338852,
0.0704522579908371,
0.12831103801727295,
0.052344344556331635,
-0.05413730442523956,
0.02466546557843685,
-0.040424589067697525,
-0.030735908076167107,
-0.05536000058054924,
-0.0814288929104805,
-0.10184742510318756,
0.11912370473146439,
-0.05009429156780243,
0.011833866126835346,
-0.12330061942338943,
0.04240599647164345,
0.02559916488826275,
-0.03736153990030289,
-0.04317427799105644,
-0.01075571309775114,
-0.013314824551343918,
-0.09001027792692184,
0.07955795526504517,
0.0022627797443419695,
-0.04764031991362572,
0.06846155971288681,
-0.17393961548805237,
-0.0762191191315651,
0.11567527055740356,
-0.024459702894091606,
-0.08150549978017807,
-0.07726506888866425,
-0.0758274495601654,
-0.02236097678542137,
-0.009121335111558437,
-0.01968350261449814,
0.1715836226940155,
0.0014886847930029035,
0.08334391564130783,
-0.03295251354575157,
0.027588892728090286,
0.1061481162905693,
-0.0394291989505291,
-0.10100171715021133,
0.03282523527741432,
0.009570825845003128,
0.012072260491549969,
0.06109253689646721,
0.014417254365980625,
0.013016914948821068,
0.09478972852230072,
0.047004926949739456,
-0.08694586902856827,
-0.0013676652451977134,
0.03832576796412468,
-0.002710496075451374,
0.12541265785694122,
-0.07974275946617126,
-0.09973204135894775,
0.0790049210190773,
-0.022686941549181938,
-0.0089372294023633,
-0.14000143110752106,
0.027294276282191277,
0.03536049649119377,
0.046940311789512634,
0.05822146311402321,
-0.001735981204546988,
-0.07096471637487411,
0.08651192486286163,
0.021724950522184372,
-0.09084119647741318,
0.03231954202055931,
-0.029786832630634308,
-0.09523020684719086,
0.11663087457418442,
0.010119013488292694,
-0.24145159125328064,
-0.07442445307970047,
0.00966613832861185,
-0.05079224333167076,
0.03278682008385658,
-0.021494397893548012,
0.01906609535217285,
-0.0878436490893364,
-0.11585190147161484,
0.05816908925771713,
0.037163227796554565,
-0.02002343162894249,
0.07611914724111557,
-0.05284499004483223,
0.017018739134073257,
-0.036395154893398285,
-0.038364145904779434,
-0.07634666562080383,
-0.0844159945845604,
0.08401894569396973,
-0.07837764918804169,
0.10663314908742905,
0.09474959969520569,
-0.03752791881561279,
0.013182681053876877,
-0.03490632772445679,
0.15169478952884674,
-0.02527519315481186,
0.04586450010538101,
0.2264564037322998,
0.019124029204249382,
0.0495028980076313,
0.045248813927173615,
-0.02812768518924713,
-0.061657000333070755,
0.06776336580514908,
0.020380746573209763,
-0.09366673976182938,
-0.008128734305500984,
-0.08989908546209335,
-0.029775315895676613,
0.10728324949741364,
0.144632950425148,
0.0083067761734128,
-0.001678137923590839,
0.0783340111374855,
-0.027106227353215218,
0.12102007120847702,
0.05089965835213661,
0.07617763429880142,
-0.06410998106002808,
-0.0725937932729721,
0.08419125527143478,
-0.022641457617282867,
-0.03981277719140053,
0.08129248768091202,
0.06343904882669449,
0.22936195135116577,
-0.06671471893787384,
-0.0851537436246872,
-0.007042452227324247,
0.12298355996608734,
0.016947630792856216,
0.11013945192098618,
-0.04963739216327667,
0.04208698868751526,
-0.0588383749127388,
-0.025020236149430275,
-0.08208880573511124,
0.15738838911056519,
0.08232211321592331,
-0.02152862586081028,
-0.09967406094074249,
0.06697734445333481,
0.025339756160974503,
0.20595158636569977,
0.03525568172335625,
-0.282751202583313,
-0.06147749349474907,
0.027170812711119652,
0.04880853369832039,
-0.12920884788036346,
0.031445808708667755,
0.10439129918813705,
-0.11024286597967148,
-0.027552498504519463,
-0.07395245879888535,
0.00985046662390232,
-0.1093866378068924,
-0.0011408432619646192,
-0.03328704461455345,
0.031874753534793854,
0.0399983786046505,
0.06670250743627548,
-0.1559116542339325,
0.04471852630376816,
-0.027075165882706642,
0.034820448607206345,
-0.10672030597925186,
-0.02111947163939476,
0.04689885303378105,
-0.046771250665187836,
0.23665392398834229,
-0.005103252828121185,
0.06030746176838875,
-0.14374184608459473,
-0.10234209895133972,
0.04275776445865631,
0.06403753906488419,
-0.059122879058122635,
0.03481185436248779,
-0.0034352566581219435,
0.03019564040005207,
-0.0036977953277528286,
0.06891142576932907,
0.005933868233114481,
-0.1576133817434311,
0.003292195266112685,
-0.0873161107301712,
0.003705323673784733,
-0.03680608421564102,
-0.07912040501832962,
-0.08190961927175522,
0.07809963077306747,
-0.01924270950257778,
-0.007249870337545872,
-0.12755973637104034,
0.0691860169172287,
0.043866466730833054,
-0.03489042818546295,
0.07767727971076965,
-0.020670216530561447,
0.08277558535337448,
-0.0931227058172226,
-0.1661781221628189,
0.06044063717126846,
-0.09270825982093811,
-0.10602860152721405,
-0.03572491928935051,
-0.020483359694480896,
0.08676417171955109,
0.01699734292924404,
-0.02884289249777794,
0.00040204342803917825,
-0.08398039638996124,
-0.03925279155373573,
0.09358423203229904,
0.2087569236755371,
0.03809782862663269,
0.08450990915298462,
-0.049839675426483154,
-0.1166011169552803,
-0.039349909871816635,
0.0402078852057457,
0.08789631724357605,
0.13549040257930756,
-0.03847556561231613,
0.08392027020454407,
0.16505135595798492,
-0.036919452250003815,
-0.2916279733181,
-0.009267347864806652,
0.05941306799650192,
0.09303810447454453,
-0.009404062293469906,
-0.07575313746929169,
0.02282818593084812,
0.03136187791824341,
-0.013066473416984081,
-0.007080424576997757,
-0.1785472184419632,
-0.09977712482213974,
0.13275589048862457,
0.15520671010017395,
0.24073435366153717,
-0.07099012285470963,
0.06698638200759888,
-0.0032971862237900496,
-0.06020984798669815,
0.10544514656066895,
0.03322456032037735,
0.0743115022778511,
-0.01790785603225231,
0.08190163224935532,
0.04057953506708145,
-0.04785001277923584,
0.13073404133319855,
-0.050364118069410324,
0.11006555706262589,
-0.0790833979845047,
-0.0018323920667171478,
0.008435835130512714,
-0.05928647145628929,
0.18392504751682281,
0.07265091687440872,
0.03025667928159237,
-0.06890667974948883,
-0.08215197175741196,
-0.08008985966444016,
0.029455814510583878,
0.027497269213199615,
-0.0974612906575203,
-0.08328454196453094,
0.09864473342895508,
0.07348185777664185,
0.02762364223599434,
0.026987072080373764,
0.0010028628166764975,
-0.05107899010181427,
0.12638883292675018,
0.0052456362172961235,
0.054139427840709686,
-0.09857340902090073,
-0.06588831543922424,
-0.004741090815514326,
0.0811101645231247,
-0.18020012974739075,
0.01012515276670456,
0.148892343044281,
0.006117300130426884,
0.1288587898015976,
0.0657624751329422,
-0.12135772407054901,
0.09145289659500122,
-0.003479692619293928,
-0.11266662925481796,
-0.10879629105329514,
0.002105463994666934,
0.049414634704589844,
0.012469306588172913,
0.03650803491473198,
0.06732262670993805,
-0.13312365114688873,
0.027949446812272072,
-0.0504094734787941,
0.0704578161239624,
-0.027943303808569908,
0.050635021179914474,
0.015959350392222404,
-0.014151383191347122,
-0.06469187140464783,
0.14975731074810028,
0.028064079582691193,
-0.11031584441661835,
0.07430394738912582,
-0.07379495352506638,
-0.06009208410978317,
-0.030093248933553696,
-0.021667758002877235,
0.10824724286794662,
-0.06606583297252655,
-0.06377992033958435,
-0.023891562595963478,
-0.156410813331604,
0.015166603960096836,
0.0019483723444864154,
0.019987769424915314,
0.007061040960252285,
-0.09547137469053268,
0.02811279520392418,
-0.09929961711168289,
0.05974912643432617,
0.028925301507115364,
-0.012489542365074158,
-0.07094454765319824,
0.0367160364985466,
0.036147162318229675,
0.10958046466112137,
-0.07623220980167389,
-0.12255369871854782,
-0.0672500804066658,
0.020660080015659332,
-0.0888250395655632,
0.002715226961299777,
-0.08772581070661545,
-0.009551354683935642,
0.0029797409661114216,
0.01896735280752182,
0.04403950646519661,
0.04183443635702133,
-0.05605259910225868,
-0.004430064465850592,
-0.033268071711063385,
-0.01748112589120865,
-0.04535486549139023,
-0.08996754884719849,
-0.01565580815076828,
-0.07550293952226639,
0.08312530815601349,
0.05427888408303261,
-0.030172748491168022,
-0.06161300092935562,
-0.07037941366434097,
-0.019490910694003105,
0.077584408223629,
0.051756151020526886,
0.019900010898709297,
-0.03225003182888031,
-0.04297919571399689,
0.010250586085021496,
-0.11372246593236923,
-0.04339474439620972,
0.14064712822437286,
-0.021497778594493866,
0.03986988589167595,
-0.034856703132390976,
-0.009165812283754349,
-0.11040425300598145,
0.0715172216296196,
0.012325464747846127,
0.12613099813461304,
-0.018915381282567978,
-0.012458516284823418,
0.058164484798908234,
-0.12235937267541885,
-0.036518219858407974,
0.035710353404283524,
-0.005131886340677738,
0.07008736580610275,
-0.0892646387219429,
0.05582631751894951,
0.019898056983947754,
0.1354326456785202,
0.0650288537144661,
0.017960507422685623,
-0.0723733901977539,
0.01602807268500328,
0.008538373745977879,
0.0321536511182785,
-0.0009414314990863204,
-0.023612581193447113,
0.03722041845321655,
0.004981998819857836,
0.09977015107870102,
0.049868032336235046,
0.21083621680736542,
0.0007292857044376433,
-0.05262691527605057,
0.05597831308841705,
0.108108751475811,
0.06709285080432892,
-0.09009916335344315,
0.001406218041665852,
-0.012544962577521801,
-0.046887174248695374,
0.06615319848060608,
-0.04499216750264168,
0.019163012504577637,
0.06951434910297394,
-0.13491162657737732,
0.045163825154304504,
-0.011228559538722038,
-0.07919266819953918,
-0.0039032918866723776,
-0.1948275864124298,
0.011441812850534916,
-0.06966366618871689,
0.018737375736236572,
-0.015334706753492355,
-0.03801441565155983,
0.08689946681261063,
0.04891175776720047,
-0.011875132098793983,
0.0873730257153511,
-0.20862846076488495,
-0.022522984072566032,
0.10675778239965439,
0.03660419210791588,
0.007712396327406168,
-0.03684087097644806,
-0.06658158451318741,
0.04859273135662079,
0.031141910701990128,
0.043623536825180054,
0.022601086646318436,
0.07876322418451309,
0.09210716187953949,
0.045669566839933395,
-0.056256961077451706,
-0.02593575417995453,
-0.04202472046017647,
-0.022610118612647057,
0.09162168949842453,
0.017964737489819527,
-0.017786020413041115,
-0.011110877618193626,
0.07028853893280029,
-0.06802535802125931,
0.025451311841607094,
-0.14179272949695587,
0.1949175000190735,
-0.09281948208808899,
0.020896658301353455,
-0.00681389681994915,
-0.08760251849889755,
-0.0656621977686882,
0.2008839249610901,
0.11663341522216797,
-0.09812407195568085,
-0.04345688223838806,
0.007410327438265085,
-0.009767037816345692,
-0.06172192096710205,
0.0778379738330841,
0.08441606909036636,
0.2805961072444916,
-0.08882518112659454,
0.08789681643247604,
-0.011441856622695923,
-0.05457023158669472,
0.006746580824255943,
0.07136648893356323,
-0.012532197870314121,
0.035579364746809006,
-0.10178790241479874,
-0.04152832552790642,
-0.07573415338993073,
-0.18868234753608704,
0.27541494369506836,
-0.07839269191026688,
-0.03300132229924202,
0.0015037633711472154,
-0.005307055078446865,
0.016727613285183907,
0.08799955248832703,
-0.07336858659982681,
0.05731426551938057,
0.14941714704036713,
0.010377447120845318,
-0.09015757590532303,
0.004144009668380022,
-0.006978592369705439,
-0.04310884326696396,
0.09409269690513611,
-0.014772712253034115,
0.09570734202861786,
0.004749482963234186,
0.0336153618991375,
-0.03539717197418213,
0.02646629698574543,
-0.022149275988340378,
0.014012315310537815,
-0.027773912996053696,
0.14667683839797974,
-0.01852666214108467,
0.06485506147146225,
0.04620329663157463,
-0.056758396327495575,
0.0214872807264328,
0.018141765147447586,
-0.09243959188461304,
-0.05060012266039848,
-0.0015706997364759445,
-0.044589970260858536,
0.14122270047664642,
0.026872383430600166,
0.019895566627383232,
-0.022867253050208092,
-0.03600726276636124,
0.047365717589855194,
-0.011293938383460045,
0.04429253190755844,
0.04745872691273689,
-0.0934707447886467,
0.010218927636742592,
-0.14827129244804382,
0.013230657204985619,
-0.1754041463136673,
-0.1265907734632492,
-0.04584893956780434,
-0.07810700684785843,
0.037962961941957474,
0.03201252222061157,
0.060206178575754166,
-0.006368317175656557,
-0.08591271936893463,
-0.005570732522755861,
0.0441034696996212,
0.10147635638713837,
-0.010098623111844063,
-0.05775850638747215
] |
null | null |
transformers
|
# Perceiver IO for vision (convolutional processing)
Perceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795) by Jaegle et al. and first released in [this repository](https://github.com/deepmind/deepmind-research/tree/master/perceiver).
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/perceiver_architecture.jpg" alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model employs a simple 2D conv+maxpool preprocessing network on the pixel values, before using the inputs for cross-attention with the latents.
By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.
## Intended uses & limitations
You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=deepmind/perceiver) to look for other fine-tuned versions on a task that may interest you.
### How to use
Here is how to use this model in PyTorch:
```python
from transformers import PerceiverFeatureExtractor, PerceiverForImageClassificationConvProcessing
import requests
from PIL import Image
feature_extractor = PerceiverFeatureExtractor.from_pretrained("deepmind/vision-perceiver-conv")
model = PerceiverForImageClassificationConvProcessing.from_pretrained("deepmind/vision-perceiver-conv")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
# prepare input
inputs = feature_extractor(image, return_tensors="pt").pixel_values
# forward pass
outputs = model(inputs)
logits = outputs.logits
print("Predicted class:", model.config.id2label[logits.argmax(-1).item()])
>>> should print Predicted class: tabby, tabby cat
```
## Training data
This model was pretrained on [ImageNet](http://www.image-net.org/), a dataset consisting of 14 million images and 1k classes.
## Training procedure
### Preprocessing
Images are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the [paper](https://arxiv.org/abs/2107.14795).
### Pretraining
Hyperparameter details can be found in Appendix H of the [paper](https://arxiv.org/abs/2107.14795).
## Evaluation results
This model is able to achieve a top-1 accuracy of 82.1 on ImageNet-1k.
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2107-14795,
author = {Andrew Jaegle and
Sebastian Borgeaud and
Jean{-}Baptiste Alayrac and
Carl Doersch and
Catalin Ionescu and
David Ding and
Skanda Koppula and
Daniel Zoran and
Andrew Brock and
Evan Shelhamer and
Olivier J. H{\'{e}}naff and
Matthew M. Botvinick and
Andrew Zisserman and
Oriol Vinyals and
Jo{\~{a}}o Carreira},
title = {Perceiver {IO:} {A} General Architecture for Structured Inputs {\&}
Outputs},
journal = {CoRR},
volume = {abs/2107.14795},
year = {2021},
url = {https://arxiv.org/abs/2107.14795},
eprinttype = {arXiv},
eprint = {2107.14795},
timestamp = {Tue, 03 Aug 2021 14:53:34 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2107-14795.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
{"license": "apache-2.0", "datasets": ["imagenet"]}
|
image-classification
|
deepmind/vision-perceiver-conv
|
[
"transformers",
"pytorch",
"perceiver",
"image-classification",
"dataset:imagenet",
"arxiv:2107.14795",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.14795"
] |
[] |
TAGS
#transformers #pytorch #perceiver #image-classification #dataset-imagenet #arxiv-2107.14795 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# Perceiver IO for vision (convolutional processing)
Perceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository.
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).
<img src="URL alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model employs a simple 2D conv+maxpool preprocessing network on the pixel values, before using the inputs for cross-attention with the latents.
By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.
## Intended uses & limitations
You can use the raw model for image classification. See the model hub to look for other fine-tuned versions on a task that may interest you.
### How to use
Here is how to use this model in PyTorch:
## Training data
This model was pretrained on ImageNet, a dataset consisting of 14 million images and 1k classes.
## Training procedure
### Preprocessing
Images are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the paper.
### Pretraining
Hyperparameter details can be found in Appendix H of the paper.
## Evaluation results
This model is able to achieve a top-1 accuracy of 82.1 on ImageNet-1k.
### BibTeX entry and citation info
|
[
"# Perceiver IO for vision (convolutional processing)\n\nPerceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model employs a simple 2D conv+maxpool preprocessing network on the pixel values, before using the inputs for cross-attention with the latents.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.",
"## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for other fine-tuned versions on a task that may interest you.",
"### How to use\n\nHere is how to use this model in PyTorch:",
"## Training data\n\nThis model was pretrained on ImageNet, a dataset consisting of 14 million images and 1k classes.",
"## Training procedure",
"### Preprocessing\n\nImages are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the paper.",
"### Pretraining\n\nHyperparameter details can be found in Appendix H of the paper.",
"## Evaluation results\n\nThis model is able to achieve a top-1 accuracy of 82.1 on ImageNet-1k.",
"### BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #perceiver #image-classification #dataset-imagenet #arxiv-2107.14795 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# Perceiver IO for vision (convolutional processing)\n\nPerceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model employs a simple 2D conv+maxpool preprocessing network on the pixel values, before using the inputs for cross-attention with the latents.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.",
"## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for other fine-tuned versions on a task that may interest you.",
"### How to use\n\nHere is how to use this model in PyTorch:",
"## Training data\n\nThis model was pretrained on ImageNet, a dataset consisting of 14 million images and 1k classes.",
"## Training procedure",
"### Preprocessing\n\nImages are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the paper.",
"### Pretraining\n\nHyperparameter details can be found in Appendix H of the paper.",
"## Evaluation results\n\nThis model is able to achieve a top-1 accuracy of 82.1 on ImageNet-1k.",
"### BibTeX entry and citation info"
] |
[
65,
124,
384,
42,
17,
27,
3,
53,
20,
25,
11
] |
[
"passage: TAGS\n#transformers #pytorch #perceiver #image-classification #dataset-imagenet #arxiv-2107.14795 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# Perceiver IO for vision (convolutional processing)\n\nPerceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team."
] |
[
-0.0962994247674942,
0.05063332989811897,
-0.0003458890423644334,
0.09329332411289215,
0.10633520781993866,
0.05700497329235077,
0.16075699031352997,
0.03291630372405052,
0.032281264662742615,
0.044721029698848724,
0.10041463375091553,
0.03686508908867836,
0.04260146617889404,
0.15258194506168365,
0.02504293993115425,
-0.1800212562084198,
0.061478693038225174,
0.10199747234582901,
0.06895741820335388,
0.12150002270936966,
0.0832938551902771,
-0.10364475846290588,
0.1282775104045868,
0.02288929373025894,
-0.24952583014965057,
-0.015158301219344139,
-0.06502890586853027,
-0.05802762508392334,
0.06804991513490677,
-0.023035017773509026,
0.055665139108896255,
0.003975899890065193,
0.0919654443860054,
0.035514600574970245,
0.013186974450945854,
0.08135847002267838,
-0.0052392068319022655,
0.0901188850402832,
0.047503240406513214,
0.010006614960730076,
0.07077119499444962,
0.010568970814347267,
0.007986154407262802,
-0.014142188243567944,
-0.04604151472449303,
-0.23192833364009857,
-0.007450715638697147,
0.24331629276275635,
-0.019504638388752937,
0.08376103639602661,
0.010278545320034027,
0.09348955005407333,
0.050069548189640045,
0.057904258370399475,
0.054398540407419205,
-0.15903353691101074,
-0.06506019830703735,
0.007135841529816389,
-0.09898587316274643,
0.029599850997328758,
-0.047034699469804764,
-0.019884860143065453,
0.07889097183942795,
0.04383998364210129,
0.03921288996934891,
-0.008730032481253147,
-0.004695155657827854,
-0.11023500561714172,
-0.08542293310165405,
-0.06669008731842041,
0.19166158139705658,
0.01512762438505888,
-0.07999496161937714,
-0.051361795514822006,
-0.09367024898529053,
-0.06921181827783585,
0.05002200976014137,
-0.02830427885055542,
0.047706056386232376,
-0.03320963680744171,
0.06897256523370743,
-0.05739755183458328,
-0.10701242834329605,
-0.08755765110254288,
-0.18228855729103088,
0.04818569868803024,
0.040106866508722305,
0.13394668698310852,
-0.0521005243062973,
0.07959380000829697,
-0.12623842060565948,
-0.07347168028354645,
-0.05622072145342827,
-0.09932219237089157,
0.06399373710155487,
0.07667825371026993,
-0.02159019745886326,
-0.0575813427567482,
-0.08018007129430771,
0.06856314092874527,
0.09067332744598389,
-0.00737403891980648,
0.043347906321287155,
0.13852137327194214,
0.08500582724809647,
0.09120696038007736,
-0.0758916363120079,
0.08666859567165375,
0.031101467087864876,
0.03609021380543709,
0.039489395916461945,
-0.04821988195180893,
-0.052587226033210754,
0.030224427580833435,
0.07647596299648285,
-0.11092153191566467,
-0.003751181298866868,
0.07841876894235611,
0.0017170376377180219,
-0.07566779106855392,
0.15226243436336517,
-0.06570262461900711,
-0.030431393533945084,
-0.028947170823812485,
-0.07053887844085693,
0.15539564192295074,
0.18518078327178955,
-0.039913732558488846,
-0.02874632738530636,
0.00009750256140250713,
-0.07065247744321823,
-0.0032119061797857285,
-0.08193916082382202,
-0.06801912933588028,
0.02010529302060604,
-0.21451762318611145,
-0.003667445620521903,
-0.17631565034389496,
-0.14895561337471008,
0.010303511284291744,
0.12066332995891571,
-0.04705237224698067,
-0.02358497865498066,
0.0538557805120945,
0.04150534048676491,
-0.014486018568277359,
-0.011378061957657337,
-0.04053275287151337,
0.012166795320808887,
0.06129895895719528,
-0.052980758249759674,
0.13480594754219055,
-0.19403627514839172,
0.08077558875083923,
-0.159065842628479,
0.039462797343730927,
-0.05531485378742218,
0.01156557071954012,
0.01939735934138298,
-0.03097664564847946,
-0.06490075588226318,
-0.0953403040766716,
-0.061434052884578705,
0.04509966820478439,
0.03687341511249542,
0.08223311603069305,
-0.07009884715080261,
-0.024967238306999207,
0.029560402035713196,
-0.2303050309419632,
-0.07655498385429382,
0.09731195867061615,
0.006799745373427868,
0.13049902021884918,
-0.0269265566021204,
0.15599912405014038,
0.046008430421352386,
-0.06105823814868927,
-0.011178063228726387,
0.0178590789437294,
-0.07952944934368134,
-0.13448065519332886,
0.05643121525645256,
0.07940245419740677,
0.06147471442818642,
0.04480048641562462,
-0.053243596106767654,
0.005886188242584467,
-0.059641752392053604,
-0.050680700689554214,
-0.03003605455160141,
-0.11731855571269989,
-0.01897159218788147,
0.04683198407292366,
0.0819263756275177,
0.033968403935432434,
-0.010097213089466095,
0.015210340730845928,
0.04213172569870949,
-0.09357544034719467,
0.007557386998087168,
-0.04785935580730438,
0.10998078435659409,
-0.10163948684930801,
-0.004282429348677397,
-0.12170318514108658,
0.05800466611981392,
0.010165679268538952,
-0.07278372347354889,
0.042064324021339417,
0.13948100805282593,
0.06403683125972748,
0.10230912268161774,
0.008817270398139954,
-0.04333144798874855,
-0.0016238624230027199,
-0.019608044996857643,
-0.054425258189439774,
-0.08808866143226624,
-0.07771280407905579,
-0.049219075590372086,
0.11944969743490219,
-0.0266705472022295,
0.022544095292687416,
-0.04205423220992088,
0.037120770663022995,
-0.019994936883449554,
-0.04900025203824043,
-0.032457489520311356,
-0.015065188519656658,
-0.05203790217638016,
-0.041658274829387665,
0.11665598303079605,
0.015099593438208103,
0.09136530011892319,
0.09304354339838028,
-0.06907059252262115,
0.038806118071079254,
0.17705312371253967,
-0.07940691709518433,
-0.028096381574869156,
0.042219724506139755,
-0.04639478772878647,
-0.0396367609500885,
0.05120624229311943,
-0.0007741457666270435,
0.16892185807228088,
-0.047439660876989365,
0.10315399616956711,
-0.07128825783729553,
0.04106368497014046,
0.11074581742286682,
-0.06411316990852356,
-0.12192008644342422,
0.04635845497250557,
0.09103967994451523,
-0.08773701637983322,
0.1638869196176529,
0.11855048686265945,
-0.08247846364974976,
0.09477576613426208,
0.015163770876824856,
-0.05090909078717232,
0.03171989321708679,
-0.02307393029332161,
-0.012614298611879349,
0.1402408629655838,
-0.2222605049610138,
-0.09345981478691101,
0.09994412958621979,
-0.030532678589224815,
0.025621507316827774,
-0.11617699265480042,
0.025782983750104904,
0.012014191597700119,
0.051744841039180756,
0.004976009950041771,
0.05160870775580406,
-0.057088326662778854,
0.07403187453746796,
-0.003579039592295885,
-0.05223478376865387,
0.06035979837179184,
0.017155678942799568,
-0.06847336888313293,
0.08169329166412354,
-0.029942113906145096,
-0.2617427110671997,
-0.14422422647476196,
0.022709866985678673,
-0.11299290508031845,
0.0235152468085289,
0.016383664682507515,
0.001774114673025906,
-0.04355242848396301,
-0.05514056608080864,
-0.027246860787272453,
0.008772493340075016,
-0.020984094589948654,
0.13428132236003876,
-0.06990364193916321,
0.04987342655658722,
-0.09074671566486359,
-0.009360585361719131,
-0.07259435206651688,
-0.09367382526397705,
0.11350835114717484,
-0.05183228850364685,
0.11363210529088974,
0.07998385280370712,
-0.08006183058023453,
0.053158972412347794,
0.03570764884352684,
0.19093739986419678,
-0.015353123657405376,
0.051663029938936234,
0.2817400395870209,
0.04576679691672325,
0.038303013890981674,
0.0825176015496254,
-0.03934727981686592,
-0.0338226854801178,
-0.04954403266310692,
0.0007023399230092764,
-0.1261124163866043,
-0.08150449395179749,
-0.02801699750125408,
-0.020101912319660187,
0.032939355820417404,
0.12819400429725647,
0.0581657737493515,
0.03166592866182327,
0.1687966138124466,
0.011136719956994057,
0.08770498633384705,
-0.021297525614500046,
0.07118101418018341,
-0.029918286949396133,
-0.006671954412013292,
0.09864810854196548,
-0.07125525921583176,
0.01875261403620243,
0.12571948766708374,
0.025940753519535065,
0.2010277658700943,
-0.0788746103644371,
-0.10286369919776917,
0.049436476081609726,
0.16287072002887726,
0.0764661431312561,
0.18242184817790985,
-0.12379831075668335,
0.03674783930182457,
-0.04097278043627739,
-0.041532643139362335,
-0.10226668417453766,
0.07465850561857224,
-0.09056589752435684,
0.01233987882733345,
-0.0065603540278971195,
0.03591011464595795,
-0.0014833310851827264,
0.19370953738689423,
0.054274849593639374,
-0.28466489911079407,
-0.06898551434278488,
-0.018785342574119568,
0.08058277517557144,
-0.17758776247501373,
0.02146158739924431,
0.09641338139772415,
-0.041009269654750824,
0.057222671806812286,
-0.06507885456085205,
0.0327838659286499,
-0.14445526897907257,
0.007717054337263107,
0.11730635911226273,
0.02367366850376129,
0.10012687742710114,
0.06188371405005455,
-0.10878019034862518,
0.16437497735023499,
-0.04065649211406708,
-0.03099730610847473,
-0.11961131542921066,
0.0014595333486795425,
-0.013832569122314453,
0.12481134384870529,
0.18207506835460663,
0.04270119220018387,
0.14579205214977264,
-0.12651574611663818,
-0.0636180192232132,
-0.009516437537968159,
0.04603865370154381,
-0.0023090264294296503,
-0.04046567529439926,
0.013768650591373444,
0.02715275250375271,
-0.021328547969460487,
0.1896214783191681,
0.005686384625732899,
-0.13438840210437775,
0.07044512778520584,
0.013101283460855484,
0.04328606277704239,
-0.024725468829274178,
-0.11388496309518814,
-0.12962567806243896,
0.11846456676721573,
-0.03534329682588577,
-0.07997407019138336,
-0.08653992414474487,
0.007802811451256275,
0.04102377966046333,
-0.045853037387132645,
0.06362096220254898,
-0.11793415993452072,
0.037904851138591766,
-0.04115862771868706,
-0.18286313116550446,
0.03274141997098923,
-0.07150755822658539,
-0.07653164118528366,
0.001907181809656322,
-0.007228446193039417,
0.025136200711131096,
-0.012585856020450592,
-0.01190172228962183,
0.030271554365754128,
-0.11236288398504257,
-0.045779868960380554,
0.11752942949533463,
0.09410379827022552,
0.018510960042476654,
-0.01070982776582241,
-0.012145867571234703,
-0.1654479205608368,
0.07325410842895508,
0.04414346069097519,
0.0011472785845398903,
0.0619250126183033,
-0.05755286663770676,
0.06768925487995148,
0.23596426844596863,
-0.05206327140331268,
-0.3051014840602875,
-0.02223563753068447,
0.013658931478857994,
0.024867741391062737,
-0.11663339287042618,
-0.11083263158798218,
0.07790867984294891,
0.14241865277290344,
-0.030839357525110245,
0.0949736014008522,
-0.1281360536813736,
-0.0851445272564888,
0.07842857390642166,
0.14287292957305908,
0.2677106559276581,
-0.13051234185695648,
0.02335774153470993,
-0.012030391953885555,
-0.16323915123939514,
0.07645373046398163,
-0.030760107561945915,
0.08616317808628082,
-0.06357768177986145,
0.05285681039094925,
0.0022747074253857136,
-0.07436854392290115,
0.09998569637537003,
-0.05842817202210426,
0.07474713772535324,
-0.11995051056146622,
-0.04311200976371765,
0.14384573698043823,
-0.025537217035889626,
0.15886610746383667,
0.03506094589829445,
0.026410240679979324,
-0.02500802092254162,
-0.04683404788374901,
-0.13023433089256287,
-0.003757257480174303,
0.03647765889763832,
-0.07598879188299179,
-0.028885552659630775,
0.05460938811302185,
-0.008042634464800358,
0.0012549555394798517,
0.038797035813331604,
0.008099817670881748,
0.02927052229642868,
0.19955074787139893,
-0.01577819138765335,
-0.0820404589176178,
-0.15686248242855072,
-0.1308773308992386,
-0.040142789483070374,
0.09244643896818161,
-0.12907721102237701,
-0.009021137841045856,
0.09618263691663742,
0.017722545191645622,
0.0536259263753891,
0.026632124558091164,
-0.05714667961001396,
0.04330157861113548,
0.039259400218725204,
-0.18362019956111908,
-0.011507870629429817,
0.019687537103891373,
0.023449338972568512,
0.058813728392124176,
0.15695679187774658,
0.11930292844772339,
-0.12420608103275299,
0.010111374780535698,
-0.03877336531877518,
0.03235166519880295,
-0.024144573137164116,
0.026365317404270172,
0.07363365590572357,
0.016492009162902832,
-0.09352286905050278,
0.13497737050056458,
0.07106278091669083,
-0.049293674528598785,
0.02892516739666462,
0.013346823863685131,
-0.08057279884815216,
-0.10415325313806534,
-0.010217768140137196,
0.03551081568002701,
-0.029455389827489853,
-0.08240799605846405,
-0.005167135503143072,
-0.09885042905807495,
-0.0010293640661984682,
0.07459557801485062,
0.07877780497074127,
0.007346491329371929,
-0.050623003393411636,
-0.0206072349101305,
-0.0997212752699852,
0.0019198490772396326,
-0.04732148349285126,
0.03951985761523247,
-0.14385445415973663,
-0.01657770201563835,
0.0635850727558136,
0.11229415237903595,
-0.08354287594556808,
-0.06856674700975418,
-0.026366861537098885,
0.0077906204387545586,
-0.022830523550510406,
0.03033554181456566,
-0.04648025706410408,
-0.022676460444927216,
0.012674245983362198,
-0.03745075687766075,
-0.0472022145986557,
0.06419271975755692,
-0.07389823347330093,
0.0255034901201725,
0.01885918714106083,
-0.01453353837132454,
-0.07293891161680222,
-0.04480279982089996,
-0.040324077010154724,
-0.001996882725507021,
0.10391788929700851,
0.01977638341486454,
-0.09254255890846252,
-0.08615937829017639,
-0.09895312786102295,
-0.04286249354481697,
0.07533371448516846,
0.05106526240706444,
0.0185895673930645,
0.006231848616153002,
0.005502806976437569,
0.05494823679327965,
-0.08048398047685623,
0.002571666147559881,
0.19255048036575317,
-0.034874189645051956,
0.03943030536174774,
0.042584385722875595,
-0.02524392493069172,
-0.07792302221059799,
0.06348927319049835,
0.09869883209466934,
0.07748492807149887,
0.018133113160729408,
-0.011834219098091125,
0.04322640597820282,
-0.10797244310379028,
-0.002123410813510418,
0.05572113022208214,
-0.09916244447231293,
0.005032923072576523,
-0.0565747432410717,
0.01307789608836174,
0.010231158696115017,
0.17145757377147675,
0.08164233714342117,
-0.08686426281929016,
-0.03326099365949631,
0.10663159936666489,
-0.09274839609861374,
0.008429720997810364,
0.07492165267467499,
-0.029623955488204956,
0.04162158817052841,
0.03279957175254822,
0.10514429956674576,
0.009672481566667557,
0.16441930830478668,
0.06653391569852829,
0.07678955048322678,
0.06107937544584274,
0.09816766530275345,
0.06970231980085373,
-0.003952060826122761,
-0.06302919238805771,
-0.07543855160474777,
-0.05821256339550018,
0.09231673181056976,
-0.09705204516649246,
0.07559096813201904,
0.03856714442372322,
-0.0469268299639225,
-0.00003944383570342325,
-0.041525572538375854,
-0.049077972769737244,
-0.011416374705731869,
-0.2856755256652832,
-0.04078550636768341,
-0.09003772586584091,
0.015639590099453926,
0.015553266741335392,
-0.0015750994207337499,
0.12408997863531113,
0.048267487436532974,
-0.05263770371675491,
0.08649291098117828,
-0.14002782106399536,
-0.020933998748660088,
0.09589534252882004,
0.02721572108566761,
-0.056270599365234375,
-0.13828358054161072,
-0.04538482800126076,
0.04011854901909828,
0.07175004482269287,
0.060596026480197906,
0.027660051360726357,
0.11182112991809845,
0.019780177623033524,
0.016137611120939255,
-0.11213960498571396,
-0.04387906938791275,
0.015419048257172108,
0.02536771260201931,
0.02461910806596279,
-0.08805441856384277,
0.036204349249601364,
-0.03476593643426895,
0.0012175278970971704,
-0.0187865998595953,
-0.015503576956689358,
-0.05086468532681465,
0.09726212918758392,
-0.1269182413816452,
-0.019424090161919594,
0.04080137610435486,
-0.06336904317140579,
-0.07583469152450562,
0.2872997522354126,
0.2122873067855835,
-0.041305266320705414,
-0.016366582363843918,
0.05424023047089577,
-0.00009219321509590372,
-0.01969975046813488,
0.06785783916711807,
0.08889692276716232,
0.18714268505573273,
-0.05617189034819603,
0.046592384576797485,
-0.06114044412970543,
-0.012659953907132149,
-0.011859126389026642,
0.08334296941757202,
-0.015429701656103134,
0.0035226764157414436,
-0.14202216267585754,
-0.0849190354347229,
-0.01727752573788166,
-0.07279481738805771,
0.24402008950710297,
-0.1258227825164795,
-0.06323837488889694,
-0.020267978310585022,
0.1062692329287529,
0.018952734768390656,
0.005602353718131781,
-0.03545842319726944,
0.04681982845067978,
0.06132739037275314,
0.012713602744042873,
-0.1457195281982422,
-0.08475875109434128,
0.043150436133146286,
-0.10552898794412613,
0.16781416535377502,
-0.028257114812731743,
0.07943842560052872,
0.020185744389891624,
0.01060167234390974,
-0.11018753051757812,
-0.003373143495991826,
-0.07094495743513107,
-0.07101406902074814,
-0.06994111835956573,
0.08447065204381943,
-0.019364094361662865,
-0.03613583743572235,
-0.022957468405365944,
-0.07904792577028275,
-0.0648830309510231,
0.017826728522777557,
-0.04985291138291359,
-0.02530013397336006,
0.02927573025226593,
-0.08888513594865799,
0.12493284046649933,
0.05639396607875824,
0.027880162000656128,
-0.12455864995718002,
-0.0711509957909584,
0.056024566292762756,
-0.004993706941604614,
-0.11984552443027496,
0.04885377362370491,
-0.11142705380916595,
-0.012195153161883354,
-0.1426004320383072,
0.047543998807668686,
-0.1316470056772232,
-0.057698871940374374,
-0.09470677375793457,
-0.0646696463227272,
-0.0692184790968895,
-0.02356891706585884,
0.07487881928682327,
0.02154003083705902,
-0.050235457718372345,
0.1040215790271759,
0.008645489811897278,
0.04449249431490898,
-0.049523841589689255,
-0.08492623269557953
] |
null | null |
transformers
|
# Perceiver IO for vision (fixed Fourier position embeddings)
Perceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795) by Jaegle et al. and first released in [this repository](https://github.com/deepmind/deepmind-research/tree/master/perceiver).
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/perceiver_architecture.jpg" alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model only adds fixed Fourier 2D position embeddings to the pixel values.
By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.
## Intended uses & limitations
You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=deepmind/perceiver) to look for other fine-tuned versions on a task that may interest you.
### How to use
Here is how to use this model in PyTorch:
```python
from transformers import PerceiverImageProcessor, PerceiverForImageClassificationFourier
import requests
from PIL import Image
processor = PerceiverImageProcessor.from_pretrained("deepmind/vision-perceiver-fourier")
model = PerceiverForImageClassificationFourier.from_pretrained("deepmind/vision-perceiver-fourier")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
# prepare input
inputs = processor(image, return_tensors="pt").pixel_values
# forward pass
outputs = model(inputs)
logits = outputs.logits
print("Predicted class:", model.config.id2label[logits.argmax(-1).item()])
>>> should print Predicted class: tabby, tabby cat
```
## Training data
This model was pretrained on [ImageNet](http://www.image-net.org/), a dataset consisting of 14 million images and 1k classes.
## Training procedure
### Preprocessing
Images are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the [paper](https://arxiv.org/abs/2107.14795).
### Pretraining
Hyperparameter details can be found in Appendix H of the [paper](https://arxiv.org/abs/2107.14795).
## Evaluation results
This model is able to achieve a top-1 accuracy of 79.0 on ImageNet-1k, and 84.5 when pre-trained on a large-scale dataset (JFT-300M, an internal dataset of Google).
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2107-14795,
author = {Andrew Jaegle and
Sebastian Borgeaud and
Jean{-}Baptiste Alayrac and
Carl Doersch and
Catalin Ionescu and
David Ding and
Skanda Koppula and
Daniel Zoran and
Andrew Brock and
Evan Shelhamer and
Olivier J. H{\'{e}}naff and
Matthew M. Botvinick and
Andrew Zisserman and
Oriol Vinyals and
Jo{\~{a}}o Carreira},
title = {Perceiver {IO:} {A} General Architecture for Structured Inputs {\&}
Outputs},
journal = {CoRR},
volume = {abs/2107.14795},
year = {2021},
url = {https://arxiv.org/abs/2107.14795},
eprinttype = {arXiv},
eprint = {2107.14795},
timestamp = {Tue, 03 Aug 2021 14:53:34 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2107-14795.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
{"license": "apache-2.0", "datasets": ["imagenet"]}
|
image-classification
|
deepmind/vision-perceiver-fourier
|
[
"transformers",
"pytorch",
"perceiver",
"image-classification",
"dataset:imagenet",
"arxiv:2107.14795",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.14795"
] |
[] |
TAGS
#transformers #pytorch #perceiver #image-classification #dataset-imagenet #arxiv-2107.14795 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# Perceiver IO for vision (fixed Fourier position embeddings)
Perceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository.
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).
<img src="URL alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model only adds fixed Fourier 2D position embeddings to the pixel values.
By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.
## Intended uses & limitations
You can use the raw model for image classification. See the model hub to look for other fine-tuned versions on a task that may interest you.
### How to use
Here is how to use this model in PyTorch:
## Training data
This model was pretrained on ImageNet, a dataset consisting of 14 million images and 1k classes.
## Training procedure
### Preprocessing
Images are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the paper.
### Pretraining
Hyperparameter details can be found in Appendix H of the paper.
## Evaluation results
This model is able to achieve a top-1 accuracy of 79.0 on ImageNet-1k, and 84.5 when pre-trained on a large-scale dataset (JFT-300M, an internal dataset of Google).
### BibTeX entry and citation info
|
[
"# Perceiver IO for vision (fixed Fourier position embeddings)\n\nPerceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model only adds fixed Fourier 2D position embeddings to the pixel values.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.",
"## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for other fine-tuned versions on a task that may interest you.",
"### How to use\n\nHere is how to use this model in PyTorch:",
"## Training data\n\nThis model was pretrained on ImageNet, a dataset consisting of 14 million images and 1k classes.",
"## Training procedure",
"### Preprocessing\n\nImages are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the paper.",
"### Pretraining\n\nHyperparameter details can be found in Appendix H of the paper.",
"## Evaluation results\n\nThis model is able to achieve a top-1 accuracy of 79.0 on ImageNet-1k, and 84.5 when pre-trained on a large-scale dataset (JFT-300M, an internal dataset of Google).",
"### BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #perceiver #image-classification #dataset-imagenet #arxiv-2107.14795 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# Perceiver IO for vision (fixed Fourier position embeddings)\n\nPerceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model only adds fixed Fourier 2D position embeddings to the pixel values.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.",
"## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for other fine-tuned versions on a task that may interest you.",
"### How to use\n\nHere is how to use this model in PyTorch:",
"## Training data\n\nThis model was pretrained on ImageNet, a dataset consisting of 14 million images and 1k classes.",
"## Training procedure",
"### Preprocessing\n\nImages are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the paper.",
"### Pretraining\n\nHyperparameter details can be found in Appendix H of the paper.",
"## Evaluation results\n\nThis model is able to achieve a top-1 accuracy of 79.0 on ImageNet-1k, and 84.5 when pre-trained on a large-scale dataset (JFT-300M, an internal dataset of Google).",
"### BibTeX entry and citation info"
] |
[
65,
128,
367,
42,
17,
27,
3,
53,
20,
55,
11
] |
[
"passage: TAGS\n#transformers #pytorch #perceiver #image-classification #dataset-imagenet #arxiv-2107.14795 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# Perceiver IO for vision (fixed Fourier position embeddings)\n\nPerceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team."
] |
[
-0.10816529393196106,
0.04923240467905998,
-0.0005244706408120692,
0.1068199947476387,
0.11613842099905014,
0.05041851848363876,
0.16244040429592133,
0.032475151121616364,
0.0260093342512846,
0.059708576649427414,
0.08991403877735138,
0.046667322516441345,
0.03876509144902229,
0.13875128328800201,
0.018366605043411255,
-0.18155549466609955,
0.044677458703517914,
0.08757901191711426,
0.02982036955654621,
0.11256139725446701,
0.08651795238256454,
-0.09887488931417465,
0.1291598677635193,
0.02874349057674408,
-0.24858248233795166,
-0.011530978605151176,
-0.057360053062438965,
-0.06435175240039825,
0.0858398973941803,
-0.0014247652143239975,
0.054452795535326004,
-0.006984248291701078,
0.08222085237503052,
0.029506823047995567,
0.012139406055212021,
0.09662146866321564,
-0.00884472019970417,
0.09060578048229218,
0.06339626759290695,
-0.013603228144347668,
0.04859576374292374,
0.0003873704990837723,
0.017758654430508614,
-0.012779301963746548,
-0.057060357183218,
-0.20709851384162903,
-0.010680868290364742,
0.23434892296791077,
-0.032285720109939575,
0.09248048067092896,
0.005575691815465689,
0.1009209156036377,
0.05506522208452225,
0.06667587161064148,
0.0503559336066246,
-0.16997982561588287,
-0.052328336983919144,
-0.02186642959713936,
-0.12102672457695007,
0.03306002914905548,
-0.06107059121131897,
-0.024185558781027794,
0.07777539640665054,
0.03391224518418312,
0.03572544455528259,
-0.010596456937491894,
-0.015230333432555199,
-0.11097559332847595,
-0.08961403369903564,
-0.052756767719984055,
0.19576019048690796,
0.03058186173439026,
-0.08202435821294785,
-0.04770629107952118,
-0.09782657772302628,
-0.0575362890958786,
0.04610464721918106,
-0.01768621616065502,
0.052236009389162064,
-0.034396085888147354,
0.05864381417632103,
-0.03219027817249298,
-0.10811900347471237,
-0.06281141936779022,
-0.20074622333049774,
0.04132216051220894,
0.02825622446835041,
0.12672394514083862,
-0.05695413053035736,
0.06714844703674316,
-0.1360539346933365,
-0.07659927755594254,
-0.04940938577055931,
-0.09431891143321991,
0.05356666073203087,
0.07047316431999207,
-0.026845259591937065,
-0.062274590134620667,
-0.08068383485078812,
0.08654885739088058,
0.0999232679605484,
0.0009289434528909624,
0.023768868297338486,
0.134065642952919,
0.10170216113328934,
0.09442359209060669,
-0.08674339950084686,
0.08449628949165344,
0.04630688950419426,
0.025867488235235214,
0.04092806950211525,
-0.05900394171476364,
-0.039581913501024246,
0.04243740439414978,
0.08242802321910858,
-0.11306915432214737,
-0.004339640028774738,
0.07663804292678833,
-0.005836923141032457,
-0.07675442099571228,
0.14216166734695435,
-0.07529278844594955,
-0.013685229234397411,
-0.01697980985045433,
-0.0669817104935646,
0.15170800685882568,
0.18242110311985016,
-0.03448489308357239,
-0.022647704929113388,
0.009989920072257519,
-0.07711177319288254,
0.0005096469540148973,
-0.06303727626800537,
-0.08323431015014648,
0.02154986932873726,
-0.20004300773143768,
0.006416414398699999,
-0.1737985908985138,
-0.10113636404275894,
-0.006019486580044031,
0.11226948350667953,
-0.053494371473789215,
-0.016202621161937714,
0.048935674130916595,
0.0462561659514904,
-0.013334804214537144,
-0.007264542859047651,
-0.043093856424093246,
0.009063011035323143,
0.06068350747227669,
-0.03607368841767311,
0.12966710329055786,
-0.18997633457183838,
0.08205290883779526,
-0.16781480610370636,
0.03743227943778038,
-0.03838510438799858,
0.016768213361501694,
0.018664151430130005,
-0.028036637231707573,
-0.06800460070371628,
-0.08692695945501328,
-0.06635402143001556,
0.045593272894620895,
0.045586224645376205,
0.0731315165758133,
-0.04259170964360237,
-0.03920334577560425,
0.02661203034222126,
-0.22114092111587524,
-0.07852310687303543,
0.09568663686513901,
0.00035578792449086905,
0.11956651508808136,
-0.029125530272722244,
0.14920753240585327,
0.051202110946178436,
-0.05135999619960785,
-0.01885325461626053,
0.014667707495391369,
-0.07639031112194061,
-0.12383224070072174,
0.04202740266919136,
0.07649899274110794,
0.0684562399983406,
0.03992997109889984,
-0.04246947541832924,
0.01289505884051323,
-0.05662746727466583,
-0.0506584569811821,
-0.029952874407172203,
-0.10308769345283508,
-0.036193519830703735,
0.048354923725128174,
0.08204599469900131,
0.015606645494699478,
-0.01597747579216957,
0.006556185428053141,
0.03994624316692352,
-0.09463714063167572,
0.013488054275512695,
-0.04659072309732437,
0.10795402526855469,
-0.10270369797945023,
-0.013649207539856434,
-0.14033935964107513,
0.05199132859706879,
0.0015697510680183768,
-0.04878848046064377,
0.04928586632013321,
0.12492011487483978,
0.06831406056880951,
0.1096767857670784,
0.014598914422094822,
-0.04780149832367897,
0.004637597128748894,
-0.019585339352488518,
-0.05608363449573517,
-0.11501539498567581,
-0.06840577721595764,
-0.049467217177152634,
0.15508070588111877,
-0.03551727533340454,
0.018711337819695473,
-0.028235986828804016,
0.04804205894470215,
-0.016796138137578964,
-0.05040229856967926,
-0.03444889560341835,
-0.00029403454391285777,
-0.06133977696299553,
-0.045795001089572906,
0.11098293215036392,
0.01148319710046053,
0.09155692905187607,
0.08269190788269043,
-0.08296588808298111,
0.03250560164451599,
0.18583519756793976,
-0.08648932725191116,
-0.025234373286366463,
0.04087977856397629,
-0.03387382999062538,
-0.028120843693614006,
0.043372612446546555,
0.004711820278316736,
0.17172552645206451,
-0.03686234727501869,
0.10351251810789108,
-0.05946787819266319,
0.03676344454288483,
0.11603433638811111,
-0.07211286574602127,
-0.13445128500461578,
0.051491010934114456,
0.10646399110555649,
-0.11141527444124222,
0.17476913332939148,
0.11457041651010513,
-0.06502597033977509,
0.0749669075012207,
0.012715701945126057,
-0.04554340988397598,
0.03813783451914787,
-0.02463768981397152,
-0.003300471929833293,
0.12994344532489777,
-0.23429132997989655,
-0.09586839377880096,
0.09679334610700607,
-0.035407017916440964,
0.012051925994455814,
-0.11911287903785706,
0.03552431985735893,
0.02305731736123562,
0.06197237968444824,
-0.0009583213832229376,
0.037385813891887665,
-0.0531483069062233,
0.07375101000070572,
-0.003997030667960644,
-0.05555348843336105,
0.06125493720173836,
0.020322274416685104,
-0.07807093858718872,
0.08045657724142075,
-0.04261406883597374,
-0.26493650674819946,
-0.14834530651569366,
-0.0017743080388754606,
-0.12855812907218933,
0.018529433757066727,
0.016441892832517624,
-0.00013583534746430814,
-0.04881022870540619,
-0.05849788337945938,
-0.016225598752498627,
0.011968071572482586,
-0.01272845733910799,
0.1336853951215744,
-0.07010432332754135,
0.06657898426055908,
-0.10020695626735687,
-0.010966992005705833,
-0.07475491613149643,
-0.0807562842965126,
0.12074097990989685,
-0.034819308668375015,
0.11942705512046814,
0.0768834576010704,
-0.07358871400356293,
0.04649445042014122,
0.031035682186484337,
0.21027910709381104,
-0.015394303016364574,
0.04345099627971649,
0.2655118703842163,
0.0437794029712677,
0.044726140797138214,
0.0865430235862732,
-0.035368453711271286,
-0.03945672884583473,
-0.044826406985521317,
0.019480707123875618,
-0.10944444686174393,
-0.07489053159952164,
-0.03649346902966499,
-0.02584248036146164,
0.023998189717531204,
0.1329079568386078,
0.06457798182964325,
0.01981234923005104,
0.15944841504096985,
0.006286223419010639,
0.08514420688152313,
-0.03305041790008545,
0.06915489584207535,
-0.010592591017484665,
0.0009000581339932978,
0.10806183516979218,
-0.06849870085716248,
0.013380328193306923,
0.1295681893825531,
0.023821450769901276,
0.18361689150333405,
-0.07791762053966522,
-0.10657209903001785,
0.04226623475551605,
0.17579558491706848,
0.0767790898680687,
0.18352605402469635,
-0.13042089343070984,
0.037368398159742355,
-0.03757605701684952,
-0.03927937522530556,
-0.11337999999523163,
0.07093129307031631,
-0.08692505955696106,
0.0041755251586437225,
-0.019581420347094536,
0.037280529737472534,
0.008561587892472744,
0.17771092057228088,
0.06237267702817917,
-0.2891570031642914,
-0.07958939671516418,
-0.021707970649003983,
0.07466515898704529,
-0.17529939115047455,
0.008376013487577438,
0.08363702148199081,
-0.0383639931678772,
0.06113847345113754,
-0.07193725556135178,
0.03150845691561699,
-0.14716807007789612,
0.006137460470199585,
0.10070371627807617,
0.0361366868019104,
0.08448832482099533,
0.06384696066379547,
-0.12176861613988876,
0.1611950546503067,
-0.03768305107951164,
-0.03880949690937996,
-0.11921466141939163,
0.011390620842576027,
-0.012657275423407555,
0.1236552968621254,
0.16568657755851746,
0.04129638895392418,
0.1423759013414383,
-0.1564933806657791,
-0.05709027498960495,
-0.012271800078451633,
0.053813621401786804,
0.009576719254255295,
-0.03131555765867233,
0.01536313071846962,
0.021643301472067833,
-0.01200732123106718,
0.17810684442520142,
0.0037433290854096413,
-0.15516646206378937,
0.06606636196374893,
-0.0008093229844234884,
0.03351794183254242,
-0.01989975944161415,
-0.11224650591611862,
-0.13126814365386963,
0.10581419616937637,
-0.04387005418539047,
-0.06491237878799438,
-0.0934639424085617,
0.035352207720279694,
0.029146159067749977,
-0.05569831281900406,
0.06308497488498688,
-0.12134547531604767,
0.0376829169690609,
-0.041523244231939316,
-0.17663565278053284,
0.048086024820804596,
-0.07305064052343369,
-0.06793548166751862,
-0.008469323627650738,
0.0036538110580295324,
0.02038760669529438,
-0.0205281563103199,
-0.01909813843667507,
0.032231446355581284,
-0.09831143170595169,
-0.04592301696538925,
0.12281268835067749,
0.07651861757040024,
0.0008640117011964321,
-0.010078405030071735,
-0.022400308400392532,
-0.1604083627462387,
0.058325495570898056,
0.04707234352827072,
0.02116416022181511,
0.06673147529363632,
-0.041022852063179016,
0.06850896030664444,
0.22411593794822693,
-0.05044156312942505,
-0.2919216752052307,
-0.009202690795063972,
0.0066346656531095505,
0.03720744326710701,
-0.10919199883937836,
-0.11998441070318222,
0.08919544517993927,
0.13924965262413025,
-0.020637502893805504,
0.08497956395149231,
-0.13087651133537292,
-0.08626164495944977,
0.08843417465686798,
0.14856809377670288,
0.24952250719070435,
-0.1253383606672287,
0.024024099111557007,
-0.01249169372022152,
-0.14345788955688477,
0.07926488667726517,
-0.022123636677861214,
0.09088216722011566,
-0.049593355506658554,
0.03919772803783417,
0.0020324818324297667,
-0.06483433395624161,
0.09442665427923203,
-0.05781136825680733,
0.07480771094560623,
-0.11607575416564941,
-0.04707184433937073,
0.15752021968364716,
-0.015547367744147778,
0.1396246999502182,
0.03665734454989433,
0.019943654537200928,
-0.03908013552427292,
-0.0418727844953537,
-0.10877227038145065,
0.010421630926430225,
0.034600868821144104,
-0.07427963614463806,
-0.031016696244478226,
0.051133736968040466,
0.0029679483268409967,
-0.0027416134253144264,
0.03869978338479996,
0.014481003396213055,
0.00526476139202714,
0.19782935082912445,
-0.020881015807390213,
-0.08232399076223373,
-0.17073781788349152,
-0.1300658881664276,
-0.03290427848696709,
0.09202838689088821,
-0.1253797858953476,
0.004949126858264208,
0.09663833677768707,
0.018082553520798683,
0.05637401342391968,
0.027181267738342285,
-0.06413021683692932,
0.04656000807881355,
0.04712327569723129,
-0.17295949161052704,
0.0015086587518453598,
0.03209572657942772,
0.004863068461418152,
0.05631306394934654,
0.14825771749019623,
0.14081308245658875,
-0.10881832242012024,
0.00819343701004982,
-0.03981250897049904,
0.03276431933045387,
-0.023565107956528664,
0.03510722890496254,
0.08564188331365585,
0.00807166937738657,
-0.09677616506814957,
0.1395816057920456,
0.08145184069871902,
-0.05257854238152504,
0.03731091320514679,
0.01144473161548376,
-0.0698007270693779,
-0.09690430015325546,
0.0038313670083880424,
0.061655789613723755,
-0.027331972494721413,
-0.07634106278419495,
-0.01664954610168934,
-0.09201152622699738,
-0.005795877426862717,
0.0734214335680008,
0.0678020715713501,
-0.000052357689128257334,
-0.06421427428722382,
-0.019985685124993324,
-0.11669504642486572,
0.0034316331148147583,
-0.03070693090558052,
0.04556220769882202,
-0.1495826542377472,
-0.022582031786441803,
0.05233945697546005,
0.10088184475898743,
-0.07420480251312256,
-0.07375283539295197,
-0.031986214220523834,
-0.002740701427683234,
0.0009474129765294492,
0.02096419222652912,
-0.049751125276088715,
-0.014615454711019993,
0.018821900710463524,
-0.04067903012037277,
-0.035445015877485275,
0.06536364555358887,
-0.06592490524053574,
0.013899104669690132,
0.014998550526797771,
-0.013707492500543594,
-0.0925520807504654,
-0.05558791384100914,
-0.032334230840206146,
-0.014652309007942677,
0.10793940722942352,
0.022831596434116364,
-0.08837538212537766,
-0.08799668401479721,
-0.1227283701300621,
-0.050449058413505554,
0.07772015035152435,
0.05724846571683884,
0.021510889753699303,
-0.007277123164385557,
0.012868063524365425,
0.05504319816827774,
-0.07802578061819077,
-0.0010582196991890669,
0.213752880692482,
-0.03605544939637184,
0.03002895787358284,
0.0354427807033062,
-0.031003927811980247,
-0.0724446102976799,
0.07218806445598602,
0.09991347789764404,
0.058179449290037155,
0.028268838301301003,
-0.019847050309181213,
0.042609039694070816,
-0.1203857958316803,
-0.003066220786422491,
0.05493812635540962,
-0.1047358438372612,
-0.0003250009031035006,
-0.059255439788103104,
0.021237529814243317,
0.004767165053635836,
0.16257548332214355,
0.08915761858224869,
-0.07286041229963303,
-0.0272012110799551,
0.10655710846185684,
-0.0849677249789238,
0.015931066125631332,
0.0771869346499443,
-0.01923815906047821,
0.03991306573152542,
0.03369871526956558,
0.10590893030166626,
0.027545396238565445,
0.17538666725158691,
0.06240612268447876,
0.07100364565849304,
0.059574682265520096,
0.08746493607759476,
0.07913308590650558,
-0.018819743767380714,
-0.06367218494415283,
-0.06945078819990158,
-0.05953844264149666,
0.09180477261543274,
-0.08843513578176498,
0.055239882320165634,
0.025942198932170868,
-0.052785828709602356,
0.002393541391938925,
-0.045679815113544464,
-0.05660078302025795,
-0.005495260003954172,
-0.28186890482902527,
-0.04484493285417557,
-0.10044051706790924,
0.013672880828380585,
0.0030271040741354227,
-0.0071476721204817295,
0.11481758207082748,
0.04479000344872475,
-0.050084296613931656,
0.08091410249471664,
-0.15053866803646088,
-0.020561402663588524,
0.09293675422668457,
0.027792148292064667,
-0.05564548447728157,
-0.12772122025489807,
-0.040434230118989944,
0.03659595921635628,
0.07505836337804794,
0.05785612761974335,
0.033592190593481064,
0.12964847683906555,
0.03365863859653473,
0.01587478071451187,
-0.12132735550403595,
-0.03909875079989433,
0.01074789185076952,
0.01660533994436264,
0.041164886206388474,
-0.07576427608728409,
0.047405507415533066,
-0.036713141947984695,
0.015789037570357323,
-0.01672287844121456,
-0.010930110700428486,
-0.05645860731601715,
0.11013460159301758,
-0.11121047288179398,
-0.012370530515909195,
0.03975670412182808,
-0.08800500631332397,
-0.06589546799659729,
0.2842481732368469,
0.20513468980789185,
-0.05384351313114166,
-0.011799558065831661,
0.052455659955739975,
-0.004627242684364319,
-0.013220568187534809,
0.0634269267320633,
0.09545525908470154,
0.1720310002565384,
-0.05728069692850113,
0.038676466792821884,
-0.05804714560508728,
-0.018217215314507484,
-0.010657180100679398,
0.10312213748693466,
-0.00827985443174839,
0.0006326433504000306,
-0.13874372839927673,
-0.0810193195939064,
-0.0021757343783974648,
-0.10037419199943542,
0.25499382615089417,
-0.13314343988895416,
-0.07530313730239868,
-0.02930579148232937,
0.09537502378225327,
0.03665501996874809,
0.01555714663118124,
-0.024664636701345444,
0.039347801357507706,
0.07544447481632233,
0.029979707673192024,
-0.14985913038253784,
-0.08811833709478378,
0.031292691826820374,
-0.13239631056785583,
0.15653470158576965,
-0.022113792598247528,
0.07565486431121826,
0.029582137241959572,
0.02315189316868782,
-0.10292764008045197,
-0.0012704016407951713,
-0.07187853753566742,
-0.06642942130565643,
-0.06196579709649086,
0.08196671307086945,
-0.008960792794823647,
-0.022875260561704636,
-0.01943528652191162,
-0.09283984452486038,
-0.0591411255300045,
0.013271113857626915,
-0.049885399639606476,
-0.028348149731755257,
0.03235189989209175,
-0.09501615166664124,
0.12325337529182434,
0.06480685621500015,
0.023578256368637085,
-0.12118573486804962,
-0.07288970798254013,
0.045703787356615067,
-0.008753789588809013,
-0.12390793859958649,
0.047785576432943344,
-0.11996088922023773,
-0.0033682554494589567,
-0.14082005620002747,
0.051212795078754425,
-0.1259588897228241,
-0.06420020759105682,
-0.07967503368854523,
-0.06586853414773941,
-0.0698045939207077,
-0.016198260709643364,
0.07960421591997147,
0.02009289525449276,
-0.056808970868587494,
0.11443787813186646,
0.013288712128996849,
0.050790879875421524,
-0.03818422555923462,
-0.07326456904411316
] |
null | null |
transformers
|
# Perceiver IO for vision (learned position embeddings)
Perceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795) by Jaegle et al. and first released in [this repository](https://github.com/deepmind/deepmind-research/tree/master/perceiver).
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/perceiver_architecture.jpg" alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model only adds learned 1D position embeddings to the pixel values, hence it is given no privileged information about the 2D structure of images.
By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.
## Intended uses & limitations
You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=deepmind/perceiver) to look for other fine-tuned versions on a task that may interest you.
### How to use
Here is how to use this model in PyTorch:
```python
from transformers import PerceiverFeatureExtractor, PerceiverForImageClassificationLearned
import requests
from PIL import Image
feature_extractor = PerceiverFeatureExtractor.from_pretrained("deepmind/vision-perceiver-learned")
model = PerceiverForImageClassificationLearned.from_pretrained("deepmind/vision-perceiver-learned")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
# prepare input
encoding = feature_extractor(image, return_tensors="pt")
inputs = encoding.pixel_values
# forward pass
outputs = model(inputs)
logits = outputs.logits
print("Predicted class:", model.config.id2label[logits.argmax(-1).item()])
>>> should print Predicted class: tabby, tabby cat
```
## Training data
This model was pretrained on [ImageNet](http://www.image-net.org/), a dataset consisting of 14 million images and 1k classes.
## Training procedure
### Preprocessing
Images are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the [paper](https://arxiv.org/abs/2107.14795).
### Pretraining
Hyperparameter details can be found in Appendix H of the [paper](https://arxiv.org/abs/2107.14795).
## Evaluation results
This model is able to achieve a top-1 accuracy of 72.7 on ImageNet-1k, despite having no privileged information about the 2D structure of images.
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2107-14795,
author = {Andrew Jaegle and
Sebastian Borgeaud and
Jean{-}Baptiste Alayrac and
Carl Doersch and
Catalin Ionescu and
David Ding and
Skanda Koppula and
Daniel Zoran and
Andrew Brock and
Evan Shelhamer and
Olivier J. H{\'{e}}naff and
Matthew M. Botvinick and
Andrew Zisserman and
Oriol Vinyals and
Jo{\~{a}}o Carreira},
title = {Perceiver {IO:} {A} General Architecture for Structured Inputs {\&}
Outputs},
journal = {CoRR},
volume = {abs/2107.14795},
year = {2021},
url = {https://arxiv.org/abs/2107.14795},
eprinttype = {arXiv},
eprint = {2107.14795},
timestamp = {Tue, 03 Aug 2021 14:53:34 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2107-14795.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
{"license": "apache-2.0", "datasets": ["imagenet"]}
|
image-classification
|
deepmind/vision-perceiver-learned
|
[
"transformers",
"pytorch",
"perceiver",
"image-classification",
"dataset:imagenet",
"arxiv:2107.14795",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2107.14795"
] |
[] |
TAGS
#transformers #pytorch #perceiver #image-classification #dataset-imagenet #arxiv-2107.14795 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# Perceiver IO for vision (learned position embeddings)
Perceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository.
Disclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Perceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs.
To decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).
<img src="URL alt="drawing" width="600"/>
<small> Perceiver IO architecture.</small>
As the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model only adds learned 1D position embeddings to the pixel values, hence it is given no privileged information about the 2D structure of images.
By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.
## Intended uses & limitations
You can use the raw model for image classification. See the model hub to look for other fine-tuned versions on a task that may interest you.
### How to use
Here is how to use this model in PyTorch:
## Training data
This model was pretrained on ImageNet, a dataset consisting of 14 million images and 1k classes.
## Training procedure
### Preprocessing
Images are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the paper.
### Pretraining
Hyperparameter details can be found in Appendix H of the paper.
## Evaluation results
This model is able to achieve a top-1 accuracy of 72.7 on ImageNet-1k, despite having no privileged information about the 2D structure of images.
### BibTeX entry and citation info
|
[
"# Perceiver IO for vision (learned position embeddings)\n\nPerceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model only adds learned 1D position embeddings to the pixel values, hence it is given no privileged information about the 2D structure of images.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.",
"## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for other fine-tuned versions on a task that may interest you.",
"### How to use\n\nHere is how to use this model in PyTorch:",
"## Training data\n\nThis model was pretrained on ImageNet, a dataset consisting of 14 million images and 1k classes.",
"## Training procedure",
"### Preprocessing\n\nImages are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the paper.",
"### Pretraining\n\nHyperparameter details can be found in Appendix H of the paper.",
"## Evaluation results\n\nThis model is able to achieve a top-1 accuracy of 72.7 on ImageNet-1k, despite having no privileged information about the 2D structure of images.",
"### BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #perceiver #image-classification #dataset-imagenet #arxiv-2107.14795 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Perceiver IO for vision (learned position embeddings)\n\nPerceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team.",
"## Model description\n\nPerceiver IO is a transformer encoder model that can be applied on any modality (text, images, audio, video, ...). The core idea is to employ the self-attention mechanism on a not-too-large set of latent vectors (e.g. 256 or 512), and only use the inputs to perform cross-attention with the latents. This allows for the time and memory requirements of the self-attention mechanism to not depend on the size of the inputs. \n\nTo decode, the authors employ so-called decoder queries, which allow to flexibly decode the final hidden states of the latents to produce outputs of arbitrary size and semantics. For image classification, the output is a tensor containing the logits, of shape (batch_size, num_labels).\n\n<img src=\"URL alt=\"drawing\" width=\"600\"/>\n\n<small> Perceiver IO architecture.</small>\n\nAs the time and memory requirements of the self-attention mechanism don't depend on the size of the inputs, the Perceiver IO authors can train the model directly on raw pixel values, rather than on patches as is done in ViT. This particular model only adds learned 1D position embeddings to the pixel values, hence it is given no privileged information about the 2D structure of images.\n\nBy pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by replacing the classification decoder.",
"## Intended uses & limitations\n\nYou can use the raw model for image classification. See the model hub to look for other fine-tuned versions on a task that may interest you.",
"### How to use\n\nHere is how to use this model in PyTorch:",
"## Training data\n\nThis model was pretrained on ImageNet, a dataset consisting of 14 million images and 1k classes.",
"## Training procedure",
"### Preprocessing\n\nImages are center cropped and resized to a resolution of 224x224 and normalized across the RGB channels. Note that data augmentation was used during pre-training, as explained in Appendix H of the paper.",
"### Pretraining\n\nHyperparameter details can be found in Appendix H of the paper.",
"## Evaluation results\n\nThis model is able to achieve a top-1 accuracy of 72.7 on ImageNet-1k, despite having no privileged information about the 2D structure of images.",
"### BibTeX entry and citation info"
] |
[
61,
127,
382,
42,
17,
27,
3,
53,
20,
41,
11
] |
[
"passage: TAGS\n#transformers #pytorch #perceiver #image-classification #dataset-imagenet #arxiv-2107.14795 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# Perceiver IO for vision (learned position embeddings)\n\nPerceiver IO model pre-trained on ImageNet (14 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper Perceiver IO: A General Architecture for Structured Inputs & Outputs by Jaegle et al. and first released in this repository. \n\nDisclaimer: The team releasing Perceiver IO did not write a model card for this model so this model card has been written by the Hugging Face team."
] |
[
-0.09674649685621262,
0.04328707978129387,
-0.0004558498039841652,
0.09901127219200134,
0.12690170109272003,
0.05291036143898964,
0.16433173418045044,
0.03160715475678444,
0.04638105630874634,
0.04087941348552704,
0.0880332812666893,
0.04908977076411247,
0.040882714092731476,
0.13687588274478912,
0.03404347226023674,
-0.18784449994564056,
0.05706731975078583,
0.10475073009729385,
0.07364531606435776,
0.12063145637512207,
0.08121217787265778,
-0.10057774931192398,
0.12943950295448303,
0.027214227244257927,
-0.25753292441368103,
-0.01791957952082157,
-0.0668472945690155,
-0.06292878836393356,
0.06768681854009628,
-0.022672489285469055,
0.06210682913661003,
-0.008615458384156227,
0.1004631444811821,
0.034159157425165176,
0.012416389770805836,
0.09141874313354492,
-0.006950309965759516,
0.09119242429733276,
0.054881948977708817,
-0.006962755229324102,
0.0446569062769413,
0.021579736843705177,
0.012132016010582447,
-0.012919705361127853,
-0.04656239226460457,
-0.22423599660396576,
0.0026165081653743982,
0.25231876969337463,
-0.026002099737524986,
0.09150323271751404,
0.010732072405517101,
0.0997154712677002,
0.05314801260828972,
0.05631508305668831,
0.05853823199868202,
-0.1631792038679123,
-0.060149986296892166,
0.005617372691631317,
-0.11827123165130615,
0.03143760561943054,
-0.05751359090209007,
-0.03125421330332756,
0.07327941805124283,
0.0434432215988636,
0.03408914431929588,
0.0006173055153340101,
0.021652741357684135,
-0.11819443851709366,
-0.08434795588254929,
-0.06115946173667908,
0.1711411327123642,
0.019167425110936165,
-0.08210057765245438,
-0.05057657137513161,
-0.09092824906110764,
-0.059436291456222534,
0.04404723644256592,
-0.02649421989917755,
0.04878531023859978,
-0.027133462950587273,
0.06116332486271858,
-0.056960105895996094,
-0.09526863694190979,
-0.09064259380102158,
-0.18679678440093994,
0.035998426377773285,
0.042287494987249374,
0.1305665671825409,
-0.06657133996486664,
0.07896587997674942,
-0.11764151602983475,
-0.06859593838453293,
-0.060407429933547974,
-0.10583380609750748,
0.06097579374909401,
0.0716552883386612,
-0.009351913817226887,
-0.051251236349344254,
-0.09012624621391296,
0.07238901406526566,
0.10943585634231567,
0.005140059627592564,
0.0309948418289423,
0.14095772802829742,
0.0834333524107933,
0.09299299120903015,
-0.07643679529428482,
0.10910755395889282,
0.040206123143434525,
0.03444890305399895,
0.026554279029369354,
-0.047739047557115555,
-0.048566658049821854,
0.04559804126620293,
0.09030568599700928,
-0.11034861952066422,
-0.007036407943814993,
0.08004368096590042,
0.008136927150189877,
-0.08611983805894852,
0.1452985554933548,
-0.06599327176809311,
-0.03227057680487633,
-0.028900370001792908,
-0.06767626851797104,
0.17531509697437286,
0.1895575374364853,
-0.04310816153883934,
-0.03320171311497688,
0.006720138248056173,
-0.07085666060447693,
0.00034037872683256865,
-0.08078526705503464,
-0.06783970445394516,
0.01605231501162052,
-0.21224713325500488,
0.00013769371435046196,
-0.18016231060028076,
-0.15554073452949524,
0.008666087873280048,
0.12091652303934097,
-0.044087816029787064,
-0.019090861082077026,
0.042871903628110886,
0.04302456974983215,
-0.025533489882946014,
-0.013540173880755901,
-0.062292274087667465,
0.015626750886440277,
0.05800800025463104,
-0.04958142712712288,
0.1256030946969986,
-0.18653911352157593,
0.08407050371170044,
-0.1644384264945984,
0.039745964109897614,
-0.05895955488085747,
0.023095538839697838,
0.024220900610089302,
-0.02738102525472641,
-0.07034053653478622,
-0.09547728300094604,
-0.06844919919967651,
0.046799954026937485,
0.03309589996933937,
0.06871102005243301,
-0.051927655935287476,
-0.029632724821567535,
0.04838681221008301,
-0.2368001937866211,
-0.07962585240602493,
0.08159290999174118,
-0.00248371297493577,
0.14091648161411285,
-0.011800502426922321,
0.14905132353305817,
0.05447300150990486,
-0.06016504764556885,
-0.010481172241270542,
0.011781108565628529,
-0.08968862891197205,
-0.13671977818012238,
0.04968056082725525,
0.0803857147693634,
0.050076961517333984,
0.045191001147031784,
-0.06771131604909897,
0.01593165285885334,
-0.06252274662256241,
-0.046400200575590134,
-0.02314058504998684,
-0.11714217811822891,
-0.03029564581811428,
0.05122828856110573,
0.08371471613645554,
0.03611559420824051,
-0.011752909980714321,
0.03672344982624054,
0.03482593223452568,
-0.11050263792276382,
0.007690804544836283,
-0.05553516745567322,
0.09670726209878922,
-0.1032148003578186,
-0.0036863975692540407,
-0.1279178112745285,
0.04130655899643898,
0.011602095328271389,
-0.08502393960952759,
0.03975361958146095,
0.12111872434616089,
0.0711069405078888,
0.10936713218688965,
0.007332696113735437,
-0.050404150038957596,
-0.010324690490961075,
-0.02491126023232937,
-0.04759823903441429,
-0.0894518569111824,
-0.07878783345222473,
-0.04536602273583412,
0.11458740383386612,
-0.014475357718765736,
0.014969214797019958,
-0.05021506920456886,
0.04554612934589386,
-0.016469119116663933,
-0.052431944757699966,
-0.040074270218610764,
-0.01226150244474411,
-0.042528536170721054,
-0.047079578042030334,
0.1251792162656784,
0.014730383642017841,
0.10024017840623856,
0.09074285626411438,
-0.07175891101360321,
0.006986485794186592,
0.18364317715168,
-0.0862196609377861,
-0.021586323156952858,
0.04821879789233208,
-0.03973831608891487,
-0.036946143954992294,
0.0463457852602005,
-0.0014929830795153975,
0.1924891620874405,
-0.040022071450948715,
0.10724400728940964,
-0.07302933931350708,
0.041213296353816986,
0.11176194995641708,
-0.05583232641220093,
-0.12098220735788345,
0.04361307621002197,
0.10859030485153198,
-0.1046145036816597,
0.1713675707578659,
0.09961280971765518,
-0.06806620955467224,
0.08625789731740952,
0.00902030523866415,
-0.052283670753240585,
0.04101325199007988,
-0.014496290124952793,
-0.006858778651803732,
0.12602443993091583,
-0.22491693496704102,
-0.09301144629716873,
0.09371434897184372,
-0.025378769263625145,
0.02240581624209881,
-0.11651337891817093,
0.0319245308637619,
0.014178067445755005,
0.05173882469534874,
0.006183297839015722,
0.04182848706841469,
-0.062282267957925797,
0.06466678529977798,
-0.0008734787697903812,
-0.05308431014418602,
0.07191582769155502,
0.017177967354655266,
-0.07723934203386307,
0.07976291328668594,
-0.034351397305727005,
-0.2585808336734772,
-0.1457715779542923,
0.0105988048017025,
-0.11286979913711548,
0.024210812523961067,
0.013564036227762699,
-0.0013768995413556695,
-0.04472658038139343,
-0.047783371061086655,
-0.02753289043903351,
0.013425436802208424,
-0.026241039857268333,
0.15226377546787262,
-0.0663008987903595,
0.05364304780960083,
-0.09095830470323563,
-0.01178047712892294,
-0.0733129009604454,
-0.09036741405725479,
0.10650801658630371,
-0.06266141682863235,
0.10873701423406601,
0.07339514046907425,
-0.07562118768692017,
0.06091843545436859,
0.03544285520911217,
0.20679564774036407,
-0.009492841549217701,
0.03544304892420769,
0.2777853012084961,
0.04241437092423439,
0.03960103914141655,
0.06860682368278503,
-0.043535321950912476,
-0.04064134880900383,
-0.050239115953445435,
0.013861780054867268,
-0.12677913904190063,
-0.07702040672302246,
-0.025276338681578636,
-0.013391044922173023,
0.029574712738394737,
0.14007095992565155,
0.05857924744486809,
0.02260695956647396,
0.16760540008544922,
0.008548807352781296,
0.08649694174528122,
-0.018125424161553383,
0.07623632997274399,
-0.05101456120610237,
0.004779509734362364,
0.09792158007621765,
-0.06902449578046799,
0.014520754106342793,
0.1142083927989006,
0.02340712584555149,
0.20546801388263702,
-0.0719127207994461,
-0.09783647209405899,
0.04225859045982361,
0.16007207334041595,
0.0897577777504921,
0.19275523722171783,
-0.13380175828933716,
0.03801177069544792,
-0.032744329422712326,
-0.038905251771211624,
-0.10973941534757614,
0.07062895596027374,
-0.10196996480226517,
0.016148937866091728,
-0.0038244612514972687,
0.048059940338134766,
0.002554224105551839,
0.19557027518749237,
0.042859628796577454,
-0.29709288477897644,
-0.07105915993452072,
-0.028695495799183846,
0.07956692576408386,
-0.18152864277362823,
0.021704191341996193,
0.07553326338529587,
-0.04879103973507881,
0.054085031151771545,
-0.07854608446359634,
0.0383947528898716,
-0.16074959933757782,
0.00703382259234786,
0.12313132733106613,
0.033331554383039474,
0.10169702023267746,
0.06401073932647705,
-0.10695824027061462,
0.1574506014585495,
-0.04620980843901634,
-0.022570522502064705,
-0.12194564938545227,
0.000003474919822110678,
-0.012336470186710358,
0.11220699548721313,
0.18968217074871063,
0.041907090693712234,
0.14980851113796234,
-0.13397669792175293,
-0.06588143855333328,
-0.01119213830679655,
0.05504877492785454,
-0.005601903889328241,
-0.03292783722281456,
0.015627199783921242,
0.024759456515312195,
-0.01961384154856205,
0.16337618231773376,
0.005362285766750574,
-0.13407978415489197,
0.06357017904520035,
-0.0020083002746105194,
0.06352001428604126,
-0.019521286711096764,
-0.11124920845031738,
-0.1338941752910614,
0.10145173221826553,
-0.04733650013804436,
-0.08144807070493698,
-0.08692898601293564,
0.016892632469534874,
0.02632327564060688,
-0.05126906931400299,
0.06420647352933884,
-0.1147213876247406,
0.04467281326651573,
-0.03805078938603401,
-0.1965666562318802,
0.035718999803066254,
-0.07185990363359451,
-0.06945235282182693,
0.009498764760792255,
-0.0030033967923372984,
0.03979462757706642,
-0.024088265374302864,
-0.010264047421514988,
0.02507970668375492,
-0.10045895725488663,
-0.04476504027843475,
0.12017375975847244,
0.09669984132051468,
0.008915216661989689,
-0.008125602267682552,
-0.022202901542186737,
-0.14283046126365662,
0.06579383462667465,
0.05902961269021034,
0.008711115457117558,
0.04850165918469429,
-0.04614053666591644,
0.06892482191324234,
0.2352575808763504,
-0.05643872916698456,
-0.3025001585483551,
-0.016412515193223953,
0.022668780758976936,
0.024280520156025887,
-0.12197650223970413,
-0.12803524732589722,
0.09491535276174545,
0.14361576735973358,
-0.023641375824809074,
0.07574084401130676,
-0.12423936277627945,
-0.08226613700389862,
0.07673189789056778,
0.16204406321048737,
0.25587624311447144,
-0.11997269839048386,
0.023422205820679665,
-0.017773957923054695,
-0.16446425020694733,
0.08864665031433105,
-0.022492503747344017,
0.08935824781656265,
-0.05751931294798851,
0.037259411066770554,
-0.0009617205359973013,
-0.07467624545097351,
0.08763787895441055,
-0.0501011498272419,
0.07458364963531494,
-0.11383538693189621,
-0.03481753543019295,
0.152066171169281,
-0.01655193231999874,
0.15875692665576935,
0.04783986881375313,
0.023939678445458412,
-0.016761237755417824,
-0.05350349843502045,
-0.12638576328754425,
-0.013718738220632076,
0.04301303252577782,
-0.0805472731590271,
-0.019313765689730644,
0.05475044623017311,
0.00569562055170536,
0.007019151002168655,
0.049583613872528076,
0.01452337671071291,
0.0181173924356699,
0.18024064600467682,
-0.016333507373929024,
-0.0835408866405487,
-0.18550603091716766,
-0.14536307752132416,
-0.037068434059619904,
0.09467590600252151,
-0.11609219759702682,
-0.0001566254795761779,
0.10059843212366104,
0.018281295895576477,
0.059768881648778915,
0.03208419680595398,
-0.06395474821329117,
0.04054545611143112,
0.033385761082172394,
-0.17571254074573517,
-0.01328072976320982,
0.016135288402438164,
0.018820641562342644,
0.07146298140287399,
0.15304245054721832,
0.11949001997709274,
-0.12289959192276001,
0.011403493583202362,
-0.039390239864587784,
0.026315053924918175,
-0.022393377497792244,
0.017482295632362366,
0.0773836001753807,
0.01504459884017706,
-0.09855671972036362,
0.14787407219409943,
0.0698341503739357,
-0.03879636526107788,
0.03862879052758217,
0.013309959322214127,
-0.0896906927227974,
-0.09608852863311768,
-0.0025592572055757046,
0.06655386835336685,
-0.03715139999985695,
-0.08665292710065842,
-0.011781272478401661,
-0.09476155042648315,
0.0024009195622056723,
0.08220647275447845,
0.07966160029172897,
-0.0009878395358100533,
-0.054141685366630554,
-0.019115664064884186,
-0.11539856344461441,
-0.002748390892520547,
-0.05412907525897026,
0.037592194974422455,
-0.1493932157754898,
-0.03209682181477547,
0.06913038343191147,
0.11081516742706299,
-0.08010905981063843,
-0.07264012843370438,
-0.03350052982568741,
0.01663901098072529,
-0.0172441266477108,
0.03258982673287392,
-0.048561159521341324,
-0.017071379348635674,
0.021140454337000847,
-0.028421269729733467,
-0.03321313485503197,
0.07180424779653549,
-0.0747673436999321,
0.01816038228571415,
0.024644911289215088,
-0.019226662814617157,
-0.07171758264303207,
-0.045378852635622025,
-0.04032224416732788,
-0.002269020536914468,
0.10226235538721085,
0.024772735312581062,
-0.09276165813207626,
-0.07335951179265976,
-0.10593986511230469,
-0.04952852427959442,
0.07642433792352676,
0.0530608594417572,
0.010774281807243824,
-0.0020719922613352537,
0.012446408160030842,
0.05417879298329353,
-0.09055284410715103,
0.001579353935085237,
0.2061062455177307,
-0.037645403295755386,
0.03057243488729,
0.047939855605363846,
-0.010959237813949585,
-0.07570730894804001,
0.07270989567041397,
0.09478253126144409,
0.07319542020559311,
0.013621404767036438,
-0.015060127712786198,
0.04592248424887657,
-0.10909218341112137,
-0.01132963690906763,
0.0606980137526989,
-0.09629432111978531,
0.009905079379677773,
-0.056815434247255325,
0.012044115923345089,
0.005576490890234709,
0.15452374517917633,
0.07281716167926788,
-0.07630256563425064,
-0.030219828709959984,
0.10251184552907944,
-0.09706117957830429,
0.009428535588085651,
0.09135556221008301,
-0.025269849225878716,
0.04424094036221504,
0.050262805074453354,
0.1093427911400795,
0.014658864587545395,
0.16603946685791016,
0.06786514818668365,
0.0768362283706665,
0.04684880003333092,
0.09341377764940262,
0.08309539407491684,
0.0011150249047204852,
-0.06397467851638794,
-0.07028601318597794,
-0.04750135913491249,
0.08513364940881729,
-0.09464886784553528,
0.06810497492551804,
0.03230302408337593,
-0.037828292697668076,
-0.00041550429887138307,
-0.037283290177583694,
-0.05429963394999504,
-0.013321210630238056,
-0.2972763478755951,
-0.043727610260248184,
-0.09979379922151566,
0.022702476009726524,
0.01502235233783722,
-0.017769189551472664,
0.13480712473392487,
0.05027567222714424,
-0.05559314787387848,
0.08908388018608093,
-0.14526914060115814,
-0.025366825982928276,
0.09456438571214676,
0.02171614021062851,
-0.05562851205468178,
-0.13320203125476837,
-0.04534315690398216,
0.04615313187241554,
0.07300619035959244,
0.05769423022866249,
0.02867908962070942,
0.11795473098754883,
0.028028758242726326,
0.023918181657791138,
-0.12099969387054443,
-0.0358615517616272,
0.017178239300847054,
0.010989501141011715,
0.022179841995239258,
-0.0895237997174263,
0.03336533531546593,
-0.04033614322543144,
0.001889724750071764,
-0.0077498238533735275,
-0.011224820278584957,
-0.055474936962127686,
0.11560825258493423,
-0.11069818586111069,
-0.01578553579747677,
0.0489749014377594,
-0.0684855654835701,
-0.06875808537006378,
0.2968830168247223,
0.19375383853912354,
-0.041296593844890594,
-0.018090421333909035,
0.053621649742126465,
-0.0009443778544664383,
-0.011739839799702168,
0.07122887670993805,
0.08965996652841568,
0.16959165036678314,
-0.06094176694750786,
0.04472927749156952,
-0.057069774717092514,
-0.023417362943291664,
-0.009842040948569775,
0.07703138142824173,
-0.009873759001493454,
0.0037089858669787645,
-0.1456524133682251,
-0.0823451355099678,
-0.02823064476251602,
-0.06680107861757278,
0.2543430030345917,
-0.111483134329319,
-0.07033972442150116,
-0.03174546733498573,
0.09256260842084885,
0.025155290961265564,
0.001932295854203403,
-0.03227914497256279,
0.04417044296860695,
0.07359332591295242,
0.02408348023891449,
-0.14938747882843018,
-0.0974903404712677,
0.026507116854190826,
-0.10008013248443604,
0.16362513601779938,
-0.023796403780579567,
0.06982450187206268,
0.018269771710038185,
0.014920886605978012,
-0.10543438047170639,
-0.0013797236606478691,
-0.07600937783718109,
-0.06925927847623825,
-0.0708402767777443,
0.0886334478855133,
-0.022257806733250618,
-0.027729758992791176,
-0.032509759068489075,
-0.08882910013198853,
-0.0723624899983406,
0.017965028062462807,
-0.056275319308042526,
-0.02440771460533142,
0.027055926620960236,
-0.09321475774049759,
0.12458700686693192,
0.059576261788606644,
0.029865799471735954,
-0.12640628218650818,
-0.07254128158092499,
0.06254036724567413,
-0.003215165575966239,
-0.1323530375957489,
0.04790357127785683,
-0.11471682786941528,
-0.019443560391664505,
-0.1409839242696762,
0.04875437542796135,
-0.12536445260047913,
-0.0592188686132431,
-0.08569958060979843,
-0.07181825488805771,
-0.07235554605722427,
-0.02795299142599106,
0.07987742871046066,
0.02634517289698124,
-0.05522864684462547,
0.12659434974193573,
0.007773322518914938,
0.043649062514305115,
-0.05154651775956154,
-0.08992312103509903
] |
null | null |
transformers
|
# Aeona | Chatbot

An generative AI made using [microsoft/DialoGPT-small](https://huggingface.co/microsoft/DialoGPT-small).
Recommended to use along with an [AIML Chatbot](https://github.com/deepsarda/Aeona-Aiml) to reduce load, get better replies, add name and personality to your bot.
Using an AIML Chatbot will allow you to hardcode some replies also.
# AEONA
Aeona is an chatbot which hope's to be able to talk with humans as if its an friend!
It's main target platform is discord.
You can invite the bot [here](https://aeona.xyz).
To learn more about this project and chat with the ai, you can use this [website](https://aeona.xyz/).
Aeona works why using context of the previous messages and guessing the personality of the human who is talking with it and adapting its own personality to better talk with the user.
# Participate and Help the AI improve or just hang out at [hugging face discussions](https://huggingface.co/deepparag/Aeona/discussions)
## Goals
The goal is to create an AI which will work with AIML in order to create the most human like AI.
#### Why not an AI on its own?
For AI it is not possible (realistically) to learn about the user and store data on them, when compared to an AIML which can even execute code!
The goal of the AI is to generate responses where the AIML fails.
Hence the goals becomes to make an AI which has a wide variety of knowledge, yet be as small as possible!
So we use 3 dataset:-
1. [Movielines](https://www.kaggle.com/Cornell-University/movie-dialog-corpus) The movie lines promote longer and more thought out responses but it can be very random. About 200k lines!
2. [Discord Messages](https://www.kaggle.com/jef1056/discord-data) The messages are on a wide variety of topics filtered and removed spam which makes the AI highly random but gives it a very random response to every days questions! about 120 million messages!
3. Custom dataset scrapped from my messages, These messages are very narrow teaching this dataset and sending a random reply will make the AI say sorry loads of time!
## Training
The Discord Messages Dataset simply dwarfs the other datasets, Hence the data sets are repeated.
This leads to them covering each others issues!
The AI has a context of 6 messages which means it will reply until the 4th message from user.
[Example](https://huggingface.co/deepparag/Aeona-Beta/discussions/1)
## Tips for Hugging Face interference
I recommend send the user input,
previous 3 AI and human responses.
Using more context than this will lead to useless responses but using less is alright but the responses may be random.
## Evaluation
Below is a comparison of Aeona vs. other baselines on the mixed dataset given above using automatic evaluation metrics.
| Model | Perplexity |
|---|---|
| Seq2seq Baseline [3] | 29.8 |
| Wolf et al. [5] | 16.3 |
| GPT-2 baseline | 99.5 |
| DialoGPT baseline | 56.6 |
| DialoGPT finetuned | 11.4 |
| PersonaGPT | 10.2 |
| **Aeona** | **7.9** |
## Usage
Example:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("deepparag/Aeona")
model = AutoModelWithLMHead.from_pretrained("deepparag/Aeona")
# Let's chat for 4 lines
for step in range(4):
# encode the new user input, add the eos_token and return a tensor in Pytorch
new_user_input_ids = tokenizer.encode(input(">> User:") + tokenizer.eos_token, return_tensors='pt')
# print(new_user_input_ids)
# append the new user input tokens to the chat history
bot_input_ids = torch.cat([chat_history_ids, new_user_input_ids], dim=-1) if step > 0 else new_user_input_ids
# generated a response while limiting the total chat history to 1000 tokens,
chat_history_ids = model.generate(
bot_input_ids, max_length=200,
pad_token_id=tokenizer.eos_token_id,
no_repeat_ngram_size=4,
do_sample=True,
top_k=100,
top_p=0.7,
temperature=0.8
)
# pretty print last ouput tokens from bot
print("Aeona: {}".format(tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][0], skip_special_tokens=True)))
```
|
{"license": "mit", "tags": ["conversational"], "datasets": ["blended_skill_talk"], "metrics": ["accuracy", "f1", "perplexity"], "thumbnail": "https://images-ext-2.discordapp.net/external/Wvtx1L98EbA7DR2lpZPbDxDuO4qmKt03nZygATZtXgk/%3Fsize%3D4096/https/cdn.discordapp.com/avatars/931226824753700934/338a9e413bbceaeb9095a29e97d4fac0.png", "pipeline_tag": "conversational"}
|
text-generation
|
deepparag/Aeona
|
[
"transformers",
"pytorch",
"safetensors",
"gpt2",
"text-generation",
"conversational",
"dataset:blended_skill_talk",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #dataset-blended_skill_talk #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
Aeona | Chatbot
===============
!Aeona Banner
An generative AI made using microsoft/DialoGPT-small.
Recommended to use along with an AIML Chatbot to reduce load, get better replies, add name and personality to your bot.
Using an AIML Chatbot will allow you to hardcode some replies also.
AEONA
=====
Aeona is an chatbot which hope's to be able to talk with humans as if its an friend!
It's main target platform is discord.
You can invite the bot here.
To learn more about this project and chat with the ai, you can use this website.
Aeona works why using context of the previous messages and guessing the personality of the human who is talking with it and adapting its own personality to better talk with the user.
Participate and Help the AI improve or just hang out at hugging face discussions
================================================================================
Goals
-----
The goal is to create an AI which will work with AIML in order to create the most human like AI.
#### Why not an AI on its own?
For AI it is not possible (realistically) to learn about the user and store data on them, when compared to an AIML which can even execute code!
The goal of the AI is to generate responses where the AIML fails.
Hence the goals becomes to make an AI which has a wide variety of knowledge, yet be as small as possible!
So we use 3 dataset:-
1. Movielines The movie lines promote longer and more thought out responses but it can be very random. About 200k lines!
2. Discord Messages The messages are on a wide variety of topics filtered and removed spam which makes the AI highly random but gives it a very random response to every days questions! about 120 million messages!
3. Custom dataset scrapped from my messages, These messages are very narrow teaching this dataset and sending a random reply will make the AI say sorry loads of time!
Training
--------
The Discord Messages Dataset simply dwarfs the other datasets, Hence the data sets are repeated.
This leads to them covering each others issues!
The AI has a context of 6 messages which means it will reply until the 4th message from user.
Example
Tips for Hugging Face interference
----------------------------------
```
I recommend send the user input,
previous 3 AI and human responses.
Using more context than this will lead to useless responses but using less is alright but the responses may be random.
```
Evaluation
----------
Below is a comparison of Aeona vs. other baselines on the mixed dataset given above using automatic evaluation metrics.
Usage
-----
Example:
|
[
"#### Why not an AI on its own?\n\n\nFor AI it is not possible (realistically) to learn about the user and store data on them, when compared to an AIML which can even execute code!\nThe goal of the AI is to generate responses where the AIML fails.\n\n\nHence the goals becomes to make an AI which has a wide variety of knowledge, yet be as small as possible!\nSo we use 3 dataset:-\n\n\n1. Movielines The movie lines promote longer and more thought out responses but it can be very random. About 200k lines!\n2. Discord Messages The messages are on a wide variety of topics filtered and removed spam which makes the AI highly random but gives it a very random response to every days questions! about 120 million messages!\n3. Custom dataset scrapped from my messages, These messages are very narrow teaching this dataset and sending a random reply will make the AI say sorry loads of time!\n\n\nTraining\n--------\n\n\nThe Discord Messages Dataset simply dwarfs the other datasets, Hence the data sets are repeated.\nThis leads to them covering each others issues!\n\n\nThe AI has a context of 6 messages which means it will reply until the 4th message from user.\nExample\n\n\nTips for Hugging Face interference\n----------------------------------\n\n\n\n```\nI recommend send the user input,\nprevious 3 AI and human responses.\n\nUsing more context than this will lead to useless responses but using less is alright but the responses may be random. \n\n```\n\nEvaluation\n----------\n\n\nBelow is a comparison of Aeona vs. other baselines on the mixed dataset given above using automatic evaluation metrics.\n\n\n\nUsage\n-----\n\n\nExample:"
] |
[
"TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #dataset-blended_skill_talk #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"#### Why not an AI on its own?\n\n\nFor AI it is not possible (realistically) to learn about the user and store data on them, when compared to an AIML which can even execute code!\nThe goal of the AI is to generate responses where the AIML fails.\n\n\nHence the goals becomes to make an AI which has a wide variety of knowledge, yet be as small as possible!\nSo we use 3 dataset:-\n\n\n1. Movielines The movie lines promote longer and more thought out responses but it can be very random. About 200k lines!\n2. Discord Messages The messages are on a wide variety of topics filtered and removed spam which makes the AI highly random but gives it a very random response to every days questions! about 120 million messages!\n3. Custom dataset scrapped from my messages, These messages are very narrow teaching this dataset and sending a random reply will make the AI say sorry loads of time!\n\n\nTraining\n--------\n\n\nThe Discord Messages Dataset simply dwarfs the other datasets, Hence the data sets are repeated.\nThis leads to them covering each others issues!\n\n\nThe AI has a context of 6 messages which means it will reply until the 4th message from user.\nExample\n\n\nTips for Hugging Face interference\n----------------------------------\n\n\n\n```\nI recommend send the user input,\nprevious 3 AI and human responses.\n\nUsing more context than this will lead to useless responses but using less is alright but the responses may be random. \n\n```\n\nEvaluation\n----------\n\n\nBelow is a comparison of Aeona vs. other baselines on the mixed dataset given above using automatic evaluation metrics.\n\n\n\nUsage\n-----\n\n\nExample:"
] |
[
77,
358
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #dataset-blended_skill_talk #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n#### Why not an AI on its own?\n\n\nFor AI it is not possible (realistically) to learn about the user and store data on them, when compared to an AIML which can even execute code!\nThe goal of the AI is to generate responses where the AIML fails.\n\n\nHence the goals becomes to make an AI which has a wide variety of knowledge, yet be as small as possible!\nSo we use 3 dataset:-\n\n\n1. Movielines The movie lines promote longer and more thought out responses but it can be very random. About 200k lines!\n2. Discord Messages The messages are on a wide variety of topics filtered and removed spam which makes the AI highly random but gives it a very random response to every days questions! about 120 million messages!\n3. Custom dataset scrapped from my messages, These messages are very narrow teaching this dataset and sending a random reply will make the AI say sorry loads of time!\n\n\nTraining\n--------\n\n\nThe Discord Messages Dataset simply dwarfs the other datasets, Hence the data sets are repeated.\nThis leads to them covering each others issues!\n\n\nThe AI has a context of 6 messages which means it will reply until the 4th message from user.\nExample\n\n\nTips for Hugging Face interference\n----------------------------------\n\n\n\n```\nI recommend send the user input,\nprevious 3 AI and human responses.\n\nUsing more context than this will lead to useless responses but using less is alright but the responses may be random. \n\n```\n\nEvaluation\n----------\n\n\nBelow is a comparison of Aeona vs. other baselines on the mixed dataset given above using automatic evaluation metrics.\n\n\n\nUsage\n-----\n\n\nExample:"
] |
[
-0.0006305525312200189,
0.05374990031123161,
-0.0037794781383126974,
0.067367322742939,
0.12123514711856842,
0.05027877539396286,
0.08440403640270233,
0.08415666967630386,
0.082182876765728,
0.12573988735675812,
0.01524052768945694,
-0.027766646817326546,
0.1017942726612091,
0.1537107676267624,
0.01278162281960249,
-0.19576716423034668,
0.04338866472244263,
-0.10423028469085693,
0.1720658540725708,
0.07625432312488556,
0.09510686993598938,
-0.08656313270330429,
0.0702720582485199,
-0.06050102785229683,
-0.015438955277204514,
-0.010968477465212345,
-0.019951537251472473,
0.003615055000409484,
0.07614479959011078,
0.0757153108716011,
0.020255086943507195,
0.02743089571595192,
0.05614246055483818,
-0.1523769497871399,
0.03714985027909279,
0.05194217711687088,
0.044708870351314545,
0.0451606921851635,
0.03209736943244934,
0.05907987058162689,
0.10608778893947601,
0.03680458664894104,
0.017252527177333832,
0.07970600575208664,
-0.08012817054986954,
-0.06272788345813751,
-0.135122612118721,
0.022440586239099503,
0.09316377341747284,
0.07979917526245117,
-0.0783807560801506,
0.13143739104270935,
-0.07143954932689667,
0.010514424182474613,
0.23871830105781555,
-0.1670733541250229,
-0.0006529003148898482,
0.0300727691501379,
-0.024128740653395653,
0.08918904513120651,
-0.08674844354391098,
-0.008360477164387703,
0.08376342803239822,
0.01251479797065258,
-0.046005625277757645,
0.032527096569538116,
0.041174545884132385,
-0.018353719264268875,
-0.07628172636032104,
-0.06907221674919128,
0.05775216221809387,
0.057330064475536346,
-0.10832607001066208,
-0.17337752878665924,
-0.011426434852182865,
-0.07422631233930588,
-0.03912915289402008,
-0.0013822235632687807,
-0.019950633868575096,
0.025076111778616905,
0.10095584392547607,
0.01938360370695591,
-0.061279796063899994,
-0.011003201827406883,
-0.026586370542645454,
0.02593904174864292,
0.053536556661129,
0.04696807265281677,
0.02441839687526226,
0.09678827971220016,
0.02605246566236019,
-0.03701414540410042,
-0.055190980434417725,
-0.0372733473777771,
-0.12249696254730225,
-0.007679655682295561,
-0.04743470624089241,
-0.01823371835052967,
-0.001490308903157711,
0.13960909843444824,
-0.007733630482107401,
0.038413166999816895,
-0.048095572739839554,
0.042257219552993774,
0.09804708510637283,
0.03181224316358566,
-0.1310182362794876,
-0.02639518491923809,
0.047911085188388824,
0.08614157885313034,
0.09095482528209686,
-0.05542858690023422,
0.003899170784279704,
0.06146416813135147,
0.06305130571126938,
0.07987183332443237,
0.015722883865237236,
0.05910647660493851,
-0.1060103178024292,
-0.003961301874369383,
0.13294242322444916,
-0.10656637698411942,
0.019800854846835136,
0.0516006238758564,
-0.08162353187799454,
0.0170144010335207,
-0.010833500884473324,
0.009280141443014145,
-0.02917639911174774,
0.09736304730176926,
-0.03461506590247154,
0.0008022553520277143,
-0.0978907123208046,
-0.09189385175704956,
0.05511479079723358,
0.02058403566479683,
-0.12153797596693039,
-0.10582152754068375,
-0.11572720110416412,
-0.02489604614675045,
-0.03546677529811859,
-0.0824771448969841,
-0.032386038452386856,
0.030086912214756012,
-0.02534562163054943,
-0.045087698847055435,
-0.005581453442573547,
-0.04278191551566124,
-0.04651649668812752,
0.0333278626203537,
-0.0702524483203888,
0.06294196844100952,
0.056327734142541885,
-0.0042604743503034115,
-0.06740046292543411,
0.009759770706295967,
-0.21818646788597107,
0.10948919504880905,
-0.12788984179496765,
-0.03347665071487427,
-0.05644683912396431,
-0.032524097710847855,
0.04440886899828911,
-0.0020864433608949184,
-0.0029531586915254593,
0.18984170258045197,
-0.2683698832988739,
-0.00945565477013588,
0.012197581119835377,
-0.1531594842672348,
-0.03776790574193001,
0.2227419912815094,
-0.053533732891082764,
-0.034217797219753265,
0.07236418128013611,
0.06050879880785942,
-0.1503801792860031,
-0.005693970248103142,
-0.15000203251838684,
-0.06876520067453384,
-0.0896797627210617,
0.2709287703037262,
0.04321913421154022,
0.021537411957979202,
-0.020323503762483597,
0.026187898591160774,
0.053225498646497726,
0.0028513020370155573,
-0.04904227703809738,
-0.05818009749054909,
0.014084781520068645,
-0.03231355547904968,
0.008456573821604252,
0.030436478555202484,
-0.008253934793174267,
-0.047342460602521896,
-0.07014647871255875,
-0.11116489768028259,
0.12245629727840424,
0.0002816690248437226,
0.03898175433278084,
-0.15269304811954498,
0.06368338316679001,
0.03775535523891449,
0.03413921967148781,
-0.11673349142074585,
-0.04108109325170517,
0.08597858995199203,
-0.048804547637701035,
0.08307328075170517,
0.024046948179602623,
0.017832238227128983,
0.07746049761772156,
-0.0578683502972126,
-0.004578989930450916,
-0.01627003774046898,
-0.002876238664612174,
-0.06940791010856628,
-0.1592935174703598,
-0.05631347373127937,
-0.03959931060671806,
0.1876733899116516,
-0.11771077662706375,
-0.008178545162081718,
0.034851014614105225,
0.1348278522491455,
0.006990357302129269,
-0.10876768827438354,
0.055479854345321655,
-0.0016758195124566555,
-0.07702584564685822,
-0.042248137295246124,
-0.012242467142641544,
-0.0424409881234169,
-0.0640551894903183,
0.04456229507923126,
-0.17546617984771729,
-0.16120052337646484,
0.08574897050857544,
0.11330296844244003,
-0.14234088361263275,
0.10069834440946579,
-0.06869283318519592,
-0.011675143614411354,
-0.07429534196853638,
-0.09383904933929443,
0.08552087098360062,
0.033380065113306046,
0.07808341085910797,
-0.06195170059800148,
0.03503518924117088,
0.0054712118580937386,
-0.03336377069354057,
-0.04135328158736229,
0.06179134547710419,
0.025782126933336258,
-0.07629469782114029,
0.062310684472322464,
-0.10684879124164581,
0.016094479709863663,
0.09452684968709946,
0.0035978169180452824,
-0.08652973920106888,
-0.050514090806245804,
0.05808379873633385,
0.061261970549821854,
0.04991006851196289,
0.043827325105667114,
0.05215634033083916,
0.06258122622966766,
-0.03461487591266632,
-0.01651979610323906,
-0.07141344994306564,
-0.0035605409648269415,
-0.004641389939934015,
-0.03763548657298088,
0.015604167245328426,
-0.015787625685334206,
0.03299039974808693,
0.1571493148803711,
-0.03311023861169815,
-0.00012907976633869112,
-0.01562461256980896,
-0.02623671293258667,
-0.11272665858268738,
0.10788878798484802,
-0.038823164999485016,
-0.1850469708442688,
-0.07448218017816544,
0.006081182509660721,
-0.06332910805940628,
0.004734111484140158,
0.020027529448270798,
-0.08633973449468613,
-0.10041699558496475,
-0.16751697659492493,
0.007123610936105251,
0.12108172476291656,
-0.01226651668548584,
0.07744227349758148,
-0.028866196051239967,
0.011816170997917652,
-0.06736534088850021,
0.011774156242609024,
-0.0262942872941494,
-0.05956197902560234,
0.062051426619291306,
0.006157036405056715,
0.08304768055677414,
0.09475447237491608,
0.01933734305202961,
-0.011099574156105518,
-0.015009826049208641,
0.23323240876197815,
-0.0664341002702713,
0.11032218486070633,
0.059799924492836,
-0.04751628637313843,
0.04867579787969589,
0.12262909859418869,
0.031981226056814194,
-0.06995980441570282,
0.06652316451072693,
0.08543556183576584,
-0.03803572803735733,
-0.18076062202453613,
-0.07525230199098587,
-0.012617882341146469,
0.06344028562307358,
-0.01640818454325199,
0.03740702196955681,
-0.00015321218234021217,
-0.018404752016067505,
-0.11710049957036972,
-0.06883291155099869,
0.05115934833884239,
0.07140399515628815,
-0.13145983219146729,
-0.004573971964418888,
0.05240984633564949,
-0.059301748871803284,
-0.016262007877230644,
0.16771407425403595,
-0.08886970579624176,
0.19930259883403778,
-0.05194561183452606,
0.11679491400718689,
0.016735799610614777,
0.051518600434064865,
0.03238140046596527,
0.01501509826630354,
-0.039156556129455566,
-0.024722430855035782,
-0.0715942531824112,
-0.04083580896258354,
0.007474223151803017,
0.10755916684865952,
0.05008243769407272,
-0.038041360676288605,
-0.01421481091529131,
0.04183264076709747,
0.080813467502594,
0.21568799018859863,
-0.011148635298013687,
-0.1037776991724968,
-0.04767997935414314,
0.0498836413025856,
-0.033854495733976364,
-0.0022512427531182766,
0.0017591420328244567,
0.11433181166648865,
-0.125013068318367,
0.07162471115589142,
-0.02715696021914482,
0.0502752847969532,
-0.08730760216712952,
0.005174499005079269,
-0.11444605141878128,
0.029246171936392784,
-0.028173815459012985,
0.06499330699443817,
-0.14258775115013123,
0.04386129975318909,
-0.011697428300976753,
0.04498680680990219,
-0.04885342717170715,
-0.011214202269911766,
0.05941150709986687,
0.05620996281504631,
0.12264731526374817,
0.0028326972387731075,
-0.06548308581113815,
-0.16221578419208527,
-0.06548215448856354,
-0.03293731436133385,
0.05438639596104622,
-0.0029680179432034492,
0.08586371690034866,
0.01228976622223854,
0.03972761332988739,
-0.00796574354171753,
0.11569825559854507,
-0.0850980132818222,
-0.14663873612880707,
0.07967869192361832,
0.03421687334775925,
0.01891225203871727,
-0.032552022486925125,
-0.07750128209590912,
-0.09796188771724701,
0.15599700808525085,
-0.11007634550333023,
-0.057080939412117004,
-0.07008199393749237,
0.029209276661276817,
0.09918612241744995,
-0.04907864332199097,
-0.06714913249015808,
-0.018466154113411903,
0.13169480860233307,
-0.0742151141166687,
-0.03581811487674713,
0.04234810173511505,
-0.06638062745332718,
-0.25080105662345886,
-0.04255983978509903,
0.04194849729537964,
0.042542293667793274,
0.09397287666797638,
-0.03446217626333237,
-0.02209177426993847,
0.042188119143247604,
-0.10242044925689697,
0.008991437964141369,
0.16855299472808838,
0.023296577855944633,
0.13514426350593567,
-0.034881338477134705,
-0.10980558395385742,
-0.14243918657302856,
-0.09124203026294708,
0.07824841886758804,
0.29851794242858887,
-0.10443158447742462,
0.10826340317726135,
0.11487983912229538,
-0.03909081965684891,
-0.21162916719913483,
-0.055187877267599106,
0.04259902238845825,
-0.012773683294653893,
0.07177677005529404,
-0.09146524965763092,
-0.048926420509815216,
0.08620187640190125,
-0.0377388522028923,
-0.053866252303123474,
-0.18961040675640106,
-0.11673547327518463,
0.022833388298749924,
0.02629994973540306,
0.07361249625682831,
-0.1358298361301422,
-0.006109029985964298,
-0.060239553451538086,
-0.04500552639365196,
0.06055649742484093,
0.12205629795789719,
0.09147346764802933,
-0.002550243865698576,
-0.003769197268411517,
0.05536704882979393,
-0.08166565746068954,
0.1519736796617508,
0.003812233218923211,
0.10356364399194717,
-0.09547930210828781,
-0.07181749492883682,
-0.030100753530859947,
-0.01803005300462246,
0.17836737632751465,
-0.04275093972682953,
0.0018087276257574558,
-0.0674397274851799,
-0.022667404264211655,
-0.0846220999956131,
0.017643408849835396,
0.0066809807904064655,
-0.00806089211255312,
-0.11832071840763092,
0.09219184517860413,
0.0755404531955719,
0.01341318804770708,
0.020353876054286957,
-0.04390420764684677,
0.02854405902326107,
0.21562907099723816,
0.04013447463512421,
-0.08002634346485138,
-0.04096773639321327,
-0.011640502139925957,
-0.0077376337721943855,
0.06431977450847626,
-0.027247929945588112,
0.02066277340054512,
0.06633798778057098,
-0.03679213672876358,
0.21830829977989197,
-0.01117006316781044,
-0.15978805720806122,
0.05667129158973694,
0.06352341175079346,
-0.09872712939977646,
-0.3581806421279907,
0.03327467292547226,
0.10865066945552826,
-0.046745020896196365,
-0.054123349487781525,
0.059262964874506,
-0.07394040375947952,
0.008625604212284088,
-0.0011926525039598346,
0.10611391067504883,
0.07018432766199112,
0.06775008141994476,
-0.05542318895459175,
0.054529376327991486,
-0.1270381212234497,
0.09119383245706558,
0.08082396537065506,
-0.07131421566009521,
0.07756365090608597,
0.09149082750082016,
-0.09671176970005035,
-0.016225425526499748,
-0.074663445353508,
0.07882947474718094,
0.031467437744140625,
-0.005931140389293432,
-0.04835217446088791,
-0.07076133042573929,
0.042806725949048996,
-0.03581903129816055,
0.008751897141337395,
0.06768978387117386,
0.010616680607199669,
-0.010902696289122105,
-0.0700628012418747,
0.05001988261938095,
0.06700652092695236,
0.03806142508983612,
-0.026783382520079613,
0.11819276958703995,
0.04674525558948517,
0.021078020334243774,
-0.030911855399608612,
-0.010123677551746368,
-0.031094882637262344,
-0.026235492900013924,
-0.18171082437038422,
-0.013434551656246185,
-0.08367320150136948,
-0.048962291330099106,
0.0004908511182293296,
-0.04944451525807381,
0.011054223403334618,
0.017179902642965317,
-0.043891649693250656,
-0.039324332028627396,
-0.009567221626639366,
0.06149809807538986,
-0.12106452137231827,
-0.03833664581179619,
0.08006983250379562,
-0.04763549938797951,
0.15721619129180908,
0.06547762453556061,
-0.04609610512852669,
-0.08180728554725647,
-0.020085925236344337,
-0.027632923796772957,
0.03529204800724983,
0.03240922838449478,
0.013583118095993996,
-0.1212407574057579,
0.003544325940310955,
-0.035010792315006256,
-0.06276490539312363,
0.05080009624361992,
0.1701854169368744,
-0.0879702940583229,
0.042258791625499725,
0.052702754735946655,
-0.025440046563744545,
-0.07629939913749695,
0.0007553389295935631,
0.03592856600880623,
0.01027265377342701,
0.13214416801929474,
-0.05052906274795532,
0.019283464178442955,
-0.1637403666973114,
-0.03786354511976242,
0.00316688884049654,
0.016587337478995323,
-0.07823245227336884,
0.001978927990421653,
0.06392232328653336,
-0.03365546092391014,
0.11506564170122147,
-0.058085642755031586,
-0.10654736310243607,
0.032522719353437424,
0.0736086517572403,
0.06080028787255287,
-0.03491092473268509,
-0.0660216361284256,
-0.030029870569705963,
-0.041577644646167755,
-0.03915608674287796,
-0.06175876781344414,
0.005579681135714054,
-0.05162579193711281,
0.07739315927028656,
0.06926154345273972,
0.12460924685001373,
-0.021090786904096603,
-0.006906584836542606,
-0.08551657944917679,
-0.0329420268535614,
0.0673723816871643,
-0.1413286030292511,
0.06444166600704193,
-0.01828988827764988,
0.07765042781829834,
0.18309438228607178,
-0.04879342019557953,
0.13204264640808105,
-0.09785376489162445,
-0.03801397606730461,
-0.025477727875113487,
-0.15360400080680847,
-0.044035062193870544,
-0.07692310214042664,
0.04408751800656319,
-0.07402864098548889,
0.08394307643175125,
0.09757792204618454,
0.03465529531240463,
-0.05825766548514366,
0.1131819635629654,
-0.11043020337820053,
-0.03557805344462395,
0.011558834463357925,
0.001964304130524397,
-0.056488003581762314,
0.1125374585390091,
-0.02354848012328148,
0.054894570261240005,
-0.01356619130820036,
0.07008544355630875,
0.1204974427819252,
0.06728382408618927,
0.05352753400802612,
-0.007076213601976633,
-0.08861067146062851,
0.03778534010052681,
-0.05301598459482193,
0.02950071170926094,
0.21141882240772247,
0.03981102257966995,
0.055844638496637344,
-0.016882577911019325,
0.13432583212852478,
-0.048252593725919724,
-0.00844285823404789,
-0.18632806837558746,
0.20240123569965363,
-0.02431178279221058,
0.021578310057520866,
-0.029164446517825127,
-0.07677356898784637,
-0.056362301111221313,
0.12336412817239761,
0.104268379509449,
-0.1293126344680786,
-0.0327647440135479,
-0.08469599485397339,
0.0017299192259088159,
-0.029332073405385017,
0.14127090573310852,
0.03539375215768814,
0.15524886548519135,
-0.05207062512636185,
0.17377418279647827,
-0.08914180845022202,
0.018014883622527122,
-0.09173217415809631,
0.2546197772026062,
-0.018288902938365936,
0.005083369091153145,
-0.1106223464012146,
0.008165127597749233,
-0.028782838955521584,
-0.22469967603683472,
-0.09018785506486893,
-0.13906501233577728,
-0.09130298346281052,
0.0560024157166481,
0.0469069704413414,
0.024533221498131752,
0.06402993947267532,
0.03196602314710617,
0.06543540209531784,
0.07578877359628677,
-0.035376567393541336,
-0.048209596425294876,
-0.0080981170758605,
0.033609237521886826,
-0.10614734888076782,
0.12000459432601929,
0.052387673407793045,
0.06230970099568367,
0.07634461671113968,
-0.056682150810956955,
-0.13021105527877808,
0.13123691082000732,
0.0396285355091095,
-0.014881815761327744,
-0.003478601574897766,
0.20496974885463715,
0.006538375746458769,
0.06603989750146866,
0.061153594404459,
-0.014846819452941418,
0.024809418246150017,
-0.009361844509840012,
-0.027493471279740334,
-0.10566767305135727,
0.08536293357610703,
-0.04908294603228569,
0.1089162603020668,
0.12679684162139893,
-0.033411454409360886,
0.02877972275018692,
-0.0073149437084794044,
-0.032536424696445465,
-0.004130282439291477,
0.1144782155752182,
0.006129554472863674,
-0.20627307891845703,
0.056068480014801025,
-0.0023977900855243206,
0.11471085250377655,
-0.1620451807975769,
-0.04352801665663719,
-0.02075754851102829,
-0.0596502311527729,
0.026400884613394737,
0.0916418507695198,
0.06132740154862404,
-0.0009042739402502775,
-0.013726347126066685,
-0.1416676789522171,
0.06101226061582565,
0.09996802359819412,
-0.03614290803670883,
-0.033583302050828934
] |
null | null |
transformers
|
An generative AI made using [microsoft/DialoGPT-small](https://huggingface.co/microsoft/DialoGPT-small).
Trained on:
https://www.kaggle.com/Cornell-University/movie-dialog-corpus
https://www.kaggle.com/jef1056/discord-data
Important:
The AI can be a bit weird at times as it is still undergoing training!
At times it send stuff using :<random_wierd_words>: as they are discord emotes.
It also send random @RandomName as it is trying to ping people.
This works well on discord but on the web not so much but it is easy enough to remove such stuff using [re.sub](https://docs.python.org/3/library/re.html#re.sub)
Issues:
The AI like with all conversation AI lacks a character, it changes its name way too often. This can be solved using an AIML chatbot to give it a stable character!
[Live Demo](https://dumbot-331213.uc.r.appspot.com/)
Example:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("deepparag/DumBot")
model = AutoModelWithLMHead.from_pretrained("deepparag/DumBot")
# Let's chat for 4 lines
for step in range(4):
# encode the new user input, add the eos_token and return a tensor in Pytorch
new_user_input_ids = tokenizer.encode(input(">> User:") + tokenizer.eos_token, return_tensors='pt')
# print(new_user_input_ids)
# append the new user input tokens to the chat history
bot_input_ids = torch.cat([chat_history_ids, new_user_input_ids], dim=-1) if step > 0 else new_user_input_ids
# generated a response while limiting the total chat history to 1000 tokens,
chat_history_ids = model.generate(
bot_input_ids, max_length=200,
pad_token_id=tokenizer.eos_token_id,
no_repeat_ngram_size=4,
do_sample=True,
top_k=100,
top_p=0.7,
temperature=0.8
)
# pretty print last ouput tokens from bot
print("DumBot: {}".format(tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][0], skip_special_tokens=True)))
```
|
{"license": "mit", "tags": ["conversational"], "thumbnail": "https://cdn.discordapp.com/app-icons/870239976690970625/c02cae78ae105f07969cfd8f8ea3d0a0.png"}
|
text-generation
|
deepparag/DumBot-Beta
|
[
"transformers",
"pytorch",
"gpt_neo",
"text-generation",
"conversational",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt_neo #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
An generative AI made using microsoft/DialoGPT-small.
Trained on:
URL
URL
Important:
The AI can be a bit weird at times as it is still undergoing training!
At times it send stuff using :<random_wierd_words>: as they are discord emotes.
It also send random @RandomName as it is trying to ping people.
This works well on discord but on the web not so much but it is easy enough to remove such stuff using URL
Issues:
The AI like with all conversation AI lacks a character, it changes its name way too often. This can be solved using an AIML chatbot to give it a stable character!
Live Demo
Example:
|
[] |
[
"TAGS\n#transformers #pytorch #gpt_neo #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
48
] |
[
"passage: TAGS\n#transformers #pytorch #gpt_neo #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.012249067425727844,
0.049002762883901596,
-0.005731964483857155,
0.006306023336946964,
0.13584865629673004,
0.022059865295886993,
0.11178325116634369,
0.13175678253173828,
-0.003722151042893529,
-0.05755901336669922,
0.14100919663906097,
0.22235743701457977,
-0.009073399938642979,
0.08186548203229904,
-0.05247195065021515,
-0.2769356369972229,
0.0657941997051239,
0.06847848743200302,
0.051143307238817215,
0.11515828222036362,
0.10635673254728317,
-0.04623867943882942,
0.06820302456617355,
0.005524468142539263,
-0.15246263146400452,
0.004866791423410177,
0.03687358275055885,
-0.1184389516711235,
0.12296588718891144,
0.06455177813768387,
0.06430112570524216,
0.046249374747276306,
-0.06963103264570236,
-0.1459026038646698,
0.030186865478754044,
-0.014319771900773048,
-0.08413688093423843,
0.024707432836294174,
0.04886164143681526,
-0.042409539222717285,
0.15102766454219818,
0.11013579368591309,
-0.0034574109595268965,
0.08136606961488724,
-0.1545637995004654,
-0.13877619802951813,
-0.0682561844587326,
0.057151757180690765,
0.049767110496759415,
0.08906492590904236,
-0.013939391821622849,
0.15605908632278442,
-0.10613973438739777,
0.0777021273970604,
0.07197828590869904,
-0.3682616353034973,
-0.003279378404840827,
0.10807133466005325,
0.06058641895651817,
0.04232795909047127,
-0.03127194195985794,
0.07363533228635788,
0.030371520668268204,
0.008581161499023438,
-0.036916058510541916,
-0.07150588184595108,
-0.06328806281089783,
0.05418230965733528,
-0.08186172693967819,
-0.050417497754096985,
0.23929503560066223,
-0.04802493751049042,
0.0468418151140213,
-0.05378488078713417,
-0.05435328185558319,
-0.005393299274146557,
-0.03298303484916687,
-0.010436191223561764,
-0.050028786063194275,
0.1071195974946022,
0.016713786870241165,
-0.08891807496547699,
-0.1418488323688507,
-0.004855267237871885,
-0.21702179312705994,
0.1439635455608368,
0.04552127793431282,
0.05558148771524429,
-0.15206176042556763,
0.07387471199035645,
-0.019568951800465584,
-0.06684524565935135,
-0.02285359613597393,
-0.09960941225290298,
0.08734620362520218,
-0.001750814262777567,
-0.053117554634809494,
0.03196975216269493,
0.08617507666349411,
0.15013402700424194,
0.0338047593832016,
-0.009683072566986084,
-0.028798721730709076,
0.10959910601377487,
-0.007134866435080767,
0.05529975891113281,
0.042314913123846054,
-0.023841220885515213,
0.057796869426965714,
-0.16466124355793,
0.026920216158032417,
-0.04131140932440758,
-0.18353518843650818,
-0.01756293699145317,
-0.02104547806084156,
0.08831218630075455,
0.03485485166311264,
0.09915734827518463,
-0.03607060760259628,
0.007540606427937746,
0.09088407456874847,
-0.03151343762874603,
-0.008154054172337055,
-0.02582979016005993,
0.02106625959277153,
0.08762531727552414,
-0.0021767897997051477,
0.0202944353222847,
-0.07671457529067993,
0.06001148372888565,
-0.06684006005525589,
-0.0293559692800045,
-0.03604314103722572,
-0.023138336837291718,
0.048139333724975586,
-0.06490501016378403,
0.02622133493423462,
-0.1598779261112213,
-0.17846137285232544,
0.01918182335793972,
0.008763393387198448,
-0.016509894281625748,
-0.0903969332575798,
-0.015494401566684246,
-0.002804709831252694,
0.01924826391041279,
-0.08829634636640549,
-0.054391562938690186,
-0.0831519365310669,
0.11040516942739487,
-0.030920084565877914,
0.04485701024532318,
-0.1752510815858841,
0.07739592343568802,
-0.11826501786708832,
-0.04288419708609581,
-0.04130818322300911,
0.028717974200844765,
-0.03502530977129936,
0.12287655472755432,
0.006918778643012047,
-0.02845912054181099,
-0.06463278830051422,
0.0643080398440361,
-0.08180462568998337,
0.16125646233558655,
-0.0757519081234932,
-0.10456221550703049,
0.2820027768611908,
-0.10100947320461273,
-0.17428332567214966,
0.11013005673885345,
0.010218650102615356,
0.060077134519815445,
0.13104362785816193,
0.19609656929969788,
0.02654334530234337,
-0.0496942438185215,
0.07730130106210709,
0.09347636997699738,
-0.12911424040794373,
-0.12749820947647095,
0.032853078097105026,
-0.0455610491335392,
-0.10023678094148636,
0.054105862975120544,
0.016883675009012222,
0.09744250029325485,
-0.03362274542450905,
-0.046690359711647034,
-0.025695810094475746,
-0.007613616995513439,
0.06247224286198616,
0.017612986266613007,
0.08270915597677231,
-0.10034307837486267,
-0.04168548062443733,
-0.0064352164044976234,
0.0010134915355592966,
0.04196496307849884,
0.03204869478940964,
-0.0724017545580864,
0.10423140972852707,
0.04631216078996658,
0.04651815444231033,
-0.11254076659679413,
-0.07922010123729706,
-0.025963453575968742,
0.10278407484292984,
0.07031848281621933,
0.15062281489372253,
0.0275556743144989,
-0.04213658347725868,
-0.021060924977064133,
0.027943281456828117,
0.14726009964942932,
0.011435896158218384,
-0.013496119529008865,
-0.06726338714361191,
0.09579480439424515,
-0.0387408509850502,
0.02981865033507347,
-0.03631645068526268,
0.004588298965245485,
0.07871276885271072,
0.08524885773658752,
-0.04622102156281471,
0.06011127680540085,
-0.0465075708925724,
0.02398812584578991,
-0.055141471326351166,
0.014183321967720985,
0.120051808655262,
0.0386679545044899,
-0.07549531757831573,
0.24721704423427582,
-0.15730978548526764,
0.23648713529109955,
0.22452400624752045,
-0.22155407071113586,
-0.0005283859791234136,
-0.0736691802740097,
-0.03791789337992668,
0.004907302092760801,
0.06284107267856598,
-0.004582809284329414,
0.12411756813526154,
-0.020207174122333527,
0.1652490794658661,
-0.05654369294643402,
-0.03941424563527107,
-0.02533758617937565,
-0.04954761266708374,
-0.02767106145620346,
0.0550057478249073,
0.14561231434345245,
-0.19744232296943665,
0.21940301358699799,
0.19909405708312988,
0.04743945971131325,
0.21530066430568695,
-0.011006261222064495,
0.024377303197979927,
0.05220266431570053,
-0.011722017079591751,
-0.03366449847817421,
-0.012490116991102695,
-0.21307604014873505,
-0.01746106892824173,
0.08554521948099136,
0.013327722437679768,
0.0958114042878151,
-0.14412856101989746,
-0.08456682413816452,
-0.014574657194316387,
-0.027012448757886887,
0.01742803119122982,
0.1194600984454155,
0.020603887736797333,
0.10613621771335602,
-0.016075871884822845,
-0.035505738109350204,
0.1251087784767151,
0.030337627977132797,
-0.07154674828052521,
0.1453801989555359,
-0.15834029018878937,
-0.303773432970047,
-0.12894898653030396,
-0.17768557369709015,
-0.03799306973814964,
0.03563178330659866,
0.14527589082717896,
-0.07328806072473526,
-0.030751429498195648,
0.03647215664386749,
0.013870466500520706,
-0.08390077948570251,
-0.014456242322921753,
-0.08555371314287186,
0.03002605028450489,
-0.10924658179283142,
-0.10555863380432129,
-0.07831002026796341,
-0.021988309919834137,
-0.0961788073182106,
0.1151948794722557,
-0.09817806631326675,
0.0646129697561264,
0.15044978260993958,
0.028233826160430908,
0.05805652588605881,
-0.07198275625705719,
0.17294661700725555,
-0.09173253923654556,
-0.01097830943763256,
0.20732638239860535,
0.02012474462389946,
0.07378273457288742,
0.15158842504024506,
0.029408587142825127,
-0.07168286293745041,
0.015590324997901917,
-0.052278127521276474,
-0.08632690459489822,
-0.2168901562690735,
-0.1558571457862854,
-0.13747188448905945,
0.0690465047955513,
0.0030086152255535126,
0.0723981186747551,
0.19773901998996735,
0.06861533224582672,
-0.046401601284742355,
-0.031108777970075607,
0.021583007648587227,
0.10063444077968597,
0.3410639762878418,
-0.05327043682336807,
0.14195378124713898,
-0.07357075810432434,
-0.11199769377708435,
0.1226927787065506,
0.056482817977666855,
0.11938674747943878,
0.1123334988951683,
0.08026449382305145,
0.06672121584415436,
0.12795454263687134,
0.13798023760318756,
0.043327946215867996,
0.046439047902822495,
-0.034112315624952316,
-0.036420855671167374,
-0.027160797268152237,
-0.02566816285252571,
0.05508521571755409,
0.06615292280912399,
-0.18479539453983307,
-0.002824634313583374,
-0.14954999089241028,
0.06561906635761261,
0.053764041513204575,
0.0202480461448431,
-0.18362785875797272,
0.005204673390835524,
0.06528598815202713,
-0.005602819379419088,
-0.09366512298583984,
0.08746214956045151,
-0.05425247177481651,
-0.14547725021839142,
0.09669147431850433,
-0.023126065731048584,
0.10454806685447693,
-0.0008005491108633578,
0.07207867503166199,
-0.04206039384007454,
-0.0978512093424797,
0.038896892219781876,
0.1325279176235199,
-0.31494826078414917,
0.23261544108390808,
-0.01393983606249094,
-0.024500148370862007,
-0.08572899550199509,
-0.011335136368870735,
0.018626568838953972,
0.15950773656368256,
0.10951443016529083,
-0.003407503478229046,
-0.06336256861686707,
-0.0294339619576931,
0.014848529361188412,
0.029322730377316475,
0.08637427538633347,
-0.005204329267144203,
-0.0519639253616333,
-0.060435328632593155,
0.00353814335539937,
-0.015896853059530258,
-0.011147964745759964,
-0.000527078693266958,
-0.19493171572685242,
0.08259357511997223,
0.07040222734212875,
0.06522004306316376,
-0.0023138520773500204,
-0.018989993259310722,
-0.11687406897544861,
0.2190253883600235,
-0.07438122481107712,
-0.09118514508008957,
-0.10614849627017975,
-0.10791749507188797,
0.014160062186419964,
-0.06386269629001617,
0.06167882680892944,
-0.09340805560350418,
-0.004537708126008511,
-0.08441250771284103,
-0.18372799456119537,
0.11925160139799118,
-0.09707064181566238,
-0.02996150776743889,
-0.03649343550205231,
0.18195928633213043,
-0.07884290814399719,
0.015585575252771378,
0.044431980699300766,
0.021854732185602188,
-0.12260624021291733,
-0.11570632457733154,
-0.001567576196976006,
-0.02357831597328186,
0.05740809068083763,
-0.0386137031018734,
-0.07989652454853058,
-0.02859911508858204,
-0.018755758181214333,
-0.08220020681619644,
0.2862620949745178,
0.19378797709941864,
-0.059891022741794586,
0.1996472030878067,
0.14418531954288483,
-0.0836072638630867,
-0.31367456912994385,
-0.12825967371463776,
-0.15447452664375305,
-0.07887224853038788,
-0.03856498748064041,
-0.1633101850748062,
0.0669545978307724,
0.042361143976449966,
-0.05480370298027992,
0.12089236080646515,
-0.20620796084403992,
-0.09718060493469238,
0.17236702144145966,
-0.030651569366455078,
0.37178996205329895,
-0.1336265206336975,
-0.10561801493167877,
-0.03579758480191231,
-0.15537145733833313,
0.17460471391677856,
0.03383317217230797,
0.09980908036231995,
-0.018191907554864883,
0.11429305374622345,
0.018375881016254425,
-0.03433363884687424,
0.09259071946144104,
-0.04374115914106369,
-0.01802315004169941,
-0.12656237185001373,
-0.03467437997460365,
0.05829605832695961,
0.03474951162934303,
0.024503521621227264,
-0.06423432379961014,
0.00486725103110075,
-0.08315606415271759,
-0.05337224528193474,
-0.10578950494527817,
0.07607460021972656,
0.021353967487812042,
-0.08471216261386871,
-0.046389319002628326,
-0.01375446654856205,
-0.027837689965963364,
-0.016736391931772232,
0.19044184684753418,
-0.05678305774927139,
0.18762017786502838,
0.03515775501728058,
0.0874270647764206,
-0.1925969272851944,
-0.01793329231441021,
-0.09486759454011917,
-0.07706146687269211,
0.058870937675237656,
-0.053388141095638275,
0.007050744723528624,
0.11362024396657944,
-0.04568180814385414,
0.07655699551105499,
0.09220363199710846,
-0.009664068929851055,
-0.009222960099577904,
0.13299599289894104,
-0.21688342094421387,
-0.07797618210315704,
-0.05882538855075836,
0.04841262102127075,
0.14055059850215912,
0.061913829296827316,
0.13002751767635345,
0.0007747893105261028,
-0.044269874691963196,
0.010284749791026115,
0.007751959841698408,
-0.05043522268533707,
0.004296678118407726,
0.02365328185260296,
-0.0034212120808660984,
-0.15053234994411469,
0.04839516431093216,
0.020622797310352325,
-0.15532852709293365,
-0.007347651291638613,
0.12926343083381653,
-0.12544861435890198,
-0.13151559233665466,
-0.0568719245493412,
0.05392124503850937,
-0.18578730523586273,
-0.05175306275486946,
-0.01340430323034525,
-0.14527878165245056,
0.07227054238319397,
0.1335563063621521,
0.07020816206932068,
0.09346722811460495,
-0.03473092243075371,
-0.017580600455403328,
-0.0010850598337128758,
-0.01277335174381733,
-0.0702723041176796,
0.020632831379771233,
-0.06670907139778137,
0.0515761636197567,
-0.0010570454178377986,
0.10607122629880905,
-0.07681027054786682,
-0.07187917083501816,
-0.13882730901241302,
0.03476519510149956,
-0.05300183966755867,
-0.08174020797014236,
-0.1259097307920456,
-0.045731544494628906,
0.018159471452236176,
-0.049423057585954666,
-0.05027712136507034,
-0.038844939321279526,
-0.12640899419784546,
0.021460644900798798,
-0.0027027209289371967,
0.056953154504299164,
-0.09590408205986023,
0.0012444353196769953,
0.09691134840250015,
0.011175412684679031,
0.12088154256343842,
0.10418660938739777,
-0.07730735093355179,
0.10553126782178879,
-0.14351359009742737,
-0.07625448703765869,
0.10025633126497269,
0.03493395447731018,
0.040884461253881454,
0.07780515402555466,
0.018934905529022217,
0.11146469414234161,
0.015662286430597305,
0.0693574920296669,
-0.00056478037731722,
-0.13772419095039368,
0.025176076218485832,
-0.0014767285902053118,
-0.13831113278865814,
-0.01900855451822281,
-0.06012744456529617,
0.0757073163986206,
0.0038057672791182995,
0.14088225364685059,
-0.05497356504201889,
0.0532233864068985,
-0.03993082791566849,
0.023217875510454178,
-0.008580229245126247,
-0.1808282732963562,
-0.04805506765842438,
-0.08369839936494827,
-0.008409316651523113,
0.02841249480843544,
0.2975298762321472,
0.030329830944538116,
-0.05788365751504898,
0.06853384524583817,
0.10396338254213333,
0.03038226254284382,
-0.019904915243387222,
0.23988476395606995,
0.09490204602479935,
-0.032089076936244965,
-0.10155369341373444,
0.07538599520921707,
-0.02252069115638733,
-0.09963911026716232,
0.12199230492115021,
0.04227960854768753,
0.021249504759907722,
0.04568271338939667,
0.03805205971002579,
0.011955090798437595,
-0.10703131556510925,
-0.16083677113056183,
0.01052097138017416,
0.06161532551050186,
-0.04190748929977417,
0.13787445425987244,
0.1595102995634079,
-0.04470818489789963,
0.04552879557013512,
-0.032386310398578644,
-0.026421358808875084,
-0.17320170998573303,
-0.16995979845523834,
-0.0688595324754715,
-0.1556079089641571,
0.03142613545060158,
-0.06998912990093231,
0.05865192413330078,
0.04794394597411156,
0.06449270993471146,
-0.09364529699087143,
0.04531434550881386,
-0.02366747334599495,
-0.1023751050233841,
0.026530368253588676,
-0.03943566977977753,
0.05637279897928238,
-0.06441641598939896,
-0.022143740206956863,
-0.121025949716568,
-0.02414691634476185,
-0.012842953205108643,
0.05409131571650505,
-0.04133643954992294,
0.019086189568042755,
-0.14171773195266724,
-0.05173737555742264,
-0.04780691862106323,
0.06556716561317444,
-0.00593840004876256,
0.1432768553495407,
-0.005484726745635271,
0.006947752553969622,
0.0687648132443428,
0.19053958356380463,
-0.03777540102601051,
-0.10561830550432205,
-0.006132352165877819,
0.16916921734809875,
0.05998756363987923,
0.09166300296783447,
0.0026950945612043142,
0.03157255798578262,
-0.05743153393268585,
0.33528459072113037,
0.32739728689193726,
-0.03973012417554855,
0.021618841215968132,
0.008777703158557415,
0.04168492555618286,
0.10792719572782516,
0.14862732589244843,
0.09119109809398651,
0.30000826716423035,
-0.06803705543279648,
-0.030932903289794922,
-0.04280340299010277,
0.009867323562502861,
-0.12347240000963211,
0.060818806290626526,
0.026928909122943878,
-0.08003076165914536,
-0.012291047722101212,
0.11529748141765594,
-0.20037633180618286,
0.17301198840141296,
-0.07062963396310806,
-0.16492827236652374,
-0.0486336313188076,
0.008376134559512138,
0.15650779008865356,
-0.00016821749159134924,
0.0688432902097702,
-0.0032920464873313904,
-0.08078102022409439,
0.0539923831820488,
0.025309838354587555,
-0.2656105160713196,
-0.04429003968834877,
0.10075674951076508,
0.004843710921704769,
0.054684117436409,
-0.02429237775504589,
0.056473713368177414,
0.06928127259016037,
0.06274232268333435,
-0.03154357895255089,
0.0944388285279274,
0.026516802608966827,
-0.08910882472991943,
0.018166767433285713,
-0.05460406094789505,
0.017261607572436333,
-0.09248588979244232,
0.05807221680879593,
-0.11333122849464417,
0.07397489994764328,
0.004136587958782911,
-0.05735277757048607,
-0.028945300728082657,
0.0800483226776123,
-0.09963076561689377,
0.07779213041067123,
0.034079134464263916,
-0.004071264993399382,
-0.039859309792518616,
-0.06616104394197464,
-0.017025412991642952,
0.03833020478487015,
-0.08716744929552078,
-0.074739009141922,
-0.07642904669046402,
-0.09621842950582504,
0.08742217719554901,
0.010199184529483318,
-0.15595589578151703,
-0.012035669758915901,
-0.09999629855155945,
0.05975433811545372,
-0.17523962259292603,
0.06509187817573547,
0.08609765022993088,
0.0008505068253725767,
0.007804687134921551,
-0.06760196387767792,
0.03639288991689682,
0.0297650508582592,
-0.1162659227848053,
-0.07191837579011917
] |
null | null |
transformers
|
# THIS AI IS OUTDATED. See [Aeona](https://huggingface.co/deepparag/Aeona)
An generative AI made using [microsoft/DialoGPT-small](https://huggingface.co/microsoft/DialoGPT-small).
Trained on:
https://www.kaggle.com/Cornell-University/movie-dialog-corpus
https://www.kaggle.com/jef1056/discord-data
[Live Demo](https://dumbot-331213.uc.r.appspot.com/)
Example:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("deepparag/DumBot")
model = AutoModelWithLMHead.from_pretrained("deepparag/DumBot")
# Let's chat for 4 lines
for step in range(4):
# encode the new user input, add the eos_token and return a tensor in Pytorch
new_user_input_ids = tokenizer.encode(input(">> User:") + tokenizer.eos_token, return_tensors='pt')
# print(new_user_input_ids)
# append the new user input tokens to the chat history
bot_input_ids = torch.cat([chat_history_ids, new_user_input_ids], dim=-1) if step > 0 else new_user_input_ids
# generated a response while limiting the total chat history to 1000 tokens,
chat_history_ids = model.generate(
bot_input_ids, max_length=200,
pad_token_id=tokenizer.eos_token_id,
no_repeat_ngram_size=4,
do_sample=True,
top_k=100,
top_p=0.7,
temperature=0.8
)
# pretty print last ouput tokens from bot
print("DumBot: {}".format(tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][0], skip_special_tokens=True)))
```
|
{"license": "mit", "tags": ["conversational"], "thumbnail": "https://cdn.discordapp.com/app-icons/870239976690970625/c02cae78ae105f07969cfd8f8ea3d0a0.png"}
|
text-generation
|
deepparag/DumBot
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# THIS AI IS OUTDATED. See Aeona
An generative AI made using microsoft/DialoGPT-small.
Trained on:
URL
URL
Live Demo
Example:
|
[
"# THIS AI IS OUTDATED. See Aeona\nAn generative AI made using microsoft/DialoGPT-small.\n\nTrained on:\n\n URL\n\n URL\n\n\n \nLive Demo\n \nExample:"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# THIS AI IS OUTDATED. See Aeona\nAn generative AI made using microsoft/DialoGPT-small.\n\nTrained on:\n\n URL\n\n URL\n\n\n \nLive Demo\n \nExample:"
] |
[
56,
42
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# THIS AI IS OUTDATED. See Aeona\nAn generative AI made using microsoft/DialoGPT-small.\n\nTrained on:\n\n URL\n\n URL\n\n\n \nLive Demo\n \nExample:"
] |
[
-0.015592156909406185,
0.0358826145529747,
-0.0024390239268541336,
0.018374916166067123,
0.15722103416919708,
0.014799325726926327,
0.22847865521907806,
0.11785368621349335,
0.06237064674496651,
0.0027541096787899733,
0.14302510023117065,
0.1539483368396759,
0.07268708199262619,
0.1455671489238739,
-0.030491938814520836,
-0.19723285734653473,
0.07931632548570633,
0.054024260491132736,
0.11271105706691742,
0.09322741627693176,
0.08511294424533844,
-0.0339931845664978,
0.08306469023227692,
0.03352933004498482,
-0.09875987470149994,
-0.029751939699053764,
0.002345107262954116,
-0.12251845747232437,
0.13998949527740479,
0.09291097521781921,
0.04441061615943909,
-0.012210092507302761,
0.003375174943357706,
-0.07965296506881714,
0.027726173400878906,
-0.03342229872941971,
-0.004335146397352219,
0.06923183798789978,
-0.02466190792620182,
0.007252922281622887,
0.22981670498847961,
0.2072608917951584,
-0.04122605547308922,
0.04782455414533615,
-0.10509198158979416,
-0.05496647208929062,
-0.030025389045476913,
-0.03837982937693596,
0.059332482516765594,
0.15699107944965363,
-0.016681315377354622,
0.18136066198349,
0.01269763708114624,
0.03769588842988014,
0.0705718919634819,
-0.3179049789905548,
-0.00371698010712862,
0.04909703880548477,
0.04299827665090561,
-0.04960688576102257,
-0.01727975532412529,
0.10145565867424011,
0.05878734216094017,
0.03802080824971199,
-0.0011421056697145104,
0.00917134527117014,
-0.08738327771425247,
-0.006318498402833939,
-0.06691313534975052,
-0.045022133737802505,
0.2725536525249481,
-0.034682415425777435,
0.009677251800894737,
-0.07480252534151077,
-0.05714726820588112,
-0.08301715552806854,
-0.015701936557888985,
-0.057024192065000534,
-0.0759853720664978,
0.07671736925840378,
0.06554338335990906,
-0.10072983056306839,
-0.11657197028398514,
-0.040239959955215454,
-0.1871674656867981,
0.2701423764228821,
0.04276144877076149,
0.040006399154663086,
-0.16033649444580078,
0.08117199689149857,
0.0822918638586998,
-0.04610210657119751,
0.0028481516055762768,
-0.06488148868083954,
0.07389906793832779,
0.05818432569503784,
-0.06496129930019379,
-0.01668907143175602,
0.08740503340959549,
0.09844507277011871,
0.0428137481212616,
-0.0458112470805645,
0.014739317819476128,
0.08153890073299408,
0.016986114904284477,
0.07036063820123672,
-0.034937676042318344,
-0.012723423540592194,
0.07967617362737656,
-0.017904486507177353,
0.07216694951057434,
-0.08219770342111588,
-0.17215581238269806,
0.053568217903375626,
0.013714725151658058,
0.04757054150104523,
-0.0037088459357619286,
0.0888824462890625,
-0.11710367351770401,
-0.02003777027130127,
0.016782864928245544,
-0.02849024347960949,
-0.02306610718369484,
-0.0002545099996495992,
-0.039710745215415955,
0.07950824499130249,
0.035611286759376526,
0.03055962175130844,
-0.08767133951187134,
-0.06096520274877548,
-0.056992307305336,
-0.03575160354375839,
-0.04094612970948219,
-0.0243153627961874,
0.005994027946144342,
-0.07605726271867752,
0.03879043459892273,
-0.20724274218082428,
-0.12865617871284485,
0.042884886264801025,
-0.005098741967231035,
-0.07766852527856827,
-0.07078768312931061,
-0.0843735784292221,
-0.00029002100927755237,
-0.007572411093860865,
-0.07673536241054535,
-0.06008433923125267,
-0.05530068278312683,
0.09107819944620132,
-0.00812339223921299,
0.0926496833562851,
-0.2174810767173767,
0.0757998675107956,
-0.07727538794279099,
-0.06956838816404343,
0.07447630167007446,
0.044669367372989655,
0.02658727392554283,
0.03209466114640236,
-0.04421955347061157,
0.028616642579436302,
-0.10694081336259842,
0.06909137219190598,
0.040608394891023636,
0.16913966834545135,
-0.08747141808271408,
-0.05741187557578087,
0.059913795441389084,
-0.07738742977380753,
-0.15794873237609863,
0.15448136627674103,
0.012728636153042316,
0.12670069932937622,
0.11455538868904114,
0.15281832218170166,
-0.17110346257686615,
-0.011494085192680359,
-0.02579224854707718,
0.00009057992429006845,
-0.0818847268819809,
-0.04779497906565666,
0.02665671892464161,
0.05093751102685928,
-0.07915323972702026,
0.036843396723270416,
0.04092884436249733,
0.09893633425235748,
-0.05374090373516083,
-0.06250366568565369,
0.01245739497244358,
-0.07543749362230301,
0.027320781722664833,
0.0005776231409981847,
0.10344948619604111,
0.011682250536978245,
-0.07438737899065018,
-0.06821537017822266,
0.04424755275249481,
-0.02059224806725979,
0.005997087340801954,
-0.17156973481178284,
0.14098457992076874,
0.011925943195819855,
0.07091156393289566,
-0.1234302818775177,
-0.0408765934407711,
0.021333852782845497,
0.14482928812503815,
0.09668996930122375,
0.09692488610744476,
0.005398618057370186,
-0.0497346967458725,
0.03732917457818985,
0.033708348870277405,
0.08289333432912827,
-0.015469619072973728,
-0.09888748079538345,
-0.08150395005941391,
0.04297243431210518,
-0.02999795414507389,
0.04140501096844673,
-0.06701581925153732,
-0.003587462706491351,
-0.012056897394359112,
0.07333926111459732,
-0.050127070397138596,
0.02021150104701519,
0.042120128870010376,
0.005221907515078783,
-0.13900715112686157,
0.024132251739501953,
0.08959457278251648,
0.009276316501200199,
-0.1625756323337555,
0.1882653683423996,
-0.14032065868377686,
0.1340407133102417,
0.17999695241451263,
-0.09162641316652298,
-0.010844708420336246,
0.04216122627258301,
-0.015733571723103523,
0.014946100302040577,
0.0003609558043535799,
-0.03214559331536293,
0.30209171772003174,
-0.06620979309082031,
0.15812279284000397,
-0.06302168220281601,
-0.0052358973771333694,
-0.0016533900052309036,
-0.023467861115932465,
-0.019509490579366684,
0.022299034520983696,
0.10897380858659744,
-0.13263081014156342,
0.1209353655576706,
0.0858340710401535,
-0.01278204657137394,
0.17944321036338806,
0.0717213973402977,
-0.012785333208739758,
0.03881753608584404,
0.016517488285899162,
-0.00801134668290615,
0.04190744087100029,
-0.22682268917560577,
-0.007389573380351067,
0.057816244661808014,
-0.01116924174129963,
0.07078037410974503,
-0.11807221174240112,
-0.055128127336502075,
-0.028577888384461403,
-0.050486430525779724,
0.09895633161067963,
0.06764255464076996,
-0.027530191466212273,
0.16156764328479767,
-0.00955766811966896,
-0.12037357687950134,
0.05734953284263611,
0.04497518390417099,
-0.05286985635757446,
0.14537259936332703,
-0.05321352928876877,
-0.3399096727371216,
-0.0956154465675354,
-0.056758519262075424,
-0.05108650028705597,
0.06937260925769806,
0.08827359229326248,
-0.058202099055051804,
-0.03557531163096428,
-0.05958378687500954,
0.054241422563791275,
0.03260146081447601,
0.009880644269287586,
0.09770365059375763,
-0.018629034981131554,
-0.03152253478765488,
-0.09109950810670853,
-0.047481194138526917,
-0.09144754707813263,
-0.06613399088382721,
0.09631725400686264,
-0.06742610782384872,
0.06919991225004196,
0.12051104754209518,
0.010225923731923103,
0.03493665158748627,
-0.026897739619016647,
0.1577792763710022,
-0.10648317635059357,
-0.008792893961071968,
0.23603197932243347,
-0.0016166535206139088,
-0.014079919084906578,
0.05216965079307556,
-0.004298910032957792,
-0.14898115396499634,
0.02328776754438877,
-0.036436423659324646,
-0.1125774011015892,
-0.18528206646442413,
-0.1394547075033188,
-0.06368991732597351,
0.04739682003855705,
0.021465027704834938,
0.08978806436061859,
0.04619394987821579,
0.08735711872577667,
-0.0638352781534195,
0.05304788425564766,
-0.037831854075193405,
0.04536086320877075,
0.12118345499038696,
-0.04156438633799553,
0.1520613431930542,
-0.03861226141452789,
-0.10658111423254013,
0.15853992104530334,
0.10717974603176117,
0.10244258493185043,
0.059127602726221085,
0.01879824511706829,
0.04244484007358551,
0.014913740567862988,
0.07519558817148209,
0.03369433805346489,
0.028595823794603348,
-0.06042760983109474,
-0.0682736411690712,
-0.010842264629900455,
-0.08150690048933029,
0.12065821886062622,
-0.020218366757035255,
-0.12066756933927536,
0.001022503711283207,
0.014585715718567371,
0.07210744172334671,
0.092921182513237,
0.04828304052352905,
-0.3460814952850342,
-0.062472403049468994,
0.03916189819574356,
-0.05579961836338043,
-0.09322261810302734,
0.101308174431324,
0.030436482280492783,
-0.10923583060503006,
0.06406921148300171,
-0.01989545114338398,
0.07321617752313614,
-0.14341183006763458,
0.07059301435947418,
-0.07101187855005264,
-0.072959303855896,
0.047332752496004105,
0.06947227567434311,
-0.24152235686779022,
0.1923818737268448,
-0.036558669060468674,
0.03949643298983574,
-0.09312287718057632,
0.004248786251991987,
0.06534093618392944,
0.03529168665409088,
0.12988106906414032,
-0.04682658240199089,
0.08866521716117859,
-0.05650607869029045,
-0.08795661479234695,
0.07851824909448624,
0.023024607449769974,
-0.03132404014468193,
-0.0023227182682603598,
-0.021698517724871635,
0.029295673593878746,
-0.031512465327978134,
-0.09110629558563232,
-0.0017322874628007412,
-0.12878549098968506,
0.06979008764028549,
0.2333316057920456,
0.13672195374965668,
0.03343196213245392,
-0.05338117480278015,
-0.07982124388217926,
0.24527643620967865,
-0.0471513569355011,
-0.08169720321893692,
-0.07270507514476776,
0.02998758852481842,
-0.025926506146788597,
-0.02072814479470253,
0.019730180501937866,
-0.049208153039216995,
0.0015374895883724093,
-0.0488821305334568,
-0.13795821368694305,
0.10269739478826523,
-0.09708400815725327,
-0.09364193677902222,
-0.034456148743629456,
0.04772593453526497,
-0.005046738311648369,
0.018858039751648903,
0.05106908082962036,
-0.013765143230557442,
-0.1244339570403099,
-0.08642153441905975,
0.021987859159708023,
0.11711271107196808,
-0.02510513737797737,
0.0042272708378732204,
-0.09985410422086716,
-0.13880567252635956,
-0.07664139568805695,
-0.05524328723549843,
0.23563207685947418,
0.1651589274406433,
-0.061094410717487335,
0.10157853364944458,
0.15516464412212372,
-0.08662781119346619,
-0.3148633539676666,
-0.18719789385795593,
-0.07920140773057938,
0.0036945310421288013,
-0.010749156586825848,
-0.1810431033372879,
0.039165060967206955,
0.04944536089897156,
-0.05465097352862358,
0.06719446927309036,
-0.29734358191490173,
-0.08801517635583878,
0.13803887367248535,
0.03936971351504326,
0.34766048192977905,
-0.13607102632522583,
-0.04090133681893349,
-0.029855525121092796,
0.023437418043613434,
0.1045953780412674,
0.09506750851869583,
0.16243144869804382,
-0.025643453001976013,
0.09004423767328262,
0.018461771309375763,
0.007604877930134535,
0.07908937335014343,
-0.05180087313055992,
-0.020181337371468544,
-0.07544214278459549,
-0.03942263871431351,
0.046774063259363174,
0.028346870094537735,
0.1262749433517456,
-0.1151973307132721,
-0.01858474873006344,
-0.10402001440525055,
-0.05968927964568138,
-0.045620329678058624,
0.05591671168804169,
0.07930197566747665,
-0.051044952124357224,
-0.06507338583469391,
-0.01459942851215601,
-0.03573282063007355,
0.07055383175611496,
0.16425564885139465,
-0.046822261065244675,
0.0922340676188469,
0.047095224261283875,
0.17264625430107117,
-0.13879160583019257,
0.06688263267278671,
-0.03382718935608864,
-0.052404750138521194,
0.13245689868927002,
-0.09906909614801407,
-0.004669057205319405,
0.10153276473283768,
-0.04963501915335655,
0.037465061992406845,
0.062049441039562225,
-0.05464242398738861,
0.12417976558208466,
0.12355569005012512,
-0.19144387543201447,
-0.1371268630027771,
-0.035406190901994705,
0.16065280139446259,
0.12440130859613419,
0.17081841826438904,
0.17728666961193085,
-0.05850568041205406,
-0.04388204216957092,
-0.005021399352699518,
0.033447835594415665,
-0.026570839807391167,
0.016750099137425423,
-0.11231634020805359,
0.0025096856988966465,
-0.13214635848999023,
0.10254786163568497,
0.038829073309898376,
-0.08899421989917755,
0.12371093779802322,
0.0654916986823082,
-0.10515111684799194,
-0.11831150949001312,
-0.16311101615428925,
0.014267608523368835,
-0.1490621566772461,
-0.0678548514842987,
-0.09844119846820831,
-0.10250837355852127,
0.042304955422878265,
0.03542815148830414,
0.07233017683029175,
0.05987904220819473,
-0.07400836795568466,
-0.04208594188094139,
-0.031556062400341034,
-0.02304498665034771,
-0.03321671113371849,
-0.0056854733265936375,
-0.09583654254674911,
-0.08299020677804947,
0.049734413623809814,
0.04739827662706375,
-0.09627880156040192,
-0.09421561658382416,
-0.08061303198337555,
0.03776675835251808,
-0.08098096400499344,
-0.0714372992515564,
-0.08669782429933548,
-0.07222671061754227,
0.0044811926782131195,
-0.06684806197881699,
-0.07011133432388306,
0.010764464735984802,
-0.07335787266492844,
0.0197964645922184,
-0.01799631677567959,
0.04147297143936157,
-0.03917441517114639,
0.03297605365514755,
-0.0015117611037567258,
-0.015964802354574203,
0.1900479942560196,
0.07282545417547226,
-0.08646422624588013,
-0.0023377276957035065,
-0.11604662239551544,
0.008373119868338108,
0.13145814836025238,
0.01388090942054987,
0.03397885710000992,
-0.018012119457125664,
0.023813212290406227,
0.10697416216135025,
0.057986412197351456,
0.031123671680688858,
0.15788260102272034,
-0.08420835435390472,
0.024360233917832375,
-0.01860102266073227,
-0.03130629286170006,
-0.03147329017519951,
-0.03850652277469635,
0.07375101745128632,
0.03653176501393318,
0.12602077424526215,
-0.022750452160835266,
0.0006688786670565605,
-0.02136668935418129,
0.082217276096344,
-0.009999333880841732,
-0.1420062780380249,
-0.14324791729450226,
-0.039752282202243805,
0.009537679143249989,
-0.003309535561129451,
0.22524040937423706,
-0.00522816414013505,
-0.0900353193283081,
0.025454139336943626,
0.0888785570859909,
-0.0023974142968654633,
-0.023521622642874718,
0.09924661368131638,
0.06564205139875412,
0.01243874616920948,
-0.09627605229616165,
0.02445799857378006,
0.06561724096536636,
0.0844615176320076,
0.14950031042099,
0.002394819399341941,
0.010236099362373352,
0.14075492322444916,
0.031564485281705856,
-0.006127793341875076,
-0.22433918714523315,
-0.11543920636177063,
-0.1265910118818283,
0.10611765086650848,
-0.02227105386555195,
0.06846620887517929,
0.19570870697498322,
0.0427524708211422,
-0.004196834750473499,
-0.04280005767941475,
-0.034096334129571915,
-0.11876165866851807,
-0.20034201443195343,
-0.0714992880821228,
-0.12375452369451523,
0.017091304063796997,
-0.07697644084692001,
0.007897410541772842,
0.09501324594020844,
0.03301042318344116,
-0.12017036229372025,
0.10740922391414642,
-0.01304300781339407,
-0.06794119626283646,
0.07062496989965439,
-0.02064804919064045,
-0.006026009563356638,
-0.06527643650770187,
-0.07001753896474838,
-0.10576940327882767,
0.058963507413864136,
0.038162607699632645,
0.06987633556127548,
0.006521693430840969,
0.033096540719270706,
-0.08492143452167511,
-0.05222385376691818,
-0.046278905123472214,
0.008945520035922527,
0.0013838056474924088,
0.10093101859092712,
0.021336296573281288,
-0.046471040695905685,
0.04798831790685654,
0.17400959134101868,
-0.031460799276828766,
-0.012179103679955006,
-0.11686093360185623,
0.2692624628543854,
-0.03502760827541351,
0.03921026736497879,
-0.054872214794158936,
0.03161704167723656,
-0.09794384241104126,
0.3736419379711151,
0.2270008772611618,
-0.08312883228063583,
-0.027816081419587135,
-0.03605065867304802,
0.028733553364872932,
0.004218180663883686,
0.14700044691562653,
0.09857915341854095,
0.24167531728744507,
-0.04217023774981499,
-0.07756714522838593,
-0.06218547746539116,
0.03477887809276581,
-0.09700387716293335,
0.06803274154663086,
0.021180346608161926,
-0.02620246633887291,
-0.06659696251153946,
0.0316573828458786,
-0.22930270433425903,
0.04241398349404335,
-0.13436070084571838,
-0.16317935287952423,
-0.0994548425078392,
0.016881518065929413,
0.05927092954516411,
-0.022435475140810013,
0.1081417128443718,
-0.019618051126599312,
-0.028283245861530304,
-0.011879329569637775,
-0.017373837530612946,
-0.17463824152946472,
0.05667770653963089,
0.03563876822590828,
-0.12278872728347778,
0.09424594789743423,
-0.022303199395537376,
0.007728603668510914,
0.05640382692217827,
0.041592635214328766,
-0.07326307147741318,
0.13385999202728271,
-0.013551046140491962,
-0.00020954465435352176,
0.06939724087715149,
0.007725379429757595,
0.03536275774240494,
0.07701173424720764,
0.09636170417070389,
-0.011073442175984383,
0.03980335593223572,
-0.06602806597948074,
-0.046684544533491135,
0.006311787758022547,
0.03246258199214935,
-0.08022012561559677,
0.10395585000514984,
0.035313379019498825,
-0.04600462317466736,
-0.06479527801275253,
-0.014895369298756123,
0.04970020055770874,
-0.04154611751437187,
-0.04866841807961464,
-0.05681897699832916,
-0.2409530133008957,
-0.07905251532793045,
-0.0020226342603564262,
0.02189943566918373,
-0.19515152275562286,
0.012728902511298656,
-0.14019344747066498,
0.03294411674141884,
-0.08233492076396942,
0.02209964022040367,
0.15753622353076935,
-0.0319121815264225,
-0.0054800985381007195,
-0.04119393974542618,
0.05705181136727333,
0.04484649375081062,
-0.0748756155371666,
-0.12717074155807495
] |
null | null |
transformers
|
This is a BERT base cased model trained on SQuAD v2
|
{"language": "en", "license": "cc-by-4.0", "datasets": ["squad_v2"], "model-index": [{"name": "deepset/bert-base-cased-squad2", "results": [{"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squad_v2", "type": "squad_v2", "config": "squad_v2", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 71.1517, "name": "Exact Match", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZGZlNmQ1YzIzMWUzNTg4YmI4NWVhYThiMzE2ZGZmNWUzNDM3NWI0ZGJkNzliNGUxNTY2MDA5MWVkYjAwYWZiMCIsInZlcnNpb24iOjF9.iUvVdy5c4hoXkwlThJankQqG9QXzNilvfF1_4P0oL8X-jkY5Q6YSsZx6G6cpgXogqFpn7JlE_lP6_OT0VIamCg"}, {"type": "f1", "value": 74.6714, "name": "F1", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMWE5OGNjODhmY2Y0NWIyZDIzMmQ2NmRjZGYyYTYzOWMxZDUzYzg4YjBhNTRiNTY4NTc0M2IxNjI5NWI5ZDM0NCIsInZlcnNpb24iOjF9.IqU9rbzUcKmDEoLkwCUZTKSH0ZFhtqgnhOaEDKKnaRMGBJLj98D5V4VirYT6jLh8FlR0FiwvMTMjReBcfTisAQ"}]}]}]}
|
question-answering
|
deepset/bert-base-cased-squad2
|
[
"transformers",
"pytorch",
"jax",
"safetensors",
"bert",
"question-answering",
"en",
"dataset:squad_v2",
"license:cc-by-4.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us
|
This is a BERT base cased model trained on SQuAD v2
|
[] |
[
"TAGS\n#transformers #pytorch #jax #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us \n"
] |
[
65
] |
[
"passage: TAGS\n#transformers #pytorch #jax #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us \n"
] |
[
-0.08711686730384827,
0.10855642706155777,
-0.004670148249715567,
0.019531184807419777,
0.03634446859359741,
0.03583719953894615,
0.1052190512418747,
0.09741074591875076,
0.09691087901592255,
0.0037842150777578354,
0.15773411095142365,
0.16274189949035645,
-0.020829342305660248,
0.036590710282325745,
-0.06451952457427979,
-0.1492329090833664,
0.10681743919849396,
0.04698137566447258,
-0.05541244521737099,
0.10581796616315842,
0.10823728144168854,
-0.1353597342967987,
0.06454658508300781,
-0.0023318715393543243,
-0.08999733626842499,
0.001459990511648357,
0.026046931743621826,
-0.06025748327374458,
0.13602933287620544,
0.000902316125575453,
0.1357259452342987,
0.11567200720310211,
0.0103986207395792,
-0.11915203928947449,
0.025593159720301628,
0.016014793887734413,
-0.1018102765083313,
0.06470799446105957,
0.03023640252649784,
0.011705094948410988,
0.04372934624552727,
0.0028630627784878016,
0.003131957957521081,
0.04460286721587181,
-0.13116654753684998,
-0.24980001151561737,
-0.09489139169454575,
0.058147113770246506,
0.057758163660764694,
0.04757721349596977,
0.00199432997033,
0.22980253398418427,
-0.1544552445411682,
0.043898630887269974,
0.12638333439826965,
-0.39568856358528137,
-0.005450936034321785,
0.2130184769630432,
0.15446749329566956,
0.00998770073056221,
-0.05121086537837982,
0.08010748773813248,
0.04614384099841118,
0.0018300729570910335,
0.08623276650905609,
-0.05619833618402481,
-0.07288292795419693,
0.08682522177696228,
-0.10976476222276688,
-0.09068841487169266,
0.29635271430015564,
0.03349332511425018,
0.04210778325796127,
-0.04205779731273651,
-0.0598076656460762,
0.0468413420021534,
0.0032084560953080654,
-0.03214116767048836,
0.01958174630999565,
0.036540888249874115,
-0.017122071236371994,
0.0026279157027602196,
-0.1608506590127945,
-0.04264280945062637,
-0.1618187576532364,
0.05427895113825798,
-0.007135735359042883,
0.10709676146507263,
-0.17338500916957855,
0.07058777660131454,
-0.024210261180996895,
-0.09361469000577927,
-0.0030807105358690023,
-0.09472309052944183,
0.005588933359831572,
-0.015137912705540657,
-0.04574138671159744,
0.08432076871395111,
0.1122579500079155,
0.12790654599666595,
-0.0460701584815979,
-0.05871042236685753,
0.07074107229709625,
0.0942380428314209,
0.034512829035520554,
0.04876136779785156,
-0.09834055602550507,
0.004479727242141962,
0.014836925081908703,
-0.02410544641315937,
0.004996459931135178,
-0.024949248880147934,
-0.08905740827322006,
-0.06867014616727829,
0.0437270849943161,
0.08975548297166824,
0.11059612035751343,
0.045224033296108246,
-0.03134728968143463,
0.02805105783045292,
0.10987567156553268,
-0.007413480430841446,
0.0226502176374197,
0.015203324146568775,
0.05151684582233429,
-0.03162809833884239,
-0.018830597400665283,
0.030440330505371094,
0.02388990856707096,
0.013028069399297237,
-0.10168030112981796,
-0.042134951800107956,
-0.025582205504179,
-0.08644591271877289,
0.0803455039858818,
-0.10017981380224228,
0.056028664112091064,
-0.14733609557151794,
-0.09710710495710373,
0.02584598772227764,
0.03574344143271446,
-0.004634176380932331,
-0.05470466613769531,
0.06890395283699036,
-0.09163405746221542,
-0.00595767330378294,
-0.07116973400115967,
0.009592205286026001,
-0.08145400136709213,
0.06453438103199005,
-0.05209832265973091,
0.05926625803112984,
-0.11411275714635849,
0.016272451728582382,
-0.11063632369041443,
0.023101627826690674,
0.00798952579498291,
-0.05243675783276558,
-0.05298657342791557,
0.09238115698099136,
-0.03927476704120636,
-0.04876834154129028,
-0.016851214691996574,
0.018451103940606117,
-0.024058617651462555,
0.16668973863124847,
-0.160683736205101,
-0.008075506426393986,
0.14908941090106964,
-0.09148717671632767,
-0.2572755813598633,
0.061497025191783905,
0.006461287848651409,
-0.013390828855335712,
0.03680234029889107,
0.15573976933956146,
0.007779376115649939,
-0.14661090075969696,
-0.05987478420138359,
0.12214552611112595,
-0.10645741969347,
-0.15152743458747864,
0.09154524654150009,
0.01544538140296936,
-0.04065856337547302,
0.012844850309193134,
-0.07147863507270813,
0.018499284982681274,
-0.0882272720336914,
-0.07974088191986084,
-0.07121017575263977,
-0.045013427734375,
0.03736238554120064,
0.05142123997211456,
0.046354833990335464,
-0.06629937142133713,
-0.03618572652339935,
-0.011169440113008022,
0.029761767014861107,
0.04680194333195686,
-0.012507464736700058,
-0.09298601746559143,
0.19642309844493866,
-0.13612782955169678,
-0.018620628863573074,
-0.12973564863204956,
-0.1134788990020752,
-0.02344685047864914,
0.07740150392055511,
-0.062252409756183624,
0.23556214570999146,
0.07223587483167648,
-0.06273853033781052,
-0.008122110739350319,
-0.036026481539011,
0.06989243626594543,
0.0824776440858841,
-0.06113813817501068,
-0.14171214401721954,
-0.05552031844854355,
-0.07951188832521439,
-0.01733226142823696,
-0.1017628014087677,
0.006797292269766331,
0.0856928601861,
0.16175565123558044,
-0.028057653456926346,
0.07487820833921432,
0.002101019024848938,
-0.014685741625726223,
-0.028264673426747322,
0.006144736427813768,
0.1001862958073616,
0.012674971483647823,
-0.04098515957593918,
0.18321752548217773,
-0.048921555280685425,
0.3336278796195984,
0.1923334300518036,
-0.16274942457675934,
0.05268095061182976,
0.0070433602668344975,
-0.039202552288770676,
0.049389902502298355,
-0.01953110843896866,
-0.01847820170223713,
-0.06753327697515488,
-0.028258176520466805,
0.08226920664310455,
-0.08027317374944687,
-0.0348719097673893,
0.02383340708911419,
-0.05139277130365372,
-0.07514718919992447,
0.04574381187558174,
0.0955965593457222,
-0.17196057736873627,
0.18395669758319855,
0.33660218119621277,
0.043576501309871674,
0.06876001507043839,
-0.05068028345704079,
-0.019996076822280884,
-0.0517619363963604,
-0.04043826460838318,
-0.06873337179422379,
0.18191398680210114,
-0.17266033589839935,
-0.0029056009370833635,
0.10464689135551453,
0.00012023314775433391,
0.027504069730639458,
-0.1589961051940918,
-0.09035903960466385,
0.017606031149625778,
-0.003071183804422617,
-0.11597605794668198,
0.12796342372894287,
0.042783789336681366,
0.12101737409830093,
-0.04670156538486481,
-0.10323867201805115,
0.07310709357261658,
-0.021759865805506706,
-0.07064354419708252,
0.14820490777492523,
-0.06754432618618011,
-0.17721204459667206,
-0.022284040227532387,
0.005402505397796631,
0.016224397346377373,
-0.0044806464575231075,
0.07748466730117798,
-0.028552044183015823,
-0.02358917146921158,
-0.0250072218477726,
-0.05079798400402069,
-0.09038487821817398,
0.03593602031469345,
-0.07363583147525787,
0.06943593919277191,
-0.016909489408135414,
-0.10790152847766876,
-0.05114580690860748,
-0.029850373044610023,
-0.07482876628637314,
0.1107301339507103,
-0.0063398187048733234,
0.0823858454823494,
0.07689187675714493,
-0.022621573880314827,
0.025887135416269302,
-0.04159603640437126,
0.26162296533584595,
-0.04377741739153862,
0.01325925998389721,
0.20130570232868195,
0.0488676056265831,
0.04435543715953827,
0.21345888078212738,
0.055647168308496475,
-0.04883324354887009,
-0.04432711750268936,
-0.07126801460981369,
-0.06247636675834656,
-0.17764362692832947,
-0.08593636006116867,
-0.11093071848154068,
-0.005219230428338051,
0.01570907235145569,
0.040568649768829346,
0.03344539925456047,
0.10502760857343674,
-0.016026386991143227,
-0.07988684624433517,
-0.07204963266849518,
0.045507367700338364,
0.17548426985740662,
-0.05177626013755798,
0.11905521154403687,
-0.06352677196264267,
-0.06918266415596008,
0.05994270741939545,
0.0828617662191391,
0.10564203560352325,
0.0843888372182846,
-0.07181324809789658,
0.10921747982501984,
0.2575988173484802,
0.12171273678541183,
0.0807550922036171,
0.01936255767941475,
-0.07295581698417664,
-0.013790218159556389,
-0.0381523072719574,
-0.006726639810949564,
0.06510058790445328,
0.11850782483816147,
-0.073091059923172,
-0.049778714776039124,
-0.15383844077587128,
0.0664626881480217,
0.09774339199066162,
0.13881778717041016,
-0.22288592159748077,
0.006844345945864916,
0.08054644614458084,
0.03833330050110817,
-0.03555014356970787,
0.048567429184913635,
0.07786380499601364,
-0.07635822892189026,
0.011191194877028465,
-0.027038902044296265,
0.08731617033481598,
0.07548518478870392,
0.05678150802850723,
-0.021463241428136826,
-0.1230248436331749,
0.030454473569989204,
0.09806977957487106,
-0.19463710486888885,
0.2876279652118683,
0.008820392191410065,
-0.03461967036128044,
-0.06746599823236465,
-0.037536829710006714,
-0.01967770792543888,
0.1947535127401352,
0.1898929625749588,
0.03314397111535072,
-0.08430905640125275,
-0.06623955070972443,
-0.0024949731305241585,
0.06466666609048843,
-0.019250551238656044,
-0.029462207108736038,
0.02464788779616356,
-0.02002445049583912,
0.008925979025661945,
0.0171425212174654,
0.1631302833557129,
-0.05846824496984482,
-0.07792236655950546,
0.018669093027710915,
0.07138597965240479,
0.00298004737123847,
-0.07570993155241013,
-0.04137787967920303,
-0.1308753788471222,
0.08026165515184402,
-0.05123619735240936,
-0.03516637906432152,
-0.08147692680358887,
-0.07912952452898026,
0.11853528767824173,
-0.03704604506492615,
0.02901780791580677,
-0.04584325850009918,
0.008932667784392834,
-0.0606602318584919,
-0.11098960041999817,
0.10040625184774399,
-0.14990781247615814,
-0.04709262400865555,
-0.08029354363679886,
0.13433697819709778,
-0.10258479416370392,
0.03715994209051132,
0.0642763003706932,
0.09186577796936035,
-0.12797829508781433,
-0.08654620498418808,
0.0416271798312664,
-0.038880929350852966,
0.11008081585168839,
-0.004724564030766487,
-0.009990978986024857,
-0.04348552227020264,
0.08018450438976288,
-0.0011052160989493132,
0.16055133938789368,
0.2929772138595581,
-0.137710303068161,
0.0902462974190712,
0.14771459996700287,
-0.002490503014996648,
-0.2909335196018219,
-0.09573238343000412,
-0.1619882881641388,
-0.004349839873611927,
0.07805813103914261,
0.009772646240890026,
0.08250603079795837,
0.004919479601085186,
-0.09413401037454605,
0.018859153613448143,
-0.2256212681531906,
-0.05569332093000412,
0.17037729918956757,
0.018463855609297752,
0.30411583185195923,
-0.13004715740680695,
-0.04903576523065567,
0.018780872225761414,
-0.2250831425189972,
0.14839918911457062,
-0.1191333681344986,
0.033129237592220306,
-0.04755253717303276,
0.004106263164430857,
0.008766750805079937,
-0.08487557619810104,
0.18254853785037994,
-0.035043079406023026,
0.06225055083632469,
-0.07081930339336395,
-0.10869326442480087,
0.1728609949350357,
-0.03278414532542229,
0.03867591544985771,
-0.06725377589464188,
0.06620165705680847,
-0.15693481266498566,
0.014349470846354961,
-0.12192784249782562,
0.08745481073856354,
-0.03677929565310478,
-0.06055797263979912,
-0.04589131101965904,
0.03527297079563141,
0.0034838602878153324,
-0.04287336766719818,
0.20214006304740906,
-0.01660008728504181,
0.1593509018421173,
0.12852323055267334,
0.07955563068389893,
-0.15555520355701447,
-0.10280972719192505,
-0.002369498834013939,
-0.0677618756890297,
0.10122436285018921,
-0.10818293690681458,
0.037876978516578674,
0.11060905456542969,
0.005244651343673468,
0.0417637936770916,
0.08402268588542938,
0.0027229958213865757,
-0.023221293464303017,
0.1324009895324707,
-0.18767230212688446,
-0.14760489761829376,
0.015744969248771667,
0.027108266949653625,
-0.018275441601872444,
0.09976571053266525,
0.08009669929742813,
-0.02817431651055813,
-0.02218199148774147,
-0.022287091240286827,
0.01372710894793272,
-0.07370493561029434,
0.03611280024051666,
0.10568326711654663,
0.07011536508798599,
-0.10438486933708191,
0.045836251229047775,
0.007736686151474714,
-0.11139814555644989,
-0.05700967460870743,
-0.028784459456801414,
-0.097989521920681,
-0.14976155757904053,
-0.048631709069013596,
0.028844185173511505,
-0.10344508290290833,
-0.053730081766843796,
-0.05014031380414963,
-0.13600338995456696,
0.04488201439380646,
0.12037933617830276,
0.09526095539331436,
0.08227194100618362,
0.03038179688155651,
-0.08973235636949539,
0.00043781555723398924,
0.034602146595716476,
-0.058318883180618286,
0.0012674826430156827,
-0.1603362113237381,
-0.025392616167664528,
-0.03515396639704704,
0.1147916316986084,
-0.08113409578800201,
0.009612170979380608,
-0.1556803435087204,
0.018485452979803085,
-0.12990567088127136,
-0.057002823799848557,
-0.13865093886852264,
-0.041723404079675674,
0.008676324971020222,
-0.12334369122982025,
-0.08466696739196777,
0.01162714697420597,
-0.10539984703063965,
0.04195336624979973,
0.024178000167012215,
0.09734378010034561,
-0.12040608376264572,
-0.04888961464166641,
0.056221120059490204,
-0.028429944068193436,
0.09316506236791611,
0.04385976865887642,
-0.057864245027303696,
0.038494180887937546,
-0.11993785202503204,
-0.11592288315296173,
0.042962025851011276,
0.03148466348648071,
0.09452967345714569,
-0.03770579397678375,
0.006534157320857048,
0.095021553337574,
-0.007119862828403711,
0.057928863912820816,
-0.04272422939538956,
-0.08159031718969345,
-0.04341175779700279,
0.01948072575032711,
-0.06509767472743988,
-0.0036618749145418406,
-0.06968297064304352,
0.16845408082008362,
-0.005358606111258268,
0.1582619696855545,
0.012797734700143337,
0.03907722607254982,
-0.1724882274866104,
0.01078544370830059,
-0.06952519714832306,
-0.12620356678962708,
-0.07529203593730927,
-0.020628340542316437,
0.026984378695487976,
-0.05004093796014786,
0.29258301854133606,
0.04592735692858696,
-0.03176894038915634,
0.052299268543720245,
0.0931253731250763,
0.07360571622848511,
0.012357317842543125,
0.2618342339992523,
0.03161998838186264,
-0.02660341002047062,
-0.048253707587718964,
0.05407097935676575,
-0.03508862480521202,
-0.032917797565460205,
0.06783051043748856,
0.15236851572990417,
0.09454216063022614,
0.06144813820719719,
0.11949077248573303,
-0.04340964928269386,
-0.023308679461479187,
-0.11797046661376953,
0.013978737406432629,
0.05260956287384033,
-0.04713675007224083,
0.03535597398877144,
0.19391141831874847,
-0.0820113867521286,
0.026013189926743507,
-0.08859332650899887,
0.004543881863355637,
-0.13268041610717773,
-0.12607645988464355,
-0.07661119848489761,
-0.12327343225479126,
0.011879375204443932,
-0.09571638703346252,
-0.018522704020142555,
0.14160475134849548,
0.029760543256998062,
-0.03195366635918617,
0.0006456703413277864,
0.05310380831360817,
-0.0016396150458604097,
0.011637507937848568,
0.0330970399081707,
0.044765837490558624,
-0.035081468522548676,
0.04627098888158798,
-0.01680569536983967,
-0.0648643970489502,
-0.060247696936130524,
-0.004092021379619837,
-0.06241607666015625,
0.023063071072101593,
-0.08127833157777786,
-0.09161672741174698,
-0.061564840376377106,
0.025092026218771935,
-0.0066347504034638405,
0.13200737535953522,
0.01545737124979496,
0.056588999927043915,
0.04206423461437225,
0.21436481177806854,
-0.06403141468763351,
-0.09103362262248993,
-0.06461624801158905,
0.1436678022146225,
-0.01686711795628071,
0.05906299874186516,
0.0377948172390461,
-0.020102770999073982,
-0.008509594015777111,
0.18123626708984375,
0.31553930044174194,
-0.08479191362857819,
0.07146182656288147,
0.007777115795761347,
-0.003955093678086996,
-0.0085982671007514,
0.01928151771426201,
0.06443099677562714,
0.1958080679178238,
-0.12175015360116959,
0.025592895224690437,
-0.06292412430047989,
0.02084200270473957,
-0.008599256165325642,
0.03280062600970268,
0.03432944044470787,
-0.07462868839502335,
-0.029208922758698463,
0.08978330343961716,
-0.048043809831142426,
0.048790715634822845,
0.07964412868022919,
-0.1498139500617981,
-0.05673293396830559,
-0.0021257209591567516,
0.14334280788898468,
0.009659252129495144,
0.06846068799495697,
-0.0961916521191597,
-0.019902873784303665,
0.045431092381477356,
-0.024306882172822952,
-0.1744595170021057,
-0.059026483446359634,
0.14093734323978424,
0.044092580676078796,
0.14237387478351593,
-0.03125089034438133,
0.12002864480018616,
0.11406126618385315,
0.017986420542001724,
-0.1331930160522461,
0.05161937326192856,
0.06943230330944061,
-0.1022452861070633,
-0.0380394421517849,
-0.09022258222103119,
0.023011859506368637,
-0.014151856303215027,
0.07313664257526398,
-0.15032805502414703,
0.05691114440560341,
-0.012578349560499191,
-0.04060923308134079,
-0.08369362354278564,
0.047501303255558014,
-0.014878612011671066,
0.08812426030635834,
0.02220270037651062,
-0.05977529659867287,
-0.052856963127851486,
-0.022740133106708527,
0.03600096330046654,
0.07350745797157288,
-0.08576440811157227,
-0.0598103329539299,
-0.030979454517364502,
0.01669272407889366,
0.07970058172941208,
-0.008370120078325272,
-0.05029335618019104,
-0.06040991097688675,
-0.01890474371612072,
-0.014062649570405483,
-0.05192720144987106,
0.03358498960733414,
0.1287160962820053,
0.02790956199169159,
-0.014954843558371067,
-0.03593766689300537,
-0.0041211387142539024,
0.06301914900541306,
-0.1131141409277916,
-0.09938940405845642
] |
null | null |
transformers
|
This is a German BERT v1 (https://deepset.ai/german-bert) trained to do hate speech detection on the GermEval18Coarse dataset
|
{"license": "cc-by-4.0"}
|
text-classification
|
deepset/bert-base-german-cased-hatespeech-GermEval18Coarse
|
[
"transformers",
"pytorch",
"jax",
"safetensors",
"bert",
"text-classification",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #safetensors #bert #text-classification #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us
|
This is a German BERT v1 (URL trained to do hate speech detection on the GermEval18Coarse dataset
|
[] |
[
"TAGS\n#transformers #pytorch #jax #safetensors #bert #text-classification #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
53
] |
[
"passage: TAGS\n#transformers #pytorch #jax #safetensors #bert #text-classification #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.04729612544178963,
0.0846203863620758,
-0.006335574202239513,
0.030475320294499397,
0.11667154729366302,
0.018078986555337906,
0.1426430344581604,
0.09385145455598831,
0.05297777056694031,
-0.05056314170360565,
0.1410708725452423,
0.2355819046497345,
-0.024207206442952156,
0.09993260353803635,
-0.11860299855470657,
-0.210224911570549,
0.07000324130058289,
0.03712200000882149,
0.04050077125430107,
0.10849898308515549,
0.10708671808242798,
-0.08307524025440216,
0.04492489621043205,
-0.03420690819621086,
-0.1188141480088234,
0.005475277546793222,
0.07523097842931747,
-0.1346309632062912,
0.10058131068944931,
0.026412729173898697,
0.14114415645599365,
0.09964283555746078,
-0.015531162731349468,
-0.16465525329113007,
0.024158252403140068,
-0.015532500110566616,
-0.1079573929309845,
0.05395163968205452,
0.08280159533023834,
-0.0481962189078331,
0.015923388302326202,
0.039043355733156204,
0.017604516819119453,
0.05800924077630043,
-0.11801692098379135,
-0.1540025919675827,
-0.06622212380170822,
0.08571509271860123,
0.09292116016149521,
0.06448211520910263,
0.01891462504863739,
0.1795433759689331,
-0.1263352483510971,
0.08409146219491959,
0.06468396633863449,
-0.37044480443000793,
0.012063358910381794,
0.11359035223722458,
0.06031502038240433,
0.04539739340543747,
-0.06142398715019226,
0.05520912632346153,
0.04737744480371475,
-0.004350175615400076,
0.04318474978208542,
-0.07065542042255402,
-0.07160955667495728,
0.024582261219620705,
-0.059059761464595795,
-0.04895617812871933,
0.2194012701511383,
-0.03405226022005081,
0.014026479795575142,
-0.053538549691438675,
-0.05254993215203285,
-0.011097243055701256,
-0.020761510357260704,
0.026775848120450974,
-0.0023027202114462852,
0.07270895689725876,
0.015044483356177807,
0.02597225457429886,
-0.14569537341594696,
0.002670581918209791,
-0.1805064082145691,
0.13261204957962036,
0.012571793980896473,
0.059112656861543655,
-0.13944727182388306,
0.06096021831035614,
0.023413142189383507,
-0.10073933750391006,
0.006219984497874975,
-0.09281224757432938,
0.09176680445671082,
-0.04075122997164726,
-0.03992575779557228,
0.06989987939596176,
0.10792417079210281,
0.161514550447464,
-0.02460731379687786,
0.015998821705579758,
-0.050545789301395416,
0.10999102890491486,
-0.01639094203710556,
0.04829319566488266,
0.058383602648973465,
0.03557872772216797,
0.10685614496469498,
-0.07500852644443512,
0.045995160937309265,
-0.02896808087825775,
-0.1549459546804428,
-0.008203878067433834,
0.05163101479411125,
0.13770753145217896,
-0.0027112746611237526,
0.06820875406265259,
-0.058372046798467636,
0.03342832997441292,
0.18097274005413055,
-0.05080306902527809,
0.022892644628882408,
0.025041477754712105,
0.05854339897632599,
0.00048507461906410754,
-0.0010095228208228946,
0.015929048880934715,
-0.03551056608557701,
0.1441391408443451,
-0.058588117361068726,
-0.008190092630684376,
-0.017149796709418297,
-0.05133333429694176,
0.08381544798612595,
-0.08857527375221252,
0.05125519633293152,
-0.18945017457008362,
-0.124130979180336,
0.031468745321035385,
0.014074989594519138,
0.02172030322253704,
-0.056394241750240326,
0.017016781494021416,
-0.013143242336809635,
0.016007820144295692,
-0.08207947760820389,
-0.08429624140262604,
-0.0872267410159111,
0.10703602433204651,
-0.04430587217211723,
0.02365664765238762,
-0.1473279893398285,
0.028013572096824646,
-0.11800447106361389,
-0.020894834771752357,
-0.07936733961105347,
-0.03618048503994942,
-0.07961063832044601,
0.1775650680065155,
-0.004819449502974749,
-0.04596177488565445,
-0.002269812859594822,
0.03017427586019039,
-0.07050879299640656,
0.15467111766338348,
-0.07855464518070221,
-0.06457629799842834,
0.20704147219657898,
-0.13665294647216797,
-0.19840401411056519,
0.08628854900598526,
-0.005589632783085108,
-0.006348794791847467,
0.10137061029672623,
0.1657983511686325,
0.0671529546380043,
-0.08435550332069397,
0.05574402958154678,
0.11769778281450272,
-0.07523844391107559,
-0.16544665396213531,
0.02841390296816826,
-0.031708456575870514,
-0.15049335360527039,
0.044061705470085144,
-0.006446341518312693,
0.05537313222885132,
-0.0349372923374176,
-0.05859338864684105,
-0.03939194604754448,
-0.02490914985537529,
0.04551141709089279,
0.03083738312125206,
0.07506145536899567,
-0.10640380531549454,
-0.007730068638920784,
-0.028213802725076675,
-0.0010147623252123594,
0.03795905038714409,
0.019084643572568893,
-0.09188908338546753,
0.10312781482934952,
0.0031497059389948845,
0.012538664974272251,
-0.14194881916046143,
-0.05942355841398239,
0.0076810335740447044,
0.09419847279787064,
-0.002861935645341873,
0.05696835741400719,
0.0407363586127758,
-0.015404774807393551,
-0.021078553050756454,
-0.04042840003967285,
0.16414083540439606,
0.04275759309530258,
-0.04040290415287018,
-0.11652093380689621,
0.058396995067596436,
-0.05292843282222748,
0.02893633395433426,
-0.10257565230131149,
0.014388554729521275,
0.1071716696023941,
0.08687712997198105,
-0.012090571224689484,
0.08261176943778992,
-0.037943124771118164,
0.040335651487112045,
-0.05316659063100815,
0.02703174389898777,
0.12619741261005402,
0.036287833005189896,
-0.06821360439062119,
0.1988307237625122,
-0.1346558928489685,
0.345467209815979,
0.21971626579761505,
-0.20448482036590576,
0.008858997374773026,
-0.04810534790158272,
0.004620024934411049,
0.021340813487768173,
0.01279731560498476,
0.02819225937128067,
0.03529282659292221,
-0.017362181097269058,
0.18648499250411987,
-0.07250998914241791,
-0.039596445858478546,
0.010226130485534668,
-0.0488012470304966,
-0.03978263586759567,
0.0671355128288269,
0.09901190549135208,
-0.21465188264846802,
0.18824239075183868,
0.3033634126186371,
0.02514571137726307,
0.10834815353155136,
-0.05417812988162041,
0.045750245451927185,
0.031510189175605774,
-0.006258746143430471,
-0.000629139831289649,
0.008420761674642563,
-0.09257421642541885,
-0.013077497482299805,
0.07270710915327072,
0.011897810734808445,
0.03168858215212822,
-0.16373774409294128,
-0.07148413360118866,
-0.008045153692364693,
-0.00455687427893281,
-0.07448475807905197,
0.06793587654829025,
0.021128717809915543,
0.11238144338130951,
-0.03988747298717499,
-0.11915034055709839,
0.1399887204170227,
-0.008060231804847717,
-0.10343031585216522,
0.17194823920726776,
-0.14925545454025269,
-0.2717238664627075,
-0.14356650412082672,
-0.13952502608299255,
0.0033603552728891373,
0.04195066913962364,
0.11860427260398865,
-0.043820060789585114,
-0.05140971392393112,
-0.01249871775507927,
-0.10922577977180481,
-0.03968428820371628,
0.025293434038758278,
-0.06965713202953339,
0.0751311406493187,
-0.0187021866440773,
-0.08641496300697327,
-0.07241640239953995,
-0.009302530437707901,
-0.06175275892019272,
0.13506504893302917,
-0.06778782606124878,
0.07330300658941269,
0.11890163272619247,
-0.019276561215519905,
0.027679329738020897,
-0.07559419423341751,
0.12960366904735565,
-0.05322540923953056,
-0.0279086884111166,
0.18727773427963257,
-0.05344611406326294,
0.0725640282034874,
0.16995182633399963,
0.05609075352549553,
-0.06094895303249359,
0.016982004046440125,
-0.09096948057413101,
-0.07288887351751328,
-0.23258446156978607,
-0.11291462182998657,
-0.09220687299966812,
0.06247592344880104,
0.06415648758411407,
0.08226890861988068,
0.11708039790391922,
0.08233464509248734,
-0.009968146681785583,
-0.01676439866423607,
0.04609621688723564,
0.09435439854860306,
0.24353435635566711,
0.005347133614122868,
0.133165180683136,
-0.08838290721178055,
-0.0835200622677803,
0.08726446330547333,
0.03783406689763069,
0.09967538714408875,
0.12959535419940948,
0.017830748111009598,
0.05977533012628555,
0.16759458184242249,
0.17401650547981262,
0.11787508428096771,
0.039598457515239716,
-0.03134423866868019,
-0.013932827860116959,
-0.030894597992300987,
-0.03835740685462952,
0.014017502777278423,
-0.013481715694069862,
-0.11608064919710159,
-0.07900263369083405,
-0.16530552506446838,
0.06841100007295609,
0.0997566282749176,
0.037727925926446915,
-0.212862029671669,
0.029643025249242783,
0.09294140338897705,
0.01534915529191494,
-0.0606081523001194,
0.09914778172969818,
-0.04995667189359665,
-0.09407878667116165,
0.11189864575862885,
-0.047322385013103485,
0.10462474822998047,
-0.0328373983502388,
0.058635562658309937,
-0.025187738239765167,
-0.1345280259847641,
0.028344573453068733,
0.11090454459190369,
-0.2856384217739105,
0.22339332103729248,
0.014153675176203251,
-0.01169020589441061,
-0.06134333461523056,
-0.022375930100679398,
0.032122980803251266,
0.2598021924495697,
0.1261238306760788,
0.0016114319441840053,
-0.1221383735537529,
-0.12030712515115738,
-0.03911536559462547,
0.02231493592262268,
0.07499445229768753,
-0.010625943541526794,
-0.03154703602194786,
-0.04800819233059883,
-0.018472563475370407,
-0.0025286716409027576,
-0.037260085344314575,
-0.04121755436062813,
-0.1440514326095581,
0.033861350268125534,
0.08933790773153305,
0.07837218046188354,
-0.05405355244874954,
-0.04006298631429672,
-0.1491067111492157,
0.1705280840396881,
-0.15072345733642578,
-0.06365475058555603,
-0.09676802903413773,
-0.15454092621803284,
0.006060509476810694,
-0.05234706774353981,
0.06593792140483856,
-0.07089944183826447,
0.0103513328358531,
-0.08233599364757538,
-0.1805177479982376,
0.12680168449878693,
-0.1314513385295868,
-0.07477734237909317,
-0.06396019458770752,
0.16256646811962128,
-0.10640076547861099,
0.007828515954315662,
0.051798079162836075,
0.031126925721764565,
-0.07274169474840164,
-0.1026153415441513,
0.00441932724788785,
-0.011674358509480953,
0.08558718860149384,
-0.002541557652875781,
-0.10712384432554245,
-0.12478230893611908,
0.004094304982572794,
-0.03307565301656723,
0.21717779338359833,
0.2758559286594391,
-0.07585366815328598,
0.15440316498279572,
0.21073536574840546,
-0.07599582523107529,
-0.3241501450538635,
-0.11434103548526764,
-0.16359566152095795,
-0.07489524781703949,
-0.010590987280011177,
-0.0941852554678917,
0.0891607403755188,
0.0378381721675396,
-0.08232268691062927,
0.11606922745704651,
-0.1427382230758667,
-0.09633829444646835,
0.17881552875041962,
0.0175599567592144,
0.33114755153656006,
-0.14893560111522675,
-0.09136315435171127,
-0.05826939269900322,
-0.14100056886672974,
0.15905801951885223,
-0.05111883953213692,
0.053216103464365005,
-0.011482667177915573,
-0.003007011953741312,
0.0018575361464172602,
-0.058332618325948715,
0.10640687495470047,
-0.047018177807331085,
0.06610563397407532,
-0.11628688126802444,
-0.06503079831600189,
0.07859986275434494,
-0.012126718647778034,
0.010134969837963581,
-0.11373452842235565,
0.018131909891963005,
-0.08985075354576111,
-0.029249804094433784,
-0.05644771084189415,
0.08058683574199677,
-0.010490994900465012,
-0.04397016763687134,
-0.001746234716847539,
-0.014764927327632904,
-0.02348012663424015,
-0.03819160535931587,
0.26970475912094116,
-0.025702260434627533,
0.16476160287857056,
0.1392083764076233,
0.14032617211341858,
-0.1511956751346588,
0.026239069178700447,
-0.06110503524541855,
-0.10470584034919739,
0.0742655023932457,
-0.0852099061012268,
0.049371808767318726,
0.11212192475795746,
-0.045669928193092346,
0.07293222099542618,
0.10131774097681046,
0.032197754830121994,
-0.02762601524591446,
0.16301953792572021,
-0.22046583890914917,
0.014596272259950638,
-0.021337805315852165,
0.0779072642326355,
0.08085811883211136,
0.08750016242265701,
0.11671848595142365,
0.0009805340087041259,
-0.031941819936037064,
-0.0009385467856191099,
0.01444372907280922,
-0.024486036971211433,
0.03236377239227295,
0.053353846073150635,
0.03290669247508049,
-0.1245374009013176,
0.0781019926071167,
0.032148607075214386,
-0.1660444289445877,
-0.03825279697775841,
0.09981024265289307,
-0.17005926370620728,
-0.1323775351047516,
-0.01563205197453499,
0.0883876383304596,
-0.11191462725400925,
-0.10404044389724731,
-0.05693166330456734,
-0.16491450369358063,
0.04708503931760788,
0.19390641152858734,
0.09815497696399689,
0.08235318958759308,
0.013793639838695526,
-0.05969928577542305,
-0.017719488590955734,
0.02618839032948017,
-0.08526185154914856,
0.03885186091065407,
-0.13838519155979156,
0.014104707166552544,
0.000551415141671896,
0.08048239350318909,
-0.07835722714662552,
-0.006650138646364212,
-0.1551646888256073,
0.03582103177905083,
-0.0704585537314415,
0.019526824355125427,
-0.09023326635360718,
-0.008060582913458347,
0.007030722685158253,
-0.04277316853404045,
-0.021118799224495888,
-0.023226333782076836,
-0.10554662346839905,
0.031141214072704315,
0.0060213771648705006,
0.061486922204494476,
-0.09206261485815048,
-0.05641177296638489,
0.05538071319460869,
-0.02395067922770977,
0.10824815928936005,
0.0638156533241272,
-0.0754103809595108,
0.09210904687643051,
-0.1927480250597,
-0.07983016222715378,
0.11928461492061615,
0.017987946048378944,
0.040185101330280304,
0.025121882557868958,
0.037098199129104614,
0.12582537531852722,
-0.028938189148902893,
0.0730358362197876,
0.03475409746170044,
-0.13452166318893433,
0.003018548246473074,
0.011465663090348244,
-0.13503581285476685,
-0.017290890216827393,
-0.07570868730545044,
0.1330687701702118,
-0.02243674360215664,
0.20326712727546692,
-0.06712888181209564,
0.03910326212644577,
-0.057723477482795715,
0.005049991887062788,
-0.03962935134768486,
-0.19599713385105133,
-0.16771158576011658,
-0.061766110360622406,
-0.0031188251450657845,
-0.010394696146249771,
0.26885902881622314,
0.04064325988292694,
-0.04464341700077057,
0.08293019980192184,
0.049181822687387466,
0.016362786293029785,
0.027861060574650764,
0.24734115600585938,
0.05116918310523033,
-0.03849861025810242,
-0.09216362237930298,
0.03028014302253723,
0.004906482994556427,
-0.11334283649921417,
0.08317089825868607,
0.1018962636590004,
-0.003646082943305373,
0.04609745368361473,
0.0646437332034111,
0.016280289739370346,
-0.07265118509531021,
-0.16147328913211823,
-0.016170047223567963,
0.06751204282045364,
0.0024005211889743805,
0.05771571025252342,
0.1511182337999344,
-0.02642744965851307,
-0.007340565789490938,
-0.07202344387769699,
-0.025957521051168442,
-0.18648618459701538,
-0.10478468984365463,
-0.10162192583084106,
-0.09593334794044495,
0.0298962090164423,
-0.05322004854679108,
-0.004976966883987188,
0.06496605277061462,
0.04784918576478958,
-0.0652889683842659,
0.0026065404526889324,
-0.010262506082654,
-0.025471383705735207,
0.06855493783950806,
-0.010285470634698868,
0.011032775044441223,
-0.0237300843000412,
-0.04846491664648056,
-0.09979524463415146,
-0.09021607041358948,
-0.047816310077905655,
0.02874644100666046,
-0.024078283458948135,
0.034798309206962585,
-0.11320793628692627,
-0.09182526916265488,
-0.024296674877405167,
0.06237690895795822,
0.0017384049715474248,
0.1684277057647705,
0.006486565340310335,
0.017733663320541382,
0.0791158452630043,
0.1706554889678955,
-0.04456587880849838,
-0.11988460272550583,
-0.010048762895166874,
0.2606355845928192,
0.050189949572086334,
0.09381920099258423,
0.022354548797011375,
0.0022798634599894285,
0.0037346924655139446,
0.2706383764743805,
0.30041611194610596,
-0.04485565423965454,
0.06420420110225677,
-0.030049938708543777,
0.02039472386240959,
0.08628060668706894,
0.1478390097618103,
0.08367515355348587,
0.23410874605178833,
-0.07048232853412628,
-0.0116987070068717,
-0.03362049162387848,
0.029246343299746513,
-0.12379106134176254,
0.04695429280400276,
-0.004042357672005892,
-0.05581178888678551,
-0.045911870896816254,
0.10593817383050919,
-0.11561360210180283,
0.11959775537252426,
0.04937466233968735,
-0.1338445097208023,
-0.022746652364730835,
-0.008267958648502827,
0.16911141574382782,
-0.015829527750611305,
0.03741355612874031,
-0.042344700545072556,
-0.06459426879882812,
0.02373373880982399,
-0.010895421728491783,
-0.18925310671329498,
-0.020942429080605507,
0.052893947809934616,
-0.008624152280390263,
0.10672438144683838,
0.0019228050950914621,
0.10928823053836823,
0.08052076399326324,
0.05404146760702133,
-0.07680966705083847,
0.11528406292200089,
0.017144877463579178,
-0.07878519594669342,
0.019492460414767265,
-0.07830395549535751,
-0.006530393846333027,
-0.022742269560694695,
0.048246633261442184,
-0.1001528948545456,
0.0721980482339859,
-0.07336211949586868,
-0.09860645234584808,
-0.05477318912744522,
0.0649687796831131,
-0.0477997288107872,
0.06427404284477234,
0.023853080347180367,
-0.016813598573207855,
-0.045985862612724304,
-0.04221019148826599,
-0.0056973835453391075,
0.039598461240530014,
-0.15022525191307068,
-0.05729189142584801,
-0.06691606342792511,
-0.023221468552947044,
0.13858194649219513,
0.034737758338451385,
-0.1722090095281601,
-0.006547162309288979,
-0.10968952625989914,
0.02826952561736107,
-0.19788874685764313,
0.05420532822608948,
0.0929330587387085,
0.017818372696638107,
-0.017105011269450188,
-0.07733577489852905,
0.027970759198069572,
0.03629624471068382,
-0.09339030832052231,
-0.09794473648071289
] |
null | null |
transformers
|
<a href="https://huggingface.co/exbert/?model=bert-base-german-cased">
\t<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
# German BERT with old vocabulary
For details see the related [FARM issue](https://github.com/deepset-ai/FARM/issues/60).
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Slack](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "tags": ["exbert"], "thumbnail": "https://static.tildacdn.com/tild6438-3730-4164-b266-613634323466/german_bert.png"}
|
fill-mask
|
deepset/bert-base-german-cased-oldvocab
|
[
"transformers",
"pytorch",
"jax",
"bert",
"fill-mask",
"exbert",
"de",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #jax #bert #fill-mask #exbert #de #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
<a href="URL
\t<img width="300px" src="URL
</a>
# German BERT with old vocabulary
For details see the related FARM issue.
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Slack | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# German BERT with old vocabulary\nFor details see the related FARM issue.",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #jax #bert #fill-mask #exbert #de #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# German BERT with old vocabulary\nFor details see the related FARM issue.",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
49,
17,
129
] |
[
"passage: TAGS\n#transformers #pytorch #jax #bert #fill-mask #exbert #de #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# German BERT with old vocabulary\nFor details see the related FARM issue.## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
-0.02285728231072426,
0.10548131912946701,
0.002864976180717349,
0.05970723181962967,
0.07597914338111877,
-0.05165277421474457,
0.05877460166811943,
0.07256433367729187,
0.2487581968307495,
-0.018766755238175392,
0.13817977905273438,
0.012065771967172623,
0.01433270052075386,
0.11248250305652618,
0.010159078985452652,
-0.19840717315673828,
0.04356660693883896,
0.002552791265770793,
-0.05110719054937363,
0.025518083944916725,
0.15664444863796234,
-0.08130115270614624,
0.10729915648698807,
0.01294627320021391,
-0.16248832643032074,
0.07668211311101913,
-0.04944198578596115,
-0.009036622941493988,
0.1757381409406662,
0.05394487455487251,
0.05723917856812477,
-0.09762348234653473,
-0.02441483922302723,
-0.08857855945825577,
0.031151805073022842,
-0.03301335498690605,
-0.025351762771606445,
0.04036562889814377,
-0.00676512299105525,
0.0483565516769886,
0.025616614148020744,
-0.00531046325340867,
-0.06835812330245972,
0.047784075140953064,
-0.10328595340251923,
-0.04145768657326698,
-0.05815510451793671,
0.1179892048239708,
-0.005364049691706896,
0.015002026222646236,
0.041094932705163956,
0.06921246647834778,
-0.15411075949668884,
0.028160905465483665,
0.06671641767024994,
-0.2794437110424042,
-0.01049510482698679,
0.026816492900252342,
0.08317701518535614,
-0.06457288563251495,
-0.14161071181297302,
0.07765255123376846,
0.07575442641973495,
0.04175733029842377,
-0.11890088766813278,
-0.054178934544324875,
0.07807398587465286,
-0.01546415314078331,
-0.09225324541330338,
0.002773892367258668,
0.20867322385311127,
0.02214258909225464,
-0.08813004195690155,
-0.10565183311700821,
0.0337621346116066,
0.14728398621082306,
-0.01933392323553562,
-0.05594092980027199,
-0.051369428634643555,
-0.001374994171783328,
-0.02907560020685196,
-0.0976787582039833,
-0.1075950637459755,
0.018267614766955376,
-0.06750776618719101,
0.29996010661125183,
0.02812105417251587,
0.015826307237148285,
-0.016946330666542053,
0.018807826563715935,
-0.10691998153924942,
-0.11105196177959442,
-0.04096798971295357,
-0.09245739877223969,
0.06504566967487335,
-0.03276277706027031,
-0.03652690723538399,
-0.039744168519973755,
0.12416138499975204,
0.08893058449029922,
-0.05400952324271202,
-0.032301947474479675,
-0.04456060379743576,
0.06114323437213898,
0.08415355533361435,
0.12827946245670319,
-0.00515431584790349,
-0.22434008121490479,
0.048576220870018005,
-0.1533762663602829,
-0.04241590201854706,
0.034223735332489014,
-0.1278662383556366,
-0.04931892454624176,
-0.01025253627449274,
0.010046502575278282,
0.03321053460240364,
0.055095572024583817,
-0.07951927930116653,
-0.02961653284728527,
0.10417729616165161,
-0.041460875421762466,
0.017584657296538353,
-0.030375361442565918,
-0.054025158286094666,
0.11301715672016144,
-0.1216544508934021,
0.018998965620994568,
-0.024286825209856033,
-0.01597520522773266,
-0.05887426435947418,
-0.08462589979171753,
-0.0866413339972496,
-0.11735276877880096,
0.10507366806268692,
-0.06011020019650459,
0.060431938618421555,
-0.13434702157974243,
-0.09568988531827927,
0.03078419715166092,
0.07390919327735901,
0.015051042661070824,
-0.05701742693781853,
0.01493636704981327,
0.0063887350261211395,
0.0355144701898098,
0.0036612525582313538,
-0.12468697875738144,
0.005742799025028944,
0.040968164801597595,
-0.13073456287384033,
0.028698859736323357,
-0.20162521302700043,
0.02333948202431202,
-0.15413711965084076,
-0.01647469587624073,
-0.2162093222141266,
-0.012621412053704262,
-0.12373489141464233,
0.1220165342092514,
-0.12278011441230774,
0.033790912479162216,
0.02638866938650608,
0.059591107070446014,
-0.0259382501244545,
0.10233201831579208,
-0.10774809122085571,
-0.06154131889343262,
0.11960742622613907,
-0.08519843965768814,
-0.06438221782445908,
0.17554786801338196,
-0.05207771807909012,
0.08246807008981705,
0.17187291383743286,
0.22258760035037994,
0.15069830417633057,
-0.2003801465034485,
-0.03345028683543205,
0.07384173572063446,
-0.05335190147161484,
-0.0864366739988327,
0.05313035473227501,
-0.02258971706032753,
-0.18337738513946533,
0.040860459208488464,
-0.09516382217407227,
0.09210667759180069,
-0.03757668659090996,
-0.01199556514620781,
0.033127788454294205,
-0.03663821145892143,
0.13433629274368286,
-0.008137914352118969,
0.024558234959840775,
-0.12259778380393982,
-0.04591425880789757,
0.06622035801410675,
0.02388780377805233,
0.006414939183741808,
-0.010529574938118458,
-0.007809313014149666,
0.0787365734577179,
0.01850808598101139,
0.05068262666463852,
-0.030676912516355515,
-0.16392427682876587,
0.067534901201725,
-0.08034754544496536,
0.13711263239383698,
0.1801515817642212,
0.056201137602329254,
0.01071262452751398,
0.009932608343660831,
0.03208531439304352,
-0.0636371448636055,
-0.02517467737197876,
-0.038021281361579895,
-0.14007818698883057,
0.02894449792802334,
-0.10615446418523788,
-0.05611584708094597,
0.04121582582592964,
-0.03318753093481064,
0.040151432156562805,
0.09280688315629959,
0.03313744068145752,
0.0006476472481153905,
-0.08714177459478378,
0.07303737103939056,
-0.011519801802933216,
0.056960031390190125,
0.056143298745155334,
-0.029226765036582947,
-0.04641292616724968,
0.18076160550117493,
0.17849545180797577,
0.16451387107372284,
0.06594989448785782,
-0.014935450628399849,
-0.059696923941373825,
-0.10137121379375458,
-0.06985704600811005,
0.02903068996965885,
-0.021655447781085968,
-0.14289595186710358,
0.2554777264595032,
0.016446786001324654,
0.01523960754275322,
-0.04307475686073303,
-0.015123623423278332,
-0.030062448233366013,
-0.08175035566091537,
-0.0663287416100502,
0.10557849705219269,
-0.02874092012643814,
-0.1852024495601654,
0.1673627644777298,
0.19216875731945038,
0.08560793101787567,
0.30109724402427673,
-0.012176730670034885,
-0.019423728808760643,
-0.013873674906790257,
0.008137171156704426,
-0.07530061155557632,
0.18035979568958282,
-0.1040811836719513,
-0.023777280002832413,
0.03654700145125389,
-0.008068390190601349,
0.0279975775629282,
-0.025707177817821503,
-0.008606182411313057,
-0.03263820707798004,
9.62093508860562e-7,
0.12061504274606705,
0.04596409201622009,
-0.011666030623018742,
0.06974909454584122,
0.08446839451789856,
-0.13238854706287384,
0.11037639528512955,
0.021847117692232132,
-0.008259951137006283,
0.1668047457933426,
-0.049872446805238724,
-0.3236967921257019,
-0.16130824387073517,
-0.18920612335205078,
-0.14396697282791138,
0.005949317943304777,
0.06715156883001328,
-0.030270837247371674,
-0.131459042429924,
0.07285543531179428,
0.09741964936256409,
0.0166497640311718,
-0.049605876207351685,
-0.0405023992061615,
-0.08629900217056274,
-0.0066188727505505085,
-0.13128681480884552,
-0.05941268429160118,
-0.009833692573010921,
-0.01584050990641117,
0.0022435644641518593,
0.04309811070561409,
0.10211911797523499,
-0.029044128954410553,
-0.05846268683671951,
-0.005219428334385157,
-0.0028573458548635244,
0.23870739340782166,
-0.08304135501384735,
0.06630007177591324,
0.05270833894610405,
-0.08855275064706802,
0.0764140784740448,
0.1706867665052414,
0.06874268501996994,
0.0004370269307401031,
-0.02255888283252716,
-0.005979446694254875,
-0.06326320767402649,
-0.13793988525867462,
-0.15807098150253296,
-0.03296029567718506,
0.08007857203483582,
0.01585562340915203,
0.04363519698381424,
0.05353677645325661,
0.013159161433577538,
-0.0079095009714365,
-0.024213874712586403,
-0.11022298038005829,
0.10584153234958649,
0.11902423202991486,
-0.038442522287368774,
0.023433661088347435,
-0.028946204110980034,
-0.10115122050046921,
0.07245734333992004,
0.1263580024242401,
0.0067276302725076675,
0.15264904499053955,
0.05497094616293907,
0.05952346697449684,
0.05138368904590607,
0.10854972153902054,
0.030825279653072357,
-0.0986223891377449,
-0.054507412016391754,
-0.061287589371204376,
-0.035665225237607956,
0.04336113855242729,
0.030815057456493378,
0.05143648013472557,
-0.05353974923491478,
-0.03326655551791191,
-0.13691231608390808,
0.06339270621538162,
0.12416107207536697,
0.03966484218835831,
-0.05692632496356964,
-0.10852959752082825,
-0.012441705912351608,
-0.1430543065071106,
-0.005556604824960232,
0.0302251148968935,
0.04182774946093559,
-0.17055737972259521,
-0.00848609209060669,
-0.023836450651288033,
0.08751215040683746,
0.03608183190226555,
0.08487629890441895,
-0.02720058523118496,
0.02945670858025551,
0.0007306499755941331,
0.11728306114673615,
-0.20711204409599304,
0.19214577972888947,
0.012361540459096432,
0.009187069721519947,
-0.10462134331464767,
-0.05285860598087311,
-0.03180338814854622,
-0.07529343664646149,
0.10954488813877106,
0.012032102793455124,
0.049700383096933365,
-0.05209726840257645,
-0.09838578850030899,
0.03959967941045761,
0.03853680565953255,
-0.03747502714395523,
0.01870454102754593,
-0.011127064004540443,
0.019751835614442825,
-0.044016093015670776,
0.10673319548368454,
-0.05906694009900093,
-0.052524738013744354,
-0.020190179347991943,
-0.00041895866161212325,
0.10551401972770691,
-0.015663346275687218,
-0.023282401263713837,
-0.14825096726417542,
0.1961185336112976,
-0.10978026688098907,
-0.05788305774331093,
-0.09955596923828125,
-0.027171730995178223,
-0.040219973772764206,
-0.0982164666056633,
0.061358485370874405,
-0.05013718083500862,
0.05960055813193321,
-0.07112722098827362,
-0.1345418244600296,
0.0650995522737503,
-0.09883781522512436,
-0.015415363013744354,
0.016636259853839874,
0.16997884213924408,
0.12429432570934296,
0.03789738938212395,
0.10269458591938019,
-0.06321906298398972,
-0.05193506181240082,
-0.13152481615543365,
-0.0083926972001791,
-0.08616076409816742,
-0.01317878719419241,
0.07882990688085556,
-0.16876934468746185,
-0.10559151321649551,
0.027506623417139053,
0.05642500892281532,
0.30170464515686035,
0.12464207410812378,
-0.02422948367893696,
0.20106063783168793,
0.24715635180473328,
-0.019445281475782394,
-0.3621733784675598,
-0.00849907472729683,
0.0324254184961319,
0.05248989537358284,
-0.029259584844112396,
-0.20323561131954193,
0.21152068674564362,
0.076349638402462,
-0.043249160051345825,
0.0583881139755249,
-0.10027719289064407,
-0.06234384700655937,
0.1304446905851364,
-0.05901498720049858,
0.275160014629364,
0.0012447627959772944,
-0.03378938511013985,
-0.0326959453523159,
-0.09243188053369522,
0.17706961929798126,
-0.055719323456287384,
0.07822452485561371,
-0.01577463001012802,
0.17423677444458008,
0.029939543455839157,
-0.05936571583151817,
0.06730762124061584,
0.07545044273138046,
0.026538236066699028,
-0.04340246319770813,
-0.0702601820230484,
-0.00524006225168705,
0.018548820167779922,
0.08446130901575089,
-0.07068415731191635,
-0.026971912011504173,
-0.1516023874282837,
-0.018940480425953865,
-0.0953756794333458,
0.2436210811138153,
-0.009226112626492977,
-0.11815348267555237,
-0.05906695872545242,
0.1125599592924118,
-0.02128000743687153,
0.05484295263886452,
0.11535542458295822,
-0.039691854268312454,
0.09797891974449158,
0.08566927909851074,
0.05795637145638466,
-0.06194029748439789,
-0.0275411456823349,
-0.037393294274806976,
-0.0803447738289833,
0.1120368018746376,
0.036684416234493256,
0.0510915145277977,
0.12725089490413666,
0.01032425370067358,
0.123383067548275,
-0.012370739132165909,
-0.07664411514997482,
0.03340255841612816,
0.09215286374092102,
-0.13633237779140472,
-0.11975125968456268,
-0.0802767276763916,
-0.02806997299194336,
-0.0010880492627620697,
0.08101583272218704,
0.1967792510986328,
-0.02772895060479641,
-0.006545542273670435,
-0.0025939487386494875,
-0.0033418824896216393,
-0.08345240354537964,
0.01590476743876934,
0.09756675362586975,
-0.007374980486929417,
-0.0828145444393158,
0.020793631672859192,
-0.020454298704862595,
-0.025539295747876167,
0.06879057735204697,
0.007230326533317566,
-0.014054305851459503,
-0.0472891703248024,
-0.02184315025806427,
0.18857797980308533,
-0.2317158728837967,
-0.039785873144865036,
0.06149560958147049,
-0.08890857547521591,
0.01163899339735508,
0.11397013068199158,
0.06432299315929413,
0.029346918687224388,
-0.013016946613788605,
-0.007437232881784439,
-0.005189425777643919,
0.038167309015989304,
0.025906825438141823,
0.009169021621346474,
-0.012282269075512886,
-0.004670803435146809,
-0.11052912473678589,
0.07088921219110489,
-0.07726921141147614,
-0.058133140206336975,
-0.210887148976326,
-0.07161669433116913,
-0.16311155259609222,
-0.05469891056418419,
-0.03298521041870117,
0.020238876342773438,
-0.022114049643278122,
-0.09994705021381378,
0.038519375026226044,
-0.006670346483588219,
-0.10729697346687317,
0.007798385806381702,
0.07127367705106735,
0.07741068303585052,
-0.09309890866279602,
-0.004693416878581047,
0.14187440276145935,
0.045809730887413025,
0.06106216087937355,
0.018489213660359383,
0.03391387313604355,
0.09613130241632462,
-0.15635690093040466,
-0.04982951283454895,
0.03485400974750519,
-0.010692763142287731,
0.10444673895835876,
0.06756983697414398,
-0.02333652228116989,
0.043175891041755676,
0.0009388688486069441,
0.057275913655757904,
0.02696104347705841,
-0.09852362424135208,
0.06799624860286713,
0.0687786117196083,
-0.1321517676115036,
-0.02777511067688465,
0.010151921771466732,
0.10300172120332718,
0.06112607568502426,
0.12474784255027771,
-0.06258661299943924,
0.04945971444249153,
0.0710260421037674,
0.060438696295022964,
-0.0034663190599530935,
-0.1155809611082077,
-0.04802683740854263,
-0.0239284485578537,
-0.04224817827343941,
0.002355010947212577,
0.21969656646251678,
0.0658814013004303,
-0.058931317180395126,
0.033691588789224625,
0.005738376174122095,
-0.05278880149126053,
-0.05105772614479065,
0.14862897992134094,
-0.020788585767149925,
0.03808675333857536,
-0.09423994272947311,
0.0252962876111269,
-0.013086443766951561,
-0.04978126659989357,
0.13785946369171143,
0.15462400019168854,
0.003327806480228901,
-0.04794954136013985,
0.09778670966625214,
0.033507123589515686,
-0.1576337367296219,
-0.14538389444351196,
0.07124826312065125,
0.10479437559843063,
-0.04994666948914528,
0.14937573671340942,
0.027596525847911835,
-0.16552668809890747,
0.03499796241521835,
0.05359039455652237,
-0.004674691706895828,
-0.025721512734889984,
-0.07082349807024002,
-0.051103588193655014,
-0.0448995865881443,
0.03358464315533638,
-0.10393712669610977,
0.059734683483839035,
-0.06864069402217865,
0.05830319970846176,
-0.09251143783330917,
0.14278791844844818,
-0.17830604314804077,
-0.06641155481338501,
0.08805570006370544,
0.015986377373337746,
0.04093825817108154,
-0.023791514337062836,
-0.050464678555727005,
-0.19052940607070923,
0.10747459530830383,
-0.03658599033951759,
0.0494844913482666,
0.03177905082702637,
-0.06932329386472702,
-0.10186797380447388,
-0.05782179534435272,
-0.0726902186870575,
0.01480148360133171,
0.04453163221478462,
0.0792878270149231,
0.004784770775586367,
-0.028406906872987747,
-0.010950756259262562,
0.10310548543930054,
0.020818166434764862,
-0.08771990984678268,
-0.03448811173439026,
0.11573712527751923,
0.01467800047248602,
0.061606135219335556,
-0.05323576554656029,
-0.04221275821328163,
-0.0011861887760460377,
0.1392057240009308,
0.23994725942611694,
-0.10172132402658463,
0.027300305664539337,
-0.08615612238645554,
0.034835539758205414,
0.05135437101125717,
0.1632809042930603,
-0.03861764818429947,
0.3171459436416626,
-0.02435389719903469,
0.01872853934764862,
-0.0034404152538627386,
-0.007786607835441828,
-0.17946426570415497,
0.05286858603358269,
0.033544011414051056,
-0.12787644565105438,
-0.14530974626541138,
0.0823165699839592,
-0.13559383153915405,
-0.1623142957687378,
-0.10534423589706421,
-0.08852138370275497,
-0.13841108977794647,
-0.0839059054851532,
0.0016559440409764647,
0.06966958940029144,
0.09032821655273438,
-0.005904266610741615,
0.02714548632502556,
0.042179789394140244,
0.0060995048843324184,
-0.11299239099025726,
-0.029024602845311165,
0.1724662035703659,
0.08910694718360901,
0.12952160835266113,
0.004176849499344826,
0.0011737237218767405,
0.08313213288784027,
0.007233956828713417,
-0.06308568269014359,
-0.0821690633893013,
0.008733417838811874,
-0.1836266964673996,
-0.06723099201917648,
-0.006609363481402397,
0.00953493919223547,
0.07036998122930527,
0.02144674025475979,
-0.04917402192950249,
-0.012131921015679836,
0.11042346060276031,
-0.025223424658179283,
-0.08951251953840256,
0.11439453065395355,
-0.09310715645551682,
0.12193147838115692,
0.14401446282863617,
-0.009008117020130157,
-0.06291457265615463,
-0.06745841354131699,
0.058590248227119446,
0.09440192580223083,
-0.10444953292608261,
-0.0925154760479927,
-0.15134204924106598,
-0.03340374678373337,
-0.021551141515374184,
-0.05777643248438835,
-0.10559752583503723,
-0.05581885948777199,
-0.06745962798595428,
0.1584843397140503,
-0.060354236513376236,
0.027929171919822693,
0.11017199605703354,
-0.03332921862602234,
0.06370138376951218,
0.03956181928515434,
-0.008604058995842934,
-0.010656136088073254,
-0.019695589318871498,
-0.09341897070407867
] |
null | null |
transformers
|
# bert-base-uncased for QA
## Overview
**Language model:** bert-base-uncased
**Language:** English
**Downstream-task:** Extractive QA
**Training data:** SQuAD 2.0
**Eval data:** SQuAD 2.0
**Infrastructure**: 1x Tesla v100
## Hyperparameters
```
batch_size = 32
n_epochs = 3
base_LM_model = "bert-base-uncased"
max_seq_len = 384
learning_rate = 3e-5
lr_schedule = LinearWarmup
warmup_proportion = 0.2
doc_stride=128
max_query_length=64
```
## Performance
```
"exact": 73.67977764676156
"f1": 77.87647139308865
```
## Authors
- Timo Möller: `timo.moeller [at] deepset.ai`
- Julian Risch: `julian.risch [at] deepset.ai`
- Malte Pietsch: `malte.pietsch [at] deepset.ai`
- Michel Bartels: `michel.bartels [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "en", "license": "cc-by-4.0", "datasets": ["squad_v2"], "model-index": [{"name": "deepset/bert-base-uncased-squad2", "results": [{"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squad_v2", "type": "squad_v2", "config": "squad_v2", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 75.6529, "name": "Exact Match", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTY2YmQ0ZDFjMjRlZWRiZWQ2YWQ4MTM0ODkyYTQ0NmYwMzBlNWViZWQ0ODFhMGJmMmY4ZGYwOTQyMDAyZGNjYyIsInZlcnNpb24iOjF9.UyqonQTsCB0BW86LfPy17kLt3a4r3wMeh04MDam5t_UhElp6N02YpiKOqcb1ethNHjAR0WGyxrcV3TI4d-wFAQ"}, {"type": "f1", "value": 78.6191, "name": "F1", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZWRkZWVjMDU2YTcxYWVkZTU1YmUzY2FkNWI5NDJkM2YwMjFmMmE0Njc3MjI5N2Q0NDdhZDNkZWNjMWE5YTRmZiIsInZlcnNpb24iOjF9.ol0Zacd9ZryXazXjgVssGFYG4s5FzbhGGaj1ZEDLVN2ziyzx23bo4GH9PSuGTFxRK2BO5_dxvDupLRqJOF59Bg"}]}]}]}
|
question-answering
|
deepset/bert-base-uncased-squad2
|
[
"transformers",
"pytorch",
"safetensors",
"bert",
"question-answering",
"en",
"dataset:squad_v2",
"license:cc-by-4.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us
|
# bert-base-uncased for QA
## Overview
Language model: bert-base-uncased
Language: English
Downstream-task: Extractive QA
Training data: SQuAD 2.0
Eval data: SQuAD 2.0
Infrastructure: 1x Tesla v100
## Hyperparameters
## Performance
## Authors
- Timo Möller: 'timo.moeller [at] URL'
- Julian Risch: 'URL [at] URL'
- Malte Pietsch: 'malte.pietsch [at] URL'
- Michel Bartels: 'michel.bartels [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Discord | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# bert-base-uncased for QA",
"## Overview\nLanguage model: bert-base-uncased \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nInfrastructure: 1x Tesla v100",
"## Hyperparameters",
"## Performance",
"## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'\n- Michel Bartels: 'michel.bartels [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us \n",
"# bert-base-uncased for QA",
"## Overview\nLanguage model: bert-base-uncased \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nInfrastructure: 1x Tesla v100",
"## Hyperparameters",
"## Performance",
"## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'\n- Michel Bartels: 'michel.bartels [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
62,
12,
51,
5,
2,
63,
129
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us \n# bert-base-uncased for QA## Overview\nLanguage model: bert-base-uncased \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nInfrastructure: 1x Tesla v100## Hyperparameters## Performance## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'\n- Michel Bartels: 'michel.bartels [at] URL'## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
-0.054955240339040756,
0.13798126578330994,
-0.0019011758267879486,
0.041783835738897324,
0.08702215552330017,
-0.01555041316896677,
0.1482595056295395,
0.09094793349504471,
0.12331552803516388,
0.030578117817640305,
0.01530672237277031,
0.04332752898335457,
0.08677744120359421,
0.17502903938293457,
0.005801141262054443,
-0.1793920397758484,
-0.003802796360105276,
-0.08087325096130371,
-0.07942906022071838,
0.07143500447273254,
0.14988155663013458,
-0.10069624334573746,
0.09057160466909409,
0.006785176694393158,
0.0049651977606117725,
0.0043478733859956264,
-0.016376033425331116,
-0.06638327240943909,
0.10682567954063416,
0.07111477851867676,
0.05633332580327988,
-0.0027449815534055233,
0.006666807923465967,
-0.1718776375055313,
0.0434880331158638,
0.018211523070931435,
0.005860389210283756,
0.051333602517843246,
0.1003510057926178,
0.0039259591139853,
0.06494547426700592,
-0.022242344915866852,
0.012035020627081394,
0.09023479372262955,
-0.07424943894147873,
-0.12405576556921005,
-0.07929983735084534,
0.08806660771369934,
0.03559517115354538,
0.061974283307790756,
-0.017102576792240143,
0.07931428402662277,
-0.13678374886512756,
0.041975509375333786,
0.07847634702920914,
-0.2873469293117523,
-0.05574987456202507,
0.036775458604097366,
0.06869881600141525,
0.07135824114084244,
-0.1435679942369461,
0.05254485458135605,
0.02491648867726326,
0.01536698266863823,
-0.00558685977011919,
-0.01768822781741619,
-0.024547096341848373,
0.016798973083496094,
-0.08326943218708038,
0.01679491437971592,
0.21265898644924164,
0.030063049867749214,
-0.052327658981084824,
-0.18051466345787048,
0.001144774490967393,
0.16125604510307312,
-0.029958516359329224,
-0.009481614455580711,
0.01579720713198185,
-0.00899647455662489,
-0.016879642382264137,
-0.05726221576333046,
-0.09851962327957153,
0.014351538382470608,
-0.020717808976769447,
0.10206819325685501,
0.021956881508231163,
0.04995405673980713,
0.024681340903043747,
0.053332023322582245,
0.0019288966432213783,
-0.14217442274093628,
-0.03882844001054764,
-0.17431491613388062,
-0.04743500426411629,
-0.030584564432501793,
0.06113211065530777,
0.048365265130996704,
0.15609118342399597,
0.08982616662979126,
-0.08741917461156845,
-0.005269494839012623,
0.012621183879673481,
-0.004607511684298515,
0.05025773122906685,
0.1276872605085373,
-0.08340580761432648,
-0.20337654650211334,
0.0031405906192958355,
-0.028856495395302773,
-0.06594842672348022,
0.018354879692196846,
-0.03991999849677086,
-0.00027391736512072384,
-0.06537142395973206,
0.0349799208343029,
0.0648253858089447,
0.03559727594256401,
-0.08770482987165451,
-0.0813891813158989,
0.12193423509597778,
-0.06842723488807678,
0.011652348563075066,
0.025177625939249992,
-0.06906921416521072,
0.12047557532787323,
-0.10486866533756256,
0.056830648332834244,
-0.018398134037852287,
0.029921317473053932,
-0.05931348353624344,
-0.05750744417309761,
-0.08647526055574417,
-0.09988106787204742,
0.08348042517900467,
0.00020640587899833918,
0.016152819618582726,
-0.09533281624317169,
-0.08581116795539856,
-0.027135763317346573,
0.08698438107967377,
-0.03324472904205322,
-0.05961218103766441,
0.02142738737165928,
-0.02548035979270935,
0.026127930730581284,
-0.0171721950173378,
-0.0024033028166741133,
-0.06681597232818604,
0.03414100781083107,
-0.13978633284568787,
0.013257619924843311,
-0.0010553685715422034,
0.015979396179318428,
-0.07448729127645493,
-0.037914030253887177,
-0.17957539856433868,
0.04635295644402504,
-0.18945558369159698,
0.14171379804611206,
-0.14826369285583496,
-0.013011237606406212,
0.04218656197190285,
0.021052971482276917,
-0.03545021265745163,
0.1341940313577652,
-0.09237711876630783,
-0.046125560998916626,
0.17279286682605743,
-0.08034378290176392,
-0.07440720498561859,
0.13525860011577606,
-0.034049153327941895,
0.021934010088443756,
0.1300002485513687,
0.14523369073867798,
0.1361691802740097,
-0.15066808462142944,
-0.028893662616610527,
0.01644730567932129,
-0.007047335151582956,
-0.0137399323284626,
0.12438564747571945,
-0.06396207958459854,
-0.041689712554216385,
0.030430510640144348,
-0.06426943838596344,
0.018453646451234818,
-0.058851856738328934,
-0.05496128275990486,
0.06331052631139755,
-0.054040078073740005,
0.042539726942777634,
-0.0065320259891450405,
0.007725412026047707,
-0.07650893181562424,
-0.10444752126932144,
-0.007492863107472658,
0.0433339849114418,
0.019915372133255005,
-0.07194659113883972,
-0.051614910364151,
0.08175045996904373,
0.015211784280836582,
0.03020472079515457,
-0.058700740337371826,
-0.156368225812912,
0.06939249485731125,
-0.09037817269563675,
0.08563454449176788,
0.1238274797797203,
0.052804626524448395,
-0.010424409992992878,
-0.02699081040918827,
0.004650777205824852,
-0.04453326389193535,
-0.027051324024796486,
-0.008345992304384708,
-0.2287808507680893,
0.013603703118860722,
-0.09736882150173187,
0.06440412253141403,
-0.08037760853767395,
-0.026399383321404457,
0.06723067909479141,
0.07428492605686188,
0.060054488480091095,
-0.0023464150726795197,
-0.00501240324229002,
0.013114940375089645,
0.03312408924102783,
-0.010148844681680202,
0.027653491124510765,
-0.020123174414038658,
-0.04044918715953827,
0.09940978139638901,
0.06611916422843933,
0.18948832154273987,
0.06528361886739731,
0.008678143844008446,
-0.04660022631287575,
-0.11681359261274338,
-0.06928890943527222,
0.019077688455581665,
-0.07522497326135635,
-0.06610830128192902,
0.15011408925056458,
0.03211190924048424,
0.012298107147216797,
-0.08962205052375793,
-0.057636868208646774,
-0.042465850710868835,
-0.00892672874033451,
-0.0022636796347796917,
0.15433955192565918,
0.0014708703383803368,
-0.1462930291891098,
0.11156313866376877,
0.1204250305891037,
0.12125765532255173,
0.2663520872592926,
-0.04757538437843323,
-0.040581945329904556,
0.0163210891187191,
-0.0018990482203662395,
-0.07010602205991745,
0.1520572304725647,
-0.08331829309463501,
-0.006416722666472197,
0.07565914839506149,
-0.013456118293106556,
0.019322801381349564,
-0.043616387993097305,
-0.0024782486725598574,
-0.02965913899242878,
-0.0415952168405056,
0.0036856939550489187,
0.022086836397647858,
0.018025914207100868,
0.11613323539495468,
0.07899240404367447,
-0.032899849116802216,
-0.0016435198485851288,
-0.05557244271039963,
-0.06036197021603584,
0.1722678542137146,
-0.07378571480512619,
-0.12365861237049103,
-0.09278969466686249,
-0.06064196676015854,
-0.0951288715004921,
-0.02019168809056282,
0.03617250174283981,
-0.06665878742933273,
-0.12521293759346008,
-0.023111652582883835,
0.06587116420269012,
0.05375325307250023,
-0.0742272287607193,
-0.007467879913747311,
-0.007577565964311361,
-0.005046206060796976,
-0.14494910836219788,
-0.03205849975347519,
-0.021915066987276077,
-0.04427667334675789,
0.0010275811655446887,
0.02544797956943512,
0.07243629544973373,
0.045004356652498245,
0.02530977502465248,
0.006802812218666077,
-0.04205803573131561,
0.2876133322715759,
-0.11394631117582321,
0.07487108558416367,
0.11443375051021576,
0.03664456307888031,
0.06109321117401123,
0.2518393397331238,
0.10735446214675903,
-0.011854065582156181,
-0.006447130348533392,
0.009638870134949684,
0.013367103412747383,
-0.30752885341644287,
-0.11543725430965424,
-0.05612960457801819,
-0.014705791138112545,
-0.020334791392087936,
0.04007278382778168,
0.006711361929774284,
0.030420053750276566,
-0.08669915795326233,
-0.05569804087281227,
0.06273715198040009,
0.06366115063428879,
0.08067163825035095,
0.013720168732106686,
0.06816473603248596,
-0.03169264644384384,
-0.028435638174414635,
0.13369257748126984,
0.0647992193698883,
0.1685550957918167,
0.07119292765855789,
0.13900582492351532,
0.09019434452056885,
0.04729463532567024,
0.042866453528404236,
0.02893519401550293,
-0.029403559863567352,
0.02907458506524563,
-0.038917798548936844,
-0.06756015866994858,
0.05035343021154404,
0.07048821449279785,
0.028494887053966522,
-0.06505420058965683,
-0.026291372254490852,
-0.05656058341264725,
0.1265902817249298,
0.24457964301109314,
-0.0023574987426400185,
-0.10120932757854462,
-0.09388037770986557,
0.047165803611278534,
-0.10847175121307373,
-0.009997579269111156,
0.02107476443052292,
0.0623260997235775,
-0.17158785462379456,
0.04809727147221565,
0.0005861875251866877,
0.11228765547275543,
0.0050465078093111515,
0.02982441522181034,
0.060880161821842194,
0.03230613097548485,
-0.018786581233143806,
0.10329306870698929,
-0.1878194659948349,
0.24088944494724274,
0.026333346962928772,
0.07728099077939987,
-0.07095655053853989,
0.018350129947066307,
0.0233541801571846,
-0.07112379372119904,
0.14038656651973724,
-0.014573656022548676,
-0.04322131723165512,
-0.07643148303031921,
-0.10920523852109909,
0.05808201804757118,
0.029635988175868988,
-0.10103197395801544,
0.08207521587610245,
-0.013620656915009022,
-0.01022784411907196,
-0.026564378291368484,
0.007458205800503492,
-0.1404617577791214,
-0.12217891961336136,
0.02810523472726345,
-0.07620728760957718,
0.04193362221121788,
-0.06262156367301941,
-0.05692262575030327,
-0.13829730451107025,
0.1338111013174057,
-0.13310155272483826,
-0.11876563727855682,
-0.12947602570056915,
-0.030375352129340172,
0.06135516241192818,
-0.09820955246686935,
0.042717449367046356,
-0.029685458168387413,
0.06889589875936508,
-0.003172870259732008,
-0.07531458884477615,
0.05524248257279396,
-0.10148032009601593,
-0.1303180754184723,
-0.005500981118530035,
0.16023698449134827,
0.049258142709732056,
0.047751590609550476,
0.04752693697810173,
-0.050781045109033585,
-0.07039666175842285,
-0.15857471525669098,
-0.010509207844734192,
0.09035075455904007,
0.024890489876270294,
0.05723108351230621,
-0.10050436109304428,
-0.16121214628219604,
-0.06124002858996391,
-0.01374772097915411,
0.11680556833744049,
0.15889526903629303,
-0.06674280762672424,
0.1802430897951126,
0.20604197680950165,
-0.0329119935631752,
-0.22062623500823975,
-0.04291635751724243,
0.02824612893164158,
-0.011618039570748806,
0.006245754659175873,
-0.1657695323228836,
0.16076603531837463,
0.02633720263838768,
-0.0585206001996994,
0.016200052574276924,
-0.1688002347946167,
-0.09252745658159256,
0.0669533982872963,
-0.022633397951722145,
0.03032800555229187,
-0.11390038579702377,
-0.05911177396774292,
-0.042964234948158264,
-0.1445619761943817,
0.11286133527755737,
-0.1307414025068283,
0.041658271104097366,
0.028854992240667343,
0.06394466757774353,
0.017593996599316597,
-0.0780634954571724,
0.09115435183048248,
0.024073738604784012,
0.01792873814702034,
-0.04336979612708092,
0.0060969688929617405,
0.034601952880620956,
-0.006135794334113598,
0.054882198572158813,
-0.00952841155230999,
0.015139018185436726,
-0.13194377720355988,
-0.05766439437866211,
-0.06870651245117188,
0.1331605613231659,
-0.03995603695511818,
-0.07181783020496368,
-0.06391866505146027,
0.12449266761541367,
0.051086414605379105,
0.017874188721179962,
0.013230289332568645,
-0.062299735844135284,
0.10292451083660126,
0.0594763420522213,
0.17977145314216614,
-0.05071036145091057,
-0.02192222699522972,
-0.038196977227926254,
-0.026916537433862686,
0.08849221467971802,
-0.08680932223796844,
0.040482860058546066,
0.16346456110477448,
-0.007746860384941101,
0.0845416858792305,
-0.009411726146936417,
-0.10619556903839111,
0.01502392441034317,
0.06830199062824249,
-0.13442642986774445,
-0.19098611176013947,
-0.06385066360235214,
0.018938781693577766,
-0.061107978224754333,
0.017091093584895134,
0.14440380036830902,
0.015356370247900486,
-0.02754630148410797,
0.018596593290567398,
0.053134024143218994,
-0.0005539901903830469,
0.10885677486658096,
0.045296810567379,
0.026791337877511978,
-0.12060192972421646,
0.039941221475601196,
0.09374820441007614,
-0.03686872869729996,
-0.020323464646935463,
0.0567069873213768,
-0.060102175921201706,
-0.0671306774020195,
0.00927390530705452,
0.12954159080982208,
-0.09128552675247192,
-0.0427829846739769,
0.028957728296518326,
-0.13577228784561157,
0.008157684467732906,
0.06832225620746613,
0.023073583841323853,
0.005628914572298527,
0.05201330408453941,
-0.00567824300378561,
0.05172997713088989,
0.1400509774684906,
0.05753859132528305,
-0.024210594594478607,
-0.06940572708845139,
-0.05510322004556656,
-0.05445586144924164,
0.022939926013350487,
-0.039714131504297256,
-0.005792897194623947,
-0.1684829443693161,
-0.05338354781270027,
-0.14835426211357117,
0.020720820873975754,
-0.0416790172457695,
0.05152932181954384,
-0.04566819965839386,
-0.08407609909772873,
-0.02751162461936474,
0.00669255992397666,
-0.10459066182374954,
0.0229105856269598,
0.007461176253855228,
0.14965471625328064,
-0.15091603994369507,
-0.006642146967351437,
0.09431368857622147,
-0.002025127876549959,
0.12044990062713623,
0.004502734635025263,
-0.004725878592580557,
0.024162381887435913,
-0.1705273538827896,
-0.011289644986391068,
-0.055326394736766815,
0.03153049945831299,
0.0534525103867054,
-0.049754850566387177,
0.0011608880013227463,
0.014380075968801975,
-0.033625438809394836,
0.00502785062417388,
-0.060529742389917374,
-0.09370023757219315,
0.03359101712703705,
0.040490880608558655,
-0.09953341633081436,
-0.022095564752817154,
0.006024508737027645,
0.1383749097585678,
0.01689824089407921,
0.13177715241909027,
-0.06475301086902618,
0.05734739825129509,
-0.09838517755270004,
0.026817571371793747,
0.027599111199378967,
-0.031698551028966904,
-0.10126955807209015,
-0.01024387776851654,
0.04398122429847717,
-0.022309590131044388,
0.24394133687019348,
0.061547815799713135,
0.0272207111120224,
0.05346124619245529,
-0.04651046171784401,
-0.01681133545935154,
0.0258941650390625,
0.11081413924694061,
0.003131531411781907,
0.04971142113208771,
-0.0030783754773437977,
-0.07275039702653885,
-0.04196716845035553,
-0.03358564153313637,
0.15281367301940918,
0.27888017892837524,
0.08867435902357101,
-0.07101713120937347,
0.11271718144416809,
-0.04690423607826233,
-0.12987570464611053,
-0.04573643207550049,
0.030995560809969902,
0.07556065171957016,
-0.11581572890281677,
0.10795433819293976,
0.10024870932102203,
-0.16402527689933777,
0.06829365342855453,
-0.018016034737229347,
-0.022831987589597702,
-0.07421284914016724,
-0.11195694655179977,
-0.0789353996515274,
-0.09465248882770538,
0.026399947702884674,
-0.1510450392961502,
0.038622207939624786,
0.054251719266176224,
0.06017005071043968,
-0.07921430468559265,
0.07148858159780502,
-0.0979001596570015,
-0.06510432809591293,
0.12588120996952057,
0.03902881219983101,
0.02521318756043911,
-0.02863568440079689,
-0.0225551538169384,
-0.10456933081150055,
0.09388578683137894,
-0.02909609116613865,
0.03779347613453865,
-0.08023512363433838,
-0.060220759361982346,
-0.051003597676754,
-0.0696331337094307,
-0.03929532319307327,
0.016408056020736694,
-0.011862155981361866,
0.13282737135887146,
0.008768977597355843,
0.0017051524482667446,
0.021786686033010483,
0.21529801189899445,
-0.007278618402779102,
-0.11053633689880371,
-0.14192774891853333,
0.0417385995388031,
-0.048810988664627075,
0.03343365713953972,
0.00595560809597373,
-0.09643121808767319,
0.00570844067260623,
0.2063993662595749,
0.24538813531398773,
-0.11501143872737885,
0.010350736789405346,
-0.03775356709957123,
0.017304228618741035,
0.02229810319840908,
0.11544531583786011,
0.009743176400661469,
0.3051148056983948,
-0.03309495747089386,
0.04337478429079056,
0.037815336138010025,
0.022218899801373482,
-0.1244708001613617,
0.11818578094244003,
0.02886432409286499,
-0.0842665508389473,
-0.10535356402397156,
0.15389671921730042,
-0.10771059989929199,
-0.14700473845005035,
-0.090716652572155,
-0.07013175636529922,
-0.14544677734375,
-0.05563150718808174,
0.02613319270312786,
0.06878484785556793,
0.04614686965942383,
0.0004017421160824597,
-0.05302656441926956,
0.13077597320079803,
0.03705896809697151,
-0.09104927629232407,
-0.038322996348142624,
0.180592879652977,
-0.01764017716050148,
0.2119038850069046,
-0.0053795939311385155,
0.033745087683200836,
0.10143332928419113,
-0.033444736152887344,
-0.09825140237808228,
-0.054298240691423416,
0.08627687394618988,
-0.21924270689487457,
0.007105057593435049,
0.02616344764828682,
0.006318224593997002,
0.08661268651485443,
0.08388252556324005,
-0.057766932994127274,
0.021901646628975868,
0.10976282507181168,
-0.03966186195611954,
-0.1017574593424797,
0.11108751595020294,
-0.12551817297935486,
0.1321847289800644,
0.13659532368183136,
-0.050015222281217575,
-0.026076337322592735,
-0.0363854318857193,
0.06882841885089874,
0.008738887496292591,
0.023062212392687798,
-0.05780160427093506,
-0.17614677548408508,
-0.005024662707000971,
-0.02889559231698513,
0.05578280985355377,
-0.09297962486743927,
-0.012420379556715488,
-0.051396578550338745,
0.12256214767694473,
-0.04639550670981407,
0.10119704157114029,
0.13406360149383545,
-0.028711393475532532,
0.0009396314271725714,
-0.039027974009513855,
-0.018437553197145462,
0.08104303479194641,
-0.08182588964700699,
-0.019183360040187836
] |
null | null |
transformers
|
# bert-large-uncased-whole-word-masking-squad2
This is a berta-large model, fine-tuned using the SQuAD2.0 dataset for the task of question answering.
## Overview
**Language model:** bert-large
**Language:** English
**Downstream-task:** Extractive QA
**Training data:** SQuAD 2.0
**Eval data:** SQuAD 2.0
**Code:** See [an example QA pipeline on Haystack](https://haystack.deepset.ai/tutorials/first-qa-system)
## Usage
### In Haystack
Haystack is an NLP framework by deepset. You can use this model in a Haystack pipeline to do question answering at scale (over many documents). To load the model in [Haystack](https://github.com/deepset-ai/haystack/):
```python
reader = FARMReader(model_name_or_path="deepset/bert-large-uncased-whole-word-masking-squad2")
# or
reader = TransformersReader(model_name_or_path="FILL",tokenizer="deepset/bert-large-uncased-whole-word-masking-squad2")
```
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "deepset/bert-large-uncased-whole-word-masking-squad2"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'Why is model conversion important?',
'context': 'The option to convert models between FARM and transformers gives freedom to the user and let people easily switch between frameworks.'
}
res = nlp(QA_input)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/deepset-logo-colored.png" class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" class="w-40"/>
</div>
</div>
[deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://docs.haystack.deepset.ai">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community">Discord community open to everyone!</a></strong></p>
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "en", "license": "cc-by-4.0", "datasets": ["squad_v2"], "model-index": [{"name": "deepset/bert-large-uncased-whole-word-masking-squad2", "results": [{"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squad_v2", "type": "squad_v2", "config": "squad_v2", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 80.8846, "name": "Exact Match", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2E5ZGNkY2ExZWViZGEwNWE3OGRmMWM2ZmE4ZDU4ZDQ1OGM3ZWE0NTVmZjFmYmZjZmJmNjJmYTc3NTM3OTk3OSIsInZlcnNpb24iOjF9.aSblF4ywh1fnHHrN6UGL392R5KLaH3FCKQlpiXo_EdQ4XXEAENUCjYm9HWDiFsgfSENL35GkbSyz_GAhnefsAQ"}, {"type": "f1", "value": 83.8765, "name": "F1", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNGFlNmEzMTk2NjRkNTI3ZTk3ZTU1NWNlYzIyN2E0ZDFlNDA2ZjYwZWJlNThkMmRmMmE0YzcwYjIyZDM5NmRiMCIsInZlcnNpb24iOjF9.-rc2_Bsp_B26-o12MFYuAU0Ad2Hg9PDx7Preuk27WlhYJDeKeEr32CW8LLANQABR3Mhw2x8uTYkEUrSDMxxLBw"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squad", "type": "squad", "config": "plain_text", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 85.904, "name": "Exact Match"}, {"type": "f1", "value": 92.586, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "adversarial_qa", "type": "adversarial_qa", "config": "adversarialQA", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 28.233, "name": "Exact Match"}, {"type": "f1", "value": 41.17, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squad_adversarial", "type": "squad_adversarial", "config": "AddOneSent", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 78.064, "name": "Exact Match"}, {"type": "f1", "value": 83.591, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squadshifts amazon", "type": "squadshifts", "config": "amazon", "split": "test"}, "metrics": [{"type": "exact_match", "value": 65.615, "name": "Exact Match"}, {"type": "f1", "value": 80.733, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squadshifts new_wiki", "type": "squadshifts", "config": "new_wiki", "split": "test"}, "metrics": [{"type": "exact_match", "value": 81.57, "name": "Exact Match"}, {"type": "f1", "value": 91.199, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squadshifts nyt", "type": "squadshifts", "config": "nyt", "split": "test"}, "metrics": [{"type": "exact_match", "value": 83.279, "name": "Exact Match"}, {"type": "f1", "value": 91.09, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squadshifts reddit", "type": "squadshifts", "config": "reddit", "split": "test"}, "metrics": [{"type": "exact_match", "value": 69.305, "name": "Exact Match"}, {"type": "f1", "value": 82.405, "name": "F1"}]}]}]}
|
question-answering
|
deepset/bert-large-uncased-whole-word-masking-squad2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"question-answering",
"en",
"dataset:squad_v2",
"license:cc-by-4.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us
|
# bert-large-uncased-whole-word-masking-squad2
This is a berta-large model, fine-tuned using the SQuAD2.0 dataset for the task of question answering.
## Overview
Language model: bert-large
Language: English
Downstream-task: Extractive QA
Training data: SQuAD 2.0
Eval data: SQuAD 2.0
Code: See an example QA pipeline on Haystack
## Usage
### In Haystack
Haystack is an NLP framework by deepset. You can use this model in a Haystack pipeline to do question answering at scale (over many documents). To load the model in Haystack:
### In Transformers
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
</div>
deepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- Distilled roberta-base-squad2 (aka "tinyroberta-squad2")
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="URL repo and <strong><a href="URL">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="URL community open to everyone!</a></strong></p>
Twitter | LinkedIn | Discord | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# bert-large-uncased-whole-word-masking-squad2\n\nThis is a berta-large model, fine-tuned using the SQuAD2.0 dataset for the task of question answering.",
"## Overview\nLanguage model: bert-large \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nCode: See an example QA pipeline on Haystack",
"## Usage",
"### In Haystack\nHaystack is an NLP framework by deepset. You can use this model in a Haystack pipeline to do question answering at scale (over many documents). To load the model in Haystack:",
"### In Transformers",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us \n",
"# bert-large-uncased-whole-word-masking-squad2\n\nThis is a berta-large model, fine-tuned using the SQuAD2.0 dataset for the task of question answering.",
"## Overview\nLanguage model: bert-large \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nCode: See an example QA pipeline on Haystack",
"## Usage",
"### In Haystack\nHaystack is an NLP framework by deepset. You can use this model in a Haystack pipeline to do question answering at scale (over many documents). To load the model in Haystack:",
"### In Transformers",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
68,
51,
53,
3,
51,
6,
251,
113
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us \n# bert-large-uncased-whole-word-masking-squad2\n\nThis is a berta-large model, fine-tuned using the SQuAD2.0 dataset for the task of question answering.## Overview\nLanguage model: bert-large \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nCode: See an example QA pipeline on Haystack## Usage### In Haystack\nHaystack is an NLP framework by deepset. You can use this model in a Haystack pipeline to do question answering at scale (over many documents). To load the model in Haystack:### In Transformers## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")"
] |
[
-0.09096361696720123,
0.1670721471309662,
-0.004543938208371401,
0.037845827639102936,
0.09551984816789627,
0.04824783653020859,
0.11570664495229721,
0.12518665194511414,
0.08544853329658508,
0.1162230521440506,
0.008697941899299622,
0.040784865617752075,
0.07938652485609055,
0.08572381734848022,
0.029929591342806816,
-0.2035992443561554,
0.00022744912712369114,
-0.1258096992969513,
-0.039119064807891846,
0.05774562433362007,
0.0904911607503891,
-0.08511709421873093,
0.08319734036922455,
-0.01817377656698227,
-0.029089583083987236,
0.01444860640913248,
-0.04285372421145439,
-0.020044706761837006,
0.06592821329832077,
0.05706164985895157,
0.02234148234128952,
0.0005392554448917508,
0.03537403792142868,
-0.20533937215805054,
0.029281897470355034,
0.07602519541978836,
0.016422607004642487,
0.08641175925731659,
0.07609792798757553,
-0.0019967439584434032,
0.03557552397251129,
-0.1518384963274002,
0.005522338207811117,
0.07833027094602585,
-0.0467543751001358,
-0.1654151976108551,
-0.11362844705581665,
0.12192708998918533,
0.04055961221456528,
0.024229060858488083,
-0.009318220429122448,
0.04691004753112793,
-0.03534546494483948,
0.05491957813501358,
0.15882597863674164,
-0.2779981791973114,
-0.03445570543408394,
0.059872061014175415,
0.022463856264948845,
0.02273022197186947,
-0.10563433170318604,
0.02888486348092556,
-0.03844105452299118,
0.024388905614614487,
-0.03674107789993286,
-0.039370663464069366,
0.03556058928370476,
-0.008852796629071236,
-0.06610750406980515,
0.002357631456106901,
0.13173823058605194,
0.03424150496721268,
-0.041760124266147614,
-0.205733522772789,
-0.016757531091570854,
0.1012389212846756,
-0.055285289883613586,
0.014644337818026543,
0.01809314265847206,
-0.039260540157556534,
0.03210718557238579,
-0.09872536361217499,
-0.09012991189956665,
0.030756786465644836,
0.0014919140376150608,
0.022945327684283257,
0.032471779733896255,
0.014292384497821331,
0.02305091917514801,
0.09779093414545059,
-0.0776468962430954,
-0.1456168293952942,
-0.003236051881685853,
-0.07446549087762833,
-0.14305533468723297,
-0.018680084496736526,
-0.012673016637563705,
-0.03950634226202965,
0.11946754902601242,
0.17950604856014252,
-0.0455608144402504,
0.06297608464956284,
-0.010432761162519455,
0.008030526340007782,
0.025804150849580765,
0.14042538404464722,
-0.08919219672679901,
-0.16130474209785461,
0.012906458228826523,
-0.014033463783562183,
-0.00889083556830883,
-0.011053316295146942,
-0.07383735477924347,
-0.0432235524058342,
-0.04583193361759186,
0.0553768016397953,
0.06589216738939285,
0.03350786119699478,
-0.03704235330224037,
-0.07053742557764053,
0.15257738530635834,
-0.1345900148153305,
0.06154865399003029,
0.06641180068254471,
-0.006706328596919775,
0.09650896489620209,
-0.012967412360012531,
0.011861474253237247,
-0.07366565614938736,
0.03942427784204483,
-0.03472672402858734,
-0.017648005858063698,
-0.05632192641496658,
-0.07979030162096024,
0.03126126527786255,
-0.030181091278791428,
-0.06460008025169373,
-0.05677248537540436,
-0.07679704576730728,
-0.04710548371076584,
0.0635470598936081,
-0.043727390468120575,
-0.015996407717466354,
-0.008158573880791664,
-0.03697817400097847,
0.0606737919151783,
0.016119014471769333,
0.06929489970207214,
-0.03540055826306343,
0.02618795447051525,
-0.12187574803829193,
0.03509693592786789,
0.03291366994380951,
0.015466502867639065,
-0.028314750641584396,
0.017504515126347542,
-0.2291359156370163,
0.11192665249109268,
-0.1042354479432106,
0.0220969095826149,
-0.1460254192352295,
-0.02767142280936241,
0.06439869850873947,
0.019744034856557846,
-0.01663069985806942,
0.11702790856361389,
-0.13378694653511047,
-0.035105329006910324,
0.13466300070285797,
-0.021160919219255447,
-0.05731959640979767,
0.09626022726297379,
-0.04114719480276108,
0.0074725463055074215,
0.0648103877902031,
0.12448722124099731,
0.16186247766017914,
-0.12234863638877869,
-0.09531361609697342,
0.050279390066862106,
-0.010142479091882706,
0.09182387590408325,
0.07543711364269257,
-0.07346775382757187,
0.06832210719585419,
0.0356682725250721,
-0.05412575602531433,
-0.007305104751139879,
0.00778989540413022,
-0.030347827821969986,
0.005848143715411425,
-0.01816522143781185,
0.025289766490459442,
-0.054971229285001755,
-0.015334825031459332,
-0.022581659257411957,
-0.1356995403766632,
0.0059177023358643055,
0.08757992833852768,
-0.021780934184789658,
0.005468192510306835,
-0.05350768193602562,
0.030464062467217445,
-0.025187065824866295,
0.017012732103466988,
-0.13994817435741425,
-0.1473657190799713,
0.04359704256057739,
-0.0985422432422638,
0.07464569807052612,
0.04613237828016281,
0.05492010340094566,
0.05200124904513359,
-0.013980845920741558,
-0.0337924063205719,
0.006944986060261726,
-0.01884225197136402,
0.005564361345022917,
-0.18036240339279175,
-0.0696890726685524,
-0.03224826976656914,
0.12814974784851074,
-0.07340522110462189,
0.015376333147287369,
0.040910135954618454,
0.13028845191001892,
0.03205343708395958,
-0.032187122851610184,
0.038078900426626205,
-0.035480376332998276,
0.017843324691057205,
-0.0510571114718914,
0.01281837560236454,
-0.006947592832148075,
-0.0640682652592659,
0.024511082097887993,
-0.1115955114364624,
-0.059070561081171036,
0.03173236548900604,
0.07766254246234894,
-0.06983263045549393,
-0.06278528273105621,
-0.032162901014089584,
-0.026659149676561356,
-0.052847400307655334,
-0.07433366030454636,
0.14591631293296814,
0.061806920915842056,
0.07097946852445602,
-0.05353783071041107,
-0.07776129990816116,
-0.02451261132955551,
0.00826185755431652,
0.012585139833390713,
0.10989902913570404,
-0.025456661358475685,
-0.11389277875423431,
0.07777107506990433,
0.13725130259990692,
0.05158250778913498,
0.18317481875419617,
-0.013169354759156704,
-0.07408410310745239,
-0.04224930703639984,
0.043040141463279724,
0.00708678737282753,
0.04131532087922096,
-0.022978495806455612,
0.03253095597028732,
0.08064427971839905,
0.004353460390120745,
-0.0026786925736814737,
-0.04494372382760048,
0.04789732024073601,
-0.03837835043668747,
-0.02966289594769478,
0.02398970164358616,
0.054098039865493774,
0.07818109542131424,
0.08461815863847733,
0.04291592538356781,
0.060685042291879654,
-0.00024578464217484,
-0.06079433858394623,
-0.06478523463010788,
0.16969311237335205,
-0.1066921129822731,
-0.2051319032907486,
-0.143034890294075,
-0.09701457619667053,
-0.1111951470375061,
-0.03758798539638519,
0.04085676744580269,
-0.06515686959028244,
-0.07943577319383621,
-0.05727987363934517,
0.06729792058467865,
0.08706602454185486,
-0.0623280331492424,
-0.06584355235099792,
0.019855540245771408,
0.06816402077674866,
-0.13616149127483368,
-0.009474369697272778,
0.03183954209089279,
-0.07341253757476807,
-0.0185635257512331,
0.10281740874052048,
0.03808069974184036,
0.0660616010427475,
0.05697735771536827,
0.017491910606622696,
0.021408123895525932,
0.1942637711763382,
-0.10801036655902863,
0.11371766030788422,
0.12032382190227509,
-0.048476170748472214,
0.07295961678028107,
0.14456391334533691,
0.07410415261983871,
0.002131069777533412,
-0.010703541338443756,
0.05317310243844986,
-0.004310494754463434,
-0.21980951726436615,
-0.08409617841243744,
-0.045565165579319,
-0.016503583639860153,
0.03669050708413124,
0.055690232664346695,
-0.08990649878978729,
0.01697871834039688,
-0.0998966172337532,
-0.022135717794299126,
0.04775938391685486,
0.06394223868846893,
0.05575531721115112,
0.009531918913125992,
0.04516949504613876,
-0.06283888965845108,
-0.01783670112490654,
0.11086271703243256,
0.07921747118234634,
0.08084007352590561,
-0.025837495923042297,
0.1802694946527481,
0.038283735513687134,
0.05790354683995247,
-0.023457657545804977,
0.015591351315379143,
-0.0033115779515355825,
0.006157238967716694,
-0.006434113718569279,
-0.0996813103556633,
0.0545056089758873,
0.08221390098333359,
0.08866582810878754,
-0.015352514572441578,
-0.01438251044601202,
0.0019805054180324078,
0.1031746193766594,
0.2077811360359192,
0.07658794522285461,
-0.12245526909828186,
-0.046865180134773254,
0.050580985844135284,
-0.05626358836889267,
-0.08040334284305573,
-0.014095226302742958,
0.1018073633313179,
-0.17396238446235657,
0.08957168459892273,
-0.020980361849069595,
0.07543263584375381,
-0.07416978478431702,
0.016886860132217407,
0.036600034683942795,
0.1652768850326538,
-0.009141627699136734,
0.09826350957155228,
-0.14889098703861237,
0.05013265088200569,
0.03922111541032791,
0.016877679154276848,
-0.07754168659448624,
0.05502723529934883,
0.05440973863005638,
-0.0515563003718853,
0.16664129495620728,
-0.007652061991393566,
0.0005696593434549868,
-0.051844071596860886,
-0.11685539036989212,
0.019183460623025894,
0.0974668487906456,
-0.130197674036026,
0.06838018447160721,
-0.012653244659304619,
-0.06277990341186523,
-0.057315923273563385,
0.05313429608941078,
-0.0825912207365036,
-0.16973017156124115,
0.04384458437561989,
-0.03601457551121712,
-0.023322120308876038,
-0.03544622287154198,
0.002083922503516078,
-0.14879000186920166,
0.20760612189769745,
-0.1387358158826828,
-0.090287946164608,
-0.11734099686145782,
-0.0036881130654364824,
0.13718527555465698,
-0.09544137865304947,
0.027106085792183876,
-0.04064327850937843,
0.11208166182041168,
-0.023531006649136543,
-0.10107531398534775,
0.056510139256715775,
-0.07421255111694336,
-0.14270222187042236,
-0.027399739250540733,
0.15841805934906006,
0.008003047667443752,
0.02974805235862732,
-0.009000190533697605,
0.016411874443292618,
-0.027634410187602043,
-0.11295895278453827,
0.013978430069983006,
0.12418215721845627,
0.07644739001989365,
0.12230423092842102,
-0.11778903752565384,
-0.052728861570358276,
-0.02665472775697708,
0.051137182861566544,
0.13295577466487885,
0.21434539556503296,
-0.09332555532455444,
0.157138854265213,
0.08757035434246063,
-0.05191781371831894,
-0.2309674769639969,
-0.029842618852853775,
0.06837131828069687,
0.023412615060806274,
0.07991740852594376,
-0.17039445042610168,
0.16352596879005432,
0.05758489668369293,
-0.01719547063112259,
0.08037929236888885,
-0.22014959156513214,
-0.10749658197164536,
0.05956663936376572,
-0.03402971848845482,
-0.12583135068416595,
-0.12140575796365738,
-0.05955154448747635,
-0.06954922527074814,
-0.07112325727939606,
0.07839357852935791,
-0.11236422508955002,
0.07373058050870895,
0.02353908121585846,
0.048313431441783905,
0.042190082371234894,
-0.06391144543886185,
0.11583973467350006,
0.054791904985904694,
0.03483846038579941,
-0.06045078858733177,
-0.014689872041344643,
0.02659636363387108,
-0.07895027101039886,
0.09029441326856613,
-0.06737504154443741,
0.04926016926765442,
-0.0965566337108612,
0.005197476129978895,
-0.07933729141950607,
0.08404840528964996,
-0.0672387033700943,
-0.038424454629421234,
-0.04609926417469978,
0.08206269890069962,
0.05235496163368225,
0.0064808982424438,
0.09294053167104721,
-0.03143549710512161,
0.038722068071365356,
0.12267794460058212,
0.11254695057868958,
0.02968345396220684,
-0.15462124347686768,
-0.04254048690199852,
-0.01396534126251936,
0.03736088424921036,
-0.10563591122627258,
0.06512744724750519,
0.08002917468547821,
0.008543631061911583,
0.1304531693458557,
-0.026696255430579185,
-0.11756806820631027,
-0.006387111730873585,
0.07266802340745926,
-0.11814402788877487,
-0.17446459829807281,
-0.04613644257187843,
0.026179606094956398,
-0.11374687403440475,
-0.025090422481298447,
0.16667385399341583,
0.05052344873547554,
-0.03180718049407005,
0.03857416287064552,
0.05467042326927185,
-0.013072540983557701,
0.0722600594162941,
0.023613663390278816,
0.05914874002337456,
-0.10086079686880112,
0.0759042352437973,
0.06844498962163925,
-0.022767219692468643,
0.01915927790105343,
0.15639953315258026,
-0.0935671254992485,
-0.06654085218906403,
-0.050458766520023346,
0.1132735088467598,
-0.021790262311697006,
-0.0005486494046635926,
0.015172459185123444,
-0.040903788059949875,
0.028161948546767235,
-0.024845663458108902,
0.015678932890295982,
-0.014054798521101475,
-0.00654549291357398,
-0.030660148710012436,
-0.013662505894899368,
0.13061854243278503,
0.009227201342582703,
0.027642132714390755,
-0.08369886875152588,
0.0025080309715121984,
-0.020914573222398758,
0.039985477924346924,
-0.0006314117927104235,
-0.022946706041693687,
-0.0962754562497139,
-0.036036014556884766,
-0.18808400630950928,
0.013659370131790638,
-0.07973171770572662,
0.009036863222718239,
-0.01519277598708868,
-0.012118794023990631,
0.028390608727931976,
0.0033084717579185963,
-0.0675591453909874,
-0.0796855166554451,
-0.03955864906311035,
0.1257856786251068,
-0.14678426086902618,
0.008907905779778957,
0.03216841071844101,
-0.08207499235868454,
0.1530611366033554,
0.013494493439793587,
-0.030885711312294006,
0.011500800959765911,
-0.09532324224710464,
-0.08624906837940216,
-0.03959498926997185,
0.046979643404483795,
0.036601778119802475,
-0.10647890716791153,
0.010896680876612663,
-0.06288831681013107,
-0.03873135522007942,
-0.01013140007853508,
0.0019980513025075197,
-0.10272175073623657,
0.11376059055328369,
-0.024516893550753593,
-0.01681104302406311,
-0.08539889752864838,
0.019440650939941406,
0.05795807018876076,
0.03030703030526638,
0.10924604535102844,
-0.0719514787197113,
0.08268643915653229,
-0.12540043890476227,
-0.027887554839253426,
0.01424119807779789,
-0.03469272702932358,
-0.05145158991217613,
-0.03146527335047722,
0.06559007614850998,
-0.04240305721759796,
0.0972539409995079,
-0.06710841506719589,
0.01606900244951248,
0.032219573855400085,
0.006588649936020374,
-0.045816827565431595,
0.03265433758497238,
0.0501813180744648,
-0.010150239802896976,
0.017994966357946396,
-0.007413015700876713,
-0.004769371822476387,
-0.0053749713115394115,
0.03659724444150925,
0.12560197710990906,
0.1648544818162918,
0.08419032394886017,
-0.003028041450306773,
0.10366057604551315,
-0.08512555807828903,
-0.09399370104074478,
0.1601548045873642,
-0.03695381060242653,
0.09745802730321884,
-0.09503950923681259,
-0.013893691822886467,
0.10092531144618988,
-0.17941537499427795,
0.0729273334145546,
0.020994320511817932,
-0.04220006242394447,
-0.08337965607643127,
-0.14972993731498718,
-0.07215260714292526,
-0.0068145496770739555,
0.006041341926902533,
-0.10524562746286392,
0.061841510236263275,
0.09487009793519974,
0.04681568965315819,
0.0066675180569291115,
0.1318579465150833,
-0.10792303830385208,
-0.0652294009923935,
0.09167303144931793,
0.04739801958203316,
0.030394554138183594,
0.05491561070084572,
0.006829617545008659,
0.004592916928231716,
0.04069004952907562,
0.051266491413116455,
0.06437984108924866,
0.03621020168066025,
-0.015088495798408985,
-0.07851719111204147,
-0.09682675451040268,
-0.007588108070194721,
-0.0044494192115962505,
-0.018723413348197937,
0.09908390045166016,
0.057510070502758026,
-0.03655695542693138,
-0.03860878571867943,
0.1580076366662979,
-0.070462666451931,
-0.09245819598436356,
-0.14274287223815918,
0.08711789548397064,
-0.03439594432711601,
0.01466337963938713,
0.0197947695851326,
-0.11511436849832535,
-0.03958495706319809,
0.09874677658081055,
0.24246187508106232,
-0.10283359885215759,
0.022783083841204643,
0.042295753955841064,
0.002474808134138584,
-0.03101552091538906,
0.10257598757743835,
0.02583801932632923,
0.23261693120002747,
-0.03373502194881439,
0.05947835370898247,
0.041586026549339294,
-0.05366605147719383,
-0.11094500124454498,
0.1268940567970276,
-0.0357968769967556,
-0.017380714416503906,
-0.0017622570740059018,
0.10508225858211517,
-0.06409481912851334,
-0.24069000780582428,
-0.024815982207655907,
-0.07346858829259872,
-0.1671379953622818,
-0.029101502150297165,
0.03312025964260101,
0.04067433625459671,
0.036371566355228424,
-0.008917107246816158,
-0.056772951036691666,
0.18440471589565277,
-0.019105365499854088,
-0.020442118868231773,
-0.03660915791988373,
0.06923259794712067,
-0.0710671991109848,
0.12858256697654724,
0.0343402698636055,
0.06858345866203308,
0.10783837735652924,
-0.02851245366036892,
-0.10074086487293243,
-0.028638219460844994,
0.08102114498615265,
-0.13773104548454285,
0.028991691768169403,
0.10815012454986572,
-0.012770649045705795,
0.11866271495819092,
0.09168202430009842,
-0.09621033817529678,
0.040001124143600464,
0.06598219275474548,
-0.03386002033948898,
-0.10183160752058029,
0.07052838802337646,
-0.07429873198270798,
0.14535531401634216,
0.15466052293777466,
-0.008389666676521301,
-0.01569712720811367,
-0.025351211428642273,
0.03799423947930336,
0.03709333389997482,
0.07384980469942093,
-0.05382737144827843,
-0.1221255287528038,
0.06172432005405426,
-0.06889673322439194,
0.09084801375865936,
-0.14904700219631195,
-0.06585235148668289,
0.041441213339567184,
0.045368216931819916,
-0.06009003892540932,
0.12703746557235718,
0.05352536216378212,
0.007719038054347038,
-0.012456797063350677,
-0.03753921017050743,
-0.010450550355017185,
0.10063362121582031,
-0.11703325808048248,
-0.04631619527935982
] |
null | null |
transformers
|
## Overview
**Language model:** deepset/roberta-base-squad2-distilled
**Language:** English
**Training data:** SQuAD 2.0 training set
**Eval data:** SQuAD 2.0 dev set
**Infrastructure**: 1x V100 GPU
**Published**: Apr 21st, 2021
## Details
- haystack's distillation feature was used for training. deepset/bert-large-uncased-whole-word-masking-squad2 was used as the teacher model.
## Hyperparameters
```
batch_size = 6
n_epochs = 2
max_seq_len = 384
learning_rate = 3e-5
lr_schedule = LinearWarmup
embeds_dropout_prob = 0.1
temperature = 5
distillation_loss_weight = 1
```
## Performance
```
"exact": 68.6431398972458
"f1": 72.7637083790805
```
## Authors
- Timo Möller: `timo.moeller [at] deepset.ai`
- Julian Risch: `julian.risch [at] deepset.ai`
- Malte Pietsch: `malte.pietsch [at] deepset.ai`
- Michel Bartels: `michel.bartels [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "en", "license": "mit", "tags": ["exbert"], "datasets": ["squad_v2"], "thumbnail": "https://thumb.tildacdn.com/tild3433-3637-4830-a533-353833613061/-/resize/720x/-/format/webp/germanquad.jpg", "model-index": [{"name": "deepset/bert-medium-squad2-distilled", "results": [{"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squad_v2", "type": "squad_v2", "config": "squad_v2", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 69.8231, "name": "Exact Match", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMmE4MGRkZTVjNmViMGNjYjVhY2E1NzcyOGQ1OWE1MWMzMjY5NWU0MmU0Y2I4OWU4YTU5OWQ5YTI2NWE1NmM0ZSIsInZlcnNpb24iOjF9.tnCJvWzMctTwiQu5yig_owO2ZI1t1MZz1AN2lQy4COAGOzuMovD-74acQvMbxJQoRfNNkIetz2hqYivf1lJKDw"}, {"type": "f1", "value": 72.9232, "name": "F1", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTMwNzk0ZDRjNGUyMjQyNzc1NzczZmUwMTU2MTM5MGQ3M2NhODlmOTU4ZDI0YjhlNTVjNDA1MGEwM2M1MzIyZSIsInZlcnNpb24iOjF9.eElGmTOXH_qHTNaPwZ-dUJfVz9VMvCutDCof_6UG_625MwctT_j7iVkWcGwed4tUnunuq1BPm-0iRh1RuuB-AQ"}]}]}]}
|
question-answering
|
deepset/bert-medium-squad2-distilled
|
[
"transformers",
"pytorch",
"safetensors",
"bert",
"question-answering",
"exbert",
"en",
"dataset:squad_v2",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #safetensors #bert #question-answering #exbert #en #dataset-squad_v2 #license-mit #model-index #endpoints_compatible #has_space #region-us
|
## Overview
Language model: deepset/roberta-base-squad2-distilled
Language: English
Training data: SQuAD 2.0 training set
Eval data: SQuAD 2.0 dev set
Infrastructure: 1x V100 GPU
Published: Apr 21st, 2021
## Details
- haystack's distillation feature was used for training. deepset/bert-large-uncased-whole-word-masking-squad2 was used as the teacher model.
## Hyperparameters
## Performance
## Authors
- Timo Möller: 'timo.moeller [at] URL'
- Julian Risch: 'URL [at] URL'
- Malte Pietsch: 'malte.pietsch [at] URL'
- Michel Bartels: 'michel.bartels [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Discord | GitHub Discussions | Website
By the way: we're hiring!
|
[
"## Overview\nLanguage model: deepset/roberta-base-squad2-distilled \nLanguage: English \nTraining data: SQuAD 2.0 training set \nEval data: SQuAD 2.0 dev set \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021",
"## Details\n- haystack's distillation feature was used for training. deepset/bert-large-uncased-whole-word-masking-squad2 was used as the teacher model.",
"## Hyperparameters",
"## Performance",
"## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'\n- Michel Bartels: 'michel.bartels [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #safetensors #bert #question-answering #exbert #en #dataset-squad_v2 #license-mit #model-index #endpoints_compatible #has_space #region-us \n",
"## Overview\nLanguage model: deepset/roberta-base-squad2-distilled \nLanguage: English \nTraining data: SQuAD 2.0 training set \nEval data: SQuAD 2.0 dev set \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021",
"## Details\n- haystack's distillation feature was used for training. deepset/bert-large-uncased-whole-word-masking-squad2 was used as the teacher model.",
"## Hyperparameters",
"## Performance",
"## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'\n- Michel Bartels: 'michel.bartels [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
61,
57,
47,
5,
2,
63,
129
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #bert #question-answering #exbert #en #dataset-squad_v2 #license-mit #model-index #endpoints_compatible #has_space #region-us \n## Overview\nLanguage model: deepset/roberta-base-squad2-distilled \nLanguage: English \nTraining data: SQuAD 2.0 training set \nEval data: SQuAD 2.0 dev set \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021## Details\n- haystack's distillation feature was used for training. deepset/bert-large-uncased-whole-word-masking-squad2 was used as the teacher model.## Hyperparameters## Performance## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'\n- Michel Bartels: 'michel.bartels [at] URL'## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
-0.03329724445939064,
0.08736185729503632,
-0.0030456525273621082,
0.06409167498350143,
0.07546447962522507,
0.006702073384076357,
0.1453578770160675,
0.09575534611940384,
0.1008153110742569,
0.07329614460468292,
0.0065531181171536446,
-0.03821446746587753,
0.09345376491546631,
0.12345194071531296,
0.03557119518518448,
-0.20919585227966309,
0.0037311033811420202,
-0.10852610319852829,
-0.11380473524332047,
0.10056903958320618,
0.12801489233970642,
-0.09695252776145935,
0.10823915898799896,
-0.013436486013233662,
-0.0133076636120677,
0.0195449385792017,
-0.024569327011704445,
-0.024653982371091843,
0.11395161598920822,
0.04933525621891022,
0.08224564790725708,
-0.006030228454619646,
0.023461418226361275,
-0.20939871668815613,
0.031669262796640396,
0.07256097346544266,
0.021011095494031906,
0.0391099639236927,
0.03864426910877228,
0.00040272276964969933,
-0.02353634499013424,
-0.11608049273490906,
0.04036795720458031,
0.058163922280073166,
-0.11833624541759491,
-0.12956848740577698,
-0.1312899887561798,
0.06291557848453522,
0.028134038671851158,
0.03178718686103821,
-0.02768709324300289,
0.06140947341918945,
-0.1344103366136551,
0.028661111369729042,
0.09860016405582428,
-0.3227139413356781,
-0.06307519972324371,
0.003632645821198821,
0.04944152757525444,
0.08806713670492172,
-0.15093334019184113,
0.048362940549850464,
0.023687047883868217,
0.018512075766921043,
0.0069354986771941185,
-0.004721829667687416,
0.06666313111782074,
0.004649735521525145,
-0.07275380939245224,
-0.020373733714222908,
0.12062682956457138,
0.01507245097309351,
-0.04338773712515831,
-0.20613403618335724,
-0.0010396743891760707,
0.07674408704042435,
-0.012783367186784744,
-0.05948429927229881,
0.0072001684457063675,
-0.019355054944753647,
-0.034371230751276016,
-0.045628637075424194,
-0.0912441611289978,
0.022521184757351875,
0.010538388043642044,
0.08967085927724838,
0.033240269869565964,
0.029392611235380173,
-0.00040646755951456726,
0.07735918462276459,
0.029441460967063904,
-0.1277577131986618,
-0.028249310329556465,
-0.13320599496364594,
-0.06404583901166916,
-0.006382529158145189,
0.00546012818813324,
0.025502294301986694,
0.1018412709236145,
0.18967783451080322,
-0.05992421135306358,
0.017996374517679214,
0.018646471202373505,
-0.010043499991297722,
0.036566346883773804,
0.20288921892642975,
-0.02968224324285984,
-0.18881547451019287,
-0.015089127235114574,
0.0012684897519648075,
-0.032733701169490814,
-0.018108492717146873,
-0.04236983880400658,
-0.0006970653194002807,
-0.033636465668678284,
0.04583803564310074,
0.030892936512827873,
0.036308132112026215,
-0.06773921847343445,
-0.08522595465183258,
0.10808170586824417,
-0.09355922788381577,
0.04617198556661606,
0.05361870303750038,
-0.03393886610865593,
0.09415842592716217,
-0.07134947180747986,
0.028778795152902603,
-0.023484081029891968,
0.07964061945676804,
-0.018797820433974266,
-0.03754596412181854,
-0.08195298910140991,
-0.09877212345600128,
0.04873031750321388,
0.0005595145630650222,
-0.049308281391859055,
-0.08000761270523071,
-0.028417939320206642,
-0.06270691007375717,
0.06762535870075226,
-0.024783752858638763,
-0.013426670804619789,
-0.0222481656819582,
-0.01536048948764801,
0.049529291689395905,
0.012503473088145256,
-0.050083693116903305,
-0.04039764776825905,
0.058891572058200836,
-0.11355312913656235,
0.013312875293195248,
0.0018147547962144017,
0.030976513400673866,
-0.0542924702167511,
-0.004342700820416212,
-0.1742832362651825,
0.06803446263074875,
-0.1409311443567276,
0.08588403463363647,
-0.11880163848400116,
-0.037831176072359085,
0.01263369433581829,
0.03252165764570236,
0.0008577904081903398,
0.11581136286258698,
-0.1333300769329071,
-0.073606476187706,
0.14999254047870636,
-0.060083333402872086,
-0.07957054674625397,
0.16399800777435303,
-0.0692899078130722,
0.0010207145242020488,
0.107354536652565,
0.1717449128627777,
0.15030114352703094,
-0.11856105923652649,
-0.018207736313343048,
-0.04241035133600235,
0.03861067444086075,
0.07836327701807022,
0.09318380802869797,
-0.0445709265768528,
0.03969234973192215,
0.008178004994988441,
-0.09197082370519638,
-0.01115824282169342,
-0.03716650977730751,
-0.0764276459813118,
0.0447910875082016,
-0.019391436129808426,
0.13045819103717804,
-0.026386601850390434,
-0.005607925821095705,
-0.07234309613704681,
-0.12127561867237091,
-0.039653144776821136,
0.00851861760020256,
0.00021988473599776626,
-0.01661001145839691,
-0.052301157265901566,
0.014576654881238937,
0.11338270455598831,
0.01918051578104496,
-0.07623619586229324,
-0.15573805570602417,
0.08483722805976868,
-0.05032724887132645,
0.1191859319806099,
0.04922223463654518,
0.06947098672389984,
0.006037859711796045,
-0.04413291811943054,
-0.02201898768544197,
-0.10130833834409714,
-0.026245461776852608,
0.017652787268161774,
-0.17622539401054382,
0.008617423474788666,
-0.07134614884853363,
0.038182225078344345,
-0.05350235477089882,
-0.0272345170378685,
0.08169106394052505,
0.10733040422201157,
0.07292373478412628,
-0.01483891624957323,
-0.04284604638814926,
0.03616447001695633,
0.05554826185107231,
-0.015258993953466415,
0.015279177576303482,
-0.01459453348070383,
-0.022005664184689522,
0.05071501433849335,
-0.002074841409921646,
0.08348596841096878,
0.05049007013440132,
0.03840366750955582,
-0.06409880518913269,
-0.08665783703327179,
-0.06598036736249924,
0.0018363040871918201,
-0.05872243270277977,
-0.08044325560331345,
0.17111678421497345,
0.027699191123247147,
0.023870104923844337,
-0.072297602891922,
-0.08379120379686356,
-0.06119387596845627,
0.01776861399412155,
-0.003767840564250946,
0.11931682378053665,
-0.040220823138952255,
-0.13231350481510162,
0.12064393609762192,
0.19245685636997223,
0.0767282173037529,
0.22028791904449463,
-0.06626537442207336,
-0.02839989773929119,
-0.023386768996715546,
0.039173293858766556,
-0.05159880593419075,
0.12523271143436432,
-0.008848827332258224,
-0.017946645617485046,
0.04557950049638748,
-0.00783111434429884,
-0.020495954900979996,
-0.0807492583990097,
0.01717340387403965,
-0.01438406016677618,
-0.024677084758877754,
0.013455175794661045,
0.024532584473490715,
0.06159728765487671,
0.09690143913030624,
0.08724058419466019,
0.01761859655380249,
-0.015559851191937923,
-0.0623224638402462,
-0.059012118726968765,
0.14961829781532288,
-0.12443097680807114,
-0.201480895280838,
-0.1046992689371109,
-0.008601905778050423,
-0.0979149118065834,
-0.045218318700790405,
0.0371866337954998,
-0.10923787206411362,
-0.08761760592460632,
0.0016524250386282802,
0.08526693284511566,
0.08815562725067139,
-0.05622128024697304,
-0.009511361829936504,
-0.010125233791768551,
0.0015595946460962296,
-0.1346769481897354,
-0.03153754770755768,
-0.019176790490746498,
-0.05762345716357231,
-0.011867131106555462,
0.048470765352249146,
0.03241119161248207,
0.0314522460103035,
0.02315385267138481,
0.021296031773090363,
-0.047760408371686935,
0.2920621931552887,
-0.13819250464439392,
0.07970279455184937,
0.06787160784006119,
-0.015057183802127838,
0.07036801427602768,
0.20965784788131714,
0.0907907783985138,
-0.008526607416570187,
0.015969237312674522,
0.05007832869887352,
0.02024037390947342,
-0.2561410963535309,
-0.10859573632478714,
-0.04371470585465431,
-0.012758360244333744,
0.008363798260688782,
0.04955238476395607,
-0.013492384925484657,
-0.02573736384510994,
-0.10916073620319366,
-0.03447095304727554,
0.08610789477825165,
0.0431579053401947,
0.09072080254554749,
0.01221825834363699,
0.04096382111310959,
-0.03620127961039543,
-0.05758201703429222,
0.11044313758611679,
0.09800945222377777,
0.15382221341133118,
0.07325823605060577,
0.1388229876756668,
0.0634046345949173,
0.0764698013663292,
0.014771681278944016,
-0.004146933555603027,
-0.005005209241062403,
0.02933688275516033,
-0.033745549619197845,
-0.07661358267068863,
0.01768314838409424,
0.07060188055038452,
0.08098218590021133,
-0.06865713745355606,
-0.02542232908308506,
-0.05726048722863197,
0.13964757323265076,
0.2544064223766327,
0.003964553587138653,
-0.06438594311475754,
-0.0886964350938797,
0.028896858915686607,
-0.06708980351686478,
-0.03608773648738861,
0.01680467277765274,
0.07661574333906174,
-0.18404529988765717,
0.031143618747591972,
-0.021454188972711563,
0.0953642800450325,
-0.021239861845970154,
0.03239014744758606,
0.07815005630254745,
0.03467023745179176,
-0.015056123957037926,
0.05628589913249016,
-0.2332141399383545,
0.20361313223838806,
0.01066649705171585,
0.06077398732304573,
-0.03503718227148056,
0.025598956272006035,
0.045355599373579025,
-0.044503021985292435,
0.11852941662073135,
0.017297016456723213,
-0.036323949694633484,
-0.035500023514032364,
-0.08479602634906769,
0.024600913748145103,
0.12495940178632736,
-0.12149937450885773,
0.05858798697590828,
-0.03763347491621971,
-0.010696318000555038,
-0.02759411372244358,
0.0603504553437233,
-0.12382147461175919,
-0.12632127106189728,
0.026349710300564766,
-0.09425055235624313,
0.007482440676540136,
-0.04534875229001045,
-0.051051314920186996,
-0.1257382035255432,
0.1565493494272232,
-0.15232664346694946,
-0.06938904523849487,
-0.1062016636133194,
-0.04383840039372444,
0.048141684383153915,
-0.09537354856729507,
0.03695780411362648,
-0.013197937048971653,
0.07964472472667694,
-0.032064709812402725,
-0.10428287833929062,
0.06308960914611816,
-0.0760984867811203,
-0.1467427760362625,
-0.025661826133728027,
0.16166473925113678,
0.041879381984472275,
0.04237751290202141,
0.011596065014600754,
0.002469646744430065,
-0.040525175631046295,
-0.12800182402133942,
0.023688074201345444,
0.11760057508945465,
-0.020167764276266098,
0.07884559780359268,
-0.11244022846221924,
-0.11961299180984497,
-0.08644027262926102,
0.0297300536185503,
0.11405804008245468,
0.14083121716976166,
-0.07527834922075272,
0.19336706399917603,
0.14445212483406067,
-0.04386918246746063,
-0.24890999495983124,
-0.03419722244143486,
0.048798322677612305,
0.019164297729730606,
0.016517123207449913,
-0.16421930491924286,
0.09317543357610703,
-0.0006854418898001313,
-0.03800875321030617,
-0.001038238755427301,
-0.20885780453681946,
-0.12119311839342117,
0.03327289968729019,
-0.048978712409734726,
-0.004883562680333853,
-0.09295487403869629,
-0.07584614306688309,
-0.03417174518108368,
-0.11106814444065094,
0.1224166750907898,
-0.0439131036400795,
0.05965831130743027,
0.029028424993157387,
0.05573354661464691,
0.005827931687235832,
-0.02344273403286934,
0.09678361564874649,
0.011648822575807571,
0.03504706174135208,
-0.050022318959236145,
-0.03317476436495781,
0.02729222923517227,
-0.05225510895252228,
0.024156605824828148,
-0.03977058455348015,
0.010244254022836685,
-0.1643100082874298,
-0.015539735555648804,
-0.09018325805664062,
0.08029516041278839,
-0.07372792065143585,
-0.05063128098845482,
-0.10062295198440552,
0.13879340887069702,
0.06861819326877594,
0.003643064061179757,
0.010561528615653515,
0.01032065600156784,
0.09962297976016998,
0.056025732308626175,
0.13737694919109344,
0.02787085995078087,
-0.09726084768772125,
-0.034860603511333466,
-0.02111203595995903,
0.0952358990907669,
-0.06354474276304245,
0.054201848804950714,
0.14896531403064728,
-0.001835367758758366,
0.12255436182022095,
0.004153137560933828,
-0.09707742184400558,
-0.020446155220270157,
0.08174146711826324,
-0.13890573382377625,
-0.2357875555753708,
-0.07642808556556702,
-0.010861780494451523,
-0.05265619978308678,
0.0132111432030797,
0.1579812467098236,
0.013403510674834251,
-0.04873979464173317,
0.02512950822710991,
0.0630728006362915,
-0.01255539245903492,
0.07478313893079758,
0.03726675361394882,
0.03589727729558945,
-0.09239927679300308,
0.07662705332040787,
0.07676498591899872,
-0.04511263966560364,
0.017215600237250328,
0.11333128809928894,
-0.05210188031196594,
-0.030393993481993675,
0.04308493435382843,
0.09664624184370041,
-0.042601678520441055,
-0.010328763164579868,
-0.002988979686051607,
-0.12823227047920227,
-0.004778052214533091,
0.06359291821718216,
0.022082116454839706,
0.033293940126895905,
0.021470598876476288,
0.014244363643229008,
0.07412872463464737,
0.14446209371089935,
0.06684457510709763,
-0.003222202882170677,
-0.03746437281370163,
-0.0013553923927247524,
-0.0395207405090332,
0.024781817570328712,
-0.017480533570051193,
-0.02457539364695549,
-0.17053104937076569,
-0.034578774124383926,
-0.10373447090387344,
-0.0035660858266055584,
0.004642501473426819,
0.03329050913453102,
-0.034381091594696045,
-0.0765509158372879,
0.0013646517181769013,
0.02744448371231556,
-0.08178891241550446,
-0.01756192184984684,
-0.008629820309579372,
0.13649694621562958,
-0.1875709891319275,
0.008311796933412552,
0.09320908784866333,
-0.048585452139377594,
0.11262568086385727,
-0.009349913336336613,
-0.03494253382086754,
0.03941016271710396,
-0.11868595331907272,
-0.049771592020988464,
-0.0639086440205574,
0.03576463460922241,
0.04556994512677193,
-0.09092167019844055,
-0.01163329929113388,
-0.015736108645796776,
-0.035492535680532455,
0.026205264031887054,
0.017627183347940445,
-0.06787338107824326,
0.06052510067820549,
0.0021429338958114386,
-0.11972616612911224,
-0.02084878645837307,
0.015162460505962372,
0.09940247982740402,
0.034820806235075,
0.11881601810455322,
-0.08811190724372864,
0.06420761346817017,
-0.11545417457818985,
-0.0037034298293292522,
0.05938947573304176,
-0.03618365526199341,
-0.15255720913410187,
-0.0206458643078804,
0.04954611137509346,
-0.007258072029799223,
0.14095182716846466,
0.029832901433110237,
-0.020771954208612442,
0.046668097376823425,
-0.03305552899837494,
-0.1122199222445488,
0.05534689873456955,
0.06413954496383667,
-0.02713770605623722,
-0.0005516664241440594,
-0.036739230155944824,
-0.07757776230573654,
-0.04997338727116585,
0.024793433025479317,
0.20424853265285492,
0.27291741967201233,
0.12766876816749573,
-0.0161049272865057,
0.1315162628889084,
0.003220480866730213,
-0.12785422801971436,
-0.012085908092558384,
0.026314768940210342,
0.07417241483926773,
-0.12826520204544067,
0.06336057186126709,
0.08751366287469864,
-0.20392778515815735,
0.10450424253940582,
-0.05917482078075409,
-0.03574657067656517,
-0.05983809381723404,
-0.10287777334451675,
-0.07616420835256577,
-0.09226369857788086,
0.0173247791826725,
-0.12870506942272186,
0.05151218920946121,
0.026331139728426933,
0.0669533759355545,
-0.07137153297662735,
0.12525561451911926,
-0.1318027824163437,
-0.04901321604847908,
0.14862193167209625,
0.04480629786849022,
0.062057483941316605,
0.015435425564646721,
-0.013682621531188488,
-0.06166866049170494,
0.09340579807758331,
0.014339247718453407,
0.06337910890579224,
-0.04349370673298836,
-0.06807272881269455,
-0.058359548449516296,
-0.07019755989313126,
0.006229096557945013,
-0.016770534217357635,
-0.0162075012922287,
0.15389248728752136,
0.02457616850733757,
-0.0200699120759964,
0.00860091857612133,
0.1853453814983368,
-0.038504671305418015,
-0.09994997829198837,
-0.18328599631786346,
0.08526941388845444,
-0.02482662722468376,
0.030489113181829453,
0.005399525631219149,
-0.09014997631311417,
-0.03033881075680256,
0.10533183068037033,
0.2433689385652542,
-0.09404560178518295,
0.022464284673333168,
-0.013423418626189232,
0.033819764852523804,
0.03996674716472626,
0.14576533436775208,
-0.0005201379535719752,
0.2445661872625351,
-0.0226763729006052,
0.018854757770895958,
0.023902764543890953,
-0.04545017331838608,
-0.10059889405965805,
0.13501033186912537,
0.03859420865774155,
-0.06345056742429733,
-0.12374984472990036,
0.16003556549549103,
-0.05973755195736885,
-0.14418533444404602,
-0.038311224430799484,
-0.19098225235939026,
-0.14995968341827393,
-0.05312560126185417,
0.06291145831346512,
0.057084787636995316,
0.06249869614839554,
0.04062838852405548,
-0.05583178624510765,
0.1355930119752884,
0.015539278276264668,
-0.008154301904141903,
-0.03580040484666824,
0.13295522332191467,
-0.04336564615368843,
0.20680208504199982,
0.04463879391551018,
0.0342378243803978,
0.07938959449529648,
-0.0025921452324837446,
-0.04381096363067627,
-0.0711514800786972,
0.06432992964982986,
-0.1436590552330017,
-0.05590870976448059,
0.05717974156141281,
0.006600013002753258,
0.07210537791252136,
0.08555576950311661,
-0.04796163737773895,
0.05567960813641548,
0.13410964608192444,
-0.043820153921842575,
-0.1147465780377388,
0.12380452454090118,
-0.1132357269525528,
0.13152703642845154,
0.17755180597305298,
-0.022634150460362434,
0.012900114990770817,
-0.02566744200885296,
0.021642478182911873,
0.01838311366736889,
0.06112717092037201,
-0.05602690950036049,
-0.1730637103319168,
0.006974638905376196,
-0.04395857825875282,
0.05560225620865822,
-0.11577395349740982,
-0.0909661129117012,
-0.012278217822313309,
0.12075024098157883,
-0.042434435337781906,
0.15106552839279175,
0.08504464477300644,
-0.023836426436901093,
0.00810421071946621,
-0.1147715374827385,
-0.00783612858504057,
0.08382309228181839,
-0.07257365435361862,
0.006012731231749058
] |
null | null |
transformers
|
# electra-base for QA
## Overview
**Language model:** electra-base
**Language:** English
**Downstream-task:** Extractive QA
**Training data:** SQuAD 2.0
**Eval data:** SQuAD 2.0
**Code:** See [example](https://github.com/deepset-ai/FARM/blob/master/examples/question_answering.py) in [FARM](https://github.com/deepset-ai/FARM/blob/master/examples/question_answering.py)
**Infrastructure**: 1x Tesla v100
## Hyperparameters
```
seed=42
batch_size = 32
n_epochs = 5
base_LM_model = "google/electra-base-discriminator"
max_seq_len = 384
learning_rate = 1e-4
lr_schedule = LinearWarmup
warmup_proportion = 0.1
doc_stride=128
max_query_length=64
```
## Performance
Evaluated on the SQuAD 2.0 dev set with the [official eval script](https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/).
```
"exact": 77.30144024256717,
"f1": 81.35438272008543,
"total": 11873,
"HasAns_exact": 74.34210526315789,
"HasAns_f1": 82.45961302894314,
"HasAns_total": 5928,
"NoAns_exact": 80.25231286795626,
"NoAns_f1": 80.25231286795626,
"NoAns_total": 5945
```
## Usage
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "deepset/electra-base-squad2"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'Why is model conversion important?',
'context': 'The option to convert models between FARM and transformers gives freedom to the user and lets people easily switch between frameworks.'
}
res = nlp(QA_input)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
### In FARM
```python
from farm.modeling.adaptive_model import AdaptiveModel
from farm.modeling.tokenization import Tokenizer
from farm.infer import Inferencer
model_name = "deepset/electra-base-squad2"
# a) Get predictions
nlp = Inferencer.load(model_name, task_type="question_answering")
QA_input = [{"questions": ["Why is model conversion important?"],
"text": "The option to convert models between FARM and transformers gives freedom to the user and lets people easily switch between frameworks."}]
res = nlp.inference_from_dicts(dicts=QA_input)
# b) Load model & tokenizer
model = AdaptiveModel.convert_from_transformers(model_name, device="cpu", task_type="question_answering")
tokenizer = Tokenizer.load(model_name)
```
### In haystack
For doing QA at scale (i.e. many docs instead of a single paragraph), you can load the model also in [haystack](https://github.com/deepset-ai/haystack/):
```python
reader = FARMReader(model_name_or_path="deepset/electra-base-squad2")
# or
reader = TransformersReader(model="deepset/electra-base-squad2",tokenizer="deepset/electra-base-squad2")
```
## Authors
Vaishali Pal `vaishali.pal [at] deepset.ai`
Branden Chan: `branden.chan [at] deepset.ai`
Timo Möller: `timo.moeller [at] deepset.ai`
Malte Pietsch: `malte.pietsch [at] deepset.ai`
Tanay Soni: `tanay.soni [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "en", "license": "cc-by-4.0", "datasets": ["squad_v2"], "model-index": [{"name": "deepset/electra-base-squad2", "results": [{"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squad_v2", "type": "squad_v2", "config": "squad_v2", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 77.6074, "name": "Exact Match", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYzE5NTRmMmUwYTk1MTI0NjM0ZmQwNDFmM2Y4Mjk4ZWYxOGVmOWI3ZGFiNWM4OTUxZDQ2ZjdmNmU3OTk5ZjRjYyIsInZlcnNpb24iOjF9.0VZRewdiovE4z3K5box5R0oTT7etpmd0BX44FJBLRFfot-uJ915b-bceSv3luJQ7ENPjaYSa7o7jcHlDzn3oAw"}, {"type": "f1", "value": 81.7181, "name": "F1", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2VlMzM0Y2UzYjhhNTJhMTFiYWZmMDNjNjRiZDgwYzc5NWE3N2M4ZGFlYWQ0ZjVkZTE2MDU0YmMzMDc1MTY5MCIsInZlcnNpb24iOjF9.jRV58UxOM7CJJSsmxJuZvlt00jMGA1thp4aqtcFi1C8qViQ1kW7NYz8rg1gNTDZNez2UwPS1NgN_HnnwBHPbCQ"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squad", "type": "squad", "config": "plain_text", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 80.407, "name": "Exact Match"}, {"type": "f1", "value": 88.942, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "adversarial_qa", "type": "adversarial_qa", "config": "adversarialQA", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 23.533, "name": "Exact Match"}, {"type": "f1", "value": 36.521, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squad_adversarial", "type": "squad_adversarial", "config": "AddOneSent", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 73.867, "name": "Exact Match"}, {"type": "f1", "value": 81.381, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squadshifts amazon", "type": "squadshifts", "config": "amazon", "split": "test"}, "metrics": [{"type": "exact_match", "value": 64.512, "name": "Exact Match"}, {"type": "f1", "value": 80.166, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squadshifts new_wiki", "type": "squadshifts", "config": "new_wiki", "split": "test"}, "metrics": [{"type": "exact_match", "value": 76.568, "name": "Exact Match"}, {"type": "f1", "value": 87.706, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squadshifts nyt", "type": "squadshifts", "config": "nyt", "split": "test"}, "metrics": [{"type": "exact_match", "value": 77.884, "name": "Exact Match"}, {"type": "f1", "value": 87.858, "name": "F1"}]}, {"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squadshifts reddit", "type": "squadshifts", "config": "reddit", "split": "test"}, "metrics": [{"type": "exact_match", "value": 64.399, "name": "Exact Match"}, {"type": "f1", "value": 78.096, "name": "F1"}]}]}]}
|
question-answering
|
deepset/electra-base-squad2
|
[
"transformers",
"pytorch",
"safetensors",
"electra",
"question-answering",
"en",
"dataset:squad_v2",
"license:cc-by-4.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #safetensors #electra #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us
|
# electra-base for QA
## Overview
Language model: electra-base
Language: English
Downstream-task: Extractive QA
Training data: SQuAD 2.0
Eval data: SQuAD 2.0
Code: See example in FARM
Infrastructure: 1x Tesla v100
## Hyperparameters
## Performance
Evaluated on the SQuAD 2.0 dev set with the official eval script.
## Usage
### In Transformers
### In FARM
### In haystack
For doing QA at scale (i.e. many docs instead of a single paragraph), you can load the model also in haystack:
## Authors
Vaishali Pal 'URL [at] URL'
Branden Chan: 'URL [at] URL'
Timo Möller: 'timo.moeller [at] URL'
Malte Pietsch: 'malte.pietsch [at] URL'
Tanay Soni: 'URL [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Discord | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# electra-base for QA",
"## Overview\nLanguage model: electra-base \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nCode: See example in FARM \nInfrastructure: 1x Tesla v100",
"## Hyperparameters",
"## Performance\nEvaluated on the SQuAD 2.0 dev set with the official eval script.",
"## Usage",
"### In Transformers",
"### In FARM",
"### In haystack\nFor doing QA at scale (i.e. many docs instead of a single paragraph), you can load the model also in haystack:",
"## Authors\nVaishali Pal 'URL [at] URL' \nBranden Chan: 'URL [at] URL' \nTimo Möller: 'timo.moeller [at] URL' \nMalte Pietsch: 'malte.pietsch [at] URL' \nTanay Soni: 'URL [at] URL'",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n\nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #safetensors #electra #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us \n",
"# electra-base for QA",
"## Overview\nLanguage model: electra-base \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nCode: See example in FARM \nInfrastructure: 1x Tesla v100",
"## Hyperparameters",
"## Performance\nEvaluated on the SQuAD 2.0 dev set with the official eval script.",
"## Usage",
"### In Transformers",
"### In FARM",
"### In haystack\nFor doing QA at scale (i.e. many docs instead of a single paragraph), you can load the model also in haystack:",
"## Authors\nVaishali Pal 'URL [at] URL' \nBranden Chan: 'URL [at] URL' \nTimo Möller: 'timo.moeller [at] URL' \nMalte Pietsch: 'malte.pietsch [at] URL' \nTanay Soni: 'URL [at] URL'",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n\nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
63,
8,
54,
5,
19,
3,
6,
5,
37,
67,
129
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #electra #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us \n# electra-base for QA## Overview\nLanguage model: electra-base \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nCode: See example in FARM \nInfrastructure: 1x Tesla v100## Hyperparameters## Performance\nEvaluated on the SQuAD 2.0 dev set with the official eval script.## Usage### In Transformers### In FARM### In haystack\nFor doing QA at scale (i.e. many docs instead of a single paragraph), you can load the model also in haystack:## Authors\nVaishali Pal 'URL [at] URL' \nBranden Chan: 'URL [at] URL' \nTimo Möller: 'timo.moeller [at] URL' \nMalte Pietsch: 'malte.pietsch [at] URL' \nTanay Soni: 'URL [at] URL'## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n\nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
-0.018373293802142143,
0.14797022938728333,
-0.005033206660300493,
0.04284340888261795,
0.1013653501868248,
-0.010208062827587128,
0.13389702141284943,
0.0965534895658493,
0.026136230677366257,
0.04740932583808899,
0.009151484817266464,
0.040719788521528244,
0.10847601294517517,
0.13226018846035004,
0.04007529094815254,
-0.1869620978832245,
-0.008877738378942013,
-0.09026572108268738,
-0.08227916061878204,
0.09907685220241547,
0.1368774026632309,
-0.08905480057001114,
0.11931591480970383,
0.014550847932696342,
0.0021541111636906862,
0.022328147664666176,
-0.02274506166577339,
-0.046317968517541885,
0.08910471946001053,
0.04577156901359558,
0.08556836098432541,
-0.014369465410709381,
0.03015219047665596,
-0.24405528604984283,
0.030395207926630974,
0.047898948192596436,
0.038432709872722626,
0.037019677460193634,
0.111154705286026,
0.007312927395105362,
0.10304483771324158,
-0.07123328745365143,
0.021941468119621277,
0.09394451975822449,
-0.08503372967243195,
-0.13287222385406494,
-0.09428294748067856,
0.10676077008247375,
0.05671435222029686,
0.07058218121528625,
-0.018092898651957512,
0.06729160249233246,
-0.1039995402097702,
0.06344413012266159,
0.0818285197019577,
-0.24463851749897003,
-0.04311449080705643,
0.0008192639797925949,
0.0464838370680809,
0.04081675782799721,
-0.11786188930273056,
0.03718702867627144,
0.03440995514392853,
0.020883597433567047,
0.01281728781759739,
-0.005326803307980299,
0.009161138907074928,
0.011739896610379219,
-0.08368854969739914,
-0.024426689371466637,
0.18053603172302246,
0.039564039558172226,
-0.0584738552570343,
-0.14164039492607117,
0.0011227515060454607,
0.1381016969680786,
-0.011363251134753227,
-0.04496745392680168,
0.03364121913909912,
0.0040174247696995735,
0.009712815284729004,
-0.0606880858540535,
-0.14170920848846436,
0.05206548795104027,
0.01846042089164257,
0.04612784832715988,
0.025253430008888245,
0.033744849264621735,
-0.018579859286546707,
0.033652082085609436,
0.05102294683456421,
-0.13160936534404755,
-0.014125228859484196,
-0.14183056354522705,
-0.022056279703974724,
-0.02714105322957039,
0.0415872186422348,
-0.00709941890090704,
0.15914702415466309,
0.09437257796525955,
-0.06775083392858505,
0.003795898286625743,
-0.028180962428450584,
-0.004429439548403025,
0.04635977745056152,
0.18779928982257843,
-0.08042771369218826,
-0.1924986094236374,
-0.004143793601542711,
-0.0026644552126526833,
-0.02827504649758339,
0.017461227253079414,
-0.034726355224847794,
-0.0008443646947853267,
-0.045196037739515305,
0.052113126963377,
0.08984292298555374,
0.04162672534584999,
-0.08436815440654755,
-0.06437094509601593,
0.12759444117546082,
-0.10474840551614761,
0.05147207900881767,
0.020774144679307938,
-0.03768859803676605,
0.1970783770084381,
-0.07192948460578918,
0.04316006600856781,
-0.04709461331367493,
0.04103328660130501,
-0.05788540840148926,
-0.016102738678455353,
-0.09676431119441986,
-0.0823117196559906,
0.09752284735441208,
0.02113224007189274,
-0.010379362851381302,
-0.09164687991142273,
-0.06030019745230675,
-0.03043423779308796,
0.07134354114532471,
-0.017756810411810875,
-0.03838448226451874,
-0.02432030439376831,
-0.047574058175086975,
0.028124220669269562,
0.002415173454210162,
-0.036890022456645966,
-0.05180029571056366,
0.03000284917652607,
-0.09907252341508865,
0.0266415998339653,
0.05372166261076927,
0.017224283888936043,
-0.07544894516468048,
-0.01795273832976818,
-0.15322713553905487,
0.04123619198799133,
-0.11400467157363892,
0.1536743938922882,
-0.14494897425174713,
-0.03466426581144333,
0.014711053110659122,
0.028063658624887466,
-0.025214292109012604,
0.12200602889060974,
-0.12600453197956085,
-0.0426640585064888,
0.19065506756305695,
-0.0916157215833664,
-0.08116275072097778,
0.14828062057495117,
0.0031218898948282003,
0.014924121089279652,
0.12511798739433289,
0.12089752405881882,
0.1254728138446808,
-0.17881564795970917,
-0.03917209804058075,
-0.006264461670070887,
-0.07553096115589142,
-0.0007945594261400402,
0.10506316274404526,
-0.06363438814878464,
0.050020888447761536,
0.02496967278420925,
-0.032043978571891785,
0.021780535578727722,
-0.03833527863025665,
-0.07072030007839203,
0.059188418090343475,
-0.024040015414357185,
0.0018640055786818266,
-0.014835435897111893,
-0.06336917728185654,
-0.06575103104114532,
-0.11327515542507172,
-0.04989085718989372,
0.03687095642089844,
0.014058059081435204,
-0.011881373822689056,
-0.08078448474407196,
0.03730238974094391,
0.03513411805033684,
0.028861314058303833,
-0.043722204864025116,
-0.12849105894565582,
0.050929415971040726,
-0.0860481709241867,
0.09500007331371307,
0.05299694463610649,
0.04519427940249443,
-0.00034356871037743986,
-0.01839767023921013,
-0.017039988189935684,
-0.06098437309265137,
-0.03151838108897209,
0.018109489232301712,
-0.23087354004383087,
0.01467785146087408,
-0.07025164365768433,
0.022421130910515785,
-0.0627843514084816,
-0.015293968841433525,
0.08545877784490585,
0.08314969390630722,
0.05014192312955856,
0.011447345837950706,
-0.004030912648886442,
0.006494604982435703,
0.015422262251377106,
-0.026662958785891533,
0.021282704547047615,
-0.028900472447276115,
-0.03451501578092575,
0.045598581433296204,
0.01160590723156929,
0.07515885680913925,
0.06475098431110382,
-0.06997337937355042,
-0.05001232773065567,
-0.06957420706748962,
-0.07081650197505951,
-0.013503299094736576,
-0.05617155134677887,
-0.09063511341810226,
0.11313978582620621,
0.04445421323180199,
0.003582445904612541,
-0.07807750254869461,
-0.03945007175207138,
-0.04780580848455429,
-0.04643278196454048,
-0.03936903923749924,
0.10389506816864014,
0.01797420345246792,
-0.1351446658372879,
0.09249719232320786,
0.13445283472537994,
0.10832257568836212,
0.225608691573143,
-0.03192010521888733,
-0.055825937539339066,
-0.015506535768508911,
0.03584716469049454,
-0.043679457157850266,
0.07595188915729523,
-0.05454028397798538,
0.016281327232718468,
0.052243392914533615,
-0.023213928565382957,
0.023163001984357834,
-0.0678393542766571,
0.018130047246813774,
-0.006434454582631588,
-0.04585939645767212,
0.018811190500855446,
-0.007197817321866751,
0.035879138857126236,
0.10026255249977112,
0.06410782039165497,
-0.016967417672276497,
-0.0069819726049900055,
-0.055894769728183746,
-0.060011956840753555,
0.1355823129415512,
-0.09705471992492676,
-0.20684200525283813,
-0.1435643583536148,
-0.019550060853362083,
-0.07036472856998444,
-0.0394299291074276,
0.044836197048425674,
-0.08886382728815079,
-0.12678761780261993,
-0.037916701287031174,
0.07830725610256195,
0.06005565822124481,
-0.07462259382009506,
0.013396828435361385,
0.04618476703763008,
0.007497075013816357,
-0.14210161566734314,
-0.03749929368495941,
0.012899402529001236,
-0.04138369485735893,
-0.007390983868390322,
0.04470185190439224,
0.08597591519355774,
0.06336726248264313,
0.013648736290633678,
-0.0005371436709538102,
-0.017763976007699966,
0.2454751431941986,
-0.1030028685927391,
0.09221867471933365,
0.1550525724887848,
0.019257089123129845,
0.08897305279970169,
0.2196444422006607,
0.07906907796859741,
0.003437071805819869,
0.009761294350028038,
0.03070713020861149,
0.005688612349331379,
-0.29083940386772156,
-0.0830303207039833,
-0.051896657794713974,
-0.013043791987001896,
-0.028030699118971825,
0.03484313562512398,
-0.0011284522479400039,
0.0013765622861683369,
-0.10223094373941422,
-0.045144110918045044,
0.05577454715967178,
0.06904121488332748,
0.09714560955762863,
0.017187852412462234,
0.04552365466952324,
-0.04759465903043747,
-0.05327378958463669,
0.09580305218696594,
0.06995806843042374,
0.14151859283447266,
0.07268297672271729,
0.1622491329908371,
0.10438838601112366,
0.04887542873620987,
0.00826501939445734,
0.031163454055786133,
-0.02195620723068714,
0.04471791163086891,
-0.034038443118333817,
-0.08635500818490982,
0.026667024940252304,
0.09005288034677505,
0.06993414461612701,
-0.06423833966255188,
0.019251100718975067,
0.01356715988367796,
0.13078683614730835,
0.22896169126033783,
-0.005267444532364607,
-0.1412631869316101,
-0.03630521148443222,
0.05150994285941124,
-0.042612239718437195,
-0.011861653998494148,
0.016785012558102608,
0.05088311433792114,
-0.1401047259569168,
0.06140459328889847,
0.006143257953226566,
0.10765457898378372,
-0.01029888354241848,
0.03986262157559395,
0.05405140295624733,
0.07586424797773361,
-0.00933497678488493,
0.09588079899549484,
-0.19403113424777985,
0.16234663128852844,
0.023255040869116783,
0.09987538307905197,
-0.03696227818727493,
0.051271963864564896,
0.023922095075249672,
-0.059404026716947556,
0.11609505861997604,
0.0019865569192916155,
-0.07530311495065689,
-0.09719320386648178,
-0.10872731357812881,
0.027567265555262566,
0.08909476548433304,
-0.04081939905881882,
0.10122333467006683,
-0.033133309334516525,
-0.03746189549565315,
-0.027133366093039513,
0.024526340886950493,
-0.1447954922914505,
-0.1334449201822281,
0.04182758927345276,
-0.06388411670923233,
0.03807312622666359,
-0.05159885063767433,
-0.027680685743689537,
-0.1011599600315094,
0.14904537796974182,
-0.1504068523645401,
-0.13015985488891602,
-0.11817570775747299,
-0.0450553223490715,
0.0792212188243866,
-0.10212887078523636,
0.028717514127492905,
-0.022917624562978745,
0.07780428230762482,
0.009532693773508072,
-0.06714559346437454,
0.02435116097331047,
-0.07666873931884766,
-0.13932283222675323,
-0.004810792859643698,
0.17277133464813232,
0.020231716334819794,
0.03210188448429108,
0.03819885104894638,
-0.05143871158361435,
-0.03830815106630325,
-0.14581900835037231,
-0.0047120884992182255,
0.11418924480676651,
0.006397306453436613,
0.06279820948839188,
-0.09068531543016434,
-0.18736694753170013,
-0.11261783540248871,
-0.0039004895370453596,
0.0907180905342102,
0.17057381570339203,
-0.042631275951862335,
0.12619124352931976,
0.20578588545322418,
-0.06769196689128876,
-0.18801826238632202,
-0.06961418688297272,
0.0028688220772892237,
-0.015199277549982071,
0.023144496604800224,
-0.13161563873291016,
0.1391954869031906,
-0.004078536294400692,
-0.0258847177028656,
0.040530286729335785,
-0.19089247286319733,
-0.09565835446119308,
0.03090248629450798,
-0.018568020313978195,
-0.042244549840688705,
-0.11575158685445786,
-0.0427071675658226,
-0.046914804726839066,
-0.08611070364713669,
0.10355456918478012,
-0.05650767683982849,
0.04597412049770355,
0.03153926506638527,
0.07554011791944504,
0.03174368664622307,
-0.07106155902147293,
0.0811547189950943,
0.007734723389148712,
0.04269295558333397,
-0.047932978719472885,
-0.012731359340250492,
0.07298169285058975,
-0.014651622623205185,
0.09187991917133331,
0.00693432055413723,
-0.01785915158689022,
-0.06771164387464523,
-0.04984920471906662,
-0.07172912359237671,
0.09732471406459808,
-0.06884190440177917,
-0.07395528256893158,
-0.0818651095032692,
0.09820380806922913,
0.07642010599374771,
0.02158311940729618,
0.016394782811403275,
-0.03592692315578461,
0.04350787028670311,
0.10161096602678299,
0.1627112776041031,
-0.025677796453237534,
-0.02199658565223217,
-0.052809178829193115,
-0.02168055810034275,
0.06394042074680328,
-0.0780075415968895,
0.05710994079709053,
0.15275496244430542,
-0.022935062646865845,
0.09377743303775787,
0.0020827518310397863,
-0.12698347866535187,
-0.005140677094459534,
0.10221435874700546,
-0.0935220867395401,
-0.1951962560415268,
-0.06374681740999222,
-0.022763675078749657,
-0.09083490818738937,
-0.0023514707572758198,
0.1548241674900055,
-0.011124111711978912,
-0.01952822133898735,
0.011069871485233307,
0.09411365538835526,
-0.010020957328379154,
0.0810922309756279,
0.049515072256326675,
0.020943205803632736,
-0.0792129710316658,
0.05437994375824928,
0.08275431394577026,
-0.05032455921173096,
-0.010997163131833076,
0.09163302183151245,
-0.04203518480062485,
-0.05725174397230148,
0.010816937312483788,
0.10058218240737915,
-0.06481137871742249,
-0.07476620376110077,
0.008660716935992241,
-0.1149909570813179,
-0.0076474086381495,
-0.005886008031666279,
0.005780769046396017,
0.050620198249816895,
0.0428251177072525,
0.005833297036588192,
-0.004940500482916832,
0.14577576518058777,
0.04651017114520073,
-0.03957885503768921,
-0.0970061644911766,
0.01706746034324169,
-0.023034969344735146,
0.009456057101488113,
-0.00975485797971487,
-0.01038380153477192,
-0.12117434293031693,
-0.029467104002833366,
-0.10921251028776169,
0.039190102368593216,
-0.0673140436410904,
0.03403220325708389,
-0.0070218234322965145,
-0.058846183121204376,
-0.01928996481001377,
0.015380803495645523,
-0.07286779582500458,
-0.03117244504392147,
-0.0012132760602980852,
0.15713296830654144,
-0.18963609635829926,
-0.01905878446996212,
0.08236845582723618,
-0.03499210625886917,
0.07633603364229202,
-0.01946909725666046,
-0.02938271500170231,
0.03166622295975685,
-0.17996039986610413,
-0.0008042342378757894,
-0.020678676664829254,
0.048237405717372894,
0.03193378075957298,
-0.0424790121614933,
-0.013932858593761921,
0.012278487905859947,
-0.05729595944285393,
-0.01752890832722187,
-0.04987921193242073,
-0.09793996065855026,
0.06247182562947273,
0.0302924495190382,
-0.14161303639411926,
-0.029030149802565575,
0.004996045492589474,
0.10770253092050552,
-0.004384427331387997,
0.15724346041679382,
-0.06551127135753632,
0.06912744790315628,
-0.10735634714365005,
0.025957968086004257,
0.031156277284026146,
-0.00419754721224308,
-0.1026829332113266,
-0.017280694097280502,
0.03997474163770676,
-0.03444034978747368,
0.12308793514966965,
0.03970503807067871,
0.05170103535056114,
0.037937842309474945,
-0.05967123806476593,
-0.04137652739882469,
0.04082684963941574,
0.019818592816591263,
-0.030291104689240456,
0.01156292762607336,
-0.006788984406739473,
-0.11258368194103241,
-0.019565608352422714,
-0.05033888667821884,
0.16488808393478394,
0.25690940022468567,
0.1110171377658844,
-0.010921879671514034,
0.10858645290136337,
-0.029277384281158447,
-0.17884978652000427,
-0.03487497940659523,
0.02916628308594227,
0.05095705762505531,
-0.08161187171936035,
0.07063475251197815,
0.1694752424955368,
-0.17063435912132263,
0.06890425831079483,
-0.038807213306427,
-0.02807079255580902,
-0.05816248059272766,
-0.15432453155517578,
-0.07716044783592224,
-0.03895208239555359,
0.011906572617590427,
-0.1432334929704666,
0.056212324649095535,
0.03706912323832512,
0.04073375463485718,
-0.10046139359474182,
0.08057005703449249,
-0.05712313577532768,
-0.061718832701444626,
0.09687162190675735,
0.04294286668300629,
0.04541199654340744,
0.04666675999760628,
0.003004661062732339,
-0.07725363224744797,
0.09902086108922958,
0.0028013004921376705,
0.07221883535385132,
-0.04355574771761894,
-0.03824342042207718,
-0.08432726562023163,
-0.08566181361675262,
-0.010416953824460506,
-0.028116760775446892,
-0.02525158040225506,
0.06882103532552719,
0.021164782345294952,
-0.017873039469122887,
0.025446420535445213,
0.16339579224586487,
-0.025706574320793152,
-0.13188590109348297,
-0.1274505853652954,
0.039238665252923965,
-0.01717269979417324,
0.04573890194296837,
-0.0040680887177586555,
-0.09410732239484787,
-0.0253704022616148,
0.1867932826280594,
0.20702490210533142,
-0.06881396472454071,
-0.00505306338891387,
-0.055103715509176254,
0.0254550538957119,
-0.017551546916365623,
0.12458298355340958,
0.008958409540355206,
0.2777879536151886,
-0.027294699102640152,
0.06883768737316132,
-0.01256130076944828,
0.002969927852973342,
-0.1321268230676651,
0.05223479121923447,
0.033324070274829865,
-0.030644608661532402,
-0.07645469903945923,
0.17385254800319672,
-0.09566868841648102,
-0.1257452368736267,
-0.0554339773952961,
-0.06315074115991592,
-0.10839840769767761,
-0.015923643484711647,
0.09489969909191132,
0.061298370361328125,
0.06931275874376297,
-0.0034789172932505608,
-0.056163739413022995,
0.13708381354808807,
0.02869456447660923,
-0.08245328068733215,
0.026289109140634537,
0.1135721430182457,
-0.05617356672883034,
0.22000561654567719,
0.023964913561940193,
0.035077907145023346,
0.1034981906414032,
-0.02583649754524231,
-0.10479982197284698,
-0.03953850269317627,
0.06747131049633026,
-0.1401170939207077,
0.015005407854914665,
0.01737087219953537,
0.011491382494568825,
0.06768887490034103,
0.08853021264076233,
-0.032068680971860886,
0.011416075751185417,
0.13523639738559723,
0.00582471676170826,
-0.07837145775556564,
0.08114880323410034,
-0.12082535028457642,
0.12021149694919586,
0.12726597487926483,
-0.03692598268389702,
-0.015886951237916946,
-0.031940508633852005,
0.05385041609406471,
0.017103174701333046,
0.04167560487985611,
-0.06195753812789917,
-0.15360061824321747,
-0.013058925978839397,
-0.030678199604153633,
0.07712101936340332,
-0.05552666634321213,
-0.0554993636906147,
-0.02549869567155838,
0.10283268988132477,
-0.08707846701145172,
0.10512056201696396,
0.09876637160778046,
-0.018913131207227707,
0.011146417818963528,
-0.07716292142868042,
-0.020470522344112396,
0.10978561639785767,
-0.10846613347530365,
-0.026895679533481598
] |
null | null |
transformers
|

## Overview
**Language model:** gbert-base-germandpr
**Language:** German
**Training data:** GermanDPR train set (~ 56MB)
**Eval data:** GermanDPR test set (~ 6MB)
**Infrastructure**: 4x V100 GPU
**Published**: Apr 26th, 2021
## Details
- We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.
- The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published [online](https://deepset.ai/germanquad).
- It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set.
For each pair, there are one positive context and three hard negative contexts.
- As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).
- The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.
See https://deepset.ai/germanquad for more details and dataset download.
## Hyperparameters
```
batch_size = 40
n_epochs = 20
num_training_steps = 4640
num_warmup_steps = 460
max_seq_len = 32 tokens for question encoder and 300 tokens for passage encoder
learning_rate = 1e-6
lr_schedule = LinearWarmup
embeds_dropout_prob = 0.1
num_hard_negatives = 2
```
## Performance
During training, we monitored the in-batch average rank and the loss and evaluated different batch sizes, numbers of epochs, and number of hard negatives on a dev set split from the train set.
The dev split contained 1030 question/answer pairs.
Even without thorough hyperparameter tuning, we observed quite stable learning. Multiple restarts with different seeds produced quite similar results.
Note that the in-batch average rank is influenced by settings for batch size and number of hard negatives. A smaller number of hard negatives makes the task easier.
After fixing the hyperparameters we trained the model on the full GermanDPR train set.
We further evaluated the retrieval performance of the trained model on the full German Wikipedia with the GermanDPR test set as labels. To this end, we converted the GermanDPR test set to SQuAD format. The DPR model drastically outperforms the BM25 baseline with regard to recall@k.

## Usage
### In haystack
You can load the model in [haystack](https://github.com/deepset-ai/haystack/) as a retriever for doing QA at scale:
```python
retriever = DensePassageRetriever(
document_store=document_store,
query_embedding_model="deepset/gbert-base-germandpr-question_encoder"
passage_embedding_model="deepset/gbert-base-germandpr-ctx_encoder"
)
```
## Authors
- Timo Möller: `timo.moeller [at] deepset.ai`
- Julian Risch: `julian.risch [at] deepset.ai`
- Malte Pietsch: `malte.pietsch [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "tags": ["exbert"], "datasets": ["deepset/germandpr"], "thumbnail": "https://thumb.tildacdn.com/tild3433-3637-4830-a533-353833613061/-/resize/720x/-/format/webp/germanquad.jpg"}
| null |
deepset/gbert-base-germandpr-ctx_encoder
|
[
"transformers",
"pytorch",
"dpr",
"exbert",
"de",
"dataset:deepset/germandpr",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #dpr #exbert #de #dataset-deepset/germandpr #license-mit #endpoints_compatible #has_space #region-us
|
!bert_image
## Overview
Language model: gbert-base-germandpr
Language: German
Training data: GermanDPR train set (~ 56MB)
Eval data: GermanDPR test set (~ 6MB)
Infrastructure: 4x V100 GPU
Published: Apr 26th, 2021
## Details
- We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.
- The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published online.
- It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set.
For each pair, there are one positive context and three hard negative contexts.
- As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).
- The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.
See URL for more details and dataset download.
## Hyperparameters
## Performance
During training, we monitored the in-batch average rank and the loss and evaluated different batch sizes, numbers of epochs, and number of hard negatives on a dev set split from the train set.
The dev split contained 1030 question/answer pairs.
Even without thorough hyperparameter tuning, we observed quite stable learning. Multiple restarts with different seeds produced quite similar results.
Note that the in-batch average rank is influenced by settings for batch size and number of hard negatives. A smaller number of hard negatives makes the task easier.
After fixing the hyperparameters we trained the model on the full GermanDPR train set.
We further evaluated the retrieval performance of the trained model on the full German Wikipedia with the GermanDPR test set as labels. To this end, we converted the GermanDPR test set to SQuAD format. The DPR model drastically outperforms the BM25 baseline with regard to recall@k.
!performancetable
## Usage
### In haystack
You can load the model in haystack as a retriever for doing QA at scale:
## Authors
- Timo Möller: 'timo.moeller [at] URL'
- Julian Risch: 'URL [at] URL'
- Malte Pietsch: 'malte.pietsch [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Website
By the way: we're hiring!
|
[
"## Overview\nLanguage model: gbert-base-germandpr \nLanguage: German \nTraining data: GermanDPR train set (~ 56MB) \nEval data: GermanDPR test set (~ 6MB) \nInfrastructure: 4x V100 GPU \nPublished: Apr 26th, 2021",
"## Details\n- We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.\n- The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published online.\n- It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set.\nFor each pair, there are one positive context and three hard negative contexts.\n- As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).\n- The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.\n\nSee URL for more details and dataset download.",
"## Hyperparameters",
"## Performance\nDuring training, we monitored the in-batch average rank and the loss and evaluated different batch sizes, numbers of epochs, and number of hard negatives on a dev set split from the train set.\nThe dev split contained 1030 question/answer pairs.\nEven without thorough hyperparameter tuning, we observed quite stable learning. Multiple restarts with different seeds produced quite similar results.\nNote that the in-batch average rank is influenced by settings for batch size and number of hard negatives. A smaller number of hard negatives makes the task easier.\nAfter fixing the hyperparameters we trained the model on the full GermanDPR train set.\n \nWe further evaluated the retrieval performance of the trained model on the full German Wikipedia with the GermanDPR test set as labels. To this end, we converted the GermanDPR test set to SQuAD format. The DPR model drastically outperforms the BM25 baseline with regard to recall@k.\n!performancetable",
"## Usage",
"### In haystack\nYou can load the model in haystack as a retriever for doing QA at scale:",
"## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Website \n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #dpr #exbert #de #dataset-deepset/germandpr #license-mit #endpoints_compatible #has_space #region-us \n",
"## Overview\nLanguage model: gbert-base-germandpr \nLanguage: German \nTraining data: GermanDPR train set (~ 56MB) \nEval data: GermanDPR test set (~ 6MB) \nInfrastructure: 4x V100 GPU \nPublished: Apr 26th, 2021",
"## Details\n- We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.\n- The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published online.\n- It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set.\nFor each pair, there are one positive context and three hard negative contexts.\n- As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).\n- The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.\n\nSee URL for more details and dataset download.",
"## Hyperparameters",
"## Performance\nDuring training, we monitored the in-batch average rank and the loss and evaluated different batch sizes, numbers of epochs, and number of hard negatives on a dev set split from the train set.\nThe dev split contained 1030 question/answer pairs.\nEven without thorough hyperparameter tuning, we observed quite stable learning. Multiple restarts with different seeds produced quite similar results.\nNote that the in-batch average rank is influenced by settings for batch size and number of hard negatives. A smaller number of hard negatives makes the task easier.\nAfter fixing the hyperparameters we trained the model on the full GermanDPR train set.\n \nWe further evaluated the retrieval performance of the trained model on the full German Wikipedia with the GermanDPR test set as labels. To this end, we converted the GermanDPR test set to SQuAD format. The DPR model drastically outperforms the BM25 baseline with regard to recall@k.\n!performancetable",
"## Usage",
"### In haystack\nYou can load the model in haystack as a retriever for doing QA at scale:",
"## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Website \n\nBy the way: we're hiring!"
] |
[
48,
59,
187,
5,
229,
3,
27,
47,
118
] |
[
"passage: TAGS\n#transformers #pytorch #dpr #exbert #de #dataset-deepset/germandpr #license-mit #endpoints_compatible #has_space #region-us \n## Overview\nLanguage model: gbert-base-germandpr \nLanguage: German \nTraining data: GermanDPR train set (~ 56MB) \nEval data: GermanDPR test set (~ 6MB) \nInfrastructure: 4x V100 GPU \nPublished: Apr 26th, 2021## Details\n- We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.\n- The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published online.\n- It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set.\nFor each pair, there are one positive context and three hard negative contexts.\n- As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).\n- The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.\n\nSee URL for more details and dataset download.## Hyperparameters"
] |
[
-0.10844545066356659,
0.11103019118309021,
0.0005179223953746259,
0.07334968447685242,
0.05571523681282997,
0.01724947988986969,
0.17428110539913177,
0.1168983057141304,
0.059548549354076385,
0.03896645829081535,
0.006519480142742395,
-0.12892886996269226,
0.04112428054213524,
0.1385737955570221,
0.06349482387304306,
-0.25598931312561035,
0.03992544487118721,
-0.05240015685558319,
-0.03731405735015869,
0.07824084162712097,
0.13041602075099945,
-0.09764422476291656,
0.05842101201415062,
-0.044952500611543655,
-0.09987328946590424,
0.13280270993709564,
-0.047075215727090836,
-0.0030210595577955246,
0.10843861103057861,
-0.012744871899485588,
0.09380752593278885,
-0.05010095238685608,
0.0927620530128479,
-0.11481030285358429,
0.0037890125531703234,
0.018165303394198418,
0.03405491262674332,
0.03749152645468712,
0.0028984027449041605,
-0.05200342461466789,
0.10609766095876694,
-0.10770631581544876,
0.02258843369781971,
0.021405676379799843,
-0.06188860908150673,
-0.19493870437145233,
-0.11475947499275208,
0.14844952523708344,
0.025764774531126022,
0.11792496591806412,
-0.025348376482725143,
0.04058554768562317,
-0.10233735293149948,
-0.010269173420965672,
0.09835977107286453,
-0.1125078797340393,
-0.02344711683690548,
0.08005665987730026,
-0.022807424888014793,
0.06582989543676376,
-0.10930826514959335,
-0.026277482509613037,
0.052926115691661835,
0.035385094583034515,
-0.03359981253743172,
-0.010743514634668827,
-0.01084899716079235,
-0.015555545687675476,
-0.0897049605846405,
-0.03681952506303787,
0.19260519742965698,
-0.01812772825360298,
-0.08306898176670074,
-0.19256970286369324,
-0.01649351976811886,
0.035209354013204575,
0.04505746066570282,
0.011876916512846947,
-0.010052508674561977,
-0.02061968855559826,
-0.03530681133270264,
-0.04811585694551468,
-0.11587295681238174,
-0.014124835841357708,
-0.06765321642160416,
0.1827060878276825,
0.04374699294567108,
0.031048784032464027,
0.002961433492600918,
0.14097149670124054,
-0.12677329778671265,
-0.11051950603723526,
-0.04202038794755936,
-0.06431017816066742,
0.005305592902004719,
0.0020295577123761177,
-0.07027114182710648,
-0.1364395171403885,
0.020079078152775764,
0.15987738966941833,
-0.03597471863031387,
-0.002822915790602565,
-0.038323961198329926,
0.012371508404612541,
0.11607617139816284,
0.17965471744537354,
-0.0003369925543665886,
-0.06318681687116623,
-0.004682921804487705,
-0.007574690971523523,
0.06715807318687439,
0.017853448167443275,
-0.048673633486032486,
-0.05775001272559166,
0.0026208478957414627,
0.05766380950808525,
-0.02486838586628437,
0.017258085310459137,
-0.03464524820446968,
-0.04054083302617073,
0.158238023519516,
-0.1565673053264618,
-0.0034873990807682276,
0.020544247701764107,
-0.10558901727199554,
0.08628150075674057,
-0.02624841220676899,
-0.02605632320046425,
-0.07825382053852081,
0.09539040178060532,
-0.016197003424167633,
0.00032357065356336534,
-0.14597027003765106,
-0.12221014499664307,
0.057150695472955704,
0.01408325880765915,
-0.012274066917598248,
-0.07258618623018265,
-0.1519593596458435,
-0.02531648613512516,
0.05319349467754364,
0.0013774522813037038,
0.09033339470624924,
-0.03890534117817879,
0.034329719841480255,
0.030297376215457916,
-0.025136448442935944,
-0.07764942944049835,
-0.015420823357999325,
0.0153163131326437,
-0.05621293559670448,
0.007014842238277197,
-0.1119135394692421,
0.03584650531411171,
-0.13253498077392578,
-0.0535426028072834,
-0.22255031764507294,
0.07159453630447388,
-0.10716193914413452,
-0.04041672870516777,
-0.10422991961240768,
-0.06306244432926178,
-0.027709776535630226,
0.014278951101005077,
0.029418861493468285,
0.1508904993534088,
-0.1903674602508545,
-0.05436546355485916,
0.16163741052150726,
-0.164127916097641,
-0.0058876462280750275,
0.1386697143316269,
-0.0751374214887619,
-0.018648240715265274,
0.08736740797758102,
0.17610733211040497,
0.07697321474552155,
-0.18127915263175964,
-0.10453791916370392,
0.02323245070874691,
0.03973793238401413,
0.045733172446489334,
0.09441214054822922,
-0.06848426908254623,
0.08969073742628098,
-0.0033012342173606157,
-0.07587005943059921,
0.014399749226868153,
-0.023029685020446777,
-0.03299159184098244,
-0.004365919623523951,
-0.03794689103960991,
0.10532893240451813,
0.009414485655725002,
-0.012890852056443691,
-0.0733763724565506,
-0.068766750395298,
0.0943707525730133,
0.15627045929431915,
-0.09441298991441727,
0.008898114785552025,
-0.0006973127601668239,
-0.04392797872424126,
-0.0005679064197465777,
-0.038482680916786194,
-0.12415745109319687,
-0.1818457990884781,
0.06538522243499756,
-0.1273157149553299,
0.07881400734186172,
0.05236634984612465,
0.07524716109037399,
0.03731058910489082,
-0.09569814056158066,
-0.009991784580051899,
-0.14566916227340698,
-0.0442451536655426,
0.006632368545979261,
-0.12053733319044113,
-0.03913191333413124,
-0.03639623150229454,
0.04997757449746132,
-0.024690289050340652,
-0.025746960192918777,
0.02605734020471573,
0.09941165894269943,
0.023905184119939804,
-0.06080549210309982,
-0.04505191743373871,
0.033962734043598175,
-0.019187677651643753,
-0.036488696932792664,
-0.029293736442923546,
-0.01901954971253872,
-0.045986223965883255,
0.05534486100077629,
-0.005706363823264837,
-0.03574652597308159,
0.08618803322315216,
0.10815707594156265,
-0.07522127032279968,
-0.012514106929302216,
-0.08920017629861832,
-0.022299444302916527,
-0.1460869163274765,
-0.09269461780786514,
0.209975928068161,
0.008305668830871582,
0.03603889420628548,
-0.04717293754220009,
-0.04848480597138405,
-0.047952331602573395,
-0.015662219375371933,
-0.06495005637407303,
0.07697585970163345,
-0.025535572320222855,
-0.08872034400701523,
0.1202756017446518,
0.051771316677331924,
-0.016480546444654465,
0.24563953280448914,
-0.056927695870399475,
-0.10950063169002533,
0.0007023422513157129,
0.003886060556396842,
-0.04224380850791931,
0.1857854723930359,
0.052719760686159134,
-0.017142126336693764,
0.03986372426152229,
0.05568461865186691,
0.057483866810798645,
-0.08464527130126953,
0.019572364166378975,
-0.024884499609470367,
-0.05860273167490959,
-0.07516111433506012,
0.029641257598996162,
-0.015552377328276634,
0.08843663334846497,
-0.0022571405861526728,
0.055210649967193604,
-0.004808052442967892,
-0.04462302848696709,
-0.108952596783638,
0.1427404135465622,
-0.11203774809837341,
-0.20688796043395996,
-0.13228371739387512,
0.08059452474117279,
-0.09588181227445602,
0.008652543649077415,
0.0005311148124746978,
-0.0767124816775322,
-0.09388840943574905,
-0.05474141985177994,
0.12856590747833252,
-0.02739771455526352,
-0.06756275147199631,
-0.07013341039419174,
-0.0038285967893898487,
0.026306837797164917,
-0.1809285432100296,
-0.02620367519557476,
-0.01730245351791382,
-0.06606465578079224,
-0.025767067447304726,
0.008439261466264725,
0.042428333312273026,
0.019134489819407463,
-0.018406258895993233,
-0.017849400639533997,
-0.034273624420166016,
0.11021767556667328,
-0.12227360159158707,
0.03987513855099678,
0.16119123995304108,
-0.06856013834476471,
0.005392129998654127,
0.07624059170484543,
0.06231606379151344,
-0.04973515868186951,
0.043764885514974594,
0.044744763523340225,
-0.044229116290807724,
-0.23892554640769958,
-0.13209602236747742,
-0.05687404051423073,
-0.0015452896477654576,
0.07257881760597229,
0.03971453011035919,
-0.013269350863993168,
0.00796953309327364,
-0.08980198949575424,
-0.047280170023441315,
0.008391693234443665,
0.03645092248916626,
0.06295031309127808,
0.008340838365256786,
0.03748093917965889,
-0.09373615682125092,
-0.057960253208875656,
0.11859013140201569,
0.011614575982093811,
0.20057600736618042,
-0.04235108941793442,
0.004425502382218838,
0.03065185807645321,
0.09096625447273254,
0.008183661848306656,
0.11621488630771637,
-0.0022106687538325787,
-0.024768300354480743,
-0.011735327541828156,
-0.07183327525854111,
-0.0022005955688655376,
0.07157020270824432,
0.04124733805656433,
-0.019026193767786026,
-0.06439029425382614,
-0.08060136437416077,
0.10257072001695633,
0.1779673546552658,
0.09430570900440216,
-0.10302656888961792,
-0.13079965114593506,
-0.01180176343768835,
-0.056745871901512146,
-0.015045051462948322,
0.019459586590528488,
0.10612449049949646,
-0.13684110343456268,
0.060920026153326035,
-0.010041429661214352,
0.06441324204206467,
-0.14808133244514465,
-0.002570181153714657,
0.026948412880301476,
0.05150184780359268,
-0.020724236965179443,
0.11726744472980499,
-0.2564564049243927,
0.15125426650047302,
-0.0005669370293617249,
0.09937787801027298,
-0.1124003529548645,
0.032867904752492905,
0.03407833352684975,
-0.1149907335639,
0.09964346885681152,
0.039699114859104156,
-0.11609472334384918,
0.011799542233347893,
-0.14606551826000214,
0.0318964459002018,
0.11758706718683243,
-0.07805280387401581,
0.054571446031332016,
-0.0015685532707720995,
0.017261669039726257,
0.0009850402129814029,
0.03673003613948822,
-0.1510726362466812,
-0.10869056731462479,
0.02926192618906498,
-0.01395620871335268,
-0.06983522325754166,
-0.05430765450000763,
-0.06869902461767197,
0.03917867690324783,
0.1654302179813385,
-0.06519440561532974,
-0.06649215519428253,
-0.13000158965587616,
0.10026322305202484,
0.1441102772951126,
-0.07267878204584122,
0.046246808022260666,
0.07031618058681488,
0.09604918956756592,
-0.043081946671009064,
-0.15156260132789612,
0.04712699353694916,
-0.10524051636457443,
-0.08438042551279068,
-0.05227859690785408,
0.12448016554117203,
0.1527855545282364,
0.0386962816119194,
-0.0329306423664093,
0.0179218091070652,
0.044500164687633514,
-0.11251003295183182,
0.030011383816599846,
0.11164472997188568,
0.010788420215249062,
0.13416017591953278,
-0.10873933136463165,
-0.11132442206144333,
-0.014714325778186321,
-0.03982522711157799,
0.09445617347955704,
0.05387646332383156,
-0.036338794976472855,
0.09574048966169357,
0.13936670124530792,
-0.0638384222984314,
-0.3150402903556824,
0.05058303102850914,
0.09321891516447067,
0.053434647619724274,
-0.028408534824848175,
-0.2748658061027527,
0.12696239352226257,
0.09607266634702682,
-0.03879125788807869,
0.007460216525942087,
-0.28859132528305054,
-0.1254618912935257,
0.09679928421974182,
-0.0405237078666687,
0.10775476694107056,
-0.013258404098451138,
-0.01003173179924488,
-0.03487494960427284,
-0.10092687606811523,
0.05744194984436035,
-0.06277903914451599,
0.06422074139118195,
0.016147995367646217,
0.11563209444284439,
0.04388728365302086,
-0.05595122277736664,
0.08623349666595459,
0.10806910693645477,
0.07925799489021301,
-0.06446830183267593,
-0.049711924046278,
-0.01314211543649435,
-0.027132650837302208,
0.21892710030078888,
-0.006855303421616554,
0.0466967448592186,
-0.08742456138134003,
-0.02362888865172863,
-0.02210894413292408,
0.10865902155637741,
-0.02044440433382988,
-0.07948999106884003,
-0.10788647830486298,
0.11162079125642776,
0.0050046141259372234,
-0.010674926452338696,
0.05659535527229309,
0.010076655074954033,
-0.01086396910250187,
0.008973751217126846,
0.08007372170686722,
0.1843392252922058,
-0.08278747648000717,
-0.02404242567718029,
-0.026270003989338875,
0.06658942997455597,
-0.11211765557527542,
0.054350532591342926,
0.1533854603767395,
0.028958085924386978,
0.11098326742649078,
0.00639569154009223,
-0.10656675696372986,
0.02591572515666485,
0.061489544808864594,
-0.1431882381439209,
-0.13203832507133484,
-0.03784361109137535,
-0.06657303869724274,
-0.07005652040243149,
0.054898038506507874,
0.16648972034454346,
-0.08110731095075607,
-0.01222437247633934,
-0.029837995767593384,
0.03868520259857178,
-0.056660059839487076,
0.14720577001571655,
0.035343803465366364,
0.008273446001112461,
-0.07595577090978622,
0.19930419325828552,
0.03825342282652855,
-0.11586875468492508,
0.08179514855146408,
0.05472096428275108,
-0.045463208109140396,
-0.00259764539077878,
-0.006744559854269028,
0.1046452596783638,
-0.132552370429039,
-0.058388467878103256,
-0.047979071736335754,
-0.07358584553003311,
-0.005112594924867153,
-0.005732513032853603,
0.061823923140764236,
0.07355774194002151,
-0.04312436282634735,
0.034251030534505844,
-0.10207479447126389,
0.07807090878486633,
0.07087772339582443,
0.03356880694627762,
0.014442171901464462,
0.019396329298615456,
-0.03220956027507782,
-0.02545134164392948,
-0.03720840811729431,
0.004839382134377956,
-0.0713593065738678,
-0.0322134755551815,
-0.09559758752584457,
-0.0701078325510025,
0.02788035199046135,
-0.02573523484170437,
-0.007958082482218742,
-0.07141385227441788,
0.02928738482296467,
0.06995167583227158,
-0.07618673145771027,
-0.004431052599102259,
-0.034165527671575546,
0.05757917836308479,
-0.17544950544834137,
0.006081927102059126,
0.0486622229218483,
-0.04536035284399986,
0.11981451511383057,
0.14244462549686432,
0.008023742586374283,
0.07059159874916077,
-0.0916760265827179,
-0.051568061113357544,
-0.0111375218257308,
0.05345763638615608,
0.0297139473259449,
-0.05128086730837822,
0.0031981957145035267,
0.06568779051303864,
-0.004806273616850376,
0.02999112196266651,
0.013133211992681026,
-0.066154345870018,
0.029372824355959892,
0.028664110228419304,
-0.06820037215948105,
-0.06403578817844391,
0.09634979814291,
0.12166820466518402,
0.08084645867347717,
0.08842582255601883,
-0.1029096245765686,
0.05021268501877785,
-0.09120916575193405,
0.017340006306767464,
0.03925790637731552,
-0.055033087730407715,
-0.08654315769672394,
0.018738064914941788,
0.054320089519023895,
-0.030644947662949562,
0.22101925313472748,
0.016358258202672005,
-0.010003124363720417,
0.03299219533801079,
-0.016162468120455742,
0.012669255025684834,
0.03589467704296112,
0.14898905158042908,
-0.04699451103806496,
-0.015948036685585976,
-0.0697372704744339,
-0.005305215250700712,
-0.020581934601068497,
0.058240510523319244,
0.21462401747703552,
0.15259245038032532,
0.10753154009580612,
0.04159494489431381,
0.024786248803138733,
-0.07094286382198334,
-0.09086009860038757,
-0.03538211062550545,
0.011656788177788258,
0.07320670783519745,
-0.02618139237165451,
0.10508602857589722,
0.07065185159444809,
-0.1805318146944046,
0.10704206675291061,
-0.014372128061950207,
-0.04467659443616867,
-0.06835085898637772,
-0.07743292301893234,
-0.027903815731406212,
-0.09558811038732529,
0.03778179734945297,
-0.1438523381948471,
0.008589902892708778,
0.05853631719946861,
0.08588780462741852,
-0.07100500166416168,
0.20982670783996582,
-0.09714575111865997,
-0.06308972835540771,
0.11738038808107376,
0.017335647717118263,
0.014904019422829151,
0.05533752590417862,
-0.02821269817650318,
0.0035384432412683964,
0.05253276973962784,
0.0863063856959343,
0.021417900919914246,
0.04048745706677437,
-0.023116005584597588,
-0.03460824862122536,
-0.033599432557821274,
-0.028570381924510002,
-0.03661993891000748,
0.03600131720304489,
0.1723492294549942,
0.050333909690380096,
0.0023694620467722416,
-0.018914086744189262,
0.1519089937210083,
-0.053286030888557434,
-0.11672275513410568,
-0.13192762434482574,
0.19278427958488464,
0.03345879912376404,
0.01995810493826866,
0.015901483595371246,
-0.10704055428504944,
-0.018476126715540886,
0.07194871455430984,
0.2532210946083069,
-0.0048520504496991634,
0.0036579836159944534,
-0.0024081976152956486,
0.0032181208953261375,
0.02659413404762745,
0.09691102802753448,
0.00476633757352829,
0.2855558693408966,
-0.010682126507163048,
0.04679735377430916,
-0.007968958467245102,
-0.016820350661873817,
0.004863150417804718,
0.16208983957767487,
-0.024753373116254807,
-0.05968084931373596,
-0.13805322349071503,
0.11188813298940659,
0.019221851602196693,
-0.1754949688911438,
-0.01744171790778637,
-0.10459711402654648,
-0.1282246857881546,
0.0131342438980937,
0.08320099115371704,
0.09476660192012787,
0.07878347486257553,
-0.008377655409276485,
0.04387558996677399,
0.1494503617286682,
-0.0009255733457393944,
-0.0688793733716011,
-0.08710341155529022,
0.08685869723558426,
-0.05852803587913513,
0.21327559649944305,
0.00531345559284091,
0.10239004343748093,
0.06499277055263519,
-0.016750521957874298,
-0.08779395371675491,
0.002239423571154475,
0.027379201725125313,
-0.0883188545703888,
-0.035757746547460556,
0.1593467891216278,
-0.05977647006511688,
0.12502524256706238,
0.05546391010284424,
-0.006676909979432821,
0.031053686514496803,
0.05208607017993927,
-0.04299025610089302,
-0.13253110647201538,
0.1049896627664566,
-0.08922769874334335,
0.1412609964609146,
0.18054133653640747,
-0.016474375501275063,
0.021919941529631615,
-0.06005677580833435,
0.0505194216966629,
0.04864463582634926,
0.08509500324726105,
0.023384157568216324,
-0.15707777440547943,
0.015954071655869484,
-0.02696741744875908,
-0.012128024362027645,
-0.15671473741531372,
-0.035745684057474136,
-0.01142642367631197,
-0.010375184006989002,
-0.03264185041189194,
0.07322558015584946,
0.0210009403526783,
0.007700544316321611,
-0.0074449703097343445,
-0.042179469019174576,
0.011505087837576866,
0.010660726577043533,
-0.05763555318117142,
-0.0664246454834938
] |
null | null |
transformers
|

## Overview
**Language model:** gbert-base-germandpr
**Language:** German
**Training data:** GermanDPR train set (~ 56MB)
**Eval data:** GermanDPR test set (~ 6MB)
**Infrastructure**: 4x V100 GPU
**Published**: Apr 26th, 2021
## Details
- We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.
- The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published [online](https://deepset.ai/germanquad).
- It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set.
For each pair, there are one positive context and three hard negative contexts.
- As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).
- The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.
See https://deepset.ai/germanquad for more details and dataset download.
## Hyperparameters
```
batch_size = 40
n_epochs = 20
num_training_steps = 4640
num_warmup_steps = 460
max_seq_len = 32 tokens for question encoder and 300 tokens for passage encoder
learning_rate = 1e-6
lr_schedule = LinearWarmup
embeds_dropout_prob = 0.1
num_hard_negatives = 2
```
## Performance
During training, we monitored the in-batch average rank and the loss and evaluated different batch sizes, numbers of epochs, and number of hard negatives on a dev set split from the train set.
The dev split contained 1030 question/answer pairs.
Even without thorough hyperparameter tuning, we observed quite stable learning. Multiple restarts with different seeds produced quite similar results.
Note that the in-batch average rank is influenced by settings for batch size and number of hard negatives. A smaller number of hard negatives makes the task easier.
After fixing the hyperparameters we trained the model on the full GermanDPR train set.
We further evaluated the retrieval performance of the trained model on the full German Wikipedia with the GermanDPR test set as labels. To this end, we converted the GermanDPR test set to SQuAD format. The DPR model drastically outperforms the BM25 baseline with regard to recall@k.

## Usage
### In haystack
You can load the model in [haystack](https://github.com/deepset-ai/haystack/) as a retriever for doing QA at scale:
```python
retriever = DensePassageRetriever(
document_store=document_store,
query_embedding_model="deepset/gbert-base-germandpr-question_encoder"
passage_embedding_model="deepset/gbert-base-germandpr-ctx_encoder"
)
```
## Authors
- Timo Möller: `timo.moeller [at] deepset.ai`
- Julian Risch: `julian.risch [at] deepset.ai`
- Malte Pietsch: `malte.pietsch [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "tags": ["exbert"], "datasets": ["deepset/germandpr"], "thumbnail": "https://thumb.tildacdn.com/tild3433-3637-4830-a533-353833613061/-/resize/720x/-/format/webp/germanquad.jpg"}
|
feature-extraction
|
deepset/gbert-base-germandpr-question_encoder
|
[
"transformers",
"pytorch",
"safetensors",
"dpr",
"feature-extraction",
"exbert",
"de",
"dataset:deepset/germandpr",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #safetensors #dpr #feature-extraction #exbert #de #dataset-deepset/germandpr #license-mit #endpoints_compatible #has_space #region-us
|
!bert_image
## Overview
Language model: gbert-base-germandpr
Language: German
Training data: GermanDPR train set (~ 56MB)
Eval data: GermanDPR test set (~ 6MB)
Infrastructure: 4x V100 GPU
Published: Apr 26th, 2021
## Details
- We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.
- The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published online.
- It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set.
For each pair, there are one positive context and three hard negative contexts.
- As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).
- The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.
See URL for more details and dataset download.
## Hyperparameters
## Performance
During training, we monitored the in-batch average rank and the loss and evaluated different batch sizes, numbers of epochs, and number of hard negatives on a dev set split from the train set.
The dev split contained 1030 question/answer pairs.
Even without thorough hyperparameter tuning, we observed quite stable learning. Multiple restarts with different seeds produced quite similar results.
Note that the in-batch average rank is influenced by settings for batch size and number of hard negatives. A smaller number of hard negatives makes the task easier.
After fixing the hyperparameters we trained the model on the full GermanDPR train set.
We further evaluated the retrieval performance of the trained model on the full German Wikipedia with the GermanDPR test set as labels. To this end, we converted the GermanDPR test set to SQuAD format. The DPR model drastically outperforms the BM25 baseline with regard to recall@k.
!performancetable
## Usage
### In haystack
You can load the model in haystack as a retriever for doing QA at scale:
## Authors
- Timo Möller: 'timo.moeller [at] URL'
- Julian Risch: 'URL [at] URL'
- Malte Pietsch: 'malte.pietsch [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Website
By the way: we're hiring!
|
[
"## Overview\nLanguage model: gbert-base-germandpr \nLanguage: German \nTraining data: GermanDPR train set (~ 56MB) \nEval data: GermanDPR test set (~ 6MB) \nInfrastructure: 4x V100 GPU \nPublished: Apr 26th, 2021",
"## Details\n- We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.\n- The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published online.\n- It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set.\nFor each pair, there are one positive context and three hard negative contexts.\n- As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).\n- The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.\n\nSee URL for more details and dataset download.",
"## Hyperparameters",
"## Performance\nDuring training, we monitored the in-batch average rank and the loss and evaluated different batch sizes, numbers of epochs, and number of hard negatives on a dev set split from the train set.\nThe dev split contained 1030 question/answer pairs.\nEven without thorough hyperparameter tuning, we observed quite stable learning. Multiple restarts with different seeds produced quite similar results.\nNote that the in-batch average rank is influenced by settings for batch size and number of hard negatives. A smaller number of hard negatives makes the task easier.\nAfter fixing the hyperparameters we trained the model on the full GermanDPR train set.\n \nWe further evaluated the retrieval performance of the trained model on the full German Wikipedia with the GermanDPR test set as labels. To this end, we converted the GermanDPR test set to SQuAD format. The DPR model drastically outperforms the BM25 baseline with regard to recall@k.\n!performancetable",
"## Usage",
"### In haystack\nYou can load the model in haystack as a retriever for doing QA at scale:",
"## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Website \n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #safetensors #dpr #feature-extraction #exbert #de #dataset-deepset/germandpr #license-mit #endpoints_compatible #has_space #region-us \n",
"## Overview\nLanguage model: gbert-base-germandpr \nLanguage: German \nTraining data: GermanDPR train set (~ 56MB) \nEval data: GermanDPR test set (~ 6MB) \nInfrastructure: 4x V100 GPU \nPublished: Apr 26th, 2021",
"## Details\n- We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.\n- The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published online.\n- It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set.\nFor each pair, there are one positive context and three hard negative contexts.\n- As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).\n- The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.\n\nSee URL for more details and dataset download.",
"## Hyperparameters",
"## Performance\nDuring training, we monitored the in-batch average rank and the loss and evaluated different batch sizes, numbers of epochs, and number of hard negatives on a dev set split from the train set.\nThe dev split contained 1030 question/answer pairs.\nEven without thorough hyperparameter tuning, we observed quite stable learning. Multiple restarts with different seeds produced quite similar results.\nNote that the in-batch average rank is influenced by settings for batch size and number of hard negatives. A smaller number of hard negatives makes the task easier.\nAfter fixing the hyperparameters we trained the model on the full GermanDPR train set.\n \nWe further evaluated the retrieval performance of the trained model on the full German Wikipedia with the GermanDPR test set as labels. To this end, we converted the GermanDPR test set to SQuAD format. The DPR model drastically outperforms the BM25 baseline with regard to recall@k.\n!performancetable",
"## Usage",
"### In haystack\nYou can load the model in haystack as a retriever for doing QA at scale:",
"## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Website \n\nBy the way: we're hiring!"
] |
[
59,
59,
187,
5,
229,
3,
27,
47,
118
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #dpr #feature-extraction #exbert #de #dataset-deepset/germandpr #license-mit #endpoints_compatible #has_space #region-us \n## Overview\nLanguage model: gbert-base-germandpr \nLanguage: German \nTraining data: GermanDPR train set (~ 56MB) \nEval data: GermanDPR test set (~ 6MB) \nInfrastructure: 4x V100 GPU \nPublished: Apr 26th, 2021## Details\n- We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.\n- The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published online.\n- It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set.\nFor each pair, there are one positive context and three hard negative contexts.\n- As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).\n- The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.\n\nSee URL for more details and dataset download.## Hyperparameters"
] |
[
-0.11299958825111389,
0.07363506406545639,
-0.0006193685694597661,
0.06938125193119049,
0.054069940000772476,
0.03945962339639664,
0.1644134819507599,
0.10362701117992401,
0.015287978574633598,
0.049251802265644073,
-0.0035605509765446186,
-0.12323492765426636,
0.04297959804534912,
0.10546499490737915,
0.0337318554520607,
-0.2138376533985138,
0.08376523852348328,
-0.04453754425048828,
-0.016559813171625137,
0.07574483752250671,
0.12319812178611755,
-0.07617664337158203,
0.04766033962368965,
-0.06052154675126076,
-0.05786428600549698,
0.08339642733335495,
-0.024696109816432,
0.0095559973269701,
0.10661854594945908,
-0.0019927998073399067,
0.10465148836374283,
-0.001948907389305532,
0.02350146323442459,
-0.13040725886821747,
0.015144618228077888,
0.03832454979419708,
-0.01628652960062027,
0.020762190222740173,
-0.003108491888269782,
-0.06889605522155762,
0.054943013936281204,
-0.08972366899251938,
0.04232851043343544,
0.019797656685113907,
-0.09031690657138824,
-0.21749569475650787,
-0.13171334564685822,
0.11277510225772858,
0.06656597554683685,
0.11211667209863663,
-0.03980497643351555,
0.029687799513339996,
-0.17110420763492584,
0.009246911853551865,
0.11986864358186722,
-0.15790186822414398,
-0.008009441196918488,
0.028682425618171692,
-0.06038599833846092,
0.08397515118122101,
-0.10423099249601364,
-0.013240891508758068,
0.04829951748251915,
0.03575654327869415,
-0.006282308604568243,
-0.004910086747258902,
0.04149814322590828,
-0.01590385101735592,
-0.11103521287441254,
-0.04093609377741814,
0.1537487804889679,
-0.022046929225325584,
-0.09046914428472519,
-0.1811446249485016,
-0.028051801025867462,
-0.006840291433036327,
0.026560934260487556,
0.01325993612408638,
-0.017833828926086426,
-0.0027246875688433647,
-0.003284391714259982,
-0.04148311913013458,
-0.08749566972255707,
-0.007901481352746487,
-0.054125845432281494,
0.19170218706130981,
0.06655054539442062,
0.02301093004643917,
0.006702334154397249,
0.121492400765419,
-0.07891335338354111,
-0.10784827917814255,
-0.04075654596090317,
-0.06035676226019859,
0.004665293265134096,
0.013553038239479065,
-0.06168099865317345,
-0.13295653462409973,
0.023311369121074677,
0.15231987833976746,
-0.12034370005130768,
0.01712978072464466,
-0.050489265471696854,
0.02265063486993313,
0.06137024611234665,
0.14140421152114868,
-0.01966843754053116,
-0.047905053943395615,
0.03477180376648903,
0.00219280319288373,
0.08582045137882233,
0.012913662940263748,
-0.03759841248393059,
-0.01846187375485897,
0.016520027071237564,
0.10621421039104462,
-0.023351775482296944,
0.018200254067778587,
-0.07801998406648636,
-0.0338757187128067,
0.14665353298187256,
-0.15012522041797638,
-0.004183435346931219,
0.05636277422308922,
-0.04984792321920395,
0.08011519908905029,
-0.026059890165925026,
-0.040056534111499786,
-0.09251800924539566,
0.09711763262748718,
-0.016487495973706245,
-0.011238940991461277,
-0.12138725072145462,
-0.15769825875759125,
0.05728607997298241,
-0.016225574538111687,
-0.031945936381816864,
-0.07119477540254593,
-0.1837606430053711,
-0.028377927839756012,
0.035122714936733246,
0.009180029854178429,
0.08909913897514343,
-0.016753772273659706,
-0.018474610522389412,
0.040870022028684616,
-0.005327595863491297,
-0.13798221945762634,
-0.038542360067367554,
0.012299281544983387,
-0.06849666684865952,
0.031020496040582657,
-0.08431312441825867,
0.02958487905561924,
-0.1441338062286377,
-0.010213433764874935,
-0.2232445329427719,
0.07785138487815857,
-0.1431467980146408,
-0.06228005141019821,
-0.08355333656072617,
-0.06146974116563797,
-0.05235385149717331,
0.020150478929281235,
0.03589651361107826,
0.1743312031030655,
-0.18758103251457214,
-0.06626840680837631,
0.1572738140821457,
-0.20935003459453583,
-0.006065940484404564,
0.16796737909317017,
-0.05597270280122757,
-0.041399601846933365,
0.08575873076915741,
0.140426903963089,
0.09092875570058823,
-0.16585545241832733,
-0.0919327661395073,
-0.020130081102252007,
0.029134919866919518,
0.10702542215585709,
0.07845797389745712,
-0.0721321776509285,
0.05367414280772209,
0.005881595890969038,
-0.10059742629528046,
0.02973032556474209,
-0.005181485321372747,
-0.03837529569864273,
0.0007982024690136313,
-0.02266741916537285,
0.1802893728017807,
-0.004208357073366642,
-0.006327144801616669,
-0.07102968543767929,
-0.09693251550197601,
0.04772495478391647,
0.12215172499418259,
-0.09859321266412735,
0.025315819308161736,
-0.012865317054092884,
0.012379765510559082,
-0.023867987096309662,
-0.009586930274963379,
-0.11764226853847504,
-0.2144106924533844,
0.04985975846648216,
-0.1801731288433075,
0.020197536796331406,
0.028454888612031937,
0.05329402536153793,
0.04403090476989746,
-0.12361329793930054,
-0.05734134092926979,
-0.17243210971355438,
-0.01688087172806263,
0.02395845577120781,
-0.13855940103530884,
-0.010358338244259357,
-0.027290241792798042,
0.08471034467220306,
-0.03252899646759033,
-0.00466223293915391,
0.07114149630069733,
0.09255048632621765,
0.05018043518066406,
-0.04785631224513054,
-0.0190667025744915,
0.06858502328395844,
-0.010797585360705853,
-0.04079575464129448,
-0.025550711899995804,
-0.03902892395853996,
-0.02846318483352661,
0.11502061039209366,
-0.04357830062508583,
-0.005878599360585213,
0.1094907894730568,
0.13349944353103638,
-0.08199527859687805,
0.04846114292740822,
-0.0570746548473835,
-0.03166288509964943,
-0.133344367146492,
-0.0775105208158493,
0.19161061942577362,
-0.009778732433915138,
0.025681866332888603,
-0.046474043279886246,
-0.011227305978536606,
-0.05181720107793808,
-0.014606811106204987,
-0.054854616522789,
0.08224068582057953,
0.0003671989543363452,
-0.0801038146018982,
0.14474456012248993,
0.11491220444440842,
-0.06147877126932144,
0.22824494540691376,
-0.061670079827308655,
-0.1084466502070427,
0.009347565472126007,
0.03246558830142021,
-0.01821618527173996,
0.2040102779865265,
0.07191066443920135,
0.0035130411852151155,
0.03406307101249695,
0.057036835700273514,
0.05280556157231331,
-0.09309712797403336,
0.021727440878748894,
-0.03551024943590164,
-0.06612906605005264,
-0.04487429931759834,
0.02929232455790043,
0.013243358582258224,
0.10042157024145126,
-0.008430393412709236,
0.04855210706591606,
0.028943100944161415,
-0.03246002271771431,
-0.11589640378952026,
0.16992725431919098,
-0.13710005581378937,
-0.23731403052806854,
-0.11020385473966599,
0.13399705290794373,
-0.11486760526895523,
-0.004140973091125488,
0.04767850413918495,
-0.09147801250219345,
-0.09582383185625076,
-0.06880071759223938,
0.16191960871219635,
0.003323484444990754,
-0.05603718012571335,
-0.08663405478000641,
-0.010407664813101292,
0.005197818856686354,
-0.1557844877243042,
-0.022357799112796783,
-0.01295776292681694,
-0.04998314008116722,
0.026382839307188988,
-0.008513864129781723,
0.054462119936943054,
0.039459750056266785,
0.019232003018260002,
-0.04969516769051552,
-0.04632074385881424,
0.10053993761539459,
-0.12694981694221497,
0.053524311631917953,
0.12543681263923645,
-0.08589319884777069,
0.023657632991671562,
0.10557470470666885,
0.031088044866919518,
-0.06148412078619003,
0.03488968685269356,
0.06262895464897156,
-0.04818958416581154,
-0.22028951346874237,
-0.14497312903404236,
-0.07451839745044708,
0.011730216443538666,
0.027373839169740677,
0.043372415006160736,
-0.002891309093683958,
0.010469431057572365,
-0.10360527038574219,
-0.08248664438724518,
0.052509862929582596,
0.0557248517870903,
0.11309199780225754,
0.0031081452034413815,
0.04124058410525322,
-0.09041013568639755,
-0.03378571197390556,
0.11235073208808899,
-0.012909910641610622,
0.23226141929626465,
-0.05576932802796364,
0.05283080413937569,
0.06060750037431717,
0.10939739644527435,
0.04595193639397621,
0.0979907289147377,
-0.007144267670810223,
-0.015601000748574734,
-0.0037701684050261974,
-0.06812435388565063,
0.0027161939069628716,
0.07547809928655624,
0.02706715650856495,
-0.022295191884040833,
-0.051934048533439636,
-0.08333265036344528,
0.11798173189163208,
0.20153912901878357,
0.10326611250638962,
-0.11459587514400482,
-0.13861437141895294,
0.012808527797460556,
-0.020973440259695053,
0.011439929716289043,
0.0188557468354702,
0.10160651057958603,
-0.11251024901866913,
0.034769896417856216,
-0.013020235113799572,
0.052564457058906555,
-0.13185659050941467,
-0.012917703948915005,
-0.010999093763530254,
0.023938389495015144,
-0.03200450912117958,
0.07969991117715836,
-0.2567698657512665,
0.11694362759590149,
-0.00009599478653399274,
0.09213101118803024,
-0.07691434770822525,
0.02642964944243431,
0.05081721022725105,
-0.06291850656270981,
0.08447954058647156,
0.046041905879974365,
-0.11519363522529602,
-0.004856209270656109,
-0.11272457987070084,
0.031829629093408585,
0.1151561513543129,
-0.06033987179398537,
0.05674945190548897,
0.022981993854045868,
0.029483061283826828,
0.007944810204207897,
0.07687383890151978,
-0.15055449306964874,
-0.08396092802286148,
0.05811210349202156,
-0.017933251336216927,
-0.03807850554585457,
-0.06599552929401398,
-0.09536094963550568,
0.027046840637922287,
0.19446536898612976,
-0.10519777238368988,
-0.08952559530735016,
-0.1343638151884079,
0.004180009942501783,
0.14075122773647308,
-0.08254072070121765,
0.05161665752530098,
0.06362348794937134,
0.09668266028165817,
-0.05144340172410011,
-0.14961621165275574,
0.10012839734554291,
-0.11089161783456802,
-0.1478559970855713,
-0.050067901611328125,
0.12325446307659149,
0.1221429854631424,
0.03627108782529831,
-0.024765372276306152,
0.035237450152635574,
0.07672829926013947,
-0.13480278849601746,
0.035670213401317596,
0.11373768746852875,
-0.005426694173365831,
0.13342152535915375,
-0.14269866049289703,
-0.12979596853256226,
-0.01847067102789879,
-0.022743312641978264,
0.10614274442195892,
0.1387878805398941,
-0.027567757293581963,
0.12281474471092224,
0.13636921346187592,
-0.05873149633407593,
-0.318897545337677,
0.039852071553468704,
0.05582737550139427,
0.061058640480041504,
0.0012830909108743072,
-0.24794597923755646,
0.11092177778482437,
0.09467419236898422,
-0.048602309077978134,
0.057834409177303314,
-0.252049058675766,
-0.11991260945796967,
0.10601701587438583,
0.004995474126189947,
0.16943979263305664,
-0.06188138946890831,
0.0022238485980778933,
-0.008514625951647758,
-0.09338923543691635,
0.06769245117902756,
-0.11600527912378311,
0.05517951026558876,
0.011372909881174564,
0.053783152252435684,
0.024343743920326233,
-0.056857671588659286,
0.09642160683870316,
0.07953089475631714,
0.06380731612443924,
-0.0584421269595623,
0.010392541065812111,
0.042139049619436264,
-0.023391690105199814,
0.2037750482559204,
-0.012248918414115906,
0.04474836587905884,
-0.10840379446744919,
-0.019020896404981613,
-0.049342572689056396,
0.10669097304344177,
-0.03850105032324791,
-0.07107816636562347,
-0.09112239629030228,
0.11527331918478012,
0.0382593534886837,
-0.018611840903759003,
0.061799369752407074,
0.014850079081952572,
-0.002245573326945305,
-0.004957336466759443,
0.03981880471110344,
0.17480787634849548,
-0.1023147702217102,
0.0008626999915577471,
-0.0433015339076519,
0.08046329766511917,
-0.10131872445344925,
0.07213941216468811,
0.09687474370002747,
0.019296541810035706,
0.13389639556407928,
0.023661334067583084,
-0.10777246206998825,
0.05122494325041771,
0.04743160679936409,
-0.1488499492406845,
-0.07941189408302307,
-0.02310130186378956,
-0.12268602102994919,
-0.043409768491983414,
0.05973818898200989,
0.1892808973789215,
-0.0699232891201973,
-0.0021908553317189217,
-0.040513813495635986,
0.07707860320806503,
-0.030481401830911636,
0.12225262075662613,
0.023982597514986992,
0.01155801024287939,
-0.08129502832889557,
0.18262508511543274,
0.036704085767269135,
-0.17760591208934784,
0.08276205509901047,
0.04178126901388168,
-0.08316327631473541,
-0.03243811056017876,
-0.013949963264167309,
0.08598274737596512,
-0.04277872294187546,
-0.054397329688072205,
-0.08474715799093246,
-0.1182590201497078,
-0.009356924332678318,
-0.0045883143320679665,
0.056675396859645844,
0.09791363775730133,
-0.02022351510822773,
0.0013902746140956879,
-0.07722175866365433,
0.07388612627983093,
0.06354542076587677,
0.04520283639431,
0.021177729591727257,
-0.01685035414993763,
0.011043368838727474,
-0.031888049095869064,
-0.048661671578884125,
0.016145234927535057,
-0.11616060882806778,
-0.03939259052276611,
-0.12543636560440063,
-0.06364383548498154,
-0.0003902889438904822,
-0.018056634813547134,
-0.019057497382164,
-0.07114025950431824,
0.004241541959345341,
0.08360539376735687,
-0.08046640455722809,
-0.021651508286595345,
-0.034327246248722076,
0.052706990391016006,
-0.15626277029514313,
0.007670334540307522,
0.04965503141283989,
-0.06911269575357437,
0.1381700336933136,
0.12818460166454315,
0.007646182086318731,
0.10006105899810791,
-0.07491447031497955,
-0.025898300111293793,
0.0023384620435535908,
0.007465746719390154,
0.012781048193573952,
-0.0820125937461853,
-0.013952253386378288,
0.0567733496427536,
-0.031857892870903015,
0.03724857047200203,
0.009195641614496708,
-0.07081969082355499,
0.029642950743436813,
-0.00303914537653327,
-0.06784998625516891,
-0.036366842687129974,
0.07544640451669693,
0.11002355813980103,
0.05956251546740532,
0.1311125010251999,
-0.10420207679271698,
0.0552428774535656,
-0.06385162472724915,
-0.01719248667359352,
0.03871731832623482,
-0.03594424948096275,
-0.06015278398990631,
0.02230057679116726,
0.04734659940004349,
-0.04071523994207382,
0.20563869178295135,
-0.038287173956632614,
-0.01659451797604561,
0.04788142070174217,
-0.003267632331699133,
0.0018900787690654397,
0.05597703531384468,
0.11339960247278214,
-0.06530550122261047,
-0.028016556054353714,
-0.09598685055971146,
-0.025989258661866188,
-0.01774866320192814,
0.030966179445385933,
0.19560766220092773,
0.15877428650856018,
0.1028582900762558,
0.03963441401720047,
-0.004494187422096729,
-0.06124464049935341,
-0.0676359012722969,
-0.0008118816767819226,
-0.017208045348525047,
0.07780972868204117,
0.012971306219696999,
0.126509889960289,
0.1154426634311676,
-0.15654438734054565,
0.10763361304998398,
0.02529802732169628,
-0.046832501888275146,
-0.06092262268066406,
-0.07919983565807343,
-0.041500918567180634,
-0.09866497665643692,
0.04081088304519653,
-0.14213617146015167,
-0.026541758328676224,
0.10152287036180496,
0.08737025409936905,
-0.05292974412441254,
0.23061853647232056,
-0.03521135076880455,
-0.08705892413854599,
0.11071164160966873,
0.006847230717539787,
0.007809705566614866,
0.13018612563610077,
-0.043243251740932465,
0.015003213658928871,
0.04193020239472389,
0.07620682567358017,
0.035349469631910324,
0.01035721879452467,
-0.003515031887218356,
-0.0627736896276474,
-0.046190887689590454,
-0.012562451884150505,
-0.004468610975891352,
0.023475628346204758,
0.1869892030954361,
0.061164386570453644,
-0.016382597386837006,
-0.0033619229216128588,
0.10852428525686264,
-0.02450256422162056,
-0.0822499468922615,
-0.13852457702159882,
0.15864980220794678,
0.055271364748477936,
0.0321609266102314,
0.01316378265619278,
-0.11760701984167099,
0.01406899280846119,
0.026394294574856758,
0.25281283259391785,
-0.011555318720638752,
0.02578643336892128,
-0.01592174917459488,
0.0036911247298121452,
0.018710196018218994,
0.08213035762310028,
0.017898278310894966,
0.29601529240608215,
-0.024196594953536987,
0.0563105084002018,
-0.048651956021785736,
-0.03757691755890846,
-0.06007165461778641,
0.15379349887371063,
-0.03059040755033493,
-0.05720295011997223,
-0.15403488278388977,
0.1131877526640892,
0.015786608681082726,
-0.16920490562915802,
0.016752559691667557,
-0.1196541115641594,
-0.12637196481227875,
0.003914726432412863,
0.07255343347787857,
0.0663522481918335,
0.07373180240392685,
0.00924641266465187,
0.020377298817038536,
0.12215627729892731,
-0.005962285213172436,
-0.05576585978269577,
-0.0752912312746048,
0.07434543967247009,
-0.022125734016299248,
0.15938138961791992,
0.0130148371681571,
0.09499458968639374,
0.08543536067008972,
-0.0034578428603708744,
-0.07759035378694534,
0.05629700794816017,
0.024904629215598106,
-0.12583112716674805,
-0.027850806713104248,
0.1468798667192459,
-0.06638328731060028,
0.10753524303436279,
0.0835941806435585,
-0.0027628098614513874,
0.04737721383571625,
0.01640484295785427,
-0.08775404095649719,
-0.09460826963186264,
0.10618934035301208,
-0.09135408699512482,
0.12074866145849228,
0.19339148700237274,
-0.019011689350008965,
0.029996225610375404,
-0.05946729704737663,
0.04802798479795456,
0.019551169127225876,
0.059949494898319244,
0.03425826132297516,
-0.16742168366909027,
0.04830526188015938,
-0.00856985803693533,
0.021391289308667183,
-0.18082726001739502,
-0.032245028764009476,
-0.028732474893331528,
0.010821371339261532,
-0.059944998472929,
0.07260410487651825,
0.029567047953605652,
-0.016125725582242012,
-0.014441885985434055,
-0.10771819204092026,
0.021881552413105965,
0.015864858403801918,
-0.038738228380680084,
-0.07284032553434372
] |
null | null |
transformers
|
## Overview
**Language model:** gbert-base-germandpr-reranking
**Language:** German
**Training data:** GermanDPR train set (~ 56MB)
**Eval data:** GermanDPR test set (~ 6MB)
**Infrastructure**: 1x V100 GPU
**Published**: June 3rd, 2021
## Details
- We trained a text pair classification model in FARM, which can be used for reranking in document retrieval tasks. To this end, the classifier calculates the similarity of the query and each retrieved top k document (e.g., k=10). The top k documents are then sorted by their similarity scores. The document most similar to the query is the best.
## Hyperparameters
```
batch_size = 16
n_epochs = 2
max_seq_len = 512 tokens for question and passage concatenated
learning_rate = 2e-5
lr_schedule = LinearWarmup
embeds_dropout_prob = 0.1
```
## Performance
We use the GermanDPR test dataset as ground truth labels and run two experiments to compare how a BM25 retriever performs with or without reranking with our model. The first experiment runs retrieval on the full German Wikipedia (more than 2 million passages) and second experiment runs retrieval on the GermanDPR dataset only (not more than 5000 passages). Both experiments use 1025 queries. Note that the second experiment is evaluating on a much simpler task because of the smaller dataset size, which explains strong BM25 retrieval performance.
### Full German Wikipedia (more than 2 million passages):
BM25 Retriever without Reranking
- recall@3: 0.4088 (419 / 1025)
- mean_reciprocal_rank@3: 0.3322
BM25 Retriever with Reranking Top 10 Documents
- recall@3: 0.5200 (533 / 1025)
- mean_reciprocal_rank@3: 0.4800
### GermanDPR Test Dataset only (not more than 5000 passages):
BM25 Retriever without Reranking
- recall@3: 0.9102 (933 / 1025)
- mean_reciprocal_rank@3: 0.8528
BM25 Retriever with Reranking Top 10 Documents
- recall@3: 0.9298 (953 / 1025)
- mean_reciprocal_rank@3: 0.8813
## Usage
### In haystack
You can load the model in [haystack](https://github.com/deepset-ai/haystack/) for reranking the documents returned by a Retriever:
```python
...
retriever = ElasticsearchRetriever(document_store=document_store)
ranker = FARMRanker(model_name_or_path="deepset/gbert-base-germandpr-reranking")
...
p = Pipeline()
p.add_node(component=retriever, name="ESRetriever", inputs=["Query"])
p.add_node(component=ranker, name="Ranker", inputs=["ESRetriever"])
)
```
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "datasets": ["deepset/germandpr"]}
|
text-classification
|
deepset/gbert-base-germandpr-reranking
|
[
"transformers",
"pytorch",
"safetensors",
"bert",
"text-classification",
"de",
"dataset:deepset/germandpr",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #safetensors #bert #text-classification #de #dataset-deepset/germandpr #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
## Overview
Language model: gbert-base-germandpr-reranking
Language: German
Training data: GermanDPR train set (~ 56MB)
Eval data: GermanDPR test set (~ 6MB)
Infrastructure: 1x V100 GPU
Published: June 3rd, 2021
## Details
- We trained a text pair classification model in FARM, which can be used for reranking in document retrieval tasks. To this end, the classifier calculates the similarity of the query and each retrieved top k document (e.g., k=10). The top k documents are then sorted by their similarity scores. The document most similar to the query is the best.
## Hyperparameters
## Performance
We use the GermanDPR test dataset as ground truth labels and run two experiments to compare how a BM25 retriever performs with or without reranking with our model. The first experiment runs retrieval on the full German Wikipedia (more than 2 million passages) and second experiment runs retrieval on the GermanDPR dataset only (not more than 5000 passages). Both experiments use 1025 queries. Note that the second experiment is evaluating on a much simpler task because of the smaller dataset size, which explains strong BM25 retrieval performance.
### Full German Wikipedia (more than 2 million passages):
BM25 Retriever without Reranking
- recall@3: 0.4088 (419 / 1025)
- mean_reciprocal_rank@3: 0.3322
BM25 Retriever with Reranking Top 10 Documents
- recall@3: 0.5200 (533 / 1025)
- mean_reciprocal_rank@3: 0.4800
### GermanDPR Test Dataset only (not more than 5000 passages):
BM25 Retriever without Reranking
- recall@3: 0.9102 (933 / 1025)
- mean_reciprocal_rank@3: 0.8528
BM25 Retriever with Reranking Top 10 Documents
- recall@3: 0.9298 (953 / 1025)
- mean_reciprocal_rank@3: 0.8813
## Usage
### In haystack
You can load the model in haystack for reranking the documents returned by a Retriever:
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Website
By the way: we're hiring!
|
[
"## Overview\nLanguage model: gbert-base-germandpr-reranking \nLanguage: German \nTraining data: GermanDPR train set (~ 56MB) \nEval data: GermanDPR test set (~ 6MB) \nInfrastructure: 1x V100 GPU \nPublished: June 3rd, 2021",
"## Details\n- We trained a text pair classification model in FARM, which can be used for reranking in document retrieval tasks. To this end, the classifier calculates the similarity of the query and each retrieved top k document (e.g., k=10). The top k documents are then sorted by their similarity scores. The document most similar to the query is the best.",
"## Hyperparameters",
"## Performance\nWe use the GermanDPR test dataset as ground truth labels and run two experiments to compare how a BM25 retriever performs with or without reranking with our model. The first experiment runs retrieval on the full German Wikipedia (more than 2 million passages) and second experiment runs retrieval on the GermanDPR dataset only (not more than 5000 passages). Both experiments use 1025 queries. Note that the second experiment is evaluating on a much simpler task because of the smaller dataset size, which explains strong BM25 retrieval performance.",
"### Full German Wikipedia (more than 2 million passages):\nBM25 Retriever without Reranking\n- recall@3: 0.4088 (419 / 1025)\n- mean_reciprocal_rank@3: 0.3322\n\nBM25 Retriever with Reranking Top 10 Documents\n- recall@3: 0.5200 (533 / 1025)\n- mean_reciprocal_rank@3: 0.4800",
"### GermanDPR Test Dataset only (not more than 5000 passages):\nBM25 Retriever without Reranking\n- recall@3: 0.9102 (933 / 1025)\n- mean_reciprocal_rank@3: 0.8528\n\nBM25 Retriever with Reranking Top 10 Documents\n- recall@3: 0.9298 (953 / 1025)\n- mean_reciprocal_rank@3: 0.8813",
"## Usage",
"### In haystack\nYou can load the model in haystack for reranking the documents returned by a Retriever:",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Website \n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #safetensors #bert #text-classification #de #dataset-deepset/germandpr #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"## Overview\nLanguage model: gbert-base-germandpr-reranking \nLanguage: German \nTraining data: GermanDPR train set (~ 56MB) \nEval data: GermanDPR test set (~ 6MB) \nInfrastructure: 1x V100 GPU \nPublished: June 3rd, 2021",
"## Details\n- We trained a text pair classification model in FARM, which can be used for reranking in document retrieval tasks. To this end, the classifier calculates the similarity of the query and each retrieved top k document (e.g., k=10). The top k documents are then sorted by their similarity scores. The document most similar to the query is the best.",
"## Hyperparameters",
"## Performance\nWe use the GermanDPR test dataset as ground truth labels and run two experiments to compare how a BM25 retriever performs with or without reranking with our model. The first experiment runs retrieval on the full German Wikipedia (more than 2 million passages) and second experiment runs retrieval on the GermanDPR dataset only (not more than 5000 passages). Both experiments use 1025 queries. Note that the second experiment is evaluating on a much simpler task because of the smaller dataset size, which explains strong BM25 retrieval performance.",
"### Full German Wikipedia (more than 2 million passages):\nBM25 Retriever without Reranking\n- recall@3: 0.4088 (419 / 1025)\n- mean_reciprocal_rank@3: 0.3322\n\nBM25 Retriever with Reranking Top 10 Documents\n- recall@3: 0.5200 (533 / 1025)\n- mean_reciprocal_rank@3: 0.4800",
"### GermanDPR Test Dataset only (not more than 5000 passages):\nBM25 Retriever without Reranking\n- recall@3: 0.9102 (933 / 1025)\n- mean_reciprocal_rank@3: 0.8528\n\nBM25 Retriever with Reranking Top 10 Documents\n- recall@3: 0.9298 (953 / 1025)\n- mean_reciprocal_rank@3: 0.8813",
"## Usage",
"### In haystack\nYou can load the model in haystack for reranking the documents returned by a Retriever:",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Website \n\nBy the way: we're hiring!"
] |
[
58,
63,
93,
5,
127,
89,
95,
3,
27,
118
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #bert #text-classification #de #dataset-deepset/germandpr #license-mit #autotrain_compatible #endpoints_compatible #region-us \n## Overview\nLanguage model: gbert-base-germandpr-reranking \nLanguage: German \nTraining data: GermanDPR train set (~ 56MB) \nEval data: GermanDPR test set (~ 6MB) \nInfrastructure: 1x V100 GPU \nPublished: June 3rd, 2021## Details\n- We trained a text pair classification model in FARM, which can be used for reranking in document retrieval tasks. To this end, the classifier calculates the similarity of the query and each retrieved top k document (e.g., k=10). The top k documents are then sorted by their similarity scores. The document most similar to the query is the best.## Hyperparameters## Performance\nWe use the GermanDPR test dataset as ground truth labels and run two experiments to compare how a BM25 retriever performs with or without reranking with our model. The first experiment runs retrieval on the full German Wikipedia (more than 2 million passages) and second experiment runs retrieval on the GermanDPR dataset only (not more than 5000 passages). Both experiments use 1025 queries. Note that the second experiment is evaluating on a much simpler task because of the smaller dataset size, which explains strong BM25 retrieval performance.### Full German Wikipedia (more than 2 million passages):\nBM25 Retriever without Reranking\n- recall@3: 0.4088 (419 / 1025)\n- mean_reciprocal_rank@3: 0.3322\n\nBM25 Retriever with Reranking Top 10 Documents\n- recall@3: 0.5200 (533 / 1025)\n- mean_reciprocal_rank@3: 0.4800"
] |
[
-0.053545139729976654,
0.08669358491897583,
-0.005339981988072395,
0.08563859015703201,
0.025446653366088867,
0.031404923647642136,
0.05595878139138222,
0.08389831334352493,
0.04609540104866028,
0.06213710084557533,
-0.047043848782777786,
-0.07770812511444092,
0.04967157915234566,
0.09737786650657654,
0.02449834905564785,
-0.1849087029695511,
0.03505464643239975,
-0.07977914065122604,
0.08814160525798798,
0.07295488566160202,
0.11736957728862762,
-0.10331736505031586,
0.03444405272603035,
-0.03551000356674194,
-0.07217416167259216,
0.11400685459375381,
0.02856799028813839,
0.015444988384842873,
0.10393974184989929,
-0.02366265282034874,
0.09130269289016724,
-0.029631948098540306,
0.07566627860069275,
-0.05725213885307312,
-0.005897794850170612,
0.028178442269563675,
-0.005116811953485012,
0.027796810492873192,
0.07340657711029053,
-0.07651495188474655,
0.06870386749505997,
-0.06831425428390503,
-0.0002441184624331072,
0.024155689403414726,
-0.07429414987564087,
-0.14194396138191223,
-0.14701688289642334,
0.13150449097156525,
0.00492568826302886,
0.015411978587508202,
-0.04028288275003433,
0.005163636989891529,
-0.11386402696371078,
0.03975341096520424,
0.20094192028045654,
-0.27439120411872864,
-0.029809657484292984,
0.06637772172689438,
0.0033561105374246836,
0.046019699424505234,
-0.07707949727773666,
0.0020009547006338835,
0.011966988444328308,
0.023310910910367966,
-0.038054488599300385,
-0.031327228993177414,
0.016537785530090332,
-0.010639063082635403,
-0.10316114127635956,
-0.035861462354660034,
0.09131815284490585,
0.005105266347527504,
-0.08873613178730011,
-0.07433047145605087,
-0.09803954511880875,
-0.01894710771739483,
0.023696834221482277,
-0.027807297185063362,
0.029767950996756554,
-0.027331341058015823,
0.0009630175190977752,
0.01968756690621376,
-0.08172894269227982,
-0.03231336548924446,
-0.11889480799436569,
0.19243058562278748,
0.04023858532309532,
0.036655720323324203,
0.04977116733789444,
0.14498133957386017,
-0.07775329798460007,
-0.11830472201108932,
-0.1113320142030716,
-0.05718359351158142,
-0.14417703449726105,
-0.02753721922636032,
-0.04832536727190018,
-0.12591493129730225,
0.0062561784870922565,
0.2201993465423584,
-0.027187101542949677,
0.042898572981357574,
-0.0454103946685791,
0.0079317856580019,
0.0908454954624176,
0.2186708003282547,
-0.053067896515131,
-0.08930648863315582,
0.02913915552198887,
-0.08200830221176147,
0.0433780662715435,
0.06271293759346008,
0.032202597707509995,
-0.05925881862640381,
0.1371913105249405,
0.06428361684083939,
-0.08844106644392014,
-0.005369895603507757,
-0.04636102914810181,
-0.01661747507750988,
-0.012262318283319473,
-0.18621030449867249,
0.0066364542581140995,
0.025775860995054245,
-0.08873288333415985,
0.08152788132429123,
0.011831008829176426,
-0.07588326930999756,
-0.06087404862046242,
0.10802572220563889,
-0.007134874351322651,
-0.024789046496152878,
-0.1124887764453888,
-0.12651780247688293,
0.025194354355335236,
-0.03214791417121887,
-0.04335577413439751,
-0.12369336932897568,
-0.16334468126296997,
-0.05445895716547966,
0.07429523766040802,
-0.07255556434392929,
0.06505434215068817,
0.01577366702258587,
0.0044667781330645084,
-0.010078882798552513,
-0.00449754111468792,
-0.024708891287446022,
-0.03280308470129967,
0.018654173240065575,
-0.0687454417347908,
0.06066736951470375,
0.04585333541035652,
-0.0016382826725021005,
-0.12512965500354767,
-0.016061194241046906,
-0.30979597568511963,
0.08475954830646515,
-0.12099138647317886,
-0.05435195192694664,
-0.0928976982831955,
-0.04241938889026642,
-0.010949390940368176,
-0.03734889626502991,
0.018586518242955208,
0.08802492916584015,
-0.20394474267959595,
-0.03290484473109245,
0.16771142184734344,
-0.11295823752880096,
0.028021296486258507,
0.1345217376947403,
-0.07033500075340271,
0.00676779355853796,
0.11143874377012253,
0.10744088888168335,
0.13880272209644318,
-0.1315445899963379,
-0.13487236201763153,
-0.002378201810643077,
-0.06471805274486542,
0.15607225894927979,
0.06662771105766296,
0.03982006013393402,
0.036301493644714355,
0.0330214649438858,
-0.03029315359890461,
-0.010094030760228634,
-0.016373978927731514,
-0.03130963072180748,
-0.022363005205988884,
-0.018863150849938393,
0.1209016665816307,
-0.015437992289662361,
0.006437376607209444,
-0.06666935980319977,
-0.08563855290412903,
0.03988754749298096,
0.09781701862812042,
-0.05005490034818649,
0.003104080678895116,
-0.036037784069776535,
-0.02466214820742607,
-0.036497052758932114,
0.012356105260550976,
-0.19718506932258606,
-0.05261006951332092,
0.09202728420495987,
-0.13330039381980896,
0.055370330810546875,
0.14863690733909607,
0.107199527323246,
0.034447602927684784,
-0.11204338818788528,
-0.026714323088526726,
-0.11952827125787735,
-0.03332044184207916,
-0.0004693883238360286,
-0.11511789262294769,
-0.02940983511507511,
-0.05260959267616272,
0.022475095465779305,
0.047229308634996414,
-0.0787564218044281,
0.04446128383278847,
0.19356437027454376,
0.024746866896748543,
-0.04727505147457123,
-0.07725074142217636,
-0.04404866695404053,
-0.014998753555119038,
-0.0219864621758461,
-0.048986803740262985,
-0.023544929921627045,
0.06789715588092804,
0.04353388026356697,
0.02604817971587181,
-0.039768118411302567,
0.036780670285224915,
0.15586774051189423,
-0.11223709583282471,
0.10557379573583603,
-0.09547851979732513,
0.013760782778263092,
-0.16175147891044617,
-0.14105850458145142,
0.2130400687456131,
0.03550379350781441,
0.043980710208415985,
-0.06714554131031036,
-0.11156188696622849,
-0.03825061023235321,
0.026322796940803528,
-0.03192991390824318,
0.13121744990348816,
-0.05691562965512276,
-0.21752844750881195,
0.04635830223560333,
0.0012063417816534638,
-0.04195162653923035,
0.16837261617183685,
-0.06918606162071228,
-0.10635682940483093,
-0.039039041846990585,
0.03193391487002373,
-0.015038693323731422,
0.159277081489563,
0.10266876965761185,
0.03432461991906166,
0.051803674548864365,
0.048340726643800735,
0.026427632197737694,
-0.037720125168561935,
0.08386939018964767,
-0.027042580768465996,
-0.052458539605140686,
0.01254005916416645,
0.0065746307373046875,
0.07006184011697769,
0.12741103768348694,
-0.008886015973985195,
-0.010002010501921177,
-0.04565367475152016,
-0.05339616909623146,
-0.060119930654764175,
0.16329032182693481,
-0.05618883669376373,
-0.17876875400543213,
-0.12305112928152084,
0.05031118541955948,
-0.11981114745140076,
0.017675835639238358,
0.012404167093336582,
-0.034312278032302856,
-0.11608971655368805,
-0.04094703868031502,
0.13289391994476318,
0.0796990767121315,
-0.02737308479845524,
-0.06047692894935608,
0.007083606440573931,
0.05459503084421158,
-0.19106963276863098,
-0.018009405583143234,
-0.05017292872071266,
-0.09286385774612427,
0.006002844776958227,
0.05772079899907112,
0.0445157065987587,
0.044001732021570206,
-0.07185611873865128,
-0.007760823238641024,
0.014446420595049858,
0.16743096709251404,
-0.06354983896017075,
0.09513869136571884,
0.04472190514206886,
-0.05251915007829666,
0.024128548800945282,
0.094500832259655,
0.020822420716285706,
-0.03222598135471344,
0.04165268689393997,
0.05433225631713867,
-0.026473799720406532,
-0.23096966743469238,
-0.06126362457871437,
-0.027659926563501358,
-0.02280581369996071,
0.058161087334156036,
0.05168743431568146,
-0.12340863794088364,
-0.010379769839346409,
-0.06391274929046631,
0.009778997860848904,
0.06272050738334656,
-0.0031765324529260397,
0.028711948543787003,
0.008187878876924515,
0.060760315507650375,
-0.08195550739765167,
-0.0847804993391037,
0.09618383646011353,
-0.007048369850963354,
0.21549053490161896,
-0.05722767859697342,
-0.06630050390958786,
0.054930511862039566,
-0.01664677821099758,
0.014781523495912552,
0.21698063611984253,
-0.009510045871138573,
0.017175104469060898,
-0.019984420388936996,
-0.06616923958063126,
-0.0032264527399092913,
0.037456341087818146,
0.006533042993396521,
0.0588042251765728,
-0.005979514680802822,
-0.05239702761173248,
0.12269405275583267,
0.2130669504404068,
0.11712925136089325,
-0.04159065708518028,
-0.09017970412969589,
0.0350172258913517,
-0.11372259259223938,
-0.03330773115158081,
0.017126474529504776,
0.07879671454429626,
-0.12617388367652893,
0.03686841204762459,
-0.023705540224909782,
0.06224808841943741,
-0.12076225876808167,
0.0003116141015198082,
-0.03954968601465225,
0.08374150842428207,
-0.0014927109004929662,
0.07291806489229202,
-0.15668977797031403,
0.13558878004550934,
0.0027908883057534695,
0.11642064899206161,
-0.06756675243377686,
0.037819743156433105,
0.036427512764930725,
-0.08369266986846924,
0.1342954784631729,
0.028484245762228966,
-0.10514595359563828,
-0.021282339468598366,
-0.10285183042287827,
0.03188429772853851,
0.09335329383611679,
-0.08976901322603226,
0.09962274879217148,
-0.022008830681443214,
-0.02274138294160366,
-0.04644547030329704,
0.07056699693202972,
-0.15226349234580994,
-0.039056167006492615,
0.03960612043738365,
-0.14432910084724426,
-0.027233440428972244,
-0.02642221562564373,
-0.043988216668367386,
-0.10732928663492203,
0.17440839111804962,
-0.13757319748401642,
-0.05927643924951553,
-0.10695395618677139,
0.016660287976264954,
0.14831925928592682,
-0.0980900377035141,
0.02274971641600132,
0.036938492208719254,
0.10567241162061691,
-0.021502748131752014,
-0.1984199583530426,
0.0026675467379391193,
-0.05798953026533127,
-0.10822303593158722,
-0.009231640957295895,
0.10566740483045578,
0.08599019795656204,
0.07521732896566391,
0.05541140213608742,
0.06834764778614044,
0.028403889387845993,
-0.12632353603839874,
0.042771827429533005,
0.06469379365444183,
-0.004412731155753136,
0.14210301637649536,
-0.1370587795972824,
-0.06926512718200684,
0.006640995386987925,
0.07833396643400192,
0.09196546673774719,
0.14585895836353302,
-0.10090792924165726,
0.10629843920469284,
0.1012999415397644,
-0.05542639270424843,
-0.3040061593055725,
-0.00042167649371549487,
0.0815378800034523,
0.06869402527809143,
-0.00826676283031702,
-0.13082121312618256,
0.16977928578853607,
0.14604505896568298,
-0.032890480011701584,
-0.09010642021894455,
-0.17365382611751556,
-0.13839861750602722,
0.12096019089221954,
-0.0025345482863485813,
0.11612038314342499,
-0.005117018707096577,
-0.015339657664299011,
-0.07356449216604233,
-0.1610216349363327,
0.009839408099651337,
-0.13228625059127808,
0.11065345257520676,
0.010155349969863892,
0.005808362737298012,
0.03568083047866821,
-0.048238400369882584,
0.08246546238660812,
0.1033819317817688,
0.09133207052946091,
-0.01976480893790722,
0.04327315464615822,
0.047901369631290436,
-0.022658821195364,
0.19291183352470398,
0.05225984379649162,
0.08072949200868607,
-0.05347489193081856,
-0.0045335860922932625,
-0.050849877297878265,
0.08906521648168564,
-0.019511662423610687,
-0.05683788284659386,
-0.1090002954006195,
0.11320588737726212,
0.052131522446870804,
-0.015457382425665855,
0.012266230769455433,
0.03072371520102024,
-0.01473400741815567,
0.08642025291919708,
0.07150989770889282,
0.06253079324960709,
-0.11737900972366333,
-0.008266604505479336,
-0.022734254598617554,
0.027911746874451637,
0.0042035579681396484,
0.12534071505069733,
0.16858159005641937,
0.02968079410493374,
0.13175885379314423,
0.00823567807674408,
-0.13191305100917816,
-0.00007454635488102213,
0.09932149946689606,
-0.1673438996076584,
-0.22372066974639893,
-0.015365606173872948,
-0.1545867919921875,
-0.05608195438981056,
0.05386887118220329,
0.20349273085594177,
-0.049764491617679596,
-0.007792482618242502,
0.00458238460123539,
0.04283576458692551,
-0.025760728865861893,
0.20390376448631287,
0.007797550410032272,
0.038774434477090836,
-0.09796176850795746,
0.18937629461288452,
0.07751855254173279,
-0.015835968777537346,
0.0033426322042942047,
0.022362546995282173,
-0.0487285852432251,
0.00013719097478315234,
-0.004930565599352121,
0.11566616594791412,
-0.03444213792681694,
0.020314233377575874,
-0.11113517731428146,
-0.11828106641769409,
0.012279579415917397,
0.09565487504005432,
0.06465910375118256,
0.08479616791009903,
-0.03277207165956497,
0.025308050215244293,
-0.015191502869129181,
0.1219002828001976,
0.1530531346797943,
0.05602142587304115,
-0.030195876955986023,
0.09381494671106339,
-0.09891033917665482,
0.004244453739374876,
-0.06946033984422684,
0.02611680142581463,
-0.11839824914932251,
-0.03397344797849655,
-0.1960468590259552,
-0.02657320350408554,
0.055751264095306396,
-0.0049646878615021706,
-0.019165055826306343,
-0.04867115989327431,
-0.03244679421186447,
0.05747503042221069,
-0.09670697152614594,
-0.003338316222652793,
-0.021673385053873062,
0.026069287210702896,
-0.16692061722278595,
-0.0057428753934800625,
0.08612226694822311,
-0.07752124965190887,
0.13028588891029358,
0.12421765178442001,
0.00787494145333767,
0.09774786978960037,
-0.0821356549859047,
-0.10434632003307343,
0.02912043407559395,
0.0363369882106781,
0.05343089997768402,
-0.10113795846700668,
0.04657446965575218,
0.03635062277317047,
0.0005978179397061467,
0.05798019841313362,
-0.03505780175328255,
-0.08629177510738373,
0.08075675368309021,
-0.038214970380067825,
-0.05077538266777992,
-0.10498204082250595,
0.028112653642892838,
0.06599365919828415,
0.11141408979892731,
0.13459081947803497,
-0.1014336496591568,
0.003390687284991145,
-0.13751614093780518,
0.0034810048528015614,
0.04566584527492523,
-0.1105668917298317,
-0.05055368319153786,
0.01165627222508192,
0.03449889272451401,
-0.015401038341224194,
0.16669613122940063,
0.05859360471367836,
0.005163490306586027,
0.05825742334127426,
0.038591571152210236,
-0.08060254901647568,
0.0630846917629242,
0.027640419080853462,
-0.015828466042876244,
-0.022924110293388367,
-0.09611731022596359,
-0.012230529449880123,
-0.016344353556632996,
-0.09873991459608078,
0.2591569721698761,
0.20094804465770721,
0.05189753696322441,
0.0229959599673748,
0.04291735589504242,
-0.015912633389234543,
-0.057976219803094864,
0.012070830911397934,
-0.023813243955373764,
0.04494081065058708,
-0.011934998445212841,
0.061357371509075165,
0.05594081059098244,
-0.16508258879184723,
0.13887189328670502,
-0.04551039636135101,
-0.021551145240664482,
-0.0852881371974945,
0.006153633818030357,
-0.07772251218557358,
-0.05356467142701149,
0.019459329545497894,
-0.1265648901462555,
0.024466712027788162,
0.005537637509405613,
0.06443119794130325,
-0.05710510164499283,
0.12844042479991913,
-0.1822100579738617,
-0.03841663897037506,
0.13259771466255188,
0.021955113857984543,
-0.005356297828257084,
0.09482358396053314,
0.004300395958125591,
0.00012937694555148482,
0.11319895088672638,
0.08042312413454056,
0.0384662039577961,
0.030636988580226898,
-0.08092427998781204,
-0.043553464114665985,
-0.03561576083302498,
-0.012080642394721508,
-0.03585435450077057,
-0.009215441532433033,
0.041821036487817764,
0.05009967461228371,
0.002868272364139557,
-0.02607072703540325,
0.19954946637153625,
-0.06408893316984177,
-0.12282740324735641,
-0.14786109328269958,
0.2003469616174698,
0.07351025938987732,
0.1108289584517479,
0.06295262277126312,
-0.1558016836643219,
-0.06546477973461151,
0.044532474130392075,
0.09244232624769211,
0.029454514384269714,
0.019652610644698143,
-0.007133197039365768,
0.010691789910197258,
0.048124656081199646,
0.046902041882276535,
-0.0002943290746770799,
0.26420754194259644,
-0.0030735593754798174,
0.13496658205986023,
0.03531855717301369,
-0.020943820476531982,
-0.0077021243050694466,
0.15279005467891693,
-0.04893083497881889,
-0.01859034039080143,
-0.14029109477996826,
0.06299345195293427,
-0.001449137693271041,
-0.24435822665691376,
-0.10166705399751663,
-0.06867514550685883,
-0.1555510312318802,
-0.020725058391690254,
0.073524110019207,
0.024386728182435036,
0.05836394429206848,
0.01619045063853264,
0.017572971060872078,
0.1353931874036789,
0.016579875722527504,
-0.03871206194162369,
-0.10473530739545822,
0.08485770970582962,
0.00935457181185484,
0.21672473847866058,
0.036818843334913254,
0.03128610551357269,
0.08668147772550583,
-0.06040330231189728,
-0.10649365931749344,
0.023474786430597305,
0.010941418819129467,
-0.23726172745227814,
0.019302688539028168,
0.21147535741329193,
-0.014826030470430851,
0.03811274468898773,
0.03699609637260437,
-0.006262755487114191,
0.005872556008398533,
0.0902668759226799,
0.01758033037185669,
-0.13752123713493347,
0.12411078810691833,
-0.06882951408624649,
0.08249858766794205,
0.23011845350265503,
0.010174588300287724,
0.08901460468769073,
-0.09678653627634048,
0.07988519221544266,
0.05086246505379677,
0.044878482818603516,
0.024093283340334892,
-0.14799202978610992,
0.01241061370819807,
-0.0029092493932694197,
-0.026456227526068687,
-0.1586587280035019,
-0.1081295907497406,
0.028112689033150673,
0.01281447522342205,
-0.03888540342450142,
0.11448674649000168,
0.024593887850642204,
0.04279227927327156,
0.024001315236091614,
-0.18143585324287415,
-0.00451451912522316,
0.003944863099604845,
-0.07694262266159058,
-0.05254555121064186
] |
null | null |
transformers
|
# German BERT base
Released, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our [paper](https://arxiv.org/pdf/2010.10906.pdf), we outline the steps taken to train our model and show that it outperforms its predecessors.
## Overview
**Paper:** [here](https://arxiv.org/pdf/2010.10906.pdf)
**Architecture:** BERT base
**Language:** German
## Performance
```
GermEval18 Coarse: 78.17
GermEval18 Fine: 50.90
GermEval14: 87.98
```
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: `branden.chan [at] deepset.ai`
Stefan Schweter: `stefan [at] schweter.eu`
Timo Möller: `timo.moeller [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Slack](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "datasets": ["wikipedia", "OPUS", "OpenLegalData"]}
|
fill-mask
|
deepset/gbert-base
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"fill-mask",
"de",
"dataset:wikipedia",
"dataset:OPUS",
"dataset:OpenLegalData",
"arxiv:2010.10906",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.10906"
] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #safetensors #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# German BERT base
Released, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that it outperforms its predecessors.
## Overview
Paper: here
Architecture: BERT base
Language: German
## Performance
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: 'URL [at] URL'
Stefan Schweter: 'stefan [at] URL'
Timo Möller: 'timo.moeller [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Slack | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# German BERT base\n\nReleased, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that it outperforms its predecessors.",
"## Overview \nPaper: here \nArchitecture: BERT base \nLanguage: German",
"## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# German BERT base\n\nReleased, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that it outperforms its predecessors.",
"## Overview \nPaper: here \nArchitecture: BERT base \nLanguage: German",
"## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
80,
99,
15,
60,
40,
129
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# German BERT base\n\nReleased, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that it outperforms its predecessors.## Overview \nPaper: here \nArchitecture: BERT base \nLanguage: German## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
-0.021730495616793633,
0.12227428704500198,
-0.003641229821369052,
0.04392511397600174,
0.03829647973179817,
-0.01627953164279461,
0.09598879516124725,
0.09882965683937073,
0.05981605499982834,
0.09190554171800613,
-0.0012881456641480327,
0.00749401468783617,
0.08051788061857224,
0.1449557989835739,
0.030373647809028625,
-0.2296418696641922,
0.02815416269004345,
-0.1177922934293747,
-0.06297411024570465,
0.07684134691953659,
0.14988811314105988,
-0.09358829259872437,
0.09263899177312851,
0.004656004719436169,
-0.000717829680070281,
0.028245408087968826,
-0.06324293464422226,
-0.020233256742358208,
0.0495930090546608,
0.026642650365829468,
0.04525839164853096,
-0.044263724237680435,
-0.005944399628788233,
-0.15574006736278534,
0.022457459941506386,
0.04803037643432617,
0.03205563873052597,
0.048067327588796616,
0.056143004447221756,
-0.029715005308389664,
-0.028820617124438286,
-0.14308440685272217,
0.00586595106869936,
0.07218316197395325,
-0.10309343785047531,
-0.14736583828926086,
-0.08878816664218903,
0.13067953288555145,
0.04319285973906517,
0.015910912305116653,
-0.039658866822719574,
0.051587942987680435,
-0.07381032407283783,
0.00672679441049695,
0.09614329040050507,
-0.2655613124370575,
-0.09050382673740387,
-0.007120739668607712,
-0.004091614857316017,
-0.00881607923656702,
-0.1502775102853775,
0.06745252013206482,
-0.0033284672535955906,
0.022560087963938713,
0.01905481517314911,
-0.021759796887636185,
0.07528914511203766,
0.0026152776554226875,
-0.09200921654701233,
0.030150411650538445,
0.11245699971914291,
0.006372545380145311,
-0.05174059793353081,
-0.23109404742717743,
0.0000999101102934219,
0.16858746111392975,
-0.02379002794623375,
-0.04732983559370041,
0.023249447345733643,
-0.017854789271950722,
-0.0042166681960225105,
-0.09091775119304657,
-0.0936804860830307,
0.05245482176542282,
0.004000706598162651,
0.18305233120918274,
0.01937655359506607,
-0.005691794212907553,
0.05914195999503136,
0.06698907911777496,
-0.04030679538846016,
-0.16457271575927734,
-0.027181196957826614,
-0.12832413613796234,
-0.04323675110936165,
-0.012047525495290756,
-0.026559561491012573,
-0.07232172787189484,
0.12141741812229156,
0.21749670803546906,
0.026778118684887886,
0.02642395719885826,
-0.011396350339055061,
-0.0021939268335700035,
0.0861879512667656,
0.17348498106002808,
-0.05435286834836006,
-0.19907207787036896,
-0.001040568226017058,
-0.044795989990234375,
0.028613435104489326,
-0.011552823707461357,
-0.05286850780248642,
-0.027908191084861755,
-0.027664789929986,
0.00009771638724487275,
0.06084425002336502,
0.06499235332012177,
-0.08692963421344757,
-0.1002892330288887,
0.08022702485322952,
-0.1371159851551056,
0.0554957389831543,
0.03613469749689102,
-0.05664694309234619,
0.05737316235899925,
-0.06494295597076416,
0.01744043454527855,
-0.041580680757761,
0.06189563870429993,
-0.01699812337756157,
-0.03658158332109451,
-0.11398665606975555,
-0.08267820626497269,
0.05149700865149498,
-0.0476226881146431,
-0.03887351602315903,
-0.06096995249390602,
-0.018590310588479042,
-0.05973608046770096,
0.12623797357082367,
-0.04826505482196808,
-0.015765707939863205,
-0.04891305789351463,
-0.011743711307644844,
0.055947378277778625,
0.017769254744052887,
-0.04754406213760376,
-0.004625609610229731,
0.020271580666303635,
-0.10147970914840698,
0.005928710103034973,
-0.031609006226062775,
0.031137581914663315,
-0.06230832636356354,
-0.00711400993168354,
-0.2993262708187103,
0.053237609565258026,
-0.14395299553871155,
0.07048269361257553,
-0.13392901420593262,
-0.005323090124875307,
-0.006189304403960705,
0.043936531990766525,
0.004368975292891264,
0.09976932406425476,
-0.06822247803211212,
-0.062419138848781586,
0.1333637833595276,
-0.0571046844124794,
-0.023234747350215912,
0.12876887619495392,
-0.06411570310592651,
0.027853211387991905,
0.08702102303504944,
0.22636301815509796,
0.1517876386642456,
-0.1201087087392807,
-0.03570457920432091,
-0.03623432293534279,
-0.01945876143872738,
0.09400039166212082,
0.08951475471258163,
-0.09285181760787964,
0.08547361195087433,
0.022837843745946884,
-0.09195691347122192,
0.010163605213165283,
0.006032265722751617,
-0.03974468633532524,
0.06075183302164078,
-0.03021552599966526,
0.13062739372253418,
-0.031233515590429306,
-0.020868120715022087,
-0.08924296498298645,
-0.12510241568088531,
0.0838206559419632,
0.017239797860383987,
-0.004372819792479277,
-0.015816019847989082,
-0.07308724522590637,
0.0204131081700325,
0.07278576493263245,
0.01628001406788826,
-0.056473009288311005,
-0.13947586715221405,
0.07022339850664139,
-0.058224618434906006,
0.15861178934574127,
0.017190681770443916,
0.09292540699243546,
-0.008787273429334164,
-0.029816754162311554,
-0.022403186187148094,
-0.046014729887247086,
-0.023661259561777115,
0.04374850168824196,
-0.1955118179321289,
-0.005036730784922838,
-0.043888889253139496,
0.049171723425388336,
0.0011133860098198056,
-0.022762369364500046,
0.03565868362784386,
0.16210044920444489,
0.04300902038812637,
-0.04068095237016678,
-0.005862157326191664,
-0.0007724941824562848,
0.05788553133606911,
-0.01913394406437874,
0.009936366230249405,
-0.014151510782539845,
-0.04116300120949745,
0.053893353790044785,
-0.009374058805406094,
0.021644188091158867,
0.03921917825937271,
-0.023770179599523544,
-0.06727147102355957,
-0.0630922019481659,
-0.05359441041946411,
-0.022833289578557014,
-0.021525077521800995,
-0.1208239421248436,
0.2087220400571823,
0.012131704948842525,
-0.019898155704140663,
-0.06242287531495094,
-0.09919147193431854,
-0.066660076379776,
-0.03252296894788742,
-0.03132700175046921,
0.08688170462846756,
-0.07466527074575424,
-0.19047628343105316,
0.13085156679153442,
0.13090616464614868,
0.056974031031131744,
0.26331058144569397,
-0.061641860753297806,
-0.039935093373060226,
-0.02287842147052288,
0.05893683061003685,
-0.03858840838074684,
0.10721445083618164,
-0.01746263913810253,
-0.027936633676290512,
0.046008702367544174,
-0.029115114361047745,
0.004145507700741291,
-0.0034223010297864676,
0.05949319526553154,
-0.027420902624726295,
-0.008092047646641731,
0.10539616644382477,
0.0175380390137434,
0.039995186030864716,
0.06838198006153107,
0.1153281033039093,
0.015809059143066406,
0.010490513406693935,
-0.0473107248544693,
-0.029862865805625916,
0.09181264042854309,
-0.14477133750915527,
-0.19560128450393677,
-0.1426362246274948,
-0.07114952057600021,
-0.12119680643081665,
-0.015424411743879318,
-0.008715811185538769,
-0.05680505558848381,
-0.11664135009050369,
-0.005505747627466917,
0.10344673693180084,
0.08891694992780685,
-0.07438813149929047,
-0.013403045013546944,
-0.004610877484083176,
0.03229189291596413,
-0.1345692276954651,
-0.04110757261514664,
0.0017919210949912667,
-0.03549307584762573,
-0.038260724395513535,
0.07775779813528061,
0.03116735629737377,
0.018701177090406418,
0.026366885751485825,
0.02157319150865078,
-0.013786565512418747,
0.1919948309659958,
-0.1387343406677246,
0.10861878842115402,
0.09854370355606079,
-0.08907041698694229,
0.06766265630722046,
0.19587987661361694,
0.08245053142309189,
0.0056116580963134766,
-0.002054807497188449,
0.06316297501325607,
0.03944714367389679,
-0.19206726551055908,
-0.10761409997940063,
-0.05201883241534233,
-0.03024638257920742,
-0.021416477859020233,
0.046809036284685135,
-0.002675108378753066,
-0.009921833872795105,
-0.10143037885427475,
-0.060938794165849686,
0.08847224712371826,
0.06122428923845291,
0.08452115207910538,
-0.0035230088979005814,
0.030068766325712204,
-0.01862488128244877,
-0.0864621251821518,
0.10092147439718246,
0.025789175182580948,
0.08536643534898758,
0.05196032673120499,
0.14161156117916107,
0.05128571018576622,
0.06372597068548203,
0.010644147172570229,
0.003089735982939601,
-0.03657164052128792,
0.013213057070970535,
-0.027084948495030403,
-0.10936141759157181,
0.029116181656718254,
0.08818238228559494,
0.09990626573562622,
-0.0532170832157135,
0.04754309356212616,
-0.06578157097101212,
0.15935175120830536,
0.2022685557603836,
0.03293484449386597,
-0.04791302606463432,
-0.052033767104148865,
0.035069335252046585,
-0.07341885566711426,
-0.06802679598331451,
-0.014693538658320904,
0.03698058798909187,
-0.15645986795425415,
0.07990133762359619,
-0.01581396535038948,
0.07477479428052902,
-0.03344981372356415,
0.03789164870977402,
0.03990877792239189,
0.1366475522518158,
-0.012167563661932945,
0.07401690632104874,
-0.18241916596889496,
0.10396543890237808,
0.030039489269256592,
0.07281754910945892,
-0.05716448649764061,
0.04726412892341614,
0.048320088535547256,
-0.13915985822677612,
0.12297867983579636,
0.00466192839667201,
-0.007947132922708988,
0.02667051926255226,
-0.1377502828836441,
0.008560148999094963,
0.17553934454917908,
-0.13822497427463531,
0.05447578802704811,
-0.02116447500884533,
-0.0327182374894619,
-0.049651581794023514,
0.05156259983778,
-0.11780499666929245,
-0.10200471431016922,
0.024526944383978844,
-0.11325555294752121,
0.019666638225317,
-0.04053197801113129,
0.00362935452722013,
-0.12257671356201172,
0.26932501792907715,
-0.14589504897594452,
-0.07820741832256317,
-0.11530665308237076,
-0.030293740332126617,
0.07556883990764618,
-0.07613785564899445,
0.07687227427959442,
0.005337256006896496,
0.08326500654220581,
-0.017312686890363693,
-0.12087506800889969,
0.07806600630283356,
-0.06231661140918732,
-0.09957792609930038,
-0.007001028396189213,
0.17014151811599731,
0.08344809710979462,
0.020957432687282562,
0.005513105075806379,
0.019906440749764442,
-0.008540808223187923,
-0.10751611739397049,
0.04360714182257652,
0.09691990166902542,
-0.009748177602887154,
0.07465842366218567,
-0.12404434382915497,
-0.1744718998670578,
-0.043502327054739,
0.048120953142642975,
0.09362000226974487,
0.10111384093761444,
-0.02464882656931877,
0.18736937642097473,
0.17096386849880219,
-0.03140544518828392,
-0.27825358510017395,
-0.013116654939949512,
0.061313554644584656,
0.035846129059791565,
0.0048894560895860195,
-0.24202416837215424,
0.14561067521572113,
0.004218199755996466,
-0.057258836925029755,
0.027505578473210335,
-0.10592487454414368,
-0.10686993598937988,
0.09598840773105621,
-0.052879780530929565,
-0.049542199820280075,
-0.049571726471185684,
-0.09158343076705933,
-0.037989914417266846,
-0.053896449506282806,
0.0898791179060936,
-0.08584880828857422,
0.0521932989358902,
0.06517394632101059,
0.037721287459135056,
0.0477621965110302,
-0.02182958275079727,
0.07224904000759125,
-0.024008000269532204,
0.03931521624326706,
-0.0909537523984909,
-0.039052098989486694,
0.020398693159222603,
-0.0384446457028389,
0.08762852847576141,
-0.039405301213264465,
-0.0328311063349247,
-0.0929911658167839,
-0.017871495336294174,
-0.07265282422304153,
0.1283695101737976,
-0.050055742263793945,
-0.08946815133094788,
-0.07109510898590088,
0.15176750719547272,
0.02137642167508602,
0.029199285432696342,
0.07363855093717575,
-0.026468733325600624,
0.0001553164329379797,
0.07440750300884247,
0.17352981865406036,
0.08974123746156693,
-0.04657522216439247,
-0.03881867229938507,
-0.02772228978574276,
0.06452960520982742,
0.004783236421644688,
0.0530368946492672,
0.09426550567150116,
-0.011718525551259518,
0.10263131558895111,
-0.029072070494294167,
-0.13567779958248138,
-0.03881429135799408,
0.10948999226093292,
-0.14225338399410248,
-0.15467636287212372,
-0.08295058459043503,
-0.028454720973968506,
-0.046458836644887924,
0.029131760820746422,
0.16887661814689636,
0.016877660527825356,
-0.04065480828285217,
0.045569196343421936,
0.05908476933836937,
-0.043363265693187714,
0.028559766709804535,
0.055405013263225555,
0.01582885906100273,
-0.0823599100112915,
0.10225913673639297,
0.0646071583032608,
0.029878918081521988,
0.06859938055276871,
0.10423911362886429,
-0.031228870153427124,
-0.011346234008669853,
0.07041846215724945,
0.15389521420001984,
0.016613850370049477,
-0.028051692992448807,
0.009325796738266945,
-0.09415964782238007,
-0.02559797279536724,
0.052152637392282486,
0.043397922068834305,
0.032011978328228,
0.03578978031873703,
0.048460427671670914,
0.07108553498983383,
0.11348523199558258,
0.05721549317240715,
-0.0006082175532355905,
-0.025672292336821556,
-0.010788924060761929,
-0.0702938660979271,
-0.03682590276002884,
-0.03972839564085007,
-0.01828065514564514,
-0.1659264862537384,
-0.04926709458231926,
-0.07478224486112595,
-0.03798028454184532,
-0.03622129559516907,
-0.0186537504196167,
-0.0012669421266764402,
-0.04154296591877937,
0.020849984139204025,
0.0224230345338583,
-0.07085161656141281,
-0.03038663975894451,
0.0015916448319330812,
0.11511123925447464,
-0.19764649868011475,
0.003692804602906108,
0.08836855739355087,
-0.04771999642252922,
0.10665854811668396,
0.025493746623396873,
-0.009350989013910294,
0.035721879452466965,
-0.16156214475631714,
-0.07473693788051605,
-0.04496539384126663,
0.017216825857758522,
0.030359389260411263,
-0.06936904042959213,
-0.031376712024211884,
-0.025747064501047134,
-0.054816968739032745,
0.0016784551553428173,
0.03275338187813759,
-0.08392000198364258,
0.1197558268904686,
0.03392111137509346,
-0.1037202924489975,
-0.0375453419983387,
0.04731252044439316,
0.08870136737823486,
0.022232796996831894,
0.07834906876087189,
-0.07911527901887894,
0.022363608703017235,
-0.06651363521814346,
0.033729150891304016,
0.05273909121751785,
-0.04916803166270256,
-0.10635831952095032,
-0.010957762598991394,
0.041057392954826355,
-0.025819214060902596,
0.08457304537296295,
0.014805346727371216,
-0.1089002937078476,
0.04246991127729416,
-0.02213723212480545,
-0.04533524438738823,
0.08607769012451172,
0.04090892896056175,
-0.03671020269393921,
-0.01074296422302723,
-0.06663000583648682,
-0.05609883740544319,
-0.05247984454035759,
-0.06320898979902267,
0.15492750704288483,
0.2700105905532837,
0.13311561942100525,
0.009747620671987534,
0.16669908165931702,
-0.021329110488295555,
-0.13807877898216248,
0.08596925437450409,
0.05354073643684387,
0.08142270147800446,
-0.09700102359056473,
0.056146495044231415,
0.06834138184785843,
-0.21387256681919098,
0.10481277853250504,
-0.013045885600149632,
-0.016480280086398125,
-0.0304033774882555,
-0.22266380488872528,
-0.07079959660768509,
-0.030869029462337494,
0.02061716467142105,
-0.10524095594882965,
0.04080899432301521,
-0.01953205280005932,
0.07435783743858337,
-0.07435405254364014,
0.1658918708562851,
-0.17283877730369568,
-0.061780884861946106,
0.15452232956886292,
0.026911085471510887,
0.024540916085243225,
0.04481964558362961,
-0.056237682700157166,
-0.08901142328977585,
0.14929869771003723,
0.04202752560377121,
0.047017112374305725,
-0.013325444422662258,
-0.07440829277038574,
-0.04824850708246231,
-0.10128404200077057,
-0.00033115455880761147,
-0.05684232711791992,
-0.025889992713928223,
0.09356468915939331,
0.012434341944754124,
-0.03664172440767288,
-0.021690528839826584,
0.16015292704105377,
-0.03724199905991554,
-0.10092397779226303,
-0.14853638410568237,
0.02813045307993889,
-0.03107152320444584,
0.03919500485062599,
-0.0015849348856136203,
-0.11922407150268555,
-0.04024519398808479,
0.10175653547048569,
0.24593591690063477,
-0.08217739313840866,
0.02742638997733593,
0.012306938879191875,
0.02464589662849903,
0.013619188219308853,
0.11286890506744385,
-0.052853479981422424,
0.26470181345939636,
-0.010645393282175064,
0.03590140491724014,
0.03665415570139885,
-0.06684892624616623,
-0.13830214738845825,
0.08848182111978531,
0.048829130828380585,
-0.05001963675022125,
-0.09455829113721848,
0.13349004089832306,
-0.09952212125062943,
-0.15946276485919952,
-0.08864513784646988,
-0.10253077000379562,
-0.1672501415014267,
-0.06009834259748459,
0.06303351372480392,
0.0644456297159195,
0.07495473325252533,
0.04449935629963875,
-0.03001791052520275,
0.0826278105378151,
0.0227036215364933,
0.026918893679976463,
-0.0022665844298899174,
0.1300950050354004,
-0.07219137251377106,
0.18539707362651825,
0.036494143307209015,
-0.00026928962324745953,
0.09157148003578186,
-0.0352352000772953,
-0.033019233494997025,
-0.07336152344942093,
0.04379948973655701,
-0.17596988379955292,
-0.03371026739478111,
0.11361221969127655,
0.006816274952143431,
0.08927454799413681,
0.07179086655378342,
-0.022472580894827843,
0.029799342155456543,
0.11076059937477112,
-0.04242178797721863,
-0.10305453091859818,
0.13561475276947021,
-0.1406831294298172,
0.13830676674842834,
0.1801130175590515,
-0.0071042682975530624,
-0.01765286922454834,
-0.053491584956645966,
0.04045912250876427,
0.04495983570814133,
0.040892478078603745,
-0.06825906783342361,
-0.15943124890327454,
-0.009438326582312584,
0.023391544818878174,
0.04327443987131119,
-0.08585835248231888,
-0.08693894743919373,
-0.020919576287269592,
0.12093472480773926,
-0.05825795978307724,
0.07522988319396973,
0.11758964508771896,
-0.0025894558057188988,
0.01199264358729124,
-0.039458341896533966,
0.004888373427093029,
0.028647907078266144,
-0.06872793287038803,
0.008008845150470734
] |
null | null |
transformers
|
## Overview
**Language model:** gbert-large-sts
**Language:** German
**Training data:** German STS benchmark train and dev set
**Eval data:** German STS benchmark test set
**Infrastructure**: 1x V100 GPU
**Published**: August 12th, 2021
## Details
- We trained a gbert-large model on the task of estimating semantic similarity of German-language text pairs. The dataset is a machine-translated version of the [STS benchmark](https://ixa2.si.ehu.eus/stswiki/index.php/STSbenchmark), which is available [here](https://github.com/t-systems-on-site-services-gmbh/german-STSbenchmark).
## Hyperparameters
```
batch_size = 16
n_epochs = 4
warmup_ratio = 0.1
learning_rate = 2e-5
lr_schedule = LinearWarmup
```
## Performance
Stay tuned... and watch out for new papers on arxiv.org ;)
## Authors
- Julian Risch: `julian.risch [at] deepset.ai`
- Timo Möller: `timo.moeller [at] deepset.ai`
- Julian Gutsch: `julian.gutsch [at] deepset.ai`
- Malte Pietsch: `malte.pietsch [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "tags": ["exbert"]}
|
text-classification
|
deepset/gbert-large-sts
|
[
"transformers",
"pytorch",
"safetensors",
"bert",
"text-classification",
"exbert",
"de",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #safetensors #bert #text-classification #exbert #de #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
## Overview
Language model: gbert-large-sts
Language: German
Training data: German STS benchmark train and dev set
Eval data: German STS benchmark test set
Infrastructure: 1x V100 GPU
Published: August 12th, 2021
## Details
- We trained a gbert-large model on the task of estimating semantic similarity of German-language text pairs. The dataset is a machine-translated version of the STS benchmark, which is available here.
## Hyperparameters
## Performance
Stay tuned... and watch out for new papers on URL ;)
## Authors
- Julian Risch: 'URL [at] URL'
- Timo Möller: 'timo.moeller [at] URL'
- Julian Gutsch: 'URL [at] URL'
- Malte Pietsch: 'malte.pietsch [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Website
By the way: we're hiring!
|
[
"## Overview\nLanguage model: gbert-large-sts\n\nLanguage: German \nTraining data: German STS benchmark train and dev set \nEval data: German STS benchmark test set \nInfrastructure: 1x V100 GPU \nPublished: August 12th, 2021",
"## Details\n- We trained a gbert-large model on the task of estimating semantic similarity of German-language text pairs. The dataset is a machine-translated version of the STS benchmark, which is available here.",
"## Hyperparameters",
"## Performance\nStay tuned... and watch out for new papers on URL ;)",
"## Authors\n- Julian Risch: 'URL [at] URL'\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Gutsch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Website \n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #safetensors #bert #text-classification #exbert #de #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"## Overview\nLanguage model: gbert-large-sts\n\nLanguage: German \nTraining data: German STS benchmark train and dev set \nEval data: German STS benchmark test set \nInfrastructure: 1x V100 GPU \nPublished: August 12th, 2021",
"## Details\n- We trained a gbert-large model on the task of estimating semantic similarity of German-language text pairs. The dataset is a machine-translated version of the STS benchmark, which is available here.",
"## Hyperparameters",
"## Performance\nStay tuned... and watch out for new papers on URL ;)",
"## Authors\n- Julian Risch: 'URL [at] URL'\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Gutsch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Website \n\nBy the way: we're hiring!"
] |
[
51,
53,
53,
5,
16,
59,
118
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #bert #text-classification #exbert #de #license-mit #autotrain_compatible #endpoints_compatible #region-us \n## Overview\nLanguage model: gbert-large-sts\n\nLanguage: German \nTraining data: German STS benchmark train and dev set \nEval data: German STS benchmark test set \nInfrastructure: 1x V100 GPU \nPublished: August 12th, 2021## Details\n- We trained a gbert-large model on the task of estimating semantic similarity of German-language text pairs. The dataset is a machine-translated version of the STS benchmark, which is available here.## Hyperparameters## Performance\nStay tuned... and watch out for new papers on URL ;)## Authors\n- Julian Risch: 'URL [at] URL'\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Gutsch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Website \n\nBy the way: we're hiring!"
] |
[
-0.09705337882041931,
0.09625989198684692,
-0.0015123882330954075,
0.057734351605176926,
0.07128582149744034,
0.024909701198339462,
0.1384689062833786,
0.08852546662092209,
0.1352076679468155,
0.054683711379766464,
-0.012547680176794529,
-0.08179748058319092,
0.09760435670614243,
0.21521109342575073,
0.049958568066358566,
-0.20106522738933563,
0.004050216171890497,
-0.09478729218244553,
-0.10579641908407211,
0.08687257021665573,
0.13079774379730225,
-0.09923761337995529,
0.11479352414608002,
-0.023351535201072693,
-0.008847842924296856,
0.045974258333444595,
-0.0223548524081707,
-0.04649348929524422,
0.11970409750938416,
0.031122641637921333,
0.04678812623023987,
-0.010620140470564365,
0.03267316147685051,
-0.16379156708717346,
0.025773262605071068,
0.030428536236286163,
0.03484150022268295,
0.03419393673539162,
0.08692575991153717,
-0.034625813364982605,
-0.005922443233430386,
-0.12752015888690948,
-0.008782632648944855,
0.055019624531269073,
-0.07510968297719955,
-0.12722299993038177,
-0.13536250591278076,
0.058823052793741226,
0.0014532118802890182,
0.038598984479904175,
-0.023404184728860855,
0.07314736396074295,
-0.12372815608978271,
0.031574103981256485,
0.10260143131017685,
-0.2606385052204132,
-0.04397028684616089,
0.009136929176747799,
0.0351736918091774,
0.09512647986412048,
-0.12953834235668182,
0.024500466883182526,
0.03285401687026024,
0.03221392259001732,
0.0027652238495647907,
-0.029963241890072823,
0.044659871608018875,
0.007467753253877163,
-0.070161372423172,
-0.009337082505226135,
0.17214439809322357,
0.005622332450002432,
-0.07266783714294434,
-0.1786397397518158,
0.024192480370402336,
0.14166294038295746,
0.0052976603619754314,
-0.04545433074235916,
-0.0014414405450224876,
-0.017207643017172813,
-0.05128961056470871,
-0.046877846121788025,
-0.0936545878648758,
0.014449316076934338,
-0.006270080339163542,
0.08310612291097641,
0.020797427743673325,
0.029868485406041145,
0.007126218639314175,
0.0916125550866127,
-0.0066564856097102165,
-0.12542220950126648,
-0.0354418009519577,
-0.12811778485774994,
-0.07580365240573883,
-0.0013167660217732191,
-0.013340095989406109,
-0.027207227423787117,
0.11912327259778976,
0.15424971282482147,
-0.08227697014808655,
0.0008157757110893726,
0.031084619462490082,
-0.014928753487765789,
0.04762747883796692,
0.21184034645557404,
-0.041811004281044006,
-0.2706005275249481,
0.002519287634640932,
-0.046713028103113174,
-0.034317754209041595,
0.00115548400208354,
-0.03830918297171593,
-0.020788609981536865,
-0.031179217621684074,
0.04466121643781662,
0.0162653811275959,
0.04591242969036102,
-0.0630863830447197,
-0.07144208252429962,
0.13530142605304718,
-0.12186560779809952,
0.03846467286348343,
0.026687197387218475,
-0.04894139617681503,
0.09418672323226929,
-0.08550478518009186,
-0.007351213600486517,
-0.055234938859939575,
0.0618244931101799,
-0.025883324444293976,
-0.04326057434082031,
-0.07916347682476044,
-0.09870868921279907,
0.05216363072395325,
-0.023666515946388245,
-0.034596413373947144,
-0.08031463623046875,
-0.09897805005311966,
-0.06378798931837082,
0.04719923436641693,
-0.012131782248616219,
0.004687605891376734,
-0.02953711897134781,
-0.011549982242286205,
0.048656873404979706,
-0.009185549803078175,
-0.05346853658556938,
-0.0600413978099823,
0.03371758386492729,
-0.09833274036645889,
0.016857918351888657,
-0.024899577721953392,
0.014560840092599392,
-0.1212795153260231,
-0.028288455680012703,
-0.17049619555473328,
0.09637533128261566,
-0.16600532829761505,
0.0783325657248497,
-0.11660253256559372,
-0.01629958301782608,
0.03168782591819763,
0.038075849413871765,
0.015080064535140991,
0.13580207526683807,
-0.16687148809432983,
-0.08578592538833618,
0.18448609113693237,
-0.0986652597784996,
-0.0917806401848793,
0.15742707252502441,
-0.0497872531414032,
-0.0039022022392600775,
0.12631504237651825,
0.1788693070411682,
0.16742707788944244,
-0.07794170826673508,
-0.07737988233566284,
-0.018479006364941597,
0.00840948149561882,
0.01582690328359604,
0.0905773788690567,
-0.02758798934519291,
0.049629807472229004,
0.04267431050539017,
-0.10386750102043152,
0.00550075015053153,
-0.030566472560167313,
-0.040616147220134735,
0.0262107253074646,
-0.024082712829113007,
0.08796553313732147,
-0.040113747119903564,
-0.012961196713149548,
-0.07243996113538742,
-0.09565871208906174,
0.000018131602701032534,
0.03391816094517708,
0.0000061520509007095825,
-0.01711178570985794,
-0.0383773073554039,
-0.004044880624860525,
0.05626815930008888,
0.02569643408060074,
-0.08283234387636185,
-0.15392827987670898,
0.0802283063530922,
-0.05149597302079201,
0.1181291937828064,
0.04603518918156624,
0.0764860138297081,
-0.00990754459053278,
-0.046071555465459824,
-0.021670697256922722,
-0.08152208477258682,
-0.03835514187812805,
0.011449743993580341,
-0.16493770480155945,
0.00832151435315609,
-0.07232394814491272,
0.002974546980112791,
-0.037109583616256714,
-0.04130842164158821,
0.11712386459112167,
0.10759953409433365,
0.06455383449792862,
-0.02124776318669319,
-0.02583218924701214,
0.036445192992687225,
0.013765837997198105,
-0.023293882608413696,
-0.026606371626257896,
-0.01096607930958271,
-0.017572108656167984,
0.1174381822347641,
0.040254492312669754,
0.074542336165905,
0.06729982048273087,
0.09753678739070892,
-0.05686785653233528,
-0.08056805282831192,
-0.07764918357133865,
0.0028884210623800755,
-0.07916677743196487,
-0.09755905717611313,
0.16518086194992065,
0.021792331710457802,
0.018744470551609993,
-0.06680174916982651,
-0.07925146073102951,
-0.061205171048641205,
0.009221827611327171,
-0.028037823736667633,
0.12163147330284119,
-0.07356128096580505,
-0.06163938716053963,
0.11436426639556885,
0.12118691951036453,
0.041198134422302246,
0.26412343978881836,
-0.0629759281873703,
-0.06828976422548294,
0.001521041733212769,
-0.00728400144726038,
-0.041440848261117935,
0.14815202355384827,
0.0016125335823744535,
-0.0351562462747097,
0.06414080411195755,
0.012531882151961327,
0.003687097690999508,
-0.0477156788110733,
0.028779730200767517,
-0.04158448055386543,
-0.039538782089948654,
0.02159171923995018,
0.006517267320305109,
0.06024201214313507,
0.10719974339008331,
0.05241541191935539,
0.006486873142421246,
-0.026357430964708328,
-0.05697016790509224,
-0.08557574450969696,
0.15779942274093628,
-0.10629202425479889,
-0.20187239348888397,
-0.14008671045303345,
0.03798891603946686,
-0.1024743989109993,
-0.024380363523960114,
0.040169768035411835,
-0.08025594055652618,
-0.10677191615104675,
-0.02495475858449936,
0.07456003129482269,
0.1198301836848259,
-0.027404088526964188,
-0.047249648720026016,
0.0012319451197981834,
0.022738609462976456,
-0.12891149520874023,
-0.028523817658424377,
-0.014825370162725449,
-0.054091572761535645,
-0.003859224496409297,
0.042314477264881134,
0.0703398659825325,
0.013549256138503551,
0.005616230424493551,
0.03181415796279907,
-0.05576471611857414,
0.23989565670490265,
-0.13740189373493195,
0.11417562514543533,
0.043564315885305405,
0.01353888213634491,
0.05256602168083191,
0.14556480944156647,
0.07576438784599304,
-0.006361682899296284,
0.018193315714597702,
0.029129313305020332,
-0.004658055957406759,
-0.27204352617263794,
-0.12890346348285675,
-0.03607283532619476,
-0.009709157980978489,
-0.014289596118032932,
0.055792778730392456,
-0.059847667813301086,
0.013523275963962078,
-0.09895796328783035,
-0.00871502235531807,
0.04573861137032509,
0.016032706946134567,
0.010945039801299572,
0.03694019466638565,
0.03420903533697128,
-0.06568741798400879,
-0.04618043079972267,
0.15275762975215912,
0.09609662741422653,
0.1572733223438263,
0.010244995355606079,
0.10348940640687943,
0.057021308690309525,
0.05454627051949501,
0.00451697176322341,
0.023977410048246384,
-0.0022005445789545774,
0.02556416019797325,
-0.02621261402964592,
-0.07817351073026657,
0.02553543634712696,
0.0900249257683754,
0.055165935307741165,
-0.044158242642879486,
-0.015377802774310112,
-0.05521923676133156,
0.15622472763061523,
0.2312009185552597,
-0.007753583136945963,
-0.11742834001779556,
-0.09287538379430771,
0.026820510625839233,
-0.05364178493618965,
-0.006706038024276495,
0.01587841473519802,
0.09324830770492554,
-0.16125859320163727,
0.07556075602769852,
-0.011386770755052567,
0.07902242988348007,
-0.037933770567178726,
0.012061472982168198,
0.04372410848736763,
0.06940989196300507,
-0.03267587348818779,
0.0928749367594719,
-0.2496338039636612,
0.2025928944349289,
0.01005827821791172,
0.08245891332626343,
-0.05151556804776192,
0.02300962060689926,
0.0673331618309021,
-0.060845647007226944,
0.1286630928516388,
0.02228032611310482,
-0.08100973814725876,
-0.07868070155382156,
-0.10770703107118607,
0.051706742495298386,
0.09460686892271042,
-0.09964380413293839,
0.07501346617937088,
-0.030353017151355743,
-0.00793666485697031,
-0.05566976219415665,
0.0705297663807869,
-0.10465928167104721,
-0.10299571603536606,
0.006921045016497374,
-0.038495879620313644,
0.009987208060920238,
-0.039936501532793045,
-0.057041093707084656,
-0.10557334125041962,
0.17904208600521088,
-0.1537773609161377,
-0.07623214274644852,
-0.1249883845448494,
-0.057525184005498886,
0.06930435448884964,
-0.11198621988296509,
0.00746671250090003,
0.030746053904294968,
0.10041894763708115,
-0.031191613525152206,
-0.11541163176298141,
0.04122619330883026,
-0.09360012412071228,
-0.14137190580368042,
-0.03161109238862991,
0.1418287605047226,
0.07611226290464401,
0.04902785271406174,
0.03611733019351959,
-0.005000067409127951,
-0.023120926693081856,
-0.1439460664987564,
0.01856413669884205,
0.12174534052610397,
-0.0265701524913311,
0.09324506670236588,
-0.08860443532466888,
-0.15057317912578583,
-0.042777422815561295,
0.006040194071829319,
0.07348735630512238,
0.08555153757333755,
-0.06846912205219269,
0.1699250340461731,
0.1699255257844925,
-0.05138678103685379,
-0.28536269068717957,
-0.0014303104253485799,
0.0523156002163887,
0.031106403097510338,
0.0071248384192585945,
-0.12992307543754578,
0.1296757459640503,
0.030394837260246277,
-0.04192034900188446,
-0.040713340044021606,
-0.15455158054828644,
-0.11604053527116776,
0.06339874863624573,
-0.04393293708562851,
-0.023422081023454666,
-0.08537708222866058,
-0.09042557328939438,
-0.0472511351108551,
-0.03694175183773041,
0.10686769336462021,
-0.07563747465610504,
0.042538587003946304,
0.021268285810947418,
0.054665692150592804,
0.004332290962338448,
-0.03932827711105347,
0.09663130342960358,
0.06557802855968475,
0.03777070716023445,
-0.04393458366394043,
0.007560077589005232,
0.06976521760225296,
-0.04411564767360687,
0.10888297855854034,
-0.030963502824306488,
0.021567538380622864,
-0.16715888679027557,
-0.020911293104290962,
-0.0714130848646164,
0.14210928976535797,
-0.013732081279158592,
-0.060884181410074234,
-0.10210397839546204,
0.13001027703285217,
0.04359772801399231,
-0.010969088412821293,
0.004189119208604097,
0.009496806189417839,
0.02565052919089794,
0.07058792561292648,
0.15111464262008667,
-0.019993307068943977,
-0.06340378522872925,
-0.013813486322760582,
-0.023279685527086258,
0.07434019446372986,
-0.0316687636077404,
0.051666270941495895,
0.1543661504983902,
0.007591819390654564,
0.08966708928346634,
-0.009908145293593407,
-0.11845279484987259,
0.02585678920149803,
0.10326802730560303,
-0.14141465723514557,
-0.1703118532896042,
-0.10191982239484787,
-0.07739383727312088,
-0.0493825301527977,
0.060406893491744995,
0.16293735802173615,
-0.01197187602519989,
-0.029036393389105797,
-0.004563949070870876,
0.05130787193775177,
-0.0015744707779958844,
0.1580931842327118,
0.03725139796733856,
0.0315404087305069,
-0.10050206631422043,
0.11145027726888657,
0.07148570567369461,
-0.05525204539299011,
0.04195445403456688,
0.1060948297381401,
-0.0782851055264473,
-0.017669714987277985,
0.058463551104068756,
0.05516016483306885,
-0.0805220901966095,
-0.06406709551811218,
-0.03206785395741463,
-0.11303896456956863,
0.007800730876624584,
0.03982751443982124,
0.03647831454873085,
0.054323721677064896,
0.02141365222632885,
-0.0009704448166303337,
0.05283515155315399,
0.12462934851646423,
0.09659399092197418,
0.016214152798056602,
-0.04772548750042915,
0.036452244967222214,
-0.04906541109085083,
0.029700567945837975,
-0.03440454602241516,
-0.029411353170871735,
-0.14228013157844543,
-0.03439786285161972,
-0.11806140094995499,
0.04027178883552551,
-0.016897959634661674,
0.019440224394202232,
-0.04804019257426262,
-0.09993056207895279,
0.0003103417402599007,
0.043748192489147186,
-0.0701451227068901,
-0.0073461649008095264,
-0.02705630101263523,
0.14972703158855438,
-0.18437765538692474,
0.018365681171417236,
0.06985791772603989,
-0.02461261861026287,
0.11251860111951828,
0.020869983360171318,
-0.04145007207989693,
0.043133996427059174,
-0.1374727487564087,
-0.04125652089715004,
-0.03720993921160698,
0.046532511711120605,
0.0636097863316536,
-0.03760723024606705,
-0.0008269849931821227,
0.03951537609100342,
-0.029898274689912796,
0.026079216971993446,
-0.0036783437244594097,
-0.06923367083072662,
0.09440115839242935,
0.012428772635757923,
-0.11161667853593826,
-0.011827218346297741,
0.03534704074263573,
0.1478452831506729,
0.020169513300061226,
0.11337479203939438,
-0.10640794038772583,
0.01709834858775139,
-0.12343316525220871,
-0.005260435398668051,
0.05021636188030243,
-0.0519295260310173,
-0.1754811406135559,
0.004359045531600714,
0.06197177991271019,
0.0008395844488404691,
0.20336972177028656,
0.04557956010103226,
0.012127348221838474,
0.03764225170016289,
0.004237745888531208,
-0.062008436769247055,
0.05973207205533981,
0.06375091522932053,
-0.039602115750312805,
0.019354475662112236,
-0.051138512790203094,
-0.04753349721431732,
-0.049874696880578995,
-0.012328657321631908,
0.19160591065883636,
0.19524981081485748,
0.12243402749300003,
0.017580749467015266,
0.11591900885105133,
-0.0026070312596857548,
-0.13549378514289856,
-0.068294957280159,
-0.006112131290137768,
0.06539758294820786,
-0.08045137673616409,
0.08422253280878067,
0.11973560601472855,
-0.21057046949863434,
0.09814377874135971,
-0.017568083480000496,
-0.023767517879605293,
-0.03124876320362091,
-0.13131646811962128,
-0.08143922686576843,
-0.08897282183170319,
0.007185798604041338,
-0.13561353087425232,
0.016301102936267853,
0.0605195127427578,
0.07567217946052551,
-0.0747697502374649,
0.13687588274478912,
-0.13619056344032288,
-0.05302803963422775,
0.1371772736310959,
0.041167158633470535,
0.0326249897480011,
0.05213707685470581,
-0.03469544276595116,
-0.06827231496572495,
0.10642940551042557,
0.018914105370640755,
0.06040401756763458,
-0.02298661880195141,
-0.0832066461443901,
-0.055000435560941696,
-0.044455938041210175,
-0.00829387828707695,
-0.0327356792986393,
-0.019620496779680252,
0.05531969666481018,
0.05110757425427437,
-0.037454646080732346,
0.020233722403645515,
0.19608983397483826,
-0.04515137895941734,
-0.09738770872354507,
-0.16331399977207184,
0.08408654481172562,
0.0020059675443917513,
0.04687075689435005,
0.016544567421078682,
-0.11886385083198547,
-0.0351799875497818,
0.09453979134559631,
0.24630896747112274,
-0.05971139296889305,
0.01609107106924057,
-0.03724634647369385,
0.01973053626716137,
0.025074921548366547,
0.158946692943573,
-0.03070790134370327,
0.309996098279953,
-0.0042894878424704075,
0.03223954141139984,
0.025484517216682434,
-0.028349431231617928,
-0.07175469398498535,
0.1619570255279541,
0.002134582493454218,
-0.06419651210308075,
-0.12889327108860016,
0.15350094437599182,
-0.0517379529774189,
-0.1688675731420517,
-0.09067760407924652,
-0.1444419026374817,
-0.1490480750799179,
-0.04693109542131424,
0.012521937489509583,
0.06527911126613617,
0.038909975439310074,
0.05951392650604248,
0.002245549578219652,
0.10323934257030487,
0.010374805890023708,
-0.05202432721853256,
-0.03185536339879036,
0.10158286243677139,
-0.07844902575016022,
0.2341461181640625,
0.029660029336810112,
0.04493715614080429,
0.07527606189250946,
0.0052522774785757065,
-0.07533597946166992,
-0.06105804443359375,
0.04984379932284355,
-0.1365474909543991,
-0.001373938168399036,
0.04125935584306717,
-0.013233249075710773,
0.07221750915050507,
0.08593103289604187,
-0.00256629497744143,
0.049178071320056915,
0.13323743641376495,
-0.021777860820293427,
-0.12367848306894302,
0.10006158798933029,
-0.10410046577453613,
0.12460745126008987,
0.14580869674682617,
-0.018078215420246124,
0.024301720783114433,
-0.04567757248878479,
0.06075894087553024,
0.004017956089228392,
0.07518324255943298,
-0.019670620560646057,
-0.1906527429819107,
-0.009243757463991642,
-0.04075399413704872,
0.037517137825489044,
-0.13069745898246765,
-0.07877696305513382,
-0.02392236515879631,
0.10040508210659027,
-0.021146021783351898,
0.1602938324213028,
0.07923097908496857,
-0.010756423696875572,
0.0016091304132714868,
-0.0745561271905899,
0.004538621753454208,
0.07503429055213928,
-0.09440456330776215,
-0.013118872418999672
] |
null | null |
transformers
|
# German BERT large
Released, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our [paper](https://arxiv.org/pdf/2010.10906.pdf), we outline the steps taken to train our model and show that it outperforms its predecessors.
## Overview
**Paper:** [here](https://arxiv.org/pdf/2010.10906.pdf)
**Architecture:** BERT large
**Language:** German
## Performance
```
GermEval18 Coarse: 80.08
GermEval18 Fine: 52.48
GermEval14: 88.16
```
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
**Branden Chan:** [email protected]
**Stefan Schweter:** [email protected]
**Timo Möller:** [email protected]
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/deepset-logo-colored.png" class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" class="w-40"/>
</div>
</div>
[deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://docs.haystack.deepset.ai">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community">Discord community open to everyone!</a></strong></p>
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "datasets": ["wikipedia", "OPUS", "OpenLegalData", "oscar"]}
|
fill-mask
|
deepset/gbert-large
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"fill-mask",
"de",
"dataset:wikipedia",
"dataset:OPUS",
"dataset:OpenLegalData",
"dataset:oscar",
"arxiv:2010.10906",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.10906"
] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #safetensors #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #dataset-oscar #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# German BERT large
Released, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that it outperforms its predecessors.
## Overview
Paper: here
Architecture: BERT large
Language: German
## Performance
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: URL@URL
Stefan Schweter: stefan@URL
Timo Möller: timo.moeller@URL
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
</div>
deepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- Distilled roberta-base-squad2 (aka "tinyroberta-squad2")
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="URL repo and <strong><a href="URL">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="URL community open to everyone!</a></strong></p>
Twitter | LinkedIn | Discord | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# German BERT large\n\nReleased, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that it outperforms its predecessors.",
"## Overview \nPaper: here \nArchitecture: BERT large \nLanguage: German",
"## Performance \n\n\nSee also: \ndeepset/gbert-base \ndeepset/gbert-large \ndeepset/gelectra-base \ndeepset/gelectra-large \ndeepset/gelectra-base-generator \ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: URL@URL \nStefan Schweter: stefan@URL \nTimo Möller: timo.moeller@URL",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #dataset-oscar #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# German BERT large\n\nReleased, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that it outperforms its predecessors.",
"## Overview \nPaper: here \nArchitecture: BERT large \nLanguage: German",
"## Performance \n\n\nSee also: \ndeepset/gbert-base \ndeepset/gbert-large \ndeepset/gelectra-base \ndeepset/gelectra-large \ndeepset/gelectra-base-generator \ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: URL@URL \nStefan Schweter: stefan@URL \nTimo Möller: timo.moeller@URL",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
86,
99,
15,
60,
29,
251,
113
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #dataset-oscar #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# German BERT large\n\nReleased, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that it outperforms its predecessors.## Overview \nPaper: here \nArchitecture: BERT large \nLanguage: German## Performance \n\n\nSee also: \ndeepset/gbert-base \ndeepset/gbert-large \ndeepset/gelectra-base \ndeepset/gelectra-large \ndeepset/gelectra-base-generator \ndeepset/gelectra-large-generator## Authors\nBranden Chan: URL@URL \nStefan Schweter: stefan@URL \nTimo Möller: timo.moeller@URL"
] |
[
-0.04289546236395836,
0.1553763896226883,
-0.002303042449057102,
0.09324605762958527,
0.025813328102231026,
-0.034023936837911606,
0.1343233585357666,
0.07569053769111633,
-0.014769393019378185,
0.09826390445232391,
0.051240064203739166,
0.004097283352166414,
0.04774203151464462,
0.1495743989944458,
-0.0040818070992827415,
-0.23199816048145294,
0.012081943452358246,
-0.05763553828001022,
-0.11315209418535233,
0.030862057581543922,
0.14175660908222198,
-0.13014115393161774,
0.08713750541210175,
0.0021059459540992975,
-0.019265616312623024,
0.04165275767445564,
-0.08101482689380646,
-0.024514446035027504,
0.03959614038467407,
0.012856470420956612,
0.0445016548037529,
-0.01190387923270464,
0.02932616136968136,
-0.051906805485486984,
0.016422517597675323,
0.002167894272133708,
0.034574925899505615,
0.06126575171947479,
0.05074695125222206,
-0.011999009177088737,
-0.06332719326019287,
-0.13967260718345642,
-0.006318485829979181,
0.04551447182893753,
-0.06633929908275604,
-0.18376202881336212,
-0.06554587930440903,
0.13339218497276306,
0.02767667919397354,
0.02796768583357334,
-0.028995823115110397,
0.05738266557455063,
-0.07234383374452591,
0.02100854553282261,
0.1346394270658493,
-0.2890651524066925,
-0.09306935966014862,
0.0411917120218277,
0.0026933823246508837,
-0.01477156300097704,
-0.10792253911495209,
0.04156580939888954,
0.041372623294591904,
0.011304112151265144,
0.06426230818033218,
-0.04167403280735016,
0.061683010309934616,
-0.014936074614524841,
-0.10808853805065155,
0.04112344980239868,
0.17156364023685455,
0.024152522906661034,
-0.04509731009602547,
-0.14447425305843353,
-0.014638098888099194,
0.15664318203926086,
-0.005260754842311144,
-0.049147773534059525,
0.07485711574554443,
-0.031930871307849884,
-0.015646981075406075,
-0.07710646092891693,
-0.09740282595157623,
0.027893323451280594,
-0.007589041255414486,
0.1427697390317917,
0.01333186961710453,
-0.02166302129626274,
0.01741524413228035,
0.04053783416748047,
-0.08759956061840057,
-0.20327900350093842,
-0.01832384429872036,
-0.08388225734233856,
0.029107725247740746,
-0.0034577692858874798,
-0.03561290353536606,
-0.06409502774477005,
0.11531245708465576,
0.2018938511610031,
0.09332067519426346,
-0.00016289990162476897,
-0.023331576958298683,
0.025801315903663635,
0.0824882835149765,
0.17506390810012817,
-0.08035795390605927,
-0.13934703171253204,
0.04242271929979324,
-0.021289804950356483,
0.07745195925235748,
0.0013764810282737017,
-0.099760502576828,
-0.04679659754037857,
-0.024779556319117546,
-0.018594807013869286,
0.045130591839551926,
0.052946899086236954,
-0.08875256776809692,
-0.06423161178827286,
0.08317357301712036,
-0.12970654666423798,
0.04956800118088722,
0.020198088139295578,
-0.05966294929385185,
0.042870186269283295,
-0.05257681757211685,
0.024118755012750626,
-0.026880940422415733,
0.0926247388124466,
-0.02487827278673649,
-0.043660521507263184,
-0.09932924062013626,
-0.09972599148750305,
0.052902091294527054,
-0.09662044048309326,
-0.0023995875380933285,
-0.08436450362205505,
-0.008736622519791126,
-0.03677208349108696,
0.13897287845611572,
-0.035821668803691864,
-0.0142488032579422,
-0.08244801312685013,
-0.0073063792660832405,
0.04570317268371582,
0.0009828145848587155,
0.004892551805824041,
-0.0027880866546183825,
0.022663358598947525,
-0.06666427105665207,
0.05055021122097969,
-0.10381104052066803,
0.04291999712586403,
-0.06976215541362762,
0.002025023801252246,
-0.2494180053472519,
0.011722029186785221,
-0.1177753135561943,
0.04662306606769562,
-0.1220850944519043,
-0.035463497042655945,
0.0006722137331962585,
0.04759678617119789,
0.04474300891160965,
0.1261790543794632,
-0.07511813193559647,
-0.05357638746500015,
0.1508045196533203,
-0.02597161754965782,
-0.05093726888298988,
0.13924206793308258,
-0.05985104665160179,
0.03386539965867996,
0.06875289231538773,
0.21026578545570374,
0.08861685544252396,
-0.09617780894041061,
-0.08053254336118698,
-0.0027656874153763056,
0.02443145401775837,
0.01989728957414627,
0.1232159212231636,
-0.05005963146686554,
0.10165797173976898,
0.006577593740075827,
-0.0827173963189125,
0.005891175474971533,
0.016765279695391655,
-0.031167136505246162,
0.03254984691739082,
-0.06096517667174339,
0.09903685748577118,
-0.04172028601169586,
0.04099634289741516,
-0.08938311785459518,
-0.09929442405700684,
0.07452332973480225,
0.03511772304773331,
-0.02602197416126728,
-0.013537932187318802,
-0.060419075191020966,
0.07040088623762131,
-0.012773229740560055,
-0.0000724956626072526,
-0.0626172199845314,
-0.04032634198665619,
0.06371603906154633,
-0.03458058089017868,
0.12416472285985947,
0.09564771503210068,
0.08296111971139908,
0.00817089807242155,
-0.023054439574480057,
-0.025193501263856888,
-0.02447182685136795,
-0.04268329218029976,
0.0069580995477736,
-0.15473201870918274,
-0.005131092853844166,
-0.04846452921628952,
-0.00043521379120647907,
-0.043765995651483536,
-0.0036597715225070715,
0.019700486212968826,
0.12360155582427979,
0.03200497105717659,
-0.0350872240960598,
-0.002652975032106042,
-0.004977324046194553,
0.002455086214467883,
-0.0220223069190979,
0.01136302761733532,
0.006181182339787483,
-0.07358291745185852,
0.03496960178017616,
-0.02619991824030876,
0.04086121916770935,
0.06840074062347412,
-0.05674852058291435,
-0.031701140105724335,
-0.014174869284033775,
-0.04607313871383667,
-0.008011521771550179,
0.009788978844881058,
-0.08567912876605988,
0.18111738562583923,
-0.030901264399290085,
0.05115244537591934,
-0.08044737577438354,
-0.08148805797100067,
-0.04124552011489868,
-0.048544108867645264,
-0.04555021598935127,
0.1145537719130516,
-0.055068887770175934,
-0.1491202861070633,
0.1244659572839737,
0.1693451851606369,
-0.027806811034679413,
0.23535075783729553,
-0.039731454104185104,
-0.037356361746788025,
-0.005601499229669571,
0.052277565002441406,
0.006567387375980616,
0.10706689208745956,
-0.07033151388168335,
-0.035668931901454926,
0.050137538462877274,
-0.017890186980366707,
-0.013258433900773525,
-0.027706071734428406,
0.05062570422887802,
0.0051000104285776615,
0.002206393051892519,
0.058587055653333664,
0.0038010962307453156,
-0.016261715441942215,
0.07837744802236557,
0.1068677231669426,
-0.004261513240635395,
0.0251300148665905,
-0.039796292781829834,
-0.02744799107313156,
0.11113705486059189,
-0.09962117671966553,
-0.15904051065444946,
-0.1370663046836853,
-0.09280011802911758,
-0.12050776928663254,
0.023080192506313324,
0.023844623938202858,
-0.07142239809036255,
-0.09649237990379333,
-0.009222650900483131,
0.05269210785627365,
0.04402710124850273,
-0.04030977562069893,
0.022682491689920425,
0.00457494892179966,
0.022147303447127342,
-0.14442574977874756,
-0.044401220977306366,
-0.007145232520997524,
-0.06596262007951736,
-0.010184111073613167,
0.062220584601163864,
0.05222846195101738,
-0.0034378436394035816,
0.014389321208000183,
0.012110772542655468,
0.005965034011751413,
0.12968632578849792,
-0.0954451709985733,
0.0620880126953125,
0.14949876070022583,
-0.06586192548274994,
0.05468976870179176,
0.14642281830310822,
0.07185571640729904,
-0.021167350932955742,
0.004038608632981777,
0.026346303522586823,
0.009159618988633156,
-0.15521030128002167,
-0.09811265766620636,
-0.05843034386634827,
0.01667880453169346,
0.02428923361003399,
0.036765385419130325,
-0.1014302670955658,
0.051744136959314346,
-0.05763839930295944,
0.012020846828818321,
0.03466613218188286,
0.06425296515226364,
0.028918810188770294,
0.0029001219663769007,
0.04937281832098961,
-0.030336283147335052,
-0.10241058468818665,
0.11843231320381165,
0.0056221564300358295,
0.05298410728573799,
0.009555093012750149,
0.1049310564994812,
0.032848943024873734,
0.08334444463253021,
0.0010586775606498122,
0.07282953709363937,
-0.031657204031944275,
0.01264894101768732,
-0.04447389394044876,
-0.12582574784755707,
0.019830333068966866,
0.0879461020231247,
0.05056440457701683,
-0.009664753451943398,
-0.0014020571252331138,
-0.05256153270602226,
0.09211812168359756,
0.2160911113023758,
0.0823090523481369,
-0.13360914587974548,
-0.038458436727523804,
0.05916981399059296,
-0.008723254315555096,
-0.0560736320912838,
0.0005789022543467581,
0.06095615774393082,
-0.11295954883098602,
0.08273886144161224,
0.011989099904894829,
0.07609612494707108,
-0.03607219457626343,
0.05752580985426903,
-0.027797527611255646,
0.09628681093454361,
0.009265677072107792,
0.08463463932275772,
-0.23401440680027008,
0.14101554453372955,
0.0409759059548378,
0.05166139081120491,
-0.09010452777147293,
0.048069462180137634,
0.05655061826109886,
-0.13129065930843353,
0.14662052690982819,
-0.004958062432706356,
0.10537800937891006,
0.01435411162674427,
-0.18406186997890472,
0.042949289083480835,
0.13093078136444092,
-0.12341250479221344,
0.04599539190530777,
0.006786872632801533,
-0.037974707782268524,
-0.03390210494399071,
0.08952085673809052,
-0.11532928794622421,
-0.08959010243415833,
-0.004967410583049059,
-0.035700369626283646,
-0.0010799207957461476,
-0.03055717423558235,
-0.04225748032331467,
-0.10412590950727463,
0.2833673059940338,
-0.09506043046712875,
-0.043203938752412796,
-0.09769473224878311,
0.0175478532910347,
0.028910549357533455,
-0.07227323949337006,
0.05104505270719528,
-0.007391889579594135,
0.07414905726909637,
-0.0253499336540699,
-0.11422128975391388,
0.1152552142739296,
-0.07529839873313904,
-0.07165362685918808,
-0.012316478416323662,
0.1101999282836914,
0.06164482980966568,
0.03420943021774292,
0.0005107231554575264,
0.04143722727894783,
-0.037764061242341995,
-0.08150236308574677,
0.03792254626750946,
0.08604258298873901,
0.0819380059838295,
0.08502072095870972,
-0.15547850728034973,
-0.17061401903629303,
0.024410000070929527,
0.04359208419919014,
0.10316798090934753,
0.18913178145885468,
-0.03872518241405487,
0.12349510192871094,
0.16183042526245117,
-0.012548535130918026,
-0.34088781476020813,
0.011266948655247688,
0.054448384791612625,
0.07188576459884644,
0.012815553694963455,
-0.229374498128891,
0.15598203241825104,
0.021246837452054024,
-0.06949247419834137,
0.004057134501636028,
-0.05134476348757744,
-0.1096620038151741,
0.12470962107181549,
-0.022155648097395897,
0.0887589082121849,
-0.05910571292042732,
-0.05199519544839859,
-0.07805654406547546,
-0.04552275314927101,
0.057639870792627335,
-0.06555241346359253,
0.07275019586086273,
0.05126165598630905,
-0.03177356719970703,
0.03901723772287369,
-0.042527973651885986,
0.06642501056194305,
-0.03335396200418472,
0.03563351184129715,
-0.09804290533065796,
-0.054214123636484146,
0.02450747787952423,
-0.04515434429049492,
0.11820538341999054,
-0.057753290981054306,
-0.01157322432845831,
-0.10425668209791183,
-0.013836721889674664,
-0.03073796071112156,
0.11812800168991089,
-0.03296740725636482,
-0.10784829407930374,
-0.057008832693099976,
0.12970538437366486,
0.02734832465648651,
0.05420670658349991,
0.12536530196666718,
-0.015043635852634907,
-0.058024194091558456,
0.05172384902834892,
0.20783168077468872,
0.03546534478664398,
0.004603986162692308,
-0.03294682875275612,
-0.041362300515174866,
0.08375626057386398,
-0.040828999131917953,
0.048606183379888535,
0.09287349879741669,
-0.008013459853827953,
0.05544139817357063,
0.01995049975812435,
-0.08054101467132568,
0.0024225241504609585,
0.1021803691983223,
-0.14144578576087952,
-0.11227543652057648,
-0.07737889140844345,
-0.044066816568374634,
-0.047875579446554184,
0.04822875186800957,
0.15931513905525208,
-0.021805256605148315,
-0.026218194514513016,
0.03395785018801689,
0.03111366368830204,
-0.05395183339715004,
0.019521113485097885,
0.0399465374648571,
0.010377395898103714,
-0.06338758021593094,
0.10525296628475189,
0.075502909719944,
0.013511180877685547,
0.09568049013614655,
0.04657137766480446,
-0.02752227708697319,
-0.00436954153701663,
0.09218713641166687,
0.18633542954921722,
0.02118474431335926,
-0.02758178487420082,
-0.032248228788375854,
-0.050038259476423264,
-0.00883362628519535,
0.030828459188342094,
0.075405053794384,
0.025886641815304756,
0.010831886902451515,
0.050051890313625336,
0.025917859748005867,
0.08425400406122208,
0.050171684473752975,
0.033337753266096115,
-0.04366952180862427,
-0.015003500506281853,
-0.08996270596981049,
0.0034519725013524294,
-0.05645320937037468,
0.013495635241270065,
-0.16197569668293,
-0.05671123042702675,
-0.10956797748804092,
-0.03293103352189064,
-0.014967677183449268,
-0.04145658761262894,
-0.04250910133123398,
-0.029451340436935425,
0.023741770535707474,
0.016792181879281998,
-0.06942270696163177,
-0.013809272088110447,
-0.018114451318979263,
0.09929141402244568,
-0.1698947250843048,
-0.008527865633368492,
0.04675088822841644,
-0.04647759348154068,
0.10913655906915665,
-0.0059618866071105,
-0.014204522594809532,
-0.0029390479903668165,
-0.1843462884426117,
-0.11200553178787231,
-0.02009723335504532,
0.038349591195583344,
0.05488386005163193,
-0.019172443076968193,
-0.02976161055266857,
-0.00467841187492013,
-0.026731090620160103,
-0.013298136182129383,
0.055137749761343,
-0.07028309255838394,
0.12551474571228027,
-0.0008373624878004193,
-0.0757831484079361,
-0.05126231536269188,
0.06070535257458687,
0.052231092005968094,
0.036000438034534454,
0.07367874681949615,
-0.07121684402227402,
0.005495324265211821,
-0.08316893130540848,
0.03476312384009361,
0.03591480478644371,
-0.09575416892766953,
-0.1502656191587448,
-0.03261756896972656,
0.052009448409080505,
-0.02939148247241974,
0.0669291615486145,
0.03313443809747696,
-0.1654042750597,
0.04009974002838135,
-0.0381787084043026,
0.07112372666597366,
0.05307392776012421,
0.10049837827682495,
-0.013163737021386623,
-0.019921982660889626,
-0.04332362860441208,
0.019350290298461914,
0.019245482981204987,
0.028193121775984764,
0.14311976730823517,
0.2522919178009033,
0.07869439572095871,
0.04866452515125275,
0.14258071780204773,
-0.01167053822427988,
-0.13818860054016113,
0.05159788578748703,
0.07437483966350555,
0.11762338131666183,
-0.08161507546901703,
0.05169647932052612,
0.03786719590425491,
-0.17740362882614136,
0.0568816140294075,
-0.030160417780280113,
-0.0018836113158613443,
-0.0710706040263176,
-0.22465448081493378,
-0.0952642410993576,
-0.005411876831203699,
0.019419269636273384,
-0.10838858783245087,
0.009973878972232342,
0.00039484474109485745,
0.08213972300291061,
-0.054022178053855896,
0.1504688709974289,
-0.20582370460033417,
0.006692339666187763,
0.14360490441322327,
-0.010958973318338394,
0.006318851374089718,
0.028616569936275482,
-0.08059794455766678,
-0.07802515476942062,
0.13478054106235504,
0.04443471133708954,
0.04058222472667694,
-0.0026359143666923046,
-0.05863811448216438,
-0.020369723439216614,
-0.09302259236574173,
-0.024788103997707367,
-0.03622301295399666,
0.038144346326589584,
0.08543738722801208,
0.0038847769610583782,
-0.04210802540183067,
-0.010211212560534477,
0.16826075315475464,
-0.06598580628633499,
-0.04946304112672806,
-0.10067235678434372,
-0.0008080762927420437,
-0.04183114692568779,
0.0427224338054657,
-0.00428454065695405,
-0.09753527492284775,
-0.056238677352666855,
0.14687079191207886,
0.27080392837524414,
-0.07609955221414566,
0.023037558421492577,
0.009937618859112263,
0.020823311060667038,
-0.013735795393586159,
0.107918880879879,
0.01465121004730463,
0.2592301368713379,
-0.043866533786058426,
-0.013161190785467625,
0.02760692499577999,
-0.02101183496415615,
-0.17060014605522156,
0.06066765636205673,
0.06092154607176781,
-0.05042978748679161,
-0.060769688338041306,
0.07767915725708008,
-0.03244900703430176,
-0.11481408774852753,
-0.1145220547914505,
-0.1409415304660797,
-0.17634375393390656,
-0.049620624631643295,
0.041345905512571335,
0.05885665491223335,
0.09105642139911652,
0.007997415959835052,
0.009279956109821796,
0.017236093059182167,
-0.01188192144036293,
0.018309561535716057,
0.013491574674844742,
0.10028006881475449,
-0.0754111036658287,
0.1967741847038269,
0.03365272283554077,
-0.005215892102569342,
0.10649780184030533,
-0.04536876082420349,
-0.03437746688723564,
-0.05293780565261841,
0.03350028395652771,
-0.1485016644001007,
-0.012617215514183044,
0.07983160018920898,
-0.05044029280543327,
0.059935446828603745,
0.04813327267765999,
-0.051325585693120956,
0.01874629594385624,
0.10847566276788712,
-0.05291010066866875,
-0.12065199017524719,
0.11192556470632553,
-0.14307935535907745,
0.13924026489257812,
0.18515753746032715,
-0.018203917890787125,
-0.050551626831293106,
-0.04304720088839531,
0.07323505729436874,
0.05625985935330391,
0.012866045348346233,
-0.049011245369911194,
-0.16393789649009705,
-0.013884114101529121,
-0.0085697490721941,
0.013735141605138779,
-0.17889012396335602,
-0.050513625144958496,
-0.06941397488117218,
0.1011885404586792,
-0.0808091014623642,
0.010529372841119766,
0.10283482819795609,
0.01078187208622694,
-0.01666232943534851,
-0.0637575015425682,
-0.011462788097560406,
0.006999595556408167,
-0.06502816826105118,
-0.04762302339076996
] |
null | null |
transformers
|
# German ELECTRA base generator
Released, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our [paper](https://arxiv.org/pdf/2010.10906.pdf), we outline the steps taken to train our model.
The generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-base.
## Overview
**Paper:** [here](https://arxiv.org/pdf/2010.10906.pdf)
**Architecture:** ELECTRA base (generator)
**Language:** German
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: `branden.chan [at] deepset.ai`
Stefan Schweter: `stefan [at] schweter.eu`
Timo Möller: `timo.moeller [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Slack](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "datasets": ["wikipedia", "OPUS", "OpenLegalData"]}
|
fill-mask
|
deepset/gelectra-base-generator
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"electra",
"fill-mask",
"de",
"dataset:wikipedia",
"dataset:OPUS",
"dataset:OpenLegalData",
"arxiv:2010.10906",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.10906"
] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #safetensors #electra #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# German ELECTRA base generator
Released, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model.
The generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-base.
## Overview
Paper: here
Architecture: ELECTRA base (generator)
Language: German
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: 'URL [at] URL'
Stefan Schweter: 'stefan [at] URL'
Timo Möller: 'timo.moeller [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Slack | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# German ELECTRA base generator\n\nReleased, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model.\n\nThe generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-base.",
"## Overview \nPaper: here \nArchitecture: ELECTRA base (generator)\nLanguage: German \n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #electra #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# German ELECTRA base generator\n\nReleased, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model.\n\nThe generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-base.",
"## Overview \nPaper: here \nArchitecture: ELECTRA base (generator)\nLanguage: German \n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
79,
147,
78,
40,
129
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #electra #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# German ELECTRA base generator\n\nReleased, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model.\n\nThe generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-base.## Overview \nPaper: here \nArchitecture: ELECTRA base (generator)\nLanguage: German \n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
-0.02845471352338791,
0.18034368753433228,
-0.004699418321251869,
0.03879307955503464,
0.03695755451917648,
-0.005506027024239302,
0.06674375385046005,
0.09157684445381165,
0.05946332588791847,
0.10797607153654099,
0.007902253419160843,
0.00963008776307106,
0.09255615621805191,
0.11301461607217789,
0.03355332463979721,
-0.2252361923456192,
0.017733080312609673,
-0.10857079923152924,
-0.035055454820394516,
0.07543953508138657,
0.14673443138599396,
-0.09300576150417328,
0.09021205455064774,
-0.006006044335663319,
0.0037400596775114536,
0.04804255813360214,
-0.05168834701180458,
-0.02514636144042015,
0.03546127304434776,
0.046477340161800385,
0.0506020151078701,
-0.06345556676387787,
-0.0072330208495259285,
-0.1717040240764618,
0.025773921981453896,
0.057122889906167984,
0.0021556762512773275,
0.05155211314558983,
0.05904630199074745,
-0.04332815110683441,
-0.039136726409196854,
-0.1610802859067917,
0.011153582483530045,
0.07333965599536896,
-0.09069810062646866,
-0.15867829322814941,
-0.0684494599699974,
0.15371684730052948,
0.061402756720781326,
0.011784618720412254,
-0.03799106553196907,
0.039128899574279785,
-0.036426741629838943,
0.013751787133514881,
0.08630923926830292,
-0.26166439056396484,
-0.09316614270210266,
-0.02845802530646324,
0.008259081281721592,
0.020867375656962395,
-0.1339111030101776,
0.0349295511841774,
-0.015879591926932335,
0.02214943617582321,
0.03565693274140358,
-0.029118461534380913,
0.04834362491965294,
0.002285531722009182,
-0.07706169784069061,
0.039688412100076675,
0.06242012977600098,
-0.007471347227692604,
-0.04883778095245361,
-0.23358403146266937,
-0.01590108498930931,
0.10772500932216644,
-0.013897063210606575,
-0.05364694073796272,
0.027681702747941017,
-0.026237476617097855,
0.01811179332435131,
-0.10101374238729477,
-0.08922868221998215,
0.041611261665821075,
-0.023549353703856468,
0.15659938752651215,
0.03294866532087326,
-0.0039580995216965675,
0.06295661628246307,
0.0779929906129837,
-0.0721898302435875,
-0.1710909903049469,
-0.025694970041513443,
-0.10605709999799728,
-0.0822691097855568,
-0.008468583226203918,
-0.023334959521889687,
-0.08085381239652634,
0.0922803059220314,
0.22268694639205933,
0.005881681106984615,
0.04760510101914406,
-0.01455229427665472,
-0.025826994329690933,
0.08867921680212021,
0.2023419588804245,
-0.06622307747602463,
-0.19907249510288239,
0.023216350004076958,
-0.0693477913737297,
0.047129932790994644,
0.0011631647357717156,
-0.06136702373623848,
-0.04197951406240463,
-0.0009107348741963506,
0.028458410874009132,
0.05719136819243431,
0.024393022060394287,
-0.05778377875685692,
-0.11377862095832825,
0.10366754233837128,
-0.13717657327651978,
0.06384771317243576,
0.05917314440011978,
-0.04373474046587944,
0.06825024634599686,
-0.06557415425777435,
0.006974958349019289,
-0.057455744594335556,
0.034128762781620026,
-0.01691758632659912,
-0.02297850511968136,
-0.1152065098285675,
-0.07901137322187424,
0.038079675287008286,
-0.03464837372303009,
-0.04685508459806442,
-0.07350282371044159,
-0.0635247752070427,
-0.058744873851537704,
0.11828931421041489,
-0.05667341873049736,
-0.021662000566720963,
-0.028221067041158676,
-0.00030752617749385536,
0.03960828855633736,
0.008260251954197884,
-0.04785645753145218,
0.005962451454252005,
0.020886944606900215,
-0.08650350570678711,
0.003758820937946439,
-0.005777521058917046,
0.031113795936107635,
-0.05914538353681564,
-0.003670823061838746,
-0.31471046805381775,
0.0738278478384018,
-0.11330121010541916,
0.04461261257529259,
-0.13976052403450012,
-0.02498132735490799,
0.022010456770658493,
0.01763216033577919,
-0.006858418229967356,
0.07234445959329605,
-0.08599472045898438,
-0.04949295148253441,
0.1121862381696701,
-0.04859452322125435,
-0.025214681401848793,
0.13290713727474213,
-0.0749259665608406,
0.011785664595663548,
0.1040777787566185,
0.19394612312316895,
0.19677165150642395,
-0.09495615214109421,
-0.06757182627916336,
-0.06208450719714165,
-0.027856862172484398,
0.09046826511621475,
0.09150910377502441,
-0.06555719673633575,
0.08344001322984695,
0.01717015728354454,
-0.07113701850175858,
0.0033723788801580667,
0.029425792396068573,
-0.04265584051609039,
0.0471307672560215,
-0.04049910604953766,
0.11737976223230362,
-0.031463779509067535,
-0.024755338206887245,
-0.06330560892820358,
-0.14089073240756989,
0.04229527339339256,
0.023710299283266068,
-0.019629759714007378,
-0.016886837780475616,
-0.06992439925670624,
0.0144001804292202,
0.046522777527570724,
0.010778825730085373,
-0.05213788524270058,
-0.1308862864971161,
0.07822244614362717,
-0.09508431702852249,
0.15362896025180817,
-0.06498688459396362,
0.08151347190141678,
0.01092911884188652,
-0.04303838312625885,
-0.04205704107880592,
-0.006507533602416515,
-0.0428178645670414,
0.045031990855932236,
-0.15077726542949677,
-0.014362618327140808,
-0.033470913767814636,
0.06200660765171051,
0.0106870261952281,
-0.006559695117175579,
-0.022119199857115746,
0.15068253874778748,
0.0450245626270771,
-0.035980772227048874,
0.004027181304991245,
-0.007307711057364941,
0.050432562828063965,
-0.027579568326473236,
-0.01597261242568493,
-0.03152190148830414,
-0.05434907600283623,
0.02122286520898342,
-0.03190965950489044,
-0.019532646983861923,
0.015487498603761196,
0.047037459909915924,
-0.06274427473545074,
-0.054924942553043365,
-0.01573786325752735,
-0.021976986899971962,
-0.01911342144012451,
-0.10499320179224014,
0.24645738303661346,
0.031183786690235138,
-0.019045328721404076,
-0.07689617574214935,
-0.09757234156131744,
-0.08237060159444809,
-0.022091887891292572,
-0.01358883362263441,
0.0827290341258049,
-0.06497067958116531,
-0.1764475256204605,
0.11419562995433807,
0.08899190276861191,
0.012506851926445961,
0.25533512234687805,
-0.041964106261730194,
-0.03774365410208702,
-0.037253156304359436,
0.03797656670212746,
-0.017609579488635063,
0.07839237153530121,
0.049056027084589005,
0.01713176816701889,
0.05669117346405983,
-0.033925365656614304,
-0.001336496090516448,
0.014868751168251038,
0.06536222994327545,
-0.02542477659881115,
-0.01042488869279623,
0.09321211278438568,
0.013934586197137833,
0.06009306758642197,
0.050162989646196365,
0.07802452892065048,
0.06130389869213104,
-0.0009519457817077637,
-0.04122622683644295,
-0.03298064321279526,
0.10795606672763824,
-0.15615566074848175,
-0.1997176557779312,
-0.16657358407974243,
-0.06426814943552017,
-0.13054528832435608,
-0.023116908967494965,
-0.007578342221677303,
-0.05384279042482376,
-0.09829802811145782,
-0.013277145102620125,
0.10008952021598816,
0.11082124710083008,
-0.08910282701253891,
-0.08114822953939438,
-0.006191698834300041,
0.03522992134094238,
-0.14349059760570526,
-0.02433093823492527,
0.013823694549500942,
-0.06172969192266464,
-0.026570646092295647,
0.10473451018333435,
0.03067075088620186,
0.050709184259176254,
0.02059360034763813,
-0.002270245458930731,
-0.004941048566251993,
0.1860731691122055,
-0.13349828124046326,
0.12998805940151215,
0.1312703639268875,
-0.07194127142429352,
0.06745028495788574,
0.19543364644050598,
0.07028888911008835,
0.014700431376695633,
-0.0030176423024386168,
0.06474503129720688,
0.03306518867611885,
-0.2037918120622635,
-0.09176237881183624,
-0.03259638696908951,
-0.04909498989582062,
-0.010998054407536983,
0.04088228568434715,
0.012536092661321163,
-0.0017600252758711576,
-0.09278852492570877,
-0.07432743906974792,
0.06304287165403366,
0.06779489666223526,
0.09432685375213623,
0.024790676310658455,
0.04230832681059837,
-0.02384898066520691,
-0.06837955862283707,
0.0897197350859642,
0.02511587180197239,
0.11028163880109787,
0.0347442701458931,
0.15300674736499786,
0.021373342722654343,
0.06636299192905426,
0.0093889981508255,
0.009871558286249638,
-0.04643353447318077,
0.009715494699776173,
-0.02031560428440571,
-0.09467840939760208,
0.023712627589702606,
0.07186112552881241,
0.10929325222969055,
-0.008384202606976032,
0.06497541069984436,
-0.056126076728105545,
0.15290766954421997,
0.1843542903661728,
0.0250814538449049,
-0.04622010886669159,
-0.04593779891729355,
0.047850728034973145,
-0.10000545531511307,
-0.08596553653478622,
-0.02206205017864704,
0.019273197278380394,
-0.15736986696720123,
0.07892482727766037,
-0.028305968269705772,
0.07045748829841614,
-0.0396704338490963,
0.01824835129082203,
0.01359399501234293,
0.1514711081981659,
-0.009719764813780785,
0.07400672882795334,
-0.13713067770004272,
0.0488346591591835,
0.03564155474305153,
0.09096676856279373,
-0.05194302275776863,
0.049511197954416275,
0.04112709313631058,
-0.1190839484333992,
0.10071533173322678,
-0.008249333128333092,
-0.018981866538524628,
0.04406779631972313,
-0.15705712139606476,
0.020511453971266747,
0.1316862553358078,
-0.14110787212848663,
0.06460016220808029,
-0.006003448739647865,
-0.029492434114217758,
-0.07744138687849045,
0.028059329837560654,
-0.113493412733078,
-0.11193516850471497,
-0.007420324254781008,
-0.08986659348011017,
0.07253576815128326,
-0.033969633281230927,
0.026518482714891434,
-0.1017688438296318,
0.21766892075538635,
-0.1901436150074005,
-0.09124679863452911,
-0.10321208089590073,
-0.035815898329019547,
0.10829197615385056,
-0.09973111003637314,
0.07708875089883804,
-0.004035052843391895,
0.10404852777719498,
-0.015219555236399174,
-0.11437632888555527,
0.04813409596681595,
-0.054919082671403885,
-0.0853213295340538,
0.004597263410687447,
0.16435350477695465,
0.08946177363395691,
0.036351218819618225,
0.0011973227374255657,
0.03304153308272362,
-0.030378658324480057,
-0.11013294756412506,
0.027662746608257294,
0.09223349392414093,
-0.012904997915029526,
0.10202477127313614,
-0.10707467049360275,
-0.19001014530658722,
-0.060260023921728134,
0.06297635287046432,
0.12725792825222015,
0.08182106167078018,
-0.040933456271886826,
0.17758506536483765,
0.17341600358486176,
-0.03851834312081337,
-0.2319825142621994,
-0.030222948640584946,
0.09190918505191803,
0.04783226549625397,
0.01516981702297926,
-0.26198309659957886,
0.16135548055171967,
0.03396584838628769,
-0.040853485465049744,
0.025903712958097458,
-0.10454126447439194,
-0.11926238983869553,
0.09541046619415283,
-0.0545184463262558,
-0.12078233808279037,
-0.06158226355910301,
-0.07726728171110153,
-0.05608471855521202,
-0.0613093338906765,
0.09351585805416107,
-0.08746584504842758,
0.047544535249471664,
0.04459824413061142,
0.07182860374450684,
0.04257746413350105,
-0.018472274765372276,
0.09011467546224594,
-0.019044263288378716,
0.053140029311180115,
-0.05817348137497902,
-0.029892483726143837,
-0.0006594397127628326,
-0.04663717746734619,
0.10742172598838806,
-0.057598814368247986,
-0.024221397936344147,
-0.07583733648061752,
-0.02237018011510372,
-0.07220166176557541,
0.11558303982019424,
-0.043694693595170975,
-0.07328613102436066,
-0.0651090070605278,
0.13958677649497986,
0.03614211082458496,
0.018940115347504616,
0.058568719774484634,
-0.03825829550623894,
-0.007034467998892069,
0.10266421735286713,
0.1695038229227066,
0.10142190009355545,
-0.07890308648347855,
-0.031955916434526443,
-0.01986197754740715,
0.06586236506700516,
-0.006851113401353359,
0.0589669868350029,
0.07911111414432526,
-0.015060336329042912,
0.10062547773122787,
-0.04772218316793442,
-0.12278714030981064,
-0.022192571312189102,
0.09351757168769836,
-0.10693812370300293,
-0.17230768501758575,
-0.06138996779918671,
-0.008784390054643154,
-0.08889146894216537,
0.006930551026016474,
0.14963991940021515,
0.03414987400174141,
-0.06521991640329361,
0.04788137227296829,
0.05512325465679169,
-0.03307420015335083,
0.03576270118355751,
0.06326974928379059,
0.006752911023795605,
-0.08523734658956528,
0.09483254700899124,
0.08125264197587967,
0.03322792425751686,
0.06147810444235802,
0.12315306067466736,
-0.0038546284195035696,
-0.019274596124887466,
0.06632283329963684,
0.1604156792163849,
-0.009984791278839111,
-0.027679208666086197,
0.021079184487462044,
-0.06585767865180969,
-0.013056011870503426,
0.0549318864941597,
0.033063165843486786,
-0.00626270892098546,
0.010457171127200127,
0.028150387108325958,
0.05423598363995552,
0.12665370106697083,
0.06825339794158936,
0.018781235441565514,
-0.02648533135652542,
-0.008991164155304432,
-0.056024327874183655,
-0.04379093647003174,
-0.02402014471590519,
-0.016660859808325768,
-0.13901856541633606,
-0.046009842306375504,
-0.11144237965345383,
-0.04083643853664398,
-0.02848890982568264,
0.0023140846751630306,
0.0068188440054655075,
-0.011597373522818089,
0.0017813603626564145,
0.020820943638682365,
-0.05361684784293175,
-0.04839057847857475,
-0.003260156372562051,
0.1145116463303566,
-0.19192692637443542,
0.0047637661918997765,
0.0886056199669838,
-0.037540458142757416,
0.12171613425016403,
0.03826174512505531,
-0.00421752268448472,
0.03488444164395332,
-0.14182229340076447,
-0.06866753101348877,
-0.06227642670273781,
0.022748645395040512,
0.022112522274255753,
-0.12250389158725739,
-0.008635084144771099,
-0.02874857373535633,
-0.08155644685029984,
0.004631134681403637,
0.022085897624492645,
-0.08723326027393341,
0.1482793539762497,
0.02268947847187519,
-0.07782317698001862,
-0.06024833396077156,
0.043474093079566956,
0.060148052871227264,
0.011660752817988396,
0.08863882720470428,
-0.08779258280992508,
0.03729688748717308,
-0.07660387456417084,
0.018575597554445267,
0.03355460613965988,
-0.03185376897454262,
-0.11221563071012497,
-0.013093012385070324,
0.04039076715707779,
-0.008855888620018959,
0.06618263572454453,
-0.005391324870288372,
-0.08893773704767227,
0.035979125648736954,
0.013605880551040173,
-0.055616624653339386,
0.07305994629859924,
0.02103155478835106,
-0.03794220834970474,
-0.016553837805986404,
-0.03161334991455078,
-0.04100510850548744,
-0.05715106055140495,
-0.035319652408361435,
0.13140755891799927,
0.22834613919258118,
0.12155528366565704,
0.005477042868733406,
0.14087419211864471,
-0.00917440839111805,
-0.08571774512529373,
0.12599192559719086,
0.02333191968500614,
0.03792686387896538,
-0.07065664976835251,
0.04983806237578392,
0.06637369096279144,
-0.21311484277248383,
0.10329752415418625,
0.020123226568102837,
-0.018702438101172447,
-0.042016007006168365,
-0.20425215363502502,
-0.08153191953897476,
-0.02460259012877941,
0.013428820297122002,
-0.10919757187366486,
0.03663553297519684,
-0.024383436888456345,
0.05934178829193115,
-0.06856956332921982,
0.1801551878452301,
-0.14306707680225372,
-0.07579931616783142,
0.14201834797859192,
0.04424377903342247,
0.037300966680049896,
0.06397885084152222,
-0.007943746633827686,
-0.07512615621089935,
0.10230819880962372,
0.059338755905628204,
0.049498043954372406,
-0.01114527229219675,
-0.05372907221317291,
-0.06744305044412613,
-0.09826251864433289,
0.004778604488819838,
-0.0376022644340992,
-0.030129337683320045,
0.14509214460849762,
0.020684722810983658,
-0.036331720650196075,
-0.0340915210545063,
0.14649255573749542,
-0.03433394432067871,
-0.08863077312707901,
-0.1598692387342453,
0.04521279036998749,
-0.02349299006164074,
0.025505386292934418,
0.006709488108754158,
-0.13036873936653137,
-0.038541700690984726,
0.06150539591908455,
0.20868784189224243,
-0.08266131579875946,
0.01772942766547203,
0.00509417662397027,
0.02030271105468273,
-0.0068778591230511665,
0.08661124110221863,
-0.031148692592978477,
0.26053386926651,
-0.029197080060839653,
0.07886842638254166,
0.035789456218481064,
-0.0661495253443718,
-0.15834477543830872,
0.10145421326160431,
0.01559967640787363,
-0.03672775253653526,
-0.05675168335437775,
0.1377895027399063,
-0.10171128064393997,
-0.14754702150821686,
-0.07974880188703537,
-0.075556680560112,
-0.15803909301757812,
-0.049036655575037,
0.07297331839799881,
0.053056709468364716,
0.06353358924388885,
0.051812395453453064,
-0.03623998910188675,
0.10032191127538681,
0.016223736107349396,
0.0352814756333828,
-0.010958550497889519,
0.14037492871284485,
-0.07920625060796738,
0.19814594089984894,
0.03863468021154404,
0.0013279286213219166,
0.10293461382389069,
-0.03930314630270004,
-0.04264216497540474,
-0.04235371574759483,
0.044800542294979095,
-0.16453799605369568,
-0.0060887327417731285,
0.13282246887683868,
0.02504883147776127,
0.10571850091218948,
0.08100336045026779,
-0.0009533868287689984,
0.02018028125166893,
0.06454197317361832,
-0.021804578602313995,
-0.11360737681388855,
0.15852925181388855,
-0.12145712971687317,
0.12919820845127106,
0.16909217834472656,
-0.011322311125695705,
-0.0148009043186903,
-0.05674736574292183,
0.043997086584568024,
0.06265432387590408,
0.08434715121984482,
-0.04969708248972893,
-0.14571644365787506,
0.01465770322829485,
0.042030248790979385,
0.054976534098386765,
-0.10845411568880081,
-0.08486001938581467,
-0.008869537152349949,
0.08832363039255142,
-0.05366462096571922,
0.07523998618125916,
0.10197185724973679,
-0.004234050866216421,
0.014705067500472069,
-0.04105531424283981,
-0.01267929095774889,
0.022380638867616653,
-0.07781114429235458,
0.004538693930953741
] |
null | null |
transformers
|

## Overview
**Language model:** gelectra-base-germanquad-distilled
**Language:** German
**Training data:** GermanQuAD train set (~ 12MB)
**Eval data:** GermanQuAD test set (~ 5MB)
**Infrastructure**: 1x V100 GPU
**Published**: Apr 21st, 2021
## Details
- We trained a German question answering model with a gelectra-base model as its basis.
- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published [online](https://deepset.ai/germanquad).
- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536answers, because we removed 76 wrong answers.
- In addition to the annotations in GermanQuAD, haystack's distillation feature was used for training. deepset/gelectra-large-germanquad was used as the teacher model.
See https://deepset.ai/germanquad for more details and dataset download in SQuAD format.
## Hyperparameters
```
batch_size = 24
n_epochs = 6
max_seq_len = 384
learning_rate = 3e-5
lr_schedule = LinearWarmup
embeds_dropout_prob = 0.1
temperature = 2
distillation_loss_weight = 0.75
```
## Performance
We evaluated the extractive question answering performance on our GermanQuAD test set.
Model types and training data are included in the model name.
For finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.
The GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on \\\\germanquad.
The human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.
```
"exact": 62.4773139745916
"f1": 80.9488017070188
```

## Authors
- Timo Möller: `timo.moeller [at] deepset.ai`
- Julian Risch: `julian.risch [at] deepset.ai`
- Malte Pietsch: `malte.pietsch [at] deepset.ai`
- Michel Bartels: `michel.bartels [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Slack](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "tags": ["exbert"], "datasets": ["deepset/germanquad"], "thumbnail": "https://thumb.tildacdn.com/tild3433-3637-4830-a533-353833613061/-/resize/720x/-/format/webp/germanquad.jpg"}
|
question-answering
|
deepset/gelectra-base-germanquad-distilled
|
[
"transformers",
"pytorch",
"safetensors",
"electra",
"question-answering",
"exbert",
"de",
"dataset:deepset/germanquad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #safetensors #electra #question-answering #exbert #de #dataset-deepset/germanquad #license-mit #endpoints_compatible #region-us
|
!bert_image
## Overview
Language model: gelectra-base-germanquad-distilled
Language: German
Training data: GermanQuAD train set (~ 12MB)
Eval data: GermanQuAD test set (~ 5MB)
Infrastructure: 1x V100 GPU
Published: Apr 21st, 2021
## Details
- We trained a German question answering model with a gelectra-base model as its basis.
- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.
- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536answers, because we removed 76 wrong answers.
- In addition to the annotations in GermanQuAD, haystack's distillation feature was used for training. deepset/gelectra-large-germanquad was used as the teacher model.
See URL for more details and dataset download in SQuAD format.
## Hyperparameters
## Performance
We evaluated the extractive question answering performance on our GermanQuAD test set.
Model types and training data are included in the model name.
For finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.
The GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on \\\\germanquad.
The human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.
!performancetable
## Authors
- Timo Möller: 'timo.moeller [at] URL'
- Julian Risch: 'URL [at] URL'
- Malte Pietsch: 'malte.pietsch [at] URL'
- Michel Bartels: 'michel.bartels [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Slack | GitHub Discussions | Website
By the way: we're hiring!
|
[
"## Overview\nLanguage model: gelectra-base-germanquad-distilled \nLanguage: German \nTraining data: GermanQuAD train set (~ 12MB) \nEval data: GermanQuAD test set (~ 5MB) \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021",
"## Details\n- We trained a German question answering model with a gelectra-base model as its basis.\n- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.\n- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536answers, because we removed 76 wrong answers.\n- In addition to the annotations in GermanQuAD, haystack's distillation feature was used for training. deepset/gelectra-large-germanquad was used as the teacher model.\n\nSee URL for more details and dataset download in SQuAD format.",
"## Hyperparameters",
"## Performance\nWe evaluated the extractive question answering performance on our GermanQuAD test set.\nModel types and training data are included in the model name. \nFor finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.\nThe GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on \\\\\\\\germanquad. \nThe human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.\n\n!performancetable",
"## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'\n- Michel Bartels: 'michel.bartels [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #safetensors #electra #question-answering #exbert #de #dataset-deepset/germanquad #license-mit #endpoints_compatible #region-us \n",
"## Overview\nLanguage model: gelectra-base-germanquad-distilled \nLanguage: German \nTraining data: GermanQuAD train set (~ 12MB) \nEval data: GermanQuAD test set (~ 5MB) \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021",
"## Details\n- We trained a German question answering model with a gelectra-base model as its basis.\n- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.\n- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536answers, because we removed 76 wrong answers.\n- In addition to the annotations in GermanQuAD, haystack's distillation feature was used for training. deepset/gelectra-large-germanquad was used as the teacher model.\n\nSee URL for more details and dataset download in SQuAD format.",
"## Hyperparameters",
"## Performance\nWe evaluated the extractive question answering performance on our GermanQuAD test set.\nModel types and training data are included in the model name. \nFor finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.\nThe GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on \\\\\\\\germanquad. \nThe human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.\n\n!performancetable",
"## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'\n- Michel Bartels: 'michel.bartels [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
55,
64,
179,
5,
117,
63,
129
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #electra #question-answering #exbert #de #dataset-deepset/germanquad #license-mit #endpoints_compatible #region-us \n## Overview\nLanguage model: gelectra-base-germanquad-distilled \nLanguage: German \nTraining data: GermanQuAD train set (~ 12MB) \nEval data: GermanQuAD test set (~ 5MB) \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021## Details\n- We trained a German question answering model with a gelectra-base model as its basis.\n- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.\n- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536answers, because we removed 76 wrong answers.\n- In addition to the annotations in GermanQuAD, haystack's distillation feature was used for training. deepset/gelectra-large-germanquad was used as the teacher model.\n\nSee URL for more details and dataset download in SQuAD format.## Hyperparameters## Performance\nWe evaluated the extractive question answering performance on our GermanQuAD test set.\nModel types and training data are included in the model name. \nFor finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.\nThe GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on \\\\\\\\germanquad. \nThe human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.\n\n!performancetable## Authors\n- Timo Möller: 'timo.moeller [at] URL'\n- Julian Risch: 'URL [at] URL'\n- Malte Pietsch: 'malte.pietsch [at] URL'\n- Michel Bartels: 'michel.bartels [at] URL'"
] |
[
-0.06607553362846375,
0.15553893148899078,
-0.0034133640583604574,
0.09133598208427429,
0.11527649313211441,
0.029385237023234367,
0.1307239681482315,
0.0699366107583046,
-0.01008154358714819,
0.09735699743032455,
0.017135068774223328,
-0.06149810552597046,
0.08624418079853058,
0.06062275543808937,
0.03844572976231575,
-0.1917555183172226,
0.039908021688461304,
-0.1032692939043045,
-0.009767751209437847,
0.10712885111570358,
0.12314338982105255,
-0.08529828488826752,
0.09230797737836838,
-0.020925696939229965,
-0.017609568312764168,
0.048376549035310745,
-0.015803318470716476,
-0.009608864784240723,
0.06967681646347046,
0.04622236639261246,
0.08112021535634995,
-0.04010481387376785,
0.061316411942243576,
-0.15227818489074707,
0.011865920387208462,
0.05933310091495514,
-0.00747069763019681,
0.05168614909052849,
0.03981434553861618,
0.006350455339998007,
0.04379298910498619,
-0.07228939980268478,
0.027495218440890312,
0.02473687380552292,
-0.11043459922075272,
-0.06010346859693527,
-0.19738955795764923,
0.12964077293872833,
0.035788312554359436,
0.0954221561551094,
-0.01977286860346794,
0.06291653960943222,
-0.05487027019262314,
0.025008147582411766,
0.16019907593727112,
-0.2048434317111969,
-0.05903778597712517,
0.04905746877193451,
-0.02487695775926113,
0.08890442550182343,
-0.11429133266210556,
0.027783628553152084,
0.026617880910634995,
0.024409687146544456,
-0.026345491409301758,
-0.011659946292638779,
0.02976846881210804,
-0.018725894391536713,
-0.09658166766166687,
-0.0579962320625782,
0.11315281689167023,
0.017372654750943184,
-0.0814477801322937,
-0.14971423149108887,
-0.02218240685760975,
0.07071594148874283,
0.03361174091696739,
-0.04494742676615715,
-0.009279643185436726,
-0.01482363510876894,
0.015071315690875053,
-0.11570437997579575,
-0.08357211947441101,
0.018029995262622833,
-0.012912381440401077,
0.18672320246696472,
0.012666916474699974,
0.052497923374176025,
0.011342811398208141,
0.09734150767326355,
-0.0767657682299614,
-0.08534656465053558,
-0.04740620777010918,
-0.08123590052127838,
-0.04526050388813019,
-0.01280180923640728,
-0.05499007925391197,
-0.06924564391374588,
-0.015398483723402023,
0.18754509091377258,
-0.04002096876502037,
0.04169653728604317,
-0.008594983257353306,
-0.010695080272853374,
0.051987286657094955,
0.21287858486175537,
-0.05528932809829712,
-0.13934524357318878,
-0.023482872173190117,
0.02169373817741871,
0.06135816127061844,
-0.028376523405313492,
-0.0057329218834638596,
-0.026159541681408882,
0.0499078631401062,
0.05567914992570877,
0.030265802517533302,
0.0018873297376558185,
-0.0628436729311943,
-0.07376962900161743,
0.09077279269695282,
-0.1377333551645279,
0.04686254635453224,
0.029320722445845604,
-0.0660269632935524,
0.09522034227848053,
0.0009049157379195094,
-0.015478541143238544,
-0.080058254301548,
0.07502251118421555,
-0.032601140439510345,
-0.007127915509045124,
-0.09439757466316223,
-0.10376700013875961,
0.0276379082351923,
-0.07245831191539764,
-0.06943504512310028,
-0.047329943627119064,
-0.12129098922014236,
-0.08846965432167053,
0.05702400580048561,
-0.02587740123271942,
0.03784078732132912,
-0.02464640885591507,
-0.013863950036466122,
0.019725138321518898,
0.0008586667245253921,
-0.05086923763155937,
0.016023168340325356,
0.04417632147669792,
-0.07504991441965103,
-0.001691003912128508,
-0.03773389756679535,
0.03261300548911095,
-0.07746555656194687,
-0.02561618573963642,
-0.20683039724826813,
0.09376588463783264,
-0.10786578059196472,
-0.04113852605223656,
-0.0951743870973587,
-0.07022236287593842,
0.00019293816876597703,
0.005790445953607559,
0.08571299910545349,
0.11896415799856186,
-0.11547886580228806,
-0.042711324989795685,
0.0845610722899437,
-0.12736834585666656,
-0.048699356615543365,
0.11543851345777512,
-0.06572563201189041,
0.020713848993182182,
0.10413452237844467,
0.1355828493833542,
0.11530432850122452,
-0.13708364963531494,
-0.11858190596103668,
-0.07921358197927475,
0.023531557992100716,
0.13050773739814758,
0.05824917554855347,
-0.0581294447183609,
-0.02168056182563305,
0.02046472392976284,
-0.13596150279045105,
-0.03441562131047249,
-0.01181798055768013,
-0.05820482224225998,
0.010127102956175804,
-0.03312087059020996,
0.10438905656337738,
0.008314582519233227,
-0.04126961529254913,
-0.053505703806877136,
-0.07839009910821915,
0.031482893973588943,
0.061916425824165344,
-0.05315593630075455,
0.031319305300712585,
-0.011282361112535,
0.008959335274994373,
0.040896810591220856,
-0.012775174342095852,
-0.14723339676856995,
-0.1874387413263321,
0.04568139463663101,
-0.07091381400823593,
0.08019590377807617,
-0.026797378435730934,
0.06394997984170914,
0.02488519623875618,
-0.0934198871254921,
-0.04483315721154213,
-0.11625181883573532,
-0.026382770389318466,
-0.030803602188825607,
-0.09853611886501312,
-0.055093131959438324,
-0.011160704307258129,
0.06420430541038513,
-0.039592307060956955,
-0.012853918597102165,
-0.0030805100686848164,
0.1051211804151535,
0.028992261737585068,
-0.07677999138832092,
-0.045938342809677124,
0.007725576404482126,
0.021500393748283386,
-0.041144873946905136,
-0.02795444428920746,
0.016085531562566757,
-0.02166522666811943,
0.00832595955580473,
-0.08978765457868576,
-0.10625268518924713,
0.05704548582434654,
0.10911468416452408,
-0.0823749452829361,
-0.008575350977480412,
-0.041889142245054245,
0.016014201566576958,
-0.09314881265163422,
-0.09806308150291443,
0.12517112493515015,
0.026303328573703766,
0.03893103823065758,
-0.05554770678281784,
-0.03483480587601662,
-0.04124060273170471,
0.0052335551008582115,
-0.06562026590108871,
0.0920361801981926,
-0.029794568195939064,
-0.11551697552204132,
0.11094025522470474,
0.08716639131307602,
0.027816923335194588,
0.15910233557224274,
-0.019280990585684776,
-0.07584044337272644,
-0.06532193720340729,
0.01958324946463108,
-0.014030192978680134,
0.11503374576568604,
0.08941342681646347,
0.0506758913397789,
0.041653864085674286,
0.04978952184319496,
0.005200900603085756,
-0.07546360045671463,
0.03128477931022644,
0.0029032034799456596,
-0.02114209719002247,
-0.01944895088672638,
-0.013005906715989113,
0.021271035075187683,
0.09173937141895294,
-0.0019963679369539022,
0.06296377629041672,
-0.012579436413943768,
-0.03877708688378334,
-0.10254482179880142,
0.16040144860744476,
-0.11241426318883896,
-0.20255252718925476,
-0.15347038209438324,
0.03626143932342529,
-0.14176030457019806,
-0.005878710187971592,
0.012213320471346378,
-0.046167805790901184,
-0.09365565329790115,
-0.07751604914665222,
0.0595918707549572,
0.036735888570547104,
-0.06912414729595184,
0.010712571442127228,
-0.04287944734096527,
0.019317176192998886,
-0.1614350974559784,
-0.01576610468327999,
-0.03806715086102486,
-0.058182243257761,
-0.002784350188449025,
0.03836330026388168,
0.050090353935956955,
0.026764696463942528,
-0.012591227889060974,
0.0028200934175401926,
-0.0016424818895757198,
0.25626930594444275,
-0.10970412194728851,
0.10449390858411789,
0.07633136957883835,
-0.02229188196361065,
0.08382123708724976,
0.09200675040483475,
0.037638597190380096,
-0.028999216854572296,
0.041452113538980484,
0.08003876358270645,
-0.02355695515871048,
-0.256764680147171,
-0.10092636942863464,
-0.031116940081119537,
-0.05650004372000694,
0.04236986115574837,
0.017296290025115013,
0.0181314405053854,
0.022903496399521828,
-0.11201835423707962,
-0.06098494306206703,
0.08245564997196198,
0.040631841868162155,
0.07220553606748581,
0.0034352310467511415,
0.058726467192173004,
-0.054786525666713715,
-0.012681257911026478,
0.12866739928722382,
0.0032533660996705294,
0.18340082466602325,
-0.018511449918150902,
0.043853819370269775,
0.014432287774980068,
0.07417961210012436,
-0.05680495873093605,
0.10227375477552414,
-0.013964763842523098,
0.018175821751356125,
-0.010424538515508175,
-0.07177630066871643,
-0.0012571605620905757,
0.05864257365465164,
0.0551813580095768,
-0.008313664235174656,
-0.05158507823944092,
-0.017036885023117065,
0.09965791553258896,
0.19186657667160034,
0.023308441042900085,
-0.08016601949930191,
-0.09685801714658737,
0.011198053136467934,
-0.018971776589751244,
-0.0618620328605175,
-0.013693897053599358,
0.10551990568637848,
-0.1622237116098404,
0.07129638642072678,
-0.012386503629386425,
0.07371097803115845,
-0.0012157667661085725,
-0.026222672313451767,
0.031664103269577026,
0.04751552641391754,
-0.04433182626962662,
0.0730031281709671,
-0.19599685072898865,
0.1027432307600975,
0.015887178480625153,
0.08314395695924759,
-0.04708065092563629,
0.03559776023030281,
0.04537784308195114,
-0.044101595878601074,
0.11911436915397644,
0.05453818291425705,
-0.13017867505550385,
-0.023268694058060646,
-0.09147638827562332,
0.007644645869731903,
0.1623084843158722,
-0.0867609903216362,
0.08315112441778183,
-0.03213939443230629,
0.01977367140352726,
-0.01180318184196949,
0.02861405722796917,
-0.1413121372461319,
-0.14663827419281006,
0.03989344462752342,
-0.038007933646440506,
0.006410154048353434,
-0.03697361424565315,
-0.07060471177101135,
-0.08315405249595642,
0.15200230479240417,
-0.11673025041818619,
-0.06553709506988525,
-0.11977572739124298,
-0.0025202317629009485,
0.1336173266172409,
-0.09418972581624985,
0.02669311873614788,
0.015549997799098492,
0.08423992246389389,
-0.050790444016456604,
-0.1134396642446518,
0.04793490469455719,
-0.07864117622375488,
-0.15472353994846344,
-0.005961052607744932,
0.13115814328193665,
0.1082567423582077,
0.040136076509952545,
-0.00526598934084177,
0.0524778887629509,
0.009263758547604084,
-0.15208758413791656,
0.012275198474526405,
0.10613657534122467,
-0.0017617200501263142,
0.1551692932844162,
-0.07496931403875351,
-0.15081122517585754,
-0.05152092128992081,
0.019633980467915535,
0.07023990154266357,
0.11240334063768387,
-0.04933473467826843,
0.12731575965881348,
0.1706283688545227,
-0.07371586561203003,
-0.2545013427734375,
0.017303207889199257,
0.07656358182430267,
0.0213350597769022,
0.06480851024389267,
-0.1912820041179657,
0.08979462087154388,
0.059854038059711456,
-0.021414268761873245,
-0.02959699183702469,
-0.22459322214126587,
-0.11322150379419327,
0.019163338467478752,
0.003017089329659939,
0.05403260886669159,
-0.062232907861471176,
-0.03882378339767456,
-0.029980339109897614,
-0.13425090909004211,
0.11578836292028427,
0.0026199796702712774,
0.08238799124956131,
0.031527578830718994,
0.054958321154117584,
0.03913097083568573,
-0.04520002007484436,
0.11971680074930191,
0.050126783549785614,
0.05808737501502037,
-0.054989080876111984,
-0.049827441573143005,
0.08199933171272278,
-0.0523095540702343,
0.14027230441570282,
0.0034241166431456804,
0.07369279116392136,
-0.11517857760190964,
-0.01307682879269123,
-0.11085302382707596,
0.0768095925450325,
-0.08579013496637344,
-0.023582978174090385,
-0.09485946595668793,
0.12729007005691528,
0.07995136082172394,
0.005922912620007992,
-0.010497174225747585,
0.02755328267812729,
0.018840331584215164,
0.08439819514751434,
0.07064329832792282,
0.11925582587718964,
-0.103178970515728,
-0.03519358113408089,
-0.010898741893470287,
0.06918325275182724,
0.007408845704048872,
0.0719454362988472,
0.12955690920352936,
0.03416725993156433,
0.133092001080513,
-0.012050898745656013,
-0.10271837562322617,
-0.005868739914149046,
0.030642492696642876,
-0.1520949900150299,
-0.19743315875530243,
-0.056679658591747284,
-0.0098905423656106,
-0.10064490139484406,
0.03207683190703392,
0.16282807290554047,
0.00397142069414258,
-0.0387999527156353,
-0.02517041005194187,
0.07772520929574966,
-0.0008177092531695962,
0.08361297100782394,
0.04794187843799591,
0.008277224376797676,
-0.07316611707210541,
0.14467689394950867,
0.057918108999729156,
-0.057655882090330124,
0.06680764257907867,
0.061408236622810364,
-0.0373699776828289,
0.0072544333525002,
0.006034442689269781,
0.027274280786514282,
-0.11943509429693222,
-0.03130786120891571,
-0.009215564467012882,
-0.06364854425191879,
-0.004440483171492815,
0.013971175067126751,
0.027702637016773224,
0.04598189517855644,
-0.0014327674871310592,
0.0017862190725281835,
-0.028946856036782265,
0.07732873409986496,
0.0718051865696907,
0.01981368288397789,
0.003808513982221484,
-0.002066829241812229,
-0.022424204275012016,
-0.020496893674135208,
-0.0030551108065992594,
-0.028323762118816376,
-0.09644603729248047,
-0.036669690161943436,
-0.055621594190597534,
-0.02945845201611519,
-0.006457716692239046,
-0.003350408049300313,
-0.012602400034666061,
-0.08217832446098328,
0.017525585368275642,
0.06891938298940659,
-0.0605371929705143,
-0.0363166481256485,
-0.015032890252768993,
0.05955434963107109,
-0.21865829825401306,
-0.007001139223575592,
0.08524605631828308,
-0.060707058757543564,
0.1087801605463028,
0.08192849159240723,
0.00586699740961194,
0.06409405171871185,
-0.1093834713101387,
-0.04434297978878021,
-0.07435353100299835,
0.024113887920975685,
0.017581075429916382,
-0.14490540325641632,
0.013416304253041744,
0.007304804865270853,
-0.016598166897892952,
0.02912534959614277,
0.014236541464924812,
-0.07730849087238312,
0.08079256117343903,
0.00097559584537521,
-0.09657424688339233,
-0.06010371074080467,
0.09152781218290329,
0.05303812026977539,
0.02845781296491623,
0.1358707845211029,
-0.06471466273069382,
0.04238109290599823,
-0.12921515107154846,
0.009161405265331268,
0.03874371945858002,
-0.033103328198194504,
-0.11095836758613586,
0.0048481919802725315,
0.04503760486841202,
-0.0066313426941633224,
0.06474559754133224,
-0.011495763435959816,
0.019499994814395905,
0.0325276218354702,
0.020199334248900414,
-0.07489238679409027,
0.01590089127421379,
0.07562392950057983,
-0.056239053606987,
-0.028284883126616478,
-0.010275561362504959,
-0.03299515321850777,
-0.03771079331636429,
0.039031982421875,
0.20837508141994476,
0.15701909363269806,
0.09064800292253494,
0.0360245406627655,
0.04032011330127716,
-0.011106832884252071,
-0.10994116216897964,
-0.014990393072366714,
-0.016655590385198593,
0.031098010018467903,
-0.045427411794662476,
0.056905798614025116,
0.057039838284254074,
-0.21624280512332916,
0.118550144135952,
-0.047241222113370895,
-0.0633956640958786,
-0.05756659060716629,
-0.12484416365623474,
-0.04996027052402496,
-0.03301103785634041,
0.02629339136183262,
-0.15141858160495758,
0.05923715978860855,
0.08487962186336517,
0.051434602588415146,
-0.06850511580705643,
0.12478385865688324,
-0.1412169188261032,
-0.04713815078139305,
0.08114167302846909,
0.03279460221529007,
0.05354594811797142,
0.09014587849378586,
-0.03364000469446182,
0.0004300993459764868,
0.061420563608407974,
0.10074276477098465,
0.058532606810331345,
0.05310369282960892,
-0.026096783578395844,
-0.02207152731716633,
-0.0702909305691719,
-0.023275863379240036,
-0.03851968050003052,
0.041142821311950684,
0.19521364569664001,
0.03262265399098396,
0.020302167162299156,
-0.02987336739897728,
0.18746134638786316,
-0.06423818320035934,
-0.06278069317340851,
-0.18024297058582306,
0.10625077039003372,
0.012295442633330822,
0.025456197559833527,
0.03110707551240921,
-0.12739834189414978,
-0.00443519651889801,
0.0713782086968422,
0.13067838549613953,
-0.0378529317677021,
0.006077314727008343,
-0.013767186552286148,
0.02104831486940384,
0.0665770173072815,
0.08474235981702805,
-0.003761331317946315,
0.20438747107982635,
-0.023757414892315865,
0.10304340720176697,
-0.03638538345694542,
-0.035360187292099,
-0.019015489146113396,
0.1660653054714203,
-0.04141034185886383,
-0.05253253132104874,
-0.11031703650951385,
0.12183893471956253,
0.013793754391372204,
-0.23301059007644653,
0.008810186758637428,
-0.11652988195419312,
-0.12790949642658234,
0.006167196203023195,
0.08017414063215256,
0.054748065769672394,
0.09661248326301575,
0.03834440931677818,
-0.001744258450344205,
0.18245618045330048,
0.008756021969020367,
-0.01861725188791752,
-0.11740569025278091,
0.10352044552564621,
-0.05236535146832466,
0.24131911993026733,
0.05786696448922157,
0.05626307427883148,
0.0654512271285057,
-0.022388147190213203,
-0.07344058156013489,
0.0007133827311918139,
0.05862050876021385,
-0.08912275731563568,
-0.034641802310943604,
0.12459149211645126,
-0.013963482342660427,
0.1122179701924324,
0.052734676748514175,
0.0022634773049503565,
0.05474381893873215,
0.11965961009263992,
-0.020615682005882263,
-0.12665525078773499,
0.11132091283798218,
-0.08643728494644165,
0.1644294559955597,
0.17932374775409698,
-0.018439939245581627,
0.018485382199287415,
-0.047776103019714355,
0.026323651894927025,
0.013913556933403015,
0.10363630950450897,
0.01621808111667633,
-0.15861442685127258,
0.04360371455550194,
-0.04252549260854721,
0.0556354857981205,
-0.16088160872459412,
-0.05594758689403534,
0.002923174761235714,
-0.006834954489022493,
-0.017031684517860413,
0.10979905724525452,
0.013508249074220657,
-0.013337274082005024,
0.011456101201474667,
-0.02770971693098545,
-0.015090779401361942,
0.07918859273195267,
-0.02628454566001892,
-0.026717357337474823
] |
null | null |
transformers
|

## Overview
**Language model:** gelectra-base-germanquad
**Language:** German
**Training data:** GermanQuAD train set (~ 12MB)
**Eval data:** GermanQuAD test set (~ 5MB)
**Infrastructure**: 1x V100 GPU
**Published**: Apr 21st, 2021
## Details
- We trained a German question answering model with a gelectra-base model as its basis.
- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published [online](https://deepset.ai/germanquad).
- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536answers, because we removed 76 wrong answers.
See https://deepset.ai/germanquad for more details and dataset download in SQuAD format.
## Hyperparameters
```
batch_size = 24
n_epochs = 2
max_seq_len = 384
learning_rate = 3e-5
lr_schedule = LinearWarmup
embeds_dropout_prob = 0.1
```
## Performance
We evaluated the extractive question answering performance on our GermanQuAD test set.
Model types and training data are included in the model name.
For finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.
The GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on [GermanQuAD](https://deepset.ai/germanquad).
The human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.

## Authors
**Timo Möller:** [email protected]
**Julian Risch:** [email protected]
**Malte Pietsch:** [email protected]
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/deepset-logo-colored.png" class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" class="w-40"/>
</div>
</div>
[deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://docs.haystack.deepset.ai">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community">Discord community open to everyone!</a></strong></p>
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "tags": ["exbert"], "datasets": ["deepset/germanquad"], "thumbnail": "https://thumb.tildacdn.com/tild3433-3637-4830-a533-353833613061/-/resize/720x/-/format/webp/germanquad.jpg"}
|
question-answering
|
deepset/gelectra-base-germanquad
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"electra",
"question-answering",
"exbert",
"de",
"dataset:deepset/germanquad",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #safetensors #electra #question-answering #exbert #de #dataset-deepset/germanquad #license-mit #endpoints_compatible #has_space #region-us
|
!bert_image
## Overview
Language model: gelectra-base-germanquad
Language: German
Training data: GermanQuAD train set (~ 12MB)
Eval data: GermanQuAD test set (~ 5MB)
Infrastructure: 1x V100 GPU
Published: Apr 21st, 2021
## Details
- We trained a German question answering model with a gelectra-base model as its basis.
- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.
- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536answers, because we removed 76 wrong answers.
See URL for more details and dataset download in SQuAD format.
## Hyperparameters
## Performance
We evaluated the extractive question answering performance on our GermanQuAD test set.
Model types and training data are included in the model name.
For finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.
The GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on GermanQuAD.
The human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.
!performancetable
## Authors
Timo Möller: timo.moeller@URL
Julian Risch: URL@URL
Malte Pietsch: malte.pietsch@URL
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
</div>
deepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- Distilled roberta-base-squad2 (aka "tinyroberta-squad2")
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="URL repo and <strong><a href="URL">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="URL community open to everyone!</a></strong></p>
Twitter | LinkedIn | Discord | GitHub Discussions | Website
By the way: we're hiring!
|
[
"## Overview\nLanguage model: gelectra-base-germanquad \nLanguage: German \nTraining data: GermanQuAD train set (~ 12MB) \nEval data: GermanQuAD test set (~ 5MB) \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021",
"## Details\n- We trained a German question answering model with a gelectra-base model as its basis.\n- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.\n- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536answers, because we removed 76 wrong answers.\n\nSee URL for more details and dataset download in SQuAD format.",
"## Hyperparameters",
"## Performance\nWe evaluated the extractive question answering performance on our GermanQuAD test set.\nModel types and training data are included in the model name. \nFor finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.\nThe GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on GermanQuAD.\nThe human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth. \n!performancetable",
"## Authors\nTimo Möller: timo.moeller@URL \nJulian Risch: URL@URL \nMalte Pietsch: malte.pietsch@URL",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #electra #question-answering #exbert #de #dataset-deepset/germanquad #license-mit #endpoints_compatible #has_space #region-us \n",
"## Overview\nLanguage model: gelectra-base-germanquad \nLanguage: German \nTraining data: GermanQuAD train set (~ 12MB) \nEval data: GermanQuAD test set (~ 5MB) \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021",
"## Details\n- We trained a German question answering model with a gelectra-base model as its basis.\n- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.\n- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536answers, because we removed 76 wrong answers.\n\nSee URL for more details and dataset download in SQuAD format.",
"## Hyperparameters",
"## Performance\nWe evaluated the extractive question answering performance on our GermanQuAD test set.\nModel types and training data are included in the model name. \nFor finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.\nThe GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on GermanQuAD.\nThe human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth. \n!performancetable",
"## Authors\nTimo Möller: timo.moeller@URL \nJulian Risch: URL@URL \nMalte Pietsch: malte.pietsch@URL",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
62,
60,
132,
5,
115,
33,
251,
113
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #electra #question-answering #exbert #de #dataset-deepset/germanquad #license-mit #endpoints_compatible #has_space #region-us \n## Overview\nLanguage model: gelectra-base-germanquad \nLanguage: German \nTraining data: GermanQuAD train set (~ 12MB) \nEval data: GermanQuAD test set (~ 5MB) \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021## Details\n- We trained a German question answering model with a gelectra-base model as its basis.\n- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.\n- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536answers, because we removed 76 wrong answers.\n\nSee URL for more details and dataset download in SQuAD format.## Hyperparameters## Performance\nWe evaluated the extractive question answering performance on our GermanQuAD test set.\nModel types and training data are included in the model name. \nFor finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.\nThe GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on GermanQuAD.\nThe human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth. \n!performancetable## Authors\nTimo Möller: timo.moeller@URL \nJulian Risch: URL@URL \nMalte Pietsch: malte.pietsch@URL"
] |
[
-0.08874672651290894,
0.15541905164718628,
-0.0012320472160354257,
0.10432808846235275,
0.1287650316953659,
0.057236701250076294,
0.15258322656154633,
0.08059009164571762,
0.014325186610221863,
0.08539535105228424,
0.026836752891540527,
-0.08863363415002823,
0.06550692766904831,
0.08701556921005249,
0.04557039216160774,
-0.22219941020011902,
0.03800331801176071,
-0.09890283644199371,
0.0224844329059124,
0.09833109378814697,
0.1077498123049736,
-0.08453793823719025,
0.08295431733131409,
-0.009885870851576328,
-0.034905750304460526,
0.06401398032903671,
-0.01086756307631731,
-0.008289477787911892,
0.10333733260631561,
0.04981975257396698,
0.07991239428520203,
-0.0391707643866539,
0.06609567999839783,
-0.12634126842021942,
0.009740954264998436,
0.030753854662179947,
-0.019712967798113823,
0.04730051010847092,
0.056233420968055725,
0.002762994496151805,
0.08711924403905869,
-0.025833534076809883,
0.009875960648059845,
0.044399287551641464,
-0.08947930485010147,
-0.09997310489416122,
-0.18164606392383575,
0.18097306787967682,
0.005717088468372822,
0.12171421945095062,
-0.02440119907259941,
0.09966900944709778,
-0.058817967772483826,
0.032088521867990494,
0.14568258821964264,
-0.20840391516685486,
-0.05567437782883644,
0.041065316647291183,
-0.005856458097696304,
0.08650541305541992,
-0.09369852393865585,
0.012098775245249271,
0.03593752905726433,
0.040854137390851974,
-0.07301099598407745,
-0.03183945268392563,
-0.016602322459220886,
-0.018243631348013878,
-0.09643730521202087,
-0.038390740752220154,
0.10887813568115234,
0.030706509947776794,
-0.09487871825695038,
-0.14138495922088623,
-0.014923532493412495,
0.06876625865697861,
0.05111458897590637,
-0.05627430975437164,
-0.014709556475281715,
-0.026911605149507523,
0.02965184487402439,
-0.09563148021697998,
-0.09312071651220322,
-0.01267936360090971,
-0.004490922670811415,
0.2264060080051422,
0.028330253437161446,
0.04994950070977211,
-0.006940918508917093,
0.07392558455467224,
-0.13802391290664673,
-0.08374196290969849,
-0.06952188163995743,
-0.0946747362613678,
-0.0756630152463913,
-0.01553218811750412,
-0.0512247160077095,
-0.08457352221012115,
0.00934590958058834,
0.16170130670070648,
-0.09420793503522873,
0.03139925375580788,
0.00508772162720561,
-0.00829586572945118,
0.05773571878671646,
0.19449201226234436,
-0.07489057630300522,
-0.10924607515335083,
-0.031974710524082184,
-0.015168819576501846,
0.04234892502427101,
-0.027319753542542458,
-0.033789053559303284,
-0.030742287635803223,
0.05185991898179054,
0.06782013177871704,
0.01219946052879095,
0.017504816874861717,
-0.049164775758981705,
-0.0629318431019783,
0.09635823965072632,
-0.1138397753238678,
0.028001246973872185,
0.022141605615615845,
-0.05443074181675911,
0.04960716515779495,
-0.025694869458675385,
-0.019160043448209763,
-0.07419943809509277,
0.06810831278562546,
-0.04213029518723488,
-0.03574627637863159,
-0.08275223523378372,
-0.13440024852752686,
0.02849353849887848,
-0.061648063361644745,
-0.038784317672252655,
-0.1074252650141716,
-0.1146865263581276,
-0.08519269526004791,
0.049276143312454224,
-0.029373522847890854,
0.05794044956564903,
0.011078487150371075,
-0.009822751395404339,
0.008091341704130173,
-0.0012238294584676623,
-0.055870480835437775,
0.006512819789350033,
0.03926936537027359,
-0.06188367307186127,
0.011847403831779957,
-0.06548985093832016,
0.028528684750199318,
-0.10730928927659988,
-0.05088178813457489,
-0.17333364486694336,
0.061224181205034256,
-0.08994686603546143,
-0.040090762078762054,
-0.08146682381629944,
-0.07699792087078094,
-0.003138829953968525,
0.014965604990720749,
0.0859462171792984,
0.1277930587530136,
-0.14469949901103973,
-0.04579966515302658,
0.13060250878334045,
-0.16457751393318176,
-0.06336306780576706,
0.11367344856262207,
-0.03327210992574692,
-0.004523079842329025,
0.09305419027805328,
0.16568627953529358,
0.12188268452882767,
-0.13934986293315887,
-0.10285685211420059,
-0.08861806243658066,
-0.014425602741539478,
0.09804630279541016,
0.06501004844903946,
-0.062067266553640366,
-0.03894611820578575,
0.02339928410947323,
-0.12448129057884216,
-0.0011323537910357118,
-0.037453558295965195,
-0.04498012363910675,
0.011023364029824734,
-0.04963625222444534,
0.10128556936979294,
0.033701252192258835,
-0.04140198603272438,
-0.06792972981929779,
-0.0814216136932373,
0.020746484398841858,
0.0848945900797844,
-0.047443587332963943,
0.003133554710075259,
-0.011650051921606064,
0.048937272280454636,
0.03241431340575218,
-0.006793645676225424,
-0.1507568210363388,
-0.1918666958808899,
0.05130982771515846,
-0.0646933987736702,
0.07997418195009232,
0.056180112063884735,
0.07040576636791229,
0.04798031598329544,
-0.10922945290803909,
-0.059614233672618866,
-0.12033738195896149,
-0.01455695927143097,
-0.034699682146310806,
-0.1229640543460846,
-0.06742504239082336,
-0.030933761969208717,
0.05042770132422447,
-0.07769239693880081,
-0.03213127329945564,
-0.010184071958065033,
0.08352479338645935,
0.04035567864775658,
-0.07374551892280579,
-0.03709207847714424,
0.028594667091965675,
0.0003578059549909085,
-0.04122409224510193,
-0.026950106024742126,
0.004909509792923927,
-0.02349422499537468,
0.03593301773071289,
-0.07245447486639023,
-0.07398742437362671,
0.05071781203150749,
0.14395572245121002,
-0.11565025895833969,
-0.00271386350505054,
-0.07058794051408768,
0.012087286449968815,
-0.08807452023029327,
-0.09238456189632416,
0.13529731333255768,
0.018465228378772736,
0.039820510894060135,
-0.0411246232688427,
-0.010722196660935879,
-0.030500788241624832,
-0.0006354014622047544,
-0.06939520686864853,
0.08565568923950195,
-0.055448390543460846,
-0.13890616595745087,
0.09498725086450577,
0.04907354712486267,
0.005604672245681286,
0.19168922305107117,
-0.014483623206615448,
-0.09600398689508438,
-0.04511750489473343,
0.01265863236039877,
0.00066273013362661,
0.1346425861120224,
0.07229012995958328,
0.04449449107050896,
0.05167184770107269,
0.045147959142923355,
0.01635112799704075,
-0.03885255753993988,
0.01203186810016632,
-0.0015696423361077905,
-0.023840246722102165,
-0.036671072244644165,
0.004361467901617289,
0.007844537496566772,
0.09565412253141403,
-0.009474394842982292,
0.037911124527454376,
-0.005335760768502951,
-0.033528152853250504,
-0.13005121052265167,
0.16366475820541382,
-0.09867347776889801,
-0.1874413937330246,
-0.13840697705745697,
0.05848807469010353,
-0.12564870715141296,
0.012968203984200954,
0.039668258279561996,
-0.06663358956575394,
-0.09037130326032639,
-0.05903215706348419,
0.05665389820933342,
0.03827391564846039,
-0.04390911012887955,
-0.03342738375067711,
-0.05296853184700012,
0.022158928215503693,
-0.1469927728176117,
-0.01058119535446167,
-0.051237642765045166,
-0.05024419724941254,
0.03707362338900566,
0.004370450507849455,
0.0866849422454834,
0.05097038671374321,
-0.05537373945116997,
0.011951543390750885,
-0.016886506229639053,
0.23345859348773956,
-0.10470801591873169,
0.09917127341032028,
0.1236020028591156,
-0.0018774429336190224,
0.06515778601169586,
0.08668922632932663,
0.021205028519034386,
-0.032248854637145996,
0.0508551225066185,
0.06224356219172478,
-0.050315406173467636,
-0.2524568736553192,
-0.11017227917909622,
-0.03806537389755249,
-0.06120046228170395,
0.030618632212281227,
0.031472306698560715,
0.06920800358057022,
0.03025936894118786,
-0.1155005544424057,
-0.0971735492348671,
0.04508650302886963,
0.05875303968787193,
0.01762416400015354,
0.009616799652576447,
0.054350849241018295,
-0.044688571244478226,
-0.010482371784746647,
0.14252039790153503,
-0.014692995697259903,
0.19078859686851501,
-0.0108732208609581,
0.003939957357943058,
0.025486212223768234,
0.06731002032756805,
-0.023032866418361664,
0.11546840518712997,
-0.02101743593811989,
-0.013905622996389866,
-0.006985146086663008,
-0.08418263494968414,
0.008270751684904099,
0.09524702280759811,
0.06523292511701584,
-0.007258962374180555,
-0.09093043208122253,
-0.013463852927088737,
0.07362816482782364,
0.16520531475543976,
0.05209093168377876,
-0.09411218762397766,
-0.11069398373365402,
-0.007043204270303249,
-0.024834075942635536,
-0.06469991058111191,
-0.012468681670725346,
0.14139169454574585,
-0.1558847427368164,
0.059361156076192856,
0.002270430326461792,
0.08391642570495605,
0.012335404753684998,
-0.006909288465976715,
0.04413300380110741,
0.039804041385650635,
-0.04205900430679321,
0.09240487217903137,
-0.17823155224323273,
0.13564087450504303,
0.01792951300740242,
0.07401867210865021,
-0.08118200302124023,
0.01846705935895443,
0.030981453135609627,
-0.04545038193464279,
0.13537202775478363,
0.04328478127717972,
-0.12047646939754486,
-0.057999081909656525,
-0.055159419775009155,
0.017313694581389427,
0.1415773332118988,
-0.06604620069265366,
0.07551046460866928,
-0.020765038207173347,
0.02162426896393299,
-0.013311606831848621,
0.07417374849319458,
-0.16241437196731567,
-0.11895222216844559,
0.027738874778151512,
-0.0026318938471376896,
0.02066906914114952,
-0.054204750806093216,
-0.07900076359510422,
-0.06359577178955078,
0.15691916644573212,
-0.08495380729436874,
-0.06782130151987076,
-0.14543698728084564,
0.030830085277557373,
0.11886942386627197,
-0.10600621998310089,
0.017765376716852188,
0.03210398927330971,
0.08521842211484909,
-0.05012005567550659,
-0.10424403846263885,
0.0583549402654171,
-0.09125455468893051,
-0.16353465616703033,
-0.018831348046660423,
0.12083550542593002,
0.12341316789388657,
0.04832169786095619,
0.0050299265421926975,
0.03539060428738594,
0.02623172663152218,
-0.16280414164066315,
0.007551989052444696,
0.06625887751579285,
0.025905512273311615,
0.13297007977962494,
-0.07935528457164764,
-0.13742318749427795,
-0.02452879212796688,
0.005692383274435997,
0.08329642564058304,
0.09786850959062576,
-0.06646400690078735,
0.10439996421337128,
0.19770312309265137,
-0.043918270617723465,
-0.2548733651638031,
0.022491781041026115,
0.0798821896314621,
0.030166763812303543,
0.06912336498498917,
-0.16731204092502594,
0.06658603996038437,
0.05916505306959152,
-0.016527390107512474,
-0.04925226420164108,
-0.18218787014484406,
-0.11339867115020752,
0.05887367203831673,
0.02886025607585907,
0.037174418568611145,
-0.07192382961511612,
-0.04483559727668762,
-0.02187192067503929,
-0.11276570707559586,
0.06035174801945686,
-0.04273536428809166,
0.08396361768245697,
0.009806741029024124,
0.04130121320486069,
0.04496936500072479,
-0.05629643052816391,
0.10556270182132721,
0.07350250333547592,
0.0652007982134819,
-0.06256398558616638,
-0.04372049495577812,
0.09217268228530884,
-0.022280670702457428,
0.16493642330169678,
0.005078269634395838,
0.0884372740983963,
-0.12751354277133942,
-0.02090495638549328,
-0.10053848475217819,
0.09622543305158615,
-0.06619910150766373,
-0.0641808956861496,
-0.0754396840929985,
0.13734173774719238,
0.044667862355709076,
-0.007394126616418362,
-0.044132329523563385,
-0.018706930801272392,
0.03927656263113022,
0.10028127580881119,
0.06595179438591003,
0.09696493297815323,
-0.06632877886295319,
-0.018521877005696297,
-0.026844171807169914,
0.07664552330970764,
0.03673039749264717,
0.060124486684799194,
0.1365422010421753,
0.03811071813106537,
0.11806230992078781,
-0.012955259531736374,
-0.10787581652402878,
-0.0048662470653653145,
0.03782095015048981,
-0.15425245463848114,
-0.20034478604793549,
-0.06720742583274841,
-0.017904220148921013,
-0.09423313289880753,
0.04979989677667618,
0.15612296760082245,
-0.01353866420686245,
-0.03275696933269501,
-0.03236394375562668,
0.043914251029491425,
-0.023761684074997902,
0.1185201033949852,
0.08312413096427917,
0.021866265684366226,
-0.07356013357639313,
0.10195405036211014,
0.04578554257750511,
-0.036166366189718246,
0.08040104061365128,
0.06510414183139801,
-0.06116615608334541,
-0.016573429107666016,
0.01603992097079754,
0.07380606979131699,
-0.15962077677249908,
-0.07548438012599945,
0.002011363161727786,
-0.062256503850221634,
-0.014978121966123581,
0.019749268889427185,
0.021983511745929718,
0.041379738599061966,
0.004481120500713587,
-0.01369953341782093,
-0.05572149530053139,
0.06401269882917404,
0.04466279596090317,
0.026234909892082214,
-0.023757807910442352,
-0.01905575953423977,
-0.047011878341436386,
0.006483270321041346,
-0.026325657963752747,
-0.023641323670744896,
-0.12077092379331589,
-0.03552139177918434,
-0.10573657602071762,
-0.027464013546705246,
-0.005088220350444317,
-0.012990789487957954,
-0.014042973518371582,
-0.10186824202537537,
0.032493364065885544,
0.06458161771297455,
-0.05236148089170456,
-0.014873809181153774,
-0.0037490238901227713,
0.051138319075107574,
-0.21645216643810272,
-0.004257949069142342,
0.04712869971990585,
-0.036148328334093094,
0.13150690495967865,
0.07794664055109024,
0.0031796202529221773,
0.06516807526350021,
-0.10688064992427826,
-0.026642290875315666,
-0.05404426530003548,
0.029173247516155243,
0.03530626744031906,
-0.12087643146514893,
0.009817481972277164,
0.014245042577385902,
-0.024544348940253258,
0.04563542455434799,
0.016087224707007408,
-0.059902552515268326,
0.07189207524061203,
-0.011026495136320591,
-0.09953953325748444,
-0.0646585077047348,
0.10158730298280716,
0.07587321847677231,
0.008090192452073097,
0.11139101535081863,
-0.06938986480236053,
0.06046516075730324,
-0.13425713777542114,
0.01109302043914795,
0.032278645783662796,
-0.01414593867957592,
-0.10077407956123352,
0.023204496130347252,
0.043822113424539566,
-0.007013955153524876,
0.07604134827852249,
-0.003844037652015686,
0.07078185677528381,
0.032420042902231216,
0.02944311499595642,
-0.009991057217121124,
0.004178556147962809,
0.08078718185424805,
-0.06352633982896805,
-0.0283083263784647,
-0.031925372779369354,
-0.016463855281472206,
-0.07039175927639008,
0.012336306273937225,
0.242676243185997,
0.1633036732673645,
0.0754292756319046,
0.025257110595703125,
0.05400175601243973,
-0.02584325522184372,
-0.1139620691537857,
-0.09225307404994965,
0.0096176378428936,
0.025178130716085434,
-0.027459215372800827,
0.06856615096330643,
0.09918257594108582,
-0.2019132375717163,
0.10570736974477768,
-0.054025039076805115,
-0.062141720205545425,
-0.0520176999270916,
-0.1299404799938202,
-0.04506334662437439,
-0.04758036136627197,
0.02750379778444767,
-0.13811588287353516,
0.030141571536660194,
0.047959521412849426,
0.056763894855976105,
-0.07083463668823242,
0.14367429912090302,
-0.12414245307445526,
-0.032395053654909134,
0.054632458835840225,
0.027607474476099014,
0.05521874502301216,
0.07818949967622757,
-0.0220868531614542,
0.0022506783716380596,
0.05647069960832596,
0.07566389441490173,
0.04979996383190155,
0.04599805921316147,
-0.03082217276096344,
-0.02348148450255394,
-0.06600077450275421,
-0.024919545277953148,
-0.023191669955849648,
0.04980739951133728,
0.18270732462406158,
0.025249026715755463,
0.006216367240995169,
-0.025970853865146637,
0.18031300604343414,
-0.06520793586969376,
-0.08192529529333115,
-0.15111123025417328,
0.1689349114894867,
0.0216852817684412,
0.04173567518591881,
0.02148084156215191,
-0.1402912735939026,
0.009113816544413567,
0.09803709387779236,
0.17001605033874512,
-0.010133850388228893,
0.011891815811395645,
-0.01012948714196682,
0.009158855304121971,
0.03533903881907463,
0.0754668191075325,
-0.015660317614674568,
0.24648506939411163,
-0.012496539391577244,
0.14010055363178253,
-0.018766440451145172,
-0.003738309955224395,
0.0006404845044016838,
0.20721641182899475,
-0.012203183025121689,
-0.06507664173841476,
-0.1050969809293747,
0.11431378126144409,
0.011788562871515751,
-0.21017660200595856,
0.026693442836403847,
-0.08551166951656342,
-0.13081970810890198,
0.001006434322334826,
0.04985117167234421,
0.07174541056156158,
0.10895396769046783,
0.026631591841578484,
0.01575162075459957,
0.15104036033153534,
0.008385873399674892,
-0.05187346786260605,
-0.1377122402191162,
0.09514632076025009,
-0.03467284515500069,
0.2287897914648056,
0.03232932835817337,
0.07860265672206879,
0.08179043978452682,
-0.009858888573944569,
-0.09206835180521011,
0.015938082709908485,
0.05397738888859749,
-0.11585395783185959,
-0.01829678937792778,
0.12314207851886749,
-0.021848760545253754,
0.08489733189344406,
0.04564063623547554,
-0.05358150601387024,
0.03449712693691254,
0.07881088554859161,
-0.004652131348848343,
-0.1297537386417389,
0.13309454917907715,
-0.07650646567344666,
0.17358486354351044,
0.15623098611831665,
-0.02551635354757309,
0.00823601707816124,
-0.049207545816898346,
0.05973406136035919,
0.04402341693639755,
0.10362124443054199,
0.03047814965248108,
-0.1601710468530655,
0.035995420068502426,
-0.05474425107240677,
0.02015053853392601,
-0.20497120916843414,
-0.052973970770835876,
-0.028504541143774986,
-0.005016813054680824,
-0.012571608647704124,
0.09970103949308395,
0.042657848447561264,
0.0008564770687371492,
0.017114708200097084,
-0.029832905158400536,
-0.012412965297698975,
0.0757383480668068,
-0.047218553721904755,
-0.041687577962875366
] |
null | null |
transformers
|
# German ELECTRA base
Released, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our [paper](https://arxiv.org/pdf/2010.10906.pdf), we outline the steps taken to train our model. Our evaluation suggests that this model is somewhat undertrained. For best performance from a base sized model, we recommend deepset/gbert-base
## Overview
**Paper:** [here](https://arxiv.org/pdf/2010.10906.pdf)
**Architecture:** ELECTRA base (discriminator)
**Language:** German
## Performance
```
GermEval18 Coarse: 76.02
GermEval18 Fine: 42.22
GermEval14: 86.02
```
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: `branden.chan [at] deepset.ai`
Stefan Schweter: `stefan [at] schweter.eu`
Timo Möller: `timo.moeller [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Slack](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "datasets": ["wikipedia", "OPUS", "OpenLegalData"]}
| null |
deepset/gelectra-base
|
[
"transformers",
"pytorch",
"tf",
"electra",
"pretraining",
"de",
"dataset:wikipedia",
"dataset:OPUS",
"dataset:OpenLegalData",
"arxiv:2010.10906",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.10906"
] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #electra #pretraining #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #arxiv-2010.10906 #license-mit #endpoints_compatible #has_space #region-us
|
# German ELECTRA base
Released, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model. Our evaluation suggests that this model is somewhat undertrained. For best performance from a base sized model, we recommend deepset/gbert-base
## Overview
Paper: here
Architecture: ELECTRA base (discriminator)
Language: German
## Performance
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: 'URL [at] URL'
Stefan Schweter: 'stefan [at] URL'
Timo Möller: 'timo.moeller [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Slack | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# German ELECTRA base\n\nReleased, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model. Our evaluation suggests that this model is somewhat undertrained. For best performance from a base sized model, we recommend deepset/gbert-base",
"## Overview \nPaper: here \nArchitecture: ELECTRA base (discriminator)\nLanguage: German",
"## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #tf #electra #pretraining #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #arxiv-2010.10906 #license-mit #endpoints_compatible #has_space #region-us \n",
"# German ELECTRA base\n\nReleased, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model. Our evaluation suggests that this model is somewhat undertrained. For best performance from a base sized model, we recommend deepset/gbert-base",
"## Overview \nPaper: here \nArchitecture: ELECTRA base (discriminator)\nLanguage: German",
"## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
68,
120,
21,
60,
40,
129
] |
[
"passage: TAGS\n#transformers #pytorch #tf #electra #pretraining #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #arxiv-2010.10906 #license-mit #endpoints_compatible #has_space #region-us \n# German ELECTRA base\n\nReleased, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model. Our evaluation suggests that this model is somewhat undertrained. For best performance from a base sized model, we recommend deepset/gbert-base## Overview \nPaper: here \nArchitecture: ELECTRA base (discriminator)\nLanguage: German## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
-0.0350506529211998,
0.11492346972227097,
-0.0019879022147506475,
0.053520843386650085,
0.05289936438202858,
-0.01355165708810091,
0.09938298910856247,
0.09136351943016052,
0.07379617542028427,
0.09251753985881805,
-0.000005092054379929323,
-0.04079625383019447,
0.07474660873413086,
0.15288296341896057,
0.0593036524951458,
-0.23560619354248047,
0.013012206181883812,
-0.10484832525253296,
-0.04952587932348251,
0.0737520158290863,
0.15882550179958344,
-0.09611755609512329,
0.10014393925666809,
0.003374149091541767,
-0.02782982401549816,
0.034692686051130295,
-0.06916533410549164,
-0.009201806038618088,
0.06895262748003006,
0.03265981748700142,
0.038925882428884506,
-0.03672655671834946,
-0.005318049341440201,
-0.1541822999715805,
0.02867577224969864,
0.06423681974411011,
0.04415997862815857,
0.04388362914323807,
0.05555303394794464,
-0.01979030668735504,
0.021073967218399048,
-0.13965651392936707,
0.019696736708283424,
0.0630265399813652,
-0.09814436733722687,
-0.1563660204410553,
-0.08218355476856232,
0.16772939264774323,
0.0635211169719696,
0.006090056616812944,
-0.03902493417263031,
0.03451008349657059,
-0.07266078144311905,
0.004233705345541239,
0.09387005120515823,
-0.2636394798755646,
-0.07588895410299301,
0.02352074161171913,
-0.007588929031044245,
0.04897269979119301,
-0.15443049371242523,
0.04519174247980118,
0.02417568489909172,
0.014140668325126171,
0.010541807860136032,
-0.0066568488255143166,
0.06863056123256683,
0.012435651384294033,
-0.08308779448270798,
0.02254975400865078,
0.1417780965566635,
0.008854467421770096,
-0.07126286625862122,
-0.23065975308418274,
0.020105767995119095,
0.12664340436458588,
-0.014885619282722473,
-0.07872757315635681,
0.026409318670630455,
-0.008307432755827904,
-0.010885201394557953,
-0.08449795097112656,
-0.1051274985074997,
0.03936414048075676,
0.025588154792785645,
0.1385948807001114,
0.02236161008477211,
-0.008866836316883564,
0.04927113279700279,
0.09710472822189331,
-0.07001643627882004,
-0.15323764085769653,
-0.03289991617202759,
-0.1252262145280838,
-0.04027944430708885,
-0.02263212390244007,
-0.03730189800262451,
-0.04313642531633377,
0.10721073299646378,
0.23142389953136444,
0.029874205589294434,
0.022860992699861526,
0.017885984852910042,
-0.012379029765725136,
0.07710474729537964,
0.19337104260921478,
-0.05298931524157524,
-0.2154180407524109,
0.010571272112429142,
-0.041625380516052246,
0.037385400384664536,
-0.016117822378873825,
-0.058291129767894745,
-0.042104363441467285,
-0.012409384362399578,
0.001788525260053575,
0.030760345980525017,
0.04105624929070473,
-0.06912937760353088,
-0.09854541718959808,
0.10606546700000763,
-0.11982300877571106,
0.045963723212480545,
0.02029803767800331,
-0.06207416579127312,
0.0686725303530693,
-0.06475131213665009,
0.03235676512122154,
-0.03430010750889778,
0.06577557325363159,
-0.03571522235870361,
-0.03450849652290344,
-0.0948590338230133,
-0.06350044161081314,
0.045038703829050064,
-0.046960435807704926,
-0.03194347769021988,
-0.07150408625602722,
-0.04523437097668648,
-0.07078660279512405,
0.10464935004711151,
-0.030101027339696884,
-0.042071662843227386,
-0.05402908846735954,
0.006064053624868393,
0.04305994510650635,
0.022738641127943993,
-0.05306728184223175,
-0.008789773099124432,
0.021534627303481102,
-0.08848653733730316,
-0.005567031912505627,
-0.03643724322319031,
0.02861873060464859,
-0.08638771623373032,
-0.013198118656873703,
-0.2938533425331116,
0.06782250851392746,
-0.1200585663318634,
0.10467106848955154,
-0.12401978671550751,
-0.010261771269142628,
-0.008303786627948284,
0.045670103281736374,
-0.0006927711656317115,
0.09424932301044464,
-0.0672396570444107,
-0.08529838919639587,
0.10634610801935196,
-0.0630558505654335,
-0.04695949703454971,
0.1522057205438614,
-0.06715516000986099,
0.043977051973342896,
0.11394824087619781,
0.24730178713798523,
0.20418699085712433,
-0.09348601847887039,
-0.04531696066260338,
-0.04105411469936371,
-0.0030218278989195824,
0.06714967638254166,
0.08694273233413696,
-0.08207187801599503,
0.09248370677232742,
0.028629016131162643,
-0.0938710942864418,
-0.0036530063953250647,
0.011316130869090557,
-0.03832884505391121,
0.03149627149105072,
-0.046241000294685364,
0.10797291249036789,
-0.0297479759901762,
-0.025913918390870094,
-0.09778671711683273,
-0.12815062701702118,
0.057863689959049225,
0.03920581191778183,
-0.01635454222559929,
-0.005392793565988541,
-0.05384870246052742,
0.02341459132730961,
0.07508819550275803,
0.007270433474332094,
-0.06212667375802994,
-0.12073838710784912,
0.07900454103946686,
-0.05868128314614296,
0.16047106683254242,
0.023717427626252174,
0.06678251177072525,
-0.012490607798099518,
-0.03435307741165161,
-0.02311268262565136,
-0.057047922164201736,
-0.0422959104180336,
0.0204825010150671,
-0.17560577392578125,
-0.002210761420428753,
-0.04564309120178223,
0.06694531440734863,
-0.03153223544359207,
-0.02020713873207569,
0.026359211653470993,
0.14443930983543396,
0.04427715763449669,
-0.05326998233795166,
0.0058402204886078835,
0.0018423418514430523,
0.048503320664167404,
-0.03586293384432793,
-0.00016657279047649354,
-0.0094990199431777,
-0.06201322376728058,
0.006847902666777372,
-0.02905435860157013,
0.03423575684428215,
0.01959364116191864,
0.02651798538863659,
-0.06182018294930458,
-0.06446933001279831,
-0.05800136923789978,
-0.015916435047984123,
-0.04143458232283592,
-0.08601771295070648,
0.24198134243488312,
0.003756182035431266,
-0.017560109496116638,
-0.06663713604211807,
-0.09615786373615265,
-0.07387571781873703,
-0.02176365815103054,
-0.016460126265883446,
0.0922260731458664,
-0.0957699567079544,
-0.16248056292533875,
0.10787179321050644,
0.11087071895599365,
0.06297378242015839,
0.2507705092430115,
-0.058870941400527954,
-0.03973720595240593,
-0.00010293415107298642,
0.03978915885090828,
-0.05293664336204529,
0.11168588697910309,
-0.026181861758232117,
-0.03264237195253372,
0.04031490162014961,
-0.02469368278980255,
0.001106903189793229,
-0.004514180123806,
0.06423121690750122,
-0.01714223623275757,
-0.018632184714078903,
0.09528542309999466,
0.017531441524624825,
0.02910151332616806,
0.06945382803678513,
0.10335667431354523,
0.04117628186941147,
0.013303062878549099,
-0.03852842375636101,
-0.04921739175915718,
0.11953091621398926,
-0.12090105563402176,
-0.1903637796640396,
-0.14771510660648346,
-0.019749218598008156,
-0.1232462152838707,
-0.016849763691425323,
0.008646964095532894,
-0.06615911424160004,
-0.11163068562746048,
-0.015223028138279915,
0.07353515923023224,
0.0876680538058281,
-0.07398378103971481,
-0.029794320464134216,
-0.015910541638731956,
0.025886012241244316,
-0.14115338027477264,
-0.03343454375863075,
0.0037717714440077543,
-0.04882112890481949,
-0.05869930237531662,
0.07714571803808212,
0.020349621772766113,
0.05984726548194885,
0.024477731436491013,
0.02069140411913395,
-0.026341596618294716,
0.19091516733169556,
-0.14798086881637573,
0.11553388833999634,
0.11309169232845306,
-0.05078858509659767,
0.060576871037483215,
0.19168147444725037,
0.05976187810301781,
0.0002081741695292294,
0.021207226440310478,
0.05032750591635704,
0.026028132066130638,
-0.18661831319332123,
-0.10690904408693314,
-0.03665173053741455,
-0.03643180429935455,
-0.016032623127102852,
0.05195387452840805,
0.007130431476980448,
-0.002319167833775282,
-0.11041435599327087,
-0.025582360103726387,
0.05800444632768631,
0.07374153286218643,
0.07117602974176407,
0.024433456361293793,
0.03280675411224365,
-0.04481683298945427,
-0.05749174579977989,
0.10638918727636337,
0.020157665014266968,
0.07239499688148499,
0.036354146897792816,
0.14709728956222534,
0.04324296489357948,
0.0794144868850708,
0.014390549622476101,
-0.01816767081618309,
-0.025693345814943314,
0.015236674807965755,
-0.02876896783709526,
-0.09308226406574249,
0.007681040093302727,
0.08019161969423294,
0.06427895277738571,
-0.03387073427438736,
0.014987769536674023,
-0.06147824227809906,
0.1482856422662735,
0.1751164197921753,
0.027986450120806694,
-0.07059410214424133,
-0.08684588223695755,
0.03238818421959877,
-0.079301618039608,
-0.05582242086529732,
-0.009739043191075325,
0.044076625257730484,
-0.16081976890563965,
0.06663335859775543,
0.00347139872610569,
0.0784781351685524,
-0.043739352375268936,
0.04883771389722824,
0.044445909559726715,
0.12985870242118835,
-0.02452397532761097,
0.0788504108786583,
-0.20661568641662598,
0.07437283545732498,
0.013847323134541512,
0.08671152591705322,
-0.04559162259101868,
0.036511875689029694,
0.06083479896187782,
-0.08412032574415207,
0.09139727801084518,
0.012491361238062382,
-0.02059861272573471,
0.023507997393608093,
-0.12685926258563995,
0.028583936393260956,
0.14063948392868042,
-0.1511334776878357,
0.05149482190608978,
-0.023877384141087532,
-0.017911111935973167,
-0.06347062438726425,
0.0626310482621193,
-0.12794367969036102,
-0.11373625695705414,
0.013499743305146694,
-0.10567284375429153,
0.042483579367399216,
-0.04647355154156685,
0.0001244760787812993,
-0.07552817463874817,
0.23693495988845825,
-0.13887999951839447,
-0.10327372699975967,
-0.10841641575098038,
-0.01941298134624958,
0.06733038276433945,
-0.084810771048069,
0.05718256160616875,
0.014407593756914139,
0.0972539633512497,
-0.009112882427871227,
-0.10412203520536423,
0.04447375610470772,
-0.061187487095594406,
-0.11326718330383301,
0.012619085609912872,
0.17389409244060516,
0.07343383133411407,
0.025104787200689316,
0.014331876300275326,
0.005220354069024324,
-0.028802046552300453,
-0.11488006263971329,
0.020972153171896935,
0.09931860119104385,
-0.027115345001220703,
0.07090339064598083,
-0.08709504455327988,
-0.14091117680072784,
-0.057682909071445465,
0.031080549582839012,
0.07370752841234207,
0.07706034183502197,
-0.033575233072042465,
0.19437600672245026,
0.18849262595176697,
-0.04442186653614044,
-0.23039469122886658,
-0.014079337008297443,
0.0550697036087513,
0.0372382290661335,
0.004524927120655775,
-0.2341434508562088,
0.17582623660564423,
0.011542104184627533,
-0.04503357410430908,
0.025464044883847237,
-0.14146649837493896,
-0.12162765860557556,
0.09387417137622833,
-0.05059428885579109,
-0.05425393208861351,
-0.06436777114868164,
-0.08072365075349808,
-0.05077282339334488,
-0.054158661514520645,
0.09326676279306412,
-0.07939673215150833,
0.05432421341538429,
0.045097917318344116,
0.05745839700102806,
0.03905905410647392,
-0.00806876178830862,
0.074418805539608,
0.008804067969322205,
0.058473553508520126,
-0.08236221969127655,
-0.010865326039493084,
0.004226713441312313,
-0.05898094177246094,
0.09920936077833176,
-0.04010989889502525,
-0.015103083103895187,
-0.14546437561511993,
-0.020053820684552193,
-0.05562499165534973,
0.12017902731895447,
-0.051103610545396805,
-0.08763498812913895,
-0.10246523469686508,
0.13832755386829376,
0.009391803294420242,
0.02709401771426201,
0.0423235259950161,
-0.02698042057454586,
0.038152892142534256,
-0.0035885784309357405,
0.19451114535331726,
0.036995332688093185,
-0.07043998688459396,
-0.03668084740638733,
-0.024153804406523705,
0.07716196775436401,
-0.05910823121666908,
0.04761119931936264,
0.11050774902105331,
-0.005064511671662331,
0.09845977276563644,
-0.025703400373458862,
-0.1410476416349411,
-0.02156166359782219,
0.10891332477331161,
-0.1419299840927124,
-0.135544091463089,
-0.09022285789251328,
-0.02400139905512333,
-0.05431453883647919,
0.03622535616159439,
0.15606890618801117,
-0.0017389861168339849,
-0.04844657704234123,
0.03790279105305672,
0.04601733386516571,
-0.037964828312397,
0.034051425755023956,
0.059332624077796936,
0.02134770154953003,
-0.07491834461688995,
0.11804001033306122,
0.08617179095745087,
0.01097018551081419,
0.06113702058792114,
0.11272133141756058,
-0.047927070409059525,
-0.009943922981619835,
0.04986049607396126,
0.1553223431110382,
-0.015124725177884102,
-0.04508267715573311,
0.012686658650636673,
-0.08063231408596039,
-0.03164144977927208,
0.05323939025402069,
0.03791480511426926,
0.031138068065047264,
0.01899643987417221,
0.04360715672373772,
0.03543320298194885,
0.1112341582775116,
0.07442358136177063,
0.008333059959113598,
-0.054770536720752716,
-0.000598645128775388,
-0.06044677644968033,
-0.030347425490617752,
-0.029176780954003334,
-0.025302227586507797,
-0.14309503138065338,
-0.05458477512001991,
-0.07717017084360123,
-0.040832579135894775,
-0.00022476777667179704,
-0.002124929102137685,
-0.0043067424558103085,
-0.03792108595371246,
0.022910093888640404,
0.023172061890363693,
-0.06022506207227707,
-0.03374090790748596,
0.006465703248977661,
0.1252581775188446,
-0.1937401294708252,
0.019267573952674866,
0.08723671734333038,
-0.039176419377326965,
0.12489213049411774,
0.024767789989709854,
-0.010980131104588509,
0.03904499113559723,
-0.17715533077716827,
-0.04758263751864433,
-0.06614828109741211,
0.012606094591319561,
0.020702464506030083,
-0.06903202086687088,
-0.021253865212202072,
-0.02741539478302002,
-0.04423604905605316,
0.010212198831140995,
0.025442713871598244,
-0.07123792171478271,
0.11782029271125793,
0.025288723409175873,
-0.08522692322731018,
-0.059373144060373306,
0.03326384350657463,
0.071736179292202,
0.04345032572746277,
0.06505652517080307,
-0.09204407781362534,
0.0046160463243722916,
-0.07556035369634628,
0.01828555203974247,
0.04529076814651489,
-0.03536886349320412,
-0.13739097118377686,
-0.021531859412789345,
0.03679920360445976,
-0.0065620895475149155,
0.09083911031484604,
0.004833784885704517,
-0.09586849063634872,
0.0395086407661438,
0.003113344544544816,
-0.022543421015143394,
0.05950424075126648,
0.05085708200931549,
-0.03016207553446293,
-0.004494711756706238,
-0.029200755059719086,
-0.04569847136735916,
-0.031100085005164146,
-0.04650145024061203,
0.17863242328166962,
0.23490844666957855,
0.0985105037689209,
0.019716493785381317,
0.10598840564489365,
-0.017882363870739937,
-0.10651323199272156,
0.0706254318356514,
0.06001830846071243,
0.06278547644615173,
-0.08760188519954681,
0.07428595423698425,
0.09828947484493256,
-0.21219642460346222,
0.08537544310092926,
-0.009299863129854202,
-0.022511377930641174,
-0.031192569062113762,
-0.20086173713207245,
-0.06256972253322601,
-0.06581605970859528,
0.00656069116666913,
-0.11913682520389557,
0.023682858794927597,
0.014378725551068783,
0.0708802342414856,
-0.08215229213237762,
0.15951615571975708,
-0.15411067008972168,
-0.05786185711622238,
0.16190044581890106,
0.01825409010052681,
0.02610243298113346,
0.0521722286939621,
-0.040655821561813354,
-0.07627440989017487,
0.12061123549938202,
0.042365770787000656,
0.03596338629722595,
0.000060023045080015436,
-0.06845914572477341,
-0.03891945630311966,
-0.08837254345417023,
0.001603047945536673,
-0.04375144839286804,
-0.022679544985294342,
0.07847844809293747,
0.017309652641415596,
-0.0450507253408432,
-0.026134878396987915,
0.2086629420518875,
-0.028340943157672882,
-0.08310578018426895,
-0.13276784121990204,
0.04174760356545448,
-0.02627861499786377,
0.009099871851503849,
0.005001736339181662,
-0.13067886233329773,
-0.02692728117108345,
0.08187785744667053,
0.20682615041732788,
-0.07211654633283615,
0.015926282852888107,
-0.01588483713567257,
0.021220626309514046,
0.005967669654637575,
0.1250731498003006,
-0.03178096562623978,
0.24344101548194885,
-0.015184570103883743,
0.054513949900865555,
0.05655821040272713,
-0.05809508636593819,
-0.14661963284015656,
0.10469532757997513,
0.03523607924580574,
-0.05295844003558159,
-0.09037131071090698,
0.14222541451454163,
-0.10917148739099503,
-0.15072095394134521,
-0.08059177547693253,
-0.10724657773971558,
-0.16230228543281555,
-0.060121964663267136,
0.05108996853232384,
0.05216934531927109,
0.06624957174062729,
0.044698476791381836,
-0.02114889957010746,
0.07649695128202438,
0.020465755835175514,
0.007038766518235207,
-0.020447351038455963,
0.13477809727191925,
-0.05883730202913284,
0.19341851770877838,
0.030619269236922264,
0.01101964246481657,
0.08849362283945084,
-0.032944053411483765,
-0.0466785803437233,
-0.0784570649266243,
0.02912733517587185,
-0.13806301355361938,
-0.03211842477321625,
0.11016734689474106,
0.025443201884627342,
0.0713965967297554,
0.07981990277767181,
-0.03495179861783981,
0.025817696005105972,
0.111317940056324,
-0.016180677339434624,
-0.11860689520835876,
0.12215439230203629,
-0.1339360773563385,
0.1537495106458664,
0.1533457338809967,
-0.022991877049207687,
-0.014528488740324974,
-0.050694338977336884,
0.04784197732806206,
0.0501222237944603,
0.06629933416843414,
-0.04090523719787598,
-0.1537577211856842,
0.003021667944267392,
0.004731052555143833,
0.0174384955316782,
-0.12058772891759872,
-0.07371822744607925,
-0.010404725559055805,
0.12456101924180984,
-0.046784140169620514,
0.1068873181939125,
0.09992743283510208,
-0.0022265869192779064,
0.008947940543293953,
-0.04235232621431351,
-0.008203720673918724,
0.05519063025712967,
-0.07844473421573639,
0.0009721179376356304
] |
null | null |
transformers
|
# German ELECTRA large generator
Released, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our [paper](https://arxiv.org/pdf/2010.10906.pdf), we outline the steps taken to train our model.
The generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-large.
## Overview
**Paper:** [here](https://arxiv.org/pdf/2010.10906.pdf)
**Architecture:** ELECTRA large (generator)
**Language:** German
## Performance
```
GermEval18 Coarse: 80.70
GermEval18 Fine: 55.16
GermEval14: 88.95
```
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: `branden.chan [at] deepset.ai`
Stefan Schweter: `stefan [at] schweter.eu`
Timo Möller: `timo.moeller [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Slack](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "datasets": ["wikipedia", "OPUS", "OpenLegalData", "oscar"]}
|
fill-mask
|
deepset/gelectra-large-generator
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"electra",
"fill-mask",
"de",
"dataset:wikipedia",
"dataset:OPUS",
"dataset:OpenLegalData",
"dataset:oscar",
"arxiv:2010.10906",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.10906"
] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #safetensors #electra #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #dataset-oscar #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# German ELECTRA large generator
Released, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model.
The generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-large.
## Overview
Paper: here
Architecture: ELECTRA large (generator)
Language: German
## Performance
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: 'URL [at] URL'
Stefan Schweter: 'stefan [at] URL'
Timo Möller: 'timo.moeller [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Slack | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# German ELECTRA large generator\n\nReleased, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model.\n\nThe generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-large.",
"## Overview \nPaper: here \nArchitecture: ELECTRA large (generator) \nLanguage: German",
"## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #electra #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #dataset-oscar #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# German ELECTRA large generator\n\nReleased, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model.\n\nThe generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-large.",
"## Overview \nPaper: here \nArchitecture: ELECTRA large (generator) \nLanguage: German",
"## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'",
"## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
85,
148,
20,
60,
40,
129
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #electra #fill-mask #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #dataset-oscar #arxiv-2010.10906 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# German ELECTRA large generator\n\nReleased, Oct 2020, this is the generator component of the German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model.\n\nThe generator is useful for performing masking experiments. If you are looking for a regular language model for embedding extraction, or downstream tasks like NER, classification or QA, please use deepset/gelectra-large.## Overview \nPaper: here \nArchitecture: ELECTRA large (generator) \nLanguage: German## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator## Authors\nBranden Chan: 'URL [at] URL'\nStefan Schweter: 'stefan [at] URL'\nTimo Möller: 'timo.moeller [at] URL'## About us\n!deepset logo\n\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Slack | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
-0.027217773720622063,
0.14159603416919708,
-0.00549551285803318,
0.04321419075131416,
0.019071560353040695,
-0.006323746871203184,
0.040698833763599396,
0.09484990686178207,
0.08418390899896622,
0.11754840612411499,
-0.010946175083518028,
0.020115312188863754,
0.09343983978033066,
0.09828399121761322,
0.013439781032502651,
-0.20821669697761536,
0.0037743793800473213,
-0.12152353674173355,
-0.05578793212771416,
0.07488591969013214,
0.14596211910247803,
-0.09312400221824646,
0.08940885215997696,
0.004005583003163338,
0.027763238176703453,
0.03481024131178856,
-0.04303719103336334,
-0.028601596131920815,
0.025612378492951393,
0.050109971314668655,
0.05556648224592209,
-0.048451442271471024,
-0.015443471260368824,
-0.17980816960334778,
0.028811952099204063,
0.04731901362538338,
0.004359642043709755,
0.03283918276429176,
0.06510154902935028,
-0.04675092175602913,
-0.04244111478328705,
-0.1655454784631729,
0.024353696033358574,
0.07969189435243607,
-0.08049891144037247,
-0.1486593782901764,
-0.06731220334768295,
0.12130900472402573,
0.07503769546747208,
0.013950983062386513,
-0.041419610381126404,
0.02881423383951187,
-0.06826654821634293,
0.017769398167729378,
0.07144015282392502,
-0.24999280273914337,
-0.09070565551519394,
-0.01176358200609684,
0.01438827719539404,
0.02322070114314556,
-0.14532899856567383,
0.039930764585733414,
-0.018135657534003258,
0.019780151546001434,
0.04589571803808212,
-0.025670591741800308,
0.0471043735742569,
0.010747094638645649,
-0.07470301538705826,
0.03690880909562111,
0.04129208251833916,
-0.003636670531705022,
-0.0420355349779129,
-0.2555457353591919,
-0.008787417784333229,
0.07428780943155289,
-0.009728326462209225,
-0.06281223148107529,
0.033138588070869446,
-0.012767814099788666,
0.019808296114206314,
-0.08743127435445786,
-0.09159470349550247,
0.03679690137505531,
-0.04768450930714607,
0.13971561193466187,
0.03522045165300369,
0.0003434896352700889,
0.05121336877346039,
0.06399001181125641,
-0.04988301545381546,
-0.1663416028022766,
-0.040668074041604996,
-0.10587750375270844,
-0.10153212398290634,
-0.009201317094266415,
-0.014113240875303745,
-0.06702672690153122,
0.09215711057186127,
0.21752387285232544,
-0.015711698681116104,
0.05437503755092621,
0.0016099406639114022,
-0.0281320009380579,
0.07473339140415192,
0.22548088431358337,
-0.047772157937288284,
-0.18564771115779877,
0.019633183255791664,
-0.06889508664608002,
0.03150513768196106,
-0.0036661960184574127,
-0.05535385385155678,
-0.04132673144340515,
-0.013806594535708427,
0.041790541261434555,
0.04744676128029823,
0.024296347051858902,
-0.06693894416093826,
-0.11020416021347046,
0.10209792852401733,
-0.13059568405151367,
0.06951714307069778,
0.07790841907262802,
-0.05055229738354683,
0.048639632761478424,
-0.07148728519678116,
0.020010126754641533,
-0.04111615568399429,
0.05142674967646599,
-0.002789640799164772,
-0.018742507323622704,
-0.10011956095695496,
-0.06141424551606178,
0.03858669847249985,
-0.007428101729601622,
-0.05366133525967598,
-0.08022864162921906,
-0.030348656699061394,
-0.05600477382540703,
0.10372650623321533,
-0.058376118540763855,
-0.02888571284711361,
-0.02031218633055687,
-0.004026937764137983,
0.042992789298295975,
0.016338126733899117,
-0.04100818932056427,
-0.0004267647454980761,
0.023245614022016525,
-0.08304538577795029,
0.0038034000899642706,
0.00629795715212822,
0.027614207938313484,
-0.04310910031199455,
-0.0037260286044329405,
-0.30398571491241455,
0.09320855885744095,
-0.11366676539182663,
0.048223160207271576,
-0.12587425112724304,
-0.033858973532915115,
0.023020606487989426,
0.011716142296791077,
-0.02760503999888897,
0.07186312228441238,
-0.0884581133723259,
-0.04941905289888382,
0.11019795387983322,
-0.03128686174750328,
-0.034293290227651596,
0.14633341133594513,
-0.07231955975294113,
-0.013571998104453087,
0.10173624008893967,
0.19121840596199036,
0.2025398313999176,
-0.09777001291513443,
-0.07337646186351776,
-0.05937127023935318,
-0.022265955805778503,
0.11136998981237411,
0.11008106172084808,
-0.0629081130027771,
0.10337477922439575,
0.01868910901248455,
-0.049705762416124344,
-0.0024230051785707474,
0.03312200307846069,
-0.05868086218833923,
0.04908350110054016,
-0.04156765714287758,
0.11289077997207642,
-0.0410122312605381,
-0.023537052795290947,
-0.05744098126888275,
-0.14616142213344574,
0.0091059235855937,
0.03370983526110649,
0.0020433191675692797,
-0.026750678196549416,
-0.09843789041042328,
0.012082896195352077,
0.054308030754327774,
0.005638750735670328,
-0.04300588369369507,
-0.13294030725955963,
0.08333903551101685,
-0.08917129784822464,
0.13719633221626282,
-0.06538520008325577,
0.08726096898317337,
-0.0058440170250833035,
-0.04643790423870087,
-0.03550906106829643,
-0.004167862236499786,
-0.05552559718489647,
0.03825317695736885,
-0.13174133002758026,
-0.0016079751076176763,
-0.029352616518735886,
0.07513809204101562,
0.01216183789074421,
-0.014661997556686401,
0.028423897922039032,
0.1413005292415619,
0.05375386402010918,
-0.039719484746456146,
0.008187010884284973,
-0.010976826772093773,
0.05427854135632515,
-0.02223527804017067,
-0.01942119561135769,
-0.03097410872578621,
-0.06507356464862823,
0.0225578173995018,
-0.031070010736584663,
0.0012952886754646897,
0.014346064068377018,
0.06090933829545975,
-0.051034312695264816,
-0.05422545224428177,
-0.020815113559365273,
-0.011265694163739681,
-0.0339738130569458,
-0.09083250164985657,
0.2688104212284088,
0.02473554015159607,
-0.01812063343822956,
-0.0845617949962616,
-0.11362294107675552,
-0.08825040608644485,
0.0038527781143784523,
-0.008582626469433308,
0.10076935589313507,
-0.06721939146518707,
-0.17100413143634796,
0.10058345645666122,
0.08479564636945724,
0.01654132455587387,
0.25274595618247986,
-0.0454377606511116,
-0.03266981616616249,
-0.04120564088225365,
0.060688916593790054,
-0.01060221716761589,
0.06234519183635712,
0.04193630814552307,
0.02159608155488968,
0.050685904920101166,
-0.03275051340460777,
0.0011529732728376985,
0.014182624407112598,
0.05615123361349106,
-0.02323339693248272,
-0.016168352216482162,
0.09427821636199951,
0.022309765219688416,
0.0751415491104126,
0.04848054423928261,
0.08646789938211441,
0.06716249138116837,
-0.005901729222387075,
-0.051170412451028824,
-0.0249621719121933,
0.10284010320901871,
-0.1510632336139679,
-0.20821289718151093,
-0.15199299156665802,
-0.033105261623859406,
-0.13217222690582275,
-0.03589703142642975,
-0.0008287791279144585,
-0.07966045290231705,
-0.09451940655708313,
-0.010365759022533894,
0.106996551156044,
0.09017771482467651,
-0.07864321023225784,
-0.06767809391021729,
0.010639842599630356,
0.03193845972418785,
-0.13936711847782135,
-0.028286470100283623,
0.008141309954226017,
-0.0731084942817688,
-0.03206354379653931,
0.10939580947160721,
0.03418576717376709,
0.05170758441090584,
0.030223827809095383,
0.009849285706877708,
-0.01299650315195322,
0.1909915804862976,
-0.14748233556747437,
0.1181945949792862,
0.1271793246269226,
-0.06761067360639572,
0.06433278322219849,
0.2013596147298813,
0.07365993410348892,
0.005888448562473059,
0.001428774674423039,
0.07719379663467407,
0.03698813170194626,
-0.21275053918361664,
-0.1000460758805275,
-0.03678989037871361,
-0.05030899867415428,
-0.03180301561951637,
0.052487920969724655,
0.001995483646169305,
-0.01540966983884573,
-0.0965355858206749,
-0.07609013468027115,
0.06520196795463562,
0.04958492890000343,
0.08608680218458176,
0.037187907844781876,
0.04166702553629875,
-0.018673861399292946,
-0.07890074700117111,
0.10097847878932953,
0.023165879771113396,
0.09913481026887894,
0.019665254279971123,
0.17380231618881226,
0.026039069518446922,
0.08135081827640533,
0.021109316498041153,
-0.024115443229675293,
-0.04190737381577492,
0.007600911892950535,
-0.016269581392407417,
-0.09523285925388336,
0.02366841770708561,
0.08051614463329315,
0.14242874085903168,
-0.007727626245468855,
0.06038069352507591,
-0.036681607365608215,
0.16222301125526428,
0.20258094370365143,
0.005784617271274328,
-0.030143123120069504,
-0.04339645430445671,
0.05571065470576286,
-0.1117507666349411,
-0.07440900057554245,
-0.007381409872323275,
0.037461962550878525,
-0.15358871221542358,
0.08024521172046661,
-0.018404753878712654,
0.07295370101928711,
-0.05173792317509651,
0.012099009938538074,
0.020277371630072594,
0.13796968758106232,
0.00037791571230627596,
0.0737529844045639,
-0.15178370475769043,
0.034313324838876724,
0.03210200369358063,
0.08782708644866943,
-0.05164168402552605,
0.05777869001030922,
0.05970248952507973,
-0.11527930945158005,
0.11294135451316833,
-0.007319670170545578,
-0.02333209291100502,
0.023575343191623688,
-0.16091233491897583,
0.025967296212911606,
0.13543589413166046,
-0.15538880228996277,
0.06220579519867897,
-0.015549363568425179,
-0.03326175734400749,
-0.07971256226301193,
0.03021986037492752,
-0.09122205525636673,
-0.10816871374845505,
-0.00224788929335773,
-0.08746577799320221,
0.06281735748052597,
-0.0359397754073143,
0.02953987941145897,
-0.12086788564920425,
0.21122008562088013,
-0.2017410844564438,
-0.09366847574710846,
-0.1067853793501854,
-0.06289945542812347,
0.10358685255050659,
-0.11137021332979202,
0.06991830468177795,
0.0029773451387882233,
0.1187291294336319,
-0.010022196918725967,
-0.0937642827630043,
0.04501398652791977,
-0.05531393736600876,
-0.09176897257566452,
0.01683635637164116,
0.16386224329471588,
0.06778713315725327,
0.03860345855355263,
0.0038216912653297186,
0.02474294789135456,
-0.03152910992503166,
-0.11679963022470474,
0.03533698245882988,
0.1054413914680481,
-0.017017025500535965,
0.09314650297164917,
-0.09063113480806351,
-0.17885154485702515,
-0.06725028157234192,
0.05093767121434212,
0.13485175371170044,
0.08405929803848267,
-0.05039770156145096,
0.1861451417207718,
0.13394272327423096,
-0.025561179965734482,
-0.22175681591033936,
-0.04048203304409981,
0.08851104974746704,
0.039066337049007416,
0.01766330376267433,
-0.22904270887374878,
0.1472441405057907,
0.0343918539583683,
-0.045650288462638855,
0.015610797330737114,
-0.12040954828262329,
-0.11725720763206482,
0.06710919737815857,
-0.05746321380138397,
-0.14847609400749207,
-0.08105736970901489,
-0.0887385755777359,
-0.04765833541750908,
-0.050080303102731705,
0.07831650972366333,
-0.0745050460100174,
0.04604032635688782,
0.043487563729286194,
0.06654631346464157,
0.04214578866958618,
-0.007517107296735048,
0.1066681370139122,
-0.04502298682928085,
0.039878565818071365,
-0.058610547333955765,
-0.04328642785549164,
-0.01606534607708454,
-0.05732288211584091,
0.08660022914409637,
-0.07260835915803909,
-0.02122161164879799,
-0.09018635004758835,
-0.03465963155031204,
-0.0688425600528717,
0.10563582181930542,
-0.04602587968111038,
-0.06620706617832184,
-0.07409384846687317,
0.13287949562072754,
0.05232889950275421,
0.012408386915922165,
0.04277925193309784,
-0.042517539113759995,
-0.00977551843971014,
0.07895654439926147,
0.1783350706100464,
0.08894682675600052,
-0.10437947511672974,
-0.032270532101392746,
-0.00858767144382,
0.06627398729324341,
-0.015924805775284767,
0.07087542116641998,
0.08428680896759033,
-0.020208202302455902,
0.10131184756755829,
-0.04384747892618179,
-0.10305731743574142,
-0.010833973065018654,
0.09283314645290375,
-0.0885685384273529,
-0.1763366013765335,
-0.07071848958730698,
-0.008939092047512531,
-0.0788642093539238,
-0.007732267025858164,
0.13310106098651886,
0.05014130845665932,
-0.06700492650270462,
0.04840017482638359,
0.059422459453344345,
-0.01996515318751335,
0.03284863382577896,
0.0481833852827549,
0.01742541417479515,
-0.09219254553318024,
0.08202751725912094,
0.09000755101442337,
0.020661871880292892,
0.053297799080610275,
0.13709408044815063,
-0.006773789878934622,
-0.02121853642165661,
0.06226305663585663,
0.14813761413097382,
0.016722403466701508,
-0.019990649074316025,
0.016498329117894173,
-0.078680619597435,
-0.0006086451467126608,
0.05995940789580345,
0.0299142487347126,
-0.00838822778314352,
0.021894119679927826,
0.03309686481952667,
0.06587143242359161,
0.14143216609954834,
0.0600975826382637,
0.0029144538566470146,
-0.04783715680241585,
-0.009038534946739674,
-0.05879632383584976,
-0.03440707549452782,
-0.018685143440961838,
-0.011190575547516346,
-0.13796362280845642,
-0.03468722850084305,
-0.11994066089391708,
-0.01593172550201416,
-0.019553927704691887,
0.015596020966768265,
-0.0027163082268089056,
-0.013347890228033066,
-0.01715291291475296,
0.029345672577619553,
-0.057663772255182266,
-0.055547263473272324,
-0.016370505094528198,
0.11968351155519485,
-0.18385225534439087,
-0.004011130426079035,
0.09822402894496918,
-0.040139809250831604,
0.12920048832893372,
0.02374444156885147,
-0.012743642553687096,
0.016694318503141403,
-0.11840042471885681,
-0.07730451226234436,
-0.054709579795598984,
0.03153659775853157,
0.01727999560534954,
-0.1255442500114441,
-0.006272534839808941,
-0.032277803868055344,
-0.08892569690942764,
0.0047655263915658,
0.027136288583278656,
-0.08475834131240845,
0.12081141769886017,
0.023913433775305748,
-0.0807797834277153,
-0.052280258387327194,
0.030386554077267647,
0.059631410986185074,
0.0065979124046862125,
0.07636144757270813,
-0.07475092262029648,
0.040968235582113266,
-0.07908367365598679,
0.014992774464190006,
0.04020027443766594,
-0.017785079777240753,
-0.1259520798921585,
-0.010564862750470638,
0.04848920926451683,
-0.01125227939337492,
0.0702260360121727,
0.0024691284634172916,
-0.09854491055011749,
0.03875553235411644,
0.013405025005340576,
-0.04433498531579971,
0.07110749930143356,
0.015483565628528595,
-0.025825489312410355,
-0.01929684355854988,
-0.005503776948899031,
-0.06526783853769302,
-0.04982122406363487,
-0.03480380401015282,
0.1344965398311615,
0.24698567390441895,
0.14129473268985748,
-0.0011408244026824832,
0.13805945217609406,
-0.01616181991994381,
-0.09818762540817261,
0.11373359709978104,
-0.015462052077054977,
0.024179255589842796,
-0.07313453406095505,
0.02617824636399746,
0.07131816446781158,
-0.20606011152267456,
0.11661434918642044,
0.012526542879641056,
-0.01505634468048811,
-0.0488557331264019,
-0.1834266185760498,
-0.0944477990269661,
-0.023508433252573013,
0.018539374694228172,
-0.10365065932273865,
0.036344949156045914,
-0.018488343805074692,
0.06088573858141899,
-0.06423485279083252,
0.18240562081336975,
-0.1668827086687088,
-0.08607383072376251,
0.15293696522712708,
0.040625277906656265,
0.02796577475965023,
0.07339800149202347,
-0.00006529263191623613,
-0.07192911207675934,
0.09309348464012146,
0.05895334482192993,
0.037282317876815796,
-0.038121242076158524,
-0.06187593191862106,
-0.06777466833591461,
-0.08812019228935242,
0.006833349820226431,
-0.04483826830983162,
-0.02689981646835804,
0.1425120085477829,
0.01906440034508705,
-0.029579434543848038,
-0.027730345726013184,
0.15390123426914215,
-0.03696916252374649,
-0.08404102176427841,
-0.16420990228652954,
0.019400641322135925,
-0.035315077751874924,
0.02723112516105175,
0.01484938245266676,
-0.12553992867469788,
-0.04320693388581276,
0.08653987944126129,
0.1846584975719452,
-0.09243100136518478,
0.025398818776011467,
0.015570792369544506,
0.022979989647865295,
-0.010210487060248852,
0.08580072969198227,
-0.025865212082862854,
0.2848008871078491,
-0.02875501662492752,
0.060014836490154266,
0.0481376014649868,
-0.06617872416973114,
-0.14137856662273407,
0.11104286462068558,
0.0057707056403160095,
-0.023394687101244926,
-0.06124347075819969,
0.13368526101112366,
-0.11222042888402939,
-0.12573528289794922,
-0.05932873487472534,
-0.1020616963505745,
-0.165882408618927,
-0.04512055218219757,
0.078140988945961,
0.04769906774163246,
0.057036012411117554,
0.06659109145402908,
-0.05351526290178299,
0.07820126414299011,
0.016491837799549103,
0.027046924456954002,
0.0001462514337617904,
0.13723814487457275,
-0.0640023723244667,
0.2064232975244522,
0.03890233859419823,
0.009099336341023445,
0.10898695886135101,
-0.043349739164114,
-0.02701091393828392,
-0.038358498364686966,
0.05731416866183281,
-0.18140535056591034,
-0.00401239562779665,
0.14908160269260406,
0.01953488402068615,
0.11781452596187592,
0.10101059824228287,
0.008556835353374481,
0.029505157843232155,
0.07987205684185028,
-0.016551269218325615,
-0.12408121675252914,
0.16472387313842773,
-0.13491666316986084,
0.11495378613471985,
0.17333197593688965,
-0.0070901894941926,
-0.005252702161669731,
-0.050844691693782806,
0.0550866462290287,
0.059368740767240524,
0.10137384384870529,
-0.05102105438709259,
-0.1466771960258484,
0.019888006150722504,
0.044226717203855515,
0.06507419794797897,
-0.10247211158275604,
-0.09358923137187958,
-0.009043144062161446,
0.08851931989192963,
-0.04866965115070343,
0.0840119943022728,
0.11081838607788086,
-0.011262115091085434,
0.018320515751838684,
-0.047888144850730896,
-0.006958398036658764,
0.017578525468707085,
-0.0897107645869255,
0.0229408610612154
] |
null | null |
transformers
|

## Overview
**Language model:** gelectra-large-germanquad
**Language:** German
**Training data:** GermanQuAD train set (~ 12MB)
**Eval data:** GermanQuAD test set (~ 5MB)
**Infrastructure**: 1x V100 GPU
**Published**: Apr 21st, 2021
## Details
- We trained a German question answering model with a gelectra-large model as its basis.
- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published [online](https://deepset.ai/germanquad).
- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536 answers, because we removed 76 wrong answers.
See https://deepset.ai/germanquad for more details and dataset download in SQuAD format.
## Hyperparameters
```
batch_size = 24
n_epochs = 2
max_seq_len = 384
learning_rate = 3e-5
lr_schedule = LinearWarmup
embeds_dropout_prob = 0.1
```
## Performance
We evaluated the extractive question answering performance on our GermanQuAD test set.
Model types and training data are included in the model name.
For finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.
The GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on [GermanQuAD](https://deepset.ai/germanquad).
The human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.

## Authors
**Timo Möller:** [email protected]
**Julian Risch:** [email protected]
**Malte Pietsch:** [email protected]
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://huggingface.co/spaces/deepset/README/resolve/main/haystack-logo-colored.svg" class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://huggingface.co/spaces/deepset/README/resolve/main/deepset-logo-colored.svg" class="w-40"/>
</div>
</div>
[deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://haystack.deepset.ai">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community/join">Discord community open to everyone!</a></strong></p>
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "tags": ["exbert"], "datasets": ["deepset/germanquad"], "thumbnail": "https://thumb.tildacdn.com/tild3433-3637-4830-a533-353833613061/-/resize/720x/-/format/webp/germanquad.jpg"}
|
question-answering
|
deepset/gelectra-large-germanquad
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"electra",
"question-answering",
"exbert",
"de",
"dataset:deepset/germanquad",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #safetensors #electra #question-answering #exbert #de #dataset-deepset/germanquad #license-mit #endpoints_compatible #has_space #region-us
|
!bert_image
## Overview
Language model: gelectra-large-germanquad
Language: German
Training data: GermanQuAD train set (~ 12MB)
Eval data: GermanQuAD test set (~ 5MB)
Infrastructure: 1x V100 GPU
Published: Apr 21st, 2021
## Details
- We trained a German question answering model with a gelectra-large model as its basis.
- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.
- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536 answers, because we removed 76 wrong answers.
See URL for more details and dataset download in SQuAD format.
## Hyperparameters
## Performance
We evaluated the extractive question answering performance on our GermanQuAD test set.
Model types and training data are included in the model name.
For finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.
The GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on GermanQuAD.
The human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.
!performancetable
## Authors
Timo Möller: timo.moeller@URL
Julian Risch: URL@URL
Malte Pietsch: malte.pietsch@URL
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
</div>
deepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- Distilled roberta-base-squad2 (aka "tinyroberta-squad2")
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="URL repo and <strong><a href="URL">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="URL community open to everyone!</a></strong></p>
Twitter | LinkedIn | Discord | GitHub Discussions | Website
By the way: we're hiring!
|
[
"## Overview\nLanguage model: gelectra-large-germanquad \nLanguage: German \nTraining data: GermanQuAD train set (~ 12MB) \nEval data: GermanQuAD test set (~ 5MB) \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021",
"## Details\n- We trained a German question answering model with a gelectra-large model as its basis.\n- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.\n- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536 answers, because we removed 76 wrong answers.\n\nSee URL for more details and dataset download in SQuAD format.",
"## Hyperparameters",
"## Performance\nWe evaluated the extractive question answering performance on our GermanQuAD test set.\nModel types and training data are included in the model name. \nFor finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.\nThe GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on GermanQuAD. \nThe human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.\n!performancetable",
"## Authors\n Timo Möller: timo.moeller@URL \n Julian Risch: URL@URL \n Malte Pietsch: malte.pietsch@URL",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #electra #question-answering #exbert #de #dataset-deepset/germanquad #license-mit #endpoints_compatible #has_space #region-us \n",
"## Overview\nLanguage model: gelectra-large-germanquad \nLanguage: German \nTraining data: GermanQuAD train set (~ 12MB) \nEval data: GermanQuAD test set (~ 5MB) \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021",
"## Details\n- We trained a German question answering model with a gelectra-large model as its basis.\n- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.\n- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536 answers, because we removed 76 wrong answers.\n\nSee URL for more details and dataset download in SQuAD format.",
"## Hyperparameters",
"## Performance\nWe evaluated the extractive question answering performance on our GermanQuAD test set.\nModel types and training data are included in the model name. \nFor finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.\nThe GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on GermanQuAD. \nThe human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.\n!performancetable",
"## Authors\n Timo Möller: timo.moeller@URL \n Julian Risch: URL@URL \n Malte Pietsch: malte.pietsch@URL",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
62,
61,
133,
5,
115,
33,
251,
113
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #electra #question-answering #exbert #de #dataset-deepset/germanquad #license-mit #endpoints_compatible #has_space #region-us \n## Overview\nLanguage model: gelectra-large-germanquad \nLanguage: German \nTraining data: GermanQuAD train set (~ 12MB) \nEval data: GermanQuAD test set (~ 5MB) \nInfrastructure: 1x V100 GPU \nPublished: Apr 21st, 2021## Details\n- We trained a German question answering model with a gelectra-large model as its basis.\n- The dataset is GermanQuAD, a new, German language dataset, which we hand-annotated and published online.\n- The training dataset is one-way annotated and contains 11518 questions and 11518 answers, while the test dataset is three-way annotated so that there are 2204 questions and with 2204·3−76 = 6536 answers, because we removed 76 wrong answers.\n\nSee URL for more details and dataset download in SQuAD format.## Hyperparameters## Performance\nWe evaluated the extractive question answering performance on our GermanQuAD test set.\nModel types and training data are included in the model name. \nFor finetuning XLM-Roberta, we use the English SQuAD v2.0 dataset.\nThe GELECTRA models are warm started on the German translation of SQuAD v1.1 and finetuned on GermanQuAD. \nThe human baseline was computed for the 3-way test set by taking one answer as prediction and the other two as ground truth.\n!performancetable## Authors\n Timo Möller: timo.moeller@URL \n Julian Risch: URL@URL \n Malte Pietsch: malte.pietsch@URL"
] |
[
-0.09818603843450546,
0.1492626816034317,
-0.0013435237342491746,
0.10949873179197311,
0.13704214990139008,
0.050265807658433914,
0.145344540476799,
0.08129026740789413,
-0.0006415314855985343,
0.08637528121471405,
0.01880992390215397,
-0.07926326245069504,
0.06501516699790955,
0.10044968873262405,
0.045342497527599335,
-0.2220786064863205,
0.02824706770479679,
-0.09389699250459671,
0.030058272182941437,
0.0962240919470787,
0.10784035921096802,
-0.09018997848033905,
0.07461174577474594,
-0.010409542359411716,
-0.03523841127753258,
0.05801272392272949,
-0.01939142867922783,
-0.01798877865076065,
0.10297681391239166,
0.05240930989384651,
0.07947422564029694,
-0.0363236740231514,
0.05695049464702606,
-0.13021306693553925,
0.008950875140726566,
0.027775218710303307,
-0.010043423622846603,
0.04676460102200508,
0.05075482651591301,
0.001988374162465334,
0.09870625287294388,
-0.030558999627828598,
0.010908662341535091,
0.04061239957809448,
-0.09525812417268753,
-0.10970057547092438,
-0.17305636405944824,
0.1731143295764923,
0.013647675514221191,
0.1240258514881134,
-0.03066161461174488,
0.08826875686645508,
-0.06356868147850037,
0.036831073462963104,
0.13928280770778656,
-0.20548714697360992,
-0.05549829080700874,
0.036145538091659546,
0.0006581793422810733,
0.09279419481754303,
-0.09301965683698654,
0.008361081592738628,
0.03887849673628807,
0.03624984622001648,
-0.07574839890003204,
-0.024633586406707764,
-0.00846251193434,
-0.008961782790720463,
-0.10156580805778503,
-0.04587290436029434,
0.12576517462730408,
0.024911969900131226,
-0.09705273061990738,
-0.1502504199743271,
0.003114970400929451,
0.06696702539920807,
0.049375612288713455,
-0.05521440505981445,
-0.010651752352714539,
-0.029027801007032394,
0.028098635375499725,
-0.08392108976840973,
-0.09653919190168381,
-0.011319204233586788,
-0.013781728222966194,
0.20588596165180206,
0.03088146634399891,
0.03980068862438202,
-0.010036977007985115,
0.0804281160235405,
-0.11882973462343216,
-0.09106887131929398,
-0.07005834579467773,
-0.08525350689888,
-0.07980114221572876,
-0.026332521811127663,
-0.06264133751392365,
-0.08583074063062668,
0.008577247150242329,
0.16353270411491394,
-0.09391744434833527,
0.02693210542201996,
0.006868013646453619,
-0.005488060414791107,
0.0668458640575409,
0.19889234006404877,
-0.0734805092215538,
-0.10689127445220947,
-0.040878742933273315,
-0.019543731585144997,
0.033879365772008896,
-0.028493059799075127,
-0.03398766741156578,
-0.02259628288447857,
0.04515523836016655,
0.07239799946546555,
0.013611703179776669,
0.00814047735184431,
-0.05903514847159386,
-0.06199820712208748,
0.07445802539587021,
-0.11355318129062653,
0.027149800211191177,
0.020295996218919754,
-0.07057344913482666,
0.061654746532440186,
-0.03034987859427929,
-0.012242738157510757,
-0.07923303544521332,
0.07464191317558289,
-0.04399555176496506,
-0.028424261137843132,
-0.08211465179920197,
-0.13515156507492065,
0.03262262046337128,
-0.05542109161615372,
-0.03570500388741493,
-0.10084143280982971,
-0.11440421640872955,
-0.08758974820375443,
0.05185527354478836,
-0.03822457417845726,
0.052549008280038834,
-0.0035507152788341045,
-0.005582761485129595,
0.005937611218541861,
-0.006716357544064522,
-0.04015045613050461,
-0.0017602368025109172,
0.041159193962812424,
-0.05569447949528694,
0.015576151199638844,
-0.040568575263023376,
0.03695215657353401,
-0.10496173053979874,
-0.05429228022694588,
-0.15979677438735962,
0.06784721463918686,
-0.08984460681676865,
-0.04010137915611267,
-0.08970586955547333,
-0.0842590481042862,
0.003958507906645536,
0.019825728610157967,
0.09234727919101715,
0.13481172919273376,
-0.14523039758205414,
-0.051091477274894714,
0.13337108492851257,
-0.15801149606704712,
-0.07007738947868347,
0.12049058824777603,
-0.030130483210086823,
0.000041258565033786,
0.09604337066411972,
0.17112688720226288,
0.10116931796073914,
-0.14155885577201843,
-0.10833931714296341,
-0.08869357407093048,
-0.008392881602048874,
0.09695733338594437,
0.07609035074710846,
-0.0547228641808033,
-0.012297299690544605,
0.019550247117877007,
-0.1177452951669693,
0.004313724115490913,
-0.04419509693980217,
-0.05059678480029106,
0.017236527055501938,
-0.039720743894577026,
0.08729078620672226,
0.036610934883356094,
-0.03762037679553032,
-0.06965290755033493,
-0.0852331593632698,
0.029707934707403183,
0.0838618278503418,
-0.05165952071547508,
0.004185959696769714,
-0.022378556430339813,
0.051375385373830795,
0.01898697018623352,
-0.007260976824909449,
-0.14914102852344513,
-0.18414759635925293,
0.04893159866333008,
-0.07091262191534042,
0.07455532252788544,
0.07973875105381012,
0.06360926479101181,
0.04911799728870392,
-0.10230565816164017,
-0.05997757986187935,
-0.12756121158599854,
-0.01269177719950676,
-0.041680708527565,
-0.13553385436534882,
-0.054057151079177856,
-0.03342048078775406,
0.0664968341588974,
-0.08180943131446838,
-0.0338401235640049,
0.0025590111035853624,
0.08289837837219238,
0.037027277052402496,
-0.07284564524888992,
-0.0344354510307312,
0.03677396848797798,
-0.0004541562229860574,
-0.03716959059238434,
-0.020410995930433273,
0.008245798759162426,
-0.024774765595793724,
0.03273371234536171,
-0.05660798400640488,
-0.07241406291723251,
0.056851621717214584,
0.13895711302757263,
-0.11001711338758469,
-0.015676070004701614,
-0.07510367035865784,
0.004587098956108093,
-0.089650958776474,
-0.08600661158561707,
0.16268664598464966,
0.022255342453718185,
0.04486539214849472,
-0.042142800986766815,
-0.015499578788876534,
-0.028628811240196228,
-0.006151055917143822,
-0.06476401537656784,
0.09840275347232819,
-0.05163579434156418,
-0.12598098814487457,
0.09306825697422028,
0.05155494436621666,
0.006889783311635256,
0.19270683825016022,
-0.021901696920394897,
-0.10900002717971802,
-0.0481858029961586,
0.015668785199522972,
0.002920460654422641,
0.13429385423660278,
0.05370459705591202,
0.04125373810529709,
0.05168488994240761,
0.044263992458581924,
0.02363622561097145,
-0.04286539554595947,
0.00961328949779272,
0.006905540358275175,
-0.023713883012533188,
-0.044059596955776215,
0.010071196593344212,
0.004206305369734764,
0.10070402175188065,
-0.0070608933456242085,
0.03329300135374069,
-0.009995068423449993,
-0.036606136709451675,
-0.12579266726970673,
0.16196496784687042,
-0.09698624163866043,
-0.1996842324733734,
-0.12693142890930176,
0.05887714400887489,
-0.11490177363157272,
0.009465018287301064,
0.04496188834309578,
-0.0701521560549736,
-0.09118825942277908,
-0.058726970106363297,
0.05867263302206993,
0.0407235324382782,
-0.0442703515291214,
-0.02327781356871128,
-0.051148511469364166,
0.012615624815225601,
-0.1495858132839203,
-0.010885084979236126,
-0.051856573671102524,
-0.05221658572554588,
0.04015699028968811,
0.0006040561711415648,
0.09481072425842285,
0.055862847715616226,
-0.044335950165987015,
0.014396811835467815,
-0.019176535308361053,
0.2294643074274063,
-0.11126972734928131,
0.0767681673169136,
0.12380299717187881,
0.009329047054052353,
0.061246857047080994,
0.07865852862596512,
0.021357066929340363,
-0.03520074859261513,
0.05111176520586014,
0.05347824841737747,
-0.05034312605857849,
-0.25635746121406555,
-0.11508125811815262,
-0.038082946091890335,
-0.058928828686475754,
0.030344046652317047,
0.027154456824064255,
0.06424310058355331,
0.019819391891360283,
-0.11047600954771042,
-0.09212283790111542,
0.03636060282588005,
0.05303327366709709,
0.027732405811548233,
0.006031973287463188,
0.058458223938941956,
-0.04423312470316887,
-0.010196340270340443,
0.1412678211927414,
-0.01623915322124958,
0.19871199131011963,
-0.01381891593337059,
0.01013276632875204,
0.030467933043837547,
0.057501647621393204,
-0.029771754518151283,
0.1115509495139122,
-0.0219186469912529,
-0.01398693211376667,
-0.011468276381492615,
-0.0823713168501854,
0.006100836209952831,
0.09773709625005722,
0.07604361325502396,
-0.009610784240067005,
-0.0982913225889206,
-0.014231519773602486,
0.08549824357032776,
0.14691932499408722,
0.04782314598560333,
-0.09898058325052261,
-0.11505211889743805,
-0.005763237830251455,
-0.024750424548983574,
-0.05490345135331154,
-0.017873123288154602,
0.13911806046962738,
-0.15533867478370667,
0.043475162237882614,
0.004049699753522873,
0.08873845636844635,
0.003080629510805011,
-0.005446261260658503,
0.04322172328829765,
0.0366349071264267,
-0.03886451944708824,
0.0994805172085762,
-0.19862154126167297,
0.13305523991584778,
0.008661911822855473,
0.07872644811868668,
-0.08657149225473404,
0.024234790354967117,
0.024423690512776375,
-0.054465215653181076,
0.14090928435325623,
0.042615585029125214,
-0.09946726262569427,
-0.06787218898534775,
-0.053936511278152466,
0.016841154545545578,
0.1309109926223755,
-0.07656453549861908,
0.07603123784065247,
-0.011708339676260948,
0.029166921973228455,
-0.015910597518086433,
0.07883529365062714,
-0.14581288397312164,
-0.12131093442440033,
0.03889565169811249,
-0.01662396267056465,
0.023104112595319748,
-0.05204600468277931,
-0.09055895358324051,
-0.06437400728464127,
0.1573244035243988,
-0.08928149938583374,
-0.07356110960245132,
-0.14367561042308807,
0.04476102069020271,
0.11374437808990479,
-0.10885274410247803,
0.014400612562894821,
0.04327130317687988,
0.07862875610589981,
-0.0416780449450016,
-0.08943115919828415,
0.06455139815807343,
-0.09230490773916245,
-0.16084051132202148,
-0.022588761523365974,
0.11232753098011017,
0.12066509574651718,
0.05409126356244087,
0.005116632673889399,
0.03940770775079727,
0.01580328494310379,
-0.16240328550338745,
0.003418002277612686,
0.06574475020170212,
0.036520227789878845,
0.13066719472408295,
-0.07225989550352097,
-0.12281160056591034,
-0.030961375683546066,
-0.009168706834316254,
0.08388710767030716,
0.09455808252096176,
-0.06347335129976273,
0.10503358393907547,
0.18515536189079285,
-0.04684386029839516,
-0.2472599297761917,
0.019798830151557922,
0.08454770594835281,
0.03354315087199211,
0.0684659481048584,
-0.17685729265213013,
0.07329457253217697,
0.06390443444252014,
-0.01264626532793045,
-0.0565342977643013,
-0.2070053219795227,
-0.1100657656788826,
0.04759986698627472,
0.03227689862251282,
0.027600139379501343,
-0.07763050496578217,
-0.0429290346801281,
-0.013820350170135498,
-0.09373705834150314,
0.05395660176873207,
-0.04322072118520737,
0.08642588555812836,
0.011746385134756565,
0.038068875670433044,
0.04444025829434395,
-0.05889430642127991,
0.11312969028949738,
0.08163879066705704,
0.055187806487083435,
-0.06365729868412018,
-0.03854084759950638,
0.09566587209701538,
-0.015119095332920551,
0.16694509983062744,
-0.0030174816492944956,
0.08363205939531326,
-0.1410183608531952,
-0.024518106132745743,
-0.09097832441329956,
0.09497623145580292,
-0.07016047090291977,
-0.06681279838085175,
-0.07007114589214325,
0.1406698077917099,
0.05139411613345146,
-0.00920794252306223,
-0.04819023981690407,
-0.01736869476735592,
0.025050994008779526,
0.07749130576848984,
0.06425011903047562,
0.09749855846166611,
-0.08305474370718002,
-0.019019976258277893,
-0.01579076237976551,
0.0736384391784668,
0.018449990078806877,
0.059690650552511215,
0.14944861829280853,
0.03739053010940552,
0.12707269191741943,
-0.012048762291669846,
-0.10630100965499878,
0.005370095372200012,
0.045667748898267746,
-0.15046338737010956,
-0.18911504745483398,
-0.06920688599348068,
-0.019264793023467064,
-0.08759427070617676,
0.04775267466902733,
0.15557169914245605,
-0.01804860681295395,
-0.02472832426428795,
-0.03204841911792755,
0.04567955806851387,
-0.023370305076241493,
0.13667236268520355,
0.07216352224349976,
0.02559235505759716,
-0.07328636944293976,
0.09101742506027222,
0.05170252174139023,
-0.06058076024055481,
0.08150334656238556,
0.0594257153570652,
-0.05734512582421303,
-0.017797067761421204,
0.0010362057946622372,
0.07809947431087494,
-0.15822719037532806,
-0.06996459513902664,
-0.010603963397443295,
-0.05430667847394943,
-0.007982360199093819,
0.0018801302649080753,
0.027192460373044014,
0.04500015452504158,
-0.0032121094409376383,
-0.019116563722491264,
-0.06003594398498535,
0.07108619809150696,
0.060423072427511215,
0.016230333596467972,
-0.016263052821159363,
-0.010550342500209808,
-0.042613908648490906,
0.00273729651235044,
-0.028334304690361023,
-0.03142114356160164,
-0.11994686722755432,
-0.02515576034784317,
-0.10007130354642868,
-0.027913622558116913,
-0.011405568569898605,
-0.014202945865690708,
-0.01950518786907196,
-0.1076035350561142,
0.03110557608306408,
0.0660700798034668,
-0.0510895699262619,
-0.01164944190531969,
0.0002706487139221281,
0.06695781648159027,
-0.21691831946372986,
-0.00873135682195425,
0.04100296273827553,
-0.04158512130379677,
0.1347724348306656,
0.08226189017295837,
0.002092010574415326,
0.062324024736881256,
-0.09639241546392441,
-0.015602804720401764,
-0.05547061935067177,
0.03570687025785446,
0.03679337725043297,
-0.1319780796766281,
0.009967312216758728,
0.016997072845697403,
-0.02372284047305584,
0.047306377440690994,
0.0057318368926644325,
-0.0646236315369606,
0.06462451815605164,
-0.0059524355456233025,
-0.10159865021705627,
-0.06997688859701157,
0.10676460713148117,
0.0773090049624443,
0.020820463076233864,
0.12059767544269562,
-0.06515992432832718,
0.0656014233827591,
-0.12825295329093933,
0.008327674120664597,
0.03462088480591774,
-0.006775392685085535,
-0.09669442474842072,
0.010254018008708954,
0.047967731952667236,
-0.013889024965465069,
0.0809386745095253,
0.0031969822011888027,
0.07235056906938553,
0.034249067306518555,
0.025038661435246468,
-0.0021278790663927794,
0.0071675218641757965,
0.08721770346164703,
-0.052664197981357574,
-0.01647978276014328,
-0.032952774316072464,
-0.013117849826812744,
-0.06876780092716217,
0.014349781908094883,
0.22912070155143738,
0.1711426079273224,
0.08953796327114105,
0.02603451907634735,
0.051463644951581955,
-0.03409441560506821,
-0.12197133153676987,
-0.07831265032291412,
0.011842661537230015,
0.030728625133633614,
-0.024514349177479744,
0.07274451851844788,
0.09075571596622467,
-0.2060040384531021,
0.10032736510038376,
-0.05301636829972267,
-0.06411764025688171,
-0.04223652184009552,
-0.12841282784938812,
-0.04357685148715973,
-0.04300815984606743,
0.026479007676243782,
-0.14401675760746002,
0.03152846544981003,
0.04829126223921776,
0.0642005056142807,
-0.07288631051778793,
0.1465582549571991,
-0.11531636863946915,
-0.035065315663814545,
0.06143486872315407,
0.018967757001519203,
0.05747150629758835,
0.06317997723817825,
-0.013743936084210873,
-0.009038173593580723,
0.05858256295323372,
0.07434699684381485,
0.05173797905445099,
0.03782011196017265,
-0.03755226731300354,
-0.02296149730682373,
-0.06389515846967697,
-0.023524025455117226,
-0.024933654814958572,
0.045545462518930435,
0.1723988652229309,
0.027937771752476692,
0.0042113978415727615,
-0.023087050765752792,
0.18800070881843567,
-0.062457162886857986,
-0.08359424024820328,
-0.15423396229743958,
0.16508732736110687,
0.041667792946100235,
0.040202546864748,
0.024120289832353592,
-0.1356373429298401,
0.005902059376239777,
0.09688028693199158,
0.16028782725334167,
-0.014404685236513615,
0.008598226122558117,
-0.004013759549707174,
0.010786579921841621,
0.03144760802388191,
0.06324662268161774,
-0.008034816011786461,
0.2456882894039154,
-0.016037756577134132,
0.14241014420986176,
-0.026401462033391,
-0.004394856747239828,
0.008705577813088894,
0.19375529885292053,
-0.013416914269328117,
-0.06488841027021408,
-0.1063975989818573,
0.12416978925466537,
0.01045980304479599,
-0.21556051075458527,
0.022327443584799767,
-0.09590741991996765,
-0.13683752715587616,
0.0008311193669214845,
0.06441719084978104,
0.07191785424947739,
0.11207957565784454,
0.02803381159901619,
0.013370527885854244,
0.14095571637153625,
0.012429898604750633,
-0.04649069905281067,
-0.12851306796073914,
0.09908000379800797,
-0.051748860627412796,
0.21530528366565704,
0.026883549988269806,
0.09287784993648529,
0.08364081382751465,
0.001830689376220107,
-0.0877174511551857,
0.01476150844246149,
0.05935228615999222,
-0.12037206441164017,
-0.008245419710874557,
0.1207033172249794,
-0.021390974521636963,
0.08815272152423859,
0.055196769535541534,
-0.05465220287442207,
0.03466726839542389,
0.07390416413545609,
0.0013777537969872355,
-0.127746120095253,
0.12629909813404083,
-0.07562761008739471,
0.1753273904323578,
0.1556096374988556,
-0.02688971348106861,
0.005927663296461105,
-0.04724635183811188,
0.06081409007310867,
0.04104672744870186,
0.11178480833768845,
0.025940189138054848,
-0.17011065781116486,
0.03490007668733597,
-0.061477795243263245,
0.02493705041706562,
-0.1914544254541397,
-0.0576988086104393,
-0.021324817091226578,
-0.008390222676098347,
-0.010696148499846458,
0.10615941137075424,
0.04941713437438011,
-0.0022648414596915245,
0.012174146249890327,
-0.027157893404364586,
-0.01342708244919777,
0.08586402982473373,
-0.04987889900803566,
-0.043265219777822495
] |
null | null |
transformers
|
# German ELECTRA large
Released, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our [paper](https://arxiv.org/pdf/2010.10906.pdf), we outline the steps taken to train our model and show that this is the state of the art German language model.
## Overview
**Paper:** [here](https://arxiv.org/pdf/2010.10906.pdf)
**Architecture:** ELECTRA large (discriminator)
**Language:** German
## Performance
```
GermEval18 Coarse: 80.70
GermEval18 Fine: 55.16
GermEval14: 88.95
```
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: `branden.chan [at] deepset.ai`
Stefan Schweter: `stefan [at] schweter.eu`
Timo Möller: `timo.moeller [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "de", "license": "mit", "datasets": ["wikipedia", "OPUS", "OpenLegalData", "oscar"]}
| null |
deepset/gelectra-large
|
[
"transformers",
"pytorch",
"tf",
"electra",
"pretraining",
"de",
"dataset:wikipedia",
"dataset:OPUS",
"dataset:OpenLegalData",
"dataset:oscar",
"arxiv:2010.10906",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.10906"
] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #electra #pretraining #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #dataset-oscar #arxiv-2010.10906 #license-mit #endpoints_compatible #has_space #region-us
|
# German ELECTRA large
Released, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that this is the state of the art German language model.
## Overview
Paper: here
Architecture: ELECTRA large (discriminator)
Language: German
## Performance
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: 'URL [at] URL'
Stefan Schweter: 'stefan [at] URL'
Timo Möller: 'timo.moeller [at] URL'
## About us
!deepset logo
We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
- FARM
- Haystack
Get in touch:
Twitter | LinkedIn | Discord | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# German ELECTRA large\n\nReleased, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that this is the state of the art German language model.",
"## Overview \nPaper: here \nArchitecture: ELECTRA large (discriminator) \nLanguage: German",
"## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: 'URL [at] URL' \nStefan Schweter: 'stefan [at] URL' \nTimo Möller: 'timo.moeller [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #tf #electra #pretraining #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #dataset-oscar #arxiv-2010.10906 #license-mit #endpoints_compatible #has_space #region-us \n",
"# German ELECTRA large\n\nReleased, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that this is the state of the art German language model.",
"## Overview \nPaper: here \nArchitecture: ELECTRA large (discriminator) \nLanguage: German",
"## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator",
"## Authors\nBranden Chan: 'URL [at] URL' \nStefan Schweter: 'stefan [at] URL' \nTimo Möller: 'timo.moeller [at] URL'",
"## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
74,
101,
21,
60,
40,
129
] |
[
"passage: TAGS\n#transformers #pytorch #tf #electra #pretraining #de #dataset-wikipedia #dataset-OPUS #dataset-OpenLegalData #dataset-oscar #arxiv-2010.10906 #license-mit #endpoints_compatible #has_space #region-us \n# German ELECTRA large\n\nReleased, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka \"bert-base-german-cased\") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that this is the state of the art German language model.## Overview \nPaper: here \nArchitecture: ELECTRA large (discriminator) \nLanguage: German## Performance \n\n\nSee also: \ndeepset/gbert-base\ndeepset/gbert-large\ndeepset/gelectra-base\ndeepset/gelectra-large\ndeepset/gelectra-base-generator\ndeepset/gelectra-large-generator## Authors\nBranden Chan: 'URL [at] URL' \nStefan Schweter: 'stefan [at] URL' \nTimo Möller: 'timo.moeller [at] URL'## About us\n!deepset logo\nWe bring NLP to the industry via open source! \nOur focus: Industry specific language models & large scale QA systems. \n \nSome of our work: \n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")\n- FARM\n- Haystack\n\nGet in touch:\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
-0.010076262056827545,
0.11799279600381851,
-0.00463507417589426,
0.048264652490615845,
0.044729456305503845,
-0.008288214914500713,
0.09648457169532776,
0.09708728641271591,
0.06407333165407181,
0.10703910142183304,
-0.0005532560753636062,
0.0061661615036427975,
0.08590066432952881,
0.11267413944005966,
0.04347671568393707,
-0.23401200771331787,
0.015163620933890343,
-0.10678724199533463,
-0.07356675714254379,
0.07438330352306366,
0.16259293258190155,
-0.09545612335205078,
0.09972570836544037,
-0.007201106753200293,
-0.007562923710793257,
0.032095782458782196,
-0.06088601425290108,
-0.026577886193990707,
0.05430798977613449,
0.027709880843758583,
0.03742998093366623,
-0.04188358038663864,
-0.003510046051815152,
-0.14969460666179657,
0.01949239894747734,
0.050266847014427185,
0.03036799468100071,
0.03989797085523605,
0.055419567972421646,
-0.03952804207801819,
0.005664209369570017,
-0.15092749893665314,
0.010795543901622295,
0.06560848653316498,
-0.1151990294456482,
-0.1520126610994339,
-0.09839203953742981,
0.13833163678646088,
0.048550430685281754,
0.017797471955418587,
-0.04488559439778328,
0.03730298951268196,
-0.09847472608089447,
0.0044957841746509075,
0.09551957994699478,
-0.2536218762397766,
-0.07734165340662003,
0.0075294626876711845,
0.002975424285978079,
0.026867590844631195,
-0.1503300666809082,
0.05693362280726433,
-0.0012981094187125564,
0.028099944815039635,
0.002294316655024886,
-0.026103144511580467,
0.059663716703653336,
0.005745722446590662,
-0.09165142476558685,
0.024391258135437965,
0.14293697476387024,
0.015043729916214943,
-0.04613365978002548,
-0.21122126281261444,
0.014688264578580856,
0.14160552620887756,
-0.026168933138251305,
-0.06078527867794037,
0.03188931941986084,
-0.0157578494399786,
-0.0035696448758244514,
-0.08680811524391174,
-0.09725692868232727,
0.06289225816726685,
0.003594671143218875,
0.15313312411308289,
0.012919100932776928,
-0.008581559173762798,
0.04803206026554108,
0.05989942327141762,
-0.03951193392276764,
-0.14348141849040985,
-0.027921240776777267,
-0.11092864722013474,
-0.040155280381441116,
-0.013827389106154442,
-0.025438645854592323,
-0.04614876210689545,
0.11167967319488525,
0.21570537984371185,
0.03628223389387131,
0.019428197294473648,
0.004804856609553099,
-0.004131913650780916,
0.08236522227525711,
0.18189270794391632,
-0.06891247630119324,
-0.2147095948457718,
-0.0002161872835131362,
-0.04551784321665764,
0.028925729915499687,
-0.014443091116845608,
-0.06291615962982178,
-0.03372548148036003,
-0.024800851941108704,
-0.002720785792917013,
0.05476142466068268,
0.045982424169778824,
-0.09388863295316696,
-0.0945952981710434,
0.06729324907064438,
-0.11897947639226913,
0.06573428213596344,
0.044595640152692795,
-0.0657271072268486,
0.08184625953435898,
-0.0785689502954483,
0.03896258771419525,
-0.03647231683135033,
0.07194720208644867,
-0.02013142965734005,
-0.023019155487418175,
-0.10018777847290039,
-0.0743463858962059,
0.055883269757032394,
-0.048305023461580276,
-0.034956976771354675,
-0.0606108233332634,
-0.022861484438180923,
-0.07327204197645187,
0.10330880433320999,
-0.052212368696928024,
-0.03320029750466347,
-0.04814865067601204,
-0.003213894320651889,
0.050336774438619614,
0.018077975139021873,
-0.05859726667404175,
-0.0024056928232312202,
0.03374353423714638,
-0.10297326743602753,
0.009759807959198952,
-0.022869346663355827,
0.03490648418664932,
-0.06743128597736359,
-0.011890358291566372,
-0.2852066159248352,
0.06687425822019577,
-0.13129998743534088,
0.09234815090894699,
-0.13160771131515503,
-0.020264901220798492,
0.0001537799835205078,
0.04713129997253418,
-0.0010846024379134178,
0.0949888676404953,
-0.07948573678731918,
-0.07646845281124115,
0.12209782004356384,
-0.044718340039253235,
-0.032155800610780716,
0.14408451318740845,
-0.06487710028886795,
0.028322232887148857,
0.10317129641771317,
0.22611968219280243,
0.14217853546142578,
-0.09534378349781036,
-0.05066347122192383,
-0.03658769652247429,
-0.026312436908483505,
0.09255701303482056,
0.0980081707239151,
-0.08876539766788483,
0.08978766947984695,
0.01115621067583561,
-0.07890351861715317,
0.0029769758693873882,
0.010428168810904026,
-0.04381026700139046,
0.05543450266122818,
-0.018847256898880005,
0.12788201868534088,
-0.020450616255402565,
-0.02807011269032955,
-0.09062781929969788,
-0.11252442747354507,
0.0848461166024208,
0.022809801623225212,
0.010196492075920105,
-0.01605512946844101,
-0.07003390789031982,
0.027918871492147446,
0.07586270570755005,
0.009173990227282047,
-0.05362250283360481,
-0.11681714653968811,
0.06926081329584122,
-0.026927370578050613,
0.17231404781341553,
0.03216147422790527,
0.0847582072019577,
-0.019832735881209373,
-0.025075867772102356,
-0.024691130965948105,
-0.05181621387600899,
-0.032345522195100784,
0.039837464690208435,
-0.1808004379272461,
0.010420347563922405,
-0.03528551757335663,
0.07656282931566238,
-0.022803837433457375,
-0.03444923087954521,
0.05044126510620117,
0.1525917500257492,
0.04552382603287697,
-0.03326473757624626,
-0.009181858040392399,
0.005765443667769432,
0.056994933634996414,
-0.018532810732722282,
0.007479802705347538,
-0.017289364710450172,
-0.04598889872431755,
0.03435473144054413,
0.0013232467463240027,
0.023341968655586243,
0.032360244542360306,
-0.027453411370515823,
-0.06857527792453766,
-0.05426488444209099,
-0.054848168045282364,
-0.0158951748162508,
-0.01800333335995674,
-0.11434856057167053,
0.2175167202949524,
0.0020589642226696014,
-0.025011589750647545,
-0.0675828754901886,
-0.09258227795362473,
-0.07373686134815216,
-0.04451398923993111,
-0.03220265731215477,
0.10409439355134964,
-0.06235402822494507,
-0.18154948949813843,
0.13424110412597656,
0.12817998230457306,
0.04545621573925018,
0.2510734498500824,
-0.060267671942710876,
-0.03521435707807541,
-0.027205325663089752,
0.04983197897672653,
-0.03951272740960121,
0.07735079526901245,
-0.006478053983300924,
-0.017299834638834,
0.04431562498211861,
-0.029128869995474815,
-0.010214206762611866,
-0.011796334758400917,
0.050310760736465454,
-0.026205353438854218,
-0.009491601958870888,
0.1126420721411705,
0.01953934319317341,
0.050958920270204544,
0.07717464119195938,
0.11511968821287155,
0.011609761975705624,
0.009743551723659039,
-0.0399806834757328,
-0.03392130136489868,
0.0979132354259491,
-0.14705756306648254,
-0.19046978652477264,
-0.13702794909477234,
-0.046185802668333054,
-0.12061180919408798,
-0.015200498513877392,
-0.002545437077060342,
-0.0663178414106369,
-0.1046430766582489,
-0.0033340987283736467,
0.09950821101665497,
0.0829620510339737,
-0.08301146328449249,
-0.005476564634591341,
-0.0018325427081435919,
0.01539270207285881,
-0.13247157633304596,
-0.042663704603910446,
-0.002010271418839693,
-0.041121140122413635,
-0.03752922639250755,
0.07714998722076416,
0.028309732675552368,
0.03797279670834541,
0.035803087055683136,
0.018949324265122414,
-0.017209507524967194,
0.21121159195899963,
-0.15726761519908905,
0.10990045964717865,
0.09747610241174698,
-0.06208135560154915,
0.0644933357834816,
0.19247718155384064,
0.07950887084007263,
0.008672802709043026,
-0.0011392426677048206,
0.06209087744355202,
0.04129830747842789,
-0.198526531457901,
-0.11779658496379852,
-0.043911729007959366,
-0.023528827354311943,
-0.020885854959487915,
0.047522131353616714,
0.009206089191138744,
-0.010452556423842907,
-0.10042396187782288,
-0.03772057220339775,
0.07893664389848709,
0.05061057582497597,
0.09456230700016022,
0.0010743399616330862,
0.022183142602443695,
-0.031716518104076385,
-0.0911175087094307,
0.10575278103351593,
0.03902414068579674,
0.09564540535211563,
0.05312160775065422,
0.16391004621982574,
0.043920449912548065,
0.043965984135866165,
0.00114324816968292,
-0.016470665112137794,
-0.04251149669289589,
0.008184507489204407,
-0.029630420729517937,
-0.10010498762130737,
0.03371138870716095,
0.09075086563825607,
0.12706835567951202,
-0.05340001359581947,
0.04518405720591545,
-0.06131775677204132,
0.1620912253856659,
0.19120514392852783,
0.025794070214033127,
-0.05541146546602249,
-0.05132489278912544,
0.0409780777990818,
-0.05651609227061272,
-0.062025777995586395,
-0.015256133861839771,
0.028263164684176445,
-0.1595904380083084,
0.07358015328645706,
-0.004819048102945089,
0.07092692703008652,
-0.03981637954711914,
0.03123539686203003,
0.02773606963455677,
0.13103169202804565,
-0.010035164654254913,
0.07794313877820969,
-0.21377204358577728,
0.09511581808328629,
0.022668106481432915,
0.07043289393186569,
-0.0518704392015934,
0.03903113305568695,
0.03557456284761429,
-0.1088491901755333,
0.11719895154237747,
0.014624495059251785,
-0.008681736886501312,
0.015795690938830376,
-0.12065969407558441,
-0.005808557383716106,
0.16455744206905365,
-0.13698498904705048,
0.04693540930747986,
-0.0249300766736269,
-0.026804573833942413,
-0.05570037290453911,
0.02870034985244274,
-0.10139648616313934,
-0.10336356610059738,
0.034683872014284134,
-0.10674478113651276,
0.03703491762280464,
-0.032760653644800186,
0.004241324029862881,
-0.11073895543813705,
0.23462732136249542,
-0.15274986624717712,
-0.07573467493057251,
-0.10872924327850342,
-0.03364962339401245,
0.06525330990552902,
-0.08194896578788757,
0.06418594717979431,
0.008314438164234161,
0.0778558999300003,
-0.021873073652386665,
-0.09774989634752274,
0.07235918194055557,
-0.05836812034249306,
-0.11607681214809418,
0.0007611957844346762,
0.16824759542942047,
0.08238458633422852,
0.0397082194685936,
0.006699766963720322,
0.024490924552083015,
-0.010951966978609562,
-0.1102900356054306,
0.04951261356472969,
0.09009779244661331,
-0.011639726348221302,
0.08703865110874176,
-0.10081376880407333,
-0.1559784710407257,
-0.05452911555767059,
0.03340559080243111,
0.11985572427511215,
0.09752178937196732,
-0.030700087547302246,
0.18822218477725983,
0.16902601718902588,
-0.0392671674489975,
-0.2672116756439209,
-0.005951021332293749,
0.0632006898522377,
0.02562275342643261,
0.0049868859350681305,
-0.2478373795747757,
0.15033511817455292,
0.018105696886777878,
-0.0490347258746624,
0.025326499715447426,
-0.15600627660751343,
-0.10429031401872635,
0.07114414870738983,
-0.06718683242797852,
-0.05678022280335426,
-0.06273342669010162,
-0.08942040801048279,
-0.04242338985204697,
-0.06673683971166611,
0.09098754078149796,
-0.04989638924598694,
0.052128810435533524,
0.05818437039852142,
0.050185803323984146,
0.04428565874695778,
-0.013867378234863281,
0.08696603775024414,
-0.002765684388577938,
0.04321378096938133,
-0.09569669514894485,
-0.06650309264659882,
0.021308034658432007,
-0.035674188286066055,
0.09172506630420685,
-0.04374014586210251,
-0.037152331322431564,
-0.13244162499904633,
-0.015313117764890194,
-0.08241789788007736,
0.13122627139091492,
-0.06084924191236496,
-0.08714477717876434,
-0.09609789401292801,
0.14908331632614136,
0.01738405041396618,
0.023002121597528458,
0.06578230112791061,
-0.014802765101194382,
0.0005768226692453027,
0.02032296359539032,
0.18451587855815887,
0.08673439919948578,
-0.07786227017641068,
-0.039984554052352905,
-0.021857794374227524,
0.06284316629171371,
0.0011499077081680298,
0.04544489458203316,
0.10945631563663483,
-0.012538917362689972,
0.12569449841976166,
-0.03154495358467102,
-0.13993452489376068,
-0.026927167549729347,
0.111336350440979,
-0.12424270063638687,
-0.16065344214439392,
-0.08368698507547379,
-0.01688036136329174,
-0.04646962136030197,
0.013098168186843395,
0.1583913415670395,
0.016089340671896935,
-0.04641351103782654,
0.04721221700310707,
0.06378050148487091,
-0.04357834532856941,
0.017851032316684723,
0.06516104191541672,
0.018519900739192963,
-0.07864509522914886,
0.09885233640670776,
0.08031526952981949,
0.00870531052350998,
0.06317084282636642,
0.117925263941288,
-0.02149968035519123,
-0.012396504171192646,
0.061647552996873856,
0.15365968644618988,
-0.01358597818762064,
-0.026700884103775024,
0.011396203190088272,
-0.10342574864625931,
-0.01828380674123764,
0.033785417675971985,
0.03525371849536896,
0.027899114415049553,
0.03292413055896759,
0.041593968868255615,
0.05907027795910835,
0.11471550911664963,
0.05654902383685112,
0.0039255861192941666,
-0.032465703785419464,
0.012980161234736443,
-0.06591929495334625,
-0.027293933555483818,
-0.02697344683110714,
-0.020429417490959167,
-0.15535752475261688,
-0.045078400522470474,
-0.06519296020269394,
-0.04823625087738037,
-0.03537594899535179,
-0.009415313601493835,
-0.0034652629401534796,
-0.042895425111055374,
0.02073248289525509,
0.015607710927724838,
-0.07328343391418457,
-0.02744656801223755,
0.002918931422755122,
0.1254069060087204,
-0.18962283432483673,
-0.002409343607723713,
0.09315413236618042,
-0.03641340881586075,
0.11497252434492111,
0.02645433135330677,
-0.00584683520719409,
0.02741636522114277,
-0.1461411416530609,
-0.058474913239479065,
-0.03915352374315262,
0.018178829923272133,
0.029811911284923553,
-0.06992508471012115,
-0.02289935015141964,
-0.0331365205347538,
-0.05193560943007469,
0.0075509860180318356,
-0.0031362713780254126,
-0.08608588576316833,
0.1319718360900879,
0.014921060763299465,
-0.11415012180805206,
-0.05418340489268303,
0.04493483901023865,
0.07566209882497787,
0.019485531374812126,
0.08605777472257614,
-0.08143837749958038,
0.034848652780056,
-0.06277897208929062,
0.02487202361226082,
0.043279532343149185,
-0.04576931893825531,
-0.1314527988433838,
-0.01733124442398548,
0.04124100133776665,
-0.015490954741835594,
0.08228694647550583,
0.021919259801506996,
-0.12201227247714996,
0.039780210703611374,
-0.007505152374505997,
-0.04999559000134468,
0.08293037861585617,
0.028049875050783157,
-0.03169306367635727,
-0.016510246321558952,
-0.06845242530107498,
-0.0560295395553112,
-0.040178071707487106,
-0.03798379376530647,
0.16057711839675903,
0.26118069887161255,
0.13252556324005127,
0.010100631043314934,
0.15246281027793884,
-0.03384814038872719,
-0.1384815275669098,
0.0809774100780487,
0.04530653357505798,
0.06997280567884445,
-0.09900085628032684,
0.06763248890638351,
0.07971484214067459,
-0.21198192238807678,
0.09828577190637589,
-0.024918928742408752,
-0.015692593529820442,
-0.02536407858133316,
-0.1961783766746521,
-0.0741412490606308,
-0.038212791085243225,
0.020338842645287514,
-0.11449131369590759,
0.05138474330306053,
-0.018490854650735855,
0.07060602307319641,
-0.08556463569402695,
0.14588500559329987,
-0.1632848083972931,
-0.06018175557255745,
0.15338775515556335,
0.02350192703306675,
0.03258044272661209,
0.04431097209453583,
-0.04925580322742462,
-0.08606718480587006,
0.1292455494403839,
0.04390767216682434,
0.05361088365316391,
-0.017207162454724312,
-0.07841021567583084,
-0.05158967524766922,
-0.09634403139352798,
0.006949125323444605,
-0.05434761568903923,
-0.028080357238650322,
0.09307447820901871,
0.011582284234464169,
-0.03182411938905716,
-0.01719639077782631,
0.15919217467308044,
-0.03337603807449341,
-0.09535780549049377,
-0.1375117003917694,
0.025893788784742355,
-0.02854527346789837,
0.04352722689509392,
-0.005868234671652317,
-0.12370892614126205,
-0.04080852121114731,
0.0856277272105217,
0.22269664704799652,
-0.0794607624411583,
0.02374427765607834,
0.0030227857641875744,
0.02954898774623871,
0.013348494656383991,
0.11132489889860153,
-0.04456549137830734,
0.28536224365234375,
-0.020825447514653206,
0.03647946938872337,
0.01893310807645321,
-0.05166807398200035,
-0.14389856159687042,
0.09518919140100479,
0.05116274952888489,
-0.04573938623070717,
-0.08648153394460678,
0.13774868845939636,
-0.0994146317243576,
-0.15492358803749084,
-0.0735657587647438,
-0.12430807203054428,
-0.16738136112689972,
-0.05945909768342972,
0.05951033532619476,
0.0749952420592308,
0.07899538427591324,
0.05101965367794037,
-0.0564342699944973,
0.0783759206533432,
0.026933232322335243,
0.006838582456111908,
0.009573022834956646,
0.12648789584636688,
-0.055659063160419464,
0.18031086027622223,
0.03322230651974678,
0.01586860604584217,
0.09125618636608124,
-0.03190752491354942,
-0.04355287924408913,
-0.0681413933634758,
0.040712371468544006,
-0.15112854540348053,
-0.035537559539079666,
0.10239584743976593,
0.009971576742827892,
0.09057474136352539,
0.0844484344124794,
-0.027188215404748917,
0.037065327167510986,
0.12562406063079834,
-0.03229586407542229,
-0.11167768388986588,
0.1382594108581543,
-0.1430405080318451,
0.1365891695022583,
0.17476306855678558,
-0.016451645642518997,
-0.02482530102133751,
-0.05047597736120224,
0.03210129588842392,
0.05752192437648773,
0.06536311656236649,
-0.06301984935998917,
-0.1585555374622345,
0.0017345277592539787,
0.0006121656042523682,
0.057313889265060425,
-0.07732192426919937,
-0.08493008464574814,
-0.017527086660265923,
0.11741141974925995,
-0.056772757321596146,
0.08720961958169937,
0.11325816065073013,
-0.007477071136236191,
0.013347581960260868,
-0.03704746812582016,
-0.001641679904423654,
0.051686566323041916,
-0.07057751715183258,
0.015274904668331146
] |
null | null |
transformers
|
# MiniLM-L12-H384-uncased for QA
## Overview
**Language model:** microsoft/MiniLM-L12-H384-uncased
**Language:** English
**Downstream-task:** Extractive QA
**Training data:** SQuAD 2.0
**Eval data:** SQuAD 2.0
**Code:** See an [example QA pipeline on Haystack](https://haystack.deepset.ai/tutorials/01_basic_qa_pipeline)
**Infrastructure**: 1x Tesla v100
## Hyperparameters
```
seed=42
batch_size = 12
n_epochs = 4
base_LM_model = "microsoft/MiniLM-L12-H384-uncased"
max_seq_len = 384
learning_rate = 4e-5
lr_schedule = LinearWarmup
warmup_proportion = 0.2
doc_stride=128
max_query_length=64
grad_acc_steps=4
```
## Performance
Evaluated on the SQuAD 2.0 dev set with the [official eval script](https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/).
```
"exact": 76.13071675229513,
"f1": 79.49786500219953,
"total": 11873,
"HasAns_exact": 78.35695006747639,
"HasAns_f1": 85.10090269418276,
"HasAns_total": 5928,
"NoAns_exact": 73.91084945332211,
"NoAns_f1": 73.91084945332211,
"NoAns_total": 5945
```
## Usage
### In Haystack
For doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in [Haystack](https://github.com/deepset-ai/haystack/):
```python
reader = FARMReader(model_name_or_path="deepset/minilm-uncased-squad2")
# or
reader = TransformersReader(model="deepset/minilm-uncased-squad2",tokenizer="deepset/minilm-uncased-squad2")
```
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "deepset/minilm-uncased-squad2"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'Why is model conversion important?',
'context': 'The option to convert models between FARM and transformers gives freedom to the user and let people easily switch between frameworks.'
}
res = nlp(QA_input)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
**Vaishali Pal:** [email protected]
**Branden Chan:** [email protected]
**Timo Möller:** [email protected]
**Malte Pietsch:** [email protected]
**Tanay Soni:** [email protected]
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/deepset-logo-colored.png" class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" class="w-40"/>
</div>
</div>
[deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://docs.haystack.deepset.ai">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community">Discord community open to everyone!</a></strong></p>
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "en", "license": "cc-by-4.0", "datasets": ["squad_v2"], "model-index": [{"name": "deepset/minilm-uncased-squad2", "results": [{"task": {"type": "question-answering", "name": "Question Answering"}, "dataset": {"name": "squad_v2", "type": "squad_v2", "config": "squad_v2", "split": "validation"}, "metrics": [{"type": "exact_match", "value": 76.1921, "name": "Exact Match", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNmViZTQ3YTBjYTc3ZDQzYmI1Mzk3MTAxM2MzNjdmMTc0MWY4Yzg2MWU3NGQ1MDJhZWI2NzY0YWYxZTY2OTgzMiIsInZlcnNpb24iOjF9.s4XCRs_pvW__LJ57dpXAEHD6NRsQ3XaFrM1xaguS6oUs5fCN77wNNc97scnfoPXT18A8RAn0cLTNivfxZm0oBA"}, {"type": "f1", "value": 79.5483, "name": "F1", "verified": true, "verifyToken": "eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZmJlYTIyOTg2NjMyMzg4NzNlNGIzMTY2NDVkMjg0ODdiOWRmYjVkZDYyZjBjNWNiNTBhNjcwOWUzMDM4ZWJiZiIsInZlcnNpb24iOjF9.gxpwIBBA3_5xPi-TaZcqWNnGgCiHzxaUNgrS2jucxoVWGxhBtnPdwKVCxLleQoDDZenAXB3Yh71zMP3xTSeHCw"}]}]}]}
|
question-answering
|
deepset/minilm-uncased-squad2
|
[
"transformers",
"pytorch",
"jax",
"safetensors",
"bert",
"question-answering",
"en",
"dataset:squad_v2",
"license:cc-by-4.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us
|
# MiniLM-L12-H384-uncased for QA
## Overview
Language model: microsoft/MiniLM-L12-H384-uncased
Language: English
Downstream-task: Extractive QA
Training data: SQuAD 2.0
Eval data: SQuAD 2.0
Code: See an example QA pipeline on Haystack
Infrastructure: 1x Tesla v100
## Hyperparameters
## Performance
Evaluated on the SQuAD 2.0 dev set with the official eval script.
## Usage
### In Haystack
For doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in Haystack:
### In Transformers
## Authors
Vaishali Pal: URL@URL
Branden Chan: URL@URL
Timo Möller: timo.moeller@URL
Malte Pietsch: malte.pietsch@URL
Tanay Soni: URL@URL
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
</div>
deepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- Distilled roberta-base-squad2 (aka "tinyroberta-squad2")
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="URL repo and <strong><a href="URL">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="URL community open to everyone!</a></strong></p>
Twitter | LinkedIn | Discord | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# MiniLM-L12-H384-uncased for QA",
"## Overview\nLanguage model: microsoft/MiniLM-L12-H384-uncased \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nCode: See an example QA pipeline on Haystack\nInfrastructure: 1x Tesla v100",
"## Hyperparameters",
"## Performance\nEvaluated on the SQuAD 2.0 dev set with the official eval script.",
"## Usage",
"### In Haystack\nFor doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in Haystack:",
"### In Transformers",
"## Authors\nVaishali Pal: URL@URL \nBranden Chan: URL@URL \nTimo Möller: timo.moeller@URL \nMalte Pietsch: malte.pietsch@URL \nTanay Soni: URL@URL",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #jax #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us \n",
"# MiniLM-L12-H384-uncased for QA",
"## Overview\nLanguage model: microsoft/MiniLM-L12-H384-uncased \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nCode: See an example QA pipeline on Haystack\nInfrastructure: 1x Tesla v100",
"## Hyperparameters",
"## Performance\nEvaluated on the SQuAD 2.0 dev set with the official eval script.",
"## Usage",
"### In Haystack\nFor doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in Haystack:",
"### In Transformers",
"## Authors\nVaishali Pal: URL@URL \nBranden Chan: URL@URL \nTimo Möller: timo.moeller@URL \nMalte Pietsch: malte.pietsch@URL \nTanay Soni: URL@URL",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
65,
16,
71,
5,
19,
3,
36,
6,
49,
251,
113
] |
[
"passage: TAGS\n#transformers #pytorch #jax #safetensors #bert #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #model-index #endpoints_compatible #has_space #region-us \n# MiniLM-L12-H384-uncased for QA## Overview\nLanguage model: microsoft/MiniLM-L12-H384-uncased \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD 2.0 \nEval data: SQuAD 2.0 \nCode: See an example QA pipeline on Haystack\nInfrastructure: 1x Tesla v100## Hyperparameters## Performance\nEvaluated on the SQuAD 2.0 dev set with the official eval script.## Usage### In Haystack\nFor doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in Haystack:### In Transformers## Authors\nVaishali Pal: URL@URL \nBranden Chan: URL@URL \nTimo Möller: timo.moeller@URL \nMalte Pietsch: malte.pietsch@URL \nTanay Soni: URL@URL"
] |
[
-0.11835096776485443,
0.1658633053302765,
-0.005803135689347982,
0.06522215157747269,
0.12533648312091827,
0.01581316441297531,
0.1738072782754898,
0.09350883215665817,
-0.009541911073029041,
0.017498653382062912,
0.05007525905966759,
0.01752236671745777,
0.0701986774802208,
0.06797461211681366,
0.004205991048365831,
-0.21409845352172852,
-0.00875399075448513,
-0.0735253393650055,
-0.12368083000183105,
0.08797422051429749,
0.10909365862607956,
-0.08203265815973282,
0.08678104728460312,
0.031851109117269516,
-0.042619623243808746,
0.030537405982613564,
0.005551839247345924,
-0.07829708606004715,
0.10242323577404022,
0.05902274325489998,
0.10797973722219467,
0.013921800069510937,
0.07843536883592606,
-0.11329519003629684,
0.04138012230396271,
0.07784061878919601,
0.002665888983756304,
0.040177084505558014,
0.09182071685791016,
0.005714972037822008,
0.0643206536769867,
-0.003520750440657139,
0.003410671604797244,
0.06598544865846634,
-0.04821057617664337,
-0.15490436553955078,
-0.012983221560716629,
0.04257980361580849,
0.07806572318077087,
0.08716773241758347,
-0.00885866954922676,
0.1596180945634842,
-0.11165471374988556,
0.11323367059230804,
0.12426932901144028,
-0.2596088945865631,
-0.05976768583059311,
0.006712807342410088,
0.03589490056037903,
0.03307800367474556,
-0.07644928991794586,
-0.027649568393826485,
0.044334158301353455,
0.013977485708892345,
0.03028658591210842,
-0.0725819319486618,
-0.04189375415444374,
0.04339161515235901,
-0.09587379544973373,
0.004274282604455948,
0.1802472323179245,
0.009764951653778553,
-0.05128316208720207,
-0.0458916574716568,
-0.1176314651966095,
0.09260912239551544,
-0.026535119861364365,
0.028493110090494156,
0.0025424466002732515,
0.0073740603402256966,
-0.004967821761965752,
0.004641314502805471,
-0.12913750112056732,
-0.037296317517757416,
0.0027051731012761593,
-0.026678547263145447,
0.06659308820962906,
0.02690873108804226,
-0.11854805052280426,
0.06256020814180374,
0.09342136234045029,
-0.16246546804904938,
-0.034529175609350204,
-0.12397465854883194,
-0.06302303820848465,
0.033315010368824005,
-0.0033559321891516447,
0.0038617956452071667,
0.10483210533857346,
0.07309900969266891,
0.060727640986442566,
0.060825712978839874,
0.07385227084159851,
0.008296782150864601,
0.002597093116492033,
0.20814232528209686,
-0.09506566822528839,
-0.06525380164384842,
0.06002190709114075,
0.03613362833857536,
-0.018204743042588234,
-0.030710386112332344,
-0.06803393363952637,
0.007327904459089041,
0.03480984643101692,
0.09384752810001373,
0.02732517570257187,
0.016527613624930382,
-0.037190280854701996,
-0.029299374669790268,
0.056202638894319534,
-0.10156293213367462,
0.025014763697981834,
0.05925046280026436,
-0.013136797584593296,
0.18657788634300232,
-0.02506677247583866,
0.04047713428735733,
-0.10796674340963364,
0.02548428811132908,
-0.07373978197574615,
-0.021560339257121086,
-0.025891689583659172,
-0.06841522455215454,
0.022937318310141563,
0.003923384472727776,
0.04034821689128876,
-0.18833929300308228,
-0.09301044791936874,
-0.014255810528993607,
0.05682773143053055,
-0.01318920124322176,
-0.022418927401304245,
0.02167780138552189,
-0.10010667890310287,
0.021893229335546494,
0.007367817685008049,
0.003860423807054758,
-0.06869102269411087,
0.042866598814725876,
-0.00823211669921875,
0.03010961227118969,
0.006839483045041561,
0.004087402019649744,
-0.10148843377828598,
-0.008275520987808704,
-0.032509781420230865,
0.04548889026045799,
-0.0693620815873146,
0.08906124532222748,
-0.09243142604827881,
-0.052396390587091446,
0.07305749505758286,
0.021965529769659042,
0.031088443472981453,
0.16198278963565826,
-0.1786777526140213,
-0.05747107043862343,
0.1566738784313202,
-0.0480080172419548,
-0.15650536119937897,
0.12871676683425903,
0.023156503215432167,
0.04292399063706398,
0.08548960834741592,
0.09876077622175217,
0.13681596517562866,
-0.2289622575044632,
-0.05669873580336571,
0.07116824388504028,
0.05483490973711014,
-0.011138098314404488,
0.10604584962129593,
0.010441236197948456,
0.10552768409252167,
0.021663067862391472,
0.007298525422811508,
0.0033353958278894424,
-0.012217334471642971,
-0.07928666472434998,
-0.03275296837091446,
-0.04115133732557297,
-0.062173712998628616,
-0.007420964073389769,
0.00744905648753047,
0.003887192113325,
-0.11531289666891098,
-0.05378765985369682,
0.0965651124715805,
-0.04270274192094803,
0.021114531904459,
-0.1120169535279274,
0.028468318283557892,
-0.010098486207425594,
0.0306862760335207,
-0.12863411009311676,
-0.09567725658416748,
0.025922482833266258,
-0.04884905740618706,
0.0028233022894710302,
0.05039867013692856,
0.028254209086298943,
0.045013703405857086,
0.006074517033994198,
-0.06708493083715439,
-0.033992692828178406,
-0.021064797416329384,
-0.07169866561889648,
-0.14956916868686676,
-0.0170627161860466,
-0.05318382754921913,
-0.004421187564730644,
-0.07520455867052078,
0.02241625264286995,
0.08784128725528717,
0.08416200429201126,
0.07154970616102219,
0.03329254314303398,
0.013253191486001015,
0.024077214300632477,
-0.041426606476306915,
-0.06010928004980087,
0.03440134599804878,
0.014534968882799149,
-0.05218026414513588,
0.01254724059253931,
-0.024360105395317078,
0.14809775352478027,
0.09983182698488235,
-0.021588539704680443,
0.0258325282484293,
0.04517567902803421,
-0.034655455499887466,
-0.004474904388189316,
-0.03826514631509781,
-0.06396016478538513,
0.1245802715420723,
0.06549960374832153,
0.09772256016731262,
-0.11155331879854202,
-0.044013168662786484,
0.0061667729169130325,
-0.04226810112595558,
0.023757414892315865,
0.12354736030101776,
0.060712799429893494,
-0.03333874046802521,
0.022544993087649345,
0.16852310299873352,
-0.027504101395606995,
0.13720452785491943,
-0.06802365183830261,
-0.08058995008468628,
-0.01771172136068344,
-0.015204638242721558,
0.03571994975209236,
0.0910755917429924,
-0.10435906052589417,
0.019178342074155807,
0.045911721885204315,
0.05023137480020523,
0.01933298259973526,
-0.11605986207723618,
-0.009351721964776516,
-0.025508463382720947,
-0.014334297738969326,
-0.09801621735095978,
0.04614437371492386,
0.01494268886744976,
0.07699060440063477,
-0.010647108778357506,
0.012357081286609173,
-0.026367727667093277,
-0.059944167733192444,
-0.0918017327785492,
0.22655902802944183,
-0.0688638910651207,
-0.1661803275346756,
-0.10328938812017441,
-0.013150344602763653,
-0.04377312958240509,
-0.030728425830602646,
0.08660916239023209,
-0.13667601346969604,
-0.09265778958797455,
-0.042190298438072205,
0.040290817618370056,
0.03298095613718033,
0.004173357039690018,
-0.022903775796294212,
0.05283217504620552,
0.038255177438259125,
-0.14995278418064117,
0.01947060413658619,
0.04474909231066704,
-0.04577987641096115,
0.019580261781811714,
0.012375690042972565,
0.10589519888162613,
0.07754933834075928,
0.04104655608534813,
-0.0019133917521685362,
-0.007182934787124395,
0.34510180354118347,
-0.07242528349161148,
0.07860878854990005,
0.2379303276538849,
0.016533710062503815,
0.06924594938755035,
0.17510652542114258,
0.04464675486087799,
-0.05569842830300331,
0.020792175084352493,
0.0050205965526402,
-0.03336658701300621,
-0.2503873407840729,
-0.027149904519319534,
-0.08073685318231583,
0.04174989461898804,
-0.006113356910645962,
0.05745764449238777,
-0.1586243063211441,
0.039050061255693436,
-0.06279640644788742,
0.044451210647821426,
-0.04281258210539818,
0.08038422465324402,
0.05540912598371506,
0.05049082264304161,
0.07708509266376495,
-0.07260619848966599,
-0.01582300290465355,
0.09531743824481964,
0.14233998954296112,
0.1536836475133896,
-0.04709967225790024,
0.16094286739826202,
0.06483602523803711,
0.13692940771579742,
0.03744814917445183,
0.061063289642333984,
0.00813007727265358,
0.0036535633262246847,
-0.009366854093968868,
-0.07404768466949463,
0.01750798337161541,
0.09317295253276825,
-0.04557482898235321,
-0.0014212413225322962,
-0.04844195395708084,
0.10113447159528732,
0.06684792041778564,
0.2347378432750702,
0.06549222022294998,
-0.16240659356117249,
-0.07424893975257874,
0.0832693800330162,
-0.04051581770181656,
-0.03036913275718689,
0.06653063744306564,
0.052859432995319366,
-0.12649944424629211,
0.012553424574434757,
-0.021064670756459236,
0.12995894253253937,
-0.033659931272268295,
0.03457266092300415,
0.04505953565239906,
0.03395550698041916,
0.01122717373073101,
0.0816921815276146,
-0.267967164516449,
0.1369432508945465,
0.023942722007632256,
0.12267975509166718,
-0.035358645021915436,
0.05501782149076462,
0.04121178016066551,
0.05691469833254814,
0.10080169886350632,
-0.015669194981455803,
-0.03972437605261803,
-0.13069498538970947,
-0.07200349867343903,
0.11866466701030731,
0.04856650531291962,
-0.007669779472053051,
0.12424734979867935,
-0.0748198926448822,
-0.00007892644498497248,
-0.011434205807745457,
0.03684787452220917,
-0.13494983315467834,
-0.13691207766532898,
0.022077221423387527,
0.027099791914224625,
0.0099069494754076,
-0.08086847513914108,
-0.055677834898233414,
-0.050317876040935516,
0.05036075413227081,
-0.22102154791355133,
-0.1405894160270691,
-0.11449340730905533,
-0.07595893740653992,
0.06200908124446869,
-0.12612661719322205,
-0.024962279945611954,
-0.05593226104974747,
0.08497484028339386,
0.04693274199962616,
-0.08942554146051407,
0.05129457637667656,
-0.10385216772556305,
-0.1497582197189331,
-0.03464856743812561,
0.10476770997047424,
-0.03098205476999283,
0.016300952062010765,
0.015503840520977974,
-0.03594445437192917,
-0.1337457150220871,
-0.1274254024028778,
-0.025284431874752045,
0.06321655958890915,
0.041574351489543915,
0.0020445052068680525,
-0.04512389376759529,
-0.061224233359098434,
-0.012142601422965527,
0.011281775310635567,
0.08138974756002426,
0.20908109843730927,
-0.034345593303442,
0.05497584864497185,
0.17656682431697845,
-0.037469446659088135,
-0.21391935646533966,
-0.08722727000713348,
0.02203236147761345,
0.019275518134236336,
-0.009360315278172493,
-0.10080133378505707,
0.13162986934185028,
0.003537936368957162,
0.0007720540161244571,
0.00032422281219623983,
-0.2635834217071533,
-0.11059524118900299,
0.03172430023550987,
0.07025430351495743,
0.006100809201598167,
-0.17298276722431183,
-0.004958821460604668,
-0.0726812556385994,
-0.17826028168201447,
0.03511631116271019,
-0.16779081523418427,
0.08410144597291946,
0.0049156975001096725,
0.05979997292160988,
-0.024862326681613922,
-0.04441565275192261,
0.11790448427200317,
-0.018658243119716644,
0.00397747615352273,
0.00223738607019186,
0.010641152039170265,
0.12026643007993698,
-0.011294066905975342,
0.027088459581136703,
-0.06361071020364761,
0.08143985271453857,
-0.12292425334453583,
-0.01559192780405283,
-0.03135543316602707,
0.035908233374357224,
-0.08297253400087357,
-0.08613459020853043,
-0.04067309945821762,
0.031101612374186516,
0.059687066823244095,
-0.02466604858636856,
0.053364627063274384,
-0.029536264017224312,
0.05188556760549545,
0.17106559872627258,
0.10108216106891632,
-0.10139820724725723,
-0.10070148855447769,
-0.042173340916633606,
-0.02026963420212269,
0.11402982473373413,
-0.21957196295261383,
0.07467672228813171,
0.11760294437408447,
0.0033473072107881308,
0.01629151590168476,
0.03964914008975029,
-0.0732843279838562,
0.083383709192276,
0.046519968658685684,
-0.08696764707565308,
-0.1352674961090088,
-0.04773934930562973,
-0.011860806494951248,
-0.11900048702955246,
0.09008636325597763,
0.19614408910274506,
-0.03336583450436592,
-0.017985563725233078,
0.010971846990287304,
-0.005990968551486731,
0.007508807349950075,
0.13384731113910675,
0.08740344643592834,
0.06012096628546715,
-0.08713670074939728,
0.04838838800787926,
0.02139158546924591,
-0.029307076707482338,
-0.0008076821104623377,
0.11294694989919662,
-0.10358549654483795,
-0.07196581363677979,
-0.006191596854478121,
0.09015867859125137,
-0.052809182554483414,
-0.05263553559780121,
-0.09307420998811722,
-0.08294246345758438,
0.04685845971107483,
-0.002520971233025193,
0.0290527381002903,
0.007701055146753788,
-0.052963919937610626,
-0.017569521442055702,
-0.03855085372924805,
0.13944494724273682,
0.027464693412184715,
-0.003125283168628812,
-0.20257291197776794,
-0.007234860677272081,
-0.04116949066519737,
0.09616989642381668,
-0.02301926724612713,
-0.008711842820048332,
-0.14035257697105408,
0.010356027632951736,
-0.27334776520729065,
0.07797355949878693,
-0.015162963420152664,
0.02975936233997345,
-0.046114806085824966,
-0.057176847010850906,
-0.06367423385381699,
0.054733630269765854,
-0.05061521753668785,
-0.021656686440110207,
-0.03759125992655754,
0.0867874026298523,
-0.13826602697372437,
0.05180602893233299,
0.03160148113965988,
-0.05704959109425545,
0.11541559547185898,
-0.013347461819648743,
-0.09460199624300003,
0.079910509288311,
-0.02961169369518757,
-0.004463793244212866,
-0.01462454441934824,
0.06040911376476288,
0.08724911510944366,
-0.005528004840016365,
0.009638145565986633,
-0.01583021879196167,
-0.0017762419302016497,
-0.009091106243431568,
0.09319096058607101,
-0.06423594802618027,
0.01666000857949257,
-0.007118712179362774,
-0.048476435244083405,
-0.053653206676244736,
0.009487749077379704,
0.06481298804283142,
0.03800484538078308,
0.17090079188346863,
-0.03892574831843376,
0.07471746951341629,
-0.1624099165201187,
-0.008909881114959717,
0.05104798823595047,
-0.031124191358685493,
-0.1242002621293068,
-0.06971127539873123,
0.05570971220731735,
-0.0632585734128952,
0.07521145045757294,
-0.034066759049892426,
0.17184382677078247,
0.0477023646235466,
-0.027859756723046303,
-0.017403578385710716,
0.046125058084726334,
0.1314615160226822,
0.05451086536049843,
0.030437100678682327,
-0.005202777683734894,
-0.055677276104688644,
0.03422749042510986,
0.07616625726222992,
0.11836947500705719,
0.1298176646232605,
0.09162098914384842,
0.09221496433019638,
-0.0009990644175559282,
-0.004862592555582523,
-0.20094536244869232,
-0.052799563854932785,
-0.007951161824166775,
0.06624133884906769,
-0.06546033173799515,
-0.006045670248568058,
0.23213829100131989,
-0.095008984208107,
-0.017811957746744156,
-0.08559500426054001,
-0.061051033437252045,
-0.13005220890045166,
-0.08801772445440292,
-0.08004774153232574,
-0.07838528603315353,
-0.009905745275318623,
-0.13184133172035217,
-0.02207557111978531,
0.04044603928923607,
0.03132010996341705,
-0.027152031660079956,
0.10978801548480988,
0.04891073331236839,
-0.0577494241297245,
0.07267620414495468,
0.02496650069952011,
0.022019529715180397,
0.0993541032075882,
0.04328291863203049,
-0.02977299690246582,
-0.0034096206072717905,
-0.005317541770637035,
0.04734291508793831,
-0.054354842752218246,
0.023348521441221237,
-0.12962467968463898,
-0.06607905775308609,
-0.011157682165503502,
0.06505822390317917,
-0.03440633788704872,
0.08306317031383514,
0.050738316029310226,
-0.02270282618701458,
0.029599785804748535,
0.255255788564682,
-0.0858575627207756,
-0.1091814711689949,
-0.17630824446678162,
0.05881280452013016,
-0.06319396197795868,
0.031652335077524185,
0.029560832306742668,
-0.08255156129598618,
-0.07784775644540787,
0.25546786189079285,
0.11749988049268723,
-0.14698034524917603,
0.007847683504223824,
-0.013056203722953796,
0.005260915961116552,
-0.09496484696865082,
0.20584052801132202,
0.11351870000362396,
0.2456800490617752,
-0.049850091338157654,
0.01236662920564413,
-0.003410290228202939,
-0.047577958554029465,
-0.15827932953834534,
0.13145726919174194,
-0.007226066198199987,
-0.04770609363913536,
-0.04842084273695946,
0.12955284118652344,
-0.12455300241708755,
-0.09028462320566177,
-0.06804915517568588,
-0.07565333694219589,
-0.13393279910087585,
-0.02625906653702259,
0.11699697375297546,
0.046455346047878265,
0.06059665232896805,
-0.03805680572986603,
-0.0012915197294205427,
0.1771266758441925,
0.0016361142043024302,
-0.004206511192023754,
-0.0035603672731667757,
0.09412635862827301,
-0.11068002879619598,
0.11075349897146225,
0.008735126815736294,
0.05993376299738884,
0.1332756131887436,
0.0033833440393209457,
-0.12686270475387573,
0.027184562757611275,
0.10947216302156448,
-0.14192181825637817,
0.01708710752427578,
0.005376041866838932,
-0.007343594450503588,
0.036405064165592194,
0.09283889830112457,
-0.010866697877645493,
0.009388159960508347,
0.05123595520853996,
0.033854421228170395,
-0.1063380017876625,
0.06343230605125427,
-0.09750083833932877,
0.08028530329465866,
0.09305819123983383,
-0.07631374895572662,
-0.01955130696296692,
-0.008156669326126575,
0.10142980515956879,
0.008104439824819565,
-0.019202589988708496,
0.0012242518132552505,
-0.15179228782653809,
0.002936901990324259,
0.033555153757333755,
0.062448110431432724,
-0.12207722663879395,
-0.01963278092443943,
-0.02211284264922142,
0.016839951276779175,
-0.0840175524353981,
0.15215559303760529,
0.03822110593318939,
-0.024579457938671112,
-0.013242644257843494,
-0.08111178874969482,
-0.054809506982564926,
0.09465400129556656,
-0.11048994958400726,
-0.10232315957546234
] |
null | null |
transformers
|
This language model is trained using sentence_transformers (https://github.com/UKPLab/sentence-transformers)
Started with bert-base-nli-stsb-mean-tokens
Continue training on quora questions deduplication dataset (https://www.kaggle.com/c/quora-question-pairs)
See train_script.py for script used
Below is the performance over the course of training
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
0,1000,0.5944576426835938,0.6010801382777033,0.5942803776859142,0.5934485776801595,0.5939676679774666,0.593162725602328,0.5905591590826669,0.5921674789994058
0,2000,0.6404080440207146,0.6416811632113405,0.6384419354012121,0.6352050423100778,0.6379917744471867,0.6347884067391001,0.6410544760582826,0.6379252046791412
0,3000,0.6710168301884945,0.6676529324662036,0.6660195209784969,0.6618423144808695,0.6656461098096684,0.6615366331956389,0.6724401903484759,0.666073727723655
0,4000,0.6886373265097949,0.6808948140300153,0.67907655686838,0.6714218133850957,0.6786809551564443,0.6711577956884357,0.6926435869763303,0.68190855298609
0,5000,0.6991409753700026,0.6919630610321864,0.6991041519437052,0.6868961486499775,0.6987076032270729,0.6865385550504007,0.7035518148330993,0.6916275246101342
0,6000,0.7120367327025509,0.6975005265298305,0.7065567493967201,0.6922375503495235,0.7060005509843024,0.6916475765570651,0.7147094303373102,0.6981390706722722
0,7000,0.7254672394728687,0.7130118465900485,0.7261844956277705,0.7086213543110718,0.7257479964972307,0.7079315661881832,0.728729909455115,0.7122743793160531
0,8000,0.7402421930101399,0.7216774208330149,0.7367901914441078,0.7166256588352043,0.7362607046874481,0.7158881916281887,0.7433902441373252,0.7220998491980078
0,9000,0.7381005358120434,0.7197216844469877,0.7343228719349923,0.7139462687943793,0.7345247569255238,0.7145106206467152,0.7421843672419275,0.720686853053079
0,10000,0.7465436564646095,0.7260327107480364,0.7467524239596304,0.7230195666847953,0.7467721566237211,0.7231367593302213,0.749792199122442,0.7263143296580317
0,11000,0.7521805421706547,0.7323771570146701,0.7530672061250105,0.729223203496722,0.7530616532823367,0.7293818369675622,0.7552399002305836,0.7320808333541338
0,12000,0.7579359969644401,0.7340677616737238,0.7570017235719905,0.7305965412825544,0.7570601853520393,0.730718189957289,0.7611254136080384,0.7351501229591327
0,-1,0.7573407371218097,0.7329952035782198,0.755595312163209,0.7291445551777086,0.7557737117990928,0.7295404703700227,0.7607276219361719,0.7342415455980179
1,1000,0.7619907683805341,0.7374667949734767,0.7629820517114324,0.7330364216044966,0.7628369522755882,0.7331912674450544,0.7658583898073758,0.7381503446695727
1,2000,0.7618972640071228,0.7362151058969478,0.764582212425539,0.7335856230046062,0.7643125513700815,0.7334501607097152,0.7652852805583232,0.7369104639809163
1,3000,0.7687362955240467,0.7404674623181671,0.7708304819979073,0.7380959815601529,0.7707835692712482,0.7379796800453193,0.772074854759756,0.7414513460702766
1,4000,0.7685047787908202,0.7403088288815168,0.7703522257474043,0.7379787888808298,0.7701221475099808,0.7377898546753812,0.7713755359045312,0.7409415801952219
1,5000,0.7696438109797803,0.7410393893292365,0.773270389327895,0.7392953127251652,0.7729880866533291,0.7389853982789335,0.7726236305835863,0.7416278035580925
1,6000,0.7749538363837081,0.7436499342062207,0.774879168058157,0.7401827241766746,0.7745754601165837,0.739763415043146,0.7788801166152383,0.7446249060022169
1,7000,0.7794560817870597,0.7480970176267153,0.7803506944510302,0.7453305130502859,0.7799867949176531,0.7447100155494814,0.7828208193123926,0.7486740690324809
1,8000,0.7855844359073243,0.7496742172376921,0.7828816645965887,0.747176409009761,0.7827584875358967,0.7471037762845532,0.7879159073496309,0.7507349669102151
1,9000,0.7844110753729492,0.7507746252693759,0.7847208586489722,0.7485172180290892,0.7846408087474059,0.748491818820158,0.7872061334510225,0.7514470349769437
1,10000,0.7881311227435004,0.7530048509727403,0.7886917756879734,0.7508018068765787,0.7883332502188707,0.7505037008187275,0.7910707228932787,0.7537200382362567
1,11000,0.7883300109606874,0.7513494487126553,0.7879329130497712,0.749818368689255,0.7876525616593218,0.7494872882301785,0.7911454269743292,0.7522843165147303
1,12000,0.7853334933336618,0.7516809747712728,0.7893895316714998,0.749780492728257,0.7890075986655403,0.7494079715118533,0.7885959664070629,0.7523827940133203
1,-1,0.7887529238148887,0.7534076729932393,0.7896864404801204,0.7513080079201105,0.7894077512343298,0.7510009899066772,0.7919617393746149,0.7542173273241598
2,1000,0.7919209063905188,0.7550167329363414,0.7917464066515253,0.7523043685293455,0.7914371703225378,0.7520285423781206,0.7950297421784158,0.7562599556207076
2,2000,0.7924507768792486,0.7542908512484463,0.7934519001953887,0.7517491515010692,0.7931885648751081,0.751521004535999,0.7951637852162545,0.7551495215642072
2,3000,0.7937606244038364,0.755599577136169,0.7933633347508111,0.7527922999916203,0.7931581019714242,0.7527132061436363,0.797275652800117,0.7569827180764233
2,4000,0.7938389298721445,0.7578716892320315,0.7963783770097079,0.7555928931784702,0.796150381773947,0.7555438771581088,0.7972911620482322,0.759178632650707
2,5000,0.7935330563129844,0.7551129824372304,0.7970775059297484,0.7527285792572385,0.7967359830546507,0.7524478515463257,0.7966395126138969,0.756319220359678
2,6000,0.7929852776759999,0.7525490026774382,0.7952484474454824,0.7503695753216607,0.7950784132079611,0.7503677929234961,0.7956152082976395,0.7535275392698093
2,7000,0.794956504054517,0.756119591765251,0.7982025041673655,0.7532521587180684,0.7980261618830962,0.7532107179960499,0.7983222918908033,0.7571226363678287
2,8000,0.7934568432535339,0.7538336661192452,0.797015698241178,0.7514773358161916,0.7968076980315735,0.7513458838811067,0.7960694134685949,0.754143803399873
2,9000,0.7970040626682157,0.7576497805894974,0.7987855332059015,0.7550996144509958,0.7984693921009676,0.7548260162973456,0.7999509314900626,0.758347143906916
2,10000,0.7979442987735523,0.7585338500791028,0.8018677081664496,0.7557412777548302,0.8015397301245205,0.7552916678886369,0.8007921348414564,0.7589772216225288
2,11000,0.7985519561040211,0.7579986850302035,0.8021236875460913,0.7555826443181872,0.8019861620475348,0.7553763317660516,0.8009230128897853,0.7586541619907702
2,12000,0.7986842143860736,0.7599570950134775,0.8029131054823838,0.7577678644678973,0.8027922603736795,0.7575152095990927,0.8020896747930555,0.7608540869254408
2,-1,0.7994135319568432,0.7596286881516635,0.8022087183675333,0.7570593611974978,0.8020218401019292,0.7567291719729909,0.8026346812258125,0.7603928913647044
3,1000,0.7985505039929134,0.7592588405681144,0.8023296699449267,0.7569345933969436,0.8023622066009718,0.7570237132696928,0.8013054275981851,0.759643838536062
3,2000,0.7995482191699455,0.759205368623176,0.8026859405513612,0.7565709841358819,0.8024845263367439,0.7562920388231202,0.8021318586127523,0.7596496313300967
3,3000,0.7991070423195897,0.7582027696555826,0.8016352550470427,0.7555585819429662,0.8014268261947898,0.7551838327642736,0.8013136081494014,0.7584429477727118
3,4000,0.7999188836884763,0.7586764419322649,0.802987646214278,0.7561111254802977,0.8026549791861386,0.7556463650525692,0.8024068858366156,0.7591238238715613
3,5000,0.7988075932525881,0.7583533823004922,0.8019498750207454,0.755792967372457,0.8016459824731964,0.7553834613587099,0.8015528810821693,0.7589527136833425
3,6000,0.8003341798460688,0.7585432077405799,0.8032464035902267,0.7563722467405277,0.8028695045742804,0.7557626665682309,0.8027937010871594,0.7590404967573696
3,7000,0.799187592384933,0.7579358555659604,0.8028413548398412,0.7555875459131398,0.8025187078191003,0.7551196665011402,0.8018680475193432,0.7585565756912578
3,8000,0.797725037202641,0.757439012042047,0.802048241301358,0.7548888458326453,0.8017608103042271,0.7544606246736175,0.8005479449399782,0.758037452190282
3,9000,0.7990232649360067,0.7573703896772077,0.8021375332910405,0.754873027155089,0.8018733796679427,0.7545680141630304,0.8016400687760605,0.7579461042843499
3,10000,0.7994934439260372,0.758368978248884,0.8035693504115055,0.75619400688862,0.8032990505007025,0.7559016935896375,0.8022819185772518,0.7589558328445544
3,11000,0.8002954591825011,0.758710753096932,0.8043310859792212,0.7566387152306694,0.8040865016706966,0.7564221538891368,0.8030873114870971,0.7592722085543488
3,12000,0.8003726616196549,0.7588056657991931,0.8044000317617518,0.7566146528909147,0.8041705213966136,0.7563419459362758,0.8031760015719815,0.7593194421057111
3,-1,0.8004926728141455,0.7587192194882135,0.8043340929890026,0.756546030526114,0.8041028559910275,0.7563103085106637,0.8032542493776693,0.7592325501951863
|
{"license": "apache-2.0"}
|
feature-extraction
|
deepset/quora_dedup_bert_base
|
[
"transformers",
"pytorch",
"jax",
"safetensors",
"bert",
"feature-extraction",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #safetensors #bert #feature-extraction #license-apache-2.0 #endpoints_compatible #region-us
|
This language model is trained using sentence_transformers (URL
Started with bert-base-nli-stsb-mean-tokens
Continue training on quora questions deduplication dataset (URL
See train_script.py for script used
Below is the performance over the course of training
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
0,1000,0.5944576426835938,0.6010801382777033,0.5942803776859142,0.5934485776801595,0.5939676679774666,0.593162725602328,0.5905591590826669,0.5921674789994058
0,2000,0.6404080440207146,0.6416811632113405,0.6384419354012121,0.6352050423100778,0.6379917744471867,0.6347884067391001,0.6410544760582826,0.6379252046791412
0,3000,0.6710168301884945,0.6676529324662036,0.6660195209784969,0.6618423144808695,0.6656461098096684,0.6615366331956389,0.6724401903484759,0.666073727723655
0,4000,0.6886373265097949,0.6808948140300153,0.67907655686838,0.6714218133850957,0.6786809551564443,0.6711577956884357,0.6926435869763303,0.68190855298609
0,5000,0.6991409753700026,0.6919630610321864,0.6991041519437052,0.6868961486499775,0.6987076032270729,0.6865385550504007,0.7035518148330993,0.6916275246101342
0,6000,0.7120367327025509,0.6975005265298305,0.7065567493967201,0.6922375503495235,0.7060005509843024,0.6916475765570651,0.7147094303373102,0.6981390706722722
0,7000,0.7254672394728687,0.7130118465900485,0.7261844956277705,0.7086213543110718,0.7257479964972307,0.7079315661881832,0.728729909455115,0.7122743793160531
0,8000,0.7402421930101399,0.7216774208330149,0.7367901914441078,0.7166256588352043,0.7362607046874481,0.7158881916281887,0.7433902441373252,0.7220998491980078
0,9000,0.7381005358120434,0.7197216844469877,0.7343228719349923,0.7139462687943793,0.7345247569255238,0.7145106206467152,0.7421843672419275,0.720686853053079
0,10000,0.7465436564646095,0.7260327107480364,0.7467524239596304,0.7230195666847953,0.7467721566237211,0.7231367593302213,0.749792199122442,0.7263143296580317
0,11000,0.7521805421706547,0.7323771570146701,0.7530672061250105,0.729223203496722,0.7530616532823367,0.7293818369675622,0.7552399002305836,0.7320808333541338
0,12000,0.7579359969644401,0.7340677616737238,0.7570017235719905,0.7305965412825544,0.7570601853520393,0.730718189957289,0.7611254136080384,0.7351501229591327
0,-1,0.7573407371218097,0.7329952035782198,0.755595312163209,0.7291445551777086,0.7557737117990928,0.7295404703700227,0.7607276219361719,0.7342415455980179
1,1000,0.7619907683805341,0.7374667949734767,0.7629820517114324,0.7330364216044966,0.7628369522755882,0.7331912674450544,0.7658583898073758,0.7381503446695727
1,2000,0.7618972640071228,0.7362151058969478,0.764582212425539,0.7335856230046062,0.7643125513700815,0.7334501607097152,0.7652852805583232,0.7369104639809163
1,3000,0.7687362955240467,0.7404674623181671,0.7708304819979073,0.7380959815601529,0.7707835692712482,0.7379796800453193,0.772074854759756,0.7414513460702766
1,4000,0.7685047787908202,0.7403088288815168,0.7703522257474043,0.7379787888808298,0.7701221475099808,0.7377898546753812,0.7713755359045312,0.7409415801952219
1,5000,0.7696438109797803,0.7410393893292365,0.773270389327895,0.7392953127251652,0.7729880866533291,0.7389853982789335,0.7726236305835863,0.7416278035580925
1,6000,0.7749538363837081,0.7436499342062207,0.774879168058157,0.7401827241766746,0.7745754601165837,0.739763415043146,0.7788801166152383,0.7446249060022169
1,7000,0.7794560817870597,0.7480970176267153,0.7803506944510302,0.7453305130502859,0.7799867949176531,0.7447100155494814,0.7828208193123926,0.7486740690324809
1,8000,0.7855844359073243,0.7496742172376921,0.7828816645965887,0.747176409009761,0.7827584875358967,0.7471037762845532,0.7879159073496309,0.7507349669102151
1,9000,0.7844110753729492,0.7507746252693759,0.7847208586489722,0.7485172180290892,0.7846408087474059,0.748491818820158,0.7872061334510225,0.7514470349769437
1,10000,0.7881311227435004,0.7530048509727403,0.7886917756879734,0.7508018068765787,0.7883332502188707,0.7505037008187275,0.7910707228932787,0.7537200382362567
1,11000,0.7883300109606874,0.7513494487126553,0.7879329130497712,0.749818368689255,0.7876525616593218,0.7494872882301785,0.7911454269743292,0.7522843165147303
1,12000,0.7853334933336618,0.7516809747712728,0.7893895316714998,0.749780492728257,0.7890075986655403,0.7494079715118533,0.7885959664070629,0.7523827940133203
1,-1,0.7887529238148887,0.7534076729932393,0.7896864404801204,0.7513080079201105,0.7894077512343298,0.7510009899066772,0.7919617393746149,0.7542173273241598
2,1000,0.7919209063905188,0.7550167329363414,0.7917464066515253,0.7523043685293455,0.7914371703225378,0.7520285423781206,0.7950297421784158,0.7562599556207076
2,2000,0.7924507768792486,0.7542908512484463,0.7934519001953887,0.7517491515010692,0.7931885648751081,0.751521004535999,0.7951637852162545,0.7551495215642072
2,3000,0.7937606244038364,0.755599577136169,0.7933633347508111,0.7527922999916203,0.7931581019714242,0.7527132061436363,0.797275652800117,0.7569827180764233
2,4000,0.7938389298721445,0.7578716892320315,0.7963783770097079,0.7555928931784702,0.796150381773947,0.7555438771581088,0.7972911620482322,0.759178632650707
2,5000,0.7935330563129844,0.7551129824372304,0.7970775059297484,0.7527285792572385,0.7967359830546507,0.7524478515463257,0.7966395126138969,0.756319220359678
2,6000,0.7929852776759999,0.7525490026774382,0.7952484474454824,0.7503695753216607,0.7950784132079611,0.7503677929234961,0.7956152082976395,0.7535275392698093
2,7000,0.794956504054517,0.756119591765251,0.7982025041673655,0.7532521587180684,0.7980261618830962,0.7532107179960499,0.7983222918908033,0.7571226363678287
2,8000,0.7934568432535339,0.7538336661192452,0.797015698241178,0.7514773358161916,0.7968076980315735,0.7513458838811067,0.7960694134685949,0.754143803399873
2,9000,0.7970040626682157,0.7576497805894974,0.7987855332059015,0.7550996144509958,0.7984693921009676,0.7548260162973456,0.7999509314900626,0.758347143906916
2,10000,0.7979442987735523,0.7585338500791028,0.8018677081664496,0.7557412777548302,0.8015397301245205,0.7552916678886369,0.8007921348414564,0.7589772216225288
2,11000,0.7985519561040211,0.7579986850302035,0.8021236875460913,0.7555826443181872,0.8019861620475348,0.7553763317660516,0.8009230128897853,0.7586541619907702
2,12000,0.7986842143860736,0.7599570950134775,0.8029131054823838,0.7577678644678973,0.8027922603736795,0.7575152095990927,0.8020896747930555,0.7608540869254408
2,-1,0.7994135319568432,0.7596286881516635,0.8022087183675333,0.7570593611974978,0.8020218401019292,0.7567291719729909,0.8026346812258125,0.7603928913647044
3,1000,0.7985505039929134,0.7592588405681144,0.8023296699449267,0.7569345933969436,0.8023622066009718,0.7570237132696928,0.8013054275981851,0.759643838536062
3,2000,0.7995482191699455,0.759205368623176,0.8026859405513612,0.7565709841358819,0.8024845263367439,0.7562920388231202,0.8021318586127523,0.7596496313300967
3,3000,0.7991070423195897,0.7582027696555826,0.8016352550470427,0.7555585819429662,0.8014268261947898,0.7551838327642736,0.8013136081494014,0.7584429477727118
3,4000,0.7999188836884763,0.7586764419322649,0.802987646214278,0.7561111254802977,0.8026549791861386,0.7556463650525692,0.8024068858366156,0.7591238238715613
3,5000,0.7988075932525881,0.7583533823004922,0.8019498750207454,0.755792967372457,0.8016459824731964,0.7553834613587099,0.8015528810821693,0.7589527136833425
3,6000,0.8003341798460688,0.7585432077405799,0.8032464035902267,0.7563722467405277,0.8028695045742804,0.7557626665682309,0.8027937010871594,0.7590404967573696
3,7000,0.799187592384933,0.7579358555659604,0.8028413548398412,0.7555875459131398,0.8025187078191003,0.7551196665011402,0.8018680475193432,0.7585565756912578
3,8000,0.797725037202641,0.757439012042047,0.802048241301358,0.7548888458326453,0.8017608103042271,0.7544606246736175,0.8005479449399782,0.758037452190282
3,9000,0.7990232649360067,0.7573703896772077,0.8021375332910405,0.754873027155089,0.8018733796679427,0.7545680141630304,0.8016400687760605,0.7579461042843499
3,10000,0.7994934439260372,0.758368978248884,0.8035693504115055,0.75619400688862,0.8032990505007025,0.7559016935896375,0.8022819185772518,0.7589558328445544
3,11000,0.8002954591825011,0.758710753096932,0.8043310859792212,0.7566387152306694,0.8040865016706966,0.7564221538891368,0.8030873114870971,0.7592722085543488
3,12000,0.8003726616196549,0.7588056657991931,0.8044000317617518,0.7566146528909147,0.8041705213966136,0.7563419459362758,0.8031760015719815,0.7593194421057111
3,-1,0.8004926728141455,0.7587192194882135,0.8043340929890026,0.756546030526114,0.8041028559910275,0.7563103085106637,0.8032542493776693,0.7592325501951863
|
[] |
[
"TAGS\n#transformers #pytorch #jax #safetensors #bert #feature-extraction #license-apache-2.0 #endpoints_compatible #region-us \n"
] |
[
45
] |
[
"passage: TAGS\n#transformers #pytorch #jax #safetensors #bert #feature-extraction #license-apache-2.0 #endpoints_compatible #region-us \n"
] |
[
-0.061390433460474014,
0.08115120232105255,
-0.00781991146504879,
0.011112220585346222,
0.030008064582943916,
-0.011579565703868866,
0.08574016392230988,
0.11435400694608688,
0.03188648074865341,
-0.05744137242436409,
0.13497814536094666,
0.22118213772773743,
-0.007218912709504366,
0.03260617330670357,
-0.05039866268634796,
-0.16080568730831146,
0.1291034072637558,
0.03042897954583168,
-0.029160721227526665,
0.09378783404827118,
0.09978688508272171,
-0.05542799457907677,
0.02836604230105877,
0.0005383683601394296,
-0.035881321877241135,
0.009123008698225021,
0.05517837777733803,
-0.08734776079654694,
0.09765614569187164,
0.0033021809067577124,
0.09695323556661606,
0.048956189304590225,
-0.038406189531087875,
-0.2008890062570572,
0.022641029208898544,
0.023588623851537704,
-0.06386538594961166,
0.039385098963975906,
0.050295840948820114,
-0.037276819348335266,
-0.008529179729521275,
0.06605007499456406,
-0.006067186128348112,
0.03190780431032181,
-0.07355542480945587,
-0.29062530398368835,
-0.10742350667715073,
0.11507131904363632,
0.07006192207336426,
0.09019403159618378,
0.05220058932900429,
0.18527038395404816,
-0.13485382497310638,
0.04860848933458328,
0.18269304931163788,
-0.3539066016674042,
0.014944784343242645,
0.040287308394908905,
0.09347937256097794,
-0.012819405645132065,
-0.018222525715827942,
0.021426890045404434,
0.029416514560580254,
0.015700478106737137,
0.08493150025606155,
-0.05313039943575859,
-0.08727580308914185,
0.03126257285475731,
-0.06420847028493881,
-0.08990751206874847,
0.22397060692310333,
0.03198013827204704,
0.016308169811964035,
-0.013477205298841,
-0.07542051374912262,
0.05808962509036064,
-0.03229983150959015,
0.01928563043475151,
0.03701940178871155,
0.10119780898094177,
0.05634962394833565,
-0.02225443348288536,
-0.1353820264339447,
-0.005335730034857988,
-0.14743660390377045,
0.10007050633430481,
0.03660329431295395,
0.1060563325881958,
-0.15408216416835785,
0.04761859029531479,
0.007614837493747473,
-0.12276400625705719,
-0.017144840210676193,
-0.08619707822799683,
0.1547888219356537,
0.05333613231778145,
-0.04496627300977707,
0.09465254843235016,
0.15354079008102417,
0.2238205522298813,
-0.021058259531855583,
-0.015436887741088867,
-0.02039409428834915,
0.1222604513168335,
-0.04770713299512863,
0.04666910693049431,
0.022714756429195404,
0.005628790240734816,
0.13382668793201447,
-0.06671509146690369,
0.07312024384737015,
-0.006660440005362034,
-0.07762379199266434,
-0.005273356102406979,
0.04269605129957199,
0.14194993674755096,
0.07173550873994827,
0.005760321859270334,
-0.06283964961767197,
0.039554718881845474,
0.16576740145683289,
-0.054439179599285126,
-0.004860213957726955,
-0.0012194790178909898,
0.05746358633041382,
0.04389932379126549,
0.0325181819498539,
0.010369420982897282,
-0.018483584746718407,
0.05457473173737526,
-0.061854779720306396,
-0.02277018129825592,
-0.029090922325849533,
0.00515176635235548,
0.1002228781580925,
-0.09268128126859665,
0.07213295996189117,
-0.15326835215091705,
-0.17344972491264343,
0.05919533222913742,
0.07328438013792038,
0.047445833683013916,
-0.05325592681765556,
0.0659223422408104,
-0.03931581228971481,
-0.009069138206541538,
-0.09269829094409943,
-0.051424652338027954,
-0.08597896248102188,
0.0760183334350586,
-0.06964301317930222,
-0.003425006289035082,
-0.11833511292934418,
0.03790319710969925,
-0.1492186039686203,
0.02753414586186409,
-0.05646892637014389,
-0.08923469483852386,
-0.08821046352386475,
0.20827904343605042,
-0.04239673912525177,
-0.03083125874400139,
-0.033636607229709625,
0.007997977547347546,
-0.036340367048978806,
0.12418947368860245,
-0.076919324696064,
-0.04794198274612427,
0.2374815046787262,
-0.16421332955360413,
-0.23230208456516266,
0.054220400750637054,
0.022589391097426414,
-0.011777365580201149,
0.07405931502580643,
0.14963050186634064,
0.06886231899261475,
-0.0752670019865036,
0.04588279128074646,
0.14048878848552704,
-0.10028819739818573,
-0.1918783038854599,
0.053301140666007996,
-0.055706724524497986,
-0.09698771685361862,
0.05922461673617363,
-0.06027766317129135,
0.11020562797784805,
-0.012731099501252174,
-0.08163723349571228,
-0.07417050004005432,
-0.06736310571432114,
-0.012628595344722271,
0.009967583231627941,
0.032992053776979446,
-0.0644686296582222,
-0.026591522619128227,
-0.06364390254020691,
0.03941363841295242,
0.03482019528746605,
0.05320504680275917,
-0.08624190837144852,
0.10126993805170059,
-0.005854786839336157,
0.019848376512527466,
-0.12808874249458313,
-0.031573038548231125,
0.02244795858860016,
-0.04040810465812683,
-0.05896111577749252,
0.042415279895067215,
0.08443555235862732,
-0.0682140588760376,
0.011783203110098839,
-0.05320831388235092,
0.083344466984272,
0.06390732526779175,
0.017297547310590744,
-0.1322108656167984,
0.00988398864865303,
-0.05034055933356285,
-0.007789366878569126,
0.02465253695845604,
0.01955506205558777,
0.04526621475815773,
0.09488706290721893,
-0.028370512649416924,
0.05614948272705078,
-0.050841204822063446,
-0.04142113775014877,
-0.019838688895106316,
-0.012083744630217552,
0.10765343904495239,
0.05187365785241127,
-0.06500628590583801,
0.18145306408405304,
-0.035058893263339996,
0.3508375585079193,
0.20135892927646637,
-0.13481958210468292,
0.08227744698524475,
0.031115347519516945,
-0.02734445594251156,
-0.0016392288962379098,
0.05442580580711365,
0.00022778841957915574,
-0.02968781627714634,
0.02419905550777912,
0.10521697252988815,
-0.07124155759811401,
-0.056038759648799896,
-0.0034873955883085728,
-0.050636451691389084,
-0.010752118192613125,
0.018603185191750526,
0.10156100243330002,
-0.1443846970796585,
0.17482686042785645,
0.3748784065246582,
0.0010446066735312343,
0.023142002522945404,
-0.12848696112632751,
-0.016766415908932686,
0.019989628344774246,
0.01997699961066246,
-0.029925504699349403,
0.08257510513067245,
-0.1466926485300064,
0.01705138385295868,
0.07942761480808258,
0.02459910698235035,
0.03923266381025314,
-0.14260920882225037,
-0.0693911537528038,
0.012710156850516796,
-0.018926579505205154,
-0.06791798025369644,
0.044072702527046204,
-0.009749071672558784,
0.06581056118011475,
-0.04677614942193031,
-0.12174055725336075,
0.12202273309230804,
-0.003048304468393326,
-0.045972805470228195,
0.14657975733280182,
-0.14455950260162354,
-0.21846194565296173,
-0.07931360602378845,
-0.06174639239907265,
0.004244623705744743,
-0.015209246426820755,
0.13181431591510773,
-0.02741939388215542,
-0.05574389547109604,
-0.0006089023663662374,
-0.08040796965360641,
-0.014747966080904007,
0.024288782849907875,
-0.018543368205428123,
0.049590568989515305,
0.030123958364129066,
-0.12228704243898392,
-0.05936392396688461,
0.003038047580048442,
-0.05495291203260422,
0.05192733183503151,
-0.03801443800330162,
0.0683823823928833,
0.06052131950855255,
0.05056437849998474,
0.014568368904292583,
-0.027622221037745476,
0.13865192234516144,
-0.003423654939979315,
-0.01305603887885809,
0.2450936734676361,
-0.025333968922495842,
0.087063729763031,
0.11628900468349457,
0.050846878439188004,
-0.04592946171760559,
-0.015607792884111404,
-0.054426830261945724,
-0.0790197104215622,
-0.24005049467086792,
-0.06301476061344147,
-0.08882083743810654,
0.030197493731975555,
0.015244171023368835,
0.08786525577306747,
0.10363003611564636,
0.08892297744750977,
-0.02812514826655388,
-0.07016917318105698,
0.018506096675992012,
0.04104827344417572,
0.16077470779418945,
-0.02662321925163269,
0.08309421688318253,
-0.09667237102985382,
-0.048482127487659454,
0.09699174016714096,
0.06202206388115883,
0.1736638844013214,
0.1218196228146553,
0.017706964164972305,
0.10884732007980347,
0.27600622177124023,
0.06512739509344101,
0.1495552510023117,
-0.02539936825633049,
-0.03223731368780136,
-0.03679468855261803,
-0.02866547740995884,
-0.05084117129445076,
0.03485913947224617,
-0.021878251805901527,
-0.05993087962269783,
-0.035484787076711655,
-0.1751634031534195,
0.07612989097833633,
0.20205312967300415,
0.004640983883291483,
-0.1437673419713974,
0.024804629385471344,
0.09828813374042511,
0.0044830418191850185,
-0.014056095853447914,
0.10951314866542816,
-0.06527984142303467,
-0.07106199860572815,
0.08376588672399521,
-0.043240200728178024,
0.10661955922842026,
0.04379929229617119,
0.039731238037347794,
-0.03392007574439049,
-0.11399713903665543,
0.07823999971151352,
0.10965115576982498,
-0.2213229537010193,
0.2158724069595337,
-0.021509459242224693,
0.00502787483856082,
-0.053952403366565704,
0.025381896644830704,
0.05362194776535034,
0.19362232089042664,
0.14988356828689575,
0.03298578038811684,
-0.10189279168844223,
-0.004521218128502369,
-0.05975351855158806,
0.056421104818582535,
-0.006308950949460268,
-0.015588763169944286,
-0.04738843813538551,
-0.06284425407648087,
-0.01233772374689579,
0.014721198938786983,
0.08301714807748795,
-0.037986330687999725,
-0.1067894995212555,
0.013659716583788395,
0.0681120976805687,
0.05912549048662186,
-0.11562105268239975,
0.0035713077522814274,
-0.06309486925601959,
0.1475716084241867,
-0.09252410382032394,
-0.06799707561731339,
-0.07065461575984955,
-0.18413233757019043,
0.07928621023893356,
-0.049806371331214905,
0.08154043555259705,
-0.06315725296735764,
-0.004968955647200346,
-0.074834443628788,
-0.19265280663967133,
0.0972675085067749,
-0.16992615163326263,
-0.026097353547811508,
-0.036983758211135864,
0.1743231862783432,
-0.07675928622484207,
-0.004030055366456509,
0.04820183664560318,
-0.00581187242642045,
-0.102561354637146,
-0.1112421452999115,
-0.014725223183631897,
0.0330931693315506,
0.054401252418756485,
-0.03851785138249397,
-0.06267602741718292,
-0.05017806962132454,
0.03274882212281227,
-0.009343739598989487,
0.16242635250091553,
0.2208881378173828,
-0.06322573870420456,
0.13448578119277954,
0.2313888520002365,
-0.04866568744182587,
-0.2667959928512573,
-0.17598825693130493,
-0.1750021129846573,
-0.11644916981458664,
0.03893488645553589,
-0.07987543195486069,
0.15104398131370544,
0.08545596897602081,
-0.10455628484487534,
0.11093364655971527,
-0.20296229422092438,
-0.05018205568194389,
0.1854524463415146,
0.02398039773106575,
0.3436551094055176,
-0.16059136390686035,
-0.06822100281715393,
-0.03464832529425621,
-0.23263676464557648,
0.11869175732135773,
-0.10750038921833038,
0.013682464137673378,
-0.0024219194892793894,
-0.03748590499162674,
-0.01799651049077511,
-0.06999177485704422,
0.12830708920955658,
-0.04363575577735901,
0.06460423767566681,
-0.07719417661428452,
0.025836866348981857,
0.1233910471200943,
-0.03079126589000225,
0.052356455475091934,
-0.14529423415660858,
0.03813356161117554,
-0.05634179338812828,
-0.009275871329009533,
-0.095655158162117,
0.11049652844667435,
-0.006093223579227924,
-0.04921656847000122,
-0.002328951144590974,
-0.025707140564918518,
0.02010566182434559,
0.00399518059566617,
0.27337661385536194,
0.05493561178445816,
0.10056544095277786,
0.0740787461400032,
0.025672970339655876,
-0.21666018664836884,
-0.11047616600990295,
-0.08710984140634537,
-0.08224008232355118,
0.08829327672719955,
-0.11607853323221207,
0.060823362320661545,
0.06795129179954529,
-0.03212912380695343,
0.020163502544164658,
0.08096759021282196,
-0.0226425901055336,
-0.04996062442660332,
0.12027297914028168,
-0.1621571183204651,
-0.04067160561680794,
0.01168813742697239,
0.12637118995189667,
0.06144433468580246,
0.11135450005531311,
0.09317392855882645,
0.0031595837790519,
-0.037328559905290604,
-0.012941112741827965,
0.05079253762960434,
-0.09598633646965027,
0.007443580776453018,
0.0361449271440506,
0.022116761654615402,
-0.12668417394161224,
0.09702511131763458,
-0.03553071990609169,
-0.1952456831932068,
-0.046764083206653595,
0.044951073825359344,
-0.1257386952638626,
-0.1149163767695427,
0.004464334808290005,
0.04891306161880493,
-0.07856699824333191,
-0.12146781384944916,
-0.03348002955317497,
-0.14985480904579163,
0.04158230125904083,
0.13129889965057373,
0.1070607528090477,
0.098091259598732,
0.04116053134202957,
-0.032530128955841064,
0.028889354318380356,
-0.008057325147092342,
-0.07493754476308823,
0.010583566501736641,
-0.13250979781150818,
-0.11717123538255692,
0.03076191246509552,
0.06185759976506233,
-0.04453790932893753,
0.01800280623137951,
-0.09114625304937363,
0.043908655643463135,
-0.1127539873123169,
-0.010493263602256775,
-0.1292666494846344,
-0.014072715304791927,
0.02215159870684147,
-0.07518205046653748,
-0.020428655669093132,
0.03759589418768883,
-0.1096709817647934,
-0.016598105430603027,
0.016439462080597878,
0.03413952514529228,
-0.13192111253738403,
-0.05034998059272766,
0.08432894945144653,
-0.04109584167599678,
0.09769763052463531,
0.12327107787132263,
-0.0591316744685173,
0.11866939067840576,
-0.16879869997501373,
-0.11246826499700546,
0.09030218422412872,
0.034048717468976974,
0.005958773195743561,
0.0057567935436964035,
-0.00423176446929574,
0.1179991364479065,
-0.058458808809518814,
0.011589279398322105,
-0.03401924669742584,
-0.14770694077014923,
-0.06782815605401993,
0.04321863874793053,
-0.10926061123609543,
0.02026687003672123,
-0.14865200221538544,
0.18014390766620636,
-0.012112658470869064,
0.18681322038173676,
0.014221373945474625,
0.032515235245227814,
-0.07600323110818863,
0.018810885027050972,
-0.039288152009248734,
-0.17491383850574493,
-0.12662041187286377,
-0.02747073397040367,
-0.05542777478694916,
-0.03386740759015083,
0.24799378216266632,
-0.015607927925884724,
-0.05565398558974266,
0.06415246427059174,
0.016865583136677742,
0.055743228644132614,
0.027048267424106598,
0.28009510040283203,
0.03722377121448517,
-0.02029125951230526,
-0.09267491847276688,
0.00532075809314847,
0.018095117062330246,
-0.16258519887924194,
0.04582330957055092,
0.16075541079044342,
0.09114771336317062,
0.07301115989685059,
0.07771322876214981,
0.02145327627658844,
-0.09599032998085022,
-0.19843901693820953,
0.041210275143384933,
0.0698545053601265,
0.037950560450553894,
0.12306851893663406,
0.17261090874671936,
-0.015501276589930058,
0.022977305576205254,
-0.03733143210411072,
0.021681541576981544,
-0.1520007848739624,
-0.08518609404563904,
-0.05774372071027756,
-0.12499014288187027,
0.00974944420158863,
-0.051348086446523666,
-0.015811318531632423,
0.13122713565826416,
0.02655704878270626,
-0.05335593596100807,
0.021301422268152237,
0.035916801542043686,
-0.013594098389148712,
0.030321845784783363,
0.012941177934408188,
-0.05302920565009117,
0.02876899018883705,
-0.03947385400533676,
-0.07198022305965424,
-0.07053419947624207,
-0.05736730620265007,
0.017349110916256905,
-0.0448523685336113,
0.06816980987787247,
-0.07931386679410934,
-0.07487360388040543,
-0.053743813186883926,
0.019579974934458733,
-0.01633182168006897,
0.1419452577829361,
0.010538932867348194,
0.02896551787853241,
0.08076063543558121,
0.1690555363893509,
-0.06375094503164291,
-0.13202250003814697,
-0.03487011045217514,
0.16199743747711182,
0.03842746838927269,
0.06004665419459343,
0.030522266402840614,
0.035496439784765244,
-0.010744675993919373,
0.25739532709121704,
0.23131877183914185,
-0.01311311312019825,
0.05376414209604263,
-0.01625993102788925,
0.009803496301174164,
-0.007456178776919842,
0.09851834177970886,
0.14622706174850464,
0.24239075183868408,
-0.09023813158273697,
0.0036201130133122206,
-0.04163946583867073,
0.0073642549104988575,
-0.16735510528087616,
0.022505758330225945,
-0.026172447949647903,
-0.07084760814905167,
-0.01611330360174179,
0.08767805993556976,
-0.061415914446115494,
0.07962052524089813,
0.04894016683101654,
-0.1051095575094223,
-0.024204956367611885,
-0.03738607466220856,
0.22033676505088806,
0.026318321004509926,
0.04455455392599106,
-0.045577727258205414,
-0.055804040282964706,
0.10624825209379196,
-0.007696014828979969,
-0.19850769639015198,
-0.10995518416166306,
0.09548689424991608,
0.026365207508206367,
0.183216854929924,
0.011343097314238548,
0.03185877203941345,
0.07653085887432098,
0.05094477906823158,
-0.11973968893289566,
0.0920133888721466,
0.04230223223567009,
-0.1041184663772583,
-0.05329209938645363,
-0.131383016705513,
-0.02295713685452938,
-0.04002850502729416,
0.014073634520173073,
-0.08181367814540863,
0.0281438697129488,
0.014438263140618801,
-0.07030633836984634,
-0.03957244008779526,
0.03483865037560463,
-0.05249039828777313,
0.043156396597623825,
-0.018173398450016975,
-0.031129105016589165,
-0.06594487279653549,
-0.058287158608436584,
0.029807114973664284,
0.06921905279159546,
-0.1411651223897934,
-0.08097470551729202,
0.006253357045352459,
0.0037617958150804043,
0.10055237263441086,
0.0020640671718865633,
-0.046121932566165924,
-0.05059383437037468,
-0.06540551781654358,
-0.001437457511201501,
-0.16324615478515625,
-0.017445232719182968,
0.08595111221075058,
0.02161841094493866,
0.014120658859610558,
-0.049168661236763,
0.011134239844977856,
0.03017236292362213,
-0.073300801217556,
-0.09056500345468521
] |
null | null |
transformers
|
# roberta-base-squad2 for QA on COVID-19
## Overview
**Language model:** deepset/roberta-base-squad2
**Language:** English
**Downstream-task:** Extractive QA
**Training data:** [SQuAD-style CORD-19 annotations from 23rd April](https://github.com/deepset-ai/COVID-QA/blob/master/data/question-answering/200423_covidQA.json)
**Code:** See [an example QA pipeline on Haystack](https://haystack.deepset.ai/tutorials/01_basic_qa_pipeline)
**Infrastructure**: Tesla v100
## Hyperparameters
```
batch_size = 24
n_epochs = 3
base_LM_model = "deepset/roberta-base-squad2"
max_seq_len = 384
learning_rate = 3e-5
lr_schedule = LinearWarmup
warmup_proportion = 0.1
doc_stride = 128
xval_folds = 5
dev_split = 0
no_ans_boost = -100
```
---
license: cc-by-4.0
---
## Performance
5-fold cross-validation on the data set led to the following results:
**Single EM-Scores:** [0.222, 0.123, 0.234, 0.159, 0.158]
**Single F1-Scores:** [0.476, 0.493, 0.599, 0.461, 0.465]
**Single top\\_3\\_recall Scores:** [0.827, 0.776, 0.860, 0.771, 0.777]
**XVAL EM:** 0.17890995260663506
**XVAL f1:** 0.49925444207319924
**XVAL top\\_3\\_recall:** 0.8021327014218009
This model is the model obtained from the **third** fold of the cross-validation.
## Usage
### In Haystack
For doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in [haystack](https://github.com/deepset-ai/haystack/):
```python
reader = FARMReader(model_name_or_path="deepset/roberta-base-squad2-covid")
# or
reader = TransformersReader(model="deepset/roberta-base-squad2",tokenizer="deepset/roberta-base-squad2-covid")
```
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "deepset/roberta-base-squad2-covid"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'Why is model conversion important?',
'context': 'The option to convert models between FARM and transformers gives freedom to the user and let people easily switch between frameworks.'
}
res = nlp(QA_input)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
**Branden Chan:** [email protected]
**Timo Möller:** [email protected]
**Malte Pietsch:** [email protected]
**Tanay Soni:** [email protected]
**Bogdan Kostić:** [email protected]
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/deepset-logo-colored.png" class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" class="w-40"/>
</div>
</div>
[deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://docs.haystack.deepset.ai">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community/join">Discord community open to everyone!</a></strong></p>
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
{"language": "en", "license": "cc-by-4.0", "datasets": ["squad_v2"]}
|
question-answering
|
deepset/roberta-base-squad2-covid
|
[
"transformers",
"pytorch",
"jax",
"safetensors",
"roberta",
"question-answering",
"en",
"dataset:squad_v2",
"license:cc-by-4.0",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #safetensors #roberta #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #endpoints_compatible #has_space #region-us
|
# roberta-base-squad2 for QA on COVID-19
## Overview
Language model: deepset/roberta-base-squad2
Language: English
Downstream-task: Extractive QA
Training data: SQuAD-style CORD-19 annotations from 23rd April
Code: See an example QA pipeline on Haystack
Infrastructure: Tesla v100
## Hyperparameters
---
license: cc-by-4.0
---
## Performance
5-fold cross-validation on the data set led to the following results:
Single EM-Scores: [0.222, 0.123, 0.234, 0.159, 0.158]
Single F1-Scores: [0.476, 0.493, 0.599, 0.461, 0.465]
Single top\\_3\\_recall Scores: [0.827, 0.776, 0.860, 0.771, 0.777]
XVAL EM: 0.17890995260663506
XVAL f1: 0.49925444207319924
XVAL top\\_3\\_recall: 0.8021327014218009
This model is the model obtained from the third fold of the cross-validation.
## Usage
### In Haystack
For doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in haystack:
### In Transformers
## Authors
Branden Chan: URL@URL
Timo Möller: timo.moeller@URL
Malte Pietsch: malte.pietsch@URL
Tanay Soni: URL@URL
Bogdan Kostić: URL@URL
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="URL class="w-40"/>
</div>
</div>
deepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- Distilled roberta-base-squad2 (aka "tinyroberta-squad2")
- German BERT (aka "bert-base-german-cased")
- GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="URL repo and <strong><a href="URL">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="URL community open to everyone!</a></strong></p>
Twitter | LinkedIn | Discord | GitHub Discussions | Website
By the way: we're hiring!
|
[
"# roberta-base-squad2 for QA on COVID-19",
"## Overview\nLanguage model: deepset/roberta-base-squad2 \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD-style CORD-19 annotations from 23rd April \nCode: See an example QA pipeline on Haystack \nInfrastructure: Tesla v100",
"## Hyperparameters\n\n---\nlicense: cc-by-4.0\n---",
"## Performance\n5-fold cross-validation on the data set led to the following results: \n\nSingle EM-Scores: [0.222, 0.123, 0.234, 0.159, 0.158] \nSingle F1-Scores: [0.476, 0.493, 0.599, 0.461, 0.465] \nSingle top\\\\_3\\\\_recall Scores: [0.827, 0.776, 0.860, 0.771, 0.777] \nXVAL EM: 0.17890995260663506 \nXVAL f1: 0.49925444207319924 \nXVAL top\\\\_3\\\\_recall: 0.8021327014218009\n\nThis model is the model obtained from the third fold of the cross-validation.",
"## Usage",
"### In Haystack\nFor doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in haystack:",
"### In Transformers",
"## Authors\nBranden Chan: URL@URL \nTimo Möller: timo.moeller@URL \nMalte Pietsch: malte.pietsch@URL \nTanay Soni: URL@URL \nBogdan Kostić: URL@URL",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
"TAGS\n#transformers #pytorch #jax #safetensors #roberta #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #endpoints_compatible #has_space #region-us \n",
"# roberta-base-squad2 for QA on COVID-19",
"## Overview\nLanguage model: deepset/roberta-base-squad2 \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD-style CORD-19 annotations from 23rd April \nCode: See an example QA pipeline on Haystack \nInfrastructure: Tesla v100",
"## Hyperparameters\n\n---\nlicense: cc-by-4.0\n---",
"## Performance\n5-fold cross-validation on the data set led to the following results: \n\nSingle EM-Scores: [0.222, 0.123, 0.234, 0.159, 0.158] \nSingle F1-Scores: [0.476, 0.493, 0.599, 0.461, 0.465] \nSingle top\\\\_3\\\\_recall Scores: [0.827, 0.776, 0.860, 0.771, 0.777] \nXVAL EM: 0.17890995260663506 \nXVAL f1: 0.49925444207319924 \nXVAL top\\\\_3\\\\_recall: 0.8021327014218009\n\nThis model is the model obtained from the third fold of the cross-validation.",
"## Usage",
"### In Haystack\nFor doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in haystack:",
"### In Transformers",
"## Authors\nBranden Chan: URL@URL \nTimo Möller: timo.moeller@URL \nMalte Pietsch: malte.pietsch@URL \nTanay Soni: URL@URL \nBogdan Kostić: URL@URL",
"## About us\n<div class=\"grid lg:grid-cols-2 gap-x-4 gap-y-3\">\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n <div class=\"w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center\">\n <img alt=\"\" src=\"URL class=\"w-40\"/>\n </div>\n</div>\n\ndeepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.\n\n\nSome of our other work: \n- Distilled roberta-base-squad2 (aka \"tinyroberta-squad2\")\n- German BERT (aka \"bert-base-german-cased\")\n- GermanQuAD and GermanDPR datasets and models (aka \"gelectra-base-germanquad\", \"gbert-base-germandpr\")",
"## Get in touch and join the Haystack community\n\n<p>For more info on Haystack, visit our <strong><a href=\"URL repo and <strong><a href=\"URL\">Documentation</a></strong>. \n\nWe also have a <strong><a class=\"h-7\" href=\"URL community open to everyone!</a></strong></p>\n\nTwitter | LinkedIn | Discord | GitHub Discussions | Website\n\nBy the way: we're hiring!"
] |
[
62,
16,
68,
15,
162,
3,
36,
6,
48,
251,
113
] |
[
"passage: TAGS\n#transformers #pytorch #jax #safetensors #roberta #question-answering #en #dataset-squad_v2 #license-cc-by-4.0 #endpoints_compatible #has_space #region-us \n# roberta-base-squad2 for QA on COVID-19## Overview\nLanguage model: deepset/roberta-base-squad2 \nLanguage: English \nDownstream-task: Extractive QA \nTraining data: SQuAD-style CORD-19 annotations from 23rd April \nCode: See an example QA pipeline on Haystack \nInfrastructure: Tesla v100## Hyperparameters\n\n---\nlicense: cc-by-4.0\n---## Performance\n5-fold cross-validation on the data set led to the following results: \n\nSingle EM-Scores: [0.222, 0.123, 0.234, 0.159, 0.158] \nSingle F1-Scores: [0.476, 0.493, 0.599, 0.461, 0.465] \nSingle top\\\\_3\\\\_recall Scores: [0.827, 0.776, 0.860, 0.771, 0.777] \nXVAL EM: 0.17890995260663506 \nXVAL f1: 0.49925444207319924 \nXVAL top\\\\_3\\\\_recall: 0.8021327014218009\n\nThis model is the model obtained from the third fold of the cross-validation.## Usage### In Haystack\nFor doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in haystack:### In Transformers## Authors\nBranden Chan: URL@URL \nTimo Möller: timo.moeller@URL \nMalte Pietsch: malte.pietsch@URL \nTanay Soni: URL@URL \nBogdan Kostić: URL@URL"
] |
[
-0.15268182754516602,
0.18655012547969818,
-0.006074079312384129,
0.08518765866756439,
0.06311427801847458,
0.02879638783633709,
0.15725697576999664,
0.1315089762210846,
0.014164002612233162,
0.103668712079525,
0.07146792858839035,
0.10666242241859436,
0.11680980026721954,
0.16399236023426056,
-0.019998442381620407,
-0.16683663427829742,
0.00672468775883317,
-0.07564922422170639,
-0.028393153101205826,
0.07817462831735611,
0.08517854660749435,
-0.08675327152013779,
0.08641111850738525,
-0.001468042959459126,
-0.007983238436281681,
0.014072170481085777,
-0.007885032333433628,
-0.04958830401301384,
0.08354400098323822,
0.0696549341082573,
0.05896767973899841,
0.044617291539907455,
0.10596390068531036,
-0.1741044670343399,
0.02235402911901474,
0.05508456751704216,
-0.03905099257826805,
0.06928005069494247,
0.1408378630876541,
-0.014896746724843979,
0.07158707827329636,
-0.06217021495103836,
0.007280614692717791,
0.04804778844118118,
-0.11160128563642502,
-0.1406242996454239,
-0.14157073199748993,
0.10966320335865021,
0.10081122070550919,
0.09858658164739609,
-0.02034738101065159,
0.10317181795835495,
-0.03863692283630371,
0.10172973573207855,
0.17110444605350494,
-0.3129500150680542,
-0.06512660533189774,
0.1130606010556221,
0.05861280858516693,
0.009573562070727348,
-0.07636727392673492,
-0.02328735776245594,
0.023692119866609573,
-0.005412680096924305,
0.0021416412200778723,
-0.05409425497055054,
0.06076553091406822,
0.013310421258211136,
-0.09944809973239899,
-0.026670947670936584,
0.14161722362041473,
0.08115524053573608,
-0.06518354266881943,
-0.0772135779261589,
-0.04612267017364502,
-0.020680630579590797,
-0.02858796715736389,
-0.008213397115468979,
0.023341171443462372,
-0.024334153160452843,
-0.0026323285419493914,
0.04504874721169472,
-0.10112829506397247,
-0.020341971889138222,
-0.01959916576743126,
-0.07733896374702454,
0.02487204223871231,
0.012162557803094387,
0.010664244182407856,
0.042022980749607086,
-0.007294623181223869,
-0.18135809898376465,
-0.05859820544719696,
-0.061945490539073944,
-0.10099127143621445,
-0.031784676015377045,
0.027777181938290596,
0.06582166999578476,
0.09165795892477036,
0.1652347594499588,
-0.027312709018588066,
0.09514667838811874,
0.017142903059720993,
0.012080172076821327,
-0.038681309670209885,
0.162200465798378,
-0.11744439601898193,
-0.12455886602401733,
0.014891558326780796,
0.05938984081149101,
0.011570846661925316,
-0.022633114829659462,
-0.06557606905698776,
-0.005457173567265272,
0.012061395682394505,
0.07292799651622772,
0.030312838032841682,
0.0006101917242631316,
-0.047099173069000244,
-0.041428711265325546,
0.09119609743356705,
-0.11160656809806824,
0.039255209267139435,
0.06839803606271744,
-0.04691089317202568,
0.04607617110013962,
-0.021405845880508423,
0.0334966778755188,
-0.08183891326189041,
0.0031575073953717947,
-0.06363285332918167,
-0.013089384883642197,
-0.03401041775941849,
-0.10182960331439972,
0.06536047905683517,
-0.06715677678585052,
0.006503951735794544,
-0.10699144005775452,
-0.10333612561225891,
-0.10215248912572861,
0.05550767853856087,
-0.07147440314292908,
-0.012284296564757824,
-0.0473177395761013,
-0.10281846672296524,
0.030484510585665703,
0.006887027062475681,
0.050364501774311066,
-0.0657334104180336,
0.026360223069787025,
-0.03242141753435135,
0.038808781653642654,
0.0651792585849762,
0.004134651273488998,
-0.056570231914520264,
0.008213641121983528,
-0.1199515238404274,
0.03479667380452156,
-0.07540271431207657,
0.0659589022397995,
-0.16312435269355774,
-0.05668462812900543,
0.04836418107151985,
0.01083378866314888,
0.07197780907154083,
0.15248101949691772,
-0.2221054881811142,
-0.0023470744490623474,
0.15796566009521484,
-0.04312123730778694,
-0.13388289511203766,
0.06310567259788513,
-0.011556534096598625,
0.009526044130325317,
0.029294084757566452,
0.11627283692359924,
0.09587137401103973,
-0.1548682302236557,
-0.06748718023300171,
0.004428891930729151,
0.022624604403972626,
-0.002390379086136818,
0.10843846201896667,
-0.025623979046940804,
0.11048538982868195,
0.004136963747441769,
-0.03380299732089043,
-0.06260459125041962,
-0.04432351142168045,
-0.08318367600440979,
0.01409908290952444,
-0.02571091055870056,
-0.10048872977495193,
0.019862964749336243,
0.0010266657918691635,
-0.03782583400607109,
-0.10810741782188416,
-0.13746680319309235,
0.09698562324047089,
-0.045450251549482346,
-0.01518337894231081,
-0.12660431861877441,
0.06322778761386871,
-0.026288719847798347,
0.011707885190844536,
-0.13990432024002075,
-0.041114673018455505,
0.02699943445622921,
-0.016552189365029335,
0.016021553426980972,
0.02744029276072979,
0.06837349385023117,
0.027591321617364883,
-0.007944888435304165,
-0.05993785709142685,
0.021986523643136024,
-0.0357716903090477,
-0.06217457726597786,
-0.22964906692504883,
-0.05659355968236923,
-0.04643864557147026,
0.07466749846935272,
-0.18510551750659943,
-0.0012665999820455909,
0.02495088055729866,
0.1210484579205513,
0.012732699513435364,
-0.011769833974540234,
0.00003154480873490684,
0.035210102796554565,
-0.03225903958082199,
-0.06874444335699081,
0.03462431952357292,
0.004595310427248478,
-0.06416246294975281,
0.024521108716726303,
-0.12400904297828674,
0.09562186151742935,
0.08841881901025772,
-0.016663935035467148,
-0.061066143214702606,
0.01751326583325863,
-0.013065114617347717,
-0.0023715414572507143,
-0.00026573328068479896,
0.0058518145233392715,
0.09410720318555832,
0.04840407520532608,
0.06682310998439789,
-0.1022658720612526,
-0.055548910051584244,
0.0326482318341732,
-0.004380951169878244,
-0.025865554809570312,
0.20102451741695404,
0.07906525582075119,
-0.13448652625083923,
0.041676368564367294,
0.06649418920278549,
0.02619449980556965,
0.11490347981452942,
-0.07321116328239441,
-0.07962176203727722,
-0.08698547631502151,
0.045667726546525955,
0.0472850538790226,
0.12427832931280136,
-0.027069274336099625,
0.024863464757800102,
0.047455403953790665,
0.01662275567650795,
-0.002903733402490616,
-0.11376271396875381,
-0.01989969238638878,
-0.007338362745940685,
-0.05164116993546486,
-0.06868229061365128,
0.032776981592178345,
0.006964787375181913,
0.09143290668725967,
-0.006123314145952463,
-0.05045058950781822,
-0.042462315410375595,
-0.07228448987007141,
-0.11413154751062393,
0.2303152233362198,
-0.0559764988720417,
-0.1304055005311966,
-0.12763769924640656,
-0.02223799377679825,
-0.04647887125611305,
-0.029175246134400368,
0.037363383919000626,
-0.07124452292919159,
-0.08798803389072418,
-0.09527547657489777,
-0.054894186556339264,
0.024672754108905792,
-0.02098885364830494,
0.025503473356366158,
0.03904299437999725,
0.051004260778427124,
-0.15491971373558044,
-0.0016904305666685104,
0.0032229896169155836,
-0.061480067670345306,
0.015053519047796726,
0.008033480495214462,
0.10151626914739609,
0.0887497067451477,
0.041249390691518784,
0.022360894829034805,
-0.025410320609807968,
0.2688712179660797,
-0.07103562355041504,
0.03297993913292885,
0.1410694718360901,
0.03452393785119057,
0.060733724385499954,
0.1776016503572464,
0.03834685683250427,
-0.06321804970502853,
-0.019563935697078705,
0.026564128696918488,
-0.015575391240417957,
-0.26485440135002136,
0.0016587651334702969,
-0.056781768798828125,
0.04698224365711212,
0.05109787359833717,
0.051722265779972076,
-0.06621745228767395,
0.03487910330295563,
-0.03748868778347969,
-0.00007520330836996436,
0.013805177062749863,
0.07923973351716995,
0.02758382260799408,
0.06170862540602684,
0.11489573121070862,
-0.07872316986322403,
0.03396395593881607,
0.08330351859331131,
0.04464635252952576,
0.20586243271827698,
0.007561848033219576,
0.13352954387664795,
0.08859530836343765,
0.1607571840286255,
-0.014266004785895348,
0.06617552042007446,
0.007572137285023928,
0.008045241236686707,
-0.005860644392669201,
-0.09695731848478317,
0.02633725106716156,
0.06163451448082924,
-0.03516452759504318,
0.00207083229906857,
-0.05787455663084984,
0.05350743606686592,
0.03737974166870117,
0.18657313287258148,
0.09906108677387238,
-0.25758975744247437,
-0.039445701986551285,
0.03864419087767601,
-0.032265570014715195,
-0.022538045421242714,
-0.009769221767783165,
0.017676280811429024,
-0.0934520736336708,
0.0805632621049881,
-0.005137012340128422,
0.10934702306985855,
-0.030430909246206284,
0.007103689946234226,
0.02712203748524189,
0.032490912824869156,
0.00044755308772437274,
0.048174113035202026,
-0.18266561627388,
0.23133715987205505,
0.026872919872403145,
0.11641590297222137,
-0.046237193048000336,
0.03183523193001747,
0.03858182579278946,
-0.034003570675849915,
0.1541377156972885,
-0.02003287896513939,
-0.1408344954252243,
-0.17739300429821014,
-0.08009838312864304,
0.06229502707719803,
0.07566189765930176,
-0.07667447626590729,
0.15021765232086182,
-0.011039908975362778,
-0.038257334381341934,
-0.0059964475221931934,
0.04984639212489128,
-0.13278430700302124,
-0.11511169373989105,
0.05242639407515526,
-0.04282480105757713,
-0.027356794103980064,
-0.05741710215806961,
-0.05448589473962784,
-0.09306485950946808,
0.10994650423526764,
-0.1899443417787552,
-0.03970429301261902,
-0.12213033437728882,
-0.011675999499857426,
0.13714665174484253,
-0.12675075232982635,
0.020841680467128754,
-0.01960260234773159,
0.05735619366168976,
0.034555599093437195,
-0.08525058627128601,
0.06096537038683891,
-0.06872648000717163,
-0.16064275801181793,
-0.02907685376703739,
0.14993661642074585,
0.006514627020806074,
0.036030225455760956,
0.043808046728372574,
0.014500151388347149,
-0.0480673648416996,
-0.1271093785762787,
0.03682240843772888,
0.041246358305215836,
0.0959312692284584,
-0.004263716749846935,
-0.02307536080479622,
-0.06966931372880936,
-0.05831277742981911,
-0.00848733726888895,
0.03978568688035011,
0.319759726524353,
-0.06822165846824646,
0.0449817031621933,
0.08570588380098343,
-0.07244286686182022,
-0.15755318105220795,
-0.051891300827264786,
0.03468998521566391,
0.047306183725595474,
0.012918050400912762,
-0.10831132531166077,
0.07720878720283508,
0.054347556084394455,
-0.006932232063263655,
0.025442758575081825,
-0.24183957278728485,
-0.11585794389247894,
0.06674692034721375,
0.0711468830704689,
0.04893838241696358,
-0.1737796813249588,
-0.052196573466062546,
-0.035104136914014816,
-0.17940780520439148,
0.0473695769906044,
-0.10552819818258286,
0.10062931478023529,
0.003253292990848422,
-0.012342143803834915,
0.0037531994748860598,
-0.03568928688764572,
0.14463311433792114,
0.019618527963757515,
0.06788244843482971,
-0.028935914859175682,
-0.003333837492391467,
0.08611146360635757,
-0.052344195544719696,
0.04719355329871178,
-0.02459559217095375,
0.05836687982082367,
-0.13313078880310059,
-0.028307411819696426,
-0.027319809421896935,
0.021646801382303238,
-0.08651230484247208,
-0.023032432422041893,
-0.02683708444237709,
0.07021021097898483,
0.060132719576358795,
-0.02087056078016758,
0.07652701437473297,
-0.03642166405916214,
0.061163999140262604,
0.16281349956989288,
0.12556293606758118,
-0.014755352400243282,
-0.035286616533994675,
0.011049743741750717,
-0.03394071012735367,
0.03468118980526924,
-0.14990484714508057,
0.04374278709292412,
0.14379575848579407,
-0.001262259786017239,
0.07911468297243118,
0.0330863781273365,
-0.08932200819253922,
0.010170325636863708,
0.07737252116203308,
-0.08879247307777405,
-0.140542134642601,
0.00013627531006932259,
-0.057611677795648575,
-0.13229481875896454,
0.03758567199110985,
0.17917445302009583,
0.01382895465940237,
-0.018326109275221825,
0.011692240834236145,
0.042907122522592545,
0.018118316307663918,
0.16210348904132843,
0.03993244841694832,
0.05228056013584137,
-0.08771359920501709,
0.06660860031843185,
0.04792742431163788,
-0.021520337089896202,
0.028637854382395744,
0.11134230345487595,
-0.07121916115283966,
-0.06885897368192673,
-0.015859108418226242,
0.16585655510425568,
-0.0779806524515152,
-0.039400335401296616,
-0.09581460803747177,
-0.09280718117952347,
0.06724736094474792,
0.09335669875144958,
0.027741966769099236,
0.01747364178299904,
0.027676086872816086,
-0.03183742240071297,
-0.03704383596777916,
0.11190779507160187,
0.10703195631504059,
-0.010076554492115974,
-0.1260298639535904,
0.04882671684026718,
-0.0502651110291481,
-0.0056955087929964066,
0.0168194267898798,
0.01724470593035221,
-0.16088344156742096,
-0.014060822315514088,
-0.14363105595111847,
0.07570336014032364,
-0.07893584668636322,
0.010768013074994087,
-0.02892898954451084,
-0.03578538820147514,
-0.07412335276603699,
0.02620057575404644,
-0.06439836323261261,
-0.04746367037296295,
-0.04196477681398392,
0.10040411353111267,
-0.1379273235797882,
0.011896314099431038,
-0.01002134196460247,
-0.07681279629468918,
0.07678399235010147,
0.004453145433217287,
-0.003994016908109188,
0.03530174121260643,
-0.12287688255310059,
0.02921685203909874,
-0.028292184695601463,
0.03345480188727379,
0.06962192058563232,
-0.12247972190380096,
0.022915467619895935,
-0.0173653457313776,
-0.03651814162731171,
0.008135917596518993,
-0.017388861626386642,
-0.07249484956264496,
0.0020349365659058094,
-0.02908056601881981,
-0.04431374743580818,
-0.05691509321331978,
0.08142856508493423,
0.1428934782743454,
0.026182327419519424,
0.1557510644197464,
-0.04563688114285469,
0.06530841439962387,
-0.21614046394824982,
-0.01605597883462906,
0.025595352053642273,
-0.03235457465052605,
-0.06581946462392807,
-0.06376878172159195,
0.10145898908376694,
-0.05339471623301506,
0.042747240513563156,
-0.04450741037726402,
0.11042653769254684,
0.013959435746073723,
-0.043148767203092575,
0.017074551433324814,
0.056257396936416626,
0.13168230652809143,
0.03790558874607086,
0.007019882556051016,
0.032765619456768036,
0.00005932000931352377,
0.01344640925526619,
0.02162514440715313,
0.12042941898107529,
0.16862447559833527,
0.12474218010902405,
0.07250458002090454,
0.07468760013580322,
-0.12335753440856934,
-0.09873679280281067,
0.06715357303619385,
-0.04506445303559303,
0.053775038570165634,
-0.05896749719977379,
0.022443892434239388,
0.16713398694992065,
-0.16793692111968994,
0.019241048023104668,
-0.07966206967830658,
-0.05436115339398384,
-0.1261082887649536,
-0.1815628707408905,
-0.1090502068400383,
-0.05383249744772911,
0.01368173211812973,
-0.11627236753702164,
0.04579821228981018,
0.07223694771528244,
0.04726294428110123,
-0.003507803427055478,
0.005932733882218599,
-0.012875011190772057,
-0.019854417070746422,
0.0727156326174736,
0.03526349738240242,
0.01702248491346836,
0.03870931267738342,
0.02539701946079731,
0.017423545941710472,
-0.03060201369225979,
0.03863787651062012,
0.017780037596821785,
0.018889743834733963,
0.03956180438399315,
-0.05965806543827057,
-0.08647497743368149,
0.00805148109793663,
0.016981597989797592,
0.021187257021665573,
0.14608052372932434,
0.0623696930706501,
0.0003875409602187574,
0.03447713330388069,
0.2080780416727066,
-0.03356751427054405,
-0.06494442373514175,
-0.14867135882377625,
0.07577282935380936,
0.015243220143020153,
0.027497515082359314,
0.016975829377770424,
-0.08321096748113632,
0.03121812455356121,
0.1770823746919632,
0.13525225222110748,
-0.05390490964055061,
-0.002158242976292968,
0.03434786945581436,
0.005072475876659155,
-0.0414474755525589,
0.0828404426574707,
0.08476913720369339,
0.17758683860301971,
-0.05964808166027069,
0.022551583126187325,
0.00879813265055418,
-0.015049091540277004,
-0.061222296208143234,
0.08829904347658157,
0.006671212147921324,
-0.012685352005064487,
-0.009501608088612556,
0.11945533007383347,
-0.0732276663184166,
-0.13609372079372406,
0.02956201508641243,
-0.1031639352440834,
-0.1539987325668335,
-0.032194312661886215,
0.05585939809679985,
0.03128141537308693,
0.05663486197590828,
-0.029523001983761787,
-0.0199888925999403,
0.1386822760105133,
0.007831307128071785,
-0.03333624079823494,
-0.06348048895597458,
0.03629963845014572,
-0.06295325607061386,
0.2026429921388626,
-0.00895850919187069,
0.053753796964883804,
0.15556447207927704,
0.016284896060824394,
-0.13929995894432068,
0.02119222842156887,
0.08808550238609314,
-0.09636291116476059,
0.07004617154598236,
0.03556913882493973,
-0.013661677949130535,
0.11283360421657562,
0.09278752654790878,
-0.06517714262008667,
0.020300930365920067,
0.016252074390649796,
0.016718707978725433,
-0.08433589339256287,
0.05785233527421951,
-0.09257397055625916,
0.13106602430343628,
0.14801768958568573,
-0.06298556178808212,
-0.0037847962230443954,
-0.0001379334571538493,
0.06467407196760178,
0.0009097014553844929,
0.027606703341007233,
0.007802195381373167,
-0.1640763133764267,
0.0624566450715065,
-0.0005914040957577527,
0.0614941269159317,
-0.13609959185123444,
-0.07982373982667923,
0.030387042090296745,
-0.023122994229197502,
-0.07378894090652466,
0.12958741188049316,
0.05962655320763588,
0.008604301139712334,
-0.03987172991037369,
-0.08353828638792038,
-0.03773779794573784,
0.1273924857378006,
-0.08610225468873978,
-0.10719248652458191
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.